hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
925223d3f0e0b7eb079fde3bd4a1c7613f350d0e | 116 | py | Python | fuzzc/__init__.py | KiLJ4EdeN/fuzzc | 42bedc29198db97c412e5498782b1f80124321ee | [
"MIT"
] | 8 | 2020-04-08T15:05:30.000Z | 2021-05-08T13:21:14.000Z | fuzzc/__init__.py | KiLJ4EdeN/fuzzc | 42bedc29198db97c412e5498782b1f80124321ee | [
"MIT"
] | 1 | 2020-04-09T07:21:32.000Z | 2020-04-11T14:59:44.000Z | fuzzc/__init__.py | KiLJ4EdeN/fuzzc | 42bedc29198db97c412e5498782b1f80124321ee | [
"MIT"
] | 3 | 2020-04-08T17:23:52.000Z | 2020-09-29T10:15:26.000Z | from fuzzc.core import *
from fuzzc.entropy import *
from fuzzc.memberships import *
from fuzzc.operations import *
| 23.2 | 31 | 0.793103 | 16 | 116 | 5.75 | 0.4375 | 0.391304 | 0.48913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 116 | 4 | 32 | 29 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9290032db5ff0239fd40f4cecbcf9d3e1245f433 | 1,200 | py | Python | tests/stories/liquidation_stories.py | subba72/balanced-contracts | 23d2c283c26ce78985b169a204db187f0956b35b | [
"MIT"
] | 12 | 2020-08-05T00:28:33.000Z | 2022-03-15T03:55:05.000Z | tests/stories/liquidation_stories.py | subba72/balanced-contracts | 23d2c283c26ce78985b169a204db187f0956b35b | [
"MIT"
] | 212 | 2021-04-20T01:06:45.000Z | 2022-02-04T03:38:28.000Z | tests/stories/liquidation_stories.py | subba72/balanced-contracts | 23d2c283c26ce78985b169a204db187f0956b35b | [
"MIT"
] | 10 | 2021-01-15T03:01:58.000Z | 2022-02-13T03:20:48.000Z | ICX = 10**18
LIQUIDATION_STORIES = {
"stories": [{
"description": "liquidating btest_wallet account by depositing 782 icx collateral and minting 2000 "
"bnusd loan",
"actions": {
"deposited_icx": 782769 * ICX // 1000,
"test_icx": 600,
"test_bnUSD": 2000 * ICX,
"expected_initial_position": "No Debt",
"expected_position": "Liquidate",
"expected_result": "Zero"
}
},
{
"description": "liquidating btest_wallet account by depositing 782 icx collateral and minting "
"1500 bnusd loan",
"actions": {
"deposited_icx": 782769 * ICX // 1000,
"test_icx": 600,
"test_bnUSD": 1500 * ICX,
"expected_initial_position": "No Debt",
"expected_position": "Liquidate",
"expected_result": "Zero"
}
}
]
}
| 41.37931 | 116 | 0.409167 | 84 | 1,200 | 5.642857 | 0.404762 | 0.092827 | 0.113924 | 0.139241 | 0.898734 | 0.898734 | 0.898734 | 0.898734 | 0.898734 | 0.898734 | 0 | 0.086811 | 0.500833 | 1,200 | 28 | 117 | 42.857143 | 0.704508 | 0 | 0 | 0.428571 | 0 | 0 | 0.370833 | 0.041667 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
92abdccad7c40cd6897a76d68b65d6f7ca1d7ca7 | 24,230 | py | Python | final_coding.py | White-Brett/Nutrients-Prediction | bb2c3bfe3718b41fa8e31bcb19ce50c00aa85b66 | [
"CC0-1.0"
] | null | null | null | final_coding.py | White-Brett/Nutrients-Prediction | bb2c3bfe3718b41fa8e31bcb19ce50c00aa85b66 | [
"CC0-1.0"
] | null | null | null | final_coding.py | White-Brett/Nutrients-Prediction | bb2c3bfe3718b41fa8e31bcb19ce50c00aa85b66 | [
"CC0-1.0"
] | null | null | null | Python 3.8.2 (tags/v3.8.2:7b3ab59, Feb 25 2020, 23:03:10) [MSC v.1916 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license()" for more information.
>>> import numpy as np
>>> import pandas as pd
>>> import random
>>> import matplotlib.pyplot as plt
>>> t_1= pd.read_excel("C:/Users/BrettData/Desktop/capst/fertiliser-use.xlsx", sheet_name=0, header=3, names=None, index_col=None, keep_default_na=False)
>>> a=t_1.sample(5)
>>> t1=t_1.iloc[43:59,1:5]
>>> t_2= pd.read_excel("C:/Users/BrettData/Desktop/capst/fertiliser-use.xlsx", sheet_name=1, header=3, names=None, index_col=None, keep_default_na=False)
>>> b=t_2.sample(5)
>>> t2=t_2.iloc[43:59,1:5]
>>> a1=np.where(t1-t2)
>>> a2=np.where(a-b)
>>> a1
(array([ 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 2,
2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 3, 3, 4, 4,
4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 6, 6, 6,
6, 6, 6, 6, 6, 7, 7, 7, 7, 7, 7, 7, 7, 8, 8, 8, 8,
8, 8, 8, 8, 9, 9, 9, 9, 9, 9, 9, 9, 10, 10, 10, 10, 10,
10, 10, 10, 11, 11, 11, 11, 11, 11, 11, 11, 12, 12, 12, 12, 12, 12,
12, 12, 13, 13, 13, 13, 13, 13, 13, 13, 14, 14, 14, 14, 14, 14, 14,
14, 15, 15, 15, 15, 15, 15, 15, 15], dtype=int64), array([0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5,
6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3,
4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1,
2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7,
0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5,
6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7], dtype=int64))
>>> a2
(array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,
2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 4, 4,
4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 5, 5,
5, 5, 5, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6,
6, 6, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 8, 8, 8, 8,
8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9,
9, 9, 9, 9, 9, 9], dtype=int64), array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0,
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1,
2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2,
3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2, 3,
4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2, 3, 4,
5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2, 3, 4, 5,
6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2, 3, 4, 5, 6,
7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2, 3, 4, 5, 6, 7,
8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2, 3, 4, 5, 6, 7, 8,
9, 10, 11, 12, 13, 14, 15], dtype=int64))
>>> t_1= pd.read_excel("C:/Users/BrettData/Desktop/capst/fertiliser-use.xlsx", sheet_name=2, header=2, names=None, index_col=None, keep_default_na=False)
>>> t_1= pd.read_excel("C:/Users/BrettData/Desktop/capst/fertiliser-use.xlsx", sheet_name=0, header=3, names=None, index_col=None, keep_default_na=False)
>>> t_3= pd.read_excel("C:/Users/BrettData/Desktop/capst/fertiliser-use.xlsx", sheet_name=2, header=2, names=None, index_col=None, keep_default_na=False)
>>> c=t_3.sample(5)
>>> tt3=t_3.iloc[43:59,1:9]
>>> t3=tt3.drop(columns='Unnamed: 3')
>>> t_4= pd.read_excel("C:/Users/BrettData/Desktop/capst/fertiliser-use.xlsx", sheet_name=3, header=2, names=None, index_col=None, keep_default_na=False)
>>> d=t_4.sample(5)
>>> tt4=t_4.iloc[17:35,1:9]
>>> t4=tt4.drop(columns='Unnamed: 5')
>>> b1=np.where(t3-t4)
>>> b2=np.where(c-d)
>>> b1
(array([ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2,
2, 2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 3, 3, 3,
3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4,
4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 6,
6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 7, 7, 7, 7,
7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 8, 8, 8, 8, 8, 8, 8,
8, 8, 8, 8, 8, 8, 8, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9,
9, 9, 9, 9, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10,
10, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 12, 12,
12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 13, 13, 13, 13, 13,
13, 13, 13, 13, 13, 13, 13, 13, 13, 14, 14, 14, 14, 14, 14, 14, 14,
14, 14, 14, 14, 14, 14, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15,
15, 15, 15, 16, 16, 16, 16, 16, 16, 16, 16, 16, 16, 16, 16, 16, 16,
17, 17, 17, 17, 17, 17, 17, 17, 17, 17, 17, 17, 17, 17, 18, 18, 18,
18, 18, 18, 18, 18, 18, 18, 18, 18, 18, 18, 19, 19, 19, 19, 19, 19,
19, 19, 19, 19, 19, 19, 19, 19, 20, 20, 20, 20, 20, 20, 20, 20, 20,
20, 20, 20, 20, 20, 21, 21, 21, 21, 21, 21, 21, 21, 21, 21, 21, 21,
21, 21, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 23,
23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 24, 24, 24, 24,
24, 24, 24, 24, 24, 24, 24, 24, 24, 24, 25, 25, 25, 25, 25, 25, 25,
25, 25, 25, 25, 25, 25, 25, 26, 26, 26, 26, 26, 26, 26, 26, 26, 26,
26, 26, 26, 26, 27, 27, 27, 27, 27, 27, 27, 27, 27, 27, 27, 27, 27,
27, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 29, 29,
29, 29, 29, 29, 29, 29, 29, 29, 29, 29, 29, 29, 30, 30, 30, 30, 30,
30, 30, 30, 30, 30, 30, 30, 30, 30, 31, 31, 31, 31, 31, 31, 31, 31,
31, 31, 31, 31, 31, 31, 32, 32, 32, 32, 32, 32, 32, 32, 32, 32, 32,
32, 32, 32], dtype=int64), array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2,
3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5,
6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8,
9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11,
12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0,
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3,
4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6,
7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9,
10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12,
13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1,
2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4,
5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7,
8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10,
11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13,
0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2,
3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5,
6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8,
9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11,
12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0,
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3,
4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6,
7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9,
10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12,
13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1,
2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4,
5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7,
8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10,
11, 12, 13], dtype=int64))
>>> b2
(array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2,
2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3,
3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 4, 4,
4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5,
5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6,
6, 6, 6, 6, 6, 6, 6, 6, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7,
7, 7, 7, 7, 7, 7, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8,
8, 8, 8, 8, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9,
9, 9], dtype=int64), array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16,
17, 18, 19, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13,
14, 15, 16, 17, 18, 19, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10,
11, 12, 13, 14, 15, 16, 17, 18, 19, 0, 1, 2, 3, 4, 5, 6, 7,
8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 0, 1, 2, 3, 4,
5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 0, 1,
2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18,
19, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15,
16, 17, 18, 19, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12,
13, 14, 15, 16, 17, 18, 19, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9,
10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 0, 1, 2, 3, 4, 5, 6,
7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19], dtype=int64))
>>> c1=np.where(a1-b2)
Traceback (most recent call last):
File "<pyshell#28>", line 1, in <module>
c1=np.where(a1-b2)
TypeError: unsupported operand type(s) for -: 'tuple' and 'tuple'
>>> a1.plot()
Traceback (most recent call last):
File "<pyshell#29>", line 1, in <module>
a1.plot()
AttributeError: 'tuple' object has no attribute 'plot'
>>> a2.plot()
Traceback (most recent call last):
File "<pyshell#30>", line 1, in <module>
a2.plot()
AttributeError: 'tuple' object has no attribute 'plot'
>>> a1
(array([ 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 2,
2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 3, 3, 4, 4,
4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 6, 6, 6,
6, 6, 6, 6, 6, 7, 7, 7, 7, 7, 7, 7, 7, 8, 8, 8, 8,
8, 8, 8, 8, 9, 9, 9, 9, 9, 9, 9, 9, 10, 10, 10, 10, 10,
10, 10, 10, 11, 11, 11, 11, 11, 11, 11, 11, 12, 12, 12, 12, 12, 12,
12, 12, 13, 13, 13, 13, 13, 13, 13, 13, 14, 14, 14, 14, 14, 14, 14,
14, 15, 15, 15, 15, 15, 15, 15, 15], dtype=int64), array([0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5,
6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3,
4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1,
2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7,
0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5,
6, 7, 0, 1, 2, 3, 4, 5, 6, 7, 0, 1, 2, 3, 4, 5, 6, 7], dtype=int64))
>>> b1
(array([ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2,
2, 2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 3, 3, 3,
3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4,
4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 6,
6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 7, 7, 7, 7,
7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 8, 8, 8, 8, 8, 8, 8,
8, 8, 8, 8, 8, 8, 8, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9,
9, 9, 9, 9, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10,
10, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 12, 12,
12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 13, 13, 13, 13, 13,
13, 13, 13, 13, 13, 13, 13, 13, 13, 14, 14, 14, 14, 14, 14, 14, 14,
14, 14, 14, 14, 14, 14, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15,
15, 15, 15, 16, 16, 16, 16, 16, 16, 16, 16, 16, 16, 16, 16, 16, 16,
17, 17, 17, 17, 17, 17, 17, 17, 17, 17, 17, 17, 17, 17, 18, 18, 18,
18, 18, 18, 18, 18, 18, 18, 18, 18, 18, 18, 19, 19, 19, 19, 19, 19,
19, 19, 19, 19, 19, 19, 19, 19, 20, 20, 20, 20, 20, 20, 20, 20, 20,
20, 20, 20, 20, 20, 21, 21, 21, 21, 21, 21, 21, 21, 21, 21, 21, 21,
21, 21, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 23,
23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 24, 24, 24, 24,
24, 24, 24, 24, 24, 24, 24, 24, 24, 24, 25, 25, 25, 25, 25, 25, 25,
25, 25, 25, 25, 25, 25, 25, 26, 26, 26, 26, 26, 26, 26, 26, 26, 26,
26, 26, 26, 26, 27, 27, 27, 27, 27, 27, 27, 27, 27, 27, 27, 27, 27,
27, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 29, 29,
29, 29, 29, 29, 29, 29, 29, 29, 29, 29, 29, 29, 30, 30, 30, 30, 30,
30, 30, 30, 30, 30, 30, 30, 30, 30, 31, 31, 31, 31, 31, 31, 31, 31,
31, 31, 31, 31, 31, 31, 32, 32, 32, 32, 32, 32, 32, 32, 32, 32, 32,
32, 32, 32], dtype=int64), array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2,
3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5,
6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8,
9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11,
12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0,
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3,
4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6,
7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9,
10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12,
13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1,
2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4,
5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7,
8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10,
11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13,
0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2,
3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5,
6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8,
9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11,
12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0,
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3,
4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6,
7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9,
10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12,
13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1,
2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4,
5, 6, 7, 8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7,
8, 9, 10, 11, 12, 13, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10,
11, 12, 13], dtype=int64))
>>> a2
(array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,
2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 4, 4,
4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 5, 5,
5, 5, 5, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6,
6, 6, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 8, 8, 8, 8,
8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9,
9, 9, 9, 9, 9, 9], dtype=int64), array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0,
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1,
2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2,
3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2, 3,
4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2, 3, 4,
5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2, 3, 4, 5,
6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2, 3, 4, 5, 6,
7, 8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2, 3, 4, 5, 6, 7,
8, 9, 10, 11, 12, 13, 14, 15, 0, 1, 2, 3, 4, 5, 6, 7, 8,
9, 10, 11, 12, 13, 14, 15], dtype=int64))
>>> b2
(array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2,
2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3,
3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 4, 4,
4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5,
5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6,
6, 6, 6, 6, 6, 6, 6, 6, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7,
7, 7, 7, 7, 7, 7, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8,
8, 8, 8, 8, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9,
9, 9], dtype=int64), array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16,
17, 18, 19, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13,
14, 15, 16, 17, 18, 19, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10,
11, 12, 13, 14, 15, 16, 17, 18, 19, 0, 1, 2, 3, 4, 5, 6, 7,
8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 0, 1, 2, 3, 4,
5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 0, 1,
2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18,
19, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15,
16, 17, 18, 19, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12,
13, 14, 15, 16, 17, 18, 19, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9,
10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 0, 1, 2, 3, 4, 5, 6,
7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19], dtype=int64))
>>> t1.plot()
<matplotlib.axes._subplots.AxesSubplot object at 0x000002C21EF413D0>
>>> plot.show()
Traceback (most recent call last):
File "<pyshell#36>", line 1, in <module>
plot.show()
NameError: name 'plot' is not defined
>>> plt.show()
>>> t1.plot(kind='area')
<matplotlib.axes._subplots.AxesSubplot object at 0x000002C21EF41640>
>>> plt.show
<function show at 0x000002C21CA5A820>
>>> plt.show()
>>> t1.plot(kind='area')
<matplotlib.axes._subplots.AxesSubplot object at 0x000002C21EF4B1F0>
>>> plt.show()
>>> t1.plot(kind='hist')
<matplotlib.axes._subplots.AxesSubplot object at 0x000002C21F7808B0>
>>> plt.show()
>>> t2.plot()
<matplotlib.axes._subplots.AxesSubplot object at 0x000002C21F17F370>
>>> plt.show()
>>> t2.plot(kind='area')
<matplotlib.axes._subplots.AxesSubplot object at 0x000002C223287A60>
>>> plt.show()
>>> t2.plot(kind='hist')
<matplotlib.axes._subplots.AxesSubplot object at 0x000002C21F154760>
>>> plt.show()
Traceback (most recent call last):
File "<pyshell#50>", line 1, in <module>
plt.show()
File "C:\Users\BrettData\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0\LocalCache\local-packages\Python38\site-packages\matplotlib\pyplot.py", line 269, in show
return _show(*args, **kw)
File "C:\Users\BrettData\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0\LocalCache\local-packages\Python38\site-packages\matplotlib\cbook\deprecation.py", line 413, in wrapper
return func(*args, **kwargs)
File "C:\Users\BrettData\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0\LocalCache\local-packages\Python38\site-packages\matplotlib\backend_bases.py", line 3311, in show
cls.mainloop()
File "C:\Users\BrettData\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0\LocalCache\local-packages\Python38\site-packages\matplotlib\backends\_backend_tk.py", line 981, in mainloop
managers[0].window.mainloop()
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.8_3.8.752.0_x64__qbz5n2kfra8p0\lib\tkinter\__init__.py", line 1420, in mainloop
self.tk.mainloop(n)
KeyboardInterrupt
t3.plot()
>>> t3.plot()
<matplotlib.axes._subplots.AxesSubplot object at 0x000002C21EFBE6A0>
>>> plt.show()
>>> t3.plot(kind='area')
<matplotlib.axes._subplots.AxesSubplot object at 0x000002C2205C8910>
>>> plt.show()
>>> t3.plot(kind='hist')
<matplotlib.axes._subplots.AxesSubplot object at 0x000002C21F7BDCA0>
>>> plt.show()
>>> t4.plot()
Traceback (most recent call last):
File "<pyshell#57>", line 1, in <module>
t4.plot()
File "C:\Users\BrettData\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0\LocalCache\local-packages\Python38\site-packages\pandas\plotting\_core.py", line 847, in __call__
return plot_backend.plot(data, kind=kind, **kwargs)
File "C:\Users\BrettData\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0\LocalCache\local-packages\Python38\site-packages\pandas\plotting\_matplotlib\__init__.py", line 61, in plot
plot_obj.generate()
File "C:\Users\BrettData\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0\LocalCache\local-packages\Python38\site-packages\pandas\plotting\_matplotlib\core.py", line 261, in generate
self._compute_plot_data()
File "C:\Users\BrettData\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0\LocalCache\local-packages\Python38\site-packages\pandas\plotting\_matplotlib\core.py", line 410, in _compute_plot_data
raise TypeError("no numeric data to plot")
TypeError: no numeric data to plot
>>> t4
Gypsum Sulfur Sulfuric Acid ... Compost Dried manure Sewage sludge
17 1415900 168985 44801 ... 88124 160680 152740
18 1440326 158030 43998 ... 67563 155974 232965
19 1162055 176607 105010 ... 86146 156870 92814
20 1314697 173377 103366 ... 67799 125754 106018
21 1489254 304627 139220 ... 80149 154455 58450
22 1489116 237807 111041 ... 96220 144157 117134
23 1533445 217365 116585 ... 104166 168805 84824
24 1399232 243301 108109 ... 74860 146050 96066
25 1543232 613711 63688 ... 133126 188751 87425
26 1396707 147768 74235 ... 77143 190435 173635
27 1649057 229019 87729 ... 43327 140137 96556
28 1729496 282780 103005 ... 94159 204104 73086
29 1856233 390792 79648 ... 170071 175738 110876
30 2201487 332744 122441 ... 171785 132704 230134
31 2337083 438076 405524 ... 118068 193179 214638
32 2488392 415240 933589 ... 111425 95810 211798
33 ...
[17 rows x 7 columns]
>>> t4=tt4=t_4.iloc[17:32,1:9]
>>> t4.plot()
<matplotlib.axes._subplots.AxesSubplot object at 0x000002C21EFBE820>
>>> plt.show()
>>> t4.plot(kind='area')
<matplotlib.axes._subplots.AxesSubplot object at 0x000002C22071C2E0>
>>> plt.show()
>>> t4.plot(kind='hist')
<matplotlib.axes._subplots.AxesSubplot object at 0x000002C21F5A0A90>
>>> plt.show()
>>> t3.columns
Index(['Ammonia (Anhydrous)', 'Ammonia (Aqua)', 'Ammonium (Nitrate)',
'Ammonium (Sulfate)', ' Nitrogen solutions', ' Sodium nitrate',
' Urea '],
dtype='object')
>>> print random.randint(16000, 11000)
SyntaxError: invalid syntax
>>> import random
>>> print( random.randint(16000, 11000)
print
SyntaxError: invalid syntax
>>> print( random.randint(16000, 11000)
| 65.309973 | 219 | 0.446224 | 4,909 | 24,230 | 2.182522 | 0.070483 | 0.027254 | 0.038641 | 0.051521 | 0.794568 | 0.774501 | 0.769274 | 0.748927 | 0.720553 | 0.685925 | 0 | 0.394062 | 0.345316 | 24,230 | 371 | 220 | 65.309973 | 0.281347 | 0 | 0 | 0.6875 | 0 | 0.024457 | 0.089518 | 0.07506 | 0 | 0 | 0.010561 | 0 | 0 | 0 | null | null | 0 | 0.013587 | null | null | 0.01087 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
2b7eb48ed749bb68f5eb70986c8f71ab13c46df5 | 3,505 | py | Python | gameComponents/comparisonConditional.py | brktrkmn/Rock-Paper-Scissors-Game | ce0cc9fde52573f1e55e78f40c132670581a12bd | [
"Unlicense"
] | null | null | null | gameComponents/comparisonConditional.py | brktrkmn/Rock-Paper-Scissors-Game | ce0cc9fde52573f1e55e78f40c132670581a12bd | [
"Unlicense"
] | null | null | null | gameComponents/comparisonConditional.py | brktrkmn/Rock-Paper-Scissors-Game | ce0cc9fde52573f1e55e78f40c132670581a12bd | [
"Unlicense"
] | null | null | null | from gameComponents import gameVars
# 'if' is a conditional
#() is not necessary with if
# = sets the value
# == compares values
# you go in when using if (indenting) Indented elements are connected to the conditional
def comparisonConditional():
if gameVars.computer_choice == gameVars.player_choice:
print(" ")
print(" //////////////////////")
print(" # #")
print(" # T-I-E ! #")
print(" # #")
print(" //////////////////////")
print(" ")
#elif->else if
elif gameVars.computer_choice == "rock":
if gameVars.player_choice == "scissors":
print(" ")
print(" //////////////////////")
print(" # #")
print(" # Y-O-U L-O-S-E ! #")
print(" # #")
print(" //////////////////////")
print(" ")
#long way
#player_lives = player_lives -1
#simple way
gameVars.player_lives -= 1
else:
print(" ")
print(" //////////////////////")
print(" # #")
print(" # Y-O-U W-I-N ! #")
print(" # #")
print(" //////////////////////")
print(" ")
gameVars.computer_lives -= 1
elif gameVars.computer_choice == "paper":
if gameVars.player_choice == "scissors":
print(" ")
print(" //////////////////////")
print(" # #")
print(" # Y-O-U L-O-S-E ! #")
print(" # #")
print(" //////////////////////")
print(" ")
gameVars.player_lives -= 1
else:
print(" ")
print(" //////////////////////")
print(" # #")
print(" # Y-O-U W-I-N ! #")
print(" # #")
print(" //////////////////////")
print(" ")
gameVars.computer_lives-= 1
elif gameVars.computer_choice == "scissors":
if gameVars.player_choice == "paper":
print(" ")
print(" //////////////////////")
print(" # #")
print(" # Y-O-U L-O-S-E ! #")
print(" # #")
print(" //////////////////////")
print(" ")
gameVars.player_lives -= 1
else:
print(" ")
print(" //////////////////////")
print(" # #")
print(" # Y-O-U W-I-N ! #")
print(" # #")
print(" //////////////////////")
print(" ")
gameVars.computer_lives -= 1 | 42.743902 | 89 | 0.258203 | 201 | 3,505 | 4.422886 | 0.263682 | 0.393701 | 0.354331 | 0.15748 | 0.561305 | 0.561305 | 0.561305 | 0.561305 | 0.561305 | 0.561305 | 0 | 0.004447 | 0.550927 | 3,505 | 82 | 90 | 42.743902 | 0.560356 | 0.066476 | 0 | 0.880597 | 0 | 0 | 0.627337 | 0.094392 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014925 | true | 0 | 0.014925 | 0 | 0.029851 | 0.731343 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
2bfe5397a4245f320a13f41f179a9416dd0fb888 | 12,953 | py | Python | indra/benchmarks/benchmark_trips.py | zebulon2/indra | 7727ddcab52ad8012eb6592635bfa114e904bd48 | [
"BSD-2-Clause"
] | 136 | 2016-02-11T22:06:37.000Z | 2022-03-31T17:26:20.000Z | indra/benchmarks/benchmark_trips.py | zebulon2/indra | 7727ddcab52ad8012eb6592635bfa114e904bd48 | [
"BSD-2-Clause"
] | 748 | 2016-02-03T16:27:56.000Z | 2022-03-09T14:27:54.000Z | indra/benchmarks/benchmark_trips.py | zebulon2/indra | 7727ddcab52ad8012eb6592635bfa114e904bd48 | [
"BSD-2-Clause"
] | 56 | 2015-08-28T14:03:44.000Z | 2022-02-04T06:15:55.000Z | from __future__ import absolute_import, print_function, unicode_literals
import os
from os.path import dirname, join
import sys
import indra.statements
from indra.sources import trips
from indra.assemblers.pysb import PysbAssembler
def test_bind():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'The receptor tyrosine kinase EGFR binds the growth factor ligand EGF.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(len(st.members) == 2)
assert(has_hgnc_ref(st.members[0]))
assert(has_hgnc_ref(st.members[1]))
os.remove(fname)
def test_complex_bind():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'The EGFR-EGF complex binds another EGFR-EGF complex.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(len(st.members) == 2)
assert(has_hgnc_ref(st.members[0]))
assert(has_hgnc_ref(st.members[1]))
assert(st.members[0].bound_conditions)
assert(st.members[1].bound_conditions)
os.remove(fname)
def test_complex_bind2():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'The EGFR-EGFR complex binds GRB2.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(len(st.members) == 2)
assert(st.members[0].name == 'EGFR')
assert(st.members[1].name == 'GRB2')
assert(len(st.members[0].bound_conditions) == 1)
assert(st.members[0].bound_conditions[0].agent.name == 'EGFR')
os.remove(fname)
def test_complex_bind3():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'RAF binds to the RAS-GTP complex.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(len(st.members) == 2)
assert(st.members[0].name == 'RAF')
assert(st.members[1].name == 'RAS')
assert(len(st.members[1].bound_conditions) == 1)
assert(st.members[1].bound_conditions[0].agent.name == 'GTP')
os.remove(fname)
def test_complex_bind4():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'The RAF-RAS complex binds another RAF-RAS complex.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(len(st.members) == 2)
assert(st.members[0].name == 'RAF')
assert(st.members[1].name == 'RAF')
assert(len(st.members[0].bound_conditions) == 1)
assert(st.members[0].bound_conditions[0].agent.name == 'RAS')
assert(len(st.members[1].bound_conditions) == 1)
assert(st.members[1].bound_conditions[0].agent.name == 'RAS')
os.remove(fname)
def test_bound_mod():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'The adaptor protein GRB2 can bind EGFR that is phosphorylated on tyrosine.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(len(st.members) == 2)
assert(has_hgnc_ref(st.members[0]))
assert(has_hgnc_ref(st.members[1]))
assert(st.members[1].mods)
assert(st.members[1].mods[0].mod_type == 'phosphorylation')
assert(st.members[1].mods[0].residue == 'Y')
os.remove(fname)
def test_not_bound_to():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'BRAF that is not bound to Vemurafenib binds MEK1.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(len(st.members) == 2)
assert(st.members[0].name == 'BRAF')
assert(has_hgnc_ref(st.members[0]))
assert(st.members[1].name == 'MAP2K1')
assert(has_hgnc_ref(st.members[1]))
assert(len(st.members[0].bound_conditions) == 1)
assert(st.members[0].bound_conditions[0].agent.name.lower() == 'vemurafenib')
assert(st.members[0].bound_conditions[0].is_bound == False)
os.remove(fname)
def test_not_bound_to2():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'BRAF, not bound to Vemurafenib, phosphorylates MEK1.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_phosphorylation(st))
assert(st.enz is not None)
assert(st.sub is not None)
assert(st.enz.name == 'BRAF')
assert(has_hgnc_ref(st.enz))
assert(st.sub.name == 'MAP2K1')
assert(has_hgnc_ref(st.sub))
assert(len(st.enz.bound_conditions) == 1)
assert(st.enz.bound_conditions[0].agent.name.lower() == 'vemurafenib')
assert(st.enz.bound_conditions[0].is_bound == False)
os.remove(fname)
def test_not_bound_to3():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'SOS1 bound to GRB2 binds NRAS that is not bound to BRAF and GTP.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(st.members[0].name == 'SOS1')
assert(st.members[1].name == 'NRAS')
assert(len(st.members[0].bound_conditions) == 1)
assert(len(st.members[1].bound_conditions) == 2)
os.remove(fname)
def test_not_bound_to4():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'BRAF that is not bound to NRAS and Vemurafenib binds BRAF ' +\
'that is not bound to Vemurafenib.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(st.members[0].name == 'BRAF')
assert(st.members[1].name == 'BRAF')
assert(len(st.members[0].bound_conditions) == 2)
assert(len(st.members[1].bound_conditions) == 1)
os.remove(fname)
def test_bound_to():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'NRAS, bound to GTP, binds BRAF.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(len(st.members) == 2)
assert(st.members[0].name == 'NRAS')
assert(has_hgnc_ref(st.members[0]))
assert(st.members[1].name == 'BRAF')
assert(has_hgnc_ref(st.members[1]))
assert(len(st.members[0].bound_conditions) == 1)
assert(st.members[0].bound_conditions[0].agent.name == 'GTP')
os.remove(fname)
def test_bound_to2():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'EGFR-bound GRB2 binds SOS1.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(len(st.members) == 2)
assert(st.members[0].name == 'GRB2')
assert(has_hgnc_ref(st.members[0]))
assert(st.members[1].name == 'SOS1')
assert(has_hgnc_ref(st.members[1]))
assert(len(st.members[0].bound_conditions) == 1)
assert(st.members[0].bound_conditions[0].agent.name == 'EGFR')
os.remove(fname)
def test_bound_to3():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'SOS1, bound to GRB2 binds RAS.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(len(st.members) == 2)
assert(st.members[0].name == 'SOS1')
assert(has_hgnc_ref(st.members[0]))
assert(st.members[1].name == 'RAS')
assert(len(st.members[0].bound_conditions) == 1)
assert(st.members[0].bound_conditions[0].agent.name == 'GRB2')
os.remove(fname)
def test_bound_to4():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'RAS, bound to SOS1, binds GTP.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(len(st.members) == 2)
assert(st.members[0].name == 'RAS')
assert(st.members[1].name == 'GTP')
assert(len(st.members[0].bound_conditions) == 1)
assert(st.members[0].bound_conditions[0].agent.name == 'SOS1')
os.remove(fname)
def test_bound_to5():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'BRAF that is bound to NRAS and Vemurafenib binds ' +\
'BRAF that is bound to NRAS and Vemurafenib.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_complex(st))
assert(len(st.members) == 2)
assert(st.members[0].name == 'BRAF')
assert(st.members[1].name == 'BRAF')
assert(len(st.members[0].bound_conditions) == 2)
assert(len(st.members[1].bound_conditions) == 2)
os.remove(fname)
def test_transphosphorylate():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'EGFR, bound to EGFR, transphosphorylates itself.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_transphosphorylation(st))
assert(st.enz is not None)
assert(st.enz.name == 'EGFR')
assert(has_hgnc_ref(st.enz))
assert(st.residue == None)
assert(len(st.enz.bound_conditions) == 1)
assert(st.enz.bound_conditions[0].agent.name == 'EGFR')
os.remove(fname)
def test_transphosphorylate2():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'EGFR, bound to EGFR, transphosphorylates itself on a tyrosine residue.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_transphosphorylation(st))
assert(st.enz is not None)
assert(st.enz.name == 'EGFR')
assert(has_hgnc_ref(st.enz))
assert(st.residue == 'Y')
assert(len(st.enz.bound_conditions) == 1)
assert(st.enz.bound_conditions[0].agent.name == 'EGFR')
os.remove(fname)
def test_act_mod():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'MEK1, phosphorylated at Ser218 and Ser222, is activated.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_actmod(st))
assert(st.agent is not None)
assert(st.agent.name == 'MAP2K1')
residues = [m.residue for m in st.agent.mods]
positions = [m.position for m in st.agent.mods]
assert(residues == ['S', 'S'])
assert(positions == ['218', '222'])
assert(st.is_active)
os.remove(fname)
def test_bound_phosphorylate():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'RAF, bound to RAF, phosphorylates MEK1.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_phosphorylation(st))
assert(st.enz is not None)
assert(st.enz.name == 'RAF')
assert(st.sub is not None)
assert(st.sub.name == 'MAP2K1')
assert(st.residue == None)
os.remove(fname)
'''
def test_bound_not_bound_phosphorylate():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'BRAF-bound BRAF that is not bound to Vemurafenib phosphorylates MEK1.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_phosphorylation(st))
assert(st.enz is not None)
assert(st.enz.name == 'MAP2K1')
assert(st.monomer.mod == ['PhosphorylationSerine', 'PhosphorylationSerine'])
assert(st.monomer.mod_pos == ['218', '222'])
assert(st.relationship == 'increases')
os.remove(fname)
'''
def test_act_phosphorylate():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'Active MEK1 phosphorylates ERK2 at Tyr187.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_phosphorylation(st))
assert(st.enz is not None)
assert(st.enz.name == 'MAP2K1')
assert(st.sub is not None)
assert(st.sub.name == 'MAPK1')
assert(st.residue == 'Y')
assert(st.position == '187')
os.remove(fname)
def test_dephosphorylate():
fname = sys._getframe().f_code.co_name + '.xml'
txt = 'DUSP6 dephosphorylates ERK2 at Tyr187.'
tp = trips.process_text(txt, fname, False)
assert(len(tp.statements) == 1)
st = tp.statements[0]
assert(is_dephosphorylation(st))
assert(st.enz is not None)
assert(st.enz.name == 'DUSP6')
assert(st.sub is not None)
assert(st.sub.name == 'MAPK1')
assert(st.residue == 'Y')
assert(st.position == '187')
os.remove(fname)
def is_complex(statement):
return isinstance(statement, indra.statements.Complex)
def is_phosphorylation(statement):
return isinstance(statement, indra.statements.Phosphorylation)
def is_transphosphorylation(statement):
return isinstance(statement, indra.statements.Transphosphorylation)
def is_actmod(statement):
return isinstance(statement, indra.statements.ActiveForm)
def is_dephosphorylation(statement):
return isinstance(statement, indra.statements.Dephosphorylation)
def has_hgnc_ref(agent):
return ('HGNC' in agent.db_refs.keys())
| 36.487324 | 86 | 0.663398 | 1,909 | 12,953 | 4.373494 | 0.074384 | 0.083004 | 0.044317 | 0.058211 | 0.853396 | 0.841777 | 0.769673 | 0.753384 | 0.744281 | 0.721643 | 0 | 0.022141 | 0.177102 | 12,953 | 354 | 87 | 36.590395 | 0.761141 | 0 | 0 | 0.672026 | 0 | 0 | 0.112091 | 0 | 0 | 0 | 0 | 0 | 0.5209 | 1 | 0.086817 | false | 0 | 0.022508 | 0.019293 | 0.128617 | 0.003215 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
920023228479fadf27f6554ddcd995db73f092c9 | 34,946 | py | Python | footkit/Preprocess.py | KhaitovR/footkit | a989e30bbfa14d4e5a3c1e0a3f9c57a1d02cb521 | [
"Apache-2.0"
] | 1 | 2020-11-03T18:42:43.000Z | 2020-11-03T18:42:43.000Z | footkit/Preprocess.py | KhaitovR/footkit | a989e30bbfa14d4e5a3c1e0a3f9c57a1d02cb521 | [
"Apache-2.0"
] | null | null | null | footkit/Preprocess.py | KhaitovR/footkit | a989e30bbfa14d4e5a3c1e0a3f9c57a1d02cb521 | [
"Apache-2.0"
] | null | null | null | from config import Config
import pandas as pd
import numpy as np
from footkit.utils import set_to_list
class Preprocess(Config):
def __init__(self, Config, df, teams_df):
# наследование параметров
super().__init__()
self.df=df
self.teams_df=teams_df
def get_teams_df(self):
'''
переводим разрез команда1-команда2, в:
команда1-матч
команда2-матч
'''
df=self.df.copy()
self.teams_df.sort_values('DateTime',inplace=True)
self.teams_df.reset_index(drop = True, inplace = True)
print('до обработки Shape==1: ',self.teams_df[(self.teams_df['TeamId'] == 113) & (self.teams_df['DateTime'] == '2019-09-16 18:45:00')].shape[0]==1)
# справочник команды (команда-матч)
trans = {
'HomeTeamId' : 'TeamId',
'HomeTeam' : 'TeamName',
'HomeTeam_xG' : 'xG',
'AwayTeamId' : 'TeamId',
'AwayTeam' : 'TeamName',
'AwayTeam_xG' : 'xG',
}
home_df = pd.merge(
df[['HomeTeamId', 'HomeTeam', 'LeagueName', 'Season', 'DateTime', 'WinProba', 'DrawProba', 'LoseProba']].rename(columns=trans)
, self.teams_df
, on = ['TeamId', 'TeamName', 'LeagueName', 'Season', 'DateTime']
, how = 'left'
)
home_df['h_a'] = 'h'
away_df = pd.merge(
df[['AwayTeamId', 'AwayTeam', 'LeagueName', 'Season', 'DateTime', 'WinProba', 'DrawProba', 'LoseProba']].rename(columns=trans)
, self.teams_df
, on = ['TeamId', 'TeamName', 'LeagueName', 'Season', 'DateTime']
, how = 'left'
)
away_df['h_a'] = 'a'
self.teams_df = pd.concat([home_df, away_df], sort=False, axis = 0).copy()
print('после обработки Shape==1: ',self.teams_df[(self.teams_df['TeamId'] == 113) & (self.teams_df['DateTime'] == '2019-09-16 18:45:00')].shape[0]==1)
return self.teams_df
def fillnull_on_test(self,data, targets):
data=data.copy()
for i in targets:
data[i] = np.where(
data.IsResult == False
, np.nan
, data[i]
)
return data
def get_target_on_df(
self,
data,
HomeTeamGoal,
AwayTeamGoal
):
'''
example:
get_target_on_df(
df,
HomeTeamGoal = 'FTHG',
AwayTeamGoal = 'FTAG',
)
'''
data = data.copy()
data_columns = list(data.columns)
# Виды ставок
data['П1'] = np.where(data[HomeTeamGoal] > data[AwayTeamGoal],1,0)
data['Х'] = np.where(data[HomeTeamGoal] == data[AwayTeamGoal],1,0)
data['П2'] = np.where(data[HomeTeamGoal] < data[AwayTeamGoal],1,0)
data['Победитель'] = np.where(data[HomeTeamGoal] > data[AwayTeamGoal],1, np.where(data[HomeTeamGoal] == data[AwayTeamGoal],0,2))
data['1Х'] = np.where(data[HomeTeamGoal] >= data[AwayTeamGoal],1,0)
data['12'] = np.where(data[HomeTeamGoal] != data[AwayTeamGoal],1,0)
data['2Х'] = np.where(data[HomeTeamGoal] <= data[AwayTeamGoal],1,0)
data['ТМ1'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] <= 1 , 1 , 0)
data['ТБ1'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] >= 1 , 1 , 0)
data['ТМ1.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] < 1.5 , 1 , 0)
data['ТБ1.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] > 1.5 , 1 , 0)
data['ТМ2'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] <= 2 , 1 , 0)
data['ТБ2'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] >= 2 , 1 , 0)
data['ТМ2.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] < 2.5 , 1 , 0)
data['ТБ2.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] > 2.5 , 1 , 0)
data['ТМ3'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] <= 3 , 1 , 0)
data['ТБ3'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] >= 3 , 1 , 0)
data['ТМ3.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] < 3.5 , 1 , 0)
data['ТБ3.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] > 3.5 , 1 , 0)
data['ТМ4'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] <= 4 , 1 , 0)
data['ТБ4'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] >= 4 , 1 , 0)
data['ТМ4.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] < 4.5 , 1 , 0)
data['ТБ4.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] > 4.5 , 1 , 0)
data['ТМ5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] <= 5 , 1 , 0)
data['ТБ5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] >= 5 , 1 , 0)
data['ТМ5.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] < 5.5 , 1 , 0)
data['ТБ5.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] > 5.5 , 1 , 0)
data['Голов 0-1'] = np.where((data[HomeTeamGoal] + data[AwayTeamGoal] >=0)
& (data[HomeTeamGoal] + data[AwayTeamGoal] <=1)
,1,0)
data['Голов 2-3'] = np.where((data[HomeTeamGoal] + data[AwayTeamGoal] >=2)
& (data[HomeTeamGoal] + data[AwayTeamGoal] <=3)
,1,0)
data['Голов 4-5'] = np.where((data[HomeTeamGoal] + data[AwayTeamGoal] >=4)
& (data[HomeTeamGoal] + data[AwayTeamGoal] <=5)
,1,0)
data['Голов 6>' ] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] >= 6,1,0)
data['Ком1 забьет'] = np.where(data[HomeTeamGoal] > 0,1,0)
data['Ком2 забьет'] = np.where(data[AwayTeamGoal] > 0,1,0)
data['Обе забьют'] = np.where((data[HomeTeamGoal] > 0) & (data[AwayTeamGoal] > 0),1,0)
data['Колво голов КОМ1(0)'] = np.where( data[HomeTeamGoal] == 0 , 1 , 0)
data['Колво голов КОМ1(1)'] = np.where( data[HomeTeamGoal] == 1 , 1 , 0)
data['Колво голов КОМ1(2)'] = np.where( data[HomeTeamGoal] == 2 , 1 , 0)
data['Колво голов КОМ1(3>)'] = np.where( data[HomeTeamGoal] >= 3 , 1 , 0)
data['Колво голов КОМ2(0)'] = np.where( data[AwayTeamGoal] == 0 , 1 , 0)
data['Колво голов КОМ2(1)'] = np.where( data[AwayTeamGoal] == 1 , 1 , 0)
data['Колво голов КОМ2(2)'] = np.where( data[AwayTeamGoal] == 2 , 1 , 0)
data['Колво голов КОМ2(3>)'] = np.where( data[AwayTeamGoal] == 3 , 1 , 0)
data['Инд.Тотал К1 ТМ0.5'] = np.where( data[HomeTeamGoal] < 0.5 , 1 , 0)
data['Инд.Тотал К1 ТБ0.5'] = np.where( data[HomeTeamGoal] > 0.5 , 1 , 0)
data['Инд.Тотал К2 ТМ0.5'] = np.where( data[AwayTeamGoal] < 0.5 , 1 , 0)
data['Инд.Тотал К2 ТБ0.5'] = np.where( data[AwayTeamGoal] > 0.5 , 1 , 0)
data['Инд.Тотал К1 ТМ1'] = np.where( data[HomeTeamGoal] <= 1 , 1 , 0)
data['Инд.Тотал К1 ТБ1'] = np.where( data[HomeTeamGoal] >= 1 , 1 , 0)
data['Инд.Тотал К2 ТМ1'] = np.where( data[AwayTeamGoal] <= 1 , 1 , 0)
data['Инд.Тотал К2 ТБ1'] = np.where( data[AwayTeamGoal] >= 1 , 1 , 0)
data['Инд.Тотал К1 ТМ1.5'] = np.where( data[HomeTeamGoal] < 1.5 , 1 , 0)
data['Инд.Тотал К1 ТБ1.5'] = np.where( data[HomeTeamGoal] > 1.5 , 1 , 0)
data['Инд.Тотал К2 ТМ1.5'] = np.where( data[AwayTeamGoal] < 1.5 , 1 , 0)
data['Инд.Тотал К2 ТБ1.5'] = np.where( data[AwayTeamGoal] > 1.5 , 1 , 0)
data['Инд.Тотал К1 ТМ2'] = np.where( data[HomeTeamGoal] <= 2 , 1 , 0)
data['Инд.Тотал К1 ТБ2'] = np.where( data[HomeTeamGoal] >= 2 , 1 , 0)
data['Инд.Тотал К2 ТМ2'] = np.where( data[AwayTeamGoal] <= 2 , 1 , 0)
data['Инд.Тотал К2 ТБ2'] = np.where( data[AwayTeamGoal] >= 2 , 1 , 0)
data['Инд.Тотал К1 ТМ2.5'] = np.where( data[HomeTeamGoal] < 2.5 , 1 , 0)
data['Инд.Тотал К1 ТБ2.5'] = np.where( data[HomeTeamGoal] > 2.5 , 1 , 0)
data['Инд.Тотал К2 ТМ2.5'] = np.where( data[AwayTeamGoal] < 2.5 , 1 , 0)
data['Инд.Тотал К2 ТБ2.5'] = np.where( data[AwayTeamGoal] > 2.5 , 1 , 0)
data['Инд.Тотал К1 ТМ3.5'] = np.where( data[HomeTeamGoal] < 3.5 , 1 , 0)
data['Инд.Тотал К1 ТБ3.5'] = np.where( data[HomeTeamGoal] > 3.5 , 1 , 0)
data['Инд.Тотал К2 ТМ3.5'] = np.where( data[AwayTeamGoal] < 3.5 , 1 , 0)
data['Инд.Тотал К2 ТБ3.5'] = np.where( data[AwayTeamGoal] > 3.5 , 1 , 0)
data['Точный счет 0:1'] = np.where( (data[HomeTeamGoal] == 0) & (data[AwayTeamGoal] == 1) , 1, 0)
data['Точный счет 0:2'] = np.where( (data[HomeTeamGoal] == 0) & (data[AwayTeamGoal] == 2) , 1, 0)
data['Точный счет 0:3'] = np.where( (data[HomeTeamGoal] == 0) & (data[AwayTeamGoal] == 3) , 1, 0)
data['Точный счет 1:0'] = np.where( (data[HomeTeamGoal] == 1) & (data[AwayTeamGoal] == 0) , 1, 0)
data['Точный счет 2:0'] = np.where( (data[HomeTeamGoal] == 2) & (data[AwayTeamGoal] == 0) , 1, 0)
data['Точный счет 3:0'] = np.where( (data[HomeTeamGoal] == 3) & (data[AwayTeamGoal] == 0) , 1, 0)
data['Точный счет 1:1'] = np.where( (data[HomeTeamGoal] == 1) & (data[AwayTeamGoal] == 1) , 1, 0)
data['Точный счет 1:2'] = np.where( (data[HomeTeamGoal] == 1) & (data[AwayTeamGoal] == 2) , 1, 0)
data['Точный счет 1:3'] = np.where( (data[HomeTeamGoal] == 1) & (data[AwayTeamGoal] == 3) , 1, 0)
data['Точный счет 2:1'] = np.where( (data[HomeTeamGoal] == 2) & (data[AwayTeamGoal] == 1) , 1, 0)
data['Точный счет 2:2'] = np.where( (data[HomeTeamGoal] == 2) & (data[AwayTeamGoal] == 2) , 1, 0)
data['Точный счет 2:3'] = np.where( (data[HomeTeamGoal] == 2) & (data[AwayTeamGoal] == 3) , 1, 0)
data['Точный счет 3:1'] = np.where( (data[HomeTeamGoal] == 3) & (data[AwayTeamGoal] == 1) , 1, 0)
data['Точный счет 3:2'] = np.where( (data[HomeTeamGoal] == 3) & (data[AwayTeamGoal] == 2) , 1, 0)
data['Точный счет 3:3'] = np.where( (data[HomeTeamGoal] == 3) & (data[AwayTeamGoal] == 3) , 1, 0)
data["Победитель + обе забьют 1/да" ] = np.where(
(data['П1'] == 1) & (data['Обе забьют'] == 1),
1,0
)
data["Победитель + обе забьют 1/нет"] = np.where(
(data['П1'] == 1) & (data['Обе забьют'] == 0),
1,0
)
data["Победитель + обе забьют Х/да" ] = np.where(
(data['Х'] == 1) & (data['Обе забьют'] == 1),
1,0
)
data["Победитель + обе забьют Х/нет"] = np.where(
(data['Х'] == 1) & (data['Обе забьют'] == 0),
1,0
)
data["Победитель + обе забьют 2/да" ] = np.where(
(data['П2'] == 1) & (data['Обе забьют'] == 1),
1,0
)
data["Победитель + обе забьют 2/нет"] = np.where(
(data['П2'] == 1) & (data['Обе забьют'] == 0),
1,0
)
data['Сухая победа К1'] = np.where(
(data[HomeTeamGoal]>data[AwayTeamGoal]) & (data[AwayTeamGoal] == 0),
1,0
)
data['Сухая победа К2'] = np.where(
(data[AwayTeamGoal]>data[HomeTeamGoal]) & (data[HomeTeamGoal] == 0),
1,0
)
data['Обе забьют + ТБ2.5 Да/Бол'] = np.where(
(data['Обе забьют'] ==1) & (data['ТБ2.5'] == 1)
, 1
, 0
)
data['Обе забьют + ТБ2.5 Да/Мен'] = np.where(
(data['Обе забьют'] ==1) & (data['ТБ2.5'] == 0)
, 1
, 0
)
data['Обе забьют + ТБ2.5 Нет/Бол'] = np.where(
(data['Обе забьют'] ==0) & (data['ТБ2.5'] == 1)
, 1
, 0
)
data['Обе забьют + ТБ2.5 Нет/Мен'] = np.where(
(data['Обе забьют'] ==0) & (data['ТБ2.5'] == 0)
, 1
, 0
)
data['Ф1'] = np.where(
data[HomeTeamGoal] - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Ф1 0.5'] = np.where(
( data[HomeTeamGoal] + 0.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Ф1 -0.5'] = np.where(
( data[HomeTeamGoal] - 0.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Ф1 1'] = np.where(
( data[HomeTeamGoal] + 1 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Ф1 -1'] = np.where(
( data[HomeTeamGoal] - 1 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Ф1 1.5'] = np.where(
( data[HomeTeamGoal] + 1.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Ф1 -1.5'] = np.where(
( data[HomeTeamGoal] - 1.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Ф1 2'] = np.where(
( data[HomeTeamGoal] + 2 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Ф1 -2'] = np.where(
( data[HomeTeamGoal] - 2 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Ф1 2.5'] = np.where(
( data[HomeTeamGoal] + 2.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Ф1 -2.5'] = np.where(
( data[HomeTeamGoal] - 2.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Ф1 -3'] = np.where(
( data[HomeTeamGoal] - 3 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Ф1 -3.5'] = np.where(
( data[HomeTeamGoal] - 3.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Ф2'] = np.where(
data[AwayTeamGoal] - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Ф2 0.5'] = np.where(
( data[AwayTeamGoal] + 0.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Ф2 -0.5'] = np.where(
( data[AwayTeamGoal] - 0.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Ф2 1'] = np.where(
( data[AwayTeamGoal] + 1 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Ф2 -1'] = np.where(
( data[AwayTeamGoal] - 1 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Ф2 1.5'] = np.where(
( data[AwayTeamGoal] + 1.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Ф2 -1.5'] = np.where(
( data[AwayTeamGoal] - 1.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Ф2 2'] = np.where(
( data[AwayTeamGoal] + 2 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Ф2 -2'] = np.where(
( data[AwayTeamGoal] - 2 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Ф2 2.5'] = np.where(
( data[AwayTeamGoal] + 2.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Ф2 -2.5'] = np.where(
( data[AwayTeamGoal] - 2.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Ф2 3'] = np.where(
( data[AwayTeamGoal] + 3 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Ф2 3.5'] = np.where(
( data[AwayTeamGoal] + 3.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
targets = list(
set(data.columns) - set(data_columns)
)
data = self.fillnull_on_test(data, targets)
self.df_targets=data[targets].copy()
self.targets=targets.copy()
return data[targets], targets
def get_target_on_df_eng(
self,
HomeTeamGoal,
AwayTeamGoal
):
'''
example:
get_target_on_df(
df,
HomeTeamGoal = 'FTHG',
AwayTeamGoal = 'FTAG',
)
'''
data = self.df.copy()
data_columns = list(data.columns)
# Виды ставок
data['W'] = np.where(data[HomeTeamGoal] > data[AwayTeamGoal],1,0)
data['D'] = np.where(data[HomeTeamGoal] == data[AwayTeamGoal],1,0)
data['L'] = np.where(data[HomeTeamGoal] < data[AwayTeamGoal],1,0)
data['Winner'] = np.where(data[HomeTeamGoal] > data[AwayTeamGoal],1, np.where(data[HomeTeamGoal] == data[AwayTeamGoal],0,2))
data['1X'] = np.where(data[HomeTeamGoal] >= data[AwayTeamGoal],1,0)
data['12'] = np.where(data[HomeTeamGoal] != data[AwayTeamGoal],1,0)
data['2X'] = np.where(data[HomeTeamGoal] <= data[AwayTeamGoal],1,0)
data['Total Less 1'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] <= 1 , 1 , 0)
data['Total More 1'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] >= 1 , 1 , 0)
data['Total Less 1.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] < 1.5 , 1 , 0)
data['Total More 1.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] > 1.5 , 1 , 0)
data['Total Less 2'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] <= 2 , 1 , 0)
data['Total More 2'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] >= 2 , 1 , 0)
data['Total Less 2.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] < 2.5 , 1 , 0)
data['Total More 2.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] > 2.5 , 1 , 0)
data['Total Less 3'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] <= 3 , 1 , 0)
data['Total More 3'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] >= 3 , 1 , 0)
data['Total Less 3.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] < 3.5 , 1 , 0)
data['Total More 3.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] > 3.5 , 1 , 0)
data['Total Less 4'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] <= 4 , 1 , 0)
data['Total More 4'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] >= 4 , 1 , 0)
data['Total Less 4.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] < 4.5 , 1 , 0)
data['Total More 4.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] > 4.5 , 1 , 0)
data['Total Less 5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] <= 5 , 1 , 0)
data['Total More 5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] >= 5 , 1 , 0)
data['Total Less 5.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] < 5.5 , 1 , 0)
data['Total More 5.5'] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] > 5.5 , 1 , 0)
data['Goals 0-1'] = np.where((data[HomeTeamGoal] + data[AwayTeamGoal] >=0)
& (data[HomeTeamGoal] + data[AwayTeamGoal] <=1)
,1,0)
data['Goals 2-3'] = np.where((data[HomeTeamGoal] + data[AwayTeamGoal] >=2)
& (data[HomeTeamGoal] + data[AwayTeamGoal] <=3)
,1,0)
data['Goals 4-5'] = np.where((data[HomeTeamGoal] + data[AwayTeamGoal] >=4)
& (data[HomeTeamGoal] + data[AwayTeamGoal] <=5)
,1,0)
data['Goals 6>' ] = np.where(data[HomeTeamGoal] + data[AwayTeamGoal] >= 6,1,0)
data['HT Score'] = np.where(data[HomeTeamGoal] > 0,1,0)
data['AT Score'] = np.where(data[AwayTeamGoal] > 0,1,0)
data['Both teams to score'] = np.where((data[HomeTeamGoal] > 0) & (data[AwayTeamGoal] > 0),1,0)
data['Ind. Total HT Total Less 0.5'] = np.where( data[HomeTeamGoal] < 0.5 , 1 , 0)
data['Ind. Total HT Total More 0.5'] = np.where( data[HomeTeamGoal] > 0.5 , 1 , 0)
data['Ind. Total AT Total Less 0.5'] = np.where( data[AwayTeamGoal] < 0.5 , 1 , 0)
data['Ind. Total AT Total More 0.5'] = np.where( data[AwayTeamGoal] > 0.5 , 1 , 0)
data['Ind. Total HT Total Less 1'] = np.where( data[HomeTeamGoal] <= 1 , 1 , 0)
data['Ind. Total HT Total More 1'] = np.where( data[HomeTeamGoal] >= 1 , 1 , 0)
data['Ind. Total AT Total Less 1'] = np.where( data[AwayTeamGoal] <= 1 , 1 , 0)
data['Ind. Total AT Total More 1'] = np.where( data[AwayTeamGoal] >= 1 , 1 , 0)
data['Ind. Total HT Total Less 1.5'] = np.where( data[HomeTeamGoal] < 1.5 , 1 , 0)
data['Ind. Total HT Total More 1.5'] = np.where( data[HomeTeamGoal] > 1.5 , 1 , 0)
data['Ind. Total AT Total Less 1.5'] = np.where( data[AwayTeamGoal] < 1.5 , 1 , 0)
data['Ind. Total AT Total More 1.5'] = np.where( data[AwayTeamGoal] > 1.5 , 1 , 0)
data['Ind. Total HT Total Less 2'] = np.where( data[HomeTeamGoal] <= 2 , 1 , 0)
data['Ind. Total HT Total More 2'] = np.where( data[HomeTeamGoal] >= 2 , 1 , 0)
data['Ind. Total AT Total Less 2'] = np.where( data[AwayTeamGoal] <= 2 , 1 , 0)
data['Ind. Total AT Total More 2'] = np.where( data[AwayTeamGoal] >= 2 , 1 , 0)
data['Ind. Total HT Total Less 2.5'] = np.where( data[HomeTeamGoal] < 2.5 , 1 , 0)
data['Ind. Total HT Total More 2.5'] = np.where( data[HomeTeamGoal] > 2.5 , 1 , 0)
data['Ind. Total AT Total Less 2.5'] = np.where( data[AwayTeamGoal] < 2.5 , 1 , 0)
data['Ind. Total AT Total More 2.5'] = np.where( data[AwayTeamGoal] > 2.5 , 1 , 0)
data['Ind. Total HT Total Less 3.5'] = np.where( data[HomeTeamGoal] < 3.5 , 1 , 0)
data['Ind. Total HT Total More 3.5'] = np.where( data[HomeTeamGoal] > 3.5 , 1 , 0)
data['Ind. Total AT Total Less 3.5'] = np.where( data[AwayTeamGoal] < 3.5 , 1 , 0)
data['Ind. Total AT Total More 3.5'] = np.where( data[AwayTeamGoal] > 3.5 , 1 , 0)
data['Score 0:1'] = np.where( (data[HomeTeamGoal] == 0) & (data[AwayTeamGoal] == 1) , 1, 0)
data['Score 0:2'] = np.where( (data[HomeTeamGoal] == 0) & (data[AwayTeamGoal] == 2) , 1, 0)
data['Score 0:3'] = np.where( (data[HomeTeamGoal] == 0) & (data[AwayTeamGoal] == 3) , 1, 0)
data['Score 1:0'] = np.where( (data[HomeTeamGoal] == 1) & (data[AwayTeamGoal] == 0) , 1, 0)
data['Score 2:0'] = np.where( (data[HomeTeamGoal] == 2) & (data[AwayTeamGoal] == 0) , 1, 0)
data['Score 3:0'] = np.where( (data[HomeTeamGoal] == 3) & (data[AwayTeamGoal] == 0) , 1, 0)
data['Score 1:1'] = np.where( (data[HomeTeamGoal] == 1) & (data[AwayTeamGoal] == 1) , 1, 0)
data['Score 1:2'] = np.where( (data[HomeTeamGoal] == 1) & (data[AwayTeamGoal] == 2) , 1, 0)
data['Score 1:3'] = np.where( (data[HomeTeamGoal] == 1) & (data[AwayTeamGoal] == 3) , 1, 0)
data['Score 2:1'] = np.where( (data[HomeTeamGoal] == 2) & (data[AwayTeamGoal] == 1) , 1, 0)
data['Score 2:2'] = np.where( (data[HomeTeamGoal] == 2) & (data[AwayTeamGoal] == 2) , 1, 0)
data['Score 2:3'] = np.where( (data[HomeTeamGoal] == 2) & (data[AwayTeamGoal] == 3) , 1, 0)
data['Score 3:1'] = np.where( (data[HomeTeamGoal] == 3) & (data[AwayTeamGoal] == 1) , 1, 0)
data['Score 3:2'] = np.where( (data[HomeTeamGoal] == 3) & (data[AwayTeamGoal] == 2) , 1, 0)
data['Score 3:3'] = np.where( (data[HomeTeamGoal] == 3) & (data[AwayTeamGoal] == 3) , 1, 0)
data['Handicap HT'] = np.where(
data[HomeTeamGoal] - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Handicap HT 0.5'] = np.where(
( data[HomeTeamGoal] + 0.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Handicap HT -0.5'] = np.where(
( data[HomeTeamGoal] - 0.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Handicap HT 1'] = np.where(
( data[HomeTeamGoal] + 1 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Handicap HT -1'] = np.where(
( data[HomeTeamGoal] - 1 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Handicap HT 1.5'] = np.where(
( data[HomeTeamGoal] + 1.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Handicap HT -1.5'] = np.where(
( data[HomeTeamGoal] - 1.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Handicap HT 2'] = np.where(
( data[HomeTeamGoal] + 2 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Handicap HT -2'] = np.where(
( data[HomeTeamGoal] - 2 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Handicap HT 2.5'] = np.where(
( data[HomeTeamGoal] + 2.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Handicap HT -2.5'] = np.where(
( data[HomeTeamGoal] - 2.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Handicap HT -3'] = np.where(
( data[HomeTeamGoal] - 3 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Handicap HT -3.5'] = np.where(
( data[HomeTeamGoal] - 3.5 ) - data[AwayTeamGoal] > 0
, 1
, 0
)
data['Handicap AT'] = np.where(
data[AwayTeamGoal] - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Handicap AT 0.5'] = np.where(
( data[AwayTeamGoal] + 0.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Handicap AT -0.5'] = np.where(
( data[AwayTeamGoal] - 0.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Handicap AT 1'] = np.where(
( data[AwayTeamGoal] + 1 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Handicap AT -1'] = np.where(
( data[AwayTeamGoal] - 1 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Handicap AT 1.5'] = np.where(
( data[AwayTeamGoal] + 1.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Handicap AT -1.5'] = np.where(
( data[AwayTeamGoal] - 1.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Handicap AT 2'] = np.where(
( data[AwayTeamGoal] + 2 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Handicap AT -2'] = np.where(
( data[AwayTeamGoal] - 2 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Handicap AT 2.5'] = np.where(
( data[AwayTeamGoal] + 2.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Handicap AT -2.5'] = np.where(
( data[AwayTeamGoal] - 2.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Handicap AT 3'] = np.where(
( data[AwayTeamGoal] + 3 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
data['Handicap AT 3.5'] = np.where(
( data[AwayTeamGoal] + 3.5 ) - data[HomeTeamGoal] > 0
, 1
, 0
)
targets = list(
set(data.columns) - set(data_columns)
)
data = self.fillnull_on_test(data, targets)
self.df_targets=data[targets].copy()
self.targets = targets
return data[targets], targets
def create_features(
self
):
df=self.df.copy()
StatTeam_df = self.teams_df.copy()
StatTeam_df.sort_values([self.SEASON,self.TEAMNAME,self.LEAGUENAME,self.DATETIME],inplace=True)
features_base = list(StatTeam_df.columns)
StatTeam_df[self.ROUNDEVENT] = StatTeam_df.groupby([self.SEASON,self.LEAGUENAME,self.TEAMNAME]).cumcount()+1
StatTeam_df['ChillDays'] = (StatTeam_df['DateTime'] - StatTeam_df.groupby(['Season','TeamName'])['DateTime'].shift(1)).dt.days
# результаты предыдущего матча
for target in self.FEATURES:
for sh in [1, 2, 3]:
StatTeam_df['Shift'+str(sh)+target] = StatTeam_df.groupby([self.SEASON,self.TEAMNAME,self.LEAGUENAME])[target].shift(sh).fillna(0)
StatTeam_df['Cum'+str(sh)+target] = StatTeam_df.groupby([self.SEASON,self.TEAMNAME,self.LEAGUENAME])['Shift'+str(sh)+target].cumsum()
StatTeam_df['DeltaShift'+target] = ((StatTeam_df['Shift'+str(1)+target] - StatTeam_df['Shift'+str(2)+target]) + (StatTeam_df['Shift'+str(2)+target]-StatTeam_df['Shift'+str(3)+target]))/2
StatTeam_df['DeltaCum'+target] = ((StatTeam_df['Cum'+str(1)+target] - StatTeam_df['Cum'+str(2)+target]) + (StatTeam_df['Cum'+str(2)+target]-StatTeam_df['Cum'+str(3)+target]))/2
# Накопительные результаты ( исключая текущий )
StatTeam_df['Cum'+target] = StatTeam_df.groupby([self.SEASON,self.TEAMNAME,self.LEAGUENAME])['Shift'+str(1)+target].cumsum()
# место в турнирной таблице перед матчем
StatTeam_df['Place'] = StatTeam_df.groupby([self.SEASON,self.LEAGUENAME,self.ROUNDEVENT])['Cum1'+self.POINTS].rank(ascending=False,method='average')
# Накопительные результаты ( исключая текущий зависимо, от места игры )
# чтобы не создавать лишние поля, тут же шифтуем, чтобы забрать все до "сегодняшнего" матча, сразу же считаем нак. сумму
for target in self.FEATURES:
StatTeam_df['Cum'+target+'InPlace'] = StatTeam_df.groupby([self.SEASON,self.TEAMNAME,self.LEAGUENAME,self.PLACEMATCH])[target].shift(1).fillna(0)
StatTeam_df['Cum'+target+'InPlace'] = StatTeam_df.groupby([self.SEASON,self.TEAMNAME,self.LEAGUENAME,self.PLACEMATCH])['Cum'+target+'InPlace'].cumsum()
StatTeam_df['Perc'+target+'InPlace'] = StatTeam_df['Cum'+target+'InPlace'] / StatTeam_df['Cum'+target]
StatTeam_df['Avg'+target+'Season'] = StatTeam_df['Cum'+target] / StatTeam_df[self.ROUNDEVENT]
# шифт+дифф
StatTeam_df['ShiftPlace'+self.GOALS+'Diff'] = StatTeam_df.groupby(
[
self.SEASON,self.TEAMNAME,self.LEAGUENAME,self.PLACEMATCH
]
)[self.GOALS].shift(1).fillna(0) - \
StatTeam_df.groupby(
[
self.SEASON,self.TEAMNAME,self.LEAGUENAME,self.PLACEMATCH
]
)[self.MISSED].shift(1).fillna(0)
StatTeam_df['Cum'+self.GOALS+'DiffInPlace'] = StatTeam_df.groupby([self.SEASON,self.TEAMNAME,self.LEAGUENAME,self.PLACEMATCH])['ShiftPlace'+self.GOALS+'Diff'].shift(1).fillna(0)
StatTeam_df['Cum'+self.GOALS+'DiffInPlace'] = StatTeam_df.groupby([self.SEASON,self.TEAMNAME,self.LEAGUENAME,self.PLACEMATCH])['Cum'+self.GOALS+'DiffInPlace'].cumsum()
StatTeam_df.drop('ShiftPlace'+self.GOALS+'Diff',axis=1,inplace=True)
# место в турнирной таблице перед матчем
# StatTeam_df['PlaceInPlace'] = StatTeam_df.groupby([self.SEASON,self.LEAGUENAME,self.ROUNDEVENT,self.PLACEMATCH])['Cum'+self.POINTS+'InPlace'].rank(ascending=False,method='average')
# 3 days
for target in self.FEATURES:
# Дивизион - команда
agg_group = [self.LEAGUENAME,self.TEAMNAME] #,self.SEASON
StatTeam_df['Avg'+target+'3'] = np.mean([StatTeam_df.groupby(agg_group)[target].shift(i) for i in [1,2,3]], axis=0)
StatTeam_df['Avg'+target+'5'] = np.mean([StatTeam_df.groupby(agg_group)[target].shift(i) for i in [1,2,3,4,5]], axis=0)
# Среднее колво забитых/пропущенных голов команды (за 5 и 3 матча)
# Матчи учитываются дома и в гостях раздельно(на каждый по 5/3 матча)
agg_group = [self.LEAGUENAME,self.TEAMNAME,self.PLACEMATCH]
StatTeam_df['Avg'+target+'3_PLACE'] = np.mean([StatTeam_df.groupby(agg_group)[target].shift(i) for i in [1,2,3]], axis=0)
StatTeam_df['Avg'+target+'5_PLACE'] = np.mean([StatTeam_df.groupby(agg_group)[target].shift(i) for i in [1,2,3,4,5]], axis=0)
# {i : 'Home'+i for i in }
rename_cols_home = {}
rename_cols_away = {}
key_cols = [self.DATETIME,self.LEAGUENAME,'Team',self.SEASON]
rename_base_cols = list(
set(StatTeam_df.columns) - set([self.SEASON, self.LEAGUENAME, self.DATETIME,self.ROUNDEVENT]) - set(self.FEATURES) - set(['h_a', 'result'])
)
rename_targets = self.FEATURES
for i in rename_targets:
rename_cols_home[i] = 'Home'+i
rename_cols_away[i] = 'Away'+i
for i in rename_base_cols:
rename_cols_home[i] = 'Home'+i
rename_cols_away[i] = 'Away'+i
df = pd.merge(
pd.merge(
df
, StatTeam_df.rename(columns=rename_cols_home)
, on=['DateTime','LeagueName','HomeTeamId','Season']
, suffixes = (False, False)
)
, StatTeam_df.rename(columns=rename_cols_away).drop([self.ROUNDEVENT, 'h_a', 'result'], axis = 1)
,on = ['DateTime','LeagueName','AwayTeamId','Season']
)
df['Month'] = df['DateTime'].dt.month
df['Year'] = df['DateTime'].dt.year
df['Week'] = df['DateTime'].dt.week
df['Day'] = df['DateTime'].dt.day
df['Dayofweek'] = df['DateTime'].dt.dayofweek
leaky_features = rename_targets
leaky_features = ['Home'+i for i in leaky_features] + ['Away'+i for i in leaky_features] + set_to_list(leaky_features, ['ChillDays'])
leaky_features = leaky_features+[
'IdMatch',
# 'DateTime',
# 'LeagueName',
# 'Season',
'HomeTeam',
'HomeShortTeam',
# 'HomeTeamId',
'HomeTeam_xG',
'AwayTeam',
'AwayShortTeam',
# 'AwayTeamId',
'AwayTeam_xG',
'FTHG',
'FTAG',
'IsResult',
'h_a',
'result',
'HomeTeamName',
'AwayTeamName',
'HomePlace',
'AwayPlace',
'HomeWinProba', 'HomeDrawProba', 'HomeLoseProba',
'AwayWinProba', 'AwayDrawProba', 'AwayLoseProba',
]
self.engineering_features = list(
set(df.columns) - set(features_base) - set(leaky_features)
)
df[self.engineering_features][:1].to_pickle('./data/example_features.pkl')
for i in ['Awayscored', 'Awaymissed', 'Awaydraws', 'Homescored', 'Homemissed', 'Awayloses', 'Homeloses', 'Awaywins', 'AwayxGA', 'Homedraws', 'AwayxG', 'HomenpxG']:
if i in self.engineering_features:
print('loh, drop this feature', i)
df.set_index(['IdMatch'], inplace=True)
self.df=df.copy()
return self.df, self.engineering_features
| 40.77713 | 198 | 0.515538 | 4,306 | 34,946 | 4.142127 | 0.066883 | 0.063355 | 0.136297 | 0.197298 | 0.825185 | 0.79889 | 0.766988 | 0.744954 | 0.705203 | 0.645604 | 0 | 0.055049 | 0.316431 | 34,946 | 856 | 199 | 40.824766 | 0.691602 | 0.03245 | 0 | 0.343988 | 0 | 0 | 0.129722 | 0.000804 | 0 | 0 | 0 | 0 | 0 | 1 | 0.009132 | false | 0 | 0.006088 | 0 | 0.024353 | 0.004566 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a604e659e68efec99d5411a959eab14f7e0acd07 | 10,802 | py | Python | parser/team28/models/instructions/Expression/string_funcs.py | itsmjoe/tytus | 3b0341cc854d67979b766c5c8b06ed172ce0c913 | [
"MIT"
] | null | null | null | parser/team28/models/instructions/Expression/string_funcs.py | itsmjoe/tytus | 3b0341cc854d67979b766c5c8b06ed172ce0c913 | [
"MIT"
] | null | null | null | parser/team28/models/instructions/Expression/string_funcs.py | itsmjoe/tytus | 3b0341cc854d67979b766c5c8b06ed172ce0c913 | [
"MIT"
] | null | null | null | import hashlib
from models.instructions.Expression.expression import Expression, PrimitiveData, DATA_TYPE
from controllers.error_controller import ErrorController
#TODO: REVISAR QUE NO MUERA CON DECODE, UNCODE, GETBYTE, SETBYTE, CONVERT
class Length(Expression):
'''
La función se usa para devolver el valor después de
redondear un número hasta un decimal específico,
proporcionado en el argumento.
'''
def __init__(self, value, line, column) :
self.value = value
self.alias = f'LENGTH({self.value.alias})'
self.line = line
self.column = column
def __repr__(self):
return str(vars(self))
def process(self, environment):
try:
val = self.value.process(environment).value
l = len(val)
return PrimitiveData(DATA_TYPE.NUMBER, l, self.line, self.column)
except TypeError:
desc = "Tipo de dato invalido para Length"
ErrorController().add(37, 'Execution', desc, self.line, self.column)
return
except:
desc = "FATAL ERROR --- StringFuncs"
ErrorController().add(34, 'Execution', desc, self.line, self.column)
class Substring(Expression):
'''
La función se usa para devolver el valor después de
redondear un número hasta un decimal específico,
proporcionado en el argumento.
'''
def __init__(self, value, down, up, line, column) :
self.value = value
self.alias = f'SUBSTRING({self.value.alias})'
self.up = up
self.down = down
self.line = line
self.column = column
def __repr__(self):
return str(vars(self))
def process(self, environment):
try:
i = self.down.process(environment).value
j = self.up.process(environment).value
cadena = self.value.process(environment).value
substr = cadena[i:j]
return PrimitiveData(DATA_TYPE.STRING, substr, self.line, self.column)
except TypeError:
desc = "Tipo de dato invalido para Substring"
ErrorController().add(37, 'Execution', desc, self.line, self.column)
return
except:
desc = "FATAL ERROR --- StringFuncs"
ErrorController().add(34, 'Execution', desc, self.line, self.column)
class Substr(Expression):
'''
La función se usa para devolver el valor después de
redondear un número hasta un decimal específico,
proporcionado en el argumento.
'''
def __init__(self, value, down, up, line, column) :
self.value = value
self.alias = f'SUBSTR({self.value.alias})'
self.up = up
self.down = down
self.line = line
self.column = column
def __repr__(self):
return str(vars(self))
def process(self, environment):
try:
i = self.down.process(environment).value
j = self.up.process(environment).value
cadena = self.value.process(environment).value
substr = cadena[i:j]
return PrimitiveData(DATA_TYPE.STRING, substr, self.line, self.column)
except TypeError:
desc = "Tipo de dato invalido para Substr"
ErrorController().add(37, 'Execution', desc, self.line, self.column)
return
except:
desc = "FATAL ERROR --- StringFuncs"
ErrorController().add(34, 'Execution', desc, self.line, self.column)
class Trim(Expression):
'''
La función se usa para devolver el valor después de
redondear un número hasta un decimal específico,
proporcionado en el argumento.
'''
def __init__(self, value, line, column) :
self.value = value
self.alias = f'TRIM({self.value.alias})'
self.line = line
self.column = column
def __repr__(self):
return str(vars(self))
def process(self, environment):
try:
cadena = self.value.process(environment).value
trim_str = cadena.strip()
return PrimitiveData(DATA_TYPE.STRING, trim_str, self.line, self.column)
except TypeError:
desc = "Tipo de dato invalido para Trim"
ErrorController().add(37, 'Execution', desc, self.line, self.column)
return
except:
desc = "FATAL ERROR --- StringFuncs"
ErrorController().add(34, 'Execution', desc, self.line, self.column)
class MD5(Expression):
'''
La función se usa para devolver el valor después de
redondear un número hasta un decimal específico,
proporcionado en el argumento.
'''
def __init__(self, value, line, column) :
self.value = value
self.alias = f'MD5({self.value.alias})'
self.line = line
self.column = column
def __repr__(self):
return str(vars(self))
def process(self, environment):
try:
cadena = self.value.process(environment).value
result = hashlib.md5(cadena.encode())
return PrimitiveData(DATA_TYPE.STRING, result.hexdigest(), self.line, self.column)
except TypeError:
desc = "Tipo de dato invalido para md5"
ErrorController().add(37, 'Execution', desc, self.line, self.column)
return
except:
desc = "FATAL ERROR --- StringFuncs"
ErrorController().add(34, 'Execution', desc, self.line, self.column)
class SHA256(Expression):
'''
La función se usa para devolver el valor después de
redondear un número hasta un decimal específico,
proporcionado en el argumento.
'''
def __init__(self, value, line, column) :
self.value = value
self.alias = f'SHA256({self.value.alias})'
self.line = line
self.column = column
def __repr__(self):
return str(vars(self))
def process(self, environment):
try:
cadena = self.value.process(environment).value
result = hashlib.sha256(cadena.encode())
return PrimitiveData(DATA_TYPE.STRING, result.hexdigest(), self.line, self.column)
except TypeError:
desc = "Tipo de dato invalido para sha256"
ErrorController().add(37, 'Execution', desc, self.line, self.column)
return
except:
desc = "FATAL ERROR --- StringFuncs"
ErrorController().add(34, 'Execution', desc, self.line, self.column)
class GetByte(Expression):
'''
La función se usa para devolver el valor después de
redondear un número hasta un decimal específico,
proporcionado en el argumento.
'''
def __init__(self, value, line, column) :
self.value = value
self.alias = f'GETBYTE({self.value.alias})'
self.line = line
self.column = column
def __repr__(self):
return str(vars(self))
def process(self, environment):
try:
pass
except TypeError:
desc = "Tipo de dato invalido para GetByte"
ErrorController().add(37, 'Execution', desc, self.line, self.column)
return
except:
desc = "FATAL ERROR --- StringFuncs"
ErrorController().add(34, 'Execution', desc, self.line, self.column)
class SetByte(Expression):
'''
La función se usa para devolver el valor después de
redondear un número hasta un decimal específico,
proporcionado en el argumento.
'''
def __init__(self, value, line, column) :
self.value = value
self.alias = f'SETBYTE({self.value.alias})'
self.line = line
self.column = column
def __repr__(self):
return str(vars(self))
def process(self, environment):
try:
pass
except TypeError:
desc = "Tipo de dato invalido para SetByte"
ErrorController().add(37, 'Execution', desc, self.line, self.column)
return
except:
desc = "FATAL ERROR --- StringFuncs"
ErrorController().add(34, 'Execution', desc, self.line, self.column)
class Convert(Expression):
'''
La función se usa para devolver el valor después de
redondear un número hasta un decimal específico,
proporcionado en el argumento.
'''
def __init__(self, value, data_type, line, column) :
self.value = value
self.value = data_type
self.alias = f'CONVERT({self.value.alias})'
self.line = line
self.column = column
def __repr__(self):
return str(vars(self))
def process(self, environment):
try:
pass
except TypeError:
desc = "Tipo de dato invalido para Convert"
ErrorController().add(37, 'Execution', desc, self.line, self.column)
return
except:
desc = "FATAL ERROR --- StringFuncs"
ErrorController().add(34, 'Execution', desc, self.line, self.column)
class Encode(Expression):
'''
La función se usa para devolver el valor después de
redondear un número hasta un decimal específico,
proporcionado en el argumento.
'''
def __init__(self, value, format_text, line, column) :
self.value = value
self.value = format_text
self.alias = f'ENCODE({self.value.alias})'
self.line = line
self.column = column
def __repr__(self):
return str(vars(self))
def process(self, environment):
try:
pass
except TypeError:
desc = "Tipo de dato invalido para Encode"
ErrorController().add(37, 'Execution', desc, self.line, self.column)
return
except:
desc = "FATAL ERROR --- StringFuncs"
ErrorController().add(34, 'Execution', desc, self.line, self.column)
class Decode(Expression):
'''
La función se usa para devolver el valor después de
redondear un número hasta un decimal específico,
proporcionado en el argumento.
'''
def __init__(self, value, format_text, line, column) :
self.value = value
self.value = format_text
self.alias = f'DECODE({self.value.alias})'
self.line = line
self.column = column
def __repr__(self):
return str(vars(self))
def process(self, environment):
try:
pass
except TypeError:
desc = "Tipo de dato invalido para Decode"
ErrorController().add(37, 'Execution', desc, self.line, self.column)
return
except:
desc = "FATAL ERROR --- StringFuncs"
ErrorController().add(34, 'Execution', desc, self.line, self.column) | 35.887043 | 94 | 0.597019 | 1,217 | 10,802 | 5.21364 | 0.077239 | 0.059574 | 0.086052 | 0.079433 | 0.919622 | 0.909377 | 0.909377 | 0.904177 | 0.904177 | 0.904177 | 0 | 0.007971 | 0.303185 | 10,802 | 301 | 95 | 35.887043 | 0.834994 | 0.143122 | 0 | 0.799107 | 0 | 0 | 0.129185 | 0.032353 | 0 | 0 | 0 | 0.003322 | 0 | 1 | 0.147321 | false | 0.022321 | 0.013393 | 0.049107 | 0.334821 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a615955fa87fa409556f68de69bce3a234e603bc | 43 | py | Python | mx_utils/mailer/__init__.py | marxav0319/MX_Utils | efb2e9efe9cb36054c442ccb3e2ebe0c90b2042c | [
"MIT"
] | null | null | null | mx_utils/mailer/__init__.py | marxav0319/MX_Utils | efb2e9efe9cb36054c442ccb3e2ebe0c90b2042c | [
"MIT"
] | null | null | null | mx_utils/mailer/__init__.py | marxav0319/MX_Utils | efb2e9efe9cb36054c442ccb3e2ebe0c90b2042c | [
"MIT"
] | null | null | null | from .outlook_mailer import Outlook_Mailer
| 21.5 | 42 | 0.883721 | 6 | 43 | 6 | 0.666667 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093023 | 43 | 1 | 43 | 43 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a69b9ee30807dbb4e5197e75dbd4c47f3c37c9fc | 7,153 | py | Python | molmodmt/forms/classes/get/template.py | LMMV/MolModMT | 5725d6d5627b07edcbbd5e55318345a136b28c35 | [
"MIT"
] | null | null | null | molmodmt/forms/classes/get/template.py | LMMV/MolModMT | 5725d6d5627b07edcbbd5e55318345a136b28c35 | [
"MIT"
] | null | null | null | molmodmt/forms/classes/get/template.py | LMMV/MolModMT | 5725d6d5627b07edcbbd5e55318345a136b28c35 | [
"MIT"
] | null | null | null |
def getting(item, element='atom', indices=None, ids=None, **kwargs):
result=[]
args = [ii for ii in kwargs if kwargs[ii]]
if element=='atom':
for option in args:
if option=='n_atoms':
raise NotImplementedError
elif option in ['atom_name', 'name']:
raise NotImplementedError
elif option in ['atom_index', 'index']:
raise NotImplementedError
elif option in ['atom_id', 'id']:
raise NotImplementedError
elif option in ['atom_type', 'type']:
raise NotImplementedError
elif option=='n_residues':
raise NotImplementedError
elif option=='residue_name':
raise NotImplementedError
elif option=='residue_index':
raise NotImplementedError
elif option=='residue_id':
raise NotImplementedError
elif option=='chain_index':
raise NotImplementedError
elif option=='chain_id':
raise NotImplementedError
elif option=='n_frames':
raise NotImplementedError
elif option=='n_chains':
raise NotImplementedError
elif option=='n_molecules':
raise NotImplementedError
elif option=='n_aminoacids':
raise NotImplementedError
elif option=='n_nucleotides':
raise NotImplementedError
elif option=='n_waters':
raise NotImplementedError
elif option=='n_ions':
raise NotImplementedError
elif option=='masses':
raise NotImplementedError
elif option=='charge':
raise NotImplementedError
elif option=='net_charge':
raise NotImplementedError
elif option=='bonded_atoms':
raise NotImplementedError
elif option=='bonds':
raise NotImplementedError
elif option=='graph':
raise NotImplementedError
elif option=='molecules':
raise NotImplementedError
elif option=='molecule_type':
raise NotImplementedError
elif option=='coordinates':
raise NotImplementedError
else:
raise NotImplementedError
elif element=='residue':
for option in args:
if option=='n_atoms':
raise NotImplementedError
elif option=='atom_name':
raise NotImplementedError
elif option=='atom_type':
raise NotImplementedError
elif option=='n_residues':
raise NotImplementedError
elif option=='residue_name':
raise NotImplementedError
elif option=='residue_index':
raise NotImplementedError
elif option=='residue_id':
raise NotImplementedError
elif option=='chain_index':
raise NotImplementedError
elif option=='chain_id':
raise NotImplementedError
elif option=='n_frames':
raise NotImplementedError
elif option=='n_chains':
raise NotImplementedError
elif option=='n_molecules':
raise NotImplementedError
elif option=='n_aminoacids':
raise NotImplementedError
elif option=='n_nucleotides':
raise NotImplementedError
elif option=='n_waters':
raise NotImplementedError
elif option=='n_ions':
raise NotImplementedError
elif option=='masses':
raise NotImplementedError
elif option=='charge':
raise NotImplementedError
elif option=='bonded_atoms':
raise NotImplementedError
elif option=='bonds':
raise NotImplementedError
elif option=='graph':
raise NotImplementedError
elif option=='molecules':
raise NotImplementedError
elif option=='molecule_type':
raise NotImplementedError
elif option=='coordinates':
raise NotImplementedError
else:
raise NotImplementedError
elif element=='molecule':
raise NotImplementedError
elif element=='chain':
for option in args:
if option=='n_atoms':
raise NotImplementedError
elif option=='atom_name':
raise NotImplementedError
elif option=='atom_type':
raise NotImplementedError
elif option=='n_residues':
raise NotImplementedError
elif option=='residue_name':
raise NotImplementedError
elif option=='residue_index':
raise NotImplementedError
elif option=='residue_id':
raise NotImplementedError
elif option=='chain_index':
raise NotImplementedError
elif option=='chain_id':
raise NotImplementedError
elif option=='n_frames':
raise NotImplementedError
elif option=='n_chains':
raise NotImplementedError
elif option=='n_molecules':
raise NotImplementedError
elif option=='n_aminoacids':
raise NotImplementedError
elif option=='n_nucleotides':
raise NotImplementedError
elif option=='n_waters':
raise NotImplementedError
elif option=='n_ions':
raise NotImplementedError
elif option=='masses':
raise NotImplementedError
elif option=='charge':
raise NotImplementedError
elif option=='bonded_atoms':
raise NotImplementedError
elif option=='bonds':
raise NotImplementedError
elif option=='graph':
raise NotImplementedError
elif option=='molecules':
raise NotImplementedError
elif option=='molecule_type':
raise NotImplementedError
elif option=='coordinates':
raise NotImplementedError
else:
raise NotImplementedError
elif element=='trajectory':
raise NotImplementedError
elif element=='system':
for option in args:
if option=='n_atoms':
raise NotImplementedError
elif option=='charge':
raise NotImplementedError
elif option=='net_charge':
raise NotImplementedError
else:
raise NotImplementedError
else:
raise NotImplementedError
if len(result)==1:
return result[0]
else:
return result
| 35.410891 | 68 | 0.539634 | 540 | 7,153 | 7.038889 | 0.092593 | 0.536701 | 0.581952 | 0.661931 | 0.93186 | 0.914233 | 0.881084 | 0.881084 | 0.881084 | 0.881084 | 0 | 0.000462 | 0.39452 | 7,153 | 201 | 69 | 35.587065 | 0.877165 | 0 | 0 | 0.913514 | 0 | 0 | 0.108376 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005405 | false | 0 | 0 | 0 | 0.016216 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
a6a6fc14980282f66177b6b5105afa481026324c | 31,873 | py | Python | MHD/FEniCS/MHD/Stabilised/SaddlePointForm/Test/SplitMatrix/ScottTest/MHDprec.py | wathen/PhD | 35524f40028541a4d611d8c78574e4cf9ddc3278 | [
"MIT"
] | 3 | 2020-10-25T13:30:20.000Z | 2021-08-10T21:27:30.000Z | MHD/FEniCS/MHD/Stabilised/SaddlePointForm/Test/SplitMatrix/ScottTest/MHDprec.py | wathen/PhD | 35524f40028541a4d611d8c78574e4cf9ddc3278 | [
"MIT"
] | null | null | null | MHD/FEniCS/MHD/Stabilised/SaddlePointForm/Test/SplitMatrix/ScottTest/MHDprec.py | wathen/PhD | 35524f40028541a4d611d8c78574e4cf9ddc3278 | [
"MIT"
] | 3 | 2019-10-28T16:12:13.000Z | 2020-01-13T13:59:44.000Z | import petsc4py
import sys
petsc4py.init(sys.argv)
from petsc4py import PETSc
import numpy as np
from dolfin import tic, toc
import HiptmairSetup
import PETScIO as IO
import scipy.sparse as sp
import MatrixOperations as MO
import HiptmairSetup
class BaseMyPC(object):
def setup(self, pc):
pass
def reset(self, pc):
pass
def apply(self, pc, x, y):
raise NotImplementedError
def applyT(self, pc, x, y):
self.apply(pc, x, y)
def applyS(self, pc, x, y):
self.apply(pc, x, y)
def applySL(self, pc, x, y):
self.applyS(pc, x, y)
def applySR(self, pc, x, y):
self.applyS(pc, x, y)
def applyRich(self, pc, x, y, w, tols):
self.apply(pc, x, y)
class Matrix(object):
def __init__(self):
pass
def create(self, mat):
pass
def destroy(self, mat):
pass
class InnerOuterMAGNETICinverse(BaseMyPC):
def __init__(self, W, kspF, kspA, kspQ,Fp,kspScalar, kspCGScalar, kspVector, G, P, A, Hiptmairtol,Options):
self.W = W
self.kspF = kspF
self.kspA = kspA
self.kspQ = kspQ
self.Fp = Fp
self.kspScalar = kspScalar
self.kspCGScalar = kspCGScalar
self.kspVector = kspVector
# self.Bt = Bt
self.HiptmairIts = 0
self.CGits = 0
# print range(self.W[0].dim(),self.W[0].dim()+self.W[1].dim())
# ss
self.P = P
self.G = G
self.AA = A
self.tol = Hiptmairtol
self.u_is = PETSc.IS().createGeneral(range(self.W[0].dim()))
self.p_is = PETSc.IS().createGeneral(range(self.W[0].dim(),self.W[0].dim()+self.W[1].dim()))
self.b_is = PETSc.IS().createGeneral(range(self.W[0].dim()+self.W[1].dim(),
self.W[0].dim()+self.W[1].dim()+self.W[2].dim()))
self.r_is = PETSc.IS().createGeneral(range(self.W[0].dim()+self.W[1].dim()+self.W[2].dim(),
self.W[0].dim()+self.W[1].dim()+self.W[2].dim()+self.W[3].dim()))
def create(self, pc):
print "Create"
def setUp(self, pc):
A, P = pc.getOperators()
print A.size
if A.type == 'python':
self.Ct = A.getPythonContext().getMatrix("Ct")
self.Bt = A.getPythonContext().getMatrix("Bt")
else:
self.Ct = A.getSubMatrix(self.b_is,self.u_is)
self.Bt = A.getSubMatrix(self.p_is,self.u_is)
self.Dt = A.getSubMatrix(self.r_is,self.b_is)
# print self.Ct.view()
#CFC = sp.csr_matrix( (data,(row,column)), shape=(self.W[1].dim(),self.W[1].dim()) )
#print CFC.shape
#CFC = PETSc.Mat().createAIJ(size=CFC.shape,csr=(CFC.indptr, CFC.indices, CFC.data))
#print CFC.size, self.AA.size
# MO.StoreMatrix(B,"A")
# print FC.todense()
OptDB = PETSc.Options()
OptDB["pc_factor_mat_ordering_type"] = "rcm"
OptDB["pc_factor_mat_solver_package"] = "mumps"
self.kspA.setType('preonly')
self.kspA.getPC().setType('lu')
self.kspA.setFromOptions()
self.kspA.setPCSide(0)
self.kspQ.setType('preonly')
self.kspQ.getPC().setType('lu')
self.kspQ.setFromOptions()
self.kspQ.setPCSide(0)
self.kspScalar.setType('preonly')
self.kspScalar.getPC().setType('lu')
self.kspScalar.setFromOptions()
self.kspScalar.setPCSide(0)
kspMX = PETSc.KSP()
kspMX.create(comm=PETSc.COMM_WORLD)
pcMX = kspMX.getPC()
kspMX.setType('preonly')
pcMX.setType('lu')
OptDB = PETSc.Options()
kspMX.setOperators(self.AA,self.AA)
self.kspMX = kspMX
# self.kspCGScalar.setType('preonly')
# self.kspCGScalar.getPC().setType('lu')
# self.kspCGScalar.setFromOptions()
# self.kspCGScalar.setPCSide(0)
self.kspVector.setType('preonly')
self.kspVector.getPC().setType('lu')
self.kspVector.setFromOptions()
self.kspVector.setPCSide(0)
print "setup"
def apply(self, pc, x, y):
br = x.getSubVector(self.r_is)
xr = br.duplicate()
self.kspScalar.solve(br, xr)
# print self.D.size
x2 = x.getSubVector(self.p_is)
y2 = x2.duplicate()
y3 = x2.duplicate()
xp = x2.duplicate()
self.kspA.solve(x2,y2)
self.Fp.mult(y2,y3)
self.kspQ.solve(y3,xp)
# self.kspF.solve(bu1-bu4-bu2,xu)
bb = x.getSubVector(self.b_is)
xb = bb.duplicate()
xxr = bb.duplicate()
self.Dt.multTranspose(xr,xxr)
self.kspMX.solve(bb-xxr,xb)
bu1 = x.getSubVector(self.u_is)
bu2 = bu1.duplicate()
bu4 = bu1.duplicate()
self.Bt.multTranspose(xp,bu2)
self.Ct.multTranspose(xb,bu4)
XX = bu1.duplicate()
xu = XX.duplicate()
self.kspF.solve(bu1-bu4+bu2,xu)
#self.kspF.solve(bu1,xu)
y.array = (np.concatenate([xu.array, -xp.array,xb.array,xr.array]))
def ITS(self):
return self.CGits, self.HiptmairIts , self.CGtime, self.HiptmairTime
class InnerOuterMAGNETICapprox(BaseMyPC):
def __init__(self, W, kspF, kspA, kspQ,Fp,kspScalar, kspCGScalar, kspVector, G, P, A, Hiptmairtol,Options):
self.W = W
self.kspF = kspF
self.kspA = kspA
self.kspQ = kspQ
self.Fp = Fp
self.kspScalar = kspScalar
self.kspCGScalar = kspCGScalar
self.kspVector = kspVector
# self.Bt = Bt
self.HiptmairIts = 0
self.CGits = 0
# print range(self.W[0].dim(),self.W[0].dim()+self.W[1].dim())
# ss
self.P = P
self.G = G
self.AA = A
self.tol = Hiptmairtol
self.u_is = PETSc.IS().createGeneral(range(self.W[0].dim()))
self.p_is = PETSc.IS().createGeneral(range(self.W[0].dim(),self.W[0].dim()+self.W[1].dim()))
self.b_is = PETSc.IS().createGeneral(range(self.W[0].dim()+self.W[1].dim(),
self.W[0].dim()+self.W[1].dim()+self.W[2].dim()))
self.r_is = PETSc.IS().createGeneral(range(self.W[0].dim()+self.W[1].dim()+self.W[2].dim(),
self.W[0].dim()+self.W[1].dim()+self.W[2].dim()+self.W[3].dim()))
def create(self, pc):
print "Create"
def setUp(self, pc):
A, P = pc.getOperators()
print A.size
if A.type == 'python':
self.Ct = A.getPythonContext().getMatrix("Ct")
self.Bt = A.getPythonContext().getMatrix("Bt")
else:
self.Ct = A.getSubMatrix(self.b_is,self.u_is)
self.Bt = A.getSubMatrix(self.p_is,self.u_is)
self.Dt = A.getSubMatrix(self.r_is,self.b_is)
# print self.Ct.view()
#CFC = sp.csr_matrix( (data,(row,column)), shape=(self.W[1].dim(),self.W[1].dim()) )
#print CFC.shape
#CFC = PETSc.Mat().createAIJ(size=CFC.shape,csr=(CFC.indptr, CFC.indices, CFC.data))
#print CFC.size, self.AA.size
# MO.StoreMatrix(B,"A")
# print FC.todense()
#self.kspF.setType('preonly')
#self.kspF.getPC().setType('lu')
#self.kspF.setFromOptions()
#self.kspF.setPCSide(0)
print "setup"
def apply(self, pc, x, y):
br = x.getSubVector(self.r_is)
xr = br.duplicate()
self.kspScalar.solve(br, xr)
# print self.D.size
x2 = x.getSubVector(self.p_is)
y2 = x2.duplicate()
y3 = x2.duplicate()
xp = x2.duplicate()
self.kspA.solve(x2,y2)
self.Fp.mult(y2,y3)
self.kspQ.solve(y3,xp)
# self.kspF.solve(bu1-bu4-bu2,xu)
bb = x.getSubVector(self.b_is)
xb = bb.duplicate()
#self.kspMX.solve(bb,xb)
xxr = bb.duplicate()
self.Dt.multTranspose(xr,xxr)
xb, its, self.HiptmairTime = HiptmairSetup.HiptmairApply(self.AA, bb-xxr, self.kspScalar, self.kspVector, self.G, self.P, self.tol)
bu1 = x.getSubVector(self.u_is)
bu2 = bu1.duplicate()
bu4 = bu1.duplicate()
self.Bt.multTranspose(xp,bu2)
self.Ct.multTranspose(xb,bu4)
XX = bu1.duplicate()
xu = XX.duplicate()
self.kspF.solve(bu1-bu4+bu2,xu)
#self.kspF.solve(bu1,xu)
y.array = (np.concatenate([xu.array, -xp.array,xb.array,xr.array]))
def ITS(self):
return self.CGits, self.HiptmairIts , self.CGtime, self.HiptmairTime
class P(Matrix):
def __init__(self, Fspace,P,Mass,L,F,M):
self.Fspace = Fspace
self.P = P
self.Mass = Mass
self.L = L
self.kspFp = F
self.M = M
# self.N = (n, n, n)
# self.F = zeros([n+2]*3, order='f')
def create(self, A):
self.IS = MO.IndexSet(self.Fspace)
self.F = self.P.getSubMatrix(self.IS[0],self.IS[0])
self.Bt = self.P.getSubMatrix(self.IS[0],self.IS[2])
self.Ct = self.P.getSubMatrix(self.IS[0],self.IS[1])
self.C = self.P.getSubMatrix(self.IS[1],self.IS[0])
self.A = self.P.getSubMatrix(self.IS[3],self.IS[3])
# ksp = PETSc.KSP()
# ksp.create(comm=PETSc.COMM_WORLD)
# pc = ksp.getPC()
# ksp.setType('preonly')
# pc.setType('hypre')
# ksp.max_it = 1
# ksp.setOperators(self.FF)
# self.ksp = ksp
print 13333
def mult(self, A, x, y):
print 'multi apply'
print 333
u = x.getSubVector(self.IS[0])
p = x.getSubVector(self.IS[2])
b = x.getSubVector(self.IS[1])
r = x.getSubVector(self.IS[3])
FQp = p.duplicate()
uOut = self.F*u+self.Bt*p+self.Ct*b
Qp =self.Mass*p
self.kspFp.solve(Qp,FQp)
pOut = -self.L*FQp
bOut = self.C*u+self.M*b
rOut = self.A*r
y.array = (np.concatenate([uOut.array, bOut.array, pOut.array, rOut.array]))
print "$$$$$$$/$$$$$$$$"
# print x.array
def multTranspose(self, A, x, y):
"y <- A' * x"
self.mult(x, y)
# def getSubMatrix(self, isrow, iscol, submat=None):
# submat = self.P.get
class ApproxInv(BaseMyPC):
def __init__(self, W, kspF, kspA, kspQ,Fp,kspScalar, kspCGScalar, kspVector, G, P, A, Hiptmairtol,Options):
self.W = W
self.kspF = kspF
self.kspA = kspA
self.kspQ = kspQ
self.Fp = Fp
self.kspScalar = kspScalar
self.kspCGScalar = kspCGScalar
self.kspVector = kspVector
self.Options = Options
# self.Bt = Bt
self.HiptmairIts = 0
self.CGits = 0
# print range(self.W[0].dim(),self.W[0].dim()+self.W[1].dim())
# ss
self.P = P
self.G = G
self.AA = A
self.tol = Hiptmairtol
self.u_is = PETSc.IS().createGeneral(range(self.W[0].dim()))
self.p_is = PETSc.IS().createGeneral(range(self.W[0].dim(),self.W[0].dim()+self.W[1].dim()))
self.b_is = PETSc.IS().createGeneral(range(self.W[0].dim()+self.W[1].dim(),
self.W[0].dim()+self.W[1].dim()+self.W[2].dim()))
self.r_is = PETSc.IS().createGeneral(range(self.W[0].dim()+self.W[1].dim()+self.W[2].dim(),
self.W[0].dim()+self.W[1].dim()+self.W[2].dim()+self.W[3].dim()))
def create(self, pc):
print "Create"
def setUp(self, pc):
A, P = pc.getOperators()
print A.size
if A.type == 'python':
self.Ct = A.getPythonContext().getMatrix("Ct")
self.Bt = A.getPythonContext().getMatrix("Bt")
else:
self.Ct = A.getSubMatrix(self.b_is,self.u_is)
self.Bt = A.getSubMatrix(self.p_is,self.u_is)
self.Dt = A.getSubMatrix(self.r_is,self.b_is)
# print self.Ct.view()
#CFC = sp.csr_matrix( (data,(row,column)), shape=(self.W[1].dim(),self.W[1].dim()) )
#print CFC.shape
#CFC = PETSc.Mat().createAIJ(size=CFC.shape,csr=(CFC.indptr, CFC.indices, CFC.data))
#print CFC.size, self.AA.size
# MO.StoreMatrix(B,"A")
# print FC.todense()
OptDB = PETSc.Options()
OptDB["pc_factor_mat_ordering_type"] = "rcm"
OptDB["pc_factor_mat_solver_package"] = "mumps"
self.kspA.setType('preonly')
self.kspA.getPC().setType('lu')
self.kspA.setFromOptions()
self.kspA.setPCSide(0)
self.kspQ.setType('preonly')
self.kspQ.getPC().setType('lu')
self.kspQ.setFromOptions()
self.kspQ.setPCSide(0)
self.kspScalar.setType('preonly')
self.kspScalar.getPC().setType('lu')
self.kspScalar.setFromOptions()
self.kspScalar.setPCSide(0)
kspMX = PETSc.KSP()
kspMX.create(comm=PETSc.COMM_WORLD)
pcMX = kspMX.getPC()
kspMX.setType('preonly')
pcMX.setType('lu')
OptDB = PETSc.Options()
kspMX.setOperators(self.AA,self.AA)
self.kspMX = kspMX
# self.kspCGScalar.setType('preonly')
# self.kspCGScalar.getPC().setType('lu')
# self.kspCGScalar.setFromOptions()
# self.kspCGScalar.setPCSide(0)
self.kspVector.setType('preonly')
self.kspVector.getPC().setType('lu')
self.kspVector.setFromOptions()
self.kspVector.setPCSide(0)
print "setup"
def apply(self, pc, x, y):
if self.Options == 'BT':
b = x.getSubVector(self.b_is)
Mxb = b.duplicate()
self.kspMX.solve(b,Mxb)
r = x.getSubVector(self.r_is)
Lr = r.duplicate()
self.kspScalar.solve(r, Lr)
DL = b.duplicate()
self.Dt.multTranspose(Lr,DL)
K = b.duplicate()
self.kspMX.solve(DL,K)
DM = r.duplicate()
self.Dt.mult(Mxb,DM)
E = r.duplicate()
self.kspScalar.solve(DM,E)
p = x.getSubVector(self.p_is)
Sp2 = p.duplicate()
Sp3 = p.duplicate()
Sp = p.duplicate()
self.kspA.solve(p,Sp2)
self.Fp.mult(Sp2,Sp3)
self.kspQ.solve(Sp3,Sp)
u = x.getSubVector(self.u_is)
Fu = u.duplicate()
Cb = u.duplicate()
Bp = u.duplicate()
self.Ct.multTranspose(Mxb,Cb)
self.Bt.multTranspose(Sp,Bp)
self.kspF.solve(u-Cb+Bp,Fu)
y.array = (np.concatenate([Fu.array, -Sp.array, Mxb.array+K.array,E.array]))
else:
u = x.getSubVector(self.u_is)
Fu = u.duplicate()
self.kspF.solve(u,Fu)
p = x.getSubVector(self.p_is)
Sp2 = p.duplicate()
Sp3 = p.duplicate()
Sp = p.duplicate()
self.kspA.solve(p,Sp2)
self.Fp.mult(Sp2,Sp3)
self.kspQ.solve(Sp3,Sp)
b = x.getSubVector(self.b_is)
Mxb = b.duplicate()
self.kspMX.solve(b,Mxb)
r = x.getSubVector(self.r_is)
Lr = r.duplicate()
self.kspScalar.solve(r, Lr)
if self.Options == 'p4':
Q = u.duplicate()
else:
Q1 = u.duplicate()
self.Bt.multTranspose(Sp,Q1)
Q = u.duplicate()
self.kspF(Q1,Q)
Y1 = u.duplicate()
self.Ct.multTranspose(Mxb,Y1)
Y = u.duplicate()
self.kspF(Y1,Y)
BF = p.duplicate()
self.Bt.mult(Fu,BF)
if self.Options == 'p3':
H = p.duplicate()
else:
H1 = p.duplicate()
H2 = p.duplicate()
H = p.duplicate()
self.kspA.solve(BF,H1)
self.Fp.mult(H1,H2)
self.kspQ.solve(H2,H)
if self.Options == 'p3':
J = p.duplicate()
else:
BY = p.duplicate()
self.Bt.mult(Fu,BY)
J1 = p.duplicate()
J2 = p.duplicate()
J = p.duplicate()
self.kspA.solve(BY,J1)
self.Fp.mult(J1,J2)
self.kspQ.solve(J2,J)
CF = b.duplicate()
self.Ct.mult(Fu,CF)
T = b.duplicate()
self.kspMX.solve(CF,T)
if self.Options == 'p4':
V = b.duplicate()
else:
CQ = b.duplicate()
self.Ct.mult(Q,CQ)
V = b.duplicate()
self.kspMX.solve(CQ,V)
DL = b.duplicate()
self.Dt.multTranspose(Lr,DL)
K = b.duplicate()
self.kspMX.solve(DL,K)
DM = r.duplicate()
self.Dt.mult(Mxb,DM)
E = r.duplicate()
self.kspScalar.solve(DM,E)
y.array = (np.concatenate([Fu.array+Q.array-Y.array, H.array-Sp.array-J.array, T.array+V.array+Mxb.array+K.array,E.array]))
def ITS(self):
return self.CGits, self.HiptmairIts , self.CGtime, self.HiptmairTime
class ApproxInvApprox(BaseMyPC):
def __init__(self, W, kspF, kspA, kspQ,Fp,kspScalar, kspCGScalar, kspVector, G, P, A, Hiptmairtol,Options):
self.W = W
self.kspF = kspF
self.kspA = kspA
self.kspQ = kspQ
self.Fp = Fp
self.kspScalar = kspScalar
self.kspCGScalar = kspCGScalar
self.kspVector = kspVector
self.Options = Options
# self.Bt = Bt
self.HiptmairIts = 0
self.CGits = 0
# print range(self.W[0].dim(),self.W[0].dim()+self.W[1].dim())
# ss
self.P = P
self.G = G
self.AA = A
self.tol = Hiptmairtol
self.u_is = PETSc.IS().createGeneral(range(self.W[0].dim()))
self.p_is = PETSc.IS().createGeneral(range(self.W[0].dim(),self.W[0].dim()+self.W[1].dim()))
self.b_is = PETSc.IS().createGeneral(range(self.W[0].dim()+self.W[1].dim(),
self.W[0].dim()+self.W[1].dim()+self.W[2].dim()))
self.r_is = PETSc.IS().createGeneral(range(self.W[0].dim()+self.W[1].dim()+self.W[2].dim(),
self.W[0].dim()+self.W[1].dim()+self.W[2].dim()+self.W[3].dim()))
def create(self, pc):
print "Create"
def setUp(self, pc):
A, P = pc.getOperators()
print A.size
if A.type == 'python':
self.Ct = A.getPythonContext().getMatrix("Ct")
self.Bt = A.getPythonContext().getMatrix("Bt")
else:
self.Ct = A.getSubMatrix(self.b_is,self.u_is)
self.Bt = A.getSubMatrix(self.p_is,self.u_is)
self.Dt = A.getSubMatrix(self.r_is,self.b_is)
print "setup"
def apply(self, pc, x, y):
if self.Options == 'BT':
b = x.getSubVector(self.b_is)
Mxb = b.duplicate()
# self.kspMX.solve(b,Mxb)
Mxb, its, self.HiptmairTime = HiptmairSetup.HiptmairApply(self.AA, b, self.kspScalar, self.kspVector, self.G, self.P, self.tol)
r = x.getSubVector(self.r_is)
Lr = r.duplicate()
self.kspScalar.solve(r, Lr)
DL = b.duplicate()
self.Dt.multTranspose(Lr,DL)
K = b.duplicate()
K, its, self.HiptmairTime = HiptmairSetup.HiptmairApply(self.AA, DL, self.kspScalar, self.kspVector, self.G, self.P, self.tol)
DM = r.duplicate()
self.Dt.mult(Mxb,DM)
E = r.duplicate()
self.kspScalar.solve(DM,E)
p = x.getSubVector(self.p_is)
Sp2 = p.duplicate()
Sp3 = p.duplicate()
Sp = p.duplicate()
self.kspA.solve(p,Sp2)
self.Fp.mult(Sp2,Sp3)
self.kspQ.solve(Sp3,Sp)
u = x.getSubVector(self.u_is)
Fu = u.duplicate()
Cb = u.duplicate()
Bp = u.duplicate()
self.Ct.multTranspose(Mxb,Cb)
self.Bt.multTranspose(Sp,Bp)
self.kspF.solve(u-Cb+Bp,Fu)
y.array = (np.concatenate([Fu.array, -Sp.array, Mxb.array+K.array,E.array]))
else:
u = x.getSubVector(self.u_is)
Fu = u.duplicate()
self.kspF.solve(u,Fu)
p = x.getSubVector(self.p_is)
Sp2 = p.duplicate()
Sp3 = p.duplicate()
Sp = p.duplicate()
self.kspA.solve(p,Sp2)
self.Fp.mult(Sp2,Sp3)
self.kspQ.solve(Sp3,Sp)
b = x.getSubVector(self.b_is)
Mxb = b.duplicate()
Mxb, its, self.HiptmairTime = HiptmairSetup.HiptmairApply(self.AA, b, self.kspScalar, self.kspVector, self.G, self.P, self.tol)
r = x.getSubVector(self.r_is)
Lr = r.duplicate()
self.kspScalar.solve(r, Lr)
if self.Options == 'p4':
Q = u.duplicate()
else:
Q1 = u.duplicate()
self.Bt.multTranspose(Sp,Q1)
Q = u.duplicate()
self.kspF(Q1,Q)
Y1 = u.duplicate()
self.Ct.multTranspose(Mxb,Y1)
Y = u.duplicate()
self.kspF(Y1,Y)
BF = p.duplicate()
self.Bt.mult(Fu,BF)
if self.Options == 'p3':
H = p.duplicate()
else:
H1 = p.duplicate()
H2 = p.duplicate()
H = p.duplicate()
self.kspA.solve(BF,H1)
self.Fp.mult(H1,H2)
self.kspQ.solve(H2,H)
BY = p.duplicate()
self.Bt.mult(Fu,BY)
if self.Options == 'p3':
J = p.duplicate()
else:
J1 = p.duplicate()
J2 = p.duplicate()
J = p.duplicate()
self.kspA.solve(BY,J1)
self.Fp.mult(J1,J2)
self.kspQ.solve(J2,J)
CF = b.duplicate()
self.Ct.mult(Fu,CF)
T, its, self.HiptmairTime = HiptmairSetup.HiptmairApply(self.AA, CF, self.kspScalar, self.kspVector, self.G, self.P, self.tol)
if self.Options == 'p4':
V = b.duplicate()
else:
CQ = b.duplicate()
self.Ct.mult(Q,CQ)
V, its, self.HiptmairTime = HiptmairSetup.HiptmairApply(self.AA, CQ, self.kspScalar, self.kspVector, self.G, self.P, self.tol)
DL = b.duplicate()
self.Dt.multTranspose(Lr,DL)
K = b.duplicate()
K, its, self.HiptmairTime = HiptmairSetup.HiptmairApply(self.AA, DL, self.kspScalar, self.kspVector, self.G, self.P, self.tol)
DM = r.duplicate()
self.Dt.mult(Mxb,DM)
E = r.duplicate()
self.kspScalar.solve(DM,E)
y.array = (np.concatenate([Fu.array+Q.array-Y.array, H.array-Sp.array-J.array, T.array+V.array+Mxb.array+K.array,E.array]))
def ITS(self):
return self.CGits, self.HiptmairIts , self.CGtime, self.HiptmairTime
# class ApproxBT(BaseMyPC):
# def __init__(self, W, kspF, kspA, kspQ,Fp,kspScalar, kspCGScalar, kspVector, G, P, A, Hiptmairtol,Options):
# self.W = W
# self.kspF = kspF
# self.kspA = kspA
# self.kspQ = kspQ
# self.Fp = Fp
# self.kspScalar = kspScalar
# self.kspCGScalar = kspCGScalar
# self.kspVector = kspVector
# self.Options = Options
# # self.Bt = Bt
# self.HiptmairIts = 0
# self.CGits = 0
# # print range(self.W[0].dim(),self.W[0].dim()+self.W[1].dim())
# # ss
# self.P = P
# self.G = G
# self.AA = A
# self.tol = Hiptmairtol
# self.u_is = PETSc.IS().createGeneral(range(self.W[0].dim()))
# self.p_is = PETSc.IS().createGeneral(range(self.W[0].dim(),self.W[0].dim()+self.W[1].dim()))
# self.b_is = PETSc.IS().createGeneral(range(self.W[0].dim()+self.W[1].dim(),
# self.W[0].dim()+self.W[1].dim()+self.W[2].dim()))
# self.r_is = PETSc.IS().createGeneral(range(self.W[0].dim()+self.W[1].dim()+self.W[2].dim(),
# self.W[0].dim()+self.W[1].dim()+self.W[2].dim()+self.W[3].dim()))
# def create(self, pc):
# print "Create"
# def setUp(self, pc):
# A, P = pc.getOperators()
# print A.size
# if A.type == 'python':
# self.Ct = A.getPythonContext().getMatrix("Ct")
# self.Bt = A.getPythonContext().getMatrix("Bt")
# else:
# self.Ct = A.getSubMatrix(self.b_is,self.u_is)
# self.Bt = A.getSubMatrix(self.p_is,self.u_is)
# self.Dt = A.getSubMatrix(self.r_is,self.b_is)
# # print self.Ct.view()
# #CFC = sp.csr_matrix( (data,(row,column)), shape=(self.W[1].dim(),self.W[1].dim()) )
# #print CFC.shape
# #CFC = PETSc.Mat().createAIJ(size=CFC.shape,csr=(CFC.indptr, CFC.indices, CFC.data))
# #print CFC.size, self.AA.size
# # MO.StoreMatrix(B,"A")
# # print FC.todense()
# OptDB = PETSc.Options()
# OptDB["pc_factor_mat_ordering_type"] = "rcm"
# OptDB["pc_factor_mat_solver_package"] = "mumps"
# self.kspA.setType('preonly')
# self.kspA.getPC().setType('lu')
# self.kspA.setFromOptions()
# self.kspA.setPCSide(0)
# self.kspQ.setType('preonly')
# self.kspQ.getPC().setType('lu')
# self.kspQ.setFromOptions()
# self.kspQ.setPCSide(0)
# self.kspScalar.setType('preonly')
# self.kspScalar.getPC().setType('lu')
# self.kspScalar.setFromOptions()
# self.kspScalar.setPCSide(0)
# kspMX = PETSc.KSP()
# kspMX.create(comm=PETSc.COMM_WORLD)
# pcMX = kspMX.getPC()
# kspMX.setType('preonly')
# pcMX.setType('lu')
# OptDB = PETSc.Options()
# kspMX.setOperators(self.AA,self.AA)
# self.kspMX = kspMX
# # self.kspCGScalar.setType('preonly')
# # self.kspCGScalar.getPC().setType('lu')
# # self.kspCGScalar.setFromOptions()
# # self.kspCGScalar.setPCSide(0)
# self.kspVector.setType('preonly')
# self.kspVector.getPC().setType('lu')
# self.kspVector.setFromOptions()
# self.kspVector.setPCSide(0)
# print "setup"
# def apply(self, pc, x, y):
# def ITS(self):
# return self.CGits, self.HiptmairIts , self.CGtime, self.HiptmairTime
def FluidSchur(A, b):
if len(A) == 1:
print "exact Schur complement"
x = b.duplicate()
A[0].solve(b, x)
return x
else:
print "PCD Schur complement"
x1 = b.duplicate()
x2 = b.duplicate()
x3 = b.duplicate()
A[0].solve(b,x1)
A[1].mult(x1,x2)
A[2].solve(x2,x3)
return x3
class ApproxInv(BaseMyPC):
def __init__(self, W, kspF, kspA, kspQ,Fp,kspScalar, kspCGScalar, kspVector, G, P, A, Hiptmairtol,Options):
self.W = W
self.kspF = kspF
self.kspA = kspA
self.kspQ = kspQ
self.Fp = Fp
self.kspScalar = kspScalar
self.kspCGScalar = kspCGScalar
self.kspVector = kspVector
# self.Bt = Bt
self.HiptmairIts = 0
self.CGits = 0
# print range(self.W[0].dim(),self.W[0].dim()+self.W[1].dim())
# ss
self.P = P
self.G = G
self.AA = A
self.tol = Hiptmairtol
self.u_is = PETSc.IS().createGeneral(range(self.W[0].dim()))
self.p_is = PETSc.IS().createGeneral(range(self.W[0].dim(),self.W[0].dim()+self.W[1].dim()))
self.b_is = PETSc.IS().createGeneral(range(self.W[0].dim()+self.W[1].dim(),
self.W[0].dim()+self.W[1].dim()+self.W[2].dim()))
self.r_is = PETSc.IS().createGeneral(range(self.W[0].dim()+self.W[1].dim()+self.W[2].dim(),
self.W[0].dim()+self.W[1].dim()+self.W[2].dim()+self.W[3].dim()))
def create(self, pc):
print "Create"
def setUp(self, pc):
A, P = pc.getOperators()
print A.size
if A.type == 'python':
self.Ct = A.getPythonContext().getMatrix("Ct")
self.Bt = A.getPythonContext().getMatrix("Bt")
else:
self.C = A.getSubMatrix(self.u_is,self.b_is)
self.B = A.getSubMatrix(self.u_is,self.p_is)
self.D = A.getSubMatrix(self.b_is,self.r_is)
# print self.Ct.view()
#CFC = sp.csr_matrix( (data,(row,column)), shape=(self.W[1].dim(),self.W[1].dim()) )
#print CFC.shape
#CFC = PETSc.Mat().createAIJ(size=CFC.shape,csr=(CFC.indptr, CFC.indices, CFC.data))
#print CFC.size, self.AA.size
# MO.StoreMatrix(B,"A")
# print FC.todense()
OptDB = PETSc.Options()
OptDB["pc_factor_mat_ordering_type"] = "rcm"
OptDB["pc_factor_mat_solver_package"] = "mumps"
self.kspA.setType('preonly')
self.kspA.getPC().setType('lu')
self.kspA.setFromOptions()
self.kspA.setPCSide(0)
self.kspQ.setType('preonly')
self.kspQ.getPC().setType('lu')
self.kspQ.setFromOptions()
self.kspQ.setPCSide(0)
self.kspScalar.setType('preonly')
self.kspScalar.getPC().setType('lu')
self.kspScalar.setFromOptions()
self.kspScalar.setPCSide(0)
kspMX = PETSc.KSP()
kspMX.create(comm=PETSc.COMM_WORLD)
pcMX = kspMX.getPC()
kspMX.setType('preonly')
pcMX.setType('lu')
OptDB = PETSc.Options()
kspMX.setOperators(self.AA,self.AA)
self.kspMX = kspMX
# self.kspCGScalar.setType('preonly')
# self.kspCGScalar.getPC().setType('lu')
# self.kspCGScalar.setFromOptions()
# self.kspCGScalar.setPCSide(0)
self.kspVector.setType('preonly')
self.kspVector.getPC().setType('lu')
self.kspVector.setFromOptions()
self.kspVector.setPCSide(0)
print "setup"
def apply(self, pc, x, y):
bu = x.getSubVector(self.u_is)
invF = bu.duplicate()
bb = x.getSubVector(self.b_is)
invMX = bb.duplicate()
br = x.getSubVector(self.r_is)
invL = br.duplicate()
self.kspF.solve(bu,invF)
invS = FluidSchur([kspA, Fp, KspQ], bp)
self.kspMX.solve(bb,invMX)
self.kspScalar.solve(br,invL)
# outP = barF - invS - Schur(B*F(C'*invMx));
# outU = invF - F(B'*barF) + barS;
xp1 = xp.duplicate()
self.B.mult(invF, xp1)
barF = FluidSchur([kspA, Fp, KspQ], xp1)
xu1 = xu.duplicate()
barS = xu.duplicate()
self.B.multTranspose(invS, xu1)
self.kspF.solve(xu1, barS)
# outR = (L(D*invMx));
xr1 = xr.duplicate()
outR = xr.duplicate()
self.D.mult(invMX, xr1)
self.kspScalar(xr1, outR)
# outB = (Mx(C*barS) + invMx + Mx(D'*invL));
xb1 = invMX.duplicate()
xb2 = invMX.duplicate()
xb3 = invMX.duplicate()
xb4 = invMX.duplicate()
self.D.multTranspose(invL, xb1)
self.kspMX.solve(xb1, xb2)
self.C.mult(xp, xb3)
self.kspMX.solve(xb3, xb4)
outB = xb4 + xb + xb2
xp1 = xu.duplicate()
xp2 = xu.duplicate()
xp3 = xp.duplicate()
self.C.multTranspose(xb, xp1)
self.kspF.solve(xp1, xp2)
self.B.mult(xp2, xp3)
xp4 = FluidSchur([kspA, Fp, KspQ], xp3)
outP = barF - xp - xp4;
xu1 = xu.duplicate()
xu2 = xu.duplicate()
self.B.multTranspose(barF, xu1)
self.kspF.solve(xu1, xu2)
outU = xu - xu2 + barS;
y.array = (np.concatenate([outU.array, outP.array, outB.array, outR.array]))
def ITS(self):
return self.CGits, self.HiptmairIts , self.CGtime, self.HiptmairTime
| 29.955827 | 142 | 0.534214 | 4,191 | 31,873 | 4.024099 | 0.057027 | 0.04032 | 0.042218 | 0.028817 | 0.868841 | 0.854195 | 0.844234 | 0.835162 | 0.818678 | 0.818678 | 0 | 0.016175 | 0.307533 | 31,873 | 1,063 | 143 | 29.984008 | 0.74795 | 0.187118 | 0 | 0.810185 | 0 | 0 | 0.020561 | 0.006413 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.007716 | 0.015432 | null | null | 0.032407 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a6b4c99ff731f87418d673952732b7a0b7d6f5a6 | 90 | py | Python | Math/10869_사칙연산/10869_사칙연산.py | 7dudtj/BOJ_myCode | 37d105590a7963e2232102b3098fea3c3504b96f | [
"MIT"
] | 1 | 2022-03-30T15:50:47.000Z | 2022-03-30T15:50:47.000Z | Math/10869_사칙연산/10869_사칙연산.py | 7dudtj/BOJ_myCode | 37d105590a7963e2232102b3098fea3c3504b96f | [
"MIT"
] | null | null | null | Math/10869_사칙연산/10869_사칙연산.py | 7dudtj/BOJ_myCode | 37d105590a7963e2232102b3098fea3c3504b96f | [
"MIT"
] | 1 | 2021-07-20T07:11:06.000Z | 2021-07-20T07:11:06.000Z | A, B = map(int, input().split())
print(A+B)
print(A-B)
print(A*B)
print(A//B)
print(A%B)
| 11.25 | 32 | 0.588889 | 21 | 90 | 2.52381 | 0.333333 | 0.226415 | 0.660377 | 0.90566 | 0.660377 | 0.660377 | 0.660377 | 0.660377 | 0.660377 | 0.660377 | 0 | 0 | 0.122222 | 90 | 7 | 33 | 12.857143 | 0.670886 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.833333 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
5b3b240823076c031b7ca093f07c6d2080ca9fb4 | 3,026 | py | Python | test/test_dlt.py | strawlab/flyvr | 335892cae740e53e82e07b526e1ba53fbd34b0ce | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 3 | 2015-01-29T14:09:25.000Z | 2016-04-24T04:25:49.000Z | test/test_dlt.py | strawlab/flyvr | 335892cae740e53e82e07b526e1ba53fbd34b0ce | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | null | null | null | test/test_dlt.py | strawlab/flyvr | 335892cae740e53e82e07b526e1ba53fbd34b0ce | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | null | null | null | #!/usr/bin/env python3
import numpy as np
# ROS imports
import roslib; roslib.load_manifest('freemovr_engine')
import freemovr_engine.dlt as dlt
from pymvg.camera_model import CameraModel
# some sample data -----------------------
XYZ = np.array([[ 2.00000000e-02, 0.00000000e+00, 0.00000000e+00],
[ 1.22464680e-18, 0.00000000e+00, 2.00000000e-02],
[ 1.41421356e-02, 0.00000000e+00, -1.41421356e-02],
[ 1.41421356e-02, 0.00000000e+00, 1.41421356e-02],
[ -3.67394040e-18, -2.00000000e-02, 0.00000000e+00],
[ 1.22464680e-18, 2.00000000e-02, 0.00000000e+00],
[ 1.00000000e-02, 1.00000000e-02, -1.41421356e-02],
[ 1.00000000e-02, -1.00000000e-02, -1.41421356e-02],
[ 1.41421356e-02, 1.41421356e-02, 0.00000000e+00],
[ 1.41421356e-02, -1.41421356e-02, 0.00000000e+00],
[ 2.00000000e-02, 0.00000000e+00, 0.00000000e+00],
[ 1.22464680e-18, 0.00000000e+00, 2.00000000e-02],
[ 1.41421356e-02, 0.00000000e+00, -1.41421356e-02],
[ 1.41421356e-02, 0.00000000e+00, 1.41421356e-02],
[ -3.67394040e-18, -2.00000000e-02, 0.00000000e+00],
[ 1.22464680e-18, 2.00000000e-02, 0.00000000e+00],
[ 1.00000000e-02, 1.00000000e-02, -1.41421356e-02],
[ 1.00000000e-02, -1.00000000e-02, -1.41421356e-02],
[ 1.41421356e-02, 1.41421356e-02, 0.00000000e+00],
[ 1.41421356e-02, -1.41421356e-02, 0.00000000e+00]])
xy = np.array([[ 467.85551727, 663.68835971],
[ 466.81674246, 590.70322096],
[ 469.81695678, 723.93789261],
[ 469.3536242 , 616.7838872 ],
[ 389.60089819, 678.51156788],
[ 549.99472131, 667.72203292],
[ 522.4649629 , 725.09611373],
[ 415.840405 , 729.02607287],
[ 524.74859392, 665.29342667],
[ 414.28125927, 665.94255981],
[ 467.85551727, 663.68835971],
[ 466.81674246, 590.70322096],
[ 469.81695678, 723.93789261],
[ 469.3536242 , 616.7838872 ],
[ 389.60089819, 678.51156788],
[ 549.99472131, 667.72203292],
[ 522.4649629 , 725.09611373],
[ 415.840405 , 729.02607287],
[ 524.74859392, 665.29342667],
[ 414.28125927, 665.94255981]])
# the tests -------------------
def test_basic_dlt():
results = dlt.dlt(XYZ, xy, ransac=False)
assert results['mean_reprojection_error'] < 6.0
c1 = CameraModel.load_camera_from_M( results['pmat'] )
def test_ransac_dlt():
np.random.seed(3) # try to prevent following from failing occasionally
results = dlt.dlt(XYZ, xy, ransac=True)
assert results['mean_reprojection_error'] < 5.0
| 47.28125 | 74 | 0.532716 | 355 | 3,026 | 4.498592 | 0.287324 | 0.03757 | 0.150282 | 0.12273 | 0.792736 | 0.750157 | 0.7201 | 0.7201 | 0.7201 | 0.7201 | 0 | 0.527658 | 0.312954 | 3,026 | 63 | 75 | 48.031746 | 0.2405 | 0.051223 | 0 | 0.615385 | 0 | 0 | 0.02268 | 0.01605 | 0 | 0 | 0 | 0 | 0.038462 | 1 | 0.038462 | false | 0 | 0.076923 | 0 | 0.115385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5b4e5e5b4d3143299b3237bf62a8529133f06817 | 319,871 | py | Python | tests/tools/arachni/arachni_reports_1_2_1.py | owtf/ptp | b43e581d7646330810f526432c689c3d88995df9 | [
"BSD-3-Clause"
] | 23 | 2015-03-22T09:18:35.000Z | 2022-03-10T23:28:13.000Z | tests/tools/arachni/arachni_reports_1_2_1.py | owtf/ptp | b43e581d7646330810f526432c689c3d88995df9 | [
"BSD-3-Clause"
] | 22 | 2015-07-12T12:23:40.000Z | 2017-02-26T12:39:48.000Z | tests/tools/arachni/arachni_reports_1_2_1.py | owtf/ptp | b43e581d7646330810f526432c689c3d88995df9 | [
"BSD-3-Clause"
] | 14 | 2015-06-03T19:16:22.000Z | 2022-03-10T23:28:15.000Z | report_high = r"""<?xml version="1.0"?>
<report xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="https://raw.githubusercontent.com/Arachni/arachni/v1.2.1/components/reporters/xml/schema.xsd">
<version>1.2.1</version>
<options>---
input:
values: {}
default_values:
"(?i-mx:name)": arachni_name
"(?i-mx:user)": arachni_user
"(?i-mx:usr)": arachni_user
"(?i-mx:pass)": 5543!%arachni_secret
"(?i-mx:txt)": arachni_text
"(?i-mx:num)": '132'
"(?i-mx:amount)": '100'
"(?i-mx:mail)": arachni@email.gr
"(?i-mx:account)": '12'
"(?i-mx:id)": '1'
without_defaults: false
force: false
audit:
parameter_values: true
exclude_vector_patterns: []
include_vector_patterns: []
link_templates: []
links: true
forms: true
cookies: true
jsons: true
xmls: true
browser_cluster:
wait_for_elements: {}
pool_size: 6
job_timeout: 25
worker_time_to_live: 100
ignore_images: false
screen_width: 1600
screen_height: 1200
session: {}
http:
user_agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
request_timeout: 10000
request_redirect_limit: 5
request_concurrency: 20
request_queue_size: 100
request_headers: {}
response_max_size: 500000
cookies: {}
scope:
redundant_path_patterns: {}
dom_depth_limit: 5
exclude_path_patterns: []
exclude_content_patterns: []
include_path_patterns: []
restrict_paths: []
extend_paths: []
url_rewrites: {}
datastore:
report_path: "/root/Desktop/arachni.afr"
checks:
- htaccess_limit
- insecure_client_access_policy
- interesting_responses
- insecure_cross_domain_policy_access
- insecure_cross_domain_policy_headers
- backdoors
- directory_listing
- origin_spoof_access_restriction_bypass
- backup_directories
- http_put
- xst
- localstart_asp
- common_admin_interfaces
- common_files
- common_directories
- webdav
- backup_files
- captcha
- private_ip
- http_only_cookies
- x_frame_options
- unencrypted_password_forms
- insecure_cookies
- password_autocomplete
- cookie_set_for_parent_domain
- insecure_cors_policy
- credit_card
- ssn
- html_objects
- form_upload
- mixed_resource
- hsts
- cvs_svn_users
- emails
- allowed_methods
- sql_injection
- code_injection_timing
- no_sql_injection
- session_fixation
- path_traversal
- unvalidated_redirect
- trainer
- os_cmd_injection_timing
- response_splitting
- xxe
- file_inclusion
- csrf
- xss_event
- xss_path
- source_code_disclosure
- rfi
- xss_dom_script_context
- no_sql_injection_differential
- unvalidated_redirect_dom
- sql_injection_differential
- xpath_injection
- ldap_injection
- xss_tag
- xss_script_context
- xss
- code_injection_php_input_wrapper
- xss_dom
- os_cmd_injection
- code_injection
- sql_injection_timing
- xss_dom_inputs
platforms: []
plugins: {}
no_fingerprinting: false
authorized_by:
url: http://elearnix.org/
</options>
<start_datetime>2016-06-07T09:37:38-04:00</start_datetime>
<finish_datetime>2016-06-07T09:55:29-04:00</finish_datetime>
<sitemap>
<entry url="http://elearnix.org/" code="200"/>
</sitemap>
<issues>
<issue>
<name>Cross-Site Request Forgery</name>
<description>
In the majority of today's web applications, clients are required to submit forms
which can perform sensitive operations.
An example of such a form being used would be when an administrator wishes to
create a new user for the application.
In the simplest version of the form, the administrator would fill-in:
* Name
* Password
* Role (level of access)
Continuing with this example, Cross Site Request Forgery (CSRF) would occur when
the administrator is tricked into clicking on a link, which if logged into the
application, would automatically submit the form without any further interaction.
Cyber-criminals will look for sites where sensitive functions are performed in
this manner and then craft malicious requests that will be used against clients
via a social engineering attack.
There are 3 things that are required for a CSRF attack to occur:
1. The form must perform some sort of sensitive action.
2. The victim (the administrator the example above) must have an active session.
3. Most importantly, all parameter values must be **known** or **guessable**.
Arachni discovered that all parameters within the form were known or predictable
and therefore the form could be vulnerable to CSRF.
_Manual verification may be required to check whether the submission will then
perform a sensitive action, such as reset a password, modify user profiles, post
content on a forum, etc._
</description>
<remedy_guidance>
Based on the risk (determined by manual verification) of whether the form submission
performs a sensitive action, the addition of anti-CSRF tokens may be required.
These tokens can be configured in such a way that each session generates a new
anti-CSRF token or such that each individual request requires a new token.
It is important that the server track and maintain the status of each token (in
order to reject requests accompanied by invalid ones) and therefore prevent
cyber-criminals from knowing, guessing or reusing them.
_For examples of framework specific remediation options, please refer to the references._
</remedy_guidance>
<remedy_code/>
<severity>high</severity>
<check>
<name>CSRF</name>
<description>
It uses differential analysis to determine which forms affect business logic and
checks them for lack of anti-CSRF tokens.
(Works best with a valid session.)
</description>
<author>Tasos "Zapotek" Laskos <tasos.laskos@arachni-scanner.com> </author>
<version>0.3.5</version>
<shortname>csrf</shortname>
</check>
<cwe>352</cwe>
<digest>1606559286</digest>
<references>
<reference title="Wikipedia" url="http://en.wikipedia.org/wiki/Cross-site_request_forgery"/>
<reference title="OWASP" url="https://www.owasp.org/index.php/Cross-Site_Request_Forgery_(CSRF)"/>
<reference title="CGI Security" url="http://www.cgisecurity.com/csrf-faq.html"/>
</references>
<vector>
<class>Arachni::Element::Form</class>
<type>form</type>
<url>http://elearnix.org/</url>
<action>http://elearnix.org/verify.php</action>
<source><form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge">
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/">
<input type="submit" value="Delist">
</form></source>
<affected_input_name/>
<inputs>
<input name="recaptcha_challenge_field" value=""/>
<input name="recaptcha_response_field" value="manual_challenge"/>
<input name="origin_url" value="http://elearnix.org/"/>
</inputs>
</vector>
<variations>
<variation>
<vector>
<seed/>
<inputs>
<input name="recaptcha_challenge_field" value=""/>
<input name="recaptcha_response_field" value="manual_challenge"/>
<input name="origin_url" value="http://elearnix.org/"/>
</inputs>
</vector>
<remarks/>
<page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</page>
<referring_page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</referring_page>
<signature/>
<proof><form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge">
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/">
<input type="submit" value="Delist">
</form></proof>
<trusted>true</trusted>
<platform_type/>
<platform_name/>
</variation>
</variations>
</issue>
<issue>
<name>HTTP TRACE</name>
<description>
The `TRACE` HTTP method allows a client so send a request to the server, and
have the same request then send back in the server's response. This allows the
client to determine if the server is receiving the request as expected or if
specific parts of the request are not arriving as expected.
For example incorrect encoding or a load balancer has filtered or changed a value.
On many default installations the `TRACE` method is still enabled.
While not vulnerable by itself, it does provide a method for cyber-criminals to
bypass the `HTTPOnly` cookie flag, and therefore could allow a XSS attack to
successfully access a session token.
Arachni has discovered that the affected page permits the HTTP `TRACE` method.
</description>
<remedy_guidance>
The HTTP `TRACE` method is normally not required within production sites and
should therefore be disabled.
Depending on the function being performed by the web application, the risk
level can start low and increase as more functionality is implemented.
The remediation is typically a very simple configuration change and in most cases
will not have any negative impact on the server or application.
</remedy_guidance>
<remedy_code/>
<severity>medium</severity>
<check>
<name>XST</name>
<description>Sends an HTTP TRACE request and checks if it succeeded.</description>
<author>Tasos "Zapotek" Laskos <tasos.laskos@arachni-scanner.com></author>
<version>0.1.7</version>
<shortname>xst</shortname>
</check>
<cwe>693</cwe>
<digest>1441521763</digest>
<references>
<reference title="CAPEC" url="http://capec.mitre.org/data/definitions/107.html"/>
<reference title="OWASP" url="http://www.owasp.org/index.php/Cross_Site_Tracing"/>
</references>
<vector>
<class>Arachni::Element::Server</class>
<type>server</type>
<url>http://elearnix.org/</url>
<action>http://elearnix.org/</action>
</vector>
<variations>
<variation>
<vector/>
<remarks/>
<page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>trace</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>TRACE / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>2.3537</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:43 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:43 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</page>
<referring_page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</referring_page>
<signature/>
<proof>HTTP/1.1 200 OK</proof>
<trusted>true</trusted>
<platform_type/>
<platform_name/>
</variation>
</variations>
</issue>
<issue>
<name>Missing 'X-Frame-Options' header</name>
<description>
Clickjacking (User Interface redress attack, UI redress attack, UI redressing)
is a malicious technique of tricking a Web user into clicking on something different
from what the user perceives they are clicking on, thus potentially revealing
confidential information or taking control of their computer while clicking on
seemingly innocuous web pages.
The server didn't return an `X-Frame-Options` header which means that this website
could be at risk of a clickjacking attack.
The `X-Frame-Options` HTTP response header can be used to indicate whether or not
a browser should be allowed to render a page inside a frame or iframe. Sites can
use this to avoid clickjacking attacks, by ensuring that their content is not
embedded into other sites.
</description>
<remedy_guidance>
Configure your web server to include an X-Frame-Options header.
</remedy_guidance>
<remedy_code/>
<severity>low</severity>
<check>
<name>Missing X-Frame-Options header</name>
<description>Checks the host for a missing `X-Frame-Options` header.</description>
<author>Tasos Laskos <tasos.laskos@arachni-scanner.com></author>
<version>0.1.1</version>
<shortname>x_frame_options</shortname>
</check>
<cwe>693</cwe>
<digest>730375711</digest>
<references>
<reference title="MDN" url="https://developer.mozilla.org/en-US/docs/Web/HTTP/X-Frame-Options"/>
<reference title="RFC" url="http://tools.ietf.org/html/rfc7034"/>
<reference title="OWASP" url="https://www.owasp.org/index.php/Clickjacking"/>
</references>
<vector>
<class>Arachni::Element::Server</class>
<type>server</type>
<url>http://elearnix.org/</url>
<action>http://elearnix.org/</action>
</vector>
<variations>
<variation>
<vector/>
<remarks/>
<page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</page>
<referring_page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</referring_page>
<signature/>
<proof>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</proof>
<trusted>true</trusted>
<platform_type/>
<platform_name/>
</variation>
</variations>
</issue>
<issue>
<name>Interesting response</name>
<description>
The server responded with a non 200 (OK) nor 404 (Not Found) status code.
This is a non-issue, however exotic HTTP response status codes can provide useful
insights into the behavior of the web application and assist with the penetration test.
</description>
<remedy_guidance/>
<remedy_code/>
<severity>informational</severity>
<check>
<name>Interesting responses</name>
<description>Logs all non 200 (OK) server responses.</description>
<author>Tasos "Zapotek" Laskos <tasos.laskos@arachni-scanner.com></author>
<version>0.2.1</version>
<shortname>interesting_responses</shortname>
</check>
<digest>2686368332</digest>
<references>
<reference title="w3.org" url="http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html"/>
</references>
<vector>
<class>Arachni::Element::Server</class>
<type>server</type>
<url>http://elearnix.org/.adm</url>
<action>http://elearnix.org/.adm</action>
</vector>
<variations>
<variation>
<vector/>
<remarks/>
<page>
<body>Invalid URI /.adm</body>
<request>
<url>http://elearnix.org/.adm</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET /.adm HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/.adm</url>
<code>403</code>
<ip_address>31.220.16.186</ip_address>
<time>7.4241</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:49 GMT"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Content-Type" value="text/plain"/>
<header name="Content-Length" value="17"/>
</headers>
<body>Invalid URI /.adm</body>
<raw_headers>HTTP/1.1 403 Forbidden
Date: Tue, 07 Jun 2016 13:37:49 GMT
Server: Apache/2.2.16 (Debian)
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Content-Type: text/plain
Content-Length: 17

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/.adm</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</page>
<referring_page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</referring_page>
<signature/>
<proof>HTTP/1.1 403 Forbidden</proof>
<trusted>true</trusted>
<platform_type/>
<platform_name/>
</variation>
</variations>
</issue>
<issue>
<name>CAPTCHA protected form</name>
<description>
To prevent the automated abuse of a page, applications can implement what is
known as a CAPTCHA.
These are used to ensure human interaction with the application and are often
used on forms where the application conducts sensitive actions. These typically
include user registration, or submitting emails via "Contact Us" pages etc.
Arachni has flagged this not as a vulnerability, but as a prompt for the
penetration tester to conduct further manual testing on the CAPTCHA function, as
Arachni cannot audit CAPTCHA protected forms.
Testing for insecurely implemented CAPTCHA is a manual process, and an insecurely
implemented CAPTCHA could allow a cyber-criminal a means to abuse these sensitive
actions.
</description>
<remedy_guidance>
Although no remediation may be required based on this finding alone, manual
testing should ensure that:
1. The server keeps track of CAPTCHA tokens in use and has the token terminated
after its first use or after a period of time. Therefore preventing replay attacks.
2. The CAPTCHA answer is not hidden in plain text within the response that is
sent to the client.
3. The CAPTCHA image should not be weak and easily solved.
</remedy_guidance>
<remedy_code/>
<severity>informational</severity>
<check>
<name>CAPTCHA</name>
<description>Greps pages for forms with CAPTCHAs.</description>
<author>Tasos "Zapotek" Laskos <tasos.laskos@arachni-scanner.com></author>
<version>0.2</version>
<shortname>captcha</shortname>
</check>
<digest>1495184166</digest>
<references/>
<vector>
<class>Arachni::Element::Form</class>
<type>form</type>
<url>http://elearnix.org/</url>
<action>http://elearnix.org/verify.php</action>
<source><form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge">
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/">
<input type="submit" value="Delist">
</form></source>
<affected_input_name/>
<inputs>
<input name="recaptcha_challenge_field" value=""/>
<input name="recaptcha_response_field" value="manual_challenge"/>
<input name="origin_url" value="http://elearnix.org/"/>
</inputs>
</vector>
<variations>
<variation>
<vector>
<seed/>
<inputs>
<input name="recaptcha_challenge_field" value=""/>
<input name="recaptcha_response_field" value="manual_challenge"/>
<input name="origin_url" value="http://elearnix.org/"/>
</inputs>
</vector>
<remarks/>
<page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</page>
<referring_page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</referring_page>
<signature>captcha</signature>
<proof><form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge">
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/">
<input type="submit" value="Delist">
</form></proof>
<trusted>true</trusted>
<platform_type/>
<platform_name/>
</variation>
</variations>
</issue>
<issue>
<name>Interesting response</name>
<description>
The server responded with a non 200 (OK) nor 404 (Not Found) status code.
This is a non-issue, however exotic HTTP response status codes can provide useful
insights into the behavior of the web application and assist with the penetration test.
</description>
<remedy_guidance/>
<remedy_code/>
<severity>informational</severity>
<check>
<name>Interesting responses</name>
<description>Logs all non 200 (OK) server responses.</description>
<author>Tasos "Zapotek" Laskos <tasos.laskos@arachni-scanner.com></author>
<version>0.2.1</version>
<shortname>interesting_responses</shortname>
</check>
<digest>3783498189</digest>
<references>
<reference title="w3.org" url="http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html"/>
</references>
<vector>
<class>Arachni::Element::Server</class>
<type>server</type>
<url>http://elearnix.org/.git/HEAD</url>
<action>http://elearnix.org/.git/HEAD</action>
</vector>
<variations>
<variation>
<vector/>
<remarks/>
<page>
<body>Invalid URI /.git/HEAD</body>
<request>
<url>http://elearnix.org/.git/HEAD</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET /.git/HEAD HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/.git/HEAD</url>
<code>403</code>
<ip_address>31.220.16.186</ip_address>
<time>7.3712</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:49 GMT"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Content-Type" value="text/plain"/>
<header name="Content-Length" value="22"/>
</headers>
<body>Invalid URI /.git/HEAD</body>
<raw_headers>HTTP/1.1 403 Forbidden
Date: Tue, 07 Jun 2016 13:37:49 GMT
Server: Apache/2.2.16 (Debian)
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Content-Type: text/plain
Content-Length: 22

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/.git/HEAD</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</page>
<referring_page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</referring_page>
<signature/>
<proof>HTTP/1.1 403 Forbidden</proof>
<trusted>true</trusted>
<platform_type/>
<platform_name/>
</variation>
</variations>
</issue>
<issue>
<name>Interesting response</name>
<description>
The server responded with a non 200 (OK) nor 404 (Not Found) status code.
This is a non-issue, however exotic HTTP response status codes can provide useful
insights into the behavior of the web application and assist with the penetration test.
</description>
<remedy_guidance/>
<remedy_code/>
<severity>informational</severity>
<check>
<name>Interesting responses</name>
<description>Logs all non 200 (OK) server responses.</description>
<author>Tasos "Zapotek" Laskos <tasos.laskos@arachni-scanner.com></author>
<version>0.2.1</version>
<shortname>interesting_responses</shortname>
</check>
<digest>680817867</digest>
<references>
<reference title="w3.org" url="http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html"/>
</references>
<vector>
<class>Arachni::Element::Server</class>
<type>server</type>
<url>http://elearnix.org/.admin</url>
<action>http://elearnix.org/.admin</action>
</vector>
<variations>
<variation>
<vector/>
<remarks/>
<page>
<body>Invalid URI /.admin</body>
<request>
<url>http://elearnix.org/.admin</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET /.admin HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/.admin</url>
<code>403</code>
<ip_address>31.220.16.186</ip_address>
<time>4.2526</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:43 GMT"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Content-Type" value="text/plain"/>
<header name="Content-Length" value="19"/>
</headers>
<body>Invalid URI /.admin</body>
<raw_headers>HTTP/1.1 403 Forbidden
Date: Tue, 07 Jun 2016 13:37:43 GMT
Server: Apache/2.2.16 (Debian)
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Content-Type: text/plain
Content-Length: 19

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/.admin</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</page>
<referring_page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</referring_page>
<signature/>
<proof>HTTP/1.1 403 Forbidden</proof>
<trusted>true</trusted>
<platform_type/>
<platform_name/>
</variation>
</variations>
</issue>
<issue>
<name>E-mail address disclosure</name>
<description>
Email addresses are typically found on "Contact us" pages, however, they can also
be found within scripts or code comments of the application. They are used to
provide a legitimate means of contacting an organisation.
As one of the initial steps in information gathering, cyber-criminals will spider
a website and using automated methods collect as many email addresses as possible,
that they may then use in a social engineering attack.
Using the same automated methods, Arachni was able to detect one or more email
addresses that were stored within the affected page.
</description>
<remedy_guidance>E-mail addresses should be presented in such
a way that it is hard to process them automatically.</remedy_guidance>
<remedy_code/>
<severity>informational</severity>
<check>
<name>E-mail address</name>
<description>Greps pages for disclosed e-mail addresses.</description>
<author>Tasos "Zapotek" Laskos <tasos.laskos@arachni-scanner.com></author>
<version>0.2.1</version>
<shortname>emails</shortname>
</check>
<cwe>200</cwe>
<digest>4057954726</digest>
<references/>
<vector>
<class>Arachni::Element::Body</class>
<type>body</type>
<url>http://elearnix.org/</url>
<action>http://elearnix.org/</action>
</vector>
<variations>
<variation>
<vector/>
<remarks/>
<page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</page>
<referring_page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</referring_page>
<signature>[A-Z0-9._%+-]+(?:@|\s*\[at\]\s*)[A-Z0-9.-]+(?:\.|\s*\[dot\]\s*)[A-Z]{2,4}</signature>
<proof>csapda@web-server.hu</proof>
<trusted>true</trusted>
<platform_type/>
<platform_name/>
</variation>
<variation>
<vector/>
<remarks/>
<page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</page>
<referring_page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</referring_page>
<signature>[A-Z0-9._%+-]+(?:@|\s*\[at\]\s*)[A-Z0-9.-]+(?:\.|\s*\[dot\]\s*)[A-Z]{2,4}</signature>
<proof>csapda@astrohost.com</proof>
<trusted>true</trusted>
<platform_type/>
<platform_name/>
</variation>
</variations>
</issue>
<issue>
<name>Interesting response</name>
<description>
The server responded with a non 200 (OK) nor 404 (Not Found) status code.
This is a non-issue, however exotic HTTP response status codes can provide useful
insights into the behavior of the web application and assist with the penetration test.
</description>
<remedy_guidance/>
<remedy_code/>
<severity>informational</severity>
<check>
<name>Interesting responses</name>
<description>Logs all non 200 (OK) server responses.</description>
<author>Tasos "Zapotek" Laskos <tasos.laskos@arachni-scanner.com></author>
<version>0.2.1</version>
<shortname>interesting_responses</shortname>
</check>
<digest>710586659</digest>
<references>
<reference title="w3.org" url="http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html"/>
</references>
<vector>
<class>Arachni::Element::Server</class>
<type>server</type>
<url>http://elearnix.org/.svn/all-wcprops</url>
<action>http://elearnix.org/.svn/all-wcprops</action>
</vector>
<variations>
<variation>
<vector/>
<remarks/>
<page>
<body>Invalid URI /.svn/all-wcprops</body>
<request>
<url>http://elearnix.org/.svn/all-wcprops</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET /.svn/all-wcprops HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/.svn/all-wcprops</url>
<code>403</code>
<ip_address>31.220.16.186</ip_address>
<time>1.4663</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:44 GMT"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Content-Type" value="text/plain"/>
<header name="Content-Length" value="29"/>
</headers>
<body>Invalid URI /.svn/all-wcprops</body>
<raw_headers>HTTP/1.1 403 Forbidden
Date: Tue, 07 Jun 2016 13:37:44 GMT
Server: Apache/2.2.16 (Debian)
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Content-Type: text/plain
Content-Length: 29

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/.svn/all-wcprops</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</page>
<referring_page>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<request>
<url>http://elearnix.org/</url>
<method>get</method>
<parameters/>
<headers>
<header name="Accept" value="text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"/>
<header name="User-Agent" value="Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0"/>
</headers>
<body/>
<raw>GET / HTTP/1.1
Host: elearnix.org
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

</raw>
</request>
<response>
<url>http://elearnix.org/</url>
<code>200</code>
<ip_address>31.220.16.186</ip_address>
<time>0.9839</time>
<return_code>ok</return_code>
<return_message>No error</return_message>
<headers>
<header name="Date" value="Tue, 07 Jun 2016 13:37:38 GMT"/>
<header name="Cache-Control" value="no-cache, no-store, must-revalidate"/>
<header name="Pragma" value="no-cache"/>
<header name="Expires" value="0"/>
<header name="Server" value="Apache/2.2.16 (Debian)"/>
<header name="Content-Length" value="6557"/>
<header name="Connection" value="close"/>
</headers>
<body><html>
<head>
<title>Visitor anti-robot validation</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="/css/style.css" />
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="robots" content="noindex, nofollow" />
<meta name="keywords" content="joomla, Joomla, joomla 1.5, wordpress 2.5, Drupal" />
<meta name="description" content="Joomla!" />
<meta name="generator" content="Joomla! 1.5 - Open Source Content Management" />
<meta name="generator" content="WordPress 2.5" />
</head>
<body>
<div class="container">
<div>
<h1>Dear visitor</h1>
<p>To reach the website securely, please fill in the characters shown below.</p>
<p><strong></strong></p>
</div>
<div class="left">
<img src="/img/logo.png" alt="" />
</div>
<div class="right">
<form method="post" action="/verify.php">
<script type="text/javascript" src="http://www.google.com/recaptcha/api/challenge?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC"></script>
<noscript>
<iframe src="http://www.google.com/recaptcha/api/noscript?k=6LfRteUSAAAAAFQ4IlQQdjP_E7ek9ElCzSo5TDxC" height="300" width="500" frameborder="0"></iframe><br/>
<textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
<input type="hidden" name="recaptcha_response_field" value="manual_challenge"/>
</noscript>
<input type="hidden" name="origin_url" value="http://elearnix.org/" />
<input type="submit" value="Delist" />
</form>
</div>
<div class="clear"></div>
<div>
<h1>Why is it necessary?</h1>
<p>Your IP address (125.18.48.110) has been blocked for security reason. Probably your IP address has been used for violation of server security rules before.</p>
<p>We have to make sure that this is not a malicious visit by an automated robot. Filling in the captcha is required to delist you IP address.</p>
<p>Thank you.</p>
<hr/>
<pre>
Remote address: 125.18.48.110
URI: /
Agent: Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
</pre>
</div>
</div>
</body>
<!--
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<a href='index.php?option=com_dshop'>This contact form is about /components/com_dshop/ </a><br>
<a href='index.php?option=com_jobprofile'>This contact form is about /components/com_jobprofile/ </a><br>
<a href='index.php?option=com_fckeditor'>This contact form is about /components/com_fckeditor/ </a><br>
<a href='index.php?option=com_acajoom'>This contact form is about /components/com_acajoom/ </a><br>
<a href='index.php?option=com_content'>This contact form is about /components/com_content/ </a><br>
<a href='index.php?option=com_phocagallery'>This contact form is about /components/com_phocagallery/ </a><br>
<a href='index.php?option=com_mailto'>This contact form is about /components/com_mailto/ </a><br>
<a href='index.php?option=com_qcontacts'>This contact form is about /components/com_qcontacts/ </a><br>
<a href='index.php?option=com_jevents'>This contact form is about /components/com_jevents/ </a><br>
<a href='index.php?option=com_contact'>This contact form is about /components/com_contact/ </a><br>
<a href='index.php?option=com_search'>This contact form is about /components/com_search/ </a><br>
<a href='index.php?option=com_virtuemart'>This contact form is about /components/com_virtuemart/ </a><br>
<a href='index.php?option=com_google'>This contact form is about /components/com_google/ </a><br>
<a href='index.php?option=com_oziogallery2'>This contact form is about /components/com_oziogallery2/ </a><br>
<a href='index.php?option=fckeditor/editor/filemanager/connectors/uploadtest.html'>This contact form is about /components/fckeditor/editor/filemanager/connectors/uploadtest.html/ </a><br>
<a href='index.php?option=FCKeditor - Uploaders Tests'>This contact form is about /components/FCKeditor - Uploaders Tests/ </a><br>
<a href='index.php?option=phpmyadmin'>This contact form is about /components/phpmyadmin/ </a><br>
<a href='index.php?option=phpmyadmin2'>This contact form is about /components/phpmyadmin2/ </a><br>
<a href="demo/GHH%20-%20Haxplorer/1.php">GHDB Signature #833 (filetype:php HAXPLORER &quot;Server Files Browser&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Ping/php-ping.php">GHDB Signature #733 (&quot;Enter ip&quot; inurl:&quot;php-ping.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHP%20Shell/phpshell.php">GHDB Signature #365 (intitle:&quot;PHP Shell *&quot; &quot;Enable stderr&quot; filetype:php)</a><br>
<br>
<a href="demo/GHH%20-%20PHPBB%20Install/phpBB2/install/install.php">GHDB Signature #935 (inurl:&quot;install/install.php&quot;)</a><br>
<br>
<a href="demo/GHH%20-%20PHPFM/index.php">GHDB Signature #361 (&quot;Powered by PHPFM&quot; filetype:php -username)
</a><br><br>
<a href="demo/GHH%20-%20PhpSysInfo/index.php">GHDB Signature #161 (inurl:phpSysInfo/ &quot;created by phpsysinfo&quot;)</a><br><br>
<a href="demo/GHH%20-%20SquirrelMail/src/login.php">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7">GHDB Signature #1013 (&quot;SquirrelMail version 1.4.4&quot; inurl:src ext:php)</a> <br><br>
<a href="/demo/GHH v1.1 - .mdb/admin.mdb">GHDB Signature #162 (allinurl: admin mdb)</a> <br><br>
<a href="/demo/GHH v1.1 - .sql/create.sql">GHDB Signature #1064 (filetype:sql ("passwd values" | "password values" | "pass values" ))</a> <br><br>
<a href="/demo/GHH v1.1 - AIM BuddyList/BuddyList.blt">GHDB Signature #937 (filetype:blt "buddylist")</a> <br><br>
<a href="/demo/GHH v1.1 - File Upload Manager/">GHDB Signature #734 ("File Upload Manager v1.3" "rename to")</a> <br><br>
<a href="/demo/GHH v1.1 - passlist.txt/passlist.txt">GHDB Signature #58 (inurl:passlist.txt)</a> <br><br>
<a href="/demo/GHH v1.1 - passwd.txt/passwd.txt">GHDB Signature #1122 (wwwboard WebAdmin inurl:passwd.txt</a> <br><br>
<a href="/demo/GHH v1.1 - WebUtil 2.7/webutil.pl">GHDB Signature #769 (inurl:webutil.pl)</a> <br><br>
-->
<!--
<a href="mailto:csapda@web-server.hu"></a>
<a href="mailto:csapda@astrohost.com"></a>
-->
</html>
</body>
<raw_headers>HTTP/1.1 200 OK
Date: Tue, 07 Jun 2016 13:37:38 GMT
Cache-Control: no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: 0
Server: Apache/2.2.16 (Debian)
Content-Length: 6557
Connection: close

</raw_headers>
</response>
<dom>
<url>http://elearnix.org/</url>
<transitions/>
<data_flow_sinks/>
<execution_flow_sinks/>
</dom>
</referring_page>
<signature/>
<proof>HTTP/1.1 403 Forbidden</proof>
<trusted>true</trusted>
<platform_type/>
<platform_name/>
</variation>
</variations>
</issue>
</issues>
<plugins>
<healthmap>
<name>Health map</name>
<description>Generates a simple list of safe/unsafe URLs.</description>
<results>
<map>
<with_issues>http://elearnix.org/</with_issues>
<with_issues>http://elearnix.org/.adm</with_issues>
<with_issues>http://elearnix.org/.admin</with_issues>
<with_issues>http://elearnix.org/.git/HEAD</with_issues>
<with_issues>http://elearnix.org/.svn/all-wcprops</with_issues>
<with_issues>http://elearnix.org/verify.php</with_issues>
</map>
<total>6</total>
<with_issues>6</with_issues>
<without_issues>0</without_issues>
<issue_percentage>100</issue_percentage>
</results>
</healthmap>
</plugins>
</report>
"""
| 61.009155 | 205 | 0.649459 | 53,602 | 319,871 | 3.843961 | 0.016604 | 0.0973 | 0.060686 | 0.079672 | 0.959261 | 0.957058 | 0.955641 | 0.95416 | 0.953369 | 0.952496 | 0 | 0.03275 | 0.158827 | 319,871 | 5,242 | 206 | 61.020794 | 0.733022 | 0 | 1 | 0.93187 | 0 | 0.360323 | 0.999931 | 0.335632 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.021329 | 0.000414 | 0 | 0.000414 | 0.000207 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
5b6b6d0bb9e5486784c1b9d1b773b7cb46c7828e | 158 | py | Python | ident/cdr/__init__.py | AlexShander/Grafana_Asterisk | 1f8b20e2d3e6960c13d223baf9ce134fca402b1f | [
"Apache-2.0"
] | null | null | null | ident/cdr/__init__.py | AlexShander/Grafana_Asterisk | 1f8b20e2d3e6960c13d223baf9ce134fca402b1f | [
"Apache-2.0"
] | null | null | null | ident/cdr/__init__.py | AlexShander/Grafana_Asterisk | 1f8b20e2d3e6960c13d223baf9ce134fca402b1f | [
"Apache-2.0"
] | 1 | 2021-04-22T05:47:13.000Z | 2021-04-22T05:47:13.000Z | from cdr.datasets import Cdr
from cdr.tables import QueueLogForExcel
from cdr.tables import CDRViewer
from cdr.cdr import DBCdr
from cdr.config import Config
| 26.333333 | 39 | 0.841772 | 25 | 158 | 5.32 | 0.36 | 0.263158 | 0.195489 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126582 | 158 | 5 | 40 | 31.6 | 0.963768 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
5b9a06e2c0b90300fbf0ed03bad154464673086e | 162 | py | Python | Python/network/server_basic/cgi/test.py | sug5806/TIL | 2309d8a270e4a7b8961268a40b6492c5db317e37 | [
"MIT"
] | null | null | null | Python/network/server_basic/cgi/test.py | sug5806/TIL | 2309d8a270e4a7b8961268a40b6492c5db317e37 | [
"MIT"
] | 102 | 2020-02-12T00:10:33.000Z | 2022-03-11T23:58:41.000Z | Python/network/server_basic/cgi/test.py | sug5806/TIL | 2309d8a270e4a7b8961268a40b6492c5db317e37 | [
"MIT"
] | null | null | null | #! /home/martine/.pyenv/shims/python3
print("Contest-type: text/html\n")
print("<html><head><title>CGI테스트</title></head></html><body>CGI Server Testing</body>")
| 32.4 | 87 | 0.703704 | 24 | 162 | 4.75 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006494 | 0.049383 | 162 | 4 | 88 | 40.5 | 0.733766 | 0.222222 | 0 | 0 | 0 | 0.5 | 0.830645 | 0.451613 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
5b9e228664153a592cf7ae6000ed9a2e4046f21d | 3,738 | py | Python | tests/Unit/Evolution/Systems/Cce/Actions/ScriObserveInterpolated.py | nilsvu/spectre | 1455b9a8d7e92db8ad600c66f54795c29c3052ee | [
"MIT"
] | 117 | 2017-04-08T22:52:48.000Z | 2022-03-25T07:23:36.000Z | tests/Unit/Evolution/Systems/Cce/Actions/ScriObserveInterpolated.py | GitHimanshuc/spectre | 4de4033ba36547113293fe4dbdd77591485a4aee | [
"MIT"
] | 3,177 | 2017-04-07T21:10:18.000Z | 2022-03-31T23:55:59.000Z | tests/Unit/Evolution/Systems/Cce/Actions/ScriObserveInterpolated.py | geoffrey4444/spectre | 9350d61830b360e2d5b273fdd176dcc841dbefb0 | [
"MIT"
] | 85 | 2017-04-07T19:36:13.000Z | 2022-03-01T10:21:00.000Z | # Distributed under the MIT License.
# See LICENSE.txt for details.
import numpy as np
def compute_News(linear_coefficient, quadratic_coefficient, time,
news_coefficient, _1, _2, _3, _4, _5, _6):
return news_coefficient * (1.0 + linear_coefficient * time +
quadratic_coefficient * time**2)
def compute_EthInertialRetardedTime(linear_coefficient, quadratic_coefficient,
time, _1, _2, _3, _4, _5, _6,
eth_u_coefficient):
return eth_u_coefficient * (1.0 + linear_coefficient * time +
quadratic_coefficient * time**2)
def compute_Du_TimeIntegral_ScriPlus_Psi4(linear_coefficient,
quadratic_coefficient, time, _1, _2,
_3, _4, _5, psi4_coefficient, _6):
return psi4_coefficient * (linear_coefficient +
2.0 * quadratic_coefficient * time)
def compute_ScriPlus_Psi3(linear_coefficient, quadratic_coefficient, time, _1,
psi3_coefficient, _2, _3, _4, psi4_coefficient,
eth_u_coefficient):
time_factor = (1.0 + linear_coefficient * time +
quadratic_coefficient * time**2)
psi4 = psi4_coefficient * (linear_coefficient +
2.0 * quadratic_coefficient * time)
psi3 = psi3_coefficient * time_factor
eth_u = eth_u_coefficient * time_factor
return psi3 + 0.5 * eth_u * psi4
def compute_ScriPlus_Psi2(linear_coefficient, quadratic_coefficient, time, _1,
psi3_coefficient, psi2_coefficient, _2, _3,
psi4_coefficient, eth_u_coefficient):
time_factor = (1.0 + linear_coefficient * time +
quadratic_coefficient * time**2)
psi4 = psi4_coefficient * (linear_coefficient +
2.0 * quadratic_coefficient * time)
psi3 = psi3_coefficient * time_factor
psi2 = psi2_coefficient * time_factor
eth_u = eth_u_coefficient * time_factor
return psi2 + psi3 * eth_u + 0.25 * psi4 * eth_u**2
def compute_ScriPlus_Psi1(linear_coefficient, quadratic_coefficient, time, _1,
psi3_coefficient, psi2_coefficient, psi1_coefficient,
_2, psi4_coefficient, eth_u_coefficient):
time_factor = (1.0 + linear_coefficient * time +
quadratic_coefficient * time**2)
psi4 = psi4_coefficient * (linear_coefficient +
2.0 * quadratic_coefficient * time)
psi3 = psi3_coefficient * time_factor
psi2 = psi2_coefficient * time_factor
psi1 = psi1_coefficient * time_factor
eth_u = eth_u_coefficient * time_factor
return psi1 + 1.5 * psi2 * eth_u + 0.75 * psi3 * eth_u**2 \
+ 0.125 * psi4 * eth_u**3
def compute_ScriPlus_Psi0(linear_coefficient, quadratic_coefficient, time, _1,
psi3_coefficient, psi2_coefficient, psi1_coefficient,
psi0_coefficient, psi4_coefficient,
eth_u_coefficient):
time_factor = (1.0 + linear_coefficient * time +
quadratic_coefficient * time**2)
psi4 = psi4_coefficient * (linear_coefficient +
2.0 * quadratic_coefficient * time)
psi3 = psi3_coefficient * time_factor
psi2 = psi2_coefficient * time_factor
psi1 = psi1_coefficient * time_factor
psi0 = psi0_coefficient * time_factor
eth_u = eth_u_coefficient * time_factor
return psi0 + 2.0 * psi1 * eth_u + 0.75 * psi2 * eth_u**2 \
+ 0.5 * psi3 * eth_u**3 + 0.0625 * psi4 * eth_u**4
| 45.585366 | 79 | 0.607277 | 411 | 3,738 | 5.128954 | 0.107056 | 0.298861 | 0.204934 | 0.072106 | 0.787951 | 0.768501 | 0.76518 | 0.76518 | 0.73814 | 0.710626 | 0 | 0.059496 | 0.321027 | 3,738 | 81 | 80 | 46.148148 | 0.77108 | 0.016854 | 0 | 0.578125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.109375 | false | 0 | 0.015625 | 0.046875 | 0.234375 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5bcf68e7f214d8ca992b689129451da448ba26de | 14,237 | py | Python | spec/clean/matrix_builder_spec.py | slavad/py-series-clean | 2a7cbd0cec1a46374e49eae79e6a88afb31ca54a | [
"MIT"
] | null | null | null | spec/clean/matrix_builder_spec.py | slavad/py-series-clean | 2a7cbd0cec1a46374e49eae79e6a88afb31ca54a | [
"MIT"
] | null | null | null | spec/clean/matrix_builder_spec.py | slavad/py-series-clean | 2a7cbd0cec1a46374e49eae79e6a88afb31ca54a | [
"MIT"
] | null | null | null | from spec.spec_helper import *
import clean.matrix_builder as mb
with description(mb) as self:
with shared_context('compare actual and expected with precision'):
with it('returns correct vector'):
expect(
self.actual_result
).to(
equal_ndarray(self.expected_result, self.round_precision)
)
with it('does not contain zeroes'):
expect(self.actual_result).to(contain_non_zero_vals(self.round_precision))
with before.all:
self.round_precision = 7
with description('#generate_index_vector'):
with description('argument is odd'):
with before.all:
self.vector_size = 5
self.expected_vector = np.array(
[-2, -1, 0, 1, 2]
).reshape((-1, 1))
with it('generates correct vector'):
expect(
mb.generate_index_vector(self.vector_size)
).to(equal_ndarray(self.expected_vector))
with description('argument is event'):
with before.all:
self.vector_size = 4
with it('raises error'):
expect(
lambda: mb.generate_index_vector(self.vector_size)
).to(raise_error(ValueError, "matrix_size must be odd"))
with description('#run_ft'):
with shared_context('run_ft results checker'):
with before.all:
self.time_grid = np.array(
[0.0, 0.5, 0.6, 0.9, 2.0]
).reshape((-1, 1))
self.number_of_freq_estimations = 2
self.freq_vector = np.array(
[-2.0,0.0,2.0]
).reshape((-1, 1))
with before.each:
#it's ok to round here, since the results are not exactly the same
self.expected_result = self.expected_result*self.norm
self.actual_result = mb.run_ft(
self.time_grid, self.values,
self.freq_vector, self.number_of_freq_estimations,
self.kind
)
with included_context('compare actual and expected with precision'):
pass
with description('kind == "direct"'):
with before.each:
self.kind = 'direct'
self.coeff = -1j*2*np.pi
self.norm = 1.0/self.time_grid.shape[0]
self.values = np.array(
[
-1.0 + 0.1j,
0.9 - 1.4j,
-4.5 + 1.1j,
1.1 - 2.0j,
0.2 + 1.6j
]
).reshape((-1, 1))
self.expected_result = np.array(
# see e.g. eq 148 ref 2
[
np.sum(
np.exp(
self.coeff*self.freq_vector[0][0]*self.time_grid
)*self.values
),
np.sum(
np.exp(
self.coeff*self.freq_vector[1][0]*self.time_grid
)*self.values
),
np.sum(
np.exp(
self.coeff*self.freq_vector[2][0]*self.time_grid
)*self.values
)
]
).reshape((-1, 1))
with included_context('run_ft results checker'):
pass
with description('kind == "inverse"'):
with before.each:
self.kind = 'inverse'
self.coeff = 1j*2*np.pi
self.values = np.array(
[
-1.0 + 0.1j,
0.9 - 1.4j,
-4.5 + 1.1j
]
).reshape((-1, 1))
self.norm = self.time_grid.shape[0]/self.number_of_freq_estimations
self.expected_result = np.array(
# see e.g. eq 161 ref 2
[
np.sum(
np.exp(
self.coeff*self.time_grid[0][0]*self.freq_vector
)*self.values
),
np.sum(
np.exp(
self.coeff*self.time_grid[1][0]*self.freq_vector
)*self.values
),
np.sum(
np.exp(
self.coeff*self.time_grid[2][0]*self.freq_vector
)*self.values
),
np.sum(
np.exp(
self.coeff*self.time_grid[3][0]*self.freq_vector
)*self.values
),
np.sum(
np.exp(
self.coeff*self.time_grid[4][0]*self.freq_vector
)*self.values
)
]
).reshape((-1, 1))
with included_context('run_ft results checker'):
pass
with description('kind == "qwerty"'):
with before.each:
self.time_grid = None
self.values = None
self.number_of_freq_estimations = None
self.freq_vector = None
self.kind = 'qwerty'
self.coeff = None
self.norm = None
self.action = lambda: mb.run_ft(
self.time_grid, self.values,
self.freq_vector, self.number_of_freq_estimations,
self.kind
)
with it('raises error'):
expect(
self.action
).to(raise_error(ValueError, "unknown kind"))
with description('#generate_freq_vector'):
with before.all:
self.index_vector = np.array([-1.0, 0.0, 1.0])
self.max_freq = 2.0
self.number_of_freq_estimations = 3
self.expected_result = self.index_vector*self.max_freq/self.number_of_freq_estimations
with it('generates correct value'):
expect(
self.expected_result
).to(
equal_ndarray(
mb.generate_freq_vector(
self.index_vector, self.max_freq, self.number_of_freq_estimations
)
)
)
with description('#size_of_spectrum_vector'):
with before.all:
self.number_of_freq_estimations = 3
self.expected_result = 2*self.number_of_freq_estimations + 1
with it('returns correct value'):
expect(self.expected_result).to(
equal(mb.size_of_spectrum_vector(self.number_of_freq_estimations))
)
with description('#size_of_window_vector'):
with before.all:
self.number_of_freq_estimations = 3
self.expected_result = 4*self.number_of_freq_estimations + 1
with it('returns correct value'):
expect(self.expected_result).to(
equal(mb.size_of_window_vector(self.number_of_freq_estimations))
)
with description('vector generators'):
with before.all:
self.time_grid = np.array(
[0.0, 0.5, 0.6, 0.9, 2.0]
).reshape((-1, 1))
self.max_freq = 2
self.number_of_freq_estimations = 3
self.coeff = -1j*2*np.pi
self.norm = 1.0/self.time_grid.shape[0]
with description('#calculate_dirty_vector'):
with before.each:
self.values = np.array(
[-1.0, 0.9, -4.5, 1.1, 0.2]
).reshape((-1, 1))
self.index_vector = mb.generate_index_vector(
mb.size_of_spectrum_vector(self.number_of_freq_estimations)
)
self.freq_vector = mb.generate_freq_vector(
self.index_vector,
self.max_freq,
self.number_of_freq_estimations
)
self.expected_result = np.array(
[
np.sum(
np.exp(
self.coeff*self.freq_vector[0][0]*self.time_grid
)*self.values
),
np.sum(
np.exp(
self.coeff*self.freq_vector[1][0]*self.time_grid
)*self.values
),
np.sum(
np.exp(
self.coeff*self.freq_vector[2][0]*self.time_grid
)*self.values
),
np.sum(
np.exp(
self.coeff*self.freq_vector[3][0]*self.time_grid
)*self.values
),
np.sum(
np.exp(
self.coeff*self.freq_vector[4][0]*self.time_grid
)*self.values
),
np.sum(
np.exp(
self.coeff*self.freq_vector[5][0]*self.time_grid
)*self.values
),
np.sum(
np.exp(
self.coeff*self.freq_vector[6][0]*self.time_grid
)*self.values
)
]
).reshape((-1, 1))*self.norm
self.actual_result = mb.calculate_dirty_vector(
self.time_grid,
self.values,
self.number_of_freq_estimations,
self.max_freq
)
with included_context('compare actual and expected with precision'):
pass
with description('#calculate_window_vector'):
with before.each:
self.values = np.array(
[-1.0, 0.9, -4.5, 1.1, 0.2]
).reshape((-1, 1))
self.index_vector = mb.generate_index_vector(
mb.size_of_window_vector(self.number_of_freq_estimations)
)
self.freq_vector = mb.generate_freq_vector(
self.index_vector,
self.max_freq,
self.number_of_freq_estimations
)
self.expected_result = np.array(
[
np.sum(
np.exp(
self.coeff*self.freq_vector[0][0]*self.time_grid
)
),
np.sum(
np.exp(
self.coeff*self.freq_vector[1][0]*self.time_grid
)
),
np.sum(
np.exp(
self.coeff*self.freq_vector[2][0]*self.time_grid
)
),
np.sum(
np.exp(
self.coeff*self.freq_vector[3][0]*self.time_grid
)
),
np.sum(
np.exp(
self.coeff*self.freq_vector[4][0]*self.time_grid
)
),
np.sum(
np.exp(
self.coeff*self.freq_vector[5][0]*self.time_grid
)
),
np.sum(
np.exp(
self.coeff*self.freq_vector[6][0]*self.time_grid
)
),
np.sum(
np.exp(
self.coeff*self.freq_vector[7][0]*self.time_grid
)
),
np.sum(
np.exp(
self.coeff*self.freq_vector[8][0]*self.time_grid
)
),
np.sum(
np.exp(
self.coeff*self.freq_vector[9][0]*self.time_grid
)
),
np.sum(
np.exp(
self.coeff*self.freq_vector[10][0]*self.time_grid
)
),
np.sum(
np.exp(
self.coeff*self.freq_vector[11][0]*self.time_grid
)
),
np.sum(
np.exp(
self.coeff*self.freq_vector[12][0]*self.time_grid
)
)
]
).reshape((-1, 1))*self.norm
self.actual_result = mb.calculate_window_vector(
self.time_grid,
self.number_of_freq_estimations,
self.max_freq
)
with included_context('compare actual and expected with precision'):
pass
| 39.768156 | 98 | 0.389267 | 1,304 | 14,237 | 4.062117 | 0.093558 | 0.057391 | 0.086086 | 0.05286 | 0.819143 | 0.745894 | 0.71701 | 0.700396 | 0.675477 | 0.643005 | 0 | 0.03066 | 0.523495 | 14,237 | 357 | 99 | 39.879552 | 0.750147 | 0.007656 | 0 | 0.620178 | 0 | 0 | 0.048637 | 0.009628 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.014837 | 0.005935 | 0 | 0.005935 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5bea04484f24bb5bb1f85e6a4f3e579017d0c734 | 26,747 | py | Python | xrdsst/api/tokens_api.py | nordic-institute/X-Road-Security-Server-toolkit | 1538dbf3d76647f4fb3a72bbe93bf54f414ee9fb | [
"MIT"
] | 7 | 2020-11-01T19:50:11.000Z | 2022-01-18T17:45:19.000Z | xrdsst/api/tokens_api.py | nordic-institute/X-Road-Security-Server-toolkit | 1538dbf3d76647f4fb3a72bbe93bf54f414ee9fb | [
"MIT"
] | 24 | 2020-11-09T08:09:10.000Z | 2021-06-16T07:22:14.000Z | xrdsst/api/tokens_api.py | nordic-institute/X-Road-Security-Server-toolkit | 1538dbf3d76647f4fb3a72bbe93bf54f414ee9fb | [
"MIT"
] | 1 | 2021-04-27T14:39:48.000Z | 2021-04-27T14:39:48.000Z | # coding: utf-8
"""
X-Road Security Server Admin API
X-Road Security Server Admin API. Note that the error metadata responses described in some endpoints are subjects to change and may be updated in upcoming versions. # noqa: E501
OpenAPI spec version: 1.0.31
Contact: info@niis.org
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from xrdsst.api_client.api_client import ApiClient
class TokensApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def add_key(self, id, **kwargs): # noqa: E501
"""add new key # noqa: E501
<h3>Adds key for selected token.</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_key(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of the token (required)
:param KeyLabel body:
:return: Key
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_key_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.add_key_with_http_info(id, **kwargs) # noqa: E501
return data
def add_key_with_http_info(self, id, **kwargs): # noqa: E501
"""add new key # noqa: E501
<h3>Adds key for selected token.</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_key_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of the token (required)
:param KeyLabel body:
:return: Key
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_key" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `add_key`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['ApiKeyAuth'] # noqa: E501
return self.api_client.call_api(
'/tokens/{id}/keys', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Key', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_key_and_csr(self, id, **kwargs): # noqa: E501
"""add a new key and generate a csr for it # noqa: E501
<h3>Administrator adds a new key and generates a csr for it.</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_key_and_csr(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of the token (required)
:param KeyLabelWithCsrGenerate body:
:return: KeyWithCertificateSigningRequestId
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_key_and_csr_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.add_key_and_csr_with_http_info(id, **kwargs) # noqa: E501
return data
def add_key_and_csr_with_http_info(self, id, **kwargs): # noqa: E501
"""add a new key and generate a csr for it # noqa: E501
<h3>Administrator adds a new key and generates a csr for it.</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_key_and_csr_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of the token (required)
:param KeyLabelWithCsrGenerate body:
:return: KeyWithCertificateSigningRequestId
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_key_and_csr" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `add_key_and_csr`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['ApiKeyAuth'] # noqa: E501
return self.api_client.call_api(
'/tokens/{id}/keys-with-csrs', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='KeyWithCertificateSigningRequestId', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_token(self, id, **kwargs): # noqa: E501
"""get security server token information # noqa: E501
<h3>Administrator views the token details of the security server.</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_token(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of the token (required)
:return: Token
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_token_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_token_with_http_info(id, **kwargs) # noqa: E501
return data
def get_token_with_http_info(self, id, **kwargs): # noqa: E501
"""get security server token information # noqa: E501
<h3>Administrator views the token details of the security server.</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_token_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of the token (required)
:return: Token
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_token" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_token`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['ApiKeyAuth'] # noqa: E501
return self.api_client.call_api(
'/tokens/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Token', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_tokens(self, **kwargs): # noqa: E501
"""get security server tokens # noqa: E501
<h3>Administrator views tokens of the security server.</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_tokens(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: list[Token]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_tokens_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_tokens_with_http_info(**kwargs) # noqa: E501
return data
def get_tokens_with_http_info(self, **kwargs): # noqa: E501
"""get security server tokens # noqa: E501
<h3>Administrator views tokens of the security server.</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_tokens_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: list[Token]
If the method is called asynchronously,
returns the request thread.
"""
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_tokens" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['ApiKeyAuth'] # noqa: E501
return self.api_client.call_api(
'/tokens', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[Token]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def login_token(self, id, **kwargs): # noqa: E501
"""login to token # noqa: E501
<h3>Administrator logs in to a token</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.login_token(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of the token (required)
:param TokenPassword body:
:return: Token
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.login_token_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.login_token_with_http_info(id, **kwargs) # noqa: E501
return data
def login_token_with_http_info(self, id, **kwargs): # noqa: E501
"""login to token # noqa: E501
<h3>Administrator logs in to a token</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.login_token_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of the token (required)
:param TokenPassword body:
:return: Token
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method login_token" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `login_token`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['ApiKeyAuth'] # noqa: E501
return self.api_client.call_api(
'/tokens/{id}/login', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Token', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def logout_token(self, id, **kwargs): # noqa: E501
"""logout from token # noqa: E501
<h3>Administrator logs out from token.</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.logout_token(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of the token (required)
:return: Token
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.logout_token_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.logout_token_with_http_info(id, **kwargs) # noqa: E501
return data
def logout_token_with_http_info(self, id, **kwargs): # noqa: E501
"""logout from token # noqa: E501
<h3>Administrator logs out from token.</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.logout_token_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of the token (required)
:return: Token
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method logout_token" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `logout_token`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['ApiKeyAuth'] # noqa: E501
return self.api_client.call_api(
'/tokens/{id}/logout', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Token', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_token(self, id, **kwargs): # noqa: E501
"""update security server token information # noqa: E501
<h3>Administrator updates the token information.</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_token(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of the token (required)
:param TokenName body:
:return: Token
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_token_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.update_token_with_http_info(id, **kwargs) # noqa: E501
return data
def update_token_with_http_info(self, id, **kwargs): # noqa: E501
"""update security server token information # noqa: E501
<h3>Administrator updates the token information.</h3> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_token_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of the token (required)
:param TokenName body:
:return: Token
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_token" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_token`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['ApiKeyAuth'] # noqa: E501
return self.api_client.call_api(
'/tokens/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Token', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 36.994467 | 182 | 0.590459 | 3,128 | 26,747 | 4.823529 | 0.06298 | 0.055673 | 0.025981 | 0.033404 | 0.955329 | 0.951418 | 0.945851 | 0.936638 | 0.936638 | 0.925835 | 0 | 0.019261 | 0.314802 | 26,747 | 722 | 183 | 37.045706 | 0.804005 | 0.335664 | 0 | 0.822917 | 0 | 0 | 0.162515 | 0.032171 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039063 | false | 0 | 0.010417 | 0 | 0.106771 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f3a2b17394a7a897bc7782000f153877145e6441 | 44,467 | py | Python | core/models.py | EzekielCarvalho/costas_commerce_store | d7f00ffd5d5c7bbd0c1dcec9adaa4ddd70d19ea8 | [
"MIT"
] | null | null | null | core/models.py | EzekielCarvalho/costas_commerce_store | d7f00ffd5d5c7bbd0c1dcec9adaa4ddd70d19ea8 | [
"MIT"
] | null | null | null | core/models.py | EzekielCarvalho/costas_commerce_store | d7f00ffd5d5c7bbd0c1dcec9adaa4ddd70d19ea8 | [
"MIT"
] | null | null | null |
# Notes: A model is the single, definitive source of information about your data. It contains the essential fields and behaviors of the data you’re storing. Generally, each model maps to a single database table.
# ref https://docs.djangoproject.com/en/3.2/topics/db/models/
from django.db.models.signals import post_save
from django.conf import settings #This is necessary since we want to make use of our auth user model. This is from Django settings. This isn’t a module – it’s an object. So importing individual settings is not possible. This extracts or sucks (abstracts) the concepts of default settings and site-specific settings; it presents a single interface. It also separates (decouples) the code that uses settings from the location of your settings. (ref: https://docs.djangoproject.com/en/2.2/topics/settings/#using-settings-in-python-code)
from django.db import models
from cloudinary.models import CloudinaryField
from django.db.models import Sum
from django.urls import reverse
from django_countries.fields import CountryField
from django.db.utils import OperationalError, ProgrammingError
# CATEGORY_CHOICES = ( # This is a tuple (Python tuples are a data structure that store an ordered sequence of values. Tuples are immutable. This means you cannot change the values in a tuple. Tuples are defined with parenthesis.)
# ('C', 'Compact'), # First entry goes to the database, secnd entry is displayed on the page.
# ('BR', 'Bridge'), # These are for choices below which are A sequence consisting itself of iterables of exactly two items (e.g. [(A, B), (A, B) ...]) to use as choices for this field. If choices are given, they’re enforced by model validation and the default form widget will be a select box with these choices instead of the standard text field. (Ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#choices))
# ('DR', 'DSLR'),
# ('MR', 'Mirrorless cameras')
# )
LABEL_CHOICES = ( # This is a tuple (Python tuples are a data structure that store an ordered sequence of values. Tuples are immutable. This means you cannot change the values in a tuple. Tuples are defined with parenthesis.)
('P', 'primary'), # First entry goes to the database, secnd entry is displayed on the page.
('W', 'warning'), # These are for choices below which are A sequence consisting itself of iterables of exactly two items (e.g. [(A, B), (A, B) ...]) to use as choices for this field. If choices are given, they’re enforced by model validation and the default form widget will be a select box with these choices instead of the standard text field. (Ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#choices))
('D', 'danger')
)
ADDRESS_CHOICES = ( # This is a tuple (Python tuples are a data structure that store an ordered sequence of values. Tuples are immutable. This means you cannot change the values in a tuple. Tuples are defined with parenthesis.)
('B', 'Billing'), # First entry goes to the database, secnd entry is displayed on the page.
('S', 'Shipping'), # These are for choices below which are A sequence consisting itself of iterables of exactly two items (e.g. [(A, B), (A, B) ...]) to use as choices for this field. If choices are given, they’re enforced by model validation and the default form widget will be a select box with these choices instead of the standard text field. (Ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#choices))
)
class Category(models.Model): # This is the way of connecting the user with his/ her credit card details
name = models.CharField(max_length=255) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
def __str__(self): # in every model you should define the standard Python class method __str__() to return a human-readable string for each object. This string is used to represent individual records in the administration site (and anywhere else you need to refer to a model instance). Often this will return a title or name field from the model. (ref: https://developer.mozilla.org/en-US/docs/Learn/Server-side/Django/Models)
return self.name
def get_absolute_url(self): # Define a get_absolute_url() method to tell Django how to calculate the canonical (absolute, recognized) URL for an object. The reverse() function is usually the best approach to be used with get_absolute. One place Django uses get_absolute_url() is in the admin app. If it makes sense for your model’s instances to each have a unique URL, you should define get_absolute_url(). It’s good practice to use get_absolute_url() in templates, instead of hard-coding your objects’ URLs. The logic here is that if you change the URL structure of your objects, even for something small like correcting a spelling error, you don’t want to have to track down every place that the URL might be created. Specify it once, in get_absolute_url() and have all your other code call that one place.
return reverse('core:home-page') # "SLUG" from line 26 models.py. "self.slug" is as per the format. "core" from urls.py from line 11 and product-page from line 18. The reverse() function can reverse a large variety of regular expression patterns for URLs, but not every possible one. kwargs allows you to handle named arguments that you have not defined in advance. ref to for format: https://docs.djangoproject.com/en/3.2/ref/models/instances/#get-absolute-url
try:
CHOICES = Category.objects.all().values_list('name','name') # This is going to grab all the entries made via admin to the Category model
CATEGORY_CHOICES = [] # creates a dictionary
for item in CHOICES: # For each item that is present in the CHOICES results, append or add each of them to the CATEGORY_CHOICES dictionary.
CATEGORY_CHOICES.append(item)
except (OperationalError, ProgrammingError): # This is to avoid Programming errors from arising while deploying onto a new domain and host. This has been used to avoid errors
pass
class UserProfile(models.Model): # This is the way of connecting the user with his/ her credit card details
user = models.OneToOneField(settings.AUTH_USER_MODEL, on_delete=models.CASCADE) # We chose one to one field because each credit card info is associated with one user at a time rather than one card for many people, which would not be good. A one-to-one relationship. Conceptually, this is similar to a ForeignKey with unique=True, but the “reverse” side of the relation will directly return a single object. ref https://docs.djangoproject.com/en/3.2/ref/models/fields/#django.db.models.OneToOneField # ref https://thetldr.tech/what-is-the-difference-between-blank-and-null-in-django/ . null=True would tell the underlying database that this field is allowed to save null. blank=True is applicable in the Django forms layer, i.e. any user is allowed to keep empty this field in Django form or Admin page. blank value is stored in the database.For the price.) # This is to associate the order with the user. Note: ForeignKey is a Django ORM (object relational mapping) field-to-column mapping for creating and working with relationships between tables in relational databases. Django has a powerful, built-in user authentication system that makes it quick and easy to add login, logout, and signup functionality to a website. The AUTH_USER_MODEL is a recommended approach for referencing a user in a models.py file (ref: https://learndjango.com/tutorials/django-best-practices-referencing-user-model).
# for the cascade feature, The on_delete method is used to tell Django what to do with model instances (examples, cases) that depend on the model instance you delete. (e.g. a ForeignKey relationship). The on_delete=models. CASCADE tells Django to cascade (pour, flood) the deleting effect i.e. continue deleting the dependent models as well. (Ref https://stackoverflow.com/questions/38388423/what-does-on-delete-do-on-django-models)
stripe_customer_id = models.CharField(max_length=50, blank=True, null=True) # The stripe customer id will get populated when if the user decides to save their customer informationw when they checkout i.e. if they click on the checkbox to save for future purchases in the payment page # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: as CharField(max_length=None, **options))
one_click_purchasing = models.BooleanField(default=False) # A true/false field. ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#booleanfield
def __str__(self): # in every model you should define the standard Python class method __str__() to return a human-readable string for each object. This string is used to represent individual records in the administration site (and anywhere else you need to refer to a model instance). Often this will return a title or name field from the model. (ref: https://developer.mozilla.org/en-US/docs/Learn/Server-side/Django/Models)
return self.user.username # returns the username as the string representation
class Item(models.Model): # This is going to be displayed in the site on the page where the products are displayed for users to purchase. Once it is added to the cart, it becomes an "OrderItem" (next class)
try:
title = models.CharField(max_length=100) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
price = models.FloatField(blank=True, null=True) # For the price. The FloatField class is sometimes mixed up with the DecimalField class. Although they both represent real numbers, they represent those numbers differently. FloatField uses Python’s float type internally, while DecimalField uses Python’s Decimal type. A floating-point number represented in Python by a float instance. ref https://docs.djangoproject.com/en/3.2/ref/models/fields/#django.db.models.FloatField
discount_price = models.FloatField(blank=True, null=True) # ref https://thetldr.tech/what-is-the-difference-between-blank-and-null-in-django/ . null=True would tell the underlying database that this field is allowed to save null. blank=True is applicable in the Django forms layer, i.e. any user is allowed to keep empty this field in Django form or Admin page. blank value is stored in the database.For the price. The FloatField class is sometimes mixed up with the DecimalField class. Although they both represent real numbers, they represent those numbers differently. FloatField uses Python’s float type internally, while DecimalField uses Python’s Decimal type. A floating-point number represented in Python by a float instance. ref https://docs.djangoproject.com/en/3.2/ref/models/fields/#django.db.models.FloatField
category = models.CharField(choices=CATEGORY_CHOICES, max_length=255, default='uncategorized') # Choices are A sequence consisting itself of iterables of exactly two items (e.g. [(A, B), (A, B) ...]) to use as choices for this field. If choices are given, they’re enforced by model validation and the default form widget will be a select box with these choices instead of the standard text field. (Ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#choices)
label = models.CharField(choices=LABEL_CHOICES, max_length=1)
slug = models.SlugField() # A Slug is basically a short label for something, containing only letters, numbers, underscores or hyphens. They’re generally used in URLs. SlugField in Django is like a CharField, where you can specify max_length attribute also. If max_length is not specified, Django will use a default length of 50. It also implies setting Field.db_index to True.It is often useful to automatically prepopulate a SlugField based on the value of some other value.It uses validate_slug or validate_unicode_slug for validation. ref: https://www.geeksforgeeks.org/slugfield-django-models/
description = models.CharField(max_length=5000) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
additional_description = models.CharField(max_length=10000) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
image = CloudinaryField('image') # For uploading images to cloudinary instead of using Heroku to host images, which is a bad idea.
def __str__(self): # in every model you should define the standard Python class method __str__() to return a human-readable string for each object. This string is used to represent individual records in the administration site (and anywhere else you need to refer to a model instance). Often this will return a title or name field from the model. (ref: https://developer.mozilla.org/en-US/docs/Learn/Server-side/Django/Models)
return self.title
def get_absolute_url(self): # Define a get_absolute_url() method to tell Django how to calculate the canonical (absolute, recognized) URL for an object. The reverse() function is usually the best approach to be used with get_absolute. One place Django uses get_absolute_url() is in the admin app. If it makes sense for your model’s instances to each have a unique URL, you should define get_absolute_url(). It’s good practice to use get_absolute_url() in templates, instead of hard-coding your objects’ URLs. The logic here is that if you change the URL structure of your objects, even for something small like correcting a spelling error, you don’t want to have to track down every place that the URL might be created. Specify it once, in get_absolute_url() and have all your other code call that one place.
return reverse('core:product-page', kwargs={'slug': self.slug}) # "SLUG" from line 26 models.py. "self.slug" is as per the format. "core" from urls.py from line 11 and product-page from line 18. The reverse() function can reverse a large variety of regular expression patterns for URLs, but not every possible one. kwargs allows you to handle named arguments that you have not defined in advance. ref to for format: https://docs.djangoproject.com/en/3.2/ref/models/instances/#get-absolute-url
def get_addition_to_cart_url(self): # This function was created mainly because to help with the add to cart feature
return reverse('core:add-to-cart', kwargs={'slug': self.slug}) # "SLUG" from line 26 models.py. "self.slug" is as per the format. "core" from urls.py from line 11 and add-to-cart is from line 20. The reverse() function can reverse a large variety of regular expression patterns for URLs, but not every possible one. kwargs allows you to handle named arguments that you have not defined in advance. ref to for format: https://docs.djangoproject.com/en/3.2/ref/models/instances/#get-absolute-url
def get_remove_item_from_cart(self): # This function was created mainly because to help with the add to cart feature
return reverse('core:remove-from-cart', kwargs={'slug': self.slug}) # "SLUG" from line 26 models.py. "self.slug" is as per the format. "core" from urls.py from line 11 and add-to-cart is from line 20. The reverse() function can reverse a large variety of regular expression patterns for URLs, but not every possible one. kwargs allows you to handle named arguments that you have not defined in advance. ref to for format: https://docs.djangoproject.com/en/3.2/ref/models/instances/#get-absolute-url
except (OperationalError, ProgrammingError):
pass
class OrderItem(models.Model): # This is the way of connecting the item with the shopping cart (Order)
user = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE) # ref https://thetldr.tech/what-is-the-difference-between-blank-and-null-in-django/ . null=True would tell the underlying database that this field is allowed to save null. blank=True is applicable in the Django forms layer, i.e. any user is allowed to keep empty this field in Django form or Admin page. blank value is stored in the database.For the price.) # This is to associate the order with the user. Note: ForeignKey is a Django ORM (object relational mapping) field-to-column mapping for creating and working with relationships between tables in relational databases. Django has a powerful, built-in user authentication system that makes it quick and easy to add login, logout, and signup functionality to a website. The AUTH_USER_MODEL is a recommended approach for referencing a user in a models.py file (ref: https://learndjango.com/tutorials/django-best-practices-referencing-user-model).
# for the cascade feature, The on_delete method is used to tell Django what to do with model instances (examples, cases) that depend on the model instance you delete. (e.g. a ForeignKey relationship). The on_delete=models. CASCADE tells Django to cascade (pour, flood) the deleting effect i.e. continue deleting the dependent models as well. (Ref https://stackoverflow.com/questions/38388423/what-does-on-delete-do-on-django-models)
ordered = models.BooleanField(default=False) # A true/false field. ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#booleanfield
item = models.ForeignKey(Item, on_delete=models.CASCADE) # We connect the Item from the previous class to the OrderItem class. # for the cascade feature, The on_delete method is used to tell Django what to do with model instances (examples, cases) that depend on the model instance you delete. (e.g. a ForeignKey relationship). The on_delete=models. CASCADE tells Django to cascade (pour, flood) the deleting effect i.e. continue deleting the dependent models as well. (Ref https://docs.djangoproject.com/en/3.2/ref/models/fields/#django.db.models.ForeignKey)
quantity = models.IntegerField(default=1) # IntegerField is a integer number represented in Python by a int instance. This field is generally used to store integer numbers in the database. The default form widget for this field is a NumberInput when localize is False or TextInput otherwise. ref https://www.geeksforgeeks.org/integerfield-django-models/ and https://docs.djangoproject.com/en/3.2/ref/models/fields/
def __str__(self): # in every model you should define the standard Python class method __str__() to return a human-readable string for each object. This string is used to represent individual records in the administration site (and anywhere else you need to refer to a model instance). Often this will return a title or name field from the model. (ref: https://developer.mozilla.org/en-US/docs/Learn/Server-side/Django/Models)
return f"{self.quantity} of {self.item.title}" # quantity refers to quantity in OrderItem class of the self.item.title (which is of the OderItem class, which connects to the Item class which has title)
def get_total_item_price(self): # This is a method (A function) made to calculate the price based on the quantity of items in the cart. It multiples the quantity with the price of the product. item is form the OrderedItem class and price is from the Item class
return self.quantity * self.item.price
def get_total_discount_item_price(self):
return self.quantity * self.item.discount_price # This is a method (A function) made to calculate the discounted price based on the quantity of items in the cart. It multiples the quantity with the discounted price of the product
def get_amount_saved(self): #This method calculates how much the person saves
return self.get_total_item_price() - self.get_total_discount_item_price() # The total price minus the discounted price gives us how much you would save
def get_final_amount(self): # This is a function made to get the final price, this is made so that we don't have to keep repeating the "inf there is a discounted price" logic as in line 49 order_summary html
if self.item.discount_price: # If the discounted price exists
return self.get_total_discount_item_price() # Show the discounted price
return self.get_total_item_price() # Esle show the original unaltered price
class Order(models.Model): # For the purpose of connecting all the order items to the order. The order is our shopping cart.
user = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE) # This is to associate the order with the user. Note: ForeignKey is a Django ORM (object relational mapping) field-to-column mapping for creating and working with relationships between tables in relational databases. Django has a powerful, built-in user authentication system that makes it quick and easy to add login, logout, and signup functionality to a website. The AUTH_USER_MODEL is a recommended approach for referencing a user in a models.py file (ref: https://learndjango.com/tutorials/django-best-practices-referencing-user-model).
# for the cascade feature, The on_delete method is used to tell Django what to do with model instances (examples, cases) that depend on the model instance you delete. (e.g. a ForeignKey relationship). The on_delete=models. CASCADE tells Django to cascade (pour, flood) the deleting effect i.e. continue deleting the dependent models as well. (Ref https://stackoverflow.com/questions/38388423/what-does-on-delete-do-on-django-models)
reference_code = models.CharField(max_length=20, blank=True, null=True) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
items = models.ManyToManyField(OrderItem) # The example used in the Django docs is of a Group, Person, and Membership relationship. A group can have many people as members, and a person can be part of many groups, so the Group model has a ManyToManyField that points to Person . ref: https://docs.djangoproject.com/en/3.2/topics/db/examples/many_to_many/ https://docs.djangoproject.com/en/3.2/ref/models/fields/#django.db.models.ManyToManyField
start_date = models.DateTimeField(auto_now_add=True) # Moment the order was created. A date and time, represented in Python by a datetime.datetime instance. ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#datetimefield
ordered_date = models.DateTimeField() # A date and time, represented in Python by a datetime.datetime instance. ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#datetimefield . Nothing has been added in the field since we intend to manually set the value the moment it is ordered
ordered = models.BooleanField(default=False) # A true/false field. ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#booleanfield
billing_address = models.ForeignKey (
'Address', related_name='billing_address', on_delete=models.SET_NULL, blank=True, null=True) # Because the Foreign Key is on the same model, we had to use "Related_name". The related_name attribute specifies the name of the reverse relation from the User model back to your model. If you don't specify a related_name, Django automatically creates one using the name of your model with the suffix _set, for instance User.map_set.all(). ref https://stackoverflow.com/questions/2642613/what-is-related-name-used-for
shipping_address = models.ForeignKey (
'Address', related_name='shipping_address', on_delete=models.SET_NULL, blank=True, null=True) # Because the Foreign Key is on the same model, we had to use "Related_name". The related_name attribute specifies the name of the reverse relation from the User model back to your model. If you don't specify a related_name, Django automatically creates one using the name of your model with the suffix _set, for instance User.map_set.all(). ref https://stackoverflow.com/questions/2642613/what-is-related-name-used-for
payment = models.ForeignKey (
'Payment', on_delete=models.SET_NULL, blank=True, null=True)
coupon = models.ForeignKey (
'Coupon', on_delete=models.SET_NULL, blank=True, null=True)
being_delivered = models.BooleanField(default=False) # A true/false field. ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#booleanfield
received = models.BooleanField(default=False) # A true/false field. ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#booleanfield
refund_requested = models.BooleanField(default=False) # A true/false field. ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#booleanfield
refund_granted = models.BooleanField(default=False) # A true/false field. ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#booleanfield
# The process for our ecommerce site involves:
# * adding an item to cart
# * adding a billing address (possibilities of a failed checkout)
# * paying (pre processing, processing, packaging, etc)
# * item being delivered
# * item received
# * refunds
# ForeignKey is A many-to-one relationship (in this case the billing address is one and the people who use it are many). Requires two positional arguments: the class to which the model is related and the on_delete option.
# BillingAddress line 109 models.py and payment line 125, coupon is line 143
# To create a recursive relationship – an object that has a many-to-one relationship with itself – use models.ForeignKey('self', on_delete=models.CASCADE).
# REF https://docs.djangoproject.com/en/3.2/ref/models/fields/#foreignkey
# you can use null=True and on_delete=models.SET_NULL to implement a simple kind of soft deletion.
# ref https://stackoverflow.com/questions/8609192/what-is-the-difference-between-null-true-and-blank-true-in-django
# null=True would tell the underlying database that this field is allowed to save null.
# blank=True is applicable in the Django forms layer, i.e. any user is allowed to keep empty this field in Django form or Admin page. blank value is stored in the database.
# ref https://thetldr.tech/what-is-the-difference-between-blank-and-null-in-django/
def __str__(self): # in every model you should define the standard Python class method __str__() to return a human-readable string for each object. This string is used to represent individual records in the administration site (and anywhere else you need to refer to a model instance). Often this will return a title or name field from the model. (ref: https://developer.mozilla.org/en-US/docs/Learn/Server-side/Django/Models)
return self.user.username # returns the username as the string representation
def get_total(self): # This is a function made to get the total of all the items in the order
total = 0 # total by default set to 0
for order_item in self.items.all(): # items is from the Order class above in models.py. For every order_item (randomly chosen name) in the items attribute in the Order class
total += order_item.get_final_amount() # add total (which is = 0) to each order item getting the final amount from the OrderItem class, so you get the final price (same as total = total + order_item.get_final_price)
if self.coupon: # If there is a coupon submitted from the user, then minus the amount of that coupon, from the total that we obtained from the previous step
total -= self.coupon.amount # total = total - self.coupon.amount so the total has to be minused with the coupon's amount. "coupon" is definied some lines above as being connected ot the Coupon class below, which also has the "amount" attributed. This is the amount that the coupon is worth. This amount is minused with the total
return total
class Address (models.Model): # This is our billing address
user = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE) # This is to associate the order with the user. Note: ForeignKey is a Django ORM (object relational mapping) field-to-column mapping for creating and working with relationships between tables in relational databases. Django has a powerful, built-in user authentication system that makes it quick and easy to add login, logout, and signup functionality to a website. The AUTH_USER_MODEL is a recommended approach for referencing a user in a models.py file (ref: https://learndjango.com/tutorials/django-best-practices-referencing-user-model).
# for the cascade feature, The on_delete method is used to tell Django what to do with model instances (examples, cases) that depend on the model instance you delete. (e.g. a ForeignKey relationship). The on_delete=models. CASCADE tells Django to cascade (pour, flood) the deleting effect i.e. continue deleting the dependent models as well. (Ref https://stackoverflow.com/questions/38388423/what-does-on-delete-do-on-django-models)
first_name = models.CharField(max_length=100) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
last_name = models.CharField(max_length=100) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
username = models.CharField(max_length=100) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
street_address = models.CharField(max_length=100) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
apartment_address = models.CharField(max_length=100) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
country = CountryField(multiple=False) # From multi-choice ref https://github.com/SmileyChris/django-countries
zip = models.CharField(max_length=100) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
address_type = models.CharField(max_length=1, choices=ADDRESS_CHOICES) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options)) Choices are A sequence consisting itself of iterables of exactly two items (e.g. [(A, B), (A, B) ...]) to use as choices for this field. If choices are given, they’re enforced by model validation and the default form widget will be a select box with these choices instead of the standard text field. (Ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#choices)
default = models.BooleanField(default=False) # Everytime you tell to use an address as default, you will grab (Via other code) the address you created and set default to be true. A true/false field. ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#booleanfield
def __str__(self): # in every model you should define the standard Python class method __str__() to return a human-readable string for each object. This string is used to represent individual records in the administration site (and anywhere else you need to refer to a model instance). Often this will return a title or name field from the model. (ref: https://developer.mozilla.org/en-US/docs/Learn/Server-side/Django/Models)
return self.user.username # returns the username as the string representation
class Meta: # Give your model metadata by using an inner class Meta ref https://docs.djangoproject.com/en/3.2/topics/db/models/#meta-options
verbose_name_plural = 'Addresses' # A human-readable name for the object, singular. ref https://docs.djangoproject.com/en/3.2/ref/models/options/#verbose-name Model metadata is “anything that’s not a field”, such as ordering options (ordering), database table name (db_table), or human-readable singular and plural names (verbose_name and verbose_name_plural). None are required, and adding class Meta to a model is completely optional.
# We need to keep track of the stripe payments. Right now thus far we've not been doing that, so we create the below class
class Payment(models.Model): # This is to keep track of stripe payments
stripe_charge_id = models.CharField(max_length=50) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
user = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.SET_NULL, blank=True, null=True) # This is to associate the order with the user. Note: ForeignKey is a Django ORM (object relational mapping) field-to-column mapping for creating and working with relationships between tables in relational databases. Django has a powerful, built-in user authentication system that makes it quick and easy to add login, logout, and signup functionality to a website. The AUTH_USER_MODEL is a recommended approach for referencing a user in a models.py file (ref: https://learndjango.com/tutorials/django-best-practices-referencing-user-model).
# ForeignKey is A many-to-one relationship (in this case the billing address is one and the people who use it are many). Requires two positional arguments: the class to which the model is related and the on_delete option. # This is to associate the order with the user. Note: ForeignKey is a Django ORM (object relational mapping) field-to-column mapping for creating and working with relationships between tables in relational databases. Django has a powerful, built-in user authentication system that makes it quick and easy to add login, logout, and signup functionality to a website. The AUTH_USER_MODEL is a recommended approach for referencing a user in a models.py file (ref: https://learndjango.com/tutorials/django-best-practices-referencing-user-model).
# for the cascade feature, The on_delete method is used to tell Django what to do with model instances (examples, cases) that depend on the model instance you delete. (e.g. a ForeignKey relationship). The on_delete=models. CASCADE tells Django to cascade (pour, flood) the deleting effect i.e. continue deleting the dependent models as well. (Ref https://stackoverflow.com/questions/38388423/what-does-on-delete-do-on-django-models)
# you can use null=True and on_delete=models.SET_NULL to implement a simple kind of soft deletion.
# ref https://stackoverflow.com/questions/8609192/what-is-the-difference-between-null-true-and-blank-true-in-django
# null=True would tell the underlying database that this field is allowed to save null.
# blank=True is applicable in the Django forms layer, i.e. any user is allowed to keep empty this field in Django form or Admin page. blank value is stored in the database.
# ref https://thetldr.tech/what-is-the-difference-between-blank-and-null-in-django/
amount = models.FloatField() # A floating-point number represented in Python by a float instance. ref https://docs.djangoproject.com/en/3.2/ref/models/fields/#django.db.models.FloatField
timestamp = models.DateTimeField(auto_now_add=True) # A date and time, represented in Python by a datetime.datetime instance. Takes the same extra arguments as DateField. ref https://docs.djangoproject.com/en/3.2/ref/models/fields/#datetimefield
def __str__(self): # in every model you should define the standard Python class method __str__() to return a human-readable string for each object. This string is used to represent individual records in the administration site (and anywhere else you need to refer to a model instance). Often this will return a title or name field from the model. (ref: https://developer.mozilla.org/en-US/docs/Learn/Server-side/Django/Models)
return self.user.username # returns the username as the string representation
class Coupon(models.Model): # This is for the coupon feature. A model is the single, definitive source of information about your data. It contains the essential fields and behaviors of the data you’re storing. Generally, each model maps to a single database table. ref https://docs.djangoproject.com/en/3.2/topics/db/models/
code = models.CharField(max_length=15) # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
amount = models.FloatField() # This is the mount of money worth for the coupon code. A floating-point number represented in Python by a float instance. ref https://docs.djangoproject.com/en/3.2/ref/models/fields/#django.db.models.FloatField
def __str__(self): # in every model you should define the standard Python class method __str__() to return a human-readable string for each object. This string is used to represent individual records in the administration site (and anywhere else you need to refer to a model instance). Often this will return a title or name field from the model. (ref: https://developer.mozilla.org/en-US/docs/Learn/Server-side/Django/Models)
return self.code # returns the code variable as the string representation
class Refund(models.Model): # This is our refund class
order = models.ForeignKey(Order, on_delete=models.CASCADE) # This is to associate the order with the Order class. Note: ForeignKey is a Django ORM (object relational mapping) field-to-column mapping for creating and working with relationships between tables in relational databases. Django has a powerful, built-in user authentication system that makes it quick and easy to add login, logout, and signup functionality to a website. The AUTH_USER_MODEL is a recommended approach for referencing a user in a models.py file (ref: https://learndjango.com/tutorials/django-best-practices-referencing-user-model).
# for the cascade feature, The on_delete method is used to tell Django what to do with model instances (examples, cases) that depend on the model instance you delete. (e.g. a ForeignKey relationship). The on_delete=models. CASCADE tells Django to cascade (pour, flood) the deleting effect i.e. continue deleting the dependent models as well. (Ref https://stackoverflow.com/questions/38388423/what-does-on-delete-do-on-django-models)
reason = models.TextField() # This is A string field, for small- to large-sized strings. (ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/) (syntax: ass CharField(max_length=None, **options))
accepted = models.BooleanField(default=False) # A true/false field. ref: https://docs.djangoproject.com/en/3.2/ref/models/fields/#booleanfield
email = models.EmailField() # A CharField that checks that the value is a valid email address using EmailValidator. ref https://docs.djangoproject.com/en/3.2/ref/models/fields/#emailfield
def __str__(self): # in every model you should define the standard Python class method __str__() to return a human-readable string for each object. This string is used to represent individual records in the administration site (and anywhere else you need to refer to a model instance). Often this will return a title or name field from the model. (ref: https://developer.mozilla.org/en-US/docs/Learn/Server-side/Django/Models)
return f"{self.pk}" # this returns the primary key but it has to be in f strings because it's an ID not a string
#Ideally, every time a user model instance is created, a corresponding user profile instance must be created as well. This is usually done using signals. ref https://www.oreilly.com/library/view/django-design-patterns/9781788831345/b2ecd556-abe5-47a1-8276-4e18da9402f5.xhtml
# Signals are used to perform any action on modification of a model instance. The signals are utilities that help us to connect events with actions. We can develop a function that will run when a signal calls it. In other words, Signals are used to perform some action on modification/creation of a particular entry in Database. For example, One would want to create a profile instance, as soon as a new user instance is created in Database ref https://www.geeksforgeeks.org/how-to-create-and-use-signals-in-django/
# ref https://docs.djangoproject.com/en/3.2/ref/signals/#post-save
def userprofile_receiver(sender, instance, created, *args, **kwargs): # mostly as per format
if created: # as per format. If that user is created then proceed with the next line
userprofile = UserProfile.objects.create(user=instance) # we use the post_save signal to create a user profile no sooner a user is created
post_save.connect(userprofile_receiver, sender=settings.AUTH_USER_MODEL) # we specify the sender as the Auth user model. pre_save/post_save: This signal works before/after the method save(). ref https://www.geeksforgeeks.org/how-to-create-and-use-signals-in-django/
# So what is happening is when the UserProfile model is saved, a signal is fired called userprofile_receiver which creates a useerprofile receiver instance with a foreign key pointing to the instance of the user.
# receiver – The function who receives the signal and does something.
# sender – Sends the signal
# created — Checks whether the model is created or not
# instance — created model instance
# **kwargs –wildcard keyword arguments | 182.241803 | 1,418 | 0.725234 | 6,779 | 44,467 | 4.712494 | 0.10621 | 0.021286 | 0.037188 | 0.042259 | 0.749546 | 0.731923 | 0.719026 | 0.716647 | 0.715332 | 0.707976 | 0 | 0.008544 | 0.199811 | 44,467 | 244 | 1,419 | 182.241803 | 0.889045 | 0.794679 | 0 | 0.2 | 0 | 0 | 0.028898 | 0.002352 | 0 | 0 | 0 | 0 | 0 | 1 | 0.135714 | false | 0.014286 | 0.057143 | 0.114286 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
f3b776ad11a1550ecbba89e64c2392b666e8cb8a | 71,857 | py | Python | tests/system/action/poll/test_vote.py | ostcar/openslides-backend | e6ceac497c37a1e3e7f408c6cfb29cf21d985b4c | [
"MIT"
] | null | null | null | tests/system/action/poll/test_vote.py | ostcar/openslides-backend | e6ceac497c37a1e3e7f408c6cfb29cf21d985b4c | [
"MIT"
] | 19 | 2021-11-22T16:25:54.000Z | 2021-11-25T13:38:13.000Z | tests/system/action/poll/test_vote.py | ostcar/openslides-backend | e6ceac497c37a1e3e7f408c6cfb29cf21d985b4c | [
"MIT"
] | null | null | null | import json
from typing import Any, Dict
import requests
from openslides_backend.models.models import Poll
from tests.system.action.base import BaseActionTestCase
from tests.system.util import convert_to_test_response
from tests.util import Response
class BaseVoteTestCase(BaseActionTestCase):
def request(
self,
action: str,
data: Dict[str, Any],
anonymous: bool = False,
start_poll_before_vote: bool = True,
stop_poll_after_vote: bool = True,
) -> Response:
"""Overwrite request method to reroute voting requests to the vote service."""
if action == "poll.vote":
if start_poll_before_vote:
self.vote_service.start(data["id"])
response = self.vote_service.vote(data)
if stop_poll_after_vote:
self.execute_action_internally("poll.stop", {"id": data["id"]})
return response
else:
return super().request(action, data, anonymous)
def anonymous_vote(self, payload: Dict[str, Any], id: int = 1) -> Response:
# make request manually to prevent sending of cookie & header
payload_json = json.dumps(payload, separators=(",", ":"))
response = requests.post(
url=self.vote_service.url.replace("internal", "system") + f"?id={id}",
data=payload_json,
headers={
"Content-Type": "application/json",
},
)
return convert_to_test_response(response)
class PollVoteTest(BaseVoteTestCase):
def setUp(self) -> None:
super().setUp()
self.create_model(
"meeting/113",
{"is_active_in_organization_id": 1},
)
def test_vote_correct_pollmethod_Y(self) -> None:
user_id = self.create_user("test2")
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1, user_id]},
"option/11": {"meeting_id": 113, "poll_id": 1},
f"user/{user_id}": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
"vote_weight_$113": "2.000000",
"vote_weight_$": ["113"],
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
"poll/1": {
"title": "my test poll",
"option_ids": [11],
"pollmethod": "Y",
"meeting_id": 113,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
"min_votes_amount": 1,
"max_votes_amount": 10,
},
"meeting/113": {"users_enable_vote_weight": True},
}
)
response = self.request(
"poll.vote", {"id": 1, "value": {"11": 1}}, stop_poll_after_vote=False
)
self.assert_status_code(response, 200)
self.login(user_id)
response = self.request(
"poll.vote", {"id": 1, "value": {"11": 1}}, start_poll_before_vote=False
)
self.assert_status_code(response, 200)
vote = self.get_model("vote/1")
assert vote.get("value") == "Y"
assert vote.get("option_id") == 11
assert vote.get("weight") == "1.000000"
assert vote.get("meeting_id") == 113
assert vote.get("user_id") == 1
user = self.get_model("user/1")
assert user.get("vote_$_ids") == ["113"]
assert user.get("vote_$113_ids") == [1]
vote = self.get_model("vote/2")
assert vote.get("value") == "Y"
assert vote.get("option_id") == 11
assert vote.get("weight") == "2.000000"
assert vote.get("meeting_id") == 113
assert vote.get("user_id") == 2
option = self.get_model("option/11")
assert option.get("vote_ids") == [1, 2]
assert option.get("yes") == "3.000000"
assert option.get("no") == "0.000000"
assert option.get("abstain") == "0.000000"
user = self.get_model("user/2")
assert user.get("vote_$_ids") == ["113"]
assert user.get("vote_$113_ids") == [2]
def test_value_check(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"option/11": {"meeting_id": 113, "poll_id": 1},
"option/12": {"meeting_id": 113, "poll_id": 1},
"option/13": {"meeting_id": 113, "poll_id": 1},
"poll/1": {
"title": "my test poll",
"option_ids": [11, 12, 13],
"pollmethod": "YN",
"meeting_id": 113,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
}
)
response = self.request(
"poll.vote",
{
"id": 1,
"user_id": 1,
"value": {"11": "Y", "12": "N", "13": "A"},
},
)
self.assert_status_code(response, 400)
assert (
"Data for option 13 does not fit the poll method."
in response.json["message"]
)
def test_vote_correct_pollmethod_YN(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"option/11": {"meeting_id": 113, "poll_id": 1},
"option/12": {"meeting_id": 113, "poll_id": 1},
"option/13": {"meeting_id": 113, "poll_id": 1},
"poll/1": {
"title": "my test poll",
"option_ids": [11, 12, 13],
"pollmethod": "YN",
"meeting_id": 113,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
"min_votes_amount": 1,
"max_votes_amount": 10,
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
}
)
response = self.request(
"poll.vote",
{
"id": 1,
"user_id": 1,
"value": {"11": "Y", "12": "N"},
},
)
self.assert_status_code(response, 200)
vote = self.get_model("vote/1")
assert vote.get("value") == "Y"
assert vote.get("option_id") == 11
assert vote.get("weight") == "1.000000"
assert vote.get("meeting_id") == 113
assert vote.get("user_id") == 1
user_token = vote.get("user_token")
vote = self.get_model("vote/2")
assert vote.get("value") == "N"
assert vote.get("option_id") == 12
assert vote.get("weight") == "1.000000"
assert vote.get("meeting_id") == 113
assert vote.get("user_id") == 1
assert vote.get("user_token") == user_token
option = self.get_model("option/11")
assert option.get("vote_ids") == [1]
assert option.get("yes") == "1.000000"
assert option.get("no") == "0.000000"
assert option.get("abstain") == "0.000000"
option = self.get_model("option/12")
assert option.get("vote_ids") == [2]
assert option.get("yes") == "0.000000"
assert option.get("no") == "1.000000"
assert option.get("abstain") == "0.000000"
user = self.get_model("user/1")
assert user.get("vote_$_ids") == ["113"]
assert user.get("vote_$113_ids") == [1, 2]
def test_vote_wrong_votes_total(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"option/11": {"meeting_id": 113, "poll_id": 1},
"option/12": {"meeting_id": 113, "poll_id": 1},
"option/13": {"meeting_id": 113, "poll_id": 1},
"poll/1": {
"title": "my test poll",
"option_ids": [11, 12, 13],
"pollmethod": "Y",
"meeting_id": 113,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
"min_votes_amount": 1,
"max_votes_amount": 1,
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
}
)
response = self.request(
"poll.vote",
{
"id": 1,
"user_id": 1,
"value": {"11": 1, "12": 1},
},
)
self.assert_status_code(response, 400)
assert (
"The sum of your answers has to be between 1 and 1"
in response.json["message"]
)
self.assert_model_not_exists("vote/1")
def test_vote_pollmethod_Y_wrong_value(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"option/11": {"meeting_id": 113, "poll_id": 1},
"poll/1": {
"option_ids": [11, 12, 13],
"pollmethod": "Y",
"meeting_id": 113,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
}
)
response = self.request(
"poll.vote",
{
"id": 1,
"value": {"11": "Y"},
},
)
self.assert_status_code(response, 400)
assert "Your vote has a wrong format" in response.json["message"]
self.assert_model_not_exists("vote/1")
def test_vote_no_votes_total_check_by_YNA(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"option/11": {"meeting_id": 113, "poll_id": 1},
"option/12": {"meeting_id": 113, "poll_id": 1},
"option/13": {"meeting_id": 113, "poll_id": 1},
"poll/1": {
"title": "my test poll",
"option_ids": [11, 12, 13],
"pollmethod": "YNA",
"meeting_id": 113,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
"min_votes_amount": 1,
"max_votes_amount": 1,
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
}
)
response = self.request(
"poll.vote",
{
"id": 1,
"user_id": 1,
"value": {"11": "Y", "12": "A"},
},
)
self.assert_status_code(response, 200)
self.assert_model_exists("vote/1")
def test_vote_no_votes_total_check_by_YN(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"option/11": {"meeting_id": 113, "poll_id": 1},
"option/12": {"meeting_id": 113, "poll_id": 1},
"option/13": {"meeting_id": 113, "poll_id": 1},
"poll/1": {
"title": "my test poll",
"option_ids": [11, 12, 13],
"pollmethod": "YN",
"meeting_id": 113,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
"min_votes_amount": 1,
"max_votes_amount": 1,
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
}
)
response = self.request(
"poll.vote",
{
"id": 1,
"user_id": 1,
"value": {"11": "Y", "12": "N"},
},
)
self.assert_status_code(response, 200)
self.assert_model_exists("vote/1")
def test_vote_wrong_votes_total_min_case(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"option/11": {"meeting_id": 113, "poll_id": 1},
"option/12": {"meeting_id": 113, "poll_id": 1},
"option/13": {"meeting_id": 113, "poll_id": 1},
"poll/1": {
"title": "my test poll",
"option_ids": [11, 12, 13],
"pollmethod": "Y",
"meeting_id": 113,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
"min_votes_amount": 2,
"max_votes_amount": 2,
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
}
)
response = self.request(
"poll.vote",
{
"id": 1,
"user_id": 1,
"value": {"11": 1},
},
)
self.assert_status_code(response, 400)
assert (
"The sum of your answers has to be between 2 and 2"
in response.json["message"]
)
self.assert_model_not_exists("vote/1")
def test_vote_global(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1, 2]},
"option/11": {"meeting_id": 113, "used_as_global_option_in_poll_id": 1},
"user/2": {
"username": "test2",
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
"poll/1": {
"title": "my test poll",
"global_option_id": 11,
"global_no": True,
"global_yes": False,
"global_abstain": False,
"meeting_id": 113,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
"pollmethod": "YNA",
},
}
)
response = self.request(
"poll.vote",
{"id": 1, "user_id": 1, "value": "N"},
stop_poll_after_vote=False,
)
self.assert_status_code(response, 200)
response = self.request("poll.vote", {"id": 1, "user_id": 2, "value": "Y"})
self.assert_status_code(response, 400)
vote = self.get_model("vote/1")
assert vote.get("value") == "N"
assert vote.get("option_id") == 11
assert vote.get("weight") == "1.000000"
assert vote.get("meeting_id") == 113
assert vote.get("user_id") == 1
option = self.get_model("option/11")
assert option.get("vote_ids") == [1]
assert option.get("yes") == "0.000000"
assert option.get("no") == "1.000000"
assert option.get("abstain") == "0.000000"
user = self.get_model("user/1")
assert user.get("vote_$_ids") == ["113"]
assert user.get("vote_$113_ids") == [1]
self.assert_model_not_exists("vote/2")
option = self.get_model("option/11")
assert option.get("vote_ids") == [1]
user = self.get_model("user/1")
assert user.get("vote_$_ids") == ["113"]
assert user.get("vote_$113_ids") == [1]
def test_vote_schema_problems(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"poll/1": {
"title": "my test poll",
"entitled_group_ids": [1],
"meeting_id": 113,
"pollmethod": "YNA",
"state": Poll.STATE_STARTED,
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
}
)
response = self.request("poll.vote", {"id": 1, "user_id": 1, "value": "X"})
self.assert_status_code(response, 400)
assert "Global vote X is not enabled" in response.json["message"]
def test_vote_invalid_vote_value(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"option/11": {"meeting_id": 113, "poll_id": 1},
"poll/1": {
"option_ids": [11],
"pollmethod": "YNA",
"meeting_id": 113,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
"meeting_id": 113,
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
}
)
response = self.request(
"poll.vote",
{
"id": 1,
"user_id": 1,
"value": {"11": "X"},
},
)
self.assert_status_code(response, 400)
assert (
"Data for option 11 does not fit the poll method."
in response.json["message"]
)
def test_vote_not_started_in_service(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"poll/1": {
"type": "named",
"meeting_id": 113,
"pollmethod": "YNA",
"global_yes": True,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$_ids": ["113"],
"group_$113_ids": [1],
},
}
)
response = self.request(
"poll.vote",
{"id": 1, "value": "Y"},
start_poll_before_vote=False,
stop_poll_after_vote=False,
)
self.assert_status_code(response, 400)
assert "Poll does not exist" in response.json["message"]
def test_vote_option_not_in_poll(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"poll/1": {
"title": "my test poll",
"type": "named",
"pollmethod": "YNA",
"entitled_group_ids": [1],
"meeting_id": 113,
"state": Poll.STATE_STARTED,
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
}
)
response = self.request(
"poll.vote",
{
"id": 1,
"user_id": 1,
"value": {"113": "Y"},
},
)
self.assert_status_code(response, 400)
assert "Option_id 113 does not belong to the poll" in response.json["message"]
def test_double_vote(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1, 2]},
"option/11": {"meeting_id": 113, "used_as_global_option_in_poll_id": 1},
"user/2": {
"username": "test2",
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
"poll/1": {
"title": "my test poll",
"global_option_id": 11,
"global_no": True,
"global_yes": False,
"global_abstain": False,
"meeting_id": 113,
"entitled_group_ids": [1],
"pollmethod": "YN",
"state": Poll.STATE_STARTED,
},
}
)
response = self.request(
"poll.vote",
{"id": 1, "user_id": 1, "value": "N"},
stop_poll_after_vote=False,
)
self.assert_status_code(response, 200)
response = self.request(
"poll.vote",
{"id": 1, "user_id": 1, "value": "N"},
start_poll_before_vote=False,
)
self.assert_status_code(response, 400)
assert "Not the first vote" in response.json["message"]
vote = self.get_model("vote/1")
assert vote.get("value") == "N"
assert vote.get("option_id") == 11
assert vote.get("weight") == "1.000000"
assert vote.get("meeting_id") == 113
assert vote.get("user_id") == 1
option = self.get_model("option/11")
assert option.get("vote_ids") == [1]
user = self.get_model("user/1")
assert user.get("vote_$_ids") == ["113"]
assert user.get("vote_$113_ids") == [1]
def test_check_user_in_entitled_group(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"option/11": {"meeting_id": 113, "used_as_global_option_in_poll_id": 1},
"user/1": {"is_present_in_meeting_ids": [113]},
"poll/1": {
"pollmethod": "YNA",
"global_option_id": 11,
"global_no": True,
"global_yes": False,
"global_abstain": False,
"meeting_id": 113,
"entitled_group_ids": [],
"state": Poll.STATE_STARTED,
},
}
)
response = self.request("poll.vote", {"id": 1, "user_id": 1, "value": "N"})
self.assert_status_code(response, 400)
assert "User 1 is not allowed to vote" in response.json["message"]
def test_check_user_present_in_meeting(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"user/1": {"group_$_ids": ["113"], "group_$113_ids": [1]},
"option/11": {"meeting_id": 113, "used_as_global_option_in_poll_id": 1},
"poll/1": {
"title": "my test poll",
"global_option_id": 11,
"global_no": True,
"global_yes": False,
"global_abstain": False,
"meeting_id": 113,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
},
}
)
response = self.request("poll.vote", {"id": 1, "value": "N"})
self.assert_status_code(response, 400)
assert "You have to be present in meeting 113" in response.json["message"]
def test_check_str_validation(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"poll/1": {
"title": "my test poll",
"type": "named",
"meeting_id": 113,
"entitled_group_ids": [1],
"pollmethod": "Y",
"state": Poll.STATE_STARTED,
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$_ids": ["113"],
"group_$113_ids": [1],
},
}
)
response = self.request("poll.vote", {"id": 1, "user_id": 1, "value": "X"})
self.assert_status_code(response, 400)
assert "Global vote X is not enabled" in response.json["message"]
def test_default_vote_weight(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"option/11": {"meeting_id": 113, "poll_id": 1},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
"default_vote_weight": "3.000000",
},
"poll/1": {
"title": "my test poll",
"option_ids": [11],
"pollmethod": "Y",
"meeting_id": 113,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
},
"meeting/113": {"users_enable_vote_weight": True},
}
)
response = self.request(
"poll.vote", {"id": 1, "user_id": 1, "value": {"11": 1}}
)
self.assert_status_code(response, 200)
vote = self.get_model("vote/1")
assert vote.get("value") == "Y"
assert vote.get("option_id") == 11
assert vote.get("weight") == "3.000000"
assert vote.get("meeting_id") == 113
assert vote.get("user_id") == 1
option = self.get_model("option/11")
assert option.get("vote_ids") == [1]
assert option.get("yes") == "3.000000"
assert option.get("no") == "0.000000"
assert option.get("abstain") == "0.000000"
user = self.get_model("user/1")
assert user.get("vote_$_ids") == ["113"]
assert user.get("vote_$113_ids") == [1]
def test_vote_weight_not_enabled(self) -> None:
self.set_models(
{
"organization/1": {"enable_electronic_voting": True},
"group/1": {"user_ids": [1]},
"option/11": {"meeting_id": 113, "poll_id": 1},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
"default_vote_weight": "3.000000",
"vote_weight_$113": "4.200000",
"vote_weight_$": ["113"],
},
"poll/1": {
"title": "my test poll",
"option_ids": [11],
"pollmethod": "Y",
"meeting_id": 113,
"entitled_group_ids": [1],
"state": Poll.STATE_STARTED,
},
"meeting/113": {"users_enable_vote_weight": False},
}
)
response = self.request(
"poll.vote", {"id": 1, "user_id": 1, "value": {"11": 1}}
)
self.assert_status_code(response, 200)
vote = self.get_model("vote/1")
assert vote.get("value") == "Y"
assert vote.get("option_id") == 11
assert vote.get("weight") == "1.000000"
assert vote.get("meeting_id") == 113
assert vote.get("user_id") == 1
option = self.get_model("option/11")
assert option.get("vote_ids") == [1]
assert option.get("yes") == "1.000000"
assert option.get("no") == "0.000000"
assert option.get("abstain") == "0.000000"
user = self.get_model("user/1")
assert user.get("vote_$_ids") == ["113"]
assert user.get("vote_$113_ids") == [1]
class VotePollBaseTestClass(BaseVoteTestCase):
def setUp(self) -> None:
super().setUp()
self.set_models(
{
"assignment/1": {
"title": "test_assignment_tcLT59bmXrXif424Qw7K",
"open_posts": 1,
"meeting_id": 113,
},
"meeting/113": {"is_active_in_organization_id": 1},
}
)
self.create_poll()
self.set_models(
{
"group/1": {"user_ids": [1]},
"option/1": {
"meeting_id": 113,
"poll_id": 1,
"yes": "0.000000",
"no": "0.000000",
"abstain": "0.000000",
},
"option/2": {
"meeting_id": 113,
"poll_id": 1,
"yes": "0.000000",
"no": "0.000000",
"abstain": "0.000000",
},
"user/1": {
"is_present_in_meeting_ids": [113],
"group_$113_ids": [1],
"group_$_ids": ["113"],
},
"option/11": {"meeting_id": 113, "used_as_global_option_in_poll_id": 1},
"poll/1": {"global_option_id": 11},
}
)
def create_poll(self) -> None:
# has to be implemented by subclasses
raise NotImplementedError()
def start_poll(self) -> None:
self.update_model("poll/1", {"state": Poll.STATE_STARTED})
def add_option(self) -> None:
self.set_models(
{
"option/3": {"meeting_id": 113, "poll_id": 1},
"poll/1": {"option_ids": [1, 2, 3]},
}
)
class VotePollNamedYNA(VotePollBaseTestClass):
def create_poll(self) -> None:
self.create_model(
"poll/1",
{
"content_object_id": "assignment/1",
"title": "test_title_OkHAIvOSIcpFnCxbaL6v",
"pollmethod": "YNA",
"type": Poll.TYPE_NAMED,
"state": Poll.STATE_CREATED,
"meeting_id": 113,
"option_ids": [1, 2],
"entitled_group_ids": [1],
"votescast": "0.000000",
"votesvalid": "0.000000",
"votesinvalid": "0.000000",
"min_votes_amount": 1,
"max_votes_amount": 10,
},
)
def test_vote(self) -> None:
self.add_option()
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "Y", "2": "N", "3": "A"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 200)
self.assert_model_count("vote", 113, 3)
poll = self.get_model("poll/1")
self.assertEqual(poll.get("votesvalid"), "1.000000")
self.assertEqual(poll.get("votesinvalid"), "0.000000")
self.assertEqual(poll.get("votescast"), "1.000000")
self.assertIn(1, poll.get("voted_ids", []))
option1 = self.get_model("option/1")
option2 = self.get_model("option/2")
option3 = self.get_model("option/3")
self.assertEqual(option1.get("yes"), "1.000000")
self.assertEqual(option1.get("no"), "0.000000")
self.assertEqual(option1.get("abstain"), "0.000000")
self.assertEqual(option2.get("yes"), "0.000000")
self.assertEqual(option2.get("no"), "1.000000")
self.assertEqual(option2.get("abstain"), "0.000000")
self.assertEqual(option3.get("yes"), "0.000000")
self.assertEqual(option3.get("no"), "0.000000")
self.assertEqual(option3.get("abstain"), "1.000000")
def test_vote_with_voteweight(self) -> None:
self.set_models(
{
"user/1": {"vote_weight_$113": "4.200000", "vote_weight_$": ["113"]},
"meeting/113": {"users_enable_vote_weight": True},
}
)
self.add_option()
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "Y", "2": "N", "3": "A"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 200)
self.assert_model_count("vote", 113, 3)
poll = self.get_model("poll/1")
self.assertEqual(poll.get("votesvalid"), "4.200000")
self.assertEqual(poll.get("votesinvalid"), "0.000000")
self.assertEqual(poll.get("votescast"), "1.000000")
self.assertEqual(poll.get("state"), Poll.STATE_FINISHED)
option1 = self.get_model("option/1")
option2 = self.get_model("option/2")
option3 = self.get_model("option/3")
self.assertEqual(option1.get("yes"), "4.200000")
self.assertEqual(option1.get("no"), "0.000000")
self.assertEqual(option1.get("abstain"), "0.000000")
self.assertEqual(option2.get("yes"), "0.000000")
self.assertEqual(option2.get("no"), "4.200000")
self.assertEqual(option2.get("abstain"), "0.000000")
self.assertEqual(option3.get("yes"), "0.000000")
self.assertEqual(option3.get("no"), "0.000000")
self.assertEqual(option3.get("abstain"), "4.200000")
def test_change_vote(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "Y"}, "id": 1, "user_id": 1},
stop_poll_after_vote=False,
)
response = self.request(
"poll.vote",
{"value": {"1": "N"}, "id": 1, "user_id": 1},
start_poll_before_vote=False,
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/2")
vote = self.get_model("vote/1")
self.assertEqual(vote.get("value"), "Y")
def test_too_many_options(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "Y", "3": "N"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_options(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "Y", "3": "N"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_anonymous(self) -> None:
self.start_poll()
response = self.anonymous_vote({"value": {"1": "Y"}})
self.assert_status_code(response, 401)
self.assert_model_not_exists("vote/1")
def test_vote_not_present(self) -> None:
self.start_poll()
self.update_model("user/1", {"is_present_in_meeting_ids": []})
response = self.request(
"poll.vote",
{"value": {"1": "Y"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_state(self) -> None:
response = self.request(
"poll.vote",
{"value": {"1": "Y"}, "id": 1, "user_id": 1},
start_poll_before_vote=False,
stop_poll_after_vote=False,
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_missing_data(self) -> None:
self.start_poll()
response = self.request("poll.vote", {"value": {}, "id": 1, "user_id": 1})
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
poll = self.get_model("poll/1")
self.assertNotIn(1, poll.get("voted_ids", []))
def test_wrong_data_format(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": [1, 2, 5], "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_option_format(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "string"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_option_id_type(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"id": "Y"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_vote_data(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": [None]}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
class VotePollNamedY(VotePollBaseTestClass):
def create_poll(self) -> None:
self.create_model(
"poll/1",
{
"content_object_id": "assignment/1",
"title": "test_title_Zrvh146QAdq7t6iSDwZk",
"pollmethod": "Y",
"type": Poll.TYPE_NAMED,
"state": Poll.STATE_CREATED,
"meeting_id": 113,
"option_ids": [1, 2],
"entitled_group_ids": [1],
"votesinvalid": "0.000000",
"global_yes": True,
"global_no": True,
"global_abstain": True,
"min_votes_amount": 1,
"max_votes_amount": 10,
},
)
def test_vote(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": 1, "2": 0}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 200)
self.assert_model_exists("vote/1")
self.assert_model_not_exists("vote/2")
poll = self.get_model("poll/1")
self.assertEqual(poll.get("votesvalid"), "1.000000")
self.assertEqual(poll.get("votesinvalid"), "0.000000")
self.assertEqual(poll.get("votescast"), "1.000000")
self.assertIn(1, poll.get("voted_ids", []))
option1 = self.get_model("option/1")
option2 = self.get_model("option/2")
self.assertEqual(option1.get("yes"), "1.000000")
self.assertEqual(option1.get("no"), "0.000000")
self.assertEqual(option1.get("abstain"), "0.000000")
self.assertEqual(option2.get("yes"), "0.000000")
self.assertEqual(option2.get("no"), "0.000000")
self.assertEqual(option2.get("abstain"), "0.000000")
def test_change_vote(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": 1, "2": 0}, "id": 1, "user_id": 1},
stop_poll_after_vote=False,
)
response = self.request(
"poll.vote",
{"value": {"1": 0, "2": 1}, "id": 1, "user_id": 1},
start_poll_before_vote=False,
)
self.assert_status_code(response, 400)
option1 = self.get_model("option/1")
option2 = self.get_model("option/2")
self.assertEqual(option1.get("yes"), "1.000000")
self.assertEqual(option1.get("no"), "0.000000")
self.assertEqual(option1.get("abstain"), "0.000000")
self.assertEqual(option2.get("yes"), "0.000000")
self.assertEqual(option2.get("no"), "0.000000")
self.assertEqual(option2.get("abstain"), "0.000000")
def test_global_yes(self) -> None:
self.start_poll()
response = self.request("poll.vote", {"value": "Y", "id": 1, "user_id": 1})
self.assert_status_code(response, 200)
option = self.get_model("option/11")
self.assertEqual(option.get("yes"), "1.000000")
self.assertEqual(option.get("no"), "0.000000")
self.assertEqual(option.get("abstain"), "0.000000")
def test_global_yes_forbidden(self) -> None:
self.update_model("poll/1", {"global_yes": False})
self.start_poll()
response = self.request("poll.vote", {"value": "Y", "id": 1, "user_id": 1})
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_global_no(self) -> None:
self.start_poll()
response = self.request("poll.vote", {"value": "N", "id": 1, "user_id": 1})
self.assert_status_code(response, 200)
option = self.get_model("option/11")
self.assertEqual(option.get("yes"), "0.000000")
self.assertEqual(option.get("no"), "1.000000")
self.assertEqual(option.get("abstain"), "0.000000")
def test_global_no_forbidden(self) -> None:
self.update_model("poll/1", {"global_no": False})
self.start_poll()
response = self.request("poll.vote", {"value": "N", "id": 1, "user_id": 1})
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_global_abstain(self) -> None:
self.start_poll()
response = self.request("poll.vote", {"value": "A", "id": 1, "user_id": 1})
self.assert_status_code(response, 200)
option = self.get_model("option/11")
self.assertEqual(option.get("yes"), "0.000000")
self.assertEqual(option.get("no"), "0.000000")
self.assertEqual(option.get("abstain"), "1.000000")
def test_global_abstain_forbidden(self) -> None:
self.update_model("poll/1", {"global_abstain": False})
self.start_poll()
response = self.request("poll.vote", {"value": "A", "id": 1, "user_id": 1})
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_negative_vote(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": -1}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_too_many_options(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": 1, "2": 1, "3": 1}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_options(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"id": 1, "value": {"3": 1}},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_anonymous(self) -> None:
self.start_poll()
response = self.anonymous_vote({"value": {"1": 1}})
self.assert_status_code(response, 401)
self.assert_model_not_exists("vote/1")
def test_anonymous_as_vote_user(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": 1}, "id": 1, "user_id": 0},
)
self.assert_status_code(response, 400)
assert "Votes for anonymous user are not allowed" in response.json["message"]
self.assert_model_not_exists("vote/1")
def test_vote_not_present(self) -> None:
self.start_poll()
self.update_model("user/1", {"is_present_in_meeting_ids": []})
response = self.request(
"poll.vote",
{"value": {"1": 1}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_state(self) -> None:
response = self.request(
"poll.vote",
{"value": {"1": 1}, "id": 1, "user_id": 1},
start_poll_before_vote=False,
stop_poll_after_vote=False,
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_missing_data(self) -> None:
self.start_poll()
response = self.request("poll.vote", {"value": {}, "id": 1, "user_id": 1})
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
poll = self.get_model("poll/1")
self.assertNotIn(1, poll.get("voted_ids", []))
def test_wrong_data_format(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": [1, 2, 5], "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_option_format(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "string"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_option_id_type(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"id": 1}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_vote_data(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": [None]}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
class VotePollNamedN(VotePollBaseTestClass):
def create_poll(self) -> None:
self.create_model(
"poll/1",
{
"content_object_id": "assignment/1",
"title": "test_title_4oi49ckKFk39SDIfj30s",
"pollmethod": "N",
"type": Poll.TYPE_NAMED,
"state": Poll.STATE_CREATED,
"meeting_id": 113,
"option_ids": [1, 2],
"entitled_group_ids": [1],
"votesinvalid": "0.000000",
"global_yes": True,
"global_no": True,
"global_abstain": True,
"min_votes_amount": 1,
"max_votes_amount": 10,
},
)
def test_vote(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": 1, "2": 0}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 200)
self.assert_model_exists("vote/1")
self.assert_model_not_exists("vote/2")
poll = self.get_model("poll/1")
self.assertEqual(poll.get("votesvalid"), "1.000000")
self.assertEqual(poll.get("votesinvalid"), "0.000000")
self.assertEqual(poll.get("votescast"), "1.000000")
self.assertTrue(1 in poll.get("voted_ids", []))
option1 = self.get_model("option/1")
option2 = self.get_model("option/2")
self.assertEqual(option1.get("yes"), "0.000000")
self.assertEqual(option1.get("no"), "1.000000")
self.assertEqual(option1.get("abstain"), "0.000000")
self.assertEqual(option2.get("yes"), "0.000000")
self.assertEqual(option2.get("no"), "0.000000")
self.assertEqual(option2.get("abstain"), "0.000000")
def test_change_vote(self) -> None:
self.add_option()
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": 1, "2": 0}, "id": 1, "user_id": 1},
stop_poll_after_vote=False,
)
response = self.request(
"poll.vote",
{"value": {"1": 0, "2": 1}, "id": 1, "user_id": 1},
start_poll_before_vote=False,
)
self.assert_status_code(response, 400)
option1 = self.get_model("option/1")
option2 = self.get_model("option/2")
self.assertEqual(option1.get("yes"), "0.000000")
self.assertEqual(option1.get("no"), "1.000000")
self.assertEqual(option1.get("abstain"), "0.000000")
self.assertEqual(option2.get("yes"), "0.000000")
self.assertEqual(option2.get("no"), "0.000000")
self.assertEqual(option2.get("abstain"), "0.000000")
def test_global_yes(self) -> None:
self.start_poll()
response = self.request("poll.vote", {"value": "Y", "id": 1, "user_id": 1})
self.assert_status_code(response, 200)
option = self.get_model("option/11")
self.assertEqual(option.get("yes"), "1.000000")
self.assertEqual(option.get("no"), "0.000000")
self.assertEqual(option.get("abstain"), "0.000000")
def test_global_yes_forbidden(self) -> None:
self.update_model("poll/1", {"global_yes": False})
self.start_poll()
response = self.request("poll.vote", {"value": "Y", "id": 1, "user_id": 1})
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_global_no(self) -> None:
self.start_poll()
response = self.request("poll.vote", {"value": "N", "id": 1, "user_id": 1})
self.assert_status_code(response, 200)
option = self.get_model("option/11")
self.assertEqual(option.get("yes"), "0.000000")
self.assertEqual(option.get("no"), "1.000000")
self.assertEqual(option.get("abstain"), "0.000000")
def test_global_no_forbidden(self) -> None:
self.update_model("poll/1", {"global_no": False})
self.start_poll()
response = self.request("poll.vote", {"value": "N", "id": 1, "user_id": 1})
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_global_abstain(self) -> None:
self.start_poll()
response = self.request("poll.vote", {"value": "A", "id": 1, "user_id": 1})
self.assert_status_code(response, 200)
option = self.get_model("option/11")
self.assertEqual(option.get("yes"), "0.000000")
self.assertEqual(option.get("no"), "0.000000")
self.assertEqual(option.get("abstain"), "1.000000")
def test_global_abstain_forbidden(self) -> None:
self.update_model("poll/1", {"global_abstain": False})
self.start_poll()
response = self.request("poll.vote", {"value": "A", "id": 1, "user_id": 1})
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_negative_vote(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": -1}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_options(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"3": 1}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_anonymous(self) -> None:
self.start_poll()
response = self.anonymous_vote({"value": {"1": 1}})
self.assert_status_code(response, 401)
self.assert_model_not_exists("vote/1")
def test_vote_not_present(self) -> None:
self.start_poll()
self.update_model("user/1", {"is_present_in_meeting_ids": []})
response = self.request(
"poll.vote",
{"value": {"1": 1}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_state(self) -> None:
response = self.request(
"poll.vote",
{"value": {"1": 1}, "id": 1, "user_id": 1},
start_poll_before_vote=False,
stop_poll_after_vote=False,
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_missing_data(self) -> None:
self.start_poll()
response = self.request("poll.vote", {"value": {}, "id": 1, "user_id": 1})
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
poll = self.get_model("poll/1")
self.assertNotIn(1, poll.get("voted_ids", []))
def test_wrong_data_format(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": [1, 2, 5], "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_option_format(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "string"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_option_id_type(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"id": 1}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_vote_data(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": [None]}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
class VotePollPseudoanonymousYNA(VotePollBaseTestClass):
def create_poll(self) -> None:
self.create_model(
"poll/1",
{
"content_object_id": "assignment/1",
"title": "test_title_OkHAIvOSIcpFnCxbaL6v",
"pollmethod": "YNA",
"type": Poll.TYPE_PSEUDOANONYMOUS,
"state": Poll.STATE_CREATED,
"meeting_id": 113,
"option_ids": [1, 2],
"entitled_group_ids": [1],
"votesinvalid": "0.000000",
"min_votes_amount": 1,
"max_votes_amount": 10,
},
)
def test_vote(self) -> None:
self.add_option()
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "Y", "2": "N", "3": "A"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 200)
self.assert_model_count("vote", 113, 3)
poll = self.get_model("poll/1")
self.assertEqual(poll.get("votesvalid"), "1.000000")
self.assertEqual(poll.get("votesinvalid"), "0.000000")
self.assertEqual(poll.get("votescast"), "1.000000")
option1 = self.get_model("option/1")
option2 = self.get_model("option/2")
option3 = self.get_model("option/3")
self.assertEqual(option1.get("yes"), "1.000000")
self.assertEqual(option1.get("no"), "0.000000")
self.assertEqual(option1.get("abstain"), "0.000000")
self.assertEqual(option2.get("yes"), "0.000000")
self.assertEqual(option2.get("no"), "1.000000")
self.assertEqual(option2.get("abstain"), "0.000000")
self.assertEqual(option3.get("yes"), "0.000000")
self.assertEqual(option3.get("no"), "0.000000")
self.assertEqual(option3.get("abstain"), "1.000000")
def test_change_vote(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "Y"}, "id": 1, "user_id": 1},
stop_poll_after_vote=False,
)
response = self.request(
"poll.vote",
{"value": {"1": "N"}, "id": 1, "user_id": 1},
start_poll_before_vote=False,
)
self.assert_status_code(response, 400)
option1 = self.get_model("option/1")
self.assertEqual(option1.get("yes"), "1.000000")
self.assertEqual(option1.get("no"), "0.000000")
self.assertEqual(option1.get("abstain"), "0.000000")
def test_too_many_options(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "Y", "3": "N"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_partial_vote(self) -> None:
self.add_option()
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "Y"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 200)
self.assert_model_exists("vote/1")
def test_wrong_options(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "Y", "3": "N"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_anonymous(self) -> None:
self.start_poll()
response = self.anonymous_vote({"value": {"1": "Y"}})
self.assert_status_code(response, 401)
self.assert_model_not_exists("vote/1")
def test_vote_not_present(self) -> None:
self.start_poll()
self.update_model("user/1", {"is_present_in_meeting_ids": []})
response = self.request(
"poll.vote",
{"value": {"1": "Y"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_state(self) -> None:
response = self.request(
"poll.vote",
{"value": {}, "id": 1, "user_id": 1},
start_poll_before_vote=False,
stop_poll_after_vote=False,
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_missing_value(self) -> None:
self.start_poll()
response = self.request("poll.vote", {"value": {}, "id": 1, "user_id": 1})
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
poll = self.get_model("poll/1")
self.assertNotIn(1, poll.get("voted_ids", []))
def test_wrong_value_format(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": [1, 2, 5], "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_option_format(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "string"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_option_id_type(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"id": "Y"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_vote_value(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": [None]}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
class VotePollPseudoanonymousY(VotePollBaseTestClass):
def create_poll(self) -> None:
self.create_model(
"poll/1",
{
"content_object_id": "assignment/1",
"title": "test_title_Zrvh146QAdq7t6iSDwZk",
"pollmethod": "Y",
"type": Poll.TYPE_PSEUDOANONYMOUS,
"state": Poll.STATE_CREATED,
"meeting_id": 113,
"option_ids": [1, 2],
"entitled_group_ids": [1],
"votesinvalid": "0.000000",
"min_votes_amount": 1,
"max_votes_amount": 10,
},
)
def test_vote(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": 1, "2": 0}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 200)
self.assert_model_exists("vote/1")
self.assert_model_not_exists("vote/2")
poll = self.get_model("poll/1")
self.assertEqual(poll.get("votesvalid"), "1.000000")
self.assertEqual(poll.get("votesinvalid"), "0.000000")
self.assertEqual(poll.get("votescast"), "1.000000")
self.assertTrue(1 in poll.get("voted_ids", []))
option1 = self.get_model("option/1")
option2 = self.get_model("option/2")
self.assertEqual(option1.get("yes"), "1.000000")
self.assertEqual(option1.get("no"), "0.000000")
self.assertEqual(option1.get("abstain"), "0.000000")
self.assertEqual(option2.get("yes"), "0.000000")
self.assertEqual(option2.get("no"), "0.000000")
self.assertEqual(option2.get("abstain"), "0.000000")
vote = self.get_model("vote/1")
self.assertIsNone(vote.get("user_id"))
def test_change_vote(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": 1, "2": 0}, "id": 1, "user_id": 1},
stop_poll_after_vote=False,
)
response = self.request(
"poll.vote",
{"value": {"1": 0, "2": 1}, "id": 1, "user_id": 1},
start_poll_before_vote=False,
)
self.assert_status_code(response, 400)
self.get_model("poll/1")
option1 = self.get_model("option/1")
option2 = self.get_model("option/2")
self.assertEqual(option1.get("yes"), "1.000000")
self.assertEqual(option1.get("no"), "0.000000")
self.assertEqual(option1.get("abstain"), "0.000000")
self.assertEqual(option2.get("yes"), "0.000000")
self.assertEqual(option2.get("no"), "0.000000")
self.assertEqual(option2.get("abstain"), "0.000000")
def test_negative_vote(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": -1}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_options(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"3": 1}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_vote_not_present(self) -> None:
self.start_poll()
self.update_model("user/1", {"is_present_in_meeting_ids": []})
response = self.request(
"poll.vote",
{"value": {"1": 1}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_state(self) -> None:
response = self.request(
"poll.vote",
{"value": {"1": 1}, "id": 1, "user_id": 1},
start_poll_before_vote=False,
stop_poll_after_vote=False,
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_missing_data(self) -> None:
self.start_poll()
response = self.request("poll.vote", {"value": {}, "id": 1, "user_id": 1})
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
poll = self.get_model("poll/1")
self.assertNotIn(1, poll.get("voted_ids", []))
def test_wrong_data_format(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"value": [1, 2, 5]}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_option_format(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "string"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_option_id_type(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"id": 1}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_vote_data(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": [None]}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
class VotePollPseudoAnonymousN(VotePollBaseTestClass):
def create_poll(self) -> None:
self.create_model(
"poll/1",
{
"content_object_id": "assignment/1",
"title": "test_title_wWPOVJgL9afm83eamf3e",
"pollmethod": "N",
"type": Poll.TYPE_PSEUDOANONYMOUS,
"state": Poll.STATE_CREATED,
"meeting_id": 113,
"option_ids": [1, 2],
"entitled_group_ids": [1],
"votesinvalid": "0.000000",
"min_votes_amount": 1,
"max_votes_amount": 10,
},
)
def test_vote(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"id": 1, "value": {"1": 1, "2": 0}, "user_id": 1},
)
self.assert_status_code(response, 200)
self.assert_model_exists("vote/1")
self.assert_model_not_exists("vote/2")
poll = self.get_model("poll/1")
self.assertEqual(poll.get("votesvalid"), "1.000000")
self.assertEqual(poll.get("votesinvalid"), "0.000000")
self.assertEqual(poll.get("votescast"), "1.000000")
self.assertTrue(1 in poll.get("voted_ids", []))
option1 = self.get_model("option/1")
option2 = self.get_model("option/2")
self.assertEqual(option1.get("yes"), "0.000000")
self.assertEqual(option1.get("no"), "1.000000")
self.assertEqual(option1.get("abstain"), "0.000000")
self.assertEqual(option2.get("yes"), "0.000000")
self.assertEqual(option2.get("no"), "0.000000")
self.assertEqual(option2.get("abstain"), "0.000000")
vote = self.get_model("vote/1")
self.assertIsNone(vote.get("user_id"))
def test_change_vote(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": 1, "2": 0}, "id": 1, "user_id": 1},
stop_poll_after_vote=False,
)
response = self.request(
"poll.vote",
{"value": {"1": 0, "2": 1}, "id": 1, "user_id": 1},
start_poll_before_vote=False,
)
self.assert_status_code(response, 400)
self.get_model("poll/1")
option1 = self.get_model("option/1")
self.assertEqual(option1.get("yes"), "0.000000")
self.assertEqual(option1.get("no"), "1.000000")
self.assertEqual(option1.get("abstain"), "0.000000")
option2 = self.get_model("option/2")
self.assertEqual(option2.get("yes"), "0.000000")
self.assertEqual(option2.get("no"), "0.000000")
self.assertEqual(option2.get("abstain"), "0.000000")
def test_negative_vote(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": -1}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_vote_not_present(self) -> None:
self.start_poll()
self.update_model("user/1", {"is_present_in_meeting_ids": []})
response = self.request(
"poll.vote",
{"id": 1, "user_id": 1, "value": {"1": 1}},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_state(self) -> None:
response = self.request(
"poll.vote",
{"value": {"1": 1}, "id": 1, "user_id": 1},
start_poll_before_vote=False,
stop_poll_after_vote=False,
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_data_format(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": [1, 2, 5], "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
assert (
"decoding payload: unknown vote value: `[1,2,5]`"
in response.json["message"]
)
self.assert_model_not_exists("vote/1")
def test_wrong_option_format(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"value": {"1": "string"}, "id": 1, "user_id": 1},
)
self.assert_status_code(response, 400)
assert "Your vote has a wrong format" in response.json["message"]
self.assert_model_not_exists("vote/1")
def test_wrong_option_id_type(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"id": 1},
)
self.assert_status_code(response, 400)
self.assert_model_not_exists("vote/1")
def test_wrong_vote_data(self) -> None:
self.start_poll()
response = self.request(
"poll.vote",
{"id": 1, "value": {"1": [None]}, "user_id": 1},
)
self.assert_status_code(response, 400)
assert "decoding payload: unknown vote value:" in response.json["message"]
self.assert_model_not_exists("vote/1")
| 37.601779 | 88 | 0.504377 | 8,029 | 71,857 | 4.280483 | 0.027774 | 0.021823 | 0.059707 | 0.072277 | 0.938198 | 0.929906 | 0.926589 | 0.91655 | 0.904882 | 0.90433 | 0 | 0.065084 | 0.337573 | 71,857 | 1,910 | 89 | 37.621466 | 0.656933 | 0.002352 | 0 | 0.778965 | 0 | 0 | 0.196789 | 0.023577 | 0 | 0 | 0 | 0 | 0.235658 | 1 | 0.065242 | false | 0 | 0.003937 | 0 | 0.075928 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cabfe8148431487150ac3585d2e1354afabc73b4 | 2,535 | py | Python | hgapp/characters/migrations/0035_artifact_circumstance_condition.py | shadytradesman/The-Contract-Website | d8b353064f91c53ebab951dec784a0a36caba260 | [
"Apache-2.0"
] | 6 | 2020-10-03T12:15:05.000Z | 2021-10-15T04:43:36.000Z | hgapp/characters/migrations/0035_artifact_circumstance_condition.py | shadytradesman/The-Contract-Website | d8b353064f91c53ebab951dec784a0a36caba260 | [
"Apache-2.0"
] | 99 | 2020-06-04T17:43:56.000Z | 2022-03-12T01:07:20.000Z | hgapp/characters/migrations/0035_artifact_circumstance_condition.py | shadytradesman/The-Contract-Website | d8b353064f91c53ebab951dec784a0a36caba260 | [
"Apache-2.0"
] | 9 | 2020-06-06T16:39:09.000Z | 2020-10-02T16:24:17.000Z | # Generated by Django 2.2.13 on 2021-05-22 14:36
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('cells', '0008_auto_20210521_0240'),
('characters', '0034_auto_20210521_0240'),
]
operations = [
migrations.CreateModel(
name='Condition',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=500)),
('description', models.CharField(max_length=1000)),
('system', models.CharField(max_length=1000)),
('cell', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='cells.Cell')),
('character', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='characters.Character')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Circumstance',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=500)),
('description', models.CharField(max_length=1000)),
('system', models.CharField(max_length=1000)),
('cell', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='cells.Cell')),
('character', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='characters.Character')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Artifact',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=500)),
('description', models.CharField(max_length=1000)),
('system', models.CharField(max_length=1000)),
('cell', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='cells.Cell')),
('character', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='characters.Character')),
],
options={
'abstract': False,
},
),
]
| 43.706897 | 129 | 0.57357 | 249 | 2,535 | 5.718876 | 0.240964 | 0.094803 | 0.113764 | 0.151685 | 0.816713 | 0.816713 | 0.816713 | 0.816713 | 0.816713 | 0.816713 | 0 | 0.044481 | 0.281657 | 2,535 | 57 | 130 | 44.473684 | 0.737507 | 0.018146 | 0 | 0.705882 | 1 | 0 | 0.127865 | 0.018496 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.039216 | 0 | 0.098039 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
caf197e8b3c02ba1ce8f417f706654501a9cdee8 | 5,008 | py | Python | ssh/sanity.py | DavidChae2000/cs338 | 26b6f103b79a1b07d8d5662265741edea439dad2 | [
"MIT"
] | null | null | null | ssh/sanity.py | DavidChae2000/cs338 | 26b6f103b79a1b07d8d5662265741edea439dad2 | [
"MIT"
] | null | null | null | ssh/sanity.py | DavidChae2000/cs338 | 26b6f103b79a1b07d8d5662265741edea439dad2 | [
"MIT"
] | null | null | null | import math
nhex = '00ea51e8e46b58fd1950721b0c38de73b2e712b5862d781cc1dae4df90d603c6a489a61d57929110a63fbc2caa6d016e928aff26e4973c203a35483008c1b1395fccdf488938c8e2fbfbec220e3e1647baa5657c4e5207b74bbc2b3c338116d827c0d0d01f4422da7e52fc4f6da09c4fcb6155cf9af77b8b7f31c841495f79d57b05e68408cd354017f358cdc1e6fdb786926008038bb8a46778fc08e964c7e47d2e57ee45f33f97b6599a6ac48857acdcdd160fb0bcdab00e55d6d27c45e8faa5fa84a71227adcbdce207e402a0a9cbd39eee104159861c186e3e2511a27f8b56a5a5b65b82ac95cdef1465b155f2b7484f948264c44f72aa43664ef0b3129fcbeb1cf7c2837e505d530c9cec59603de5367393775f877ed20506e2d65d3c00c0aa52d2ea8808fcb9c521b338413f644d4077e931e0fa4ba171d60fe6d1c15b89a527651ac4c244a1218fc538e875cf3a2fd528dfc991a583dd4d72d7dee66bbc5d766c0c6be69a813635a2a2bba5b74506556d9df7737f241f53f6f97c0e09fd'
ehex = '010001'
dhex = '7beefcd62bacf2993c024e2a05be2ac5ba101c30aec5f3b7d8b5eb4568dcb0690060faee0198768ce7f2f850dfbcbd26071c730ebeaacf84d9ffa7604a4a1945814fabc6e060e8254d1694e0a1981932301ee0437081a2420d7174b2ed190f14df97d22c675663e822ae4eb8761596e2c445a5bb9a201f2514488db768274171832a6c35578c4dad1d499ff269e151fb7c5a0c830b988a642c0a6a103ffcb5ee3bdfc827f9436e53c54ab623e2d53499d0d9c54cbe66906dd49afdd33b0ac1de9aebec71cd1b5921db418d1e668c2ba0e18def0f3063084852d62139cee6d5bfb07e65b1bd106469d25aae014b08e76f53015790de2ba68700782b467ffc4c8552cdb96cc343576d602a518005cfca1bb3618fbe707c80defcd3660ff91807486beb1b4ac162568e6f098b590d67c404f68437b3b33ab4c2e90a4536c1660249227c7d1f9cdc21614a94f41de0ac1355097e6ed1c4cfad4c5e61552685fec0a58b65ae8d54baacaa1d270209e13b2336fa2e5cbd3da87aafa7e62236f5c51345'
phex = '00f7835d5a710b1eed86f2946b3305312acf096fb415f0d2ceaa0f1c64be3d08a7cb002c29bd16d70a21dcac4a960d630014aac7d4bfa894a914a4d98a1288d7ed6b8a2a7183564b88d441515a42adc0f4174c83fe8666320c983bf2b6ec715d35ec3674911d40498eb9a399d05e787a378766a937e812480063ee7136ff92763075c8c98bd6ffda1aa1cfc946259163e762af74d9753989f8473510f4e1a7b9340a346e690317b0d21ec30c77bc407006914ef14d39656e171d978ded3968889b'
qhex = '00f25abcb7b95fd41b099ca7254fc5b07d82995050e294eb61550f489a932d9cc591dce820d6c14933d8e2057c65597243e25f380449d59aae91d27813ac227fe6b74430aeee2df39de5e7662b9b49e3a783e51410d68239231e4c41d13c48085f17ebb461fe744ccd045bc2fe049c60f652da273a856af8db4ded6c4439de569c1931a9baea9222d144e10ee684931b21c927db816cb2aaa12d1b4d95d53a92547abb26094dbbbcc8450f14eca223dccfeabefcf8d0fa7879933e9f160db16547'
ex1hex = '00cb725a10962caaee58e71a22075cf42e18e1cbc47de8668ee2efeac9d02940ed130a80428a9fb4802a9745b3452599c39e39466a546390566e96960d4cf0d873d9da46cae946cc6d9af25999548797bbcb4ee48c1912f57eaebf4fe115f694c456f5e7d8c3ef6cc6f946458f25a7e7e5aed12eb4ba781460e5c33fa2126ee3e12f5fac72da75916424af33e509ca39111f67d3274f98cd149c0b3d75dbe9ce6cbd1a9dca897536ad7005a940de27252df0d0afbd67393d4598d8531bb3fd7d93'
ex2hex = '684d2ebcef35d3c6131bed3f7c967aff792e5c47834c4e04e4a3a03e8e2aea3689310661e3aae9c33b8b028b67ded7f369404b8e64ec5e16d5413c565315c1efdc5da78a118d8b342056b73cfb2710a7ee76af6d13b495bc8c748b3aee739dceb72fc4c10bcea7a2f04641a42afdd6290cbd4076ca26a09559735dcdfb0dba3e920532a3f40212a3f7bdd838d343b040991ee3344f409178ae3dfae50a109c1b697a93c31b4639d803089b9bbfa82986fde356620b2650a519feb05ec063e14f'
cohex = '00cba2482285fda2b94a962a77cd1eeea438768deceb35ead3f60378ee6166eec72022edce41a1dd8c403bfb8ab73022a3a0b3fc5e8d49319e65cfd671e876db39546948d8cf0881b28339f1b97800300c2cc54a312a418fd89a188e2cab258b27026a46fdfb16c97c044775a8150d80ab49e9b13d4d1efe9a46e02c66c41a974918598043c870514f6fb2cfd06df71ba260a33b9d45d9684ef0023bf28a3aae47a30c015c410dd373f0dc0c532468f6201cddc833cbceeef2b0c5c089e6200e5e'
n = int(nhex, 16)
e = int(ehex, 16)
d = int(dhex, 16)
p = int(phex, 16)
q = int(qhex, 16)
ex1 = int(ex1hex, 16)
ex2 = int(ex2hex, 16)
co = int(cohex, 16)
# l = int(((p-1)*(q-1))/math.gcd(p-1,q-1)) overflow error so I did this calculation on an online calculator
l = 2658802038852802029953426482625348287435299698020799104511769959716093662204460869464516234703249285615460032089620642000240815741563173471422341759203780293850144402988624465738583838504845051674579949752491489591345987662746959407647943898520794487070084120873506188599453531668545175875630399595146172605726319774185989968386305306734803851270662474575687560751546378104302752184786272121725602413957174425130772316019428842218936255662768586084351198080719032278130774603201306043076416136082159116158329775819589380272151469449360322748921108471486704577121632725026998046220539820847007341164573784011556299765764438683824504053318003992982465497771304090851010699197955634912225218978314424492334986869855621728812342902269728061573850702725417824331095067145085495012059559322101598116565390407523560035002445052000421321052608763333404586308438110315580823837332586879451363261029147994654156175496373854441735589390
sanity = 0
if p*q != n:
sanity += 1
if math.gcd(e,l) != 1:
sanity += 1
if e*d%l != 1:
sanity += 1
if ex1 != d%(p-1):
sanity += 1
if ex2 != d%(q-1):
sanity += 1
print(sanity)
| 116.465116 | 930 | 0.938498 | 114 | 5,008 | 41.22807 | 0.412281 | 0.007447 | 0.00766 | 0.006383 | 0.004681 | 0 | 0 | 0 | 0 | 0 | 0 | 0.64917 | 0.03754 | 5,008 | 42 | 931 | 119.238095 | 0.325934 | 0.020966 | 0 | 0.166667 | 0 | 0 | 0.714992 | 0.713756 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.033333 | 0 | 0.033333 | 0.033333 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1b4da5a716b66ac22648a7d42e6ef700b03352b4 | 228 | py | Python | bmtk/tests/simulator/bionet/conftest.py | aaberbach/bmtk | 42aa70ce2003227a32df6ce5a95420dbf4bdfbd4 | [
"BSD-3-Clause"
] | 216 | 2017-10-03T17:02:42.000Z | 2022-03-20T03:35:48.000Z | bmtk/tests/simulator/bionet/conftest.py | moekay/bmtk | 6efdf6387d2a6badf276b917ee15d238daeae883 | [
"BSD-3-Clause"
] | 70 | 2017-10-05T00:50:41.000Z | 2022-03-30T18:55:01.000Z | bmtk/tests/simulator/bionet/conftest.py | moekay/bmtk | 6efdf6387d2a6badf276b917ee15d238daeae883 | [
"BSD-3-Clause"
] | 97 | 2017-10-03T22:15:06.000Z | 2022-03-23T21:03:26.000Z | try:
import bmtk.simulator.bionet as bionet
from bmtk.simulator.bionet.gids import GidPool
from bmtk.simulator.bionet.pyfunction_cache import *
nrn_installed = True
except ImportError:
nrn_installed = False | 25.333333 | 56 | 0.758772 | 29 | 228 | 5.862069 | 0.586207 | 0.229412 | 0.335294 | 0.270588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184211 | 228 | 9 | 57 | 25.333333 | 0.913978 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.571429 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1b801eb2d0665c4c11cc56eda53bee5fd75230ba | 54,137 | py | Python | utils/LineModels.py | farnikn/MithraLIMSims | 6b11448b8859519edf9733b2e22bc5569356942f | [
"MIT"
] | null | null | null | utils/LineModels.py | farnikn/MithraLIMSims | 6b11448b8859519edf9733b2e22bc5569356942f | [
"MIT"
] | null | null | null | utils/LineModels.py | farnikn/MithraLIMSims | 6b11448b8859519edf9733b2e22bc5569356942f | [
"MIT"
] | null | null | null | import numpy as np
from scipy.interpolate import interp2d, NearestNDInterpolator
from nbodykit.utils import DistributedArray
from nbodykit.lab import BigFileCatalog, MultipleSpeciesCatalog
from nbodykit.cosmology.cosmology import Cosmology
from pmesh.pm import ParticleMesh
from sfr import logSFR_Behroozi
gb_k_B = 1.38064852e-16 # Boltzmann constant in unit of erg K^-1
gb_L_sun = 3.83e33 # in unit of erg/s
gb_c = 2.99792458e5 # in units of km/s
gb_len_conv = 3.086e19 # conversion factor from Mpc to km
gb_h = 0.677
cosmodef = {'omegam':0.309167, 'h':0.677, 'omegab':0.048}
cosmo = Cosmology.from_dict(cosmodef)
def H(z):
return 100.* cosmo.h * cosmo.efunc(z)
# ########################################### ###########################################
# ########################################### MODELS ###########################################
# ########################################### ###########################################
class ModelHI_A():
def __init__(self, aa):
self.aa = aa
self.zz = 1/aa-1
self.alp = (1+2*self.zz)/(2+2*self.zz)
#self.mcut = 1e9*( 1.8 + 15*(3*self.aa)**8)
self.mcut = 3e9*( 1 + 10*(3*self.aa)**8)
###self.normhalo = 3e5*(1+(3.5/self.zz)**6)
###self.normhalo = 3e7 *(4+(3.5/self.zz)**6)
self.normhalo = 8e5*(1+(3.5/self.zz)**6)
self.normsat = self.normhalo*(1.75 + 0.25*self.zz)
self.normsat *= 0.5 #THis is to avoid negative masses
#z=1
if np.abs(self.zz-1.0)<0.1:
self.alp = 0.76
self.mcut = 2.6e10
self.normhalo = 4.6e8
self.normsat = self.normsat/5
#z=0.5
if np.abs(self.zz-0.5)<0.1:
self.alp = 0.63
self.mcut = 3.7e10
self.normhalo = 9.8e8
self.normsat = self.normsat/100
#z=0
if np.abs(self.zz - 0.0)<0.1:
#print('Modify')
self.alp = 0.49
self.mcut = 5.2e10
self.normhalo = 2.1e9
self.normsat = self.normsat/100
def assignline(self, halocat, cencat, satcat):
mHIhalo = self.assignhalo(halocat['Mass'].compute())
mHIsat = self.assignsat(satcat['Mass'].compute())
mHIcen = self.assigncen(mHIhalo, mHIsat, satcat['GlobalID'].compute(),
cencat.csize, cencat.comm)
return mHIhalo, mHIcen, mHIsat
def assignhalo(self, mhalo):
xx = mhalo/self.mcut+1e-10
mHI = xx**self.alp * np.exp(-1/xx)
mHI*= self.normhalo
return mHI
def assignsat(self, msat):
xx = msat/self.mcut+1e-10
mHI = xx**self.alp * np.exp(-1/xx)
mHI*= self.normsat
return mHI
def getinsat(self, mHIsat, satid, totalsize, localsize, comm):
#print(comm.rank, np.all(np.diff(satid) >=0))
#diff = np.diff(satid)
#if comm.rank == 260:
# print(satid[:-1][diff <0], satid[1:][diff < 0])
da = DistributedArray(satid, comm)
mHI = da.bincount(mHIsat, shared_edges=False)
zerosize = totalsize - mHI.cshape[0]
zeros = DistributedArray.cempty(cshape=(zerosize, ), dtype=mHI.local.dtype, comm=comm)
zeros.local[...] = 0
mHItotal = DistributedArray.concat(mHI, zeros, localsize=localsize)
return mHItotal
def assigncen(self, mHIhalo, mHIsat, satid, censize, comm):
#Assumes every halo has a central...which it does...usually
mHItotal = self.getinsat(mHIsat, satid, censize, mHIhalo.size, comm)
return mHIhalo - mHItotal.local
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of HI
'''
pm = ParticleMesh(BoxSize=bs, Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
rankweight = sum([wt.sum() for wt in weights])
totweight = comm.allreduce(rankweight)
for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
return mesh
def createmesh_catalog(self, bs, nc, halocat, cencat, satcat, mode='galaxies', position='RSDpos', weight='HImass', tofield=False):
'''use this to create mesh of HI
'''
comm = halocat.comm
if mode == 'halos': catalogs = [halocat]
elif mode == 'galaxies': catalogs = [cencat, satcat]
elif mode == 'all': catalogs = [halocat, cencat, satcat]
else: print('Mode not recognized')
rankweight = sum([cat[weight].sum().compute() for cat in catalogs])
totweight = comm.allreduce(rankweight)
for cat in catalogs: cat[weight] /= totweight/float(nc)**3
allcat = MultipleSpeciesCatalog(['%d'%i for i in range(len(catalogs))], *catalogs)
mesh = allcat.to_mesh(BoxSize=bs,Nmesh=[nc,nc,nc],\
position=position,weight=weight)
if tofield: mesh = mesh.to_field()
return mesh
# ###########################################
class ModelHI_A2(ModelHI_A):
'''Same as model A with a different RSD for satellites
'''
def __init__(self, aa):
super().__init__(aa)
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity_HI']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
# ###########################################
class ModelHI_B():
def __init__(self, aa, h=0.6776):
self.aa = aa
self.zz = 1/aa-1
self.h = h
#self.mcut = 1e9*( 1.8 + 15*(3*self.aa)**8 )
self.mcut = 3e9*( 1 + 10*(3*self.aa)**8)
self.normhalo = 1
#self.slope, self.intercept = np.polyfit([8.1, 11], [0.2, -1.], deg=1)
def assignline(self, halocat, cencat, satcat):
mHIsat = self.assignsat(satcat['Mass'].compute())
mHIcen = self.assigncen(cencat['Mass'].compute())
mHIhalo = self.assignhalo(mHIcen, mHIsat, satcat['GlobalID'].compute(),
halocat.csize, halocat.comm)
return mHIhalo, mHIcen, mHIsat
def assignhalo(self, mHIcen, mHIsat, satid, hsize, comm):
#Assumes every halo has a central...which it does...usually
mHItotal = self.getinsat(mHIsat, satid, hsize, mHIcen.size, comm)
return mHIcen + mHItotal.local
def getinsat(self, mHIsat, satid, totalsize, localsize, comm):
da = DistributedArray(satid, comm)
mHI = da.bincount(mHIsat, shared_edges=False)
zerosize = totalsize - mHI.cshape[0]
zeros = DistributedArray.cempty(cshape=(zerosize, ), dtype=mHI.local.dtype, comm=comm)
zeros.local[...] = 0
mHItotal = DistributedArray.concat(mHI, zeros, localsize=localsize)
return mHItotal
def _assign(self, mstellar):
'''Takes in M_stellar and gives M_HI in M_solar
'''
mm = 3e8 #5e7
f = 0.18 #0.35
alpha = 0.4 #0.35
mfrac = f*(mm/(mstellar + mm))**alpha
mh1 = mstellar * mfrac
return mh1
def assignsat(self, msat, scatter=None):
mstellar = self.moster(msat, scatter=scatter)/self.h
mh1 = self._assign(mstellar)
mh1 = mh1*self.h #* np.exp(-self.mcut/msat)
return mh1
def assigncen(self, mcen, scatter=None):
mstellar = self.moster(mcen, scatter=scatter)/self.h
mh1 = self._assign(mstellar)
mh1 = mh1*self.h #* np.exp(-self.mcut/mcen)
return mh1
def moster(self, Mhalo, scatter=None):
"""
moster(Minf,z):
Returns the stellar mass (M*/h) given Minf and z from Table 1 and
Eq. (2,11-14) of Moster++13 [1205.5807].
This version now works in terms of Msun/h units,
convert to Msun units in the function
To get "true" stellar mass, add 0.15 dex of lognormal scatter.
To get "observed" stellar mass, add between 0.1-0.45 dex extra scatter.
"""
z = self.zz
Minf = Mhalo/self.h
zzp1 = z/(1+z)
M1 = 10.0**(11.590+1.195*zzp1)
mM = 0.0351 - 0.0247*zzp1
beta = 1.376 - 0.826*zzp1
gamma = 0.608 + 0.329*zzp1
Mstar = 2*mM/( (Minf/M1)**(-beta) + (Minf/M1)**gamma )
Mstar*= Minf
if scatter is not None:
Mstar = 10**(np.log10(Mstar) + np.random.normal(0, scatter, Mstar.size))
return Mstar*self.h
#
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of HI
'''
pm = ParticleMesh(BoxSize=bs,Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
rankweight = sum([wt.sum() for wt in weights])
totweight = comm.allreduce(rankweight)
for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
return mesh
def createmesh_catalog(self, bs, nc, halocat, cencat, satcat, mode='galaxies', position='RSDpos', weight='HImass', tofield=False):
'''use this to create mesh of HI
'''
comm = halocat.comm
if mode == 'halos': catalogs = [halocat]
elif mode == 'galaxies': catalogs = [cencat, satcat]
elif mode == 'all': catalogs = [halocat, cencat, satcat]
else: print('Mode not recognized')
rankweight = sum([cat[weight].sum().compute() for cat in catalogs])
totweight = comm.allreduce(rankweight)
for cat in catalogs: cat[weight] /= totweight/float(nc)**3
allcat = MultipleSpeciesCatalog(['%d'%i for i in range(len(catalogs))], *catalogs)
mesh = allcat.to_mesh(BoxSize=bs,Nmesh=[nc,nc,nc],\
position=position,weight=weight)
if tofield: mesh = mesh.to_field()
return mesh
# ###########################################
class ModelHI_C(ModelHI_A):
'''Vanilla model with no centrals and satellites, only halo
Halos have the COM velocity but do not have any dispersion over it
'''
def __init__(self, aa):
super().__init__(aa)
self.normsat = 0
#self.alp = 1.0
#self.mcut = 1e9
#self.normhalo = 2e5*(1+3/self.zz**2)
#self.normhalo = 1.1e5*(1+4/self.zz)
self.alp = 0.9
self.mcut = 1e10
self.normhalo = 3.5e6*(1+1/self.zz)
def derivate(self, param, delta):
if param == 'alpha':
self.alp = (1+delta)*self.alp
elif param == 'mcut':
self.mcut = 10**( (1+delta)*np.log10(self.mcut))
elif param == 'norm':
self.mcut = 10**( (1+delta)*np.log10(self.normhalo))
else:
print('Parameter to vary not recongnized. Should be "alpha", "mcut" or "norm"')
def assignline(self, halocat, cencat, satcat):
mHIhalo = self.assignhalo(halocat['Mass'].compute())
mHIsat = self.assignsat(satcat['Mass'].compute())
mHIcen = self.assigncen(cencat['Mass'].compute())
return mHIhalo, mHIcen, mHIsat
def assignsat(self, msat):
return msat*0
def assigncen(self, mcen):
return mcen*0
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of HI
'''
pm = ParticleMesh(BoxSize=bs,Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
rankweight = sum([wt.sum() for wt in weights])
totweight = comm.allreduce(rankweight)
for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
return mesh
def createmesh_catalog(self, bs, nc, halocat, cencat, satcat, mode='halos', position='RSDpos', weight='HImass', tofield=False):
'''use this to create mesh of HI
'''
comm = halocat.comm
if mode == 'halos': catalogs = [halocat]
else: print('Mode not recognized')
rankweight = sum([cat[weight].sum().compute() for cat in catalogs])
totweight = comm.allreduce(rankweight)
for cat in catalogs: cat[weight] /= totweight/float(nc)**3
allcat = MultipleSpeciesCatalog(['%d'%i for i in range(len(catalogs))], *catalogs)
mesh = allcat.to_mesh(BoxSize=bs,Nmesh=[nc,nc,nc],\
position=position,weight=weight)
if tofield: mesh = mesh.to_field()
return mesh
# ###########################################
class ModelHI_C2(ModelHI_C):
'''Vanilla model with no centrals and satellites, only halo
Halos have the COM velocity and a dispersion from VN18 added over it
'''
def __init__(self, aa):
super().__init__(aa)
self.vdisp = self._setupvdisp()
def _setupvdisp(self):
vzdisp0 = np.array([31, 34, 39, 44, 51, 54])
vzdispal = np.array([0.35, 0.37, 0.38, 0.39, 0.39, 0.40])
vdispz = np.arange(0, 6)
vdisp0fit = np.polyfit(vdispz, vzdisp0, 1)
vdispalfit = np.polyfit(vdispz, vzdispal, 1)
vdisp0 = self.zz * vdisp0fit[0] + vdisp0fit[1]
vdispal = self.zz * vdispalfit[0] + vdispalfit[1]
return lambda M: vdisp0*(M/1e10)**vdispal
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
dispersion = np.random.normal(0, self.vdisp(halocat['Mass'].compute())).reshape(-1, 1)
hvel = halocat['Velocity']*los + dispersion*los
hrsdpos = halocat['Position']+ hvel*rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
# ###########################################
class ModelHI_D():
'''Vanilla model with no centrals and satellites, only halo
Halos have the COM velocity and a dispersion from VN18 added over it
'''
def __init__(self, aa):
self.aa = aa
self.zz = 1/aa-1
self.alp = (1.+2*self.zz)/(2.+2*self.zz)
self.mcut = 6e10*np.exp(-0.75*self.zz) + 1
self.normhalo = 1.7e9/(1+self.zz)**(5./3.)
self.nsatfhalo = 0.1
def fsat_h1(self, mhalo):
logmass = np.log10(mhalo)
mminf = 9.5 #10 - 0.2*self.zz
mhalf = 12.8 #13 - 0.1*self.zz
fsat = 0.5/(mhalf-mminf)**2 * (logmass-mminf)**2
fsat[logmass < mminf] = 0
fsat[fsat > 0.8] = 0.8
return fsat
def nsat_h1(self, mhalo):
return ((1 + self.nsatfhalo*mhalo / self.mcut)**0.5).astype(int) #2 #int(mhalo*0 + 2)
def vdisp(self, mhalo):
h = cosmo.efunc(self.zz)
return 1100. * (h * mhalo / 1e15) ** 0.33333
def assignline(self, halocat, cencat, satcat):
if halocat.comm.rank == 0:
if cencat is not None: print("\nCencat not used")
if satcat is not None: print("\nSatcat not used")
mHIhalo = self.assignhalo(halocat['Mass'].compute())
mHIsat = self.assignsat(halocat['Mass'].compute(), mHIhalo)
mHIcen = self.assigncen(mHIhalo, mHIsat)
#now repeat satellite catalog and reduce mass
nsat = self.nsat_h1(halocat['Mass'].compute())
mHIsat = np.repeat(mHIsat/nsat, nsat, axis=0)
return mHIhalo, mHIcen, mHIsat
def assignhalo(self, mhalo):
xx = (mhalo + 1e-30)/self.mcut+1e-10
mHI = xx**self.alp * np.exp(-1.0/xx)
mHI*= self.normhalo
return mHI
def assignsat(self, mhalo, mh1halo):
frac = self.fsat_h1(mhalo)
mHI = mh1halo*frac
return mHI
def assigncen(self, mHIhalo, mHIsat):
#Assumes every halo has a central...which it does...usually
return mHIhalo - mHIsat
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = hrsdpos #cencat['Position']+cencat['Velocity']*los * rsdfac
#now satellites
nsat = self.nsat_h1(halocat['Mass'].compute())
dispersion = np.random.normal(0, np.repeat(self.vdisp(halocat['Mass'].compute()), nsat)).reshape(-1, 1)*los
hvel = np.repeat(halocat['Velocity'].compute(), nsat, axis=0)*los
hveldisp = hvel + dispersion
srsdpos = np.repeat(halocat['Position'].compute(), nsat, axis=0) + hveldisp*rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of HI
'''
pm = ParticleMesh(BoxSize=bs,Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
rankweight = sum([wt.sum() for wt in weights])
totweight = comm.allreduce(rankweight)
for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
return mesh
# ###########################################
class ModelHI_D2():
'''Vanilla model with no centrals and satellites, only halo
Halos have the COM velocity and a dispersion from VN18 added over it
'''
def __init__(self, aa):
self.aa = aa
self.zz = 1/aa-1
self.alp = (1.+2*self.zz)/(2.+2*self.zz)
self.mcut = 6e10*np.exp(-0.75*self.zz) + 1
self.normhalo = 1.7e9/(1+self.zz)**(5./3.)
self.nsat = 10
def fsat_h1(self, mhalo):
logmass = np.log10(mhalo)
mminf = 9.5 #10 - 0.2*self.zz
mhalf = 12.8 #13 - 0.1*self.zz
fsat = 0.5/(mhalf-mminf)**2 * (logmass-mminf)**2
fsat[logmass < mminf] = 0
fsat[fsat > 0.8] = 0.8
return fsat
def nsat_h1(self, mhalo):
return 2 #int(mhalo*0 + 2)
def vdisp(self, mhalo):
h = cosmo.efunc(self.zz)
return 1100. * (h * mhalo / 1e15) ** 0.33333
def assignline(self, halocat, cencat, satcat):
if halocat.comm.rank == 0:
if cencat is not None: print("\nCencat not used")
if satcat is not None: print("\nSatcat not used")
mHIhalo = self.assignhalo(halocat['Mass'].compute())
mHIsat = self.assignsat(halocat['Mass'].compute(), mHIhalo)
mHIcen = self.assigncen(mHIhalo, mHIsat)
return mHIhalo, mHIcen, mHIsat
def assignhalo(self, mhalo):
xx = (mhalo + 1e-30)/self.mcut+1e-10
mHI = xx**self.alp * np.exp(-1.0/xx)
mHI*= self.normhalo
return mHI
def assignsat(self, mhalo, mh1halo):
frac = self.fsat_h1(mhalo)
mHI = mh1halo*frac
return mHI
def assigncen(self, mHIhalo, mHIsat):
#Assumes every halo has a central...which it does...usually
return mHIhalo - mHIsat
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = hrsdpos
#now for satellites, return only the dispersion and factor it into account while painting
srsdpos = self.vdisp(halocat['Mass'].compute()).reshape(-1, 1)*los * rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of HI
'''
pm = ParticleMesh(BoxSize=bs,Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
rankweight = sum([wt.sum() for wt in weights])
totweight = comm.allreduce(rankweight)
for wt in weights: wt /= totweight/float(nc)**3
lay = pm.decompose(positions[0])
mesh.paint(positions[0], mass=weights[0], layout=lay, hold=True)
if len(positions) > 1:
for i in range(self.nsat):
shift = np.random.normal(0, positions[1])
pos = positions[0] + shift
lay = pm.decompose(pos)
mesh.paint(pos, mass=weights[1]/self.nsat, layout=lay, hold=True)
return mesh
# ###########################################
class ModelCII_A():
def __init__(self, aa):
self.aa = aa
self.zz = 1./aa - 1.
self.mcut = 1.0e9
self.nu_line = 1902e9
self.L_fac = 1e6 / (8.*np.pi*gb_k_B*H(self.zz)) * (gb_c/self.nu_line)**3. * (1.+self.zz)**2. * gb_len_conv**-2. * gb_L_sun * gb_h**3.
self.a_CII = 0.8475
self.b_CII = 7.2203
def assignline(self, halocat, cencat, satcat):
haloL = self.assignhalo(halocat['Mass'].compute())
satL = self.assignsat(satcat['Mass'].compute())
cenL = self.assigncen(cencat['Mass'].compute())
return haloL, satL, cenL
def assignhalo(self, mhalo):
logMhalo = np.log10(mhalo[self.mcut < mhalo]/gb_h)
logSFR = logSFR_Behroozi(z=self.zz, logMList=logMhalo)
lCII = np.power(10., self.a_CII * logSFR + self.b_CII) * np.power(10., 0.37*np.random.randn(len(logMhalo)))
# log10(L) = a*log10(SFR) + b + Normal(0, 0.37)
lCII[logSFR == -1000.] = 0.
return self.L_fac * lCII
def assignsat(self, msat):
return msat*0
def assigncen(self, mcen):
return mcen*0
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of the Line
'''
pm = ParticleMesh(BoxSize=bs, Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
# rankweight = sum([wt.sum() for wt in weights])
# totweight = comm.allreduce(rankweight)
# for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
# mesh = mesh/mesh.cmean()
return mesh
def createmesh_catalog(self, bs, nc, halocat, cencat, satcat, mode='galaxies', position='RSDpos', weight='HImass', tofield=False):
'''use this to create mesh of the Line
'''
comm = halocat.comm
if mode == 'halos': catalogs = [halocat]
elif mode == 'galaxies': catalogs = [cencat, satcat]
elif mode == 'all': catalogs = [halocat, cencat, satcat]
else: print('Mode not recognized')
rankweight = sum([cat[weight].sum().compute() for cat in catalogs])
totweight = comm.allreduce(rankweight)
for cat in catalogs: cat[weight] /= totweight/float(nc)**3
allcat = MultipleSpeciesCatalog(['%d'%i for i in range(len(catalogs))], *catalogs)
mesh = allcat.to_mesh(BoxSize=bs, Nmesh=[nc,nc,nc], position=position, weight=weight)
if tofield: mesh = mesh.to_field()
return mesh
# ###########################################
class ModelCII_B():
def __init__(self, aa):
self.aa = aa
self.zz = 1./aa - 1.
self.mcut = 1.0e9
self.nu_line = 1902e9
self.L_fac = 1e6 / (8.*np.pi*gb_k_B*H(self.zz)) * (gb_c/self.nu_line)**3. * (1.+self.zz)**2. * gb_len_conv**-2. * gb_L_sun * gb_h**3.
self.a_CII = 1.0000
self.b_CII = 6.9647
def assignline(self, halocat, cencat, satcat):
haloL = self.assignhalo(halocat['Mass'].compute())
satL = self.assignsat(satcat['Mass'].compute())
cenL = self.assigncen(cencat['Mass'].compute())
return haloL, satL, cenL
def assignhalo(self, mhalo):
logMhalo = np.log10(mhalo[self.mcut < mhalo]/gb_h)
logSFR = logSFR_Behroozi(z=self.zz, logMList=logMhalo)
lCII = np.power(10., self.a_CII * logSFR + self.b_CII) * np.power(10., 0.37*np.random.randn(len(logMhalo)))
lCII[logSFR == -1000.] = 0.
return self.L_fac * lCII
def assignsat(self, msat):
return msat*0
def assigncen(self, mcen):
return mcen*0
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of Line
'''
pm = ParticleMesh(BoxSize=bs, Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
# rankweight = sum([wt.sum() for wt in weights])
# totweight = comm.allreduce(rankweight)
# for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
return mesh
def createmesh_catalog(self, bs, nc, halocat, cencat, satcat, mode='galaxies', position='RSDpos', weight='HImass', tofield=False):
'''use this to create mesh of HI
'''
comm = halocat.comm
if mode == 'halos': catalogs = [halocat]
elif mode == 'galaxies': catalogs = [cencat, satcat]
elif mode == 'all': catalogs = [halocat, cencat, satcat]
else: print('Mode not recognized')
rankweight = sum([cat[weight].sum().compute() for cat in catalogs])
totweight = comm.allreduce(rankweight)
for cat in catalogs: cat[weight] /= totweight/float(nc)**3
allcat = MultipleSpeciesCatalog(['%d'%i for i in range(len(catalogs))], *catalogs)
mesh = allcat.to_mesh(BoxSize=bs,Nmesh=[nc,nc,nc],\
position=position,weight=weight)
if tofield: mesh = mesh.to_field()
return mesh
# ###########################################
class ModelCII_C():
def __init__(self, aa):
self.aa = aa
self.zz = 1./aa - 1.
self.mcut = 1.0e9
self.nu_line = 1902e9
self.L_fac = 1e6 / (8.*np.pi*gb_k_B*H(self.zz)) * (gb_c/self.nu_line)**3. * (1.+self.zz)**2. * gb_len_conv**-2. * gb_L_sun * gb_h**3.
self.a_CII = 0.8727
self.b_CII = 6.7250
def assignline(self, halocat, cencat, satcat):
haloL = self.assignhalo(halocat['Mass'].compute())
satL = self.assignsat(satcat['Mass'].compute())
cenL = self.assigncen(cencat['Mass'].compute())
return haloL, satL, cenL
def assignhalo(self, mhalo):
logMhalo = np.log10(mhalo[self.mcut < mhalo]/gb_h)
logSFR = logSFR_Behroozi(z=self.zz, logMList=logMhalo)
lCII = np.power(10., self.a_CII * logSFR + self.b_CII) * np.power(10., 0.37*np.random.randn(len(logMhalo)))
lCII[logSFR == -1000.] = 0.
return self.L_fac * lCII
def assignsat(self, msat):
return msat*0
def assigncen(self, mcen):
return mcen*0
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of Line
'''
pm = ParticleMesh(BoxSize=bs, Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
# rankweight = sum([wt.sum() for wt in weights])
# totweight = comm.allreduce(rankweight)
# for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
return mesh
def createmesh_catalog(self, bs, nc, halocat, cencat, satcat, mode='galaxies', position='RSDpos', weight='HImass', tofield=False):
'''use this to create mesh of HI
'''
comm = halocat.comm
if mode == 'halos': catalogs = [halocat]
elif mode == 'galaxies': catalogs = [cencat, satcat]
elif mode == 'all': catalogs = [halocat, cencat, satcat]
else: print('Mode not recognized')
rankweight = sum([cat[weight].sum().compute() for cat in catalogs])
totweight = comm.allreduce(rankweight)
for cat in catalogs: cat[weight] /= totweight/float(nc)**3
allcat = MultipleSpeciesCatalog(['%d'%i for i in range(len(catalogs))], *catalogs)
mesh = allcat.to_mesh(BoxSize=bs,Nmesh=[nc,nc,nc],\
position=position,weight=weight)
if tofield: mesh = mesh.to_field()
return mesh
# ###########################################
class ModelCII_D():
def __init__(self, aa):
self.aa = aa
self.zz = 1./aa - 1.
self.mcut = 1.0e9
self.nu_line = 1902e9
self.L_fac = 1e6 / (8.*np.pi*gb_k_B*H(self.zz)) * (gb_c/self.nu_line)**3. * (1.+self.zz)**2. * gb_len_conv**-2. * gb_L_sun * gb_h**3.
self.a_CII = 0.9231
self.b_CII = 6.5234
def assignline(self, halocat, cencat, satcat):
haloL = self.assignhalo(halocat['Mass'].compute())
satL = self.assignsat(satcat['Mass'].compute())
cenL = self.assigncen(cencat['Mass'].compute())
return haloL, satL, cenL
def assignhalo(self, mhalo):
logMhalo = np.log10(mhalo[self.mcut < mhalo]/gb_h)
logSFR = logSFR_Behroozi(z=self.zz, logMList=logMhalo)
lCII = np.power(10., self.a_CII * logSFR + self.b_CII) * np.power(10., 0.37*np.random.randn(len(logMhalo)))
lCII[logSFR == -1000.] = 0.
return self.L_fac * lCII
def assignsat(self, msat):
return msat*0
def assigncen(self, mcen):
return mcen*0
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of Line
'''
pm = ParticleMesh(BoxSize=bs, Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
# rankweight = sum([wt.sum() for wt in weights])
# totweight = comm.allreduce(rankweight)
# for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
return mesh
def createmesh_catalog(self, bs, nc, halocat, cencat, satcat, mode='galaxies', position='RSDpos', weight='HImass', tofield=False):
'''use this to create mesh of HI
'''
comm = halocat.comm
if mode == 'halos': catalogs = [halocat]
elif mode == 'galaxies': catalogs = [cencat, satcat]
elif mode == 'all': catalogs = [halocat, cencat, satcat]
else: print('Mode not recognized')
rankweight = sum([cat[weight].sum().compute() for cat in catalogs])
totweight = comm.allreduce(rankweight)
for cat in catalogs: cat[weight] /= totweight/float(nc)**3
allcat = MultipleSpeciesCatalog(['%d'%i for i in range(len(catalogs))], *catalogs)
mesh = allcat.to_mesh(BoxSize=bs,Nmesh=[nc,nc,nc],\
position=position,weight=weight)
if tofield: mesh = mesh.to_field()
return mesh
# ###########################################
class ModelCO10():
def __init__(self, aa):
self.aa = aa
self.zz = 1./aa - 1.
self.mcut = 1.0e9
self.delta_MF = 1.0
self.J = 1.0
self.nu_line = self.J*115.27e9 # in unit of Hz for CO
self.L_fac = 1e6 / (8.*np.pi*gb_k_B*H(self.zz)) * (gb_c/self.nu_line)**3. * (1.+self.zz)**2. * gb_len_conv**-2. * gb_L_sun * gb_h**3.
self.a_CO = 1.27
self.b_CO = -1.0
def assignline(self, halocat, cencat, satcat):
haloL = self.assignhalo(halocat['Mass'].compute())
satL = self.assignsat(satcat['Mass'].compute())
cenL = self.assigncen(cencat['Mass'].compute())
return haloL, satL, cenL
def assignhalo(self, mhalo):
logMhalo = np.log10(mhalo[self.mcut < mhalo]/gb_h)
logSFR = logSFR_Behroozi(z=self.zz, logMList=logMhalo)
L_IR = np.power(10., logSFR)/self.delta_MF * np.power(10., 10.)
L_IR[logSFR == -1000.] = 0.
Lprime_CO = np.power(10., (np.log10(L_IR)-self.b_CO)/self.a_CO) * np.power(10., 0.37*np.random.randn(len(logMhalo))) #in unit of K km/s pc^2
#log10(LCO_prime) = (log10(L_IR) - b) / a + Normal(0, 0.37)
L_CO = 4.9 * 1.0e-5 * self.J**3. * Lprime_CO #in unit of L_sun
return self.L_fac * L_CO
def assignsat(self, msat):
return msat*0
def assigncen(self, mcen):
return mcen*0
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of the Line
'''
pm = ParticleMesh(BoxSize=bs, Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
# rankweight = sum([wt.sum() for wt in weights])
# totweight = comm.allreduce(rankweight)
# for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
# mesh = mesh/mesh.cmean()
return mesh
def createmesh_catalog(self, bs, nc, halocat, cencat, satcat, mode='galaxies', position='RSDpos', weight='HImass', tofield=False):
'''use this to create mesh of HI
'''
comm = halocat.comm
if mode == 'halos': catalogs = [halocat]
elif mode == 'galaxies': catalogs = [cencat, satcat]
elif mode == 'all': catalogs = [halocat, cencat, satcat]
else: print('Mode not recognized')
rankweight = sum([cat[weight].sum().compute() for cat in catalogs])
totweight = comm.allreduce(rankweight)
for cat in catalogs: cat[weight] /= totweight/float(nc)**3
allcat = MultipleSpeciesCatalog(['%d'%i for i in range(len(catalogs))], *catalogs)
mesh = allcat.to_mesh(BoxSize=bs,Nmesh=[nc,nc,nc],\
position=position,weight=weight)
if tofield: mesh = mesh.to_field()
return mesh
# ###########################################
class ModelCO21():
def __init__(self, aa):
self.aa = aa
self.zz = 1./aa - 1.
self.mcut = 1.0e9
self.delta_MF = 1.0
self.J = 2.0
self.nu_line = self.J*115.27e9 # in unit of Hz for CO
self.L_fac = 1e6 / (8.*np.pi*gb_k_B*H(self.zz)) * (gb_c/self.nu_line)**3. * (1.+self.zz)**2. * gb_len_conv**-2. * gb_L_sun * gb_h**3.
self.a_CO = 1.11
self.b_CO = 0.6
def assignline(self, halocat, cencat, satcat):
haloL = self.assignhalo(halocat['Mass'].compute())
satL = self.assignsat(satcat['Mass'].compute())
cenL = self.assigncen(cencat['Mass'].compute())
return haloL, satL, cenL
def assignhalo(self, mhalo):
logMhalo = np.log10(mhalo[self.mcut < mhalo]/gb_h)
logSFR = logSFR_Behroozi(z=self.zz, logMList=logMhalo)
L_IR = np.power(10., logSFR)/self.delta_MF * np.power(10., 10.)
L_IR[logSFR == -1000.] = 0.
Lprime_CO = np.power(10., (np.log10(L_IR)-self.b_CO)/self.a_CO) * np.power(10., 0.37*np.random.randn(len(logMhalo)))
#log10(LCO_prime) = (log10(L_IR) - b)/a + Normal(0, 0.37)
#in unit of K km/s pc^2
L_CO = 4.9 * 1.0e-5 * self.J**3. * Lprime_CO #in unit of L_sun
return self.L_fac * L_CO
def assignsat(self, msat):
return msat*0
def assigncen(self, mcen):
return mcen*0
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of Line
'''
pm = ParticleMesh(BoxSize=bs, Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
# rankweight = sum([wt.sum() for wt in weights])
# totweight = comm.allreduce(rankweight)
# for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
return mesh
def createmesh_catalog(self, bs, nc, halocat, cencat, satcat, mode='galaxies', position='RSDpos', weight='HImass', tofield=False):
'''use this to create mesh of HI
'''
comm = halocat.comm
if mode == 'halos': catalogs = [halocat]
elif mode == 'galaxies': catalogs = [cencat, satcat]
elif mode == 'all': catalogs = [halocat, cencat, satcat]
else: print('Mode not recognized')
rankweight = sum([cat[weight].sum().compute() for cat in catalogs])
totweight = comm.allreduce(rankweight)
for cat in catalogs: cat[weight] /= totweight/float(nc)**3
allcat = MultipleSpeciesCatalog(['%d'%i for i in range(len(catalogs))], *catalogs)
mesh = allcat.to_mesh(BoxSize=bs,Nmesh=[nc,nc,nc],\
position=position,weight=weight)
if tofield: mesh = mesh.to_field()
return mesh
# ###########################################
class ModelCO32():
def __init__(self, aa):
self.aa = aa
self.zz = 1./aa - 1.
self.mcut = 1.0e9
self.delta_MF = 1.0
self.J = 3.0
self.nu_line = self.J*115.27e9 # in unit of Hz for CO
self.L_fac = 1e6 / (8.*np.pi*gb_k_B*H(self.zz)) * (gb_c/self.nu_line)**3. * (1.+self.zz)**2. * gb_len_conv**-2. * gb_L_sun * gb_h**3.
self.a_CO = 1.18
self.b_CO = 0.1
def assignline(self, halocat, cencat, satcat):
haloL = self.assignhalo(halocat['Mass'].compute())
satL = self.assignsat(satcat['Mass'].compute())
cenL = self.assigncen(cencat['Mass'].compute())
return haloL, satL, cenL
def assignhalo(self, mhalo):
logMhalo = np.log10(mhalo[self.mcut < mhalo]/gb_h)
logSFR = logSFR_Behroozi(z=self.zz, logMList=logMhalo)
L_IR = np.power(10., logSFR)/self.delta_MF * np.power(10., 10.)
L_IR[logSFR == -1000.] = 0.
Lprime_CO = np.power(10., (np.log10(L_IR)-self.b_CO)/self.a_CO) * np.power(10., 0.37*np.random.randn(len(logMhalo)))
#in unit of K km/s pc^2
L_CO = 4.9 * 1.0e-5 * self.J**3. * Lprime_CO #in unit of L_sun
return self.L_fac * L_CO
def assignsat(self, msat):
return msat*0
def assigncen(self, mcen):
return mcen*0
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of Line
'''
pm = ParticleMesh(BoxSize=bs, Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
# rankweight = sum([wt.sum() for wt in weights])
# totweight = comm.allreduce(rankweight)
# for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
return mesh
def createmesh_catalog(self, bs, nc, halocat, cencat, satcat, mode='galaxies', position='RSDpos', weight='HImass', tofield=False):
'''use this to create mesh of HI
'''
comm = halocat.comm
if mode == 'halos': catalogs = [halocat]
elif mode == 'galaxies': catalogs = [cencat, satcat]
elif mode == 'all': catalogs = [halocat, cencat, satcat]
else: print('Mode not recognized')
rankweight = sum([cat[weight].sum().compute() for cat in catalogs])
totweight = comm.allreduce(rankweight)
for cat in catalogs: cat[weight] /= totweight/float(nc)**3
allcat = MultipleSpeciesCatalog(['%d'%i for i in range(len(catalogs))], *catalogs)
mesh = allcat.to_mesh(BoxSize=bs,Nmesh=[nc,nc,nc],\
position=position,weight=weight)
if tofield: mesh = mesh.to_field()
return mesh
# ###########################################
class ModelCO43():
def __init__(self, aa):
self.aa = aa
self.zz = 1./aa - 1.
self.mcut = 1.0e9
self.delta_MF = 1.0
self.J = 4.0
self.nu_line = self.J*115.27e9 # in unit of Hz for CO
self.L_fac = 1e6 / (8.*np.pi*gb_k_B*H(self.zz)) * (gb_c/self.nu_line)**3. * (1.+self.zz)**2. * gb_len_conv**-2. * gb_L_sun * gb_h**3.
self.a_CO = 1.09
self.b_CO = 1.2
def assignline(self, halocat, cencat, satcat):
haloL = self.assignhalo(halocat['Mass'].compute())
satL = self.assignsat(satcat['Mass'].compute())
cenL = self.assigncen(cencat['Mass'].compute())
return haloL, satL, cenL
def assignhalo(self, mhalo):
logMhalo = np.log10(mhalo[self.mcut < mhalo]/gb_h)
logSFR = logSFR_Behroozi(z=self.zz, logMList=logMhalo)
L_IR = np.power(10., logSFR)/self.delta_MF * np.power(10., 10.)
L_IR[logSFR == -1000.] = 0.
Lprime_CO = np.power(10., (np.log10(L_IR)-self.b_CO)/self.a_CO) * np.power(10., 0.37*np.random.randn(len(logMhalo)))
#in unit of K km/s pc^2
L_CO = 4.9 * 1.0e-5 * self.J**3. * Lprime_CO #in unit of L_sun
return self.L_fac * L_CO
def assignsat(self, msat):
return msat*0
def assigncen(self, mcen):
return mcen*0
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of Line
'''
pm = ParticleMesh(BoxSize=bs, Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
# rankweight = sum([wt.sum() for wt in weights])
# totweight = comm.allreduce(rankweight)
# for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
return mesh
def createmesh_catalog(self, bs, nc, halocat, cencat, satcat, mode='galaxies', position='RSDpos', weight='HImass', tofield=False):
'''use this to create mesh of HI
'''
comm = halocat.comm
if mode == 'halos': catalogs = [halocat]
elif mode == 'galaxies': catalogs = [cencat, satcat]
elif mode == 'all': catalogs = [halocat, cencat, satcat]
else: print('Mode not recognized')
rankweight = sum([cat[weight].sum().compute() for cat in catalogs])
totweight = comm.allreduce(rankweight)
for cat in catalogs: cat[weight] /= totweight/float(nc)**3
allcat = MultipleSpeciesCatalog(['%d'%i for i in range(len(catalogs))], *catalogs)
mesh = allcat.to_mesh(BoxSize=bs,Nmesh=[nc,nc,nc],\
position=position,weight=weight)
if tofield: mesh = mesh.to_field()
return mesh
# ###########################################
class ModelCO54():
def __init__(self, aa):
self.aa = aa
self.zz = 1./aa - 1.
self.mcut = 1.0e9
self.delta_MF = 1.0
self.J = 5.0
self.nu_line = self.J*115.27e9 # in unit of Hz for CO
self.L_fac = 1e6 / (8.*np.pi*gb_k_B*H(self.zz)) * (gb_c/self.nu_line)**3. * (1.+self.zz)**2. * gb_len_conv**-2. * gb_L_sun * gb_h**3.
self.a_CO = 1.05
self.b_CO = 1.8
def assignline(self, halocat, cencat, satcat):
haloL = self.assignhalo(halocat['Mass'].compute())
satL = self.assignsat(satcat['Mass'].compute())
cenL = self.assigncen(cencat['Mass'].compute())
return haloL, satL, cenL
def assignhalo(self, mhalo):
logMhalo = np.log10(mhalo[self.mcut < mhalo]/gb_h)
logSFR = logSFR_Behroozi(z=self.zz, logMList=logMhalo)
L_IR = np.power(10., logSFR)/self.delta_MF * np.power(10., 10.)
L_IR[logSFR == -1000.] = 0.
Lprime_CO = np.power(10., (np.log10(L_IR)-self.b_CO)/self.a_CO) * np.power(10., 0.37*np.random.randn(len(logMhalo)))
#in unit of K km/s pc^2
L_CO = 4.9 * 1.0e-5 * self.J**3. * Lprime_CO #in unit of L_sun
return self.L_fac * L_CO
def assignsat(self, msat):
return msat*0
def assigncen(self, mcen):
return mcen*0
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of Line
'''
pm = ParticleMesh(BoxSize=bs, Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
# rankweight = sum([wt.sum() for wt in weights])
# totweight = comm.allreduce(rankweight)
# for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
return mesh
def createmesh_catalog(self, bs, nc, halocat, cencat, satcat, mode='galaxies', position='RSDpos', weight='HImass', tofield=False):
'''use this to create mesh of HI
'''
comm = halocat.comm
if mode == 'halos': catalogs = [halocat]
elif mode == 'galaxies': catalogs = [cencat, satcat]
elif mode == 'all': catalogs = [halocat, cencat, satcat]
else: print('Mode not recognized')
rankweight = sum([cat[weight].sum().compute() for cat in catalogs])
totweight = comm.allreduce(rankweight)
for cat in catalogs: cat[weight] /= totweight/float(nc)**3
allcat = MultipleSpeciesCatalog(['%d'%i for i in range(len(catalogs))], *catalogs)
mesh = allcat.to_mesh(BoxSize=bs,Nmesh=[nc,nc,nc],\
position=position,weight=weight)
if tofield: mesh = mesh.to_field()
return mesh
# ###########################################
class ModelCO65():
def __init__(self, aa):
self.aa = aa
self.zz = 1./aa - 1.
self.mcut = 1.0e9
self.delta_MF = 1.0
self.J = 6.0
self.nu_line = self.J*115.27e9 # in unit of Hz for CO
self.L_fac = 1e6 / (8.*np.pi*gb_k_B*H(self.zz)) * (gb_c/self.nu_line)**3. * (1.+self.zz)**2. * gb_len_conv**-2. * gb_L_sun * gb_h**3.
self.a_CO = 1.04
self.b_CO = 2.2
def assignline(self, halocat, cencat, satcat):
haloL = self.assignhalo(halocat['Mass'].compute())
satL = self.assignsat(satcat['Mass'].compute())
cenL = self.assigncen(cencat['Mass'].compute())
return haloL, satL, cenL
def assignhalo(self, mhalo):
logMhalo = np.log10(mhalo[self.mcut < mhalo]/gb_h)
logSFR = logSFR_Behroozi(z=self.zz, logMList=logMhalo)
L_IR = np.power(10., logSFR)/self.delta_MF * np.power(10., 10.)
L_IR[logSFR == -1000.] = 0.
Lprime_CO = np.power(10., (np.log10(L_IR)-self.b_CO)/self.a_CO) * np.power(10., 0.37*np.random.randn(len(logMhalo)))
#in unit of K km/s pc^2
L_CO = 4.9 * 1.0e-5 * self.J**3. * Lprime_CO #in unit of L_sun
return self.L_fac * L_CO
def assignsat(self, msat):
return msat*0
def assigncen(self, mcen):
return mcen*0
def assignrsd(self, rsdfac, halocat, cencat, satcat, los=[0,0,1]):
hrsdpos = halocat['Position']+halocat['Velocity']*los * rsdfac
crsdpos = cencat['Position']+cencat['Velocity']*los * rsdfac
srsdpos = satcat['Position']+satcat['Velocity']*los * rsdfac
return hrsdpos, crsdpos, srsdpos
def createmesh(self, bs, nc, positions, weights):
'''use this to create mesh of Line
'''
pm = ParticleMesh(BoxSize=bs, Nmesh=[nc,nc,nc])
mesh = pm.create(mode='real', value=0)
comm = pm.comm
# rankweight = sum([wt.sum() for wt in weights])
# totweight = comm.allreduce(rankweight)
# for wt in weights: wt /= totweight/float(nc)**3
for i in range(len(positions)):
lay = pm.decompose(positions[i])
mesh.paint(positions[i], mass=weights[i], layout=lay, hold=True)
return mesh
def createmesh_catalog(self, bs, nc, halocat, cencat, satcat, mode='galaxies', position='RSDpos', weight='HImass', tofield=False):
'''use this to create mesh of HI
'''
comm = halocat.comm
if mode == 'halos': catalogs = [halocat]
elif mode == 'galaxies': catalogs = [cencat, satcat]
elif mode == 'all': catalogs = [halocat, cencat, satcat]
else: print('Mode not recognized')
rankweight = sum([cat[weight].sum().compute() for cat in catalogs])
totweight = comm.allreduce(rankweight)
for cat in catalogs: cat[weight] /= totweight/float(nc)**3
allcat = MultipleSpeciesCatalog(['%d'%i for i in range(len(catalogs))], *catalogs)
mesh = allcat.to_mesh(BoxSize=bs,Nmesh=[nc,nc,nc],\
position=position,weight=weight)
if tofield: mesh = mesh.to_field()
return mesh | 38.205363 | 148 | 0.554611 | 6,929 | 54,137 | 4.274643 | 0.061336 | 0.014788 | 0.035923 | 0.01418 | 0.88612 | 0.876296 | 0.873899 | 0.866471 | 0.855025 | 0.853169 | 0 | 0.032132 | 0.290042 | 54,137 | 1,417 | 149 | 38.205363 | 0.738494 | 0.110571 | 0 | 0.822563 | 0 | 0 | 0.040046 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.145674 | false | 0 | 0.007667 | 0.029573 | 0.297919 | 0.019715 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1b944bba407fbc4932c2d491573b51f9f06a60d8 | 9,902 | py | Python | enel_service/test/modeling/test_model_ops.py | dos-group/enel-experiments | a511b03a2a2577d4ce372aa44e475df8005eb394 | [
"MIT"
] | 1 | 2022-03-25T14:03:57.000Z | 2022-03-25T14:03:57.000Z | enel_service/test/modeling/test_model_ops.py | dos-group/enel-experiments | a511b03a2a2577d4ce372aa44e475df8005eb394 | [
"MIT"
] | null | null | null | enel_service/test/modeling/test_model_ops.py | dos-group/enel-experiments | a511b03a2a2577d4ce372aa44e475df8005eb394 | [
"MIT"
] | null | null | null | from unittest import TestCase
import torch
from enel_service.modeling.model_ops import RuntimeConv, MetricConv, OverheadConv
class TestOverheadConv(TestCase):
def setUp(self) -> None:
self.context_tensor = torch.rand(5, 40)
self.x_start_scale_out_vec = torch.rand(5, 3)
self.x_end_scale_out_vec = torch.rand(5, 3)
self.x_rescaling_time_ratio = torch.rand(5, 1)
self.metric_tensor = torch.rand(5, 10)
self.true_time = torch.abs(torch.rand(5, 1))
self.edge_index = torch.ones(5, 5).nonzero(as_tuple=False).t()
self.real_nodes_batch = torch.tensor(list(range(2, 5)), dtype=torch.long)
self.conv = OverheadConv(57, 1)
self.loss = torch.nn.MSELoss()
def test_module_scriptable(self):
try:
conv = self.conv
torch.jit.script(conv)
print(conv)
print(conv.get_num_parameters())
print(conv.get_num_trainable_parameters())
except BaseException as exc:
self.fail(f"Module is not scriptable, error: '{exc}'")
def test_module_forward(self):
# first: with "normal module"
pred_overhead = self.conv(self.x_start_scale_out_vec, self.x_end_scale_out_vec, self.x_rescaling_time_ratio,
self.context_tensor, self.metric_tensor, self.real_nodes_batch)
self.assertTrue(isinstance(pred_overhead, torch.Tensor))
self.assertEqual(pred_overhead.size(), (self.context_tensor.size()[0], 1))
self.assertEqual(pred_overhead.isnan().sum(), 0)
self.assertEqual(pred_overhead.isinf().sum(), 0)
# now: with torchscript module
script_conv = torch.jit.script(self.conv)
pred_overhead = script_conv(self.x_start_scale_out_vec, self.x_end_scale_out_vec, self.x_rescaling_time_ratio,
self.context_tensor, self.metric_tensor, self.real_nodes_batch)
self.assertTrue(isinstance(pred_overhead, torch.Tensor))
self.assertEqual(pred_overhead.size(), (self.context_tensor.size()[0], 1))
self.assertEqual(pred_overhead.isnan().sum(), 0)
self.assertEqual(pred_overhead.isinf().sum(), 0)
def test_module_backward(self):
try:
# first: with "normal module"
pred_overhead = self.conv(self.x_start_scale_out_vec, self.x_end_scale_out_vec, self.x_rescaling_time_ratio,
self.context_tensor, self.metric_tensor, self.real_nodes_batch)
loss = self.loss(pred_overhead, self.true_time)
loss.backward(retain_graph=True)
# now: with torchscript module
script_conv = torch.jit.script(self.conv)
pred_overhead = script_conv(self.x_start_scale_out_vec, self.x_end_scale_out_vec,
self.x_rescaling_time_ratio, self.context_tensor,
self.metric_tensor, self.real_nodes_batch)
loss = self.loss(pred_overhead, self.true_time)
loss.backward()
except BaseException as exc:
self.fail(f"Module output produces error during backpropagation, error: '{exc}'")
class TestRuntimeConv(TestCase):
def setUp(self) -> None:
self.context_tensor = torch.rand(5, 40)
self.x_scale_out_vec = torch.rand(5, 3)
self.metric_tensor = torch.rand(5, 10)
self.overhead_tensor = torch.rand(5, 1)
self.time_cumsum_tensor = torch.rand(5, 1)
self.true_time = torch.abs(torch.rand(5, 1))
self.edge_index = torch.ones(5, 5).nonzero(as_tuple=False).t()
self.real_nodes_batch = torch.tensor(list(range(2, 5)), dtype=torch.long)
self.conv = RuntimeConv(54, 1)
self.loss = torch.nn.MSELoss()
def test_module_jittable(self):
try:
conv = self.conv.jittable()
torch.jit.script(conv)
print(conv)
print(conv.get_num_parameters())
print(conv.get_num_trainable_parameters())
except BaseException as exc:
self.fail(f"Module is not jittable, error: '{exc}'")
def test_module_forward(self):
# first: with "normal module"
pred_time, prop_time = self.conv(self.edge_index, self.x_scale_out_vec,
self.context_tensor, self.metric_tensor, self.real_nodes_batch,
self.overhead_tensor, self.time_cumsum_tensor)
self.assertTrue(isinstance(pred_time, torch.Tensor))
self.assertEqual(pred_time.size(), (self.context_tensor.size()[0], 1))
self.assertEqual(pred_time.isnan().sum(), 0)
self.assertEqual(pred_time.isinf().sum(), 0)
self.assertTrue(isinstance(prop_time, torch.Tensor))
self.assertEqual(prop_time.size(), (self.context_tensor.size()[0], 1))
self.assertEqual(prop_time.isnan().sum(), 0)
self.assertEqual(prop_time.isinf().sum(), 0)
# now: with torchscript module
script_conv = torch.jit.script(self.conv.jittable())
pred_time, prop_time = script_conv(self.edge_index, self.x_scale_out_vec,
self.context_tensor, self.metric_tensor, self.real_nodes_batch,
self.overhead_tensor, self.time_cumsum_tensor)
self.assertTrue(isinstance(pred_time, torch.Tensor))
self.assertEqual(pred_time.size(), (self.context_tensor.size()[0], 1))
self.assertEqual(pred_time.isnan().sum(), 0)
self.assertEqual(pred_time.isinf().sum(), 0)
self.assertTrue(isinstance(prop_time, torch.Tensor))
self.assertEqual(prop_time.size(), (self.context_tensor.size()[0], 1))
self.assertEqual(prop_time.isnan().sum(), 0)
self.assertEqual(prop_time.isinf().sum(), 0)
def test_module_backward(self):
try:
# first: with "normal module"
pred_time, prop_time = self.conv(self.edge_index, self.x_scale_out_vec,
self.context_tensor, self.metric_tensor, self.real_nodes_batch,
self.overhead_tensor, self.time_cumsum_tensor)
loss = self.loss(pred_time, self.true_time)
loss.backward(retain_graph=True)
loss = self.loss(prop_time, self.true_time)
loss.backward(retain_graph=True)
# now: with torchscript module
script_conv = torch.jit.script(self.conv.jittable())
pred_time, prop_time = script_conv(self.edge_index, self.x_scale_out_vec,
self.context_tensor, self.metric_tensor, self.real_nodes_batch,
self.overhead_tensor, self.time_cumsum_tensor)
loss = self.loss(pred_time, self.true_time)
loss.backward(retain_graph=True)
loss = self.loss(prop_time, self.true_time)
loss.backward()
except BaseException as exc:
self.fail(f"Module output produces error during backpropagation, error: '{exc}'")
class TestMetricConv(TestCase):
def setUp(self) -> None:
self.x_start_scale_out_vec = torch.rand(5, 3)
self.x_end_scale_out_vec = torch.rand(5, 3)
self.context_tensor = torch.rand(5, 40)
self.metric_tensor = torch.rand(5, 10)
self.true_metric = torch.abs(torch.rand(5, 10))
self.edge_index = torch.ones(5, 5).nonzero(as_tuple=False).t()
self.conv = MetricConv(46, 10, 10, 0.5)
self.loss = torch.nn.MSELoss()
def test_module_jittable(self):
try:
conv = self.conv.jittable()
torch.jit.script(conv)
print(conv)
print(conv.get_num_parameters())
print(conv.get_num_trainable_parameters())
except BaseException as exc:
self.fail(f"Module is not jittable, error: '{exc}'")
def test_module_forward(self):
# first: with "normal module"
prop_metric = self.conv(self.edge_index, self.x_start_scale_out_vec, self.x_end_scale_out_vec,
self.context_tensor, self.metric_tensor)
self.assertTrue(isinstance(prop_metric, torch.Tensor))
self.assertEqual(prop_metric.size(), self.metric_tensor.size())
self.assertEqual(prop_metric.isnan().sum(), 0)
self.assertEqual(prop_metric.isinf().sum(), 0)
# now: with torchscript module
script_conv = torch.jit.script(self.conv.jittable())
prop_metric = script_conv(self.edge_index, self.x_start_scale_out_vec, self.x_end_scale_out_vec,
self.context_tensor, self.metric_tensor)
self.assertTrue(isinstance(prop_metric, torch.Tensor))
self.assertEqual(prop_metric.size(), self.metric_tensor.size())
self.assertEqual(prop_metric.isnan().sum(), 0)
self.assertEqual(prop_metric.isinf().sum(), 0)
def test_module_backward(self):
try:
# first: with "normal module"
prop_metric = self.conv(self.edge_index, self.x_start_scale_out_vec, self.x_end_scale_out_vec,
self.context_tensor, self.metric_tensor)
loss = self.loss(prop_metric, self.true_metric)
loss.backward(retain_graph=True)
# now: with torchscript module
script_conv = torch.jit.script(self.conv.jittable())
prop_metric = script_conv(self.edge_index, self.x_start_scale_out_vec, self.x_end_scale_out_vec,
self.context_tensor, self.metric_tensor)
loss = self.loss(prop_metric, self.true_metric)
loss.backward()
except BaseException as exc:
self.fail(f"Module output produces error during backpropagation, error: '{exc}'")
| 48.778325 | 120 | 0.626742 | 1,260 | 9,902 | 4.674603 | 0.086508 | 0.061121 | 0.046689 | 0.050934 | 0.957555 | 0.94601 | 0.935484 | 0.935484 | 0.922071 | 0.904754 | 0 | 0.012768 | 0.264391 | 9,902 | 202 | 121 | 49.019802 | 0.795854 | 0.034437 | 0 | 0.87037 | 0 | 0 | 0.033201 | 0 | 0 | 0 | 0 | 0 | 0.197531 | 1 | 0.074074 | false | 0 | 0.018519 | 0 | 0.111111 | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
59ed80b2191f001d26c6354b0a00d14706bf7e0c | 5,287 | py | Python | tests/timemodel_tests/test_event_sequence.py | dpazel/music_rep | 2f9de9b98b13df98f1a0a2120b84714725ce527e | [
"MIT"
] | 1 | 2021-05-06T19:45:54.000Z | 2021-05-06T19:45:54.000Z | tests/timemodel_tests/test_event_sequence.py | dpazel/music_rep | 2f9de9b98b13df98f1a0a2120b84714725ce527e | [
"MIT"
] | null | null | null | tests/timemodel_tests/test_event_sequence.py | dpazel/music_rep | 2f9de9b98b13df98f1a0a2120b84714725ce527e | [
"MIT"
] | null | null | null | import unittest
from timemodel.event_sequence import EventSequence
from timemodel.event import Event
from timemodel.position import Position
class TestEventSequence(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test_successor(self):
es = EventSequence()
p1 = Event('str_1', Position(1, 2))
es.add(p1)
print(es)
assert es.successor(p1) is None
p2 = Event('str_2', Position(1))
es.add(p2)
print(es)
assert es.successor(p1) == p2
assert es.successor(p2) is None
p3 = Event('str_2', Position(3, 4))
es.add(p3)
print(es)
assert es.successor(p1) == p3
assert es.successor(p3) == p2
assert es.successor(p2) is None
def test_basic_succ_pred_sequence(self):
events = [Event(3, Position(0)),
Event(6, Position(1, 2)),
Event(2, Position(3, 4)),
Event(7, Position(1)),
Event(8, Position(3, 2))
]
es = EventSequence(events)
print(es)
assert es.successor(events[0]) == events[1]
assert es.successor(events[1]) == events[2]
assert es.successor(events[2]) == events[3]
assert es.successor(events[3]) == events[4]
assert es.successor(events[4]) is None
assert es.predecessor(events[4]) == events[3]
assert es.predecessor(events[3]) == events[2]
assert es.predecessor(events[2]) == events[1]
assert es.predecessor(events[1]) == events[0]
assert es.predecessor(events[0]) is None
assert es.first == events[0]
assert es.last == events[4]
es = EventSequence()
for i in reversed(range(len(events))):
es.add(events[i])
print(es)
assert es.successor(events[0]) == events[1]
assert es.successor(events[1]) == events[2]
assert es.successor(events[2]) == events[3]
assert es.successor(events[3]) == events[4]
assert es.successor(events[4]) is None
assert es.predecessor(events[4]) == events[3]
assert es.predecessor(events[3]) == events[2]
assert es.predecessor(events[2]) == events[1]
assert es.predecessor(events[1]) == events[0]
assert es.predecessor(events[0]) is None
assert es.first == events[0]
assert es.last == events[4]
es = EventSequence()
for i in [2, 4, 1, 3, 0]:
es.add(events[i])
print(es)
assert es.successor(events[0]) == events[1]
assert es.successor(events[1]) == events[2]
assert es.successor(events[2]) == events[3]
assert es.successor(events[3]) == events[4]
assert es.successor(events[4]) is None
assert es.predecessor(events[4]) == events[3]
assert es.predecessor(events[3]) == events[2]
assert es.predecessor(events[2]) == events[1]
assert es.predecessor(events[1]) == events[0]
assert es.predecessor(events[0]) is None
assert es.first == events[0]
assert es.last == events[4]
es.remove(events[3])
print('remove object 7')
print(es)
print(es.print_maps())
assert es.successor(events[0]) == events[1]
assert es.successor(events[1]) == events[2]
assert es.successor(events[2]) == events[4]
assert es.successor(events[4]) is None
assert es.predecessor(events[4]) == events[2]
assert es.predecessor(events[2]) == events[1]
assert es.predecessor(events[1]) == events[0]
assert es.predecessor(events[0]) is None
assert es.first == events[0]
assert es.last == events[4]
new_event = Event(23, Position(3, 4))
print('update object 2 to 23')
es.add(new_event)
print(es)
es.print_maps()
assert es.successor(events[0]) == events[1]
assert es.successor(events[1]) == new_event
assert es.successor(new_event) == events[4]
assert es.successor(events[4]) is None
assert es.predecessor(events[4]) == new_event
assert es.predecessor(new_event) == events[1]
assert es.predecessor(events[1]) == events[0]
assert es.predecessor(events[0]) is None
assert es.first == events[0]
assert es.last == events[4]
es.move_event(new_event, Position(1, 8))
print('move (23, 3/4) to (23, 1/8)')
print(es)
es.print_maps()
assert es.successor(events[0]) == new_event
assert es.successor(new_event) == events[1]
assert es.successor(events[1]) == events[4]
assert es.successor(events[4]) is None
assert es.predecessor(events[4]) == events[1]
assert es.predecessor(events[1]) == new_event
assert es.predecessor(new_event) == events[0]
assert es.predecessor(events[0]) is None
assert es.first == events[0]
assert es.last == events[4]
if __name__ == "__main__":
unittest.main()
| 33.462025 | 53 | 0.555324 | 672 | 5,287 | 4.321429 | 0.089286 | 0.198347 | 0.193182 | 0.198003 | 0.767218 | 0.767218 | 0.740358 | 0.712466 | 0.654959 | 0.654959 | 0 | 0.043884 | 0.310384 | 5,287 | 157 | 54 | 33.675159 | 0.752606 | 0 | 0 | 0.596774 | 0 | 0 | 0.016266 | 0 | 0 | 0 | 0 | 0 | 0.580645 | 1 | 0.032258 | false | 0.016129 | 0.032258 | 0 | 0.072581 | 0.120968 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
947942d3aa3b3136543df07fa1502e1de7d44347 | 30 | py | Python | modulepackage/telly/telly-contrib/telly-po/tubbytronic/po.py | Chyi341152/chyi-book | ddeaf49d69a68f5718c20c3b7fe6fd37381d21eb | [
"MIT"
] | null | null | null | modulepackage/telly/telly-contrib/telly-po/tubbytronic/po.py | Chyi341152/chyi-book | ddeaf49d69a68f5718c20c3b7fe6fd37381d21eb | [
"MIT"
] | null | null | null | modulepackage/telly/telly-contrib/telly-po/tubbytronic/po.py | Chyi341152/chyi-book | ddeaf49d69a68f5718c20c3b7fe6fd37381d21eb | [
"MIT"
] | null | null | null | # po.py
print('imported po')
| 7.5 | 20 | 0.633333 | 5 | 30 | 3.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 30 | 3 | 21 | 10 | 0.76 | 0.166667 | 0 | 0 | 0 | 0 | 0.478261 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 7 |
947f8a673c4063e7cce492bfdf10ca79fdd7fec4 | 160 | py | Python | datamaps/api/__init__.py | hammerheadlemon/datamaps | 3605bb975f606bed1c009100d04a4048dfa28305 | [
"MIT"
] | 1 | 2020-10-16T12:52:12.000Z | 2020-10-16T12:52:12.000Z | datamaps/api/__init__.py | yulqen/datamaps | f07fd91889749bec2b7a5fce3af4f10b3e2b3f73 | [
"MIT"
] | 13 | 2020-09-07T13:19:11.000Z | 2021-11-06T16:11:36.000Z | datamaps/api/__init__.py | hammerheadlemon/datamaps | 3605bb975f606bed1c009100d04a4048dfa28305 | [
"MIT"
] | 1 | 2019-11-03T14:56:27.000Z | 2019-11-03T14:56:27.000Z | from .api import project_data_from_master_api as project_data_from_master
from .api import project_data_from_master_month_api as project_data_from_master_month
| 53.333333 | 85 | 0.9125 | 28 | 160 | 4.642857 | 0.285714 | 0.338462 | 0.461538 | 0.646154 | 1 | 0.923077 | 0.523077 | 0 | 0 | 0 | 0 | 0 | 0.075 | 160 | 2 | 86 | 80 | 0.878378 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
94873f8385e7780d1a0a17ee247bba0a917e9140 | 9,214 | py | Python | NVIDIAFastPhotoStyle/models.py | sleebapaul/AuriaKathi | d1705fc7e0919fd0a9e9f87a2593a9f7319886cf | [
"MIT"
] | 9 | 2019-12-31T15:53:33.000Z | 2021-05-06T06:36:22.000Z | NVIDIAFastPhotoStyle/models.py | sleebapaul/AuriaKathi | d1705fc7e0919fd0a9e9f87a2593a9f7319886cf | [
"MIT"
] | null | null | null | NVIDIAFastPhotoStyle/models.py | sleebapaul/AuriaKathi | d1705fc7e0919fd0a9e9f87a2593a9f7319886cf | [
"MIT"
] | 2 | 2020-01-01T05:03:24.000Z | 2020-01-03T13:07:26.000Z | """
Copyright (C) 2018 NVIDIA Corporation. All rights reserved.
Licensed under the CC BY-NC-SA 4.0 license (https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode).
"""
import torch.nn as nn
class VGGEncoder(nn.Module):
def __init__(self, level):
super(VGGEncoder, self).__init__()
self.level = level
# 224 x 224
self.conv0 = nn.Conv2d(3, 3, 1, 1, 0)
self.pad1_1 = nn.ReflectionPad2d((1, 1, 1, 1))
# 226 x 226
self.conv1_1 = nn.Conv2d(3, 64, 3, 1, 0)
self.relu1_1 = nn.ReLU(inplace=True)
# 224 x 224
if level < 2: return
self.pad1_2 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv1_2 = nn.Conv2d(64, 64, 3, 1, 0)
self.relu1_2 = nn.ReLU(inplace=True)
# 224 x 224
self.maxpool1 = nn.MaxPool2d(kernel_size=2, stride=2, return_indices=True)
# 112 x 112
self.pad2_1 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv2_1 = nn.Conv2d(64, 128, 3, 1, 0)
self.relu2_1 = nn.ReLU(inplace=True)
# 112 x 112
if level < 3: return
self.pad2_2 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv2_2 = nn.Conv2d(128, 128, 3, 1, 0)
self.relu2_2 = nn.ReLU(inplace=True)
# 112 x 112
self.maxpool2 = nn.MaxPool2d(kernel_size=2, stride=2, return_indices=True)
# 56 x 56
self.pad3_1 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv3_1 = nn.Conv2d(128, 256, 3, 1, 0)
self.relu3_1 = nn.ReLU(inplace=True)
# 56 x 56
if level < 4: return
self.pad3_2 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv3_2 = nn.Conv2d(256, 256, 3, 1, 0)
self.relu3_2 = nn.ReLU(inplace=True)
# 56 x 56
self.pad3_3 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv3_3 = nn.Conv2d(256, 256, 3, 1, 0)
self.relu3_3 = nn.ReLU(inplace=True)
# 56 x 56
self.pad3_4 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv3_4 = nn.Conv2d(256, 256, 3, 1, 0)
self.relu3_4 = nn.ReLU(inplace=True)
# 56 x 56
self.maxpool3 = nn.MaxPool2d(kernel_size=2, stride=2, return_indices=True)
# 28 x 28
self.pad4_1 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv4_1 = nn.Conv2d(256, 512, 3, 1, 0)
self.relu4_1 = nn.ReLU(inplace=True)
# 28 x 28
def forward(self, x):
out = self.conv0(x)
out = self.pad1_1(out)
out = self.conv1_1(out)
out = self.relu1_1(out)
if self.level < 2:
return out
out = self.pad1_2(out)
out = self.conv1_2(out)
pool1 = self.relu1_2(out)
out, pool1_idx = self.maxpool1(pool1)
out = self.pad2_1(out)
out = self.conv2_1(out)
out = self.relu2_1(out)
if self.level < 3:
return out, pool1_idx, pool1.size()
out = self.pad2_2(out)
out = self.conv2_2(out)
pool2 = self.relu2_2(out)
out, pool2_idx = self.maxpool2(pool2)
out = self.pad3_1(out)
out = self.conv3_1(out)
out = self.relu3_1(out)
if self.level < 4:
return out, pool1_idx, pool1.size(), pool2_idx, pool2.size()
out = self.pad3_2(out)
out = self.conv3_2(out)
out = self.relu3_2(out)
out = self.pad3_3(out)
out = self.conv3_3(out)
out = self.relu3_3(out)
out = self.pad3_4(out)
out = self.conv3_4(out)
pool3 = self.relu3_4(out)
out, pool3_idx = self.maxpool3(pool3)
out = self.pad4_1(out)
out = self.conv4_1(out)
out = self.relu4_1(out)
return out, pool1_idx, pool1.size(), pool2_idx, pool2.size(), pool3_idx, pool3.size()
def forward_multiple(self, x):
out = self.conv0(x)
out = self.pad1_1(out)
out = self.conv1_1(out)
out = self.relu1_1(out)
if self.level < 2: return out
out1 = out
out = self.pad1_2(out)
out = self.conv1_2(out)
pool1 = self.relu1_2(out)
out, pool1_idx = self.maxpool1(pool1)
out = self.pad2_1(out)
out = self.conv2_1(out)
out = self.relu2_1(out)
if self.level < 3: return out, out1
out2 = out
out = self.pad2_2(out)
out = self.conv2_2(out)
pool2 = self.relu2_2(out)
out, pool2_idx = self.maxpool2(pool2)
out = self.pad3_1(out)
out = self.conv3_1(out)
out = self.relu3_1(out)
if self.level < 4: return out, out2, out1
out3 = out
out = self.pad3_2(out)
out = self.conv3_2(out)
out = self.relu3_2(out)
out = self.pad3_3(out)
out = self.conv3_3(out)
out = self.relu3_3(out)
out = self.pad3_4(out)
out = self.conv3_4(out)
pool3 = self.relu3_4(out)
out, pool3_idx = self.maxpool3(pool3)
out = self.pad4_1(out)
out = self.conv4_1(out)
out = self.relu4_1(out)
return out, out3, out2, out1
class VGGDecoder(nn.Module):
def __init__(self, level):
super(VGGDecoder, self).__init__()
self.level = level
if level > 3:
self.pad4_1 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv4_1 = nn.Conv2d(512, 256, 3, 1, 0)
self.relu4_1 = nn.ReLU(inplace=True)
# 28 x 28
self.unpool3 = nn.MaxUnpool2d(kernel_size=2, stride=2)
# 56 x 56
self.pad3_4 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv3_4 = nn.Conv2d(256, 256, 3, 1, 0)
self.relu3_4 = nn.ReLU(inplace=True)
# 56 x 56
self.pad3_3 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv3_3 = nn.Conv2d(256, 256, 3, 1, 0)
self.relu3_3 = nn.ReLU(inplace=True)
# 56 x 56
self.pad3_2 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv3_2 = nn.Conv2d(256, 256, 3, 1, 0)
self.relu3_2 = nn.ReLU(inplace=True)
# 56 x 56
if level > 2:
self.pad3_1 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv3_1 = nn.Conv2d(256, 128, 3, 1, 0)
self.relu3_1 = nn.ReLU(inplace=True)
# 56 x 56
self.unpool2 = nn.MaxUnpool2d(kernel_size=2, stride=2)
# 112 x 112
self.pad2_2 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv2_2 = nn.Conv2d(128, 128, 3, 1, 0)
self.relu2_2 = nn.ReLU(inplace=True)
# 112 x 112
if level > 1:
self.pad2_1 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv2_1 = nn.Conv2d(128, 64, 3, 1, 0)
self.relu2_1 = nn.ReLU(inplace=True)
# 112 x 112
self.unpool1 = nn.MaxUnpool2d(kernel_size=2, stride=2)
# 224 x 224
self.pad1_2 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv1_2 = nn.Conv2d(64, 64, 3, 1, 0)
self.relu1_2 = nn.ReLU(inplace=True)
# 224 x 224
if level > 0:
self.pad1_1 = nn.ReflectionPad2d((1, 1, 1, 1))
self.conv1_1 = nn.Conv2d(64, 3, 3, 1, 0)
def forward(self, x, pool1_idx=None, pool1_size=None, pool2_idx=None, pool2_size=None, pool3_idx=None,
pool3_size=None):
out = x
if self.level > 3:
out = self.pad4_1(out)
out = self.conv4_1(out)
out = self.relu4_1(out)
out = self.unpool3(out, pool3_idx, output_size=pool3_size)
out = self.pad3_4(out)
out = self.conv3_4(out)
out = self.relu3_4(out)
out = self.pad3_3(out)
out = self.conv3_3(out)
out = self.relu3_3(out)
out = self.pad3_2(out)
out = self.conv3_2(out)
out = self.relu3_2(out)
if self.level > 2:
out = self.pad3_1(out)
out = self.conv3_1(out)
out = self.relu3_1(out)
out = self.unpool2(out, pool2_idx, output_size=pool2_size)
out = self.pad2_2(out)
out = self.conv2_2(out)
out = self.relu2_2(out)
if self.level > 1:
out = self.pad2_1(out)
out = self.conv2_1(out)
out = self.relu2_1(out)
out = self.unpool1(out, pool1_idx, output_size=pool1_size)
out = self.pad1_2(out)
out = self.conv1_2(out)
out = self.relu1_2(out)
if self.level > 0:
out = self.pad1_1(out)
out = self.conv1_1(out)
return out
| 30.919463 | 106 | 0.49631 | 1,296 | 9,214 | 3.385031 | 0.071759 | 0.126054 | 0.136768 | 0.065193 | 0.843857 | 0.804878 | 0.796216 | 0.761796 | 0.757009 | 0.741965 | 0 | 0.135527 | 0.386586 | 9,214 | 297 | 107 | 31.023569 | 0.640658 | 0.042435 | 0 | 0.706806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026178 | false | 0 | 0.005236 | 0 | 0.073298 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
84d0c13683e910b5ff87f3b9b7af71302cbef933 | 3,702 | py | Python | apps/bednets/models.py | rapidsms/rapidsms-legacy | 43c2ecd41fd1541a2538326edee3d9e816d84529 | [
"BSD-3-Clause"
] | null | null | null | apps/bednets/models.py | rapidsms/rapidsms-legacy | 43c2ecd41fd1541a2538326edee3d9e816d84529 | [
"BSD-3-Clause"
] | null | null | null | apps/bednets/models.py | rapidsms/rapidsms-legacy | 43c2ecd41fd1541a2538326edee3d9e816d84529 | [
"BSD-3-Clause"
] | 1 | 2019-11-02T19:35:54.000Z | 2019-11-02T19:35:54.000Z | # vim: ai sts=4 ts=4 et sw=4
from django.db import models
from reporters.models import Location, Reporter, PersistantConnection
import time as taim
class NetDistribution(models.Model):
reporter = models.ForeignKey(Reporter, null=True, blank=True)
connection = models.ForeignKey(PersistantConnection, null=True, blank=True)
location = models.ForeignKey(Location)
time = models.DateTimeField()
distributed = models.PositiveIntegerField()
expected = models.PositiveIntegerField()
actual = models.PositiveIntegerField()
discrepancy = models.PositiveIntegerField()
def __unicode__(self):
return "%s (%s) %s" % (self.location, self.reporter, self.time)
@staticmethod
def net_data(location):
all = NetDistribution.objects.all().filter(location__pk=location.pk)
return {"distributed": sum(all.values_list("distributed", flat=True)),
"expected": sum(all.values_list("expected", flat=True)),
"actual": sum(all.values_list("actual", flat=True)),
"discrepancy": sum(all.values_list("discrepancy", flat=True))}
class Meta:
# FIXME tell django the old table name (since app has been renamed)
db_table = "nigeria_netdistribution"
# define a permission for this app to use the @permission_required
# decorator in bednet's views
# in the admin's auth section, we have a group called 'llin' whose
# users have this permission -- and are able to see this section
permissions = (
("can_view", "Can view"),
)
@staticmethod
def net_data_total(location):
'''For a given location, this function gets all the descendant locations
and calculates the totals for card distribution'''
all = NetDistribution.objects.all().filter(location__code__startswith=location.code)
return {"distributed": sum(all.values_list("distributed", flat=True)),
"expected": sum(all.values_list("expected", flat=True)),
"actual": sum(all.values_list("actual", flat=True)),
"discrepancy": sum(all.values_list("discrepancy", flat=True))}
class CardDistribution(models.Model):
reporter = models.ForeignKey(Reporter, null=True, blank=True)
connection = models.ForeignKey(PersistantConnection, null=True, blank=True)
location = models.ForeignKey(Location)
time = models.DateTimeField()
settlements = models.PositiveIntegerField()
people = models.PositiveIntegerField()
distributed = models.PositiveIntegerField()
def __unicode__(self):
return "%s (%s) %s" % (self.location, self.reporter, self.time)
@staticmethod
def card_data(location):
all = CardDistribution.objects.all().filter(location__pk=location.pk)
return {"distributed": sum(all.values_list("distributed", flat=True)),
"settlements": sum(all.values_list("settlements", flat=True)),
"people": sum(all.values_list("people", flat=True))}
@staticmethod
def card_data_total(location):
'''For a given location, this function gets all the descendant locations
and calculates the totals for card distribution'''
all = CardDistribution.objects.all().filter(location__code__startswith=location.code)
return {"distributed": sum(all.values_list("distributed", flat=True)),
"settlements": sum(all.values_list("settlements", flat=True)),
"people": sum(all.values_list("people", flat=True))}
class Meta:
# FIXME tell django the old table name (since app has been renamed)
db_table = "nigeria_carddistribution"
| 44.071429 | 93 | 0.672339 | 419 | 3,702 | 5.830549 | 0.252983 | 0.034384 | 0.068768 | 0.091691 | 0.735571 | 0.735571 | 0.707736 | 0.707736 | 0.707736 | 0.707736 | 0 | 0.001031 | 0.213668 | 3,702 | 83 | 94 | 44.60241 | 0.8382 | 0.166397 | 0 | 0.596491 | 0 | 0 | 0.111038 | 0.015395 | 0 | 0 | 0 | 0.024096 | 0 | 1 | 0.105263 | false | 0 | 0.052632 | 0.035088 | 0.596491 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
ca1c1bac66066672a05b652e586c46207da9d23d | 1,717 | py | Python | tests/test_rules_p4.py | Dratui/AI-Arena | e9693e34a90523bbb86eb2ad3b2c3e9797beed5c | [
"MIT"
] | 2 | 2018-11-16T08:18:42.000Z | 2018-11-22T08:44:10.000Z | tests/test_rules_p4.py | Dratui/2048_online | e9693e34a90523bbb86eb2ad3b2c3e9797beed5c | [
"MIT"
] | 15 | 2018-11-16T10:52:24.000Z | 2018-11-23T08:36:17.000Z | tests/test_rules_p4.py | Dratui/AI-Arena | e9693e34a90523bbb86eb2ad3b2c3e9797beed5c | [
"MIT"
] | 2 | 2018-11-15T09:32:36.000Z | 2018-11-16T08:56:54.000Z | from src.games.game_p4.rules_p4 import *
from pytest import *
from src.board import *
from src.games.games import *
def test_is_over():
game = init_game("p4")
game.list_board[0] = generate_board_from_list([[None, None, None, None], [None, None, None, None], [None, None, None, None], [0, None, None, 1]])
assert game.is_over()[0] == False
game.list_board[0] = generate_board_from_list([[None, None, None, None], [None, None, None, None], [None, None, None, None], [0, 0, 0, 0]])
assert game.is_over()[0] == True
game.list_board[0] = generate_board_from_list([[None, None, None, None], [None, None, None, None], [None, None, None, None], [1, 1, 1, 1]])
assert game.is_over()[0] == True
game.list_board[0] = generate_board_from_list([[0, None, None, None], [0, None, None, None], [0, None, None, None], [0, None, None, 1]])
assert game.is_over()[0] == True
def test_make_a_move():
game = init_game("p4")
game.list_board[1] = generate_board_from_list([[None, None, None, None], [None, None, None, None], [None, None, None, None], [0, None, None, 1]])
game.player_playing = 1
game.make_a_move(0)
assert game.list_board[1].get_all_tiles() == generate_board_from_list([[None, None, None, None], [None, None, None, None], [1, None, None, None], [0, None, None, 1]]).get_all_tiles()
game.list_board[0] = generate_board_from_list([[None, None, None, None], [None, None, None, None], [None, None, None, None], [1, None, 0, 0]])
game.player_playing = 0
game.make_a_move(2)
assert game.list_board[0].get_all_tiles() == generate_board_from_list([[None, None, None, None], [None, None, None, None], [None, None, 0, None], [1,None,0,0]]).get_all_tiles()
| 55.387097 | 186 | 0.652301 | 285 | 1,717 | 3.722807 | 0.108772 | 0.625825 | 0.769086 | 0.859566 | 0.760603 | 0.752121 | 0.752121 | 0.6918 | 0.6918 | 0.6918 | 0 | 0.032503 | 0.157833 | 1,717 | 30 | 187 | 57.233333 | 0.701245 | 0 | 0 | 0.208333 | 1 | 0 | 0.00233 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.083333 | false | 0 | 0.166667 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ca6c0c75a3eca1224876074fe5f47afe63494c2f | 147 | py | Python | pydatastructs/__init__.py | Jahnavi-Jonnalagadda/pydatastructs | 27bda667d67b40851b6cb482ef50fe2bccc098de | [
"BSD-3-Clause"
] | 3 | 2020-11-05T09:12:30.000Z | 2021-11-14T06:13:21.000Z | pydatastructs/__init__.py | Jahnavi-Jonnalagadda/pydatastructs | 27bda667d67b40851b6cb482ef50fe2bccc098de | [
"BSD-3-Clause"
] | null | null | null | pydatastructs/__init__.py | Jahnavi-Jonnalagadda/pydatastructs | 27bda667d67b40851b6cb482ef50fe2bccc098de | [
"BSD-3-Clause"
] | 1 | 2021-10-11T23:26:24.000Z | 2021-10-11T23:26:24.000Z | from .linear_data_structures import *
from .trees import *
from .miscellaneous_data_structures import *
from .utils import *
from .graphs import *
| 24.5 | 44 | 0.795918 | 19 | 147 | 5.947368 | 0.473684 | 0.353982 | 0.353982 | 0.424779 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136054 | 147 | 5 | 45 | 29.4 | 0.889764 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
ca93a3238d6e4218966d3c91a07b2250725a81e2 | 2,479 | py | Python | isimip_data/metadata/tests/test_admin.py | ISI-MIP/isimip-data | a0e4772362cc60db91e7689ec397840dcaaacddb | [
"MIT"
] | 3 | 2020-02-10T10:13:17.000Z | 2021-12-21T09:10:50.000Z | isimip_data/metadata/tests/test_admin.py | ISI-MIP/isimip-data | a0e4772362cc60db91e7689ec397840dcaaacddb | [
"MIT"
] | 17 | 2020-02-10T16:09:12.000Z | 2021-07-02T09:03:37.000Z | isimip_data/metadata/tests/test_admin.py | ISI-MIP/isimip-data | a0e4772362cc60db91e7689ec397840dcaaacddb | [
"MIT"
] | null | null | null | from django.urls import reverse
from isimip_data.metadata.models import Dataset, File, Resource
def test_dataset_changelist(db, client):
client.login(username='admin', password='admin')
url = reverse('admin:metadata_dataset_changelist')
response = client.get(url)
assert response.status_code == 200
def test_dataset_change(db, client):
client.login(username='admin', password='admin')
dataset = Dataset.objects.using('metadata').first()
url = reverse('admin:metadata_dataset_change', args=[dataset.id])
response = client.get(url)
assert response.status_code == 200
def test_dataset_delete(db, client):
client.login(username='admin', password='admin')
dataset = Dataset.objects.using('metadata').first()
url = reverse('admin:metadata_dataset_delete', args=[dataset.id])
response = client.get(url)
assert response.status_code == 403
def test_file_changelist(db, client):
client.login(username='admin', password='admin')
url = reverse('admin:metadata_file_changelist')
response = client.get(url)
assert response.status_code == 200
def test_file_change(db, client):
client.login(username='admin', password='admin')
file = File.objects.using('metadata').first()
url = reverse('admin:metadata_file_change', args=[file.id])
response = client.get(url)
assert response.status_code == 200
def test_file_delete(db, client):
client.login(username='admin', password='admin')
file = File.objects.using('metadata').first()
url = reverse('admin:metadata_file_delete', args=[file.id])
response = client.get(url)
assert response.status_code == 403
def test_resource_changelist(db, client):
client.login(username='admin', password='admin')
url = reverse('admin:metadata_resource_changelist')
response = client.get(url)
assert response.status_code == 200
def test_resource_change(db, client):
client.login(username='admin', password='admin')
resource = Resource.objects.using('metadata').first()
url = reverse('admin:metadata_resource_change', args=[resource.id])
response = client.get(url)
assert response.status_code == 200
def test_resource_delete(db, client):
client.login(username='admin', password='admin')
resource = Resource.objects.using('metadata').first()
url = reverse('admin:metadata_resource_delete', args=[resource.id])
response = client.get(url)
assert response.status_code == 403
| 28.170455 | 71 | 0.713998 | 311 | 2,479 | 5.543408 | 0.115756 | 0.036543 | 0.073086 | 0.099188 | 0.917633 | 0.906613 | 0.906613 | 0.906613 | 0.906613 | 0.885731 | 0 | 0.012808 | 0.149657 | 2,479 | 87 | 72 | 28.494253 | 0.805028 | 0 | 0 | 0.622642 | 0 | 0 | 0.163372 | 0.107705 | 0 | 0 | 0 | 0 | 0.169811 | 1 | 0.169811 | false | 0.169811 | 0.037736 | 0 | 0.207547 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
0472d1c07b828789360a750d62bedf3d7d5cacc6 | 225 | py | Python | Day_55/generate_random_string.py | kiranrraj/100Days_Of_Coding | ab75d83be9be87fb7bc83a3f3b72a4638dab22a1 | [
"MIT"
] | null | null | null | Day_55/generate_random_string.py | kiranrraj/100Days_Of_Coding | ab75d83be9be87fb7bc83a3f3b72a4638dab22a1 | [
"MIT"
] | null | null | null | Day_55/generate_random_string.py | kiranrraj/100Days_Of_Coding | ab75d83be9be87fb7bc83a3f3b72a4638dab22a1 | [
"MIT"
] | null | null | null | # Title : Generate random string
# Author : Kiran raj R.
# Date : 24:10:2020
import secrets
print(f"Random secure Hexadecimal token is {secrets.token_hex(64)}")
print(f"Random secure URL is {secrets.token_urlsafe(64)}")
| 25 | 68 | 0.728889 | 35 | 225 | 4.628571 | 0.685714 | 0.074074 | 0.148148 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062827 | 0.151111 | 225 | 8 | 69 | 28.125 | 0.78534 | 0.324444 | 0 | 0 | 1 | 0 | 0.716216 | 0.337838 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.666667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
048e662f66c37b819f34a8f7766febc9862c592c | 70 | py | Python | tests/test_DBN.py | WillianFuks/pyClickModels | e0515d8e07310d4b6369f2768647a3808b4904dc | [
"MIT"
] | 13 | 2020-05-23T01:03:49.000Z | 2021-11-08T10:20:46.000Z | tests/test_DBN.py | WillianFuks/pyClickModels | e0515d8e07310d4b6369f2768647a3808b4904dc | [
"MIT"
] | 4 | 2021-02-11T03:58:27.000Z | 2021-06-16T18:10:19.000Z | tests/test_DBN.py | WillianFuks/pyClickModels | e0515d8e07310d4b6369f2768647a3808b4904dc | [
"MIT"
] | 3 | 2020-12-20T11:41:33.000Z | 2021-08-31T15:32:19.000Z | def test_DBN():
from test_cy_DBN import run_tests
run_tests()
| 17.5 | 37 | 0.714286 | 12 | 70 | 3.75 | 0.666667 | 0.355556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.214286 | 70 | 3 | 38 | 23.333333 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b6c81b666034a656f84d8b4773604d4fdc60b11e | 29 | py | Python | package/subpackage/module_a.py | tagler/Data_Science_Project_Template_Python | 0c60a06e81df2e01db771393a17504f28f4fa432 | [
"MIT"
] | null | null | null | package/subpackage/module_a.py | tagler/Data_Science_Project_Template_Python | 0c60a06e81df2e01db771393a17504f28f4fa432 | [
"MIT"
] | null | null | null | package/subpackage/module_a.py | tagler/Data_Science_Project_Template_Python | 0c60a06e81df2e01db771393a17504f28f4fa432 | [
"MIT"
] | null | null | null | def print_a():
print('a') | 14.5 | 14 | 0.551724 | 5 | 29 | 3 | 0.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206897 | 29 | 2 | 15 | 14.5 | 0.652174 | 0 | 0 | 0 | 0 | 0 | 0.033333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 1 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
b6dcdb7a5206b7bdc8243e08b918678343c449fb | 3,273 | py | Python | tests/dict/test_dict_representation.py | nikitanovosibirsk/district42 | 0c13248919fc96bde16b9634a8ea468e4882752a | [
"Apache-2.0"
] | 1 | 2016-09-16T04:09:19.000Z | 2016-09-16T04:09:19.000Z | tests/dict/test_dict_representation.py | nikitanovosibirsk/district42 | 0c13248919fc96bde16b9634a8ea468e4882752a | [
"Apache-2.0"
] | 2 | 2021-06-14T05:53:49.000Z | 2022-02-01T14:26:31.000Z | tests/dict/test_dict_representation.py | nikitanovosibirsk/district42 | 0c13248919fc96bde16b9634a8ea468e4882752a | [
"Apache-2.0"
] | null | null | null | from baby_steps import given, then, when
from district42 import optional, represent, schema
def test_dict_representation():
with given:
sch = schema.dict
with when:
res = represent(sch)
with then:
assert res == "schema.dict"
def test_dict_empty_representation():
with given:
sch = schema.dict({})
with when:
res = represent(sch)
with then:
assert res == "schema.dict({})"
def test_dict_one_key_representation():
with given:
sch = schema.dict({"id": schema.int})
with when:
res = represent(sch)
with then:
assert res == "\n".join([
"schema.dict({",
" 'id': schema.int",
"})"
])
def test_dict_many_keys_representation():
with given:
sch = schema.dict({
"id": schema.int,
"name": schema.str("banana")
})
with when:
res = represent(sch)
with then:
assert res == "\n".join([
"schema.dict({",
" 'id': schema.int,",
" 'name': schema.str('banana')",
"})",
])
def test_dict_optional_key_representation():
with given:
sch = schema.dict({
"id": schema.int,
optional("name"): schema.str,
})
with when:
res = represent(sch)
with then:
assert res == "\n".join([
"schema.dict({",
" 'id': schema.int,",
" optional('name'): schema.str",
"})"
])
def test_dict_nested_keys_representation():
with given:
sch = schema.dict({
"id": schema.int,
"user": schema.dict({
"id": schema.int,
"name": schema.str("banana")
}),
"is_deleted": schema.bool
})
with when:
res = represent(sch)
with then:
assert res == "\n".join([
"schema.dict({",
" 'id': schema.int,",
" 'user': schema.dict({",
" 'id': schema.int,",
" 'name': schema.str('banana')",
" }),",
" 'is_deleted': schema.bool",
"})",
])
def test_dict_relaxed_empty_representation():
with given:
sch = schema.dict({...: ...})
with when:
res = represent(sch)
with then:
assert res == "schema.dict({...: ...})"
def test_dict_relaxed_one_key_representation():
with given:
sch = schema.dict({"id": schema.int, ...: ...})
with when:
res = represent(sch)
with then:
assert res == "\n".join([
"schema.dict({",
" 'id': schema.int,",
" ...: ...",
"})",
])
def test_dict_relaxed_many_keys_representation():
with given:
sch = schema.dict({
"id": schema.int,
"name": schema.str("banana"),
...: ...,
})
with when:
res = represent(sch)
with then:
assert res == "\n".join([
"schema.dict({",
" 'id': schema.int,",
" 'name': schema.str('banana'),",
" ...: ...",
"})",
])
| 21.253247 | 55 | 0.44974 | 310 | 3,273 | 4.625806 | 0.119355 | 0.13947 | 0.117155 | 0.175732 | 0.892608 | 0.892608 | 0.892608 | 0.892608 | 0.892608 | 0.863319 | 0 | 0.000999 | 0.388329 | 3,273 | 153 | 56 | 21.392157 | 0.715285 | 0 | 0 | 0.732759 | 0 | 0 | 0.177513 | 0.006416 | 0 | 0 | 0 | 0 | 0.077586 | 1 | 0.077586 | false | 0 | 0.017241 | 0 | 0.094828 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b6e1c69245737dbd35aa8c9cfb692f1f860e492f | 173 | py | Python | authCredentials.py | SnowCheetos/BubbleMint | 29f2c4ce3debc065790152b3e5f41ee9dff9bd52 | [
"MIT"
] | null | null | null | authCredentials.py | SnowCheetos/BubbleMint | 29f2c4ce3debc065790152b3e5f41ee9dff9bd52 | [
"MIT"
] | null | null | null | authCredentials.py | SnowCheetos/BubbleMint | 29f2c4ce3debc065790152b3e5f41ee9dff9bd52 | [
"MIT"
] | 2 | 2022-01-07T14:43:41.000Z | 2022-01-25T20:08:34.000Z | api_key = ... #Insert your Coinbase Pro API information
api_secret = ... #Insert your Coinbase Pro API information
api_pass = ... #Insert your Coinbase Pro API information
| 34.6 | 58 | 0.751445 | 24 | 173 | 5.291667 | 0.375 | 0.23622 | 0.425197 | 0.496063 | 0.874016 | 0.874016 | 0.598425 | 0 | 0 | 0 | 0 | 0 | 0.16185 | 173 | 4 | 59 | 43.25 | 0.875862 | 0.693642 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.333333 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
8e3babdf4ccb179a21415ace032a67568984f1df | 9,726 | py | Python | extending_streamlit_usage/001_nlp_spacy_python_realp/001_nlp_spacy_python.py | bflaven/BlogArticlesExamples | 5df2dfc26170ffbbade78ba136bf3172391e3b2a | [
"MIT"
] | 5 | 2018-05-03T08:16:02.000Z | 2021-09-04T03:44:24.000Z | extending_streamlit_usage/001_nlp_spacy_python_realp/001_nlp_spacy_python.py | bflaven/BlogArticlesExamples | 5df2dfc26170ffbbade78ba136bf3172391e3b2a | [
"MIT"
] | 1 | 2022-01-28T19:27:19.000Z | 2022-01-28T19:27:19.000Z | extending_streamlit_usage/001_nlp_spacy_python_realp/001_nlp_spacy_python.py | bflaven/BlogArticlesExamples | 5df2dfc26170ffbbade78ba136bf3172391e3b2a | [
"MIT"
] | 2 | 2020-09-10T13:33:27.000Z | 2022-02-09T11:07:38.000Z | #!/usr/bin/python
# -*- coding: utf-8 -*-
"""
[path]
cd /Users/brunoflaven/Documents/01_work/blog_articles/extending_streamlit_usage/001_nlp_spacy_python_realp/
[file]
python 001_nlp_spacy_python.py
# source
Source: https://realpython.com/natural-language-processing-spacy-python/
"""
import spacy
import spacy_streamlit
def debug_text_enumerate(doc):
for i, token in enumerate(doc):
# DEBUG
print(i, token, token.pos_, token.dep_)
# print(i, token.text, token.pos_, token.dep_)
# print(i, token.text, token.ent_type)
# print(i, token.text, token.ent_type_)
# print(i, token.text, token.ent_type, token.ent_type_)
# print(i, token.text, token.head)
# print(doc[i].text)
# print(doc[i])
# print(i, doc[i], doc[i].pos_, doc[i].dep_)
# and so on...
def debug_text_loop(doc):
for token in doc:
# print(token, token.pos_, token.dep_)
# print(token.text, token.pos_, token.dep_)
# print(token.text, token.ent_type)
# print(token.text, token.ent_type_)
# print(token.text, token.ent_type, token.ent_type_)
print(token.text, token.head)
nlp = spacy.load('en_core_web_sm')
print("\n --- result_1")
print("EN spacy loaded")
# HOW TO READ A TEXT
bf_text = ('Order is a very relative notion, it is specific for each of us. “The world is my representation” as Arthur “Chop” Schopenhauer says. At the same time, experience teaches us “Every cloud has a silver lining” so modestly I discovered that a relative “disorder” is often more efficient than an absolute “order”. Right? Ooh slow down a bit! It is not an easy-peasy Q/A! For sure, you can ask yourself why such interrogations are part of a so called post dedicated to Python. Well, first of all, it make sense for me for several reasons: Never bad to introduce some philosophy in programming practice. Personally, it reconciles two of my favorite hobbies(bad programming & cheap philosophy) to one: headache! More seriously, these thoughts come from a practical and real-world experience. Let me tell you the story: after a long, long aging process, as a PO, I achieved to build a robust testing strategy with CodeceptJS for a backoffice! But then I discovered that the e2e sequential execution of the suite created a bias(let’s called them errors). OMG! Then I was wondering how to introduce randomization inside the testing suite execution in order to reduce this bias to the maximum. The purpose is to avoid the false positive nightmare! Meanwhile, in a completely chaotic, confused, disordered, disorganised, topsy-turvy, helter-skelter, pell-mell, upside-down, higgledy-piggledy, hugger-mugger, harum-scarum, snafu, slipshod, unplanned, erratic, strayed, hit-and-miss, incidental, spontaneous, uncoordinated, willy-nilly, devil-may-care, unpremeditated, reckless, cockeyed, hit-and-miss, fluky, incidental… anarchic way, I am learning advanced Python techniques(NLP, facial recognition …) but also fundamentals like the difference between a “tuple” and a “dictionnary”. So it was logic that Python popped up in my mind to bring this randomness, given the complexity of doing the same thing in Bash, a language that I am far from mastering! Incidentally, this experiment proves me that Python is: Easy to read: Python is easy to read and most of the language makes sense at a glimpse. This makes finding issues a lot easier than more complicated languages. Portability: Python runs on many platforms and systems, meaning your programs can reach a wider audience. It can easily replace Bash or Ruby of instance. Two qualities that gave Python a serious advantage in what I am looking for: to create on-demand order or disorder 🙂. Order: when it comes to browsing, summarizing and indexing hundreds of texts in order to store them in a database itself MySQL and / or NoSQL. Disorder: when it comes to run tests in parallel, for example, and thereby overcome a bias linked to a sequential execution. Oddly, sometimes, Disorder proves to be more efficient and effective than Order. So, this post presents my quick researches on how-to handle random in Python. Having in mind that the final objective is to load randomly testing files that achieve a complete Backoffice’s assessment without increasing errors into the process. By the way, it reveals a third and fourth Python qualities: Increasing productivity: how much work you can accomplish in a given time with few lines of code. Relieving Boring Stuff in Bullshit Jobs: Thanks to Al Sweigart and David Graeber to have enabled me to create this “tuna-mayo-sandwich” concept. So, now, let’s move on and show some code…')
# bf_doc = nlp(bf_text)
# Extract tokens for the given doc
# print("\n --- result_2")
# print([token.text for token in bf_doc])
# print("\n --- result_3")
# # HOW TO READ A TEXT FILE TRY_1
# file_name = 'article_bf_1.txt'
# introduction_file_text = open(file_name).read()
# introduction_file_doc = nlp(introduction_file_text)
# # Extract tokens for the given doc
# print([token.text for token in introduction_file_doc])
print("\n --- result_4")
# SENTENCE DETECTION
# In spaCy, the sents property is used to extract sentences. Here’s how you would extract the total number of sentences and the sentences for a given input texts.
# BF detect the sentence's number based on the dot (.)
"""
about_text = ('Python, Randomization, E2E – it’s all about Random and some good reasons to learn and leverage on Python'
'Order is a very relative notion, it is specific for each of us. “The world is my representation” as Arthur “Chop” Schopenhauer says. At the same time, experience teaches us “Every cloud has a silver lining” so modestly I discovered that a relative “disorder” is often more efficient than an absolute “order”. Right? Ooh slow down a bit! It is not an easy-peasy Q/A!'
'For sure, you can ask yourself why such interrogations are part of a so called post dedicated to Python. Well, first of all, it make sense for me for several reasons:'
'Never bad to introduce some philosophy in programming practice. Personally, it reconciles two of my favorite hobbies(bad programming & cheap philosophy) to one: headache! More seriously, these thoughts come from a practical and real-world experience.'
'Let me tell you the story: after a long, long aging process, as a PO, I achieved to build a robust testing strategy with CodeceptJS for a backoffice! But then I discovered that the e2e sequential execution of the suite created a bias(let’s called them errors). OMG! Then I was wondering how to introduce randomization inside the testing suite execution in order to reduce this bias to the maximum. The purpose is to avoid the false positive nightmare!'
'Meanwhile, in a completely chaotic, confused, disordered, disorganized, topsy-turvy, helter-skelter, pell-mell, upside-down, higgledy-piggledy, hugger-mugger, harum-scarum, snafu, slipshod, unplanned, erratic, strayed, hit-and-miss, incidental, spontaneous, uncoordinated, willy-nilly, devil-may-care, unpremeditated, reckless, cockeyed, hit-and-miss, fluky, incidental… anarchic way, I am learning advanced Python techniques(NLP, facial recognition …) but also fundamentals like the difference between a “tuple” and a “dictionnary”. So it was logic that Python popped up in my mind to bring this randomness, given the complexity of doing the same thing in Bash, a language that I am far from mastering!'
'Incidentally, this experiment proves me that Python is : '
'Easy to read: Python is easy to read and most of the language makes sense at a glimpse. This makes finding issues a lot easier than more complicated languages.'
'Portability: Python runs on many platforms and systems, meaning your programs can reach a wider audience. It can easily replace Bash or Ruby of instance.'
'Two qualities that gave Python a serious advantage in what I am looking for: to create on-demand order or disorder 🙂'
'Order: when it comes to browsing, summarizing and indexing hundreds of texts in order to store them in a database itself MySQL and / or NoSQL.'
'Disorder: when it comes to run tests in parallel, for example, and thereby overcome a bias linked to a sequential execution.'
'Oddly, sometimes, Disorder proves to be more efficient and effective than Order.'
'So, this post presents my quick researches on how-to handle random in Python. Having in mind that the final objective is to load randomly testing files that achieve a complete Backoffice’s assessment without increasing errors into the process.'
'By the way, it reveals a third and fourth Python qualities:'
'Increasing productivity: how much work you can accomplish in a given time with few lines of code.'
'Relieving Boring Stuff in Bullshit Jobs: Thanks to Al Sweigart and David Graeber to have enabled me to create this “tuna-mayo-sandwich” concept.'
'So, now, let’s move on and show some code…'
)
about_doc = nlp(about_text)
sentences = list(about_doc.sents)
len(sentences)
for sentence in sentences:
print (sentence)
"""
# same example with (...) aka ellipsis
# In the above example, spaCy is correctly able to identify sentences in the English language, using a full stop(.) as the sentence delimiter. You can also customize the sentence detection to detect sentences on custom delimiters.
# Here’s an example, where an ellipsis(...) is used as the delimiter:
# BF detect the sentence's number based on the ellipsis (...)
# see 002_nlp_spacy_python.py
| 79.073171 | 3,380 | 0.743882 | 1,539 | 9,726 | 4.669916 | 0.271605 | 0.015027 | 0.01948 | 0.014192 | 0.82204 | 0.818144 | 0.810769 | 0.808821 | 0.790316 | 0.769027 | 0 | 0.002662 | 0.188875 | 9,726 | 122 | 3,381 | 79.721311 | 0.905691 | 0.190417 | 0 | 0 | 0 | 0.076923 | 0.876632 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.153846 | 0 | 0.307692 | 0.384615 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f3dea9ebd67f9daa4148099186d16813a87e9925 | 5,986 | py | Python | WSGI-Scripts/WebPageDev/words.py | sch93nz/Linux-Scripts | 331fab04a11580ed4e8427e91a2069a9aa76e3e4 | [
"MIT"
] | null | null | null | WSGI-Scripts/WebPageDev/words.py | sch93nz/Linux-Scripts | 331fab04a11580ed4e8427e91a2069a9aa76e3e4 | [
"MIT"
] | null | null | null | WSGI-Scripts/WebPageDev/words.py | sch93nz/Linux-Scripts | 331fab04a11580ed4e8427e91a2069a9aa76e3e4 | [
"MIT"
] | null | null | null | import random
class A(object):
ID = 0
link = ""
webChildren = list()
def __init__(self,words):
self.webChildren=[]
self.ID =random.getrandbits(16)
self.link = words
def getOpening(self):
output = '''\r\n<a'''
return output+'''">'''
def getEnding(self):
return '''</a>'''
def webString(self):
output = self.getOpening()
output += self.link
for item in self.webChildren:
output+=item.webString()
output+=self.getEnding()
return output
class P(object):
ID = 0
text = ""
webChildren = list()
def __init__(self,words):
self.webChildren=[]
self.ID =random.getrandbits(16)
self.text = words
def getOpening(self):
output = '''\r\n<p'''
output += ''' style="'''
return output+'''">'''
def getEnding(self):
return '''</p>'''
def webString(self):
output = self.getOpening()
output += self.text
for item in self.webChildren:
output+=item.webString()
output+=self.getEnding()
return output
class H1(object):
ID = 0
text = ""
webChildren = list()
def __init__(self,words):
self.webChildren=[]
self.ID =random.getrandbits(16)
self.text = words
def getOpening(self):
output = '''\r\n<h1'''
output += ''' style="'''
return output+'''">'''
def getEnding(self):
return '''</h1>'''
def webString(self):
output = self.getOpening()
output += self.text
for item in self.webChildren:
output+=item.webString()
output+=self.getEnding()
return output
class H2(object):
ID=0
text = ""
webChildren = list()
def __init__(self,words):
self.webChildren=[]
self.ID =random.getrandbits(16)
self.text = words
def getOpening(self):
output = '''\r\n<h2'''
output += ''' style="'''
return output+'''">'''
def getEnding(self):
return '''</h2>'''
def webString(self):
output = self.getOpening()
output += self.text
for item in self.webChildren:
output+=item.webString()
output+=self.getEnding()
return output
class H3(object):
ID=0
text = ""
webChildren = list()
def __init__(self,words):
self.webChildren=[]
self.ID =random.getrandbits(16)
self.text = words
def getOpening(self):
output = '''\r\n<h3'''
output += ''' style="'''
return output+'''">'''
def getEnding(self):
return '''</h3>'''
def webString(self):
output = self.getOpening()
output += self.text
for item in self.webChildren:
output+=item.webString()
output+=self.getEnding()
return output
class H4(object):
text = ""
webChildren = list()
ID=0
def __init__(self,words):
self.webChildren=[]
self.ID =random.getrandbits(16)
self.text = words
def getOpening(self):
output = '''\r\n<h4'''
output += ''' style="'''
return output+'''">'''
def getEnding(self):
return '''</h4>'''
def webString(self):
output = self.getOpening()
output += self.text
for item in self.webChildren:
output+=item.webString()
output+=self.getEnding()
return output
class H5(object):
text = ""
webChildren = list()
ID=0
def __init__(self,words):
self.webChildren=[]
self.ID =random.getrandbits(16)
self.text = words
def getOpening(self):
output = '''\r\n<h5'''
output += ''' style="'''
return output+'''">'''
def getEnding(self):
return '''</h5>'''
def webString(self):
output = self.getOpening()
output += self.text
for item in self.webChildren:
output+=item.webString()
output+=self.getEnding()
return output
class H6(object):
text = ""
webChildren = list()
ID=0
def __init__(self,words):
self.webChildren=[]
self.ID =random.getrandbits(16)
self.text = words
def getOpening(self):
output = '''\r\n<h6'''
output += ''' style="'''
return output+'''">'''
def getEnding(self):
return '''</h6>'''
def webString(self):
output = self.getOpening()
output += self.text
for item in self.webChildren:
output+=item.webString()
output+=self.getEnding()
return output
class H7(object):
text = ""
webChildren = list()
ID=0
def __init__(self,words):
self.webChildren=[]
self.ID =random.getrandbits(16)
self.text = words
def getOpening(self):
output = '''\r\n<h7'''
output += ''' style="'''
return output+'''">'''
def getEnding(self):
return '''</h7>'''
def webString(self):
output = self.getOpening()
output += self.text
for item in self.webChildren:
output+=item.webString()
output+=self.getEnding()
return output
class H8(object):
text = ""
webChildren = list()
ID=0
def __init__(self,words):
self.webChildren=[]
self.ID =random.getrandbits(16)
self.text = words
def getOpening(self):
output = '''\r\n<h8'''
output += ''' style="'''
return output+'''">'''
def getEnding(self):
return '''</h8>'''
def webString(self):
output = self.getOpening()
output += self.text
for item in self.webChildren:
output+=item.webString()
output+=self.getEnding()
return output
| 19.309677 | 39 | 0.510358 | 600 | 5,986 | 5.025 | 0.061667 | 0.099502 | 0.036484 | 0.053068 | 0.969486 | 0.969486 | 0.958209 | 0.948259 | 0.798673 | 0.798673 | 0 | 0.013654 | 0.339292 | 5,986 | 309 | 40 | 19.372168 | 0.748673 | 0 | 0 | 0.838095 | 0 | 0 | 0.034759 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.190476 | false | 0 | 0.004762 | 0.047619 | 0.528571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
f3fd153abf5023bb8d15e8da1f16ab17826ced5c | 4,050 | py | Python | assignment2/run_experiments.py | adijo/ift6135-rnn | 88ebcd621cea4042f5ada688f2452ce25d02b761 | [
"Apache-2.0"
] | null | null | null | assignment2/run_experiments.py | adijo/ift6135-rnn | 88ebcd621cea4042f5ada688f2452ce25d02b761 | [
"Apache-2.0"
] | null | null | null | assignment2/run_experiments.py | adijo/ift6135-rnn | 88ebcd621cea4042f5ada688f2452ce25d02b761 | [
"Apache-2.0"
] | null | null | null | import subprocess
def main():
# Since they were too big for git, the saved models (best parameters) for experiments 1, 2 and 3 can be found on
# https://drive.google.com/drive/folders/1CeaePSAqsOERrAY6zxIqkVKJm75TyB1q?usp=sharing
experiments = [
# Experiment 1
"python ptb-lm.py --model=RNN --optimizer=ADAM --initial_lr=0.0001 --batch_size=20 --seq_len=35 --hidden_size=1500 --num_layers=2 --dp_keep_prob=0.35 --save_best",
# Experiment 2
"python ptb-lm.py --model=GRU --optimizer=SGD_LR_SCHEDULE --initial_lr=10 --batch_size=20 --seq_len=35 --hidden_size=1500 --num_layers=2 --dp_keep_prob=0.35 --save_best",
# Experiment 3
"python ptb-lm.py --model=TRANSFORMER --optimizer=SGD_LR_SCHEDULE --initial_lr=20 --batch_size=128 --seq_len=35 --hidden_size=512 --num_layers=6 --dp_keep_prob=0.9",
# Experiment 4
"python ptb-lm.py --model=GRU --optimizer=SGD --initial_lr=10 --batch_size=20 --seq_len=35 --hidden_size=1500 --num_layers=2 --dp_keep_prob=0.35",
# Experiment 5
"python ptb-lm.py --model=GRU --optimizer=ADAM --initial_lr=0.0001 --batch_size=20 --seq_len=35 --hidden_size=1500 --num_layers=2 --dp_keep_prob=0.35",
# Experiment 6
"python ptb-lm.py --model=GRU --optimizer=SGD_LR_SCHEDULE --initial_lr=10 --batch_size=20 --seq_len=35 --hidden_size=1500 --num_layers=2 --dp_keep_prob=0.35",
# Experiment 7
"python ptb-lm.py --model=TRANSFORMER --optimizer=SGD --initial_lr=20 --batch_size=128 --seq_len=35 --hidden_size=512 --num_layers=6 --dp_keep_prob=0.9",
# Experiment 8
"python ptb-lm.py --model=TRANSFORMER --optimizer=ADAM --initial_lr=0.001 --batch_size=128 --seq_len=35 --hidden_size=512 --num_layers=2 --dp_keep_prob=0.9",
# Experiment 9
"python ptb-lm.py --model=TRANSFORMER --optimizer=SGD_LR_SCHEDULE --initial_lr=20 --batch_size=128 --seq_len=35 --hidden_size=512 --num_layers=6 --dp_keep_prob=0.9",
# Experiment 10
"python ptb-lm.py --model=RNN --optimizer=SGD --initial_lr=0.0004 --batch_size=20 --seq_len=35 --hidden_size=1500 --num_layers=2 --dp_keep_prob=0.35",
# Experiment 11
"python ptb-lm.py --model=RNN --optimizer=SGD_LR_SCHEDULE --initial_lr=5 --batch_size=20 --seq_len=35 --hidden_size=512 --num_layers=2 --dp_keep_prob=0.35",
# Experiment 12
"python ptb-lm.py --model=RNN --optimizer=ADAM --initial_lr=0.0001 --batch_size=20 --seq_len=35 --hidden_size=1500 --num_layers=2 --dp_keep_prob=0.5",
# Experiment 13
"python ptb-lm.py --model=GRU --optimizer=SGD --initial_lr=10 --batch_size=20 --seq_len=35 --hidden_size=1500 --num_layers=2 --dp_keep_prob=0.25",
# Experiment 14
"python ptb-lm.py --model=GRU --optimizer=SGD_LR_SCHEDULE --initial_lr=15 --batch_size=20 --seq_len=35 --hidden_size=1500 --num_layers=2 --dp_keep_prob=0.35",
# Experiment 15
"python ptb-lm.py --model=GRU --optimizer=ADAM --initial_lr=0.0001 --batch_size=20 --seq_len=35 --hidden_size=1400 --num_layers=2 --dp_keep_prob=0.35",
# Experiment 16
"python ptb-lm.py --model=TRANSFORMER --optimizer=SGD --initial_lr=20 --batch_size=256 --seq_len=35 --hidden_size=256 --num_layers=6 --dp_keep_prob=0.85",
# Experiment 17
"python ptb-lm.py --model=TRANSFORMER --optimizer=SGD_LR_SCHEDULE --initial_lr=20 --batch_size=256 --seq_len=35 --hidden_size=256 --num_layers=6 --dp_keep_prob=0.85",
# Experiment 18
"python ptb-lm.py --model=TRANSFORMER --optimizer=ADAM --initial_lr=0.001 --batch_size=256 --seq_len=35 --hidden_size=128 --num_layers=2 --dp_keep_prob=0.9"
]
for command in experiments:
for message in run_command(command):
print(message, end="")
def run_command(cmd):
popen = subprocess.Popen(cmd, stdout=subprocess.PIPE, universal_newlines=True)
for stdout_line in iter(popen.stdout.readline, ""):
yield stdout_line
popen.stdout.close()
popen.wait()
if __name__ == '__main__':
main()
| 66.393443 | 178 | 0.679753 | 648 | 4,050 | 4.009259 | 0.166667 | 0.062356 | 0.076212 | 0.090069 | 0.801386 | 0.801386 | 0.790223 | 0.790223 | 0.743649 | 0.726328 | 0 | 0.087983 | 0.163704 | 4,050 | 60 | 179 | 67.5 | 0.679067 | 0.108148 | 0 | 0.060606 | 0 | 0.545455 | 0.771373 | 0.052632 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060606 | false | 0 | 0.030303 | 0 | 0.090909 | 0.030303 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6d05909c872f5a2f3ad4e99579da8424d2a70ac9 | 240 | py | Python | nextcord/ext/interactions/user/user_command.py | abrahammurciano/nextcord | e55be422a1b923fc498b04f82172d5a0d263eb71 | [
"MIT"
] | null | null | null | nextcord/ext/interactions/user/user_command.py | abrahammurciano/nextcord | e55be422a1b923fc498b04f82172d5a0d263eb71 | [
"MIT"
] | null | null | null | nextcord/ext/interactions/user/user_command.py | abrahammurciano/nextcord | e55be422a1b923fc498b04f82172d5a0d263eb71 | [
"MIT"
] | null | null | null | from nextcord.ext.interactions.application_command import ApplicationCommand
class UserCommand(ApplicationCommand):
"""
TODO: Implement
https://discord.com/developers/docs/interactions/application-commands#user-commands
""" | 34.285714 | 87 | 0.7875 | 23 | 240 | 8.173913 | 0.826087 | 0.244681 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1125 | 240 | 7 | 88 | 34.285714 | 0.882629 | 0.4125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
6d225d1d34bd1b59c055f92ad6d726279caafb23 | 3,716 | py | Python | tests/test_blackjack.py | ThatClyde/Simple_BlackJack | e7fb8557b61cd053964ca1cbc21d343330d661b1 | [
"MIT"
] | null | null | null | tests/test_blackjack.py | ThatClyde/Simple_BlackJack | e7fb8557b61cd053964ca1cbc21d343330d661b1 | [
"MIT"
] | null | null | null | tests/test_blackjack.py | ThatClyde/Simple_BlackJack | e7fb8557b61cd053964ca1cbc21d343330d661b1 | [
"MIT"
] | null | null | null | import unittest
from decimal import Decimal
from utils import blackjack_test
from models import Player, Hand, Card
class Test_blackjack_test(unittest.TestCase):
def setup_player_non_blackjack(self):
player = Player()
player.bet = Decimal(1.00)
player.money = Decimal(1.00)
card1 = Card('Hearts', 'Jack')
card2 = Card('Hearts', 'King')
player_hand = Hand()
player_hand.add_card(card1)
player_hand.add_card(card2)
dealer_hand = Hand()
return player, player_hand, dealer_hand
def setup_player_blackjack(self):
player = Player()
player.bet = Decimal(1.00)
player.money = Decimal(1.00)
card1 = Card('Hearts', 'Jack')
card2 = Card('Hearts', 'Ace')
dealer_hand = Hand()
player_hand = Hand()
player_hand.add_card(card1)
player_hand.add_card(card2)
return player, player_hand, dealer_hand
def setup_dealer_blackjack(self):
player = Player()
player.bet = Decimal(1.00)
player.money = Decimal(1.00)
card1 = Card('Hearts', 'Jack')
card2 = Card('Hearts', 'Ace')
dealer_hand = Hand()
player_hand = Hand()
dealer_hand.add_card(card1)
dealer_hand.add_card(card2)
return player, player_hand, dealer_hand
def setup_dual_blackjack_scenario(self):
player = Player()
player.bet = Decimal(1.00)
player.money = Decimal(1.00)
card1 = Card('Hearts', 'Jack')
card2 = Card('Hearts', 'Ace')
dealer_hand = Hand()
player_hand = Hand()
dealer_hand.add_card(card1)
dealer_hand.add_card(card2)
player_hand.add_card(card1)
player_hand.add_card(card2)
return player, player_hand, dealer_hand
def test_blackjack_no_blackjack_returns_false(self):
player, player_hand, dealer_hand = self.setup_player_non_blackjack()
self.assertFalse(blackjack_test(player, player_hand, dealer_hand))
def test_blackjack_with_blackjack_return_true(self):
player, player_hand, dealer_hand = self.setup_player_blackjack()
self.assertTrue(blackjack_test(player, player_hand, dealer_hand))
def test_blackjack_with_dual_blackjacks_return_true(self):
player, player_hand, dealer_hand = self.setup_dual_blackjack_scenario()
self.assertTrue(blackjack_test(player, player_hand, dealer_hand))
def test_blackjack_with_dual_blackjack_money_lost(self):
player, player_hand, dealer_hand = self.setup_dual_blackjack_scenario()
blackjack_test(player, player_hand, dealer_hand)
self.assertEqual(player.money, Decimal(0))
def test_blackjack_with_blackjack_wins_increment(self):
player, player_hand, dealer_hand = self.setup_player_blackjack()
blackjack_test(player, player_hand, dealer_hand)
self.assertEqual(player.wins, 1)
def test_blackjack_with_blackjack_money_added(self):
player, player_hand, dealer_hand = self.setup_player_blackjack()
blackjack_test(player, player_hand, dealer_hand)
self.assertEqual(player.money, Decimal(2.5))
def test_blackjack_with_dealer_blackjack_money_lost(self):
player, player_hand, dealer_hand = self.setup_dealer_blackjack()
blackjack_test(player, player_hand, dealer_hand)
self.assertEqual(player.money, Decimal(0))
def test_blackjack_with_dealer_blackjack_return_true(self):
player, player_hand, dealer_hand = self.setup_dealer_blackjack()
self.assertTrue(blackjack_test(player, player_hand, dealer_hand))
| 36.431373 | 79 | 0.672497 | 454 | 3,716 | 5.169604 | 0.105727 | 0.132084 | 0.131231 | 0.187473 | 0.897316 | 0.859395 | 0.845761 | 0.845761 | 0.823179 | 0.804005 | 0 | 0.016561 | 0.236276 | 3,716 | 101 | 80 | 36.792079 | 0.81043 | 0 | 0 | 0.721519 | 0 | 0 | 0.020721 | 0 | 0 | 0 | 0 | 0 | 0.101266 | 1 | 0.151899 | false | 0 | 0.050633 | 0 | 0.265823 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6d39c1004ff4c9f1d3e5f96be220e148865aada2 | 261,876 | py | Python | com/vmware/vcenter/lcm_client.py | vishal-12/vsphere-automation-sdk-python | 9cf363971db77ea5a12928eecd5cf5170a7fcd8a | [
"MIT"
] | null | null | null | com/vmware/vcenter/lcm_client.py | vishal-12/vsphere-automation-sdk-python | 9cf363971db77ea5a12928eecd5cf5170a7fcd8a | [
"MIT"
] | null | null | null | com/vmware/vcenter/lcm_client.py | vishal-12/vsphere-automation-sdk-python | 9cf363971db77ea5a12928eecd5cf5170a7fcd8a | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#---------------------------------------------------------------------------
# Copyright 2019 VMware, Inc. All rights reserved.
# AUTO GENERATED FILE -- DO NOT MODIFY!
#
# vAPI stub file for package com.vmware.vcenter.lcm.
#---------------------------------------------------------------------------
"""
"""
__author__ = 'VMware, Inc.'
__docformat__ = 'restructuredtext en'
import sys
from com.vmware.cis_client import Tasks
from vmware.vapi.stdlib.client.task import Task
from vmware.vapi.bindings import type
from vmware.vapi.bindings.converter import TypeConverter
from vmware.vapi.bindings.enum import Enum
from vmware.vapi.bindings.error import VapiError
from vmware.vapi.bindings.struct import VapiStruct
from vmware.vapi.bindings.stub import (
ApiInterfaceStub, StubFactoryBase, VapiInterface)
from vmware.vapi.bindings.common import raise_core_exception
from vmware.vapi.data.validator import (UnionValidator, HasFieldsOfValidator)
from vmware.vapi.exception import CoreException
from vmware.vapi.lib.constants import TaskType
from vmware.vapi.lib.rest import OperationRestMetadata
class ApplianceSize(Enum):
"""
The size of appliance to be deployed.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
TINY = None
"""
Appliance size of 'tiny', Default vCPUs: 2, Memory: 8GB, VM: 100, Hosts: 10
"""
SMALL = None
"""
Appliance size of 'small', Default vCPUs: 4, Memory: 16GB, VM: 1000, Hosts:
100
"""
MEDIUM = None
"""
Appliance size of 'medium', Default vCPUs: 8, Memory: 24GB, VM: 4000,
Hosts: 400
"""
LARGE = None
"""
Appliance size of 'large', Default vCPUs: 16, Memory: 32GB, VM: 10000,
Hosts: 1000
"""
XLARGE = None
"""
Appliance size of 'extra large', Default vCPUs: 24, Memory: 48GB, VM:
35000, Hosts: 2000
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`ApplianceSize` instance.
"""
Enum.__init__(string)
ApplianceSize._set_values([
ApplianceSize('TINY'),
ApplianceSize('SMALL'),
ApplianceSize('MEDIUM'),
ApplianceSize('LARGE'),
ApplianceSize('XLARGE'),
])
ApplianceSize._set_binding_type(type.EnumType(
'com.vmware.vcenter.lcm.appliance_size',
ApplianceSize))
class ApplianceType(Enum):
"""
The type of appliance to be deployed.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
VCSA_EXTERNAL = None
"""
Management node.
"""
VCSA_EMBEDDED = None
"""
Embedded node.
"""
PSC = None
"""
Infrastructure node.
"""
VMC = None
"""
VMC node.
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`ApplianceType` instance.
"""
Enum.__init__(string)
ApplianceType._set_values([
ApplianceType('VCSA_EXTERNAL'),
ApplianceType('VCSA_EMBEDDED'),
ApplianceType('PSC'),
ApplianceType('VMC'),
])
ApplianceType._set_binding_type(type.EnumType(
'com.vmware.vcenter.lcm.appliance_type',
ApplianceType))
class StorageSize(Enum):
"""
The storage size of the appliance to be deployed.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
LARGE = None
"""
Large storage
"""
XLARGE = None
"""
Extra large storage
"""
REGULAR = None
"""
Regular storage
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`StorageSize` instance.
"""
Enum.__init__(string)
StorageSize._set_values([
StorageSize('LARGE'),
StorageSize('XLARGE'),
StorageSize('REGULAR'),
])
StorageSize._set_binding_type(type.EnumType(
'com.vmware.vcenter.lcm.storage_size',
StorageSize))
class CeipOnlySso(VapiStruct):
"""
The SSO definition that only contains CEIP setting.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
ceip_enabled=None,
):
"""
:type ceip_enabled: :class:`bool` or ``None``
:param ceip_enabled: This key describes the enabling option for the VMware's Customer
Experience Improvement Program (CEIP). By default we have
``ceipEnabled``: true, which indicates that you are joining CEIP.
If you prefer not to participate in the VMware's CEIP for this
product, you must disable CEIP by setting ``ceipEnabled``: false.
You may join or leave VMware's CEIP for this product at any time.
If None, defaults to True
"""
self.ceip_enabled = ceip_enabled
VapiStruct.__init__(self)
CeipOnlySso._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.ceip_only_sso', {
'ceip_enabled': type.OptionalType(type.BooleanType()),
},
CeipOnlySso,
False,
None))
class Connection(VapiStruct):
"""
Connection information for source/destination location.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
hostname=None,
username=None,
password=None,
https_port=None,
ssl_verify=None,
ssl_thumbprint=None,
):
"""
:type hostname: :class:`str`
:param hostname: The IP address or DNS resolvable name of the ESX/VC host. If a DNS
resolvable name is provided, it must be resolvable from the machine
that is running the installer.
:type username: :class:`str`
:param username: A username with administrative privileges on the ESX/VC host.
:type password: :class:`str`
:param password: The password of the 'username' on the ESX/VC host.
:type https_port: :class:`long` or ``None``
:param https_port: The port number for the ESX/VC.
If None, defaults to 443
:type ssl_verify: :class:`bool` or ``None``
:param ssl_verify: A flag to indicate whether the ssl verification is required.
If ``sslThumbprint`` is provided, this field can be omitted If
None, defaults to True
:type ssl_thumbprint: :class:`str` or ``None``
:param ssl_thumbprint: Thumbprint for SSL verification.
If ``sslVerify`` if false, this field is not required
"""
self.hostname = hostname
self.username = username
self.password = password
self.https_port = https_port
self.ssl_verify = ssl_verify
self.ssl_thumbprint = ssl_thumbprint
VapiStruct.__init__(self)
Connection._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.connection', {
'hostname': type.StringType(),
'username': type.StringType(),
'password': type.SecretType(),
'https_port': type.OptionalType(type.IntegerType()),
'ssl_verify': type.OptionalType(type.BooleanType()),
'ssl_thumbprint': type.OptionalType(type.StringType()),
},
Connection,
False,
None))
class DeploymentInfo(VapiStruct):
"""
Information about the appliance deployed.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
appliance_name=None,
appliance_fqdn=None,
appliance_ips=None,
):
"""
:type appliance_name: :class:`str`
:param appliance_name: The name of the appliance.
:type appliance_fqdn: :class:`str` or ``None``
:param appliance_fqdn: The FQDN of the appliance.
Not applicable before firstboot.
:type appliance_ips: :class:`list` of :class:`str` or ``None``
:param appliance_ips: The ip addresses of the appliance.
Not applicable before firstboot.
"""
self.appliance_name = appliance_name
self.appliance_fqdn = appliance_fqdn
self.appliance_ips = appliance_ips
VapiStruct.__init__(self)
DeploymentInfo._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.deployment_info', {
'appliance_name': type.StringType(),
'appliance_fqdn': type.OptionalType(type.StringType()),
'appliance_ips': type.OptionalType(type.ListType(type.StringType())),
},
DeploymentInfo,
False,
None))
class DeploymentOption(VapiStruct):
"""
Container to control deployment.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
skip_options=None,
):
"""
:type skip_options: :class:`dict` of :class:`DeploymentOption.SkipOptions` and :class:`bool`
:param skip_options: The options control if a task should be skipped.
"""
self.skip_options = skip_options
VapiStruct.__init__(self)
class SkipOptions(Enum):
"""
Skippable tasks.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
SKIP_SSO_CHECK = None
"""
Skips the sso check. This should only be used when performing precheck for
install/upgrade of management node before infrastructure node is deployed.
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`SkipOptions` instance.
"""
Enum.__init__(string)
SkipOptions._set_values([
SkipOptions('SKIP_SSO_CHECK'),
])
SkipOptions._set_binding_type(type.EnumType(
'com.vmware.vcenter.lcm.deployment_option.skip_options',
SkipOptions))
DeploymentOption._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.deployment_option', {
'skip_options': type.MapType(type.ReferenceType(__name__, 'DeploymentOption.SkipOptions'), type.BooleanType()),
},
DeploymentOption,
False,
None))
class DestinationLocation(VapiStruct):
"""
Configuration of destination location.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
esx=None,
vcenter=None,
):
"""
:type esx: :class:`Esx` or ``None``
:param esx: This section describes the ESX host on which to deploy the
appliance. Required if you are deploying the appliance directly on
an ESX host.
Mutual exclusive between ``esx`` and ``vcenter``
:type vcenter: :class:`Vc` or ``None``
:param vcenter: This subsection describes the vCenter on which to deploy the
appliance.
Mutual exclusive between ``esx`` and ``vcenter``
"""
self.esx = esx
self.vcenter = vcenter
VapiStruct.__init__(self)
DestinationLocation._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.destination_location', {
'esx': type.OptionalType(type.ReferenceType(__name__, 'Esx')),
'vcenter': type.OptionalType(type.ReferenceType(__name__, 'Vc')),
},
DestinationLocation,
False,
None))
class EmbeddedReplicatedVcsa(VapiStruct):
"""
Configuration of the replicated Single Sign-On for Embedded type
deployment.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
sso_admin_password=None,
sso_domain_name=None,
partner_hostname=None,
ssl_verify=None,
ssl_thumbprint=None,
https_port=None,
):
"""
:type sso_admin_password: :class:`str`
:param sso_admin_password: Administrator password of the existing Single Sign-On to be
replicated.
:type sso_domain_name: :class:`str`
:param sso_domain_name: Domain name for the remote appliance which is being replicated. For
example, 'vsphere.local'
:type partner_hostname: :class:`str`
:param partner_hostname: The IP address or DNS resolvable name for the remote appliance.
:type ssl_verify: :class:`bool` or ``None``
:param ssl_verify: A flag to indicate whether the ssl verification is required.
If ``sslThumbprint`` is provided, this field can be omitted If
None, defaults to True
:type ssl_thumbprint: :class:`str` or ``None``
:param ssl_thumbprint: SHA1 thumbprint of the server SSL certificate which will be used
for verification.
If ``sslVerify`` is set to False, this field can be omitted
:type https_port: :class:`long` or ``None``
:param https_port: The HTTPS port of the external PSC appliance.
If None, defaults to 443
"""
self.sso_admin_password = sso_admin_password
self.sso_domain_name = sso_domain_name
self.partner_hostname = partner_hostname
self.ssl_verify = ssl_verify
self.ssl_thumbprint = ssl_thumbprint
self.https_port = https_port
VapiStruct.__init__(self)
EmbeddedReplicatedVcsa._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.embedded_replicated_vcsa', {
'sso_admin_password': type.SecretType(),
'sso_domain_name': type.StringType(),
'partner_hostname': type.StringType(),
'ssl_verify': type.OptionalType(type.BooleanType()),
'ssl_thumbprint': type.OptionalType(type.StringType()),
'https_port': type.OptionalType(type.IntegerType()),
},
EmbeddedReplicatedVcsa,
False,
None))
class EmbeddedStandaloneVcsa(VapiStruct):
"""
Configuration of the standalone Single Sign-On for Embedded type
deployment.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
sso_admin_password=None,
sso_domain_name=None,
):
"""
:type sso_admin_password: :class:`str`
:param sso_admin_password: Password must conform to the following requirements: 1. At least 8
characters. 2. No more than 20 characters. 3. At least 1 uppercase
character. 4. At least 1 lowercase character. 5. At least 1 number.
6. At least 1 special character (e.g., '!', '(', '\\\\@', etc.). 7.
Only visible A-Z, a-z, 0-9 and punctuation (spaces are not allowed)
:type sso_domain_name: :class:`str`
:param sso_domain_name: The Single Sign-On domain name to be used to configure this vCenter
Server Appliance. For example, 'vsphere.local'
"""
self.sso_admin_password = sso_admin_password
self.sso_domain_name = sso_domain_name
VapiStruct.__init__(self)
EmbeddedStandaloneVcsa._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.embedded_standalone_vcsa', {
'sso_admin_password': type.SecretType(),
'sso_domain_name': type.StringType(),
},
EmbeddedStandaloneVcsa,
False,
None))
class Esx(VapiStruct):
"""
Configuration of ESX.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
connection=None,
inventory=None,
):
"""
:type connection: :class:`Connection`
:param connection: The configuration to connect to an ESX/VC.
:type inventory: :class:`EsxInventory`
:param inventory: The configuration of ESX inventory.
"""
self.connection = connection
self.inventory = inventory
VapiStruct.__init__(self)
Esx._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.esx', {
'connection': type.ReferenceType(__name__, 'Connection'),
'inventory': type.ReferenceType(__name__, 'EsxInventory'),
},
Esx,
False,
None))
class EsxInventory(VapiStruct):
"""
Configuration of ESX's inventory.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
datastore_name=None,
network_name=None,
resource_pool_path=None,
):
"""
:type datastore_name: :class:`str`
:param datastore_name: The datastore on which to store the files of the appliance. This
value has to be either a specific datastore name, or a specific
datastore in a datastore cluster. The datastore must be accessible
from the ESX host and must have at least 25 GB of free space.
Otherwise, the new appliance might not power on.
:type network_name: :class:`str` or ``None``
:param network_name: The network of the ESX host to which the new appliance should
connect. Omit this parameter if the ESX host has one network.
If None, defaults to VM Network
:type resource_pool_path: :class:`str` or ``None``
:param resource_pool_path: The path to the resource pool on the ESX host in which the
appliance will be deployed.
Not applicable when not in resource pool
"""
self.datastore_name = datastore_name
self.network_name = network_name
self.resource_pool_path = resource_pool_path
VapiStruct.__init__(self)
EsxInventory._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.esx_inventory', {
'datastore_name': type.StringType(),
'network_name': type.OptionalType(type.StringType()),
'resource_pool_path': type.OptionalType(type.StringType()),
},
EsxInventory,
False,
None))
class ExistingMigrationAssistant(VapiStruct):
"""
Configuration of the migration assistant that is already running on the
source Windows VC.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
ssl_thumbprint=None,
https_port=None,
):
"""
:type ssl_thumbprint: :class:`str`
:param ssl_thumbprint: The SSL thumbprint of Migration Assistant. The SSL thumbprint can
be retrieved from the Migration Assistant console and log file.
:type https_port: :class:`long` or ``None``
:param https_port: Migration Assistant port number shown in the Migration Assistant
console and log file. The default port is 9123.
If None, defaults to 9123
"""
self.ssl_thumbprint = ssl_thumbprint
self.https_port = https_port
VapiStruct.__init__(self)
ExistingMigrationAssistant._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.existing_migration_assistant', {
'ssl_thumbprint': type.StringType(),
'https_port': type.OptionalType(type.IntegerType()),
},
ExistingMigrationAssistant,
False,
None))
class ExternalTool(VapiStruct):
"""
Configuration of the external tools used.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
name=None,
hostname=None,
location=None,
):
"""
:type name: :class:`str`
:param name: The name of the external tool
:type hostname: :class:`str` or ``None``
:param hostname: The host name of the external tool.
Can be absent when external tool does not have a host name.
:type location: :class:`str`
:param location: The location of the external tool.
"""
self.name = name
self.hostname = hostname
self.location = location
VapiStruct.__init__(self)
ExternalTool._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.external_tool', {
'name': type.StringType(),
'hostname': type.OptionalType(type.StringType()),
'location': type.StringType(),
},
ExternalTool,
False,
None))
class ExternalVcsa(VapiStruct):
"""
Configuration of the Single Sign-On for Management type deployment.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
sso_admin_password=None,
sso_domain_name=None,
psc_hostname=None,
ssl_verify=None,
ssl_thumbprint=None,
https_port=None,
):
"""
:type sso_admin_password: :class:`str`
:param sso_admin_password: Administrator password of the external PSC to register with.
:type sso_domain_name: :class:`str`
:param sso_domain_name: Domain name of the external PSC. For example, 'vsphere.local'
:type psc_hostname: :class:`str`
:param psc_hostname: The IP address or DNS resolvable name of the remote PSC to which
this configuring vCenter Server will be registered.
:type ssl_verify: :class:`bool` or ``None``
:param ssl_verify: A flag to indicate whether the SSL verification is required
If ``sslThumbprint`` is provided, this field can be omitted If
None, defaults to False
:type ssl_thumbprint: :class:`str` or ``None``
:param ssl_thumbprint: SHA1 thumbprint of the server SSL certificate which will be used
for verification.
If ``sslVerify`` is set to False, this field can be omitted.
:type https_port: :class:`long` or ``None``
:param https_port: The HTTPS port of the external PSC appliance.
If None, defaults to 443
"""
self.sso_admin_password = sso_admin_password
self.sso_domain_name = sso_domain_name
self.psc_hostname = psc_hostname
self.ssl_verify = ssl_verify
self.ssl_thumbprint = ssl_thumbprint
self.https_port = https_port
VapiStruct.__init__(self)
ExternalVcsa._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.external_vcsa', {
'sso_admin_password': type.SecretType(),
'sso_domain_name': type.StringType(),
'psc_hostname': type.StringType(),
'ssl_verify': type.OptionalType(type.BooleanType()),
'ssl_thumbprint': type.OptionalType(type.StringType()),
'https_port': type.OptionalType(type.IntegerType()),
},
ExternalVcsa,
False,
None))
class GuestCredential(VapiStruct):
"""
Configuration of the guest credential.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
os_username=None,
os_password=None,
):
"""
:type os_username: :class:`str`
:param os_username: Administrator username for the source Windows operating system.
:type os_password: :class:`str`
:param os_password: Administrator user password for the source Windows operating
system.
"""
self.os_username = os_username
self.os_password = os_password
VapiStruct.__init__(self)
GuestCredential._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.guest_credential', {
'os_username': type.StringType(),
'os_password': type.SecretType(),
},
GuestCredential,
False,
None))
class History(VapiStruct):
"""
Configuration of the data to be exported during upgrade/migrate.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
defer_import=None,
data_set=None,
):
"""
:type defer_import: :class:`bool` or ``None``
:param defer_import: A flag to indicate whether the import of historical data should be
deferred until after upgrade/migrate.
If None, defaults to False
:type data_set: :class:`History.DataSetType` or ``None``
:param data_set: The type of data to be upgraded/migrated.
If None, defaults to ALL
"""
self.defer_import = defer_import
self.data_set = data_set
VapiStruct.__init__(self)
class DataSetType(Enum):
"""
The type of data to be upgraded/migrated.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
EVENTS_TASKS = None
"""
event data and task data.
"""
NONE = None
"""
core only.
"""
ALL = None
"""
core, event and task data.
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`DataSetType` instance.
"""
Enum.__init__(string)
DataSetType._set_values([
DataSetType('EVENTS_TASKS'),
DataSetType('NONE'),
DataSetType('ALL'),
])
DataSetType._set_binding_type(type.EnumType(
'com.vmware.vcenter.lcm.history.data_set_type',
DataSetType))
History._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.history', {
'defer_import': type.OptionalType(type.BooleanType()),
'data_set': type.OptionalType(type.ReferenceType(__name__, 'History.DataSetType')),
},
History,
False,
None))
class MigrationAssistant(VapiStruct):
"""
Configuration of the migration assistant to be uploaded and started on
source Windows VC.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
source_location=None,
settings=None,
guest_credentials=None,
migration_assistant_installer_location=None,
migration_assistant_installer_location_ssl_verify=None,
migration_assistant_installer_location_ssl_thumbprint=None,
):
"""
:type source_location: :class:`Connection`
:param source_location: The configuration to connect to an ESX/VC.
:type settings: :class:`MigrationAssistantSetting`
:param settings: Spec to automatically launch the Migration Assistant.
:type guest_credentials: :class:`GuestCredential`
:param guest_credentials: Credentials for the Windows system on which the vCenter server is
running.
:type migration_assistant_installer_location: :class:`str` or ``None``
:param migration_assistant_installer_location: Installer location of the migration assistant to be uploaded.
:type migration_assistant_installer_location_ssl_verify: :class:`bool` or ``None``
:param migration_assistant_installer_location_ssl_verify: A flag to indicate whether to verify ssl connection.
when SSL thumbprint is provided, SSL verify is not required.
:type migration_assistant_installer_location_ssl_thumbprint: :class:`str` or ``None``
:param migration_assistant_installer_location_ssl_thumbprint: SSL thumbprint of the source appliance.
if ssl verify is set to False, thumbprint is not required.
"""
self.source_location = source_location
self.settings = settings
self.guest_credentials = guest_credentials
self.migration_assistant_installer_location = migration_assistant_installer_location
self.migration_assistant_installer_location_ssl_verify = migration_assistant_installer_location_ssl_verify
self.migration_assistant_installer_location_ssl_thumbprint = migration_assistant_installer_location_ssl_thumbprint
VapiStruct.__init__(self)
MigrationAssistant._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.migration_assistant', {
'source_location': type.ReferenceType(__name__, 'Connection'),
'settings': type.ReferenceType(__name__, 'MigrationAssistantSetting'),
'guest_credentials': type.ReferenceType(__name__, 'GuestCredential'),
'migration_assistant_installer_location': type.OptionalType(type.StringType()),
'migration_assistant_installer_location_ssl_verify': type.OptionalType(type.BooleanType()),
'migration_assistant_installer_location_ssl_thumbprint': type.OptionalType(type.StringType()),
},
MigrationAssistant,
False,
None))
class MigrationAssistantSetting(VapiStruct):
"""
Configuration of the setting of migration assistant to be uploaded and
started on source Windows VC.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
migrated_ip=None,
https_port=None,
export_dir=None,
service_account_password=None,
):
"""
:type migrated_ip: :class:`str` or ``None``
:param migrated_ip: The IP address of the network adapter that will be migrated. Only
required if the Windows vCenter Server has multiple network
adapters, making its system name resolve to multiple IP addresses.
May not be applicable.
:type https_port: :class:`long` or ``None``
:param https_port: Migration Assistant port number shown in the Migration Assistant
console. The default port is 9123
If None, defaults to 9123
:type export_dir: :class:`str` or ``None``
:param export_dir: Directory to export source configuration and data. Optional.
If None, defaults to /var/tmp
:type service_account_password: :class:`str` or ``None``
:param service_account_password: The password of the vCenter Server service account. Required only
if the vCenter Server service is running under a non LocalSystem
account.
Service account may not be applicable
"""
self.migrated_ip = migrated_ip
self.https_port = https_port
self.export_dir = export_dir
self.service_account_password = service_account_password
VapiStruct.__init__(self)
MigrationAssistantSetting._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.migration_assistant_setting', {
'migrated_ip': type.OptionalType(type.StringType()),
'https_port': type.OptionalType(type.IntegerType()),
'export_dir': type.OptionalType(type.StringType()),
'service_account_password': type.OptionalType(type.SecretType()),
},
MigrationAssistantSetting,
False,
None))
class Network(VapiStruct):
"""
Network configuration of the appliance to be deployed.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
_validator_list = [
UnionValidator(
'mode',
{
'STATIC' : [('hostname', False), ('ip', True), ('dns_servers', True), ('prefix', True), ('gateway', True)],
'DHCP' : [],
}
),
]
def __init__(self,
hostname=None,
ip_family=None,
mode=None,
ip=None,
dns_servers=None,
prefix=None,
gateway=None,
):
"""
:type hostname: :class:`str` or ``None``
:param hostname: Primary network identity. Can be either an IP address or a fully
qualified domain name(FQDN).
host name may not be applicable
:type ip_family: :class:`TemporaryNetwork.IpType` or ``None``
:param ip_family: Network IP address family.
If None, defaults to IPV4
:type mode: :class:`TemporaryNetwork.NetworkMode`
:param mode: Network mode.
:type ip: :class:`str`
:param ip: Network IP address. Required for static mode only.
This attribute is optional and it is only relevant when the value
of ``mode`` is :attr:`TemporaryNetwork.NetworkMode.STATIC`.
:type dns_servers: :class:`list` of :class:`str`
:param dns_servers: A comma-separated list of IP addresses of DNS servers. A JSON array
such as ["1.2.3.4", "127.0.0.1"]. Required for static mode only.
DNS servers must be reachable from the machine that runs CLI
installer
This attribute is optional and it is only relevant when the value
of ``mode`` is :attr:`TemporaryNetwork.NetworkMode.STATIC`.
:type prefix: :class:`long`
:param prefix: Network prefix length. Required for static mode only. Remove if the
mode is "dhcp". This is the number of bits set in the subnet mask;
for instance, if the subnet mask is 255.255.255.0, there are 24
bits in the binary version of the subnet mask, so the prefix length
is 24. If used, the values must be in the inclusive range of 0 to
32 for IPv4 and 0 to 128 for IPv6. Required for static mode only.
This attribute is optional and it is only relevant when the value
of ``mode`` is :attr:`TemporaryNetwork.NetworkMode.STATIC`.
:type gateway: :class:`str`
:param gateway: Gateway of the network. Required for static mode only.
This attribute is optional and it is only relevant when the value
of ``mode`` is :attr:`TemporaryNetwork.NetworkMode.STATIC`.
"""
self.hostname = hostname
self.ip_family = ip_family
self.mode = mode
self.ip = ip
self.dns_servers = dns_servers
self.prefix = prefix
self.gateway = gateway
VapiStruct.__init__(self)
Network._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.network', {
'hostname': type.OptionalType(type.StringType()),
'ip_family': type.OptionalType(type.ReferenceType(__name__, 'TemporaryNetwork.IpType')),
'mode': type.ReferenceType(__name__, 'TemporaryNetwork.NetworkMode'),
'ip': type.OptionalType(type.StringType()),
'dns_servers': type.OptionalType(type.ListType(type.StringType())),
'prefix': type.OptionalType(type.IntegerType()),
'gateway': type.OptionalType(type.StringType()),
},
Network,
False,
None))
class Notification(VapiStruct):
"""
Notification messages of a single task.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
id=None,
time=None,
message=None,
resolution=None,
):
"""
:type id: :class:`str`
:param id: The identifier of the message.
:type time: :class:`datetime.datetime` or ``None``
:param time: The time the notification was raised/found.
Only :class:`set` if the time information is available.
:type message: :class:`com.vmware.vapi.std_client.LocalizableMessage`
:param message: The notification message.
:type resolution: :class:`com.vmware.vapi.std_client.LocalizableMessage` or ``None``
:param resolution: The resolution message, if any.
Only :class:`set` for warnings and errors.
"""
self.id = id
self.time = time
self.message = message
self.resolution = resolution
VapiStruct.__init__(self)
Notification._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.notification', {
'id': type.StringType(),
'time': type.OptionalType(type.DateTimeType()),
'message': type.ReferenceType('com.vmware.vapi.std_client', 'LocalizableMessage'),
'resolution': type.OptionalType(type.ReferenceType('com.vmware.vapi.std_client', 'LocalizableMessage')),
},
Notification,
False,
None))
class PscReplicated(VapiStruct):
"""
Configuration of the replicated Single Sign-On for PSC type deployment.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
sso_admin_password=None,
sso_domain_name=None,
partner_hostname=None,
ssl_verify=None,
ssl_thumbprint=None,
https_port=None,
sso_site_name=None,
):
"""
:type sso_admin_password: :class:`str`
:param sso_admin_password: Administrator password of the PSC to be replicated.
:type sso_domain_name: :class:`str`
:param sso_domain_name: Domain name of the remote PSC. For example, 'vsphere.local'
:type partner_hostname: :class:`str`
:param partner_hostname: The IP address or DNS resolvable name of the remote PSC.
:type ssl_verify: :class:`bool` or ``None``
:param ssl_verify: A flag to indicate whether the SSL verification is required.
If ``sslThumbprint`` is provided, this field can be omitted If
None, defaults to False
:type ssl_thumbprint: :class:`str` or ``None``
:param ssl_thumbprint: SHA1 thumbprint of the server SSL certificate which will be used
for verification.
If ``sslVerify`` is set to False, this field can be omitted
:type https_port: :class:`long` or ``None``
:param https_port: The HTTPS port of the remote PSC.
If None, defaults to 443
:type sso_site_name: :class:`str` or ``None``
:param sso_site_name: Site name of the newly deployed PSC.
Site name may not be applicable
"""
self.sso_admin_password = sso_admin_password
self.sso_domain_name = sso_domain_name
self.partner_hostname = partner_hostname
self.ssl_verify = ssl_verify
self.ssl_thumbprint = ssl_thumbprint
self.https_port = https_port
self.sso_site_name = sso_site_name
VapiStruct.__init__(self)
PscReplicated._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.psc_replicated', {
'sso_admin_password': type.SecretType(),
'sso_domain_name': type.StringType(),
'partner_hostname': type.StringType(),
'ssl_verify': type.OptionalType(type.BooleanType()),
'ssl_thumbprint': type.OptionalType(type.StringType()),
'https_port': type.OptionalType(type.IntegerType()),
'sso_site_name': type.OptionalType(type.StringType()),
},
PscReplicated,
False,
None))
class PscStandalone(VapiStruct):
"""
Configuration of the standalone Single Sign-On for Embedded type
deployment.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
sso_admin_password=None,
sso_domain_name=None,
sso_site_name=None,
):
"""
:type sso_admin_password: :class:`str`
:param sso_admin_password: Password must conform to the following requirements: 1. At least 8
characters. 2. No more than 20 characters. 3. At least 1 uppercase
character. 4. At least 1 lowercase character. 5. At least 1 number.
6. At least 1 special character (e.g., '!', '(', '\\\\@', etc.). 7.
Only visible A-Z, a-z, 0-9 and punctuation (spaces are not allowed)
:type sso_domain_name: :class:`str`
:param sso_domain_name: Domain name of the newly deployed PSC. For example, 'vsphere.local'
:type sso_site_name: :class:`str` or ``None``
:param sso_site_name: Site name of the PSC.
Site name may not be applicable
"""
self.sso_admin_password = sso_admin_password
self.sso_domain_name = sso_domain_name
self.sso_site_name = sso_site_name
VapiStruct.__init__(self)
PscStandalone._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.psc_standalone', {
'sso_admin_password': type.SecretType(),
'sso_domain_name': type.StringType(),
'sso_site_name': type.OptionalType(type.StringType()),
},
PscStandalone,
False,
None))
class Result(VapiStruct):
"""
Container of info, warning and error messages associated with a single
task.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
info=None,
warnings=None,
errors=None,
):
"""
:type info: :class:`list` of :class:`Notification` or ``None``
:param info: Info notification messages reported.
Only :class:`set` if an info was reported by the task.
:type warnings: :class:`list` of :class:`Notification` or ``None``
:param warnings: Warning notification messages reported.
Only :class:`set` if an warning was reported by the task.
:type errors: :class:`list` of :class:`Notification` or ``None``
:param errors: Error notification messages reported.
Only :class:`set` if an error was reported by the task.
"""
self.info = info
self.warnings = warnings
self.errors = errors
VapiStruct.__init__(self)
Result._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.result', {
'info': type.OptionalType(type.ListType(type.ReferenceType(__name__, 'Notification'))),
'warnings': type.OptionalType(type.ListType(type.ReferenceType(__name__, 'Notification'))),
'errors': type.OptionalType(type.ListType(type.ReferenceType(__name__, 'Notification'))),
},
Result,
False,
None))
class SourceVcWindows(VapiStruct):
"""
Configuration of the source Windows VC.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
hostname=None,
username=None,
password=None,
ssl_verify=None,
ssl_thumbprint=None,
):
"""
:type hostname: :class:`str`
:param hostname: The IP address or FQDN of the source Windows vCenter server to
migrate. If an FQDN is provided, it must be resolvable from the
machine that is running the installer.
:type username: :class:`str`
:param username: Single Sign-On administrator user on the source Windows vCenter
server. For example, administrator\\\\@vsphere.local. Important:
The user must be administrator\\\\@your_domain_name.
:type password: :class:`str`
:param password: The password of the Single Sign-On administrator on the source
Windows vCenter server.
:type ssl_verify: :class:`bool` or ``None``
:param ssl_verify:
:type ssl_thumbprint: :class:`str` or ``None``
:param ssl_thumbprint:
"""
self.hostname = hostname
self.username = username
self.password = password
self.ssl_verify = ssl_verify
self.ssl_thumbprint = ssl_thumbprint
VapiStruct.__init__(self)
SourceVcWindows._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.source_vc_windows', {
'hostname': type.StringType(),
'username': type.StringType(),
'password': type.SecretType(),
'ssl_verify': type.OptionalType(type.BooleanType()),
'ssl_thumbprint': type.OptionalType(type.StringType()),
},
SourceVcWindows,
False,
None))
class SourceVum(VapiStruct):
"""
Configuration of the source VUM.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
hostname=None,
os_username=None,
os_password=None,
export_dir=None,
port=None,
start_migration_assistant=None,
existing_migration_assistant=None,
):
"""
:type hostname: :class:`str`
:param hostname: IP address or fully qualified domain name (FQDN) of the vSphere
Update Manager host. If an FQDN is provided, it has to be
resolvable from the machine that is running the installer.
:type os_username: :class:`str`
:param os_username: Administrator username for the source vSphere Update Manager
Windows operating system.
:type os_password: :class:`str`
:param os_password: Administrator user password for the source vSphere Update Manager
Windows operating system.
:type export_dir: :class:`str` or ``None``
:param export_dir: Directory to export source configuration and data.
If None, defaults to for mac/lin: /var/tmp, for windows: %TEMP%
:type port: :class:`long` or ``None``
:param port: The port of the source vum.
default to be 9123
:type start_migration_assistant: :class:`SourceVumMigrationAssistant` or ``None``
:param start_migration_assistant: Configuration of migration assistant to be deployed to the vSphere
Upgrade Manager.
Mutually exclusive between start migration assistant and existing
migration assistant.
:type existing_migration_assistant: :class:`ExistingMigrationAssistant` or ``None``
:param existing_migration_assistant: Configuration of migration assistant that are already running on
the vSphere Upgrade Manager.
"""
self.hostname = hostname
self.os_username = os_username
self.os_password = os_password
self.export_dir = export_dir
self.port = port
self.start_migration_assistant = start_migration_assistant
self.existing_migration_assistant = existing_migration_assistant
VapiStruct.__init__(self)
SourceVum._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.source_vum', {
'hostname': type.StringType(),
'os_username': type.StringType(),
'os_password': type.SecretType(),
'export_dir': type.OptionalType(type.StringType()),
'port': type.OptionalType(type.IntegerType()),
'start_migration_assistant': type.OptionalType(type.ReferenceType(__name__, 'SourceVumMigrationAssistant')),
'existing_migration_assistant': type.OptionalType(type.ReferenceType(__name__, 'ExistingMigrationAssistant')),
},
SourceVum,
False,
None))
class SourceVumMigrationAssistant(VapiStruct):
"""
Configuration of migration assistant to be deployed to the vSphere Upgrade
Manager.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
installer_location=None,
installer_location_ssl_verify=None,
installer_location_ssl_thumbprint=None,
):
"""
:type installer_location: :class:`str` or ``None``
:param installer_location: Location of the installer of migration assistant to be uploaded to
the source vSphere Update Manager.
Mutually exclusive between start migration assistant and existing
migration assistant.
:type installer_location_ssl_verify: :class:`bool` or ``None``
:param installer_location_ssl_verify: A flag to indicate whether to verify ssl connection of the location
of the installer of migration assistant.
when SSL thumbprint is provided, SSL verify is not required.
:type installer_location_ssl_thumbprint: :class:`str` or ``None``
:param installer_location_ssl_thumbprint: SSL thumbprint of the location of the installer of migration
assistant.
if ssl verify is set to False, thumbprint is not required.
"""
self.installer_location = installer_location
self.installer_location_ssl_verify = installer_location_ssl_verify
self.installer_location_ssl_thumbprint = installer_location_ssl_thumbprint
VapiStruct.__init__(self)
SourceVumMigrationAssistant._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.source_vum_migration_assistant', {
'installer_location': type.OptionalType(type.StringType()),
'installer_location_ssl_verify': type.OptionalType(type.BooleanType()),
'installer_location_ssl_thumbprint': type.OptionalType(type.StringType()),
},
SourceVumMigrationAssistant,
False,
None))
class Ssh(VapiStruct):
"""
Setting to enable SSH on the deployed appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
enabled=None,
):
"""
:type enabled: :class:`bool` or ``None``
:param enabled: Whether to enable SSH.
If None, defaults to False
"""
self.enabled = enabled
VapiStruct.__init__(self)
Ssh._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.ssh', {
'enabled': type.OptionalType(type.BooleanType()),
},
Ssh,
False,
None))
class SubTaskInfo(VapiStruct):
"""
Container that contains the status information about a single task.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
_validator_list = [
UnionValidator(
'status',
{
'FAILED' : [('error', False), ('start_time', True), ('end_time', True)],
'RUNNING' : [('start_time', True)],
'BLOCKED' : [('start_time', True)],
'SUCCEEDED' : [('start_time', True), ('end_time', True)],
'PENDING' : [],
}
),
]
def __init__(self,
progress=None,
last_updated_time=None,
result=None,
external_tools=None,
description=None,
service=None,
operation=None,
parent=None,
target=None,
status=None,
cancelable=None,
error=None,
start_time=None,
end_time=None,
user=None,
):
"""
:type progress: :class:`com.vmware.cis.task_client.Progress`
:param progress: The progress info of this deployment task.
:type last_updated_time: :class:`datetime.datetime`
:param last_updated_time: The time that the last update is registered.
:type result: :class:`Result` or ``None``
:param result: Result of the task.
This attribute will be None if result is not available at the
current step of the task.
:type external_tools: :class:`list` of :class:`ExternalTool`
:param external_tools: External tools used for the deployment.
:type description: :class:`com.vmware.vapi.std_client.LocalizableMessage`
:param description: Description of the operation associated with the task.
:type service: :class:`str`
:param service: Identifier of the service containing the operation.
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.vapi.service``. When methods return a value of this
class as a return value, the attribute will be an identifier for
the resource type: ``com.vmware.vapi.service``.
:type operation: :class:`str`
:param operation: Identifier of the operation associated with the task.
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.vapi.operation``. When methods return a value of this
class as a return value, the attribute will be an identifier for
the resource type: ``com.vmware.vapi.operation``.
:type parent: :class:`str` or ``None``
:param parent: Parent of the current task.
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.cis.task``. When methods return a value of this class
as a return value, the attribute will be an identifier for the
resource type: ``com.vmware.cis.task``.
This attribute will be None if the task has no parent.
:type target: :class:`com.vmware.vapi.std_client.DynamicID` or ``None``
:param target: Identifier of the target created by the operation or an existing
one the operation performed on.
This attribute will be None if the operation has no target or
multiple targets.
:type status: :class:`com.vmware.cis.task_client.Status`
:param status: Status of the operation associated with the task.
:type cancelable: :class:`bool`
:param cancelable: Flag to indicate whether or not the operation can be cancelled. The
value may change as the operation progresses.
:type error: :class:`Exception` or ``None``
:param error: Description of the error if the operation status is "FAILED".
If None the description of why the operation failed will be
included in the result of the operation (see
:attr:`com.vmware.cis.task_client.Info.result`).
:type start_time: :class:`datetime.datetime`
:param start_time: Time when the operation is started.
This attribute is optional and it is only relevant when the value
of ``status`` is one of
:attr:`com.vmware.cis.task_client.Status.RUNNING`,
:attr:`com.vmware.cis.task_client.Status.BLOCKED`,
:attr:`com.vmware.cis.task_client.Status.SUCCEEDED`, or
:attr:`com.vmware.cis.task_client.Status.FAILED`.
:type end_time: :class:`datetime.datetime`
:param end_time: Time when the operation is completed.
This attribute is optional and it is only relevant when the value
of ``status`` is one of
:attr:`com.vmware.cis.task_client.Status.SUCCEEDED` or
:attr:`com.vmware.cis.task_client.Status.FAILED`.
:type user: :class:`str` or ``None``
:param user: Name of the user who performed the operation.
This attribute will be None if the operation is performed by the
system.
"""
self.progress = progress
self.last_updated_time = last_updated_time
self.result = result
self.external_tools = external_tools
self.description = description
self.service = service
self.operation = operation
self.parent = parent
self.target = target
self.status = status
self.cancelable = cancelable
self.error = error
self.start_time = start_time
self.end_time = end_time
self.user = user
VapiStruct.__init__(self)
SubTaskInfo._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.sub_task_info', {
'progress': type.ReferenceType('com.vmware.cis.task_client', 'Progress'),
'last_updated_time': type.DateTimeType(),
'result': type.OptionalType(type.ReferenceType(__name__, 'Result')),
'external_tools': type.ListType(type.ReferenceType(__name__, 'ExternalTool')),
'description': type.ReferenceType('com.vmware.vapi.std_client', 'LocalizableMessage'),
'service': type.IdType(resource_types='com.vmware.vapi.service'),
'operation': type.IdType(resource_types='com.vmware.vapi.operation'),
'parent': type.OptionalType(type.IdType()),
'target': type.OptionalType(type.ReferenceType('com.vmware.vapi.std_client', 'DynamicID')),
'status': type.ReferenceType('com.vmware.cis.task_client', 'Status'),
'cancelable': type.BooleanType(),
'error': type.OptionalType(type.AnyErrorType()),
'start_time': type.OptionalType(type.DateTimeType()),
'end_time': type.OptionalType(type.DateTimeType()),
'user': type.OptionalType(type.StringType()),
},
SubTaskInfo,
False,
None))
class TaskInfo(VapiStruct):
"""
The container that contains the status information of a deployment.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
_validator_list = [
UnionValidator(
'status',
{
'RUNNING' : [('progress', True), ('start_time', True)],
'FAILED' : [('progress', True), ('error', False), ('start_time', True), ('end_time', True)],
'SUCCEEDED' : [('progress', True), ('start_time', True), ('end_time', True)],
'BLOCKED' : [('progress', True), ('start_time', True)],
'PENDING' : [],
}
),
]
def __init__(self,
metadata_file=None,
state=None,
progress=None,
last_updated_time=None,
subtask_order=None,
subtasks=None,
appliance_info=None,
result=None,
additional_info=None,
description=None,
service=None,
operation=None,
parent=None,
target=None,
status=None,
cancelable=None,
error=None,
start_time=None,
end_time=None,
user=None,
):
"""
:type metadata_file: :class:`str`
:param metadata_file: The path of the metadata file.
:type state: :class:`str` or ``None``
:param state: The state of appliance being deployed.
May not have any state information.
:type progress: :class:`com.vmware.cis.task_client.Progress`
:param progress: The total progress of the deployment operation.
This attribute is optional and it is only relevant when the value
of ``#status`` is one of
:attr:`com.vmware.cis.task_client.Status.RUNNING`,
:attr:`com.vmware.cis.task_client.Status.FAILED`,
:attr:`com.vmware.cis.task_client.Status.SUCCEEDED`, or
:attr:`com.vmware.cis.task_client.Status.BLOCKED`.
:type last_updated_time: :class:`datetime.datetime`
:param last_updated_time: The time that the last update is registered.
:type subtask_order: :class:`list` of :class:`list` of :class:`str` or ``None``
:param subtask_order: The ordered list of subtasks for this deployment operation.
Only :class:`set` when the appliance state is RUNNING_IN_PROGRESS,
FAILED, CANCELLED and SUCCEEDED.
:type subtasks: (:class:`dict` of :class:`str` and :class:`SubTaskInfo`) or ``None``
:param subtasks: The map of the deployment subtasks and their status information.
Only :class:`set` when the appliance state is RUNNING_IN_PROGRESS,
FAILED, CANCELLED and SUCCEEDED.
:type appliance_info: :class:`DeploymentInfo` or ``None``
:param appliance_info: Information about the appliance deployed.
Such information may not be available for requests that are not for
deployment (validation/recommendation).
:type result: :class:`DataValue` or ``None``
:param result: The result of validation or recommendation requests.
Not applicable for precheck/deployment operation.
:type additional_info: :class:`str` or ``None``
:param additional_info: Additional information that a response may contain.
Not all response will contain additional information.
:type description: :class:`com.vmware.vapi.std_client.LocalizableMessage`
:param description: Description of the operation associated with the task.
:type service: :class:`str`
:param service: Identifier of the service containing the operation.
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.vapi.service``. When methods return a value of this
class as a return value, the attribute will be an identifier for
the resource type: ``com.vmware.vapi.service``.
:type operation: :class:`str`
:param operation: Identifier of the operation associated with the task.
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.vapi.operation``. When methods return a value of this
class as a return value, the attribute will be an identifier for
the resource type: ``com.vmware.vapi.operation``.
:type parent: :class:`str` or ``None``
:param parent: Parent of the current task.
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.cis.task``. When methods return a value of this class
as a return value, the attribute will be an identifier for the
resource type: ``com.vmware.cis.task``.
This attribute will be None if the task has no parent.
:type target: :class:`com.vmware.vapi.std_client.DynamicID` or ``None``
:param target: Identifier of the target created by the operation or an existing
one the operation performed on.
This attribute will be None if the operation has no target or
multiple targets.
:type status: :class:`com.vmware.cis.task_client.Status`
:param status: Status of the operation associated with the task.
:type cancelable: :class:`bool`
:param cancelable: Flag to indicate whether or not the operation can be cancelled. The
value may change as the operation progresses.
:type error: :class:`Exception` or ``None``
:param error: Description of the error if the operation status is "FAILED".
If None the description of why the operation failed will be
included in the result of the operation (see
:attr:`com.vmware.cis.task_client.Info.result`).
:type start_time: :class:`datetime.datetime`
:param start_time: Time when the operation is started.
This attribute is optional and it is only relevant when the value
of ``status`` is one of
:attr:`com.vmware.cis.task_client.Status.RUNNING`,
:attr:`com.vmware.cis.task_client.Status.BLOCKED`,
:attr:`com.vmware.cis.task_client.Status.SUCCEEDED`, or
:attr:`com.vmware.cis.task_client.Status.FAILED`.
:type end_time: :class:`datetime.datetime`
:param end_time: Time when the operation is completed.
This attribute is optional and it is only relevant when the value
of ``status`` is one of
:attr:`com.vmware.cis.task_client.Status.SUCCEEDED` or
:attr:`com.vmware.cis.task_client.Status.FAILED`.
:type user: :class:`str` or ``None``
:param user: Name of the user who performed the operation.
This attribute will be None if the operation is performed by the
system.
"""
self.metadata_file = metadata_file
self.state = state
self.progress = progress
self.last_updated_time = last_updated_time
self.subtask_order = subtask_order
self.subtasks = subtasks
self.appliance_info = appliance_info
self.result = result
self.additional_info = additional_info
self.description = description
self.service = service
self.operation = operation
self.parent = parent
self.target = target
self.status = status
self.cancelable = cancelable
self.error = error
self.start_time = start_time
self.end_time = end_time
self.user = user
VapiStruct.__init__(self)
TaskInfo._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.task_info', {
'metadata_file': type.StringType(),
'state': type.OptionalType(type.StringType()),
'progress': type.OptionalType(type.ReferenceType('com.vmware.cis.task_client', 'Progress')),
'last_updated_time': type.DateTimeType(),
'subtask_order': type.OptionalType(type.ListType(type.ListType(type.StringType()))),
'subtasks': type.OptionalType(type.MapType(type.StringType(), type.ReferenceType(__name__, 'SubTaskInfo'))),
'appliance_info': type.OptionalType(type.ReferenceType(__name__, 'DeploymentInfo')),
'result': type.OptionalType(type.OpaqueType()),
'additional_info': type.OptionalType(type.StringType()),
'description': type.ReferenceType('com.vmware.vapi.std_client', 'LocalizableMessage'),
'service': type.IdType(resource_types='com.vmware.vapi.service'),
'operation': type.IdType(resource_types='com.vmware.vapi.operation'),
'parent': type.OptionalType(type.IdType()),
'target': type.OptionalType(type.ReferenceType('com.vmware.vapi.std_client', 'DynamicID')),
'status': type.ReferenceType('com.vmware.cis.task_client', 'Status'),
'cancelable': type.BooleanType(),
'error': type.OptionalType(type.AnyErrorType()),
'start_time': type.OptionalType(type.DateTimeType()),
'end_time': type.OptionalType(type.DateTimeType()),
'user': type.OptionalType(type.StringType()),
},
TaskInfo,
False,
None))
class TemporaryNetwork(VapiStruct):
"""
Configuration of the temporary network which is used during
upgrade/migrate.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
_validator_list = [
UnionValidator(
'mode',
{
'STATIC' : [('ip', True), ('dns_servers', True), ('prefix', True), ('gateway', True)],
'DHCP' : [],
}
),
]
def __init__(self,
ip_family=None,
mode=None,
ip=None,
dns_servers=None,
prefix=None,
gateway=None,
):
"""
:type ip_family: :class:`TemporaryNetwork.IpType` or ``None``
:param ip_family: Network IP address family.
If None, defaults to IPV4
:type mode: :class:`TemporaryNetwork.NetworkMode`
:param mode: Network mode.
:type ip: :class:`str`
:param ip: Network IP address. Required for static mode only.
This attribute is optional and it is only relevant when the value
of ``mode`` is :attr:`TemporaryNetwork.NetworkMode.STATIC`.
:type dns_servers: :class:`list` of :class:`str`
:param dns_servers: A comma-separated list of IP addresses of DNS servers. A JSON array
such as ["1.2.3.4", "127.0.0.1"]. Required for static mode only.
DNS servers must be reachable from the machine that runs CLI
installer
This attribute is optional and it is only relevant when the value
of ``mode`` is :attr:`TemporaryNetwork.NetworkMode.STATIC`.
:type prefix: :class:`long`
:param prefix: Network prefix length. Required for static mode only. Remove if the
mode is "dhcp". This is the number of bits set in the subnet mask;
for instance, if the subnet mask is 255.255.255.0, there are 24
bits in the binary version of the subnet mask, so the prefix length
is 24. If used, the values must be in the inclusive range of 0 to
32 for IPv4 and 0 to 128 for IPv6. Required for static mode only.
This attribute is optional and it is only relevant when the value
of ``mode`` is :attr:`TemporaryNetwork.NetworkMode.STATIC`.
:type gateway: :class:`str`
:param gateway: Gateway of the network. Required for static mode only.
This attribute is optional and it is only relevant when the value
of ``mode`` is :attr:`TemporaryNetwork.NetworkMode.STATIC`.
"""
self.ip_family = ip_family
self.mode = mode
self.ip = ip
self.dns_servers = dns_servers
self.prefix = prefix
self.gateway = gateway
VapiStruct.__init__(self)
class IpType(Enum):
"""
Network IP address family.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
IPV4 = None
"""
IPv4 Type of IP address.
"""
IPV6 = None
"""
IPv6 Type of IP address.
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`IpType` instance.
"""
Enum.__init__(string)
IpType._set_values([
IpType('IPV4'),
IpType('IPV6'),
])
IpType._set_binding_type(type.EnumType(
'com.vmware.vcenter.lcm.temporary_network.ip_type',
IpType))
class NetworkMode(Enum):
"""
Network mode.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
DHCP = None
"""
DHCP mode.
"""
STATIC = None
"""
Static IP mode.
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`NetworkMode` instance.
"""
Enum.__init__(string)
NetworkMode._set_values([
NetworkMode('DHCP'),
NetworkMode('STATIC'),
])
NetworkMode._set_binding_type(type.EnumType(
'com.vmware.vcenter.lcm.temporary_network.network_mode',
NetworkMode))
TemporaryNetwork._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.temporary_network', {
'ip_family': type.OptionalType(type.ReferenceType(__name__, 'TemporaryNetwork.IpType')),
'mode': type.ReferenceType(__name__, 'TemporaryNetwork.NetworkMode'),
'ip': type.OptionalType(type.StringType()),
'dns_servers': type.OptionalType(type.ListType(type.StringType())),
'prefix': type.OptionalType(type.IntegerType()),
'gateway': type.OptionalType(type.StringType()),
},
TemporaryNetwork,
False,
None))
class Time(VapiStruct):
"""
NTP setting of the appliance to be deployed.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
ntp_servers=None,
):
"""
:type ntp_servers: :class:`list` of :class:`str` or ``None``
:param ntp_servers: To configure NTP time synchronization for the appliance, set the
value to a comma - separated list of host names or IP addresses of
Network Time Protocol(NTP) servers. If "ntp_servers" is not
provided, the appliance clock will be synced to the ESX. For
example: ["time.nist.gov"].
Times tool sync will be enabled when ntp server is not provided. If
None, defaults to []
"""
self.ntp_servers = ntp_servers
VapiStruct.__init__(self)
Time._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.time', {
'ntp_servers': type.OptionalType(type.ListType(type.StringType())),
},
Time,
False,
None))
class UpgradeDestinationApplianceService(VapiStruct):
"""
Configurable services of destination appliance for upgrade/migrate
operation.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
ssh=None,
):
"""
:type ssh: :class:`Ssh`
:param ssh: Whether to enable SSH on the vCenter Appliance.
"""
self.ssh = ssh
VapiStruct.__init__(self)
UpgradeDestinationApplianceService._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.upgrade_destination_appliance_service', {
'ssh': type.ReferenceType(__name__, 'Ssh'),
},
UpgradeDestinationApplianceService,
False,
None))
class UpgradeSourceAppliance(VapiStruct):
"""
Configuration of the source appliance to be upgraded/migrated.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
hostname=None,
sso_admin_username=None,
sso_admin_password=None,
root_password=None,
https_port=None,
ssl_verify=None,
ssl_thumbprint=None,
export_dir=None,
):
"""
:type hostname: :class:`str`
:param hostname: The IP address or fully qualified domain name (FQDN) of the vCenter
Server instance. If an FQDN is provided, it has to be resolvable
from the machine that is running the installer.
:type sso_admin_username: :class:`str`
:param sso_admin_username: vCenter Single Sign-On administrator user name of the source
appliance.
:type sso_admin_password: :class:`str`
:param sso_admin_password: vCenter Single Sign-On administrator password of the source
appliance.
:type root_password: :class:`str`
:param root_password: Password of the operating system root user of the appliance.
:type https_port: :class:`long` or ``None``
:param https_port: The HTTPS port number to connect to the source appliance.
If None, defaults to 443
:type ssl_verify: :class:`bool` or ``None``
:param ssl_verify:
:type ssl_thumbprint: :class:`str` or ``None``
:param ssl_thumbprint:
:type export_dir: :class:`str` or ``None``
:param export_dir: Export directory of the source appliance.
Default to be "/var/tmp/".
"""
self.hostname = hostname
self.sso_admin_username = sso_admin_username
self.sso_admin_password = sso_admin_password
self.root_password = root_password
self.https_port = https_port
self.ssl_verify = ssl_verify
self.ssl_thumbprint = ssl_thumbprint
self.export_dir = export_dir
VapiStruct.__init__(self)
UpgradeSourceAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.upgrade_source_appliance', {
'hostname': type.StringType(),
'sso_admin_username': type.StringType(),
'sso_admin_password': type.SecretType(),
'root_password': type.SecretType(),
'https_port': type.OptionalType(type.IntegerType()),
'ssl_verify': type.OptionalType(type.BooleanType()),
'ssl_thumbprint': type.OptionalType(type.StringType()),
'export_dir': type.OptionalType(type.StringType()),
},
UpgradeSourceAppliance,
False,
None))
class Vc(VapiStruct):
"""
Configuration of the VC that hosts/will host an appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
connection=None,
inventory=None,
):
"""
:type connection: :class:`Connection`
:param connection: The configuration to connect to an ESX/VC.
:type inventory: :class:`VcInventory`
:param inventory: All names are case-sensitive. you can install the appliance to one
of the following destinations: 1. A resource pool in a cluster, use
'cluster_path'. 2. A specific ESX host in a cluster, use
'host_path'. 3. A resource pool in a specific ESX host being
managed by the current vCenter, use 'resource_pool_path'. You must
always provide the 'network_name' key. To install a new appliance
to a specific ESX host in a cluster, provide the 'host_path' key,
and the 'datastore_name', e.g. 'host_path':
'/MyDataCenter/host/MyCluster/10.20.30.40', 'datastore_name': 'Your
Datastore'. To install a new appliance to a specific resource pool,
provide the 'resource_pool_path', and the 'datastore_name', e.g.
'resource_pool_path': '/Your Datacenter Folder/Your
Datacenter/host/Your Cluster/Resources/Your Resource Pool',
'datastore_name': 'Your Datastore'. To place a new appliance to a
virtual machine Folder, provide the 'vm_folder_path', e.g.
'vm_folder_path': 'VM Folder 0/VM Folder1'.
"""
self.connection = connection
self.inventory = inventory
VapiStruct.__init__(self)
Vc._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.vc', {
'connection': type.ReferenceType(__name__, 'Connection'),
'inventory': type.ReferenceType(__name__, 'VcInventory'),
},
Vc,
False,
None))
class VcInventory(VapiStruct):
"""
Inventory information about a VCenter.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
vm_folder_path=None,
resource_pool_path=None,
cluster_path=None,
host_path=None,
datastore_name=None,
datastore_cluster_name=None,
network_name=None,
):
"""
:type vm_folder_path: :class:`str` or ``None``
:param vm_folder_path: Path of the VM folder. VM folder must be visible by the Data Center
of the compute resourceFormat:{vm_folder1}/{vm_folder2}e.g.:'VM
Folder 0/VM Folder1'.
Mutually exclusive between ``#resource_pool_path``,
``#cluster_path``, and ``#host_path``
:type resource_pool_path: :class:`str` or ``None``
:param resource_pool_path: Full path to resource pool. Format: /{datacenter
folder}/{datacenter name}/host/{host
name}/{cluster_name}/Resources/{resource pool}. e.g: Your
Datacenter Folder/Your Datacenter/host/Your Cluster/Resources/Your
Resource Pool
Mutually exclusive between ``#resource_pool_path``,
``#cluster_path``, and ``#host_path``
:type cluster_path: :class:`str` or ``None``
:param cluster_path: Full path to the cluster. Format: /{datacenter folder}/{datacenter
name}/host/{cluster_name}. e.g: /Your Datacenter Folder/Your
Datacenter/host/Your Cluster
Mutually exclusive between ``#resource_pool_path``,
``#cluster_path``, and ``#host_path``
:type host_path: :class:`str` or ``None``
:param host_path:
:type datastore_name: :class:`str` or ``None``
:param datastore_name: The datastore on which to store the files of the appliance. This
value has to be either a specific datastore name, or a specific
datastore in a datastore cluster. The datastore must be accessible
from the ESX host and must have at least 25 GB of free space.
Otherwise, the new appliance might not power on.
Mutually exclusive between ``#datastore_name`` and
``#datastore_cluster_name``
:type datastore_cluster_name: :class:`str` or ``None``
:param datastore_cluster_name: The datastore cluster on which to store the files of the appliance.
Mutually exclusive between ``#datastore_name`` and
``#datastore_cluster_name``
:type network_name: :class:`str`
:param network_name: Name of the network. e.g. VM Network
"""
self.vm_folder_path = vm_folder_path
self.resource_pool_path = resource_pool_path
self.cluster_path = cluster_path
self.host_path = host_path
self.datastore_name = datastore_name
self.datastore_cluster_name = datastore_cluster_name
self.network_name = network_name
VapiStruct.__init__(self)
VcInventory._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.vc_inventory', {
'vm_folder_path': type.OptionalType(type.StringType()),
'resource_pool_path': type.OptionalType(type.StringType()),
'cluster_path': type.OptionalType(type.StringType()),
'host_path': type.OptionalType(type.StringType()),
'datastore_name': type.OptionalType(type.StringType()),
'datastore_cluster_name': type.OptionalType(type.StringType()),
'network_name': type.StringType(),
},
VcInventory,
False,
None))
class Install(VapiInterface):
"""
The service to install Embedded VCSA, PSC, Management VCSA, VMC gateway.
"""
_VAPI_SERVICE_ID = 'com.vmware.vcenter.lcm.install'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _InstallStub)
class Psc(VapiStruct):
"""
Spec used to configure a Platform Services Controller. This section
describes how the Platform Services Controller appliance should be
configured. If unset, either ``#vcsaEmbedded`` or ``#vcsaExternal`` must be
provided.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
standalone=None,
replicated=None,
ceip_enabled=None,
):
"""
:type standalone: :class:`PscStandalone` or ``None``
:param standalone: Spec used to configure a standalone Platform Services Controller.
This section describes how the standalone PSC should be configured.
Mutually exclusive between ``standalone`` and ``replicated``
:type replicated: :class:`PscReplicated` or ``None``
:param replicated: Spec used to configure a replicated Platform Services Controller.
This section describes how the replicated PSC should be configured.
Mutually exclusive between ``standalone`` and ``replicated``
:type ceip_enabled: :class:`bool`
:param ceip_enabled: This key describes the enabling option for the VMware's Customer
Experience Improvement Program (CEIP). By default we have
``ceipEnabled``: true, which indicates that you are joining CEIP.
If you prefer not to participate in the VMware's CEIP for this
product, you must disable CEIP by setting ``ceipEnabled``: false.
You may join or leave VMware's CEIP for this product at any time.
"""
self.standalone = standalone
self.replicated = replicated
self.ceip_enabled = ceip_enabled
VapiStruct.__init__(self)
Psc._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.install.psc', {
'standalone': type.OptionalType(type.ReferenceType(__name__, 'PscStandalone')),
'replicated': type.OptionalType(type.ReferenceType(__name__, 'PscReplicated')),
'ceip_enabled': type.BooleanType(),
},
Psc,
False,
None))
class VcsaEmbedded(VapiStruct):
"""
Spec used to configure an embedded vCenter Server. This field describes how
the embedded vCenter Server appliance should be configured.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
standalone=None,
replicated=None,
ceip_enabled=None,
):
"""
:type standalone: :class:`EmbeddedStandaloneVcsa` or ``None``
:param standalone: Spec used to configure a standalone embedded vCenter Server. This
field describes how the standalone vCenter Server appliance should
be configured.
Mutually exclusive between ``standalone`` and ``replicated``
:type replicated: :class:`EmbeddedReplicatedVcsa` or ``None``
:param replicated: Spec used to configure a replicated embedded vCenter Server. This
field describes how the replicated vCenter Server appliance should
be configured.
Mutually exclusive between ``standalone`` and ``replicated``
:type ceip_enabled: :class:`bool`
:param ceip_enabled: This key describes the enabling option for the VMware's Customer
Experience Improvement Program (CEIP). By default we have
``ceipEnabled``: true, which indicates that you are joining CEIP.
If you prefer not to participate in the VMware's CEIP for this
product, you must disable CEIP by setting ``ceipEnabled``: false.
You may join or leave VMware's CEIP for this product at any time.
"""
self.standalone = standalone
self.replicated = replicated
self.ceip_enabled = ceip_enabled
VapiStruct.__init__(self)
VcsaEmbedded._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.install.vcsa_embedded', {
'standalone': type.OptionalType(type.ReferenceType(__name__, 'EmbeddedStandaloneVcsa')),
'replicated': type.OptionalType(type.ReferenceType(__name__, 'EmbeddedReplicatedVcsa')),
'ceip_enabled': type.BooleanType(),
},
VcsaEmbedded,
False,
None))
class ReverseProxy(VapiStruct):
"""
Port numbers on which the vCenter Server Appliance communicates with the
other vSphere components.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
http_port=None,
https_port=None,
):
"""
:type http_port: :class:`long` or ``None``
:param http_port: Reverse proxy http port.
If None, defaults to 8080
:type https_port: :class:`long` or ``None``
:param https_port: Reverse proxy https port.
If None, defaults to 8443
"""
self.http_port = http_port
self.https_port = https_port
VapiStruct.__init__(self)
ReverseProxy._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.install.reverse_proxy', {
'http_port': type.OptionalType(type.IntegerType()),
'https_port': type.OptionalType(type.IntegerType()),
},
ReverseProxy,
False,
None))
class DestinationApplianceService(VapiStruct):
"""
The configuration of vCenter services.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
rhttpproxy=None,
ssh=None,
):
"""
:type rhttpproxy: :class:`Install.ReverseProxy` or ``None``
:param rhttpproxy: Port numbers on which the vCenter Server Appliance communicates
with the other vSphere components.
Default value used reverse proxy not provided
:type ssh: :class:`Ssh`
:param ssh: Whether to enable SSH on the vCenter Appliance.
"""
self.rhttpproxy = rhttpproxy
self.ssh = ssh
VapiStruct.__init__(self)
DestinationApplianceService._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.install.destination_appliance_service', {
'rhttpproxy': type.OptionalType(type.ReferenceType(__name__, 'Install.ReverseProxy')),
'ssh': type.ReferenceType(__name__, 'Ssh'),
},
DestinationApplianceService,
False,
None))
class DestinationAppliance(VapiStruct):
"""
Spec to describe the new appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
_validator_list = [
UnionValidator(
'appliance_type',
{
'VCSA_EMBEDDED' : [('appliance_size', False), ('appliance_disk_size', False), ('vcsa_embedded', True)],
'VCSA_EXTERNAL' : [('appliance_size', False), ('appliance_disk_size', False), ('vcsa_external', True)],
'PSC' : [('psc', True)],
'VMC' : [('vmc', True)],
}
),
]
def __init__(self,
appliance_name=None,
appliance_type=None,
appliance_size=None,
appliance_disk_size=None,
root_password=None,
thin_disk_mode=None,
ova_location=None,
ova_location_ssl_verify=None,
ova_location_ssl_thumbprint=None,
ovftool_location=None,
ovftool_location_ssl_verify=None,
ovftool_location_ssl_thumbprint=None,
services=None,
network=None,
time=None,
ovftool_arguments=None,
vcsa_embedded=None,
psc=None,
vcsa_external=None,
vmc=None,
):
"""
:type appliance_name: :class:`str`
:param appliance_name: The name of the appliance to deploy.
:type appliance_type: :class:`ApplianceType`
:param appliance_type: The type of appliance to deploy.
:type appliance_size: :class:`ApplianceSize` or ``None``
:param appliance_size: A size descriptor based on the number of virtual machines which
will be managed by the new vCenter appliance.
If None, defaults to SMALL
:type appliance_disk_size: :class:`StorageSize` or ``None``
:param appliance_disk_size: The disk size of the new vCenter appliance.
If None, defaults to REGULAR
:type root_password: :class:`str`
:param root_password: Password must conform to the following requirements: 1. At least 8
characters. 2. No more than 20 characters. 3. At least 1 uppercase
character. 4. At least 1 lowercase character. 5. At least 1 number.
6. At least 1 special character (e.g., '!', '(', '\\\\@', etc.). 7.
Only visible A-Z, a-z, 0-9 and punctuation (spaces are not allowed)
:type thin_disk_mode: :class:`bool`
:param thin_disk_mode: Whether to deploy the appliance with thin mode virtual disks.
:type ova_location: :class:`str`
:param ova_location: The location of the OVA file.
:type ova_location_ssl_verify: :class:`bool` or ``None``
:param ova_location_ssl_verify: A flag to indicate whether the ssl verification is required.
If ``ovaLocationSslThumbprint`` is provided, this field can be
omitted If None, defaults to True
:type ova_location_ssl_thumbprint: :class:`str` or ``None``
:param ova_location_ssl_thumbprint: SSL thumbprint of ssl verification. If provided, ssl_verify can be
omitted or set to true. If omitted, ssl_verify must be false. If
omitted and ssl_verify is true, an error will occur.
If ova_location_ssl_verify is False, this field can be omitted
:type ovftool_location: :class:`str`
:param ovftool_location: The location of the OVF Tool.
:type ovftool_location_ssl_verify: :class:`bool` or ``None``
:param ovftool_location_ssl_verify: Flag to indicate whether or not to verify the SSL thumbprint of OVF
Tool location.
if None, Default to be True.
:type ovftool_location_ssl_thumbprint: :class:`str` or ``None``
:param ovftool_location_ssl_thumbprint: SSL thumbprint of OVF Tool location to be verified.
When ovftoolLocationSslVerify is set to False, this field can be
omitted.
:type services: :class:`Install.DestinationApplianceService`
:param services: The configuration of vCenter services.
:type network: :class:`Network`
:param network: The network settings of the appliance to be deployed.
:type time: :class:`Time`
:param time: Configuration of the vCSA time synchronization.
:type ovftool_arguments: (:class:`dict` of :class:`str` and :class:`str`) or ``None``
:param ovftool_arguments: The OVF Tool arguments to be included.
Not required when no OVF Tool argument to pass through
:type vcsa_embedded: :class:`Install.VcsaEmbedded`
:param vcsa_embedded: Spec used to configure an embedded vCenter Server. This field
describes how the embedded vCenter Server appliance should be
configured.
This attribute is optional and it is only relevant when the value
of ``applianceType`` is :attr:`ApplianceType.VCSA_EMBEDDED`.
:type psc: :class:`Install.Psc`
:param psc: Spec used to configure a Platform Services Controller. This section
describes how the Platform Services Controller appliance should be
configured. If unset, either ``vcsaEmbedded`` or ``vcsaExternal``
must be provided.
This attribute is optional and it is only relevant when the value
of ``applianceType`` is :attr:`ApplianceType.PSC`.
:type vcsa_external: :class:`ExternalVcsa`
:param vcsa_external: Spec used to configure a vCenter Server registered with an external
PSC. If unset, either vcsa_embedded or psc must be provided.
This attribute is optional and it is only relevant when the value
of ``applianceType`` is :attr:`ApplianceType.VCSA_EXTERNAL`.
:type vmc: :class:`ExternalVcsa`
:param vmc: Spec used to configure a vCenter Server registered with an external
PSC. If unset, either vcsa_embedded or psc must be provided.
This attribute is optional and it is only relevant when the value
of ``applianceType`` is :attr:`ApplianceType.VMC`.
"""
self.appliance_name = appliance_name
self.appliance_type = appliance_type
self.appliance_size = appliance_size
self.appliance_disk_size = appliance_disk_size
self.root_password = root_password
self.thin_disk_mode = thin_disk_mode
self.ova_location = ova_location
self.ova_location_ssl_verify = ova_location_ssl_verify
self.ova_location_ssl_thumbprint = ova_location_ssl_thumbprint
self.ovftool_location = ovftool_location
self.ovftool_location_ssl_verify = ovftool_location_ssl_verify
self.ovftool_location_ssl_thumbprint = ovftool_location_ssl_thumbprint
self.services = services
self.network = network
self.time = time
self.ovftool_arguments = ovftool_arguments
self.vcsa_embedded = vcsa_embedded
self.psc = psc
self.vcsa_external = vcsa_external
self.vmc = vmc
VapiStruct.__init__(self)
DestinationAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.install.destination_appliance', {
'appliance_name': type.StringType(),
'appliance_type': type.ReferenceType(__name__, 'ApplianceType'),
'appliance_size': type.OptionalType(type.ReferenceType(__name__, 'ApplianceSize')),
'appliance_disk_size': type.OptionalType(type.ReferenceType(__name__, 'StorageSize')),
'root_password': type.SecretType(),
'thin_disk_mode': type.BooleanType(),
'ova_location': type.StringType(),
'ova_location_ssl_verify': type.OptionalType(type.BooleanType()),
'ova_location_ssl_thumbprint': type.OptionalType(type.StringType()),
'ovftool_location': type.StringType(),
'ovftool_location_ssl_verify': type.OptionalType(type.BooleanType()),
'ovftool_location_ssl_thumbprint': type.OptionalType(type.StringType()),
'services': type.ReferenceType(__name__, 'Install.DestinationApplianceService'),
'network': type.ReferenceType(__name__, 'Network'),
'time': type.ReferenceType(__name__, 'Time'),
'ovftool_arguments': type.OptionalType(type.MapType(type.StringType(), type.StringType())),
'vcsa_embedded': type.OptionalType(type.ReferenceType(__name__, 'Install.VcsaEmbedded')),
'psc': type.OptionalType(type.ReferenceType(__name__, 'Install.Psc')),
'vcsa_external': type.OptionalType(type.ReferenceType(__name__, 'ExternalVcsa')),
'vmc': type.OptionalType(type.ReferenceType(__name__, 'ExternalVcsa')),
},
DestinationAppliance,
False,
None))
class Spec(VapiStruct):
"""
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_location=None,
destination_appliance=None,
):
"""
:type destination_location: :class:`DestinationLocation`
:param destination_location: This subsection describes the ESX or VC on which to deploy the
appliance.
:type destination_appliance: :class:`Install.DestinationAppliance`
:param destination_appliance: Spec to describe the new appliance.
"""
self.destination_location = destination_location
self.destination_appliance = destination_appliance
VapiStruct.__init__(self)
Spec._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.install.spec', {
'destination_location': type.ReferenceType(__name__, 'DestinationLocation'),
'destination_appliance': type.ReferenceType(__name__, 'Install.DestinationAppliance'),
},
Spec,
False,
None))
def check_task(self,
spec,
options=None,
):
"""
Performs a precheck for the given specification. The result of this
operation can be queried by calling the cis/tasks/{task-id} with the
task-id in the response of this call.
:type spec: :class:`Install.Spec`
:param spec: The specification of the deployment.
:type options: :class:`DeploymentOption` or ``None``
:param options: The deployment precheck options.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the given spec and/or option contains error.
"""
task_id = self._invoke('check$task',
{
'spec': spec,
'options': options,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.VoidType())
return task_instance
def start_task(self,
spec,
):
"""
Deploys the appliance for the given specification. The result of this
operation can be queried by calling the cis/tasks/{task-id} with the
task-id in the response of this call.
:type spec: :class:`Install.Spec`
:param spec: The specification of the deployment.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the given specification contains error.
"""
task_id = self._invoke('start$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.VoidType())
return task_instance
class Log(VapiInterface):
"""
The service that provides logs associated with a task of a given task ID.
"""
_VAPI_SERVICE_ID = 'com.vmware.vcenter.lcm.log'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _LogStub)
def get(self,
task_id,
):
"""
Retrieves the zipped files that contains operation log, serialized task
flow, record of all configuration, and a current status of the
operation.
:type task_id: :class:`str`
:param task_id: The :class:`vmodl.lang_client.ID` of the operation. must exist in
the server. If for any reason the server reboots during an
operation, all :class:`vmodl.lang_client.ID`s previously stored is
lost.
The parameter must be an identifier for the resource type:
``com.vmware.cis.task``.
:rtype: :class:`str`
:return: A zipped file that contains the files mentioned above.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the given ID does not exist in the server. There might be a
cause of task ID does not exist including, error in taskID, or log
files been purged by system or manually.
"""
return self._invoke('get',
{
'task_id': task_id,
})
class Migrate(VapiInterface):
"""
The service to migrate a windows VC to Embedded VCSA, PSC, and Management
VCSA.
"""
_VAPI_SERVICE_ID = 'com.vmware.vcenter.lcm.migrate'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _MigrateStub)
class MigrateDestinationAppliance(VapiStruct):
"""
Spec to describe the new appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
_validator_list = [
UnionValidator(
'appliance_type',
{
'VCSA_EMBEDDED' : [('appliance_size', False), ('appliance_disk_size', False), ('vcsa_embedded', True)],
'VCSA_EXTERNAL' : [('appliance_size', False), ('appliance_disk_size', False)],
'PSC' : [('psc', True)],
'VMC' : [],
}
),
]
def __init__(self,
appliance_name=None,
appliance_type=None,
appliance_size=None,
appliance_disk_size=None,
root_password=None,
thin_disk_mode=None,
ova_location=None,
ova_location_ssl_verify=None,
ova_location_ssl_thumbprint=None,
ovftool_location=None,
ovftool_location_ssl_verify=None,
ovftool_location_ssl_thumbprint=None,
active_directory_domain=None,
active_directory_username=None,
active_directory_password=None,
services=None,
temporary_network=None,
history=None,
ovftool_arguments=None,
vcsa_embedded=None,
psc=None,
):
"""
:type appliance_name: :class:`str`
:param appliance_name: The name of the appliance to deploy.
:type appliance_type: :class:`ApplianceType` or ``None``
:param appliance_type: The type of appliance to deploy.
If None, defaults to VCSA_EMBEDDED
:type appliance_size: :class:`ApplianceSize` or ``None``
:param appliance_size: A size descriptor based on the number of virtual machines which
will be managed by the new vCenter appliance.
If None, defaults to SMALL
:type appliance_disk_size: :class:`StorageSize` or ``None``
:param appliance_disk_size: The disk size of the new vCenter appliance.
If None, defaults to REGULAR
:type root_password: :class:`str`
:param root_password: Password must conform to the following requirements: 1. At least 8
characters. 2. No more than 20 characters. 3. At least 1 uppercase
character. 4. At least 1 lowercase character. 5. At least 1 number.
6. At least 1 special character (e.g., '!', '(', '\\\\@', etc.). 7.
Only visible A-Z, a-z, 0-9 and punctuation (spaces are not allowed)
:type thin_disk_mode: :class:`bool`
:param thin_disk_mode: Whether to deploy the appliance with thin mode virtual disks.
:type ova_location: :class:`str`
:param ova_location: The location of the OVA file.
:type ova_location_ssl_verify: :class:`bool` or ``None``
:param ova_location_ssl_verify: A flag to indicate whether the ssl verification is required.
If ``ovaLocationSslThumbprint`` is provided, this field can be
omitted If None, defaults to True
:type ova_location_ssl_thumbprint: :class:`str` or ``None``
:param ova_location_ssl_thumbprint: SSL thumbprint of ssl verification. If provided, ssl_verify can be
omitted or set to true. If omitted, ssl_verify must be false. If
omitted and ssl_verify is true, an error will occur.
If ova_location_ssl_verify is False, this field can be omitted
:type ovftool_location: :class:`str`
:param ovftool_location: The location of the OVF Tool.
:type ovftool_location_ssl_verify: :class:`bool` or ``None``
:param ovftool_location_ssl_verify: Flag to indicate whether or not to verify the SSL thumbprint of OVF
Tool location.
if None, Default to be True.
:type ovftool_location_ssl_thumbprint: :class:`str` or ``None``
:param ovftool_location_ssl_thumbprint: SSL thumbprint of OVF Tool location to be verified.
When ovftoolLocationSslVerify is set to False, this field can be
omitted.
:type active_directory_domain: :class:`str` or ``None``
:param active_directory_domain: The name of the Active Directory domain to which the source Windows
installation is joined. If the source Windows installation is not
joined to an Active Directory domain, omit this parameter.
Not required when active directory is not applicable
:type active_directory_username: :class:`str` or ``None``
:param active_directory_username: Administrator user name of the Active Directory domain to which the
source Windows installation is joined. The format can be either
'username' or 'username\\\\@domain'
Not required when active directory is not applicable
:type active_directory_password: :class:`str` or ``None``
:param active_directory_password: Password for the active directory user.
Not required when active directory is not applicable
:type services: :class:`UpgradeDestinationApplianceService`
:param services: Spec to configure vCenter server services.
:type temporary_network: :class:`TemporaryNetwork`
:param temporary_network: The network settings of the appliance to be deployed.
:type history: :class:`History` or ``None``
:param history: History data to be included in the upgrade and migrate
Default value will be applied when absent
:type ovftool_arguments: (:class:`dict` of :class:`str` and :class:`str`) or ``None``
:param ovftool_arguments: The OVF Tool arguments to be included.
Not required when no OVF Tool argument to pass through
:type vcsa_embedded: :class:`CeipOnlySso`
:param vcsa_embedded: Spec used to configure an embedded vCenter Server. This field
describes how the embedded vCenter Server appliance should be
configured.
This attribute is optional and it is only relevant when the value
of ``applianceType`` is :attr:`ApplianceType.VCSA_EMBEDDED`.
:type psc: :class:`CeipOnlySso`
:param psc: Spec used to configure a Platform Services Controller. This section
describes how the Platform Services Controller appliance should be
configured. If unset, either ``vcsaEmbedded`` or ``#vcsaExternal``
must be provided.
This attribute is optional and it is only relevant when the value
of ``applianceType`` is :attr:`ApplianceType.PSC`.
"""
self.appliance_name = appliance_name
self.appliance_type = appliance_type
self.appliance_size = appliance_size
self.appliance_disk_size = appliance_disk_size
self.root_password = root_password
self.thin_disk_mode = thin_disk_mode
self.ova_location = ova_location
self.ova_location_ssl_verify = ova_location_ssl_verify
self.ova_location_ssl_thumbprint = ova_location_ssl_thumbprint
self.ovftool_location = ovftool_location
self.ovftool_location_ssl_verify = ovftool_location_ssl_verify
self.ovftool_location_ssl_thumbprint = ovftool_location_ssl_thumbprint
self.active_directory_domain = active_directory_domain
self.active_directory_username = active_directory_username
self.active_directory_password = active_directory_password
self.services = services
self.temporary_network = temporary_network
self.history = history
self.ovftool_arguments = ovftool_arguments
self.vcsa_embedded = vcsa_embedded
self.psc = psc
VapiStruct.__init__(self)
MigrateDestinationAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.migrate.migrate_destination_appliance', {
'appliance_name': type.StringType(),
'appliance_type': type.OptionalType(type.ReferenceType(__name__, 'ApplianceType')),
'appliance_size': type.OptionalType(type.ReferenceType(__name__, 'ApplianceSize')),
'appliance_disk_size': type.OptionalType(type.ReferenceType(__name__, 'StorageSize')),
'root_password': type.SecretType(),
'thin_disk_mode': type.BooleanType(),
'ova_location': type.StringType(),
'ova_location_ssl_verify': type.OptionalType(type.BooleanType()),
'ova_location_ssl_thumbprint': type.OptionalType(type.StringType()),
'ovftool_location': type.StringType(),
'ovftool_location_ssl_verify': type.OptionalType(type.BooleanType()),
'ovftool_location_ssl_thumbprint': type.OptionalType(type.StringType()),
'active_directory_domain': type.OptionalType(type.StringType()),
'active_directory_username': type.OptionalType(type.StringType()),
'active_directory_password': type.OptionalType(type.SecretType()),
'services': type.ReferenceType(__name__, 'UpgradeDestinationApplianceService'),
'temporary_network': type.ReferenceType(__name__, 'TemporaryNetwork'),
'history': type.OptionalType(type.ReferenceType(__name__, 'History')),
'ovftool_arguments': type.OptionalType(type.MapType(type.StringType(), type.StringType())),
'vcsa_embedded': type.OptionalType(type.ReferenceType(__name__, 'CeipOnlySso')),
'psc': type.OptionalType(type.ReferenceType(__name__, 'CeipOnlySso')),
},
MigrateDestinationAppliance,
False,
None))
class Spec(VapiStruct):
"""
Spec to describe the configuration parameters that are required for
migrating a Windows vCenter Server.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_location=None,
destination_appliance=None,
source_vc_windows=None,
existing_migration_assistant=None,
start_migration_assistant=None,
source_vum_location=None,
source_vum=None,
):
"""
:type destination_location: :class:`DestinationLocation`
:param destination_location: This subsection describes the ESX or VC on which to deploy the
appliance.
:type destination_appliance: :class:`Migrate.MigrateDestinationAppliance`
:param destination_appliance: Spec to describe the new appliance.
:type source_vc_windows: :class:`SourceVcWindows`
:param source_vc_windows: Spec to describe the existing Windows vCenter server to migrate.
:type existing_migration_assistant: :class:`ExistingMigrationAssistant` or ``None``
:param existing_migration_assistant: Spec to describe the attributes of a running Migration Assistant on
the Windows vCenter server.
Only applicable when migration assistant is already running on the
source appliance
:type start_migration_assistant: :class:`MigrationAssistant` or ``None``
:param start_migration_assistant: Spec to automate the invocation of Migration Assistant. Automatic
invocation works only if the source Windows installation is running
as a virtual machine.
Only applicable when migration assistant is not running on the
source appliance.
:type source_vum_location: :class:`Connection` or ``None``
:param source_vum_location: The configuration to connect to an ESX/VC.
:type source_vum: :class:`SourceVum` or ``None``
:param source_vum: This section describes the source vSphere Update Manager (VUM)
which you want to upgrade.
Not applicable for appliance not having source vSphere Update
Manager
"""
self.destination_location = destination_location
self.destination_appliance = destination_appliance
self.source_vc_windows = source_vc_windows
self.existing_migration_assistant = existing_migration_assistant
self.start_migration_assistant = start_migration_assistant
self.source_vum_location = source_vum_location
self.source_vum = source_vum
VapiStruct.__init__(self)
Spec._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.migrate.spec', {
'destination_location': type.ReferenceType(__name__, 'DestinationLocation'),
'destination_appliance': type.ReferenceType(__name__, 'Migrate.MigrateDestinationAppliance'),
'source_vc_windows': type.ReferenceType(__name__, 'SourceVcWindows'),
'existing_migration_assistant': type.OptionalType(type.ReferenceType(__name__, 'ExistingMigrationAssistant')),
'start_migration_assistant': type.OptionalType(type.ReferenceType(__name__, 'MigrationAssistant')),
'source_vum_location': type.OptionalType(type.ReferenceType(__name__, 'Connection')),
'source_vum': type.OptionalType(type.ReferenceType(__name__, 'SourceVum')),
},
Spec,
False,
None))
def check_task(self,
spec,
options=None,
):
"""
Performs a precheck for the given specification. The result of this
operation can be queried by calling the cis/tasks/{task-id} with the
task-id in the response of this call.
:type spec: :class:`Migrate.Spec`
:param spec: The specification of the deployment.
:type options: :class:`DeploymentOption` or ``None``
:param options: The deployment precheck options.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the given spec and/or option contains error.
"""
task_id = self._invoke('check$task',
{
'spec': spec,
'options': options,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.VoidType())
return task_instance
def start_task(self,
spec,
):
"""
Deploys the appliance for the given specification. The result of this
operation can be queried by calling the cis/tasks/{task-id} with the
task-id in the response of this call.
:type spec: :class:`Migrate.Spec`
:param spec: The specification of the deployment.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the given specification contains error.
"""
task_id = self._invoke('start$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.VoidType())
return task_instance
class Recommendation(VapiInterface):
"""
The service that provide recommendation.
"""
_VAPI_SERVICE_ID = 'com.vmware.vcenter.lcm.recommendation'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _RecommendationStub)
class DeploymentSizeDestinationAppliance(VapiStruct):
"""
Spec to describe the new appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
ova_location=None,
ova_location_ssl_verify=None,
ova_location_ssl_thumbprint=None,
):
"""
:type ova_location: :class:`str` or ``None``
:param ova_location: The location of the ova file.
Not required.
:type ova_location_ssl_verify: :class:`bool` or ``None``
:param ova_location_ssl_verify: A flag to indicate whether the ssl verification is required.
If ``ovaLocationSslThumbprint`` is provided, this field can be
omitted If None, defaults to True
:type ova_location_ssl_thumbprint: :class:`str` or ``None``
:param ova_location_ssl_thumbprint: SSL thumbprint of ssl verification. If provided, ssl_verify can be
omitted or set to true. If omitted, ssl_verify must be false. If
omitted and ssl_verify is true, an error will occur.
If ova_location_ssl_verify is False, this field can be omitted
"""
self.ova_location = ova_location
self.ova_location_ssl_verify = ova_location_ssl_verify
self.ova_location_ssl_thumbprint = ova_location_ssl_thumbprint
VapiStruct.__init__(self)
DeploymentSizeDestinationAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.recommendation.deployment_size_destination_appliance', {
'ova_location': type.OptionalType(type.StringType()),
'ova_location_ssl_verify': type.OptionalType(type.BooleanType()),
'ova_location_ssl_thumbprint': type.OptionalType(type.StringType()),
},
DeploymentSizeDestinationAppliance,
False,
None))
class MigrateDeploymentSizeRequest(VapiStruct):
"""
The request for recommending migrate deployment size.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_appliance=None,
source_vc_windows=None,
existing_migration_assistant=None,
start_migration_assistant=None,
source_vum_location=None,
source_vum=None,
):
"""
:type destination_appliance: :class:`Recommendation.DeploymentSizeDestinationAppliance`
:param destination_appliance: Spec to describe the new appliance.
:type source_vc_windows: :class:`SourceVcWindows`
:param source_vc_windows: Spec to describe the existing Windows vCenter server to migrate.
:type existing_migration_assistant: :class:`ExistingMigrationAssistant` or ``None``
:param existing_migration_assistant: Spec to describe the attributes of a running Migration Assistant on
the Windows vCenter server.
Only applicable when migration assistant is already running on the
source appliance
:type start_migration_assistant: :class:`MigrationAssistant` or ``None``
:param start_migration_assistant: Spec to automate the invocation of Migration Assistant. Automatic
invocation works only if the source Windows installation is running
as a virtual machine.
Only applicable when migration assistant is not running on the
source appliance.
:type source_vum_location: :class:`Connection` or ``None``
:param source_vum_location: The configuration to connect to an ESX/VC.
:type source_vum: :class:`SourceVum` or ``None``
:param source_vum: This section describes the source vSphere Update Manager (VUM)
which you want to upgrade.
Not applicable for appliance not having source vSphere Update
Manager
"""
self.destination_appliance = destination_appliance
self.source_vc_windows = source_vc_windows
self.existing_migration_assistant = existing_migration_assistant
self.start_migration_assistant = start_migration_assistant
self.source_vum_location = source_vum_location
self.source_vum = source_vum
VapiStruct.__init__(self)
MigrateDeploymentSizeRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.recommendation.migrate_deployment_size_request', {
'destination_appliance': type.ReferenceType(__name__, 'Recommendation.DeploymentSizeDestinationAppliance'),
'source_vc_windows': type.ReferenceType(__name__, 'SourceVcWindows'),
'existing_migration_assistant': type.OptionalType(type.ReferenceType(__name__, 'ExistingMigrationAssistant')),
'start_migration_assistant': type.OptionalType(type.ReferenceType(__name__, 'MigrationAssistant')),
'source_vum_location': type.OptionalType(type.ReferenceType(__name__, 'Connection')),
'source_vum': type.OptionalType(type.ReferenceType(__name__, 'SourceVum')),
},
MigrateDeploymentSizeRequest,
False,
None))
class UpgradeDeploymentSizeRequest(VapiStruct):
"""
The request for recommending upgrade deployment size.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_appliance=None,
source_location=None,
source_appliance=None,
source_vum=None,
source_vum_location=None,
):
"""
:type destination_appliance: :class:`Recommendation.DeploymentSizeDestinationAppliance`
:param destination_appliance: Spec to describe the new appliance
:type source_location: :class:`Connection`
:param source_location: The configuration to connect to an ESX/VC.
:type source_appliance: :class:`UpgradeSourceAppliance`
:param source_appliance: Spec to describe the existing appliance to upgrade.
:type source_vum: :class:`SourceVum` or ``None``
:param source_vum: This section describes the source vSphere Update Manager (VUM)
which you want to upgrade.
Not applicable for appliance not having source vSphere Update
Manager
:type source_vum_location: :class:`Connection` or ``None``
:param source_vum_location: The configuration to connect to an ESX/VC.
"""
self.destination_appliance = destination_appliance
self.source_location = source_location
self.source_appliance = source_appliance
self.source_vum = source_vum
self.source_vum_location = source_vum_location
VapiStruct.__init__(self)
UpgradeDeploymentSizeRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.recommendation.upgrade_deployment_size_request', {
'destination_appliance': type.ReferenceType(__name__, 'Recommendation.DeploymentSizeDestinationAppliance'),
'source_location': type.ReferenceType(__name__, 'Connection'),
'source_appliance': type.ReferenceType(__name__, 'UpgradeSourceAppliance'),
'source_vum': type.OptionalType(type.ReferenceType(__name__, 'SourceVum')),
'source_vum_location': type.OptionalType(type.ReferenceType(__name__, 'Connection')),
},
UpgradeDeploymentSizeRequest,
False,
None))
class DatastoreRequest(VapiStruct):
"""
The request for recommending datastore.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_location=None,
destination_appliance=None,
):
"""
:type destination_location: :class:`Recommendation.DatastoreDestinationLocation`
:param destination_location: This subsection describes the ESX or VC on which to deploy the
appliance.
:type destination_appliance: :class:`Recommendation.DatastoreDestinationAppliance`
:param destination_appliance: Spec to describe the new appliance.
"""
self.destination_location = destination_location
self.destination_appliance = destination_appliance
VapiStruct.__init__(self)
DatastoreRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.recommendation.datastore_request', {
'destination_location': type.ReferenceType(__name__, 'Recommendation.DatastoreDestinationLocation'),
'destination_appliance': type.ReferenceType(__name__, 'Recommendation.DatastoreDestinationAppliance'),
},
DatastoreRequest,
False,
None))
class DatastoreDestinationAppliance(VapiStruct):
"""
The configuration of destination appliance to recommend datastore.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
appliance_size=None,
appliance_disk_size=None,
ova_location=None,
ova_location_ssl_verify=None,
ova_location_ssl_thumbprint=None,
):
"""
:type appliance_size: :class:`ApplianceSize` or ``None``
:param appliance_size: A size descriptor based on the number of virtual machines which
will be managed by the new vCenter appliance.
If None, defaults to SMALL
:type appliance_disk_size: :class:`StorageSize` or ``None``
:param appliance_disk_size: The disk size of the new vCenter appliance.
If None, defaults to REGULAR
:type ova_location: :class:`str` or ``None``
:param ova_location: The location of the ova file.
Not required.
:type ova_location_ssl_verify: :class:`bool` or ``None``
:param ova_location_ssl_verify: A flag to indicate whether the ssl verification is required.
If ``ovaLocationSslThumbprint`` is provided, this field can be
omitted If None, defaults to True
:type ova_location_ssl_thumbprint: :class:`str` or ``None``
:param ova_location_ssl_thumbprint: SSL thumbprint of ssl verification. If provided, ssl_verify can be
omitted or set to true. If omitted, ssl_verify must be false. If
omitted and ssl_verify is true, an error will occur.
If ova_location_ssl_verify is False, this field can be omitted
"""
self.appliance_size = appliance_size
self.appliance_disk_size = appliance_disk_size
self.ova_location = ova_location
self.ova_location_ssl_verify = ova_location_ssl_verify
self.ova_location_ssl_thumbprint = ova_location_ssl_thumbprint
VapiStruct.__init__(self)
DatastoreDestinationAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.recommendation.datastore_destination_appliance', {
'appliance_size': type.OptionalType(type.ReferenceType(__name__, 'ApplianceSize')),
'appliance_disk_size': type.OptionalType(type.ReferenceType(__name__, 'StorageSize')),
'ova_location': type.OptionalType(type.StringType()),
'ova_location_ssl_verify': type.OptionalType(type.BooleanType()),
'ova_location_ssl_thumbprint': type.OptionalType(type.StringType()),
},
DatastoreDestinationAppliance,
False,
None))
class DatastoreDestinationLocation(VapiStruct):
"""
This subsection describes the ESX or VC on which to deploy the appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
esx=None,
vcenter=None,
):
"""
:type esx: :class:`Recommendation.DatastoreEsx` or ``None``
:param esx: This section describes the ESX host on which to deploy the
appliance. Required if you are deploying the appliance directly on
an ESX host.
Mutual exclusive between ``esx`` and ``vcenter``
:type vcenter: :class:`Recommendation.DatastoreVc` or ``None``
:param vcenter: This subsection describes the vCenter on which to deploy the
appliance.
Mutual exclusive between ``esx`` and ``vcenter``
"""
self.esx = esx
self.vcenter = vcenter
VapiStruct.__init__(self)
DatastoreDestinationLocation._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.recommendation.datastore_destination_location', {
'esx': type.OptionalType(type.ReferenceType(__name__, 'Recommendation.DatastoreEsx')),
'vcenter': type.OptionalType(type.ReferenceType(__name__, 'Recommendation.DatastoreVc')),
},
DatastoreDestinationLocation,
False,
None))
class DatastoreEsx(VapiStruct):
"""
This section describes the ESX host on which to deploy the appliance.
Required if you are deploying the appliance directly on an ESX host.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
connection=None,
):
"""
:type connection: :class:`Connection`
:param connection: The configuration to connect to an ESX/VC.
"""
self.connection = connection
VapiStruct.__init__(self)
DatastoreEsx._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.recommendation.datastore_esx', {
'connection': type.ReferenceType(__name__, 'Connection'),
},
DatastoreEsx,
False,
None))
class DatastoreVc(VapiStruct):
"""
This subsection describes the vCenter on which to deploy the appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
connection=None,
inventory=None,
):
"""
:type connection: :class:`Connection`
:param connection: The configuration to connect to an ESX/VC.
:type inventory: :class:`Recommendation.DatastoreVcInventory`
:param inventory: All names are case-sensitive. you can install the appliance to one
of the following destinations: 1. A resource pool in a cluster, use
'cluster_path'. 2. A specific ESX host in a cluster, use
'host_path'. 3. A resource pool in a specific ESX host being
managed by the current vCenter, use 'resource_pool_path'. You must
always provide the 'network_name' key. To install a new appliance
to a specific ESX host in a cluster, provide the 'host_path' key,
and the 'datastore_name', e.g. 'host_path':
'/MyDataCenter/host/MyCluster/10.20.30.40', 'datastore_name': 'Your
Datastore'. To install a new appliance to a specific resource pool,
provide the 'resource_pool_path', and the 'datastore_name', e.g.
'resource_pool_path': '/Your Datacenter Folder/Your
Datacenter/host/Your Cluster/Resources/Your Resource Pool',
'datastore_name': 'Your Datastore'. To place a new appliance to a
virtual machine Folder, provide the 'vm_folder_path', e.g.
'vm_folder_path': 'VM Folder 0/VM Folder1'.
"""
self.connection = connection
self.inventory = inventory
VapiStruct.__init__(self)
DatastoreVc._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.recommendation.datastore_vc', {
'connection': type.ReferenceType(__name__, 'Connection'),
'inventory': type.ReferenceType(__name__, 'Recommendation.DatastoreVcInventory'),
},
DatastoreVc,
False,
None))
class DatastoreVcInventory(VapiStruct):
"""
All names are case-sensitive. you can install the appliance to one of the
following destinations: 1. A resource pool in a cluster, use
'cluster_path'. 2. A specific ESX host in a cluster, use 'host_path'. 3. A
resource pool in a specific ESX host being managed by the current vCenter,
use 'resource_pool_path'. You must always provide the 'network_name' key.
To install a new appliance to a specific ESX host in a cluster, provide the
'host_path' key, and the 'datastore_name', e.g. 'host_path':
'/MyDataCenter/host/MyCluster/10.20.30.40', 'datastore_name': 'Your
Datastore'. To install a new appliance to a specific resource pool, provide
the 'resource_pool_path', and the 'datastore_name', e.g.
'resource_pool_path': '/Your Datacenter Folder/Your Datacenter/host/Your
Cluster/Resources/Your Resource Pool', 'datastore_name': 'Your Datastore'.
To place a new appliance to a virtual machine Folder, provide the
'vm_folder_path', e.g. 'vm_folder_path': 'VM Folder 0/VM Folder1'.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
host_path=None,
):
"""
:type host_path: :class:`str`
:param host_path: Full path to an ESX host. Format: /{datacenter folder}/{datacenter
name}/host/{host name}. e.g: /Your Datacenter Folder/Your
Datacenter/host/Your Host
"""
self.host_path = host_path
VapiStruct.__init__(self)
DatastoreVcInventory._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.recommendation.datastore_vc_inventory', {
'host_path': type.StringType(),
},
DatastoreVcInventory,
False,
None))
class DatastoreInfo(VapiStruct):
"""
Datastore space information. Space information are in GB unit.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
freespace=None,
freespace_after_placement=None,
required_space=None,
):
"""
:type freespace: :class:`float`
:param freespace: The amount of space that the datastore currently has.
:type freespace_after_placement: :class:`float`
:param freespace_after_placement: The amount of space that the datastore will have after deployment.
:type required_space: :class:`float`
:param required_space: The amount of space that the deployment will occupy.
"""
self.freespace = freespace
self.freespace_after_placement = freespace_after_placement
self.required_space = required_space
VapiStruct.__init__(self)
DatastoreInfo._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.recommendation.datastore_info', {
'freespace': type.DoubleType(),
'freespace_after_placement': type.DoubleType(),
'required_space': type.DoubleType(),
},
DatastoreInfo,
False,
None))
def scan_migrate_deployment_size_task(self,
spec,
):
"""
Recommend deployment sizes based on the configuration given in the
specification.
:type spec: :class:`Recommendation.MigrateDeploymentSizeRequest`
:param spec: The specification that contains information needed to recommend
deloyment size.
:rtype: :class: `vmware.vapi.stdlib.client.task.Task`
:return: Task instance
"""
task_id = self._invoke('scan_migrate_deployment_size$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.MapType(type.StringType(), type.ListType(type.StringType())))
return task_instance
def scan_datastore_task(self,
spec,
):
"""
Recommend a datastore for the appliance to be deployed in.
:type spec: :class:`Recommendation.DatastoreRequest`
:param spec: The specification contains the information needed to recommend
datastore
:rtype: :class: `vmware.vapi.stdlib.client.task.Task`
:return: Task instance
"""
task_id = self._invoke('scan_datastore$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.MapType(type.StringType(), type.ReferenceType(__name__, 'Recommendation.DatastoreInfo')))
return task_instance
def scan_upgrade_deployment_size_task(self,
spec,
):
"""
Recommend deployment sizes based on the configuration given in the
specification.
:type spec: :class:`Recommendation.UpgradeDeploymentSizeRequest`
:param spec: The specification that contains information needed to recommend
deloyment size.
:rtype: :class: `vmware.vapi.stdlib.client.task.Task`
:return: Task instance
"""
task_id = self._invoke('scan_upgrade_deployment_size$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.MapType(type.StringType(), type.ListType(type.StringType())))
return task_instance
class Upgrade(VapiInterface):
"""
The service to upgrade an existing appliance to Embedded VCSA, PSC, and
Management VCSA.
"""
_VAPI_SERVICE_ID = 'com.vmware.vcenter.lcm.upgrade'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _UpgradeStub)
class UpgradeDestinationAppliance(VapiStruct):
"""
Spec to describe the new appliance
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
_validator_list = [
UnionValidator(
'appliance_type',
{
'VCSA_EMBEDDED' : [('appliance_size', False), ('appliance_disk_size', False), ('vcsa_embedded', True)],
'VCSA_EXTERNAL' : [('appliance_size', False), ('appliance_disk_size', False)],
'PSC' : [('psc', True)],
'VMC' : [],
}
),
]
def __init__(self,
appliance_name=None,
appliance_type=None,
appliance_size=None,
appliance_disk_size=None,
thin_disk_mode=None,
ova_location=None,
ova_location_ssl_verify=None,
ova_location_ssl_thumbprint=None,
ovftool_location=None,
ovftool_location_ssl_verify=None,
ovftool_location_ssl_thumbprint=None,
services=None,
temporary_network=None,
history=None,
ovftool_arguments=None,
vcsa_embedded=None,
psc=None,
):
"""
:type appliance_name: :class:`str`
:param appliance_name: The name of the appliance to deploy.
:type appliance_type: :class:`ApplianceType`
:param appliance_type: The type of appliance to deploy.
:type appliance_size: :class:`ApplianceSize` or ``None``
:param appliance_size: A size descriptor based on the number of virtual machines which
will be managed by the new vCenter appliance.
If None, defaults to SMALL
:type appliance_disk_size: :class:`StorageSize` or ``None``
:param appliance_disk_size: The disk size of the new vCenter appliance.
If None, defaults to REGULAR
:type thin_disk_mode: :class:`bool`
:param thin_disk_mode: Whether to deploy the appliance with thin mode virtual disks.
:type ova_location: :class:`str`
:param ova_location: The location of the OVA file.
:type ova_location_ssl_verify: :class:`bool` or ``None``
:param ova_location_ssl_verify: A flag to indicate whether the ssl verification is required.
If ``ovaLocationSslThumbprint`` is provided, this field can be
omitted If None, defaults to True
:type ova_location_ssl_thumbprint: :class:`str` or ``None``
:param ova_location_ssl_thumbprint: SSL thumbprint of ssl verification. If provided, ssl_verify can be
omitted or set to true. If omitted, ssl_verify must be false. If
omitted and ssl_verify is true, an error will occur.
If ova_location_ssl_verify is False, this field can be omitted
:type ovftool_location: :class:`str`
:param ovftool_location: The location of the OVF Tool.
:type ovftool_location_ssl_verify: :class:`bool` or ``None``
:param ovftool_location_ssl_verify: Flag to indicate whether or not to verify the SSL thumbprint of OVF
Tool location.
Default to be True.
:type ovftool_location_ssl_thumbprint: :class:`str` or ``None``
:param ovftool_location_ssl_thumbprint: SSL thumbprint of OVF Tool location to be verified.
When ``ovftoolLocationSslVerify`` is set to False, this field can
be omitted.
:type services: :class:`UpgradeDestinationApplianceService`
:param services: Spec to configure vCenter server services.
:type temporary_network: :class:`TemporaryNetwork`
:param temporary_network: The network settings of the appliance to be deployed.
:type history: :class:`History` or ``None``
:param history: History data to be included in the upgrade and migrate
Default value will be applied when absent
:type ovftool_arguments: (:class:`dict` of :class:`str` and :class:`str`) or ``None``
:param ovftool_arguments: The OVF Tool arguments to be included.
Not required when no OVF Tool argument to pass through
:type vcsa_embedded: :class:`CeipOnlySso`
:param vcsa_embedded: Spec used to configure an embedded vCenter Server. This field
describes how the embedded vCenter Server appliance should be
configured.
This attribute is optional and it is only relevant when the value
of ``applianceType`` is :attr:`ApplianceType.VCSA_EMBEDDED`.
:type psc: :class:`CeipOnlySso`
:param psc: Spec used to configure a Platform Services Controller. This section
describes how the Platform Services Controller appliance should be
configured. If unset, either ``vcsaEmbedded`` or ``#vcsaExternal``
must be provided.
This attribute is optional and it is only relevant when the value
of ``applianceType`` is :attr:`ApplianceType.PSC`.
"""
self.appliance_name = appliance_name
self.appliance_type = appliance_type
self.appliance_size = appliance_size
self.appliance_disk_size = appliance_disk_size
self.thin_disk_mode = thin_disk_mode
self.ova_location = ova_location
self.ova_location_ssl_verify = ova_location_ssl_verify
self.ova_location_ssl_thumbprint = ova_location_ssl_thumbprint
self.ovftool_location = ovftool_location
self.ovftool_location_ssl_verify = ovftool_location_ssl_verify
self.ovftool_location_ssl_thumbprint = ovftool_location_ssl_thumbprint
self.services = services
self.temporary_network = temporary_network
self.history = history
self.ovftool_arguments = ovftool_arguments
self.vcsa_embedded = vcsa_embedded
self.psc = psc
VapiStruct.__init__(self)
UpgradeDestinationAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.upgrade.upgrade_destination_appliance', {
'appliance_name': type.StringType(),
'appliance_type': type.ReferenceType(__name__, 'ApplianceType'),
'appliance_size': type.OptionalType(type.ReferenceType(__name__, 'ApplianceSize')),
'appliance_disk_size': type.OptionalType(type.ReferenceType(__name__, 'StorageSize')),
'thin_disk_mode': type.BooleanType(),
'ova_location': type.StringType(),
'ova_location_ssl_verify': type.OptionalType(type.BooleanType()),
'ova_location_ssl_thumbprint': type.OptionalType(type.StringType()),
'ovftool_location': type.StringType(),
'ovftool_location_ssl_verify': type.OptionalType(type.BooleanType()),
'ovftool_location_ssl_thumbprint': type.OptionalType(type.StringType()),
'services': type.ReferenceType(__name__, 'UpgradeDestinationApplianceService'),
'temporary_network': type.ReferenceType(__name__, 'TemporaryNetwork'),
'history': type.OptionalType(type.ReferenceType(__name__, 'History')),
'ovftool_arguments': type.OptionalType(type.MapType(type.StringType(), type.StringType())),
'vcsa_embedded': type.OptionalType(type.ReferenceType(__name__, 'CeipOnlySso')),
'psc': type.OptionalType(type.ReferenceType(__name__, 'CeipOnlySso')),
},
UpgradeDestinationAppliance,
False,
None))
class Spec(VapiStruct):
"""
Spec to describe the configuration parameters that are required for upgrade
of a vCenter Server Appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_location=None,
destination_appliance=None,
source_location=None,
source_appliance=None,
source_vum=None,
source_vum_location=None,
):
"""
:type destination_location: :class:`DestinationLocation`
:param destination_location: This subsection describes the ESX or VC on which to deploy the
appliance.
:type destination_appliance: :class:`Upgrade.UpgradeDestinationAppliance`
:param destination_appliance: Spec to describe the new appliance
:type source_location: :class:`Connection`
:param source_location: The configuration to connect to an ESX/VC.
:type source_appliance: :class:`UpgradeSourceAppliance`
:param source_appliance: Spec to describe the existing appliance to upgrade.
:type source_vum: :class:`SourceVum` or ``None``
:param source_vum: This section describes the source vSphere Update Manager (VUM)
which you want to upgrade.
Not applicable for appliance not having source vSphere Update
Manager
:type source_vum_location: :class:`Connection` or ``None``
:param source_vum_location: The configuration to connect to an ESX/VC.
"""
self.destination_location = destination_location
self.destination_appliance = destination_appliance
self.source_location = source_location
self.source_appliance = source_appliance
self.source_vum = source_vum
self.source_vum_location = source_vum_location
VapiStruct.__init__(self)
Spec._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.upgrade.spec', {
'destination_location': type.ReferenceType(__name__, 'DestinationLocation'),
'destination_appliance': type.ReferenceType(__name__, 'Upgrade.UpgradeDestinationAppliance'),
'source_location': type.ReferenceType(__name__, 'Connection'),
'source_appliance': type.ReferenceType(__name__, 'UpgradeSourceAppliance'),
'source_vum': type.OptionalType(type.ReferenceType(__name__, 'SourceVum')),
'source_vum_location': type.OptionalType(type.ReferenceType(__name__, 'Connection')),
},
Spec,
False,
None))
def check_task(self,
spec,
options=None,
):
"""
Performs a precheck for the given specification. The result of this
operation can be queried by calling the cis/tasks/{task-id} with the
task-id in the response of this call.
:type spec: :class:`Upgrade.Spec`
:param spec: The specification of the deployment.
:type options: :class:`DeploymentOption` or ``None``
:param options: The deployment precheck options.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the given spec and/or option contains error.
"""
task_id = self._invoke('check$task',
{
'spec': spec,
'options': options,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.VoidType())
return task_instance
def start_task(self,
spec,
):
"""
Deploys the appliance for the given specification. The result of this
operation can be queried by calling the cis/tasks/{task-id} with the
task-id in the response of this call.
:type spec: :class:`Upgrade.Spec`
:param spec: The specification of the deployment.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the given specification contains error.
"""
task_id = self._invoke('start$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.VoidType())
return task_instance
class Validation(VapiInterface):
"""
The service that provides validation of a section of full deployment
specification.
"""
_VAPI_SERVICE_ID = 'com.vmware.vcenter.lcm.validation'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _ValidationStub)
class UpgradeSourceApplianceDestinationAppliance(VapiStruct):
"""
Spec to describe the new appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
ova_location=None,
ova_location_ssl_verify=None,
ova_location_ssl_thumbprint=None,
):
"""
:type ova_location: :class:`str` or ``None``
:param ova_location: The location of the ova file.
Not required.
:type ova_location_ssl_verify: :class:`bool` or ``None``
:param ova_location_ssl_verify: A flag to indicate whether the ssl verification is required.
If ``ovaLocationSslThumbprint`` is provided, this field can be
omitted If None, defaults to True
:type ova_location_ssl_thumbprint: :class:`str` or ``None``
:param ova_location_ssl_thumbprint: SSL thumbprint of ssl verification. If provided, ssl_verify can be
omitted or set to true. If omitted, ssl_verify must be false. If
omitted and ssl_verify is true, an error will occur.
If ova_location_ssl_verify is False, this field can be omitted
"""
self.ova_location = ova_location
self.ova_location_ssl_verify = ova_location_ssl_verify
self.ova_location_ssl_thumbprint = ova_location_ssl_thumbprint
VapiStruct.__init__(self)
UpgradeSourceApplianceDestinationAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.upgrade_source_appliance_destination_appliance', {
'ova_location': type.OptionalType(type.StringType()),
'ova_location_ssl_verify': type.OptionalType(type.BooleanType()),
'ova_location_ssl_thumbprint': type.OptionalType(type.StringType()),
},
UpgradeSourceApplianceDestinationAppliance,
False,
None))
class UpgradeSourceApplianceRequest(VapiStruct):
"""
The configuration to validate source appliance for upgrade.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_appliance=None,
source_appliance=None,
source_location=None,
):
"""
:type destination_appliance: :class:`Validation.UpgradeSourceApplianceDestinationAppliance`
:param destination_appliance: Spec to describe the new appliance.
:type source_appliance: :class:`UpgradeSourceAppliance`
:param source_appliance: The source appliance configuration.
:type source_location: :class:`Connection`
:param source_location: The source location configuration.
"""
self.destination_appliance = destination_appliance
self.source_appliance = source_appliance
self.source_location = source_location
VapiStruct.__init__(self)
UpgradeSourceApplianceRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.upgrade_source_appliance_request', {
'destination_appliance': type.ReferenceType(__name__, 'Validation.UpgradeSourceApplianceDestinationAppliance'),
'source_appliance': type.ReferenceType(__name__, 'UpgradeSourceAppliance'),
'source_location': type.ReferenceType(__name__, 'Connection'),
},
UpgradeSourceApplianceRequest,
False,
None))
class SourceLocationRequest(VapiStruct):
"""
This subsection describes the ESX or VC on which to deploy the appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
source_location=None,
):
"""
:type source_location: :class:`Connection`
:param source_location: The source location configuration
"""
self.source_location = source_location
VapiStruct.__init__(self)
SourceLocationRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.source_location_request', {
'source_location': type.ReferenceType(__name__, 'Connection'),
},
SourceLocationRequest,
False,
None))
class SourceVumRequest(VapiStruct):
"""
The request that contains information needed to verify the credentials of
source vSphere Update Manager and run the migration assistant.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
source_vc_windows=None,
source_appliance=None,
source_vum_location=None,
source_vum=None,
):
"""
:type source_vc_windows: :class:`Validation.SourceVumSourceVcWindows` or ``None``
:param source_vc_windows: Spec to describe the existing Windows vCenter server to migrate.
:type source_appliance: :class:`Validation.SourceVumUpgradeSourceAppliance` or ``None``
:param source_appliance: Source appliance configuration for upgrade service.
:type source_vum_location: :class:`Connection` or ``None``
:param source_vum_location: The configuration to connect to an ESX/VC.
:type source_vum: :class:`SourceVum`
:param source_vum: This section describes the source vSphere Update Manager (VUM)
which you want to upgrade.
"""
self.source_vc_windows = source_vc_windows
self.source_appliance = source_appliance
self.source_vum_location = source_vum_location
self.source_vum = source_vum
VapiStruct.__init__(self)
SourceVumRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.source_vum_request', {
'source_vc_windows': type.OptionalType(type.ReferenceType(__name__, 'Validation.SourceVumSourceVcWindows')),
'source_appliance': type.OptionalType(type.ReferenceType(__name__, 'Validation.SourceVumUpgradeSourceAppliance')),
'source_vum_location': type.OptionalType(type.ReferenceType(__name__, 'Connection')),
'source_vum': type.ReferenceType(__name__, 'SourceVum'),
},
SourceVumRequest,
False,
None))
class SourceVumUpgradeSourceAppliance(VapiStruct):
"""
Source appliance configuration for upgrade service.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
sso_admin_username=None,
sso_admin_password=None,
):
"""
:type sso_admin_username: :class:`str`
:param sso_admin_username: vCenter Single Sign-On administrator user name of the source
appliance.
:type sso_admin_password: :class:`str`
:param sso_admin_password: vCenter Single Sign-On administrator password of the source
appliance.
"""
self.sso_admin_username = sso_admin_username
self.sso_admin_password = sso_admin_password
VapiStruct.__init__(self)
SourceVumUpgradeSourceAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.source_vum_upgrade_source_appliance', {
'sso_admin_username': type.StringType(),
'sso_admin_password': type.SecretType(),
},
SourceVumUpgradeSourceAppliance,
False,
None))
class SourceVumSourceVcWindows(VapiStruct):
"""
Spec to describe the existing Windows vCenter server to migrate.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
username=None,
password=None,
):
"""
:type username: :class:`str`
:param username: Single Sign-On administrator user on the source Windows vCenter
server. For example, administrator\\\\@vsphere.local. Important:
The user must be administrator\\\\@your_domain_name.
:type password: :class:`str`
:param password: The password of the Single Sign-On administrator on the source
Windows vCenter server.
"""
self.username = username
self.password = password
VapiStruct.__init__(self)
SourceVumSourceVcWindows._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.source_vum_source_vc_windows', {
'username': type.StringType(),
'password': type.SecretType(),
},
SourceVumSourceVcWindows,
False,
None))
class OsPasswordRequest(VapiStruct):
"""
The request that contains information needed to verify the given password
conforms password policy.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_appliance=None,
):
"""
:type destination_appliance: :class:`Validation.OsPasswordDestinationAppliance`
:param destination_appliance: Spec to describe the new appliance.
"""
self.destination_appliance = destination_appliance
VapiStruct.__init__(self)
OsPasswordRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.os_password_request', {
'destination_appliance': type.ReferenceType(__name__, 'Validation.OsPasswordDestinationAppliance'),
},
OsPasswordRequest,
False,
None))
class OsPasswordDestinationAppliance(VapiStruct):
"""
Spec to describe the new appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
root_password=None,
):
"""
:type root_password: :class:`str`
:param root_password: Password must conform to the following requirements: 1. At least 8
characters. 2. No more than 20 characters. 3. At least 1 uppercase
character. 4. At least 1 lowercase character. 5. At least 1 number.
6. At least 1 special character (e.g., '!', '(', '\\\\@', etc.). 7.
Only visible A-Z, a-z, 0-9 and punctuation (spaces are not allowed)
"""
self.root_password = root_password
VapiStruct.__init__(self)
OsPasswordDestinationAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.os_password_destination_appliance', {
'root_password': type.SecretType(),
},
OsPasswordDestinationAppliance,
False,
None))
class NtpServerRequest(VapiStruct):
"""
A request that contains the information needed to validate the given ntp
servers.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_appliance=None,
):
"""
:type destination_appliance: :class:`Validation.NtpServerDestinationAppliance`
:param destination_appliance: Spec to describe the new appliance.
"""
self.destination_appliance = destination_appliance
VapiStruct.__init__(self)
NtpServerRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.ntp_server_request', {
'destination_appliance': type.ReferenceType(__name__, 'Validation.NtpServerDestinationAppliance'),
},
NtpServerRequest,
False,
None))
class NtpServerDestinationAppliance(VapiStruct):
"""
Spec to describe the new appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
time=None,
):
"""
:type time: :class:`Time`
:param time: Configuration of the vCSA time synchronization.
"""
self.time = time
VapiStruct.__init__(self)
NtpServerDestinationAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.ntp_server_destination_appliance', {
'time': type.ReferenceType(__name__, 'Time'),
},
NtpServerDestinationAppliance,
False,
None))
class EsxRequest(VapiStruct):
"""
The request that contains the information to verify esx management status.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_location=None,
source_location=None,
source_vum_location=None,
):
"""
:type destination_location: :class:`Validation.EsxDestinationLocation`
:param destination_location: This subsection describes the ESX or VC on which to deploy the
appliance.
:type source_location: :class:`Connection` or ``None``
:param source_location: The configuration to connect to an ESX/VC.
:type source_vum_location: :class:`Connection` or ``None``
:param source_vum_location: The configuration to connect to an ESX/VC.
"""
self.destination_location = destination_location
self.source_location = source_location
self.source_vum_location = source_vum_location
VapiStruct.__init__(self)
EsxRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.esx_request', {
'destination_location': type.ReferenceType(__name__, 'Validation.EsxDestinationLocation'),
'source_location': type.OptionalType(type.ReferenceType(__name__, 'Connection')),
'source_vum_location': type.OptionalType(type.ReferenceType(__name__, 'Connection')),
},
EsxRequest,
False,
None))
class EsxDestinationLocation(VapiStruct):
"""
This subsection describes the ESX or VC on which to deploy the appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
esx=None,
vcenter=None,
):
"""
:type esx: :class:`Validation.ContainerWithoutInventory` or ``None``
:param esx: This section describes the ESX host on which to deploy the
appliance. Required if you are deploying the appliance directly on
an ESX host.
Mutual exclusive between ``esx`` and ``vcenter``
:type vcenter: :class:`Validation.ContainerWithoutInventory` or ``None``
:param vcenter: This subsection describes the vCenter on which to deploy the
appliance.
Mutual exclusive between ``esx`` and ``vcenter``
"""
self.esx = esx
self.vcenter = vcenter
VapiStruct.__init__(self)
EsxDestinationLocation._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.esx_destination_location', {
'esx': type.OptionalType(type.ReferenceType(__name__, 'Validation.ContainerWithoutInventory')),
'vcenter': type.OptionalType(type.ReferenceType(__name__, 'Validation.ContainerWithoutInventory')),
},
EsxDestinationLocation,
False,
None))
class ContainerWithoutInventory(VapiStruct):
"""
This section describes the ESX host on which to deploy the appliance.
Required if you are deploying the appliance directly on an ESX host.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
connection=None,
):
"""
:type connection: :class:`Connection`
:param connection: The configuration to connect to an ESX/VC.
"""
self.connection = connection
VapiStruct.__init__(self)
ContainerWithoutInventory._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.container_without_inventory', {
'connection': type.ReferenceType(__name__, 'Connection'),
},
ContainerWithoutInventory,
False,
None))
class NetworkRequest(VapiStruct):
"""
The request that contains network information to be validated.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_appliance=None,
):
"""
:type destination_appliance: :class:`Validation.NetworkDestinationAppliance`
:param destination_appliance: Spec to describe the new appliance.
"""
self.destination_appliance = destination_appliance
VapiStruct.__init__(self)
NetworkRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.network_request', {
'destination_appliance': type.ReferenceType(__name__, 'Validation.NetworkDestinationAppliance'),
},
NetworkRequest,
False,
None))
class NetworkDestinationAppliance(VapiStruct):
"""
Spec to describe the new appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
network=None,
):
"""
:type network: :class:`Network`
:param network: The network settings of the appliance to be deployed.
"""
self.network = network
VapiStruct.__init__(self)
NetworkDestinationAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.network_destination_appliance', {
'network': type.ReferenceType(__name__, 'Network'),
},
NetworkDestinationAppliance,
False,
None))
class TemporaryNetworkRequest(VapiStruct):
"""
The request that contains network information to be validated.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_appliance=None,
):
"""
:type destination_appliance: :class:`Validation.TemporaryNetworkDestinationAppliance`
:param destination_appliance: Spec to describe the new appliance.
"""
self.destination_appliance = destination_appliance
VapiStruct.__init__(self)
TemporaryNetworkRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.temporary_network_request', {
'destination_appliance': type.ReferenceType(__name__, 'Validation.TemporaryNetworkDestinationAppliance'),
},
TemporaryNetworkRequest,
False,
None))
class TemporaryNetworkDestinationAppliance(VapiStruct):
"""
Spec to describe the new appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
network=None,
):
"""
:type network: :class:`TemporaryNetwork`
:param network: The network settings of the appliance to be deployed.
"""
self.network = network
VapiStruct.__init__(self)
TemporaryNetworkDestinationAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.temporary_network_destination_appliance', {
'network': type.ReferenceType(__name__, 'TemporaryNetwork'),
},
TemporaryNetworkDestinationAppliance,
False,
None))
class ApplianceNameRequest(VapiStruct):
"""
Data container that contains the information needed to validate appliance
name.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_location=None,
destination_appliance=None,
):
"""
:type destination_location: :class:`Validation.ApplianceNameDestinationLocation`
:param destination_location: This subsection describes the ESX or VC on which to deploy the
appliance.
:type destination_appliance: :class:`Validation.ApplianceNameDestinationAppliance`
:param destination_appliance: Spec to describe the new appliance.
"""
self.destination_location = destination_location
self.destination_appliance = destination_appliance
VapiStruct.__init__(self)
ApplianceNameRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.appliance_name_request', {
'destination_location': type.ReferenceType(__name__, 'Validation.ApplianceNameDestinationLocation'),
'destination_appliance': type.ReferenceType(__name__, 'Validation.ApplianceNameDestinationAppliance'),
},
ApplianceNameRequest,
False,
None))
class ApplianceNameDestinationAppliance(VapiStruct):
"""
Data container for appliance name information used in validation of
appliance name request.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
appliance_name=None,
):
"""
:type appliance_name: :class:`str`
:param appliance_name: The name of the appliance to deploy.
"""
self.appliance_name = appliance_name
VapiStruct.__init__(self)
ApplianceNameDestinationAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.appliance_name_destination_appliance', {
'appliance_name': type.StringType(),
},
ApplianceNameDestinationAppliance,
False,
None))
class ApplianceNameDestinationLocation(VapiStruct):
"""
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
esx=None,
vcenter=None,
):
"""
:type esx: :class:`Validation.ApplianceNameEsx` or ``None``
:param esx: This section describes the ESX host on which to deploy the
appliance. Required if you are deploying the appliance directly on
an ESX host.
Mutual exclusive between ``esx`` and ``vcenter``
:type vcenter: :class:`Validation.ApplianceNameVc` or ``None``
:param vcenter: This subsection describes the vCenter on which to deploy the
appliance.
Mutual exclusive between ``esx`` and ``vcenter``
"""
self.esx = esx
self.vcenter = vcenter
VapiStruct.__init__(self)
ApplianceNameDestinationLocation._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.appliance_name_destination_location', {
'esx': type.OptionalType(type.ReferenceType(__name__, 'Validation.ApplianceNameEsx')),
'vcenter': type.OptionalType(type.ReferenceType(__name__, 'Validation.ApplianceNameVc')),
},
ApplianceNameDestinationLocation,
False,
None))
class ApplianceNameEsx(VapiStruct):
"""
This section describes the ESX host on which to deploy the appliance.
Required if you are deploying the appliance directly on an ESX host.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
connection=None,
):
"""
:type connection: :class:`Connection`
:param connection: The configuration to connect to an ESX/VC.
"""
self.connection = connection
VapiStruct.__init__(self)
ApplianceNameEsx._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.appliance_name_esx', {
'connection': type.ReferenceType(__name__, 'Connection'),
},
ApplianceNameEsx,
False,
None))
class ApplianceNameEsxInventory(VapiStruct):
"""
The configuration of ESX inventory.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
resource_pool_path=None,
):
"""
:type resource_pool_path: :class:`str` or ``None``
:param resource_pool_path: The path to the resource pool on the ESX host in which the
appliance will be deployed.
Not applicable when not in resource pool
"""
self.resource_pool_path = resource_pool_path
VapiStruct.__init__(self)
ApplianceNameEsxInventory._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.appliance_name_esx_inventory', {
'resource_pool_path': type.OptionalType(type.StringType()),
},
ApplianceNameEsxInventory,
False,
None))
class ApplianceNameVc(VapiStruct):
"""
This subsection describes the vCenter on which to deploy the appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
connection=None,
inventory=None,
):
"""
:type connection: :class:`Connection`
:param connection: The configuration to connect to an ESX/VC.
:type inventory: :class:`Validation.ApplianceNameVcInventory`
:param inventory: All names are case-sensitive. you can install the appliance to one
of the following destinations: 1. A resource pool in a cluster, use
'cluster_path'. 2. A specific ESX host in a cluster, use
'host_path'. 3. A resource pool in a specific ESX host being
managed by the current vCenter, use 'resource_pool_path'. You must
always provide the 'network_name' key. To install a new appliance
to a specific ESX host in a cluster, provide the 'host_path' key,
and the 'datastore_name', e.g. 'host_path':
'/MyDataCenter/host/MyCluster/10.20.30.40', 'datastore_name': 'Your
Datastore'. To install a new appliance to a specific resource pool,
provide the 'resource_pool_path', and the 'datastore_name', e.g.
'resource_pool_path': '/Your Datacenter Folder/Your
Datacenter/host/Your Cluster/Resources/Your Resource Pool',
'datastore_name': 'Your Datastore'. To place a new appliance to a
virtual machine Folder, provide the 'vm_folder_path', e.g.
'vm_folder_path': 'VM Folder 0/VM Folder1'.
"""
self.connection = connection
self.inventory = inventory
VapiStruct.__init__(self)
ApplianceNameVc._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.appliance_name_vc', {
'connection': type.ReferenceType(__name__, 'Connection'),
'inventory': type.ReferenceType(__name__, 'Validation.ApplianceNameVcInventory'),
},
ApplianceNameVc,
False,
None))
class ApplianceNameVcInventory(VapiStruct):
"""
All names are case-sensitive. you can install the appliance to one of the
following destinations: 1. A resource pool in a cluster, use
'cluster_path'. 2. A specific ESX host in a cluster, use 'host_path'. 3. A
resource pool in a specific ESX host being managed by the current vCenter,
use 'resource_pool_path'. You must always provide the 'network_name' key.
To install a new appliance to a specific ESX host in a cluster, provide the
'host_path' key, and the 'datastore_name', e.g. 'host_path':
'/MyDataCenter/host/MyCluster/10.20.30.40', 'datastore_name': 'Your
Datastore'. To install a new appliance to a specific resource pool, provide
the 'resource_pool_path', and the 'datastore_name', e.g.
'resource_pool_path': '/Your Datacenter Folder/Your Datacenter/host/Your
Cluster/Resources/Your Resource Pool', 'datastore_name': 'Your Datastore'.
To place a new appliance to a virtual machine Folder, provide the
'vm_folder_path', e.g. 'vm_folder_path': 'VM Folder 0/VM Folder1'.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
vm_folder_path=None,
resource_pool_path=None,
cluster_path=None,
host_path=None,
):
"""
:type vm_folder_path: :class:`str` or ``None``
:param vm_folder_path:
:type resource_pool_path: :class:`str` or ``None``
:param resource_pool_path: Full path to resource pool. Format: /{datacenter
folder}/{datacenter name}/host/{host
name}/{cluster_name}/Resources/{resource pool}. e.g: /Your
Datacenter Folder/Your Datacenter/host/Your Cluster/Resources/Your
Resource Pool
Mutually exclusive between ``#resource_pool_path``,
``#cluster_path``, and ``#host_path``
:type cluster_path: :class:`str` or ``None``
:param cluster_path: Full path to the cluster. Format: /{datacenter folder}/{datacenter
name}/host/{cluster_name}. e.g: /Your Datacenter Folder/Your
Datacenter/host/Your Cluster
Mutually exclusive between ``#resource_pool_path``,
``#cluster_path``, and ``#host_path``
:type host_path: :class:`str` or ``None``
:param host_path: Full path to an ESX host. Format: /{datacenter folder}/{datacenter
name}/host/{host name}. e.g: /Your Datacenter Folder/Your
Datacenter/host/Your Host
Mutually exclusive between ``#resource_pool_path``,
``#cluster_path``, and ``#host_path``
"""
self.vm_folder_path = vm_folder_path
self.resource_pool_path = resource_pool_path
self.cluster_path = cluster_path
self.host_path = host_path
VapiStruct.__init__(self)
ApplianceNameVcInventory._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.appliance_name_vc_inventory', {
'vm_folder_path': type.OptionalType(type.StringType()),
'resource_pool_path': type.OptionalType(type.StringType()),
'cluster_path': type.OptionalType(type.StringType()),
'host_path': type.OptionalType(type.StringType()),
},
ApplianceNameVcInventory,
False,
None))
class DestinationLocationRequest(VapiStruct):
"""
This subsection describes the ESX or VC on which to deploy the appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_location=None,
):
"""
:type destination_location: :class:`Validation.ValidationDestinationLocation`
:param destination_location: The destination location configuration.
"""
self.destination_location = destination_location
VapiStruct.__init__(self)
DestinationLocationRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.destination_location_request', {
'destination_location': type.ReferenceType(__name__, 'Validation.ValidationDestinationLocation'),
},
DestinationLocationRequest,
False,
None))
class ValidationDestinationLocation(VapiStruct):
"""
This subsection describes the ESX or VC on which to deploy the appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
esx=None,
vcenter=None,
):
"""
:type esx: :class:`Validation.DestinationLocationEsx` or ``None``
:param esx: This section describes the ESX host on which to deploy the
appliance. Required if you are deploying the appliance directly on
an ESX host.
Mutual exclusive between ``esx`` and ``vcenter``
:type vcenter: :class:`Validation.DestinationLocationVc` or ``None``
:param vcenter: This subsection describes the vCenter on which to deploy the
appliance.
Mutual exclusive between ``esx`` and ``vcenter``
"""
self.esx = esx
self.vcenter = vcenter
VapiStruct.__init__(self)
ValidationDestinationLocation._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.validation_destination_location', {
'esx': type.OptionalType(type.ReferenceType(__name__, 'Validation.DestinationLocationEsx')),
'vcenter': type.OptionalType(type.ReferenceType(__name__, 'Validation.DestinationLocationVc')),
},
ValidationDestinationLocation,
False,
None))
class DestinationLocationEsx(VapiStruct):
"""
This section describes the ESX host on which to deploy the appliance.
Required if you are deploying the appliance directly on an ESX host.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
connection=None,
):
"""
:type connection: :class:`Connection`
:param connection: The configuration to connect to an ESX/VC.
"""
self.connection = connection
VapiStruct.__init__(self)
DestinationLocationEsx._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.destination_location_esx', {
'connection': type.ReferenceType(__name__, 'Connection'),
},
DestinationLocationEsx,
False,
None))
class DestinationLocationVc(VapiStruct):
"""
This subsection describes the vCenter on which to deploy the appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
connection=None,
inventory=None,
):
"""
:type connection: :class:`Connection`
:param connection: The configuration to connect to an ESX/VC.
:type inventory: :class:`VcInventory` or ``None``
:param inventory: All names are case-sensitive. you can install the appliance to one
of the following destinations: 1. A resource pool in a cluster, use
'cluster_path'. 2. A specific ESX host in a cluster, use
'host_path'. 3. A resource pool in a specific ESX host being
managed by the current vCenter, use 'resource_pool_path'. You must
always provide the 'network_name' key. To install a new appliance
to a specific ESX host in a cluster, provide the 'host_path' key,
and the 'datastore_name', e.g. 'host_path':
'/MyDataCenter/host/MyCluster/10.20.30.40', 'datastore_name': 'Your
Datastore'. To install a new appliance to a specific resource pool,
provide the 'resource_pool_path', and the 'datastore_name', e.g.
'resource_pool_path': '/Your Datacenter Folder/Your
Datacenter/host/Your Cluster/Resources/Your Resource Pool',
'datastore_name': 'Your Datastore'. To place a new appliance to a
virtual machine Folder, provide the 'vm_folder_path', e.g.
'vm_folder_path': 'VM Folder 0/VM Folder1'.
"""
self.connection = connection
self.inventory = inventory
VapiStruct.__init__(self)
DestinationLocationVc._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.destination_location_vc', {
'connection': type.ReferenceType(__name__, 'Connection'),
'inventory': type.OptionalType(type.ReferenceType(__name__, 'VcInventory')),
},
DestinationLocationVc,
False,
None))
class HostConfigRequest(VapiStruct):
"""
Data container that contains the information needed to validate appliance
name.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_location=None,
destination_appliance=None,
):
"""
:type destination_location: :class:`Validation.HostConfigDestinationLocation`
:param destination_location: This subsection describes the ESX or VC on which to deploy the
appliance.
:type destination_appliance: :class:`Validation.HostConfigDestinationAppliance`
:param destination_appliance: Spec to describe the new appliance.
"""
self.destination_location = destination_location
self.destination_appliance = destination_appliance
VapiStruct.__init__(self)
HostConfigRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.host_config_request', {
'destination_location': type.ReferenceType(__name__, 'Validation.HostConfigDestinationLocation'),
'destination_appliance': type.ReferenceType(__name__, 'Validation.HostConfigDestinationAppliance'),
},
HostConfigRequest,
False,
None))
class HostConfigDestinationAppliance(VapiStruct):
"""
Data container for appliance name information used in validation of
appliance name request.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
_validator_list = [
UnionValidator(
'appliance_type',
{
'VCSA_EXTERNAL' : [],
'VCSA_EMBEDDED' : [],
'PSC' : [],
'VMC' : [],
}
),
]
def __init__(self,
appliance_name=None,
appliance_type=None,
appliance_size=None,
appliance_disk_size=None,
thin_disk_mode=None,
ova_location=None,
ova_location_ssl_verify=None,
ova_location_ssl_thumbprint=None,
):
"""
:type appliance_name: :class:`str`
:param appliance_name: The name of the appliance to deploy.
:type appliance_type: :class:`ApplianceType`
:param appliance_type: The type of appliance to deploy.
:type appliance_size: :class:`ApplianceSize` or ``None``
:param appliance_size: A size descriptor based on the number of virtual machines which
will be managed by the new vCenter appliance.
If None, defaults to SMALL
:type appliance_disk_size: :class:`StorageSize` or ``None``
:param appliance_disk_size: The disk size of the new vCenter appliance.
If None, defaults to REGULAR
:type thin_disk_mode: :class:`bool`
:param thin_disk_mode: Whether to deploy the appliance with thin mode virtual disks.
:type ova_location: :class:`str` or ``None``
:param ova_location: The location of the ova file.
Not required.
:type ova_location_ssl_verify: :class:`bool` or ``None``
:param ova_location_ssl_verify: A flag to indicate whether the ssl verification is required.
If ``ovaLocationSslThumbprint`` is provided, this field can be
omitted If None, defaults to True
:type ova_location_ssl_thumbprint: :class:`str` or ``None``
:param ova_location_ssl_thumbprint: SSL thumbprint of ssl verification. If provided, ssl_verify can be
omitted or set to true. If omitted, ssl_verify must be false. If
omitted and ssl_verify is true, an error will occur.
If ova_location_ssl_verify is False, this field can be omitted
"""
self.appliance_name = appliance_name
self.appliance_type = appliance_type
self.appliance_size = appliance_size
self.appliance_disk_size = appliance_disk_size
self.thin_disk_mode = thin_disk_mode
self.ova_location = ova_location
self.ova_location_ssl_verify = ova_location_ssl_verify
self.ova_location_ssl_thumbprint = ova_location_ssl_thumbprint
VapiStruct.__init__(self)
HostConfigDestinationAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.host_config_destination_appliance', {
'appliance_name': type.StringType(),
'appliance_type': type.ReferenceType(__name__, 'ApplianceType'),
'appliance_size': type.OptionalType(type.ReferenceType(__name__, 'ApplianceSize')),
'appliance_disk_size': type.OptionalType(type.ReferenceType(__name__, 'StorageSize')),
'thin_disk_mode': type.BooleanType(),
'ova_location': type.OptionalType(type.StringType()),
'ova_location_ssl_verify': type.OptionalType(type.BooleanType()),
'ova_location_ssl_thumbprint': type.OptionalType(type.StringType()),
},
HostConfigDestinationAppliance,
False,
None))
class HostConfigDestinationLocation(VapiStruct):
"""
This subsection describes the ESX or VC on which to deploy the appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
esx=None,
vcenter=None,
):
"""
:type esx: :class:`Validation.HostConfigEsx` or ``None``
:param esx: This section describes the ESX host on which to deploy the
appliance. Required if you are deploying the appliance directly on
an ESX host.
Mutual exclusive between ``esx`` and ``vcenter``
:type vcenter: :class:`Validation.HostConfigVc` or ``None``
:param vcenter: This subsection describes the vCenter on which to deploy the
appliance.
Mutual exclusive between ``esx`` and ``vcenter``
"""
self.esx = esx
self.vcenter = vcenter
VapiStruct.__init__(self)
HostConfigDestinationLocation._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.host_config_destination_location', {
'esx': type.OptionalType(type.ReferenceType(__name__, 'Validation.HostConfigEsx')),
'vcenter': type.OptionalType(type.ReferenceType(__name__, 'Validation.HostConfigVc')),
},
HostConfigDestinationLocation,
False,
None))
class HostConfigEsx(VapiStruct):
"""
This section describes the ESX host on which to deploy the appliance.
Required if you are deploying the appliance directly on an ESX host.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
connection=None,
inventory=None,
):
"""
:type connection: :class:`Connection`
:param connection: The configuration to connect to an ESX/VC.
:type inventory: :class:`Validation.HostConfigEsxInventory`
:param inventory: The configuration of ESX inventory.
"""
self.connection = connection
self.inventory = inventory
VapiStruct.__init__(self)
HostConfigEsx._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.host_config_esx', {
'connection': type.ReferenceType(__name__, 'Connection'),
'inventory': type.ReferenceType(__name__, 'Validation.HostConfigEsxInventory'),
},
HostConfigEsx,
False,
None))
class HostConfigEsxInventory(VapiStruct):
"""
The configuration of ESX inventory.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
datastore_name=None,
resource_pool_path=None,
):
"""
:type datastore_name: :class:`str`
:param datastore_name: The datastore on which to store the files of the appliance. This
value has to be either a specific datastore name, or a specific
datastore in a datastore cluster. The datastore must be accessible
from the ESX host and must have at least 25 GB of free space.
Otherwise, the new appliance might not power on.
:type resource_pool_path: :class:`str` or ``None``
:param resource_pool_path: The path to the resource pool on the ESX host in which the
appliance will be deployed.
Not applicable when not in resource pool
"""
self.datastore_name = datastore_name
self.resource_pool_path = resource_pool_path
VapiStruct.__init__(self)
HostConfigEsxInventory._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.host_config_esx_inventory', {
'datastore_name': type.StringType(),
'resource_pool_path': type.OptionalType(type.StringType()),
},
HostConfigEsxInventory,
False,
None))
class HostConfigVc(VapiStruct):
"""
This subsection describes the vCenter on which to deploy the appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
connection=None,
inventory=None,
):
"""
:type connection: :class:`Connection`
:param connection: The configuration to connect to an ESX/VC.
:type inventory: :class:`Validation.HostConfigVcInventory`
:param inventory: All names are case-sensitive. you can install the appliance to one
of the following destinations: 1. A resource pool in a cluster, use
'cluster_path'. 2. A specific ESX host in a cluster, use
'host_path'. 3. A resource pool in a specific ESX host being
managed by the current vCenter, use 'resource_pool_path'. You must
always provide the 'network_name' key. To install a new appliance
to a specific ESX host in a cluster, provide the 'host_path' key,
and the 'datastore_name', e.g. 'host_path':
'/MyDataCenter/host/MyCluster/10.20.30.40', 'datastore_name': 'Your
Datastore'. To install a new appliance to a specific resource pool,
provide the 'resource_pool_path', and the 'datastore_name', e.g.
'resource_pool_path': '/Your Datacenter Folder/Your
Datacenter/host/Your Cluster/Resources/Your Resource Pool',
'datastore_name': 'Your Datastore'. To place a new appliance to a
virtual machine Folder, provide the 'vm_folder_path', e.g.
'vm_folder_path': 'VM Folder 0/VM Folder1'.
"""
self.connection = connection
self.inventory = inventory
VapiStruct.__init__(self)
HostConfigVc._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.host_config_vc', {
'connection': type.ReferenceType(__name__, 'Connection'),
'inventory': type.ReferenceType(__name__, 'Validation.HostConfigVcInventory'),
},
HostConfigVc,
False,
None))
class HostConfigVcInventory(VapiStruct):
"""
All names are case-sensitive. you can install the appliance to one of the
following destinations: 1. A resource pool in a cluster, use
'cluster_path'. 2. A specific ESX host in a cluster, use 'host_path'. 3. A
resource pool in a specific ESX host being managed by the current vCenter,
use 'resource_pool_path'. You must always provide the 'network_name' key.
To install a new appliance to a specific ESX host in a cluster, provide the
'host_path' key, and the 'datastore_name', e.g. 'host_path':
'/MyDataCenter/host/MyCluster/10.20.30.40', 'datastore_name': 'Your
Datastore'. To install a new appliance to a specific resource pool, provide
the 'resource_pool_path', and the 'datastore_name', e.g.
'resource_pool_path': '/Your Datacenter Folder/Your Datacenter/host/Your
Cluster/Resources/Your Resource Pool', 'datastore_name': 'Your Datastore'.
To place a new appliance to a virtual machine Folder, provide the
'vm_folder_path', e.g. 'vm_folder_path': 'VM Folder 0/VM Folder1'.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
vm_folder_path=None,
resource_pool_path=None,
cluster_path=None,
host_path=None,
datastore_name=None,
datastore_cluster_name=None,
):
"""
:type vm_folder_path: :class:`str` or ``None``
:param vm_folder_path: Path of the VM folder. VM folder must be visible by the Data Center
of the compute resourceFormat:{vm_folder1}/{vm_folder2}e.g.:'VM
Folder 0/VM Folder1'.
Mutually exclusive between ``#resource_pool_path``,
``#cluster_path``, and ``#host_path``
:type resource_pool_path: :class:`str` or ``None``
:param resource_pool_path: Full path to resource pool. Format: /{datacenter
folder}/{datacenter name}/host/{host
name}/{cluster_name}/Resources/{resource pool}. e.g: /Your
Datacenter Folder/Your Datacenter/host/Your Cluster/Resources/Your
Resource Pool
Mutually exclusive between ``#resource_pool_path``,
``#cluster_path``, and ``#host_path``
:type cluster_path: :class:`str` or ``None``
:param cluster_path: Full path to the cluster. Format: /{datacenter folder}/{datacenter
name}/host/{cluster_name}. e.g: /Your Datacenter Folder/Your
Datacenter/host/Your Cluster
Mutually exclusive between ``#resource_pool_path``,
``#cluster_path``, and ``#host_path``
:type host_path: :class:`str` or ``None``
:param host_path:
:type datastore_name: :class:`str` or ``None``
:param datastore_name: The datastore on which to store the files of the appliance. This
value has to be either a specific datastore name, or a specific
datastore in a datastore cluster. The datastore must be accessible
from the ESX host and must have at least 25 GB of free space.
Otherwise, the new appliance might not power on.
Mutually exclusive between ``#datastore_name`` and
``#datastore_cluster_name``
:type datastore_cluster_name: :class:`str` or ``None``
:param datastore_cluster_name: The datastore cluster on which to store the files of the appliance.
Mutually exclusive between ``#datastore_name`` and
``#datastore_cluster_name``
"""
self.vm_folder_path = vm_folder_path
self.resource_pool_path = resource_pool_path
self.cluster_path = cluster_path
self.host_path = host_path
self.datastore_name = datastore_name
self.datastore_cluster_name = datastore_cluster_name
VapiStruct.__init__(self)
HostConfigVcInventory._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.host_config_vc_inventory', {
'vm_folder_path': type.OptionalType(type.StringType()),
'resource_pool_path': type.OptionalType(type.StringType()),
'cluster_path': type.OptionalType(type.StringType()),
'host_path': type.OptionalType(type.StringType()),
'datastore_name': type.OptionalType(type.StringType()),
'datastore_cluster_name': type.OptionalType(type.StringType()),
},
HostConfigVcInventory,
False,
None))
class MigrateSourceApplianceRequest(VapiStruct):
"""
This subsection describes the ESX or VC on which to deploy the appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_appliance=None,
source_vc_windows=None,
existing_migration_assistant=None,
start_migration_assistant=None,
source_vum_location=None,
source_vum=None,
):
"""
:type destination_appliance: :class:`Recommendation.DeploymentSizeDestinationAppliance`
:param destination_appliance: Spec to describe the new appliance.
:type source_vc_windows: :class:`SourceVcWindows`
:param source_vc_windows: Spec to describe the existing Windows vCenter server to migrate.
:type existing_migration_assistant: :class:`ExistingMigrationAssistant` or ``None``
:param existing_migration_assistant: Spec to describe the attributes of a running Migration Assistant on
the Windows vCenter server.
Only applicable when migration assistant is already running on the
source appliance
:type start_migration_assistant: :class:`MigrationAssistant` or ``None``
:param start_migration_assistant: Spec to automate the invocation of Migration Assistant. Automatic
invocation works only if the source Windows installation is running
as a virtual machine.
Only applicable when migration assistant is not running on the
source appliance.
:type source_vum_location: :class:`Connection` or ``None``
:param source_vum_location: The configuration to connect to an ESX/VC.
:type source_vum: :class:`SourceVum` or ``None``
:param source_vum: This section describes the source vSphere Update Manager (VUM)
which you want to upgrade.
Not applicable for appliance not having source vSphere Update
Manager
"""
self.destination_appliance = destination_appliance
self.source_vc_windows = source_vc_windows
self.existing_migration_assistant = existing_migration_assistant
self.start_migration_assistant = start_migration_assistant
self.source_vum_location = source_vum_location
self.source_vum = source_vum
VapiStruct.__init__(self)
MigrateSourceApplianceRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.migrate_source_appliance_request', {
'destination_appliance': type.ReferenceType(__name__, 'Recommendation.DeploymentSizeDestinationAppliance'),
'source_vc_windows': type.ReferenceType(__name__, 'SourceVcWindows'),
'existing_migration_assistant': type.OptionalType(type.ReferenceType(__name__, 'ExistingMigrationAssistant')),
'start_migration_assistant': type.OptionalType(type.ReferenceType(__name__, 'MigrationAssistant')),
'source_vum_location': type.OptionalType(type.ReferenceType(__name__, 'Connection')),
'source_vum': type.OptionalType(type.ReferenceType(__name__, 'SourceVum')),
},
MigrateSourceApplianceRequest,
False,
None))
class SsoConfigurationRequest(VapiStruct):
"""
The request that contains information needed to verify the Single Sign-On
configuration.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
destination_appliance=None,
):
"""
:type destination_appliance: :class:`Validation.SsoConfigurationDestinationAppliance`
:param destination_appliance: Destination appliance configuration needed to validate Single
Sign-On.
"""
self.destination_appliance = destination_appliance
VapiStruct.__init__(self)
SsoConfigurationRequest._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.sso_configuration_request', {
'destination_appliance': type.ReferenceType(__name__, 'Validation.SsoConfigurationDestinationAppliance'),
},
SsoConfigurationRequest,
False,
None))
class SsoConfigurationDestinationAppliance(VapiStruct):
"""
Spec to describe the new appliance.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
vcsa_embedded=None,
psc=None,
vcsa_external=None,
):
"""
:type vcsa_embedded: :class:`Validation.SsoConfigurationVcsaEmbedded` or ``None``
:param vcsa_embedded: Configuration of Single Sign-On for deploying Embedded.
Mutuall exclusive between embedded, PSC, and management node.
:type psc: :class:`Validation.SsoConfigurationPsc` or ``None``
:param psc: Configuration of Single Sign-On for deploying PSC.
Mutuall exclusive between embedded, PSC, and management node.
:type vcsa_external: :class:`ExternalVcsa` or ``None``
:param vcsa_external: Configuration of Single Sign-On for deploying management node.
Mutuall exclusive between embedded, PSC, and management node.
"""
self.vcsa_embedded = vcsa_embedded
self.psc = psc
self.vcsa_external = vcsa_external
VapiStruct.__init__(self)
SsoConfigurationDestinationAppliance._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.sso_configuration_destination_appliance', {
'vcsa_embedded': type.OptionalType(type.ReferenceType(__name__, 'Validation.SsoConfigurationVcsaEmbedded')),
'psc': type.OptionalType(type.ReferenceType(__name__, 'Validation.SsoConfigurationPsc')),
'vcsa_external': type.OptionalType(type.ReferenceType(__name__, 'ExternalVcsa')),
},
SsoConfigurationDestinationAppliance,
False,
None))
class SsoConfigurationVcsaEmbedded(VapiStruct):
"""
Spec used to configure an embedded vCenter Server. This field describes how
the embedded vCenter Server appliance should be configured.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
standalone=None,
replicated=None,
):
"""
:type standalone: :class:`EmbeddedStandaloneVcsa` or ``None``
:param standalone: Spec used to configure a standalone embedded vCenter Server. This
field describes how the standalone vCenter Server appliance should
be configured.
Mutually exclusive between ``standalone`` and ``replicated``
:type replicated: :class:`EmbeddedReplicatedVcsa` or ``None``
:param replicated: Spec used to configure a replicated embedded vCenter Server. This
field describes how the replicated vCenter Server appliance should
be configured.
Mutually exclusive between ``standalone`` and ``replicated``
"""
self.standalone = standalone
self.replicated = replicated
VapiStruct.__init__(self)
SsoConfigurationVcsaEmbedded._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.sso_configuration_vcsa_embedded', {
'standalone': type.OptionalType(type.ReferenceType(__name__, 'EmbeddedStandaloneVcsa')),
'replicated': type.OptionalType(type.ReferenceType(__name__, 'EmbeddedReplicatedVcsa')),
},
SsoConfigurationVcsaEmbedded,
False,
None))
class SsoConfigurationPsc(VapiStruct):
"""
Spec used to configure a Platform Services Controller. This section
describes how the Platform Services Controller appliance should be
configured. If unset, either ``#vcsaEmbedded`` or ``#vcsaExternal`` must be
provided.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
standalone=None,
replicated=None,
):
"""
:type standalone: :class:`PscStandalone` or ``None``
:param standalone: Spec used to configure a standalone Platform Services Controller.
This section describes how the standalone PSC should be configured.
Mutually exclusive between ``standalone`` and ``replicated``
:type replicated: :class:`PscReplicated` or ``None``
:param replicated: Spec used to configure a replicated Platform Services Controller.
This section describes how the replicated PSC should be configured.
Mutually exclusive between ``standalone`` and ``replicated``
"""
self.standalone = standalone
self.replicated = replicated
VapiStruct.__init__(self)
SsoConfigurationPsc._set_binding_type(type.StructType(
'com.vmware.vcenter.lcm.validation.sso_configuration_psc', {
'standalone': type.OptionalType(type.ReferenceType(__name__, 'PscStandalone')),
'replicated': type.OptionalType(type.ReferenceType(__name__, 'PscReplicated')),
},
SsoConfigurationPsc,
False,
None))
def check_appliance_name_task(self,
spec,
):
"""
Validate the name of the appliance to be deployed.
#. 1. Return False if the there is already an appliance that has the
same name as given in the spec exist in the path.
:type spec: :class:`Validation.ApplianceNameRequest`
:param spec: The configuration needed to validate the name of the appliance to
be deployed.
:rtype: :class: `vmware.vapi.stdlib.client.task.Task`
:return: Task instance
"""
task_id = self._invoke('check_appliance_name$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.BooleanType())
return task_instance
def check_os_password_task(self,
spec,
):
"""
Validate if the given password conforms password policy.
#. 1. Return False if the password in the spec violates password policy
:type spec: :class:`Validation.OsPasswordRequest`
:param spec: The configuration needed to validate the given password against
password policy.
:rtype: :class: `vmware.vapi.stdlib.client.task.Task`
:return: Task instance
"""
task_id = self._invoke('check_os_password$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.BooleanType())
return task_instance
def check_network_task(self,
spec,
):
"""
Check to see if the given network configuration is valid.
#. 1. Return False if the given network will cause conflict.
#. 2. Always return True if network mode is set to DHCP.
:type spec: :class:`Validation.NetworkRequest`
:param spec: The configuration needed to validate network.
:rtype: :class: `vmware.vapi.stdlib.client.task.Task`
:return: Task instance
"""
task_id = self._invoke('check_network$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.BooleanType())
return task_instance
def check_host_config_task(self,
spec,
):
"""
Validate the host configuration.
#. 1. Return False if the provided appliance type, appliance size, and
disk size combination is not valid.
#. 2. Return False if the provided deployment path does not have
sufficient memory allocated.
#. 3. Return False if the provided deployment path does not have
sufficient cpu allocated.
#. 3. Return False if the provided deployment path does not have
sufficient datastore space.
:type spec: :class:`Validation.HostConfigRequest`
:param spec: The configuration needed to validate host configuration.
:rtype: :class: `vmware.vapi.stdlib.client.task.Task`
:return: Task instance
"""
task_id = self._invoke('check_host_config$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.BooleanType())
return task_instance
def check_destination_location_task(self,
spec,
):
"""
Validate the ESX of the appliance to be deployed.
:type spec: :class:`Validation.DestinationLocationRequest`
:param spec: The configuration needed to validate the ESX of the appliance to be
deployed.
:rtype: :class: `vmware.vapi.stdlib.client.task.Task`
:return: Task instance
"""
task_id = self._invoke('check_destination_location$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.BooleanType())
return task_instance
def check_temporary_network_task(self,
spec,
):
"""
Check to see if the given network configuration is valid.
#. 1. Return False if the given network will cause conflict.
#. 2. Always return True if network mode is set to DHCP.
:type spec: :class:`Validation.TemporaryNetworkRequest`
:param spec: The configuration needed to validate network.
:rtype: :class: `vmware.vapi.stdlib.client.task.Task`
:return: Task instance
"""
task_id = self._invoke('check_temporary_network$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.BooleanType())
return task_instance
def check_source_vum_task(self,
spec,
):
"""
Validate the source VUM configuration.
:type spec: :class:`Validation.SourceVumRequest`
:param spec: The configuration needed to validate the source VUM.
:rtype: :class: `vmware.vapi.stdlib.client.task.Task`
:return: Task instance
"""
task_id = self._invoke('check_source_vum$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.BooleanType())
return task_instance
def check_upgrade_source_appliance_task(self,
spec,
):
"""
Validate the source appliance to be upgraded.
#. 1. Return False if the provided source appliance credentials are
incorrect
#. 2. Return False if the provided source location credentials are
incorrect.
#. 3. Return False if upgrade runner precheck results in error.
#. 4. Return False if export directory provided is invalid.
:type spec: :class:`Validation.UpgradeSourceApplianceRequest`
:param spec: The configuration of the source appliance to be upgraded.
:rtype: :class: `vmware.vapi.stdlib.client.task.Task`
:return: Task instance
"""
task_id = self._invoke('check_upgrade_source_appliance$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.BooleanType())
return task_instance
def check_migrate_source_appliance_task(self,
spec,
):
"""
Validate the source appliance to be migrated.
#. 1. Return False if the provided source windows vc credentials are
incorrect
#. 2. Return False if migration assistant precheck results in error.
:type spec: :class:`Validation.MigrateSourceApplianceRequest`
:param spec: The configuration of the source appliance to be migrated.
:rtype: :class: `vmware.vapi.stdlib.client.task.Task`
:return: Task instance
"""
task_id = self._invoke('check_migrate_source_appliance$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.BooleanType())
return task_instance
class _InstallStub(ApiInterfaceStub):
def __init__(self, config):
# properties for check operation
check_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Install.Spec'),
'options': type.OptionalType(type.ReferenceType(__name__, 'DeploymentOption')),
})
check_error_dict = {
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
}
check_input_value_validator_list = [
]
check_output_validator_list = [
]
check_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/install?action=check',
path_variables={
},
query_parameters={
}
)
# properties for start operation
start_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Install.Spec'),
})
start_error_dict = {
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
}
start_input_value_validator_list = [
]
start_output_validator_list = [
]
start_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/install?action=start',
path_variables={
},
query_parameters={
}
)
operations = {
'check$task': {
'input_type': check_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': check_error_dict,
'input_value_validator_list': check_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
'start$task': {
'input_type': start_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': start_error_dict,
'input_value_validator_list': start_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
}
rest_metadata = {
'check': check_rest_metadata,
'start': start_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.vcenter.lcm.install',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _LogStub(ApiInterfaceStub):
def __init__(self, config):
# properties for get operation
get_input_type = type.StructType('operation-input', {
'task_id': type.IdType(resource_types='com.vmware.cis.task'),
})
get_error_dict = {
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
}
get_input_value_validator_list = [
]
get_output_validator_list = [
]
get_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/vcenter/lcm/logs/{taskId}',
path_variables={
'task_id': 'taskId',
},
query_parameters={
}
)
operations = {
'get': {
'input_type': get_input_type,
'output_type': type.URIType(),
'errors': get_error_dict,
'input_value_validator_list': get_input_value_validator_list,
'output_validator_list': get_output_validator_list,
'task_type': TaskType.NONE,
},
}
rest_metadata = {
'get': get_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.vcenter.lcm.log',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _MigrateStub(ApiInterfaceStub):
def __init__(self, config):
# properties for check operation
check_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Migrate.Spec'),
'options': type.OptionalType(type.ReferenceType(__name__, 'DeploymentOption')),
})
check_error_dict = {
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
}
check_input_value_validator_list = [
]
check_output_validator_list = [
]
check_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/migration?action=check',
path_variables={
},
query_parameters={
}
)
# properties for start operation
start_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Migrate.Spec'),
})
start_error_dict = {
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
}
start_input_value_validator_list = [
]
start_output_validator_list = [
]
start_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/migration?action=start',
path_variables={
},
query_parameters={
}
)
operations = {
'check$task': {
'input_type': check_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': check_error_dict,
'input_value_validator_list': check_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
'start$task': {
'input_type': start_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': start_error_dict,
'input_value_validator_list': start_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
}
rest_metadata = {
'check': check_rest_metadata,
'start': start_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.vcenter.lcm.migrate',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _RecommendationStub(ApiInterfaceStub):
def __init__(self, config):
# properties for scan_migrate_deployment_size operation
scan_migrate_deployment_size_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Recommendation.MigrateDeploymentSizeRequest'),
})
scan_migrate_deployment_size_error_dict = {}
scan_migrate_deployment_size_input_value_validator_list = [
]
scan_migrate_deployment_size_output_validator_list = [
]
scan_migrate_deployment_size_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/recommendation?action=scan-migrate-deployment-size',
path_variables={
},
query_parameters={
}
)
# properties for scan_datastore operation
scan_datastore_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Recommendation.DatastoreRequest'),
})
scan_datastore_error_dict = {}
scan_datastore_input_value_validator_list = [
]
scan_datastore_output_validator_list = [
]
scan_datastore_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/recommendation?action=scan-datastore',
path_variables={
},
query_parameters={
}
)
# properties for scan_upgrade_deployment_size operation
scan_upgrade_deployment_size_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Recommendation.UpgradeDeploymentSizeRequest'),
})
scan_upgrade_deployment_size_error_dict = {}
scan_upgrade_deployment_size_input_value_validator_list = [
]
scan_upgrade_deployment_size_output_validator_list = [
]
scan_upgrade_deployment_size_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/recommendation?action=scan-upgrade-deployment-size',
path_variables={
},
query_parameters={
}
)
operations = {
'scan_migrate_deployment_size$task': {
'input_type': scan_migrate_deployment_size_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': scan_migrate_deployment_size_error_dict,
'input_value_validator_list': scan_migrate_deployment_size_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
'scan_datastore$task': {
'input_type': scan_datastore_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': scan_datastore_error_dict,
'input_value_validator_list': scan_datastore_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
'scan_upgrade_deployment_size$task': {
'input_type': scan_upgrade_deployment_size_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': scan_upgrade_deployment_size_error_dict,
'input_value_validator_list': scan_upgrade_deployment_size_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
}
rest_metadata = {
'scan_migrate_deployment_size': scan_migrate_deployment_size_rest_metadata,
'scan_datastore': scan_datastore_rest_metadata,
'scan_upgrade_deployment_size': scan_upgrade_deployment_size_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.vcenter.lcm.recommendation',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _UpgradeStub(ApiInterfaceStub):
def __init__(self, config):
# properties for check operation
check_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Upgrade.Spec'),
'options': type.OptionalType(type.ReferenceType(__name__, 'DeploymentOption')),
})
check_error_dict = {
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
}
check_input_value_validator_list = [
]
check_output_validator_list = [
]
check_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/upgrade?action=check',
path_variables={
},
query_parameters={
}
)
# properties for start operation
start_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Upgrade.Spec'),
})
start_error_dict = {
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
}
start_input_value_validator_list = [
]
start_output_validator_list = [
]
start_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/upgrade?action=start',
path_variables={
},
query_parameters={
}
)
operations = {
'check$task': {
'input_type': check_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': check_error_dict,
'input_value_validator_list': check_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
'start$task': {
'input_type': start_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': start_error_dict,
'input_value_validator_list': start_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
}
rest_metadata = {
'check': check_rest_metadata,
'start': start_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.vcenter.lcm.upgrade',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _ValidationStub(ApiInterfaceStub):
def __init__(self, config):
# properties for check_appliance_name operation
check_appliance_name_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Validation.ApplianceNameRequest'),
})
check_appliance_name_error_dict = {}
check_appliance_name_input_value_validator_list = [
]
check_appliance_name_output_validator_list = [
]
check_appliance_name_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/validation?action=check-appliance-name',
path_variables={
},
query_parameters={
}
)
# properties for check_os_password operation
check_os_password_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Validation.OsPasswordRequest'),
})
check_os_password_error_dict = {}
check_os_password_input_value_validator_list = [
]
check_os_password_output_validator_list = [
]
check_os_password_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/validation?action=check-os-password',
path_variables={
},
query_parameters={
}
)
# properties for check_network operation
check_network_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Validation.NetworkRequest'),
})
check_network_error_dict = {}
check_network_input_value_validator_list = [
]
check_network_output_validator_list = [
]
check_network_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/validation?action=check-network',
path_variables={
},
query_parameters={
}
)
# properties for check_host_config operation
check_host_config_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Validation.HostConfigRequest'),
})
check_host_config_error_dict = {}
check_host_config_input_value_validator_list = [
]
check_host_config_output_validator_list = [
]
check_host_config_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/validation?action=check-host-config',
path_variables={
},
query_parameters={
}
)
# properties for check_destination_location operation
check_destination_location_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Validation.DestinationLocationRequest'),
})
check_destination_location_error_dict = {}
check_destination_location_input_value_validator_list = [
]
check_destination_location_output_validator_list = [
]
check_destination_location_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/validation?action=check-destination-location',
path_variables={
},
query_parameters={
}
)
# properties for check_temporary_network operation
check_temporary_network_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Validation.TemporaryNetworkRequest'),
})
check_temporary_network_error_dict = {}
check_temporary_network_input_value_validator_list = [
]
check_temporary_network_output_validator_list = [
]
check_temporary_network_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/validation?action=check-temporary-network',
path_variables={
},
query_parameters={
}
)
# properties for check_source_vum operation
check_source_vum_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Validation.SourceVumRequest'),
})
check_source_vum_error_dict = {}
check_source_vum_input_value_validator_list = [
]
check_source_vum_output_validator_list = [
]
check_source_vum_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/validation?action=check-source-vum',
path_variables={
},
query_parameters={
}
)
# properties for check_upgrade_source_appliance operation
check_upgrade_source_appliance_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Validation.UpgradeSourceApplianceRequest'),
})
check_upgrade_source_appliance_error_dict = {}
check_upgrade_source_appliance_input_value_validator_list = [
]
check_upgrade_source_appliance_output_validator_list = [
]
check_upgrade_source_appliance_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/validation?action=check-upgrade-source-appliance',
path_variables={
},
query_parameters={
}
)
# properties for check_migrate_source_appliance operation
check_migrate_source_appliance_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Validation.MigrateSourceApplianceRequest'),
})
check_migrate_source_appliance_error_dict = {}
check_migrate_source_appliance_input_value_validator_list = [
]
check_migrate_source_appliance_output_validator_list = [
]
check_migrate_source_appliance_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/lcm/validation?action=check-migrate-source-appliance',
path_variables={
},
query_parameters={
}
)
operations = {
'check_appliance_name$task': {
'input_type': check_appliance_name_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': check_appliance_name_error_dict,
'input_value_validator_list': check_appliance_name_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
'check_os_password$task': {
'input_type': check_os_password_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': check_os_password_error_dict,
'input_value_validator_list': check_os_password_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
'check_network$task': {
'input_type': check_network_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': check_network_error_dict,
'input_value_validator_list': check_network_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
'check_host_config$task': {
'input_type': check_host_config_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': check_host_config_error_dict,
'input_value_validator_list': check_host_config_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
'check_destination_location$task': {
'input_type': check_destination_location_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': check_destination_location_error_dict,
'input_value_validator_list': check_destination_location_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
'check_temporary_network$task': {
'input_type': check_temporary_network_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': check_temporary_network_error_dict,
'input_value_validator_list': check_temporary_network_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
'check_source_vum$task': {
'input_type': check_source_vum_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': check_source_vum_error_dict,
'input_value_validator_list': check_source_vum_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
'check_upgrade_source_appliance$task': {
'input_type': check_upgrade_source_appliance_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': check_upgrade_source_appliance_error_dict,
'input_value_validator_list': check_upgrade_source_appliance_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
'check_migrate_source_appliance$task': {
'input_type': check_migrate_source_appliance_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': check_migrate_source_appliance_error_dict,
'input_value_validator_list': check_migrate_source_appliance_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
}
rest_metadata = {
'check_appliance_name': check_appliance_name_rest_metadata,
'check_os_password': check_os_password_rest_metadata,
'check_network': check_network_rest_metadata,
'check_host_config': check_host_config_rest_metadata,
'check_destination_location': check_destination_location_rest_metadata,
'check_temporary_network': check_temporary_network_rest_metadata,
'check_source_vum': check_source_vum_rest_metadata,
'check_upgrade_source_appliance': check_upgrade_source_appliance_rest_metadata,
'check_migrate_source_appliance': check_migrate_source_appliance_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.vcenter.lcm.validation',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class StubFactory(StubFactoryBase):
_attrs = {
'Install': Install,
'Log': Log,
'Migrate': Migrate,
'Recommendation': Recommendation,
'Upgrade': Upgrade,
'Validation': Validation,
}
| 39.690209 | 142 | 0.610506 | 27,171 | 261,876 | 5.666041 | 0.029406 | 0.011276 | 0.026112 | 0.013946 | 0.837553 | 0.812039 | 0.783575 | 0.758905 | 0.736022 | 0.718406 | 0 | 0.002169 | 0.306485 | 261,876 | 6,597 | 143 | 39.696226 | 0.845514 | 0.416533 | 0 | 0.609781 | 1 | 0 | 0.157086 | 0.097703 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044182 | false | 0.02597 | 0.005734 | 0 | 0.102192 | 0.019224 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
edd794c8c2e7e4d1b212f90dd7d76b92ff6b6ab6 | 10,684 | py | Python | 22/swagger_client/api/lang_detect_controller_api.py | apitore/apitore-sdk-python | c0814c5635ddd09e9a20fcb155b62122bee41d33 | [
"Apache-2.0"
] | 3 | 2018-08-21T06:14:33.000Z | 2019-10-18T23:05:50.000Z | 22/swagger_client/api/lang_detect_controller_api.py | apitore/apitore-sdk-python | c0814c5635ddd09e9a20fcb155b62122bee41d33 | [
"Apache-2.0"
] | null | null | null | 22/swagger_client/api/lang_detect_controller_api.py | apitore/apitore-sdk-python | c0814c5635ddd09e9a20fcb155b62122bee41d33 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Language Detection APIs
Language detection.<BR />[Endpoint] https://api.apitore.com/api/22 # noqa: E501
OpenAPI spec version: 0.0.1
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from swagger_client.api_client import ApiClient
class LangDetectControllerApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def get_using_get(self, access_token, text, **kwargs): # noqa: E501
"""Language Detection. This supports 53 languages. # noqa: E501
Language Detection.<BR />Response<BR /> Github: <a href=\"https://github.com/keigohtr/apitore-response-parent/tree/master/langdetect-response\">langdetect-response</a><BR /> Class: com.apitore.banana.response.org.jsoup.LanguageResponseEntity<BR /> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_using_get(access_token, text, async=True)
>>> result = thread.get()
:param async bool
:param str access_token: Access Token (required)
:param str text: Text [10-20 words over is recommended] (required)
:return: LanguageResponseEntity
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.get_using_get_with_http_info(access_token, text, **kwargs) # noqa: E501
else:
(data) = self.get_using_get_with_http_info(access_token, text, **kwargs) # noqa: E501
return data
def get_using_get_with_http_info(self, access_token, text, **kwargs): # noqa: E501
"""Language Detection. This supports 53 languages. # noqa: E501
Language Detection.<BR />Response<BR /> Github: <a href=\"https://github.com/keigohtr/apitore-response-parent/tree/master/langdetect-response\">langdetect-response</a><BR /> Class: com.apitore.banana.response.org.jsoup.LanguageResponseEntity<BR /> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_using_get_with_http_info(access_token, text, async=True)
>>> result = thread.get()
:param async bool
:param str access_token: Access Token (required)
:param str text: Text [10-20 words over is recommended] (required)
:return: LanguageResponseEntity
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['access_token', 'text'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'access_token' is set
if ('access_token' not in params or
params['access_token'] is None):
raise ValueError("Missing the required parameter `access_token` when calling `get_using_get`") # noqa: E501
# verify the required parameter 'text' is set
if ('text' not in params or
params['text'] is None):
raise ValueError("Missing the required parameter `text` when calling `get_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'access_token' in params:
query_params.append(('access_token', params['access_token'])) # noqa: E501
if 'text' in params:
query_params.append(('text', params['text'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/langdetect/get', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='LanguageResponseEntity', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def sm_get_using_get(self, access_token, text, **kwargs): # noqa: E501
"""Language Detection for Short Messages. This supports 53 languages. # noqa: E501
Language Detection.<BR />Response<BR /> Github: <a href=\"https://github.com/keigohtr/apitore-response-parent/tree/master/langdetect-response\">langdetect-response</a><BR /> Class: com.apitore.banana.response.org.jsoup.LanguageResponseEntity<BR /> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.sm_get_using_get(access_token, text, async=True)
>>> result = thread.get()
:param async bool
:param str access_token: Access Token (required)
:param str text: Text [Short message like tweet is supported] (required)
:return: LanguageResponseEntity
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.sm_get_using_get_with_http_info(access_token, text, **kwargs) # noqa: E501
else:
(data) = self.sm_get_using_get_with_http_info(access_token, text, **kwargs) # noqa: E501
return data
def sm_get_using_get_with_http_info(self, access_token, text, **kwargs): # noqa: E501
"""Language Detection for Short Messages. This supports 53 languages. # noqa: E501
Language Detection.<BR />Response<BR /> Github: <a href=\"https://github.com/keigohtr/apitore-response-parent/tree/master/langdetect-response\">langdetect-response</a><BR /> Class: com.apitore.banana.response.org.jsoup.LanguageResponseEntity<BR /> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.sm_get_using_get_with_http_info(access_token, text, async=True)
>>> result = thread.get()
:param async bool
:param str access_token: Access Token (required)
:param str text: Text [Short message like tweet is supported] (required)
:return: LanguageResponseEntity
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['access_token', 'text'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method sm_get_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'access_token' is set
if ('access_token' not in params or
params['access_token'] is None):
raise ValueError("Missing the required parameter `access_token` when calling `sm_get_using_get`") # noqa: E501
# verify the required parameter 'text' is set
if ('text' not in params or
params['text'] is None):
raise ValueError("Missing the required parameter `text` when calling `sm_get_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'access_token' in params:
query_params.append(('access_token', params['access_token'])) # noqa: E501
if 'text' in params:
query_params.append(('text', params['text'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/langdetect/short/get', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='LanguageResponseEntity', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 42.907631 | 281 | 0.631037 | 1,261 | 10,684 | 5.14433 | 0.137986 | 0.04563 | 0.030523 | 0.018036 | 0.927547 | 0.926777 | 0.926777 | 0.914753 | 0.914136 | 0.914136 | 0 | 0.01762 | 0.266941 | 10,684 | 248 | 282 | 43.080645 | 0.810649 | 0.066455 | 0 | 0.787402 | 1 | 0 | 0.189393 | 0.033704 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.031496 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
edf24d3c30d2f0a9638dab51b47b0302f5d6f703 | 8,692 | py | Python | tests/v2/test_1142-numbers-to-type.py | colesbury/awkward-1.0 | d036ab18eb54de8a2571d9f179d315ac8ee22119 | [
"BSD-3-Clause"
] | null | null | null | tests/v2/test_1142-numbers-to-type.py | colesbury/awkward-1.0 | d036ab18eb54de8a2571d9f179d315ac8ee22119 | [
"BSD-3-Clause"
] | null | null | null | tests/v2/test_1142-numbers-to-type.py | colesbury/awkward-1.0 | d036ab18eb54de8a2571d9f179d315ac8ee22119 | [
"BSD-3-Clause"
] | null | null | null | # BSD 3-Clause License; see https://github.com/scikit-hep/awkward-1.0/blob/main/LICENSE
import pytest # noqa: F401
import numpy as np # noqa: F401
import awkward as ak # noqa: F401
def test_numbers_to_type():
one, two = (
ak._v2.highlevel.Array([1, 2, 3]).layout,
ak._v2.highlevel.Array([4, 5]).layout,
)
assert np.asarray(one.numbers_to_type("bool")).dtype == np.dtype(np.bool_)
assert np.asarray(one.numbers_to_type("int8")).dtype == np.dtype(np.int8)
assert np.asarray(one.numbers_to_type("uint8")).dtype == np.dtype(np.uint8)
assert np.asarray(one.numbers_to_type("int16")).dtype == np.dtype(np.int16)
assert np.asarray(one.numbers_to_type("uint16")).dtype == np.dtype(np.uint16)
assert np.asarray(one.numbers_to_type("int32")).dtype == np.dtype(np.int32)
assert np.asarray(one.numbers_to_type("uint32")).dtype == np.dtype(np.uint32)
assert np.asarray(one.numbers_to_type("int64")).dtype == np.dtype(np.int64)
assert np.asarray(one.numbers_to_type("uint64")).dtype == np.dtype(np.uint64)
assert np.asarray(one.numbers_to_type("float32")).dtype == np.dtype(np.float32)
assert np.asarray(one.numbers_to_type("float64")).dtype == np.dtype(np.float64)
assert np.asarray(one.numbers_to_type("complex64")).dtype == np.dtype(np.complex64)
assert np.asarray(one.numbers_to_type("complex128")).dtype == np.dtype(
np.complex128
)
assert np.asarray(one.numbers_to_type("datetime64")).dtype == np.dtype(
np.datetime64
)
assert np.asarray(one.numbers_to_type("datetime64[Y]")).dtype == np.dtype(
"datetime64[Y]"
)
assert np.asarray(one.numbers_to_type("datetime64[M]")).dtype == np.dtype(
"datetime64[M]"
)
assert np.asarray(one.numbers_to_type("datetime64[W]")).dtype == np.dtype(
"datetime64[W]"
)
assert np.asarray(one.numbers_to_type("datetime64[D]")).dtype == np.dtype(
"datetime64[D]"
)
assert np.asarray(one.numbers_to_type("datetime64[h]")).dtype == np.dtype(
"datetime64[h]"
)
assert np.asarray(one.numbers_to_type("datetime64[m]")).dtype == np.dtype(
"datetime64[m]"
)
assert np.asarray(one.numbers_to_type("datetime64[s]")).dtype == np.dtype(
"datetime64[s]"
)
assert np.asarray(one.numbers_to_type("datetime64[ms]")).dtype == np.dtype(
"datetime64[ms]"
)
assert np.asarray(one.numbers_to_type("datetime64[us]")).dtype == np.dtype(
"datetime64[us]"
)
assert np.asarray(one.numbers_to_type("datetime64[ns]")).dtype == np.dtype(
"datetime64[ns]"
)
assert np.asarray(one.numbers_to_type("datetime64[ps]")).dtype == np.dtype(
"datetime64[ps]"
)
assert np.asarray(one.numbers_to_type("datetime64[fs]")).dtype == np.dtype(
"datetime64[fs]"
)
assert np.asarray(one.numbers_to_type("datetime64[as]")).dtype == np.dtype(
"datetime64[as]"
)
assert np.asarray(one.numbers_to_type("timedelta64")).dtype == np.dtype(
np.timedelta64
)
assert np.asarray(one.numbers_to_type("timedelta64[Y]")).dtype == np.dtype(
"timedelta64[Y]"
)
assert np.asarray(one.numbers_to_type("timedelta64[M]")).dtype == np.dtype(
"timedelta64[M]"
)
assert np.asarray(one.numbers_to_type("timedelta64[W]")).dtype == np.dtype(
"timedelta64[W]"
)
assert np.asarray(one.numbers_to_type("timedelta64[D]")).dtype == np.dtype(
"timedelta64[D]"
)
assert np.asarray(one.numbers_to_type("timedelta64[h]")).dtype == np.dtype(
"timedelta64[h]"
)
assert np.asarray(one.numbers_to_type("timedelta64[m]")).dtype == np.dtype(
"timedelta64[m]"
)
assert np.asarray(one.numbers_to_type("timedelta64[s]")).dtype == np.dtype(
"timedelta64[s]"
)
assert np.asarray(one.numbers_to_type("timedelta64[ms]")).dtype == np.dtype(
"timedelta64[ms]"
)
assert np.asarray(one.numbers_to_type("timedelta64[us]")).dtype == np.dtype(
"timedelta64[us]"
)
assert np.asarray(one.numbers_to_type("timedelta64[ns]")).dtype == np.dtype(
"timedelta64[ns]"
)
assert np.asarray(one.numbers_to_type("timedelta64[ps]")).dtype == np.dtype(
"timedelta64[ps]"
)
assert np.asarray(one.numbers_to_type("timedelta64[fs]")).dtype == np.dtype(
"timedelta64[fs]"
)
assert np.asarray(one.numbers_to_type("timedelta64[as]")).dtype == np.dtype(
"timedelta64[as]"
)
assert np.asarray(two.numbers_to_type("bool")).dtype == np.dtype(np.bool_)
assert np.asarray(two.numbers_to_type("int8")).dtype == np.dtype(np.int8)
assert np.asarray(two.numbers_to_type("uint8")).dtype == np.dtype(np.uint8)
assert np.asarray(two.numbers_to_type("int16")).dtype == np.dtype(np.int16)
assert np.asarray(two.numbers_to_type("uint16")).dtype == np.dtype(np.uint16)
assert np.asarray(two.numbers_to_type("int32")).dtype == np.dtype(np.int32)
assert np.asarray(two.numbers_to_type("uint32")).dtype == np.dtype(np.uint32)
assert np.asarray(two.numbers_to_type("int64")).dtype == np.dtype(np.int64)
assert np.asarray(two.numbers_to_type("uint64")).dtype == np.dtype(np.uint64)
assert np.asarray(two.numbers_to_type("float32")).dtype == np.dtype(np.float32)
assert np.asarray(two.numbers_to_type("float64")).dtype == np.dtype(np.float64)
assert np.asarray(two.numbers_to_type("complex64")).dtype == np.dtype(np.complex64)
assert np.asarray(two.numbers_to_type("complex128")).dtype == np.dtype(
np.complex128
)
assert np.asarray(two.numbers_to_type("datetime64")).dtype == np.dtype(
np.datetime64
)
assert np.asarray(two.numbers_to_type("datetime64[Y]")).dtype == np.dtype(
"datetime64[Y]"
)
assert np.asarray(two.numbers_to_type("datetime64[M]")).dtype == np.dtype(
"datetime64[M]"
)
assert np.asarray(two.numbers_to_type("datetime64[W]")).dtype == np.dtype(
"datetime64[W]"
)
assert np.asarray(two.numbers_to_type("datetime64[D]")).dtype == np.dtype(
"datetime64[D]"
)
assert np.asarray(two.numbers_to_type("datetime64[h]")).dtype == np.dtype(
"datetime64[h]"
)
assert np.asarray(two.numbers_to_type("datetime64[m]")).dtype == np.dtype(
"datetime64[m]"
)
assert np.asarray(two.numbers_to_type("datetime64[s]")).dtype == np.dtype(
"datetime64[s]"
)
assert np.asarray(two.numbers_to_type("datetime64[ms]")).dtype == np.dtype(
"datetime64[ms]"
)
assert np.asarray(two.numbers_to_type("datetime64[us]")).dtype == np.dtype(
"datetime64[us]"
)
assert np.asarray(two.numbers_to_type("datetime64[ns]")).dtype == np.dtype(
"datetime64[ns]"
)
assert np.asarray(two.numbers_to_type("datetime64[ps]")).dtype == np.dtype(
"datetime64[ps]"
)
assert np.asarray(two.numbers_to_type("datetime64[fs]")).dtype == np.dtype(
"datetime64[fs]"
)
assert np.asarray(two.numbers_to_type("datetime64[as]")).dtype == np.dtype(
"datetime64[as]"
)
assert np.asarray(two.numbers_to_type("timedelta64")).dtype == np.dtype(
np.timedelta64
)
assert np.asarray(two.numbers_to_type("timedelta64[Y]")).dtype == np.dtype(
"timedelta64[Y]"
)
assert np.asarray(two.numbers_to_type("timedelta64[M]")).dtype == np.dtype(
"timedelta64[M]"
)
assert np.asarray(two.numbers_to_type("timedelta64[W]")).dtype == np.dtype(
"timedelta64[W]"
)
assert np.asarray(two.numbers_to_type("timedelta64[D]")).dtype == np.dtype(
"timedelta64[D]"
)
assert np.asarray(two.numbers_to_type("timedelta64[h]")).dtype == np.dtype(
"timedelta64[h]"
)
assert np.asarray(two.numbers_to_type("timedelta64[m]")).dtype == np.dtype(
"timedelta64[m]"
)
assert np.asarray(two.numbers_to_type("timedelta64[s]")).dtype == np.dtype(
"timedelta64[s]"
)
assert np.asarray(two.numbers_to_type("timedelta64[ms]")).dtype == np.dtype(
"timedelta64[ms]"
)
assert np.asarray(two.numbers_to_type("timedelta64[us]")).dtype == np.dtype(
"timedelta64[us]"
)
assert np.asarray(two.numbers_to_type("timedelta64[ns]")).dtype == np.dtype(
"timedelta64[ns]"
)
assert np.asarray(two.numbers_to_type("timedelta64[ps]")).dtype == np.dtype(
"timedelta64[ps]"
)
assert np.asarray(two.numbers_to_type("timedelta64[fs]")).dtype == np.dtype(
"timedelta64[fs]"
)
assert np.asarray(two.numbers_to_type("timedelta64[as]")).dtype == np.dtype(
"timedelta64[as]"
)
| 40.616822 | 87 | 0.645996 | 1,172 | 8,692 | 4.645051 | 0.056314 | 0.144012 | 0.1982 | 0.135562 | 0.960691 | 0.960691 | 0.960691 | 0.944159 | 0.91881 | 0.91881 | 0 | 0.046827 | 0.176944 | 8,692 | 213 | 88 | 40.807512 | 0.714146 | 0.013576 | 0 | 0.281553 | 0 | 0 | 0.192788 | 0 | 0 | 0 | 0 | 0 | 0.398058 | 1 | 0.004854 | true | 0 | 0.014563 | 0 | 0.019417 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
fce87365b2d69419e584fae637f13e6d625d2384 | 6,532 | py | Python | startup/users/30-user_Ruan.py | NSLS-II-SMI/profile_collection | c1e2236a7520f605ac85e7591f05682add06357c | [
"BSD-3-Clause"
] | null | null | null | startup/users/30-user_Ruan.py | NSLS-II-SMI/profile_collection | c1e2236a7520f605ac85e7591f05682add06357c | [
"BSD-3-Clause"
] | 13 | 2018-09-25T19:35:08.000Z | 2021-01-15T20:42:26.000Z | startup/users/30-user_Ruan.py | NSLS-II-SMI/profile_collection | c1e2236a7520f605ac85e7591f05682add06357c | [
"BSD-3-Clause"
] | 3 | 2019-09-06T01:40:59.000Z | 2020-07-01T20:27:39.000Z |
def NEXAFS_S_edge(t=0.5):
yield from bps.mv(waxs, 60)
dets = [pil300KW]
name = '17_Phil_pk61_buffer_NEXAFS_3rd'
#x = [8800]
energies = np.linspace(2450, 2500, 51)
#for name, x in zip(names, x):
#bps.mv(piezo.x, x)
det_exposure_time(t,t)
name_fmt = '{sample}_{energy}eV_xbpm{xbpm}'
for e in energies:
yield from bps.mv(energy, e)
sample_name = name_fmt.format(sample=name, energy=e, xbpm = '%3.1f'%xbpm3.sumY.value)
sample_id(user_name='ZR', sample_name=sample_name)
print(f'\n\t=== Sample: {sample_name} ===\n')
yield from bp.count(dets, num=1)
yield from bps.sleep(2)
yield from bps.mv(energy, 2470)
yield from bps.sleep(10)
def NEXAFS_Cl_edge(t=0.5):
yield from bps.mv(waxs, 60)
dets = [pil300KW]
name = '7_Le_13_Cl_saxs_solution'
#x = [8800]
energies = np.linspace(2800, 2850, 51)
#for name, x in zip(names, x):
#bps.mv(piezo.x, x)
det_exposure_time(t,t)
name_fmt = '{sample}_{energy}eV_xbpm{xbpm}'
for e in energies:
yield from bps.mv(energy, e)
sample_name = name_fmt.format(sample=name, energy=e, xbpm = '%3.1f'%xbpm3.sumY.value)
sample_id(user_name='ZR', sample_name=sample_name)
print(f'\n\t=== Sample: {sample_name} ===\n')
yield from bp.count(dets, num=1)
yield from bps.sleep(2)
yield from bps.mv(energy, 2800)
yield from bps.sleep(10)
def SAXS_Cl_edge(t=1):
dets = [pil300KW, pil1M]
name = '7_Le_13_Cl_saxs_solution'
energies = [2810, 2820, 2826, 2827, 2829, 2832, 2850]
#energies = [2470]
det_exposure_time(t,t)
name_fmt = '{sample}_{energy}eV_xbpm{xbpm}_wa{wa}'
wa = [0.0, 6.5, 13.0]
#y0 = piezo.y.position
#ys = np.linspace(y0, y0+750, 6)
for wax in wa:
yield from bps.mv(waxs, wax)
for k, e in enumerate(energies):
yield from bps.mv(energy, e)
#yield from bps.mv(piezo.y, yss)
sample_name = name_fmt.format(sample=name, energy=e, xbpm = '%3.1f'%xbpm3.sumY.value, wa='%2.1f'%wax)
sample_id(user_name='OS', sample_name=sample_name)
print(f'\n\t=== Sample: {sample_name} ===\n')
yield from bp.count(dets, num=1)
yield from bps.mv(energy, 2810)
#yield from bps.mvr(piezo.y, 150)
for wax in wa:
yield from bps.mv(waxs, wax)
name_fmt = '{sample}_2810eV_postmeas_xbpm{xbpm}_wa{wa}'
sample_name = name_fmt.format(sample=name, xbpm = '%3.1f'%xbpm3.sumY.value, wa='%2.1f'%wax)
sample_id(user_name='OS', sample_name=sample_name)
print(f'\n\t=== Sample: {sample_name} ===\n')
yield from bp.count(dets, num=1)
sample_id(user_name='test', sample_name='test')
def NEXAFS_Br_edge(t=0.5):
yield from bps.mv(waxs, 60)
dets = [pil300KW]
name = '1_Le_15_Br_nexafs_solution'
#x = [8800]
energies = np.linspace(13450, 13500, 51)
#for name, x in zip(names, x):
#bps.mv(piezo.x, x)
det_exposure_time(t,t)
name_fmt = '{sample}_{energy}eV_xbpm{xbpm}'
for e in energies:
yield from bps.mv(energy, e)
sample_name = name_fmt.format(sample=name, energy=e, xbpm = '%3.1f'%xbpm3.sumY.value)
sample_id(user_name='ZR', sample_name=sample_name)
print(f'\n\t=== Sample: {sample_name} ===\n')
yield from bp.count(dets, num=1)
yield from bps.sleep(2)
yield from bps.mv(energy, 13450)
yield from bps.sleep(10)
def SAXS_Br_edge(t=1):
dets = [pil300KW, pil1M]
name = '5_Le_15_Br_saxs'
energies = [13450, 13465, 13469, 13471, 13478,13500]
#energies = [13450]
det_exposure_time(t,t)
name_fmt = '{sample}_{energy}eV_xbpm{xbpm}_wa{wa}'
wa = [0.0, 6.5, 13.0]
#y0 = piezo.y.position
#ys = np.linspace(y0, y0+750, 6)
for wax in wa:
yield from bps.mv(waxs, wax)
for k, e in enumerate(energies):
yield from bps.mv(energy, e)
#yield from bps.mv(piezo.y, yss)
sample_name = name_fmt.format(sample=name, energy=e, xbpm = '%3.1f'%xbpm3.sumY.value, wa='%2.1f'%wax)
sample_id(user_name='OS', sample_name=sample_name)
print(f'\n\t=== Sample: {sample_name} ===\n')
yield from bp.count(dets, num=1)
yield from bps.mv(energy, 13450)
#yield from bps.mvr(piezo.y, 150)
for wax in wa:
yield from bps.mv(waxs, wax)
name_fmt = '{sample}_13450eV_postmeas_xbpm{xbpm}_wa{wa}'
sample_name = name_fmt.format(sample=name, xbpm = '%3.1f'%xbpm3.sumY.value, wa='%2.1f'%wax)
sample_id(user_name='OS', sample_name=sample_name)
print(f'\n\t=== Sample: {sample_name} ===\n')
yield from bp.count(dets, num=1)
sample_id(user_name='test', sample_name='test')
def SAXS_s_edge(t=1):
dets = [pil300KW]
name = '17_Phil_pk61_buffer_saxs'
energies = [2470, 2477, 2480, 2482, 2484, 2500]
#energies = [2470]
det_exposure_time(t,t)
name_fmt = '{sample}_{energy}eV_xbpm{xbpm}_wa{wa}'
wa = [0.0, 6.5, 13.0, 19.5]
yield from bps.mv(GV7.close_cmd, 1 )
yield from bps.sleep(1)
yield from bps.mv(GV7.close_cmd, 1 )
y0 = piezo.y.position
ys = np.linspace(y0, y0+750, 6)
for wax in wa:
yield from bps.mv(waxs, wax)
for k, (e, yss) in enumerate(zip(energies, ys)):
yield from bps.mv(energy, e)
yield from bps.mv(piezo.y, yss)
sample_name = name_fmt.format(sample=name, energy=e, xbpm = '%3.1f'%xbpm3.sumY.value, wa='%2.1f'%wax)
sample_id(user_name='OS', sample_name=sample_name)
print(f'\n\t=== Sample: {sample_name} ===\n')
yield from bp.count(dets, num=1)
yield from bps.mv(energy, 2470)
yield from bps.mvr(piezo.y, 150)
for wax in wa:
yield from bps.mv(waxs, wax)
name_fmt = '{sample}_2470eV_postmeas_xbpm{xbpm}_wa{wa}'
sample_name = name_fmt.format(sample=name, xbpm = '%3.1f'%xbpm3.sumY.value, wa='%2.1f'%wax)
sample_id(user_name='OS', sample_name=sample_name)
print(f'\n\t=== Sample: {sample_name} ===\n')
yield from bp.count(dets, num=1)
sample_id(user_name='test', sample_name='test')
| 32.497512 | 113 | 0.579302 | 1,011 | 6,532 | 3.582591 | 0.111771 | 0.132523 | 0.119271 | 0.100497 | 0.924627 | 0.911375 | 0.893705 | 0.844009 | 0.829652 | 0.818056 | 0 | 0.073653 | 0.272505 | 6,532 | 200 | 114 | 32.66 | 0.688552 | 0.069351 | 0 | 0.779528 | 0 | 0 | 0.149059 | 0.075272 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047244 | false | 0 | 0 | 0 | 0.047244 | 0.070866 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bf47f369b8016bd5423f01e32952ddfca4d3b6b5 | 2,413 | py | Python | tests/test_middleware.py | lastfm/djangohmac | db2c34315fe732e211f4bd9f28dc78b366a92e8a | [
"Unlicense"
] | 1 | 2018-07-16T18:47:34.000Z | 2018-07-16T18:47:34.000Z | tests/test_middleware.py | lastfm/djangohmac | db2c34315fe732e211f4bd9f28dc78b366a92e8a | [
"Unlicense"
] | 14 | 2015-09-15T11:52:44.000Z | 2020-01-03T10:09:11.000Z | tests/test_middleware.py | lastfm/djangohmac | db2c34315fe732e211f4bd9f28dc78b366a92e8a | [
"Unlicense"
] | 4 | 2016-08-04T14:56:16.000Z | 2019-12-17T17:02:26.000Z | # Third Party Libs
from django.core.exceptions import PermissionDenied
from django.test import RequestFactory, TestCase
# First Party Libs
from djangohmac.middleware import HmacMiddleware
from djangohmac.sign import Hmac
class SingleHmacMiddlewareTestCase(TestCase):
def setUp(self):
self.hmacmiddleware = HmacMiddleware()
self.factory = RequestFactory()
self.hmac = Hmac()
def test_raise_exception_when_signature_is_not_send(self):
request = self.factory.get('/example',)
with self.assertRaises(PermissionDenied):
self.hmacmiddleware.process_request(request)
def test_should_be_ok_when_correct_hmac_is_send(self):
signature = self.hmac.make_hmac()
request = self.factory.get('/example', **{'HTTP_' + self.hmac.header.upper(): signature})
self.hmacmiddleware.process_request(request)
def test_raise_exception_when_invalid_hmac_is_send(self):
request = self.factory.get('/example', **{'HTTP_' + self.hmac.header.upper(): '00'})
with self.assertRaises(PermissionDenied):
self.hmacmiddleware.process_request(request)
class MultipleHmacMiddlewareTestCase(TestCase):
def setUp(self):
self.hmacmiddleware = HmacMiddleware()
self.factory = RequestFactory()
self.hmac = Hmac()
def test_raise_exception_when_signature_is_not_send(self):
request = self.factory.get('/example',)
with self.assertRaises(PermissionDenied):
self.hmacmiddleware.process_request(request)
def test_should_be_ok_when_correct_hmac_is_send(self):
signature = self.hmac.make_hmac_for('serviceA')
request = self.factory.get('/example', **{'HTTP_' + self.hmac.header.upper(): signature})
self.hmacmiddleware.process_request(request)
def test_raise_exception_when_signature_changed(self):
signature = self.hmac.make_hmac_for('serviceA', 'some data')
request = self.factory.get('/example', **{'HTTP_' + self.hmac.header.upper(): signature})
with self.assertRaises(PermissionDenied):
self.hmacmiddleware.process_request(request)
def test_raise_exception_when_invalid_hmac_is_send(self):
request = self.factory.get('/example', **{'HTTP_' + self.hmac.header.upper(): '00'})
with self.assertRaises(PermissionDenied):
self.hmacmiddleware.process_request(request)
| 38.301587 | 97 | 0.710734 | 269 | 2,413 | 6.130112 | 0.197026 | 0.048514 | 0.07641 | 0.089145 | 0.835658 | 0.835658 | 0.8302 | 0.8302 | 0.799272 | 0.799272 | 0 | 0.002018 | 0.178616 | 2,413 | 62 | 98 | 38.919355 | 0.82997 | 0.013676 | 0 | 0.767442 | 0 | 0 | 0.046277 | 0 | 0 | 0 | 0 | 0 | 0.116279 | 1 | 0.209302 | false | 0 | 0.093023 | 0 | 0.348837 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
44a1150962bf0c8e692a2981d8fb22b1f73ec319 | 26,268 | py | Python | source/test_everything.py | jaesikchoi/gpss-research | 2a64958a018f1668f7b8eedf33c4076a63af7868 | [
"MIT"
] | 151 | 2015-01-09T19:25:05.000Z | 2022-01-05T02:05:52.000Z | source/test_everything.py | jaesikchoi/gpss-research | 2a64958a018f1668f7b8eedf33c4076a63af7868 | [
"MIT"
] | 1 | 2016-08-04T13:12:51.000Z | 2016-08-04T13:12:51.000Z | source/test_everything.py | jaesikchoi/gpss-research | 2a64958a018f1668f7b8eedf33c4076a63af7868 | [
"MIT"
] | 59 | 2015-02-04T19:13:58.000Z | 2021-07-28T23:36:09.000Z | import unittest
import numpy as np
import experiment
import flexible_function as ff
import grammar
#import translation
class ff_testcase(unittest.TestCase):
def test_noise_kernel(self):
k = ff.NoiseKernel()
print '\n', k.pretty_print(), '\n'
print '\n', k.syntax, '\n'
print '\n', k, '\n'
print '\n', k.get_gpml_expression(dimensions=3), '\n'
k.initialise_params(data_shape = {'y_sd' : 0})
print '\n', k, '\n'
k = k.copy()
print '\n', k, '\n'
assert k == k.copy()
k.load_param_vector(k.param_vector)
print '\n', k, '\n'
def test_sq_exp(self):
k = ff.SqExpKernel()
print '\n', k.pretty_print(), '\n'
print '\n', k.syntax, '\n'
print '\n', k, '\n'
k.dimension = 1
print '\n', k.get_gpml_expression(dimensions=3), '\n'
k.initialise_params(data_shape = {'y_sd' : 0, 'x_sd' : [0,2], 'x_min' : [-10,-100], 'x_max' : [10,100]})
print '\n', k, '\n'
assert k == k.copy()
k = k.copy()
print '\n', k, '\n'
assert k == k.copy()
k.load_param_vector(k.param_vector)
print '\n', k, '\n'
def test_sum(self):
k = ff.SqExpKernel()
k1 = k.copy()
k2 = k.copy()
k = k1 + k2
print '\n', k.pretty_print(), '\n'
print '\n', k.syntax, '\n'
print '\n', k, '\n'
k.operands[0].dimension = 0
k.operands[1].dimension = 1
print '\n', k.pretty_print(), '\n'
print '\n', k.syntax, '\n'
print '\n', k, '\n'
print '\n', k.get_gpml_expression(dimensions=3), '\n'
k.initialise_params(data_shape = {'y_sd' : 0, 'x_sd' : [0,2], 'x_min' : [-10,-100], 'x_max' : [10,100]})
print '\n', k, '\n'
assert k == k.copy()
k = k.copy()
print '\n', k, '\n'
assert k == k.copy()
k.load_param_vector(k.param_vector)
print '\n', k, '\n'
k = k + k.copy()
print '\n', k.pretty_print(), '\n'
print '\n', k.syntax, '\n'
print '\n', k, '\n'
print '\n', k.pretty_print(), '\n'
print '\n', k.syntax, '\n'
print '\n', k, '\n'
print '\n', k.get_gpml_expression(dimensions=3), '\n'
k.initialise_params(data_shape = {'y_sd' : 0, 'x_sd' : [0,2], 'x_min' : [-10,-100], 'x_max' : [10,100]})
print '\n', k, '\n'
assert k == k.copy()
k = k.copy()
print '\n', k, '\n'
assert k == k.copy()
k.load_param_vector(k.param_vector)
print '\n', k, '\n'
k.sf
def test_prod(self):
k = ff.SqExpKernel()
k1 = k.copy()
k2 = k.copy()
k = k1 * k2
print '\n', k.pretty_print(), '\n'
print '\n', k.syntax, '\n'
print '\n', k, '\n'
k.operands[0].dimension = 0
k.operands[1].dimension = 1
print '\n', k.pretty_print(), '\n'
print '\n', k.syntax, '\n'
print '\n', k, '\n'
print '\n', k.get_gpml_expression(dimensions=3), '\n'
k.initialise_params(data_shape = {'y_sd' : 0, 'x_sd' : [0,2], 'x_min' : [-10,-100], 'x_max' : [10,100]})
print '\n', k, '\n'
assert k == k.copy()
k = k.copy()
print '\n', k, '\n'
assert k == k.copy()
k.load_param_vector(k.param_vector)
print '\n', k, '\n'
k = k + k.copy()
print '\n', k.pretty_print(), '\n'
print '\n', k.syntax, '\n'
print '\n', k, '\n'
print '\n', k.pretty_print(), '\n'
print '\n', k.syntax, '\n'
print '\n', k, '\n'
print '\n', k.get_gpml_expression(dimensions=3), '\n'
k.initialise_params(data_shape = {'y_sd' : 0, 'x_sd' : [0,2], 'x_min' : [-10,-100], 'x_max' : [10,100]})
print '\n', k, '\n'
assert k == k.copy()
k = k.copy()
print '\n', k, '\n'
assert k == k.copy()
k.load_param_vector(k.param_vector)
print '\n', k, '\n'
k.sf
def test_model(self):
print 'model'
m = ff.MeanZero()
k = ff.SqExpKernel()
l = ff.LikGauss()
regression_model = ff.GPModel(mean=m, kernel=k, likelihood=l, nll=0, ndata=100)
print '\n', regression_model.pretty_print(), '\n'
print '\n', regression_model.__repr__(), '\n'
print regression_model.bic
print regression_model.aic
print regression_model.pl2
print ff.GPModel.score(regression_model, criterion='nll')
def test_base(self):
kernels = ff.base_kernels_without_dimension('SE,Const,Noise')
for k in kernels:
print '\n', k.pretty_print(), '\n'
kernels = ff.base_kernels(3, 'SE,Const,Noise')
for k in kernels:
print '\n', k.pretty_print(), '\n'
def test_repr(self):
m = ff.MeanZero()
k = ff.SqExpKernel()
l = ff.LikGauss()
regression_model = ff.GPModel(mean=m, kernel=k, likelihood=l)
print regression_model
print ff.repr_to_model(regression_model.__repr__())
assert regression_model == ff.repr_to_model(regression_model.__repr__())
def test_collapse_add_idempotent(self):
k = ff.SqExpKernel()
k1 = k.copy()
k2 = k.copy()
k = ff.NoiseKernel(sf=-1)
k3 = k.copy()
k4 = k.copy()
k = ff.ConstKernel(sf=1)
k5 = k.copy()
k6 = k.copy()
k = k1 + k2 + k3 + k4 + k5 + k6
print '\n', k.pretty_print(), '\n'
k = k.collapse_additive_idempotency()
print '\n', k.pretty_print(), '\n'
def test_collapse_mult_idempotent(self):
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k1 = k.copy()
k2 = k.copy()
k = k1 * k2
print '\n', k.pretty_print(), '\n'
k = k.collapse_multiplicative_idempotency()
assert (isinstance(k, ff.SqExpKernel)) and (k.dimension == 0)
print '\n', k.pretty_print(), '\n'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k1 = k.copy()
k2 = k.copy()
k = ff.SqExpKernel(dimension=1, lengthscale=2, sf=2)
k3 = k.copy()
k4 = k.copy()
k5 = ff.NoiseKernel(sf=-1)
k6 = ff.ConstKernel(sf=1)
k = k1 * k2 * k3 * k4 * k5 * k5.copy() + k6 + k6.copy()
print '\n', k.pretty_print(), '\n'
k = k.collapse_multiplicative_idempotency()
print '\n', k.pretty_print(), '\n'
k = k1 * k2 * k3 * k4 * k5 * k5.copy() * k6 * k6.copy()
print '\n', k.pretty_print(), '\n'
k = k.collapse_multiplicative_idempotency()
print '\n', k.pretty_print(), '\n'
def test_collapse_zero(self):
k1 = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k2 = ff.NoiseKernel(sf=-1)
k = k1 * k2
print '\n', k.pretty_print(), '\n'
k = k.collapse_multiplicative_zero()
assert isinstance(k, ff.NoiseKernel)
print '\n', k.pretty_print(), '\n'
k = (k1 + k1.copy() + k1.copy() * k2.copy()) * k2
print (k1 + k1.copy()).sf
print (k1.copy() * k2.copy()).sf
print (k1 + k1.copy() + k1.copy() * k2.copy()).sf
print k.sf
print '\n', k.pretty_print(), '\n'
k = k.collapse_multiplicative_zero()
assert isinstance(k, ff.NoiseKernel)
print '\n', k.pretty_print(), '\n'
def test_collapse_identity(self):
print 'collapse identity'
k1 = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k2 = ff.ConstKernel(sf=-1)
k = k1 * k2
print '\n', k.pretty_print(), '\n'
k = k.collapse_multiplicative_identity()
assert isinstance(k, ff.SqExpKernel)
print '\n', k.pretty_print(), '\n'
k1 = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k = (k1 + k1.copy() + k1.copy() * k2.copy()) * k2
print (k1 + k1.copy()).sf
print (k1.copy() * k2.copy()).sf
print (k1 + k1.copy() + k1.copy() * k2.copy()).sf
print k.sf
print '\n', k.pretty_print(), '\n'
k = k.collapse_multiplicative_identity()
print '\n', k.pretty_print(), '\n'
def test_simplified_k(self):
print 'simplified_k'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k1 = k.copy()
k2 = k.copy()
k = k1 * k2
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k1 = k.copy()
k2 = k.copy()
k = ff.SqExpKernel(dimension=1, lengthscale=2, sf=2)
k3 = k.copy()
k4 = k.copy()
k5 = ff.NoiseKernel(sf=-1)
k6 = ff.ConstKernel(sf=1)
k = k1 * k2 * k3 * k4 * k5 * k5.copy() + k6 + k6.copy() + k1.copy() * k1.copy() * k3.copy()
print '\n', k.pretty_print(), '\n'
k = k.simplified()
print '\n', k.pretty_print(), '\n'
def test_distribute_products_k(self):
print 'distribute'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k1 = k.copy()
k2 = k.copy()
k = ff.SqExpKernel(dimension=1, lengthscale=2, sf=2)
k3 = k.copy()
k4 = k.copy()
k5 = ff.NoiseKernel(sf=-1)
k6 = ff.ConstKernel(sf=1)
k = (k1 + k2 + k3) * (k4 + k5)
print '\n', k.pretty_print(), '\n'
components = k.distribute_products().simplified()
print components
print components.collapse_additive_idempotency()
for k in components.operands:
print '\n', k.pretty_print(), '\n'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k1 = k.copy()
k2 = k.copy()
k = ff.SqExpKernel(dimension=1, lengthscale=2, sf=2)
k3 = k.copy()
k4 = k.copy()
k5 = ff.NoiseKernel(sf=-1)
k6 = ff.ConstKernel(sf=1)
k = (k1 * (k2 + k3)) + (k4 * k5)
print '\n', k.pretty_print(), '\n'
components = k.distribute_products().simplified()
print components
print components.collapse_additive_idempotency()
for k in components.operands:
print '\n', k.pretty_print(), '\n'
def test_jitter(self):
print 'jitter'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k1 = k.copy()
k2 = k.copy()
print [k,k1,k2]
assert (k == k1) and (k == k2) and (k1 == k2)
ff.add_jitter_k([k1, k2])
assert (not k == k1) and (not k == k2) and (not k1 == k2)
print [k,k1,k2]
def test_jitter_model(self):
print 'jitter model'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k1 = k.copy()
k2 = k.copy()
print [k,k1,k2]
assert (k == k1) and (k == k2) and (k1 == k2)
m1 = ff.GPModel(kernel=k1)
m2 = ff.GPModel(kernel=k2)
ff.add_jitter([m1, m2])
assert (not k == k1) and (not k == k2) and (not k1 == k2)
print [k,k1,k2]
def test_restarts(self):
print 'restart'
data_shape = {'y_sd' : 0, 'x_sd' : [0,2], 'x_min' : [-10,-100], 'x_max' : [10,100]}
k = ff.SqExpKernel(dimension=0)
k1 = k.copy()
k2 = k.copy()
print [k,k1,k2]
assert (k == k1) and (k == k2) and (k1 == k2)
kernel_list = ff.add_random_restarts_k([k1, k2], data_shape=data_shape, sd=1)
k1 = kernel_list[0]
k2 = kernel_list[1]
assert (not k == k1) and (not k == k2) and (not k1 == k2)
print [k,k1,k2]
def test_restarts_model(self):
print 'restart model'
data_shape = {'y_sd' : 0, 'x_sd' : [0,2], 'x_min' : [-10,-100], 'x_max' : [10,100]}
k = ff.SqExpKernel(dimension=0)
k1 = k.copy()
k2 = k.copy()
print [k,k1,k2]
assert (k == k1) and (k == k2) and (k1 == k2)
m1 = ff.GPModel(kernel=k1)
m2 = ff.GPModel(kernel=k2)
model_list = ff.add_random_restarts([m1, m2], n_rand=1, data_shape=data_shape, sd=1)
k1 = model_list[0].kernel
k2 = model_list[1].kernel
assert (not k == k1) and (not k == k2) and (not k1 == k2)
print [k,k1,k2]
def test_additive_form_k(self):
print 'additive form'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k1 = k.copy()
k2 = k.copy()
k = ff.SqExpKernel(dimension=1, lengthscale=2, sf=2)
k3 = k.copy()
k4 = k.copy()
k5 = ff.NoiseKernel(sf=-1)
k6 = ff.ConstKernel(sf=1)
k = (k1 * (k2 + k3)) + (k4 * k5)
print '\n', k.pretty_print(), '\n'
components = k.additive_form().simplified()
print components
for k in components.operands:
print '\n', k.pretty_print(), '\n'
def test_canonical_k(self):
print 'canonical_k form'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k1 = k.copy()
k2 = k.copy()
k = ff.SqExpKernel(dimension=1, lengthscale=2, sf=2)
k3 = k.copy()
k4 = k.copy()
k5 = ff.NoiseKernel(sf=-1)
k6 = ff.ConstKernel(sf=1)
k = k1 * k2 * k3 * k4 * k5 * k5.copy() + k6 + k6.copy() + k1.copy() * k1.copy() * k3.copy()
print '\n', k.pretty_print(), '\n'
print '\n', k.canonical().pretty_print(), '\n'
def test_canonical_k_2(self):
print 'canonical_k form 2'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k1 = k.copy()
k2 = ff.NoneKernel()
k = ff.ChangePointKernel(operands=[k1,k2])
print '\n', k, '\n'
k = k.canonical()
print '\n', k, '\n'
assert k == k1
k = ff.ChangePointKernel(operands=[ff.ChangePointKernel(operands=[k1,k2]),k2])
print '\n', k, '\n'
k = k.canonical()
print '\n', k, '\n'
assert k == k1
def test_hash_and_cmp(self):
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k1 = k.copy()
k2 = k.copy()
k3 = ff.SqExpKernel(dimension=1, lengthscale=0, sf=1)
assert sorted(list(set([k1,k2,k3]))) == sorted([k1,k3])
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=1)
k4 = k.copy()
k5 = ff.NoneKernel()
k6 = ff.ChangePointKernel(operands=[k4,k5])
k7 = ff.ChangePointKernel(operands=[ff.ChangePointKernel(operands=[k4,k5]),k1])
assert sorted([k1,k2,k3,k4,k5,k6,k7]) == sorted([k7,k1,k6,k2,k5,k3,k4])
assert sorted(k.canonical() for k in [k1,k2,k3,k4,k5,k6,k7]) == sorted(k.canonical() for k in [k7,k1,k6,k2,k5,k3,k4])
assert sorted(k.additive_form() for k in [k1,k2,k3,k4,k5,k6,k7]) == sorted(k.additive_form() for k in [k7,k1,k6,k2,k5,k3,k4])
class grammar_testcase(unittest.TestCase):
def test_expand(self):
print 'expand'
print '1d'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=0)
expanded = grammar.expand_kernels(1, [k], base_kernels='SE', rules=None)
for k in expanded:
print '\n', k.pretty_print(), '\n'
print '2d'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=0)
expanded = grammar.expand_kernels(2, [k], base_kernels='SE', rules=None)
for k in expanded:
print '\n', k.pretty_print(), '\n'
print '3d'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=0)
expanded = grammar.expand_kernels(3, [k], base_kernels='SE', rules=None)
for k in expanded:
print '\n', k.pretty_print(), '\n'
print '3d with two SEs'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=0)
expanded = grammar.expand_kernels(3, [k + k.copy()], base_kernels='SE', rules=None)
for k in expanded:
print '\n', k.pretty_print(), '\n'
def test_expand_model(self):
print 'expand model'
print '2d'
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=0)
m = ff.GPModel(mean=ff.MeanZero(), kernel=k, likelihood=ff.LikGauss())
expanded = grammar.expand_models(2, [m], base_kernels='SE', rules=None)
for k in expanded:
print '\n', k.pretty_print(), '\n'
class experiment_testcase(unittest.TestCase):
def test_nan_score(self):
k = ff.SqExpKernel(dimension=0, lengthscale=0, sf=0)
m1 = ff.GPModel(kernel=k, nll=np.nan, ndata=100)
m2 = ff.GPModel(kernel=k.copy(), nll=0, ndata=100)
(not_nan, eq_nan) = experiment.remove_nan_scored_models([m1,m2], score='bic')
assert (len(not_nan) == 1) and (len(eq_nan) == 1)
class misc_testcase(unittest.TestCase):
def test_3_operands_to_binary(self):
assert len(ff.ChangePointKernel(operands=[ff.ConstKernel(), ff.ChangePointKernel(operands=[ff.ConstKernel(), ff.ConstKernel()])]).canonical().operands) == 2
def test_simplify(self):
m = ff.GPModel(mean=ff.MeanZero(), kernel=ff.SumKernel(operands=[ff.ProductKernel(operands=[ff.ConstKernel(sf=0.170186999131), ff.SqExpKernel(dimension=0, lengthscale=1.02215322228, sf=5.9042619611)]), ff.ProductKernel(operands=[ff.NoiseKernel(sf=2.43188502201), ff.ConstKernel(sf=-0.368638271154)]), ff.ProductKernel(operands=[ff.NoiseKernel(sf=1.47110516981), ff.PeriodicKernel(dimension=0, lengthscale=-1.19651800365, period=0.550394248167, sf=0.131044872864)]), ff.ProductKernel(operands=[ff.SqExpKernel(dimension=0, lengthscale=3.33346140605, sf=3.7579461353), ff.PeriodicKernel(dimension=0, lengthscale=0.669624964607, period=0.00216264543496, sf=2.41995024965)])]), likelihood=ff.LikGauss(sf=-np.inf), nll=599.59757993, ndata=144)
assert not m.simplified() == m
m = ff.GPModel(mean=ff.MeanZero(), kernel=ff.SumKernel(operands=[ff.ProductKernel(operands=[ff.ConstKernel(sf=0.170186999131), ff.SqExpKernel(dimension=0, lengthscale=1.02215322228, sf=5.9042619611)]), ff.ProductKernel(operands=[ff.NoiseKernel(sf=2.43188502201), ff.ConstKernel(sf=-0.368638271154)])]), likelihood=ff.LikGauss(sf=-np.inf), nll=599.59757993, ndata=144)
assert not m.simplified() == m
def test_param_loading(self):
k = ff.ChangePointKernel(dimension=0, location=0, steepness=0, operands=[ff.ConstKernel(sf=0), ff.ConstKernel(sf=0)])
param_vector = [1,1,1,1]
k.load_param_vector(param_vector)
assert np.all(k.param_vector == param_vector)
param_vector = [0,0,0,0]
assert not np.any(k.param_vector == param_vector)
# def test_wrong_dimension(self):
# try:
# k = fk.MaskKernelFamily(1,1,fk.SqExpKernelFamily())
# except:
# pass
# else:
# raise RuntimeError('I gave a mask kernel inconsistent number of dimensions and active dimension')
# #def test_none_dimensions(self):
# # k = fk.MaskKernelFamily(None,None,fk.SqExpKernelFamily())
# def test_addition(self):
# k = fk.SqExpKernelFamily().default() + fk.SqExpKernelFamily().default()
# assert isinstance(k, fk.SumKernel)
# def test_creation(self):
# # Check that both of these ways of creating a kernel work
# k = fk.SqExpKernelFamily().default()
# k = fk.SqExpKernelFamily.default()
# def test_addition_2(self):
# k = fk.SqExpKernelFamily().default() + fk.SqExpKernelFamily().default()
# k = k + k.copy()
# assert isinstance(k, fk.SumKernel) and (not isinstance(k.operands[0], fk.SumKernel))
# def test_multiplication(self):
# k = fk.SqExpKernelFamily().default() * fk.SqExpKernelFamily().default()
# assert isinstance(k, fk.ProductKernel)
# def test_addition_2(self):
# k = fk.SqExpKernelFamily().default() * fk.SqExpKernelFamily().default()
# k = k * k.copy()
# assert isinstance(k, fk.ProductKernel) and (not isinstance(k.operands[0], fk.ProductKernel))
# def test_defaults(self):
# k = fk.SqExpKernelFamily()
# dummy = fk.ChangePointTanhKernelFamily(operands=[k,k]).default()
# dummy = fk.ChangeBurstTanhKernelFamily(operands=[k,k]).default()
# dummy = (k.default() + k.default()).family().default()
# dummy = (k.default() * k.default()).family().default()
# def test_default_family_default(self):
# k = fk.SqExpKernelFamily()
# assert (k.default() * k.default()).family().default() == (k.default() * k.default())
# class experiment_testcase(unittest.TestCase):
# def test_nan_score(self):
# k1 = fk.ScoredKernel(fk.SqExpKernelFamily.default())
# k2 = fk.ScoredKernel(fk.SqExpKernelFamily.default(), bic_nle=0)
# (not_nan, eq_nan) = experiment.remove_nan_scored_kernels([k1,k2], score='bic')
# assert (len(not_nan) == 1) and (len(eq_nan) == 1)
# class grammar_testcase(unittest.TestCase):
# def test_type_match(self):
# g = grammar.MultiDGrammar(ndim=2)
# k = fk.MaskKernel(2,0,fk.SqExpKernelFamily.default())
# assert g.type_matches(k, 'multi')
# k = fk.MaskKernel(2,0,fk.FourierKernelFamily.default())
# assert g.type_matches(k, 'multi')
# k = k + k.copy()
# assert g.type_matches(k, 'multi')
# k = k * k.copy()
# assert g.type_matches(k, 'multi')
# k = fk.MaskKernel(2,0,fk.SqExpKernelFamily.default()).family()
# k = fk.ChangePointTanhKernelFamily(operands=[k,k]).default()
# assert g.type_matches(k, 'multi')
# k = fk.MaskKernel(2,0,fk.SqExpKernelFamily.default()).family()
# k = fk.ChangeBurstTanhKernelFamily(operands=[k,k]).default()
# assert g.type_matches(k, 'multi')
# class translation_testcase(unittest.TestCase):
# def test_SE(self):
# k = fk.SqExpKernelFamily().default()
# sentences = translation.translate_additive_component(k, np.array([-1,0,1]), monotonic=0, gradient=0, unit='year')
# def test_SE_metres(self):
# k = fk.SqExpKernelFamily().default()
# sentences = translation.translate_additive_component(k, np.array([-1,0,1]), monotonic=0, gradient=0, unit='metre')
# def test_SE_number(self):
# k = fk.SqExpKernelFamily().default()
# sentences = translation.translate_additive_component(k, np.array([-1,0,1]), monotonic=0, gradient=0, unit='number')
# def test_SE_no_unit(self):
# k = fk.SqExpKernelFamily().default()
# sentences = translation.translate_additive_component(k, np.array([-1,0,1]), monotonic=0, gradient=0, unit='')
# def test_BroadSE(self):
# k = fk.SqExpKernelFamily().default()
# sentences = translation.translate_additive_component(k, np.array([0,0.5]), monotonic=0, gradient=0, unit='year')
# def test_poly(self):
# k = fk.LinKernelFamily().default() * fk.LinKernelFamily().default() * fk.LinKernelFamily().default()
# sentences = translation.translate_additive_component(k, np.array([0,0.5]), monotonic=0, gradient=0, unit='year')
# def test_SEpolydecrease(self):
# k = fk.SqExpKernelFamily().default() * fk.LinKernelFamily().default()
# sentences = translation.translate_additive_component(k, np.array([0.2,0.5]), monotonic=0, gradient=0, unit='year')
# def test_complicated(self):
# k = fk.SqExpKernelFamily().default() * fk.CentredPeriodicKernelFamily().default() * fk.CosineKernelFamily().default() * fk.CosineKernelFamily().default() * fk.LinKernelFamily().default() * fk.LinKernelFamily().default()
# op = [fk.ZeroKernel(), k]
# k = fk.ChangePointTanhKernel(location = 1.5, steepness=2, operands=op)
# op = [k, fk.ZeroKernel()]
# k = fk.ChangePointTanhKernel(location = 1.8, steepness=2, operands=op)
# sentences = translation.translate_additive_component(k, np.array([1,2]), monotonic=0, gradient=0, unit='year')
# def test_IMT3(self):
# k = fk.IMT3LinKernelFamily().default()
# sentences = translation.translate_additive_component(k, np.array([1,2]), monotonic=0, gradient=0, unit='year')
# def test_Const(self):
# k = fk.ConstKernelFamily().default()
# sentences = translation.translate_additive_component(k, np.array([1,2]), monotonic=0, gradient=0, unit='year')
# def test_ConstSE(self):
# k = fk.ConstKernelFamily().default() * fk.SqExpKernelFamily().default()
# sentences = translation.translate_additive_component(k, np.array([1,2]), monotonic=0, gradient=0, unit='year')
# def test_Window(self):
# k = fk.SqExpKernelFamily().default()
# op = [k, fk.ZeroKernel()]
# k = fk.ChangeBurstTanhKernel(location = 1.5, steepness=2, width=np.log(0.2), operands=op)
# sentences = translation.translate_additive_component(k, np.array([1,2]), monotonic=0, gradient=0, unit='year')
# def test_cos(self):
# k = fk.CosineKernelFamily().default()
# sentences = translation.translate_additive_component(k, np.array([1,2]), monotonic=0, gradient=0, unit='year')
# def test_Window2(self):
# k = fk.SqExpKernelFamily().default()
# op = [fk.ZeroKernel(), k]
# k = fk.ChangeBurstTanhKernel(location = 1.5, steepness=2, width=np.log(0.2), operands=op)
# sentences = translation.translate_additive_component(k, np.array([1,2]), monotonic=0, gradient=0, unit='year')
# def test_IMT3Complicated(self):
# k = fk.SqExpKernelFamily().default() * fk.CentredPeriodicKernelFamily().default() * fk.CosineKernelFamily().default() * fk.CosineKernelFamily().default() * fk.LinKernelFamily().default() * fk.IMT3LinKernelFamily().default()
# op = [fk.ZeroKernel(), k]
# k = fk.ChangePointTanhKernel(location = 1.5, steepness=2, operands=op)
# op = [k, fk.ZeroKernel()]
# k = fk.ChangePointTanhKernel(location = 1.8, steepness=2, operands=op)
# sentences = translation.translate_additive_component(k, np.array([1,2]), monotonic=0, gradient=0, unit='year')
# def test_error_1(self):
# k = fk.MaskKernelFamily(1,0,fk.SqExpKernelFamily())
# try:
# sentences = translation.translate_additive_component(k, np.array([-1,0,1]), monotonic=0, gradient=0, unit='year')
# except:
# pass
# else:
# raise RuntimeError('I should not be able to describe a mask kernel on its own')
if __name__ == "__main__":
unittest.main()
| 42.504854 | 745 | 0.573473 | 3,528 | 26,268 | 4.162982 | 0.073129 | 0.055968 | 0.049568 | 0.037176 | 0.823585 | 0.772179 | 0.729693 | 0.701641 | 0.682372 | 0.667053 | 0 | 0.048927 | 0.256929 | 26,268 | 617 | 746 | 42.573744 | 0.70352 | 0.283006 | 0 | 0.706573 | 0 | 0 | 0.038706 | 0 | 0 | 0 | 0 | 0 | 0.084507 | 0 | null | null | 0 | 0.011737 | null | null | 0.323944 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
780cf56ce02f1a51585e0c3e4af291c4a2120b3a | 237 | py | Python | headless/__init__.py | gbrucepayne/headless | cd1729f9a4ff4f2cc5499d486f4df402df9f45e9 | [
"MIT"
] | null | null | null | headless/__init__.py | gbrucepayne/headless | cd1729f9a4ff4f2cc5499d486f4df402df9f45e9 | [
"MIT"
] | null | null | null | headless/__init__.py | gbrucepayne/headless | cd1729f9a4ff4f2cc5499d486f4df402df9f45e9 | [
"MIT"
] | null | null | null | from headless import RepeatingTimer
from headless import is_logger, is_log_handler, get_caller_name, get_wrapping_logger
from headless import get_serial_ports, validate_serial_port
from headless import get_net_interfaces, get_ip_address
| 47.4 | 84 | 0.890295 | 36 | 237 | 5.444444 | 0.555556 | 0.244898 | 0.367347 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088608 | 237 | 4 | 85 | 59.25 | 0.907407 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
780dfc6d923ee69137f9515095e6c796fbe5a5e5 | 9,588 | py | Python | pizza_cutter/des_pizza_cutter/tests/test_se_image_psf.py | beckermr/pizza-cutter | 04eefd2d4b2a63975fe809c60b5c8e7e3fcf26c6 | [
"BSD-3-Clause"
] | null | null | null | pizza_cutter/des_pizza_cutter/tests/test_se_image_psf.py | beckermr/pizza-cutter | 04eefd2d4b2a63975fe809c60b5c8e7e3fcf26c6 | [
"BSD-3-Clause"
] | 194 | 2018-10-24T23:40:47.000Z | 2021-11-17T16:02:35.000Z | pizza_cutter/des_pizza_cutter/tests/test_se_image_psf.py | beckermr/pizza-cutter | 04eefd2d4b2a63975fe809c60b5c8e7e3fcf26c6 | [
"BSD-3-Clause"
] | null | null | null | import os
import numpy as np
import pytest
import galsim
import piff
from .._se_image import SEImageSlice, PIFF_STAMP_SIZE
@pytest.mark.skipif(
os.environ.get('TEST_DESDATA', None) is None,
reason=(
'SEImageSlice can only be tested if '
'test data is at TEST_DESDATA'))
@pytest.mark.parametrize('x,y', [
(np.ones(10), 10),
(np.ones((10, 10)), 10)])
def test_se_image_psf_array(se_image_data, x, y):
se_im = SEImageSlice(
source_info=se_image_data['source_info'],
psf_model=None,
wcs=se_image_data['eu_wcs'],
wcs_position_offset=1,
wcs_color=0,
psf_kwargs=None,
noise_seeds=[10],
mask_tape_bumps=False,
)
with pytest.raises(AssertionError):
se_im.get_psf_image(x, y)
with pytest.raises(AssertionError):
se_im.get_psf_image(y, x)
with pytest.raises(AssertionError):
se_im.get_psf_image(x, y)
with pytest.raises(AssertionError):
se_im.get_psf_image(y, x)
@pytest.mark.skipif(
os.environ.get('TEST_DESDATA', None) is None,
reason=(
'SEImageSlice can only be tested if '
'test data is at TEST_DESDATA'))
@pytest.mark.parametrize('wcs_pos_offset', [0, 1])
@pytest.mark.parametrize('eps_x', [
-0.75, -0.5, -0.25, 0.0, 0.25, 0.5, 0.75])
@pytest.mark.parametrize('eps_y', [
-0.75, -0.5, -0.25, 0.0, 0.25, 0.5, 0.75])
def test_se_image_psf_gsobject(se_image_data, eps_x, eps_y, wcs_pos_offset):
x = 10 + eps_x
y = 11 + eps_y
dx = x - np.floor(x + 0.5)
dy = y - np.floor(y + 0.5)
se_im = SEImageSlice(
source_info=se_image_data['source_info'],
psf_model=galsim.Gaussian(fwhm=0.8),
wcs=se_image_data['eu_wcs'],
wcs_position_offset=wcs_pos_offset,
wcs_color=0,
psf_kwargs=None,
noise_seeds=[10],
mask_tape_bumps=False,
)
psf_im = se_im.get_psf_image(x, y)
cen = (psf_im.shape[0] - 1) / 2
# check mean (x, y) to make sure it is in the right spot
_y, _x = np.mgrid[:psf_im.shape[0], :psf_im.shape[1]]
xbar = np.mean((_x - cen) * psf_im) / np.mean(psf_im)
ybar = np.mean((_y - cen) * psf_im) / np.mean(psf_im)
assert np.abs(xbar - dx) < 1e-3, xbar
assert np.abs(ybar - dy) < 1e-3, ybar
true_psf_im = galsim.Gaussian(fwhm=0.8).drawImage(
nx=19,
ny=19,
wcs=se_im.get_wcs_jacobian(x, y),
offset=galsim.PositionD(x=dx, y=dy)
).array
true_psf_im /= np.sum(true_psf_im)
assert np.array_equal(psf_im, true_psf_im)
@pytest.mark.skipif(
os.environ.get('TEST_DESDATA', None) is None,
reason=(
'SEImageSlice can only be tested if '
'test data is at TEST_DESDATA'))
@pytest.mark.parametrize('wcs_pos_offset', [0, 1])
@pytest.mark.parametrize('eps_x', [
-0.75, -0.5, -0.25, 0.0, 0.25, 0.5, 0.75])
@pytest.mark.parametrize('eps_y', [
-0.75, -0.5, -0.25, 0.0, 0.25, 0.5, 0.75])
@pytest.mark.parametrize('use_wcs', [False, True])
def test_se_image_psf_psfex(
se_image_data, use_wcs, eps_x, eps_y, wcs_pos_offset):
if use_wcs:
psfex = galsim.des.DES_PSFEx(
se_image_data['source_info']['psf_path'],
se_image_data['source_info']['image_path'],
)
else:
psfex = galsim.des.DES_PSFEx(
se_image_data['source_info']['psf_path'])
x = 10 + eps_x
y = 11 + eps_y
dx = x - np.floor(x + 0.5)
dy = y - np.floor(y + 0.5)
se_im = SEImageSlice(
source_info=se_image_data['source_info'],
psf_model=psfex,
wcs=se_image_data['eu_wcs'],
wcs_position_offset=wcs_pos_offset,
wcs_color=0,
psf_kwargs=None,
noise_seeds=[10],
mask_tape_bumps=False,
)
if use_wcs:
wcs = se_im.get_wcs_jacobian(x, y)
else:
wcs = galsim.PixelScale(1.0)
psf_im = se_im.get_psf_image(x, y)
cen = (psf_im.shape[0] - 1) / 2
# check mean (x, y) to make sure it is not the center
_y, _x = np.mgrid[:psf_im.shape[0], :psf_im.shape[1]]
xbar = np.mean((_x - cen) * psf_im) / np.mean(psf_im)
ybar = np.mean((_y - cen) * psf_im) / np.mean(psf_im)
# PSFEx is not exactly centered, so the tolerance here is bigger
assert np.abs(xbar - dx) < 1e-1, xbar
assert np.abs(ybar - dy) < 1e-1, ybar
psf = psfex.getPSF(galsim.PositionD(
x=x+wcs_pos_offset, y=y+wcs_pos_offset))
true_psf_im = psf.drawImage(
nx=psf_im.shape[0],
ny=psf_im.shape[0],
wcs=wcs,
offset=galsim.PositionD(x=dx, y=dy),
method='no_pixel',
).array
true_psf_im /= np.sum(true_psf_im)
assert np.array_equal(psf_im, true_psf_im)
def get_center_delta(x):
return x - np.floor(x+0.5)
@pytest.mark.skipif(
os.environ.get('TEST_DESDATA', None) is None,
reason=(
'SEImageSlice can only be tested if '
'test data is at TEST_DESDATA'))
@pytest.mark.parametrize('wcs_pos_offset', [0, 1])
@pytest.mark.parametrize('eps_x', [
-0.75, -0.50, -0.25, 0.0, 0.25, 0.50, 0.75])
@pytest.mark.parametrize('eps_y', [
-0.75, -0.50, -0.25, 0.0, 0.25, 0.50, 0.75])
def test_se_image_psf_piff(se_image_data, eps_x, eps_y, wcs_pos_offset):
x = 101 + eps_x
y = 111 + eps_y
dx = get_center_delta(x)
dy = get_center_delta(y)
psf_mod = piff.PSF.read(se_image_data['source_info']['piff_path'])
se_im = SEImageSlice(
source_info=se_image_data['source_info'],
psf_model=psf_mod,
wcs=se_image_data['eu_wcs'],
wcs_position_offset=wcs_pos_offset,
wcs_color=0,
psf_kwargs={"GI_COLOR": 0.61},
noise_seeds=[10],
mask_tape_bumps=False,
)
psf_im_cen = se_im.get_psf_image(np.floor(x+0.5), np.floor(y+0.5))
_y, _x = np.mgrid[:psf_im_cen.shape[0], :psf_im_cen.shape[1]]
xcen = np.mean(_x * psf_im_cen) / np.mean(psf_im_cen)
ycen = np.mean(_y * psf_im_cen) / np.mean(psf_im_cen)
# check mean (x, y) to make sure it is not the center
psf_im = se_im.get_psf_image(x, y)
_y, _x = np.mgrid[:psf_im.shape[0], :psf_im.shape[1]]
xbar = np.mean((_x - xcen) * psf_im) / np.mean(psf_im)
ybar = np.mean((_y - ycen) * psf_im) / np.mean(psf_im)
# Piff is not exactly centered, so the tolerance here is bigger
print("\ncenter offsets:", xbar, dx, ybar, dy)
assert np.abs(xbar - dx) < 1e-1, 'x: %g xbar: %g dx: %g' % (x, xbar, dx)
assert np.abs(ybar - dy) < 1e-1, 'y: %g ybar: %g dy: %g' % (y, ybar, dy)
psf_mod = piff.PSF.read(se_image_data['source_info']['piff_path'])
image = galsim.ImageD(
PIFF_STAMP_SIZE,
PIFF_STAMP_SIZE,
wcs=se_im.get_wcs_jacobian(x, y),
)
true_psf_im = psf_mod.draw(
x=x+wcs_pos_offset,
y=y+wcs_pos_offset,
image=image,
center=True,
offset=(x - np.floor(x+0.5), y - np.floor(y+0.5)),
GI_COLOR=0.61,
chipnum=se_image_data["source_info"]["ccdnum"],
).array
true_psf_im /= np.sum(true_psf_im)
assert np.array_equal(psf_im, true_psf_im)
@pytest.mark.skipif(
os.environ.get('TEST_DESDATA', None) is None,
reason=(
'SEImageSlice can only be tested if '
'test data is at TEST_DESDATA'))
@pytest.mark.parametrize('wcs_pos_offset', [1])
@pytest.mark.parametrize('eps_x', [-0.50])
@pytest.mark.parametrize('eps_y', [0.25])
def test_se_image_psf_piff_color(se_image_data, eps_x, eps_y, wcs_pos_offset):
x = 10 + eps_x
y = 11 + eps_y
dx = get_center_delta(x)
dy = get_center_delta(y)
psf_mod = piff.PSF.read(se_image_data['source_info']['piff_path'])
se_im = SEImageSlice(
source_info=se_image_data['source_info'],
psf_model=psf_mod,
wcs=se_image_data['eu_wcs'],
wcs_position_offset=wcs_pos_offset,
wcs_color=0.7,
psf_kwargs={"GI_COLOR": 0.61},
noise_seeds=[10],
mask_tape_bumps=False,
)
psf_im_cen = se_im.get_psf_image(np.floor(x+0.5), np.floor(y+0.5))
_y, _x = np.mgrid[:psf_im_cen.shape[0], :psf_im_cen.shape[1]]
xcen = np.mean(_x * psf_im_cen) / np.mean(psf_im_cen)
ycen = np.mean(_y * psf_im_cen) / np.mean(psf_im_cen)
# check mean (x, y) to make sure it is not the center
psf_im = se_im.get_psf_image(x, y)
_y, _x = np.mgrid[:psf_im.shape[0], :psf_im.shape[1]]
xbar = np.mean((_x - xcen) * psf_im) / np.mean(psf_im)
ybar = np.mean((_y - ycen) * psf_im) / np.mean(psf_im)
# Piff is not exactly centered, so the tolerance here is bigger
assert np.abs(xbar - dx) < 1e-1, 'x: %g xbar: %g dx: %g' % (x, xbar, dx)
assert np.abs(ybar - dy) < 1e-1, ybar
psf_mod = piff.PSF.read(se_image_data['source_info']['piff_path'])
image = galsim.ImageD(
PIFF_STAMP_SIZE,
PIFF_STAMP_SIZE,
wcs=se_im.get_wcs_jacobian(x, y),
)
true_psf_im = psf_mod.draw(
x=x+wcs_pos_offset,
y=y+wcs_pos_offset,
image=image,
center=True,
offset=(x - np.floor(x+0.5), y - np.floor(y+0.5)),
chipnum=se_image_data['source_info']['ccdnum'],
GI_COLOR=0.61,
).array
true_psf_im /= np.sum(true_psf_im)
assert np.array_equal(psf_im, true_psf_im)
# piff defaults to no color used for this test data
not_true_psf_im = psf_mod.draw(
x=x+wcs_pos_offset,
y=y+wcs_pos_offset,
stamp_size=psf_im.shape[0],
chipnum=se_image_data['source_info']['ccdnum'],
GI_COLOR=0.61,
).array
not_true_psf_im /= np.sum(not_true_psf_im)
assert not np.array_equal(psf_im, not_true_psf_im)
| 32.282828 | 78 | 0.619942 | 1,640 | 9,588 | 3.351829 | 0.087195 | 0.06549 | 0.050027 | 0.046389 | 0.891759 | 0.868838 | 0.854102 | 0.818446 | 0.810078 | 0.79789 | 0 | 0.035031 | 0.231852 | 9,588 | 296 | 79 | 32.391892 | 0.711337 | 0.046621 | 0 | 0.743902 | 0 | 0 | 0.094174 | 0 | 0 | 0 | 0 | 0 | 0.069106 | 1 | 0.02439 | false | 0 | 0.02439 | 0.004065 | 0.052846 | 0.004065 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
784a4bf9e31b15d539e9f06faf8867c88f51d76c | 1,396 | py | Python | tessled/effects/argtypes.py | hodgestar/tesseract-control-software | 41f47a4b901a0069f1745c90abe28f0778704b0e | [
"MIT"
] | 2 | 2019-07-13T14:15:30.000Z | 2020-01-04T10:44:47.000Z | tessled/effects/argtypes.py | hodgestar/tesseract-control-software | 41f47a4b901a0069f1745c90abe28f0778704b0e | [
"MIT"
] | 1 | 2018-04-11T17:29:04.000Z | 2018-04-11T17:29:04.000Z | earthstar/effects/argtypes.py | hodgestar/earthstar-control-software | 52c02338ad21907a78d9814063d9845b0e64b91e | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
""" Effect argument types. """
class StrArg(object):
def __init__(self, allow_null=False):
self._allow_null = allow_null
def __call__(self, v):
if self._allow_null and v is None:
return None
return str(v)
class DictArg(object):
def __init__(self, allow_null=False):
self._allow_null = allow_null
def __call__(self, v):
if self._allow_null and v is None:
return v
assert isinstance(v, dict)
return v
class IntArg(object):
def __init__(self, default, min=None, max=None):
self._default = default
self._min = min
self._max = max
def __call__(self, v):
try:
v = int(v)
except Exception:
return self._default
if self._min is not None:
v = max(v, self._min)
if self._max is not None:
v = min(v, self._max)
return v
class FloatArg(object):
def __init__(self, default, min=None, max=None):
self._default = default
self._min = min
self._max = max
def __call__(self, v):
try:
v = float(v)
except Exception:
return self._default
if self._min is not None:
v = max(v, self._min)
if self._max is not None:
v = min(v, self._max)
return v
| 22.885246 | 52 | 0.546562 | 185 | 1,396 | 3.8 | 0.205405 | 0.102418 | 0.110953 | 0.096728 | 0.819346 | 0.819346 | 0.819346 | 0.819346 | 0.819346 | 0.819346 | 0 | 0.001114 | 0.356734 | 1,396 | 60 | 53 | 23.266667 | 0.781737 | 0.032951 | 0 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022222 | 1 | 0.177778 | false | 0 | 0 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
78580b41028e4ad94de4a7190414cfdf8ca4fedb | 193 | py | Python | src/Play.py | TestowanieAutomatyczneUG/laboratorium-9-melkorw | a1501911943af9acf1fa81f88f69054ca1475e06 | [
"MIT"
] | null | null | null | src/Play.py | TestowanieAutomatyczneUG/laboratorium-9-melkorw | a1501911943af9acf1fa81f88f69054ca1475e06 | [
"MIT"
] | null | null | null | src/Play.py | TestowanieAutomatyczneUG/laboratorium-9-melkorw | a1501911943af9acf1fa81f88f69054ca1475e06 | [
"MIT"
] | null | null | null | class Play:
def get_time(self):
pass
def play_wav_file(self, file):
pass
def wav_was_played(self, file):
pass
def reset_wav(self, file):
pass
| 14.846154 | 35 | 0.564767 | 27 | 193 | 3.814815 | 0.444444 | 0.203884 | 0.349515 | 0.291262 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.352332 | 193 | 12 | 36 | 16.083333 | 0.824 | 0 | 0 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.444444 | false | 0.444444 | 0 | 0 | 0.555556 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
789bccda1b6a828cf65a61f0caa7e2e3f66cf804 | 2,738 | py | Python | python_test/test_parser_addition.py | lubkoll/friendly-type-erasure | 719830233a8652ccf18164653b466b0054a617f6 | [
"MIT"
] | null | null | null | python_test/test_parser_addition.py | lubkoll/friendly-type-erasure | 719830233a8652ccf18164653b466b0054a617f6 | [
"MIT"
] | 22 | 2016-08-03T16:51:10.000Z | 2016-11-23T20:53:03.000Z | python_test/test_parser_addition.py | lubkoll/friendly-type-erasure | 719830233a8652ccf18164653b466b0054a617f6 | [
"MIT"
] | null | null | null | import unittest
import type_erasure.parser_addition
single_line_test_comments = ['/// comment',
'//! comment',
'// comment',
' \n\r\t/// comment']
multi_line_test_comments = [['/** comment */'],
['/* comment */'],
['/* comment', '*/'],
['/**', '* comment', '*/']]
class TestIsComment(unittest.TestCase):
def test_is_single_line_comment(self):
for comment in single_line_test_comments:
self.assertTrue(type_erasure.parser_addition.is_single_line_comment(comment))
for comment in multi_line_test_comments:
for line in comment:
self.assertFalse(type_erasure.parser_addition.is_single_line_comment(line))
def test_is_multi_line_comment(self):
for comment in single_line_test_comments:
self.assertFalse(type_erasure.parser_addition.is_multi_line_comment(comment, in_multi_line_comment=False))
self.assertFalse(type_erasure.parser_addition.is_multi_line_comment(comment, in_multi_line_comment=True))
for comment in multi_line_test_comments:
for line in comment:
if line is comment[0]:
self.assertTrue(type_erasure.parser_addition.is_multi_line_comment(line, in_multi_line_comment=False))
self.assertFalse(type_erasure.parser_addition.is_multi_line_comment(line, in_multi_line_comment=True))
else:
self.assertFalse(type_erasure.parser_addition.is_multi_line_comment(line, in_multi_line_comment=False))
self.assertTrue(type_erasure.parser_addition.is_multi_line_comment(line, in_multi_line_comment=True))
def test_is_comment(self):
for comment in single_line_test_comments:
self.assertTrue(type_erasure.parser_addition.is_comment(comment, in_multi_line_comment=False))
self.assertTrue(type_erasure.parser_addition.is_comment(comment, in_multi_line_comment=True))
for comment in multi_line_test_comments:
for line in comment:
if line is comment[0]:
self.assertTrue(type_erasure.parser_addition.is_comment(line, in_multi_line_comment=False))
self.assertFalse(type_erasure.parser_addition.is_comment(line, in_multi_line_comment=True))
else:
self.assertFalse(type_erasure.parser_addition.is_comment(line, in_multi_line_comment=False))
self.assertTrue(type_erasure.parser_addition.is_comment(line, in_multi_line_comment=True))
if __name__ == '__main__':
unittest.main()
| 53.686275 | 123 | 0.65851 | 322 | 2,738 | 5.170807 | 0.099379 | 0.124324 | 0.182583 | 0.225225 | 0.893093 | 0.893093 | 0.848649 | 0.83964 | 0.803003 | 0.803003 | 0 | 0.000979 | 0.2542 | 2,738 | 50 | 124 | 54.76 | 0.814398 | 0 | 0 | 0.302326 | 0 | 0 | 0.041636 | 0 | 0 | 0 | 0 | 0 | 0.325581 | 1 | 0.069767 | false | 0 | 0.046512 | 0 | 0.139535 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
78a89808acceeaa265c6a8b54012d586f7dbf31e | 4,304 | py | Python | tests/test_pyros_schemas/__init__.py | pyros-dev/pyros-schemas | a460920260ee77a1b5b6d5c0b97df52f1572ff79 | [
"MIT"
] | 3 | 2018-01-01T17:10:16.000Z | 2018-11-15T15:41:46.000Z | tests/test_pyros_schemas/__init__.py | pyros-dev/pyros-schemas | a460920260ee77a1b5b6d5c0b97df52f1572ff79 | [
"MIT"
] | 7 | 2018-02-02T10:05:55.000Z | 2018-02-17T15:15:46.000Z | tests/test_pyros_schemas/__init__.py | pyros-dev/pyros-schemas | a460920260ee77a1b5b6d5c0b97df52f1572ff79 | [
"MIT"
] | 2 | 2017-09-27T09:46:31.000Z | 2018-02-02T09:37:13.000Z | from __future__ import absolute_import, division, print_function, unicode_literals
import os
import site
added_site_dir = os.path.join(os.path.dirname(os.path.dirname(__file__)), 'rosdeps')
srvs_site_dir = os.path.join(os.path.dirname(os.path.dirname(__file__)), 'rosdeps', 'ros_comm_msgs')
print("Adding site directory {0} to access std_msgs".format(added_site_dir))
site.addsitedir(added_site_dir)
site.addsitedir(srvs_site_dir)
import rosimport
rosimport.activate()
from . import msg as pyros_schemas_test_msgs
# patching (need to know the field name)
import pyros_msgs.opt_as_array
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_bool_as_array, ['data'])
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_int8_as_array, ['data'])
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_int16_as_array, ['data'])
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_int32_as_array, ['data'])
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_int64_as_array, ['data'])
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_uint8_as_array, ['data'])
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_uint16_as_array, ['data'])
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_uint32_as_array, ['data'])
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_uint64_as_array, ['data'])
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_float32_as_array, ['data'])
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_float64_as_array, ['data'])
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_string_as_array, ['data'])
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_time_as_array, ['data'])
pyros_msgs.opt_as_array.duck_punch(pyros_schemas_test_msgs.test_opt_duration_as_array, ['data'])
import hypothesis
import hypothesis.strategies as st
import six
six_long = six.integer_types[-1]
def maybe_list(l):
"""Return list of one element if ``l`` is a scalar."""
return l if l is None or isinstance(l, list) else [l]
from .strategies.ros import std_msgs_types_strat_ok, std_msgs_dicts_strat_ok
from .strategies.ros import pyros_schemas_opttypes_strat_ok, pyros_schemas_dicts_strat_ok
#
# def proper_basic_msg_strategy_selector(*msg_types):
# """Accept a (list of) rostype and return it with the matching strategy for ros message"""
# # TODO : break on error (type not in map)
# # we use a list comprehension here to avoid creating a generator (tuple comprehension)
# return tuple([(msg_type, std_msgs_types_strat_ok.get(msg_type)) for msg_type in msg_types])
#
#
# def proper_basic_dict_strategy_selector(*msg_types):
# """Accept a (list of) rostype and return it with the matching strategy for dict"""
# # TODO : break on error (type not in map)
# # we use a list comprehension here to avoid creating a generator (tuple comprehension)
# return tuple([(msg_type, std_msgs_dicts_strat_ok.get(msg_type)) for msg_type in msg_types])
# def proper_basic_optmsg_strategy_selector(*msg_types):
# """Accept a (list of) rostype and return it with the matching strategy for ros message"""
# # TODO : break on error (type not in map)
# # we use a list comprehension here to avoid creating a generator (tuple comprehension)
# return tuple([(msg_type, pyros_schemas_opttypes_strat_ok.get(msg_type)) for msg_type in msg_types])
#
#
# def proper_basic_optdict_strategy_selector(*msg_types):
# """Accept a (list of) rostype and return it with the matching strategy for dict"""
# # TODO : break on error (type not in map)
# # we use a list comprehension here to avoid creating a generator (tuple comprehension)
# return tuple([(msg_type, pyros_schemas_dicts_strat_ok.get(msg_type)) for msg_type in msg_types])
#
#
# def proper_basic_optdata_strategy_selector(*field_types):
# """Accept a (list of) rostype and return it with the matching strategy for data"""
# # TODO : break on error (type not in map)
# # we use a list comprehension here to avoid creating a generator (tuple comprehension)
# return tuple([(field_type, optfield_strat_ok.get(field_type)) for field_type in field_types])
| 44.833333 | 105 | 0.787872 | 705 | 4,304 | 4.42695 | 0.175887 | 0.065043 | 0.076898 | 0.096123 | 0.779558 | 0.724127 | 0.724127 | 0.724127 | 0.724127 | 0.724127 | 0 | 0.005263 | 0.1171 | 4,304 | 95 | 106 | 45.305263 | 0.816053 | 0.46027 | 0 | 0 | 0 | 0 | 0.055873 | 0 | 0 | 0 | 0 | 0.010526 | 0 | 1 | 0.029412 | false | 0 | 0.352941 | 0 | 0.411765 | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
1545279bcb5a12955234366bb5649f42d39382ee | 1,440 | py | Python | server/tasks/migrations/0002_auto_20180509_1601.py | akshlu/taskboard | ee18ea517321a95685c4c6e98ed7035627de425a | [
"MIT"
] | null | null | null | server/tasks/migrations/0002_auto_20180509_1601.py | akshlu/taskboard | ee18ea517321a95685c4c6e98ed7035627de425a | [
"MIT"
] | null | null | null | server/tasks/migrations/0002_auto_20180509_1601.py | akshlu/taskboard | ee18ea517321a95685c4c6e98ed7035627de425a | [
"MIT"
] | null | null | null | # Generated by Django 2.0.5 on 2018-05-09 16:01
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('tasks', '0001_initial'),
]
operations = [
migrations.AddField(
model_name='epic',
name='description',
field=models.TextField(default=''),
),
migrations.AddField(
model_name='epic',
name='name',
field=models.TextField(default='No Name'),
),
migrations.AddField(
model_name='project',
name='description',
field=models.TextField(default=''),
),
migrations.AddField(
model_name='project',
name='name',
field=models.TextField(default='No Name'),
),
migrations.AddField(
model_name='subtask',
name='description',
field=models.TextField(default=''),
),
migrations.AddField(
model_name='subtask',
name='name',
field=models.TextField(default='No Name'),
),
migrations.AddField(
model_name='task',
name='description',
field=models.TextField(default=''),
),
migrations.AddField(
model_name='task',
name='name',
field=models.TextField(default='No Name'),
),
]
| 26.666667 | 54 | 0.513889 | 123 | 1,440 | 5.943089 | 0.292683 | 0.19699 | 0.25171 | 0.295486 | 0.80985 | 0.80985 | 0.712722 | 0.712722 | 0.656635 | 0.656635 | 0 | 0.020518 | 0.356944 | 1,440 | 53 | 55 | 27.169811 | 0.768898 | 0.03125 | 0 | 0.851064 | 1 | 0 | 0.106963 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021277 | 0 | 0.085106 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
154efb0ce384cec38d429dc5beadc2c4f99a390f | 144 | py | Python | car/debug.py | jieguangzhou/spider | 81f25fd19753f1a50b4e442b078fdc2dc31fda12 | [
"MIT"
] | null | null | null | car/debug.py | jieguangzhou/spider | 81f25fd19753f1a50b4e442b078fdc2dc31fda12 | [
"MIT"
] | null | null | null | car/debug.py | jieguangzhou/spider | 81f25fd19753f1a50b4e442b078fdc2dc31fda12 | [
"MIT"
] | null | null | null | from scrapy import cmdline
# cmdline.execute("scrapy crawl autohome_brand".split())
cmdline.execute("scrapy crawl autohome_car_config".split()) | 36 | 59 | 0.805556 | 19 | 144 | 5.947368 | 0.578947 | 0.247788 | 0.353982 | 0.442478 | 0.584071 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076389 | 144 | 4 | 59 | 36 | 0.849624 | 0.375 | 0 | 0 | 0 | 0 | 0.359551 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
15a431a689e584783d0ce411ed84956d9415a66a | 7,749 | py | Python | users/tests.py | wizath/django-cookiejwt | 8a749f1e9b8e6ce1cf061aa28c2621606279aae3 | [
"MIT"
] | null | null | null | users/tests.py | wizath/django-cookiejwt | 8a749f1e9b8e6ce1cf061aa28c2621606279aae3 | [
"MIT"
] | 8 | 2020-02-12T00:05:43.000Z | 2021-09-22T17:52:01.000Z | users/tests.py | wizath/django-cookiejwt | 8a749f1e9b8e6ce1cf061aa28c2621606279aae3 | [
"MIT"
] | null | null | null | import json
import os
from http import cookies
import datetime
from rest_framework import status
from rest_framework.test import APITestCase
from rest_framework_simplejwt.tokens import AccessToken, RefreshToken
from rest_framework_simplejwt.settings import api_settings
from users.authentication import CookieAccessTokenAuthentication
from users.models import User
class TestCookieTokenVerify(APITestCase):
def setUp(self):
user = User(username='testuser', email='test@test.com')
user.set_password('testpassword')
user.save()
def test_cookie_token_verify(self):
u = User.objects.first()
token = AccessToken.for_user(u)
token_cookie = cookies.SimpleCookie({'access_token': token})
self.client.cookies = token_cookie
response = self.client.get('/api/token/verify')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data['user_id'], u.id)
def test_cookie_token_verify_wrong_token(self):
token_cookie = cookies.SimpleCookie({'access_token': str(os.urandom(32))})
self.client.cookies = token_cookie
response = self.client.get('/api/token/verify')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_cookie_token_verify_no_cookie(self):
response = self.client.get('/api/token/verify')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
class TestCookieTokenObtain(APITestCase):
def setUp(self):
user = User(username='testuser', email='test@test.com')
user.set_password('testpassword')
user.save()
def test_cookie_tokens_obtain_session(self):
response = self.client.post('/api/token', json.dumps({
'username': 'testuser',
'password': 'testpassword',
'remember': False
}), content_type="application/json")
raw_token = response.client.cookies['access_token']
# Morsel dict
# {'expires': 'Mon, 18 Nov 2019 23:45:35 GMT', 'path': '/', 'comment': '', 'domain': '', 'max-age': 300, 'secure': '', 'httponly': True, 'version': '', 'samesite': ''}
self.assertTrue(raw_token['httponly'])
self.assertEqual(raw_token['expires'], '')
raw_refresh = response.client.cookies['refresh_token']
self.assertTrue(raw_refresh['httponly'])
self.assertEqual(raw_refresh['expires'], '')
backend = CookieAccessTokenAuthentication()
validated_token = backend.get_validated_token(raw_token.value)
user = backend.get_user(validated_token)
self.assertEqual(user.id, 1)
def test_cookie_tokens_obtain_remember(self):
response = self.client.post('/api/token', json.dumps({
'username': 'testuser',
'password': 'testpassword',
'remember': True
}), content_type="application/json")
raw_token = response.client.cookies['access_token']
# Morsel dict
# {'expires': 'Mon, 18 Nov 2019 23:45:35 GMT', 'path': '/', 'comment': '', 'domain': '', 'max-age': 300, 'secure': '', 'httponly': True, 'version': '', 'samesite': ''}
self.assertTrue(raw_token['httponly'])
dt = datetime.datetime.strptime(raw_token['expires'], "%a, %d %b %Y %H:%M:%S %Z")
delta = dt - datetime.datetime.now()
expire = datetime.timedelta(seconds=round(delta.seconds, -2))
self.assertEqual(expire, api_settings.ACCESS_TOKEN_LIFETIME)
raw_refresh = response.client.cookies['refresh_token']
self.assertTrue(raw_refresh['httponly'])
dt = datetime.datetime.strptime(raw_refresh['expires'], "%a, %d %b %Y %H:%M:%S %Z")
delta = dt - datetime.datetime.now()
expire = datetime.timedelta(seconds=round(delta.seconds, -2))
self.assertEqual(expire, api_settings.REFRESH_TOKEN_LIFETIME)
backend = CookieAccessTokenAuthentication()
validated_token = backend.get_validated_token(raw_token.value)
user = backend.get_user(validated_token)
self.assertEqual(user.id, 1)
def test_cookie_tokens_obtain_wrong_password(self):
response = self.client.post('/api/token', json.dumps({
'username': 'testuser',
'password': 'wrongpassword',
'remember': False
}), content_type="application/json")
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_cookie_tokens_obtain_no_exisiting_user(self):
response = self.client.post('/api/token', json.dumps({
'username': 'nonexistinguser',
'password': 'testpassword',
'remember': False
}), content_type="application/json")
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_cookie_tokens_obtain_no_data(self):
response = self.client.post('/api/token', content_type="application/json")
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_cookie_tokens_obtain_wrong_method(self):
response = self.client.get('/api/token', content_type="application/json")
self.assertEqual(response.status_code, status.HTTP_405_METHOD_NOT_ALLOWED)
class TestCookieTokenRefresh(APITestCase):
def setUp(self):
user = User(username='testuser', email='test@test.com')
user.set_password('testpassword')
user.save()
def test_token_refresh_endpoint_no_cookie(self):
response = self.client.post('/api/token/refresh')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_token_refresh_endpoint_bad_refresh_cookie(self):
token_cookie = cookies.SimpleCookie({'refresh_token': str(os.urandom(32))})
self.client.cookies = token_cookie
response = self.client.post('/api/token/refresh')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_cookie_token_refresh(self):
u = User.objects.first()
token = RefreshToken.for_user(u)
token_cookie = cookies.SimpleCookie({'refresh_token': token})
self.client.cookies = token_cookie
# verify status code
response = self.client.post('/api/token/refresh')
self.assertEqual(response.status_code, status.HTTP_200_OK)
# verify token
raw_token = response.client.cookies['access_token']
backend = CookieAccessTokenAuthentication()
validated_token = backend.get_validated_token(raw_token.value)
user = backend.get_user(validated_token)
self.assertEqual(user.id, 1)
self.assertTrue(raw_token['httponly'])
dt = datetime.datetime.strptime(raw_token['expires'], "%a, %d %b %Y %H:%M:%S %Z")
delta = dt - datetime.datetime.now()
expire = datetime.timedelta(seconds=round(delta.seconds, -2))
self.assertEqual(expire, api_settings.ACCESS_TOKEN_LIFETIME)
class TestCookieTokenClear(APITestCase):
def setUp(self):
user = User(username='testuser', email='test@test.com')
user.set_password('testpassword')
user.save()
def test_token_refresh_endpoint_bad_refresh_cookie(self):
token_cookie = cookies.SimpleCookie({'refresh_token': str(os.urandom(32))})
token_cookie = cookies.SimpleCookie({'access_token': str(os.urandom(32))})
self.client.cookies = token_cookie
response = self.client.post('/api/token/clear')
self.assertEqual(response.status_code, status.HTTP_200_OK)
raw_access = response.client.cookies['access_token']
raw_refresh = response.client.cookies['refresh_token']
self.assertEqual(raw_access.value, "")
self.assertEqual(raw_refresh.value, "")
| 40.784211 | 175 | 0.67712 | 894 | 7,749 | 5.657718 | 0.147651 | 0.065243 | 0.046263 | 0.063068 | 0.845987 | 0.828391 | 0.801107 | 0.757019 | 0.742784 | 0.731317 | 0 | 0.012348 | 0.195251 | 7,749 | 189 | 176 | 41 | 0.798749 | 0.049942 | 0 | 0.683453 | 0 | 0 | 0.123589 | 0 | 0 | 0 | 0 | 0 | 0.194245 | 1 | 0.122302 | false | 0.064748 | 0.071942 | 0 | 0.223022 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
15b922a7548680b63fa816709a05c04219eba710 | 41 | py | Python | src/__init__.py | Loggi-pro/prettier-size-printer | 77ae84da60aeb1b803844f79e19c1ddca9836e46 | [
"MIT"
] | null | null | null | src/__init__.py | Loggi-pro/prettier-size-printer | 77ae84da60aeb1b803844f79e19c1ddca9836e46 | [
"MIT"
] | null | null | null | src/__init__.py | Loggi-pro/prettier-size-printer | 77ae84da60aeb1b803844f79e19c1ddca9836e46 | [
"MIT"
] | null | null | null | from .size_printer import run, Arguments
| 20.5 | 40 | 0.829268 | 6 | 41 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121951 | 41 | 1 | 41 | 41 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 7 |
ec761a5514f7fce967741ac76031384c21c7b884 | 30,704 | py | Python | sdk/python/pulumi_azure/core/template_deployment.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 109 | 2018-06-18T00:19:44.000Z | 2022-02-20T05:32:57.000Z | sdk/python/pulumi_azure/core/template_deployment.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 663 | 2018-06-18T21:08:46.000Z | 2022-03-31T20:10:11.000Z | sdk/python/pulumi_azure/core/template_deployment.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 41 | 2018-07-19T22:37:38.000Z | 2022-03-14T10:56:26.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['TemplateDeploymentArgs', 'TemplateDeployment']
@pulumi.input_type
class TemplateDeploymentArgs:
def __init__(__self__, *,
deployment_mode: pulumi.Input[str],
resource_group_name: pulumi.Input[str],
name: Optional[pulumi.Input[str]] = None,
parameters: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
parameters_body: Optional[pulumi.Input[str]] = None,
template_body: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a TemplateDeployment resource.
:param pulumi.Input[str] deployment_mode: Specifies the mode that is used to deploy resources. This value could be either `Incremental` or `Complete`.
Note that you will almost *always* want this to be set to `Incremental` otherwise the deployment will destroy all infrastructure not
specified within the template, and this provider will not be aware of this.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to
create the template deployment.
:param pulumi.Input[str] name: Specifies the name of the template deployment. Changing this forces a
new resource to be created.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] parameters: Specifies the name and value pairs that define the deployment parameters for the template.
:param pulumi.Input[str] parameters_body: Specifies a valid Azure JSON parameters file that define the deployment parameters. It can contain KeyVault references
:param pulumi.Input[str] template_body: Specifies the JSON definition for the template.
"""
pulumi.set(__self__, "deployment_mode", deployment_mode)
pulumi.set(__self__, "resource_group_name", resource_group_name)
if name is not None:
pulumi.set(__self__, "name", name)
if parameters is not None:
pulumi.set(__self__, "parameters", parameters)
if parameters_body is not None:
pulumi.set(__self__, "parameters_body", parameters_body)
if template_body is not None:
pulumi.set(__self__, "template_body", template_body)
@property
@pulumi.getter(name="deploymentMode")
def deployment_mode(self) -> pulumi.Input[str]:
"""
Specifies the mode that is used to deploy resources. This value could be either `Incremental` or `Complete`.
Note that you will almost *always* want this to be set to `Incremental` otherwise the deployment will destroy all infrastructure not
specified within the template, and this provider will not be aware of this.
"""
return pulumi.get(self, "deployment_mode")
@deployment_mode.setter
def deployment_mode(self, value: pulumi.Input[str]):
pulumi.set(self, "deployment_mode", value)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Input[str]:
"""
The name of the resource group in which to
create the template deployment.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the name of the template deployment. Changing this forces a
new resource to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def parameters(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
Specifies the name and value pairs that define the deployment parameters for the template.
"""
return pulumi.get(self, "parameters")
@parameters.setter
def parameters(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "parameters", value)
@property
@pulumi.getter(name="parametersBody")
def parameters_body(self) -> Optional[pulumi.Input[str]]:
"""
Specifies a valid Azure JSON parameters file that define the deployment parameters. It can contain KeyVault references
"""
return pulumi.get(self, "parameters_body")
@parameters_body.setter
def parameters_body(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "parameters_body", value)
@property
@pulumi.getter(name="templateBody")
def template_body(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the JSON definition for the template.
"""
return pulumi.get(self, "template_body")
@template_body.setter
def template_body(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "template_body", value)
@pulumi.input_type
class _TemplateDeploymentState:
def __init__(__self__, *,
deployment_mode: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
outputs: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
parameters: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
parameters_body: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
template_body: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering TemplateDeployment resources.
:param pulumi.Input[str] deployment_mode: Specifies the mode that is used to deploy resources. This value could be either `Incremental` or `Complete`.
Note that you will almost *always* want this to be set to `Incremental` otherwise the deployment will destroy all infrastructure not
specified within the template, and this provider will not be aware of this.
:param pulumi.Input[str] name: Specifies the name of the template deployment. Changing this forces a
new resource to be created.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] outputs: A map of supported scalar output types returned from the deployment (currently, Azure Template Deployment outputs of type String, Int and Bool are supported, and are converted to strings - others will be ignored) and can be accessed using `.outputs["name"]`.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] parameters: Specifies the name and value pairs that define the deployment parameters for the template.
:param pulumi.Input[str] parameters_body: Specifies a valid Azure JSON parameters file that define the deployment parameters. It can contain KeyVault references
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to
create the template deployment.
:param pulumi.Input[str] template_body: Specifies the JSON definition for the template.
"""
if deployment_mode is not None:
pulumi.set(__self__, "deployment_mode", deployment_mode)
if name is not None:
pulumi.set(__self__, "name", name)
if outputs is not None:
pulumi.set(__self__, "outputs", outputs)
if parameters is not None:
pulumi.set(__self__, "parameters", parameters)
if parameters_body is not None:
pulumi.set(__self__, "parameters_body", parameters_body)
if resource_group_name is not None:
pulumi.set(__self__, "resource_group_name", resource_group_name)
if template_body is not None:
pulumi.set(__self__, "template_body", template_body)
@property
@pulumi.getter(name="deploymentMode")
def deployment_mode(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the mode that is used to deploy resources. This value could be either `Incremental` or `Complete`.
Note that you will almost *always* want this to be set to `Incremental` otherwise the deployment will destroy all infrastructure not
specified within the template, and this provider will not be aware of this.
"""
return pulumi.get(self, "deployment_mode")
@deployment_mode.setter
def deployment_mode(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "deployment_mode", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the name of the template deployment. Changing this forces a
new resource to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def outputs(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A map of supported scalar output types returned from the deployment (currently, Azure Template Deployment outputs of type String, Int and Bool are supported, and are converted to strings - others will be ignored) and can be accessed using `.outputs["name"]`.
"""
return pulumi.get(self, "outputs")
@outputs.setter
def outputs(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "outputs", value)
@property
@pulumi.getter
def parameters(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
Specifies the name and value pairs that define the deployment parameters for the template.
"""
return pulumi.get(self, "parameters")
@parameters.setter
def parameters(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "parameters", value)
@property
@pulumi.getter(name="parametersBody")
def parameters_body(self) -> Optional[pulumi.Input[str]]:
"""
Specifies a valid Azure JSON parameters file that define the deployment parameters. It can contain KeyVault references
"""
return pulumi.get(self, "parameters_body")
@parameters_body.setter
def parameters_body(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "parameters_body", value)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the resource group in which to
create the template deployment.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter(name="templateBody")
def template_body(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the JSON definition for the template.
"""
return pulumi.get(self, "template_body")
@template_body.setter
def template_body(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "template_body", value)
class TemplateDeployment(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
deployment_mode: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
parameters: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
parameters_body: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
template_body: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Manages a template deployment of resources
> **Note on ARM Template Deployments:** Due to the way the underlying Azure API is designed, this provider can only manage the deployment of the ARM Template - and not any resources which are created by it.
This means that when deleting the `core.TemplateDeployment` resource, this provider will only remove the reference to the deployment, whilst leaving any resources created by that ARM Template Deployment.
One workaround for this is to use a unique Resource Group for each ARM Template Deployment, which means deleting the Resource Group would contain any resources created within it - however this isn't ideal. [More information](https://docs.microsoft.com/en-us/rest/api/resources/deployments#Deployments_Delete).
## Example Usage
> **Note:** This example uses Storage Accounts and Public IP's which are natively supported by this provider - we'd highly recommend using the Native Resources where possible instead rather than an ARM Template, for the reasons outlined above.
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_template_deployment = azure.core.TemplateDeployment("exampleTemplateDeployment",
resource_group_name=example_resource_group.name,
template_body=\"\"\"{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"storageAccountType": {
"type": "string",
"defaultValue": "Standard_LRS",
"allowedValues": [
"Standard_LRS",
"Standard_GRS",
"Standard_ZRS"
],
"metadata": {
"description": "Storage Account type"
}
}
},
"variables": {
"location": "[resourceGroup().location]",
"storageAccountName": "[concat(uniquestring(resourceGroup().id), 'storage')]",
"publicIPAddressName": "[concat('myPublicIp', uniquestring(resourceGroup().id))]",
"publicIPAddressType": "Dynamic",
"apiVersion": "2015-06-15",
"dnsLabelPrefix": "example-acctest"
},
"resources": [
{
"type": "Microsoft.Storage/storageAccounts",
"name": "[variables('storageAccountName')]",
"apiVersion": "[variables('apiVersion')]",
"location": "[variables('location')]",
"properties": {
"accountType": "[parameters('storageAccountType')]"
}
},
{
"type": "Microsoft.Network/publicIPAddresses",
"apiVersion": "[variables('apiVersion')]",
"name": "[variables('publicIPAddressName')]",
"location": "[variables('location')]",
"properties": {
"publicIPAllocationMethod": "[variables('publicIPAddressType')]",
"dnsSettings": {
"domainNameLabel": "[variables('dnsLabelPrefix')]"
}
}
}
],
"outputs": {
"storageAccountName": {
"type": "string",
"value": "[variables('storageAccountName')]"
}
}
}
\"\"\",
parameters={
"storageAccountType": "Standard_GRS",
},
deployment_mode="Incremental")
pulumi.export("storageAccountName", example_template_deployment.outputs["storageAccountName"])
```
## Note
This provider does not know about the individual resources created by Azure using a deployment template and therefore cannot delete these resources during a destroy. Destroying a template deployment removes the associated deployment operations, but will not delete the Azure resources created by the deployment. In order to delete these resources, the containing resource group must also be destroyed. [More information](https://docs.microsoft.com/en-us/rest/api/resources/deployments#Deployments_Delete).
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] deployment_mode: Specifies the mode that is used to deploy resources. This value could be either `Incremental` or `Complete`.
Note that you will almost *always* want this to be set to `Incremental` otherwise the deployment will destroy all infrastructure not
specified within the template, and this provider will not be aware of this.
:param pulumi.Input[str] name: Specifies the name of the template deployment. Changing this forces a
new resource to be created.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] parameters: Specifies the name and value pairs that define the deployment parameters for the template.
:param pulumi.Input[str] parameters_body: Specifies a valid Azure JSON parameters file that define the deployment parameters. It can contain KeyVault references
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to
create the template deployment.
:param pulumi.Input[str] template_body: Specifies the JSON definition for the template.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: TemplateDeploymentArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages a template deployment of resources
> **Note on ARM Template Deployments:** Due to the way the underlying Azure API is designed, this provider can only manage the deployment of the ARM Template - and not any resources which are created by it.
This means that when deleting the `core.TemplateDeployment` resource, this provider will only remove the reference to the deployment, whilst leaving any resources created by that ARM Template Deployment.
One workaround for this is to use a unique Resource Group for each ARM Template Deployment, which means deleting the Resource Group would contain any resources created within it - however this isn't ideal. [More information](https://docs.microsoft.com/en-us/rest/api/resources/deployments#Deployments_Delete).
## Example Usage
> **Note:** This example uses Storage Accounts and Public IP's which are natively supported by this provider - we'd highly recommend using the Native Resources where possible instead rather than an ARM Template, for the reasons outlined above.
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_template_deployment = azure.core.TemplateDeployment("exampleTemplateDeployment",
resource_group_name=example_resource_group.name,
template_body=\"\"\"{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"storageAccountType": {
"type": "string",
"defaultValue": "Standard_LRS",
"allowedValues": [
"Standard_LRS",
"Standard_GRS",
"Standard_ZRS"
],
"metadata": {
"description": "Storage Account type"
}
}
},
"variables": {
"location": "[resourceGroup().location]",
"storageAccountName": "[concat(uniquestring(resourceGroup().id), 'storage')]",
"publicIPAddressName": "[concat('myPublicIp', uniquestring(resourceGroup().id))]",
"publicIPAddressType": "Dynamic",
"apiVersion": "2015-06-15",
"dnsLabelPrefix": "example-acctest"
},
"resources": [
{
"type": "Microsoft.Storage/storageAccounts",
"name": "[variables('storageAccountName')]",
"apiVersion": "[variables('apiVersion')]",
"location": "[variables('location')]",
"properties": {
"accountType": "[parameters('storageAccountType')]"
}
},
{
"type": "Microsoft.Network/publicIPAddresses",
"apiVersion": "[variables('apiVersion')]",
"name": "[variables('publicIPAddressName')]",
"location": "[variables('location')]",
"properties": {
"publicIPAllocationMethod": "[variables('publicIPAddressType')]",
"dnsSettings": {
"domainNameLabel": "[variables('dnsLabelPrefix')]"
}
}
}
],
"outputs": {
"storageAccountName": {
"type": "string",
"value": "[variables('storageAccountName')]"
}
}
}
\"\"\",
parameters={
"storageAccountType": "Standard_GRS",
},
deployment_mode="Incremental")
pulumi.export("storageAccountName", example_template_deployment.outputs["storageAccountName"])
```
## Note
This provider does not know about the individual resources created by Azure using a deployment template and therefore cannot delete these resources during a destroy. Destroying a template deployment removes the associated deployment operations, but will not delete the Azure resources created by the deployment. In order to delete these resources, the containing resource group must also be destroyed. [More information](https://docs.microsoft.com/en-us/rest/api/resources/deployments#Deployments_Delete).
:param str resource_name: The name of the resource.
:param TemplateDeploymentArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(TemplateDeploymentArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
deployment_mode: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
parameters: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
parameters_body: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
template_body: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = TemplateDeploymentArgs.__new__(TemplateDeploymentArgs)
if deployment_mode is None and not opts.urn:
raise TypeError("Missing required property 'deployment_mode'")
__props__.__dict__["deployment_mode"] = deployment_mode
__props__.__dict__["name"] = name
__props__.__dict__["parameters"] = parameters
__props__.__dict__["parameters_body"] = parameters_body
if resource_group_name is None and not opts.urn:
raise TypeError("Missing required property 'resource_group_name'")
__props__.__dict__["resource_group_name"] = resource_group_name
__props__.__dict__["template_body"] = template_body
__props__.__dict__["outputs"] = None
super(TemplateDeployment, __self__).__init__(
'azure:core/templateDeployment:TemplateDeployment',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
deployment_mode: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
outputs: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
parameters: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
parameters_body: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
template_body: Optional[pulumi.Input[str]] = None) -> 'TemplateDeployment':
"""
Get an existing TemplateDeployment resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] deployment_mode: Specifies the mode that is used to deploy resources. This value could be either `Incremental` or `Complete`.
Note that you will almost *always* want this to be set to `Incremental` otherwise the deployment will destroy all infrastructure not
specified within the template, and this provider will not be aware of this.
:param pulumi.Input[str] name: Specifies the name of the template deployment. Changing this forces a
new resource to be created.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] outputs: A map of supported scalar output types returned from the deployment (currently, Azure Template Deployment outputs of type String, Int and Bool are supported, and are converted to strings - others will be ignored) and can be accessed using `.outputs["name"]`.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] parameters: Specifies the name and value pairs that define the deployment parameters for the template.
:param pulumi.Input[str] parameters_body: Specifies a valid Azure JSON parameters file that define the deployment parameters. It can contain KeyVault references
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to
create the template deployment.
:param pulumi.Input[str] template_body: Specifies the JSON definition for the template.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _TemplateDeploymentState.__new__(_TemplateDeploymentState)
__props__.__dict__["deployment_mode"] = deployment_mode
__props__.__dict__["name"] = name
__props__.__dict__["outputs"] = outputs
__props__.__dict__["parameters"] = parameters
__props__.__dict__["parameters_body"] = parameters_body
__props__.__dict__["resource_group_name"] = resource_group_name
__props__.__dict__["template_body"] = template_body
return TemplateDeployment(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="deploymentMode")
def deployment_mode(self) -> pulumi.Output[str]:
"""
Specifies the mode that is used to deploy resources. This value could be either `Incremental` or `Complete`.
Note that you will almost *always* want this to be set to `Incremental` otherwise the deployment will destroy all infrastructure not
specified within the template, and this provider will not be aware of this.
"""
return pulumi.get(self, "deployment_mode")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Specifies the name of the template deployment. Changing this forces a
new resource to be created.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def outputs(self) -> pulumi.Output[Mapping[str, str]]:
"""
A map of supported scalar output types returned from the deployment (currently, Azure Template Deployment outputs of type String, Int and Bool are supported, and are converted to strings - others will be ignored) and can be accessed using `.outputs["name"]`.
"""
return pulumi.get(self, "outputs")
@property
@pulumi.getter
def parameters(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
Specifies the name and value pairs that define the deployment parameters for the template.
"""
return pulumi.get(self, "parameters")
@property
@pulumi.getter(name="parametersBody")
def parameters_body(self) -> pulumi.Output[Optional[str]]:
"""
Specifies a valid Azure JSON parameters file that define the deployment parameters. It can contain KeyVault references
"""
return pulumi.get(self, "parameters_body")
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Output[str]:
"""
The name of the resource group in which to
create the template deployment.
"""
return pulumi.get(self, "resource_group_name")
@property
@pulumi.getter(name="templateBody")
def template_body(self) -> pulumi.Output[str]:
"""
Specifies the JSON definition for the template.
"""
return pulumi.get(self, "template_body")
| 50.91874 | 513 | 0.65213 | 3,426 | 30,704 | 5.696439 | 0.088441 | 0.060309 | 0.061693 | 0.043964 | 0.902951 | 0.892806 | 0.882097 | 0.873079 | 0.870465 | 0.860012 | 0 | 0.001788 | 0.253322 | 30,704 | 602 | 514 | 51.003322 | 0.849472 | 0.520779 | 0 | 0.716 | 1 | 0 | 0.100652 | 0.007327 | 0 | 0 | 0 | 0 | 0 | 1 | 0.16 | false | 0.004 | 0.02 | 0 | 0.276 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ec9e23fccff15e80e0a8f1809e2c5cf07a3e43f0 | 27,028 | py | Python | tests/gui_tests/test_gui_methods.py | debrief/pepys-import | 12d29c0e0f69e1119400334983947893e7679b6b | [
"Apache-2.0"
] | 4 | 2021-05-14T08:22:47.000Z | 2022-02-04T19:48:25.000Z | tests/gui_tests/test_gui_methods.py | debrief/pepys-import | 12d29c0e0f69e1119400334983947893e7679b6b | [
"Apache-2.0"
] | 1,083 | 2019-11-06T17:01:07.000Z | 2022-03-25T10:26:51.000Z | tests/gui_tests/test_gui_methods.py | debrief/pepys-import | 12d29c0e0f69e1119400334983947893e7679b6b | [
"Apache-2.0"
] | 4 | 2019-11-06T12:00:45.000Z | 2021-06-09T04:18:28.000Z | from unittest.mock import Mock
import pytest
from pepys_admin.maintenance.gui import MaintenanceGUI
from pepys_import.core.store.db_status import TableTypes
# These tests only work properly if pytest is run with the -s option
# that stops pytest trying to change where stdin is pointing to.
# This is because the constructor of the prompt_toolkit Application class
# tries to sort out input/output terminals, even if we don't call
# the run method.
# I tried various ways to configure this programatically, and they all
# failed in various interesting and intermittent ways - so it is best
# just to run these tests with -s.
# The first few lines of each test skip the test if pytest hasn't been
# run with -s - otherwise they would fail.
# The CI configuration has been updated to do two test runs: one
# for most of the tests without -s, and then the GUI tests with -s.
def set_selected_table_to_platform(gui):
gui.current_table_object = gui.data_store.db_classes.Platform
gui.get_column_data(gui.current_table_object)
gui.dropdown_table.text = "Platforms"
gui.get_default_preview_fields()
def test_generating_column_data(pytestconfig, test_datastore):
if pytestconfig.getoption("capture") != "no":
pytest.skip("Skipped because pytest was not run with -s option")
gui = MaintenanceGUI(test_datastore)
set_selected_table_to_platform(gui)
correct_col_data = {
"created date": {
"required": False,
"sqlalchemy_type": "column",
"system_name": "created_date",
"type": "datetime",
},
"identifier": {
"required": True,
"sqlalchemy_type": "column",
"system_name": "identifier",
"type": "string",
"values": ["A643", "A816", "C045", "P543"],
},
"name": {
"required": True,
"sqlalchemy_type": "column",
"system_name": "name",
"type": "string",
"values": ["ADRI", "JEAN", "NARV", "SPAR"],
},
"nationality": {
"foreign_table_type": TableTypes.REFERENCE,
"multiple_values_allowed": False,
"required": True,
"second_level": False,
"sqlalchemy_type": "relationship",
"system_name": "nationality",
"type": "string",
"values": [
"United Kingdom",
"Canada",
"France",
"Germany",
"Italy",
"Netherlands",
"United States",
"Afghanistan",
"Albania",
"Algeria",
"American Samoa",
"Andorra",
"Angola",
"Anguilla",
"Antarctica",
"Antigua and Barbuda",
"Argentina",
"Armenia",
"Aruba",
"Australia",
"Austria",
"Azerbaijan",
"Bahamas",
"Bahrain",
"Bangladesh",
"Barbados",
"Belarus",
"Belgium",
"Belize",
"Benin",
"Bermuda",
"Bhutan",
"Bolivia, Plurinational State of",
"Bolivia",
"Bosnia and Herzegovina",
"Botswana",
"Bouvet Island",
"Brazil",
"British Indian Ocean Territory",
"Brunei Darussalam",
"Brunei",
"Bulgaria",
"Burkina Faso",
"Burundi",
"Cambodia",
"Cameroon",
"Cape Verde",
"Cayman Islands",
"Central African Republic",
"Chad",
"Chile",
"China",
"Christmas Island",
"Cocos (Keeling) Islands",
"Colombia",
"Comoros",
"Congo",
"Congo, the Democratic Republic of the",
"Cook Islands",
"Costa Rica",
"Cote d'Ivoire",
"Ivory Coast",
"Croatia",
"Cuba",
"Cyprus",
"Czech Republic",
"Denmark",
"Djibouti",
"Dominica",
"Dominican Republic",
"Ecuador",
"Egypt",
"El Salvador",
"Equatorial Guinea",
"Eritrea",
"Estonia",
"Ethiopia",
"Falkland Islands (Malvinas)",
"Faroe Islands",
"Fiji",
"Finland",
"French Guiana",
"French Polynesia",
"French Southern Territories",
"Gabon",
"Gambia",
"Georgia",
"Ghana",
"Gibraltar",
"Greece",
"Greenland",
"Grenada",
"Guadeloupe",
"Guam",
"Guatemala",
"Guernsey",
"Guinea",
"Guinea-Bissau",
"Guyana",
"Haiti",
"Heard Island and McDonald Islands",
"Holy See (Vatican City State)",
"Honduras",
"Hong Kong",
"Hungary",
"Iceland",
"India",
"Indonesia",
"Iran, Islamic Republic of",
"Iraq",
"Ireland",
"Isle of Man",
"Israel",
"Jamaica",
"Japan",
"Jersey",
"Jordan",
"Kazakhstan",
"Kenya",
"Kiribati",
"Korea, Democratic People's Republic of",
"Korea, Republic of",
"South Korea",
"Kuwait",
"Kyrgyzstan",
"Lao People's Democratic Republic",
"Latvia",
"Lebanon",
"Lesotho",
"Liberia",
"Libyan Arab Jamahiriya",
"Libya",
"Liechtenstein",
"Lithuania",
"Luxembourg",
"Macao",
"Macedonia, the former Yugoslav Republic of",
"Madagascar",
"Malawi",
"Malaysia",
"Maldives",
"Mali",
"Malta",
"Marshall Islands",
"Martinique",
"Mauritania",
"Mauritius",
"Mayotte",
"Mexico",
"Micronesia, Federated States of",
"Moldova, Republic of",
"Monaco",
"Mongolia",
"Montenegro",
"Montserrat",
"Morocco",
"Mozambique",
"Myanmar",
"Burma",
"Namibia",
"Nauru",
"Nepal",
"Netherlands Antilles",
"New Caledonia",
"New Zealand",
"Nicaragua",
"Niger",
"Nigeria",
"Niue",
"Norfolk Island",
"Northern Mariana Islands",
"Norway",
"Oman",
"Pakistan",
"Palau",
"Palestinian Territory, Occupied",
"Panama",
"Papua New Guinea",
"Paraguay",
"Peru",
"Philippines",
"Pitcairn",
"Poland",
"Portugal",
"Puerto Rico",
"Qatar",
"Reunion",
"Romania",
"Russian Federation",
"Russia",
"Rwanda",
"Saint Helena, Ascension and Tristan da Cunha",
"Saint Kitts and Nevis",
"Saint Lucia",
"Saint Pierre and Miquelon",
"Saint Vincent and the Grenadines",
"Saint Vincent & the Grenadines",
"St. Vincent and the Grenadines",
"Samoa",
"San Marino",
"Sao Tome and Principe",
"Saudi Arabia",
"Senegal",
"Serbia",
"Seychelles",
"Sierra Leone",
"Singapore",
"Slovakia",
"Slovenia",
"Solomon Islands",
"Somalia",
"South Africa",
"South Georgia and the South Sandwich Islands",
"South Sudan",
"Spain",
"Sri Lanka",
"Sudan",
"Suriname",
"Svalbard and Jan Mayen",
"Swaziland",
"Sweden",
"Switzerland",
"Syrian Arab Republic",
"Taiwan, Province of China",
"Taiwan",
"Tajikistan",
"Tanzania, United Republic of",
"Thailand",
"Timor-Leste",
"Togo",
"Tokelau",
"Tonga",
"Trinidad and Tobago",
"Tunisia",
"Turkey",
"Turkmenistan",
"Turks and Caicos Islands",
"Tuvalu",
"Uganda",
"Ukraine",
"United Arab Emirates",
"United States Minor Outlying Islands",
"Uruguay",
"Uzbekistan",
"Vanuatu",
"Venezuela, Bolivarian Republic of",
"Venezuela",
"Viet Nam",
"Vietnam",
"Virgin Islands, British",
"Virgin Islands, U.S.",
"Wallis and Futuna",
"Western Sahara",
"Yemen",
"Zambia",
"Zimbabwe",
"Unknown",
],
},
"nationality name": {
"required": True,
"sqlalchemy_type": "assoc_proxy",
"system_name": "nationality_name",
"type": "string",
"values": [
"United Kingdom",
"Canada",
"France",
"Germany",
"Italy",
"Netherlands",
"United States",
"Afghanistan",
"Albania",
"Algeria",
"American Samoa",
"Andorra",
"Angola",
"Anguilla",
"Antarctica",
"Antigua and Barbuda",
"Argentina",
"Armenia",
"Aruba",
"Australia",
"Austria",
"Azerbaijan",
"Bahamas",
"Bahrain",
"Bangladesh",
"Barbados",
"Belarus",
"Belgium",
"Belize",
"Benin",
"Bermuda",
"Bhutan",
"Bolivia, Plurinational State of",
"Bolivia",
"Bosnia and Herzegovina",
"Botswana",
"Bouvet Island",
"Brazil",
"British Indian Ocean Territory",
"Brunei Darussalam",
"Brunei",
"Bulgaria",
"Burkina Faso",
"Burundi",
"Cambodia",
"Cameroon",
"Cape Verde",
"Cayman Islands",
"Central African Republic",
"Chad",
"Chile",
"China",
"Christmas Island",
"Cocos (Keeling) Islands",
"Colombia",
"Comoros",
"Congo",
"Congo, the Democratic Republic of the",
"Cook Islands",
"Costa Rica",
"Cote d'Ivoire",
"Ivory Coast",
"Croatia",
"Cuba",
"Cyprus",
"Czech Republic",
"Denmark",
"Djibouti",
"Dominica",
"Dominican Republic",
"Ecuador",
"Egypt",
"El Salvador",
"Equatorial Guinea",
"Eritrea",
"Estonia",
"Ethiopia",
"Falkland Islands (Malvinas)",
"Faroe Islands",
"Fiji",
"Finland",
"French Guiana",
"French Polynesia",
"French Southern Territories",
"Gabon",
"Gambia",
"Georgia",
"Ghana",
"Gibraltar",
"Greece",
"Greenland",
"Grenada",
"Guadeloupe",
"Guam",
"Guatemala",
"Guernsey",
"Guinea",
"Guinea-Bissau",
"Guyana",
"Haiti",
"Heard Island and McDonald Islands",
"Holy See (Vatican City State)",
"Honduras",
"Hong Kong",
"Hungary",
"Iceland",
"India",
"Indonesia",
"Iran, Islamic Republic of",
"Iraq",
"Ireland",
"Isle of Man",
"Israel",
"Jamaica",
"Japan",
"Jersey",
"Jordan",
"Kazakhstan",
"Kenya",
"Kiribati",
"Korea, Democratic People's Republic of",
"Korea, Republic of",
"South Korea",
"Kuwait",
"Kyrgyzstan",
"Lao People's Democratic Republic",
"Latvia",
"Lebanon",
"Lesotho",
"Liberia",
"Libyan Arab Jamahiriya",
"Libya",
"Liechtenstein",
"Lithuania",
"Luxembourg",
"Macao",
"Macedonia, the former Yugoslav Republic of",
"Madagascar",
"Malawi",
"Malaysia",
"Maldives",
"Mali",
"Malta",
"Marshall Islands",
"Martinique",
"Mauritania",
"Mauritius",
"Mayotte",
"Mexico",
"Micronesia, Federated States of",
"Moldova, Republic of",
"Monaco",
"Mongolia",
"Montenegro",
"Montserrat",
"Morocco",
"Mozambique",
"Myanmar",
"Burma",
"Namibia",
"Nauru",
"Nepal",
"Netherlands Antilles",
"New Caledonia",
"New Zealand",
"Nicaragua",
"Niger",
"Nigeria",
"Niue",
"Norfolk Island",
"Northern Mariana Islands",
"Norway",
"Oman",
"Pakistan",
"Palau",
"Palestinian Territory, Occupied",
"Panama",
"Papua New Guinea",
"Paraguay",
"Peru",
"Philippines",
"Pitcairn",
"Poland",
"Portugal",
"Puerto Rico",
"Qatar",
"Reunion",
"Romania",
"Russian Federation",
"Russia",
"Rwanda",
"Saint Helena, Ascension and Tristan da Cunha",
"Saint Kitts and Nevis",
"Saint Lucia",
"Saint Pierre and Miquelon",
"Saint Vincent and the Grenadines",
"Saint Vincent & the Grenadines",
"St. Vincent and the Grenadines",
"Samoa",
"San Marino",
"Sao Tome and Principe",
"Saudi Arabia",
"Senegal",
"Serbia",
"Seychelles",
"Sierra Leone",
"Singapore",
"Slovakia",
"Slovenia",
"Solomon Islands",
"Somalia",
"South Africa",
"South Georgia and the South Sandwich Islands",
"South Sudan",
"Spain",
"Sri Lanka",
"Sudan",
"Suriname",
"Svalbard and Jan Mayen",
"Swaziland",
"Sweden",
"Switzerland",
"Syrian Arab Republic",
"Taiwan, Province of China",
"Taiwan",
"Tajikistan",
"Tanzania, United Republic of",
"Thailand",
"Timor-Leste",
"Togo",
"Tokelau",
"Tonga",
"Trinidad and Tobago",
"Tunisia",
"Turkey",
"Turkmenistan",
"Turks and Caicos Islands",
"Tuvalu",
"Uganda",
"Ukraine",
"United Arab Emirates",
"United States Minor Outlying Islands",
"Uruguay",
"Uzbekistan",
"Vanuatu",
"Venezuela, Bolivarian Republic of",
"Venezuela",
"Viet Nam",
"Vietnam",
"Virgin Islands, British",
"Virgin Islands, U.S.",
"Wallis and Futuna",
"Western Sahara",
"Yemen",
"Zambia",
"Zimbabwe",
"Unknown",
],
},
"platform id": {
"required": True,
"sqlalchemy_type": "column",
"system_name": "platform_id",
"type": "id",
},
"platform type": {
"foreign_table_type": TableTypes.REFERENCE,
"multiple_values_allowed": False,
"required": True,
"second_level": False,
"sqlalchemy_type": "relationship",
"system_name": "platform_type",
"type": "string",
"values": [
"Attack Craft",
"Commercial aircraft",
"Fishing Vessel",
"High Speed Craft",
"Law Enforcement",
"Medical",
"Merchant",
"Naval - aircraft carrier",
"Naval - amphib",
"Naval - auxiliary",
"Naval - cruiser",
"Naval - destroyer",
"Naval - ExCon",
"Naval - fixed wing",
"Naval - frigate",
"Naval - minewarfare",
"Naval - miscellaneous",
"Naval - patrol",
"Naval - rotary wing",
"Naval - submarine",
"Naval - survey",
"Passenger vessel",
"Pleasure Craft",
"Research vessel",
"Sailing vessel",
"Search and Rescue",
"Offshore support",
"Civilian - uncrewed surface vehicle",
"Civilian - uncrewed underwater vehicle",
"Civilian - uncrewed air vehicle",
"Naval - uncrewed surface vehicle",
"Naval - uncrewed underwater vehicle",
"Naval - uncrewed air vehicle",
"Unknown",
],
},
"platform type name": {
"required": True,
"sqlalchemy_type": "assoc_proxy",
"system_name": "platform_type_name",
"type": "string",
"values": [
"Attack Craft",
"Civilian - uncrewed air vehicle",
"Civilian - uncrewed surface vehicle",
"Civilian - uncrewed underwater vehicle",
"Commercial aircraft",
"Fishing Vessel",
"High Speed Craft",
"Law Enforcement",
"Medical",
"Merchant",
"Naval - ExCon",
"Naval - aircraft carrier",
"Naval - amphib",
"Naval - auxiliary",
"Naval - cruiser",
"Naval - destroyer",
"Naval - fixed wing",
"Naval - frigate",
"Naval - minewarfare",
"Naval - miscellaneous",
"Naval - patrol",
"Naval - rotary wing",
"Naval - submarine",
"Naval - survey",
"Naval - uncrewed air vehicle",
"Naval - uncrewed surface vehicle",
"Naval - uncrewed underwater vehicle",
"Offshore support",
"Passenger vessel",
"Pleasure Craft",
"Research vessel",
"Sailing vessel",
"Search and Rescue",
"Unknown",
],
},
"privacy": {
"foreign_table_type": TableTypes.REFERENCE,
"multiple_values_allowed": False,
"required": True,
"second_level": False,
"sqlalchemy_type": "relationship",
"system_name": "privacy",
"type": "string",
"values": [
"Public",
"Public Sensitive",
"Private",
"Private UK/IE",
"Very Private UK/IE",
"Private UK/IE/FR",
"Very Private UK/IE/FR",
"Very Private",
],
},
"privacy name": {
"required": True,
"sqlalchemy_type": "assoc_proxy",
"system_name": "privacy_name",
"type": "string",
"values": [
"Public",
"Public Sensitive",
"Private",
"Private UK/IE",
"Very Private UK/IE",
"Private UK/IE/FR",
"Very Private UK/IE/FR",
"Very Private",
],
},
"quadgraph": {
"required": False,
"sqlalchemy_type": "column",
"system_name": "quadgraph",
"type": "string",
"values": [],
},
"trigraph": {
"required": False,
"sqlalchemy_type": "column",
"system_name": "trigraph",
"type": "string",
"values": [],
},
"wargame participations": {
"required": True,
"sqlalchemy_type": "assoc_proxy",
"system_name": "wargame_participations",
"type": "string",
"values": [],
},
}
output_col_data = gui.column_data
del output_col_data["nationality"]["ids"]
del output_col_data["platform type"]["ids"]
del output_col_data["privacy"]["ids"]
assert output_col_data == correct_col_data
def test_running_query_single_condition(pytestconfig, test_datastore):
if pytestconfig.getoption("capture") != "no":
pytest.skip("Skipped because pytest was not run with -s option")
gui = MaintenanceGUI(test_datastore)
set_selected_table_to_platform(gui)
gui.filter_widget = Mock()
gui.filter_widget.filters = [["name", "=", "ADRI"]]
gui.run_query()
# Should be 2 entries because of the header line,
# plus the one result
assert len(gui.table_data) == 2
assert len(gui.table_objects) == 2
assert gui.table_data[0] == ["Name", "Identifier", "Nationality name", "Platform type name"]
assert gui.table_data[1] == ["ADRI", "A643", "United Kingdom", "Naval - frigate"]
assert isinstance(gui.table_objects[1], gui.data_store.db_classes.Platform)
assert gui.table_objects[0] is None
assert gui.table_objects[1].name == "ADRI"
def test_running_query_two_conditions_or(pytestconfig, test_datastore):
if pytestconfig.getoption("capture") != "no":
pytest.skip("Skipped because pytest was not run with -s option")
gui = MaintenanceGUI(test_datastore)
set_selected_table_to_platform(gui)
gui.filter_widget = Mock()
gui.filter_widget.filters = [["name", "=", "ADRI"], ["OR"], ["name", "=", "JEAN"]]
gui.run_query()
# Should be 3 entries because of the header line,
# plus the two results
assert len(gui.table_data) == 3
assert len(gui.table_objects) == 3
assert gui.table_data[0] == ["Name", "Identifier", "Nationality name", "Platform type name"]
assert gui.table_data[2] == ["JEAN", "A816", "United Kingdom", "Naval - frigate"]
assert isinstance(gui.table_objects[2], gui.data_store.db_classes.Platform)
assert gui.table_objects[0] is None
assert gui.table_objects[2].name == "JEAN"
def test_running_query_two_conditions_and(pytestconfig, test_datastore):
if pytestconfig.getoption("capture") != "no":
pytest.skip("Skipped because pytest was not run with -s option")
gui = MaintenanceGUI(test_datastore)
set_selected_table_to_platform(gui)
gui.filter_widget = Mock()
gui.filter_widget.filters = [
["nationality_name", "=", "United Kingdom"],
["AND"],
["identifier", "LIKE", "A"],
]
gui.run_query()
# Should be 3 entries because of the header line,
# plus the two results
assert len(gui.table_data) == 3
assert len(gui.table_objects) == 3
assert gui.table_data[0] == ["Name", "Identifier", "Nationality name", "Platform type name"]
assert gui.table_data[1] == ["ADRI", "A643", "United Kingdom", "Naval - frigate"]
assert gui.table_data[2] == ["JEAN", "A816", "United Kingdom", "Naval - frigate"]
assert isinstance(gui.table_objects[2], gui.data_store.db_classes.Platform)
assert gui.table_objects[0] is None
assert gui.table_objects[1].name == "ADRI"
assert gui.table_objects[2].name == "JEAN"
| 32.642512 | 96 | 0.417826 | 1,932 | 27,028 | 5.758282 | 0.287785 | 0.016539 | 0.017618 | 0.01636 | 0.862742 | 0.844584 | 0.836225 | 0.806921 | 0.788225 | 0.772494 | 0 | 0.003517 | 0.473953 | 27,028 | 827 | 97 | 32.681983 | 0.778942 | 0.033151 | 0 | 0.905685 | 0 | 0 | 0.351549 | 0.003485 | 0 | 0 | 0 | 0 | 0.031008 | 1 | 0.00646 | false | 0.002584 | 0.005168 | 0 | 0.011628 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
eceef1735507c150e57a55e6aa285fd7249f0e5f | 32,939 | py | Python | test/terra/utils/qasm_simulator/qasm_cliffords.py | mtreinish/qiskit-aer | ef2e910b474379bb3a6f45e1b4f42cc3dc39c0c0 | [
"Apache-2.0"
] | null | null | null | test/terra/utils/qasm_simulator/qasm_cliffords.py | mtreinish/qiskit-aer | ef2e910b474379bb3a6f45e1b4f42cc3dc39c0c0 | [
"Apache-2.0"
] | null | null | null | test/terra/utils/qasm_simulator/qasm_cliffords.py | mtreinish/qiskit-aer | ef2e910b474379bb3a6f45e1b4f42cc3dc39c0c0 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright 2018, IBM.
#
# This source code is licensed under the Apache License, Version 2.0 found in
# the LICENSE.txt file in the root directory of this source tree.
"""
QasmSimulator Integration Tests
"""
from test.terra.utils import common
from test.terra.utils import ref_1q_clifford
from test.terra.utils import ref_2q_clifford
from qiskit import compile
from qiskit.providers.aer import QasmSimulator
class QasmCliffordTests(common.QiskitAerTestCase):
"""QasmSimulator Clifford gate tests in default basis."""
SIMULATOR = QasmSimulator()
BACKEND_OPTS = {}
# ---------------------------------------------------------------------
# Test h-gate
# ---------------------------------------------------------------------
def test_h_gate_deterministic_default_basis_gates(self):
"""Test h-gate circuits compiling to backend default basis_gates."""
shots = 100
circuits = ref_1q_clifford.h_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.h_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
def test_h_gate_nondeterministic_default_basis_gates(self):
"""Test h-gate circuits compiling to backend default basis_gates."""
shots = 2000
circuits = ref_1q_clifford.h_gate_circuits_nondeterministic(final_measure=True)
targets = ref_1q_clifford.h_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test x-gate
# ---------------------------------------------------------------------
def test_x_gate_deterministic_default_basis_gates(self):
"""Test x-gate circuits compiling to backend default basis_gates."""
shots = 100
circuits = ref_1q_clifford.x_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.x_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
# ---------------------------------------------------------------------
# Test z-gate
# ---------------------------------------------------------------------
def test_z_gate_deterministic_default_basis_gates(self):
"""Test z-gate circuits compiling to backend default basis_gates."""
shots = 100
circuits = ref_1q_clifford.z_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.z_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
# ---------------------------------------------------------------------
# Test y-gate
# ---------------------------------------------------------------------
def test_y_gate_deterministic_default_basis_gates(self):
"""Test y-gate circuits compiling to backend default basis_gates."""
shots = 100
circuits = ref_1q_clifford.y_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.y_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
# ---------------------------------------------------------------------
# Test s-gate
# ---------------------------------------------------------------------
def test_s_gate_deterministic_default_basis_gates(self):
"""Test s-gate circuits compiling to backend default basis_gates."""
shots = 100
circuits = ref_1q_clifford.s_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.s_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_s_gate_nondeterministic_default_basis_gates(self):
"""Test s-gate circuits compiling to backend default basis_gates."""
shots = 2000
circuits = ref_1q_clifford.s_gate_circuits_nondeterministic(final_measure=True)
targets = ref_1q_clifford.s_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test sdg-gate
# ---------------------------------------------------------------------
def test_sdg_gate_deterministic_default_basis_gates(self):
"""Test sdg-gate circuits compiling to backend default basis_gates."""
shots = 100
circuits = ref_1q_clifford.sdg_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.sdg_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_sdg_gate_nondeterministic_default_basis_gates(self):
shots = 2000
"""Test sdg-gate circuits compiling to backend default basis_gates."""
circuits = ref_1q_clifford.sdg_gate_circuits_nondeterministic(final_measure=True)
targets = ref_1q_clifford.sdg_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test cx-gate
# ---------------------------------------------------------------------
def test_cx_gate_deterministic_default_basis_gates(self):
"""Test cx-gate circuits compiling to backend default basis_gates."""
shots = 100
circuits = ref_2q_clifford.cx_gate_circuits_deterministic(final_measure=True)
targets = ref_2q_clifford.cx_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_cx_gate_nondeterministic_default_basis_gates(self):
"""Test cx-gate circuits compiling to backend default basis_gates."""
shots = 2000
circuits = ref_2q_clifford.cx_gate_circuits_nondeterministic(final_measure=True)
targets = ref_2q_clifford.cx_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test cz-gate
# ---------------------------------------------------------------------
def test_cz_gate_deterministic_default_basis_gates(self):
"""Test cz-gate circuits compiling to backend default basis_gates."""
shots = 100
circuits = ref_2q_clifford.cz_gate_circuits_deterministic(final_measure=True)
targets = ref_2q_clifford.cz_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_cz_gate_nondeterministic_default_basis_gates(self):
"""Test cz-gate circuits compiling to backend default basis_gates."""
shots = 2000
circuits = ref_2q_clifford.cz_gate_circuits_nondeterministic(final_measure=True)
targets = ref_2q_clifford.cz_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test swap-gate
# ---------------------------------------------------------------------
def test_swap_gate_deterministic_default_basis_gates(self):
"""Test swap-gate circuits compiling to backend default basis_gates."""
shots = 100
circuits = ref_2q_clifford.swap_gate_circuits_deterministic(final_measure=True)
targets = ref_2q_clifford.swap_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_swap_gate_nondeterministic_default_basis_gates(self):
"""Test swap-gate circuits compiling to backend default basis_gates."""
shots = 2000
circuits = ref_2q_clifford.swap_gate_circuits_nondeterministic(final_measure=True)
targets = ref_2q_clifford.swap_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
class QasmCliffordTestsWaltzBasis(common.QiskitAerTestCase):
"""QasmSimulator Clifford gate tests in Waltz u1,u2,u3,cx basis."""
SIMULATOR = QasmSimulator()
BACKEND_OPTS = {}
# ---------------------------------------------------------------------
# Test h-gate
# ---------------------------------------------------------------------
def test_h_gate_deterministic_waltz_basis_gates(self):
"""Test h-gate gate circuits compiling to u1,u2,u3,cx"""
shots = 100
circuits = ref_1q_clifford.h_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.h_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
def test_h_gate_nondeterministic_waltz_basis_gates(self):
"""Test h-gate gate circuits compiling to u1,u2,u3,cx"""
shots = 2000
circuits = ref_1q_clifford.h_gate_circuits_nondeterministic(final_measure=True)
targets = ref_1q_clifford.h_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test x-gate
# ---------------------------------------------------------------------
def test_x_gate_deterministic_waltz_basis_gates(self):
"""Test x-gate gate circuits compiling to u1,u2,u3,cx"""
shots = 100
circuits = ref_1q_clifford.x_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.x_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
# ---------------------------------------------------------------------
# Test z-gate
# ---------------------------------------------------------------------
def test_z_gate_deterministic_waltz_basis_gates(self):
"""Test z-gate gate circuits compiling to u1,u2,u3,cx"""
shots = 100
circuits = ref_1q_clifford.z_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.z_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_z_gate_deterministic_minimal_basis_gates(self):
"""Test z-gate gate circuits compiling to u3,cx"""
shots = 100
circuits = ref_1q_clifford.z_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.z_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
# ---------------------------------------------------------------------
# Test y-gate
# ---------------------------------------------------------------------
def test_y_gate_deterministic_default_basis_gates(self):
"""Test y-gate circuits compiling to backend default basis_gates."""
shots = 100
circuits = ref_1q_clifford.y_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.y_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots)
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_y_gate_deterministic_waltz_basis_gates(self):
shots = 100
"""Test y-gate gate circuits compiling to u1,u2,u3,cx"""
circuits = ref_1q_clifford.y_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.y_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
# ---------------------------------------------------------------------
# Test s-gate
# ---------------------------------------------------------------------
def test_s_gate_deterministic_waltz_basis_gates(self):
"""Test s-gate gate circuits compiling to u1,u2,u3,cx"""
shots = 100
circuits = ref_1q_clifford.s_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.s_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_s_gate_nondeterministic_waltz_basis_gates(self):
"""Test s-gate gate circuits compiling to u1,u2,u3,cx"""
shots = 2000
circuits = ref_1q_clifford.s_gate_circuits_nondeterministic(final_measure=True)
targets = ref_1q_clifford.s_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test sdg-gate
# ---------------------------------------------------------------------
def test_sdg_gate_deterministic_waltz_basis_gates(self):
"""Test sdg-gate gate circuits compiling to u1,u2,u3,cx"""
shots = 100
circuits = ref_1q_clifford.sdg_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.sdg_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_sdg_gate_nondeterministic_waltz_basis_gates(self):
"""Test sdg-gate gate circuits compiling to u1,u2,u3,cx"""
shots = 2000
circuits = ref_1q_clifford.sdg_gate_circuits_nondeterministic(final_measure=True)
targets = ref_1q_clifford.sdg_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test cx-gate
# ---------------------------------------------------------------------
def test_cx_gate_deterministic_waltz_basis_gates(self):
shots = 100
"""Test cx-gate gate circuits compiling to u1,u2,u3,cx"""
circuits = ref_2q_clifford.cx_gate_circuits_deterministic(final_measure=True)
targets = ref_2q_clifford.cx_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_cx_gate_nondeterministic_waltz_basis_gates(self):
"""Test cx-gate gate circuits compiling to u1,u2,u3,cx"""
shots = 2000
circuits = ref_2q_clifford.cx_gate_circuits_nondeterministic(final_measure=True)
targets = ref_2q_clifford.cx_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test cz-gate
# ---------------------------------------------------------------------
def test_cz_gate_deterministic_waltz_basis_gates(self):
"""Test cz-gate gate circuits compiling to u1,u2,u3,cx"""
shots = 100
circuits = ref_2q_clifford.cz_gate_circuits_deterministic(final_measure=True)
targets = ref_2q_clifford.cz_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_cz_gate_nondeterministic_waltz_basis_gates(self):
"""Test cz-gate gate circuits compiling to u1,u2,u3,cx"""
shots = 2000
circuits = ref_2q_clifford.cz_gate_circuits_nondeterministic(final_measure=True)
targets = ref_2q_clifford.cz_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test swap-gate
# ---------------------------------------------------------------------
def test_swap_gate_deterministic_waltz_basis_gates(self):
"""Test swap-gate gate circuits compiling to u1,u2,u3,cx"""
shots = 100
circuits = ref_2q_clifford.swap_gate_circuits_deterministic(final_measure=True)
targets = ref_2q_clifford.swap_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_swap_gate_nondeterministic_waltz_basis_gates(self):
"""Test swap-gate gate circuits compiling to u1,u2,u3,cx"""
shots = 2000
circuits = ref_2q_clifford.swap_gate_circuits_nondeterministic(final_measure=True)
targets = ref_2q_clifford.swap_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u1', 'u2', 'u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
class QasmCliffordTestsMinimalBasis(common.QiskitAerTestCase):
"""QasmSimulator Clifford gate tests in minimam U,CX basis."""
SIMULATOR = QasmSimulator()
BACKEND_OPTS = {}
# ---------------------------------------------------------------------
# Test h-gate
# ---------------------------------------------------------------------
def test_h_gate_deterministic_minimal_basis_gates(self):
"""Test h-gate gate circuits compiling to u3,cx"""
shots = 100
circuits = ref_1q_clifford.h_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.h_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
def test_h_gate_nondeterministic_minimal_basis_gates(self):
"""Test h-gate gate circuits compiling to u3,cx"""
shots = 2000
circuits = ref_1q_clifford.h_gate_circuits_nondeterministic(final_measure=True)
targets = ref_1q_clifford.h_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test x-gate
# ---------------------------------------------------------------------
def test_x_gate_deterministic_minimal_basis_gates(self):
"""Test x-gate gate circuits compiling to u3,cx"""
shots = 100
circuits = ref_1q_clifford.x_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.x_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
# ---------------------------------------------------------------------
# Test z-gate
# ---------------------------------------------------------------------
def test_z_gate_deterministic_minimal_basis_gates(self):
"""Test z-gate gate circuits compiling to u3,cx"""
shots = 100
circuits = ref_1q_clifford.z_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.z_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
# ---------------------------------------------------------------------
# Test y-gate
# ---------------------------------------------------------------------
def test_y_gate_deterministic_minimal_basis_gates(self):
"""Test y-gate gate circuits compiling to u3,cx"""
shots = 100
circuits = ref_1q_clifford.y_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.y_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
# ---------------------------------------------------------------------
# Test s-gate
# ---------------------------------------------------------------------
def test_s_gate_deterministic_minimal_basis_gates(self):
"""Test s-gate gate circuits compiling to u3,cx"""
shots = 100
circuits = ref_1q_clifford.s_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.s_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_s_gate_nondeterministic_minimal_basis_gates(self):
"""Test s-gate gate circuits compiling to u3,cx"""
shots = 2000
circuits = ref_1q_clifford.s_gate_circuits_nondeterministic(final_measure=True)
targets = ref_1q_clifford.s_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test sdg-gate
# ---------------------------------------------------------------------
def test_sdg_gate_deterministic_minimal_basis_gates(self):
"""Test sdg-gate gate circuits compiling to u3,cx"""
shots = 100
circuits = ref_1q_clifford.sdg_gate_circuits_deterministic(final_measure=True)
targets = ref_1q_clifford.sdg_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_sdg_gate_nondeterministic_minimal_basis_gates(self):
"""Test sdg-gate gate circuits compiling to u3,cx"""
shots = 2000
circuits = ref_1q_clifford.sdg_gate_circuits_nondeterministic(final_measure=True)
targets = ref_1q_clifford.sdg_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test cx-gate
# ---------------------------------------------------------------------
def test_cx_gate_deterministic_minimal_basis_gates(self):
"""Test cx-gate gate circuits compiling to u3,cx"""
shots = 100
circuits = ref_2q_clifford.cx_gate_circuits_deterministic(final_measure=True)
targets = ref_2q_clifford.cx_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_cx_gate_nondeterministic_minimal_basis_gates(self):
"""Test cx-gate gate circuits compiling to u3,cx"""
shots = 2000
circuits = ref_2q_clifford.cx_gate_circuits_nondeterministic(final_measure=True)
targets = ref_2q_clifford.cx_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test cz-gate
# ---------------------------------------------------------------------
def test_cz_gate_deterministic_minimal_basis_gates(self):
"""Test cz-gate gate circuits compiling to u3,cx"""
shots = 100
circuits = ref_2q_clifford.cz_gate_circuits_deterministic(final_measure=True)
targets = ref_2q_clifford.cz_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_cz_gate_nondeterministic_minimal_basis_gates(self):
"""Test cz-gate gate circuits compiling to u3,cx"""
shots = 2000
circuits = ref_2q_clifford.cz_gate_circuits_nondeterministic(final_measure=True)
targets = ref_2q_clifford.cz_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
# ---------------------------------------------------------------------
# Test swap-gate
# ---------------------------------------------------------------------
def test_swap_gate_deterministic_minimal_basis_gates(self):
"""Test swap-gate gate circuits compiling to u3,cx"""
shots = 100
circuits = ref_2q_clifford.swap_gate_circuits_deterministic(final_measure=True)
targets = ref_2q_clifford.swap_gate_counts_deterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0)
def test_swap_gate_nondeterministic_minimal_basis_gates(self):
"""Test swap-gate gate circuits compiling to u3,cx"""
shots = 2000
circuits = ref_2q_clifford.swap_gate_circuits_nondeterministic(final_measure=True)
targets = ref_2q_clifford.swap_gate_counts_nondeterministic(shots)
qobj = compile(circuits, self.SIMULATOR, shots=shots, basis_gates=['u3', 'cx'])
result = self.SIMULATOR.run(qobj, backend_options=self.BACKEND_OPTS).result()
self.is_completed(result)
self.compare_counts(result, circuits, targets, delta=0.05 * shots)
| 55.452862 | 99 | 0.628495 | 3,731 | 32,939 | 5.269633 | 0.025194 | 0.071716 | 0.039011 | 0.054982 | 0.981181 | 0.979909 | 0.975993 | 0.961497 | 0.961497 | 0.957683 | 0 | 0.017393 | 0.169131 | 32,939 | 593 | 100 | 55.546374 | 0.701001 | 0.20881 | 0 | 0.869231 | 0 | 0 | 0.007202 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.120513 | false | 0 | 0.012821 | 0 | 0.15641 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bf00955962ddd0b1d17a6e7d7f1225b9e953f776 | 239 | py | Python | program/cpu.py | Filipe-Barbos/Projeto-Watch-Dogs | 100254404219112779725fdc93be7900c6d9a078 | [
"MIT"
] | null | null | null | program/cpu.py | Filipe-Barbos/Projeto-Watch-Dogs | 100254404219112779725fdc93be7900c6d9a078 | [
"MIT"
] | null | null | null | program/cpu.py | Filipe-Barbos/Projeto-Watch-Dogs | 100254404219112779725fdc93be7900c6d9a078 | [
"MIT"
] | null | null | null | import psutil
def freq():
return round(psutil.cpu_freq().current/1000, 1)
def cores():
return psutil.cpu_count()
def phyCores():
return psutil.cpu_count(logical=False)
def percentage():
return psutil.cpu_times_percent(interval=1)
| 15.933333 | 48 | 0.753138 | 35 | 239 | 5 | 0.542857 | 0.205714 | 0.257143 | 0.228571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028436 | 0.117155 | 239 | 14 | 49 | 17.071429 | 0.800948 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.444444 | true | 0 | 0.111111 | 0.444444 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
171aec567d18acc0905a2f03cc21a3c0002d6fe5 | 22,282 | py | Python | data_structures/graphs/tests/test_adjacency_map_directed_weighted_graph.py | vinta/fuck-coding-interviews | 915ff55963430e81134a35f65f511e5684c52f11 | [
"MIT"
] | 590 | 2020-06-17T08:26:47.000Z | 2022-03-30T18:47:32.000Z | data_structures/graphs/tests/test_adjacency_map_directed_weighted_graph.py | parvathirajan/fuck-coding-interviews | 915ff55963430e81134a35f65f511e5684c52f11 | [
"MIT"
] | 12 | 2020-07-14T09:24:32.000Z | 2020-11-02T03:43:47.000Z | data_structures/graphs/tests/test_adjacency_map_directed_weighted_graph.py | parvathirajan/fuck-coding-interviews | 915ff55963430e81134a35f65f511e5684c52f11 | [
"MIT"
] | 75 | 2020-07-29T06:50:13.000Z | 2022-03-13T16:14:57.000Z | # coding: utf-8
import unittest
from data_structures.graphs.adjacency_map_directed_weighted_graph import DirectedGraph
class TestCase(unittest.TestCase):
def setUp(self):
self.graph = DirectedGraph()
self.edges = [
# (source, destination, weight)
('A', 'B', 1),
('A', 'C', 1),
('B', 'C', 1),
('B', 'D', 1),
('B', 'I', 1),
('C', 'D', 1),
('D', 'C', 1),
('D', 'H', 1),
('H', 'I', 1),
('E', 'A', 1),
('E', 'F', 1),
('F', 'C', 1),
('F', 'G', 1),
]
self.vertices = set()
for source, destination, weight in self.edges:
self.vertices.add(source)
self.vertices.add(destination)
self.graph.add_vertex(source, value=source)
self.graph.add_vertex(destination, value=destination)
self.graph.add_edge(source, destination, weight)
def test_add_vertex(self):
self.graph.add_vertex('X')
self.assertEqual(self.graph.vertex_count(), len(self.vertices) + 1)
def test_add_edge(self):
self.graph.add_vertex('X', value='X')
self.graph.add_vertex('Y', value='Y')
self.graph.add_edge('X', 'Y', 0)
self.assertEqual(self.graph.vertex_count(), len(self.vertices) + 2)
self.assertEqual(self.graph.edge_count(), len(self.edges) + 1)
def test_remove_vertex(self):
self.graph.remove_vertex('C')
self.assertEqual(self.graph.vertex_count(), len(self.vertices) - 1)
edge_count = 0
for source, destination, _ in self.edges:
if not source == 'C' and not destination == 'C':
edge_count += 1
self.assertEqual(self.graph.edge_count(), edge_count)
def test_remove_edge(self):
self.graph.remove_edge('A', 'B')
self.assertEqual(self.graph.edge_count(), len(self.edges) - 1)
with self.assertRaises(ValueError):
self.graph.remove_edge('Z', 'Z')
def test_vertex_count(self):
self.assertEqual(self.graph.vertex_count(), len(self.vertices))
def test_edge_count(self):
self.assertEqual(self.graph.edge_count(), len(self.edges))
def test_vertices(self):
self.assertCountEqual(self.graph.vertices(), self.vertices)
def test_edges(self):
self.assertCountEqual(self.graph.edges(), self.edges)
def test_incident_edges(self):
vertex = 'A'
outgoing_edges = [edge for edge in self.edges if edge[0] == vertex]
self.assertCountEqual(self.graph.incident_edges(vertex, edge_type='outgoing'), outgoing_edges)
incoming_edges = [edge for edge in self.edges if edge[1] == vertex]
self.assertCountEqual(self.graph.incident_edges(vertex, edge_type='incoming'), incoming_edges)
self.assertCountEqual(self.graph.incident_edges('NOT EXIST', edge_type='outgoing'), [])
self.assertCountEqual(self.graph.incident_edges('NOT EXIST', edge_type='incoming'), [])
def test_edge_weight(self):
for source, destination, weight in self.edges:
self.assertEqual(self.graph.edge_weight(source, destination), weight)
with self.assertRaises(ValueError):
self.graph.edge_weight('NOT EXIST', 'NOT EXIST')
def test_breadth_first_search(self):
v = 'A'
visited = self.graph.breadth_first_search(v)
self.assertCountEqual(visited, ['A', 'B', 'C', 'D', 'H', 'I'])
v = 'E'
visited = self.graph.breadth_first_search(v)
self.assertCountEqual(visited, ['A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I'])
def test_depth_first_search(self):
v = 'A'
visited = self.graph.depth_first_search(v)
self.assertCountEqual(visited, ['A', 'B', 'C', 'D', 'H', 'I'])
v = 'E'
visited = self.graph.depth_first_search(v)
self.assertCountEqual(visited, ['A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I'])
def test_has_cycles_dfs(self):
self.assertEqual(self.graph.has_cycles_dfs(), True)
graph = DirectedGraph()
edges = [
('A', 'A', 1),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
self.assertEqual(graph.has_cycles_dfs(), True)
graph = DirectedGraph()
edges = [
('A', 'B', 1),
('B', 'C', 1),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
self.assertEqual(graph.has_cycles_dfs(), False)
def test_find_shortest_paths_bfs(self):
graph = DirectedGraph()
edges = [ # https://www.geeksforgeeks.org/shortest-path-unweighted-graph/
(0, 1, 1),
(0, 3, 1),
(1, 0, 1),
(1, 2, 1),
(2, 1, 1),
(3, 0, 1),
(3, 4, 1),
(3, 7, 1),
(4, 3, 1),
(4, 5, 1),
(4, 6, 1),
(4, 7, 1),
(5, 4, 1),
(5, 6, 1),
(6, 4, 1),
(6, 5, 1),
(6, 7, 1),
(7, 3, 1),
(7, 4, 1),
(7, 6, 1),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
previous, distances = graph.find_shortest_paths_bfs(0)
path = graph.construct_path(previous, 0, 5)
self.assertEqual(path, [0, 3, 4, 5])
path = graph.construct_path(previous, 0, 7)
self.assertEqual(path, [0, 3, 7])
# No such path.
with self.assertRaises(ValueError):
graph.construct_path(previous, 0, 10)
def test_find_shortest_path_dijkstra(self):
graph = DirectedGraph()
edges = [ # https://cs.stackexchange.com/questions/18138/dijkstra-algorithm-vs-breadth-first-search-for-shortest-path-in-graph
('A', 'B', 1),
('B', 'C', 3),
('B', 'D', 2),
('B', 'E', 1),
('C', 'E', 4),
('D', 'E', 2),
('E', 'F', 3),
('G', 'D', 1),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
previous, distances = graph.find_shortest_path_dijkstra('A')
path = graph.construct_path(previous, 'A', 'D')
self.assertEqual(path, ['A', 'B', 'D'])
path = graph.construct_path(previous, 'A', 'E')
self.assertEqual(path, ['A', 'B', 'E'])
path = graph.construct_path(previous, 'A', 'F')
self.assertEqual(path, ['A', 'B', 'E', 'F'])
# No such path.
with self.assertRaises(ValueError):
graph.construct_path(previous, 'A', 'G')
graph = DirectedGraph()
edges = [ # https://www.youtube.com/watch?v=pVfj6mxhdMw
('A', 'B', 6),
('A', 'D', 1),
('B', 'A', 6),
('B', 'C', 5),
('B', 'D', 2),
('B', 'E', 2),
('C', 'B', 5),
('C', 'E', 5),
('D', 'A', 1),
('D', 'B', 2),
('D', 'E', 1),
('E', 'B', 2),
('E', 'C', 5),
('E', 'D', 1),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
previous, distances = graph.find_shortest_path_dijkstra('A')
path = graph.construct_path(previous, 'A', 'B')
self.assertEqual(path, ['A', 'D', 'B'])
path = graph.construct_path(previous, 'A', 'C')
self.assertEqual(path, ['A', 'D', 'E', 'C'])
path = graph.construct_path(previous, 'A', 'D')
self.assertEqual(path, ['A', 'D'])
path = graph.construct_path(previous, 'A', 'E')
self.assertEqual(path, ['A', 'D', 'E'])
graph = DirectedGraph()
edges = [ # https://www.chegg.com/homework-help/questions-and-answers/8-4-14-10-2-figure-2-directed-graph-computing-shortest-path-3-dijkstra-s-algorithm-computi-q25960616#question-transcript
('A', 'B', 4),
('B', 'C', 11),
('B', 'D', 9),
('C', 'A', 8),
('D', 'C', 7),
('D', 'E', 2),
('D', 'F', 6),
('E', 'B', 8),
('E', 'G', 7),
('E', 'H', 4),
('F', 'C', 1),
('F', 'E', 5),
('G', 'H', 14),
('G', 'I', 9),
('H', 'F', 2),
('H', 'I', 10),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
previous, distances = graph.find_shortest_path_dijkstra('A')
path = graph.construct_path(previous, 'A', 'I')
self.assertEqual(path, ['A', 'B', 'D', 'E', 'H', 'I'])
previous, distances = graph.find_shortest_path_dijkstra('E')
path = graph.construct_path(previous, 'E', 'C')
self.assertEqual(path, ['E', 'H', 'F', 'C'])
previous, distances = graph.find_shortest_path_dijkstra('I')
# No such path.
with self.assertRaises(ValueError):
graph.construct_path(previous, 'I', 'A')
graph = DirectedGraph()
edges = [ # https://www.bogotobogo.com/python/python_Prims_Spanning_Tree_Data_Structure.php
('A', 'B', 7),
('A', 'C', 9),
('A', 'F', 14),
('B', 'A', 7),
('B', 'C', 10),
('B', 'D', 15),
('C', 'A', 9),
('C', 'B', 10),
('C', 'D', 11),
('C', 'F', 2),
('D', 'B', 15),
('D', 'C', 11),
('D', 'E', 6),
('E', 'D', 6),
('E', 'F', 9),
('F', 'A', 14),
('F', 'C', 2),
('F', 'E', 9),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
previous, distances = graph.find_shortest_path_dijkstra('A')
path = graph.construct_path(previous, 'A', 'E')
self.assertEqual(path, ['A', 'C', 'F', 'E'])
path = graph.construct_path(previous, 'A', 'D')
self.assertEqual(path, ['A', 'C', 'D'])
graph = DirectedGraph()
adj_list = { # https://adityakamath.com/2018/06/17/Dijktra%27s-Algorithm-In-Python.html
'S': {'A': 7, 'B': 2, 'C': 3},
'A': {'S': 7, 'B': 3, 'D': 4},
'B': {'S': 2, 'A': 3, 'D': 4, 'H': 1, 'C': 12},
'C': {'S': 3, 'L': 2},
'D': {'A': 4, 'B': 4, 'F': 5},
'E': {'G': 2, 'K': 5},
'F': {'D': 5, 'H': 3},
'G': {'H': 2, 'I': 10, 'E': 2},
'H': {'B': 1, 'F': 3, 'G': 2},
'I': {'J': 6, 'K': 4, 'L': 4},
'J': {'K': 4, 'L': 4},
'K': {'I': 4, 'J': 4, 'E': 5},
'L': {'C': 2, 'I': 4, 'J': 4},
}
for src, neighbors in adj_list.items():
for des, weight in neighbors.items():
graph.add_edge(src, des, weight)
previous, distances = graph.find_shortest_path_dijkstra('S')
path = graph.construct_path(previous, 'S', 'E')
self.assertEqual(path, ['S', 'B', 'H', 'G', 'E'])
def test_find_shortest_path_bellman_ford(self):
graph = DirectedGraph()
edges = [ # https://www.programiz.com/dsa/bellman-ford-algorithm
('A', 'B', 2),
('B', 'C', 2),
('B', 'D', 1),
('C', 'D', -4),
('D', 'B', 1),
('D', 'E', 3),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
# There is a negative weight cycle among {B, C, D}.
with self.assertRaises(ValueError):
previous, distances = graph.find_shortest_path_bellman_ford('A')
graph = DirectedGraph()
edges = [ # https://www.programiz.com/dsa/bellman-ford-algorithm
('A', 'B', 4),
('A', 'C', 2),
('B', 'C', 3),
('B', 'D', 2),
('B', 'E', 4),
('C', 'B', 1),
('C', 'D', 3),
('C', 'E', 5),
('E', 'D', -5),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
previous, distances = graph.find_shortest_path_bellman_ford('A')
path = graph.construct_path(previous, 'A', 'A')
self.assertEqual(path, ['A', ])
path = graph.construct_path(previous, 'A', 'B')
self.assertEqual(path, ['A', 'C', 'B'])
path = graph.construct_path(previous, 'A', 'C')
self.assertEqual(path, ['A', 'C'])
path = graph.construct_path(previous, 'A', 'D')
self.assertEqual(path, ['A', 'C', 'E', 'D'])
path = graph.construct_path(previous, 'A', 'E')
self.assertEqual(path, ['A', 'C', 'E'])
previous, distances = graph.find_shortest_path_bellman_ford('D')
# No such path.
with self.assertRaises(ValueError):
graph.construct_path(previous, 'D', 'E')
graph = DirectedGraph()
edges = [ # https://cs.stackexchange.com/questions/18138/dijkstra-algorithm-vs-breadth-first-search-for-shortest-path-in-graph
('A', 'B', 1),
('B', 'C', 1),
('B', 'D', 1),
('B', 'E', 1),
('C', 'E', 1),
('D', 'E', 1),
('E', 'F', 1),
('G', 'D', 1),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
previous, distances = graph.find_shortest_path_bellman_ford('A')
path = graph.construct_path(previous, 'A', 'E')
self.assertEqual(path, ['A', 'B', 'E'])
path = graph.construct_path(previous, 'A', 'D')
self.assertEqual(path, ['A', 'B', 'D'])
# No such path.
with self.assertRaises(ValueError):
graph.construct_path(previous, 'A', 'G')
graph = DirectedGraph()
edges = [ # https://www.youtube.com/watch?v=pVfj6mxhdMw
('A', 'B', 6),
('A', 'D', 1),
('B', 'A', 6),
('B', 'C', 5),
('B', 'D', 2),
('B', 'E', 2),
('C', 'B', 5),
('C', 'E', 5),
('D', 'A', 1),
('D', 'B', 2),
('D', 'E', 1),
('E', 'B', 2),
('E', 'C', 5),
('E', 'D', 1),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
previous, distances = graph.find_shortest_path_bellman_ford('A')
path = graph.construct_path(previous, 'A', 'C')
self.assertEqual(path, ['A', 'D', 'E', 'C'])
graph = DirectedGraph()
edges = [ # https://www.chegg.com/homework-help/questions-and-answers/8-4-14-10-2-figure-2-directed-graph-computing-shortest-path-3-dijkstra-s-algorithm-computi-q25960616#question-transcript
('A', 'B', 4),
('B', 'C', 11),
('B', 'D', 9),
('C', 'A', 8),
('D', 'C', 7),
('D', 'E', 2),
('D', 'F', 6),
('E', 'B', 8),
('E', 'G', 7),
('E', 'H', 4),
('F', 'C', 1),
('F', 'E', 5),
('G', 'H', 14),
('G', 'I', 9),
('H', 'F', 2),
('H', 'I', 10),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
previous, distances = graph.find_shortest_path_bellman_ford('A')
path = graph.construct_path(previous, 'A', 'I')
self.assertEqual(path, ['A', 'B', 'D', 'E', 'H', 'I'])
previous, distances = graph.find_shortest_path_bellman_ford('E')
path = graph.construct_path(previous, 'E', 'C')
self.assertEqual(path, ['E', 'H', 'F', 'C'])
previous, distances = graph.find_shortest_path_bellman_ford('I')
# No such path.
with self.assertRaises(ValueError):
graph.construct_path(previous, 'I', 'A')
graph = DirectedGraph()
edges = [ # https://www.bogotobogo.com/python/python_Prims_Spanning_Tree_Data_Structure.php
('A', 'B', 7),
('A', 'C', 9),
('A', 'F', 14),
('B', 'A', 7),
('B', 'C', 10),
('B', 'D', 15),
('C', 'A', 9),
('C', 'B', 10),
('C', 'D', 11),
('C', 'F', 2),
('D', 'B', 15),
('D', 'C', 11),
('D', 'E', 6),
('E', 'D', 6),
('E', 'F', 9),
('F', 'A', 14),
('F', 'C', 2),
('F', 'E', 9),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
previous, distances = graph.find_shortest_path_bellman_ford('A')
path = graph.construct_path(previous, 'A', 'E')
self.assertEqual(path, ['A', 'C', 'F', 'E'])
path = graph.construct_path(previous, 'A', 'D')
self.assertEqual(path, ['A', 'C', 'D'])
def test_find_minimum_spanning_tree_prim_jarnik(self):
graph = DirectedGraph()
edges = [ # https://www.programiz.com/dsa/spanning-tree-and-minimum-spanning-tree
('A', 'B', 1),
('A', 'D', 1),
('B', 'A', 1),
('B', 'C', 1),
('C', 'B', 1),
('C', 'D', 1),
('D', 'A', 1),
('D', 'C', 1),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
tree_edges = graph.find_minimum_spanning_tree_prim_jarnik('A')
total_weights = sum(edge[2] for edge in tree_edges)
self.assertEqual(total_weights, 3)
graph = DirectedGraph()
edges = [ # https://en.wikipedia.org/wiki/Minimum_spanning_tree
('A', 'B', 1),
('A', 'D', 4),
('A', 'E', 3),
('B', 'A', 1),
('B', 'D', 4),
('B', 'E', 2),
('C', 'E', 4),
('C', 'F', 5),
('D', 'A', 4),
('D', 'B', 4),
('D', 'E', 4),
('E', 'A', 3),
('E', 'B', 2),
('E', 'C', 4),
('E', 'D', 4),
('E', 'F', 7),
('F', 'C', 5),
('F', 'E', 7),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
tree_edges = graph.find_minimum_spanning_tree_prim_jarnik('A')
total_weights = sum(edge[2] for edge in tree_edges)
self.assertEqual(total_weights, 16)
graph = DirectedGraph()
edges = [ # http://dev.tutorialspoint.com/design_and_analysis_of_algorithms/design_and_analysis_of_algorithms_quick_guide.htm
(1, 2, 5),
(1, 3, 2),
(2, 1, 5),
(2, 3, 2),
(2, 4, 3),
(2, 5, 7),
(3, 1, 2),
(3, 2, 2),
(3, 4, 3),
(3, 7, 9),
(4, 2, 3),
(4, 3, 3),
(4, 5, 2),
(4, 7, 6),
(5, 2, 7),
(5, 4, 2),
(5, 6, 8),
(5, 7, 5),
(5, 8, 7),
(6, 5, 8),
(6, 8, 3),
(6, 9, 4),
(7, 3, 9),
(7, 4, 6),
(7, 5, 5),
(7, 8, 2),
(8, 5, 7),
(8, 6, 3),
(8, 7, 2),
(9, 6, 4),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
tree_edges = graph.find_minimum_spanning_tree_prim_jarnik(1)
total_weights = sum(edge[2] for edge in tree_edges)
self.assertEqual(total_weights, 23)
def test_find_minimum_spanning_tree_kruskal(self):
graph = DirectedGraph()
edges = [ # https://www.programiz.com/dsa/spanning-tree-and-minimum-spanning-tree
('A', 'B', 1),
('A', 'D', 1),
('B', 'A', 1),
('B', 'C', 1),
('C', 'B', 1),
('C', 'D', 1),
('D', 'A', 1),
('D', 'C', 1),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
tree_edges = graph.find_minimum_spanning_tree_kruskal('A')
total_weights = sum(edge[2] for edge in tree_edges)
self.assertEqual(total_weights, 3)
graph = DirectedGraph()
edges = [ # https://en.wikipedia.org/wiki/Minimum_spanning_tree
('A', 'B', 1),
('A', 'D', 4),
('A', 'E', 3),
('B', 'A', 1),
('B', 'D', 4),
('B', 'E', 2),
('C', 'E', 4),
('C', 'F', 5),
('D', 'A', 4),
('D', 'B', 4),
('D', 'E', 4),
('E', 'A', 3),
('E', 'B', 2),
('E', 'C', 4),
('E', 'D', 4),
('E', 'F', 7),
('F', 'C', 5),
('F', 'E', 7),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
tree_edges = graph.find_minimum_spanning_tree_kruskal('A')
total_weights = sum(edge[2] for edge in tree_edges)
self.assertEqual(total_weights, 16)
graph = DirectedGraph()
edges = [ # http://dev.tutorialspoint.com/design_and_analysis_of_algorithms/design_and_analysis_of_algorithms_quick_guide.htm
(1, 2, 5),
(1, 3, 2),
(2, 1, 5),
(2, 3, 2),
(2, 4, 3),
(2, 5, 7),
(3, 1, 2),
(3, 2, 2),
(3, 4, 3),
(3, 7, 9),
(4, 2, 3),
(4, 3, 3),
(4, 5, 2),
(4, 7, 6),
(5, 2, 7),
(5, 4, 2),
(5, 6, 8),
(5, 7, 5),
(5, 8, 7),
(6, 5, 8),
(6, 8, 3),
(6, 9, 4),
(7, 3, 9),
(7, 4, 6),
(7, 5, 5),
(7, 8, 2),
(8, 5, 7),
(8, 6, 3),
(8, 7, 2),
(9, 6, 4),
]
for src, des, weight in edges:
graph.add_edge(src, des, weight)
tree_edges = graph.find_minimum_spanning_tree_kruskal(1)
total_weights = sum(edge[2] for edge in tree_edges)
self.assertEqual(total_weights, 23)
if __name__ == '__main__':
unittest.main()
| 32.671554 | 199 | 0.445158 | 2,730 | 22,282 | 3.518681 | 0.065568 | 0.068707 | 0.04872 | 0.086613 | 0.840308 | 0.811888 | 0.773059 | 0.771393 | 0.749115 | 0.713929 | 0 | 0.041585 | 0.35679 | 22,282 | 681 | 200 | 32.71953 | 0.628663 | 0.077372 | 0 | 0.705779 | 0 | 0 | 0.035415 | 0 | 0 | 0 | 0 | 0 | 0.110333 | 1 | 0.033275 | false | 0 | 0.003503 | 0 | 0.038529 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
17323e85d711640b85587fb98bad121ac34e5af3 | 306 | py | Python | Python/branching.py | MichaelSDavid/TwilioQuest-Code | 7dcdf6b11b27831a1c07896d0796757e379189cf | [
"MIT"
] | null | null | null | Python/branching.py | MichaelSDavid/TwilioQuest-Code | 7dcdf6b11b27831a1c07896d0796757e379189cf | [
"MIT"
] | 5 | 2020-07-27T15:42:52.000Z | 2020-07-27T15:59:26.000Z | Python/branching.py | MichaelSDavid/TwilioQuest-Code | 7dcdf6b11b27831a1c07896d0796757e379189cf | [
"MIT"
] | null | null | null | import sys
if (int(sys.argv[1]) + int(sys.argv[2])) > 100:
print("You have chosen the path of excess.")
elif 1 < (int(sys.argv[1]) + int(sys.argv[2])) <= 100:
print("You have chosen the path of plenty.")
elif (int(sys.argv[1]) + int(sys.argv[2])) <= 0:
print("You have chosen the path of destitution.") | 38.25 | 54 | 0.643791 | 57 | 306 | 3.45614 | 0.350877 | 0.182741 | 0.304569 | 0.22335 | 0.77665 | 0.77665 | 0.77665 | 0.639594 | 0.527919 | 0.527919 | 0 | 0.053846 | 0.150327 | 306 | 8 | 55 | 38.25 | 0.703846 | 0 | 0 | 0 | 0 | 0 | 0.358306 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0.428571 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
bd88def33b25e990d220e1155d1125b63c7555b7 | 8,373 | py | Python | models/resnet.py | lionelmessi6410/ntga | 0bc73087ec8c0a3601e5561603dcf439716d4646 | [
"Apache-2.0"
] | 25 | 2021-05-26T05:52:24.000Z | 2022-02-10T23:54:55.000Z | models/resnet.py | lionelmessi6410/Neural-Tangent-Generalization-Attacks | 0bc73087ec8c0a3601e5561603dcf439716d4646 | [
"Apache-2.0"
] | 1 | 2022-02-09T05:54:06.000Z | 2022-02-17T17:07:17.000Z | models/resnet.py | lionelmessi6410/Neural-Tangent-Generalization-Attacks | 0bc73087ec8c0a3601e5561603dcf439716d4646 | [
"Apache-2.0"
] | 1 | 2021-10-02T05:14:09.000Z | 2021-10-02T05:14:09.000Z | import tensorflow as tf
from models.residual_block import make_basic_block_layer, make_bottleneck_layer
class ResNetTypeISmall(tf.keras.Model):
def __init__(self, num_classes, layer_params):
super(ResNetTypeISmall, self).__init__()
self.conv1 = tf.keras.layers.Conv2D(filters=64,
kernel_size=(3, 3),
strides=1,
padding="same")
self.bn1 = tf.keras.layers.BatchNormalization()
self.layer1 = make_basic_block_layer(filter_num=64,
blocks=layer_params[0])
self.layer2 = make_basic_block_layer(filter_num=128,
blocks=layer_params[1],
stride=2)
self.layer3 = make_basic_block_layer(filter_num=256,
blocks=layer_params[2],
stride=2)
self.layer4 = make_basic_block_layer(filter_num=512,
blocks=layer_params[3],
stride=2)
self.avgpool = tf.keras.layers.GlobalAveragePooling2D()
self.fc = tf.keras.layers.Dense(units=num_classes, activation=tf.keras.activations.softmax)
def call(self, inputs, training=None, mask=None):
x = self.conv1(inputs)
x = self.bn1(x, training=training)
x = tf.nn.relu(x)
x = self.layer1(x, training=training)
x = self.layer2(x, training=training)
x = self.layer3(x, training=training)
x = self.layer4(x, training=training)
x = self.avgpool(x)
output = self.fc(x)
return output
class ResNetTypeI(tf.keras.Model):
def __init__(self, num_classes, layer_params):
super(ResNetTypeI, self).__init__()
self.conv1 = tf.keras.layers.Conv2D(filters=64,
kernel_size=(7, 7),
strides=2,
padding="same")
self.bn1 = tf.keras.layers.BatchNormalization()
self.pool1 = tf.keras.layers.MaxPool2D(pool_size=(3, 3),
strides=2,
padding="same")
self.layer1 = make_basic_block_layer(filter_num=64,
blocks=layer_params[0])
self.layer2 = make_basic_block_layer(filter_num=128,
blocks=layer_params[1],
stride=2)
self.layer3 = make_basic_block_layer(filter_num=256,
blocks=layer_params[2],
stride=2)
self.layer4 = make_basic_block_layer(filter_num=512,
blocks=layer_params[3],
stride=2)
self.avgpool = tf.keras.layers.GlobalAveragePooling2D()
self.fc = tf.keras.layers.Dense(units=num_classes, activation=tf.keras.activations.softmax)
def call(self, inputs, training=None, mask=None):
x = self.conv1(inputs)
x = self.bn1(x, training=training)
x = tf.nn.relu(x)
x = self.pool1(x)
x = self.layer1(x, training=training)
x = self.layer2(x, training=training)
x = self.layer3(x, training=training)
x = self.layer4(x, training=training)
x = self.avgpool(x)
output = self.fc(x)
return output
class ResNetTypeIISmall(tf.keras.Model):
def __init__(self, num_classes, layer_params):
super(ResNetTypeIISmall, self).__init__()
self.conv1 = tf.keras.layers.Conv2D(filters=64,
kernel_size=(3, 3),
strides=1,
padding="same")
self.bn1 = tf.keras.layers.BatchNormalization()
self.layer1 = make_bottleneck_layer(filter_num=64,
blocks=layer_params[0])
self.layer2 = make_bottleneck_layer(filter_num=128,
blocks=layer_params[1],
stride=2)
self.layer3 = make_bottleneck_layer(filter_num=256,
blocks=layer_params[2],
stride=2)
self.layer4 = make_bottleneck_layer(filter_num=512,
blocks=layer_params[3],
stride=2)
self.avgpool = tf.keras.layers.GlobalAveragePooling2D()
self.fc = tf.keras.layers.Dense(units=num_classes, activation=tf.keras.activations.softmax)
def call(self, inputs, training=None, mask=None):
x = self.conv1(inputs)
x = self.bn1(x, training=training)
x = tf.nn.relu(x)
x = self.layer1(x, training=training)
x = self.layer2(x, training=training)
x = self.layer3(x, training=training)
x = self.layer4(x, training=training)
x = self.avgpool(x)
output = self.fc(x)
return output
class ResNetTypeII(tf.keras.Model):
def __init__(self, num_classes, layer_params):
super(ResNetTypeII, self).__init__()
self.conv1 = tf.keras.layers.Conv2D(filters=64,
kernel_size=(7, 7),
strides=2,
padding="same")
self.bn1 = tf.keras.layers.BatchNormalization()
self.pool1 = tf.keras.layers.MaxPool2D(pool_size=(3, 3),
strides=2,
padding="same")
self.layer1 = make_bottleneck_layer(filter_num=64,
blocks=layer_params[0])
self.layer2 = make_bottleneck_layer(filter_num=128,
blocks=layer_params[1],
stride=2)
self.layer3 = make_bottleneck_layer(filter_num=256,
blocks=layer_params[2],
stride=2)
self.layer4 = make_bottleneck_layer(filter_num=512,
blocks=layer_params[3],
stride=2)
self.avgpool = tf.keras.layers.GlobalAveragePooling2D()
self.fc = tf.keras.layers.Dense(units=num_classes, activation=tf.keras.activations.softmax)
def call(self, inputs, training=None, mask=None):
x = self.conv1(inputs)
x = self.bn1(x, training=training)
x = tf.nn.relu(x)
x = self.pool1(x)
x = self.layer1(x, training=training)
x = self.layer2(x, training=training)
x = self.layer3(x, training=training)
x = self.layer4(x, training=training)
x = self.avgpool(x)
output = self.fc(x)
return output
def ResNet18(input_shape, num_classes):
if input_shape != (224, 224, 3):
return ResNetTypeISmall(num_classes, layer_params=[2, 2, 2, 2])
return ResNetTypeI(num_classes, layer_params=[2, 2, 2, 2])
def ResNet34(input_shape, num_classes):
if input_shape != (224, 224, 3):
return ResNetTypeISmall(num_classes, layer_params=[3, 4, 6, 3])
return ResNetTypeI(num_classes, layer_params=[3, 4, 6, 3])
def ResNet50(input_shape, num_classes):
if input_shape != (224, 224, 3):
return ResNetTypeIISmall(num_classes, layer_params=[3, 4, 6, 3])
return ResNetTypeII(num_classes, layer_params=[3, 4, 6, 3])
def ResNet101(input_shape, num_classes):
if input_shape != (224, 224, 3):
return ResNetTypeIISmall(num_classes, layer_params=[3, 4, 23, 3])
return ResNetTypeII(num_classes, layer_params=[3, 4, 23, 3])
def ResNet152(input_shape, num_classes):
if input_shape != (224, 224, 3):
return ResNetTypeIISmall(num_classes, layer_params=[3, 8, 36, 3])
return ResNetTypeII(num_classes, layer_params=[3, 8, 36, 3])
| 46.005495 | 99 | 0.518213 | 896 | 8,373 | 4.660714 | 0.098214 | 0.079023 | 0.081418 | 0.086207 | 0.938937 | 0.938937 | 0.932232 | 0.932232 | 0.912596 | 0.881944 | 0 | 0.048606 | 0.383256 | 8,373 | 181 | 100 | 46.259669 | 0.76007 | 0 | 0 | 0.841772 | 0 | 0 | 0.002866 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.082278 | false | 0 | 0.012658 | 0 | 0.208861 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bdf93c9edab05f83fd19e618f5f9c4599f080261 | 707 | py | Python | temp/test.py | HiPeople21/Terminwind | e84add2eb5e08b9d7c792cda6e2f96f17e6e35ef | [
"MIT"
] | null | null | null | temp/test.py | HiPeople21/Terminwind | e84add2eb5e08b9d7c792cda6e2f96f17e6e35ef | [
"MIT"
] | null | null | null | temp/test.py | HiPeople21/Terminwind | e84add2eb5e08b9d7c792cda6e2f96f17e6e35ef | [
"MIT"
] | null | null | null | import time
import aiofile
async def main():
now = time.perf_counter()
with open("./temp/f.py", "r") as f:
g = f.read()
print(time.perf_counter() - now)
now = time.perf_counter()
with open("./temp/f.py", "r") as f:
g = ""
for i in f:
g += i
print(time.perf_counter() - now)
now = time.perf_counter()
async with aiofile.async_open("./temp/f.py", "r") as f:
g = await f.read()
print(time.perf_counter() - now)
now = time.perf_counter()
async with aiofile.async_open("./temp/f.py", "r") as f:
async for i in f:
g += i
print(time.perf_counter() - now)
import asyncio
asyncio.run(main())
| 19.108108 | 59 | 0.547383 | 107 | 707 | 3.523364 | 0.233645 | 0.169761 | 0.318302 | 0.190981 | 0.809019 | 0.809019 | 0.809019 | 0.809019 | 0.806366 | 0.806366 | 0 | 0 | 0.289958 | 707 | 36 | 60 | 19.638889 | 0.750996 | 0 | 0 | 0.583333 | 0 | 0 | 0.067893 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0.166667 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
bdff6628f31ce8af58102e55dec6af2def61db13 | 51,529 | py | Python | monk/system/imports.py | gstearmit/monk_v1 | 89184ae27dc6d134620034d5b12aa86473ea47ba | [
"Apache-2.0"
] | null | null | null | monk/system/imports.py | gstearmit/monk_v1 | 89184ae27dc6d134620034d5b12aa86473ea47ba | [
"Apache-2.0"
] | 9 | 2020-01-28T21:40:39.000Z | 2022-02-10T01:24:06.000Z | monk/system/imports.py | abhi-kumar/monk_kaggle_bengali_ai | 12a6c654446e887706c1a8bed82fccf8a98ce356 | [
"Apache-2.0"
] | 1 | 2020-10-07T12:57:44.000Z | 2020-10-07T12:57:44.000Z | import os
import sys
import shutil
import json
import pandas as pd
import numpy as np
import logging
import datetime
import functools
import inspect
import string
import warnings
from pylg import TraceFunction
from pylg import trace
class ArgumentValidationError(ValueError):
'''
Raised when the type of an argument to a function is not what it should be.
'''
def __init__(self, arg_num, func_name, accepted_arg_type, given_arg_type, list_type):
if(list_type):
self.error = 'The {0} argument of {1}() is not in the list {2}, but is {3}'.format(arg_num,
func_name,
accepted_arg_type,
given_arg_type)
else:
self.error = 'The {0} argument of {1}() is not a {2}, but is {3}'.format(arg_num,
func_name,
accepted_arg_type,
given_arg_type)
def __str__(self):
return self.error
class InvalidArgumentNumberError(ValueError):
'''
Raised when the number of arguments supplied to a function is incorrect.
Note that this check is only performed from the number of arguments
specified in the validate_accept() decorator. If the validate_accept()
call is incorrect, it is possible to have a valid function where this
will report a false validation.
'''
def __init__(self, func_name):
self.error = 'Invalid number of arguments for {0}()'.format(func_name)
def __str__(self):
return self.error
class InvalidReturnType(ValueError):
'''
As the name implies, the return value is the wrong type.
'''
def __init__(self, return_type, func_name):
self.error = 'Invalid return type {0} for {1}()'.format(return_type,
func_name)
def __str__(self):
return self.error
def ordinal(num):
'''
Returns the ordinal number of a given integer, as a string.
eg. 1 -> 1st, 2 -> 2nd, 3 -> 3rd, etc.
'''
if 10 <= num % 100 < 20:
return '{0}th'.format(num)
else:
ord = {1 : 'st', 2 : 'nd', 3 : 'rd'}.get(num % 10, 'th')
return '{0}{1}'.format(num, ord)
def accepts(*accepted_arg_types, **accepted_arg_dicts):
'''
A decorator to validate the parameter types of a given function.
It is passed a tuple of types. eg. (<type 'tuple'>, <type 'int'>)
Note: It doesn't do a deep check, for example checking through a
tuple of types. The argument passed must only be types.
'''
def accept_decorator(validate_function):
# Check if the number of arguments to the validator
# function is the same as the arguments provided
# to the actual function to validate. We don't need
# to check if the function to validate has the right
# amount of arguments, as Python will do this
# automatically (also with a TypeError).
@functools.wraps(validate_function)
def decorator_wrapper(*function_args, **function_args_dicts):
if len(accepted_arg_types) is not len(accepted_arg_types):
raise InvalidArgumentNumberError(validate_function.__name__)
# We're using enumerate to get the index, so we can pass the
# argument number with the incorrect type to ArgumentValidationError.
i = 0;
for arg_num, (actual_arg, accepted_arg_type) in enumerate(zip(function_args, accepted_arg_types)):
if(accepted_arg_type=="self"):
continue;
if(type(accepted_arg_type)) == list:
if type(actual_arg) not in accepted_arg_type:
ord_num = ordinal(arg_num + 1)
if(accepted_arg_dicts["post_trace"]):
raise ArgumentValidationError(ord_num,
validate_function.function.function.__name__,
accepted_arg_type,
type(actual_arg),
True)
else:
raise ArgumentValidationError(ord_num,
validate_function.__name__,
accepted_arg_type,
type(actual_arg),
True)
else:
if not type(actual_arg) is accepted_arg_type:
ord_num = ordinal(arg_num + 1)
if(accepted_arg_dicts["post_trace"]):
raise ArgumentValidationError(ord_num,
validate_function.function.function.__name__,
accepted_arg_type,
type(actual_arg),
False)
else:
raise ArgumentValidationError(ord_num,
validate_function.__name__,
accepted_arg_type,
type(actual_arg),
False)
i += 1;
keys = list(function_args_dicts.keys());
for i in range(len(keys)):
func_arg_type = type(function_args_dicts[keys[i]]);
accepted_arg_type = accepted_arg_dicts[keys[i]];
if(type(accepted_arg_type) == list):
if(func_arg_type not in accepted_arg_type):
if(accepted_arg_dicts["post_trace"]):
raise ArgumentValidationError(keys[i],
validate_function.function.function.__name__,
accepted_arg_type,
func_arg_type,
True)
else:
raise ArgumentValidationError(keys[i],
validate_function.__name__,
accepted_arg_type,
func_arg_type,
True)
else:
if(func_arg_type != accepted_arg_type):
if(accepted_arg_dicts["post_trace"]):
raise ArgumentValidationError(keys[i],
validate_function.function.function.__name__,
accepted_arg_type,
func_arg_type,
False)
else:
raise ArgumentValidationError(keys[i],
validate_function.__name__,
accepted_arg_type,
func_arg_type,
False)
return validate_function(*function_args, **function_args_dicts)
return decorator_wrapper
return accept_decorator
class ConstraintError(ValueError):
'''
Raised when the type of an argument to a function is not what it should be.
'''
def __init__(self, msg):
self.error = msg
def __str__(self):
return self.error
def ConstraintWarning(msg):
warnings.warn(msg)
def error_checks(*arg_constraints, **kwargs_constraints):
def check_gte(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in [int, float]):
if(actual_value < limit):
msg += "Value expected to be greater than equal to \"{}\", but is \"{}\"".format(limit, actual_value);
raise ConstraintError(msg);
if(type(actual_value) in [list, tuple]):
for i in range(len(actual_value)):
if(actual_value[i] < limit):
msg += "List's arg number \"{}\" expected to be greater than equal to \"{}\", but is \"{}\"".format(i+1, limit, actual_value[i]);
raise ConstraintError(msg);
def check_gt(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in [int, float]):
if(actual_value <= limit):
msg += "Value expected to be strictly greater than to \"{}\", but is \"{}\"".format(limit, actual_value);
raise ConstraintError(msg);
if(type(actual_value) in [list, tuple]):
for i in range(len(actual_value)):
if(actual_value[i] <= limit):
msg += "List's arg number \"{}\" expected to be strictly greater than equal to \"{}\", but is \"{}\"".format(i+1, limit, actual_value[i]);
raise ConstraintError(msg);
def check_lte(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in [int, float]):
if(actual_value > limit):
msg += "Value expected to be less than equal to \"{}\", but is \"{}\"".format(limit, actual_value);
raise ConstraintError(msg);
if(type(actual_value) in [list, tuple]):
for i in range(len(actual_value)):
if(actual_value[i] > limit):
msg += "List's arg number \"{}\" expected to be less than equal to \"{}\", but is \"{}\"".format(i+1, limit, actual_value[i]);
raise ConstraintError(msg);
def check_lt(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in [int, float]):
if(actual_value >= limit):
msg += "Value expected to be strictly less than to \"{}\", but is \"{}\"".format(limit, actual_value);
raise ConstraintError(msg);
if(type(actual_value) in [list, tuple]):
for i in range(len(actual_value)):
if(actual_value[i] >= limit):
msg += "List's arg number \"{}\" expected to be strictly less than equal to \"{}\", but is \"{}\"".format(i+1, limit, actual_value[i]);
raise ConstraintError(msg);
def check_eq(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in [int, float, str, list, tuple]):
if(actual_value != limit):
msg += "Value expected to be strictly equal to \"{}\", but is \"{}\"".format(limit, actual_value);
raise ConstraintError(msg);
def check_neq(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in [int, float, str, list, tuple]):
if(actual_value == limit):
msg += "Value expected to be strictly not equal to \"{}\", but is \"{}\"".format(limit, actual_value);
raise ConstraintError(msg);
def check_in(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in list(map(type, limit))):
if(actual_value not in limit):
msg += "Value expected to be one among \"{}\", but is \"{}\"".format(limit, actual_value);
raise ConstraintError(msg);
def check_nin(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in list(map(type, limit))):
if(actual_value in limit):
msg += "Value expected to be anything except \"{}\", but is \"{}\"".format(limit, actual_value);
raise ConstraintError(msg);
def check_folder(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) == str):
if(not os.path.isdir(actual_value)):
msg = "Folder \"{}\" not found".format(actual_value)
raise ConstraintError(msg);
if(limit == "r"):
if(not os.access(actual_value, os.R_OK)):
msg = "Folder \"{}\" has no read access".format(actual_value)
raise ConstraintError(msg);
if(limit == "w"):
if(not os.access(actual_value, os.W_OK)):
msg = "Folder \"{}\" has no write access".format(actual_value)
raise ConstraintError(msg);
if(type(actual_value) == list):
for i in range(len(actual_value)):
if(not os.path.isdir(actual_value[i])):
msg = "Folder \"{}\" not found".format(actual_value[i])
raise ConstraintError(msg);
if(limit == "r"):
if(not os.access(actual_value[i], os.R_OK)):
msg = "Folder \"{}\" has no read access".format(actual_value[i])
raise ConstraintError(msg);
if(limit == "w"):
if(not os.access(actual_value[i], os.W_OK)):
msg = "Folder \"{}\" has no write access".format(actual_value[i])
raise ConstraintError(msg);
def check_file(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) == str):
if(not os.path.isfile(actual_value)):
msg = "File \"{}\" not found".format(actual_value)
raise ConstraintError(msg);
if(limit == "r"):
if(not os.access(actual_value, os.R_OK)):
msg = "File \"{}\" has no read access".format(actual_value)
raise ConstraintError(msg);
if(limit == "w"):
if(not os.access(actual_value, os.W_OK)):
msg = "File \"{}\" has no write access".format(actual_value)
raise ConstraintError(msg);
if(type(actual_value) == list):
for i in range(len(actual_value)):
if(not os.path.isfile(actual_value[i])):
msg = "File \"{}\" not found".format(actual_value[i])
raise ConstraintError(msg);
if(limit == "r"):
if(not os.access(actual_value[i], os.R_OK)):
msg = "File \"{}\" has no read access".format(actual_value[i])
raise ConstraintError(msg);
if(limit == "w"):
if(not os.access(actual_value[i], os.W_OK)):
msg = "File \"{}\" has no write access".format(actual_value[i])
raise ConstraintError(msg);
def check_inc(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) == list):
if(sorted(actual_value) != actual_value):
msg += "List expected to be incremental, but is \"{}\"".format(actual_value);
raise ConstraintError(msg);
def check_dec(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) == list):
if(sorted(actual_value, reverse=True) != actual_value):
msg += "List expected to be decremental, but is \"{}\"".format(actual_value);
raise ConstraintError(msg);
def check_name(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) == str):
total_list = [];
for i in range(len(limit)):
if(limit[i] == "a-z"):
total_list += list(string.ascii_lowercase)
elif(limit[i] == "A-Z"):
total_list += list(string.ascii_uppercase)
elif(limit[i] == "0-9"):
total_list += list(string.digits)
else:
total_list += limit[i];
actual_value = list(actual_value)
for j in range(len(actual_value)):
if(actual_value[j] not in total_list):
msg += "Character \"{}\" not allowed as per constrains \"{}\"".format(actual_value[j], limit);
raise ConstraintError(msg);
def accept_decorator(validate_function):
@functools.wraps(validate_function)
def decorator_wrapper(*function_args, **function_args_dicts):
if len(arg_constraints) is not len(function_args):
raise InvalidArgumentNumberError(validate_function.__name__)
if(kwargs_constraints["post_trace"]):
function_name = validate_function.function.function.__name__;
else:
function_name = validate_function.__name__;
for arg_num, (actual_arg, arg_constraint) in enumerate(zip(function_args, arg_constraints)):
if(arg_constraint):
for i in range(len(arg_constraint)//2):
if(arg_constraint[i*2] == "gte"):
check_gte(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "gt"):
check_gt(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "lte"):
check_lte(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "lt"):
check_lt(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "eq"):
check_eq(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "neq"):
check_neq(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "in"):
check_in(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "nin"):
check_nin(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "folder"):
check_folder(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "file"):
check_file(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "inc"):
check_inc(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "dec"):
check_dec(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "name"):
check_name(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
keys = list(function_args_dicts.keys());
for x in range(len(keys)):
actual_arg = function_args_dicts[keys[x]];
arg_constraint = kwargs_constraints[keys[x]];
if(arg_constraint):
for i in range(len(arg_constraint)//2):
if(arg_constraint[i*2] == "gte"):
check_gte(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "gt"):
check_gt(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "lte"):
check_lte(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "lt"):
check_lt(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "eq"):
check_eq(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "neq"):
check_neq(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "in"):
check_in(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "nin"):
check_nin(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "folder"):
check_folder(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "file"):
check_file(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "inc"):
check_inc(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "dec"):
check_dec(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "name"):
check_name(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
return validate_function(*function_args, **function_args_dicts)
return decorator_wrapper
return accept_decorator
def warning_checks(*arg_constraints, **kwargs_constraints):
def check_gte(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in [int, float]):
if(actual_value < limit):
msg += "Value expected to be greater than equal to \"{}\", but is \"{}\"".format(limit, actual_value);
ConstraintWarning(msg);
if(type(actual_value) in [list, tuple]):
for i in range(len(actual_value)):
if(actual_value[i] < limit):
msg += "List's arg number \"{}\" expected to be greater than equal to \"{}\", but is \"{}\"".format(i+1, limit, actual_value[i]);
ConstraintWarning(msg);
def check_gt(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in [int, float]):
if(actual_value <= limit):
msg += "Value expected to be strictly greater than to \"{}\", but is \"{}\"".format(limit, actual_value);
ConstraintWarning(msg);
if(type(actual_value) in [list, tuple]):
for i in range(len(actual_value)):
if(actual_value[i] <= limit):
msg += "List's arg number \"{}\" expected to be strictly greater than equal to \"{}\", but is \"{}\"".format(i+1, limit, actual_value[i]);
ConstraintWarning(msg);
def check_lte(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in [int, float]):
if(actual_value > limit):
msg += "Value expected to be less than equal to \"{}\", but is \"{}\"".format(limit, actual_value);
ConstraintWarning(msg);
if(type(actual_value) in [list, tuple]):
for i in range(len(actual_value)):
if(actual_value[i] > limit):
msg += "List's arg number \"{}\" expected to be less than equal to \"{}\", but is \"{}\"".format(i+1, limit, actual_value[i]);
ConstraintWarning(msg);
def check_lt(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in [int, float]):
if(actual_value >= limit):
msg += "Value expected to be strictly less than to \"{}\", but is \"{}\"".format(limit, actual_value);
ConstraintWarning(msg);
if(type(actual_value) in [list, tuple]):
for i in range(len(actual_value)):
if(actual_value[i] >= limit):
msg += "List's arg number \"{}\" expected to be strictly less than equal to \"{}\", but is \"{}\"".format(i+1, limit, actual_value[i]);
ConstraintWarning(msg);
def check_eq(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in [int, float, str, list, tuple]):
if(actual_value != limit):
msg += "Value expected to be strictly equal to \"{}\", but is \"{}\"".format(limit, actual_value);
ConstraintWarning(msg);
def check_neq(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in [int, float, str, list, tuple]):
if(actual_value == limit):
msg += "Value expected to be strictly not equal to \"{}\", but is \"{}\"".format(limit, actual_value);
ConstraintWarning(msg);
def check_in(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in list(map(type, limit))):
if(actual_value not in limit):
msg += "Value expected to be one among \"{}\", but is \"{}\"".format(limit, actual_value);
ConstraintWarning(msg);
def check_nin(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) in list(map(type, limit))):
if(actual_value in limit):
msg += "Value expected to be anything except \"{}\", but is \"{}\"".format(limit, actual_value);
ConstraintWarning(msg);
def check_folder(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) == str):
if(not os.path.isdir(actual_value)):
msg = "Folder \"{}\" not found".format(actual_value)
ConstraintWarning(msg);
if(limit == "r"):
if(not os.access(actual_value, os.R_OK)):
msg = "Folder \"{}\" has no read access".format(actual_value)
ConstraintWarning(msg);
if(limit == "w"):
if(not os.access(actual_value, os.W_OK)):
msg = "Folder \"{}\" has no write access".format(actual_value)
ConstraintWarning(msg);
if(type(actual_value) == list):
for i in range(len(actual_value)):
if(not os.path.isdir(actual_value[i])):
msg = "Folder \"{}\" not found".format(actual_value[i])
ConstraintWarning(msg);
if(limit == "r"):
if(not os.access(actual_value[i], os.R_OK)):
msg = "Folder \"{}\" has no read access".format(actual_value[i])
ConstraintWarning(msg);
if(limit == "w"):
if(not os.access(actual_value[i], os.W_OK)):
msg = "Folder \"{}\" has no write access".format(actual_value[i])
ConstraintWarning(msg);
def check_file(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) == str):
if(not os.path.isdir(actual_value)):
msg = "File \"{}\" not found".format(actual_value)
ConstraintWarning(msg);
if(limit == "r"):
if(not os.access(actual_value, os.R_OK)):
msg = "File \"{}\" has no read access".format(actual_value)
ConstraintWarning(msg);
if(limit == "w"):
if(not os.access(actual_value, os.W_OK)):
msg = "File \"{}\" has no write access".format(actual_value)
ConstraintWarning(msg);
if(type(actual_value) == list):
for i in range(len(actual_value)):
if(not os.path.isdir(actual_value[i])):
msg = "File \"{}\" not found".format(actual_value[i])
ConstraintWarning(msg);
if(limit == "r"):
if(not os.access(actual_value[i], os.R_OK)):
msg = "File \"{}\" has no read access".format(actual_value[i])
ConstraintWarning(msg);
if(limit == "w"):
if(not os.access(actual_value[i], os.W_OK)):
msg = "File \"{}\" has no write access".format(actual_value[i])
ConstraintWarning(msg);
def check_inc(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) == list):
if(sorted(actual_value) != actual_value):
msg += "List expected to be incremental, but is \"{}\"".format(actual_value);
ConstraintWarning(msg);
def check_dec(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) == list):
if(sorted(actual_value, reverse=True) != actual_value):
msg += "List expected to be decremental, but is \"{}\"".format(actual_value);
ConstraintWarning(msg);
def check_name(actual_value, limit, function_name, arg_num=None, arg_name=None):
if(arg_num):
arg = arg_num;
msg = "Constraint Mismatch for argument number \"{}\" in function \"{}\".\n".format(arg, function_name);
if(arg_name):
arg = arg_name;
msg = "Constraint Mismatch for argument name \"{}\" in function \"{}\".\n".format(arg, function_name);
if(type(actual_value) == str):
total_list = [];
for i in range(len(limit)):
if(limit[i] == "a-z"):
total_list += list(string.ascii_lowercase)
elif(limit[i] == "A-Z"):
total_list += list(string.ascii_uppercase)
elif(limit[i] == "0-9"):
total_list += list(string.digits)
else:
total_list += limit[i];
actual_value = list(actual_value)
for j in range(len(actual_value)):
if(actual_value[j] not in total_list):
msg += "Character \"{}\" not allowed as per constrains \"{}\"".format(actual_value[j], limit);
ConstraintWarning(msg);
def accept_decorator(validate_function):
@functools.wraps(validate_function)
def decorator_wrapper(*function_args, **function_args_dicts):
if len(arg_constraints) is not len(function_args):
raise InvalidArgumentNumberError(validate_function.__name__)
if(kwargs_constraints["post_trace"]):
function_name = validate_function.function.function.__name__;
else:
function_name = validate_function.__name__;
for arg_num, (actual_arg, arg_constraint) in enumerate(zip(function_args, arg_constraints)):
if(arg_constraint):
for i in range(len(arg_constraint)//2):
if(arg_constraint[i*2] == "gte"):
check_gte(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "gt"):
check_gt(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "lte"):
check_lte(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "lt"):
check_lt(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "eq"):
check_eq(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "neq"):
check_neq(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "in"):
check_in(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "nin"):
check_nin(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "folder"):
check_folder(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "file"):
check_file(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "inc"):
check_inc(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "dec"):
check_dec(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
if(arg_constraint[i*2] == "name"):
check_name(actual_arg, arg_constraint[i*2+1], function_name, arg_num=arg_num+1);
keys = list(function_args_dicts.keys());
for x in range(len(keys)):
actual_arg = function_args_dicts[keys[x]];
arg_constraint = kwargs_constraints[keys[x]];
if(arg_constraint):
for i in range(len(arg_constraint)//2):
if(arg_constraint[i*2] == "gte"):
check_gte(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "gt"):
check_gt(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "lte"):
check_lte(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "lt"):
check_lt(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "eq"):
check_eq(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "neq"):
check_neq(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "in"):
check_in(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "nin"):
check_nin(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "folder"):
check_folder(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "file"):
check_file(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "inc"):
check_inc(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "dec"):
check_dec(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
if(arg_constraint[i*2] == "name"):
check_name(actual_arg, arg_constraint[i*2+1], function_name, arg_name=keys[x]);
return validate_function(*function_args, **function_args_dicts)
return decorator_wrapper
return accept_decorator
"""
def accepts(*accepted_arg_types, post_trace=False):
'''
A decorator to validate the parameter types of a given function.
It is passed a tuple of types. eg. (<type 'tuple'>, <type 'int'>)
Note: It doesn't do a deep check, for example checking through a
tuple of types. The argument passed must only be types.
'''
def accept_decorator(validate_function):
# Check if the number of arguments to the validator
# function is the same as the arguments provided
# to the actual function to validate. We don't need
# to check if the function to validate has the right
# amount of arguments, as Python will do this
# automatically (also with a TypeError).
@functools.wraps(validate_function)
def decorator_wrapper(*function_args, **function_args_dict):
if len(accepted_arg_types) is not len(accepted_arg_types):
raise InvalidArgumentNumberError(validate_function.__name__)
# We're using enumerate to get the index, so we can pass the
# argument number with the incorrect type to ArgumentValidationError.
i = 0;
#print(function_args, accepted_arg_types);
for arg_num, (actual_arg, accepted_arg_type) in enumerate(zip(function_args, accepted_arg_types)):
if(accepted_arg_type=="self"):
continue;
if(type(accepted_arg_type)) == list:
print(actual_arg, accepted_arg_type)
if type(actual_arg) not in accepted_arg_type:
ord_num = ordinal(arg_num + 1)
if(post_trace):
raise ArgumentValidationError(ord_num,
validate_function.function.function.__name__,
accepted_arg_type)
else:
raise ArgumentValidationError(ord_num,
validate_function.__name__,
accepted_arg_type)
else:
if not type(actual_arg) is accepted_arg_type:
ord_num = ordinal(arg_num + 1)
if(post_trace):
raise ArgumentValidationError(ord_num,
validate_function.function.function.__name__,
accepted_arg_type)
else:
raise ArgumentValidationError(ord_num,
validate_function.__name__,
accepted_arg_type)
i += 1;
return validate_function(*function_args, **function_args_dict)
return decorator_wrapper
return accept_decorator
"""
"""
def returns(*accepted_return_type_tuple, post_trace=False):
'''
Validates the return type. Since there's only ever one
return type, this makes life simpler. Along with the
accepts() decorator, this also only does a check for
the top argument. For example you couldn't check
(<type 'tuple'>, <type 'int'>, <type 'str'>).
In that case you could only check if it was a tuple.
'''
def return_decorator(validate_function):
# No return type has been specified.
if len(accepted_return_type_tuple) == 0:
raise TypeError('You must specify a return type.')
@functools.wraps(validate_function)
def decorator_wrapper(*function_args):
# More than one return type has been specified.
if len(accepted_return_type_tuple) > 1:
raise TypeError('You must specify one return type.')
# Since the decorator receives a tuple of arguments
# and the is only ever one object returned, we'll just
# grab the first parameter.
accepted_return_type = accepted_return_type_tuple[0]
# We'll execute the function, and
# take a look at the return type.
return_value = validate_function(*function_args)
return_value_type = type(return_value)
if return_value_type is not accepted_return_type:
if(post_trace):
raise InvalidReturnType(return_value_type,
validate_function.function.function.__name__)
else:
raise InvalidReturnType(return_value_type,
validate_function.__name__)
return return_value
return decorator_wrapper
return return_decorator
# new_f.__name__ = f.__name__
# return new_f
# check_accepts.__name__ = f.__name__
'''
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
DATE_TIME = datetime.datetime.now()
HANDLER = logging.FileHandler("process.log", 'w')
HANDLER.setLevel(logging.DEBUG)
FORMATTER = logging.Formatter('[%(asctime)s] p%(process)s {%(filename)s:%(lineno)d} %(levelname)s \
- %(message)s', '%m-%d %H:%M:%S')
HANDLER.setFormatter(FORMATTER)
logger.addHandler(HANDLER)
def logged(class_func=False):
def wrap(function):
@functools.wraps(function)
def wrapper(*args, **kwargs):
if(class_func):
logger.debug("Calling function '{}' with args={} kwargs={}".format(function.__name__, args[1:], kwargs))
else:
logger.debug("Calling function '{}' with args={} kwargs={}".format(function.__name__, args, kwargs))
try:
response = function(*args, **kwargs)
except Exception as error:
logger.error("Function '{}' raised {} with error '{}'"
.format(function.__name__,
error.__class__.__name__,
str(error)))
raise error
logger.debug("Function '{}' returned {}"
.format(function.__name__,
response))
return function(*args, **kwargs)
return wrapper
return wrap
'''
""" | 50.568204 | 158 | 0.52846 | 5,856 | 51,529 | 4.423839 | 0.048839 | 0.082375 | 0.056203 | 0.060218 | 0.904076 | 0.894272 | 0.893345 | 0.881649 | 0.877828 | 0.8703 | 0 | 0.007238 | 0.351161 | 51,529 | 1,019 | 159 | 50.568204 | 0.767602 | 0.025015 | 0 | 0.921127 | 0 | 0.001408 | 0.128546 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.06338 | false | 0 | 0.019718 | 0.005634 | 0.109859 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
da0e07027b013382095c6aad43bf91cc2391e080 | 148,769 | py | Python | rustici_software_cloud_v2/apis/registration_api.py | ryanhope2/scormcloud-api-v2-client-python | fcc392933218d32b70987f8bfb1711f891f31c06 | [
"Apache-2.0"
] | null | null | null | rustici_software_cloud_v2/apis/registration_api.py | ryanhope2/scormcloud-api-v2-client-python | fcc392933218d32b70987f8bfb1711f891f31c06 | [
"Apache-2.0"
] | null | null | null | rustici_software_cloud_v2/apis/registration_api.py | ryanhope2/scormcloud-api-v2-client-python | fcc392933218d32b70987f8bfb1711f891f31c06 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
SCORM Cloud Rest API
REST API used for SCORM Cloud integrations.
OpenAPI spec version: 2.0
Contact: systems@rusticisoftware.com
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class RegistrationApi(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def build_registration_launch_link(self, registration_id, launch_link_request, **kwargs):
"""
Get registration launch link.
Returns the link to use to launch this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.build_registration_launch_link(registration_id, launch_link_request, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param LaunchLinkRequestSchema launch_link_request: (required)
:return: LaunchLinkSchema
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.build_registration_launch_link_with_http_info(registration_id, launch_link_request, **kwargs)
else:
(data) = self.build_registration_launch_link_with_http_info(registration_id, launch_link_request, **kwargs)
return data
def build_registration_launch_link_with_http_info(self, registration_id, launch_link_request, **kwargs):
"""
Get registration launch link.
Returns the link to use to launch this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.build_registration_launch_link_with_http_info(registration_id, launch_link_request, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param LaunchLinkRequestSchema launch_link_request: (required)
:return: LaunchLinkSchema
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'launch_link_request']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method build_registration_launch_link" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `build_registration_launch_link`")
# verify the required parameter 'launch_link_request' is set
if ('launch_link_request' not in params) or (params['launch_link_request'] is None):
raise ValueError("Missing the required parameter `launch_link_request` when calling `build_registration_launch_link`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/launchLink'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'launch_link_request' in params:
body_params = params['launch_link_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='LaunchLinkSchema',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_new_registration_instance(self, registration_id, **kwargs):
"""
Create a new instance for this registration specified by the registration ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_new_registration_instance(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.create_new_registration_instance_with_http_info(registration_id, **kwargs)
else:
(data) = self.create_new_registration_instance_with_http_info(registration_id, **kwargs)
return data
def create_new_registration_instance_with_http_info(self, registration_id, **kwargs):
"""
Create a new instance for this registration specified by the registration ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_new_registration_instance_with_http_info(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_new_registration_instance" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `create_new_registration_instance`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/instances'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_registration(self, registration, **kwargs):
"""
Create a registration.
This method is used to create a new registration. A registration will contain a few pieces of information such as a learner name, a learner id, and optionally, information about where activity data should be posted (for client consumption), as well as a way to specify simple authentication schemes for posting said data. A registration must be tied to a specific course at creation time. When the created registration is “launched”, the course specified at creation time will be launched.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_registration(registration, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param CreateRegistrationSchema registration: (required)
:param int course_version: The version of the course you want to create the registration for. Unless you have a reason for using this you probably do not need to.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.create_registration_with_http_info(registration, **kwargs)
else:
(data) = self.create_registration_with_http_info(registration, **kwargs)
return data
def create_registration_with_http_info(self, registration, **kwargs):
"""
Create a registration.
This method is used to create a new registration. A registration will contain a few pieces of information such as a learner name, a learner id, and optionally, information about where activity data should be posted (for client consumption), as well as a way to specify simple authentication schemes for posting said data. A registration must be tied to a specific course at creation time. When the created registration is “launched”, the course specified at creation time will be launched.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_registration_with_http_info(registration, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param CreateRegistrationSchema registration: (required)
:param int course_version: The version of the course you want to create the registration for. Unless you have a reason for using this you probably do not need to.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration', 'course_version']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_registration" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration' is set
if ('registration' not in params) or (params['registration'] is None):
raise ValueError("Missing the required parameter `registration` when calling `create_registration`")
collection_formats = {}
resource_path = '/registrations'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'course_version' in params:
query_params['courseVersion'] = params['course_version']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'registration' in params:
body_params = params['registration']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_registration(self, registration_id, **kwargs):
"""
Delete a registration.
Delete `registrationId`. This includes all instances of this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_registration(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_registration_with_http_info(registration_id, **kwargs)
else:
(data) = self.delete_registration_with_http_info(registration_id, **kwargs)
return data
def delete_registration_with_http_info(self, registration_id, **kwargs):
"""
Delete a registration.
Delete `registrationId`. This includes all instances of this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_registration_with_http_info(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_registration" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `delete_registration`")
collection_formats = {}
resource_path = '/registrations/{registrationId}'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_registration_configuration_setting(self, registration_id, setting_id, **kwargs):
"""
Clear a registration configuration.
Clears the `settingId` value for this registration. The effective value will become the value at the next level which has an explicit value set. Possibilities are course, application, or default.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_registration_configuration_setting(registration_id, setting_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param str setting_id: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_registration_configuration_setting_with_http_info(registration_id, setting_id, **kwargs)
else:
(data) = self.delete_registration_configuration_setting_with_http_info(registration_id, setting_id, **kwargs)
return data
def delete_registration_configuration_setting_with_http_info(self, registration_id, setting_id, **kwargs):
"""
Clear a registration configuration.
Clears the `settingId` value for this registration. The effective value will become the value at the next level which has an explicit value set. Possibilities are course, application, or default.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_registration_configuration_setting_with_http_info(registration_id, setting_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param str setting_id: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'setting_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_registration_configuration_setting" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `delete_registration_configuration_setting`")
# verify the required parameter 'setting_id' is set
if ('setting_id' not in params) or (params['setting_id'] is None):
raise ValueError("Missing the required parameter `setting_id` when calling `delete_registration_configuration_setting`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/configuration/{settingId}'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
if 'setting_id' in params:
path_params['settingId'] = params['setting_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_registration_global_data(self, registration_id, **kwargs):
"""
Delete the global data of a registration.
Delete global data associated with `registrationId`'. Calling this method will reset all global objectives associated with this registration, if any exist.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_registration_global_data(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_registration_global_data_with_http_info(registration_id, **kwargs)
else:
(data) = self.delete_registration_global_data_with_http_info(registration_id, **kwargs)
return data
def delete_registration_global_data_with_http_info(self, registration_id, **kwargs):
"""
Delete the global data of a registration.
Delete global data associated with `registrationId`'. Calling this method will reset all global objectives associated with this registration, if any exist.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_registration_global_data_with_http_info(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_registration_global_data" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `delete_registration_global_data`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/globalData'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_registration_instance_configuration_setting(self, registration_id, instance_id, setting_id, **kwargs):
"""
Clear a configuration for an instance of a registration.
Clears the `settingId` value for this registration instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_registration_instance_configuration_setting(registration_id, instance_id, setting_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param int instance_id: The instance of this registration (required)
:param str setting_id: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_registration_instance_configuration_setting_with_http_info(registration_id, instance_id, setting_id, **kwargs)
else:
(data) = self.delete_registration_instance_configuration_setting_with_http_info(registration_id, instance_id, setting_id, **kwargs)
return data
def delete_registration_instance_configuration_setting_with_http_info(self, registration_id, instance_id, setting_id, **kwargs):
"""
Clear a configuration for an instance of a registration.
Clears the `settingId` value for this registration instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_registration_instance_configuration_setting_with_http_info(registration_id, instance_id, setting_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param int instance_id: The instance of this registration (required)
:param str setting_id: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'instance_id', 'setting_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_registration_instance_configuration_setting" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `delete_registration_instance_configuration_setting`")
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params) or (params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `delete_registration_instance_configuration_setting`")
# verify the required parameter 'setting_id' is set
if ('setting_id' not in params) or (params['setting_id'] is None):
raise ValueError("Missing the required parameter `setting_id` when calling `delete_registration_instance_configuration_setting`")
if 'instance_id' in params and params['instance_id'] < 0:
raise ValueError("Invalid value for parameter `instance_id` when calling `delete_registration_instance_configuration_setting`, must be a value greater than or equal to `0`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/instances/{instanceId}/configuration/{settingId}'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id']
if 'setting_id' in params:
path_params['settingId'] = params['setting_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_registration_progress(self, registration_id, **kwargs):
"""
Reset a registration.
This method will reset the specified registration. This is essentially the same as deleting and recreating the registration, and as such, will delete all the data associated with the registration (including launch history, etc.). If the course for which the registration is registered has multiple versions, the registration being reset will automatically be registered for the latest version.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_registration_progress(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_registration_progress_with_http_info(registration_id, **kwargs)
else:
(data) = self.delete_registration_progress_with_http_info(registration_id, **kwargs)
return data
def delete_registration_progress_with_http_info(self, registration_id, **kwargs):
"""
Reset a registration.
This method will reset the specified registration. This is essentially the same as deleting and recreating the registration, and as such, will delete all the data associated with the registration (including launch history, etc.). If the course for which the registration is registered has multiple versions, the registration being reset will automatically be registered for the latest version.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_registration_progress_with_http_info(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_registration_progress" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `delete_registration_progress`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/progress'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_registration_tags(self, registration_id, tags, **kwargs):
"""
Delete tags from a registration.
Delete the provided tags for this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_registration_tags(registration_id, tags, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param TagListSchema tags: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_registration_tags_with_http_info(registration_id, tags, **kwargs)
else:
(data) = self.delete_registration_tags_with_http_info(registration_id, tags, **kwargs)
return data
def delete_registration_tags_with_http_info(self, registration_id, tags, **kwargs):
"""
Delete tags from a registration.
Delete the provided tags for this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_registration_tags_with_http_info(registration_id, tags, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param TagListSchema tags: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'tags']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_registration_tags" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `delete_registration_tags`")
# verify the required parameter 'tags' is set
if ('tags' not in params) or (params['tags'] is None):
raise ValueError("Missing the required parameter `tags` when calling `delete_registration_tags`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/tags'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'tags' in params:
body_params = params['tags']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_registration(self, registration_id, **kwargs):
"""
See if a registration exists.
This method is meant to check if a registration with `registrationId` exists in the system.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_registration_with_http_info(registration_id, **kwargs)
else:
(data) = self.get_registration_with_http_info(registration_id, **kwargs)
return data
def get_registration_with_http_info(self, registration_id, **kwargs):
"""
See if a registration exists.
This method is meant to check if a registration with `registrationId` exists in the system.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_with_http_info(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_registration" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `get_registration`")
collection_formats = {}
resource_path = '/registrations/{registrationId}'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'HEAD',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_registration_configuration(self, registration_id, **kwargs):
"""
Get registration configuration.
Returns all configuration settings for this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_configuration(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param bool include_metadata:
:return: SettingListSchema
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_registration_configuration_with_http_info(registration_id, **kwargs)
else:
(data) = self.get_registration_configuration_with_http_info(registration_id, **kwargs)
return data
def get_registration_configuration_with_http_info(self, registration_id, **kwargs):
"""
Get registration configuration.
Returns all configuration settings for this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_configuration_with_http_info(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param bool include_metadata:
:return: SettingListSchema
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'include_metadata']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_registration_configuration" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `get_registration_configuration`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/configuration'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
if 'include_metadata' in params:
query_params['includeMetadata'] = params['include_metadata']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SettingListSchema',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_registration_instance_configuration(self, registration_id, instance_id, **kwargs):
"""
Get configuration for instance of registration.
Returns all configuration settings for this registration instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_instance_configuration(registration_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param int instance_id: The instance of this registration (required)
:param bool include_metadata:
:return: SettingListSchema
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_registration_instance_configuration_with_http_info(registration_id, instance_id, **kwargs)
else:
(data) = self.get_registration_instance_configuration_with_http_info(registration_id, instance_id, **kwargs)
return data
def get_registration_instance_configuration_with_http_info(self, registration_id, instance_id, **kwargs):
"""
Get configuration for instance of registration.
Returns all configuration settings for this registration instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_instance_configuration_with_http_info(registration_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param int instance_id: The instance of this registration (required)
:param bool include_metadata:
:return: SettingListSchema
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'instance_id', 'include_metadata']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_registration_instance_configuration" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `get_registration_instance_configuration`")
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params) or (params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `get_registration_instance_configuration`")
if 'instance_id' in params and params['instance_id'] < 0:
raise ValueError("Invalid value for parameter `instance_id` when calling `get_registration_instance_configuration`, must be a value greater than or equal to `0`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/instances/{instanceId}/configuration'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id']
query_params = {}
if 'include_metadata' in params:
query_params['includeMetadata'] = params['include_metadata']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SettingListSchema',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_registration_instance_launch_history(self, registration_id, instance_id, **kwargs):
"""
Get launch history for an instance of a registration.
Returns history of the launches of the specified instance of this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_instance_launch_history(registration_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param int instance_id: The instance of this registration (required)
:param bool include_history_log: Whether to include the history log in the launch history
:return: LaunchHistoryListSchema
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_registration_instance_launch_history_with_http_info(registration_id, instance_id, **kwargs)
else:
(data) = self.get_registration_instance_launch_history_with_http_info(registration_id, instance_id, **kwargs)
return data
def get_registration_instance_launch_history_with_http_info(self, registration_id, instance_id, **kwargs):
"""
Get launch history for an instance of a registration.
Returns history of the launches of the specified instance of this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_instance_launch_history_with_http_info(registration_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param int instance_id: The instance of this registration (required)
:param bool include_history_log: Whether to include the history log in the launch history
:return: LaunchHistoryListSchema
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'instance_id', 'include_history_log']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_registration_instance_launch_history" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `get_registration_instance_launch_history`")
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params) or (params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `get_registration_instance_launch_history`")
if 'instance_id' in params and params['instance_id'] < 0:
raise ValueError("Invalid value for parameter `instance_id` when calling `get_registration_instance_launch_history`, must be a value greater than or equal to `0`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/instances/{instanceId}/launchHistory'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id']
query_params = {}
if 'include_history_log' in params:
query_params['includeHistoryLog'] = params['include_history_log']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='LaunchHistoryListSchema',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_registration_instance_progress(self, registration_id, instance_id, **kwargs):
"""
Get details of an instance of a registration.
Get registration progress for instance `instanceId` of `registrationId`'
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_instance_progress(registration_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param int instance_id: The instance of this registration (required)
:param bool include_child_results: Include information about each learning object, not just the top level in the results
:param bool include_interactions_and_objectives: Include interactions and objectives in the results
:param bool include_runtime: Include runtime details in the results
:return: RegistrationSchema
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_registration_instance_progress_with_http_info(registration_id, instance_id, **kwargs)
else:
(data) = self.get_registration_instance_progress_with_http_info(registration_id, instance_id, **kwargs)
return data
def get_registration_instance_progress_with_http_info(self, registration_id, instance_id, **kwargs):
"""
Get details of an instance of a registration.
Get registration progress for instance `instanceId` of `registrationId`'
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_instance_progress_with_http_info(registration_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param int instance_id: The instance of this registration (required)
:param bool include_child_results: Include information about each learning object, not just the top level in the results
:param bool include_interactions_and_objectives: Include interactions and objectives in the results
:param bool include_runtime: Include runtime details in the results
:return: RegistrationSchema
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'instance_id', 'include_child_results', 'include_interactions_and_objectives', 'include_runtime']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_registration_instance_progress" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `get_registration_instance_progress`")
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params) or (params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `get_registration_instance_progress`")
if 'instance_id' in params and params['instance_id'] < 0:
raise ValueError("Invalid value for parameter `instance_id` when calling `get_registration_instance_progress`, must be a value greater than or equal to `0`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/instances/{instanceId}'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id']
query_params = {}
if 'include_child_results' in params:
query_params['includeChildResults'] = params['include_child_results']
if 'include_interactions_and_objectives' in params:
query_params['includeInteractionsAndObjectives'] = params['include_interactions_and_objectives']
if 'include_runtime' in params:
query_params['includeRuntime'] = params['include_runtime']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RegistrationSchema',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_registration_instance_statements(self, registration_id, instance_id, **kwargs):
"""
Get xAPI statements for an instance of a registration.
Get xAPI statements for instance `instanceId` of `registrationId`.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_instance_statements(registration_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param int instance_id: The instance of this registration (required)
:param datetime since: Only items updated since the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param datetime until: Only items updated before the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param str more: Value for this parameter will be provided in the 'more' property of registration lists, where needed. An opaque value, construction and parsing may change without notice.
:return: XapiStatementResult
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_registration_instance_statements_with_http_info(registration_id, instance_id, **kwargs)
else:
(data) = self.get_registration_instance_statements_with_http_info(registration_id, instance_id, **kwargs)
return data
def get_registration_instance_statements_with_http_info(self, registration_id, instance_id, **kwargs):
"""
Get xAPI statements for an instance of a registration.
Get xAPI statements for instance `instanceId` of `registrationId`.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_instance_statements_with_http_info(registration_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param int instance_id: The instance of this registration (required)
:param datetime since: Only items updated since the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param datetime until: Only items updated before the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param str more: Value for this parameter will be provided in the 'more' property of registration lists, where needed. An opaque value, construction and parsing may change without notice.
:return: XapiStatementResult
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'instance_id', 'since', 'until', 'more']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_registration_instance_statements" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `get_registration_instance_statements`")
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params) or (params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `get_registration_instance_statements`")
if 'instance_id' in params and params['instance_id'] < 0:
raise ValueError("Invalid value for parameter `instance_id` when calling `get_registration_instance_statements`, must be a value greater than or equal to `0`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/instances/{instanceId}/xAPIStatements'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id']
query_params = {}
if 'since' in params:
query_params['since'] = params['since']
if 'until' in params:
query_params['until'] = params['until']
if 'more' in params:
query_params['more'] = params['more']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='XapiStatementResult',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_registration_instances(self, registration_id, **kwargs):
"""
Get all instances of a registration.
Get all the instances of this the registration specified by the registration ID
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_instances(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param datetime until: Only items updated before the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param datetime since: Only items updated since the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param str more: Value for this parameter will be provided in the 'more' property of registration lists, where needed. An opaque value, construction and parsing may change without notice.
:param bool include_child_results: Include information about each learning object, not just the top level in the results
:param bool include_interactions_and_objectives: Include interactions and objectives in the results
:param bool include_runtime: Include runtime details in the results
:return: RegistrationListSchema
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_registration_instances_with_http_info(registration_id, **kwargs)
else:
(data) = self.get_registration_instances_with_http_info(registration_id, **kwargs)
return data
def get_registration_instances_with_http_info(self, registration_id, **kwargs):
"""
Get all instances of a registration.
Get all the instances of this the registration specified by the registration ID
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_instances_with_http_info(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param datetime until: Only items updated before the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param datetime since: Only items updated since the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param str more: Value for this parameter will be provided in the 'more' property of registration lists, where needed. An opaque value, construction and parsing may change without notice.
:param bool include_child_results: Include information about each learning object, not just the top level in the results
:param bool include_interactions_and_objectives: Include interactions and objectives in the results
:param bool include_runtime: Include runtime details in the results
:return: RegistrationListSchema
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'until', 'since', 'more', 'include_child_results', 'include_interactions_and_objectives', 'include_runtime']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_registration_instances" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `get_registration_instances`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/instances'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
if 'until' in params:
query_params['until'] = params['until']
if 'since' in params:
query_params['since'] = params['since']
if 'more' in params:
query_params['more'] = params['more']
if 'include_child_results' in params:
query_params['includeChildResults'] = params['include_child_results']
if 'include_interactions_and_objectives' in params:
query_params['includeInteractionsAndObjectives'] = params['include_interactions_and_objectives']
if 'include_runtime' in params:
query_params['includeRuntime'] = params['include_runtime']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RegistrationListSchema',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_registration_launch_history(self, registration_id, **kwargs):
"""
Get launch history for a registration.
Returns history of this registration's launches.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_launch_history(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param bool include_history_log: Whether to include the history log in the launch history
:return: LaunchHistoryListSchema
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_registration_launch_history_with_http_info(registration_id, **kwargs)
else:
(data) = self.get_registration_launch_history_with_http_info(registration_id, **kwargs)
return data
def get_registration_launch_history_with_http_info(self, registration_id, **kwargs):
"""
Get launch history for a registration.
Returns history of this registration's launches.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_launch_history_with_http_info(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param bool include_history_log: Whether to include the history log in the launch history
:return: LaunchHistoryListSchema
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'include_history_log']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_registration_launch_history" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `get_registration_launch_history`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/launchHistory'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
if 'include_history_log' in params:
query_params['includeHistoryLog'] = params['include_history_log']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='LaunchHistoryListSchema',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_registration_progress(self, registration_id, **kwargs):
"""
Get details of a registration.
Get detailed information about the latest instance of `registrationId`. Additional detail may be obtained by using the optional query parameters.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_progress(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param bool include_child_results: Include information about each learning object, not just the top level in the results
:param bool include_interactions_and_objectives: Include interactions and objectives in the results
:param bool include_runtime: Include runtime details in the results
:return: RegistrationSchema
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_registration_progress_with_http_info(registration_id, **kwargs)
else:
(data) = self.get_registration_progress_with_http_info(registration_id, **kwargs)
return data
def get_registration_progress_with_http_info(self, registration_id, **kwargs):
"""
Get details of a registration.
Get detailed information about the latest instance of `registrationId`. Additional detail may be obtained by using the optional query parameters.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_progress_with_http_info(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param bool include_child_results: Include information about each learning object, not just the top level in the results
:param bool include_interactions_and_objectives: Include interactions and objectives in the results
:param bool include_runtime: Include runtime details in the results
:return: RegistrationSchema
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'include_child_results', 'include_interactions_and_objectives', 'include_runtime']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_registration_progress" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `get_registration_progress`")
collection_formats = {}
resource_path = '/registrations/{registrationId}'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
if 'include_child_results' in params:
query_params['includeChildResults'] = params['include_child_results']
if 'include_interactions_and_objectives' in params:
query_params['includeInteractionsAndObjectives'] = params['include_interactions_and_objectives']
if 'include_runtime' in params:
query_params['includeRuntime'] = params['include_runtime']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RegistrationSchema',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_registration_statements(self, registration_id, **kwargs):
"""
Get xAPI statements for a registration.
Get xAPI statements for `registrationId`.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_statements(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param datetime since: Only items updated since the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param datetime until: Only items updated before the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param str more: Value for this parameter will be provided in the 'more' property of registration lists, where needed. An opaque value, construction and parsing may change without notice.
:return: XapiStatementResult
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_registration_statements_with_http_info(registration_id, **kwargs)
else:
(data) = self.get_registration_statements_with_http_info(registration_id, **kwargs)
return data
def get_registration_statements_with_http_info(self, registration_id, **kwargs):
"""
Get xAPI statements for a registration.
Get xAPI statements for `registrationId`.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_statements_with_http_info(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param datetime since: Only items updated since the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param datetime until: Only items updated before the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param str more: Value for this parameter will be provided in the 'more' property of registration lists, where needed. An opaque value, construction and parsing may change without notice.
:return: XapiStatementResult
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'since', 'until', 'more']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_registration_statements" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `get_registration_statements`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/xAPIStatements'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
if 'since' in params:
query_params['since'] = params['since']
if 'until' in params:
query_params['until'] = params['until']
if 'more' in params:
query_params['more'] = params['more']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='XapiStatementResult',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_registration_tags(self, registration_id, **kwargs):
"""
Get tags for a registration.
Get a list of the tags applied to this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_tags(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:return: TagListSchema
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_registration_tags_with_http_info(registration_id, **kwargs)
else:
(data) = self.get_registration_tags_with_http_info(registration_id, **kwargs)
return data
def get_registration_tags_with_http_info(self, registration_id, **kwargs):
"""
Get tags for a registration.
Get a list of the tags applied to this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registration_tags_with_http_info(registration_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:return: TagListSchema
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_registration_tags" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `get_registration_tags`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/tags'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TagListSchema',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_registrations(self, **kwargs):
"""
Get a list of all registrations.
Gets a list of registrations including a summary of the status of each registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registrations(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str course_id: Only registrations for the specified course id will be included.
:param str learner_id: Only registrations for the specified learner id will be included.
:param datetime since: Only items updated since the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param datetime until: Only items updated before the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param str more: Value for this parameter will be provided in the 'more' property of registration lists, where needed. An opaque value, construction and parsing may change without notice.
:param bool include_child_results: Include information about each learning object, not just the top level in the results
:param bool include_interactions_and_objectives: Include interactions and objectives in the results
:param bool include_runtime: Include runtime details in the results
:param list[str] tags:
:return: RegistrationListSchema
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_registrations_with_http_info(**kwargs)
else:
(data) = self.get_registrations_with_http_info(**kwargs)
return data
def get_registrations_with_http_info(self, **kwargs):
"""
Get a list of all registrations.
Gets a list of registrations including a summary of the status of each registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_registrations_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str course_id: Only registrations for the specified course id will be included.
:param str learner_id: Only registrations for the specified learner id will be included.
:param datetime since: Only items updated since the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param datetime until: Only items updated before the specified ISO 8601 TimeStamp (inclusive) are included. If a time zone is not specified, UTC time zone will be used.
:param str more: Value for this parameter will be provided in the 'more' property of registration lists, where needed. An opaque value, construction and parsing may change without notice.
:param bool include_child_results: Include information about each learning object, not just the top level in the results
:param bool include_interactions_and_objectives: Include interactions and objectives in the results
:param bool include_runtime: Include runtime details in the results
:param list[str] tags:
:return: RegistrationListSchema
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['course_id', 'learner_id', 'since', 'until', 'more', 'include_child_results', 'include_interactions_and_objectives', 'include_runtime', 'tags']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_registrations" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/registrations'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'course_id' in params:
query_params['courseId'] = params['course_id']
if 'learner_id' in params:
query_params['learnerId'] = params['learner_id']
if 'since' in params:
query_params['since'] = params['since']
if 'until' in params:
query_params['until'] = params['until']
if 'more' in params:
query_params['more'] = params['more']
if 'include_child_results' in params:
query_params['includeChildResults'] = params['include_child_results']
if 'include_interactions_and_objectives' in params:
query_params['includeInteractionsAndObjectives'] = params['include_interactions_and_objectives']
if 'include_runtime' in params:
query_params['includeRuntime'] = params['include_runtime']
if 'tags' in params:
query_params['tags'] = params['tags']
collection_formats['tags'] = 'csv'
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RegistrationListSchema',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def put_registration_tags(self, registration_id, tags, **kwargs):
"""
Set tags on a registration.
Set the tags for this registration. Note: any tags currently on this registration will be overwritten with the new array of tags.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.put_registration_tags(registration_id, tags, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param TagListSchema tags: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.put_registration_tags_with_http_info(registration_id, tags, **kwargs)
else:
(data) = self.put_registration_tags_with_http_info(registration_id, tags, **kwargs)
return data
def put_registration_tags_with_http_info(self, registration_id, tags, **kwargs):
"""
Set tags on a registration.
Set the tags for this registration. Note: any tags currently on this registration will be overwritten with the new array of tags.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.put_registration_tags_with_http_info(registration_id, tags, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param TagListSchema tags: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'tags']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method put_registration_tags" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `put_registration_tags`")
# verify the required parameter 'tags' is set
if ('tags' not in params) or (params['tags'] is None):
raise ValueError("Missing the required parameter `tags` when calling `put_registration_tags`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/tags'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'tags' in params:
body_params = params['tags']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def put_registration_tags_batch(self, batch, **kwargs):
"""
Set tags on registrations.
Sets all of the provided tags on all of the provided registrations.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.put_registration_tags_batch(batch, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param BatchTagsSchema batch: Object representing an array of ids to apply an array of tags to. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.put_registration_tags_batch_with_http_info(batch, **kwargs)
else:
(data) = self.put_registration_tags_batch_with_http_info(batch, **kwargs)
return data
def put_registration_tags_batch_with_http_info(self, batch, **kwargs):
"""
Set tags on registrations.
Sets all of the provided tags on all of the provided registrations.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.put_registration_tags_batch_with_http_info(batch, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param BatchTagsSchema batch: Object representing an array of ids to apply an array of tags to. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['batch']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method put_registration_tags_batch" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'batch' is set
if ('batch' not in params) or (params['batch'] is None):
raise ValueError("Missing the required parameter `batch` when calling `put_registration_tags_batch`")
collection_formats = {}
resource_path = '/registrations/tags'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'batch' in params:
body_params = params['batch']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def set_registration_configuration(self, registration_id, configuration_settings, **kwargs):
"""
Set registration configuration.
Set configuration settings for this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.set_registration_configuration(registration_id, configuration_settings, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param SettingsPostSchema configuration_settings: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.set_registration_configuration_with_http_info(registration_id, configuration_settings, **kwargs)
else:
(data) = self.set_registration_configuration_with_http_info(registration_id, configuration_settings, **kwargs)
return data
def set_registration_configuration_with_http_info(self, registration_id, configuration_settings, **kwargs):
"""
Set registration configuration.
Set configuration settings for this registration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.set_registration_configuration_with_http_info(registration_id, configuration_settings, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param SettingsPostSchema configuration_settings: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'configuration_settings']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method set_registration_configuration" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `set_registration_configuration`")
# verify the required parameter 'configuration_settings' is set
if ('configuration_settings' not in params) or (params['configuration_settings'] is None):
raise ValueError("Missing the required parameter `configuration_settings` when calling `set_registration_configuration`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/configuration'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'configuration_settings' in params:
body_params = params['configuration_settings']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def set_registration_instance_configuration(self, registration_id, instance_id, configuration_settings, **kwargs):
"""
Set configuration for instance of registration.
Set configuration settings for this registration instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.set_registration_instance_configuration(registration_id, instance_id, configuration_settings, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param int instance_id: The instance of this registration (required)
:param SettingsPostSchema configuration_settings: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.set_registration_instance_configuration_with_http_info(registration_id, instance_id, configuration_settings, **kwargs)
else:
(data) = self.set_registration_instance_configuration_with_http_info(registration_id, instance_id, configuration_settings, **kwargs)
return data
def set_registration_instance_configuration_with_http_info(self, registration_id, instance_id, configuration_settings, **kwargs):
"""
Set configuration for instance of registration.
Set configuration settings for this registration instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.set_registration_instance_configuration_with_http_info(registration_id, instance_id, configuration_settings, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str registration_id: id for this registration (required)
:param int instance_id: The instance of this registration (required)
:param SettingsPostSchema configuration_settings: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['registration_id', 'instance_id', 'configuration_settings']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method set_registration_instance_configuration" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'registration_id' is set
if ('registration_id' not in params) or (params['registration_id'] is None):
raise ValueError("Missing the required parameter `registration_id` when calling `set_registration_instance_configuration`")
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params) or (params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `set_registration_instance_configuration`")
# verify the required parameter 'configuration_settings' is set
if ('configuration_settings' not in params) or (params['configuration_settings'] is None):
raise ValueError("Missing the required parameter `configuration_settings` when calling `set_registration_instance_configuration`")
if 'instance_id' in params and params['instance_id'] < 0:
raise ValueError("Invalid value for parameter `instance_id` when calling `set_registration_instance_configuration`, must be a value greater than or equal to `0`")
collection_formats = {}
resource_path = '/registrations/{registrationId}/instances/{instanceId}/configuration'.replace('{format}', 'json')
path_params = {}
if 'registration_id' in params:
path_params['registrationId'] = params['registration_id']
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'configuration_settings' in params:
body_params = params['configuration_settings']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def test_registration_postback(self, post_back, **kwargs):
"""
Send a test postback with a provided configuration.
This method will allow testing a postback configuration that you provide by sending dummy data to the url specified, with the format you specify.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.test_registration_postback(post_back, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param PostBackSchema post_back: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.test_registration_postback_with_http_info(post_back, **kwargs)
else:
(data) = self.test_registration_postback_with_http_info(post_back, **kwargs)
return data
def test_registration_postback_with_http_info(self, post_back, **kwargs):
"""
Send a test postback with a provided configuration.
This method will allow testing a postback configuration that you provide by sending dummy data to the url specified, with the format you specify.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.test_registration_postback_with_http_info(post_back, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param PostBackSchema post_back: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['post_back']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method test_registration_postback" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'post_back' is set
if ('post_back' not in params) or (params['post_back'] is None):
raise ValueError("Missing the required parameter `post_back` when calling `test_registration_postback`")
collection_formats = {}
resource_path = '/registrations/postBackTest'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'post_back' in params:
body_params = params['post_back']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['APP_NORMAL', 'OAUTH']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 48.824746 | 498 | 0.610772 | 15,335 | 148,769 | 5.69136 | 0.023215 | 0.053577 | 0.016683 | 0.021449 | 0.984154 | 0.976706 | 0.970588 | 0.96542 | 0.95803 | 0.954616 | 0 | 0.000795 | 0.314703 | 148,769 | 3,046 | 499 | 48.840775 | 0.855274 | 0.354066 | 0 | 0.823568 | 1 | 0.00395 | 0.216779 | 0.070178 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034891 | false | 0 | 0.004608 | 0 | 0.091508 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
da402206095096fef048c0afa29474256e9bbd05 | 283 | py | Python | __init__.py | ElgaSalvadore/watools | daaaad474add572f32dd6a45a4230ccf636c479a | [
"Apache-2.0"
] | 11 | 2018-09-25T08:58:26.000Z | 2021-02-13T18:58:05.000Z | __init__.py | trngbich/watools | 57b9074d59d856886675aa26014bfd6673d5da76 | [
"Apache-2.0"
] | 2 | 2019-07-25T06:10:40.000Z | 2019-07-25T07:09:27.000Z | __init__.py | trngbich/watools | 57b9074d59d856886675aa26014bfd6673d5da76 | [
"Apache-2.0"
] | 16 | 2018-09-28T22:55:11.000Z | 2021-02-22T13:03:56.000Z | # -*- coding: utf-8 -*-
from watools import General, Collect_Tools, WebAccounts, Collect, Products, Sheets, Models, Generator, Functions
__all__ = ['General', 'Collect_Tools', 'WebAccounts', 'Collect', 'Products', 'Sheets', 'Models', 'Generator', 'Functions']
__version__ = '0.1'
| 35.375 | 122 | 0.70318 | 30 | 283 | 6.3 | 0.633333 | 0.148148 | 0.201058 | 0.31746 | 0.793651 | 0.793651 | 0.793651 | 0.793651 | 0.793651 | 0.793651 | 0 | 0.012048 | 0.120141 | 283 | 7 | 123 | 40.428571 | 0.746988 | 0.074205 | 0 | 0 | 0 | 0 | 0.303846 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 9 |
da42d7dc1f878f851fb4b286f3cbcbe775fb9ca5 | 7,886 | py | Python | src/abaqus/EngineeringFeature/ContourIntegral.py | Haiiliin/PyAbaqus | f20db6ebea19b73059fe875a53be370253381078 | [
"MIT"
] | 7 | 2022-01-21T09:15:45.000Z | 2022-02-15T09:31:58.000Z | src/abaqus/EngineeringFeature/ContourIntegral.py | Haiiliin/PyAbaqus | f20db6ebea19b73059fe875a53be370253381078 | [
"MIT"
] | null | null | null | src/abaqus/EngineeringFeature/ContourIntegral.py | Haiiliin/PyAbaqus | f20db6ebea19b73059fe875a53be370253381078 | [
"MIT"
] | null | null | null | from abaqusConstants import *
from .Crack import Crack
from ..Region.RegionArray import RegionArray
class ContourIntegral(Crack):
"""The ContourIntegral object defines contour integral objects on an region. Currently only
assembly regions are supported.
The ContourIntegral object is derived from the Crack object.
Attributes
----------
suppressed: Boolean
A Boolean specifying whether the crack is suppressed or not. The default value is OFF.
Notes
-----
This object can be accessed by:
.. code-block:: python
import part
mdb.models[name].parts[name].engineeringFeatures.cracks[name]
import assembly
mdb.models[name].rootAssembly.engineeringFeatures.cracks[name]
The corresponding analysis keywords are:
- CONTOUR INTEGRAL
"""
# A Boolean specifying whether the crack is suppressed or not. The default value is OFF.
suppressed: Boolean = OFF
def __init__(self, name: str, crackFront: RegionArray, crackTip: RegionArray,
extensionDirectionMethod: SymbolicConstant, symmetric: Boolean = OFF,
listOfRegions: Boolean = OFF, crackFrontName: str = '', crackTipName: str = '',
crackNormal: tuple = (), qVectors: tuple = (), midNodePosition: float = 0,
collapsedElementAtTip: SymbolicConstant = NONE):
"""This method creates a ContourIntegral object. Although the constructor is available both
for parts and for the assembly, ContourIntegral objects are currently supported only
under the assembly.
Notes
-----
This function can be accessed by:
.. code-block:: python
mdb.models[name].parts[name].engineeringFeatures.ContourIntegral
mdb.models[name].rootAssembly.engineeringFeatures.ContourIntegral
Parameters
----------
name
A String specifying the repository key.
crackFront
A RegionArray object specifying the crack-front region to which the contour integral is
applied. If the crack-front consists of a single region, a Region object may be
specified instead of a sequence with a single item in it.
crackTip
A RegionArray object specifying the crack-tip region to which the contour integral is
applied. If the crack-tip consists of a single region, a Region object may be specified
instead of a sequence with a single item in it.
extensionDirectionMethod
A SymbolicConstant specifying how the virtual crack extension direction vectors are
defined. Possible values are CRACK_NORMAL and Q_VECTORS.
symmetric
A Boolean specifying whether the crack is defined on a half model (about a symmetry
plane) or whether it is defined on the whole model. The default value is OFF.
listOfRegions
A Boolean specifying whether the regions specified by *crackFront* and *crackTip* are
specified using a single region or tuples of region objects. The default value is OFF.
crackFrontName
A String specifying the name of the crack-front region generated from the tuple of
regions specifying the crack-front region. This argument is valid only when
*listOfRegions* is ON. The default value is *name*+Front.
crackTipName
A String specifying the name of the crack-tip region generated from the tuple of regions
specifying the crack-tip region. This parameter is valid only when *listOfRegions*=ON.
The default value is *name*+Tip.
crackNormal
A sequence of sequences of Floats specifying the two points of the vector that describes
the crack normal direction. Each point is defined by a tuple of two or three coordinates
indicating its position. This argument is required only when
*extensionDirectionMethod*=CRACK_NORMAL. The default value is an empty sequence.
qVectors
A sequence of sequences of sequences of Floats specifying the vectors that indicate the
set of crack extension directions. Each vector is described by a tuple of two points,
and each point is described by a tuple of two or three coordinates indicating its
position. This argument is required only when *extensionDirectionMethod*=Q_VECTORS. The
default value is an empty sequence.
midNodePosition
A Float specifying the position of the midside node along the edges of the second-order
elements that radiate from the crack tip. Possible values are 0.0 << *midNodeParameter*
<< 1.0. The default value is 0.5.
collapsedElementAtTip
A SymbolicConstant specifying the crack-tip singularity. Possible values are NONE,
SINGLE_NODE, and DUPLICATE_NODES. The default value is NONE.
Returns
-------
A ContourIntegral object.
"""
super().__init__()
pass
def setValues(self, symmetric: Boolean = OFF, listOfRegions: Boolean = OFF, crackFrontName: str = '',
crackTipName: str = '', crackNormal: tuple = (), qVectors: tuple = (),
midNodePosition: float = 0, collapsedElementAtTip: SymbolicConstant = NONE):
"""This method modifies the ContourIntegral object.
Parameters
----------
symmetric
A Boolean specifying whether the crack is defined on a half model (about a symmetry
plane) or whether it is defined on the whole model. The default value is OFF.
listOfRegions
A Boolean specifying whether the regions specified by *crackFront* and *crackTip* are
specified using a single region or tuples of region objects. The default value is OFF.
crackFrontName
A String specifying the name of the crack-front region generated from the tuple of
regions specifying the crack-front region. This argument is valid only when
*listOfRegions* is ON. The default value is *name*+Front.
crackTipName
A String specifying the name of the crack-tip region generated from the tuple of regions
specifying the crack-tip region. This parameter is valid only when *listOfRegions*=ON.
The default value is *name*+Tip.
crackNormal
A sequence of sequences of Floats specifying the two points of the vector that describes
the crack normal direction. Each point is defined by a tuple of two or three coordinates
indicating its position. This argument is required only when
*extensionDirectionMethod*=CRACK_NORMAL. The default value is an empty sequence.
qVectors
A sequence of sequences of sequences of Floats specifying the vectors that indicate the
set of crack extension directions. Each vector is described by a tuple of two points,
and each point is described by a tuple of two or three coordinates indicating its
position. This argument is required only when *extensionDirectionMethod*=Q_VECTORS. The
default value is an empty sequence.
midNodePosition
A Float specifying the position of the midside node along the edges of the second-order
elements that radiate from the crack tip. Possible values are 0.0 << *midNodeParameter*
<< 1.0. The default value is 0.5.
collapsedElementAtTip
A SymbolicConstant specifying the crack-tip singularity. Possible values are NONE,
SINGLE_NODE, and DUPLICATE_NODES. The default value is NONE.
"""
pass
| 52.573333 | 105 | 0.667639 | 952 | 7,886 | 5.511555 | 0.176471 | 0.035068 | 0.051458 | 0.058319 | 0.802363 | 0.784067 | 0.754717 | 0.743282 | 0.743282 | 0.743282 | 0 | 0.002488 | 0.286584 | 7,886 | 149 | 106 | 52.926175 | 0.930146 | 0.769592 | 0 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.125 | 0.1875 | 0 | 0.4375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
da6aa4652e7ab27e9180db57d7d1798b1fcf3658 | 19,428 | py | Python | maths78.py | IssamMerikhi/MathApp | 43c4bb38a1b67ecebd70d87396eba2a9bbe576f9 | [
"Apache-2.0"
] | null | null | null | maths78.py | IssamMerikhi/MathApp | 43c4bb38a1b67ecebd70d87396eba2a9bbe576f9 | [
"Apache-2.0"
] | null | null | null | maths78.py | IssamMerikhi/MathApp | 43c4bb38a1b67ecebd70d87396eba2a9bbe576f9 | [
"Apache-2.0"
] | 1 | 2021-02-26T17:04:35.000Z | 2021-02-26T17:04:35.000Z | # Environment used: dash1_8_0_env
import pandas as pd #(version 1.0.0)
import plotly #(version 4.5.0)
import plotly.express as px
import numpy as np
import dash #(version 1.8.0)
import dash_core_components as dcc
import dash_html_components as html
from dash.dependencies import Input, Output, State
from dash.exceptions import PreventUpdate
import plotly.graph_objects as go
import random
from math import *
from random import choice
app = dash.Dash(__name__)
app.title = 'Maths 78'
server = app.server
#---------------------------------------------------------------
app.layout = html.Div([
html.Div(["Enjoy Maths"], className='titre'),
html.Div([], style={'width': '30%'} ,className='einstein'),
html.Div([], className = 'h'),
html.Div([
dcc.Dropdown(id = 'type',
options=[{'label': 'Droites aléatoires', 'value': 'a'},
{'label': 'Droites orthogonales', 'value': 'o'},
{'label': 'Droites parallèles', 'value': 'p'}],
value='a')
],
style={'width': '20%',
'display': 'inline-block',
'padding-top': '10px',
'margin-left': '10%',
'font-family':'-apple-system, BlinkMacSystemFont, Segoe UI, Roboto, Oxygen, Ubuntu, Cantarell, Open Sans, Helvetica Neue, sans-serif'}),
html.Div([
dcc.Dropdown(id = 'monotonie',
options=[{'label': 'Ari croissante', 'value': 'acroi'},
{'label': 'Ari décroissante', 'value': 'adecroi'},
{'label': 'Géo croissante', 'value': 'gcroi'},
{'label': 'Géo décroissante', 'value': 'gdecroi'}],
value='acroi')
],
style={'width': '20%',
'display': 'inline-block',
'padding-top': '10px',
'margin-left': '40%',
'font-family':'-apple-system, BlinkMacSystemFont, Segoe UI, Roboto, Oxygen, Ubuntu, Cantarell, Open Sans, Helvetica Neue, sans-serif'}),
html.Div([
dcc.Graph(id='droites')
],
style={'width': '50%', 'display': 'inline-block'}),
html.Div([
dcc.Graph(id='suite')
],
style={'width': '50%', 'display': 'inline-block'}),
html.Div([
dcc.Dropdown(id = 'new',
options=[{'label': 'Vecteurs aléatoires', 'value': 'A'},
{'label': 'Vecteurs orthogonaux', 'value': 'O'}],
value='A')
],
style={'width': '20%',
'display': 'inline-block',
'padding-top': '10px',
'margin-left': '10%',
'font-family':'-apple-system, BlinkMacSystemFont, Segoe UI, Roboto, Oxygen, Ubuntu, Cantarell, Open Sans, Helvetica Neue, sans-serif'}),
html.Div([
dcc.Dropdown(id = 'form',
options=[{'label': 'Parabole en U', 'value': 'U'},
{'label': 'Parabole en n', 'value': 'n'},
{'label': 'Trinome a>0', 'value': 'apos'},
{'label': 'Trinome a<0', 'value': 'aneg'}],
value='U')
],
style={'width': '20%',
'display': 'inline-block',
'padding-top': '10px',
'margin-left': '40%',
'font-family':'-apple-system, BlinkMacSystemFont, Segoe UI, Roboto, Oxygen, Ubuntu, Cantarell, Open Sans, Helvetica Neue, sans-serif'}),
html.Div([
dcc.Graph(id='produit')
],
style={'width': '50%', 'display': 'inline-block'}),
html.Div([
dcc.Graph(id='fonction')
],
style={'width': '50%', 'display': 'inline-block'}),
html.Div(["Copyright - Issam Merikhi 2021 - All rights reserved"], className = 'footer'),
])
#---------------------------------------------------------------
@app.callback(
Output(component_id='fonction', component_property='figure'),
[Input(component_id='form', component_property='value')]
)
def function_output(form):
fonction = go.Figure()
a_prime = 0
if (form == 'U'):
a = random.randint(0,10)
b = random.randint(-10,10)
c = random.randint(-100,100)
x = np.linspace(-20,20,1000)
y = a*x**2 + b*x + c
a_prime = 2*a
y_prime = a_prime*x + b
fonction = go.Figure(data=go.Scatter(x=x, y=y, name = "f(X)"))
fonction.add_trace(go.Scatter(x=x, y=y_prime, name = "f'(X)"))
fonction.update_layout(title = "La fonction : "+str(a)+"x^2 + "+str(b)+"x + "+str(c)+"<br> Sa dérivée : "+str(a_prime)+"x + "+str(b))
fonction.update_layout(
title = {
'y':0.9,
'x':0.5,
'xanchor': 'center',
'yanchor': 'top'})
fonction.update_layout(
xaxis_title="X",
yaxis_title="Y",
legend_title="Functions",
font=dict(
family="-apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif",
size=13,
color="black"
)
)
if (form == 'n'):
a = random.randint(-10,0)
b = random.randint(-10,10)
c = random.randint(-100,100)
x = np.linspace(-20,20,1000)
y = a*x**2 + b*x + c
a_prime = 2*a
y_prime = a_prime*x + b
fonction = go.Figure(data=go.Scatter(x=x, y=y, name ="f(X)"))
fonction.add_trace(go.Scatter(x=x, y=y_prime, name ="f'(X)"))
fonction.update_layout(title = "La fonction : "+str(a)+"x^2 + "+str(b)+"x + "+str(c)+"<br> Sa dérivée : "+str(a_prime)+"x + "+str(b))
fonction.update_layout(
title = {
'y':0.9,
'x':0.5,
'xanchor': 'center',
'yanchor': 'top'}),
fonction.update_layout(
xaxis_title="X",
yaxis_title="Y",
legend_title="Functions",
font=dict(
family="-apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif",
size=13,
color="black"
)
)
if (form == 'apos'):
a = random.randint(0,10)
b = random.randint(-10,10)
c = random.randint(-10,10)
d = random.randint(-10,10)
x = np.linspace(-100,100,1000)
y = a*x**3 + b*x**2 + c*x + d
y_prime = 3*a*x**2 + 2*b*x + c
a_prime = 3*a
b_prime = 2*b
c_prime = c
fonction = go.Figure(data=go.Scatter(x=x, y=y, name ="f(X)"))
fonction.add_trace(go.Scatter(x=x, y=y_prime, name ="f'(X)"))
fonction.update_layout(title = "La fonction : "+str(a)+"x^3 + "+str(b)+"x^2 + "+str(c)+"x +"+str(d)+"<br> Sa dérivée : "+str(a_prime)+"x^2 + "+str(b_prime)+"x +"+str(c_prime))
fonction.update_layout(
title = {
'y':0.9,
'x':0.5,
'xanchor': 'center',
'yanchor': 'top'}),
fonction.update_layout(
xaxis_title="X",
yaxis_title="Y",
legend_title="Functions",
font=dict(
family="-apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif",
size=13,
color="black"
)
)
if (form == 'aneg'):
a = random.randint(-10,0)
b = random.randint(-10,10)
c = random.randint(-10,10)
d = random.randint(-10,10)
x = np.linspace(-100,100,1000)
y = a*x**3 + b*x**2 + c*x + d
y_prime = 3*a*x**2 + 2*b*x + c
a_prime = 3*a
b_prime = 2*b
c_prime = c
fonction = go.Figure(data=go.Scatter(x=x, y=y, name ="f(X)"))
fonction.add_trace(go.Scatter(x=x, y=y_prime, name ="f'(X)"))
fonction.update_layout(title = "La fonction : "+str(a_prime)+"x^3 + "+str(b_prime)+"x^2 + "+str(c_prime)+"x +"+str(d)+"<br> Sa dérivée : "+str(3*a_prime)+"x^2 + "+str(2*b)+"x +"+str(c))
fonction.update_layout(
title = {
'y':0.9,
'x':0.5,
'xanchor': 'center',
'yanchor': 'top'}),
fonction.update_layout(
xaxis_title="X",
yaxis_title="Y",
legend_title="Functions",
font=dict(
family="-apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif",
size=13,
color="black"
)
)
return fonction
@app.callback(
Output(component_id='suite', component_property='figure'),
[Input(component_id='monotonie', component_property='value')]
)
def suite_output(monotonie):
suite = px.bar()
if (monotonie == 'acroi'):
U1 = random.randint(-10,10)
r = random.randint(0,10)
n = random.randint(50,100)
Un = U1 + (n-1)*r
X = [(n_loc) for n_loc in range(1,n+1)]
Y = [U1 + (n_loc-1)*r for n_loc in range(1,n+1)]
df = pd.DataFrame(X,Y)
suite = px.bar(df, x=X, y=Y)
suite.update_layout(title = "La suite : Un = "+str(U1)+" + (n-1)x"+str(r)+"<br> C'est une suite arithmétique croissante")
suite.update_layout(
title = {
'y':0.95,
'x':0.5,
'xanchor': 'center',
'yanchor': 'top'})
suite.update_layout(
xaxis_title="n",
yaxis_title="Un",
font=dict(
family="-apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif",
size=13,
color="black"
)
)
if (monotonie == 'adecroi'):
U1 = random.randint(-10,10)
r = random.randint(-10,0)
n = random.randint(50,100)
Un = U1 + (n-1)*r
X = [(n_loc) for n_loc in range(1,n+1)]
Y = [U1 + (n_loc-1)*r for n_loc in range(1,n+1)]
df = pd.DataFrame(X,Y)
suite = px.bar(df, x=X, y=Y)
suite.update_layout(title = "La suite : Un = "+str(U1)+" + (n-1)"+str(r)+"<br> C'est une suite arithmétique décroissante")
suite.update_layout(
title = {
'y':0.95,
'x':0.5,
'xanchor': 'center',
'yanchor': 'top'})
suite.update_layout(
xaxis_title="n",
yaxis_title="Un",
font=dict(
family="-apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif",
size=13,
color="black"
)
)
if (monotonie == 'gcroi'):
U1 = random.randint(-10,10)
q = random.randint(2,5)
n = random.randint(3,6)
Un = U1 + q**(n-1)
X = [(n_loc) for n_loc in range(1,n+1)]
Y = [U1 + q**(n_loc-1) for n_loc in range(1,n+1)]
df = pd.DataFrame(X,Y)
suite = px.bar(df, x=X, y=Y)
suite.update_layout(title = "La suite : Un = "+str(U1)+" + "+str(q)+"^(n-1)<br> C'est une suite géométrique croissante")
suite.update_layout(
title = {
'y':0.95,
'x':0.5,
'xanchor': 'center',
'yanchor': 'top'})
suite.update_layout(
xaxis_title="n",
yaxis_title="Un",
font=dict(
family="-apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif",
size=13,
color="black"
)
)
if (monotonie == 'gdecroi'):
U1 = random.randint(-10,10)
q = np.random.choice([1/4,1/2,3/4,1])
n = random.randint(3,6)
Un = U1 + q**(n-1)
X = [(n_loc) for n_loc in range(1,n+1)]
Y = [U1 + q**(n_loc-1) for n_loc in range(1,n+1)]
df = pd.DataFrame(X,Y)
suite = px.bar(df, x=X, y=Y)
suite.update_layout(title = "La suite : Un = "+str(U1)+" + "+str(q)+"^(n-1) <br> C'est une suite géométrique décroissante")
suite.update_layout(
title = {
'y':0.95,
'x':0.5,
'xanchor': 'center',
'yanchor': 'top'})
suite.update_layout(
xaxis_title="n",
yaxis_title="Un",
font=dict(
family="-apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif",
size=13,
color="black"
)
)
return suite
@app.callback(
Output(component_id='produit', component_property='figure'),
[Input(component_id='new', component_property='value')]
)
def produit_scalaire(new):
produit = go.Figure()
if (new == 'A'):
x1 = random.randint(-10,10)
y1 = random.randint(-10,10)
A=(x1,y1)
x2 = random.randint(-10,10)
y2 = random.randint(-10,10)
B=(x2,y2)
x3 = random.randint(-10,10)
y3 = random.randint(-10,10)
C=(x3,y3)
x4 = random.randint(-10,10)
y4 = random.randint(-10,10)
D=(x4,y4)
u = (x2-x1,y2-y1)
v = (x4-x3,y4-y3)
df2 = {'xAB': [x1, x2],
'xCD': [x3, x4],
'yAB': [y1, y2],
'yCD': [y3, y4],
}
df2 = pd.DataFrame(df2, columns=['xAB', 'xCD', 'yAB', 'yCD'])
produit.add_trace(go.Scatter(x=df2['xAB'], y=df2['yAB'],
marker = dict(
size = 10
),
mode='lines+markers',
name='AB'))
produit.add_trace(go.Scatter(x=df2['xCD'], y=df2['yCD'],
marker = dict(
size = 10
),
mode='lines+markers',
name='CD'))
produit.update_layout(yaxis=dict(scaleanchor="x", scaleratio=1))
produit.update_layout(title = "Le vecteur AB : "+str(u)+"<br> Le vecteur CD : "+str(v)+"<br> Le produit scalaire vaut : "+str(((x2-x1)*(x4-x3))+((y2-y1)*(y4-y3))))
produit.update_layout(
title = {
'y':0.95,
'x':0.5,
'xanchor': 'center',
'yanchor': 'top'})
if (new == 'O'):
x1 = random.randint(-10,10)
y1 = random.randint(-10,10)
A=(x1,y1)
x2 = x1 + random.randint(0,5)
y2 = y1 + random.randint(0,5)
B=(x2,y2)
u = (x2-x1,y2-y1)
x3 = random.randint(-10,10)
y3 = random.randint(-10,10)
x4 = x3 + random.randint(0,5)
while (y2 - y1 == 0):
x1 = random.randint(-10,10)
y1 = random.randint(-10,10)
A=(x1,y1)
x2 = x1 + random.randint(0,5)
y2 = y1 + random.randint(0,5)
B=(x2,y2)
u = (x2-x1,y2-y1)
x3 = random.randint(-10,10)
y3 = random.randint(-10,10)
x4 = x3 + random.randint(0,5)
y4 = (-(x2-x1)*(x4-x3))/(y2-y1) + y3
v = (x4-x3,y4-y3)
df2 = {'xAB': [x1, x2],
'xCD': [x3, x4],
'yAB': [y1, y2],
'yCD': [y3, y4],
}
df2 = pd.DataFrame(df2, columns=['xAB', 'xCD', 'yAB', 'yCD'])
produit.add_trace(go.Scatter(x=df2['xAB'], y=df2['yAB'],
marker = dict(
size = 10
),
mode='markers+lines',
name='AB'))
produit.add_trace(go.Scatter(x=df2['xCD'], y=df2['yCD'],
marker = dict(
size = 10
),
mode='markers+lines',
name='CD'))
produit.update_layout(yaxis=dict(scaleanchor="x", scaleratio=1))
produit.update_layout(title = "Le vecteur AB : "+str(u)+"<br> Le vecteur CD : "+str(v)+"<br> Le produit scalaire vaut : "+str((x2-x1)*(x4-x3)+(y2-y1)*(y4-y3)))
produit.update_layout(
title = {
'y':0.95,
'x':0.5,
'xanchor': 'center',
'yanchor': 'top'})
return produit
@app.callback(
Output(component_id='droites', component_property='figure'),
[Input(component_id='type', component_property='value')]
)
def droite(type):
droites = go.Figure()
if (type == 'a'):
droites = go.Figure()
a1 = random.randint(1,10)
b1 = random.randint(-10,10)
a2 = random.randint(1,10)
b2 = random.randint(0,10)
x = np.linspace(-20,20,1000)
y1 = a1*x + b1
y2 = a2*x + b2
droites = go.Figure(data=go.Scatter(x=x, y=y1, name = "y1"))
droites.add_trace(go.Scatter(x=x, y=y2, name = "y2"))
droites.update_layout(title = "La droite y1 : "+str(a1)+"x + "+str(b1)+"<br> La droite y2 : "+str(a2)+"x + "+str(b2))
droites.update_layout(
title = {
'y':0.9,
'x':0.5,
'xanchor': 'center',
'yanchor': 'top'})
if (type == 'o'):
a1 = random.randint(1,10)
b1 = choice([i for i in range(-11,11) if i != 0])
a2 = -(1/a1)
b2 = random.randint(0,10)
x = np.linspace(-20,20,1000)
y1 = a1*x + b1
y2 = a2*x + b2
droites = go.Figure(data=go.Scatter(x=x, y=y1, name = "y1"))
droites.add_trace(go.Scatter(x=x, y=y2, name = "y2"))
droites.update_layout(title = "La droite y1 : "+str(a1)+"x + "+str(b1)+"<br> La droite y2 : "+str(a2)+"x + "+str(b2))
droites.update_layout(
title = {
'y':0.9,
'x':0.5,
'xanchor': 'center',
'yanchor': 'top'})
droites.update_layout(yaxis=dict(scaleanchor="x", scaleratio=1))
if (type == 'p'):
droites = go.Figure()
a1 = random.randint(-10,10)
b1 = random.randint(-10,10)
a2 = a1
b2 = random.randint(-10,10)
x = np.linspace(-20,20,1000)
y1 = a1*x + b1
y2 = a2*x + b2
droites = go.Figure(data=go.Scatter(x=x, y=y1, name = "y1"))
droites.add_trace(go.Scatter(x=x, y=y2, name = "y2"))
droites.update_layout(title = "La droite y1 : "+str(a1)+"x + "+str(b1)+"<br> La droite y2 : "+str(a2)+"x + "+str(b2))
droites.update_layout(
title = {
'y':0.9,
'x':0.5,
'xanchor': 'center',
'yanchor': 'top'})
return droites
if __name__ == '__main__':
app.run_server(debug=True) | 29.751914 | 194 | 0.46397 | 2,337 | 19,428 | 3.792041 | 0.096277 | 0.082149 | 0.059242 | 0.061386 | 0.817423 | 0.792259 | 0.759648 | 0.747687 | 0.717107 | 0.712593 | 0 | 0.053666 | 0.358349 | 19,428 | 653 | 195 | 29.751914 | 0.657228 | 0.010397 | 0 | 0.725367 | 0 | 0.025157 | 0.207077 | 0 | 0.004193 | 0 | 0 | 0 | 0 | 1 | 0.008386 | false | 0 | 0.027254 | 0 | 0.044025 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
da85933889a8a9a349f9071cf367f610e35c3991 | 29 | py | Python | src/stackoverflow/59179990/otherModule.py | mrdulin/python-codelab | 3d960a14a96b3a673b7dc2277d202069b1f8e778 | [
"MIT"
] | null | null | null | src/stackoverflow/59179990/otherModule.py | mrdulin/python-codelab | 3d960a14a96b3a673b7dc2277d202069b1f8e778 | [
"MIT"
] | null | null | null | src/stackoverflow/59179990/otherModule.py | mrdulin/python-codelab | 3d960a14a96b3a673b7dc2277d202069b1f8e778 | [
"MIT"
] | 3 | 2020-02-19T08:02:04.000Z | 2021-06-08T13:27:51.000Z | def B(x, y):
print(x, y)
| 9.666667 | 15 | 0.448276 | 7 | 29 | 1.857143 | 0.714286 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.310345 | 29 | 2 | 16 | 14.5 | 0.65 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
e50669d9b17c12a4d14527ddd64257104ff4ac87 | 125 | py | Python | src/ralph/util/demo/core/__init__.py | fossabot/ralph | 9ceeec52e3fc85a589c2e5766597a7c67c4e4aa2 | [
"Apache-2.0"
] | 1 | 2018-09-01T14:14:08.000Z | 2018-09-01T14:14:08.000Z | src/ralph/util/demo/core/__init__.py | srikanth4372/sample | 127b5742ae464d42909a14d71e3c10c241ec3a23 | [
"Apache-2.0"
] | 1 | 2019-08-14T10:03:45.000Z | 2019-08-14T10:03:45.000Z | src/ralph/util/demo/core/__init__.py | srikanth4372/sample | 127b5742ae464d42909a14d71e3c10c241ec3a23 | [
"Apache-2.0"
] | 1 | 2019-08-14T09:59:42.000Z | 2019-08-14T09:59:42.000Z | # -*- coding: utf-8 -*-
from ralph.util.demo.core.discovery import * # noqa
from ralph.util.demo.core.cmdb import * # noqa
| 31.25 | 52 | 0.68 | 19 | 125 | 4.473684 | 0.631579 | 0.211765 | 0.305882 | 0.4 | 0.494118 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009434 | 0.152 | 125 | 3 | 53 | 41.666667 | 0.792453 | 0.248 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
e51a5d3c293d3beb666b4b87b0d73303f67209ac | 75 | py | Python | adder.py | ShadenSmith/practice-azure-pipelines | 6c7e964d33d7bb12bc910466a5bc678bced567a7 | [
"MIT"
] | null | null | null | adder.py | ShadenSmith/practice-azure-pipelines | 6c7e964d33d7bb12bc910466a5bc678bced567a7 | [
"MIT"
] | null | null | null | adder.py | ShadenSmith/practice-azure-pipelines | 6c7e964d33d7bb12bc910466a5bc678bced567a7 | [
"MIT"
] | null | null | null |
def add(x, y):
return x + y
def add3(x, y, z):
return x + y + z
| 9.375 | 20 | 0.466667 | 16 | 75 | 2.1875 | 0.4375 | 0.228571 | 0.457143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021277 | 0.373333 | 75 | 7 | 21 | 10.714286 | 0.723404 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
e524ceaf28df96bbe193c2cc37b46f04e5d293a6 | 15,810 | py | Python | test/test_replay.py | nerdoid/dqn | a51993ea1a3b062ac93ac87fd1817fddfec4c41d | [
"MIT"
] | null | null | null | test/test_replay.py | nerdoid/dqn | a51993ea1a3b062ac93ac87fd1817fddfec4c41d | [
"MIT"
] | null | null | null | test/test_replay.py | nerdoid/dqn | a51993ea1a3b062ac93ac87fd1817fddfec4c41d | [
"MIT"
] | null | null | null | """Tests for replay memory"""
import numpy as np
import tensorflow as tf
from hedonist.replay import Replay
class TestReplay:
def setup_method(self):
self.sess = tf.Session()
def test_add_same_frame_repeatedly(self):
"""Mimic the initial state wherein the first frame is duplicated
history_length times to populate replay for first retrieved state
"""
self.replay = Replay(self.sess, 10, (3, 4), 4, 3, 6)
self.sess.run(tf.global_variables_initializer())
frame = np.array(
[
[0, 63, 127, 191],
[255, 191, 127, 63],
[127, 191, 255, 0]
]
)
action = 0
reward = 0.0
terminal = False
for _ in range(4):
self.replay.insert(frame, action, reward, terminal)
expected = np.array(
[
[
[0, 0, 0, 0],
[63, 63, 63, 63],
[127, 127, 127, 127],
[191, 191, 191, 191]
],
[
[255, 255, 255, 255],
[191, 191, 191, 191],
[127, 127, 127, 127],
[63, 63, 63, 63]
],
[
[127, 127, 127, 127],
[191, 191, 191, 191],
[255, 255, 255, 255],
[0, 0, 0, 0]
]
]
)
result = self.replay.get_current_state()
assert np.array_equal(result, expected)
def test_add_unique_frames(self):
"""Mimic a state that is populated with unique consecutive frames"""
self.replay = Replay(self.sess, 10, (3, 4), 4, 3, 6)
self.sess.run(tf.global_variables_initializer())
frame_1 = np.array(
[
[11, 12, 13, 14],
[21, 22, 23, 24],
[31, 32, 33, 34]
]
)
frame_2 = np.array(
[
[41, 42, 43, 44],
[51, 52, 53, 54],
[61, 62, 63, 64]
]
)
frame_3 = np.array(
[
[71, 72, 73, 74],
[81, 82, 83, 84],
[91, 92, 93, 94]
]
)
frame_4 = np.array(
[
[101, 102, 103, 104],
[111, 112, 113, 114],
[121, 122, 123, 124]
]
)
action = 0
reward = 0.0
terminal = False
for frame in [frame_1, frame_2, frame_3, frame_4]:
self.replay.insert(frame, action, reward, terminal)
expected = np.array(
[
[
[11, 41, 71, 101],
[12, 42, 72, 102],
[13, 43, 73, 103],
[14, 44, 74, 104]
],
[
[21, 51, 81, 111],
[22, 52, 82, 112],
[23, 53, 83, 113],
[24, 54, 84, 114]
],
[
[31, 61, 91, 121],
[32, 62, 92, 122],
[33, 63, 93, 123],
[34, 64, 94, 124]
]
]
)
result = self.replay.get_current_state()
assert np.array_equal(result, expected)
def add_test_frames(self, actions, rewards, terminals):
frame_1 = np.array(
[
[11, 12, 13, 14],
[21, 22, 23, 24],
[31, 32, 33, 34]
]
)
frame_2 = np.array(
[
[41, 42, 43, 44],
[51, 52, 53, 54],
[61, 62, 63, 64]
]
)
frame_3 = np.array(
[
[71, 72, 73, 74],
[81, 82, 83, 84],
[91, 92, 93, 94]
]
)
frame_4 = np.array(
[
[101, 102, 103, 104],
[111, 112, 113, 114],
[121, 122, 123, 124]
]
)
advance_frame = np.array(
[
[127, 127, 127, 127],
[191, 191, 191, 191],
[255, 255, 255, 255]
]
)
frames = [frame_1, frame_2, frame_3, frame_4, advance_frame]
for i in range(5):
self.replay.insert(frames[i], actions[i], rewards[i], terminals[i])
def test_advance_state_window(self):
"""Does it return the most recent history_length frames, ignoring
frames outside of the window?
"""
self.replay = Replay(self.sess, 10, (3, 4), 4, 3, 6)
self.sess.run(tf.global_variables_initializer())
actions = [0] * 5
rewards = [0.0] * 5
terminals = [False] * 5
self.add_test_frames(actions, rewards, terminals)
expected = np.array(
[
[
[41, 71, 101, 127],
[42, 72, 102, 127],
[43, 73, 103, 127],
[44, 74, 104, 127]
],
[
[51, 81, 111, 191],
[52, 82, 112, 191],
[53, 83, 113, 191],
[54, 84, 114, 191]
],
[
[61, 91, 121, 255],
[62, 92, 122, 255],
[63, 93, 123, 255],
[64, 94, 124, 255]
]
]
)
result = self.replay.get_current_state()
assert np.array_equal(result, expected)
def test_wrap(self):
"""Mimic adding past the capacity of the replay memory"""
self.replay = Replay(self.sess, 4, (3, 4), 4, 3, 6)
self.sess.run(tf.global_variables_initializer())
frame_1 = np.array(
[
[11, 12, 13, 14],
[21, 22, 23, 24],
[31, 32, 33, 34]
]
)
frame_2 = np.array(
[
[41, 42, 43, 44],
[51, 52, 53, 54],
[61, 62, 63, 64]
]
)
frame_3 = np.array(
[
[71, 72, 73, 74],
[81, 82, 83, 84],
[91, 92, 93, 94]
]
)
frame_4 = np.array(
[
[101, 102, 103, 104],
[111, 112, 113, 114],
[121, 122, 123, 124]
]
)
wrap_frame = np.array(
[
[127, 127, 127, 127],
[191, 191, 191, 191],
[255, 255, 255, 255]
]
)
frames = [frame_1, frame_2, frame_3, frame_4, wrap_frame]
action = 0
reward = 0.0
terminal = False
for frame in frames:
self.replay.insert(frame, action, reward, terminal)
expected = np.array(
[
[
[41, 71, 101, 127],
[42, 72, 102, 127],
[43, 73, 103, 127],
[44, 74, 104, 127]
],
[
[51, 81, 111, 191],
[52, 82, 112, 191],
[53, 83, 113, 191],
[54, 84, 114, 191]
],
[
[61, 91, 121, 255],
[62, 92, 122, 255],
[63, 93, 123, 255],
[64, 94, 124, 255]
]
]
)
result = self.replay.get_current_state()
assert np.array_equal(result, expected)
def test_single_sample(self):
"""Verify a single sample"""
self.replay = Replay(self.sess, 10, (3, 4), 4, 1, 6)
self.sess.run(tf.global_variables_initializer())
actions = [0, 1, 2, 3, 4]
rewards = [0.0, -1.0, 2.0, 0.0, 3.0]
terminals = [False, False, False, False, True]
self.add_test_frames(actions, rewards, terminals)
sample = self.replay.sample()
expected_first_state = np.array([
[
[
[11, 41, 71, 101],
[12, 42, 72, 102],
[13, 43, 73, 103],
[14, 44, 74, 104]
],
[
[21, 51, 81, 111],
[22, 52, 82, 112],
[23, 53, 83, 113],
[24, 54, 84, 114]
],
[
[31, 61, 91, 121],
[32, 62, 92, 122],
[33, 63, 93, 123],
[34, 64, 94, 124]
]
]
])
expected_next_state = np.array([
[
[
[41, 71, 101, 127],
[42, 72, 102, 127],
[43, 73, 103, 127],
[44, 74, 104, 127]
],
[
[51, 81, 111, 191],
[52, 82, 112, 191],
[53, 83, 113, 191],
[54, 84, 114, 191]
],
[
[61, 91, 121, 255],
[62, 92, 122, 255],
[63, 93, 123, 255],
[64, 94, 124, 255]
]
]
])
assert len(sample) == 5
assert np.array_equal(sample[0], expected_first_state)
assert np.array_equal(sample[1], np.array([[0, 0, 0, 0, 1, 0]]))
assert np.array_equal(sample[2], np.array([3]))
assert np.array_equal(sample[3], expected_next_state)
assert np.array_equal(sample[4], np.array([True]))
def test_no_terminals_in_first_state_1(self):
"""Verify that the sample is not the one with a terminal in the first
state.
"""
self.replay = Replay(self.sess, 10, (3, 4), 4, 1, 6)
self.sess.run(tf.global_variables_initializer())
actions = [0, 1, 2, 3, 4]
rewards = [0.0, -1.0, 2.0, 0.0, 3.0]
terminals = [False, False, False, False, True]
self.add_test_frames(actions, rewards, terminals)
frame = np.array(
[
[95, 95, 95, 95],
[159, 159, 159, 159],
[221, 221, 221, 221]
]
)
self.replay.insert(frame, 3, 1.0, False)
sample = self.replay.sample()
expected_first_state = np.array([
[
[
[11, 41, 71, 101],
[12, 42, 72, 102],
[13, 43, 73, 103],
[14, 44, 74, 104]
],
[
[21, 51, 81, 111],
[22, 52, 82, 112],
[23, 53, 83, 113],
[24, 54, 84, 114]
],
[
[31, 61, 91, 121],
[32, 62, 92, 122],
[33, 63, 93, 123],
[34, 64, 94, 124]
]
]
])
expected_next_state = np.array([
[
[
[41, 71, 101, 127],
[42, 72, 102, 127],
[43, 73, 103, 127],
[44, 74, 104, 127]
],
[
[51, 81, 111, 191],
[52, 82, 112, 191],
[53, 83, 113, 191],
[54, 84, 114, 191]
],
[
[61, 91, 121, 255],
[62, 92, 122, 255],
[63, 93, 123, 255],
[64, 94, 124, 255]
]
]
])
assert len(sample) == 5
assert np.array_equal(sample[0], expected_first_state)
assert np.array_equal(sample[1], np.array([[0, 0, 0, 0, 1, 0]]))
assert np.array_equal(sample[2], np.array([3]))
assert np.array_equal(sample[3], expected_next_state)
assert np.array_equal(sample[4], np.array([True]))
def test_no_terminals_in_first_state_2(self):
"""Verify that the sample is not the one with a terminal in the first
state.
"""
self.replay = Replay(self.sess, 10, (3, 4), 4, 1, 6)
self.sess.run(tf.global_variables_initializer())
actions = [0, 1, 2, 3, 4]
rewards = [0.0, -1.0, 2.0, 0.0, 3.0]
terminals = [True, False, False, False, False]
self.add_test_frames(actions, rewards, terminals)
frame = np.array(
[
[95, 95, 95, 95],
[159, 159, 159, 159],
[221, 221, 221, 221]
]
)
self.replay.insert(frame, 3, 1.0, True)
sample = self.replay.sample()
expected_first_state = np.array([
[
[
[41, 71, 101, 127],
[42, 72, 102, 127],
[43, 73, 103, 127],
[44, 74, 104, 127]
],
[
[51, 81, 111, 191],
[52, 82, 112, 191],
[53, 83, 113, 191],
[54, 84, 114, 191]
],
[
[61, 91, 121, 255],
[62, 92, 122, 255],
[63, 93, 123, 255],
[64, 94, 124, 255]
]
]
])
expected_next_state = np.array([
[
[
[71, 101, 127, 95],
[72, 102, 127, 95],
[73, 103, 127, 95],
[74, 104, 127, 95]
],
[
[81, 111, 191, 159],
[82, 112, 191, 159],
[83, 113, 191, 159],
[84, 114, 191, 159]
],
[
[91, 121, 255, 221],
[92, 122, 255, 221],
[93, 123, 255, 221],
[94, 124, 255, 221]
]
]
])
assert len(sample) == 5
assert np.array_equal(sample[0], expected_first_state)
assert np.array_equal(sample[1], np.array([[0, 0, 0, 1, 0, 0]]))
assert np.array_equal(sample[2], np.array([1.0]))
assert np.array_equal(sample[3], expected_next_state)
assert np.array_equal(sample[4], np.array([True]))
def test_sample_alignment(self):
"""Verify that sample data lines up"""
self.replay = Replay(self.sess, 10, (3, 4), 4, 32, 6)
self.sess.run(tf.global_variables_initializer())
actions = [0, 1, 2, 3, 4, 5, 0, 1, 2, 3]
rewards = [0.0, 1.0, 2.0, 3.0, 4.0, 5.0, -1.0, -2.0, -3.0, -4.0]
terminals = [
False, False, False, True, False, False, False, False, False, False
]
frame = np.array(
[
[1, 1, 1, 1],
[2, 2, 2, 2],
[3, 3, 3, 3]
]
)
for i in range(len(actions)):
self.replay.insert(frame * i, actions[i], rewards[i], terminals[i])
batch = self.replay.sample()
b_frames1, b_actions, b_rewards, b_frames2, b_terminals = batch
for sample_i, r in enumerate(b_rewards):
input_i = rewards.index(r)
b_action = np.nonzero(b_actions[sample_i])[0][0]
assert b_action == actions[input_i]
assert b_terminals[sample_i] == terminals[input_i]
| 29.22366 | 79 | 0.369956 | 1,687 | 15,810 | 3.377001 | 0.0984 | 0.068808 | 0.043356 | 0.060032 | 0.786203 | 0.763033 | 0.762331 | 0.744076 | 0.734597 | 0.714411 | 0 | 0.240405 | 0.500633 | 15,810 | 540 | 80 | 29.277778 | 0.481191 | 0.03561 | 0 | 0.586433 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052516 | 1 | 0.021882 | false | 0 | 0.006565 | 0 | 0.030635 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e53926b5d81a98430a29bd0256795793d924be3e | 33,456 | py | Python | Account/app/mod_database/helpers.py | TamSzaGot/mydata-sdk | 9c8afb75077f0b993819aa534b904501a8112f76 | [
"MIT"
] | 4 | 2018-04-21T00:46:40.000Z | 2019-12-03T13:52:03.000Z | Account/app/mod_database/helpers.py | TamSzaGot/mydata-sdk | 9c8afb75077f0b993819aa534b904501a8112f76 | [
"MIT"
] | 1 | 2019-01-09T10:45:23.000Z | 2019-01-09T10:45:23.000Z | Account/app/mod_database/helpers.py | TamSzaGot/mydata-sdk | 9c8afb75077f0b993819aa534b904501a8112f76 | [
"MIT"
] | 4 | 2018-04-21T01:12:12.000Z | 2020-09-24T06:19:29.000Z | # -*- coding: utf-8 -*-
"""
__author__ = "Jani Yli-Kantola"
__copyright__ = ""
__credits__ = ["Harri Hirvonsalo", "Aleksi Palomäki"]
__license__ = "MIT"
__version__ = "1.3.0"
__maintainer__ = "Jani Yli-Kantola"
__contact__ = "https://github.com/HIIT/mydata-stack"
__status__ = "Development"
"""
# Import dependencies
from flask import current_app
from app.helpers import get_custom_logger
from app.app_modules import db
logger = get_custom_logger(__name__)
def log_query(sql_query=None, arguments=None):
logger.info("Executing")
if sql_query is None:
raise AttributeError("Provide sql_query as parameter")
if arguments is None:
raise AttributeError("Provide arguments as parameter")
logger.debug('sql_query: ' + repr(sql_query))
for index in range(len(arguments)):
logger.debug("arguments[" + str(index) + "]: " + str(arguments[index]))
logger.debug('SQL query to execute: ' + repr(sql_query % arguments))
def get_db_cursor():
logger.info("Executing")
try:
cursor = db.connection.cursor()
except Exception as exp:
logger.debug('db.connection.cursor(): ' + repr(exp))
raise RuntimeError('Could not get cursor for database connection')
else:
logger.debug('DB cursor at ' + repr(cursor))
return cursor
def execute_sql_insert(cursor, sql_query):
"""
:param cursor:
:param sql_query:
:return: cursor:
:return: last_id:
INSERT to MySQL
"""
logger.info("Executing")
last_id = ""
if current_app.config["SUPER_DEBUG"]:
logger.debug('sql_query: ' + repr(sql_query))
try:
# Should be done like here: http://stackoverflow.com/questions/3617052/escape-string-python-for-mysql/27575399#27575399
cursor.execute(sql_query)
except Exception as exp:
logger.debug('Error in SQL query execution: ' + repr(exp))
raise
try:
last_id = str(cursor.lastrowid)
except Exception as exp:
logger.debug('cursor.lastrowid not found: ' + repr(exp))
raise
else:
logger.debug('cursor.lastrowid: ' + last_id)
return cursor, last_id
def execute_sql_insert_2(cursor, sql_query, arguments):
"""
:param cursor:
:param sql_query:
:return: cursor:
:return: last_id:
INSERT to MySQL
"""
logger.info("Executing")
last_id = ""
log_query(sql_query=sql_query, arguments=arguments)
try:
# Should be done like here: http://stackoverflow.com/questions/3617052/escape-string-python-for-mysql/27575399#27575399
cursor.execute(sql_query, (arguments))
logger.debug("Executed SQL query: " + str(cursor._last_executed))
logger.debug("Affected rows: " + str(cursor.rowcount))
except Exception as exp:
logger.debug('Error in SQL query execution: ' + repr(exp))
raise
try:
last_id = str(cursor.lastrowid)
except Exception as exp:
logger.debug('cursor.lastrowid not found: ' + repr(exp))
raise
else:
logger.debug('cursor.lastrowid: ' + last_id)
return cursor, last_id
def execute_sql_update(cursor, sql_query, arguments):
"""
:param arguments:
:param cursor:
:param sql_query:
:return: cursor:
INSERT to MySQL
"""
logger.info("Executing")
logger.debug('sql_query: ' + str(sql_query))
for index in range(len(arguments)):
logger.debug("arguments[" + str(index) + "]: " + str(arguments[index]))
try:
# Should be done like here: http://stackoverflow.com/questions/3617052/escape-string-python-for-mysql/27575399#27575399
cursor.execute(sql_query, (arguments))
logger.debug("Executed SQL query: " + str(cursor._last_executed))
logger.debug("Affected rows SQL query: " + str(cursor.rowcount))
except Exception as exp:
logger.debug('Error in SQL query execution: ' + repr(exp))
raise
else:
logger.debug('db entry updated')
return cursor
def execute_sql_select(cursor=None, sql_query=None):
"""
:param cursor:
:param sql_query:
:return: cursor:
:return: last_id:
SELECT from MySQL
"""
logger.info("Executing")
if current_app.config["SUPER_DEBUG"]:
logger.debug('sql_query: ' + repr(sql_query))
try:
cursor.execute(sql_query)
except Exception as exp:
logger.debug('Error in SQL query execution: ' + repr(exp))
raise
try:
data = cursor.fetchall()
except Exception as exp:
logger.debug('cursor.fetchall() failed: ' + repr(exp))
data = 'No content'
if current_app.config["SUPER_DEBUG"]:
logger.debug('data ' + repr(data))
return cursor, data
def execute_sql_select_2(cursor=None, sql_query=None, arguments=None):
"""
:param cursor:
:param sql_query:
:return: cursor:
:return: last_id:
SELECT from MySQL
"""
logger.info("Executing")
log_query(sql_query=sql_query, arguments=arguments)
try:
cursor.execute(sql_query, (arguments))
logger.debug("Executed SQL query: " + str(cursor._last_executed))
logger.debug("Affected rows: " + str(cursor.rowcount))
except Exception as exp:
logger.debug('Error in SQL query execution: ' + repr(exp))
raise
try:
data = cursor.fetchall()
except Exception as exp:
logger.debug('cursor.fetchall() failed: ' + repr(exp))
data = 'No content'
logger.debug('data ' + repr(data))
return cursor, data
def execute_sql_count(cursor=None, sql_query=None):
"""
:param cursor:
:param sql_query:
:return: cursor:
:return: last_id:
SELECT from MySQL
"""
logger.info("Executing")
consent_count = 0
if current_app.config["SUPER_DEBUG"]:
logger.debug('sql_query: ' + repr(sql_query))
try:
cursor.execute(sql_query)
except Exception as exp:
logger.debug('Error in SQL query execution: ' + repr(exp))
raise
try:
data = cursor.fetchone()
if current_app.config["SUPER_DEBUG"]:
logger.debug('data: ' + repr(data))
consent_count = int(data[0])
except Exception as exp:
logger.debug('cursor.fetchone() failed: ' + repr(exp))
if current_app.config["SUPER_DEBUG"]:
logger.debug('data ' + repr(data))
return cursor, consent_count
def drop_table_content():
"""
http://stackoverflow.com/questions/5452760/truncate-foreign-key-constrained-table/5452798#5452798
Drop table content
"""
logger.info("Executing")
try:
cursor = get_db_cursor()
except Exception as exp:
logger.debug('Could not get db cursor: ' + repr(exp))
raise
sql_query = "SELECT Concat('TRUNCATE TABLE ',table_schema,'.',TABLE_NAME, ';') " \
"FROM INFORMATION_SCHEMA.TABLES where table_schema in ('MyDataAccount');"
try:
cursor.execute(sql_query)
except Exception as exp:
logger.debug('Error in SQL query execution: ' + repr(exp))
db.connection.rollback()
raise
else:
sql_queries = cursor.fetchall()
logger.debug("Fetched sql_queries: " + repr(sql_queries))
try:
logger.debug("SET FOREIGN_KEY_CHECKS = 0;")
cursor.execute("SET FOREIGN_KEY_CHECKS = 0;")
for query in sql_queries:
logger.debug("Executing: " + str(query[0]))
sql_query = str(query[0])
cursor.execute(sql_query)
except Exception as exp:
logger.debug('Error in SQL query execution: ' + repr(exp))
db.connection.rollback()
logger.debug("SET FOREIGN_KEY_CHECKS = 1;")
cursor.execute("SET FOREIGN_KEY_CHECKS = 1;")
raise
else:
db.connection.commit()
logger.debug("Committed")
logger.debug("SET FOREIGN_KEY_CHECKS = 1;")
cursor.execute("SET FOREIGN_KEY_CHECKS = 1;")
return True
def delete_account_from_database(account_id=None):
"""
Delete all entries related to Account
"""
logger.info("Executing")
if account_id is None:
raise AttributeError("Provide account_id as parameter")
if not isinstance(account_id, int):
try:
account_id = int(account_id)
except Exception as exp:
logger.error("account_id has wrong type: " + repr(type(account_id)) + ' - ' + repr(exp))
raise TypeError("account_id MUST be int")
else:
logger.info("account_id: " + str(account_id))
try:
cursor = get_db_cursor()
except Exception as exp:
logger.debug('Could not get db cursor: ' + repr(exp))
raise
sql_query_for_account_table = "DELETE FROM MyDataAccount.Accounts WHERE id = {0};".format(account_id)
sql_query = "SELECT Concat('DELETE FROM ',table_schema,'.',TABLE_NAME, ' ', 'WHERE Accounts_id = %s',';') " \
"FROM INFORMATION_SCHEMA.TABLES where table_schema in ('MyDataAccount');"
arguments = (
int(account_id),
)
try:
log_query(sql_query=sql_query, arguments=arguments)
cursor.execute(sql_query, (arguments))
except Exception as exp:
logger.debug('Error in SQL query execution: ' + repr(exp))
db.connection.rollback()
raise
else:
sql_queries = cursor.fetchall()
logger.debug("Fetched sql_queries: " + repr(sql_queries))
try:
logger.debug("SET FOREIGN_KEY_CHECKS = 0;")
cursor.execute("SET FOREIGN_KEY_CHECKS = 0;")
for query in sql_queries:
if "MyDataAccount.Accounts" in query[0]: # MyDataAccount.Accounts table has to skipped here because missing table column "Accounts_id"
logger.debug("Skipping MyDataAccount.Accounts table because missing table column Accounts_id")
else:
logger.debug("Executing: " + str(query[0]))
sql_query = str(query[0])
cursor.execute(sql_query)
logger.debug("Executing: " + str(sql_query_for_account_table)) # Handling MyDataAccount.Accounts
cursor.execute(sql_query_for_account_table)
except Exception as exp:
logger.debug('Error in SQL query execution: ' + repr(exp))
db.connection.rollback()
logger.debug("SET FOREIGN_KEY_CHECKS = 1;")
cursor.execute("SET FOREIGN_KEY_CHECKS = 1;")
raise
else:
db.connection.commit()
logger.debug("Committed")
logger.debug("SET FOREIGN_KEY_CHECKS = 1;")
cursor.execute("SET FOREIGN_KEY_CHECKS = 1;")
return True
def get_primary_keys_by_account_id(cursor=None, account_id=None, table_name=None):
logger.info("Executing")
if cursor is None:
raise AttributeError("Provide cursor as parameter")
if account_id is None:
raise AttributeError("Provide account_id as parameter")
if table_name is None:
raise AttributeError("Provide table_name as parameter")
sql_query = "SELECT id " \
"FROM " + table_name + " " \
"WHERE Accounts_id LIKE %s;"
arguments = (
'%' + str(account_id) + '%',
)
try:
cursor, data = execute_sql_select_2(cursor=cursor, sql_query=sql_query, arguments=arguments)
except Exception as exp:
logger.debug('sql_query: ' + repr(exp))
raise
else:
logger.debug("Got data: " + repr(data))
if len(data) == 0:
logger.error("IndexError('DB query returned no results')")
raise IndexError("DB query returned no results")
logger.debug("Got data[0]: " + repr(data[0]))
data_list = list(data[0])
logger.info("Got data_list: " + repr(data_list))
for i in range(len(data_list)):
data_list[i] = str(data_list[i])
id_list = data_list
logger.info("Got id_list: " + repr(id_list))
return cursor, id_list
def get_slr_ids(cursor=None, account_id=None, table_name=None):
logger.info("Executing")
if cursor is None:
raise AttributeError("Provide cursor as parameter")
if account_id is None:
raise AttributeError("Provide account_id as parameter")
if table_name is None:
raise AttributeError("Provide table_name as parameter")
sql_query = "SELECT serviceLinkRecordId " \
"FROM " + table_name + " " \
"WHERE Accounts_id LIKE %s;"
arguments = (
'%' + str(account_id) + '%',
)
try:
cursor, data = execute_sql_select_2(cursor=cursor, sql_query=sql_query, arguments=arguments)
except Exception as exp:
logger.debug('sql_query: ' + repr(exp))
raise
else:
logger.debug("Got data: " + repr(data))
#logger.debug("Got data[0]: " + repr(data[0]))
data_list = list(data)
logger.info("Got data_list: " + repr(data_list))
if len(data) == 0:
logger.error("IndexError('DB query returned no results')")
raise IndexError("DB query returned no results")
for i in range(len(data_list)):
data_list[i] = str(data_list[i][0])
logger.info("Formatted data_list: " + repr(data_list))
id_list = data_list
logger.info("Got id_list: " + repr(id_list))
return cursor, id_list
def get_slr_ids_by_service(cursor=None, service_id=None, surrogate_id="", account_id="", table_name=None):
logger.info("Executing")
if cursor is None:
raise AttributeError("Provide cursor as parameter")
if service_id is None:
raise AttributeError("Provide service_id as parameter")
if surrogate_id is None:
raise AttributeError("Provide surrogate_id as parameter")
if account_id is None:
raise AttributeError("Provide account_id as parameter")
if table_name is None:
raise AttributeError("Provide table_name as parameter")
sql_query = "SELECT serviceLinkRecordId " \
"FROM " + table_name + " " \
"WHERE serviceId LIKE %s AND surrogateId LIKE %s AND Accounts_id LIKE %s;"
arguments = (
'%' + str(service_id) + '%',
'%' + str(surrogate_id) + '%',
'%' + str(account_id) + '%',
)
try:
cursor, data = execute_sql_select_2(cursor=cursor, sql_query=sql_query, arguments=arguments)
except Exception as exp:
logger.debug('sql_query: ' + repr(exp))
raise
else:
logger.debug("Got data: " + repr(data))
#logger.debug("Got data[0]: " + repr(data[0]))
data_list = list(data)
logger.info("Got data_list: " + repr(data_list))
if len(data) == 0:
logger.error("IndexError('DB query returned no results')")
raise IndexError("DB query returned no results")
for i in range(len(data_list)):
data_list[i] = str(data_list[i][0])
logger.info("Formatted data_list: " + repr(data_list))
id_list = data_list
logger.info("Got id_list: " + repr(id_list))
return cursor, id_list
def get_slsr_ids(cursor=None, slr_id=None, table_name=None):
logger.info("Executing")
if cursor is None:
raise AttributeError("Provide cursor as parameter")
if slr_id is None:
raise AttributeError("Provide slr_id as parameter")
if table_name is None:
raise AttributeError("Provide table_name as parameter")
sql_query = "SELECT serviceLinkStatusRecordId " \
"FROM " + table_name + " " \
"WHERE serviceLinkRecordId LIKE %s;"
arguments = (
'%' + str(slr_id) + '%',
)
try:
cursor, data = execute_sql_select_2(cursor=cursor, sql_query=sql_query, arguments=arguments)
except Exception as exp:
logger.debug('sql_query: ' + repr(exp))
raise
else:
logger.debug("Got data: " + repr(data))
if len(data) == 0:
logger.error("IndexError('DB query returned no results')")
raise IndexError("DB query returned no results")
logger.debug("Got data[0]: " + repr(data[0]))
data_list = list(data[0])
logger.info("Got data_list: " + repr(data_list))
for i in range(len(data_list)):
data_list[i] = str(data_list[i])
id_list = data_list
logger.info("Got id_list: " + repr(id_list))
return cursor, id_list
def get_last_slsr_id(cursor=None, slr_id=None, table_name=None):
logger.info("Executing")
if cursor is None:
raise AttributeError("Provide cursor as parameter")
if slr_id is None:
raise AttributeError("Provide slr_id as parameter")
if table_name is None:
raise AttributeError("Provide table_name as parameter")
sql_query = "SELECT serviceLinkStatusRecordId " \
"FROM " + table_name + " " \
"WHERE serviceLinkRecordId LIKE %s " \
"ORDER BY id DESC " \
"LIMIT 1;"
arguments = (
'%' + str(slr_id) + '%',
)
try:
cursor, data = execute_sql_select_2(cursor=cursor, sql_query=sql_query, arguments=arguments)
except Exception as exp:
logger.debug('sql_query: ' + repr(exp))
raise
else:
logger.debug("Got data: " + repr(data))
if len(data) == 0:
logger.error("IndexError('DB query returned no results')")
raise IndexError("DB query returned no results")
logger.debug("Got data[0]: " + repr(data[0]))
data_list = list(data[0])
logger.info("Got data_list: " + repr(data_list))
entry_id = str(data_list[0])
logger.info("Got entry_id: " + repr(entry_id))
return cursor, entry_id
def get_cr_ids(cursor=None, slr_id=None, table_name=None):
logger.info("Executing")
if cursor is None:
raise AttributeError("Provide cursor as parameter")
if slr_id is None:
raise AttributeError("Provide slr_id as parameter")
if table_name is None:
raise AttributeError("Provide table_name as parameter")
sql_query = "SELECT consentRecordId " \
"FROM " + table_name + " " \
"WHERE serviceLinkRecordId LIKE %s;"
arguments = (
'%' + str(slr_id) + '%',
)
try:
cursor, data = execute_sql_select_2(cursor=cursor, sql_query=sql_query, arguments=arguments)
except Exception as exp:
logger.debug('sql_query: ' + repr(exp))
raise
else:
logger.debug("Got data: " + repr(data))
if len(data) == 0:
logger.error("IndexError('DB query returned no results')")
raise IndexError("DB query returned no results")
logger.debug("Got data[0]: " + repr(data[0]))
data_list = list(data[0])
logger.info("Got data_list: " + repr(data_list))
for i in range(len(data_list)):
data_list[i] = str(data_list[i])
id_list = data_list
logger.info("Got id_list: " + repr(id_list))
return cursor, id_list
def get_csr_ids(cursor=None, cr_id=None, csr_primary_key=None, table_name=None):
logger.info("Executing")
if cursor is None:
raise AttributeError("Provide cursor as parameter")
if cr_id is None:
raise AttributeError("Provide cr_id as parameter")
if table_name is None:
raise AttributeError("Provide table_name as parameter")
if csr_primary_key is None:
sql_query = "SELECT consentStatusRecordId " \
"FROM " + table_name + " " \
"WHERE consentRecordId LIKE %s;"
arguments = (
'%' + str(cr_id) + '%',
)
else:
sql_query = "SELECT consentStatusRecordId " \
"FROM " + table_name + " " \
"WHERE consentRecordId LIKE %s AND id > %s;"
arguments = (
'%' + str(cr_id) + '%',
int(csr_primary_key),
)
try:
cursor, data = execute_sql_select_2(cursor=cursor, sql_query=sql_query, arguments=arguments)
except Exception as exp:
logger.debug('sql_query: ' + repr(exp))
raise
else:
logger.debug("Got data: " + repr(data))
if len(data) == 0:
logger.error("IndexError('DB query returned no results')")
raise IndexError("DB query returned no results")
logger.debug("Got data[0]: " + repr(data[0]))
data_list = list(data)
logger.info("Got data_list: " + repr(data_list))
for i in range(len(data_list)):
data_list[i] = str(data_list[i][-1])
id_list = data_list
logger.info("Got id_list: " + repr(id_list))
return cursor, id_list
def get_last_csr_id(cursor=None, consent_id=None, account_id="", table_name=None):
logger.info("Executing")
if cursor is None:
raise AttributeError("Provide cursor as parameter")
if table_name is None:
raise AttributeError("Provide table_name as parameter")
try:
consent_id = str(consent_id)
except Exception:
raise TypeError("consent_id MUST be str, not " + str(type(consent_id)))
try:
account_id = int(account_id)
except Exception:
logger.warning("account_id SHOULD be int, not " + str(type(account_id)))
logger.warning("Querying without Account ID")
sql_query = "SELECT consentStatusRecordId " \
"FROM " + table_name + " " \
"WHERE consentRecordId LIKE %s " \
"ORDER BY id DESC " \
"LIMIT 1;"
arguments = (
'%' + str(consent_id) + '%',
)
else:
logger.debug("Querying with account_id")
sql_query = "SELECT consentStatusRecordId " \
"FROM " + table_name + " " \
"WHERE consentRecordId LIKE %s " \
"AND Accounts_id = %s " \
"ORDER BY id DESC " \
"LIMIT 1;"
arguments = (
'%' + str(consent_id) + '%',
int(account_id),
)
try:
cursor, data = execute_sql_select_2(cursor=cursor, sql_query=sql_query, arguments=arguments)
except Exception as exp:
logger.debug('sql_query: ' + repr(exp))
raise
else:
logger.debug("Got data: " + repr(data))
if len(data) == 0:
logger.error("IndexError('DB query returned no results')")
raise IndexError("DB query returned no results")
logger.debug("Got data[0]: " + repr(data[0]))
data_list = list(data[0])
logger.info("Got data_list: " + repr(data_list))
entry_id = str(data_list[0])
logger.info("Got entry_id: " + repr(entry_id))
return cursor, entry_id
def get_account_id_by_csr_id(cursor=None, cr_id=None, acc_table_name=None, slr_table_name=None, cr_table_name=None):
logger.info("Executing")
if cursor is None:
raise AttributeError("Provide cursor as parameter")
if cr_id is None:
raise AttributeError("Provide cr_id as parameter")
if acc_table_name is None:
raise AttributeError("Provide acc_table_name as parameter")
if slr_table_name is None:
raise AttributeError("Provide slr_table_name as parameter")
if cr_table_name is None:
raise AttributeError("Provide cr_table_name as parameter")
sql_query = "SELECT `Accounts`.`id` " \
"FROM " + acc_table_name + " " \
"INNER JOIN " + slr_table_name + " on " + acc_table_name + ".`id` = " + slr_table_name + ".`Accounts_id` " \
"INNER JOIN " + cr_table_name + " on " + slr_table_name + ".`id` = " + cr_table_name + ".`ServiceLinkRecords_id` " \
"WHERE " + cr_table_name + ".`consentRecordId` LIKE %s " \
"LIMIT 1;"
arguments = (
'%' + str(cr_id) + '%',
)
try:
cursor, data = execute_sql_select_2(cursor=cursor, sql_query=sql_query, arguments=arguments)
except Exception as exp:
logger.debug('sql_query: ' + repr(exp))
raise
else:
logger.debug("Got data: " + repr(data))
if len(data) == 0:
logger.error("IndexError('DB query returned no results')")
raise IndexError("DB query returned no results")
logger.debug("Got data[0]: " + repr(data[0]))
data_list = list(data[0])
logger.info("Got data_list: " + repr(data_list))
entry_id = str(data_list[0])
logger.info("Got entry_id: " + repr(entry_id))
return cursor, entry_id
def get_consent_ids(cursor=None, surrogate_id="", slr_id="", subject_id="", consent_pair_id="", account_id="", table_name=None):
logger.info("Executing")
if cursor is None:
raise AttributeError("Provide cursor as parameter")
if table_name is None:
raise AttributeError("Provide table_name as parameter")
try:
surrogate_id = str(surrogate_id)
except Exception:
raise TypeError("surrogate_id MUST be str, not " + str(type(surrogate_id)))
try:
slr_id = str(slr_id)
except Exception:
raise TypeError("slr_id MUST be str, not " + str(type(slr_id)))
try:
subject_id = str(subject_id)
except Exception:
raise TypeError("subject_id MUST be str, not " + str(type(subject_id)))
try:
consent_pair_id = str(consent_pair_id)
except Exception:
raise TypeError("consent_pair_id MUST be str, not " + str(type(consent_pair_id)))
try:
account_id = str(account_id)
except Exception:
raise TypeError("account_id MUST be str, not " + str(type(account_id)))
sql_query = "SELECT consentRecordId " \
"FROM " + table_name + " " \
"WHERE surrogateId LIKE %s " \
"AND serviceLinkRecordId LIKE %s " \
"AND subjectId LIKE %s " \
"AND consentPairId LIKE %s " \
"AND Accounts_id LIKE %s;"
arguments = (
'%' + str(surrogate_id) + '%',
'%' + str(slr_id) + '%',
'%' + str(subject_id) + '%',
'%' + str(consent_pair_id) + '%',
'%' + str(account_id) + '%',
)
try:
cursor, data = execute_sql_select_2(cursor=cursor, sql_query=sql_query, arguments=arguments)
except Exception as exp:
logger.debug('sql_query: ' + repr(exp))
raise
else:
logger.debug("Got data: " + repr(data))
#logger.debug("Got data[0]: " + repr(data[0]))
data_list = list(data)
logger.info("Got data_list: " + repr(data_list))
if len(data) == 0:
logger.error("IndexError('DB query returned no results')")
raise IndexError("DB query returned no results")
for i in range(len(data_list)):
data_list[i] = str(data_list[i][0])
logger.info("Formatted data_list: " + repr(data_list))
id_list = data_list
logger.info("Got id_list: " + repr(id_list))
return cursor, id_list
def get_last_consent_id(cursor=None, surrogate_id="", slr_id="", subject_id="", consent_pair_id="", account_id="", table_name=None):
logger.info("Executing")
if cursor is None:
raise AttributeError("Provide cursor as parameter")
if table_name is None:
raise AttributeError("Provide table_name as parameter")
try:
surrogate_id = str(surrogate_id)
except Exception:
raise TypeError("surrogate_id MUST be str, not " + str(type(surrogate_id)))
try:
slr_id = str(slr_id)
except Exception:
raise TypeError("slr_id MUST be str, not " + str(type(slr_id)))
try:
subject_id = str(subject_id)
except Exception:
raise TypeError("subject_id MUST be str, not " + str(type(subject_id)))
try:
consent_pair_id = str(consent_pair_id)
except Exception:
raise TypeError("consent_pair_id MUST be str, not " + str(type(consent_pair_id)))
try:
account_id = str(account_id)
except Exception:
raise TypeError("account_id MUST be str, not " + str(type(account_id)))
sql_query = "SELECT consentRecordId " \
"FROM " + table_name + " " \
"WHERE surrogateId LIKE %s " \
"AND serviceLinkRecordId LIKE %s " \
"AND subjectId LIKE %s " \
"AND consentPairId LIKE %s " \
"AND Accounts_id LIKE %s " \
"ORDER BY id DESC LIMIT 1;"
arguments = (
'%' + str(surrogate_id) + '%',
'%' + str(slr_id) + '%',
'%' + str(subject_id) + '%',
'%' + str(consent_pair_id) + '%',
'%' + str(account_id) + '%',
)
try:
cursor, data = execute_sql_select_2(cursor=cursor, sql_query=sql_query, arguments=arguments)
except Exception as exp:
logger.debug('sql_query: ' + repr(exp))
raise
else:
logger.debug("Got data: " + repr(data))
#logger.debug("Got data[0]: " + repr(data[0]))
data_list = list(data)
logger.info("Got data_list: " + repr(data_list))
if len(data) == 0:
logger.error("IndexError('DB query returned no results')")
raise IndexError("DB query returned no results")
for i in range(len(data_list)):
data_list[i] = str(data_list[i][0])
logger.info("Formatted data_list: " + repr(data_list))
id_list = data_list
logger.info("Got id_list: " + repr(id_list))
return cursor, id_list
def get_consent_status_ids(cursor=None, cr_id="", account_id="", primary_key_filter=0, table_name=None):
logger.info("Executing")
if cursor is None:
raise AttributeError("Provide cursor as parameter")
if table_name is None:
raise AttributeError("Provide table_name as parameter")
try:
cr_id = str(cr_id)
except Exception:
raise TypeError("cr_id MUST be str, not " + str(type(cr_id)))
try:
account_id = str(account_id)
except Exception:
raise TypeError("account_id MUST be str, not " + str(type(account_id)))
try:
primary_key_filter = int(primary_key_filter)
except Exception:
raise TypeError("primary_key_filter MUST be int, not " + str(type(primary_key_filter)))
sql_query = "SELECT consentStatusRecordId " \
"FROM " + table_name + " " \
"WHERE consentRecordId LIKE %s " \
"AND Accounts_id LIKE %s" \
" AND id > %s;"
arguments = (
'%' + str(cr_id) + '%',
'%' + str(account_id) + '%',
int(primary_key_filter),
)
try:
cursor, data = execute_sql_select_2(cursor=cursor, sql_query=sql_query, arguments=arguments)
except Exception as exp:
logger.debug('sql_query: ' + repr(exp))
raise
else:
logger.debug("Got data: " + repr(data))
#logger.debug("Got data[0]: " + repr(data[0]))
data_list = list(data)
logger.info("Got data_list: " + repr(data_list))
if len(data) == 0:
logger.error("IndexError('DB query returned no results')")
raise IndexError("DB query returned no results")
for i in range(len(data_list)):
data_list[i] = str(data_list[i][0])
logger.info("Formatted data_list: " + repr(data_list))
id_list = data_list
logger.info("Got id_list: " + repr(id_list))
return cursor, id_list
def get_consent_status_id_filter(cursor=None, csr_id="", account_id="", table_name=None):
logger.info("Executing")
if cursor is None:
raise AttributeError("Provide cursor as parameter")
if table_name is None:
raise AttributeError("Provide table_name as parameter")
try:
csr_id = str(csr_id)
except Exception:
raise TypeError("csr_id MUST be str, not " + str(type(csr_id)))
try:
account_id = str(account_id)
except Exception:
raise TypeError("account_id MUST be str, not " + str(type(account_id)))
sql_query = "SELECT id " \
"FROM " + table_name + " " \
"WHERE consentStatusRecordId LIKE %s " \
"AND Accounts_id LIKE %s;"
arguments = (
'%' + str(csr_id) + '%',
'%' + str(account_id) + '%',
)
try:
cursor, data = execute_sql_select_2(cursor=cursor, sql_query=sql_query, arguments=arguments)
except Exception as exp:
logger.debug('sql_query: ' + repr(exp))
raise
else:
logger.debug("Got data: " + repr(data))
#logger.debug("Got data[0]: " + repr(data[0]))
data_list = list(data)
logger.info("Got data_list: " + repr(data_list))
if len(data) == 0:
logger.error("IndexError('DB query returned no results')")
raise IndexError("DB query returned no results")
for i in range(len(data_list)):
data_list[i] = str(data_list[i][0])
logger.info("Formatted data_list: " + repr(data_list))
id_list = data_list
logger.info("Got id_list: " + repr(id_list))
id = max(id_list)
logger.info("Got max id: " + str(id))
return cursor, id
| 32.107486 | 151 | 0.599892 | 4,106 | 33,456 | 4.70263 | 0.05017 | 0.048889 | 0.023357 | 0.053084 | 0.881506 | 0.847688 | 0.833653 | 0.817857 | 0.805479 | 0.788959 | 0 | 0.007896 | 0.280727 | 33,456 | 1,041 | 152 | 32.138329 | 0.794506 | 0.051471 | 0 | 0.847884 | 0 | 0 | 0.22603 | 0.012672 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030423 | false | 0 | 0.003968 | 0 | 0.063492 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e5506c8e9cd6e52ca7041b8c86c74d2e49f47621 | 9,196 | py | Python | InfectiousDiseasePrevention_SupportSystem_Kakemizu/program/dbFunction.py | YamashitaTsuyoshi/Graduate-Research-2020 | d3d46e991e789836622cfccb529de871f94d28f1 | [
"MIT"
] | null | null | null | InfectiousDiseasePrevention_SupportSystem_Kakemizu/program/dbFunction.py | YamashitaTsuyoshi/Graduate-Research-2020 | d3d46e991e789836622cfccb529de871f94d28f1 | [
"MIT"
] | null | null | null | InfectiousDiseasePrevention_SupportSystem_Kakemizu/program/dbFunction.py | YamashitaTsuyoshi/Graduate-Research-2020 | d3d46e991e789836622cfccb529de871f94d28f1 | [
"MIT"
] | 15 | 2021-02-23T04:18:51.000Z | 2021-03-12T07:33:49.000Z | # -*- coding: utf-8 -*-
import mysql.connector as mydb
import datetime
import random #テスト用、あとで消去
def add_record(dt,from_serial_ID,LQI,temp,humid,pressure,VCC,CO2):
conn = mydb.connect(
user="EnviMonitor",
password="EnviMonitor",
host="localhost",
database="SensorDB"
)
cur = conn.cursor(buffered = True)
#devID特定
sql = 'SELECT devID FROM device WHERE serialNo = '+ str(from_serial_ID)
cur.execute(sql)
if cur.rowcount != 1:
cur.fetchall()
sql = 'INSERT INTO device (serialNo) VALUES ('+str(from_serial_ID)+')'
cur.execute(sql)
conn.commit()
sql = 'SELECT devID FROM device WHERE serialNo = '+ str(from_serial_ID)
cur.execute(sql)
dev = cur.fetchone()
dev_ID = dev[0]
sql = 'INSERT INTO records (dt,devID,LQI,temp,humid,pressure,VCC,CO2) VALUES ("'+ \
dt.strftime("%Y-%m-%d %H:%M:%S") +'",'+str(dev_ID)+','+str(LQI)+','+ \
str(temp)+','+str(humid)+','+str(pressure)+','+str(VCC)+','+str(CO2)+')'
cur.execute(sql)
conn.commit()
cur.close()
conn.close()
def add_results(dt,co2Concentration,wlevel,numofPersons,rlevel):
conn = mydb.connect(
user="EnviMonitor",
password="EnviMonitor",
host="localhost",
database="SensorDB"
)
cur = conn.cursor(buffered = True)
sql = 'INSERT INTO results (dt,co2Concentration,wlevel,numofPersons,rlevel) VALUES ("'+ \
dt.strftime("%Y-%m-%d %H:%M:%S") +'",'+str(co2Concentration)+','+str(wlevel)+','+ \
str(numofPersons)+',"'+rlevel+'")'
cur.execute(sql)
conn.commit()
cur.close()
conn.close()
def get_all_record(srcDate):
conn = mydb.connect(
user="EnviMonitor",
password="EnviMonitor",
host="localhost",
database="SensorDB"
)
cur = conn.cursor(buffered = True,dictionary= True)
#sql = 'SELECT dt,device.serialNo,LQI,temp,humid,pressure,VCC,CO2 FROM records JOIN device USING (devID)'
sql = 'SELECT dt,devID,LQI,temp,humid,pressure,VCC,CO2 FROM records '+'WHERE DATE_FORMAT(dt, "%Y-%m-%d") = "'+ \
srcDate.strftime("%Y-%m-%d")+'" ORDER BY dt DESC'
cur.execute(sql)
ans = cur.fetchall()
#キーの名前をわかりやすく変換
#for record in ans:
# record['from_serial_ID'] = record['serialNo']
# del record['serialNo']
cur.close()
conn.close()
return ans
def get_all_devID():
conn = mydb.connect(
user="EnviMonitor",
password="EnviMonitor",
host="localhost",
database="SensorDB"
)
cur = conn.cursor(buffered = True)
sql = 'SELECT devID FROM device'
cur.execute(sql)
ans = cur.fetchall()
IDarray = []
for rec in ans:
IDarray.append(rec[0])
cur.close()
conn.close()
return IDarray
'''
#devIDの直近num個のレコードを取得
def get_records(devID,num,srcDate):
conn = mydb.connect(
user="EnviMonitor",
password="EnviMonitor",
host="localhost",
database="SensorDB"
)
cur = conn.cursor(buffered = True)
sql = 'SELECT dt,devID,LQI,temp,humid,pressure,VCC,CO2 FROM records WHERE devID = '+ str(devID) +' AND DATE_FORMAT(dt, "%Y-%m-%d") = "'+ \
srcDate.strftime("%Y-%m-%d")+'" ORDER BY dt DESC LIMIT '+str(num)
cur.execute(sql)
ans = cur.fetchall()
cur.close()
conn.close()
return ans
'''
#devIDの直近の指定された時間以降のレコードを取得
def get_records(devID,srcDate):
conn = mydb.connect(
user="EnviMonitor",
password="EnviMonitor",
host="localhost",
database="SensorDB"
)
cur = conn.cursor(buffered = True)
sql = 'SELECT dt,devID,LQI,temp,humid,pressure,VCC,CO2 FROM records WHERE devID = '+ str(devID) +' AND dt >= "'+ \
srcDate.strftime("%Y-%m-%d %H:%M:%S")+'" ORDER BY dt DESC'
cur.execute(sql)
ans = cur.fetchall()
cur.close()
conn.close()
return ans
def get_recent_dev(srcDate):
conn = mydb.connect(
user="EnviMonitor",
password="EnviMonitor",
host="localhost",
database="SensorDB"
)
cur = conn.cursor(buffered = True)
sql = 'SELECT devID FROM records WHERE dt >= "'+ \
srcDate.strftime("%Y-%m-%d %H:%M:%S")+'" ORDER BY dt DESC'
cur.execute(sql)
ans = cur.fetchall()
IDarray = []
for rec in ans:
IDarray.append(rec[0])
cur.close()
conn.close()
IDarray = list(set(IDarray))
return IDarray
def regi_enter_risk(enterLisk):
conn = mydb.connect(
user="EnviMonitor",
password="EnviMonitor",
host="localhost",
database="RiskDB"
)
cur = conn.cursor(buffered = True)
#色表記は大文字に
EnterLisk = enterLisk.upper()
dt = datetime.datetime.now()
sql = 'INSERT INTO enterLisk (dt,risk) VALUES ("'+ \
dt.strftime("%Y-%m-%d %H:%M:%S") +'","'+EnterLisk+'")'
cur.execute(sql)
conn.commit()
cur.close()
conn.close()
def get_last_enter_risk():
conn = mydb.connect(
user="EnviMonitor",
password="EnviMonitor",
host="localhost",
database="RiskDB"
)
cur = conn.cursor(buffered = True)
sql = 'SELECT risk FROM enterLisk ORDER BY dt DESC LIMIT 1'
cur.execute(sql)
ans = cur.fetchall()
IDarray = []
for rec in ans:
IDarray.append(rec[0])
cur.close()
conn.close()
return IDarray
def del_all_records():
conn = mydb.connect(
user="EnviMonitor",
password="EnviMonitor",
host="localhost",
database="SensorDB"
)
cur = conn.cursor(buffered = True)
sql = 'DELETE FROM device'
cur.execute(sql)
sql = 'DELETE FROM records'
cur.execute(sql)
conn.commit()
#オートインクリメント初期化
sql = 'alter table device auto_increment = 1'
cur.execute(sql)
sql = 'alter table records auto_increment = 1'
cur.execute(sql)
conn.commit()
cur.close()
conn.close()
def del_all_risks():
conn = mydb.connect(
user="EnviMonitor",
password="EnviMonitor",
host="localhost",
database="RiskDB"
)
cur = conn.cursor(buffered = True)
sql = 'DELETE FROM enterLisk'
cur.execute(sql)
conn.commit()
#オートインクリメント初期化
sql = 'alter table enterLisk auto_increment = 1'
cur.execute(sql)
conn.commit()
cur.close()
conn.close()
def make_test_data():
'''
add_record(datetime.datetime.now() - datetime.timedelta(minutes= 16),12345,100,20.5,55.6,1000,3100,400)
add_record(datetime.datetime.now() - datetime.timedelta(minutes= 15),67890,90,20.8,55.6,1000,3100,400)
add_record(datetime.datetime.now() - datetime.timedelta(minutes= 12),12345,100,20.5,55.6,1000,3100,410)
add_record(datetime.datetime.now() - datetime.timedelta(minutes= 12),67890,105,20.8,30.0,1000,3100,320)
add_record(datetime.datetime.now() - datetime.timedelta(minutes= 9),12345,100,20.5,55.6,1000,3000,450)
add_record(datetime.datetime.now() - datetime.timedelta(minutes= 10),67890,106,21.8,30.0,1000,3000,1000)
add_record(datetime.datetime.now() - datetime.timedelta(minutes= 6),12345,88,20.5,55.6,1000,3000,480)
add_record(datetime.datetime.now() - datetime.timedelta(minutes= 3),67890,108,22.7,30.0,1000,3000,400)
add_record(datetime.datetime.now(),12345,100,20.5,55.6,1000,3000,500)
add_record(datetime.datetime.now(),67890,120,23.6,30.0,1000,2900,450)
add_record(datetime.datetime.now()- datetime.timedelta(minutes= 3),12345,100,20.5,55.6,1000,3100,random.randint(500,1200))
add_record(datetime.datetime.now()- datetime.timedelta(minutes= 3),67890,100,20.5,55.6,1000,3100,random.randint(500,1200))
add_record(datetime.datetime.now()- datetime.timedelta(minutes= 6),12345,100,20.5,55.6,1000,3100,random.randint(500,1200))
add_record(datetime.datetime.now()- datetime.timedelta(minutes= 6),67890,100,20.5,55.6,1000,3100,random.randint(500,1200))
add_record(datetime.datetime.now()- datetime.timedelta(minutes= 9),12345,100,20.5,55.6,1000,3100,random.randint(500,1200))
add_record(datetime.datetime.now()- datetime.timedelta(minutes= 9),67890,100,20.5,55.6,1000,3100,random.randint(500,1200))
add_record(datetime.datetime.now()- datetime.timedelta(minutes= 12),12345,100,20.5,55.6,1000,3100,random.randint(500,1200))
add_record(datetime.datetime.now()- datetime.timedelta(minutes= 12),67890,100,20.5,55.6,1000,3100,random.randint(500,1200))
add_record(datetime.datetime.now()- datetime.timedelta(minutes= 15),12345,100,20.5,55.6,1000,3100,random.randint(500,1200))
add_record(datetime.datetime.now()- datetime.timedelta(minutes= 15),67890,100,20.5,55.6,1000,3100,random.randint(500,1200))
regi_enter_risk("Yellow")
'''
#del_all_records()
#random.seed()
#add_record(datetime.datetime.now(),12345,100,20.5,55.6,1000,3100,random.randint(500,1200))
#add_record(datetime.datetime.now(),67890,100,20.5,55.6,1000,3100,random.randint(500,1200))
| 32.609929 | 143 | 0.617986 | 1,187 | 9,196 | 4.733783 | 0.133109 | 0.036839 | 0.077772 | 0.097882 | 0.82488 | 0.814558 | 0.775583 | 0.769532 | 0.734116 | 0.693006 | 0 | 0.089932 | 0.221292 | 9,196 | 281 | 144 | 32.725979 | 0.694735 | 0.303936 | 0 | 0.705556 | 0 | 0.011111 | 0.242896 | 0.030769 | 0.005556 | 0 | 0 | 0 | 0 | 1 | 0.061111 | false | 0.055556 | 0.016667 | 0 | 0.105556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
e5c0e576e2e3c0ccc12eb2fe94b8f53e1d3f2bfb | 4,251 | py | Python | tests/test_cli.py | WhaleJ84/hb-organiser | 6bb540aead050a453fc6913c9df9378b06fe08bc | [
"MIT"
] | 1 | 2021-02-20T11:56:38.000Z | 2021-02-20T11:56:38.000Z | tests/test_cli.py | WhaleJ84/hb-organiser | 6bb540aead050a453fc6913c9df9378b06fe08bc | [
"MIT"
] | 16 | 2020-12-31T20:00:30.000Z | 2021-06-02T03:52:40.000Z | tests/test_cli.py | WhaleJ84/hb-organiser | 6bb540aead050a453fc6913c9df9378b06fe08bc | [
"MIT"
] | null | null | null | from os.path import abspath
from unittest import TestCase
from hb_organiser.cli import cli, verify_args
class HBOrganiserCliTesCase(TestCase):
def setUp(self) -> None:
pass
def tearDown(self) -> None:
pass
def test_verify_args_returns_true_on_good_relative_source_path_with_trailing_forward_slash(self):
arguments = cli(['-s', 'tests/test_libraries/library/', 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertTrue(self, result[0])
def test_verify_args_returns_true_on_good_absolute_source_path_with_trailing_forward_slash(self):
arguments = cli(['-s', abspath('tests/test_libraries/library/'), 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertTrue(self, result[0])
def test_verify_args_returns_true_on_good_relative_source_path_without_trailing_forward_slash(self):
arguments = cli(['-s', 'tests/test_libraries/library', 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertTrue(self, result[0])
def test_verify_args_returns_true_on_good_absolute_source_path_without_trailing_forward_slash(self):
arguments = cli(['-s', abspath('tests/test_libraries/library'), 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertTrue(self, result[0])
def test_verify_args_returns_none_on_source_bad_path(self):
arguments = cli(['-s', 'tests/test_libraries/fake_library', 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertIsNone(self, result[0])
def test_verify_args_returns_true_on_good_relative_destination_path_with_trailing_forward_slash(self):
arguments = cli(['-s', 'tests/test_libraries/library/', '-d', 'tests/test_destination/', 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertTrue(self, result[1])
def test_verify_args_returns_true_on_good_absolute_destination_path_with_trailing_forward_slash(self):
arguments = cli(['-s', 'tests/test_libraries/library/', '-d', abspath('tests/test_destination/'), 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertTrue(self, result[1])
def test_verify_args_returns_true_on_good_relative_destination_path_without_trailing_forward_slash(self):
arguments = cli(['-s', 'tests/test_libraries/library/', '-d', 'tests/test_destination', 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertTrue(self, result[1])
def test_verify_args_returns_true_on_good_absolute_destination_path_without_trailing_forward_slash(self):
arguments = cli(['-s', 'tests/test_libraries/library/', '-d', abspath('tests/test_destination'), 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertTrue(self, result[1])
def test_dry_run_flag_overwrites_destination_flag_to_none(self):
arguments = cli(['-s', 'tests/test_libraries/library', '-d', 'tests/test_destination', '-D', 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertIsNone(self, result[1])
def test_verify_args_returns_source_on_good_config_path(self):
arguments = cli(['-c' 'tests/test.conf', 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertEqual(self, result[0], 'tests/test_libraries/library')
def test_verify_args_return_destination_on_good_config_path(self):
arguments = cli(['-c', 'tests/test.conf', 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertEqual(self, result[1], 'tests/test_destination')
def test_verify_args_overwrites_config_source_on_set_source_flag(self):
arguments = cli(['-c', 'tests/test.conf', '-s', 'tests/test_destination', 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertEqual(self, result[0], 'tests/test_destination')
def test_verify_args_overwrites_config_destination_on_set_destination_flag(self):
arguments = cli(['-c', 'tests/test.conf', '-d', 'tests/test_libraries/library', 'all'])
result = verify_args(arguments)
HBOrganiserCliTesCase.assertEqual(self, result[1], 'tests/test_libraries/library')
| 51.216867 | 113 | 0.729946 | 509 | 4,251 | 5.719057 | 0.115914 | 0.096187 | 0.07695 | 0.091378 | 0.881141 | 0.881141 | 0.881141 | 0.866712 | 0.834765 | 0.752319 | 0 | 0.003884 | 0.151964 | 4,251 | 82 | 114 | 51.841463 | 0.803606 | 0 | 0 | 0.375 | 0 | 0 | 0.164432 | 0.130087 | 0 | 0 | 0 | 0 | 0.21875 | 1 | 0.25 | false | 0.03125 | 0.046875 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
005ba18477bdf0c534ad9ed79361eac4fcf618c1 | 38 | py | Python | Chapter 2 - Variables & Data Types/tempCodeRunnerFile.py | alex-dsouza777/Python-Basics | 8f1c406f2319cd65b5d54dfea990d09fa69d9adf | [
"MIT"
] | null | null | null | Chapter 2 - Variables & Data Types/tempCodeRunnerFile.py | alex-dsouza777/Python-Basics | 8f1c406f2319cd65b5d54dfea990d09fa69d9adf | [
"MIT"
] | null | null | null | Chapter 2 - Variables & Data Types/tempCodeRunnerFile.py | alex-dsouza777/Python-Basics | 8f1c406f2319cd65b5d54dfea990d09fa69d9adf | [
"MIT"
] | 1 | 2021-04-21T10:23:08.000Z | 2021-04-21T10:23:08.000Z | b = (14<7)
# b = (14==7)
# b = (14!=7) | 12.666667 | 13 | 0.315789 | 9 | 38 | 1.333333 | 0.333333 | 0.75 | 1 | 0.833333 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0.321429 | 0.263158 | 38 | 3 | 14 | 12.666667 | 0.107143 | 0.605263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
0066f5827983eb5ea73729137065a1aa2f0d1e79 | 86 | py | Python | intersim/viz/__init__.py | lassepe/InteractionSimulator | bf6bba18780ca61df8ac62627805fb00850832d2 | [
"MIT"
] | null | null | null | intersim/viz/__init__.py | lassepe/InteractionSimulator | bf6bba18780ca61df8ac62627805fb00850832d2 | [
"MIT"
] | null | null | null | intersim/viz/__init__.py | lassepe/InteractionSimulator | bf6bba18780ca61df8ac62627805fb00850832d2 | [
"MIT"
] | null | null | null |
from intersim.viz.animatedviz import animate
from intersim.viz.utils import build_map | 28.666667 | 44 | 0.860465 | 13 | 86 | 5.615385 | 0.692308 | 0.328767 | 0.410959 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093023 | 86 | 3 | 45 | 28.666667 | 0.935897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
00a2c81e9084bc7b8b5bfbc0d1a21d197f873dee | 704 | py | Python | src/LivestockCV/core/threshold/__init__.py | peschelgroup/LivestockCV | e5746af75935d5000ba3ad26d09b6868fae76b76 | [
"MIT"
] | null | null | null | src/LivestockCV/core/threshold/__init__.py | peschelgroup/LivestockCV | e5746af75935d5000ba3ad26d09b6868fae76b76 | [
"MIT"
] | null | null | null | src/LivestockCV/core/threshold/__init__.py | peschelgroup/LivestockCV | e5746af75935d5000ba3ad26d09b6868fae76b76 | [
"MIT"
] | null | null | null | from LivestockCV.core.threshold.threshold_methods import binary
from LivestockCV.core.threshold.threshold_methods import gaussian
from LivestockCV.core.threshold.threshold_methods import mean
from LivestockCV.core.threshold.threshold_methods import otsu
from LivestockCV.core.threshold.threshold_methods import triangle
from LivestockCV.core.threshold.threshold_methods import texture
from LivestockCV.core.threshold.threshold_methods import custom_range
from LivestockCV.core.threshold.threshold_methods import saturation
from LivestockCV.core.threshold.threshold_methods import mask_bad
__all__ = ["binary", "gaussian", "mean", "otsu", "triangle", "texture", "custom_range", "saturation", "mask_bad"]
| 58.666667 | 113 | 0.852273 | 86 | 704 | 6.77907 | 0.209302 | 0.231561 | 0.29331 | 0.432247 | 0.77187 | 0.77187 | 0.77187 | 0 | 0 | 0 | 0 | 0 | 0.068182 | 704 | 11 | 114 | 64 | 0.88872 | 0 | 0 | 0 | 0 | 0 | 0.09517 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.9 | 0 | 0.9 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
00c1cce4aaeb8229c87a9c0e55894159542e0e78 | 145 | py | Python | gcraft/resources/resource.py | ddomurad/gcraft | 7174fcacee875fd90e8d878463108aae3c77873e | [
"MIT"
] | null | null | null | gcraft/resources/resource.py | ddomurad/gcraft | 7174fcacee875fd90e8d878463108aae3c77873e | [
"MIT"
] | null | null | null | gcraft/resources/resource.py | ddomurad/gcraft | 7174fcacee875fd90e8d878463108aae3c77873e | [
"MIT"
] | null | null | null | class Resource:
def __init__(self, r_id, r_type):
self.r_id = r_id
self.r_type = r_type
def release(self):
pass | 18.125 | 37 | 0.57931 | 23 | 145 | 3.217391 | 0.434783 | 0.202703 | 0.189189 | 0.216216 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.331034 | 145 | 8 | 38 | 18.125 | 0.762887 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.166667 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
979998b7e4e74d755782c398073cdc3300522dc1 | 135 | py | Python | pangram/honorable_mentions/pangram_1.py | ederst/exercism-python | 8791f145ff4ce1a3b78ac3566fbe428ce3a3bd7b | [
"Unlicense"
] | 1 | 2021-06-25T16:09:02.000Z | 2021-06-25T16:09:02.000Z | pangram/honorable_mentions/pangram_1.py | ederst/exercism-python | 8791f145ff4ce1a3b78ac3566fbe428ce3a3bd7b | [
"Unlicense"
] | 1 | 2021-05-17T23:45:29.000Z | 2021-05-17T23:46:01.000Z | pangram/honorable_mentions/pangram_1.py | ederst/exercism-python | 8791f145ff4ce1a3b78ac3566fbe428ce3a3bd7b | [
"Unlicense"
] | null | null | null | from string import ascii_lowercase
def is_pangram(sentence):
return bool(min(sentence.lower().count(c) for c in ascii_lowercase))
| 27 | 72 | 0.777778 | 21 | 135 | 4.857143 | 0.809524 | 0.27451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125926 | 135 | 4 | 73 | 33.75 | 0.864407 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
c131393e5792ccc979117b51f933fac574549250 | 807 | py | Python | utils.py | machacek/statistical-nlp-assignment-3 | 0b25562bb050160e6d68e83427dd0414aac5566a | [
"MIT"
] | 1 | 2016-05-23T23:40:41.000Z | 2016-05-23T23:40:41.000Z | utils.py | machacek/statistical-nlp-assignment-3 | 0b25562bb050160e6d68e83427dd0414aac5566a | [
"MIT"
] | null | null | null | utils.py | machacek/statistical-nlp-assignment-3 | 0b25562bb050160e6d68e83427dd0414aac5566a | [
"MIT"
] | null | null | null | import sys
class LabeledReader(object):
def __init__(self, filename):
if filename == '-':
self.file = sys.stdin
else:
self.file = open(filename, mode='r', encoding='utf-8')
def __iter__(self):
for line in self.file:
yield tuple(line.strip().split('/',2))
def __del__(self):
if self.file is not sys.stdin:
self.file.close()
class UnlabeledReader(object):
def __init__(self, filename):
if filename == '-':
self.file = sys.stdin
else:
self.file = open(filename, mode='r', encoding='utf-8')
def __iter__(self):
for line in self.file:
yield line.strip()
def __del__(self):
if self.file is not sys.stdin:
self.file.close()
| 25.21875 | 66 | 0.547708 | 99 | 807 | 4.222222 | 0.333333 | 0.191388 | 0.062201 | 0.08134 | 0.818182 | 0.818182 | 0.818182 | 0.818182 | 0.818182 | 0.818182 | 0 | 0.005484 | 0.322181 | 807 | 31 | 67 | 26.032258 | 0.758684 | 0 | 0 | 0.8 | 0 | 0 | 0.018587 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.24 | false | 0 | 0.04 | 0 | 0.36 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c152c2eec5d408f540234e793d0ab860a685eb20 | 128,013 | py | Python | msgraph-cli-extensions/beta/people_beta/azext_people_beta/generated/custom.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | msgraph-cli-extensions/beta/people_beta/azext_people_beta/generated/custom.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | msgraph-cli-extensions/beta/people_beta/azext_people_beta/generated/custom.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | # --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
# pylint: disable=too-many-lines
def people_user_create_person(client,
user_id,
id_=None,
birthday=None,
company_name=None,
department=None,
display_name=None,
email_addresses=None,
given_name=None,
is_favorite=None,
mailbox_type=None,
office_location=None,
person_notes=None,
person_type=None,
phones=None,
postal_addresses=None,
profession=None,
sources=None,
surname=None,
title=None,
user_principal_name=None,
websites=None,
yomi_company=None):
body = {}
body['id'] = id_
body['birthday'] = birthday
body['company_name'] = company_name
body['department'] = department
body['display_name'] = display_name
body['email_addresses'] = email_addresses
body['given_name'] = given_name
body['is_favorite'] = is_favorite
body['mailbox_type'] = mailbox_type
body['office_location'] = office_location
body['person_notes'] = person_notes
body['person_type'] = person_type
body['phones'] = phones
body['postal_addresses'] = postal_addresses
body['profession'] = profession
body['sources'] = sources
body['surname'] = surname
body['title'] = title
body['user_principal_name'] = user_principal_name
body['websites'] = websites
body['yomi_company'] = yomi_company
return client.create_people(user_id=user_id,
body=body)
def people_user_delete_analytic(client,
user_id,
if_match=None):
return client.delete_analytics(user_id=user_id,
if_match=if_match)
def people_user_delete_person(client,
user_id,
person_id,
if_match=None):
return client.delete_people(user_id=user_id,
person_id=person_id,
if_match=if_match)
def people_user_delete_profile(client,
user_id,
if_match=None):
return client.delete_profile(user_id=user_id,
if_match=if_match)
def people_user_list_person(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_people(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_show_analytic(client,
user_id,
select=None,
expand=None):
return client.get_analytics(user_id=user_id,
select=select,
expand=expand)
def people_user_show_person(client,
user_id,
person_id,
select=None,
expand=None):
return client.get_people(user_id=user_id,
person_id=person_id,
select=select,
expand=expand)
def people_user_show_profile(client,
user_id,
select=None,
expand=None):
return client.get_profile(user_id=user_id,
select=select,
expand=expand)
def people_user_update_analytic(client,
user_id,
id_=None,
settings=None,
activity_statistics=None):
body = {}
body['id'] = id_
body['settings'] = settings
body['activity_statistics'] = activity_statistics
return client.update_analytics(user_id=user_id,
body=body)
def people_user_update_person(client,
user_id,
person_id,
id_=None,
birthday=None,
company_name=None,
department=None,
display_name=None,
email_addresses=None,
given_name=None,
is_favorite=None,
mailbox_type=None,
office_location=None,
person_notes=None,
person_type=None,
phones=None,
postal_addresses=None,
profession=None,
sources=None,
surname=None,
title=None,
user_principal_name=None,
websites=None,
yomi_company=None):
body = {}
body['id'] = id_
body['birthday'] = birthday
body['company_name'] = company_name
body['department'] = department
body['display_name'] = display_name
body['email_addresses'] = email_addresses
body['given_name'] = given_name
body['is_favorite'] = is_favorite
body['mailbox_type'] = mailbox_type
body['office_location'] = office_location
body['person_notes'] = person_notes
body['person_type'] = person_type
body['phones'] = phones
body['postal_addresses'] = postal_addresses
body['profession'] = profession
body['sources'] = sources
body['surname'] = surname
body['title'] = title
body['user_principal_name'] = user_principal_name
body['websites'] = websites
body['yomi_company'] = yomi_company
return client.update_people(user_id=user_id,
person_id=person_id,
body=body)
def people_user_update_profile(client,
user_id,
id_=None,
account=None,
addresses=None,
anniversaries=None,
awards=None,
certifications=None,
educational_activities=None,
emails=None,
interests=None,
languages=None,
names=None,
notes=None,
patents=None,
phones=None,
positions=None,
projects=None,
publications=None,
skills=None,
web_accounts=None,
websites=None):
body = {}
body['id'] = id_
body['account'] = account
body['addresses'] = addresses
body['anniversaries'] = anniversaries
body['awards'] = awards
body['certifications'] = certifications
body['educational_activities'] = educational_activities
body['emails'] = emails
body['interests'] = interests
body['languages'] = languages
body['names'] = names
body['notes'] = notes
body['patents'] = patents
body['phones'] = phones
body['positions'] = positions
body['projects'] = projects
body['publications'] = publications
body['skills'] = skills
body['web_accounts'] = web_accounts
body['websites'] = websites
return client.update_profile(user_id=user_id,
body=body)
def people_user_analytic_create_activity_statistics(client,
user_id,
id_=None,
activity=None,
duration=None,
end_date=None,
start_date=None,
time_zone_used=None):
body = {}
body['id'] = id_
body['activity'] = activity
body['duration'] = duration
body['end_date'] = end_date
body['start_date'] = start_date
body['time_zone_used'] = time_zone_used
return client.create_activity_statistics(user_id=user_id,
body=body)
def people_user_analytic_delete_activity_statistics(client,
user_id,
activity_statistics_id,
if_match=None):
return client.delete_activity_statistics(user_id=user_id,
activity_statistics_id=activity_statistics_id,
if_match=if_match)
def people_user_analytic_list_activity_statistics(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_activity_statistics(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_analytic_show_activity_statistics(client,
user_id,
activity_statistics_id,
select=None,
expand=None):
return client.get_activity_statistics(user_id=user_id,
activity_statistics_id=activity_statistics_id,
select=select,
expand=expand)
def people_user_analytic_update_activity_statistics(client,
user_id,
activity_statistics_id,
id_=None,
activity=None,
duration=None,
end_date=None,
start_date=None,
time_zone_used=None):
body = {}
body['id'] = id_
body['activity'] = activity
body['duration'] = duration
body['end_date'] = end_date
body['start_date'] = start_date
body['time_zone_used'] = time_zone_used
return client.update_activity_statistics(user_id=user_id,
activity_statistics_id=activity_statistics_id,
body=body)
def people_user_profile_create_account(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
age_group=None,
country_code=None,
preferred_language_tag=None,
user_principal_name=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['age_group'] = age_group
body['country_code'] = country_code
body['preferred_language_tag'] = preferred_language_tag
body['user_principal_name'] = user_principal_name
return client.create_account(user_id=user_id,
body=body)
def people_user_profile_create_address(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
detail=None,
display_name=None,
geo_coordinates=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['detail'] = detail
body['display_name'] = display_name
body['geo_coordinates'] = geo_coordinates
return client.create_addresses(user_id=user_id,
body=body)
def people_user_profile_create_anniversary(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
date=None,
type_=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['date'] = date
body['type'] = type_
return client.create_anniversaries(user_id=user_id,
body=body)
def people_user_profile_create_award(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
description=None,
display_name=None,
issued_date=None,
issuing_authority=None,
thumbnail_url=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['description'] = description
body['display_name'] = display_name
body['issued_date'] = issued_date
body['issuing_authority'] = issuing_authority
body['thumbnail_url'] = thumbnail_url
body['web_url'] = web_url
return client.create_awards(user_id=user_id,
body=body)
def people_user_profile_create_certification(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
certification_id=None,
description=None,
display_name=None,
end_date=None,
issued_date=None,
issuing_authority=None,
issuing_company=None,
start_date=None,
thumbnail_url=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['certification_id'] = certification_id
body['description'] = description
body['display_name'] = display_name
body['end_date'] = end_date
body['issued_date'] = issued_date
body['issuing_authority'] = issuing_authority
body['issuing_company'] = issuing_company
body['start_date'] = start_date
body['thumbnail_url'] = thumbnail_url
body['web_url'] = web_url
return client.create_certifications(user_id=user_id,
body=body)
def people_user_profile_create_educational_activity(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
completion_month_year=None,
end_month_year=None,
program=None,
start_month_year=None,
description=None,
display_name=None,
location=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['completion_month_year'] = completion_month_year
body['end_month_year'] = end_month_year
body['program'] = program
body['start_month_year'] = start_month_year
body['institution'] = {}
body['institution']['description'] = description
body['institution']['display_name'] = display_name
body['institution']['location'] = location
body['institution']['web_url'] = web_url
return client.create_educational_activities(user_id=user_id,
body=body)
def people_user_profile_create_email(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
address=None,
display_name=None,
type_=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['address'] = address
body['display_name'] = display_name
body['type'] = type_
return client.create_emails(user_id=user_id,
body=body)
def people_user_profile_create_interest(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
categories=None,
collaboration_tags=None,
description=None,
display_name=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['categories'] = categories
body['collaboration_tags'] = collaboration_tags
body['description'] = description
body['display_name'] = display_name
body['web_url'] = web_url
return client.create_interests(user_id=user_id,
body=body)
def people_user_profile_create_language(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
display_name=None,
proficiency=None,
reading=None,
spoken=None,
tag=None,
written=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['display_name'] = display_name
body['proficiency'] = proficiency
body['reading'] = reading
body['spoken'] = spoken
body['tag'] = tag
body['written'] = written
return client.create_languages(user_id=user_id,
body=body)
def people_user_profile_create_name(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
display_name=None,
first=None,
initials=None,
language_tag=None,
last=None,
maiden=None,
middle=None,
nickname=None,
pronunciation=None,
suffix=None,
title=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['display_name'] = display_name
body['first'] = first
body['initials'] = initials
body['language_tag'] = language_tag
body['last'] = last
body['maiden'] = maiden
body['middle'] = middle
body['nickname'] = nickname
body['pronunciation'] = pronunciation
body['suffix'] = suffix
body['title'] = title
return client.create_names(user_id=user_id,
body=body)
def people_user_profile_create_note(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
detail=None,
display_name=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['detail'] = detail
body['display_name'] = display_name
return client.create_notes(user_id=user_id,
body=body)
def people_user_profile_create_patent(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
description=None,
display_name=None,
is_pending=None,
issued_date=None,
issuing_authority=None,
number=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['description'] = description
body['display_name'] = display_name
body['is_pending'] = is_pending
body['issued_date'] = issued_date
body['issuing_authority'] = issuing_authority
body['number'] = number
body['web_url'] = web_url
return client.create_patents(user_id=user_id,
body=body)
def people_user_profile_create_phone(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
display_name=None,
number=None,
type_=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['display_name'] = display_name
body['number'] = number
body['type'] = type_
return client.create_phones(user_id=user_id,
body=body)
def people_user_profile_create_position(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
categories=None,
colleagues=None,
is_current=None,
manager=None,
company=None,
description=None,
end_month_year=None,
job_title=None,
role=None,
start_month_year=None,
summary=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['categories'] = categories
body['colleagues'] = colleagues
body['is_current'] = is_current
body['manager'] = manager
body['detail'] = {}
body['detail']['company'] = company
body['detail']['description'] = description
body['detail']['end_month_year'] = end_month_year
body['detail']['job_title'] = job_title
body['detail']['role'] = role
body['detail']['start_month_year'] = start_month_year
body['detail']['summary'] = summary
return client.create_positions(user_id=user_id,
body=body)
def people_user_profile_create_project(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
categories=None,
collaboration_tags=None,
colleagues=None,
display_name=None,
sponsors=None,
company=None,
description=None,
end_month_year=None,
job_title=None,
role=None,
start_month_year=None,
summary=None,
address=None,
department=None,
microsoft_graph_company_detail_display_name=None,
office_location=None,
pronunciation=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['categories'] = categories
body['collaboration_tags'] = collaboration_tags
body['colleagues'] = colleagues
body['display_name'] = display_name
body['sponsors'] = sponsors
body['detail'] = {}
body['detail']['company'] = company
body['detail']['description'] = description
body['detail']['end_month_year'] = end_month_year
body['detail']['job_title'] = job_title
body['detail']['role'] = role
body['detail']['start_month_year'] = start_month_year
body['detail']['summary'] = summary
body['client'] = {}
body['client']['address'] = address
body['client']['department'] = department
body['client']['display_name'] = microsoft_graph_company_detail_display_name
body['client']['office_location'] = office_location
body['client']['pronunciation'] = pronunciation
body['client']['web_url'] = web_url
return client.create_projects(user_id=user_id,
body=body)
def people_user_profile_create_publication(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
description=None,
display_name=None,
published_date=None,
publisher=None,
thumbnail_url=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['description'] = description
body['display_name'] = display_name
body['published_date'] = published_date
body['publisher'] = publisher
body['thumbnail_url'] = thumbnail_url
body['web_url'] = web_url
return client.create_publications(user_id=user_id,
body=body)
def people_user_profile_create_skill(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
categories=None,
collaboration_tags=None,
display_name=None,
proficiency=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['categories'] = categories
body['collaboration_tags'] = collaboration_tags
body['display_name'] = display_name
body['proficiency'] = proficiency
body['web_url'] = web_url
return client.create_skills(user_id=user_id,
body=body)
def people_user_profile_create_web_account(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
description=None,
service=None,
status_message=None,
microsoft_graph_web_account_user_id=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['description'] = description
body['service'] = service
body['status_message'] = status_message
body['user_id'] = microsoft_graph_web_account_user_id
body['web_url'] = web_url
return client.create_web_accounts(user_id=user_id,
body=body)
def people_user_profile_create_website(client,
user_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
categories=None,
description=None,
display_name=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['categories'] = categories
body['description'] = description
body['display_name'] = display_name
body['web_url'] = web_url
return client.create_websites(user_id=user_id,
body=body)
def people_user_profile_delete_account(client,
user_id,
user_account_information_id,
if_match=None):
return client.delete_account(user_id=user_id,
user_account_information_id=user_account_information_id,
if_match=if_match)
def people_user_profile_delete_address(client,
user_id,
item_address_id,
if_match=None):
return client.delete_addresses(user_id=user_id,
item_address_id=item_address_id,
if_match=if_match)
def people_user_profile_delete_anniversary(client,
user_id,
person_anniversary_id,
if_match=None):
return client.delete_anniversaries(user_id=user_id,
person_anniversary_id=person_anniversary_id,
if_match=if_match)
def people_user_profile_delete_award(client,
user_id,
person_award_id,
if_match=None):
return client.delete_awards(user_id=user_id,
person_award_id=person_award_id,
if_match=if_match)
def people_user_profile_delete_certification(client,
user_id,
person_certification_id,
if_match=None):
return client.delete_certifications(user_id=user_id,
person_certification_id=person_certification_id,
if_match=if_match)
def people_user_profile_delete_educational_activity(client,
user_id,
educational_activity_id,
if_match=None):
return client.delete_educational_activities(user_id=user_id,
educational_activity_id=educational_activity_id,
if_match=if_match)
def people_user_profile_delete_email(client,
user_id,
item_email_id,
if_match=None):
return client.delete_emails(user_id=user_id,
item_email_id=item_email_id,
if_match=if_match)
def people_user_profile_delete_interest(client,
user_id,
person_interest_id,
if_match=None):
return client.delete_interests(user_id=user_id,
person_interest_id=person_interest_id,
if_match=if_match)
def people_user_profile_delete_language(client,
user_id,
language_proficiency_id,
if_match=None):
return client.delete_languages(user_id=user_id,
language_proficiency_id=language_proficiency_id,
if_match=if_match)
def people_user_profile_delete_name(client,
user_id,
person_name_id,
if_match=None):
return client.delete_names(user_id=user_id,
person_name_id=person_name_id,
if_match=if_match)
def people_user_profile_delete_note(client,
user_id,
person_annotation_id,
if_match=None):
return client.delete_notes(user_id=user_id,
person_annotation_id=person_annotation_id,
if_match=if_match)
def people_user_profile_delete_patent(client,
user_id,
item_patent_id,
if_match=None):
return client.delete_patents(user_id=user_id,
item_patent_id=item_patent_id,
if_match=if_match)
def people_user_profile_delete_phone(client,
user_id,
item_phone_id,
if_match=None):
return client.delete_phones(user_id=user_id,
item_phone_id=item_phone_id,
if_match=if_match)
def people_user_profile_delete_position(client,
user_id,
work_position_id,
if_match=None):
return client.delete_positions(user_id=user_id,
work_position_id=work_position_id,
if_match=if_match)
def people_user_profile_delete_project(client,
user_id,
project_participation_id,
if_match=None):
return client.delete_projects(user_id=user_id,
project_participation_id=project_participation_id,
if_match=if_match)
def people_user_profile_delete_publication(client,
user_id,
item_publication_id,
if_match=None):
return client.delete_publications(user_id=user_id,
item_publication_id=item_publication_id,
if_match=if_match)
def people_user_profile_delete_skill(client,
user_id,
skill_proficiency_id,
if_match=None):
return client.delete_skills(user_id=user_id,
skill_proficiency_id=skill_proficiency_id,
if_match=if_match)
def people_user_profile_delete_web_account(client,
user_id,
web_account_id,
if_match=None):
return client.delete_web_accounts(user_id=user_id,
web_account_id=web_account_id,
if_match=if_match)
def people_user_profile_delete_website(client,
user_id,
person_website_id,
if_match=None):
return client.delete_websites(user_id=user_id,
person_website_id=person_website_id,
if_match=if_match)
def people_user_profile_list_account(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_account(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_address(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_addresses(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_anniversary(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_anniversaries(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_award(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_awards(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_certification(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_certifications(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_educational_activity(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_educational_activities(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_email(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_emails(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_interest(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_interests(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_language(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_languages(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_name(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_names(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_note(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_notes(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_patent(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_patents(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_phone(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_phones(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_position(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_positions(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_project(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_projects(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_publication(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_publications(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_skill(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_skills(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_web_account(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_web_accounts(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_list_website(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_websites(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def people_user_profile_show_account(client,
user_id,
user_account_information_id,
select=None,
expand=None):
return client.get_account(user_id=user_id,
user_account_information_id=user_account_information_id,
select=select,
expand=expand)
def people_user_profile_show_address(client,
user_id,
item_address_id,
select=None,
expand=None):
return client.get_addresses(user_id=user_id,
item_address_id=item_address_id,
select=select,
expand=expand)
def people_user_profile_show_anniversary(client,
user_id,
person_anniversary_id,
select=None,
expand=None):
return client.get_anniversaries(user_id=user_id,
person_anniversary_id=person_anniversary_id,
select=select,
expand=expand)
def people_user_profile_show_award(client,
user_id,
person_award_id,
select=None,
expand=None):
return client.get_awards(user_id=user_id,
person_award_id=person_award_id,
select=select,
expand=expand)
def people_user_profile_show_certification(client,
user_id,
person_certification_id,
select=None,
expand=None):
return client.get_certifications(user_id=user_id,
person_certification_id=person_certification_id,
select=select,
expand=expand)
def people_user_profile_show_educational_activity(client,
user_id,
educational_activity_id,
select=None,
expand=None):
return client.get_educational_activities(user_id=user_id,
educational_activity_id=educational_activity_id,
select=select,
expand=expand)
def people_user_profile_show_email(client,
user_id,
item_email_id,
select=None,
expand=None):
return client.get_emails(user_id=user_id,
item_email_id=item_email_id,
select=select,
expand=expand)
def people_user_profile_show_interest(client,
user_id,
person_interest_id,
select=None,
expand=None):
return client.get_interests(user_id=user_id,
person_interest_id=person_interest_id,
select=select,
expand=expand)
def people_user_profile_show_language(client,
user_id,
language_proficiency_id,
select=None,
expand=None):
return client.get_languages(user_id=user_id,
language_proficiency_id=language_proficiency_id,
select=select,
expand=expand)
def people_user_profile_show_name(client,
user_id,
person_name_id,
select=None,
expand=None):
return client.get_names(user_id=user_id,
person_name_id=person_name_id,
select=select,
expand=expand)
def people_user_profile_show_note(client,
user_id,
person_annotation_id,
select=None,
expand=None):
return client.get_notes(user_id=user_id,
person_annotation_id=person_annotation_id,
select=select,
expand=expand)
def people_user_profile_show_patent(client,
user_id,
item_patent_id,
select=None,
expand=None):
return client.get_patents(user_id=user_id,
item_patent_id=item_patent_id,
select=select,
expand=expand)
def people_user_profile_show_phone(client,
user_id,
item_phone_id,
select=None,
expand=None):
return client.get_phones(user_id=user_id,
item_phone_id=item_phone_id,
select=select,
expand=expand)
def people_user_profile_show_position(client,
user_id,
work_position_id,
select=None,
expand=None):
return client.get_positions(user_id=user_id,
work_position_id=work_position_id,
select=select,
expand=expand)
def people_user_profile_show_project(client,
user_id,
project_participation_id,
select=None,
expand=None):
return client.get_projects(user_id=user_id,
project_participation_id=project_participation_id,
select=select,
expand=expand)
def people_user_profile_show_publication(client,
user_id,
item_publication_id,
select=None,
expand=None):
return client.get_publications(user_id=user_id,
item_publication_id=item_publication_id,
select=select,
expand=expand)
def people_user_profile_show_skill(client,
user_id,
skill_proficiency_id,
select=None,
expand=None):
return client.get_skills(user_id=user_id,
skill_proficiency_id=skill_proficiency_id,
select=select,
expand=expand)
def people_user_profile_show_web_account(client,
user_id,
web_account_id,
select=None,
expand=None):
return client.get_web_accounts(user_id=user_id,
web_account_id=web_account_id,
select=select,
expand=expand)
def people_user_profile_show_website(client,
user_id,
person_website_id,
select=None,
expand=None):
return client.get_websites(user_id=user_id,
person_website_id=person_website_id,
select=select,
expand=expand)
def people_user_profile_update_account(client,
user_id,
user_account_information_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
age_group=None,
country_code=None,
preferred_language_tag=None,
user_principal_name=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['age_group'] = age_group
body['country_code'] = country_code
body['preferred_language_tag'] = preferred_language_tag
body['user_principal_name'] = user_principal_name
return client.update_account(user_id=user_id,
user_account_information_id=user_account_information_id,
body=body)
def people_user_profile_update_address(client,
user_id,
item_address_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
detail=None,
display_name=None,
geo_coordinates=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['detail'] = detail
body['display_name'] = display_name
body['geo_coordinates'] = geo_coordinates
return client.update_addresses(user_id=user_id,
item_address_id=item_address_id,
body=body)
def people_user_profile_update_anniversary(client,
user_id,
person_anniversary_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
date=None,
type_=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['date'] = date
body['type'] = type_
return client.update_anniversaries(user_id=user_id,
person_anniversary_id=person_anniversary_id,
body=body)
def people_user_profile_update_award(client,
user_id,
person_award_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
description=None,
display_name=None,
issued_date=None,
issuing_authority=None,
thumbnail_url=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['description'] = description
body['display_name'] = display_name
body['issued_date'] = issued_date
body['issuing_authority'] = issuing_authority
body['thumbnail_url'] = thumbnail_url
body['web_url'] = web_url
return client.update_awards(user_id=user_id,
person_award_id=person_award_id,
body=body)
def people_user_profile_update_certification(client,
user_id,
person_certification_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
certification_id=None,
description=None,
display_name=None,
end_date=None,
issued_date=None,
issuing_authority=None,
issuing_company=None,
start_date=None,
thumbnail_url=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['certification_id'] = certification_id
body['description'] = description
body['display_name'] = display_name
body['end_date'] = end_date
body['issued_date'] = issued_date
body['issuing_authority'] = issuing_authority
body['issuing_company'] = issuing_company
body['start_date'] = start_date
body['thumbnail_url'] = thumbnail_url
body['web_url'] = web_url
return client.update_certifications(user_id=user_id,
person_certification_id=person_certification_id,
body=body)
def people_user_profile_update_educational_activity(client,
user_id,
educational_activity_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
completion_month_year=None,
end_month_year=None,
program=None,
start_month_year=None,
description=None,
display_name=None,
location=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['completion_month_year'] = completion_month_year
body['end_month_year'] = end_month_year
body['program'] = program
body['start_month_year'] = start_month_year
body['institution'] = {}
body['institution']['description'] = description
body['institution']['display_name'] = display_name
body['institution']['location'] = location
body['institution']['web_url'] = web_url
return client.update_educational_activities(user_id=user_id,
educational_activity_id=educational_activity_id,
body=body)
def people_user_profile_update_email(client,
user_id,
item_email_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
address=None,
display_name=None,
type_=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['address'] = address
body['display_name'] = display_name
body['type'] = type_
return client.update_emails(user_id=user_id,
item_email_id=item_email_id,
body=body)
def people_user_profile_update_interest(client,
user_id,
person_interest_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
categories=None,
collaboration_tags=None,
description=None,
display_name=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['categories'] = categories
body['collaboration_tags'] = collaboration_tags
body['description'] = description
body['display_name'] = display_name
body['web_url'] = web_url
return client.update_interests(user_id=user_id,
person_interest_id=person_interest_id,
body=body)
def people_user_profile_update_language(client,
user_id,
language_proficiency_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
display_name=None,
proficiency=None,
reading=None,
spoken=None,
tag=None,
written=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['display_name'] = display_name
body['proficiency'] = proficiency
body['reading'] = reading
body['spoken'] = spoken
body['tag'] = tag
body['written'] = written
return client.update_languages(user_id=user_id,
language_proficiency_id=language_proficiency_id,
body=body)
def people_user_profile_update_name(client,
user_id,
person_name_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
display_name=None,
first=None,
initials=None,
language_tag=None,
last=None,
maiden=None,
middle=None,
nickname=None,
pronunciation=None,
suffix=None,
title=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['display_name'] = display_name
body['first'] = first
body['initials'] = initials
body['language_tag'] = language_tag
body['last'] = last
body['maiden'] = maiden
body['middle'] = middle
body['nickname'] = nickname
body['pronunciation'] = pronunciation
body['suffix'] = suffix
body['title'] = title
return client.update_names(user_id=user_id,
person_name_id=person_name_id,
body=body)
def people_user_profile_update_note(client,
user_id,
person_annotation_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
detail=None,
display_name=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['detail'] = detail
body['display_name'] = display_name
return client.update_notes(user_id=user_id,
person_annotation_id=person_annotation_id,
body=body)
def people_user_profile_update_patent(client,
user_id,
item_patent_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
description=None,
display_name=None,
is_pending=None,
issued_date=None,
issuing_authority=None,
number=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['description'] = description
body['display_name'] = display_name
body['is_pending'] = is_pending
body['issued_date'] = issued_date
body['issuing_authority'] = issuing_authority
body['number'] = number
body['web_url'] = web_url
return client.update_patents(user_id=user_id,
item_patent_id=item_patent_id,
body=body)
def people_user_profile_update_phone(client,
user_id,
item_phone_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
display_name=None,
number=None,
type_=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['display_name'] = display_name
body['number'] = number
body['type'] = type_
return client.update_phones(user_id=user_id,
item_phone_id=item_phone_id,
body=body)
def people_user_profile_update_position(client,
user_id,
work_position_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
categories=None,
colleagues=None,
is_current=None,
manager=None,
company=None,
description=None,
end_month_year=None,
job_title=None,
role=None,
start_month_year=None,
summary=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['categories'] = categories
body['colleagues'] = colleagues
body['is_current'] = is_current
body['manager'] = manager
body['detail'] = {}
body['detail']['company'] = company
body['detail']['description'] = description
body['detail']['end_month_year'] = end_month_year
body['detail']['job_title'] = job_title
body['detail']['role'] = role
body['detail']['start_month_year'] = start_month_year
body['detail']['summary'] = summary
return client.update_positions(user_id=user_id,
work_position_id=work_position_id,
body=body)
def people_user_profile_update_project(client,
user_id,
project_participation_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
categories=None,
collaboration_tags=None,
colleagues=None,
display_name=None,
sponsors=None,
company=None,
description=None,
end_month_year=None,
job_title=None,
role=None,
start_month_year=None,
summary=None,
address=None,
department=None,
microsoft_graph_company_detail_display_name=None,
office_location=None,
pronunciation=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['categories'] = categories
body['collaboration_tags'] = collaboration_tags
body['colleagues'] = colleagues
body['display_name'] = display_name
body['sponsors'] = sponsors
body['detail'] = {}
body['detail']['company'] = company
body['detail']['description'] = description
body['detail']['end_month_year'] = end_month_year
body['detail']['job_title'] = job_title
body['detail']['role'] = role
body['detail']['start_month_year'] = start_month_year
body['detail']['summary'] = summary
body['client'] = {}
body['client']['address'] = address
body['client']['department'] = department
body['client']['display_name'] = microsoft_graph_company_detail_display_name
body['client']['office_location'] = office_location
body['client']['pronunciation'] = pronunciation
body['client']['web_url'] = web_url
return client.update_projects(user_id=user_id,
project_participation_id=project_participation_id,
body=body)
def people_user_profile_update_publication(client,
user_id,
item_publication_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
description=None,
display_name=None,
published_date=None,
publisher=None,
thumbnail_url=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['description'] = description
body['display_name'] = display_name
body['published_date'] = published_date
body['publisher'] = publisher
body['thumbnail_url'] = thumbnail_url
body['web_url'] = web_url
return client.update_publications(user_id=user_id,
item_publication_id=item_publication_id,
body=body)
def people_user_profile_update_skill(client,
user_id,
skill_proficiency_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
categories=None,
collaboration_tags=None,
display_name=None,
proficiency=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['categories'] = categories
body['collaboration_tags'] = collaboration_tags
body['display_name'] = display_name
body['proficiency'] = proficiency
body['web_url'] = web_url
return client.update_skills(user_id=user_id,
skill_proficiency_id=skill_proficiency_id,
body=body)
def people_user_profile_update_web_account(client,
user_id,
web_account_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
description=None,
service=None,
status_message=None,
microsoft_graph_web_account_user_id=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['description'] = description
body['service'] = service
body['status_message'] = status_message
body['user_id'] = microsoft_graph_web_account_user_id
body['web_url'] = web_url
return client.update_web_accounts(user_id=user_id,
web_account_id=web_account_id,
body=body)
def people_user_profile_update_website(client,
user_id,
person_website_id,
id_=None,
allowed_audiences=None,
created_date_time=None,
inference=None,
last_modified_date_time=None,
source=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
categories=None,
description=None,
display_name=None,
web_url=None):
body = {}
body['id'] = id_
body['allowed_audiences'] = allowed_audiences
body['created_date_time'] = created_date_time
body['inference'] = inference
body['last_modified_date_time'] = last_modified_date_time
body['source'] = source
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['categories'] = categories
body['description'] = description
body['display_name'] = display_name
body['web_url'] = web_url
return client.update_websites(user_id=user_id,
person_website_id=person_website_id,
body=body)
| 47.944944 | 97 | 0.446017 | 9,949 | 128,013 | 5.351593 | 0.018997 | 0.038202 | 0.09421 | 0.051387 | 0.971865 | 0.962662 | 0.958004 | 0.917567 | 0.87989 | 0.868997 | 0 | 0 | 0.482998 | 128,013 | 2,669 | 98 | 47.962907 | 0.804482 | 0.003672 | 0 | 0.892901 | 0 | 0 | 0.095561 | 0.007865 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045548 | false | 0 | 0 | 0.027493 | 0.091096 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c17205df77d363049b2f7fa246a5ed91daa7a69a | 47,101 | py | Python | sdk/python/pulumi_alicloud/vpc/router_interface.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 42 | 2019-03-18T06:34:37.000Z | 2022-03-24T07:08:57.000Z | sdk/python/pulumi_alicloud/vpc/router_interface.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 152 | 2019-04-15T21:03:44.000Z | 2022-03-29T18:00:57.000Z | sdk/python/pulumi_alicloud/vpc/router_interface.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2020-08-26T17:30:07.000Z | 2021-07-05T01:37:45.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['RouterInterfaceArgs', 'RouterInterface']
@pulumi.input_type
class RouterInterfaceArgs:
def __init__(__self__, *,
opposite_region: pulumi.Input[str],
role: pulumi.Input[str],
router_id: pulumi.Input[str],
router_type: pulumi.Input[str],
description: Optional[pulumi.Input[str]] = None,
health_check_source_ip: Optional[pulumi.Input[str]] = None,
health_check_target_ip: Optional[pulumi.Input[str]] = None,
instance_charge_type: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
opposite_access_point_id: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
specification: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a RouterInterface resource.
:param pulumi.Input[str] opposite_region: The Region of peer side.
:param pulumi.Input[str] role: The role the router interface plays. Optional value: `InitiatingSide`, `AcceptingSide`.
:param pulumi.Input[str] router_id: The Router ID.
:param pulumi.Input[str] router_type: Router Type. Optional value: VRouter, VBR. Accepting side router interface type only be VRouter.
:param pulumi.Input[str] description: Description of the router interface. It can be 2-256 characters long or left blank. It cannot start with http:// and https://.
:param pulumi.Input[str] health_check_source_ip: Used as the Packet Source IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_target_ip` must be specified at the same time.
:param pulumi.Input[str] health_check_target_ip: Used as the Packet Target IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_source_ip` must be specified at the same time.
:param pulumi.Input[str] instance_charge_type: The billing method of the router interface. Valid values are "PrePaid" and "PostPaid". Default to "PostPaid". Router Interface doesn't support "PrePaid" when region and opposite_region are the same.
:param pulumi.Input[str] name: Name of the router interface. Length must be 2-80 characters long. Only Chinese characters, English letters, numbers, period (.), underline (_), or dash (-) are permitted.
If it is not specified, the default value is interface ID. The name cannot start with http:// and https://.
:param pulumi.Input[str] opposite_access_point_id: It has been deprecated from version 1.11.0.
:param pulumi.Input[str] specification: Specification of router interfaces. It is valid when `role` is `InitiatingSide`. Accepting side's role is default to set as 'Negative'. For more about the specification, refer to [Router interface specification](https://www.alibabacloud.com/help/doc-detail/36037.htm).
"""
pulumi.set(__self__, "opposite_region", opposite_region)
pulumi.set(__self__, "role", role)
pulumi.set(__self__, "router_id", router_id)
pulumi.set(__self__, "router_type", router_type)
if description is not None:
pulumi.set(__self__, "description", description)
if health_check_source_ip is not None:
pulumi.set(__self__, "health_check_source_ip", health_check_source_ip)
if health_check_target_ip is not None:
pulumi.set(__self__, "health_check_target_ip", health_check_target_ip)
if instance_charge_type is not None:
pulumi.set(__self__, "instance_charge_type", instance_charge_type)
if name is not None:
pulumi.set(__self__, "name", name)
if opposite_access_point_id is not None:
warnings.warn("""Attribute 'opposite_access_point_id' has been deprecated from version 1.11.0.""", DeprecationWarning)
pulumi.log.warn("""opposite_access_point_id is deprecated: Attribute 'opposite_access_point_id' has been deprecated from version 1.11.0.""")
if opposite_access_point_id is not None:
pulumi.set(__self__, "opposite_access_point_id", opposite_access_point_id)
if period is not None:
pulumi.set(__self__, "period", period)
if specification is not None:
pulumi.set(__self__, "specification", specification)
@property
@pulumi.getter(name="oppositeRegion")
def opposite_region(self) -> pulumi.Input[str]:
"""
The Region of peer side.
"""
return pulumi.get(self, "opposite_region")
@opposite_region.setter
def opposite_region(self, value: pulumi.Input[str]):
pulumi.set(self, "opposite_region", value)
@property
@pulumi.getter
def role(self) -> pulumi.Input[str]:
"""
The role the router interface plays. Optional value: `InitiatingSide`, `AcceptingSide`.
"""
return pulumi.get(self, "role")
@role.setter
def role(self, value: pulumi.Input[str]):
pulumi.set(self, "role", value)
@property
@pulumi.getter(name="routerId")
def router_id(self) -> pulumi.Input[str]:
"""
The Router ID.
"""
return pulumi.get(self, "router_id")
@router_id.setter
def router_id(self, value: pulumi.Input[str]):
pulumi.set(self, "router_id", value)
@property
@pulumi.getter(name="routerType")
def router_type(self) -> pulumi.Input[str]:
"""
Router Type. Optional value: VRouter, VBR. Accepting side router interface type only be VRouter.
"""
return pulumi.get(self, "router_type")
@router_type.setter
def router_type(self, value: pulumi.Input[str]):
pulumi.set(self, "router_type", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Description of the router interface. It can be 2-256 characters long or left blank. It cannot start with http:// and https://.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="healthCheckSourceIp")
def health_check_source_ip(self) -> Optional[pulumi.Input[str]]:
"""
Used as the Packet Source IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_target_ip` must be specified at the same time.
"""
return pulumi.get(self, "health_check_source_ip")
@health_check_source_ip.setter
def health_check_source_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "health_check_source_ip", value)
@property
@pulumi.getter(name="healthCheckTargetIp")
def health_check_target_ip(self) -> Optional[pulumi.Input[str]]:
"""
Used as the Packet Target IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_source_ip` must be specified at the same time.
"""
return pulumi.get(self, "health_check_target_ip")
@health_check_target_ip.setter
def health_check_target_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "health_check_target_ip", value)
@property
@pulumi.getter(name="instanceChargeType")
def instance_charge_type(self) -> Optional[pulumi.Input[str]]:
"""
The billing method of the router interface. Valid values are "PrePaid" and "PostPaid". Default to "PostPaid". Router Interface doesn't support "PrePaid" when region and opposite_region are the same.
"""
return pulumi.get(self, "instance_charge_type")
@instance_charge_type.setter
def instance_charge_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_charge_type", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the router interface. Length must be 2-80 characters long. Only Chinese characters, English letters, numbers, period (.), underline (_), or dash (-) are permitted.
If it is not specified, the default value is interface ID. The name cannot start with http:// and https://.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="oppositeAccessPointId")
def opposite_access_point_id(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from version 1.11.0.
"""
return pulumi.get(self, "opposite_access_point_id")
@opposite_access_point_id.setter
def opposite_access_point_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "opposite_access_point_id", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def specification(self) -> Optional[pulumi.Input[str]]:
"""
Specification of router interfaces. It is valid when `role` is `InitiatingSide`. Accepting side's role is default to set as 'Negative'. For more about the specification, refer to [Router interface specification](https://www.alibabacloud.com/help/doc-detail/36037.htm).
"""
return pulumi.get(self, "specification")
@specification.setter
def specification(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "specification", value)
@pulumi.input_type
class _RouterInterfaceState:
def __init__(__self__, *,
access_point_id: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
health_check_source_ip: Optional[pulumi.Input[str]] = None,
health_check_target_ip: Optional[pulumi.Input[str]] = None,
instance_charge_type: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
opposite_access_point_id: Optional[pulumi.Input[str]] = None,
opposite_interface_id: Optional[pulumi.Input[str]] = None,
opposite_interface_owner_id: Optional[pulumi.Input[str]] = None,
opposite_region: Optional[pulumi.Input[str]] = None,
opposite_router_id: Optional[pulumi.Input[str]] = None,
opposite_router_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
role: Optional[pulumi.Input[str]] = None,
router_id: Optional[pulumi.Input[str]] = None,
router_type: Optional[pulumi.Input[str]] = None,
specification: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering RouterInterface resources.
:param pulumi.Input[str] access_point_id: It has been deprecated from version 1.11.0.
:param pulumi.Input[str] description: Description of the router interface. It can be 2-256 characters long or left blank. It cannot start with http:// and https://.
:param pulumi.Input[str] health_check_source_ip: Used as the Packet Source IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_target_ip` must be specified at the same time.
:param pulumi.Input[str] health_check_target_ip: Used as the Packet Target IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_source_ip` must be specified at the same time.
:param pulumi.Input[str] instance_charge_type: The billing method of the router interface. Valid values are "PrePaid" and "PostPaid". Default to "PostPaid". Router Interface doesn't support "PrePaid" when region and opposite_region are the same.
:param pulumi.Input[str] name: Name of the router interface. Length must be 2-80 characters long. Only Chinese characters, English letters, numbers, period (.), underline (_), or dash (-) are permitted.
If it is not specified, the default value is interface ID. The name cannot start with http:// and https://.
:param pulumi.Input[str] opposite_access_point_id: It has been deprecated from version 1.11.0.
:param pulumi.Input[str] opposite_interface_id: It has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_router_id' instead.
:param pulumi.Input[str] opposite_interface_owner_id: It has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_interface_id' instead.
:param pulumi.Input[str] opposite_region: The Region of peer side.
:param pulumi.Input[str] opposite_router_id: It has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_router_id' instead.
:param pulumi.Input[str] opposite_router_type: It has been deprecated from version 1.11.0. resource alicloud_router_interface_connection's 'opposite_router_type' instead.
:param pulumi.Input[str] role: The role the router interface plays. Optional value: `InitiatingSide`, `AcceptingSide`.
:param pulumi.Input[str] router_id: The Router ID.
:param pulumi.Input[str] router_type: Router Type. Optional value: VRouter, VBR. Accepting side router interface type only be VRouter.
:param pulumi.Input[str] specification: Specification of router interfaces. It is valid when `role` is `InitiatingSide`. Accepting side's role is default to set as 'Negative'. For more about the specification, refer to [Router interface specification](https://www.alibabacloud.com/help/doc-detail/36037.htm).
"""
if access_point_id is not None:
warnings.warn("""Attribute 'opposite_access_point_id' has been deprecated from version 1.11.0.""", DeprecationWarning)
pulumi.log.warn("""access_point_id is deprecated: Attribute 'opposite_access_point_id' has been deprecated from version 1.11.0.""")
if access_point_id is not None:
pulumi.set(__self__, "access_point_id", access_point_id)
if description is not None:
pulumi.set(__self__, "description", description)
if health_check_source_ip is not None:
pulumi.set(__self__, "health_check_source_ip", health_check_source_ip)
if health_check_target_ip is not None:
pulumi.set(__self__, "health_check_target_ip", health_check_target_ip)
if instance_charge_type is not None:
pulumi.set(__self__, "instance_charge_type", instance_charge_type)
if name is not None:
pulumi.set(__self__, "name", name)
if opposite_access_point_id is not None:
warnings.warn("""Attribute 'opposite_access_point_id' has been deprecated from version 1.11.0.""", DeprecationWarning)
pulumi.log.warn("""opposite_access_point_id is deprecated: Attribute 'opposite_access_point_id' has been deprecated from version 1.11.0.""")
if opposite_access_point_id is not None:
pulumi.set(__self__, "opposite_access_point_id", opposite_access_point_id)
if opposite_interface_id is not None:
warnings.warn("""Attribute 'opposite_interface_id' has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_interface_id' instead.""", DeprecationWarning)
pulumi.log.warn("""opposite_interface_id is deprecated: Attribute 'opposite_interface_id' has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_interface_id' instead.""")
if opposite_interface_id is not None:
pulumi.set(__self__, "opposite_interface_id", opposite_interface_id)
if opposite_interface_owner_id is not None:
warnings.warn("""Attribute 'opposite_interface_owner_id' has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_interface_owner_id' instead.""", DeprecationWarning)
pulumi.log.warn("""opposite_interface_owner_id is deprecated: Attribute 'opposite_interface_owner_id' has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_interface_owner_id' instead.""")
if opposite_interface_owner_id is not None:
pulumi.set(__self__, "opposite_interface_owner_id", opposite_interface_owner_id)
if opposite_region is not None:
pulumi.set(__self__, "opposite_region", opposite_region)
if opposite_router_id is not None:
warnings.warn("""Attribute 'opposite_router_id' has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_router_id' instead.""", DeprecationWarning)
pulumi.log.warn("""opposite_router_id is deprecated: Attribute 'opposite_router_id' has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_router_id' instead.""")
if opposite_router_id is not None:
pulumi.set(__self__, "opposite_router_id", opposite_router_id)
if opposite_router_type is not None:
warnings.warn("""Attribute 'opposite_router_type' has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_router_type' instead.""", DeprecationWarning)
pulumi.log.warn("""opposite_router_type is deprecated: Attribute 'opposite_router_type' has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_router_type' instead.""")
if opposite_router_type is not None:
pulumi.set(__self__, "opposite_router_type", opposite_router_type)
if period is not None:
pulumi.set(__self__, "period", period)
if role is not None:
pulumi.set(__self__, "role", role)
if router_id is not None:
pulumi.set(__self__, "router_id", router_id)
if router_type is not None:
pulumi.set(__self__, "router_type", router_type)
if specification is not None:
pulumi.set(__self__, "specification", specification)
@property
@pulumi.getter(name="accessPointId")
def access_point_id(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from version 1.11.0.
"""
return pulumi.get(self, "access_point_id")
@access_point_id.setter
def access_point_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "access_point_id", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Description of the router interface. It can be 2-256 characters long or left blank. It cannot start with http:// and https://.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="healthCheckSourceIp")
def health_check_source_ip(self) -> Optional[pulumi.Input[str]]:
"""
Used as the Packet Source IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_target_ip` must be specified at the same time.
"""
return pulumi.get(self, "health_check_source_ip")
@health_check_source_ip.setter
def health_check_source_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "health_check_source_ip", value)
@property
@pulumi.getter(name="healthCheckTargetIp")
def health_check_target_ip(self) -> Optional[pulumi.Input[str]]:
"""
Used as the Packet Target IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_source_ip` must be specified at the same time.
"""
return pulumi.get(self, "health_check_target_ip")
@health_check_target_ip.setter
def health_check_target_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "health_check_target_ip", value)
@property
@pulumi.getter(name="instanceChargeType")
def instance_charge_type(self) -> Optional[pulumi.Input[str]]:
"""
The billing method of the router interface. Valid values are "PrePaid" and "PostPaid". Default to "PostPaid". Router Interface doesn't support "PrePaid" when region and opposite_region are the same.
"""
return pulumi.get(self, "instance_charge_type")
@instance_charge_type.setter
def instance_charge_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_charge_type", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the router interface. Length must be 2-80 characters long. Only Chinese characters, English letters, numbers, period (.), underline (_), or dash (-) are permitted.
If it is not specified, the default value is interface ID. The name cannot start with http:// and https://.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="oppositeAccessPointId")
def opposite_access_point_id(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from version 1.11.0.
"""
return pulumi.get(self, "opposite_access_point_id")
@opposite_access_point_id.setter
def opposite_access_point_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "opposite_access_point_id", value)
@property
@pulumi.getter(name="oppositeInterfaceId")
def opposite_interface_id(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_router_id' instead.
"""
return pulumi.get(self, "opposite_interface_id")
@opposite_interface_id.setter
def opposite_interface_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "opposite_interface_id", value)
@property
@pulumi.getter(name="oppositeInterfaceOwnerId")
def opposite_interface_owner_id(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_interface_id' instead.
"""
return pulumi.get(self, "opposite_interface_owner_id")
@opposite_interface_owner_id.setter
def opposite_interface_owner_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "opposite_interface_owner_id", value)
@property
@pulumi.getter(name="oppositeRegion")
def opposite_region(self) -> Optional[pulumi.Input[str]]:
"""
The Region of peer side.
"""
return pulumi.get(self, "opposite_region")
@opposite_region.setter
def opposite_region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "opposite_region", value)
@property
@pulumi.getter(name="oppositeRouterId")
def opposite_router_id(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_router_id' instead.
"""
return pulumi.get(self, "opposite_router_id")
@opposite_router_id.setter
def opposite_router_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "opposite_router_id", value)
@property
@pulumi.getter(name="oppositeRouterType")
def opposite_router_type(self) -> Optional[pulumi.Input[str]]:
"""
It has been deprecated from version 1.11.0. resource alicloud_router_interface_connection's 'opposite_router_type' instead.
"""
return pulumi.get(self, "opposite_router_type")
@opposite_router_type.setter
def opposite_router_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "opposite_router_type", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def role(self) -> Optional[pulumi.Input[str]]:
"""
The role the router interface plays. Optional value: `InitiatingSide`, `AcceptingSide`.
"""
return pulumi.get(self, "role")
@role.setter
def role(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "role", value)
@property
@pulumi.getter(name="routerId")
def router_id(self) -> Optional[pulumi.Input[str]]:
"""
The Router ID.
"""
return pulumi.get(self, "router_id")
@router_id.setter
def router_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "router_id", value)
@property
@pulumi.getter(name="routerType")
def router_type(self) -> Optional[pulumi.Input[str]]:
"""
Router Type. Optional value: VRouter, VBR. Accepting side router interface type only be VRouter.
"""
return pulumi.get(self, "router_type")
@router_type.setter
def router_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "router_type", value)
@property
@pulumi.getter
def specification(self) -> Optional[pulumi.Input[str]]:
"""
Specification of router interfaces. It is valid when `role` is `InitiatingSide`. Accepting side's role is default to set as 'Negative'. For more about the specification, refer to [Router interface specification](https://www.alibabacloud.com/help/doc-detail/36037.htm).
"""
return pulumi.get(self, "specification")
@specification.setter
def specification(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "specification", value)
class RouterInterface(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
description: Optional[pulumi.Input[str]] = None,
health_check_source_ip: Optional[pulumi.Input[str]] = None,
health_check_target_ip: Optional[pulumi.Input[str]] = None,
instance_charge_type: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
opposite_access_point_id: Optional[pulumi.Input[str]] = None,
opposite_region: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
role: Optional[pulumi.Input[str]] = None,
router_id: Optional[pulumi.Input[str]] = None,
router_type: Optional[pulumi.Input[str]] = None,
specification: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
## Import
The router interface can be imported using the id, e.g.
```sh
$ pulumi import alicloud:vpc/routerInterface:RouterInterface interface ri-abc123456
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] description: Description of the router interface. It can be 2-256 characters long or left blank. It cannot start with http:// and https://.
:param pulumi.Input[str] health_check_source_ip: Used as the Packet Source IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_target_ip` must be specified at the same time.
:param pulumi.Input[str] health_check_target_ip: Used as the Packet Target IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_source_ip` must be specified at the same time.
:param pulumi.Input[str] instance_charge_type: The billing method of the router interface. Valid values are "PrePaid" and "PostPaid". Default to "PostPaid". Router Interface doesn't support "PrePaid" when region and opposite_region are the same.
:param pulumi.Input[str] name: Name of the router interface. Length must be 2-80 characters long. Only Chinese characters, English letters, numbers, period (.), underline (_), or dash (-) are permitted.
If it is not specified, the default value is interface ID. The name cannot start with http:// and https://.
:param pulumi.Input[str] opposite_access_point_id: It has been deprecated from version 1.11.0.
:param pulumi.Input[str] opposite_region: The Region of peer side.
:param pulumi.Input[str] role: The role the router interface plays. Optional value: `InitiatingSide`, `AcceptingSide`.
:param pulumi.Input[str] router_id: The Router ID.
:param pulumi.Input[str] router_type: Router Type. Optional value: VRouter, VBR. Accepting side router interface type only be VRouter.
:param pulumi.Input[str] specification: Specification of router interfaces. It is valid when `role` is `InitiatingSide`. Accepting side's role is default to set as 'Negative'. For more about the specification, refer to [Router interface specification](https://www.alibabacloud.com/help/doc-detail/36037.htm).
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: RouterInterfaceArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
## Import
The router interface can be imported using the id, e.g.
```sh
$ pulumi import alicloud:vpc/routerInterface:RouterInterface interface ri-abc123456
```
:param str resource_name: The name of the resource.
:param RouterInterfaceArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(RouterInterfaceArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
description: Optional[pulumi.Input[str]] = None,
health_check_source_ip: Optional[pulumi.Input[str]] = None,
health_check_target_ip: Optional[pulumi.Input[str]] = None,
instance_charge_type: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
opposite_access_point_id: Optional[pulumi.Input[str]] = None,
opposite_region: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
role: Optional[pulumi.Input[str]] = None,
router_id: Optional[pulumi.Input[str]] = None,
router_type: Optional[pulumi.Input[str]] = None,
specification: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = RouterInterfaceArgs.__new__(RouterInterfaceArgs)
__props__.__dict__["description"] = description
__props__.__dict__["health_check_source_ip"] = health_check_source_ip
__props__.__dict__["health_check_target_ip"] = health_check_target_ip
__props__.__dict__["instance_charge_type"] = instance_charge_type
__props__.__dict__["name"] = name
if opposite_access_point_id is not None and not opts.urn:
warnings.warn("""Attribute 'opposite_access_point_id' has been deprecated from version 1.11.0.""", DeprecationWarning)
pulumi.log.warn("""opposite_access_point_id is deprecated: Attribute 'opposite_access_point_id' has been deprecated from version 1.11.0.""")
__props__.__dict__["opposite_access_point_id"] = opposite_access_point_id
if opposite_region is None and not opts.urn:
raise TypeError("Missing required property 'opposite_region'")
__props__.__dict__["opposite_region"] = opposite_region
__props__.__dict__["period"] = period
if role is None and not opts.urn:
raise TypeError("Missing required property 'role'")
__props__.__dict__["role"] = role
if router_id is None and not opts.urn:
raise TypeError("Missing required property 'router_id'")
__props__.__dict__["router_id"] = router_id
if router_type is None and not opts.urn:
raise TypeError("Missing required property 'router_type'")
__props__.__dict__["router_type"] = router_type
__props__.__dict__["specification"] = specification
__props__.__dict__["access_point_id"] = None
__props__.__dict__["opposite_interface_id"] = None
__props__.__dict__["opposite_interface_owner_id"] = None
__props__.__dict__["opposite_router_id"] = None
__props__.__dict__["opposite_router_type"] = None
super(RouterInterface, __self__).__init__(
'alicloud:vpc/routerInterface:RouterInterface',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
access_point_id: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
health_check_source_ip: Optional[pulumi.Input[str]] = None,
health_check_target_ip: Optional[pulumi.Input[str]] = None,
instance_charge_type: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
opposite_access_point_id: Optional[pulumi.Input[str]] = None,
opposite_interface_id: Optional[pulumi.Input[str]] = None,
opposite_interface_owner_id: Optional[pulumi.Input[str]] = None,
opposite_region: Optional[pulumi.Input[str]] = None,
opposite_router_id: Optional[pulumi.Input[str]] = None,
opposite_router_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
role: Optional[pulumi.Input[str]] = None,
router_id: Optional[pulumi.Input[str]] = None,
router_type: Optional[pulumi.Input[str]] = None,
specification: Optional[pulumi.Input[str]] = None) -> 'RouterInterface':
"""
Get an existing RouterInterface resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] access_point_id: It has been deprecated from version 1.11.0.
:param pulumi.Input[str] description: Description of the router interface. It can be 2-256 characters long or left blank. It cannot start with http:// and https://.
:param pulumi.Input[str] health_check_source_ip: Used as the Packet Source IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_target_ip` must be specified at the same time.
:param pulumi.Input[str] health_check_target_ip: Used as the Packet Target IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_source_ip` must be specified at the same time.
:param pulumi.Input[str] instance_charge_type: The billing method of the router interface. Valid values are "PrePaid" and "PostPaid". Default to "PostPaid". Router Interface doesn't support "PrePaid" when region and opposite_region are the same.
:param pulumi.Input[str] name: Name of the router interface. Length must be 2-80 characters long. Only Chinese characters, English letters, numbers, period (.), underline (_), or dash (-) are permitted.
If it is not specified, the default value is interface ID. The name cannot start with http:// and https://.
:param pulumi.Input[str] opposite_access_point_id: It has been deprecated from version 1.11.0.
:param pulumi.Input[str] opposite_interface_id: It has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_router_id' instead.
:param pulumi.Input[str] opposite_interface_owner_id: It has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_interface_id' instead.
:param pulumi.Input[str] opposite_region: The Region of peer side.
:param pulumi.Input[str] opposite_router_id: It has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_router_id' instead.
:param pulumi.Input[str] opposite_router_type: It has been deprecated from version 1.11.0. resource alicloud_router_interface_connection's 'opposite_router_type' instead.
:param pulumi.Input[str] role: The role the router interface plays. Optional value: `InitiatingSide`, `AcceptingSide`.
:param pulumi.Input[str] router_id: The Router ID.
:param pulumi.Input[str] router_type: Router Type. Optional value: VRouter, VBR. Accepting side router interface type only be VRouter.
:param pulumi.Input[str] specification: Specification of router interfaces. It is valid when `role` is `InitiatingSide`. Accepting side's role is default to set as 'Negative'. For more about the specification, refer to [Router interface specification](https://www.alibabacloud.com/help/doc-detail/36037.htm).
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _RouterInterfaceState.__new__(_RouterInterfaceState)
__props__.__dict__["access_point_id"] = access_point_id
__props__.__dict__["description"] = description
__props__.__dict__["health_check_source_ip"] = health_check_source_ip
__props__.__dict__["health_check_target_ip"] = health_check_target_ip
__props__.__dict__["instance_charge_type"] = instance_charge_type
__props__.__dict__["name"] = name
__props__.__dict__["opposite_access_point_id"] = opposite_access_point_id
__props__.__dict__["opposite_interface_id"] = opposite_interface_id
__props__.__dict__["opposite_interface_owner_id"] = opposite_interface_owner_id
__props__.__dict__["opposite_region"] = opposite_region
__props__.__dict__["opposite_router_id"] = opposite_router_id
__props__.__dict__["opposite_router_type"] = opposite_router_type
__props__.__dict__["period"] = period
__props__.__dict__["role"] = role
__props__.__dict__["router_id"] = router_id
__props__.__dict__["router_type"] = router_type
__props__.__dict__["specification"] = specification
return RouterInterface(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="accessPointId")
def access_point_id(self) -> pulumi.Output[str]:
"""
It has been deprecated from version 1.11.0.
"""
return pulumi.get(self, "access_point_id")
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[str]]:
"""
Description of the router interface. It can be 2-256 characters long or left blank. It cannot start with http:// and https://.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter(name="healthCheckSourceIp")
def health_check_source_ip(self) -> pulumi.Output[Optional[str]]:
"""
Used as the Packet Source IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_target_ip` must be specified at the same time.
"""
return pulumi.get(self, "health_check_source_ip")
@property
@pulumi.getter(name="healthCheckTargetIp")
def health_check_target_ip(self) -> pulumi.Output[Optional[str]]:
"""
Used as the Packet Target IP of health check for disaster recovery or ECMP. It is only valid when `router_type` is `VBR`. The IP must be an unused IP in the local VPC. It and `health_check_source_ip` must be specified at the same time.
"""
return pulumi.get(self, "health_check_target_ip")
@property
@pulumi.getter(name="instanceChargeType")
def instance_charge_type(self) -> pulumi.Output[Optional[str]]:
"""
The billing method of the router interface. Valid values are "PrePaid" and "PostPaid". Default to "PostPaid". Router Interface doesn't support "PrePaid" when region and opposite_region are the same.
"""
return pulumi.get(self, "instance_charge_type")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Name of the router interface. Length must be 2-80 characters long. Only Chinese characters, English letters, numbers, period (.), underline (_), or dash (-) are permitted.
If it is not specified, the default value is interface ID. The name cannot start with http:// and https://.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="oppositeAccessPointId")
def opposite_access_point_id(self) -> pulumi.Output[Optional[str]]:
"""
It has been deprecated from version 1.11.0.
"""
return pulumi.get(self, "opposite_access_point_id")
@property
@pulumi.getter(name="oppositeInterfaceId")
def opposite_interface_id(self) -> pulumi.Output[str]:
"""
It has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_router_id' instead.
"""
return pulumi.get(self, "opposite_interface_id")
@property
@pulumi.getter(name="oppositeInterfaceOwnerId")
def opposite_interface_owner_id(self) -> pulumi.Output[str]:
"""
It has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_interface_id' instead.
"""
return pulumi.get(self, "opposite_interface_owner_id")
@property
@pulumi.getter(name="oppositeRegion")
def opposite_region(self) -> pulumi.Output[str]:
"""
The Region of peer side.
"""
return pulumi.get(self, "opposite_region")
@property
@pulumi.getter(name="oppositeRouterId")
def opposite_router_id(self) -> pulumi.Output[str]:
"""
It has been deprecated from version 1.11.0. Use resource alicloud_router_interface_connection's 'opposite_router_id' instead.
"""
return pulumi.get(self, "opposite_router_id")
@property
@pulumi.getter(name="oppositeRouterType")
def opposite_router_type(self) -> pulumi.Output[str]:
"""
It has been deprecated from version 1.11.0. resource alicloud_router_interface_connection's 'opposite_router_type' instead.
"""
return pulumi.get(self, "opposite_router_type")
@property
@pulumi.getter
def period(self) -> pulumi.Output[Optional[int]]:
return pulumi.get(self, "period")
@property
@pulumi.getter
def role(self) -> pulumi.Output[str]:
"""
The role the router interface plays. Optional value: `InitiatingSide`, `AcceptingSide`.
"""
return pulumi.get(self, "role")
@property
@pulumi.getter(name="routerId")
def router_id(self) -> pulumi.Output[str]:
"""
The Router ID.
"""
return pulumi.get(self, "router_id")
@property
@pulumi.getter(name="routerType")
def router_type(self) -> pulumi.Output[str]:
"""
Router Type. Optional value: VRouter, VBR. Accepting side router interface type only be VRouter.
"""
return pulumi.get(self, "router_type")
@property
@pulumi.getter
def specification(self) -> pulumi.Output[Optional[str]]:
"""
Specification of router interfaces. It is valid when `role` is `InitiatingSide`. Accepting side's role is default to set as 'Negative'. For more about the specification, refer to [Router interface specification](https://www.alibabacloud.com/help/doc-detail/36037.htm).
"""
return pulumi.get(self, "specification")
| 56.139452 | 316 | 0.685039 | 6,065 | 47,101 | 5.084089 | 0.039077 | 0.066353 | 0.079455 | 0.076342 | 0.941819 | 0.925215 | 0.903584 | 0.882277 | 0.861424 | 0.852181 | 0 | 0.007293 | 0.216874 | 47,101 | 838 | 317 | 56.206444 | 0.828661 | 0.363708 | 0 | 0.741313 | 1 | 0.015444 | 0.182732 | 0.070225 | 0 | 0 | 0 | 0 | 0 | 1 | 0.158301 | false | 0.001931 | 0.009653 | 0.005792 | 0.264479 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c1804dc503cc990279d12d8ae343d09521e4916e | 1,403 | py | Python | key_move.py | homma/terminal-dots | 84b866e6f148ac2c631652de73903e1413be5bc8 | [
"MIT"
] | null | null | null | key_move.py | homma/terminal-dots | 84b866e6f148ac2c631652de73903e1413be5bc8 | [
"MIT"
] | null | null | null | key_move.py | homma/terminal-dots | 84b866e6f148ac2c631652de73903e1413be5bc8 | [
"MIT"
] | null | null | null |
## key_move.py
import curses
def up(app):
app.win.chgat(1, curses.A_NORMAL)
y, x = app.win.getyx()
if y != 0:
app.win.move(y - 1, x)
app.win.chgat(1, curses.A_UNDERLINE)
def down(app):
app.win.chgat(1, curses.A_NORMAL)
y, x = app.win.getyx()
h, w = app.win.getmaxyx()
max_y = h - 1
if y < max_y:
app.win.move(y + 1, x)
app.win.chgat(1, curses.A_UNDERLINE)
def top(app):
app.win.chgat(1, curses.A_NORMAL)
y, x = app.win.getyx()
app.win.move(0, x)
app.win.chgat(1, curses.A_UNDERLINE)
def bottom(app):
app.win.chgat(1, curses.A_NORMAL)
y, x = app.win.getyx()
h, w = app.win.getmaxyx()
max_y = h - 1
app.win.move(max_y, x)
app.win.chgat(1, curses.A_UNDERLINE)
def left(app):
app.win.chgat(1, curses.A_NORMAL)
y, x = app.win.getyx()
if x != 0:
app.win.move(y, x - 1)
app.win.chgat(1, curses.A_UNDERLINE)
def right(app):
app.win.chgat(1, curses.A_NORMAL)
y, x = app.win.getyx()
h, w = app.win.getmaxyx()
max_x = w - 1
if x < max_x:
app.win.move(y, x + 1)
app.win.chgat(1, curses.A_UNDERLINE)
def right_end(app):
app.win.chgat(1, curses.A_NORMAL)
y, x = app.win.getyx()
h, w = app.win.getmaxyx()
max_x = w - 1
app.win.move(y, max_x)
app.win.chgat(1, curses.A_UNDERLINE)
def left_end(app):
app.win.chgat(1, curses.A_NORMAL)
y, x = app.win.getyx()
app.win.move(y, 0)
app.win.chgat(1, curses.A_UNDERLINE)
| 21.257576 | 38 | 0.626515 | 279 | 1,403 | 3.053763 | 0.103943 | 0.253521 | 0.206573 | 0.225352 | 0.899061 | 0.896714 | 0.896714 | 0.86385 | 0.86385 | 0.826291 | 0 | 0.024648 | 0.190306 | 1,403 | 65 | 39 | 21.584615 | 0.725352 | 0.00784 | 0 | 0.603774 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.150943 | false | 0 | 0.018868 | 0 | 0.169811 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
c1b7403616a8717012c0fb5686229ed9b29fa574 | 88 | py | Python | app/settings/defaults.py | pedrolp85/pydevice | 39b961bb67f59ac9a9373ecc99748e07505b249e | [
"Apache-2.0"
] | null | null | null | app/settings/defaults.py | pedrolp85/pydevice | 39b961bb67f59ac9a9373ecc99748e07505b249e | [
"Apache-2.0"
] | null | null | null | app/settings/defaults.py | pedrolp85/pydevice | 39b961bb67f59ac9a9373ecc99748e07505b249e | [
"Apache-2.0"
] | null | null | null | from .settings import Settings
def get_settings() -> Settings :
return Settings()
| 14.666667 | 32 | 0.715909 | 10 | 88 | 6.2 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.193182 | 88 | 5 | 33 | 17.6 | 0.873239 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
c1e0f7502de0ced75ef63e164c1de15460d7f76b | 47 | py | Python | cffi/golay.py | CuckooEXE/golay | 41aaf9346a527811a401d3cdc6b0664fe59c1bbb | [
"MIT"
] | null | null | null | cffi/golay.py | CuckooEXE/golay | 41aaf9346a527811a401d3cdc6b0664fe59c1bbb | [
"MIT"
] | null | null | null | cffi/golay.py | CuckooEXE/golay | 41aaf9346a527811a401d3cdc6b0664fe59c1bbb | [
"MIT"
] | null | null | null | import pygolay.lib
print(pygolay.lib.add(1,2)) | 15.666667 | 27 | 0.765957 | 9 | 47 | 4 | 0.777778 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0.06383 | 47 | 3 | 27 | 15.666667 | 0.772727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
c1e83655c635d0e648f3bbbc4540fabe38df13d0 | 90,070 | py | Python | escriptcore/test/python/test_util_spatial_functions1.py | markendr/esys-escript.github.io | 0023eab09cd71f830ab098cb3a468e6139191e8d | [
"Apache-2.0"
] | null | null | null | escriptcore/test/python/test_util_spatial_functions1.py | markendr/esys-escript.github.io | 0023eab09cd71f830ab098cb3a468e6139191e8d | [
"Apache-2.0"
] | 1 | 2019-01-14T03:07:43.000Z | 2019-01-14T03:07:43.000Z | escriptcore/test/python/test_util_spatial_functions1.py | markendr/esys-escript.github.io | 0023eab09cd71f830ab098cb3a468e6139191e8d | [
"Apache-2.0"
] | null | null | null |
##############################################################################
#
# Copyright (c) 2003-2018 by The University of Queensland
# http://www.uq.edu.au
#
# Primary Business: Queensland, Australia
# Licensed under the Apache License, version 2.0
# http://www.apache.org/licenses/LICENSE-2.0
#
# Development until 2012 by Earth Systems Science Computational Center (ESSCC)
# Development 2012-2013 by School of Earth Sciences
# Development from 2014 by Centre for Geoscience Computing (GeoComp)
#
##############################################################################
from __future__ import print_function, division
__copyright__="""Copyright (c) 2003-2018 by The University of Queensland
http://www.uq.edu.au
Primary Business: Queensland, Australia"""
__license__="""Licensed under the Apache License, version 2.0
http://www.apache.org/licenses/LICENSE-2.0"""
__url__="https://launchpad.net/escript-finley"
"""
basic tests for functions in util.py effecting the spatial distribution
it is assumed that the domain is the usint square/cube
not all these test will run for all domains. check the doc string for the assumptions of a particular test
:var __author__: name of author
:var __copyright__: copyrights
:var __license__: licence agreement
:var __url__: url entry point on documentation
:var __version__: version
:var __date__: date of the version
"""
__author__="Lutz Gross, l.gross@uq.edu.au"
import esys.escriptcore.utestselect as unittest
from esys.escript import *
from numpy import array
import numpy
from test_util_grad import Test_Util_Gradient_noBoundary
from test_util_integrals import Test_Util_Integration_noContact
from test_util_interpolation import Test_Util_Interpolation_noContact
class Test_Util_SpatialFunctions_noGradOnBoundary_noContact(Test_Util_Integration_noContact, Test_Util_Interpolation_noContact, Test_Util_Gradient_noBoundary):
RES_TOL=1.e-8
def test_x_ofDomain(self):
"""
test getX() of the domain to be in the [0,1]^dim box
"""
dim=self.domain.getDim()
x=self.domain.getX()
self.assertEqual(x.getShape(),(dim,),"wrong shape of result.")
self.assertAlmostEqual(inf(x[0]),0.,int(-log10(self.RES_TOL)),"min x0 wrong")
self.assertAlmostEqual(sup(x[0]),1.,int(-log10(self.RES_TOL)),"max x0 wrong")
self.assertAlmostEqual(inf(x[1]),0.,int(-log10(self.RES_TOL)),"min x1 wrong")
self.assertAlmostEqual(sup(x[1]),1.,int(-log10(self.RES_TOL)),"max x1 wrong")
if dim>2:
self.assertAlmostEqual(inf(x[2]),0.,int(-log10(self.RES_TOL)),"min x2 wrong")
self.assertAlmostEqual(sup(x[2]),1.,int(-log10(self.RES_TOL)),"max x2 wrong")
def test_SolutionOrder(self):
"""
test the approximation order
"""
self.assertEqual(self.order, Solution(self.domain).getApproximationOrder(), "wrong order (Solution)")
self.assertEqual(self.order, ContinuousFunction(self.domain).getApproximationOrder(), "wrong order (continuous function)")
self.assertEqual(1, ReducedSolution(self.domain).getApproximationOrder(), "wrong order (ReducedSolution)")
self.assertEqual(1, ReducedContinuousFunction(self.domain).getApproximationOrder(), "wrong order (Reduced continuous function)")
for i in range(self.domain.getDim()):
for k in range(Function(self.domain).getApproximationOrder()+1):
self.assertAlmostEqual(integrate(Function(self.domain).getX()[i]**k),1./(k+1),8,"wrong integral (i=%s, order = %s)"%(i,k))
for k in range(ReducedFunction(self.domain).getApproximationOrder()+1):
self.assertAlmostEqual(integrate(ReducedFunction(self.domain).getX()[i]**k),1./(k+1),8,"wrong integral (i=%s, order = %s (reduced))"%(i,k))
def test_normal_FunctionOnBoundary(self):
"""
test getNormal() on boundary
assumptions: FunctionOnBoundary(self.domain) exists
"""
dim=self.domain.getDim()
f=FunctionOnBoundary(self.domain)
x=f.getX()
ref=Vector(0.,what=f)
if dim==3:
ref.setTaggedValue(200,[0,0,1])
ref.setTaggedValue(100,[0,0,-1])
ref.setTaggedValue(20,[0,1,0])
ref.setTaggedValue(10,[0,-1,0])
ref.setTaggedValue(2,[1,0,0])
ref.setTaggedValue(1,[-1,0,0])
else:
ref.setTaggedValue(2,[1,0])
ref.setTaggedValue(1,[-1,0])
ref.setTaggedValue(20, [0,1])
ref.setTaggedValue(10, [0,-1])
res=f.getNormal()
self.assertEqual(res.getShape(),(dim,),"wrong shape of result.")
self.assertEqual(res.getFunctionSpace(),f,"wrong functionspace of result.")
self.assertLess(Lsup(ref-res), self.RES_TOL, "wrong result")
def test_normal_ReducedFunctionOnBoundary(self):
"""
test getNormal() on boundary
assumptions: FunctionOnBoundary(self.domain) exists
"""
dim=self.domain.getDim()
f=ReducedFunctionOnBoundary(self.domain)
x=f.getX()
ref=Vector(0.,what=f)
if dim==3:
ref.setTaggedValue(200,[0,0,1])
ref.setTaggedValue(100,[0,0,-1])
ref.setTaggedValue(20,[0,1,0])
ref.setTaggedValue(10,[0,-1,0])
ref.setTaggedValue(2,[1,0,0])
ref.setTaggedValue(1,[-1,0,0])
else:
ref.setTaggedValue(2,[1,0])
ref.setTaggedValue(1,[-1,0])
ref.setTaggedValue(20, [0,1])
ref.setTaggedValue(10, [0,-1])
res=f.getNormal()
self.assertEqual(res.getShape(),(dim,),"wrong shape of result.")
self.assertEqual(res.getFunctionSpace(),f,"wrong functionspace of result.")
self.assertLess(Lsup(ref-res), self.RES_TOL, "wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onFunction_fromData_rank0(self):
"""
tests L2-norm of Data on the Function
assumptions: self.domain supports integration on Function
"""
dim=self.domain.getDim()
w=Function(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(),w)
arg=(0.608797336225)*x[0]
ref=sqrt((0.123544732198))
else:
arg=Data(0,(),w)
arg=(0.136031275673)*x[0]
ref=sqrt((0.00616816932037))
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onFunction_fromData_rank1(self):
"""
tests L2-norm of Data on the Function
assumptions: self.domain supports integration on Function
"""
dim=self.domain.getDim()
w=Function(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(2,),w)
arg[0]=(-0.212143919436)*x[0]
arg[1]=(-0.256194155686)*x[1]
ref=sqrt((0.0368801626538))
else:
arg=Data(0,(3,),w)
arg[0]=(0.0452831341416)*x[0]
arg[1]=(-0.278640180656)*x[1]
arg[2]=(-0.607035001062)*x[2]
ref=sqrt((0.149394135009))
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onFunction_fromData_rank2(self):
"""
tests L2-norm of Data on the Function
assumptions: self.domain supports integration on Function
"""
dim=self.domain.getDim()
w=Function(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(4, 2),w)
arg[0,0]=(0.239448813076)*x[0]
arg[0,1]=(-0.529349708753)*x[1]
arg[1,0]=(-0.381557161859)*x[0]
arg[1,1]=(0.731658534249)*x[1]
arg[2,0]=(-0.813679062342)*x[0]
arg[2,1]=(0.528100089704)*x[1]
arg[3,0]=(-0.480867528161)*x[0]
arg[3,1]=(-0.167862206972)*x[1]
ref=sqrt((0.739610516051))
else:
arg=Data(0,(4, 3),w)
arg[0,0]=(0.951209543612)*x[0]
arg[0,1]=(0.735178735637)*x[1]
arg[0,2]=(0.13074673272)*x[2]
arg[1,0]=(0.412295676715)*x[0]
arg[1,1]=(-0.657695950153)*x[1]
arg[1,2]=(-0.900044734695)*x[2]
arg[2,0]=(0.741773926224)*x[0]
arg[2,1]=(0.0521828807406)*x[1]
arg[2,2]=(0.797728501985)*x[2]
arg[3,0]=(-0.61235554051)*x[0]
arg[3,1]=(0.456652747412)*x[1]
arg[3,2]=(-0.734303857319)*x[2]
ref=sqrt((1.72901661926))
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onFunction_fromData_rank3(self):
"""
tests L2-norm of Data on the Function
assumptions: self.domain supports integration on Function
"""
dim=self.domain.getDim()
w=Function(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(6, 2, 2),w)
arg[0,0,0]=(0.449174971953)*x[0]
arg[0,0,1]=(-0.0109398763289)*x[1]
arg[0,1,0]=(-0.202497187709)*x[0]
arg[0,1,1]=(-0.12970879334)*x[1]
arg[1,0,0]=(-0.138092481719)*x[0]
arg[1,0,1]=(-0.528752200917)*x[1]
arg[1,1,0]=(-0.605919441662)*x[0]
arg[1,1,1]=(0.215615032334)*x[1]
arg[2,0,0]=(-0.998734541972)*x[0]
arg[2,0,1]=(0.725811901251)*x[1]
arg[2,1,0]=(-0.966536503228)*x[0]
arg[2,1,1]=(-0.528692217355)*x[1]
arg[3,0,0]=(0.757633851466)*x[0]
arg[3,0,1]=(-0.524660157377)*x[1]
arg[3,1,0]=(0.983733431677)*x[0]
arg[3,1,1]=(0.061279109546)*x[1]
arg[4,0,0]=(0.85914215305)*x[0]
arg[4,0,1]=(0.941714045112)*x[1]
arg[4,1,0]=(0.172235529555)*x[0]
arg[4,1,1]=(-0.108381454437)*x[1]
arg[5,0,0]=(-0.736373697727)*x[0]
arg[5,0,1]=(-0.599337929679)*x[1]
arg[5,1,0]=(0.661072686392)*x[0]
arg[5,1,1]=(-0.55107327409)*x[1]
ref=sqrt((2.94641432714))
else:
arg=Data(0,(6, 2, 3),w)
arg[0,0,0]=(0.69227064904)*x[0]
arg[0,0,1]=(-0.968336177418)*x[1]
arg[0,0,2]=(-0.634883146685)*x[2]
arg[0,1,0]=(-0.12640661422)*x[0]
arg[0,1,1]=(-0.637386589888)*x[1]
arg[0,1,2]=(0.26060859356)*x[2]
arg[1,0,0]=(-0.986864633297)*x[0]
arg[1,0,1]=(-0.441589142379)*x[1]
arg[1,0,2]=(-0.587865539582)*x[2]
arg[1,1,0]=(0.596052465031)*x[0]
arg[1,1,1]=(0.312732336652)*x[1]
arg[1,1,2]=(-0.514423945092)*x[2]
arg[2,0,0]=(-0.892391254794)*x[0]
arg[2,0,1]=(0.377920185756)*x[1]
arg[2,0,2]=(-0.120174597181)*x[2]
arg[2,1,0]=(-0.469951576468)*x[0]
arg[2,1,1]=(-0.788362249555)*x[1]
arg[2,1,2]=(0.745625354986)*x[2]
arg[3,0,0]=(0.542802498569)*x[0]
arg[3,0,1]=(-0.814541028706)*x[1]
arg[3,0,2]=(0.298410992196)*x[2]
arg[3,1,0]=(0.981190341206)*x[0]
arg[3,1,1]=(0.666421298608)*x[1]
arg[3,1,2]=(-0.369751722626)*x[2]
arg[4,0,0]=(-0.75379530597)*x[0]
arg[4,0,1]=(0.283357267139)*x[1]
arg[4,0,2]=(0.247787072861)*x[2]
arg[4,1,0]=(0.301766692533)*x[0]
arg[4,1,1]=(0.828183439224)*x[1]
arg[4,1,2]=(-0.580824060547)*x[2]
arg[5,0,0]=(0.637345610764)*x[0]
arg[5,0,1]=(-0.234409115997)*x[1]
arg[5,0,2]=(-0.192639300316)*x[2]
arg[5,1,0]=(-0.62609237162)*x[0]
arg[5,1,1]=(0.463404958552)*x[1]
arg[5,1,2]=(-0.547814448738)*x[2]
ref=sqrt((4.2381131862))
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onFunction_fromData_rank4(self):
"""
tests L2-norm of Data on the Function
assumptions: self.domain supports integration on Function
"""
dim=self.domain.getDim()
w=Function(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(4, 5, 3, 2),w)
arg[0,0,0,0]=(-0.232618585183)*x[0]
arg[0,0,0,1]=(0.39796117869)*x[1]
arg[0,0,1,0]=(-0.997336958262)*x[0]
arg[0,0,1,1]=(-0.351780915076)*x[1]
arg[0,0,2,0]=(-0.876764070136)*x[0]
arg[0,0,2,1]=(0.808730805817)*x[1]
arg[0,1,0,0]=(-0.197154744966)*x[0]
arg[0,1,0,1]=(0.416246096086)*x[1]
arg[0,1,1,0]=(0.708038457121)*x[0]
arg[0,1,1,1]=(-0.00954021503188)*x[1]
arg[0,1,2,0]=(-0.62630809425)*x[0]
arg[0,1,2,1]=(0.430228727912)*x[1]
arg[0,2,0,0]=(0.0306704747648)*x[0]
arg[0,2,0,1]=(-0.913877199453)*x[1]
arg[0,2,1,0]=(-0.697612800829)*x[0]
arg[0,2,1,1]=(-0.17996376822)*x[1]
arg[0,2,2,0]=(-0.304509578871)*x[0]
arg[0,2,2,1]=(-0.610556755811)*x[1]
arg[0,3,0,0]=(-0.452355972234)*x[0]
arg[0,3,0,1]=(-0.368921242518)*x[1]
arg[0,3,1,0]=(-0.478275554932)*x[0]
arg[0,3,1,1]=(0.257178549127)*x[1]
arg[0,3,2,0]=(0.530736487177)*x[0]
arg[0,3,2,1]=(-0.567126272463)*x[1]
arg[0,4,0,0]=(0.801519165938)*x[0]
arg[0,4,0,1]=(-0.509816703951)*x[1]
arg[0,4,1,0]=(-0.255412646934)*x[0]
arg[0,4,1,1]=(0.437540101896)*x[1]
arg[0,4,2,0]=(-0.815574969538)*x[0]
arg[0,4,2,1]=(-0.94691547137)*x[1]
arg[1,0,0,0]=(-0.732550722593)*x[0]
arg[1,0,0,1]=(0.515752381704)*x[1]
arg[1,0,1,0]=(-0.343590210899)*x[0]
arg[1,0,1,1]=(-0.0601907964915)*x[1]
arg[1,0,2,0]=(0.0199916154421)*x[0]
arg[1,0,2,1]=(-0.136927227821)*x[1]
arg[1,1,0,0]=(0.397994441702)*x[0]
arg[1,1,0,1]=(0.953873148948)*x[1]
arg[1,1,1,0]=(0.419416235967)*x[0]
arg[1,1,1,1]=(0.700998577193)*x[1]
arg[1,1,2,0]=(-0.497358799271)*x[0]
arg[1,1,2,1]=(0.0851768858379)*x[1]
arg[1,2,0,0]=(0.0936678875202)*x[0]
arg[1,2,0,1]=(0.869883786896)*x[1]
arg[1,2,1,0]=(0.582700123485)*x[0]
arg[1,2,1,1]=(-0.433381106794)*x[1]
arg[1,2,2,0]=(-0.527031777974)*x[0]
arg[1,2,2,1]=(0.105105137652)*x[1]
arg[1,3,0,0]=(-0.716750829134)*x[0]
arg[1,3,0,1]=(0.774519209008)*x[1]
arg[1,3,1,0]=(-0.568743372716)*x[0]
arg[1,3,1,1]=(0.794732483944)*x[1]
arg[1,3,2,0]=(0.246606002015)*x[0]
arg[1,3,2,1]=(-0.988869494994)*x[1]
arg[1,4,0,0]=(0.482379298083)*x[0]
arg[1,4,0,1]=(-0.386268387903)*x[1]
arg[1,4,1,0]=(0.137184889675)*x[0]
arg[1,4,1,1]=(-0.140520035321)*x[1]
arg[1,4,2,0]=(0.822755050415)*x[0]
arg[1,4,2,1]=(-0.815562139522)*x[1]
arg[2,0,0,0]=(-0.462891511962)*x[0]
arg[2,0,0,1]=(-0.122643411631)*x[1]
arg[2,0,1,0]=(-0.520861119962)*x[0]
arg[2,0,1,1]=(-0.881189618018)*x[1]
arg[2,0,2,0]=(-0.776157842774)*x[0]
arg[2,0,2,1]=(-0.12354053207)*x[1]
arg[2,1,0,0]=(0.395495230826)*x[0]
arg[2,1,0,1]=(-0.388106659423)*x[1]
arg[2,1,1,0]=(0.354250242834)*x[0]
arg[2,1,1,1]=(-0.666514210192)*x[1]
arg[2,1,2,0]=(0.951294655083)*x[0]
arg[2,1,2,1]=(0.074024416386)*x[1]
arg[2,2,0,0]=(0.335448485459)*x[0]
arg[2,2,0,1]=(-0.40988282528)*x[1]
arg[2,2,1,0]=(-0.805725968875)*x[0]
arg[2,2,1,1]=(-0.949883082118)*x[1]
arg[2,2,2,0]=(0.531549210683)*x[0]
arg[2,2,2,1]=(-0.398401016682)*x[1]
arg[2,3,0,0]=(-0.953963433205)*x[0]
arg[2,3,0,1]=(0.643431126406)*x[1]
arg[2,3,1,0]=(-0.167611998738)*x[0]
arg[2,3,1,1]=(0.226130056552)*x[1]
arg[2,3,2,0]=(0.0752687641131)*x[0]
arg[2,3,2,1]=(-0.115742756362)*x[1]
arg[2,4,0,0]=(0.579694491028)*x[0]
arg[2,4,0,1]=(-0.112005738299)*x[1]
arg[2,4,1,0]=(0.657291764224)*x[0]
arg[2,4,1,1]=(0.62671154177)*x[1]
arg[2,4,2,0]=(0.103695027944)*x[0]
arg[2,4,2,1]=(0.462828491544)*x[1]
arg[3,0,0,0]=(0.697692979998)*x[0]
arg[3,0,0,1]=(-0.123481859619)*x[1]
arg[3,0,1,0]=(-0.749745629459)*x[0]
arg[3,0,1,1]=(-0.541969524069)*x[1]
arg[3,0,2,0]=(0.819484470759)*x[0]
arg[3,0,2,1]=(-0.860592326469)*x[1]
arg[3,1,0,0]=(-0.716566084771)*x[0]
arg[3,1,0,1]=(-0.949235434827)*x[1]
arg[3,1,1,0]=(-0.826699498174)*x[0]
arg[3,1,1,1]=(-0.138511521583)*x[1]
arg[3,1,2,0]=(-0.951682890904)*x[0]
arg[3,1,2,1]=(0.413293316925)*x[1]
arg[3,2,0,0]=(0.909516836775)*x[0]
arg[3,2,0,1]=(-0.919989721277)*x[1]
arg[3,2,1,0]=(0.0994860369337)*x[0]
arg[3,2,1,1]=(-0.933647246623)*x[1]
arg[3,2,2,0]=(-0.759215183015)*x[0]
arg[3,2,2,1]=(0.0975793309286)*x[1]
arg[3,3,0,0]=(-0.130256739381)*x[0]
arg[3,3,0,1]=(-0.582280862311)*x[1]
arg[3,3,1,0]=(0.206970526192)*x[0]
arg[3,3,1,1]=(-0.8678322258)*x[1]
arg[3,3,2,0]=(0.133004501279)*x[0]
arg[3,3,2,1]=(0.802921710935)*x[1]
arg[3,4,0,0]=(-0.255737792764)*x[0]
arg[3,4,0,1]=(-0.34168114937)*x[1]
arg[3,4,1,0]=(-0.859309090399)*x[0]
arg[3,4,1,1]=(0.245043986435)*x[1]
arg[3,4,2,0]=(0.893062018695)*x[0]
arg[3,4,2,1]=(0.709422742588)*x[1]
ref=sqrt((13.7289280362))
else:
arg=Data(0,(4, 5, 3, 3),w)
arg[0,0,0,0]=(0.0312828390439)*x[0]
arg[0,0,0,1]=(-0.524970416212)*x[1]
arg[0,0,0,2]=(0.561865217554)*x[2]
arg[0,0,1,0]=(0.692457187384)*x[0]
arg[0,0,1,1]=(0.946967182157)*x[1]
arg[0,0,1,2]=(-0.863842279464)*x[2]
arg[0,0,2,0]=(0.993922921598)*x[0]
arg[0,0,2,1]=(0.322812768679)*x[1]
arg[0,0,2,2]=(0.901876132204)*x[2]
arg[0,1,0,0]=(0.967569979365)*x[0]
arg[0,1,0,1]=(0.840979131355)*x[1]
arg[0,1,0,2]=(0.0494811460856)*x[2]
arg[0,1,1,0]=(0.315178456102)*x[0]
arg[0,1,1,1]=(0.449848313024)*x[1]
arg[0,1,1,2]=(0.765887852886)*x[2]
arg[0,1,2,0]=(0.975541574352)*x[0]
arg[0,1,2,1]=(-0.797851290751)*x[1]
arg[0,1,2,2]=(0.628918775319)*x[2]
arg[0,2,0,0]=(0.685635794312)*x[0]
arg[0,2,0,1]=(0.10341799962)*x[1]
arg[0,2,0,2]=(-0.964822756043)*x[2]
arg[0,2,1,0]=(-0.56160368212)*x[0]
arg[0,2,1,1]=(0.676344298102)*x[1]
arg[0,2,1,2]=(-0.713924121843)*x[2]
arg[0,2,2,0]=(-0.276655136263)*x[0]
arg[0,2,2,1]=(0.336046973788)*x[1]
arg[0,2,2,2]=(-0.68789392396)*x[2]
arg[0,3,0,0]=(0.0172861311571)*x[0]
arg[0,3,0,1]=(-0.301075956456)*x[1]
arg[0,3,0,2]=(0.779442985415)*x[2]
arg[0,3,1,0]=(-0.517629576558)*x[0]
arg[0,3,1,1]=(0.584779586639)*x[1]
arg[0,3,1,2]=(-0.53266435436)*x[2]
arg[0,3,2,0]=(0.841533567102)*x[0]
arg[0,3,2,1]=(0.0458746415489)*x[1]
arg[0,3,2,2]=(0.921237870758)*x[2]
arg[0,4,0,0]=(0.0548343238805)*x[0]
arg[0,4,0,1]=(0.687022707412)*x[1]
arg[0,4,0,2]=(-0.319803609795)*x[2]
arg[0,4,1,0]=(0.409763007811)*x[0]
arg[0,4,1,1]=(0.165501957435)*x[1]
arg[0,4,1,2]=(0.116001692781)*x[2]
arg[0,4,2,0]=(-0.515571394238)*x[0]
arg[0,4,2,1]=(0.209467945147)*x[1]
arg[0,4,2,2]=(-0.344827191247)*x[2]
arg[1,0,0,0]=(0.57193838014)*x[0]
arg[1,0,0,1]=(-0.0880683799076)*x[1]
arg[1,0,0,2]=(0.956899617441)*x[2]
arg[1,0,1,0]=(-0.783689636357)*x[0]
arg[1,0,1,1]=(-0.25177506885)*x[1]
arg[1,0,1,2]=(-0.97074584634)*x[2]
arg[1,0,2,0]=(0.432543519806)*x[0]
arg[1,0,2,1]=(0.481003021954)*x[1]
arg[1,0,2,2]=(-0.0630751518268)*x[2]
arg[1,1,0,0]=(-0.65152446796)*x[0]
arg[1,1,0,1]=(-0.0323685084425)*x[1]
arg[1,1,0,2]=(-0.508674033909)*x[2]
arg[1,1,1,0]=(-0.533367818916)*x[0]
arg[1,1,1,1]=(0.310738340288)*x[1]
arg[1,1,1,2]=(0.694612234326)*x[2]
arg[1,1,2,0]=(-0.622052473032)*x[0]
arg[1,1,2,1]=(0.0498443793671)*x[1]
arg[1,1,2,2]=(0.61023707512)*x[2]
arg[1,2,0,0]=(0.0730267406859)*x[0]
arg[1,2,0,1]=(0.146909334607)*x[1]
arg[1,2,0,2]=(-0.641860284448)*x[2]
arg[1,2,1,0]=(0.917976589737)*x[0]
arg[1,2,1,1]=(0.50219672122)*x[1]
arg[1,2,1,2]=(0.634559579812)*x[2]
arg[1,2,2,0]=(0.0578772734534)*x[0]
arg[1,2,2,1]=(0.288730973517)*x[1]
arg[1,2,2,2]=(-0.0525978796154)*x[2]
arg[1,3,0,0]=(-0.926152433388)*x[0]
arg[1,3,0,1]=(0.0616647680855)*x[1]
arg[1,3,0,2]=(-0.875889217846)*x[2]
arg[1,3,1,0]=(-0.638931542845)*x[0]
arg[1,3,1,1]=(0.708848122964)*x[1]
arg[1,3,1,2]=(0.119066979792)*x[2]
arg[1,3,2,0]=(0.853716218591)*x[0]
arg[1,3,2,1]=(-0.92754322201)*x[1]
arg[1,3,2,2]=(-0.671530626265)*x[2]
arg[1,4,0,0]=(0.337424536231)*x[0]
arg[1,4,0,1]=(0.335704451719)*x[1]
arg[1,4,0,2]=(-0.484565969466)*x[2]
arg[1,4,1,0]=(-0.855476192012)*x[0]
arg[1,4,1,1]=(0.405674615553)*x[1]
arg[1,4,1,2]=(0.728310771323)*x[2]
arg[1,4,2,0]=(0.363651308265)*x[0]
arg[1,4,2,1]=(0.174460594531)*x[1]
arg[1,4,2,2]=(-0.0418244838617)*x[2]
arg[2,0,0,0]=(-0.531341992511)*x[0]
arg[2,0,0,1]=(0.584996796272)*x[1]
arg[2,0,0,2]=(-0.752430968716)*x[2]
arg[2,0,1,0]=(-0.341989849747)*x[0]
arg[2,0,1,1]=(0.153572646953)*x[1]
arg[2,0,1,2]=(-0.197130051737)*x[2]
arg[2,0,2,0]=(-0.338082424082)*x[0]
arg[2,0,2,1]=(0.000173657394772)*x[1]
arg[2,0,2,2]=(0.365272907692)*x[2]
arg[2,1,0,0]=(0.904304126564)*x[0]
arg[2,1,0,1]=(0.161252368484)*x[1]
arg[2,1,0,2]=(0.246854092422)*x[2]
arg[2,1,1,0]=(-0.299880647529)*x[0]
arg[2,1,1,1]=(-0.566917528608)*x[1]
arg[2,1,1,2]=(0.243183337285)*x[2]
arg[2,1,2,0]=(0.437406011474)*x[0]
arg[2,1,2,1]=(0.727447394053)*x[1]
arg[2,1,2,2]=(0.380752950664)*x[2]
arg[2,2,0,0]=(0.172292846911)*x[0]
arg[2,2,0,1]=(0.334201791643)*x[1]
arg[2,2,0,2]=(0.739989926962)*x[2]
arg[2,2,1,0]=(-0.0669843715042)*x[0]
arg[2,2,1,1]=(-0.540497281635)*x[1]
arg[2,2,1,2]=(-0.744217027088)*x[2]
arg[2,2,2,0]=(-0.287295952259)*x[0]
arg[2,2,2,1]=(-0.512411849183)*x[1]
arg[2,2,2,2]=(0.953107417666)*x[2]
arg[2,3,0,0]=(0.998168116695)*x[0]
arg[2,3,0,1]=(0.960065646359)*x[1]
arg[2,3,0,2]=(0.110048258832)*x[2]
arg[2,3,1,0]=(-0.477271134724)*x[0]
arg[2,3,1,1]=(0.707182612251)*x[1]
arg[2,3,1,2]=(0.285500891755)*x[2]
arg[2,3,2,0]=(-0.863497506661)*x[0]
arg[2,3,2,1]=(-0.293917669879)*x[1]
arg[2,3,2,2]=(-0.403384244295)*x[2]
arg[2,4,0,0]=(0.848455277702)*x[0]
arg[2,4,0,1]=(-0.530101455578)*x[1]
arg[2,4,0,2]=(0.33887313048)*x[2]
arg[2,4,1,0]=(-0.195313538124)*x[0]
arg[2,4,1,1]=(-0.62754572008)*x[1]
arg[2,4,1,2]=(-0.385132960582)*x[2]
arg[2,4,2,0]=(0.240048012886)*x[0]
arg[2,4,2,1]=(0.900766252969)*x[1]
arg[2,4,2,2]=(0.669620533505)*x[2]
arg[3,0,0,0]=(0.375766827301)*x[0]
arg[3,0,0,1]=(0.705484960308)*x[1]
arg[3,0,0,2]=(0.440931516034)*x[2]
arg[3,0,1,0]=(-0.44724403177)*x[0]
arg[3,0,1,1]=(-0.31558249626)*x[1]
arg[3,0,1,2]=(-0.00419436365172)*x[2]
arg[3,0,2,0]=(0.750599752032)*x[0]
arg[3,0,2,1]=(0.367649951795)*x[1]
arg[3,0,2,2]=(0.0488013073654)*x[2]
arg[3,1,0,0]=(-0.992890068274)*x[0]
arg[3,1,0,1]=(0.671447745511)*x[1]
arg[3,1,0,2]=(0.85613331404)*x[2]
arg[3,1,1,0]=(-0.46064764242)*x[0]
arg[3,1,1,1]=(0.48138877715)*x[1]
arg[3,1,1,2]=(0.396741761803)*x[2]
arg[3,1,2,0]=(-0.879391967543)*x[0]
arg[3,1,2,1]=(-0.44039462138)*x[1]
arg[3,1,2,2]=(0.0330511573872)*x[2]
arg[3,2,0,0]=(-0.367413701648)*x[0]
arg[3,2,0,1]=(0.0359818324891)*x[1]
arg[3,2,0,2]=(-0.307532667032)*x[2]
arg[3,2,1,0]=(0.334663597166)*x[0]
arg[3,2,1,1]=(0.541941978066)*x[1]
arg[3,2,1,2]=(-0.609184079318)*x[2]
arg[3,2,2,0]=(0.359349239826)*x[0]
arg[3,2,2,1]=(0.0419272305685)*x[1]
arg[3,2,2,2]=(0.557189794296)*x[2]
arg[3,3,0,0]=(-0.85864165554)*x[0]
arg[3,3,0,1]=(-0.185411404213)*x[1]
arg[3,3,0,2]=(0.254294865253)*x[2]
arg[3,3,1,0]=(0.870362177541)*x[0]
arg[3,3,1,1]=(-0.439688612864)*x[1]
arg[3,3,1,2]=(0.26006729357)*x[2]
arg[3,3,2,0]=(-0.0724034754175)*x[0]
arg[3,3,2,1]=(0.444871564246)*x[1]
arg[3,3,2,2]=(0.485634530531)*x[2]
arg[3,4,0,0]=(-0.744756961758)*x[0]
arg[3,4,0,1]=(0.429761406102)*x[1]
arg[3,4,0,2]=(-0.584963735834)*x[2]
arg[3,4,1,0]=(0.684578379159)*x[0]
arg[3,4,1,1]=(0.949460132601)*x[1]
arg[3,4,1,2]=(-0.592179909559)*x[2]
arg[3,4,2,0]=(0.707154437797)*x[0]
arg[3,4,2,1]=(0.619200407063)*x[1]
arg[3,4,2,2]=(-0.338547165)*x[2]
ref=sqrt((19.2170638478))
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onReducedFunction_fromData_rank0(self):
"""
tests L2-norm of Data on the ReducedFunction
assumptions: self.domain supports integration on ReducedFunction
"""
dim=self.domain.getDim()
w=ReducedFunction(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(),w)
arg=1.*sqrt(x[0])
ref=sqrt(0.5)
else:
arg=Data(0,(),w)
arg=1.*sqrt(x[0])
ref=sqrt(0.5)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onReducedFunction_fromData_rank1(self):
"""
tests L2-norm of Data on the ReducedFunction
assumptions: self.domain supports integration on ReducedFunction
"""
dim=self.domain.getDim()
w=ReducedFunction(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(2,),w)
arg[0]=1.*sqrt(x[0])
arg[1]=2.*sqrt(x[1])
ref=sqrt(2.5)
else:
arg=Data(0,(3,),w)
arg[0]=1.*sqrt(x[0])
arg[1]=2.*sqrt(x[1])
arg[2]=3.*sqrt(x[2])
ref=sqrt(7.)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onReducedFunction_fromData_rank2(self):
"""
tests L2-norm of Data on the ReducedFunction
assumptions: self.domain supports integration on ReducedFunction
"""
dim=self.domain.getDim()
w=ReducedFunction(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(4, 2),w)
arg[0,0]=11.*sqrt(x[0])
arg[0,1]=1.*sqrt(x[1])
arg[1,0]=10.*sqrt(x[0])
arg[1,1]=11.*sqrt(x[1])
arg[2,0]=20.*sqrt(x[0])
arg[2,1]=21.*sqrt(x[1])
arg[3,0]=30.*sqrt(x[0])
arg[3,1]=31.*sqrt(x[1])
ref=sqrt(1522.5)
else:
arg=Data(0,(4, 3),w)
arg[0,0]=11.*sqrt(x[0])
arg[0,1]=1.*sqrt(x[1])
arg[0,2]=2.*sqrt(x[2])
arg[1,0]=10.*sqrt(x[0])
arg[1,1]=11.*sqrt(x[1])
arg[1,2]=12.*sqrt(x[2])
arg[2,0]=20.*sqrt(x[0])
arg[2,1]=21.*sqrt(x[1])
arg[2,2]=22.*sqrt(x[2])
arg[3,0]=30.*sqrt(x[0])
arg[3,1]=31.*sqrt(x[1])
arg[3,2]=32.*sqrt(x[2])
ref=sqrt(2350.5)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onReducedFunction_fromData_rank3(self):
"""
tests L2-norm of Data on the ReducedFunction
assumptions: self.domain supports integration on ReducedFunction
"""
dim=self.domain.getDim()
w=ReducedFunction(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(6, 2, 2),w)
arg[0,0,0]=(0.449174971953)*sqrt(x[0])
arg[0,0,1]=(-0.0109398763289)*sqrt(x[1])
arg[0,1,0]=(-0.202497187709)*sqrt(x[0])
arg[0,1,1]=(-0.12970879334)*sqrt(x[1])
arg[1,0,0]=(-0.138092481719)*sqrt(x[0])
arg[1,0,1]=(-0.528752200917)*sqrt(x[1])
arg[1,1,0]=(-0.605919441662)*sqrt(x[0])
arg[1,1,1]=(0.215615032334)*sqrt(x[1])
arg[2,0,0]=(-0.998734541972)*sqrt(x[0])
arg[2,0,1]=(0.725811901251)*sqrt(x[1])
arg[2,1,0]=(-0.966536503228)*sqrt(x[0])
arg[2,1,1]=(-0.528692217355)*sqrt(x[1])
arg[3,0,0]=(0.757633851466)*sqrt(x[0])
arg[3,0,1]=(-0.524660157377)*sqrt(x[1])
arg[3,1,0]=(0.983733431677)*sqrt(x[0])
arg[3,1,1]=(0.061279109546)*sqrt(x[1])
arg[4,0,0]=(0.85914215305)*sqrt(x[0])
arg[4,0,1]=(0.941714045112)*sqrt(x[1])
arg[4,1,0]=(0.172235529555)*sqrt(x[0])
arg[4,1,1]=(-0.108381454437)*sqrt(x[1])
arg[5,0,0]=(-0.736373697727)*sqrt(x[0])
arg[5,0,1]=(-0.599337929679)*sqrt(x[1])
arg[5,1,0]=(0.661072686392)*sqrt(x[0])
arg[5,1,1]=(-0.55107327409)*sqrt(x[1])
ref=sqrt(4.4196214907099591)
else:
arg=Data(0,(6, 2, 3),w)
arg[0,0,0]=(0.69227064904)*sqrt(x[0])
arg[0,0,1]=(-0.968336177418)*sqrt(x[1])
arg[0,0,2]=(-0.634883146685)*sqrt(x[2])
arg[0,1,0]=(-0.12640661422)*sqrt(x[0])
arg[0,1,1]=(-0.637386589888)*sqrt(x[1])
arg[0,1,2]=(0.26060859356)*sqrt(x[2])
arg[1,0,0]=(-0.986864633297)*sqrt(x[0])
arg[1,0,1]=(-0.441589142379)*sqrt(x[1])
arg[1,0,2]=(-0.587865539582)*sqrt(x[2])
arg[1,1,0]=(0.596052465031)*sqrt(x[0])
arg[1,1,1]=(0.312732336652)*sqrt(x[1])
arg[1,1,2]=(-0.514423945092)*sqrt(x[2])
arg[2,0,0]=(-0.892391254794)*sqrt(x[0])
arg[2,0,1]=(0.377920185756)*sqrt(x[1])
arg[2,0,2]=(-0.120174597181)*sqrt(x[2])
arg[2,1,0]=(-0.469951576468)*sqrt(x[0])
arg[2,1,1]=(-0.788362249555)*sqrt(x[1])
arg[2,1,2]=(0.745625354986)*sqrt(x[2])
arg[3,0,0]=(0.542802498569)*sqrt(x[0])
arg[3,0,1]=(-0.814541028706)*sqrt(x[1])
arg[3,0,2]=(0.298410992196)*sqrt(x[2])
arg[3,1,0]=(0.981190341206)*sqrt(x[0])
arg[3,1,1]=(0.666421298608)*sqrt(x[1])
arg[3,1,2]=(-0.369751722626)*sqrt(x[2])
arg[4,0,0]=(-0.75379530597)*sqrt(x[0])
arg[4,0,1]=(0.283357267139)*sqrt(x[1])
arg[4,0,2]=(0.247787072861)*sqrt(x[2])
arg[4,1,0]=(0.301766692533)*sqrt(x[0])
arg[4,1,1]=(0.828183439224)*sqrt(x[1])
arg[4,1,2]=(-0.580824060547)*sqrt(x[2])
arg[5,0,0]=(0.637345610764)*sqrt(x[0])
arg[5,0,1]=(-0.234409115997)*sqrt(x[1])
arg[5,0,2]=(-0.192639300316)*sqrt(x[2])
arg[5,1,0]=(-0.62609237162)*sqrt(x[0])
arg[5,1,1]=(0.463404958552)*sqrt(x[1])
arg[5,1,2]=(-0.547814448738)*sqrt(x[2])
ref=sqrt(6.3571697792950923)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onReducedFunction_fromData_rank4(self):
"""
tests L2-norm of Data on the ReducedFunction
assumptions: self.domain supports integration on ReducedFunction
"""
dim=self.domain.getDim()
w=ReducedFunction(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(4, 5, 3, 2),w)
arg[0,0,0,0]=(-0.232618585183)*sqrt(x[0])
arg[0,0,0,1]=(0.39796117869)*sqrt(x[1])
arg[0,0,1,0]=(-0.997336958262)*sqrt(x[0])
arg[0,0,1,1]=(-0.351780915076)*sqrt(x[1])
arg[0,0,2,0]=(-0.876764070136)*sqrt(x[0])
arg[0,0,2,1]=(0.808730805817)*sqrt(x[1])
arg[0,1,0,0]=(-0.197154744966)*sqrt(x[0])
arg[0,1,0,1]=(0.416246096086)*sqrt(x[1])
arg[0,1,1,0]=(0.708038457121)*sqrt(x[0])
arg[0,1,1,1]=(-0.00954021503188)*sqrt(x[1])
arg[0,1,2,0]=(-0.62630809425)*sqrt(x[0])
arg[0,1,2,1]=(0.430228727912)*sqrt(x[1])
arg[0,2,0,0]=(0.0306704747648)*sqrt(x[0])
arg[0,2,0,1]=(-0.913877199453)*sqrt(x[1])
arg[0,2,1,0]=(-0.697612800829)*sqrt(x[0])
arg[0,2,1,1]=(-0.17996376822)*sqrt(x[1])
arg[0,2,2,0]=(-0.304509578871)*sqrt(x[0])
arg[0,2,2,1]=(-0.610556755811)*sqrt(x[1])
arg[0,3,0,0]=(-0.452355972234)*sqrt(x[0])
arg[0,3,0,1]=(-0.368921242518)*sqrt(x[1])
arg[0,3,1,0]=(-0.478275554932)*sqrt(x[0])
arg[0,3,1,1]=(0.257178549127)*sqrt(x[1])
arg[0,3,2,0]=(0.530736487177)*sqrt(x[0])
arg[0,3,2,1]=(-0.567126272463)*sqrt(x[1])
arg[0,4,0,0]=(0.801519165938)*sqrt(x[0])
arg[0,4,0,1]=(-0.509816703951)*sqrt(x[1])
arg[0,4,1,0]=(-0.255412646934)*sqrt(x[0])
arg[0,4,1,1]=(0.437540101896)*sqrt(x[1])
arg[0,4,2,0]=(-0.815574969538)*sqrt(x[0])
arg[0,4,2,1]=(-0.94691547137)*sqrt(x[1])
arg[1,0,0,0]=(-0.732550722593)*sqrt(x[0])
arg[1,0,0,1]=(0.515752381704)*sqrt(x[1])
arg[1,0,1,0]=(-0.343590210899)*sqrt(x[0])
arg[1,0,1,1]=(-0.0601907964915)*sqrt(x[1])
arg[1,0,2,0]=(0.0199916154421)*sqrt(x[0])
arg[1,0,2,1]=(-0.136927227821)*sqrt(x[1])
arg[1,1,0,0]=(0.397994441702)*sqrt(x[0])
arg[1,1,0,1]=(0.953873148948)*sqrt(x[1])
arg[1,1,1,0]=(0.419416235967)*sqrt(x[0])
arg[1,1,1,1]=(0.700998577193)*sqrt(x[1])
arg[1,1,2,0]=(-0.497358799271)*sqrt(x[0])
arg[1,1,2,1]=(0.0851768858379)*sqrt(x[1])
arg[1,2,0,0]=(0.0936678875202)*sqrt(x[0])
arg[1,2,0,1]=(0.869883786896)*sqrt(x[1])
arg[1,2,1,0]=(0.582700123485)*sqrt(x[0])
arg[1,2,1,1]=(-0.433381106794)*sqrt(x[1])
arg[1,2,2,0]=(-0.527031777974)*sqrt(x[0])
arg[1,2,2,1]=(0.105105137652)*sqrt(x[1])
arg[1,3,0,0]=(-0.716750829134)*sqrt(x[0])
arg[1,3,0,1]=(0.774519209008)*sqrt(x[1])
arg[1,3,1,0]=(-0.568743372716)*sqrt(x[0])
arg[1,3,1,1]=(0.794732483944)*sqrt(x[1])
arg[1,3,2,0]=(0.246606002015)*sqrt(x[0])
arg[1,3,2,1]=(-0.988869494994)*sqrt(x[1])
arg[1,4,0,0]=(0.482379298083)*sqrt(x[0])
arg[1,4,0,1]=(-0.386268387903)*sqrt(x[1])
arg[1,4,1,0]=(0.137184889675)*sqrt(x[0])
arg[1,4,1,1]=(-0.140520035321)*sqrt(x[1])
arg[1,4,2,0]=(0.822755050415)*sqrt(x[0])
arg[1,4,2,1]=(-0.815562139522)*sqrt(x[1])
arg[2,0,0,0]=(-0.462891511962)*sqrt(x[0])
arg[2,0,0,1]=(-0.122643411631)*sqrt(x[1])
arg[2,0,1,0]=(-0.520861119962)*sqrt(x[0])
arg[2,0,1,1]=(-0.881189618018)*sqrt(x[1])
arg[2,0,2,0]=(-0.776157842774)*sqrt(x[0])
arg[2,0,2,1]=(-0.12354053207)*sqrt(x[1])
arg[2,1,0,0]=(0.395495230826)*sqrt(x[0])
arg[2,1,0,1]=(-0.388106659423)*sqrt(x[1])
arg[2,1,1,0]=(0.354250242834)*sqrt(x[0])
arg[2,1,1,1]=(-0.666514210192)*sqrt(x[1])
arg[2,1,2,0]=(0.951294655083)*sqrt(x[0])
arg[2,1,2,1]=(0.074024416386)*sqrt(x[1])
arg[2,2,0,0]=(0.335448485459)*sqrt(x[0])
arg[2,2,0,1]=(-0.40988282528)*sqrt(x[1])
arg[2,2,1,0]=(-0.805725968875)*sqrt(x[0])
arg[2,2,1,1]=(-0.949883082118)*sqrt(x[1])
arg[2,2,2,0]=(0.531549210683)*sqrt(x[0])
arg[2,2,2,1]=(-0.398401016682)*sqrt(x[1])
arg[2,3,0,0]=(-0.953963433205)*sqrt(x[0])
arg[2,3,0,1]=(0.643431126406)*sqrt(x[1])
arg[2,3,1,0]=(-0.167611998738)*sqrt(x[0])
arg[2,3,1,1]=(0.226130056552)*sqrt(x[1])
arg[2,3,2,0]=(0.0752687641131)*sqrt(x[0])
arg[2,3,2,1]=(-0.115742756362)*sqrt(x[1])
arg[2,4,0,0]=(0.579694491028)*sqrt(x[0])
arg[2,4,0,1]=(-0.112005738299)*sqrt(x[1])
arg[2,4,1,0]=(0.657291764224)*sqrt(x[0])
arg[2,4,1,1]=(0.62671154177)*sqrt(x[1])
arg[2,4,2,0]=(0.103695027944)*sqrt(x[0])
arg[2,4,2,1]=(0.462828491544)*sqrt(x[1])
arg[3,0,0,0]=(0.697692979998)*sqrt(x[0])
arg[3,0,0,1]=(-0.123481859619)*sqrt(x[1])
arg[3,0,1,0]=(-0.749745629459)*sqrt(x[0])
arg[3,0,1,1]=(-0.541969524069)*sqrt(x[1])
arg[3,0,2,0]=(0.819484470759)*sqrt(x[0])
arg[3,0,2,1]=(-0.860592326469)*sqrt(x[1])
arg[3,1,0,0]=(-0.716566084771)*sqrt(x[0])
arg[3,1,0,1]=(-0.949235434827)*sqrt(x[1])
arg[3,1,1,0]=(-0.826699498174)*sqrt(x[0])
arg[3,1,1,1]=(-0.138511521583)*sqrt(x[1])
arg[3,1,2,0]=(-0.951682890904)*sqrt(x[0])
arg[3,1,2,1]=(0.413293316925)*sqrt(x[1])
arg[3,2,0,0]=(0.909516836775)*sqrt(x[0])
arg[3,2,0,1]=(-0.919989721277)*sqrt(x[1])
arg[3,2,1,0]=(0.0994860369337)*sqrt(x[0])
arg[3,2,1,1]=(-0.933647246623)*sqrt(x[1])
arg[3,2,2,0]=(-0.759215183015)*sqrt(x[0])
arg[3,2,2,1]=(0.0975793309286)*sqrt(x[1])
arg[3,3,0,0]=(-0.130256739381)*sqrt(x[0])
arg[3,3,0,1]=(-0.582280862311)*sqrt(x[1])
arg[3,3,1,0]=(0.206970526192)*sqrt(x[0])
arg[3,3,1,1]=(-0.8678322258)*sqrt(x[1])
arg[3,3,2,0]=(0.133004501279)*sqrt(x[0])
arg[3,3,2,1]=(0.802921710935)*sqrt(x[1])
arg[3,4,0,0]=(-0.255737792764)*sqrt(x[0])
arg[3,4,0,1]=(-0.34168114937)*sqrt(x[1])
arg[3,4,1,0]=(-0.859309090399)*sqrt(x[0])
arg[3,4,1,1]=(0.245043986435)*sqrt(x[1])
arg[3,4,2,0]=(0.893062018695)*sqrt(x[0])
arg[3,4,2,1]=(0.709422742588)*sqrt(x[1])
ref=sqrt(20.5933920543)
else:
arg=Data(0,(4, 5, 3, 3),w)
arg[0,0,0,0]=(0.0312828390439)*sqrt(x[0])
arg[0,0,0,1]=(-0.524970416212)*sqrt(x[1])
arg[0,0,0,2]=(0.561865217554)*sqrt(x[2])
arg[0,0,1,0]=(0.692457187384)*sqrt(x[0])
arg[0,0,1,1]=(0.946967182157)*sqrt(x[1])
arg[0,0,1,2]=(-0.863842279464)*sqrt(x[2])
arg[0,0,2,0]=(0.993922921598)*sqrt(x[0])
arg[0,0,2,1]=(0.322812768679)*sqrt(x[1])
arg[0,0,2,2]=(0.901876132204)*sqrt(x[2])
arg[0,1,0,0]=(0.967569979365)*sqrt(x[0])
arg[0,1,0,1]=(0.840979131355)*sqrt(x[1])
arg[0,1,0,2]=(0.0494811460856)*sqrt(x[2])
arg[0,1,1,0]=(0.315178456102)*sqrt(x[0])
arg[0,1,1,1]=(0.449848313024)*sqrt(x[1])
arg[0,1,1,2]=(0.765887852886)*sqrt(x[2])
arg[0,1,2,0]=(0.975541574352)*sqrt(x[0])
arg[0,1,2,1]=(-0.797851290751)*sqrt(x[1])
arg[0,1,2,2]=(0.628918775319)*sqrt(x[2])
arg[0,2,0,0]=(0.685635794312)*sqrt(x[0])
arg[0,2,0,1]=(0.10341799962)*sqrt(x[1])
arg[0,2,0,2]=(-0.964822756043)*sqrt(x[2])
arg[0,2,1,0]=(-0.56160368212)*sqrt(x[0])
arg[0,2,1,1]=(0.676344298102)*sqrt(x[1])
arg[0,2,1,2]=(-0.713924121843)*sqrt(x[2])
arg[0,2,2,0]=(-0.276655136263)*sqrt(x[0])
arg[0,2,2,1]=(0.336046973788)*sqrt(x[1])
arg[0,2,2,2]=(-0.68789392396)*sqrt(x[2])
arg[0,3,0,0]=(0.0172861311571)*sqrt(x[0])
arg[0,3,0,1]=(-0.301075956456)*sqrt(x[1])
arg[0,3,0,2]=(0.779442985415)*sqrt(x[2])
arg[0,3,1,0]=(-0.517629576558)*sqrt(x[0])
arg[0,3,1,1]=(0.584779586639)*sqrt(x[1])
arg[0,3,1,2]=(-0.53266435436)*sqrt(x[2])
arg[0,3,2,0]=(0.841533567102)*sqrt(x[0])
arg[0,3,2,1]=(0.0458746415489)*sqrt(x[1])
arg[0,3,2,2]=(0.921237870758)*sqrt(x[2])
arg[0,4,0,0]=(0.0548343238805)*sqrt(x[0])
arg[0,4,0,1]=(0.687022707412)*sqrt(x[1])
arg[0,4,0,2]=(-0.319803609795)*sqrt(x[2])
arg[0,4,1,0]=(0.409763007811)*sqrt(x[0])
arg[0,4,1,1]=(0.165501957435)*sqrt(x[1])
arg[0,4,1,2]=(0.116001692781)*sqrt(x[2])
arg[0,4,2,0]=(-0.515571394238)*sqrt(x[0])
arg[0,4,2,1]=(0.209467945147)*sqrt(x[1])
arg[0,4,2,2]=(-0.344827191247)*sqrt(x[2])
arg[1,0,0,0]=(0.57193838014)*sqrt(x[0])
arg[1,0,0,1]=(-0.0880683799076)*sqrt(x[1])
arg[1,0,0,2]=(0.956899617441)*sqrt(x[2])
arg[1,0,1,0]=(-0.783689636357)*sqrt(x[0])
arg[1,0,1,1]=(-0.25177506885)*sqrt(x[1])
arg[1,0,1,2]=(-0.97074584634)*sqrt(x[2])
arg[1,0,2,0]=(0.432543519806)*sqrt(x[0])
arg[1,0,2,1]=(0.481003021954)*sqrt(x[1])
arg[1,0,2,2]=(-0.0630751518268)*sqrt(x[2])
arg[1,1,0,0]=(-0.65152446796)*sqrt(x[0])
arg[1,1,0,1]=(-0.0323685084425)*sqrt(x[1])
arg[1,1,0,2]=(-0.508674033909)*sqrt(x[2])
arg[1,1,1,0]=(-0.533367818916)*sqrt(x[0])
arg[1,1,1,1]=(0.310738340288)*sqrt(x[1])
arg[1,1,1,2]=(0.694612234326)*sqrt(x[2])
arg[1,1,2,0]=(-0.622052473032)*sqrt(x[0])
arg[1,1,2,1]=(0.0498443793671)*sqrt(x[1])
arg[1,1,2,2]=(0.61023707512)*sqrt(x[2])
arg[1,2,0,0]=(0.0730267406859)*sqrt(x[0])
arg[1,2,0,1]=(0.146909334607)*sqrt(x[1])
arg[1,2,0,2]=(-0.641860284448)*sqrt(x[2])
arg[1,2,1,0]=(0.917976589737)*sqrt(x[0])
arg[1,2,1,1]=(0.50219672122)*sqrt(x[1])
arg[1,2,1,2]=(0.634559579812)*sqrt(x[2])
arg[1,2,2,0]=(0.0578772734534)*sqrt(x[0])
arg[1,2,2,1]=(0.288730973517)*sqrt(x[1])
arg[1,2,2,2]=(-0.0525978796154)*sqrt(x[2])
arg[1,3,0,0]=(-0.926152433388)*sqrt(x[0])
arg[1,3,0,1]=(0.0616647680855)*sqrt(x[1])
arg[1,3,0,2]=(-0.875889217846)*sqrt(x[2])
arg[1,3,1,0]=(-0.638931542845)*sqrt(x[0])
arg[1,3,1,1]=(0.708848122964)*sqrt(x[1])
arg[1,3,1,2]=(0.119066979792)*sqrt(x[2])
arg[1,3,2,0]=(0.853716218591)*sqrt(x[0])
arg[1,3,2,1]=(-0.92754322201)*sqrt(x[1])
arg[1,3,2,2]=(-0.671530626265)*sqrt(x[2])
arg[1,4,0,0]=(0.337424536231)*sqrt(x[0])
arg[1,4,0,1]=(0.335704451719)*sqrt(x[1])
arg[1,4,0,2]=(-0.484565969466)*sqrt(x[2])
arg[1,4,1,0]=(-0.855476192012)*sqrt(x[0])
arg[1,4,1,1]=(0.405674615553)*sqrt(x[1])
arg[1,4,1,2]=(0.728310771323)*sqrt(x[2])
arg[1,4,2,0]=(0.363651308265)*sqrt(x[0])
arg[1,4,2,1]=(0.174460594531)*sqrt(x[1])
arg[1,4,2,2]=(-0.0418244838617)*sqrt(x[2])
arg[2,0,0,0]=(-0.531341992511)*sqrt(x[0])
arg[2,0,0,1]=(0.584996796272)*sqrt(x[1])
arg[2,0,0,2]=(-0.752430968716)*sqrt(x[2])
arg[2,0,1,0]=(-0.341989849747)*sqrt(x[0])
arg[2,0,1,1]=(0.153572646953)*sqrt(x[1])
arg[2,0,1,2]=(-0.197130051737)*sqrt(x[2])
arg[2,0,2,0]=(-0.338082424082)*sqrt(x[0])
arg[2,0,2,1]=(0.000173657394772)*sqrt(x[1])
arg[2,0,2,2]=(0.365272907692)*sqrt(x[2])
arg[2,1,0,0]=(0.904304126564)*sqrt(x[0])
arg[2,1,0,1]=(0.161252368484)*sqrt(x[1])
arg[2,1,0,2]=(0.246854092422)*sqrt(x[2])
arg[2,1,1,0]=(-0.299880647529)*sqrt(x[0])
arg[2,1,1,1]=(-0.566917528608)*sqrt(x[1])
arg[2,1,1,2]=(0.243183337285)*sqrt(x[2])
arg[2,1,2,0]=(0.437406011474)*sqrt(x[0])
arg[2,1,2,1]=(0.727447394053)*sqrt(x[1])
arg[2,1,2,2]=(0.380752950664)*sqrt(x[2])
arg[2,2,0,0]=(0.172292846911)*sqrt(x[0])
arg[2,2,0,1]=(0.334201791643)*sqrt(x[1])
arg[2,2,0,2]=(0.739989926962)*sqrt(x[2])
arg[2,2,1,0]=(-0.0669843715042)*sqrt(x[0])
arg[2,2,1,1]=(-0.540497281635)*sqrt(x[1])
arg[2,2,1,2]=(-0.744217027088)*sqrt(x[2])
arg[2,2,2,0]=(-0.287295952259)*sqrt(x[0])
arg[2,2,2,1]=(-0.512411849183)*sqrt(x[1])
arg[2,2,2,2]=(0.953107417666)*sqrt(x[2])
arg[2,3,0,0]=(0.998168116695)*sqrt(x[0])
arg[2,3,0,1]=(0.960065646359)*sqrt(x[1])
arg[2,3,0,2]=(0.110048258832)*sqrt(x[2])
arg[2,3,1,0]=(-0.477271134724)*sqrt(x[0])
arg[2,3,1,1]=(0.707182612251)*sqrt(x[1])
arg[2,3,1,2]=(0.285500891755)*sqrt(x[2])
arg[2,3,2,0]=(-0.863497506661)*sqrt(x[0])
arg[2,3,2,1]=(-0.293917669879)*sqrt(x[1])
arg[2,3,2,2]=(-0.403384244295)*sqrt(x[2])
arg[2,4,0,0]=(0.848455277702)*sqrt(x[0])
arg[2,4,0,1]=(-0.530101455578)*sqrt(x[1])
arg[2,4,0,2]=(0.33887313048)*sqrt(x[2])
arg[2,4,1,0]=(-0.195313538124)*sqrt(x[0])
arg[2,4,1,1]=(-0.62754572008)*sqrt(x[1])
arg[2,4,1,2]=(-0.385132960582)*sqrt(x[2])
arg[2,4,2,0]=(0.240048012886)*sqrt(x[0])
arg[2,4,2,1]=(0.900766252969)*sqrt(x[1])
arg[2,4,2,2]=(0.669620533505)*sqrt(x[2])
arg[3,0,0,0]=(0.375766827301)*sqrt(x[0])
arg[3,0,0,1]=(0.705484960308)*sqrt(x[1])
arg[3,0,0,2]=(0.440931516034)*sqrt(x[2])
arg[3,0,1,0]=(-0.44724403177)*sqrt(x[0])
arg[3,0,1,1]=(-0.31558249626)*sqrt(x[1])
arg[3,0,1,2]=(-0.00419436365172)*sqrt(x[2])
arg[3,0,2,0]=(0.750599752032)*sqrt(x[0])
arg[3,0,2,1]=(0.367649951795)*sqrt(x[1])
arg[3,0,2,2]=(0.0488013073654)*sqrt(x[2])
arg[3,1,0,0]=(-0.992890068274)*sqrt(x[0])
arg[3,1,0,1]=(0.671447745511)*sqrt(x[1])
arg[3,1,0,2]=(0.85613331404)*sqrt(x[2])
arg[3,1,1,0]=(-0.46064764242)*sqrt(x[0])
arg[3,1,1,1]=(0.48138877715)*sqrt(x[1])
arg[3,1,1,2]=(0.396741761803)*sqrt(x[2])
arg[3,1,2,0]=(-0.879391967543)*sqrt(x[0])
arg[3,1,2,1]=(-0.44039462138)*sqrt(x[1])
arg[3,1,2,2]=(0.0330511573872)*sqrt(x[2])
arg[3,2,0,0]=(-0.367413701648)*sqrt(x[0])
arg[3,2,0,1]=(0.0359818324891)*sqrt(x[1])
arg[3,2,0,2]=(-0.307532667032)*sqrt(x[2])
arg[3,2,1,0]=(0.334663597166)*sqrt(x[0])
arg[3,2,1,1]=(0.541941978066)*sqrt(x[1])
arg[3,2,1,2]=(-0.609184079318)*sqrt(x[2])
arg[3,2,2,0]=(0.359349239826)*sqrt(x[0])
arg[3,2,2,1]=(0.0419272305685)*sqrt(x[1])
arg[3,2,2,2]=(0.557189794296)*sqrt(x[2])
arg[3,3,0,0]=(-0.85864165554)*sqrt(x[0])
arg[3,3,0,1]=(-0.185411404213)*sqrt(x[1])
arg[3,3,0,2]=(0.254294865253)*sqrt(x[2])
arg[3,3,1,0]=(0.870362177541)*sqrt(x[0])
arg[3,3,1,1]=(-0.439688612864)*sqrt(x[1])
arg[3,3,1,2]=(0.26006729357)*sqrt(x[2])
arg[3,3,2,0]=(-0.0724034754175)*sqrt(x[0])
arg[3,3,2,1]=(0.444871564246)*sqrt(x[1])
arg[3,3,2,2]=(0.485634530531)*sqrt(x[2])
arg[3,4,0,0]=(-0.744756961758)*sqrt(x[0])
arg[3,4,0,1]=(0.429761406102)*sqrt(x[1])
arg[3,4,0,2]=(-0.584963735834)*sqrt(x[2])
arg[3,4,1,0]=(0.684578379159)*sqrt(x[0])
arg[3,4,1,1]=(0.949460132601)*sqrt(x[1])
arg[3,4,1,2]=(-0.592179909559)*sqrt(x[2])
arg[3,4,2,0]=(0.707154437797)*sqrt(x[0])
arg[3,4,2,1]=(0.619200407063)*sqrt(x[1])
arg[3,4,2,2]=(-0.338547165)*sqrt(x[2])
ref=sqrt(28.8255957718)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onFunctionOnBoundary_fromData_rank0(self):
"""
tests L2-norm of Data on the FunctionOnBoundary
assumptions: self.domain supports integration on FunctionOnBoundary
"""
dim=self.domain.getDim()
w=FunctionOnBoundary(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(),w)
arg=(-0.245574919477)*x[0]
ref=sqrt((0.0603070410759)*(2.*dim+1.)/3.)
else:
arg=Data(0,(),w)
arg=(0.757324521515)*x[0]
ref=sqrt((0.573540430888)*(2.*dim+1.)/3.)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onFunctionOnBoundary_fromData_rank1(self):
"""
tests L2-norm of Data on the FunctionOnBoundary
assumptions: self.domain supports integration on FunctionOnBoundary
"""
dim=self.domain.getDim()
w=FunctionOnBoundary(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(2,),w)
arg[0]=(0.723421565407)*x[0]
arg[1]=(-0.460477393103)*x[1]
ref=sqrt((0.735378190855)*(2.*dim+1.)/3.)
else:
arg=Data(0,(3,),w)
arg[0]=(-0.88528497163)*x[0]
arg[1]=(-0.65510214636)*x[1]
arg[2]=(0.399538866363)*x[2]
ref=sqrt((1.37251960889)*(2.*dim+1.)/3.)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onFunctionOnBoundary_fromData_rank2(self):
"""
tests L2-norm of Data on the FunctionOnBoundary
assumptions: self.domain supports integration on FunctionOnBoundary
"""
dim=self.domain.getDim()
w=FunctionOnBoundary(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(4, 2),w)
arg[0,0]=(0.955620993904)*x[0]
arg[0,1]=(-0.0987865813703)*x[1]
arg[1,0]=(0.0288267231531)*x[0]
arg[1,1]=(0.655440599879)*x[1]
arg[2,0]=(0.685627284533)*x[0]
arg[2,1]=(-0.989832824892)*x[1]
arg[3,0]=(0.292184093194)*x[0]
arg[3,1]=(0.149553857773)*x[1]
ref=sqrt((2.91099532781)*(2.*dim+1.)/3.)
else:
arg=Data(0,(4, 3),w)
arg[0,0]=(-0.325908541533)*x[0]
arg[0,1]=(-0.992480479749)*x[1]
arg[0,2]=(0.660360271799)*x[2]
arg[1,0]=(0.173485908581)*x[0]
arg[1,1]=(-0.328755199781)*x[1]
arg[1,2]=(-0.943354674948)*x[2]
arg[2,0]=(0.680713222646)*x[0]
arg[2,1]=(-0.765971835693)*x[1]
arg[2,2]=(0.0413284847528)*x[2]
arg[3,0]=(0.990074004708)*x[0]
arg[3,1]=(0.941801786766)*x[1]
arg[3,2]=(0.886926192201)*x[2]
ref=sqrt((6.26107155228)*(2.*dim+1.)/3.)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onFunctionOnBoundary_fromData_rank3(self):
"""
tests L2-norm of Data on the FunctionOnBoundary
assumptions: self.domain supports integration on FunctionOnBoundary
"""
dim=self.domain.getDim()
w=FunctionOnBoundary(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(6, 2, 2),w)
arg[0,0,0]=(-0.0781611551598)*x[0]
arg[0,0,1]=(0.291016249575)*x[1]
arg[0,1,0]=(-0.107555233086)*x[0]
arg[0,1,1]=(-0.559067108546)*x[1]
arg[1,0,0]=(-0.0818406701266)*x[0]
arg[1,0,1]=(-0.594866806483)*x[1]
arg[1,1,0]=(-0.725814803863)*x[0]
arg[1,1,1]=(0.59128101992)*x[1]
arg[2,0,0]=(-0.15381555291)*x[0]
arg[2,0,1]=(-0.679882948503)*x[1]
arg[2,1,0]=(-0.58437193917)*x[0]
arg[2,1,1]=(0.136304615849)*x[1]
arg[3,0,0]=(0.0671365410096)*x[0]
arg[3,0,1]=(-0.645687212187)*x[1]
arg[3,1,0]=(-0.642492412392)*x[0]
arg[3,1,1]=(-0.125760054735)*x[1]
arg[4,0,0]=(0.731110824794)*x[0]
arg[4,0,1]=(0.491668422979)*x[1]
arg[4,1,0]=(-0.775841478292)*x[0]
arg[4,1,1]=(0.728265567974)*x[1]
arg[5,0,0]=(0.84511832373)*x[0]
arg[5,0,1]=(-0.513796801068)*x[1]
arg[5,1,0]=(0.113072243554)*x[0]
arg[5,1,1]=(0.246630838744)*x[1]
ref=sqrt((6.30829536252)*(2.*dim+1.)/3.)
else:
arg=Data(0,(6, 2, 3),w)
arg[0,0,0]=(0.369748116859)*x[0]
arg[0,0,1]=(-0.758056560031)*x[1]
arg[0,0,2]=(-0.873984709951)*x[2]
arg[0,1,0]=(0.311680165784)*x[0]
arg[0,1,1]=(0.374400673651)*x[1]
arg[0,1,2]=(0.712484217076)*x[2]
arg[1,0,0]=(0.829379714484)*x[0]
arg[1,0,1]=(-0.0551589596149)*x[1]
arg[1,0,2]=(0.965672208426)*x[2]
arg[1,1,0]=(-0.205044281547)*x[0]
arg[1,1,1]=(0.238197452756)*x[1]
arg[1,1,2]=(-0.33456139292)*x[2]
arg[2,0,0]=(0.649928288926)*x[0]
arg[2,0,1]=(-0.661384953389)*x[1]
arg[2,0,2]=(-0.253241222975)*x[2]
arg[2,1,0]=(-0.491716575992)*x[0]
arg[2,1,1]=(-0.970872527468)*x[1]
arg[2,1,2]=(0.222410198921)*x[2]
arg[3,0,0]=(0.205752630262)*x[0]
arg[3,0,1]=(0.864804362697)*x[1]
arg[3,0,2]=(-0.417975564033)*x[2]
arg[3,1,0]=(0.586425694033)*x[0]
arg[3,1,1]=(0.952661122184)*x[1]
arg[3,1,2]=(0.608680080453)*x[2]
arg[4,0,0]=(0.625968903369)*x[0]
arg[4,0,1]=(-0.573909405003)*x[1]
arg[4,0,2]=(-0.762256394595)*x[2]
arg[4,1,0]=(0.0710742394418)*x[0]
arg[4,1,1]=(0.583378040574)*x[1]
arg[4,1,2]=(0.719032893115)*x[2]
arg[5,0,0]=(0.032173368884)*x[0]
arg[5,0,1]=(-0.434042549492)*x[1]
arg[5,0,2]=(0.363504765447)*x[2]
arg[5,1,0]=(0.598817469198)*x[0]
arg[5,1,1]=(-0.163967008775)*x[1]
arg[5,1,2]=(0.546778730604)*x[2]
ref=sqrt((11.9696343123)*(2.*dim+1.)/3.)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onFunctionOnBoundary_fromData_rank4(self):
"""
tests L2-norm of Data on the FunctionOnBoundary
assumptions: self.domain supports integration on FunctionOnBoundary
"""
dim=self.domain.getDim()
w=FunctionOnBoundary(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(4, 5, 3, 2),w)
arg[0,0,0,0]=(-0.273446520069)*x[0]
arg[0,0,0,1]=(-0.913305910831)*x[1]
arg[0,0,1,0]=(0.0745566747537)*x[0]
arg[0,0,1,1]=(0.98803601919)*x[1]
arg[0,0,2,0]=(-0.244120818875)*x[0]
arg[0,0,2,1]=(0.247509644998)*x[1]
arg[0,1,0,0]=(-0.548756417777)*x[0]
arg[0,1,0,1]=(-0.354587911923)*x[1]
arg[0,1,1,0]=(0.104268198867)*x[0]
arg[0,1,1,1]=(-0.541700877072)*x[1]
arg[0,1,2,0]=(-0.25900060689)*x[0]
arg[0,1,2,1]=(-0.859660175231)*x[1]
arg[0,2,0,0]=(0.195235560321)*x[0]
arg[0,2,0,1]=(0.175518738589)*x[1]
arg[0,2,1,0]=(-0.0638854232272)*x[0]
arg[0,2,1,1]=(-0.586161016541)*x[1]
arg[0,2,2,0]=(0.580258892247)*x[0]
arg[0,2,2,1]=(-0.931927145435)*x[1]
arg[0,3,0,0]=(-0.408209600298)*x[0]
arg[0,3,0,1]=(-0.0344882667014)*x[1]
arg[0,3,1,0]=(-0.131763534163)*x[0]
arg[0,3,1,1]=(-0.787653739965)*x[1]
arg[0,3,2,0]=(0.0910808104711)*x[0]
arg[0,3,2,1]=(-0.280409023611)*x[1]
arg[0,4,0,0]=(0.97745754012)*x[0]
arg[0,4,0,1]=(-0.59829020936)*x[1]
arg[0,4,1,0]=(0.890520260543)*x[0]
arg[0,4,1,1]=(-0.600760090231)*x[1]
arg[0,4,2,0]=(-0.897992297974)*x[0]
arg[0,4,2,1]=(-0.841169898923)*x[1]
arg[1,0,0,0]=(-0.249868816409)*x[0]
arg[1,0,0,1]=(0.620375082228)*x[1]
arg[1,0,1,0]=(-0.660480789306)*x[0]
arg[1,0,1,1]=(-0.73638571806)*x[1]
arg[1,0,2,0]=(0.339987316643)*x[0]
arg[1,0,2,1]=(0.541112529894)*x[1]
arg[1,1,0,0]=(-0.468808186705)*x[0]
arg[1,1,0,1]=(-0.32919679792)*x[1]
arg[1,1,1,0]=(0.917292803419)*x[0]
arg[1,1,1,1]=(-0.834265058005)*x[1]
arg[1,1,2,0]=(-0.247536849264)*x[0]
arg[1,1,2,1]=(-0.197503469238)*x[1]
arg[1,2,0,0]=(0.897591919909)*x[0]
arg[1,2,0,1]=(-0.807446231234)*x[1]
arg[1,2,1,0]=(-0.369878499382)*x[0]
arg[1,2,1,1]=(0.985678692179)*x[1]
arg[1,2,2,0]=(-0.709976427525)*x[0]
arg[1,2,2,1]=(-0.368744647016)*x[1]
arg[1,3,0,0]=(0.299630726462)*x[0]
arg[1,3,0,1]=(-0.445295757899)*x[1]
arg[1,3,1,0]=(-0.922386577254)*x[0]
arg[1,3,1,1]=(0.234794853697)*x[1]
arg[1,3,2,0]=(0.953377720197)*x[0]
arg[1,3,2,1]=(0.409778183998)*x[1]
arg[1,4,0,0]=(0.271967945488)*x[0]
arg[1,4,0,1]=(-0.578629001202)*x[1]
arg[1,4,1,0]=(0.210755226769)*x[0]
arg[1,4,1,1]=(-0.0902751419945)*x[1]
arg[1,4,2,0]=(0.70033387381)*x[0]
arg[1,4,2,1]=(0.305733565661)*x[1]
arg[2,0,0,0]=(-0.662483167298)*x[0]
arg[2,0,0,1]=(0.585252048652)*x[1]
arg[2,0,1,0]=(-0.398813785959)*x[0]
arg[2,0,1,1]=(-0.797438697186)*x[1]
arg[2,0,2,0]=(-0.508308971009)*x[0]
arg[2,0,2,1]=(0.302249407524)*x[1]
arg[2,1,0,0]=(0.208644491879)*x[0]
arg[2,1,0,1]=(-0.604749055374)*x[1]
arg[2,1,1,0]=(0.641654284594)*x[0]
arg[2,1,1,1]=(0.456898593356)*x[1]
arg[2,1,2,0]=(-0.398043867778)*x[0]
arg[2,1,2,1]=(-0.0712344657587)*x[1]
arg[2,2,0,0]=(0.0967860954865)*x[0]
arg[2,2,0,1]=(0.520449905952)*x[1]
arg[2,2,1,0]=(0.770198029595)*x[0]
arg[2,2,1,1]=(-0.594004671621)*x[1]
arg[2,2,2,0]=(-0.744571452885)*x[0]
arg[2,2,2,1]=(0.544447367825)*x[1]
arg[2,3,0,0]=(-0.137087966968)*x[0]
arg[2,3,0,1]=(0.120672667497)*x[1]
arg[2,3,1,0]=(0.204800088057)*x[0]
arg[2,3,1,1]=(0.626526346076)*x[1]
arg[2,3,2,0]=(-0.696480393227)*x[0]
arg[2,3,2,1]=(0.188533741996)*x[1]
arg[2,4,0,0]=(-0.403523821067)*x[0]
arg[2,4,0,1]=(-0.428048989483)*x[1]
arg[2,4,1,0]=(-0.244186366584)*x[0]
arg[2,4,1,1]=(0.00866444909003)*x[1]
arg[2,4,2,0]=(-0.445991308853)*x[0]
arg[2,4,2,1]=(-0.899951068935)*x[1]
arg[3,0,0,0]=(0.609340085418)*x[0]
arg[3,0,0,1]=(0.878750391425)*x[1]
arg[3,0,1,0]=(0.258064654464)*x[0]
arg[3,0,1,1]=(-0.482402612985)*x[1]
arg[3,0,2,0]=(0.943732283389)*x[0]
arg[3,0,2,1]=(0.65514211843)*x[1]
arg[3,1,0,0]=(-0.894551979619)*x[0]
arg[3,1,0,1]=(0.220116541042)*x[1]
arg[3,1,1,0]=(0.386887699577)*x[0]
arg[3,1,1,1]=(-0.422560075108)*x[1]
arg[3,1,2,0]=(0.00387273783493)*x[0]
arg[3,1,2,1]=(0.465673505613)*x[1]
arg[3,2,0,0]=(0.987383428982)*x[0]
arg[3,2,0,1]=(0.376320055964)*x[1]
arg[3,2,1,0]=(-0.463778689128)*x[0]
arg[3,2,1,1]=(0.179816566227)*x[1]
arg[3,2,2,0]=(0.961522856801)*x[0]
arg[3,2,2,1]=(-0.257779627946)*x[1]
arg[3,3,0,0]=(0.748886531458)*x[0]
arg[3,3,0,1]=(-0.257342282566)*x[1]
arg[3,3,1,0]=(0.377494024401)*x[0]
arg[3,3,1,1]=(-0.334588346017)*x[1]
arg[3,3,2,0]=(0.502495149189)*x[0]
arg[3,3,2,1]=(0.534612429702)*x[1]
arg[3,4,0,0]=(-0.308551337355)*x[0]
arg[3,4,0,1]=(-0.471825826745)*x[1]
arg[3,4,1,0]=(-0.262606531584)*x[0]
arg[3,4,1,1]=(0.766089616367)*x[1]
arg[3,4,2,0]=(-0.136526755642)*x[0]
arg[3,4,2,1]=(0.675111459363)*x[1]
ref=sqrt((37.3453550914)*(2.*dim+1.)/3.)
else:
arg=Data(0,(4, 5, 3, 3),w)
arg[0,0,0,0]=(0.532946231146)*x[0]
arg[0,0,0,1]=(0.269364089513)*x[1]
arg[0,0,0,2]=(-0.207412457081)*x[2]
arg[0,0,1,0]=(-0.843104704858)*x[0]
arg[0,0,1,1]=(0.0416216508473)*x[1]
arg[0,0,1,2]=(-0.836074693662)*x[2]
arg[0,0,2,0]=(0.943609268731)*x[0]
arg[0,0,2,1]=(0.0154543737816)*x[1]
arg[0,0,2,2]=(0.0726268788381)*x[2]
arg[0,1,0,0]=(0.108422740078)*x[0]
arg[0,1,0,1]=(-0.296667916638)*x[1]
arg[0,1,0,2]=(-0.769732600535)*x[2]
arg[0,1,1,0]=(-0.428575493834)*x[0]
arg[0,1,1,1]=(0.421245456722)*x[1]
arg[0,1,1,2]=(-0.588277652277)*x[2]
arg[0,1,2,0]=(0.145294576795)*x[0]
arg[0,1,2,1]=(0.323206623794)*x[1]
arg[0,1,2,2]=(0.788115602892)*x[2]
arg[0,2,0,0]=(-0.227877282292)*x[0]
arg[0,2,0,1]=(-0.630647460719)*x[1]
arg[0,2,0,2]=(0.58754882135)*x[2]
arg[0,2,1,0]=(0.347191113403)*x[0]
arg[0,2,1,1]=(0.464093634725)*x[1]
arg[0,2,1,2]=(-0.0412800774497)*x[2]
arg[0,2,2,0]=(0.223364317185)*x[0]
arg[0,2,2,1]=(0.257201130157)*x[1]
arg[0,2,2,2]=(0.063203467463)*x[2]
arg[0,3,0,0]=(-0.723240451643)*x[0]
arg[0,3,0,1]=(-0.862468295097)*x[1]
arg[0,3,0,2]=(-0.149283247587)*x[2]
arg[0,3,1,0]=(0.15680097839)*x[0]
arg[0,3,1,1]=(0.421563637547)*x[1]
arg[0,3,1,2]=(0.111549188549)*x[2]
arg[0,3,2,0]=(-0.272783329363)*x[0]
arg[0,3,2,1]=(-0.420352789853)*x[1]
arg[0,3,2,2]=(0.570865117722)*x[2]
arg[0,4,0,0]=(-0.321910078414)*x[0]
arg[0,4,0,1]=(0.988695599439)*x[1]
arg[0,4,0,2]=(0.920200893398)*x[2]
arg[0,4,1,0]=(0.0260910072651)*x[0]
arg[0,4,1,1]=(0.460012578184)*x[1]
arg[0,4,1,2]=(0.848099524112)*x[2]
arg[0,4,2,0]=(0.242157803251)*x[0]
arg[0,4,2,1]=(0.394528777004)*x[1]
arg[0,4,2,2]=(0.562996837311)*x[2]
arg[1,0,0,0]=(0.459886225958)*x[0]
arg[1,0,0,1]=(-0.721868942003)*x[1]
arg[1,0,0,2]=(0.432203082994)*x[2]
arg[1,0,1,0]=(0.409831045482)*x[0]
arg[1,0,1,1]=(-0.481677513473)*x[1]
arg[1,0,1,2]=(0.439387853437)*x[2]
arg[1,0,2,0]=(0.261583198434)*x[0]
arg[1,0,2,1]=(0.290993423577)*x[1]
arg[1,0,2,2]=(0.477993114134)*x[2]
arg[1,1,0,0]=(0.586344598248)*x[0]
arg[1,1,0,1]=(-0.105390792831)*x[1]
arg[1,1,0,2]=(0.335990751314)*x[2]
arg[1,1,1,0]=(-0.191500562856)*x[0]
arg[1,1,1,1]=(0.244514598216)*x[1]
arg[1,1,1,2]=(-0.804402720669)*x[2]
arg[1,1,2,0]=(-0.455225710648)*x[0]
arg[1,1,2,1]=(-0.505052700585)*x[1]
arg[1,1,2,2]=(-0.0240295199362)*x[2]
arg[1,2,0,0]=(-0.718487964893)*x[0]
arg[1,2,0,1]=(-0.0899522570462)*x[1]
arg[1,2,0,2]=(-0.293353754696)*x[2]
arg[1,2,1,0]=(-0.180013826342)*x[0]
arg[1,2,1,1]=(0.793689231922)*x[1]
arg[1,2,1,2]=(0.673066555571)*x[2]
arg[1,2,2,0]=(0.705362155032)*x[0]
arg[1,2,2,1]=(0.54476742883)*x[1]
arg[1,2,2,2]=(-0.331195064878)*x[2]
arg[1,3,0,0]=(-0.360927441647)*x[0]
arg[1,3,0,1]=(0.230772030282)*x[1]
arg[1,3,0,2]=(0.912342489431)*x[2]
arg[1,3,1,0]=(-0.817510690014)*x[0]
arg[1,3,1,1]=(0.397583721353)*x[1]
arg[1,3,1,2]=(-0.982551067917)*x[2]
arg[1,3,2,0]=(0.86380240427)*x[0]
arg[1,3,2,1]=(-0.415018976841)*x[1]
arg[1,3,2,2]=(0.271582572267)*x[2]
arg[1,4,0,0]=(0.252845347406)*x[0]
arg[1,4,0,1]=(0.687786802906)*x[1]
arg[1,4,0,2]=(0.465501171342)*x[2]
arg[1,4,1,0]=(-0.613703721675)*x[0]
arg[1,4,1,1]=(-0.110297640533)*x[1]
arg[1,4,1,2]=(-0.836768056501)*x[2]
arg[1,4,2,0]=(-0.0400898232224)*x[0]
arg[1,4,2,1]=(0.0358172759009)*x[1]
arg[1,4,2,2]=(-0.335751455408)*x[2]
arg[2,0,0,0]=(-0.309992915015)*x[0]
arg[2,0,0,1]=(-0.721404217867)*x[1]
arg[2,0,0,2]=(-0.548000635629)*x[2]
arg[2,0,1,0]=(0.651175831531)*x[0]
arg[2,0,1,1]=(0.158960783491)*x[1]
arg[2,0,1,2]=(-0.310676926155)*x[2]
arg[2,0,2,0]=(-0.122289734411)*x[0]
arg[2,0,2,1]=(-0.252405938421)*x[1]
arg[2,0,2,2]=(-0.938280244213)*x[2]
arg[2,1,0,0]=(0.559495801686)*x[0]
arg[2,1,0,1]=(-0.547182622716)*x[1]
arg[2,1,0,2]=(0.397441517898)*x[2]
arg[2,1,1,0]=(-0.406112472071)*x[0]
arg[2,1,1,1]=(0.355063810677)*x[1]
arg[2,1,1,2]=(0.760400203215)*x[2]
arg[2,1,2,0]=(0.992201320481)*x[0]
arg[2,1,2,1]=(0.0580660882576)*x[1]
arg[2,1,2,2]=(-0.643170879939)*x[2]
arg[2,2,0,0]=(-0.280644461832)*x[0]
arg[2,2,0,1]=(-0.0467430285531)*x[1]
arg[2,2,0,2]=(0.314050593255)*x[2]
arg[2,2,1,0]=(-0.230032618609)*x[0]
arg[2,2,1,1]=(0.0996058698273)*x[1]
arg[2,2,1,2]=(-0.0270266073208)*x[2]
arg[2,2,2,0]=(0.767914132956)*x[0]
arg[2,2,2,1]=(0.496930363612)*x[1]
arg[2,2,2,2]=(-0.599525033616)*x[2]
arg[2,3,0,0]=(-0.326433376073)*x[0]
arg[2,3,0,1]=(-0.0366374501025)*x[1]
arg[2,3,0,2]=(0.22555705749)*x[2]
arg[2,3,1,0]=(-0.162548813895)*x[0]
arg[2,3,1,1]=(-0.110074212194)*x[1]
arg[2,3,1,2]=(-0.143600895553)*x[2]
arg[2,3,2,0]=(0.771148880174)*x[0]
arg[2,3,2,1]=(0.112528116552)*x[1]
arg[2,3,2,2]=(-0.955735294341)*x[2]
arg[2,4,0,0]=(-0.968392951034)*x[0]
arg[2,4,0,1]=(-0.36901708507)*x[1]
arg[2,4,0,2]=(0.283692515492)*x[2]
arg[2,4,1,0]=(0.997238032837)*x[0]
arg[2,4,1,1]=(-0.625794124653)*x[1]
arg[2,4,1,2]=(0.533386027556)*x[2]
arg[2,4,2,0]=(0.977311695557)*x[0]
arg[2,4,2,1]=(0.693009976689)*x[1]
arg[2,4,2,2]=(0.711179347652)*x[2]
arg[3,0,0,0]=(-0.155585788931)*x[0]
arg[3,0,0,1]=(0.0228078851234)*x[1]
arg[3,0,0,2]=(0.510104938032)*x[2]
arg[3,0,1,0]=(0.74865995369)*x[0]
arg[3,0,1,1]=(0.672153736284)*x[1]
arg[3,0,1,2]=(0.588012355098)*x[2]
arg[3,0,2,0]=(-0.924508475715)*x[0]
arg[3,0,2,1]=(-0.392784674758)*x[1]
arg[3,0,2,2]=(-0.36371454642)*x[2]
arg[3,1,0,0]=(-0.709783490337)*x[0]
arg[3,1,0,1]=(0.844136172222)*x[1]
arg[3,1,0,2]=(0.621011730043)*x[2]
arg[3,1,1,0]=(0.428807337181)*x[0]
arg[3,1,1,1]=(0.126300214574)*x[1]
arg[3,1,1,2]=(0.795972806221)*x[2]
arg[3,1,2,0]=(-0.252334324004)*x[0]
arg[3,1,2,1]=(-0.722829467938)*x[1]
arg[3,1,2,2]=(-0.551540062366)*x[2]
arg[3,2,0,0]=(-0.134668475963)*x[0]
arg[3,2,0,1]=(-0.598747540536)*x[1]
arg[3,2,0,2]=(0.426422436624)*x[2]
arg[3,2,1,0]=(-0.363050323762)*x[0]
arg[3,2,1,1]=(0.980891457977)*x[1]
arg[3,2,1,2]=(0.162831912555)*x[2]
arg[3,2,2,0]=(-0.126505493475)*x[0]
arg[3,2,2,1]=(-0.578567864811)*x[1]
arg[3,2,2,2]=(-0.509843129095)*x[2]
arg[3,3,0,0]=(-0.446171262265)*x[0]
arg[3,3,0,1]=(-0.715175197494)*x[1]
arg[3,3,0,2]=(-0.881016888806)*x[2]
arg[3,3,1,0]=(-0.942020866327)*x[0]
arg[3,3,1,1]=(0.156434646828)*x[1]
arg[3,3,1,2]=(0.523624761583)*x[2]
arg[3,3,2,0]=(-0.683550923926)*x[0]
arg[3,3,2,1]=(0.857075218033)*x[1]
arg[3,3,2,2]=(0.297672594023)*x[2]
arg[3,4,0,0]=(0.74317121113)*x[0]
arg[3,4,0,1]=(0.076464540756)*x[1]
arg[3,4,0,2]=(0.781965468281)*x[2]
arg[3,4,1,0]=(0.417750169098)*x[0]
arg[3,4,1,1]=(0.82275428729)*x[1]
arg[3,4,1,2]=(0.919072321093)*x[2]
arg[3,4,2,0]=(-0.0246706472217)*x[0]
arg[3,4,2,1]=(0.179863245513)*x[1]
arg[3,4,2,2]=(0.539115287766)*x[2]
ref=sqrt((52.492676775)*(2.*dim+1.)/3.)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onReducedFunctionOnBoundary_fromData_rank0(self):
"""
tests L2-norm of Data on the ReducedFunctionOnBoundary
assumptions: self.domain supports integration on ReducedFunctionOnBoundary
"""
dim=self.domain.getDim()
w=ReducedFunctionOnBoundary(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(),w)
arg=(-0.245574919477)*sqrt(x[0])
ref=sqrt((0.0603070410759)*dim)
else:
arg=Data(0,(),w)
arg=(0.757324521515)*sqrt(x[0])
ref=sqrt((0.573540430888)*dim)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onReducedFunctionOnBoundary_fromData_rank1(self):
"""
tests L2-norm of Data on the ReducedFunctionOnBoundary
assumptions: self.domain supports integration on ReducedFunctionOnBoundary
"""
dim=self.domain.getDim()
w=ReducedFunctionOnBoundary(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(2,),w)
arg[0]=(0.723421565407)*sqrt(x[0])
arg[1]=(-0.460477393103)*sqrt(x[1])
ref=sqrt((0.735378190855)*dim)
else:
arg=Data(0,(3,),w)
arg[0]=(-0.88528497163)*sqrt(x[0])
arg[1]=(-0.65510214636)*sqrt(x[1])
arg[2]=(0.399538866363)*sqrt(x[2])
ref=sqrt((1.37251960889)*dim)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onReducedFunctionOnBoundary_fromData_rank2(self):
"""
tests L2-norm of Data on the ReducedFunctionOnBoundary
assumptions: self.domain supports integration on ReducedFunctionOnBoundary
"""
dim=self.domain.getDim()
w=ReducedFunctionOnBoundary(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(4, 2),w)
arg[0,0]=(0.955620993904)*sqrt(x[0])
arg[0,1]=(-0.0987865813703)*sqrt(x[1])
arg[1,0]=(0.0288267231531)*sqrt(x[0])
arg[1,1]=(0.655440599879)*sqrt(x[1])
arg[2,0]=(0.685627284533)*sqrt(x[0])
arg[2,1]=(-0.989832824892)*sqrt(x[1])
arg[3,0]=(0.292184093194)*sqrt(x[0])
arg[3,1]=(0.149553857773)*sqrt(x[1])
ref=sqrt((2.91099532781)*dim)
else:
arg=Data(0,(4, 3),w)
arg[0,0]=(-0.325908541533)*sqrt(x[0])
arg[0,1]=(-0.992480479749)*sqrt(x[1])
arg[0,2]=(0.660360271799)*sqrt(x[2])
arg[1,0]=(0.173485908581)*sqrt(x[0])
arg[1,1]=(-0.328755199781)*sqrt(x[1])
arg[1,2]=(-0.943354674948)*sqrt(x[2])
arg[2,0]=(0.680713222646)*sqrt(x[0])
arg[2,1]=(-0.765971835693)*sqrt(x[1])
arg[2,2]=(0.0413284847528)*sqrt(x[2])
arg[3,0]=(0.990074004708)*sqrt(x[0])
arg[3,1]=(0.941801786766)*sqrt(x[1])
arg[3,2]=(0.886926192201)*sqrt(x[2])
ref=sqrt((6.26107155228)*dim)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onReducedFunctionOnBoundary_fromData_rank3(self):
"""
tests L2-norm of Data on the ReducedFunctionOnBoundary
assumptions: self.domain supports integration on ReducedFunctionOnBoundary
"""
dim=self.domain.getDim()
w=ReducedFunctionOnBoundary(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(6, 2, 2),w)
arg[0,0,0]=(-0.0781611551598)*sqrt(x[0])
arg[0,0,1]=(0.291016249575)*sqrt(x[1])
arg[0,1,0]=(-0.107555233086)*sqrt(x[0])
arg[0,1,1]=(-0.559067108546)*sqrt(x[1])
arg[1,0,0]=(-0.0818406701266)*sqrt(x[0])
arg[1,0,1]=(-0.594866806483)*sqrt(x[1])
arg[1,1,0]=(-0.725814803863)*sqrt(x[0])
arg[1,1,1]=(0.59128101992)*sqrt(x[1])
arg[2,0,0]=(-0.15381555291)*sqrt(x[0])
arg[2,0,1]=(-0.679882948503)*sqrt(x[1])
arg[2,1,0]=(-0.58437193917)*sqrt(x[0])
arg[2,1,1]=(0.136304615849)*sqrt(x[1])
arg[3,0,0]=(0.0671365410096)*sqrt(x[0])
arg[3,0,1]=(-0.645687212187)*sqrt(x[1])
arg[3,1,0]=(-0.642492412392)*sqrt(x[0])
arg[3,1,1]=(-0.125760054735)*sqrt(x[1])
arg[4,0,0]=(0.731110824794)*sqrt(x[0])
arg[4,0,1]=(0.491668422979)*sqrt(x[1])
arg[4,1,0]=(-0.775841478292)*sqrt(x[0])
arg[4,1,1]=(0.728265567974)*sqrt(x[1])
arg[5,0,0]=(0.84511832373)*sqrt(x[0])
arg[5,0,1]=(-0.513796801068)*sqrt(x[1])
arg[5,1,0]=(0.113072243554)*sqrt(x[0])
arg[5,1,1]=(0.246630838744)*sqrt(x[1])
ref=sqrt((6.30829536252)*dim)
else:
arg=Data(0,(6, 2, 3),w)
arg[0,0,0]=(0.369748116859)*sqrt(x[0])
arg[0,0,1]=(-0.758056560031)*sqrt(x[1])
arg[0,0,2]=(-0.873984709951)*sqrt(x[2])
arg[0,1,0]=(0.311680165784)*sqrt(x[0])
arg[0,1,1]=(0.374400673651)*sqrt(x[1])
arg[0,1,2]=(0.712484217076)*sqrt(x[2])
arg[1,0,0]=(0.829379714484)*sqrt(x[0])
arg[1,0,1]=(-0.0551589596149)*sqrt(x[1])
arg[1,0,2]=(0.965672208426)*sqrt(x[2])
arg[1,1,0]=(-0.205044281547)*sqrt(x[0])
arg[1,1,1]=(0.238197452756)*sqrt(x[1])
arg[1,1,2]=(-0.33456139292)*sqrt(x[2])
arg[2,0,0]=(0.649928288926)*sqrt(x[0])
arg[2,0,1]=(-0.661384953389)*sqrt(x[1])
arg[2,0,2]=(-0.253241222975)*sqrt(x[2])
arg[2,1,0]=(-0.491716575992)*sqrt(x[0])
arg[2,1,1]=(-0.970872527468)*sqrt(x[1])
arg[2,1,2]=(0.222410198921)*sqrt(x[2])
arg[3,0,0]=(0.205752630262)*sqrt(x[0])
arg[3,0,1]=(0.864804362697)*sqrt(x[1])
arg[3,0,2]=(-0.417975564033)*sqrt(x[2])
arg[3,1,0]=(0.586425694033)*sqrt(x[0])
arg[3,1,1]=(0.952661122184)*sqrt(x[1])
arg[3,1,2]=(0.608680080453)*sqrt(x[2])
arg[4,0,0]=(0.625968903369)*sqrt(x[0])
arg[4,0,1]=(-0.573909405003)*sqrt(x[1])
arg[4,0,2]=(-0.762256394595)*sqrt(x[2])
arg[4,1,0]=(0.0710742394418)*sqrt(x[0])
arg[4,1,1]=(0.583378040574)*sqrt(x[1])
arg[4,1,2]=(0.719032893115)*sqrt(x[2])
arg[5,0,0]=(0.032173368884)*sqrt(x[0])
arg[5,0,1]=(-0.434042549492)*sqrt(x[1])
arg[5,0,2]=(0.363504765447)*sqrt(x[2])
arg[5,1,0]=(0.598817469198)*sqrt(x[0])
arg[5,1,1]=(-0.163967008775)*sqrt(x[1])
arg[5,1,2]=(0.546778730604)*sqrt(x[2])
ref=sqrt((11.9696343123)*dim)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
#+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
def test_L2_onReducedFunctionOnBoundary_fromData_rank4(self):
"""
tests L2-norm of Data on the ReducedFunctionOnBoundary
assumptions: self.domain supports integration on ReducedFunctionOnBoundary
"""
dim=self.domain.getDim()
w=ReducedFunctionOnBoundary(self.domain)
x=w.getX()
if dim==2:
arg=Data(0,(4, 5, 3, 2),w)
arg[0,0,0,0]=(-0.273446520069)*sqrt(x[0])
arg[0,0,0,1]=(-0.913305910831)*sqrt(x[1])
arg[0,0,1,0]=(0.0745566747537)*sqrt(x[0])
arg[0,0,1,1]=(0.98803601919)*sqrt(x[1])
arg[0,0,2,0]=(-0.244120818875)*sqrt(x[0])
arg[0,0,2,1]=(0.247509644998)*sqrt(x[1])
arg[0,1,0,0]=(-0.548756417777)*sqrt(x[0])
arg[0,1,0,1]=(-0.354587911923)*sqrt(x[1])
arg[0,1,1,0]=(0.104268198867)*sqrt(x[0])
arg[0,1,1,1]=(-0.541700877072)*sqrt(x[1])
arg[0,1,2,0]=(-0.25900060689)*sqrt(x[0])
arg[0,1,2,1]=(-0.859660175231)*sqrt(x[1])
arg[0,2,0,0]=(0.195235560321)*sqrt(x[0])
arg[0,2,0,1]=(0.175518738589)*sqrt(x[1])
arg[0,2,1,0]=(-0.0638854232272)*sqrt(x[0])
arg[0,2,1,1]=(-0.586161016541)*sqrt(x[1])
arg[0,2,2,0]=(0.580258892247)*sqrt(x[0])
arg[0,2,2,1]=(-0.931927145435)*sqrt(x[1])
arg[0,3,0,0]=(-0.408209600298)*sqrt(x[0])
arg[0,3,0,1]=(-0.0344882667014)*sqrt(x[1])
arg[0,3,1,0]=(-0.131763534163)*sqrt(x[0])
arg[0,3,1,1]=(-0.787653739965)*sqrt(x[1])
arg[0,3,2,0]=(0.0910808104711)*sqrt(x[0])
arg[0,3,2,1]=(-0.280409023611)*sqrt(x[1])
arg[0,4,0,0]=(0.97745754012)*sqrt(x[0])
arg[0,4,0,1]=(-0.59829020936)*sqrt(x[1])
arg[0,4,1,0]=(0.890520260543)*sqrt(x[0])
arg[0,4,1,1]=(-0.600760090231)*sqrt(x[1])
arg[0,4,2,0]=(-0.897992297974)*sqrt(x[0])
arg[0,4,2,1]=(-0.841169898923)*sqrt(x[1])
arg[1,0,0,0]=(-0.249868816409)*sqrt(x[0])
arg[1,0,0,1]=(0.620375082228)*sqrt(x[1])
arg[1,0,1,0]=(-0.660480789306)*sqrt(x[0])
arg[1,0,1,1]=(-0.73638571806)*sqrt(x[1])
arg[1,0,2,0]=(0.339987316643)*sqrt(x[0])
arg[1,0,2,1]=(0.541112529894)*sqrt(x[1])
arg[1,1,0,0]=(-0.468808186705)*sqrt(x[0])
arg[1,1,0,1]=(-0.32919679792)*sqrt(x[1])
arg[1,1,1,0]=(0.917292803419)*sqrt(x[0])
arg[1,1,1,1]=(-0.834265058005)*sqrt(x[1])
arg[1,1,2,0]=(-0.247536849264)*sqrt(x[0])
arg[1,1,2,1]=(-0.197503469238)*sqrt(x[1])
arg[1,2,0,0]=(0.897591919909)*sqrt(x[0])
arg[1,2,0,1]=(-0.807446231234)*sqrt(x[1])
arg[1,2,1,0]=(-0.369878499382)*sqrt(x[0])
arg[1,2,1,1]=(0.985678692179)*sqrt(x[1])
arg[1,2,2,0]=(-0.709976427525)*sqrt(x[0])
arg[1,2,2,1]=(-0.368744647016)*sqrt(x[1])
arg[1,3,0,0]=(0.299630726462)*sqrt(x[0])
arg[1,3,0,1]=(-0.445295757899)*sqrt(x[1])
arg[1,3,1,0]=(-0.922386577254)*sqrt(x[0])
arg[1,3,1,1]=(0.234794853697)*sqrt(x[1])
arg[1,3,2,0]=(0.953377720197)*sqrt(x[0])
arg[1,3,2,1]=(0.409778183998)*sqrt(x[1])
arg[1,4,0,0]=(0.271967945488)*sqrt(x[0])
arg[1,4,0,1]=(-0.578629001202)*sqrt(x[1])
arg[1,4,1,0]=(0.210755226769)*sqrt(x[0])
arg[1,4,1,1]=(-0.0902751419945)*sqrt(x[1])
arg[1,4,2,0]=(0.70033387381)*sqrt(x[0])
arg[1,4,2,1]=(0.305733565661)*sqrt(x[1])
arg[2,0,0,0]=(-0.662483167298)*sqrt(x[0])
arg[2,0,0,1]=(0.585252048652)*sqrt(x[1])
arg[2,0,1,0]=(-0.398813785959)*sqrt(x[0])
arg[2,0,1,1]=(-0.797438697186)*sqrt(x[1])
arg[2,0,2,0]=(-0.508308971009)*sqrt(x[0])
arg[2,0,2,1]=(0.302249407524)*sqrt(x[1])
arg[2,1,0,0]=(0.208644491879)*sqrt(x[0])
arg[2,1,0,1]=(-0.604749055374)*sqrt(x[1])
arg[2,1,1,0]=(0.641654284594)*sqrt(x[0])
arg[2,1,1,1]=(0.456898593356)*sqrt(x[1])
arg[2,1,2,0]=(-0.398043867778)*sqrt(x[0])
arg[2,1,2,1]=(-0.0712344657587)*sqrt(x[1])
arg[2,2,0,0]=(0.0967860954865)*sqrt(x[0])
arg[2,2,0,1]=(0.520449905952)*sqrt(x[1])
arg[2,2,1,0]=(0.770198029595)*sqrt(x[0])
arg[2,2,1,1]=(-0.594004671621)*sqrt(x[1])
arg[2,2,2,0]=(-0.744571452885)*sqrt(x[0])
arg[2,2,2,1]=(0.544447367825)*sqrt(x[1])
arg[2,3,0,0]=(-0.137087966968)*sqrt(x[0])
arg[2,3,0,1]=(0.120672667497)*sqrt(x[1])
arg[2,3,1,0]=(0.204800088057)*sqrt(x[0])
arg[2,3,1,1]=(0.626526346076)*sqrt(x[1])
arg[2,3,2,0]=(-0.696480393227)*sqrt(x[0])
arg[2,3,2,1]=(0.188533741996)*sqrt(x[1])
arg[2,4,0,0]=(-0.403523821067)*sqrt(x[0])
arg[2,4,0,1]=(-0.428048989483)*sqrt(x[1])
arg[2,4,1,0]=(-0.244186366584)*sqrt(x[0])
arg[2,4,1,1]=(0.00866444909003)*sqrt(x[1])
arg[2,4,2,0]=(-0.445991308853)*sqrt(x[0])
arg[2,4,2,1]=(-0.899951068935)*sqrt(x[1])
arg[3,0,0,0]=(0.609340085418)*sqrt(x[0])
arg[3,0,0,1]=(0.878750391425)*sqrt(x[1])
arg[3,0,1,0]=(0.258064654464)*sqrt(x[0])
arg[3,0,1,1]=(-0.482402612985)*sqrt(x[1])
arg[3,0,2,0]=(0.943732283389)*sqrt(x[0])
arg[3,0,2,1]=(0.65514211843)*sqrt(x[1])
arg[3,1,0,0]=(-0.894551979619)*sqrt(x[0])
arg[3,1,0,1]=(0.220116541042)*sqrt(x[1])
arg[3,1,1,0]=(0.386887699577)*sqrt(x[0])
arg[3,1,1,1]=(-0.422560075108)*sqrt(x[1])
arg[3,1,2,0]=(0.00387273783493)*sqrt(x[0])
arg[3,1,2,1]=(0.465673505613)*sqrt(x[1])
arg[3,2,0,0]=(0.987383428982)*sqrt(x[0])
arg[3,2,0,1]=(0.376320055964)*sqrt(x[1])
arg[3,2,1,0]=(-0.463778689128)*sqrt(x[0])
arg[3,2,1,1]=(0.179816566227)*sqrt(x[1])
arg[3,2,2,0]=(0.961522856801)*sqrt(x[0])
arg[3,2,2,1]=(-0.257779627946)*sqrt(x[1])
arg[3,3,0,0]=(0.748886531458)*sqrt(x[0])
arg[3,3,0,1]=(-0.257342282566)*sqrt(x[1])
arg[3,3,1,0]=(0.377494024401)*sqrt(x[0])
arg[3,3,1,1]=(-0.334588346017)*sqrt(x[1])
arg[3,3,2,0]=(0.502495149189)*sqrt(x[0])
arg[3,3,2,1]=(0.534612429702)*sqrt(x[1])
arg[3,4,0,0]=(-0.308551337355)*sqrt(x[0])
arg[3,4,0,1]=(-0.471825826745)*sqrt(x[1])
arg[3,4,1,0]=(-0.262606531584)*sqrt(x[0])
arg[3,4,1,1]=(0.766089616367)*sqrt(x[1])
arg[3,4,2,0]=(-0.136526755642)*sqrt(x[0])
arg[3,4,2,1]=(0.675111459363)*sqrt(x[1])
ref=sqrt((37.3453550914)*dim)
else:
arg=Data(0,(4, 5, 3, 3),w)
arg[0,0,0,0]=(0.532946231146)*sqrt(x[0])
arg[0,0,0,1]=(0.269364089513)*sqrt(x[1])
arg[0,0,0,2]=(-0.207412457081)*sqrt(x[2])
arg[0,0,1,0]=(-0.843104704858)*sqrt(x[0])
arg[0,0,1,1]=(0.0416216508473)*sqrt(x[1])
arg[0,0,1,2]=(-0.836074693662)*sqrt(x[2])
arg[0,0,2,0]=(0.943609268731)*sqrt(x[0])
arg[0,0,2,1]=(0.0154543737816)*sqrt(x[1])
arg[0,0,2,2]=(0.0726268788381)*sqrt(x[2])
arg[0,1,0,0]=(0.108422740078)*sqrt(x[0])
arg[0,1,0,1]=(-0.296667916638)*sqrt(x[1])
arg[0,1,0,2]=(-0.769732600535)*sqrt(x[2])
arg[0,1,1,0]=(-0.428575493834)*sqrt(x[0])
arg[0,1,1,1]=(0.421245456722)*sqrt(x[1])
arg[0,1,1,2]=(-0.588277652277)*sqrt(x[2])
arg[0,1,2,0]=(0.145294576795)*sqrt(x[0])
arg[0,1,2,1]=(0.323206623794)*sqrt(x[1])
arg[0,1,2,2]=(0.788115602892)*sqrt(x[2])
arg[0,2,0,0]=(-0.227877282292)*sqrt(x[0])
arg[0,2,0,1]=(-0.630647460719)*sqrt(x[1])
arg[0,2,0,2]=(0.58754882135)*sqrt(x[2])
arg[0,2,1,0]=(0.347191113403)*sqrt(x[0])
arg[0,2,1,1]=(0.464093634725)*sqrt(x[1])
arg[0,2,1,2]=(-0.0412800774497)*sqrt(x[2])
arg[0,2,2,0]=(0.223364317185)*sqrt(x[0])
arg[0,2,2,1]=(0.257201130157)*sqrt(x[1])
arg[0,2,2,2]=(0.063203467463)*sqrt(x[2])
arg[0,3,0,0]=(-0.723240451643)*sqrt(x[0])
arg[0,3,0,1]=(-0.862468295097)*sqrt(x[1])
arg[0,3,0,2]=(-0.149283247587)*sqrt(x[2])
arg[0,3,1,0]=(0.15680097839)*sqrt(x[0])
arg[0,3,1,1]=(0.421563637547)*sqrt(x[1])
arg[0,3,1,2]=(0.111549188549)*sqrt(x[2])
arg[0,3,2,0]=(-0.272783329363)*sqrt(x[0])
arg[0,3,2,1]=(-0.420352789853)*sqrt(x[1])
arg[0,3,2,2]=(0.570865117722)*sqrt(x[2])
arg[0,4,0,0]=(-0.321910078414)*sqrt(x[0])
arg[0,4,0,1]=(0.988695599439)*sqrt(x[1])
arg[0,4,0,2]=(0.920200893398)*sqrt(x[2])
arg[0,4,1,0]=(0.0260910072651)*sqrt(x[0])
arg[0,4,1,1]=(0.460012578184)*sqrt(x[1])
arg[0,4,1,2]=(0.848099524112)*sqrt(x[2])
arg[0,4,2,0]=(0.242157803251)*sqrt(x[0])
arg[0,4,2,1]=(0.394528777004)*sqrt(x[1])
arg[0,4,2,2]=(0.562996837311)*sqrt(x[2])
arg[1,0,0,0]=(0.459886225958)*sqrt(x[0])
arg[1,0,0,1]=(-0.721868942003)*sqrt(x[1])
arg[1,0,0,2]=(0.432203082994)*sqrt(x[2])
arg[1,0,1,0]=(0.409831045482)*sqrt(x[0])
arg[1,0,1,1]=(-0.481677513473)*sqrt(x[1])
arg[1,0,1,2]=(0.439387853437)*sqrt(x[2])
arg[1,0,2,0]=(0.261583198434)*sqrt(x[0])
arg[1,0,2,1]=(0.290993423577)*sqrt(x[1])
arg[1,0,2,2]=(0.477993114134)*sqrt(x[2])
arg[1,1,0,0]=(0.586344598248)*sqrt(x[0])
arg[1,1,0,1]=(-0.105390792831)*sqrt(x[1])
arg[1,1,0,2]=(0.335990751314)*sqrt(x[2])
arg[1,1,1,0]=(-0.191500562856)*sqrt(x[0])
arg[1,1,1,1]=(0.244514598216)*sqrt(x[1])
arg[1,1,1,2]=(-0.804402720669)*sqrt(x[2])
arg[1,1,2,0]=(-0.455225710648)*sqrt(x[0])
arg[1,1,2,1]=(-0.505052700585)*sqrt(x[1])
arg[1,1,2,2]=(-0.0240295199362)*sqrt(x[2])
arg[1,2,0,0]=(-0.718487964893)*sqrt(x[0])
arg[1,2,0,1]=(-0.0899522570462)*sqrt(x[1])
arg[1,2,0,2]=(-0.293353754696)*sqrt(x[2])
arg[1,2,1,0]=(-0.180013826342)*sqrt(x[0])
arg[1,2,1,1]=(0.793689231922)*sqrt(x[1])
arg[1,2,1,2]=(0.673066555571)*sqrt(x[2])
arg[1,2,2,0]=(0.705362155032)*sqrt(x[0])
arg[1,2,2,1]=(0.54476742883)*sqrt(x[1])
arg[1,2,2,2]=(-0.331195064878)*sqrt(x[2])
arg[1,3,0,0]=(-0.360927441647)*sqrt(x[0])
arg[1,3,0,1]=(0.230772030282)*sqrt(x[1])
arg[1,3,0,2]=(0.912342489431)*sqrt(x[2])
arg[1,3,1,0]=(-0.817510690014)*sqrt(x[0])
arg[1,3,1,1]=(0.397583721353)*sqrt(x[1])
arg[1,3,1,2]=(-0.982551067917)*sqrt(x[2])
arg[1,3,2,0]=(0.86380240427)*sqrt(x[0])
arg[1,3,2,1]=(-0.415018976841)*sqrt(x[1])
arg[1,3,2,2]=(0.271582572267)*sqrt(x[2])
arg[1,4,0,0]=(0.252845347406)*sqrt(x[0])
arg[1,4,0,1]=(0.687786802906)*sqrt(x[1])
arg[1,4,0,2]=(0.465501171342)*sqrt(x[2])
arg[1,4,1,0]=(-0.613703721675)*sqrt(x[0])
arg[1,4,1,1]=(-0.110297640533)*sqrt(x[1])
arg[1,4,1,2]=(-0.836768056501)*sqrt(x[2])
arg[1,4,2,0]=(-0.0400898232224)*sqrt(x[0])
arg[1,4,2,1]=(0.0358172759009)*sqrt(x[1])
arg[1,4,2,2]=(-0.335751455408)*sqrt(x[2])
arg[2,0,0,0]=(-0.309992915015)*sqrt(x[0])
arg[2,0,0,1]=(-0.721404217867)*sqrt(x[1])
arg[2,0,0,2]=(-0.548000635629)*sqrt(x[2])
arg[2,0,1,0]=(0.651175831531)*sqrt(x[0])
arg[2,0,1,1]=(0.158960783491)*sqrt(x[1])
arg[2,0,1,2]=(-0.310676926155)*sqrt(x[2])
arg[2,0,2,0]=(-0.122289734411)*sqrt(x[0])
arg[2,0,2,1]=(-0.252405938421)*sqrt(x[1])
arg[2,0,2,2]=(-0.938280244213)*sqrt(x[2])
arg[2,1,0,0]=(0.559495801686)*sqrt(x[0])
arg[2,1,0,1]=(-0.547182622716)*sqrt(x[1])
arg[2,1,0,2]=(0.397441517898)*sqrt(x[2])
arg[2,1,1,0]=(-0.406112472071)*sqrt(x[0])
arg[2,1,1,1]=(0.355063810677)*sqrt(x[1])
arg[2,1,1,2]=(0.760400203215)*sqrt(x[2])
arg[2,1,2,0]=(0.992201320481)*sqrt(x[0])
arg[2,1,2,1]=(0.0580660882576)*sqrt(x[1])
arg[2,1,2,2]=(-0.643170879939)*sqrt(x[2])
arg[2,2,0,0]=(-0.280644461832)*sqrt(x[0])
arg[2,2,0,1]=(-0.0467430285531)*sqrt(x[1])
arg[2,2,0,2]=(0.314050593255)*sqrt(x[2])
arg[2,2,1,0]=(-0.230032618609)*sqrt(x[0])
arg[2,2,1,1]=(0.0996058698273)*sqrt(x[1])
arg[2,2,1,2]=(-0.0270266073208)*sqrt(x[2])
arg[2,2,2,0]=(0.767914132956)*sqrt(x[0])
arg[2,2,2,1]=(0.496930363612)*sqrt(x[1])
arg[2,2,2,2]=(-0.599525033616)*sqrt(x[2])
arg[2,3,0,0]=(-0.326433376073)*sqrt(x[0])
arg[2,3,0,1]=(-0.0366374501025)*sqrt(x[1])
arg[2,3,0,2]=(0.22555705749)*sqrt(x[2])
arg[2,3,1,0]=(-0.162548813895)*sqrt(x[0])
arg[2,3,1,1]=(-0.110074212194)*sqrt(x[1])
arg[2,3,1,2]=(-0.143600895553)*sqrt(x[2])
arg[2,3,2,0]=(0.771148880174)*sqrt(x[0])
arg[2,3,2,1]=(0.112528116552)*sqrt(x[1])
arg[2,3,2,2]=(-0.955735294341)*sqrt(x[2])
arg[2,4,0,0]=(-0.968392951034)*sqrt(x[0])
arg[2,4,0,1]=(-0.36901708507)*sqrt(x[1])
arg[2,4,0,2]=(0.283692515492)*sqrt(x[2])
arg[2,4,1,0]=(0.997238032837)*sqrt(x[0])
arg[2,4,1,1]=(-0.625794124653)*sqrt(x[1])
arg[2,4,1,2]=(0.533386027556)*sqrt(x[2])
arg[2,4,2,0]=(0.977311695557)*sqrt(x[0])
arg[2,4,2,1]=(0.693009976689)*sqrt(x[1])
arg[2,4,2,2]=(0.711179347652)*sqrt(x[2])
arg[3,0,0,0]=(-0.155585788931)*sqrt(x[0])
arg[3,0,0,1]=(0.0228078851234)*sqrt(x[1])
arg[3,0,0,2]=(0.510104938032)*sqrt(x[2])
arg[3,0,1,0]=(0.74865995369)*sqrt(x[0])
arg[3,0,1,1]=(0.672153736284)*sqrt(x[1])
arg[3,0,1,2]=(0.588012355098)*sqrt(x[2])
arg[3,0,2,0]=(-0.924508475715)*sqrt(x[0])
arg[3,0,2,1]=(-0.392784674758)*sqrt(x[1])
arg[3,0,2,2]=(-0.36371454642)*sqrt(x[2])
arg[3,1,0,0]=(-0.709783490337)*sqrt(x[0])
arg[3,1,0,1]=(0.844136172222)*sqrt(x[1])
arg[3,1,0,2]=(0.621011730043)*sqrt(x[2])
arg[3,1,1,0]=(0.428807337181)*sqrt(x[0])
arg[3,1,1,1]=(0.126300214574)*sqrt(x[1])
arg[3,1,1,2]=(0.795972806221)*sqrt(x[2])
arg[3,1,2,0]=(-0.252334324004)*sqrt(x[0])
arg[3,1,2,1]=(-0.722829467938)*sqrt(x[1])
arg[3,1,2,2]=(-0.551540062366)*sqrt(x[2])
arg[3,2,0,0]=(-0.134668475963)*sqrt(x[0])
arg[3,2,0,1]=(-0.598747540536)*sqrt(x[1])
arg[3,2,0,2]=(0.426422436624)*sqrt(x[2])
arg[3,2,1,0]=(-0.363050323762)*sqrt(x[0])
arg[3,2,1,1]=(0.980891457977)*sqrt(x[1])
arg[3,2,1,2]=(0.162831912555)*sqrt(x[2])
arg[3,2,2,0]=(-0.126505493475)*sqrt(x[0])
arg[3,2,2,1]=(-0.578567864811)*sqrt(x[1])
arg[3,2,2,2]=(-0.509843129095)*sqrt(x[2])
arg[3,3,0,0]=(-0.446171262265)*sqrt(x[0])
arg[3,3,0,1]=(-0.715175197494)*sqrt(x[1])
arg[3,3,0,2]=(-0.881016888806)*sqrt(x[2])
arg[3,3,1,0]=(-0.942020866327)*sqrt(x[0])
arg[3,3,1,1]=(0.156434646828)*sqrt(x[1])
arg[3,3,1,2]=(0.523624761583)*sqrt(x[2])
arg[3,3,2,0]=(-0.683550923926)*sqrt(x[0])
arg[3,3,2,1]=(0.857075218033)*sqrt(x[1])
arg[3,3,2,2]=(0.297672594023)*sqrt(x[2])
arg[3,4,0,0]=(0.74317121113)*sqrt(x[0])
arg[3,4,0,1]=(0.076464540756)*sqrt(x[1])
arg[3,4,0,2]=(0.781965468281)*sqrt(x[2])
arg[3,4,1,0]=(0.417750169098)*sqrt(x[0])
arg[3,4,1,1]=(0.82275428729)*sqrt(x[1])
arg[3,4,1,2]=(0.919072321093)*sqrt(x[2])
arg[3,4,2,0]=(-0.0246706472217)*sqrt(x[0])
arg[3,4,2,1]=(0.179863245513)*sqrt(x[1])
arg[3,4,2,2]=(0.539115287766)*sqrt(x[2])
ref=sqrt((52.492676775)*dim)
res=L2(arg)
self.assertTrue(isinstance(res,float),"wrong type of result.")
self.assertAlmostEqual(res,ref,int(-log10(self.RES_TOL)),"wrong result")
| 42.768281 | 159 | 0.53112 | 16,574 | 90,070 | 2.874925 | 0.061723 | 0.041764 | 0.064639 | 0.058175 | 0.952633 | 0.943357 | 0.927113 | 0.90388 | 0.790321 | 0.188902 | 0 | 0.387442 | 0.197979 | 90,070 | 2,105 | 160 | 42.788599 | 0.27217 | 0.047829 | 0 | 0.139301 | 0 | 0 | 0.016004 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 1 | 0.012712 | false | 0 | 0.004237 | 0 | 0.018008 | 0.00053 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c1ed42b1b273f6771b22bd977457cacaeb047271 | 28 | py | Python | my.py | nowage/penDetect | dab67dd0e29e9409076b204b76184437e9cd855b | [
"MIT"
] | 1 | 2022-02-05T02:02:28.000Z | 2022-02-05T02:02:28.000Z | my.py | panafamily/penDetect | dab67dd0e29e9409076b204b76184437e9cd855b | [
"MIT"
] | null | null | null | my.py | panafamily/penDetect | dab67dd0e29e9409076b204b76184437e9cd855b | [
"MIT"
] | 31 | 2020-12-22T11:24:35.000Z | 2021-01-08T07:16:15.000Z | def f1(x,y):
return x+y+1
| 9.333333 | 14 | 0.571429 | 8 | 28 | 2 | 0.75 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 0.214286 | 28 | 2 | 15 | 14 | 0.636364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
de133a73f1ab7d94339d44084247af4b5d9e670e | 10,902 | py | Python | integration-tests/integration_tests/integration_tests/component_tests/component_asynchronous_express_message_pattern_tests.py | tomzo/integration-adaptors | d4f296d3e44475df6f69a78a27fac6ed5b67513b | [
"Apache-2.0"
] | null | null | null | integration-tests/integration_tests/integration_tests/component_tests/component_asynchronous_express_message_pattern_tests.py | tomzo/integration-adaptors | d4f296d3e44475df6f69a78a27fac6ed5b67513b | [
"Apache-2.0"
] | 4 | 2021-03-31T19:46:30.000Z | 2021-03-31T19:55:03.000Z | integration-tests/integration_tests/integration_tests/component_tests/component_asynchronous_express_message_pattern_tests.py | tomzo/integration-adaptors | d4f296d3e44475df6f69a78a27fac6ed5b67513b | [
"Apache-2.0"
] | 2 | 2020-04-02T11:22:17.000Z | 2021-04-11T07:24:48.000Z | """Component tests related to the asynchronous-express message pattern"""
import unittest
from integration_tests.assertors.text_error_response_assertor import TextErrorResponseAssertor
from integration_tests.dynamo.dynamo import MHS_STATE_TABLE_DYNAMO_WRAPPER, MHS_SYNC_ASYNC_TABLE_DYNAMO_WRAPPER
from integration_tests.dynamo.dynamo_mhs_table import DynamoMhsTableStateAssertor
from integration_tests.helpers.build_message import build_message
from integration_tests.http.mhs_http_request_builder import MhsHttpRequestBuilder
class AsynchronousExpressMssagingPatternTests(unittest.TestCase):
"""
These tests show an asynchronous express response from Spine via the MHS for the example message interaction
of PSIS(Personal Spine Information Service).
They make use of the fake-spine service, which has known responses for certain message ids.
They make use of the fake-spine-route-lookup service, which has known responses for certain interaction ids.
"""
def setUp(self):
MHS_STATE_TABLE_DYNAMO_WRAPPER.clear_all_records_in_table()
MHS_SYNC_ASYNC_TABLE_DYNAMO_WRAPPER.clear_all_records_in_table()
def test_should_return_information_from_soap_fault_returned_from_spine_in_original_post_request_to_client(self):
"""
Message ID: AD7D39A8-1B6C-4520-8367-6B7BEBD7B842 configured in fakespine to return a SOAP Fault error.
Error found here: fake_spine/fake_spine/configured_responses/soap_fault_single_error.xml
"""
# Arrange
message, message_id = build_message('QUPC_IN160101UK05', '9689177923',
message_id='AD7D39A8-1B6C-4520-8367-6B7BEBD7B842'
)
# Act
response = MhsHttpRequestBuilder() \
.with_headers(interaction_id='QUPC_IN160101UK05', message_id=message_id, sync_async=False) \
.with_body(message) \
.execute_post_expecting_error_response()
# Assert
TextErrorResponseAssertor(response.text) \
.assert_error_code(200) \
.assert_code_context('urn:nhs:names:error:tms') \
.assert_severity('Error')
def test_should_record_message_status_as_nackd_when_soap_error_response_returned_from_spine(self):
"""
Message ID: AD7D39A8-1B6C-4520-8367-6B7BEBD7B842 configured in fakespine to return a SOAP Fault error.
Error found here: fake_spine/fake_spine/configured_responses/soap_fault_single_error.xml
"""
# Arrange
message, message_id = build_message('QUPC_IN160101UK05', '9689177923',
message_id='AD7D39A8-1B6C-4520-8367-6B7BEBD7B842'
)
# Act
MhsHttpRequestBuilder() \
.with_headers(interaction_id='QUPC_IN160101UK05', message_id=message_id, sync_async=False) \
.with_body(message) \
.execute_post_expecting_error_response()
# Assert
DynamoMhsTableStateAssertor(MHS_STATE_TABLE_DYNAMO_WRAPPER.get_all_records_in_table()) \
.assert_single_item_exists_with_key(message_id) \
.assert_item_contains_values(
{
'INBOUND_STATUS': None,
'OUTBOUND_STATUS': 'OUTBOUND_MESSAGE_NACKD',
'WORKFLOW': 'async-express'
})
def test_should_return_information_from_ebxml_fault_returned_from_spine_in_original_post_request(self):
"""
Message ID: '7AA57E38-8B20-4AE0-9E73-B9B0C0C42BDA' configured in fakespine to return a ebxml Fault error.
Error found here: fake_spine/fake_spine/configured_responses/ebxml_fault_single_error.xml
"""
# Arrange
message, message_id = build_message('QUPC_IN160101UK05', '9689177923',
message_id='7AA57E38-8B20-4AE0-9E73-B9B0C0C42BDA'
)
# Act
response = MhsHttpRequestBuilder() \
.with_headers(interaction_id='QUPC_IN160101UK05', message_id=message_id, sync_async=False) \
.with_body(message) \
.execute_post_expecting_error_response()
# Assert
TextErrorResponseAssertor(response.text) \
.assert_code_context('urn:oasis:names:tc:ebxml') \
.assert_severity('Error') \
.assert_error_type('ebxml_error')
def test_should_record_message_status_as_nackd_when_ebxml_error_response_returned_from_spine(self):
"""
Message ID: '7AA57E38-8B20-4AE0-9E73-B9B0C0C42BDA' configured in fakespine to return a ebxml Fault error.
Error found here: fake_spine/fake_spine/configured_responses/ebxml_fault_single_error.xml
"""
# Arrange
message, message_id = build_message('QUPC_IN160101UK05', '9689177923',
message_id='7AA57E38-8B20-4AE0-9E73-B9B0C0C42BDA'
)
# Act
response = MhsHttpRequestBuilder() \
.with_headers(interaction_id='QUPC_IN160101UK05', message_id=message_id, sync_async=False) \
.with_body(message) \
.execute_post_expecting_error_response()
# Assert
DynamoMhsTableStateAssertor(MHS_STATE_TABLE_DYNAMO_WRAPPER.get_all_records_in_table()) \
.assert_single_item_exists_with_key(message_id) \
.assert_item_contains_values(
{
'INBOUND_STATUS': None,
'OUTBOUND_STATUS': 'OUTBOUND_MESSAGE_NACKD',
'WORKFLOW': 'async-express'
})
def test_should_return_information_from_soap_fault_returned_from_spine_in_original_request_to_client_when_sync_async_requested(self):
"""
Message ID: AD7D39A8-1B6C-4520-8367-6B7BEBD7B842 configured in fakespine to return a SOAP Fault error.
Error found here: fake_spine/fake_spine/configured_responses/soap_fault_single_error.xml
"""
# Arrange
message, message_id = build_message('QUPC_IN160101UK05', '9689177923',
message_id='AD7D39A8-1B6C-4520-8367-6B7BEBD7B842'
)
# Act
response = MhsHttpRequestBuilder() \
.with_headers(interaction_id='QUPC_IN160101UK05', message_id=message_id, sync_async=True) \
.with_body(message) \
.execute_post_expecting_error_response()
# Assert
TextErrorResponseAssertor(response.text) \
.assert_error_code(200) \
.assert_code_context('urn:nhs:names:error:tms') \
.assert_severity('Error')
def test_should_record_message_status_when_soap_error_response_returned_from_spine_and_sync_async_requested(self):
"""
Message ID: AD7D39A8-1B6C-4520-8367-6B7BEBD7B842 configured in fakespine to return a SOAP Fault error.
Error found here: fake_spine/fake_spine/configured_responses/soap_fault_single_error.xml
"""
# Arrange
message, message_id = build_message('QUPC_IN160101UK05', '9689177923',
message_id='AD7D39A8-1B6C-4520-8367-6B7BEBD7B842'
)
# Act
MhsHttpRequestBuilder() \
.with_headers(interaction_id='QUPC_IN160101UK05', message_id=message_id, sync_async=True) \
.with_body(message) \
.execute_post_expecting_error_response()
# Assert
DynamoMhsTableStateAssertor(MHS_STATE_TABLE_DYNAMO_WRAPPER.get_all_records_in_table()) \
.assert_single_item_exists_with_key(message_id) \
.assert_item_contains_values(
{
'INBOUND_STATUS': None,
'OUTBOUND_STATUS': 'OUTBOUND_SYNC_ASYNC_MESSAGE_SUCCESSFULLY_RESPONDED',
'WORKFLOW': 'sync-async'
})
def test_should_return_information_in_ebxml_fault_returned_from_spine_in_original_post_request_to_client_when_sync_async_requested(self):
"""
Message ID: '7AA57E38-8B20-4AE0-9E73-B9B0C0C42BDA' configured in fakespine to return a ebxml Fault error.
Error found here: fake_spine/fake_spine/configured_responses/ebxml_fault_single_error.xml
"""
# Arrange
message, message_id = build_message('QUPC_IN160101UK05', '9689177923',
message_id='7AA57E38-8B20-4AE0-9E73-B9B0C0C42BDA'
)
# Act
response = MhsHttpRequestBuilder() \
.with_headers(interaction_id='QUPC_IN160101UK05', message_id=message_id, sync_async=True) \
.with_body(message) \
.execute_post_expecting_error_response()
# Assert
TextErrorResponseAssertor(response.text) \
.assert_code_context('urn:oasis:names:tc:ebxml') \
.assert_severity('Error') \
.assert_error_type('ebxml_error')
def test_should_record_message_status_when_ebxml_error_response_returned_from_spine_and_sync_async_requested(self):
"""
Message ID: '7AA57E38-8B20-4AE0-9E73-B9B0C0C42BDA' configured in fakespine to return a ebxml Fault error.
Error found here: fake_spine/fake_spine/configured_responses/ebxml_fault_single_error.xml
"""
# Arrange
message, message_id = build_message('QUPC_IN160101UK05', '9689177923',
message_id='7AA57E38-8B20-4AE0-9E73-B9B0C0C42BDA'
)
# Act
MhsHttpRequestBuilder() \
.with_headers(interaction_id='QUPC_IN160101UK05', message_id=message_id, sync_async=True) \
.with_body(message) \
.execute_post_expecting_error_response()
# Assert
DynamoMhsTableStateAssertor(MHS_STATE_TABLE_DYNAMO_WRAPPER.get_all_records_in_table()) \
.assert_single_item_exists_with_key(message_id) \
.assert_item_contains_values(
{
'INBOUND_STATUS': None,
'OUTBOUND_STATUS': 'OUTBOUND_SYNC_ASYNC_MESSAGE_SUCCESSFULLY_RESPONDED',
'WORKFLOW': 'sync-async'
})
def test_should_return_bad_request_when_client_sends_invalid_message(self):
# Arrange
message, message_id = build_message('QUPC_IN160101UK05', '9689174606')
# Act
response = MhsHttpRequestBuilder() \
.with_headers(interaction_id='QUPC_IN160101UK05', message_id=message_id, sync_async=False) \
.with_body({'blah': '123'}) \
.execute_post_expecting_bad_request_response()
# Assert
self.assertEqual(response.text, "400: Invalid request. Validation errors: {'payload': ['Not a valid string.']}")
| 49.108108 | 141 | 0.657311 | 1,139 | 10,902 | 5.882353 | 0.134328 | 0.063134 | 0.017463 | 0.030896 | 0.886866 | 0.870597 | 0.864328 | 0.842537 | 0.812836 | 0.804925 | 0 | 0.068759 | 0.266281 | 10,902 | 221 | 142 | 49.330317 | 0.768846 | 0.196936 | 0 | 0.728682 | 0 | 0 | 0.147017 | 0.062263 | 0 | 0 | 0 | 0 | 0.24031 | 1 | 0.077519 | false | 0 | 0.046512 | 0 | 0.131783 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e72a256d1f13c3c018b3dfc49d59525f77979a93 | 168 | py | Python | finitewave/cpuwave3D/fibrosis/__init__.py | ArsOkenov/Finitewave | 14274d74be824a395b47a5c53ba18188798ab70d | [
"MIT"
] | null | null | null | finitewave/cpuwave3D/fibrosis/__init__.py | ArsOkenov/Finitewave | 14274d74be824a395b47a5c53ba18188798ab70d | [
"MIT"
] | null | null | null | finitewave/cpuwave3D/fibrosis/__init__.py | ArsOkenov/Finitewave | 14274d74be824a395b47a5c53ba18188798ab70d | [
"MIT"
] | null | null | null | from finitewave.cpuwave3D.fibrosis.diffuse_3d_pattern import Diffuse3DPattern
from finitewave.cpuwave3D.fibrosis.structural_3d_pattern \
import Structural3DPattern
| 42 | 77 | 0.880952 | 18 | 168 | 8 | 0.611111 | 0.194444 | 0.319444 | 0.430556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03871 | 0.077381 | 168 | 3 | 78 | 56 | 0.890323 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
99b9e497fadf448104dfeeff28fdaa6696ba8315 | 22,511 | py | Python | tests/scruples/demos/norms/test_app.py | allenai/scruples | 9a43459c507e57d89ab8442a4f3985cedecb8710 | [
"Apache-2.0"
] | 29 | 2020-05-09T10:55:45.000Z | 2022-03-28T16:18:02.000Z | tests/scruples/demos/norms/test_app.py | allenai/scruples | 9a43459c507e57d89ab8442a4f3985cedecb8710 | [
"Apache-2.0"
] | null | null | null | tests/scruples/demos/norms/test_app.py | allenai/scruples | 9a43459c507e57d89ab8442a4f3985cedecb8710 | [
"Apache-2.0"
] | 6 | 2020-10-05T12:24:28.000Z | 2021-12-06T19:51:06.000Z | """Tests for scruples.demos.norms.app."""
import unittest
import pytest
from scruples import settings
from scruples.demos.norms import app
class AppTestCase(unittest.TestCase):
"""Test scruples.demos.norms.app.app."""
def setUp(self):
# initialize the test client
app.app.config['TESTING'] = True
self.client = app.app.test_client()
def test_home(self):
response = self.client.get('/')
self.assertEqual(response.status_code, 200)
def test_predict_actions_throws_error_for_no_json(self):
response = self.client.post('/api/actions/predict')
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'No JSON',
'message': 'Requests to this API endpoint must contain JSON.'
}
])
def test_predict_actions_throws_error_for_non_list(self):
# when the posted JSON is an object
response = self.client.post('/api/actions/predict', json={'foo': 'bar'})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'The request data must be an array of objects.'
}
])
# when the posted JSON is a string
response = self.client.post('/api/actions/predict', json='foo')
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'The request data must be an array of objects.'
}
])
def test_predict_actions_throws_error_for_when_entries_are_not_objects(self):
# when the posted JSON has null entries
response = self.client.post('/api/actions/predict', json=[None])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'Each element of the data array must be an object.'
}
])
response = self.client.post('/api/actions/predict', json=[
{'action1': 'foo', 'action2': 'bar'},
None
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'Each element of the data array must be an object.'
}
])
# when the posted JSON has string entries
response = self.client.post('/api/actions/predict', json=['foo'])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'Each element of the data array must be an object.'
}
])
response = self.client.post('/api/actions/predict', json=[
{'action1': 'foo', 'action2': 'bar'},
'foo'
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'Each element of the data array must be an object.'
}
])
# when the posted JSON has array entries
response = self.client.post('/api/actions/predict', json=[['foo']])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'Each element of the data array must be an object.'
}
])
response = self.client.post('/api/actions/predict', json=[
{'action1': 'foo', 'action2': 'bar'},
['foo']
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'Each element of the data array must be an object.'
}
])
def test_predict_actions_throws_error_for_missing_keys_in_object(self):
# when action1 is missing in an object
response = self.client.post('/api/actions/predict', json=[
{'action2': 'foo'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Missing Key',
'message': 'Each object must have an "action1" key.'
}
])
# when action1 is missing in some objects
response = self.client.post('/api/actions/predict', json=[
{'action1': 'foo', 'action2': 'bar'},
{'action2': 'foo'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Missing Key',
'message': 'Each object must have an "action1" key.'
}
])
# when action2 is missing in an object
response = self.client.post('/api/actions/predict', json=[
{'action1': 'foo'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Missing Key',
'message': 'Each object must have an "action2" key.'
}
])
# when action2 is missing in some objects
response = self.client.post('/api/actions/predict', json=[
{'action1': 'foo', 'action2': 'bar'},
{'action1': 'foo'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Missing Key',
'message': 'Each object must have an "action2" key.'
}
])
def test_predict_actions_throws_error_on_non_string_values(self):
# when action1 has a null, non-string value
response = self.client.post('/api/actions/predict', json=[
{'action1': None, 'action2': 'bar'},
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'The value corresponding to the "action1" key'
' must be a string.'
}
])
# when action2 has a null, non-string value
response = self.client.post('/api/actions/predict', json=[
{'action1': 'foo', 'action2': None},
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'The value corresponding to the "action2" key'
' must be a string.'
}
])
# when action1 has an integer, non-string value
response = self.client.post('/api/actions/predict', json=[
{'action1': 1, 'action2': 'bar'},
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'The value corresponding to the "action1" key'
' must be a string.'
}
])
# when action2 has an integer, non-string value
response = self.client.post('/api/actions/predict', json=[
{'action1': 'foo', 'action2': 5},
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'The value corresponding to the "action2" key'
' must be a string.'
}
])
def test_predict_actions_throws_error_for_unrecognized_keys(self):
# when an object has an unrecognized key
response = self.client.post('/api/actions/predict', json=[
{'action1': 'foo', 'action2': 'bar', 'baz': 'qux'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Unexpected Key',
'message': 'Each object must only have "action1" and "action2"'
' keys.'
}
])
# when some objects have unrecognized keys
response = self.client.post('/api/actions/predict', json=[
{'action1': 'foo', 'action2': 'bar'},
{'action1': 'foo', 'action2': 'bar', 'baz': 'qux'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Unexpected Key',
'message': 'Each object must only have "action1" and "action2"'
' keys.'
}
])
def test_predict_actions_can_throw_multiple_errors(self):
response = self.client.post('/api/actions/predict', json=[
{'action1': 'foo'},
{'action1': 'foo', 'action2': 'bar', 'baz': 'qux'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Missing Key',
'message': 'Each object must have an "action2" key.'
},
{
'error': 'Unexpected Key',
'message': 'Each object must only have "action1" and "action2"'
' keys.'
}
])
@pytest.mark.slow
@pytest.mark.skipif(
settings.NORMS_ACTIONS_BASELINE is None
or settings.NORMS_ACTIONS_MODEL is None
or settings.NORMS_PREDICT_BATCH_SIZE is None,
reason='requires the norms demo environment variables.')
def test_predict_actions_computes_action_prediction(self):
response = self.client.post('/api/actions/predict', json=[
{'action1': 'foo', 'action2': 'bar'}
])
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.json), 1)
self.assertEqual(
set(response.json[0].keys()),
set(['action1', 'action2']))
self.assertIsInstance(response.json[0]['action1'], float)
self.assertIsInstance(response.json[0]['action2'], float)
@pytest.mark.slow
@pytest.mark.skipif(
settings.NORMS_ACTIONS_BASELINE is None
or settings.NORMS_ACTIONS_MODEL is None
or settings.NORMS_PREDICT_BATCH_SIZE is None,
reason='requires the norms demo environment variables.')
def test_predict_actions_computes_plots(self):
response = self.client.post('/api/actions/predict?plot=true', json=[
{'action1': 'foo', 'action2': 'bar'}
])
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.json), 1)
self.assertEqual(
set(response.json[0].keys()),
set(['action1', 'action2', 'plot']))
self.assertIsInstance(response.json[0]['action1'], float)
self.assertIsInstance(response.json[0]['action2'], float)
self.assertIsInstance(response.json[0]['plot'], str)
def test_predict_corpus_throws_error_for_no_json(self):
response = self.client.post('/api/corpus/predict')
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'No JSON',
'message': 'Requests to this API endpoint must contain JSON.'
}
])
def test_predict_corpus_throws_error_for_non_list(self):
# when the posted JSON is an object
response = self.client.post('/api/corpus/predict', json={'foo': 'bar'})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'The request data must be an array of objects.'
}
])
# when the posted JSON is a string
response = self.client.post('/api/corpus/predict', json='foo')
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'The request data must be an array of objects.'
}
])
def test_predict_corpus_throws_error_when_entries_are_not_objects(self):
# when the posted JSON has null entries
response = self.client.post('/api/corpus/predict', json=[None])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'Each element of the data array must be an object.'
}
])
response = self.client.post('/api/corpus/predict', json=[
{'title': 'foo', 'text': 'bar'},
None
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'Each element of the data array must be an object.'
}
])
# when the posted JSON has string entries
response = self.client.post('/api/corpus/predict', json=['foo'])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'Each element of the data array must be an object.'
}
])
response = self.client.post('/api/corpus/predict', json=[
{'title': 'foo', 'text': 'bar'},
'foo'
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'Each element of the data array must be an object.'
}
])
# when the posted JSON has array entries
response = self.client.post('/api/corpus/predict', json=[['foo']])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'Each element of the data array must be an object.'
}
])
response = self.client.post('/api/corpus/predict', json=[
{'title': 'foo', 'text': 'bar'},
['foo']
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'Each element of the data array must be an object.'
}
])
def test_predict_corpus_throws_error_for_missing_keys_in_object(self):
# when title is missing in an object
response = self.client.post('/api/corpus/predict', json=[
{'text': 'foo'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Missing Key',
'message': 'Each object must have a "title" key.'
}
])
# when title is missing in some objects
response = self.client.post('/api/corpus/predict', json=[
{'title': 'foo', 'text': 'bar'},
{'text': 'foo'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Missing Key',
'message': 'Each object must have a "title" key.'
}
])
# when text is missing in an object
response = self.client.post('/api/corpus/predict', json=[
{'title': 'foo'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Missing Key',
'message': 'Each object must have a "text" key.'
}
])
# when text is missing in some objects
response = self.client.post('/api/corpus/predict', json=[
{'title': 'foo', 'text': 'bar'},
{'title': 'foo'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Missing Key',
'message': 'Each object must have a "text" key.'
}
])
def test_predict_corpus_throws_error_on_non_string_values(self):
# when title has a null, non-string value
response = self.client.post('/api/corpus/predict', json=[
{'title': None, 'text': 'bar'},
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'The value corresponding to the "title" key'
' must be a string.'
}
])
# when text has a null, non-string value
response = self.client.post('/api/corpus/predict', json=[
{'title': 'foo', 'text': None},
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'The value corresponding to the "text" key'
' must be a string.'
}
])
# when title has an integer, non-string value
response = self.client.post('/api/corpus/predict', json=[
{'title': 1, 'text': 'bar'},
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'The value corresponding to the "title" key'
' must be a string.'
}
])
# when text has an integer, non-string value
response = self.client.post('/api/corpus/predict', json=[
{'title': 'foo', 'text': 5},
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Wrong Type',
'message': 'The value corresponding to the "text" key'
' must be a string.'
}
])
def test_predict_corpus_throws_error_for_unrecognized_keys(self):
# when an object has an unrecognized key
response = self.client.post('/api/corpus/predict', json=[
{'title': 'foo', 'text': 'bar', 'baz': 'qux'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Unexpected Key',
'message': 'Each object must only have "title" and "text"'
' keys.'
}
])
# when some objects have unrecognized keys
response = self.client.post('/api/corpus/predict', json=[
{'title': 'foo', 'text': 'bar'},
{'title': 'foo', 'text': 'bar', 'baz': 'qux'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Unexpected Key',
'message': 'Each object must only have "title" and "text"'
' keys.'
}
])
def test_predict_corpus_can_throw_multiple_errors(self):
response = self.client.post('/api/corpus/predict', json=[
{'title': 'foo'},
{'title': 'foo', 'text': 'bar', 'baz': 'qux'}
])
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json, [
{
'error': 'Missing Key',
'message': 'Each object must have a "text" key.'
},
{
'error': 'Unexpected Key',
'message': 'Each object must only have "title" and "text"'
' keys.'
}
])
@pytest.mark.slow
@pytest.mark.skipif(
settings.NORMS_CORPUS_BASELINE is None
or settings.NORMS_CORPUS_MODEL is None
or settings.NORMS_PREDICT_BATCH_SIZE is None,
reason='requires the norms demo environment variables.')
def test_predict_corpus_computes_corpus_prediction(self):
response = self.client.post('/api/corpus/predict', json=[
{'title': 'foo', 'text': 'bar'}
])
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.json), 1)
self.assertEqual(
set(response.json[0].keys()),
set([
'AUTHOR',
'OTHER',
'EVERYBODY',
'NOBODY',
'INFO'
]))
self.assertIsInstance(response.json[0]['AUTHOR'], float)
self.assertIsInstance(response.json[0]['OTHER'], float)
self.assertIsInstance(response.json[0]['EVERYBODY'], float)
self.assertIsInstance(response.json[0]['NOBODY'], float)
self.assertIsInstance(response.json[0]['INFO'], float)
@pytest.mark.slow
@pytest.mark.skipif(
settings.NORMS_CORPUS_BASELINE is None
or settings.NORMS_CORPUS_MODEL is None
or settings.NORMS_PREDICT_BATCH_SIZE is None,
reason='requires the norms demo environment variables.')
def test_predict_corpus_computes_plots(self):
response = self.client.post('/api/corpus/predict?plot=true', json=[
{'title': 'foo', 'text': 'bar'}
])
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.json), 1)
self.assertEqual(
set(response.json[0].keys()),
set([
'AUTHOR',
'OTHER',
'EVERYBODY',
'NOBODY',
'INFO',
'plot_author',
'plot_other'
]))
self.assertIsInstance(response.json[0]['AUTHOR'], float)
self.assertIsInstance(response.json[0]['OTHER'], float)
self.assertIsInstance(response.json[0]['EVERYBODY'], float)
self.assertIsInstance(response.json[0]['NOBODY'], float)
self.assertIsInstance(response.json[0]['INFO'], float)
self.assertIsInstance(response.json[0]['plot_author'], str)
self.assertIsInstance(response.json[0]['plot_other'], str)
| 37.085667 | 81 | 0.532184 | 2,289 | 22,511 | 5.144605 | 0.05723 | 0.118461 | 0.166016 | 0.110819 | 0.962806 | 0.960768 | 0.946416 | 0.929857 | 0.912194 | 0.904127 | 0 | 0.015482 | 0.340056 | 22,511 | 606 | 82 | 37.146865 | 0.777194 | 0.056239 | 0 | 0.725379 | 0 | 0 | 0.233824 | 0.002782 | 0 | 0 | 0 | 0 | 0.208333 | 1 | 0.037879 | false | 0 | 0.007576 | 0 | 0.047348 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
99dcb45b42aa180b835d470038f78388a5660817 | 1,351 | py | Python | tests/test_script_questions.py | qrkourier/milc | 28bb8f5df201283cc61e87439944e2bf3ded147b | [
"X11"
] | 19 | 2020-02-27T22:47:37.000Z | 2021-09-03T03:27:21.000Z | tests/test_script_questions.py | qrkourier/milc | 28bb8f5df201283cc61e87439944e2bf3ded147b | [
"X11"
] | 27 | 2020-03-12T21:14:45.000Z | 2022-03-31T20:29:32.000Z | tests/test_script_questions.py | qrkourier/milc | 28bb8f5df201283cc61e87439944e2bf3ded147b | [
"X11"
] | 10 | 2019-10-13T17:46:19.000Z | 2022-03-31T18:21:13.000Z | from .common import check_assert, check_command, check_returncode
def test_questions():
result = check_command('./questions')
check_returncode(result)
check_assert(result, 'User has chosen to stop.' in result.stdout)
check_assert(result, 'User is stopping.' in result.stdout)
check_assert(result, 'Interesting answer: None' in result.stdout)
def test_questions_interactive():
result = check_command('./questions', '--interactive', input='y\n2\nbecause\n')
check_returncode(result)
check_assert(result, 'User has chosen to continue.' in result.stdout)
check_assert(result, 'User is not stopping.' in result.stdout)
check_assert(result, 'Interesting answer: because' in result.stdout)
def test_questions_yes():
result = check_command('./questions', '--yes')
check_returncode(result)
check_assert(result, 'User has chosen to continue.' in result.stdout)
check_assert(result, 'User is stopping.' in result.stdout)
check_assert(result, 'Interesting answer: None' in result.stdout)
def test_questions_no():
result = check_command('./questions', '--no')
check_returncode(result)
check_assert(result, 'User has chosen to stop.' in result.stdout)
check_assert(result, 'User is stopping.' in result.stdout)
check_assert(result, 'Interesting answer: None' in result.stdout)
| 39.735294 | 83 | 0.733531 | 177 | 1,351 | 5.429379 | 0.19209 | 0.148803 | 0.212279 | 0.174818 | 0.753382 | 0.753382 | 0.722164 | 0.722164 | 0.722164 | 0.663892 | 0 | 0.000871 | 0.150259 | 1,351 | 33 | 84 | 40.939394 | 0.836237 | 0 | 0 | 0.56 | 0 | 0 | 0.263509 | 0 | 0 | 0 | 0 | 0 | 0.52 | 1 | 0.16 | false | 0 | 0.04 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
820bbda57042bf8bbd0241c3c1389798e699158e | 56,207 | py | Python | venv/lib/python3.8/site-packages/azureml/_restclient/operations/jasmine_operations.py | amcclead7336/Enterprise_Data_Science_Final | ccdc0aa08d4726bf82d71c11a1cc0c63eb301a28 | [
"Unlicense",
"MIT"
] | null | null | null | venv/lib/python3.8/site-packages/azureml/_restclient/operations/jasmine_operations.py | amcclead7336/Enterprise_Data_Science_Final | ccdc0aa08d4726bf82d71c11a1cc0c63eb301a28 | [
"Unlicense",
"MIT"
] | null | null | null | venv/lib/python3.8/site-packages/azureml/_restclient/operations/jasmine_operations.py | amcclead7336/Enterprise_Data_Science_Final | ccdc0aa08d4726bf82d71c11a1cc0c63eb301a28 | [
"Unlicense",
"MIT"
] | 2 | 2021-05-23T16:46:31.000Z | 2021-05-26T23:51:09.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator 2.3.33.0
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
from msrest.pipeline import ClientRawResponse
from .. import models
class JasmineOperations(object):
"""JasmineOperations operations.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
"""
models = models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self.config = config
def get_curated_environment(
self, subscription_id, resource_group_name, workspace_name, experiment_id, auto_ml_curated_environment_input_dto=None, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param auto_ml_curated_environment_input_dto:
:type auto_ml_curated_environment_input_dto:
~_restclient.models.AutoMLCuratedEnvInput
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: AutoMLCuratedEnvOutput or ClientRawResponse if raw=true
:rtype: ~_restclient.models.AutoMLCuratedEnvOutput or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.get_curated_environment.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
if auto_ml_curated_environment_input_dto is not None:
body_content = self._serialize.body(auto_ml_curated_environment_input_dto, 'AutoMLCuratedEnvInput')
else:
body_content = None
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('AutoMLCuratedEnvOutput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_curated_environment.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/getCuratedEnvironment'}
def submit_remote_snapshot_run(
self, subscription_id, resource_group_name, workspace_name, experiment_id, parent_run_id, run_definition=None, snapshot_id=None, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param parent_run_id:
:type parent_run_id: str
:param run_definition:
:type run_definition: ~_restclient.models.RunDefinition
:param snapshot_id:
:type snapshot_id: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: StartRunResult or ClientRawResponse if raw=true
:rtype: ~_restclient.models.StartRunResult or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.submit_remote_snapshot_run.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str'),
'parentRunId': self._serialize.url("parent_run_id", parent_run_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if snapshot_id is not None:
query_parameters['snapshotId'] = self._serialize.query("snapshot_id", snapshot_id, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
if run_definition is not None:
body_content = self._serialize.body(run_definition, 'RunDefinition')
else:
body_content = None
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('StartRunResult', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
submit_remote_snapshot_run.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/runs/{parentRunId}/submitSnapshotRun'}
def continue_run(
self, subscription_id, resource_group_name, workspace_name, experiment_id, parent_run_id, updated_iterations=None, updated_time=-1, updated_exit_score=None, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param parent_run_id:
:type parent_run_id: str
:param updated_iterations:
:type updated_iterations: int
:param updated_time:
:type updated_time: int
:param updated_exit_score:
:type updated_exit_score: float
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: None or ClientRawResponse if raw=true
:rtype: None or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.continue_run.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str'),
'parentRunId': self._serialize.url("parent_run_id", parent_run_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if updated_iterations is not None:
query_parameters['updatedIterations'] = self._serialize.query("updated_iterations", updated_iterations, 'int')
if updated_time is not None:
query_parameters['updatedTime'] = self._serialize.query("updated_time", updated_time, 'int')
if updated_exit_score is not None:
query_parameters['updatedExitScore'] = self._serialize.query("updated_exit_score", updated_exit_score, 'float')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
continue_run.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/runs/{parentrunId}/continueRun'}
def create_parent_run_method(
self, subscription_id, resource_group_name, workspace_name, experiment_id, create_parent_run_dto=None, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param create_parent_run_dto:
:type create_parent_run_dto: ~_restclient.models.CreateParentRun
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: str or ClientRawResponse if raw=true
:rtype: str or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.create_parent_run_method.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
if create_parent_run_dto is not None:
body_content = self._serialize.body(create_parent_run_dto, 'CreateParentRun')
else:
body_content = None
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('str', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
create_parent_run_method.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/run'}
def local_run_get_next_task(
self, subscription_id, resource_group_name, workspace_name, experiment_id, parent_run_id, start_child_run=None, local_run_get_next_task_input=None, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param parent_run_id:
:type parent_run_id: str
:param start_child_run:
:type start_child_run: bool
:param local_run_get_next_task_input:
:type local_run_get_next_task_input:
~_restclient.models.LocalRunGetNextTaskInput
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: IterationTask or ClientRawResponse if raw=true
:rtype: ~_restclient.models.IterationTask or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.local_run_get_next_task.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str'),
'parentRunId': self._serialize.url("parent_run_id", parent_run_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if start_child_run is not None:
query_parameters['startChildRun'] = self._serialize.query("start_child_run", start_child_run, 'bool')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
if local_run_get_next_task_input is not None:
body_content = self._serialize.body(local_run_get_next_task_input, 'LocalRunGetNextTaskInput')
else:
body_content = None
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('IterationTask', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
local_run_get_next_task.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/runs/{parentRunId}/next'}
def local_run_get_next_task_batch(
self, subscription_id, resource_group_name, workspace_name, experiment_id, parent_run_id, start_child_runs=None, local_run_get_next_task_batch_input=None, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param parent_run_id:
:type parent_run_id: str
:param start_child_runs:
:type start_child_runs: bool
:param local_run_get_next_task_batch_input:
:type local_run_get_next_task_batch_input:
~_restclient.models.LocalRunGetNextTaskBatchInput
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: BatchIterationTask or ClientRawResponse if raw=true
:rtype: ~_restclient.models.BatchIterationTask or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.local_run_get_next_task_batch.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str'),
'parentRunId': self._serialize.url("parent_run_id", parent_run_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if start_child_runs is not None:
query_parameters['startChildRuns'] = self._serialize.query("start_child_runs", start_child_runs, 'bool')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
if local_run_get_next_task_batch_input is not None:
body_content = self._serialize.body(local_run_get_next_task_batch_input, 'LocalRunGetNextTaskBatchInput')
else:
body_content = None
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('BatchIterationTask', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
local_run_get_next_task_batch.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/runs/{parentRunId}/next_task_batch'}
def get_pipeline(
self, subscription_id, resource_group_name, workspace_name, experiment_id, parent_run_id, worker_id, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param parent_run_id:
:type parent_run_id: str
:param worker_id:
:type worker_id: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: IterationTask or ClientRawResponse if raw=true
:rtype: ~_restclient.models.IterationTask or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.get_pipeline.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str'),
'parentRunId': self._serialize.url("parent_run_id", parent_run_id, 'str'),
'workerId': self._serialize.url("worker_id", worker_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('IterationTask', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_pipeline.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/runs/{parentRunId}/workers/{workerId}'}
def change_run_status(
self, subscription_id, resource_group_name, workspace_name, experiment_id, run_id, target_status, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param run_id:
:type run_id: str
:param target_status:
:type target_status: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: None or ClientRawResponse if raw=true
:rtype: None or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.change_run_status.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str'),
'runId': self._serialize.url("run_id", run_id, 'str'),
'targetStatus': self._serialize.url("target_status", target_status, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
change_run_status.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/runs/{runId}/changerunstatus/{targetStatus}'}
def cancel_run(
self, subscription_id, resource_group_name, workspace_name, experiment_id, run_id, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param run_id:
:type run_id: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: None or ClientRawResponse if raw=true
:rtype: None or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.cancel_run.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str'),
'runId': self._serialize.url("run_id", run_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
cancel_run.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/cancelrun/{runId}'}
def get_feature_profiles(
self, subscription_id, resource_group_name, workspace_name, experiment_id, run_id, feature_profile_input_dto=None, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param run_id:
:type run_id: str
:param feature_profile_input_dto:
:type feature_profile_input_dto:
~_restclient.models.FeatureProfileInput
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: FeatureProfileOutput or ClientRawResponse if raw=true
:rtype: ~_restclient.models.FeatureProfileOutput or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.get_feature_profiles.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str'),
'runId': self._serialize.url("run_id", run_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
if feature_profile_input_dto is not None:
body_content = self._serialize.body(feature_profile_input_dto, 'FeatureProfileInput')
else:
body_content = None
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('FeatureProfileOutput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_feature_profiles.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/featureprofiles/{runId}'}
def on_demand_model_explain(
self, subscription_id, resource_group_name, workspace_name, experiment_id, run_id, model_explain_dto=None, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param run_id:
:type run_id: str
:param model_explain_dto:
:type model_explain_dto: ~_restclient.models.ModelExplain
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: str or ClientRawResponse if raw=true
:rtype: str or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.on_demand_model_explain.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str'),
'runId': self._serialize.url("run_id", run_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
if model_explain_dto is not None:
body_content = self._serialize.body(model_explain_dto, 'ModelExplain')
else:
body_content = None
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('str', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
on_demand_model_explain.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/modelexplain/{runId}'}
def on_demand_model_test(
self, subscription_id, resource_group_name, workspace_name, experiment_id, run_id, model_test_dto=None, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param run_id:
:type run_id: str
:param model_test_dto:
:type model_test_dto: ~_restclient.models.ModelTest
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: str or ClientRawResponse if raw=true
:rtype: str or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.on_demand_model_test.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str'),
'runId': self._serialize.url("run_id", run_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
if model_test_dto is not None:
body_content = self._serialize.body(model_test_dto, 'ModelTest')
else:
body_content = None
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('str', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
on_demand_model_test.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/modeltest/{runId}'}
def notebook_gen_method(
self, subscription_id, resource_group_name, workspace_name, experiment_id, run_id, notebook_gen_dto=None, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param run_id:
:type run_id: str
:param notebook_gen_dto:
:type notebook_gen_dto: ~_restclient.models.NotebookGen
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: str or ClientRawResponse if raw=true
:rtype: str or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.notebook_gen_method.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str'),
'runId': self._serialize.url("run_id", run_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
if notebook_gen_dto is not None:
body_content = self._serialize.body(notebook_gen_dto, 'NotebookGen')
else:
body_content = None
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('str', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
notebook_gen_method.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/notebookgen/{runId}'}
def create_job(
self, subscription_id, resource_group_name, workspace_name, experiment_id, create_auto_ml_job_input_dto=None, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param create_auto_ml_job_input_dto:
:type create_auto_ml_job_input_dto:
~_restclient.models.CreateAutoMLJobInput
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: CreateAutoMLJobOutput or ClientRawResponse if raw=true
:rtype: ~_restclient.models.CreateAutoMLJobOutput or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.create_job.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
if create_auto_ml_job_input_dto is not None:
body_content = self._serialize.body(create_auto_ml_job_input_dto, 'CreateAutoMLJobInput')
else:
body_content = None
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CreateAutoMLJobOutput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
create_job.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/jobs/create'}
def validate_run_input(
self, subscription_id, resource_group_name, workspace_name, experiment_id, create_parent_run_dto, maximum_severity=2, flight="Prod", custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param create_parent_run_dto:
:type create_parent_run_dto: ~_restclient.models.CreateParentRun
:param maximum_severity:
:type maximum_severity: int
:param flight: Possible values include: 'Prod', 'Test'
:type flight: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: ErrorResponse or ClientRawResponse if raw=true
:rtype: ~_restclient.models.ErrorResponse or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.validate_run_input.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if maximum_severity is not None:
query_parameters['maximumSeverity'] = self._serialize.query("maximum_severity", maximum_severity, 'int')
if flight is not None:
query_parameters['flight'] = self._serialize.query("flight", flight, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
body_content = self._serialize.body(create_parent_run_dto, 'CreateParentRun')
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200, 204]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ErrorResponse', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
validate_run_input.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/validation'}
def validate_many_models_run_input(
self, subscription_id, resource_group_name, workspace_name, experiment_id, many_models_input_dto, experiment_name=None, custom_headers=None, raw=False, **operation_config):
"""
:param subscription_id: The Azure Subscription ID.
:type subscription_id: str
:param resource_group_name: The Name of the resource group in which
the workspace is located.
:type resource_group_name: str
:param workspace_name: The name of the workspace.
:type workspace_name: str
:param experiment_id: Experiment Id.
:type experiment_id: str
:param many_models_input_dto:
:type many_models_input_dto:
~_restclient.models.ManyModelsRunValidationInput
:param experiment_name:
:type experiment_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: ManyModelsRunValidationOutput or ClientRawResponse if
raw=true
:rtype: ~_restclient.models.ManyModelsRunValidationOutput or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<_restclient.models.ErrorResponseException>`
"""
# Construct URL
url = self.validate_many_models_run_input.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("subscription_id", subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str'),
'experimentId': self._serialize.url("experiment_id", experiment_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if experiment_name is not None:
query_parameters['experimentName'] = self._serialize.query("experiment_name", experiment_name, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
body_content = self._serialize.body(many_models_input_dto, 'ManyModelsRunValidationInput')
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ManyModelsRunValidationOutput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
validate_many_models_run_input.metadata = {'url': '/jasmine/v1.0/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.MachineLearningServices/workspaces/{workspaceName}/experimentids/{experimentId}/validateManyModelsRun'}
| 47.512257 | 271 | 0.667764 | 5,810 | 56,207 | 6.199828 | 0.041308 | 0.035729 | 0.037756 | 0.011549 | 0.899281 | 0.886788 | 0.8761 | 0.870936 | 0.860109 | 0.858999 | 0 | 0.003441 | 0.245094 | 56,207 | 1,182 | 272 | 47.552453 | 0.84549 | 0.306047 | 0 | 0.729572 | 0 | 0.031128 | 0.203863 | 0.099162 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033074 | false | 0 | 0.003891 | 0 | 0.097276 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
41377d30ca1c4f8f04ab87fcf58215cc14e869c1 | 2,902 | py | Python | alpyro_msgs/nav_msgs/getmapaction.py | rho2/alpyro_msgs | b5a680976c40c83df70d61bb2db1de32a1cde8d3 | [
"MIT"
] | 1 | 2020-12-13T13:07:10.000Z | 2020-12-13T13:07:10.000Z | alpyro_msgs/nav_msgs/getmapaction.py | rho2/alpyro_msgs | b5a680976c40c83df70d61bb2db1de32a1cde8d3 | [
"MIT"
] | null | null | null | alpyro_msgs/nav_msgs/getmapaction.py | rho2/alpyro_msgs | b5a680976c40c83df70d61bb2db1de32a1cde8d3 | [
"MIT"
] | null | null | null | from typing import List
from typing_extensions import Annotated
from typing import Final
from alpyro_msgs import RosMessage
from alpyro_msgs.nav_msgs.getmapactionfeedback import GetMapActionFeedback
from alpyro_msgs.nav_msgs.getmapactiongoal import GetMapActionGoal
from alpyro_msgs.nav_msgs.getmapactionresult import GetMapActionResult
class GetMapAction(RosMessage):
__msg_typ__ = "nav_msgs/GetMapAction"
__msg_def__ = "bmF2X21zZ3MvR2V0TWFwQWN0aW9uR29hbCBhY3Rpb25fZ29hbAogIHN0ZF9tc2dzL0hlYWRlciBoZWFkZXIKICAgIHVpbnQzMiBzZXEKICAgIHRpbWUgc3RhbXAKICAgIHN0cmluZyBmcmFtZV9pZAogIGFjdGlvbmxpYl9tc2dzL0dvYWxJRCBnb2FsX2lkCiAgICB0aW1lIHN0YW1wCiAgICBzdHJpbmcgaWQKICBuYXZfbXNncy9HZXRNYXBHb2FsIGdvYWwKbmF2X21zZ3MvR2V0TWFwQWN0aW9uUmVzdWx0IGFjdGlvbl9yZXN1bHQKICBzdGRfbXNncy9IZWFkZXIgaGVhZGVyCiAgICB1aW50MzIgc2VxCiAgICB0aW1lIHN0YW1wCiAgICBzdHJpbmcgZnJhbWVfaWQKICBhY3Rpb25saWJfbXNncy9Hb2FsU3RhdHVzIHN0YXR1cwogICAgdWludDggUEVORElORz0wCiAgICB1aW50OCBBQ1RJVkU9MQogICAgdWludDggUFJFRU1QVEVEPTIKICAgIHVpbnQ4IFNVQ0NFRURFRD0zCiAgICB1aW50OCBBQk9SVEVEPTQKICAgIHVpbnQ4IFJFSkVDVEVEPTUKICAgIHVpbnQ4IFBSRUVNUFRJTkc9NgogICAgdWludDggUkVDQUxMSU5HPTcKICAgIHVpbnQ4IFJFQ0FMTEVEPTgKICAgIHVpbnQ4IExPU1Q9OQogICAgYWN0aW9ubGliX21zZ3MvR29hbElEIGdvYWxfaWQKICAgICAgdGltZSBzdGFtcAogICAgICBzdHJpbmcgaWQKICAgIHVpbnQ4IHN0YXR1cwogICAgc3RyaW5nIHRleHQKICBuYXZfbXNncy9HZXRNYXBSZXN1bHQgcmVzdWx0CiAgICBuYXZfbXNncy9PY2N1cGFuY3lHcmlkIG1hcAogICAgICBzdGRfbXNncy9IZWFkZXIgaGVhZGVyCiAgICAgICAgdWludDMyIHNlcQogICAgICAgIHRpbWUgc3RhbXAKICAgICAgICBzdHJpbmcgZnJhbWVfaWQKICAgICAgbmF2X21zZ3MvTWFwTWV0YURhdGEgaW5mbwogICAgICAgIHRpbWUgbWFwX2xvYWRfdGltZQogICAgICAgIGZsb2F0MzIgcmVzb2x1dGlvbgogICAgICAgIHVpbnQzMiB3aWR0aAogICAgICAgIHVpbnQzMiBoZWlnaHQKICAgICAgICBnZW9tZXRyeV9tc2dzL1Bvc2Ugb3JpZ2luCiAgICAgICAgICBnZW9tZXRyeV9tc2dzL1BvaW50IHBvc2l0aW9uCiAgICAgICAgICAgIGZsb2F0NjQgeAogICAgICAgICAgICBmbG9hdDY0IHkKICAgICAgICAgICAgZmxvYXQ2NCB6CiAgICAgICAgICBnZW9tZXRyeV9tc2dzL1F1YXRlcm5pb24gb3JpZW50YXRpb24KICAgICAgICAgICAgZmxvYXQ2NCB4CiAgICAgICAgICAgIGZsb2F0NjQgeQogICAgICAgICAgICBmbG9hdDY0IHoKICAgICAgICAgICAgZmxvYXQ2NCB3CiAgICAgIGludDhbXSBkYXRhCm5hdl9tc2dzL0dldE1hcEFjdGlvbkZlZWRiYWNrIGFjdGlvbl9mZWVkYmFjawogIHN0ZF9tc2dzL0hlYWRlciBoZWFkZXIKICAgIHVpbnQzMiBzZXEKICAgIHRpbWUgc3RhbXAKICAgIHN0cmluZyBmcmFtZV9pZAogIGFjdGlvbmxpYl9tc2dzL0dvYWxTdGF0dXMgc3RhdHVzCiAgICB1aW50OCBQRU5ESU5HPTAKICAgIHVpbnQ4IEFDVElWRT0xCiAgICB1aW50OCBQUkVFTVBURUQ9MgogICAgdWludDggU1VDQ0VFREVEPTMKICAgIHVpbnQ4IEFCT1JURUQ9NAogICAgdWludDggUkVKRUNURUQ9NQogICAgdWludDggUFJFRU1QVElORz02CiAgICB1aW50OCBSRUNBTExJTkc9NwogICAgdWludDggUkVDQUxMRUQ9OAogICAgdWludDggTE9TVD05CiAgICBhY3Rpb25saWJfbXNncy9Hb2FsSUQgZ29hbF9pZAogICAgICB0aW1lIHN0YW1wCiAgICAgIHN0cmluZyBpZAogICAgdWludDggc3RhdHVzCiAgICBzdHJpbmcgdGV4dAogIG5hdl9tc2dzL0dldE1hcEZlZWRiYWNrIGZlZWRiYWNrCgo="
__md5_sum__ = "e611ad23fbf237c031b7536416dc7cd7"
action_goal: GetMapActionGoal
action_result: GetMapActionResult
action_feedback: GetMapActionFeedback
| 161.222222 | 2,330 | 0.971399 | 65 | 2,902 | 42.953846 | 0.415385 | 0.014327 | 0.020057 | 0.018266 | 0.022564 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087614 | 0.020675 | 2,902 | 17 | 2,331 | 170.705882 | 0.894792 | 0 | 0 | 0 | 0 | 0 | 0.814955 | 0.814955 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
415d0db542cf80f65110ac1b85286e4f21862382 | 13,979 | py | Python | extensions/.stubs/clrclasses/System/Runtime/InteropServices/__init__.py | vicwjb/Pycad | 7391cd694b7a91ad9f9964ec95833c1081bc1f84 | [
"MIT"
] | 1 | 2020-03-25T03:27:24.000Z | 2020-03-25T03:27:24.000Z | extensions/.stubs/clrclasses/System/Runtime/InteropServices/__init__.py | vicwjb/Pycad | 7391cd694b7a91ad9f9964ec95833c1081bc1f84 | [
"MIT"
] | null | null | null | extensions/.stubs/clrclasses/System/Runtime/InteropServices/__init__.py | vicwjb/Pycad | 7391cd694b7a91ad9f9964ec95833c1081bc1f84 | [
"MIT"
] | null | null | null | import __clrclasses__.System.Runtime.InteropServices.WindowsRuntime as WindowsRuntime
import __clrclasses__.System.Runtime.InteropServices.Expando as Expando
import __clrclasses__.System.Runtime.InteropServices.ComTypes as ComTypes
from __clrclasses__.System.Runtime.InteropServices import _Activator
from __clrclasses__.System.Runtime.InteropServices import _Assembly
from __clrclasses__.System.Runtime.InteropServices import _AssemblyBuilder
from __clrclasses__.System.Runtime.InteropServices import _AssemblyName
from __clrclasses__.System.Runtime.InteropServices import _Attribute
from __clrclasses__.System.Runtime.InteropServices import _ConstructorBuilder
from __clrclasses__.System.Runtime.InteropServices import _ConstructorInfo
from __clrclasses__.System.Runtime.InteropServices import _CustomAttributeBuilder
from __clrclasses__.System.Runtime.InteropServices import _EnumBuilder
from __clrclasses__.System.Runtime.InteropServices import _EventBuilder
from __clrclasses__.System.Runtime.InteropServices import _EventInfo
from __clrclasses__.System.Runtime.InteropServices import _Exception
from __clrclasses__.System.Runtime.InteropServices import _FieldBuilder
from __clrclasses__.System.Runtime.InteropServices import _FieldInfo
from __clrclasses__.System.Runtime.InteropServices import _ILGenerator
from __clrclasses__.System.Runtime.InteropServices import _LocalBuilder
from __clrclasses__.System.Runtime.InteropServices import _MemberInfo
from __clrclasses__.System.Runtime.InteropServices import _MethodBase
from __clrclasses__.System.Runtime.InteropServices import _MethodBuilder
from __clrclasses__.System.Runtime.InteropServices import _MethodInfo
from __clrclasses__.System.Runtime.InteropServices import _MethodRental
from __clrclasses__.System.Runtime.InteropServices import _Module
from __clrclasses__.System.Runtime.InteropServices import _ModuleBuilder
from __clrclasses__.System.Runtime.InteropServices import _ParameterBuilder
from __clrclasses__.System.Runtime.InteropServices import _ParameterInfo
from __clrclasses__.System.Runtime.InteropServices import _PropertyBuilder
from __clrclasses__.System.Runtime.InteropServices import _PropertyInfo
from __clrclasses__.System.Runtime.InteropServices import _SignatureHelper
from __clrclasses__.System.Runtime.InteropServices import _Thread
from __clrclasses__.System.Runtime.InteropServices import _Type
from __clrclasses__.System.Runtime.InteropServices import _TypeBuilder
from __clrclasses__.System.Runtime.InteropServices import AllowReversePInvokeCallsAttribute
from __clrclasses__.System.Runtime.InteropServices import Architecture
from __clrclasses__.System.Runtime.InteropServices import ArrayWithOffset
from __clrclasses__.System.Runtime.InteropServices import AssemblyRegistrationFlags
from __clrclasses__.System.Runtime.InteropServices import AutomationProxyAttribute
from __clrclasses__.System.Runtime.InteropServices import BestFitMappingAttribute
from __clrclasses__.System.Runtime.InteropServices import BIND_OPTS
from __clrclasses__.System.Runtime.InteropServices import BINDPTR
from __clrclasses__.System.Runtime.InteropServices import BStrWrapper
from __clrclasses__.System.Runtime.InteropServices import CALLCONV
from __clrclasses__.System.Runtime.InteropServices import CallingConvention
from __clrclasses__.System.Runtime.InteropServices import CharSet
from __clrclasses__.System.Runtime.InteropServices import ClassInterfaceAttribute
from __clrclasses__.System.Runtime.InteropServices import ClassInterfaceType
from __clrclasses__.System.Runtime.InteropServices import CoClassAttribute
from __clrclasses__.System.Runtime.InteropServices import ComAliasNameAttribute
from __clrclasses__.System.Runtime.InteropServices import ComAwareEventInfo
from __clrclasses__.System.Runtime.InteropServices import ComCompatibleVersionAttribute
from __clrclasses__.System.Runtime.InteropServices import ComConversionLossAttribute
from __clrclasses__.System.Runtime.InteropServices import ComDefaultInterfaceAttribute
from __clrclasses__.System.Runtime.InteropServices import ComEventInterfaceAttribute
from __clrclasses__.System.Runtime.InteropServices import ComEventsHelper
from __clrclasses__.System.Runtime.InteropServices import COMException
from __clrclasses__.System.Runtime.InteropServices import ComImportAttribute
from __clrclasses__.System.Runtime.InteropServices import ComInterfaceType
from __clrclasses__.System.Runtime.InteropServices import ComMemberType
from __clrclasses__.System.Runtime.InteropServices import ComRegisterFunctionAttribute
from __clrclasses__.System.Runtime.InteropServices import ComSourceInterfacesAttribute
from __clrclasses__.System.Runtime.InteropServices import ComUnregisterFunctionAttribute
from __clrclasses__.System.Runtime.InteropServices import ComVisibleAttribute
from __clrclasses__.System.Runtime.InteropServices import CONNECTDATA
from __clrclasses__.System.Runtime.InteropServices import CriticalHandle
from __clrclasses__.System.Runtime.InteropServices import CurrencyWrapper
from __clrclasses__.System.Runtime.InteropServices import CustomQueryInterfaceMode
from __clrclasses__.System.Runtime.InteropServices import CustomQueryInterfaceResult
from __clrclasses__.System.Runtime.InteropServices import DefaultCharSetAttribute
from __clrclasses__.System.Runtime.InteropServices import DefaultDllImportSearchPathsAttribute
from __clrclasses__.System.Runtime.InteropServices import DefaultParameterValueAttribute
from __clrclasses__.System.Runtime.InteropServices import DESCKIND
from __clrclasses__.System.Runtime.InteropServices import DispatchWrapper
from __clrclasses__.System.Runtime.InteropServices import DispIdAttribute
from __clrclasses__.System.Runtime.InteropServices import DISPPARAMS
from __clrclasses__.System.Runtime.InteropServices import DllImportAttribute
from __clrclasses__.System.Runtime.InteropServices import DllImportSearchPath
from __clrclasses__.System.Runtime.InteropServices import ELEMDESC
from __clrclasses__.System.Runtime.InteropServices import ErrorWrapper
from __clrclasses__.System.Runtime.InteropServices import EXCEPINFO
from __clrclasses__.System.Runtime.InteropServices import ExporterEventKind
from __clrclasses__.System.Runtime.InteropServices import ExtensibleClassFactory
from __clrclasses__.System.Runtime.InteropServices import ExternalException
from __clrclasses__.System.Runtime.InteropServices import FieldOffsetAttribute
from __clrclasses__.System.Runtime.InteropServices import FILETIME
from __clrclasses__.System.Runtime.InteropServices import FUNCDESC
from __clrclasses__.System.Runtime.InteropServices import FUNCFLAGS
from __clrclasses__.System.Runtime.InteropServices import FUNCKIND
from __clrclasses__.System.Runtime.InteropServices import GCHandle
from __clrclasses__.System.Runtime.InteropServices import GCHandleType
from __clrclasses__.System.Runtime.InteropServices import GuidAttribute
from __clrclasses__.System.Runtime.InteropServices import HandleCollector
from __clrclasses__.System.Runtime.InteropServices import HandleRef
from __clrclasses__.System.Runtime.InteropServices import ICustomAdapter
from __clrclasses__.System.Runtime.InteropServices import ICustomFactory
from __clrclasses__.System.Runtime.InteropServices import ICustomMarshaler
from __clrclasses__.System.Runtime.InteropServices import ICustomQueryInterface
from __clrclasses__.System.Runtime.InteropServices import IDispatchImplAttribute
from __clrclasses__.System.Runtime.InteropServices import IDispatchImplType
from __clrclasses__.System.Runtime.InteropServices import IDLDESC
from __clrclasses__.System.Runtime.InteropServices import IDLFLAG
from __clrclasses__.System.Runtime.InteropServices import IMPLTYPEFLAGS
from __clrclasses__.System.Runtime.InteropServices import ImportedFromTypeLibAttribute
from __clrclasses__.System.Runtime.InteropServices import ImporterEventKind
from __clrclasses__.System.Runtime.InteropServices import InAttribute
from __clrclasses__.System.Runtime.InteropServices import InterfaceTypeAttribute
from __clrclasses__.System.Runtime.InteropServices import InvalidComObjectException
from __clrclasses__.System.Runtime.InteropServices import InvalidOleVariantTypeException
from __clrclasses__.System.Runtime.InteropServices import INVOKEKIND
from __clrclasses__.System.Runtime.InteropServices import IRegistrationServices
from __clrclasses__.System.Runtime.InteropServices import ITypeLibConverter
from __clrclasses__.System.Runtime.InteropServices import ITypeLibExporterNameProvider
from __clrclasses__.System.Runtime.InteropServices import ITypeLibExporterNotifySink
from __clrclasses__.System.Runtime.InteropServices import ITypeLibImporterNotifySink
from __clrclasses__.System.Runtime.InteropServices import LayoutKind
from __clrclasses__.System.Runtime.InteropServices import LCIDConversionAttribute
from __clrclasses__.System.Runtime.InteropServices import LIBFLAGS
from __clrclasses__.System.Runtime.InteropServices import ManagedToNativeComInteropStubAttribute
from __clrclasses__.System.Runtime.InteropServices import Marshal
from __clrclasses__.System.Runtime.InteropServices import MarshalAsAttribute
from __clrclasses__.System.Runtime.InteropServices import MarshalDirectiveException
from __clrclasses__.System.Runtime.InteropServices import ObjectCreationDelegate
from __clrclasses__.System.Runtime.InteropServices import OptionalAttribute
from __clrclasses__.System.Runtime.InteropServices import OSPlatform
from __clrclasses__.System.Runtime.InteropServices import OutAttribute
from __clrclasses__.System.Runtime.InteropServices import PARAMDESC
from __clrclasses__.System.Runtime.InteropServices import PARAMFLAG
from __clrclasses__.System.Runtime.InteropServices import PreserveSigAttribute
from __clrclasses__.System.Runtime.InteropServices import PrimaryInteropAssemblyAttribute
from __clrclasses__.System.Runtime.InteropServices import ProgIdAttribute
from __clrclasses__.System.Runtime.InteropServices import RegistrationClassContext
from __clrclasses__.System.Runtime.InteropServices import RegistrationConnectionType
from __clrclasses__.System.Runtime.InteropServices import RegistrationServices
from __clrclasses__.System.Runtime.InteropServices import RuntimeEnvironment
from __clrclasses__.System.Runtime.InteropServices import RuntimeInformation
from __clrclasses__.System.Runtime.InteropServices import SafeArrayRankMismatchException
from __clrclasses__.System.Runtime.InteropServices import SafeArrayTypeMismatchException
from __clrclasses__.System.Runtime.InteropServices import SafeBuffer
from __clrclasses__.System.Runtime.InteropServices import SafeHandle
from __clrclasses__.System.Runtime.InteropServices import SEHException
from __clrclasses__.System.Runtime.InteropServices import SetWin32ContextInIDispatchAttribute
from __clrclasses__.System.Runtime.InteropServices import StandardOleMarshalObject
from __clrclasses__.System.Runtime.InteropServices import STATSTG
from __clrclasses__.System.Runtime.InteropServices import StructLayoutAttribute
from __clrclasses__.System.Runtime.InteropServices import SYSKIND
from __clrclasses__.System.Runtime.InteropServices import TYPEATTR
from __clrclasses__.System.Runtime.InteropServices import TYPEDESC
from __clrclasses__.System.Runtime.InteropServices import TYPEFLAGS
from __clrclasses__.System.Runtime.InteropServices import TypeIdentifierAttribute
from __clrclasses__.System.Runtime.InteropServices import TYPEKIND
from __clrclasses__.System.Runtime.InteropServices import TYPELIBATTR
from __clrclasses__.System.Runtime.InteropServices import TypeLibConverter
from __clrclasses__.System.Runtime.InteropServices import TypeLibExporterFlags
from __clrclasses__.System.Runtime.InteropServices import TypeLibFuncAttribute
from __clrclasses__.System.Runtime.InteropServices import TypeLibFuncFlags
from __clrclasses__.System.Runtime.InteropServices import TypeLibImportClassAttribute
from __clrclasses__.System.Runtime.InteropServices import TypeLibImporterFlags
from __clrclasses__.System.Runtime.InteropServices import TypeLibTypeAttribute
from __clrclasses__.System.Runtime.InteropServices import TypeLibTypeFlags
from __clrclasses__.System.Runtime.InteropServices import TypeLibVarAttribute
from __clrclasses__.System.Runtime.InteropServices import TypeLibVarFlags
from __clrclasses__.System.Runtime.InteropServices import TypeLibVersionAttribute
from __clrclasses__.System.Runtime.InteropServices import UCOMIBindCtx
from __clrclasses__.System.Runtime.InteropServices import UCOMIConnectionPoint
from __clrclasses__.System.Runtime.InteropServices import UCOMIConnectionPointContainer
from __clrclasses__.System.Runtime.InteropServices import UCOMIEnumConnectionPoints
from __clrclasses__.System.Runtime.InteropServices import UCOMIEnumConnections
from __clrclasses__.System.Runtime.InteropServices import UCOMIEnumMoniker
from __clrclasses__.System.Runtime.InteropServices import UCOMIEnumString
from __clrclasses__.System.Runtime.InteropServices import UCOMIEnumVARIANT
from __clrclasses__.System.Runtime.InteropServices import UCOMIMoniker
from __clrclasses__.System.Runtime.InteropServices import UCOMIPersistFile
from __clrclasses__.System.Runtime.InteropServices import UCOMIRunningObjectTable
from __clrclasses__.System.Runtime.InteropServices import UCOMIStream
from __clrclasses__.System.Runtime.InteropServices import UCOMITypeComp
from __clrclasses__.System.Runtime.InteropServices import UCOMITypeInfo
from __clrclasses__.System.Runtime.InteropServices import UCOMITypeLib
from __clrclasses__.System.Runtime.InteropServices import UnknownWrapper
from __clrclasses__.System.Runtime.InteropServices import UnmanagedFunctionPointerAttribute
from __clrclasses__.System.Runtime.InteropServices import UnmanagedType
from __clrclasses__.System.Runtime.InteropServices import VARDESC
from __clrclasses__.System.Runtime.InteropServices import VarEnum
from __clrclasses__.System.Runtime.InteropServices import VARFLAGS
from __clrclasses__.System.Runtime.InteropServices import VariantWrapper
| 75.155914 | 96 | 0.907146 | 1,299 | 13,979 | 9.167821 | 0.148576 | 0.248552 | 0.357293 | 0.59031 | 0.744647 | 0.733563 | 0 | 0 | 0 | 0 | 0 | 0.000151 | 0.052937 | 13,979 | 185 | 97 | 75.562162 | 0.899388 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.005405 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
41ab044c343938a2d3f2c4c855e8b469c892ac87 | 8,666 | py | Python | uwhpsc/2013/solutions/homework2/hw2b.py | philipwangdk/HPC | e2937016821701adb80ece5bf65d43d1860640c0 | [
"MIT"
] | null | null | null | uwhpsc/2013/solutions/homework2/hw2b.py | philipwangdk/HPC | e2937016821701adb80ece5bf65d43d1860640c0 | [
"MIT"
] | null | null | null | uwhpsc/2013/solutions/homework2/hw2b.py | philipwangdk/HPC | e2937016821701adb80ece5bf65d43d1860640c0 | [
"MIT"
] | null | null | null |
"""
Demonstration module for quadratic interpolation.
Sample solutions for Homework 2 problems #2 through #7.
"""
import numpy as np
import matplotlib.pyplot as plt
from numpy.linalg import solve
def quad_interp(xi,yi):
"""
Quadratic interpolation. Compute the coefficients of the polynomial
interpolating the points (xi[i],yi[i]) for i = 0,1,2.
Returns c, an array containing the coefficients of
p(x) = c[0] + c[1]*x + c[2]*x**2.
"""
# check inputs and print error message if not valid:
error_message = "xi and yi should have type numpy.ndarray"
assert (type(xi) is np.ndarray) and (type(yi) is np.ndarray), error_message
error_message = "xi and yi should have length 3"
assert len(xi)==3 and len(yi)==3, error_message
# Set up linear system to interpolate through data points:
A = np.vstack([np.ones(3), xi, xi**2]).T
c = solve(A,yi)
return c
def plot_quad(xi, yi):
"""
Perform quadratic interpolation and plot the resulting function along
with the data points.
"""
# Compute the coefficients:
c = quad_interp(xi,yi)
# Plot the resulting polynomial:
x = np.linspace(xi.min() - 1, xi.max() + 1, 1000)
y = c[0] + c[1]*x + c[2]*x**2
plt.figure(1) # open plot figure window
plt.clf() # clear figure
plt.plot(x,y,'b-') # connect points with a blue line
# Add data points (polynomial should go through these points!)
plt.plot(xi,yi,'ro') # plot as red circles
plt.ylim(-2,8) # set limits in y for plot
plt.title("Data points and interpolating polynomial")
plt.savefig('quadratic.png') # save figure as .png file
def cubic_interp(xi,yi):
"""
Cubic interpolation. Compute the coefficients of the polynomial
interpolating the points (xi[i],yi[i]) for i = 0,1,2,3
Returns c, an array containing the coefficients of
p(x) = c[0] + c[1]*x + c[2]*x**2 + c[3]*x**3.
"""
# check inputs and print error message if not valid:
error_message = "xi and yi should have type numpy.ndarray"
assert (type(xi) is np.ndarray) and (type(yi) is np.ndarray), error_message
error_message = "xi and yi should have length 4"
assert len(xi)==4 and len(yi)==4, error_message
# Set up linear system to interpolate through data points:
A = np.vstack([np.ones(4), xi, xi**2, xi**3]).T
c = solve(A,yi)
return c
def plot_cubic(xi, yi):
"""
Perform cubic interpolation and plot the resulting function along
with the data points.
"""
# Compute the coefficients:
c = cubic_interp(xi,yi)
# Plot the resulting polynomial:
x = np.linspace(xi.min() - 1, xi.max() + 1, 1000)
y = c[0] + c[1]*x + c[2]*x**2 + c[3]*x**3
plt.figure(1) # open plot figure window
plt.clf() # clear figure
plt.plot(x,y,'b-') # connect points with a blue line
# Add data points (polynomial should go through these points!)
plt.plot(xi,yi,'ro') # plot as red circles
plt.ylim(-2,8) # set limits in y for plot
plt.title("Data points and interpolating polynomial")
plt.savefig('cubic.png') # save figure as .png file
def poly_interp(xi,yi):
"""
General polynomial interpolation.
Compute the coefficients of the polynomial
interpolating the points (xi[i],yi[i]) for i = 0,1,2,...,n-1
where n = len(xi) = len(yi).
Returns c, an array containing the coefficients of
p(x) = c[0] + c[1]*x + c[2]*x**2 + ... + c[N-1]*x**(N-1).
"""
# check inputs and print error message if not valid:
error_message = "xi and yi should have type numpy.ndarray"
assert (type(xi) is np.ndarray) and (type(yi) is np.ndarray), error_message
error_message = "xi and yi should have the same length "
assert len(xi)==len(yi), error_message
# Set up linear system to interpolate through data points:
# Uses a list comprehension, see
# http://docs.python.org/2/tutorial/datastructures.html#list-comprehensions
n = len(xi)
A = np.vstack([xi**j for j in range(n)]).T
c = solve(A,yi)
return c
def plot_poly(xi, yi):
"""
Perform polynomial interpolation and plot the resulting function along
with the data points.
"""
# Compute the coefficients:
c = poly_interp(xi,yi)
# Plot the resulting polynomial:
x = np.linspace(xi.min() - 1, xi.max() + 1, 1000)
# Use Horner's rule:
n = len(xi)
y = c[n-1]
for j in range(n-1, 0, -1):
y = y*x + c[j-1]
plt.figure(1) # open plot figure window
plt.clf() # clear figure
plt.plot(x,y,'b-') # connect points with a blue line
# Add data points (polynomial should go through these points!)
plt.plot(xi,yi,'ro') # plot as red circles
plt.ylim(yi.min()-1, yi.max()+1) # set limits in y for plot
plt.title("Data points and interpolating polynomial")
plt.savefig('poly.png') # save figure as .png file
def test_quad1():
"""
Test code, no return value or exception if test runs properly.
"""
xi = np.array([-1., 0., 2.])
yi = np.array([ 1., -1., 7.])
c = quad_interp(xi,yi)
c_true = np.array([-1., 0., 2.])
print "c = ", c
print "c_true = ", c_true
# test that all elements have small error:
assert np.allclose(c, c_true), \
"Incorrect result, c = %s, Expected: c = %s" % (c,c_true)
# Also produce plot:
plot_quad(xi,yi)
def test_quad2():
"""
Test code, no return value or exception if test runs properly.
"""
# Generate a test by specifying c_true first:
c_true = np.array([7., 2., -3.])
# Points to interpolate:
xi = np.array([-1., 0., 2.])
# Function values to interpolate:
yi = c_true[0] + c_true[1]*xi + c_true[2]*xi**2
# Now interpolate and check we get c_true back again.
c = quad_interp(xi,yi)
print "c = ", c
print "c_true = ", c_true
# test that all elements have small error:
assert np.allclose(c, c_true), \
"Incorrect result, c = %s, Expected: c = %s" % (c,c_true)
# Also produce plot:
plot_quad(xi,yi)
def test_cubic1():
"""
Test code, no return value or exception if test runs properly.
"""
# Generate a test by specifying c_true first:
c_true = np.array([7., -2., -3., 1.])
# Points to interpolate:
xi = np.array([-1., 0., 1., 2.])
# Function values to interpolate:
yi = c_true[0] + c_true[1]*xi + c_true[2]*xi**2 + c_true[3]*xi**3
# Now interpolate and check we get c_true back again.
c = cubic_interp(xi,yi)
print "c = ", c
print "c_true = ", c_true
# test that all elements have small error:
assert np.allclose(c, c_true), \
"Incorrect result, c = %s, Expected: c = %s" % (c,c_true)
# Also produce plot:
plot_cubic(xi,yi)
def test_poly1():
"""
Test code, no return value or exception if test runs properly.
Same points as test_cubic1.
"""
# Generate a test by specifying c_true first:
c_true = np.array([7., -2., -3., 1.])
# Points to interpolate:
xi = np.array([-1., 0., 1., 2.])
# Function values to interpolate:
# Use Horner's rule:
n = len(xi)
yi = c_true[n-1]
for j in range(n-1, 0, -1):
yi = yi*xi + c_true[j-1]
# Now interpolate and check we get c_true back again.
c = poly_interp(xi,yi)
print "c = ", c
print "c_true = ", c_true
# test that all elements have small error:
assert np.allclose(c, c_true), \
"Incorrect result, c = %s, Expected: c = %s" % (c,c_true)
# Also produce plot:
plot_poly(xi,yi)
def test_poly2():
"""
Test code, no return value or exception if test runs properly.
Test with 5 points (quartic interpolating function).
"""
# Generate a test by specifying c_true first:
c_true = np.array([0., -6., 11., -6., 1.])
# Points to interpolate:
xi = np.array([-1., 0., 1., 2., 4.])
# Function values to interpolate:
# Use Horner's rule:
n = len(xi)
yi = c_true[n-1]
for j in range(n-1, 0, -1):
yi = yi*xi + c_true[j-1]
# Now interpolate and check we get c_true back again.
c = poly_interp(xi,yi)
print "c = ", c
print "c_true = ", c_true
# test that all elements have small error:
assert np.allclose(c, c_true), \
"Incorrect result, c = %s, Expected: c = %s" % (c,c_true)
# Also produce plot:
plot_poly(xi,yi)
if __name__=="__main__":
print "Running test..."
test_quad1()
test_quad2()
test_cubic1()
test_poly1()
test_poly2()
| 28.228013 | 79 | 0.603277 | 1,378 | 8,666 | 3.725689 | 0.126996 | 0.042852 | 0.021426 | 0.019868 | 0.848072 | 0.83658 | 0.834242 | 0.815738 | 0.809895 | 0.795871 | 0 | 0.024742 | 0.263097 | 8,666 | 306 | 80 | 28.320261 | 0.779205 | 0.249481 | 0 | 0.637795 | 0 | 0 | 0.149194 | 0 | 0 | 0 | 0 | 0 | 0.086614 | 0 | null | null | 0 | 0.023622 | null | null | 0.086614 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.