hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
8c2206d947bb5be7d393960dc49b8f9f58c7b820 | 1,746 | py | Python | floodsystem/flood.py | aliced187/alice-and-charlie | 1851e9f682f81753cc6682988b79116edb8cd341 | [
"MIT"
] | null | null | null | floodsystem/flood.py | aliced187/alice-and-charlie | 1851e9f682f81753cc6682988b79116edb8cd341 | [
"MIT"
] | null | null | null | floodsystem/flood.py | aliced187/alice-and-charlie | 1851e9f682f81753cc6682988b79116edb8cd341 | [
"MIT"
] | null | null | null |
def stations_level_over_threshold(stations, tol):
names = []
ranges = []
waterlevels = []
relwaterlevels = []
for station in stations:
names.append(station.name)
waterlevels.append(station.latest_level)
ranges.append(station.typical_range)
for z in range(len(ranges)):
if ranges[z] == None:
pass
else:
value = ranges[z][1] - ranges[z][0]
if value < 0:
pass
elif waterlevels[z] == None:
pass
else:
rannum = ranges[z][1] - ranges[z][0]
relwl = (waterlevels[z]- ranges[z][0]) / rannum
if relwl > tol:
relwat = (names[z], relwl)
relwaterlevels.append(relwat)
else:
pass
return sorted(relwaterlevels, key=lambda x: -x[1])
def stations_highest_rel_level(stations, N):
names = []
ranges = []
waterlevels = []
relwaterlevels = []
for station in stations:
names.append(station.name)
waterlevels.append(station.latest_level)
ranges.append(station.typical_range)
for z in range(len(ranges)):
if ranges[z] == None:
pass
else:
value = ranges[z][1] - ranges[z][0]
if value < 0:
pass
elif waterlevels[z] == None:
pass
else:
rannum = ranges[z][1] - ranges[z][0]
relwl = (waterlevels[z]- ranges[z][0]) / rannum
relwat = (names[z], relwl)
relwaterlevels.append(relwat)
v = sorted(relwaterlevels, key=lambda x: -x[1])
c = v[:N]
return(c) | 30.631579 | 63 | 0.497136 | 184 | 1,746 | 4.663043 | 0.222826 | 0.097902 | 0.055944 | 0.060606 | 0.871795 | 0.871795 | 0.871795 | 0.69697 | 0.69697 | 0.69697 | 0 | 0.013084 | 0.387171 | 1,746 | 57 | 64 | 30.631579 | 0.788785 | 0 | 0 | 0.867925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037736 | false | 0.132075 | 0 | 0 | 0.056604 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
8c2ac899912ef1eaba30902764d97bfb47c9d976 | 139 | py | Python | specter/__init__.py | ronaldseoh/specter | 984a80e75d800ef472c828b112e9daf16789fa9e | [
"Apache-2.0"
] | 248 | 2020-04-17T19:27:05.000Z | 2022-03-27T03:23:54.000Z | specter/__init__.py | ronaldseoh/specter | 984a80e75d800ef472c828b112e9daf16789fa9e | [
"Apache-2.0"
] | 34 | 2020-04-24T02:38:17.000Z | 2022-03-28T16:19:20.000Z | specter/__init__.py | ronaldseoh/specter | 984a80e75d800ef472c828b112e9daf16789fa9e | [
"Apache-2.0"
] | 44 | 2020-05-27T01:07:44.000Z | 2022-02-19T05:50:34.000Z | from specter.data import DataReaderFromPickled, DataReader
from specter.model import Specter
from specter.predictor import SpecterPredictor | 46.333333 | 58 | 0.884892 | 16 | 139 | 7.6875 | 0.5625 | 0.268293 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086331 | 139 | 3 | 59 | 46.333333 | 0.968504 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8c2c9c1ccf139bde6111eabba67869399644a7d0 | 28 | py | Python | environment-build/main.py | H37kouya/lib-python-study | 6af40c27e2e48f71825e8dc7ecee0e820bf8dbfc | [
"MIT"
] | null | null | null | environment-build/main.py | H37kouya/lib-python-study | 6af40c27e2e48f71825e8dc7ecee0e820bf8dbfc | [
"MIT"
] | null | null | null | environment-build/main.py | H37kouya/lib-python-study | 6af40c27e2e48f71825e8dc7ecee0e820bf8dbfc | [
"MIT"
] | null | null | null | print('Hello Conda Python')
| 14 | 27 | 0.75 | 4 | 28 | 5.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107143 | 28 | 1 | 28 | 28 | 0.84 | 0 | 0 | 0 | 0 | 0 | 0.642857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
8c593d401f9bc1bff29c0dddb264b30f7959be15 | 151 | py | Python | cookomatic.py | mcastellin/cookomatic | 42d01e4168dcd2427b03c72bc9b30af6140254b1 | [
"MIT"
] | 1 | 2021-06-25T11:10:09.000Z | 2021-06-25T11:10:09.000Z | recipebook.py | mcastellin/yt-docker-compose-tutorial | c3c97e46ad776ab5bf6f2cbb95cefe2910186711 | [
"MIT"
] | 1 | 2020-10-22T10:08:31.000Z | 2020-10-22T10:08:31.000Z | cookomatic.py | mcastellin/cookomatic | 42d01e4168dcd2427b03c72bc9b30af6140254b1 | [
"MIT"
] | null | null | null | from app import app, db
from app.models import Recipe
@app.shell_context_processor
def make_shell_context():
return {"db": db, "Recipe": Recipe}
| 18.875 | 39 | 0.741722 | 23 | 151 | 4.695652 | 0.521739 | 0.12963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152318 | 151 | 7 | 40 | 21.571429 | 0.84375 | 0 | 0 | 0 | 0 | 0 | 0.05298 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
4fdcf725b7de459d38ec7319366f794d9d63e4b0 | 150 | py | Python | src/compas_ghpython/install.py | jf---/compas | cd878ece933013b8ac34e9d42cf6d5c62a5396ee | [
"MIT"
] | 2 | 2021-03-17T18:14:22.000Z | 2021-09-19T13:50:02.000Z | src/compas_ghpython/install.py | jf---/compas | cd878ece933013b8ac34e9d42cf6d5c62a5396ee | [
"MIT"
] | null | null | null | src/compas_ghpython/install.py | jf---/compas | cd878ece933013b8ac34e9d42cf6d5c62a5396ee | [
"MIT"
] | null | null | null | import compas.plugins
@compas.plugins.plugin(category='install')
def installable_rhino_packages(category='install'):
return ['compas_ghpython']
| 21.428571 | 51 | 0.786667 | 17 | 150 | 6.764706 | 0.705882 | 0.226087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086667 | 150 | 6 | 52 | 25 | 0.839416 | 0 | 0 | 0 | 0 | 0 | 0.193333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
4fe5f0bfe46ff78f3190f67bede6d2516480931e | 328 | py | Python | openrec/utils/samplers/__init__.py | BoData-Bot/openrec | 3d655d21b762b40d50e53cea96d7802fd49c74ad | [
"Apache-2.0"
] | null | null | null | openrec/utils/samplers/__init__.py | BoData-Bot/openrec | 3d655d21b762b40d50e53cea96d7802fd49c74ad | [
"Apache-2.0"
] | null | null | null | openrec/utils/samplers/__init__.py | BoData-Bot/openrec | 3d655d21b762b40d50e53cea96d7802fd49c74ad | [
"Apache-2.0"
] | null | null | null | from openrec.utils.samplers.sampler import Sampler
from openrec.utils.samplers.pairwise_sampler import PairwiseSampler
from openrec.utils.samplers.n_pairwise_sampler import NPairwiseSampler
from openrec.utils.samplers.pointwise_sampler import PointwiseSampler
from openrec.utils.samplers.explicit_sampler import ExplicitSampler
| 54.666667 | 70 | 0.893293 | 40 | 328 | 7.2 | 0.35 | 0.190972 | 0.277778 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060976 | 328 | 5 | 71 | 65.6 | 0.935065 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8b4110ffc44363b71d1a966e34d4940911f43f3b | 104 | py | Python | feature/__init__.py | catch-n-release/drig | 1290e60839e6b45ce2c429d2ed9eb4dcf0678755 | [
"BSD-3-Clause"
] | null | null | null | feature/__init__.py | catch-n-release/drig | 1290e60839e6b45ce2c429d2ed9eb4dcf0678755 | [
"BSD-3-Clause"
] | null | null | null | feature/__init__.py | catch-n-release/drig | 1290e60839e6b45ce2c429d2ed9eb4dcf0678755 | [
"BSD-3-Clause"
] | null | null | null | from drig.feature.condenser import FeatureCondenser
from drig.feature.extractor import FeatureExtractor
| 34.666667 | 51 | 0.884615 | 12 | 104 | 7.666667 | 0.666667 | 0.173913 | 0.326087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 104 | 2 | 52 | 52 | 0.958333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8b4593f1c141d2aa831701d0ee4c5a4832e1f176 | 146 | py | Python | tests/test_version.py | lemoncheesecake/lemoncheesecake-selenium | 02ab57e65ee30ec4be23f4d373ff3025e548fe2e | [
"Apache-2.0"
] | null | null | null | tests/test_version.py | lemoncheesecake/lemoncheesecake-selenium | 02ab57e65ee30ec4be23f4d373ff3025e548fe2e | [
"Apache-2.0"
] | null | null | null | tests/test_version.py | lemoncheesecake/lemoncheesecake-selenium | 02ab57e65ee30ec4be23f4d373ff3025e548fe2e | [
"Apache-2.0"
] | null | null | null | import re
from lemoncheesecake_selenium.__version__ import __version__
def test_version():
assert re.match(r"^\d+\.\d+\.\d+$", __version__)
| 20.857143 | 60 | 0.732877 | 19 | 146 | 4.894737 | 0.631579 | 0.043011 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116438 | 146 | 6 | 61 | 24.333333 | 0.72093 | 0 | 0 | 0 | 0 | 0 | 0.10274 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | true | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8ca9637af1ee70c31fae754ab057c8c6b518b573 | 66 | py | Python | python/testData/refactoring/introduceVariable/substringFromFormatDict.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 2 | 2018-12-29T09:53:39.000Z | 2018-12-29T09:53:42.000Z | python/testData/refactoring/introduceVariable/substringFromFormatDict.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/refactoring/introduceVariable/substringFromFormatDict.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | print("<selection>Hello</selection> %(name)s" % {"name": "World"}) | 66 | 66 | 0.636364 | 8 | 66 | 5.25 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060606 | 66 | 1 | 66 | 66 | 0.677419 | 0 | 0 | 0 | 0 | 0 | 0.686567 | 0.41791 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
8cdb3973bbd65ef9d958ba933becf1f285702994 | 32 | py | Python | Round1/dynamic_analysis/__init__.py | NavneelSinghal/HCLHackIITK | 91ceb865d1ff7c1ff109fbbbcfda8005d3b9cf93 | [
"MIT"
] | null | null | null | Round1/dynamic_analysis/__init__.py | NavneelSinghal/HCLHackIITK | 91ceb865d1ff7c1ff109fbbbcfda8005d3b9cf93 | [
"MIT"
] | null | null | null | Round1/dynamic_analysis/__init__.py | NavneelSinghal/HCLHackIITK | 91ceb865d1ff7c1ff109fbbbcfda8005d3b9cf93 | [
"MIT"
] | null | null | null | from .model import DynamicModel
| 16 | 31 | 0.84375 | 4 | 32 | 6.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.964286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8cf39e12e0fe2f655783cebdb14d2ce0f69a642c | 35 | py | Python | nevermined_sdk_py/nevermined/__init__.py | nevermined-io/sdk-py | 1e8872688c7181d502d2647f81cfc9786f771f22 | [
"Apache-2.0"
] | 1 | 2020-12-02T13:49:55.000Z | 2020-12-02T13:49:55.000Z | nevermined_sdk_py/nevermined/__init__.py | nevermined-io/sdk-py | 1e8872688c7181d502d2647f81cfc9786f771f22 | [
"Apache-2.0"
] | 17 | 2020-11-20T15:07:43.000Z | 2021-10-20T13:18:16.000Z | nevermined_sdk_py/nevermined/__init__.py | nevermined-io/sdk-py | 1e8872688c7181d502d2647f81cfc9786f771f22 | [
"Apache-2.0"
] | 1 | 2021-05-14T08:49:37.000Z | 2021-05-14T08:49:37.000Z | from .nevermined import Nevermined
| 17.5 | 34 | 0.857143 | 4 | 35 | 7.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
508ed748a38c1171dca139e7448b68e6e3383f25 | 9,583 | py | Python | magenta/music/encoder_decoder_test.py | danabo/magenta | 291521df5541f11d389fb9aef4c9b0c22064d1a3 | [
"Apache-2.0"
] | 4 | 2016-09-07T03:39:41.000Z | 2021-06-03T18:53:43.000Z | magenta/music/encoder_decoder_test.py | danabo/magenta | 291521df5541f11d389fb9aef4c9b0c22064d1a3 | [
"Apache-2.0"
] | null | null | null | magenta/music/encoder_decoder_test.py | danabo/magenta | 291521df5541f11d389fb9aef4c9b0c22064d1a3 | [
"Apache-2.0"
] | 3 | 2016-08-03T13:53:32.000Z | 2016-08-06T00:08:42.000Z | # Copyright 2016 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Tests for encoder_decoder."""
# internal imports
import tensorflow as tf
from magenta.common import sequence_example_lib
from magenta.music import encoder_decoder
from magenta.music import testing_lib
class OneHotEventSequenceEncoderDecoderTest(tf.test.TestCase):
def setUp(self):
self.enc = encoder_decoder.OneHotEventSequenceEncoderDecoder(
testing_lib.TrivialOneHotEncoding(3))
def testInputSize(self):
self.assertEquals(3, self.enc.input_size)
def testNumClasses(self):
self.assertEqual(3, self.enc.num_classes)
def testEventsToInput(self):
events = [0, 1, 0, 2, 0]
self.assertEqual([1.0, 0.0, 0.0], self.enc.events_to_input(events, 0))
self.assertEqual([0.0, 1.0, 0.0], self.enc.events_to_input(events, 1))
self.assertEqual([1.0, 0.0, 0.0], self.enc.events_to_input(events, 2))
self.assertEqual([0.0, 0.0, 1.0], self.enc.events_to_input(events, 3))
self.assertEqual([1.0, 0.0, 0.0], self.enc.events_to_input(events, 4))
def testEventsToLabel(self):
events = [0, 1, 0, 2, 0]
self.assertEqual(0, self.enc.events_to_label(events, 0))
self.assertEqual(1, self.enc.events_to_label(events, 1))
self.assertEqual(0, self.enc.events_to_label(events, 2))
self.assertEqual(2, self.enc.events_to_label(events, 3))
self.assertEqual(0, self.enc.events_to_label(events, 4))
def testClassIndexToEvent(self):
events = [0, 1, 0, 2, 0]
self.assertEqual(0, self.enc.class_index_to_event(0, events))
self.assertEqual(1, self.enc.class_index_to_event(1, events))
self.assertEqual(2, self.enc.class_index_to_event(2, events))
def testEncode(self):
events = [0, 1, 0, 2, 0]
sequence_example = self.enc.encode(events)
expected_inputs = [[1.0, 0.0, 0.0],
[0.0, 1.0, 0.0],
[1.0, 0.0, 0.0],
[0.0, 0.0, 1.0]]
expected_labels = [1, 0, 2, 0]
expected_sequence_example = sequence_example_lib.make_sequence_example(
expected_inputs, expected_labels)
self.assertEqual(sequence_example, expected_sequence_example)
def testGetInputsBatch(self):
event_sequences = [[0, 1, 0, 2, 0], [0, 1, 2]]
expected_inputs_1 = [[1.0, 0.0, 0.0],
[0.0, 1.0, 0.0],
[1.0, 0.0, 0.0],
[0.0, 0.0, 1.0],
[1.0, 0.0, 0.0]]
expected_inputs_2 = [[1.0, 0.0, 0.0],
[0.0, 1.0, 0.0],
[0.0, 0.0, 1.0]]
expected_full_length_inputs_batch = [expected_inputs_1, expected_inputs_2]
expected_last_event_inputs_batch = [expected_inputs_1[-1:],
expected_inputs_2[-1:]]
self.assertListEqual(
expected_full_length_inputs_batch,
self.enc.get_inputs_batch(event_sequences, True))
self.assertListEqual(
expected_last_event_inputs_batch,
self.enc.get_inputs_batch(event_sequences))
def testExtendEventSequences(self):
events1 = [0]
events2 = [0]
events3 = [0]
event_sequences = [events1, events2, events3]
softmax = [[[0.0, 0.0, 1.0]], [[1.0, 0.0, 0.0]], [[0.0, 1.0, 0.0]]]
self.enc.extend_event_sequences(event_sequences, softmax)
self.assertListEqual(list(events1), [0, 2])
self.assertListEqual(list(events2), [0, 0])
self.assertListEqual(list(events3), [0, 1])
class LookbackEventSequenceEncoderDecoderTest(tf.test.TestCase):
def setUp(self):
self.enc = encoder_decoder.LookbackEventSequenceEncoderDecoder(
testing_lib.TrivialOneHotEncoding(3), [1, 2], 2)
def testInputSize(self):
self.assertEqual(13, self.enc.input_size)
def testNumClasses(self):
self.assertEqual(5, self.enc.num_classes)
def testEventsToInput(self):
events = [0, 1, 0, 2, 0]
self.assertEqual([1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0,
1.0, -1.0, 0.0, 0.0],
self.enc.events_to_input(events, 0))
self.assertEqual([0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0,
-1.0, 1.0, 0.0, 0.0],
self.enc.events_to_input(events, 1))
self.assertEqual([1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0,
1.0, 1.0, 0.0, 1.0],
self.enc.events_to_input(events, 2))
self.assertEqual([0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0,
-1.0, -1.0, 0.0, 0.0],
self.enc.events_to_input(events, 3))
self.assertEqual([1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0,
1.0, -1.0, 0.0, 1.0],
self.enc.events_to_input(events, 4))
def testEventsToLabel(self):
events = [0, 1, 0, 2, 0]
self.assertEqual(4, self.enc.events_to_label(events, 0))
self.assertEqual(1, self.enc.events_to_label(events, 1))
self.assertEqual(4, self.enc.events_to_label(events, 2))
self.assertEqual(2, self.enc.events_to_label(events, 3))
self.assertEqual(4, self.enc.events_to_label(events, 4))
def testClassIndexToEvent(self):
events = [0, 1, 0, 2, 0]
self.assertEqual(0, self.enc.class_index_to_event(0, events[:1]))
self.assertEqual(1, self.enc.class_index_to_event(1, events[:1]))
self.assertEqual(2, self.enc.class_index_to_event(2, events[:1]))
self.assertEqual(0, self.enc.class_index_to_event(3, events[:1]))
self.assertEqual(0, self.enc.class_index_to_event(4, events[:1]))
self.assertEqual(0, self.enc.class_index_to_event(0, events[:2]))
self.assertEqual(1, self.enc.class_index_to_event(1, events[:2]))
self.assertEqual(2, self.enc.class_index_to_event(2, events[:2]))
self.assertEqual(1, self.enc.class_index_to_event(3, events[:2]))
self.assertEqual(0, self.enc.class_index_to_event(4, events[:2]))
self.assertEqual(0, self.enc.class_index_to_event(0, events[:3]))
self.assertEqual(1, self.enc.class_index_to_event(1, events[:3]))
self.assertEqual(2, self.enc.class_index_to_event(2, events[:3]))
self.assertEqual(0, self.enc.class_index_to_event(3, events[:3]))
self.assertEqual(1, self.enc.class_index_to_event(4, events[:3]))
self.assertEqual(0, self.enc.class_index_to_event(0, events[:4]))
self.assertEqual(1, self.enc.class_index_to_event(1, events[:4]))
self.assertEqual(2, self.enc.class_index_to_event(2, events[:4]))
self.assertEqual(2, self.enc.class_index_to_event(3, events[:4]))
self.assertEqual(0, self.enc.class_index_to_event(4, events[:4]))
self.assertEqual(0, self.enc.class_index_to_event(0, events[:5]))
self.assertEqual(1, self.enc.class_index_to_event(1, events[:5]))
self.assertEqual(2, self.enc.class_index_to_event(2, events[:5]))
self.assertEqual(0, self.enc.class_index_to_event(3, events[:5]))
self.assertEqual(2, self.enc.class_index_to_event(4, events[:5]))
def testEmptyLookback(self):
enc = encoder_decoder.LookbackEventSequenceEncoderDecoder(
testing_lib.TrivialOneHotEncoding(3), [], 2)
self.assertEqual(5, enc.input_size)
self.assertEqual(3, enc.num_classes)
events = [0, 1, 0, 2, 0]
self.assertEqual([1.0, 0.0, 0.0, 1.0, -1.0],
enc.events_to_input(events, 0))
self.assertEqual([0.0, 1.0, 0.0, -1.0, 1.0],
enc.events_to_input(events, 1))
self.assertEqual([1.0, 0.0, 0.0, 1.0, 1.0],
enc.events_to_input(events, 2))
self.assertEqual([0.0, 0.0, 1.0, -1.0, -1.0],
enc.events_to_input(events, 3))
self.assertEqual([1.0, 0.0, 0.0, 1.0, -1.0],
enc.events_to_input(events, 4))
self.assertEqual(0, enc.events_to_label(events, 0))
self.assertEqual(1, enc.events_to_label(events, 1))
self.assertEqual(0, enc.events_to_label(events, 2))
self.assertEqual(2, enc.events_to_label(events, 3))
self.assertEqual(0, enc.events_to_label(events, 4))
self.assertEqual(0, self.enc.class_index_to_event(0, events[:1]))
self.assertEqual(1, self.enc.class_index_to_event(1, events[:1]))
self.assertEqual(2, self.enc.class_index_to_event(2, events[:1]))
self.assertEqual(0, self.enc.class_index_to_event(0, events[:2]))
self.assertEqual(1, self.enc.class_index_to_event(1, events[:2]))
self.assertEqual(2, self.enc.class_index_to_event(2, events[:2]))
self.assertEqual(0, self.enc.class_index_to_event(0, events[:3]))
self.assertEqual(1, self.enc.class_index_to_event(1, events[:3]))
self.assertEqual(2, self.enc.class_index_to_event(2, events[:3]))
self.assertEqual(0, self.enc.class_index_to_event(0, events[:4]))
self.assertEqual(1, self.enc.class_index_to_event(1, events[:4]))
self.assertEqual(2, self.enc.class_index_to_event(2, events[:4]))
self.assertEqual(0, self.enc.class_index_to_event(0, events[:5]))
self.assertEqual(1, self.enc.class_index_to_event(1, events[:5]))
self.assertEqual(2, self.enc.class_index_to_event(2, events[:5]))
if __name__ == '__main__':
tf.test.main()
| 45.20283 | 78 | 0.654179 | 1,533 | 9,583 | 3.915199 | 0.091324 | 0.056981 | 0.062479 | 0.055981 | 0.743086 | 0.723259 | 0.723092 | 0.720593 | 0.712429 | 0.627124 | 0 | 0.073611 | 0.190546 | 9,583 | 211 | 79 | 45.417062 | 0.700142 | 0.063967 | 0 | 0.386905 | 0 | 0 | 0.000894 | 0 | 0 | 0 | 0 | 0 | 0.505952 | 1 | 0.095238 | false | 0 | 0.02381 | 0 | 0.130952 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
50fa07ce6828b879ddccbee603c3a20535321b8f | 340 | py | Python | backend/oauth2/admin.py | projectpai/paipass | 8b8e70b6808bf026cf957e240c7eed7bfcf4c55d | [
"MIT"
] | 3 | 2021-04-17T10:20:26.000Z | 2022-03-08T07:36:13.000Z | backend/oauth2/admin.py | projectpai/paipass | 8b8e70b6808bf026cf957e240c7eed7bfcf4c55d | [
"MIT"
] | null | null | null | backend/oauth2/admin.py | projectpai/paipass | 8b8e70b6808bf026cf957e240c7eed7bfcf4c55d | [
"MIT"
] | null | null | null | from django.contrib import admin
#from .models import (PaipassApplication, PaipassAccessToken, PaipassGrant,
# PaipassRefreshToken)
# Register your models here.
#admin.site.register(PaipassApplication)
#admin.site.register(PaipassAccessToken)
#admin.site.register(PaipassGrant)
#admin.site.register(PaipassRefreshToken)
| 34 | 75 | 0.788235 | 32 | 340 | 8.375 | 0.4375 | 0.134328 | 0.253731 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120588 | 340 | 9 | 76 | 37.777778 | 0.896321 | 0.858824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0f9add845f2139b38868af5b9f249cf113182bf9 | 21 | py | Python | scripts/svc_health_check/__init__.py | atsgen/tf-test | 2748fcd81491450c75dadc71849d2a1c11061029 | [
"Apache-2.0"
] | 5 | 2020-09-29T00:36:57.000Z | 2022-02-16T06:51:32.000Z | scripts/svc_health_check/__init__.py | atsgen/tf-test | 2748fcd81491450c75dadc71849d2a1c11061029 | [
"Apache-2.0"
] | 27 | 2019-11-02T02:18:34.000Z | 2022-02-24T18:49:08.000Z | scripts/svc_health_check/__init__.py | atsgen/tf-test | 2748fcd81491450c75dadc71849d2a1c11061029 | [
"Apache-2.0"
] | 20 | 2019-11-28T16:02:25.000Z | 2022-01-06T05:56:58.000Z | "Health Check tests"
| 10.5 | 20 | 0.761905 | 3 | 21 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 21 | 1 | 21 | 21 | 0.888889 | 0.857143 | 0 | 0 | 0 | 0 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ba0196925d9fc123343df50714e87e102b303c11 | 45 | py | Python | datastore/__init__.py | FinnStutzenstein/openslides-datastore-service | 07a8022b46683a223cbed0f6a925d81499ee71ba | [
"MIT"
] | 2 | 2020-01-20T13:56:28.000Z | 2020-02-17T10:56:26.000Z | datastore/__init__.py | FinnStutzenstein/openslides-datastore-service | 07a8022b46683a223cbed0f6a925d81499ee71ba | [
"MIT"
] | 122 | 2020-01-16T15:13:37.000Z | 2022-03-17T10:32:47.000Z | datastore/__init__.py | FinnStutzenstein/openslides-datastore-service | 07a8022b46683a223cbed0f6a925d81499ee71ba | [
"MIT"
] | 7 | 2020-02-20T12:04:17.000Z | 2021-11-23T17:54:33.000Z | from . import reader, shared, writer # noqa
| 22.5 | 44 | 0.711111 | 6 | 45 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 45 | 1 | 45 | 45 | 0.888889 | 0.088889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ba20176d017d4d04abc84783e76c922c75ad561a | 112 | py | Python | bdd/__init__.py | bornobob/ModelChecking | f49d7077e32d3a699d199289349c88ff18206382 | [
"MIT"
] | 2 | 2020-02-19T21:07:10.000Z | 2020-02-19T21:39:56.000Z | bdd/__init__.py | bornobob/ModelChecking | f49d7077e32d3a699d199289349c88ff18206382 | [
"MIT"
] | 9 | 2020-02-26T13:16:21.000Z | 2020-04-01T09:55:40.000Z | bdd/__init__.py | bornobob/ModelChecking | f49d7077e32d3a699d199289349c88ff18206382 | [
"MIT"
] | null | null | null | from bdd.bddconstructor import BDDConstructor
from bdd.bdd import BDD
from bdd.bddminimiser import BDDMinimiser
| 28 | 45 | 0.866071 | 15 | 112 | 6.466667 | 0.333333 | 0.216495 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107143 | 112 | 3 | 46 | 37.333333 | 0.97 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e85fd2b253d6fcfb8c8e00d03de01f8e5cbf7e94 | 109 | py | Python | vision-core/src/bsmu/vision/core/abc.py | IvanKosik/vision | 74603d4b727e6d993b562eb4656952e29173323e | [
"BSD-3-Clause"
] | 2 | 2019-10-15T11:34:17.000Z | 2021-02-03T10:46:07.000Z | vision-core/src/bsmu/vision/core/abc.py | IvanKosik/vision | 74603d4b727e6d993b562eb4656952e29173323e | [
"BSD-3-Clause"
] | null | null | null | vision-core/src/bsmu/vision/core/abc.py | IvanKosik/vision | 74603d4b727e6d993b562eb4656952e29173323e | [
"BSD-3-Clause"
] | null | null | null | import abc
from PySide2.QtCore import QObject
class QABCMeta(type(QObject), abc.ABCMeta):
pass
| 13.625 | 44 | 0.706422 | 14 | 109 | 5.5 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011765 | 0.220183 | 109 | 7 | 45 | 15.571429 | 0.894118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
e88e233fe88e0a1f901d84e36f1691520e7c88fe | 3,366 | py | Python | users/migrations/0001_initial.py | Vivek1258/Ecommerce-backend-with-django-and-mongodb | 3aaf28e18a175a68290c02035def7157ba832693 | [
"Apache-2.0"
] | 1 | 2021-01-14T15:06:01.000Z | 2021-01-14T15:06:01.000Z | users/migrations/0001_initial.py | Vivek1258/Ecommerce-backend-with-django-and-mongodb | 3aaf28e18a175a68290c02035def7157ba832693 | [
"Apache-2.0"
] | 3 | 2021-02-13T04:49:20.000Z | 2021-03-26T12:59:51.000Z | users/migrations/0001_initial.py | Vivek1258/Django-Ecommerce-website-backend | 3aaf28e18a175a68290c02035def7157ba832693 | [
"Apache-2.0"
] | null | null | null | # Generated by Django 3.0.7 on 2020-12-06 16:55
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
('items', '0001_initial'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='SellerProfile',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('mobile_number', models.BigIntegerField(blank=True)),
('img_link', models.URLField(blank=True, default='https://cdn.pixabay.com/photo/2015/10/05/22/37/blank-profile-picture-973460_960_720.png', max_length=5000)),
('shop_type', models.TextField(blank=True, max_length=1000)),
('official_doc_link', models.URLField(blank=True, max_length=5000)),
('lane_no', models.TextField(blank=True, max_length=50)),
('landmark', models.TextField(blank=True, max_length=1000)),
('village', models.TextField(blank=True, max_length=100)),
('district', models.TextField(blank=True, max_length=100)),
('state', models.TextField(blank=True, max_length=100)),
('user', models.OneToOneField(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='BuyerProfile',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('mobile_number', models.BigIntegerField()),
('img_link', models.URLField(blank=True, default='https://cdn.pixabay.com/photo/2015/10/05/22/37/blank-profile-picture-973460_960_720.png', max_length=5000)),
('lane_no', models.TextField(blank=True, max_length=50)),
('landmark', models.TextField(blank=True, max_length=1000)),
('village', models.TextField(blank=True, max_length=100)),
('district', models.TextField(blank=True, max_length=100)),
('state', models.TextField(blank=True, max_length=100)),
('pro_user', models.BooleanField(default=False)),
('user', models.OneToOneField(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='BuyerOrder',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('placed', models.DateTimeField(auto_now_add=True)),
('order_sat', models.TextField(max_length=1000)),
('item', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='items.Item')),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
]
| 54.290323 | 175 | 0.605764 | 368 | 3,366 | 5.380435 | 0.282609 | 0.077273 | 0.072727 | 0.109091 | 0.777778 | 0.764141 | 0.747475 | 0.728788 | 0.728788 | 0.728788 | 0 | 0.046429 | 0.251337 | 3,366 | 61 | 176 | 55.180328 | 0.739286 | 0.013369 | 0 | 0.574074 | 1 | 0.037037 | 0.134745 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.12963 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2ceb214ec4e9bc9aa324e4ea3bb0420790ec8383 | 5,621 | py | Python | tests/test_density_SphereVoxelization.py | SyedZiaul/freud | 04bcf9b3bcf45c14b05872205eb16205b2dbcf60 | [
"BSD-3-Clause"
] | 172 | 2018-11-24T03:07:53.000Z | 2022-02-24T17:18:15.000Z | tests/test_density_SphereVoxelization.py | SyedZiaul/freud | 04bcf9b3bcf45c14b05872205eb16205b2dbcf60 | [
"BSD-3-Clause"
] | 631 | 2019-01-23T17:49:33.000Z | 2022-03-28T19:46:36.000Z | tests/test_density_SphereVoxelization.py | SyedZiaul/freud | 04bcf9b3bcf45c14b05872205eb16205b2dbcf60 | [
"BSD-3-Clause"
] | 30 | 2019-07-24T07:57:06.000Z | 2022-02-25T10:58:19.000Z | import matplotlib
import matplotlib.pyplot as plt
import numpy as np
import pytest
from SphereVoxelization_fft import compute_2d, compute_3d
import freud
matplotlib.use("agg")
class TestSphereVoxelization:
def test_random_points_2d(self):
width = 100
r_max = 10.0
num_points = 10
box_size = r_max * 10
box, points = freud.data.make_random_system(box_size, num_points, is2D=True)
for w in (width, (width, width), [width, width]):
vox = freud.density.SphereVoxelization(w, r_max)
# Test access
with pytest.raises(AttributeError):
vox.box
with pytest.raises(AttributeError):
vox.voxels
vox.compute(system=(box, points))
# Test access
vox.box
vox.voxels
# Verify the output dimensions are correct
assert vox.voxels.shape == (width, width)
assert np.prod(vox.voxels.shape) == np.prod(vox.width)
# Verify the calculation is correct
# here we assert that the calculations (from two different methods)
# are the same up to rounding error
fft_vox = compute_2d(box_size, width, points, r_max)
num_same = len(
np.where(np.isclose(vox.voxels - fft_vox, np.zeros(fft_vox.shape)))[0]
)
total_num = np.prod(fft_vox.shape)
assert num_same / total_num > 0.95
# Verify that the voxels are all 1's and 0's
num_zeros = len(
np.where(np.isclose(vox.voxels, np.zeros(vox.voxels.shape)))[0]
)
num_ones = len(
np.where(np.isclose(vox.voxels, np.ones(vox.voxels.shape)))[0]
)
assert num_zeros > 0
assert num_ones > 0
assert num_zeros + num_ones == np.prod(vox.voxels.shape)
def test_random_points_3d(self):
width = 100
r_max = 10.0
num_points = 10
box_size = r_max * 10
box, points = freud.data.make_random_system(box_size, num_points, is2D=False)
for w in (width, (width, width, width), [width, width, width]):
vox = freud.density.SphereVoxelization(w, r_max)
# Test access
with pytest.raises(AttributeError):
vox.box
with pytest.raises(AttributeError):
vox.voxels
vox.compute(system=(box, points))
# Test access
vox.box
vox.voxels
# Verify the output dimensions are correct
assert vox.voxels.shape == (width, width, width)
# Verify the calculation is correct
# here we assert that the calculations (from two different methods)
# are the same up to rounding error
fft_vox = compute_3d(box_size, width, points, r_max)
num_same = len(
np.where(np.isclose(vox.voxels - fft_vox, np.zeros(fft_vox.shape)))[0]
)
total_num = np.prod(fft_vox.shape)
assert num_same / total_num > 0.95
# Verify that the voxels are all 1's and 0's
num_zeros = len(
np.where(np.isclose(vox.voxels, np.zeros(vox.voxels.shape)))[0]
)
num_ones = len(
np.where(np.isclose(vox.voxels, np.ones(vox.voxels.shape)))[0]
)
assert num_zeros > 0
assert num_ones > 0
assert num_zeros + num_ones == np.prod(vox.voxels.shape)
def test_change_box_dimension(self):
width = 100
r_max = 10.0
num_points = 100
box_size = r_max * 3.1
# test that computing a 3D system after computing a 2D system will fail
box, points = freud.data.make_random_system(box_size, num_points, is2D=True)
vox = freud.density.SphereVoxelization(width, r_max)
vox.compute(system=(box, points))
test_box, test_points = freud.data.make_random_system(
box_size, num_points, is2D=False
)
with pytest.raises(ValueError):
vox.compute((test_box, test_points))
# test that computing a 2D system after computing a 3D system will fail
box, points = freud.data.make_random_system(box_size, num_points, is2D=False)
vox = freud.density.SphereVoxelization(width, r_max)
vox.compute(system=(box, points))
test_box, test_points = freud.data.make_random_system(
box_size, num_points, is2D=True
)
with pytest.raises(ValueError):
vox.compute((test_box, test_points))
def test_repr(self):
vox = freud.density.SphereVoxelization(100, 10.0)
assert str(vox) == str(eval(repr(vox)))
# Use both signatures
vox3 = freud.density.SphereVoxelization((98, 99, 100), 10.0)
assert str(vox3) == str(eval(repr(vox3)))
def test_repr_png(self):
width = 100
r_max = 10.0
num_points = 100
box_size = r_max * 3.1
box, points = freud.data.make_random_system(box_size, num_points, is2D=True)
vox = freud.density.SphereVoxelization(width, r_max)
with pytest.raises(AttributeError):
vox.plot()
assert vox._repr_png_() is None
vox.compute((box, points))
vox.plot()
vox = freud.density.SphereVoxelization(width, r_max)
test_box = freud.box.Box.cube(box_size)
vox.compute((test_box, points))
vox.plot()
assert vox._repr_png_() is None
plt.close("all")
| 34.913043 | 86 | 0.585305 | 728 | 5,621 | 4.35989 | 0.144231 | 0.053875 | 0.042533 | 0.041903 | 0.819471 | 0.794266 | 0.794266 | 0.781033 | 0.755829 | 0.755829 | 0 | 0.025648 | 0.320228 | 5,621 | 160 | 87 | 35.13125 | 0.805025 | 0.114392 | 0 | 0.669565 | 0 | 0 | 0.00121 | 0 | 0 | 0 | 0 | 0 | 0.130435 | 1 | 0.043478 | false | 0 | 0.052174 | 0 | 0.104348 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fa0eb68a99659e288f64c6a85871499357ea9b2c | 11,288 | py | Python | test/fasta_test.py | dreyes17/python-chado | 94f77b1db95010ff4629b869ea5849fcc943a18c | [
"MIT"
] | 8 | 2017-09-08T15:19:26.000Z | 2022-02-23T17:28:01.000Z | test/fasta_test.py | dreyes17/python-chado | 94f77b1db95010ff4629b869ea5849fcc943a18c | [
"MIT"
] | 9 | 2018-02-07T18:14:41.000Z | 2022-03-03T13:14:32.000Z | test/fasta_test.py | dreyes17/python-chado | 94f77b1db95010ff4629b869ea5849fcc943a18c | [
"MIT"
] | 5 | 2018-09-28T08:03:52.000Z | 2022-03-02T17:51:32.000Z | from nose.tools import raises
from . import ChadoTestCase, ci
class FastaTest(ChadoTestCase):
def test_load_fasta_simple(self):
org = self._create_fake_org()
an = self._create_fake_an()
self.ci.feature.load_fasta(fasta="./test-data/proteins.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'])
feats = self.ci.feature.get_features(organism_id=org['organism_id'], name='Q02123|VNBP_POPMV')
f = feats[0]
contigterm = self.ci.get_cvterm_id('contig', 'sequence')
assert f['dbxref_id'] is None, "fasta features properly created"
assert f['organism_id'] == org['organism_id'], "fasta features properly created"
assert f['analysis_id'] == an['analysis_id'], "fasta features properly created"
assert f['name'] == 'Q02123|VNBP_POPMV', "fasta features properly created"
assert f['uniquename'] == 'Q02123|VNBP_POPMV', "fasta features properly created"
assert f['residues'] == 'MVNMRKVLALMQVFRERYDHKCDFNFCDIAVSIVCRSELDFINEPGLSNYAKRRRARRLGRCVRCFRVNPGFYFTKRCDGITCVPGISWNYDVEDYIKRGRVTGDRETPSTFHGYGYPVGHKT', "fasta features properly created"
assert f['seqlen'] == 123, "fasta features properly created"
assert f['md5checksum'] == 'a10c50557506954bf61efe1f997aa8d3', "fasta features properly created"
assert f['type_id'] == contigterm, "fasta features properly created"
assert f['is_analysis'] is False, "fasta features properly created"
assert f['is_obsolete'] is False, "fasta features properly created"
def test_load_fasta_seqtype(self):
org = self._create_fake_org()
an = self._create_fake_an()
self.ci.feature.load_fasta(fasta="./test-data/proteins.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'], sequence_type='supercontig')
feats = self.ci.feature.get_features(organism_id=org['organism_id'], name='Q02123|VNBP_POPMV')
f = feats[0]
contigterm = self.ci.get_cvterm_id('supercontig', 'sequence')
assert f['type_id'] == contigterm, "correct sequence_type"
def test_load_fasta_rename(self):
org = self._create_fake_org()
an = self._create_fake_an()
self.ci.feature.load_fasta(fasta="./test-data/proteins.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'], re_name=r'([a-zA-Z0-9]+)\|[a-zA-Z0-9_]+')
feats = self.ci.feature.get_features(organism_id=org['organism_id'], name='Q02123')
f = feats[0]
assert f['name'] == 'Q02123', "fasta re_name"
assert f['uniquename'] == 'Q02123|VNBP_POPMV', "fasta re_uniquename"
def test_load_fasta_reuname(self):
org = self._create_fake_org()
an = self._create_fake_an()
self.ci.feature.load_fasta(fasta="./test-data/proteins.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'], re_uniquename=r'[a-zA-Z0-9]+\|([a-zA-Z0-9_]+)')
feats = self.ci.feature.get_features(organism_id=org['organism_id'], name='Q02123|VNBP_POPMV')
f = feats[0]
assert f['name'] == 'Q02123|VNBP_POPMV', "fasta re_name"
assert f['uniquename'] == 'VNBP_POPMV', "fasta re_uniquename"
def test_load_fasta_wrong_rename(self):
org = self._create_fake_org()
an = self._create_fake_an()
self.ci.feature.load_fasta(fasta="./test-data/proteins.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'], re_name=r'([a-zA-Z]+)\|[a-zA-Z0-9_]+', re_uniquename=r'([a-zA-Z]+)\|[a-zA-Z0-9_]+')
feats = self.ci.feature.get_features(organism_id=org['organism_id'], name='Q02123|VNBP_POPMV')
f = feats[0]
assert f['name'] == 'Q02123|VNBP_POPMV', "fasta wrong re_name"
assert f['uniquename'] == 'Q02123|VNBP_POPMV', "fasta wrong re_uniquename"
def test_load_fasta_update(self):
org = self._create_fake_org()
an = self._create_fake_an()
self.ci.feature.load_fasta(fasta="./test-data/proteins.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'])
self.ci.feature.load_fasta(fasta="./test-data/proteins_alt.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'], update=True)
feats = self.ci.feature.get_features(organism_id=org['organism_id'], name='Q02123|VNBP_POPMV')
f = feats[0]
assert f['residues'] == 'MVXXXDPR', "fasta features properly updated"
assert f['seqlen'] == 8, "fasta features properly updated"
assert f['md5checksum'] == '2eb1f62c8a6af8662f4ad1e155543779', "fasta features properly updated"
@raises(Exception)
def test_load_fasta_update_fail(self):
org = self._create_fake_org()
an = self._create_fake_an()
self.ci.feature.load_fasta(fasta="./test-data/proteins.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'])
self.ci.feature.load_fasta(fasta="./test-data/proteins_alt.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'])
@raises(Exception)
def test_load_fasta_duplicate(self):
org = self._create_fake_org()
an = self._create_fake_an()
self.ci.feature.load_fasta(fasta="./test-data/proteins_dup.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'])
def test_load_fasta_update_match_on_name_fail(self):
org = self._create_fake_org()
an = self._create_fake_an()
self.ci.feature.load_fasta(fasta="./test-data/proteins.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'], re_uniquename=r'[a-zA-Z0-9]+\|([a-zA-Z0-9_]+)')
self.ci.feature.load_fasta(fasta="./test-data/proteins_alt.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'], update=True)
feats = self.ci.feature.get_features(organism_id=org['organism_id'], uniquename='Q02123|VNBP_POPMV')
f = feats[0]
assert f['residues'] == 'MVXXXDPR', "fasta features properly updated"
assert f['seqlen'] == 8, "fasta features properly updated"
assert f['md5checksum'] == '2eb1f62c8a6af8662f4ad1e155543779', "fasta features properly updated"
feats = self.ci.feature.get_features(organism_id=org['organism_id'], uniquename='VNBP_POPMV')
f = feats[0]
assert f['seqlen'] == 123, "fasta features properly updated"
def test_load_fasta_update_match_on_name(self):
org = self._create_fake_org()
an = self._create_fake_an()
self.ci.feature.load_fasta(fasta="./test-data/proteins.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'], re_uniquename=r'[a-zA-Z0-9]+\|([a-zA-Z0-9_]+)')
self.ci.feature.load_fasta(fasta="./test-data/proteins_alt.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'], update=True, match_on_name=True)
feats = self.ci.feature.get_features(organism_id=org['organism_id'], name='Q02123|VNBP_POPMV')
f = feats[0]
assert f['residues'] == 'MVXXXDPR', "fasta features properly updated"
assert f['seqlen'] == 8, "fasta features properly updated"
assert f['md5checksum'] == '2eb1f62c8a6af8662f4ad1e155543779', "fasta features properly updated"
def test_load_fasta_db(self):
org = self._create_fake_org()
an = self._create_fake_an()
# get a test db
db = self.ci.session.query(self.ci.model.db).filter_by(name='null')[0]
self.ci.feature.load_fasta(fasta="./test-data/proteins.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'], db=db.db_id, re_db_accession=r'[a-zA-Z0-9]+\|([a-zA-Z0-9_]+)')
feats = self.ci.feature.get_features(organism_id=org['organism_id'], name='Q02123|VNBP_POPMV')
f = feats[0]
assert f['dbxref_id'] > 0, "dbxref created"
dbxref = self.ci.session.query(self.ci.model.dbxref).filter_by(dbxref_id=f['dbxref_id'])[0]
assert dbxref.accession == 'VNBP_POPMV', "dbxref created correctly"
def test_load_fasta_relation(self):
org = self._create_fake_org()
an = self._create_fake_an()
self.ci.feature.load_fasta(fasta="./test-data/proteins.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'], re_name=r'[a-zA-Z0-9]+\|([a-zA-Z0-9_]+)', re_uniquename=r'[a-zA-Z0-9]+\|([a-zA-Z0-9_]+)', sequence_type='supercontig')
self.ci.feature.load_fasta(fasta="./test-data/proteins_alt.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'], re_name=r'([a-zA-Z0-9]+)\|[a-zA-Z0-9_]+', re_uniquename=r'([a-zA-Z0-9]+)\|[a-zA-Z0-9_]+', rel_type='part_of', re_parent=r'[a-zA-Z0-9]+\|([a-zA-Z0-9_]+)', parent_type='supercontig')
child = self.ci.feature.get_features(organism_id=org['organism_id'], name='Q02123')[0]
parent = self.ci.feature.get_features(organism_id=org['organism_id'], name='VNBP_POPMV')[0]
rship = self.ci.session.query(self.ci.model.feature_relationship).filter_by(subject_id=child['feature_id']).all()[0]
partofterm = self.ci.get_cvterm_id('part_of', 'sequence')
assert rship.subject_id == child['feature_id'], "fasta features properly loaded with relationship"
assert rship.object_id == parent['feature_id'], "fasta features properly loaded with relationship"
assert rship.type_id == partofterm, "fasta features properly loaded with relationship"
def test_load_fasta_remove_by_org(self):
org = self._create_fake_org()
an = self._create_fake_an()
self.ci.feature.load_fasta(fasta="./test-data/proteins.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'])
self.ci.feature.get_features(organism_id=org['organism_id'], name='Q02123|VNBP_POPMV')
assert len(self.ci.feature.get_features()) == 21, "features are loaded"
self.ci.organism.delete_organisms(organism_id=org['organism_id'])
assert len(self.ci.feature.get_features()) == 0, "features removed when removing organism"
def test_load_fasta_remove_by_an(self):
org = self._create_fake_org()
an = self._create_fake_an()
self.ci.feature.load_fasta(fasta="./test-data/proteins.fa", analysis_id=an['analysis_id'], organism_id=org['organism_id'])
self.ci.feature.get_features(organism_id=org['organism_id'], name='Q02123|VNBP_POPMV')
assert len(self.ci.feature.get_features()) == 21, "features are loaded"
self.ci.analysis.delete_analyses(analysis_id=an['analysis_id'])
assert len(self.ci.feature.get_features()) == 0, "features removed when removing analysis"
def _del_dbxref(self):
self.ci.session.query(self.ci.model.dbxref).filter(
self.ci.model.dbxref.db_id == 1,
(self.ci.model.dbxref.accession.like('VNBP%') | self.ci.model.dbxref.accession.like('%VIRU'))
).delete(synchronize_session='fetch')
def setUp(self):
self.ci = ci
self.ci.organism.delete_organisms()
self.ci.analysis.delete_analyses()
self.ci.feature.delete_features()
# Make sure dbxref are deleted too
self._del_dbxref()
self.ci.session.commit()
def tearDown(self):
self.ci.organism.delete_organisms()
self.ci.analysis.delete_analyses()
self.ci.feature.delete_features()
# Make sure dbxref are deleted too
self._del_dbxref()
self.ci.session.commit()
| 52.259259 | 315 | 0.680103 | 1,579 | 11,288 | 4.611146 | 0.080431 | 0.096141 | 0.069633 | 0.100948 | 0.868699 | 0.836836 | 0.777366 | 0.734377 | 0.718995 | 0.667765 | 0 | 0.02746 | 0.164422 | 11,288 | 215 | 316 | 52.502326 | 0.744487 | 0.006999 | 0 | 0.522876 | 0 | 0 | 0.311111 | 0.094065 | 0 | 0 | 0 | 0 | 0.24183 | 1 | 0.111111 | false | 0 | 0.013072 | 0 | 0.130719 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fa11768d74a212392fc7b9a10ce959316869beae | 34 | py | Python | hius/responses.py | WoolenSweater/liteapi | 01339f65b68af67253459609fb60ae35e5a50604 | [
"MIT"
] | null | null | null | hius/responses.py | WoolenSweater/liteapi | 01339f65b68af67253459609fb60ae35e5a50604 | [
"MIT"
] | null | null | null | hius/responses.py | WoolenSweater/liteapi | 01339f65b68af67253459609fb60ae35e5a50604 | [
"MIT"
] | null | null | null | from starlette.responses import *
| 17 | 33 | 0.823529 | 4 | 34 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 34 | 1 | 34 | 34 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fa2c663f3b06bff1a2567e240e0eb5a110f2416b | 38,074 | py | Python | third_party/tlslite/scripts/tls.py | nagineni/chromium-crosswalk | 5725642f1c67d0f97e8613ec1c3e8107ab53fdf8 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | 231 | 2015-01-08T09:04:44.000Z | 2021-12-30T03:03:10.000Z | third_party/tlslite/scripts/tls.py | 1065672644894730302/Chromium | 239dd49e906be4909e293d8991e998c9816eaa35 | [
"BSD-3-Clause"
] | 5 | 2015-03-27T14:29:23.000Z | 2019-09-25T13:23:12.000Z | third_party/tlslite/scripts/tls.py | 1065672644894730302/Chromium | 239dd49e906be4909e293d8991e998c9816eaa35 | [
"BSD-3-Clause"
] | 268 | 2015-01-21T05:53:28.000Z | 2022-03-25T22:09:01.000Z | #! python
import sys
import os
import os.path
import socket
import thread
import time
import httplib
import BaseHTTPServer
import SimpleHTTPServer
try:
from cryptoIDlib.api import *
cryptoIDlibLoaded = True
except:
cryptoIDlibLoaded = False
if __name__ != "__main__":
raise "This must be run as a command, not used as a module!"
#import tlslite
#from tlslite.constants import AlertDescription, Fault
#from tlslite.utils.jython_compat import formatExceptionTrace
#from tlslite.X509 import X509, X509CertChain
from tlslite.api import *
def parsePrivateKey(s):
try:
return parsePEMKey(s, private=True)
except Exception, e:
print e
return parseXMLKey(s, private=True)
def clientTest(address, dir):
#Split address into hostname/port tuple
address = address.split(":")
if len(address)==1:
address.append("4443")
address = ( address[0], int(address[1]) )
def connect():
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
if hasattr(sock, 'settimeout'): #It's a python 2.3 feature
sock.settimeout(5)
sock.connect(address)
c = TLSConnection(sock)
return c
test = 0
badFault = False
print "Test 1 - good shared key"
connection = connect()
connection.handshakeClientSharedKey("shared", "key")
connection.close()
connection.sock.close()
print "Test 2 - shared key faults"
for fault in Fault.clientSharedKeyFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeClientSharedKey("shared", "key")
print " Good Fault %s" % (Fault.faultNames[fault])
except TLSFaultError, e:
print " BAD FAULT %s: %s" % (Fault.faultNames[fault], str(e))
badFault = True
connection.sock.close()
print "Test 3 - good SRP"
connection = connect()
connection.handshakeClientSRP("test", "password")
connection.close()
print "Test 4 - SRP faults"
for fault in Fault.clientSrpFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeClientSRP("test", "password")
print " Good Fault %s" % (Fault.faultNames[fault])
except TLSFaultError, e:
print " BAD FAULT %s: %s" % (Fault.faultNames[fault], str(e))
badFault = True
connection.sock.close()
print "Test 5 - good SRP: unknown_srp_username idiom"
def srpCallback():
return ("test", "password")
connection = connect()
connection.handshakeClientUnknown(srpCallback=srpCallback)
connection.close()
connection.sock.close()
print "Test 6 - good SRP: with X.509 certificate"
connection = connect()
connection.handshakeClientSRP("test", "password")
assert(isinstance(connection.session.serverCertChain, X509CertChain))
connection.close()
connection.sock.close()
print "Test 7 - X.509 with SRP faults"
for fault in Fault.clientSrpFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeClientSRP("test", "password")
print " Good Fault %s" % (Fault.faultNames[fault])
except TLSFaultError, e:
print " BAD FAULT %s: %s" % (Fault.faultNames[fault], str(e))
badFault = True
connection.sock.close()
if cryptoIDlibLoaded:
print "Test 8 - good SRP: with cryptoID certificate chain"
connection = connect()
connection.handshakeClientSRP("test", "password")
assert(isinstance(connection.session.serverCertChain, CertChain))
if not (connection.session.serverCertChain.validate()):
print connection.session.serverCertChain.validate(listProblems=True)
connection.close()
connection.sock.close()
print "Test 9 - CryptoID with SRP faults"
for fault in Fault.clientSrpFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeClientSRP("test", "password")
print " Good Fault %s" % (Fault.faultNames[fault])
except TLSFaultError, e:
print " BAD FAULT %s: %s" % (Fault.faultNames[fault], str(e))
badFault = True
connection.sock.close()
print "Test 10 - good X509"
connection = connect()
connection.handshakeClientCert()
assert(isinstance(connection.session.serverCertChain, X509CertChain))
connection.close()
connection.sock.close()
print "Test 10.a - good X509, SSLv3"
connection = connect()
settings = HandshakeSettings()
settings.minVersion = (3,0)
settings.maxVersion = (3,0)
connection.handshakeClientCert(settings=settings)
assert(isinstance(connection.session.serverCertChain, X509CertChain))
connection.close()
connection.sock.close()
print "Test 11 - X.509 faults"
for fault in Fault.clientNoAuthFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeClientCert()
print " Good Fault %s" % (Fault.faultNames[fault])
except TLSFaultError, e:
print " BAD FAULT %s: %s" % (Fault.faultNames[fault], str(e))
badFault = True
connection.sock.close()
if cryptoIDlibLoaded:
print "Test 12 - good cryptoID"
connection = connect()
connection.handshakeClientCert()
assert(isinstance(connection.session.serverCertChain, CertChain))
assert(connection.session.serverCertChain.validate())
connection.close()
connection.sock.close()
print "Test 13 - cryptoID faults"
for fault in Fault.clientNoAuthFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeClientCert()
print " Good Fault %s" % (Fault.faultNames[fault])
except TLSFaultError, e:
print " BAD FAULT %s: %s" % (Fault.faultNames[fault], str(e))
badFault = True
connection.sock.close()
print "Test 14 - good mutual X509"
x509Cert = X509().parse(open(os.path.join(dir, "clientX509Cert.pem")).read())
x509Chain = X509CertChain([x509Cert])
s = open(os.path.join(dir, "clientX509Key.pem")).read()
x509Key = parsePEMKey(s, private=True)
connection = connect()
connection.handshakeClientCert(x509Chain, x509Key)
assert(isinstance(connection.session.serverCertChain, X509CertChain))
connection.close()
connection.sock.close()
print "Test 14.a - good mutual X509, SSLv3"
connection = connect()
settings = HandshakeSettings()
settings.minVersion = (3,0)
settings.maxVersion = (3,0)
connection.handshakeClientCert(x509Chain, x509Key, settings=settings)
assert(isinstance(connection.session.serverCertChain, X509CertChain))
connection.close()
connection.sock.close()
print "Test 15 - mutual X.509 faults"
for fault in Fault.clientCertFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeClientCert(x509Chain, x509Key)
print " Good Fault %s" % (Fault.faultNames[fault])
except TLSFaultError, e:
print " BAD FAULT %s: %s" % (Fault.faultNames[fault], str(e))
badFault = True
connection.sock.close()
if cryptoIDlibLoaded:
print "Test 16 - good mutual cryptoID"
cryptoIDChain = CertChain().parse(open(os.path.join(dir, "serverCryptoIDChain.xml"), "r").read())
cryptoIDKey = parseXMLKey(open(os.path.join(dir, "serverCryptoIDKey.xml"), "r").read(), private=True)
connection = connect()
connection.handshakeClientCert(cryptoIDChain, cryptoIDKey)
assert(isinstance(connection.session.serverCertChain, CertChain))
assert(connection.session.serverCertChain.validate())
connection.close()
connection.sock.close()
print "Test 17 - mutual cryptoID faults"
for fault in Fault.clientCertFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeClientCert(cryptoIDChain, cryptoIDKey)
print " Good Fault %s" % (Fault.faultNames[fault])
except TLSFaultError, e:
print " BAD FAULT %s: %s" % (Fault.faultNames[fault], str(e))
badFault = True
connection.sock.close()
print "Test 18 - good SRP, prepare to resume..."
connection = connect()
connection.handshakeClientSRP("test", "password")
connection.close()
connection.sock.close()
session = connection.session
print "Test 19 - resumption"
connection = connect()
connection.handshakeClientSRP("test", "garbage", session=session)
#Don't close! -- see below
print "Test 20 - invalidated resumption"
connection.sock.close() #Close the socket without a close_notify!
connection = connect()
try:
connection.handshakeClientSRP("test", "garbage", session=session)
assert()
except TLSRemoteAlert, alert:
if alert.description != AlertDescription.bad_record_mac:
raise
connection.sock.close()
print "Test 21 - HTTPS test X.509"
address = address[0], address[1]+1
if hasattr(socket, "timeout"):
timeoutEx = socket.timeout
else:
timeoutEx = socket.error
while 1:
try:
time.sleep(2)
htmlBody = open(os.path.join(dir, "index.html")).read()
fingerprint = None
for y in range(2):
h = HTTPTLSConnection(\
address[0], address[1], x509Fingerprint=fingerprint)
for x in range(3):
h.request("GET", "/index.html")
r = h.getresponse()
assert(r.status == 200)
s = r.read()
assert(s == htmlBody)
fingerprint = h.tlsSession.serverCertChain.getFingerprint()
assert(fingerprint)
time.sleep(2)
break
except timeoutEx:
print "timeout, retrying..."
pass
if cryptoIDlibLoaded:
print "Test 21a - HTTPS test SRP+cryptoID"
address = address[0], address[1]+1
if hasattr(socket, "timeout"):
timeoutEx = socket.timeout
else:
timeoutEx = socket.error
while 1:
try:
time.sleep(2) #Time to generate key and cryptoID
htmlBody = open(os.path.join(dir, "index.html")).read()
fingerprint = None
protocol = None
for y in range(2):
h = HTTPTLSConnection(\
address[0], address[1],
username="test", password="password",
cryptoID=fingerprint, protocol=protocol)
for x in range(3):
h.request("GET", "/index.html")
r = h.getresponse()
assert(r.status == 200)
s = r.read()
assert(s == htmlBody)
fingerprint = h.tlsSession.serverCertChain.cryptoID
assert(fingerprint)
protocol = "urn:whatever"
time.sleep(2)
break
except timeoutEx:
print "timeout, retrying..."
pass
address = address[0], address[1]+1
implementations = []
if cryptlibpyLoaded:
implementations.append("cryptlib")
if m2cryptoLoaded:
implementations.append("openssl")
if pycryptoLoaded:
implementations.append("pycrypto")
implementations.append("python")
print "Test 22 - different ciphers"
for implementation in implementations:
for cipher in ["aes128", "aes256", "rc4"]:
print "Test 22:",
connection = connect()
settings = HandshakeSettings()
settings.cipherNames = [cipher]
settings.cipherImplementations = [implementation, "python"]
connection.handshakeClientSharedKey("shared", "key", settings=settings)
print ("%s %s" % (connection.getCipherName(), connection.getCipherImplementation()))
connection.write("hello")
h = connection.read(min=5, max=5)
assert(h == "hello")
connection.close()
connection.sock.close()
print "Test 23 - throughput test"
for implementation in implementations:
for cipher in ["aes128", "aes256", "3des", "rc4"]:
if cipher == "3des" and implementation not in ("openssl", "cryptlib", "pycrypto"):
continue
print "Test 23:",
connection = connect()
settings = HandshakeSettings()
settings.cipherNames = [cipher]
settings.cipherImplementations = [implementation, "python"]
connection.handshakeClientSharedKey("shared", "key", settings=settings)
print ("%s %s:" % (connection.getCipherName(), connection.getCipherImplementation())),
startTime = time.clock()
connection.write("hello"*10000)
h = connection.read(min=50000, max=50000)
stopTime = time.clock()
print "100K exchanged at rate of %d bytes/sec" % int(100000/(stopTime-startTime))
assert(h == "hello"*10000)
connection.close()
connection.sock.close()
print "Test 24 - Internet servers test"
try:
i = IMAP4_TLS("cyrus.andrew.cmu.edu")
i.login("anonymous", "anonymous@anonymous.net")
i.logout()
print "Test 24: IMAP4 good"
p = POP3_TLS("pop.gmail.com")
p.quit()
print "Test 24: POP3 good"
except socket.error, e:
print "Non-critical error: socket error trying to reach internet server: ", e
if not badFault:
print "Test succeeded"
else:
print "Test failed"
def serverTest(address, dir):
#Split address into hostname/port tuple
address = address.split(":")
if len(address)==1:
address.append("4443")
address = ( address[0], int(address[1]) )
#Connect to server
lsock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
lsock.bind(address)
lsock.listen(5)
def connect():
return TLSConnection(lsock.accept()[0])
print "Test 1 - good shared key"
sharedKeyDB = SharedKeyDB()
sharedKeyDB["shared"] = "key"
sharedKeyDB["shared2"] = "key2"
connection = connect()
connection.handshakeServer(sharedKeyDB=sharedKeyDB)
connection.close()
connection.sock.close()
print "Test 2 - shared key faults"
for fault in Fault.clientSharedKeyFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeServer(sharedKeyDB=sharedKeyDB)
assert()
except:
pass
connection.sock.close()
print "Test 3 - good SRP"
#verifierDB = tlslite.VerifierDB(os.path.join(dir, "verifierDB"))
#verifierDB.open()
verifierDB = VerifierDB()
verifierDB.create()
entry = VerifierDB.makeVerifier("test", "password", 1536)
verifierDB["test"] = entry
connection = connect()
connection.handshakeServer(verifierDB=verifierDB)
connection.close()
connection.sock.close()
print "Test 4 - SRP faults"
for fault in Fault.clientSrpFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeServer(verifierDB=verifierDB)
assert()
except:
pass
connection.sock.close()
print "Test 5 - good SRP: unknown_srp_username idiom"
connection = connect()
connection.handshakeServer(verifierDB=verifierDB)
connection.close()
connection.sock.close()
print "Test 6 - good SRP: with X.509 cert"
x509Cert = X509().parse(open(os.path.join(dir, "serverX509Cert.pem")).read())
x509Chain = X509CertChain([x509Cert])
s = open(os.path.join(dir, "serverX509Key.pem")).read()
x509Key = parsePEMKey(s, private=True)
connection = connect()
connection.handshakeServer(verifierDB=verifierDB, \
certChain=x509Chain, privateKey=x509Key)
connection.close()
connection.sock.close()
print "Test 7 - X.509 with SRP faults"
for fault in Fault.clientSrpFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeServer(verifierDB=verifierDB, \
certChain=x509Chain, privateKey=x509Key)
assert()
except:
pass
connection.sock.close()
if cryptoIDlibLoaded:
print "Test 8 - good SRP: with cryptoID certs"
cryptoIDChain = CertChain().parse(open(os.path.join(dir, "serverCryptoIDChain.xml"), "r").read())
cryptoIDKey = parseXMLKey(open(os.path.join(dir, "serverCryptoIDKey.xml"), "r").read(), private=True)
connection = connect()
connection.handshakeServer(verifierDB=verifierDB, \
certChain=cryptoIDChain, privateKey=cryptoIDKey)
connection.close()
connection.sock.close()
print "Test 9 - cryptoID with SRP faults"
for fault in Fault.clientSrpFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeServer(verifierDB=verifierDB, \
certChain=cryptoIDChain, privateKey=cryptoIDKey)
assert()
except:
pass
connection.sock.close()
print "Test 10 - good X.509"
connection = connect()
connection.handshakeServer(certChain=x509Chain, privateKey=x509Key)
connection.close()
connection.sock.close()
print "Test 10.a - good X.509, SSL v3"
connection = connect()
settings = HandshakeSettings()
settings.minVersion = (3,0)
settings.maxVersion = (3,0)
connection.handshakeServer(certChain=x509Chain, privateKey=x509Key, settings=settings)
connection.close()
connection.sock.close()
print "Test 11 - X.509 faults"
for fault in Fault.clientNoAuthFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeServer(certChain=x509Chain, privateKey=x509Key)
assert()
except:
pass
connection.sock.close()
if cryptoIDlibLoaded:
print "Test 12 - good cryptoID"
connection = connect()
connection.handshakeServer(certChain=cryptoIDChain, privateKey=cryptoIDKey)
connection.close()
connection.sock.close()
print "Test 13 - cryptoID faults"
for fault in Fault.clientNoAuthFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeServer(certChain=cryptoIDChain, privateKey=cryptoIDKey)
assert()
except:
pass
connection.sock.close()
print "Test 14 - good mutual X.509"
connection = connect()
connection.handshakeServer(certChain=x509Chain, privateKey=x509Key, reqCert=True)
assert(isinstance(connection.session.serverCertChain, X509CertChain))
connection.close()
connection.sock.close()
print "Test 14a - good mutual X.509, SSLv3"
connection = connect()
settings = HandshakeSettings()
settings.minVersion = (3,0)
settings.maxVersion = (3,0)
connection.handshakeServer(certChain=x509Chain, privateKey=x509Key, reqCert=True, settings=settings)
assert(isinstance(connection.session.serverCertChain, X509CertChain))
connection.close()
connection.sock.close()
print "Test 15 - mutual X.509 faults"
for fault in Fault.clientCertFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeServer(certChain=x509Chain, privateKey=x509Key, reqCert=True)
assert()
except:
pass
connection.sock.close()
if cryptoIDlibLoaded:
print "Test 16 - good mutual cryptoID"
connection = connect()
connection.handshakeServer(certChain=cryptoIDChain, privateKey=cryptoIDKey, reqCert=True)
assert(isinstance(connection.session.serverCertChain, CertChain))
assert(connection.session.serverCertChain.validate())
connection.close()
connection.sock.close()
print "Test 17 - mutual cryptoID faults"
for fault in Fault.clientCertFaults + Fault.genericFaults:
connection = connect()
connection.fault = fault
try:
connection.handshakeServer(certChain=cryptoIDChain, privateKey=cryptoIDKey, reqCert=True)
assert()
except:
pass
connection.sock.close()
print "Test 18 - good SRP, prepare to resume"
sessionCache = SessionCache()
connection = connect()
connection.handshakeServer(verifierDB=verifierDB, sessionCache=sessionCache)
connection.close()
connection.sock.close()
print "Test 19 - resumption"
connection = connect()
connection.handshakeServer(verifierDB=verifierDB, sessionCache=sessionCache)
#Don't close! -- see next test
print "Test 20 - invalidated resumption"
try:
connection.read(min=1, max=1)
assert() #Client is going to close the socket without a close_notify
except TLSAbruptCloseError, e:
pass
connection = connect()
try:
connection.handshakeServer(verifierDB=verifierDB, sessionCache=sessionCache)
except TLSLocalAlert, alert:
if alert.description != AlertDescription.bad_record_mac:
raise
connection.sock.close()
print "Test 21 - HTTPS test X.509"
#Close the current listening socket
lsock.close()
#Create and run an HTTP Server using TLSSocketServerMixIn
class MyHTTPServer(TLSSocketServerMixIn,
BaseHTTPServer.HTTPServer):
def handshake(self, tlsConnection):
tlsConnection.handshakeServer(certChain=x509Chain, privateKey=x509Key)
return True
cd = os.getcwd()
os.chdir(dir)
address = address[0], address[1]+1
httpd = MyHTTPServer(address, SimpleHTTPServer.SimpleHTTPRequestHandler)
for x in range(6):
httpd.handle_request()
httpd.server_close()
cd = os.chdir(cd)
if cryptoIDlibLoaded:
print "Test 21a - HTTPS test SRP+cryptoID"
#Create and run an HTTP Server using TLSSocketServerMixIn
class MyHTTPServer(TLSSocketServerMixIn,
BaseHTTPServer.HTTPServer):
def handshake(self, tlsConnection):
tlsConnection.handshakeServer(certChain=cryptoIDChain, privateKey=cryptoIDKey,
verifierDB=verifierDB)
return True
cd = os.getcwd()
os.chdir(dir)
address = address[0], address[1]+1
httpd = MyHTTPServer(address, SimpleHTTPServer.SimpleHTTPRequestHandler)
for x in range(6):
httpd.handle_request()
httpd.server_close()
cd = os.chdir(cd)
#Re-connect the listening socket
lsock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
address = address[0], address[1]+1
lsock.bind(address)
lsock.listen(5)
def connect():
return TLSConnection(lsock.accept()[0])
implementations = []
if cryptlibpyLoaded:
implementations.append("cryptlib")
if m2cryptoLoaded:
implementations.append("openssl")
if pycryptoLoaded:
implementations.append("pycrypto")
implementations.append("python")
print "Test 22 - different ciphers"
for implementation in ["python"] * len(implementations):
for cipher in ["aes128", "aes256", "rc4"]:
print "Test 22:",
connection = connect()
settings = HandshakeSettings()
settings.cipherNames = [cipher]
settings.cipherImplementations = [implementation, "python"]
connection.handshakeServer(sharedKeyDB=sharedKeyDB, settings=settings)
print connection.getCipherName(), connection.getCipherImplementation()
h = connection.read(min=5, max=5)
assert(h == "hello")
connection.write(h)
connection.close()
connection.sock.close()
print "Test 23 - throughput test"
for implementation in implementations:
for cipher in ["aes128", "aes256", "3des", "rc4"]:
if cipher == "3des" and implementation not in ("openssl", "cryptlib", "pycrypto"):
continue
print "Test 23:",
connection = connect()
settings = HandshakeSettings()
settings.cipherNames = [cipher]
settings.cipherImplementations = [implementation, "python"]
connection.handshakeServer(sharedKeyDB=sharedKeyDB, settings=settings)
print connection.getCipherName(), connection.getCipherImplementation()
h = connection.read(min=50000, max=50000)
assert(h == "hello"*10000)
connection.write(h)
connection.close()
connection.sock.close()
print "Test succeeded"
if len(sys.argv) == 1 or (len(sys.argv)==2 and sys.argv[1].lower().endswith("help")):
print ""
print "Version: 0.3.8"
print ""
print "RNG: %s" % prngName
print ""
print "Modules:"
if cryptlibpyLoaded:
print " cryptlib_py : Loaded"
else:
print " cryptlib_py : Not Loaded"
if m2cryptoLoaded:
print " M2Crypto : Loaded"
else:
print " M2Crypto : Not Loaded"
if pycryptoLoaded:
print " pycrypto : Loaded"
else:
print " pycrypto : Not Loaded"
if gmpyLoaded:
print " GMPY : Loaded"
else:
print " GMPY : Not Loaded"
if cryptoIDlibLoaded:
print " cryptoIDlib : Loaded"
else:
print " cryptoIDlib : Not Loaded"
print ""
print "Commands:"
print ""
print " clientcert <server> [<chain> <key>]"
print " clientsharedkey <server> <user> <pass>"
print " clientsrp <server> <user> <pass>"
print " clienttest <server> <dir>"
print ""
print " serversrp <server> <verifierDB>"
print " servercert <server> <chain> <key> [req]"
print " serversrpcert <server> <verifierDB> <chain> <key>"
print " serversharedkey <server> <sharedkeyDB>"
print " servertest <server> <dir>"
sys.exit()
cmd = sys.argv[1].lower()
class Args:
def __init__(self, argv):
self.argv = argv
def get(self, index):
if len(self.argv)<=index:
raise SyntaxError("Not enough arguments")
return self.argv[index]
def getLast(self, index):
if len(self.argv)>index+1:
raise SyntaxError("Too many arguments")
return self.get(index)
args = Args(sys.argv)
def reformatDocString(s):
lines = s.splitlines()
newLines = []
for line in lines:
newLines.append(" " + line.strip())
return "\n".join(newLines)
try:
if cmd == "clienttest":
address = args.get(2)
dir = args.getLast(3)
clientTest(address, dir)
sys.exit()
elif cmd.startswith("client"):
address = args.get(2)
#Split address into hostname/port tuple
address = address.split(":")
if len(address)==1:
address.append("4443")
address = ( address[0], int(address[1]) )
def connect():
#Connect to server
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
if hasattr(sock, "settimeout"):
sock.settimeout(5)
sock.connect(address)
#Instantiate TLSConnections
return TLSConnection(sock)
try:
if cmd == "clientsrp":
username = args.get(3)
password = args.getLast(4)
connection = connect()
start = time.clock()
connection.handshakeClientSRP(username, password)
elif cmd == "clientsharedkey":
username = args.get(3)
password = args.getLast(4)
connection = connect()
start = time.clock()
connection.handshakeClientSharedKey(username, password)
elif cmd == "clientcert":
certChain = None
privateKey = None
if len(sys.argv) > 3:
certFilename = args.get(3)
keyFilename = args.getLast(4)
s1 = open(certFilename, "rb").read()
s2 = open(keyFilename, "rb").read()
#Try to create cryptoID cert chain
if cryptoIDlibLoaded:
try:
certChain = CertChain().parse(s1)
privateKey = parsePrivateKey(s2)
except:
certChain = None
privateKey = None
#Try to create X.509 cert chain
if not certChain:
x509 = X509()
x509.parse(s1)
certChain = X509CertChain([x509])
privateKey = parsePrivateKey(s2)
connection = connect()
start = time.clock()
connection.handshakeClientCert(certChain, privateKey)
else:
raise SyntaxError("Unknown command")
except TLSLocalAlert, a:
if a.description == AlertDescription.bad_record_mac:
if cmd == "clientsharedkey":
print "Bad sharedkey password"
else:
raise
elif a.description == AlertDescription.user_canceled:
print str(a)
else:
raise
sys.exit()
except TLSRemoteAlert, a:
if a.description == AlertDescription.unknown_srp_username:
if cmd == "clientsrp":
print "Unknown username"
else:
raise
elif a.description == AlertDescription.bad_record_mac:
if cmd == "clientsrp":
print "Bad username or password"
else:
raise
elif a.description == AlertDescription.handshake_failure:
print "Unable to negotiate mutually acceptable parameters"
else:
raise
sys.exit()
stop = time.clock()
print "Handshake success"
print " Handshake time: %.4f seconds" % (stop - start)
print " Version: %s.%s" % connection.version
print " Cipher: %s %s" % (connection.getCipherName(), connection.getCipherImplementation())
if connection.session.srpUsername:
print " Client SRP username: %s" % connection.session.srpUsername
if connection.session.sharedKeyUsername:
print " Client shared key username: %s" % connection.session.sharedKeyUsername
if connection.session.clientCertChain:
print " Client fingerprint: %s" % connection.session.clientCertChain.getFingerprint()
if connection.session.serverCertChain:
print " Server fingerprint: %s" % connection.session.serverCertChain.getFingerprint()
connection.close()
connection.sock.close()
elif cmd.startswith("server"):
address = args.get(2)
#Split address into hostname/port tuple
address = address.split(":")
if len(address)==1:
address.append("4443")
address = ( address[0], int(address[1]) )
verifierDBFilename = None
sharedKeyDBFilename = None
certFilename = None
keyFilename = None
sharedKeyDB = None
reqCert = False
if cmd == "serversrp":
verifierDBFilename = args.getLast(3)
elif cmd == "servercert":
certFilename = args.get(3)
keyFilename = args.get(4)
if len(sys.argv)>=6:
req = args.getLast(5)
if req.lower() != "req":
raise SyntaxError()
reqCert = True
elif cmd == "serversrpcert":
verifierDBFilename = args.get(3)
certFilename = args.get(4)
keyFilename = args.getLast(5)
elif cmd == "serversharedkey":
sharedKeyDBFilename = args.getLast(3)
elif cmd == "servertest":
address = args.get(2)
dir = args.getLast(3)
serverTest(address, dir)
sys.exit()
verifierDB = None
if verifierDBFilename:
verifierDB = VerifierDB(verifierDBFilename)
verifierDB.open()
sharedKeyDB = None
if sharedKeyDBFilename:
sharedKeyDB = SharedKeyDB(sharedKeyDBFilename)
sharedKeyDB.open()
certChain = None
privateKey = None
if certFilename:
s1 = open(certFilename, "rb").read()
s2 = open(keyFilename, "rb").read()
#Try to create cryptoID cert chain
if cryptoIDlibLoaded:
try:
certChain = CertChain().parse(s1)
privateKey = parsePrivateKey(s2)
except:
certChain = None
privateKey = None
#Try to create X.509 cert chain
if not certChain:
x509 = X509()
x509.parse(s1)
certChain = X509CertChain([x509])
privateKey = parsePrivateKey(s2)
#Create handler function - performs handshake, then echos all bytes received
def handler(sock):
try:
connection = TLSConnection(sock)
settings = HandshakeSettings()
connection.handshakeServer(sharedKeyDB=sharedKeyDB, verifierDB=verifierDB, \
certChain=certChain, privateKey=privateKey, \
reqCert=reqCert, settings=settings)
print "Handshake success"
print " Version: %s.%s" % connection.version
print " Cipher: %s %s" % (connection.getCipherName(), connection.getCipherImplementation())
if connection.session.srpUsername:
print " Client SRP username: %s" % connection.session.srpUsername
if connection.session.sharedKeyUsername:
print " Client shared key username: %s" % connection.session.sharedKeyUsername
if connection.session.clientCertChain:
print " Client fingerprint: %s" % connection.session.clientCertChain.getFingerprint()
if connection.session.serverCertChain:
print " Server fingerprint: %s" % connection.session.serverCertChain.getFingerprint()
s = ""
while 1:
newS = connection.read()
if not newS:
break
s += newS
if s[-1]=='\n':
connection.write(s)
s = ""
except TLSLocalAlert, a:
if a.description == AlertDescription.unknown_srp_username:
print "Unknown SRP username"
elif a.description == AlertDescription.bad_record_mac:
if cmd == "serversrp" or cmd == "serversrpcert":
print "Bad SRP password for:", connection.allegedSrpUsername
else:
raise
elif a.description == AlertDescription.handshake_failure:
print "Unable to negotiate mutually acceptable parameters"
else:
raise
except TLSRemoteAlert, a:
if a.description == AlertDescription.bad_record_mac:
if cmd == "serversharedkey":
print "Bad sharedkey password for:", connection.allegedSharedKeyUsername
else:
raise
elif a.description == AlertDescription.user_canceled:
print "Handshake cancelled"
elif a.description == AlertDescription.handshake_failure:
print "Unable to negotiate mutually acceptable parameters"
elif a.description == AlertDescription.close_notify:
pass
else:
raise
#Run multi-threaded server
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.bind(address)
sock.listen(5)
while 1:
(newsock, cliAddress) = sock.accept()
thread.start_new_thread(handler, (newsock,))
else:
print "Bad command: '%s'" % cmd
except TLSRemoteAlert, a:
print str(a)
raise
| 35.417674 | 109 | 0.587146 | 3,527 | 38,074 | 6.320953 | 0.117097 | 0.025029 | 0.040056 | 0.040908 | 0.757199 | 0.743294 | 0.721136 | 0.702476 | 0.672961 | 0.641966 | 0 | 0.022406 | 0.315438 | 38,074 | 1,074 | 110 | 35.450652 | 0.83295 | 0.028497 | 0 | 0.758889 | 0 | 0 | 0.12042 | 0.003004 | 0 | 0 | 0 | 0 | 0.037778 | 0 | null | null | 0.035556 | 0.012222 | null | null | 0.17 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fa2f5e7c774438ccef45173567e9e3b6db90a13f | 118 | py | Python | teamtrees/__init__.py | nm17/teamtrees | 7f8e22cf318e4d553468d4f0cfc1544eb4c568e4 | [
"MIT"
] | 2 | 2019-10-30T13:43:37.000Z | 2021-08-18T11:50:40.000Z | teamtrees/__init__.py | nm17/teamtrees | 7f8e22cf318e4d553468d4f0cfc1544eb4c568e4 | [
"MIT"
] | null | null | null | teamtrees/__init__.py | nm17/teamtrees | 7f8e22cf318e4d553468d4f0cfc1544eb4c568e4 | [
"MIT"
] | null | null | null | from teamtrees.counter import get_trees_count
from teamtrees.donations import get_top_donations, get_recent_donations
| 39.333333 | 71 | 0.898305 | 17 | 118 | 5.882353 | 0.588235 | 0.26 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076271 | 118 | 2 | 72 | 59 | 0.917431 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d721e822f3719a9c5150cb034654afb62656a496 | 124 | py | Python | tag/admin.py | noriHanda/study_matching | 55e9525c9502c45c799f8f3a4c9c2b54406ed9e1 | [
"MIT"
] | null | null | null | tag/admin.py | noriHanda/study_matching | 55e9525c9502c45c799f8f3a4c9c2b54406ed9e1 | [
"MIT"
] | null | null | null | tag/admin.py | noriHanda/study_matching | 55e9525c9502c45c799f8f3a4c9c2b54406ed9e1 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Tag
@admin.register(Tag)
class TagAdmin(admin.ModelAdmin):
pass
| 13.777778 | 33 | 0.766129 | 17 | 124 | 5.588235 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153226 | 124 | 8 | 34 | 15.5 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
d75a2da5653c98b8dd536735d408d2bbb13653b6 | 213 | py | Python | tartiflette/__init__.py | dailymotion/test-ttftt | aa999093a43e4681bec0767d1a15f36621b91baa | [
"MIT"
] | null | null | null | tartiflette/__init__.py | dailymotion/test-ttftt | aa999093a43e4681bec0767d1a15f36621b91baa | [
"MIT"
] | null | null | null | tartiflette/__init__.py | dailymotion/test-ttftt | aa999093a43e4681bec0767d1a15f36621b91baa | [
"MIT"
] | null | null | null | from tartiflette.resolver import Resolver, ResolverExecutorFactory
from tartiflette.subscription import Subscription
from tartiflette.sdl import build_graphql_schema_from_sdl
from tartiflette.engine import Engine
| 42.6 | 66 | 0.896714 | 25 | 213 | 7.48 | 0.44 | 0.320856 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079812 | 213 | 4 | 67 | 53.25 | 0.954082 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ad18435032b7eca7d5df0a0e53a90f1fcfd0eab0 | 142 | py | Python | assets/shaders/gradient.py | E15dev/pygame-shader-render | 5a773b762c6e8013c1f011a02f8fb0bc2731f86a | [
"MIT"
] | 2 | 2022-02-06T19:58:26.000Z | 2022-03-09T10:40:17.000Z | assets/shaders/gradient.py | E15dev/pygame-shader-render | 5a773b762c6e8013c1f011a02f8fb0bc2731f86a | [
"MIT"
] | null | null | null | assets/shaders/gradient.py | E15dev/pygame-shader-render | 5a773b762c6e8013c1f011a02f8fb0bc2731f86a | [
"MIT"
] | null | null | null | from math import floor
def shader(x, y, z):
# return (255, 255, 255)
return (floor((x + 100) / 2), floor((y + 100) / 2), 127)
| 23.666667 | 63 | 0.528169 | 23 | 142 | 3.26087 | 0.608696 | 0.16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.19802 | 0.288732 | 142 | 5 | 64 | 28.4 | 0.544554 | 0.15493 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
ad4b69e4f76a722525daf1d01491cfa72d6e0443 | 84 | py | Python | models/__init__.py | unademo/UNet_Nested4Tiny_Objects_Keypoints | cdf1b499c526949ca2ae927b70a217cefc18dd08 | [
"MIT"
] | 6 | 2019-10-29T13:05:59.000Z | 2020-03-13T00:47:46.000Z | models/__init__.py | unademo/UNet_Nested4Tiny_Objects_Keypoints | cdf1b499c526949ca2ae927b70a217cefc18dd08 | [
"MIT"
] | 1 | 2021-07-07T12:51:05.000Z | 2021-07-07T12:51:05.000Z | models/__init__.py | unanan/UNet_Nested4Tiny_Objects_Keypoints | cdf1b499c526949ca2ae927b70a217cefc18dd08 | [
"MIT"
] | null | null | null | from .unet import *
from .resnet import *
from .triplet import *
from .onet import * | 21 | 22 | 0.72619 | 12 | 84 | 5.083333 | 0.5 | 0.491803 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178571 | 84 | 4 | 23 | 21 | 0.884058 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ad6d6823ffce26ac40cbb93164088747f36bba42 | 14,004 | py | Python | tensorflow_model_analysis/evaluators/metrics_validator_test.py | BioGeek/model-analysis | 03db02c21e21b092bc409c8bf263174b90c4e2ae | [
"Apache-2.0"
] | null | null | null | tensorflow_model_analysis/evaluators/metrics_validator_test.py | BioGeek/model-analysis | 03db02c21e21b092bc409c8bf263174b90c4e2ae | [
"Apache-2.0"
] | 1 | 2020-03-03T03:34:37.000Z | 2020-03-03T03:34:37.000Z | tensorflow_model_analysis/evaluators/metrics_validator_test.py | Bobgy/model-analysis | a964d2e8430b447c898d271fb6e6d8f5b99adf4b | [
"Apache-2.0"
] | null | null | null | # Lint as: python3
# Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Test for MetricsAndPlotsEvaluator."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import tensorflow as tf # pylint: disable=g-explicit-tensorflow-version-import
from tensorflow_model_analysis import config
from tensorflow_model_analysis.eval_saved_model import testutil
from tensorflow_model_analysis.evaluators import metrics_validator
from tensorflow_model_analysis.metrics import metric_types
from tensorflow_model_analysis.proto import validation_result_pb2
from google.protobuf import text_format
class MetricsValidatorTest(testutil.TensorflowModelAnalysisTest):
def testValidateMetricsMetricValueAndThreshold(self):
eval_config = config.EvalConfig(
model_specs=[
config.ModelSpec(),
],
slicing_specs=[config.SlicingSpec()],
metrics_specs=[
config.MetricsSpec(
metrics=[
config.MetricConfig(
class_name='WeightedExampleCount',
# 1.5 < 1, NOT OK.
threshold=config.MetricThreshold(
value_threshold=config.GenericValueThreshold(
upper_bound={'value': 1}))),
],
model_names=['']),
],
)
sliced_metrics = ((()), {
metric_types.MetricKey(name='weighted_example_count'): 1.5,
})
result = metrics_validator.validate_metrics(sliced_metrics, eval_config)
self.assertFalse(result.validation_ok)
expected = text_format.Parse(
"""
metric_validations_per_slice {
slice_key {
}
failures {
metric_key {
name: "weighted_example_count"
}
metric_threshold {
value_threshold {
upper_bound {
value: 1.0
}
}
}
metric_value {
double_value {
value: 1.5
}
}
}
}""", validation_result_pb2.ValidationResult())
self.assertEqual(result, expected)
def testValidateMetricsValueThresholdUpperBoundFail(self):
eval_config = config.EvalConfig(
model_specs=[
config.ModelSpec(),
],
slicing_specs=[config.SlicingSpec()],
metrics_specs=[
config.MetricsSpec(
metrics=[
config.MetricConfig(
class_name='WeightedExampleCount',
# 1.5 < 1, NOT OK.
threshold=config.MetricThreshold(
value_threshold=config.GenericValueThreshold(
upper_bound={'value': 1}))),
],
model_names=['']),
],
)
sliced_metrics = ((()), {
metric_types.MetricKey(name='weighted_example_count'): 1.5,
})
result = metrics_validator.validate_metrics(sliced_metrics, eval_config)
self.assertFalse(result.validation_ok)
def testValidateMetricsValueThresholdLowerBoundFail(self):
eval_config = config.EvalConfig(
model_specs=[
config.ModelSpec(),
],
slicing_specs=[config.SlicingSpec()],
metrics_specs=[
config.MetricsSpec(
metrics=[
config.MetricConfig(
class_name='WeightedExampleCount',
# 0 > 1, NOT OK.
threshold=config.MetricThreshold(
value_threshold=config.GenericValueThreshold(
lower_bound={'value': 1}))),
],
model_names=['']),
],
)
sliced_metrics = ((()), {
metric_types.MetricKey(name='weighted_example_count'): 0,
})
result = metrics_validator.validate_metrics(sliced_metrics, eval_config)
self.assertFalse(result.validation_ok)
def testValidateMetricsValueThresholdUpperBoundPass(self):
eval_config = config.EvalConfig(
model_specs=[
config.ModelSpec(),
],
slicing_specs=[config.SlicingSpec()],
metrics_specs=[
config.MetricsSpec(
metrics=[
config.MetricConfig(
class_name='WeightedExampleCount',
# 0 < 1, OK.
threshold=config.MetricThreshold(
value_threshold=config.GenericValueThreshold(
upper_bound={'value': 1}))),
],
model_names=['']),
],
)
sliced_metrics = ((()), {
metric_types.MetricKey(name='weighted_example_count'): 0,
})
result = metrics_validator.validate_metrics(sliced_metrics, eval_config)
self.assertTrue(result.validation_ok)
def testValidateMetricsValueThresholdLowerBoundPass(self):
eval_config = config.EvalConfig(
model_specs=[
config.ModelSpec(),
],
slicing_specs=[config.SlicingSpec()],
metrics_specs=[
config.MetricsSpec(
metrics=[
config.MetricConfig(
class_name='WeightedExampleCount',
# 2 > 1, OK.
threshold=config.MetricThreshold(
value_threshold=config.GenericValueThreshold(
lower_bound={'value': 1}))),
],
model_names=['']),
],
)
sliced_metrics = ((()), {
metric_types.MetricKey(name='weighted_example_count'): 2,
})
result = metrics_validator.validate_metrics(sliced_metrics, eval_config)
self.assertTrue(result.validation_ok)
def testValidateMetricsChangeThresholdAbsoluteFail(self):
eval_config = config.EvalConfig(
model_specs=[
config.ModelSpec(),
config.ModelSpec(name='baseline', is_baseline=True)
],
slicing_specs=[config.SlicingSpec()],
metrics_specs=[
config.MetricsSpec(
metrics=[
config.MetricConfig(
class_name='MeanPrediction',
# Diff = 0 - .333 = -.333 < -1, NOT OK.
threshold=config.MetricThreshold(
change_threshold=config.GenericChangeThreshold(
direction=config.MetricDirection
.LOWER_IS_BETTER,
absolute={'value': -1})))
],
model_names=['']),
],
)
sliced_metrics = ((()), {
metric_types.MetricKey(name='mean_prediction', model_name='baseline'):
0.333,
metric_types.MetricKey(name='mean_prediction', is_diff=True):
-0.333,
})
result = metrics_validator.validate_metrics(sliced_metrics, eval_config)
self.assertFalse(result.validation_ok)
def testValidateMetricsChangeThresholdRelativeFail(self):
eval_config = config.EvalConfig(
model_specs=[
config.ModelSpec(),
config.ModelSpec(name='baseline', is_baseline=True)
],
slicing_specs=[config.SlicingSpec()],
metrics_specs=[
config.MetricsSpec(
metrics=[
config.MetricConfig(
class_name='MeanPrediction',
# Diff = -.333
# Diff% = -.333/.333 = -100% < -200%, NOT OK.
threshold=config.MetricThreshold(
change_threshold=config.GenericChangeThreshold(
direction=config.MetricDirection
.LOWER_IS_BETTER,
relative={'value': -2}))),
],
model_names=['']),
],
)
sliced_metrics = ((()), {
metric_types.MetricKey(name='mean_prediction', model_name='baseline'):
0.333,
metric_types.MetricKey(name='mean_prediction', is_diff=True):
-0.333,
})
result = metrics_validator.validate_metrics(sliced_metrics, eval_config)
self.assertFalse(result.validation_ok)
def testValidateMetricsChangeThresholdAbsolutePass(self):
eval_config = config.EvalConfig(
model_specs=[
config.ModelSpec(),
config.ModelSpec(name='baseline', is_baseline=True)
],
slicing_specs=[config.SlicingSpec()],
metrics_specs=[
config.MetricsSpec(
metrics=[
config.MetricConfig(
class_name='MeanPrediction',
# Diff = 0 - .333 = -.333 < 0, OK.
threshold=config.MetricThreshold(
change_threshold=config.GenericChangeThreshold(
direction=config.MetricDirection
.LOWER_IS_BETTER,
absolute={'value': 0})))
],
model_names=['']),
],
)
sliced_metrics = ((()), {
metric_types.MetricKey(name='mean_prediction', model_name='baseline'):
0.333,
metric_types.MetricKey(name='mean_prediction', is_diff=True):
-0.333,
})
result = metrics_validator.validate_metrics(sliced_metrics, eval_config)
self.assertTrue(result.validation_ok)
def testValidateMetricsChangeThresholdRelativePass(self):
eval_config = config.EvalConfig(
model_specs=[
config.ModelSpec(),
config.ModelSpec(name='baseline', is_baseline=True)
],
slicing_specs=[config.SlicingSpec()],
metrics_specs=[
config.MetricsSpec(
metrics=[
config.MetricConfig(
class_name='MeanPrediction',
# Diff = -.333
# Diff% = -.333/.333 = -100% < 0%, OK.
threshold=config.MetricThreshold(
change_threshold=config.GenericChangeThreshold(
direction=config.MetricDirection
.LOWER_IS_BETTER,
relative={'value': 0}))),
],
model_names=['']),
],
)
sliced_metrics = ((()), {
metric_types.MetricKey(name='mean_prediction', model_name='baseline'):
0.333,
metric_types.MetricKey(name='mean_prediction', is_diff=True):
-0.333,
})
result = metrics_validator.validate_metrics(sliced_metrics, eval_config)
self.assertTrue(result.validation_ok)
def testValidateMetricsChangeThresholdHigherIsBetterPass(self):
eval_config = config.EvalConfig(
model_specs=[
config.ModelSpec(),
config.ModelSpec(name='baseline', is_baseline=True)
],
slicing_specs=[config.SlicingSpec()],
metrics_specs=[
config.MetricsSpec(
metrics=[
config.MetricConfig(
class_name='MeanPrediction',
# Diff = -.333 > -1, OK.
threshold=config.MetricThreshold(
change_threshold=config.GenericChangeThreshold(
direction=config.MetricDirection
.HIGHER_IS_BETTER,
absolute={'value': -1}))),
],
model_names=['']),
],
)
sliced_metrics = ((()), {
metric_types.MetricKey(name='mean_prediction', model_name='baseline'):
0.333,
metric_types.MetricKey(name='mean_prediction', is_diff=True):
-0.333,
})
result = metrics_validator.validate_metrics(sliced_metrics, eval_config)
self.assertTrue(result.validation_ok)
def testValidateMetricsChangeThresholdHigherIsBetterFail(self):
eval_config = config.EvalConfig(
model_specs=[
config.ModelSpec(),
config.ModelSpec(name='baseline', is_baseline=True)
],
slicing_specs=[config.SlicingSpec()],
metrics_specs=[
config.MetricsSpec(
metrics=[
config.MetricConfig(
class_name='MeanPrediction',
# Diff = -.333 > 0, NOT OK.
threshold=config.MetricThreshold(
change_threshold=config.GenericChangeThreshold(
direction=config.MetricDirection
.HIGHER_IS_BETTER,
absolute={'value': 0}))),
],
model_names=['']),
],
)
sliced_metrics = ((()), {
metric_types.MetricKey(name='mean_prediction', model_name='baseline'):
0.333,
metric_types.MetricKey(name='mean_prediction', is_diff=True):
-0.333,
})
result = metrics_validator.validate_metrics(sliced_metrics, eval_config)
self.assertFalse(result.validation_ok)
if __name__ == '__main__':
tf.compat.v1.enable_v2_behavior()
tf.test.main()
| 37.44385 | 79 | 0.546701 | 1,123 | 14,004 | 6.574354 | 0.155833 | 0.049167 | 0.046052 | 0.055262 | 0.756332 | 0.756197 | 0.756061 | 0.756061 | 0.756061 | 0.755926 | 0 | 0.016306 | 0.356255 | 14,004 | 373 | 80 | 37.544236 | 0.802662 | 0.067981 | 0 | 0.849673 | 0 | 0 | 0.050519 | 0.008779 | 0 | 0 | 0 | 0 | 0.039216 | 1 | 0.035948 | false | 0.01634 | 0.03268 | 0 | 0.071895 | 0.003268 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d127c0a129c3df74a8a4406072e858f16df1fcdd | 32 | py | Python | bmw/test_set.py | qcware/bmw | 761e405587bffe5dc4ca9f79432a79df2c7fd8f8 | [
"MIT"
] | null | null | null | bmw/test_set.py | qcware/bmw | 761e405587bffe5dc4ca9f79432a79df2c7fd8f8 | [
"MIT"
] | null | null | null | bmw/test_set.py | qcware/bmw | 761e405587bffe5dc4ca9f79432a79df2c7fd8f8 | [
"MIT"
] | null | null | null | from .bmw_plugin import TestSet
| 16 | 31 | 0.84375 | 5 | 32 | 5.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d13d541526ca25adc685c39c907bcf706edbd202 | 2,240 | py | Python | api/migrations/0024_auto_20200327_1726.py | CMPUT404W20-Wed/CMPUT404-project-socialdistribution-tmp | a4198f6c627b1f461bbb982bbfd126aee98cca87 | [
"Apache-2.0"
] | 1 | 2020-02-05T20:59:50.000Z | 2020-02-05T20:59:50.000Z | api/migrations/0024_auto_20200327_1726.py | CMPUT404W20-Wed/CMPUT404-project-socialdistribution-tmp | a4198f6c627b1f461bbb982bbfd126aee98cca87 | [
"Apache-2.0"
] | 33 | 2020-02-17T03:58:23.000Z | 2020-04-07T20:03:44.000Z | api/migrations/0024_auto_20200327_1726.py | CMPUT404W20-Wed/CMPUT404-project-socialdistribution | a4198f6c627b1f461bbb982bbfd126aee98cca87 | [
"Apache-2.0"
] | null | null | null | # Generated by Django 2.2.10 on 2020-03-27 17:26
from django.db import migrations, models
import django.utils.timezone
class Migration(migrations.Migration):
dependencies = [
('api', '0023_merge_20200325_2106'),
]
operations = [
migrations.AlterField(
model_name='comment',
name='comment',
field=models.TextField(),
),
migrations.AlterField(
model_name='comment',
name='published',
field=models.DateTimeField(default=django.utils.timezone.now),
),
migrations.AlterField(
model_name='login',
name='password',
field=models.CharField(max_length=255),
),
migrations.AlterField(
model_name='login',
name='username',
field=models.CharField(max_length=255),
),
migrations.AlterField(
model_name='post',
name='content',
field=models.TextField(),
),
migrations.AlterField(
model_name='post',
name='description',
field=models.TextField(),
),
migrations.AlterField(
model_name='post',
name='published',
field=models.DateTimeField(default=django.utils.timezone.now),
),
migrations.AlterField(
model_name='post',
name='title',
field=models.TextField(),
),
migrations.AlterField(
model_name='user',
name='bio',
field=models.TextField(),
),
migrations.AlterField(
model_name='user',
name='displayName',
field=models.CharField(max_length=127),
),
migrations.AlterField(
model_name='user',
name='email',
field=models.CharField(max_length=127),
),
migrations.AlterField(
model_name='user',
name='firstName',
field=models.CharField(max_length=63),
),
migrations.AlterField(
model_name='user',
name='lastName',
field=models.CharField(max_length=63),
),
]
| 28 | 74 | 0.526786 | 191 | 2,240 | 6.062827 | 0.293194 | 0.224525 | 0.280656 | 0.325561 | 0.779793 | 0.779793 | 0.624352 | 0.582038 | 0.582038 | 0.388601 | 0 | 0.033264 | 0.355804 | 2,240 | 79 | 75 | 28.35443 | 0.769231 | 0.020536 | 0 | 0.739726 | 1 | 0 | 0.08531 | 0.010949 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.013699 | 0.027397 | 0 | 0.068493 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d14195f83cb3b2c07ab5c1a3aff30857466a56f8 | 22 | py | Python | mkdocs_autozip/__init__.py | martinohanlon/mkdocs_autozip | 52e4e370c168c075971fea05dbb2df370f21faba | [
"MIT"
] | 1 | 2020-11-23T20:41:15.000Z | 2020-11-23T20:41:15.000Z | mkdocs_autozip/__init__.py | martinohanlon/mkdocs_autozip | 52e4e370c168c075971fea05dbb2df370f21faba | [
"MIT"
] | 1 | 2020-11-04T11:38:37.000Z | 2020-11-05T06:25:51.000Z | mkdocs_autozip/__init__.py | martinohanlon/mkdocs_autozip | 52e4e370c168c075971fea05dbb2df370f21faba | [
"MIT"
] | null | null | null | from .autozip import * | 22 | 22 | 0.772727 | 3 | 22 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 22 | 1 | 22 | 22 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d15959c4e2936b6b119fb219bb11ab7f62de5a08 | 133 | py | Python | nitorch/tools/registration/__init__.py | balbasty/nitorch | d30c3125a8a66ea1434f2b39ed03338afd9724b4 | [
"MIT"
] | 46 | 2020-07-31T10:14:05.000Z | 2022-03-24T12:51:46.000Z | nitorch/tools/registration/__init__.py | balbasty/nitorch | d30c3125a8a66ea1434f2b39ed03338afd9724b4 | [
"MIT"
] | 36 | 2020-10-06T19:01:38.000Z | 2022-02-03T18:07:35.000Z | nitorch/tools/registration/__init__.py | balbasty/nitorch | d30c3125a8a66ea1434f2b39ed03338afd9724b4 | [
"MIT"
] | 6 | 2021-01-05T14:59:05.000Z | 2021-11-18T18:26:45.000Z | from . import pairwise
from . import losses
from . import objects
from .losses import gmm
from . import phantoms
from . import utils
| 19 | 23 | 0.774436 | 19 | 133 | 5.421053 | 0.421053 | 0.485437 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.180451 | 133 | 6 | 24 | 22.166667 | 0.944954 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0f087a2aea4b8e80c5237bebeac3e0d341b4894d | 21 | py | Python | pyfdb/__init__.py | ecmwf/pyfdb | 90716ddcaa8b3d981e695b47a1690123e0c230ba | [
"Apache-2.0"
] | null | null | null | pyfdb/__init__.py | ecmwf/pyfdb | 90716ddcaa8b3d981e695b47a1690123e0c230ba | [
"Apache-2.0"
] | null | null | null | pyfdb/__init__.py | ecmwf/pyfdb | 90716ddcaa8b3d981e695b47a1690123e0c230ba | [
"Apache-2.0"
] | null | null | null | from .pyfdb import *
| 10.5 | 20 | 0.714286 | 3 | 21 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 21 | 1 | 21 | 21 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0f10879683b07d6eaf7e15985a8a95bde0a8f90a | 13,691 | py | Python | tests/user_test.py | kobezt08/SecurityBot | 5d8c8fcd7bf85536e28a9903bdaae90ff91216eb | [
"Apache-2.0"
] | 1 | 2019-02-04T02:57:51.000Z | 2019-02-04T02:57:51.000Z | tests/user_test.py | kobezt08/SecurityBot | 5d8c8fcd7bf85536e28a9903bdaae90ff91216eb | [
"Apache-2.0"
] | null | null | null | tests/user_test.py | kobezt08/SecurityBot | 5d8c8fcd7bf85536e28a9903bdaae90ff91216eb | [
"Apache-2.0"
] | null | null | null | from unittest2 import TestCase
from mock import Mock, patch
from collections import defaultdict
from datetime import timedelta
import securitybot.user as user
import securitybot.bot
import securitybot.chat.chat
import securitybot.auth.auth
# Mock away ignoring alerts
import securitybot.ignored_alerts as ignored_alerts
ignored_alerts.__update_ignored_list = Mock()
ignored_alerts.get_ignored = Mock(return_value={})
ignored_alerts.get_ignored.return_value = {}
ignored_alerts.ignore_task = Mock()
class UserTest(TestCase):
@patch('securitybot.chat.chat.Chat', autospec=True)
@patch('securitybot.bot.SecurityBot', autospec=True)
def setUp(self, bot, chat):
bot.chat = chat
self.bot = bot
def test_construction(self):
'''Tests basic construction of a user.'''
user.User({}, None, None)
def test_get_attributes(self):
'''Tests grabbing attributes like a dictionary.'''
test_user = user.User({'alphabet': 'soup',
'animal': 'crackers'},
None, None)
assert test_user['alphabet'] == 'soup'
assert test_user['animal'] == 'crackers'
def test_name(self):
'''Tests getting a user's name.'''
test_user = user.User({'profile': {'first_name': 'Bot'}}, None, None)
assert test_user.get_name() == 'Bot'
test_user = user.User({'profile': {}, 'name': 'Bot2'}, None, None)
assert test_user.get_name() == 'Bot2'
# User interaction flows
@patch('securitybot.tasker.tasker.Task')
@patch('securitybot.auth.auth.Auth', autospec=True)
def test_basic_flow(self, auth, mock_task):
'''
Tests basic flow through the bot.
This is the most basic flow:
new task => did perform => allow 2FA => valid 2FA => no task
This will ensure that the states progress as expected and the bot
cleans itself up afterwards.
'''
auth.auth_status.return_value = securitybot.auth.auth.AUTH_STATES.NONE
auth.can_auth.return_value = True
self.bot.messages = defaultdict(str)
test_user = user.User({}, auth, self.bot)
task = mock_task.start()
assert str(test_user._fsm.state) == 'need_task'
# Also test not advancing on no queued task
test_user.step()
assert str(test_user._fsm.state) == 'need_task'
test_user.add_task(task)
test_user.step()
assert str(test_user._fsm.state) == 'action_performed_check'
test_user.positive_response('Dummy explanation.')
test_user.step()
assert str(test_user._fsm.state) == 'auth_permission_check'
assert (test_user._last_message.answer is None and
test_user._last_message.text == '')
test_user.positive_response('Dummy explanation.')
test_user.step()
assert str(test_user._fsm.state) == 'waiting_on_auth'
assert (test_user._last_message.answer is None and
test_user._last_message.text == '')
auth.auth_status.return_value = securitybot.auth.auth.AUTH_STATES.AUTHORIZED
test_user.step()
assert str(test_user._fsm.state) == 'task_finished'
test_user.step()
self.bot.cleanup_user.assert_called_with(test_user)
assert str(test_user._fsm.state) == 'need_task'
task.set_verifying.assert_called_with()
mock_task.stop()
@patch('securitybot.tasker.tasker.Task')
@patch('securitybot.auth.auth.Auth', autospec=True)
def test_did_not_do_flow(self, auth, mock_task):
'''
Tests flow if a user did not perform an action.
'''
auth.auth_status.return_value = securitybot.auth.auth.AUTH_STATES.NONE
auth.can_auth.return_value = True
self.bot.messages = defaultdict(str)
self.bot.reporting_channel = None
test_user = user.User({}, auth, self.bot)
task = mock_task.start()
assert str(test_user._fsm.state) == 'need_task'
# Also test not advancing on no queued task
test_user.step()
assert str(test_user._fsm.state) == 'need_task'
test_user.add_task(task)
test_user.step()
assert str(test_user._fsm.state) == 'action_performed_check'
test_user.negative_response('Dummy explanation.')
test_user.step()
assert str(test_user._fsm.state) == 'task_finished'
assert (test_user._last_message.answer is None and
test_user._last_message.text == '')
mock_task.stop()
@patch('securitybot.tasker.tasker.Task')
@patch('securitybot.auth.auth.Auth', autospec=True)
def test_two_task_flow(self, auth, mock_task):
'''
Tests two task. Once the first is completed, the bot should send a
a message announcing that another task exists.
'''
auth.auth_status.return_value = securitybot.auth.auth.AUTH_STATES.NONE
auth.can_auth.return_value = True
self.bot.messages = defaultdict(str)
test_user = user.User({}, auth, self.bot)
test_user.send_message = Mock()
task = mock_task.start()
assert str(test_user._fsm.state) == 'need_task'
# Add two tasks to the queue
test_user.add_task(task)
test_user.add_task(task)
test_user.step()
assert str(test_user._fsm.state) == 'action_performed_check'
test_user.positive_response('Dummy explanation.')
test_user.step()
assert str(test_user._fsm.state) == 'auth_permission_check'
assert (test_user._last_message.answer is None and
test_user._last_message.text == '')
test_user.positive_response('Dummy explanation.')
test_user.step()
assert str(test_user._fsm.state) == 'waiting_on_auth'
assert (test_user._last_message.answer is None and
test_user._last_message.text == '')
auth.auth_status.return_value = securitybot.auth.auth.AUTH_STATES.AUTHORIZED
test_user.step()
assert str(test_user._fsm.state) == 'task_finished'
test_user.step()
test_user.send_message.assert_called_with('bwtm')
assert str(test_user._fsm.state) == 'need_task'
task.set_verifying.assert_called_with()
mock_task.stop()
@patch('securitybot.tasker.tasker.Task')
@patch('securitybot.auth.auth.Auth', autospec=True)
def test_already_authorized_flow(self, auth, mock_task):
'''
Tests already being authorized after confirming an alert.
This is the most basic flow:
new task => did perform => already authorized => no task
'''
auth.auth_status.return_value = securitybot.auth.auth.AUTH_STATES.AUTHORIZED
auth.can_auth.return_value = True
self.bot.messages = defaultdict(str)
test_user = user.User({}, auth, self.bot)
task = mock_task.start()
assert str(test_user._fsm.state) == 'need_task'
test_user.add_task(task)
test_user.step()
assert str(test_user._fsm.state) == 'action_performed_check'
test_user.positive_response('Dummy explanation.')
test_user.step()
assert str(test_user._fsm.state) == 'task_finished'
assert (test_user._last_message.answer is None and
test_user._last_message.text == '')
test_user.step()
self.bot.cleanup_user.assert_called_with(test_user)
assert str(test_user._fsm.state) == 'need_task'
task.set_verifying.assert_called_with()
mock_task.stop()
@patch('securitybot.tasker.tasker.Task')
@patch('securitybot.auth.auth.Auth', autospec=True)
def test_no_2fa(self, auth, mock_task):
'''
Tests a user not having 2FA capability.
This is the most basic flow:
new task => did perform => allow 2FA => valid 2FA => no task
'''
auth.auth_status.return_value = securitybot.auth.auth.AUTH_STATES.NONE
auth.can_auth.return_value = False
self.bot.messages = defaultdict(str)
test_user = user.User({}, auth, self.bot)
task = mock_task.start()
assert str(test_user._fsm.state) == 'need_task'
test_user.step()
assert str(test_user._fsm.state) == 'need_task'
test_user.add_task(task)
test_user.step()
assert str(test_user._fsm.state) == 'action_performed_check'
test_user.positive_response('Dummy explanation.')
test_user.step()
assert str(test_user._fsm.state) == 'task_finished'
assert (test_user._last_message.answer is None and
test_user._last_message.text == '')
test_user.step()
self.bot.cleanup_user.assert_called_with(test_user)
assert str(test_user._fsm.state) == 'need_task'
task.set_verifying.assert_called_with()
mock_task.stop()
@patch('securitybot.tasker.tasker.Task')
@patch('securitybot.auth.auth.Auth', autospec=True)
def test_not_allow_2fa_flow(self, auth, mock_task):
'''
Tests if the user denies being sent a Duo Push.
'''
auth.auth_status.return_value = securitybot.auth.auth.AUTH_STATES.NONE
auth.can_auth.return_value = True
self.bot.messages = defaultdict(str)
test_user = user.User({}, auth, self.bot)
task = mock_task.start()
assert str(test_user._fsm.state) == 'need_task'
# Also test not advancing on no queued task
test_user.step()
assert str(test_user._fsm.state) == 'need_task'
test_user.add_task(task)
test_user.step()
assert str(test_user._fsm.state) == 'action_performed_check'
test_user.positive_response('Dummy explanation.')
test_user.step()
assert str(test_user._fsm.state) == 'auth_permission_check'
assert (test_user._last_message.answer is None and
test_user._last_message.text == '')
test_user.negative_response('Dummy explanation.')
test_user.step()
assert str(test_user._fsm.state) == 'task_finished'
assert (test_user._last_message.answer is None and
test_user._last_message.text == '')
mock_task.stop()
@patch('securitybot.user.ESCALATION_TIME', timedelta(seconds=-1))
@patch('securitybot.tasker.tasker.Task')
@patch('securitybot.auth.auth.Auth', autospec=True)
def test_auto_escalate(self, auth, mock_task):
'''Tests that after some time an alert automatically escalates.'''
auth.auth_status.return_value = securitybot.auth.auth.AUTH_STATES.DENIED
auth.can_auth.return_value = True
self.bot.messages = defaultdict(str)
test_user = user.User({}, auth, self.bot)
task = mock_task.start()
assert str(test_user._fsm.state) == 'need_task'
test_user.add_task(task)
test_user.step()
assert str(test_user._fsm.state) == 'action_performed_check'
# Auto-escalation should happen immediately because escalation time
# is set to be zero seconds
test_user.step()
assert str(test_user._fsm.state) == 'task_finished'
task.set_verifying.assert_called_with()
mock_task.stop()
@patch('securitybot.tasker.tasker.Task')
@patch('securitybot.auth.auth.Auth', autospec=True)
def test_deny_resets_auth(self, auth, mock_task):
'''Tests that receiving a deny from 2FA resets any saved authorization.'''
auth.auth_status.return_value = securitybot.auth.auth.AUTH_STATES.DENIED
auth.can_auth.return_value = True
self.bot.messages = defaultdict(str)
test_user = user.User({}, auth, self.bot)
task = mock_task.start()
assert str(test_user._fsm.state) == 'need_task'
test_user.add_task(task)
test_user.step()
assert str(test_user._fsm.state) == 'action_performed_check'
test_user.positive_response('Dummy explanation.')
test_user.step()
assert str(test_user._fsm.state) == 'auth_permission_check'
assert (test_user._last_message.answer is None and
test_user._last_message.text == '')
test_user.positive_response('Dummy explanation.')
test_user.step()
assert str(test_user._fsm.state) == 'waiting_on_auth'
assert (test_user._last_message.answer is None and
test_user._last_message.text == '')
test_user.step()
assert str(test_user._fsm.state) == 'task_finished'
auth.reset.assert_called_with()
mock_task.stop()
# Auth interactions
@patch('securitybot.tasker.tasker.Task')
@patch('securitybot.auth.auth.Auth', autospec=True)
def test_start_auth(self, auth, mock_task):
'''Tests that authorization calls call the auth object.'''
self.bot.messages = defaultdict(str)
test_user = user.User({}, auth, self.bot)
task = mock_task.start()
test_user.pending_task = task
test_user.begin_auth()
auth.auth.assert_called_with(task.description)
mock_task.stop()
@patch('securitybot.auth.auth.Auth', autospec=True)
def test_check_auth(self, auth):
'''Tests that auth status calls interact properly.'''
test_user = user.User({}, auth, None)
test_user.auth_status()
auth.auth_status.assert_called_with()
@patch('securitybot.auth.auth.Auth', autospec=True)
def test_reset_auth(self, auth):
'''Tests that auth is properly reset on `reset_auth`.'''
test_user = user.User({}, auth, None)
test_user.reset_auth()
auth.reset.assert_called_with()
| 36.412234 | 84 | 0.650281 | 1,759 | 13,691 | 4.797612 | 0.102331 | 0.131769 | 0.061263 | 0.078564 | 0.783979 | 0.767982 | 0.740965 | 0.725441 | 0.717858 | 0.706719 | 0 | 0.00115 | 0.237967 | 13,691 | 375 | 85 | 36.509333 | 0.807725 | 0.103645 | 0 | 0.784861 | 0 | 0 | 0.124032 | 0.075002 | 0 | 0 | 0 | 0 | 0.266932 | 1 | 0.059761 | false | 0 | 0.035857 | 0 | 0.099602 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0f51c07b30e7babd2427ba6a7c7bc04e40658009 | 24 | py | Python | casbin/__init__.py | goodrain/pycasbin | 1a481ba1af7619e1cc7e83896581d14976927d80 | [
"Apache-2.0"
] | 1 | 2019-12-24T17:47:37.000Z | 2019-12-24T17:47:37.000Z | casbin/__init__.py | goodrain/pycasbin | 1a481ba1af7619e1cc7e83896581d14976927d80 | [
"Apache-2.0"
] | null | null | null | casbin/__init__.py | goodrain/pycasbin | 1a481ba1af7619e1cc7e83896581d14976927d80 | [
"Apache-2.0"
] | null | null | null | from .enforcer import *
| 12 | 23 | 0.75 | 3 | 24 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 24 | 1 | 24 | 24 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0f8f648f46f10ebe722f17d17f3d7ff86e1b81a4 | 32 | py | Python | smale/__init__.py | cswiercz/smale | 8c418a66d8e6d8193b9d3d04ffecc0fd02a337e4 | [
"MIT"
] | 4 | 2019-06-12T20:57:54.000Z | 2022-03-04T06:50:20.000Z | smale/__init__.py | cswiercz/smale | 8c418a66d8e6d8193b9d3d04ffecc0fd02a337e4 | [
"MIT"
] | null | null | null | smale/__init__.py | cswiercz/smale | 8c418a66d8e6d8193b9d3d04ffecc0fd02a337e4 | [
"MIT"
] | 1 | 2022-03-04T06:50:27.000Z | 2022-03-04T06:50:27.000Z | from .smale import smale_newton
| 16 | 31 | 0.84375 | 5 | 32 | 5.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7e32967f7684ba3b613f084cbf209afc719eb7df | 47 | py | Python | rdf_io/views/__init__.py | rob-metalinkage/rdf-io | a53cb59446b7de5a8c68ba27ec055cfe6f3d4397 | [
"CC0-1.0"
] | 37 | 2016-07-27T22:07:32.000Z | 2022-03-26T10:25:52.000Z | rdf_io/views/__init__.py | rob-metalinkage/rdf-io | a53cb59446b7de5a8c68ba27ec055cfe6f3d4397 | [
"CC0-1.0"
] | 19 | 2016-05-04T23:20:30.000Z | 2021-03-16T19:00:20.000Z | rdf_io/views/__init__.py | rob-metalinkage/rdf-io | a53cb59446b7de5a8c68ba27ec055cfe6f3d4397 | [
"CC0-1.0"
] | 12 | 2016-11-21T05:30:27.000Z | 2022-02-28T18:59:04.000Z | from .serialize import *
from .manage import *
| 15.666667 | 24 | 0.744681 | 6 | 47 | 5.833333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170213 | 47 | 2 | 25 | 23.5 | 0.897436 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7e458ecffdbbda7189fcb3a2a417b2fcfdc714c5 | 27,083 | py | Python | Algor/flask/lib/python2.7/site-packages/hjson/__init__.py | HesselTjeerdsma/Cyber-Physical-Pacman-Game | cc44ad67e3c492319af8567cb74580c1bbb7ee8d | [
"Apache-2.0"
] | 3 | 2017-09-28T09:06:17.000Z | 2017-11-01T03:43:07.000Z | Algor/flask/lib/python2.7/site-packages/hjson/__init__.py | HesselTjeerdsma/Cyber-Physical-Pacman-Game | cc44ad67e3c492319af8567cb74580c1bbb7ee8d | [
"Apache-2.0"
] | null | null | null | Algor/flask/lib/python2.7/site-packages/hjson/__init__.py | HesselTjeerdsma/Cyber-Physical-Pacman-Game | cc44ad67e3c492319af8567cb74580c1bbb7ee8d | [
"Apache-2.0"
] | null | null | null | r"""Hjson, the Human JSON. A configuration file format that caters to
humans and helps reduce the errors they make.
For details and syntax see <http://hjson.org>.
Decoding Hjson::
>>> import hjson
>>> text = "{\n foo: a\n bar: 1\n}"
>>> hjson.loads(text)
OrderedDict([('foo', 'a'), ('bar', 1)])
Encoding Python object hierarchies::
>>> import hjson
>>> # hjson.dumps({'foo': 'text', 'bar': (1, 2)})
>>> hjson.dumps(OrderedDict([('foo', 'text'), ('bar', (1, 2))]))
'{\n foo: text\n bar:\n [\n 1\n 2\n ]\n}'
Encoding as JSON::
Note that this is probably not as performant as the simplejson version.
>>> import hjson
>>> hjson.dumpsJSON(['foo', {'bar': ('baz', None, 1.0, 2)}])
'["foo", {"bar": ["baz", null, 1.0, 2]}]'
Using hjson.tool from the shell to validate and pretty-print::
$ echo '{"json":"obj"}' | python -m hjson.tool
{
json: obj
}
Other formats are -c for compact or -j for formatted JSON.
"""
from __future__ import absolute_import
__version__ = '2.0.6'
__all__ = [
'dump', 'dumps', 'load', 'loads',
'dumpJSON', 'dumpsJSON',
'HjsonDecoder', 'HjsonDecodeError', 'HjsonEncoder', 'JSONEncoder',
'OrderedDict', 'simple_first',
]
# based on simplejson by
# __author__ = 'Bob Ippolito <bob@redivi.com>'
__author__ = 'Christian Zangl <coralllama@gmail.com>'
from decimal import Decimal
from .scanner import HjsonDecodeError
from .decoder import HjsonDecoder
from .encoderH import HjsonEncoder
from .encoder import JSONEncoder
def _import_OrderedDict():
import collections
try:
return collections.OrderedDict
except AttributeError:
from . import ordered_dict
return ordered_dict.OrderedDict
OrderedDict = _import_OrderedDict()
_default_decoder = HjsonDecoder(encoding=None, object_hook=None,
object_pairs_hook=OrderedDict)
def load(fp, encoding=None, cls=None, object_hook=None, parse_float=None,
parse_int=None, object_pairs_hook=OrderedDict,
use_decimal=False, namedtuple_as_object=True, tuple_as_array=True,
**kw):
"""Deserialize ``fp`` (a ``.read()``-supporting file-like object containing
a JSON document) to a Python object.
*encoding* determines the encoding used to interpret any
:class:`str` objects decoded by this instance (``'utf-8'`` by
default). It has no effect when decoding :class:`unicode` objects.
Note that currently only encodings that are a superset of ASCII work,
strings of other encodings should be passed in as :class:`unicode`.
*object_hook*, if specified, will be called with the result of every
JSON object decoded and its return value will be used in place of the
given :class:`dict`. This can be used to provide custom
deserializations (e.g. to support JSON-RPC class hinting).
*object_pairs_hook* is an optional function that will be called with
the result of any object literal decode with an ordered list of pairs.
The return value of *object_pairs_hook* will be used instead of the
:class:`dict`. This feature can be used to implement custom decoders
that rely on the order that the key and value pairs are decoded (for
example, :func:`collections.OrderedDict` will remember the order of
insertion). If *object_hook* is also defined, the *object_pairs_hook*
takes priority.
*parse_float*, if specified, will be called with the string of every
JSON float to be decoded. By default, this is equivalent to
``float(num_str)``. This can be used to use another datatype or parser
for JSON floats (e.g. :class:`decimal.Decimal`).
*parse_int*, if specified, will be called with the string of every
JSON int to be decoded. By default, this is equivalent to
``int(num_str)``. This can be used to use another datatype or parser
for JSON integers (e.g. :class:`float`).
If *use_decimal* is true (default: ``False``) then it implies
parse_float=decimal.Decimal for parity with ``dump``.
To use a custom ``HjsonDecoder`` subclass, specify it with the ``cls``
kwarg. NOTE: You should use *object_hook* or *object_pairs_hook* instead
of subclassing whenever possible.
"""
return loads(fp.read(),
encoding=encoding, cls=cls, object_hook=object_hook,
parse_float=parse_float, parse_int=parse_int,
object_pairs_hook=object_pairs_hook,
use_decimal=use_decimal, **kw)
def loads(s, encoding=None, cls=None, object_hook=None, parse_float=None,
parse_int=None, object_pairs_hook=None,
use_decimal=False, **kw):
"""Deserialize ``s`` (a ``str`` or ``unicode`` instance containing a JSON
document) to a Python object.
*encoding* determines the encoding used to interpret any
:class:`str` objects decoded by this instance (``'utf-8'`` by
default). It has no effect when decoding :class:`unicode` objects.
Note that currently only encodings that are a superset of ASCII work,
strings of other encodings should be passed in as :class:`unicode`.
*object_hook*, if specified, will be called with the result of every
JSON object decoded and its return value will be used in place of the
given :class:`dict`. This can be used to provide custom
deserializations (e.g. to support JSON-RPC class hinting).
*object_pairs_hook* is an optional function that will be called with
the result of any object literal decode with an ordered list of pairs.
The return value of *object_pairs_hook* will be used instead of the
:class:`dict`. This feature can be used to implement custom decoders
that rely on the order that the key and value pairs are decoded (for
example, :func:`collections.OrderedDict` will remember the order of
insertion). If *object_hook* is also defined, the *object_pairs_hook*
takes priority.
*parse_float*, if specified, will be called with the string of every
JSON float to be decoded. By default, this is equivalent to
``float(num_str)``. This can be used to use another datatype or parser
for JSON floats (e.g. :class:`decimal.Decimal`).
*parse_int*, if specified, will be called with the string of every
JSON int to be decoded. By default, this is equivalent to
``int(num_str)``. This can be used to use another datatype or parser
for JSON integers (e.g. :class:`float`).
If *use_decimal* is true (default: ``False``) then it implies
parse_float=decimal.Decimal for parity with ``dump``.
To use a custom ``HjsonDecoder`` subclass, specify it with the ``cls``
kwarg. NOTE: You should use *object_hook* or *object_pairs_hook* instead
of subclassing whenever possible.
"""
if (cls is None and encoding is None and object_hook is None and
parse_int is None and parse_float is None and
object_pairs_hook is None
and not use_decimal and not kw):
return _default_decoder.decode(s)
if cls is None:
cls = HjsonDecoder
if object_hook is not None:
kw['object_hook'] = object_hook
if object_pairs_hook is not None:
kw['object_pairs_hook'] = object_pairs_hook
if parse_float is not None:
kw['parse_float'] = parse_float
if parse_int is not None:
kw['parse_int'] = parse_int
if use_decimal:
if parse_float is not None:
raise TypeError("use_decimal=True implies parse_float=Decimal")
kw['parse_float'] = Decimal
return cls(encoding=encoding, **kw).decode(s)
_default_hjson_encoder = HjsonEncoder(
skipkeys=False,
ensure_ascii=True,
check_circular=True,
indent=None,
encoding='utf-8',
default=None,
use_decimal=True,
namedtuple_as_object=True,
tuple_as_array=True,
bigint_as_string=False,
item_sort_key=None,
for_json=False,
int_as_string_bitcount=None,
)
def dump(obj, fp, skipkeys=False, ensure_ascii=True, check_circular=True,
cls=None, indent=None,
encoding='utf-8', default=None, use_decimal=True,
namedtuple_as_object=True, tuple_as_array=True,
bigint_as_string=False, sort_keys=False, item_sort_key=None,
for_json=False, int_as_string_bitcount=None, **kw):
"""Serialize ``obj`` as a JSON formatted stream to ``fp`` (a
``.write()``-supporting file-like object).
If *skipkeys* is true then ``dict`` keys that are not basic types
(``str``, ``unicode``, ``int``, ``long``, ``float``, ``bool``, ``None``)
will be skipped instead of raising a ``TypeError``.
If *ensure_ascii* is false, then the some chunks written to ``fp``
may be ``unicode`` instances, subject to normal Python ``str`` to
``unicode`` coercion rules. Unless ``fp.write()`` explicitly
understands ``unicode`` (as in ``codecs.getwriter()``) this is likely
to cause an error.
If *check_circular* is false, then the circular reference check
for container types will be skipped and a circular reference will
result in an ``OverflowError`` (or worse).
If *indent* is a string, then JSON array elements and object members
will be pretty-printed with a newline followed by that string repeated
for each level of nesting. ``None`` (the default) selects the most compact
representation without any newlines. For backwards compatibility with
versions of hjson earlier than 2.1.0, an integer is also accepted
and is converted to a string with that many spaces.
*encoding* is the character encoding for str instances, default is UTF-8.
*default(obj)* is a function that should return a serializable version
of obj or raise ``TypeError``. The default simply raises ``TypeError``.
If *use_decimal* is true (default: ``True``) then decimal.Decimal
will be natively serialized to JSON with full precision.
If *namedtuple_as_object* is true (default: ``True``),
:class:`tuple` subclasses with ``_asdict()`` methods will be encoded
as JSON objects.
If *tuple_as_array* is true (default: ``True``),
:class:`tuple` (and subclasses) will be encoded as JSON arrays.
If *bigint_as_string* is true (default: ``False``), ints 2**53 and higher
or lower than -2**53 will be encoded as strings. This is to avoid the
rounding that happens in Javascript otherwise. Note that this is still a
lossy operation that will not round-trip correctly and should be used
sparingly.
If *int_as_string_bitcount* is a positive number (n), then int of size
greater than or equal to 2**n or lower than or equal to -2**n will be
encoded as strings.
If specified, *item_sort_key* is a callable used to sort the items in
each dictionary. This is useful if you want to sort items other than
in alphabetical order by key. This option takes precedence over
*sort_keys*.
If *sort_keys* is true (default: ``False``), the output of dictionaries
will be sorted by item.
If *for_json* is true (default: ``False``), objects with a ``for_json()``
method will use the return value of that method for encoding as JSON
instead of the object.
To use a custom ``HjsonEncoder`` subclass (e.g. one that overrides the
``.default()`` method to serialize additional types), specify it with
the ``cls`` kwarg. NOTE: You should use *default* or *for_json* instead
of subclassing whenever possible.
"""
# cached encoder
if (not skipkeys and ensure_ascii and
check_circular and
cls is None and indent is None and
encoding == 'utf-8' and default is None and use_decimal
and namedtuple_as_object and tuple_as_array
and not bigint_as_string and not sort_keys
and not item_sort_key and not for_json
and int_as_string_bitcount is None
and not kw
):
iterable = _default_hjson_encoder.iterencode(obj)
else:
if cls is None:
cls = HjsonEncoder
iterable = cls(skipkeys=skipkeys, ensure_ascii=ensure_ascii,
check_circular=check_circular, indent=indent,
encoding=encoding,
default=default, use_decimal=use_decimal,
namedtuple_as_object=namedtuple_as_object,
tuple_as_array=tuple_as_array,
bigint_as_string=bigint_as_string,
sort_keys=sort_keys,
item_sort_key=item_sort_key,
for_json=for_json,
int_as_string_bitcount=int_as_string_bitcount,
**kw).iterencode(obj)
# could accelerate with writelines in some versions of Python, at
# a debuggability cost
for chunk in iterable:
fp.write(chunk)
def dumps(obj, skipkeys=False, ensure_ascii=True, check_circular=True,
cls=None, indent=None,
encoding='utf-8', default=None, use_decimal=True,
namedtuple_as_object=True, tuple_as_array=True,
bigint_as_string=False, sort_keys=False, item_sort_key=None,
for_json=False, int_as_string_bitcount=None, **kw):
"""Serialize ``obj`` to a JSON formatted ``str``.
If ``skipkeys`` is false then ``dict`` keys that are not basic types
(``str``, ``unicode``, ``int``, ``long``, ``float``, ``bool``, ``None``)
will be skipped instead of raising a ``TypeError``.
If ``ensure_ascii`` is false, then the return value will be a
``unicode`` instance subject to normal Python ``str`` to ``unicode``
coercion rules instead of being escaped to an ASCII ``str``.
If ``check_circular`` is false, then the circular reference check
for container types will be skipped and a circular reference will
result in an ``OverflowError`` (or worse).
If ``indent`` is a string, then JSON array elements and object members
will be pretty-printed with a newline followed by that string repeated
for each level of nesting. ``None`` (the default) selects the most compact
representation without any newlines. For backwards compatibility with
versions of hjson earlier than 2.1.0, an integer is also accepted
and is converted to a string with that many spaces.
``encoding`` is the character encoding for str instances, default is UTF-8.
``default(obj)`` is a function that should return a serializable version
of obj or raise TypeError. The default simply raises TypeError.
If *use_decimal* is true (default: ``True``) then decimal.Decimal
will be natively serialized to JSON with full precision.
If *namedtuple_as_object* is true (default: ``True``),
:class:`tuple` subclasses with ``_asdict()`` methods will be encoded
as JSON objects.
If *tuple_as_array* is true (default: ``True``),
:class:`tuple` (and subclasses) will be encoded as JSON arrays.
If *bigint_as_string* is true (not the default), ints 2**53 and higher
or lower than -2**53 will be encoded as strings. This is to avoid the
rounding that happens in Javascript otherwise.
If *int_as_string_bitcount* is a positive number (n), then int of size
greater than or equal to 2**n or lower than or equal to -2**n will be
encoded as strings.
If specified, *item_sort_key* is a callable used to sort the items in
each dictionary. This is useful if you want to sort items other than
in alphabetical order by key. This option takes precendence over
*sort_keys*.
If *sort_keys* is true (default: ``False``), the output of dictionaries
will be sorted by item.
If *for_json* is true (default: ``False``), objects with a ``for_json()``
method will use the return value of that method for encoding as JSON
instead of the object.
To use a custom ``HjsonEncoder`` subclass (e.g. one that overrides the
``.default()`` method to serialize additional types), specify it with
the ``cls`` kwarg. NOTE: You should use *default* instead of subclassing
whenever possible.
"""
# cached encoder
if (
not skipkeys and ensure_ascii and
check_circular and
cls is None and indent is None and
encoding == 'utf-8' and default is None and use_decimal
and namedtuple_as_object and tuple_as_array
and not bigint_as_string and not sort_keys
and not item_sort_key and not for_json
and int_as_string_bitcount is None
and not kw
):
return _default_hjson_encoder.encode(obj)
if cls is None:
cls = HjsonEncoder
return cls(
skipkeys=skipkeys, ensure_ascii=ensure_ascii,
check_circular=check_circular, indent=indent,
encoding=encoding, default=default,
use_decimal=use_decimal,
namedtuple_as_object=namedtuple_as_object,
tuple_as_array=tuple_as_array,
bigint_as_string=bigint_as_string,
sort_keys=sort_keys,
item_sort_key=item_sort_key,
for_json=for_json,
int_as_string_bitcount=int_as_string_bitcount,
**kw).encode(obj)
_default_json_encoder = JSONEncoder(
skipkeys=False,
ensure_ascii=True,
check_circular=True,
indent=None,
separators=None,
encoding='utf-8',
default=None,
use_decimal=True,
namedtuple_as_object=True,
tuple_as_array=True,
bigint_as_string=False,
item_sort_key=None,
for_json=False,
int_as_string_bitcount=None,
)
def dumpJSON(obj, fp, skipkeys=False, ensure_ascii=True, check_circular=True,
cls=None, indent=None, separators=None,
encoding='utf-8', default=None, use_decimal=True,
namedtuple_as_object=True, tuple_as_array=True,
bigint_as_string=False, sort_keys=False, item_sort_key=None,
for_json=False, int_as_string_bitcount=None, **kw):
"""Serialize ``obj`` as a JSON formatted stream to ``fp`` (a
``.write()``-supporting file-like object).
If *skipkeys* is true then ``dict`` keys that are not basic types
(``str``, ``unicode``, ``int``, ``long``, ``float``, ``bool``, ``None``)
will be skipped instead of raising a ``TypeError``.
If *ensure_ascii* is false, then the some chunks written to ``fp``
may be ``unicode`` instances, subject to normal Python ``str`` to
``unicode`` coercion rules. Unless ``fp.write()`` explicitly
understands ``unicode`` (as in ``codecs.getwriter()``) this is likely
to cause an error.
If *check_circular* is false, then the circular reference check
for container types will be skipped and a circular reference will
result in an ``OverflowError`` (or worse).
If *indent* is a string, then JSON array elements and object members
will be pretty-printed with a newline followed by that string repeated
for each level of nesting. ``None`` (the default) selects the most compact
representation without any newlines. An integer is also accepted
and is converted to a string with that many spaces.
If specified, *separators* should be an
``(item_separator, key_separator)`` tuple. The default is ``(', ', ': ')``
if *indent* is ``None`` and ``(',', ': ')`` otherwise. To get the most
compact JSON representation, you should specify ``(',', ':')`` to eliminate
whitespace.
*encoding* is the character encoding for str instances, default is UTF-8.
*default(obj)* is a function that should return a serializable version
of obj or raise ``TypeError``. The default simply raises ``TypeError``.
If *use_decimal* is true (default: ``True``) then decimal.Decimal
will be natively serialized to JSON with full precision.
If *namedtuple_as_object* is true (default: ``True``),
:class:`tuple` subclasses with ``_asdict()`` methods will be encoded
as JSON objects.
If *tuple_as_array* is true (default: ``True``),
:class:`tuple` (and subclasses) will be encoded as JSON arrays.
If *bigint_as_string* is true (default: ``False``), ints 2**53 and higher
or lower than -2**53 will be encoded as strings. This is to avoid the
rounding that happens in Javascript otherwise. Note that this is still a
lossy operation that will not round-trip correctly and should be used
sparingly.
If *int_as_string_bitcount* is a positive number (n), then int of size
greater than or equal to 2**n or lower than or equal to -2**n will be
encoded as strings.
If specified, *item_sort_key* is a callable used to sort the items in
each dictionary. This is useful if you want to sort items other than
in alphabetical order by key. This option takes precedence over
*sort_keys*.
If *sort_keys* is true (default: ``False``), the output of dictionaries
will be sorted by item.
If *for_json* is true (default: ``False``), objects with a ``for_json()``
method will use the return value of that method for encoding as JSON
instead of the object.
To use a custom ``JSONEncoder`` subclass (e.g. one that overrides the
``.default()`` method to serialize additional types), specify it with
the ``cls`` kwarg. NOTE: You should use *default* or *for_json* instead
of subclassing whenever possible.
"""
# cached encoder
if (not skipkeys and ensure_ascii and
check_circular and
cls is None and indent is None and separators is None and
encoding == 'utf-8' and default is None and use_decimal
and namedtuple_as_object and tuple_as_array
and not bigint_as_string and not sort_keys
and not item_sort_key and not for_json
and int_as_string_bitcount is None
and not kw
):
iterable = _default_json_encoder.iterencode(obj)
else:
if cls is None:
cls = JSONEncoder
iterable = cls(skipkeys=skipkeys, ensure_ascii=ensure_ascii,
check_circular=check_circular, indent=indent,
separators=separators, encoding=encoding,
default=default, use_decimal=use_decimal,
namedtuple_as_object=namedtuple_as_object,
tuple_as_array=tuple_as_array,
bigint_as_string=bigint_as_string,
sort_keys=sort_keys,
item_sort_key=item_sort_key,
for_json=for_json,
int_as_string_bitcount=int_as_string_bitcount,
**kw).iterencode(obj)
# could accelerate with writelines in some versions of Python, at
# a debuggability cost
for chunk in iterable:
fp.write(chunk)
def dumpsJSON(obj, skipkeys=False, ensure_ascii=True, check_circular=True,
cls=None, indent=None, separators=None,
encoding='utf-8', default=None, use_decimal=True,
namedtuple_as_object=True, tuple_as_array=True,
bigint_as_string=False, sort_keys=False, item_sort_key=None,
for_json=False, int_as_string_bitcount=None, **kw):
"""Serialize ``obj`` to a JSON formatted ``str``.
If ``skipkeys`` is false then ``dict`` keys that are not basic types
(``str``, ``unicode``, ``int``, ``long``, ``float``, ``bool``, ``None``)
will be skipped instead of raising a ``TypeError``.
If ``ensure_ascii`` is false, then the return value will be a
``unicode`` instance subject to normal Python ``str`` to ``unicode``
coercion rules instead of being escaped to an ASCII ``str``.
If ``check_circular`` is false, then the circular reference check
for container types will be skipped and a circular reference will
result in an ``OverflowError`` (or worse).
If ``indent`` is a string, then JSON array elements and object members
will be pretty-printed with a newline followed by that string repeated
for each level of nesting. ``None`` (the default) selects the most compact
representation without any newlines. An integer is also accepted
and is converted to a string with that many spaces.
If specified, ``separators`` should be an
``(item_separator, key_separator)`` tuple. The default is ``(', ', ': ')``
if *indent* is ``None`` and ``(',', ': ')`` otherwise. To get the most
compact JSON representation, you should specify ``(',', ':')`` to eliminate
whitespace.
``encoding`` is the character encoding for str instances, default is UTF-8.
``default(obj)`` is a function that should return a serializable version
of obj or raise TypeError. The default simply raises TypeError.
If *use_decimal* is true (default: ``True``) then decimal.Decimal
will be natively serialized to JSON with full precision.
If *namedtuple_as_object* is true (default: ``True``),
:class:`tuple` subclasses with ``_asdict()`` methods will be encoded
as JSON objects.
If *tuple_as_array* is true (default: ``True``),
:class:`tuple` (and subclasses) will be encoded as JSON arrays.
If *bigint_as_string* is true (not the default), ints 2**53 and higher
or lower than -2**53 will be encoded as strings. This is to avoid the
rounding that happens in Javascript otherwise.
If *int_as_string_bitcount* is a positive number (n), then int of size
greater than or equal to 2**n or lower than or equal to -2**n will be
encoded as strings.
If specified, *item_sort_key* is a callable used to sort the items in
each dictionary. This is useful if you want to sort items other than
in alphabetical order by key. This option takes precendence over
*sort_keys*.
If *sort_keys* is true (default: ``False``), the output of dictionaries
will be sorted by item.
If *for_json* is true (default: ``False``), objects with a ``for_json()``
method will use the return value of that method for encoding as JSON
instead of the object.
To use a custom ``JSONEncoder`` subclass (e.g. one that overrides the
``.default()`` method to serialize additional types), specify it with
the ``cls`` kwarg. NOTE: You should use *default* instead of subclassing
whenever possible.
"""
# cached encoder
if (
not skipkeys and ensure_ascii and
check_circular and
cls is None and indent is None and separators is None and
encoding == 'utf-8' and default is None and use_decimal
and namedtuple_as_object and tuple_as_array
and not bigint_as_string and not sort_keys
and not item_sort_key and not for_json
and int_as_string_bitcount is None
and not kw
):
return _default_json_encoder.encode(obj)
if cls is None:
cls = JSONEncoder
return cls(
skipkeys=skipkeys, ensure_ascii=ensure_ascii,
check_circular=check_circular, indent=indent,
separators=separators, encoding=encoding, default=default,
use_decimal=use_decimal,
namedtuple_as_object=namedtuple_as_object,
tuple_as_array=tuple_as_array,
bigint_as_string=bigint_as_string,
sort_keys=sort_keys,
item_sort_key=item_sort_key,
for_json=for_json,
int_as_string_bitcount=int_as_string_bitcount,
**kw).encode(obj)
def simple_first(kv):
"""Helper function to pass to item_sort_key to sort simple
elements to the top, then container elements.
"""
return (isinstance(kv[1], (list, dict, tuple)), kv[0])
| 41.730354 | 79 | 0.686962 | 3,932 | 27,083 | 4.601475 | 0.091811 | 0.016581 | 0.012933 | 0.023103 | 0.89145 | 0.884652 | 0.874482 | 0.874482 | 0.869065 | 0.864865 | 0 | 0.003467 | 0.222575 | 27,083 | 648 | 80 | 41.794753 | 0.855854 | 0.627626 | 0 | 0.674107 | 0 | 0 | 0.034025 | 0.002454 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035714 | false | 0 | 0.044643 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7e4a9852d7032ce15bd75923009c5b8c45901f9f | 26 | py | Python | b2share_tools/commands/__init__.py | hjhsalo/b2share-tools | c12e568889094ffb28f01adb45cc39aed387c8b5 | [
"MIT"
] | null | null | null | b2share_tools/commands/__init__.py | hjhsalo/b2share-tools | c12e568889094ffb28f01adb45cc39aed387c8b5 | [
"MIT"
] | null | null | null | b2share_tools/commands/__init__.py | hjhsalo/b2share-tools | c12e568889094ffb28f01adb45cc39aed387c8b5 | [
"MIT"
] | null | null | null | from .record import record | 26 | 26 | 0.846154 | 4 | 26 | 5.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 26 | 1 | 26 | 26 | 0.956522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7e589be78fa33d62ea7e77c907dba7c0ec4fd68f | 32,997 | py | Python | tests/test_inline_markup.py | brabect1/docutils-rstwriter-dev | b74e5d5d32f697d22ab0c38bd16fa87ed7dc1ded | [
"Apache-2.0"
] | null | null | null | tests/test_inline_markup.py | brabect1/docutils-rstwriter-dev | b74e5d5d32f697d22ab0c38bd16fa87ed7dc1ded | [
"Apache-2.0"
] | null | null | null | tests/test_inline_markup.py | brabect1/docutils-rstwriter-dev | b74e5d5d32f697d22ab0c38bd16fa87ed7dc1ded | [
"Apache-2.0"
] | null | null | null | #! /usr/bin/env python
# -*- coding: utf-8 -*-
# Copyright 2020 Tomas Brabec
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import RstWriterTestUtils
import docutils
import docutils.core
def suite():
s = RstWriterTestUtils.PublishTestSuite(writer_name='docutils-rstwriter',
test_class=RstWriterTestUtils.WriterNoTransformTestCase)
s.generateTests(totest)
return s
totest = {}
totest['emphasis'] = [
["""\
*emphasis*
""",
"""\
*emphasis*
"""],
[u"""\
l'*emphasis* with the *emphasis*' apostrophe.
l\u2019*emphasis* with the *emphasis*\u2019 apostrophe.
""",
u"""\
l'*emphasis* with the *emphasis*' apostrophe.
l\u2019*emphasis* with the *emphasis*\u2019 apostrophe.
"""],
["""\
*emphasized sentence
across lines*
""",
"""\
*emphasized sentence
across lines*
"""],
["""\
*emphasis without closing asterisk
""",
"""\
*emphasis without closing asterisk
"""],
[r"""some punctuation is allowed around inline markup, e.g.
/*emphasis*/, -*emphasis*-, and :*emphasis*: (delimiters),
(*emphasis*), [*emphasis*], <*emphasis*>, {*emphasis*} (open/close pairs)
*emphasis*., *emphasis*,, *emphasis*!, and *emphasis*\ (closing delimiters),
but not
)*emphasis*(, ]*emphasis*[, >*emphasis*>, }*emphasis*{ (close/open pairs),
(*), [*], '*' or '"*"' ("quoted" start-string),
x*2* or 2*x* (alphanumeric char before),
\*args or * (escaped, whitespace behind start-string),
or *the\* *stars\* *inside* (escaped, whitespace before end-string).
However, '*args' will trigger a warning and may be problematic.
what about *this**?
""",
"""\
some punctuation is allowed around inline markup, e.g.
/*emphasis*/, -*emphasis*-, and :*emphasis*: (delimiters),
(*emphasis*), [*emphasis*], <*emphasis*>, {*emphasis*} (open/close pairs)
*emphasis*., *emphasis*,, *emphasis*!, and *emphasis*(closing delimiters),
but not
)*emphasis*(, ]*emphasis*[, >*emphasis*>, }*emphasis*{ (close/open pairs),
(*), [*], '*' or '"*"' ("quoted" start-string),
x*2* or 2*x* (alphanumeric char before),
*args or * (escaped, whitespace behind start-string),
or *the* *stars* *inside* (escaped, whitespace before end-string).
However, '*args' will trigger a warning and may be problematic.
what about *this**?
"""],
[u"""\
Quotes around inline markup:
'*emphasis*' "*emphasis*" Straight,
‘*emphasis*’ “*emphasis*” English, ...,
« *emphasis* » ‹ *emphasis* › « *emphasis* » ‹ *emphasis* ›
« *emphasis* » ‹ *emphasis* › French,
„*emphasis*“ ‚*emphasis*‘ »*emphasis*« ›*emphasis*‹ German, Czech, ...,
„*emphasis*” «*emphasis*» Romanian,
“*emphasis*„ ‘*emphasis*‚ Greek,
「*emphasis*」 『*emphasis*』traditional Chinese,
”*emphasis*” ’*emphasis*’ »*emphasis*» ›*emphasis*› Swedish, Finnish,
„*emphasis*” ‚*emphasis*’ Polish,
„*emphasis*” »*emphasis*« ’*emphasis*’ Hungarian,
""",
u"""\
Quotes around inline markup:
'*emphasis*' "*emphasis*" Straight,
‘*emphasis*’ “*emphasis*” English, ...,
« *emphasis* » ‹ *emphasis* › « *emphasis* » ‹ *emphasis* ›
« *emphasis* » ‹ *emphasis* › French,
„*emphasis*“ ‚*emphasis*‘ »*emphasis*« ›*emphasis*‹ German, Czech, ...,
„*emphasis*” «*emphasis*» Romanian,
“*emphasis*„ ‘*emphasis*‚ Greek,
「*emphasis*」 『*emphasis*』traditional Chinese,
”*emphasis*” ’*emphasis*’ »*emphasis*» ›*emphasis*› Swedish, Finnish,
„*emphasis*” ‚*emphasis*’ Polish,
„*emphasis*” »*emphasis*« ’*emphasis*’ Hungarian,
"""],
[r"""
Emphasized asterisk: *\**
Emphasized double asterisk: *\***
""",
"""\
Emphasized asterisk: ***
Emphasized double asterisk: ****
"""],
]
totest['strong'] = [
["""\
**strong**
""",
"""\
**strong**
"""],
[u"""\
l'**strong** and l\u2019**strong** with apostrophe
""",
u"""\
l'**strong** and l\u2019**strong** with apostrophe
"""],
[u"""\
quoted '**strong**', quoted "**strong**",
quoted \u2018**strong**\u2019, quoted \u201c**strong**\u201d,
quoted \xab**strong**\xbb
""",
u"""\
quoted '**strong**', quoted "**strong**",
quoted \u2018**strong**\u2019, quoted \u201c**strong**\u201d,
quoted \xab**strong**\xbb
"""],
[r"""
(**strong**) but not (**) or '(** ' or x**2 or \**kwargs or **
(however, '**kwargs' will trigger a warning and may be problematic)
""",
"""\
(**strong**) but not (**) or '(** ' or x**2 or **kwargs or **
(however, '**kwargs' will trigger a warning and may be problematic)
"""],
["""\
Strong asterisk: *****
Strong double asterisk: ******
""",
"""\
Strong asterisk: *****
Strong double asterisk: ******
"""],
["""\
**strong without closing asterisks
""",
"""\
**strong without closing asterisks
"""],
]
#
# literal
#
totest['literal'] = [
["""\
``literal``
""",
"""\
``literal``
"""],
[r"""
``\literal``
""",
"""\
``\\literal``
"""],
[r"""
``lite\ral``
""",
"""\
``lite\\ral``
"""],
[r"""
``literal\``
""",
"""\
``literal\\``
"""],
[u"""\
l'``literal`` and l\u2019``literal`` with apostrophe
""",
u"""\
l'``literal`` and l\u2019``literal`` with apostrophe
"""],
[u"""\
quoted '``literal``', quoted "``literal``",
quoted \u2018``literal``\u2019, quoted \u201c``literal``\u201d,
quoted \xab``literal``\xbb
""",
u"""\
quoted '``literal``', quoted "``literal``",
quoted \u2018``literal``\u2019, quoted \u201c``literal``\u201d,
quoted \xab``literal``\xbb
"""],
[u"""\
``'literal'`` with quotes, ``"literal"`` with quotes,
``\u2018literal\u2019`` with quotes, ``\u201cliteral\u201d`` with quotes,
``\xabliteral\xbb`` with quotes
""",
u"""\
``'literal'`` with quotes, ``"literal"`` with quotes,
``\u2018literal\u2019`` with quotes, ``\u201cliteral\u201d`` with quotes,
``\xabliteral\xbb`` with quotes
"""],
[r"""
``literal ``TeX quotes'' & \backslash`` but not "``" or ``
(however, ``standalone TeX quotes'' will trigger a warning
and may be problematic)
""",
"""\
``literal ``TeX quotes'' & \\backslash`` but not "``" or ``
(however, ``standalone TeX quotes'' will trigger a warning
and may be problematic)
"""],
["""\
Find the ```interpreted text``` in this paragraph!
""",
"""\
Find the ```interpreted text``` in this paragraph!
"""],
["""\
``literal without closing backquotes
""",
"""\
``literal without closing backquotes
"""],
[r"""
Python ``list``\s use square bracket syntax.
""",
"""\
Python ``list``s use square bracket syntax.
"""],
[r"""
Blank after opening `` not allowed.
""",
"""\
Blank after opening `` not allowed.
"""],
[r"""
no blank ``after closing``continues`` literal.
""",
"""\
no blank ``after closing``continues`` literal.
"""],
[r"""
dot ``after closing``. is possible.
""",
"""\
dot ``after closing``. is possible.
"""],
]
#
# reference
#
totest['references'] = [
["""\
ref_
""",
"""\
ref_
"""],
[u"""\
l'ref_ and l\u2019ref_ with apostrophe
""",
u"""\
l'ref_ and l\u2019ref_ with apostrophe
"""],
[u"""\
quoted 'ref_', quoted "ref_",
quoted \u2018ref_\u2019, quoted \u201cref_\u201d,
quoted \xabref_\xbb,
but not 'ref ref'_, "ref ref"_, \u2018ref ref\u2019_,
\u201cref ref\u201d_, or \xabref ref\xbb_
""",
u"""\
quoted 'ref_', quoted "ref_",
quoted \u2018ref_\u2019, quoted \u201cref_\u201d,
quoted \xabref_\xbb,
but not 'ref ref'_, "ref ref"_, \u2018ref ref\u2019_,
\u201cref ref\u201d_, or \xabref ref\xbb_
"""],
["""\
ref__
""",
"""\
ref__
"""],
[u"""\
l'ref__ and l\u2019ref__ with apostrophe
""",
u"""\
l'ref__ and l\u2019ref__ with apostrophe
"""],
[u"""\
quoted 'ref__', quoted "ref__",
quoted \u2018ref__\u2019, quoted \u201cref__\u201d,
quoted \xabref__\xbb,
but not 'ref ref'__, "ref ref"__, \u2018ref ref\u2019__,
\u201cref ref\u201d__, or \xabref ref\xbb__
""",
u"""\
quoted 'ref__', quoted "ref__",
quoted \u2018ref__\u2019, quoted \u201cref__\u201d,
quoted \xabref__\xbb,
but not 'ref ref'__, "ref ref"__, \u2018ref ref\u2019__,
\u201cref ref\u201d__, or \xabref ref\xbb__
"""],
["""\
ref_, r_, r_e-f_, -ref_, and anonymousref__,
but not _ref_ or __attr__ or object.__attr__
""",
"""\
ref_, r_, r_e-f_, -ref_, and anonymousref__,
but not _ref_ or __attr__ or object.__attr__
"""],
]
totest['phrase_references'] = [
["""\
`phrase reference`_
""",
"""\
`phrase reference`_
"""],
[u"""\
l'`phrase reference`_ and l\u2019`phrase reference`_ with apostrophe
""",
u"""\
l'`phrase reference`_ and l\u2019`phrase reference`_ with apostrophe
"""],
[u"""\
quoted '`phrase reference`_', quoted "`phrase reference`_",
quoted \u2018`phrase reference`_\u2019,
quoted \u201c`phrase reference`_\u201d,
quoted \xab`phrase reference`_\xbb
""",
u"""\
quoted '`phrase reference`_', quoted "`phrase reference`_",
quoted \u2018`phrase reference`_\u2019,
quoted \u201c`phrase reference`_\u201d,
quoted \xab`phrase reference`_\xbb
"""],
[u"""\
`'phrase reference'`_ with quotes, `"phrase reference"`_ with quotes,
`\u2018phrase reference\u2019`_ with quotes,
`\u201cphrase reference\u201d`_ with quotes,
`\xabphrase reference\xbb`_ with quotes
""",
u"""\
`'phrase reference'`_ with quotes, `"phrase reference"`_ with quotes,
`\u2018phrase reference\u2019`_ with quotes,
`\u201cphrase reference\u201d`_ with quotes,
`\xabphrase reference\xbb`_ with quotes
"""],
["""\
`anonymous reference`__
""",
"""\
`anonymous reference`__
"""],
[u"""\
l'`anonymous reference`__ and l\u2019`anonymous reference`__ with apostrophe
""",
u"""\
l'`anonymous reference`__ and l\u2019`anonymous reference`__ with apostrophe
"""],
[u"""\
quoted '`anonymous reference`__', quoted "`anonymous reference`__",
quoted \u2018`anonymous reference`__\u2019,
quoted \u201c`anonymous reference`__\u201d,
quoted \xab`anonymous reference`__\xbb
""",
u"""\
quoted '`anonymous reference`__', quoted "`anonymous reference`__",
quoted \u2018`anonymous reference`__\u2019,
quoted \u201c`anonymous reference`__\u201d,
quoted \xab`anonymous reference`__\xbb
"""],
[u"""\
`'anonymous reference'`__ with quotes, `"anonymous reference"`__ with quotes,
`\u2018anonymous reference\u2019`__ with quotes,
`\u201canonymous reference\u201d`__ with quotes,
`\xabanonymous reference\xbb`__ with quotes
""",
u"""\
`'anonymous reference'`__ with quotes, `"anonymous reference"`__ with quotes,
`\u2018anonymous reference\u2019`__ with quotes,
`\u201canonymous reference\u201d`__ with quotes,
`\xabanonymous reference\xbb`__ with quotes
"""],
["""\
`phrase reference
across lines`_
""",
"""\
`phrase reference
across lines`_
"""],
["""\
`phrase\\`_ reference`_
""",
"""\
`phrase\\`_ reference`_
"""],
["""\
Invalid phrase reference:
:role:`phrase reference`_
""",
"""\
Invalid phrase reference:
:role:`phrase reference`_
"""],
["""\
Invalid phrase reference:
`phrase reference`:role:_
""",
"""\
Invalid phrase reference:
`phrase reference`:role:_
"""],
["""\
`phrase reference_ without closing backquote
""",
"""\
`phrase reference_ without closing backquote
"""],
["""\
`anonymous phrase reference__ without closing backquote
""",
"""\
`anonymous phrase reference__ without closing backquote
"""],
]
totest['embedded_URIs'] = [
["""\
`phrase reference <http://example.com>`_
""",
"""\
`phrase reference <http://example.com>`_
"""],
["""\
`anonymous reference <http://example.com>`__
""",
# Anonymous URI references resolve to normal ones!
"""\
`anonymous reference <http://example.com>`_
"""],
["""\
`embedded URI on next line
<http://example.com>`__
""",
"""\
`embedded URI on next line <http://example.com>`_
"""],
["""\
`embedded URI across lines <http://example.com/
long/path>`__
""",
"""\
`embedded URI across lines <http://example.com/long/path>`_
"""],
["""\
`embedded URI with whitespace <http://example.com/
long/path /and /whitespace>`__
""",
"""\
`embedded URI with whitespace <http://example.com/long/path/and/whitespace>`_
"""],
[r"""
`embedded URI with escaped whitespace <http://example.com/a\
long/path\ and/some\ escaped\ whitespace>`__
`<omitted\ reference\ text\ with\ escaped\ whitespace>`__
""",
"""\
`embedded URI with escaped whitespace <http://example.com/a\ long/path\ and/some\ escaped\ whitespace>`_
`<omitted\ reference\ text\ with\ escaped\ whitespace>`_
"""],
["""\
`embedded email address <jdoe@example.com>`__
`embedded email address <mailto:jdoe@example.com>`__
`embedded email address broken across lines <jdoe
@example.com>`__
""",
"""\
`embedded email address <mailto:jdoe@example.com>`_
`embedded email address <mailto:jdoe@example.com>`_
`embedded email address broken across lines <mailto:jdoe@example.com>`_
"""],
[r"""`embedded URI with too much whitespace < http://example.com/
long/path /and /whitespace >`__
`embedded URI with too much whitespace at end <http://example.com/
long/path /and /whitespace >`__
`embedded URI with no preceding whitespace<http://example.com>`__
`escaped URI \<http://example.com>`__
See `HTML Anchors: \<a>`_.
""",
r"""`embedded URI with too much whitespace \< http://example.com/
long/path /and /whitespace >`__
`embedded URI with too much whitespace at end \<http://example.com/
long/path /and /whitespace >`__
`embedded URI with no preceding whitespace\<http://example.com>`__
`escaped URI \<http://example.com>`__
See `HTML Anchors: \<a>`_.
"""],
["""\
Relative URIs' reference text can be omitted:
`<reference>`_
`<anonymous>`__
""",
# A relative anonymous reference aliases with a normal relative reference
"""\
Relative URIs' reference text can be omitted:
`<reference>`_
`<anonymous>`_
"""],
[r"""
Escape trailing low-line char in URIs:
`<reference\_>`_
`<anonymous\_>`__
""",
"""\
Escape trailing low-line char in URIs:
`<reference\_>`_
`<anonymous\_>`_
"""],
["""\
Escape other char in URIs:
`<reference\\:1>`_
`<anonymous\\call>`__
""",
"""\
Escape other char in URIs:
`<reference:1>`_
`<anonymouscall>`_
"""],
["""\
`reference_`_
`reference\\_`_
`<reference\\_>`_
`<reference_>`_
`<reference_name>`_
`<reference\\_name>`_
""",
"""\
`reference\\_`_
`reference\\_`_
`<reference\\_>`_
reference_
`<reference\\_name>`_
`<reference\\_name>`_
"""],
]
#TODO totest['embedded_aliases'] = [
#TODO ["""\
#TODO `phrase reference <alias_>`_
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <reference name="phrase reference" refname="alias">
#TODO phrase reference
#TODO <target names="phrase\\ reference" refname="alias">
#TODO """],
#TODO ["""\
#TODO `anonymous reference <alias_>`__
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <reference name="anonymous reference" refname="alias">
#TODO anonymous reference
#TODO """],
#TODO ["""\
#TODO `embedded alias on next line
#TODO <alias_>`__
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <reference name="embedded alias on next line" refname="alias">
#TODO embedded alias on next line
#TODO """],
#TODO ["""\
#TODO `embedded alias across lines <alias
#TODO phrase_>`__
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <reference name="embedded alias across lines" refname="alias phrase">
#TODO embedded alias across lines
#TODO """],
#TODO ["""\
#TODO `embedded alias with whitespace <alias
#TODO long phrase_>`__
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <reference name="embedded alias with whitespace" refname="alias long phrase">
#TODO embedded alias with whitespace
#TODO """],
#TODO ["""\
#TODO `<embedded alias with whitespace_>`__
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <reference name="embedded alias with whitespace" refname="embedded alias with whitespace">
#TODO embedded alias with whitespace
#TODO """],
#TODO [r"""
#TODO `no embedded alias (whitespace inside bracket) < alias_ >`__
#TODO
#TODO `no embedded alias (no preceding whitespace)<alias_>`__
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <reference anonymous="1" name="no embedded alias (whitespace inside bracket) < alias_ >">
#TODO no embedded alias (whitespace inside bracket) < alias_ >
#TODO <paragraph>
#TODO <reference anonymous="1" name="no embedded alias (no preceding whitespace)<alias_>">
#TODO no embedded alias (no preceding whitespace)<alias_>
#TODO """],
#TODO [r"""
#TODO `anonymous reference <alias\ with\\ escaped \:characters_>`__
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <reference name="anonymous reference" refname="aliaswith\\ escaped :characters">
#TODO anonymous reference
#TODO """],
#TODO [r"""
#TODO `anonymous reference <alias\ with\\ escaped \:characters_>`__
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <reference name="anonymous reference" refname="aliaswith\\ escaped :characters">
#TODO anonymous reference
#TODO """],
#TODO ]
totest['inline_targets'] = [
["""\
_`target`
Here is _`another target` in some text. And _`yet
another target`, spanning lines.
_`Here is a TaRgeT` with case and spacial difficulties.
""",
"""\
_`target`
Here is _`another target` in some text. And _`yet
another target`, spanning lines.
_`Here is a TaRgeT` with case and spacial difficulties.
"""],
[u"""\
l'_`target1` and l\u2019_`target2` with apostrophe
""",
u"""\
l'_`target1` and l\u2019_`target2` with apostrophe
"""],
[u"""\
quoted '_`target1`', quoted "_`target2`",
quoted \u2018_`target3`\u2019, quoted \u201c_`target4`\u201d,
quoted \xab_`target5`\xbb
""",
u"""\
quoted '_`target1`', quoted "_`target2`",
quoted \u2018_`target3`\u2019, quoted \u201c_`target4`\u201d,
quoted \xab_`target5`\xbb
"""],
[u"""\
_`'target1'` with quotes, _`"target2"` with quotes,
_`\u2018target3\u2019` with quotes, _`\u201ctarget4\u201d` with quotes,
_`\xabtarget5\xbb` with quotes
""",
u"""\
_`'target1'` with quotes, _`"target2"` with quotes,
_`\u2018target3\u2019` with quotes, _`\u201ctarget4\u201d` with quotes,
_`\xabtarget5\xbb` with quotes
"""],
["""\
But this isn't a _target; targets require backquotes.
And _`this`_ is just plain confusing.
""",
"""\
But this isn't a _target; targets require backquotes.
And _`this`_ is just plain confusing.
"""],
["""\
_`inline target without closing backquote
""",
"""\
_`inline target without closing backquote
"""],
]
totest['footnote_reference'] = [
["""\
[1]_
""",
"""\
[1]_
"""],
["""\
[#]_
""",
"""\
[#]_
"""],
["""\
[#label]_
""",
"""\
[#label]_
"""],
["""\
[*]_
""",
"""\
[*]_
"""],
["""\
[*label]_
""",
"""\
[*label]_
"""],
["""\
Back to back: [*]_ [#label]_ [#]_ [2]_ [1]_ [*label]_
""",
"""\
Back to back: [*]_ [#label]_ [#]_ [2]_ [1]_ [*label]_
"""],
["""\
Adjacent footnote refs are not possible: [*]_[#label]_ [#]_[2]_ [1]_[*]_
""",
"""\
Adjacent footnote refs are not possible: [*]_[#label]_ [#]_[2]_ [1]_[*]_
"""],
]
totest['citation_reference'] = [
["""\
[citation]_
""",
"""\
[citation]_
"""],
["""\
[citation]_ and [cit-ation]_ and [cit.ation]_ and [CIT1]_ but not [CIT 1]_
""",
"""\
[citation]_ and [cit-ation]_ and [cit.ation]_ and [CIT1]_ but not [CIT 1]_
"""],
["""\
Adjacent citation refs are not possible: [citation]_[CIT1]_
""",
"""\
Adjacent citation refs are not possible: [citation]_[CIT1]_
"""],
]
#TODO totest['substitution_references'] = [
#TODO ["""\
#TODO |subref|
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <substitution_reference refname="subref">
#TODO subref
#TODO """],
#TODO ["""\
#TODO |subref|_ and |subref|__
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <reference refname="subref">
#TODO <substitution_reference refname="subref">
#TODO subref
#TODO and \n\
#TODO <reference anonymous="1">
#TODO <substitution_reference refname="subref">
#TODO subref
#TODO """],
#TODO ["""\
#TODO |substitution reference|
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <substitution_reference refname="substitution reference">
#TODO substitution reference
#TODO """],
#TODO ["""\
#TODO |substitution
#TODO reference|
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <substitution_reference refname="substitution reference">
#TODO substitution
#TODO reference
#TODO """],
#TODO ["""\
#TODO |substitution reference without closing verbar
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO <problematic ids="id2" refid="id1">
#TODO |
#TODO substitution reference without closing verbar
#TODO <system_message backrefs="id2" ids="id1" level="2" line="1" source="test data" type="WARNING">
#TODO <paragraph>
#TODO Inline substitution_reference start-string without end-string.
#TODO """],
#TODO ["""\
#TODO first | then || and finally |||
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO first | then || and finally |||
#TODO """],
#TODO ]
totest['standalone_hyperlink'] = [
["""\
http://www.standalone.hyperlink.com
http:/one-slash-only.absolute.path
[http://example.com]
(http://example.com)
<http://example.com>
http://[1080:0:0:0:8:800:200C:417A]/IPv6address.html
http://[3ffe:2a00:100:7031::1] (the final "]" is ambiguous in text)
http://[3ffe:2a00:100:7031::1]/
mailto:someone@somewhere.com
news:comp.lang.python
An email address in a sentence: someone@somewhere.com.
ftp://ends.with.a.period.
(a.question.mark@end?)
""",
"""\
http://www.standalone.hyperlink.com
http:/one-slash-only.absolute.path
[http://example.com]
(http://example.com)
<http://example.com>
http://[1080:0:0:0:8:800:200C:417A]/IPv6address.html
http://[3ffe:2a00:100:7031::1] (the final "]" is ambiguous in text)
http://[3ffe:2a00:100:7031::1]/
mailto:someone@somewhere.com
news:comp.lang.python
An email address in a sentence: someone@somewhere.com.
ftp://ends.with.a.period.
(a.question.mark@end?)
"""],
[r"""
Valid URLs with escaped markup characters:
http://example.com/\*content\*/whatever
http://example.com/\*content*/whatever
""",
"""\
Valid URLs with escaped markup characters:
http://example.com/*content*/whatever
http://example.com/*content*/whatever
"""],
["""\
Valid URLs may end with punctuation inside "<>":
<http://example.org/ends-with-dot.>
""",
"""\
Valid URLs may end with punctuation inside "<>":
<http://example.org/ends-with-dot.>
"""],
["""\
Valid URLs with interesting endings:
http://example.org/ends-with-pluses++
""",
"""\
Valid URLs with interesting endings:
http://example.org/ends-with-pluses++
"""],
["""\
None of these are standalone hyperlinks (their "schemes"
are not recognized): signal:noise, a:b.
""",
"""\
None of these are standalone hyperlinks (their "schemes"
are not recognized): signal:noise, a:b.
"""],
["""\
Escaped email addresses are not recognized: test\\@example.org
""",
"""\
Escaped email addresses are not recognized: test@example.org
"""],
]
#TODO totest['markup recognition rules'] = [
#TODO ["""\
#TODO __This__ should be left alone.
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO __This__ should be left alone.
#TODO """],
#TODO [r"""
#TODO Character-level m\ *a*\ **r**\ ``k``\ `u`:title:\p
#TODO with backslash-escaped whitespace, including new\
#TODO lines.
#TODO """,
#TODO """\
#TODO <document source="test data">
#TODO <paragraph>
#TODO Character-level m
#TODO <emphasis>
#TODO a
#TODO <strong>
#TODO r
#TODO <literal>
#TODO k
#TODO <title_reference>
#TODO u
#TODO p
#TODO with backslash-escaped whitespace, including newlines.
#TODO """],
#TODO [u"""\
#TODO text-*separated*\u2010*by*\u2011*various*\u2012*dashes*\u2013*and*\u2014*hyphens*.
#TODO \u00bf*punctuation*? \u00a1*examples*!\u00a0*no-break-space*\u00a0.
#TODO """,
#TODO u"""\
#TODO <document source="test data">
#TODO <paragraph>
#TODO text-
#TODO <emphasis>
#TODO separated
#TODO \u2010
#TODO <emphasis>
#TODO by
#TODO \u2011
#TODO <emphasis>
#TODO various
#TODO \u2012
#TODO <emphasis>
#TODO dashes
#TODO \u2013
#TODO <emphasis>
#TODO and
#TODO \u2014
#TODO <emphasis>
#TODO hyphens
#TODO .
#TODO \xbf
#TODO <emphasis>
#TODO punctuation
#TODO ? \xa1
#TODO <emphasis>
#TODO examples
#TODO !\xa0
#TODO <emphasis>
#TODO no-break-space
#TODO \u00a0.
#TODO """],
#TODO # Whitespace characters:
#TODO # \u180e*MONGOLIAN VOWEL SEPARATOR*\u180e, fails in Python 2.6
#TODO [u"""\
#TODO text separated by
#TODO *newline*
#TODO or *space* or one of
#TODO \xa0*NO-BREAK SPACE*\xa0,
#TODO \u1680*OGHAM SPACE MARK*\u1680,
#TODO \u2000*EN QUAD*\u2000,
#TODO \u2001*EM QUAD*\u2001,
#TODO \u2002*EN SPACE*\u2002,
#TODO \u2003*EM SPACE*\u2003,
#TODO \u2004*THREE-PER-EM SPACE*\u2004,
#TODO \u2005*FOUR-PER-EM SPACE*\u2005,
#TODO \u2006*SIX-PER-EM SPACE*\u2006,
#TODO \u2007*FIGURE SPACE*\u2007,
#TODO \u2008*PUNCTUATION SPACE*\u2008,
#TODO \u2009*THIN SPACE*\u2009,
#TODO \u200a*HAIR SPACE*\u200a,
#TODO \u202f*NARROW NO-BREAK SPACE*\u202f,
#TODO \u205f*MEDIUM MATHEMATICAL SPACE*\u205f,
#TODO \u3000*IDEOGRAPHIC SPACE*\u3000,
#TODO \u2028*LINE SEPARATOR*\u2028
#TODO """,
#TODO u"""\
#TODO <document source="test data">
#TODO <paragraph>
#TODO text separated by
#TODO <emphasis>
#TODO newline
#TODO \n\
#TODO or \n\
#TODO <emphasis>
#TODO space
#TODO or one of
#TODO \xa0
#TODO <emphasis>
#TODO NO-BREAK SPACE
#TODO \xa0,
#TODO \u1680
#TODO <emphasis>
#TODO OGHAM SPACE MARK
#TODO \u1680,
#TODO \u2000
#TODO <emphasis>
#TODO EN QUAD
#TODO \u2000,
#TODO \u2001
#TODO <emphasis>
#TODO EM QUAD
#TODO \u2001,
#TODO \u2002
#TODO <emphasis>
#TODO EN SPACE
#TODO \u2002,
#TODO \u2003
#TODO <emphasis>
#TODO EM SPACE
#TODO \u2003,
#TODO \u2004
#TODO <emphasis>
#TODO THREE-PER-EM SPACE
#TODO \u2004,
#TODO \u2005
#TODO <emphasis>
#TODO FOUR-PER-EM SPACE
#TODO \u2005,
#TODO \u2006
#TODO <emphasis>
#TODO SIX-PER-EM SPACE
#TODO \u2006,
#TODO \u2007
#TODO <emphasis>
#TODO FIGURE SPACE
#TODO \u2007,
#TODO \u2008
#TODO <emphasis>
#TODO PUNCTUATION SPACE
#TODO \u2008,
#TODO \u2009
#TODO <emphasis>
#TODO THIN SPACE
#TODO \u2009,
#TODO \u200a
#TODO <emphasis>
#TODO HAIR SPACE
#TODO \u200a,
#TODO \u202f
#TODO <emphasis>
#TODO NARROW NO-BREAK SPACE
#TODO \u202f,
#TODO \u205f
#TODO <emphasis>
#TODO MEDIUM MATHEMATICAL SPACE
#TODO \u205f,
#TODO \u3000
#TODO <emphasis>
#TODO IDEOGRAPHIC SPACE
#TODO \u3000,
#TODO <paragraph>
#TODO <emphasis>
#TODO LINE SEPARATOR
#TODO """],
#TODO [u"""\
#TODO inline markup separated by non-ASCII whitespace
#TODO \xa0**NO-BREAK SPACE**\xa0, \xa0``NO-BREAK SPACE``\xa0, \xa0`NO-BREAK SPACE`\xa0,
#TODO \u2000**EN QUAD**\u2000, \u2000``EN QUAD``\u2000, \u2000`EN QUAD`\u2000,
#TODO \u202f**NARROW NBSP**\u202f, \u202f``NARROW NBSP``\u202f, \u202f`NARROW NBSP`\u202f,
#TODO """,
#TODO u"""\
#TODO <document source="test data">
#TODO <paragraph>
#TODO inline markup separated by non-ASCII whitespace
#TODO \xa0
#TODO <strong>
#TODO NO-BREAK SPACE
#TODO \xa0, \xa0
#TODO <literal>
#TODO NO-BREAK SPACE
#TODO \xa0, \xa0
#TODO <title_reference>
#TODO NO-BREAK SPACE
#TODO \xa0,
#TODO \u2000
#TODO <strong>
#TODO EN QUAD
#TODO \u2000, \u2000
#TODO <literal>
#TODO EN QUAD
#TODO \u2000, \u2000
#TODO <title_reference>
#TODO EN QUAD
#TODO \u2000,
#TODO \u202f
#TODO <strong>
#TODO NARROW NBSP
#TODO \u202f, \u202f
#TODO <literal>
#TODO NARROW NBSP
#TODO \u202f, \u202f
#TODO <title_reference>
#TODO NARROW NBSP
#TODO \u202f,
#TODO """],
#TODO [u"""\
#TODO no inline markup due to whitespace inside and behind: *
#TODO newline
#TODO *
#TODO * space * or one of
#TODO *\xa0NO-BREAK SPACE\xa0*
#TODO *\u1680OGHAM SPACE MARK\u1680*
#TODO *\u2000EN QUAD\u2000*
#TODO *\u2001EM QUAD\u2001*
#TODO *\u2002EN SPACE\u2002*
#TODO *\u2003EM SPACE\u2003*
#TODO *\u2004THREE-PER-EM SPACE\u2004*
#TODO *\u2005FOUR-PER-EM SPACE\u2005*
#TODO *\u2006SIX-PER-EM SPACE\u2006*
#TODO *\u2007FIGURE SPACE\u2007*
#TODO *\u2008PUNCTUATION SPACE\u2008*
#TODO *\u2009THIN SPACE\u2009*
#TODO *\u200aHAIR SPACE\u200a*
#TODO *\u202fNARROW NO-BREAK SPACE\u202f*
#TODO *\u205fMEDIUM MATHEMATICAL SPACE\u205f*
#TODO *\u3000IDEOGRAPHIC SPACE\u3000*
#TODO *\u2028LINE SEPARATOR\u2028*
#TODO """,
#TODO u"""\
#TODO <document source="test data">
#TODO <paragraph>
#TODO no inline markup due to whitespace inside and behind: *
#TODO newline
#TODO *
#TODO * space * or one of
#TODO *\xa0NO-BREAK SPACE\xa0*
#TODO *\u1680OGHAM SPACE MARK\u1680*
#TODO *\u2000EN QUAD\u2000*
#TODO *\u2001EM QUAD\u2001*
#TODO *\u2002EN SPACE\u2002*
#TODO *\u2003EM SPACE\u2003*
#TODO *\u2004THREE-PER-EM SPACE\u2004*
#TODO *\u2005FOUR-PER-EM SPACE\u2005*
#TODO *\u2006SIX-PER-EM SPACE\u2006*
#TODO *\u2007FIGURE SPACE\u2007*
#TODO *\u2008PUNCTUATION SPACE\u2008*
#TODO *\u2009THIN SPACE\u2009*
#TODO *\u200aHAIR SPACE\u200a*
#TODO *\u202fNARROW NO-BREAK SPACE\u202f*
#TODO *\u205fMEDIUM MATHEMATICAL SPACE\u205f*
#TODO *\u3000IDEOGRAPHIC SPACE\u3000*
#TODO *
#TODO LINE SEPARATOR
#TODO *"""],
#TODO [u"""\
#TODO no inline markup because of non-ASCII whitespace following /preceding the markup
#TODO **\xa0NO-BREAK SPACE\xa0** ``\xa0NO-BREAK SPACE\xa0`` `\xa0NO-BREAK SPACE\xa0`
#TODO **\u2000EN QUAD\u2000** ``\u2000EN QUAD\u2000`` `\u2000EN QUAD\u2000`
#TODO **\u202fNARROW NBSP\u202f** ``\u202fNARROW NBSP\u202f`` `\u202fNARROW NBSP\u202f`
#TODO """,
#TODO u"""\
#TODO <document source="test data">
#TODO <paragraph>
#TODO no inline markup because of non-ASCII whitespace following /preceding the markup
#TODO **\xa0NO-BREAK SPACE\xa0** ``\xa0NO-BREAK SPACE\xa0`` `\xa0NO-BREAK SPACE\xa0`
#TODO **\u2000EN QUAD\u2000** ``\u2000EN QUAD\u2000`` `\u2000EN QUAD\u2000`
#TODO **\u202fNARROW NBSP\u202f** ``\u202fNARROW NBSP\u202f`` `\u202fNARROW NBSP\u202f`\
#TODO """],
#TODO # « * » ‹ * › « * » ‹ * › « * » ‹ * › French,
#TODO [u"""\
#TODO "Quoted" markup start-string (matched openers & closers) -> no markup:
#TODO
#TODO '*' "*" (*) <*> [*] {*}
#TODO ⁅*⁆
#TODO
#TODO Some international quoting styles:
#TODO ‘*’ “*” English, ...,
#TODO „*“ ‚*‘ »*« ›*‹ German, Czech, ...,
#TODO „*” «*» Romanian,
#TODO “*„ ‘*‚ Greek,
#TODO 「*」 『*』traditional Chinese,
#TODO ”*” ’*’ »*» ›*› Swedish, Finnish,
#TODO „*” ‚*’ Polish,
#TODO „*” »*« ’*’ Hungarian,
#TODO
#TODO But this is „*’ emphasized »*‹.
#TODO """,
#TODO u"""\
#TODO <document source="test data">
#TODO <paragraph>
#TODO "Quoted" markup start-string (matched openers & closers) -> no markup:
#TODO <paragraph>
#TODO '*' "*" (*) <*> [*] {*}
#TODO ⁅*⁆
#TODO <paragraph>
#TODO Some international quoting styles:
#TODO ‘*’ “*” English, ...,
#TODO „*“ ‚*‘ »*« ›*‹ German, Czech, ...,
#TODO „*” «*» Romanian,
#TODO “*„ ‘*‚ Greek,
#TODO 「*」 『*』traditional Chinese,
#TODO ”*” ’*’ »*» ›*› Swedish, Finnish,
#TODO „*” ‚*’ Polish,
#TODO „*” »*« ’*’ Hungarian,
#TODO <paragraph>
#TODO But this is „
#TODO <emphasis>
#TODO ’ emphasized »
#TODO ‹.
#TODO """],
#TODO ]
def load_tests(loader, tests, pattern):
return suite()
if __name__ == '__main__':
import unittest
unittest.main(defaultTest='suite')
| 24.478487 | 104 | 0.622269 | 3,905 | 32,997 | 5.174904 | 0.116261 | 0.034442 | 0.020784 | 0.02504 | 0.809333 | 0.766033 | 0.747724 | 0.711797 | 0.673793 | 0.650534 | 0 | 0.049105 | 0.184108 | 32,997 | 1,347 | 105 | 24.496659 | 0.696531 | 0.432009 | 0 | 0.528785 | 0 | 0.010661 | 0.788696 | 0.049531 | 0 | 0 | 0 | 0.002227 | 0 | 1 | 0.004264 | false | 0 | 0.008529 | 0.002132 | 0.017058 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7e6bab37377032729c13ab8a4e2e959758e37c70 | 95 | py | Python | drivers/mil_passive_sonar/src/mil_passive_sonar/__init__.py | Burtt/mil_common | 0952376fe74fb84d1db2bbe89d014db57550183e | [
"MIT"
] | 27 | 2020-02-17T21:54:09.000Z | 2022-03-18T17:49:23.000Z | drivers/mil_passive_sonar/src/mil_passive_sonar/__init__.py | Burtt/mil_common | 0952376fe74fb84d1db2bbe89d014db57550183e | [
"MIT"
] | 325 | 2019-09-11T14:13:56.000Z | 2022-03-31T00:38:30.000Z | drivers/mil_passive_sonar/src/mil_passive_sonar/__init__.py | Burtt/mil_common | 0952376fe74fb84d1db2bbe89d014db57550183e | [
"MIT"
] | 24 | 2019-09-16T00:29:45.000Z | 2022-03-06T10:56:38.000Z | #autogenerated by ROS python message generators
from tx_interface import TxHydrophonesClient
| 31.666667 | 48 | 0.863158 | 11 | 95 | 7.363636 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126316 | 95 | 2 | 49 | 47.5 | 0.975904 | 0.484211 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7e910e7459c08f69ef9fc25935e53a96e65e35f0 | 329 | py | Python | mba_obfuscator/mba_obfuscator/__init__.py | nhpcc502/MBA-Obfuscator | 8574ef8537f884ed7bd38da7b7bc630e8e8fc8f6 | [
"MIT"
] | 12 | 2021-07-11T23:14:41.000Z | 2022-03-15T02:47:07.000Z | mba_obfuscator/mba_obfuscator/__init__.py | nhpcc502/MBA-Obfuscator | 8574ef8537f884ed7bd38da7b7bc630e8e8fc8f6 | [
"MIT"
] | null | null | null | mba_obfuscator/mba_obfuscator/__init__.py | nhpcc502/MBA-Obfuscator | 8574ef8537f884ed7bd38da7b7bc630e8e8fc8f6 | [
"MIT"
] | 2 | 2022-01-10T14:46:13.000Z | 2022-03-04T19:14:57.000Z | """MBA generation module, containing:
- truthtable_generate.py: functions to generate entire 2/3/4-variable truth table.
- lMBA_generate.py: module of linear MBA expression generation
- pMBA_generate.py: module of polynomial MBA expression generation
- npMBA_generate.py: module of non-polynomial MBA expression generation
"""
| 36.555556 | 82 | 0.802432 | 45 | 329 | 5.777778 | 0.533333 | 0.153846 | 0.184615 | 0.207692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010381 | 0.121581 | 329 | 8 | 83 | 41.125 | 0.889273 | 0.972644 | 0 | null | 1 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0e25544150bf9db20dd9d3c4350df546fb29ea0b | 1,526 | py | Python | playground/optimization/ott2butKAMA2-Gcloud-done-23it/routes.py | ysdede/jesse_strategies | ade9f4ba42cec11207c766d267b9d8feb8bce648 | [
"CC0-1.0"
] | 38 | 2021-09-18T15:33:28.000Z | 2022-02-21T17:29:08.000Z | playground/optimization/ott2butKAMA2-Gcloud-done-23it/routes.py | ysdede/jesse_strategies | ade9f4ba42cec11207c766d267b9d8feb8bce648 | [
"CC0-1.0"
] | 4 | 2022-01-02T14:46:12.000Z | 2022-02-16T18:39:41.000Z | playground/optimization/ott2butKAMA2-Gcloud-done-23it/routes.py | ysdede/jesse_strategies | ade9f4ba42cec11207c766d267b9d8feb8bce648 | [
"CC0-1.0"
] | 11 | 2021-10-19T06:21:43.000Z | 2022-02-21T17:29:10.000Z | routes = [
('FTX Futures', 'CRO-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'ASD-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'IOTA-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'NEO-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'ZRX-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'BADGER-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'RUNE-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'XLM-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'XEM-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'MER-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'ETH-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'SKL-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'PUNDIX-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'OXY-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'SC-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'ALCX-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'MEDIA-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'TRU-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'FIL-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'RSR-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'WAVES-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'CONV-USD', '5m', 'OttKamaRm', 'WL1T,O'),
('FTX Futures', 'BNB-USD', '5m', 'OttKamaRm', 'WL1T,O'),
# ('Binance Futures', 'SKL-USDT', '5m', 'Ott2butKAMA', '(╯°□°)╯︵ ┻━┻'),
]
extra_candles = []
| 50.866667 | 75 | 0.525557 | 196 | 1,526 | 4.132653 | 0.19898 | 0.283951 | 0.397531 | 0.511111 | 0.811111 | 0.787654 | 0.787654 | 0 | 0 | 0 | 0 | 0.037647 | 0.164482 | 1,526 | 29 | 76 | 52.62069 | 0.590588 | 0.045216 | 0 | 0 | 0 | 0 | 0.562199 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0e802632d6ffb9d1f4a771203d23a1aef2caaa38 | 176 | py | Python | blender_async/__init__.py | akloster/blender-asyncio | 3e94de86905127f2d1041f1b70d71c5f5b8f14c8 | [
"Apache-2.0"
] | 54 | 2015-03-17T12:54:21.000Z | 2021-12-19T04:01:25.000Z | blender_async/__init__.py | akloster/blender-asyncio | 3e94de86905127f2d1041f1b70d71c5f5b8f14c8 | [
"Apache-2.0"
] | 3 | 2016-07-07T10:21:46.000Z | 2017-06-22T15:14:24.000Z | blender_async/__init__.py | akloster/blender-asyncio | 3e94de86905127f2d1041f1b70d71c5f5b8f14c8 | [
"Apache-2.0"
] | 11 | 2015-03-17T18:46:09.000Z | 2021-12-04T10:12:05.000Z | from blender_async.bridge import get_event_loop, BlenderListener
from blender_async.dialogs import AsyncDialog, open_file_dialog, open_dialog
from .handlers import app_handler
| 44 | 76 | 0.880682 | 25 | 176 | 5.88 | 0.68 | 0.14966 | 0.217687 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085227 | 176 | 3 | 77 | 58.666667 | 0.913043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7ed440140b08de157371b50962c1e286abe802b8 | 93 | py | Python | cash_ml/classifier.py | jesse-toftum/cash_ml | 316121a41359f8d18358c17f9be2ab90ad69bcb2 | [
"MIT"
] | 4 | 2018-12-05T14:46:31.000Z | 2019-07-03T12:39:39.000Z | cash_ml/classifier.py | jesse-toftum/cash_ml | 316121a41359f8d18358c17f9be2ab90ad69bcb2 | [
"MIT"
] | 4 | 2018-12-16T18:16:26.000Z | 2019-01-11T00:10:02.000Z | cash_ml/classifier.py | jesse-toftum/cash_ml | 316121a41359f8d18358c17f9be2ab90ad69bcb2 | [
"MIT"
] | 3 | 2018-12-05T14:40:13.000Z | 2019-11-17T00:40:15.000Z | from cash_ml.predictor_base import PredictorBase
class Classifier(PredictorBase):
pass
| 15.5 | 48 | 0.817204 | 11 | 93 | 6.727273 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139785 | 93 | 5 | 49 | 18.6 | 0.925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
7d0b3becd8c2fe49854a24a98b8d6f7e316f9dda | 36 | py | Python | calibratesdr/dabplus/__init__.py | arnaudlb/CalibrateSDR | 3493da7808bdec23aa89ad88b1149a811ace143a | [
"MIT"
] | 24 | 2020-12-30T02:11:28.000Z | 2022-02-21T20:18:44.000Z | calibratesdr/dabplus/__init__.py | jyrj/CalibrateSDR | e9b1e9dd0f4d15dd19af6c8cc6e9b04d3227d3a7 | [
"MIT"
] | 5 | 2020-12-29T09:47:08.000Z | 2021-08-30T10:33:58.000Z | calibratesdr/dabplus/__init__.py | jyrj/CalibrateSDR | e9b1e9dd0f4d15dd19af6c8cc6e9b04d3227d3a7 | [
"MIT"
] | 15 | 2020-12-31T13:34:29.000Z | 2021-11-19T13:57:55.000Z | from calibratesdr.dabplus import dab | 36 | 36 | 0.888889 | 5 | 36 | 6.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 36 | 1 | 36 | 36 | 0.969697 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7d247eacfaf00da0e8a2645b30650bc2750008ea | 49 | py | Python | code/check_costs/res/aws_kms_key.py | derBroBro/TerraDepot | 185830a7cb9ba3ec068c3965640b9fd055f942a0 | [
"MIT"
] | 61 | 2020-01-19T21:28:05.000Z | 2022-03-16T16:57:47.000Z | code/check_costs/res/aws_kms_key.py | derBroBro/terraform-http-backend | 185830a7cb9ba3ec068c3965640b9fd055f942a0 | [
"MIT"
] | 2 | 2021-08-02T17:09:41.000Z | 2021-08-02T17:10:07.000Z | code/check_costs/res/aws_kms_key.py | derBroBro/terraform-http-backend | 185830a7cb9ba3ec068c3965640b9fd055f942a0 | [
"MIT"
] | 5 | 2020-01-19T23:13:57.000Z | 2022-01-07T18:08:04.000Z | def run(resource):
# 1USD/mon
return 1.0
| 12.25 | 18 | 0.591837 | 8 | 49 | 3.625 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 0.285714 | 49 | 3 | 19 | 16.333333 | 0.742857 | 0.163265 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
7d4ec81db6ff730371fed829866a2b406860d80a | 25 | py | Python | libswitch/commands/__init__.py | liuyenting/libswitch | 38ed7debd7631a8d5deccef6a7165ffddd6e4a60 | [
"MIT"
] | null | null | null | libswitch/commands/__init__.py | liuyenting/libswitch | 38ed7debd7631a8d5deccef6a7165ffddd6e4a60 | [
"MIT"
] | null | null | null | libswitch/commands/__init__.py | liuyenting/libswitch | 38ed7debd7631a8d5deccef6a7165ffddd6e4a60 | [
"MIT"
] | null | null | null | from .cisco import Cisco
| 12.5 | 24 | 0.8 | 4 | 25 | 5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 25 | 1 | 25 | 25 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
adc295ebe596258b0dc59c82360324a91fe75274 | 22,499 | py | Python | tests/lastfm/test_views.py | garrettc/django-ditto | fcf15beb8f9b4d61634efd4a88064df12ee16a6f | [
"MIT"
] | 54 | 2016-08-15T17:32:41.000Z | 2022-02-27T03:32:05.000Z | tests/lastfm/test_views.py | garrettc/django-ditto | fcf15beb8f9b4d61634efd4a88064df12ee16a6f | [
"MIT"
] | 229 | 2015-07-23T12:50:47.000Z | 2022-03-24T10:33:20.000Z | tests/lastfm/test_views.py | garrettc/django-ditto | fcf15beb8f9b4d61634efd4a88064df12ee16a6f | [
"MIT"
] | 8 | 2015-09-10T17:10:35.000Z | 2022-03-25T13:05:01.000Z | from django.urls import reverse
from django.test import TestCase
from freezegun import freeze_time
from ditto.core.utils import datetime_from_str
from ditto.lastfm.factories import (
AccountFactory,
AlbumFactory,
ArtistFactory,
ScrobbleFactory,
TrackFactory,
)
# from ditto.lastfm.models import *
class AlbumDetailViewTests(TestCase):
def setUp(self):
self.artist = ArtistFactory(slug="Lou+Reed")
self.album = AlbumFactory(slug="New+York", artist=self.artist)
def test_templates(self):
"Uses the correct templates"
response = self.client.get(
reverse(
"lastfm:album_detail",
kwargs={
"artist_slug": self.artist.slug,
"album_slug": self.album.slug,
},
)
)
self.assertEquals(response.status_code, 200)
self.assertTemplateUsed(response, "lastfm/album_detail.html")
self.assertTemplateUsed(response, "lastfm/base.html")
self.assertTemplateUsed(response, "ditto/base.html")
def test_context(self):
"Sends the correct data to the templates"
response = self.client.get(
reverse(
"lastfm:album_detail",
kwargs={
"artist_slug": self.artist.slug,
"album_slug": self.album.slug,
},
)
)
self.assertIn("album", response.context)
self.assertEqual(self.album.pk, response.context["album"].pk)
def test_404s(self):
"Responds with 404 if we request an album that doesn't exist."
response = self.client.get(
reverse(
"lastfm:album_detail",
kwargs={"artist_slug": self.artist.slug, "album_slug": "Transformer"},
)
)
self.assertEquals(response.status_code, 404)
class AlbumListViewTests(TestCase):
def test_templates(self):
"Uses the correct templates"
response = self.client.get(reverse("lastfm:album_list"))
self.assertEquals(response.status_code, 200)
self.assertTemplateUsed(response, "lastfm/album_list.html")
self.assertTemplateUsed(response, "lastfm/base.html")
self.assertTemplateUsed(response, "ditto/base.html")
def test_context(self):
"Sends the correct data to the templates"
AccountFactory.create_batch(2)
AlbumFactory.create_batch(3)
response = self.client.get(reverse("lastfm:album_list"))
self.assertIn("account_list", response.context)
self.assertEqual(len(response.context["account_list"]), 2)
self.assertIn("album_list", response.context)
self.assertEqual(len(response.context["album_list"]), 3)
self.assertIn("valid_days", response.context)
self.assertEqual(
response.context["valid_days"], ["7", "30", "90", "180", "365", "all"]
)
self.assertIn("current_days", response.context)
self.assertEqual(response.context["current_days"], "all")
@freeze_time("2016-10-05 12:00:00", tz_offset=-8)
def test_default_days(self):
"Has correct scrobble count context when all days are viewed, the default."
artist = ArtistFactory()
album = AlbumFactory(artist=artist)
ScrobbleFactory(
artist=artist,
album=album,
post_time=datetime_from_str("2012-10-01 12:00:00"),
)
ScrobbleFactory(
artist=artist,
album=album,
post_time=datetime_from_str("2016-10-01 12:00:00"),
)
response = self.client.get(reverse("lastfm:album_list"))
self.assertEqual(response.context["album_list"][0].scrobble_count, 2)
@freeze_time("2016-10-05 12:00:00", tz_offset=-8)
def test_all_days(self):
"Has correct scrobble count context when all days are viewed."
artist = ArtistFactory()
album = AlbumFactory(artist=artist)
ScrobbleFactory(
artist=artist,
album=album,
post_time=datetime_from_str("2012-10-01 12:00:00"),
)
ScrobbleFactory(
artist=artist,
album=album,
post_time=datetime_from_str("2016-10-01 12:00:00"),
)
response = self.client.get("%s?days=all" % reverse("lastfm:album_list"))
self.assertEqual(response.context["album_list"][0].scrobble_count, 2)
@freeze_time("2016-10-05 12:00:00", tz_offset=-8)
def test_7_days(self):
"Has correct scrobble count context when restricted number of days are viewed."
artist = ArtistFactory()
album = AlbumFactory(artist=artist)
ScrobbleFactory(
artist=artist,
album=album,
post_time=datetime_from_str("2012-10-01 12:00:00"),
)
ScrobbleFactory(
artist=artist,
album=album,
post_time=datetime_from_str("2016-10-01 12:00:00"),
)
response = self.client.get("%s?days=7" % reverse("lastfm:album_list"))
self.assertEqual(response.context["album_list"][0].scrobble_count, 1)
class ArtistAlbumsViewTests(TestCase):
def setUp(self):
self.artist = ArtistFactory(slug="Lou+Reed")
def test_templates(self):
"Uses the correct templates"
response = self.client.get(
reverse("lastfm:artist_albums", kwargs={"artist_slug": self.artist.slug})
)
self.assertEquals(response.status_code, 200)
self.assertTemplateUsed(response, "lastfm/artist_albums.html")
self.assertTemplateUsed(response, "lastfm/base.html")
self.assertTemplateUsed(response, "ditto/base.html")
def test_context(self):
"Sends the correct data to the templates"
response = self.client.get(
reverse("lastfm:artist_albums", kwargs={"artist_slug": self.artist.slug})
)
self.assertIn("artist", response.context)
self.assertEqual(self.artist.pk, response.context["artist"].pk)
def test_404s(self):
"Responds with 404 if we request an artist that doesn't exist."
response = self.client.get(
reverse("lastfm:artist_albums", kwargs={"artist_slug": "Looper"})
)
self.assertEquals(response.status_code, 404)
class ArtistDetailViewTests(TestCase):
def setUp(self):
self.artist = ArtistFactory(slug="Lou+Reed")
def test_templates(self):
"Uses the correct templates"
response = self.client.get(
reverse("lastfm:artist_detail", kwargs={"artist_slug": self.artist.slug})
)
self.assertEquals(response.status_code, 200)
self.assertTemplateUsed(response, "lastfm/artist_detail.html")
self.assertTemplateUsed(response, "lastfm/base.html")
self.assertTemplateUsed(response, "ditto/base.html")
def test_context(self):
"Sends the correct data to the templates"
response = self.client.get(
reverse("lastfm:artist_detail", kwargs={"artist_slug": self.artist.slug})
)
self.assertIn("artist", response.context)
self.assertEqual(self.artist.pk, response.context["artist"].pk)
def test_404s(self):
"Responds with 404 if we request an artist that doesn't exist."
response = self.client.get(
reverse("lastfm:artist_detail", kwargs={"artist_slug": "Looper"})
)
self.assertEquals(response.status_code, 404)
class ArtistListViewTests(TestCase):
def test_templates(self):
"Uses the correct templates"
response = self.client.get(reverse("lastfm:artist_list"))
self.assertEquals(response.status_code, 200)
self.assertTemplateUsed(response, "lastfm/artist_list.html")
self.assertTemplateUsed(response, "lastfm/base.html")
self.assertTemplateUsed(response, "ditto/base.html")
def test_context(self):
"Sends the correct data to the templates"
AccountFactory.create_batch(2)
ArtistFactory.create_batch(3)
response = self.client.get(reverse("lastfm:artist_list"))
self.assertIn("account_list", response.context)
self.assertEqual(len(response.context["account_list"]), 2)
self.assertIn("artist_list", response.context)
self.assertEqual(len(response.context["artist_list"]), 3)
self.assertIn("valid_days", response.context)
self.assertEqual(
response.context["valid_days"], ["7", "30", "90", "180", "365", "all"]
)
self.assertIn("current_days", response.context)
self.assertEqual(response.context["current_days"], "all")
@freeze_time("2016-10-05 12:00:00", tz_offset=-8)
def test_default_days(self):
"Has correct scrobble count context when all days are viewed, the default."
artist = ArtistFactory()
track = TrackFactory(artist=artist)
ScrobbleFactory(
artist=artist,
track=track,
post_time=datetime_from_str("2012-10-01 12:00:00"),
)
ScrobbleFactory(
artist=artist,
track=track,
post_time=datetime_from_str("2016-10-01 12:00:00"),
)
response = self.client.get(reverse("lastfm:artist_list"))
self.assertEqual(response.context["artist_list"][0].scrobble_count, 2)
@freeze_time("2016-10-05 12:00:00", tz_offset=-8)
def test_all_days(self):
"Has correct scrobble count context when all days are viewed."
artist = ArtistFactory()
track = TrackFactory(artist=artist)
ScrobbleFactory(
artist=artist,
track=track,
post_time=datetime_from_str("2012-10-01 12:00:00"),
)
ScrobbleFactory(
artist=artist,
track=track,
post_time=datetime_from_str("2016-10-01 12:00:00"),
)
response = self.client.get("%s?days=all" % reverse("lastfm:artist_list"))
self.assertEqual(response.context["artist_list"][0].scrobble_count, 2)
@freeze_time("2016-10-05 12:00:00", tz_offset=-8)
def test_7_days(self):
"Has correct scrobble count context when restricted number of days are viewed."
artist = ArtistFactory()
track = TrackFactory(artist=artist)
ScrobbleFactory(
artist=artist,
track=track,
post_time=datetime_from_str("2012-10-01 12:00:00"),
)
ScrobbleFactory(
artist=artist,
track=track,
post_time=datetime_from_str("2016-10-01 12:00:00"),
)
response = self.client.get("%s?days=7" % reverse("lastfm:artist_list"))
self.assertEqual(response.context["artist_list"][0].scrobble_count, 1)
class HomeViewTests(TestCase):
def test_templates(self):
"Uses the correct templates"
response = self.client.get(reverse("lastfm:home"))
self.assertEquals(response.status_code, 200)
self.assertTemplateUsed(response, "lastfm/home.html")
self.assertTemplateUsed(response, "lastfm/base.html")
self.assertTemplateUsed(response, "ditto/base.html")
def test_context(self):
"Sends the correct data to the templates"
accounts = AccountFactory.create_batch(3)
ScrobbleFactory(account=accounts[0])
ScrobbleFactory(account=accounts[1])
response = self.client.get(reverse("lastfm:home"))
self.assertIn("account_list", response.context)
self.assertEqual(len(response.context["account_list"]), 3)
self.assertIn("counts", response.context)
self.assertIn("scrobbles", response.context["counts"])
self.assertEqual(response.context["counts"]["scrobbles"], 2)
class ScrobbleListViewTests(TestCase):
def test_templates(self):
"Uses the correct templates"
response = self.client.get(reverse("lastfm:scrobble_list"))
self.assertEquals(response.status_code, 200)
self.assertTemplateUsed(response, "lastfm/scrobble_list.html")
self.assertTemplateUsed(response, "lastfm/base.html")
self.assertTemplateUsed(response, "ditto/base.html")
def test_context(self):
"Sends the correct data to the templates"
AccountFactory.create_batch(3)
response = self.client.get(reverse("lastfm:scrobble_list"))
self.assertIn("account_list", response.context)
self.assertEqual(len(response.context["account_list"]), 3)
self.assertIn("scrobble_list", response.context)
class TrackDetailViewTests(TestCase):
def setUp(self):
self.artist = ArtistFactory(slug="Lou+Reed")
self.track = TrackFactory(slug="Hold+On", artist=self.artist)
def test_templates(self):
"Uses the correct templates"
response = self.client.get(
reverse(
"lastfm:track_detail",
kwargs={
"artist_slug": self.artist.slug,
"track_slug": self.track.slug,
},
)
)
self.assertEquals(response.status_code, 200)
self.assertTemplateUsed(response, "lastfm/track_detail.html")
self.assertTemplateUsed(response, "lastfm/base.html")
self.assertTemplateUsed(response, "ditto/base.html")
def test_context(self):
"Sends the correct data to the templates"
response = self.client.get(
reverse(
"lastfm:track_detail",
kwargs={
"artist_slug": self.artist.slug,
"track_slug": self.track.slug,
},
)
)
self.assertIn("track", response.context)
self.assertEqual(self.track.pk, response.context["track"].pk)
def test_404s(self):
"Responds with 404 if we request a track that doesn't exist."
response = self.client.get(
reverse(
"lastfm:track_detail",
kwargs={"artist_slug": self.artist.slug, "track_slug": "Viscious"},
)
)
self.assertEquals(response.status_code, 404)
class TrackListViewTests(TestCase):
def test_templates(self):
"Uses the correct templates"
response = self.client.get(reverse("lastfm:track_list"))
self.assertEquals(response.status_code, 200)
self.assertTemplateUsed(response, "lastfm/track_list.html")
self.assertTemplateUsed(response, "lastfm/base.html")
self.assertTemplateUsed(response, "ditto/base.html")
def test_context(self):
"Sends the correct data to the templates"
AccountFactory.create_batch(2)
TrackFactory.create_batch(3)
response = self.client.get(reverse("lastfm:track_list"))
self.assertIn("account_list", response.context)
self.assertEqual(len(response.context["account_list"]), 2)
self.assertIn("track_list", response.context)
self.assertEqual(len(response.context["track_list"]), 3)
self.assertIn("valid_days", response.context)
self.assertEqual(
response.context["valid_days"], ["7", "30", "90", "180", "365", "all"]
)
self.assertIn("current_days", response.context)
self.assertEqual(response.context["current_days"], "all")
@freeze_time("2016-10-05 12:00:00", tz_offset=-8)
def test_default_days(self):
"Has correct scrobble count context when all days are viewed, the default."
artist = ArtistFactory()
track = TrackFactory(artist=artist)
ScrobbleFactory(
artist=artist,
track=track,
post_time=datetime_from_str("2012-10-01 12:00:00"),
)
ScrobbleFactory(
artist=artist,
track=track,
post_time=datetime_from_str("2016-10-01 12:00:00"),
)
response = self.client.get(reverse("lastfm:track_list"))
self.assertEqual(response.context["track_list"][0].scrobble_count, 2)
@freeze_time("2016-10-05 12:00:00", tz_offset=-8)
def test_all_days(self):
"Has correct scrobble count context when all days are viewed."
artist = ArtistFactory()
track = TrackFactory(artist=artist)
ScrobbleFactory(
artist=artist,
track=track,
post_time=datetime_from_str("2012-10-01 12:00:00"),
)
ScrobbleFactory(
artist=artist,
track=track,
post_time=datetime_from_str("2016-10-01 12:00:00"),
)
response = self.client.get("%s?days=all" % reverse("lastfm:track_list"))
self.assertEqual(response.context["track_list"][0].scrobble_count, 2)
@freeze_time("2016-10-05 12:00:00", tz_offset=-8)
def test_7_days(self):
"Has correct scrobble count context when restricted number of days are viewed."
artist = ArtistFactory()
track = TrackFactory(artist=artist)
ScrobbleFactory(
artist=artist,
track=track,
post_time=datetime_from_str("2012-10-01 12:00:00"),
)
ScrobbleFactory(
artist=artist,
track=track,
post_time=datetime_from_str("2016-10-01 12:00:00"),
)
response = self.client.get("%s?days=7" % reverse("lastfm:track_list"))
self.assertEqual(response.context["track_list"][0].scrobble_count, 1)
class UserCommonTests(object):
"""Parent for all user-specific views.
Doesn't inherit from TestCase because we don't want the tests in this class
to run, only in its child classes.
Child classes should inherit like:
class MyChildTestCase(UserCommonTests, TestCase):
in that order, so that setUp() runs.
"""
# eg, 'user_album_list':
view_name = "DEFINE IN CHILD CLASSES"
def setUp(self):
bob = AccountFactory(username="bob")
terry = AccountFactory(username="terry")
self.artist1 = ArtistFactory()
self.track1 = TrackFactory(artist=self.artist1)
self.album1 = AlbumFactory(artist=self.artist1)
self.artist2 = ArtistFactory()
self.track2 = TrackFactory(artist=self.artist2)
ScrobbleFactory.create_batch(
2, account=bob, track=self.track1, artist=self.artist1, album=self.album1
)
ScrobbleFactory.create_batch(
5, account=bob, track=self.track2, artist=self.artist2
)
ScrobbleFactory.create_batch(
3, account=terry, track=self.track1, artist=self.artist1, album=self.album1
)
ScrobbleFactory.create_batch(
7, account=terry, track=self.track2, artist=self.artist2
)
def test_templates(self):
"Uses the correct templates"
response = self.client.get(
reverse("lastfm:%s" % self.view_name, kwargs={"username": "bob"})
)
self.assertEquals(response.status_code, 200)
self.assertTemplateUsed(response, "lastfm/%s.html" % self.view_name)
self.assertTemplateUsed(response, "lastfm/base.html")
self.assertTemplateUsed(response, "ditto/base.html")
def test_context_counts(self):
"""Sends the correct count data to the templates.
All user_* views should have these same counts in their context.
"""
response = self.client.get(
reverse("lastfm:%s" % self.view_name, kwargs={"username": "bob"})
)
self.assertIn("counts", response.context)
self.assertEqual(response.context["counts"]["albums"], 1)
self.assertEqual(response.context["counts"]["artists"], 2)
self.assertEqual(response.context["counts"]["scrobbles"], 7)
self.assertEqual(response.context["counts"]["tracks"], 2)
def test_404s(self):
"Responds with 404 if we request a user that doesn't exist."
response = self.client.get(
reverse("lastfm:%s" % self.view_name, kwargs={"username": "thelma"})
)
self.assertEquals(response.status_code, 404)
class UserDetailViewTestCase(UserCommonTests, TestCase):
view_name = "user_detail"
class UserAlbumListViewTestCase(UserCommonTests, TestCase):
view_name = "user_album_list"
def test_context_albums(self):
"Sends the correct album data to the templates"
response = self.client.get(
reverse("lastfm:%s" % self.view_name, kwargs={"username": "bob"})
)
self.assertIn("album_list", response.context)
albums = response.context["album_list"]
self.assertEqual(len(albums), 1)
self.assertEqual(albums[0], self.album1)
self.assertEqual(albums[0].scrobble_count, 2)
class UserArtistListViewTestCase(UserCommonTests, TestCase):
view_name = "user_artist_list"
def test_context_albums(self):
"Sends the correct album data to the templates"
response = self.client.get(
reverse("lastfm:%s" % self.view_name, kwargs={"username": "bob"})
)
self.assertIn("artist_list", response.context)
artists = response.context["artist_list"]
self.assertEqual(len(artists), 2)
self.assertEqual(artists[0], self.artist2)
self.assertEqual(artists[1], self.artist1)
self.assertEqual(artists[0].scrobble_count, 5)
self.assertEqual(artists[1].scrobble_count, 2)
class UserScrobbleListViewTestCase(UserCommonTests, TestCase):
view_name = "user_scrobble_list"
def test_context_scrobbles(self):
"Sends the correct scrobble data to the templates"
response = self.client.get(
reverse("lastfm:%s" % self.view_name, kwargs={"username": "bob"})
)
self.assertIn("scrobble_list", response.context)
scrobbles = response.context["scrobble_list"]
self.assertEqual(len(scrobbles), 7)
class UserTrackListViewTestCase(UserCommonTests, TestCase):
view_name = "user_track_list"
def test_context_tracks(self):
"Sends the correct track data to the templates"
response = self.client.get(
reverse("lastfm:%s" % self.view_name, kwargs={"username": "bob"})
)
self.assertIn("track_list", response.context)
tracks = response.context["track_list"]
self.assertEqual(len(tracks), 2)
self.assertEqual(tracks[0], self.track2)
self.assertEqual(tracks[0].scrobble_count, 5)
self.assertEqual(tracks[1], self.track1)
self.assertEqual(tracks[1].scrobble_count, 2)
| 38.133898 | 87 | 0.632828 | 2,540 | 22,499 | 5.48937 | 0.070866 | 0.0667 | 0.049057 | 0.057233 | 0.835545 | 0.812522 | 0.775228 | 0.766191 | 0.75199 | 0.737288 | 0 | 0.03475 | 0.247922 | 22,499 | 589 | 88 | 38.198642 | 0.789256 | 0.098004 | 0 | 0.65996 | 0 | 0 | 0.209 | 0.008636 | 0 | 0 | 0 | 0 | 0.235412 | 1 | 0.086519 | false | 0 | 0.01006 | 0 | 0.138833 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
add0b09893a25a75855e1c0f73884293c1553204 | 96 | py | Python | core/utils/states/__init__.py | AKurmazov/hoteluni_bot | f6c25ecc92cc71326cd1c7417f79092874d457a9 | [
"MIT"
] | 2 | 2020-09-09T11:24:42.000Z | 2021-05-23T18:46:54.000Z | core/utils/states/__init__.py | AKurmazov/hoteluni_bot | f6c25ecc92cc71326cd1c7417f79092874d457a9 | [
"MIT"
] | 6 | 2019-08-24T07:35:32.000Z | 2020-03-23T17:53:51.000Z | core/utils/states/__init__.py | AKurmazov/hoteluni_bot | f6c25ecc92cc71326cd1c7417f79092874d457a9 | [
"MIT"
] | 3 | 2019-09-02T09:34:37.000Z | 2021-11-26T18:27:45.000Z | from .choose_language import *
from .cleaning_reminder import *
from .mailing_everyone import *
| 24 | 32 | 0.8125 | 12 | 96 | 6.25 | 0.666667 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 96 | 3 | 33 | 32 | 0.892857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
70aaf376e9234a7e97cdc61c77a505c4e542177c | 51,831 | py | Python | tests/unit/test_config.py | lantz/faucet2 | fc97d51924283679b35da2b59fedb58202a218e9 | [
"Apache-2.0"
] | null | null | null | tests/unit/test_config.py | lantz/faucet2 | fc97d51924283679b35da2b59fedb58202a218e9 | [
"Apache-2.0"
] | null | null | null | tests/unit/test_config.py | lantz/faucet2 | fc97d51924283679b35da2b59fedb58202a218e9 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
"""Test config parsing"""
import logging
import re
import shutil
import tempfile
import os
import unittest
from faucet import config_parser as cp
LOGNAME = '/dev/null'
class TestConfig(unittest.TestCase): # pytype: disable=module-attr
"""Test config parsing raises correct exception."""
tmpdir = None
def setUp(self):
logging.disable(logging.CRITICAL)
self.tmpdir = tempfile.mkdtemp()
def tearDown(self):
logging.disable(logging.NOTSET)
shutil.rmtree(self.tmpdir)
def conf_file_name(self):
"""Return path to test config file in test directory."""
return os.path.join(self.tmpdir, 'faucet.yaml')
def create_config_file(self, config):
"""Returns file path to file containing the config parameter."""
conf_file_name = self.conf_file_name()
with open(conf_file_name, 'wb') as conf_file:
if isinstance(config, bytes):
conf_file.write(config)
else:
conf_file.write(config.encode('utf-8'))
return conf_file_name
def run_function_with_config(self, config, function, before_function=None):
"""Return False with error if provided function raises InvalidConfigError."""
# TODO: Check acls_in work now acl_in is deprecated
if isinstance(config, str) and 'acl_in' in config and not 'acls_in':
config = re.sub('(acl_in: )(.*)', 'acls_in: [\\2]', config)
conf_file = self.create_config_file(config)
if before_function:
before_function()
try:
function(conf_file, LOGNAME)
except cp.InvalidConfigError as err:
return (False, err)
return (True, None)
def check_config_failure(self, config, function, before_function=None):
"""Ensure config parsing reported as failed."""
config_success, config_err = self.run_function_with_config(
config, function, before_function)
self.assertEqual(config_success, False, config_err)
def check_config_success(self, config, function, before_function=None):
"""Ensure config parsing reported succeeded."""
config_success, config_err = self.run_function_with_config(
config, function, before_function)
self.assertEqual(config_success, True, config_err)
def test_dupe_vid(self):
"""Test that VLANs cannot have same VID."""
config = """
vlans:
office:
vid: 100
guest:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
2:
native_vlan: guest
"""
self.check_config_failure(config, cp.dp_parser)
def test_unhashable_key(self):
config = """
vlans:
? office:
vid: 100
guest:
vid: 200
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
2:
native_vlan: office
3:
native_vlan: guest
4:
native_vlan: office
5:
tagged_vlans: [office]
sw2:
dp_id: 0x2
interfaces:
1:
native_vlan: office
2:
native_vlan: guest
24:
tagged_vlans: [office, guest]
"""
self.check_config_failure(config, cp.dp_parser)
def test_config_contains_only_int(self):
"""Test that config is invalid when only an int"""
config = """5"""
self.check_config_failure(config, cp.dp_parser)
def test_config_contains_only_float(self):
"""Test that config is invalid when only a float"""
config = """5.5"""
self.check_config_failure(config, cp.dp_parser)
def test_config_contains_only_str(self):
"""Test config is invalid when only a string"""
config = """aaaa"""
self.check_config_failure(config, cp.dp_parser)
def test_config_only_boolean(self):
"""Test config is invalid when only a boolean"""
config = """False"""
self.check_config_failure(config, cp.dp_parser)
def test_config_only_datetime(self):
"""Test that config is invalid when only a datetime object"""
config = """1967-07-31"""
self.check_config_failure(config, cp.dp_parser)
def test_config_contains_only_dash(self):
"""Test that config is invalid when only only a -"""
config = """-"""
self.check_config_failure(config, cp.dp_parser)
def test_config_contains_only_array(self):
"""Test that config is invalid when only only [2, 2]"""
config = """[2, 2]"""
self.check_config_failure(config, cp.dp_parser)
def test_config_only_empty_array(self):
"""Test that config is invalid when only only []"""
config = """[]"""
self.check_config_failure(config, cp.dp_parser)
def test_unconfigured_acl(self):
"""Test that config is invalid when there are unconfigured acls"""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
acl_in: access-port-protect
tagged_vlans: [office]
"""
self.check_config_failure(config, cp.dp_parser)
def test_unconfigured_vlan_acl(self):
"""Test that config is invalid when only there are unconfigured acls"""
config = """
vlans:
office:
vid: 100
acl_in: office-vlan-protect
dps:
sw1:
dp_id: 0x1
interfaces:
1:
tagged_vlans: [office]
"""
self.check_config_failure(config, cp.dp_parser)
def test_config_routes_are_empty(self):
"""Test that config is invalid when vlan routes are empty"""
config = """
vlans:
office:
vid: 100
routes:
- route:
ip_dst:
ip_gw:
dps:
sw1:
dp_id: 0x1
interfaces:
5:
tagged_vlans: [office]
"""
self.check_config_failure(config, cp.dp_parser)
def test_config_routes_not_strings(self):
"""Test config is invalid when vlan routes are not strings"""
config = """
vlans:
office:
vid: 100
routes:
- route:
ip_dst: 5.5
ip_gw: []
dps:
sw1:
dp_id: 0x1
interfaces:
5:
tagged_vlans: [office]
"""
self.check_config_failure(config, cp.dp_parser)
def test_config_vips_not_strings(self):
"""Test that config is invalid when faucet_vips does not contain strings"""
config = """
vlans:
office:
vid: 100
faucet_vips: [False, 5.5, []]
dps:
sw1:
dp_id: 0x1
interfaces:
5:
tagged_vlans: [office]
"""
self.check_config_failure(config, cp.dp_parser)
def test_config_faucet_invalid_vips(self):
"""Test that config is rejected if faucet_vips does not contain valid ip addresses"""
config = """
vlans:
office:
vid: 100
faucet_vips: ['aaaaa', '', '123421342']
dps:
sw1:
dp_id: 0x1
interfaces:
5:
tagged_vlans: [office]
"""
self.check_config_failure(config, cp.dp_parser)
def test_config_vlans_is_empty(self):
"""Test that config is rejected when vlans is empty"""
config = """
vlans:
dps:
sw1:
dp_id: 0x1
hardware: "Open vSwitch"
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_config_dps_is_empty(self):
"""Test that config is rejected when dps is empty"""
config = """
vlans:
office:
vid: 100
dps:
"""
self.check_config_failure(config, cp.dp_parser)
def test_including_invalid_files(self):
"""Test that config is rejected when including invalid files"""
config = """
include: [-, False, 1967-06-07, 5.5, [5], {'5': 5}, testing]
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
5:
tagged_vlans: [office]
"""
self.check_config_failure(config, cp.dp_parser)
def test_config_vlans_on_stack(self):
"""Test that config is rejected vlans on a stack interface."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
hardware: "Open vSwitch"
stack:
priority: 1
interfaces:
1:
native_vlan: office
stack:
dp: sw2
port: 1
2:
native_vlan: office
sw2:
dp_id: 0x2
hardware: "Open vSwitch"
interfaces:
1:
stack:
dp: sw1
port: 1
2:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_config_stack(self):
"""Test valid stacking config."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
hardware: "Open vSwitch"
stack:
priority: 1
interfaces:
1:
stack:
dp: sw2
port: 1
2:
native_vlan: office
sw2:
dp_id: 0x2
hardware: "Open vSwitch"
interfaces:
1:
stack:
dp: sw1
port: 1
2:
native_vlan: office
"""
self.check_config_success(config, cp.dp_parser)
def test_config_stack_and_non_stack(self):
"""Test stack and non-stacking config."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
hardware: "Open vSwitch"
stack:
priority: 1
interfaces:
1:
stack:
dp: sw2
port: 1
2:
native_vlan: office
sw2:
dp_id: 0x2
hardware: "Open vSwitch"
interfaces:
1:
stack:
dp: sw1
port: 1
2:
native_vlan: office
sw3:
dp_id: 0x3
hardware: "Open vSwitch"
interfaces:
1:
native_vlan: office
2:
native_vlan: office
"""
self.check_config_success(config, cp.dp_parser)
def test_config_stack_islands(self):
"""Test that stack islands don't exist."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
hardware: "Open vSwitch"
stack:
priority: 1
interfaces:
1:
stack:
dp: sw2
port: 1
2:
native_vlan: office
sw2:
dp_id: 0x2
hardware: "Open vSwitch"
interfaces:
1:
stack:
dp: sw1
port: 1
2:
native_vlan: office
sw3:
dp_id: 0x3
hardware: "Open vSwitch"
interfaces:
1:
stack:
dp: sw4
port: 1
2:
native_vlan: office
sw4:
dp_id: 0x4
hardware: "Open vSwitch"
interfaces:
1:
stack:
dp: sw3
port: 1
2:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_port_number(self):
"""Test port number is valid."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
testing:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_override_port(self):
"""Test override port is valid."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
testing:
number: 1
native_vlan: office
override_output_port: output_port
output_port:
number: 2
output_only: True
"""
self.check_config_success(config, cp.dp_parser)
def test_one_port_dp(self):
"""Test port number is valid."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
testing:
number: 1
native_vlan: office
"""
self.check_config_success(config, cp.dp_parser)
def test_dp_id_too_big(self):
"""Test DP ID is valid."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0xfffffffffffffffffffffffffffffffff
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_invalid_vid(self):
"""Test VID is valid."""
config = """
vlans:
office:
vid: 10000
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_routers_empty(self):
"""Test with empty router config."""
config = """
routers:
router-1:
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_valid_mac(self):
"""Test with valid MAC."""
config = """
vlans:
office:
vid: 100
faucet_mac: '11:22:33:44:55:66'
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_success(config, cp.dp_parser)
def test_invalid_mac(self):
"""Test with invalid MAC."""
config = """
vlans:
office:
vid: 100
faucet_mac: '11:22:33:44:55:66:77:88'
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_empty_mac(self):
"""Test with empty MAC."""
config = """
vlans:
office:
vid: 100
faucet_mac: ''
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_empty_vid(self):
"""Test empty VID."""
config = """
vlans:
office:
vid:
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_empty_interfaces(self):
"""Test empty interfaces."""
config = """
vlans:
office:
vid:
dps:
sw1:
dp_id: 0x1
"""
self.check_config_failure(config, cp.dp_parser)
def test_invalid_interfaces(self):
"""Test invalid interfaces."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces: {'5': 5}
"""
self.check_config_failure(config, cp.dp_parser)
def test_unresolved_mirror_ports(self):
"""Test invalid mirror port name."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
acl_in: mirror_all
acls:
mirror_all:
- rule:
actions:
mirror: UNRESOLVED
allow: 1
"""
self.check_config_failure(config, cp.dp_parser)
def test_resolved_mirror_port(self):
"""Test can use name reference to mirrored port."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
mirrored_port:
number: 1
native_vlan: office
2:
mirror: mirrored_port
"""
self.check_config_success(config, cp.dp_parser)
def test_vlans_on_mirror_ports(self):
"""Test invalid VLANs configured on a mirror port."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
2:
native_vlan: office
mirror: 1
"""
self.check_config_failure(config, cp.dp_parser)
def test_unresolved_output_ports(self):
"""Test invalid output port name."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
acl_in: mirror_all
acls:
mirror_all:
- rule:
actions:
output:
port: UNRESOLVED
allow: 1
"""
self.check_config_failure(config, cp.dp_parser)
def test_unresolved_actions_output_ports(self):
"""Test invalid output port name with actions"""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
acl_in: output_unresolved
acls:
output_unresolved:
- rule:
actions:
output:
set_fields:
- eth_dst: '01:00:00:00:00:00'
port: UNRESOLVED
"""
self.check_config_failure(config, cp.dp_parser)
def test_unknown_output_ports(self):
"""Test invalid mirror ACL port."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
acl_in: mirror_all
acls:
mirror_all:
- rule:
actions:
output:
port: 2
allow: 1
"""
self.check_config_failure(config, cp.dp_parser)
def test_port_range_valid_config(self):
"""Test if port range config applied correctly"""
config = """
vlans:
office:
vid: 100
guest:
vid: 200
dps:
sw1:
dp_id: 0x1
interface_ranges:
1-4,6,port8:
native_vlan: office
max_hosts: 2
permanent_learn: True
port10-11:
native_vlan: guest
max_hosts: 2
interfaces:
1:
max_hosts: 4
description: "video conf"
"""
conf_file = self.create_config_file(config)
_, dps = cp.dp_parser(conf_file, LOGNAME)
dp = dps[0]
self.assertEqual(len(dp.ports), 8)
self.assertTrue(all([p.permanent_learn for p in dp.ports.values() if p.number < 9]))
self.assertTrue(all([p.max_hosts == 2 for p in dp.ports.values() if p.number > 1]))
self.assertTrue(dp.ports[1].max_hosts == 4)
self.assertEqual(dp.ports[1].description, "video conf")
def test_single_range_valid_config(self):
"""Test if port range with single port config applied correctly"""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interface_ranges:
1:
native_vlan: office
"""
conf_file = self.create_config_file(config)
_, dps = cp.dp_parser(conf_file, LOGNAME)
dp = dps[0]
self.assertEqual(len(dp.ports), 1)
def test_port_range_invalid_config(self):
"""Test invalid characters used in interface_ranges."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interface_ranges:
abc:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_acl_no_actions(self):
"""Test ACL with invalid actions section."""
config = """
acls:
office-vlan-protect:
- rule:
dl_type: 0x800
actions:
0 allow: 0
vlans:
office:
vid: 100
acl_in: office-vlan-protect
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_acl_invalid_ipv4(self):
"""Test invalid IPv4 address in ACL."""
config = """
acls:
office-vlan-protect:
- rule:
dl_type: 0x800
ipv4_src: q0.0.200.0/24
vlans:
office:
vid: 100
acl_in: office-vlan-protect
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
acl_in: access-port-protect
"""
self.check_config_failure(config, cp.dp_parser)
def test_acl_invalid_ipv6(self):
"""Test invalid IPv6 address in ACL."""
config = """
acls:
office-vlan-protect:
- rule:
dl_type: 0x800
ipv6_src: zyx
vlans:
office:
vid: 100
acl_in: office-vlan-protect
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
acl_in: access-port-protect
"""
self.check_config_failure(config, cp.dp_parser)
def test_acl_invalid_mask(self):
"""Test invalid IPv4 mask in ACL."""
config = """
acls:
office-vlan-protect:
- rule:
dl_type: 0x800
ipv4_src: 10/0.200.0/24
vlans:
office:
vid: 100
acl_in: office-vlan-protect
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
acl_in: access-port-protect
"""
self.check_config_failure(config, cp.dp_parser)
def test_acl_invalid_udp_port(self):
"""Test invalid UDP port in ACL."""
config = """
acls:
access-port-protect:
- rule:
udp_src: v7
vlans:
office:
vid: 100
acl_in: office-vlan-protect
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
acl_in: access-port-protect
"""
self.check_config_failure(config, cp.dp_parser)
def test_acl_invalid_rule_name(self):
"""Test invalid name for rule in ACL."""
config = """
acls:
access-port-protect:
- xrule:
udp_src: v7
vlans:
office:
vid: 100
acl_in: office-vlan-protect
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
acl_in: access-port-protect
"""
self.check_config_failure(config, cp.dp_parser)
def test_acl_and_acls_vlan_invalid(self):
"""Test cannot have acl_in and acls_in together."""
config = """
acls:
access-port-protect:
- rule:
udp_src: 80
office-vlan-protect:
- rule:
dl_type: 0x800
ipv4_src: 10.0.200.0/24
vlans:
office:
vid: 100
acl_in: office-vlan-protect
acls_in: [access-port-protect]
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_inconsistent_exact_match(self):
"""Test that ACLs have consistent exact_match."""
config = """
acls:
acl_a:
exact_match: False
rules:
- rule:
udp_src: 80
acl_b:
exact_match: True
rules:
- rule:
udp_src: 81
vlans:
office:
vid: 100
acls_in: [acl_a, acl_b]
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_acl_and_acls_port_invalid(self):
config = """
acls:
access-port-protect:
- rule:
udp_src: 80
office-vlan-protect:
- rule:
dl_type: 0x800
ipv4_src: 10.0.200.0/24
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
acl_in: office-vlan-protect
acls_in: [access-port-protect]
"""
self.check_config_failure(config, cp.dp_parser)
def test_acls_vlan_valid(self):
"""Test ACLs can be combined on VLAN."""
config = """
acls:
access-port-protect:
- rule:
udp_src: 80
office-vlan-protect:
- rule:
dl_type: 0x800
ipv4_src: 10.0.200.0/24
vlans:
office:
vid: 100
acls_in: [access-port-protect, office-vlan-protect]
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_success(config, cp.dp_parser)
def test_acls_port_valid(self):
"""Test ACLs can be combined on a port."""
config = """
acls:
access-port-protect:
- rule:
udp_src: 80
office-vlan-protect:
- rule:
dl_type: 0x800
ipv4_src: 10.0.200.0/24
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
acls_in: [access-port-protect, office-vlan-protect]
"""
self.check_config_success(config, cp.dp_parser)
def test_invalid_char(self):
"""Test config file with invalid characters."""
config = b'\x63\xe1'
self.check_config_failure(config, cp.dp_parser)
def test_perm_denied(self):
"""Test config file has no read permission."""
def unreadable():
"""Make config unreadable."""
os.chmod(self.conf_file_name(), 0)
config = ''
self.check_config_failure(config, cp.dp_parser, before_function=unreadable)
def test_missing_route_config(self):
"""Test missing IP gateway for route."""
config = """
vlans:
office:
vid: 100
routes:
- route:
ip_dst: '192.168.0.0/24'
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_invalid_dp_conf(self):
"""Test invalid DP header config."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
description: "host1 container"
0 native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_duplicate_keys_conf(self):
"""Test duplicate top level keys."""
config = """
vlans:
office:
vid: 100
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
testing:
number: 1
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_dp_id_not_a_string(self):
"""Test dp_id is not a string"""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: &x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_invalid_key(self):
"""Test invalid key"""
config = """
acls:
? office-vlan-protect:
- rule:
actions:
allow: 1
vlans:
office:
vid: 100
acl_in: office-vlan-protect
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_invalid_acl_formation(self):
"""Test missing ACL name."""
config = """
acls:
# office-vlan-protect:
- rule:
actions:
allow: 1
vlans:
office:
vid: 100
acl_in: office-vlan-protect
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_invalid_route_value(self):
"""Test routes value forming a dictionary"""
config = """
vlans:
office:
vid: 100
routes:
- - route:
ip_dst: '192.168.0.0/24'
ip_gw: '10.0.100.2'
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_invalid_mirror_port(self):
"""Test referencing invalid mirror port"""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
mirror: 1"
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_invalid_include_values(self):
"""Test include directive contains invalid values"""
config = """
include:
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_ipv4_src_is_empty(self):
"""Test acl ipv4_src is empty"""
config = """
acls:
office-vlan-protect:
- rule:
dl_type: 0x800
ipv4_src:
actions:
allow: 0
vlans:
office:
vid: 100
acl_in: office-vlan-protect
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_empty_eth_dst(self):
"""Test eth_dst/dl_dst is empty"""
config = """
vlans:
100:
acls:
101:
- rule:
dl_dst:
actions:
output:
port: 1
dps:
switch1:
dp_id: 0xcafef00d
interfaces:
1:
native_vlan: 100
acl_in: 101
"""
self.check_config_failure(config, cp.dp_parser)
def test_router_vlan_invalid_type(self):
"""Test when router vlans forms a dict"""
config = """
vlans:
100:
acls:
101:
- rule:
dl_dst: "0e:00:00:00:02:02"
actions:
mirror:
port: 1
dps:
switch1:
dp_id: 0xcafef00d
interfaces:
1:
native_vlan: 100
acl_in: 101
"""
self.check_config_failure(config, cp.dp_parser)
def test_mirror_port_invalid_type(self):
"""Test when mirror port forms a dict"""
config = """
vlans:
100:
acls:
101:
- rule:
dl_dst: "0e:00:00:00:02:02"
actions:
mirror:
port: 1
dps:
switch1:
dp_id: 0xcafef00d
interfaces:
1:
native_vlan: 100
acl_in: 101
"""
self.check_config_failure(config, cp.dp_parser)
def test_referencing_unconfigured_dp_in_stack(self):
"""Test when referencing a nonexistent dp in a stack"""
config = """
vlans:
office:
vid: 100
dps:
3w1:
dp_id: 0x1
stack:
priority: 1
interfaces:
1:
stack:
dp: sw2
port: 1
2:
native_vlan: office
sw2:
dp_id: 0x2
interfaces:
1:
stack:
dp: sw1
port: 1
2:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_referencing_unconfigured_port_in_stack(self):
"""Test when referencing a nonexistent port for dp in a stack"""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
stack:
priority: 1
interfaces:
9:
stack:
dp: sw2
port: 1
2:
native_vlan: office
sw2:
dp_id: 0x2
interfaces:
1:
stack:
dp: sw1
port: 1
2:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_not_referencing_a_port_in_the_stack(self):
"""Test when not referencing a port in a stack"""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
stack:
priority: 1
interfaces:
1:
stack:
dp: sw2
0ort: 1
2:
native_vlan: office
sw2:
dp_id: 0x2
interfaces:
1:
stack:
dp: sw1
port: 1
2:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_not_referencing_a_dp_in_the_stack(self):
"""Test when not referencing a dp in a stack"""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
stack:
priority: 1
interfaces:
1:
stack:
$p: sw2
port: 1
2:
native_vlan: office
sw2:
dp_id: 0x2
interfaces:
1:
stack:
dp: sw1
port: 1
2:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_no_rules_in_acl(self):
"""Test when no rules are present in acl"""
config = """
acls:
mirror_destination: {}
vlans:
office:
vid: 100
acl_in: office-vlan-protect
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_empty_ipv6_src(self):
"""Test when ipv6_src is empty"""
config = """
acls:
office-vlan-protect:
- rule:
dl_type: 0x800
ipv6_src:
actions:
allow: 0
vlans:
office:
vid: 100
acl_in: office-vlan-protect
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_port_number_is_wrong_type(self):
"""Test when port number is a dict"""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
stack:
priority: 1
interfaces:
1:
number:
dp: sw2
port: 1
2:
native_vlan: office
sw2:
dp_id: 0x2
interfaces:
1:
stack:
dp: sw1
port: 1
2:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_invalid_date_time_object(self):
"""Test when config is just an invalid datetime object"""
config = """
1976-87-04
"""
self.check_config_failure(config, cp.dp_parser)
def test_config_is_only_bad_float(self):
"""Test when config is this specific case of characters"""
config = """
._
"""
self.check_config_failure(config, cp.dp_parser)
def test_stack_port_is_list(self):
"""Test when stack port is a list"""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
stack:
priority: 1
interfaces:
1:
stack:
dp: sw2
port: []# 2:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_bad_vlan_reference(self):
"""Test when tagged vlans is a dict"""
config = """
vlans:
office:
vid: 100
guest:
vid: 200
dps:
sw2:
dp_id: 0x2
interfaces:
24:
tagged_vlans: [office: guest]
"""
self.check_config_failure(config, cp.dp_parser)
def test_bad_set_fields(self):
"""Test unknown set_field."""
config = """
acls:
bad_acl:
rules:
- rule:
actions:
output:
set_fields:
- nosuchfield: "xyz"
vlans:
guest:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: 100
acl_in: bad_acl
"""
self.check_config_failure(config, cp.dp_parser)
def test_good_set_fields(self):
"""Test good set_fields."""
config = """
acls:
good_acl:
rules:
- rule:
actions:
output:
set_fields:
- eth_dst: "0e:00:00:00:00:01"
vlans:
guest:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: 100
acl_in: good_acl
"""
self.check_config_success(config, cp.dp_parser)
def test_bad_match_fields(self):
"""Test bad match fields."""
config = """
acls:
bad_acl:
rules:
- rule:
notsuch: "match"
actions:
output:
set_fields:
- eth_dst: "0e:00:00:00:00:01"
vlans:
guest:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: 100
acl_in: bad_acl
"""
self.check_config_failure(config, cp.dp_parser)
def test_push_pop_vlans_acl(self):
"""Test push and pop VLAN ACL fields."""
config = """
acls:
good_acl:
rules:
- rule:
actions:
output:
pop_vlans: 1
vlan_vids:
- { vid: 200, eth_type: 0x8100 }
vlans:
guest:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: 100
acl_in: good_acl
"""
self.check_config_success(config, cp.dp_parser)
def test_dp_acls(self):
"""Test DP ACLs."""
config = """
acls:
good_acl:
rules:
- rule:
actions:
output:
port: 1
vlans:
guest:
vid: 100
dps:
sw1:
dp_id: 0x1
dp_acls: [good_acl]
interfaces:
1:
native_vlan: 100
"""
self.check_config_success(config, cp.dp_parser)
def test_force_port_vlan(self):
"""Test push force_port_vlan."""
config = """
acls:
good_acl:
rules:
- rule:
actions:
allow: 1
force_port_vlan: 1
output:
swap_vid: 101
vlans:
guest:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
tagged_vlans: [100]
acl_in: good_acl
"""
self.check_config_success(config, cp.dp_parser)
def test_failover_acl(self):
"""Test failover ACL fields."""
config = """
acls:
good_acl:
rules:
- rule:
actions:
output:
failover:
group_id: 1
ports: [1]
vlans:
guest:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: 100
acl_in: good_acl
"""
self.check_config_success(config, cp.dp_parser)
def test_unreferenced_acl(self):
"""Test an unresolveable port in an ACL that is not referenced is OK."""
config = """
acls:
unreferenced_acl:
rules:
- rule:
actions:
output:
port: 99
vlans:
guest:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: 100
"""
self.check_config_success(config, cp.dp_parser)
def test_bad_cookie(self):
"""Test bad cookie value."""
config = """
acls:
bad_cookie_acl:
rules:
- rule:
cookie: 999999
actions:
output:
port: 1
vlans:
guest:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: 100
acl_in: bad_cookie_acl
"""
self.check_config_failure(config, cp.dp_parser)
def test_routers_unreferenced(self):
"""Test with unreferenced router config."""
config = """
routers:
router-1:
vlans: [office, guest]
vlans:
office:
vid: 100
guest:
vid: 200
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_routers_overlapping_vips(self):
"""Test with unreferenced router config."""
config = """
routers:
router-1:
vlans: [office, guest]
vlans:
office:
vid: 100
faucet_vips: ["10.0.0.1/24"]
guest:
vid: 200
faucet_vips: ["10.0.0.2/24"]
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
2:
native_vlan: guest
"""
self.check_config_failure(config, cp.dp_parser)
def test_same_vlan_tagged_untagged(self):
"""Test cannot have the same VLAN tagged and untagged on same port."""
config = """
vlans:
guest:
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: 100
tagged_vlans: [100]
"""
self.check_config_failure(config, cp.dp_parser)
def test_share_bgp_routing_VLAN(self):
"""Test cannot share VLAN with BGP across DPs."""
config = """
vlans:
routing:
vid: 100
faucet_vips: ["10.0.0.254/24"]
bgp_server_addresses: ["127.0.0.1"]
bgp_as: 1
bgp_routerid: "1.1.1.1"
bgp_neighbor_addresses: ["127.0.0.1"]
bgp_neighbor_as: 2
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: routing
sw2:
dp_id: 0x2
interfaces:
1:
native_vlan: routing
"""
self.check_config_failure(config, cp.dp_parser)
def test_multi_bgp(self):
"""Test multiple BGP VLANs can be configured."""
config = """
vlans:
routing1:
vid: 100
faucet_vips: ["10.0.0.254/24"]
bgp_server_addresses: ["127.0.0.1"]
bgp_as: 100
bgp_routerid: "1.1.1.1"
bgp_neighbor_addresses: ["127.0.0.1"]
bgp_neighbor_as: 100
routing2:
vid: 200
faucet_vips: ["10.0.0.253/24"]
bgp_server_addresses: ["127.0.0.1"]
bgp_as: 200
bgp_routerid: "1.1.1.1"
bgp_neighbor_addresses: ["127.0.0.2"]
bgp_neighbor_as: 200
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: routing1
2:
native_vlan: routing2
"""
self.check_config_success(config, cp.dp_parser)
def test_bgp_server_invalid(self):
"""Test invalid BGP server address."""
bgp_config = """
vlans:
100:
description: "100"
bgp_port: 9179
bgp_server_addresses: ['256.0.0.1']
bgp_as: 1
bgp_routerid: '1.1.1.1'
bgp_neighbor_addresses: ['127.0.0.1']
bgp_neighbor_as: 2
dps:
switch1:
dp_id: 0xcafef00d
hardware: 'Open vSwitch'
interfaces:
1:
native_vlan: 100
"""
self.check_config_failure(bgp_config, cp.dp_parser)
def test_bgp_neighbor_invalid(self):
"""Test invalid BGP server address."""
bgp_config = """
vlans:
100:
description: "100"
bgp_port: 9179
bgp_server_addresses: ['127.0.0.1']
bgp_as: 1
bgp_routerid: '1.1.1.1'
bgp_neighbor_addresses: ['256.0.0.1']
bgp_neighbor_as: 2
dps:
switch1:
dp_id: 0xcafef00d
hardware: 'Open vSwitch'
interfaces:
1:
native_vlan: 100
"""
self.check_config_failure(bgp_config, cp.dp_parser)
def test_unknown_vlan_key(self):
"""Test unknown VLAN key."""
config = """
vlans:
unknown_key:
name: office
vid: 100
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_unknown_dp_key(self):
"""Test unknown DP key."""
config = """
dps:
unknown_key:
name: sw1
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_unknown_port_key(self):
"""Test unknown port key."""
config = """
dps:
sw1:
dp_id: 0x1
interfaces:
unknown_key:
name: port1
number: 3
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_meter_config(self):
"""Test valid meter config."""
config = """
meters:
lossymeter:
meter_id: 1
entry:
flags: "KBPS"
bands:
[
{
type: "DROP",
rate: 1000
}
]
acls:
lossyacl:
- rule:
actions:
meter: lossymeter
allow: 1
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: 100
acl_in: lossyacl
"""
self.check_config_success(config, cp.dp_parser)
def test_dp_lldp_minimal_invalid(self):
"""Test minimal invalid DP config."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
lldp_beacon:
system_name: test_system
interfaces:
testing:
number: 1
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_dp_lldp_minimal_valid(self):
"""Test minimal valid DP config."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
lldp_beacon:
send_interval: 10
max_per_interval: 10
interfaces:
testing:
number: 1
native_vlan: office
"""
self.check_config_success(config, cp.dp_parser)
def test_port_lldp_minimal_valid(self):
"""Test minimal valid LLDP config."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
lldp_beacon:
send_interval: 10
max_per_interval: 10
interfaces:
testing:
number: 1
native_vlan: office
lldp_beacon:
enable: true
"""
self.check_config_success(config, cp.dp_parser)
def test_all_lldp_valid(self):
"""Test a fully specified valid LLDP config."""
config = """
vlans:
office:
vid: 100
dps:
sw1:
dp_id: 0x1
lldp_beacon:
system_name: test_system
send_interval: 10
max_per_interval: 10
interfaces:
testing:
number: 1
native_vlan: office
lldp_beacon:
enable: true
system_name: port_system
port_descr: port_description
org_tlvs:
- {oui: 0x12bb, subtype: 2, info: "01406500"}
"""
self.check_config_success(config, cp.dp_parser)
def test_interface_ranges_lldp(self):
"""Verify lldp config works when using interface ranges"""
config = """
vlans:
office:
vid: 100
guest:
vid: 200
dps:
sw1:
dp_id: 0x1
lldp_beacon:
send_interval: 10
max_per_interval: 10
interface_ranges:
'1-2':
lldp_beacon:
enable: True
system_name: port_system
org_tlvs:
- {oui: 0x12bb, subtype: 2, info: "01406500"}
interfaces:
1:
native_vlan: office
2:
native_vlan: office
"""
self.check_config_success(config, cp.dp_parser)
def test_multi_acl_dp(self):
"""Test multiple ACLs with multiple DPs, where one ACL does mirroring."""
config = """
dps:
SWPRI2:
dp_id: 0x223d5a07ff
interfaces:
11:
acl_in: non_mirroring_acl
native_vlan: 197
SWSEC0B:
dp_id: 0xe01aea107a69
interfaces:
30:
native_vlan: 197
acl_in: mirroring_acl
47:
native_vlan: 197
vlans:
197:
acls:
mirroring_acl:
- rule:
actions:
allow: 1
mirror: 47
non_mirroring_acl:
- rule:
actions:
allow: 1
"""
self.check_config_success(config, cp.dp_parser)
def test_vlan_route_dictionary_valid(self):
"""Test new vlan route format as dictionary is valid"""
config = """
vlans:
office:
vid: 100
routes:
- {ip_gw: '10.0.0.1', ip_dst: '10.99.99.0/24'}
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_success(config, cp.dp_parser)
def test_vlan_route_missing_value_invalid(self):
"""Test new vlan route format fails when missing value"""
config = """
vlans:
office:
vid: 100
routes:
- {}
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
def test_vlan_route_values_invalid(self):
"""Test new vlan route format fails when values are invalid"""
config = """
vlans:
office:
vid: 100
routes:
- {ip_gw: [],ip_gw: 5.5}
dps:
sw1:
dp_id: 0x1
interfaces:
1:
native_vlan: office
"""
self.check_config_failure(config, cp.dp_parser)
if __name__ == "__main__":
unittest.main() # pytype: disable=module-attr
| 22.984922 | 93 | 0.504177 | 5,722 | 51,831 | 4.344984 | 0.072527 | 0.018663 | 0.044244 | 0.069504 | 0.768764 | 0.752353 | 0.734816 | 0.709476 | 0.667404 | 0.633135 | 0 | 0.049194 | 0.403079 | 51,831 | 2,254 | 94 | 22.99512 | 0.754388 | 0.089715 | 0 | 0.843548 | 0 | 0.000997 | 0.639462 | 0.008262 | 0 | 0 | 0.010473 | 0.000444 | 0.003986 | 1 | 0.058794 | false | 0 | 0.003488 | 0 | 0.065272 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
70c3298555393b1bdd44aa30a87f94499e678a0b | 29,839 | py | Python | Core/mailing.py | hanif-ali/BlogBar | e24aa0835ac8869680bd904ad050fd7437ed97c7 | [
"PostgreSQL"
] | null | null | null | Core/mailing.py | hanif-ali/BlogBar | e24aa0835ac8869680bd904ad050fd7437ed97c7 | [
"PostgreSQL"
] | null | null | null | Core/mailing.py | hanif-ali/BlogBar | e24aa0835ac8869680bd904ad050fd7437ed97c7 | [
"PostgreSQL"
] | 1 | 2020-10-25T18:11:22.000Z | 2020-10-25T18:11:22.000Z | import smtplib, ssl
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
import os
import random
# from Core.Usermanagement import get_user_type_and_langage # ToDo: recursive import !!! Fix this Issue
from Core.dbconn import get_database_connection
footer = """
<footer style="font-family: Calibri;"><br><br><br>
<img src="https://blogbar.eu/static/img/Mailingfooterlogo.jpg" width="300px;"><br><br><br>
<p>
BlogBar Digital Network UG (haftungsbeschränkt)<br>
Krausstr. 1<br>
63897 Miltenberg
</p>
<p>
<a href="www.blogbar.eu">www.blogbar.eu</a><br>
<a href="mailto:cheers@blogbar.eu">cheers@blogbar.eu</a><br>
Tel. 0176 8747 9127
</p>
<p>
Firmensitz: Miltenberg<br>
Registergericht: Amtsgericht Aschaffenburg HRB 15208<br>
Umsatzsteuer-Identifikationsnummer gem. § 27a UStG: DE325297281<br>
Geschäftsführer: Axel Sommer, Carolin Wolz<br>
</p>
</footer>
"""
def get_user_type_and_langage(mail: str) -> (int, str):
"""
:param mail:
:return: (int, str) ==> (kind, language_abbr)
"""
dbconnection = get_database_connection()
cursor = dbconnection.cursor()
cursor.execute("""SELECT KIND, LANG, NAME FROM sign_up_view WHERE MAIL=%s;""", (
mail,
))
result = cursor.fetchall()[0]
result = (result[0], result[1], result[2])
cursor.close()
dbconnection.close()
return result
def get_random_string() -> str:
alphabeth = ["a", "b", "c", "d", "e", "f", "g", "h", "i", "j", "k", "l", "m", "n", "o", "p", "q", "q", "r", "s",
"t", "u", "u", "v", "w", "x", "y", "z"]
string = ""
for i in range(0, 3):
string += alphabeth[random.randint(0, len(alphabeth) - 1)]
return string
def generate_token_pn():
token = ""
for i in range(0, 15):
token += get_random_string()
for i in range(0, 2):
token += str(random.randint(0, 255))
print(token)
return token
def generate_pwd_reset_token(mail_addr: str) -> str:
dbconnection = get_database_connection()
cursor = dbconnection.cursor()
token = generate_token_pn()
try:
cursor.execute("""
INSERT INTO pwd_reset_tokens(token, email) VALUES (%s, %s)""", (token, mail_addr))
print(cursor.statement)
except:
cursor.execute("""
UPDATE pwd_reset_tokens SET token = %s WHERE email = %s""", (token, mail_addr))
print(cursor.statement)
dbconnection.commit()
cursor.close()
dbconnection.close()
return token
def send_double_opt_in_request(receiver_mail, first_name, kind: int, language: str):
sender_email = "noreply@blogbar.eu"
password = "BlogBar2103#"
message = MIMEMultipart("alternative")
message[
"Subject"] = "BlogBar: E-Mail-Adresse bestätigen!" if language == "de" else "BlogBar: Confirm E-Mail-Address!"
message["From"] = sender_email
message["To"] = receiver_mail
confirm_key = generate_token_pn()
dbconnection = get_database_connection()
cursor = dbconnection.cursor()
cursor.execute("""DELETE FROM confirm_keys WHERE email = %s""", (receiver_mail,))
cursor.execute("""INSERT INTO confirm_keys(email, token) VALUES (%s, %s)""", (receiver_mail, confirm_key))
print("INSERTED FOR: {mail}".format(
mail=receiver_mail,
))
dbconnection.commit()
cursor.close()
dbconnection.close()
# Create the plain-text and HTML version of your message
if kind == 2:
if language == 'de':
text = """\
Hallo {first_name},
vielen Dank für Ihre Registrierung bei BlogBar. Wir freuen uns, Sie bei der Suche nach passenden Influencer unterstützen zu können.
Bevor es losgehen kann, bitte bestätigen Sie Ihre E-Mailadresse durch folgenden Link: https://blogbar.eu/de/confirm?key={key}
Im Anschluss können Sie sich direkt einloggen und das passende Paket für Ihre Anforderungen buchen.
Bitte beachten Sie: Sofern Sie den Link innerhalb von 3 Tagen nicht bestätigen, wird die Anmeldung zurückgesetzt und Ihre Daten gelöscht.
Bei Fragen stehen wir gerne zur Verfügung.
Ihr Team von BlogBar""".format(first_name=first_name, key=confirm_key)
html = """\
<html>
<body style="font-family: Calibri;">
<p>Hallo {first_name},<br><br>
vielen Dank für Ihre Registrierung bei BlogBar. Wir freuen uns, Sie bei der Suche nach passenden Influencer unterstützen zu können.
Bevor es losgehen kann, bitte bestätigen Sie Ihre E-Mailadresse durch folgenden Link:<br>
<a href="https://blogbar.eu/de/confirm?key={key}">Jetzt bestätigen</a>
<br>
<br>
Bitte beachten Sie: Sofern Sie den Link innerhalb von 3 Tagen nicht bestätigen, wird die Anmeldung zurückgesetzt und Ihre Daten gelöscht.
Bei Fragen stehen wir gerne zur Verfügung.
<br>
<br>
Wir freuen uns, dass du unserer Plattform beigetreten bist.<br><br>
Your team from BlogBar!
</p>
</body>
{footer}
</html>
""".format(first_name=first_name,
key=confirm_key,
footer=footer)
else:
text = """\
Hello {first_name},
Thank you for registering at BlogBar. We are happy to assist you in your search for suitable influencers.
Before you can start, please confirm your e-mail address with the following link: https://blogbar.eu/de/confirm?key={key}
Then you are able to log in directly and book the right package for your requirements.
Please note: If you do not confirm the link within 3 days, the registration will be reset and your data deleted.
If you have any questions, please do not hesitate to contact us.
Your team from BlogBar""".format(first_name=first_name, key="4567898765456789098765678909876567890")
html = """\
<html>
<body style="font-family: Calibri;">
<p>Hello {first_name},<br><br>
Thank you for registering at BlogBar. We are happy to assist you in your search for suitable influencers.
Before you can start, please confirm your e-mail address with the following link:<br>
<a href="https://blogbar.eu/de/confirm?key={key}">Confirm now!</a>
<br>
<br>
Then you are able to log in directly and book the right package for your requirements.
Please note: If you do not confirm the link within 3 days, the registration will be reset and your data deleted.
<br>
If you have any questions, please do not hesitate to contact us.
<br>
<br>
Your team from BlogBar!
{footer}
</p>
</body>
</html>
""".format(first_name=first_name,
key=confirm_key,
footer=footer)
else:
if language == 'de':
text = """\
Hallo {first_name},
wir freuen uns sehr, dass du ab sofort Teil unserer Community bist.
Bevor es losgehen kann, bitte bestätige deine E-Mailadresse durch folgenden Link: https://blogbar.eu/de/confirm?key={key}
Im Anschluss kannst du dich direkt einloggen und deine Daten jederzeit anpassen.
Bitte beachten: Sofern du den Link innerhalb von 3 Tagen nicht bestätigst, wird die Anmeldung zurückgesetzt und deine Daten gelöscht.
Bei Fragen stehen wir gerne zur Verfügung.
Dein Team von BlogBar""".format(first_name=first_name, key="4567898765456789098765678909876567890")
html = """\
<html>
<body style="font-family: Calibri;">
<p>Hallo {first_name},<br><br>
wir freuen uns sehr, dass du ab sofort Teil unserer Community bist.
Bevor es losgehen kann, bitte bestätige deine E-Mailadresse durch folgenden Link:<br>
<a href="https://blogbar.eu/de/confirm?key={key}">Bestätigen</a>
<br>
<br>
Im Anschluss kannst du dich direkt einloggen und deine Daten jederzeit anpassen.
Bitte beachten: Sofern du den Link innerhalb von 3 Tagen nicht bestätigst, wird die Anmeldung zurückgesetzt und deine Daten gelöscht.
Bei Fragen stehen wir gerne zur Verfügung.
<br>
<br>
Wir freuen uns, dass du unserer Plattform beigetretn bist.<br><br>
</p>
<p>Dein Team von BlogBar</p>
</body>
{footer}
</html>
""".format(first_name=first_name,
key=confirm_key,
footer=footer)
else:
text = """\
Hello {first_name},
we are very happy that you are part of our community now. Before you can start, please confirm your e-mail address with the following link:https://blogbar.eu/de/confirm?key={key}
Afterwards you can log in directly and change your data every time.
Please note: If you do not confirm the link within 3 days, the registration will be reset and your data deleted.
If you have any questions, please do not hesitate to contact us.
Your team from BlogBar""".format(first_name=first_name,
key="4567898765456789098765678909876567890")
html = """\
<html>
<body style="font-family: Calibri;">
<p>Hello {first_name},<br><br>
we are very happy that you are part of our community now.
Before you can start, please confirm your e-mail address with the following link:
<br>
<a href="https://blogbar.eu/de/confirm?key={key}">Confirm by clicking this link</a>
<br>
<br>
Afterwards you can log in directly and change your data every time.
Please note: If you do not confirm the link within 3 days, the registration will be reset and your data deleted.
If you have any questions, please do not hesitate to contact us.
</p>
<br><br>
<p>Your Team from BlogBar</p>
</body>
{footer}
</html>
""".format(first_name=first_name, key=confirm_key, footer=footer)
# Turn these into plain/html MIMEText objects
part1 = MIMEText(text, "plain")
part2 = MIMEText(html, "html")
# Add HTML/plain-text parts to MIMEMultipart message
# The email client will try to render the last part first
message.attach(part1)
message.attach(part2)
# Create secure connection with server and send email
context = ssl.create_default_context()
with smtplib.SMTP_SSL("smtp.ionos.de", 465, context=context) as server:
server.login(sender_email, password)
try:
server.sendmail(
sender_email, receiver_mail, message.as_string()
)
except smtplib.SMTPRecipientsRefused:
# EMail-Address not taken
pass
def send_pwd_reset_token(receiver_mail):
kind, language, name = get_user_type_and_langage(receiver_mail)
sender_email = "noreply@blogbar.eu"
password = "BlogBar2103#"
message = MIMEMultipart("alternative")
message["Subject"] = "BlogBar: Passwort zurücksetzen" if language == "de" else "BlogBar: Reset your Password"
message["From"] = sender_email
message["To"] = receiver_mail
token = generate_pwd_reset_token(receiver_mail)
if language == "de":
if kind == 1:
text = """
Hallo {name},
Mit folgendem Link kannst du dein Passwort zurücksetzen:
https://blogbar.eu/de/pwd_reset?token={key}
Viele Grüße,
Dein Team von BlogBar
""".format(key=token, name=name)
html = """\
<html>
<body style="font-family: Calibri;">
<p>
Hallo {name},<br><br>
Mit folgendem Link kannst du dein Passwort zurücksetzen:<br>
<a href="https://blogbar.eu/de/pwd_reset?token={key}">Passwort zurücksetzten</a>
</p>
<p><br>Viele Grüße,<br><br>Dein Team von BlogBar</br></p>
</body>
{footer}
</html>
""".format(key=token, footer=footer, name=name)
else:
text = """
Hallo Frau / Herr {name},
Mit folgendem Link können Sie Ihr Passwort zurücksetzen:
https://blogbar.eu/de/pwd_reset?token={key}
Viele Grüße,
Ihr Team von BlogBar
""".format(key=token, name=name)
html = """\
<html>
<body style="font-family: Calibri;">
<p>
Hallo Frau / Herr {name},<br><br>
Mit folgendem Link können Sie Ihr Passwort zurücksetzen:<br>
<a href="https://blogbar.eu/de/pwd_reset?token={key}">Passwort zurücksetzten</a>
</p>
<p>Viele Grüße,<br><br>Ihr Team von BlogBar</p>
</body>
{footer}
</html>
""".format(key=token, footer=footer, name=name)
else:
if kind == 1:
text = """
Hello {name},
With the following link you can reset your password:
https://blogbar.eu/de/pwd_reset?token={key}
Best regards,
Your team from BlogBar
""".format(key=token, name=name)
html = """\
<html>
<body style="font-family: Calibri;">
<p>
Hello {name},<br><br>
With the following link you can reset your password:<br>
<a href="https://blogbar.eu/de/pwd_reset?token={key}">Reset password</a>
</p>
<p>Best regards, <br><br>Your team from BlogBar</p>
</body>
{footer}
</html>
""".format(key=token, footer=footer, name=name)
else:
text = """
Hello Mrs / Mr {name},
With the following link you can reset your password:
https://blogbar.eu/de/pwd_reset?token={key}
Best regards,
Your team from BlogBar
""".format(key=token, name=name)
html = """\
<html>
<body style="font-family: Calibri;">
<p>
Hello Mrs / Mr {name},<br><br>
With the following link you can reset your password:<br>
<a href="https://blogbar.eu/de/pwd_reset?token={key}">Reset password</a>
</p>
<p>Best regards, <br><br>Your team from BlogBar</p>
</body>
{footer}
</html>
""".format(key=token, footer=footer, name=name)
# Turn these into plain/html MIMEText objects
part1 = MIMEText(text, "plain")
part2 = MIMEText(html, "html")
# Add HTML/plain-text parts to MIMEMultipart message
# The email client will try to render the last part first
message.attach(part1)
message.attach(part2)
# Create secure connection with server and send email
context = ssl.create_default_context()
try:
with smtplib.SMTP_SSL("smtp.ionos.de", 465, context=context) as server:
server.login(sender_email, password)
server.sendmail(
sender_email, receiver_mail, message.as_string()
)
except:
pass
print(receiver_mail)
def send_log_in_alert(influencer_identifier: int):
dbconnection = get_database_connection()
cursor = dbconnection.cursor()
cursor.execute("""SELECT * FROM influencer WHERE influencer_identifier = %s;""", (influencer_identifier,))
data = cursor.fetchone()
keys = cursor.column_names
cursor.close()
dbconnection.close()
influencer = {}
for index in range(0, len(keys)):
influencer[keys[index]] = data[index]
sender_email = "noreply@blogbar.eu"
password = "BlogBar2103#"
message = MIMEMultipart("alternative")
message["Subject"] = "Sicherheitshinweis: Neue Anmeldung in Deinem BlogBar-Account"
message["From"] = sender_email
message["To"] = influencer["email"]
text = """\
Hi {first_name},
es wurde sich vor kurzen in Dein BlogBar-Konto eingeloggt.
Falls Du es selbst warst, kannst du diese Mail ignorieren.
Falls Du das Gefühl haben solltest, eine fremde Person ist im Besitz deiner Zugangsdaten, solltest du diese unter
den Kontoeinstellungen sofort ändern.""".format(first_name=influencer["first_name"])
html = """\
<html>
<body style="font-family: Calibri;">
<p>Hi {first_name},<br>
<br>
es wurde sich vor kurzen in Dein BlogBar-Konto eingeloggt.<br>
Falls Du es selbst warst, kannst du diese Mail ignorieren.
</p>
<p>Falls Du das Gefühl haben solltest, eine fremde Person ist im Besitz deiner Zugangsdaten, solltest du diese unter
den Kontoeinstellungen sofort ändern: <a href="https://blogbar.eu/de/login"> Hier anmelden </a></p>
<p>Du kannst dich bei Fragen zur weiteren Vorgehensweise gerne bei unserem Support-Team melden.<br><br>
Dein Team von BlogBar!</p>
</body>
{footer}
</html>
""".format(first_name=influencer["first_name"], footer=footer)
# Turn these into plain/html MIMEText objects
part1 = MIMEText(text, "plain")
part2 = MIMEText(html, "html")
# Add HTML/plain-text parts to MIMEMultipart message
# The email client will try to render the last part first
message.attach(part1)
message.attach(part2)
context = ssl.create_default_context()
try:
with smtplib.SMTP_SSL("smtp.ionos.de", 465, context=context) as server:
server.login(sender_email, password)
server.sendmail(
sender_email, influencer["email"], message.as_string()
)
except:
pass
def send_goodbye_message(mail, first_name, language: str):
sender_email = "noreply@blogbar.eu"
password = "BlogBar2103#"
message = MIMEMultipart("alternative")
message[
"Subject"] = "Auf Wiedersehen: Deine Abmeldung war erfolgreich." if language == "de" else "Goodbye: Your Profile-Deletion-Request was successful"
message["From"] = sender_email
message["To"] = mail
if language == 'de':
text = """\
Hi {first_name},
du hast dich erfolgreich abgemeldet.
Deine personenbezogenen Daten wurden vollständig und dauerhaft gelöscht.
Schade, dass du gehst!
hat dir ein Feature gefehlt? Teil uns gerne den Grund, der deine Abmeldung begründet hat mit: cheers@blogbar.eu
Dein BlogBar-Team""".format(first_name=first_name)
html = """\
<html>
<body style="font-family: Calibri;">
Hi {first_name},<br><br>
<p>du hast dich erfolgreich abgemeldet.</p>
<p>Deine personenbezogenen Daten wurden <strong>vollständig und dauerhaft</strong> gelöscht.</p>
<p>Schade, dass du gehst!<br>
Hat dir ein Feature gefehlt? Teil uns gerne den Grund, der deine Abmeldung begründet hat mit: <a href="mailto:cheers@blogbar.eu">cheers@blogbar.eu</a></p>
<br>
Dein Team von BlogBar!
</body>
{footer}
</html>
""".format(first_name=first_name, footer=footer)
else:
print(language)
text = """\
Hello {first_name},
So sad that you want to unsubscribe from BlogBar. Of course we will comply with your request and confirm your unsubscription.
Your data will be completely deleted immediately.
We thank you for your trust and wish a lot of success in the future!
Your team from BlogBar""".format(first_name=first_name)
html = """\
<html>
<body style="font-family: Calibri;">
Hello {first_name},<br><br>
<p>So sad that you want to unsubscribe from BlogBar. Of course we will comply with your request and confirm your unsubscription.</p>
<p>Your data will be completely deleted immediately.</p>
<p>We thank you for your trust and wish a lot of success in the future!</p>
<br>
Your team from BlogBar!
</body>
{footer}
</html>
""".format(first_name=first_name, footer=footer)
# Turn these into plain/html MIMEText objects
part1 = MIMEText(text, "plain")
part2 = MIMEText(html, "html")
# Add HTML/plain-text parts to MIMEMultipart message
# The email client will try to render the last part first
message.attach(part1)
message.attach(part2)
context = ssl.create_default_context()
try:
with smtplib.SMTP_SSL("smtp.ionos.de", 465, context=context) as server:
server.login(sender_email, password)
server.sendmail(
sender_email, mail, message.as_string()
)
except:
pass
def send_booked_package_confirmation_mail(company_name, contact_person, contact_email, booked_package, expire_date,
token, booked_date, booked_package_duration_in_month,
booked_package_total_amount,
language_abbr: str):
sender_email = "noreply@blogbar.eu"
password = "BlogBar2103#"
message = MIMEMultipart("alternative")
if language_abbr == "de":
message[
"Subject"] = "Buchungsbestätigung: Die Buchung deines {months}-Monats-{level}-Premiumpaketes war erfolgreich".format(
months=booked_package_duration_in_month,
level=str(booked_package).upper()
)
else:
message[
"Subject"] = "Booking-Confirmation: The Booking of your {months}-Months-{level}-premium-package was successful".format(
months=booked_package_duration_in_month,
level=str(booked_package).upper()
)
message["From"] = sender_email
message["To"] = contact_email
# message['Cc'] = "cheers@blogbar.eu"
if language_abbr == 'de':
text = """\
Hallo {contact_person},
die Buchung Ihres {level}-Paketes war erfolgreich und wird dir für die nächsten {months} Monate
erweiterte Funktionen ermöglichen.
Dein Paket wird voraussichtlich am {expiration_date} auslaufen. Das Premiumpaket kann nach Ablaufen erneut
gebucht werden.
Eine Rechnung kann unter folgendem Link heruntergeladen werden:
https://blogbar.eu/invoice/{token}
Ihr BlogBar-Team""".format(contact_person=contact_person,
level=str(booked_package).upper(),
months=int(booked_package_duration_in_month),
expiration_date=expire_date,
token=token)
html = """\
<html>
<body style="font-family: Calibri;">
<p>Hallo {contact_person},</p>
<p>die Buchung Ihres {level}-Paketes war erfolgreich und wird dir für die nächsten {months} Monate
erweiterte Funktionen ermöglichen.</p>
<p>Dein Paket wird voraussichtlich am {expiration_date} auslaufen. Das Premiumpaket kann nach Ablaufen erneut
gebucht werden.<p>
<p>Eine Rechnung kann unter folgendem Link heruntergeladen werden: <br>
<a href="https://blogbar.eu/invoice/{token}"> Rechnung hier herunterladen</a></p>
<p>Ihr BlogBar-Team</p>
</body>
{footer}
</html>
""".format(contact_person=contact_person,
level=str(booked_package).upper(),
months=int(booked_package_duration_in_month),
expiration_date=expire_date,
token=token, footer=footer)
else:
text = """\
Hello {contact_person},
Thank you for booking our premium package!
You can download the invoice using this link:
https://blogbar.eu/invoice/{token}
Your package is activated. You can start directly.
Click here for the login: https://blogbar.eu/en/login
We wish you much success!
If you have any questions, please do not hesitate to contact us.
Your team from BlogBar""".format(contact_person=contact_person,
level=str(booked_package).upper(),
months=int(booked_package_duration_in_month),
expiration_date=expire_date,
token=token)
html = """\
<html>
<body style="font-family: Calibri;">
<p>Hello {contact_person},</p>
<p>Thank you for booking our premium package!</p>
<p>You can download the invoice using this link:<br>
<a href="https://blogbar.eu/invoice/{token}"> Rechnung hier herunterladen</a></p>
<p>Your package is activated. You can start directly.<p>
<p> Click <a href="https://blogbar.eu/en/login">here</a> for the login.</p>
<p>We wish you much success!<br>
If you have any questions, please do not hesitate to contact us.
</p>
<p>Your team from BlogBar</p>
</body>
{footer}
</html>
""".format(contact_person=contact_person,
level=str(booked_package).upper(),
months=int(booked_package_duration_in_month),
expiration_date=expire_date,
token=token, footer=footer)
# Turn these into plain/html MIMEText objects
part1 = MIMEText(text, "plain")
part2 = MIMEText(html, "html")
# Add HTML/plain-text parts to MIMEMultipart message
# The email client will try to render the last part first
message.attach(part1)
message.attach(part2)
context = ssl.create_default_context()
try:
with smtplib.SMTP_SSL("smtp.ionos.de", 465, context=context) as server:
server.login(sender_email, password)
server.sendmail(
sender_email, contact_email, message.as_string()
)
except:
# Log this in a logging script
print("ERROR OCCURED BY SENDING MAIL")
| 42.205092 | 197 | 0.536177 | 3,205 | 29,839 | 4.913885 | 0.153822 | 0.025144 | 0.021335 | 0.017271 | 0.786209 | 0.749698 | 0.727475 | 0.701886 | 0.671344 | 0.659216 | 0 | 0.012361 | 0.371025 | 29,839 | 706 | 198 | 42.264873 | 0.826726 | 0.038976 | 0 | 0.655856 | 0 | 0.066667 | 0.659484 | 0.026263 | 0 | 0 | 0 | 0.001416 | 0 | 1 | 0.016216 | false | 0.057658 | 0.010811 | 0 | 0.034234 | 0.012613 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
70da0e9f6246689fefc62ed115a202860dfffe89 | 46 | py | Python | pyleecan/Methods/Simulation/LUTdq/get_orders_dqh.py | thalesmaoa/pyleecan | c4fdc6362fdeba3d0766d5d1df3ff9c97c3f9fa3 | [
"Apache-2.0"
] | 1 | 2021-11-10T11:52:57.000Z | 2021-11-10T11:52:57.000Z | pyleecan/Methods/Simulation/LUTdq/get_orders_dqh.py | thalesmaoa/pyleecan | c4fdc6362fdeba3d0766d5d1df3ff9c97c3f9fa3 | [
"Apache-2.0"
] | null | null | null | pyleecan/Methods/Simulation/LUTdq/get_orders_dqh.py | thalesmaoa/pyleecan | c4fdc6362fdeba3d0766d5d1df3ff9c97c3f9fa3 | [
"Apache-2.0"
] | null | null | null | def get_orders_dqh(self):
pass
# TODO
| 11.5 | 25 | 0.630435 | 7 | 46 | 3.857143 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.282609 | 46 | 3 | 26 | 15.333333 | 0.818182 | 0.086957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 0 | 1 | 0.5 | false | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
cb2809d0e063785007abbea102979c281a58237a | 30 | py | Python | cosmosis/samplers/pmaxlike/__init__.py | ktanidis2/Modified_CosmoSIS_for_galaxy_number_count_angular_power_spectra | 07e5d308c6a8641a369a3e0b8d13c4104988cd2b | [
"BSD-2-Clause"
] | 2 | 2021-06-18T14:11:59.000Z | 2022-02-23T19:19:36.000Z | cosmosis/samplers/pmaxlike/__init__.py | ktanidis2/Modified_CosmoSIS_for_galaxy_number_count_angular_power_spectra | 07e5d308c6a8641a369a3e0b8d13c4104988cd2b | [
"BSD-2-Clause"
] | 2 | 2021-11-02T12:44:24.000Z | 2022-03-30T15:09:48.000Z | cosmosis/samplers/pmaxlike/__init__.py | ktanidis2/Modified_CosmoSIS_for_galaxy_number_count_angular_power_spectra | 07e5d308c6a8641a369a3e0b8d13c4104988cd2b | [
"BSD-2-Clause"
] | 2 | 2022-03-25T21:26:27.000Z | 2022-03-29T06:37:46.000Z | from . import pmaxlike_sampler | 30 | 30 | 0.866667 | 4 | 30 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 30 | 1 | 30 | 30 | 0.925926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cb3d5914ca0173ed8c5a48c6f03bc11c49c1b1cc | 46 | py | Python | service/espnet/test_model.py | sciai-ai/espnet-tts-serving | ff0cb885615ef130f67772d6f32f62dd23444ca6 | [
"BSD-3-Clause"
] | null | null | null | service/espnet/test_model.py | sciai-ai/espnet-tts-serving | ff0cb885615ef130f67772d6f32f62dd23444ca6 | [
"BSD-3-Clause"
] | null | null | null | service/espnet/test_model.py | sciai-ai/espnet-tts-serving | ff0cb885615ef130f67772d6f32f62dd23444ca6 | [
"BSD-3-Clause"
] | null | null | null | import io
from service.espnet import model
| 7.666667 | 32 | 0.782609 | 7 | 46 | 5.142857 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.195652 | 46 | 5 | 33 | 9.2 | 0.972973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cbbea6fcf4222531b20c0e295772d34f3b7d616e | 75 | py | Python | tests/test_some_stuff.py | tryexceptpass/githubapp | a34b3c91f1bcfe7c6ea11351c67867ac54dd44dc | [
"MIT"
] | 2 | 2020-06-18T13:36:07.000Z | 2020-12-11T21:10:42.000Z | tests/test_some_stuff.py | tryexceptpass/githubapp | a34b3c91f1bcfe7c6ea11351c67867ac54dd44dc | [
"MIT"
] | null | null | null | tests/test_some_stuff.py | tryexceptpass/githubapp | a34b3c91f1bcfe7c6ea11351c67867ac54dd44dc | [
"MIT"
] | 1 | 2019-11-16T10:48:30.000Z | 2019-11-16T10:48:30.000Z |
def test_stuff1():
assert True
def test_stuff2():
assert False
| 8.333333 | 18 | 0.653333 | 10 | 75 | 4.7 | 0.7 | 0.297872 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036364 | 0.266667 | 75 | 8 | 19 | 9.375 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
cbe11b73c74965fdb94fc2c5c9940703cc0e6641 | 579 | py | Python | python/94.py | kylekanos/project-euler-1 | af7089356a4cea90f8ef331cfdc65e696def6140 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | python/94.py | kylekanos/project-euler-1 | af7089356a4cea90f8ef331cfdc65e696def6140 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | python/94.py | kylekanos/project-euler-1 | af7089356a4cea90f8ef331cfdc65e696def6140 | [
"BSD-2-Clause-FreeBSD"
] | 1 | 2019-09-17T00:55:58.000Z | 2019-09-17T00:55:58.000Z | #!/usr/bin/env python
#Note: I used the quadratic diophantine equation solver
#for this one.
p=-2
q=-1
k=4
r=-3
s=-2
l=4
x=2
y=0
total=0
while True:
peri = 3*x-2
if peri>10**9: break
eqn = x*x*(3*x*x-8*x+4)
if x>2 and eqn%16==0 and int(eqn**.5)**2==eqn: total += peri
t=x
x = p*x+q*y+k
y = r*t+s*y+l
p=-2
q=-1
k=-4
r=-3
s=-2
l=-4
x=-2
y=0
while True:
peri = 3*x+2
if peri>10**9: break
eqn = x*x*(3*x*x+8*x+4)
if x>1 and eqn%16==0 and int(eqn**.5)**2==eqn: total += peri
t=x
x = p*x+q*y+k
y = r*t+s*y+l
print total
| 12.319149 | 64 | 0.519862 | 149 | 579 | 2.020134 | 0.295302 | 0.039867 | 0.019934 | 0.026578 | 0.707641 | 0.707641 | 0.707641 | 0.707641 | 0.707641 | 0.707641 | 0 | 0.10514 | 0.260794 | 579 | 46 | 65 | 12.586957 | 0.598131 | 0.150259 | 0 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.029412 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1dcfb1ad9aed1d370ff05312ec7c41b0f4420506 | 1,585 | py | Python | care/facility/migrations/0178_auto_20200916_1534.py | gigincg/care | 07be6a7982b5c46a854e3435a52662f32800c8ae | [
"MIT"
] | 189 | 2020-03-17T17:18:58.000Z | 2022-02-22T09:49:45.000Z | care/facility/migrations/0178_auto_20200916_1534.py | gigincg/care | 07be6a7982b5c46a854e3435a52662f32800c8ae | [
"MIT"
] | 598 | 2020-03-19T21:22:09.000Z | 2022-03-30T05:08:37.000Z | care/facility/migrations/0178_auto_20200916_1534.py | gigincg/care | 07be6a7982b5c46a854e3435a52662f32800c8ae | [
"MIT"
] | 159 | 2020-03-19T18:45:56.000Z | 2022-03-17T13:23:12.000Z | # Generated by Django 2.2.11 on 2020-09-16 10:04
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('facility', '0177_auto_20200916_1448'),
]
operations = [
migrations.AddField(
model_name='historicalpatientregistration',
name='date_of_test',
field=models.DateTimeField(blank=True, null=True, verbose_name="Patient's test Date"),
),
migrations.AddField(
model_name='historicalpatientregistration',
name='srf_id',
field=models.CharField(blank=True, default='', max_length=200),
),
migrations.AddField(
model_name='historicalpatientregistration',
name='test_type',
field=models.IntegerField(choices=[(10, 'UNK'), (20, 'ANTIGEN'), (30, 'RTPCR'), (40, 'CBNAT'), (50, 'TRUNAT')], default=10),
),
migrations.AddField(
model_name='patientregistration',
name='date_of_test',
field=models.DateTimeField(blank=True, null=True, verbose_name="Patient's test Date"),
),
migrations.AddField(
model_name='patientregistration',
name='srf_id',
field=models.CharField(blank=True, default='', max_length=200),
),
migrations.AddField(
model_name='patientregistration',
name='test_type',
field=models.IntegerField(choices=[(10, 'UNK'), (20, 'ANTIGEN'), (30, 'RTPCR'), (40, 'CBNAT'), (50, 'TRUNAT')], default=10),
),
]
| 36.022727 | 136 | 0.587382 | 155 | 1,585 | 5.870968 | 0.387097 | 0.118681 | 0.151648 | 0.178022 | 0.832967 | 0.832967 | 0.615385 | 0.615385 | 0.615385 | 0.615385 | 0 | 0.053587 | 0.270032 | 1,585 | 43 | 137 | 36.860465 | 0.73293 | 0.029022 | 0 | 0.810811 | 1 | 0 | 0.207547 | 0.071568 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.027027 | 0 | 0.108108 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
38295d04042da069a5e36c93cf515933f7fbad6a | 31 | py | Python | papertrack/__init__.py | fantastic001/papertrack | f5a13ccabf1836a60020eb65e2e998b6ba33d8bd | [
"MIT"
] | null | null | null | papertrack/__init__.py | fantastic001/papertrack | f5a13ccabf1836a60020eb65e2e998b6ba33d8bd | [
"MIT"
] | null | null | null | papertrack/__init__.py | fantastic001/papertrack | f5a13ccabf1836a60020eb65e2e998b6ba33d8bd | [
"MIT"
] | null | null | null |
from papertrack.core import * | 15.5 | 30 | 0.774194 | 4 | 31 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16129 | 31 | 2 | 30 | 15.5 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
69b5ddaae3c792eb6d535661dd65a26a5c8f344a | 10,140 | py | Python | test_autoarray/instruments/test_euclid.py | jonathanfrawley/PyAutoArray_copy | c21e8859bdb20737352147b9904797ac99985b73 | [
"MIT"
] | 5 | 2019-09-26T02:18:25.000Z | 2021-12-11T16:29:20.000Z | test_autoarray/instruments/test_euclid.py | jonathanfrawley/PyAutoArray_copy | c21e8859bdb20737352147b9904797ac99985b73 | [
"MIT"
] | 3 | 2020-03-30T14:25:57.000Z | 2021-12-21T17:10:55.000Z | test_autoarray/instruments/test_euclid.py | jonathanfrawley/PyAutoArray_copy | c21e8859bdb20737352147b9904797ac99985b73 | [
"MIT"
] | 4 | 2020-03-03T11:35:41.000Z | 2022-01-21T17:37:35.000Z | import os
import numpy as np
import autoarray as aa
path = "{}/".format(os.path.dirname(os.path.realpath(__file__)))
class TestArray2DEuclid:
def test__euclid_array_for_four_quandrants__loads_data_and_dimensions(
self, euclid_data
):
euclid_array = aa.euclid.Array2DEuclid.top_left(array_electrons=euclid_data)
assert euclid_array.shape_native == (2086, 2128)
assert (euclid_array.native == np.zeros((2086, 2128))).all()
euclid_array = aa.euclid.Array2DEuclid.top_right(array_electrons=euclid_data)
assert euclid_array.shape_native == (2086, 2128)
assert (euclid_array.native == np.zeros((2086, 2128))).all()
euclid_array = aa.euclid.Array2DEuclid.bottom_left(array_electrons=euclid_data)
assert euclid_array.shape_native == (2086, 2128)
assert (euclid_array.native == np.zeros((2086, 2128))).all()
euclid_array = aa.euclid.Array2DEuclid.bottom_right(array_electrons=euclid_data)
assert euclid_array.shape_native == (2086, 2128)
assert (euclid_array.native == np.zeros((2086, 2128))).all()
class TestLayout2DEuclid:
def test__euclid_layout_for_four_quandrants__loads_data_and_dimensions(
self, euclid_data
):
layout = aa.euclid.Layout2DEuclid.top_left(
parallel_size=2086,
serial_size=2128,
serial_prescan_size=51,
serial_overscan_size=29,
parallel_overscan_size=20,
)
assert layout.original_roe_corner == (0, 0)
assert layout.shape_2d == (2086, 2128)
assert layout.parallel_overscan == (2066, 2086, 51, 2099)
assert layout.serial_prescan == (0, 2086, 0, 51)
assert layout.serial_overscan == (20, 2086, 2099, 2128)
layout = aa.euclid.Layout2DEuclid.top_left(
parallel_size=2086,
serial_size=2128,
serial_prescan_size=41,
serial_overscan_size=10,
parallel_overscan_size=15,
)
assert layout.original_roe_corner == (0, 0)
assert layout.shape_2d == (2086, 2128)
assert layout.parallel_overscan == (2071, 2086, 41, 2118)
assert layout.serial_prescan == (0, 2086, 0, 41)
assert layout.serial_overscan == (15, 2086, 2118, 2128)
layout = aa.euclid.Layout2DEuclid.top_right(
parallel_size=2086,
serial_size=2128,
serial_prescan_size=51,
serial_overscan_size=29,
parallel_overscan_size=20,
)
assert layout.original_roe_corner == (0, 1)
assert layout.shape_2d == (2086, 2128)
assert layout.parallel_overscan == (2066, 2086, 51, 2099)
assert layout.serial_prescan == (0, 2086, 0, 51)
assert layout.serial_overscan == (20, 2086, 2099, 2128)
layout = aa.euclid.Layout2DEuclid.top_right(
parallel_size=2086,
serial_size=2128,
serial_prescan_size=41,
serial_overscan_size=10,
parallel_overscan_size=15,
)
assert layout.original_roe_corner == (0, 1)
assert layout.shape_2d == (2086, 2128)
assert layout.parallel_overscan == (2071, 2086, 41, 2118)
assert layout.serial_prescan == (0, 2086, 0, 41)
assert layout.serial_overscan == (15, 2086, 2118, 2128)
layout = aa.euclid.Layout2DEuclid.bottom_left(
parallel_size=2086,
serial_size=2128,
serial_prescan_size=51,
serial_overscan_size=29,
parallel_overscan_size=20,
)
assert layout.original_roe_corner == (1, 0)
assert layout.shape_2d == (2086, 2128)
assert layout.parallel_overscan == (2066, 2086, 51, 2099)
assert layout.serial_prescan == (0, 2086, 0, 51)
assert layout.serial_overscan == (0, 2066, 2099, 2128)
layout = aa.euclid.Layout2DEuclid.bottom_left(
parallel_size=2086,
serial_size=2128,
serial_prescan_size=41,
serial_overscan_size=10,
parallel_overscan_size=15,
)
assert layout.original_roe_corner == (1, 0)
assert layout.shape_2d == (2086, 2128)
assert layout.parallel_overscan == (2071, 2086, 41, 2118)
assert layout.serial_prescan == (0, 2086, 0, 41)
assert layout.serial_overscan == (0, 2071, 2118, 2128)
layout = aa.euclid.Layout2DEuclid.bottom_right(
parallel_size=2086,
serial_size=2128,
serial_prescan_size=51,
serial_overscan_size=29,
parallel_overscan_size=20,
)
assert layout.original_roe_corner == (1, 1)
assert layout.shape_2d == (2086, 2128)
assert layout.parallel_overscan == (2066, 2086, 51, 2099)
assert layout.serial_prescan == (0, 2086, 0, 51)
assert layout.serial_overscan == (0, 2066, 2099, 2128)
layout = aa.euclid.Layout2DEuclid.bottom_right(
parallel_size=2086,
serial_size=2128,
serial_prescan_size=41,
serial_overscan_size=10,
parallel_overscan_size=15,
)
assert layout.original_roe_corner == (1, 1)
assert layout.shape_2d == (2086, 2128)
assert layout.parallel_overscan == (2071, 2086, 41, 2118)
assert layout.serial_prescan == (0, 2086, 0, 41)
assert layout.serial_overscan == (0, 2071, 2118, 2128)
def test__left_side__chooses_correct_layout_given_input(self, euclid_data):
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text1", quadrant_id="E"
)
assert layout.original_roe_corner == (1, 0)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text2", quadrant_id="E"
)
assert layout.original_roe_corner == (1, 0)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text3", quadrant_id="E"
)
assert layout.original_roe_corner == (1, 0)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text1", quadrant_id="F"
)
assert layout.original_roe_corner == (1, 1)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text2", quadrant_id="F"
)
assert layout.original_roe_corner == (1, 1)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text3", quadrant_id="F"
)
assert layout.original_roe_corner == (1, 1)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text1", quadrant_id="G"
)
assert layout.original_roe_corner == (0, 1)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text2", quadrant_id="G"
)
assert layout.original_roe_corner == (0, 1)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text3", quadrant_id="G"
)
assert layout.original_roe_corner == (0, 1)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text1", quadrant_id="H"
)
assert layout.original_roe_corner == (0, 0)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text2", quadrant_id="H"
)
assert layout.original_roe_corner == (0, 0)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text3", quadrant_id="H"
)
assert layout.original_roe_corner == (0, 0)
def test__right_side__chooses_correct_layout_given_input(self, euclid_data):
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text4", quadrant_id="E"
)
assert layout.original_roe_corner == (0, 1)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text5", quadrant_id="E"
)
assert layout.original_roe_corner == (0, 1)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text6", quadrant_id="E"
)
assert layout.original_roe_corner == (0, 1)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text4", quadrant_id="F"
)
assert layout.original_roe_corner == (0, 0)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text5", quadrant_id="F"
)
assert layout.original_roe_corner == (0, 0)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text6", quadrant_id="F"
)
assert layout.original_roe_corner == (0, 0)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text4", quadrant_id="G"
)
assert layout.original_roe_corner == (1, 0)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text5", quadrant_id="G"
)
assert layout.original_roe_corner == (1, 0)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text6", quadrant_id="G"
)
assert layout.original_roe_corner == (1, 0)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text4", quadrant_id="H"
)
assert layout.original_roe_corner == (1, 1)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text5", quadrant_id="H"
)
assert layout.original_roe_corner == (1, 1)
layout = aa.euclid.Layout2DEuclid.from_ccd_and_quadrant_id(
ccd_id="text6", quadrant_id="H"
)
assert layout.original_roe_corner == (1, 1)
| 33.913043 | 89 | 0.606509 | 1,199 | 10,140 | 4.805671 | 0.06839 | 0.133287 | 0.077751 | 0.155502 | 0.967372 | 0.967372 | 0.960778 | 0.960778 | 0.960778 | 0.960778 | 0 | 0.091835 | 0.293393 | 10,140 | 298 | 90 | 34.026846 | 0.712352 | 0 | 0 | 0.678899 | 0 | 0 | 0.014936 | 0 | 0 | 0 | 0 | 0 | 0.330275 | 1 | 0.018349 | false | 0 | 0.013761 | 0 | 0.041284 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0e01bdcda641bada1356037c6a78bd854d14c335 | 4,925 | py | Python | houdini/houdini_client/forms.py | TrianglePlusPlus/houdini | 292b1fb395fc34dbefa8f891cc94bb811f5805bb | [
"MIT"
] | 2 | 2017-09-25T00:30:22.000Z | 2021-02-04T22:11:54.000Z | houdini/houdini_client/forms.py | TrianglePlusPlus/houdini | 292b1fb395fc34dbefa8f891cc94bb811f5805bb | [
"MIT"
] | 11 | 2016-12-29T22:05:57.000Z | 2020-06-05T17:23:10.000Z | houdini/houdini_client/forms.py | TrianglePlusPlus/houdini | 292b1fb395fc34dbefa8f891cc94bb811f5805bb | [
"MIT"
] | null | null | null | from django import forms
class LoginForm(forms.Form):
email = forms.CharField(label='email', max_length=100, widget=forms.EmailInput())
password = forms.CharField(label='password', max_length=100, widget=forms.PasswordInput())
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.fields['email'].widget.attrs.update({'class': 'form-control', 'placeholder': 'Email'})
self.fields['password'].widget.attrs.update({'class': 'form-control', 'placeholder': 'Password'})
class RegisterForm(forms.Form):
first_name = forms.CharField(label='first name', max_length=32)
middle_name = forms.CharField(label='middle name', max_length=32, required=False)
last_name = forms.CharField(label='last name', max_length=32)
email = forms.CharField(label='email', max_length=100, widget=forms.EmailInput())
confirm_email = forms.CharField(label='confirm email', max_length=100, widget=forms.EmailInput())
password = forms.CharField(label='password', max_length=100, widget=forms.PasswordInput())
confirm_password = forms.CharField(label='confirm password', max_length=100, widget=forms.PasswordInput())
def clean(self):
email = self.cleaned_data.get('email')
confirm_email = self.cleaned_data.get('confirm_email')
if email and email != confirm_email:
self.add_error('confirm_email', "Emails don't match")
password = self.cleaned_data.get('password')
confirm_password = self.cleaned_data.get('confirm_password')
if password and password != confirm_password:
self.add_error('confirm_password', "Passwords don't match")
return self.cleaned_data
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.fields['first_name'].widget.attrs.update({'class': 'form-control', 'placeholder': 'First Name'})
self.fields['middle_name'].widget.attrs.update({'class': 'form-control', 'placeholder': 'Middle Name'})
self.fields['last_name'].widget.attrs.update({'class': 'form-control', 'placeholder': 'Last Name'})
self.fields['email'].widget.attrs.update({'class': 'form-control', 'placeholder': 'Email'})
self.fields['confirm_email'].widget.attrs.update({'class': 'form-control', 'placeholder': 'Confirm Email'})
self.fields['password'].widget.attrs.update({'class': 'form-control', 'placeholder': 'Password'})
self.fields['confirm_password'].widget.attrs.update({'class': 'form-control', 'placeholder': 'Confirm Password'})
class PasswordChangeForm(forms.Form):
password = forms.CharField(label='current password', max_length=100, widget=forms.PasswordInput())
new_password = forms.CharField(label='new password', max_length=100, widget=forms.PasswordInput())
confirm_new_password = forms.CharField(label='confirm new password', max_length=100, widget=forms.PasswordInput())
def clean(self):
new_password = self.cleaned_data.get('new_password')
confirm_new_password = self.cleaned_data.get('confirm_new_password')
if new_password and new_password != confirm_new_password:
self.add_error('confirm_new_password', "Passwords don't match")
return self.cleaned_data
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.fields['password'].widget.attrs.update({'class': 'form-control', 'placeholder': 'Password'})
self.fields['new_password'].widget.attrs.update({'class': 'form-control', 'placeholder': 'New Password'})
self.fields['confirm_new_password'].widget.attrs.update({'class': 'form-control', 'placeholder': 'Confirm New Password'})
class PasswordResetForm(forms.Form):
email = forms.CharField(label='email', max_length=100, widget=forms.EmailInput())
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.fields['email'].widget.attrs.update({'class': 'form-control', 'placeholder': 'Email'})
class PasswordSetForm(forms.Form):
new_password = forms.CharField(label='new password', max_length=100, widget=forms.PasswordInput())
confirm_new_password = forms.CharField(label='confirm new password', max_length=100, widget=forms.PasswordInput())
def clean(self):
new_password = self.cleaned_data.get('new_password')
confirm_new_password = self.cleaned_data.get('confirm_new_password')
if new_password and new_password != confirm_new_password:
self.add_error('confirm_new_password', "Passwords don't match")
return self.cleaned_data
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.fields['new_password'].widget.attrs.update({'class': 'form-control', 'placeholder': 'New Password'})
self.fields['confirm_new_password'].widget.attrs.update({'class': 'form-control', 'placeholder': 'Confirm New Password'})
| 54.722222 | 129 | 0.696447 | 590 | 4,925 | 5.60339 | 0.094915 | 0.106473 | 0.087114 | 0.099819 | 0.821234 | 0.778887 | 0.768905 | 0.755596 | 0.692982 | 0.675136 | 0 | 0.009981 | 0.145584 | 4,925 | 89 | 130 | 55.337079 | 0.775665 | 0 | 0 | 0.632353 | 0 | 0 | 0.237563 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0.455882 | 0.014706 | 0 | 0.470588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
38631bfe78c237d24c860462f8c70950abffa309 | 213 | py | Python | async_scheduler/exceptions.py | mr55p-dev/python-async-scheduler | c27004fe2603d264223969f39f996da3e22f8443 | [
"MIT"
] | 1 | 2022-01-21T12:52:38.000Z | 2022-01-21T12:52:38.000Z | async_scheduler/exceptions.py | mr55p-dev/python-async-scheduler | c27004fe2603d264223969f39f996da3e22f8443 | [
"MIT"
] | null | null | null | async_scheduler/exceptions.py | mr55p-dev/python-async-scheduler | c27004fe2603d264223969f39f996da3e22f8443 | [
"MIT"
] | null | null | null | class SchedulerRunningError(BaseException):
pass
class SchedulerExecutionError(TypeError):
pass
class DuplicateUserError(BaseException):
pass
class PrototypeFunctionError(BaseException):
pass | 15.214286 | 44 | 0.788732 | 16 | 213 | 10.5 | 0.5 | 0.303571 | 0.261905 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15493 | 213 | 14 | 45 | 15.214286 | 0.933333 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
38aae207c415593e0998f01044c15f7d86b758b8 | 1,341 | py | Python | ad_api/sp_products/neg_targeting.py | 854350999/python-amazon-ad-api | f7deb695b539296260c6558529b4be39a426967e | [
"MIT"
] | null | null | null | ad_api/sp_products/neg_targeting.py | 854350999/python-amazon-ad-api | f7deb695b539296260c6558529b4be39a426967e | [
"MIT"
] | null | null | null | ad_api/sp_products/neg_targeting.py | 854350999/python-amazon-ad-api | f7deb695b539296260c6558529b4be39a426967e | [
"MIT"
] | null | null | null |
from ..client import Client
class NegProductTargets(Client):
def create_neg_targets(self, params):
self.uri_path = "/v2/sp/negativeTargets"
self.method = "post"
self.data = params
return self.execute()
def update_neg_targets(self, params):
self.uri_path = "/v2/sp/negativeTargets"
self.method = "put"
self.data = params
return self.execute()
def get_neg_targets(self, params):
self.method = "get"
self.uri_path = "/v2/sp/negativeTargets"
self.data = params
return self.execute()
def get_neg_targets_by_id(self, target_id):
self.method = "get"
self.uri_path = "/v2/sp/negativeTargets/{}".format(target_id)
return self.execute()
def delete_neg_targets_by_id(self, target_id):
self.uri_path = "/v2/sp/negativeTargets/{}".format(target_id)
self.method = "delete"
return self.execute()
def get_neg_targets_extended(self, params):
self.method = "get"
self.uri_path = "/v2/sp/negativeTargets/extended"
self.data = params
return self.execute()
def get_neg_targets_extended_by_id(self, target_id):
self.method = "get"
self.uri_path = "/v2/sp/negativeTargets/extended/{}".format(target_id)
return self.execute()
| 29.152174 | 78 | 0.632364 | 167 | 1,341 | 4.868263 | 0.179641 | 0.086101 | 0.094711 | 0.111931 | 0.880689 | 0.880689 | 0.821648 | 0.779828 | 0.687577 | 0.616236 | 0 | 0.006951 | 0.249068 | 1,341 | 45 | 79 | 29.8 | 0.800397 | 0 | 0 | 0.588235 | 0 | 0 | 0.153846 | 0.135176 | 0 | 0 | 0 | 0 | 0 | 1 | 0.205882 | false | 0 | 0.029412 | 0 | 0.470588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
38c0fcc97d6adcbd841473b6f9cada85d6805be6 | 35 | py | Python | brewpi/recipes/models/__init__.py | trottmpq/brewpi | 5090f32d30e27cafe1594beba7bb8cd0aac8bfc6 | [
"MIT"
] | null | null | null | brewpi/recipes/models/__init__.py | trottmpq/brewpi | 5090f32d30e27cafe1594beba7bb8cd0aac8bfc6 | [
"MIT"
] | 9 | 2020-11-14T18:27:41.000Z | 2022-02-20T18:30:47.000Z | brewpi/recipes/models/__init__.py | trottmpq/brewpi | 5090f32d30e27cafe1594beba7bb8cd0aac8bfc6 | [
"MIT"
] | null | null | null | from .recipe import Recipe # noqa
| 17.5 | 34 | 0.742857 | 5 | 35 | 5.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 35 | 1 | 35 | 35 | 0.928571 | 0.114286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2a25f25e0a28e7a4ee4f470629b55fd00de911a5 | 90 | py | Python | pyntcloud/geometry/__init__.py | bernssolg/pyntcloud-master | 84cf000b7a7f69a2c1b36f9624f05f65160bf992 | [
"MIT"
] | 1,142 | 2016-10-10T08:55:30.000Z | 2022-03-30T04:46:16.000Z | pyntcloud/geometry/__init__.py | bernssolg/pyntcloud-master | 84cf000b7a7f69a2c1b36f9624f05f65160bf992 | [
"MIT"
] | 195 | 2016-10-10T08:30:37.000Z | 2022-02-17T12:51:17.000Z | pyntcloud/geometry/__init__.py | bernssolg/pyntcloud-master | 84cf000b7a7f69a2c1b36f9624f05f65160bf992 | [
"MIT"
] | 215 | 2017-02-28T00:50:29.000Z | 2022-03-22T17:01:31.000Z |
"""
HAKUNA MATATA
"""
from .models.plane import Plane
from .models.sphere import Sphere
| 11.25 | 33 | 0.733333 | 12 | 90 | 5.5 | 0.583333 | 0.30303 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155556 | 90 | 7 | 34 | 12.857143 | 0.868421 | 0.144444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2a968a5ce77433e32f974f19fbff0ed5dd4298b4 | 121 | py | Python | examples/basic/project.py | furious-luke/polecat | 7be5110f76dc42b15c922c1bb7d49220e916246d | [
"MIT"
] | 4 | 2019-08-10T12:56:12.000Z | 2020-01-21T09:51:20.000Z | examples/basic/project.py | furious-luke/polecat | 7be5110f76dc42b15c922c1bb7d49220e916246d | [
"MIT"
] | 71 | 2019-04-09T05:39:21.000Z | 2020-05-16T23:09:24.000Z | examples/basic/project.py | furious-luke/polecat | 7be5110f76dc42b15c922c1bb7d49220e916246d | [
"MIT"
] | null | null | null | from polecat.project import Project as BaseProject
from .models import * # noqa
class Project(BaseProject):
pass
| 15.125 | 50 | 0.752066 | 15 | 121 | 6.066667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190083 | 121 | 7 | 51 | 17.285714 | 0.928571 | 0.033058 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
aa775443356ac016e62e362a706f1b6fed62d454 | 96 | py | Python | common/code/snippets/ctf/misc/byte_xor.py | nevesnunes/env | 7a5e3816334337e04a87e1a2e4dc322215901744 | [
"MIT"
] | 4 | 2020-04-07T14:45:02.000Z | 2021-12-28T22:43:16.000Z | common/code/snippets/ctf/misc/byte_xor.py | nevesnunes/env | 7a5e3816334337e04a87e1a2e4dc322215901744 | [
"MIT"
] | null | null | null | common/code/snippets/ctf/misc/byte_xor.py | nevesnunes/env | 7a5e3816334337e04a87e1a2e4dc322215901744 | [
"MIT"
] | 2 | 2020-04-08T03:12:06.000Z | 2021-03-04T20:33:03.000Z | #!/usr/bin/env python3
print("".join([chr(0x5f ^ x) for x in bytearray(b"\x41\x41\x41\x41")]))
| 24 | 71 | 0.635417 | 18 | 96 | 3.388889 | 0.777778 | 0.295082 | 0.295082 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127907 | 0.104167 | 96 | 3 | 72 | 32 | 0.581395 | 0.21875 | 0 | 0 | 0 | 0 | 0.216216 | 0 | 0 | 0 | 0.054054 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
aa7813201cfc04b332cb83c10a6eaea2dd3d7ead | 30,241 | py | Python | pybind/slxos/v16r_1_00b/policy_map/class_/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v16r_1_00b/policy_map/class_/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v16r_1_00b/policy_map/class_/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null |
from operator import attrgetter
import pyangbind.lib.xpathhelper as xpathhelper
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType, RestrictedClassType, TypedListType
from pyangbind.lib.yangtypes import YANGBool, YANGListType, YANGDynClass, ReferenceType
from pyangbind.lib.base import PybindBase
from decimal import Decimal
from bitarray import bitarray
import __builtin__
import police
import set_
import span
import map_
import shape
import scheduler
import priority_mapping_table
class class_(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module brocade-qos-mqc - based on the path /policy-map/class. Each member element of
the container is represented as a class variable - with a specific
YANG type.
"""
__slots__ = ('_pybind_generated_by', '_path_helper', '_yang_name', '_rest_name', '_extmethods', '__cl_name','__police','__set_','__span','__map_','__sflow_profile','__shape','__scheduler','__priority_mapping_table',)
_yang_name = 'class'
_rest_name = 'class'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
path_helper_ = kwargs.pop("path_helper", None)
if path_helper_ is False:
self._path_helper = False
elif path_helper_ is not None and isinstance(path_helper_, xpathhelper.YANGPathHelper):
self._path_helper = path_helper_
elif hasattr(self, "_parent"):
path_helper_ = getattr(self._parent, "_path_helper", False)
self._path_helper = path_helper_
else:
self._path_helper = False
extmethods = kwargs.pop("extmethods", None)
if extmethods is False:
self._extmethods = False
elif extmethods is not None and isinstance(extmethods, dict):
self._extmethods = extmethods
elif hasattr(self, "_parent"):
extmethods = getattr(self._parent, "_extmethods", None)
self._extmethods = extmethods
else:
self._extmethods = False
self.__span = YANGDynClass(base=span.span, is_container='container', presence=False, yang_name="span", rest_name="span", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'span sesion <id>'}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
self.__scheduler = YANGDynClass(base=scheduler.scheduler, is_container='container', presence=False, yang_name="scheduler", rest_name="scheduler", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure Traffic Class Scheduler', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
self.__sflow_profile = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'[a-zA-Z]{1}([-a-zA-Z0-9_]{0,63})'}), is_leaf=True, yang_name="sflow-profile", rest_name="sflow-profile", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'Apply sflow profile'}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='sflow:profile-name-type', is_config=True)
self.__priority_mapping_table = YANGDynClass(base=priority_mapping_table.priority_mapping_table, is_container='container', presence=False, yang_name="priority-mapping-table", rest_name="priority-mapping-table", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure cee priority mapping table', u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
self.__shape = YANGDynClass(base=shape.shape, is_container='container', presence=False, yang_name="shape", rest_name="shape", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure Shaping rate', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
self.__cl_name = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'[a-zA-Z]{1}([-a-zA-Z0-9_]{0,63})'}), is_leaf=True, yang_name="cl-name", rest_name="cl-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Policy Map Class Name (Max Size -64)'}}, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='map-name-type', is_config=True)
self.__police = YANGDynClass(base=police.police, is_container='container', presence=False, yang_name="police", rest_name="police", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-compact-syntax': None, u'info': u'Policy Map Class Police Instance', u'cli-sequence-commands': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
self.__set_ = YANGDynClass(base=set_.set_, is_container='container', presence=False, yang_name="set", rest_name="set", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'set cos,traffic-class or dscp value', u'cli-compact-syntax': None, u'cli-incomplete-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
self.__map_ = YANGDynClass(base=map_.map_, is_container='container', presence=False, yang_name="map", rest_name="map", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure QoS Map', u'cli-incomplete-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return [u'policy-map', u'class']
def _rest_path(self):
if hasattr(self, "_parent"):
if self._rest_name:
return self._parent._rest_path()+[self._rest_name]
else:
return self._parent._rest_path()
else:
return [u'policy-map', u'class']
def _get_cl_name(self):
"""
Getter method for cl_name, mapped from YANG variable /policy_map/class/cl_name (map-name-type)
"""
return self.__cl_name
def _set_cl_name(self, v, load=False):
"""
Setter method for cl_name, mapped from YANG variable /policy_map/class/cl_name (map-name-type)
If this variable is read-only (config: false) in the
source YANG file, then _set_cl_name is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cl_name() directly.
"""
parent = getattr(self, "_parent", None)
if parent is not None and load is False:
raise AttributeError("Cannot set keys directly when" +
" within an instantiated list")
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'[a-zA-Z]{1}([-a-zA-Z0-9_]{0,63})'}), is_leaf=True, yang_name="cl-name", rest_name="cl-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Policy Map Class Name (Max Size -64)'}}, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='map-name-type', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cl_name must be of a type compatible with map-name-type""",
'defined-type': "brocade-qos-mqc:map-name-type",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'[a-zA-Z]{1}([-a-zA-Z0-9_]{0,63})'}), is_leaf=True, yang_name="cl-name", rest_name="cl-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Policy Map Class Name (Max Size -64)'}}, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='map-name-type', is_config=True)""",
})
self.__cl_name = t
if hasattr(self, '_set'):
self._set()
def _unset_cl_name(self):
self.__cl_name = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'[a-zA-Z]{1}([-a-zA-Z0-9_]{0,63})'}), is_leaf=True, yang_name="cl-name", rest_name="cl-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Policy Map Class Name (Max Size -64)'}}, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='map-name-type', is_config=True)
def _get_police(self):
"""
Getter method for police, mapped from YANG variable /policy_map/class/police (container)
"""
return self.__police
def _set_police(self, v, load=False):
"""
Setter method for police, mapped from YANG variable /policy_map/class/police (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_police is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_police() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=police.police, is_container='container', presence=False, yang_name="police", rest_name="police", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-compact-syntax': None, u'info': u'Policy Map Class Police Instance', u'cli-sequence-commands': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """police must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=police.police, is_container='container', presence=False, yang_name="police", rest_name="police", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-compact-syntax': None, u'info': u'Policy Map Class Police Instance', u'cli-sequence-commands': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)""",
})
self.__police = t
if hasattr(self, '_set'):
self._set()
def _unset_police(self):
self.__police = YANGDynClass(base=police.police, is_container='container', presence=False, yang_name="police", rest_name="police", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-compact-syntax': None, u'info': u'Policy Map Class Police Instance', u'cli-sequence-commands': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
def _get_set_(self):
"""
Getter method for set_, mapped from YANG variable /policy_map/class/set (container)
"""
return self.__set_
def _set_set_(self, v, load=False):
"""
Setter method for set_, mapped from YANG variable /policy_map/class/set (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_set_ is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_set_() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=set_.set_, is_container='container', presence=False, yang_name="set", rest_name="set", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'set cos,traffic-class or dscp value', u'cli-compact-syntax': None, u'cli-incomplete-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """set_ must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=set_.set_, is_container='container', presence=False, yang_name="set", rest_name="set", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'set cos,traffic-class or dscp value', u'cli-compact-syntax': None, u'cli-incomplete-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)""",
})
self.__set_ = t
if hasattr(self, '_set'):
self._set()
def _unset_set_(self):
self.__set_ = YANGDynClass(base=set_.set_, is_container='container', presence=False, yang_name="set", rest_name="set", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'set cos,traffic-class or dscp value', u'cli-compact-syntax': None, u'cli-incomplete-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
def _get_span(self):
"""
Getter method for span, mapped from YANG variable /policy_map/class/span (container)
YANG Description: span sesion <id>
"""
return self.__span
def _set_span(self, v, load=False):
"""
Setter method for span, mapped from YANG variable /policy_map/class/span (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_span is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_span() directly.
YANG Description: span sesion <id>
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=span.span, is_container='container', presence=False, yang_name="span", rest_name="span", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'span sesion <id>'}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """span must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=span.span, is_container='container', presence=False, yang_name="span", rest_name="span", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'span sesion <id>'}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)""",
})
self.__span = t
if hasattr(self, '_set'):
self._set()
def _unset_span(self):
self.__span = YANGDynClass(base=span.span, is_container='container', presence=False, yang_name="span", rest_name="span", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'span sesion <id>'}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
def _get_map_(self):
"""
Getter method for map_, mapped from YANG variable /policy_map/class/map (container)
"""
return self.__map_
def _set_map_(self, v, load=False):
"""
Setter method for map_, mapped from YANG variable /policy_map/class/map (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_map_ is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_map_() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=map_.map_, is_container='container', presence=False, yang_name="map", rest_name="map", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure QoS Map', u'cli-incomplete-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """map_ must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=map_.map_, is_container='container', presence=False, yang_name="map", rest_name="map", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure QoS Map', u'cli-incomplete-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)""",
})
self.__map_ = t
if hasattr(self, '_set'):
self._set()
def _unset_map_(self):
self.__map_ = YANGDynClass(base=map_.map_, is_container='container', presence=False, yang_name="map", rest_name="map", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure QoS Map', u'cli-incomplete-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
def _get_sflow_profile(self):
"""
Getter method for sflow_profile, mapped from YANG variable /policy_map/class/sflow_profile (sflow:profile-name-type)
YANG Description: This applies sflow profile.
"""
return self.__sflow_profile
def _set_sflow_profile(self, v, load=False):
"""
Setter method for sflow_profile, mapped from YANG variable /policy_map/class/sflow_profile (sflow:profile-name-type)
If this variable is read-only (config: false) in the
source YANG file, then _set_sflow_profile is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_sflow_profile() directly.
YANG Description: This applies sflow profile.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'[a-zA-Z]{1}([-a-zA-Z0-9_]{0,63})'}), is_leaf=True, yang_name="sflow-profile", rest_name="sflow-profile", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'Apply sflow profile'}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='sflow:profile-name-type', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """sflow_profile must be of a type compatible with sflow:profile-name-type""",
'defined-type': "sflow:profile-name-type",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'[a-zA-Z]{1}([-a-zA-Z0-9_]{0,63})'}), is_leaf=True, yang_name="sflow-profile", rest_name="sflow-profile", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'Apply sflow profile'}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='sflow:profile-name-type', is_config=True)""",
})
self.__sflow_profile = t
if hasattr(self, '_set'):
self._set()
def _unset_sflow_profile(self):
self.__sflow_profile = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'[a-zA-Z]{1}([-a-zA-Z0-9_]{0,63})'}), is_leaf=True, yang_name="sflow-profile", rest_name="sflow-profile", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'Apply sflow profile'}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='sflow:profile-name-type', is_config=True)
def _get_shape(self):
"""
Getter method for shape, mapped from YANG variable /policy_map/class/shape (container)
"""
return self.__shape
def _set_shape(self, v, load=False):
"""
Setter method for shape, mapped from YANG variable /policy_map/class/shape (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_shape is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_shape() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=shape.shape, is_container='container', presence=False, yang_name="shape", rest_name="shape", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure Shaping rate', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """shape must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=shape.shape, is_container='container', presence=False, yang_name="shape", rest_name="shape", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure Shaping rate', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)""",
})
self.__shape = t
if hasattr(self, '_set'):
self._set()
def _unset_shape(self):
self.__shape = YANGDynClass(base=shape.shape, is_container='container', presence=False, yang_name="shape", rest_name="shape", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure Shaping rate', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
def _get_scheduler(self):
"""
Getter method for scheduler, mapped from YANG variable /policy_map/class/scheduler (container)
"""
return self.__scheduler
def _set_scheduler(self, v, load=False):
"""
Setter method for scheduler, mapped from YANG variable /policy_map/class/scheduler (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_scheduler is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_scheduler() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=scheduler.scheduler, is_container='container', presence=False, yang_name="scheduler", rest_name="scheduler", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure Traffic Class Scheduler', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """scheduler must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=scheduler.scheduler, is_container='container', presence=False, yang_name="scheduler", rest_name="scheduler", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure Traffic Class Scheduler', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)""",
})
self.__scheduler = t
if hasattr(self, '_set'):
self._set()
def _unset_scheduler(self):
self.__scheduler = YANGDynClass(base=scheduler.scheduler, is_container='container', presence=False, yang_name="scheduler", rest_name="scheduler", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure Traffic Class Scheduler', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
def _get_priority_mapping_table(self):
"""
Getter method for priority_mapping_table, mapped from YANG variable /policy_map/class/priority_mapping_table (container)
"""
return self.__priority_mapping_table
def _set_priority_mapping_table(self, v, load=False):
"""
Setter method for priority_mapping_table, mapped from YANG variable /policy_map/class/priority_mapping_table (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_priority_mapping_table is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_priority_mapping_table() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=priority_mapping_table.priority_mapping_table, is_container='container', presence=False, yang_name="priority-mapping-table", rest_name="priority-mapping-table", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure cee priority mapping table', u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """priority_mapping_table must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=priority_mapping_table.priority_mapping_table, is_container='container', presence=False, yang_name="priority-mapping-table", rest_name="priority-mapping-table", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure cee priority mapping table', u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)""",
})
self.__priority_mapping_table = t
if hasattr(self, '_set'):
self._set()
def _unset_priority_mapping_table(self):
self.__priority_mapping_table = YANGDynClass(base=priority_mapping_table.priority_mapping_table, is_container='container', presence=False, yang_name="priority-mapping-table", rest_name="priority-mapping-table", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure cee priority mapping table', u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-qos-mqc', defining_module='brocade-qos-mqc', yang_type='container', is_config=True)
cl_name = __builtin__.property(_get_cl_name, _set_cl_name)
police = __builtin__.property(_get_police, _set_police)
set_ = __builtin__.property(_get_set_, _set_set_)
span = __builtin__.property(_get_span, _set_span)
map_ = __builtin__.property(_get_map_, _set_map_)
sflow_profile = __builtin__.property(_get_sflow_profile, _set_sflow_profile)
shape = __builtin__.property(_get_shape, _set_shape)
scheduler = __builtin__.property(_get_scheduler, _set_scheduler)
priority_mapping_table = __builtin__.property(_get_priority_mapping_table, _set_priority_mapping_table)
_pyangbind_elements = {'cl_name': cl_name, 'police': police, 'set_': set_, 'span': span, 'map_': map_, 'sflow_profile': sflow_profile, 'shape': shape, 'scheduler': scheduler, 'priority_mapping_table': priority_mapping_table, }
| 71.491726 | 551 | 0.728151 | 4,245 | 30,241 | 4.95689 | 0.050177 | 0.040871 | 0.050566 | 0.033409 | 0.854577 | 0.839274 | 0.827773 | 0.815132 | 0.812375 | 0.798688 | 0 | 0.002356 | 0.129857 | 30,241 | 422 | 552 | 71.661137 | 0.797294 | 0.137727 | 0 | 0.459459 | 0 | 0.057915 | 0.387473 | 0.173971 | 0 | 0 | 0 | 0 | 0 | 1 | 0.11583 | false | 0 | 0.057915 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
aa9c10893a3ee9fca24dcb663af861b6bfab9ded | 166 | py | Python | genomisc/plots/__init__.py | timodonnell/genomisc | f28e80bac86c4d2d251dc26e8f325aa683960cdd | [
"Apache-2.0"
] | 1 | 2016-06-08T19:51:01.000Z | 2016-06-08T19:51:01.000Z | genomisc/plots/__init__.py | timodonnell/genomisc | f28e80bac86c4d2d251dc26e8f325aa683960cdd | [
"Apache-2.0"
] | 1 | 2015-05-28T14:51:37.000Z | 2015-05-28T14:51:37.000Z | genomisc/plots/__init__.py | timodonnell/genomisc | f28e80bac86c4d2d251dc26e8f325aa683960cdd | [
"Apache-2.0"
] | null | null | null | from __future__ import absolute_import
from . import util
from . import read_evidence_pie_charts
from . import mds
__all__ = [util, read_evidence_pie_charts, mds]
| 20.75 | 47 | 0.807229 | 24 | 166 | 4.958333 | 0.458333 | 0.252101 | 0.252101 | 0.352941 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144578 | 166 | 7 | 48 | 23.714286 | 0.838028 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.8 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
aab21a3bb337fc285597ceaada4c058f20737228 | 71 | py | Python | image_classifier/train_tools/models/__init__.py | dlrudco/CoEM | d12410a7028d39c5cd28ce531b0dff5169265c38 | [
"Apache-2.0"
] | null | null | null | image_classifier/train_tools/models/__init__.py | dlrudco/CoEM | d12410a7028d39c5cd28ce531b0dff5169265c38 | [
"Apache-2.0"
] | 1 | 2020-12-28T08:48:43.000Z | 2020-12-28T08:53:22.000Z | image_classifier/train_tools/models/__init__.py | dlrudco/CoEM | d12410a7028d39c5cd28ce531b0dff5169265c38 | [
"Apache-2.0"
] | null | null | null | from .gd_models import *
from .VAE_MMGAN import *
from .models import * | 23.666667 | 24 | 0.760563 | 11 | 71 | 4.727273 | 0.545455 | 0.461538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15493 | 71 | 3 | 25 | 23.666667 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2a9f7bf71fdf0c2d212aa41cc895b43074bd5da7 | 9,713 | py | Python | app.py | Mr-TalhaIlyas/Plant-Disease-Detector-Flask-Server-Deep-Learning-Backend | d40206d5f3a46213e62dbd0fbda08083eae36fee | [
"MIT"
] | null | null | null | app.py | Mr-TalhaIlyas/Plant-Disease-Detector-Flask-Server-Deep-Learning-Backend | d40206d5f3a46213e62dbd0fbda08083eae36fee | [
"MIT"
] | null | null | null | app.py | Mr-TalhaIlyas/Plant-Disease-Detector-Flask-Server-Deep-Learning-Backend | d40206d5f3a46213e62dbd0fbda08083eae36fee | [
"MIT"
] | null | null | null | from flask import Flask, render_template, request, send_file, Response
import cv2
import numpy as np
import json
import requests
import base64
import matplotlib.pyplot as plt
import time
import os
app = Flask(__name__)
app.config['static'] = os.path.join('static', 'images')
app.config['upload_folder'] = os.path.join('static', 'results')
secret_file = "api_address.json"
with open(secret_file) as f:
addresses = json.loads(f.read())
@app.route('/')
def home():
return render_template('index.html')
########################################################
@app.route('/upload/strawberries')
def upload_straw():
return render_template('uploadstrawberries.html')
@app.route('/upload/tomatoes')
def upload_tomato():
return render_template('uploadtomatoes.html')
@app.route('/upload/paprika')
def upload_pap():
return render_template('uploadpaprika.html')
@app.route('/upload/melon')
def upload_mel():
return render_template('uploadmelon.html')
###########################################################
@app.route('/tomato', methods=['GET', 'POST'])
def result_toma():
try:
# Set content_type to header.
content_type = 'application/json'
headers = {'content-type': content_type}
# upload image string array data.
img_file = request.files['file'].stream.read()
img = cv2.imdecode(np.fromstring(img_file, np.uint8), cv2.IMREAD_COLOR)
# map to json.
send = base64.b64encode(np.array(img))
request_json = json.dumps({'input_img': send.decode(),
'info': {
'height': img.shape[0],
'width': img.shape[1],
'channel': img.shape[2]
}
})
# http request.
# print(request_json)
response = requests.post(addresses["TOMATO_SERVER"], data=request_json, headers=headers)
# print(response)
# ['data', 'time', 'is_gpu']
response_json = response.json()
print(response_json)
cost = response_json['time']
is_gpu = response_json['is_gpu']
# change to numpy array.
r = base64.decodebytes(response_json['data'].encode())
response_dat = np.fromstring(r, dtype=np.float)
response_dat = response_dat.reshape((response_json['info']['height'],
response_json['info']['width'],
response_json['info']['channel']))
# decodeed numpy image.
plt.figure(figsize=(int(7 * (response_json['info']['width'] / response_json['info']['height'])), 7))
plt.imshow(response_dat)
timenow = str(time.time())
fname = os.path.join('static', 'results', 'tomato', timenow + '.png')
plt.axis('off')
plt.savefig(fname)
return render_template('tom_result.html', outimg=fname, timenow=timenow, cost=cost, is_gpu=is_gpu)
except:
return render_template('uploadtomatoes.html', alertflag="이미지가 너무 크거나 올바르지 않은 파일입니다.")
@app.route('/strawberry', methods=['GET', 'POST'])
def result_straw():
try:
# Set content_type to header.
content_type = 'application/json'
headers = {'content-type': content_type}
# upload image string array data.
img_file = request.files['file'].stream.read()
img = cv2.imdecode(np.fromstring(img_file, np.uint8), cv2.IMREAD_COLOR)
# map to json.
send = base64.b64encode(np.array(img))
request_json = json.dumps({'input_img': send.decode(),
'info': {
'height': img.shape[0],
'width': img.shape[1],
'channel': img.shape[2]
}
})
# http request.
response = requests.post(addresses["STRAWBERRY_SERVER"], data=request_json, headers=headers)
# print(response)
# ['data', 'time', 'is_gpu']
response_json = response.json()
cost = response_json['time']
is_gpu = response_json['is_gpu']
# change to numpy array.
r = base64.decodebytes(response_json['data'].encode())
response_dat = np.fromstring(r, np.float)
response_dat = response_dat.reshape((response_json['info']['height'],
response_json['info']['width'],
response_json['info']['channel']))
# decodeed numpy image.
plt.figure(figsize=(7, 7))
plt.imshow(response_dat)
timenow = str(time.time())
fname = os.path.join('static', 'results', 'strawberry', timenow + '.png')
plt.axis('off')
plt.savefig(fname)
return render_template('str_result.html', outimg=fname, timenow=timenow, cost=cost, is_gpu=is_gpu)
except:
return render_template('uploadstrawberries.html', alertflag="이미지가 너무 크거나 올바르지 않은 파일입니다.")
@app.route('/paprika', methods=['GET', 'POST'])
def result_pap():
try:
# Set content_type to header.
content_type = 'application/json'
headers = {'content-type': content_type}
# upload image string array data.
img_file = request.files['file'].stream.read()
img = cv2.imdecode(np.fromstring(img_file, np.uint8), cv2.IMREAD_COLOR)
# map to json.
send = base64.b64encode(np.array(img))
request_json = json.dumps({'input_img': send.decode(),
'info': {
'height': img.shape[0],
'width': img.shape[1],
'channel': img.shape[2]
}
})
# http request.
response = requests.post(addresses["PAPRIKA_SERVER"], data=request_json, headers=headers)
# print(response)
# ['data', 'time', 'is_gpu']
response_json = response.json()
cost = response_json['time']
is_gpu = response_json['is_gpu']
# change to numpy array.
r = base64.decodebytes(response_json['data'].encode())
response_dat = np.fromstring(r, dtype=np.float)
response_dat = response_dat.reshape((response_json['info']['height'],
response_json['info']['width'],
response_json['info']['channel']))
# decodeed numpy image.
plt.figure(figsize=(7, 7))
plt.imshow(response_dat)
timenow = str(time.time())
fname = os.path.join('static', 'results', 'paprika', timenow + '.png')
plt.axis('off')
plt.savefig(fname)
return render_template('pap_result.html', outimg=fname, timenow=timenow, cost=cost, is_gpu=is_gpu)
except:
return render_template('uploadpaprika.html', alertflag="이미지가 너무 크거나 올바르지 않은 파일입니다.")
@app.route('/melon', methods=['GET', 'POST'])
def result_mel():
""" 아직 모델이 없는 관계로 파프리카 모델 임시 사용 """
try:
# Set content_type to header.
content_type = 'application/json'
headers = {'content-type': content_type}
# upload image string array data.
img_file = request.files['file'].stream.read()
img = cv2.imdecode(np.fromstring(img_file, np.uint8), cv2.IMREAD_COLOR)
# map to json.
send = base64.b64encode(np.array(img))
request_json = json.dumps({'input_img': send.decode(),
'info': {
'height': img.shape[0],
'width': img.shape[1],
'channel': img.shape[2]
}
})
# http request.
response = requests.post(addresses["PAPRIKA_SERVER"], data=request_json, headers=headers)
# print(response)
# ['data', 'time', 'is_gpu']
response_json = response.json()
cost = response_json['time']
is_gpu = response_json['is_gpu']
# change to numpy array.
r = base64.decodebytes(response_json['data'].encode())
response_dat = np.fromstring(r, dtype=np.float)
response_dat = response_dat.reshape((response_json['info']['height'],
response_json['info']['width'],
response_json['info']['channel']))
# decodeed numpy image.
plt.figure(figsize=(7, 7))
plt.imshow(response_dat)
timenow = str(time.time())
fname = os.path.join('static', 'results', 'paprika', timenow + '.png')
plt.axis('off')
plt.savefig(fname)
return render_template('mel_result.html', outimg=fname, timenow=timenow, cost=cost, is_gpu=is_gpu)
except:
return render_template('uploadmelon.html', alertflag="이미지가 너무 크거나 올바르지 않은 파일입니다.")
################################################################################
@app.route('/delete_file', methods=['POST'])
def delete():
if request.method == "POST":
filename = request.form['filename']
if os.path.isfile(filename):
os.remove(filename)
return Response(status="ok")
################################################################################
if __name__ == '__main__':
app.run(debug=True, host='0.0.0.0', port=5000)
| 33.843206 | 108 | 0.531041 | 1,016 | 9,713 | 4.927165 | 0.155512 | 0.083899 | 0.044746 | 0.027167 | 0.832801 | 0.768278 | 0.768278 | 0.760887 | 0.760887 | 0.760887 | 0 | 0.009978 | 0.308659 | 9,713 | 286 | 109 | 33.961538 | 0.735518 | 0.077113 | 0 | 0.616279 | 0 | 0 | 0.135376 | 0.005318 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05814 | false | 0 | 0.052326 | 0.02907 | 0.19186 | 0.005814 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2ad3cc7ffcbf3439fa867a0bb88663b4552006e7 | 72 | py | Python | cohorts_proj/api/adapters/__init__.py | zferic/harmonization-website | f6a081481df3a3a62cb075fbb63ad0470b0d4e06 | [
"MIT"
] | 1 | 2020-09-20T02:32:01.000Z | 2020-09-20T02:32:01.000Z | cohorts_proj/api/adapters/__init__.py | zferic/harmonization-website | f6a081481df3a3a62cb075fbb63ad0470b0d4e06 | [
"MIT"
] | 20 | 2020-04-17T14:01:41.000Z | 2022-03-12T00:30:23.000Z | cohorts_proj/api/adapters/__init__.py | zferic/harmonization-website | f6a081481df3a3a62cb075fbb63ad0470b0d4e06 | [
"MIT"
] | 3 | 2020-10-08T00:24:51.000Z | 2021-06-02T20:07:30.000Z | import api.adapters.unm
import api.adapters.neu
import api.adapters.dar
| 18 | 23 | 0.833333 | 12 | 72 | 5 | 0.5 | 0.45 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 72 | 3 | 24 | 24 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
2d9903bba2660fdfcb83313ca41e054d8b6fe166 | 90,535 | py | Python | tests/unit_test/api/api_processor_test.py | rit1200/kairon | 674a491f6deeae4800825ca93e0726e4fb6e0866 | [
"Apache-2.0"
] | 9 | 2020-04-22T12:49:29.000Z | 2020-06-13T22:23:20.000Z | tests/unit_test/api/api_processor_test.py | rit1200/kairon | 674a491f6deeae4800825ca93e0726e4fb6e0866 | [
"Apache-2.0"
] | 18 | 2020-04-20T12:39:20.000Z | 2020-05-21T05:10:51.000Z | tests/unit_test/api/api_processor_test.py | rit1200/kairon | 674a491f6deeae4800825ca93e0726e4fb6e0866 | [
"Apache-2.0"
] | 13 | 2020-04-21T12:12:40.000Z | 2020-05-13T07:27:44.000Z | import asyncio
import datetime
import os
from urllib.parse import urljoin
import jwt
import responses
from fastapi import HTTPException
from fastapi_sso.sso.base import OpenID
from mongoengine import connect
from mongoengine.errors import ValidationError, DoesNotExist
import pytest
from mongomock.object_id import ObjectId
from pydantic import SecretStr
from pytest_httpx import HTTPXMock
from starlette.datastructures import Headers, URL
from starlette.requests import Request
from starlette.responses import RedirectResponse
from kairon.shared.auth import Authentication, LoginSSOFactory
from kairon.shared.account.data_objects import Feedback, BotAccess, User
from kairon.shared.account.processor import AccountProcessor
from kairon.shared.authorization.processor import IntegrationProcessor
from kairon.shared.data.constant import ACTIVITY_STATUS, ACCESS_ROLES, TOKEN_TYPE, INTEGRATION_STATUS
from kairon.shared.data.data_objects import Configs, Rules, Responses
from kairon.shared.sso.clients.facebook import FacebookSSO
from kairon.shared.sso.clients.google import GoogleSSO
from kairon.shared.utils import Utility
from kairon.exceptions import AppException
from stress_test.data_objects import Bot
os.environ["system_file"] = "./tests/testing_data/system.yaml"
def pytest_configure():
return {'bot': None, 'account': None}
class TestAccountProcessor:
@pytest.fixture(autouse=True, scope='class')
def init_connection(self):
Utility.load_environment()
connect(**Utility.mongoengine_connection(Utility.environment['database']["url"]))
AccountProcessor.load_system_properties()
def test_add_account(self):
account_response = AccountProcessor.add_account("paypal", "testAdmin")
account = AccountProcessor.get_account(account_response["_id"])
assert account_response
pytest.account = account_response["_id"]
assert account_response["_id"] == account["_id"]
assert account_response["name"] == account["name"]
account_response = AccountProcessor.add_account("ebay", "testAdmin")
account = AccountProcessor.get_account(account_response["_id"])
assert account_response
assert account_response["_id"] == account["_id"]
assert account_response["name"] == account["name"]
def test_add_duplicate_account(self):
with pytest.raises(Exception):
AccountProcessor.add_account("paypal", "testAdmin")
def test_add_duplicate_account_case_insentive(self):
with pytest.raises(Exception):
AccountProcessor.add_account("PayPal", "testAdmin")
def test_add_blank_account(self):
with pytest.raises(AppException):
AccountProcessor.add_account("", "testAdmin")
def test_add_empty_account(self):
with pytest.raises(AppException):
AccountProcessor.add_account(" ", "testAdmin")
def test_add_none_account(self):
with pytest.raises(AppException):
AccountProcessor.add_account(None, "testAdmin")
def test_list_bots_none(self):
assert not list(AccountProcessor.list_bots(1000))
def test_add_bot(self):
bot_response = AccountProcessor.add_bot("test", pytest.account, "fshaikh@digite.com", True)
bot = Bot.objects(name="test").get().to_mongo().to_dict()
assert bot['_id'].__str__() == bot_response['_id'].__str__()
config = Configs.objects(bot=bot['_id'].__str__()).get().to_mongo().to_dict()
assert config['language']
assert config['pipeline'][5]['name'] == 'FallbackClassifier'
assert config['pipeline'][5]['threshold'] == 0.7
assert config['policies'][2]['name'] == 'RulePolicy'
assert config['policies'][2]['core_fallback_action_name'] == "action_default_fallback"
assert config['policies'][2]['core_fallback_threshold'] == 0.3
assert Rules.objects(bot=bot['_id'].__str__()).get()
assert Responses.objects(name__iexact='utter_please_rephrase', bot=bot['_id'].__str__(), status=True).get()
assert Responses.objects(name='utter_default', bot=bot['_id'].__str__(), status=True).get()
pytest.bot = bot_response['_id'].__str__()
def test_list_bots(self):
bot = list(AccountProcessor.list_bots(pytest.account))
assert bot[0]['name'] == 'test'
assert bot[0]['_id']
def test_get_bot(self):
bot_response = AccountProcessor.get_bot(pytest.bot)
assert bot_response
assert bot_response["account"] == pytest.account
def test_add_duplicate_bot(self):
with pytest.raises(Exception):
AccountProcessor.add_bot("test", pytest.account, "testAdmin")
def test_add_duplicate_bot_case_insensitive(self):
with pytest.raises(Exception):
AccountProcessor.add_bot("TEST", pytest.account, "testAdmin")
def test_add_blank_bot(self):
with pytest.raises(AppException):
AccountProcessor.add_bot(" ", pytest.account, "testAdmin")
def test_add_empty_bot(self):
with pytest.raises(AppException):
AccountProcessor.add_bot("", pytest.account, "testAdmin")
def test_add_none_bot(self):
with pytest.raises(AppException):
AccountProcessor.add_bot(None, pytest.account, "testAdmin")
def test_add_none_user(self):
with pytest.raises(AppException):
AccountProcessor.add_bot('test', pytest.account, None)
def test_add_user(self):
user = AccountProcessor.add_user(
email="fshaikh@digite.com",
first_name="Fahad Ali",
last_name="Shaikh",
password="Welcome@1",
account=pytest.account,
user="testAdmin",
)
assert user
assert user["password"] != "12345"
assert user["status"]
def test_add_bot_for_existing_user(self):
bot_response = AccountProcessor.add_bot("test_version_2", pytest.account, "fshaikh@digite.com", False)
bot = Bot.objects(name="test_version_2").get().to_mongo().to_dict()
assert bot['_id'].__str__() == bot_response['_id'].__str__()
assert len(AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned']) == 2
config = Configs.objects(bot=bot['_id'].__str__()).get().to_mongo().to_dict()
assert config['language']
assert config['pipeline'][5]['name'] == 'FallbackClassifier'
assert config['pipeline'][5]['threshold'] == 0.7
assert config['policies'][2]['name'] == 'RulePolicy'
assert config['policies'][2]['core_fallback_action_name'] == "action_default_fallback"
assert config['policies'][2]['core_fallback_threshold'] == 0.3
assert Rules.objects(bot=bot['_id'].__str__()).get()
assert Responses.objects(name='utter_default', bot=bot['_id'].__str__(), status=True).get()
def test_add_member_already_exists(self):
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
with pytest.raises(AppException, match='User is already a collaborator'):
AccountProcessor.allow_bot_and_generate_invite_url(bot_id, "fshaikh@digite.com", 'testAdmin',
pytest.account, ACCESS_ROLES.DESIGNER.value)
def test_add_member_bot_not_exists(self):
with pytest.raises(DoesNotExist, match='Bot does not exists!'):
AccountProcessor.allow_bot_and_generate_invite_url('bot_not_exists', "fshaikh@digite.com", 'testAdmin', pytest.account)
def test_list_bot_accessors_1(self):
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
accessors = list(AccountProcessor.list_bot_accessors(bot_id))
assert len(accessors) == 1
assert accessors[0]['accessor_email'] == 'fshaikh@digite.com'
assert accessors[0]['role'] == 'owner'
assert accessors[0]['bot']
assert accessors[0]['bot_account'] == pytest.account
assert accessors[0]['user'] == "fshaikh@digite.com"
assert accessors[0]['timestamp']
def test_update_bot_access_modify_bot_owner_access(self):
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
with pytest.raises(AppException, match='Ownership modification denied'):
AccountProcessor.update_bot_access(bot_id, "fshaikh@digite.com", 'testAdmin',
ACCESS_ROLES.OWNER.value, ACTIVITY_STATUS.INACTIVE.value)
with pytest.raises(AppException, match='Ownership modification denied'):
AccountProcessor.update_bot_access(bot_id, "fshaikh@digite.com", 'testAdmin',
ACCESS_ROLES.ADMIN.value, ACTIVITY_STATUS.ACTIVE.value)
def test_update_bot_access_user_not_exists(self):
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
BotAccess(bot=bot_id, accessor_email="udit.pandey@digite.com", user='test',
role='designer', status='invite_not_accepted', bot_account=pytest.account).save()
with pytest.raises(DoesNotExist, match='User does not exist!'):
AccountProcessor.update_bot_access(bot_id, "udit.pandey@digite.com",
ACCESS_ROLES.ADMIN.value, ACTIVITY_STATUS.INACTIVE.value)
def test_update_bot_access_invite_not_accepted(self, monkeypatch):
monkeypatch.setitem(Utility.email_conf["email"], "enable", True)
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
User(email='udit.pandey@digite.com', first_name='udit', last_name='pandey', password='124556779', account=10,
user='udit.pandey@digite.com').save()
with pytest.raises(AppException, match='User is yet to accept the invite'):
AccountProcessor.update_bot_access(bot_id, "udit.pandey@digite.com",
ACCESS_ROLES.ADMIN.value, ACTIVITY_STATUS.INACTIVE.value)
assert BotAccess.objects(bot=bot_id, accessor_email="udit.pandey@digite.com", user='test',
role='designer', status='invite_not_accepted', bot_account=pytest.account).get()
def test_list_active_invites(self):
invite = list(AccountProcessor.list_active_invites("udit.pandey@digite.com"))
assert invite[0]['accessor_email'] == 'udit.pandey@digite.com'
assert invite[0]['role'] == 'designer'
assert invite[0]['bot_name'] == 'test_version_2'
def test_accept_bot_access_invite_user_not_exists(self):
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
token = Utility.generate_token("pandey.udit867@gmail.com")
with pytest.raises(DoesNotExist, match='User does not exist!'):
AccountProcessor.validate_request_and_accept_bot_access_invite(token, bot_id)
def test_update_bot_access_user_not_allowed(self):
AccountProcessor.add_account('pandey.udit867@gmail.com', 'pandey.udit867@gmail.com')
User(email='pandey.udit867@gmail.com', first_name='udit', last_name='pandey', password='124556779', account=10,
user='pandey.udit867@gmail.com').save()
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
with pytest.raises(AppException, match='User not yet invited to collaborate'):
AccountProcessor.update_bot_access(bot_id, "pandey.udit867@gmail.com",
ACCESS_ROLES.ADMIN.value, ACTIVITY_STATUS.INACTIVE.value)
def test_accept_bot_access_invite(self, monkeypatch):
def _mock_get_user(*args, **kwargs):
return None
monkeypatch.setattr(AccountProcessor, 'get_user_details', _mock_get_user)
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
token = Utility.generate_token("udit.pandey@digite.com")
AccountProcessor.validate_request_and_accept_bot_access_invite(token, bot_id)
assert BotAccess.objects(bot=bot_id, accessor_email="udit.pandey@digite.com", user='test',
role='designer', status='active', bot_account=pytest.account).get()
def test_list_active_invites_none(self):
invite = list(AccountProcessor.list_active_invites("udit.pandey@digite.com"))
assert invite == []
def test_update_bot_access(self):
account_bot_info = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]
assert account_bot_info['role'] == 'owner'
bot_id = account_bot_info['_id']
assert ('test_version_2', 'fshaikh@digite.com') == AccountProcessor.update_bot_access(
bot_id, "udit.pandey@digite.com", 'testAdmin', ACCESS_ROLES.ADMIN.value, ACTIVITY_STATUS.ACTIVE.value
)
bot_access = BotAccess.objects(bot=bot_id, accessor_email="udit.pandey@digite.com").get()
assert bot_access.role == ACCESS_ROLES.ADMIN.value
assert bot_access.status == ACTIVITY_STATUS.ACTIVE.value
shared_bot_info = AccountProcessor.get_accessible_bot_details(4, "udit.pandey@digite.com")['shared'][0]
assert shared_bot_info['role'] == 'admin'
assert shared_bot_info['_id'] == bot_id
with pytest.raises(AppException, match='Ownership modification denied'):
AccountProcessor.update_bot_access(bot_id, "udit.pandey@digite.com", 'testAdmin',
ACCESS_ROLES.OWNER.value, ACTIVITY_STATUS.ACTIVE.value)
bot_access = BotAccess.objects(bot=bot_id, accessor_email="udit.pandey@digite.com").get()
assert bot_access.role == ACCESS_ROLES.ADMIN.value
assert bot_access.status == ACTIVITY_STATUS.ACTIVE.value
def test_accept_bot_access_invite_user_not_allowed(self, monkeypatch):
def _mock_get_user(*args, **kwargs):
return None
monkeypatch.setattr(AccountProcessor, 'get_user_details', _mock_get_user)
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
token = Utility.generate_token("pandey.udit867@gmail.com")
with pytest.raises(AppException, match='No pending invite found for this bot and user'):
AccountProcessor.validate_request_and_accept_bot_access_invite(token, bot_id)
def test_accept_bot_access_invite_token_expired(self):
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
token = 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6InBhbmRleS51ZGl0ODY3QGdtYWlsLmNvbSIsImV4cCI6MTUxNjIzOTAyMn0.dP8a4rHXb9dBrPFKfKD3_tfKu4NdwfSz213F15qej18'
with pytest.raises(AppException, match='Invalid token'):
AccountProcessor.validate_request_and_accept_bot_access_invite(token, bot_id)
def test_accept_bot_access_invite_invalid_bot(self):
token = Utility.generate_token("fshaikh@digite.com")
with pytest.raises(DoesNotExist, match='Bot does not exists!'):
AccountProcessor.validate_request_and_accept_bot_access_invite(token, '61cb4e2f7c7ac78d2fa8fab7')
def test_list_bot_accessors_2(self):
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
accessors = list(AccountProcessor.list_bot_accessors(bot_id))
assert accessors[0]['accessor_email'] == 'fshaikh@digite.com'
assert accessors[0]['role'] == 'owner'
assert accessors[0]['bot']
assert accessors[0]['bot_account'] == pytest.account
assert accessors[0]['user'] == "fshaikh@digite.com"
assert accessors[0]['timestamp']
assert accessors[1]['accessor_email'] == 'udit.pandey@digite.com'
assert accessors[1]['role'] == 'admin'
assert accessors[1]['bot']
assert accessors[1]['bot_account'] == pytest.account
assert accessors[1]['user'] == 'testAdmin'
assert accessors[1]['accept_timestamp']
assert accessors[1]['timestamp']
def test_invite_user_as_owner(self):
with pytest.raises(AppException, match='There can be only 1 owner per bot'):
AccountProcessor.allow_bot_and_generate_invite_url('test', 'user@demo.ai', 'admin@demo.ai', 2, ACCESS_ROLES.OWNER.value)
def test_transfer_ownership(self):
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
AccountProcessor.transfer_ownership(pytest.account, bot_id, "fshaikh@digite.com", 'udit.pandey@digite.com')
accessors = list(AccountProcessor.list_bot_accessors(bot_id))
assert accessors[0]['accessor_email'] == 'fshaikh@digite.com'
assert accessors[0]['role'] == 'admin'
assert accessors[0]['bot']
assert accessors[0]['bot_account'] == 10
assert accessors[0]['user'] == "fshaikh@digite.com"
assert accessors[1]['accessor_email'] == 'udit.pandey@digite.com'
assert accessors[1]['role'] == 'owner'
assert accessors[1]['bot']
assert accessors[1]['bot_account'] == 10
assert accessors[1]['user'] == "fshaikh@digite.com"
assert AccountProcessor.get_bot_and_validate_status(bot_id)['account'] == 10
AccountProcessor.transfer_ownership(pytest.account, bot_id, 'udit.pandey@digite.com', "fshaikh@digite.com")
accessors = list(AccountProcessor.list_bot_accessors(bot_id))
assert accessors[0]['accessor_email'] == 'fshaikh@digite.com'
assert accessors[0]['role'] == 'owner'
assert accessors[0]['bot']
assert accessors[0]['bot_account'] == pytest.account
assert accessors[0]['user'] == 'udit.pandey@digite.com'
assert accessors[1]['accessor_email'] == 'udit.pandey@digite.com'
assert accessors[1]['role'] == 'admin'
assert accessors[1]['bot']
assert accessors[1]['bot_account'] == pytest.account
assert accessors[1]['user'] == 'udit.pandey@digite.com'
assert AccountProcessor.get_bot_and_validate_status(bot_id)['account'] == pytest.account
def test_transfer_ownership_to_non_member(self):
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
User(email='udit@demo.ai', first_name='udit', last_name='pandey', password='124556779', account=10,
user='udit@demo.ai').save()
with pytest.raises(AppException, match='User not yet invited to collaborate'):
AccountProcessor.transfer_ownership(pytest.account, bot_id, "fshaikh@digite.com", 'udit@demo.ai')
def test_remove_bot_access_not_a_member(self):
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
with pytest.raises(AppException, match='User not a collaborator to this bot'):
AccountProcessor.remove_bot_access(bot_id, accessor_email='pandey.udit867@gmail.com')
def test_remove_bot_access(self):
bot_id = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")['account_owned'][1]['_id']
assert not AccountProcessor.remove_bot_access(bot_id, accessor_email='udit.pandey@digite.com')
assert len(list(AccountProcessor.list_bot_accessors(bot_id))) == 1
def test_remove_bot_from_all_accessors(self):
bot_id = str(ObjectId())
BotAccess(bot=bot_id, accessor_email="udit.pandey@digite.com", user='test',
role='designer', status='active', bot_account=10).save()
BotAccess(bot=bot_id, accessor_email="pandey.udit867@gmail.com", user='test',
role='designer', status='invite_not_accepted', bot_account=10).save()
BotAccess(bot=bot_id, accessor_email="pandey.udit@gmail.com", user='test',
role='designer', status='inactive', bot_account=10).save()
BotAccess(bot=bot_id, accessor_email="udit867@gmail.com", user='test',
role='designer', status='deleted', bot_account=10).save()
assert len(list(AccountProcessor.list_bot_accessors(bot_id))) == 3
AccountProcessor.remove_bot_access(bot_id)
assert len(list(AccountProcessor.list_bot_accessors(bot_id))) == 0
def test_list_bots_2(self):
bot = list(AccountProcessor.list_bots(pytest.account))
assert bot[0]['name'] == 'test'
assert bot[0]['_id']
assert bot[1]['name'] == 'test_version_2'
assert bot[1]['_id']
def test_update_bot_name(self):
AccountProcessor.update_bot('test_bot', pytest.bot)
bot = list(AccountProcessor.list_bots(pytest.account))
assert bot[0]['name'] == 'test_bot'
assert bot[0]['_id']
def test_update_bot_not_exists(self):
with pytest.raises(AppException):
AccountProcessor.update_bot('test_bot', '5f256412f98b97335c168ef0')
def test_update_bot_empty_name(self):
with pytest.raises(AppException):
AccountProcessor.update_bot(' ', '5f256412f98b97335c168ef0')
def test_delete_bot(self):
bot = list(AccountProcessor.list_bots(pytest.account))
pytest.deleted_bot = bot[1]['_id']
AccountProcessor.delete_bot(pytest.deleted_bot)
with pytest.raises(DoesNotExist):
Bot.objects(id=pytest.deleted_bot, status=True).get()
bots = AccountProcessor.get_accessible_bot_details(pytest.account, "fshaikh@digite.com")
assert len(bots['account_owned']) == 1
assert pytest.deleted_bot not in [bot['_id'] for bot in bots['account_owned']]
assert pytest.deleted_bot not in [bot['_id'] for bot in bots['shared']]
def test_delete_bot_not_exists(self):
with pytest.raises(AppException):
AccountProcessor.delete_bot(pytest.deleted_bot)
def test_delete_account_for_account_bots(self):
account = {
"account": "Test_Delete_Account",
"email": "ritika@digite.com",
"first_name": "Test_Delete_First",
"last_name": "Test_Delete_Last",
"password": SecretStr("Welcome@1"),
}
loop = asyncio.new_event_loop()
user_detail, mail, link = loop.run_until_complete(AccountProcessor.account_setup(account_setup=account))
pytest.deleted_account = user_detail['account'].__str__()
AccountProcessor.add_bot("delete_account_bot_1", pytest.deleted_account, "ritika@digite.com", False)
AccountProcessor.add_bot("delete_account_bot_2", pytest.deleted_account, "ritika@digite.com", False)
account_bots_before_delete = list(AccountProcessor.list_bots(pytest.deleted_account))
assert len(account_bots_before_delete) == 3
AccountProcessor.delete_account(pytest.deleted_account)
for bot in account_bots_before_delete:
with pytest.raises(DoesNotExist):
Bot.objects(id=bot['_id'], account=pytest.deleted_account, status=True).get()
def test_delete_account_for_shared_bot(self):
account = {
"account": "Test_Delete_Account",
"email": "ritika@digite.com",
"first_name": "Test_Delete_First",
"last_name": "Test_Delete_Last",
"password": SecretStr("Welcome@1"),
}
loop = asyncio.new_event_loop()
user_detail, mail, link = loop.run_until_complete(
AccountProcessor.account_setup(account_setup=account))
#Add shared bot
bot_response = AccountProcessor.add_bot("delete_account_shared_bot", 30, "udit.pandey@digite.com", False)
bot_id = bot_response['_id'].__str__()
BotAccess(bot=bot_id, accessor_email="ritika@digite.com", user='testAdmin',
role='designer', status='active', bot_account=30).save()
pytest.deleted_account = user_detail['account'].__str__()
accessors_before_delete = list(AccountProcessor.list_bot_accessors(bot_id))
assert len(accessors_before_delete) == 2
assert accessors_before_delete[0]['accessor_email'] == 'udit.pandey@digite.com'
assert accessors_before_delete[1]['accessor_email'] == 'ritika@digite.com'
AccountProcessor.delete_account(pytest.deleted_account)
accessors_after_delete = list(AccountProcessor.list_bot_accessors(bot_id))
assert len(accessors_after_delete) == 1
assert accessors_after_delete[0]['accessor_email'] == 'udit.pandey@digite.com'
assert accessors_after_delete[0]['bot_account'] == 30
assert Bot.objects(id=bot_id, account=30, status=True).get()
def test_delete_account_for_account(self):
account = {
"account": "Test_Delete_Account",
"email": "ritika@digite.com",
"first_name": "Test_Delete_First",
"last_name": "Test_Delete_Last",
"password": SecretStr("Welcome@1")
}
loop = asyncio.new_event_loop()
user_detail, mail, link = loop.run_until_complete(
AccountProcessor.account_setup(account_setup=account))
pytest.deleted_account = user_detail['account'].__str__()
AccountProcessor.delete_account(pytest.deleted_account)
assert AccountProcessor.get_account(pytest.deleted_account)
assert not AccountProcessor.get_account(pytest.deleted_account).get('status')
with pytest.raises(AppException, match="Account does not exist!"):
AccountProcessor.delete_account(pytest.deleted_account)
def test_delete_account_for_user(self):
account = {
"account": "Test_Delete_Account",
"email": "ritika@digite.com",
"first_name": "Test_Delete_First",
"last_name": "Test_Delete_Last",
"password": SecretStr("Welcome@1")
}
loop = asyncio.new_event_loop()
user_detail, mail, link = loop.run_until_complete(
AccountProcessor.account_setup(account_setup=account))
pytest.deleted_account = user_detail['account'].__str__()
# Add Multiple user to same account
user = {
"account": pytest.deleted_account,
"email": "ritika.G@digite.com",
"first_name": "Test_Delete_First1",
"last_name": "Test_Delete_Last1",
"password": "Welcome@2",
"user": "testAdmin"
}
AccountProcessor.add_user(**user)
assert User.objects(email__iexact="ritika@digite.com", status=True).get()
assert User.objects(email__iexact="ritika.G@digite.com", status=True).get()
AccountProcessor.delete_account(pytest.deleted_account)
assert User.objects(email__iexact="ritika@digite.com", status=False)
assert User.objects(email__iexact="ritika.G@digite.com", status=False)
def test_delete_account_again_add(self):
account = {
"account": "Test_Delete_Account",
"email": "ritika@digite.com",
"first_name": "Test_Delete_First",
"last_name": "Test_Delete_Last",
"password": SecretStr("Welcome@1"),
}
loop = asyncio.new_event_loop()
user_detail, mail, link = loop.run_until_complete(
AccountProcessor.account_setup(account_setup=account))
pytest.deleted_account = user_detail['account'].__str__()
AccountProcessor.delete_account(pytest.deleted_account)
loop = asyncio.new_event_loop()
user_detail, mail, link = loop.run_until_complete(
AccountProcessor.account_setup(account_setup=account))
new_account_id = user_detail['account'].__str__()
assert new_account_id
assert AccountProcessor.get_account(new_account_id).get('status')
assert len(list(AccountProcessor.list_bots(new_account_id))) == 1
def test_add_user_duplicate(self):
with pytest.raises(Exception):
AccountProcessor.add_user(
email="fshaikh@digite.com",
first_name="Fahad Ali",
last_name="Shaikh",
password="Welcome@1",
account=1,
user="testAdmin",
)
def test_add_user_duplicate_case_insensitive(self):
with pytest.raises(Exception):
AccountProcessor.add_user(
email="FShaikh@digite.com",
first_name="Fahad Ali",
last_name="Shaikh",
password="Welcome@1",
account=1,
user="testAdmin",
)
def test_add_user_empty_email(self):
with pytest.raises(AppException):
AccountProcessor.add_user(
email="",
first_name="Fahad Ali",
last_name="Shaikh",
password="Welcome@1",
account=1,
user="testAdmin",
)
def test_add_user_blank_email(self):
with pytest.raises(AppException):
AccountProcessor.add_user(
email=" ",
first_name="Fahad Ali",
last_name="Shaikh",
password="Welcome@1",
account=1,
user="testAdmin",
)
def test_add_user_invalid_email(self):
with pytest.raises(ValidationError):
AccountProcessor.add_user(
email="demo",
first_name="Fahad Ali",
last_name="Shaikh",
password="Welcome@1",
account=1,
user="testAdmin",
)
def test_add_user_none_email(self):
with pytest.raises(AppException):
AccountProcessor.add_user(
email=None,
first_name="Fahad Ali",
last_name="Shaikh",
password="Welcome@1",
account=1,
user="testAdmin",
)
def test_add_user_empty_firstname(self):
with pytest.raises(AppException):
AccountProcessor.add_user(
email="demo@demo.ai",
first_name="",
last_name="Shaikh",
password="Welcome@1",
account=1,
user="testAdmin",
)
def test_add_user_blank_firstname(self):
with pytest.raises(AppException):
AccountProcessor.add_user(
email="demo@demo.ai",
first_name=" ",
last_name="Shaikh",
password="Welcome@1",
account=1,
user="testAdmin",
)
def test_add_user_none_firstname(self):
with pytest.raises(AppException):
AccountProcessor.add_user(
email="demo@demo.ai",
first_name="",
last_name="Shaikh",
password="Welcome@1",
account=1,
user="testAdmin",
)
def test_add_user_empty_lastname(self):
with pytest.raises(AppException):
AccountProcessor.add_user(
email="demo@demo.ai",
first_name="Fahad Ali",
last_name="",
password="Welcome@1",
account=1,
user="testAdmin",
)
def test_add_user_none_lastname(self):
with pytest.raises(AppException):
AccountProcessor.add_user(
email="demo@demo.ai",
first_name="Fahad Ali",
last_name=None,
password="Welcome@1",
account=1,
user="testAdmin",
)
def test_add_user_blank_lastname(self):
with pytest.raises(AppException):
AccountProcessor.add_user(
email="demo@demo.ai",
first_name="Fahad Ali",
last_name=" ",
password="Welcome@1",
account=1,
user="testAdmin",
)
def test_add_user_empty_password(self):
with pytest.raises(AppException):
AccountProcessor.add_user(
email="demo@demo.ai",
first_name="Fahad Ali",
last_name="Shaikh",
password="",
account=1,
user="testAdmin",
)
def test_add_user_blank_password(self):
with pytest.raises(AppException):
AccountProcessor.add_user(
email="demo@demo.ai",
first_name="Fahad Ali",
last_name="Shaikh",
password=" ",
account=1,
user="testAdmin",
)
def test_add_user_None_password(self):
with pytest.raises(AppException):
AccountProcessor.add_user(
email="demo@demo.ai",
first_name="Fahad Ali",
last_name="Shaikh",
password=None,
account=1,
user="testAdmin",
)
def test_get_user(self):
user = AccountProcessor.get_user("fshaikh@digite.com")
assert all(
user[key] is False if key == "is_integration_user" else user[key]
for key in user.keys()
)
def test_get_user_not_exists(self):
with pytest.raises(DoesNotExist, match='User does not exist!'):
AccountProcessor.get_user("udit.pandey_kairon@digite.com")
def test_get_user_details(self):
user = AccountProcessor.get_user_details("fshaikh@digite.com")
assert all(
user[key] is False if key == "is_integration_user" else user[key]
for key in user.keys()
)
@pytest.fixture
def mock_user_inactive(self, monkeypatch):
def user_response(*args, **kwargs):
return {
"email": "demo@demo.ai",
"status": False,
"bot": "support",
"account": 2,
"is_integration_user": False
}
def bot_response(*args, **kwargs):
return {"name": "support", "status": True}
def account_response(*args, **kwargs):
return {"name": "paytm", "status": True}
monkeypatch.setattr(AccountProcessor, "get_user", user_response)
monkeypatch.setattr(AccountProcessor, "get_bot", bot_response)
monkeypatch.setattr(AccountProcessor, "get_account", account_response)
def test_get_user_details_user_inactive(self, mock_user_inactive):
with pytest.raises(ValidationError):
user_details = AccountProcessor.get_user_details("demo@demo.ai")
assert all(
user_details[key] is False
if key == "is_integration_user"
else user_details[key]
for key in user_details.keys()
)
@pytest.fixture
def mock_bot_inactive(self, monkeypatch):
def user_response(*args, **kwargs):
return {
"email": "demo@demo.ai",
"status": True,
"bot": "support",
"account": 2,
"is_integration_user": False
}
def bot_response(*args, **kwargs):
return {"name": "support", "status": False}
def account_response(*args, **kwargs):
return {"name": "paytm", "status": True}
monkeypatch.setattr(AccountProcessor, "get_user", user_response)
monkeypatch.setattr(AccountProcessor, "get_bot", bot_response)
monkeypatch.setattr(AccountProcessor, "get_account", account_response)
def test_get_user_details_bot_inactive(self, mock_bot_inactive, monkeypatch):
monkeypatch.setitem(Utility.email_conf["email"], 'enable', True)
with pytest.raises(AppException) as e:
AccountProcessor.get_user_details("demo@demo.ai")
assert str(e).__contains__('Please verify your mail')
@pytest.fixture
def mock_account_inactive(self, monkeypatch):
def user_response(*args, **kwargs):
return {
"email": "demo@demo.ai",
"status": True,
"bot": "support",
"account": 2,
"is_integration_user": False
}
def bot_response(*args, **kwargs):
return {"name": "support", "status": True}
def account_response(*args, **kwargs):
return {"name": "paytm", "status": False}
monkeypatch.setattr(AccountProcessor, "get_user", user_response)
monkeypatch.setattr(AccountProcessor, "get_bot", bot_response)
monkeypatch.setattr(AccountProcessor, "get_account", account_response)
def test_get_user_details_account_inactive(self, mock_account_inactive):
with pytest.raises(ValidationError):
user_details = AccountProcessor.get_user_details("demo@demo.ai")
assert all(
user_details[key] is False
if key == "is_integration_user"
else user_details[key]
for key in AccountProcessor.get_user_details(
user_details["email"]
).keys()
)
def test_account_setup_empty_values(self):
account = {}
with pytest.raises(AppException):
loop = asyncio.new_event_loop()
loop.run_until_complete(AccountProcessor.account_setup(account_setup=account))
def test_account_setup_missing_account(self):
account = {
"bot": "Test",
"email": "demo@ac.in",
"first_name": "Test_First",
"last_name": "Test_Last",
"password": "welcome@1",
}
with pytest.raises(AppException):
loop = asyncio.new_event_loop()
loop.run_until_complete(AccountProcessor.account_setup(account_setup=account))
def test_account_setup_user_info(self):
account = {
"account": "Test_Account",
"bot": "Test",
"first_name": "Test_First",
"last_name": "Test_Last",
"password": SecretStr("Welcome@1"),
}
with pytest.raises(AppException):
loop = asyncio.new_event_loop()
loop.run_until_complete(AccountProcessor.account_setup(account_setup=account))
def test_account_setup(self):
account = {
"account": "Test_Account",
"email": "demo@ac.in",
"first_name": "Test_First",
"last_name": "Test_Last",
"password": SecretStr("Welcome@1"),
}
loop = asyncio.new_event_loop()
actual, mail, link = loop.run_until_complete(AccountProcessor.account_setup(account_setup=account))
assert actual["_id"]
assert actual["account"]
assert actual["first_name"]
bot_id = Bot.objects(account=actual['account'], user="demo@ac.in").get()
assert BotAccess.objects(bot_account=actual['account'], accessor_email=account['email'], bot=str(bot_id.id),
status=ACTIVITY_STATUS.ACTIVE.value, role=ACCESS_ROLES.OWNER.value,
user=account['email']).get()
def test_default_account_setup(self):
loop = asyncio.new_event_loop()
actual, mail, link = loop.run_until_complete(AccountProcessor.default_account_setup())
assert actual
async def mock_smtp(self, *args, **kwargs):
return None
def test_validate_and_send_mail(self, monkeypatch):
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
loop.run_until_complete(Utility.validate_and_send_mail('demo@ac.in', subject='test', body='test'))
assert True
def test_send_false_email_id(self, monkeypatch):
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(Exception):
loop.run_until_complete(Utility.validate_and_send_mail('..', subject='test', body="test"))
def test_send_empty_mail_subject(self, monkeypatch):
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(Exception):
loop.run_until_complete(Utility.validate_and_send_mail('demo@ac.in', subject=' ', body='test'))
def test_send_empty_mail_body(self, monkeypatch):
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(Exception):
loop.run_until_complete(Utility.validate_and_send_mail('demo@ac.in', subject='test', body=' '))
def test_format_and_send_mail_invalid_type(self):
loop = asyncio.new_event_loop()
assert not loop.run_until_complete(Utility.format_and_send_mail('training_failure', 'demo@ac.in', 'udit'))
def test_valid_token(self):
token = Utility.generate_token('integ1@gmail.com')
mail = Utility.verify_token(token)
assert mail
def test_invalid_token(self):
with pytest.raises(Exception):
Utility.verify_token('..')
def test_new_user_confirm(self, monkeypatch):
AccountProcessor.add_user(
email="integ2@gmail.com",
first_name="inteq",
last_name="2",
password='Welcome@1',
account=1,
user="testAdmin",
)
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
token = Utility.generate_token('integ2@gmail.com')
loop = asyncio.new_event_loop()
loop.run_until_complete(AccountProcessor.confirm_email(token))
assert True
def test_user_already_confirmed(self, monkeypatch):
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
token = Utility.generate_token('integ2@gmail.com')
with pytest.raises(Exception):
loop.run_until_complete(AccountProcessor.confirm_email(token))
def test_user_not_confirmed(self):
with pytest.raises(Exception):
AccountProcessor.is_user_confirmed('sd')
def test_user_confirmed(self):
AccountProcessor.is_user_confirmed('integ2@gmail.com')
assert True
def test_send_empty_token(self):
with pytest.raises(Exception):
Utility.verify_token(' ')
def test_reset_link_with_mail(self, monkeypatch):
Utility.email_conf["email"]["enable"] = True
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
result = loop.run_until_complete(AccountProcessor.send_reset_link('integ2@gmail.com'))
assert result[0] == 'integ2@gmail.com'
assert result[1] == 'inteq'
assert result[2].__contains__('kairon.digite.com/reset_password/')
Utility.email_conf["email"]["enable"] = False
def test_reset_link_with_mail_limit_exceeded(self, monkeypatch):
Utility.email_conf["email"]["enable"] = True
monkeypatch.setitem(Utility.environment['user'], 'reset_password_request_limit', 2)
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
result = loop.run_until_complete(AccountProcessor.send_reset_link('integ2@gmail.com'))
assert result[0] == 'integ2@gmail.com'
assert result[1] == 'inteq'
assert result[2].__contains__('kairon.digite.com/reset_password/')
with pytest.raises(AppException, match='Password reset limit exhausted for today.'):
loop.run_until_complete(AccountProcessor.send_reset_link('integ2@gmail.com'))
Utility.email_conf["email"]["enable"] = False
def test_reset_link_with_empty_mail(self, monkeypatch):
Utility.email_conf["email"]["enable"] = True
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(Exception):
loop.run_until_complete(AccountProcessor.send_reset_link(''))
Utility.email_conf["email"]["enable"] = False
def test_reset_link_with_unregistered_mail(self, monkeypatch):
Utility.email_conf["email"]["enable"] = True
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(Exception):
loop.run_until_complete(AccountProcessor.send_reset_link('sasha.41195@gmail.com'))
Utility.email_conf["email"]["enable"] = False
def test_reset_link_with_unconfirmed_mail(self, monkeypatch):
Utility.email_conf["email"]["enable"] = True
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(Exception):
loop.run_until_complete(AccountProcessor.send_reset_link('integration@demo.ai'))
Utility.email_conf["email"]["enable"] = False
def test_overwrite_password_with_invalid_token(self, monkeypatch):
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(Exception):
loop.run_until_complete(AccountProcessor.overwrite_password('fgh', "asdfghj@1"))
def test_overwrite_password_with_empty_password_string(self, monkeypatch):
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(Exception):
loop.run_until_complete(AccountProcessor.overwrite_password(
'eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJtYWlsX2lkIjoiaW50ZWcxQGdtYWlsLmNvbSJ9.Ycs1ROb1w6MMsx2WTA4vFu3-jRO8LsXKCQEB3fkoU20',
" "))
def test_overwrite_password_with_valid_entries(self, monkeypatch):
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
token = Utility.generate_token('integ2@gmail.com')
loop = asyncio.new_event_loop()
loop.run_until_complete(AccountProcessor.overwrite_password(token, "Welcome@3"))
assert True
def test_overwrite_password_limit_exceeded(self, monkeypatch):
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
token = Utility.generate_token('integ2@gmail.com')
loop = asyncio.new_event_loop()
with pytest.raises(AppException, match='Password reset limit exhausted. Please come back in *'):
loop.run_until_complete(AccountProcessor.overwrite_password(token, "Welcome@3"))
def test_reset_link_not_within_cooldown_period(self, monkeypatch):
Utility.email_conf["email"]["enable"] = True
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(AppException, match='Password reset limit exhausted. Please come back in *'):
loop.run_until_complete(AccountProcessor.send_reset_link('integ2@gmail.com'))
Utility.email_conf["email"]["enable"] = False
def test_send_confirmation_link_with_valid_id(self, monkeypatch):
AccountProcessor.add_user(
email="integ3@gmail.com",
first_name="inteq",
last_name="3",
password='Welcome@1',
account=1,
user="testAdmin",
)
Utility.email_conf["email"]["enable"] = True
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
loop.run_until_complete(AccountProcessor.send_confirmation_link('integ3@gmail.com'))
Utility.email_conf["email"]["enable"] = False
assert True
def test_send_confirmation_link_with_confirmed_id(self, monkeypatch):
Utility.email_conf["email"]["enable"] = True
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(Exception):
loop.run_until_complete(AccountProcessor.send_confirmation_link('integ1@gmail.com'))
Utility.email_conf["email"]["enable"] = False
def test_send_confirmation_link_with_invalid_id(self, monkeypatch):
Utility.email_conf["email"]["enable"] = True
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(Exception):
loop.run_until_complete(AccountProcessor.send_confirmation_link(''))
Utility.email_conf["email"]["enable"] = False
def test_send_confirmation_link_with_unregistered_id(self, monkeypatch):
Utility.email_conf["email"]["enable"] = True
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(Exception):
loop.run_until_complete(AccountProcessor.send_confirmation_link('sasha.41195@gmail.com'))
Utility.email_conf["email"]["enable"] = False
def test_reset_link_with_mail_not_enabled(self, monkeypatch):
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(Exception):
loop.run_until_complete(AccountProcessor.send_reset_link('integ1@gmail.com'))
def test_send_confirmation_link_with_mail_not_enabled(self, monkeypatch):
monkeypatch.setattr(Utility, 'trigger_smtp', self.mock_smtp)
loop = asyncio.new_event_loop()
with pytest.raises(Exception):
loop.run_until_complete(AccountProcessor.send_confirmation_link('integration@demo.ai'))
def test_create_authentication_token_with_expire_time(self, monkeypatch):
start_date = datetime.datetime.now()
token = Authentication.create_access_token(data={"sub": "test"}, token_expire=180)
secret_key = Utility.environment['security']["secret_key"]
algorithm = Utility.environment['security']["algorithm"]
payload = jwt.decode(token, secret_key, algorithms=[algorithm])
assert round((datetime.datetime.fromtimestamp(payload.get('exp')) - start_date).total_seconds() / 60) == 180
assert payload.get('sub') == 'test'
start_date = datetime.datetime.now()
token = Authentication.create_access_token(data={"sub": "test"})
payload = jwt.decode(token, secret_key, algorithms=[algorithm])
assert round((datetime.datetime.fromtimestamp(payload.get('exp')) - start_date).total_seconds() / 60) == 10080
monkeypatch.setitem(Utility.environment['security'], 'token_expire', None)
start_date = datetime.datetime.now()
token = Authentication.create_access_token(data={"sub": "test"})
payload = jwt.decode(token, secret_key, algorithms=[algorithm])
assert round((datetime.datetime.fromtimestamp(payload.get('exp')) - start_date).total_seconds() / 60) == 15
start_date = datetime.datetime.now()
token = Authentication.create_access_token(data={"sub": "test"}, token_type='INVALID_TYPE')
payload = jwt.decode(token, secret_key, algorithms=[algorithm])
assert round((datetime.datetime.fromtimestamp(payload.get('exp')) - start_date).total_seconds() / 60) == 15
def test_generate_integration_token_login_token(self):
bot = 'test'
user = 'test_user'
with pytest.raises(NotImplementedError):
Authentication.generate_integration_token(bot, user, token_type=TOKEN_TYPE.LOGIN.value, role='chat')
def test_generate_integration_token(self):
bot = 'test'
user = 'test_user'
secret_key = Utility.environment['security']["secret_key"]
algorithm = Utility.environment['security']["algorithm"]
token = Authentication.generate_integration_token(bot, user, name='integration_token', role='chat')
payload = jwt.decode(token, secret_key, algorithms=[algorithm])
assert payload.get('bot') == bot
assert payload.get('sub') == user
assert payload.get('iat')
assert payload.get('type') == TOKEN_TYPE.INTEGRATION.value
assert payload.get('role') == 'chat'
assert not payload.get('exp')
def test_generate_integration_token_different_bot(self):
bot = 'test_1'
user = 'test_user'
secret_key = Utility.environment['security']["secret_key"]
algorithm = Utility.environment['security']["algorithm"]
token = Authentication.generate_integration_token(bot, user, name='integration_token', role='tester')
payload = jwt.decode(token, secret_key, algorithms=[algorithm])
assert payload.get('bot') == bot
assert payload.get('sub') == user
assert payload.get('iat')
assert payload.get('type') == TOKEN_TYPE.INTEGRATION.value
assert not payload.get('exp')
assert payload.get('role') == 'tester'
def test_generate_integration_token_with_expiry(self):
bot = 'test'
user = 'test_user'
secret_key = Utility.environment['security']["secret_key"]
algorithm = Utility.environment['security']["algorithm"]
token = Authentication.generate_integration_token(bot, user, expiry=15, name='integration_token_with_expiry', role='designer')
payload = jwt.decode(token, secret_key, algorithms=[algorithm])
assert payload.get('bot') == bot
assert payload.get('sub') == user
assert payload.get('iat')
assert payload.get('type') == TOKEN_TYPE.INTEGRATION.value
assert payload.get('role') == 'designer'
iat = datetime.datetime.fromtimestamp(payload.get('iat'), tz=datetime.timezone.utc)
exp = datetime.datetime.fromtimestamp(payload.get('exp'), tz=datetime.timezone.utc)
assert round((exp-iat).total_seconds() / 60) == 15
def test_generate_integration_token_with_access_limit(self):
bot = 'test1'
user = 'test_user'
secret_key = Utility.environment['security']["secret_key"]
algorithm = Utility.environment['security']["algorithm"]
start_date = datetime.datetime.now(tz=datetime.timezone.utc)
access_limit = ['/api/bot/endpoint']
token = Authentication.generate_integration_token(bot, user, expiry=15, access_limit=access_limit, name='integration_token_with_access_limit', role='admin')
payload = jwt.decode(token, secret_key, algorithms=[algorithm])
assert payload.get('bot') == bot
assert payload.get('sub') == user
assert payload.get('iat')
pytest.integration_iat = payload.get('iat')
assert payload.get('access-limit') == access_limit
assert payload.get('type') == TOKEN_TYPE.INTEGRATION.value
assert payload.get('role') == 'admin'
iat = datetime.datetime.fromtimestamp(payload.get('iat'), tz=datetime.timezone.utc)
exp = datetime.datetime.fromtimestamp(payload.get('exp'), tz=datetime.timezone.utc)
assert round((exp - iat).total_seconds() / 60) == 15
def test_generate_integration_token_name_exists(self, monkeypatch):
bot = 'test'
user = 'test_user'
monkeypatch.setitem(Utility.environment['security'], 'integrations_per_user', 3)
with pytest.raises(AppException, match='Integration token with this name has already been initiated'):
Authentication.generate_integration_token(bot, user, name='integration_token', role='chat')
def test_generate_integration_token_limit_exceeded(self):
bot = 'test'
user = 'test_user'
with pytest.raises(AppException, match='Integrations limit reached!'):
Authentication.generate_integration_token(bot, user, name='integration_token1', role='chat')
def test_generate_integration_token_dynamic(self):
bot = 'test'
user = 'test_user'
secret_key = Utility.environment['security']["secret_key"]
algorithm = Utility.environment['security']["algorithm"]
start_date = datetime.datetime.now(tz=datetime.timezone.utc)
access_limit = ['/api/bot/endpoint']
token = Authentication.generate_integration_token(bot, user, expiry=15, access_limit=access_limit, token_type=TOKEN_TYPE.DYNAMIC.value)
payload = jwt.decode(token, secret_key, algorithms=[algorithm])
assert payload.get('bot') == bot
assert payload.get('sub') == user
assert payload.get('iat')
assert payload.get('type') == TOKEN_TYPE.DYNAMIC.value
assert payload.get('role') == 'chat'
assert payload.get('access-limit') == access_limit
iat = datetime.datetime.fromtimestamp(payload.get('iat'), tz=datetime.timezone.utc)
exp = datetime.datetime.fromtimestamp(payload.get('exp'), tz=datetime.timezone.utc)
assert round((exp - iat).total_seconds() / 60) == 15
def test_generate_integration_token_without_name(self, monkeypatch):
bot = 'test'
user = 'test_user'
monkeypatch.setitem(Utility.environment['security'], 'integrations_per_user', 3)
with pytest.raises(ValidationError, match='name is required to add integration'):
Authentication.generate_integration_token(bot, user, expiry=15)
def test_list_integrations(self):
bot = 'test'
integrations = list(IntegrationProcessor.get_integrations(bot))
assert integrations[0]['name'] == 'integration_token'
assert integrations[0]['user'] == 'test_user'
assert integrations[0]['iat']
assert integrations[0]['status'] == 'active'
assert integrations[0]['role'] == 'chat'
assert integrations[1]['name'] == 'integration_token_with_expiry'
assert integrations[1]['user'] == 'test_user'
assert integrations[1]['iat']
assert integrations[1]['expiry']
assert integrations[1]['status'] == 'active'
assert integrations[0]['role'] == 'chat'
bot = 'test1'
integrations = list(IntegrationProcessor.get_integrations(bot))
assert integrations[0]['name'] == 'integration_token_with_access_limit'
assert integrations[0]['user'] == 'test_user'
assert integrations[0]['iat']
assert integrations[0]['expiry']
assert integrations[0]['access_list'] == ['/api/bot/endpoint']
assert integrations[0]['status'] == 'active'
assert integrations[0]['role'] == 'admin'
def test_update_integration_token_without_name(self):
bot = 'test'
user = 'test_user'
with pytest.raises(AppException, match="Integration does not exists"):
Authentication.update_integration_token(None, bot, user)
def test_update_integration_token_not_exists(self):
bot = 'test'
user = 'test_user'
with pytest.raises(AppException, match="Integration does not exists"):
Authentication.update_integration_token('integration_not_exists', bot, user)
def test_validate_integration_token(self):
bot = 'test1'
user = 'test_user'
name = 'integration_token_with_access_limit'
payload = {'name': name, 'bot': bot, 'sub': user, 'iat': pytest.integration_iat, 'access_limit': ['/api/bot/endpoint'], 'role': 'admin'}
assert not Authentication.validate_integration_token(payload)
def test_validate_integration_token_not_exists(self):
bot = 'test1'
user = 'test_user'
name = 'integration_not_exists'
payload = {'name': name, 'bot': bot, 'sub': user, 'iat': pytest.integration_iat}
with pytest.raises(HTTPException):
Authentication.validate_integration_token(payload)
def test_validate_integration_token_accessing_different_bot(self):
bot = 'test1'
bot_2 = 'test2'
user = 'test_user'
name = 'integration_not_exists'
payload = {'name': name, 'bot': bot, 'sub': user, 'iat': pytest.integration_iat}
with pytest.raises(HTTPException):
Authentication.validate_bot_request(bot, bot_2)
def test_list_integrations_after_update(self):
bot = 'test'
integrations = list(IntegrationProcessor.get_integrations(bot))
assert integrations[0]['name'] == 'integration_token'
assert integrations[0]['user'] == 'test_user'
assert integrations[0]['iat']
assert integrations[0]['status'] == 'active'
assert integrations[0]['role'] == 'chat'
assert integrations[1]['name'] == 'integration_token_with_expiry'
assert integrations[1]['user'] == 'test_user'
assert integrations[1]['iat']
assert integrations[1]['expiry']
assert integrations[1]['status'] == 'active'
assert integrations[1]['role'] == 'designer'
bot = 'test1'
integrations = list(IntegrationProcessor.get_integrations(bot))
assert integrations[0]['name'] == 'integration_token_with_access_limit'
assert integrations[0]['user'] == 'test_user'
assert integrations[0]['iat']
assert integrations[0]['expiry']
assert integrations[0]['access_list'] == ['/api/bot/endpoint']
assert integrations[0]['status'] == 'active'
assert integrations[0]['role'] == 'admin'
def test_update_integration_delete_integration_token_different_bot(self):
bot = 'test_1'
user = 'test_user'
token = Authentication.update_integration_token('integration_token', bot, user,
int_status=INTEGRATION_STATUS.DELETED.value)
assert not token
def test_update_integration_disable_integration_token(self):
bot = 'test1'
user = 'test_user'
token = Authentication.update_integration_token('integration_token_with_access_limit', bot, user, int_status=INTEGRATION_STATUS.INACTIVE.value)
assert not token
def test_list_integrations_after_disable(self):
bot = 'test'
integrations = list(IntegrationProcessor.get_integrations(bot))
assert integrations[0]['name'] == 'integration_token'
assert integrations[0]['user'] == 'test_user'
assert integrations[0]['iat']
assert integrations[0]['status'] == 'active'
assert integrations[0]['role'] == 'chat'
assert integrations[1]['name'] == 'integration_token_with_expiry'
assert integrations[1]['user'] == 'test_user'
assert integrations[1]['iat']
assert integrations[1]['expiry']
assert integrations[1]['status'] == 'active'
assert integrations[1]['role'] == 'designer'
bot = 'test1'
integrations = list(IntegrationProcessor.get_integrations(bot))
assert integrations[0]['name'] == 'integration_token_with_access_limit'
assert integrations[0]['user'] == 'test_user'
assert integrations[0]['iat']
assert integrations[0]['expiry']
assert integrations[0]['access_list'] == ['/api/bot/endpoint']
assert integrations[0]['status'] == 'inactive'
def test_validate_disabled_integration_token(self):
bot = 'test1'
user = 'test_user'
name = 'integration_token_with_access_limit'
payload = {'name': name, 'bot': bot, 'sub': user, 'iat': pytest.integration_iat, 'access_limit': ['/api/bot/endpoint/new']}
with pytest.raises(HTTPException):
Authentication.validate_integration_token(payload)
def test_update_integration_delete_integration_token(self):
bot = 'test1'
user = 'test_user'
token = Authentication.update_integration_token('integration_token_with_access_limit', bot, user, int_status=INTEGRATION_STATUS.DELETED.value)
assert not token
def test_list_integrations_after_deletion(self):
bot = 'test'
integrations = list(IntegrationProcessor.get_integrations(bot))
assert integrations[0]['name'] == 'integration_token'
assert integrations[0]['user'] == 'test_user'
assert integrations[0]['iat']
assert integrations[0]['status'] == 'active'
assert integrations[0]['role'] == 'chat'
assert integrations[1]['name'] == 'integration_token_with_expiry'
assert integrations[1]['user'] == 'test_user'
assert integrations[1]['iat']
assert integrations[1]['expiry']
assert integrations[1]['status'] == 'active'
assert integrations[1]['role'] == 'designer'
bot = 'test1'
integrations = list(IntegrationProcessor.get_integrations(bot))
assert integrations == []
def test_validate_deleted_integration_token(self):
bot = 'test1'
user = 'test_user'
name = 'integration_token_with_access_limit'
payload = {'name': name, 'bot': bot, 'sub': user, 'iat': pytest.integration_iat, 'access_limit': ['/api/bot/endpoint/new']}
with pytest.raises(HTTPException):
Authentication.validate_integration_token(payload)
def test_add_feedback(self):
AccountProcessor.add_feedback(4.5, 'test', feedback='product is good')
feedback = Feedback.objects(user='test').get()
assert feedback['rating'] == 4.5
assert feedback['scale'] == 5.0
assert feedback['feedback'] == 'product is good'
assert feedback['timestamp']
def test_add_feedback_2(self):
AccountProcessor.add_feedback(5.0, 'test_user', scale=10, feedback='i love kairon')
feedback = Feedback.objects(user='test_user').get()
assert feedback['rating'] == 5.0
assert feedback['scale'] == 10
assert feedback['feedback'] == 'i love kairon'
assert feedback['timestamp']
def test_add_feedback_3(self):
AccountProcessor.add_feedback(5.0, 'test')
feedback = list(Feedback.objects(user='test'))
assert feedback[1]['rating'] == 5.0
assert feedback[1]['scale'] == 5.0
assert not feedback[1]['feedback']
assert feedback[1]['timestamp']
def test_get_ui_config_none(self):
assert AccountProcessor.get_ui_config('test') == {}
def test_add_ui_config(self):
config = {'has_stepper': True, 'has_tour': False}
assert not AccountProcessor.update_ui_config(config, 'test')
config = {'has_stepper': True, 'has_tour': False, 'theme': 'black'}
assert not AccountProcessor.update_ui_config(config, 'test_user')
def test_add_ui_config_duplicate(self):
config = {'has_stepper': True, 'has_tour': False, 'theme': 'white'}
assert not AccountProcessor.update_ui_config(config, 'test')
def test_get_saved_ui_config(self):
config = {'has_stepper': True, 'has_tour': False, 'theme': 'white'}
assert AccountProcessor.get_ui_config('test') == config
config = {'has_stepper': True, 'has_tour': False, 'theme': 'black'}
assert AccountProcessor.get_ui_config('test_user') == config
@pytest.mark.asyncio
async def test_sso_login_google_not_enabled(self):
with pytest.raises(AppException, match='google login is not enabled'):
await Authentication.get_redirect_url("google")
request = Request({'type': 'http',
'headers': Headers({}).raw,
'query_string': 'code=AQDKEbWXmRjtjiPdGUxXSTuye8ggMZvN9A_cXf1Bw9j_FLSe_Tuwsf_EP-LmmHVAQqTIhqL1Yj7mnsnBbsQdSPLC_4QmJ1GJqM--mbDR0l7UAKVxWdtqy8YAK60Ws02EhjydiIKJ7duyccCa7vXZN01XPAanHak2vvp1URPMvmIMgjEcMyI-IJR0k9PR5NHCEKUmdqeeFBkyFbTtjizGvjYee7kFt7T6_-6DT3q9_1fPvC9VRVPa7ppkJOD0n6NW4smjtpLrEckjO5UF3ekOCNfISYrRdIU8LSMv0RU3i0ALgK2CDyp7rSzOwrkpw6780Ix-QtgFOF4T7scDYR7ZqG6HY5vljBt_lUE-ZWjv-zT_QHhv08Dm-9AoeC_yGNx1Wb8&state=f7ad9a88-be24-4d88-a3bd-3f02b4b12a18&scope=email profile https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/userinfo.profile openid&authuser=0&hd=digite.com&prompt=none'})
with pytest.raises(AppException, match='google login is not enabled'):
await Authentication.verify_and_process(request, "google")
@pytest.mark.asyncio
async def test_sso_login_facebook_not_enabled(self):
with pytest.raises(AppException, match='facebook login is not enabled'):
await Authentication.get_redirect_url("facebook")
request = Request({'type': 'http',
'headers': Headers({}).raw,
'query_string': 'code=AQDKEbWXmRjtjiPdGUxXSTuye8ggMZvN9A_cXf1Bw9j_FLSe_Tuwsf_EP-LmmHVAQqTIhqL1Yj7mnsnBbsQdSPLC_4QmJ1GJqM--mbDR0l7UAKVxWdtqy8YAK60Ws02EhjydiIKJ7duyccCa7vXZN01XPAanHak2vvp1URPMvmIMgjEcMyI-IJR0k9PR5NHCEKUmdqeeFBkyFbTtjizGvjYee7kFt7T6_-6DT3q9_1fPvC9VRVPa7ppkJOD0n6NW4smjtpLrEckjO5UF3ekOCNfISYrRdIU8LSMv0RU3i0ALgK2CDyp7rSzOwrkpw6780Ix-QtgFOF4T7scDYR7ZqG6HY5vljBt_lUE-ZWjv-zT_QHhv08Dm-9AoeC_yGNx1Wb8&state=f7ad9a88-be24-4d88-a3bd-3f02b4b12a18&scope=email profile https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/userinfo.profile openid&authuser=0&hd=digite.com&prompt=none'})
with pytest.raises(AppException, match='facebook login is not enabled'):
await Authentication.verify_and_process(request, "facebook")
@pytest.mark.asyncio
async def test_sso_login_linkedin_not_enabled(self):
with pytest.raises(AppException, match='linkedin login is not enabled'):
await Authentication.get_redirect_url("linkedin")
request = Request({'type': 'http',
'headers': Headers({}).raw,
'query_string': 'code=AQDKEbWXmRjtjiPdGUxXSTuye8ggMZvN9A_cXf1Bw9j_FLSe_Tuwsf_EP-LmmHVAQqTIhqL1Yj7mnsnBbsQdSPLC_4QmJ1GJqM--mbDR0l7UAKVxWdtqy8YAK60Ws02EhjydiIKJ7duyccCa7vXZN01XPAanHak2vvp1URPMvmIMgjEcMyI-IJR0k9PR5NHCEKUmdqeeFBkyFbTtjizGvjYee7kFt7T6_-6DT3q9_1fPvC9VRVPa7ppkJOD0n6NW4smjtpLrEckjO5UF3ekOCNfISYrRdIU8LSMv0RU3i0ALgK2CDyp7rSzOwrkpw6780Ix-QtgFOF4T7scDYR7ZqG6HY5vljBt_lUE-ZWjv-zT_QHhv08Dm-9AoeC_yGNx1Wb8&state=f7ad9a88-be24-4d88-a3bd-3f02b4b12a18&scope=email profile https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/userinfo.profile openid&authuser=0&hd=digite.com&prompt=none'})
with pytest.raises(AppException, match='linkedin login is not enabled'):
await Authentication.verify_and_process(request, "linkedin")
@pytest.mark.asyncio
async def test_verify_and_process_google(self, monkeypatch):
Utility.environment['sso']['google']['enable'] = True
async def _mock_google_response(*args, **kwargs):
return OpenID(
id='116918187277293076263',
email='monisha.ks@digite.com',
first_name='Monisha',
last_name='KS',
display_name='Monisha KS',
picture='https://lh3.googleusercontent.com/a/AATXAJxqb5pnbXi5Yryt_9TPdPiB8mQe8Lk613-4ytus=s96-c',
provider='google')
def _mock_user_details(*args, **kwargs):
return {"email": "monisha.ks@digite.com"}
monkeypatch.setattr(AccountProcessor, "get_user", _mock_user_details)
monkeypatch.setattr(AccountProcessor, "get_user_details", _mock_user_details)
monkeypatch.setattr(GoogleSSO, "verify_and_process", _mock_google_response)
request = Request({'type': 'http',
'headers': Headers({}).raw,
'query_string': 'code=AQDKEbWXmRjtjiPdGUxXSTuye8ggMZvN9A_cXf1Bw9j_FLSe_Tuwsf_EP-LmmHVAQqTIhqL1Yj7mnsnBbsQdSPLC_4QmJ1GJqM--mbDR0l7UAKVxWdtqy8YAK60Ws02EhjydiIKJ7duyccCa7vXZN01XPAanHak2vvp1URPMvmIMgjEcMyI-IJR0k9PR5NHCEKUmdqeeFBkyFbTtjizGvjYee7kFt7T6_-6DT3q9_1fPvC9VRVPa7ppkJOD0n6NW4smjtpLrEckjO5UF3ekOCNfISYrRdIU8LSMv0RU3i0ALgK2CDyp7rSzOwrkpw6780Ix-QtgFOF4T7scDYR7ZqG6HY5vljBt_lUE-ZWjv-zT_QHhv08Dm-9AoeC_yGNx1Wb8&state=f7ad9a88-be24-4d88-a3bd-3f02b4b12a18&scope=email profile https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/userinfo.profile openid&authuser=0&hd=digite.com&prompt=none'})
existing_user, user, token = await Authentication.verify_and_process(request, "google")
assert Utility.decode_limited_access_token(token)["sub"] == "monisha.ks@digite.com"
assert user['email'] == 'monisha.ks@digite.com'
assert user['first_name'] == 'Monisha'
assert user['last_name'] == 'KS'
assert Utility.check_empty_string(user.get('password'))
assert existing_user
@pytest.mark.asyncio
async def test_verify_and_process_user_doesnt_exist_google(self, monkeypatch):
async def _mock_google_response(*args, **kwargs):
return OpenID(
id='116918187277293076263',
email='monisha.ks@digite.com',
first_name='Monisha',
last_name='KS',
display_name='Monisha KS',
picture='https://lh3.googleusercontent.com/a/AATXAJxqb5pnbXi5Yryt_9TPdPiB8mQe8Lk613-4ytus=s96-c',
provider='google')
monkeypatch.setattr(GoogleSSO, "verify_and_process", _mock_google_response)
request = Request({'type': 'http',
'headers': Headers({}).raw,
'query_string': 'code=AQDKEbWXmRjtjiPdGUxXSTuye8ggMZvN9A_cXf1Bw9j_FLSe_Tuwsf_EP-LmmHVAQqTIhqL1Yj7mnsnBbsQdSPLC_4QmJ1GJqM--mbDR0l7UAKVxWdtqy8YAK60Ws02EhjydiIKJ7duyccCa7vXZN01XPAanHak2vvp1URPMvmIMgjEcMyI-IJR0k9PR5NHCEKUmdqeeFBkyFbTtjizGvjYee7kFt7T6_-6DT3q9_1fPvC9VRVPa7ppkJOD0n6NW4smjtpLrEckjO5UF3ekOCNfISYrRdIU8LSMv0RU3i0ALgK2CDyp7rSzOwrkpw6780Ix-QtgFOF4T7scDYR7ZqG6HY5vljBt_lUE-ZWjv-zT_QHhv08Dm-9AoeC_yGNx1Wb8&state=f7ad9a88-be24-4d88-a3bd-3f02b4b12a18&scope=email profile https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/userinfo.profile openid&authuser=0&hd=digite.com&prompt=none'})
existing_user, user, token = await Authentication.verify_and_process(request, "google")
assert user['email'] == 'monisha.ks@digite.com'
assert user['first_name'] == 'Monisha'
assert user['last_name'] == 'KS'
assert not Utility.check_empty_string(user.get('password').get_secret_value())
assert user.get('account') == user.get('email')
assert not existing_user
user = AccountProcessor.get_user_details('monisha.ks@digite.com')
assert all(
user[key] is False if key == "is_integration_user" else user[key]
for key in user.keys()
)
assert len(list(AccountProcessor.list_bots(user['account']))) == 1
assert not AccountProcessor.is_user_confirmed(user['email'])
@pytest.mark.asyncio
async def test_ssostate_google(*args, **kwargs):
request = Request({'type': 'http',
'headers': Headers({}).raw,
'query_string': 'code=AQDKEbWXmRjtjiPdGUxXSTuye8ggMZvN9A_cXf1Bw9j_FLSe_Tuwsf_EP-LmmHVAQqTIhqL1Yj7mnsnBbsQdSPLC_4QmJ1GJqM--mbDR0l7UAKVxWdtqy8YAK60Ws02EhjydiIKJ7duyccCa7vXZN01XPAanHak2vvp1URPMvmIMgjEcMyI-IJR0k9PR5NHCEKUmdqeeFBkyFbTtjizGvjYee7kFt7T6_-6DT3q9_1fPvC9VRVPa7ppkJOD0n6NW4smjtpLrEckjO5UF3ekOCNfISYrRdIU8LSMv0RU3i0ALgK2CDyp7rSzOwrkpw6780Ix-QtgFOF4T7scDYR7ZqG6HY5vljBt_lUE-ZWjv-zT_QHhv08Dm-9AoeC_yGNx1Wb8&state=f7ad9a88-be24-4d88-a3bd-3f02b4b12a18&scope=email profile https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/userinfo.profile openid&authuser=0&hd=digite.com&prompt=none'})
with pytest.raises(AppException) as e:
await Authentication.verify_and_process(request, "google")
assert str(e).__contains__('Failed to verify with google')
@pytest.mark.asyncio
async def test_get_redirect_url_google(self, monkeypatch):
discovery_url = 'https://discovery.url.localhost/o/oauth2/v2/auth?response_type=code&client_id'
async def _mock_get_discovery_doc(*args, **kwargs):
return {'authorization_endpoint': discovery_url}
monkeypatch.setattr(GoogleSSO, 'get_discovery_document', _mock_get_discovery_doc)
assert isinstance(await Authentication.get_redirect_url("google"), RedirectResponse)
@pytest.mark.asyncio
async def test_verify_and_process_facebook(self, monkeypatch):
Utility.environment['sso']['facebook']['enable'] = True
async def _mock_facebook_response(*args, **kwargs):
return OpenID(
id='107921368422696',
email='monisha.ks@digite.com',
first_name='Moni',
last_name='Shareddy',
display_name='Monisha Shareddy',
picture='https://scontent-bom1-2.xx.fbcdn.net/v/t1.30497-1/cp0/c15.0.50.50a/p50x50/84628273_176159830277856_972693363922829312_n.jpg?_nc_cat=1&ccb=1-5&_nc_sid=12b3be&_nc_ohc=reTAAmyXfF0AX9vbxxH&_nc_ht=scontent-bom1-2.xx&edm=AP4hL3IEAAAA&oh=00_AT_6IOixo-clV4B1Gthr_UabmxEzz50ri6yAhhXJzlbFeQ&oe=61F21F38',
provider='facebook')
def _mock_user_details(*args, **kwargs):
return {"email": "monisha.ks@digite.com"}
monkeypatch.setattr(AccountProcessor, "get_user", _mock_user_details)
monkeypatch.setattr(AccountProcessor, "get_user_details", _mock_user_details)
monkeypatch.setattr(FacebookSSO, "verify_and_process", _mock_facebook_response)
request = Request({'type': 'http',
'headers': Headers({}).raw,
'query_string': 'code=AQDEkezmJoa3hfyVOafJkHbXG5OJNV3dZQ4gElP3WS71LJbErkK6ljLq31C0B3xRw2dv2G4Fh9mA2twjBVrQZfv_j0MYBS8xq0DEAg08YTZ2Kd1mPJ2HVDF5GnrhZcl2V1qpcO0pGzVQAFMLVRKVWxmirya0uqm150ZLHL_xN9NZjCvk1DRnOXKYXXZtaaU-HgO22Rxxzo90hTtW4mLBl7Vg55SRmic6p1r3KAkyfnAVTLSNPhaX2I9KUgeUjQ6EwGz3NtwjxKLPnsC1yPZqQMGBS6u2lHt-BOjj80iJmukbLH_35Xzn6Mv6xVSjqGwTjNEnn6N5dyT-3_X_vmYTlcGpr8LOn6tTf7kz_ysauexbGxn883m_thFV3Ozb9oP9u78)]'})
existing_user, user, token = await Authentication.verify_and_process(request, "facebook")
assert Utility.decode_limited_access_token(token)["sub"] == "monisha.ks@digite.com"
assert user['email'] == 'monisha.ks@digite.com'
assert user['first_name'] == 'Moni'
assert user['last_name'] == 'Shareddy'
assert Utility.check_empty_string(user.get('password'))
assert existing_user
@pytest.mark.asyncio
async def test_verify_and_process_user_doesnt_exist_facebook(self, monkeypatch):
async def _mock_facebook_response(*args, **kwargs):
return OpenID(
id='107921368422696',
email='monishaks@digite.com',
first_name='Moni',
last_name='Shareddy',
display_name='Monisha Shareddy',
picture='https://scontent-bom1-2.xx.fbcdn.net/v/t1.30497-1/cp0/c15.0.50.50a/p50x50/84628273_176159830277856_972693363922829312_n.jpg?_nc_cat=1&ccb=1-5&_nc_sid=12b3be&_nc_ohc=reTAAmyXfF0AX9vbxxH&_nc_ht=scontent-bom1-2.xx&edm=AP4hL3IEAAAA&oh=00_AT_6IOixo-clV4B1Gthr_UabmxEzz50ri6yAhhXJzlbFeQ&oe=61F21F38',
provider='facebook')
monkeypatch.setattr(FacebookSSO, "verify_and_process", _mock_facebook_response)
request = Request({'type': 'http',
'headers': Headers({'cookie': "ssostate=a257c5b8-4293-49db-a773-2c6fd78df016"}).raw,
'query_string': 'code=AQB4u0qDPLiqREyHXEmGydCw-JBg-vU1VL9yfR1PLuGijlyGsZs7CoYe98XhQ-jkQu_jYj-DMefRL_AcAvhenbBEuQ5Bhd18B9gOfDwe0JvB-Y5TAm21MrhVZtDxSm9VTSZVaPrwsWeN0dQYr2OgG9I0qPoM-OBEsOdJRYpCn-nKBKFGAbXb6AR7KTHhQtRDHHrylLe0QcSz2p1FjlLVWOrBh-A3o5xmvsaXaRtwYfYdJuxOBz2W7DlVw9m6qP9fx4gAzkp-j1sNKmiZjuHBsHJvKQsBG7xCw7etZh5Uie49R-WtP87-yic_CMYulju5bYRWTMd-549QWwjMW8lIQkPXStGwbU0JaOy9BHKmB6iUSrp0jIyo1RYdBo6Ji81Jyms&state=a257c5b8-4293-49db-a773-2c6fd78df016'})
existing_user, user, token = await Authentication.verify_and_process(request, "facebook")
assert user['email'] == 'monishaks@digite.com'
assert user['first_name'] == 'Moni'
assert user['last_name'] == 'Shareddy'
assert not Utility.check_empty_string(user.get('password').get_secret_value())
assert user.get('account') == user.get('email')
assert not existing_user
user = AccountProcessor.get_user_details('monishaks@digite.com')
assert all(
user[key] is False if key == "is_integration_user" else user[key]
for key in user.keys()
)
assert len(list(AccountProcessor.list_bots(user['account']))) == 1
assert not AccountProcessor.is_user_confirmed(user['email'])
@pytest.mark.asyncio
async def test_get_redirect_url_facebook(self):
assert isinstance(await Authentication.get_redirect_url("facebook"), RedirectResponse)
@pytest.mark.asyncio
async def test_invalid_ssostate_facebook(*args, **kwargs):
request = Request({'type': 'http',
'headers': Headers({'cookie': "ssostate=a257c5b8-4293-49db-a773-2c6fd78df016"}).raw,
'query_string': 'code=AQB4u0qDPLiqREyHXEmGydCw-JBg-vU1VL9yfR1PLuGijlyGsZs7CoYe98XhQ-jkQu_jYj-DMefRL_AcAvhenbBEuQ5Bhd18B9gOfDwe0JvB-Y5TAm21MrhVZtDxSm9VTSZVaPrwsWeN0dQYr2OgG9I0qPoM-OBEsOdJRYpCn-nKBKFGAbXb6AR7KTHhQtRDHHrylLe0QcSz2p1FjlLVWOrBh-A3o5xmvsaXaRtwYfYdJuxOBz2W7DlVw9m6qP9fx4gAzkp-j1sNKmiZjuHBsHJvKQsBG7xCw7etZh5Uie49R-WtP87-yic_CMYulju5bYRWTMd-549QWwjMW8lIQkPXStGwbU0JaOy9BHKmB6iUSrp0jIyo1RYdBo6Ji81Jyms&state=a257c5b8-4293-49db-a773-2c6fd78df016'})
with pytest.raises(AppException) as e:
await Authentication.verify_and_process(request, "facebook")
assert str(e).__contains__('Failed to verify with facebook')
@pytest.mark.asyncio
async def test_get_redirect_url_linkedin(self):
Utility.environment['sso']['linkedin']['enable'] = True
response = await Authentication.get_redirect_url("linkedin")
assert isinstance(response, RedirectResponse)
@pytest.mark.asyncio
async def test_sso_linkedin_login_error(self, httpx_mock: HTTPXMock):
httpx_mock.add_response(
method=responses.POST,
url=await LoginSSOFactory.get_client('linkedin').sso_client.token_endpoint,
json={'access_token': '1234567890'},
)
httpx_mock.add_response(
method=responses.GET,
url=await LoginSSOFactory.get_client('linkedin').sso_client.userinfo_endpoint,
json={'first_name': 'udit', 'last_name': 'pandey', 'profile_url': '1234::mkfnwuefhbwi'},
)
httpx_mock.add_response(
method=responses.GET,
url=await LoginSSOFactory.get_client('linkedin').sso_client.useremail_endpoint,
json={'emailAddress': '1234567890'},
)
scope = {
"type": "http",
"http_version": "1.1",
"method": "GET",
"scheme": "http",
"path": "/",
'query_string': b'code=4/0AX4XfWh-AOKSPocewBBm0KAE_5j1qGNNWJAdbRcZ8OYKUU1KlwGqx_kOz6yzlZN-jUBi0Q&state={LoginSSOFactory.linkedin_sso.state}&scope=email profile https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/userinfo.profile openid&authuser=0&hd=digite.com&prompt=none',
"headers": Headers({
'cookie': f"ssostate={LoginSSOFactory.get_client('linkedin').sso_client.state}",
'host': 'www.example.org',
'accept': 'application/json',
}).raw,
"client": ("134.56.78.4", 1453),
"server": ("www.example.org", 443),
}
request = Request(scope=scope)
request._url = URL(scope=scope)
with pytest.raises(AppException, match='User was not verified with linkedin'):
await Authentication.verify_and_process(request, "linkedin")
@pytest.mark.asyncio
async def test_sso_linkedin_login_success(self, httpx_mock: HTTPXMock, monkeypatch):
httpx_mock.add_response(
method=responses.POST,
url=await LoginSSOFactory.get_client('linkedin').sso_client.token_endpoint,
json={'access_token': '1234567890'},
)
httpx_mock.add_response(
method=responses.GET,
url=await LoginSSOFactory.get_client('linkedin').sso_client.userinfo_endpoint,
json={'localizedFirstName': 'monisha', 'localizedLastName': 'reddy'},
)
httpx_mock.add_response(
method=responses.GET,
url=await LoginSSOFactory.get_client('linkedin').sso_client.useremail_endpoint,
json={'elements': [{'handle~': {'emailAddress': 'monisha.ks@digite.com'}}]}
)
scope = {
"type": "http",
"http_version": "1.1",
"method": "GET",
"scheme": "https",
"path": "/",
'query_string': b'code=4/0AX4XfWh-AOKSPocewBBm0KAE_5j1qGNNWJAdbRcZ8OYKUU1KlwGqx_kOz6yzlZN-jUBi0Q&state={LoginSSOFactory.linkedin_sso.state}&scope=email profile https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/userinfo.profile openid&authuser=0&hd=digite.com&prompt=none',
"headers": Headers({
'cookie': f"ssostate={LoginSSOFactory.get_client('linkedin').sso_client.state}",
'host': 'www.example.org',
'accept': 'application/json',
}).raw,
"client": ("134.56.78.4", 1453),
"server": ("www.example.org", 443),
}
def _mock_user_details(*args, **kwargs):
return {"email": "monisha.ks@digite.com"}
monkeypatch.setattr(AccountProcessor, "get_user", _mock_user_details)
monkeypatch.setattr(AccountProcessor, "get_user_details", _mock_user_details)
request = Request(scope=scope)
request._url = URL(scope=scope)
existing_user, user, token = await Authentication.verify_and_process(request, "linkedin")
assert Utility.decode_limited_access_token(token)["sub"] == "monisha.ks@digite.com"
assert user['email'] == 'monisha.ks@digite.com'
assert user['first_name'] == 'monisha'
assert user['last_name'] == 'reddy'
assert Utility.check_empty_string(user.get('password'))
assert existing_user
@pytest.mark.asyncio
async def test_sso_linkedin_login_new_user(self, httpx_mock: HTTPXMock, monkeypatch):
httpx_mock.add_response(
method=responses.POST,
url=await LoginSSOFactory.get_client('linkedin').sso_client.token_endpoint,
json={'access_token': '1234567890'},
)
httpx_mock.add_response(
method=responses.GET,
url=await LoginSSOFactory.get_client('linkedin').sso_client.userinfo_endpoint,
json={'localizedFirstName': 'monisha', 'localizedLastName': 'reddy'},
)
httpx_mock.add_response(
method=responses.GET,
url=await LoginSSOFactory.get_client('linkedin').sso_client.useremail_endpoint,
json={'elements': [{'handle~': {'emailAddress': 'monisha.ks.ks@digite.com'}}]}
)
scope = {
"type": "http",
"http_version": "1.1",
"method": "GET",
"scheme": "https",
"path": "/",
'query_string': b'code=4/0AX4XfWh-AOKSPocewBBm0KAE_5j1qGNNWJAdbRcZ8OYKUU1KlwGqx_kOz6yzlZN-jUBi0Q&state={LoginSSOFactory.linkedin_sso.state}&scope=email profile https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/userinfo.profile openid&authuser=0&hd=digite.com&prompt=none',
"headers": Headers({
'cookie': f"ssostate={LoginSSOFactory.get_client('linkedin').sso_client.state}",
'host': 'www.example.org',
'accept': 'application/json',
}).raw,
"client": ("134.56.78.4", 1453),
"server": ("www.example.org", 443),
}
request = Request(scope=scope)
request._url = URL(scope=scope)
existing_user, user, token = await Authentication.verify_and_process(request, "linkedin")
assert Utility.decode_limited_access_token(token)["sub"] == "monisha.ks.ks@digite.com"
assert user['email'] == 'monisha.ks.ks@digite.com'
assert user['first_name'] == 'monisha'
assert user['last_name'] == 'reddy'
assert not Utility.check_empty_string(user.get('password').get_secret_value())
assert not existing_user
user = AccountProcessor.get_user_details('monisha.ks@digite.com')
assert all(
user[key] is False if key == "is_integration_user" else user[key]
for key in user.keys()
)
user = AccountProcessor.get_user_details('monishaks@digite.com')
assert all(
user[key] is False if key == "is_integration_user" else user[key]
for key in user.keys()
)
assert len(list(AccountProcessor.list_bots(user['account']))) == 1
assert not AccountProcessor.is_user_confirmed(user['email'])
def test_sso_login_client_linkedin(self):
assert LoginSSOFactory.get_client('linkedin').sso_client.client_secret == Utility.environment['sso']['linkedin']['client_secret']
assert LoginSSOFactory.get_client('linkedin').sso_client.client_id == Utility.environment['sso']['linkedin']['client_id']
assert LoginSSOFactory.get_client('linkedin').sso_client.redirect_uri == urljoin(Utility.environment['sso']['redirect_url'], 'linkedin')
def test_sso_login_client_gmail(self):
assert LoginSSOFactory.get_client('google').sso_client.client_secret == Utility.environment['sso']['google']['client_secret']
assert LoginSSOFactory.get_client('google').sso_client.client_id == Utility.environment['sso']['google']['client_id']
assert LoginSSOFactory.get_client('google').sso_client.redirect_uri == urljoin(Utility.environment['sso']['redirect_url'], 'google')
def test_sso_login_client_facebook(self):
assert LoginSSOFactory.get_client('facebook').sso_client.client_secret == Utility.environment['sso']['facebook']['client_secret']
assert LoginSSOFactory.get_client('facebook').sso_client.client_id == Utility.environment['sso']['facebook']['client_id']
assert LoginSSOFactory.get_client('facebook').sso_client.redirect_uri == urljoin(Utility.environment['sso']['redirect_url'], 'facebook')
| 51.091986 | 642 | 0.667267 | 9,928 | 90,535 | 5.829674 | 0.056608 | 0.018626 | 0.02571 | 0.026124 | 0.871106 | 0.833561 | 0.805069 | 0.772863 | 0.743767 | 0.718835 | 0 | 0.023444 | 0.213188 | 90,535 | 1,771 | 643 | 51.120836 | 0.789047 | 0.00053 | 0 | 0.643969 | 0 | 0.009079 | 0.230005 | 0.08544 | 0 | 0 | 0 | 0 | 0.212062 | 1 | 0.102464 | false | 0.035019 | 0.018158 | 0.009728 | 0.13489 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2da2c3b2fab2c0331669259af318bd33b090db3e | 1,019 | py | Python | tests/_gutt/airfly/test_utils.py | ryanchao2012/airfly | 230ddd88885defc67485fa0c51f66c4a67ae98a9 | [
"MIT"
] | 7 | 2021-09-27T11:38:48.000Z | 2022-02-01T06:06:24.000Z | tests/_gutt/airfly/test_utils.py | ryanchao2012/airfly | 230ddd88885defc67485fa0c51f66c4a67ae98a9 | [
"MIT"
] | null | null | null | tests/_gutt/airfly/test_utils.py | ryanchao2012/airfly | 230ddd88885defc67485fa0c51f66c4a67ae98a9 | [
"MIT"
] | null | null | null | def test_blacking():
from airfly.utils import blacking
assert blacking
def test_isorting():
from airfly.utils import isorting
assert isorting
def test_collect_objects():
from airfly.utils import collect_objects
assert collect_objects
def test_load_module_by_name():
from airfly.utils import load_module_by_name
assert load_module_by_name
def test_qualname():
from airfly.utils import qualname
assert qualname
def test__escape_any_commandline_parser():
from airfly.utils import _escape_any_commandline_parser
assert _escape_any_commandline_parser
def test__collect_from_package():
from airfly.utils import _collect_from_package
assert _collect_from_package
def test__collect_from_module():
from airfly.utils import _collect_from_module
assert _collect_from_module
def test_makefile():
from airfly.utils import makefile
assert makefile
def test__writefile():
from airfly.utils import _writefile
assert _writefile
| 17.271186 | 59 | 0.776251 | 133 | 1,019 | 5.533835 | 0.172932 | 0.095109 | 0.203804 | 0.285326 | 0.125 | 0.086957 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184495 | 1,019 | 58 | 60 | 17.568966 | 0.88568 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2db48973cfee2bbbfd71adb5c0a93521dc9d9332 | 46 | py | Python | pytools/modules/musicplayer/__init__.py | maopucheng/pytools | 7d42b0fb1ef539559d931db7b70ef6725d32617a | [
"MIT"
] | 757 | 2018-08-25T07:59:26.000Z | 2021-12-20T12:44:11.000Z | pytools/modules/musicplayer/__init__.py | junyang-zhou/pytools | eca4dbace589ba74a95628d1c285e75e20ea7d1e | [
"MIT"
] | 7 | 2020-02-19T00:42:44.000Z | 2021-09-04T07:42:51.000Z | pytools/modules/musicplayer/__init__.py | junyang-zhou/pytools | eca4dbace589ba74a95628d1c285e75e20ea7d1e | [
"MIT"
] | 485 | 2018-08-25T13:53:51.000Z | 2021-12-21T05:11:08.000Z | '''初始化'''
from .musicplayer import MusicPlayer | 23 | 36 | 0.76087 | 5 | 46 | 7 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 46 | 2 | 36 | 23 | 0.833333 | 0.065217 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2dd39304f13e644f36288e71cafe6109d82f5266 | 31 | py | Python | program-16may-slack-data/code.py | Milindghag2yahoo/program-16may-slack | f9eb755acd6e2110fc89086d0a7d87d6a3f22ac7 | [
"MIT"
] | null | null | null | program-16may-slack-data/code.py | Milindghag2yahoo/program-16may-slack | f9eb755acd6e2110fc89086d0a7d87d6a3f22ac7 | [
"MIT"
] | null | null | null | program-16may-slack-data/code.py | Milindghag2yahoo/program-16may-slack | f9eb755acd6e2110fc89086d0a7d87d6a3f22ac7 | [
"MIT"
] | null | null | null | # --------------
print(7+10)
| 6.2 | 16 | 0.258065 | 3 | 31 | 2.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 0.16129 | 31 | 4 | 17 | 7.75 | 0.192308 | 0.451613 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
2def14345f75eed3599d0811808c76efc3d9b096 | 2,164 | py | Python | epytope/Data/pssms/tepitopepan/mat/DRB1_1612_9.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 7 | 2021-02-01T18:11:28.000Z | 2022-01-31T19:14:07.000Z | epytope/Data/pssms/tepitopepan/mat/DRB1_1612_9.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 22 | 2021-01-02T15:25:23.000Z | 2022-03-14T11:32:53.000Z | epytope/Data/pssms/tepitopepan/mat/DRB1_1612_9.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 4 | 2021-05-28T08:50:38.000Z | 2022-03-14T11:45:32.000Z | DRB1_1612_9 = {0: {'A': -999.0, 'E': -999.0, 'D': -999.0, 'G': -999.0, 'F': -0.004754, 'I': -0.99525, 'H': -999.0, 'K': -999.0, 'M': -0.99525, 'L': -0.99525, 'N': -999.0, 'Q': -999.0, 'P': -999.0, 'S': -999.0, 'R': -999.0, 'T': -999.0, 'W': -0.004754, 'V': -0.99525, 'Y': -0.004754}, 1: {'A': 0.0, 'E': 0.1, 'D': -1.3, 'G': 0.5, 'F': 0.8, 'I': 1.1, 'H': 0.8, 'K': 1.1, 'M': 1.1, 'L': 1.0, 'N': 0.8, 'Q': 1.2, 'P': -0.5, 'S': -0.3, 'R': 2.2, 'T': 0.0, 'W': -0.1, 'V': 2.1, 'Y': 0.9}, 2: {'A': 0.0, 'E': -1.2, 'D': -1.3, 'G': 0.2, 'F': 0.8, 'I': 1.5, 'H': 0.2, 'K': 0.0, 'M': 1.4, 'L': 1.0, 'N': 0.5, 'Q': 0.0, 'P': 0.3, 'S': 0.2, 'R': 0.7, 'T': 0.0, 'W': 0.0, 'V': 0.5, 'Y': 0.8}, 3: {'A': 0.0, 'E': -1.3316, 'D': -1.3227, 'G': -1.2381, 'F': 0.90261, 'I': 0.6293, 'H': 0.052757, 'K': -0.32254, 'M': 1.0076, 'L': 0.80581, 'N': -0.077356, 'Q': -0.39629, 'P': -1.3, 'S': -0.60532, 'R': -0.28082, 'T': -0.58521, 'W': 0.13917, 'V': 0.17743, 'Y': 0.28781}, 4: {'A': 0.0, 'E': 0.0, 'D': 0.0, 'G': 0.0, 'F': 0.0, 'I': 0.0, 'H': 0.0, 'K': 0.0, 'M': 0.0, 'L': 0.0, 'N': 0.0, 'Q': 0.0, 'P': 0.0, 'S': 0.0, 'R': 0.0, 'T': 0.0, 'W': 0.0, 'V': 0.0, 'Y': 0.0}, 5: {'A': 0.0, 'E': -1.2696, 'D': -1.2197, 'G': -0.12422, 'F': -0.78557, 'I': 0.28466, 'H': -0.42328, 'K': 0.13265, 'M': -0.38108, 'L': 0.090958, 'N': 0.25559, 'Q': -0.66847, 'P': 0.08304, 'S': 0.36721, 'R': 0.71489, 'T': 0.44212, 'W': -0.83077, 'V': 0.37124, 'Y': -0.44413}, 6: {'A': 0.0, 'E': -1.2305, 'D': -1.7332, 'G': -0.88111, 'F': -0.13739, 'I': 0.086368, 'H': -0.24659, 'K': -0.56958, 'M': 0.5618, 'L': 0.54264, 'N': -0.16143, 'Q': -0.40519, 'P': -0.71731, 'S': -0.53219, 'R': -0.25152, 'T': -0.80543, 'W': -0.39042, 'V': -0.32505, 'Y': -0.31744}, 7: {'A': 0.0, 'E': 0.0, 'D': 0.0, 'G': 0.0, 'F': 0.0, 'I': 0.0, 'H': 0.0, 'K': 0.0, 'M': 0.0, 'L': 0.0, 'N': 0.0, 'Q': 0.0, 'P': 0.0, 'S': 0.0, 'R': 0.0, 'T': 0.0, 'W': 0.0, 'V': 0.0, 'Y': 0.0}, 8: {'A': 0.0, 'E': -1.8514, 'D': -1.86, 'G': -0.76974, 'F': -0.37372, 'I': 0.68078, 'H': -1.0629, 'K': -1.6348, 'M': 0.10043, 'L': 0.46076, 'N': -1.1859, 'Q': -1.5258, 'P': -1.0812, 'S': -0.24476, 'R': -0.95816, 'T': -0.22457, 'W': -1.3819, 'V': 0.28332, 'Y': -0.8679}} | 2,164 | 2,164 | 0.395102 | 525 | 2,164 | 1.624762 | 0.2 | 0.114889 | 0.028136 | 0.037515 | 0.225088 | 0.143025 | 0.143025 | 0.143025 | 0.133646 | 0.133646 | 0 | 0.374724 | 0.162662 | 2,164 | 1 | 2,164 | 2,164 | 0.096026 | 0 | 0 | 0 | 0 | 0 | 0.078984 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
932446a0b7454e77dfd41e2488cbf0b980d21c8d | 2,244 | py | Python | examples/templates/keyframes_example.py | jamiemott/TemPy | ebadf32b309a4d713524ef6f902c2556e8ee7108 | [
"Apache-2.0"
] | null | null | null | examples/templates/keyframes_example.py | jamiemott/TemPy | ebadf32b309a4d713524ef6f902c2556e8ee7108 | [
"Apache-2.0"
] | null | null | null | examples/templates/keyframes_example.py | jamiemott/TemPy | ebadf32b309a4d713524ef6f902c2556e8ee7108 | [
"Apache-2.0"
] | null | null | null | from tempy.tags import *
from tempy.elements import Css
animationCSS = Css(
{
"@keyframes bounce": {
"0%": {
"background-color": "blue;"
},
"25%": {
"background-color": "red;",
"transform": "translateX(600px);",
"border-radius": "100%;"
},
"50%": {
"background-color": "red;",
"transform": "translate(600px, 600px);",
"border-radius": "0%;"
},
"75%": {
"background-color": "red;",
"transform": "translateY(600px);",
"border-radius": "100%;"
},
"100%": {
"background-color": "green;"
}
},
".animation": {
"width": "100px;",
"height": "100px;",
"animation": "bounce;",
"animation-duration": "4s;",
"animation-iteration-count": "infinite;"
}
}
)
'''
@keyframes bounce {
0% {
background-color: blue;
}
25%{
background-color: red;
transform: translateX(600px);
border-radius: 100%;
}
50%{
background-color: red;
transform: translate(600px, 600px);
border-radius: 0%;
}
75%{
background-color: red;
transform: translateY(600px);
border-radius: 100%;
}
100%{
background-color: green;
}
}
.animation{
background-color: blue;
width: 100px;
height: 100px;
animation: bounce;
animation-duration: 4s;
animation-iteration-count: infinite;
}
This is the output
'''
animationDiv = Div(klass="animation")
text = Text("This is animation demo")
page = Html()(
Head(
Meta(charset="UTF-8"),
Meta(name="viewport", content="width=device-width, initial-scale=1.0")
),
Body()(
text,
animationDiv
)
)
# page.render()
| 24.129032 | 78 | 0.411765 | 161 | 2,244 | 5.73913 | 0.372671 | 0.178571 | 0.116883 | 0.175325 | 0.718615 | 0.718615 | 0.718615 | 0.718615 | 0.718615 | 0.718615 | 0 | 0.060533 | 0.447861 | 2,244 | 92 | 79 | 24.391304 | 0.68523 | 0.005793 | 0 | 0.104167 | 0 | 0 | 0.346415 | 0.018868 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.041667 | 0 | 0.041667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9327034d6be252cb4e13a9cbde9cc9ec155992a7 | 315 | py | Python | run.py | Dharshan2004/games-day | 94eddc844b4bb062b7e4e0e06469b855e5f0907d | [
"MIT"
] | null | null | null | run.py | Dharshan2004/games-day | 94eddc844b4bb062b7e4e0e06469b855e5f0907d | [
"MIT"
] | null | null | null | run.py | Dharshan2004/games-day | 94eddc844b4bb062b7e4e0e06469b855e5f0907d | [
"MIT"
] | null | null | null | import os
os.system('docker login -u "dashy2004" -p "12345678qwerty123" repo.treescale.com')
os.system('docker build -t games-day .')
os.system('docker tag games-day repo.treescale.com/dashy2004/games-day:latest')
os.system('docker push repo.treescale.com/dashy2004/games-day:latest')
print('The build passed yay!') | 52.5 | 82 | 0.768254 | 49 | 315 | 4.938776 | 0.469388 | 0.132231 | 0.231405 | 0.206612 | 0.322314 | 0.322314 | 0.322314 | 0 | 0 | 0 | 0 | 0.079038 | 0.07619 | 315 | 6 | 83 | 52.5 | 0.752577 | 0 | 0 | 0 | 0 | 0 | 0.759494 | 0.28481 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.166667 | 0.166667 | 0 | 0.166667 | 0.166667 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
9340b3f60c2bf4ee8d82c2fecf3e77454020f743 | 86 | py | Python | integration_test/test_version.py | DongweiYe/muscle3 | 0c2fcf5f62995b8639fc84ce1b983c8a8e6248d0 | [
"Apache-2.0"
] | 11 | 2018-03-12T10:43:46.000Z | 2020-06-01T10:58:56.000Z | integration_test/test_version.py | DongweiYe/muscle3 | 0c2fcf5f62995b8639fc84ce1b983c8a8e6248d0 | [
"Apache-2.0"
] | 85 | 2018-03-03T15:10:56.000Z | 2022-03-18T14:05:14.000Z | integration_test/test_version.py | DongweiYe/muscle3 | 0c2fcf5f62995b8639fc84ce1b983c8a8e6248d0 | [
"Apache-2.0"
] | 6 | 2018-03-12T10:47:11.000Z | 2022-02-03T13:44:07.000Z | import libmuscle
def test_version() -> None:
assert libmuscle.__version__ != ''
| 14.333333 | 38 | 0.697674 | 9 | 86 | 6.111111 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186047 | 86 | 5 | 39 | 17.2 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9345fa9f17d4213c2620a6be1c90ca4af5e685fe | 89,156 | py | Python | google/cloud/bare_metal_solution_v2/services/bare_metal_solution/async_client.py | renovate-bot/pythonbaremetalsolution | 4e5d230a5468b444c346bd0216db7db02ce60e0c | [
"Apache-2.0"
] | null | null | null | google/cloud/bare_metal_solution_v2/services/bare_metal_solution/async_client.py | renovate-bot/pythonbaremetalsolution | 4e5d230a5468b444c346bd0216db7db02ce60e0c | [
"Apache-2.0"
] | null | null | null | google/cloud/bare_metal_solution_v2/services/bare_metal_solution/async_client.py | renovate-bot/pythonbaremetalsolution | 4e5d230a5468b444c346bd0216db7db02ce60e0c | [
"Apache-2.0"
] | 1 | 2022-03-24T16:06:23.000Z | 2022-03-24T16:06:23.000Z | # -*- coding: utf-8 -*-
# Copyright 2022 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from collections import OrderedDict
import functools
import re
from typing import Dict, Mapping, Optional, Sequence, Tuple, Type, Union
import pkg_resources
from google.api_core.client_options import ClientOptions
from google.api_core import exceptions as core_exceptions
from google.api_core import gapic_v1
from google.api_core import retry as retries
from google.auth import credentials as ga_credentials # type: ignore
from google.oauth2 import service_account # type: ignore
try:
OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
except AttributeError: # pragma: NO COVER
OptionalRetry = Union[retries.Retry, object] # type: ignore
from google.api_core import operation # type: ignore
from google.api_core import operation_async # type: ignore
from google.cloud.bare_metal_solution_v2.services.bare_metal_solution import pagers
from google.cloud.bare_metal_solution_v2.types import baremetalsolution
from google.protobuf import field_mask_pb2 # type: ignore
from google.protobuf import timestamp_pb2 # type: ignore
from .transports.base import BareMetalSolutionTransport, DEFAULT_CLIENT_INFO
from .transports.grpc_asyncio import BareMetalSolutionGrpcAsyncIOTransport
from .client import BareMetalSolutionClient
class BareMetalSolutionAsyncClient:
"""Performs management operations on Bare Metal Solution servers.
The ``baremetalsolution.googleapis.com`` service provides management
capabilities for Bare Metal Solution servers. To access the API
methods, you must assign Bare Metal Solution IAM roles containing
the desired permissions to your staff in your Google Cloud project.
You must also enable the Bare Metal Solution API. Once enabled, the
methods act upon specific servers in your Bare Metal Solution
environment.
"""
_client: BareMetalSolutionClient
DEFAULT_ENDPOINT = BareMetalSolutionClient.DEFAULT_ENDPOINT
DEFAULT_MTLS_ENDPOINT = BareMetalSolutionClient.DEFAULT_MTLS_ENDPOINT
instance_path = staticmethod(BareMetalSolutionClient.instance_path)
parse_instance_path = staticmethod(BareMetalSolutionClient.parse_instance_path)
lun_path = staticmethod(BareMetalSolutionClient.lun_path)
parse_lun_path = staticmethod(BareMetalSolutionClient.parse_lun_path)
network_path = staticmethod(BareMetalSolutionClient.network_path)
parse_network_path = staticmethod(BareMetalSolutionClient.parse_network_path)
snapshot_schedule_policy_path = staticmethod(
BareMetalSolutionClient.snapshot_schedule_policy_path
)
parse_snapshot_schedule_policy_path = staticmethod(
BareMetalSolutionClient.parse_snapshot_schedule_policy_path
)
volume_path = staticmethod(BareMetalSolutionClient.volume_path)
parse_volume_path = staticmethod(BareMetalSolutionClient.parse_volume_path)
volume_snapshot_path = staticmethod(BareMetalSolutionClient.volume_snapshot_path)
parse_volume_snapshot_path = staticmethod(
BareMetalSolutionClient.parse_volume_snapshot_path
)
common_billing_account_path = staticmethod(
BareMetalSolutionClient.common_billing_account_path
)
parse_common_billing_account_path = staticmethod(
BareMetalSolutionClient.parse_common_billing_account_path
)
common_folder_path = staticmethod(BareMetalSolutionClient.common_folder_path)
parse_common_folder_path = staticmethod(
BareMetalSolutionClient.parse_common_folder_path
)
common_organization_path = staticmethod(
BareMetalSolutionClient.common_organization_path
)
parse_common_organization_path = staticmethod(
BareMetalSolutionClient.parse_common_organization_path
)
common_project_path = staticmethod(BareMetalSolutionClient.common_project_path)
parse_common_project_path = staticmethod(
BareMetalSolutionClient.parse_common_project_path
)
common_location_path = staticmethod(BareMetalSolutionClient.common_location_path)
parse_common_location_path = staticmethod(
BareMetalSolutionClient.parse_common_location_path
)
@classmethod
def from_service_account_info(cls, info: dict, *args, **kwargs):
"""Creates an instance of this client using the provided credentials
info.
Args:
info (dict): The service account private key info.
args: Additional arguments to pass to the constructor.
kwargs: Additional arguments to pass to the constructor.
Returns:
BareMetalSolutionAsyncClient: The constructed client.
"""
return BareMetalSolutionClient.from_service_account_info.__func__(BareMetalSolutionAsyncClient, info, *args, **kwargs) # type: ignore
@classmethod
def from_service_account_file(cls, filename: str, *args, **kwargs):
"""Creates an instance of this client using the provided credentials
file.
Args:
filename (str): The path to the service account private key json
file.
args: Additional arguments to pass to the constructor.
kwargs: Additional arguments to pass to the constructor.
Returns:
BareMetalSolutionAsyncClient: The constructed client.
"""
return BareMetalSolutionClient.from_service_account_file.__func__(BareMetalSolutionAsyncClient, filename, *args, **kwargs) # type: ignore
from_service_account_json = from_service_account_file
@classmethod
def get_mtls_endpoint_and_cert_source(
cls, client_options: Optional[ClientOptions] = None
):
"""Return the API endpoint and client cert source for mutual TLS.
The client cert source is determined in the following order:
(1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
client cert source is None.
(2) if `client_options.client_cert_source` is provided, use the provided one; if the
default client cert source exists, use the default one; otherwise the client cert
source is None.
The API endpoint is determined in the following order:
(1) if `client_options.api_endpoint` if provided, use the provided one.
(2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
default mTLS endpoint; if the environment variabel is "never", use the default API
endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
use the default API endpoint.
More details can be found at https://google.aip.dev/auth/4114.
Args:
client_options (google.api_core.client_options.ClientOptions): Custom options for the
client. Only the `api_endpoint` and `client_cert_source` properties may be used
in this method.
Returns:
Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
client cert source to use.
Raises:
google.auth.exceptions.MutualTLSChannelError: If any errors happen.
"""
return BareMetalSolutionClient.get_mtls_endpoint_and_cert_source(client_options) # type: ignore
@property
def transport(self) -> BareMetalSolutionTransport:
"""Returns the transport used by the client instance.
Returns:
BareMetalSolutionTransport: The transport used by the client instance.
"""
return self._client.transport
get_transport_class = functools.partial(
type(BareMetalSolutionClient).get_transport_class, type(BareMetalSolutionClient)
)
def __init__(
self,
*,
credentials: ga_credentials.Credentials = None,
transport: Union[str, BareMetalSolutionTransport] = "grpc_asyncio",
client_options: ClientOptions = None,
client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
) -> None:
"""Instantiates the bare metal solution client.
Args:
credentials (Optional[google.auth.credentials.Credentials]): The
authorization credentials to attach to requests. These
credentials identify the application to the service; if none
are specified, the client will attempt to ascertain the
credentials from the environment.
transport (Union[str, ~.BareMetalSolutionTransport]): The
transport to use. If set to None, a transport is chosen
automatically.
client_options (ClientOptions): Custom options for the client. It
won't take effect if a ``transport`` instance is provided.
(1) The ``api_endpoint`` property can be used to override the
default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
environment variable can also be used to override the endpoint:
"always" (always use the default mTLS endpoint), "never" (always
use the default regular endpoint) and "auto" (auto switch to the
default mTLS endpoint if client certificate is present, this is
the default value). However, the ``api_endpoint`` property takes
precedence if provided.
(2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
is "true", then the ``client_cert_source`` property can be used
to provide client certificate for mutual TLS transport. If
not provided, the default SSL client certificate will be used if
present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
set, no client certificate will be used.
Raises:
google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
creation failed for any reason.
"""
self._client = BareMetalSolutionClient(
credentials=credentials,
transport=transport,
client_options=client_options,
client_info=client_info,
)
async def list_instances(
self,
request: Union[baremetalsolution.ListInstancesRequest, dict] = None,
*,
parent: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> pagers.ListInstancesAsyncPager:
r"""List servers in a given project and location.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_list_instances():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.ListInstancesRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_instances(request=request)
# Handle the response
async for response in page_result:
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.ListInstancesRequest, dict]):
The request object. Message for requesting the list of
servers.
parent (:class:`str`):
Required. Parent value for
ListInstancesRequest.
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.services.bare_metal_solution.pagers.ListInstancesAsyncPager:
Response message for the list of
servers.
Iterating over this object will yield
results and resolve additional pages
automatically.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([parent])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.ListInstancesRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if parent is not None:
request.parent = parent
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.list_instances,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# This method is paged; wrap the response in a pager, which provides
# an `__aiter__` convenience method.
response = pagers.ListInstancesAsyncPager(
method=rpc,
request=request,
response=response,
metadata=metadata,
)
# Done; return the response.
return response
async def get_instance(
self,
request: Union[baremetalsolution.GetInstanceRequest, dict] = None,
*,
name: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> baremetalsolution.Instance:
r"""Get details about a single server.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_get_instance():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.GetInstanceRequest(
name="name_value",
)
# Make the request
response = await client.get_instance(request=request)
# Handle the response
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.GetInstanceRequest, dict]):
The request object. Message for requesting server
information.
name (:class:`str`):
Required. Name of the resource.
This corresponds to the ``name`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.types.Instance:
A server.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([name])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.GetInstanceRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if name is not None:
request.name = name
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.get_instance,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# Done; return the response.
return response
async def reset_instance(
self,
request: Union[baremetalsolution.ResetInstanceRequest, dict] = None,
*,
name: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> operation_async.AsyncOperation:
r"""Perform an ungraceful, hard reset on a server.
Equivalent to shutting the power off and then turning it
back on.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_reset_instance():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.ResetInstanceRequest(
name="name_value",
)
# Make the request
operation = client.reset_instance(request=request)
print("Waiting for operation to complete...")
response = await operation.result()
# Handle the response
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.ResetInstanceRequest, dict]):
The request object. Message requesting to reset a
server.
name (:class:`str`):
Required. Name of the resource.
This corresponds to the ``name`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.api_core.operation_async.AsyncOperation:
An object representing a long-running operation.
The result type for the operation will be
:class:`google.cloud.bare_metal_solution_v2.types.ResetInstanceResponse`
Response message from resetting a server.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([name])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.ResetInstanceRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if name is not None:
request.name = name
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.reset_instance,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# Wrap the response in an operation future.
response = operation_async.from_gapic(
response,
self._client._transport.operations_client,
baremetalsolution.ResetInstanceResponse,
metadata_type=baremetalsolution.OperationMetadata,
)
# Done; return the response.
return response
async def list_volumes(
self,
request: Union[baremetalsolution.ListVolumesRequest, dict] = None,
*,
parent: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> pagers.ListVolumesAsyncPager:
r"""List storage volumes in a given project and location.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_list_volumes():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.ListVolumesRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_volumes(request=request)
# Handle the response
async for response in page_result:
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.ListVolumesRequest, dict]):
The request object. Message for requesting a list of
storage volumes.
parent (:class:`str`):
Required. Parent value for
ListVolumesRequest.
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.services.bare_metal_solution.pagers.ListVolumesAsyncPager:
Response message containing the list
of storage volumes.
Iterating over this object will yield
results and resolve additional pages
automatically.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([parent])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.ListVolumesRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if parent is not None:
request.parent = parent
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.list_volumes,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# This method is paged; wrap the response in a pager, which provides
# an `__aiter__` convenience method.
response = pagers.ListVolumesAsyncPager(
method=rpc,
request=request,
response=response,
metadata=metadata,
)
# Done; return the response.
return response
async def get_volume(
self,
request: Union[baremetalsolution.GetVolumeRequest, dict] = None,
*,
name: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> baremetalsolution.Volume:
r"""Get details of a single storage volume.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_get_volume():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.GetVolumeRequest(
name="name_value",
)
# Make the request
response = await client.get_volume(request=request)
# Handle the response
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.GetVolumeRequest, dict]):
The request object. Message for requesting storage
volume information.
name (:class:`str`):
Required. Name of the resource.
This corresponds to the ``name`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.types.Volume:
A storage volume.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([name])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.GetVolumeRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if name is not None:
request.name = name
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.get_volume,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# Done; return the response.
return response
async def update_volume(
self,
request: Union[baremetalsolution.UpdateVolumeRequest, dict] = None,
*,
volume: baremetalsolution.Volume = None,
update_mask: field_mask_pb2.FieldMask = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> operation_async.AsyncOperation:
r"""Update details of a single storage volume.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_update_volume():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.UpdateVolumeRequest(
)
# Make the request
operation = client.update_volume(request=request)
print("Waiting for operation to complete...")
response = await operation.result()
# Handle the response
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.UpdateVolumeRequest, dict]):
The request object. Message for updating a volume.
volume (:class:`google.cloud.bare_metal_solution_v2.types.Volume`):
Required. The volume to update.
The ``name`` field is used to identify the volume to
update. Format:
projects/{project}/locations/{location}/volumes/{volume}
This corresponds to the ``volume`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`):
The list of fields to update. The only currently
supported fields are: ``snapshot_auto_delete_behavior``
``snapshot_schedule_policy_name``
This corresponds to the ``update_mask`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.api_core.operation_async.AsyncOperation:
An object representing a long-running operation.
The result type for the operation will be
:class:`google.cloud.bare_metal_solution_v2.types.Volume`
A storage volume.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([volume, update_mask])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.UpdateVolumeRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if volume is not None:
request.volume = volume
if update_mask is not None:
request.update_mask = update_mask
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.update_volume,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata(
(("volume.name", request.volume.name),)
),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# Wrap the response in an operation future.
response = operation_async.from_gapic(
response,
self._client._transport.operations_client,
baremetalsolution.Volume,
metadata_type=baremetalsolution.OperationMetadata,
)
# Done; return the response.
return response
async def list_networks(
self,
request: Union[baremetalsolution.ListNetworksRequest, dict] = None,
*,
parent: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> pagers.ListNetworksAsyncPager:
r"""List network in a given project and location.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_list_networks():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.ListNetworksRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_networks(request=request)
# Handle the response
async for response in page_result:
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.ListNetworksRequest, dict]):
The request object. Message for requesting a list of
networks.
parent (:class:`str`):
Required. Parent value for
ListNetworksRequest.
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.services.bare_metal_solution.pagers.ListNetworksAsyncPager:
Response message containing the list
of networks.
Iterating over this object will yield
results and resolve additional pages
automatically.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([parent])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.ListNetworksRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if parent is not None:
request.parent = parent
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.list_networks,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# This method is paged; wrap the response in a pager, which provides
# an `__aiter__` convenience method.
response = pagers.ListNetworksAsyncPager(
method=rpc,
request=request,
response=response,
metadata=metadata,
)
# Done; return the response.
return response
async def get_network(
self,
request: Union[baremetalsolution.GetNetworkRequest, dict] = None,
*,
name: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> baremetalsolution.Network:
r"""Get details of a single network.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_get_network():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.GetNetworkRequest(
name="name_value",
)
# Make the request
response = await client.get_network(request=request)
# Handle the response
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.GetNetworkRequest, dict]):
The request object. Message for requesting network
information.
name (:class:`str`):
Required. Name of the resource.
This corresponds to the ``name`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.types.Network:
A Network.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([name])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.GetNetworkRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if name is not None:
request.name = name
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.get_network,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# Done; return the response.
return response
async def list_snapshot_schedule_policies(
self,
request: Union[
baremetalsolution.ListSnapshotSchedulePoliciesRequest, dict
] = None,
*,
parent: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> pagers.ListSnapshotSchedulePoliciesAsyncPager:
r"""List snapshot schedule policies in a given project
and location.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_list_snapshot_schedule_policies():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.ListSnapshotSchedulePoliciesRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_snapshot_schedule_policies(request=request)
# Handle the response
async for response in page_result:
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.ListSnapshotSchedulePoliciesRequest, dict]):
The request object. Message for requesting a list of
snapshot schedule policies.
parent (:class:`str`):
Required. The parent project
containing the Snapshot Schedule
Policies.
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.services.bare_metal_solution.pagers.ListSnapshotSchedulePoliciesAsyncPager:
Response message containing the list
of snapshot schedule policies.
Iterating over this object will yield
results and resolve additional pages
automatically.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([parent])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.ListSnapshotSchedulePoliciesRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if parent is not None:
request.parent = parent
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.list_snapshot_schedule_policies,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# This method is paged; wrap the response in a pager, which provides
# an `__aiter__` convenience method.
response = pagers.ListSnapshotSchedulePoliciesAsyncPager(
method=rpc,
request=request,
response=response,
metadata=metadata,
)
# Done; return the response.
return response
async def get_snapshot_schedule_policy(
self,
request: Union[baremetalsolution.GetSnapshotSchedulePolicyRequest, dict] = None,
*,
name: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> baremetalsolution.SnapshotSchedulePolicy:
r"""Get details of a single snapshot schedule policy.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_get_snapshot_schedule_policy():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.GetSnapshotSchedulePolicyRequest(
name="name_value",
)
# Make the request
response = await client.get_snapshot_schedule_policy(request=request)
# Handle the response
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.GetSnapshotSchedulePolicyRequest, dict]):
The request object. Message for requesting snapshot
schedule policy information.
name (:class:`str`):
Required. Name of the resource.
This corresponds to the ``name`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.types.SnapshotSchedulePolicy:
A snapshot schedule policy.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([name])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.GetSnapshotSchedulePolicyRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if name is not None:
request.name = name
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.get_snapshot_schedule_policy,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# Done; return the response.
return response
async def create_snapshot_schedule_policy(
self,
request: Union[
baremetalsolution.CreateSnapshotSchedulePolicyRequest, dict
] = None,
*,
parent: str = None,
snapshot_schedule_policy: baremetalsolution.SnapshotSchedulePolicy = None,
snapshot_schedule_policy_id: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> baremetalsolution.SnapshotSchedulePolicy:
r"""Create a snapshot schedule policy in the specified
project.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_create_snapshot_schedule_policy():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.CreateSnapshotSchedulePolicyRequest(
parent="parent_value",
snapshot_schedule_policy_id="snapshot_schedule_policy_id_value",
)
# Make the request
response = await client.create_snapshot_schedule_policy(request=request)
# Handle the response
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.CreateSnapshotSchedulePolicyRequest, dict]):
The request object. Message for creating a snapshot
schedule policy in a project.
parent (:class:`str`):
Required. The parent project and
location containing the
SnapshotSchedulePolicy.
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
snapshot_schedule_policy (:class:`google.cloud.bare_metal_solution_v2.types.SnapshotSchedulePolicy`):
Required. The SnapshotSchedulePolicy
to create.
This corresponds to the ``snapshot_schedule_policy`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
snapshot_schedule_policy_id (:class:`str`):
Required. Snapshot policy ID
This corresponds to the ``snapshot_schedule_policy_id`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.types.SnapshotSchedulePolicy:
A snapshot schedule policy.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any(
[parent, snapshot_schedule_policy, snapshot_schedule_policy_id]
)
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.CreateSnapshotSchedulePolicyRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if parent is not None:
request.parent = parent
if snapshot_schedule_policy is not None:
request.snapshot_schedule_policy = snapshot_schedule_policy
if snapshot_schedule_policy_id is not None:
request.snapshot_schedule_policy_id = snapshot_schedule_policy_id
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.create_snapshot_schedule_policy,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# Done; return the response.
return response
async def update_snapshot_schedule_policy(
self,
request: Union[
baremetalsolution.UpdateSnapshotSchedulePolicyRequest, dict
] = None,
*,
snapshot_schedule_policy: baremetalsolution.SnapshotSchedulePolicy = None,
update_mask: field_mask_pb2.FieldMask = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> baremetalsolution.SnapshotSchedulePolicy:
r"""Update a snapshot schedule policy in the specified
project.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_update_snapshot_schedule_policy():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.UpdateSnapshotSchedulePolicyRequest(
)
# Make the request
response = await client.update_snapshot_schedule_policy(request=request)
# Handle the response
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.UpdateSnapshotSchedulePolicyRequest, dict]):
The request object. Message for updating a snapshot
schedule policy in a project.
snapshot_schedule_policy (:class:`google.cloud.bare_metal_solution_v2.types.SnapshotSchedulePolicy`):
Required. The snapshot schedule policy to update.
The ``name`` field is used to identify the snapshot
schedule policy to update. Format:
projects/{project}/locations/global/snapshotSchedulePolicies/{policy}
This corresponds to the ``snapshot_schedule_policy`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`):
Required. The list of fields to
update.
This corresponds to the ``update_mask`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.types.SnapshotSchedulePolicy:
A snapshot schedule policy.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([snapshot_schedule_policy, update_mask])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.UpdateSnapshotSchedulePolicyRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if snapshot_schedule_policy is not None:
request.snapshot_schedule_policy = snapshot_schedule_policy
if update_mask is not None:
request.update_mask = update_mask
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.update_snapshot_schedule_policy,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata(
(
(
"snapshot_schedule_policy.name",
request.snapshot_schedule_policy.name,
),
)
),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# Done; return the response.
return response
async def delete_snapshot_schedule_policy(
self,
request: Union[
baremetalsolution.DeleteSnapshotSchedulePolicyRequest, dict
] = None,
*,
name: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> None:
r"""Delete a named snapshot schedule policy.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_delete_snapshot_schedule_policy():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.DeleteSnapshotSchedulePolicyRequest(
name="name_value",
)
# Make the request
await client.delete_snapshot_schedule_policy(request=request)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.DeleteSnapshotSchedulePolicyRequest, dict]):
The request object. Message for deleting a snapshot
schedule policy in a project.
name (:class:`str`):
Required. The name of the snapshot
schedule policy to delete.
This corresponds to the ``name`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([name])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.DeleteSnapshotSchedulePolicyRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if name is not None:
request.name = name
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.delete_snapshot_schedule_policy,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
)
# Send the request.
await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
async def create_volume_snapshot(
self,
request: Union[baremetalsolution.CreateVolumeSnapshotRequest, dict] = None,
*,
parent: str = None,
volume_snapshot: baremetalsolution.VolumeSnapshot = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> baremetalsolution.VolumeSnapshot:
r"""Create a storage volume snapshot in a containing
volume.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_create_volume_snapshot():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.CreateVolumeSnapshotRequest(
parent="parent_value",
)
# Make the request
response = await client.create_volume_snapshot(request=request)
# Handle the response
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.CreateVolumeSnapshotRequest, dict]):
The request object. Message for creating a volume
snapshot.
parent (:class:`str`):
Required. The volume to snapshot.
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
volume_snapshot (:class:`google.cloud.bare_metal_solution_v2.types.VolumeSnapshot`):
Required. The volume snapshot to
create. Only the description field may
be specified.
This corresponds to the ``volume_snapshot`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.types.VolumeSnapshot:
Snapshot registered for a given
storage volume.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([parent, volume_snapshot])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.CreateVolumeSnapshotRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if parent is not None:
request.parent = parent
if volume_snapshot is not None:
request.volume_snapshot = volume_snapshot
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.create_volume_snapshot,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# Done; return the response.
return response
async def restore_volume_snapshot(
self,
request: Union[baremetalsolution.RestoreVolumeSnapshotRequest, dict] = None,
*,
volume_snapshot: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> operation_async.AsyncOperation:
r"""Restore a storage volume snapshot to its containing
volume.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_restore_volume_snapshot():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.RestoreVolumeSnapshotRequest(
volume_snapshot="volume_snapshot_value",
)
# Make the request
operation = client.restore_volume_snapshot(request=request)
print("Waiting for operation to complete...")
response = await operation.result()
# Handle the response
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.RestoreVolumeSnapshotRequest, dict]):
The request object. Message for restoring a volume
snapshot.
volume_snapshot (:class:`str`):
Required. Name of the resource.
This corresponds to the ``volume_snapshot`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.api_core.operation_async.AsyncOperation:
An object representing a long-running operation.
The result type for the operation will be
:class:`google.cloud.bare_metal_solution_v2.types.VolumeSnapshot`
Snapshot registered for a given storage volume.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([volume_snapshot])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.RestoreVolumeSnapshotRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if volume_snapshot is not None:
request.volume_snapshot = volume_snapshot
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.restore_volume_snapshot,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata(
(("volume_snapshot", request.volume_snapshot),)
),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# Wrap the response in an operation future.
response = operation_async.from_gapic(
response,
self._client._transport.operations_client,
baremetalsolution.VolumeSnapshot,
metadata_type=baremetalsolution.OperationMetadata,
)
# Done; return the response.
return response
async def delete_volume_snapshot(
self,
request: Union[baremetalsolution.DeleteVolumeSnapshotRequest, dict] = None,
*,
name: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> None:
r"""Deletes a storage volume snapshot for a given volume.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_delete_volume_snapshot():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.DeleteVolumeSnapshotRequest(
name="name_value",
)
# Make the request
await client.delete_volume_snapshot(request=request)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.DeleteVolumeSnapshotRequest, dict]):
The request object. Message for deleting named Volume
snapshot.
name (:class:`str`):
Required. The name of the snapshot to
delete.
This corresponds to the ``name`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([name])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.DeleteVolumeSnapshotRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if name is not None:
request.name = name
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.delete_volume_snapshot,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
)
# Send the request.
await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
async def get_volume_snapshot(
self,
request: Union[baremetalsolution.GetVolumeSnapshotRequest, dict] = None,
*,
name: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> baremetalsolution.VolumeSnapshot:
r"""Get details of a single storage volume snapshot.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_get_volume_snapshot():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.GetVolumeSnapshotRequest(
name="name_value",
)
# Make the request
response = await client.get_volume_snapshot(request=request)
# Handle the response
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.GetVolumeSnapshotRequest, dict]):
The request object. Message for requesting storage
volume snapshot information.
name (:class:`str`):
Required. Name of the resource.
This corresponds to the ``name`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.types.VolumeSnapshot:
Snapshot registered for a given
storage volume.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([name])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.GetVolumeSnapshotRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if name is not None:
request.name = name
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.get_volume_snapshot,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# Done; return the response.
return response
async def list_volume_snapshots(
self,
request: Union[baremetalsolution.ListVolumeSnapshotsRequest, dict] = None,
*,
parent: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> pagers.ListVolumeSnapshotsAsyncPager:
r"""List storage volume snapshots for given storage
volume.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_list_volume_snapshots():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.ListVolumeSnapshotsRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_volume_snapshots(request=request)
# Handle the response
async for response in page_result:
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.ListVolumeSnapshotsRequest, dict]):
The request object. Message for requesting a list of
storage volume snapshots.
parent (:class:`str`):
Required. Parent value for
ListVolumesRequest.
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.services.bare_metal_solution.pagers.ListVolumeSnapshotsAsyncPager:
Response message containing the list
of storage volume snapshots.
Iterating over this object will yield
results and resolve additional pages
automatically.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([parent])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.ListVolumeSnapshotsRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if parent is not None:
request.parent = parent
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.list_volume_snapshots,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# This method is paged; wrap the response in a pager, which provides
# an `__aiter__` convenience method.
response = pagers.ListVolumeSnapshotsAsyncPager(
method=rpc,
request=request,
response=response,
metadata=metadata,
)
# Done; return the response.
return response
async def get_lun(
self,
request: Union[baremetalsolution.GetLunRequest, dict] = None,
*,
name: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> baremetalsolution.Lun:
r"""Get details of a single storage logical unit
number(LUN).
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_get_lun():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.GetLunRequest(
name="name_value",
)
# Make the request
response = await client.get_lun(request=request)
# Handle the response
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.GetLunRequest, dict]):
The request object. Message for requesting storage lun
information.
name (:class:`str`):
Required. Name of the resource.
This corresponds to the ``name`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.types.Lun:
A storage volume logical unit number
(LUN).
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([name])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.GetLunRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if name is not None:
request.name = name
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.get_lun,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# Done; return the response.
return response
async def list_luns(
self,
request: Union[baremetalsolution.ListLunsRequest, dict] = None,
*,
parent: str = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> pagers.ListLunsAsyncPager:
r"""List storage volume luns for given storage volume.
.. code-block:: python
from google.cloud import bare_metal_solution_v2
async def sample_list_luns():
# Create a client
client = bare_metal_solution_v2.BareMetalSolutionAsyncClient()
# Initialize request argument(s)
request = bare_metal_solution_v2.ListLunsRequest(
parent="parent_value",
)
# Make the request
page_result = client.list_luns(request=request)
# Handle the response
async for response in page_result:
print(response)
Args:
request (Union[google.cloud.bare_metal_solution_v2.types.ListLunsRequest, dict]):
The request object. Message for requesting a list of
storage volume luns.
parent (:class:`str`):
Required. Parent value for
ListLunsRequest.
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
Returns:
google.cloud.bare_metal_solution_v2.services.bare_metal_solution.pagers.ListLunsAsyncPager:
Response message containing the list
of storage volume luns.
Iterating over this object will yield
results and resolve additional pages
automatically.
"""
# Create or coerce a protobuf request object.
# Quick check: If we got a request object, we should *not* have
# gotten any keyword arguments that map to the request.
has_flattened_params = any([parent])
if request is not None and has_flattened_params:
raise ValueError(
"If the `request` argument is set, then none of "
"the individual field arguments should be set."
)
request = baremetalsolution.ListLunsRequest(request)
# If we have keyword arguments corresponding to fields on the
# request, apply these.
if parent is not None:
request.parent = parent
# Wrap the RPC method; this adds retry and timeout information,
# and friendly error handling.
rpc = gapic_v1.method_async.wrap_method(
self._client._transport.list_luns,
default_timeout=None,
client_info=DEFAULT_CLIENT_INFO,
)
# Certain fields should be provided within the metadata header;
# add these here.
metadata = tuple(metadata) + (
gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
)
# Send the request.
response = await rpc(
request,
retry=retry,
timeout=timeout,
metadata=metadata,
)
# This method is paged; wrap the response in a pager, which provides
# an `__aiter__` convenience method.
response = pagers.ListLunsAsyncPager(
method=rpc,
request=request,
response=response,
metadata=metadata,
)
# Done; return the response.
return response
async def __aenter__(self):
return self
async def __aexit__(self, exc_type, exc, tb):
await self.transport.close()
try:
DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
gapic_version=pkg_resources.get_distribution(
"google-cloud-bare-metal-solution",
).version,
)
except pkg_resources.DistributionNotFound:
DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo()
__all__ = ("BareMetalSolutionAsyncClient",)
| 38.64586 | 146 | 0.608338 | 9,300 | 89,156 | 5.69172 | 0.048495 | 0.031171 | 0.037897 | 0.03733 | 0.822436 | 0.790792 | 0.768254 | 0.744016 | 0.727826 | 0.719551 | 0 | 0.003248 | 0.32663 | 89,156 | 2,306 | 147 | 38.662619 | 0.878454 | 0.166495 | 0 | 0.618347 | 0 | 0 | 0.056739 | 0.002462 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005663 | false | 0 | 0.02265 | 0 | 0.08607 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
936cfea886aad7780fadb2998330a8da8a941879 | 2,491 | py | Python | controller/geotechnic/surcharge_load.py | QuantumNovice/civil-engineering-toolbox | b759df2ed32614fa237ed7e4fccaf79f78c3eee4 | [
"BSD-3-Clause"
] | 32 | 2015-11-12T08:36:26.000Z | 2021-12-28T19:48:04.000Z | controller/geotechnic/surcharge_load.py | QuantumNovice/civil-engineering-toolbox | b759df2ed32614fa237ed7e4fccaf79f78c3eee4 | [
"BSD-3-Clause"
] | 2 | 2020-09-17T05:47:07.000Z | 2021-09-05T10:50:24.000Z | controller/geotechnic/surcharge_load.py | QuantumNovice/civil-engineering-toolbox | b759df2ed32614fa237ed7e4fccaf79f78c3eee4 | [
"BSD-3-Clause"
] | 12 | 2016-04-27T06:51:48.000Z | 2021-09-05T10:30:04.000Z | from src import view
from model.geotechnic import surcharge_load
from model.utils import plot
import cherrypy
class Surcharge_Load:
def __init__(self):
pass
def point(self, **var):
# Prepare view & model object
template = view.lookup.get_template('geotechnic/surcharge_point.mako')
model = surcharge_load.Surcharge_Load()
# Prepare url params & cookie as default value
param = cherrypy.request.params
cookie = cherrypy.request.cookie
# Get url parameter or set default variable (if None)
q = float(param.get('q') or 200)
x_load = float(param.get('x_load') or 1.2)
H = float(param.get('H') or 12)
start = float(param.get('start') or -10)
end = float(param.get('end') or 10)
type = float(param.get('type') or 2)
# Calculate
x, y, z = model.point(q, x_load, H, start, end, type)
plt = plot.Plot()
img = plt.pcolor(x, y, z)
# Prepare data to view
data = {
'q': q,
'x_load': x_load, #m
'H': H, #m
'start': start, #m
'end': end, # m
'type': type,
'plot_image': img,
}
return template.render(**data)
def strip(self, **var):
# Prepare view & model object
template = view.lookup.get_template('geotechnic/surcharge_strip.mako')
model = surcharge_load.Surcharge_Load()
# Prepare url params & cookie as default value
param = cherrypy.request.params
cookie = cherrypy.request.cookie
# Get url parameter or set default variable (if None)
q = float(param.get('q') or 200)
x_load = float(param.get('x_load') or 1.2)
width = float(param.get('width') or 1)
H = float(param.get('H') or 5)
start = float(param.get('start') or -10)
end = float(param.get('end') or 10)
type = float(param.get('type') or 2)
# Calculate
x, y, z = model.strip(q, x_load, width, H, start, end, type)
plt = plot.Plot()
img = plt.pcolor(x, y, z)
data = {
'q': q,
'x_load': x_load, #m
'width': width, #m
'H': H, #m
'start': start, #m
'end': end, # m
'type': type,
'plot_image': img
}
return template.render(**data) | 32.350649 | 79 | 0.521076 | 317 | 2,491 | 4.012618 | 0.205047 | 0.102201 | 0.132862 | 0.028302 | 0.815252 | 0.815252 | 0.788522 | 0.788522 | 0.762579 | 0.762579 | 0 | 0.014944 | 0.355279 | 2,491 | 77 | 80 | 32.350649 | 0.777086 | 0.120835 | 0 | 0.631579 | 0 | 0 | 0.082061 | 0.02958 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0.017544 | 0.070175 | 0 | 0.175439 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9383bfea669c65e5b4f600e7509cb5fe58ee630d | 261 | py | Python | tests/conftest.py | trustlines-protocol/semi-atomic-swap | bbaec14f49929e6e2fe6691f0c7bee77a25a2e00 | [
"MIT"
] | null | null | null | tests/conftest.py | trustlines-protocol/semi-atomic-swap | bbaec14f49929e6e2fe6691f0c7bee77a25a2e00 | [
"MIT"
] | 1 | 2021-06-30T18:25:07.000Z | 2021-06-30T18:25:07.000Z | tests/conftest.py | trustlines-protocol/semi-atomic-swap | bbaec14f49929e6e2fe6691f0c7bee77a25a2e00 | [
"MIT"
] | null | null | null | import pytest
@pytest.fixture(scope="session")
def tl_swap_contract(deploy_contract):
return deploy_contract("TLSwap")
@pytest.fixture(scope="session")
def tl_currency_network_contract(deploy_contract):
return deploy_contract("TestCurrencyNetwork")
| 21.75 | 50 | 0.800766 | 31 | 261 | 6.451613 | 0.483871 | 0.28 | 0.18 | 0.25 | 0.72 | 0.72 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091954 | 261 | 11 | 51 | 23.727273 | 0.843882 | 0 | 0 | 0.285714 | 0 | 0 | 0.149425 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0.285714 | 0.714286 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
fa861a7d16bfeff49ea1c20ed2d43d5d37cafd18 | 5,682 | py | Python | tests/unit/test_rapu.py | dswiecki/karapace | b3cc47ee5cd14ed7113748e7fb36744c808f7131 | [
"Apache-2.0"
] | null | null | null | tests/unit/test_rapu.py | dswiecki/karapace | b3cc47ee5cd14ed7113748e7fb36744c808f7131 | [
"Apache-2.0"
] | null | null | null | tests/unit/test_rapu.py | dswiecki/karapace | b3cc47ee5cd14ed7113748e7fb36744c808f7131 | [
"Apache-2.0"
] | null | null | null | from karapace.rapu import HTTPRequest, REST_ACCEPT_RE, REST_CONTENT_TYPE_RE
async def test_header_get():
req = HTTPRequest(url="", query="", headers={}, path_for_stats="", method="GET")
assert "Content-Type" not in req.headers
for v in ["Content-Type", "content-type", "CONTENT-tYPE", "coNTENT-TYpe"]:
assert req.get_header(v) == "application/json"
def test_rest_accept_re():
# incomplete headers
assert not REST_ACCEPT_RE.match("")
assert not REST_ACCEPT_RE.match("application/")
assert not REST_ACCEPT_RE.match("application/vnd.kafka")
assert not REST_ACCEPT_RE.match("application/vnd.kafka.json")
# Unsupported serialization formats
assert not REST_ACCEPT_RE.match("application/vnd.kafka+avro")
assert not REST_ACCEPT_RE.match("application/vnd.kafka+protobuf")
assert not REST_ACCEPT_RE.match("application/vnd.kafka+binary")
assert REST_ACCEPT_RE.match("application/json").groupdict() == {
"embedded_format": None,
"api_version": None,
"serialization_format": None,
"general_format": "json",
}
assert REST_ACCEPT_RE.match("application/*").groupdict() == {
"embedded_format": None,
"api_version": None,
"serialization_format": None,
"general_format": "*",
}
assert REST_ACCEPT_RE.match("application/vnd.kafka+json").groupdict() == {
"embedded_format": None,
"api_version": None,
"serialization_format": "json",
"general_format": None,
}
# Embdded format
assert REST_ACCEPT_RE.match("application/vnd.kafka.avro+json").groupdict() == {
"embedded_format": "avro",
"api_version": None,
"serialization_format": "json",
"general_format": None,
}
assert REST_ACCEPT_RE.match("application/vnd.kafka.json+json").groupdict() == {
"embedded_format": "json",
"api_version": None,
"serialization_format": "json",
"general_format": None,
}
assert REST_ACCEPT_RE.match("application/vnd.kafka.binary+json").groupdict() == {
"embedded_format": "binary",
"api_version": None,
"serialization_format": "json",
"general_format": None,
}
assert REST_ACCEPT_RE.match("application/vnd.kafka.jsonschema+json").groupdict() == {
"embedded_format": "jsonschema",
"api_version": None,
"serialization_format": "json",
"general_format": None,
}
# API version
assert REST_ACCEPT_RE.match("application/vnd.kafka.v1+json").groupdict() == {
"embedded_format": None,
"api_version": "v1",
"serialization_format": "json",
"general_format": None,
}
assert REST_ACCEPT_RE.match("application/vnd.kafka.v2+json").groupdict() == {
"embedded_format": None,
"api_version": "v2",
"serialization_format": "json",
"general_format": None,
}
def test_content_type_re():
# incomplete headers
assert not REST_CONTENT_TYPE_RE.match("")
assert not REST_CONTENT_TYPE_RE.match("application/")
assert not REST_CONTENT_TYPE_RE.match("application/vnd.kafka")
assert not REST_CONTENT_TYPE_RE.match("application/vnd.kafka.json")
# Unsupported serialization formats
assert not REST_CONTENT_TYPE_RE.match("application/vnd.kafka+avro")
assert not REST_CONTENT_TYPE_RE.match("application/vnd.kafka+protobuf")
assert not REST_CONTENT_TYPE_RE.match("application/vnd.kafka+binary")
# Unspecified format
assert not REST_CONTENT_TYPE_RE.match("application/*")
assert REST_CONTENT_TYPE_RE.match("application/json").groupdict() == {
"embedded_format": None,
"api_version": None,
"serialization_format": None,
"general_format": "json",
}
assert REST_CONTENT_TYPE_RE.match("application/octet-stream").groupdict() == {
"embedded_format": None,
"api_version": None,
"serialization_format": None,
"general_format": "octet-stream",
}
assert REST_CONTENT_TYPE_RE.match("application/vnd.kafka+json").groupdict() == {
"embedded_format": None,
"api_version": None,
"serialization_format": "json",
"general_format": None,
}
# Embdded format
assert REST_CONTENT_TYPE_RE.match("application/vnd.kafka.avro+json").groupdict() == {
"embedded_format": "avro",
"api_version": None,
"serialization_format": "json",
"general_format": None,
}
assert REST_CONTENT_TYPE_RE.match("application/vnd.kafka.json+json").groupdict() == {
"embedded_format": "json",
"api_version": None,
"serialization_format": "json",
"general_format": None,
}
assert REST_CONTENT_TYPE_RE.match("application/vnd.kafka.binary+json").groupdict() == {
"embedded_format": "binary",
"api_version": None,
"serialization_format": "json",
"general_format": None,
}
assert REST_CONTENT_TYPE_RE.match("application/vnd.kafka.jsonschema+json").groupdict() == {
"embedded_format": "jsonschema",
"api_version": None,
"serialization_format": "json",
"general_format": None,
}
# API version
assert REST_CONTENT_TYPE_RE.match("application/vnd.kafka.v1+json").groupdict() == {
"embedded_format": None,
"api_version": "v1",
"serialization_format": "json",
"general_format": None,
}
assert REST_CONTENT_TYPE_RE.match("application/vnd.kafka.v2+json").groupdict() == {
"embedded_format": None,
"api_version": "v2",
"serialization_format": "json",
"general_format": None,
}
| 37.137255 | 95 | 0.648363 | 630 | 5,682 | 5.57619 | 0.084127 | 0.065756 | 0.158839 | 0.143467 | 0.920581 | 0.920581 | 0.878451 | 0.84657 | 0.843154 | 0.743524 | 0 | 0.001782 | 0.209785 | 5,682 | 152 | 96 | 37.381579 | 0.780624 | 0.031327 | 0 | 0.534351 | 0 | 0 | 0.380779 | 0.130688 | 0 | 0 | 0 | 0 | 0.267176 | 1 | 0.015267 | false | 0 | 0.007634 | 0 | 0.022901 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fa933a23b867d5f45abff41b916539c667d12023 | 7,273 | py | Python | atlas/foundations_core_rest_api_components/src/test/filters/test_null_filter.py | DeepLearnI/atlas | 8aca652d7e647b4e88530b93e265b536de7055ed | [
"Apache-2.0"
] | 296 | 2020-03-16T19:55:00.000Z | 2022-01-10T19:46:05.000Z | atlas/foundations_core_rest_api_components/src/test/filters/test_null_filter.py | DeepLearnI/atlas | 8aca652d7e647b4e88530b93e265b536de7055ed | [
"Apache-2.0"
] | 57 | 2020-03-17T11:15:57.000Z | 2021-07-10T14:42:27.000Z | atlas/foundations_core_rest_api_components/src/test/filters/test_null_filter.py | DeepLearnI/atlas | 8aca652d7e647b4e88530b93e265b536de7055ed | [
"Apache-2.0"
] | 38 | 2020-03-17T21:06:05.000Z | 2022-02-08T03:19:34.000Z | import unittest
from mock import patch
from foundations_core_rest_api_components.filters.null_filter import NullFilter
class TestNullFilter(unittest.TestCase):
class MockJobInfo(object):
def __init__(self, **kwargs):
for key, value in kwargs.items():
setattr(self, key, value)
def test_start_time_is_null(self):
from datetime import datetime
params = {
'start_time_isnull': 'true',
}
jobs_start_times = [
datetime(2018, 8, 1, 10, 30, 0, 0, None),
None,
datetime(2018, 10, 10, 15, 30, 0, 0, None),
None,
datetime(2018, 12, 1, 10, 30, 0, 0, None),
]
result = [self.MockJobInfo(job_id=index+1, start_time=start_time.isoformat() if start_time else None)
for index, start_time in enumerate(jobs_start_times)]
null_filter = NullFilter()
new_result = null_filter(result, params)
self.assertEqual(len(new_result), 2)
new_result_ids = [job.job_id for job in new_result]
expected_new_result_ids = [2, 4]
self.assertEqual(expected_new_result_ids, new_result_ids)
def test_start_time_is_not_null(self):
from datetime import datetime
params = {
'start_time_isnull': 'false',
}
jobs_start_times = [
datetime(2018, 8, 1, 10, 30, 0, 0, None),
None,
datetime(2018, 10, 10, 15, 30, 0, 0, None),
None,
datetime(2018, 12, 1, 10, 30, 0, 0, None),
]
result = [self.MockJobInfo(job_id=index+1, start_time=start_time.isoformat() if start_time else None)
for index, start_time in enumerate(jobs_start_times)]
null_filter = NullFilter()
new_result = null_filter(result, params)
self.assertEqual(len(new_result), 3)
new_result_ids = [job.job_id for job in new_result]
expected_new_result_ids = [1, 3, 5]
self.assertEqual(expected_new_result_ids, new_result_ids)
def test_start_time_bad_filter_value(self):
from datetime import datetime
params = {
'start_time_isnull': 'random',
}
jobs_start_times = [
datetime(2018, 8, 1, 10, 30, 0, 0, None),
None,
datetime(2018, 10, 10, 15, 30, 0, 0, None),
None,
datetime(2018, 12, 1, 10, 30, 0, 0, None),
]
result = [self.MockJobInfo(job_id=index+1, start_time=start_time.isoformat() if start_time else None)
for index, start_time in enumerate(jobs_start_times)]
null_filter = NullFilter()
new_result = null_filter(result, params)
self.assertEqual(len(new_result), 5)
new_result_ids = [job.job_id for job in new_result]
expected_new_result_ids = [1, 2, 3, 4, 5]
self.assertEqual(expected_new_result_ids, new_result_ids)
def test_input_parameters_argument_is_null(self):
params = {
'argument1_isnull': 'true',
}
input_parameters_list = [
[{'name': 'argument0', 'type': 'string', 'value': 'red leave'},
{'name': 'argument1', 'type': 'number', 'value': '3.14'}],
[{'name': 'argument0', 'type': 'string', 'value': 'green grass'},
{'name': 'argument1', 'type': 'number', 'value': None}],
[{'name': 'argument0', 'type': 'string', 'value': 'more stuff'},
{'name': 'argument1', 'type': 'number', 'value': float('nan')}]
]
result = [self.MockJobInfo(job_id=index+1, input_params=input_parameters)
for index, input_parameters in enumerate(input_parameters_list)]
null_filter = NullFilter()
new_result = null_filter(result, params)
self.assertEqual(len(new_result), 2)
new_result_job_ids = [job.job_id for job in new_result]
expected_new_result_ids = [2, 3]
self.assertEqual(expected_new_result_ids, new_result_job_ids)
def test_input_parameters_argument_is_not_null(self):
params = {
'argument1_isnull': 'false',
}
input_parameters_list = [
[{'name': 'argument0', 'type': 'string', 'value': 'red leave'},
{'name': 'argument1', 'type': 'number', 'value': '3.14'}],
[{'name': 'argument0', 'type': 'string', 'value': 'green grass'},
{'name': 'argument1', 'type': 'number', 'value': None}],
[{'name': 'argument0', 'type': 'string', 'value': 'more stuff'},
{'name': 'argument1', 'type': 'number', 'value': float('nan')}]
]
result = [self.MockJobInfo(job_id=index+1, input_params=input_parameters)
for index, input_parameters in enumerate(input_parameters_list)]
null_filter = NullFilter()
new_result = null_filter(result, params)
self.assertEqual(len(new_result), 1)
new_result_job_ids = [job.job_id for job in new_result]
expected_new_result_ids = [1]
self.assertEqual(expected_new_result_ids, new_result_job_ids)
def test_output_metrics_is_null(self):
params = {
'metric1_isnull': 'true',
}
output_metrics_list = [
[{'name': 'metric0', 'type': 'string', 'value': 'more stuff'},
{'name': 'metric1', 'type': 'string', 'value': float('nan')}],
[{'name': 'metric0', 'type': 'string', 'value': 'red leave'},
{'name': 'metric1', 'type': 'string', 'value': 'vague'}],
[{'name': 'metric0', 'type': 'string', 'value': 'green grass'},
{'name': 'metric1', 'type': 'string', 'value': None}]
]
result = [self.MockJobInfo(job_id=index+1, output_metrics=output_metrics)
for index, output_metrics in enumerate(output_metrics_list)]
null_filter = NullFilter()
new_result = null_filter(result, params)
self.assertEqual(len(new_result), 2)
new_result_job_ids = [job.job_id for job in new_result]
expected_new_result_ids = [1, 3]
self.assertEqual(expected_new_result_ids, new_result_job_ids)
def test_output_metrics_is_not_null(self):
params = {
'metric1_isnull': 'false',
}
output_metrics_list = [
[{'name': 'metric0', 'type': 'string', 'value': 'more stuff'},
{'name': 'metric1', 'type': 'string', 'value': float('nan')}],
[{'name': 'metric0', 'type': 'string', 'value': 'red leave'},
{'name': 'metric1', 'type': 'string', 'value': 'vague'}],
[{'name': 'metric0', 'type': 'string', 'value': 'green grass'},
{'name': 'metric1', 'type': 'string', 'value': None}]
]
result = [self.MockJobInfo(job_id=index+1, output_metrics=output_metrics)
for index, output_metrics in enumerate(output_metrics_list)]
null_filter = NullFilter()
new_result = null_filter(result, params)
self.assertEqual(len(new_result), 1)
new_result_job_ids = [job.job_id for job in new_result]
expected_new_result_ids = [2]
self.assertEqual(expected_new_result_ids, new_result_job_ids)
| 37.489691 | 109 | 0.582978 | 861 | 7,273 | 4.655052 | 0.108014 | 0.11003 | 0.05988 | 0.06986 | 0.931387 | 0.896956 | 0.896956 | 0.884481 | 0.884481 | 0.871756 | 0 | 0.034516 | 0.278977 | 7,273 | 193 | 110 | 37.683938 | 0.729786 | 0 | 0 | 0.695946 | 0 | 0 | 0.12952 | 0 | 0 | 0 | 0 | 0 | 0.094595 | 1 | 0.054054 | false | 0 | 0.040541 | 0 | 0.108108 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8784635d0d4fcd3bec557fbdefb255adbc6076c1 | 36 | py | Python | src/catvehicle/log/__init__.py | jmscslgroup/catvehicle | f58cc64103538bc93cd42dd30f59c4b937151f88 | [
"BSD-4-Clause-UC"
] | 26 | 2019-06-18T03:31:49.000Z | 2022-02-16T08:45:31.000Z | src/catvehicle/log/__init__.py | jmscslgroup/catvehicle | f58cc64103538bc93cd42dd30f59c4b937151f88 | [
"BSD-4-Clause-UC"
] | 9 | 2020-11-13T13:11:16.000Z | 2022-01-30T08:40:25.000Z | src/catvehicle/log/__init__.py | jmscslgroup/catvehicle | f58cc64103538bc93cd42dd30f59c4b937151f88 | [
"BSD-4-Clause-UC"
] | 12 | 2019-07-25T13:57:06.000Z | 2022-03-02T08:16:40.000Z | from .log import configure_logworker | 36 | 36 | 0.888889 | 5 | 36 | 6.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 36 | 1 | 36 | 36 | 0.939394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
87d1d17bd4f90e9ec61289bb0ca3361409c351a9 | 47 | py | Python | 01_environment_mgmt/01_conflicting_dependencies/_preparation/zp3_v2/zp3/__init__.py | iwanbolzern/python-course | a4e9088adab09b4e92710eb350392578c0977a71 | [
"MIT"
] | null | null | null | 01_environment_mgmt/01_conflicting_dependencies/_preparation/zp3_v2/zp3/__init__.py | iwanbolzern/python-course | a4e9088adab09b4e92710eb350392578c0977a71 | [
"MIT"
] | null | null | null | 01_environment_mgmt/01_conflicting_dependencies/_preparation/zp3_v2/zp3/__init__.py | iwanbolzern/python-course | a4e9088adab09b4e92710eb350392578c0977a71 | [
"MIT"
] | null | null | null | def say_goodbye():
print('Goodbye World!')
| 15.666667 | 27 | 0.659574 | 6 | 47 | 5 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170213 | 47 | 2 | 28 | 23.5 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0.297872 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
35a954c10bbe480c0a02205fd3c704410c015810 | 2,689 | py | Python | pythonfuzz/main.py | avineshwar/pythonfuzz | 5ee7f86a571040fdccff872f00b2cd810a319de7 | [
"Apache-2.0"
] | null | null | null | pythonfuzz/main.py | avineshwar/pythonfuzz | 5ee7f86a571040fdccff872f00b2cd810a319de7 | [
"Apache-2.0"
] | null | null | null | pythonfuzz/main.py | avineshwar/pythonfuzz | 5ee7f86a571040fdccff872f00b2cd810a319de7 | [
"Apache-2.0"
] | null | null | null | import argparse
import warnings
from pythonfuzz import fuzzer
warnings.filterwarnings('ignore')
class PythonFuzz(object):
def __init__(self, func):
self.function = func
def __call__(self, *args, **kwargs):
parser = argparse.ArgumentParser(description='Coverage-guided fuzzer for python packages')
parser.add_argument('dirs', type=str, nargs='*',
help="one or more directories/files to use as seed corpus. the first directory will be used to save the generated test-cases")
parser.add_argument('--exact-artifact-path', type=str, help='set exact artifact path for crashes/ooms')
parser.add_argument('--regression',
type=bool,
default=False,
help='run the fuzzer through set of files for regression or reproduction')
parser.add_argument('--rss-limit-mb', type=int, default=2048, help='Memory usage in MB')
parser.add_argument('--max-input-size', type=int, default=4096, help='Max input size in bytes')
parser.add_argument('--timeout', type=int, default=30,
help='If input takes longer then this timeout the process is treated as failure case')
args = parser.parse_args()
f = fuzzer.Fuzzer(self.function, args.dirs, args.exact_artifact_path,
args.rss_limit_mb, args.timeout, args.regression, args.max_input_size)
f.start()
def main():
parser = argparse.ArgumentParser(description='Coverage-guided fuzzer for python packages')
parser.add_argument('target', type=str, help='path to fuzz target')
parser.add_argument('dirs', type=str, nargs='*',
help="one or more directories/files to use as seed corpus. the first directory will be used to save the generated test-cases")
parser.add_argument('--exact-artifact-path', type=str, help='set exact artifact path for crashes/ooms')
parser.add_argument('--regression',
type=bool,
default=False,
help='run the fuzzer through set of files for regression or reproduction')
parser.add_argument('--rss-limit-mb', type=int, default=2048, help='Memory usage in MB')
parser.add_argument('--timeout', type=int, default=120,
help='If input takes longer then this timeout the process is treated as failure case')
args = parser.parse_args()
f = fuzzer.Fuzzer(args.target, args.dirs, args.exact_artifact_path,
args.rss_limit_mb, args.timeout, args.regression)
f.start()
#
# if __name__ == '__main__':
# main()
| 51.711538 | 154 | 0.640759 | 341 | 2,689 | 4.935484 | 0.293255 | 0.064171 | 0.121212 | 0.046346 | 0.811646 | 0.811646 | 0.811646 | 0.776589 | 0.776589 | 0.776589 | 0 | 0.008449 | 0.251766 | 2,689 | 51 | 155 | 52.72549 | 0.828032 | 0.01376 | 0 | 0.585366 | 0 | 0.04878 | 0.345921 | 0.015861 | 0 | 0 | 0 | 0 | 0 | 1 | 0.073171 | false | 0 | 0.073171 | 0 | 0.170732 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
35b11cec243758de01d96a357c4668f15b7d8c9c | 28 | py | Python | examples/plugins/workbench/AcmeLabUsingEggs/src/acme.acmelab/acme/acmelab/api.py | robmcmullen/envisage | 57338fcb0ea69c75bc3c86de18a5967d8e78c6c1 | [
"BSD-3-Clause"
] | 51 | 2015-05-12T01:34:15.000Z | 2022-03-20T19:11:22.000Z | examples/plugins/workbench/AcmeLabUsingEggs/src/acme.acmelab/acme/acmelab/api.py | robmcmullen/envisage | 57338fcb0ea69c75bc3c86de18a5967d8e78c6c1 | [
"BSD-3-Clause"
] | 347 | 2015-02-27T19:51:09.000Z | 2022-03-21T16:03:01.000Z | examples/plugins/workbench/AcmeLabUsingEggs/src/acme.acmelab/acme/acmelab/api.py | robmcmullen/envisage | 57338fcb0ea69c75bc3c86de18a5967d8e78c6c1 | [
"BSD-3-Clause"
] | 11 | 2015-02-11T04:32:54.000Z | 2021-09-13T10:50:05.000Z | from acmelab import Acmelab
| 14 | 27 | 0.857143 | 4 | 28 | 6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 28 | 1 | 28 | 28 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ea0c62780e56c4a34ac13d47552bbb5b92566ff7 | 96 | py | Python | venv/lib/python3.8/site-packages/numpy/f2py/tests/util.py | GiulianaPola/select_repeats | 17a0d053d4f874e42cf654dd142168c2ec8fbd11 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/numpy/f2py/tests/util.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/numpy/f2py/tests/util.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/86/17/cf/46b6fe8077121ab3852a0da9898322e25ade7cb473eebe020fdec09a98 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.40625 | 0 | 96 | 1 | 96 | 96 | 0.489583 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ea1006e30e9d87f3d10e1fa690eabc53e83f9f89 | 74 | py | Python | cx_Freeze/samples/relimport/pkg1/pkg2/sub3.py | lexa/cx_Freeze | f1f35d19e8e7e821733f86b4da7814c40be3bfd9 | [
"PSF-2.0"
] | 358 | 2020-07-02T13:00:02.000Z | 2022-03-29T10:03:57.000Z | cx_Freeze/samples/relimport/pkg1/pkg2/sub3.py | lexa/cx_Freeze | f1f35d19e8e7e821733f86b4da7814c40be3bfd9 | [
"PSF-2.0"
] | 372 | 2020-07-02T20:47:57.000Z | 2022-03-31T19:35:05.000Z | cx_Freeze/samples/relimport/pkg1/pkg2/sub3.py | lexa/cx_Freeze | f1f35d19e8e7e821733f86b4da7814c40be3bfd9 | [
"PSF-2.0"
] | 78 | 2020-07-09T14:24:03.000Z | 2022-03-22T19:06:52.000Z | print("importing pkg1.pkg2.sub3")
from . import sub5
from .. import sub6
| 14.8 | 33 | 0.72973 | 11 | 74 | 4.909091 | 0.818182 | 0.37037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079365 | 0.148649 | 74 | 4 | 34 | 18.5 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0.324324 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.333333 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ea504d07ee789f985698ef9fe496b69432309d9a | 3,968 | py | Python | test/server/test_idle.py | chenliangomc/pymap | 42581712631e9e9787e9dd094a22f5cc607f804d | [
"MIT"
] | null | null | null | test/server/test_idle.py | chenliangomc/pymap | 42581712631e9e9787e9dd094a22f5cc607f804d | [
"MIT"
] | null | null | null | test/server/test_idle.py | chenliangomc/pymap | 42581712631e9e9787e9dd094a22f5cc607f804d | [
"MIT"
] | null | null | null |
import pytest # type: ignore
from .base import TestBase
pytestmark = pytest.mark.asyncio
class TestIdle(TestBase):
async def test_idle(self):
self.transport.push_login()
self.transport.push_select(b'INBOX', 4, 1, 105)
self.transport.push_readline(
b'idle1 IDLE\r\n')
self.transport.push_write(
b'+ Idling.\r\n')
self.transport.push_readexactly(b'')
self.transport.push_readline(
b'DONE\r\n')
self.transport.push_write(
b'idle1 OK IDLE completed.\r\n')
self.transport.push_logout()
await self.run()
async def test_idle_invalid(self):
self.transport.push_login()
self.transport.push_select(b'INBOX', 4, 1, 105)
self.transport.push_readline(
b'idle1 IDLE\r\n')
self.transport.push_write(
b'+ Idling.\r\n')
self.transport.push_readexactly(b'')
self.transport.push_readline(
b'WHAT\r\n')
self.transport.push_write(
b'idle1 BAD Expected "DONE".\r\n')
self.transport.push_logout()
await self.run()
async def test_idle_noselect(self):
self.transport.push_login()
self.transport.push_readline(
b'idle1 IDLE\r\n')
self.transport.push_write(
b'idle1 BAD IDLE: Must select a mailbox first.\r\n')
self.transport.push_logout()
await self.run()
async def test_concurrent_idle_append(self):
concurrent = self.new_transport()
event1, event2 = self.new_events(2)
concurrent.push_login()
concurrent.push_select(b'INBOX', 4, 1, 105)
concurrent.push_readline(
b'idle1 IDLE\r\n')
concurrent.push_write(
b'+ Idling.\r\n', set=event1)
concurrent.push_readexactly(b'')
concurrent.push_readline(
b'DONE\r\n', wait=event2)
concurrent.push_write(
b'* 5 EXISTS\r\n')
concurrent.push_write(
b'* 2 RECENT\r\n')
concurrent.push_write(
b'* 5 FETCH (FLAGS (\\Recent \\Seen))\r\n')
concurrent.push_write(
b'idle1 OK IDLE completed.\r\n')
concurrent.push_logout()
self.transport.push_login()
self.transport.push_readline(
b'append1 APPEND INBOX (\\Seen) {9}\r\n', wait=event1)
self.transport.push_write(
b'+ Literal string\r\n')
self.transport.push_readexactly(
b'testing\r\n')
self.transport.push_readline(
b'\r\n')
self.transport.push_write(
b'append1 OK [APPENDUID ', None, b' 105] APPEND completed.\r\n',
set=event2)
self.transport.push_logout()
await self.run(concurrent)
async def test_concurrent_idle_expunge(self):
concurrent = self.new_transport()
event1, event2 = self.new_events(2)
concurrent.push_login()
concurrent.push_select(b'INBOX', 4, 1, 105)
concurrent.push_readline(
b'idle1 IDLE\r\n')
concurrent.push_write(
b'+ Idling.\r\n', set=event1)
concurrent.push_readexactly(b'')
concurrent.push_readline(
b'DONE\r\n', wait=event2)
concurrent.push_write(
b'* 1 EXPUNGE\r\n')
concurrent.push_write(
b'idle1 OK IDLE completed.\r\n')
concurrent.push_logout()
self.transport.push_login()
self.transport.push_select(b'INBOX', 4, 0, 105, wait=event1)
self.transport.push_readline(
b'store1 STORE 1 +FLAGS.SILENT (\\Deleted)\r\n')
self.transport.push_write(
b'store1 OK STORE completed.\r\n')
self.transport.push_readline(
b'close1 CLOSE\r\n')
self.transport.push_write(
b'close1 OK CLOSE completed.\r\n', set=event2)
self.transport.push_logout()
await self.run(concurrent)
| 32.793388 | 76 | 0.590222 | 502 | 3,968 | 4.527888 | 0.155378 | 0.194457 | 0.254289 | 0.105587 | 0.839859 | 0.791025 | 0.764188 | 0.692917 | 0.684558 | 0.639683 | 0 | 0.022711 | 0.289819 | 3,968 | 120 | 77 | 33.066667 | 0.783889 | 0.003024 | 0 | 0.721154 | 0 | 0 | 0.167974 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.019231 | 0 | 0.028846 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5776bab8b7cf52b8fe70de272b8ef6e01d617e1c | 43 | py | Python | stoicheia/__init__.py | LDSLab/stoicheia | e6e6241639b419b7528232a8cb528c9985c7fb7d | [
"MIT"
] | null | null | null | stoicheia/__init__.py | LDSLab/stoicheia | e6e6241639b419b7528232a8cb528c9985c7fb7d | [
"MIT"
] | 1 | 2020-02-17T21:09:39.000Z | 2020-02-17T21:09:39.000Z | stoicheia/__init__.py | LDSLab/stoicheia | e6e6241639b419b7528232a8cb528c9985c7fb7d | [
"MIT"
] | null | null | null | from .stoicheia import Catalog, Axis, Patch | 43 | 43 | 0.813953 | 6 | 43 | 5.833333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116279 | 43 | 1 | 43 | 43 | 0.921053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5791dd85dcee24b4419f5bb15cd815bebedc7884 | 204 | py | Python | api/post/admin.py | vishalda/blog-django-react | cc7eca5656958be242a80ba79b9814b210806d27 | [
"MIT"
] | null | null | null | api/post/admin.py | vishalda/blog-django-react | cc7eca5656958be242a80ba79b9814b210806d27 | [
"MIT"
] | null | null | null | api/post/admin.py | vishalda/blog-django-react | cc7eca5656958be242a80ba79b9814b210806d27 | [
"MIT"
] | null | null | null | from django.contrib import admin
from . import models
# Register your models here.
admin.site.register(models.BlogPost)
admin.site.register(models.BlogPostComment)
admin.site.register(models.BlogCategory) | 34 | 43 | 0.833333 | 27 | 204 | 6.296296 | 0.481481 | 0.158824 | 0.3 | 0.405882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.073529 | 204 | 6 | 44 | 34 | 0.899471 | 0.127451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
57f1d166a7013fe9b55118448f37e302043948d9 | 173 | py | Python | openwisp_network_topology/tests/test_users_integration.py | DaffyTheDuck/openwisp-network-topology | a8c9212f0d9cca76f83b41af0e3fc89330f408bb | [
"BSD-3-Clause"
] | null | null | null | openwisp_network_topology/tests/test_users_integration.py | DaffyTheDuck/openwisp-network-topology | a8c9212f0d9cca76f83b41af0e3fc89330f408bb | [
"BSD-3-Clause"
] | null | null | null | openwisp_network_topology/tests/test_users_integration.py | DaffyTheDuck/openwisp-network-topology | a8c9212f0d9cca76f83b41af0e3fc89330f408bb | [
"BSD-3-Clause"
] | null | null | null | from openwisp_users.tests.test_admin import TestUsersAdmin
class TestUsersIntegration(TestUsersAdmin):
"""
tests integration with openwisp_users
"""
pass
| 17.3 | 58 | 0.751445 | 17 | 173 | 7.470588 | 0.764706 | 0.204724 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184971 | 173 | 9 | 59 | 19.222222 | 0.900709 | 0.213873 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 6 |
aa4210534b798deb67ed73830732b547493f402a | 18,732 | py | Python | pybind/slxos/v16r_1_00b/system_monitor_mail/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v16r_1_00b/system_monitor_mail/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v16r_1_00b/system_monitor_mail/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | 1 | 2021-11-05T22:15:42.000Z | 2021-11-05T22:15:42.000Z |
from operator import attrgetter
import pyangbind.lib.xpathhelper as xpathhelper
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType, RestrictedClassType, TypedListType
from pyangbind.lib.yangtypes import YANGBool, YANGListType, YANGDynClass, ReferenceType
from pyangbind.lib.base import PybindBase
from decimal import Decimal
from bitarray import bitarray
import __builtin__
import fru
import sfp
import security
import interface
import relay
class system_monitor_mail(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module brocade-system-monitor - based on the path /system-monitor-mail. Each member element of
the container is represented as a class variable - with a specific
YANG type.
"""
__slots__ = ('_pybind_generated_by', '_path_helper', '_yang_name', '_rest_name', '_extmethods', '__fru','__sfp','__security','__interface','__relay',)
_yang_name = 'system-monitor-mail'
_rest_name = 'system-monitor-mail'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
path_helper_ = kwargs.pop("path_helper", None)
if path_helper_ is False:
self._path_helper = False
elif path_helper_ is not None and isinstance(path_helper_, xpathhelper.YANGPathHelper):
self._path_helper = path_helper_
elif hasattr(self, "_parent"):
path_helper_ = getattr(self._parent, "_path_helper", False)
self._path_helper = path_helper_
else:
self._path_helper = False
extmethods = kwargs.pop("extmethods", None)
if extmethods is False:
self._extmethods = False
elif extmethods is not None and isinstance(extmethods, dict):
self._extmethods = extmethods
elif hasattr(self, "_parent"):
extmethods = getattr(self._parent, "_extmethods", None)
self._extmethods = extmethods
else:
self._extmethods = False
self.__interface = YANGDynClass(base=interface.interface, is_container='container', presence=False, yang_name="interface", rest_name="interface", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure interface mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)
self.__security = YANGDynClass(base=security.security, is_container='container', presence=False, yang_name="security", rest_name="security", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure security mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)
self.__relay = YANGDynClass(base=YANGListType("host_ip",relay.relay, yang_name="relay", rest_name="relay", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='host-ip', extensions={u'tailf-common': {u'info': u'Configure relay ip mail settings', u'cli-suppress-mode': None, u'callpoint': u'relay-ip-server', u'cli-suppress-key-abbreviation': None, u'cli-suppress-list-no': None}}), is_container='list', yang_name="relay", rest_name="relay", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure relay ip mail settings', u'cli-suppress-mode': None, u'callpoint': u'relay-ip-server', u'cli-suppress-key-abbreviation': None, u'cli-suppress-list-no': None}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='list', is_config=True)
self.__sfp = YANGDynClass(base=sfp.sfp, is_container='container', presence=False, yang_name="sfp", rest_name="sfp", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure sfp mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)
self.__fru = YANGDynClass(base=fru.fru, is_container='container', presence=False, yang_name="fru", rest_name="fru", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure FRU mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return [u'system-monitor-mail']
def _rest_path(self):
if hasattr(self, "_parent"):
if self._rest_name:
return self._parent._rest_path()+[self._rest_name]
else:
return self._parent._rest_path()
else:
return [u'system-monitor-mail']
def _get_fru(self):
"""
Getter method for fru, mapped from YANG variable /system_monitor_mail/fru (container)
"""
return self.__fru
def _set_fru(self, v, load=False):
"""
Setter method for fru, mapped from YANG variable /system_monitor_mail/fru (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_fru is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_fru() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=fru.fru, is_container='container', presence=False, yang_name="fru", rest_name="fru", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure FRU mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """fru must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=fru.fru, is_container='container', presence=False, yang_name="fru", rest_name="fru", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure FRU mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)""",
})
self.__fru = t
if hasattr(self, '_set'):
self._set()
def _unset_fru(self):
self.__fru = YANGDynClass(base=fru.fru, is_container='container', presence=False, yang_name="fru", rest_name="fru", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure FRU mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)
def _get_sfp(self):
"""
Getter method for sfp, mapped from YANG variable /system_monitor_mail/sfp (container)
"""
return self.__sfp
def _set_sfp(self, v, load=False):
"""
Setter method for sfp, mapped from YANG variable /system_monitor_mail/sfp (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_sfp is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_sfp() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=sfp.sfp, is_container='container', presence=False, yang_name="sfp", rest_name="sfp", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure sfp mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """sfp must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=sfp.sfp, is_container='container', presence=False, yang_name="sfp", rest_name="sfp", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure sfp mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)""",
})
self.__sfp = t
if hasattr(self, '_set'):
self._set()
def _unset_sfp(self):
self.__sfp = YANGDynClass(base=sfp.sfp, is_container='container', presence=False, yang_name="sfp", rest_name="sfp", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure sfp mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)
def _get_security(self):
"""
Getter method for security, mapped from YANG variable /system_monitor_mail/security (container)
"""
return self.__security
def _set_security(self, v, load=False):
"""
Setter method for security, mapped from YANG variable /system_monitor_mail/security (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_security is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_security() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=security.security, is_container='container', presence=False, yang_name="security", rest_name="security", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure security mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """security must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=security.security, is_container='container', presence=False, yang_name="security", rest_name="security", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure security mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)""",
})
self.__security = t
if hasattr(self, '_set'):
self._set()
def _unset_security(self):
self.__security = YANGDynClass(base=security.security, is_container='container', presence=False, yang_name="security", rest_name="security", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure security mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)
def _get_interface(self):
"""
Getter method for interface, mapped from YANG variable /system_monitor_mail/interface (container)
"""
return self.__interface
def _set_interface(self, v, load=False):
"""
Setter method for interface, mapped from YANG variable /system_monitor_mail/interface (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_interface is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_interface() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=interface.interface, is_container='container', presence=False, yang_name="interface", rest_name="interface", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure interface mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """interface must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=interface.interface, is_container='container', presence=False, yang_name="interface", rest_name="interface", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure interface mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)""",
})
self.__interface = t
if hasattr(self, '_set'):
self._set()
def _unset_interface(self):
self.__interface = YANGDynClass(base=interface.interface, is_container='container', presence=False, yang_name="interface", rest_name="interface", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure interface mail settings'}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='container', is_config=True)
def _get_relay(self):
"""
Getter method for relay, mapped from YANG variable /system_monitor_mail/relay (list)
"""
return self.__relay
def _set_relay(self, v, load=False):
"""
Setter method for relay, mapped from YANG variable /system_monitor_mail/relay (list)
If this variable is read-only (config: false) in the
source YANG file, then _set_relay is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_relay() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGListType("host_ip",relay.relay, yang_name="relay", rest_name="relay", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='host-ip', extensions={u'tailf-common': {u'info': u'Configure relay ip mail settings', u'cli-suppress-mode': None, u'callpoint': u'relay-ip-server', u'cli-suppress-key-abbreviation': None, u'cli-suppress-list-no': None}}), is_container='list', yang_name="relay", rest_name="relay", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure relay ip mail settings', u'cli-suppress-mode': None, u'callpoint': u'relay-ip-server', u'cli-suppress-key-abbreviation': None, u'cli-suppress-list-no': None}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='list', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """relay must be of a type compatible with list""",
'defined-type': "list",
'generated-type': """YANGDynClass(base=YANGListType("host_ip",relay.relay, yang_name="relay", rest_name="relay", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='host-ip', extensions={u'tailf-common': {u'info': u'Configure relay ip mail settings', u'cli-suppress-mode': None, u'callpoint': u'relay-ip-server', u'cli-suppress-key-abbreviation': None, u'cli-suppress-list-no': None}}), is_container='list', yang_name="relay", rest_name="relay", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure relay ip mail settings', u'cli-suppress-mode': None, u'callpoint': u'relay-ip-server', u'cli-suppress-key-abbreviation': None, u'cli-suppress-list-no': None}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='list', is_config=True)""",
})
self.__relay = t
if hasattr(self, '_set'):
self._set()
def _unset_relay(self):
self.__relay = YANGDynClass(base=YANGListType("host_ip",relay.relay, yang_name="relay", rest_name="relay", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='host-ip', extensions={u'tailf-common': {u'info': u'Configure relay ip mail settings', u'cli-suppress-mode': None, u'callpoint': u'relay-ip-server', u'cli-suppress-key-abbreviation': None, u'cli-suppress-list-no': None}}), is_container='list', yang_name="relay", rest_name="relay", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Configure relay ip mail settings', u'cli-suppress-mode': None, u'callpoint': u'relay-ip-server', u'cli-suppress-key-abbreviation': None, u'cli-suppress-list-no': None}}, namespace='urn:brocade.com:mgmt:brocade-system-monitor', defining_module='brocade-system-monitor', yang_type='list', is_config=True)
fru = __builtin__.property(_get_fru, _set_fru)
sfp = __builtin__.property(_get_sfp, _set_sfp)
security = __builtin__.property(_get_security, _set_security)
interface = __builtin__.property(_get_interface, _set_interface)
relay = __builtin__.property(_get_relay, _set_relay)
_pyangbind_elements = {'fru': fru, 'sfp': sfp, 'security': security, 'interface': interface, 'relay': relay, }
| 69.895522 | 941 | 0.731102 | 2,545 | 18,732 | 5.172102 | 0.070334 | 0.047102 | 0.051052 | 0.032819 | 0.832105 | 0.812809 | 0.804604 | 0.790245 | 0.790245 | 0.77809 | 0 | 0.000369 | 0.131593 | 18,732 | 267 | 942 | 70.157303 | 0.808815 | 0.120703 | 0 | 0.439306 | 0 | 0.028902 | 0.391554 | 0.168372 | 0 | 0 | 0 | 0 | 0 | 1 | 0.104046 | false | 0 | 0.075145 | 0 | 0.300578 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a4b341a2043525c76c7480a6c0f213cc332fcd4f | 31 | py | Python | pycli_todo/__init__.py | luciafabio/pyto-do | 497fc6acb1b31a51e46a85e7dc6f9635895aeabb | [
"MIT"
] | null | null | null | pycli_todo/__init__.py | luciafabio/pyto-do | 497fc6acb1b31a51e46a85e7dc6f9635895aeabb | [
"MIT"
] | null | null | null | pycli_todo/__init__.py | luciafabio/pyto-do | 497fc6acb1b31a51e46a85e7dc6f9635895aeabb | [
"MIT"
] | null | null | null | from .pytodo import entry_point | 31 | 31 | 0.870968 | 5 | 31 | 5.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096774 | 31 | 1 | 31 | 31 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
35245bbc1f6cbc437b2e5bb90d938a53c30c9188 | 2,843 | py | Python | src/flbase/models/MLP.py | Yutong-Dai/FLBase | 1ff11799a5248f4c25a5b85e0588f567c7f33454 | [
"MIT"
] | null | null | null | src/flbase/models/MLP.py | Yutong-Dai/FLBase | 1ff11799a5248f4c25a5b85e0588f567c7f33454 | [
"MIT"
] | null | null | null | src/flbase/models/MLP.py | Yutong-Dai/FLBase | 1ff11799a5248f4c25a5b85e0588f567c7f33454 | [
"MIT"
] | null | null | null | '''
# File: CNNMnist.py
# Project: models
# Created Date: 2021-12-16 5:11
# Author: Yutong Dai yutongdai95@gmail.com
# -----
# Last Modified: 2022-04-22 9:18
# Modified By: Yutong Dai yutongdai95@gmail.com
#
# This code is published under the MIT License.
# -----
# HISTORY:
# Date By Comments
# ---------- --- ----------------------------------------------------------
'''
from ..model import Model
import torch.nn as nn
import torch.nn.functional as F
"""
Ref: https://github.com/AshwinRJ/Federated-Learning-PyTorch/blob/master/src/models.py
"""
class MLP(Model):
def __init__(self, config):
super().__init__(config, None, None)
dim = config['dim']
self.fc1 = nn.Linear(2, 64)
self.fc2 = nn.Linear(64, dim * 4)
self.fc3 = nn.Linear(dim * 4, dim * 2)
self.feature_embedding = nn.Linear(dim * 2, dim)
def forward(self, x, normalize=False):
x = F.relu(self.fc1(x))
x = F.relu(self.fc2(x))
x = F.relu(self.fc3(x))
embedding = self.feature_embedding(x)
if normalize:
normalized_embedding = embedding / torch.norm(embedding, dim=1).view(-1,1)
return normalized_embedding
else:
return embedding
class MLPCE(Model):
def __init__(self, config):
super().__init__(config, None, None)
dim = config['dim']
self.fc1 = nn.Linear(2, 64)
self.fc2 = nn.Linear(64, dim * 4)
self.fc3 = nn.Linear(dim * 4, dim * 2)
self.W = nn.Linear(dim * 2, 2)
def forward(self, x):
x = F.relu(self.fc1(x))
x = F.relu(self.fc2(x))
x = F.relu(self.fc3(x))
logits = self.W(x)
return logits
# class MLP(nn.Module):
# def __init__(self, dim):
# super().__init__()
# self.fc1 = nn.Linear(2, 64)
# self.fc2 = nn.Linear(64, dim * 4)
# self.fc3 = nn.Linear(dim * 4, dim * 2)
# self.feature_embedding = nn.Linear(dim * 2, dim)
# def forward(self, x, normalize=False):
# x = F.relu(self.fc1(x))
# x = F.relu(self.fc2(x))
# x = F.relu(self.fc3(x))
# embedding = self.feature_embedding(x)
# if normalize:
# normalized_embedding = embedding / torch.norm(embedding, dim=1).view(-1,1)
# return normalized_embedding
# else:
# return embedding
# class MLPCE(nn.Module):
# def __init__(self, dim):
# super().__init__()
# self.fc1 = nn.Linear(2, 64)
# self.fc2 = nn.Linear(64, dim * 4)
# self.fc3 = nn.Linear(dim * 4, dim * 2)
# self.W = nn.Linear(dim * 2, 2)
# def forward(self, x):
# x = F.relu(self.fc1(x))
# x = F.relu(self.fc2(x))
# x = F.relu(self.fc3(x))
# logits = self.W(x)
# return logits | 30.569892 | 88 | 0.537109 | 388 | 2,843 | 3.832474 | 0.226804 | 0.086079 | 0.04842 | 0.080699 | 0.801614 | 0.763954 | 0.763954 | 0.763954 | 0.763954 | 0.763954 | 0 | 0.04642 | 0.287724 | 2,843 | 93 | 89 | 30.569892 | 0.687901 | 0.510025 | 0 | 0.514286 | 0 | 0 | 0.004702 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.114286 | false | 0 | 0.085714 | 0 | 0.342857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3525013fc0fc808b74dfaf05ba21692023729240 | 37 | py | Python | tests/arithmetic/PLUS.py | past-one/rubymine-is2018 | 9b7f38bbcab0b59434e074632e9e2fb9dbd56d29 | [
"Apache-2.0"
] | null | null | null | tests/arithmetic/PLUS.py | past-one/rubymine-is2018 | 9b7f38bbcab0b59434e074632e9e2fb9dbd56d29 | [
"Apache-2.0"
] | null | null | null | tests/arithmetic/PLUS.py | past-one/rubymine-is2018 | 9b7f38bbcab0b59434e074632e9e2fb9dbd56d29 | [
"Apache-2.0"
] | null | null | null | if +1 + 1 + 2 == 4: # true
pass
| 12.333333 | 27 | 0.378378 | 7 | 37 | 2 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 0.432432 | 37 | 2 | 28 | 18.5 | 0.47619 | 0.108108 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
52cd56979894d05a967668623e7db94177a987e9 | 65 | py | Python | models/configuration.py | konohitowa/catebot | 3fcbddd69cf5f875653bed2ef698e4237bc202a1 | [
"MIT",
"Unlicense"
] | 19 | 2015-06-05T23:26:37.000Z | 2022-02-15T14:10:17.000Z | models/configuration.py | konohitowa/catebot | 3fcbddd69cf5f875653bed2ef698e4237bc202a1 | [
"MIT",
"Unlicense"
] | 30 | 2015-05-01T03:23:01.000Z | 2021-11-03T17:18:39.000Z | compute/dbconn/dbconn/models/configuration.py | djfurman/well-managed-deployments | b61c9adb7212bb2f2a03f007568760ec5a36af72 | [
"BSD-3-Clause"
] | 5 | 2016-03-09T19:15:41.000Z | 2018-09-04T12:49:59.000Z | from orator import Model
class Configuration(Model):
pass
| 9.285714 | 27 | 0.738462 | 8 | 65 | 6 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.215385 | 65 | 6 | 28 | 10.833333 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.