hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
8571b015e01d4bc3adbd31be1bd0b1a22b468acd | 47 | py | Python | proud/__init__.py | RemuLang/proud | 8e833863a0cc915d8c6e588aa812b1c6d1842347 | [
"MIT"
] | 16 | 2020-01-13T12:36:33.000Z | 2022-02-18T16:08:50.000Z | proud/__init__.py | RemuLang/proud | 8e833863a0cc915d8c6e588aa812b1c6d1842347 | [
"MIT"
] | 5 | 2020-01-14T01:28:34.000Z | 2020-01-20T12:43:26.000Z | proud/__init__.py | RemuLang/proud | 8e833863a0cc915d8c6e588aa812b1c6d1842347 | [
"MIT"
] | 1 | 2022-02-18T16:08:54.000Z | 2022-02-18T16:08:54.000Z | # import proud.core_lang.modular_compiler.setup | 47 | 47 | 0.87234 | 7 | 47 | 5.571429 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042553 | 47 | 1 | 47 | 47 | 0.866667 | 0.957447 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
8576706a90dcdb02f7a3d45bd64cb0fd34a6b9e3 | 20 | py | Python | c2dh_nerd/ned/__init__.py | theorm/c2dh-nerd | 790c945a75d25735e826fbbebb0441a00627dc1c | [
"MIT"
] | null | null | null | c2dh_nerd/ned/__init__.py | theorm/c2dh-nerd | 790c945a75d25735e826fbbebb0441a00627dc1c | [
"MIT"
] | null | null | null | c2dh_nerd/ned/__init__.py | theorm/c2dh-nerd | 790c945a75d25735e826fbbebb0441a00627dc1c | [
"MIT"
] | null | null | null | from .ned import NED | 20 | 20 | 0.8 | 4 | 20 | 4 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 20 | 1 | 20 | 20 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
a43948f803e06940ff624f03da3616c8341ac470 | 90 | py | Python | katas/kyu_7/show_multiples_of_two_numbers_within_a_range.py | the-zebulan/CodeWars | 1eafd1247d60955a5dfb63e4882e8ce86019f43a | [
"MIT"
] | 40 | 2016-03-09T12:26:20.000Z | 2022-03-23T08:44:51.000Z | katas/kyu_7/show_multiples_of_two_numbers_within_a_range.py | akalynych/CodeWars | 1eafd1247d60955a5dfb63e4882e8ce86019f43a | [
"MIT"
] | null | null | null | katas/kyu_7/show_multiples_of_two_numbers_within_a_range.py | akalynych/CodeWars | 1eafd1247d60955a5dfb63e4882e8ce86019f43a | [
"MIT"
] | 36 | 2016-11-07T19:59:58.000Z | 2022-03-31T11:18:27.000Z | def multiples(s1, s2, s3):
return [a for a in xrange(1, s3) if not(a % s1 or a % s2)]
| 30 | 62 | 0.588889 | 20 | 90 | 2.65 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104478 | 0.255556 | 90 | 2 | 63 | 45 | 0.686567 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
a454a54682a568dd6bd70392706923b11b4bbdc5 | 213 | py | Python | daily-problems/Day3/problem3_tests.py | jeffreycshelton/ghp-challenges | c19e6b9964818674502bb096fc87f32146af1ad0 | [
"MIT"
] | null | null | null | daily-problems/Day3/problem3_tests.py | jeffreycshelton/ghp-challenges | c19e6b9964818674502bb096fc87f32146af1ad0 | [
"MIT"
] | null | null | null | daily-problems/Day3/problem3_tests.py | jeffreycshelton/ghp-challenges | c19e6b9964818674502bb096fc87f32146af1ad0 | [
"MIT"
] | null | null | null | import unittest
from problem3 import numSubarrayBoundedMax
class TestProblem2(unittest.TestCase):
def tests(self):
return self.assertEqual(numSubarrayBoundedMax([2, 1, 4, 3], 2, 3), 3)
| 21.3 | 77 | 0.690141 | 24 | 213 | 6.125 | 0.708333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053892 | 0.215962 | 213 | 9 | 78 | 23.666667 | 0.826347 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.2 | false | 0 | 0.4 | 0.2 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 5 |
a49901cb967fbc59d2d263b16ee2c3c86b187b4b | 859 | py | Python | optimize_images/exceptions.py | Serempre/optimize-images | e64176fa2d09d6cfee6eed5ef46de9a8a5f3332f | [
"MIT"
] | 196 | 2018-06-24T22:45:31.000Z | 2022-03-30T12:04:19.000Z | optimize_images/exceptions.py | Serempre/optimize-images | e64176fa2d09d6cfee6eed5ef46de9a8a5f3332f | [
"MIT"
] | 35 | 2018-07-02T15:17:03.000Z | 2022-01-09T18:49:54.000Z | optimize_images/exceptions.py | Serempre/optimize-images | e64176fa2d09d6cfee6eed5ef46de9a8a5f3332f | [
"MIT"
] | 42 | 2018-08-08T18:12:19.000Z | 2021-12-03T09:39:06.000Z | class OIKeyboardInterrupt(KeyboardInterrupt):
"""Exception raised when the user interrupts stops the execution using a
keyboard interrupt (typically CTRL-C).
Attributes:
message -- explanation of the error
"""
def __init__(self, message=""):
self.message = message
super().__init__(self.message)
class OIImagesNotFoundError(FileNotFoundError):
"""Exception raised when there were no images found.
Attributes:
message -- explanation of the error
"""
def __init__(self, message=""):
self.message = message
super().__init__(self.message)
class OIInvalidPathError(ValueError):
"""Exception raised when there were no images found.
Attributes:
message -- explanation of the error
"""
def __init__(self, message=""):
self.message = message
| 24.542857 | 76 | 0.664726 | 88 | 859 | 6.261364 | 0.409091 | 0.15971 | 0.136116 | 0.163339 | 0.642468 | 0.642468 | 0.642468 | 0.642468 | 0.642468 | 0.642468 | 0 | 0 | 0.240978 | 859 | 34 | 77 | 25.264706 | 0.845092 | 0.427241 | 0 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
f10c884479989f95ed2218e71596e8d7e5927788 | 41 | py | Python | test/login.py | daxiong1hao/test001 | 43cfc4a0728f0e1c8fb7c741ca864bed92b28403 | [
"MIT"
] | null | null | null | test/login.py | daxiong1hao/test001 | 43cfc4a0728f0e1c8fb7c741ca864bed92b28403 | [
"MIT"
] | null | null | null | test/login.py | daxiong1hao/test001 | 43cfc4a0728f0e1c8fb7c741ca864bed92b28403 | [
"MIT"
] | null | null | null | num = 10
age = 18
he = 200
hight = 175
| 5.857143 | 11 | 0.560976 | 8 | 41 | 2.875 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.37037 | 0.341463 | 41 | 6 | 12 | 6.833333 | 0.481481 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
f11422d67ff34fc3cce9403c517350a7e15397b1 | 899 | py | Python | app/shared/filters/dt.py | neurothrone/project-dot | 20889075611bed645689a76a30257f96e4b55988 | [
"MIT"
] | null | null | null | app/shared/filters/dt.py | neurothrone/project-dot | 20889075611bed645689a76a30257f96e4b55988 | [
"MIT"
] | null | null | null | app/shared/filters/dt.py | neurothrone/project-dot | 20889075611bed645689a76a30257f96e4b55988 | [
"MIT"
] | null | null | null | from datetime import datetime
class Day:
@staticmethod
def part_of_day(datetime_: datetime) -> str:
hour = datetime_.hour
if 5 < hour < 12:
return "Morning"
elif 12 <= hour < 17:
return "Afternoon"
elif 17 <= hour < 21:
return "Evening"
else:
return "Night"
def time_of_day_message(datetime_: datetime) -> str:
return f"Good {Day.part_of_day(datetime_)}"
def datetime_format(datetime_: datetime) -> str:
return datetime_.strftime("%Y-%m-%d %H:%M:%S")
def date_format(datetime_: datetime) -> str:
return datetime_.strftime("%Y-%m-%d")
def time_format(datetime_: datetime) -> str:
return datetime_.strftime("%H:%M:%S")
def time_difference_to_now(_datetime: datetime) -> str:
now = datetime.utcnow()
difference = now - _datetime
return str(difference.seconds // 60)
| 23.657895 | 55 | 0.624027 | 112 | 899 | 4.785714 | 0.366071 | 0.179104 | 0.212687 | 0.186567 | 0.274254 | 0.274254 | 0.274254 | 0.186567 | 0.186567 | 0.186567 | 0 | 0.019288 | 0.250278 | 899 | 37 | 56 | 24.297297 | 0.775964 | 0 | 0 | 0 | 0 | 0 | 0.104561 | 0.031146 | 0 | 0 | 0 | 0 | 0 | 1 | 0.24 | false | 0 | 0.04 | 0.16 | 0.68 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
f12db44f9dbbddb3b8ca98cfbc5e4781d3f555c8 | 60 | py | Python | src/__init__.py | pczarnowska/generalized-fairness-metrics | 06a4cea1e017e1340ecb14617b629215a8b014cf | [
"Apache-2.0"
] | 3 | 2021-10-30T12:34:32.000Z | 2022-02-24T10:27:23.000Z | src/__init__.py | pczarnowska/generalized-fairness-metrics | 06a4cea1e017e1340ecb14617b629215a8b014cf | [
"Apache-2.0"
] | 8 | 2021-08-18T19:13:53.000Z | 2022-02-02T16:06:08.000Z | src/__init__.py | pczarnowska/generalized-fairness-metrics | 06a4cea1e017e1340ecb14617b629215a8b014cf | [
"Apache-2.0"
] | 4 | 2021-08-13T15:28:28.000Z | 2022-03-29T05:25:00.000Z | from .models.readers.flexible_reader import FlexibleReader
| 20 | 58 | 0.866667 | 7 | 60 | 7.285714 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 60 | 2 | 59 | 30 | 0.927273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
f1301660ad545b063829f8a3ab12eee5e35605ca | 68 | py | Python | cgnp_patchy/lib/nanoparticles/__init__.py | cjspindel/cgnp_patchy | 12d401c90795ecddb9c4ea0433dc26c4d31d80b6 | [
"MIT"
] | null | null | null | cgnp_patchy/lib/nanoparticles/__init__.py | cjspindel/cgnp_patchy | 12d401c90795ecddb9c4ea0433dc26c4d31d80b6 | [
"MIT"
] | null | null | null | cgnp_patchy/lib/nanoparticles/__init__.py | cjspindel/cgnp_patchy | 12d401c90795ecddb9c4ea0433dc26c4d31d80b6 | [
"MIT"
] | null | null | null | from cgnp_patchy.lib.nanoparticles.Nanoparticle import Nanoparticle
| 34 | 67 | 0.897059 | 8 | 68 | 7.5 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 68 | 1 | 68 | 68 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
f144bdcec3a9940f0ed31db33331682882048c50 | 149 | py | Python | docker_images/stanza/app/pipelines/__init__.py | huggingface/api-inference-community | 5edcd6aecbb14fefc74755ac929ab9cf29ac841a | [
"Apache-2.0"
] | 2 | 2022-03-24T19:41:23.000Z | 2022-03-25T10:41:26.000Z | docker_images/flair/app/pipelines/__init__.py | huggingface/api-inference-community | 5edcd6aecbb14fefc74755ac929ab9cf29ac841a | [
"Apache-2.0"
] | 6 | 2022-03-16T12:51:45.000Z | 2022-03-17T08:40:35.000Z | docker_images/stanza/app/pipelines/__init__.py | huggingface/api-inference-community | 5edcd6aecbb14fefc74755ac929ab9cf29ac841a | [
"Apache-2.0"
] | null | null | null | from app.pipelines.base import Pipeline, PipelineException # isort:skip
from app.pipelines.token_classification import TokenClassificationPipeline
| 37.25 | 74 | 0.865772 | 16 | 149 | 8 | 0.75 | 0.109375 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087248 | 149 | 3 | 75 | 49.666667 | 0.941176 | 0.067114 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
74d65a753172bda27ddb38c48d28bf4638915ead | 55 | py | Python | msi_zarr_analysis/preprocessing/utils.py | maxime915/msi_zarr_analysis | 6be3144318952e3e0a25d2e8be1eb0e2a1595f95 | [
"Apache-2.0"
] | null | null | null | msi_zarr_analysis/preprocessing/utils.py | maxime915/msi_zarr_analysis | 6be3144318952e3e0a25d2e8be1eb0e2a1595f95 | [
"Apache-2.0"
] | null | null | null | msi_zarr_analysis/preprocessing/utils.py | maxime915/msi_zarr_analysis | 6be3144318952e3e0a25d2e8be1eb0e2a1595f95 | [
"Apache-2.0"
] | null | null | null | "utils: functions frequently used for preprocessing"
| 13.75 | 52 | 0.8 | 6 | 55 | 7.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145455 | 55 | 3 | 53 | 18.333333 | 0.93617 | 0.909091 | 0 | 0 | 0 | 0 | 0.943396 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
2d1b644a6780dbb812ed38829a0bc2250c3333e1 | 66 | py | Python | tests/core/map/__init__.py | waigore/empyres4x | aa3bbf94ea0bca280152e92d485d5825a4b352ca | [
"Apache-2.0"
] | null | null | null | tests/core/map/__init__.py | waigore/empyres4x | aa3bbf94ea0bca280152e92d485d5825a4b352ca | [
"Apache-2.0"
] | null | null | null | tests/core/map/__init__.py | waigore/empyres4x | aa3bbf94ea0bca280152e92d485d5825a4b352ca | [
"Apache-2.0"
] | null | null | null | from .testmapgen import TestMapGen
from .testwalk import TestWalk
| 22 | 34 | 0.848485 | 8 | 66 | 7 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 66 | 2 | 35 | 33 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
741c966c08c00ac18959b7a9e9b8bcd13dd90da8 | 98 | py | Python | bikeshed/enum/__init__.py | tidoust/bikeshed | 39652c9ea7fd5d09639da5e1182cd481b0c5ae70 | [
"CC0-1.0"
] | null | null | null | bikeshed/enum/__init__.py | tidoust/bikeshed | 39652c9ea7fd5d09639da5e1182cd481b0c5ae70 | [
"CC0-1.0"
] | null | null | null | bikeshed/enum/__init__.py | tidoust/bikeshed | 39652c9ea7fd5d09639da5e1182cd481b0c5ae70 | [
"CC0-1.0"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import division, unicode_literals
from .Enum import Enum
| 19.6 | 49 | 0.734694 | 13 | 98 | 5.153846 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012048 | 0.153061 | 98 | 4 | 50 | 24.5 | 0.795181 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
7459c9b0286a1d570fc351aa5690e1aa06d40ccf | 514 | py | Python | openapi_core/contrib/django/compat.py | correl/openapi-core | 5dcb0dba7642f9959bb7af5e40f3fdeb1b2cc85f | [
"BSD-3-Clause"
] | 1 | 2021-07-14T14:43:22.000Z | 2021-07-14T14:43:22.000Z | openapi_core/contrib/django/compat.py | ashb/openapi-core | c4fab4c4b3830bff5b90c3a0ac8dc62e67ee78a3 | [
"BSD-3-Clause"
] | null | null | null | openapi_core/contrib/django/compat.py | ashb/openapi-core | c4fab4c4b3830bff5b90c3a0ac8dc62e67ee78a3 | [
"BSD-3-Clause"
] | null | null | null | """OpenAPI core contrib django compat module"""
from openapi_core.contrib.django.backports import (
HttpHeaders, request_current_scheme_host,
)
def get_headers(req):
# in Django 1 headers is not defined
return req.headers if hasattr(req, 'headers') else \
HttpHeaders(req.META)
def get_current_scheme_host(req):
# in Django 1 _current_scheme_host is not defined
return req._current_scheme_host if hasattr(req, '_current_scheme_host') \
else request_current_scheme_host(req)
| 30.235294 | 77 | 0.749027 | 72 | 514 | 5.069444 | 0.388889 | 0.213699 | 0.279452 | 0.131507 | 0.115068 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004706 | 0.173152 | 514 | 16 | 78 | 32.125 | 0.854118 | 0.243191 | 0 | 0 | 0 | 0 | 0.070681 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0.222222 | 0.555556 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
745f7484ae58b61b2a60624fe2d84f15f3c773d4 | 107 | py | Python | include/tclap-1.4.0-rc1/tests/test36.py | SpaceKatt/cpp-cli-poc | 02ffefea2fc6e999fa2b27d08a8b3be6830b1b97 | [
"BSL-1.0"
] | 62 | 2021-09-21T18:58:02.000Z | 2022-03-07T02:17:43.000Z | third_party/tclap-1.4.0-rc1/tests/test36.py | Vertexwahn/FlatlandRT | 37d09fde38b25eff5f802200b43628efbd1e3198 | [
"Apache-2.0"
] | null | null | null | third_party/tclap-1.4.0-rc1/tests/test36.py | Vertexwahn/FlatlandRT | 37d09fde38b25eff5f802200b43628efbd1e3198 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python
import simple_test
simple_test.test("test6", ["-n", "homer", "6", ], expect_fail=True)
| 17.833333 | 67 | 0.663551 | 16 | 107 | 4.25 | 0.8125 | 0.294118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020833 | 0.102804 | 107 | 5 | 68 | 21.4 | 0.6875 | 0.149533 | 0 | 0 | 0 | 0 | 0.144444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
74928ec0933eeb172b3b1ee9a181a16939e4233b | 38 | py | Python | data_postp/__init__.py | jonasrothfuss/DeepEpisodicMemory | 1095315a5d75a4840ef4017af70432e2dd535e4c | [
"MIT"
] | 13 | 2017-02-03T17:17:04.000Z | 2021-01-27T09:29:53.000Z | data_postp/__init__.py | jonasrothfuss/DeepEpisodicMemory | 1095315a5d75a4840ef4017af70432e2dd535e4c | [
"MIT"
] | 1 | 2019-01-07T23:53:51.000Z | 2019-01-07T23:53:51.000Z | data_postp/__init__.py | jonasrothfuss/DeepEpisodicMemory | 1095315a5d75a4840ef4017af70432e2dd535e4c | [
"MIT"
] | 8 | 2017-02-03T17:17:41.000Z | 2021-01-13T10:53:39.000Z | from .similarity_computations import * | 38 | 38 | 0.868421 | 4 | 38 | 8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 38 | 1 | 38 | 38 | 0.914286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
777f4448b7ea9bc400b3d438a6ff04bc6deccefd | 59 | py | Python | iysql/util/__init__.py | influx-code/lysql | 47b7a791c27dcf108d971a0fe1213381214ff76b | [
"Apache-2.0"
] | 84 | 2018-10-24T11:35:22.000Z | 2021-11-04T10:13:53.000Z | iysql/util/__init__.py | influx-code/lysql | 47b7a791c27dcf108d971a0fe1213381214ff76b | [
"Apache-2.0"
] | 15 | 2018-10-25T08:40:14.000Z | 2021-07-02T08:35:48.000Z | iysql/util/__init__.py | influx-code/lysql | 47b7a791c27dcf108d971a0fe1213381214ff76b | [
"Apache-2.0"
] | 23 | 2018-10-27T13:35:31.000Z | 2020-03-09T06:33:58.000Z | from .util import execute_commond_get_stdout, get_databases | 59 | 59 | 0.898305 | 9 | 59 | 5.444444 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067797 | 59 | 1 | 59 | 59 | 0.890909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
77b7acd1fca1bd9e0ca4b58a26a48ed9184b546a | 11,982 | py | Python | tests/test_elasticecshandler.py | innovmetric/python-elasticsearch-ecs-logger | a7993c1db6b06c5205c676d7e5d56b4804f5f9e2 | [
"Apache-2.0"
] | 4 | 2020-12-30T08:25:07.000Z | 2021-07-01T07:36:33.000Z | tests/test_elasticecshandler.py | innovmetric/python-elasticsearch-ecs-logger | a7993c1db6b06c5205c676d7e5d56b4804f5f9e2 | [
"Apache-2.0"
] | null | null | null | tests/test_elasticecshandler.py | innovmetric/python-elasticsearch-ecs-logger | a7993c1db6b06c5205c676d7e5d56b4804f5f9e2 | [
"Apache-2.0"
] | 6 | 2020-12-30T01:56:35.000Z | 2022-03-22T02:54:46.000Z | import logging
import os
import sys
import time
import unittest
sys.path.insert(0, os.path.abspath('.'))
from elasticecslogging.handlers import ElasticECSHandler
class ElasticECSHandlerTestCase(unittest.TestCase):
DEFAULT_ES_SERVER = 'localhost'
DEFAULT_ES_PORT = 9200
def getESHost(self):
return os.getenv('TEST_ES_SERVER', ElasticECSHandlerTestCase.DEFAULT_ES_SERVER)
def getESPort(self):
try:
return int(os.getenv('TEST_ES_PORT', ElasticECSHandlerTestCase.DEFAULT_ES_PORT))
except ValueError:
return ElasticECSHandlerTestCase.DEFAULT_ES_PORT
def setUp(self):
self.log = logging.getLogger("MyTestCase")
test_handler = logging.StreamHandler(stream=sys.stderr)
self.log.addHandler(test_handler)
def tearDown(self):
del self.log
def test_ping(self):
handler = ElasticECSHandler(hosts=[{'host': self.getESHost(), 'port': self.getESPort()}],
auth_type=ElasticECSHandler.AuthType.NO_AUTH,
es_index_name="pythontest",
use_ssl=False,
raise_on_indexing_exceptions=True)
es_test_server_is_up = handler.test_es_source()
self.assertEqual(True, es_test_server_is_up)
def test_buffered_log_insertion_flushed_when_buffer_full(self):
handler = ElasticECSHandler(hosts=[{'host': self.getESHost(), 'port': self.getESPort()}],
auth_type=ElasticECSHandler.AuthType.NO_AUTH,
use_ssl=False,
buffer_size=2,
flush_frequency_in_sec=1000,
es_index_name="pythontest",
raise_on_indexing_exceptions=True)
es_test_server_is_up = handler.test_es_source()
self.log.info("ES services status is: {0!s}".format(es_test_server_is_up))
self.assertEqual(True, es_test_server_is_up)
log = logging.getLogger("PythonTest")
log.setLevel(logging.DEBUG)
log.addHandler(handler)
log.warning("First Message")
log.info("Seccond Message")
self.assertEqual(0, len(handler._buffer))
handler.close()
def test_es_log_with_additional_env_fields(self):
handler = ElasticECSHandler(hosts=[{'host': self.getESHost(), 'port': self.getESPort()}],
auth_type=ElasticECSHandler.AuthType.NO_AUTH,
use_ssl=False,
es_index_name="pythontest",
es_additional_fields={'App': 'Test', 'Nested': {'One': '1', 'Two': '2'}},
es_additional_fields_in_env={'App': 'ENV_APP', 'Environment': 'ENV_ENV',
'Nested': {'One': 'ENV_ONE'}},
raise_on_indexing_exceptions=True)
es_test_server_is_up = handler.test_es_source()
self.log.info("ES services status is: {0!s}".format(es_test_server_is_up))
self.assertEqual(True, es_test_server_is_up)
log = logging.getLogger("PythonTest")
log.addHandler(handler)
log.warning("Test1 without environment variables set.")
self.assertEqual(1, len(handler._buffer))
self.assertEqual('Test', handler._buffer[0]['App'])
self.assertEqual('1', handler._buffer[0]['Nested']['One'])
self.assertEqual('2', handler._buffer[0]['Nested']['Two'])
self.assertNotIn('Environment', handler._buffer[0])
handler.flush()
self.assertEqual(0, len(handler._buffer))
os.environ['ENV_APP'] = 'Test2'
os.environ['ENV_ENV'] = 'Dev'
os.environ['ENV_ONE'] = 'One'
log.warning("Test2 with environment variables set.")
self.assertEqual(1, len(handler._buffer))
self.assertEqual('Test2', handler._buffer[0]['App'])
self.assertEqual('Dev', handler._buffer[0]['Environment'])
self.assertEqual('One', handler._buffer[0]['Nested']['One'])
self.assertEqual('2', handler._buffer[0]['Nested']['Two'])
del os.environ['ENV_APP']
del os.environ['ENV_ENV']
del os.environ['ENV_ONE']
handler.flush()
self.assertEqual(0, len(handler._buffer))
def test_es_log_extra_argument_insertion(self):
self.log.info("About to test elasticsearch insertion")
handler = ElasticECSHandler(hosts=[{'host': self.getESHost(), 'port': self.getESPort()}],
auth_type=ElasticECSHandler.AuthType.NO_AUTH,
use_ssl=False,
es_index_name="pythontest",
es_additional_fields={'App': 'Test', 'Environment': 'Dev',
'Nested': {'One': '1', 'Two': '2'}},
raise_on_indexing_exceptions=True)
es_test_server_is_up = handler.test_es_source()
self.log.info("ES services status is: {0!s}".format(es_test_server_is_up))
self.assertEqual(True, es_test_server_is_up)
log = logging.getLogger("PythonTest")
log.addHandler(handler)
log.warning("Extra arguments Message", extra={"Arg1": 300, "Arg2": 400})
log.warning("Another Log")
self.assertEqual(2, len(handler._buffer))
self.assertEqual(300, handler._buffer[0]['Arg1'])
self.assertEqual(400, handler._buffer[0]['Arg2'])
self.assertEqual('Test', handler._buffer[0]['App'])
self.assertEqual('Dev', handler._buffer[0]['Environment'])
self.assertEqual('1', handler._buffer[0]['Nested']['One'])
self.assertEqual('2', handler._buffer[0]['Nested']['Two'])
handler.flush()
self.assertEqual(0, len(handler._buffer))
def test_es_log_exception_insertion(self):
handler = ElasticECSHandler(hosts=[{'host': self.getESHost(), 'port': self.getESPort()}],
auth_type=ElasticECSHandler.AuthType.NO_AUTH,
use_ssl=False,
es_index_name="pythontest",
raise_on_indexing_exceptions=True)
es_test_server_is_up = handler.test_es_source()
self.log.info("ES services status is: {0!s}".format(es_test_server_is_up))
self.assertEqual(True, es_test_server_is_up)
log = logging.getLogger("PythonTest")
log.addHandler(handler)
try:
_ = 21/0
except ZeroDivisionError:
log.exception('Division Error')
self.assertEqual(1, len(handler._buffer))
handler.flush()
self.assertEqual(0, len(handler._buffer))
def test_buffered_log_insertion_after_interval_expired(self):
handler = ElasticECSHandler(hosts=[{'host': self.getESHost(), 'port': self.getESPort()}],
auth_type=ElasticECSHandler.AuthType.NO_AUTH,
use_ssl=False,
flush_frequency_in_sec=0.1,
es_index_name="pythontest",
raise_on_indexing_exceptions=True)
es_test_server_is_up = handler.test_es_source()
self.log.info("ES services status is: {0!s}".format(es_test_server_is_up))
self.assertEqual(True, es_test_server_is_up)
log = logging.getLogger("PythonTest")
log.addHandler(handler)
log.warning("Warning Message")
self.assertEqual(1, len(handler._buffer))
time.sleep(1)
self.assertEqual(0, len(handler._buffer))
def test_fast_insertion_of_hundred_logs(self):
handler = ElasticECSHandler(hosts=[{'host': self.getESHost(), 'port': self.getESPort()}],
auth_type=ElasticECSHandler.AuthType.NO_AUTH,
use_ssl=False,
buffer_size=500,
flush_frequency_in_sec=0.5,
es_index_name="pythontest",
raise_on_indexing_exceptions=True)
log = logging.getLogger("PythonTest")
log.setLevel(logging.DEBUG)
log.addHandler(handler)
for i in range(100):
log.info("Logging line {0:d}".format(i), extra={'LineNum': i})
handler.flush()
self.assertEqual(0, len(handler._buffer))
def test_index_name_frequency_functions(self):
index_name = "pythontest"
handler = ElasticECSHandler(hosts=[{'host': self.getESHost(), 'port': self.getESPort()}],
auth_type=ElasticECSHandler.AuthType.NO_AUTH,
es_index_name=index_name,
use_ssl=False,
index_name_frequency=ElasticECSHandler.IndexNameFrequency.DAILY,
raise_on_indexing_exceptions=True)
self.assertEqual(
ElasticECSHandler._get_daily_index_name(index_name),
handler._index_name_func.__func__(index_name)
)
handler = ElasticECSHandler(hosts=[{'host': self.getESHost(), 'port': self.getESPort()}],
auth_type=ElasticECSHandler.AuthType.NO_AUTH,
es_index_name=index_name,
use_ssl=False,
index_name_frequency=ElasticECSHandler.IndexNameFrequency.WEEKLY,
raise_on_indexing_exceptions=True)
self.assertEqual(
ElasticECSHandler._get_weekly_index_name(index_name),
handler._index_name_func.__func__(index_name)
)
handler = ElasticECSHandler(hosts=[{'host': self.getESHost(), 'port': self.getESPort()}],
auth_type=ElasticECSHandler.AuthType.NO_AUTH,
es_index_name=index_name,
use_ssl=False,
index_name_frequency=ElasticECSHandler.IndexNameFrequency.MONTHLY,
raise_on_indexing_exceptions=True)
self.assertEqual(
ElasticECSHandler._get_monthly_index_name(index_name),
handler._index_name_func.__func__(index_name)
)
handler = ElasticECSHandler(hosts=[{'host': self.getESHost(), 'port': self.getESPort()}],
auth_type=ElasticECSHandler.AuthType.NO_AUTH,
es_index_name=index_name,
use_ssl=False,
index_name_frequency=ElasticECSHandler.IndexNameFrequency.YEARLY,
raise_on_indexing_exceptions=True)
self.assertEqual(
ElasticECSHandler._get_yearly_index_name(index_name),
handler._index_name_func.__func__(index_name)
)
handler = ElasticECSHandler(hosts=[{'host': self.getESHost(), 'port': self.getESPort()}],
auth_type=ElasticECSHandler.AuthType.NO_AUTH,
es_index_name=index_name,
use_ssl=False,
index_name_frequency=ElasticECSHandler.IndexNameFrequency.NEVER,
raise_on_indexing_exceptions=True)
self.assertEqual(
ElasticECSHandler._get_never_index_name(index_name),
handler._index_name_func.__func__(index_name)
)
if __name__ == '__main__':
unittest.main()
| 47.547619 | 109 | 0.565682 | 1,184 | 11,982 | 5.408784 | 0.130912 | 0.061836 | 0.031855 | 0.037164 | 0.758276 | 0.733916 | 0.718926 | 0.718926 | 0.701124 | 0.637102 | 0 | 0.010571 | 0.32891 | 11,982 | 251 | 110 | 47.737052 | 0.785848 | 0 | 0 | 0.600939 | 0 | 0 | 0.07987 | 0 | 0 | 0 | 0 | 0 | 0.173709 | 1 | 0.056338 | false | 0 | 0.028169 | 0.004695 | 0.112676 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
77ed71a30b3592bcfceb5b20606a5aebccc5abbf | 78 | py | Python | src/data/image_url.py | haoweini/spotify_stream | 83fd13d4da9fb54a595611d4c0cd594eb5b8a9fd | [
"MIT"
] | null | null | null | src/data/image_url.py | haoweini/spotify_stream | 83fd13d4da9fb54a595611d4c0cd594eb5b8a9fd | [
"MIT"
] | null | null | null | src/data/image_url.py | haoweini/spotify_stream | 83fd13d4da9fb54a595611d4c0cd594eb5b8a9fd | [
"MIT"
] | null | null | null | def path_to_image_html(path):
return '<img src="'+ path + '" width="60" >' | 39 | 48 | 0.615385 | 12 | 78 | 3.75 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030769 | 0.166667 | 78 | 2 | 48 | 39 | 0.661538 | 0 | 0 | 0 | 0 | 0 | 0.303797 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
7ace9066e93c4ea21e9b0fef1a014e4321022d1a | 110 | py | Python | echo/__init__.py | crvernon/echo | 2aaf663879b6de5e52350d15e8d301f69228044c | [
"BSD-2-Clause"
] | null | null | null | echo/__init__.py | crvernon/echo | 2aaf663879b6de5e52350d15e8d301f69228044c | [
"BSD-2-Clause"
] | null | null | null | echo/__init__.py | crvernon/echo | 2aaf663879b6de5e52350d15e8d301f69228044c | [
"BSD-2-Clause"
] | null | null | null | from .slurm import *
from .worker import *
from .fetch import *
from .config import *
__version__ = "0.0.0"
| 13.75 | 21 | 0.690909 | 16 | 110 | 4.5 | 0.5 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033708 | 0.190909 | 110 | 7 | 22 | 15.714286 | 0.775281 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.8 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
bb5af842ef7a73adfab7eb7130fd466c6b35b1f1 | 40,135 | py | Python | tests/unit/offshoot/test_offshoot.py | SerpentAI/offshoot | 7906a95aeee511ca4ecd431ed5de1f779d2a13ab | [
"Apache-2.0"
] | 42 | 2017-01-23T22:36:03.000Z | 2021-11-14T21:22:17.000Z | tests/unit/offshoot/test_offshoot.py | SerpentAI/offshoot | 7906a95aeee511ca4ecd431ed5de1f779d2a13ab | [
"Apache-2.0"
] | 1 | 2017-09-15T18:37:10.000Z | 2017-09-15T18:37:10.000Z | tests/unit/offshoot/test_offshoot.py | SerpentAI/offshoot | 7906a95aeee511ca4ecd431ed5de1f779d2a13ab | [
"Apache-2.0"
] | 6 | 2017-04-14T13:07:27.000Z | 2020-06-17T06:24:18.000Z | # -*- coding: utf-8 -*-
import pytest
import offshoot
from pluggable import TestPluggable
from plugins.TestPlugin.plugin import TestPlugin
from plugins.TestPlugin2.plugin import TestPlugin2
from plugins.TestInvalidPlugin.plugin import TestInvalidPlugin
import yaml
import subprocess
import types
import os
import os.path
import inspect
import json
# Tests
def test_setup():
os.symlink("tests/unit/offshoot/plugins", "plugins")
os.symlink("tests/unit/offshoot/config", "config")
os.symlink("tests/unit/offshoot/libraries", "libraries")
def test_importing_the_module_should_expose_a_config_variable():
assert hasattr(offshoot, "config")
assert isinstance(offshoot.config, dict)
def test_importing_the_module_should_expose_a_lambda_to_map_pluggable_classes():
assert hasattr(offshoot, "pluggable_classes")
assert isinstance(offshoot.pluggable_classes, types.LambdaType)
def test_importing_the_module_should_expose_the_functions_from_base():
assert hasattr(offshoot, "validate_plugin_file")
assert hasattr(offshoot, "installed_plugins")
assert hasattr(offshoot, "discover")
assert isinstance(offshoot.validate_plugin_file, types.FunctionType)
assert isinstance(offshoot.installed_plugins, types.FunctionType)
assert isinstance(offshoot.discover, types.FunctionType)
def test_base_should_provide_a_function_to_get_a_default_and_complete_configuration():
assert hasattr(offshoot, "default_configuration")
assert isinstance(offshoot.default_configuration, types.FunctionType)
config = offshoot.default_configuration()
assert isinstance(config, dict)
assert "modules" in config
assert "file_paths" in config
assert "allow" in config
assert "sandbox_configuration_keys" in config
def test_base_should_be_able_to_load_a_configuration_from_an_existing_file():
config = offshoot.default_configuration()
config["modules"].append("test")
config["sandbox_configuration_keys"] = False
config["allow"]["callbacks"] = False
with open("offshoot.test.yml", "w") as f:
yaml.dump(config, f)
loaded_config = offshoot.load_configuration("offshoot.test.yml")
assert isinstance(loaded_config, dict)
assert "test" in loaded_config["modules"]
assert loaded_config["sandbox_configuration_keys"] is False
assert loaded_config["allow"]["callbacks"] is False
subprocess.call(["rm", "-f", "offshoot.test.yml"])
def test_base_should_load_the_default_configuration_if_the_configuration_file_does_not_exist():
config = offshoot.load_configuration("offshoot.test.yml")
assert isinstance(config, dict)
assert len(config["modules"]) == 0
assert config["sandbox_configuration_keys"] is True
assert config["allow"]["callbacks"] is True
def test_base_should_be_able_to_generate_a_configuration_file():
offshoot.generate_configuration_file()
assert os.path.isfile("offshoot.yml")
config = offshoot.load_configuration("offshoot.yml")
assert isinstance(config, dict)
assert len(config["modules"]) == 0
assert config["sandbox_configuration_keys"] is True
assert config["allow"]["callbacks"] is True
subprocess.call(["rm", "-f", "offshoot.yml"])
def test_base_should_be_able_to_extract_pluggable_classes_according_to_the_configuration():
config = offshoot.default_configuration()
assert isinstance(offshoot.map_pluggable_classes(config), dict)
assert len(offshoot.map_pluggable_classes(config)) == 0
config["modules"].append("pluggable")
pluggable_classes = offshoot.map_pluggable_classes(config)
assert isinstance(pluggable_classes, dict)
assert "TestPluggable" in pluggable_classes
assert "TestPluggableInvalid" not in pluggable_classes
assert inspect.isclass(pluggable_classes["TestPluggable"])
def test_base_should_be_able_to_validate_a_plugin_file_according_to_its_pluggable():
config = offshoot.default_configuration()
config["modules"].append("pluggable")
pluggable_classes = offshoot.map_pluggable_classes(config)
validation_result = offshoot.validate_plugin_file(
"tests/unit/offshoot/plugins/TestPlugin/files/test_plugin_pluggable.py",
"TestPluggable",
pluggable_classes["TestPluggable"].method_directives()
)
assert validation_result[0] is False
assert "expected methods are missing" in validation_result[1][0]
validation_result = offshoot.validate_plugin_file(
"tests/unit/offshoot/plugins/TestPlugin/files/test_plugin_pluggable_expected.py",
"TestPluggable",
pluggable_classes["TestPluggable"].method_directives()
)
assert validation_result[0] is True
assert len(validation_result[1]) == 0
validation_result = offshoot.validate_plugin_file(
"tests/unit/offshoot/plugins/TestPlugin/files/test_plugin_pluggable_accepted.py",
"TestPluggable",
pluggable_classes["TestPluggable"].method_directives()
)
assert validation_result[0] is False
assert "expected methods are missing" in validation_result[1][0]
validation_result = offshoot.validate_plugin_file(
"tests/unit/offshoot/plugins/TestPlugin/files/test_plugin_pluggable_forbidden.py",
"TestPluggable",
pluggable_classes["TestPluggable"].method_directives()
)
assert validation_result[0] is False
expected_messages = {
"expected methods are missing",
"method should not appear in the class"
}
for validation_message in validation_result[1]:
for expected_message in list(expected_messages):
if expected_message in validation_message:
expected_messages.remove(expected_message)
assert len(expected_messages) == 0
def test_base_should_be_able_to_return_a_list_of_installed_plugins():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
TestPlugin.uninstall()
assert len(offshoot.installed_plugins()) == 0
TestPlugin.install()
assert len(offshoot.installed_plugins()) == 1
assert "TestPlugin - 0.1.0" in offshoot.installed_plugins()
TestPlugin.uninstall()
assert len(offshoot.installed_plugins()) == 0
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_base_should_be_able_to_discover_installed_plugins_for_a_specified_pluggable():
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
offshoot.config["modules"].append("pluggable")
TestPlugin.install()
class_mapping = offshoot.discover("TestPluggable", globals())
assert isinstance(class_mapping, dict)
assert len(class_mapping) == 0
assert inspect.isclass(TestPluginPluggableExpected)
TestPlugin.uninstall()
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_base_should_be_able_to_discover_installed_plugins_for_a_specified_pluggable_with_no_scope_passed_along():
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
offshoot.config["modules"].append("pluggable")
TestPlugin.install()
class_mapping = offshoot.discover("TestPluggable")
assert isinstance(class_mapping, dict)
assert len(class_mapping) == 1
assert "TestPluginPluggableExpected" in class_mapping
assert inspect.isclass(class_mapping["TestPluginPluggableExpected"])
TestPlugin.uninstall()
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_base_should_be_able_to_discover_installed_plugins_for_a_specified_pluggable_with_selection_passed_along():
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
offshoot.config["modules"].append("pluggable")
TestPlugin.install()
class_mapping = offshoot.discover("TestPluggable", selection="123")
assert isinstance(class_mapping, dict)
assert len(class_mapping) == 0
class_mapping = offshoot.discover("TestPluggable", selection=["123"])
assert isinstance(class_mapping, dict)
assert len(class_mapping) == 0
class_mapping = offshoot.discover("TestPluggable", selection=["TestPluginPluggableExpected"])
assert isinstance(class_mapping, dict)
assert len(class_mapping) == 1
assert "TestPluginPluggableExpected" in class_mapping
assert inspect.isclass(class_mapping["TestPluginPluggableExpected"])
TestPlugin.uninstall()
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_base_should_be_able_to_determine_if_a_file_implements_a_specified_pluggable():
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
offshoot.config["modules"].append("pluggable")
TestPlugin.install()
manifest = offshoot.Manifest()
files = manifest.plugin_files_for_pluggable("TestPluggable")
valid, plugin_class = offshoot.file_contains_pluggable(files[0][0], "TestPluggable")
assert valid is True
assert plugin_class == "TestPluginPluggableExpected"
valid, plugin_class = offshoot.file_contains_pluggable("INVALID.txt", "TestPluggable")
assert valid is False
assert plugin_class is None
valid, plugin_class = offshoot.file_contains_pluggable(files[0][0], "InvalidPluggable")
assert valid is False
assert plugin_class is None
TestPlugin.uninstall()
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_base_magic_decorators_should_not_do_anything():
func = "I AM FUNCTION"
assert offshoot.accepted(func) == "I AM FUNCTION"
assert offshoot.expected(func) == "I AM FUNCTION"
assert offshoot.forbidden(func) == "I AM FUNCTION"
def test_manifest_should_create_manifest_file_if_it_does_not_exist_on_initialization():
os.remove("offshoot.manifest.json")
offshoot.Manifest()
assert os.path.isfile("offshoot.manifest.json")
with open("offshoot.manifest.json", "r") as f:
assert "plugins" in json.loads(f.read())
def test_manifest_should_be_able_to_list_installed_plugins_along_with_their_metadata():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
TestPlugin.install()
manifest = offshoot.Manifest()
plugins = manifest.list_plugins()
assert "TestPlugin" in plugins
assert plugins["TestPlugin"].get("name") == "TestPlugin"
assert plugins["TestPlugin"].get("version") == "0.1.0"
assert "files" in plugins["TestPlugin"]
assert "config" in plugins["TestPlugin"]
assert "libraries" in plugins["TestPlugin"]
TestPlugin.uninstall()
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_manifest_should_be_able_to_determine_if_a_specific_plugin_is_installed():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
TestPlugin.install()
manifest = offshoot.Manifest()
assert manifest.contains_plugin("TestPlugin")
TestPlugin.uninstall()
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_manifest_should_be_able_to_add_a_plugin_and_its_metadata():
os.remove("offshoot.manifest.json")
manifest = offshoot.Manifest()
manifest.add_plugin("TestPlugin")
plugins = manifest.list_plugins()
assert "TestPlugin" in plugins
assert plugins["TestPlugin"].get("name") == "TestPlugin"
assert plugins["TestPlugin"].get("version") == "0.1.0"
assert "files" in plugins["TestPlugin"]
assert "config" in plugins["TestPlugin"]
assert "libraries" in plugins["TestPlugin"]
def test_manifest_should_be_able_to_remove_a_plugin_and_its_metadata():
manifest = offshoot.Manifest()
plugins = manifest.list_plugins()
assert "TestPlugin" in plugins
assert plugins["TestPlugin"].get("name") == "TestPlugin"
assert plugins["TestPlugin"].get("version") == "0.1.0"
assert "files" in plugins["TestPlugin"]
assert "config" in plugins["TestPlugin"]
assert "libraries" in plugins["TestPlugin"]
manifest.remove_plugin("TestPlugin")
assert len(manifest.list_plugins()) == 0
os.remove("offshoot.manifest.json")
def test_manifest_should_be_able_to_return_all_file_names_containing_a_specific_pluggable():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
TestPlugin.install()
manifest = offshoot.Manifest()
plugin_files = manifest.plugin_files_for_pluggable("TestPluggable")
assert len(plugin_files) == 1
assert plugin_files[0][0] == "plugins/TestPlugin/files/test_plugin_pluggable_expected.py"
assert plugin_files[0][1] == "TestPluggable"
TestPlugin.uninstall()
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_pluggable_should_be_able_to_return_its_method_directives():
method_directives = TestPluggable.method_directives()
assert "expected" in method_directives
assert "accepted" in method_directives
assert "forbidden" in method_directives
assert "expected_function" in method_directives["expected"]
assert "accepted_function" in method_directives["accepted"]
assert "forbidden_function" in method_directives["forbidden"]
def test_pluggable_should_be_able_to_determine_the_methods_tagged_with_a_specific_decorator():
assert "expected_function" in TestPluggable.methods_with_decorator("expected")
assert "accepted_function" not in TestPluggable.methods_with_decorator("expected")
assert "forbidden_function" not in TestPluggable.methods_with_decorator("expected")
assert "expected_function" not in TestPluggable.methods_with_decorator("accepted")
assert "accepted_function" in TestPluggable.methods_with_decorator("accepted")
assert "forbidden_function" not in TestPluggable.methods_with_decorator("accepted")
assert "expected_function" not in TestPluggable.methods_with_decorator("forbidden")
assert "accepted_function" not in TestPluggable.methods_with_decorator("forbidden")
assert "forbidden_function" in TestPluggable.methods_with_decorator("forbidden")
def test_pluggable_should_trigger_a_callback_on_file_install(mocker):
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
mocker.spy(TestPluggable, "on_file_install")
TestPlugin.install()
assert TestPluggable.on_file_install.call_count == 1
TestPlugin.uninstall()
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
def test_pluggable_should_trigger_a_callback_on_file_uninstall(mocker):
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
mocker.spy(TestPluggable, "on_file_uninstall")
TestPlugin.install()
TestPlugin.uninstall()
assert TestPluggable.on_file_uninstall.call_count == 1
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
def test_plugin_respects_configuration_allow_flags_on_install(mocker):
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
mocker.spy(TestPlugin, "install_files")
mocker.spy(TestPlugin, "install_configuration")
mocker.spy(TestPlugin, "install_libraries")
mocker.spy(TestPlugin, "on_install")
TestPlugin.install()
assert TestPlugin.install_files.call_count == 0
assert TestPlugin.install_configuration.call_count == 0
assert TestPlugin.install_libraries.call_count == 0
assert TestPlugin.on_install.call_count == 0
TestPlugin.uninstall()
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
TestPlugin.install()
assert TestPlugin.install_files.call_count == 1
assert TestPlugin.install_configuration.call_count == 1
assert TestPlugin.install_libraries.call_count == 1
assert TestPlugin.on_install.call_count == 1
TestPlugin.uninstall()
def test_plugin_adds_a_manifest_entry_on_install(monkeypatch):
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
global add_plugin_called
add_plugin_called = False
class MockManifest:
def add_plugin(self, name):
global add_plugin_called
add_plugin_called = True
manifest = MockManifest()
monkeypatch.setattr(offshoot.Manifest, "add_plugin", manifest.add_plugin)
TestPlugin.install()
assert add_plugin_called
TestPlugin.uninstall()
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_respects_configuration_allow_flags_on_uninstall(mocker):
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
mocker.spy(TestPlugin, "uninstall_files")
mocker.spy(TestPlugin, "uninstall_configuration")
mocker.spy(TestPlugin, "uninstall_libraries")
mocker.spy(TestPlugin, "on_uninstall")
TestPlugin.install()
TestPlugin.uninstall()
assert TestPlugin.uninstall_files.call_count == 0
assert TestPlugin.uninstall_configuration.call_count == 0
assert TestPlugin.uninstall_libraries.call_count == 0
assert TestPlugin.on_uninstall.call_count == 0
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
TestPlugin.install()
TestPlugin.uninstall()
assert TestPlugin.uninstall_files.call_count == 1
assert TestPlugin.uninstall_configuration.call_count == 1
assert TestPlugin.uninstall_libraries.call_count == 1
assert TestPlugin.on_uninstall.call_count == 1
def test_plugin_removes_a_manifest_entry_on_uninstall(monkeypatch):
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
global remove_plugin_called
remove_plugin_called = False
class MockManifest:
def remove_plugin(self, name):
global remove_plugin_called
remove_plugin_called = True
manifest = MockManifest()
monkeypatch.setattr(offshoot.Manifest, "remove_plugin", manifest.remove_plugin)
TestPlugin.install()
TestPlugin.uninstall()
assert remove_plugin_called
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_files_are_validated_against_the_pluggable_specification_on_file_install(mocker):
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
mocker.spy(TestPlugin, "_validate_file_for_pluggable")
TestPlugin.install()
TestPlugin._validate_file_for_pluggable.assert_called_once_with(
"plugins/TestPlugin/files/test_plugin_pluggable_expected.py",
"TestPluggable"
)
TestPlugin.uninstall()
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_files_are_uninstalled_on_file_install_error(mocker, monkeypatch):
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
mocker.spy(TestPluggable, "on_file_uninstall")
global remove_plugin_called
remove_plugin_called = False
class MockManifest:
def remove_plugin(self, name):
global remove_plugin_called
remove_plugin_called = True
manifest = MockManifest()
monkeypatch.setattr(offshoot.Manifest, "remove_plugin", manifest.remove_plugin)
with pytest.raises(offshoot.PluginError):
TestInvalidPlugin.install()
assert TestPluggable.on_file_uninstall.call_count == 1
assert remove_plugin_called
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_file_install_callbacks_are_sent_on_file_install(mocker):
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
mocker.spy(TestPluggable, "on_file_install")
TestPlugin.install()
assert TestPluggable.on_file_install.call_count == 1
TestPluggable.on_file_install.assert_called_once_with(
path="test_plugin_pluggable_expected.py",
pluggable="TestPluggable"
)
TestPlugin.uninstall()
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_file_uninstall_callbacks_are_sent_on_file_uninstall(mocker):
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
mocker.spy(TestPluggable, "on_file_uninstall")
TestPlugin.install()
TestPlugin.uninstall()
assert TestPluggable.on_file_uninstall.call_count == 1
TestPluggable.on_file_uninstall.assert_called_once_with(
path="test_plugin_pluggable_expected.py",
pluggable="TestPluggable"
)
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_an_error_should_be_raised_on_configuration_install_if_the_configuration_directory_is_absent():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
os.remove("config")
with pytest.raises(offshoot.PluginError):
TestPlugin.install()
os.symlink("tests/unit/offshoot/config", "config")
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_nothing_should_happen_if_the_plugin_has_no_configuration_on_configuration_install():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
TestPlugin.config = None
assert TestPlugin.install_configuration() is None
TestPlugin.config = {}
assert TestPlugin.install_configuration() is None
TestPlugin.config = {
"is_test": True
}
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_the_configuration_keys_should_be_written_as_is_if_the_configuration_file_does_not_exist_on_configuration_install():
if os.path.isfile(offshoot.config["file_paths"]["config"]):
os.remove(offshoot.config["file_paths"]["config"])
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
offshoot.config["sandbox_configuration_keys"] = False
TestPlugin.install()
with open(offshoot.config["file_paths"]["config"], "r") as f:
config = yaml.safe_load(f)
assert "is_test" in config
assert config["is_test"] is True
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
offshoot.config["sandbox_configuration_keys"] = True
def test_plugin_the_configuration_keys_should_be_merged_properly_with_an_existing_configuration_file_on_configuration_install():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
offshoot.config["sandbox_configuration_keys"] = False
with open(offshoot.config["file_paths"]["config"], "r") as f:
config = yaml.safe_load(f)
config["is_extra"] = True
with open(offshoot.config["file_paths"]["config"], "w") as f:
yaml.dump(config, f)
TestPlugin.install()
with open(offshoot.config["file_paths"]["config"], "r") as f:
config = yaml.safe_load(f)
assert "is_test" in config
assert config["is_test"] is True
assert "is_extra" in config
assert config["is_extra"] is True
TestPlugin.uninstall()
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
offshoot.config["sandbox_configuration_keys"] = True
def test_plugin_the_configuration_keys_should_be_sandboxed_if_the_option_is_set_on_configuration_install():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
TestPlugin.install()
with open(offshoot.config["file_paths"]["config"], "r") as f:
config = yaml.safe_load(f)
assert "TestPlugin" in config
assert "is_test" in config["TestPlugin"]
assert config["TestPlugin"]["is_test"] is True
TestPlugin.uninstall()
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_an_error_should_be_raised_on_configuration_uninstall_if_the_configuration_directory_is_absent():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
os.remove("config")
with pytest.raises(offshoot.PluginError):
TestPlugin.uninstall()
os.symlink("tests/unit/offshoot/config", "config")
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_nothing_should_happen_if_the_plugin_has_no_configuration_on_configuration_uninstall():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
TestPlugin.config = None
assert TestPlugin.uninstall_configuration() is None
TestPlugin.config = {}
assert TestPlugin.uninstall_configuration() is None
TestPlugin.config = {
"is_test": True
}
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_nothing_should_happen_if_the_plugin_configuration_file_does_not_exist_on_configuration_uninstall():
if os.path.isfile(offshoot.config["file_paths"]["config"]):
os.remove(offshoot.config["file_paths"]["config"])
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
assert TestPlugin.uninstall_configuration() is None
TestPlugin.install()
TestPlugin.uninstall()
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_the_configuration_keys_should_be_removed_properly_on_configuration_uninstall():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
offshoot.config["sandbox_configuration_keys"] = False
with open(offshoot.config["file_paths"]["config"], "r") as f:
config = yaml.safe_load(f)
config["is_extra"] = True
with open(offshoot.config["file_paths"]["config"], "w") as f:
yaml.dump(config, f)
TestPlugin.install()
TestPlugin.uninstall()
with open(offshoot.config["file_paths"]["config"], "r") as f:
config = yaml.safe_load(f)
assert "is_test" not in config
assert "is_extra" in config
assert config["is_extra"] is True
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
offshoot.config["sandbox_configuration_keys"] = True
def test_plugin_the_configuration_keys_should_be_removed_properly_if_the_sandbox_keys_option_is_set_on_configuration_uninstall():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
TestPlugin.install()
TestPlugin.uninstall()
with open(offshoot.config["file_paths"]["config"], "r") as f:
config = yaml.safe_load(f)
assert "TestPlugin" not in config
assert "is_extra" in config
assert config["is_extra"] is True
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_an_error_should_be_raised_on_libraries_install_if_the_libraries_directory_is_absent():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["callbacks"] = False
offshoot.config["file_paths"]["libraries"] = "libraries/requirements.plugins.txt"
os.remove("libraries")
with pytest.raises(offshoot.PluginError):
TestPlugin.install()
os.symlink("tests/unit/offshoot/libraries", "libraries")
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["callbacks"] = True
offshoot.config["file_paths"]["libraries"] = "requirements.plugins.txt"
def test_plugin_nothing_should_happen_if_the_plugin_has_no_libraries_on_libraries_install():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["callbacks"] = False
TestPlugin.libraries = None
assert TestPlugin.install_libraries() is None
TestPlugin.libraries = []
assert TestPlugin.install_libraries() is None
TestPlugin.libraries = ["requests"]
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_should_add_a_libraries_block_on_libraries_install(mocker):
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["callbacks"] = False
mocker.spy(TestPlugin, "_write_plugin_requirement_blocks_to")
TestPlugin.install()
TestPlugin._write_plugin_requirement_blocks_to.assert_called_once_with(
offshoot.config["file_paths"]["libraries"]
)
TestPlugin.uninstall()
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_an_error_should_be_raised_on_libraries_uninstall_if_the_libraries_directory_is_absent():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["callbacks"] = False
offshoot.config["file_paths"]["libraries"] = "libraries/requirements.plugins.txt"
os.remove("libraries")
with pytest.raises(offshoot.PluginError):
TestPlugin.uninstall()
os.symlink("tests/unit/offshoot/libraries", "libraries")
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["callbacks"] = True
offshoot.config["file_paths"]["libraries"] = "requirements.plugins.txt"
def test_plugin_nothing_should_happen_if_the_plugin_has_no_libraries_on_libraries_uninstall():
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["callbacks"] = False
TestPlugin.libraries = None
assert TestPlugin.uninstall_libraries() is None
TestPlugin.libraries = []
assert TestPlugin.uninstall_libraries() is None
TestPlugin.libraries = ["requests"]
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_should_remove_a_libraries_block_on_libraries_uninstall(mocker):
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["callbacks"] = False
mocker.spy(TestPlugin, "_remove_plugin_requirement_block_from")
TestPlugin.install()
TestPlugin.uninstall()
TestPlugin._remove_plugin_requirement_block_from.assert_called_once_with(
offshoot.config["file_paths"]["libraries"]
)
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_should_be_able_to_validate_a_file_against_a_pluggable_specification(mocker):
mocker.spy(offshoot, "validate_plugin_file")
TestPlugin._validate_file_for_pluggable(
"plugins/TestPlugin/files/test_plugin_pluggable_expected.py",
"TestPluggable"
)
offshoot.validate_plugin_file.assert_called_once_with(
"plugins/TestPlugin/files/test_plugin_pluggable_expected.py",
"TestPluggable",
TestPluggable.method_directives()
)
def test_plugin_should_be_able_to_generate_a_requirements_block_for_its_libraries():
assert TestPlugin._generate_plugin_requirement_block() == ["### TestPlugin Requirements ###", "requests", "######"]
assert TestInvalidPlugin._generate_plugin_requirement_block() == ["### TestInvalidPlugin Requirements ###", "requests", "######"]
def test_plugin_should_be_able_to_extract_all_requirement_blocks_from_its_libraries_file():
TestPlugin.install_libraries()
TestInvalidPlugin.install_libraries()
requirement_blocks = TestPlugin._extract_plugin_requirement_blocks_from(offshoot.config["file_paths"]["libraries"])
assert "TestPlugin Requirements" in requirement_blocks
assert "TestInvalidPlugin Requirements" in requirement_blocks
assert requirement_blocks["TestPlugin Requirements"] == TestPlugin._generate_plugin_requirement_block()
assert requirement_blocks["TestInvalidPlugin Requirements"] == TestInvalidPlugin._generate_plugin_requirement_block()
TestPlugin.uninstall_libraries()
TestInvalidPlugin.uninstall_libraries()
def test_plugin_should_be_able_to_write_all_requirement_blocks_to_its_libraries_file():
TestPlugin._write_plugin_requirement_blocks_to(offshoot.config["file_paths"]["libraries"])
TestInvalidPlugin._write_plugin_requirement_blocks_to(offshoot.config["file_paths"]["libraries"])
requirement_blocks = TestPlugin._extract_plugin_requirement_blocks_from(offshoot.config["file_paths"]["libraries"])
assert "TestPlugin Requirements" in requirement_blocks
assert "TestInvalidPlugin Requirements" in requirement_blocks
assert requirement_blocks["TestPlugin Requirements"] == TestPlugin._generate_plugin_requirement_block()
assert requirement_blocks["TestInvalidPlugin Requirements"] == TestInvalidPlugin._generate_plugin_requirement_block()
def test_plugin_should_be_able_to_remove_a_requirement_block_from_its_libraries_file():
TestPlugin._remove_plugin_requirement_block_from(offshoot.config["file_paths"]["libraries"])
requirement_blocks = TestPlugin._extract_plugin_requirement_blocks_from(offshoot.config["file_paths"]["libraries"])
assert "TestPlugin Requirements" not in requirement_blocks
assert "TestInvalidPlugin Requirements" in requirement_blocks
TestInvalidPlugin._remove_plugin_requirement_block_from(offshoot.config["file_paths"]["libraries"])
requirement_blocks = TestPlugin._extract_plugin_requirement_blocks_from(offshoot.config["file_paths"]["libraries"])
assert "TestPlugin Requirements" not in requirement_blocks
assert "TestInvalidPlugin Requirements" not in requirement_blocks
def test_plugin_global_on_install_callback_should_be_called_after_a_successful_installation(mocker):
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
mocker.spy(TestPlugin, "on_install")
TestPlugin.install()
assert TestPlugin.on_install.call_count == 1
TestPlugin.uninstall()
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
def test_plugin_global_on_uninstall_callback_should_be_called_after_a_successful_uninstallation(mocker):
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
mocker.spy(TestPlugin, "on_uninstall")
TestPlugin.uninstall()
assert TestPlugin.on_uninstall.call_count == 1
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
def test_plugin_an_error_should_be_raised_if_a_plugin_dependency_is_not_installed():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
with pytest.raises(offshoot.PluginError):
TestPlugin2.install()
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_plugin_should_pass_plugin_dependency_verification_if_all_dependencies_are_present():
offshoot.config["allow"]["files"] = False
offshoot.config["allow"]["config"] = False
offshoot.config["allow"]["libraries"] = False
offshoot.config["allow"]["callbacks"] = False
TestPlugin.install()
TestPlugin2.install()
TestPlugin.uninstall()
TestPlugin2.uninstall()
offshoot.config["allow"]["files"] = True
offshoot.config["allow"]["config"] = True
offshoot.config["allow"]["libraries"] = True
offshoot.config["allow"]["callbacks"] = True
def test_teardown():
os.remove("plugins")
os.remove("config")
os.remove("libraries")
if os.path.isfile("offshoot.manifest.json"):
os.remove("offshoot.manifest.json")
if os.path.isfile("requirements.plugins.txt"):
os.remove("requirements.plugins.txt")
| 33.529657 | 133 | 0.72879 | 4,564 | 40,135 | 6.142419 | 0.048642 | 0.142827 | 0.165371 | 0.073839 | 0.846651 | 0.798994 | 0.761896 | 0.712706 | 0.690768 | 0.647821 | 0 | 0.002328 | 0.143964 | 40,135 | 1,196 | 134 | 33.557692 | 0.813633 | 0.000673 | 0 | 0.693082 | 0 | 0 | 0.195188 | 0.043012 | 0 | 0 | 0 | 0 | 0.228931 | 1 | 0.079245 | false | 0.003774 | 0.020126 | 0 | 0.103145 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
2499c7b29cb6f24c8565de4e330ab8705d23850b | 1,563 | py | Python | data/config.py | PrincetonUniversity/dpcca | 15ad7439f3a34ebe0c4a818ab877db1f19f7e4c1 | [
"BSD-3-Clause"
] | 18 | 2019-09-27T02:38:07.000Z | 2022-03-13T06:35:36.000Z | data/config.py | PrincetonUniversity/dpcca | 15ad7439f3a34ebe0c4a818ab877db1f19f7e4c1 | [
"BSD-3-Clause"
] | null | null | null | data/config.py | PrincetonUniversity/dpcca | 15ad7439f3a34ebe0c4a818ab877db1f19f7e4c1 | [
"BSD-3-Clause"
] | 6 | 2019-10-09T21:11:26.000Z | 2021-02-25T14:57:36.000Z | """============================================================================
Configuration for dataset.
============================================================================"""
class Config(object):
def get_image_net(self):
"""Return neural network used for learning from images.
"""
raise NotImplementedError()
# ------------------------------------------------------------------------------
def get_genes_net(self, linear):
"""Return neural network used for learning from genes.
"""
raise NotImplementedError()
# ------------------------------------------------------------------------------
def get_dataset(self):
"""Return dataset instance.
"""
raise NotImplementedError()
# ------------------------------------------------------------------------------
def save_samples(self, directory, model, desc, x1, x2, labels):
"""Save samples from learned likelihood.
"""
raise NotImplementedError()
# ------------------------------------------------------------------------------
def save_comparison(self, directory, x, x_recon, desc, is_x1=None):
"""Save comparison of data and reconstructed data.
"""
raise NotImplementedError()
# ------------------------------------------------------------------------------
def visualize_dataset(self, directory):
"""Optionally visualize dataset. This is useful in some cases to verify
the data looks like what we expect.
"""
pass
| 33.255319 | 80 | 0.399872 | 108 | 1,563 | 5.694444 | 0.537037 | 0.195122 | 0.219512 | 0.074797 | 0.123577 | 0.123577 | 0.123577 | 0 | 0 | 0 | 0 | 0.002368 | 0.189379 | 1,563 | 46 | 81 | 33.978261 | 0.483031 | 0.602687 | 0 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.461538 | false | 0.076923 | 0 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 5 |
24ada0cebef57c30a5f9a2fc31b167804a4495de | 439 | py | Python | app/routes.py | conatel-i-d/sm-api | 1a57e8303ae5f33ae4c8ac8247449fac5b0c848d | [
"MIT"
] | 1 | 2020-09-20T07:44:33.000Z | 2020-09-20T07:44:33.000Z | app/routes.py | conatel-i-d/sm-api | 1a57e8303ae5f33ae4c8ac8247449fac5b0c848d | [
"MIT"
] | 2 | 2019-12-10T13:00:36.000Z | 2021-04-30T21:04:42.000Z | app/routes.py | conatel-i-d/sm-api | 1a57e8303ae5f33ae4c8ac8247449fac5b0c848d | [
"MIT"
] | null | null | null | def register_routes(api, app, root="api"):
from app.switch import register_routes as attach_switch
from app.nics import register_routes as attach_nics
from app.macs import register_routes as attach_macs
from app.jobs import register_routes as attach_jobs
from app.logs import register_routes as attach_logs
attach_switch(api, app)
attach_nics(api, app)
attach_macs(api, app)
attach_jobs(api, app)
attach_logs(api, app)
| 36.583333 | 57 | 0.790433 | 72 | 439 | 4.597222 | 0.208333 | 0.253776 | 0.302115 | 0.332326 | 0.422961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145786 | 439 | 11 | 58 | 39.909091 | 0.882667 | 0 | 0 | 0 | 0 | 0 | 0.006834 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.454545 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
24af40d83a309680001d912dc44f19e3f2935b51 | 240 | py | Python | popups/context_processors.py | stochastic-technologies/django-popups | 87d445d3a2f2b0eb102452e1169b06b9933148ed | [
"BSD-3-Clause"
] | 1 | 2015-03-16T10:53:28.000Z | 2015-03-16T10:53:28.000Z | popups/context_processors.py | stochastic-technologies/django-popups | 87d445d3a2f2b0eb102452e1169b06b9933148ed | [
"BSD-3-Clause"
] | null | null | null | popups/context_processors.py | stochastic-technologies/django-popups | 87d445d3a2f2b0eb102452e1169b06b9933148ed | [
"BSD-3-Clause"
] | null | null | null | class Popups:
def __init__(self, request):
self.request = request
def __getitem__(self, key):
return self.request.session.get("popups_" + key, True)
def popups(request):
return {"show_popup": Popups(request)}
| 21.818182 | 62 | 0.658333 | 29 | 240 | 5.103448 | 0.482759 | 0.222973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.216667 | 240 | 10 | 63 | 24 | 0.787234 | 0 | 0 | 0 | 0 | 0 | 0.07113 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0 | 0.285714 | 0.857143 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
24cff642ffb2a6f97547b5ce8cca2963f1cb4fa4 | 301 | py | Python | get_adapters.py | cclauss/adapter_pattern | c1a7ffdd9e8ae85b719a12a87b5fceec0384663a | [
"Apache-2.0"
] | 1 | 2017-11-13T22:28:01.000Z | 2017-11-13T22:28:01.000Z | get_adapters.py | cclauss/adapter_pattern | c1a7ffdd9e8ae85b719a12a87b5fceec0384663a | [
"Apache-2.0"
] | null | null | null | get_adapters.py | cclauss/adapter_pattern | c1a7ffdd9e8ae85b719a12a87b5fceec0384663a | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
import adapters
print(dir(adapters.AdapterInterface))
print(adapters.AdapterInterface.get_adapters())
# print(adapters.get_adapters())
for key, value in adapters.AdapterInterface.get_adapters().items():
print(key, value)
print(value.is_available())
print(value())
| 25.083333 | 67 | 0.754153 | 37 | 301 | 6.027027 | 0.459459 | 0.32287 | 0.242152 | 0.313901 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003704 | 0.10299 | 301 | 11 | 68 | 27.363636 | 0.822222 | 0.172757 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0.714286 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
24f61eb74cfe516fb7e8e2328b556d9150f9ee83 | 62 | py | Python | chainer_chemistry/functions/__init__.py | delta2323/chainerchem | 364dd2b26aec2d0b25d5e2b30a9510a9d44814af | [
"MIT"
] | null | null | null | chainer_chemistry/functions/__init__.py | delta2323/chainerchem | 364dd2b26aec2d0b25d5e2b30a9510a9d44814af | [
"MIT"
] | null | null | null | chainer_chemistry/functions/__init__.py | delta2323/chainerchem | 364dd2b26aec2d0b25d5e2b30a9510a9d44814af | [
"MIT"
] | 1 | 2019-05-23T12:25:57.000Z | 2019-05-23T12:25:57.000Z | from chainer_chemistry.functions.matmul import matmul # NOQA
| 31 | 61 | 0.83871 | 8 | 62 | 6.375 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.112903 | 62 | 1 | 62 | 62 | 0.927273 | 0.064516 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
704df83049676452e5ceb406dd0f330cf1db8402 | 39 | py | Python | documentation/test_python/inspect_underscored/inspect_underscored/_submodule.py | Ryan-rsm-McKenzie/m.css | ae6f9b06986b721f13177774769f26460ebc44ea | [
"MIT"
] | 367 | 2017-09-12T19:27:54.000Z | 2022-03-20T16:24:13.000Z | documentation/test_python/inspect_underscored/inspect_underscored/_submodule.py | Ryan-rsm-McKenzie/m.css | ae6f9b06986b721f13177774769f26460ebc44ea | [
"MIT"
] | 217 | 2017-10-27T12:21:02.000Z | 2022-03-27T09:04:44.000Z | documentation/test_python/inspect_underscored/inspect_underscored/_submodule.py | Ryan-rsm-McKenzie/m.css | ae6f9b06986b721f13177774769f26460ebc44ea | [
"MIT"
] | 103 | 2017-10-23T09:23:17.000Z | 2022-02-23T13:42:59.000Z | """Documented underscored submodule"""
| 19.5 | 38 | 0.769231 | 3 | 39 | 10 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 39 | 1 | 39 | 39 | 0.833333 | 0.820513 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
7097ec025283c8329a1fcdd09130d5353d841d06 | 5,017 | py | Python | tests/test_document.py | Informasjonsforvaltning/modelldcatnotordf | 995129ff9f6fb95f9a9d875b27f3aa14bac9b7f1 | [
"Apache-2.0"
] | 1 | 2020-11-29T18:36:21.000Z | 2020-11-29T18:36:21.000Z | tests/test_document.py | Informasjonsforvaltning/modelldcatnotordf | 995129ff9f6fb95f9a9d875b27f3aa14bac9b7f1 | [
"Apache-2.0"
] | 142 | 2020-10-07T08:52:55.000Z | 2021-11-18T15:09:31.000Z | tests/test_document.py | Informasjonsforvaltning/modelldcatnotordf | 995129ff9f6fb95f9a9d875b27f3aa14bac9b7f1 | [
"Apache-2.0"
] | null | null | null | """Test cases for the document module."""
import pytest
from pytest_mock import MockFixture
from rdflib import Graph
from skolemizer.testutils import skolemization
from modelldcatnotordf.document import FoafDocument
from tests.testutils import assert_isomorphic
def test_instantiate_document() -> None:
"""It does not raise an exception."""
try:
_ = FoafDocument()
except Exception:
pytest.fail("Unexpected Exception ..")
def test_to_graph_should_return_identifier_set_at_constructor() -> None:
"""It returns a title graph isomorphic to spec."""
document = FoafDocument("http://example.com/documents/1")
src = """
@prefix dct: <http://purl.org/dc/terms/> .
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix dcat: <http://www.w3.org/ns/dcat#> .
@prefix foaf: <http://xmlns.com/foaf/0.1/> .
<http://example.com/documents/1> a foaf:Document
.
"""
g1 = Graph().parse(data=document.to_rdf(), format="turtle")
g2 = Graph().parse(data=src, format="turtle")
assert_isomorphic(g1, g2)
def test_to_graph_should_return_title_and_identifier() -> None:
"""It returns a title graph isomorphic to spec."""
"""It returns an identifier graph isomorphic to spec."""
document = FoafDocument()
document.identifier = "http://example.com/documents/1"
document.title = {"nb": "Tittel 1", "en": "Title 1"}
src = """
@prefix dct: <http://purl.org/dc/terms/> .
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix dcat: <http://www.w3.org/ns/dcat#> .
@prefix foaf: <http://xmlns.com/foaf/0.1/> .
<http://example.com/documents/1> a foaf:Document;
dct:title "Title 1"@en, "Tittel 1"@nb ;
.
"""
g1 = Graph().parse(data=document.to_rdf(), format="turtle")
g2 = Graph().parse(data=src, format="turtle")
assert_isomorphic(g1, g2)
def test_to_graph_should_return_document_skolemized(mocker: MockFixture) -> None:
"""It returns a title graph isomorphic to spec."""
"""It returns an identifier graph isomorphic to spec."""
document = FoafDocument()
document.title = {"nb": "Tittel 1", "en": "Title 1"}
mocker.patch(
"skolemizer.Skolemizer.add_skolemization", return_value=skolemization,
)
src = """
@prefix dct: <http://purl.org/dc/terms/> .
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix dcat: <http://www.w3.org/ns/dcat#> .
@prefix foaf: <http://xmlns.com/foaf/0.1/> .
<http://example.com/.well-known/skolem/284db4d2-80c2-11eb-82c3-83e80baa2f94>
a foaf:Document;
dct:title "Title 1"@en, "Tittel 1"@nb ;
.
"""
g1 = Graph().parse(data=document.to_rdf(), format="turtle")
g2 = Graph().parse(data=src, format="turtle")
assert_isomorphic(g1, g2)
def test_to_graph_should_return_language() -> None:
"""It returns an identifier graph isomorphic to spec."""
document = FoafDocument()
document.identifier = "http://example.com/documents/1"
document.language = "http://example.com/languages/1"
src = """
@prefix dct: <http://purl.org/dc/terms/> .
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix dcat: <http://www.w3.org/ns/dcat#> .
@prefix foaf: <http://xmlns.com/foaf/0.1/> .
<http://example.com/documents/1> a foaf:Document;
dct:language "http://example.com/languages/1"^^dct:LinguisticSystem
.
"""
g1 = Graph().parse(data=document.to_rdf(), format="turtle")
g2 = Graph().parse(data=src, format="turtle")
assert_isomorphic(g1, g2)
def test_to_graph_should_return_format_and_see_also() -> None:
"""It returns an identifier graph isomorphic to spec."""
document = FoafDocument()
document.identifier = "http://example.com/documents/1"
document.format = "https://www.iana.org/assignments/media-types/application/pdf"
document.rdfs_see_also = "http://example.com/link"
src = """
@prefix dct: <http://purl.org/dc/terms/> .
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix dcat: <http://www.w3.org/ns/dcat#> .
@prefix foaf: <http://xmlns.com/foaf/0.1/> .
<http://example.com/documents/1> a foaf:Document;
rdfs:seeAlso <http://example.com/link> ;
dct:format
"https://www.iana.org/assignments/media-types/application/pdf"^^dct:MediaType
.
"""
g1 = Graph().parse(data=document.to_rdf(), format="turtle")
g2 = Graph().parse(data=src, format="turtle")
assert_isomorphic(g1, g2)
| 35.330986 | 85 | 0.626071 | 666 | 5,017 | 4.636637 | 0.166667 | 0.034003 | 0.043718 | 0.05829 | 0.787241 | 0.779793 | 0.744495 | 0.744495 | 0.73057 | 0.717617 | 0 | 0.038088 | 0.199322 | 5,017 | 141 | 86 | 35.58156 | 0.730645 | 0.060594 | 0 | 0.646465 | 0 | 0.121212 | 0.571961 | 0.008543 | 0 | 0 | 0 | 0 | 0.060606 | 1 | 0.060606 | false | 0 | 0.060606 | 0 | 0.121212 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
70a351f2700c65dbc0218ead4c0a53ab85c47469 | 37,539 | py | Python | menpo/landmark/labels/human/face.py | yutiansut/menpo | 62af28606bc55985ab764f8ad38d239d1572bf1e | [
"BSD-3-Clause"
] | null | null | null | menpo/landmark/labels/human/face.py | yutiansut/menpo | 62af28606bc55985ab764f8ad38d239d1572bf1e | [
"BSD-3-Clause"
] | null | null | null | menpo/landmark/labels/human/face.py | yutiansut/menpo | 62af28606bc55985ab764f8ad38d239d1572bf1e | [
"BSD-3-Clause"
] | 1 | 2020-05-01T09:55:57.000Z | 2020-05-01T09:55:57.000Z | from collections import OrderedDict
import numpy as np
from ..base import (
validate_input, connectivity_from_array, pcloud_and_lgroup_from_ranges,
connectivity_from_range, labeller_func)
@labeller_func(group_label='face_ibug_68')
def face_ibug_68_to_face_ibug_68(pcloud):
r"""
Apply the IBUG 68-point semantic labels.
The semantic labels are as follows:
- jaw
- left_eyebrow
- right_eyebrow
- nose
- left_eye
- right_eye
- mouth
References
----------
.. [1] http://www.multipie.org/
.. [2] http://ibug.doc.ic.ac.uk/resources/300-W/
"""
from menpo.shape import LabelledPointUndirectedGraph
n_expected_points = 68
validate_input(pcloud, n_expected_points)
jaw_indices = np.arange(0, 17)
lbrow_indices = np.arange(17, 22)
rbrow_indices = np.arange(22, 27)
upper_nose_indices = np.arange(27, 31)
lower_nose_indices = np.arange(31, 36)
leye_indices = np.arange(36, 42)
reye_indices = np.arange(42, 48)
outer_mouth_indices = np.arange(48, 60)
inner_mouth_indices = np.arange(60, 68)
jaw_connectivity = connectivity_from_array(jaw_indices)
lbrow_connectivity = connectivity_from_array(lbrow_indices)
rbrow_connectivity = connectivity_from_array(rbrow_indices)
nose_connectivity = np.vstack([
connectivity_from_array(upper_nose_indices),
connectivity_from_array(lower_nose_indices)])
leye_connectivity = connectivity_from_array(leye_indices, close_loop=True)
reye_connectivity = connectivity_from_array(reye_indices, close_loop=True)
mouth_connectivity = np.vstack([
connectivity_from_array(outer_mouth_indices, close_loop=True),
connectivity_from_array(inner_mouth_indices, close_loop=True)])
all_connectivity = np.vstack([
jaw_connectivity, lbrow_connectivity, rbrow_connectivity,
nose_connectivity, leye_connectivity, reye_connectivity,
mouth_connectivity
])
mapping = OrderedDict()
mapping['jaw'] = jaw_indices
mapping['left_eyebrow'] = lbrow_indices
mapping['right_eyebrow'] = rbrow_indices
mapping['nose'] = np.hstack((upper_nose_indices, lower_nose_indices))
mapping['left_eye'] = leye_indices
mapping['right_eye'] = reye_indices
mapping['mouth'] = np.hstack((outer_mouth_indices, inner_mouth_indices))
new_pcloud = LabelledPointUndirectedGraph.init_from_indices_mapping(
pcloud.points, all_connectivity, mapping)
return new_pcloud, mapping
@labeller_func(group_label='face_ibug_68')
def face_ibug_68_mirrored_to_face_ibug_68(pcloud):
r"""
Apply the IBUG 68-point semantic labels, on a pointcloud that has been
mirrored around the vertical axis (flipped around the Y-axis). Thus, on
the flipped image the jaw etc would be the wrong way around. This
rectifies that and returns a new PointCloud whereby all the points
are oriented correctly.
The semantic labels applied are as follows:
- jaw
- left_eyebrow
- right_eyebrow
- nose
- left_eye
- right_eye
- mouth
References
----------
.. [1] http://www.multipie.org/
.. [2] http://ibug.doc.ic.ac.uk/resources/300-W/
"""
new_pcloud, old_map = face_ibug_68_to_face_ibug_68(pcloud,
return_mapping=True)
lms_map = np.hstack([old_map['jaw'][::-1],
old_map['right_eyebrow'][::-1],
old_map['left_eyebrow'][::-1],
old_map['nose'][:4],
old_map['nose'][4:][::-1],
np.roll(old_map['right_eye'][::-1], 4),
np.roll(old_map['left_eye'][::-1], 4),
np.roll(old_map['mouth'][:12][::-1], 7),
np.roll(old_map['mouth'][12:][::-1], 5)])
return new_pcloud.from_vector(pcloud.points[lms_map]), old_map
@labeller_func(group_label='face_ibug_66')
def face_ibug_68_to_face_ibug_66(pcloud):
r"""
Apply the IBUG 66-point semantic labels, but ignoring the 2 points
describing the inner mouth corners).
The semantic labels applied are as follows:
- jaw
- left_eyebrow
- right_eyebrow
- nose
- left_eye
- right_eye
- mouth
References
----------
.. [1] http://www.multipie.org/
.. [2] http://ibug.doc.ic.ac.uk/resources/300-W/
"""
from menpo.shape import LabelledPointUndirectedGraph
n_expected_points = 68
validate_input(pcloud, n_expected_points)
jaw_indices = np.arange(0, 17)
lbrow_indices = np.arange(17, 22)
rbrow_indices = np.arange(22, 27)
upper_nose_indices = np.arange(27, 31)
lower_nose_indices = np.arange(31, 36)
leye_indices = np.arange(36, 42)
reye_indices = np.arange(42, 48)
outer_mouth_indices = np.arange(48, 60)
inner_mouth_indices = np.hstack((48, np.arange(60, 63),
54, np.arange(63, 66)))
jaw_connectivity = connectivity_from_array(jaw_indices)
lbrow_connectivity = connectivity_from_array(lbrow_indices)
rbrow_connectivity = connectivity_from_array(rbrow_indices)
nose_connectivity = np.vstack([
connectivity_from_array(upper_nose_indices),
connectivity_from_array(lower_nose_indices)])
leye_connectivity = connectivity_from_array(leye_indices, close_loop=True)
reye_connectivity = connectivity_from_array(reye_indices, close_loop=True)
mouth_connectivity = np.vstack([
connectivity_from_array(outer_mouth_indices, close_loop=True),
connectivity_from_array(inner_mouth_indices, close_loop=True)])
all_connectivity = np.vstack([
jaw_connectivity, lbrow_connectivity, rbrow_connectivity,
nose_connectivity, leye_connectivity, reye_connectivity,
mouth_connectivity])
mapping = OrderedDict()
mapping['jaw'] = jaw_indices
mapping['left_eyebrow'] = lbrow_indices
mapping['right_eyebrow'] = rbrow_indices
mapping['nose'] = np.hstack([upper_nose_indices, lower_nose_indices])
mapping['left_eye'] = leye_indices
mapping['right_eye'] = reye_indices
mapping['mouth'] = np.hstack([outer_mouth_indices, inner_mouth_indices])
# Ignore the two inner mouth points
ind = np.hstack((np.arange(60), np.arange(61, 64), np.arange(65, 68)))
new_pcloud = LabelledPointUndirectedGraph.init_from_indices_mapping(
pcloud.points[ind], all_connectivity, mapping)
return new_pcloud, mapping
@labeller_func(group_label='face_ibug_51')
def face_ibug_68_to_face_ibug_51(pcloud):
r"""
Apply the IBUG 51-point semantic labels, but removing the annotations
corresponding to the jaw region.
The semantic labels applied are as follows:
- left_eyebrow
- right_eyebrow
- nose
- left_eye
- right_eye
- mouth
References
----------
.. [1] http://www.multipie.org/
.. [2] http://ibug.doc.ic.ac.uk/resources/300-W/
"""
from menpo.shape import LabelledPointUndirectedGraph
n_expected_points = 68
validate_input(pcloud, n_expected_points)
lbrow_indices = np.arange(0, 5)
rbrow_indices = np.arange(5, 10)
upper_nose_indices = np.arange(10, 14)
lower_nose_indices = np.arange(14, 19)
leye_indices = np.arange(19, 25)
reye_indices = np.arange(25, 31)
outer_mouth_indices = np.arange(31, 43)
inner_mouth_indices = np.arange(43, 51)
lbrow_connectivity = connectivity_from_array(lbrow_indices)
rbrow_connectivity = connectivity_from_array(rbrow_indices)
nose_connectivity = np.vstack([
connectivity_from_array(upper_nose_indices),
connectivity_from_array(lower_nose_indices)])
leye_connectivity = connectivity_from_array(leye_indices, close_loop=True)
reye_connectivity = connectivity_from_array(reye_indices, close_loop=True)
mouth_connectivity = np.vstack([
connectivity_from_array(outer_mouth_indices, close_loop=True),
connectivity_from_array(inner_mouth_indices, close_loop=True)])
all_connectivity = np.vstack([
lbrow_connectivity, rbrow_connectivity, nose_connectivity,
leye_connectivity, reye_connectivity, mouth_connectivity])
mapping = OrderedDict()
mapping['left_eyebrow'] = lbrow_indices
mapping['right_eyebrow'] = rbrow_indices
mapping['nose'] = np.hstack([upper_nose_indices, lower_nose_indices])
mapping['left_eye'] = leye_indices
mapping['right_eye'] = reye_indices
mapping['mouth'] = np.hstack([outer_mouth_indices, inner_mouth_indices])
# Ignore the two inner mouth points
ind = np.arange(17, 68)
new_pcloud = LabelledPointUndirectedGraph.init_from_indices_mapping(
pcloud.points[ind], all_connectivity, mapping)
return new_pcloud, mapping
@labeller_func(group_label='face_ibug_49')
def face_ibug_49_to_face_ibug_49(pcloud):
r"""
Apply the IBUG 49-point semantic labels.
The semantic labels applied are as follows:
- left_eyebrow
- right_eyebrow
- nose
- left_eye
- right_eye
- mouth
References
----------
.. [1] http://www.multipie.org/
.. [2] http://ibug.doc.ic.ac.uk/resources/300-W/
"""
from menpo.shape import LabelledPointUndirectedGraph
n_expected_points = 49
validate_input(pcloud, n_expected_points)
lbrow_indices = np.arange(0, 5)
rbrow_indices = np.arange(5, 10)
upper_nose_indices = np.arange(10, 14)
lower_nose_indices = np.arange(14, 19)
leye_indices = np.arange(19, 25)
reye_indices = np.arange(25, 31)
outer_mouth_indices = np.arange(31, 43)
inner_mouth_indices = np.hstack((31, np.arange(43, 46),
37, np.arange(46, 49)))
lbrow_connectivity = connectivity_from_array(lbrow_indices)
rbrow_connectivity = connectivity_from_array(rbrow_indices)
nose_connectivity = np.vstack([
connectivity_from_array(upper_nose_indices),
connectivity_from_array(lower_nose_indices)])
leye_connectivity = connectivity_from_array(leye_indices, close_loop=True)
reye_connectivity = connectivity_from_array(reye_indices, close_loop=True)
mouth_connectivity = np.vstack([
connectivity_from_array(outer_mouth_indices, close_loop=True),
connectivity_from_array(inner_mouth_indices, close_loop=True)])
all_connectivity = np.vstack([
lbrow_connectivity, rbrow_connectivity, nose_connectivity,
leye_connectivity, reye_connectivity, mouth_connectivity])
mapping = OrderedDict()
mapping['left_eyebrow'] = lbrow_indices
mapping['right_eyebrow'] = rbrow_indices
mapping['nose'] = np.hstack([upper_nose_indices, lower_nose_indices])
mapping['left_eye'] = leye_indices
mapping['right_eye'] = reye_indices
mapping['mouth'] = np.hstack([outer_mouth_indices, inner_mouth_indices])
# Ignore the two inner mouth points
new_pcloud = LabelledPointUndirectedGraph.init_from_indices_mapping(
pcloud.points, all_connectivity, mapping)
return new_pcloud, mapping
@labeller_func(group_label='face_ibug_49')
def face_ibug_68_to_face_ibug_49(pcloud):
r"""
Apply the IBUG 49-point semantic labels, but removing the annotations
corresponding to the jaw region and the 2 describing the inner mouth
corners.
The semantic labels applied are as follows:
- left_eyebrow
- right_eyebrow
- nose
- left_eye
- right_eye
- mouth
References
----------
.. [1] http://www.multipie.org/
.. [2] http://ibug.doc.ic.ac.uk/resources/300-W/
"""
from menpo.shape import LabelledPointUndirectedGraph
n_expected_points = 68
validate_input(pcloud, n_expected_points)
lbrow_indices = np.arange(0, 5)
rbrow_indices = np.arange(5, 10)
upper_nose_indices = np.arange(10, 14)
lower_nose_indices = np.arange(14, 19)
leye_indices = np.arange(19, 25)
reye_indices = np.arange(25, 31)
outer_mouth_indices = np.arange(31, 43)
inner_mouth_indices = np.hstack((31, np.arange(43, 46),
37, np.arange(46, 49)))
lbrow_connectivity = connectivity_from_array(lbrow_indices)
rbrow_connectivity = connectivity_from_array(rbrow_indices)
nose_connectivity = np.vstack([
connectivity_from_array(upper_nose_indices),
connectivity_from_array(lower_nose_indices)])
leye_connectivity = connectivity_from_array(leye_indices, close_loop=True)
reye_connectivity = connectivity_from_array(reye_indices, close_loop=True)
mouth_connectivity = np.vstack([
connectivity_from_array(outer_mouth_indices, close_loop=True),
connectivity_from_array(inner_mouth_indices, close_loop=True)])
all_connectivity = np.vstack([
lbrow_connectivity, rbrow_connectivity, nose_connectivity,
leye_connectivity, reye_connectivity, mouth_connectivity])
mapping = OrderedDict()
mapping['left_eyebrow'] = lbrow_indices
mapping['right_eyebrow'] = rbrow_indices
mapping['nose'] = np.hstack([upper_nose_indices, lower_nose_indices])
mapping['left_eye'] = leye_indices
mapping['right_eye'] = reye_indices
mapping['mouth'] = np.hstack([outer_mouth_indices, inner_mouth_indices])
# Ignore the two inner mouth points
ind = np.hstack((np.arange(17, 60), np.arange(61, 64), np.arange(65, 68)))
new_pcloud = LabelledPointUndirectedGraph.init_from_indices_mapping(
pcloud.points[ind], all_connectivity, mapping)
return new_pcloud, mapping
@labeller_func(group_label='face_ibug_68_trimesh')
def face_ibug_68_to_face_ibug_68_trimesh(pcloud):
r"""
Apply the IBUG 68-point semantic labels, with trimesh connectivity.
The semantic labels applied are as follows:
- tri
References
----------
.. [1] http://www.multipie.org/
.. [2] http://ibug.doc.ic.ac.uk/resources/300-W/
"""
from menpo.shape import TriMesh
n_expected_points = 68
validate_input(pcloud, n_expected_points)
tri_list = np.array([[47, 29, 28], [44, 43, 23], [38, 20, 21],
[47, 28, 42], [49, 61, 60], [40, 41, 37],
[37, 19, 20], [28, 40, 39], [38, 21, 39],
[36, 1, 0], [48, 59, 4], [49, 60, 48],
[67, 59, 60], [13, 53, 14], [61, 51, 62],
[57, 8, 7], [52, 51, 33], [61, 67, 60],
[52, 63, 51], [66, 56, 57], [35, 30, 29],
[53, 52, 35], [37, 36, 17], [18, 37, 17],
[37, 38, 40], [38, 37, 20], [19, 37, 18],
[38, 39, 40], [28, 29, 40], [41, 36, 37],
[27, 39, 21], [41, 31, 1], [30, 32, 31],
[33, 51, 50], [33, 30, 34], [31, 40, 29],
[36, 0, 17], [31, 2, 1], [31, 41, 40],
[ 1, 36, 41], [31, 49, 2], [ 2, 49, 3],
[60, 59, 48], [ 3, 49, 48], [31, 32, 50],
[48, 4, 3], [59, 5, 4], [58, 67, 66],
[ 5, 59, 58], [58, 59, 67], [ 7, 6, 58],
[66, 57, 58], [13, 54, 53], [ 7, 58, 57],
[ 6, 5, 58], [50, 61, 49], [62, 67, 61],
[31, 50, 49], [32, 33, 50], [30, 33, 32],
[34, 52, 33], [35, 52, 34], [53, 63, 52],
[62, 63, 65], [62, 51, 63], [66, 65, 56],
[63, 53, 64], [62, 66, 67], [62, 65, 66],
[57, 56, 9], [65, 63, 64], [ 8, 57, 9],
[ 9, 56, 10], [10, 56, 11], [11, 56, 55],
[11, 55, 12], [56, 65, 55], [55, 64, 54],
[55, 65, 64], [55, 54, 12], [64, 53, 54],
[12, 54, 13], [45, 46, 44], [35, 34, 30],
[14, 53, 35], [15, 46, 45], [27, 28, 39],
[27, 42, 28], [35, 29, 47], [30, 31, 29],
[15, 35, 46], [15, 14, 35], [43, 22, 23],
[27, 21, 22], [24, 44, 23], [44, 47, 43],
[43, 47, 42], [46, 35, 47], [26, 45, 44],
[46, 47, 44], [25, 44, 24], [25, 26, 44],
[16, 15, 45], [16, 45, 26], [22, 42, 43],
[50, 51, 61], [27, 22, 42]])
new_pcloud = TriMesh(pcloud.points, trilist=tri_list)
mapping = OrderedDict()
mapping['tri'] = np.arange(new_pcloud.n_points)
return new_pcloud, mapping
@labeller_func(group_label='face_ibug_66_trimesh')
def face_ibug_68_to_face_ibug_66_trimesh(pcloud):
r"""
Apply the IBUG 66-point semantic labels, with trimesh connectivity.
The semantic labels applied are as follows:
- tri
References
----------
.. [1] http://www.multipie.org/
.. [2] http://ibug.doc.ic.ac.uk/resources/300-W/
"""
from menpo.shape import TriMesh
# Apply face_ibug_68_to_face_ibug_66
new_pcloud = face_ibug_68_to_face_ibug_66(pcloud)
# This is in terms of the 66 points
tri_list = np.array([[47, 29, 28], [44, 43, 23], [38, 20, 21],
[47, 28, 42], [40, 41, 37], [51, 62, 61],
[37, 19, 20], [28, 40, 39], [38, 21, 39],
[36, 1, 0], [48, 59, 4], [49, 60, 48],
[13, 53, 14], [60, 51, 61], [51, 51, 62],
[52, 51, 33], [49, 50, 60], [57, 7, 8],
[64, 56, 57], [35, 30, 29], [52, 62, 53],
[53, 52, 35], [37, 36, 17], [18, 37, 17],
[37, 38, 40], [38, 37, 20], [19, 37, 18],
[38, 39, 40], [28, 29, 40], [41, 36, 37],
[27, 39, 21], [41, 31, 1], [30, 32, 31],
[33, 51, 50], [33, 30, 34], [31, 40, 29],
[36, 0, 17], [31, 2, 1], [31, 41, 40],
[ 1, 36, 41], [31, 49, 2], [ 2, 49, 3],
[ 3, 49, 48], [31, 32, 50], [62, 53, 54],
[48, 4, 3], [59, 5, 4], [58, 65, 64],
[ 5, 59, 58], [58, 59, 65], [ 7, 6, 58],
[64, 57, 58], [13, 54, 53], [ 7, 58, 57],
[ 6, 5, 58], [63, 55, 54], [65, 59, 48],
[31, 50, 49], [32, 33, 50], [30, 33, 32],
[34, 52, 33], [35, 52, 34], [48, 60, 65],
[64, 63, 56], [60, 65, 61], [65, 64, 61],
[57, 56, 9], [ 8, 57, 9], [64, 63, 61],
[ 9, 56, 10], [10, 56, 11], [11, 56, 55],
[11, 55, 12], [56, 63, 55], [51, 52, 62],
[55, 54, 12], [63, 54, 62], [61, 62, 63],
[12, 54, 13], [45, 46, 44], [35, 34, 30],
[14, 53, 35], [15, 46, 45], [27, 28, 39],
[27, 42, 28], [35, 29, 47], [30, 31, 29],
[15, 35, 46], [15, 14, 35], [43, 22, 23],
[27, 21, 22], [24, 44, 23], [44, 47, 43],
[43, 47, 42], [46, 35, 47], [26, 45, 44],
[46, 47, 44], [25, 44, 24], [25, 26, 44],
[16, 15, 45], [16, 45, 26], [22, 42, 43],
[50, 60, 51], [27, 22, 42]])
new_pcloud = TriMesh(new_pcloud.points, trilist=tri_list, copy=False)
mapping = OrderedDict()
mapping['tri'] = np.arange(new_pcloud.n_points)
return new_pcloud, mapping
@labeller_func(group_label='face_ibug_51_trimesh')
def face_ibug_68_to_face_ibug_51_trimesh(pcloud):
r"""
Apply the IBUG 51-point semantic labels, with trimesh connectivity..
The semantic labels applied are as follows:
- tri
References
----------
.. [1] http://www.multipie.org/
.. [2] http://ibug.doc.ic.ac.uk/resources/300-W/
"""
from menpo.shape import TriMesh
# Apply face_ibug_68_to_face_ibug_51
new_pcloud = face_ibug_68_to_face_ibug_51(pcloud)
# This is in terms of the 51 points
tri_list = np.array([[30, 12, 11], [27, 26, 6], [21, 3, 4],
[30, 11, 25], [32, 44, 43], [23, 24, 20],
[20, 2, 3], [11, 23, 22], [21, 4, 22],
[32, 43, 31], [50, 42, 43], [44, 34, 45],
[35, 34, 16], [44, 50, 43], [35, 46, 34],
[49, 39, 40], [18, 13, 12], [36, 35, 18],
[20, 19, 0], [ 1, 20, 0], [20, 21, 23],
[21, 20, 3], [ 2, 20, 1], [21, 22, 23],
[11, 12, 23], [24, 19, 20], [10, 22, 4],
[13, 15, 14], [16, 34, 33], [16, 13, 17],
[14, 23, 12], [14, 24, 23], [43, 42, 31],
[14, 15, 33], [41, 50, 49], [41, 42, 50],
[49, 40, 41], [33, 44, 32], [45, 50, 44],
[14, 33, 32], [15, 16, 33], [13, 16, 15],
[17, 35, 16], [18, 35, 17], [36, 46, 35],
[45, 46, 48], [45, 34, 46], [49, 48, 39],
[46, 36, 47], [45, 49, 50], [45, 48, 49],
[48, 46, 47], [39, 48, 38], [38, 47, 37],
[38, 48, 47], [47, 36, 37], [28, 29, 27],
[18, 17, 13], [10, 11, 22], [10, 25, 11],
[18, 12, 30], [13, 14, 12], [26, 5, 6],
[10, 4, 5], [ 7, 27, 6], [27, 30, 26],
[26, 30, 25], [29, 18, 30], [ 9, 28, 27],
[29, 30, 27], [ 8, 27, 7], [ 8, 9, 27],
[ 5, 25, 26], [33, 34, 44], [10, 5, 25]])
new_pcloud = TriMesh(new_pcloud.points, trilist=tri_list, copy=False)
mapping = OrderedDict()
mapping['tri'] = np.arange(new_pcloud.n_points)
return new_pcloud, mapping
@labeller_func(group_label='face_ibug_49_trimesh')
def face_ibug_68_to_face_ibug_49_trimesh(pcloud):
r"""
Apply the IBUG 49-point semantic labels, with trimesh connectivity.
The semantic labels applied are as follows:
- tri
References
----------
.. [1] http://www.multipie.org/
.. [2] http://ibug.doc.ic.ac.uk/resources/300-W/
"""
from menpo.shape import TriMesh
# Apply face_ibug_68_to_face_ibug_49
new_pcloud = face_ibug_68_to_face_ibug_49(pcloud)
# This is in terms of the 49 points
tri_list = np.array([[47, 29, 28], [44, 43, 23], [38, 20, 21],
[47, 28, 42], [40, 41, 37], [51, 62, 61],
[37, 19, 20], [28, 40, 39], [38, 21, 39],
[36, 1, 0], [48, 59, 4], [49, 60, 48],
[13, 53, 14], [60, 51, 61], [51, 51, 62],
[52, 51, 33], [49, 50, 60], [57, 7, 8],
[64, 56, 57], [35, 30, 29], [52, 62, 53],
[53, 52, 35], [37, 36, 17], [18, 37, 17],
[37, 38, 40], [38, 37, 20], [19, 37, 18],
[38, 39, 40], [28, 29, 40], [41, 36, 37],
[27, 39, 21], [41, 31, 1], [30, 32, 31],
[33, 51, 50], [33, 30, 34], [31, 40, 29],
[36, 0, 17], [31, 2, 1], [31, 41, 40],
[ 1, 36, 41], [31, 49, 2], [ 2, 49, 3],
[ 3, 49, 48], [31, 32, 50], [62, 53, 54],
[48, 4, 3], [59, 5, 4], [58, 65, 64],
[ 5, 59, 58], [58, 59, 65], [ 7, 6, 58],
[64, 57, 58], [13, 54, 53], [ 7, 58, 57],
[ 6, 5, 58], [63, 55, 54], [65, 59, 48],
[31, 50, 49], [32, 33, 50], [30, 33, 32],
[34, 52, 33], [35, 52, 34], [48, 60, 65],
[64, 63, 56], [60, 65, 61], [65, 64, 61],
[57, 56, 9], [ 8, 57, 9], [64, 63, 61],
[ 9, 56, 10], [10, 56, 11], [11, 56, 55],
[11, 55, 12], [56, 63, 55], [51, 52, 62],
[55, 54, 12], [63, 54, 62], [61, 62, 63],
[12, 54, 13], [45, 46, 44], [35, 34, 30],
[14, 53, 35], [15, 46, 45], [27, 28, 39],
[27, 42, 28], [35, 29, 47], [30, 31, 29],
[15, 35, 46], [15, 14, 35], [43, 22, 23],
[27, 21, 22], [24, 44, 23], [44, 47, 43],
[43, 47, 42], [46, 35, 47], [26, 45, 44],
[46, 47, 44], [25, 44, 24], [25, 26, 44],
[16, 15, 45], [16, 45, 26], [22, 42, 43],
[50, 60, 51], [27, 22, 42]])
new_pcloud = TriMesh(new_pcloud.points, trilist=tri_list, copy=False)
mapping = OrderedDict()
mapping['tri'] = np.arange(new_pcloud.n_points)
return new_pcloud, mapping
@labeller_func(group_label='face_ibug_65')
def face_ibug_68_to_face_ibug_65(pcloud):
r"""
Apply the IBUG 68 point semantic labels, but ignore the 3 points that are
coincident for a closed mouth (bottom of the inner mouth).
The semantic labels applied are as follows:
- jaw
- left_eyebrow
- right_eyebrow
- nose
- left_eye
- right_eye
- mouth
References
----------
.. [1] http://www.multipie.org/
.. [2] http://ibug.doc.ic.ac.uk/resources/300-W/
"""
from menpo.shape import LabelledPointUndirectedGraph
# Apply face_ibug_68_to_face_ibug_68
new_pcloud, mapping = face_ibug_68_to_face_ibug_68(pcloud,
return_mapping=True)
# The coincident points are considered the final 3 landmarks (bottom of
# the inner mouth points). We remove all the edges for the inner mouth
# which are the last 8.
edges = new_pcloud.edges[:-8]
# Re-add the inner mouth without the bottom 3 points
edges = np.vstack([edges,
connectivity_from_range((60, 65), close_loop=True)])
# Luckily, OrderedDict maintains the original ordering despite updates
outer_mouth_indices = np.arange(48, 60)
inner_mouth_indices = np.arange(60, 65)
mapping['mouth'] = np.hstack([outer_mouth_indices, inner_mouth_indices])
new_pcloud = LabelledPointUndirectedGraph.init_from_indices_mapping(
new_pcloud.points[:-3], edges, mapping)
return new_pcloud, mapping
@labeller_func(group_label='face_imm_58')
def face_imm_58_to_face_imm_58(pcloud):
r"""
Apply the 58-point semantic labels from the IMM dataset.
The semantic labels applied are as follows:
- jaw
- left_eye
- right_eye
- left_eyebrow
- right_eyebrow
- mouth
- nose
References
----------
.. [1] http://www2.imm.dtu.dk/~aam/
"""
n_expected_points = 58
validate_input(pcloud, n_expected_points)
labels = OrderedDict([
('jaw', (0, 13, False)),
('left_eye', (13, 21, True)),
('right_eye', (21, 29, True)),
('left _eyebrow', (29, 34, False)),
('right_eyebrow', (34, 39, False)),
('mouth', (39, 47, True)),
('nose', (47, 58, False))
])
return pcloud_and_lgroup_from_ranges(pcloud, labels)
@labeller_func(group_label='face_lfpw_29')
def face_lfpw_29_to_face_lfpw_29(pcloud):
r"""
Apply the 29-point semantic labels from the original LFPW dataset.
The semantic labels applied are as follows:
- chin
- left_eye
- right_eye
- left_eyebrow
- right_eyebrow
- mouth
- nose
References
----------
.. [1] http://homes.cs.washington.edu/~neeraj/databases/lfpw/
"""
from menpo.shape import LabelledPointUndirectedGraph
n_expected_points = 29
validate_input(pcloud, n_expected_points)
chin_indices = np.array([28])
outer_leye_indices = np.array([8, 12, 10, 13])
pupil_leye_indices = np.array([16])
outer_reye_indices = np.array([11, 14, 9, 15])
pupil_reye_indices = np.array([17])
lbrow_indices = np.array([0, 4, 2, 5])
rbrow_indices = np.array([3, 6, 1, 7])
outer_mouth_indices = np.array([22, 24, 23, 27])
inner_mouth_indices = np.array([22, 25, 23, 26])
nose_indices = np.array([18, 20, 19, 21])
chin_connectivity = connectivity_from_array(chin_indices, close_loop=True)
leye_connectivity = connectivity_from_array(outer_leye_indices,
close_loop=True)
reye_connectivity = connectivity_from_array(outer_reye_indices,
close_loop=True)
lbrow_connectivity = connectivity_from_array(lbrow_indices,
close_loop=True)
rbrow_connectivity = connectivity_from_array(rbrow_indices,
close_loop=True)
mouth_connectivity = np.vstack([
connectivity_from_array(outer_mouth_indices, close_loop=True),
connectivity_from_array(inner_mouth_indices, close_loop=True)])
nose_connectivity = connectivity_from_array(nose_indices, close_loop=True)
all_connectivity = np.vstack([
chin_connectivity, leye_connectivity, reye_connectivity,
lbrow_connectivity, rbrow_connectivity, mouth_connectivity,
nose_connectivity])
mapping = OrderedDict()
mapping['chin'] = chin_indices
mapping['left_eye'] = np.hstack((outer_leye_indices, pupil_leye_indices))
mapping['right_eye'] = np.hstack((outer_reye_indices, pupil_reye_indices))
mapping['left_eyebrow'] = lbrow_indices
mapping['right_eyebrow'] = rbrow_indices
mapping['mouth'] = np.hstack((outer_mouth_indices, inner_mouth_indices))
mapping['nose'] = nose_indices
new_pcloud = LabelledPointUndirectedGraph.init_from_indices_mapping(
pcloud.points, all_connectivity, mapping)
return new_pcloud, mapping
def _build_upper_eyelid():
top_indices = np.arange(0, 7)
middle_indices = np.arange(12, 17)
upper_eyelid_indices = np.hstack((top_indices, middle_indices))
upper_eyelid_connectivity = list(zip(top_indices, top_indices[1:]))
upper_eyelid_connectivity += [(0, 12)]
upper_eyelid_connectivity += list(zip(middle_indices, middle_indices[1:]))
upper_eyelid_connectivity += [(16, 6)]
return upper_eyelid_indices, upper_eyelid_connectivity
@labeller_func(group_label='eye_ibug_open_38')
def eye_ibug_open_38_to_eye_ibug_open_38(pcloud):
r"""
Apply the IBUG 38-point open eye semantic labels.
The semantic labels applied are as follows:
- upper_eyelid
- lower_eyelid
- iris
- pupil
- sclera
"""
from menpo.shape import LabelledPointUndirectedGraph
n_expected_points = 38
validate_input(pcloud, n_expected_points)
upper_el_indices, upper_el_connectivity = _build_upper_eyelid()
iris_range = (22, 30)
pupil_range = (30, 38)
sclera_top = np.arange(12, 17)
sclera_bottom = np.arange(17, 22)
sclera_indices = np.hstack((0, sclera_top, 6, sclera_bottom))
lower_el_top = np.arange(17, 22)
lower_el_bottom = np.arange(7, 12)
lower_el_indices = np.hstack((6, lower_el_top, 0, lower_el_bottom))
iris_connectivity = connectivity_from_range(iris_range, close_loop=True)
pupil_connectivity = connectivity_from_range(pupil_range, close_loop=True)
sclera_connectivity = list(zip(sclera_top, sclera_top[1:]))
sclera_connectivity += [(0, 21)]
sclera_connectivity += list(zip(sclera_bottom, sclera_bottom[1:]))
sclera_connectivity += [(6, 17)]
lower_el_connectivity = list(zip(lower_el_top, lower_el_top[1:]))
lower_el_connectivity += [(6, 7)]
lower_el_connectivity += list(zip(lower_el_bottom, lower_el_bottom[1:]))
lower_el_connectivity += [(11, 0)]
all_connectivity = np.asarray(upper_el_connectivity +
lower_el_connectivity +
iris_connectivity.tolist() +
pupil_connectivity.tolist() +
sclera_connectivity)
mapping = OrderedDict()
mapping['upper_eyelid'] = upper_el_indices
mapping['lower_eyelid'] = lower_el_indices
mapping['pupil'] = np.arange(*pupil_range)
mapping['iris'] = np.arange(*iris_range)
mapping['sclera'] = sclera_indices
new_pcloud = LabelledPointUndirectedGraph.init_from_indices_mapping(
pcloud.points, all_connectivity, mapping)
return new_pcloud, mapping
@labeller_func(group_label='eye_ibug_close_17')
def eye_ibug_close_17_to_eye_ibug_close_17(pcloud):
r"""
Apply the IBUG 17-point close eye semantic labels.
The semantic labels applied are as follows:
- upper_eyelid
- lower_eyelid
"""
from menpo.shape import LabelledPointUndirectedGraph
n_expected_points = 17
validate_input(pcloud, n_expected_points)
upper_indices, upper_connectivity = _build_upper_eyelid()
middle_indices = np.arange(12, 17)
bottom_indices = np.arange(6, 12)
lower_indices = np.hstack((bottom_indices, 0, middle_indices))
lower_connectivity = list(zip(bottom_indices, bottom_indices[1:]))
lower_connectivity += [(0, 12)]
lower_connectivity += list(zip(middle_indices, middle_indices[1:]))
lower_connectivity += [(11, 0)]
all_connectivity = np.asarray(upper_connectivity + lower_connectivity)
mapping = OrderedDict()
mapping['upper_eyelid'] = upper_indices
mapping['lower_eyelid'] = lower_indices
new_pcloud = LabelledPointUndirectedGraph.init_from_indices_mapping(
pcloud.points, all_connectivity, mapping)
return new_pcloud, mapping
@labeller_func(group_label='eye_ibug_open_38_trimesh')
def eye_ibug_open_38_to_eye_ibug_open_38_trimesh(pcloud):
r"""
Apply the IBUG 38-point open eye semantic labels, with trimesh connectivity.
The semantic labels applied are as follows:
- tri
"""
from menpo.shape import TriMesh
n_expected_points = 38
validate_input(pcloud, n_expected_points)
tri_list = np.array([[29, 36, 28], [22, 13, 23], [12, 1, 2],
[29, 30, 37], [13, 3, 14], [13, 12, 2],
[19, 8, 9], [25, 33, 24], [36, 37, 33],
[24, 32, 31], [33, 37, 31], [35, 34, 27],
[35, 36, 33], [ 3, 13, 2], [14, 24, 23],
[33, 32, 24], [15, 25, 14], [25, 26, 34],
[22, 30, 29], [31, 37, 30], [24, 31, 23],
[32, 33, 31], [22, 12, 13], [ 0, 1, 12],
[14, 23, 13], [31, 30, 23], [28, 19, 20],
[21, 11, 0], [12, 21, 0], [20, 11, 21],
[20, 10, 11], [21, 29, 20], [21, 12, 22],
[30, 22, 23], [29, 21, 22], [27, 19, 28],
[29, 37, 36], [29, 28, 20], [36, 35, 28],
[20, 19, 10], [10, 19, 9], [28, 35, 27],
[19, 19, 8], [17, 16, 6], [18, 7, 8],
[25, 34, 33], [18, 27, 17], [18, 19, 27],
[18, 17, 7], [27, 26, 17], [17, 6, 7],
[14, 25, 24], [34, 35, 33], [17, 26, 16],
[27, 34, 26], [ 3, 15, 14], [15, 26, 25],
[ 4, 15, 3], [16, 26, 15], [16, 4, 5],
[16, 15, 4], [16, 5, 6], [8, 18, 19]])
new_pcloud = TriMesh(pcloud.points, trilist=tri_list, copy=False)
mapping = OrderedDict()
mapping['tri'] = np.arange(new_pcloud.n_points)
return new_pcloud, mapping
@labeller_func(group_label='eye_ibug_close_17_trimesh')
def eye_ibug_close_17_to_eye_ibug_close_17_trimesh(pcloud):
r"""
Apply the IBUG 17-point close eye semantic labels, with trimesh
connectivity.
The semantic labels applied are as follows:
- tri
"""
from menpo.shape import TriMesh
n_expected_points = 17
validate_input(pcloud, n_expected_points)
tri_list = np.array([[10, 11, 13], [ 3, 13, 2], [ 4, 14, 3],
[15, 5, 16], [12, 11, 0], [13, 14, 10],
[13, 12, 2], [14, 13, 3], [ 0, 1, 12],
[ 2, 12, 1], [13, 11, 12], [ 9, 10, 14],
[15, 9, 14], [ 7, 8, 15], [ 5, 6, 16],
[15, 14, 4], [ 7, 15, 16], [ 8, 9, 15],
[15, 4, 5], [16, 6, 7]])
new_pcloud = TriMesh(pcloud.points, trilist=tri_list, copy=False)
mapping = OrderedDict()
mapping['tri'] = np.arange(new_pcloud.n_points)
return new_pcloud, mapping
@labeller_func(group_label='tongue_ibug_19')
def tongue_ibug_19_to_tongue_ibug_19(pcloud):
r"""
Apply the IBUG 19-point tongue semantic labels.
The semantic labels applied are as follows:
- outline
- bisector
"""
n_expected_points = 19
validate_input(pcloud, n_expected_points)
labels = OrderedDict([
('outline', (0, 13, False)),
('bisector', (13, 19, False))
])
return pcloud_and_lgroup_from_ranges(pcloud, labels)
| 38.305102 | 80 | 0.567144 | 4,914 | 37,539 | 4.107041 | 0.050875 | 0.027748 | 0.053067 | 0.045783 | 0.802646 | 0.769795 | 0.761867 | 0.737538 | 0.692548 | 0.674215 | 0 | 0.126618 | 0.296039 | 37,539 | 979 | 81 | 38.344229 | 0.6371 | 0.14084 | 0 | 0.595925 | 0 | 0 | 0.026089 | 0.001563 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032258 | false | 0 | 0.03056 | 0 | 0.095076 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
560a9ce6a3efe6195f981492a11e735948a71f10 | 1,693 | py | Python | riglib/positioner/calib.py | srsummerson/bmi_python | 7eca891f078ce15b2b9cf85f1309346d6fb9fccb | [
"Apache-2.0"
] | null | null | null | riglib/positioner/calib.py | srsummerson/bmi_python | 7eca891f078ce15b2b9cf85f1309346d6fb9fccb | [
"Apache-2.0"
] | 12 | 2020-07-31T18:58:31.000Z | 2022-02-10T14:36:00.000Z | riglib/positioner/calib.py | srsummerson/bmi_python | 7eca891f078ce15b2b9cf85f1309346d6fb9fccb | [
"Apache-2.0"
] | 4 | 2020-03-06T15:39:00.000Z | 2021-05-26T17:03:21.000Z | ## ******* Raw Data ******
##
## In [15]: pos.continuous_move(10000, 10000, 10000)
## c move: 5184, 3359, 4011
##
## Limit switches read: 110101
##
##
## In [16]: pos.continuous_move(-10000, -10000, -10000)
## c move: 5060, 3367, 3985
##
## Limit switches read: 111011
##
##
## In [17]: pos.continuous_move(10000, 10000, 10000)
## c move: 5183, 3359, 3994
##
## Limit switches read: 110101
##
##
## In [18]: pos.continuous_move(-10000, -10000, -10000)
## c move: 5044, 3367, 4000
##
## Limit switches read: 111010
##
##
## In [19]: pos.continuous_move(10000, 10000, 10000)
## c move: 5183, 3359, 3994
##
## Limit switches read: 110101
##
##
## In [20]: pos.continuous_move(-10000, -10000, -10000)
## c move: 5044, 3367, 742
##
## Limit switches read: 111011
##
##
## In [21]: pos.continuous_move(-10000, -10000, -10000)
## c move: 2, 0, 3248
##
## Limit switches read: 110111
##
##
## In [22]: pos.continuous_move(10000, 10000, 10000)
## c move: 5184, 3358, 3995
##
## Limit switches read: 111101
##
##
## In [23]: pos.continuous_move(-10000, -10000, -10000)
## c move: 5044, 3367, 490
##
## Limit switches read: 111010
##
##
## In [24]: pos.continuous_move(-10000, -10000, -10000)
## c move: 4, 0, 3488
##
## Limit switches read: 110111
############################## End Raw Data #####################################
import numpy as np
n_steps_min_to_max = np.vstack([
(5184, 3359, 4011),
(5183, 3359, 3994),
(5183, 3359, 3994),
(5184, 3358, 3995),
])
n_steps_max_to_min = np.vstack([
(5060, 3367, 3985),
(5044, 3367, 4000),
(5044, 3367, 742+3248),
(5044, 3367, 490+3488)
])
| 21.987013 | 82 | 0.556409 | 219 | 1,693 | 4.219178 | 0.255708 | 0.21645 | 0.183983 | 0.238095 | 0.650433 | 0.515152 | 0.515152 | 0.515152 | 0.395022 | 0.306277 | 0 | 0.328582 | 0.21264 | 1,693 | 76 | 83 | 22.276316 | 0.364591 | 0.649144 | 0 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.076923 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
563193f73cc55e1bb29b78cc79305a40012d5432 | 3,771 | py | Python | S4/S4 Library/simulation/statistics/ranked_statistic_display_info.py | NeonOcean/Environment | ca658cf66e8fd6866c22a4a0136d415705b36d26 | [
"CC-BY-4.0"
] | 1 | 2021-05-20T19:33:37.000Z | 2021-05-20T19:33:37.000Z | S4/S4 Library/simulation/statistics/ranked_statistic_display_info.py | NeonOcean/Environment | ca658cf66e8fd6866c22a4a0136d415705b36d26 | [
"CC-BY-4.0"
] | null | null | null | S4/S4 Library/simulation/statistics/ranked_statistic_display_info.py | NeonOcean/Environment | ca658cf66e8fd6866c22a4a0136d415705b36d26 | [
"CC-BY-4.0"
] | null | null | null | from sims4.tuning.tunable import HasTunableReference, TunableReference, TunableMapping, TunableEnumEntry, TunableTuple, OptionalTunable
from sims4.resources import Types
from sims4.tuning.tunable_base import ExportModes, GroupNames
from sims4.tuning.instances import HashedTunedInstanceMetaclass
import services
from sims4.localization import TunableLocalizedString
from bucks.bucks_enums import BucksType
class RankedStatisticDisplayInfo(HasTunableReference, metaclass=HashedTunedInstanceMetaclass, manager=services.get_instance_manager(Types.USER_INTERFACE_INFO)):
INSTANCE_TUNABLES = {'perks_panel_bucks_perk_info': TunableMapping(description='\n A mapping of bucks type to info for that bucks type.\n ', key_type=TunableEnumEntry(description='\n The buck type this data corresponds to.\n ', tunable_type=BucksType, default=BucksType.INVALID), value_type=TunableTuple(description='\n Info used to display Bucks and BucksPerks in the Perks Panel.\n ', bucks_text=OptionalTunable(TunableLocalizedString(description="\n Text containing a currency type token, to display the \n bucks balance in the Perks Panel.\n If not present, the points balance for this buck type \n won't be shown in the Perks Panel.\n ")), bucks_tooltip=OptionalTunable(TunableLocalizedString(description='\n Tooltip on the "Help" icon related for this buck type \n in the Perks Panel. Explains how this currency works.\n Can be none if no additional information is needed for\n this buck type. \n ')), perk_currency_label=OptionalTunable(TunableLocalizedString(description='\n Title text to display currency type on the Motive Panel.\n Such as "Power Points:" in "Power Points: 10"\n Can be none if we don\'t need to show this buck\'s balance on the Motive Panel.\n ')), perk_remove_tooltip=OptionalTunable(TunableLocalizedString(description="\n Tooltip on the removal button on the perk cell in the Perks Panel.\n Can be none if perks don't need a special removal tooltip.\n ")), export_class_name='TunableRankedStatBucksInfoTuple'), tuple_name='RankedStatBucksToBucksInfoTuple', export_modes=ExportModes.All, tuning_group=GroupNames.UI), 'perks_panel_revert_tooltip': OptionalTunable(TunableLocalizedString(description='\n Tooltip shown on the revert button in the Perks Panel.\n If not present, the revert button is unused and hidden.\n '), export_modes=ExportModes.All, tuning_group=GroupNames.UI), 'perks_panel_disabled_confirm_tooltip': TunableLocalizedString(description='\n The tooltip shown on the confirm button in the perks panel when it is disabled.\n ', export_modes=ExportModes.All, tuning_group=GroupNames.UI), 'perks_panel_title': TunableLocalizedString(description='\n The title shown in the Perks Panel for this statistic. \n ', export_modes=ExportModes.All, tuning_group=GroupNames.UI), 'perks_panel_subtitle': TunableLocalizedString(description='\n The subtitle shown in the Perks Panel for this statistic.\n ', export_modes=ExportModes.All, tuning_group=GroupNames.UI), 'ranked_statistic_reference': TunableReference(description='\n The ranked statistic gameplay tuning reference ID.\n ', manager=services.get_instance_manager(Types.STATISTIC), class_restrictions=('RankedStatistic',), export_modes=ExportModes.All, tuning_group=GroupNames.UI)}
| 342.818182 | 3,202 | 0.708035 | 447 | 3,771 | 5.85906 | 0.288591 | 0.053456 | 0.034364 | 0.051546 | 0.355861 | 0.298969 | 0.229859 | 0.229859 | 0.159603 | 0.13593 | 0 | 0.002385 | 0.221692 | 3,771 | 10 | 3,203 | 377.1 | 0.889949 | 0 | 0 | 0 | 0 | 0.555556 | 0.527446 | 0.046937 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.777778 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
56493e0056f92a6346e090c4fc56113e224d9c9b | 208 | py | Python | upgrade-insecure-requests/support/redirect-cors.py | ziransun/wpt | ab8f451eb39eb198584d547f5d965ef54df2a86a | [
"BSD-3-Clause"
] | 8 | 2019-04-09T21:13:05.000Z | 2021-11-23T17:25:18.000Z | upgrade-insecure-requests/support/redirect-cors.py | ziransun/wpt | ab8f451eb39eb198584d547f5d965ef54df2a86a | [
"BSD-3-Clause"
] | 21 | 2021-03-31T19:48:22.000Z | 2022-03-12T00:24:53.000Z | upgrade-insecure-requests/support/redirect-cors.py | ziransun/wpt | ab8f451eb39eb198584d547f5d965ef54df2a86a | [
"BSD-3-Clause"
] | 11 | 2019-04-12T01:20:16.000Z | 2021-11-23T17:25:02.000Z | def main(request, response):
response.status = 302
location = request.GET.first("location")
response.headers.set("Location", location)
response.headers.set("Access-Control-Allow-Origin", "*")
| 34.666667 | 60 | 0.701923 | 24 | 208 | 6.083333 | 0.625 | 0.219178 | 0.315068 | 0.356164 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01676 | 0.139423 | 208 | 5 | 61 | 41.6 | 0.798883 | 0 | 0 | 0 | 0 | 0 | 0.211538 | 0.129808 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
565bdeddcf9e926a7ab8f50486c7b4ac1c847272 | 488 | py | Python | armagarch/errors.py | iankhr/armagarch | 5d292b54cde992cca47024aaeb8d4120f0665a7d | [
"MIT"
] | 45 | 2018-09-11T07:16:20.000Z | 2022-03-14T08:49:23.000Z | armagarch/errors.py | Myngahuynh/armagarch | 5d292b54cde992cca47024aaeb8d4120f0665a7d | [
"MIT"
] | 5 | 2019-01-04T11:00:23.000Z | 2021-12-04T11:29:43.000Z | armagarch/errors.py | Myngahuynh/armagarch | 5d292b54cde992cca47024aaeb8d4120f0665a7d | [
"MIT"
] | 28 | 2018-10-25T07:11:04.000Z | 2022-03-31T16:06:20.000Z | # -*- coding: utf-8 -*-
"""
Created on Fri Jun 26 12:03:12 2020
This files contains some list of custom errors
@author: Ian Khrashchevskyi
"""
class Error(Exception):
pass
class InputError(Error):
def __init__(self, message):
self.message = message
class ProcedureError(Error):
def __init__(self, message):
self.message = message
class HessianError(Error):
def __init__(self, message):
self.message = message
| 19.52 | 47 | 0.631148 | 57 | 488 | 5.192982 | 0.578947 | 0.222973 | 0.121622 | 0.162162 | 0.449324 | 0.449324 | 0.449324 | 0.449324 | 0.310811 | 0 | 0 | 0.036415 | 0.268443 | 488 | 24 | 48 | 20.333333 | 0.792717 | 0.276639 | 0 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0.090909 | 0 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 5 |
565c8befd5ad55853add52ef01fc9345d15863b1 | 726 | py | Python | informacje/views.py | kbilak/Domki-Za-Lasem-Morze | 29c88775801bc7e119ea4d2120cafc70150dd9d9 | [
"Apache-2.0"
] | null | null | null | informacje/views.py | kbilak/Domki-Za-Lasem-Morze | 29c88775801bc7e119ea4d2120cafc70150dd9d9 | [
"Apache-2.0"
] | null | null | null | informacje/views.py | kbilak/Domki-Za-Lasem-Morze | 29c88775801bc7e119ea4d2120cafc70150dd9d9 | [
"Apache-2.0"
] | null | null | null | from django.shortcuts import render, get_object_or_404
from .forms import *
from .models import *
def index(request):
return render(request, 'views/index.html')
def gallery(request):
galleries = Galeria.objects.all()
return render(request, 'views/gallery.html', {'galleries': galleries})
def gallery_photo(request, slug, id):
photo = get_object_or_404(GaleriaZdjęcia, galeria__slug=slug, id=id)
return render(request, 'views/photo.html', {'photo': photo})
def contact(request):
form = KontaktForm
return render(request, 'views/contact.html', {'form': form})
def offer(request):
return render(request, 'views/offer.html')
def about(request):
return render(request, 'views/about.html')
| 29.04 | 74 | 0.720386 | 95 | 726 | 5.410526 | 0.326316 | 0.140078 | 0.22179 | 0.280156 | 0.180934 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009646 | 0.143251 | 726 | 24 | 75 | 30.25 | 0.81672 | 0 | 0 | 0 | 0 | 0 | 0.162534 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0.166667 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
567599955ecedf15f22f0d46710e253bacf01de6 | 152 | py | Python | test/test.py | napoler/mlp-pytorch | ade6ee7be0c3f0b679e93462bf4613b2db5ae936 | [
"Apache-2.0"
] | null | null | null | test/test.py | napoler/mlp-pytorch | ade6ee7be0c3f0b679e93462bf4613b2db5ae936 | [
"Apache-2.0"
] | null | null | null | test/test.py | napoler/mlp-pytorch | ade6ee7be0c3f0b679e93462bf4613b2db5ae936 | [
"Apache-2.0"
] | null | null | null |
#encoding=utf-8
from __future__ import unicode_literals
import sys
# 切换到上级目录
sys.path.append("../")
# 引入本地库
import Demo
Demo =Demo.Demo()
Demo.fun()
| 11.692308 | 39 | 0.730263 | 22 | 152 | 4.818182 | 0.681818 | 0.301887 | 0.339623 | 0.301887 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007576 | 0.131579 | 152 | 12 | 40 | 12.666667 | 0.795455 | 0.184211 | 0 | 0 | 0 | 0 | 0.02521 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
568bba42427a8112eea1465c90b6f83ffda71a58 | 55 | py | Python | pythonClient/eXchange/lib/PaxHeaders.76732/cut.py | snowdensb/theexchange-python3-clients | f6724e00494dc1a705f8cb872425416d09b062cf | [
"MIT"
] | 16 | 2019-06-20T23:17:53.000Z | 2022-03-10T05:02:26.000Z | pythonClient/eXchange/lib/PaxHeaders.76732/cut.py | snowdensb/theexchange-python3-clients | f6724e00494dc1a705f8cb872425416d09b062cf | [
"MIT"
] | null | null | null | pythonClient/eXchange/lib/PaxHeaders.76732/cut.py | snowdensb/theexchange-python3-clients | f6724e00494dc1a705f8cb872425416d09b062cf | [
"MIT"
] | 14 | 2019-05-23T18:24:21.000Z | 2021-02-19T22:52:25.000Z | 26 atime=1555436580.38539
27 ctime=1554400727.995542
| 18.333333 | 27 | 0.818182 | 8 | 55 | 5.625 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.714286 | 0.109091 | 55 | 2 | 28 | 27.5 | 0.204082 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
56911cd3693b1f55d4cce406213c567a3963d1a9 | 2,726 | py | Python | peck/entry_writer.py | jgbarsness/journal_mngr | 831bee79037eea3dba712c6f475e6e9db41fbb9e | [
"MIT"
] | 1 | 2020-12-14T18:35:51.000Z | 2020-12-14T18:35:51.000Z | peck/entry_writer.py | jgbarsness/journal_mngr | 831bee79037eea3dba712c6f475e6e9db41fbb9e | [
"MIT"
] | 3 | 2020-10-05T22:58:46.000Z | 2020-12-17T21:15:26.000Z | peck/entry_writer.py | jgbarsness/journal_mngr | 831bee79037eea3dba712c6f475e6e9db41fbb9e | [
"MIT"
] | null | null | null | import peck.info_and_paths as c
from os import chmod
from abc import ABC, abstractmethod
import peck.ab_entry
from stat import S_IREAD
class EntryWriter(ABC):
'writes entries to file'
@abstractmethod
def write(self, obj: peck.ab_entry.AEntry):
pass
class FullWrite(EntryWriter):
'writes a full entry to file'
def write(self, obj):
entries = open(c.COLLECTION_TITLE, 'a+')
entries.writelines([obj.recorded_datetime, '\n',
c.DATESTAMP_UNDERLINE, '\n',
obj.title, '\n\n',
c.FIRST_MARKER, '\n',
obj.first, '\n\n',
c.SECOND_MARKER, '\n',
obj.second, '\n' + c.END_MARKER + '\n\n'])
entries.close()
chmod(c.COLLECTION_TITLE, S_IREAD)
class FirstWrite(EntryWriter):
'a first-section-only write'
def write(self, obj):
entries = open(c.COLLECTION_TITLE, 'a+')
entries.writelines([obj.recorded_datetime, '\n',
c.DATESTAMP_UNDERLINE, '\n',
obj.title, '\n\n',
c.FIRST_MARKER, '\n',
obj.first, '\n' + c.END_MARKER + '\n\n'])
entries.close()
chmod(c.COLLECTION_TITLE, S_IREAD)
class SecondWrite(EntryWriter):
'a second-section-only write'
def write(self, obj):
entries = open(c.COLLECTION_TITLE, 'a+')
entries.writelines([obj.recorded_datetime, '\n',
c.DATESTAMP_UNDERLINE, '\n',
obj.title, '\n\n',
c.SECOND_MARKER + '\n',
obj.second, '\n' + c.END_MARKER + '\n\n'])
entries.close()
chmod(c.COLLECTION_TITLE, S_IREAD)
class TitleWrite(EntryWriter):
'a write with a title only'
def write(self, obj):
entries = open(c.COLLECTION_TITLE, 'a+')
entries.writelines([obj.recorded_datetime, '\n',
c.DATESTAMP_UNDERLINE, '\n',
obj.title, '\n',
c.END_MARKER + '\n\n'])
entries.close()
chmod(c.COLLECTION_TITLE, S_IREAD)
class TagWrite(EntryWriter):
'a write with a tag'
def write(self, obj):
entries = open(c.COLLECTION_TITLE, 'a+')
entries.writelines([obj.recorded_datetime, '\n',
c.DATESTAMP_UNDERLINE, '\n',
'(' + obj.tag + ')\n',
obj.title, '\n',
c.END_MARKER + '\n\n'])
entries.close()
chmod(c.COLLECTION_TITLE, S_IREAD)
| 32.070588 | 70 | 0.501834 | 299 | 2,726 | 4.444816 | 0.180602 | 0.021068 | 0.120391 | 0.06772 | 0.749436 | 0.716328 | 0.716328 | 0.716328 | 0.716328 | 0.716328 | 0 | 0 | 0.370506 | 2,726 | 84 | 71 | 32.452381 | 0.774476 | 0.055026 | 0 | 0.621212 | 0 | 0 | 0.085473 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0.015152 | 0.075758 | 0 | 0.257576 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
569469523c18d884ee50a60c213d6347e4404070 | 16 | py | Python | modules/MMM-Testpython/FCF.py | ENTITYSmartMirror/finaltotal | 44ffd8812dc4c7bbfd4f7b651faefba73158e525 | [
"MIT"
] | null | null | null | modules/MMM-Testpython/FCF.py | ENTITYSmartMirror/finaltotal | 44ffd8812dc4c7bbfd4f7b651faefba73158e525 | [
"MIT"
] | null | null | null | modules/MMM-Testpython/FCF.py | ENTITYSmartMirror/finaltotal | 44ffd8812dc4c7bbfd4f7b651faefba73158e525 | [
"MIT"
] | null | null | null | print("Male_28") | 16 | 16 | 0.75 | 3 | 16 | 3.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0 | 16 | 1 | 16 | 16 | 0.5625 | 0 | 0 | 0 | 0 | 0 | 0.411765 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
3b225b4c5b9aa65a54ed76a2f898647e3fd7c0b0 | 423 | py | Python | oom/sysmon.py | cjonesii/fdc4_oom | 8e5d6040cccb14b77c685c92f528f40636ccecd1 | [
"MIT"
] | null | null | null | oom/sysmon.py | cjonesii/fdc4_oom | 8e5d6040cccb14b77c685c92f528f40636ccecd1 | [
"MIT"
] | null | null | null | oom/sysmon.py | cjonesii/fdc4_oom | 8e5d6040cccb14b77c685c92f528f40636ccecd1 | [
"MIT"
] | null | null | null | # /////////////////////////////////////////////////////////////////////
#
# sysmon.py :
#
# Copyright Adnacom 2022
#
# ////////////////////////////////////////////////////////////////////
import requests
import json
from oomjsonshim import *
def sysmon_get_keyvalue( key ):
js = requests.get(url.remote,
json={'cmd': 'sgp'})
return js.text
def sysmon_get_data():
return("Hola chika") | 22.263158 | 71 | 0.413712 | 35 | 423 | 4.885714 | 0.685714 | 0.105263 | 0.140351 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011527 | 0.179669 | 423 | 19 | 72 | 22.263158 | 0.481268 | 0.416076 | 0 | 0 | 0 | 0 | 0.066667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.333333 | 0.111111 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 5 |
3b4d9326e78540a3238fd6cd9705d79813a403ed | 94 | py | Python | pypeta/__init__.py | JaylanLiu/pepeta | cc6e81e61ccc6b8b2276c14132954c5a4abb6763 | [
"MIT"
] | 1 | 2021-03-19T02:14:13.000Z | 2021-03-19T02:14:13.000Z | pypeta/__init__.py | JaylanLiu/pepeta | cc6e81e61ccc6b8b2276c14132954c5a4abb6763 | [
"MIT"
] | null | null | null | pypeta/__init__.py | JaylanLiu/pepeta | cc6e81e61ccc6b8b2276c14132954c5a4abb6763 | [
"MIT"
] | 1 | 2021-11-15T00:17:01.000Z | 2021-11-15T00:17:01.000Z | from .pypeta import *
from .utils import *
from .hgvs_variant import *
__version__ = '0.6.9'
| 15.666667 | 27 | 0.712766 | 14 | 94 | 4.428571 | 0.714286 | 0.322581 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038462 | 0.170213 | 94 | 5 | 28 | 18.8 | 0.75641 | 0 | 0 | 0 | 0 | 0 | 0.053191 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
3b57fb2f799c3516ba3241a9244527acc59d5454 | 238 | py | Python | mypath.py | jaecheoljung/dreamsboat-naver.com | 509b7f62212f1cd9c36ddee4f6a68db66a871f81 | [
"MIT"
] | 1 | 2020-08-29T21:16:04.000Z | 2020-08-29T21:16:04.000Z | mypath.py | jaecheoljung/action-localization-pytorch | 509b7f62212f1cd9c36ddee4f6a68db66a871f81 | [
"MIT"
] | null | null | null | mypath.py | jaecheoljung/action-localization-pytorch | 509b7f62212f1cd9c36ddee4f6a68db66a871f81 | [
"MIT"
] | null | null | null | class Path(object):
@staticmethod
def db_dir(database):
root_dir = './aps_cut'
output_dir = './data'
return root_dir, output_dir
@staticmethod
def model_dir():
return './c3d-pretrained.pth' | 23.8 | 37 | 0.60084 | 28 | 238 | 4.857143 | 0.642857 | 0.220588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005848 | 0.281513 | 238 | 10 | 37 | 23.8 | 0.789474 | 0 | 0 | 0.222222 | 0 | 0 | 0.146444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0 | 0.111111 | 0.555556 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
3b60c8184a776a0fef61378d6be1d3a304182feb | 66 | py | Python | movierecommender/__init__.py | 123sgrimm/movierecommender | 137181fcdca70aae951e571817ac73ed367f230a | [
"MIT"
] | null | null | null | movierecommender/__init__.py | 123sgrimm/movierecommender | 137181fcdca70aae951e571817ac73ed367f230a | [
"MIT"
] | null | null | null | movierecommender/__init__.py | 123sgrimm/movierecommender | 137181fcdca70aae951e571817ac73ed367f230a | [
"MIT"
] | null | null | null | print('hi this is the init file of the movierecommender package')
| 33 | 65 | 0.787879 | 11 | 66 | 4.727273 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151515 | 66 | 1 | 66 | 66 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0.848485 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
3b663d2afed50a353ed6ddeae3c8e354dbca47ff | 874 | py | Python | functions.py | learning-foundation/python-oo-ds | 58c212da4562f65f99c8df24bff7667744ea552b | [
"MIT"
] | null | null | null | functions.py | learning-foundation/python-oo-ds | 58c212da4562f65f99c8df24bff7667744ea552b | [
"MIT"
] | null | null | null | functions.py | learning-foundation/python-oo-ds | 58c212da4562f65f99c8df24bff7667744ea552b | [
"MIT"
] | null | null | null | def average_velocity(distance, time):
if (time <= 0):
return distance
velocity = div(distance, time)
return velocity
def sum(number_1, number_2):
return number_1 + number_2
def subtraction(number_1, number_2):
return number_1 - number_2
def calc(number_1, number_2):
return number_1 + number_2, number_1 - number_2, number_1 * number_2, number_1 / number_2
def div(number_1, number_2):
return number_1 / number_2
def test_args_kwargs(arg1, arg2, arg3):
print('arg1:', arg1)
print('arg2:', arg2)
print('arg3:', arg3)
def test_kwargs(**kwargs):
for key, value in kwargs.items():
print(key, value)
args = ('one', 2, 3)
# test_args_kwargs(*args)
kwargs = {'arg3': 3, 'arg2': 'two', 'arg1': 'one'}
# test_args_kwargs(**kwargs)
kwargs = {'arg3': 3, 'arg2': 'two', 'arg1': 'one', 'arg4': 4}
test_kwargs(**kwargs) | 24.971429 | 93 | 0.654462 | 132 | 874 | 4.098485 | 0.242424 | 0.142329 | 0.264325 | 0.284658 | 0.443623 | 0.438078 | 0.438078 | 0.345656 | 0.345656 | 0.308688 | 0 | 0.062235 | 0.191076 | 874 | 35 | 94 | 24.971429 | 0.70297 | 0.057208 | 0 | 0 | 0 | 0 | 0.07056 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.291667 | false | 0 | 0 | 0.166667 | 0.541667 | 0.166667 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
8e574c75991e10889b3959582ec4313dd13ad431 | 39 | py | Python | crashreports/__init__.py | FairphoneMirrors/hiccup-server | 8b80109740ea663d23ca46bb272c8fd95f873f1e | [
"Apache-2.0"
] | null | null | null | crashreports/__init__.py | FairphoneMirrors/hiccup-server | 8b80109740ea663d23ca46bb272c8fd95f873f1e | [
"Apache-2.0"
] | 1 | 2019-10-21T18:00:57.000Z | 2019-10-21T18:00:57.000Z | crashreports/__init__.py | FairphoneMirrors/hiccup-server | 8b80109740ea663d23ca46bb272c8fd95f873f1e | [
"Apache-2.0"
] | null | null | null | """Hiccup crashreports application."""
| 19.5 | 38 | 0.74359 | 3 | 39 | 9.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 39 | 1 | 39 | 39 | 0.805556 | 0.820513 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
8e64b35524ce684d8180a451bcbdd23b581219da | 25 | py | Python | py2md/__init__.py | Xero64/py2md | d35f7163863bb3012d7285207c7e4378df20ffa5 | [
"MIT"
] | 4 | 2019-06-30T11:29:00.000Z | 2021-09-04T19:27:40.000Z | py2md/__init__.py | Xero64/py2md | d35f7163863bb3012d7285207c7e4378df20ffa5 | [
"MIT"
] | null | null | null | py2md/__init__.py | Xero64/py2md | d35f7163863bb3012d7285207c7e4378df20ffa5 | [
"MIT"
] | null | null | null | from .py2md import Py2MD
| 12.5 | 24 | 0.8 | 4 | 25 | 5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 0.16 | 25 | 1 | 25 | 25 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
8ea273bf02449f69775b1fbe9b6df965bdaf25de | 207 | py | Python | PhdThesis/FeasibilityStudies/feasibility_studies/__init__.py | pariterre/ViolinOptimalControl | b7037d051a73f2c6cf5815e9d2269ea8c2e11993 | [
"MIT"
] | null | null | null | PhdThesis/FeasibilityStudies/feasibility_studies/__init__.py | pariterre/ViolinOptimalControl | b7037d051a73f2c6cf5815e9d2269ea8c2e11993 | [
"MIT"
] | 1 | 2020-04-16T02:21:49.000Z | 2020-04-16T02:21:49.000Z | PhdThesis/FeasibilityStudies/feasibility_studies/__init__.py | pariterre/ViolinOptimalControl | b7037d051a73f2c6cf5815e9d2269ea8c2e11993 | [
"MIT"
] | 1 | 2019-11-18T16:31:16.000Z | 2019-11-18T16:31:16.000Z | from .fatigue_integrator import FatigueIntegrator, StudyConfiguration
from .fatigue_models import FatigueModels
from .target_function import TargetFunctions
from .fatigue_parameters import FatigueParameters
| 41.4 | 69 | 0.89372 | 21 | 207 | 8.619048 | 0.619048 | 0.18232 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.082126 | 207 | 4 | 70 | 51.75 | 0.952632 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
8ea77e73aef60427b4bfa4d250cc804f92cb8b25 | 109 | py | Python | corehq/couchapps/__init__.py | dslowikowski/commcare-hq | ad8885cf8dab69dc85cb64f37aeaf06106124797 | [
"BSD-3-Clause"
] | 1 | 2015-02-10T23:26:39.000Z | 2015-02-10T23:26:39.000Z | corehq/couchapps/__init__.py | SEL-Columbia/commcare-hq | 992ee34a679c37f063f86200e6df5a197d5e3ff6 | [
"BSD-3-Clause"
] | null | null | null | corehq/couchapps/__init__.py | SEL-Columbia/commcare-hq | 992ee34a679c37f063f86200e6df5a197d5e3ff6 | [
"BSD-3-Clause"
] | null | null | null | from corehq.preindex import CouchAppsPreindexPlugin
CouchAppsPreindexPlugin.register('couchapps', __file__)
| 27.25 | 55 | 0.87156 | 9 | 109 | 10.111111 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06422 | 109 | 3 | 56 | 36.333333 | 0.892157 | 0 | 0 | 0 | 0 | 0 | 0.082569 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
8eb9b1572dd4185c3774dfcc2affd9bbb5e1df92 | 4,399 | py | Python | data_steward/test/unit_test/data_steward/cdr_cleaner/cleaning_rules/person_id_validator_test.py | berneskaracay/curation | 713e8ac606822a6f639ed3a74c9c0c07ba19f7c0 | [
"MIT"
] | null | null | null | data_steward/test/unit_test/data_steward/cdr_cleaner/cleaning_rules/person_id_validator_test.py | berneskaracay/curation | 713e8ac606822a6f639ed3a74c9c0c07ba19f7c0 | [
"MIT"
] | null | null | null | data_steward/test/unit_test/data_steward/cdr_cleaner/cleaning_rules/person_id_validator_test.py | berneskaracay/curation | 713e8ac606822a6f639ed3a74c9c0c07ba19f7c0 | [
"MIT"
] | null | null | null | # Python imports
import copy
import unittest
# Third party imports
# Project imports
import constants.bq_utils as bq_consts
import constants.cdr_cleaner.clean_cdr as clean_consts
import cdr_cleaner.cleaning_rules.person_id_validator as validator
import resources
class PersonIDValidatorTest(unittest.TestCase):
@classmethod
def setUpClass(cls):
print('**************************************************************')
print(cls.__name__)
print('**************************************************************')
def setUp(self):
self.mapped_tables = copy.copy(validator.MAPPED_VALIDATION_TABLES)
self.unmapped_tables = copy.copy(validator.UNMAPPED_VALIDATION_TABLES)
self.all_tables = copy.copy(self.mapped_tables)
self.all_tables.extend(self.unmapped_tables)
def test_get_person_id_validation_queries(self):
# pre conditions
# test
results = validator.get_person_id_validation_queries('foo', 'bar')
# post conditions
self.assertEqual(len(results), ((len(self.all_tables) * 2) - 1))
existing_and_consenting = validator.EXISTING_AND_VALID_CONSENTING_RECORDS
existing_in_person_table = validator.SELECT_EXISTING_PERSON_IDS
expected = []
for table in self.mapped_tables:
field_names = ['entry.' + field['name'] for field in resources.fields_for(table)]
fields = ', '.join(field_names)
expected.append(
{
clean_consts.QUERY: existing_and_consenting.format(
project='foo', dataset='bar', mapping_dataset='bar', table=table, fields=fields
),
clean_consts.DESTINATION_TABLE: table,
clean_consts.DESTINATION_DATASET: 'bar',
clean_consts.DISPOSITION: bq_consts.WRITE_TRUNCATE,
}
)
for table in self.all_tables:
field_names = ['entry.' + field['name'] for field in resources.fields_for(table)]
fields = ', '.join(field_names)
expected.append(
{
clean_consts.QUERY: existing_in_person_table.format(
project='foo', dataset='bar', table=table, fields=fields
),
clean_consts.DESTINATION_TABLE: table,
clean_consts.DESTINATION_DATASET: 'bar',
clean_consts.DISPOSITION: bq_consts.WRITE_TRUNCATE,
}
)
self.assertEqual(expected, results)
def test_get_person_id_validation_queries_deid(self):
# pre conditions
# test
results = validator.get_person_id_validation_queries('foo', 'bar_deid')
# post conditions
self.assertEqual(len(results), ((len(self.all_tables) * 2) - 1))
existing_and_consenting = validator.EXISTING_AND_VALID_CONSENTING_RECORDS
existing_in_person_table = validator.SELECT_EXISTING_PERSON_IDS
expected = []
for table in self.mapped_tables:
field_names = ['entry.' + field['name'] for field in resources.fields_for(table)]
fields = ', '.join(field_names)
expected.append(
{
clean_consts.QUERY: existing_and_consenting.format(
project='foo', dataset='bar_deid', mapping_dataset='bar', table=table, fields=fields
),
clean_consts.DESTINATION_TABLE: table,
clean_consts.DESTINATION_DATASET: 'bar_deid',
clean_consts.DISPOSITION: bq_consts.WRITE_TRUNCATE,
}
)
for table in self.all_tables:
field_names = ['entry.' + field['name'] for field in resources.fields_for(table)]
fields = ', '.join(field_names)
expected.append(
{
clean_consts.QUERY: existing_in_person_table.format(
project='foo', dataset='bar_deid', table=table, fields=fields
),
clean_consts.DESTINATION_TABLE: table,
clean_consts.DESTINATION_DATASET: 'bar_deid',
clean_consts.DISPOSITION: bq_consts.WRITE_TRUNCATE,
}
)
self.assertEqual(expected, results)
| 37.598291 | 108 | 0.58286 | 436 | 4,399 | 5.575688 | 0.178899 | 0.076923 | 0.072398 | 0.034554 | 0.78733 | 0.78733 | 0.784039 | 0.755245 | 0.755245 | 0.755245 | 0 | 0.001308 | 0.304615 | 4,399 | 116 | 109 | 37.922414 | 0.793397 | 0.027734 | 0 | 0.578313 | 0 | 0 | 0.058824 | 0.02906 | 0 | 0 | 0 | 0 | 0.048193 | 1 | 0.048193 | false | 0 | 0.072289 | 0 | 0.13253 | 0.036145 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
8ec1c98cccc97ee1104afca8bdae350a6b411b3b | 256 | py | Python | pipelines/helpers/tmp.py | CloudXLR8R/kubeflow-examples | 965b334fa86a72084745e26a3e276a25449d0a18 | [
"Apache-2.0"
] | 5 | 2021-06-04T20:55:46.000Z | 2021-06-05T09:37:12.000Z | pipelines/helpers/tmp.py | CloudXLR8R/kubeflow-examples | 965b334fa86a72084745e26a3e276a25449d0a18 | [
"Apache-2.0"
] | null | null | null | pipelines/helpers/tmp.py | CloudXLR8R/kubeflow-examples | 965b334fa86a72084745e26a3e276a25449d0a18 | [
"Apache-2.0"
] | 1 | 2021-12-17T04:04:50.000Z | 2021-12-17T04:04:50.000Z | from tempfile import mkstemp, mkdtemp
def get_tmp_dir(prefix, suffix=""):
return mkdtemp(prefix=f"{prefix}_", suffix=suffix)
def get_tmp_filename(prefix, suffix=""):
_, filepath = mkstemp(prefix=f"{prefix}_", suffix=suffix)
return filepath
| 23.272727 | 61 | 0.71875 | 33 | 256 | 5.363636 | 0.454545 | 0.271186 | 0.101695 | 0.214689 | 0.282486 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144531 | 256 | 10 | 62 | 25.6 | 0.808219 | 0 | 0 | 0 | 0 | 0 | 0.070313 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0.166667 | 0.833333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
8ee1eced0aa6c64f441ecd6911caac3e511cb87e | 216 | py | Python | concrete/common/data_types/__init__.py | iciac/concrete-numpy | debf888e9281263b731cfc4b31feb5de7ec7f47a | [
"FTL"
] | 96 | 2022-01-12T15:07:50.000Z | 2022-03-16T04:00:09.000Z | concrete/common/data_types/__init__.py | iciac/concrete-numpy | debf888e9281263b731cfc4b31feb5de7ec7f47a | [
"FTL"
] | 10 | 2022-02-04T16:26:37.000Z | 2022-03-25T14:08:01.000Z | concrete/common/data_types/__init__.py | iciac/concrete-numpy | debf888e9281263b731cfc4b31feb5de7ec7f47a | [
"FTL"
] | 8 | 2022-01-12T15:07:55.000Z | 2022-03-05T00:46:16.000Z | """Module for data types code and data structures."""
from . import dtypes_helpers, floats, integers
from .floats import Float, Float16, Float32, Float64
from .integers import Integer, SignedInteger, UnsignedInteger
| 43.2 | 61 | 0.796296 | 27 | 216 | 6.333333 | 0.740741 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031746 | 0.125 | 216 | 4 | 62 | 54 | 0.873016 | 0.217593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
d9059e5ac6229965fa7739e04c089d4e22aa0faf | 38 | py | Python | Location.py | Wally869/TextAdventure | 80d0a32440d57560993c7502a64b51b2b569dff2 | [
"MIT"
] | null | null | null | Location.py | Wally869/TextAdventure | 80d0a32440d57560993c7502a64b51b2b569dff2 | [
"MIT"
] | null | null | null | Location.py | Wally869/TextAdventure | 80d0a32440d57560993c7502a64b51b2b569dff2 | [
"MIT"
] | 1 | 2022-03-20T21:19:17.000Z | 2022-03-20T21:19:17.000Z |
class Location(object):
pass
| 4.75 | 23 | 0.605263 | 4 | 38 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.315789 | 38 | 7 | 24 | 5.428571 | 0.884615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
d977db8a490be6d22ada8076541810bc6a541504 | 464 | py | Python | tests/system/conftest.py | fkie-cad/woodblock | ac4a590744021540fc7388765629bf3367f89e2e | [
"MIT"
] | 8 | 2019-08-14T08:57:21.000Z | 2022-02-18T01:35:24.000Z | tests/system/conftest.py | fkie-cad/woodblock | ac4a590744021540fc7388765629bf3367f89e2e | [
"MIT"
] | 1 | 2020-01-24T23:38:36.000Z | 2020-02-27T14:00:59.000Z | tests/system/conftest.py | fkie-cad/woodblock | ac4a590744021540fc7388765629bf3367f89e2e | [
"MIT"
] | 2 | 2019-08-22T15:30:53.000Z | 2020-01-24T23:11:34.000Z | import pathlib
import pytest
import woodblock.file
HERE = pathlib.Path(__file__).absolute().parent
DATA_FILES = HERE.parent / 'data'
@pytest.fixture
def test_data_path():
return DATA_FILES
@pytest.fixture
def test_corpus_path(test_data_path):
return test_data_path / 'corpus'
@pytest.fixture
def config_path(test_data_path):
return test_data_path / 'configs'
@pytest.fixture(autouse=True)
def reset_corpus():
woodblock.file.corpus(None)
| 16 | 47 | 0.760776 | 65 | 464 | 5.123077 | 0.338462 | 0.12012 | 0.18018 | 0.162162 | 0.204204 | 0.204204 | 0.204204 | 0.204204 | 0 | 0 | 0 | 0 | 0.137931 | 464 | 28 | 48 | 16.571429 | 0.8325 | 0 | 0 | 0.176471 | 0 | 0 | 0.036638 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.176471 | 0.176471 | 0.588235 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
d9845cbfcdf67b074166004ec09c7d7562fcb872 | 155 | py | Python | tool/save_file.py | SOLINSIGHT/solinsight | b0398c48e33a1f43a2ec4528477cc07e0b692bd6 | [
"Apache-2.0"
] | null | null | null | tool/save_file.py | SOLINSIGHT/solinsight | b0398c48e33a1f43a2ec4528477cc07e0b692bd6 | [
"Apache-2.0"
] | 1 | 2021-12-18T08:44:43.000Z | 2021-12-18T08:44:43.000Z | tool/save_file.py | SOLINSIGHT/solinsight | b0398c48e33a1f43a2ec4528477cc07e0b692bd6 | [
"Apache-2.0"
] | null | null | null | # Save the file (file name, content)
def save_to_file(path_file_name, contents):
fh = open(path_file_name, 'w')
fh.write(contents)
fh.close()
| 22.142857 | 43 | 0.683871 | 25 | 155 | 4 | 0.56 | 0.24 | 0.24 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.187097 | 155 | 6 | 44 | 25.833333 | 0.793651 | 0.219355 | 0 | 0 | 0 | 0 | 0.008475 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
d98dc7d08ce94494570839e222ec7e9f497d0034 | 67 | py | Python | ArcFace/__init__.py | liuwuliuyun/CFDS | 9e86ace020b1191365e0ee5aca4c8614ebc4911b | [
"MIT"
] | 1 | 2018-12-22T02:19:19.000Z | 2018-12-22T02:19:19.000Z | ArcFace/__init__.py | liuwuliuyun/CFDS | 9e86ace020b1191365e0ee5aca4c8614ebc4911b | [
"MIT"
] | null | null | null | ArcFace/__init__.py | liuwuliuyun/CFDS | 9e86ace020b1191365e0ee5aca4c8614ebc4911b | [
"MIT"
] | null | null | null | from .extractor import extractor
from .yliu_aligner import aligner
| 22.333333 | 33 | 0.850746 | 9 | 67 | 6.222222 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119403 | 67 | 2 | 34 | 33.5 | 0.949153 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
d98eadcede2c3d9a7aa06fb6bc465e04505f16b0 | 275 | py | Python | tests/unit/locationcontext/test_mod_init.py | manojn97/kubernetes-driver | 06f554e7e74927d528bce51807ed52b4a3a77aa4 | [
"Apache-2.0"
] | 2 | 2021-07-19T07:11:36.000Z | 2021-08-20T00:42:31.000Z | tests/unit/locationcontext/test_mod_init.py | manojn97/kubernetes-driver | 06f554e7e74927d528bce51807ed52b4a3a77aa4 | [
"Apache-2.0"
] | 39 | 2020-08-21T07:39:31.000Z | 2022-03-21T09:26:08.000Z | tests/unit/locationcontext/test_mod_init.py | manojn97/kubernetes-driver | 06f554e7e74927d528bce51807ed52b4a3a77aa4 | [
"Apache-2.0"
] | 11 | 2020-09-29T06:03:50.000Z | 2022-03-07T06:29:36.000Z | import unittest
import kubedriver.locationcontext as locationcontext
class TestImports(unittest.TestCase):
def test_factory(self):
imported = locationcontext.LocationContextFactory
def test_context(self):
imported = locationcontext.LocationContext
| 25 | 57 | 0.781818 | 25 | 275 | 8.52 | 0.6 | 0.065728 | 0.253521 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163636 | 275 | 10 | 58 | 27.5 | 0.926087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.714286 | 0 | 1.142857 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
79b32a47780cbd671cb94c23a20303a7c3bfbfc6 | 26 | py | Python | app/__init__.py | cerealcake/flask-riak-mcollective-ldap3 | 00c1bd4dc9ed77ffff0f6ca56c4db74542d436cf | [
"MIT"
] | null | null | null | app/__init__.py | cerealcake/flask-riak-mcollective-ldap3 | 00c1bd4dc9ed77ffff0f6ca56c4db74542d436cf | [
"MIT"
] | null | null | null | app/__init__.py | cerealcake/flask-riak-mcollective-ldap3 | 00c1bd4dc9ed77ffff0f6ca56c4db74542d436cf | [
"MIT"
] | 1 | 2019-10-20T06:58:16.000Z | 2019-10-20T06:58:16.000Z | # riak mcollective ldap3
| 8.666667 | 24 | 0.769231 | 3 | 26 | 6.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0.192308 | 26 | 2 | 25 | 13 | 0.904762 | 0.846154 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
79c589125106cdfcbd1dd2108f84da314b1c794c | 93 | py | Python | Modules/utility/__init__.py | iheb-brini/fitness-lab | 2d82d7a2ecba27f535cda880865e6d9ed446eac5 | [
"MIT"
] | null | null | null | Modules/utility/__init__.py | iheb-brini/fitness-lab | 2d82d7a2ecba27f535cda880865e6d9ed446eac5 | [
"MIT"
] | null | null | null | Modules/utility/__init__.py | iheb-brini/fitness-lab | 2d82d7a2ecba27f535cda880865e6d9ed446eac5 | [
"MIT"
] | null | null | null | from .tools import (try_gpu, try_all_gpus,summary)
from .classes import (Timer,Accumulator)
| 31 | 50 | 0.795699 | 14 | 93 | 5.071429 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107527 | 93 | 2 | 51 | 46.5 | 0.855422 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
8db8fda57f561658bf3b30f98c6dc84658a5c4f4 | 1,274 | py | Python | link/linktools/android/__init__.py | ice-black-tea/Zelda | 8721f895c05824ca3123106c7db6f41e0b48b07b | [
"Apache-2.0"
] | 10 | 2021-05-11T02:26:47.000Z | 2022-03-17T06:58:22.000Z | link/linktools/android/__init__.py | ice-black-tea/Zelda | 8721f895c05824ca3123106c7db6f41e0b48b07b | [
"Apache-2.0"
] | null | null | null | link/linktools/android/__init__.py | ice-black-tea/Zelda | 8721f895c05824ca3123106c7db6f41e0b48b07b | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
@author : Hu Ji
@file : __init__.py
@time : 2020/03/01
@site :
@software: PyCharm
,----------------, ,---------,
,-----------------------, ," ,"|
," ,"| ," ," |
+-----------------------+ | ," ," |
| .-----------------. | | +---------+ |
| | | | | | -==----'| |
| | $ sudo rm -rf / | | | | | |
| | | | |/----|`---= | |
| | | | | ,/|==== ooo | ;
| | | | | // |(((( [33]| ,"
| `-----------------' |," .;'| |(((( | ,"
+-----------------------+ ;; | | |,"
/_)______________(_/ //' | +---------+
___________________________/___ `,
/ oooooooooooooooo .o. oooo /, \,"-----------
/ ==ooooooooooooooo==.o. ooo= // ,`\--{)B ,"
/_==__==========__==_ooo__ooo=_/' /___________,"
"""
from .adb import Adb, Device, AdbError
from .argparser import AdbArgumentParser
from .struct import Package, Permission, Component, Activity, Service, Receiver, Provider, IntentFilter
| 39.8125 | 103 | 0.281005 | 55 | 1,274 | 5.236364 | 0.836364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015769 | 0.402669 | 1,274 | 31 | 104 | 41.096774 | 0.362681 | 0.845369 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
8df4922fcff8b2f62674c56c32032f470eb22402 | 126 | py | Python | codegene/__init__.py | technosvitman/sm_gene | 243d060fec7c642ce74843776a016ded13a68855 | [
"MIT"
] | null | null | null | codegene/__init__.py | technosvitman/sm_gene | 243d060fec7c642ce74843776a016ded13a68855 | [
"MIT"
] | 9 | 2021-08-11T07:25:53.000Z | 2021-09-01T11:10:33.000Z | codegene/__init__.py | technosvitman/sm_gene | 243d060fec7c642ce74843776a016ded13a68855 | [
"MIT"
] | null | null | null |
from .CodeGenerator import CodeGenerator
from .Source import Source
from .Header import Header
from .Plantuml import Plantuml | 25.2 | 40 | 0.84127 | 16 | 126 | 6.625 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126984 | 126 | 5 | 41 | 25.2 | 0.963636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
8dfad3d6f7336ed67926671c8ccacf22cc19778d | 13,459 | py | Python | experiments/experiments_toy/convergence/nmf_icm.py | ThomasBrouwer/BNMTF | 34df0c3cebc5e67a5e39762b9305b75d73a2a0e0 | [
"Apache-2.0"
] | 16 | 2017-04-19T12:04:47.000Z | 2021-12-03T00:50:43.000Z | experiments/experiments_toy/convergence/nmf_icm.py | ThomasBrouwer/BNMTF | 34df0c3cebc5e67a5e39762b9305b75d73a2a0e0 | [
"Apache-2.0"
] | 1 | 2017-04-20T11:26:16.000Z | 2017-04-20T11:26:16.000Z | experiments/experiments_toy/convergence/nmf_icm.py | ThomasBrouwer/BNMTF | 34df0c3cebc5e67a5e39762b9305b75d73a2a0e0 | [
"Apache-2.0"
] | 8 | 2015-12-15T05:29:43.000Z | 2019-06-05T03:14:11.000Z | """
Recover the toy dataset using ICM.
We can plot the MSE, R2 and Rp as it converges, on the entire dataset.
We have I=100, J=80, K=10, and no test data.
We give flatter priors (1/10) than what was used to generate the data (1).
"""
import sys, os
project_location = os.path.dirname(__file__)+"/../../../../"
sys.path.append(project_location)
from BNMTF.code.models.nmf_icm import nmf_icm
import numpy, matplotlib.pyplot as plt
##########
input_folder = project_location+"BNMTF/data_toy/bnmf/"
iterations = 200
init_UV = 'random'
I, J, K = 100,80,10
minimum_TN = 0.1
alpha, beta = 1., 1. #1., 1.
lambdaU = numpy.ones((I,K))/10.
lambdaV = numpy.ones((J,K))/10.
priors = { 'alpha':alpha, 'beta':beta, 'lambdaU':lambdaU, 'lambdaV':lambdaV }
# Load in data
R = numpy.loadtxt(input_folder+"R.txt")
M = numpy.ones((I,J))
# Run the VB algorithm
NMF = nmf_icm(R,M,K,priors)
NMF.initialise(init_UV)
NMF.run(iterations,minimum_TN=minimum_TN)
# Plot the tau values to check convergence
plt.plot(NMF.all_tau)
# Extract the performances across all iterations
print "icm_all_performances = %s" % NMF.all_performances
'''
icm_all_performances = {'R^2': [-3.011325221921278, 0.6122247892013752, 0.6466573083285352, 0.7009971539050444, 0.7516170229311595, 0.7907776368331421, 0.8234330007573292, 0.8439394309755051, 0.863190988228737, 0.875907208593979, 0.8863902136389366, 0.8956711204838093, 0.9030691260197806, 0.9093572010424538, 0.915006474554024, 0.9208683743857938, 0.9297717022302099, 0.9382762694219999, 0.9440805949318114, 0.9481637834640294, 0.9512339963214808, 0.9536824976757972, 0.9557067102933176, 0.9574117113766332, 0.9588800821133757, 0.9601711935524039, 0.9613289552855884, 0.9623773397863895, 0.9633363247611982, 0.9642260298165124, 0.9650591336929771, 0.9658476771519857, 0.9665981939045396, 0.9673144124559565, 0.9679974803398015, 0.9686491553083643, 0.969270689248866, 0.9698621475376464, 0.9704231637295726, 0.9709522041051154, 0.9714462525541149, 0.9719082527778845, 0.9723401535497302, 0.9727429011312829, 0.9731153722352602, 0.9734587356715676, 0.973774679558256, 0.9740650713542474, 0.974331384589147, 0.9745760048935016, 0.9748003226577837, 0.9750056187155984, 0.9751933204372832, 0.9753656793342897, 0.975524450646072, 0.9756705826874299, 0.9758048782658982, 0.9759279983494072, 0.9760414435441985, 0.9761464198666829, 0.9762439737494745, 0.9763349709416392, 0.9764200978273005, 0.9765000143023518, 0.9765752982876535, 0.9766464029077505, 0.9767137870977944, 0.976777763158665, 0.9768385275485062, 0.976896216756862, 0.9769508565927314, 0.9770027176525754, 0.9770520143237956, 0.9770989747053823, 0.9771437967041209, 0.9771867068689287, 0.9772279068848454, 0.9772675330411751, 0.9773057059437851, 0.977342545050111, 0.9773781524421333, 0.9774125855589668, 0.9774458879203755, 0.9774780881680812, 0.9775092418374954, 0.9775394190363488, 0.9775686807741273, 0.9775970857638983, 0.9776246782141919, 0.9776514555564746, 0.9776773801685651, 0.9777025450581172, 0.977726974765106, 0.9777507057351827, 0.977773769986438, 0.9777961961130616, 0.9778180098685153, 0.9778391179450548, 0.9778594233471256, 0.9778791331558111, 0.9778982737730811, 0.9779168751767531, 0.9779349635705069, 0.9779525633727618, 0.9779697029026608, 0.9779863975071169, 0.9780026731155237, 0.9780185484716732, 0.9780340431606896, 0.978049172458006, 0.9780639490962261, 0.9780783664143862, 0.9780924240341047, 0.9781061480430513, 0.9781195515938678, 0.9781326453125075, 0.9781454390967911, 0.9781579418980331, 0.9781701632363203, 0.9781820969824855, 0.9781937497906589, 0.9782051380898764, 0.978216273617541, 0.97822716702872, 0.9782378275386251, 0.9782482636447294, 0.9782584834847775, 0.9782684969305622, 0.9782783083736516, 0.9782879238019593, 0.9782973496940097, 0.9783065918326681, 0.9783156553899555, 0.9783245452786685, 0.9783332660090225, 0.9783418221871046, 0.9783502214956838, 0.9783584688467293, 0.9783665679674812, 0.9783745221792527, 0.9783823342804722, 0.9783900075324913, 0.9783975455379338, 0.9784049516509242, 0.9784122290694917, 0.9784193810552732, 0.9784264109568401, 0.9784333215540182, 0.9784401168331144, 0.9784467972138723, 0.9784533657094477, 0.9784598251188316, 0.9784661780178132, 0.9784724286406051, 0.9784785765210536, 0.9784846246945589, 0.978490574938382, 0.9784964292070676, 0.9785021897245386, 0.9785078585313441, 0.9785134375766608, 0.9785189319154892, 0.9785243441381586, 0.9785296707472255, 0.9785349069305831, 0.9785400576798384, 0.9785451209562008, 0.9785501068063127, 0.9785550178068111, 0.9785598562371592, 0.9785646239133954, 0.9785693226602352, 0.9785739542582257, 0.9785785205438549, 0.9785830231596911, 0.978587463615462, 0.9785918433520977, 0.9785961637047367, 0.978600426176705, 0.9786046321511963, 0.9786087828566364, 0.9786128794436941, 0.9786169229892924, 0.978620914517937, 0.9786248550197832, 0.9786287454611107, 0.9786325867887052, 0.9786363799307498, 0.9786401256273268, 0.9786438249257021, 0.9786474789102443, 0.9786510881686012, 0.9786546542740576, 0.9786581782334028, 0.9786616608794534, 0.9786651029885662, 0.9786685052959115, 0.9786718685109778, 0.978675193270265, 0.9786784797118755], 'MSE': [157.73430399888665, 15.248190959207893, 13.894227084683257, 11.757462487642595, 9.7669757114233438, 8.2270925465674658, 6.9430104002824189, 6.1366515739550778, 5.3796371669360417, 4.8796068632731462, 4.4673915944769593, 4.1024455228758825, 3.8115393535582398, 3.5642781412850848, 3.3421360370804134, 3.1116329891045242, 2.7615341706047785, 2.4271155152741462, 2.1988764187606522, 2.0383162882317287, 1.9175886330538321, 1.8213080685030361, 1.7417114886432217, 1.674667022209924, 1.6169273916991174, 1.5661579943171295, 1.5206322064347553, 1.4794073766320301, 1.441697936689287, 1.4067127385564253, 1.373953225716426, 1.3429459284272729, 1.3134339265191106, 1.2852706068008652, 1.2584108456791596, 1.2327855243918189, 1.208345415925679, 1.1850879495364341, 1.1630275346685524, 1.142224480613619, 1.1227973875874049, 1.1046305026523751, 1.0876471956715863, 1.0718102575155173, 1.0571638583593252, 1.0436620379762187, 1.0312384157784562, 1.0198195590930468, 1.0093475254311981, 0.99972850644951539, 0.99090782887636197, 0.98283512746183466, 0.97545427480308333, 0.96867673642031549, 0.96243349236169251, 0.95668725276248079, 0.95140644942993569, 0.94656509162268587, 0.94210417213786168, 0.93797626770544396, 0.93414023024819359, 0.9305620165711973, 0.92721463651135805, 0.92407214148219363, 0.92111180633934875, 0.91831581320949329, 0.91566611573320567, 0.91315043353948344, 0.91076104145351322, 0.90849257240031112, 0.90634401151215838, 0.9043047183293389, 0.90236626265840725, 0.90051967513550812, 0.89875717362198382, 0.89706984970632575, 0.89544977268044479, 0.89389158334325391, 0.89239053920155542, 0.89094194292108075, 0.8895417804094069, 0.88818779303172968, 0.88687826949332238, 0.88561208353744347, 0.88438705138716822, 0.88320041624929679, 0.88204977909400717, 0.88093283119181676, 0.879847834164179, 0.87879488901158076, 0.87777547512257825, 0.87678593522762871, 0.87582530435987571, 0.87489214939457238, 0.87398521130859863, 0.87310336571426961, 0.87224559992771622, 0.87141558301871114, 0.87061712906998723, 0.86984209518892353, 0.8690894431908508, 0.86835799428075899, 0.86764671807118854, 0.86695465437512453, 0.8662806896203562, 0.86562422033184516, 0.8649842269076814, 0.86435997229992823, 0.86375068637758001, 0.86315576847634679, 0.86257471792169738, 0.86200779663765459, 0.86145501950304237, 0.86091536069217323, 0.86038830301368052, 0.85987342863123994, 0.85937034834035375, 0.85887871015525485, 0.85839813972478196, 0.8579288780615355, 0.85747066351018941, 0.85702285004420875, 0.85658497612258366, 0.85615662276416882, 0.85573742760618043, 0.85532705650891405, 0.85492518947799712, 0.85453143833240564, 0.85414563038582314, 0.85376753018006835, 0.8533968829650258, 0.85303346134546332, 0.85267706194591431, 0.85232749158420296, 0.85198457290854068, 0.85164812479725416, 0.85131784514663356, 0.85099354080704281, 0.85067506521502889, 0.85036228776911549, 0.85005509842882376, 0.8497533689504887, 0.8494569576708636, 0.84916573269825701, 0.84887956827898092, 0.8485983361581636, 0.84832190466246238, 0.84805016448289794, 0.84778295886893829, 0.8475202713153247, 0.84726198333801117, 0.84700798487439988, 0.84675817463867775, 0.84651238613211321, 0.84627063768491595, 0.84603280993833896, 0.84579883300649417, 0.84556863003229732, 0.84534211356611799, 0.84511920336899038, 0.84489982279490938, 0.84468377306992015, 0.84447095233647151, 0.84426149812187212, 0.84405559964896826, 0.84385306063656762, 0.84365396125449776, 0.84345790644623908, 0.84326479489265405, 0.84307453696012746, 0.84288706123796631, 0.8427022959742555, 0.84252017115390265, 0.84234061456207177, 0.84216356160928918, 0.84198895292968789, 0.84181673186265349, 0.84164684590755723, 0.84147923595023777, 0.8413138476004226, 0.84115063255365174, 0.8409895455630545, 0.84083054428159665, 0.84067358842341477, 0.84051863905338364, 0.8403656581752259, 0.84021460855919117, 0.84006545370704999, 0.83991816451709866, 0.83977269980849656, 0.83962901694143854, 0.83948709280829625, 0.83934686554397742, 0.83920829555927523, 0.839071350105973, 0.83893599865597379, 0.83880221230074248, 0.83866996314159414, 0.83853922614992127, 0.83840999589523302], 'Rp': [1.6650962751468002e-15, 0.7882267556563668, 0.81082005528312073, 0.84464755790440516, 0.87567154425592608, 0.89567605777854831, 0.911986474659361, 0.92304281921242981, 0.93245470293748689, 0.93857273971410227, 0.94411945168132871, 0.94835723957754248, 0.95195145202385845, 0.95508993701257561, 0.95790030948654403, 0.96112138648840095, 0.9655802028904521, 0.96976645853939103, 0.97255329793700007, 0.97450727939084725, 0.97599219525555536, 0.97717828980609389, 0.97815684773681111, 0.97897956277920639, 0.97968676041250824, 0.98030924782559825, 0.98086951941481348, 0.98137679317533166, 0.98184100134932395, 0.98227194231377801, 0.98267497979445817, 0.98305735022283247, 0.98342179300312593, 0.98376909949134639, 0.9841001154676412, 0.98441616710411484, 0.9847182706846539, 0.98500512484919034, 0.98527698390593077, 0.98553408169324486, 0.98577449886363677, 0.98599917346746624, 0.98620932631755143, 0.98640550554917072, 0.9865868263181865, 0.9867536070100219, 0.98690681008646364, 0.98704761771464311, 0.98717649401586283, 0.98729477309369285, 0.98740323712604328, 0.98750289398249202, 0.98759388723536434, 0.9876777408145121, 0.98775516856171364, 0.98782635223676929, 0.98789172050533436, 0.98795168375928677, 0.98800700316719514, 0.98805814337109588, 0.9881055964379204, 0.98814992588862671, 0.98819142608497657, 0.98823040016646768, 0.98826719659451212, 0.98830201978397603, 0.98833499476344389, 0.98836627741257366, 0.98839598124796169, 0.98842416526715737, 0.98845086882148236, 0.98847619957360466, 0.98850027249993688, 0.98852321409437904, 0.98854514246087655, 0.98856613598959342, 0.9885862900991409, 0.98860570614489096, 0.98862437069829079, 0.98864237583615411, 0.98865978949058841, 0.98867666989755565, 0.98869300426966844, 0.9887088025123586, 0.98872408501692555, 0.98873888627440343, 0.98875323786819735, 0.98876716833191103, 0.98878070763457426, 0.98879385628954841, 0.98880659388404946, 0.9888189381784126, 0.98883091673148737, 0.98884254993184817, 0.9888538545464316, 0.98886484435794753, 0.98887553158327124, 0.98888588614974082, 0.98889585916406963, 0.98890551656138648, 0.98891488565733354, 0.98892398781654889, 0.98893283506035368, 0.98894144940568662, 0.9889498420376277, 0.98895801680383988, 0.98896598275884429, 0.98897376288288286, 0.98898135704831713, 0.98898876886754861, 0.98899600495164453, 0.98900306515083236, 0.98900994598910519, 0.98901666055021531, 0.98902321701864448, 0.98902962089292512, 0.98903587709749796, 0.98904199029598217, 0.98904796555866281, 0.98905380683161292, 0.98905950874391446, 0.98906507915584196, 0.98907052519077776, 0.98907585069839266, 0.989081061329811, 0.98908616169840469, 0.98909115619310806, 0.98909605100892106, 0.98910084784303032, 0.98910554868545331, 0.9891101563168726, 0.98911467358897287, 0.98911910312636953, 0.98912344735955438, 0.98912770868844768, 0.98913189183769146, 0.98913599967536969, 0.98914003338198853, 0.98914399398336661, 0.98914788290824585, 0.98915170287281684, 0.98915545495386625, 0.98915914071032474, 0.98916276166326567, 0.98916631935387, 0.98916981549582716, 0.98917325157574698, 0.98917662884985613, 0.9891799512710493, 0.98918321660701081, 0.98918642658930078, 0.98918958320466677, 0.98919268791997528, 0.98919574438331481, 0.98919874928050555, 0.98920170497979909, 0.98920461252591718, 0.98920747295473088, 0.98921028736744432, 0.9892130567333659, 0.9892157819823445, 0.98921846634463872, 0.98922111229721765, 0.98922371515115415, 0.98922627523133499, 0.98922879624153026, 0.98923127856514548, 0.98923372177022806, 0.98923612780812831, 0.98923849797795116, 0.98924083352487735, 0.98924313552631349, 0.98924540519104887, 0.98924764297266665, 0.98924984937654337, 0.98925202506649701, 0.98925417065134291, 0.98925628688189982, 0.98925837456179899, 0.98926043422646315, 0.98926246637546478, 0.98926447159299391, 0.98926645045616168, 0.98926840351631695, 0.98927033129934649, 0.9892722343101118, 0.98927411303670243, 0.98927596795333295, 0.98927779952148565, 0.98927960946775573, 0.98928139939093485, 0.98928316617733236, 0.98928491243112571, 0.9892866382350306, 0.98928834382235475, 0.98929002950129463, 0.98929169564596531, 0.98929334257207657, 0.98929497053647408, 0.98929658161386502]}
''' | 269.18 | 12,331 | 0.839438 | 1,398 | 13,459 | 8.064378 | 0.502861 | 0.000798 | 0.002129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.828108 | 0.056839 | 13,459 | 50 | 12,332 | 269.18 | 0.060028 | 0.00951 | 0 | 0 | 0 | 0 | 0.123822 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.142857 | null | null | 0.047619 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
5c1a1d8c21ec5163e122c7000dbc6e08d96128df | 24 | py | Python | src/pyproject2sphinx/__init__.py | ciresnave/pyproject2sphinx | 597b456697c6f7dc9958f7938de4d613bfe8fa67 | [
"MIT"
] | null | null | null | src/pyproject2sphinx/__init__.py | ciresnave/pyproject2sphinx | 597b456697c6f7dc9958f7938de4d613bfe8fa67 | [
"MIT"
] | 45 | 2021-07-15T08:17:54.000Z | 2022-03-15T08:22:41.000Z | src/pyproject2sphinx/__init__.py | ciresnave/pyproject2sphinx | 597b456697c6f7dc9958f7938de4d613bfe8fa67 | [
"MIT"
] | null | null | null | """Pyproject2Sphinx."""
| 12 | 23 | 0.666667 | 1 | 24 | 16 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043478 | 0.041667 | 24 | 1 | 24 | 24 | 0.652174 | 0.708333 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
309f0b760e6a7faf303cb68f88bdb1216f0381c4 | 8,917 | py | Python | moneta/repository/migrations/0005_auto_20171104_1300.py | pombredanne/Moneta | 9d8e226eb9ad73f9c59e1a5540a0e8fe467f7f76 | [
"CECILL-B"
] | 6 | 2017-01-17T17:54:54.000Z | 2021-08-16T09:13:30.000Z | moneta/repository/migrations/0005_auto_20171104_1300.py | pombredanne/Moneta-d9pouces | 8ccfbadeedd00e080a0cc17a78ba4c48bced52e9 | [
"CECILL-B"
] | 4 | 2016-02-04T10:36:11.000Z | 2017-01-29T23:20:40.000Z | moneta/repository/migrations/0005_auto_20171104_1300.py | pombredanne/Moneta | 9d8e226eb9ad73f9c59e1a5540a0e8fe467f7f76 | [
"CECILL-B"
] | 5 | 2015-07-30T18:12:47.000Z | 2020-01-24T09:34:53.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.2 on 2017-11-04 12:00
from __future__ import unicode_literals
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('repository', '0004_auto_20171104_1257'),
]
operations = [
migrations.AlterField(
model_name='archivestate',
name='author',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL, verbose_name='Author'),
),
migrations.AlterField(
model_name='archivestate',
name='creation',
field=models.DateTimeField(auto_now_add=True, db_index=True, verbose_name='Creation date'),
),
migrations.AlterField(
model_name='archivestate',
name='modification',
field=models.DateTimeField(auto_now=True, db_index=True, verbose_name='Modification date'),
),
migrations.AlterField(
model_name='archivestate',
name='name',
field=models.CharField(db_index=True, max_length=100, verbose_name='Name'),
),
migrations.AlterField(
model_name='archivestate',
name='repository',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='repository.Repository', verbose_name='Repository'),
),
migrations.AlterField(
model_name='archivestate',
name='slug',
field=models.SlugField(max_length=100, verbose_name='Slug'),
),
migrations.AlterField(
model_name='element',
name='author',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL, verbose_name='Author'),
),
migrations.AlterField(
model_name='element',
name='creation',
field=models.DateTimeField(auto_now_add=True, db_index=True, verbose_name='Creation date'),
),
migrations.AlterField(
model_name='element',
name='extension',
field=models.CharField(blank=True, db_index=True, default='', help_text='Automatically generated on creation', max_length=20, verbose_name='File extension'),
),
migrations.AlterField(
model_name='element',
name='filename',
field=models.CharField(blank=True, db_index=True, default='', help_text='Automatically generated on creation', max_length=255, verbose_name='filename'),
),
migrations.AlterField(
model_name='element',
name='filesize',
field=models.IntegerField(blank=True, db_index=True, default=0, help_text='Automatically generated on creation', verbose_name='File size'),
),
migrations.AlterField(
model_name='element',
name='full_name',
field=models.CharField(blank=True, db_index=True, max_length=255, verbose_name='Complete name'),
),
migrations.AlterField(
model_name='element',
name='full_name_normalized',
field=models.CharField(blank=True, db_index=True, help_text='Complete name without special chars nor accents', max_length=255, verbose_name='Normalized complete name'),
),
migrations.AlterField(
model_name='element',
name='long_description',
field=models.TextField(blank=True, verbose_name='Long description'),
),
migrations.AlterField(
model_name='element',
name='md5',
field=models.CharField(blank=True, db_index=True, default='', help_text='Automatically generated on creation', max_length=120, verbose_name='MD5 sum'),
),
migrations.AlterField(
model_name='element',
name='modification',
field=models.DateTimeField(auto_now=True, db_index=True, verbose_name='Modification date'),
),
migrations.AlterField(
model_name='element',
name='name',
field=models.CharField(db_index=True, max_length=100, verbose_name='Name'),
),
migrations.AlterField(
model_name='element',
name='official_link',
field=models.URLField(blank=True, max_length=255, verbose_name='URL to the web page'),
),
migrations.AlterField(
model_name='element',
name='repository',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='repository.Repository', verbose_name='Repository'),
),
migrations.AlterField(
model_name='element',
name='sha1',
field=models.CharField(blank=True, db_index=True, default='', help_text='Automatically generated on creation', max_length=120, verbose_name='SHA1 sum'),
),
migrations.AlterField(
model_name='element',
name='sha256',
field=models.CharField(blank=True, db_index=True, default='', help_text='Automatically generated on creation', max_length=120, verbose_name='SHA256 sum'),
),
migrations.AlterField(
model_name='element',
name='short_description',
field=models.CharField(blank=True, max_length=500, verbose_name='Short description'),
),
migrations.AlterField(
model_name='element',
name='slug',
field=models.SlugField(max_length=100, verbose_name='Slug'),
),
migrations.AlterField(
model_name='element',
name='states',
field=models.ManyToManyField(db_index=True, to='repository.ArchiveState', verbose_name='Archive states'),
),
migrations.AlterField(
model_name='elementsignature',
name='creation',
field=models.DateTimeField(auto_now_add=True, db_index=True, verbose_name='Creation date'),
),
migrations.AlterField(
model_name='elementsignature',
name='element',
field=models.ForeignKey(default=0, on_delete=django.db.models.deletion.CASCADE, to='repository.Element', verbose_name='Element'),
),
migrations.AlterField(
model_name='elementsignature',
name='method',
field=models.CharField(choices=[('gpg', 'GnuPG'), ('x509', 'x509 (openSSL/libreSSL)')], db_index=True, max_length=10, verbose_name='signature method'),
),
migrations.AlterField(
model_name='repository',
name='admin_group',
field=models.ManyToManyField(blank=True, db_index=True, related_name='repository_admin', to='auth.Group', verbose_name='Admin groups'),
),
migrations.AlterField(
model_name='repository',
name='archive_type',
field=models.CharField(db_index=True, max_length=100, verbose_name='Repository type'),
),
migrations.AlterField(
model_name='repository',
name='author',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL, verbose_name='Author'),
),
migrations.AlterField(
model_name='repository',
name='creation',
field=models.DateTimeField(auto_now_add=True, db_index=True, verbose_name='Creation date'),
),
migrations.AlterField(
model_name='repository',
name='is_private',
field=models.BooleanField(db_index=True, default=False, verbose_name='Should readers be authenticated?'),
),
migrations.AlterField(
model_name='repository',
name='modification',
field=models.DateTimeField(auto_now=True, db_index=True, verbose_name='Modification date'),
),
migrations.AlterField(
model_name='repository',
name='name',
field=models.CharField(db_index=True, max_length=100, verbose_name='Name'),
),
migrations.AlterField(
model_name='repository',
name='on_index',
field=models.BooleanField(db_index=True, default=True, verbose_name='Display on public index?'),
),
migrations.AlterField(
model_name='repository',
name='reader_group',
field=models.ManyToManyField(blank=True, db_index=True, related_name='repository_reader', to='auth.Group', verbose_name='Readers groups'),
),
migrations.AlterField(
model_name='repository',
name='slug',
field=models.SlugField(max_length=100, verbose_name='Slug'),
),
]
| 43.926108 | 180 | 0.616687 | 911 | 8,917 | 5.84742 | 0.144896 | 0.138915 | 0.173644 | 0.201427 | 0.812277 | 0.787873 | 0.669608 | 0.580252 | 0.535198 | 0.526 | 0 | 0.015205 | 0.26242 | 8,917 | 202 | 181 | 44.143564 | 0.794739 | 0.007626 | 0 | 0.753846 | 1 | 0 | 0.177029 | 0.009948 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.020513 | 0 | 0.035897 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
30c051d112d46f87e39bf54fb7c4fbff8e964704 | 20,801 | py | Python | lab_safety_web/user/views.py | Spico197/lab-safety-iot | 1d792b236b7782cc9f6468f5a44bb90deb5d8af5 | [
"MIT"
] | null | null | null | lab_safety_web/user/views.py | Spico197/lab-safety-iot | 1d792b236b7782cc9f6468f5a44bb90deb5d8af5 | [
"MIT"
] | null | null | null | lab_safety_web/user/views.py | Spico197/lab-safety-iot | 1d792b236b7782cc9f6468f5a44bb90deb5d8af5 | [
"MIT"
] | null | null | null | import hashlib
import datetime
# import pyecharts
from django.http import HttpResponseNotFound
from django.shortcuts import render, redirect, render_to_response
from django.template import RequestContext
from django.core.paginator import Paginator
from pyecharts import Line
from user.models import *
from device.models import *
from data.models import *
# pyecharts.configure(
# global_theme='roma'
# )
def sha256(password):
return hashlib.sha256(password.encode()).hexdigest()
def page_not_found(request, exception, template_name="error-404.html"):
# response = render_to_response('error-404.html')
# response.status_code = 404
# return response
# return render(request, 'error-404.html')
return render(request, template_name, status=404)
def page_error(request):
return render(request, 'error-500.html', status=500)
def logout(request):
log = UserLogModel()
log.data_time = datetime.datetime.now()
log.phone_number = request.session.get('phone_number', '')
log.action = '0'
if request.META.get('HTTP_X_FORWARDED_FOR', ''):
ip = request.META['HTTP_X_FORWARDED_FOR']
else:
ip = request.META['REMOTE_ADDR']
log.ip = ip
log.save()
if request.session.get('phone_number', ''):
request.session.flush()
return redirect('/login/')
def login(request):
context = {
'title': "欢迎来到实验室用电信息统计平台"
}
if request.method == "POST":
# print(request.POST)
phone_number = request.POST.get('phone_number', '0')
password = sha256(request.POST.get('password'))
# print(UserModel.objects.all())
users = UserModel.objects.filter(phone_number=phone_number)
# print(users)
if users:
if users[0].password == password:
context['title'] = "登录成功"
request.session['phone_number'] = phone_number
user = UserModel.objects.get(phone_number=phone_number)
user.last_login_time = datetime.datetime.now()
user.save()
log = UserLogModel()
log.data_time = datetime.datetime.now()
log.phone_number = request.session.get('phone_number', '')
log.action = '1'
if request.META.get('HTTP_X_FORWARDED_FOR', ''):
ip = request.META['HTTP_X_FORWARDED_FOR']
else:
ip = request.META['REMOTE_ADDR']
log.ip = ip
log.save()
# print(request.POST.get('remember_me', 'off'))
if request.POST.get('remember_me', 'off') == 'on':
request.session.set_expiry(3600*24*7)
# return dashboard(request)
return redirect('/dashboard/')
else:
log = UserLogModel()
log.data_time = datetime.datetime.now()
log.phone_number = request.POST.get('phone_number', '')
log.action = '2'
if request.META.get('HTTP_X_FORWARDED_FOR', ''):
ip = request.META['HTTP_X_FORWARDED_FOR']
else:
ip = request.META['REMOTE_ADDR']
log.ip = ip
log.save()
context['title'] = "密码错误"
else:
log = UserLogModel()
log.data_time = datetime.datetime.now()
log.phone_number = request.POST.get('phone_number', '')
log.action = '2'
if request.META.get('HTTP_X_FORWARDED_FOR', ''):
ip = request.META['HTTP_X_FORWARDED_FOR']
else:
ip = request.META['REMOTE_ADDR']
log.ip = ip
log.save()
context['title'] = "用户不存在"
elif request.session.get('phone_number', ''):
return redirect('/dashboard/')
else:
context['title'] = "欢迎来到实验室用电信息统计平台"
return render(request, 'login_staradmin.html', context=context)
def dashboard(request):
phone_number = request.session.get('phone_number', '')
if phone_number == '':
return redirect('/')
user = UserModel.objects.get(phone_number=phone_number)
time_delta = datetime.datetime.now() - datetime.datetime(2019, 1, 1)
if user.is_admin:
data = DataModel.objects.order_by('data_time')[0:51]
user_number = UserModel.objects.count()
device_number = DeviceModel.objects.count()
data_number = DataModel.objects.count()
if user.device_id:
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
else:
device = '管理员设备'
status = True
else:
user_number = UserModel.objects.filter(phone_number=phone_number).count()
device_number = DeviceModel.objects.filter(device_id=user.device_id).count()
data_number = DataModel.objects.filter(device_id=user.device_id).count()
device = DeviceModel.objects.get(device_id=user.device_id)
data = DataModel.objects.filter(device_id=device.device_id).order_by('data_time')[0:51]
status = device.status
status = 'offline' if status is False else 'online'
attr = [str(dat.data_time).split('.')[0] for dat in data]
current_values = [float(dat.current_value) for dat in data]
active_power_values = [float(dat.active_power_value) for dat in data]
total_active_power_values = [float(dat.total_active_power_value) for dat in data]
line = Line("近50条数据", "电流、有功功率和有功总电量")
line.add("电流", attr, current_values, is_fill=True, mark_point=["max", "min"])
line.add("有功功率", attr, active_power_values, mark_point=["max", "min"])
line.add("有功总电量", attr, total_active_power_values, mark_point=["max", "min"])
# if user.is_admin:
# devices = DeviceModel.objects.filter(device_id=user.device_id)
# status = 'offline'
# if devices:
# status = devices[0].status
# else:
# status = 'online'
# user_number = UserModel.objects.count()
# device_number = DeviceModel.objects.count()
# data_number = DataModel.objects.count()
context_data = {
'name': user.name,
'device_id': user.device_id,
'device_status': status,
'instruction': '若系统使用时遇到问题,请及时向系统管理员进行反馈。谢谢大家的配合~',
'user_number': user_number,
'device_number': device_number,
'running_days': time_delta.days,
'data_number': data_number,
'current_peak_value': max([dat.current_value for dat in data]) if len(data) > 0 else 0,
'active_power_peak_value': max([dat.active_power_value for dat in data]) if len(data) > 0 else 0,
'total_active_power_peak_value': max([dat.total_active_power_value for dat in data]) if len(data) > 0 else 0,
'dashboard_chart': line.render_embed(),
}
# else:
# device = DeviceModel.objects.get(device_id=user.device_id)
# data = DataModel.objects.filter(device_id=device.device_id).order_by('data_time')[0:50]
# print(data)
#
# attr = [str(dat.data_time).split('.')[0] for dat in data]
# current_values = [dat.current_value for dat in data]
# active_power_values = [dat.active_power_value for dat in data]
# total_active_power_values = [dat.total_active_power_value for dat in data]
#
# line = Line("近50条数据", "电流、有功功率和有功总电量")
# # line.use_theme('macarons')
# line.add("电流", attr, current_values, is_fill=True, mark_point=["max", "min"])
# line.add("有功功率", attr, active_power_values, mark_point=["max", "min"])
# line.add("有功总电量", attr, total_active_power_values, mark_point=["max", "min"])
#
# context_data = {
# 'name': user.name,
# 'device_id': user.device_id,
# 'device_status': 'online' if device.status == '1' else 'offline',
# 'instruction': '若系统使用时遇到问题,请及时向系统管理员进行反馈。谢谢大家的配合~',
# 'user_number': 1,
# 'device_number': 1,
# 'running_days': time_delta.days,
# 'data_number': len(data),
# 'current_peak_value': max([dat.current_value for dat in data]),
# 'active_power_peak_value': max([dat.active_power_value for dat in data]),
# 'total_active_power_peak_value': max([dat.total_active_power_value for dat in data]),
# 'dashboard_chart': line.render_embed(),
# }
# print(line.get_js_dependencies())
return render(request, 'dashboard_homepage.html', context=context_data)
def user(request):
phone_number = request.session.get('phone_number', '')
if phone_number == '':
return redirect('/')
user = UserModel.objects.get(phone_number=phone_number)
if user.is_admin:
if user.device_id:
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
users = UserModel.objects.all()
else:
device = '管理员设备'
status = True
users = UserModel.objects.all()
else:
users = UserModel.objects.filter(phone_number=phone_number)
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
# print(devices)
status = 'offline' if status is False else 'online'
limit = 11
paginator = Paginator(users, limit)
page = request.GET.get('page', 1)
loaded = paginator.page(page)
# print([dat.data_time for dat in devices[0:6]])
context_data = {
'name': user.name,
'device_id': device.device_id,
'device_status': status,
'users': loaded,
}
return render(request, 'user_base.html', context=context_data)
def log(request):
phone_number = request.session.get('phone_number', '')
if phone_number == '':
return redirect('/')
user = UserModel.objects.get(phone_number=phone_number)
if user.is_admin:
if user.device_id:
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
logs = UserLogModel.objects.order_by('-data_time')
else:
device = '管理员设备'
status = True
logs = UserLogModel.objects.order_by('-data_time')
else:
logs = UserLogModel.objects.filter(phone_number=user.phone_number).order_by('-data_time')
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
status = 'offline' if status is False else 'online'
limit = 10
paginator = Paginator(logs, limit)
page = request.GET.get('page', 1)
loaded = paginator.page(page)
# print([dat.data_time for dat in devices[0:6]])
context_data = {
'name': user.name,
'device_id': device.device_id,
'device_status': status,
'logs': loaded,
}
return render(request, 'user_log.html', context=context_data)
def new(request):
phone_number = request.session.get('phone_number', '')
if phone_number == '':
return redirect('/')
user = UserModel.objects.get(phone_number=phone_number)
if user.is_admin:
if user.device_id:
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
else:
device = '管理员设备'
status = True
else:
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
status = 'offline' if status is False else 'online'
context_data = {
'name': user.name,
'device_id': user.device_id,
'device_status': status,
'return_instruction': '',
}
if request.method == "GET":
return render(request, 'user_new.html', context=context_data)
if not user.is_admin:
context_data['return_instruction'] = "只有管理员才有权限新增用户!"
return render(request, 'user_base.html', context=context_data)
if request.POST.get('phone_number', '') == '':
context_data['return_instruction'] = "请注意,电话号码不能为空"
return render(request, 'user_base.html', context=context_data)
try:
device = DeviceModel.objects.get(device_id=request.POST.get('device_id', ''))
except:
context_data['return_instruction'] = "设备号不存在!"
return render(request, 'user_base.html', context=context_data)
user_new = UserModel()
user_new.phone_number = request.POST.get('phone_number', '')
user_new.device_id = request.POST.get('device_id', '')
user_new.password = sha256(request.POST.get('password', ''))
user_new.name = request.POST.get('name', '')
user_new.class_number = request.POST.get('class_number', '')
user_new.id_number = request.POST.get('id_number', '')
user_new.is_admin = True if request.POST.get('is_admin', '') == '是' else False
user_new.comment = request.POST.get('comment', '')
user_new.date_joined = datetime.datetime.now()
user_new.save()
context_data['return_instruction'] = "保存成功"
return render(request, 'user_new.html', context=context_data)
def edit(request):
phone_number = request.session.get('phone_number', '')
if phone_number == '':
return redirect('/')
user = UserModel.objects.get(phone_number=phone_number)
if user.is_admin:
if user.device_id:
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
else:
device = '管理员设备'
status = True
else:
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
status = 'offline' if status is False else 'online'
if request.method == "GET":
edit_user = UserModel.objects.get(phone_number=request.GET.get('phone_number', ''))
context_data = {
'name': user.name,
'device_id': user.device_id,
'device_status': status,
'return_instruction': '',
'edit_phone_number': request.GET.get('phone_number', ''),
'edit_device_id': edit_user.device_id,
'edit_name': edit_user.name,
'edit_class_number': edit_user.class_number,
'edit_id_number': edit_user.id_number,
'edit_comment': edit_user.comment,
}
if not user.is_admin:
context_data['return_instruction'] = "只有管理员才有权限编辑用户!"
return render(request, 'user_base.html', context=context_data)
else:
return render(request, 'user_edit.html', context=context_data)
if request.POST:
edit_user = UserModel.objects.get(phone_number=request.POST.get('edit_phone_number', ''))
if request.POST.get('device_id', ''):
try:
dev = DeviceModel.objects.get(device_id=request.POST.get('device_id', ''))
except:
context_data = {}
context_data['return_instruction'] = "设备不存在!"
return render(request, 'user_edit.html', context=context_data)
edit_user.device_id = request.POST.get('device_id', '')
edit_user.name = request.POST.get('name', '')
edit_user.class_number = request.POST.get('class_number', '')
edit_user.id_number = request.POST.get('id_number', '')
edit_user.is_admin = True if request.POST.get('is_admin', '') == '是' else False
edit_user.comment = request.POST.get('comment', '')
if request.POST.get('password', ''):
edit_user.password = sha256(request.POST.get('password', ''))
edit_user.save()
context_data = {
'name': user.name,
'device_id': user.device_id,
'device_status': status,
'return_instruction': '',
'edit_phone_number': request.GET.get('phone_number', ''),
'edit_device_id': edit_user.device_id,
'edit_name': edit_user.name,
'edit_class_number': edit_user.class_number,
'edit_id_number': edit_user.id_number,
'edit_comment': edit_user.comment,
}
context_data['return_instruction'] = "编辑成功"
return render(request, 'user_edit.html', context=context_data)
def delete(request):
phone_number = request.session.get('phone_number', '')
if phone_number == '':
return redirect('/')
user = UserModel.objects.get(phone_number=phone_number)
if user.is_admin:
if user.device_id:
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
else:
device = '管理员设备'
status = True
else:
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
status = 'offline' if status is False else 'online'
context_data = {
'name': user.name,
'device_id': user.device_id,
'device_status': status,
'return_instruction': '',
}
if not user.is_admin:
context_data['return_instruction'] = "只有管理员才有权限删除设备!"
return render(request, 'user_base.html', context=context_data)
if request.GET.get('phone_number', '') == '':
context_data['return_instruction'] = "请注意,手机号不能为空"
return render(request, 'user_base.html', context=context_data)
UserModel.objects.filter(phone_number=request.GET.get('phone_number', '')).delete()
context_data['return_instruction'] = "删除成功"
# print('删除成功')
return redirect('/user/user/')
def search(request):
phone_number = request.session.get('phone_number', '')
if phone_number == '':
return redirect('/')
user = UserModel.objects.get(phone_number=phone_number)
if user.is_admin:
if user.device_id:
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
users = UserModel.objects.filter(phone_number=request.GET.get('phone_number', ''))
else:
device = '管理员设备'
status = True
users = UserModel.objects.filter(phone_number=request.GET.get('phone_number', ''))
else:
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
if request.GET.get('phone_number', '') != user.phone_number:
users = []
else:
users = UserModel.objects.filter(phone_number=user.phone_number)
# print(devices)
status = 'offline' if status is False else 'online'
limit = 11
paginator = Paginator(users, limit)
page = request.GET.get('page', 1)
loaded = paginator.page(page)
# print([dat.data_time for dat in devices[0:6]])
context_data = {
'name': user.name,
'device_id': device.device_id,
'device_status': status,
'users': loaded,
}
return render(request, 'user_base.html', context=context_data)
def log_search(request):
req_action = request.GET.get('action', '')
req_phone_number = request.GET.get('phone_number', '')
if req_phone_number == '':
return redirect('/')
filter_terms = {}
if req_action:
filter_terms['action'] = req_action
if req_phone_number:
filter_terms['phone_number'] = req_phone_number
phone_number = request.session.get('phone_number', '')
user = UserModel.objects.get(phone_number=phone_number)
if user.is_admin:
if user.device_id:
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
logs = UserLogModel.objects.filter(**filter_terms).order_by('-data_time')
else:
device = '管理员设备'
status = True
logs = UserLogModel.objects.filter(**filter_terms).order_by('-data_time')
else:
logs = UserLogModel.objects.filter(phone_number=user.phone_number).order_by('-data_time')
device = DeviceModel.objects.get(device_id=user.device_id)
status = device.status
if req_phone_number != user.phone_number:
logs = []
context_data = {
'name': user.name,
'device_id': device.device_id,
'device_status': status,
'logs': logs,
'return_instruction': '您无权查看其它用户的日志',
}
return render(request, 'user_log.html', context=context_data)
status = 'offline' if status is False else 'online'
limit = 10
paginator = Paginator(logs, limit)
page = request.GET.get('page', 1)
loaded = paginator.page(page)
# print([dat.data_time for dat in devices[0:6]])
context_data = {
'name': user.name,
'device_id': device.device_id,
'device_status': status,
'logs': loaded,
}
return render(request, 'user_log.html', context=context_data)
| 35.74055 | 117 | 0.614153 | 2,458 | 20,801 | 4.980879 | 0.075264 | 0.085355 | 0.04231 | 0.038226 | 0.820551 | 0.787879 | 0.758638 | 0.714776 | 0.676387 | 0.62599 | 0 | 0.006474 | 0.25744 | 20,801 | 581 | 118 | 35.802065 | 0.786158 | 0.111629 | 0 | 0.660333 | 0 | 0 | 0.132928 | 0.005864 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030879 | false | 0.016627 | 0.023753 | 0.007126 | 0.133017 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
a50a3026fa8397c9dfe7066e9bc72fdea94fc8df | 52 | py | Python | malaya_speech/train/model/revsic_multispeakerglowtts/__init__.py | ishine/malaya-speech | fd34afc7107af1656dff4b3201fa51dda54fde18 | [
"MIT"
] | null | null | null | malaya_speech/train/model/revsic_multispeakerglowtts/__init__.py | ishine/malaya-speech | fd34afc7107af1656dff4b3201fa51dda54fde18 | [
"MIT"
] | null | null | null | malaya_speech/train/model/revsic_multispeakerglowtts/__init__.py | ishine/malaya-speech | fd34afc7107af1656dff4b3201fa51dda54fde18 | [
"MIT"
] | null | null | null | from .model import Model
from .config import Config
| 17.333333 | 26 | 0.807692 | 8 | 52 | 5.25 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 52 | 2 | 27 | 26 | 0.954545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
eb9f972d0d8975d57db904f23c559d52fc5d7e55 | 310 | py | Python | unbuffered.py | kalkun/segmentor | 8a995af6838097be584e3713919ef57c7f952f0a | [
"BSD-3-Clause"
] | null | null | null | unbuffered.py | kalkun/segmentor | 8a995af6838097be584e3713919ef57c7f952f0a | [
"BSD-3-Clause"
] | null | null | null | unbuffered.py | kalkun/segmentor | 8a995af6838097be584e3713919ef57c7f952f0a | [
"BSD-3-Clause"
] | null | null | null | # http://stackoverflow.com/questions/107705/disable-output-buffering
class Unbuffered(object):
def __init__(self, stream):
self.stream = stream
def write(self, data):
self.stream.write(data)
self.stream.flush()
def __getattr__(self, attr):
return getattr(self.stream, attr) | 34.444444 | 68 | 0.690323 | 38 | 310 | 5.421053 | 0.552632 | 0.242718 | 0.135922 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023715 | 0.183871 | 310 | 9 | 69 | 34.444444 | 0.790514 | 0.212903 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0 | 0 | 0.125 | 0.625 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
ebbeb731a9f520d08966c0b3e6a0aa1865f34708 | 153 | py | Python | pymkv/__init__.py | GoldbergData/pymkv | 6d5cc299c1eb14f6e2db980bef222f09a6df6f3a | [
"MIT"
] | null | null | null | pymkv/__init__.py | GoldbergData/pymkv | 6d5cc299c1eb14f6e2db980bef222f09a6df6f3a | [
"MIT"
] | null | null | null | pymkv/__init__.py | GoldbergData/pymkv | 6d5cc299c1eb14f6e2db980bef222f09a6df6f3a | [
"MIT"
] | null | null | null | from .MKVAttachment import *
from .MKVTrack import *
from .MKVFile import *
from .Timestamp import *
from .Verifications import *
__version__ = '1.0.1'
| 19.125 | 28 | 0.745098 | 19 | 153 | 5.789474 | 0.526316 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023256 | 0.156863 | 153 | 7 | 29 | 21.857143 | 0.829457 | 0 | 0 | 0 | 0 | 0 | 0.03268 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.833333 | 0 | 0.833333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
ebdc5eb96a773cd81d8a44f8768017a2b7f3c102 | 70 | py | Python | file_separator/__init__.py | yoshikawat64m/python-packages | 862034d88965fdaf73d363f52ce6e7cc83448255 | [
"MIT"
] | null | null | null | file_separator/__init__.py | yoshikawat64m/python-packages | 862034d88965fdaf73d363f52ce6e7cc83448255 | [
"MIT"
] | null | null | null | file_separator/__init__.py | yoshikawat64m/python-packages | 862034d88965fdaf73d363f52ce6e7cc83448255 | [
"MIT"
] | null | null | null | from .src import separate_by_class
__all__ = ('separate_by_class', )
| 17.5 | 34 | 0.771429 | 10 | 70 | 4.6 | 0.7 | 0.434783 | 0.652174 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128571 | 70 | 3 | 35 | 23.333333 | 0.754098 | 0 | 0 | 0 | 0 | 0 | 0.242857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
ebe74d272b69ff75b5120703f7e4bfdd02c759d8 | 2,069 | py | Python | mean deviation method and standard deviation.py | rashant/Engineering-Maths-Algorithms | 4eb51768310f9a4e9234d2871077d1041c9e3f80 | [
"MIT"
] | null | null | null | mean deviation method and standard deviation.py | rashant/Engineering-Maths-Algorithms | 4eb51768310f9a4e9234d2871077d1041c9e3f80 | [
"MIT"
] | null | null | null | mean deviation method and standard deviation.py | rashant/Engineering-Maths-Algorithms | 4eb51768310f9a4e9234d2871077d1041c9e3f80 | [
"MIT"
] | null | null | null | import math
import pandas
print('***************************************************************************************************************************************')
print(' Mean deviation ')
print('***************************************************************************************************************************************')
x=[5,15,25,35,45,55,65,75]
f=[5,10,20,40,30,20,10,5]
xf=[]
for i in range(len(x)):
xf.append(x[i]*f[i])
sigmaxf=sum(xf)
sigmaf=sum(f)
mean=math.floor(sigmaxf/sigmaf)
x_m=[]
for i in range(len(x)):
x_m.append(abs(x[i]-mean))
fxm=[]
for i in range(len(x)):
fxm.append(x_m[i]*f[i])
fxm_sum=sum(fxm)
deviation=round(fxm_sum/sigmaf,2)
df=pandas.DataFrame({'X': x, 'f': f,'xf':xf,f'|x-{mean}|':x_m,f'f|x-{mean}|':fxm})
print(df)
print('sigmaf= ',sigmaf)
print('sigmaxf= ',sigmaxf)
print('sigma fx-mean= ',fxm_sum)
print('mean= ',mean)
print('deviation= ',deviation)
print()
print('***************************************************************************************************************************************')
print(' standard deviation ')
print('***************************************************************************************************************************************')
xf=[]
for i in range(len(x)):
xf.append(x[i]*f[i])
sigmaxf=sum(xf)
sigmaf=sum(f)
mean=round(sigmaxf/sigmaf,2)
x_m=[]
for i in range(len(x)):
x_m.append((x[i]-mean)**2)
fxm=[]
for i in range(len(x)):
fxm.append(x_m[i]*f[i])
fxm_sum=sum(fxm)
deviation=round((fxm_sum/sigmaf)**(0.5),2)
df=pandas.DataFrame({'X': x, 'f': f,'xf':xf,f'|x-{mean}|':x_m,f'f(x-{mean})^2':fxm})
print(df)
print('sigmaf= ',sigmaf)
print('sigmaxf= ',sigmaxf)
print('sigma fx-mean square= ',fxm_sum)
print('mean= ',mean)
print('deviation= ',deviation)
| 36.298246 | 147 | 0.38086 | 241 | 2,069 | 3.211618 | 0.178423 | 0.020672 | 0.046512 | 0.085271 | 0.762274 | 0.762274 | 0.762274 | 0.762274 | 0.653747 | 0.653747 | 0 | 0.021765 | 0.20058 | 2,069 | 56 | 148 | 36.946429 | 0.446191 | 0 | 0 | 0.679245 | 0 | 0 | 0.480378 | 0.268256 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.037736 | 0 | 0.037736 | 0.358491 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
cce377714d086613ccf778b298775aea73f2b7e8 | 30 | py | Python | user_login.py | beppern389/football | 3c322bfcc3b5063ff58d996d3713831c622fe7fc | [
"Apache-2.0"
] | null | null | null | user_login.py | beppern389/football | 3c322bfcc3b5063ff58d996d3713831c622fe7fc | [
"Apache-2.0"
] | null | null | null | user_login.py | beppern389/football | 3c322bfcc3b5063ff58d996d3713831c622fe7fc | [
"Apache-2.0"
] | null | null | null | Mein neuer Code...
neuer code
| 10 | 18 | 0.733333 | 5 | 30 | 4.4 | 0.6 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 30 | 2 | 19 | 15 | 0.88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
cce6aa7ed6f9ec6935000f0367578e4556d9a99c | 119 | py | Python | main.py | StefanSchmelz/whoGetsTheMillion | 99a42b6a2e99def4faa05787b4869387d7827f86 | [
"MIT"
] | null | null | null | main.py | StefanSchmelz/whoGetsTheMillion | 99a42b6a2e99def4faa05787b4869387d7827f86 | [
"MIT"
] | null | null | null | main.py | StefanSchmelz/whoGetsTheMillion | 99a42b6a2e99def4faa05787b4869387d7827f86 | [
"MIT"
] | null | null | null | from engine.Collection import Collection
from engine.Loader import Loader
from surface.Game import GameUi
g = GameUi() | 23.8 | 40 | 0.823529 | 17 | 119 | 5.764706 | 0.529412 | 0.204082 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12605 | 119 | 5 | 41 | 23.8 | 0.942308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
cce6d31d825b545141c6e0fdedbdae11fe39f5d5 | 49 | py | Python | gym_arc/envs/__init__.py | MitrofanovDmitry/gym-arc | e39f320d4bb153881f2d36f6bf3f5591a0225e4f | [
"MIT"
] | 1 | 2020-04-04T17:47:03.000Z | 2020-04-04T17:47:03.000Z | gym_arc/envs/__init__.py | dimbasamster/gym-arc | e39f320d4bb153881f2d36f6bf3f5591a0225e4f | [
"MIT"
] | null | null | null | gym_arc/envs/__init__.py | dimbasamster/gym-arc | e39f320d4bb153881f2d36f6bf3f5591a0225e4f | [
"MIT"
] | 1 | 2020-05-07T09:38:24.000Z | 2020-05-07T09:38:24.000Z | # from gym_arc.envs.env_arc import ARCEnvironment | 49 | 49 | 0.857143 | 8 | 49 | 5 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 49 | 1 | 49 | 49 | 0.888889 | 0.959184 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
690a176a88e4956c5bb60d122763310b66590ca4 | 243 | py | Python | CODEDABBLE API/stringComputeApi/api/models.py | xiphoidProcess1457/CodeDabble | 8c2d9ed46c3a14c7042c9c501568d3d1462ac0e7 | [
"MIT"
] | null | null | null | CODEDABBLE API/stringComputeApi/api/models.py | xiphoidProcess1457/CodeDabble | 8c2d9ed46c3a14c7042c9c501568d3d1462ac0e7 | [
"MIT"
] | null | null | null | CODEDABBLE API/stringComputeApi/api/models.py | xiphoidProcess1457/CodeDabble | 8c2d9ed46c3a14c7042c9c501568d3d1462ac0e7 | [
"MIT"
] | null | null | null | # from django.db import models
# Create your models here.
# class Courses(models.Model):
# title = models.CharField(max_length=70, blank=False, default='')
# description = models.CharField(max_length=200,blank=False, default='')
| 30.375 | 76 | 0.707819 | 31 | 243 | 5.483871 | 0.677419 | 0.176471 | 0.211765 | 0.282353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02439 | 0.156379 | 243 | 7 | 77 | 34.714286 | 0.804878 | 0.930041 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
691f7a6ae34afb79b7210f96c9ff1457f3d2b8c9 | 226 | py | Python | unit_tests.py | SanjoSolutions/gui_interaction | 2c63e9521625448f2263369376177251259e6a3a | [
"Unlicense"
] | null | null | null | unit_tests.py | SanjoSolutions/gui_interaction | 2c63e9521625448f2263369376177251259e6a3a | [
"Unlicense"
] | null | null | null | unit_tests.py | SanjoSolutions/gui_interaction | 2c63e9521625448f2263369376177251259e6a3a | [
"Unlicense"
] | null | null | null | from main import are_bounding_boxes_nearby
print(are_bounding_boxes_nearby((0, 0, 2, 2), (1, 0, 2, 2)))
print(are_bounding_boxes_nearby((0, 0, 2, 2), (1, 1, 1, 1)))
print(are_bounding_boxes_nearby((1, 1, 1, 1), (0, 0, 2, 2))) | 45.2 | 60 | 0.676991 | 46 | 226 | 3.065217 | 0.23913 | 0.085106 | 0.453901 | 0.624113 | 0.64539 | 0.453901 | 0.453901 | 0.453901 | 0.453901 | 0.453901 | 0 | 0.121212 | 0.123894 | 226 | 5 | 61 | 45.2 | 0.590909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.25 | 0.75 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
696a1e062e7f50cf8ab0fbaca4e226793c2212e9 | 7,875 | py | Python | Exercise_File_B-Common_Python_Data_Structures.py | ilankham/wuss-2019-half-day-class | 7b872e5ee294e62c62a20c5cd7b4e441df63f70a | [
"MIT"
] | 2 | 2019-09-04T03:29:12.000Z | 2021-03-02T07:22:08.000Z | Exercise_File_B-Common_Python_Data_Structures.py | ilankham/wuss-2019-half-day-class | 7b872e5ee294e62c62a20c5cd7b4e441df63f70a | [
"MIT"
] | null | null | null | Exercise_File_B-Common_Python_Data_Structures.py | ilankham/wuss-2019-half-day-class | 7b872e5ee294e62c62a20c5cd7b4e441df63f70a | [
"MIT"
] | 3 | 2019-09-04T05:57:02.000Z | 2019-09-15T21:21:47.000Z | # Everything is better with friends: Executing SAS® code in Python scripts with
# SASPy, and turbocharging your SAS programming with open-source tooling
#
# Half-day class, Western Users of SAS Software (WUSS) 2019
###############################################################################
# Exercises 3-5: Common Python Data Structures #
###############################################################################
# Lines 12-13 load modules needed for exercises and should be left as-is
from class_setup import print_with_title
from pandas import DataFrame
###############################################################################
# #
# Exercise 3. [Python] Define a list object #
# #
# Instructions: Uncomment the code immediately below, and then execute #
# #
###############################################################################
# hello_world_list = ['Hello', 'list']
# print_with_title(hello_world_list, 'The value of hello_world_list:')
# print_with_title(type(hello_world_list), 'The type of hello_world_list:')
# Notes:
#
# 1. A list object named hello_world_list with two values is created, and the
# following are printed:
# * the value of the list
# * its type (which is <class 'list'>)
#
# 2. Lists are the most fundamental Python data structure and are related to
# SAS data-step arrays. Values in lists are always kept in insertion order,
# meaning the order they appear in the list's definition, and they can be
# individually accessed using numerical indexes within bracket notation:
# * hello_world_list[0] returns 'Hello'
# * hello_world_list[1] returns 'list'
#
# 3. This example illustrates another way Python syntax differs from SAS:
# * The left-most element of a list is always at index 0. Unlike SAS,
# customized indexing is only available for more sophisticated data
# structures in Python (e.g., a dictionary, as in the next example).
#
# 4. For additional practice, try any or all of the following:
# * Print out the initial element of the list.
# * Print out the final element of the list.
# * Create a list of length five, and print its middle elements.
###############################################################################
# #
# Exercise 4. [Python] Define a dict object #
# #
# Instructions: Uncomment the code immediately below, and then execute #
# #
###############################################################################
# hello_world_dict = {
# 'salutation' : ['Hello' , 'dict'],
# 'valediction' : ['Goodbye' , 'list'],
# 'part of speech' : ['interjection', 'noun'],
# }
# print_with_title(hello_world_dict, 'The value of hello_world_dict:')
# print_with_title(type(hello_world_dict), 'The type of hello_world_dict:')
# Notes:
# 1. A dictionary (dict for short) object named hello_world_dict with three
# key-value pairs is created, and the following are printed:
# * the value of the dictionary
# * its type (which is <class 'dict'>)
#
# 2. Dictionaries are another fundamental Python data structure and are related
# to SAS formats and data-step hash tables. Dictionaries are more generally
# called associative arrays or maps because they map keys (appearing before
# the colons) to values (appearing after the colons). In other words, the
# value associated with each key can be accessed using bracket notation:
# * hello_world_dict['salutation'] returns ['Hello', 'dict']
# * hello_world_dict['valediction'] returns ['Goodbye', 'list']
# * hello_world_dict['part of speech'] returns ['interjection', 'noun']
#
# 3. Whenever indexable data structures are nested in Python, indexing methods
# can be combined. E.g., hello_world_dict['salutation'][0] is the same as
# ['Hello', 'dict'][0], which returns 'Hello'.
#
# 4. When using older versions of Python, the print order of key-value pairs
# may not match insertion order, meaning the order key-value pairs are
# listed when the dictionary is created. However, as of Python 3.7 (released
# in June 2018), insertion order is preserved.
#
# 5. For additional practice, try any or all of the following:
# * Print out the list with key 'salutation'.
# * Print out the initial element in the list associated with key
# 'valediction'.
# * Print out the final element in the list associated with key 'part of
# speech'.
###############################################################################
# #
# Exercise 5. [Python w/ pandas] Define a DataFrame object #
# #
# Instructions: Uncomment the code immediately below, and then execute #
# #
###############################################################################
# hello_world_df = DataFrame(
# {
# 'salutation': ['Hello' , 'DataFrame'],
# 'valediction': ['Goodbye' , 'dict'],
# 'part of speech': ['exclamation', 'noun'],
# }
# )
# print_with_title(hello_world_df, 'The value of hello_world_df:')
# print_with_title(hello_world_df.shape, 'The shape of hello_world_df:')
# hello_world_df.info()
# print_with_title('', linebreaks_before=0, linebreaks_after=0)
# Notes:
#
# 1. A DataFrame (df for short) object named hello_world_df with dimensions 2x3
# (2 rows by 3 columns) is created, and the following are printed:
# * the value of the DataFrame
# * the number of rows and columns in hello_world_df
# * some information about it, which is obtained by hello_world_df calling
# its info method (meaning a function whose definition is nested in it)
#
# 2. Since DataFrames are not built into Python, we had to import their
# definition from the pandas module at the beginning of this file. Like
# their R counterpart, DataFrames are two-dimensional arrays of values that
# can be thought of like SAS datasets. However, while SAS datasets are
# typically only accessed from disk and processed row-by-row, DataFrames are
# loaded into memory all at once. This means values in DataFrames can be
# randomly accessed, but it also means the size of DataFrames can't grow
# beyond available memory.
#
# 3. The dimensions of the DataFrame are determined as follows:
# * The keys 'salutation', 'valediction', and 'part of speech' of the
# dictionary passed to the DataFrame constructor function become column
# labels.
# * Because each key maps to a list of length two, each column will be two
# elements tall (with an error occurring if the lists are not of
# non-uniform length).
#
# 4. This example gives one option for building a DataFrame, but the
# constructor function can also accept many other object types, including
# another DataFrame.
#
# 5. For additional practice, try any or all of the following (keeping in mind
# that DataFrames can be indexed like dictionaries):
# * Print out the column with key 'salutation'.
# * Print out the initial element in the column with key 'valediction'.
# * Print out the final element in the column with key 'part of speech'.
| 46.875 | 79 | 0.579683 | 938 | 7,875 | 4.789979 | 0.293177 | 0.060093 | 0.03116 | 0.016915 | 0.297797 | 0.2357 | 0.187403 | 0.179613 | 0.179613 | 0.158246 | 0 | 0.00767 | 0.271492 | 7,875 | 167 | 80 | 47.155689 | 0.775318 | 0.871238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 5 |
15e9537eb53f97d7ed88a3ca4cae8c0fdbee7d61 | 96 | py | Python | proximal/__init__.py | kyleaj/ProxImaL | 2986b1ed40b58057822922522145bfbbdd2cf9de | [
"MIT"
] | 101 | 2016-07-24T00:33:12.000Z | 2022-03-23T23:51:58.000Z | proximal/__init__.py | kyleaj/ProxImaL | 2986b1ed40b58057822922522145bfbbdd2cf9de | [
"MIT"
] | 57 | 2016-07-26T18:12:37.000Z | 2022-02-14T04:19:26.000Z | proximal/__init__.py | kyleaj/ProxImaL | 2986b1ed40b58057822922522145bfbbdd2cf9de | [
"MIT"
] | 30 | 2016-07-26T22:51:59.000Z | 2021-01-15T14:45:42.000Z | __version__ = "0.1.7"
from .prox_fns import *
from .lin_ops import *
from .algorithms import *
| 16 | 25 | 0.71875 | 15 | 96 | 4.2 | 0.733333 | 0.31746 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0375 | 0.166667 | 96 | 5 | 26 | 19.2 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.052083 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
d65d8843f07648d456a02a5b1d7bba94ec3b7140 | 180 | py | Python | allauth/socialaccount/providers/stripe_express/urls.py | Immortalin/django-allauth | 42414424b78ba674807b4fe1bed024b673ba97d4 | [
"MIT"
] | null | null | null | allauth/socialaccount/providers/stripe_express/urls.py | Immortalin/django-allauth | 42414424b78ba674807b4fe1bed024b673ba97d4 | [
"MIT"
] | null | null | null | allauth/socialaccount/providers/stripe_express/urls.py | Immortalin/django-allauth | 42414424b78ba674807b4fe1bed024b673ba97d4 | [
"MIT"
] | null | null | null | from allauth.socialaccount.providers.oauth2.urls import default_urlpatterns
from .provider import StripeExpressProvider
urlpatterns = default_urlpatterns(StripeExpressProvider)
| 25.714286 | 75 | 0.877778 | 17 | 180 | 9.176471 | 0.647059 | 0.230769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006024 | 0.077778 | 180 | 6 | 76 | 30 | 0.933735 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
d6fde80719d3c25f1c5162587e144bb7f996f319 | 5,220 | py | Python | tests/test_slab.py | dpguoming/pyserum | d0938e9149de67093808c4d89fee1eb6f75fcd52 | [
"MIT"
] | 129 | 2020-08-30T03:50:10.000Z | 2022-03-24T02:52:18.000Z | tests/test_slab.py | dpguoming/pyserum | d0938e9149de67093808c4d89fee1eb6f75fcd52 | [
"MIT"
] | 87 | 2020-08-30T04:38:30.000Z | 2022-03-23T10:22:11.000Z | tests/test_slab.py | dpguoming/pyserum | d0938e9149de67093808c4d89fee1eb6f75fcd52 | [
"MIT"
] | 51 | 2020-09-29T13:58:20.000Z | 2022-03-07T09:49:47.000Z | """Unit tests for market."""
import base64
from pyserum._layouts.slab import ORDER_BOOK_LAYOUT, SLAB_HEADER_LAYOUT, SLAB_LAYOUT, SLAB_NODE_LAYOUT
from pyserum.market._internal.slab import Slab
from .binary_file_path import ASK_ORDER_BIN_PATH
HEX_DATA = "0900000000000000020000000000000008000000000000000400000000000000010000001e00000000000040952fe4da5c1f3c860200000004000000030000000d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d7b0000000000000000000000000000000200000002000000000000a0ca17726dae0f1e43010000001111111111111111111111111111111111111111111111111111111111111111410100000000000000000000000000000200000001000000d20a3f4eeee073c3f60fe98e010000000d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d7b000000000000000000000000000000020000000300000000000040952fe4da5c1f3c8602000000131313131313131313131313131313131313131313131313131313131313131340e20100000000000000000000000000010000001f0000000500000000000000000000000000000005000000060000000d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d7b0000000000000000000000000000000200000004000000040000000000000000000000000000001717171717171717171717171717171717171717171717171717171717171717020000000000000000000000000000000100000020000000000000a0ca17726dae0f1e430100000001000000020000000d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d0d7b000000000000000000000000000000040000000000000004000000000000000000000000000000171717171717171717171717171717171717171717171717171717171717171702000000000000000000000000000000030000000700000005000000000000000000000000000000171717171717171717171717171717171717171717171717171717171717171702000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000" # noqa: E501 # pylint: disable=line-too-long
DATA = bytes.fromhex(HEX_DATA)
def test_parse_order_book():
"""Test order book parsing."""
with open(ASK_ORDER_BIN_PATH, "r") as input_file:
base64_res = input_file.read()
data = base64.decodebytes(base64_res.encode("ascii"))
res = ORDER_BOOK_LAYOUT.parse(data)
assert res.account_flags.initialized
assert not res.account_flags.market
assert res.account_flags.asks
assert res.slab_layout.header.bump_index == 239
assert res.slab_layout.header.free_list_length == 210
assert res.slab_layout.header.free_list_head == 174
assert res.slab_layout.header.root == 0
assert res.slab_layout.header.leaf_count == 15
assert len(res.slab_layout.nodes) == 239
assert res.slab_layout.nodes[0].tag == 1
assert res.slab_layout.nodes[0].node.prefix_len == 51
def test_parse_header():
"""Test parse slab headers."""
# We only parse the data for the header which is the first 32 bytes.
header = SLAB_HEADER_LAYOUT.parse(DATA[:32])
assert header.bump_index == 9
assert header.free_list_length == 2
assert header.free_list_head == 8
def test_parse_node():
"""Test the parsing logic for a SLAB node."""
# We only parse the data for the first node. The header is of length 32 bytes.
# And the slab node layout requires 72 bytes (4 bytes for tag and 68 bytes for node data).
slab_node = SLAB_NODE_LAYOUT.parse(DATA[32 : 32 + 72]) # noqa: E203
assert slab_node.tag == 1
assert slab_node.node.prefix_len == 30
assert slab_node.node.children == [4, 3]
def test_parse_slab():
slab = SLAB_LAYOUT.parse(DATA)
assert len(slab.nodes) == 9
assert slab.nodes[1].tag == 2
assert slab.nodes[1].node.fee_tier == 0
assert slab.nodes[1].node.quantity == 321
def test_slab_get():
slab = Slab.from_bytes(DATA)
assert slab.get(123456789012345678901234567890).owner_slot == 1
assert slab.get(100000000000000000000000000000).owner_slot == 2
assert slab.get(200000000000000000000000000000).owner_slot == 3
assert slab.get(4).owner_slot == 4
assert slab.get(0) is None
assert slab.get(3) is None
assert slab.get(5) is None
assert slab.get(6) is None
assert slab.get(200000000000000000000000000001) is None
assert slab.get(100000000000000000000000000001) is None
assert slab.get(123456789012345678901234567889) is None
assert slab.get(123456789012345678901234567891) is None
assert slab.get(99999999999999999999999999999) is None
def test_length_of_slab_iterator():
slab = Slab.from_bytes(DATA)
assert sum(1 for _ in slab.items()) == 4
def test_iterate_in_ascending_order():
slab = Slab.from_bytes(DATA)
prev = None
for node in slab.items():
curr_key = node.key
if prev:
assert curr_key > prev
prev = curr_key
def test_iterate_in_descending_order():
slab = Slab.from_bytes(DATA)
prev = None
for node in slab.items(descending=True):
curr_key = node.key
if prev:
assert curr_key < prev
prev = curr_key
| 52.2 | 1,851 | 0.802107 | 504 | 5,220 | 8.115079 | 0.25 | 0.046455 | 0.04132 | 0.031296 | 0.168949 | 0.1022 | 0.076773 | 0.0489 | 0.0489 | 0.0489 | 0 | 0.433925 | 0.136015 | 5,220 | 99 | 1,852 | 52.727273 | 0.472949 | 0.076437 | 0 | 0.166667 | 0 | 0 | 0.374974 | 0.373723 | 0 | 1 | 0 | 0 | 0.513889 | 1 | 0.111111 | false | 0 | 0.055556 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
ba3a38fe7fd5daa60b83fe6ed445fcb1d0b96d60 | 27 | py | Python | tests/never_starting_service.py | butla/service-test-py | 796595f0011cc4a62cd7796dd71c3681008a4731 | [
"0BSD"
] | 56 | 2015-09-16T21:39:16.000Z | 2021-12-28T00:07:25.000Z | tests/never_starting_service.py | butla/service-test-py | 796595f0011cc4a62cd7796dd71c3681008a4731 | [
"0BSD"
] | 9 | 2016-06-27T22:40:17.000Z | 2019-01-30T20:11:18.000Z | tests/never_starting_service.py | butla/service-test-py | 796595f0011cc4a62cd7796dd71c3681008a4731 | [
"0BSD"
] | 9 | 2015-09-16T21:39:19.000Z | 2022-02-01T19:03:52.000Z | import time
time.sleep(100) | 13.5 | 15 | 0.814815 | 5 | 27 | 4.4 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12 | 0.074074 | 27 | 2 | 15 | 13.5 | 0.76 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
ba6367d660a2c43f9aaeb47b89ad9122e86f6ce7 | 812 | py | Python | ex43_classes.py | jpch89/lpthw | dc492e5e88afa309413756a839312dd6148b1144 | [
"MIT"
] | 4 | 2019-04-04T02:35:52.000Z | 2021-11-16T05:44:42.000Z | ex43_classes.py | jpch89/lpthw | dc492e5e88afa309413756a839312dd6148b1144 | [
"MIT"
] | null | null | null | ex43_classes.py | jpch89/lpthw | dc492e5e88afa309413756a839312dd6148b1144 | [
"MIT"
] | 7 | 2020-12-02T10:01:14.000Z | 2022-02-03T07:05:51.000Z | # -*- coding: utf-8 -*-
# @Author: jpch89
# @Email: jpch89@outlook.com
# @Time: 2018/7/23 10:42
class Scene(object):
def enter(self):
pass
class Engine(object):
def __init__(self, scene_map):
pass
def play(self):
pass
class Death(Scene):
def enter(self):
pass
class CentralCorridor(Scene):
def enter(self):
pass
class LaserWeaponArmory(Scene):
def enter(self):
pass
class TheBridge(Scene):
def enter(self):
pass
class EscapePod(Scene):
def enter(self):
pass
class Map(object):
def __init__(self, start_scene):
pass
def next_scene(self, scene_name):
pass
def opening_scene(self):
pass
a_map = Map('central_corridor')
a_game = Engine(a_map)
a_game.play()
| 13.096774 | 37 | 0.597291 | 104 | 812 | 4.490385 | 0.365385 | 0.137045 | 0.194861 | 0.205567 | 0.32334 | 0.278373 | 0 | 0 | 0 | 0 | 0 | 0.027634 | 0.286946 | 812 | 61 | 38 | 13.311475 | 0.778929 | 0.110837 | 0 | 0.515152 | 0 | 0 | 0.022315 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0 | 0 | 0.575758 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 5 |
ba88ca5a40ecd43adbb735f3ddbd407724c2995f | 15 | py | Python | 00_test02_devel.py | rkjin/algorithm | 5661dd621a43bcbb37b4113fd0918854e7a24310 | [
"Apache-2.0"
] | null | null | null | 00_test02_devel.py | rkjin/algorithm | 5661dd621a43bcbb37b4113fd0918854e7a24310 | [
"Apache-2.0"
] | null | null | null | 00_test02_devel.py | rkjin/algorithm | 5661dd621a43bcbb37b4113fd0918854e7a24310 | [
"Apache-2.0"
] | null | null | null | import os
ggg | 5 | 10 | 0.733333 | 3 | 15 | 3.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.266667 | 15 | 3 | 11 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
baa1f313cc2ec9198e8eeeb6270f1e67d60386ed | 77 | py | Python | iridauploader/progress/exceptions/__init__.py | COMBAT-SARS-COV-2/irida-uploader | b9d04d187d6a5a9fdcaef5b27135965ffac99db0 | [
"Apache-2.0"
] | 7 | 2019-01-25T16:56:11.000Z | 2021-01-12T15:32:08.000Z | iridauploader/progress/exceptions/__init__.py | COMBAT-SARS-COV-2/irida-uploader | b9d04d187d6a5a9fdcaef5b27135965ffac99db0 | [
"Apache-2.0"
] | 80 | 2019-01-29T14:54:26.000Z | 2022-03-25T18:51:51.000Z | iridauploader/progress/exceptions/__init__.py | COMBAT-SARS-COV-2/irida-uploader | b9d04d187d6a5a9fdcaef5b27135965ffac99db0 | [
"Apache-2.0"
] | 9 | 2019-03-14T09:58:05.000Z | 2022-01-06T20:14:45.000Z | from iridauploader.progress.exceptions.directory_error import DirectoryError
| 38.5 | 76 | 0.909091 | 8 | 77 | 8.625 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051948 | 77 | 1 | 77 | 77 | 0.945205 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
2439d2ea21ab251598bcfc81fbbf9ec49ae3ec45 | 125 | py | Python | examples/setup.py | ameroueh/performance | d393d3c826f04ef4379d6c686e46be483ba0bcd9 | [
"MIT"
] | null | null | null | examples/setup.py | ameroueh/performance | d393d3c826f04ef4379d6c686e46be483ba0bcd9 | [
"MIT"
] | 1 | 2019-04-11T22:14:04.000Z | 2019-04-11T22:14:04.000Z | examples/setup.py | ameroueh/performance | d393d3c826f04ef4379d6c686e46be483ba0bcd9 | [
"MIT"
] | 1 | 2019-04-07T18:44:30.000Z | 2019-04-07T18:44:30.000Z | from distutils.core import setup
from Cython.Build import cythonize
setup(ext_modules=cythonize("add_integers_cython.pyx"))
| 25 | 55 | 0.84 | 18 | 125 | 5.666667 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08 | 125 | 4 | 56 | 31.25 | 0.886957 | 0 | 0 | 0 | 0 | 0 | 0.184 | 0.184 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
2466ea043181deb14b6359204a4c842cfe145606 | 244 | py | Python | python/testData/resolve/PreferInitForAttributes.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/resolve/PreferInitForAttributes.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/resolve/PreferInitForAttributes.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | class Foo:
def __init__(self):
self.xyzzy = 1
def bar(self):
if self.xyzzy:
self.xyzzy = None
else:
self.xyzzy += 1
def baz(self):
print(self.xyzzy)
# <ref> | 18.769231 | 29 | 0.442623 | 28 | 244 | 3.714286 | 0.5 | 0.432692 | 0.192308 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014925 | 0.45082 | 244 | 13 | 30 | 18.769231 | 0.761194 | 0.020492 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0 | 0 | 0.4 | 0.1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
79f4b2f398ea3694de0cc0ee3bb14e71d5d12576 | 385 | py | Python | checklib/checks.py | cedadev/compliance-check-lib | 7adb5eddb55cc3b9ade5d2c5a8b5dd898b147184 | [
"BSD-3-Clause"
] | 1 | 2020-04-16T18:40:29.000Z | 2020-04-16T18:40:29.000Z | checklib/checks.py | cedadev/compliance-check-lib | 7adb5eddb55cc3b9ade5d2c5a8b5dd898b147184 | [
"BSD-3-Clause"
] | 5 | 2018-03-20T20:24:09.000Z | 2022-02-09T09:12:13.000Z | checklib/checks.py | cedadev/compliance-check-lib | 7adb5eddb55cc3b9ade5d2c5a8b5dd898b147184 | [
"BSD-3-Clause"
] | 2 | 2018-01-09T09:40:54.000Z | 2020-06-25T10:49:39.000Z | # Top-level location of all checks so that they can easily be located
from .register.nc_coords_checks_register import *
from .register.nc_var_checks_register import *
from .register.nc_file_checks_register import *
from .register.format_checks_register import *
from .register.file_checks_register import *
ALL_CHECKS = [check for check in dir() if check.endswith("Check")]
| 38.5 | 70 | 0.792208 | 57 | 385 | 5.105263 | 0.473684 | 0.206186 | 0.343643 | 0.329897 | 0.453608 | 0.233677 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137662 | 385 | 9 | 71 | 42.777778 | 0.876506 | 0.174026 | 0 | 0 | 0 | 0 | 0.016287 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.833333 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
79f8732c92f509772bfae9066ba9f8555164e21c | 113 | py | Python | biclustering/__init__.py | lucasbrunialti/biclustering-experiments | 30e51e23b0c3d91939bf7ec49c47d3035e6ecb57 | [
"BSD-2-Clause"
] | 3 | 2017-11-21T08:21:32.000Z | 2020-03-10T14:57:06.000Z | biclustering/__init__.py | lucasbrunialti/biclustering-experiments | 30e51e23b0c3d91939bf7ec49c47d3035e6ecb57 | [
"BSD-2-Clause"
] | null | null | null | biclustering/__init__.py | lucasbrunialti/biclustering-experiments | 30e51e23b0c3d91939bf7ec49c47d3035e6ecb57 | [
"BSD-2-Clause"
] | 4 | 2017-01-18T18:10:37.000Z | 2021-12-15T02:23:15.000Z |
from biclustering import Bicluster, MSR, DeltaBiclustering
__all__ = ['Bicluster', 'MSR', 'DeltaBiclustering']
| 22.6 | 58 | 0.769912 | 10 | 113 | 8.3 | 0.7 | 0.289157 | 0.698795 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115044 | 113 | 4 | 59 | 28.25 | 0.83 | 0 | 0 | 0 | 0 | 0 | 0.258929 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
03154713a21aade074ea2cca718f5687b5d02903 | 111 | py | Python | config.py | TeleUR/Funrobot | 5c9b0f7e3045d3d9e545c807e3ede3eb3ed459c7 | [
"MIT"
] | null | null | null | config.py | TeleUR/Funrobot | 5c9b0f7e3045d3d9e545c807e3ede3eb3ed459c7 | [
"MIT"
] | null | null | null | config.py | TeleUR/Funrobot | 5c9b0f7e3045d3d9e545c807e3ede3eb3ed459c7 | [
"MIT"
] | null | null | null |
token = '250324006:AAFDAxe4nVlgI3nFkUhVBWHf1xTo1bRwwpc ' # Add Your Token
is_sudo = '242361127' # add Your ID
| 27.75 | 73 | 0.774775 | 12 | 111 | 7.083333 | 0.75 | 0.164706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.231579 | 0.144144 | 111 | 3 | 74 | 37 | 0.663158 | 0.234234 | 0 | 0 | 0 | 0 | 0.679012 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
0330869b203552a4b20b10454bb4bcf9e12b39af | 84 | py | Python | swig/single_test.py | han-xie/code_notes | 9cbf7eeaff259959d61b9fc6492a0796802e2f82 | [
"MIT"
] | 1 | 2020-01-21T02:40:27.000Z | 2020-01-21T02:40:27.000Z | swig/single_test.py | han-xie/code_notes | 9cbf7eeaff259959d61b9fc6492a0796802e2f82 | [
"MIT"
] | null | null | null | swig/single_test.py | han-xie/code_notes | 9cbf7eeaff259959d61b9fc6492a0796802e2f82 | [
"MIT"
] | null | null | null | import mod_cal
a = mod_cal.add(1, 2)
print(a)
mod_cal.cpp_print(str("hello world"))
| 16.8 | 37 | 0.72619 | 18 | 84 | 3.166667 | 0.666667 | 0.315789 | 0.245614 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026667 | 0.107143 | 84 | 4 | 38 | 21 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0.130952 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
0367826fb288bbf7f76962c1d30058cf2be83f9e | 49 | py | Python | references/process/merge/tests/__init__.py | arXiv/arxiv-references | a755aeaa864ff807ff16ae2c3960f9fee54d8dd8 | [
"MIT"
] | 7 | 2019-04-21T07:22:23.000Z | 2022-02-23T18:52:26.000Z | references/process/merge/tests/__init__.py | cul-it/arxiv-references | a755aeaa864ff807ff16ae2c3960f9fee54d8dd8 | [
"MIT"
] | 4 | 2017-11-07T16:38:46.000Z | 2018-05-04T19:53:55.000Z | references/process/merge/tests/__init__.py | cul-it/arxiv-references | a755aeaa864ff807ff16ae2c3960f9fee54d8dd8 | [
"MIT"
] | 6 | 2019-01-10T22:02:15.000Z | 2022-02-22T02:00:16.000Z | """Tests for :mod:`references.process.merge`."""
| 24.5 | 48 | 0.673469 | 6 | 49 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061224 | 49 | 1 | 49 | 49 | 0.717391 | 0.857143 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
302bf2625773b8552b39f057ed5b647c9f4fdee5 | 282 | py | Python | user_interface/ui_errors.py | pablomodernell/lorawan_conformance_testing | 3e6b9028ee7a6a614e52bac684e396ecd04fd10c | [
"MIT"
] | 1 | 2020-09-10T14:12:07.000Z | 2020-09-10T14:12:07.000Z | user_interface/ui_errors.py | pablomodernell/lorawan_conformance_testing | 3e6b9028ee7a6a614e52bac684e396ecd04fd10c | [
"MIT"
] | null | null | null | user_interface/ui_errors.py | pablomodernell/lorawan_conformance_testing | 3e6b9028ee7a6a614e52bac684e396ecd04fd10c | [
"MIT"
] | null | null | null | class UiErrors(Exception):
pass
class UiParsingError(UiErrors):
pass
class SessionConfigurationBodyError(UiErrors):
pass
class InputFieldError(UiErrors):
pass
class UnsupportedFieldTypeError(InputFieldError):
pass
class InputFormBody(UiErrors):
pass | 13.428571 | 49 | 0.758865 | 24 | 282 | 8.916667 | 0.375 | 0.21028 | 0.238318 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177305 | 282 | 21 | 50 | 13.428571 | 0.922414 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
304f0376421ca2f280a7051d987aa655b7c4cf25 | 41 | py | Python | install_words.py | tonybaloney/wntf | bacc2252d35447619600101103eb75469f51d532 | [
"0BSD"
] | 15 | 2016-05-06T07:08:55.000Z | 2022-03-02T16:05:49.000Z | install_words.py | tonybaloney/wntf | bacc2252d35447619600101103eb75469f51d532 | [
"0BSD"
] | null | null | null | install_words.py | tonybaloney/wntf | bacc2252d35447619600101103eb75469f51d532 | [
"0BSD"
] | 2 | 2016-05-20T05:15:52.000Z | 2016-09-01T00:34:08.000Z | from nltk import download
download('all') | 20.5 | 25 | 0.804878 | 6 | 41 | 5.5 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 41 | 2 | 26 | 20.5 | 0.891892 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
30550644bcf126a52c8d84ab92125451b64da02b | 185 | py | Python | protostar/commands/install/installation_exceptions.py | software-mansion/protostar | e9f701d3b02dde78e5292a4698ca3c6c7d39b485 | [
"MIT"
] | 11 | 2022-01-31T14:27:32.000Z | 2022-03-28T18:24:45.000Z | protostar/commands/install/installation_exceptions.py | software-mansion/protostar | e9f701d3b02dde78e5292a4698ca3c6c7d39b485 | [
"MIT"
] | 105 | 2022-01-31T15:25:29.000Z | 2022-03-31T12:28:13.000Z | protostar/commands/install/installation_exceptions.py | software-mansion/protostar | e9f701d3b02dde78e5292a4698ca3c6c7d39b485 | [
"MIT"
] | 1 | 2022-03-28T16:18:28.000Z | 2022-03-28T16:18:28.000Z | from protostar.protostar_exception import ProtostarException
class InstallationException(ProtostarException):
pass
class InvalidLocalRepository(InstallationException):
pass
| 18.5 | 60 | 0.843243 | 14 | 185 | 11.071429 | 0.642857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118919 | 185 | 9 | 61 | 20.555556 | 0.95092 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.4 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
306516d547761f3374cd1f74150ad9802477b076 | 142 | py | Python | script/exemplo.py | yujinishioka/computacional-thinking-python | 38abfc00d94c45cc5a7d4303e57cb8f0cab4272a | [
"MIT"
] | 1 | 2022-03-08T21:54:49.000Z | 2022-03-08T21:54:49.000Z | script/exemplo.py | yujinishioka/computacional-thinking-python | 38abfc00d94c45cc5a7d4303e57cb8f0cab4272a | [
"MIT"
] | null | null | null | script/exemplo.py | yujinishioka/computacional-thinking-python | 38abfc00d94c45cc5a7d4303e57cb8f0cab4272a | [
"MIT"
] | null | null | null | valor = input("Informe um valor: ")
numA = int(valor)
valor = input("Informe um valor: ")
numB = int(valor)
soma = numA + numB
print(soma)
| 14.2 | 35 | 0.65493 | 21 | 142 | 4.428571 | 0.428571 | 0.215054 | 0.365591 | 0.408602 | 0.516129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190141 | 142 | 9 | 36 | 15.777778 | 0.808696 | 0 | 0 | 0.333333 | 0 | 0 | 0.253521 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
306b9ff89f0abcb77624b4d0b3090381f311b57b | 41 | py | Python | ical/exception.py | ReanGD/py_calendar | 973c963c632be1c8f96257434fa6bef516a7023f | [
"Apache-2.0"
] | 1 | 2015-05-06T22:01:40.000Z | 2015-05-06T22:01:40.000Z | ical/exception.py | ReanGD/py_calendar | 973c963c632be1c8f96257434fa6bef516a7023f | [
"Apache-2.0"
] | null | null | null | ical/exception.py | ReanGD/py_calendar | 973c963c632be1c8f96257434fa6bef516a7023f | [
"Apache-2.0"
] | null | null | null | class ICalException(Exception):
pass
| 13.666667 | 31 | 0.756098 | 4 | 41 | 7.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170732 | 41 | 2 | 32 | 20.5 | 0.911765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
30805695dd5eee47ddcc907cfa105b24fb10c4cc | 269 | py | Python | dabl/tests/test_common.py | adekunleba/dabl | c4bfc23ba2be11763a2600c7d2a7a0059cb2251c | [
"BSD-3-Clause"
] | 500 | 2019-04-01T13:50:18.000Z | 2022-03-07T01:50:45.000Z | dabl/tests/test_common.py | adekunleba/dabl | c4bfc23ba2be11763a2600c7d2a7a0059cb2251c | [
"BSD-3-Clause"
] | 111 | 2019-04-01T17:48:40.000Z | 2020-03-27T16:39:19.000Z | dabl/tests/test_common.py | adekunleba/dabl | c4bfc23ba2be11763a2600c7d2a7a0059cb2251c | [
"BSD-3-Clause"
] | 60 | 2019-04-01T14:58:35.000Z | 2021-08-13T02:58:20.000Z | from sklearn.utils.estimator_checks import check_estimator
from dabl.preprocessing import EasyPreprocessor
import pytest
@pytest.mark.skip(reason="haven't implemented numpy array type checks yet")
def test_preprocessor():
return check_estimator(EasyPreprocessor)
| 29.888889 | 75 | 0.836431 | 34 | 269 | 6.5 | 0.735294 | 0.126697 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.100372 | 269 | 8 | 76 | 33.625 | 0.913223 | 0 | 0 | 0 | 0 | 0 | 0.174721 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0.5 | 0.166667 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 5 |
069b049203bb082207f40eb45698be9014ceb45f | 167 | py | Python | src/test/tinc/tinctest/test/discovery/mockstorage/uao/test_functional_uao.py | rodel-talampas/gpdb | 9c955e350334abbd922102f289f782697eb52069 | [
"PostgreSQL",
"Apache-2.0"
] | 9 | 2018-04-20T03:31:01.000Z | 2020-05-13T14:10:53.000Z | src/test/tinc/tinctest/test/discovery/mockstorage/uao/test_functional_uao.py | rodel-talampas/gpdb | 9c955e350334abbd922102f289f782697eb52069 | [
"PostgreSQL",
"Apache-2.0"
] | 36 | 2017-09-21T09:12:27.000Z | 2020-06-17T16:40:48.000Z | src/test/tinc/tinctest/test/discovery/mockstorage/uao/test_functional_uao.py | rodel-talampas/gpdb | 9c955e350334abbd922102f289f782697eb52069 | [
"PostgreSQL",
"Apache-2.0"
] | 32 | 2017-08-31T12:50:52.000Z | 2022-03-01T07:34:53.000Z | import tinctest
class UAOFunctionalTests(tinctest.TINCTestCase):
def test_functional_uao1(self):
pass
def test_functional_uao2(self):
pass
| 15.181818 | 48 | 0.712575 | 18 | 167 | 6.388889 | 0.666667 | 0.121739 | 0.295652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015504 | 0.227545 | 167 | 10 | 49 | 16.7 | 0.875969 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0.166667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 5 |
ebeee93bd131db54d35f19bf10c26fef53fec234 | 110 | py | Python | basic_database_connection/__init__.py | ixmael/basic_database_connection | 6177bb3ef41a4ac6df6a804d558e859c877743ea | [
"MIT"
] | null | null | null | basic_database_connection/__init__.py | ixmael/basic_database_connection | 6177bb3ef41a4ac6df6a804d558e859c877743ea | [
"MIT"
] | null | null | null | basic_database_connection/__init__.py | ixmael/basic_database_connection | 6177bb3ef41a4ac6df6a804d558e859c877743ea | [
"MIT"
] | null | null | null | from .local_connection import DatabaseLocalConnection
from .remote_connection import DatabaseRemoteConnection
| 36.666667 | 55 | 0.909091 | 10 | 110 | 9.8 | 0.7 | 0.326531 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072727 | 110 | 2 | 56 | 55 | 0.960784 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
ebf4f226939a1ed14ea44095fbe96c39449af468 | 52 | py | Python | federated_aggregations/paillier/__init__.py | tf-encrypted/federated-aggregations | b4ab7a15c2719d4119db7d9d609f8c06d9df8958 | [
"Apache-2.0"
] | 16 | 2020-08-07T05:40:09.000Z | 2022-01-08T20:32:07.000Z | federated_aggregations/paillier/__init__.py | tf-encrypted/federated-aggregations | b4ab7a15c2719d4119db7d9d609f8c06d9df8958 | [
"Apache-2.0"
] | 1 | 2020-10-14T00:18:39.000Z | 2020-10-19T14:13:03.000Z | federated_aggregations/paillier/__init__.py | tf-encrypted/federated-aggregations | b4ab7a15c2719d4119db7d9d609f8c06d9df8958 | [
"Apache-2.0"
] | 2 | 2020-09-08T10:16:28.000Z | 2021-01-14T12:33:01.000Z | from .factory import local_paillier_executor_factory | 52 | 52 | 0.923077 | 7 | 52 | 6.428571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057692 | 52 | 1 | 52 | 52 | 0.918367 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
230d6da29f62be5248c7d4d96cd83eee2104ed12 | 37 | py | Python | tests/__init__.py | haokui/octobus | 66ba4aaf24cc43cee0f1fec226df09a451b513c1 | [
"BSD-3-Clause"
] | 1 | 2021-04-30T09:32:51.000Z | 2021-04-30T09:32:51.000Z | tests/__init__.py | haokui/octobus | 66ba4aaf24cc43cee0f1fec226df09a451b513c1 | [
"BSD-3-Clause"
] | null | null | null | tests/__init__.py | haokui/octobus | 66ba4aaf24cc43cee0f1fec226df09a451b513c1 | [
"BSD-3-Clause"
] | null | null | null | """Unit test package for octobus."""
| 18.5 | 36 | 0.675676 | 5 | 37 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135135 | 37 | 1 | 37 | 37 | 0.78125 | 0.810811 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
23102b678df8e602e36604a0fcf3d519dd848f7e | 49 | py | Python | src/KinsectsTab/__init__.py | AndrewGrim/MonsterHunterWorldDatabase | a904647f5499926e46a64d884a2ffebe38dd5407 | [
"MIT"
] | 1 | 2020-02-17T00:16:01.000Z | 2020-02-17T00:16:01.000Z | src/KinsectsTab/__init__.py | AndrewGrim/MonsterHunterWorldDatabase | a904647f5499926e46a64d884a2ffebe38dd5407 | [
"MIT"
] | null | null | null | src/KinsectsTab/__init__.py | AndrewGrim/MonsterHunterWorldDatabase | a904647f5499926e46a64d884a2ffebe38dd5407 | [
"MIT"
] | 1 | 2020-06-26T06:54:00.000Z | 2020-06-26T06:54:00.000Z | from .Kinsect import *
from .KinsectsTab import * | 24.5 | 26 | 0.77551 | 6 | 49 | 6.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 49 | 2 | 26 | 24.5 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
23592c8e854ea9309275e23714ff4077cd23b568 | 568 | py | Python | wwwhisper_auth/email_re.py | wrr/wwwhisper | 38a55dd9c828fbb1b5a8234ea3ddf2242e684983 | [
"MIT"
] | 54 | 2015-01-19T23:49:39.000Z | 2021-02-18T01:14:51.000Z | wwwhisper_auth/email_re.py | wrr/wwwhisper | 38a55dd9c828fbb1b5a8234ea3ddf2242e684983 | [
"MIT"
] | 13 | 2015-01-26T14:51:10.000Z | 2020-11-10T04:15:36.000Z | wwwhisper_auth/email_re.py | wrr/wwwhisper | 38a55dd9c828fbb1b5a8234ea3ddf2242e684983 | [
"MIT"
] | 11 | 2015-07-25T02:13:12.000Z | 2021-07-10T14:11:46.000Z | """Regexp to validates email that is used by BrowserId.
From node-validator, Copyright (c) 2010 Chris O'Hara:
https://github.com/chriso/node-validator/blob/master/lib/validators.js
https://github.com/chriso/node-validator/blob/master/LICENSE
"""
EMAIL_VALIDATION_RE = r"^(?:[\w\!\#\$\%\&\'\*\+\-\/\=\?\^\`\{\|\}\~]+\.)*[\w\!\#\$\%\&\'\*\+\-\/\=\?\^\`\{\|\}\~]+@(?:(?:(?:[a-zA-Z0-9](?:[a-zA-Z0-9\-](?!\.)){0,61}[a-zA-Z0-9]?\.)+[a-zA-Z0-9](?:[a-zA-Z0-9\-](?!$)){0,61}[a-zA-Z0-9]?)|(?:\[(?:(?:[01]?\d{1,2}|2[0-4]\d|25[0-5])\.){3}(?:[01]?\d{1,2}|2[0-4]\d|25[0-5])\]))$"
| 63.111111 | 319 | 0.498239 | 95 | 568 | 2.957895 | 0.484211 | 0.064057 | 0.106762 | 0.128114 | 0.548043 | 0.548043 | 0.548043 | 0.548043 | 0.241993 | 0.241993 | 0 | 0.082721 | 0.042254 | 568 | 8 | 320 | 71 | 0.433824 | 0.420775 | 0 | 0 | 0 | 1 | 0.913043 | 0.913043 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.