hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4a643003204335c304b6c59e07d835634e01ee56 | 70 | py | Python | ensemble/control/operational/__init__.py | licit-lab/ensemble | 7a78ef0d69610d4fcfc5e008f931ade15e35acbf | [
"Linux-OpenIB"
] | null | null | null | ensemble/control/operational/__init__.py | licit-lab/ensemble | 7a78ef0d69610d4fcfc5e008f931ade15e35acbf | [
"Linux-OpenIB"
] | null | null | null | ensemble/control/operational/__init__.py | licit-lab/ensemble | 7a78ef0d69610d4fcfc5e008f931ade15e35acbf | [
"Linux-OpenIB"
] | null | null | null | from .operational import CACC
from .reference import ReferenceHeadway
| 23.333333 | 39 | 0.857143 | 8 | 70 | 7.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 70 | 2 | 40 | 35 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
4a6e715f800bbdca692049fd50a3e919c2ee4f79 | 77 | py | Python | tastyworks/models/option_leg.py | krsvojte/tastyworks_api | 692c4e434d6c5daa2d91fcb09a3aabb8d5d24c6f | [
"Apache-2.0"
] | null | null | null | tastyworks/models/option_leg.py | krsvojte/tastyworks_api | 692c4e434d6c5daa2d91fcb09a3aabb8d5d24c6f | [
"Apache-2.0"
] | null | null | null | tastyworks/models/option_leg.py | krsvojte/tastyworks_api | 692c4e434d6c5daa2d91fcb09a3aabb8d5d24c6f | [
"Apache-2.0"
] | null | null | null | from tastyworks.models.model import Model
class OptionLeg(Model):
pass
| 12.833333 | 41 | 0.766234 | 10 | 77 | 5.9 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168831 | 77 | 5 | 42 | 15.4 | 0.921875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 5 |
4a77b32a89a8779d3d60c0d4f86b7439ed12dbb5 | 58 | py | Python | Storylines/__init__.py | Komanawa-Solutions-Ltd/SLMACC-2020-CSRA | 914b6912c5f5b522107aa9406fb3d823e61c2ebe | [
"Apache-2.0"
] | null | null | null | Storylines/__init__.py | Komanawa-Solutions-Ltd/SLMACC-2020-CSRA | 914b6912c5f5b522107aa9406fb3d823e61c2ebe | [
"Apache-2.0"
] | null | null | null | Storylines/__init__.py | Komanawa-Solutions-Ltd/SLMACC-2020-CSRA | 914b6912c5f5b522107aa9406fb3d823e61c2ebe | [
"Apache-2.0"
] | null | null | null | """
Author: Matt Hanson
Created: 24/12/2020 9:20 AM
""" | 14.5 | 28 | 0.62069 | 10 | 58 | 3.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.234043 | 0.189655 | 58 | 4 | 29 | 14.5 | 0.531915 | 0.810345 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
4a784c0f8e7f7443d434a2bdb4573883e5ce59b1 | 18 | py | Python | contrib/tools/python/src/Lib/plat-mac/Carbon/CF.py | HeyLey/catboost | f472aed90604ebe727537d9d4a37147985e10ec2 | [
"Apache-2.0"
] | 6,989 | 2017-07-18T06:23:18.000Z | 2022-03-31T15:58:36.000Z | python/src/Lib/plat-mac/Carbon/CF.py | weiqiangzheng/sl4a | d3c17dca978cbeee545e12ea240a9dbf2a6999e9 | [
"Apache-2.0"
] | 1,978 | 2017-07-18T09:17:58.000Z | 2022-03-31T14:28:43.000Z | python/src/Lib/plat-mac/Carbon/CF.py | weiqiangzheng/sl4a | d3c17dca978cbeee545e12ea240a9dbf2a6999e9 | [
"Apache-2.0"
] | 1,228 | 2017-07-18T09:03:13.000Z | 2022-03-29T05:57:40.000Z | from _CF import *
| 9 | 17 | 0.722222 | 3 | 18 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 18 | 1 | 18 | 18 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
4a853645dcb5c0aa82c121aedb488d24a6f29abc | 763 | py | Python | pymtl3/passes/backends/yosys/YosysTranslationImportPass.py | kevinyuan/pymtl3 | 5949e6a4acc625c0ccbbb25be3af1d0db683df3c | [
"BSD-3-Clause"
] | 152 | 2020-06-03T02:34:11.000Z | 2022-03-30T04:16:45.000Z | pymtl3/passes/backends/yosys/YosysTranslationImportPass.py | kevinyuan/pymtl3 | 5949e6a4acc625c0ccbbb25be3af1d0db683df3c | [
"BSD-3-Clause"
] | 139 | 2019-05-29T00:37:09.000Z | 2020-05-17T16:49:26.000Z | pymtl3/passes/backends/yosys/YosysTranslationImportPass.py | kevinyuan/pymtl3 | 5949e6a4acc625c0ccbbb25be3af1d0db683df3c | [
"BSD-3-Clause"
] | 22 | 2020-05-18T13:42:05.000Z | 2022-03-11T08:37:51.000Z | #=========================================================================
# YosysTranslationImportPass.py
#=========================================================================
# Translate and import components in the given hierarchy.
#
# Author : Peitian Pan
# Date : Aug 6, 2019
from pymtl3.passes.backends.verilog.VerilogTranslationImportPass import (
VerilogTranslationImportPass,
)
from .import_.YosysVerilatorImportPass import YosysVerilatorImportPass
from .translation.YosysTranslationPass import YosysTranslationPass
class YosysTranslationImportPass( VerilogTranslationImportPass ):
@staticmethod
def get_translation_pass():
return YosysTranslationPass
@staticmethod
def get_import_pass():
return YosysVerilatorImportPass
| 29.346154 | 74 | 0.657929 | 52 | 763 | 9.557692 | 0.615385 | 0.120724 | 0.072435 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008798 | 0.10616 | 763 | 25 | 75 | 30.52 | 0.719941 | 0.359109 | 0 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0.75 | 0.583333 | 0.166667 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 5 |
4ab3ad698c27091b8dd736206d3bcd87696ca955 | 80 | py | Python | 7_kyu/Find_all_occurrences_of_an_element_in_an_array.py | UlrichBerntien/Codewars-Katas | bbd025e67aa352d313564d3862db19fffa39f552 | [
"MIT"
] | null | null | null | 7_kyu/Find_all_occurrences_of_an_element_in_an_array.py | UlrichBerntien/Codewars-Katas | bbd025e67aa352d313564d3862db19fffa39f552 | [
"MIT"
] | null | null | null | 7_kyu/Find_all_occurrences_of_an_element_in_an_array.py | UlrichBerntien/Codewars-Katas | bbd025e67aa352d313564d3862db19fffa39f552 | [
"MIT"
] | null | null | null | def find_all(array, n):
return [ i for (i,v) in enumerate(array) if v == n ] | 40 | 56 | 0.6125 | 16 | 80 | 3 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.225 | 80 | 2 | 56 | 40 | 0.774194 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
435b46245c3d72285a5946834ad8d2e77d8596fa | 155 | py | Python | example/example/apps/main/admin.py | vsoch/contributions_django | df79923e35ebbe94e1e2776073e77e8f9aea0837 | [
"Apache-2.0"
] | 3 | 2020-08-04T08:42:40.000Z | 2022-01-10T15:44:15.000Z | example/example/apps/main/admin.py | vsoch/contributions_django | df79923e35ebbe94e1e2776073e77e8f9aea0837 | [
"Apache-2.0"
] | 3 | 2020-08-04T07:20:03.000Z | 2022-01-10T15:43:52.000Z | example/example/apps/main/admin.py | vsoch/contributions-django | df79923e35ebbe94e1e2776073e77e8f9aea0837 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from django.contrib import admin
from .models import Event
@admin.register(Event)
class EventAdmin(admin.ModelAdmin):
pass
| 14.090909 | 35 | 0.716129 | 20 | 155 | 5.55 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007692 | 0.16129 | 155 | 10 | 36 | 15.5 | 0.846154 | 0.135484 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 5 |
436498522b2bb3007aee450a046b1ed00f5d17a9 | 234 | py | Python | Exercises/spell.py | VasuGoel/mitx-6.00.1x | dcb9e7c88cc4fd5c43c0f7b7f2fe5aa2eb49dede | [
"MIT"
] | 3 | 2019-07-05T18:05:38.000Z | 2019-09-04T12:12:40.000Z | Exercises/spell.py | VasuGoel/mitx-6.00.1x | dcb9e7c88cc4fd5c43c0f7b7f2fe5aa2eb49dede | [
"MIT"
] | null | null | null | Exercises/spell.py | VasuGoel/mitx-6.00.1x | dcb9e7c88cc4fd5c43c0f7b7f2fe5aa2eb49dede | [
"MIT"
] | 4 | 2019-08-20T02:51:37.000Z | 2019-09-04T12:12:42.000Z | class Accio(Spell):
def __init__(self):
Spell.__init__(self, 'Accio', 'Summoning Charm')
def getDescription(self):
return 'This charm summons an object to the caster, potentially over a significant distance.'
| 33.428571 | 101 | 0.696581 | 29 | 234 | 5.344828 | 0.758621 | 0.103226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.209402 | 234 | 6 | 102 | 39 | 0.837838 | 0 | 0 | 0 | 0 | 0 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
436728e374e7f33411da4e87a42c0dc84b80ad7a | 37 | py | Python | src/omuse/units/quantities.py | ipelupessy/omuse | 83850925beb4b8ba6050c7fa8a1ef2371baf6fbb | [
"Apache-2.0"
] | 12 | 2020-03-25T10:02:00.000Z | 2021-11-18T00:28:35.000Z | src/omuse/units/quantities.py | ipelupessy/omuse | 83850925beb4b8ba6050c7fa8a1ef2371baf6fbb | [
"Apache-2.0"
] | 45 | 2020-03-03T16:07:16.000Z | 2022-03-14T09:01:07.000Z | src/omuse/units/quantities.py | ipelupessy/omuse | 83850925beb4b8ba6050c7fa8a1ef2371baf6fbb | [
"Apache-2.0"
] | 8 | 2020-03-03T13:28:50.000Z | 2021-05-26T09:20:02.000Z | from amuse.units.quantities import *
| 18.5 | 36 | 0.810811 | 5 | 37 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 37 | 1 | 37 | 37 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
4377e86b1daf2f65ff99c83dd5dbbe3989454f43 | 9,443 | py | Python | test/test_rsa_envelope.py | yop-platform/yop-python-sdk | 8eab98a97b9564d6ddd58d189790c8cea95229ed | [
"Apache-2.0"
] | 3 | 2020-12-02T12:55:07.000Z | 2022-02-28T15:23:01.000Z | test/test_rsa_envelope.py | yop-platform/yop-python-sdk | 8eab98a97b9564d6ddd58d189790c8cea95229ed | [
"Apache-2.0"
] | 1 | 2021-06-02T06:50:15.000Z | 2021-08-23T12:39:17.000Z | test/test_rsa_envelope.py | yop-platform/yop-python-sdk | 8eab98a97b9564d6ddd58d189790c8cea95229ed | [
"Apache-2.0"
] | 2 | 2021-05-07T09:48:43.000Z | 2021-05-08T05:57:46.000Z | # -*- coding: utf-8 -*-
from sys import version_info as pyVersion
import utils.yop_security_utils as yop_security_utils
from client.yop_client_config import YopClientConfig
from security.encryptor.rsaencryptor import RsaEncryptor
import sys
sys.path.append("./")
clientConfig = YopClientConfig(config_file='config/yop_sdk_config_rsa_qa.json')
# isv_private_key = clientConfig.get_credentials().get_priKey()
# yop_public_key = yop_security_utils.parse_pub_key()
isv_private_key = yop_security_utils.parse_rsa_pri_key(
'MIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQChnRCVXUj+zLgJyeskNhflb7Go67dmkOSoXCYIklAYpcRC1+3Mbh4Ju3Qb0y3X92XPir3tk8aXurkuk93PFig582BSPKixBSASXLLGiwBiK3pyBuJoabHvIAZuAFXYDPaXQJE2HAtrADYGcbWvsdc94umihQU67XImjgOiBj0ZgeIhsjcsCEFgmzAw4xNW1dqza8grzcvBmmqyYUig5yS5IywciHtPYG+0FiLzjT+o6LoaMoaF/IAYq8EkbPC8Jra3IFqvZsR8Krr5DipCs87NRb3SILjUCTZyaaeKMf73m1grnYyP9CcBlwaQydKxnM0tDu3Xnm8FKiIK95Oz31U1AgMBAAECggEAMhOr9sw/+QvQHuBdJwxH3UT9xLy9SF+vKmfbNR65CNocdSXZPlYEorld4d1OwDOdbXCtJzd5+rvV85PH0AoqjsJV30WCc8+Fv4rPrmuVw9V7DGgLsZTGmLTQqTcbYmWp5vYPyLdp5k7bbqW/SWCOtFNiV4RmOXsnusCYaZULS6KPXpjpnpt4shesK1SdVC2uCO7eUKf4aN8kKSaGA6rKK9aiBwuiwnqAc5Z0++HxnTu3zjbrfmUVchFCLPU31zXlfzFfZW5hUu7whUcj7914V8q5AkuEEoqpFxAcirsH0ZQE7xtPFez+00OZU42NtyqlieUSE8zYglxVRjZ9pa4GwQKBgQDkNYmzCISJXbAqcNJCY9Og/tMBaZNrYh3Z51eENxxARoVD3jj462hsZMLd+ZH6AuerT7xEHgd4QxtLDVzMpXF4wwlZydu8DrTr7KZPliHFTWPs1ntOnsCeWhFnIHcKVnay9YkiMT+WqTwpGXTkpo68oI+vu3EKY9DSPp+di0CeEQKBgQC1S2WREQHOGnzQL2vqPZA45BL0k6F/KPgb2gTF4sRTkOZr4mnb/vmTt1E+xRKwTBCzsnzk4x2DFgMtYr9ZODtg9egmfd7BdjFzXw7f+ACoDE4SHtl2YqHqbnt6qDAwY6Ahz+0hmjG/2xD4lK5h0yDDh2iwoQnAKdQnaam3ZsLw5QKBgQDSsot7/LVBjnqD9L5sBXbzAdMXTr6JOoGNGga3T5qJzZJk4tt/FvnGehFgmHeqeNwkUu3jhkYnRu4AEUpIt8dYU9piR/jUXE+2Mzwp5tcvLxC/LheSswfsLAQ9TsAZj1LwT7pZE1c+ZungmFxQb2cByMxg15K6oQW/14nPDy6NwQKBgBV/MjToWllxBJm+9cHZuO82BBViKAUm+3x59pTsVbE+/kOOnlTKwBdG5mhV/+hNrLFSGcMeNxKjGo9YJS5UH55YqkVeKXqxJB31CJOAGbvTcbJuXATQnzhoD1Y0+TnTplo8CHcyjHGebT28i4zn9vuYY86F2d0iWJivy8MGeVkNAoGADUAug8jLk19hyejyvHK+nuQJ3gnbdSaUE3uTF5YbVxMhXuKq/Q8lkx74p8mc5x+NkzwrcpSL7YQ9Px4BopU9kvYHVChNVMVCnYRJ16MmEws2N9L4qdsmVKX8j0n+55DZsTPZfhPxP/Yuj6indAq7om+wbkuVTuPDhtDyHcfjHd8=')
isv_public_key = yop_security_utils.parse_rsa_pub_key(
'MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAoZ0QlV1I/sy4CcnrJDYX5W+xqOu3ZpDkqFwmCJJQGKXEQtftzG4eCbt0G9Mt1/dlz4q97ZPGl7q5LpPdzxYoOfNgUjyosQUgElyyxosAYit6cgbiaGmx7yAGbgBV2Az2l0CRNhwLawA2BnG1r7HXPeLpooUFOu1yJo4DogY9GYHiIbI3LAhBYJswMOMTVtXas2vIK83LwZpqsmFIoOckuSMsHIh7T2BvtBYi840/qOi6GjKGhfyAGKvBJGzwvCa2tyBar2bEfCq6+Q4qQrPOzUW90iC41Ak2cmmnijH+95tYK52Mj/QnAZcGkMnSsZzNLQ7t155vBSoiCveTs99VNQIDAQAB')
class Test(object):
def test_envelope_self(self, client):
"""
Encrypts and decrypts the given envelope.
Args:
self: write your description
client: write your description
"""
if 'sm' == client.cert_type or pyVersion.major == 2:
# TODO python2 下面ECB pad有问题
return
content = '{"orderId": "SP213142141", "status": "SUCCESS", "uniqueOrderNo": "", "parentMerchantNo": "10085864877", "merchantNo": "10085864877"}'
encryptor = RsaEncryptor(
clientConfig.get_credentials().get_priKey(),
clientConfig.get_yop_public_key().get('default'))
enc = encryptor.envelope_encrypt(content, isv_private_key, isv_public_key)
print('enc:{}'.format(enc))
plain = encryptor.envelope_decrypt(enc, isv_private_key, isv_public_key)
print('plain:{}'.format(plain))
assert plain == content
def test_envelope_notify_16(self, client):
"""
Decrypt response with isv_private_key and yop_public_key
Args:
self: write your description
client: write your description
"""
if 'sm' == client.cert_type:
return
# 16/32 pad 一致的情况
content = '{"date":"20181014000000","aaa":"","boolean":true,"SIZE":-14,"name":"易宝支付","dou":12.134}'
response = 'EJSkBOwrHduycpFOvg9rJAoC_HE4_ZBLCiZJuvJGm2fgqr7TU9L56qjNkU3bWdZRwtQBMulMq6JokW4ZNglNIAYBysJrHHXF68BP1ohuFC5kfJXzvya4UXBdHFHgtT7vJUsvxUCOANwR36NOhK1kzmGiuLDiaXGtXquo5p-H9JDSIXY7ZcDf6P0WdZ8BG2_TR34sbTGDW73m-4vnw3lCPWGhxlgnW_6CxRVWpl-iXIfMBl52DcPCa9i1-HhLb1-_g8Rf6-Trm4ahMi-dNJok71XK-gNIYbJRNhdMfFfT2cC_tXjK76zfEu94LkHbFJZkflmlH6iVy6y3aPpJL49_cg$FBa72nweVtKsfXawN9BTR6AOEeSxWygcUyP_WKGvqKvVF6vOeAY7P4NYTTgojtnL9H4Pr6zmKbXgJI0GKHRjfSHoLTDf5z2qxIfD1Cd8f443PUpL7PCpPEduNSTuIx2Je5uhCtJ6Sdglp5pw8kRDNx2E2Mz0fgbBaCuatLtJmr62aiUQAlfDVoXbdcFv-5lES00KAP9S1nU8phBnQhJt2V76x-alH_rq13Pf3F_Xo6wZDAFhzrmlWVlh3jmbMDGwsBSWf1j0iIZpbsS3Vd4-UO6RO_52Hb1ZsMhZl3fMzzBIx1-Qc7w2pWlsVrYbymWGlNZukeir0RMT9I72VUqGVoMh9U5Qnw7DZssvwyjLPLjmx54vDHTxE3EXV70heccs0p5wI7gomeO8u-Szpx4CiBkeMUTtaXdtDmqrxnVnx6C4wVFSeXMSHn9RF35GrRZlOSSWVNKh1AMC57m0cJXk0NfTvl9eDkx5TmHBIrbrZ3Xfi_ZHILUjwc4A87KOeSwJW-C3OWnAN1QsDCgxZsaPdGh8O0y3Y6Wr1sEWUMV5nwH4eS_a2G6rZUswN3LJ6iro2lhLrcteIYK4QYzh8nRu5g$AES$SHA256'
encryptor = RsaEncryptor(
isv_private_key,
list(clientConfig.get_yop_public_key().get('RSA2048').values())[0])
plain = encryptor.envelope_decrypt(response) # , isv_private_key, yop_public_key)
print('plain:{}'.format(plain))
assert plain == content
def test_envelope_notify_32(self, client):
"""
16/32 pad 的 的 RsaEncrypt
Args:
self: write your description
client: write your description
"""
if 'sm' == client.cert_type:
return
# 16/32 pad 不一致的情况
content = '{"date":"20181014000000","aaa":"","boolean":true,"SIZE":-14,"name":"易宝支付","dou":12.34}'
response = 'AT320grqREB__reG5un3QbzBWQ0QfHzdHkHJW4_GmMzY4Qeg-ud_xLhCucVVxLGZQlKiJoH6BS26fVz47r49S6o_6OTUPMoxCdy-mqhPx0mF3LAvmQNV8v_bKingnMM--LiW6z8OE9i_e8-sJ2YrlmhI99AAVqpC-9Kdn_Mx1x0-i6ojv7TO0foUN3RjPomvX_43Vc2xJdJIWmYx18Z5aOrlz1Z8MiEXntx-W7cBI9veC9k3tb_Jkoc1QyyhyAU6za4sCjQq5InvhUUljN62dgHbEIiS8Jp3YOOy3bf0xy9eotKsybpGr63ZPpukdAYUGJVPfAqOXxMdokbxLGvSag$6DjZHtBf9YTJcPeQ6HmNEGSMuRMVRn1idDS18MweFPUrkCd-iBq06b6HDh0oG4ZSqW2ef_64MSOMKCnNWtFreuXpvVtlYikCZV7Tr_vYagaJ8tq3TGMlpZPuUGKmKSGpXS4V5zrzN9OTWUEQejqVoxQhPriKERXsl34ceS9Em_AHqPgWzeV5Cgfsn2it6SnOGMVTRNMB-G61m7J7rVjDlGLatMzGOPpjraj2iwTmkwFD7PQOhlHg4rPr17vdsJQOBvhAtGCrTp5pqLABRDOQefQCQ3P2z_BPg_GC3F44Osd_EzZ-PIXAha3ULFWM6g6iY8g88h0aCEZBrt0qmfjzDnl6seyuaeah6wC1hCVYmC6tWUfbW_2QIfkRMtxwGsfF7QXo2IplfkYbEdfIZoNejpBCA72RLKU6ApNiBSKQGvV4MT8IhbHlyMNjm4zCiAc5hvqy15z2pMM1CaKpKtU8jxq03H0sZvzr749dCPh5-O0BgoZST_D3eOlu7usjlKstxmIcBUYymbv0XgsNOEbzqzpQgc1-IkMdG_k17H33JXmi9Kxp1ezRjER9sqoG_oZEzjL7KP_PIFvUtAeJ5e4pdA$AES$SHA256'
clientConfig = YopClientConfig(config_file='config/yop_sdk_config_rsa_prod.json')
encryptor = RsaEncryptor(
isv_private_key,
list(clientConfig.get_yop_public_key().get('RSA2048').values())[0])
plain = encryptor.envelope_decrypt(response) # , isv_private_key, yop_public_key)
print('plain:{}'.format(plain))
assert plain == content
def test_envelope_notify_unpad(self, client):
"""
Decrypts the response and verifies that it looks like an envelope with a public key.
Args:
self: write your description
client: write your description
"""
if 'sm' == client.cert_type:
return
content = '{"date":"20181014000000","aaa":"","boolean":true,"SIZE":-14,"name":"易宝支付","dou":12.134}'
response = 'ftjjTgrMrdS3aovlxxRa1GQiEuh4VUw8UhfhtV1yahhfvmpoVpn56wjMSW_Hr5BcXFylw08oBxRkHw0QgltrNGuzcmoyXhcP3LJUYcypMjf24YVryOtxdaAqsuTdhcsb6nBpvPfWTPvQiQ0QNmB_cCZSqaHNlj7iq3MtGnO6e5xFZNs4aUc9C1NtWNZc-NMTJaKV9XYVUD0UTmRZeGio8kD7BkcNGMwmLv32FEUzqFFVKXURXRikFVWLZ_d0gw1k8ewvblJM_OFL7SZYMy8uPxFLKX8qGTWZhOdgnZX1T3Pzkf-C34yA8oVcUaSkCFMgsSt9SfK8y-mvOgamylNdow$oIQRJRwIeMNYBbI1EQKqlfly0KD1PV-oyN4IXkVl6iJ_xkjorLoLgykDcCtkVPJaPid-GIei1Px3uBE3TeGxxm06D1uzlGTpPgoaBB71dQ-efWGIJ2OLB7L25Y4IQlMXcQG11tR7GDGu7EvVI9SP52_mQxYqxh7dZBPz-Mqs11H3HqJSN7sJvh727ksRAuIv5TRI2EONoahvRENZ2diqC1sU0Tsl2FvDcKkkDU-e5O6jP8sYAbY0KBrCS_Cf40jay9MrM2knKHU6e0EvPRlkx9jqtzhUOpFS_wvbVdHRkKMFvGANhqa3rWdrMPOzKRiRRmLFUd5htoeYcdod002EJ_ltDRG-cueLXddKiJVCmWdOTGjY2OYQhVZ8N1-Y9eiRVsi0twKkPKHh0AnEPiR6KLhu0ds_pNWcheFox1N4KYqrRvObPc1AR5-4uzfIhOsaqiAmuBlMGX45LH9IvY0qwUun0oFHwtIt_PAyCQiCb47kaBf-zAbbPss0dEsGjGMZT1gO2TzBE1D6rBN3GHH_ye8WgefbGXgSZaKlvVy0mdoVNEtwyeJkFswpMNc7UlXT42yZkZIBLSziORZEFUmhrPNh5hIsxLxRR5-JKG0C8yiAjZNxjUkXCZ9UUSWwoVphrLJtyF2eee5RecMMyfYE8lO3jU47DF66Ol3c1WO2FWcIdgkHomhEMttDxsAn3kbrEwUCrfn8Bf4rvNv-R1j2aM9lkk30Amc7sMcjRTBXNfuuwCGtCNXRBONWqVclhIW8V2EuRFztPx3o7aby7zDMMrHhs3i-zT9tFQvwj2us3oxD_lM9hKCH2RVrNa-BCwxXqM7a6do5Yp-0vi0g_IvVkOBX0JFm9cAmix4Dx1yxa7SRukxOLT-EVefpLTsdXfrxZSSTxjdjk9jancfIeETmFyPsJdIJgETYLzrYF6SqOw-lyklsX1kM6_uxj0keKC3ZLkJXic5NhDM2yDQSsbD-Ru36FYyjqg5-VuUBKXubkEtUH4LYIlXQzrrDZ92wurlxrJnlVMrHpDI7Y1LQW1QXDVduETqo8A1-dlGDWmL1EWHiwI9fx0vWpaHe9j5ByilnxJli82LkstPjAA5Do5_my1adhD1UrljLv7BpISicU8MRHIt7t533nvKLLGvlCDLhHP15BI9lSerbL4NOaArJoQ$AES$SHA256'
encryptor = RsaEncryptor(
isv_private_key,
list(clientConfig.get_yop_public_key().get(u'RSA2048').values())[0])
try:
plain = encryptor.envelope_decrypt(response) # , isv_private_key, yop_public_key)
print('plain:{}'.format(plain))
assert plain == content
except Exception as e:
assert repr(e).find(u'isv private key is illegal!') > 0
| 78.041322 | 1,631 | 0.82855 | 613 | 9,443 | 12.510604 | 0.45677 | 0.015256 | 0.020342 | 0.008345 | 0.202112 | 0.192985 | 0.177337 | 0.171339 | 0.171339 | 0.156213 | 0 | 0.112602 | 0.10844 | 9,443 | 120 | 1,632 | 78.691667 | 0.798313 | 0.085884 | 0 | 0.47541 | 0 | 0.131148 | 0.712626 | 0.684766 | 0 | 1 | 0 | 0.008333 | 0.081967 | 1 | 0.065574 | false | 0 | 0.081967 | 0 | 0.229508 | 0.081967 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
43787198ff65ab51e595fb0211df259666d7d9ea | 176 | py | Python | src/commands/schedule.py | YusukeKambara/japan_horse_racing | 05c2e06fe265c5744b908b8575df260db18a115b | [
"MIT"
] | null | null | null | src/commands/schedule.py | YusukeKambara/japan_horse_racing | 05c2e06fe265c5744b908b8575df260db18a115b | [
"MIT"
] | 1 | 2021-12-13T20:32:18.000Z | 2021-12-13T20:32:18.000Z | src/commands/schedule.py | YusukeKambara/japan_horse_racing | 05c2e06fe265c5744b908b8575df260db18a115b | [
"MIT"
] | null | null | null | from datasource.jra import io as jra
def get(year, is_place):
"""Getting the schedule on JRA with argument's year
"""
return jra.get_race_name_list(year, is_place) | 29.333333 | 55 | 0.721591 | 30 | 176 | 4.066667 | 0.733333 | 0.098361 | 0.180328 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1875 | 176 | 6 | 56 | 29.333333 | 0.853147 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
439e532c53b4ff7e06e32deb579ddc086ac43dec | 103 | py | Python | CodeWars/7 Kyu/Dropcaps.py | anubhab-code/Competitive-Programming | de28cb7d44044b9e7d8bdb475da61e37c018ac35 | [
"MIT"
] | null | null | null | CodeWars/7 Kyu/Dropcaps.py | anubhab-code/Competitive-Programming | de28cb7d44044b9e7d8bdb475da61e37c018ac35 | [
"MIT"
] | null | null | null | CodeWars/7 Kyu/Dropcaps.py | anubhab-code/Competitive-Programming | de28cb7d44044b9e7d8bdb475da61e37c018ac35 | [
"MIT"
] | null | null | null | def drop_cap(str_):
return ' '.join( w.capitalize() if len(w) > 2 else w for w in str_.split(' ') ) | 51.5 | 83 | 0.61165 | 19 | 103 | 3.157895 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012195 | 0.203884 | 103 | 2 | 83 | 51.5 | 0.719512 | 0 | 0 | 0 | 0 | 0 | 0.019231 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
43a7b23a842546385f05e87d0bff51fbcff29607 | 52 | py | Python | potencia.py | isaberamos/Programinhas | d0bfa5099edaf05b9a5f055bf7ce5432588cdc3d | [
"MIT"
] | 1 | 2021-12-28T21:37:33.000Z | 2021-12-28T21:37:33.000Z | potencia.py | isaberamos/Programinhas | d0bfa5099edaf05b9a5f055bf7ce5432588cdc3d | [
"MIT"
] | null | null | null | potencia.py | isaberamos/Programinhas | d0bfa5099edaf05b9a5f055bf7ce5432588cdc3d | [
"MIT"
] | null | null | null | i = 0
while i <= 50:
print(2**i)
i = i + i
| 8.666667 | 15 | 0.384615 | 11 | 52 | 1.818182 | 0.545455 | 0.3 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 0.423077 | 52 | 5 | 16 | 10.4 | 0.533333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
43d35c459826c7b318bee9af225644e44614f455 | 77 | py | Python | fornax/stages/checkout/__init__.py | lwencel-priv/fornax | 0f66a6284975bc5a2cfc3d38bc01ef6ad492e40e | [
"MIT"
] | null | null | null | fornax/stages/checkout/__init__.py | lwencel-priv/fornax | 0f66a6284975bc5a2cfc3d38bc01ef6ad492e40e | [
"MIT"
] | null | null | null | fornax/stages/checkout/__init__.py | lwencel-priv/fornax | 0f66a6284975bc5a2cfc3d38bc01ef6ad492e40e | [
"MIT"
] | null | null | null | """Prepare environment stage package."""
from .checkout import CheckoutStage
| 25.666667 | 40 | 0.792208 | 8 | 77 | 7.625 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103896 | 77 | 2 | 41 | 38.5 | 0.884058 | 0.441558 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
43db2b53ac31fa4cb8130d521359c6152193e3d3 | 38 | py | Python | examples/contextvar_project_example/__main__.py | David-Lor/Loguru-Context-examples-lightning-talk | 79ba2c3f8e98685403a8ae87df82c31ce2ee57b0 | [
"Apache-2.0"
] | null | null | null | examples/contextvar_project_example/__main__.py | David-Lor/Loguru-Context-examples-lightning-talk | 79ba2c3f8e98685403a8ae87df82c31ce2ee57b0 | [
"Apache-2.0"
] | null | null | null | examples/contextvar_project_example/__main__.py | David-Lor/Loguru-Context-examples-lightning-talk | 79ba2c3f8e98685403a8ae87df82c31ce2ee57b0 | [
"Apache-2.0"
] | null | null | null | from my_package.run import run
run()
| 9.5 | 30 | 0.763158 | 7 | 38 | 4 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 38 | 3 | 31 | 12.666667 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
78ddad1a03e6654ebaaf86cb2be6b9c8db879432 | 119 | py | Python | Codefights/arcade/python-arcade/level-5/31.Construct-Shell/Python/solution1.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | 7 | 2017-09-20T16:40:39.000Z | 2021-08-31T18:15:08.000Z | Codefights/arcade/python-arcade/level-5/31.Construct-Shell/Python/solution1.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | null | null | null | Codefights/arcade/python-arcade/level-5/31.Construct-Shell/Python/solution1.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | null | null | null | # Python3
# 有限制修改區域
def constructShell(n):
return [[0] * (i if i <= n else (2 * n - i)) for i in range(1, 2 * n)]
| 19.833333 | 74 | 0.546218 | 22 | 119 | 2.954545 | 0.681818 | 0.061538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057471 | 0.268908 | 119 | 5 | 75 | 23.8 | 0.689655 | 0.12605 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
78e42a9402677cbfec57cd576b03d11ff46c2946 | 244 | py | Python | am/api/views/__init__.py | access-missouri/am-django-project | 2457b8089900c61c73000c1d7479b7a72f6d1855 | [
"BSD-2-Clause"
] | 4 | 2018-05-01T20:31:49.000Z | 2021-12-20T19:30:40.000Z | am/api/views/__init__.py | access-missouri/am-django-project | 2457b8089900c61c73000c1d7479b7a72f6d1855 | [
"BSD-2-Clause"
] | 22 | 2017-04-13T15:02:09.000Z | 2021-02-02T21:48:41.000Z | am/api/views/__init__.py | access-missouri/am-django-project | 2457b8089900c61c73000c1d7479b7a72f6d1855 | [
"BSD-2-Clause"
] | 1 | 2018-07-02T20:08:43.000Z | 2018-07-02T20:08:43.000Z | # -*- coding: utf-8 -*-
from .bill_api_views import BillAPIViewset, BillSearchAPIView
from .person_api_views import PersonAPIViewset, PersonSearchAPIView
from .finance_entity_api_views import FinanceEntityAPIViewset, FinanceEntitySearchAPIView | 48.8 | 89 | 0.856557 | 25 | 244 | 8.08 | 0.68 | 0.118812 | 0.207921 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004464 | 0.081967 | 244 | 5 | 89 | 48.8 | 0.897321 | 0.086066 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
6028993e28887b8e461224024fcdf703dd0dd77d | 59 | py | Python | Purchase_Pred_Stage_2/graph_model/models/__init__.py | nlprecroy/purchase_prediction_PR | 0550a30a74aae2598a15099c80aff24cbab6d752 | [
"MIT"
] | 2 | 2021-03-15T09:26:30.000Z | 2021-11-03T09:35:11.000Z | Purchase_Pred_Stage_2/graph_model/models/__init__.py | nlprecroy/purchase_prediction_PR_2020 | 0550a30a74aae2598a15099c80aff24cbab6d752 | [
"MIT"
] | null | null | null | Purchase_Pred_Stage_2/graph_model/models/__init__.py | nlprecroy/purchase_prediction_PR_2020 | 0550a30a74aae2598a15099c80aff24cbab6d752 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from .purchase import Purchase
| 14.75 | 31 | 0.610169 | 7 | 59 | 5.142857 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021739 | 0.220339 | 59 | 3 | 32 | 19.666667 | 0.76087 | 0.355932 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
603e525d4241de55d55c3c80cf88bd13acbb4ae8 | 11,765 | py | Python | openstack_dashboard/test/integration_tests/tests/test_instances.py | wilk/horizon | bdf7e692227367a928325acdd31088971d3c4ff4 | [
"Apache-2.0"
] | 1 | 2019-08-07T08:46:03.000Z | 2019-08-07T08:46:03.000Z | openstack_dashboard/test/integration_tests/tests/test_instances.py | wilk/horizon | bdf7e692227367a928325acdd31088971d3c4ff4 | [
"Apache-2.0"
] | 7 | 2017-06-26T14:34:33.000Z | 2020-06-30T22:10:50.000Z | openstack_dashboard/test/integration_tests/tests/test_instances.py | lostmap/horizon-prod | 78769447761aaf1790d0e55d50b02f020a58da20 | [
"Apache-2.0"
] | 6 | 2015-05-25T00:31:26.000Z | 2022-03-21T22:36:25.000Z | # Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from openstack_dashboard.test.integration_tests import decorators
from openstack_dashboard.test.integration_tests import helpers
from openstack_dashboard.test.integration_tests.regions import messages
class TestInstances(helpers.TestCase):
INSTANCE_NAME = helpers.gen_random_resource_name('instance',
timestamp=False)
@property
def instances_page(self):
return self.home_pg.go_to_project_compute_instancespage()
@decorators.skip_because(bugs=['1774697'])
def test_create_delete_instance(self):
"""tests the instance creation and deletion functionality:
* creates a new instance in Project > Compute > Instances page
* verifies the instance appears in the instances table as active
* deletes the newly created instance via proper page (depends on user)
* verifies the instance does not appear in the table after deletion
"""
instances_page = self.home_pg.go_to_project_compute_instancespage()
instances_page.create_instance(self.INSTANCE_NAME)
self.assertTrue(
instances_page.find_message_and_dismiss(messages.SUCCESS))
self.assertFalse(
instances_page.find_message_and_dismiss(messages.ERROR))
self.assertTrue(instances_page.is_instance_active(self.INSTANCE_NAME))
instances_page = self.instances_page
instances_page.delete_instance(self.INSTANCE_NAME)
self.assertTrue(
instances_page.find_message_and_dismiss(messages.SUCCESS))
self.assertFalse(
instances_page.find_message_and_dismiss(messages.ERROR))
self.assertTrue(instances_page.is_instance_deleted(self.INSTANCE_NAME))
@decorators.skip_because(bugs=['1774697'])
def test_instances_pagination(self):
"""This test checks instance pagination
Steps:
1) Login to Horizon Dashboard as regular user
2) Navigate to user settings page
3) Change 'Items Per Page' value to 1
4) Go to Project > Compute > Instances page
5) Create 2 instances
6) Go to appropriate page (depends on user)
7) Check that only 'Next' link is available, only one instance is
available (and it has correct name) on the first page
8) Click 'Next' and check that on the second page only one instance is
available (and it has correct name), there is no 'Next' link on page
9) Go to user settings page and restore 'Items Per Page'
10) Delete created instances via proper page (depends on user)
"""
items_per_page = 1
instance_count = 2
instance_list = ["{0}-{1}".format(self.INSTANCE_NAME, item)
for item in range(1, instance_count + 1)]
first_page_definition = {'Next': True, 'Prev': False,
'Count': items_per_page,
'Names': [instance_list[1]]}
second_page_definition = {'Next': False, 'Prev': True,
'Count': items_per_page,
'Names': [instance_list[0]]}
settings_page = self.home_pg.go_to_settings_usersettingspage()
settings_page.change_pagesize(items_per_page)
self.assertTrue(
settings_page.find_message_and_dismiss(messages.SUCCESS))
instances_page = self.home_pg.go_to_project_compute_instancespage()
instances_page.create_instance(self.INSTANCE_NAME,
instance_count=instance_count)
self.assertTrue(
instances_page.find_message_and_dismiss(messages.SUCCESS))
self.assertTrue(instances_page.is_instance_active(instance_list[1]))
instances_page = self.instances_page
instances_page.instances_table.assert_definition(
first_page_definition, sorting=True)
instances_page.instances_table.turn_next_page()
instances_page.instances_table.assert_definition(
second_page_definition, sorting=True)
instances_page = self.instances_page
instances_page.instances_table.assert_definition(
first_page_definition, sorting=True)
settings_page = self.home_pg.go_to_settings_usersettingspage()
settings_page.change_pagesize()
self.assertTrue(
settings_page.find_message_and_dismiss(messages.SUCCESS))
instances_page = self.instances_page
instances_page.delete_instances(instance_list)
self.assertTrue(
instances_page.find_message_and_dismiss(messages.SUCCESS))
self.assertTrue(instances_page.are_instances_deleted(instance_list))
@decorators.skip_because(bugs=['1774697'])
def test_instances_pagination_and_filtration(self):
"""This test checks instance pagination and filtration
Steps:
1) Login to Horizon Dashboard as regular user
2) Go to to user settings page
3) Change 'Items Per Page' value to 1
4) Go to Project > Compute > Instances page
5) Create 2 instances
6) Go to appropriate page (depends on user)
7) Check filter by Name of the first and the second instance in order
to have one instance in the list (and it should have correct name)
and no 'Next' link is available
8) Check filter by common part of Name of in order to have one instance
in the list (and it should have correct name) and 'Next' link is
available on the first page and is not available on the second page
9) Go to user settings page and restore 'Items Per Page'
10) Delete created instances via proper page (depends on user)
"""
items_per_page = 1
instance_count = 2
instance_list = ["{0}-{1}".format(self.INSTANCE_NAME, item)
for item in range(1, instance_count + 1)]
first_page_definition = {'Next': True, 'Prev': False,
'Count': items_per_page,
'Names': [instance_list[1]]}
second_page_definition = {'Next': False, 'Prev': False,
'Count': items_per_page,
'Names': [instance_list[0]]}
filter_first_page_definition = {'Next': False, 'Prev': False,
'Count': items_per_page,
'Names': [instance_list[1]]}
settings_page = self.home_pg.go_to_settings_usersettingspage()
settings_page.change_pagesize(items_per_page)
self.assertTrue(
settings_page.find_message_and_dismiss(messages.SUCCESS))
instances_page = self.home_pg.go_to_project_compute_instancespage()
instances_page.create_instance(self.INSTANCE_NAME,
instance_count=instance_count)
self.assertTrue(
instances_page.find_message_and_dismiss(messages.SUCCESS))
self.assertTrue(instances_page.is_instance_active(instance_list[1]))
instances_page = self.instances_page
instances_page.instances_table.set_filter_value('name')
instances_page.instances_table.filter(instance_list[1])
instances_page.instances_table.assert_definition(
filter_first_page_definition, sorting=True)
instances_page.instances_table.filter(instance_list[0])
instances_page.instances_table.assert_definition(
second_page_definition, sorting=True)
instances_page.instances_table.filter(self.INSTANCE_NAME)
instances_page.instances_table.assert_definition(
first_page_definition, sorting=True)
instances_page.instances_table.filter('')
settings_page = self.home_pg.go_to_settings_usersettingspage()
settings_page.change_pagesize()
self.assertTrue(
settings_page.find_message_and_dismiss(messages.SUCCESS))
instances_page = self.instances_page
instances_page.delete_instances(instance_list)
self.assertTrue(
instances_page.find_message_and_dismiss(messages.SUCCESS))
self.assertTrue(instances_page.are_instances_deleted(instance_list))
@decorators.skip_because(bugs=['1774697'])
def test_filter_instances(self):
"""This test checks filtering of instances by Instance Name
Steps:
1) Login to Horizon dashboard as regular user
2) Go to Project > Compute > Instances
3) Create 2 instances
4) Go to appropriate page (depends on user)
5) Use filter by Instance Name
6) Check that filtered table has one instance only (which name is equal
to filter value) and no other instances in the table
7) Check that filtered table has both instances (search by common part
of instance names)
8) Set nonexistent instance name. Check that 0 rows are displayed
9) Clear filter and delete instances via proper page (depends on user)
"""
instance_count = 2
instance_list = ["{0}-{1}".format(self.INSTANCE_NAME, item)
for item in range(1, instance_count + 1)]
instances_page = self.home_pg.go_to_project_compute_instancespage()
instances_page.create_instance(self.INSTANCE_NAME,
instance_count=instance_count)
self.assertTrue(
instances_page.find_message_and_dismiss(messages.SUCCESS))
self.assertTrue(instances_page.is_instance_active(instance_list[0]))
instances_page = self.instances_page
instances_page.instances_table.set_filter_value('name')
instances_page.instances_table.filter(instance_list[0])
self.assertTrue(instances_page.is_instance_present(instance_list[0]))
for instance in instance_list[1:]:
self.assertFalse(instances_page.is_instance_present(instance))
instances_page.instances_table.filter(self.INSTANCE_NAME)
for instance in instance_list:
self.assertTrue(instances_page.is_instance_present(instance))
nonexistent_instance_name = "{0}_test".format(self.INSTANCE_NAME)
instances_page.instances_table.filter(nonexistent_instance_name)
self.assertEqual(instances_page.instances_table.rows, [])
instances_page.instances_table.filter('')
instances_page.delete_instances(instance_list)
self.assertTrue(
instances_page.find_message_and_dismiss(messages.SUCCESS))
self.assertTrue(instances_page.are_instances_deleted(instance_list))
class TestAdminInstances(helpers.AdminTestCase, TestInstances):
INSTANCE_NAME = helpers.gen_random_resource_name('instance',
timestamp=False)
@property
def instances_page(self):
return self.home_pg.go_to_admin_compute_instancespage()
@decorators.skip_because(bugs=['1774697'])
def test_instances_pagination_and_filtration(self):
super(TestAdminInstances, self).\
test_instances_pagination_and_filtration()
| 47.06 | 79 | 0.674288 | 1,404 | 11,765 | 5.391026 | 0.144587 | 0.120227 | 0.072665 | 0.064209 | 0.790329 | 0.76417 | 0.733122 | 0.704584 | 0.668252 | 0.64751 | 0 | 0.012209 | 0.255079 | 11,765 | 249 | 80 | 47.248996 | 0.851438 | 0.248959 | 0 | 0.791946 | 0 | 0 | 0.02115 | 0 | 0 | 0 | 0 | 0 | 0.214765 | 1 | 0.04698 | false | 0 | 0.020134 | 0.013423 | 0.107383 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
6074e799d0754bc3d5f49e7998ab37c6fcea6188 | 146 | py | Python | office365/projectserver/publishedProject.py | wreiner/Office365-REST-Python-Client | 476bbce4f5928a140b4f5d33475d0ac9b0783530 | [
"MIT"
] | null | null | null | office365/projectserver/publishedProject.py | wreiner/Office365-REST-Python-Client | 476bbce4f5928a140b4f5d33475d0ac9b0783530 | [
"MIT"
] | null | null | null | office365/projectserver/publishedProject.py | wreiner/Office365-REST-Python-Client | 476bbce4f5928a140b4f5d33475d0ac9b0783530 | [
"MIT"
] | null | null | null | from office365.runtime.client_object import ClientObject
class PublishedProject(ClientObject):
def create_project_site(self):
pass
| 18.25 | 56 | 0.780822 | 16 | 146 | 6.9375 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02459 | 0.164384 | 146 | 7 | 57 | 20.857143 | 0.885246 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.25 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 5 |
607c03d10ad66517d580d8d317822b55f75ce377 | 10,708 | py | Python | archive/DUCS-MCA-Batch-2017-2020/DU-PG-2017-1st-sem/PGmca.py | jatin69/du-result-fetcher | 637a2823570ef9ae4f1d44abf6aa0dd74bc1baae | [
"MIT"
] | 6 | 2019-06-29T06:59:55.000Z | 2020-06-09T13:30:52.000Z | archive/DUCS-MCA-Batch-2017-2020/DU-PG-2017-1st-sem/PGmca.py | jatin69/du-result-fetcher | 637a2823570ef9ae4f1d44abf6aa0dd74bc1baae | [
"MIT"
] | 4 | 2018-07-10T19:52:59.000Z | 2019-06-29T06:35:51.000Z | archive/DUCS-MCA-Batch-2017-2020/DU-PG-2017-1st-sem/PGmca.py | jatin69/du-result-fetcher | 637a2823570ef9ae4f1d44abf6aa0dd74bc1baae | [
"MIT"
] | 7 | 2018-07-15T02:40:53.000Z | 2018-10-04T21:42:30.000Z | """
Fetches students of a college based on roll no of one student.
"""
# roll no of a college student
CONST_college_roll_no = "1724501"
# constant POST parameters per college
CONST_VIEWSTATE = """/wEPDwUJMjIwMjE3NzMyD2QWAgIDD2QWFgIBD2QWAgIFDw8WAh4EVGV4dAUzUmVzdWx0cyAoMy1ZZWFyIFNlbWVzdGVyIEV4YW1pbmF0aW9uIE5vdi1EZWMgMjAxNyApZGQCBQ8PFgIfAAUPIChOb3YtRGVjIDIwMTcpZGQCCw8PFgIeB1Zpc2libGVoZGQCEQ8QDxYGHg1EYXRhVGV4dEZpZWxkBQlDT0xMX05BTUUeDkRhdGFWYWx1ZUZpZWxkBQlDT0xMX0NPREUeC18hRGF0YUJvdW5kZ2QQFVMSPC0tLS0tU2VsZWN0LS0tLS0+IkFjaGFyeWEgTmFyZW5kcmEgRGV2IENvbGxlZ2UtKDAwMSkZQWRpdGkgTWFoYXZpZHlhbGF5YS0oMDAyKUNBcnlhYmhhdHRhIENvbGxlZ2UgW0Zvcm1lcmx5IFJhbSBMYWwgQW5hbmQgQ29sbGVnZSAoRXZlbmluZyldLSgwNTkpJUF0bWEgUmFtIFNhbmF0YW4gRGhhcmFtIENvbGxlZ2UtKDAwMykeQmhhZ2luaSBOaXZlZGl0YSBDb2xsZWdlLSgwMDcpFUJoYXJhdGkgQ29sbGVnZS0oMDA4KTBCaGFza2FyYWNoYXJ5YSBDb2xsZWdlIG9mIEFwcGxpZWQgU2NpZW5jZXMtKDAwOSkXQ0FNUFVTIExBVyBDRU5UUkUtKDMwOSkfQ2x1c3RlciBJbm5vdmF0aW9uIENlbnRyZS0oMzEyKSNDb2xsZWdlIE9mIFZvY2F0aW9uYWwgU3R1ZGllcy0oMDEzKRhEYXVsYXQgUmFtIENvbGxlZ2UtKDAxNCkiRGVlbiBEYXlhbCBVcGFkaHlheWEgQ29sbGVnZS0oMDE1KSZEZWxoaSBDb2xsZWdlIE9mIEFydHMgJiBDb21tZXJjZS0oMDE2KSBEZWxoaSBTY2hvb2wgb2YgSm91cm5hbGlzbS0oMzE2KXhEZXBhcnRtZW50IG9mIENvbXB1dGVyIFNjaWVuY2UgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAtKDIzNCkdRGVwYXJ0bWVudCBvZiBFZHVjYXRpb24tKDI0MykdRGVwYXJ0bWVudCBvZiBHZW9ncmFwaHktKDIyOSkwRGVwYXJ0bWVudCBvZiBHZXJtYW5pYyBhbmQgUm9tYW5jZSBTdHVkaWVzLSgyMDQpM0RlcGFydG1lbnQgb2YgTGlicmFyeSBhbmQgSW5mb3JtYXRpb24gU2NpZW5jZS0oMjA2KRlEZXBhcnRtZW50IG9mIE11c2ljLSgyNDApeERlcGFydG1lbnQgb2YgU29jaWFsIFdvcmsgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIC0oMjMzKR5EZXNoYmFuZGh1IENvbGxlZ2UgKERheSktKDAxOSkjRHIuIEJoaW0gUmFvIEFtYmVka2FyIENvbGxlZ2UtKDAxMClIRHVyZ2FiYWkgRGVzaG11a2ggQ29sbGVnZSBvZiBTcGVjaWFsIEVkdWNhdGlvbiAoVmlzdWFsIEltcGFpcm1lbnQpLSgzMTQpHkR5YWwgU2luZ2ggQ29sbGVnZSAoRGF5KS0oMDIxKR5EeWFsIFNpbmdoIENvbGxlZ2UgKEV2ZSktKDAyMikjRmFjdWx0eSBvZiBNYW5hZ2VtZW50IFN0dWRpZXMtKDEwOSkTR2FyZ2kgQ29sbGVnZS0oMDI0KRZIYW5zIFJhaiBDb2xsZWdlLSgwMjUpE0hpbmR1IENvbGxlZ2UtKDAyNikbSS5QLkNvbGxlZ2UgRm9yIFdvbWVuLSgwMjkpPEluZGlyYSBHYW5kaGkgSW5zdGl0dXRlIG9mIFBoeS4gRWR1LiAmIFNwb3J0cyBTY2llbmNlcy0oMDI4KSFJbnN0aXR1dGUgT2YgSG9tZSBFY29ub21pY3MtKDAzMCkhSmFua2kgRGV2aSBNZW1vcmlhbCBDb2xsZWdlLSgwMzEpGkplc3VzICYgTWFyeSBDb2xsZWdlLSgwMzIpEkp1YmlsZWUgSGFsbC0oMzA2KRVLYWxpbmRpIENvbGxlZ2UtKDAzMykZS2FtbGEgTmVocnUgQ29sbGVnZS0oMDM0KRpLZXNoYXYgTWFoYXZpZHlhbGF5YS0oMDM1KRhLaXJvcmkgTWFsIENvbGxlZ2UtKDAzNikYTGFkeSBJcndpbiBDb2xsZWdlLSgwMzgpJExhZHkgU3JpIFJhbSBDb2xsZWdlIEZvciBXb21lbi0oMDM5KRhMYWtzaG1pYmFpIENvbGxlZ2UtKDA0MCkSTEFXIENFTlRSRS1JLSgzMTApE0xBVyBDRU5UUkUtSUktKDMxMSkeTWFoYXJhamEgQWdyYXNlbiBDb2xsZWdlLSgwNDEpK01haGFyc2hpIFZhbG1pa2kgQ29sbGVnZSBvZiBFZHVjYXRpb24tKDMxNSkWTWFpdHJleWkgQ29sbGVnZS0oMDQzKSNNYXRhIFN1bmRyaSBDb2xsZWdlIEZvciBXb21lbi0oMDQ0KRNNaXJhbmRhIEhvdXNlLSgwNDcpIk1vdGkgTGFsIE5laHJ1IENvbGxlZ2UgKERheSktKDA0OCkiTW90aSBMYWwgTmVocnUgQ29sbGVnZSAoRXZlKS0oMDQ5KR5QLkcuRC5BLlYuIENvbGxlZ2UgKERheSktKDA1MykeUC5HLkQuQS5WLiBDb2xsZWdlIChFdmUpLSgwNTQpFlJhamRoYW5pIENvbGxlZ2UtKDA1NSkhUmFtIExhbCBBbmFuZCBDb2xsZWdlIChEYXkpLSgwNTgpF1JhbWFudWphbiBDb2xsZWdlLSgwMjApFFJhbWphcyBDb2xsZWdlLSgwNTYpHVMuRy5ULkIuIEtoYWxzYSBDb2xsZWdlLSgwNjgpHVNhdHlhd2F0aSBDb2xsZWdlIChEYXkpLSgwNjIpHVNhdHlhd2F0aSBDb2xsZWdlIChFdmUpLSgwNjMpHVNjaG9vbCBvZiBPcGVuIExlYXJuaW5nLShTT0wpKFNoYWhlZWQgQmhhZ2F0IFNpbmdoIENvbGxlZ2UgKERheSktKDA2NCkoU2hhaGVlZCBCaGFnYXQgU2luZ2ggQ29sbGVnZSAoRXZlKS0oMDY1KTtTaGFoZWVkIFJhamd1cnUgQ29sbGVnZSBvZiBBcHBsaWVkIFNjaWVuY2VzIGZvciBXb21lbi0oMDY2KTFTaGFoZWVkIFN1a2hkZXYgQ29sbGVnZSBvZiBCdXNpbmVzcyBTdHVkaWVzLSgwNjcpFVNoaXZhamkgQ29sbGVnZS0oMDcxKR1TaHlhbSBMYWwgQ29sbGVnZSAoRGF5KS0oMDczKR1TaHlhbSBMYWwgQ29sbGVnZSAoRXZlKS0oMDc0KSVTaHlhbWEgUHJhc2FkIE11a2hlcmplZSBDb2xsZWdlLSgwNzUpIVNyaSBBdXJvYmluZG8gQ29sbGVnZSAoRGF5KS0oMDc2KSFTcmkgQXVyb2JpbmRvIENvbGxlZ2UgKEV2ZSktKDA3NykvU3JpIEd1cnUgR29iaW5kIFNpbmdoIENvbGxlZ2UgT2YgQ29tbWVyY2UtKDA3OCknU3JpIEd1cnUgTmFuYWsgRGV2IEtoYWxzYSBDb2xsZWdlLSgwNjkpIVNyaSBSYW0gQ29sbGVnZSBPZiBDb21tZXJjZS0oMDcyKR5TcmkgVmVua2F0ZXN3YXJhIENvbGxlZ2UtKDA3OSkaU3QuIFN0ZXBoZW5zIENvbGxlZ2UtKDA4MCkgU3dhbWkgU2hyYWRkaGFuYW5kIENvbGxlZ2UtKDA4MSkZVW5pdmVyc2l0eSBvZiBEZWxoaS0oMTAwKRlWaXZla2FuYW5kYSBDb2xsZWdlLSgwODQpIFpha2lyIEh1c2FpbiBDb2xsZWdlIChFdmUpLSgwODYpJlpha2lyIEh1c2FpbiBEZWxoaSBDb2xsZWdlIChEYXkpLSgwODUpFVMSPC0tLS0tU2VsZWN0LS0tLS0+AzAwMQMwMDIDMDU5AzAwMwMwMDcDMDA4AzAwOQMzMDkDMzEyAzAxMwMwMTQDMDE1AzAxNgMzMTYDMjM0AzI0MwMyMjkDMjA0AzIwNgMyNDADMjMzAzAxOQMwMTADMzE0AzAyMQMwMjIDMTA5AzAyNAMwMjUDMDI2AzAyOQMwMjgDMDMwAzAzMQMwMzIDMzA2AzAzMwMwMzQDMDM1AzAzNgMwMzgDMDM5AzA0MAMzMTADMzExAzA0MQMzMTUDMDQzAzA0NAMwNDcDMDQ4AzA0OQMwNTMDMDU0AzA1NQMwNTgDMDIwAzA1NgMwNjgDMDYyAzA2MwNTT0wDMDY0AzA2NQMwNjYDMDY3AzA3MQMwNzMDMDc0AzA3NQMwNzYDMDc3AzA3OAMwNjkDMDcyAzA3OQMwODADMDgxAzEwMAMwODQDMDg2AzA4NRQrA1NnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2dnZ2RkAhkPEGRkFgECAWQCIQ8QDxYGHwIFCUVYQU1fRkxBRx8DBQlFWEFNX0ZMQUcfBGdkEBUFEzwtLS0tLVNlbGVlY3QtLS0tLT4EQ0JDUw5VR19TRU1FU1RFUl8zWQ5VR19TRU1FU1RFUl80WQ5QR19TRU1FU1RFUl8yWRUFEzwtLS0tLVNlbGVlY3QtLS0tLT4EQ0JDUw5VR19TRU1FU1RFUl8zWQ5VR19TRU1FU1RFUl80WQ5QR19TRU1FU1RFUl8yWRQrAwVnZ2dnZxYBAgRkAikPEGRkFgECAmQCMQ8QDxYGHwIFC0NPVVJTRV9OQU1FHwMFC0NPVVJTRV9DT0RFHwRnZBAVDBI8LS0tLS1TZWxlY3QtLS0tLT4tKEMuSS5DKSAtIE0uU2MuIChNQVRIRU1BVElDUyBFRFVDQVRJT04pLSg5MTMpNChQLkcpLSBNQVNURVIgT0YgQ09NUFVURVIgQVBQTElDQVRJT04gKE0uQy5BLiktKDgyMykpKFAuRyktKE4uQy5XLkUuQikgTS5TYy4gTUFUSEVNQVRJQ1MtKDgxMykYKFAuRyktTS5TYy4gQk9UQU5ZLSg4MTgpHChQLkcpLU0uU2MuIENIRU1JU1RSWSAtKDgxNCkiKFAuRyktTS5TYy4gQ09NUFVURVIgU0NJRU5DRS0oODIyKR0oUC5HKS1NLlNjLiBNQVRIRU1BVElDUy0oODE1KRkoUC5HKS1NLlNjLiBQSFlTSUNTLSg4MTYpHChQLkcpLU0uU2MuIFNUQVRJU1RJQ1MtKDgyMCkZKFAuRyktTS5TYy4gWk9PTE9HWS0oODE5KSUoUC5HKS1NLlNjLk9QRVJBVElPTkFMIFJFU0VBUkNILSg4MjEpFQwSPC0tLS0tU2VsZWN0LS0tLS0+AzkxMwM4MjMDODEzAzgxOAM4MTQDODIyAzgxNQM4MTYDODIwAzgxOQM4MjEUKwMMZ2dnZ2dnZ2dnZ2dnFgECAmQCOQ8QDxYGHwIFBFBBUlQfAwUEUEFSVB8EZ2QQFQUSPC0tLS0tU2VsZWN0LS0tLS0+AUkCSUkDSUlJAklWFQUSPC0tLS0tU2VsZWN0LS0tLS0+AUkCSUkDSUlJAklWFCsDBWdnZ2dnFgECAWQCPw8QDxYGHwIFA1NFTR8DBQNTRU0fBGdkEBUDEjwtLS0tLVNlbGVjdC0tLS0tPgFJAklJFQMSPC0tLS0tU2VsZWN0LS0tLS0+AUkCSUkUKwMDZ2dnZGQCVw8PFgIfAWhkZGQW21NlztastCsQYpVXWAJ74oe2Rg=="""
CONST_EVENTVALIDATION = """/wEWfwKvm/XHCAKrw9qnCgKCqc+VAQLKlPHFDgLA/IyBCALl6+6qAgLopo/rCQL+gsG3BALi5fmADwLXj7neBwLoppvrCQLvppvrCQLY6+KqAgL+gsW3BAKTuKfBCQK017nqAwLJzpv3BQLMzpv3BQKRuN/ACQL8gvG3BALuppPrCQKRuKPBCQLPzuf3BQKtxdr3BgL8gv23BALopp/rCQKvxa70BgKWuKfBCQLA/ISBCALl6+aqAgLpppvrCQKTuNvACQK0173qAwLJzp/3BQLoppPrCQLXj7HeBwKvxab0BgLA/LiBCALl65qqAgLMzuf3BQL+gv23BAKTuN/ACQK017HqAwLJzpP3BQLXj7XeBwLoppfrCQKvxdr3BgKixa70BgLH/ICBCALA/LyBCAKr17nqAwL+gvG3BAKTuNPACQLi5emADwLXj6neBwLopovrCQL+gvW3BAKTuNfACQK016nqAwLXj63eBwKvxaL0BgLJzov3BQLXj6HeBwLl65aqAgL+gum3BAKuwq+LCAKTuMvACQK0163qAwLJzo/3BQLi5eGADwLA/KiBCAL+gu23BAKTuM/ACQK016HqAwLJzoP3BQLi5eWADwLXj6XeBwLopoPrCQLl64qqAgLopofrCQKvxYr0BgLA/OyBCAKsxar0BgKTuIPBCQLJzsf3BQK01+XqAwKjosWyCQLs3ZO5CgLR2dWyDALpvqyACALYz62KDAKAjfS0DwKAxvzOBwL25u3uBwL/5u3uBwKEw+/uBwLM9PumDwLLnPhJAoTjrsIDAo7DtdIEAqbz5rkMAv2OmvwHAuXzrHECqoz6+gMC55nOiAkC5pnyiAkC5pnOiAkCv5e24QoC+6Cs/gQCjfPtlQ8C3M+y1Q4CsdaQyAgC192pywsCkL6U1AQCqOSPvgUCgYSW4wICzvvA6AEC5uu8jQ4C5uvQkw4C1enQ5gUC5uvEkw4C3L7XvQ0C9K6r2AIC9K7HxgICtPew8AwCxISEgAsCpZ/ziwJV+y6cNPsG+dkUfCxxcet8+GG4Sw=="""
import requests
from bs4 import BeautifulSoup
r= requests.get('http://duexam2.du.ac.in/RSLT_ND2017/Students/List_Of_Students.aspx?StdType=REG&ExamFlag=PG_SEMESTER_2Y&CourseCode=823&CourseName=(P.G)-%20MASTER%20OF%20COMPUTER%20APPLICATION%20(M.C.A.)&Part=I&Sem=I')
soup = BeautifulSoup(r.text, 'html.parser')
students_table = soup.find("table", {"rules": "all"})
data = []
all_students = students_table.find_all('tr')
for student in all_students:
cols = student.find_all('td')
cols = [ele.text.strip() for ele in cols]
data.append([ele for ele in cols if ele]) # Get rid of empty values
data[0] = ['sno','srollno','sname','sfathername']
data.pop(0)
#print(data)
#roll_no_pattern = CONST_college_roll_no[:-2] + '...'
#import re
#dduc = []
#for student in data:
# if re.match(roll_no_pattern,student[1]):
# dduc.append(student)
dduc = data
#print(*dduc,sep="\n")
# for these students make html
# works on same college level only
#dduc = [['1', '17015570016', 'NAVEEN KUMAR ROHILLA', 'MR. NARESH KUMAR'], ['2','17015570022','SACHIN YADAV','dd'] ]
college_sgpa_list = []
#for i in range(45):
# dduc.pop()
for VAR_stud in dduc:
VAR_rollno = VAR_stud[1]
VAR_sname = VAR_stud[2]
payload = {
'ddlcollege' : '234',
'ddlexamtype' : 'Semester',
'ddlexamflag' : 'PG_SEMESTER_2Y',
'ddlstream' : 'SC',
'ddlcourse' : '823',
'ddlpart' : 'I',
'ddlsem': 'I',
'txtrollno' : VAR_rollno,
'txtname' : VAR_sname,
'btnsearch': 'Print Score Cart/Transcript',
'__EVENTTARGET' : '',
'__EVENTARGUMENT' : '',
'__LASTFOCUS':'',
'__VIEWSTATE': CONST_VIEWSTATE,
'__EVENTVALIDATION': CONST_EVENTVALIDATION
}
r = requests.post("http://duexam2.du.ac.in/RSLT_ND2017/Students/Combine_GradeCard.aspx", data=payload)
#print(r.text)
soup = BeautifulSoup(r.text, 'html.parser')
for img in soup.find_all('img'):
img.decompose()
#soup.find('span', attrs={'id':'lblstatement'}).decompose()
#soup.find('span', attrs={'id':'lbl_sub_head3'}).decompose()
#soup.find('span', attrs={'id':'lbldisclaimer'}).decompose()
sgpa_table = soup.find("table", {"id": "gvrslt"})
if(sgpa_table == None ):
continue
sgpa_row = sgpa_table.find_all('td')
temp = []
for cols in sgpa_row:
cols = [ele for ele in cols]
temp.append([ele for ele in cols if ele!=[]]) # Get rid of empty values
#print([VAR_rollno, VAR_sname, int(temp[1][0]) ])
college_sgpa_list.append([VAR_rollno, VAR_sname, float(temp[1][0]) ])
'''
# writing result to html file
VAR_filename = "dduc_marks/" + VAR_rollno + '__' + VAR_sname + '__' + '.html'
with open(VAR_filename, "w") as file:
file.write(str(soup))
'''
college_sgpa_list.sort(key = lambda x : x[2])
#print(college_sgpa_list)
#import sys
#sys.stdout = open('oMCA_first_sem.txt', 'w')
print('{3:5} {0:15} {1:25} {2:5} {4:10}'.format("Roll No.","Name","Marks","S.No","Percentage"))
for i,marks in enumerate(college_sgpa_list):
print('{3:5} {0:15} {1:25} {2:5} {4:10}'.format(marks[0],marks[1],marks[2], i+1, float(marks[2]/5)))
#print('{0:15} {1:25} {2:5}'.format("Roll No.","Name","Marks"))
#for marks in college_sgpa_list:
# print('{0:15} {1:25} {2:5}'.format(marks[0],marks[1],marks[2]))
| 83.65625 | 6,064 | 0.869443 | 559 | 10,708 | 16.50805 | 0.40966 | 0.005202 | 0.009753 | 0.005202 | 0.051474 | 0.046597 | 0.030559 | 0.026658 | 0.014954 | 0.014954 | 0 | 0.09003 | 0.063317 | 10,708 | 127 | 6,065 | 84.314961 | 0.83001 | 0.098618 | 0 | 0.036364 | 0 | 0.072727 | 0.823542 | 0.753001 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.036364 | 0 | 0.036364 | 0.036364 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
60837d74e5441012e1bb1611521a65f305243490 | 59 | py | Python | iac-aws-cdk/machine-learning/sagemaker/constructs/__init__.py | alpha2phi/serverless | 23cc98a8de0970fa873ffc783ba1a72c7b54eecd | [
"MIT"
] | null | null | null | iac-aws-cdk/machine-learning/sagemaker/constructs/__init__.py | alpha2phi/serverless | 23cc98a8de0970fa873ffc783ba1a72c7b54eecd | [
"MIT"
] | null | null | null | iac-aws-cdk/machine-learning/sagemaker/constructs/__init__.py | alpha2phi/serverless | 23cc98a8de0970fa873ffc783ba1a72c7b54eecd | [
"MIT"
] | null | null | null | from .sm_notebook_construct import SageMakerNotebookStruct
| 29.5 | 58 | 0.915254 | 6 | 59 | 8.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067797 | 59 | 1 | 59 | 59 | 0.945455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
609074438ae5075f3b9c7dd215ffa41da869d48a | 87 | py | Python | {{cookiecutter.project_name}}/main.py | dlech/cookiecutter-ev3dev-lang-python | 176eb290c50308cb2ba0a5c8c3a35ff509f082c0 | [
"MIT"
] | null | null | null | {{cookiecutter.project_name}}/main.py | dlech/cookiecutter-ev3dev-lang-python | 176eb290c50308cb2ba0a5c8c3a35ff509f082c0 | [
"MIT"
] | null | null | null | {{cookiecutter.project_name}}/main.py | dlech/cookiecutter-ev3dev-lang-python | 176eb290c50308cb2ba0a5c8c3a35ff509f082c0 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
from ev3dev2.sound import Sound
sound = Sound()
sound.beep()
| 10.875 | 31 | 0.712644 | 13 | 87 | 4.769231 | 0.692308 | 0.483871 | 0.483871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.040541 | 0.149425 | 87 | 7 | 32 | 12.428571 | 0.797297 | 0.241379 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
60ece9ca2c6c98f88ae82b3f0c767ef92df80ac5 | 161 | py | Python | autograder/tests/__init__.py | stanfordpython/autograder | 00d487541957f32ef54a8709bcc99e7221e319f8 | [
"MIT"
] | null | null | null | autograder/tests/__init__.py | stanfordpython/autograder | 00d487541957f32ef54a8709bcc99e7221e319f8 | [
"MIT"
] | 1 | 2020-08-03T20:31:02.000Z | 2020-08-03T20:31:02.000Z | autograder/tests/__init__.py | stanfordpython/autograder | 00d487541957f32ef54a8709bcc99e7221e319f8 | [
"MIT"
] | 3 | 2020-09-20T13:22:47.000Z | 2021-09-07T01:40:37.000Z | # Load all tests into current namespace
from .BaseTest import BaseTest
from .IOTest import IOTest
from .FileIOTest import FileIOTest
from .ArgTest import ArgTest | 32.2 | 39 | 0.832298 | 22 | 161 | 6.090909 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136646 | 161 | 5 | 40 | 32.2 | 0.964029 | 0.229814 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
60f4a526fa08ef87b11cb71bceccdb68829aee47 | 132 | py | Python | tests/context.py | Watchful1/PointsBot | 56550b82bd12ff41f1e3a92bf6c2da7562654fea | [
"MIT"
] | 4 | 2020-03-10T15:06:23.000Z | 2021-07-27T19:11:33.000Z | tests/context.py | Watchful1/PointsBot | 56550b82bd12ff41f1e3a92bf6c2da7562654fea | [
"MIT"
] | 3 | 2020-12-28T23:47:33.000Z | 2021-11-02T18:56:52.000Z | tests/context.py | Watchful1/PointsBot | 56550b82bd12ff41f1e3a92bf6c2da7562654fea | [
"MIT"
] | 2 | 2020-12-13T20:37:51.000Z | 2021-07-31T02:57:09.000Z | import sys
from os.path import abspath, dirname, join
sys.path.insert(0, abspath(join(dirname(__file__), '..')))
import pointsbot
| 18.857143 | 58 | 0.742424 | 19 | 132 | 4.947368 | 0.631579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008547 | 0.113636 | 132 | 6 | 59 | 22 | 0.794872 | 0 | 0 | 0 | 0 | 0 | 0.015152 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
88028105284317f7b69d4a003fd6ab14de57c2fb | 53 | py | Python | access_parser/__init__.py | kamnsv/access_parser | 9a33c7900dcab05a3b19ee20caca04ca2dc8f76c | [
"Apache-2.0"
] | 28 | 2020-07-02T14:58:26.000Z | 2021-01-22T16:32:22.000Z | access_parser/__init__.py | kamnsv/access_parser | 9a33c7900dcab05a3b19ee20caca04ca2dc8f76c | [
"Apache-2.0"
] | 5 | 2021-05-08T19:51:58.000Z | 2021-11-05T10:37:11.000Z | access_parser/__init__.py | kamnsv/access_parser | 9a33c7900dcab05a3b19ee20caca04ca2dc8f76c | [
"Apache-2.0"
] | 9 | 2021-03-15T04:24:12.000Z | 2022-03-10T06:59:16.000Z | from access_parser.access_parser import AccessParser
| 26.5 | 52 | 0.90566 | 7 | 53 | 6.571429 | 0.714286 | 0.521739 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075472 | 53 | 1 | 53 | 53 | 0.938776 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
8807ae5abcd177a8def1856c4657808eb32586c3 | 1,548 | py | Python | example/test/core/geometry/blobby/affine/id.py | dmilos/IceRay | 4e01f141363c0d126d3c700c1f5f892967e3d520 | [
"MIT-0"
] | 2 | 2020-09-04T12:27:15.000Z | 2022-01-17T14:49:40.000Z | example/test/core/geometry/blobby/affine/id.py | dmilos/IceRay | 4e01f141363c0d126d3c700c1f5f892967e3d520 | [
"MIT-0"
] | null | null | null | example/test/core/geometry/blobby/affine/id.py | dmilos/IceRay | 4e01f141363c0d126d3c700c1f5f892967e3d520 | [
"MIT-0"
] | 1 | 2020-09-04T12:27:52.000Z | 2020-09-04T12:27:52.000Z | import math
import IceRayCpp
def USphere():
cargo={}
cargo['this']=IceRayCpp.GeometryBlobbySystem()
cargo['c1'] = IceRayCpp.GeometryBlobbyUSphere( 0.5 )
cargo['1'] = IceRayCpp.GeometryBlobbyAffine( cargo['c1'] )
cargo['rtss'] = IceRayCpp.GeometryRTSSList()
cargo['this'].add( cargo['1'] )
cargo['this'].rtss( cargo['rtss'] )
return cargo
def sphere( P_core = 0.5):
cargo={}
cargo['this']=IceRayCpp.GeometryBlobbySystem()
cargo['c1'] = IceRayCpp.GeometryBlobbySphere( IceRayCpp.MathTypeCoord3D().load(0,0,0), P_core, 1 )
cargo['1'] = IceRayCpp.GeometryBlobbyAffine( cargo['c1'] )
cargo['rtss'] = IceRayCpp.GeometryRTSSList()
cargo['this'].add( cargo['1'] )
cargo['this'].rtss( cargo['rtss'] )
return cargo
def UWaterZ():
cargo={}
cargo['this']=IceRayCpp.GeometryBlobbySystem()
cargo['c1'] = IceRayCpp.GeometryBlobbyUWaterZ( 0.5 )
cargo['1'] = IceRayCpp.GeometryBlobbyAffine( cargo['c1'] )
cargo['rtss'] = IceRayCpp.GeometryRTSSList()
cargo['this'].add( cargo['1'] )
cargo['this'].rtss( cargo['rtss'] )
return cargo
def UCylinderZ():
cargo={}
cargo['this']=IceRayCpp.GeometryBlobbySystem()
cargo['c1'] = IceRayCpp.GeometryBlobbyUCylinderZ( 0.5 )
cargo['1'] = IceRayCpp.GeometryBlobbyAffine( cargo['c1'] )
cargo['rtss'] = IceRayCpp.GeometryRTSSList()
cargo['this'].add( cargo['1'] )
cargo['this'].rtss( cargo['rtss'] )
return cargo
| 24.1875 | 103 | 0.613049 | 155 | 1,548 | 6.109677 | 0.180645 | 0.114044 | 0.059134 | 0.097149 | 0.801478 | 0.801478 | 0.801478 | 0.801478 | 0.55227 | 0.55227 | 0 | 0.02377 | 0.211886 | 1,548 | 63 | 104 | 24.571429 | 0.752459 | 0 | 0 | 0.736842 | 0 | 0 | 0.070034 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0 | 0.052632 | 0 | 0.263158 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
714b05decdc20e05ddb525953d7601dc8ba027a2 | 7,164 | py | Python | tools/mo/unit_tests/mo/front/tf/CTCLossReplacement_test.py | pazamelin/openvino | b7e8ef910d7ed8e52326d14dc6fd53b71d16ed48 | [
"Apache-2.0"
] | 1 | 2021-04-20T08:14:51.000Z | 2021-04-20T08:14:51.000Z | tools/mo/unit_tests/mo/front/tf/CTCLossReplacement_test.py | pazamelin/openvino | b7e8ef910d7ed8e52326d14dc6fd53b71d16ed48 | [
"Apache-2.0"
] | 58 | 2020-11-06T12:13:45.000Z | 2022-03-28T13:20:11.000Z | tools/mo/unit_tests/mo/front/tf/CTCLossReplacement_test.py | pazamelin/openvino | b7e8ef910d7ed8e52326d14dc6fd53b71d16ed48 | [
"Apache-2.0"
] | 2 | 2021-07-14T07:40:50.000Z | 2021-07-27T01:40:03.000Z | # Copyright (C) 2018-2021 Intel Corporation
# SPDX-License-Identifier: Apache-2.0
import numpy as np
import unittest
from argparse import Namespace
from openvino.tools.mo.front.tf.CTCLossReplacement import CTCLossReplacement
from openvino.tools.mo.front.common.partial_infer.utils import int64_array
from openvino.tools.mo.utils.ir_engine.compare_graphs import compare_graphs
from unit_tests.utils.graph import build_graph, const
class CTCLossFrontReplacementTest(unittest.TestCase):
nodes_attributes = {
'logits': {'shape': int64_array([2, 6, 100]), 'type': 'Parameter', 'kind': 'op', 'op': 'Parameter'},
'seq_mask': {'shape': int64_array([2]), 'data_type': np.int32, 'kind': 'op', 'op': 'Parameter'},
'transpose': {'kind': 'op', 'op': 'Transpose'},
'ctc_greedy_decoder': {'kind': 'op', 'op': 'CTCGreedyDecoderSeqLen', 'merge_repeated': True,
'output_sparse_format': True},
'cast': {'kind': 'op', 'op': 'Cast'},
'sparse_to_dense': {'kind': 'op', 'op': 'SparseToDense'},
'tf_ctc_loss_true_logits': {'kind': 'op', 'op': 'CTCLoss', 'preprocess_collapse_repeated': False,
'ctc_merge_repeated': True, 'unique': False, 'logits_time_major': True},
'tf_ctc_loss_false_logits': {'kind': 'op', 'op': 'CTCLoss', 'preprocess_collapse_repeated': False,
'ctc_merge_repeated': True, 'unique': False, 'logits_time_major': False},
'ctc_loss': {'kind': 'op', 'op': 'CTCLoss', 'preprocess_collapse_repeated': False,
'ctc_merge_repeated': True, 'unique': False},
**const('default_value', int64_array(-1)),
'last': {'type': None, 'value': None, 'kind': 'op', 'op': 'Result'},
'transpose2': {'kind': 'op', 'op': 'Transpose'},
**const('transpose2_axis', int64_array([1, 0, 2])),
'new_ctc_greedy_decoder': {'kind': 'op', 'op': 'CTCGreedyDecoderSeqLen', 'merge_repeated': True},
}
def CTCLossReplacement_test_true_logits(self):
graph = build_graph(self.nodes_attributes,
[('logits', 'transpose', {'out': 0, 'in': 0}),
('transpose', 'ctc_greedy_decoder', {'out': 0, 'in': 0}),
('seq_mask', 'ctc_greedy_decoder', {'out': 0, 'in': 1}),
('transpose', 'tf_ctc_loss_true_logits', {'out': 0, 'in': 0}),
('seq_mask', 'tf_ctc_loss_true_logits', {'out': 0, 'in': 3}),
('ctc_greedy_decoder', 'sparse_to_dense', {'out': 0, 'in': 0}),
('ctc_greedy_decoder', 'sparse_to_dense', {'out': 2, 'in': 1}),
('ctc_greedy_decoder', 'sparse_to_dense', {'out': 1, 'in': 2}),
('default_value', 'sparse_to_dense', {'out': 0, 'in': 3}),
('ctc_greedy_decoder', 'cast', {'out': 1, 'in': 0}),
('ctc_greedy_decoder', 'tf_ctc_loss_true_logits', {'out': 0, 'in': 1}),
('cast', 'tf_ctc_loss_true_logits', {'out': 0, 'in': 2}),
('tf_ctc_loss_true_logits', 'last', {'out': 0, 'in': 0})],
nodes_with_edges_only=True)
graph.graph['cmd_params'] = Namespace(data_type='FP32')
graph.stage = 'front'
CTCLossReplacement().find_and_replace_pattern(graph)
graph_ref = build_graph(self.nodes_attributes,
[('logits', 'transpose', {'out': 0, 'in': 0}),
('transpose', 'transpose2', {'out': 0, 'in': 0}),
('transpose2_axis', 'transpose2', {'out': 0, 'in': 1}),
('transpose2', 'new_ctc_greedy_decoder', {'out': 0, 'in': 0}),
('seq_mask', 'new_ctc_greedy_decoder', {'out': 0, 'in': 1}),
('transpose2', 'ctc_loss', {'out': 0, 'in': 0}),
('new_ctc_greedy_decoder', 'ctc_loss', {'out': 0, 'in': 2}),
('new_ctc_greedy_decoder', 'ctc_loss', {'out': 1, 'in': 3}),
('seq_mask', 'ctc_loss', {'out': 0, 'in': 1}),
('ctc_loss', 'last', {'out': 0, 'in': 0})],
nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'last', check_op_attrs=True)
self.assertTrue(flag, resp)
def CTCLossReplacement_test_false_logits(self):
graph = build_graph(self.nodes_attributes,
[('logits', 'transpose', {'out': 0, 'in': 0}),
('transpose', 'ctc_greedy_decoder', {'out': 0, 'in': 0}),
('seq_mask', 'ctc_greedy_decoder', {'out': 0, 'in': 1}),
('transpose', 'tf_ctc_loss_false_logits', {'out': 0, 'in': 0}),
('seq_mask', 'tf_ctc_loss_false_logits', {'out': 0, 'in': 3}),
('ctc_greedy_decoder', 'sparse_to_dense', {'out': 0, 'in': 0}),
('ctc_greedy_decoder', 'sparse_to_dense', {'out': 2, 'in': 1}),
('ctc_greedy_decoder', 'sparse_to_dense', {'out': 1, 'in': 2}),
('default_value', 'sparse_to_dense', {'out': 0, 'in': 3}),
('ctc_greedy_decoder', 'cast', {'out': 1, 'in': 0}),
('ctc_greedy_decoder', 'tf_ctc_loss_false_logits', {'out': 0, 'in': 1}),
('cast', 'tf_ctc_loss_false_logits', {'out': 0, 'in': 2}),
('tf_ctc_loss_false_logits', 'last', {'out': 0, 'in': 0})],
nodes_with_edges_only=True)
graph.graph['cmd_params'] = Namespace(data_type='FP32')
graph.stage = 'front'
CTCLossReplacement().find_and_replace_pattern(graph)
graph_ref = build_graph(self.nodes_attributes,
[('logits', 'transpose', {'out': 0, 'in': 0}),
('transpose', 'transpose2', {'out': 0, 'in': 0}),
('transpose2_axis', 'transpose2', {'out': 0, 'in': 1}),
('transpose2', 'new_ctc_greedy_decoder', {'out': 0, 'in': 0}),
('seq_mask', 'new_ctc_greedy_decoder', {'out': 0, 'in': 1}),
('transpose', 'ctc_loss', {'out': 0, 'in': 0}),
('new_ctc_greedy_decoder', 'ctc_loss', {'out': 0, 'in': 2}),
('new_ctc_greedy_decoder', 'ctc_loss', {'out': 1, 'in': 3}),
('seq_mask', 'ctc_loss', {'out': 0, 'in': 1}),
('ctc_loss', 'last', {'out': 0, 'in': 0})],
nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'last', check_op_attrs=True)
self.assertTrue(flag, resp)
| 66.333333 | 110 | 0.48632 | 759 | 7,164 | 4.289855 | 0.14888 | 0.046683 | 0.070025 | 0.042998 | 0.761057 | 0.730958 | 0.730958 | 0.730958 | 0.703624 | 0.687654 | 0 | 0.02908 | 0.332775 | 7,164 | 107 | 111 | 66.953271 | 0.652092 | 0.010748 | 0 | 0.553191 | 0 | 0 | 0.305195 | 0.085827 | 0 | 0 | 0 | 0 | 0.021277 | 1 | 0.021277 | false | 0 | 0.074468 | 0 | 0.117021 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
715da6c9f9e21404952d47964f4c2e9afe49563e | 711 | py | Python | example_config.py | vmalloc/dwight | 7268e7dfaf8e16bb1c62a26ca4bd1d2715b97882 | [
"BSD-3-Clause"
] | 7 | 2015-05-03T10:07:24.000Z | 2022-02-04T15:37:02.000Z | example_config.py | vmalloc/dwight | 7268e7dfaf8e16bb1c62a26ca4bd1d2715b97882 | [
"BSD-3-Clause"
] | null | null | null | example_config.py | vmalloc/dwight | 7268e7dfaf8e16bb1c62a26ca4bd1d2715b97882 | [
"BSD-3-Clause"
] | 4 | 2017-11-16T12:05:24.000Z | 2021-03-20T06:27:54.000Z | ROOT_IMAGE = "http://server/ubuntu_precise64.squashfs"
INCLUDES = [
Include("/etc/passwd", "/etc/passwd"),
Include("/etc/group", "/etc/group"),
Include("/mounts/fetched_from_local_path", "/local/path"),
Include("/mounts/fetched_from_git", "git://server/git/git_repository"),
Include("/mounts/fetched_from_git_branch", "git://server/git/git_repository", branch="branch"),
Include("/mounts/fetched_from_http", "http://server/fetched_from_http.squashfs"),
Include("/mounts/fetched_from_hg", "http+hg://server:8000/repository"),
Include("/mounts/fetched_from_hg_branch", "http+hg://server:8000/repository", branch="branch"),
]
ENVIRON = {
"PATH" : "$PATH:some_path_here"
}
| 41.823529 | 99 | 0.696203 | 89 | 711 | 5.303371 | 0.280899 | 0.163136 | 0.254237 | 0.305085 | 0.440678 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015699 | 0.104079 | 711 | 16 | 100 | 44.4375 | 0.725275 | 0 | 0 | 0 | 0 | 0 | 0.644163 | 0.407876 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.071429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
71a9ee922b4c21d3d6187fc5f08733c5f7adc3c6 | 40 | py | Python | mfr/extensions/zip/__init__.py | yacchin1205/RDM-modular-file-renderer | 5bd18175a681d21e7be7fe0238132335a1cd8ded | [
"Apache-2.0"
] | 36 | 2015-08-31T20:24:22.000Z | 2021-12-17T17:02:44.000Z | mfr/extensions/zip/__init__.py | yacchin1205/RDM-modular-file-renderer | 5bd18175a681d21e7be7fe0238132335a1cd8ded | [
"Apache-2.0"
] | 190 | 2015-01-02T06:22:01.000Z | 2022-01-19T11:27:03.000Z | mfr/extensions/zip/__init__.py | yacchin1205/RDM-modular-file-renderer | 5bd18175a681d21e7be7fe0238132335a1cd8ded | [
"Apache-2.0"
] | 47 | 2015-01-27T15:45:22.000Z | 2021-01-27T22:43:03.000Z | from .render import ZipRenderer # noqa
| 20 | 39 | 0.775 | 5 | 40 | 6.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175 | 40 | 1 | 40 | 40 | 0.939394 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
e085c2357f51183537312d4ef2d8c86fbd83c15b | 140 | py | Python | blog/django/myblog/static_pages/views.py | Anshul-GH/django | f0a0d3c8a6cf98dbef3ec9c5a056645ddd0f4ee6 | [
"MIT"
] | null | null | null | blog/django/myblog/static_pages/views.py | Anshul-GH/django | f0a0d3c8a6cf98dbef3ec9c5a056645ddd0f4ee6 | [
"MIT"
] | null | null | null | blog/django/myblog/static_pages/views.py | Anshul-GH/django | f0a0d3c8a6cf98dbef3ec9c5a056645ddd0f4ee6 | [
"MIT"
] | null | null | null | from django.shortcuts import render
# Create your views here.
def home_view(request):
return render(request, 'static_pages/home.html')
| 23.333333 | 52 | 0.771429 | 20 | 140 | 5.3 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135714 | 140 | 5 | 53 | 28 | 0.876033 | 0.164286 | 0 | 0 | 0 | 0 | 0.191304 | 0.191304 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 5 |
e0965be3aa33742a559849b435c1a5bab3eb44f7 | 506 | py | Python | falsefriends/linear_trans.py | bryant1410/false-friends | 4c5c02a028e91c52f30f217fb0a610801a06e00d | [
"Apache-2.0"
] | 4 | 2016-05-13T22:39:27.000Z | 2020-02-17T17:21:57.000Z | falsefriends/linear_trans.py | bryant1410/false-friends | 4c5c02a028e91c52f30f217fb0a610801a06e00d | [
"Apache-2.0"
] | 2 | 2017-04-20T10:59:31.000Z | 2021-12-13T19:48:40.000Z | falsefriends/linear_trans.py | bryant1410/false-friends | 4c5c02a028e91c52f30f217fb0a610801a06e00d | [
"Apache-2.0"
] | 1 | 2019-04-09T07:04:51.000Z | 2019-04-09T07:04:51.000Z | # -*- coding: utf-8 -*
import numpy as np
# noinspection PyPep8Naming
def linear_transformation(origin_vectors, destination_vectors, backwards=False):
if backwards:
origin_vectors, destination_vectors = destination_vectors, origin_vectors
return np.linalg.lstsq(origin_vectors, destination_vectors)[0]
def save_linear_transformation(file_name, transformation):
np.savez(file_name, transformation)
def load_linear_transformation(file_name):
return np.load(file_name)['arr_0']
| 26.631579 | 81 | 0.782609 | 62 | 506 | 6.096774 | 0.451613 | 0.137566 | 0.26455 | 0.246032 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009091 | 0.130435 | 506 | 18 | 82 | 28.111111 | 0.85 | 0.090909 | 0 | 0 | 0 | 0 | 0.010941 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.111111 | 0.111111 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
e0a48517ce47d3eb76ad5fc8b9cd6463b6bd56cd | 6,697 | py | Python | classification/torch/resnet.py | junyang-zh/ml-experiment | a6f43e3b00541fda1277b2ba39cec5ea454072e2 | [
"MIT"
] | null | null | null | classification/torch/resnet.py | junyang-zh/ml-experiment | a6f43e3b00541fda1277b2ba39cec5ea454072e2 | [
"MIT"
] | null | null | null | classification/torch/resnet.py | junyang-zh/ml-experiment | a6f43e3b00541fda1277b2ba39cec5ea454072e2 | [
"MIT"
] | null | null | null | import torch.nn as nn
class BasicBlock(nn.Module):
expansion = 1
def __init__(self, in_channels, out_channels, stride=1, downsample=None):
super(BasicBlock, self).__init__()
self.conv1 = nn.Conv2d(in_channels, out_channels, 3, stride, padding=1, bias=False)
self.bn1 = nn.BatchNorm2d(out_channels)
self.conv2 = nn.Conv2d(out_channels, out_channels, 3, 1, padding=1, bias=False)
self.bn2 = nn.BatchNorm2d(out_channels)
self.relu = nn.ReLU(inplace=True)
self.downsample = downsample
def forward(self, input):
residual = input
x = self.conv1(input)
x = self.bn1(x)
x = self.relu(x)
x = self.conv2(x)
x = self.bn2(x)
if self.downsample:
residual = self.downsample(residual)
x += residual
x = self.relu(x)
return x
class BottleNeck(nn.Module):
expansion = 4
def __init__(self, in_channels, out_channels, stride=1, downsample=None):
super(BottleNeck, self).__init__()
self.conv1 = nn.Conv2d(in_channels, out_channels, 1, bias=False)
self.bn1 = nn.BatchNorm2d(out_channels)
self.conv2 = nn.Conv2d(out_channels, out_channels, 3, stride, padding=1, bias=False)
self.bn2 = nn.BatchNorm2d(out_channels)
self.conv3 = nn.Conv2d(out_channels, out_channels*self.expansion, 1, bias=False)
self.bn3 = nn.BatchNorm2d(out_channels*self.expansion)
self.relu = nn.ReLU(inplace=True)
self.downsample = downsample
def forward(self, input):
residual = input
x = self.conv1(input)
x = self.bn1(x)
x = self.relu(x)
x = self.conv2(x)
x = self.bn2(x)
x = self.relu(x)
x = self.conv3(x)
x = self.bn3(x)
if self.downsample:
residual = self.downsample(residual)
x += residual
x = self.relu(x)
return x
class ResNet(nn.Module):
# 224*224
def __init__(self, block, num_layer, n_classes=1000, input_channels=3):
super(ResNet, self).__init__()
self.in_channels = 64
self.conv1 = nn.Conv2d(input_channels, 64, kernel_size=7, stride=2, padding=3, bias=False)
self.bn1 = nn.BatchNorm2d(64)
self.maxpool = nn.MaxPool2d(3, stride=2, padding=1)
self.relu = nn.ReLU(inplace=True)
self.layer1 = self._make_layer(block, 64, num_layer[0])
self.layer2 = self._make_layer(block, 128, num_layer[1], 2)
self.layer3 = self._make_layer(block, 256, num_layer[2], 2)
self.layer4 = self._make_layer(block, 512, num_layer[3], 2)
self.avgpool = nn.AvgPool2d(kernel_size=7, stride=1)
self.fc = nn.Linear(block.expansion*512, n_classes)
for m in self.modules():
if isinstance(m, nn.Conv2d):
nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu')
elif isinstance(m, nn.BatchNorm2d):
nn.init.constant_(m.weight, 1.0)
nn.init.constant_(m.bias, 0.0)
def _make_layer(self, block, out_channels, num_block, stride=1):
downsample = None
if stride != 1 or self.in_channels != out_channels*block.expansion:
downsample = nn.Sequential(
nn.Conv2d(self.in_channels, out_channels*block.expansion, 1, stride=stride, bias=False),
nn.BatchNorm2d(out_channels*block.expansion)
)
layers = []
layers.append(block(self.in_channels, out_channels, stride, downsample))
self.in_channels = out_channels*block.expansion
for _ in range(1, num_block):
layers.append(block(self.in_channels, out_channels))
return nn.Sequential(*layers)
def forward(self, input):
x = self.conv1(input)
x = self.bn1(x)
x = self.relu(x)
x = self.maxpool(x)
x = self.layer1(x)
x = self.layer2(x)
x = self.layer3(x)
x = self.layer4(x)
x = self.avgpool(x)
x = x.view(x.size(0), -1)
x = self.fc(x)
return x
def resnet18(**kwargs):
return ResNet(BasicBlock, [2, 2, 2, 2], **kwargs)
def resnet34(**kwargs):
return ResNet(BasicBlock, [3, 4, 6, 3], **kwargs)
def resnet50(**kwargs):
return ResNet(BottleNeck, [3, 4, 6, 3], **kwargs)
def resnet101(**kwargs):
return ResNet(BottleNeck, [3, 4, 23, 3], **kwargs)
def resnet152(**kwargs):
return ResNet(BottleNeck, [3, 8, 36, 3], **kwargs)
class ResNet28(nn.Module):
# 28*28
def __init__(self, block, num_layer, n_classes=10, input_channels=1):
super(ResNet28, self).__init__()
self.in_channels = 32
self.conv1 = nn.Conv2d(input_channels, 32, kernel_size=1, stride=1, bias=False)
self.bn1 = nn.BatchNorm2d(32)
self.relu = nn.ReLU(inplace=True)
self.layer1 = self._make_layer(block, 32, num_layer[0])
self.layer2 = self._make_layer(block, 64, num_layer[1], 2)
self.layer3 = self._make_layer(block, 128, num_layer[2], 2)
self.avgpool = nn.AvgPool2d(kernel_size=7, stride=1)
self.drop = nn.Dropout(0.5)
self.fc = nn.Linear(block.expansion*128, n_classes)
for m in self.modules():
if isinstance(m, nn.Conv2d):
nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu')
elif isinstance(m, nn.BatchNorm2d):
nn.init.constant_(m.weight, 1.0)
nn.init.constant_(m.bias, 0.0)
def _make_layer(self, block, out_channels, num_block, stride=1):
downsample = None
if stride != 1 or self.in_channels != out_channels*block.expansion:
downsample = nn.Sequential(
nn.Conv2d(self.in_channels, out_channels*block.expansion, 1, stride=stride, bias=False),
nn.BatchNorm2d(out_channels*block.expansion)
)
layers = []
layers.append(block(self.in_channels, out_channels, stride, downsample))
self.in_channels = out_channels*block.expansion
for _ in range(1, num_block):
layers.append(block(self.in_channels, out_channels))
return nn.Sequential(*layers)
def forward(self, input):
x = self.conv1(input)
x = self.bn1(x)
x = self.relu(x)
x = self.layer1(x)
x = self.layer2(x)
x = self.layer3(x)
x = self.avgpool(x)
x = x.view(x.size(0), -1)
x = self.drop(x)
x = self.fc(x)
return x
def resnet28_13(**kwargs):
return ResNet28(BasicBlock, [1, 2, 1], **kwargs)
def resnet28_17(**kwargs):
return ResNet28(BasicBlock, [2, 3, 3], **kwargs) | 37 | 104 | 0.603106 | 912 | 6,697 | 4.281798 | 0.117325 | 0.043534 | 0.033803 | 0.075288 | 0.829193 | 0.799488 | 0.734187 | 0.710115 | 0.693726 | 0.674776 | 0 | 0.048085 | 0.267135 | 6,697 | 181 | 105 | 37 | 0.747555 | 0.001941 | 0 | 0.62987 | 0 | 0 | 0.003292 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.11039 | false | 0 | 0.006494 | 0.045455 | 0.24026 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
e0c51dc729e90ef6757f7362ef86b195d7d234c2 | 128 | py | Python | users/admin.py | TEOTD/Travel | 013deb247429cc4fbff910d1d068a7b87bebe324 | [
"MIT"
] | null | null | null | users/admin.py | TEOTD/Travel | 013deb247429cc4fbff910d1d068a7b87bebe324 | [
"MIT"
] | null | null | null | users/admin.py | TEOTD/Travel | 013deb247429cc4fbff910d1d068a7b87bebe324 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Profile,comment
admin.site.register(Profile)
admin.site.register(comment)
| 21.333333 | 35 | 0.828125 | 18 | 128 | 5.888889 | 0.555556 | 0.169811 | 0.320755 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085938 | 128 | 5 | 36 | 25.6 | 0.905983 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
e0e0ecc5d7b9e1d014b718a6862b6c994df7c3f7 | 76 | py | Python | brochure/values/enterprise.py | GreenLightSoftware/brochure | 925b55650d321f10b2cb4d2dcd8e11854634382c | [
"MIT"
] | null | null | null | brochure/values/enterprise.py | GreenLightSoftware/brochure | 925b55650d321f10b2cb4d2dcd8e11854634382c | [
"MIT"
] | null | null | null | brochure/values/enterprise.py | GreenLightSoftware/brochure | 925b55650d321f10b2cb4d2dcd8e11854634382c | [
"MIT"
] | null | null | null | from typing import NamedTuple
class Enterprise(NamedTuple):
name: str
| 12.666667 | 29 | 0.763158 | 9 | 76 | 6.444444 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184211 | 76 | 5 | 30 | 15.2 | 0.935484 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
e0e2111f71240eae76c706339679b95f015a89ac | 20 | py | Python | test_banch/br_test.py | Ambareezh/pyneta | a64a2c213847bdec0af4064730c2c6f1d47575c7 | [
"Apache-2.0"
] | null | null | null | test_banch/br_test.py | Ambareezh/pyneta | a64a2c213847bdec0af4064730c2c6f1d47575c7 | [
"Apache-2.0"
] | null | null | null | test_banch/br_test.py | Ambareezh/pyneta | a64a2c213847bdec0af4064730c2c6f1d47575c7 | [
"Apache-2.0"
] | null | null | null | print("Heloolllll")
| 10 | 19 | 0.75 | 2 | 20 | 7.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 20 | 1 | 20 | 20 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
e0f3f67e069ee59ede9660c7ec1abb780a869dff | 180 | py | Python | student_app/admin.py | abhinav042/COMP3297 | 3c37a4330d94a45cdabaa26bc17147ae73118cfb | [
"MIT"
] | null | null | null | student_app/admin.py | abhinav042/COMP3297 | 3c37a4330d94a45cdabaa26bc17147ae73118cfb | [
"MIT"
] | 8 | 2020-06-05T17:52:40.000Z | 2022-03-11T23:18:13.000Z | student_app/admin.py | abhinav042/COMP3297 | 3c37a4330d94a45cdabaa26bc17147ae73118cfb | [
"MIT"
] | 1 | 2018-09-03T00:46:07.000Z | 2018-09-03T00:46:07.000Z | from django.contrib import admin
from student_app.models import Student,Transaction_S
# Register your models here.
admin.site.register(Student)
admin.site.register(Transaction_S)
| 25.714286 | 52 | 0.838889 | 26 | 180 | 5.692308 | 0.538462 | 0.162162 | 0.22973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 180 | 6 | 53 | 30 | 0.902439 | 0.144444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
1caa30b8f419d4b20641a0be3fafced8f356ce9f | 201 | py | Python | tic_tweet_toe/services/__init__.py | c17r/tic-tweet-toe | 6915c201c61f4c5e470f5cf127ace1819b2746f4 | [
"MIT"
] | null | null | null | tic_tweet_toe/services/__init__.py | c17r/tic-tweet-toe | 6915c201c61f4c5e470f5cf127ace1819b2746f4 | [
"MIT"
] | 1 | 2016-06-14T03:20:44.000Z | 2016-06-14T03:20:44.000Z | tic_tweet_toe/services/__init__.py | c17r/tic-tweet-toe | 6915c201c61f4c5e470f5cf127ace1819b2746f4 | [
"MIT"
] | null | null | null | from storage_service import Storage
from twitter_service import Twitter, \
TwitterServiceAuthenticationException, \
TwitterServiceNotConfigured
from stoppable_process import StoppableProcess
| 25.125 | 46 | 0.850746 | 17 | 201 | 9.882353 | 0.588235 | 0.154762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129353 | 201 | 7 | 47 | 28.714286 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
1cb5e47e1665ed1ead976d1a512381eb2d3697c6 | 74 | py | Python | python/ql/test/library-tests/web/client/requests/test.py | vadi2/codeql | a806a4f08696d241ab295a286999251b56a6860c | [
"MIT"
] | 4,036 | 2020-04-29T00:09:57.000Z | 2022-03-31T14:16:38.000Z | python/ql/test/library-tests/web/client/requests/test.py | vadi2/codeql | a806a4f08696d241ab295a286999251b56a6860c | [
"MIT"
] | 2,970 | 2020-04-28T17:24:18.000Z | 2022-03-31T22:40:46.000Z | python/ql/test/library-tests/web/client/requests/test.py | ScriptBox99/github-codeql | 2ecf0d3264db8fb4904b2056964da469372a235c | [
"MIT"
] | 794 | 2020-04-29T00:28:25.000Z | 2022-03-30T08:21:46.000Z | import requests
requests.get('example.com')
requests.post('example.com')
| 14.8 | 28 | 0.77027 | 10 | 74 | 5.7 | 0.6 | 0.350877 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067568 | 74 | 4 | 29 | 18.5 | 0.826087 | 0 | 0 | 0 | 0 | 0 | 0.297297 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
1cccb05df34e3938df42b35abeca8269cae02270 | 1,198 | py | Python | velkozz_web_api/utils/MVC_utils.py | velkoz-data-ingestion/velkozz_web_api | 519a6a90e5fdf5bab8ba2daf637768c5fd424a12 | [
"MIT"
] | null | null | null | velkozz_web_api/utils/MVC_utils.py | velkoz-data-ingestion/velkozz_web_api | 519a6a90e5fdf5bab8ba2daf637768c5fd424a12 | [
"MIT"
] | null | null | null | velkozz_web_api/utils/MVC_utils.py | velkoz-data-ingestion/velkozz_web_api | 519a6a90e5fdf5bab8ba2daf637768c5fd424a12 | [
"MIT"
] | null | null | null |
# TODO This is not viable due to different parameter names. Either find a way around this or give up and just write the code manually!
def filter_query_via_params():
"""The method is uses by the list and create (GET and POST) methods in
ViewSets to filter django model querysets via request parameters.
The method extracts the necessaray parameters from the ingested request
object (assuming specific naming conventions) and filters the ingested
queryset based on these specific parmas (again assuming specific naming
conventions). This method is meant to contain the logic for the most common
queryset filtering that takes place in the project, not to serve as an all
encompassing purpose.
Currently the method extracts and filters the following arguments:
* Start-Date -->
* End-Date -->
Args:
request (HttpResponse): The response object made to the REST API that
is passed through the ViewSet that needs to be filtered.
queryset (QuerySet): The initial queryset of all django objects from
the database.
Returns:
QuerySet: The filtered queryset of the data model.
"""
pass | 39.933333 | 134 | 0.718698 | 168 | 1,198 | 5.107143 | 0.607143 | 0.031469 | 0.039627 | 0.076923 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.24374 | 1,198 | 30 | 135 | 39.933333 | 0.94702 | 0.890651 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033333 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
1c49b9b3ba4bd5a9767f7b527ecc21f6732caf9f | 687 | py | Python | Content/Data Structures/Matrix.py | MovsisyanM/Data-Structures-And-Algos-Revisit | 3bb128a4a5476914c164b1a3c1b533a8eace8604 | [
"MIT"
] | 3 | 2020-12-24T16:49:14.000Z | 2021-08-10T17:19:16.000Z | Content/Data Structures/Matrix.py | MovsisyanM/Data-Structures-And-Algos-Revisit | 3bb128a4a5476914c164b1a3c1b533a8eace8604 | [
"MIT"
] | null | null | null | Content/Data Structures/Matrix.py | MovsisyanM/Data-Structures-And-Algos-Revisit | 3bb128a4a5476914c164b1a3c1b533a8eace8604 | [
"MIT"
] | 1 | 2020-12-25T15:37:36.000Z | 2020-12-25T15:37:36.000Z | class Matrix:
"""No, not the movie.
A 2d array with many methods to make it act like a matrix"""
def __init__(self, size, fill_with=0):
assert (size >= 1), "Matrix size too small, must be positive integer"
this.size = math.floor(size)
this.mem = [[fill_with] * this.size] * this.size
def __getitem__(self, key):
return copy.copy(this.mem[key])
# this is where the fun begins!
def __mul__(self, matrix):
pass
# No need to worry about matrix mult. compatability since all of them are created squares
# TODO
# arr = np.random.rand(50) * 50
# InsertionSort(arr)
# print(IsSorted(arr))
# Code block by Movsisyan
| 25.444444 | 97 | 0.640466 | 102 | 687 | 4.176471 | 0.686275 | 0.056338 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013645 | 0.253275 | 687 | 26 | 98 | 26.423077 | 0.816764 | 0.427948 | 0 | 0 | 0 | 0 | 0.124668 | 0 | 0 | 0 | 0 | 0.038462 | 0.111111 | 1 | 0.333333 | false | 0.111111 | 0 | 0.111111 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 5 |
1c6136aa6e2cfcace5dd83d2f05b1775dbe0f8e8 | 5,411 | py | Python | gluebox/tests/utils/test_git.py | mwhahaha/gluebox | f8f2ac0f434418fc24143c3a7691517e0742574a | [
"Apache-2.0"
] | null | null | null | gluebox/tests/utils/test_git.py | mwhahaha/gluebox | f8f2ac0f434418fc24143c3a7691517e0742574a | [
"Apache-2.0"
] | null | null | null | gluebox/tests/utils/test_git.py | mwhahaha/gluebox | f8f2ac0f434418fc24143c3a7691517e0742574a | [
"Apache-2.0"
] | null | null | null | #
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import mock
from oslotest import base
import gluebox.utils.git as git
class TestGit(base.BaseTestCase):
"""Test cases for git utils"""
@mock.patch('subprocess.call')
@mock.patch('os.chdir')
@mock.patch('os.getcwd')
def test_checkout(self, getcwd_mock, chdir_mock, call_mock):
getcwd_mock.return_value = '/tmp'
call_mock.return_value=0
git.checkout('https://blah/', '/test', 'foo')
call_mock.assert_called_once_with('git clone https://blah/ -b foo '
'/test', shell=True)
@mock.patch('subprocess.call')
@mock.patch('os.chdir')
@mock.patch('os.getcwd')
def test_checkout_with_topic(self, getcwd_mock, chdir_mock, call_mock):
getcwd_mock.return_value = '/tmp'
call_mock.return_value=0
git.checkout('https://blah/', '/test', topic='test-topic')
calls = [
mock.call('git clone https://blah/ -b master /test', shell=True),
mock.call('git checkout -b test-topic', shell=True),
]
call_mock.assert_has_calls(calls)
@mock.patch('subprocess.call')
@mock.patch('os.chdir')
@mock.patch('os.getcwd')
def test_checkout_with_review(self, getcwd_mock, chdir_mock, call_mock):
getcwd_mock.return_value = '/tmp'
call_mock.return_value=0
git.checkout('https://blah/', '/test', git_review='12345')
calls = [
mock.call('git clone https://blah/ -b master /test', shell=True),
mock.call('git review -d 12345', shell=True),
]
call_mock.assert_has_calls(calls)
@mock.patch('subprocess.call')
@mock.patch('os.chdir')
@mock.patch('os.getcwd')
def test_commit(self, getcwd_mock, chdir_mock, call_mock):
getcwd_mock.return_value = '/tmp'
call_mock.return_value=0
git.commit('/test', message='foo')
calls = [
mock.call('git add *', shell=True),
mock.call('git commit -m "foo"', shell=True),
]
call_mock.assert_has_calls(calls)
@mock.patch('subprocess.call')
@mock.patch('os.chdir')
@mock.patch('os.getcwd')
def test_commit_message_file(self, getcwd_mock, chdir_mock, call_mock):
getcwd_mock.return_value = '/tmp'
call_mock.return_value=0
git.commit('/test', message_file='/file.txt')
calls = [
mock.call('git add *', shell=True),
mock.call('git commit -F /file.txt', shell=True),
]
call_mock.assert_has_calls(calls)
@mock.patch('subprocess.call')
@mock.patch('os.chdir')
@mock.patch('os.getcwd')
def test_commit_fixup(self, getcwd_mock, chdir_mock, call_mock):
getcwd_mock.return_value = '/tmp'
call_mock.return_value = 0
git.commit('/test', fixup=True)
calls = [
mock.call('git add *', shell=True),
mock.call('git commit --amend --no-edit', shell=True),
]
call_mock.assert_has_calls(calls)
@mock.patch('subprocess.call')
@mock.patch('os.chdir')
@mock.patch('os.getcwd')
def test_review(self, getcwd_mock, chdir_mock, call_mock):
getcwd_mock.return_value = '/tmp'
call_mock.return_value = 0
git.review('/test')
call_mock.assert_called_once_with('git review master', shell=True)
@mock.patch('subprocess.call')
@mock.patch('os.chdir')
@mock.patch('os.getcwd')
def test_review_with_topic(self, getcwd_mock, chdir_mock, call_mock):
getcwd_mock.return_value = '/tmp'
call_mock.return_value = 0
git.review('/test', topic='foo')
call_mock.assert_called_once_with('git review master -t foo',
shell=True)
@mock.patch('subprocess.Popen')
@mock.patch('os.chdir')
@mock.patch('os.getcwd')
def test_get_hash(self, getcwd_mock, chdir_mock, popen_mock):
getcwd_mock.return_value = '/tmp'
proc_mock = mock.MagicMock(returncode=0)
proc_mock.communicate.side_effect = [('12345',None)]
popen_mock.return_value = proc_mock
val = git.get_hash('/test')
getcwd_mock.assert_called_once()
popen_mock.assert_called_once_with(['git', 'rev-parse', 'origin/master'],
stderr=-1,
stdout=-1)
self.assertEqual(val, '12345')
@mock.patch('subprocess.Popen')
@mock.patch('os.chdir')
@mock.patch('os.getcwd')
def test_get_hash_parse_failed(self, getcwd_mock, chdir_mock, popen_mock):
getcwd_mock.return_value = '/tmp'
proc_mock = mock.MagicMock(returncode=1)
proc_mock.communicate.side_effect = [('12345', None)]
popen_mock.return_value = proc_mock
self.assertRaises(Exception, git.get_hash, '/test')
| 32.993902 | 81 | 0.622436 | 708 | 5,411 | 4.560734 | 0.187853 | 0.079282 | 0.068133 | 0.049551 | 0.7476 | 0.740167 | 0.731806 | 0.731806 | 0.720347 | 0.693713 | 0 | 0.010005 | 0.242654 | 5,411 | 163 | 82 | 33.196319 | 0.77794 | 0.101275 | 0 | 0.59292 | 0 | 0 | 0.169418 | 0 | 0 | 0 | 0 | 0 | 0.106195 | 1 | 0.088496 | false | 0 | 0.026549 | 0 | 0.123894 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
1c63fcaa034f92c87ecd59150224c6ffd4731ffb | 38,478 | py | Python | handlers/sudo.py | mfmvip/TokyoV2 | 628174006e452ba815d64596f7529bd744275abd | [
"MIT"
] | null | null | null | handlers/sudo.py | mfmvip/TokyoV2 | 628174006e452ba815d64596f7529bd744275abd | [
"MIT"
] | null | null | null | handlers/sudo.py | mfmvip/TokyoV2 | 628174006e452ba815d64596f7529bd744275abd | [
"MIT"
] | null | null | null | from utlis.rank import setrank,isrank,remrank,remsudos,setsudo,GPranks,setasudo,remasudo
from utlis.send import send_msg, BYusers,sendM,GetLink,Glang
from utlis.tg import Bot,Ckuser
from config import *
#b
from pyrogram import ReplyKeyboardMarkup, InlineKeyboardMarkup, InlineKeyboardButton
import threading, requests, time, random, re, json,datetime,os
import importlib
from collections import defaultdict
from utlis.send import run
from os import listdir
from os.path import isfile, join
def setsudos(redis,userID):
try:
get = redis.sismember("{}Nbot:sudos".format(BOT_ID),userID)
if get:
return "sudos"
save = redis.sadd("{}Nbot:sudos".format(BOT_ID),userID)
return save
except Exception as e:
return "sudos"
def sudo(client, message,redis):
type = message.chat.type
userID = message.from_user.id
chatID = message.chat.id
rank = isrank(redis,userID,chatID)
text = message.text
title = message.chat.title
userFN = message.from_user.first_name
type = message.chat.type
c = importlib.import_module("lang.arcmd")
r = importlib.import_module("lang.arreply")
if redis.hexists("{}Nbot:stepSUDO".format(BOT_ID),userID):
tx = redis.hget("{}Nbot:stepSUDO".format(BOT_ID),userID)
if text :
redis.hset("{}Nbot:TXreplys".format(BOT_ID),tx,text)
redis.hdel("{}Nbot:stepSUDO".format(BOT_ID),userID)
Bot("sendMessage",{"chat_id":chatID,"text":r.SRtext.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
if message.sticker:
ID = message.sticker.file_id
redis.hset("{}Nbot:STreplys".format(BOT_ID),tx,ID)
redis.hdel("{}Nbot:stepSUDO".format(BOT_ID),userID)
Bot("sendMessage",{"chat_id":chatID,"text":r.SRst.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
if message.animation:
ID = message.animation.file_id
redis.hset("{}Nbot:GFreplys".format(BOT_ID),tx,ID)
redis.hdel("{}Nbot:stepSUDO".format(BOT_ID),userID)
Bot("sendMessage",{"chat_id":chatID,"text":r.SRgf.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
if message.voice:
ID = message.voice.file_id
redis.hset("{}Nbot:VOreplys".format(BOT_ID),tx,ID)
redis.hdel("{}Nbot:stepSUDO".format(BOT_ID),userID)
Bot("sendMessage",{"chat_id":chatID,"text":r.SRvo.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
if message.photo:
ID = message.photo.file_id
redis.hset("{}Nbot:PHreplys".format(BOT_ID),tx,ID)
redis.hdel("{}Nbot:stepSUDO".format(BOT_ID),userID)
Bot("sendMessage",{"chat_id":chatID,"text":r.SRph.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
if message.document:
ID = message.document.file_id
redis.hset("{}Nbot:DOreplys".format(BOT_ID),tx,ID)
redis.hdel("{}Nbot:stepSUDO".format(BOT_ID),userID)
Bot("sendMessage",{"chat_id":chatID,"text":r.SRfi.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
if text and (type is "supergroup" or type is "group"):
if rank == "sudo":
if text == "وضع مجموعه المطور":
redis.set("{}Nbot:sudogp".format(BOT_ID),chatID)
Bot("sendMessage",{"chat_id":chatID,"text":f"✅꒐ تم تحديد المجموعه لاستلام الاشعارات \n{title} {chatID}","reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search(c.leaveChatS, text) and redis.get("{}Nbot:leaveaddbot".format(BOT_ID)) :
Bot("leaveChat",{"chat_id":chatID})
redis.srem("{}Nbot:groups".format(BOT_ID),chatID)
redis.sadd("{}Nbot:disabledgroups".format(BOT_ID),chatID)
NextDay_Date = datetime.datetime.today() + datetime.timedelta(days=1)
redis.hset("{}Nbot:disabledgroupsTIME".format(BOT_ID),chatID,str(NextDay_Date))
text = text.replace("مسح ","")
if re.search(c.malk, text) and Ckuser(message):
arrays = redis.get("{}Nbot:{}:malk".format(BOT_ID,chatID))
if arrays:
b = BYusers({arrays},chatID,redis,client)
if b is not "":
Bot("sendMessage",{"chat_id":chatID,"text":r.showlist.format(text,b),"reply_to_message_id":message.message_id,"parse_mode":"markdown"})
else:
Bot("sendMessage",{"chat_id":chatID,"text":r.listempty.format(text),"reply_to_message_id":message.message_id,"parse_mode":"markdown"})
else:
Bot("sendMessage",{"chat_id":chatID,"text":r.listempty.format(text),"reply_to_message_id":message.message_id,"parse_mode":"markdown"})
if re.search(c.setmalk, text) and Ckuser(message):
if re.search("@",text):
user = text.split("@")[1]
if re.search(c.setmalk2,text):
user = text.split(" ")[2]
if message.reply_to_message:
user = message.reply_to_message.from_user.id
if 'user' not in locals():return False
try:
getUser = client.get_users(user)
userId = getUser.id
userFn = getUser.first_name
setcr = setrank(redis,"malk",userId,chatID,"one")
if setcr is "malk":
send_msg("UD",client, message,r.DsetRK,"",getUser,redis)
elif (setcr is True or setcr is 1):
send_msg("UD",client, message,r.setRK,"",getUser,redis)
except Exception as e:
Bot("sendMessage",{"chat_id":chatID,"text":r.userNocc,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search(c.remmalk, text) and Ckuser(message):
if re.search("@",text):
user = text.split("@")[1]
if re.search(c.remmalk2,text):
user = text.split(" ")[2]
if message.reply_to_message:
user = message.reply_to_message.from_user.id
if 'user' not in locals():return False
try:
getUser = client.get_users(user)
userId = getUser.id
userFn = getUser.first_name
setcr = remrank(redis,"malk",userId,chatID,"one")
if setcr:
send_msg("UD",client, message,r.remRK,"",getUser,redis)
elif not setcr:
send_msg("UD",client, message,r.DremRK,"",getUser,redis)
except Exception as e:
Bot("sendMessage",{"chat_id":chatID,"text":r.userNocc,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if text and (type is "private" or (type is "supergroup" or type is "group")) :
if rank == "sudo":
if re.search("^رفع نسخه احتياطيه$|^رفع نسخة احتياطية$", text):
msgID = Bot("sendMessage",{"chat_id":chatID,"text":"انتظر قليلاً يتم تحميل الملف ℹ️","reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True})["result"]["message_id"]
fileName = message.reply_to_message.download()
JsonDate = json.load(open(fileName))
if int(JsonDate["BOT_ID"]) != int(BOT_ID):
Bot("editMessageText",{"chat_id":chatID,"text":"عذراً هذه الملف ليس لي ⚠️","message_id":msgID,"disable_web_page_preview":True,"parse_mode":"html"})
return 0
co = len(JsonDate["group"])
Bot("editMessageText",{"chat_id":chatID,"text":f"تم ايجاد {co} مجموعه في الملف ℹ️","message_id":msgID,"disable_web_page_preview":True,"parse_mode":"html"})
for chatid in JsonDate["group"].keys():
redis.sadd("{}Nbot:groups".format(BOT_ID),chatid)
for rk in JsonDate["group"][chatid].keys():
if rk == "malk":
setrank(redis,rk,JsonDate["group"][chatid][rk],chatid,"one")
else:
for userId in JsonDate["group"][chatid][rk]:
setrank(redis,rk,userId,chatid,"array")
Bot("editMessageText",{"chat_id":chatID,"text":f"تم رفع المجموعات ✅","message_id":msgID,"disable_web_page_preview":True,"parse_mode":"html"})
if re.search("^جلب نسخه احتياطيه$|^جلب نسخة احتياطية$", text):
JsonSave = defaultdict(list)
JsonSave["BOT_ID"] = BOT_ID
JsonSave["group"] = {}
gps = redis.smembers("{}Nbot:groups".format(BOT_ID))
for gp in gps:
JsonSave["group"][gp] = {}
malk_userid = redis.get("{}Nbot:{}:malk".format(BOT_ID,gp))
if malk_userid:
JsonSave["group"][gp]["malk"] = int(malk_userid)
ranks_ar = {"acreator","creator","owner","admin"}
for rk in ranks_ar:
get = redis.smembers(f"{BOT_ID}Nbot:{gp}:{rk}")
if get:
JsonSave["group"][gp][rk] = {}
user_ids = []
for userid in get:
user_ids.append(int(userid))
JsonSave["group"][gp][rk] = user_ids
with open(f'{userID}.json', 'w') as fp:
json.dump(JsonSave, fp)
da = datetime.datetime.now().strftime("%Y-%m-%d")
message.reply_document(f'{userID}.json',caption=f"عدد المجموعات 💬 : {len(gps)}\nتاريخ النسخه 📆 : {da}\n⎯ ⎯ ⎯ ⎯")
if text == "حذف مجموعه المطور":
redis.delete("{}Nbot:sudogp".format(BOT_ID))
Bot("sendMessage",{"chat_id":chatID,"text":f"✅꒐ تم تحويل الاشعارات الى الخاص","reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search("^تحويل الاساسي$|^تحويل الاساسي @(.*)$|^تحويل الاساسي [0-9]+$", text):
if re.search("@",text):
user = text.split("@")[1]
if re.search("^تحويل الاساسي [0-9]+$" ,text):
user = text.split(" ")[2]
if message.reply_to_message:
user = message.reply_to_message.from_user.id
if 'user' not in locals():return False
try:
getUser = client.get_users(user)
userId = getUser.id
userFn = getUser.first_name
setsudo(redis,userId)
date = open("./config.py").read().replace(f"SUDO = {userID}", f"SUDO = {userId}")
print(date)
open("./config.py","w+").write(date)
Bot("sendMessage",{"chat_id":chatID,"text":f"✅꒐ تم تحويل المطور الاساسي الى {userFn} {userId}","reply_to_message_id":message.message_id,"parse_mode":"html"})
except Exception as e:
print(e)
Bot("sendMessage",{"chat_id":chatID,"text":r.userNocc,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search(c.STreplyBOT, text):
tx = text.replace(c.RPreplyBOT,"")
if redis.hexists("{}Nbot:TXreplys".format(BOT_ID,chatID),tx):
Bot("sendMessage",{"chat_id":chatID,"text":r.Yrp.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
elif redis.hexists("{}Nbot:STreplys".format(BOT_ID,chatID),tx):
Bot("sendMessage",{"chat_id":chatID,"text":r.Yrp.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
elif redis.hexists("{}Nbot:GFreplys".format(BOT_ID,chatID),tx):
Bot("sendMessage",{"chat_id":chatID,"text":r.Yrp.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
elif redis.hexists("{}Nbot:VOreplys".format(BOT_ID,chatID),tx):
Bot("sendMessage",{"chat_id":chatID,"text":r.Yrp.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
else:
redis.hset("{}Nbot:stepSUDO".format(BOT_ID),userID,tx)
kb = InlineKeyboardMarkup([[InlineKeyboardButton(r.MoreInfo, url="t.me/CCCCCD")]])
Bot("sendMessage",{"chat_id":chatID,"text":r.Sendreply % tx,"reply_to_message_id":message.message_id,"parse_mode":"html","reply_markup":kb})
if re.search(c.DLreplyBOT, text):
tx = text.replace(c.RPdreplyBOT,"")
if redis.hexists("{}Nbot:TXreplys".format(BOT_ID,chatID),tx):
redis.hdel("{}Nbot:TXreplys".format(BOT_ID,chatID),tx)
Bot("sendMessage",{"chat_id":chatID,"text":r.Drp.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
elif redis.hexists("{}Nbot:STreplys".format(BOT_ID,chatID),tx):
redis.hdel("{}Nbot:STreplys".format(BOT_ID,chatID),tx)
Bot("sendMessage",{"chat_id":chatID,"text":r.Drp.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
elif redis.hexists("{}Nbot:GFreplys".format(BOT_ID,chatID),tx):
redis.hdel("{}Nbot:GFreplys".format(BOT_ID,chatID),tx)
Bot("sendMessage",{"chat_id":chatID,"text":r.Drp.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
elif redis.hexists("{}Nbot:VOreplys".format(BOT_ID,chatID),tx):
redis.hdel("{}Nbot:GFreplys".format(BOT_ID,chatID),tx)
Bot("sendMessage",{"chat_id":chatID,"text":r.Drp.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
else:
Bot("sendMessage",{"chat_id":chatID,"text":r.Norp.format(tx),"reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search(c.ReplylistBOT, text):
reply_markup=InlineKeyboardMarkup([[InlineKeyboardButton(c.STword,callback_data=json.dumps(["showreplylistBOT","",userID])),InlineKeyboardButton(c.STgifs,callback_data=json.dumps(["showGFreplylistBOT","",userID])),],[InlineKeyboardButton(c.STvoice,callback_data=json.dumps(["showVOreplylistBOT","",userID])),InlineKeyboardButton(c.STsticker,callback_data=json.dumps(["showSTreplylistBOT","",userID])),]])
Bot("sendMessage",{"chat_id":chatID,"text":r.blocklist.format(text,title),"reply_to_message_id":message.message_id,"parse_mode":"html","reply_markup":reply_markup})
if rank is "sudo" or rank is "asudo":
if text == c.remfiles:
onlyfiles = [f for f in listdir("files") if isfile(join("files", f))]
array = []
if not onlyfiles:
Bot("sendMessage",{"chat_id":chatID,"text":r.NOaddfiles2,"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True})
return False
for f in onlyfiles:
array.append([InlineKeyboardButton(f,callback_data=json.dumps(["delF",f,userID]))])
array.append([InlineKeyboardButton(c.remallfiles,callback_data=json.dumps(["delFa","",userID]))])
kb = InlineKeyboardMarkup(array)
Bot("sendMessage",{"chat_id":chatID,"text":r.dlFiles,"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True,"reply_markup":kb})
if text == c.files:
onlyfiles = [f for f in listdir("files") if isfile(join("files", f))]
filesR = redis.smembers("{}Nbot:botfiles".format(BOT_ID))
array = []
if not onlyfiles:
Bot("sendMessage",{"chat_id":chatID,"text":r.NOaddfiles2,"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True})
return False
for f in onlyfiles:
if f in filesR:
s = r.true
else:
s = r.false
array.append([InlineKeyboardButton(f+" "+s,callback_data=json.dumps(["au",f,userID]))])
kb = InlineKeyboardMarkup(array)
Bot("sendMessage",{"chat_id":chatID,"text":r.Files,"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True,"reply_markup":kb})
if text == c.ADDfiles:
url = "https://raw.githubusercontent.com/mfmvip/TokyoV2-files/master/files"
req = requests.get(url).text
if not re.search(".py",req):
Bot("sendMessage",{"chat_id":chatID,"text":r.NOaddfiles,"reply_to_message_id":message.message_id,"disable_web_page_preview":True,"parse_mode":"html"})
return False
files = req.split("\n")
array = []
for f in files:
array.append([InlineKeyboardButton(f,callback_data=json.dumps(["dlf",f,userID]))])
kb = InlineKeyboardMarkup(array)
Bot("sendMessage",{"chat_id":chatID,"text":r.addFiles,"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True,"reply_markup":kb})
if text == c.Ubot:
Files_H = ["inline.py","all.py","callback.py","delete.py","edit.py","gpcmd.py","locks.py","msg.py","nf.py","ranks.py","sudo.py"]
#Files_H = ["gpcmd.py"]
Files_L = ["arreply.py","arcmd.py"]
Files_U = ["tg.py","locks.py","rank.py","send.py"]
Files_B = ["bot.py","setup.py"]
for fnh in Files_H:
url = "https://raw.githubusercontent.com/mfmvip/TokyoV2/master/handlers/"+fnh
out = requests.get(url).text
f = open("./handlers/"+fnh,"w+")
f.write(out)
f.close()
for fnu in Files_U:
url = "https://raw.githubusercontent.com/mfmvip/TokyoV2/master/utlis/"+fnu
out = requests.get(url).text
f = open("./utlis/"+fnu,"w+")
f.write(out)
f.close()
for fnb in Files_B:
url = "https://raw.githubusercontent.com/mfmvip/TokyoV2/master/"+fnb
out = requests.get(url).text
f = open("./"+fnb,"w+")
f.write(out)
f.close()
for fnu in Files_L:
url = "https://raw.githubusercontent.com/mfmvip/TokyoV2/master/lang/"+fnu
out = requests.get(url).text
f = open("./lang/"+fnu,"w+")
f.write(out)
f.close()
Bot("sendMessage",{"chat_id":chatID,"text":r.Wres,"reply_to_message_id":message.message_id,"parse_mode":"html"})
run(redis,chatID)
if re.search(c.setSudoC, text):
tx = text.replace(c.RsetSudoC,"")
v = Bot("sendMessage",{"chat_id":chatID,"text":tx,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if v["ok"]:
redis.set("{}Nbot:SHOWsudos".format(BOT_ID),tx)
Bot("sendMessage",{"chat_id":chatID,"text":r.DsetSudosShow,"reply_to_message_id":message.message_id,"parse_mode":"html"})
elif v["ok"] == False:
Bot("sendMessage",{"chat_id":chatID,"text":r.DsetSudosShowE,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search(c.sudosList, text) and Ckuser(message):
text = text.replace("مسح ","")
arrays = redis.smembers("{}Nbot:sudos".format(BOT_ID,chatID))
b = BYusers(arrays,chatID,redis,client)
kb = InlineKeyboardMarkup([[InlineKeyboardButton(r.delList.format(text), callback_data=json.dumps(["delList","sudos",userID]))]])
if b is not "":
Bot("sendMessage",{"chat_id":chatID,"text":r.showlist.format(text,b),"reply_to_message_id":message.message_id,"parse_mode":"markdown","reply_markup":kb})
else:
Bot("sendMessage",{"chat_id":chatID,"text":r.listempty.format(text),"reply_to_message_id":message.message_id,"parse_mode":"markdown"})
if re.search(c.setsudos, text) and Ckuser(message):
if re.search("@",text):
user = text.split("@")[1]
if re.search(c.setsudos2,text):
user = text.split(" ")[2]
if message.reply_to_message:
user = message.reply_to_message.from_user.id
if 'user' not in locals():return False
try:
getUser = client.get_users(user)
userId = getUser.id
userFn = getUser.first_name
setcr = setsudos(redis,userId)
if setcr is "sudos":
send_msg("UD",client, message,r.DsetRK,"",getUser,redis)
elif (setcr is True or setcr is 1):
send_msg("UD",client, message,r.setRK,"",getUser,redis)
except Exception as e:
print(e)
Bot("sendMessage",{"chat_id":chatID,"text":r.userNocc,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search(c.remsudos, text) and Ckuser(message):
if re.search("@",text):
user = text.split("@")[1]
if re.search(c.remsudos2,text):
user = text.split(" ")[2]
if message.reply_to_message:
user = message.reply_to_message.from_user.id
if 'user' not in locals():return False
try:
getUser = client.get_users(user)
userId = getUser.id
userFn = getUser.first_name
setcr = remsudos(redis,userId)
if setcr:
send_msg("UD",client, message,r.remRK,"",getUser,redis)
elif not setcr:
send_msg("UD",client, message,r.DremRK,"",getUser,redis)
except Exception as e:
Bot("sendMessage",{"chat_id":chatID,"text":r.userNocc,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search(c.asudoList, text) and Ckuser(message):
text = text.replace("مسح ","")
arrays = redis.smembers("{}Nbot:asudo".format(BOT_ID,chatID))
b = BYusers(arrays,chatID,redis,client)
kb = InlineKeyboardMarkup([[InlineKeyboardButton(r.delList.format(text), callback_data=json.dumps(["delList","sudos",userID]))]])
if b is not "":
Bot("sendMessage",{"chat_id":chatID,"text":r.showlist.format(text,b),"reply_to_message_id":message.message_id,"parse_mode":"markdown","reply_markup":kb})
else:
Bot("sendMessage",{"chat_id":chatID,"text":r.listempty.format(text),"reply_to_message_id":message.message_id,"parse_mode":"markdown"})
if re.search(c.setasudo, text) and Ckuser(message):
if re.search("@",text):
user = text.split("@")[1]
if re.search(c.setasudo2,text):
user = text.split(" ")[2]
if message.reply_to_message:
user = message.reply_to_message.from_user.id
if 'user' not in locals():return False
try:
getUser = client.get_users(user)
userId = getUser.id
userFn = getUser.first_name
setcr = setasudo(redis,userId)
if setcr is "sudos":
send_msg("UD",client, message,r.DsetRK,"",getUser,redis)
elif (setcr is True or setcr is 1):
send_msg("UD",client, message,r.setRK,"",getUser,redis)
except Exception as e:
print(e)
Bot("sendMessage",{"chat_id":chatID,"text":r.userNocc,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search(c.remasudo, text) and Ckuser(message):
if re.search("@",text):
user = text.split("@")[1]
if re.search(c.remasudo2,text):
user = text.split(" ")[2]
if message.reply_to_message:
user = message.reply_to_message.from_user.id
if 'user' not in locals():return False
try:
getUser = client.get_users(user)
userId = getUser.id
userFn = getUser.first_name
setcr = remasudo(redis,userId)
if setcr:
send_msg("UD",client, message,r.remRK,"",getUser,redis)
elif not setcr:
send_msg("UD",client, message,r.DremRK,"",getUser,redis)
except Exception as e:
Bot("sendMessage",{"chat_id":chatID,"text":r.userNocc,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search(c.banall, text):
if re.search("@",text):
user = text.split("@")[1]
if re.search(c.ban2all,text):
user = text.split(" ")[2]
if message.reply_to_message:
user = message.reply_to_message.from_user.id
if 'user' not in locals():return False
try:
getUser = client.get_users(user)
userId = getUser.id
userFn = getUser.first_name
BY = "<a href=\"tg://user?id={}\">{}</a>".format(userId,userFn)
Getrank = isrank(redis,userId,chatID)
GetGprank = GPranks(userId,chatID)
if Getrank == "bot":return False
if Getrank == "sudos" or Getrank == "sudo":
Bot("sendMessage",{"chat_id":chatID,"text":r.cTsudo,"reply_to_message_id":message.message_id,"parse_mode":"html"})
return False
if redis.sismember("{}Nbot:bans".format(BOT_ID),userId):
Bot("sendMessage",{"chat_id":chatID,"text":r.Dbanall.format(BY),"reply_to_message_id":message.message_id,"parse_mode":"html"})
else:
redis.sadd("{}Nbot:bans".format(BOT_ID),userId)
Bot("sendMessage",{"chat_id":chatID,"text":r.banall.format(BY),"reply_to_message_id":message.message_id,"parse_mode":"html"})
if (GetGprank == "member" or GetGprank == "restricted"):
Bot("kickChatMember",{"chat_id":chatID,"user_id":userId})
except Exception as e:
print(e)
Bot("sendMessage",{"chat_id":chatID,"text":r.userNocc,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search(c.unbanall, text):
if re.search("@",text):
user = text.split("@")[1]
if re.search(c.unban2all,text):
user = text.split(" ")[2]
if message.reply_to_message:
user = message.reply_to_message.from_user.id
if 'user' not in locals():return False
try:
getUser = client.get_users(user)
userId = getUser.id
userFn = getUser.first_name
Getrank = isrank(redis,userId,chatID)
GetGprank = GPranks(userId,chatID)
if Getrank == "bot":return False
if redis.sismember("{}Nbot:bans".format(BOT_ID),userId):
redis.srem("{}Nbot:bans".format(BOT_ID),userId)
send_msg("BNN",client, message,r.unbanall,"bans",getUser,redis)
else:
send_msg("BNN",client, message,r.Dunbanall,"bans",getUser,redis)
except Exception as e:
print(e)
Bot("sendMessage",{"chat_id":chatID,"text":r.userNocc,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search(c.TKall, text):
if re.search("@",text):
user = text.split("@")[1]
if re.search(c.TK2all,text):
user = text.split(" ")[2]
if message.reply_to_message:
user = message.reply_to_message.from_user.id
if 'user' not in locals():return False
try:
getUser = client.get_users(user)
userId = getUser.id
userFn = getUser.first_name
Getrank = isrank(redis,userId,chatID)
GetGprank = GPranks(userId,chatID)
if Getrank == "bot":return False
if Getrank == "sudos" or Getrank == "sudo":
Bot("sendMessage",{"chat_id":chatID,"text":r.cTsudo,"reply_to_message_id":message.message_id,"parse_mode":"html"})
return False
if redis.sismember("{}Nbot:restricteds".format(BOT_ID),userId):
send_msg("BNN",client, message,r.Drestrictedall,"restricteds",getUser,redis)
else:
send_msg("BNN",client, message,r.restrictedall,"restricteds",getUser,redis)
redis.sadd("{}Nbot:restricteds".format(BOT_ID),userId)
if (GetGprank == "member"):
Bot("restrictChatMember",{"chat_id": chatID,"user_id": userId,"can_send_messages": 0,"can_send_media_messages": 0,"can_send_other_messages": 0,
"can_send_polls": 0,"can_change_info": 0,"can_add_web_page_previews": 0,"can_pin_messages": 0,"can_invite_users": 0,})
except Exception as e:
print(e)
Bot("sendMessage",{"chat_id":chatID,"text":r.userNocc,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search(c.unTKall, text):
if re.search("@",text):
user = text.split("@")[1]
if re.search(c.unTK2all,text):
user = text.split(" ")[2]
if message.reply_to_message:
user = message.reply_to_message.from_user.id
if 'user' not in locals():return False
try:
getUser = client.get_users(user)
userId = getUser.id
userFn = getUser.first_name
Getrank = isrank(redis,userId,chatID)
GetGprank = GPranks(userId,chatID)
if Getrank == "bot":return False
if redis.sismember("{}Nbot:restricteds".format(BOT_ID),userId):
send_msg("BNN",client, message,r.unrestrictedall,"restricteds",getUser,redis)
redis.srem("{}Nbot:restricteds".format(BOT_ID),userId)
else:
send_msg("BNN",client, message,r.Dunrestrictedall,"restricteds",getUser,redis)
except Exception as e:
print(e)
Bot("sendMessage",{"chat_id":chatID,"text":r.userNocc,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if re.search(c.Alllist, text) and Ckuser(message):
reply_markup=InlineKeyboardMarkup([[InlineKeyboardButton(c.STbanall,callback_data=json.dumps(["showbanall","",userID])),InlineKeyboardButton(c.STtkall,callback_data=json.dumps(["showtkall","",userID])),]])
Bot("sendMessage",{"chat_id":chatID,"text":r.banlist,"reply_to_message_id":message.message_id,"parse_mode":"html","reply_markup":reply_markup})
if re.search(c.stats, text) and Ckuser(message):
pr = redis.scard("{}Nbot:privates".format(BOT_ID))
gp = redis.scard("{}Nbot:groups".format(BOT_ID))
kb = InlineKeyboardMarkup([[InlineKeyboardButton(r.CKgps,callback_data=json.dumps(["ckGPs","",userID]))]])
Bot("sendMessage",{"chat_id":chatID,"text":r.showstats.format(gp,pr),"reply_to_message_id":message.message_id,"parse_mode":"html","reply_markup":kb})
if re.search(c.fwdall, text) and message.reply_to_message:
Bot("forwardMessage",{"chat_id":chatID,"from_chat_id":chatID,"message_id":message.reply_to_message.message_id})
reply_markup=InlineKeyboardMarkup([[InlineKeyboardButton(c.STgroup,callback_data=json.dumps(["fwdtogroups","",userID])),InlineKeyboardButton(c.STprivates,callback_data=json.dumps(["fwdtoprivates","",userID])),]])
Bot("sendMessage",{"chat_id":chatID,"text":r.sendto,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html","reply_markup":reply_markup})
if re.search(c.showGPS, text) and Ckuser(message):
IDS = redis.smembers("{}Nbot:groups".format(BOT_ID))
GPslist = ""
i = 1
for ID in IDS:
get = Bot("getChat",{"chat_id":ID})
if get["ok"]:
Title = (get["result"]["title"] or "None")
Link = (redis.hget("{}Nbot:links".format(BOT_ID),ID) or GetLink(ID) or "none")
name = "[{}]({})".format(Title,Link)
N = r.ShowGPN.format(i,name,ID)
GPslist = GPslist+"\n\n"+N
i +=1
sendM("NO",GPslist,message)
if text == c.Laudo :
R = text.split(" ")[1]
get = redis.get("{}Nbot:autoaddbot".format(BOT_ID))
BY = "<a href=\"tg://user?id={}\">{}</a>".format(userID,userFN)
if get :
Bot("sendMessage",{"chat_id":chatID,"text":r.ADDed.format(BY,R),"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True})
else:
save = redis.set("{}Nbot:autoaddbot".format(BOT_ID),1)
Bot("sendMessage",{"chat_id":chatID,"text":r.ADD.format(BY,R),"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True})
if text == c.Uauto :
R = text.split(" ")[1]
BY = "<a href=\"tg://user?id={}\">{}</a>".format(userID,userFN)
get = redis.get("{}Nbot:autoaddbot".format(BOT_ID))
if get :
save = redis.delete("{}Nbot:autoaddbot".format(BOT_ID))
Bot("sendMessage",{"chat_id":chatID,"text":r.unADD.format(BY,R),"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True})
else:
Bot("sendMessage",{"chat_id":chatID,"text":r.unADDed.format(BY,R),"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True})
if text == c.Lleave :
R = text.split(" ")[1]
get = redis.get("{}Nbot:leaveaddbot".format(BOT_ID))
BY = "<a href=\"tg://user?id={}\">{}</a>".format(userID,userFN)
if get :
Bot("sendMessage",{"chat_id":chatID,"text":r.ADDed.format(BY,R),"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True})
else:
save = redis.set("{}Nbot:leaveaddbot".format(BOT_ID),1)
Bot("sendMessage",{"chat_id":chatID,"text":r.ADD.format(BY,R),"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True})
if text == c.Uleave :
R = text.split(" ")[1]
BY = "<a href=\"tg://user?id={}\">{}</a>".format(userID,userFN)
get = redis.get("{}Nbot:leaveaddbot".format(BOT_ID))
if get :
save = redis.delete("{}Nbot:leaveaddbot".format(BOT_ID))
Bot("sendMessage",{"chat_id":chatID,"text":r.unADD.format(BY,R),"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True})
else:
Bot("sendMessage",{"chat_id":chatID,"text":r.unADDed.format(BY,R),"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True})
if re.search(c.Setauto, text):
N = text.split(" ")[2]
redis.set("{}Nbot:autoaddbotN".format(BOT_ID),int(N))
Bot("sendMessage",{"chat_id":chatID,"text":r.SetAuto.format(N),"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True})
if re.search(c.leaveChat,text):
ch = text.split(" ")[1]
Bot("leaveChat",{"chat_id":ch})
redis.srem("{}Nbot:groups".format(BOT_ID),ch)
redis.sadd("{}Nbot:disabledgroups".format(BOT_ID),ch)
NextDay_Date = datetime.datetime.today() + datetime.timedelta(days=1)
redis.hset("{}Nbot:disabledgroupsTIME".format(BOT_ID),ch,str(NextDay_Date))
Bot("sendMessage",{"chat_id":chatID,"text":r.DoneleaveChat,"reply_to_message_id":message.message_id,"parse_mode":"html","disable_web_page_preview":True})
if re.search(c.sendall, text) and message.reply_to_message and Ckuser(message):
if message.reply_to_message.text:
v = Bot("sendMessage",{"chat_id":chatID,"text":message.reply_to_message.text,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if v["ok"]:
reply_markup=InlineKeyboardMarkup([[InlineKeyboardButton(c.STgroup,callback_data=json.dumps(["sendtogroups","",userID])),InlineKeyboardButton(c.STprivates,callback_data=json.dumps(["sendtoprivates","",userID])),]])
Bot("sendMessage",{"chat_id":chatID,"text":r.sendto,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html","reply_markup":reply_markup})
elif v["ok"] == False:
Bot("sendMessage",{"chat_id":chatID,"text":r.DsetSudosShowE,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html"})
if message.reply_to_message.photo:
ID = message.reply_to_message.photo.file_id
CP = message.reply_to_message.caption
v = Bot("sendphoto",{"chat_id":chatID,"photo":ID,"caption":CP,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if v["ok"]:
reply_markup=InlineKeyboardMarkup([[InlineKeyboardButton(c.STgroup,callback_data=json.dumps(["sendtogroups","",userID])),InlineKeyboardButton(c.STprivates,callback_data=json.dumps(["sendtoprivates","",userID])),]])
Bot("sendMessage",{"chat_id":chatID,"text":r.sendto,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html","reply_markup":reply_markup})
elif v["ok"] == False:
Bot("sendMessage",{"chat_id":chatID,"text":r.DsetSudosShowE,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html"})
if message.reply_to_message.voice:
ID = message.reply_to_message.voice.file_id
CP = message.reply_to_message.caption
v = Bot("sendvoice",{"chat_id":chatID,"voice":ID,"caption":CP,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if v["ok"]:
reply_markup=InlineKeyboardMarkup([[InlineKeyboardButton(c.STgroup,callback_data=json.dumps(["sendtogroups","",userID])),InlineKeyboardButton(c.STprivates,callback_data=json.dumps(["sendtoprivates","",userID])),]])
Bot("sendMessage",{"chat_id":chatID,"text":r.sendto,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html","reply_markup":reply_markup})
elif v["ok"] == False:
Bot("sendMessage",{"chat_id":chatID,"text":r.DsetSudosShowE,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html"})
if message.reply_to_message.audio:
ID = message.reply_to_message.audio.file_id
CP = message.reply_to_message.caption
v = Bot("sendaudio",{"chat_id":chatID,"audio":ID,"caption":CP,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if v["ok"]:
reply_markup=InlineKeyboardMarkup([[InlineKeyboardButton(c.STgroup,callback_data=json.dumps(["sendtogroups","",userID])),InlineKeyboardButton(c.STprivates,callback_data=json.dumps(["sendtoprivates","",userID])),]])
Bot("sendMessage",{"chat_id":chatID,"text":r.sendto,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html","reply_markup":reply_markup})
elif v["ok"] == False:
Bot("sendMessage",{"chat_id":chatID,"text":r.DsetSudosShowE,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html"})
if message.reply_to_message.document:
ID = message.reply_to_message.document.file_id
CP = message.reply_to_message.caption
v = Bot("senddocument",{"chat_id":chatID,"document":ID,"caption":CP,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if v["ok"]:
reply_markup=InlineKeyboardMarkup([[InlineKeyboardButton(c.STgroup,callback_data=json.dumps(["sendtogroups","",userID])),InlineKeyboardButton(c.STprivates,callback_data=json.dumps(["sendtoprivates","",userID])),]])
Bot("sendMessage",{"chat_id":chatID,"text":r.sendto,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html","reply_markup":reply_markup})
elif v["ok"] == False:
Bot("sendMessage",{"chat_id":chatID,"text":r.DsetSudosShowE,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html"})
if message.reply_to_message.sticker:
ID = message.reply_to_message.sticker.file_id
CP = message.reply_to_message.caption
v = Bot("sendsticker",{"chat_id":chatID,"sticker":ID,"caption":CP,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if v["ok"]:
reply_markup=InlineKeyboardMarkup([[InlineKeyboardButton(c.STgroup,callback_data=json.dumps(["sendtogroups","",userID])),InlineKeyboardButton(c.STprivates,callback_data=json.dumps(["sendtoprivates","",userID])),]])
Bot("sendMessage",{"chat_id":chatID,"text":r.sendto,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html","reply_markup":reply_markup})
elif v["ok"] == False:
Bot("sendMessage",{"chat_id":chatID,"text":r.DsetSudosShowE,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html"})
if message.reply_to_message.animation:
ID = message.reply_to_message.animation.file_id
CP = message.reply_to_message.caption
v = Bot("sendanimation",{"chat_id":chatID,"animation":ID,"caption":CP,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if v["ok"]:
reply_markup=InlineKeyboardMarkup([[InlineKeyboardButton(c.STgroup,callback_data=json.dumps(["sendtogroups","",userID])),InlineKeyboardButton(c.STprivates,callback_data=json.dumps(["sendtoprivates","",userID])),]])
Bot("sendMessage",{"chat_id":chatID,"text":r.sendto,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html","reply_markup":reply_markup})
elif v["ok"] == False:
Bot("sendMessage",{"chat_id":chatID,"text":r.DsetSudosShowE,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html"})
if message.reply_to_message.video:
ID = message.reply_to_message.video.file_id
CP = message.reply_to_message.caption
v = Bot("sendvideo",{"chat_id":chatID,"video":ID,"caption":CP,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if v["ok"]:
reply_markup=InlineKeyboardMarkup([[InlineKeyboardButton(c.STgroup,callback_data=json.dumps(["sendtogroups","",userID])),InlineKeyboardButton(c.STprivates,callback_data=json.dumps(["sendtoprivates","",userID])),]])
Bot("sendMessage",{"chat_id":chatID,"text":r.sendto,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html","reply_markup":reply_markup})
elif v["ok"] == False:
Bot("sendMessage",{"chat_id":chatID,"text":r.DsetSudosShowE,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html"})
if message.reply_to_message.video_note:
ID = message.reply_to_message.video_note.file_id
CP = message.reply_to_message.caption
v = Bot("sendVideoNote",{"chat_id":chatID,"video_note":ID,"caption":CP,"reply_to_message_id":message.message_id,"parse_mode":"html"})
if v["ok"]:
reply_markup=InlineKeyboardMarkup([[InlineKeyboardButton(c.STgroup,callback_data=json.dumps(["sendtogroups","",userID])),InlineKeyboardButton(c.STprivates,callback_data=json.dumps(["sendtoprivates","",userID])),]])
Bot("sendMessage",{"chat_id":chatID,"text":r.sendto,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html","reply_markup":reply_markup})
elif v["ok"] == False:
Bot("sendMessage",{"chat_id":chatID,"text":r.DsetSudosShowE,"reply_to_message_id":message.reply_to_message.message_id,"parse_mode":"html"})
| 52.781893 | 407 | 0.692837 | 5,486 | 38,478 | 4.672439 | 0.07911 | 0.067413 | 0.089572 | 0.05805 | 0.801662 | 0.774236 | 0.753911 | 0.721492 | 0.689424 | 0.679827 | 0 | 0.001984 | 0.12233 | 38,478 | 728 | 408 | 52.854396 | 0.756477 | 0.000598 | 0 | 0.552833 | 0 | 0.001531 | 0.248466 | 0.017294 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003063 | false | 0 | 0.019908 | 0 | 0.036753 | 0.012251 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
98bd987b83deb37fd9cdd5939523a464e2a6bdc2 | 284 | py | Python | python/graphscope/experimental/nx/tests/algorithms/forward/test_isolate.py | wenyuanyu/GraphScope | a40ccaf70557e608d8b091eb25ab04477f99ce21 | [
"Apache-2.0"
] | 2 | 2020-12-15T08:42:10.000Z | 2022-01-14T09:13:16.000Z | python/graphscope/experimental/nx/tests/algorithms/forward/test_isolate.py | wenyuanyu/GraphScope | a40ccaf70557e608d8b091eb25ab04477f99ce21 | [
"Apache-2.0"
] | 1 | 2020-12-22T13:15:40.000Z | 2020-12-22T13:15:40.000Z | python/graphscope/experimental/nx/tests/algorithms/forward/test_isolate.py | wenyuanyu/GraphScope | a40ccaf70557e608d8b091eb25ab04477f99ce21 | [
"Apache-2.0"
] | 1 | 2021-11-23T03:40:43.000Z | 2021-11-23T03:40:43.000Z | import networkx.algorithms.tests.test_isolate
import pytest
from graphscope.experimental.nx.utils.compat import import_as_graphscope_nx
import_as_graphscope_nx(networkx.algorithms.tests.test_isolate,
decorators=pytest.mark.usefixtures("graphscope_session"))
| 35.5 | 81 | 0.802817 | 34 | 284 | 6.441176 | 0.529412 | 0.164384 | 0.210046 | 0.246575 | 0.310502 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126761 | 284 | 7 | 82 | 40.571429 | 0.883065 | 0 | 0 | 0 | 0 | 0 | 0.06338 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.8 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
98cb07f0a81bf256ae86a6ac8d42bcf4a93589cc | 60 | py | Python | correlateim/__init__.py | DeMarcoLab/correlateim | 6cd2b715b13520a7fb249970b320d822f25d9b3b | [
"MIT"
] | 1 | 2019-05-13T07:45:41.000Z | 2019-05-13T07:45:41.000Z | correlateim/__init__.py | DeMarcoLab/correlateim | 6cd2b715b13520a7fb249970b320d822f25d9b3b | [
"MIT"
] | 6 | 2019-05-13T07:59:54.000Z | 2021-08-08T04:05:58.000Z | correlateim/__init__.py | DeMarcoLab/correlateim | 6cd2b715b13520a7fb249970b320d822f25d9b3b | [
"MIT"
] | 2 | 2020-04-27T12:09:46.000Z | 2022-02-01T15:47:47.000Z | from correlateim._version import __version__ # noqa: F401
| 30 | 59 | 0.8 | 7 | 60 | 6.142857 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 0.15 | 60 | 1 | 60 | 60 | 0.784314 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
98dc35031826b11e6fca31d52470402ae23c4110 | 240 | py | Python | test-suite/exonum-py-tests/exonum_tests/__init__.py | mobilipia/milestone-core | 53134d0816264f9c60d3b86e28947669c3ccc609 | [
"Apache-2.0"
] | null | null | null | test-suite/exonum-py-tests/exonum_tests/__init__.py | mobilipia/milestone-core | 53134d0816264f9c60d3b86e28947669c3ccc609 | [
"Apache-2.0"
] | 2 | 2022-01-22T10:38:47.000Z | 2022-03-26T04:27:03.000Z | test-suite/exonum-py-tests/exonum_tests/__init__.py | slowli/exonum | 5a0ca3f79cd0ac65fda701f768a3cdd5a4f682e7 | [
"Apache-2.0"
] | null | null | null | """Entry point of Python Exonum integration tests module"""
from exonum_tests.api import *
from exonum_tests.crypto_advanced import *
# Skip deploy tests. The cryptocurrency-advanced is included service.
# from exonum_tests.deploy import *
| 40 | 69 | 0.804167 | 33 | 240 | 5.727273 | 0.606061 | 0.15873 | 0.238095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 240 | 5 | 70 | 48 | 0.9 | 0.65 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
98e2b1d618d3ffaaf63cb7e9cace1dcbceee3cf6 | 49 | py | Python | UTMDriver/generic/documents/tickets/__init__.py | maxpoint2point/UTMDriver | 75f5687fc0191eb2bf3005774cbde6e4935a6d04 | [
"Apache-2.0"
] | 1 | 2020-10-02T18:19:10.000Z | 2020-10-02T18:19:10.000Z | UTMDriver/generic/queries/utm/__init__.py | maxpoint2point/UTMDriver | 75f5687fc0191eb2bf3005774cbde6e4935a6d04 | [
"Apache-2.0"
] | null | null | null | UTMDriver/generic/queries/utm/__init__.py | maxpoint2point/UTMDriver | 75f5687fc0191eb2bf3005774cbde6e4935a6d04 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) maxpoint2point@gmail.com 2020.
| 16.333333 | 47 | 0.734694 | 6 | 49 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119048 | 0.142857 | 49 | 2 | 48 | 24.5 | 0.738095 | 0.897959 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
98eda2df323f2572720781c0641dfa55b7d133e0 | 119 | py | Python | prebuilder/__main__.py | prebuilder/prebuilder.py | e5ee9dce0e46a3bc022dfaa0ce9f1be0563e2bdc | [
"Unlicense"
] | 4 | 2019-11-10T19:53:00.000Z | 2020-11-03T00:35:25.000Z | prebuilder/__main__.py | prebuilder/prebuilder.py | e5ee9dce0e46a3bc022dfaa0ce9f1be0563e2bdc | [
"Unlicense"
] | null | null | null | prebuilder/__main__.py | prebuilder/prebuilder.py | e5ee9dce0e46a3bc022dfaa0ce9f1be0563e2bdc | [
"Unlicense"
] | 1 | 2019-11-15T08:49:49.000Z | 2019-11-15T08:49:49.000Z | if __name__ == "__main__":
import sys
from pprint import pprint
pprint(parseDebhelperDebianDir(Path(sys.argv[1])))
| 19.833333 | 51 | 0.756303 | 15 | 119 | 5.466667 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009615 | 0.12605 | 119 | 5 | 52 | 23.8 | 0.778846 | 0 | 0 | 0 | 0 | 0 | 0.067227 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 5 |
c71a970d9afb43a621ed55f39e3f38b2888913ca | 96 | py | Python | venv/lib/python3.8/site-packages/debugpy/_vendored/pydevd/_pydevd_frame_eval/vendored/bytecode/flags.py | GiulianaPola/select_repeats | 17a0d053d4f874e42cf654dd142168c2ec8fbd11 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/debugpy/_vendored/pydevd/_pydevd_frame_eval/vendored/bytecode/flags.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/debugpy/_vendored/pydevd/_pydevd_frame_eval/vendored/bytecode/flags.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/ac/dc/3f/9aa3a8523c0517af211ec81e0cef2637b76326b5aafb0f6ab7b2905bb4 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.364583 | 0 | 96 | 1 | 96 | 96 | 0.53125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
c7518503c41244bbb918226b2c8da620de4e03fb | 83 | py | Python | eng/__init__.py | Rabmelon/tiSPHi | 8ffb0e505edd01cb31cb049bfe54f1f2b99cf121 | [
"MIT"
] | 5 | 2022-01-03T12:14:34.000Z | 2022-02-11T01:22:52.000Z | eng/__init__.py | Rabmelon/taichiCourse01_tiSPHi | 8ffb0e505edd01cb31cb049bfe54f1f2b99cf121 | [
"MIT"
] | null | null | null | eng/__init__.py | Rabmelon/taichiCourse01_tiSPHi | 8ffb0e505edd01cb31cb049bfe54f1f2b99cf121 | [
"MIT"
] | null | null | null | from . import particle_system, sph_solver, guishow, gguishow, wcsph, DPsph, muIsph
| 41.5 | 82 | 0.795181 | 11 | 83 | 5.818182 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120482 | 83 | 1 | 83 | 83 | 0.876712 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
c76fa0769976105c1f5e6548820ceeac4d4f325c | 4,799 | py | Python | iseq/test/test_hmmer3_compat.py | EBI-Metagenomics/iseq | 3c28fc92e5af05c91c6669d7f1a28d1ce857f3f1 | [
"MIT"
] | null | null | null | iseq/test/test_hmmer3_compat.py | EBI-Metagenomics/iseq | 3c28fc92e5af05c91c6669d7f1a28d1ce857f3f1 | [
"MIT"
] | null | null | null | iseq/test/test_hmmer3_compat.py | EBI-Metagenomics/iseq | 3c28fc92e5af05c91c6669d7f1a28d1ce857f3f1 | [
"MIT"
] | null | null | null | import os
from math import isfinite
from pathlib import Path
import pytest
from fasta_reader import read_fasta
from hmmer_reader import open_hmmer
from imm import Sequence
from imm.testing import assert_allclose
from iseq.example import example_filepath
from iseq.hmmer3 import create_profile
from iseq.hmmer_model import HMMERModel
@pytest.mark.slow
def test_hmmer3_pfam_viterbi_scores_compat(tmp_path):
os.chdir(tmp_path)
db_filepath = example_filepath("Pfam-A.hmm")
target_filepath = example_filepath("A0ALD9.fasta")
iseq_scores = loadtxt(example_filepath("Pfam-A_iseq_viterbi_scores.txt"))
with read_fasta(target_filepath) as fasta:
target = list(fasta)[0]
actual_scores = []
for hmmprof in open_hmmer(db_filepath):
prof = create_profile(HMMERModel(hmmprof), hmmer3_compat=True)
seq = Sequence.create(target.sequence.encode(), prof.alphabet)
search_results = prof.search(seq)
score = search_results.results[0].alt_viterbi_score
actual_scores.append(score)
iseq_scores = loadtxt(example_filepath("Pfam-A_iseq_viterbi_scores.txt"))
assert_allclose(actual_scores, iseq_scores)
hmmer3_scores = loadtxt(example_filepath("Pfam-A_hmmer3.3_viterbi_scores.txt"))
ok = [i for i, score in enumerate(hmmer3_scores) if isfinite(score)]
actual_scores = [actual_scores[i] for i in ok]
hmmer3_scores = [hmmer3_scores[i] for i in ok]
assert_allclose(actual_scores, hmmer3_scores, 3e-2)
def test_hmmer3_viterbi_dna_scores_compat():
hmmfile = example_filepath("2OG-FeII_Oxy_3-nt.hmm")
hmmprof = open_hmmer(hmmfile).read_model()
prof = create_profile(HMMERModel(hmmprof), hmmer3_compat=True)
for align in ["local", "unilocal", "glocal", "uniglocal"]:
fastafile = f"2OG-FeII_Oxy_3-nt_{align}.fasta"
hmmer3_vitfile = f"hmmer3_2OG-FeII_Oxy_3-nt_{align}.fasta.viterbi"
iseq_vitfile = f"iseq_2OG-FeII_Oxy_3-nt_{align}.fasta.viterbi"
hmmer3_scores = loadtxt(example_filepath(hmmer3_vitfile))
iseq_scores = loadtxt(example_filepath(iseq_vitfile))
# HMMER3 viterbi filter has very low accuracy (2 bytes of integer arithmetic)
# while we use 8 bytes of floating point arithmetic. Therefore we have
# to allow for relatively high viterbi score differences.
ok = [i for i, s in enumerate(hmmer3_scores) if isfinite(s) and abs(s) > 1]
hmmer3_scores = [hmmer3_scores[i] for i in ok]
iseq_scores = [iseq_scores[i] for i in ok]
assert_allclose(iseq_scores, hmmer3_scores, rtol=7e-2)
actual_scores = []
with read_fasta(example_filepath(fastafile)) as fasta:
for target in fasta:
seq = Sequence.create(target.sequence.encode(), prof.alphabet)
search_results = prof.search(seq)
score = search_results.results[0].alt_viterbi_score
actual_scores.append(score)
iseq_scores = loadtxt(example_filepath(iseq_vitfile))
assert_allclose(actual_scores, iseq_scores, rtol=1e-4)
def test_hmmer3_viterbi_amino_scores_compat():
hmmfile = example_filepath("2OG-FeII_Oxy_3.hmm")
hmmprof = open_hmmer(hmmfile).read_model()
prof = create_profile(HMMERModel(hmmprof), hmmer3_compat=True)
for align in ["local", "unilocal", "glocal", "uniglocal"]:
fastafile = f"2OG-FeII_Oxy_3_{align}.fasta"
hmmer3_vitfile = f"hmmer3_2OG-FeII_Oxy_3_{align}.fasta.viterbi"
iseq_vitfile = f"iseq_2OG-FeII_Oxy_3_{align}.fasta.viterbi"
hmmer3_scores = loadtxt(example_filepath(hmmer3_vitfile))
iseq_scores = loadtxt(example_filepath(iseq_vitfile))
# HMMER3 viterbi filter has very low accuracy (2 bytes of integer arithmetic)
# while we use 8 bytes of floating point arithmetic. Therefore we have
# to allow for relatively high viterbi score differences.
ok = [i for i, s in enumerate(hmmer3_scores) if isfinite(s)]
hmmer3_scores = [hmmer3_scores[i] for i in ok]
iseq_scores = [iseq_scores[i] for i in ok]
assert_allclose(iseq_scores, hmmer3_scores, rtol=1e-2)
actual_scores = []
with read_fasta(example_filepath(fastafile)) as fasta:
for target in fasta:
seq = Sequence.create(target.sequence.encode(), prof.alphabet)
search_results = prof.search(seq)
score = search_results.results[0].alt_viterbi_score
actual_scores.append(score)
iseq_scores = loadtxt(example_filepath(iseq_vitfile))
assert_allclose(actual_scores, iseq_scores, rtol=1e-5)
def loadtxt(filepath: Path):
arr = []
with open(filepath, "r") as file:
for line in file:
arr.append(float(line))
return arr
| 39.336066 | 85 | 0.702438 | 653 | 4,799 | 4.912711 | 0.183767 | 0.074813 | 0.05611 | 0.078554 | 0.765274 | 0.762781 | 0.726309 | 0.725686 | 0.704489 | 0.666147 | 0 | 0.018129 | 0.206918 | 4,799 | 121 | 86 | 39.661157 | 0.82475 | 0.083559 | 0 | 0.458824 | 0 | 0 | 0.101344 | 0.079253 | 0 | 0 | 0 | 0 | 0.082353 | 1 | 0.047059 | false | 0 | 0.129412 | 0 | 0.188235 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
c784d06c4b55dbc74c2615618c49d8226619178a | 39 | py | Python | tests/__init__.py | yuvipanda/binderbot | e9a209ee2a2ac46faa062744671157997181485f | [
"BSD-3-Clause"
] | 8 | 2020-03-17T21:32:39.000Z | 2021-09-17T00:18:49.000Z | tests/__init__.py | yuvipanda/binderbot | e9a209ee2a2ac46faa062744671157997181485f | [
"BSD-3-Clause"
] | 28 | 2020-03-17T21:54:14.000Z | 2022-02-03T17:33:27.000Z | tests/__init__.py | yuvipanda/binderbot | e9a209ee2a2ac46faa062744671157997181485f | [
"BSD-3-Clause"
] | 4 | 2020-04-01T20:35:54.000Z | 2021-02-02T21:09:18.000Z | """Unit test package for binderbot."""
| 19.5 | 38 | 0.692308 | 5 | 39 | 5.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128205 | 39 | 1 | 39 | 39 | 0.794118 | 0.820513 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
c78a8d2f48dc60ff1bff2e05be9e83a2489dc63f | 7,571 | py | Python | tests/unit/utils/test_auth.py | Maxcutex/personal_ecommerce | be09fb20eae1b225523acde06f8e75effcc3676f | [
"MIT"
] | null | null | null | tests/unit/utils/test_auth.py | Maxcutex/personal_ecommerce | be09fb20eae1b225523acde06f8e75effcc3676f | [
"MIT"
] | 2 | 2019-05-21T08:44:29.000Z | 2021-04-30T20:46:08.000Z | tests/unit/utils/test_auth.py | Maxcutex/personal_ecommerce | be09fb20eae1b225523acde06f8e75effcc3676f | [
"MIT"
] | null | null | null | from tests.base_test_case import BaseTestCase
from app.utils.auth import Auth
from unittest.mock import patch
from collections import namedtuple
class TestAuth(BaseTestCase):
def setUp(self):
self.BaseSetUp()
def test_get_user_method_return_dict_of_user_data_if_valid_header_present(self):
with self.app.test_request_context(path='/api/v1/vendors', method='GET', headers=self.headers()) as request:
user_data = Auth._get_user()
self.assertIsInstance(user_data, dict)
self.assertIsNotNone(user_data)
self.assertJSONKeysPresent(user_data, 'customerId', 'name', 'email')
def test_user_method_return_list_of_user_data_based_on_supplied_keys(self):
with self.app.test_request_context(path='/api/v1/vendors', method='GET', headers=self.headers()) as request:
decoded = Auth.decode_token(self.get_valid_token())
values = Auth.user('customerId', 'name', 'email')
customer_id, name, email = values
self.assertIsInstance(values, list)
self.assertEquals(decoded['UserInfo']['customerId'], customer_id)
self.assertEquals(decoded['UserInfo']['name'], name)
self.assertEquals(decoded['UserInfo']['email'], email)
def test_get_token_throws_exception_when_auth_header_missing(self):
try:
Auth.get_token()
assert False
except Exception as e:
assert True
def test_get_token_return_token_if_valid_header_present(self):
with self.app.test_request_context(path='/api/v1/vendors', method='GET', headers=self.headers()) as request:
token = Auth.get_token()
self.assertIsInstance(token, str)
self.assertIsNotNone(token)
def test_decode_token_throws_exception_on_invalid_token(self):
try:
Auth.decode_token(self.get_invalid_token())
assert False
except Exception as e:
assert True
def test_decode_token_returns_dict_on_valid_token(self):
token = Auth.decode_token(self.get_valid_token())
if type(token) is dict:
assert True
else:
assert False
def test_get_location_throws_exception_when_location_header_missing(self):
try:
Auth.get_location()
assert False
except Exception as e:
assert True
def test_get_location_header_returns_int_value_when_location_header_present(self):
with self.app.test_request_context(path='/api/v1/vendors', method='GET', headers=self.headers()) as request:
location = Auth.get_location()
self.assertIsInstance(location, int)
self.assertIsNotNone(location)
@patch('app.repositories.role_repo.RoleRepo.get')
@patch('app.repositories.user_role_repo.UserRoleRepo.find_first')
@patch('app.utils.auth.Auth.user')
def test_has_role_method_handles_succeeds(self, mock_auth_user, mock_find_first, mock_role_repo):
def mock_get(*args):
get_obj = namedtuple('mock', 'name')
return get_obj('admin')
class MockRole:
role_id = 1
mock_auth_user.return_value = {'customerId': 1}
mock_find_first.return_value = MockRole
mock_role_repo.return_value = mock_get()
response = Auth.has_role('admin')(lambda n: n)('test')
self.assertEqual(response, 'test')
@patch('app.utils.auth.Auth.user')
def test_has_role_method_handles_user_not_found(self, mock_auth_user):
mock_auth_user.return_value = None
response = Auth.has_role('admin')(lambda n: n)('test')
self.assertEqual(response[0].get_json()['msg'], 'Missing User ID in token')
@patch('app.repositories.user_role_repo.UserRoleRepo.find_first')
@patch('app.utils.auth.Auth.user')
def test_has_role_method_handles_role_not_fond(self, mock_auth_user, mock_find_first):
mock_auth_user.return_value = {'customerId': 1}
mock_find_first.return_value = None
response = Auth.has_role('admin')(lambda n: n)('test')
self.assertEqual(response[0].get_json()['msg'], 'Access Error - No Role Granted')
@patch('app.repositories.role_repo.RoleRepo.get')
@patch('app.repositories.user_role_repo.UserRoleRepo.find_first')
@patch('app.utils.auth.Auth.user')
def test_has_role_method_handles_unmatching_roles(self, mock_auth_user, mock_find_first, mock_role_repo):
def mock_get(*args):
get_obj = namedtuple('mock', 'name')
return get_obj('user')
class MockRole:
role_id = 1
mock_auth_user.return_value = {'customerId': 1}
mock_find_first.return_value = MockRole
mock_role_repo.return_value = mock_get()
response = Auth.has_role('admin')(lambda n: n)('test')
self.assertEqual(response[0].get_json()['msg'], 'Access Error - This role does not have the access rights')
@patch('app.utils.auth.Auth.user')
def test_has_permission_method_handles_missing_user_id(self, mock_auth_user):
mock_auth_user.return_value = None
response = Auth.has_permission('permission')('permission')()
self.assertEqual(response[0].get_json()['msg'], 'Missing User ID in token')
@patch('app.repositories.user_role_repo.UserRoleRepo.find_first')
@patch('app.utils.auth.Auth.user')
def test_has_permission_method_handles_role_not_fond(self, mock_auth_user, mock_find_first):
mock_auth_user.return_value = {'customerId': 1}
mock_find_first.return_value = None
response = Auth.has_permission('admin')(lambda n: n)('test')
self.assertEqual(response[0].get_json()['msg'], 'Access Error - No Role Granted')
@patch('app.repositories.role_repo.RoleRepo.get')
@patch('app.repositories.user_role_repo.UserRoleRepo.find_first')
@patch('app.utils.auth.Auth.user')
def test_has_permission_method_handles_missing_permissions(self, mock_auth_user, mock_find_first, mock_role_repo):
def mock_get(*args):
get_obj = namedtuple('mock', 'name')
return get_obj('user')
class MockRole:
role_id = 1
mock_auth_user.return_value = {'customerId': 1}
mock_find_first.return_value = MockRole
mock_role_repo.return_value = mock_get()
response = Auth.has_permission('admin')(lambda n: n)('test')
self.assertEqual(response[0].get_json()['msg'], 'Access Error - No Permission Granted')
@patch('app.repositories.permission_repo.PermissionRepo.filter_by')
@patch('app.repositories.role_repo.RoleRepo.get')
@patch('app.repositories.user_role_repo.UserRoleRepo.find_first')
@patch('app.utils.auth.Auth.user')
def test_has_permission_method_handles_permission_denied(self, mock_auth_user, mock_find_first, mock_role_repo, mock_filter_by):
def mock_get(*args):
get_obj = namedtuple('mock', 'name')
return get_obj('user')
class MockRole:
role_id = 1
class MockPermission:
keyword = 'test'
class MockFilter:
items = [MockPermission]
mock_auth_user.return_value = {'customerId': 1}
mock_find_first.return_value = MockRole
mock_role_repo.return_value = mock_get()
mock_filter_by.return_value = MockFilter
response = Auth.has_permission('admin')(lambda n: n)('test')
self.assertEqual(response[0].get_json()['msg'], 'Access Error - Permission Denied')
@patch('app.repositories.permission_repo.PermissionRepo.filter_by')
@patch('app.repositories.role_repo.RoleRepo.get')
@patch('app.repositories.user_role_repo.UserRoleRepo.find_first')
@patch('app.utils.auth.Auth.user')
def test_has_permission_method_succeeds(self, mock_auth_user, mock_find_first, mock_role_repo, mock_filter_by):
def mock_get(*args):
get_obj = namedtuple('mock', 'name')
return get_obj('user')
class MockRole:
role_id = 1
class MockPermission:
keyword = 'admin'
class MockFilter:
items = [MockPermission]
mock_auth_user.return_value = {'customerId': 1}
mock_find_first.return_value = MockRole
mock_role_repo.return_value = mock_get()
mock_filter_by.return_value = MockFilter
response = Auth.has_permission('admin')(lambda n: n)('test')
self.assertEqual(response, 'test')
| 32.080508 | 129 | 0.761062 | 1,085 | 7,571 | 4.99447 | 0.117051 | 0.041336 | 0.03986 | 0.028234 | 0.773021 | 0.768961 | 0.758996 | 0.74534 | 0.74534 | 0.743864 | 0 | 0.003429 | 0.11412 | 7,571 | 235 | 130 | 32.217021 | 0.804533 | 0 | 0 | 0.666667 | 0 | 0 | 0.205548 | 0.120211 | 0 | 0 | 0 | 0 | 0.17284 | 1 | 0.141975 | false | 0 | 0.024691 | 0 | 0.259259 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
c790218b01f6f3912c4a67af7b5398f23e2ca1eb | 209 | py | Python | pyriemann/__init__.py | chrinide/pyRiemann | 0f5eb0224673976a53e5d2201a13a187c5bc1e8d | [
"BSD-3-Clause"
] | 2 | 2017-03-03T02:09:10.000Z | 2021-04-21T06:32:35.000Z | pyriemann/__init__.py | chrinide/pyRiemann | 0f5eb0224673976a53e5d2201a13a187c5bc1e8d | [
"BSD-3-Clause"
] | null | null | null | pyriemann/__init__.py | chrinide/pyRiemann | 0f5eb0224673976a53e5d2201a13a187c5bc1e8d | [
"BSD-3-Clause"
] | null | null | null | from . import classification
from . import tangentspace
from . import channelselection
from . import estimation
from . import spatialfilters
from . import clustering
from . import stats
__version__ = "0.2.5"
| 20.9 | 30 | 0.789474 | 25 | 209 | 6.44 | 0.52 | 0.434783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016949 | 0.15311 | 209 | 9 | 31 | 23.222222 | 0.892655 | 0 | 0 | 0 | 0 | 0 | 0.023923 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.875 | 0 | 0.875 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
c7cdca1bf77ac36b0fdf0653768e4b59f4256495 | 300 | py | Python | machine.py | ahmadabdulnasir/FaceAttendance | 01c72bf66c8419336b089712aefac477848f1f0a | [
"MIT"
] | 1 | 2019-06-17T15:03:10.000Z | 2019-06-17T15:03:10.000Z | machine.py | Salafi9/FaceAttendance | 01c72bf66c8419336b089712aefac477848f1f0a | [
"MIT"
] | 1 | 2021-10-12T22:57:00.000Z | 2021-10-12T22:57:00.000Z | machine.py | Salafi9/FaceAttendance | 01c72bf66c8419336b089712aefac477848f1f0a | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
__author__ = 'Ahmad Abdulnasir Shu'aib <me@ahmadabdulnasir.com.ng>'
__homepage__ = https://ahmadabdulnasir.com.ng
__copyright__ = 'Copyright (c) 2019, salafi'
__version__ = "0.01t"
"""
def boot():
pass
if __name__ == "__main__":
boot()
| 16.666667 | 67 | 0.656667 | 36 | 300 | 4.805556 | 0.861111 | 0.208092 | 0.231214 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035714 | 0.16 | 300 | 17 | 68 | 17.647059 | 0.650794 | 0.746667 | 0 | 0 | 0 | 0 | 0.119403 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0.25 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
c7e3581f68ed3cf9cd0084fc847445c879656e3b | 108 | py | Python | daily_email/main.py | ariannedee/daily_email_1 | 714f5f54dfc12ff2b48c29a5300149e5b2c39641 | [
"Unlicense"
] | 2 | 2022-02-25T19:03:57.000Z | 2022-03-29T16:04:17.000Z | daily_email/main.py | ariannedee/daily_email_1 | 714f5f54dfc12ff2b48c29a5300149e5b2c39641 | [
"Unlicense"
] | null | null | null | daily_email/main.py | ariannedee/daily_email_1 | 714f5f54dfc12ff2b48c29a5300149e5b2c39641 | [
"Unlicense"
] | 3 | 2022-02-24T18:31:09.000Z | 2022-02-24T18:47:36.000Z | from send_email import send_text_email
send_text_email(subject='An email', content='This is a test email')
| 27 | 67 | 0.805556 | 19 | 108 | 4.315789 | 0.631579 | 0.195122 | 0.317073 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 108 | 3 | 68 | 36 | 0.854167 | 0 | 0 | 0 | 0 | 0 | 0.259259 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
40213a7c13249348f6d3ce2b40525b81b24d0723 | 115 | py | Python | handlers/users/__init__.py | Gerleff/4plus1bot | 4d672ff7410d1b388d92bd932d46953cb05f34b7 | [
"Apache-2.0"
] | null | null | null | handlers/users/__init__.py | Gerleff/4plus1bot | 4d672ff7410d1b388d92bd932d46953cb05f34b7 | [
"Apache-2.0"
] | null | null | null | handlers/users/__init__.py | Gerleff/4plus1bot | 4d672ff7410d1b388d92bd932d46953cb05f34b7 | [
"Apache-2.0"
] | null | null | null | from .help import dp
from .start import dp
from .admin_panel import dp
from .purchase import dp
__all__ = ["dp"]
| 14.375 | 27 | 0.73913 | 19 | 115 | 4.210526 | 0.473684 | 0.4 | 0.45 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.182609 | 115 | 7 | 28 | 16.428571 | 0.851064 | 0 | 0 | 0 | 0 | 0 | 0.017391 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.8 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
40387018820639da966cdf7493a913884511fb52 | 73 | py | Python | examples/list.sort/ex2.py | mcorne/python-by-example | 15339c0909c84b51075587a6a66391100971c033 | [
"MIT"
] | null | null | null | examples/list.sort/ex2.py | mcorne/python-by-example | 15339c0909c84b51075587a6a66391100971c033 | [
"MIT"
] | null | null | null | examples/list.sort/ex2.py | mcorne/python-by-example | 15339c0909c84b51075587a6a66391100971c033 | [
"MIT"
] | null | null | null | # l.sort(reverse=True)
l = [5, 2, 3, 1, 4]
l.sort(reverse=True)
print(l)
| 14.6 | 22 | 0.60274 | 16 | 73 | 2.75 | 0.625 | 0.227273 | 0.545455 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080645 | 0.150685 | 73 | 4 | 23 | 18.25 | 0.629032 | 0.273973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
4049e338357999cd55db247fe3f68d39480d5d35 | 192 | py | Python | School-Management-System/teachers/admin.py | GisaKaze/Python-Quarantine-Projects | 29fabcb7e4046e6f3e9a19403e6d2490fe4b9fc4 | [
"MIT"
] | null | null | null | School-Management-System/teachers/admin.py | GisaKaze/Python-Quarantine-Projects | 29fabcb7e4046e6f3e9a19403e6d2490fe4b9fc4 | [
"MIT"
] | null | null | null | School-Management-System/teachers/admin.py | GisaKaze/Python-Quarantine-Projects | 29fabcb7e4046e6f3e9a19403e6d2490fe4b9fc4 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import *
# Register your models here.
admin.site.register(TeacherDeptInfo)
admin.site.register(TeacherSubInfo)
admin.site.register(TeacherInfo)
| 21.333333 | 36 | 0.817708 | 24 | 192 | 6.541667 | 0.541667 | 0.171975 | 0.324841 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 192 | 8 | 37 | 24 | 0.902299 | 0.135417 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
407b8e7ab855bc9d001f20f090555493dff320e3 | 21 | py | Python | gtts_prepare/__init__.py | shengyang998/WeatherCN | a35b2b15e1e0839cd0fd6ea6f6a0b8650e470c7a | [
"MIT"
] | null | null | null | gtts_prepare/__init__.py | shengyang998/WeatherCN | a35b2b15e1e0839cd0fd6ea6f6a0b8650e470c7a | [
"MIT"
] | null | null | null | gtts_prepare/__init__.py | shengyang998/WeatherCN | a35b2b15e1e0839cd0fd6ea6f6a0b8650e470c7a | [
"MIT"
] | null | null | null | from . import prepare | 21 | 21 | 0.809524 | 3 | 21 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 21 | 1 | 21 | 21 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
40e8c71255e007ce45b621f67d9f1277cc98287b | 95 | py | Python | tests/files/autotest/tests_module/a2.py | ishikawa/modipyd | 30bd7df9e9babb29b848dac6a46b1c909ab1e180 | [
"MIT"
] | 1 | 2016-05-08T13:21:04.000Z | 2016-05-08T13:21:04.000Z | tests/files/autotest/tests_module/a2.py | ishikawa/modipyd | 30bd7df9e9babb29b848dac6a46b1c909ab1e180 | [
"MIT"
] | null | null | null | tests/files/autotest/tests_module/a2.py | ishikawa/modipyd | 30bd7df9e9babb29b848dac6a46b1c909ab1e180 | [
"MIT"
] | null | null | null | from tests_module import TestCase
class TestA(TestCase):
def test_it(self):
pass
| 13.571429 | 33 | 0.694737 | 13 | 95 | 4.923077 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.242105 | 95 | 6 | 34 | 15.833333 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.25 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 5 |
40f2ca0e17d0fe1660af46101310240eb7c19ae9 | 5,843 | py | Python | path_to_output.py | Luxxii/ProtGraphREST | 905013a7e96ed17c44a4c09285901af995d75418 | [
"BSD-2-Clause"
] | 1 | 2021-02-04T18:11:15.000Z | 2021-02-04T18:11:15.000Z | path_to_output.py | Luxxii/ProtGraphREST | 905013a7e96ed17c44a4c09285901af995d75418 | [
"BSD-2-Clause"
] | null | null | null | path_to_output.py | Luxxii/ProtGraphREST | 905013a7e96ed17c44a4c09285901af995d75418 | [
"BSD-2-Clause"
] | 1 | 2021-02-05T09:14:48.000Z | 2021-02-05T09:14:48.000Z | import json
import falcon
import igraph
from graph_utils import (check_path_incorrect, get_aminoacids, get_graph_path,
get_pep_and_header_def)
from models import Path
from models_utils import load_model
from prot_graph_exception import ProtGraphException
def _check_header(req):
""" checks if the header for POST is set """
if int(req.headers["CONTENT-LENGTH"]) != 0 and "CONTENT-TYPE" not in req.headers:
raise ProtGraphException(
falcon.HTTP_400,
json.dumps({"message": "Content-Length needs to be set"}, indent=4)
)
if req.headers["CONTENT-TYPE"] != "application/json":
raise ProtGraphException(
falcon.HTTP_400,
json.dumps({"message": "Content-Type needs to be set to application/json"}, indent=4)
)
def _concat_paths(*paths: Path):
""" Concatenate all paths, into one list """
paths_list = []
for path in paths:
paths_list.extend(path.paths)
if len(path.path) != 0:
paths_list.append(path.path)
return paths_list
def _check_paths_length(paths):
""" Throws an exception if no path is provided. """
if len(paths) == 0:
raise ProtGraphException(
falcon.HTTP_400,
json.dumps({"message": "A path needs to be provided"}, indent=4)
)
class PathToPeptide(object):
def __init__(self, base_dir):
self.base_dir = base_dir
def _return_content(self, resp, peptides, as_json=True):
if as_json:
resp.set_header("content-type", "application/json")
resp.body = json.dumps(peptides, ensure_ascii=False)
else:
resp.set_header("content-type", "text/plain")
resp.body = "\n".join(peptides)
resp.status = falcon.HTTP_200
def _get_peptides(self, resp, prot_graph_path, paths):
_check_paths_length(paths)
# Load graph
graph = igraph.read(prot_graph_path)
# For each path retrieve the peptide sequence:
peptides = []
for path in paths:
check_path_incorrect(graph, path)
peptides.append(get_aminoacids(graph, path[1:-1]))
return peptides
def on_get(self, req, resp, accession):
# Get Protein depending on accession and path
prot_graph_path = get_graph_path(self.base_dir, accession)
path_obj = load_model(Path, req.params)
paths = _concat_paths(path_obj)
# Get peptides
peptides = self._get_peptides(resp, prot_graph_path, paths)
# Return the content depending on return type
self._return_content(resp, peptides, path_obj.returns == "json")
def on_post(self, req, resp, accession):
# Check headers
_check_header(req)
# Get Protein depending on accession and path
prot_graph_path = get_graph_path(self.base_dir, accession)
path_obj_query, path_obj_body = load_model(Path, req.params, req.media)
paths = _concat_paths(path_obj_query, path_obj_body)
# Get peptides
peptides = self._get_peptides(resp, prot_graph_path, paths)
# Return the content depending on return type
self._return_content(resp, peptides, "json" in [path_obj_query.returns, path_obj_body.returns])
class PathToFasta(object):
def __init__(self, base_dir):
self.base_dir = base_dir
def _return_content(self, resp, peptides, as_json=True):
if as_json:
content = []
for idx, (pep, header) in enumerate(peptides):
content.append(
{
"head": ">pg|ID_" + str(idx) + "|" + header,
"seq": pep
}
)
resp.set_header("content-type", "application/json")
resp.body = json.dumps(content, ensure_ascii=False)
else:
content = ""
for idx, (pep, header) in enumerate(peptides):
content += ">pg|ID_" + str(idx) + "|" + header
content += "\n" + '\n'.join(pep[i:i+60] for i in range(0, len(pep), 60)) + "\n"
resp.set_header("content-type", "text/plain")
resp.body = content
resp.status = falcon.HTTP_200
def _get_peptides(self, resp, prot_graph_path, paths):
_check_paths_length(paths)
# Load graph
graph = igraph.read(prot_graph_path)
# For each path retrieve the peptide sequence:
peptides = []
for path in paths:
check_path_incorrect(graph, path)
pep, header = get_pep_and_header_def(path, graph)
peptides.append((pep, header))
return peptides
def on_get(self, req, resp, accession):
# Get Protein depending on accession and path
prot_graph_path = get_graph_path(self.base_dir, accession)
path_obj = load_model(Path, req.params)
paths = _concat_paths(path_obj)
# Get peptides
peptides = self._get_peptides(resp, prot_graph_path, paths)
# Return the content depending on return type
self._return_content(
resp, peptides, path_obj.returns == "json"
)
def on_post(self, req, resp, accession):
# Check headers
_check_header(req)
# Get Protein depending on accession and path
prot_graph_path = get_graph_path(self.base_dir, accession)
path_obj_query, path_obj_body = load_model(Path, req.params, req.media)
paths = _concat_paths(path_obj_query, path_obj_body)
# Get peptides
peptides = self._get_peptides(resp, prot_graph_path, paths)
# Return the content depending on return type
self._return_content(
resp, peptides, "json" in [path_obj_query.returns, path_obj_body.returns]
)
| 34.169591 | 103 | 0.620058 | 733 | 5,843 | 4.695771 | 0.154161 | 0.052295 | 0.045322 | 0.029634 | 0.739686 | 0.71993 | 0.71993 | 0.71993 | 0.704823 | 0.618826 | 0 | 0.006691 | 0.283758 | 5,843 | 170 | 104 | 34.370588 | 0.815771 | 0.113811 | 0 | 0.557522 | 0 | 0 | 0.063594 | 0 | 0.00885 | 0 | 0 | 0 | 0 | 1 | 0.115044 | false | 0 | 0.061947 | 0 | 0.221239 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
40fa93c9ac74dbe7b1f4277d5fcbdd66f8b0a6db | 52,580 | py | Python | tests/unit_tests/test_tethys_cli/test_docker_commands.py | rileyhales/tethys | 740698a95767d88612b9ce9c562ca614a5385a62 | [
"BSD-2-Clause"
] | null | null | null | tests/unit_tests/test_tethys_cli/test_docker_commands.py | rileyhales/tethys | 740698a95767d88612b9ce9c562ca614a5385a62 | [
"BSD-2-Clause"
] | null | null | null | tests/unit_tests/test_tethys_cli/test_docker_commands.py | rileyhales/tethys | 740698a95767d88612b9ce9c562ca614a5385a62 | [
"BSD-2-Clause"
] | null | null | null | import importlib
import unittest
from unittest import mock
import tethys_cli.docker_commands as cli_docker_commands
class TestDockerCommands(unittest.TestCase):
def setUp(self):
self.mock_dc = mock.MagicMock(name='docker_client')
dc_patcher = mock.patch('tethys_cli.docker_commands.docker.from_env', return_value=self.mock_dc)
self.mock_from_env = dc_patcher.start()
self.addCleanup(dc_patcher.stop)
input_patcher = mock.patch('tethys_cli.docker_commands.input', return_value=mock.MagicMock(name='input'))
self.mock_input = input_patcher.start()
self.addCleanup(input_patcher.stop)
def tearDown(self):
pass
def test_curses_import_error(self):
with mock.patch.dict('sys.modules', {'curses': None}):
importlib.reload(cli_docker_commands)
def test_get_docker_client(self):
dc = cli_docker_commands.ContainerMetadata.get_docker_client()
self.assertIs(dc, self.mock_dc)
def test_container_metadata_get_containers(self):
all_containers = cli_docker_commands.ContainerMetadata.get_containers()
self.assertEqual(4, len(all_containers))
for container in all_containers:
self.assertIsInstance(container, cli_docker_commands.ContainerMetadata)
def test_container_metadata_get_containers_cached(self):
mock_all_containers = mock.MagicMock()
cli_docker_commands.ContainerMetadata.all_containers = mock_all_containers
def cleanup():
cli_docker_commands.ContainerMetadata.all_containers = None
self.addCleanup(cleanup)
all_containers = cli_docker_commands.ContainerMetadata.get_containers()
self.assertIs(all_containers, mock_all_containers)
cli_docker_commands.ContainerMetadata.all_containers = None
def test_container_metadata_get_containers_with_containers_arg(self):
containers_input = cli_docker_commands.docker_container_inputs
for container_input in containers_input:
containers = [container_input]
all_containers = cli_docker_commands.ContainerMetadata.get_containers(containers=containers)
self.assertEqual(1, len(all_containers), 'Container input "{}" was not found in {}.'.format(
container_input, all_containers))
self.assertEqual(container_input, all_containers[0].input)
@mock.patch('tethys_cli.docker_commands.ContainerMetadata.is_installed', new_callable=mock.PropertyMock)
def test_container_metadata_get_containers_with_installed(self, mock_is_installed):
mock_is_installed.return_value = True
all_containers = cli_docker_commands.ContainerMetadata.get_containers(installed=True)
self.assertEqual(4, len(all_containers))
all_containers = cli_docker_commands.ContainerMetadata.get_containers(installed=False)
self.assertEqual(0, len(all_containers))
mock_is_installed.return_value = False
all_containers = cli_docker_commands.ContainerMetadata.get_containers(installed=True)
self.assertEqual(0, len(all_containers))
all_containers = cli_docker_commands.ContainerMetadata.get_containers(installed=False)
self.assertEqual(4, len(all_containers))
@mock.patch('tethys_cli.docker_commands.ContainerMetadata.container', new_callable=mock.PropertyMock)
def test_container_metadata_is_installed(self, mock_container):
mock_container.return_value = None
cm = cli_docker_commands.PostGisContainerMetadata()
self.assertFalse(cm.is_installed)
mock_container.return_value = mock.Mock()
self.assertTrue(cm.is_installed)
def test_container_metadata_container(self):
mock_container = mock.MagicMock()
self.mock_dc.containers.get.return_value = mock_container
cm = cli_docker_commands.PostGisContainerMetadata()
self.assertEqual(mock_container, cm.container)
def test_container_metadata_container_exception(self):
self.mock_dc.containers.get.side_effect = cli_docker_commands.DockerNotFound('test')
cm = cli_docker_commands.PostGisContainerMetadata()
self.assertIsNone(cm.container)
def test_container_metadata_container_cached(self):
mock_container = mock.MagicMock()
cm = cli_docker_commands.PostGisContainerMetadata()
cm._container = mock_container
self.assertEqual(mock_container, cm.container)
def test_container_metadata_init(self):
cli_docker_commands.PostGisContainerMetadata()
def test_container_metadata_init_with_docker_client_arg(self):
mock_dc = mock.MagicMock()
c = cli_docker_commands.PostGisContainerMetadata(mock_dc)
self.assertIs(c._docker_client, mock_dc)
def test_port_bindings_postgis(self):
cm = cli_docker_commands.PostGisContainerMetadata()
self.assertDictEqual({5432: 5435}, cm.port_bindings)
@mock.patch('tethys_cli.docker_commands.GeoServerContainerMetadata.is_cluster', new_callable=mock.PropertyMock)
def test_port_bindings_geoserver(self, mock_is_cluster):
mock_is_cluster.return_value = True
cm = cli_docker_commands.GeoServerContainerMetadata()
self.assertDictEqual({
8181: 8181,
8081: 8081,
8082: 8082,
8083: 8083,
8084: 8084,
}, cm.port_bindings)
def test_port_bindings_geoserver_non_clustered(self):
cm = cli_docker_commands.GeoServerContainerMetadata()
self.assertDictEqual({8080: 8181}, cm.port_bindings)
def test_port_bindings_52_north(self):
cm = cli_docker_commands.N52WpsContainerMetadata()
self.assertDictEqual({8080: 8282}, cm.port_bindings)
def test_port_bindings_thredds(self):
cm = cli_docker_commands.ThreddsContainerMetadata()
self.assertDictEqual({8080: 8383}, cm.port_bindings)
def test_cm_ip_postgis(self):
cm = cli_docker_commands.PostGisContainerMetadata()
expected_msg = "\nPostGIS/Database:" \
"\n Host: 127.0.0.1" \
"\n Port: 5435" \
"\n Endpoint: postgresql://<username>:<password>@127.0.0.1:5435/<database>"
self.assertEqual(expected_msg, cm.ip)
@mock.patch('tethys_cli.docker_commands.GeoServerContainerMetadata.is_cluster', new_callable=mock.PropertyMock)
def test_cm_ip_geoserver(self, mock_is_cluster):
mock_is_cluster.return_value = True
cm = cli_docker_commands.GeoServerContainerMetadata()
expected_msg = "\nGeoServer:" \
"\n Host: 127.0.0.1" \
"\n Primary Port: 8181" \
"\n Node Ports: 8081, 8082, 8083, 8084" \
"\n Endpoint: http://127.0.0.1:8181/geoserver/rest"
self.assertEqual(expected_msg, cm.ip)
def test_cm_ip_geoserver_non_clustered(self):
cm = cli_docker_commands.GeoServerContainerMetadata()
expected_msg = "\nGeoServer:" \
"\n Host: 127.0.0.1" \
"\n Port: 8181" \
"\n Endpoint: http://127.0.0.1:8181/geoserver/rest"
self.assertEqual(expected_msg, cm.ip)
def test_cm_ip_52_north(self):
cm = cli_docker_commands.N52WpsContainerMetadata()
expected_msg = "\n52 North WPS:" \
"\n Host: 127.0.0.1" \
"\n Port: 8282" \
"\n Endpoint: http://127.0.0.1:8282/wps/WebProcessingService"
self.assertEqual(expected_msg, cm.ip)
def test_cm_ip_thredds(self):
cm = cli_docker_commands.ThreddsContainerMetadata()
expected_msg = "\nTHREDDS:" \
"\n Host: 127.0.0.1" \
"\n Port: 8383" \
"\n Endpoint: http://127.0.0.1:8383/thredds/"
self.assertEqual(expected_msg, cm.ip)
@mock.patch('tethys_cli.docker_commands.ContainerMetadata.port_bindings', new_callable=mock.PropertyMock)
def test_cm_default_container_options_postgis(self, mock_port_bindings_prop):
mock_port_bindings = mock.Mock()
mock_port_bindings_prop.return_value = mock_port_bindings
expected_options = dict(
name='tethys_postgis',
image='mdillon/postgis:latest',
environment=dict(
POSTGRES_PASSWORD='mysecretpassword',
),
host_config=dict(
port_bindings=mock_port_bindings
),
)
container = cli_docker_commands.PostGisContainerMetadata()
self.assertDictEqual(expected_options, container.default_container_options())
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.UserInputHelper.get_verified_password')
@mock.patch('tethys_cli.docker_commands.PostGisContainerMetadata.default_container_options')
def test_cm_get_container_options_postgis(self, mock_default_options, mock_getpass, mock_pretty_output):
mock_default_options.return_value = dict(environment=dict(POSTGRES_PASSWORD='mysecretpassword'))
mock_getpass.side_effect = [
'pass', # POSTGRES_PASSWORD
]
expected_environment = dict(
POSTGRES_PASSWORD='pass',
)
container = cli_docker_commands.PostGisContainerMetadata()
self.assertDictEqual(expected_environment, container.get_container_options(defaults=False)['environment'])
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertIn('Tethys uses the mdillon/postgis image on Docker Hub. '
'See: https://hub.docker.com/r/mdillon/postgis/', po_call_args[0][0][0])
mock_default_options.assert_called()
@mock.patch('tethys_cli.docker_commands.GeoServerContainerMetadata.port_bindings',
new_callable=mock.PropertyMock)
def test_cm_default_container_options_geoserver(self, mock_port_bindings_prop):
mock_port_bindings = mock.Mock()
mock_port_bindings_prop.return_value = mock_port_bindings
expected_options = dict(
name='tethys_geoserver',
image='ciwater/geoserver:2.8.2-clustered',
environment=dict(
ENABLED_NODES='1',
REST_NODES='1',
MAX_TIMEOUT='60',
NUM_CORES='4',
MAX_MEMORY='1024',
MIN_MEMORY='1024',
),
volumes=[
'/var/log/supervisor:rw',
'/var/geoserver/data:rw',
'/var/geoserver:rw',
],
host_config=dict(
port_bindings=mock_port_bindings
),
)
container = cli_docker_commands.GeoServerContainerMetadata()
self.assertDictEqual(expected_options, container.default_container_options())
@mock.patch('tethys_cli.docker_commands.Mount')
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.UserInputHelper.get_valid_numeric_input')
@mock.patch('tethys_cli.docker_commands.UserInputHelper.get_valid_choice_input')
@mock.patch('tethys_cli.docker_commands.UserInputHelper.get_valid_directory_input')
@mock.patch('tethys_cli.docker_commands.GeoServerContainerMetadata.default_container_options')
@mock.patch('tethys_cli.docker_commands.GeoServerContainerMetadata.is_cluster', new_callable=mock.PropertyMock)
def test_cm_get_container_options_geoserver_numprocessors_bind(
self, mock_is_cluster, mock_default_options, mock_dir_input, mock_choice_input,
mock_numeric_input, mock_pretty_output, mock_mount
):
mock_is_cluster.return_value = True
mock_default_options.return_value = dict(environment={}, host_config={})
mock_mount.return_value = mock.Mock()
mock_numeric_input.side_effect = [
'1', # Number of GeoServer Instances Enabled
'1', # Number of GeoServer Instances with REST API Enabled
'2', # Number of Processors
'60', # Maximum request timeout in seconds
'1024', # Maximum memory to allocate to each GeoServer instance in MB
'0', # Minimum memory to allocate to each GeoServer instance in MB
]
mock_choice_input.side_effect = [
'c', # Would you like to specify number of Processors (c) OR set limits (e)
'y', # Bind the GeoServer data directory to the host?
]
mock_dir_input.side_effect = [
'/tmp' # Specify location to bind data directory
]
expected_options = dict(
environment=dict(
ENABLED_NODES='1',
REST_NODES='1',
MAX_TIMEOUT='60',
NUM_CORES='2',
MAX_MEMORY='1024',
MIN_MEMORY='0',
),
host_config=dict(
mounts=[mock_mount.return_value]
)
)
container = cli_docker_commands.GeoServerContainerMetadata()
self.assertDictEqual(expected_options, container.get_container_options(defaults=False))
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(2, len(po_call_args))
self.assertIn('The GeoServer docker can be configured to run in a clustered mode', po_call_args[0][0][0])
self.assertIn('GeoServer can be configured with limits to certain types of requests', po_call_args[1][0][0])
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.UserInputHelper.get_valid_numeric_input')
@mock.patch('tethys_cli.docker_commands.UserInputHelper.get_valid_choice_input')
@mock.patch('tethys_cli.docker_commands.GeoServerContainerMetadata.default_container_options')
@mock.patch('tethys_cli.docker_commands.GeoServerContainerMetadata.is_cluster', new_callable=mock.PropertyMock)
def test_cm_get_container_options_geoserver_limits_no_bind(
self, mock_is_cluster, mock_default_options, mock_choice_input,
mock_numeric_input, mock_pretty_output
):
mock_is_cluster.return_value = True
mock_default_options.return_value = dict(environment={}, host_config={})
mock_numeric_input.side_effect = [
'1', # Number of GeoServer Instances Enabled
'1', # Number of GeoServer Instances with REST API Enabled
'100', # Maximum number of simultaneous OGC web service requests
'8', # Maximum number of simultaneous GetMap requests
'16', # Maximum number of simultaneous GeoWebCache tile renders
'60', # Maximum request timeout in seconds
'1024', # Maximum memory to allocate to each GeoServer instance in MB
'0', # Minimum memory to allocate to each GeoServer instance in MB
]
mock_choice_input.side_effect = [
'e', # Would you like to specify number of Processors (c) OR set limits (e)
'n', # Bind the GeoServer data directory to the host?
]
expected_options = dict(
environment=dict(
ENABLED_NODES='1',
REST_NODES='1',
MAX_TIMEOUT='60',
MAX_MEMORY='1024',
MIN_MEMORY='0',
MAX_OWS_GLOBAL='100',
MAX_WMS_GETMAP='8',
MAX_OWS_GWC='16',
),
host_config={}
)
container = cli_docker_commands.GeoServerContainerMetadata()
actual_options = container.get_container_options(defaults=False)
self.assertDictEqual(expected_options, actual_options)
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(2, len(po_call_args))
self.assertIn('The GeoServer docker can be configured to run in a clustered mode', po_call_args[0][0][0])
self.assertIn('GeoServer can be configured with limits to certain types of requests', po_call_args[1][0][0])
@mock.patch('tethys_cli.docker_commands.GeoServerContainerMetadata.default_container_options')
def test_cm_get_container_options_geoserver_defaults(self, mock_default_options):
mock_default_options.return_value = mock.Mock()
container = cli_docker_commands.GeoServerContainerMetadata()
self.assertEqual(mock_default_options.return_value, container.get_container_options(defaults=False))
mock_default_options.assert_called()
@mock.patch('tethys_cli.docker_commands.GeoServerContainerMetadata.is_cluster', new_callable=mock.PropertyMock)
@mock.patch('tethys_cli.docker_commands.GeoServerContainerMetadata.default_container_options')
def test_cm_get_container_options_geoserver_no_cluster(self, mock_default_options, mock_is_cluster):
mock_default_options.return_value = dict()
mock_is_cluster.return_value = False
container = cli_docker_commands.GeoServerContainerMetadata()
self.assertEqual(mock_default_options.return_value, container.get_container_options(defaults=False))
mock_default_options.assert_called()
mock_is_cluster.assert_called()
@mock.patch('tethys_cli.docker_commands.ContainerMetadata.port_bindings', new_callable=mock.PropertyMock)
def test_cm_default_container_options_52_north(self, mock_port_bindings_prop):
mock_port_bindings = mock.Mock()
mock_port_bindings_prop.return_value = mock_port_bindings
expected_options = dict(
name='tethys_wps',
image='ciwater/n52wps:3.3.1',
environment=dict(
NAME='NONE',
POSITION='NONE',
ADDRESS='NONE',
CITY='NONE',
STATE='NONE',
COUNTRY='NONE',
POSTAL_CODE='NONE',
EMAIL='NONE',
PHONE='NONE',
FAX='NONE',
USERNAME='wps',
PASSWORD='wps'
),
host_config=dict(
port_bindings=mock_port_bindings
),
)
container = cli_docker_commands.N52WpsContainerMetadata()
self.assertDictEqual(expected_options, container.default_container_options())
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.UserInputHelper.get_verified_password')
@mock.patch('tethys_cli.docker_commands.UserInputHelper.get_input_with_default')
@mock.patch('tethys_cli.docker_commands.ContainerMetadata.default_container_options')
def test_cm_get_container_options_52_north(self, mock_default_options,
mock_input, mock_getpass, mock_pretty_output):
mock_default_options.return_value = dict(environment={})
mock_input.side_effect = [
'Name', # Name
'Pos', # Position
'Addr', # Address
'City', # City
'State', # State
'Cty', # Country
'Code', # Postal Code
'foo@foo.com', # Email
'123456789', # Phone
'123456788', # Fax
'fooadmin' # Admin username
]
mock_getpass.side_effect = ['wps'] # Admin Password
expected_options = dict(
environment={
'NAME': 'Name',
'POSITION': 'Pos',
'ADDRESS': 'Addr',
'CITY': 'City',
'STATE': 'State',
'COUNTRY': 'Cty',
'POSTAL_CODE': 'Code',
'EMAIL': 'foo@foo.com',
'PHONE': '123456789',
'FAX': '123456788',
'USERNAME': 'fooadmin',
'PASSWORD': 'wps'
},
)
container = cli_docker_commands.N52WpsContainerMetadata()
self.assertDictEqual(expected_options, container.get_container_options(defaults=False))
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertIn('Provide contact information for the 52 North Web Processing Service', po_call_args[0][0][0])
mock_default_options.assert_called()
@mock.patch('tethys_cli.docker_commands.ContainerMetadata.port_bindings', new_callable=mock.PropertyMock)
def test_cm_default_container_options_thredds(self, mock_port_bindings_prop):
mock_port_bindings = mock.Mock()
mock_port_bindings_prop.return_value = mock_port_bindings
expected_options = dict(
name='tethys_thredds',
image='unidata/thredds-docker:4.6.13',
environment=dict(
TDM_PW='CHANGEME!',
TDS_HOST='http://localhost',
THREDDS_XMX_SIZE='4G',
THREDDS_XMS_SIZE='1G',
TDM_XMX_SIZE='6G',
TDM_XMS_SIZE='1G'
),
volumes=[
'/usr/local/tomcat/content/thredds:rw'
],
host_config=dict(
port_bindings=mock_port_bindings
),
)
container = cli_docker_commands.ThreddsContainerMetadata()
self.assertDictEqual(expected_options, container.default_container_options())
@mock.patch('tethys_cli.docker_commands.Mount')
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.UserInputHelper.get_valid_directory_input')
@mock.patch('tethys_cli.docker_commands.UserInputHelper.get_valid_choice_input')
@mock.patch('tethys_cli.docker_commands.UserInputHelper.get_verified_password')
@mock.patch('tethys_cli.docker_commands.UserInputHelper.get_input_with_default')
@mock.patch('tethys_cli.docker_commands.ContainerMetadata.default_container_options')
def test_cm_get_container_options_thredds(self, mock_default_options, mock_input, mock_getpass, mock_choice_input,
mock_dir_input, mock_pretty_output, mock_mount):
mock_default_options.return_value = dict(environment={}, host_config={})
mock_getpass.side_effect = ['please-dont-use-default-passwords'] # TDM Password
mock_input.side_effect = [
'https://example.com', # TDS Host
'6G', # THREDDS XMX
'2G', # THREDDS XMS
'8G', # TDM XMX
'3G', # TDM XMS
]
mock_choice_input.side_effect = [
'y', # Bind the THREDDS data directory to the host?
]
mock_dir_input.side_effect = [
'/tmp' # Specify location to bind data directory
]
expected_options = dict(
environment=dict(
TDM_PW='please-dont-use-default-passwords',
TDS_HOST='https://example.com',
THREDDS_XMX_SIZE='6G',
THREDDS_XMS_SIZE='2G',
TDM_XMX_SIZE='8G',
TDM_XMS_SIZE='3G'
),
host_config=dict(
mounts=[mock_mount.return_value]
),
volumes=[
'/usr/local/tomcat/content/thredds:rw'
]
)
container = cli_docker_commands.ThreddsContainerMetadata()
self.assertDictEqual(expected_options, container.get_container_options(defaults=False))
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertEqual('Provide configuration options for the THREDDS container or or press enter to accept the '
'defaults shown in square brackets: ', po_call_args[0][0][0])
mock_default_options.assert_called()
@mock.patch('tethys_cli.docker_commands.log_pull_stream')
@mock.patch('tethys_cli.docker_commands.PostGisContainerMetadata.image', new_callable=mock.PropertyMock)
def test_cm_pull(self, mock_image, mock_pull_stream):
container = cli_docker_commands.PostGisContainerMetadata()
container.pull()
self.mock_dc.api.pull.assert_called_with(mock_image.return_value, stream=True)
mock_pull_stream.assert_called()
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.PostGisContainerMetadata.get_container_options')
def test_cm_create(self, mock_get_options, mock_pretty_output):
mock_get_options.return_value = dict(host_config={})
container = cli_docker_commands.PostGisContainerMetadata()
container.create()
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertIn('\nInstalling the PostGIS/Database Docker container...', po_call_args[0][0][0])
mock_get_options.assert_called_with(False)
self.mock_dc.api.create_host_config.assert_called_with()
self.mock_dc.api.create_container.assert_called()
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
def test_cm_start(self, mock_pretty_output):
container = cli_docker_commands.PostGisContainerMetadata()
msg = container.start()
self.assertIsNone(msg)
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertIn('Starting PostGIS/Database container...', po_call_args[0][0][0])
self.mock_dc.containers.get().start.assert_called()
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
def test_cm_start_exception(self, mock_pretty_output):
self.mock_dc.containers.get().start.side_effect = Exception
container = cli_docker_commands.PostGisContainerMetadata()
msg = container.start()
self.assertIn('There was an error while attempting to start container', msg)
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertIn('Starting PostGIS/Database container...', po_call_args[0][0][0])
self.mock_dc.containers.get().start.assert_called()
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
def test_cm_stop(self, mock_pretty_output):
container = cli_docker_commands.PostGisContainerMetadata()
msg = container.stop()
self.assertIsNone(msg)
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertIn('Stopping PostGIS/Database container...', po_call_args[0][0][0])
self.mock_dc.containers.get().stop.assert_called()
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
def test_cm_stop_excpetion(self, mock_pretty_output):
self.mock_dc.containers.get().stop.side_effect = Exception
container = cli_docker_commands.PostGisContainerMetadata()
msg = container.stop()
self.assertIn('There was an error while attempting to stop container', msg)
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertIn('Stopping PostGIS/Database container...', po_call_args[0][0][0])
self.mock_dc.containers.get().stop.assert_called()
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
def test_cm_remove(self, mock_pretty_output):
container = cli_docker_commands.PostGisContainerMetadata()
container.remove()
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertIn('Removing PostGIS/Database container...', po_call_args[0][0][0])
self.mock_dc.containers.get().remove.assert_called()
@mock.patch('tethys_cli.docker_commands.ContainerMetadata.get_containers')
def test_get_docker_container_statuses(self, mock_get_containers):
mock_containers = [mock.MagicMock() for _ in range(3)]
mock_get_containers.return_value = mock_containers
self.mock_dc.containers.list.return_value = mock_containers
ret = cli_docker_commands.get_docker_container_statuses()
self.assertEqual(3, len(ret))
for status in ret.values():
self.assertTrue(status)
@mock.patch('tethys_cli.docker_commands.ContainerMetadata.get_containers')
def test_get_docker_container_statuses_off(self, mock_get_containers):
mock_containers = [mock.MagicMock() for _ in range(3)]
mock_get_containers.return_value = mock_containers
self.mock_dc.containers.list.side_effect = [
mock_containers,
mock_containers[:2],
]
ret = cli_docker_commands.get_docker_container_statuses()
self.assertEqual(3, len(ret))
self.assertTrue(ret[mock_containers[0]])
self.assertTrue(ret[mock_containers[1]])
self.assertFalse(ret[mock_containers[2]])
@mock.patch('tethys_cli.docker_commands.ContainerMetadata.get_containers')
def test_get_docker_container_statuses_not_installed(self, mock_get_containers):
mock_containers = [mock.MagicMock() for _ in range(3)]
mock_get_containers.return_value = mock_containers
self.mock_dc.containers.list.side_effect = [
mock_containers[:2],
mock_containers[:1],
]
ret = cli_docker_commands.get_docker_container_statuses()
self.assertEqual(3, len(ret))
self.assertTrue(ret[mock_containers[0]])
self.assertFalse(ret[mock_containers[1]])
self.assertIsNone(ret[mock_containers[2]])
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
def test_pull_docker_images(self, mock_pretty_output):
mock_container = mock.Mock()
cli_docker_commands.pull_docker_images([mock_container])
mock_container.pull.assert_called_once()
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertEqual('Pulling Docker images...', po_call_args[0][0][0])
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
def test_pull_docker_images_already_pulled(self, mock_pretty_output):
cli_docker_commands.pull_docker_images([])
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertEqual('Docker images already pulled.', po_call_args[0][0][0])
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
def test_install_docker_containers(self, mock_pretty_output):
mock_container = mock.Mock()
cli_docker_commands.install_docker_containers([mock_container])
mock_container.create.assert_called_once_with(False)
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertEqual('Finished installing Docker containers.', po_call_args[0][0][0])
@mock.patch('tethys_cli.docker_commands.install_docker_containers')
@mock.patch('tethys_cli.docker_commands.pull_docker_images')
@mock.patch('tethys_cli.docker_commands.ContainerMetadata.get_containers')
def test_docker_init(self, mock_get_containers, mock_pull, mock_install):
mock_get_containers.return_value = mock.Mock()
cli_docker_commands.docker_init()
mock_get_containers.assert_called_with(None, installed=False)
mock_pull.assert_called_with(mock_get_containers.return_value)
mock_install.assert_called_with(mock_get_containers.return_value, defaults=False)
cli_docker_commands.docker_init(force=True)
mock_get_containers.assert_called_with(None, installed=None)
mock_pull.assert_called_with(mock_get_containers.return_value)
mock_install.assert_called_with(mock_get_containers.return_value, defaults=False)
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.get_docker_container_statuses')
def test_docker_start(self, mock_dc_status, mock_pretty_output):
mock_container = mock.Mock()
mock_container.start.return_value = '{} starting'
mock_container.display_name = 'Mock'
mock_dc_status.return_value = mock.Mock()
mock_dc_status.return_value.items.return_value = [
(mock_container, None), (mock_container, True), (mock_container, False)
]
cli_docker_commands.docker_start()
mock_container.start.assert_called()
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(3, len(po_call_args))
self.assertIn('Mock container not installed.', po_call_args[0][0][0])
self.assertIn('Mock container already running.', po_call_args[1][0][0])
self.assertIn('Mock starting', po_call_args[2][0][0])
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.get_docker_container_statuses')
def test_docker_stop(self, mock_dc_status, mock_pretty_output):
mock_container = mock.Mock()
mock_container.stop.return_value = '{} stopping'
mock_container.display_name = 'Mock'
mock_dc_status.return_value = mock.Mock()
mock_dc_status.return_value.items.return_value = [
(mock_container, None), (mock_container, False), (mock_container, True)
]
cli_docker_commands.docker_stop()
mock_container.stop.assert_called()
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(3, len(po_call_args))
self.assertEqual('Mock container not installed.', po_call_args[0][0][0])
self.assertEqual('Mock container already stopped.', po_call_args[1][0][0])
self.assertEqual('Mock stopping', po_call_args[2][0][0])
@mock.patch('tethys_cli.docker_commands.docker_start')
@mock.patch('tethys_cli.docker_commands.docker_stop')
def test_docker_restart(self, mock_stop, mock_start):
containers = mock.Mock()
cli_docker_commands.docker_restart(containers)
mock_stop.assert_called_with(containers=containers)
mock_start.assert_called_with(containers=containers)
@mock.patch('tethys_cli.docker_commands.ContainerMetadata.get_containers')
@mock.patch('tethys_cli.docker_commands.docker_stop')
def test_docker_remove(self, mock_stop, mock_get_containers):
containers = mock.Mock()
mock_get_containers.return_value = [containers]
cli_docker_commands.docker_remove(containers)
mock_stop.assert_called_with(containers=containers)
mock_get_containers.assert_called_with(containers, installed=True)
containers.remove.assert_called()
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.get_docker_container_statuses')
def test_docker_status(self, mock_dc_status, mock_pretty_output):
mock_container = mock.Mock()
mock_container.display_name = 'Mock'
mock_dc_status.return_value = mock.Mock()
mock_dc_status.return_value.items.return_value = [
(mock_container, None), (mock_container, True), (mock_container, False)
]
cli_docker_commands.docker_status()
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(3, len(po_call_args))
self.assertEqual('Mock: Not Installed', po_call_args[0][0][0])
self.assertEqual('Mock: Running', po_call_args[1][0][0])
self.assertEqual('Mock: Not Running', po_call_args[2][0][0])
@mock.patch('tethys_cli.docker_commands.docker_init')
@mock.patch('tethys_cli.docker_commands.docker_remove')
def test_docker_update(self, mock_remove, mock_init):
containers = mock.Mock()
cli_docker_commands.docker_update(containers)
mock_remove.assert_called_with(containers=containers)
mock_init.assert_called_with(containers=containers, defaults=False, force=True)
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.get_docker_container_statuses')
def test_docker_ip(self, mock_dc_status, mock_pretty_output):
mock_container = mock.Mock()
mock_container.configure_mock(ip='{name}: ip data')
mock_container.display_name = 'Mock'
mock_dc_status.return_value = mock.Mock()
mock_dc_status.return_value.items.return_value = [
(mock_container, None), (mock_container, False), (mock_container, True)
]
cli_docker_commands.docker_ip()
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(3, len(po_call_args))
self.assertEqual('Mock: Not Installed', po_call_args[0][0][0])
self.assertEqual('Mock: Not Running', po_call_args[1][0][0])
self.assertEqual('Mock: ip data', po_call_args[2][0][0])
@mock.patch('tethys_cli.docker_commands.docker_restart')
@mock.patch('tethys_cli.docker_commands.docker_ip')
@mock.patch('tethys_cli.docker_commands.docker_remove')
@mock.patch('tethys_cli.docker_commands.docker_update')
@mock.patch('tethys_cli.docker_commands.docker_status')
@mock.patch('tethys_cli.docker_commands.docker_stop')
@mock.patch('tethys_cli.docker_commands.docker_start')
@mock.patch('tethys_cli.docker_commands.docker_init')
def test_docker_command(self, mock_init, mock_start, mock_stop, mock_status,
mock_update, mock_remove, mock_ip, mock_restart):
args = mock.Mock(command='init', containers=mock.Mock(), defaults=mock.Mock())
cli_docker_commands.docker_command(args)
mock_init.assert_called_with(containers=args.containers, defaults=args.defaults)
args.command = 'start'
cli_docker_commands.docker_command(args)
mock_start.assert_called_with(containers=args.containers)
args.command = 'stop'
cli_docker_commands.docker_command(args)
mock_stop.assert_called_with(containers=args.containers)
args.command = 'status'
cli_docker_commands.docker_command(args)
mock_status.assert_called_with(containers=args.containers)
args.command = 'update'
cli_docker_commands.docker_command(args)
mock_update.assert_called_with(containers=args.containers, defaults=args.defaults)
args.command = 'remove'
cli_docker_commands.docker_command(args)
mock_remove.assert_called_with(containers=args.containers)
args.command = 'ip'
cli_docker_commands.docker_command(args)
mock_ip.assert_called_with(containers=args.containers)
args.command = 'restart'
cli_docker_commands.docker_command(args)
mock_restart.assert_called_with(containers=args.containers)
def test_uih_get_input_with_default(self):
self.mock_input.side_effect = ['test', '']
result = cli_docker_commands.UserInputHelper.get_input_with_default(prompt='prompt', default='test_default')
self.mock_input.assert_called_with('prompt [test_default]:')
self.assertEqual(result, 'test')
result = cli_docker_commands.UserInputHelper.get_input_with_default(prompt='prompt', default='test_default')
self.mock_input.assert_called_with('prompt [test_default]:')
self.assertEqual(result, 'test_default')
@mock.patch('tethys_cli.docker_commands.getpass')
def test_uih_get_verified_password_default(self, mock_getpass):
mock_getpass.getpass.side_effect = ['']
passwd = cli_docker_commands.UserInputHelper.get_verified_password(prompt='prompt', default='default_pass')
self.assertEqual(passwd, 'default_pass')
mock_getpass.getpass.assert_called_with('prompt [default_pass]: ')
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.getpass')
def test_uih_get_verified_password(self, mock_getpass, mock_pretty_output):
mock_getpass.getpass.side_effect = [
'pass',
'foo',
'pass',
'pass',
]
passwd = cli_docker_commands.UserInputHelper.get_verified_password(prompt='prompt', default='default_pass')
self.assertEqual(passwd, 'pass')
gp_call_args = mock_getpass.getpass.call_args_list
self.assertEqual(4, len(gp_call_args))
self.assertEqual('prompt [default_pass]: ', gp_call_args[0][0][0])
self.assertEqual('Confirm Password: ', gp_call_args[1][0][0])
self.assertEqual('prompt [default_pass]: ', gp_call_args[2][0][0])
self.assertEqual('Confirm Password: ', gp_call_args[3][0][0])
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertEqual('Passwords do not match, please try again.', po_call_args[0][0][0])
def test_uih_get_valid_numeric_input_default(self):
self.mock_input.side_effect = ['']
num = cli_docker_commands.UserInputHelper.get_valid_numeric_input(prompt='prompt', min_val=1, max_val=10)
self.assertEqual(num, 1)
self.mock_input.assert_called_with('prompt (max 10) [1]: ')
def test_uih_get_valid_numeric_input_valid(self):
self.mock_input.side_effect = ['5']
num = cli_docker_commands.UserInputHelper.get_valid_numeric_input(prompt='prompt', min_val=1, max_val=10)
self.assertEqual(num, 5)
self.mock_input.assert_called_with('prompt (max 10) [1]: ')
def test_uih_get_valid_numeric_input_invalid(self):
self.mock_input.side_effect = ['five', '11', '10.0', '10']
num = cli_docker_commands.UserInputHelper.get_valid_numeric_input(prompt='prompt', min_val=1, max_val=10)
self.assertEqual(num, 10)
input_call_args = self.mock_input.call_args_list
self.assertEqual(4, len(input_call_args))
self.assertEqual('prompt (max 10) [1]: ', input_call_args[0][0][0])
self.assertEqual('Please enter an integer number\nprompt (max 10) [1]: ', input_call_args[1][0][0])
self.assertEqual('Number must be between 1 and 10\nprompt (max 10) [1]: ', input_call_args[2][0][0])
self.assertEqual('Please enter an integer number\nprompt (max 10) [1]: ', input_call_args[1][0][0])
def test_uih_get_valid_choice_input_default(self):
self.mock_input.side_effect = ['']
c = cli_docker_commands.UserInputHelper.get_valid_choice_input(prompt='prompt', choices=['a', 'b'], default='a')
self.assertEqual(c, 'a')
self.mock_input.assert_called_with('prompt [A/b]: ')
def test_uih_get_valid_choice_input_invalid(self):
self.mock_input.side_effect = [
'c',
'd',
'e',
'B',
]
c = cli_docker_commands.UserInputHelper.get_valid_choice_input(prompt='prompt', choices=['a', 'b'], default='a')
self.assertEqual(c, 'b')
input_call_args = self.mock_input.call_args_list
self.assertEqual(4, len(input_call_args))
self.assertEqual('prompt [A/b]: ', input_call_args[0][0][0])
self.assertEqual('Please provide a valid option\nprompt [A/b]: ', input_call_args[1][0][0])
self.assertEqual('Please provide a valid option\nprompt [A/b]: ', input_call_args[2][0][0])
self.assertEqual('Please provide a valid option\nprompt [A/b]: ', input_call_args[3][0][0])
@mock.patch('tethys_cli.docker_commands.os.path.isdir')
def test_uih_get_valid_directory_input_default(self, mock_os_path_isdir):
mock_os_path_isdir.return_value = True
self.mock_input.side_effect = ['']
c = cli_docker_commands.UserInputHelper.get_valid_directory_input(prompt='prompt', default='/tmp')
self.assertEqual(c, '/tmp')
self.mock_input.assert_called_with('prompt [/tmp]: ')
@mock.patch('tethys_cli.docker_commands.os.makedirs')
@mock.patch('tethys_cli.docker_commands.os.path.isdir')
def test_uih_get_valid_directory_input_makedirs(self, mock_os_path_isdir, mock_os_makedirs):
value = '/non/existing/path'
self.mock_input.side_effect = [value[1:]]
mock_os_path_isdir.return_value = False
c = cli_docker_commands.UserInputHelper.get_valid_directory_input(prompt='prompt', default='/tmp')
self.assertEqual(c, value)
self.mock_input.assert_called_with('prompt [/tmp]: ')
mock_os_path_isdir.assert_called_with(value)
mock_os_makedirs.assert_called_with(value)
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.os.makedirs')
@mock.patch('tethys_cli.docker_commands.os.path.isdir')
def test_uih_get_valid_directory_input_oserror(self, mock_os_path_isdir, mock_os_makedirs, mock_pretty_output):
mock_os_path_isdir.side_effect = [False, True]
mock_os_makedirs.side_effect = OSError
self.mock_input.side_effect = ['/invalid/path', '/foo/tmp']
c = cli_docker_commands.UserInputHelper.get_valid_directory_input(prompt='prompt', default='/tmp')
self.assertEqual(c, '/foo/tmp')
mock_pretty_output.assert_called_with('OSError(): /invalid/path')
input_call_args = self.mock_input.call_args_list
self.assertEqual(2, len(input_call_args))
self.assertEqual('prompt [/tmp]: ', input_call_args[0][0][0])
self.assertEqual('Please provide a valid directory\nprompt [/tmp]: ', input_call_args[1][0][0])
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.curses')
@mock.patch('tethys_cli.docker_commands.platform.system')
def test_log_pull_stream_linux_with_id_bad_status(self, mock_platform_system, mock_curses, mock_pretty_output):
mock_stream = [b'{ "id":"358464", "status":"foo", "progress":"bar" }']
mock_platform_system.return_value = 'Linux'
mock_curses.initscr().getmaxyx.return_value = 1, 80
cli_docker_commands.log_pull_stream(mock_stream)
mock_curses.initscr().addstr.assert_any_call(0, 0, u'foo '
u' ')
mock_curses.initscr().addstr.assert_called_with(1, 0, '--- '
' ')
mock_curses.initscr().refresh.assert_called_once()
mock_curses.noecho.assert_called_once()
mock_curses.cbreak.assert_called_once()
mock_curses.echo.assert_called_once()
mock_curses.nocbreak.assert_called_once()
mock_curses.endwin.assert_called_once()
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertEqual('', po_call_args[0][0][0])
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.curses')
@mock.patch('tethys_cli.docker_commands.platform.system')
def test_log_pull_stream_linux_with_id_progress_status(self, mock_platform_system, mock_curses, mock_pretty_output):
mock_stream = [b'{ "id":"358464", "status":"Downloading", "progress":"bar" }']
mock_platform_system.return_value = 'Linux'
mock_curses.initscr().getmaxyx.return_value = 1, 80
cli_docker_commands.log_pull_stream(mock_stream)
mock_curses.initscr().addstr.assert_called_with(0, 0, '358464: Downloading bar '
' ')
mock_curses.initscr().refresh.assert_called_once()
mock_curses.noecho.assert_called_once()
mock_curses.cbreak.assert_called_once()
mock_curses.echo.assert_called_once()
mock_curses.nocbreak.assert_called_once()
mock_curses.endwin.assert_called_once()
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertEqual('', po_call_args[0][0][0])
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.curses')
@mock.patch('tethys_cli.docker_commands.platform.system')
def test_log_pull_stream_linux_with_id_status(self, mock_platform_system, mock_curses, mock_pretty_output):
mock_stream = [b'{ "id":"358464", "status":"Downloading", "progress":"bar" }\r\n'
b'{ "id":"358464", "status":"Pulling fs layer", "progress":"baz" }']
mock_platform_system.return_value = 'Linux'
mock_curses.initscr().getmaxyx.return_value = 1, 80
cli_docker_commands.log_pull_stream(mock_stream)
mock_curses.initscr().addstr.assert_called_with(0, 0, '358464: Downloading bar '
' ')
mock_curses.initscr().refresh.assert_called()
mock_curses.noecho.assert_called_once()
mock_curses.cbreak.assert_called_once()
mock_curses.echo.assert_called_once()
mock_curses.nocbreak.assert_called_once()
mock_curses.endwin.assert_called_once()
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertEqual('', po_call_args[0][0][0])
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.curses')
@mock.patch('tethys_cli.docker_commands.platform.system')
def test_log_pull_stream_linux_with_no_id(self, mock_platform_system, mock_curses, mock_pretty_output):
mock_stream = [b'{ "status":"foo", "progress":"bar" }']
mock_platform_system.return_value = 'Linux'
mock_curses.initscr().getmaxyx.return_value = 1, 80
cli_docker_commands.log_pull_stream(mock_stream)
mock_curses.initscr().addstr.assert_not_called()
mock_curses.initscr().refresh.assert_not_called()
mock_curses.noecho.assert_called_once()
mock_curses.cbreak.assert_called_once()
mock_curses.echo.assert_called_once()
mock_curses.nocbreak.assert_called_once()
mock_curses.endwin.assert_called_once()
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertEqual(u'foo', po_call_args[0][0][0])
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.curses')
@mock.patch('tethys_cli.docker_commands.platform.system')
def test_log_pull_stream_linux_with_curses_error(self, mock_platform_system, mock_curses, mock_pretty_output):
import curses
mock_stream = [b'{ "id":"358464", "status":"Downloading", "progress":"bar" }\r\n']
mock_platform_system.return_value = 'Linux'
mock_curses.initscr().getmaxyx.return_value = 1, 80
mock_curses.error = curses.error # Since curses is mocked, need to reinstate this as a curses error
mock_curses.initscr().addstr.side_effect = curses.error # Raise Curses Error
cli_docker_commands.log_pull_stream(mock_stream)
mock_curses.initscr().addstr.assert_called_with(0, 0, '358464: Downloading bar '
' ')
mock_curses.initscr().refresh.assert_called_once()
mock_curses.noecho.assert_called_once()
mock_curses.cbreak.assert_called_once()
mock_curses.echo.assert_called_once()
mock_curses.nocbreak.assert_called_once()
mock_curses.endwin.assert_called_once()
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertEqual('', po_call_args[0][0][0])
@mock.patch('tethys_cli.docker_commands.write_pretty_output')
@mock.patch('tethys_cli.docker_commands.platform.system')
def test_log_pull_stream_windows(self, mock_platform_system, mock_pretty_output):
mock_stream = [b'{ "id":"358464", "status":"Downloading", "progress":"bar" }']
mock_platform_system.return_value = 'Windows'
cli_docker_commands.log_pull_stream(mock_stream)
po_call_args = mock_pretty_output.call_args_list
self.assertEqual(1, len(po_call_args))
self.assertEqual('358464:Downloading bar', po_call_args[0][0][0])
| 48.018265 | 120 | 0.68121 | 6,336 | 52,580 | 5.29435 | 0.061869 | 0.052854 | 0.099836 | 0.072678 | 0.835743 | 0.795081 | 0.759964 | 0.700015 | 0.655985 | 0.621106 | 0 | 0.015957 | 0.216927 | 52,580 | 1,094 | 121 | 48.062157 | 0.798757 | 0.024515 | 0 | 0.526486 | 0 | 0.001081 | 0.195286 | 0.111536 | 0 | 0 | 0 | 0 | 0.247568 | 1 | 0.081081 | false | 0.041081 | 0.007568 | 0 | 0.08973 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
904f23897493d9a3b10e11ea6508b8f088880817 | 374 | py | Python | boa/blockchain/vm/Neo/Input.py | chisleu/neo-boa | 799bb37f7e7862215f94f479cfe74a4dd4b8cba2 | [
"MIT"
] | 2 | 2017-11-27T08:45:34.000Z | 2021-03-08T03:08:56.000Z | boa/blockchain/vm/Neo/Input.py | chisleu/neo-boa | 799bb37f7e7862215f94f479cfe74a4dd4b8cba2 | [
"MIT"
] | 2 | 2018-02-13T07:30:09.000Z | 2021-06-01T22:02:52.000Z | boa/blockchain/vm/Neo/Input.py | localhuman/neo-boa | 799bb37f7e7862215f94f479cfe74a4dd4b8cba2 | [
"MIT"
] | null | null | null |
class TransactionInput():
@property
def Hash(self):
"""
:return:
"""
return GetHash(self)
@property
def Index(self):
"""
:return:
"""
return GetIndex(self)
def GetHash(input):
"""
:param input:
"""
pass
def GetIndex(input):
"""
:param input:
"""
pass
| 10.685714 | 29 | 0.44385 | 30 | 374 | 5.533333 | 0.433333 | 0.13253 | 0.192771 | 0.228916 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.414439 | 374 | 34 | 30 | 11 | 0.757991 | 0.120321 | 0 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.363636 | false | 0.181818 | 0 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 5 |
905bf34a0f9bec7cbb40dda862f87edbb9f0c8ba | 3,584 | py | Python | tests/unit/test_bucket.py | Tingesplatform/tokenomics | b541a54b1709a1282c5133e731b26a6ca05ac901 | [
"MIT"
] | 2 | 2021-01-12T04:44:06.000Z | 2022-03-15T23:34:41.000Z | tests/unit/test_bucket.py | Tingesplatform/tokenomics | b541a54b1709a1282c5133e731b26a6ca05ac901 | [
"MIT"
] | 5 | 2019-01-14T07:17:34.000Z | 2021-06-01T23:38:58.000Z | tests/unit/test_bucket.py | Tingesplatform/tokenomics | b541a54b1709a1282c5133e731b26a6ca05ac901 | [
"MIT"
] | 2 | 2019-01-11T17:07:48.000Z | 2022-03-03T17:26:41.000Z | from datetime import datetime, timedelta
import pytest
from models.dao import Bucket
@pytest.fixture
def coupled_buckets(dai_stablecoin):
parent_bucket = Bucket(
name='Parent bucket',
withdraw_begin=datetime.now(),
token=dai_stablecoin,
max_volume=1000)
child_bucket = Bucket(
name='Child bucket',
withdraw_begin=datetime.now(),
token=dai_stablecoin,
max_volume=1000)
parent_bucket.set_overflow_bucket(child_bucket)
return (parent_bucket, child_bucket)
@pytest.mark.freeze_time
def test_bucket_creation(dai_stablecoin):
bucket = Bucket(
name='Test bucket',
withdraw_begin=datetime.now(),
token=dai_stablecoin,
max_volume=1000
)
assert bucket.name == 'Test bucket'
assert bucket.withdraw_begin == datetime.now()
assert bucket.token == dai_stablecoin
assert bucket.max_volume == 1000
assert bucket.overflow_bkt is None
assert bucket.set_overflow_bucket.caller_name == 'Governance'
assert bucket.flush.caller_name == 'Governance'
assert bucket.withdraw.caller_name == 'Tap'
def test_set_overflow_bucket(dai_stablecoin):
parent_bucket = Bucket(
name='Parent bucket',
withdraw_begin=datetime.now(),
token=dai_stablecoin,
max_volume=1000
)
child_bucket = Bucket(
name='Child bucket',
withdraw_begin=datetime.now(),
token=dai_stablecoin,
max_volume=1000
)
parent_bucket.set_overflow_bucket(child_bucket)
assert parent_bucket.overflow_bkt == child_bucket
def test_successful_flush(dai_stablecoin, coupled_buckets):
parent, child = coupled_buckets
dai_stablecoin.mint(parent, 1000)
parent.flush()
assert dai_stablecoin.balance_of(parent) == 1000
assert dai_stablecoin.balance_of(child) == 0
def test_overflow_flush(dai_stablecoin, coupled_buckets):
parent, child = coupled_buckets
dai_stablecoin.mint(parent, 1500)
parent.flush()
assert dai_stablecoin.balance_of(parent) == 1000
assert dai_stablecoin.balance_of(child) == 500
def test_child_overflow_flush(dai_stablecoin, coupled_buckets):
parent, child = coupled_buckets
dai_stablecoin.mint(parent, 5000)
parent.flush()
assert dai_stablecoin.balance_of(parent) == 1000
assert dai_stablecoin.balance_of(child) == 4000
def test_successful_withdraw(account, dai_stablecoin):
bucket = Bucket(
name='Test bucket',
withdraw_begin=datetime.now() - timedelta(days=1),
token=dai_stablecoin,
max_volume=1000
)
dai_stablecoin.mint(bucket, 1000)
bucket.withdraw(account, 700)
assert dai_stablecoin.balance_of(bucket) == 300
assert dai_stablecoin.balance_of(account) == 700
def test_withdraw_before_begin(account, dai_stablecoin):
bucket = Bucket(
name='Test bucket',
withdraw_begin=datetime.now() + timedelta(days=1),
token=dai_stablecoin,
max_volume=1000)
dai_stablecoin.mint(bucket, 1000)
bucket.withdraw(account, 700)
assert dai_stablecoin.balance_of(bucket) == 1000
assert dai_stablecoin.balance_of(account) == 0
def test_withdraw_too_much(account, dai_stablecoin):
bucket = Bucket(
name='Test bucket',
withdraw_begin=datetime.now() + timedelta(days=1),
token=dai_stablecoin,
max_volume=1000)
dai_stablecoin.mint(bucket, 1000)
bucket.withdraw(account, 5000)
assert dai_stablecoin.balance_of(bucket) == 1000
assert dai_stablecoin.balance_of(account) == 0
| 26.746269 | 65 | 0.703962 | 434 | 3,584 | 5.543779 | 0.131336 | 0.194514 | 0.094763 | 0.129676 | 0.788446 | 0.733998 | 0.719451 | 0.719451 | 0.719451 | 0.719451 | 0 | 0.038192 | 0.203683 | 3,584 | 133 | 66 | 26.947368 | 0.804835 | 0 | 0 | 0.608247 | 0 | 0 | 0.035714 | 0 | 0 | 0 | 0 | 0 | 0.216495 | 1 | 0.092784 | false | 0 | 0.030928 | 0 | 0.134021 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
905cf2a26bb73fc6bcff464092fedf0ed4f029d2 | 975 | py | Python | vslomp/display/screen/utils.py | CharlesWillis3/vslomp | f0f77a2be199ce6e3e35efc0653ca4670f837216 | [
"MIT"
] | null | null | null | vslomp/display/screen/utils.py | CharlesWillis3/vslomp | f0f77a2be199ce6e3e35efc0653ca4670f837216 | [
"MIT"
] | null | null | null | vslomp/display/screen/utils.py | CharlesWillis3/vslomp | f0f77a2be199ce6e3e35efc0653ca4670f837216 | [
"MIT"
] | null | null | null | import importlib
from typing import Protocol, Sequence, Tuple, Type, cast
from PIL import Image
class EPDMonochromeProtocol(Protocol):
def init(self) -> None:
raise NotImplementedError
def getbuffer(self, image: Image.Image) -> Sequence[int]:
raise NotImplementedError
def display(self, image: Sequence[int]) -> None:
raise NotImplementedError
def Clear(self) -> None:
raise NotImplementedError
def sleep(self) -> None:
raise NotImplementedError
def Dev_exit(self) -> None:
raise NotImplementedError
def get_screen(name: str) -> Tuple[EPDMonochromeProtocol, Tuple[int, int]]:
epd_module = importlib.import_module("waveshare_epd." + name)
screen_size = (
cast(int, getattr(epd_module, "EPD_WIDTH")),
cast(int, getattr(epd_module, "EPD_HEIGHT")),
)
epd_class = cast(Type[EPDMonochromeProtocol], getattr(epd_module, "EPD"))
return (epd_class(), screen_size)
| 26.351351 | 77 | 0.683077 | 109 | 975 | 5.981651 | 0.348624 | 0.220859 | 0.248466 | 0.23773 | 0.294479 | 0.079755 | 0 | 0 | 0 | 0 | 0 | 0 | 0.211282 | 975 | 36 | 78 | 27.083333 | 0.847854 | 0 | 0 | 0.25 | 0 | 0 | 0.036923 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.291667 | false | 0 | 0.166667 | 0 | 0.541667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
906d9374934e97ca9382e4c86a9af66f377ac882 | 20,621 | py | Python | supar/parsers/sdp.py | attardi/parser | 1978ba94ba649ad0a723d71bb2ca225c7e705702 | [
"MIT"
] | null | null | null | supar/parsers/sdp.py | attardi/parser | 1978ba94ba649ad0a723d71bb2ca225c7e705702 | [
"MIT"
] | null | null | null | supar/parsers/sdp.py | attardi/parser | 1978ba94ba649ad0a723d71bb2ca225c7e705702 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import os
import torch
import torch.nn as nn
from supar.models import (BiaffineSemanticDependencyModel,
VISemanticDependencyModel)
from supar.parsers.parser import Parser
from supar.utils import Config, Dataset, Embedding
from supar.utils.common import BOS, PAD, UNK
from supar.utils.field import ChartField, Field, SubwordField
from supar.utils.logging import get_logger, progress_bar
from supar.utils.metric import ChartMetric
from supar.utils.transform import CoNLL
logger = get_logger(__name__)
class BiaffineSemanticDependencyParser(Parser):
r"""
The implementation of Biaffine Semantic Dependency Parser :cite:`dozat-etal-2018-simpler`.
"""
NAME = 'biaffine-semantic-dependency'
MODEL = BiaffineSemanticDependencyModel
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.WORD, self.CHAR, self.BERT = self.transform.FORM
self.LEMMA = self.transform.LEMMA
self.TAG = self.transform.POS
self.LABEL = self.transform.PHEAD
def train(self, train, dev, test, buckets=32, batch_size=5000, update_steps=1, verbose=True, **kwargs):
r"""
Args:
train/dev/test (list[list] or str):
Filenames of the train/dev/test datasets.
buckets (int):
The number of buckets that sentences are assigned to. Default: 32.
batch_size (int):
The number of tokens in each batch. Default: 5000.
update_steps (int):
Gradient accumulation steps. Default: 1.
verbose (bool):
If ``True``, increases the output verbosity. Default: ``True``.
kwargs (dict):
A dict holding unconsumed arguments for updating training configs.
"""
return super().train(**Config().update(locals()))
def evaluate(self, data, buckets=8, batch_size=5000, verbose=True, **kwargs):
r"""
Args:
data (str):
The data for evaluation, both list of instances and filename are allowed.
buckets (int):
The number of buckets that sentences are assigned to. Default: 32.
batch_size (int):
The number of tokens in each batch. Default: 5000.
verbose (bool):
If ``True``, increases the output verbosity. Default: ``True``.
kwargs (dict):
A dict holding unconsumed arguments for updating evaluation configs.
Returns:
The loss scalar and evaluation results.
"""
return super().evaluate(**Config().update(locals()))
def predict(self, data, pred=None, lang=None, buckets=8, batch_size=5000, verbose=True, **kwargs):
r"""
Args:
data (list[list] or str):
The data for prediction, both a list of instances and filename are allowed.
pred (str):
If specified, the predicted results will be saved to the file. Default: ``None``.
lang (str):
Language code (e.g., ``en``) or language name (e.g., ``English``) for the text to tokenize.
``None`` if tokenization is not required.
Default: ``None``.
buckets (int):
The number of buckets that sentences are assigned to. Default: 32.
batch_size (int):
The number of tokens in each batch. Default: 5000.
prob (bool):
If ``True``, outputs the probabilities. Default: ``False``.
verbose (bool):
If ``True``, increases the output verbosity. Default: ``True``.
kwargs (dict):
A dict holding unconsumed arguments for updating prediction configs.
Returns:
A :class:`~supar.utils.Dataset` object that stores the predicted results.
"""
return super().predict(**Config().update(locals()))
@classmethod
def load(cls, path, reload=False, src=None, **kwargs):
r"""
Loads a parser with data fields and pretrained model parameters.
Args:
path (str):
- a string with the shortcut name of a pretrained model defined in ``supar.MODEL``
to load from cache or download, e.g., ``'biaffine-sdp-en'``.
- a local path to a pretrained model, e.g., ``./<path>/model``.
reload (bool):
Whether to discard the existing cache and force a fresh download. Default: ``False``.
src (str):
Specifies where to download the model.
``'github'``: github release page.
``'hlt'``: hlt homepage, only accessible from 9:00 to 18:00 (UTC+8).
Default: None.
kwargs (dict):
A dict holding unconsumed arguments for updating training configs and initializing the model.
Examples:
>>> from supar import Parser
>>> parser = Parser.load('biaffine-sdp-en')
>>> parser = Parser.load('./dm.biaffine.sdp.lstm.char')
"""
return super().load(path, reload, src, **kwargs)
def _train(self, loader):
self.model.train()
bar, metric = progress_bar(loader), ChartMetric()
for i, (words, *feats, labels) in enumerate(bar, 1):
word_mask = words.ne(self.args.pad_index)
mask = word_mask if len(words.shape) < 3 else word_mask.any(-1)
mask = mask.unsqueeze(1) & mask.unsqueeze(2)
mask[:, 0] = 0
s_edge, s_label = self.model(words) # Attardi: , feats)
loss = self.model.loss(s_edge, s_label, labels, mask)
loss = loss / self.args.update_steps
loss.backward()
nn.utils.clip_grad_norm_(self.model.parameters(), self.args.clip)
if i % self.args.update_steps == 0:
self.optimizer.step()
self.scheduler.step()
self.optimizer.zero_grad()
label_preds = self.model.decode(s_edge, s_label)
metric(label_preds.masked_fill(~mask, -1), labels.masked_fill(~mask, -1))
bar.set_postfix_str(f"lr: {self.scheduler.get_last_lr()[0]:.4e} - loss: {loss:.4f} - {metric}")
logger.info(f"{bar.postfix}")
@torch.no_grad()
def _evaluate(self, loader):
self.model.eval()
total_loss, metric = 0, ChartMetric()
for words, *feats, labels in loader:
word_mask = words.ne(self.args.pad_index)
mask = word_mask if len(words.shape) < 3 else word_mask.any(-1)
mask = mask.unsqueeze(1) & mask.unsqueeze(2)
mask[:, 0] = 0
s_edge, s_label = self.model(words) # Attardi: , feats)
loss = self.model.loss(s_edge, s_label, labels, mask)
total_loss += loss.item()
label_preds = self.model.decode(s_edge, s_label)
metric(label_preds.masked_fill(~mask, -1), labels.masked_fill(~mask, -1))
total_loss /= len(loader)
return total_loss, metric
@torch.no_grad()
def _predict(self, loader):
self.model.eval()
preds = {'labels': [], 'probs': [] if self.args.prob else None}
for words, *feats in progress_bar(loader):
word_mask = words.ne(self.args.pad_index)
mask = word_mask if len(words.shape) < 3 else word_mask.any(-1)
mask = mask.unsqueeze(1) & mask.unsqueeze(2)
mask[:, 0] = 0
lens = mask[:, 1].sum(-1).tolist()
s_edge, s_label = self.model(words, feats)
label_preds = self.model.decode(s_edge, s_label).masked_fill(~mask, -1)
preds['labels'].extend(chart[1:i, :i].tolist() for i, chart in zip(lens, label_preds))
if self.args.prob:
preds['probs'].extend([prob[1:i, :i].cpu() for i, prob in zip(lens, s_edge.softmax(-1).unbind())])
preds['labels'] = [CoNLL.build_relations([[self.LABEL.vocab[i] if i >= 0 else None for i in row] for row in chart])
for chart in preds['labels']]
return preds
@classmethod
def build(cls, path, min_freq=7, fix_len=20, **kwargs):
r"""
Build a brand-new Parser, including initialization of all data fields and model parameters.
Args:
path (str):
The path of the model to be saved.
min_freq (str):
The minimum frequency needed to include a token in the vocabulary. Default:7.
fix_len (int):
The max length of all subword pieces. The excess part of each piece will be truncated.
Required if using CharLSTM/BERT.
Default: 20.
kwargs (dict):
A dict holding the unconsumed arguments.
"""
args = Config(**locals())
args.device = 'cuda' if torch.cuda.is_available() else 'cpu'
os.makedirs(os.path.dirname(path) or './', exist_ok=True)
if os.path.exists(path) and not args.build:
parser = cls.load(**args)
parser.model = cls.MODEL(**parser.args)
parser.model.load_pretrained(parser.WORD.embed).to(args.device)
return parser
logger.info("Building the fields")
WORD = Field('words', pad=PAD, unk=UNK, bos=BOS, lower=True)
TAG, CHAR, LEMMA, BERT = None, None, None, None
if args.encoder != 'lstm':
from transformers import (AutoTokenizer, GPT2Tokenizer,
GPT2TokenizerFast)
t = AutoTokenizer.from_pretrained(args.bert)
WORD = SubwordField('words',
pad=t.pad_token,
unk=t.unk_token,
bos=t.bos_token or t.cls_token,
fix_len=args.fix_len,
tokenize=t.tokenize,
fn=None if not isinstance(t, (GPT2Tokenizer, GPT2TokenizerFast)) else lambda x: ' '+x)
WORD.vocab = t.get_vocab()
else:
WORD = Field('words', pad=PAD, unk=UNK, bos=BOS, lower=True)
if 'tag' in args.feat:
TAG = Field('tags', bos=BOS)
if 'char' in args.feat:
CHAR = SubwordField('chars', pad=PAD, unk=UNK, bos=BOS, fix_len=args.fix_len)
if 'lemma' in args.feat:
LEMMA = Field('lemmas', pad=PAD, unk=UNK, bos=BOS, lower=True)
if 'bert' in args.feat:
from transformers import (AutoTokenizer, GPT2Tokenizer,
GPT2TokenizerFast)
t = AutoTokenizer.from_pretrained(args.bert)
BERT = SubwordField('bert',
pad=t.pad_token,
unk=t.unk_token,
bos=t.bos_token or t.cls_token,
fix_len=args.fix_len,
tokenize=t.tokenize,
fn=None if not isinstance(t, (GPT2Tokenizer, GPT2TokenizerFast)) else lambda x: ' '+x)
BERT.vocab = t.get_vocab()
LABEL = ChartField('labels', fn=CoNLL.get_labels)
transform = CoNLL(FORM=(WORD, CHAR, BERT), LEMMA=LEMMA, POS=TAG, PHEAD=LABEL)
train = Dataset(transform, args.train)
if args.encoder == 'lstm':
WORD.build(train, args.min_freq, (Embedding.load(args.embed, args.unk) if args.embed else None))
if TAG is not None:
TAG.build(train)
if CHAR is not None:
CHAR.build(train)
if LEMMA is not None:
LEMMA.build(train)
LABEL.build(train)
args.update({
'n_words': len(WORD.vocab) if args.encoder != 'lstm' else WORD.vocab.n_init,
'n_labels': len(LABEL.vocab),
'n_tags': len(TAG.vocab) if TAG is not None else None,
'n_chars': len(CHAR.vocab) if CHAR is not None else None,
'char_pad_index': CHAR.pad_index if CHAR is not None else None,
'n_lemmas': len(LEMMA.vocab) if LEMMA is not None else None,
'bert_pad_index': BERT.pad_index if BERT is not None else None,
'pad_index': WORD.pad_index,
'unk_index': WORD.unk_index,
'bos_index': WORD.bos_index
})
logger.info(f"{transform}")
logger.info("Building the model")
model = cls.MODEL(**args).load_pretrained(WORD.embed if hasattr(WORD, 'embed') else None).to(args.device)
logger.info(f"{model}\n")
return cls(args, model, transform)
class VISemanticDependencyParser(BiaffineSemanticDependencyParser):
r"""
The implementation of Semantic Dependency Parser using Variational Inference :cite:`wang-etal-2019-second`.
"""
NAME = 'vi-semantic-dependency'
MODEL = VISemanticDependencyModel
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.WORD, self.CHAR, self.BERT = self.transform.FORM
self.LEMMA = self.transform.LEMMA
self.TAG = self.transform.POS
self.LABEL = self.transform.PHEAD
def train(self, train, dev, test, buckets=32, batch_size=5000, update_steps=1, verbose=True, **kwargs):
r"""
Args:
train/dev/test (list[list] or str):
Filenames of the train/dev/test datasets.
buckets (int):
The number of buckets that sentences are assigned to. Default: 32.
batch_size (int):
The number of tokens in each batch. Default: 5000.
update_steps (int):
Gradient accumulation steps. Default: 1.
verbose (bool):
If ``True``, increases the output verbosity. Default: ``True``.
kwargs (dict):
A dict holding unconsumed arguments for updating training configs.
"""
return super().train(**Config().update(locals()))
def evaluate(self, data, buckets=8, batch_size=5000, verbose=True, **kwargs):
r"""
Args:
data (str):
The data for evaluation, both list of instances and filename are allowed.
buckets (int):
The number of buckets that sentences are assigned to. Default: 32.
batch_size (int):
The number of tokens in each batch. Default: 5000.
verbose (bool):
If ``True``, increases the output verbosity. Default: ``True``.
kwargs (dict):
A dict holding unconsumed arguments for updating evaluation configs.
Returns:
The loss scalar and evaluation results.
"""
return super().evaluate(**Config().update(locals()))
def predict(self, data, pred=None, lang=None, buckets=8, batch_size=5000, verbose=True, **kwargs):
r"""
Args:
data (list[list] or str):
The data for prediction, both a list of instances and filename are allowed.
pred (str):
If specified, the predicted results will be saved to the file. Default: ``None``.
lang (str):
Language code (e.g., ``en``) or language name (e.g., ``English``) for the text to tokenize.
``None`` if tokenization is not required.
Default: ``None``.
buckets (int):
The number of buckets that sentences are assigned to. Default: 32.
batch_size (int):
The number of tokens in each batch. Default: 5000.
prob (bool):
If ``True``, outputs the probabilities. Default: ``False``.
verbose (bool):
If ``True``, increases the output verbosity. Default: ``True``.
kwargs (dict):
A dict holding unconsumed arguments for updating prediction configs.
Returns:
A :class:`~supar.utils.Dataset` object that stores the predicted results.
"""
return super().predict(**Config().update(locals()))
@classmethod
def load(cls, path, reload=False, src=None, **kwargs):
r"""
Loads a parser with data fields and pretrained model parameters.
Args:
path (str):
- a string with the shortcut name of a pretrained model defined in ``supar.MODEL``
to load from cache or download, e.g., ``'vi-sdp-en'``.
- a local path to a pretrained model, e.g., ``./<path>/model``.
reload (bool):
Whether to discard the existing cache and force a fresh download. Default: ``False``.
src (str):
Specifies where to download the model.
``'github'``: github release page.
``'hlt'``: hlt homepage, only accessible from 9:00 to 18:00 (UTC+8).
Default: None.
kwargs (dict):
A dict holding unconsumed arguments for updating training configs and initializing the model.
Examples:
>>> from supar import Parser
>>> parser = Parser.load('vi-sdp-en')
>>> parser = Parser.load('./dm.vi.sdp.lstm.char')
"""
return super().load(path, reload, src, **kwargs)
def _train(self, loader):
self.model.train()
bar, metric = progress_bar(loader), ChartMetric()
for i, (words, *feats, labels) in enumerate(bar, 1):
word_mask = words.ne(self.args.pad_index)
mask = word_mask if len(words.shape) < 3 else word_mask.any(-1)
mask = mask.unsqueeze(1) & mask.unsqueeze(2)
mask[:, 0] = 0
s_edge, s_sib, s_cop, s_grd, s_label = self.model(words, feats)
loss, s_edge = self.model.loss(s_edge, s_sib, s_cop, s_grd, s_label, labels, mask)
loss = loss / self.args.update_steps
loss.backward()
nn.utils.clip_grad_norm_(self.model.parameters(), self.args.clip)
if i % self.args.update_steps == 0:
self.optimizer.step()
self.scheduler.step()
self.optimizer.zero_grad()
label_preds = self.model.decode(s_edge, s_label)
metric(label_preds.masked_fill(~mask, -1), labels.masked_fill(~mask, -1))
bar.set_postfix_str(f"lr: {self.scheduler.get_last_lr()[0]:.4e} - loss: {loss:.4f} - {metric}")
logger.info(f"{bar.postfix}")
@torch.no_grad()
def _evaluate(self, loader):
self.model.eval()
total_loss, metric = 0, ChartMetric()
for words, *feats, labels in loader:
word_mask = words.ne(self.args.pad_index)
mask = word_mask if len(words.shape) < 3 else word_mask.any(-1)
mask = mask.unsqueeze(1) & mask.unsqueeze(2)
mask[:, 0] = 0
s_edge, s_sib, s_cop, s_grd, s_label = self.model(words, feats)
loss, s_edge = self.model.loss(s_edge, s_sib, s_cop, s_grd, s_label, labels, mask)
total_loss += loss.item()
label_preds = self.model.decode(s_edge, s_label)
metric(label_preds.masked_fill(~mask, -1), labels.masked_fill(~mask, -1))
total_loss /= len(loader)
return total_loss, metric
@torch.no_grad()
def _predict(self, loader):
self.model.eval()
preds = {'labels': [], 'probs': [] if self.args.prob else None}
for words, *feats in progress_bar(loader):
word_mask = words.ne(self.args.pad_index)
mask = word_mask if len(words.shape) < 3 else word_mask.any(-1)
mask = mask.unsqueeze(1) & mask.unsqueeze(2)
mask[:, 0] = 0
lens = mask[:, 1].sum(-1).tolist()
s_edge, s_sib, s_cop, s_grd, s_label = self.model(words, feats)
s_edge = self.model.inference((s_edge, s_sib, s_cop, s_grd), mask)
label_preds = self.model.decode(s_edge, s_label).masked_fill(~mask, -1)
preds['labels'].extend(chart[1:i, :i].tolist() for i, chart in zip(lens, label_preds))
if self.args.prob:
preds['probs'].extend([prob[1:i, :i].cpu() for i, prob in zip(lens, s_edge.unbind())])
preds['labels'] = [CoNLL.build_relations([[self.LABEL.vocab[i] if i >= 0 else None for i in row] for row in chart])
for chart in preds['labels']]
return preds
| 43.688559 | 123 | 0.565637 | 2,538 | 20,621 | 4.504728 | 0.124113 | 0.01968 | 0.008922 | 0.014694 | 0.78772 | 0.774163 | 0.766465 | 0.76078 | 0.759293 | 0.756582 | 0 | 0.012826 | 0.319432 | 20,621 | 471 | 124 | 43.781316 | 0.801838 | 0.319335 | 0 | 0.644898 | 0 | 0.008163 | 0.040736 | 0.009751 | 0 | 0 | 0 | 0 | 0 | 1 | 0.069388 | false | 0 | 0.053061 | 0 | 0.204082 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
907ee492f656d699b9126f3df76d06e9349109c5 | 876 | py | Python | server/openapi_server/models/__init__.py | paulheider/date-annotator-example | ecc2082750034f81bad702955d67dfaf38b832c6 | [
"Apache-2.0"
] | 4 | 2021-06-10T01:39:56.000Z | 2022-01-28T15:47:45.000Z | server/openapi_server/models/__init__.py | paulheider/date-annotator-example | ecc2082750034f81bad702955d67dfaf38b832c6 | [
"Apache-2.0"
] | 61 | 2020-11-26T00:25:03.000Z | 2022-03-31T20:31:10.000Z | server/openapi_server/models/__init__.py | paulheider/date-annotator-example | ecc2082750034f81bad702955d67dfaf38b832c6 | [
"Apache-2.0"
] | 4 | 2021-03-03T19:58:12.000Z | 2021-08-07T03:10:32.000Z | # coding: utf-8
# flake8: noqa
from __future__ import absolute_import
# import models into model package
from openapi_server.models.error import Error
from openapi_server.models.health_check import HealthCheck
from openapi_server.models.license import License
from openapi_server.models.note import Note
from openapi_server.models.text_annotation import TextAnnotation
from openapi_server.models.text_date_annotation import TextDateAnnotation
from openapi_server.models.text_date_annotation_all_of import TextDateAnnotationAllOf
from openapi_server.models.text_date_annotation_request import TextDateAnnotationRequest
from openapi_server.models.text_date_annotation_response import TextDateAnnotationResponse
from openapi_server.models.tool import Tool
from openapi_server.models.tool_dependencies import ToolDependencies
from openapi_server.models.tool_type import ToolType
| 48.666667 | 90 | 0.891553 | 115 | 876 | 6.504348 | 0.330435 | 0.176471 | 0.272727 | 0.368984 | 0.363636 | 0.219251 | 0.219251 | 0 | 0 | 0 | 0 | 0.002466 | 0.074201 | 876 | 17 | 91 | 51.529412 | 0.919852 | 0.067352 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
909669b75f8a4593d22fe2703062286a049454bc | 148 | py | Python | just/wechat/__init__.py | kenhancoder/just | 98f2f997ec61593de4b6d02534b492cee920fda9 | [
"BSD-3-Clause"
] | 1 | 2016-05-24T15:16:41.000Z | 2016-05-24T15:16:41.000Z | just/wechat/__init__.py | kenhancoder/just | 98f2f997ec61593de4b6d02534b492cee920fda9 | [
"BSD-3-Clause"
] | null | null | null | just/wechat/__init__.py | kenhancoder/just | 98f2f997ec61593de4b6d02534b492cee920fda9 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""The wechat module."""
from . import views # noqa
from views import wechat_entity
__all__ = ['views', 'wechat_entity']
| 18.5 | 36 | 0.655405 | 19 | 148 | 4.789474 | 0.631579 | 0.263736 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00813 | 0.168919 | 148 | 7 | 37 | 21.142857 | 0.731707 | 0.310811 | 0 | 0 | 0 | 0 | 0.189474 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
291517f43040e7f169cdb17a9b464218a029fed6 | 205 | py | Python | intermediate/my_package/public_function.py | duckherder/python-reminders | 23f650142b0745dbd7a51445aba186d85933300b | [
"MIT"
] | null | null | null | intermediate/my_package/public_function.py | duckherder/python-reminders | 23f650142b0745dbd7a51445aba186d85933300b | [
"MIT"
] | null | null | null | intermediate/my_package/public_function.py | duckherder/python-reminders | 23f650142b0745dbd7a51445aba186d85933300b | [
"MIT"
] | null | null | null | """my public function module"""
__all__ = ['my_public_function']
def my_private_function():
print("hello from private function!")
def my_public_function():
print("hello from public function!")
| 18.636364 | 41 | 0.717073 | 26 | 205 | 5.269231 | 0.384615 | 0.408759 | 0.350365 | 0.321168 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15122 | 205 | 10 | 42 | 20.5 | 0.787356 | 0.121951 | 0 | 0 | 0 | 0 | 0.41954 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0 | 0.4 | 0.4 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
292f3dfc5fbc2de09e55da9d6795ae00c7980ac5 | 213 | py | Python | tests/__init__.py | pjogrady/pymeta3 | 954f396d763b72b5ebb2683299f22f3c8640afd7 | [
"MIT"
] | 5 | 2015-01-09T12:09:49.000Z | 2021-09-25T13:35:09.000Z | tests/__init__.py | pjogrady/pymeta3 | 954f396d763b72b5ebb2683299f22f3c8640afd7 | [
"MIT"
] | null | null | null | tests/__init__.py | pjogrady/pymeta3 | 954f396d763b72b5ebb2683299f22f3c8640afd7 | [
"MIT"
] | 3 | 2015-10-09T18:01:41.000Z | 2019-06-24T17:39:39.000Z | from .test_builder import PythonWriterTests
from .test_pymeta import (HandyWrapper, MakeGrammarTest, NullOptimizerTest,
OMetaTestCase, PyExtractorTest, SelfHostingTest)
from .test_runtime import RuntimeTests
| 42.6 | 76 | 0.849765 | 20 | 213 | 8.9 | 0.7 | 0.134831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103286 | 213 | 4 | 77 | 53.25 | 0.931937 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
2934155729e9d28a538c759301660e01c83a4591 | 165 | py | Python | tests/web_platform/css_grid_1/grid_model/test_grid_inline_first_letter.py | fletchgraham/colosseum | 77be4896ee52b8f5956a3d77b5f2ccd2c8608e8f | [
"BSD-3-Clause"
] | null | null | null | tests/web_platform/css_grid_1/grid_model/test_grid_inline_first_letter.py | fletchgraham/colosseum | 77be4896ee52b8f5956a3d77b5f2ccd2c8608e8f | [
"BSD-3-Clause"
] | null | null | null | tests/web_platform/css_grid_1/grid_model/test_grid_inline_first_letter.py | fletchgraham/colosseum | 77be4896ee52b8f5956a3d77b5f2ccd2c8608e8f | [
"BSD-3-Clause"
] | 1 | 2020-01-16T01:56:41.000Z | 2020-01-16T01:56:41.000Z | from tests.utils import W3CTestCase
class TestGridInlineFirstLetter(W3CTestCase):
vars().update(W3CTestCase.find_tests(__file__, 'grid-inline-first-letter-'))
| 27.5 | 80 | 0.8 | 18 | 165 | 7.055556 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019868 | 0.084848 | 165 | 5 | 81 | 33 | 0.821192 | 0 | 0 | 0 | 0 | 0 | 0.152439 | 0.152439 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
2955d376e774b70800e9ca16d4699d8c3b20018c | 109 | py | Python | ghubunix/models/config.py | BBloggsbott/ghub-unix | acbcbbc81a90bcec89eb19d0b530983eb5e99a7d | [
"Apache-2.0"
] | null | null | null | ghubunix/models/config.py | BBloggsbott/ghub-unix | acbcbbc81a90bcec89eb19d0b530983eb5e99a7d | [
"Apache-2.0"
] | null | null | null | ghubunix/models/config.py | BBloggsbott/ghub-unix | acbcbbc81a90bcec89eb19d0b530983eb5e99a7d | [
"Apache-2.0"
] | null | null | null | from pydantic import BaseModel
class Config(BaseModel):
"""Data model for Config"""
username: str
| 13.625 | 31 | 0.697248 | 13 | 109 | 5.846154 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.211009 | 109 | 7 | 32 | 15.571429 | 0.883721 | 0.192661 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
46591b500763e85cae6abf2f9fef72e1c63ea4b2 | 23 | py | Python | project/test.py | carlalmadureira/SonarQube-and-Scanner-Docker | 96c37ce06b735f2a2a51dc74136d04cb5f77d132 | [
"Unlicense"
] | null | null | null | project/test.py | carlalmadureira/SonarQube-and-Scanner-Docker | 96c37ce06b735f2a2a51dc74136d04cb5f77d132 | [
"Unlicense"
] | null | null | null | project/test.py | carlalmadureira/SonarQube-and-Scanner-Docker | 96c37ce06b735f2a2a51dc74136d04cb5f77d132 | [
"Unlicense"
] | null | null | null | print('this is a test') | 23 | 23 | 0.695652 | 5 | 23 | 3.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130435 | 23 | 1 | 23 | 23 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.583333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
46affd72cc75af76d833c9bea32915680ed52163 | 175 | py | Python | Labs/vision/python_code/__init__.py | aidenchg/symmetrical-bassoon | 8487dced605f845b23200b387d03bde02642b988 | [
"MIT"
] | 2 | 2020-07-22T05:14:54.000Z | 2020-11-28T11:26:08.000Z | Labs/vision/python_code/__init__.py | aidenchg/symmetrical-bassoon | 8487dced605f845b23200b387d03bde02642b988 | [
"MIT"
] | null | null | null | Labs/vision/python_code/__init__.py | aidenchg/symmetrical-bassoon | 8487dced605f845b23200b387d03bde02642b988 | [
"MIT"
] | 6 | 2020-07-18T09:16:35.000Z | 2020-11-28T12:22:15.000Z | from .vision import show_image_caption, show_image_analysis, show_bounding_boxes
from .faces import show_faces, show_face_attributes, show_similar_faces, show_recognized_faces | 87.5 | 94 | 0.891429 | 26 | 175 | 5.5 | 0.538462 | 0.13986 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068571 | 175 | 2 | 94 | 87.5 | 0.877301 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
46b4f812cdaa809bb11e24ae84373ad049499485 | 1,239 | py | Python | test/test_installed.py | hfmark/palettable | e119c8450a2e7a333d9a5998d5fb8569408f6687 | [
"MIT"
] | null | null | null | test/test_installed.py | hfmark/palettable | e119c8450a2e7a333d9a5998d5fb8569408f6687 | [
"MIT"
] | null | null | null | test/test_installed.py | hfmark/palettable | e119c8450a2e7a333d9a5998d5fb8569408f6687 | [
"MIT"
] | null | null | null | """
Test installed palettable to make sure everything is accessible.
"""
import palettable
from palettable.palette import Palette
def test_colorbrewer():
assert isinstance(palettable.colorbrewer.diverging.PuOr_6, Palette)
assert isinstance(palettable.colorbrewer.qualitative.Pastel1_9, Palette)
assert isinstance(palettable.colorbrewer.sequential.PuBuGn_9, Palette)
def test_cubehelix():
assert isinstance(palettable.cubehelix.classic_16, Palette)
def test_tableau():
assert isinstance(palettable.tableau.ColorBlind_10, Palette)
def test_wes_anderson():
assert isinstance(palettable.wesanderson.Aquatic1_5, Palette)
def test_woods_hole():
assert isinstance(palettable.woodshole.whoi_6, Palette)
def test_matplotlib():
assert isinstance(palettable.matplotlib.Viridis_8, Palette)
def test_mycarta():
assert isinstance(palettable.mycarta.CubeYF_8, Palette)
def test_cmocean():
assert isinstance(palettable.cmocean.sequential.Amp_8, Palette)
assert isinstance(palettable.cmocean.diverging.Balance_8, Palette)
def test_cartocolors():
assert isinstance(palettable.cartocolors.sequential.Mint_7, Palette)
assert isinstance(palettable.cartocolors.diverging.Earth_7, Palette)
| 25.8125 | 76 | 0.795803 | 143 | 1,239 | 6.727273 | 0.356643 | 0.216216 | 0.351351 | 0.137214 | 0.091476 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015554 | 0.117837 | 1,239 | 47 | 77 | 26.361702 | 0.864593 | 0.051655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.541667 | 1 | 0.375 | true | 0 | 0.083333 | 0 | 0.458333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
31838007e426898d7d7ecec6627ab458c63f3ed3 | 150 | py | Python | zcrmsdk/src/com/zoho/crm/api/share_records/delete_action_handler.py | zoho/zohocrm-python-sdk-2.0 | 3a93eb3b57fed4e08f26bd5b311e101cb2995411 | [
"Apache-2.0"
] | null | null | null | zcrmsdk/src/com/zoho/crm/api/share_records/delete_action_handler.py | zoho/zohocrm-python-sdk-2.0 | 3a93eb3b57fed4e08f26bd5b311e101cb2995411 | [
"Apache-2.0"
] | null | null | null | zcrmsdk/src/com/zoho/crm/api/share_records/delete_action_handler.py | zoho/zohocrm-python-sdk-2.0 | 3a93eb3b57fed4e08f26bd5b311e101cb2995411 | [
"Apache-2.0"
] | null | null | null | from abc import ABC, abstractmethod
class DeleteActionHandler(ABC):
def __init__(self):
"""Creates an instance of DeleteActionHandler"""
pass
| 16.666667 | 50 | 0.76 | 17 | 150 | 6.470588 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153333 | 150 | 8 | 51 | 18.75 | 0.866142 | 0.28 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.25 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 5 |
3194636d32e306fdee9b58c99acce195a9d8086e | 68 | py | Python | model/__init__.py | georgistojanov/abc | 598a89c5939d97fa14aaae2e3b0a4d3d5495b467 | [
"MIT"
] | 26 | 2017-12-12T01:03:50.000Z | 2022-01-10T03:33:24.000Z | model/__init__.py | georgistojanov/abc | 598a89c5939d97fa14aaae2e3b0a4d3d5495b467 | [
"MIT"
] | null | null | null | model/__init__.py | georgistojanov/abc | 598a89c5939d97fa14aaae2e3b0a4d3d5495b467 | [
"MIT"
] | 8 | 2017-11-16T09:20:53.000Z | 2021-06-28T21:06:18.000Z | from . import hasher, generator, discriminator, hash_counter, utils
| 34 | 67 | 0.808824 | 8 | 68 | 6.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 68 | 1 | 68 | 68 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
31cff2ae49e468f4831ec74cb712774ad190b093 | 51 | py | Python | lstchain/io/__init__.py | yukihok/cta-lstchain | 58cb71d97435c2808ab84490dff6c158e9e8d4b2 | [
"BSD-3-Clause"
] | null | null | null | lstchain/io/__init__.py | yukihok/cta-lstchain | 58cb71d97435c2808ab84490dff6c158e9e8d4b2 | [
"BSD-3-Clause"
] | null | null | null | lstchain/io/__init__.py | yukihok/cta-lstchain | 58cb71d97435c2808ab84490dff6c158e9e8d4b2 | [
"BSD-3-Clause"
] | null | null | null | from .config import *
from .lstcontainers import *
| 17 | 28 | 0.764706 | 6 | 51 | 6.5 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156863 | 51 | 2 | 29 | 25.5 | 0.906977 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
31dcb4d27f785a4ab49d6dc9579a8f3851566fdd | 24 | py | Python | ppk/__main__.py | paolodragone/pickle-peeker | 564bda2ab9cab8bbd1514a18cf0e9b6056163d3e | [
"MIT"
] | null | null | null | ppk/__main__.py | paolodragone/pickle-peeker | 564bda2ab9cab8bbd1514a18cf0e9b6056163d3e | [
"MIT"
] | null | null | null | ppk/__main__.py | paolodragone/pickle-peeker | 564bda2ab9cab8bbd1514a18cf0e9b6056163d3e | [
"MIT"
] | null | null | null |
import ppk
ppk.main()
| 4.8 | 10 | 0.666667 | 4 | 24 | 4 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208333 | 24 | 4 | 11 | 6 | 0.842105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
31e8140de53f462f3e8243472de0fe600642359b | 424 | py | Python | cinemanio/sites/exceptions.py | cinemanio/backend | c393dc8c2d59dc99aa2c3314d3372b6e2bf5497f | [
"MIT"
] | 4 | 2018-07-05T07:00:04.000Z | 2021-02-03T22:02:13.000Z | cinemanio/sites/exceptions.py | cinemanio/backend | c393dc8c2d59dc99aa2c3314d3372b6e2bf5497f | [
"MIT"
] | 19 | 2018-09-03T23:27:49.000Z | 2020-02-12T00:09:02.000Z | cinemanio/sites/exceptions.py | cinemanio/backend | c393dc8c2d59dc99aa2c3314d3372b6e2bf5497f | [
"MIT"
] | null | null | null | class PossibleDuplicate(Exception):
"""
Raised, when possible duplicate found
"""
pass
class NothingFound(Exception):
"""
Raised, when no any search results found
"""
pass
class WrongValue(Exception):
"""
Raised, when trying to assign wrong value
"""
pass
class SiteIDDoesNotExist(Exception):
"""
Raised, when trying to sync unexisted site link
"""
pass
| 15.703704 | 51 | 0.629717 | 43 | 424 | 6.209302 | 0.581395 | 0.224719 | 0.284644 | 0.187266 | 0.202247 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.275943 | 424 | 26 | 52 | 16.307692 | 0.869707 | 0.396226 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
9ed5817f1337c325de4abcdc10476eeb3d0c6028 | 132 | py | Python | gnn_agglomeration/data_transforms/__init__.py | bentaculum/gnn_agglomeration | e5b8a693ba78433b8a3721ac3102630c2a79a1e4 | [
"MIT"
] | 2 | 2021-05-19T01:56:52.000Z | 2021-07-08T20:50:38.000Z | gnn_agglomeration/data_transforms/__init__.py | benjamin9555/gnn_agglomeration | e5b8a693ba78433b8a3721ac3102630c2a79a1e4 | [
"MIT"
] | 14 | 2019-07-17T19:23:09.000Z | 2021-02-02T22:01:49.000Z | gnn_agglomeration/data_transforms/__init__.py | benjamin9555/gnn_agglomeration | e5b8a693ba78433b8a3721ac3102630c2a79a1e4 | [
"MIT"
] | 2 | 2019-07-17T20:14:03.000Z | 2019-07-27T16:20:52.000Z | from .augment_hemibrain import AugmentHemibrain # noqa
from .unit_edge_attr_gaussian_noise import UnitEdgeAttrGaussianNoise # noqa | 66 | 76 | 0.871212 | 15 | 132 | 7.333333 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098485 | 132 | 2 | 76 | 66 | 0.92437 | 0.068182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
9ed781a2a7686ba6692d6faef4022b22c4e0934a | 374 | py | Python | dist/py/lightlevel.py | microsoft/jacdac | 2c6548b7e55ac34141e5152c664ca268e873cf09 | [
"CC-BY-4.0",
"MIT"
] | 31 | 2020-07-24T14:49:32.000Z | 2022-03-20T12:20:56.000Z | dist/py/lightlevel.py | QPC-database/jacdac | 74e9f7cebdb1db4c24f211aceb657b5125d0fd40 | [
"CC-BY-4.0",
"MIT"
] | 747 | 2020-07-31T22:05:45.000Z | 2022-03-31T23:27:35.000Z | dist/py/lightlevel.py | QPC-database/jacdac | 74e9f7cebdb1db4c24f211aceb657b5125d0fd40 | [
"CC-BY-4.0",
"MIT"
] | 17 | 2020-07-31T10:49:01.000Z | 2022-03-15T03:21:43.000Z | # Autogenerated file for Light level
# Add missing from ... import const
_JD_SERVICE_CLASS_LIGHT_LEVEL = const(0x17dc9a1c)
_JD_LIGHT_LEVEL_VARIANT_PHOTO_RESISTOR = const(0x1)
_JD_LIGHT_LEVEL_VARIANT_LEDMATRIX = const(0x2)
_JD_LIGHT_LEVEL_VARIANT_AMBIENT = const(0x3)
_JD_LIGHT_LEVEL_REG_LIGHT_LEVEL = const(JD_REG_READING)
_JD_LIGHT_LEVEL_REG_VARIANT = const(JD_REG_VARIANT) | 46.75 | 55 | 0.858289 | 58 | 374 | 4.913793 | 0.413793 | 0.280702 | 0.210526 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031977 | 0.080214 | 374 | 8 | 56 | 46.75 | 0.796512 | 0.181818 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
9edce4d1149dd3bf5e131bd32966f53019277de0 | 25 | py | Python | test.py | saurabhbhatt/demo | d39696ed186aeb4f4e9cae00e4807dc75714d9c7 | [
"Apache-2.0"
] | null | null | null | test.py | saurabhbhatt/demo | d39696ed186aeb4f4e9cae00e4807dc75714d9c7 | [
"Apache-2.0"
] | null | null | null | test.py | saurabhbhatt/demo | d39696ed186aeb4f4e9cae00e4807dc75714d9c7 | [
"Apache-2.0"
] | null | null | null | print({"dfkfdgkjnhdrfr"}) | 25 | 25 | 0.76 | 2 | 25 | 9.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 25 | 1 | 25 | 25 | 0.76 | 0 | 0 | 0 | 0 | 0 | 0.538462 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
9eeef7decf2e6eb483d8bf17bd9537922e49269a | 32 | py | Python | eureka/__init__.py | nixiaocang/work | 1fdab83c8749e36da8c17c0d6c20f67212c35ce5 | [
"MIT"
] | null | null | null | eureka/__init__.py | nixiaocang/work | 1fdab83c8749e36da8c17c0d6c20f67212c35ce5 | [
"MIT"
] | 2 | 2021-02-08T20:22:38.000Z | 2021-04-30T20:39:07.000Z | eureka/__init__.py | nixiaocang/work | 1fdab83c8749e36da8c17c0d6c20f67212c35ce5 | [
"MIT"
] | null | null | null | from .client import EurekaClient | 32 | 32 | 0.875 | 4 | 32 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 32 | 1 | 32 | 32 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
9ef515c46ca3ac6e5a59c70c58a2c1d468d94c5d | 201 | py | Python | venv/lib/python3.6/site-packages/gym/wrappers/sign_reward.py | amousist/cartpole | aec01e9c2d28eda6019fe8bb94804a78f2d7fbc0 | [
"MIT"
] | 3 | 2020-06-02T11:23:57.000Z | 2021-09-02T12:02:20.000Z | gym/wrappers/sign_reward.py | huangjiancong1/gym_baxter | 7534d9504b4678a3b09a4e17466f54eaeaf23ccc | [
"Apache-2.0"
] | null | null | null | gym/wrappers/sign_reward.py | huangjiancong1/gym_baxter | 7534d9504b4678a3b09a4e17466f54eaeaf23ccc | [
"Apache-2.0"
] | null | null | null | import numpy as np
from gym import RewardWrapper
class SignReward(RewardWrapper):
r""""Bin reward to {-1, 0, +1} by its sign. """
def reward(self, reward):
return np.sign(reward)
| 20.1 | 54 | 0.646766 | 29 | 201 | 4.482759 | 0.724138 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019481 | 0.233831 | 201 | 9 | 55 | 22.333333 | 0.824675 | 0.19403 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0.166667 | 0.833333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 5 |
9efa1d8f972a09a5b3df86bc765d2dc6c1e2462a | 114 | py | Python | enthought/block_canvas/function_tools/general_expression.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | 3 | 2016-12-09T06:05:18.000Z | 2018-03-01T13:00:29.000Z | enthought/block_canvas/function_tools/general_expression.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | 1 | 2020-12-02T00:51:32.000Z | 2020-12-02T08:48:55.000Z | enthought/block_canvas/function_tools/general_expression.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | null | null | null | # proxy module
from __future__ import absolute_import
from blockcanvas.function_tools.general_expression import *
| 28.5 | 59 | 0.868421 | 14 | 114 | 6.571429 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096491 | 114 | 3 | 60 | 38 | 0.893204 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
7338bf03ea8eea95b93a51bb4c6dead30877473a | 126 | py | Python | bin/ssa-end-to-end-testing/modules/assertions/endpoint/assertions.py | pageinsec/security_content | 930a6db140207e3c34d63ee16a8cba099a9035b1 | [
"Apache-2.0"
] | 348 | 2021-01-28T12:14:43.000Z | 2022-03-30T21:39:55.000Z | bin/ssa-end-to-end-testing/modules/assertions/endpoint/assertions.py | pageinsec/security_content | 930a6db140207e3c34d63ee16a8cba099a9035b1 | [
"Apache-2.0"
] | 611 | 2020-11-04T21:35:28.000Z | 2022-03-31T14:06:08.000Z | bin/ssa-end-to-end-testing/modules/assertions/endpoint/assertions.py | pageinsec/security_content | 930a6db140207e3c34d63ee16a8cba099a9035b1 | [
"Apache-2.0"
] | 115 | 2021-01-27T19:16:18.000Z | 2022-03-29T21:30:57.000Z | # Add your custom assertion functions for `endpoint` tests in this file
def dummy_endpoint_func(output=[]):
return True
| 21 | 71 | 0.753968 | 18 | 126 | 5.166667 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.174603 | 126 | 5 | 72 | 25.2 | 0.894231 | 0.547619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
735029bebbce01269fc915af1cbd80737928e629 | 271 | py | Python | python/dinamica_estocastica/borracho.py | lmonsalve22/Learning-to-Code | 2e32eba3fbd0bd63cc539e1e6d372ca346b765c9 | [
"MIT"
] | null | null | null | python/dinamica_estocastica/borracho.py | lmonsalve22/Learning-to-Code | 2e32eba3fbd0bd63cc539e1e6d372ca346b765c9 | [
"MIT"
] | null | null | null | python/dinamica_estocastica/borracho.py | lmonsalve22/Learning-to-Code | 2e32eba3fbd0bd63cc539e1e6d372ca346b765c9 | [
"MIT"
] | null | null | null | import random
class Borracho:
def __init__(self, nombre):
self.nombre = nombre
class BorrachoTradicional(Borracho):
def __init__(self, nombre):
super().__init__(nombre)
def camina(self):
return random.choice([(0,1), (0, -1), (1, 0), (-1, 0)])
| 18.066667 | 58 | 0.638376 | 35 | 271 | 4.6 | 0.428571 | 0.186335 | 0.186335 | 0.236025 | 0.310559 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036866 | 0.199262 | 271 | 14 | 59 | 19.357143 | 0.705069 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.111111 | 0.111111 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
7357f470dc6978b8bf53a4cc16c18bcdb16bfbc5 | 165 | py | Python | security/admin.py | dev-dpoudel/healthmap | faa6b26b98ec18379ef2757f854d856fd7a9dfc1 | [
"MIT"
] | 1 | 2021-05-25T02:18:46.000Z | 2021-05-25T02:18:46.000Z | security/admin.py | dev-dpoudel/healthmap | faa6b26b98ec18379ef2757f854d856fd7a9dfc1 | [
"MIT"
] | null | null | null | security/admin.py | dev-dpoudel/healthmap | faa6b26b98ec18379ef2757f854d856fd7a9dfc1 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Blocklist, Incidence
# Register your models here.
admin.site.register(Blocklist)
admin.site.register(Incidence)
| 27.5 | 40 | 0.824242 | 22 | 165 | 6.181818 | 0.545455 | 0.132353 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09697 | 165 | 5 | 41 | 33 | 0.912752 | 0.157576 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
7dfb03fe443d4990010441954705f526b0384c63 | 2,926 | py | Python | monk/pytorch/losses/return_loss.py | abhi-kumar/monk_kaggle_bengali_ai | 12a6c654446e887706c1a8bed82fccf8a98ce356 | [
"Apache-2.0"
] | null | null | null | monk/pytorch/losses/return_loss.py | abhi-kumar/monk_kaggle_bengali_ai | 12a6c654446e887706c1a8bed82fccf8a98ce356 | [
"Apache-2.0"
] | 9 | 2020-01-28T21:40:39.000Z | 2022-02-10T01:24:06.000Z | monk/pytorch/losses/return_loss.py | abhi-kumar/monk_kaggle_bengali_ai | 12a6c654446e887706c1a8bed82fccf8a98ce356 | [
"Apache-2.0"
] | null | null | null | from pytorch.losses.imports import *
from system.imports import *
@accepts(dict, post_trace=True)
@TraceFunction(trace_args=False, trace_rv=False)
def load_loss(system_dict):
name = system_dict["local"]["criterion"];
if(name == "softmaxcrossentropy"):
system_dict["local"]["criterion"] = torch.nn.CrossEntropyLoss(
weight=system_dict["hyper-parameters"]["loss"]["params"]["weight"],
size_average=system_dict["hyper-parameters"]["loss"]["params"]["size_average"],
ignore_index=system_dict["hyper-parameters"]["loss"]["params"]["ignore_index"],
reduce=system_dict["hyper-parameters"]["loss"]["params"]["reduce"],
reduction=system_dict["hyper-parameters"]["loss"]["params"]["reduction"]);
elif(name == "nll"):
system_dict["local"]["criterion"] = torch.nn.NLLLoss(
weight=system_dict["hyper-parameters"]["loss"]["params"]["weight"],
size_average=system_dict["hyper-parameters"]["loss"]["params"]["size_average"],
ignore_index=system_dict["hyper-parameters"]["loss"]["params"]["ignore_index"],
reduce=system_dict["hyper-parameters"]["loss"]["params"]["reduce"],
reduction=system_dict["hyper-parameters"]["loss"]["params"]["reduction"]);
elif(name == "poissonnll"):
system_dict["local"]["criterion"] = torch.nn.PoissonNLLLoss(
log_input=system_dict["hyper-parameters"]["loss"]["params"]["log_input"],
full=system_dict["hyper-parameters"]["loss"]["params"]["log_input"],
size_average=system_dict["hyper-parameters"]["loss"]["params"]["log_input"],
eps=system_dict["hyper-parameters"]["loss"]["params"]["log_input"],
reduce=system_dict["hyper-parameters"]["loss"]["params"]["reduce"],
reduction=system_dict["hyper-parameters"]["loss"]["params"]["reduction"]);
elif(name == "binarycrossentropy"):
system_dict["local"]["criterion"] = torch.nn.BCELoss(
weight=system_dict["hyper-parameters"]["loss"]["params"]["weight"],
size_average=system_dict["hyper-parameters"]["loss"]["params"]["size_average"],
reduce=system_dict["hyper-parameters"]["loss"]["params"]["reduce"],
reduction=system_dict["hyper-parameters"]["loss"]["params"]["reduction"]);
elif(name == "binarycrossentropywithlogits"):
system_dict["local"]["criterion"] = torch.nn.BCEWithLogitsLoss(
weight=system_dict["hyper-parameters"]["loss"]["params"]["weight"],
size_average=system_dict["hyper-parameters"]["loss"]["params"]["size_average"],
reduce=system_dict["hyper-parameters"]["loss"]["params"]["reduce"],
reduction=system_dict["hyper-parameters"]["loss"]["params"]["reduction"],
pos_weight=system_dict["hyper-parameters"]["loss"]["params"]["pos_weight"]);
return system_dict; | 57.372549 | 92 | 0.632604 | 304 | 2,926 | 5.901316 | 0.167763 | 0.183946 | 0.20903 | 0.348384 | 0.803233 | 0.803233 | 0.716834 | 0.69398 | 0.591973 | 0.591973 | 0 | 0 | 0.160629 | 2,926 | 51 | 93 | 57.372549 | 0.730456 | 0 | 0 | 0.452381 | 0 | 0 | 0.351554 | 0.009566 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02381 | false | 0 | 0.047619 | 0 | 0.095238 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
b40eab66d0fa603a818b31cdb8e687577831209e | 3,369 | py | Python | test/test_CsvTokenReader.py | ytyaru/Python.AccessToken.20210820093132 | 0d0aa0e00dccb7258b6c2967f2c9b72b61f34fe9 | [
"CC0-1.0"
] | null | null | null | test/test_CsvTokenReader.py | ytyaru/Python.AccessToken.20210820093132 | 0d0aa0e00dccb7258b6c2967f2c9b72b61f34fe9 | [
"CC0-1.0"
] | null | null | null | test/test_CsvTokenReader.py | ytyaru/Python.AccessToken.20210820093132 | 0d0aa0e00dccb7258b6c2967f2c9b72b61f34fe9 | [
"CC0-1.0"
] | null | null | null | #!/usr/bin/env python3
# coding: utf8
import os, sys, pathlib
sys.path.append(os.path.dirname(os.path.dirname(__file__)))
from src.token import CsvTokenReader
import unittest
from unittest.mock import MagicMock, patch, mock_open
import copy
import toml
class TestCsvTokenReader(unittest.TestCase):
def setUp(self):
self.rows = [
['test.com', 'test-user', 'read', 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'],
]
def test_path(self):
self.assertEqual(os.path.basename(CsvTokenReader().Path), 'token.tsv')
@patch('os.path.isfile', return_value=False)
def test_get_not_exist_file(self, mock_lib):
self.assertEqual(CsvTokenReader().get('', ''), None)
@patch('src.token.CsvTokenReader._CsvTokenReader__get_rows')
def test_get_hit_one_of_one(self, mock_lib):
mock_lib.return_value = self.rows
actual = CsvTokenReader().get(self.rows[0][0], self.rows[0][1])
mock_lib.assert_called_once()
self.assertEqual(actual, self.rows[0][3])
@patch('src.token.CsvTokenReader._CsvTokenReader__get_rows')
def test_get_hit_one_of_two(self, mock_lib):
mock_lib.return_value = [
self.rows[0],
[self.rows[0][0]+'2', self.rows[0][1]+'2', self.rows[0][2], self.rows[0][3]+'2'],
]
actual = CsvTokenReader().get(mock_lib.return_value[1][0], mock_lib.return_value[1][1])
mock_lib.assert_called_once()
self.assertEqual(actual, mock_lib.return_value[1][3])
@patch('src.token.CsvTokenReader._CsvTokenReader__get_rows')
def test_get_hit_two_of_two(self, mock_lib):
mock_lib.return_value = [
self.rows[0],
[self.rows[0][0], self.rows[0][1], ['write'], 'yyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy'],
]
actual = CsvTokenReader().get(mock_lib.return_value[1][0], mock_lib.return_value[1][1])
mock_lib.assert_called_once()
self.assertEqual(actual, mock_lib.return_value[0][3])
@patch('src.token.CsvTokenReader._CsvTokenReader__get_rows')
def test_get_not_hit_one(self, mock_lib):
mock_lib.return_value = self.rows
for case in [
(([self.rows[0][0]+'2', self.rows[0][1]], None), None),
(([self.rows[0][0], self.rows[0][1]+'2'], None), None),
(([self.rows[0][0]+'2', self.rows[0][1]], ['write']), None),
]:
with self.subTest(args=case[0][0], kwargs=case[0][1], expected=case[1]):
actual = CsvTokenReader().get(*case[0][0], scopes=case[0][1])
self.assertEqual(actual, None)
@patch('src.token.CsvTokenReader._CsvTokenReader__get_rows')
def test_get_not_hit_two(self, mock_lib):
mock_lib.return_value = [
self.rows[0],
[self.rows[0][0], self.rows[0][1], ['write'], 'yyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy'],
]
for case in [
(([self.rows[0][0]+'2', self.rows[0][1]], None), None),
(([self.rows[0][0], self.rows[0][1]+'2'], None), None),
(([self.rows[0][0]+'2', self.rows[0][1]], ['follow']), None),
]:
with self.subTest(args=case[0][0], kwargs=case[0][1], expected=case[1]):
actual = CsvTokenReader().get(*case[0][0], scopes=case[0][1])
self.assertEqual(actual, None)
if __name__ == "__main__":
unittest.main()
| 46.150685 | 102 | 0.619768 | 457 | 3,369 | 4.352298 | 0.166302 | 0.116642 | 0.117647 | 0.099548 | 0.720965 | 0.711916 | 0.711916 | 0.711916 | 0.702363 | 0.674208 | 0 | 0.033582 | 0.204512 | 3,369 | 72 | 103 | 46.791667 | 0.708582 | 0.010092 | 0 | 0.5 | 0 | 0 | 0.135614 | 0.111011 | 0 | 0 | 0 | 0 | 0.147059 | 1 | 0.117647 | false | 0 | 0.088235 | 0 | 0.220588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
b42cd328c4ca7cc38bba360aa77214c5d80dc31b | 127 | py | Python | Web Files/submitsystem/test files/hello_world.py | benhawk1/447-Project---Submit-System | 878cbf515c335b95cd420011e0fdbda1d4945aa2 | [
"Apache-2.0"
] | null | null | null | Web Files/submitsystem/test files/hello_world.py | benhawk1/447-Project---Submit-System | 878cbf515c335b95cd420011e0fdbda1d4945aa2 | [
"Apache-2.0"
] | 25 | 2020-10-06T16:37:40.000Z | 2020-12-01T22:20:27.000Z | Web Files/submitsystem/test files/hello_world.py | benhawk1/447-Project---Submit-System | 878cbf515c335b95cd420011e0fdbda1d4945aa2 | [
"Apache-2.0"
] | 2 | 2020-10-15T18:41:34.000Z | 2020-11-02T00:49:01.000Z | """
Nicholas Proulx
nproulx1@umbc.edu
CMSC 123
Print Hello World
"""
if __name__ == "__main__":
print("hello world")
| 11.545455 | 26 | 0.661417 | 16 | 127 | 4.75 | 0.8125 | 0.263158 | 0.394737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039604 | 0.204724 | 127 | 10 | 27 | 12.7 | 0.712871 | 0.472441 | 0 | 0 | 0 | 0 | 0.322034 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
b4304d30182752d9a2549decf73638c516d1084e | 150 | py | Python | pbs_ci/__init__.py | NCAR/pbs-ci | 152c495ce6246b8812b9349e7fdaa4e92bc5366d | [
"Apache-2.0"
] | 1 | 2019-08-08T19:41:36.000Z | 2019-08-08T19:41:36.000Z | pbs_ci/__init__.py | NCAR/pbs-ci | 152c495ce6246b8812b9349e7fdaa4e92bc5366d | [
"Apache-2.0"
] | 6 | 2019-01-19T08:50:42.000Z | 2019-01-23T07:05:36.000Z | pbs_ci/__init__.py | NCAR/pbs-ci | 152c495ce6246b8812b9349e7fdaa4e92bc5366d | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
"""Top-level module for pbs_ci."""
from ._version import get_versions
__version__ = get_versions()["version"]
del get_versions
| 21.428571 | 39 | 0.753333 | 22 | 150 | 4.727273 | 0.727273 | 0.317308 | 0.346154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106667 | 150 | 6 | 40 | 25 | 0.776119 | 0.326667 | 0 | 0 | 0 | 0 | 0.073684 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
c30a50c6b6e8142f10f489489070592744ee3e2d | 152 | py | Python | tool/UUID.py | blackstorm/python3-tornado-blog | 71198d37cb91ad07a7a7088e67d617d117361f54 | [
"MIT"
] | 8 | 2016-11-04T09:03:42.000Z | 2019-04-22T02:14:40.000Z | tool/UUID.py | blackstorm/python3-tornado-blog | 71198d37cb91ad07a7a7088e67d617d117361f54 | [
"MIT"
] | null | null | null | tool/UUID.py | blackstorm/python3-tornado-blog | 71198d37cb91ad07a7a7088e67d617d117361f54 | [
"MIT"
] | 6 | 2016-06-13T00:11:51.000Z | 2022-01-18T11:35:18.000Z | import base64
import uuid
def get_uuid():
id = base64.b64encode(uuid.uuid4().bytes + uuid.uuid4().bytes)
print("使用的UUID为"+str(id))
return id | 25.333333 | 66 | 0.684211 | 22 | 152 | 4.681818 | 0.590909 | 0.174757 | 0.271845 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062992 | 0.164474 | 152 | 6 | 67 | 25.333333 | 0.748032 | 0 | 0 | 0 | 0 | 0 | 0.052288 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.666667 | 0.166667 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
c33b5d3a699a15d5cc45cd794c12e38ac33a8374 | 192 | py | Python | gym_chrome_dino/envs/__init__.py | rey-allan/gym-chrome-dino | 3890ff876e1fe8a67027255c5b6e3c225102bc56 | [
"MIT"
] | null | null | null | gym_chrome_dino/envs/__init__.py | rey-allan/gym-chrome-dino | 3890ff876e1fe8a67027255c5b6e3c225102bc56 | [
"MIT"
] | 1 | 2022-02-09T01:36:32.000Z | 2022-02-09T01:36:32.000Z | gym_chrome_dino/envs/__init__.py | rey-allan/gym-chrome-dino | 3890ff876e1fe8a67027255c5b6e3c225102bc56 | [
"MIT"
] | 1 | 2022-02-02T05:09:22.000Z | 2022-02-02T05:09:22.000Z | #!/usr/bin/env python
#
# Copyright (C) 2019 Matt Struble
# Licensed under the MIT License - https://opensource.org/licenses/MIT
from gym_chrome_dino.envs.chrome_dino_env import ChromeDinoEnv | 32 | 70 | 0.791667 | 29 | 192 | 5.103448 | 0.862069 | 0.135135 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023392 | 0.109375 | 192 | 6 | 71 | 32 | 0.842105 | 0.630208 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
c362531ca88388bd4186c8b3b513a248aa4ccc97 | 3,943 | py | Python | epytope/Data/pssms/arb/mat/A_0203_8.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 7 | 2021-02-01T18:11:28.000Z | 2022-01-31T19:14:07.000Z | epytope/Data/pssms/arb/mat/A_0203_8.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 22 | 2021-01-02T15:25:23.000Z | 2022-03-14T11:32:53.000Z | epytope/Data/pssms/arb/mat/A_0203_8.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 4 | 2021-05-28T08:50:38.000Z | 2022-03-14T11:45:32.000Z | A_0203_8 = {0: {'A': 0.09260914508442766, 'C': -0.055842410332572236, 'E': -0.3237738512971216, 'D': -0.3237738512971216, 'G': -4.0, 'F': 0.5917842377663532, 'I': 0.7378292207178623, 'H': -0.432906376367205, 'K': -0.432906376367205, 'M': 0.5432476943183139, 'L': 0.35845112187102407, 'N': 0.16009500333529633, 'Q': -0.01855040550629643, 'P': -4.0, 'S': 0.639648423026295, 'R': -0.5603995474769126, 'T': -0.5058245686981333, 'W': 0.4454052228054611, 'V': -0.1818823898945458, 'Y': 0.6611209064731303}, 1: {'A': -0.2672333895677309, 'C': -0.4387840922483413, 'E': -0.4034109306495243, 'D': -0.47264480959625693, 'G': -4.0, 'F': -0.060389530160822776, 'I': 0.8134150443683076, 'H': -0.6981830595546019, 'K': -0.6981830595546019, 'M': 0.8134150443683076, 'L': 1.284933344625203, 'N': -0.2843470728996994, 'Q': -0.2843470728996994, 'P': -0.8509836054074027, 'S': -0.4387840922483413, 'R': -0.6602253929302401, 'T': -0.5051824995083194, 'W': -0.060389530160822776, 'V': -0.27167556428897965, 'Y': -0.060389530160822776}, 2: {'A': 0.5944967243530065, 'C': -0.1331071980408997, 'E': -0.44991829196421135, 'D': -0.44991829196421135, 'G': -4.0, 'F': 0.0480091971149035, 'I': -0.060910697021088235, 'H': -0.2960414820391224, 'K': -0.16214418685980708, 'M': 0.31870745030596187, 'L': 0.17249942369917873, 'N': -0.21323348425458497, 'Q': -0.21323348425458497, 'P': 0.013500177056832884, 'S': 0.3392036140548539, 'R': -0.28684854054720765, 'T': -0.4047432021454702, 'W': 0.09213686661979076, 'V': 0.024167506418157222, 'Y': 0.23320436941780093}, 3: {'A': -0.45119238630881936, 'C': -0.017753603886303048, 'E': -0.3024559078237844, 'D': 0.1979379544185194, 'G': 0.5187730968633795, 'F': 0.40301061188302617, 'I': 0.044071190879106645, 'H': -0.5790212159745092, 'K': -0.7190773351082351, 'M': 0.07306025317323778, 'L': 0.5521037532261205, 'N': 0.6683477977546922, 'Q': -0.4731044582458323, 'P': 0.25790621987206236, 'S': -0.18512153706813617, 'R': -0.2736458903737848, 'T': -0.017753603886303048, 'W': 0.40301061188302617, 'V': -0.4024734320254284, 'Y': 0.40301061188302617}, 4: {'A': -0.21166324861165306, 'C': 0.4038774903975255, 'E': -4.0, 'D': -4.0, 'G': -4.0, 'F': 0.128146179675162, 'I': 0.39616299739384064, 'H': -0.4961438373508598, 'K': -0.22592258747230348, 'M': -0.26897097310492857, 'L': 0.5110235159547518, 'N': -0.5744385554381883, 'Q': -0.5744385554381883, 'P': -0.24348070301476793, 'S': 0.2714059323833559, 'R': -0.4327101711686107, 'T': 0.31407213371612497, 'W': 0.0700533530880546, 'V': 1.1224851556845803, 'Y': 0.04810907958553632}, 5: {'A': 0.06126330113553939, 'C': -0.24726893217963136, 'E': -0.2524981572064032, 'D': -0.05947400252871723, 'G': -0.14531564904288058, 'F': 0.3031427936619408, 'I': 0.1734039731710738, 'H': -0.20065830621643388, 'K': -0.13128090933290837, 'M': 0.6084101278060998, 'L': 0.7123749318815351, 'N': -0.7355694854078886, 'Q': -0.549397255101923, 'P': -4.0, 'S': -0.15250637237103232, 'R': -0.1597094699718068, 'T': -0.5021838995482352, 'W': 0.06509043782539037, 'V': 0.6084101278060998, 'Y': 0.06509043782539037}, 6: {'A': 0.18885660137286675, 'C': 0.32103563700866794, 'E': 0.7470030835769148, 'D': -0.13994263832237933, 'G': 0.4958040595465809, 'F': -0.5158972753996656, 'I': -0.2603316309203583, 'H': -0.3451104237986978, 'K': -0.3451104237986978, 'M': -0.43856789104712984, 'L': -0.44549036746130627, 'N': -0.2194109175293774, 'Q': -0.45461463096632054, 'P': 0.9108004345404269, 'S': -0.1377162864029968, 'R': -0.28237775462120024, 'T': 0.8038535015561439, 'W': -0.5158972753996656, 'V': -0.43856789104712984, 'Y': -0.5158972753996656}, 7: {'A': 0.01986027046527329, 'C': -4.0, 'E': -4.0, 'D': -4.0, 'G': -4.0, 'F': -4.0, 'I': -0.47798394667218896, 'H': -4.0, 'K': -4.0, 'M': 0.46422218178701863, 'L': 0.27049422758067765, 'N': -4.0, 'Q': -4.0, 'P': -4.0, 'S': -4.0, 'R': -4.0, 'T': -4.0, 'W': -4.0, 'V': 0.8112761915974741, 'Y': -4.0}, -1: {'slope': 0.17023328482584466, 'intercept': -0.6108292920939158}} | 3,943 | 3,943 | 0.676135 | 498 | 3,943 | 5.349398 | 0.303213 | 0.017267 | 0.005631 | 0.007508 | 0.017267 | 0.007508 | 0.007508 | 0.007508 | 0.007508 | 0.007508 | 0 | 0.689665 | 0.084707 | 3,943 | 1 | 3,943 | 3,943 | 0.04849 | 0 | 0 | 0 | 0 | 0 | 0.044118 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
5ef4660ba59d8bdc67cc40853b6535d6ff55fdb3 | 164 | py | Python | detector/yolov5/__init__.py | gmt710/AlphaPose_yolovx | 5534e7167eb4c8f6ba7c74f59fe77c3d120258b2 | [
"Apache-2.0"
] | 24 | 2021-07-22T11:53:38.000Z | 2022-03-24T11:09:04.000Z | detector/yolov5/__init__.py | gmt710/AlphaPose_yolovx | 5534e7167eb4c8f6ba7c74f59fe77c3d120258b2 | [
"Apache-2.0"
] | 7 | 2021-08-08T10:17:22.000Z | 2022-03-23T06:16:50.000Z | detector/yolov5/__init__.py | gmt710/AlphaPose_yolovx | 5534e7167eb4c8f6ba7c74f59fe77c3d120258b2 | [
"Apache-2.0"
] | 9 | 2021-09-13T11:41:44.000Z | 2022-01-29T10:04:41.000Z | # -*- coding: UTF-8 -*-
'''
@author: mengting gu
@contact: 1065504814@qq.com
@time: 2021/3/4 下午3:50
@file: __init__.py.py
@desc:
'''
from detector.yolov5 import *
| 16.4 | 29 | 0.652439 | 25 | 164 | 4.12 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148936 | 0.140244 | 164 | 9 | 30 | 18.222222 | 0.58156 | 0.756098 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
6f0651d98ef61681cb5717c39326a93b068e01f2 | 539 | py | Python | Exercicios/Mundo1/ex009.py | mpaullos/cursoemvideo-python | 80732626b6b5471ec7fea6dc01d83931e5cfd8fb | [
"MIT"
] | null | null | null | Exercicios/Mundo1/ex009.py | mpaullos/cursoemvideo-python | 80732626b6b5471ec7fea6dc01d83931e5cfd8fb | [
"MIT"
] | null | null | null | Exercicios/Mundo1/ex009.py | mpaullos/cursoemvideo-python | 80732626b6b5471ec7fea6dc01d83931e5cfd8fb | [
"MIT"
] | null | null | null | tab =int (input ('Digite um número para ver sua tabuada: '))
print('='*12)
print('{} x {:2} = {}'.format (tab,1,tab*1))
print('{} x {:2} = {}'.format (tab,2,tab*2))
print('{} x {:2} = {}'.format (tab,3,tab*3))
print('{} x {:2} = {}'.format (tab,4,tab*4))
print('{} x {:2} = {}'.format (tab,5,tab*5))
print('{} x {:2} = {}'.format (tab,6,tab*6))
print('{} x {:2} = {}'.format (tab,7,tab*7))
print('{} x {:2} = {}'.format (tab,8,tab*8))
print('{} x {:2} = {}'.format (tab,9,tab*9))
print('{} x {} = {}'.format (tab,10,tab*10))
print ('='*12) | 41.461538 | 60 | 0.495362 | 93 | 539 | 2.870968 | 0.258065 | 0.224719 | 0.235955 | 0.438202 | 0.539326 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075107 | 0.135436 | 539 | 13 | 61 | 41.461538 | 0.497854 | 0 | 0 | 0.153846 | 0 | 0 | 0.331481 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.923077 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
6f5e904869486bcaff87c9a3d6652c582ca9dcd8 | 194 | py | Python | account/permissions.py | VahidZareCE/school | 06ea0925d4885308412844a3a9ee0418db07e9c4 | [
"MIT"
] | null | null | null | account/permissions.py | VahidZareCE/school | 06ea0925d4885308412844a3a9ee0418db07e9c4 | [
"MIT"
] | null | null | null | account/permissions.py | VahidZareCE/school | 06ea0925d4885308412844a3a9ee0418db07e9c4 | [
"MIT"
] | null | null | null | from rest_framework.permissions import BasePermission
class IsTeacher(BasePermission):
def has_permission(self, request, view):
return bool(request.user and request.user.is_teacher) | 38.8 | 61 | 0.793814 | 24 | 194 | 6.291667 | 0.833333 | 0.145695 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134021 | 194 | 5 | 61 | 38.8 | 0.89881 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.