hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
d4b1f5f594632728fc31e8160bee1a40c0651549 | 455 | py | Python | qrogue/management/__init__.py | 7Magic7Mike7/Qrogue | 70bd5671a77981c1d4b633246321ba44f13c21ff | [
"MIT"
] | 4 | 2021-12-14T19:13:43.000Z | 2022-02-16T13:25:38.000Z | qrogue/management/__init__.py | 7Magic7Mike7/Qrogue | 70bd5671a77981c1d4b633246321ba44f13c21ff | [
"MIT"
] | null | null | null | qrogue/management/__init__.py | 7Magic7Mike7/Qrogue | 70bd5671a77981c1d4b633246321ba44f13c21ff | [
"MIT"
] | 1 | 2022-01-04T18:35:51.000Z | 2022-01-04T18:35:51.000Z | # exporting
from .pause import Pausing
from .save_data import SaveData
from .story import StoryNarration
from .map_management import MapManager
from .qrogue_pycui import QrogueCUI
# importing
# +util
# +util.game_simulator
# +util.key_logger
# +game.logic.actors
# +game.logic.actors.controllables
# +game.world.dungeon_generator
# +game.world.map
# +game.world.navigation
# +game.world.tiles
# +graphics.rendering
# +graphics.popups
# +graphics.widgets
| 21.666667 | 38 | 0.784615 | 58 | 455 | 6.051724 | 0.586207 | 0.102564 | 0.08547 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107692 | 455 | 20 | 39 | 22.75 | 0.864532 | 0.564835 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
d4b280f6a1a3945115782f70def7492512a19fba | 171 | py | Python | course1/secret_code.py | dbrandenburg/python-oreilley-certification | 44af77d093100971e32d48b309f8d6e6d1b78364 | [
"Apache-2.0"
] | null | null | null | course1/secret_code.py | dbrandenburg/python-oreilley-certification | 44af77d093100971e32d48b309f8d6e6d1b78364 | [
"Apache-2.0"
] | null | null | null | course1/secret_code.py | dbrandenburg/python-oreilley-certification | 44af77d093100971e32d48b309f8d6e6d1b78364 | [
"Apache-2.0"
] | null | null | null | #!/usr/local/bin/python3
message = str(input("Message: "))
secret = ""
for character in reversed(message):
secret = secret + str(chr(ord(character)+1))
print(secret) | 21.375 | 48 | 0.684211 | 23 | 171 | 5.086957 | 0.695652 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013514 | 0.134503 | 171 | 8 | 49 | 21.375 | 0.777027 | 0.134503 | 0 | 0 | 0 | 0 | 0.060811 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
d4b7c4ef17ff633f479396b3772e1f7318aaeee8 | 129 | py | Python | testfiles/pyobfu10.py | Oliv95/Language-based-security-project | 02f11982249294a54a8bbb29a9dc106f88944e10 | [
"MIT"
] | null | null | null | testfiles/pyobfu10.py | Oliv95/Language-based-security-project | 02f11982249294a54a8bbb29a9dc106f88944e10 | [
"MIT"
] | 1 | 2017-03-23T15:25:11.000Z | 2017-03-24T12:52:35.000Z | testfiles/pyobfu10.py | Oliv95/Language-based-security-project | 02f11982249294a54a8bbb29a9dc106f88944e10 | [
"MIT"
] | null | null | null | oo000 = 0
for ii in range ( 11 ) :
oo000 += ii
if 51 - 51: IiI1i11I
print ( oo000 )
# dd678faae9ac167bc83abf78e5cb2f3f0688d3a3
| 18.428571 | 42 | 0.705426 | 16 | 129 | 5.6875 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.382353 | 0.209302 | 129 | 6 | 43 | 21.5 | 0.509804 | 0.310078 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
d4c013499a0c2e5d3b4fafb8025e0916cc00d9d6 | 9,667 | py | Python | test/solution_tests/CHK/test_checkout.py | DPNT-Sourcecode/CHK-qvhh01 | 3eccdc858e46242af5838aa84cc553c423db8af8 | [
"Apache-2.0"
] | null | null | null | test/solution_tests/CHK/test_checkout.py | DPNT-Sourcecode/CHK-qvhh01 | 3eccdc858e46242af5838aa84cc553c423db8af8 | [
"Apache-2.0"
] | null | null | null | test/solution_tests/CHK/test_checkout.py | DPNT-Sourcecode/CHK-qvhh01 | 3eccdc858e46242af5838aa84cc553c423db8af8 | [
"Apache-2.0"
] | null | null | null | from lib.solutions.CHK import checkout_solution
class TestSkus():
def test_skus_should_created(self):
skus_a = checkout_solution.Skus(product_name='A', price=10.0)
assert skus_a.get_product_name() == 'A'
assert skus_a.get_price() == 10.0
assert skus_a.get_number_of_items() == 1
assert len(skus_a.get_discounts()) == 0
def test_skus_should_created_with_discount_info(self):
discount = checkout_solution.Discount(discount_purchase=3, discount_receive=130)
skus_a = checkout_solution.Skus(product_name='A', price=50)
skus_a.add_discount(discount)
assert skus_a.get_product_name() == 'A'
assert skus_a.get_price() == 50
assert skus_a.get_number_of_items() == 1
discounts = skus_a.get_discounts()
assert len(discounts) == 1
assert discounts[0].get_discount_purchase() == 3
assert discounts[0].get_discount_receive() == 130
class TestBasket():
def test_basket_items(self):
discount = checkout_solution.Discount(discount_purchase=3, discount_receive=130)
skus_a = checkout_solution.Skus(product_name='A', price=50)
skus_a.add_discount(discount)
basket = checkout_solution.Basket()
assert len(basket.get_items()) == 0
basket.add_item(skus_a)
assert len(basket.get_items()) == 1
def test_when_added_the_same_skus_in_basket(self):
discount = checkout_solution.Discount(discount_purchase=3, discount_receive=130)
discount1 = checkout_solution.Discount(discount_purchase=2, discount_receive=45)
skus_a = checkout_solution.Skus(product_name='A', price=50)
skus_a.add_discount(discount)
skus_b = checkout_solution.Skus(product_name='B', price=30)
skus_b.add_discount(discount1)
skus_c = checkout_solution.Skus(product_name='C', price=20)
basket = checkout_solution.Basket()
basket.add_item(skus_a)
basket.add_item(skus_b)
basket.add_item(skus_c)
items_list = basket.get_items()
assert len(items_list) == 3
assert items_list[1].get_number_of_items() == 1
basket.add_item(skus_b)
items_list = basket.get_items()
assert len(items_list) == 3
assert items_list[1].get_number_of_items() == 2
# def test_checkout_without_discount(self):
# skus_a = checkout_solution.Skus(product_name='A', price=50)
# skus_b = checkout_solution.Skus(product_name='B', price=30)
# skus_c = checkout_solution.Skus(product_name='C', price=20)
# basket = checkout_solution.Basket()
# basket.add_item(skus_a)
# basket.add_item(skus_b)
# basket.add_item(skus_c)
# assert basket.get_total() == 100
# def test_checkout_with_discount_not_applied(self):
# discount = checkout_solution.Discount(discount_purchase=3, discount_receive=130)
# skus_a = checkout_solution.Skus(product_name='A', price=50)
# skus_a.add_discount(discount)
# skus_b = checkout_solution.Skus(product_name='B', price=30)
# skus_c = checkout_solution.Skus(product_name='C', price=20)
# basket = checkout_solution.Basket()
# basket.add_item(skus_a)
# basket.add_item(skus_a)
# basket.add_item(skus_b)
# basket.add_item(skus_c)
# assert basket.get_total() == 150
# def test_checkout_with_discount_applied(self):
# discount = checkout_solution.Discount(discount_purchase=3, discount_receive=130)
# skus_a = checkout_solution.Skus(product_name='A', price=50)
# skus_a.add_discount(discount)
# skus_b = checkout_solution.Skus(product_name='B', price=30)
# skus_c = checkout_solution.Skus(product_name='C', price=20)
# basket = checkout_solution.Basket()
# basket.add_item(skus_a)
# basket.add_item(skus_a)
# basket.add_item(skus_a)
# basket.add_item(skus_b)
# basket.add_item(skus_c)
# assert basket.get_total() == 180
# def test_checkout_with_discount_applied_but_not_on_all_items(self):
# discount = checkout_solution.Discount(discount_purchase=3, discount_receive=130)
# skus_a = checkout_solution.Skus(product_name='A', price=50)
# skus_a.add_discount(discount)
# skus_b = checkout_solution.Skus(product_name='B', price=30)
# skus_c = checkout_solution.Skus(product_name='C', price=20)
# basket = checkout_solution.Basket()
# basket.add_item(skus_a)
# basket.add_item(skus_a)
# basket.add_item(skus_a)
# basket.add_item(skus_a)
# basket.add_item(skus_b)
# basket.add_item(skus_c)
# assert basket.get_total() == 230
# def test_checkout_with_discount_applied_but_not_on_all_items_(self):
# discount = checkout_solution.Discount(discount_purchase=2, discount_receive=80)
# skus_a = checkout_solution.Skus(product_name='A', price=50)
# skus_a.add_discount(discount)
# skus_b = checkout_solution.Skus(product_name='B', price=30)
# skus_c = checkout_solution.Skus(product_name='C', price=20)
# basket = checkout_solution.Basket()
# basket.add_item(skus_a)
# basket.add_item(skus_a)
# basket.add_item(skus_a)
# basket.add_item(skus_a)
# basket.add_item(skus_a)
# basket.add_item(skus_b)
# basket.add_item(skus_c)
# assert basket.get_total() == 260
def test_checkout_from_string(self):
basket = checkout_solution.Basket()
assert basket.checkout('ABCD') == 115
def test_checkout_from_string_discount(self):
assert checkout_solution.checkout('ABCADA') == 195
def test_checkout_from_string_discount2(self):
assert checkout_solution.checkout('ABCADAAAA') == 315
def test_checkout_from_string_discount3(self):
assert checkout_solution.checkout('ABCADAAA') == 265
def test_checkout_from_string_discount4(self):
assert checkout_solution.checkout('ABCDE') == 155
def test_checkout_from_string_discount5(self):
assert checkout_solution.checkout('ABCEDE') == 165
def test_checkout_from_string_discount6(self):
assert checkout_solution.checkout('BBEE') == 110 # to check
def test_checkout_from_string_discount7(self):
assert checkout_solution.checkout('EE') == 80
def test_checkout_from_string_discount8(self):
assert checkout_solution.checkout('AAAAAAAA') == 330
def test_checkout_from_string_discount9(self):
assert checkout_solution.checkout('ABCDEABCDE') == 280
def test_checkout_from_string_discount10(self):
assert checkout_solution.checkout('CCADDEEBBA') == 280
def test_checkout_from_string_discount11(self):
assert checkout_solution.checkout('EEEEBB') == 160
def test_checkout_from_string_discount12(self):
assert checkout_solution.checkout('FF') == 20
def test_checkout_from_string_discount13(self):
assert checkout_solution.checkout('FFF') == 20
def test_checkout_from_string_discount14(self):
assert checkout_solution.checkout('NNNM') == 120
def test_checkout_from_string_discount15(self):
assert checkout_solution.checkout('NNN') == 120
def test_checkout_from_string_discount16(self):
assert checkout_solution.checkout('NNNNNNMM') == 240
def test_checkout_from_string_discount17(self):
assert checkout_solution.checkout('NNNNNNM') == 240
def test_checkout_from_string_discount18(self):
assert checkout_solution.checkout('M') == 15
def test_checkout_from_string_discount19(self):
assert checkout_solution.checkout('ABCDEFGHIJKLMNOPQRSTUVWXYZ') == 965
def test_checkout_from_string_discount20(self):
assert checkout_solution.checkout('ABCDEFGHIJKLMNOPQRSTUVWXYZABCDEFGHIJKLMNOPQRSTUVWXYZ') == 1880
def test_checkout_from_string_discount21(self):
assert checkout_solution.checkout('SSS') == 45
def test_checkout_from_string_discount22(self):
assert checkout_solution.checkout('TTT') == 45
def test_checkout_from_string_discount23(self):
assert checkout_solution.checkout('ZZZ') == 45
def test_checkout_from_string_discount24(self):
assert checkout_solution.checkout('STY') == 45
def test_checkout_from_string_discount25(self):
assert checkout_solution.checkout('STYX') == 62
def test_checkout_from_string_discount26(self):
assert checkout_solution.checkout('XXXX') == 62
def test_checkout_from_string_discount27(self):
assert checkout_solution.checkout('XXXST') == 79
def test_checkout_from_string_discount28(self):
assert checkout_solution.checkout('S') == 20
def test_checkout_from_string_discount29(self):
assert checkout_solution.checkout('Z') == 21
def test_checkout_from_string_discount30(self):
assert checkout_solution.checkout('ZZZ') == 45
def test_checkout_from_string_discount31(self):
assert checkout_solution.checkout('ZZZX') == 62
def test_checkout_from_invalid_string(self):
assert checkout_solution.checkout('') == 0
assert checkout_solution.checkout('a') == -1
| 42.213974 | 106 | 0.66877 | 1,181 | 9,667 | 5.108383 | 0.127011 | 0.188298 | 0.09448 | 0.103928 | 0.844853 | 0.560583 | 0.492458 | 0.491464 | 0.470247 | 0.470247 | 0 | 0.033526 | 0.231716 | 9,667 | 228 | 107 | 42.399123 | 0.77878 | 0.290162 | 0 | 0.226087 | 0 | 0 | 0.034572 | 0.011879 | 0 | 0 | 0 | 0 | 0.434783 | 1 | 0.321739 | false | 0 | 0.008696 | 0 | 0.347826 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
d4c5458ad84947b0d5f3819073069584c5373685 | 2,004 | py | Python | function/python/brightics/common/statistics.py | sharon1321/studio | c5ce7f6db5503f5020b2aa0c6f2e6acfc61c90c5 | [
"Apache-2.0"
] | null | null | null | function/python/brightics/common/statistics.py | sharon1321/studio | c5ce7f6db5503f5020b2aa0c6f2e6acfc61c90c5 | [
"Apache-2.0"
] | null | null | null | function/python/brightics/common/statistics.py | sharon1321/studio | c5ce7f6db5503f5020b2aa0c6f2e6acfc61c90c5 | [
"Apache-2.0"
] | null | null | null | """
Copyright 2019 Samsung SDS
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import pandas as pd
import numpy as np
import scipy.stats
# NOTE: all parameter 'a' is assumed as array-like
def max(a): return np.max(a)
def min(a): return np.min(a)
def range(a): return np.max(a) - np.min(a)
def sum(a): return np.sum(a)
def mean(a): return np.mean(a)
def var(a): return np.var(a)
def var_samp(a): return np.var(a, ddof=1)
def std(a): return np.std(a)
def skewness(a): return scipy.stats.skew(a)
def kurtosis(a): return scipy.stats.kurtosis(a)
def median(a): return np.median(a)
def percentile(a, q): return np.percentile(a, q)
def trimmed_mean(a, proportiontocut): return scipy.stats.trim_mean(a, proportiontocut)
def iqr(a): return scipy.stats.iqr(a)
def q1(a): return np.percentile(a, 25)
def q3(a): return np.percentile(a, 75)
def mode(a):
a = np.array(a)
a = a[np.where(~pd.isnull(a))]
vals, cnts = np.unique(a, return_counts=True)
return vals[np.where(cnts==np.max(cnts))]
def num_row(a): return len(a)
def num_value(a): return np.count_nonzero(~pd.isnull(a))
def num_nan(a): return np.count_nonzero([x is np.nan for x in a])
def num_nullonly(a): return np.count_nonzero([x is None for x in a])
def num_null(a): return np.count_nonzero(pd.isnull(a))
def num_distinct(a): return np.count_nonzero(np.unique(a))
| 21.094737 | 87 | 0.662176 | 341 | 2,004 | 3.847507 | 0.357771 | 0.112043 | 0.109756 | 0.053354 | 0.197409 | 0.11128 | 0.091463 | 0.054878 | 0.054878 | 0.054878 | 0 | 0.009622 | 0.222056 | 2,004 | 94 | 88 | 21.319149 | 0.831944 | 0.2999 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.766667 | false | 0 | 0.1 | 0.733333 | 0.9 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
d4c8f3b29546c3a956f7a3f5db89baea535074d2 | 136 | py | Python | haasomeapi/enums/EnumFlashSpreadOptions.py | iamcos/haasomeapi | eac1640cc13e1e7649b8a8d6ed88184722c907c8 | [
"MIT"
] | null | null | null | haasomeapi/enums/EnumFlashSpreadOptions.py | iamcos/haasomeapi | eac1640cc13e1e7649b8a8d6ed88184722c907c8 | [
"MIT"
] | null | null | null | haasomeapi/enums/EnumFlashSpreadOptions.py | iamcos/haasomeapi | eac1640cc13e1e7649b8a8d6ed88184722c907c8 | [
"MIT"
] | null | null | null | from enum import Enum
class EnumFlashSpreadOptions(Enum):
FIXED_AMOUNT = 0
PERCENTAGE = 1
PERCENTAGE_WITH_BOOST = 2
EXPONENTIAL = 3 | 19.428571 | 35 | 0.786765 | 18 | 136 | 5.777778 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035088 | 0.161765 | 136 | 7 | 36 | 19.428571 | 0.877193 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
d4da2f97bc53ac1186d79a1c715aa6bdf9fa9071 | 182 | py | Python | db_replication/config.py | agarwal-nitesh/db_replication | c412d9176fc2b633bd70a3ae2789f7fef288f1dd | [
"MIT"
] | 2 | 2018-08-17T15:15:01.000Z | 2018-09-18T04:16:09.000Z | db_replication/config.py | therudite/db_replication | c412d9176fc2b633bd70a3ae2789f7fef288f1dd | [
"MIT"
] | null | null | null | db_replication/config.py | therudite/db_replication | c412d9176fc2b633bd70a3ae2789f7fef288f1dd | [
"MIT"
] | 2 | 2018-10-03T09:37:36.000Z | 2019-07-31T02:13:58.000Z | APP_PORT = 7991
LOGFILE_PATH = "/var/log/application/db_replication.log"
MASTER_MYSQL = {
'host': '127.0.0.1',
'port': 3306,
'user': 'root',
'password': 'password'
}
| 20.222222 | 56 | 0.615385 | 24 | 182 | 4.5 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094595 | 0.186813 | 182 | 8 | 57 | 22.75 | 0.635135 | 0 | 0 | 0 | 0 | 0 | 0.43956 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.125 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
d4e8cfa2de53db832050bab74b279ea0eccc32d4 | 319 | py | Python | signal_notification/apps.py | pprolancer/django-signal-notification | 1d3f6fea1e58a97ac183d94fa6eca127e508f9a3 | [
"Apache-2.0"
] | 1 | 2020-12-01T14:02:11.000Z | 2020-12-01T14:02:11.000Z | signal_notification/apps.py | pprolancer/django-signal-notification | 1d3f6fea1e58a97ac183d94fa6eca127e508f9a3 | [
"Apache-2.0"
] | null | null | null | signal_notification/apps.py | pprolancer/django-signal-notification | 1d3f6fea1e58a97ac183d94fa6eca127e508f9a3 | [
"Apache-2.0"
] | null | null | null | from django.apps import AppConfig
class SignalNotificationConfig(AppConfig):
name = 'signal_notification'
def ready(self):
from signal_notification.notify_manager import get_registered_notify_manager, get_registered_handlers
get_registered_handlers()
get_registered_notify_manager()
| 26.583333 | 109 | 0.780564 | 34 | 319 | 6.941176 | 0.529412 | 0.220339 | 0.161017 | 0.220339 | 0.233051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.169279 | 319 | 11 | 110 | 29 | 0.890566 | 0 | 0 | 0 | 0 | 0 | 0.059748 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
be2a7f575bd5ee29e2edf3e8bfb1890ab5b0caf8 | 1,310 | py | Python | manga_py/providers/readmanga_eu.py | sonvt1710/manga-py | 848a78e93b890af0c92056a1a9fc7f6ce5707cf6 | [
"MIT"
] | 7 | 2019-07-31T13:50:59.000Z | 2019-08-24T03:04:13.000Z | manga_py/providers/readmanga_eu.py | sonvt1710/manga-py | 848a78e93b890af0c92056a1a9fc7f6ce5707cf6 | [
"MIT"
] | 18 | 2019-07-26T22:27:06.000Z | 2019-08-25T13:45:13.000Z | manga_py/providers/readmanga_eu.py | sonvt1710/manga-py | 848a78e93b890af0c92056a1a9fc7f6ce5707cf6 | [
"MIT"
] | 2 | 2019-08-07T12:38:25.000Z | 2019-08-08T06:53:31.000Z | from manga_py.provider import Provider
from .helpers.std import Std
class ReadMangaEu(Provider, Std):
def get_chapter_index(self) -> str:
idx = self.re.search('/manga/\d+/[^/]+/([^/]+)', self.chapter)
return '-'.join(idx.group(1).split('.'))
def get_content(self):
name = self._get_name('/(manga/\d+/[^/]+)')
return self.http_get('{}/{}'.format(self.domain, name))
def get_manga_name(self) -> str:
return self._get_name('/manga/\d+/([^/]+)')
def get_chapters(self):
selector = '#chapters_b a[href*="/manga/"]'
return self._elements(selector)
def parse_files(self, parser):
images_class = '.mainContent img.ebook_img'
return self._images_helper(parser, images_class)
def get_files(self):
parser = self.html_fromstring(self.chapter)
pages = parser.cssselect('#jumpto > option + option')
images = self.parse_files(parser)
for i in pages:
url = self.normalize_uri(i.get('value'))
parser = self.html_fromstring(url)
images += self.parse_files(parser)
return images
def get_cover(self):
return self._cover_from_content('.ebook_cover')
def book_meta(self) -> dict:
# todo meta
pass
main = ReadMangaEu
| 29.111111 | 70 | 0.608397 | 162 | 1,310 | 4.722222 | 0.388889 | 0.047059 | 0.028758 | 0.04183 | 0.112418 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001004 | 0.239695 | 1,310 | 44 | 71 | 29.772727 | 0.767068 | 0.00687 | 0 | 0 | 0 | 0 | 0.127021 | 0.018476 | 0 | 0 | 0 | 0.022727 | 0 | 1 | 0.258065 | false | 0.032258 | 0.064516 | 0.064516 | 0.580645 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
0787716fc352460cee4d15a876e7ae80344f20ee | 2,334 | py | Python | sesion_05/products/views.py | bernest/modulo-django-desarrollo-web-cdmx-20-05pt | 33f971f032f7d3902a49a993d46e3ecefb21d59b | [
"MIT"
] | null | null | null | sesion_05/products/views.py | bernest/modulo-django-desarrollo-web-cdmx-20-05pt | 33f971f032f7d3902a49a993d46e3ecefb21d59b | [
"MIT"
] | null | null | null | sesion_05/products/views.py | bernest/modulo-django-desarrollo-web-cdmx-20-05pt | 33f971f032f7d3902a49a993d46e3ecefb21d59b | [
"MIT"
] | null | null | null | """Products app views"""
# Stardard library
# Django imports
# from django.shortcuts import get_object_or_404
# Third party apps
# from rest_framework.response import Response
# from rest_framework.views import APIView
# from rest_framework import status, generics
from rest_framework import viewsets
# Local apps
from .models import Product
from .serializers import ProductSerializer
# Viewsets
class ProductViewset(viewsets.ModelViewSet):
"""Product CRUD views"""
queryset = Product.objects.all()
serializer_class = ProductSerializer
# Generic View
# class ProductList(generics.ListCreateAPIView):
# """Gets and creates products"""
# queryset = Product.objects.all()
# serializer_class = ProductSerializer
# class ProductRetrieve(generics.RetrieveUpdateDestroyAPIView):
# """Retrieves, updates and deletes a single product"""
# queryset = Product.objects.all()
# serializer_class = ProductSerializer
# API Views
# class ProductsView(APIView):
# """Shows and creates products"""
# def get(self, request):
# products = Product.objects.all()
# serializer = ProductSerializer(products, many=True)
# return Response(serializer.data)
# def post(self, request):
# serializer = ProductSerializer(data=request.data)
# if serializer.is_valid():
# serializer.save()
# return Response(serializer.data, status=status.HTTP_201_CREATED)
# return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
# class ProductView(APIView):
# """Show, updates and deletes products"""
# def get(self, request, pk):
# product = get_object_or_404(Product, pk=pk)
# serializer = ProductSerializer(product)
# return Response(serializer.data)
# def put(self, request, pk):
# product = get_object_or_404(Product, pk=pk)
# serializer = ProductSerializer(product, data=request.data)
# if serializer.is_valid():
# serializer.save()
# return Response(serializer.data)
# return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
# def delete(self, request, pk):
# product = get_object_or_404(Product, pk=pk)
# product.delete()
# return Response(status=status.HTTP_204_NO_CONTENT)
| 27.785714 | 80 | 0.691088 | 251 | 2,334 | 6.298805 | 0.318725 | 0.061986 | 0.091082 | 0.035421 | 0.471221 | 0.407337 | 0.407337 | 0.299178 | 0.299178 | 0.299178 | 0 | 0.012966 | 0.206941 | 2,334 | 83 | 81 | 28.120482 | 0.841167 | 0.841902 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
07a115ea01d6ea3d9d0dff6259320b91ecfba474 | 132 | py | Python | desafio16.py | Ramossvitor/PYTHON | 72f1986afd80f908b66238729f689e0f1f166c6b | [
"MIT"
] | null | null | null | desafio16.py | Ramossvitor/PYTHON | 72f1986afd80f908b66238729f689e0f1f166c6b | [
"MIT"
] | null | null | null | desafio16.py | Ramossvitor/PYTHON | 72f1986afd80f908b66238729f689e0f1f166c6b | [
"MIT"
] | null | null | null | import math
num1 = float(input('Digite um numero: '))
print('O numero {} tem a parte inteira {:.0f}'.format(num1, math.trunc(num1))) | 44 | 78 | 0.689394 | 21 | 132 | 4.333333 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034483 | 0.121212 | 132 | 3 | 78 | 44 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.421053 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
07da308db478ad4656c1ade3bfa40966b789cec8 | 218 | py | Python | fbExceptionMeansPortNotPermitted.py | SkyLined/mTCPIPConnection | 52f6152a83a163c9f5a45c3fd6edc840c8e72a3b | [
"CC-BY-4.0"
] | null | null | null | fbExceptionMeansPortNotPermitted.py | SkyLined/mTCPIPConnection | 52f6152a83a163c9f5a45c3fd6edc840c8e72a3b | [
"CC-BY-4.0"
] | null | null | null | fbExceptionMeansPortNotPermitted.py | SkyLined/mTCPIPConnection | 52f6152a83a163c9f5a45c3fd6edc840c8e72a3b | [
"CC-BY-4.0"
] | null | null | null | import os, socket;
def fbExceptionMeansPortNotPermitted(oException):
return (
False
) if os.name == "nt" else (
(isinstance(oException, PermissionError) and oException.errno == 13) # Permission denied
); | 27.25 | 92 | 0.711009 | 22 | 218 | 7.045455 | 0.863636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011173 | 0.178899 | 218 | 8 | 93 | 27.25 | 0.854749 | 0.077982 | 0 | 0 | 0 | 0 | 0.01 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0.142857 | 0.428571 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
07e6b33c99899b49f42db08fa5fa9726787d498b | 43 | py | Python | virt_backup/__init__.py | Anthony25/virt-backup | 610c46f24da2325efa25f52fbb1a14f166184684 | [
"BSD-2-Clause-FreeBSD"
] | 23 | 2016-11-13T00:41:47.000Z | 2019-05-16T07:37:48.000Z | virt_backup/__init__.py | Anthony25/virt-backup | 610c46f24da2325efa25f52fbb1a14f166184684 | [
"BSD-2-Clause-FreeBSD"
] | 20 | 2016-08-03T00:21:46.000Z | 2019-06-11T21:52:07.000Z | virt_backup/__init__.py | Anthony25/virt-backup | 610c46f24da2325efa25f52fbb1a14f166184684 | [
"BSD-2-Clause-FreeBSD"
] | 7 | 2017-10-09T12:50:22.000Z | 2019-03-26T00:49:14.000Z | APP_NAME = "virt-backup"
VERSION = "0.5.4"
| 14.333333 | 24 | 0.651163 | 8 | 43 | 3.375 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081081 | 0.139535 | 43 | 2 | 25 | 21.5 | 0.648649 | 0 | 0 | 0 | 0 | 0 | 0.372093 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
07ece62124e0aea3509bdde0925f7902f4130aa9 | 3,863 | py | Python | iscram/tests/integration/test_services.py | tkieras/iscram | 7d710e16db2df46e9860f02efa22ab641b8729d7 | [
"MIT"
] | null | null | null | iscram/tests/integration/test_services.py | tkieras/iscram | 7d710e16db2df46e9860f02efa22ab641b8729d7 | [
"MIT"
] | null | null | null | iscram/tests/integration/test_services.py | tkieras/iscram | 7d710e16db2df46e9860f02efa22ab641b8729d7 | [
"MIT"
] | null | null | null | from typing import Dict
from pytest import approx
from iscram.domain.model import SystemGraph
from iscram.adapters.repository import FakeRepository
from iscram.service_layer import services
from iscram.tests.conftest import get_sg_from_file
def test_get_birnbaum_structural_importance(minimal: SystemGraph):
bst_imps = services.get_birnbaum_structural_importances(minimal, {"SCALE_METRICS": "NONE"})
assert len(bst_imps) != 0
def test_get_birnbaum_importance(minimal: SystemGraph):
data = {
"nodes": {
"x1": {"risk": 0.5},
"x2": {"risk": 0.5},
"x3": {"risk": 0.5}
}
}
b_imps = services.get_birnbaum_importances(minimal, data, data_src="data")
assert len(b_imps) != 0
def test_select_attribute_no_suppliers(minimal: SystemGraph):
data = {"nodes": {}}
result = services.get_birnbaum_importances_select(minimal, data, {"domestic": False}, "data")
assert "domestic" in result
def test_select_attribute_suppliers(diamond_suppliers: SystemGraph):
data = {"nodes": {}}
result = services.get_birnbaum_importances_select(diamond_suppliers, data, {"domestic": False}, "data")
assert "domestic" in result
assert result["domestic"][False] == 0
def test_service_birnbaum_structural_importance_with_scaling(full_example_system: SystemGraph):
b_imps = services.get_birnbaum_structural_importances(full_example_system, {"SCALE_METRICS": "PROPORTIONAL"})
assert len(b_imps) != 0
assert approx(sum(b_imps.values()) == 1)
def test_service_birnbaum_structural_importance_with_no_scaling(full_example_system: SystemGraph):
b_imps = services.get_birnbaum_structural_importances(full_example_system, {"SCALE_METRICS": "NONE"})
assert len(b_imps) != 0
assert approx(sum(b_imps.values()) == 1)
def test_service_birnbaum_importance_with_scaling(full_example_system: SystemGraph, full_example_data_1: Dict):
b_imps = services.get_birnbaum_importances(full_example_system, full_example_data_1, "data", {"SCALE_METRICS": "PROPORTIONAL"})
assert len(b_imps) != 0
assert approx(sum(b_imps.values()) == 1)
def test_service_birnbaum_importance_with_no_scaling(full_example_system: SystemGraph, full_example_data_1: Dict):
b_imps = services.get_birnbaum_importances(full_example_system, full_example_data_1, "data", {"SCALE_METRICS": "NONE"})
assert len(b_imps) != 0
assert approx(sum(b_imps.values()) < 1)
def test_service_birnbaum_importance_default_scaling(full_example_system: SystemGraph, full_example_data_1: Dict):
b_imps = services.get_birnbaum_importances(full_example_system, full_example_data_1, "data")
assert len(b_imps) != 0
assert approx(max(b_imps.values()) == 1)
assert approx(min(b_imps.values()) == 0)
def test_service_fractional_importance_traits_with_scaling(full_example_system: SystemGraph, full_example_data_1: Dict):
f_imps = services.get_fractional_importance_traits(full_example_system, full_example_data_1)
assert len(f_imps) != 0
assert approx(sum([sum(f_imps[k].values()) for k in f_imps]) == 1)
def test_service_get_attribute_sensitivity(full_example_system: SystemGraph, full_example_data_1: Dict):
result = services.get_attribute_sensitivity(full_example_system, full_example_data_1, "data")
assert result is not None
def test_speed_importances_rand_tree_500():
sg = get_sg_from_file("rand_system_graph_tree_500.json")
assert services.get_birnbaum_structural_importances(sg) is not None
def test_speed_importances_rand_tree_100():
sg = get_sg_from_file("rand_system_graph_tree_100.json")
assert services.get_birnbaum_structural_importances(sg) is not None
def test_speed_importances_rand_tree_50():
sg = get_sg_from_file("rand_system_graph_tree_50.json")
assert services.get_birnbaum_structural_importances(sg) is not None
| 40.663158 | 131 | 0.768833 | 534 | 3,863 | 5.149813 | 0.153558 | 0.096 | 0.086545 | 0.058182 | 0.734182 | 0.717818 | 0.665455 | 0.625455 | 0.587636 | 0.455636 | 0 | 0.015188 | 0.130727 | 3,863 | 94 | 132 | 41.095745 | 0.803752 | 0 | 0 | 0.246154 | 0 | 0 | 0.076107 | 0.023816 | 0 | 0 | 0 | 0 | 0.338462 | 1 | 0.215385 | false | 0 | 0.461538 | 0 | 0.676923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
07f7686f93b2c90e0b5be3fe57a5046fc51868dd | 62 | py | Python | python/testData/intentions/convertVariadicParamPositionalContainerInPy3_after.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/intentions/convertVariadicParamPositionalContainerInPy3_after.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/intentions/convertVariadicParamPositionalContainerInPy3_after.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | def foo(baz=None, *args, bar, **kwargs<caret>):
return bar | 31 | 47 | 0.645161 | 10 | 62 | 4 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16129 | 62 | 2 | 48 | 31 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
07fe3afab0b4cdc9f958037890ce461a43b89378 | 3,830 | py | Python | battlecalculator.py | owenonline/Battle-Calculator-Python | 91ba40e19ebe417e18aab66a2ba98a1fdc96e7a5 | [
"MIT"
] | null | null | null | battlecalculator.py | owenonline/Battle-Calculator-Python | 91ba40e19ebe417e18aab66a2ba98a1fdc96e7a5 | [
"MIT"
] | null | null | null | battlecalculator.py | owenonline/Battle-Calculator-Python | 91ba40e19ebe417e18aab66a2ba98a1fdc96e7a5 | [
"MIT"
] | null | null | null | import math
print("keep all values greater than 0")
forceA=float(input("Enter number of non artillery forces for side A: "))
forceB=float(input("Enter number of non artillery forces for side B: "))
deteffectivenessA=float(input("Enter the organic detection effectiveness (%) of side A: "))
deteffectivenessB=float(input("Enter the organic detection effectiveness (%) of side B: "))
deseffectivenessA =float(input("Enter the destruction effectiveness (%) of side A: "))
deseffectivenessB =float(input("Enter the destruction effectiveness (%) of side B: "))
encounterA =float(input("Enter the max number of side b's units side a can encounter (higher intelligence. set to 1 for none): "))
encounterB =float(input("Enter the max number of side a's units side b can encounter (higher intelligence. set to 1 for none): "))
capitulateA=float(input("Enter side a's capitulation percent: "))/100
capitulateB=float(input("Enter side b's capitulation percent: "))/100
combatA=deteffectivenessA*deseffectivenessA
combatB=deteffectivenessB*deseffectivenessB
remainingA=forceA
remainingB=forceB
t=0
linsqr=""
winner=""
def output(winner, remainingA, remainingB):
if (winner=="b"):
print("Winner: side ",winner,"\nRemaining units on Side A: ",remainingA,"\nRemaining units on side B: ",remainingB)
else:
print("A wins. Swap sides and run equation again to see casualty report")
if(encounterA==1and encounterB==1):
if((combatB/combatA)>math.pow((forceA/forceB),2)):
output("b",forceA*capitulateA,math.sqrt((combatA/combatB)*(math.pow(remainingA,2)-math.pow(forceA,2))+math.pow(forceB,2)))
else:
output("a",0,0)
elif(encounterA==1 and encounterB!=forceA and encounterB!=1):
if(((combatB*encounterB)/combatA)>math.pow((forceA/forceB),2)):
output("b",forceA*capitulateA,math.sqrt((combatA / (combatB*encounterB)) * (math.pow((forceA * capitulateA), 2) - math.pow(forceA, 2)) + math.pow(forceB, 2)))
else:
output("a",0,0)
elif(encounterA != 1 and encounterA != forceB and encounterB == 1):
if((combatB / (combatA*encounterA)) > math.pow((forceA / forceB), 2)):
output("b",forceA*capitulateA,math.sqrt(((combatA*encounterA) / combatB) * (math.pow((forceA * capitulateA), 2) - math.pow(forceA, 2)) + math.pow(forceB, 2)))
else:
output("a",0,0)
elif(encounterA == forceB and encounterB == forceA):
if((combatB / combatA) > (forceA / forceB)):
output("b",forceA*capitulateA,(combatA / combatB) * ((forceA * capitulateA) - forceA) + forceB)
else:
output("a",0,0)
elif(encounterA == forceB and encounterB != forceA and encounterB != 1):
if(((combatB*encounterB) / combatA) > (math.pow(forceA,2) / forceB)):
output("b",forceA*capitulateA,(combatA / (combatB*encounterB)) * (math.pow((forceA * capitulateA), 2) - math.pow(forceA, 2)) + forceB)
else:
output("a",0,0)
elif(encounterA != forceB and encounterA != 1 and encounterB == forceA):
if((combatB / (combatA*encounterA)) > (forceA / math.pow(forceB,2))):
output("b",forceA*capitulateA,math.sqrt(((combatA * encounterA) / combatB) * ((forceA * capitulateA) - forceA) + math.pow(forceB, 2)))
else:
output("a",0,0)
elif(encounterA==forceB and encounterB==1):
if((combatB / combatA) > (math.pow(forceA, 2) / forceB)):
output("b",forceA*capitulateA,(combatA / combatB) * (math.pow((forceA * capitulateA), 2) - math.pow(forceA, 2)) + forceB)
else:
output("a",0,0)
elif(encounterA == 1 and encounterB == forceA):
if((combatB / combatA) > (forceA / math.pow(forceB, 2))):
output("b",forceA*capitulateA,math.sqrt((combatA / combatB) * ((forceA * capitulateA) - forceA) + math.pow(forceB, 2)))
else:
output("a",0,0) | 52.465753 | 168 | 0.66188 | 495 | 3,830 | 5.121212 | 0.157576 | 0.06075 | 0.071795 | 0.07574 | 0.743984 | 0.734911 | 0.734911 | 0.708876 | 0.590533 | 0.514793 | 0 | 0.018354 | 0.174935 | 3,830 | 73 | 169 | 52.465753 | 0.783861 | 0 | 0 | 0.265625 | 0 | 0.03125 | 0.205906 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.015625 | false | 0 | 0.015625 | 0 | 0.03125 | 0.046875 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
5806c1e5cf0aea0b7fce83ab4406884654dee558 | 1,493 | py | Python | simple_rl/mdp/StateClass.py | david-abel/mdps | d8fe6007efb4840377f085a4e35ba89aaa2cdf6d | [
"Apache-2.0"
] | 230 | 2016-08-04T12:59:11.000Z | 2022-03-15T04:14:40.000Z | simple_rl/mdp/StateClass.py | david-abel/mdps | d8fe6007efb4840377f085a4e35ba89aaa2cdf6d | [
"Apache-2.0"
] | 36 | 2016-08-31T19:31:36.000Z | 2021-11-17T03:58:24.000Z | simple_rl/mdp/StateClass.py | david-abel/mdps | d8fe6007efb4840377f085a4e35ba89aaa2cdf6d | [
"Apache-2.0"
] | 95 | 2016-08-31T19:10:45.000Z | 2022-03-15T04:15:39.000Z | # Python imports
from collections.abc import Sequence
import numpy as np
''' StateClass.py: Contains the State Class. '''
class State(Sequence):
''' Abstract State class '''
def __init__(self, data=[], is_terminal=False):
self.data = data
self._is_terminal = is_terminal
def features(self):
'''
Summary
Used by function approximators to represent the state.
Override this method in State subclasses to have functiona
approximators use a different set of features.
Returns:
(iterable)
'''
return np.array(self.data).flatten()
def get_data(self):
return self.data
def get_num_feats(self):
return len(self.features())
def is_terminal(self):
return self._is_terminal
def set_terminal(self, is_term=True):
self._is_terminal = is_term
def __hash__(self):
if type(self.data).__module__ == np.__name__:
# Numpy arrays
return hash(str(self.data))
elif self.data.__hash__ is None:
return hash(tuple(self.data))
else:
return hash(self.data)
def __str__(self):
return "s." + str(self.data)
def __eq__(self, other):
if isinstance(other, State):
return self.data == other.data
return False
def __getitem__(self, index):
return self.data[index]
def __len__(self):
return len(self.data)
| 24.883333 | 70 | 0.600804 | 181 | 1,493 | 4.674033 | 0.39779 | 0.122931 | 0.049645 | 0.037825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.304756 | 1,493 | 59 | 71 | 25.305085 | 0.815029 | 0.170797 | 0 | 0 | 0 | 0 | 0.001787 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.060606 | 0.181818 | 0.787879 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
ed34a529e9f5ee833d3f64f7b2e9895870374308 | 1,822 | py | Python | engine/error/ecs.py | TWoolhouse/Libraries | 26079ed387cb800cb97f20980720ae094008c7bf | [
"MIT"
] | 1 | 2020-10-11T15:34:56.000Z | 2020-10-11T15:34:56.000Z | engine/error/ecs.py | TWoolhouse/Libraries | 26079ed387cb800cb97f20980720ae094008c7bf | [
"MIT"
] | null | null | null | engine/error/ecs.py | TWoolhouse/Libraries | 26079ed387cb800cb97f20980720ae094008c7bf | [
"MIT"
] | null | null | null | from .base import ECSError
__all__ = ["ComponentTypeError", "EntityLimitError", "GetComponentError", "InitializeComponent", "TerminateComponent"]
class ComponentTypeError(ECSError):
def __init__(self, component, expected=None):
self.component = component
self.expected = expected
def __str__(self) -> str:
if self.expected is None:
return f"{self.component} is not the correct type"
return f"Expected: '{self.expected}' Got {self.component}"
class EntityLimitError(ECSError):
def __init__(self, world, limit):
self.world = world
self.limit = limit
def __str__(self) -> str:
return "Entity Limit: {} reached in World: {}".format(self.limit, self.world)
class GetComponentError(ECSError):
def __init__(self, entity, component):
self.entity = entity
self.component = component
def __str__(self) -> str:
return "{}<{}> Does not Exist".format(self.entity, self.component.__name__)
class InitializeComponent(ECSError):
def __init__(self, entity, component, value):
self.entity = entity
self.component = component
self.value = value
def __str__(self) -> str:
return "{}<{}> Failed to Initialize with '{}'".format(self.entity, self.component.__class__.__name__, self.value)
class TerminateComponent(ECSError):
def __init__(self, entity, component, value):
self.entity = entity
self.component = component
self.value = value
def __str__(self) -> str:
return "{}<{}> Failed to Terminate with '{}'".format(self.entity, self.component.__class__.__name__, self.value)
class ParentError(ECSError):
def __init__(self, entity):
self.entity = entity
def __str__(self) -> str:
return "{} Has no parent"
| 30.366667 | 121 | 0.658617 | 198 | 1,822 | 5.69697 | 0.232323 | 0.097518 | 0.079787 | 0.101064 | 0.470745 | 0.37234 | 0.308511 | 0.308511 | 0.308511 | 0.308511 | 0 | 0 | 0.221186 | 1,822 | 59 | 122 | 30.881356 | 0.794926 | 0 | 0 | 0.439024 | 0 | 0 | 0.177278 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.292683 | false | 0 | 0.02439 | 0.121951 | 0.634146 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
ed505c7d14017fdb369e422b7cdd39c34df7dba9 | 505 | py | Python | djangocms_baseplugins/teaser_section/models.py | benzkji/djangocms-baseplugins | 7f041a030ed93dcdec70e4ca777b841846b8f2f2 | [
"MIT"
] | 2 | 2019-04-14T01:31:22.000Z | 2020-03-05T13:06:57.000Z | djangocms_baseplugins/teaser_section/models.py | benzkji/djangocms-baseplugins | 7f041a030ed93dcdec70e4ca777b841846b8f2f2 | [
"MIT"
] | 32 | 2017-04-04T09:28:06.000Z | 2021-08-18T16:23:02.000Z | djangocms_baseplugins/teaser_section/models.py | bnzk/djangocms-baseplugins | 7f041a030ed93dcdec70e4ca777b841846b8f2f2 | [
"MIT"
] | null | null | null | from django.utils.translation import ugettext_lazy as _
from djangocms_baseplugins.baseplugin.models import AbstractBasePlugin
from djangocms_baseplugins.baseplugin.utils import check_migration_modules_needed
check_migration_modules_needed('teaser_section')
class TeaserSectionBase(AbstractBasePlugin):
class Meta:
abstract = True
def __str__(self):
text = str(_("Teaser Section"))
return self.add_hidden_flag(text)
class TeaserSection(TeaserSectionBase):
pass
| 24.047619 | 81 | 0.786139 | 55 | 505 | 6.890909 | 0.618182 | 0.068602 | 0.126649 | 0.17942 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152475 | 505 | 20 | 82 | 25.25 | 0.885514 | 0 | 0 | 0 | 0 | 0 | 0.055446 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0.083333 | 0.25 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
ed5a6f0ffb1e37470c7a57935d081456a13620ca | 195 | py | Python | python/8Kyu/Define a card suit.py | athasv/Codewars-data | 5e106466e709fd776f23585ad9f652d0d65b48d3 | [
"MIT"
] | null | null | null | python/8Kyu/Define a card suit.py | athasv/Codewars-data | 5e106466e709fd776f23585ad9f652d0d65b48d3 | [
"MIT"
] | null | null | null | python/8Kyu/Define a card suit.py | athasv/Codewars-data | 5e106466e709fd776f23585ad9f652d0d65b48d3 | [
"MIT"
] | null | null | null | def define_suit(card):
if card.endswith("C"): return "clubs"
if card.endswith("D"): return "diamonds"
if card.endswith("H"): return "hearts"
if card.endswith("S"): return "spades" | 39 | 44 | 0.651282 | 28 | 195 | 4.5 | 0.535714 | 0.190476 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.169231 | 195 | 5 | 45 | 39 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0.147959 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
ed6562c094a29d670635f02a4ce1ef60f2cc7c74 | 201 | py | Python | valorant_api/exceptions.py | MinshuG/valorant-api | 9c9a9dd4eb5401e3f80dc4daabddf2e343558260 | [
"MIT"
] | 3 | 2021-02-26T01:48:32.000Z | 2022-02-08T03:04:50.000Z | valorant_api/exceptions.py | MinshuG/valorant-api | 9c9a9dd4eb5401e3f80dc4daabddf2e343558260 | [
"MIT"
] | 1 | 2021-07-05T00:03:25.000Z | 2021-07-28T11:41:57.000Z | valorant_api/exceptions.py | MinshuG/valorant-api | 9c9a9dd4eb5401e3f80dc4daabddf2e343558260 | [
"MIT"
] | 3 | 2021-02-14T11:00:34.000Z | 2021-07-19T17:36:08.000Z | class ValorantApi(Exception):
pass
class InvalidOrMissingParameter(ValorantApi): # 400
pass
class NotFound(ValorantApi): # 404
pass
class AttributeExistsError(ValorantApi):
pass
| 14.357143 | 52 | 0.736318 | 18 | 201 | 8.222222 | 0.5 | 0.182432 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0.19403 | 201 | 13 | 53 | 15.461538 | 0.876543 | 0.034826 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
ed814c8f249324f766a297bbb9d70d4e0a146f19 | 453 | py | Python | webhook_actions/adapters/repo_impl.py | Senth/webhook-actions | 73b9ccd48a281901a211dd45d9e17bd9b9844e43 | [
"MIT"
] | null | null | null | webhook_actions/adapters/repo_impl.py | Senth/webhook-actions | 73b9ccd48a281901a211dd45d9e17bd9b9844e43 | [
"MIT"
] | 2 | 2021-07-01T19:10:05.000Z | 2021-07-01T19:31:47.000Z | webhook_actions/adapters/repo_impl.py | Senth/webhook-actions | 73b9ccd48a281901a211dd45d9e17bd9b9844e43 | [
"MIT"
] | null | null | null | from ..app.run.run_repo import RunRepo
from ..core.entities.action import Action
from ..gateways.script_gateway import ScriptGateway
class RepoImpl(RunRepo):
def __init__(self) -> None:
super().__init__()
self.script_gateway = ScriptGateway()
def run(self, action: Action) -> bool:
return self.script_gateway.run(action)
def exists(self, action: Action) -> bool:
return self.script_gateway.exists(action)
| 28.3125 | 51 | 0.699779 | 56 | 453 | 5.428571 | 0.410714 | 0.171053 | 0.167763 | 0.131579 | 0.282895 | 0.282895 | 0.282895 | 0.282895 | 0 | 0 | 0 | 0 | 0.189845 | 453 | 15 | 52 | 30.2 | 0.828338 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0.272727 | 0.181818 | 0.818182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
ed8c03239f7424096760f3c141ae35c8eb7bc0b4 | 205 | py | Python | setup.py | adrianschlatter/ppf.jabref | cb0c054db9eaa1bce542f6a559d90e13ca0d829a | [
"MIT"
] | 2 | 2021-06-15T09:59:16.000Z | 2021-06-25T22:16:11.000Z | setup.py | adrianschlatter/ppf.jabref | cb0c054db9eaa1bce542f6a559d90e13ca0d829a | [
"MIT"
] | 18 | 2015-12-10T20:54:28.000Z | 2016-03-20T16:20:29.000Z | setup.py | adrianschlatter/ppf.jabref | cb0c054db9eaa1bce542f6a559d90e13ca0d829a | [
"MIT"
] | 1 | 2022-01-05T05:22:39.000Z | 2022-01-05T05:22:39.000Z | # -*- coding: utf-8 -*-
"""
A setuptools based setup module.
See:
https://packaging.python.org/en/latest/distributing.html
https://github.com/pypa/sampleproject
"""
import setuptools
setuptools.setup()
| 15.769231 | 56 | 0.726829 | 26 | 205 | 5.730769 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005435 | 0.102439 | 205 | 12 | 57 | 17.083333 | 0.804348 | 0.760976 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
71e8c4ed12dfbb8999df239c19f770a1e734e65b | 454 | py | Python | scripts/generate_ramachandran_statistics.py | leimao/Ramachandran | 8080697cced0b33792493de8d784467734433ca5 | [
"Apache-2.0"
] | 2 | 2021-02-15T07:11:26.000Z | 2021-02-23T09:25:54.000Z | scripts/generate_ramachandran_statistics.py | leimao/Ramachandran | 8080697cced0b33792493de8d784467734433ca5 | [
"Apache-2.0"
] | null | null | null | scripts/generate_ramachandran_statistics.py | leimao/Ramachandran | 8080697cced0b33792493de8d784467734433ca5 | [
"Apache-2.0"
] | null | null | null | import ramachandran.statistics
ramachandran.statistics.count_torsion_angles_from_directory(dir_path="./pdbx_collections_resolution_1p0", save_file_path="./data/probability.npz", b_factor_threshold=30, resolution=90, num_processes=12)
ramachandran.statistics.compute_gaussian_kde_densities_from_directory(dir_path="./pdbx_collections_resolution_1p0", save_file_path="./data/gaussian_density.npz", b_factor_threshold=30, resolution=360, num_processes=12) | 90.8 | 218 | 0.867841 | 61 | 454 | 6 | 0.540984 | 0.180328 | 0.087432 | 0.10929 | 0.519126 | 0.519126 | 0.349727 | 0.349727 | 0.349727 | 0.349727 | 0 | 0.038549 | 0.028634 | 454 | 5 | 218 | 90.8 | 0.791383 | 0 | 0 | 0 | 0 | 0 | 0.252747 | 0.252747 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
71f8daad5a201c0ad6ab4d685f07bba1f782ced2 | 460 | py | Python | scarf/__init__.py | gitter-badger/scarf-1 | 9fb230857ec9c33d5c4c23b5852a4b17a79fe60f | [
"BSD-3-Clause"
] | null | null | null | scarf/__init__.py | gitter-badger/scarf-1 | 9fb230857ec9c33d5c4c23b5852a4b17a79fe60f | [
"BSD-3-Clause"
] | null | null | null | scarf/__init__.py | gitter-badger/scarf-1 | 9fb230857ec9c33d5c4c23b5852a4b17a79fe60f | [
"BSD-3-Clause"
] | null | null | null | import warnings
from dask.array import PerformanceWarning
from importlib_metadata import version
warnings.filterwarnings("ignore", category=DeprecationWarning)
warnings.filterwarnings("ignore", category=PerformanceWarning)
try:
__version__ = version('scarf-toolkit')
except ImportError:
print("Scarf is not installed", flush=True)
from .datastore import *
from .readers import *
from .writers import *
from .meld_assay import *
from .utils import *
| 25.555556 | 62 | 0.793478 | 52 | 460 | 6.903846 | 0.576923 | 0.111421 | 0.155989 | 0.200557 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121739 | 460 | 17 | 63 | 27.058824 | 0.888614 | 0 | 0 | 0 | 0 | 0 | 0.102174 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.642857 | 0 | 0.642857 | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
9c268ee22e2e511b9e7808f4526ded04a216e8f5 | 1,191 | py | Python | jetengine/query/i_contains.py | kpdemetriou/jetengine | 1b931efb4a4b0ad7a473773aa6fcf0677c71f122 | [
"BSD-3-Clause"
] | 5 | 2019-06-24T08:57:42.000Z | 2021-12-17T22:58:08.000Z | jetengine/query/i_contains.py | kpdemetriou/jetengine | 1b931efb4a4b0ad7a473773aa6fcf0677c71f122 | [
"BSD-3-Clause"
] | 1 | 2020-02-24T15:04:13.000Z | 2020-07-31T11:47:44.000Z | jetengine/query/i_contains.py | kpdemetriou/jetengine | 1b931efb4a4b0ad7a473773aa6fcf0677c71f122 | [
"BSD-3-Clause"
] | 3 | 2019-11-07T13:57:40.000Z | 2021-12-17T22:58:00.000Z | from jetengine.query.base import QueryOperator
class IContainsOperator(QueryOperator):
"""
Query operator used to return all documents which specified field contains a string equal to a passed value.
It is not case sensitive.
For more information on `$regex` go to https://docs.mongodb.org/manual/reference/operator/query/regex/
Usage:
.. testsetup:: icontains_query_operator
from datetime import datetime
import tornado.ioloop
from jetengine import *
.. testcode:: icontains_query_operator
class User(Document):
first_name = StringField()
query = Q(first_name__icontains='NaR')
query_result = query.to_query(User)
# Due to dict ordering
print('{"first_name": {"$options": "%s", "$regex": "%s"}}' % (
query_result['first_name']['$options'],
query_result['first_name']['$regex'],
))
The resulting regex is:
.. testoutput:: icontains_query_operator
{"first_name": {"$options": "i", "$regex": "NaR"}}
"""
def to_query(self, field_name, value):
return {field_name: {"$regex": r"%s" % value, "$options": "i"}}
| 25.891304 | 112 | 0.625525 | 136 | 1,191 | 5.323529 | 0.5 | 0.074586 | 0.09116 | 0.055249 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.245172 | 1,191 | 45 | 113 | 26.466667 | 0.805339 | 0.743073 | 0 | 0 | 0 | 0 | 0.080952 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
9c399fcd0c0fa289f5191739732e8da5be0c76a3 | 981 | py | Python | crescent/resources/rds/__init__.py | mpolatcan/zepyhrus | 2fd0b1b9b21613b5876a51fe8b5f9e3afbec1b67 | [
"Apache-2.0"
] | 1 | 2020-03-26T19:20:03.000Z | 2020-03-26T19:20:03.000Z | crescent/resources/rds/__init__.py | mpolatcan/zepyhrus | 2fd0b1b9b21613b5876a51fe8b5f9e3afbec1b67 | [
"Apache-2.0"
] | null | null | null | crescent/resources/rds/__init__.py | mpolatcan/zepyhrus | 2fd0b1b9b21613b5876a51fe8b5f9e3afbec1b67 | [
"Apache-2.0"
] | null | null | null | from .arn import ArnFactory
from .db_cluster import DBClusterFactory
from .db_cluster_pg import DBClusterParameterGroupFactory
from .db_instance import DBInstanceFactory
from .db_pg import DBParameterGroupFactory
from .db_security_group import DBSecurityGroupFactory
from .db_security_group_ingress import DBSecurityGroupIngressFactory
from .db_subnet_group import DBSubnetGroupFactory
from .event_subscription import EventSubscriptionFactory
from .option_group import OptionGroupFactory
from .constants import *
class Rds:
Arn = ArnFactory
DBCluster = DBClusterFactory
DBClusterParameterGroup = DBClusterParameterGroupFactory
DBInstance = DBInstanceFactory
DBParameterGroup = DBClusterParameterGroupFactory
DBSecurityGroup = DBSecurityGroupFactory
DBSecurityGroupIngress = DBSecurityGroupIngressFactory
DBSubnetGroup = DBSubnetGroupFactory
EventSubscription = EventSubscriptionFactory
OptionGroup = OptionGroupFactory
__all__ = ["Rds"]
| 35.035714 | 68 | 0.846075 | 81 | 981 | 10.024691 | 0.45679 | 0.051724 | 0.03202 | 0.046798 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125382 | 981 | 27 | 69 | 36.333333 | 0.946387 | 0 | 0 | 0 | 0 | 0 | 0.003058 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.478261 | 0 | 0.956522 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
9c496278f4fb9e5afbb236dd88748c06807c00b3 | 4,559 | py | Python | autobahn/wamp/gen/wamp/proto/CalleeFeatures.py | rapyuta-robotics/autobahn-python | c08e9e352d526a7fd0885bb94706366a432ada1a | [
"MIT"
] | 1,670 | 2015-10-12T15:46:22.000Z | 2022-03-30T22:12:53.000Z | autobahn/wamp/gen/wamp/proto/CalleeFeatures.py | rapyuta-robotics/autobahn-python | c08e9e352d526a7fd0885bb94706366a432ada1a | [
"MIT"
] | 852 | 2015-10-16T22:11:03.000Z | 2022-03-27T07:57:01.000Z | autobahn/wamp/gen/wamp/proto/CalleeFeatures.py | rapyuta-robotics/autobahn-python | c08e9e352d526a7fd0885bb94706366a432ada1a | [
"MIT"
] | 790 | 2015-10-15T08:46:12.000Z | 2022-03-30T12:22:13.000Z | # automatically generated by the FlatBuffers compiler, do not modify
# namespace: proto
import flatbuffers
class CalleeFeatures(object):
__slots__ = ['_tab']
@classmethod
def GetRootAsCalleeFeatures(cls, buf, offset):
n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset)
x = CalleeFeatures()
x.Init(buf, n + offset)
return x
# CalleeFeatures
def Init(self, buf, pos):
self._tab = flatbuffers.table.Table(buf, pos)
# CalleeFeatures
def CallerIdentification(self):
o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4))
if o != 0:
return bool(self._tab.Get(flatbuffers.number_types.BoolFlags, o + self._tab.Pos))
return False
# CalleeFeatures
def CallTrustlevels(self):
o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6))
if o != 0:
return bool(self._tab.Get(flatbuffers.number_types.BoolFlags, o + self._tab.Pos))
return False
# CalleeFeatures
def CallTimeout(self):
o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8))
if o != 0:
return bool(self._tab.Get(flatbuffers.number_types.BoolFlags, o + self._tab.Pos))
return False
# CalleeFeatures
def CallCanceling(self):
o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10))
if o != 0:
return bool(self._tab.Get(flatbuffers.number_types.BoolFlags, o + self._tab.Pos))
return False
# CalleeFeatures
def ProgressiveCallResults(self):
o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12))
if o != 0:
return bool(self._tab.Get(flatbuffers.number_types.BoolFlags, o + self._tab.Pos))
return False
# CalleeFeatures
def RegistrationRevocation(self):
o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14))
if o != 0:
return bool(self._tab.Get(flatbuffers.number_types.BoolFlags, o + self._tab.Pos))
return False
# CalleeFeatures
def PatternBasedRegistration(self):
o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(16))
if o != 0:
return bool(self._tab.Get(flatbuffers.number_types.BoolFlags, o + self._tab.Pos))
return False
# CalleeFeatures
def SharedRegistration(self):
o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18))
if o != 0:
return bool(self._tab.Get(flatbuffers.number_types.BoolFlags, o + self._tab.Pos))
return False
# CalleeFeatures
def PayloadTransparency(self):
o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(20))
if o != 0:
return bool(self._tab.Get(flatbuffers.number_types.BoolFlags, o + self._tab.Pos))
return False
# CalleeFeatures
def PayloadEncryptionCryptobox(self):
o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(22))
if o != 0:
return bool(self._tab.Get(flatbuffers.number_types.BoolFlags, o + self._tab.Pos))
return False
def CalleeFeaturesStart(builder): builder.StartObject(10)
def CalleeFeaturesAddCallerIdentification(builder, callerIdentification): builder.PrependBoolSlot(0, callerIdentification, 0)
def CalleeFeaturesAddCallTrustlevels(builder, callTrustlevels): builder.PrependBoolSlot(1, callTrustlevels, 0)
def CalleeFeaturesAddCallTimeout(builder, callTimeout): builder.PrependBoolSlot(2, callTimeout, 0)
def CalleeFeaturesAddCallCanceling(builder, callCanceling): builder.PrependBoolSlot(3, callCanceling, 0)
def CalleeFeaturesAddProgressiveCallResults(builder, progressiveCallResults): builder.PrependBoolSlot(4, progressiveCallResults, 0)
def CalleeFeaturesAddRegistrationRevocation(builder, registrationRevocation): builder.PrependBoolSlot(5, registrationRevocation, 0)
def CalleeFeaturesAddPatternBasedRegistration(builder, patternBasedRegistration): builder.PrependBoolSlot(6, patternBasedRegistration, 0)
def CalleeFeaturesAddSharedRegistration(builder, sharedRegistration): builder.PrependBoolSlot(7, sharedRegistration, 0)
def CalleeFeaturesAddPayloadTransparency(builder, payloadTransparency): builder.PrependBoolSlot(8, payloadTransparency, 0)
def CalleeFeaturesAddPayloadEncryptionCryptobox(builder, payloadEncryptionCryptobox): builder.PrependBoolSlot(9, payloadEncryptionCryptobox, 0)
def CalleeFeaturesEnd(builder): return builder.EndObject()
| 44.262136 | 143 | 0.725378 | 487 | 4,559 | 6.655031 | 0.178645 | 0.066955 | 0.135761 | 0.06788 | 0.466831 | 0.466831 | 0.466831 | 0.466831 | 0.466831 | 0.466831 | 0 | 0.013077 | 0.178109 | 4,559 | 102 | 144 | 44.696078 | 0.851882 | 0.054398 | 0 | 0.410959 | 1 | 0 | 0.000931 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.328767 | false | 0 | 0.013699 | 0.013699 | 0.657534 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
9c4b6f4acceda1c831def88da89d9cb55a9c600d | 195 | py | Python | tests/modules/ffi/base.py | mswart/topaz | 4bc02d6f4bf29c20f045223ecb6ae8a5cc9df2ae | [
"BSD-3-Clause"
] | 6 | 2015-04-10T20:11:03.000Z | 2021-11-10T07:03:46.000Z | tests/modules/ffi/base.py | mswart/topaz | 4bc02d6f4bf29c20f045223ecb6ae8a5cc9df2ae | [
"BSD-3-Clause"
] | 1 | 2017-01-24T10:17:55.000Z | 2017-01-24T10:17:55.000Z | tests/modules/ffi/base.py | mswart/topaz | 4bc02d6f4bf29c20f045223ecb6ae8a5cc9df2ae | [
"BSD-3-Clause"
] | 2 | 2017-01-23T18:47:35.000Z | 2019-11-13T08:31:18.000Z | from tests.base import BaseTopazTest
class BaseFFITest(BaseTopazTest):
def ask(self, space, question):
w_answer = space.execute(question)
return self.unwrap(space, w_answer)
| 27.857143 | 43 | 0.723077 | 24 | 195 | 5.791667 | 0.708333 | 0.100719 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189744 | 195 | 6 | 44 | 32.5 | 0.879747 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
9c555899471a0a157d92ed764433713f75557236 | 25 | py | Python | tests/data/config/g.py | jinliwei1997/mmcv | f8d46df4a9fa32fb44d2e92a4ca5e7b26ee9cb79 | [
"Apache-2.0"
] | 3,748 | 2018-10-12T08:39:46.000Z | 2022-03-31T17:22:55.000Z | tests/data/config/g.py | jinliwei1997/mmcv | f8d46df4a9fa32fb44d2e92a4ca5e7b26ee9cb79 | [
"Apache-2.0"
] | 1,637 | 2018-10-12T06:06:18.000Z | 2022-03-31T02:20:53.000Z | tests/data/config/g.py | jinliwei1997/mmcv | f8d46df4a9fa32fb44d2e92a4ca5e7b26ee9cb79 | [
"Apache-2.0"
] | 1,234 | 2018-10-12T09:28:20.000Z | 2022-03-31T15:56:24.000Z | filename = 'reserved.py'
| 12.5 | 24 | 0.72 | 3 | 25 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12 | 25 | 1 | 25 | 25 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0.44 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
92beadce1b8da861c78be46e403c35e8898f7212 | 155 | py | Python | setup.py | CorvusEtiam/financemgr | fecfa6111862030c583b044fd605baf252dd2389 | [
"MIT"
] | null | null | null | setup.py | CorvusEtiam/financemgr | fecfa6111862030c583b044fd605baf252dd2389 | [
"MIT"
] | null | null | null | setup.py | CorvusEtiam/financemgr | fecfa6111862030c583b044fd605baf252dd2389 | [
"MIT"
] | null | null | null | from setuptools import setup
setup(
name = "financemgr",
version = "1.0.0",
packages = ['financemgr'],
install_requires = ["sqlalchemy"]
) | 19.375 | 37 | 0.632258 | 16 | 155 | 6.0625 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024793 | 0.219355 | 155 | 8 | 38 | 19.375 | 0.77686 | 0 | 0 | 0 | 0 | 0 | 0.224359 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
92d8dc7401e779f6222fdbfe1120027187b2590c | 240 | py | Python | tournaments/digitDifferenceSort/digitDifferenceSort.py | gurfinkel/codeSignal | 114817947ac6311bd53a48f0f0e17c0614bf7911 | [
"MIT"
] | 5 | 2020-02-06T09:51:22.000Z | 2021-03-19T00:18:44.000Z | tournaments/digitDifferenceSort/digitDifferenceSort.py | gurfinkel/codeSignal | 114817947ac6311bd53a48f0f0e17c0614bf7911 | [
"MIT"
] | null | null | null | tournaments/digitDifferenceSort/digitDifferenceSort.py | gurfinkel/codeSignal | 114817947ac6311bd53a48f0f0e17c0614bf7911 | [
"MIT"
] | 3 | 2019-09-27T13:06:21.000Z | 2021-04-20T23:13:17.000Z | def digitDifferenceSort(a):
def dg(n):
s = list(map(int, str(n)))
return max(s) - min(s)
ans = [(a[i], i) for i in range(len(a))]
A = sorted(ans, key = lambda x: (dg(x[0]), -x[1]))
return [c[0] for c in A]
| 24 | 54 | 0.5 | 45 | 240 | 2.666667 | 0.577778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017544 | 0.2875 | 240 | 9 | 55 | 26.666667 | 0.684211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
92d99276dc4dc6c2e4ea78587a0fe9b9d2a89c3f | 204 | py | Python | 1c_GT_Computer_Networks/Project-3/topo2.py | yevheniyc/Python | 262842477793d65c2b382ca810867fd24a415576 | [
"MIT"
] | 4 | 2016-08-28T03:21:36.000Z | 2021-01-19T01:59:17.000Z | 1c_GT_Computer_Networks/Project-3/topo2.py | yevheniyc/Python | 262842477793d65c2b382ca810867fd24a415576 | [
"MIT"
] | null | null | null | 1c_GT_Computer_Networks/Project-3/topo2.py | yevheniyc/Python | 262842477793d65c2b382ca810867fd24a415576 | [
"MIT"
] | 1 | 2018-05-20T12:33:23.000Z | 2018-05-20T12:33:23.000Z | # Topology with a single loop
# A --- B --- C
# | |
# D --- E
topo = { 'A' : ['B', 'D'],
'B' : ['A', 'C', 'E'],
'C' : ['B'],
'D' : ['A', 'E'],
'E' : ['B', 'D'] }
| 17 | 31 | 0.230392 | 26 | 204 | 1.807692 | 0.384615 | 0.12766 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.416667 | 204 | 11 | 32 | 18.545455 | 0.394958 | 0.279412 | 0 | 0 | 0 | 0 | 0.105634 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
92e191138d5ca707b62326d13003dd3309e107c9 | 611 | py | Python | abc_anno/attempt01.py | Preocts/python_play_carton | 071b19a6b5a6420192cd262195f95acfd787b476 | [
"MIT"
] | null | null | null | abc_anno/attempt01.py | Preocts/python_play_carton | 071b19a6b5a6420192cd262195f95acfd787b476 | [
"MIT"
] | null | null | null | abc_anno/attempt01.py | Preocts/python_play_carton | 071b19a6b5a6420192cd262195f95acfd787b476 | [
"MIT"
] | null | null | null | from __future__ import annotations
from abc import ABC
from abc import abstractmethod
class Parent(ABC):
@abstractmethod
def method_string(self) -> str:
raise NotImplementedError()
class ChildOne(Parent):
def method_string(self) -> str:
return "hello"
class ChildTwo(Parent):
def method_string(self) -> str:
return "Goodbye"
def getChild(number: int) -> Parent:
return ChildTwo() if number else ChildOne()
def getChilren(number: int) -> list[Parent]:
if number:
return [ChildTwo() for _ in range(10)]
return [ChildOne() for _ in range(10)]
| 19.709677 | 47 | 0.672668 | 74 | 611 | 5.432432 | 0.418919 | 0.067164 | 0.11194 | 0.141791 | 0.223881 | 0.169154 | 0.169154 | 0 | 0 | 0 | 0 | 0.008457 | 0.225859 | 611 | 30 | 48 | 20.366667 | 0.841438 | 0 | 0 | 0.157895 | 0 | 0 | 0.01964 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.263158 | false | 0 | 0.157895 | 0.157895 | 0.842105 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
92f05cd494aaa5de32fd28356ef02ad1b50368d6 | 1,241 | py | Python | python/python_100_days/django_test/hellodjango/hrs/views.py | bluewaitor/playground | 330266ce28212dc5e32b0276c896f9ceffd35bf5 | [
"MIT"
] | null | null | null | python/python_100_days/django_test/hellodjango/hrs/views.py | bluewaitor/playground | 330266ce28212dc5e32b0276c896f9ceffd35bf5 | [
"MIT"
] | null | null | null | python/python_100_days/django_test/hellodjango/hrs/views.py | bluewaitor/playground | 330266ce28212dc5e32b0276c896f9ceffd35bf5 | [
"MIT"
] | null | null | null | from io import StringIO
from django.http import HttpResponse
depts_list = [
{'no': 10, 'name': '财务部', 'location': '北京'},
{'no': 20, 'name': '研发部', 'location': '成都'},
{'no': 30, 'name': '销售部', 'location': '上海'},
]
def index(request):
output = StringIO()
output.write('<html>\n')
output.write('<head>\n')
output.write('\t<meta charset="utf-8">\n')
output.write('\t<title>首页</title>')
output.write('</head>\n')
output.write('<body>\n')
output.write('\t<h1>部门信息</h1>\n')
output.write('\t<hr>\n')
output.write('\t<table>\n')
output.write('\t\t<tr>\n')
output.write('\t\t\t<th width=120>部门编号</th>\n')
output.write('\t\t\t<th width=180>部门名称</th>\n')
output.write('\t\t\t<th width=180>所在地</th>\n')
output.write('\t\t</tr>\n')
for dept in depts_list:
output.write('\t\t<tr>\n')
output.write(f'\t\t\t<td align=center>{dept["no"]}</td>\n')
output.write(f'\t\t\t<td align=center>{dept["name"]}</td>\n')
output.write(f'\t\t\t<td align=center>{dept["location"]}</td>\n')
output.write('\t\t</tr>\n')
output.write('\t</table>\n')
output.write('</body>\n')
output.write('</html>\n')
return HttpResponse(output.getvalue())
| 32.657895 | 73 | 0.564061 | 199 | 1,241 | 3.507538 | 0.291457 | 0.346705 | 0.326648 | 0.223496 | 0.54298 | 0.54298 | 0.479943 | 0.39255 | 0.308023 | 0.308023 | 0 | 0.017391 | 0.165995 | 1,241 | 37 | 74 | 33.540541 | 0.657005 | 0 | 0 | 0.121212 | 0 | 0 | 0.377921 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030303 | false | 0 | 0.060606 | 0 | 0.121212 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
92f77563985e2b27e675a89f3ea7485f58764eb1 | 6,092 | py | Python | src/features/fixtures/after_fixtures.py | Kamal1224/dataworks-behavioural-framework | e61cbdb4dd86311cc0eaa56911cb3f159255a964 | [
"0BSD"
] | null | null | null | src/features/fixtures/after_fixtures.py | Kamal1224/dataworks-behavioural-framework | e61cbdb4dd86311cc0eaa56911cb3f159255a964 | [
"0BSD"
] | null | null | null | src/features/fixtures/after_fixtures.py | Kamal1224/dataworks-behavioural-framework | e61cbdb4dd86311cc0eaa56911cb3f159255a964 | [
"0BSD"
] | null | null | null | import os
from behave import fixture
from botocore.exceptions import ClientError
from helpers import (
aws_helper,
console_printer,
)
@fixture
def clean_up_role_and_s3_objects(context, timeout=30, **kwargs):
console_printer.print_info("Executing 'clean_up_role_and_s3_objects' fixture")
aws_helper.remove_role(
context.analytical_test_e2e_role, context.analytical_test_e2e_policies
)
aws_helper.clear_session()
aws_helper.set_details_for_role_assumption(
context.aws_role, context.aws_session_timeout_seconds
)
if context.analytical_test_data_s3_location.get("path"):
aws_helper.remove_file_from_s3_and_wait_for_consistency(
context.published_bucket,
os.path.join(
context.analytical_test_data_s3_location["path"],
context.analytical_test_data_s3_location["file_name"],
),
)
if context.analytical_test_data_s3_location.get("paths"):
for path in context.analytical_test_data_s3_location["paths"]:
aws_helper.remove_file_from_s3_and_wait_for_consistency(
context.published_bucket,
os.path.join(
path, context.analytical_test_data_s3_location["file_name"]
),
)
@fixture
def clean_up_s3_object(context, timeout=30, **kwargs):
console_printer.print_info("Executing 'clean_up_s3_object' fixture")
if aws_helper.check_if_s3_object_exists(
context.published_bucket, context.analytical_test_data_s3_location.get("path")
):
aws_helper.remove_file_from_s3_and_wait_for_consistency(
context.published_bucket,
os.path.join(
context.analytical_test_data_s3_location["path"],
context.analytical_test_data_s3_location["file_name"],
),
)
@fixture
def terminate_adg_cluster(context, timeout=30, **kwargs):
console_printer.print_info("Executing 'terminate_adg_cluster' fixture")
if "adg_cluster_id" in context and context.adg_cluster_id is not None:
try:
aws_helper.terminate_emr_cluster(context.adg_cluster_id)
except ClientError as error:
console_printer.print_warning_text(
f"Error occured when terminating ADG cluster with id of '{context.adg_cluster_id}' as the following error occurred: '{error}'"
)
else:
console_printer.print_info(
"No cluster id found for ADG so not terminating any cluster"
)
@fixture
def terminate_clive_cluster(context, timeout=30, **kwargs):
console_printer.print_info("Executing 'terminate_clive_cluster' fixture")
if "clive_cluster_id" in context and context.clive_cluster_id is not None:
try:
aws_helper.terminate_emr_cluster(context.clive_cluster_id)
except ClientError as error:
console_printer.print_warning_text(
f"Error occured when terminating clive cluster with id of '{context.clive_cluster_id}' as the following error occurred: '{error}'"
)
else:
console_printer.print_info(
"No cluster id found for clive so not terminating any cluster"
)
@fixture
def terminate_pdm_cluster(context, timeout=30, **kwargs):
console_printer.print_info("Executing 'terminate_pdm_cluster' fixture")
if "pdm_cluster_id" in context and context.pdm_cluster_id is not None:
try:
aws_helper.terminate_emr_cluster(context.pdm_cluster_id)
except ClientError as error:
console_printer.print_warning_text(
f"Error occured when terminating PDM cluster with id of '{context.adg_cluster_id}' as the following error occurred: '{error}'"
)
else:
console_printer.print_info(
"No cluster id found for PDM so not terminating any cluster"
)
@fixture
def terminate_kickstart_cluster(context, timeout=30, **kwargs):
console_printer.print_info("Executing 'terminate_kickstart_adg_cluster' fixture")
if (
"kickstart_adg_cluster_id" in context
and context.kickstart_adg_cluster_id is not None
):
try:
aws_helper.terminate_emr_cluster(context.kickstart_adg_cluster_id)
except ClientError as error:
console_printer.print_warning_text(
f"Error occured when terminating kickstart cluster with id of '{context.adg_cluster_id}' as the following error occurred: '{error}'"
)
else:
console_printer.print_info(
f"No cluster id found for PDM so not terminating any cluster"
)
@fixture
def terminate_mongo_latest_cluster(context, timeout=30, **kwargs):
console_printer.print_info("Executing 'terminate_mongo_latest_cluster' fixture")
if (
"mongo_latest_cluster_id" in context
and context.mongo_latest_cluster_id is not None
):
try:
aws_helper.terminate_emr_cluster(context.mongo_latest_cluster_id)
except ClientError as error:
console_printer.print_warning_text(
f"Error occured when terminating mongo latest cluster with id of '{context.mongo_latest_cluster_id}' as the following error occurred: '{error}'"
)
else:
console_printer.print_info(
"No cluster id found for mongo latest so not terminating any cluster"
)
@fixture
def terminate_ingest_replica_cluster(context):
console_printer.print_info("Executing 'terminate_ingest_replica_cluster' fixture")
if (
"ingest_replica_emr_cluster_id" in context
and context.ingest_replica_emr_cluster_id is not None
):
try:
aws_helper.terminate_emr_cluster(context.ingest_replica_emr_cluster_id)
except Exception as e:
console_printer.print_warning_text(
f"Unable to terminate cluster due to error:{e}"
)
else:
console_printer.print_warning_text(
"No ingest-replica cluster identified to terminate"
)
| 36.047337 | 160 | 0.688116 | 755 | 6,092 | 5.205298 | 0.128477 | 0.064122 | 0.096692 | 0.076081 | 0.805344 | 0.755216 | 0.679135 | 0.663359 | 0.618321 | 0.618321 | 0 | 0.007154 | 0.242777 | 6,092 | 168 | 161 | 36.261905 | 0.844786 | 0 | 0 | 0.431655 | 0 | 0.035971 | 0.258372 | 0.06845 | 0 | 0 | 0 | 0 | 0 | 1 | 0.057554 | false | 0 | 0.028777 | 0 | 0.086331 | 0.151079 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
1311539a03deed286033435fe7cf044124e57cc0 | 29,170 | py | Python | IfxPy/tests/test_000_PrepareDb.py | deokershesh/IfxPy | 7c44a2aea85c115b6f595ffa82c038f660fbf1ad | [
"Apache-2.0"
] | null | null | null | IfxPy/tests/test_000_PrepareDb.py | deokershesh/IfxPy | 7c44a2aea85c115b6f595ffa82c038f660fbf1ad | [
"Apache-2.0"
] | null | null | null | IfxPy/tests/test_000_PrepareDb.py | deokershesh/IfxPy | 7c44a2aea85c115b6f595ffa82c038f660fbf1ad | [
"Apache-2.0"
] | null | null | null | #
#
#
import unittest, sys, os
import IfxPy
#need to add this line below to each file to make the connect parameters available to all the test files
import config
from testfunctions import IfxPyTestFunctions
name = 'name'
picture = 'picture'
class IfxPyTestCase(unittest.TestCase):
def test_000_PrepareDb(self):
obj = IfxPyTestFunctions()
obj.assert_expect(self.run_test_000)
def run_test_000(self):
# Make a connection
conn = IfxPy.connect(config.ConnStr, config.user, config.password)
# Get the server type
server = IfxPy.server_info( conn )
# Drop the animal table, in case it exists
drop = 'DROP TABLE animals'
try:
result = IfxPy.exec_immediate(conn, drop)
except:
pass
# Create the animal table
create = 'CREATE TABLE animals (id INTEGER, breed VARCHAR(32), name CHAR(16), weight DECIMAL(7,2))'
result = IfxPy.exec_immediate(conn, create)
# Populate the animal table
animals = (\
(0, 'cat', 'Pook', 3.2),\
(1, 'dog', 'Peaches', 12.3),\
(2, 'horse', 'Smarty', 350.0),\
(3, 'gold fish', 'Bubbles', 0.1),\
(4, 'budgerigar', 'Gizmo', 0.2),\
(5, 'goat', 'Rickety Ride', 9.7),\
(6, 'llama', 'Sweater', 150)\
)
insert = 'INSERT INTO animals (id, breed, name, weight) VALUES (?, ?, ?, ?)'
stmt = IfxPy.prepare(conn, insert)
if stmt:
for animal in animals:
result = IfxPy.execute(stmt, animal)
# Drop the test view, in case it exists
drop = 'DROP VIEW anime_cat'
try:
result = IfxPy.exec_immediate(conn, drop)
except:
pass
# Create test view
IfxPy.exec_immediate(conn, """CREATE VIEW anime_cat AS
SELECT name, breed FROM animals
WHERE id = 0""")
# Drop the animal_pics table
drop = 'DROP TABLE animal_pics'
try:
result = IfxPy.exec_immediate(conn, drop)
except:
pass
# Create the animal_pics table
create = 'CREATE TABLE animal_pics (name VARCHAR(32), picture BLOB)'
result = IfxPy.exec_immediate(conn, create)
# Populate the view table
animals = (\
('Spook', 'spook.png'),\
('Helmut', 'pic1.jpg'),\
)
insert = 'INSERT INTO animal_pics (name, picture) VALUES (?, ?)'
stmt = IfxPy.prepare(conn, insert)
if (not stmt):
print "Attempt to prepare statement failed."
return 0
for animal in animals:
name = animal[0]
fileHandle = open(os.path.dirname(os.path.abspath(__file__)) + '/' + animal[1], 'rb')
picture = fileHandle.read()
if (not picture):
print "Could not retrieve picture '%s'." % animal[1]
return 0
IfxPy.bind_param(stmt, 1, name, IfxPy.SQL_PARAM_INPUT)
IfxPy.bind_param(stmt, 2, picture, IfxPy.SQL_PARAM_INPUT)
# result = IfxPy.execute(stmt)
# Drop the department table, in case it exists
drop = 'DROP TABLE department'
try:
result = IfxPy.exec_immediate(conn, drop)
except:
pass
# Create the department table
create = 'CREATE TABLE department (deptno CHAR(3) NOT NULL, deptname VARCHAR(29) NOT NULL, mgrno CHAR(6), admrdept CHAR(3) NOT NULL, location CHAR(16))'
result = IfxPy.exec_immediate(conn, create)
# Populate the department table
department = (\
('A00', 'SPIFFY COMPUTER SERVICE DIV.', '000010', 'A00', None),\
('B01', 'PLANNING', '000020', 'A00', None),\
('C01', 'INFORMATION CENTER', '000030', 'A00', None),\
('D01', 'DEVELOPMENT CENTER', None, 'A00', None),\
('D11', 'MANUFACTURING SYSTEMS', '000060', 'D01', None),\
('D21', 'ADMINISTRATION SYSTEMS', '000070', 'D01', None),\
('E01', 'SUPPORT SERVICES', '000050', 'A00', None),\
('E11', 'OPERATIONS', '000090', 'E01', None),\
('E21', 'SOFTWARE SUPPORT', '000100', 'E01', None)\
)
insert = 'INSERT INTO department (deptno, deptname, mgrno, admrdept, location) VALUES (?, ?, ?, ?, ?)'
stmt = IfxPy.prepare(conn, insert)
if stmt:
for dept in department:
result = IfxPy.execute(stmt, dept)
# Drop the emp_act table, in case it exists
drop = 'DROP TABLE emp_act'
try:
result = IfxPy.exec_immediate(conn, drop)
except:
pass
# Create the emp_act table
create = 'CREATE TABLE emp_act (empno CHAR(6) NOT NULL, projno CHAR(6) NOT NULL, actno SMALLINT NOT NULL, emptime DECIMAL(5,2), emstdate DATE, emendate DATE)'
result = IfxPy.exec_immediate(conn, create)
# Populate the emp_act table
emp_act = (\
('000010', 'MA2100', 10, 0.50, '1982-01-01', '1982-11-01'),\
('000010', 'MA2110', 10, 1.00, '1982-01-01', '1983-02-01'),\
('000010', 'AD3100', 10, 0.50, '1982-01-01', '1982-07-01'),\
('000020', 'PL2100', 30, 1.00, '1982-01-01', '1982-09-15'),\
('000030', 'IF1000', 10, 0.50, '1982-06-01', '1983-01-01'),\
('000030', 'IF2000', 10, 0.50, '1982-01-01', '1983-01-01'),\
('000050', 'OP1000', 10, 0.25, '1982-01-01', '1983-02-01'),\
('000050', 'OP2010', 10, 0.75, '1982-01-01', '1983-02-01'),\
('000070', 'AD3110', 10, 1.00, '1982-01-01', '1983-02-01'),\
('000090', 'OP1010', 10, 1.00, '1982-01-01', '1983-02-01'),\
('000100', 'OP2010', 10, 1.00, '1982-01-01', '1983-02-01'),\
('000110', 'MA2100', 20, 1.00, '1982-01-01', '1982-03-01'),\
('000130', 'IF1000', 90, 1.00, '1982-01-01', '1982-10-01'),\
('000130', 'IF1000', 100, 0.50, '1982-10-01', '1983-01-01'),\
('000140', 'IF1000', 90, 0.50, '1982-10-01', '1983-01-01'),\
('000140', 'IF2000', 100, 1.00, '1982-01-01', '1982-03-01'),\
('000140', 'IF2000', 100, 0.50, '1982-03-01', '1982-07-01'),\
('000140', 'IF2000', 110, 0.50, '1982-03-01', '1982-07-01'),\
('000140', 'IF2000', 110, 0.50, '1982-10-01', '1983-01-01'),\
('000150', 'MA2112', 60, 1.00, '1982-01-01', '1982-07-15'),\
('000150', 'MA2112', 180, 1.00, '1982-07-15', '1983-02-01'),\
('000160', 'MA2113', 60, 1.00, '1982-07-15', '1983-02-01'),\
('000170', 'MA2112', 60, 1.00, '1982-01-01', '1983-06-01'),\
('000170', 'MA2112', 70, 1.00, '1982-06-01', '1983-02-01'),\
('000170', 'MA2113', 80, 1.00, '1982-01-01', '1983-02-01'),\
('000180', 'MA2113', 70, 1.00, '1982-04-01', '1982-06-15'),\
('000190', 'MA2112', 70, 1.00, '1982-02-01', '1982-10-01'),\
('000190', 'MA2112', 80, 1.00, '1982-10-01', '1983-10-01'),\
('000200', 'MA2111', 50, 1.00, '1982-01-01', '1982-06-15'),\
('000200', 'MA2111', 60, 1.00, '1982-06-15', '1983-02-01'),\
('000210', 'MA2113', 80, 0.50, '1982-10-01', '1983-02-01'),\
('000210', 'MA2113', 180, 0.50, '1982-10-01', '1983-02-01'),\
('000220', 'MA2111', 40, 1.00, '1982-01-01', '1983-02-01'),\
('000230', 'AD3111', 60, 1.00, '1982-01-01', '1982-03-15'),\
('000230', 'AD3111', 60, 0.50, '1982-03-15', '1982-04-15'),\
('000230', 'AD3111', 70, 0.50, '1982-03-15', '1982-10-15'),\
('000230', 'AD3111', 80, 0.50, '1982-04-15', '1982-10-15'),\
('000230', 'AD3111', 180, 1.00, '1982-10-15', '1983-01-01'),\
('000240', 'AD3111', 70, 1.00, '1982-02-15', '1982-09-15'),\
('000240', 'AD3111', 80, 1.00, '1982-09-15', '1983-01-01'),\
('000250', 'AD3112', 60, 1.00, '1982-01-01', '1982-02-01'),\
('000250', 'AD3112', 60, 0.50, '1982-02-01', '1982-03-15'),\
('000250', 'AD3112', 60, 0.50, '1982-12-01', '1983-01-01'),\
('000250', 'AD3112', 60, 1.00, '1983-01-01', '1983-02-01'),\
('000250', 'AD3112', 70, 0.50, '1982-02-01', '1982-03-15'),\
('000250', 'AD3112', 70, 1.00, '1982-03-15', '1982-08-15'),\
('000250', 'AD3112', 70, 0.25, '1982-08-15', '1982-10-15'),\
('000250', 'AD3112', 80, 0.25, '1982-08-15', '1982-10-15'),\
('000250', 'AD3112', 80, 0.50, '1982-10-15', '1982-12-01'),\
('000250', 'AD3112', 180, 0.50, '1982-08-15', '1983-01-01'),\
('000260', 'AD3113', 70, 0.50, '1982-06-15', '1982-07-01'),\
('000260', 'AD3113', 70, 1.00, '1982-07-01', '1983-02-01'),\
('000260', 'AD3113', 80, 1.00, '1982-01-01', '1982-03-01'),\
('000260', 'AD3113', 80, 0.50, '1982-03-01', '1982-04-15'),\
('000260', 'AD3113', 180, 0.50, '1982-03-01', '1982-04-15'),\
('000260', 'AD3113', 180, 1.00, '1982-04-15', '1982-06-01'),\
('000260', 'AD3113', 180, 0.50, '1982-06-01', '1982-07-01'),\
('000270', 'AD3113', 60, 0.50, '1982-03-01', '1982-04-01'),\
('000270', 'AD3113', 60, 1.00, '1982-04-01', '1982-09-01'),\
('000270', 'AD3113', 60, 0.25, '1982-09-01', '1982-10-15'),\
('000270', 'AD3113', 70, 0.75, '1982-09-01', '1982-10-15'),\
('000270', 'AD3113', 70, 1.00, '1982-10-15', '1983-02-01'),\
('000270', 'AD3113', 80, 1.00, '1982-01-01', '1982-03-01'),\
('000270', 'AD3113', 80, 0.50, '1982-03-01', '1982-04-01'),\
('000280', 'OP1010', 130, 1.00, '1982-01-01', '1983-02-01'),\
('000290', 'OP1010', 130, 1.00, '1982-01-01', '1983-02-01'),\
('000300', 'OP1010', 130, 1.00, '1982-01-01', '1983-02-01'),\
('000310', 'OP1010', 130, 1.00, '1982-01-01', '1983-02-01'),\
('000320', 'OP2011', 140, 0.75, '1982-01-01', '1983-02-01'),\
('000320', 'OP2011', 150, 0.25, '1982-01-01', '1983-02-01'),\
('000330', 'OP2012', 140, 0.25, '1982-01-01', '1983-02-01'),\
('000330', 'OP2012', 160, 0.75, '1982-01-01', '1983-02-01'),\
('000340', 'OP2013', 140, 0.50, '1982-01-01', '1983-02-01'),\
('000340', 'OP2013', 170, 0.50, '1982-01-01', '1983-02-01'),\
('000020', 'PL2100', 30, 1.00, '1982-01-01', '1982-09-15')\
)
insert = 'INSERT INTO emp_act (empno, projno, actno, emptime, emstdate, emendate) VALUES (?, ?, ?, ?, ?, ?)'
stmt = IfxPy.prepare(conn, insert)
if stmt:
for emp in emp_act:
result = IfxPy.execute(stmt, emp)
# Drop the employee table, in case it exists
drop = 'DROP TABLE employee'
try:
result = IfxPy.exec_immediate(conn, drop)
except:
pass
# Create the employee table
create = 'CREATE TABLE employee (empno CHAR(6) NOT NULL, firstnme VARCHAR(12) NOT NULL, midinit CHAR(1) NOT NULL, lastname VARCHAR(15) NOT NULL, workdept CHAR(3), phoneno CHAR(4), hiredate DATE, job CHAR(8), edlevel SMALLINT NOT NULL, sex CHAR(1), birthdate DATE, salary DECIMAL(9,2), bonus DECIMAL(9,2), comm DECIMAL(9,2))'
result = IfxPy.exec_immediate(conn, create)
# Populate the employee table
employee = (
('000010', 'CHRISTINE', 'I', 'HAAS', 'A00', '3978', '1965-01-01', 'PRES', 18, 'F', '1933-08-24', 52750.00, 1000, 4220),
('000020', 'MICHAEL', 'L', 'THOMPSON', 'B01', '3476', '1973-10-10', 'MANAGER', 18, 'M' ,'1948-02-02', 41250.00, 800, 3300),
('000030', 'SALLY', 'A', 'KWAN', 'C01', '4738', '1975-04-05', 'MANAGER', 20, 'F' ,'1941-05-11', 38250.00, 800, 3060),
('000050', 'JOHN', 'B', 'GEYER', 'E01', '6789', '1949-08-17', 'MANAGER', 16, 'M' ,'1925-09-15', 40175.00, 800, 3214),
('000060', 'IRVING', 'F', 'STERN', 'D11', '6423', '1973-09-14', 'MANAGER', 16, 'M' ,'1945-07-07', 32250.00, 500, 2580),
('000070', 'EVA', 'D', 'PULASKI', 'D21', '7831', '1980-09-30', 'MANAGER', 16, 'F' ,'1953-05-26', 36170.00, 700, 2893),
('000090', 'EILEEN', 'W', 'HENDERSON', 'E11', '5498', '1970-08-15', 'MANAGER', 16, 'F' ,'1941-05-15', 29750.00, 600, 2380),
('000100', 'THEODORE', 'Q', 'SPENSER', 'E21', '0972', '1980-06-19', 'MANAGER', 14, 'M' ,'1956-12-18', 26150.00, 500, 2092),
('000110', 'VINCENZO', 'G', 'LUCCHESSI', 'A00', '3490', '1958-05-16', 'SALESREP', 19, 'M' ,'1929-11-05', 46500.00, 900, 3720),
('000120', 'SEAN', '' , 'OCONNELL', 'A00', '2167', '1963-12-05', 'CLERK', 14, 'M' ,'1942-10-18', 29250.00, 600, 2340),
('000130', 'DOLORES', 'M', 'QUINTANA', 'C01', '4578', '1971-07-28', 'ANALYST', 16, 'F' ,'1925-09-15', 23800.00, 500, 1904),
('000140', 'HEATHER', 'A', 'NICHOLLS', 'C01', '1793', '1976-12-15', 'ANALYST', 18, 'F' ,'1946-01-19', 28420.00, 600, 2274),
('000150', 'BRUCE', '' , 'ADAMSON', 'D11', '4510', '1972-02-12', 'DESIGNER', 16, 'M' ,'1947-05-17', 25280.00, 500, 2022),
('000160', 'ELIZABETH', 'R', 'PIANKA', 'D11', '3782', '1977-10-11', 'DESIGNER', 17, 'F' ,'1955-04-12', 22250.00, 400, 1780),
('000170', 'MASATOSHI', 'J', 'YOSHIMURA', 'D11', '2890', '1978-09-15', 'DESIGNER', 16, 'M' ,'1951-01-05', 24680.00, 500, 1974),
('000180', 'MARILYN', 'S', 'SCOUTTEN', 'D11', '1682', '1973-07-07', 'DESIGNER', 17, 'F' ,'1949-02-21', 21340.00, 500, 1707),
('000190', 'JAMES', 'H', 'WALKER', 'D11', '2986', '1974-07-26', 'DESIGNER', 16, 'M' ,'1952-06-25', 20450.00, 400, 1636),
('000200', 'DAVID', '' , 'BROWN', 'D11', '4501', '1966-03-03', 'DESIGNER', 16, 'M' ,'1941-05-29', 27740.00, 600, 2217),
('000210', 'WILLIAM', 'T', 'JONES', 'D11', '0942', '1979-04-11', 'DESIGNER', 17, 'M' ,'1953-02-23', 18270.00, 400, 1462),
('000220', 'JENNIFER', 'K', 'LUTZ', 'D11', '0672', '1968-08-29', 'DESIGNER', 18, 'F' ,'1948-03-19', 29840.00, 600, 2387),
('000230', 'JAMES', 'J', 'JEFFERSON', 'D21', '2094', '1966-11-21', 'CLERK', 14, 'M' ,'1935-05-30', 22180.00, 400, 1774),
('000240', 'SALVATORE', 'M', 'MARINO', 'D21', '3780', '1979-12-05', 'CLERK', 17, 'M' ,'1954-03-31', 28760.00, 600, 2301),
('000250', 'DANIEL', 'S', 'SMITH', 'D21', '0961', '1969-10-30', 'CLERK', 15, 'M' ,'1939-11-12', 19180.00, 400, 1534),
('000260', 'SYBIL', 'P', 'JOHNSON', 'D21', '8953', '1975-09-11', 'CLERK', 16, 'F' ,'1936-10-05', 17250.00, 300, 1380),
('000270', 'MARIA', 'L', 'PEREZ', 'D21', '9001', '1980-09-30', 'CLERK', 15, 'F' ,'1953-05-26', 27380.00, 500, 2190),
('000280', 'ETHEL', 'R', 'SCHNEIDER', 'E11', '8997', '1967-03-24', 'OPERATOR', 17, 'F' ,'1936-03-28', 26250.00, 500, 2100),
('000290', 'JOHN', 'R', 'PARKER', 'E11', '4502', '1980-05-30', 'OPERATOR', 12, 'M' ,'1946-07-09', 15340.00, 300, 1227),
('000300', 'PHILIP', 'X', 'SMITH', 'E11', '2095', '1972-06-19', 'OPERATOR', 14, 'M' ,'1936-10-27', 17750.00, 400, 1420),
('000310', 'MAUDE', 'F', 'SETRIGHT', 'E11', '3332', '1964-09-12', 'OPERATOR', 12, 'F' ,'1931-04-21', 15900.00, 300, 1272),
('000320', 'RAMLAL', 'V', 'MEHTA', 'E21', '9990', '1965-07-07', 'FIELDREP', 16, 'M' ,'1932-08-11', 19950.00, 400, 1596),
('000330', 'WING', '' , 'LEE', 'E21', '2103', '1976-02-23', 'FIELDREP', 14, 'M' ,'1941-07-18', 25370.00, 500, 2030),
('000340', 'JASON', 'R', 'GOUNOT', 'E21', '5698', '1947-05-05', 'FIELDREP', 16, 'M' ,'1926-05-17', 23840.00, 500, 1907)
)
insert = 'INSERT INTO employee (empno, firstnme, midinit, lastname, workdept, phoneno, hiredate, job, edlevel, sex, birthdate, salary, bonus, comm) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)'
stmt = IfxPy.prepare(conn, insert)
if stmt:
for emp in employee:
result = IfxPy.execute(stmt, emp)
# Drop the emp_photo table, in case it exists
drop = 'DROP TABLE emp_photo'
try:
result = IfxPy.exec_immediate(conn, drop)
except:
pass
# Create the emp_photo table
create = 'CREATE TABLE emp_photo (empno CHAR(6) NOT NULL, photo_format VARCHAR(10) NOT NULL, picture BLOB, PRIMARY KEY(empno, photo_format))'
try:
result = IfxPy.exec_immediate(conn, create)
except:
pass
# Populate the emp_photo table
emp_photo = (\
('000130', 'jpg', 'pic1.jpg'),\
('000130', 'png', 'spook.png'),\
('000140', 'jpg', 'pic1.jpg'),\
('000140', 'png', 'spook.png'),\
('000150', 'jpg', 'pic1.jpg'),\
('000150', 'png', 'spook.png'),\
('000190', 'jpg', 'pic1.jpg'),\
('000190', 'png', 'spook.png')\
)
insert = 'INSERT INTO emp_photo (empno, photo_format, picture) VALUES (?, ?, ?)'
stmt = IfxPy.prepare(conn, insert)
if stmt:
for photo in emp_photo:
empno = photo[0]
photo_format = photo[1]
fileHandler = open(os.path.dirname(os.path.abspath(__file__)) + '/' + photo[2], 'rb')
picture = fileHandler.read()
IfxPy.bind_param(stmt, 1, empno, IfxPy.SQL_PARAM_INPUT)
IfxPy.bind_param(stmt, 2, photo_format, IfxPy.SQL_PARAM_INPUT)
IfxPy.bind_param(stmt, 3, picture, IfxPy.SQL_PARAM_INPUT)
# result = IfxPy.execute(stmt)
# Drop the org table, in case it exists
drop = 'DROP TABLE org'
try:
result = IfxPy.exec_immediate(conn, drop)
except:
pass
# Create the org table
create = 'CREATE TABLE org (deptnumb SMALLINT NOT NULL, deptname VARCHAR(14), manager SMALLINT, division VARCHAR(10), location VARCHAR(13))'
result = IfxPy.exec_immediate(conn, create)
# Populate the org table
org = (\
(10, 'Head Office', 160, 'Corporate', 'New York'),\
(15, 'New England', 50, 'Eastern', 'Boston'),\
(20, 'Mid Atlantic', 10, 'Eastern', 'Washington'),\
(38, 'South Atlantic', 30, 'Eastern', 'Atlanta'),\
(42, 'Great Lakes', 100, 'Midwest', 'Chicago'),\
(51, 'Plains', 140, 'Midwest', 'Dallas'),\
(66, 'Pacific', 270, 'Western', 'San Francisco'),\
(84, 'Mountain', 290, 'Western', 'Denver')\
)
insert = 'INSERT INTO org (deptnumb, deptname, manager, division, location) VALUES (?, ?, ?, ?, ?)'
stmt = IfxPy.prepare(conn, insert)
if stmt:
for orgpart in org:
result = IfxPy.execute(stmt, orgpart)
# Drop the project table, in case it exists
drop = 'DROP TABLE project'
try:
result = IfxPy.exec_immediate(conn, drop)
except:
pass
# Create the project table
create = 'CREATE TABLE project (projno CHAR(6) NOT NULL, projname VARCHAR(24) NOT NULL, deptno CHAR(3) NOT NULL, respemp CHAR(6) NOT NULL, prstaff DECIMAL(5,2), prstdate DATE, prendate DATE, majproj CHAR(6))'
result = IfxPy.exec_immediate(conn, create)
# Populate the project table
project = (\
('AD3100', 'ADMIN SERVICES', 'D01', '000010', 6.5, '1982-01-01', '1983-02-01', ''),\
('AD3110', 'GENERAL ADMIN SYSTEMS', 'D21', '000070', 6, '1982-01-01', '1983-02-01', 'AD3100'),\
('AD3111', 'PAYROLL PROGRAMMING', 'D21', '000230', 2, '1982-01-01', '1983-02-01', 'AD3110'),\
('AD3112', 'PERSONNEL PROGRAMMING', 'D21', '000250', 1, '1982-01-01', '1983-02-01', 'AD3110'),\
('AD3113', 'ACCOUNT PROGRAMMING', 'D21', '000270', 2, '1982-01-01', '1983-02-01', 'AD3110'),\
('IF1000', 'QUERY SERVICES', 'C01', '000030', 2, '1982-01-01', '1983-02-01', None),\
('IF2000', 'USER EDUCATION', 'C01', '000030', 1, '1982-01-01', '1983-02-01', None),\
('MA2100', 'WELD LINE AUTOMATION', 'D01', '000010', 12, '1982-01-01', '1983-02-01', None),\
('MA2110', 'W L PROGRAMMING', 'D11', '000060', 9, '1982-01-01', '1983-02-01', 'MA2100'),\
('MA2111', 'W L PROGRAM DESIGN', 'D11', '000220', 2, '1982-01-01', '1982-12-01', 'MA2110'),\
('MA2112', 'W L ROBOT DESIGN', 'D11', '000150', 3, '1982-01-01', '1982-12-01', 'MA2110'),\
('MA2113', 'W L PROD CONT PROGS', 'D11', '000160', 3, '1982-02-15', '1982-12-01', 'MA2110'),\
('OP1000', 'OPERATION SUPPORT', 'E01', '000050', 6, '1982-01-01', '1983-02-01', None),\
('OP1010', 'OPERATION', 'E11', '000090', 5, '1982-01-01', '1983-02-01', 'OP1000'),\
('OP2000', 'GEN SYSTEMS SERVICES', 'E01', '000050', 5, '1982-01-01', '1983-02-01', None),\
('OP2010', 'SYSTEMS SUPPORT', 'E21', '000100', 4, '1982-01-01', '1983-02-01', 'OP2000'),\
('OP2011', 'SCP SYSTEMS SUPPORT', 'E21', '000320', 1, '1982-01-01', '1983-02-01', 'OP2010'),\
('OP2012', 'APPLICATIONS SUPPORT', 'E21', '000330', 1, '1982-01-01', '1983-02-01', 'OP2010'),\
('OP2013', 'DB/DC SUPPORT', 'E21', '000340', 1, '1982-01-01', '1983-02-01', 'OP2010'),\
('PL2100', 'WELD LINE PLANNING', 'B01', '000020', 1, '1982-01-01', '1982-09-15', 'MA2100')\
)
insert = 'INSERT INTO project (projno, projname, deptno, respemp, prstaff, prstdate, prendate, majproj) VALUES (?, ?, ?, ?, ?, ?, ?, ?)'
stmt = IfxPy.prepare(conn, insert)
if stmt:
for proj in project:
result = IfxPy.execute(stmt, proj)
# Drop the sales table, in case it exists
drop = 'DROP TABLE sales'
try:
result = IfxPy.exec_immediate(conn, drop)
except:
pass
# Create the sales table
create = 'CREATE TABLE sales (sales_date DATE, sales_person VARCHAR(15), region VARCHAR(15), sales INT)'
result = IfxPy.exec_immediate(conn, create)
# Populate the sales table
sales = (\
('1995-12-31', 'LUCCHESSI', 'Ontario-South', 1),\
('1995-12-31', 'LEE', 'Ontario-South', 3),\
('1995-12-31', 'LEE', 'Quebec', 1),\
('1995-12-31', 'LEE', 'Manitoba', 2),\
('1995-12-31', 'GOUNOT', 'Quebec', 1),\
('1996-03-29', 'LUCCHESSI', 'Ontario-South', 3),\
('1996-03-29', 'LUCCHESSI', 'Quebec', 1),\
('1996-03-29', 'LEE', 'Ontario-South', 2),\
('1996-03-29', 'LEE', 'Ontario-North', 2),\
('1996-03-29', 'LEE', 'Quebec', 3),\
('1996-03-29', 'LEE', 'Manitoba', 5),\
('1996-03-29', 'GOUNOT', 'Ontario-South', 3),\
('1996-03-29', 'GOUNOT', 'Quebec', 1),\
('1996-03-29', 'GOUNOT', 'Manitoba', 7),\
('1996-03-30', 'LUCCHESSI', 'Ontario-South', 1),\
('1996-03-30', 'LUCCHESSI', 'Quebec', 2),\
('1996-03-30', 'LUCCHESSI', 'Manitoba', 1),\
('1996-03-30', 'LEE', 'Ontario-South', 7),\
('1996-03-30', 'LEE', 'Ontario-North', 3),\
('1996-03-30', 'LEE', 'Quebec', 7),\
('1996-03-30', 'LEE', 'Manitoba', 4),\
('1996-03-30', 'GOUNOT', 'Ontario-South', 2),\
('1996-03-30', 'GOUNOT', 'Quebec', 18),\
('1996-03-30', 'GOUNOT', 'Manitoba', 1),\
('1996-03-31', 'LUCCHESSI', 'Manitoba', 1),\
('1996-03-31', 'LEE', 'Ontario-South', 14),\
('1996-03-31', 'LEE', 'Ontario-North', 3),\
('1996-03-31', 'LEE', 'Quebec', 7),\
('1996-03-31', 'LEE', 'Manitoba', 3),\
('1996-03-31', 'GOUNOT', 'Ontario-South', 2),\
('1996-03-31', 'GOUNOT', 'Quebec', 1),\
('1996-04-01', 'LUCCHESSI', 'Ontario-South', 3),\
('1996-04-01', 'LUCCHESSI', 'Manitoba', 1),\
('1996-04-01', 'LEE', 'Ontario-South', 8),\
('1996-04-01', 'LEE', 'Ontario-North', None),\
('1996-04-01', 'LEE', 'Quebec', 8),\
('1996-04-01', 'LEE', 'Manitoba', 9),\
('1996-04-01', 'GOUNOT', 'Ontario-South', 3),\
('1996-04-01', 'GOUNOT', 'Ontario-North', 1),\
('1996-04-01', 'GOUNOT', 'Quebec', 3),\
('1996-04-01', 'GOUNOT', 'Manitoba', 7)\
)
insert = 'INSERT INTO sales (sales_date, sales_person, region, sales) VALUES (?, ?, ?, ?)'
stmt = IfxPy.prepare(conn, insert)
if stmt:
for sale in sales:
result = IfxPy.execute(stmt, sale)
# Drop the stored procedure, in case it exists
drop = 'DROP PROCEDURE match_animal'
try:
result = IfxPy.exec_immediate(conn, drop)
except:
pass
# Create the stored procedure
if (server.DBMS_NAME[0:3] == 'Inf'):
result = IfxPy.exec_immediate(conn, """
CREATE PROCEDURE match_animal(first_name VARCHAR(128), INOUT second_name VARCHAR(128), OUT animal_weight DOUBLE PRECISION )
DEFINE match_name INT;
LET match_name = 0;
FOREACH c1 FOR
SELECT COUNT(*) INTO match_name FROM animals
WHERE name IN (second_name)
IF (match_name > 0)
THEN LET second_name = 'TRUE';
END IF;
END FOREACH;
FOREACH c2 FOR
SELECT SUM(weight) INTO animal_weight FROM animals
WHERE name in (first_name, second_name)
END FOREACH;
END PROCEDURE;""")
else:
result = IfxPy.exec_immediate(conn, """
CREATE PROCEDURE match_animal(IN first_name VARCHAR(128), INOUT second_name VARCHAR(128), OUT animal_weight DOUBLE)
DYNAMIC RESULT SETS 1
LANGUAGE SQL
BEGIN
DECLARE match_name INT DEFAULT 0;
DECLARE c1 CURSOR FOR
SELECT COUNT(*) FROM animals
WHERE name IN (second_name);
DECLARE c2 CURSOR FOR
SELECT SUM(weight) FROM animals
WHERE name in (first_name, second_name);
DECLARE c3 CURSOR WITH RETURN FOR
SELECT name, breed, weight FROM animals
WHERE name BETWEEN first_name AND second_name
ORDER BY name;
OPEN c1;
FETCH c1 INTO match_name;
IF (match_name > 0)
THEN SET second_name = 'TRUE';
END IF;
CLOSE c1;
OPEN c2;
FETCH c2 INTO animal_weight;
CLOSE c2;
OPEN c3;
END""")
result = None
# Drop the staff table, in case it exists
drop = 'DROP TABLE staff'
try:
result = IfxPy.exec_immediate(conn, drop)
except:
pass
# Create the staff table
create = 'CREATE TABLE staff (id SMALLINT NOT NULL, name VARCHAR(9), dept SMALLINT, job CHAR(5), years SMALLINT, salary DECIMAL(7,2), comm DECIMAL(7,2))';
result = IfxPy.exec_immediate(conn, create)
# Populate the staff table
staff = (\
(10, 'Sanders', 20, 'Mgr', 7, 18357.50, None),\
(20, 'Pernal', 20, 'Sales', 8, 18171.25, 612.45),\
(30, 'Marenghi', 38, 'Mgr', 5, 17506.75, None),\
(40, 'OBrien', 38, 'Sales', 6, 18006.00, 846.55),\
(50, 'Hanes', 15, 'Mgr', 10, 20659.80, None),\
(60, 'Quigley', 38, 'Sales', None, 16808.30, 650.25),\
(70, 'Rothman', 15, 'Sales', 7, 16502.83, 1152.00),\
(80, 'James', 20, 'Clerk', None, 13504.60, 128.20),\
(90, 'Koonitz', 42, 'Sales', 6, 18001.75, 1386.70),\
(100, 'Plotz', 42, 'Mgr' , 7, 18352.80, None),\
(110, 'Ngan', 15, 'Clerk', 5, 12508.20, 206.60),\
(120, 'Naughton', 38, 'Clerk', None, 12954.75, 180.00),\
(130, 'Yamaguchi', 42, 'Clerk', 6, 10505.90, 75.60),\
(140, 'Fraye', 51, 'Mgr' , 6, 21150.00, None),\
(150, 'Williams', 51, 'Sales', 6, 19456.50, 637.65),\
(160, 'Molinare', 10, 'Mgr' , 7, 22959.20, None),\
(170, 'Kermisch', 15, 'Clerk', 4, 12258.50, 110.10),\
(180, 'Abrahams', 38, 'Clerk', 3, 12009.75, 236.50),\
(190, 'Sneider', 20, 'Clerk', 8, 14252.75, 126.50),\
(200, 'Scoutten', 42, 'Clerk', None, 11508.60, 84.20),\
(210, 'Lu', 10, 'Mgr' , 10, 20010.00, None),\
(220, 'Smith', 51, 'Sales', 7, 17654.50, 992.80),\
(230, 'Lundquist', 51, 'Clerk', 3, 13369.80, 189.65),\
(240, 'Daniels', 10, 'Mgr' , 5, 19260.25, None),\
(250, 'Wheeler', 51, 'Clerk', 6, 14460.00, 513.30),\
(260, 'Jones', 10, 'Mgr' , 12, 21234.00, None),\
(270, 'Lea', 66, 'Mgr' , 9, 18555.50, None),\
(280, 'Wilson', 66, 'Sales', 9, 18674.50, 811.50),\
(290, 'Quill', 84, 'Mgr' , 10, 19818.00, None),\
(300, 'Davis', 84, 'Sales', 5, 15454.50, 806.10),\
(310, 'Graham', 66, 'Sales', 13, 21000.00, 200.30),\
(320, 'Gonzales', 66, 'Sales', 4, 16858.20, 844.00),\
(330, 'Burke', 66, 'Clerk', 1, 10988.00, 55.50),\
(340, 'Edwards', 84, 'Sales', 7, 17844.00, 1285.00),\
(350, 'Gafney', 84, 'Clerk', 5, 13030.50, 188.00)\
)
insert = 'INSERT INTO staff (id, name, dept, job, years, salary, comm) VALUES (?, ?, ?, ?, ?, ?, ?)'
stmt = IfxPy.prepare(conn, insert)
if stmt:
for emp in staff:
result = IfxPy.execute(stmt, emp)
try:
result = IfxPy.exec_immediate(conn, 'DROP TABLE t_string')
except:
pass
result = IfxPy.exec_immediate(conn, 'CREATE TABLE t_string(a INTEGER, b DOUBLE PRECISION, c VARCHAR(100))')
print "Preperation complete"
#__END__
#__LUW_EXPECTED__
#Preperation complete
#__ZOS_EXPECTED__
#Preperation complete
#__SYSTEMI_EXPECTED__
#Preperation complete
#__IDS_EXPECTED__
#Preperation complete
| 52.464029 | 328 | 0.528934 | 3,865 | 29,170 | 3.954463 | 0.196636 | 0.016488 | 0.027218 | 0.025517 | 0.377061 | 0.320924 | 0.278723 | 0.24614 | 0.170505 | 0.125425 | 0 | 0.260383 | 0.256256 | 29,170 | 555 | 329 | 52.558559 | 0.444112 | 0.048337 | 0 | 0.211416 | 0 | 0.02537 | 0.423003 | 0.00083 | 0 | 0 | 0 | 0 | 0.002114 | 0 | null | null | 0.031712 | 0.008457 | null | null | 0.006342 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
1313197dfb0a287eaa24da0c98a1a327c61d097b | 207 | py | Python | sermin/exceptions.py | radiac/sermin | 5179bd69d6fa3706bda4bdf271ea35369fa96fb6 | [
"BSD-3-Clause"
] | null | null | null | sermin/exceptions.py | radiac/sermin | 5179bd69d6fa3706bda4bdf271ea35369fa96fb6 | [
"BSD-3-Clause"
] | null | null | null | sermin/exceptions.py | radiac/sermin | 5179bd69d6fa3706bda4bdf271ea35369fa96fb6 | [
"BSD-3-Clause"
] | null | null | null | """
Sermin exceptions
"""
class RunError(Exception):
"""
Sermin error during a blueprint run
"""
pass
class ShellError(RunError):
"""
Sermin shell command failed
"""
pass
| 11.5 | 39 | 0.594203 | 20 | 207 | 6.15 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.289855 | 207 | 17 | 40 | 12.176471 | 0.836735 | 0.391304 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
131ee01de7bc6c6674a70f61f27de25ad5819741 | 206 | py | Python | sisu/__init__.py | balouf/sisu | 07541e6a02e545372452b33f7df056331397001f | [
"BSD-3-Clause"
] | null | null | null | sisu/__init__.py | balouf/sisu | 07541e6a02e545372452b33f7df056331397001f | [
"BSD-3-Clause"
] | null | null | null | sisu/__init__.py | balouf/sisu | 07541e6a02e545372452b33f7df056331397001f | [
"BSD-3-Clause"
] | null | null | null | """Top-level package for Structured and Interactive Summarization."""
__author__ = """Mélanie Cambus, Marc-Olivier Buob, Fabien Mathieu"""
__email__ = 'fabien.mathieu@normalesup.org'
__version__ = '0.2.0'
| 34.333333 | 69 | 0.752427 | 25 | 206 | 5.72 | 0.88 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016304 | 0.106796 | 206 | 5 | 70 | 41.2 | 0.76087 | 0.305825 | 0 | 0 | 0 | 0 | 0.605839 | 0.211679 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
131fdb409c897cd2ca36ac8f6f131184ff9edcec | 281 | py | Python | dynamic_menu/translation.py | lessss4/oil-and-rope | b8b52609f928e8c9174b7339cbb85cc21bae4538 | [
"MIT"
] | 8 | 2019-08-27T20:08:22.000Z | 2021-07-23T22:49:47.000Z | dynamic_menu/translation.py | lessss4/oil-and-rope | b8b52609f928e8c9174b7339cbb85cc21bae4538 | [
"MIT"
] | 73 | 2020-03-11T18:07:29.000Z | 2022-03-28T18:07:47.000Z | dynamic_menu/translation.py | lessss4/oil-and-rope | b8b52609f928e8c9174b7339cbb85cc21bae4538 | [
"MIT"
] | 4 | 2020-02-22T19:44:17.000Z | 2022-03-08T09:42:45.000Z | from modeltranslation.translator import TranslationOptions, register
from . import models
@register(models.DynamicMenu)
class DynamicMenuTranslationOptions(TranslationOptions):
"""
Configures how model must be translated.
"""
fields = ('name', 'description', )
| 21.615385 | 68 | 0.747331 | 24 | 281 | 8.75 | 0.791667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.160142 | 281 | 12 | 69 | 23.416667 | 0.889831 | 0.142349 | 0 | 0 | 0 | 0 | 0.066667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
1327215df9031fe2717a6476390360f734ce6bc3 | 1,807 | py | Python | PreprocessData/all_class_files/LocationFeatureSpecification.py | wkid-neu/Schema | 4854720a15894dd814691a55e03329ecbbb6f558 | [
"MIT"
] | 3 | 2021-11-06T12:29:05.000Z | 2022-03-22T12:48:55.000Z | PreprocessData/all_class_files/LocationFeatureSpecification.py | DylanNEU/Schema | 4854720a15894dd814691a55e03329ecbbb6f558 | [
"MIT"
] | null | null | null | PreprocessData/all_class_files/LocationFeatureSpecification.py | DylanNEU/Schema | 4854720a15894dd814691a55e03329ecbbb6f558 | [
"MIT"
] | 1 | 2021-11-06T12:29:12.000Z | 2021-11-06T12:29:12.000Z | from PreprocessData.all_class_files.PropertyValue import PropertyValue
import global_data
class LocationFeatureSpecification(PropertyValue):
def __init__(self, additionalType=None, alternateName=None, description=None, disambiguatingDescription=None, identifier=None, image=None, mainEntityOfPage=None, name=None, potentialAction=None, sameAs=None, url=None, maxValue=None, minValue=None, propertyID=None, unitCode=None, unitText=None, value=None, valueReference=None, hoursAvailable=None, validFrom=None, validThrough=None):
PropertyValue.__init__(self, additionalType, alternateName, description, disambiguatingDescription, identifier, image, mainEntityOfPage, name, potentialAction, sameAs, url, maxValue, minValue, propertyID, unitCode, unitText, value, valueReference)
self.hoursAvailable = hoursAvailable
self.validFrom = validFrom
self.validThrough = validThrough
def set_hoursAvailable(self, hoursAvailable):
self.hoursAvailable = hoursAvailable
def get_hoursAvailable(self):
return self.hoursAvailable
def set_validFrom(self, validFrom):
self.validFrom = validFrom
def get_validFrom(self):
return self.validFrom
def set_validThrough(self, validThrough):
self.validThrough = validThrough
def get_validThrough(self):
return self.validThrough
def __setattr__(self, key, value_list):
if type(value_list).__name__ == "NoneType" or key == "node_id":
self.__dict__[key] = value_list
return
for value in value_list:
str_value = type(value).__name__
if str_value not in global_data.get_table()[key]:
raise ValueError("非法类型!")
self.__dict__[key] = value_list
| 45.175 | 389 | 0.712784 | 184 | 1,807 | 6.744565 | 0.304348 | 0.036261 | 0.033844 | 0.04996 | 0.032232 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.205313 | 1,807 | 39 | 390 | 46.333333 | 0.864206 | 0 | 0 | 0.275862 | 0 | 0 | 0.011312 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.275862 | false | 0 | 0.068966 | 0.103448 | 0.517241 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
1336f3101e8d2585b20ca57dd61a80d3f1e56471 | 151 | py | Python | newfile/sub/sub1.py | sicilyChen/python | 30b09bb35c57ef5f9b8950c1d38fe7322a6273bb | [
"Apache-2.0"
] | null | null | null | newfile/sub/sub1.py | sicilyChen/python | 30b09bb35c57ef5f9b8950c1d38fe7322a6273bb | [
"Apache-2.0"
] | null | null | null | newfile/sub/sub1.py | sicilyChen/python | 30b09bb35c57ef5f9b8950c1d38fe7322a6273bb | [
"Apache-2.0"
] | null | null | null | import os
path=os.getcwd()
lst_files=os.walk(path)
for dirpath,dirname,filename in lst_files:
print(dirpath)
print(dirname)
print(filename) | 21.571429 | 42 | 0.741722 | 23 | 151 | 4.782609 | 0.565217 | 0.145455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145695 | 151 | 7 | 43 | 21.571429 | 0.852713 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.428571 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
133c57602f5d0115be51f69419d63c6f230f79c6 | 276 | py | Python | AUG16/02.py | Razdeep/PythonSnippets | 76f9313894f511c487a99bc38bdf0fe5e594caf5 | [
"MIT"
] | null | null | null | AUG16/02.py | Razdeep/PythonSnippets | 76f9313894f511c487a99bc38bdf0fe5e594caf5 | [
"MIT"
] | null | null | null | AUG16/02.py | Razdeep/PythonSnippets | 76f9313894f511c487a99bc38bdf0fe5e594caf5 | [
"MIT"
] | null | null | null | # implicit conversion
num_int=123
num_float=1.23
result=num_int+num_float
print('datatype of num_int is',type(num_int))
print('datatype of num_float is',type(num_float))
print('datatype of result is',type(result))
# it automatically converts int to float to avoid data loss | 25.090909 | 59 | 0.786232 | 49 | 276 | 4.265306 | 0.428571 | 0.114833 | 0.215311 | 0.200957 | 0.220096 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02449 | 0.112319 | 276 | 11 | 59 | 25.090909 | 0.828571 | 0.278986 | 0 | 0 | 0 | 0 | 0.340102 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
134aa796c97245d7ff93a2223068865f7f292462 | 6,432 | py | Python | skimage/filter/edges.py | jaberg/scikits-image | 2ab3e2dfb341189ef2ff9370c6cf3d33ef6ec88d | [
"BSD-3-Clause"
] | null | null | null | skimage/filter/edges.py | jaberg/scikits-image | 2ab3e2dfb341189ef2ff9370c6cf3d33ef6ec88d | [
"BSD-3-Clause"
] | null | null | null | skimage/filter/edges.py | jaberg/scikits-image | 2ab3e2dfb341189ef2ff9370c6cf3d33ef6ec88d | [
"BSD-3-Clause"
] | 1 | 2019-12-20T19:19:59.000Z | 2019-12-20T19:19:59.000Z | """edges.py - Sobel edge filter
Originally part of CellProfiler, code licensed under both GPL and BSD licenses.
Website: http://www.cellprofiler.org
Copyright (c) 2003-2009 Massachusetts Institute of Technology
Copyright (c) 2009-2011 Broad Institute
All rights reserved.
Original author: Lee Kamentsky
"""
import numpy as np
from skimage import img_as_float
from scipy.ndimage import convolve, binary_erosion, generate_binary_structure
EROSION_SELEM = generate_binary_structure(2, 2)
def _mask_filter_result(result, mask):
"""Return result after masking.
Input masks are eroded so that mask areas in the original image don't
affect values in the result.
"""
if mask is None:
result[0, :] = 0
result[-1, :] = 0
result[:, 0] = 0
result[:, -1] = 0
return result
else:
mask = binary_erosion(mask, EROSION_SELEM, border_value=0)
return result * mask
def sobel(image, mask=None):
"""Calculate the absolute magnitude Sobel to find edges.
Parameters
----------
image : array_like, dtype=float
Image to process.
mask : array_like, dtype=bool, optional
An optional mask to limit the application to a certain area.
Note that pixels surrounding masked regions are also masked to
prevent masked regions from affecting the result.
Returns
-------
output : ndarray
The Sobel edge map.
Notes
-----
Take the square root of the sum of the squares of the horizontal and
vertical Sobels to get a magnitude that's somewhat insensitive to
direction.
Note that ``scipy.ndimage.sobel`` returns a directional Sobel which
has to be further processed to perform edge detection.
"""
return np.sqrt(hsobel(image, mask)**2 + vsobel(image, mask)**2)
def hsobel(image, mask=None):
"""Find the horizontal edges of an image using the Sobel transform.
Parameters
----------
image : array_like, dtype=float
Image to process.
mask : array_like, dtype=bool, optional
An optional mask to limit the application to a certain area.
Note that pixels surrounding masked regions are also masked to
prevent masked regions from affecting the result.
Returns
-------
output : ndarray
The Sobel edge map.
Notes
-----
We use the following kernel and return the absolute value of the
result at each point::
1 2 1
0 0 0
-1 -2 -1
"""
image = img_as_float(image)
result = np.abs(convolve(image,
np.array([[ 1, 2, 1],
[ 0, 0, 0],
[-1,-2,-1]]).astype(float) / 4.0))
return _mask_filter_result(result, mask)
def vsobel(image, mask=None):
"""Find the vertical edges of an image using the Sobel transform.
Parameters
----------
image : array_like, dtype=float
Image to process
mask : array_like, dtype=bool, optional
An optional mask to limit the application to a certain area
Note that pixels surrounding masked regions are also masked to
prevent masked regions from affecting the result.
Returns
-------
output : ndarray
The Sobel edge map.
Notes
-----
We use the following kernel and return the absolute value of the
result at each point::
1 0 -1
2 0 -2
1 0 -1
"""
image = img_as_float(image)
result = np.abs(convolve(image,
np.array([[1, 0, -1],
[2, 0, -2],
[1, 0, -1]]).astype(float) / 4.0))
return _mask_filter_result(result, mask)
def prewitt(image, mask=None):
"""Find the edge magnitude using the Prewitt transform.
Parameters
----------
image : array_like, dtype=float
Image to process.
mask : array_like, dtype=bool, optional
An optional mask to limit the application to a certain area.
Note that pixels surrounding masked regions are also masked to
prevent masked regions from affecting the result.
Returns
-------
output : ndarray
The Prewitt edge map.
Notes
-----
Return the square root of the sum of squares of the horizontal
and vertical Prewitt transforms.
"""
return np.sqrt(hprewitt(image, mask)**2 + vprewitt(image, mask)**2)
def hprewitt(image, mask=None):
"""Find the horizontal edges of an image using the Prewitt transform.
Parameters
----------
image : array_like, dtype=float
Image to process.
mask : array_like, dtype=bool, optional
An optional mask to limit the application to a certain area.
Note that pixels surrounding masked regions are also masked to
prevent masked regions from affecting the result.
Returns
-------
output : ndarray
The Prewitt edge map.
Notes
-----
We use the following kernel and return the absolute value of the
result at each point::
1 1 1
0 0 0
-1 -1 -1
"""
image = img_as_float(image)
result = np.abs(convolve(image,
np.array([[ 1, 1, 1],
[ 0, 0, 0],
[-1,-1,-1]]).astype(float) / 3.0))
return _mask_filter_result(result, mask)
def vprewitt(image, mask=None):
"""Find the vertical edges of an image using the Prewitt transform.
Parameters
----------
image : array_like, dtype=float
Image to process.
mask : array_like, dtype=bool, optional
An optional mask to limit the application to a certain area.
Note that pixels surrounding masked regions are also masked to
prevent masked regions from affecting the result.
Returns
-------
output : ndarray
The Prewitt edge map.
Notes
-----
We use the following kernel and return the absolute value of the
result at each point::
1 0 -1
1 0 -1
1 0 -1
"""
image = img_as_float(image)
result = np.abs(convolve(image,
np.array([[1, 0, -1],
[1, 0, -1],
[1, 0, -1]]).astype(float) / 3.0))
return _mask_filter_result(result, mask)
| 28.210526 | 79 | 0.595149 | 833 | 6,432 | 4.540216 | 0.184874 | 0.008461 | 0.044421 | 0.038075 | 0.740613 | 0.728451 | 0.702538 | 0.690375 | 0.689582 | 0.677948 | 0 | 0.025256 | 0.316698 | 6,432 | 227 | 80 | 28.334802 | 0.835267 | 0.606032 | 0 | 0.347826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.152174 | false | 0 | 0.065217 | 0 | 0.391304 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
1371b9990b14aa80e77e5f895a123e8157354e7f | 32,120 | py | Python | libs/PureCloudPlatformClientV2/models/reporting_export_job_response.py | rocketbot-cl/genesysCloud | dd9d9b5ebb90a82bab98c0d88b9585c22c91f333 | [
"MIT"
] | 1 | 2021-10-08T20:46:45.000Z | 2021-10-08T20:46:45.000Z | libs/PureCloudPlatformClientV2/models/reporting_export_job_response.py | rocketbot-cl/genesysCloud | dd9d9b5ebb90a82bab98c0d88b9585c22c91f333 | [
"MIT"
] | null | null | null | libs/PureCloudPlatformClientV2/models/reporting_export_job_response.py | rocketbot-cl/genesysCloud | dd9d9b5ebb90a82bab98c0d88b9585c22c91f333 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
Copyright 2016 SmartBear Software
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
Ref: https://github.com/swagger-api/swagger-codegen
"""
from pprint import pformat
from six import iteritems
import re
import json
from ..utils import sanitize_for_serialization
class ReportingExportJobResponse(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
"""
def __init__(self):
"""
ReportingExportJobResponse - a model defined in Swagger
:param dict swaggerTypes: The key is attribute name
and the value is attribute type.
:param dict attributeMap: The key is attribute name
and the value is json key in definition.
"""
self.swagger_types = {
'id': 'str',
'name': 'str',
'run_id': 'str',
'status': 'str',
'time_zone': 'str',
'export_format': 'str',
'interval': 'str',
'download_url': 'str',
'view_type': 'str',
'export_error_messages_type': 'str',
'period': 'str',
'filter': 'ViewFilter',
'read': 'bool',
'created_date_time': 'datetime',
'modified_date_time': 'datetime',
'locale': 'str',
'percentage_complete': 'float',
'has_format_durations': 'bool',
'has_split_filters': 'bool',
'exclude_empty_rows': 'bool',
'has_split_by_media': 'bool',
'has_summary_row': 'bool',
'csv_delimiter': 'str',
'selected_columns': 'list[SelectedColumns]',
'has_custom_participant_attributes': 'bool',
'recipient_emails': 'list[str]',
'email_statuses': 'dict(str, str)',
'email_error_description': 'str',
'enabled': 'bool',
'self_uri': 'str'
}
self.attribute_map = {
'id': 'id',
'name': 'name',
'run_id': 'runId',
'status': 'status',
'time_zone': 'timeZone',
'export_format': 'exportFormat',
'interval': 'interval',
'download_url': 'downloadUrl',
'view_type': 'viewType',
'export_error_messages_type': 'exportErrorMessagesType',
'period': 'period',
'filter': 'filter',
'read': 'read',
'created_date_time': 'createdDateTime',
'modified_date_time': 'modifiedDateTime',
'locale': 'locale',
'percentage_complete': 'percentageComplete',
'has_format_durations': 'hasFormatDurations',
'has_split_filters': 'hasSplitFilters',
'exclude_empty_rows': 'excludeEmptyRows',
'has_split_by_media': 'hasSplitByMedia',
'has_summary_row': 'hasSummaryRow',
'csv_delimiter': 'csvDelimiter',
'selected_columns': 'selectedColumns',
'has_custom_participant_attributes': 'hasCustomParticipantAttributes',
'recipient_emails': 'recipientEmails',
'email_statuses': 'emailStatuses',
'email_error_description': 'emailErrorDescription',
'enabled': 'enabled',
'self_uri': 'selfUri'
}
self._id = None
self._name = None
self._run_id = None
self._status = None
self._time_zone = None
self._export_format = None
self._interval = None
self._download_url = None
self._view_type = None
self._export_error_messages_type = None
self._period = None
self._filter = None
self._read = None
self._created_date_time = None
self._modified_date_time = None
self._locale = None
self._percentage_complete = None
self._has_format_durations = None
self._has_split_filters = None
self._exclude_empty_rows = None
self._has_split_by_media = None
self._has_summary_row = None
self._csv_delimiter = None
self._selected_columns = None
self._has_custom_participant_attributes = None
self._recipient_emails = None
self._email_statuses = None
self._email_error_description = None
self._enabled = None
self._self_uri = None
@property
def id(self):
"""
Gets the id of this ReportingExportJobResponse.
The globally unique identifier for the object.
:return: The id of this ReportingExportJobResponse.
:rtype: str
"""
return self._id
@id.setter
def id(self, id):
"""
Sets the id of this ReportingExportJobResponse.
The globally unique identifier for the object.
:param id: The id of this ReportingExportJobResponse.
:type: str
"""
self._id = id
@property
def name(self):
"""
Gets the name of this ReportingExportJobResponse.
:return: The name of this ReportingExportJobResponse.
:rtype: str
"""
return self._name
@name.setter
def name(self, name):
"""
Sets the name of this ReportingExportJobResponse.
:param name: The name of this ReportingExportJobResponse.
:type: str
"""
self._name = name
@property
def run_id(self):
"""
Gets the run_id of this ReportingExportJobResponse.
The unique run id of the export schedule execute
:return: The run_id of this ReportingExportJobResponse.
:rtype: str
"""
return self._run_id
@run_id.setter
def run_id(self, run_id):
"""
Sets the run_id of this ReportingExportJobResponse.
The unique run id of the export schedule execute
:param run_id: The run_id of this ReportingExportJobResponse.
:type: str
"""
self._run_id = run_id
@property
def status(self):
"""
Gets the status of this ReportingExportJobResponse.
The current status of the export request
:return: The status of this ReportingExportJobResponse.
:rtype: str
"""
return self._status
@status.setter
def status(self, status):
"""
Sets the status of this ReportingExportJobResponse.
The current status of the export request
:param status: The status of this ReportingExportJobResponse.
:type: str
"""
allowed_values = ["SUBMITTED", "RUNNING", "CANCELLING", "CANCELLED", "COMPLETED", "COMPLETED_WITH_PARTIAL_RESULTS", "FAILED"]
if status.lower() not in map(str.lower, allowed_values):
# print("Invalid value for status -> " + status)
self._status = "outdated_sdk_version"
else:
self._status = status
@property
def time_zone(self):
"""
Gets the time_zone of this ReportingExportJobResponse.
The requested timezone of the exported data. Time zones are represented as a string of the zone name as found in the IANA time zone database. For example: UTC, Etc/UTC, or Europe/London
:return: The time_zone of this ReportingExportJobResponse.
:rtype: str
"""
return self._time_zone
@time_zone.setter
def time_zone(self, time_zone):
"""
Sets the time_zone of this ReportingExportJobResponse.
The requested timezone of the exported data. Time zones are represented as a string of the zone name as found in the IANA time zone database. For example: UTC, Etc/UTC, or Europe/London
:param time_zone: The time_zone of this ReportingExportJobResponse.
:type: str
"""
self._time_zone = time_zone
@property
def export_format(self):
"""
Gets the export_format of this ReportingExportJobResponse.
The requested format of the exported data
:return: The export_format of this ReportingExportJobResponse.
:rtype: str
"""
return self._export_format
@export_format.setter
def export_format(self, export_format):
"""
Sets the export_format of this ReportingExportJobResponse.
The requested format of the exported data
:param export_format: The export_format of this ReportingExportJobResponse.
:type: str
"""
allowed_values = ["CSV", "PDF"]
if export_format.lower() not in map(str.lower, allowed_values):
# print("Invalid value for export_format -> " + export_format)
self._export_format = "outdated_sdk_version"
else:
self._export_format = export_format
@property
def interval(self):
"""
Gets the interval of this ReportingExportJobResponse.
The time period used to limit the the exported data. Intervals are represented as an ISO-8601 string. For example: YYYY-MM-DDThh:mm:ss/YYYY-MM-DDThh:mm:ss
:return: The interval of this ReportingExportJobResponse.
:rtype: str
"""
return self._interval
@interval.setter
def interval(self, interval):
"""
Sets the interval of this ReportingExportJobResponse.
The time period used to limit the the exported data. Intervals are represented as an ISO-8601 string. For example: YYYY-MM-DDThh:mm:ss/YYYY-MM-DDThh:mm:ss
:param interval: The interval of this ReportingExportJobResponse.
:type: str
"""
self._interval = interval
@property
def download_url(self):
"""
Gets the download_url of this ReportingExportJobResponse.
The url to download the request if it's status is completed
:return: The download_url of this ReportingExportJobResponse.
:rtype: str
"""
return self._download_url
@download_url.setter
def download_url(self, download_url):
"""
Sets the download_url of this ReportingExportJobResponse.
The url to download the request if it's status is completed
:param download_url: The download_url of this ReportingExportJobResponse.
:type: str
"""
self._download_url = download_url
@property
def view_type(self):
"""
Gets the view_type of this ReportingExportJobResponse.
The type of view export job to be created
:return: The view_type of this ReportingExportJobResponse.
:rtype: str
"""
return self._view_type
@view_type.setter
def view_type(self, view_type):
"""
Sets the view_type of this ReportingExportJobResponse.
The type of view export job to be created
:param view_type: The view_type of this ReportingExportJobResponse.
:type: str
"""
allowed_values = ["QUEUE_PERFORMANCE_SUMMARY_VIEW", "QUEUE_PERFORMANCE_DETAIL_VIEW", "INTERACTION_SEARCH_VIEW", "AGENT_PERFORMANCE_SUMMARY_VIEW", "AGENT_PERFORMANCE_DETAIL_VIEW", "AGENT_STATUS_SUMMARY_VIEW", "AGENT_STATUS_DETAIL_VIEW", "AGENT_EVALUATION_SUMMARY_VIEW", "AGENT_EVALUATION_DETAIL_VIEW", "AGENT_QUEUE_DETAIL_VIEW", "AGENT_INTERACTION_DETAIL_VIEW", "ABANDON_INSIGHTS_VIEW", "SKILLS_PERFORMANCE_VIEW", "SURVEY_FORM_PERFORMANCE_SUMMARY_VIEW", "SURVEY_FORM_PERFORMANCE_DETAIL_VIEW", "DNIS_PERFORMANCE_SUMMARY_VIEW", "DNIS_PERFORMANCE_DETAIL_VIEW", "WRAP_UP_PERFORMANCE_SUMMARY_VIEW", "AGENT_WRAP_UP_PERFORMANCE_DETAIL_VIEW", "QUEUE_ACTIVITY_SUMMARY_VIEW", "QUEUE_ACTIVITY_DETAIL_VIEW", "AGENT_QUEUE_ACTIVITY_SUMMARY_VIEW", "QUEUE_AGENT_DETAIL_VIEW", "QUEUE_INTERACTION_DETAIL_VIEW", "AGENT_SCHEDULE_DETAIL_VIEW", "IVR_PERFORMANCE_SUMMARY_VIEW", "IVR_PERFORMANCE_DETAIL_VIEW", "ANSWER_INSIGHTS_VIEW", "HANDLE_INSIGHTS_VIEW", "TALK_INSIGHTS_VIEW", "HOLD_INSIGHTS_VIEW", "ACW_INSIGHTS_VIEW", "WAIT_INSIGHTS_VIEW", "AGENT_WRAP_UP_PERFORMANCE_INTERVAL_DETAIL_VIEW", "FLOW_OUTCOME_SUMMARY_VIEW", "FLOW_OUTCOME_PERFORMANCE_DETAIL_VIEW", "FLOW_OUTCOME_PERFORMANCE_INTERVAL_DETAIL_VIEW", "FLOW_DESTINATION_SUMMARY_VIEW", "FLOW_DESTINATION_DETAIL_VIEW", "API_USAGE_VIEW", "SCHEDULED_CALLBACKS_VIEW", "CONTENT_SEARCH_VIEW", "LANDING_PAGE", "DASHBOARD_SUMMARY", "DASHBOARD_DETAIL", "JOURNEY_ACTION_MAP_SUMMARY_VIEW", "JOURNEY_OUTCOME_SUMMARY_VIEW", "JOURNEY_SEGMENT_SUMMARY_VIEW", "AGENT_DEVELOPMENT_DETAIL_VIEW", "AGENT_DEVELOPMENT_DETAIL_ME_VIEW", "AGENT_DEVELOPMENT_SUMMARY_VIEW", "AGENT_PERFORMANCE_ME_VIEW", "AGENT_STATUS_ME_VIEW", "AGENT_EVALUATION_ME_VIEW", "AGENT_SCORECARD_VIEW", "AGENT_SCORECARD_ME_VIEW", "AGENT_GAMIFICATION_LEADERSHIP_VIEW"]
if view_type.lower() not in map(str.lower, allowed_values):
# print("Invalid value for view_type -> " + view_type)
self._view_type = "outdated_sdk_version"
else:
self._view_type = view_type
@property
def export_error_messages_type(self):
"""
Gets the export_error_messages_type of this ReportingExportJobResponse.
The error message in case the export request failed
:return: The export_error_messages_type of this ReportingExportJobResponse.
:rtype: str
"""
return self._export_error_messages_type
@export_error_messages_type.setter
def export_error_messages_type(self, export_error_messages_type):
"""
Sets the export_error_messages_type of this ReportingExportJobResponse.
The error message in case the export request failed
:param export_error_messages_type: The export_error_messages_type of this ReportingExportJobResponse.
:type: str
"""
allowed_values = ["FAILED_CONVERTING_EXPORT_JOB", "FAILED_NO_DATA_EXPORT_JOB_FOUND", "FAILED_GETTING_DATA_FROM_SERVICE", "FAILED_GENERATING_TEMP_FILE", "FAILED_SAVING_FILE_TO_S3", "FAILED_NOTIFYING_SKYWALKER_OF_DOWNLOAD", "FAILED_BUILDING_DOWNLOAD_URL_FROM_SKYWALKER_RESPONSE", "FAILED_CONVERTING_EXPORT_JOB_TO_QUEUE_PERFORMANCE_JOB", "EXPORT_TYPE_NOT_IMPLEMENTED", "REACHED_MAXIMUM_ATTEMPT_OF_RETRY", "FAILED_LONG_RUNNING_EXPORT", "TOO_MANY_REQUESTS_FROM_AN_ORGANIZATION", "FAILED_AS_EXPORT_FILE_SIZE_IS_GREATER_THAN_10MB", "NOT_AUTHORIZED_TO_VIEW_EXPORT"]
if export_error_messages_type.lower() not in map(str.lower, allowed_values):
# print("Invalid value for export_error_messages_type -> " + export_error_messages_type)
self._export_error_messages_type = "outdated_sdk_version"
else:
self._export_error_messages_type = export_error_messages_type
@property
def period(self):
"""
Gets the period of this ReportingExportJobResponse.
The Period of the request in which to break down the intervals. Periods are represented as an ISO-8601 string. For example: P1D or P1DT12H
:return: The period of this ReportingExportJobResponse.
:rtype: str
"""
return self._period
@period.setter
def period(self, period):
"""
Sets the period of this ReportingExportJobResponse.
The Period of the request in which to break down the intervals. Periods are represented as an ISO-8601 string. For example: P1D or P1DT12H
:param period: The period of this ReportingExportJobResponse.
:type: str
"""
self._period = period
@property
def filter(self):
"""
Gets the filter of this ReportingExportJobResponse.
Filters to apply to create the view
:return: The filter of this ReportingExportJobResponse.
:rtype: ViewFilter
"""
return self._filter
@filter.setter
def filter(self, filter):
"""
Sets the filter of this ReportingExportJobResponse.
Filters to apply to create the view
:param filter: The filter of this ReportingExportJobResponse.
:type: ViewFilter
"""
self._filter = filter
@property
def read(self):
"""
Gets the read of this ReportingExportJobResponse.
Indicates if the request has been marked as read
:return: The read of this ReportingExportJobResponse.
:rtype: bool
"""
return self._read
@read.setter
def read(self, read):
"""
Sets the read of this ReportingExportJobResponse.
Indicates if the request has been marked as read
:param read: The read of this ReportingExportJobResponse.
:type: bool
"""
self._read = read
@property
def created_date_time(self):
"""
Gets the created_date_time of this ReportingExportJobResponse.
The created date/time of the request. Date time is represented as an ISO-8601 string. For example: yyyy-MM-ddTHH:mm:ss[.mmm]Z
:return: The created_date_time of this ReportingExportJobResponse.
:rtype: datetime
"""
return self._created_date_time
@created_date_time.setter
def created_date_time(self, created_date_time):
"""
Sets the created_date_time of this ReportingExportJobResponse.
The created date/time of the request. Date time is represented as an ISO-8601 string. For example: yyyy-MM-ddTHH:mm:ss[.mmm]Z
:param created_date_time: The created_date_time of this ReportingExportJobResponse.
:type: datetime
"""
self._created_date_time = created_date_time
@property
def modified_date_time(self):
"""
Gets the modified_date_time of this ReportingExportJobResponse.
The last modified date/time of the request. Date time is represented as an ISO-8601 string. For example: yyyy-MM-ddTHH:mm:ss[.mmm]Z
:return: The modified_date_time of this ReportingExportJobResponse.
:rtype: datetime
"""
return self._modified_date_time
@modified_date_time.setter
def modified_date_time(self, modified_date_time):
"""
Sets the modified_date_time of this ReportingExportJobResponse.
The last modified date/time of the request. Date time is represented as an ISO-8601 string. For example: yyyy-MM-ddTHH:mm:ss[.mmm]Z
:param modified_date_time: The modified_date_time of this ReportingExportJobResponse.
:type: datetime
"""
self._modified_date_time = modified_date_time
@property
def locale(self):
"""
Gets the locale of this ReportingExportJobResponse.
The locale use for localization of the exported data, i.e. en-us, es-mx
:return: The locale of this ReportingExportJobResponse.
:rtype: str
"""
return self._locale
@locale.setter
def locale(self, locale):
"""
Sets the locale of this ReportingExportJobResponse.
The locale use for localization of the exported data, i.e. en-us, es-mx
:param locale: The locale of this ReportingExportJobResponse.
:type: str
"""
self._locale = locale
@property
def percentage_complete(self):
"""
Gets the percentage_complete of this ReportingExportJobResponse.
The percentage of the job that has completed processing
:return: The percentage_complete of this ReportingExportJobResponse.
:rtype: float
"""
return self._percentage_complete
@percentage_complete.setter
def percentage_complete(self, percentage_complete):
"""
Sets the percentage_complete of this ReportingExportJobResponse.
The percentage of the job that has completed processing
:param percentage_complete: The percentage_complete of this ReportingExportJobResponse.
:type: float
"""
self._percentage_complete = percentage_complete
@property
def has_format_durations(self):
"""
Gets the has_format_durations of this ReportingExportJobResponse.
Indicates if durations are formatted in hh:mm:ss format instead of ms
:return: The has_format_durations of this ReportingExportJobResponse.
:rtype: bool
"""
return self._has_format_durations
@has_format_durations.setter
def has_format_durations(self, has_format_durations):
"""
Sets the has_format_durations of this ReportingExportJobResponse.
Indicates if durations are formatted in hh:mm:ss format instead of ms
:param has_format_durations: The has_format_durations of this ReportingExportJobResponse.
:type: bool
"""
self._has_format_durations = has_format_durations
@property
def has_split_filters(self):
"""
Gets the has_split_filters of this ReportingExportJobResponse.
Indicates if filters will be split in aggregate detail exports
:return: The has_split_filters of this ReportingExportJobResponse.
:rtype: bool
"""
return self._has_split_filters
@has_split_filters.setter
def has_split_filters(self, has_split_filters):
"""
Sets the has_split_filters of this ReportingExportJobResponse.
Indicates if filters will be split in aggregate detail exports
:param has_split_filters: The has_split_filters of this ReportingExportJobResponse.
:type: bool
"""
self._has_split_filters = has_split_filters
@property
def exclude_empty_rows(self):
"""
Gets the exclude_empty_rows of this ReportingExportJobResponse.
Excludes empty rows from the exports
:return: The exclude_empty_rows of this ReportingExportJobResponse.
:rtype: bool
"""
return self._exclude_empty_rows
@exclude_empty_rows.setter
def exclude_empty_rows(self, exclude_empty_rows):
"""
Sets the exclude_empty_rows of this ReportingExportJobResponse.
Excludes empty rows from the exports
:param exclude_empty_rows: The exclude_empty_rows of this ReportingExportJobResponse.
:type: bool
"""
self._exclude_empty_rows = exclude_empty_rows
@property
def has_split_by_media(self):
"""
Gets the has_split_by_media of this ReportingExportJobResponse.
Indicates if media type will be split in aggregate detail exports
:return: The has_split_by_media of this ReportingExportJobResponse.
:rtype: bool
"""
return self._has_split_by_media
@has_split_by_media.setter
def has_split_by_media(self, has_split_by_media):
"""
Sets the has_split_by_media of this ReportingExportJobResponse.
Indicates if media type will be split in aggregate detail exports
:param has_split_by_media: The has_split_by_media of this ReportingExportJobResponse.
:type: bool
"""
self._has_split_by_media = has_split_by_media
@property
def has_summary_row(self):
"""
Gets the has_summary_row of this ReportingExportJobResponse.
Indicates if summary row needs to be present in exports
:return: The has_summary_row of this ReportingExportJobResponse.
:rtype: bool
"""
return self._has_summary_row
@has_summary_row.setter
def has_summary_row(self, has_summary_row):
"""
Sets the has_summary_row of this ReportingExportJobResponse.
Indicates if summary row needs to be present in exports
:param has_summary_row: The has_summary_row of this ReportingExportJobResponse.
:type: bool
"""
self._has_summary_row = has_summary_row
@property
def csv_delimiter(self):
"""
Gets the csv_delimiter of this ReportingExportJobResponse.
The user supplied csv delimiter string value either of type 'comma' or 'semicolon' permitted for the export request
:return: The csv_delimiter of this ReportingExportJobResponse.
:rtype: str
"""
return self._csv_delimiter
@csv_delimiter.setter
def csv_delimiter(self, csv_delimiter):
"""
Sets the csv_delimiter of this ReportingExportJobResponse.
The user supplied csv delimiter string value either of type 'comma' or 'semicolon' permitted for the export request
:param csv_delimiter: The csv_delimiter of this ReportingExportJobResponse.
:type: str
"""
allowed_values = ["SEMICOLON", "COMMA"]
if csv_delimiter.lower() not in map(str.lower, allowed_values):
# print("Invalid value for csv_delimiter -> " + csv_delimiter)
self._csv_delimiter = "outdated_sdk_version"
else:
self._csv_delimiter = csv_delimiter
@property
def selected_columns(self):
"""
Gets the selected_columns of this ReportingExportJobResponse.
The list of ordered selected columns from the export view by the user
:return: The selected_columns of this ReportingExportJobResponse.
:rtype: list[SelectedColumns]
"""
return self._selected_columns
@selected_columns.setter
def selected_columns(self, selected_columns):
"""
Sets the selected_columns of this ReportingExportJobResponse.
The list of ordered selected columns from the export view by the user
:param selected_columns: The selected_columns of this ReportingExportJobResponse.
:type: list[SelectedColumns]
"""
self._selected_columns = selected_columns
@property
def has_custom_participant_attributes(self):
"""
Gets the has_custom_participant_attributes of this ReportingExportJobResponse.
Indicates if custom participant attributes will be exported
:return: The has_custom_participant_attributes of this ReportingExportJobResponse.
:rtype: bool
"""
return self._has_custom_participant_attributes
@has_custom_participant_attributes.setter
def has_custom_participant_attributes(self, has_custom_participant_attributes):
"""
Sets the has_custom_participant_attributes of this ReportingExportJobResponse.
Indicates if custom participant attributes will be exported
:param has_custom_participant_attributes: The has_custom_participant_attributes of this ReportingExportJobResponse.
:type: bool
"""
self._has_custom_participant_attributes = has_custom_participant_attributes
@property
def recipient_emails(self):
"""
Gets the recipient_emails of this ReportingExportJobResponse.
The list of email recipients for the exports
:return: The recipient_emails of this ReportingExportJobResponse.
:rtype: list[str]
"""
return self._recipient_emails
@recipient_emails.setter
def recipient_emails(self, recipient_emails):
"""
Sets the recipient_emails of this ReportingExportJobResponse.
The list of email recipients for the exports
:param recipient_emails: The recipient_emails of this ReportingExportJobResponse.
:type: list[str]
"""
self._recipient_emails = recipient_emails
@property
def email_statuses(self):
"""
Gets the email_statuses of this ReportingExportJobResponse.
The status of individual email addresses as a map
:return: The email_statuses of this ReportingExportJobResponse.
:rtype: dict(str, str)
"""
return self._email_statuses
@email_statuses.setter
def email_statuses(self, email_statuses):
"""
Sets the email_statuses of this ReportingExportJobResponse.
The status of individual email addresses as a map
:param email_statuses: The email_statuses of this ReportingExportJobResponse.
:type: dict(str, str)
"""
self._email_statuses = email_statuses
@property
def email_error_description(self):
"""
Gets the email_error_description of this ReportingExportJobResponse.
The optional error message in case the export fail to email
:return: The email_error_description of this ReportingExportJobResponse.
:rtype: str
"""
return self._email_error_description
@email_error_description.setter
def email_error_description(self, email_error_description):
"""
Sets the email_error_description of this ReportingExportJobResponse.
The optional error message in case the export fail to email
:param email_error_description: The email_error_description of this ReportingExportJobResponse.
:type: str
"""
self._email_error_description = email_error_description
@property
def enabled(self):
"""
Gets the enabled of this ReportingExportJobResponse.
:return: The enabled of this ReportingExportJobResponse.
:rtype: bool
"""
return self._enabled
@enabled.setter
def enabled(self, enabled):
"""
Sets the enabled of this ReportingExportJobResponse.
:param enabled: The enabled of this ReportingExportJobResponse.
:type: bool
"""
self._enabled = enabled
@property
def self_uri(self):
"""
Gets the self_uri of this ReportingExportJobResponse.
The URI for this object
:return: The self_uri of this ReportingExportJobResponse.
:rtype: str
"""
return self._self_uri
@self_uri.setter
def self_uri(self, self_uri):
"""
Sets the self_uri of this ReportingExportJobResponse.
The URI for this object
:param self_uri: The self_uri of this ReportingExportJobResponse.
:type: str
"""
self._self_uri = self_uri
def to_dict(self):
"""
Returns the model properties as a dict
"""
result = {}
for attr, _ in iteritems(self.swagger_types):
value = getattr(self, attr)
if isinstance(value, list):
result[attr] = list(map(
lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
value
))
elif hasattr(value, "to_dict"):
result[attr] = value.to_dict()
elif isinstance(value, dict):
result[attr] = dict(map(
lambda item: (item[0], item[1].to_dict())
if hasattr(item[1], "to_dict") else item,
value.items()
))
else:
result[attr] = value
return result
def to_json(self):
"""
Returns the model as raw JSON
"""
return json.dumps(sanitize_for_serialization(self.to_dict()))
def to_str(self):
"""
Returns the string representation of the model
"""
return pformat(self.to_dict())
def __repr__(self):
"""
For `print` and `pprint`
"""
return self.to_str()
def __eq__(self, other):
"""
Returns true if both objects are equal
"""
return self.__dict__ == other.__dict__
def __ne__(self, other):
"""
Returns true if both objects are not equal
"""
return not self == other
| 35.491713 | 1,758 | 0.654795 | 3,613 | 32,120 | 5.565181 | 0.106006 | 0.035808 | 0.190978 | 0.069627 | 0.632914 | 0.539911 | 0.48038 | 0.394738 | 0.345452 | 0.310837 | 0 | 0.00236 | 0.274284 | 32,120 | 904 | 1,759 | 35.530973 | 0.860232 | 0.441469 | 0 | 0.114804 | 0 | 0 | 0.234155 | 0.137277 | 0 | 0 | 0 | 0 | 0 | 1 | 0.202417 | false | 0 | 0.015106 | 0 | 0.329305 | 0.003021 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
1394f3e40b1b497fdf5622c1d3ecd798e00e4cb4 | 236 | py | Python | modelconvert/utils/humanize.py | x3dom/pipeline | 5c631065b25744cc50f0f284c01e2e1707ed566c | [
"Apache-2.0"
] | 16 | 2015-03-14T19:47:27.000Z | 2020-06-26T14:02:53.000Z | modelconvert/utils/humanize.py | x3dom/pipeline | 5c631065b25744cc50f0f284c01e2e1707ed566c | [
"Apache-2.0"
] | null | null | null | modelconvert/utils/humanize.py | x3dom/pipeline | 5c631065b25744cc50f0f284c01e2e1707ed566c | [
"Apache-2.0"
] | 8 | 2015-03-16T17:36:45.000Z | 2022-01-19T03:38:21.000Z | # -*- coding: utf-8 -*-
def bytes(num):
for x in ['bytes','KB','MB','GB']:
if num < 1024.0 and num > -1024.0:
return "{0:.1f} {1}".format(num, x)
num /= 1024.0
return "{0:.1f} {1}".format(num, 'TB')
| 26.222222 | 47 | 0.457627 | 38 | 236 | 2.842105 | 0.552632 | 0.194444 | 0.222222 | 0.259259 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0 | 0 | 0.130178 | 0.283898 | 236 | 8 | 48 | 29.5 | 0.508876 | 0.088983 | 0 | 0 | 0 | 0 | 0.164319 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
13b10d26ffa93906cc33cfb44730f1718d345af4 | 756 | py | Python | pages/result_sk.py | sunjeet-khokhar/tau-playwright-workshop | b2133164f8432e8b2564a7bfa91e4be9a9db62ca | [
"MIT"
] | 2 | 2021-12-07T19:22:26.000Z | 2021-12-12T04:36:20.000Z | pages/result_sk.py | sunjeet-khokhar/tau-playwright-workshop | b2133164f8432e8b2564a7bfa91e4be9a9db62ca | [
"MIT"
] | null | null | null | pages/result_sk.py | sunjeet-khokhar/tau-playwright-workshop | b2133164f8432e8b2564a7bfa91e4be9a9db62ca | [
"MIT"
] | null | null | null | """
This module contains DuckDuckGoResultPage,
the page object for the DuckDuckGo result page.
"""
from playwright.sync_api import Page
class ResultsPage:
SEARCH_FIELD= "[id='search_form_input']"
RESULT_TITLES = "[data-testid=result-title-a]"
def __init__(self,page : Page):
self.page = page
def get_search_field_value(self):
return(self.page.input_value(self.SEARCH_FIELD))
def wait_for_nth_result_to_load(self,num):
self.page.locator(self.RESULT_TITLES + ">> nth=" +num).wait_for()
def get_inner_text_of_all_results(self):
return(self.page.locator(self.RESULT_TITLES).all_inner_texts())
def get_title_of_page(self):
return(self.page.title()) | 28 | 73 | 0.675926 | 102 | 756 | 4.696078 | 0.441176 | 0.100209 | 0.087683 | 0.112735 | 0.129436 | 0.129436 | 0 | 0 | 0 | 0 | 0 | 0 | 0.21164 | 756 | 27 | 74 | 28 | 0.803691 | 0.119048 | 0 | 0 | 0 | 0 | 0.08953 | 0.078907 | 0 | 0 | 0 | 0 | 0 | 1 | 0.357143 | false | 0 | 0.071429 | 0.214286 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
13bfab6439a60ba360c1a8f0e7c0b8119f3c57ee | 524 | py | Python | api_object_spec/exceptions.py | AWinterman/sphinx-grammar | cfa8c6943ce344bc2d8a0ce8374ac5f95ee4ce88 | [
"Apache-2.0"
] | 1 | 2015-04-17T23:09:56.000Z | 2015-04-17T23:09:56.000Z | api_object_spec/exceptions.py | AWinterman/api_object_specification | cfa8c6943ce344bc2d8a0ce8374ac5f95ee4ce88 | [
"Apache-2.0"
] | null | null | null | api_object_spec/exceptions.py | AWinterman/api_object_specification | cfa8c6943ce344bc2d8a0ce8374ac5f95ee4ce88 | [
"Apache-2.0"
] | 1 | 2015-04-12T23:21:00.000Z | 2015-04-12T23:21:00.000Z | class ParsimoniousError(Exception):
def __init__(self, exception):
"""
A class for wrapping parsimonious errors to make them a bit more sensible to users of this library.
:param exception: The original parsimonious exception
:return: self
"""
self.exception = exception
def __unicode__(self):
return u'Encountered an error parsing your api specification. The error was: \n {}'.format(self.exception)
def __str__(self):
return str(unicode(self)) | 32.75 | 114 | 0.666031 | 62 | 524 | 5.435484 | 0.612903 | 0.106825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.255725 | 524 | 16 | 115 | 32.75 | 0.864103 | 0.320611 | 0 | 0 | 0 | 0 | 0.230284 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0 | 0.285714 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
13cefb35ef0212a7f5e64c8c64d3c614b68213a2 | 508 | py | Python | pyrevolve/angle/robogen/body_parts/__init__.py | braj29/robo_swimmers | b3c3fa91976884095eb6b5e67844167598ec573d | [
"Apache-1.1"
] | null | null | null | pyrevolve/angle/robogen/body_parts/__init__.py | braj29/robo_swimmers | b3c3fa91976884095eb6b5e67844167598ec573d | [
"Apache-1.1"
] | null | null | null | pyrevolve/angle/robogen/body_parts/__init__.py | braj29/robo_swimmers | b3c3fa91976884095eb6b5e67844167598ec573d | [
"Apache-1.1"
] | null | null | null | from __future__ import absolute_import
#from .active_hinge import ActiveHinge
#from .core_component import CoreComponent
#from .hinge import Hinge
#from .light_sensor import LightSensor
#from .touch_sensor import TouchSensor
#from .fixed_brick import FixedBrick
#from .parametric_bar_joint import ParametricBarJoint
#from .wheel import *
#from .cardan import *
#from .active_cardan import *
#from .active_rotator import *
#from .active_wheel import *
#from .active_wheg import *
__author__ = 'Elte Hupkes'
| 28.222222 | 53 | 0.809055 | 65 | 508 | 6.015385 | 0.446154 | 0.153453 | 0.204604 | 0.112532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122047 | 508 | 17 | 54 | 29.882353 | 0.876682 | 0.814961 | 0 | 0 | 0 | 0 | 0.135802 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
13d46475fd1bb38aab4f091a942cf7711e68beed | 275 | py | Python | src/yellowdog_client/model/numeric_attribute_value.py | yellowdog/yellowdog-sdk-python-public | da69a7d6e45c92933e34fefcaef8b5d98dcd6036 | [
"Apache-2.0"
] | null | null | null | src/yellowdog_client/model/numeric_attribute_value.py | yellowdog/yellowdog-sdk-python-public | da69a7d6e45c92933e34fefcaef8b5d98dcd6036 | [
"Apache-2.0"
] | null | null | null | src/yellowdog_client/model/numeric_attribute_value.py | yellowdog/yellowdog-sdk-python-public | da69a7d6e45c92933e34fefcaef8b5d98dcd6036 | [
"Apache-2.0"
] | null | null | null | from dataclasses import dataclass, field
from .attribute_value import AttributeValue
@dataclass
class NumericAttributeValue(AttributeValue):
type: str = field(default="co.yellowdog.platform.model.NumericAttributeValue", init=False)
attribute: str
value: float
| 25 | 94 | 0.792727 | 29 | 275 | 7.482759 | 0.689655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130909 | 275 | 10 | 95 | 27.5 | 0.90795 | 0 | 0 | 0 | 0 | 0 | 0.178182 | 0.178182 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.285714 | 0 | 0.857143 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
13dcbbb2cb2b60e3091e77302ea29346b4575a0f | 93 | py | Python | strings.py | slack333/Python | eb6b5191f7a5191e20fa0b46ea06c366b81addd4 | [
"MIT"
] | null | null | null | strings.py | slack333/Python | eb6b5191f7a5191e20fa0b46ea06c366b81addd4 | [
"MIT"
] | null | null | null | strings.py | slack333/Python | eb6b5191f7a5191e20fa0b46ea06c366b81addd4 | [
"MIT"
] | null | null | null | mistring = 'Curso de python3'
print(mistring [0:10])
palabra ="hola"
print (len(palabra))
| 13.285714 | 29 | 0.688172 | 13 | 93 | 4.923077 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.050633 | 0.150538 | 93 | 6 | 30 | 15.5 | 0.759494 | 0 | 0 | 0 | 0 | 0 | 0.215054 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
13ec652213421b675715ccd69356e1ccb55dbf04 | 245 | py | Python | ML_Basic/Linear.py | ductnn/Python-tu | 8d0c16a7986cf573dbf7324375967a6bce45a7a9 | [
"MIT"
] | 1 | 2020-05-18T11:40:40.000Z | 2020-05-18T11:40:40.000Z | ML_Basic/Linear.py | ductnn/Python-tu | 8d0c16a7986cf573dbf7324375967a6bce45a7a9 | [
"MIT"
] | null | null | null | ML_Basic/Linear.py | ductnn/Python-tu | 8d0c16a7986cf573dbf7324375967a6bce45a7a9 | [
"MIT"
] | null | null | null | import pandas as pd
import numpy as np
import matplotlib as plt
X = np.array([[150, 152, 160, 163, 170, 175, 180, 190]]).T
Y = np.array([[40, 42, 44, 45, 55, 58, 60, 75]])
one = np.ones((X.shape[0], 1))
Xbar = np.concatenate((one, X), axis=1)
| 24.5 | 58 | 0.620408 | 49 | 245 | 3.102041 | 0.734694 | 0.092105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.212871 | 0.17551 | 245 | 9 | 59 | 27.222222 | 0.539604 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
13f401b979b675cba08b0bd4129cdcad53d6f54a | 1,101 | py | Python | csep/utils/constants.py | pjmaechling/pycsep | 32502fea072291d09859071a2797e38036963805 | [
"BSD-3-Clause"
] | 26 | 2020-12-16T06:23:50.000Z | 2022-03-22T10:23:12.000Z | csep/utils/constants.py | pjmaechling/pycsep | 32502fea072291d09859071a2797e38036963805 | [
"BSD-3-Clause"
] | 132 | 2020-09-18T13:33:10.000Z | 2022-02-25T17:10:54.000Z | csep/utils/constants.py | pjmaechling/pycsep | 32502fea072291d09859071a2797e38036963805 | [
"BSD-3-Clause"
] | 11 | 2020-09-10T11:35:48.000Z | 2022-03-30T14:40:14.000Z | # Time Constants
import numpy
SECONDS_PER_ASTRONOMICAL_YEAR = 31557600
SECONDS_PER_DAY = 60*60*24
SECONDS_PER_HOUR = 60*60
SECONDS_PER_WEEK = SECONDS_PER_DAY*7
SECONDS_PER_MONTH = SECONDS_PER_WEEK*4
DAYS_PER_ASTRONOMICAL_YEAR = 365.25
MW_5_EQS_PER_YEAR = 10
# Magnitude Bins
min_mw = 2.5
max_mw = 8.95
dmw = 0.1
CSEP_MW_BINS = numpy.array([ 2.5, 2.6, 2.7, 2.8, 2.9, 3. , 3.1, 3.2, 3.3,
3.4, 3.5, 3.6, 3.7, 3.8, 3.9, 4. , 4.1, 4.2,
4.3, 4.4, 4.5, 4.6, 4.7, 4.8, 4.9, 5. , 5.1,
5.2, 5.3, 5.4, 5.5, 5.6, 5.7, 5.8, 5.9, 6. ,
6.1, 6.2, 6.3, 6.4, 6.5, 6.6, 6.7, 6.8, 6.9,
7. , 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 7.7, 7.8,
7.9, 8. , 8.1, 8.2, 8.3, 8.4, 8.5, 8.6, 8.7,
8.8, 8.9, 9. , 9.1, 9.2, 9.3, 9.4, 9.5, 9.6,
9.7, 9.8, 9.9, 10. ])
| 40.777778 | 90 | 0.381471 | 208 | 1,101 | 1.894231 | 0.1875 | 0.177665 | 0.096447 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.301003 | 0.456857 | 1,101 | 26 | 91 | 42.346154 | 0.35786 | 0.02634 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.05 | 0 | 0.05 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b92233a63e2f3dbb2dd7a196f89359cf6eaa4fc6 | 169 | py | Python | odin_securities/queries/deletes/position.py | JamesBrofos/Odin-Securities | a07d3a21bcd3f78513ef394d4e8b620b7ca7fad8 | [
"MIT"
] | 13 | 2017-02-04T08:41:10.000Z | 2020-06-09T12:43:09.000Z | odin_securities/queries/deletes/position.py | JamesBrofos/Odin-Securities | a07d3a21bcd3f78513ef394d4e8b620b7ca7fad8 | [
"MIT"
] | 1 | 2020-11-15T05:32:18.000Z | 2020-11-15T05:32:18.000Z | odin_securities/queries/deletes/position.py | JamesBrofos/Odin-Securities | a07d3a21bcd3f78513ef394d4e8b620b7ca7fad8 | [
"MIT"
] | 9 | 2017-02-05T21:51:44.000Z | 2020-03-23T10:55:11.000Z | from ...connection_cursor import cur
def position(sid, pid):
cur.execute("""DELETE FROM positions WHERE symbol_id={} AND portfolio_id={}
""".format(sid, pid))
| 24.142857 | 79 | 0.692308 | 23 | 169 | 4.956522 | 0.782609 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 169 | 6 | 80 | 28.166667 | 0.797203 | 0 | 0 | 0 | 0 | 0 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b92fe17f4fefffef5f797b1036daa4a7a5e7633d | 240 | py | Python | usuarios/admin.py | SricardoSdSouza/Exercicios_python | 216f21936cd1eda0481c222c659482afb011b9b4 | [
"MIT"
] | null | null | null | usuarios/admin.py | SricardoSdSouza/Exercicios_python | 216f21936cd1eda0481c222c659482afb011b9b4 | [
"MIT"
] | null | null | null | usuarios/admin.py | SricardoSdSouza/Exercicios_python | 216f21936cd1eda0481c222c659482afb011b9b4 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Usuario
@admin.register(Usuario)
class UsuarioAdmin(admin.ModelAdmin):
list_display = ('nome','email','senha')
search_fields = ('nome', 'email')
readonly_fields = ('senha',)
| 26.666667 | 43 | 0.7125 | 28 | 240 | 6 | 0.678571 | 0.107143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141667 | 240 | 8 | 44 | 30 | 0.815534 | 0 | 0 | 0 | 0 | 0 | 0.116667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.857143 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
b945eb3831b48953ad88b0cb0377e893c5b1cc73 | 8,630 | py | Python | runtime/bamboo-pipeline/pipeline/tests/eri/imp/test_context.py | DomineCore/bamboo-engine | fb4583e70f9e1e87d9d48c2393db8d8104306f37 | [
"MIT"
] | 55 | 2021-09-07T11:50:35.000Z | 2022-03-23T13:19:38.000Z | runtime/bamboo-pipeline/pipeline/tests/eri/imp/test_context.py | DomineCore/bamboo-engine | fb4583e70f9e1e87d9d48c2393db8d8104306f37 | [
"MIT"
] | 64 | 2021-09-07T12:04:12.000Z | 2022-03-29T03:47:18.000Z | runtime/bamboo-pipeline/pipeline/tests/eri/imp/test_context.py | DomineCore/bamboo-engine | fb4583e70f9e1e87d9d48c2393db8d8104306f37 | [
"MIT"
] | 20 | 2021-09-07T11:52:08.000Z | 2022-03-28T08:05:22.000Z | # -*- coding: utf-8 -*-
"""
Tencent is pleased to support the open source community by making 蓝鲸智云PaaS平台社区版 (BlueKing PaaS Community
Edition) available.
Copyright (C) 2017-2021 THL A29 Limited, a Tencent company. All rights reserved.
Licensed under the MIT License (the "License"); you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://opensource.org/licenses/MIT
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the
specific language governing permissions and limitations under the License.
"""
import json
from django.test import TransactionTestCase
from bamboo_engine.eri import ContextValue, ContextValueType
from pipeline.eri.imp.context import ContextMixin
from pipeline.eri.models import ContextValue as DBContextValue
from pipeline.eri.models import ContextOutputs
from bamboo_engine.utils.string import unique_id
class ContextMixinTestCase(TransactionTestCase):
def setUp(self):
self.mixin = ContextMixin()
self.pipeline_id = unique_id("p")
self.outputs = ["a", "b", "c", "d"]
DBContextValue.objects.create(
pipeline_id=self.pipeline_id,
key="${var_1}",
type=ContextValueType.PLAIN.value,
serializer=ContextMixin.JSON_SERIALIZER,
value=json.dumps("123"),
references="[]",
)
DBContextValue.objects.create(
pipeline_id=self.pipeline_id,
key="${var_2}",
type=ContextValueType.PLAIN.value,
serializer=ContextMixin.JSON_SERIALIZER,
value=json.dumps(123),
references="[]",
)
DBContextValue.objects.create(
pipeline_id=self.pipeline_id,
key="${var_3}",
type=ContextValueType.SPLICE.value,
serializer=ContextMixin.JSON_SERIALIZER,
value=json.dumps("${var_1}_${var_2}"),
references='["${var_1}", "${var_2}"]',
)
DBContextValue.objects.create(
pipeline_id=self.pipeline_id,
key="${var_4}",
type=ContextValueType.COMPUTE.value,
serializer=ContextMixin.JSON_SERIALIZER,
value=json.dumps({"attr1": "a", "attr2": "${var_3}"}),
references='["${var_1}", "${var_2}", "${var_3}"]',
code="cv",
)
ContextOutputs.objects.create(pipeline_id=self.pipeline_id, outputs=json.dumps(self.outputs))
def test_get_context_values(self):
context_values = self.mixin.get_context_values(self.pipeline_id, {"${var_1}", "${var_2}"})
self.assertEqual(len(context_values), 2)
self.assertEqual(context_values[0].key, "${var_1}")
self.assertEqual(context_values[0].type, ContextValueType.PLAIN)
self.assertEqual(context_values[0].value, "123")
self.assertIsNone(context_values[0].code)
self.assertEqual(context_values[1].key, "${var_2}")
self.assertEqual(context_values[1].type, ContextValueType.PLAIN)
self.assertEqual(context_values[1].value, 123)
self.assertIsNone(context_values[1].code)
context_values = self.mixin.get_context_values(
self.pipeline_id, {"${var_1}", "${var_2}", "${var_3}", "${var_4}"}
)
self.assertEqual(len(context_values), 4)
self.assertEqual(context_values[0].key, "${var_1}")
self.assertEqual(context_values[0].type, ContextValueType.PLAIN)
self.assertEqual(context_values[0].value, "123")
self.assertIsNone(context_values[0].code)
self.assertEqual(context_values[1].key, "${var_2}")
self.assertEqual(context_values[1].type, ContextValueType.PLAIN)
self.assertEqual(context_values[1].value, 123)
self.assertIsNone(context_values[1].code)
self.assertEqual(context_values[2].key, "${var_3}")
self.assertEqual(context_values[2].type, ContextValueType.SPLICE)
self.assertEqual(context_values[2].value, "${var_1}_${var_2}")
self.assertIsNone(context_values[2].code)
self.assertEqual(context_values[3].key, "${var_4}")
self.assertEqual(context_values[3].type, ContextValueType.COMPUTE)
self.assertEqual(context_values[3].value, {"attr1": "a", "attr2": "${var_3}"})
self.assertEqual(context_values[3].code, "cv")
def test_get_context_key_references(self):
references = self.mixin.get_context_key_references(self.pipeline_id, {"${var_1}", "${var_2}"})
self.assertEqual(references, set())
references = self.mixin.get_context_key_references(
self.pipeline_id, {"${var_1}", "${var_2}", "${var_3}", "${var_4}"}
)
self.assertEqual(references, {"${var_1}", "${var_2}", "${var_3}"})
def test_get_context(self):
context_values = self.mixin.get_context(self.pipeline_id)
self.assertEqual(len(context_values), 4)
self.assertEqual(context_values[0].key, "${var_1}")
self.assertEqual(context_values[0].type, ContextValueType.PLAIN)
self.assertEqual(context_values[0].value, "123")
self.assertIsNone(context_values[0].code)
self.assertEqual(context_values[1].key, "${var_2}")
self.assertEqual(context_values[1].type, ContextValueType.PLAIN)
self.assertEqual(context_values[1].value, 123)
self.assertIsNone(context_values[1].code)
self.assertEqual(context_values[2].key, "${var_3}")
self.assertEqual(context_values[2].type, ContextValueType.SPLICE)
self.assertEqual(context_values[2].value, "${var_1}_${var_2}")
self.assertIsNone(context_values[2].code)
self.assertEqual(context_values[3].key, "${var_4}")
self.assertEqual(context_values[3].type, ContextValueType.COMPUTE)
self.assertEqual(context_values[3].value, {"attr1": "a", "attr2": "${var_3}"})
self.assertEqual(context_values[3].code, "cv")
def test_get_context_outputs(self):
outputs = self.mixin.get_context_outputs(self.pipeline_id)
self.assertEqual(outputs, set(self.outputs))
def test_upsert_plain_context_values(self):
update = {
"${var_3}": ContextValue(key="${var_3}", type=ContextValueType.PLAIN, value="123_123"),
"${var_4}": ContextValue(key="${var_4}", type=ContextValueType.PLAIN, value="compute_val"),
"${var_5}": ContextValue(key="${var_5}", type=ContextValueType.PLAIN, value="5_val"),
"${var_6}": ContextValue(key="${var_6}", type=ContextValueType.PLAIN, value="6_val"),
}
self.mixin.upsert_plain_context_values(self.pipeline_id, update)
context_values = self.mixin.get_context(self.pipeline_id)
self.assertEqual(len(context_values), 6)
context_values = {cv.key: cv for cv in context_values}
self.assertEqual(context_values["${var_1}"].key, "${var_1}")
self.assertEqual(context_values["${var_1}"].type, ContextValueType.PLAIN)
self.assertEqual(context_values["${var_1}"].value, "123")
self.assertIsNone(context_values["${var_1}"].code)
self.assertEqual(context_values["${var_2}"].key, "${var_2}")
self.assertEqual(context_values["${var_2}"].type, ContextValueType.PLAIN)
self.assertEqual(context_values["${var_2}"].value, 123)
self.assertIsNone(context_values["${var_2}"].code)
self.assertEqual(context_values["${var_3}"].key, "${var_3}")
self.assertEqual(context_values["${var_3}"].type, ContextValueType.PLAIN)
self.assertEqual(context_values["${var_3}"].value, "123_123")
self.assertIsNone(context_values["${var_3}"].code)
self.assertEqual(context_values["${var_4}"].key, "${var_4}")
self.assertEqual(context_values["${var_4}"].type, ContextValueType.PLAIN)
self.assertEqual(context_values["${var_4}"].value, "compute_val")
self.assertIsNone(context_values["${var_4}"].code)
self.assertEqual(context_values["${var_5}"].key, "${var_5}")
self.assertEqual(context_values["${var_5}"].type, ContextValueType.PLAIN)
self.assertEqual(context_values["${var_5}"].value, "5_val")
self.assertIsNone(context_values["${var_5}"].code)
self.assertEqual(context_values["${var_6}"].key, "${var_6}")
self.assertEqual(context_values["${var_6}"].type, ContextValueType.PLAIN)
self.assertEqual(context_values["${var_6}"].value, "6_val")
self.assertIsNone(context_values["${var_6}"].code)
| 51.987952 | 115 | 0.665353 | 1,047 | 8,630 | 5.268386 | 0.140401 | 0.186186 | 0.19942 | 0.253807 | 0.756526 | 0.687273 | 0.607868 | 0.551668 | 0.470631 | 0.465192 | 0 | 0.026693 | 0.183893 | 8,630 | 165 | 116 | 52.30303 | 0.756496 | 0.083662 | 0 | 0.439716 | 0 | 0 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0.503546 | 1 | 0.042553 | false | 0 | 0.049645 | 0 | 0.099291 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b95c4434d81a8b10d888760ed145c9be69fb04e5 | 487 | bzl | Python | third_party/javax_servlet.bzl | or-shachar/exodus | b28bf8f4c2330de47b8aac21c2bf3c40fda7cbdc | [
"MIT"
] | null | null | null | third_party/javax_servlet.bzl | or-shachar/exodus | b28bf8f4c2330de47b8aac21c2bf3c40fda7cbdc | [
"MIT"
] | null | null | null | third_party/javax_servlet.bzl | or-shachar/exodus | b28bf8f4c2330de47b8aac21c2bf3c40fda7cbdc | [
"MIT"
] | null | null | null | load("//:import_external.bzl", import_external = "safe_wix_scala_maven_import_external")
def dependencies():
import_external(
name = "javax_servlet_javax_servlet_api",
artifact = "javax.servlet:javax.servlet-api:3.1.0",
jar_sha256 = "af456b2dd41c4e82cf54f3e743bc678973d9fe35bd4d3071fa05c7e5333b8482",
srcjar_sha256 = "5c6d640f01e8e7ffdba21b2b75c0f64f0c30fd1fc3372123750c034cb363012a",
neverlink = 1,
generated_linkable_rule_name = "linkable",
)
| 37.461538 | 89 | 0.765914 | 44 | 487 | 8.090909 | 0.613636 | 0.157303 | 0.095506 | 0.134831 | 0.151685 | 0 | 0 | 0 | 0 | 0 | 0 | 0.214797 | 0.13963 | 487 | 12 | 90 | 40.583333 | 0.634845 | 0 | 0 | 0 | 0 | 0 | 0.537988 | 0.521561 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | true | 0 | 0.2 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b974619f3e40824f6fdd20ffcb04d383c271a688 | 655 | py | Python | installer/__main__.py | baptoutiego/Jarvis | 79c04845988d5f935338659899730ed882a6d76c | [
"MIT"
] | 2,605 | 2017-03-10T22:44:36.000Z | 2022-03-31T15:33:17.000Z | installer/__main__.py | baptoutiego/Jarvis | 79c04845988d5f935338659899730ed882a6d76c | [
"MIT"
] | 729 | 2017-03-11T00:06:46.000Z | 2022-03-31T22:04:44.000Z | installer/__main__.py | baptoutiego/Jarvis | 79c04845988d5f935338659899730ed882a6d76c | [
"MIT"
] | 1,181 | 2017-03-10T23:24:55.000Z | 2022-03-31T03:59:46.000Z | import traceback
try:
from helper import log_init, log_close
from unix_windows import IS_WIN
log_init()
import os
os.chdir(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import steps.a_setup_virtualenv
import steps.b_pip
import steps.c_nltk
if not IS_WIN:
# TODO Optional requirements on windows
import steps.d_optional
import steps.e_launcher
except SystemExit:
# Expected Error
pass
except BaseException:
print("\n\n")
print("An unexpected error occurred. Please open an issue on github!")
print("here is the error:")
print('')
traceback.print_exc()
| 22.586207 | 74 | 0.691603 | 92 | 655 | 4.73913 | 0.576087 | 0.126147 | 0.059633 | 0.068807 | 0.073395 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.227481 | 655 | 28 | 75 | 23.392857 | 0.86166 | 0.079389 | 0 | 0 | 0 | 0 | 0.138333 | 0 | 0 | 0 | 0 | 0.035714 | 0 | 1 | 0 | true | 0.047619 | 0.428571 | 0 | 0.428571 | 0.238095 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
b978a6f5a6fe9e2ce254c96f0fb1e22f10d6f1df | 2,214 | py | Python | qpylib/qpylib.py | isstabb/qpylib | e4eb906deefb71a99218557e1b89c2842ed279e7 | [
"Apache-2.0"
] | null | null | null | qpylib/qpylib.py | isstabb/qpylib | e4eb906deefb71a99218557e1b89c2842ed279e7 | [
"Apache-2.0"
] | null | null | null | qpylib/qpylib.py | isstabb/qpylib | e4eb906deefb71a99218557e1b89c2842ed279e7 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 IBM Corporation All Rights Reserved.
#
# SPDX-License-Identifier: Apache-2.0
import os
from .live_qpylib import LiveQpylib
from .sdk_qpylib import SdkQpylib
def is_sdk():
sdk_env = os.getenv('QRADAR_APPFW_SDK', 'no').lower() == 'true'
return sdk_env
def strategy():
if is_sdk():
return SdkQpylib()
return LiveQpylib()
# ==== Logging ====
def log(message, level='info'):
strategy().log(message, level)
def create_log():
strategy().create_log()
def set_log_level(log_level='info'):
strategy().set_log_level(log_level)
# ==== App details ====
def get_app_id():
return strategy().get_app_id()
def get_app_name():
return strategy().get_app_name()
def get_manifest_json():
return strategy().get_manifest_json()
def get_store_path(relative_path=''):
return strategy().get_store_path(relative_path)
def get_root_path(relative_path=''):
return strategy().get_root_path(relative_path)
def get_app_base_url():
return strategy().get_app_base_url()
def q_url_for(endpoint, **values):
return strategy().q_url_for(endpoint, **values)
def get_console_address():
return strategy().get_console_address()
# ==== REST ====
def REST(rest_type, request_url, headers=None, data=None, params=None,
json_body=None, version=None, verify=None, timeout=60):
return strategy().REST(rest_type, request_url, headers=headers,
data=data, params=params, json_body=json_body,
version=version, verify=verify,
timeout=timeout)
# ==== JSON ====
def to_json_dict(python_obj):
return strategy().to_json_dict(python_obj)
def register_jsonld_type(context):
return strategy().register_jsonld_type(context)
def get_offense_rendering(offense_id, render_type):
return strategy().get_offense_rendering(offense_id, render_type)
def get_asset_rendering(asset_id, render_type):
return strategy().get_asset_rendering(asset_id, render_type)
def render_json_ld_type(jld_type, data, jld_id = None):
return strategy().render_json_ld_type(jld_type, data, jld_id)
def register_jsonld_endpoints():
return strategy().register_jsonld_endpoints()
| 26.674699 | 73 | 0.70822 | 302 | 2,214 | 4.857616 | 0.281457 | 0.143149 | 0.104294 | 0.0409 | 0.370143 | 0.245399 | 0.141786 | 0.043626 | 0.043626 | 0 | 0 | 0.004331 | 0.165763 | 2,214 | 82 | 74 | 27 | 0.78993 | 0.070912 | 0 | 0 | 0 | 0 | 0.014641 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.06 | 0.3 | 0.82 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
b9b38f24812d55a99bca08c8f55559df0b56ce6f | 337 | py | Python | office365/sharepoint/logger/logFileInfoCollection.py | rikeshtailor/Office365-REST-Python-Client | ca7bfa1b22212137bb4e984c0457632163e89a43 | [
"MIT"
] | 544 | 2016-08-04T17:10:16.000Z | 2022-03-31T07:17:20.000Z | office365/sharepoint/logger/logFileInfoCollection.py | rikeshtailor/Office365-REST-Python-Client | ca7bfa1b22212137bb4e984c0457632163e89a43 | [
"MIT"
] | 438 | 2016-10-11T12:24:22.000Z | 2022-03-31T19:30:35.000Z | office365/sharepoint/logger/logFileInfoCollection.py | rikeshtailor/Office365-REST-Python-Client | ca7bfa1b22212137bb4e984c0457632163e89a43 | [
"MIT"
] | 202 | 2016-08-22T19:29:40.000Z | 2022-03-30T20:26:15.000Z | from office365.sharepoint.base_entity_collection import BaseEntityCollection
from office365.sharepoint.logger.logFileInfo import LogFileInfo
class LogFileInfoCollection(BaseEntityCollection):
def __init__(self, context, resource_path=None):
super(LogFileInfoCollection, self).__init__(context, LogFileInfo, resource_path)
| 37.444444 | 88 | 0.836795 | 33 | 337 | 8.181818 | 0.606061 | 0.096296 | 0.17037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019737 | 0.097923 | 337 | 8 | 89 | 42.125 | 0.868421 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
b9b54112c80322313abe422ce82fcb58613b51b9 | 390 | py | Python | tests/test_adams_bashforth.py | Ambistic/CellTissue | c7fce7bb9443a4dfc3b632d8f40aa598388f9d80 | [
"MIT"
] | 4 | 2020-04-15T07:53:10.000Z | 2021-10-03T13:44:50.000Z | tests/test_adams_bashforth.py | Ambistic/CellTissue | c7fce7bb9443a4dfc3b632d8f40aa598388f9d80 | [
"MIT"
] | null | null | null | tests/test_adams_bashforth.py | Ambistic/CellTissue | c7fce7bb9443a4dfc3b632d8f40aa598388f9d80 | [
"MIT"
] | 2 | 2020-07-09T10:32:47.000Z | 2022-02-03T00:59:10.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
import numpy as np
import cbmos.solvers.adams_bashforth as ab
@np.vectorize
def func(t, y):
return -50*y
def jacobian(y, fa):
return -50*np.eye(len(y))
def test_no_overstep():
t_span = (0, 1)
y0 = np.array([1, 1])
# fixed time step
sol = ab.solve_ivp(func, t_span, y0, dt=0.03)
assert sol.t[-1] == t_span[1]
| 16.956522 | 49 | 0.610256 | 71 | 390 | 3.253521 | 0.619718 | 0.064935 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055738 | 0.217949 | 390 | 22 | 50 | 17.727273 | 0.701639 | 0.151282 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 1 | 0.25 | false | 0 | 0.166667 | 0.166667 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
b9ba1039cd18bc089f999a86b50508a9999fde55 | 311 | py | Python | exercises/is_lower_case.py | reysmerwvr/python-playgrounds | 1e039639d96044986ba5cc894a210180cc2b08e0 | [
"MIT"
] | null | null | null | exercises/is_lower_case.py | reysmerwvr/python-playgrounds | 1e039639d96044986ba5cc894a210180cc2b08e0 | [
"MIT"
] | null | null | null | exercises/is_lower_case.py | reysmerwvr/python-playgrounds | 1e039639d96044986ba5cc894a210180cc2b08e0 | [
"MIT"
] | null | null | null | # is_lower_case
#
# Checks if a string is lower case.
#
# Convert the given string to lower case, using str.lower() method and compare it to the original.
def is_lower_case(string):
return string == string.lower()
is_lower_case('abc') # True
is_lower_case('a3@$') # True
is_lower_case('Ab4') # False
| 20.733333 | 98 | 0.707395 | 51 | 311 | 4.117647 | 0.490196 | 0.3 | 0.314286 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007813 | 0.176849 | 311 | 14 | 99 | 22.214286 | 0.8125 | 0.514469 | 0 | 0 | 0 | 0 | 0.070423 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0.2 | 0.4 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
b9bd49a4c15d8b8a6692b1a6f3ae4252ac2abd25 | 10,757 | py | Python | tests/test_tweens.py | nikitagromov/nefertari | 1e3829bba4008a8014a3a5f23521a082bfd06ecd | [
"Apache-2.0"
] | 34 | 2015-03-27T16:00:38.000Z | 2016-01-26T02:15:47.000Z | tests/test_tweens.py | nikitagromov/nefertari | 1e3829bba4008a8014a3a5f23521a082bfd06ecd | [
"Apache-2.0"
] | 17 | 2015-04-17T12:24:23.000Z | 2015-12-09T03:46:10.000Z | tests/test_tweens.py | nikitagromov/nefertari | 1e3829bba4008a8014a3a5f23521a082bfd06ecd | [
"Apache-2.0"
] | 10 | 2015-03-30T06:07:38.000Z | 2015-11-30T06:32:56.000Z | import six
import pytest
from mock import Mock, patch
from nefertari import tweens
def mock_timer():
mock_timer.time = 0
def time_func():
mock_timer.time += 1
return mock_timer.time
return time_func
class DummyConfigurator(object):
def __init__(self):
self.subscribed = []
def add_subscriber(self, wrapped, ifaces):
self.subscribed.append((wrapped, ifaces))
class TestTweens(object):
@patch('nefertari.tweens.time')
@patch('nefertari.tweens.log')
def test_request_timing(self, mock_log, mock_time):
mock_time.time = mock_timer()
request = Mock(method='GET', url='http://example.com')
registry = Mock()
registry.settings = {'request_timing.slow_request_threshold': 1000}
handler = lambda request: request
timing = tweens.request_timing(handler, registry)
timing(request)
mock_log.debug.assert_called_once_with(
'GET (http://example.com) request took 1 seconds')
assert not mock_log.warning.called
@patch('nefertari.tweens.time')
@patch('nefertari.tweens.log')
def test_request_timing_slow_request(self, mock_log, mock_time):
mock_time.time = mock_timer()
request = Mock(method='GET', url='http://example.com')
registry = Mock()
registry.settings = {'request_timing.slow_request_threshold': 0}
handler = lambda request: request
timing = tweens.request_timing(handler, registry)
timing(request)
mock_log.warning.assert_called_once_with(
'GET (http://example.com) request took 1 seconds')
assert not mock_log.debug.called
def test_get_tunneling(self):
class GET(dict):
def mixed(self):
return self
request = Mock(GET=GET({'_m': 'POST', 'foo': 'bar'}), method='GET')
get_tunneling = tweens.get_tunneling(lambda x: x, None)
get_tunneling(request)
assert request.GET == {"foo": "bar"}
assert request.method == 'POST'
assert request.content_type == 'application/json'
assert request.body == six.b('{"foo": "bar"}')
def test_get_tunneling_reserved_params_dropped(self):
from nefertari import RESERVED_PARAMS
class GET(dict):
def mixed(self):
return self
reserved = RESERVED_PARAMS[0]
get_data = GET({
'_m': 'POST',
'foo': 'bar',
reserved: 'boo',
})
request = Mock(GET=get_data, method='GET')
get_tunneling = tweens.get_tunneling(lambda x: x, None)
get_tunneling(request)
assert request.GET == {'foo': 'bar', reserved: 'boo'}
assert request.method == 'POST'
assert request.content_type == 'application/json'
assert request.body == six.b('{"foo": "bar"}')
assert request._tunneled_get
def test_get_tunneling_not_allowed_method(self):
class GET(dict):
def mixed(self):
return self
request = Mock(
GET=GET({'_m': 'DELETE', 'foo': 'bar'}), method='GET',
body=None, content_type=None)
get_tunneling = tweens.get_tunneling(lambda x: x, None)
get_tunneling(request)
assert request.GET == {"foo": "bar"}
assert request.method == 'DELETE'
assert request.content_type is None
assert request.body is None
def test_cors_no_origins_no_creds(self):
registry = Mock(settings={
'cors.allow_origins': '',
'cors.allow_credentials': None,
})
handler = lambda x: Mock(headerlist=[])
request = Mock(
headers={'Origin': '127.0.0.1:8080'},
host_url='127.0.0.1:8080')
response = tweens.cors(handler, registry)(request)
assert response.headerlist == []
def test_cors_disallow_creds(self):
registry = Mock(settings={
'cors.allow_origins': '',
'cors.allow_credentials': False,
})
handler = lambda x: Mock(headerlist=[])
request = Mock(
headers={'Origin': '127.0.0.1:8080'},
host_url='127.0.0.1:8080')
response = tweens.cors(handler, registry)(request)
assert response.headerlist == [
('Access-Control-Allow-Credentials', False)]
def test_cors_allow_creds_and_origin(self):
registry = Mock(settings={
'cors.allow_origins': '127.0.0.1:8080,127.0.0.1:8090',
'cors.allow_credentials': True,
})
handler = lambda x: Mock(headerlist=[])
request = Mock(
headers={'Origin': '127.0.0.1:8080'},
host_url='127.0.0.1:8080')
response = tweens.cors(handler, registry)(request)
assert response.headerlist == [
('Access-Control-Allow-Origin', '127.0.0.1:8080'),
('Access-Control-Allow-Credentials', True)]
def test_cors_wrong_origin(self):
registry = Mock(settings={
'cors.allow_origins': '127.0.0.1:8080,127.0.0.1:8090',
'cors.allow_credentials': None,
})
handler = lambda x: Mock(headerlist=[])
request = Mock(
headers={'Origin': '127.0.0.1:8000'},
host_url='127.0.0.1:8000')
response = tweens.cors(handler, registry)(request)
assert response.headerlist == []
def test_cors_source_or_host_url(self):
registry = Mock(settings={
'cors.allow_origins': '127.0.0.1:8080,127.0.0.1:8090',
'cors.allow_credentials': None,
})
handler = lambda x: Mock(headerlist=[])
request = Mock(
headers={'Origin': '127.0.0.1:8080'},
host_url='')
response = tweens.cors(handler, registry)(request)
assert response.headerlist == [
('Access-Control-Allow-Origin', '127.0.0.1:8080')]
request = Mock(
headers={},
host_url='127.0.0.1:8080')
response = tweens.cors(handler, registry)(request)
assert response.headerlist == [
('Access-Control-Allow-Origin', '127.0.0.1:8080')]
def test_cors_allow_origins_star_credentials_true(self):
registry = Mock(settings={
'cors.allow_origins': '*',
'cors.allow_credentials': True,
})
handler = lambda x: Mock(headerlist=[])
with pytest.raises(Exception) as ex:
tweens.cors(handler, registry)
expected = ('Not allowed Access-Control-Allow-Credentials '
'to set to TRUE if origin is *')
assert str(ex.value) == expected
def test_cors_allow_origins_star_credentials_false(self):
registry = Mock(settings={
'cors.allow_origins': '*',
'cors.allow_credentials': None,
})
handler = lambda x: Mock(headerlist=[])
request = Mock(
headers={},
host_url='127.1.2.3:1234')
response = tweens.cors(handler, registry)(request)
assert response.headerlist == [
('Access-Control-Allow-Origin', '127.1.2.3:1234')]
def test_cache_control_header_not_set(self):
handler = lambda x: Mock(headerlist=[('Cache-Control', '')])
response = tweens.cache_control(handler, None)(None)
assert not response.cache_expires.called
def test_cache_control_header_set(self):
handler = lambda x: Mock(headerlist=[])
response = tweens.cache_control(handler, None)(None)
response.cache_expires.assert_called_once_with(0)
def test_ssl_url_scheme(self):
request = Mock(
scheme=None,
environ={'HTTP_X_URL_SCHEME': 'Foo'}
)
tweens.ssl(lambda x: x, None)(request)
assert request.environ['wsgi.url_scheme'] == 'foo'
assert request.scheme == 'foo'
def test_ssl_forwarded_proto(self):
request = Mock(
scheme=None,
environ={'HTTP_X_FORWARDED_PROTO': 'Foo'}
)
tweens.ssl(lambda x: x, None)(request)
assert request.environ['wsgi.url_scheme'] == 'foo'
assert request.scheme == 'foo'
def test_ssl_no_scheme(self):
request = Mock(scheme=None, environ={})
tweens.ssl(lambda x: x, None)(request)
assert request.environ == {}
assert request.scheme is None
def test_enable_selfalias(self):
from pyramid.events import ContextFound
config = DummyConfigurator()
assert config.subscribed == []
tweens.enable_selfalias(config, 'foo')
assert len(config.subscribed) == 1
assert six.callable(config.subscribed[0][0])
assert config.subscribed[0][1] is ContextFound
def test_context_found_subscriber_alias_enabled(self):
config = DummyConfigurator()
tweens.enable_selfalias(config, 'foo')
context_found_subscriber = config.subscribed[0][0]
request = Mock(
user=Mock(username='user12'),
matchdict={'foo': 'self'})
context_found_subscriber(Mock(request=request))
assert request.matchdict['foo'] == 'user12'
def test_context_found_subscriber_no_matchdict(self):
config = DummyConfigurator()
tweens.enable_selfalias(config, 'foo')
context_found_subscriber = config.subscribed[0][0]
request = Mock(
user=Mock(username='user12'),
matchdict=None)
context_found_subscriber(Mock(request=request))
assert request.matchdict is None
def test_context_found_subscriber_not_self(self):
config = DummyConfigurator()
tweens.enable_selfalias(config, 'foo')
context_found_subscriber = config.subscribed[0][0]
request = Mock(
user=Mock(username='user12'),
matchdict={'foo': '1'})
context_found_subscriber(Mock(request=request))
assert request.matchdict['foo'] == '1'
def test_context_found_subscriber_not_authenticated(self):
config = DummyConfigurator()
tweens.enable_selfalias(config, 'foo')
context_found_subscriber = config.subscribed[0][0]
request = Mock(
user=None,
matchdict={'foo': 'self'})
context_found_subscriber(Mock(request=request))
assert request.matchdict['foo'] == 'self'
def test_context_found_subscriber_wrong_id_name(self):
config = DummyConfigurator()
tweens.enable_selfalias(config, 'foo')
context_found_subscriber = config.subscribed[0][0]
request = Mock(
user=Mock(username='user12'),
matchdict={'qoo': 'self'})
context_found_subscriber(Mock(request=request))
assert request.matchdict['qoo'] == 'self'
| 36.713311 | 75 | 0.60528 | 1,223 | 10,757 | 5.145544 | 0.117743 | 0.007945 | 0.015096 | 0.018115 | 0.772763 | 0.738916 | 0.722231 | 0.679008 | 0.661847 | 0.646274 | 0 | 0.031484 | 0.267733 | 10,757 | 292 | 76 | 36.839041 | 0.767424 | 0 | 0 | 0.615686 | 0 | 0 | 0.138143 | 0.054197 | 0 | 0 | 0 | 0 | 0.164706 | 1 | 0.117647 | false | 0 | 0.023529 | 0.011765 | 0.180392 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b9db133e6d1f0808d1473d64f946277690e4e4f6 | 1,739 | py | Python | motor_controls/gpioServo.py | ArifSohaib/AutonomousRobotChallenge | 10f7299022158425c6e7aa40f427105503d38468 | [
"MIT"
] | 13 | 2018-09-05T03:33:23.000Z | 2020-12-15T13:07:33.000Z | motor_controls/gpioServo.py | ArifSohaib/AutonomousRobotChallenge | 10f7299022158425c6e7aa40f427105503d38468 | [
"MIT"
] | 1 | 2019-03-24T20:07:19.000Z | 2019-03-25T21:18:04.000Z | motor_controls/gpioServo.py | ArifSohaib/AutonomousRobotChallenge | 10f7299022158425c6e7aa40f427105503d38468 | [
"MIT"
] | 11 | 2018-11-09T09:22:04.000Z | 2021-10-05T08:00:44.000Z | import RPi.GPIO as GPIO # always needed with RPi.GPIO
from time import sleep
class MotorControls:
def __init__(self, pin1=18, pin2=19):
GPIO.setmode(GPIO.BCM) # choose BCM or BOARD numbering schemes. I use BCM
GPIO.setup(pin1, GPIO.OUT)# set GPIO pin1 as an output. You can use any GPIO port
GPIO.setup(pin2, GPIO.OUT)# set GPIO pin1 as an output. You can use any GPIO port
self.motor1 = GPIO.PWM(pin1, 50) # create an object p for PWM on port pin at 50 Hertz
# you can have more than one of these, but they need
# different names for each port
# e.g. p1, p2, motor, servo1 etc.
self.motor2 = GPIO.PWM(pin2, 50)
self.forwardStart()
def forwardStart(self):
# start the PWM on 90 percent duty cycle
self.motor1.start(80)
self.motor2.start(80)
# duty cycle value can be 0.0 to 100.0%, floats are OK
def forward(self):
self.motor1.ChangeDutyCycle(99)
self.motor2.ChangeDutyCycle(99)
def turn1(self):
self.motor1.ChangeDutyCycle(50)
self.motor2.ChangeDutyCycle(90)
def turn2(self):
self.motor1.ChangeDutyCycle(90)
self.motor2.ChangeDutyCycle(50)
def reverse(self):
#TODO
pass
def stop(self):
# stop the PWM output
self.motor2.ChangeDutyCycle(100)
self.motor1.ChangeDutyCycle(100)
def end(self):
self.motor1.stop()
self.motor2.stop()
GPIO.cleanup() # when your program exits, tidy up after yourself
if __name__ == "__main__":
pass | 35.489796 | 98 | 0.579068 | 227 | 1,739 | 4.38326 | 0.46696 | 0.070352 | 0.056281 | 0.087437 | 0.096482 | 0.096482 | 0.096482 | 0.096482 | 0.096482 | 0.096482 | 0 | 0.05821 | 0.338125 | 1,739 | 49 | 99 | 35.489796 | 0.806255 | 0.306498 | 0 | 0.060606 | 0 | 0 | 0.006717 | 0 | 0 | 0 | 0 | 0.020408 | 0 | 1 | 0.242424 | false | 0.060606 | 0.060606 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
b9e6a8c6e66791b8323d66787eb9649e924049bd | 396 | py | Python | rattr/plugins/analysers/__init__.py | SuadeLabs/rattr | 22b82d31d4cebf0a7107fa1fb496a070b2e1f4ad | [
"MIT"
] | 6 | 2021-11-10T11:13:37.000Z | 2022-01-19T16:15:17.000Z | rattr/plugins/analysers/__init__.py | SuadeLabs/ratter | 22b82d31d4cebf0a7107fa1fb496a070b2e1f4ad | [
"MIT"
] | 13 | 2021-11-10T11:39:12.000Z | 2022-03-01T10:27:49.000Z | rattr/plugins/analysers/__init__.py | SuadeLabs/rattr | 22b82d31d4cebf0a7107fa1fb496a070b2e1f4ad | [
"MIT"
] | null | null | null | from rattr.plugins.analysers.builtins import (
DelattrAnalyser,
GetattrAnalyser,
HasattrAnalyser,
SetattrAnalyser,
SortedAnalyser,
)
from rattr.plugins.analysers.collections import DefaultDictAnalyser
DEFAULT_FUNCTION_ANALYSERS = (
GetattrAnalyser(),
SetattrAnalyser(),
HasattrAnalyser(),
DelattrAnalyser(),
SortedAnalyser(),
DefaultDictAnalyser(),
)
| 22 | 67 | 0.739899 | 27 | 396 | 10.777778 | 0.555556 | 0.061856 | 0.109966 | 0.171821 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.174242 | 396 | 17 | 68 | 23.294118 | 0.889908 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 1 | 0 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b9eb3f81b2dce8c837e36668e6008ddaa3ca7cd8 | 42,124 | py | Python | descarteslabs/common/proto/workflow/workflow_pb2.py | descarteslabs/descarteslabs-python | efc874d6062603dc424c9646287a9b1f8636e7ac | [
"Apache-2.0"
] | 167 | 2017-03-23T22:16:58.000Z | 2022-03-08T09:19:30.000Z | descarteslabs/common/proto/workflow/workflow_pb2.py | descarteslabs/descarteslabs-python | efc874d6062603dc424c9646287a9b1f8636e7ac | [
"Apache-2.0"
] | 93 | 2017-03-23T22:11:40.000Z | 2021-12-13T18:38:53.000Z | descarteslabs/common/proto/workflow/workflow_pb2.py | descarteslabs/descarteslabs-python | efc874d6062603dc424c9646287a9b1f8636e7ac | [
"Apache-2.0"
] | 46 | 2017-03-25T19:12:14.000Z | 2021-08-15T18:04:29.000Z | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: descarteslabs/common/proto/workflow/workflow.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from descarteslabs.common.proto.typespec import typespec_pb2 as descarteslabs_dot_common_dot_proto_dot_typespec_dot_typespec__pb2
from descarteslabs.common.proto.visualization import visualization_pb2 as descarteslabs_dot_common_dot_proto_dot_visualization_dot_visualization__pb2
from descarteslabs.common.proto.widgets import widgets_pb2 as descarteslabs_dot_common_dot_proto_dot_widgets_dot_widgets__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='descarteslabs/common/proto/workflow/workflow.proto',
package='descarteslabs.workflows',
syntax='proto3',
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n2descarteslabs/common/proto/workflow/workflow.proto\x12\x17\x64\x65scarteslabs.workflows\x1a\x32\x64\x65scarteslabs/common/proto/typespec/typespec.proto\x1a<descarteslabs/common/proto/visualization/visualization.proto\x1a\x30\x64\x65scarteslabs/common/proto/widgets/widgets.proto\"\x92\x04\n\x08Workflow\x12\x0e\n\x02id\x18\x01 \x01(\tR\x02id\x12+\n\x11\x63reated_timestamp\x18\x02 \x01(\x03R\x10\x63reatedTimestamp\x12+\n\x11updated_timestamp\x18\x03 \x01(\x03R\x10updatedTimestamp\x12\x14\n\x05title\x18\t \x01(\tR\x05title\x12 \n\x0b\x64\x65scription\x18\x08 \x01(\tR\x0b\x64\x65scription\x12R\n\x10versioned_grafts\x18\x1a \x03(\x0b\x32\'.descarteslabs.workflows.VersionedGraftR\x0fversionedGrafts\x12\x45\n\x06labels\x18\x1b \x03(\x0b\x32-.descarteslabs.workflows.Workflow.LabelsEntryR\x06labels\x12\x12\n\x04tags\x18\x1d \x03(\tR\x04tags\x12\x12\n\x04user\x18\x17 \x01(\tR\x04user\x12\x10\n\x03org\x18\x18 \x01(\tR\x03org\x12\x14\n\x05\x65mail\x18\x19 \x01(\tR\x05\x65mail\x12\x12\n\x04name\x18\x1c \x01(\tR\x04name\x12*\n\x11wmts_url_template\x18\x1e \x01(\tR\x0fwmtsUrlTemplate\x1a\x39\n\x0bLabelsEntry\x12\x10\n\x03key\x18\x01 \x01(\tR\x03key\x12\x14\n\x05value\x18\x02 \x01(\tR\x05value:\x02\x38\x01\"\xef\x02\n\x15UpsertWorkflowRequest\x12\x0e\n\x02id\x18\x01 \x01(\tR\x02id\x12\x14\n\x05title\x18\x03 \x01(\tR\x05title\x12 \n\x0b\x64\x65scription\x18\x04 \x01(\tR\x0b\x64\x65scription\x12R\n\x10versioned_grafts\x18\x1a \x03(\x0b\x32\'.descarteslabs.workflows.VersionedGraftR\x0fversionedGrafts\x12R\n\x06labels\x18\x1b \x03(\x0b\x32:.descarteslabs.workflows.UpsertWorkflowRequest.LabelsEntryR\x06labels\x12\x12\n\x04tags\x18\x1d \x03(\tR\x04tags\x12\x17\n\x07\x64ry_run\x18\x32 \x01(\x08R\x06\x64ryRun\x1a\x39\n\x0bLabelsEntry\x12\x10\n\x03key\x18\x01 \x01(\tR\x03key\x12\x14\n\x05value\x18\x02 \x01(\tR\x05value:\x02\x38\x01\"$\n\x12GetWorkflowRequest\x12\x0e\n\x02id\x18\x01 \x01(\tR\x02id\"=\n\x11GetVersionRequest\x12\x0e\n\x02id\x18\x01 \x01(\tR\x02id\x12\x18\n\x07version\x18\x02 \x01(\tR\x07version\"\'\n\x15\x44\x65leteWorkflowRequest\x12\x0e\n\x02id\x18\x01 \x01(\tR\x02id\"c\n\x16SearchWorkflowsRequest\x12\x14\n\x05\x65mail\x18\x01 \x01(\tR\x05\x65mail\x12\x1f\n\x0bname_prefix\x18\x02 \x01(\tR\nnamePrefix\x12\x12\n\x04tags\x18\x1d \x03(\tR\x04tags\"\xe0\x05\n\x0eVersionedGraft\x12\x18\n\x07version\x18\x01 \x01(\tR\x07version\x12\x1c\n\tdocstring\x18\x05 \x01(\tR\tdocstring\x12)\n\x10serialized_graft\x18\x02 \x01(\tR\x0fserializedGraft\x12=\n\x08typespec\x18\x04 \x01(\x0b\x32!.descarteslabs.workflows.TypespecR\x08typespec\x12\x42\n\nparameters\x18\x0b \x03(\x0b\x32\".descarteslabs.workflows.ParameterR\nparameters\x12\x18\n\x07\x63hannel\x18\x03 \x01(\tR\x07\x63hannel\x12%\n\x0e\x63lient_version\x18\n \x01(\tR\rclientVersion\x12K\n\x06labels\x18\x06 \x03(\x0b\x32\x33.descarteslabs.workflows.VersionedGraft.LabelsEntryR\x06labels\x12+\n\x11\x63reated_timestamp\x18\x07 \x01(\x03R\x10\x63reatedTimestamp\x12+\n\x11updated_timestamp\x18\x08 \x01(\x03R\x10updatedTimestamp\x12\x31\n\x14\x64\x65precated_timestamp\x18\t \x01(\x03R\x13\x64\x65precatedTimestamp\x12!\n\x0curl_template\x18\x0c \x01(\tR\x0burlTemplate\x12*\n\x11wmts_url_template\x18\r \x01(\tR\x0fwmtsUrlTemplate\x12\x43\n\x0bviz_options\x18\x0e \x03(\x0b\x32\".descarteslabs.workflows.VizOptionR\nvizOptions\x1a\x39\n\x0bLabelsEntry\x12\x10\n\x03key\x18\x01 \x01(\tR\x03key\x12\x14\n\x05value\x18\x02 \x01(\tR\x05value:\x02\x38\x01\"E\n\x17WmtsUrlTemplateResponse\x12*\n\x11wmts_url_template\x18\x01 \x01(\tR\x0fwmtsUrlTemplate\"\x07\n\x05\x45mpty2\xf3\x04\n\x0bWorkflowAPI\x12\x65\n\x0eUpsertWorkflow\x12..descarteslabs.workflows.UpsertWorkflowRequest\x1a!.descarteslabs.workflows.Workflow\"\x00\x12_\n\x0bGetWorkflow\x12+.descarteslabs.workflows.GetWorkflowRequest\x1a!.descarteslabs.workflows.Workflow\"\x00\x12\x63\n\nGetVersion\x12*.descarteslabs.workflows.GetVersionRequest\x1a\'.descarteslabs.workflows.VersionedGraft\"\x00\x12i\n\x0fSearchWorkflows\x12/.descarteslabs.workflows.SearchWorkflowsRequest\x1a!.descarteslabs.workflows.Workflow\"\x00\x30\x01\x12\x62\n\x0e\x44\x65leteWorkflow\x12..descarteslabs.workflows.DeleteWorkflowRequest\x1a\x1e.descarteslabs.workflows.Empty\"\x00\x12h\n\x12GetWmtsUrlTemplate\x12\x1e.descarteslabs.workflows.Empty\x1a\x30.descarteslabs.workflows.WmtsUrlTemplateResponse\"\x00\x62\x06proto3'
,
dependencies=[descarteslabs_dot_common_dot_proto_dot_typespec_dot_typespec__pb2.DESCRIPTOR,descarteslabs_dot_common_dot_proto_dot_visualization_dot_visualization__pb2.DESCRIPTOR,descarteslabs_dot_common_dot_proto_dot_widgets_dot_widgets__pb2.DESCRIPTOR,])
_WORKFLOW_LABELSENTRY = _descriptor.Descriptor(
name='LabelsEntry',
full_name='descarteslabs.workflows.Workflow.LabelsEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='descarteslabs.workflows.Workflow.LabelsEntry.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='key', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='value', full_name='descarteslabs.workflows.Workflow.LabelsEntry.value', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='value', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=b'8\001',
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=717,
serialized_end=774,
)
_WORKFLOW = _descriptor.Descriptor(
name='Workflow',
full_name='descarteslabs.workflows.Workflow',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='descarteslabs.workflows.Workflow.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='id', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='created_timestamp', full_name='descarteslabs.workflows.Workflow.created_timestamp', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='createdTimestamp', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='updated_timestamp', full_name='descarteslabs.workflows.Workflow.updated_timestamp', index=2,
number=3, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='updatedTimestamp', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='title', full_name='descarteslabs.workflows.Workflow.title', index=3,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='title', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='description', full_name='descarteslabs.workflows.Workflow.description', index=4,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='description', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='versioned_grafts', full_name='descarteslabs.workflows.Workflow.versioned_grafts', index=5,
number=26, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='versionedGrafts', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='labels', full_name='descarteslabs.workflows.Workflow.labels', index=6,
number=27, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='labels', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tags', full_name='descarteslabs.workflows.Workflow.tags', index=7,
number=29, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='tags', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='user', full_name='descarteslabs.workflows.Workflow.user', index=8,
number=23, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='user', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='org', full_name='descarteslabs.workflows.Workflow.org', index=9,
number=24, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='org', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='email', full_name='descarteslabs.workflows.Workflow.email', index=10,
number=25, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='email', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='descarteslabs.workflows.Workflow.name', index=11,
number=28, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='name', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='wmts_url_template', full_name='descarteslabs.workflows.Workflow.wmts_url_template', index=12,
number=30, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='wmtsUrlTemplate', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_WORKFLOW_LABELSENTRY, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=244,
serialized_end=774,
)
_UPSERTWORKFLOWREQUEST_LABELSENTRY = _descriptor.Descriptor(
name='LabelsEntry',
full_name='descarteslabs.workflows.UpsertWorkflowRequest.LabelsEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='descarteslabs.workflows.UpsertWorkflowRequest.LabelsEntry.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='key', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='value', full_name='descarteslabs.workflows.UpsertWorkflowRequest.LabelsEntry.value', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='value', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=b'8\001',
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=717,
serialized_end=774,
)
_UPSERTWORKFLOWREQUEST = _descriptor.Descriptor(
name='UpsertWorkflowRequest',
full_name='descarteslabs.workflows.UpsertWorkflowRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='descarteslabs.workflows.UpsertWorkflowRequest.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='id', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='title', full_name='descarteslabs.workflows.UpsertWorkflowRequest.title', index=1,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='title', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='description', full_name='descarteslabs.workflows.UpsertWorkflowRequest.description', index=2,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='description', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='versioned_grafts', full_name='descarteslabs.workflows.UpsertWorkflowRequest.versioned_grafts', index=3,
number=26, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='versionedGrafts', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='labels', full_name='descarteslabs.workflows.UpsertWorkflowRequest.labels', index=4,
number=27, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='labels', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tags', full_name='descarteslabs.workflows.UpsertWorkflowRequest.tags', index=5,
number=29, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='tags', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='dry_run', full_name='descarteslabs.workflows.UpsertWorkflowRequest.dry_run', index=6,
number=50, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='dryRun', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_UPSERTWORKFLOWREQUEST_LABELSENTRY, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=777,
serialized_end=1144,
)
_GETWORKFLOWREQUEST = _descriptor.Descriptor(
name='GetWorkflowRequest',
full_name='descarteslabs.workflows.GetWorkflowRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='descarteslabs.workflows.GetWorkflowRequest.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='id', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1146,
serialized_end=1182,
)
_GETVERSIONREQUEST = _descriptor.Descriptor(
name='GetVersionRequest',
full_name='descarteslabs.workflows.GetVersionRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='descarteslabs.workflows.GetVersionRequest.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='id', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='version', full_name='descarteslabs.workflows.GetVersionRequest.version', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='version', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1184,
serialized_end=1245,
)
_DELETEWORKFLOWREQUEST = _descriptor.Descriptor(
name='DeleteWorkflowRequest',
full_name='descarteslabs.workflows.DeleteWorkflowRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='descarteslabs.workflows.DeleteWorkflowRequest.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='id', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1247,
serialized_end=1286,
)
_SEARCHWORKFLOWSREQUEST = _descriptor.Descriptor(
name='SearchWorkflowsRequest',
full_name='descarteslabs.workflows.SearchWorkflowsRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='email', full_name='descarteslabs.workflows.SearchWorkflowsRequest.email', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='email', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name_prefix', full_name='descarteslabs.workflows.SearchWorkflowsRequest.name_prefix', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='namePrefix', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tags', full_name='descarteslabs.workflows.SearchWorkflowsRequest.tags', index=2,
number=29, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='tags', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1288,
serialized_end=1387,
)
_VERSIONEDGRAFT_LABELSENTRY = _descriptor.Descriptor(
name='LabelsEntry',
full_name='descarteslabs.workflows.VersionedGraft.LabelsEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='descarteslabs.workflows.VersionedGraft.LabelsEntry.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='key', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='value', full_name='descarteslabs.workflows.VersionedGraft.LabelsEntry.value', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='value', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=b'8\001',
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=717,
serialized_end=774,
)
_VERSIONEDGRAFT = _descriptor.Descriptor(
name='VersionedGraft',
full_name='descarteslabs.workflows.VersionedGraft',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='version', full_name='descarteslabs.workflows.VersionedGraft.version', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='version', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='docstring', full_name='descarteslabs.workflows.VersionedGraft.docstring', index=1,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='docstring', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='serialized_graft', full_name='descarteslabs.workflows.VersionedGraft.serialized_graft', index=2,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='serializedGraft', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='typespec', full_name='descarteslabs.workflows.VersionedGraft.typespec', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='typespec', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='parameters', full_name='descarteslabs.workflows.VersionedGraft.parameters', index=4,
number=11, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='parameters', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='channel', full_name='descarteslabs.workflows.VersionedGraft.channel', index=5,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='channel', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='client_version', full_name='descarteslabs.workflows.VersionedGraft.client_version', index=6,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='clientVersion', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='labels', full_name='descarteslabs.workflows.VersionedGraft.labels', index=7,
number=6, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='labels', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='created_timestamp', full_name='descarteslabs.workflows.VersionedGraft.created_timestamp', index=8,
number=7, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='createdTimestamp', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='updated_timestamp', full_name='descarteslabs.workflows.VersionedGraft.updated_timestamp', index=9,
number=8, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='updatedTimestamp', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='deprecated_timestamp', full_name='descarteslabs.workflows.VersionedGraft.deprecated_timestamp', index=10,
number=9, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='deprecatedTimestamp', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='url_template', full_name='descarteslabs.workflows.VersionedGraft.url_template', index=11,
number=12, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='urlTemplate', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='wmts_url_template', full_name='descarteslabs.workflows.VersionedGraft.wmts_url_template', index=12,
number=13, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='wmtsUrlTemplate', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='viz_options', full_name='descarteslabs.workflows.VersionedGraft.viz_options', index=13,
number=14, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='vizOptions', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_VERSIONEDGRAFT_LABELSENTRY, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1390,
serialized_end=2126,
)
_WMTSURLTEMPLATERESPONSE = _descriptor.Descriptor(
name='WmtsUrlTemplateResponse',
full_name='descarteslabs.workflows.WmtsUrlTemplateResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='wmts_url_template', full_name='descarteslabs.workflows.WmtsUrlTemplateResponse.wmts_url_template', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, json_name='wmtsUrlTemplate', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2128,
serialized_end=2197,
)
_EMPTY = _descriptor.Descriptor(
name='Empty',
full_name='descarteslabs.workflows.Empty',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2199,
serialized_end=2206,
)
_WORKFLOW_LABELSENTRY.containing_type = _WORKFLOW
_WORKFLOW.fields_by_name['versioned_grafts'].message_type = _VERSIONEDGRAFT
_WORKFLOW.fields_by_name['labels'].message_type = _WORKFLOW_LABELSENTRY
_UPSERTWORKFLOWREQUEST_LABELSENTRY.containing_type = _UPSERTWORKFLOWREQUEST
_UPSERTWORKFLOWREQUEST.fields_by_name['versioned_grafts'].message_type = _VERSIONEDGRAFT
_UPSERTWORKFLOWREQUEST.fields_by_name['labels'].message_type = _UPSERTWORKFLOWREQUEST_LABELSENTRY
_VERSIONEDGRAFT_LABELSENTRY.containing_type = _VERSIONEDGRAFT
_VERSIONEDGRAFT.fields_by_name['typespec'].message_type = descarteslabs_dot_common_dot_proto_dot_typespec_dot_typespec__pb2._TYPESPEC
_VERSIONEDGRAFT.fields_by_name['parameters'].message_type = descarteslabs_dot_common_dot_proto_dot_widgets_dot_widgets__pb2._PARAMETER
_VERSIONEDGRAFT.fields_by_name['labels'].message_type = _VERSIONEDGRAFT_LABELSENTRY
_VERSIONEDGRAFT.fields_by_name['viz_options'].message_type = descarteslabs_dot_common_dot_proto_dot_visualization_dot_visualization__pb2._VIZOPTION
DESCRIPTOR.message_types_by_name['Workflow'] = _WORKFLOW
DESCRIPTOR.message_types_by_name['UpsertWorkflowRequest'] = _UPSERTWORKFLOWREQUEST
DESCRIPTOR.message_types_by_name['GetWorkflowRequest'] = _GETWORKFLOWREQUEST
DESCRIPTOR.message_types_by_name['GetVersionRequest'] = _GETVERSIONREQUEST
DESCRIPTOR.message_types_by_name['DeleteWorkflowRequest'] = _DELETEWORKFLOWREQUEST
DESCRIPTOR.message_types_by_name['SearchWorkflowsRequest'] = _SEARCHWORKFLOWSREQUEST
DESCRIPTOR.message_types_by_name['VersionedGraft'] = _VERSIONEDGRAFT
DESCRIPTOR.message_types_by_name['WmtsUrlTemplateResponse'] = _WMTSURLTEMPLATERESPONSE
DESCRIPTOR.message_types_by_name['Empty'] = _EMPTY
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
Workflow = _reflection.GeneratedProtocolMessageType('Workflow', (_message.Message,), {
'LabelsEntry' : _reflection.GeneratedProtocolMessageType('LabelsEntry', (_message.Message,), {
'DESCRIPTOR' : _WORKFLOW_LABELSENTRY,
'__module__' : 'descarteslabs.common.proto.workflow.workflow_pb2'
# @@protoc_insertion_point(class_scope:descarteslabs.workflows.Workflow.LabelsEntry)
})
,
'DESCRIPTOR' : _WORKFLOW,
'__module__' : 'descarteslabs.common.proto.workflow.workflow_pb2'
# @@protoc_insertion_point(class_scope:descarteslabs.workflows.Workflow)
})
_sym_db.RegisterMessage(Workflow)
_sym_db.RegisterMessage(Workflow.LabelsEntry)
UpsertWorkflowRequest = _reflection.GeneratedProtocolMessageType('UpsertWorkflowRequest', (_message.Message,), {
'LabelsEntry' : _reflection.GeneratedProtocolMessageType('LabelsEntry', (_message.Message,), {
'DESCRIPTOR' : _UPSERTWORKFLOWREQUEST_LABELSENTRY,
'__module__' : 'descarteslabs.common.proto.workflow.workflow_pb2'
# @@protoc_insertion_point(class_scope:descarteslabs.workflows.UpsertWorkflowRequest.LabelsEntry)
})
,
'DESCRIPTOR' : _UPSERTWORKFLOWREQUEST,
'__module__' : 'descarteslabs.common.proto.workflow.workflow_pb2'
# @@protoc_insertion_point(class_scope:descarteslabs.workflows.UpsertWorkflowRequest)
})
_sym_db.RegisterMessage(UpsertWorkflowRequest)
_sym_db.RegisterMessage(UpsertWorkflowRequest.LabelsEntry)
GetWorkflowRequest = _reflection.GeneratedProtocolMessageType('GetWorkflowRequest', (_message.Message,), {
'DESCRIPTOR' : _GETWORKFLOWREQUEST,
'__module__' : 'descarteslabs.common.proto.workflow.workflow_pb2'
# @@protoc_insertion_point(class_scope:descarteslabs.workflows.GetWorkflowRequest)
})
_sym_db.RegisterMessage(GetWorkflowRequest)
GetVersionRequest = _reflection.GeneratedProtocolMessageType('GetVersionRequest', (_message.Message,), {
'DESCRIPTOR' : _GETVERSIONREQUEST,
'__module__' : 'descarteslabs.common.proto.workflow.workflow_pb2'
# @@protoc_insertion_point(class_scope:descarteslabs.workflows.GetVersionRequest)
})
_sym_db.RegisterMessage(GetVersionRequest)
DeleteWorkflowRequest = _reflection.GeneratedProtocolMessageType('DeleteWorkflowRequest', (_message.Message,), {
'DESCRIPTOR' : _DELETEWORKFLOWREQUEST,
'__module__' : 'descarteslabs.common.proto.workflow.workflow_pb2'
# @@protoc_insertion_point(class_scope:descarteslabs.workflows.DeleteWorkflowRequest)
})
_sym_db.RegisterMessage(DeleteWorkflowRequest)
SearchWorkflowsRequest = _reflection.GeneratedProtocolMessageType('SearchWorkflowsRequest', (_message.Message,), {
'DESCRIPTOR' : _SEARCHWORKFLOWSREQUEST,
'__module__' : 'descarteslabs.common.proto.workflow.workflow_pb2'
# @@protoc_insertion_point(class_scope:descarteslabs.workflows.SearchWorkflowsRequest)
})
_sym_db.RegisterMessage(SearchWorkflowsRequest)
VersionedGraft = _reflection.GeneratedProtocolMessageType('VersionedGraft', (_message.Message,), {
'LabelsEntry' : _reflection.GeneratedProtocolMessageType('LabelsEntry', (_message.Message,), {
'DESCRIPTOR' : _VERSIONEDGRAFT_LABELSENTRY,
'__module__' : 'descarteslabs.common.proto.workflow.workflow_pb2'
# @@protoc_insertion_point(class_scope:descarteslabs.workflows.VersionedGraft.LabelsEntry)
})
,
'DESCRIPTOR' : _VERSIONEDGRAFT,
'__module__' : 'descarteslabs.common.proto.workflow.workflow_pb2'
# @@protoc_insertion_point(class_scope:descarteslabs.workflows.VersionedGraft)
})
_sym_db.RegisterMessage(VersionedGraft)
_sym_db.RegisterMessage(VersionedGraft.LabelsEntry)
WmtsUrlTemplateResponse = _reflection.GeneratedProtocolMessageType('WmtsUrlTemplateResponse', (_message.Message,), {
'DESCRIPTOR' : _WMTSURLTEMPLATERESPONSE,
'__module__' : 'descarteslabs.common.proto.workflow.workflow_pb2'
# @@protoc_insertion_point(class_scope:descarteslabs.workflows.WmtsUrlTemplateResponse)
})
_sym_db.RegisterMessage(WmtsUrlTemplateResponse)
Empty = _reflection.GeneratedProtocolMessageType('Empty', (_message.Message,), {
'DESCRIPTOR' : _EMPTY,
'__module__' : 'descarteslabs.common.proto.workflow.workflow_pb2'
# @@protoc_insertion_point(class_scope:descarteslabs.workflows.Empty)
})
_sym_db.RegisterMessage(Empty)
_WORKFLOW_LABELSENTRY._options = None
_UPSERTWORKFLOWREQUEST_LABELSENTRY._options = None
_VERSIONEDGRAFT_LABELSENTRY._options = None
_WORKFLOWAPI = _descriptor.ServiceDescriptor(
name='WorkflowAPI',
full_name='descarteslabs.workflows.WorkflowAPI',
file=DESCRIPTOR,
index=0,
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_start=2209,
serialized_end=2836,
methods=[
_descriptor.MethodDescriptor(
name='UpsertWorkflow',
full_name='descarteslabs.workflows.WorkflowAPI.UpsertWorkflow',
index=0,
containing_service=None,
input_type=_UPSERTWORKFLOWREQUEST,
output_type=_WORKFLOW,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='GetWorkflow',
full_name='descarteslabs.workflows.WorkflowAPI.GetWorkflow',
index=1,
containing_service=None,
input_type=_GETWORKFLOWREQUEST,
output_type=_WORKFLOW,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='GetVersion',
full_name='descarteslabs.workflows.WorkflowAPI.GetVersion',
index=2,
containing_service=None,
input_type=_GETVERSIONREQUEST,
output_type=_VERSIONEDGRAFT,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='SearchWorkflows',
full_name='descarteslabs.workflows.WorkflowAPI.SearchWorkflows',
index=3,
containing_service=None,
input_type=_SEARCHWORKFLOWSREQUEST,
output_type=_WORKFLOW,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='DeleteWorkflow',
full_name='descarteslabs.workflows.WorkflowAPI.DeleteWorkflow',
index=4,
containing_service=None,
input_type=_DELETEWORKFLOWREQUEST,
output_type=_EMPTY,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='GetWmtsUrlTemplate',
full_name='descarteslabs.workflows.WorkflowAPI.GetWmtsUrlTemplate',
index=5,
containing_service=None,
input_type=_EMPTY,
output_type=_WMTSURLTEMPLATERESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
])
_sym_db.RegisterServiceDescriptor(_WORKFLOWAPI)
DESCRIPTOR.services_by_name['WorkflowAPI'] = _WORKFLOWAPI
# @@protoc_insertion_point(module_scope)
| 49.325527 | 4,348 | 0.772291 | 5,035 | 42,124 | 6.143793 | 0.062959 | 0.040344 | 0.067563 | 0.059352 | 0.771352 | 0.692507 | 0.66713 | 0.664899 | 0.652227 | 0.621678 | 0 | 0.034704 | 0.108679 | 42,124 | 853 | 4,349 | 49.383353 | 0.789192 | 0.029128 | 0 | 0.670469 | 1 | 0.007605 | 0.229333 | 0.184685 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.008872 | 0 | 0.008872 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b9f21a9fa90a49f503a076f0448555209b958be3 | 256 | py | Python | pset_pandas_ext/101problems/p18.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 5 | 2019-04-08T20:05:37.000Z | 2019-12-04T20:48:45.000Z | pset_pandas_ext/101problems/p18.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 8 | 2019-04-15T15:16:05.000Z | 2022-02-12T10:33:32.000Z | pset_pandas_ext/101problems/p18.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 2 | 2019-04-10T00:14:42.000Z | 2020-02-26T20:35:21.000Z | """
18. How to convert the first character of each element in a series to uppercase?
"""
"""
Difficulty Level: L2
"""
"""
Change the first character of each word to upper case in each word of ser.
"""
"""
ser = pd.Series(['how', 'to', 'kick', 'ass?'])
"""
| 19.692308 | 80 | 0.636719 | 40 | 256 | 4.075 | 0.625 | 0.06135 | 0.208589 | 0.233129 | 0.282209 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014354 | 0.183594 | 256 | 12 | 81 | 21.333333 | 0.76555 | 0.3125 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
6a13548da0fb5088c8857242c2932e85a88b0f90 | 1,957 | py | Python | demos/biotool/ee/tests/apifunction_test.py | Servir-Mekong/biotool | 80ef1b18e34db637bf11d2ab84782e6a1a2dddd0 | [
"Apache-2.0"
] | 1 | 2016-09-09T14:45:45.000Z | 2016-09-09T14:45:45.000Z | ee/tests/apifunction_test.py | fitoprincipe/ee_client_auth_demo | 61f5843d7421b8c465a56044c00c5fa42ebf20b7 | [
"MIT"
] | null | null | null | ee/tests/apifunction_test.py | fitoprincipe/ee_client_auth_demo | 61f5843d7421b8c465a56044c00c5fa42ebf20b7 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
"""Tests for the ee.apifunction module."""
import types
import unittest
import ee
from ee import apitestcase
class ApiFunctionTest(apitestcase.ApiTestCase):
def testAddFunctions(self):
"""Verifies that addition of static and instance API functions."""
# Check instance vs static functions, and trampling of
# existing functions.
class TestClass(object):
def pre_addBands(self): # pylint: disable-msg=g-bad-name
pass
self.assertFalse(hasattr(TestClass, 'pre_load'))
self.assertFalse(hasattr(TestClass, 'select'))
self.assertFalse(hasattr(TestClass, 'pre_select'))
self.assertTrue(hasattr(TestClass, 'pre_addBands'))
self.assertFalse(hasattr(TestClass, '_pre_addBands'))
ee.ApiFunction.importApi(TestClass, 'Image', 'Image', 'pre_')
self.assertFalse(isinstance(TestClass.pre_load, types.MethodType))
self.assertFalse(hasattr(TestClass, 'select'))
self.assertTrue(isinstance(TestClass.pre_select, types.MethodType))
self.assertTrue(isinstance(TestClass.pre_addBands, types.MethodType))
self.assertFalse(hasattr(TestClass, '_pre_addBands'))
ee.ApiFunction.clearApi(TestClass)
self.assertFalse(hasattr(TestClass, 'pre_load'))
self.assertFalse(hasattr(TestClass, 'select'))
self.assertFalse(hasattr(TestClass, 'pre_select'))
self.assertTrue(hasattr(TestClass, 'pre_addBands'))
self.assertFalse(hasattr(TestClass, '_pre_addBands'))
def testAddFunctions_Inherited(self):
"""Verifies that inherited non-client functions can be overriden."""
class Base(object):
def ClientOverride(self):
pass
class Child(Base):
pass
ee.ApiFunction.importApi(Base, 'Image', 'Image')
ee.ApiFunction.importApi(Child, 'Image', 'Image')
self.assertEquals(Base.ClientOverride, Child.ClientOverride)
self.assertNotEquals(Base.addBands, Child.addBands)
if __name__ == '__main__':
unittest.main()
| 30.107692 | 73 | 0.728666 | 217 | 1,957 | 6.451613 | 0.308756 | 0.137143 | 0.157143 | 0.221429 | 0.430714 | 0.382143 | 0.331429 | 0.331429 | 0.282857 | 0.282857 | 0 | 0 | 0.148697 | 1,957 | 64 | 74 | 30.578125 | 0.840336 | 0.145631 | 0 | 0.394737 | 0 | 0 | 0.096189 | 0 | 0 | 0 | 0 | 0 | 0.447368 | 1 | 0.105263 | false | 0.078947 | 0.184211 | 0 | 0.394737 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
6a3a6e3946bc20209b6b119a6b5ad94fe2ccea86 | 3,866 | py | Python | d42/d42.py | m-e-w/d42-flask | d892cde19c7eaca7f2a1e88d1680f4310dd53b58 | [
"MIT"
] | null | null | null | d42/d42.py | m-e-w/d42-flask | d892cde19c7eaca7f2a1e88d1680f4310dd53b58 | [
"MIT"
] | null | null | null | d42/d42.py | m-e-w/d42-flask | d892cde19c7eaca7f2a1e88d1680f4310dd53b58 | [
"MIT"
] | null | null | null | import requests
from urllib3.exceptions import InsecureRequestWarning
import yaml
import json
class d42api ():
def __init__(self, config_path):
self.config = self._import_config(path=config_path)
requests.packages.urllib3.disable_warnings(
category=InsecureRequestWarning)
def _import_config(self, path):
try:
with open(path, "r") as stream:
try:
cfg = yaml.safe_load(stream)
return cfg
except yaml.YAMLError as exc:
print(exc)
return None
except IOError as exc:
print(exc)
return None
def lookup_enduser(self, id):
print("Querying Device42 for End User matching ID: " + id + " ...")
data = {"output_type": "json",
"query": "select name from view_enduser_v1 where enduser_pk = " + id}
response = requests.post(self.config['host'] + "/services/data/v1.0/query/",
auth=(self.config['username'], self.config['password']), data=data, verify=self.config['verify'])
if(response.json()):
print("Response: '" + response.json()[0]['name'] + "'\n")
return response.json()[0]
else:
print("No match found\n")
return None
def lookup_device(self, id):
print("Querying Device42 for Device matching ID: " + id + " ...")
data = {"output_type": "json",
"query": "select name from view_device_v2 where device_pk = " + id}
response = requests.post(self.config['host'] + "/services/data/v1.0/query/",
auth=(self.config['username'], self.config['password']), data=data, verify=self.config['verify'])
if(response.json()):
print("Response: '" + response.json()[0]['name'] + "'\n")
return response.json()[0]
else:
print("No match found\n")
return None
def lookup_asset(self, id):
print("Querying Device42 for Asset matching ID: " + id + " ...")
data = {"output_type": "json",
"query": "select name from view_asset_v1 where asset_pk = " + id}
response = requests.post(self.config['host'] + "/services/data/v1.0/query/",
auth=(self.config['username'], self.config['password']), data=data, verify=self.config['verify'])
if(response.json()):
print("Response: '" + response.json()[0]['name'] + "'\n")
return response.json()[0]
else:
print("No match found\n")
return None
def update_device_cf(self, name, key, value):
print("Updating Device Custom Field: '" + key +
"' on device: '" + name + "' with value: '" + value + "'")
data = {
'name': name,
'key': key,
'value': value
}
response = requests.put(self.config['host'] + '/api/1.0/device/custom_field/',
auth=(self.config['username'], self.config['password']), data=data, verify=self.config['verify'])
if(response.json()):
print("Response: '" + response.json()['msg'][0] + "'\n")
def update_asset_cf(self, name, key, value):
print("Updating Asset Custom Field: '" + key +
"' on asset: '" + name + "' with value: '" + value + "'")
data = {
'name': name,
'key': key,
'value': value
}
response = requests.put(self.config['host'] + '/api/1.0/custom_fields/asset/',
auth=(self.config['username'], self.config['password']), data=data, verify=self.config['verify'])
if(response.json()):
print("Response: '" + response.json()['msg'][0] + "'\n")
| 42.483516 | 130 | 0.525091 | 417 | 3,866 | 4.788969 | 0.203837 | 0.110165 | 0.039059 | 0.055083 | 0.724086 | 0.724086 | 0.655984 | 0.624937 | 0.624937 | 0.624937 | 0 | 0.011756 | 0.3179 | 3,866 | 90 | 131 | 42.955556 | 0.745544 | 0 | 0 | 0.580247 | 0 | 0 | 0.231764 | 0.035178 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08642 | false | 0.061728 | 0.074074 | 0 | 0.283951 | 0.185185 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
6a4d5f116fb1ae545f23a0c3b60365b69c34907c | 173 | py | Python | problem0453.py | kmarcini/Project-Euler-Python | d644e8e1ec4fac70a9ab407ad5e1f0a75547c8d3 | [
"BSD-3-Clause"
] | null | null | null | problem0453.py | kmarcini/Project-Euler-Python | d644e8e1ec4fac70a9ab407ad5e1f0a75547c8d3 | [
"BSD-3-Clause"
] | null | null | null | problem0453.py | kmarcini/Project-Euler-Python | d644e8e1ec4fac70a9ab407ad5e1f0a75547c8d3 | [
"BSD-3-Clause"
] | null | null | null | ###########################
#
# #453 Lattice Quadrilaterals - Project Euler
# https://projecteuler.net/problem=453
#
# Code by Kevin Marciniak
#
###########################
| 19.222222 | 45 | 0.49711 | 14 | 173 | 6.142857 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038961 | 0.109827 | 173 | 8 | 46 | 21.625 | 0.519481 | 0.595376 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
dbfd340b62e57e0106d36b0ee54e60ab171287c1 | 988 | py | Python | Leak #5 - Lost In Translation/windows/Resources/Ops/PyScripts/lib/ops/data/groups.py | bidhata/EquationGroupLeaks | 1ff4bc115cb2bd5bf2ed6bf769af44392926830c | [
"Unlicense"
] | 9 | 2019-11-22T04:58:40.000Z | 2022-02-26T16:47:28.000Z | Python.Fuzzbunch/Resources/Ops/PyScripts/lib/ops/data/groups.py | 010001111/Vx-Suites | 6b4b90a60512cce48aa7b87aec5e5ac1c4bb9a79 | [
"MIT"
] | null | null | null | Python.Fuzzbunch/Resources/Ops/PyScripts/lib/ops/data/groups.py | 010001111/Vx-Suites | 6b4b90a60512cce48aa7b87aec5e5ac1c4bb9a79 | [
"MIT"
] | 8 | 2017-09-27T10:31:18.000Z | 2022-01-08T10:30:46.000Z |
from ops.data import OpsClass, OpsField, DszObject, DszCommandObject, cmd_definitions
import dsz
if ('groups' not in cmd_definitions):
dszgroup = OpsClass('group', {'groupid': OpsField('groupid', dsz.TYPE_INT), 'group': OpsField('group', dsz.TYPE_STRING), 'comment': OpsField('comment', dsz.TYPE_STRING), 'attributes': OpsClass('attributes', {'mask': OpsField('mask', dsz.TYPE_STRING), 'groupmandatory': OpsField('groupmandatory', dsz.TYPE_BOOL), 'groupenabled': OpsField('groupenabled', dsz.TYPE_BOOL), 'grouplogonid': OpsField('grouplogonid', dsz.TYPE_BOOL), 'groupresource': OpsField('groupresource', dsz.TYPE_BOOL), 'groupenabledbydefault': OpsField('groupenabledbydefault', dsz.TYPE_BOOL), 'groupusefordenyonly': OpsField('groupusefordenyonly', dsz.TYPE_BOOL), 'groupowner': OpsField('groupowner', dsz.TYPE_STRING)}, DszObject)}, DszObject, single=False)
groupscommand = OpsClass('groups', {'group': dszgroup}, DszCommandObject)
cmd_definitions['groups'] = groupscommand | 141.142857 | 728 | 0.762146 | 104 | 988 | 7.105769 | 0.336538 | 0.104195 | 0.08931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.077935 | 988 | 7 | 729 | 141.142857 | 0.811196 | 0 | 0 | 0 | 0 | 0 | 0.299595 | 0.04251 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
e008b253bda8c1f9f79ca66ea55b123cac160a25 | 444 | py | Python | tests/unit/interpreters/test_py_spec.py | snsnlou/tox | 036dfaca03a8202be77ccc3ce70e1f1f17ece57c | [
"MIT"
] | 2,811 | 2016-09-17T18:26:04.000Z | 2022-03-31T14:55:53.000Z | tests/unit/interpreters/test_py_spec.py | snsnlou/tox | 036dfaca03a8202be77ccc3ce70e1f1f17ece57c | [
"MIT"
] | 1,816 | 2016-09-17T20:00:01.000Z | 2022-03-31T10:44:42.000Z | tests/unit/interpreters/test_py_spec.py | snsnlou/tox | 036dfaca03a8202be77ccc3ce70e1f1f17ece57c | [
"MIT"
] | 509 | 2016-09-17T20:43:31.000Z | 2022-03-27T04:15:09.000Z | from tox.interpreters.py_spec import PythonSpec
def test_py_3_10():
spec = PythonSpec.from_name("python3.10")
assert (spec.major, spec.minor) == (3, 10)
def test_debug_python():
spec = PythonSpec.from_name("python3.10-dbg")
assert (spec.major, spec.minor) == (None, None)
def test_parse_architecture():
spec = PythonSpec.from_name("python3.10-32")
assert (spec.major, spec.minor, spec.architecture) == (3, 10, 32)
| 26.117647 | 69 | 0.693694 | 64 | 444 | 4.640625 | 0.359375 | 0.070707 | 0.181818 | 0.222222 | 0.555556 | 0.313131 | 0 | 0 | 0 | 0 | 0 | 0.058667 | 0.155405 | 444 | 16 | 70 | 27.75 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0.3 | 1 | 0.3 | false | 0 | 0.1 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
e00a68a24df0d95c1be50891d18fe0d7ee72c620 | 636 | py | Python | mypy/test/teststubinfo.py | AlexWaygood/mypy | eb1b1e007e7fd1e976e6dd0f49a71b662a30a2d6 | [
"PSF-2.0"
] | null | null | null | mypy/test/teststubinfo.py | AlexWaygood/mypy | eb1b1e007e7fd1e976e6dd0f49a71b662a30a2d6 | [
"PSF-2.0"
] | null | null | null | mypy/test/teststubinfo.py | AlexWaygood/mypy | eb1b1e007e7fd1e976e6dd0f49a71b662a30a2d6 | [
"PSF-2.0"
] | null | null | null | import unittest
from mypy.stubinfo import is_legacy_bundled_package
class TestStubInfo(unittest.TestCase):
def test_is_legacy_bundled_packages(self) -> None:
assert not is_legacy_bundled_package('foobar_asdf', 2)
assert not is_legacy_bundled_package('foobar_asdf', 3)
assert is_legacy_bundled_package('pycurl', 2)
assert is_legacy_bundled_package('pycurl', 3)
assert is_legacy_bundled_package('scribe', 2)
assert not is_legacy_bundled_package('scribe', 3)
assert not is_legacy_bundled_package('dataclasses', 2)
assert is_legacy_bundled_package('dataclasses', 3)
| 33.473684 | 62 | 0.742138 | 84 | 636 | 5.22619 | 0.309524 | 0.182232 | 0.341686 | 0.451025 | 0.701595 | 0.624146 | 0.261959 | 0.186788 | 0 | 0 | 0 | 0.015326 | 0.179245 | 636 | 18 | 63 | 35.333333 | 0.825671 | 0 | 0 | 0 | 0 | 0 | 0.106918 | 0 | 0 | 0 | 0 | 0 | 0.666667 | 1 | 0.083333 | false | 0 | 0.166667 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
e0269fc05bd285ec872ca9d2e66a7bb56e1184f9 | 61 | py | Python | openml/extensions/pytorch/layers/__init__.py | adriansmares/openml-deeplearning | 44856ae5f79b553acdb3001e52ca981922f3093f | [
"BSD-3-Clause"
] | 1 | 2020-08-25T10:55:41.000Z | 2020-08-25T10:55:41.000Z | openml/extensions/pytorch/layers/__init__.py | adriansmares/openml-deeplearning | 44856ae5f79b553acdb3001e52ca981922f3093f | [
"BSD-3-Clause"
] | 8 | 2019-05-23T08:03:24.000Z | 2019-09-20T10:14:43.000Z | openml_pytorch/layers/__init__.py | openml/openml-pytorch | 3e404b5f3124898bb6b925e5e9e467c9fe9cc0b6 | [
"BSD-3-Clause"
] | 2 | 2019-06-19T11:10:47.000Z | 2019-07-08T10:31:01.000Z | from .functional import Functional
__all__ = ['Functional']
| 15.25 | 34 | 0.770492 | 6 | 61 | 7.166667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131148 | 61 | 3 | 35 | 20.333333 | 0.811321 | 0 | 0 | 0 | 0 | 0 | 0.163934 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
e0272e046947fc51b612971111f80ff01aac447e | 48 | py | Python | dataformat-xml-dom/src/test/resources/org/camunda/spin/python/xml/dom/XmlDomAttributeScriptTest.removeAttribute.py | ingorichtsmeier/camunda-spin | f6f929cb4b49f5be3c06fcecf03008fec9fe25c1 | [
"Apache-2.0"
] | 27 | 2015-02-15T22:01:39.000Z | 2022-03-02T05:41:29.000Z | dataformat-xml-dom/src/test/resources/org/camunda/spin/python/xml/dom/XmlDomAttributeScriptTest.removeAttribute.py | ingorichtsmeier/camunda-spin | f6f929cb4b49f5be3c06fcecf03008fec9fe25c1 | [
"Apache-2.0"
] | 101 | 2015-06-05T06:53:56.000Z | 2022-02-28T19:32:44.000Z | dataformat-xml-dom/src/test/resources/org/camunda/spin/python/xml/dom/XmlDomAttributeScriptTest.removeAttribute.py | ingorichtsmeier/camunda-spin | f6f929cb4b49f5be3c06fcecf03008fec9fe25c1 | [
"Apache-2.0"
] | 25 | 2015-05-26T21:28:42.000Z | 2021-07-06T10:04:01.000Z | element = S(input).attr(attributeName).remove()
| 24 | 47 | 0.75 | 6 | 48 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 48 | 1 | 48 | 48 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
e0542032fbea72ce75900f097c37cc6b5a8bfe96 | 244 | py | Python | src/py_dss_vis/Line.py | eniovianna/py_dss_vis | c40c1f66aacff632e837aa922a0450d28db0c43b | [
"MIT"
] | null | null | null | src/py_dss_vis/Line.py | eniovianna/py_dss_vis | c40c1f66aacff632e837aa922a0450d28db0c43b | [
"MIT"
] | null | null | null | src/py_dss_vis/Line.py | eniovianna/py_dss_vis | c40c1f66aacff632e837aa922a0450d28db0c43b | [
"MIT"
] | null | null | null | # -*- encoding: utf-8 -*-
"""
Created by Ênio Viana at 15/10/2021 at 22:19:09
Project: py-dss-vis [out, 2021]
"""
class Line:
def __init__(self):
self._name = "Line"
@property
def name(self):
return self._name
| 15.25 | 48 | 0.577869 | 36 | 244 | 3.75 | 0.75 | 0.118519 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106145 | 0.266393 | 244 | 15 | 49 | 16.266667 | 0.648045 | 0.42623 | 0 | 0 | 0 | 0 | 0.030769 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.166667 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
e066c7f36da59ad2ad3be24e31a8669edd1a912b | 1,360 | py | Python | part-3/1-dictionaries/6-custom_classes_hashing.py | boconlonton/python-deep-dive | c01591a4943c7b77d4d2cd90a8b23423280367a3 | [
"MIT"
] | null | null | null | part-3/1-dictionaries/6-custom_classes_hashing.py | boconlonton/python-deep-dive | c01591a4943c7b77d4d2cd90a8b23423280367a3 | [
"MIT"
] | null | null | null | part-3/1-dictionaries/6-custom_classes_hashing.py | boconlonton/python-deep-dive | c01591a4943c7b77d4d2cd90a8b23423280367a3 | [
"MIT"
] | null | null | null | class Person:
def __init__(self, name, age):
self.name = name
self.age = age
def __repr__(self):
return f'Person(name={self.name}, age={self.age}'
def __eq__(self, other):
if isinstance(other, Person):
return self.name == other.name and self.age == other.age
def __hash__(self):
return hash((self.name, self.age))
# __hash__ = None
p1 = Person('John', 78)
p2 = Person('Eric', 75)
persons = {p1: 'John obj', p2: 'Eric obj'}
print(p1 is p2) # False
print(p1 == p2) # True
# print(hash(p1))
print(persons[Person('John', 78)])
class Number:
def __init__(self, x):
self.x = x
def __eq__(self, other):
if isinstance(other, Number):
return self.x == other.x
else:
return False
def __hash__(self):
return hash(self.x)
# Usage of Custom hashes
class Point:
def __init__(self, x, y):
self.x = x
self.y = y
def __repr__(self):
return f'{self.x}, {self.y}'
def __eq__(self, other):
if isinstance(other, Point):
return self.x == other.x and self.y == other.y
else:
return False
def __hash__(self):
return hash((self.x, self.y))
points = {
Point(0, 0): 'origin',
Point(1, 1): 'second pt'
}
print(points[Point(0, 0)])
| 20 | 68 | 0.555147 | 189 | 1,360 | 3.740741 | 0.232804 | 0.063649 | 0.046676 | 0.059406 | 0.381895 | 0.282885 | 0.247525 | 0.115983 | 0.115983 | 0.115983 | 0 | 0.022082 | 0.300735 | 1,360 | 67 | 69 | 20.298507 | 0.721346 | 0.047794 | 0 | 0.311111 | 0 | 0 | 0.07758 | 0.018619 | 0 | 0 | 0 | 0 | 0 | 1 | 0.244444 | false | 0 | 0 | 0.111111 | 0.533333 | 0.088889 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
e069dfc6fb3190477f486b122ea0b54c32459ecb | 597 | py | Python | HW4_GraphProblem/Node.py | cmfeng/CEE505 | 8974562f01461d1f8113e3abb88c704394c2b370 | [
"MIT"
] | null | null | null | HW4_GraphProblem/Node.py | cmfeng/CEE505 | 8974562f01461d1f8113e3abb88c704394c2b370 | [
"MIT"
] | null | null | null | HW4_GraphProblem/Node.py | cmfeng/CEE505 | 8974562f01461d1f8113e3abb88c704394c2b370 | [
"MIT"
] | null | null | null | '''
Created on Apr 24, 2009
@author: Peter
'''
from Vector import *
class Node(object):
def __init__(self, id, v):
self.id = id
self.node = Vector(v)
self.lines = []
def getPosition(self):
return self.node
def getID(self):
return self.id
def attach(self, lineID):
self.lines.append(lineID)
def detach(self, lineID):
self.lines.remove(lineID)
def __str__(self):
return(self.id + str(self.node)+str(self.lines))
def getAttachedLines(self):
return self.lines | 19.258065 | 56 | 0.557789 | 73 | 597 | 4.452055 | 0.39726 | 0.138462 | 0.172308 | 0.098462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014888 | 0.324958 | 597 | 31 | 57 | 19.258065 | 0.791563 | 0.065327 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.388889 | false | 0 | 0.055556 | 0.222222 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
e06ac6c6b4f34bafc88da3eecca3147d137d4907 | 16,730 | py | Python | tests/unit/test_filters.py | shiratamu/pushwoosh-python-lib | da05d7b72729ebfc65a7ab0b08c9009632a38833 | [
"MIT"
] | 18 | 2015-01-08T19:51:42.000Z | 2021-11-12T11:42:18.000Z | tests/unit/test_filters.py | shiratamu/pushwoosh-python-lib | da05d7b72729ebfc65a7ab0b08c9009632a38833 | [
"MIT"
] | 7 | 2015-03-08T09:01:03.000Z | 2017-11-13T05:26:21.000Z | tests/unit/test_filters.py | shiratamu/pushwoosh-python-lib | da05d7b72729ebfc65a7ab0b08c9009632a38833 | [
"MIT"
] | 18 | 2015-02-17T03:40:54.000Z | 2021-11-25T02:26:44.000Z | from datetime import datetime
import unittest
from pypushwoosh import constants
from pypushwoosh.exceptions import PushwooshFilterInvalidOperatorException, PushwooshFilterInvalidOperandException
from pypushwoosh.filter import ApplicationFilter, ApplicationGroupFilter, IntegerTagFilter, StringTagFilter, \
ListTagFilter, DateTagFilter, DaysTagFilter, IntegerTagFilterByApplication, StringTagFilterByApplication, \
DateTagFilterByApplication, DaysTagFilterByApplication, BooleanTagFilter, BooleanTagFilterByApplication
HTTP_200_OK = 200
STATUS_OK = 'OK'
class TestApplicationFilter(unittest.TestCase):
def setUp(self):
self.test_code = '0000-0000'
self.prefix = 'A'
self.pwfilter = ApplicationFilter
def test_valid_filter(self):
expected_result = '%s("%s")' % (self.prefix, self.test_code)
result = self.pwfilter(self.test_code)
self.assertEqual(result.__str__(), expected_result)
def test_valid_filter_with_platforms(self):
expected_result = '%s("%s", ["%s", "%s"])' % (self.prefix,
self.test_code,
constants.PLATFORM_NAMES[constants.PLATFORM_IOS],
constants.PLATFORM_NAMES[constants.PLATFORM_ANDROID])
result = self.pwfilter(self.test_code, [constants.PLATFORM_IOS, constants.PLATFORM_ANDROID])
self.assertEqual(result.__str__(), expected_result)
def test_filter_invalid_platform(self):
try:
self.pwfilter(self.test_code, 'Invalid Platform')
self.assertEqual(True, 'Platform must be invalid')
except TypeError:
self.assertEqual(True, True)
class TestApplicationGroupFilter(TestApplicationFilter):
def setUp(self):
self.test_code = '0000-0000'
self.prefix = 'G'
self.pwfilter = ApplicationGroupFilter
class TestInvalidOperatorForOperand(unittest.TestCase):
def setUp(self):
self.pwfilter = IntegerTagFilter
self.tag_name = 'testInt'
def filter_with_invalid_operator_for_operand(self, value, operator):
args = [self.tag_name, operator, value]
self.assertRaises(PushwooshFilterInvalidOperandException, self.pwfilter, *args)
def test_invalid_operator_type(self):
self.filter_with_invalid_operator_for_operand([1, 2], constants.TAG_FILTER_OPERATOR_GTE)
self.filter_with_invalid_operator_for_operand([1, 2], constants.TAG_FILTER_OPERATOR_LTE)
self.filter_with_invalid_operator_for_operand(1, constants.TAG_FILTER_OPERATOR_BETWEEN)
self.filter_with_invalid_operator_for_operand('1', constants.TAG_FILTER_OPERATOR_BETWEEN)
self.filter_with_invalid_operator_for_operand(1, constants.TAG_FILTER_OPERATOR_IN)
self.filter_with_invalid_operator_for_operand('1', constants.TAG_FILTER_OPERATOR_IN)
class TestInvalidOperand(unittest.TestCase):
def filter_with_invalid_operand_type(self, value, operator, tag_name):
args = [tag_name, operator, value]
self.assertRaises(PushwooshFilterInvalidOperandException, self.pwfilter, *args)
def test_invalid_operand_type_int(self):
tag_name = 'testInt'
str_value = 'Invalid value for int'
between_value = ['invalid_min', 'invalid_max']
in_value = ['1', '2', '3']
self.pwfilter = IntegerTagFilter
self.filter_with_invalid_operand_type(str_value, constants.TAG_FILTER_OPERATOR_EQ, tag_name)
self.filter_with_invalid_operand_type(between_value, constants.TAG_FILTER_OPERATOR_BETWEEN, tag_name)
self.filter_with_invalid_operand_type(in_value, constants.TAG_FILTER_OPERATOR_IN, tag_name)
def test_invalid_operand_type_str(self):
tag_name = 'testString'
list_value_in = [[1, 2], [1, 2]]
self.pwfilter = StringTagFilter
self.filter_with_invalid_operand_type(list_value_in, constants.TAG_FILTER_OPERATOR_IN, tag_name)
def test_invalid_operand_type_list(self):
tag_name = 'testString'
list_value_in = [[1, 2], [1, 2]]
self.pwfilter = ListTagFilter
self.filter_with_invalid_operand_type(list_value_in, constants.TAG_FILTER_OPERATOR_IN, tag_name)
def test_invalid_operand_type_date(self):
tag_name = 'testDate'
self.pwfilter = DateTagFilter
self.filter_with_invalid_operand_type(1, constants.TAG_FILTER_OPERATOR_EQ, tag_name)
self.filter_with_invalid_operand_type([1, 'str'], constants.TAG_FILTER_OPERATOR_BETWEEN, tag_name)
self.filter_with_invalid_operand_type(['str', 'str', 1], constants.TAG_FILTER_OPERATOR_IN, tag_name)
def test_invalid_operand_type_days(self):
tag_name = 'testDays'
str_value = 'Invalid value for days'
between_value = ['invalid_min', 'invalid_max']
self.pwfilter = DaysTagFilter
self.filter_with_invalid_operand_type(str_value, constants.TAG_FILTER_OPERATOR_EQ, tag_name)
self.filter_with_invalid_operand_type(between_value, constants.TAG_FILTER_OPERATOR_BETWEEN, tag_name)
class TestInvalidOperator(unittest.TestCase):
def filter_with_invalid_operator(self, value, operator, tag_name):
args = [tag_name, operator, value]
self.assertRaises(PushwooshFilterInvalidOperatorException, self.pwfilter, *args)
def test_invalid_operator(self):
tag_name = 'testInt'
value = [1, 2, 3]
self.pwfilter = IntegerTagFilter
self.filter_with_invalid_operator(value, 'Invalid Operator', tag_name)
def test_invalid_operator_type_str(self):
tag_name = 'testString'
value = 1
list_value = ['123', 'asd']
self.pwfilter = StringTagFilter
self.filter_with_invalid_operator(value, constants.TAG_FILTER_OPERATOR_GTE, tag_name)
self.filter_with_invalid_operator(value, constants.TAG_FILTER_OPERATOR_LTE, tag_name)
self.filter_with_invalid_operator(list_value, constants.TAG_FILTER_OPERATOR_BETWEEN, tag_name)
def test_invalid_operator_type_list(self):
tag_name = 'testList'
value = 1
list_value = ['123', 'asd']
self.pwfilter = ListTagFilter
self.filter_with_invalid_operator(value, constants.TAG_FILTER_OPERATOR_GTE, tag_name)
self.filter_with_invalid_operator(value, constants.TAG_FILTER_OPERATOR_LTE, tag_name)
self.filter_with_invalid_operator(list_value, constants.TAG_FILTER_OPERATOR_BETWEEN, tag_name)
def test_invalid_operator_type_boolean(self):
tag_name = 'testBool'
value = 1
list_value = [0, 1]
self.pwfilter = BooleanTagFilter
self.filter_with_invalid_operator(value, constants.TAG_FILTER_OPERATOR_GTE, tag_name)
self.filter_with_invalid_operator(value, constants.TAG_FILTER_OPERATOR_LTE, tag_name)
self.filter_with_invalid_operator(list_value, constants.TAG_FILTER_OPERATOR_BETWEEN, tag_name)
self.filter_with_invalid_operator(list_value, constants.TAG_FILTER_OPERATOR_IN, tag_name)
class TestInvalidOperandLength(unittest.TestCase):
def filter_with_invalid_operator_len(self, value, operator, tag_name):
args = [tag_name, operator, value]
self.assertRaises(PushwooshFilterInvalidOperandException, self.pwfilter, *args)
def test_invalid_len_in(self):
tag_name = 'testStr'
value = []
self.pwfilter = ListTagFilter
self.filter_with_invalid_operator_len(value, constants.TAG_FILTER_OPERATOR_IN, tag_name)
def test_invalid_len_between(self):
tag_name = 'testInt'
value_lt = [1]
value_gt = [1, 2, 3]
self.pwfilter = IntegerTagFilter
self.filter_with_invalid_operator_len(value_lt, constants.TAG_FILTER_OPERATOR_BETWEEN, tag_name)
self.filter_with_invalid_operator_len(value_gt, constants.TAG_FILTER_OPERATOR_BETWEEN, tag_name)
class TestIntegerTagFilter(unittest.TestCase):
def setUp(self):
self.pwfilter = IntegerTagFilter
self.tag_name = 'testInt'
def test_valid_filter(self):
expected_result = 'T("%s", %s, 1)' % (self.tag_name, constants.TAG_FILTER_OPERATOR_GTE)
result = self.pwfilter(self.tag_name, constants.TAG_FILTER_OPERATOR_GTE, 1)
self.assertEqual(expected_result, result.__str__())
def test_valid_filter_between_operator(self):
expected_result = 'T("%s", %s, [1, 2])' % (self.tag_name, constants.TAG_FILTER_OPERATOR_BETWEEN)
result = self.pwfilter(self.tag_name, constants.TAG_FILTER_OPERATOR_BETWEEN, [1, 2])
self.assertEqual(expected_result, result.__str__())
def test_valid_filter_in_operator(self):
expected_result = 'T("%s", %s, [1, 2, 3])' % (self.tag_name, constants.TAG_FILTER_OPERATOR_IN)
result = self.pwfilter(self.tag_name, constants.TAG_FILTER_OPERATOR_IN, [1, 2, 3])
self.assertEqual(expected_result, result.__str__())
class TestStringTagFilter(unittest.TestCase):
pwfilter = StringTagFilter
tag_name = 'testStr'
def test_valid_filter(self):
expected_result = 'T("%s", %s, "test value")' % (self.tag_name, constants.TAG_FILTER_OPERATOR_EQ)
result = self.pwfilter(self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, 'test value')
self.assertEqual(expected_result, result.__str__())
def test_valid_filter_in_operator(self):
expected_result = 'T("%s", %s, ["1", "2"])' % (self.tag_name, constants.TAG_FILTER_OPERATOR_IN)
result = self.pwfilter(self.tag_name, constants.TAG_FILTER_OPERATOR_IN, ['1', '2'])
self.assertEqual(expected_result, result.__str__())
class TestListTagFilter(unittest.TestCase):
def setUp(self):
self.pwfilter = ListTagFilter
self.tag_name = 'testList'
def test_valid_filter_in_operator(self):
expected_result = 'T("%s", %s, [1, 2, "2"])' % (self.tag_name, constants.TAG_FILTER_OPERATOR_IN)
result = self.pwfilter(self.tag_name, constants.TAG_FILTER_OPERATOR_IN, [1, 2, '2'])
self.assertEqual(expected_result, result.__str__())
class TestDateTagFilter(unittest.TestCase):
def setUp(self):
self.pwfilter = DateTagFilter
self.tag_name = 'testDate'
def invalid_date_format(self, operator, value):
args = [self.tag_name, operator, value]
self.assertRaises(PushwooshFilterInvalidOperandException, self.pwfilter, *args)
def test_valid_filter(self):
values = [
'2014-12-05 22:22:22',
'2014-12-05 22:22',
'2014-12-05',
'2014-12-05T22:22:22',
'2014-12-05T22:22',
]
for value in values:
expected_result = 'T("%s", %s, "%s")' % (self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value)
result = self.pwfilter(self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value)
self.assertEqual(result.__str__(), expected_result)
def test_valid_filter_between(self):
value = ['2013-06-22 00:00:00', '2013-06-25']
expected_result = 'T("%s", %s, ["%s", "%s"])' % (self.tag_name, constants.TAG_FILTER_OPERATOR_BETWEEN, value[0], value[1])
result = self.pwfilter(self.tag_name, constants.TAG_FILTER_OPERATOR_BETWEEN, value)
self.assertEqual(result.__str__(), expected_result)
def test_valid_datetime_object(self):
value = datetime.strptime('2013-06-22 00:00:00', '%Y-%m-%d %H:%M:%S')
expected_result = 'T("%s", %s, "%s")' % (self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value)
result = self.pwfilter(self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value)
self.assertEqual(result.__str__(), expected_result)
def test_invalid_date_format(self):
self.invalid_date_format(constants.TAG_FILTER_OPERATOR_GTE, '2')
self.invalid_date_format(constants.TAG_FILTER_OPERATOR_BETWEEN, ['2013-06-25', '1'])
class TestDaysTagFilter(unittest.TestCase):
def setUp(self):
self.pwfilter = DaysTagFilter
self.tag_name = 'testDays'
def test_valid_filter(self):
value = 1
expected_result = 'T("%s", %s, %d)' % (self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value)
result = self.pwfilter(self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value)
self.assertEqual(result.__str__(), expected_result)
def test_valid_filter_between(self):
value = [1, 3]
expected_result = 'T("%s", %s, [%s, %s])' % (self.tag_name, constants.TAG_FILTER_OPERATOR_BETWEEN, value[0], value[1])
result = self.pwfilter(self.tag_name, constants.TAG_FILTER_OPERATOR_BETWEEN, value)
self.assertEqual(result.__str__(), expected_result)
def test_invalid_days(self):
args = [self.tag_name, constants.TAG_FILTER_OPERATOR_BETWEEN, [-1, 3]]
self.assertRaises(PushwooshFilterInvalidOperandException, self.pwfilter, *args)
class TestBooleanTagFilter(unittest.TestCase):
def setUp(self):
self.pwfilter = BooleanTagFilter
self.tag_name = 'testBool'
def invalid_boolean(self, operator, value):
args = [self.tag_name, operator, value]
self.assertRaises(PushwooshFilterInvalidOperandException, self.pwfilter, *args)
def test_valid_filter_in_operator(self):
expected_result = 'T("%s", %s, "true")' % (self.tag_name, constants.TAG_FILTER_OPERATOR_EQ)
result = self.pwfilter(self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, 'true')
self.assertEqual(expected_result, result.__str__())
def test_invalid_boolean(self):
self.invalid_boolean(constants.TAG_FILTER_OPERATOR_EQ, 'invalid value')
self.invalid_boolean(constants.TAG_FILTER_OPERATOR_EQ, 2)
class TestOperatorsFilter(unittest.TestCase):
def test_valid_filter(self):
app_code = '0000-0000'
tag_name = 'test_string'
tag_value = 'test value'
expected_result = '(((A("%s") + T("%s", EQ, "%s")) * A("%s")) \ T("%s", EQ, "%s"))' % \
(app_code, tag_name, tag_value, app_code, tag_name, tag_value)
tag_filter = StringTagFilter(tag_name, constants.TAG_FILTER_OPERATOR_EQ, tag_value)
app_filter = ApplicationFilter(app_code)
union_filter = app_filter.union(tag_filter)
intersect_filter = union_filter.intersect(app_filter)
subtract_filter = intersect_filter.subtract(tag_filter)
self.assertEqual(subtract_filter.__str__(), expected_result)
class TestApplicationTagFilter(unittest.TestCase):
def setUp(self):
self.tag_name = 'testApplicationTag'
self.code = '0000-0000'
def test_valid_filter_int(self):
value = 1
expected_result = 'AT("%s", "%s", %s, %d)' % (self.code, self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value)
result = IntegerTagFilterByApplication(self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value, self.code)
self.assertEqual(result.__str__(), expected_result)
def test_valid_filter_string(self):
value = 1
expected_result = 'AT("%s", "%s", %s, %d)' % (self.code, self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value)
result = StringTagFilterByApplication(self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value, self.code)
self.assertEqual(result.__str__(), expected_result)
def test_valid_filter_list(self):
expected_result = 'AT("%s", "%s", %s, [1, 2])' % (self.code, self.tag_name, constants.TAG_FILTER_OPERATOR_EQ)
result = DaysTagFilterByApplication(self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, [1, 2], self.code)
self.assertEqual(result.__str__(), expected_result)
def test_valid_filter_days(self):
value = 1
expected_result = 'AT("%s", "%s", %s, %d)' % (self.code, self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value)
result = DaysTagFilterByApplication(self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value, self.code)
self.assertEqual(result.__str__(), expected_result)
def test_valid_filter_date(self):
value = '2014-02-02 00:15:10'
expected_result = 'AT("%s", "%s", %s, "%s")' % (self.code, self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value)
result = DateTagFilterByApplication(self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value, self.code)
self.assertEqual(result.__str__(), expected_result)
def test_valid_filter_boolean(self):
value = 'False'
expected_result = 'AT("%s", "%s", %s, "%s")' % (self.code, self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value)
result = BooleanTagFilterByApplication(self.tag_name, constants.TAG_FILTER_OPERATOR_EQ, value, self.code)
self.assertEqual(result.__str__(), expected_result) | 44.494681 | 130 | 0.704782 | 2,042 | 16,730 | 5.403036 | 0.067581 | 0.059005 | 0.115834 | 0.167316 | 0.784827 | 0.745944 | 0.705157 | 0.651772 | 0.596846 | 0.592677 | 0 | 0.017452 | 0.188285 | 16,730 | 376 | 131 | 44.494681 | 0.794993 | 0 | 0 | 0.435714 | 0 | 0.003571 | 0.064312 | 0 | 0 | 0 | 0 | 0 | 0.107143 | 1 | 0.185714 | false | 0 | 0.017857 | 0 | 0.260714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
0ed1b9a5ebc494baa4b2025d19ac9bc600628af2 | 381 | py | Python | experimental/sma_database/calculate_smooth.py | Mecanon/morphing_wing | 28d027de4bca755d22345071b8d5abafff1c8778 | [
"MIT"
] | 3 | 2016-02-12T13:17:03.000Z | 2021-02-08T12:44:01.000Z | experimental/sma_database/calculate_smooth.py | Mecanon/morphing_wing | 28d027de4bca755d22345071b8d5abafff1c8778 | [
"MIT"
] | null | null | null | experimental/sma_database/calculate_smooth.py | Mecanon/morphing_wing | 28d027de4bca755d22345071b8d5abafff1c8778 | [
"MIT"
] | 1 | 2021-02-08T12:45:08.000Z | 2021-02-08T12:45:08.000Z | # -*- coding: utf-8 -*-
"""
Created on Fri Jul 15 10:07:53 2016
@author: Pedro Leal
"""
import numpy as np
n_1 = 0.2
n_2 = 0.2
n_3 = 0.2
n_4 = 0.2
Ms_list = np.array([ 68.74, 75.71, 82.33, 84.77, 88.27])
Mf_list = np.array([ 57.74, 65.39, 71.29, 74.07, 77.88])
As_list = np.array([ 78.47, 83.82, 88.81, 91.38, 94.78])
Af_list = np.array([ 88.75, 95.02, 102.15, 105.12, 108.85]) | 22.411765 | 59 | 0.593176 | 91 | 381 | 2.395604 | 0.637363 | 0.036697 | 0.201835 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.345048 | 0.178478 | 381 | 17 | 59 | 22.411765 | 0.351438 | 0.207349 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
0ed69e566ad4863c16e2fb22f8ad41f1014f3e10 | 587 | py | Python | doc/examples/scope_class.py | barraponto/pytest-dependency | cab2f65ced816939a9041b9e67169073ef0ee412 | [
"Apache-2.0"
] | 91 | 2017-01-30T16:05:13.000Z | 2022-03-29T12:17:35.000Z | doc/examples/scope_class.py | ftesser/pytest-dependency | 1e01358b272ef7d1b2bae0eb7c9d4e33f657da35 | [
"Apache-2.0"
] | 63 | 2016-04-21T19:30:32.000Z | 2022-03-30T13:17:42.000Z | doc/examples/scope_class.py | ftesser/pytest-dependency | 1e01358b272ef7d1b2bae0eb7c9d4e33f657da35 | [
"Apache-2.0"
] | 29 | 2017-09-24T17:22:02.000Z | 2022-03-30T20:39:49.000Z | import pytest
@pytest.mark.dependency()
@pytest.mark.xfail(reason="deliberate fail")
def test_a():
assert False
class TestClass1(object):
@pytest.mark.dependency()
def test_b(self):
pass
class TestClass2(object):
@pytest.mark.dependency()
def test_a(self):
pass
@pytest.mark.dependency(depends=["test_a"])
def test_c(self):
pass
@pytest.mark.dependency(depends=["test_a"], scope='class')
def test_d(self):
pass
@pytest.mark.dependency(depends=["test_b"], scope='class')
def test_e(self):
pass
| 17.787879 | 62 | 0.635434 | 75 | 587 | 4.853333 | 0.333333 | 0.192308 | 0.32967 | 0.148352 | 0.508242 | 0.508242 | 0.326923 | 0.21978 | 0 | 0 | 0 | 0.004367 | 0.219762 | 587 | 32 | 63 | 18.34375 | 0.790393 | 0 | 0 | 0.363636 | 0 | 0 | 0.073254 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 1 | 0.272727 | false | 0.227273 | 0.045455 | 0 | 0.409091 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
0edb1d732d37ad24af1ca3668a1c4bccae18cad2 | 6,358 | py | Python | tests/test_rand_weighted_cropd.py | davidiommi/MONAI | c470c1a67b33d7dbbce0f8b8c5ffdad84b76d60f | [
"Apache-2.0"
] | 1 | 2022-01-04T21:38:23.000Z | 2022-01-04T21:38:23.000Z | tests/test_rand_weighted_cropd.py | davidiommi/MONAI | c470c1a67b33d7dbbce0f8b8c5ffdad84b76d60f | [
"Apache-2.0"
] | null | null | null | tests/test_rand_weighted_cropd.py | davidiommi/MONAI | c470c1a67b33d7dbbce0f8b8c5ffdad84b76d60f | [
"Apache-2.0"
] | null | null | null | # Copyright 2020 - 2021 MONAI Consortium
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import unittest
import numpy as np
from monai.transforms.croppad.dictionary import RandWeightedCropd
from tests.utils import NumpyImageTestCase2D, NumpyImageTestCase3D
class TestRandWeightedCrop(NumpyImageTestCase2D):
def test_rand_weighted_crop_small_roi(self):
img = self.seg1[0]
n_samples = 3
crop = RandWeightedCropd("img", "w", (10, 12), n_samples)
weight = np.zeros_like(img)
weight[0, 30, 17] = 1.1
weight[0, 40, 31] = 1
weight[0, 80, 21] = 1
crop.set_random_state(10)
d = {"img": img, "w": weight}
result = crop(d)
self.assertTrue(len(result) == n_samples)
np.testing.assert_allclose(result[0]["img"].shape, (1, 10, 12))
np.testing.assert_allclose(np.asarray(crop.centers), [[80, 21], [30, 17], [40, 31]])
def test_rand_weighted_crop_default_roi(self):
img = self.imt[0]
n_samples = 3
crop = RandWeightedCropd("im", "weight", (10, -1), n_samples, "coords")
weight = np.zeros_like(img)
weight[0, 30, 17] = 1.1
weight[0, 40, 31] = 1
weight[0, 80, 21] = 1
crop.set_random_state(10)
data = {"im": img, "weight": weight, "others": np.nan}
result = crop(data)
self.assertTrue(len(result) == n_samples)
np.testing.assert_allclose(result[0]["im"].shape, (1, 10, 64))
np.testing.assert_allclose(np.asarray(crop.centers), [[14, 32], [105, 32], [20, 32]])
np.testing.assert_allclose(result[1]["coords"], [105, 32])
def test_rand_weighted_crop_large_roi(self):
img = self.segn[0]
n_samples = 3
crop = RandWeightedCropd(("img", "seg"), "weight", (10000, 400), n_samples, "location")
weight = np.zeros_like(img)
weight[0, 30, 17] = 1.1
weight[0, 10, 1] = 1
crop.set_random_state(10)
data = {"img": img, "seg": self.imt[0], "weight": weight}
result = crop(data)
self.assertTrue(len(result) == n_samples)
np.testing.assert_allclose(result[0]["img"].shape, (1, 128, 64))
np.testing.assert_allclose(result[0]["seg"].shape, (1, 128, 64))
np.testing.assert_allclose(np.asarray(crop.centers), [[64, 32], [64, 32], [64, 32]])
np.testing.assert_allclose(result[1]["location"], [64, 32])
def test_rand_weighted_crop_bad_w(self):
img = self.imt[0]
n_samples = 3
crop = RandWeightedCropd(("img", "seg"), "w", (20, 40), n_samples)
weight = np.zeros_like(img)
weight[0, 30, 17] = np.inf
weight[0, 10, 1] = -np.inf
weight[0, 10, 20] = -np.nan
crop.set_random_state(10)
result = crop({"img": img, "seg": self.segn[0], "w": weight})
self.assertTrue(len(result) == n_samples)
np.testing.assert_allclose(result[0]["img"].shape, (1, 20, 40))
np.testing.assert_allclose(result[0]["seg"].shape, (1, 20, 40))
np.testing.assert_allclose(np.asarray(crop.centers), [[63, 37], [31, 43], [66, 20]])
class TestRandWeightedCrop3D(NumpyImageTestCase3D):
def test_rand_weighted_crop_small_roi(self):
img = self.seg1[0]
n_samples = 3
crop = RandWeightedCropd("img", "w", (8, 10, 12), n_samples)
weight = np.zeros_like(img)
weight[0, 5, 30, 17] = 1.1
weight[0, 8, 40, 31] = 1
weight[0, 11, 23, 21] = 1
crop.set_random_state(10)
result = crop({"img": img, "w": weight})
self.assertTrue(len(result) == n_samples)
np.testing.assert_allclose(result[0]["img"].shape, (1, 8, 10, 12))
np.testing.assert_allclose(np.asarray(crop.centers), [[11, 23, 21], [5, 30, 17], [8, 40, 31]])
def test_rand_weighted_crop_default_roi(self):
img = self.imt[0]
n_samples = 3
crop = RandWeightedCropd(("img", "seg"), "w", (10, -1, -1), n_samples)
weight = np.zeros_like(img)
weight[0, 7, 17] = 1.1
weight[0, 13, 31] = 1.1
weight[0, 24, 21] = 1
crop.set_random_state(10)
result = crop({"img": img, "seg": self.segn[0], "w": weight})
self.assertTrue(len(result) == n_samples)
np.testing.assert_allclose(result[0]["img"].shape, (1, 10, 64, 80))
np.testing.assert_allclose(result[0]["seg"].shape, (1, 10, 64, 80))
np.testing.assert_allclose(np.asarray(crop.centers), [[14, 32, 40], [41, 32, 40], [20, 32, 40]])
def test_rand_weighted_crop_large_roi(self):
img = self.segn[0]
n_samples = 3
crop = RandWeightedCropd("img", "w", (10000, 400, 80), n_samples)
weight = np.zeros_like(img)
weight[0, 30, 17, 20] = 1.1
weight[0, 10, 1, 17] = 1
crop.set_random_state(10)
result = crop({"img": img, "w": weight})
self.assertTrue(len(result) == n_samples)
np.testing.assert_allclose(result[0]["img"].shape, (1, 48, 64, 80))
np.testing.assert_allclose(np.asarray(crop.centers), [[24, 32, 40], [24, 32, 40], [24, 32, 40]])
def test_rand_weighted_crop_bad_w(self):
img = self.imt[0]
n_samples = 3
crop = RandWeightedCropd(("img", "seg"), "w", (48, 64, 80), n_samples)
weight = np.zeros_like(img)
weight[0, 30, 17] = np.inf
weight[0, 10, 1] = -np.inf
weight[0, 10, 20] = -np.nan
crop.set_random_state(10)
result = crop({"img": img, "seg": self.segn[0], "w": weight})
self.assertTrue(len(result) == n_samples)
np.testing.assert_allclose(result[0]["img"].shape, (1, 48, 64, 80))
np.testing.assert_allclose(result[0]["seg"].shape, (1, 48, 64, 80))
np.testing.assert_allclose(np.asarray(crop.centers), [[24, 32, 40], [24, 32, 40], [24, 32, 40]])
if __name__ == "__main__":
unittest.main()
| 43.848276 | 104 | 0.600818 | 924 | 6,358 | 4.006494 | 0.168831 | 0.051864 | 0.089141 | 0.136683 | 0.734468 | 0.728255 | 0.719611 | 0.693409 | 0.676391 | 0.632631 | 0 | 0.092497 | 0.234822 | 6,358 | 144 | 105 | 44.152778 | 0.668448 | 0.088078 | 0 | 0.605042 | 0 | 0 | 0.031623 | 0 | 0 | 0 | 0 | 0 | 0.252101 | 1 | 0.067227 | false | 0 | 0.033613 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
1618c7abe1fc1d5d5b873b04cc79103c6ea2aaa3 | 307 | py | Python | utils/__init__.py | akarapun/elearning | fe116d5815925269819061ea183cbfdb773844cf | [
"MIT"
] | 1 | 2020-03-14T11:00:14.000Z | 2020-03-14T11:00:14.000Z | utils/__init__.py | akarapun/elearning | fe116d5815925269819061ea183cbfdb773844cf | [
"MIT"
] | null | null | null | utils/__init__.py | akarapun/elearning | fe116d5815925269819061ea183cbfdb773844cf | [
"MIT"
] | null | null | null | from .isAllowAccess import isAllowAccess
def validObjectAttr(obj, attrName):
if hasattr(obj, attrName):
return obj['attrName']
else:
return ''
def log(arg):
print ('\n')
print ("log --> ##################")
print (arg)
print ("##################")
print ('\n')
| 20.466667 | 40 | 0.501629 | 29 | 307 | 5.310345 | 0.517241 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2443 | 307 | 14 | 41 | 21.928571 | 0.663793 | 0 | 0 | 0.166667 | 0 | 0 | 0.18241 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.083333 | 0 | 0.416667 | 0.416667 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
1647d816c96d99861baebc021e235eb197081ece | 3,403 | py | Python | bin/python/analyze-sensitivity-statusQuo.py | narslab/project-groundwork | 1d6ade695712bdd6ce773da7726b6c78a352ccff | [
"MIT"
] | null | null | null | bin/python/analyze-sensitivity-statusQuo.py | narslab/project-groundwork | 1d6ade695712bdd6ce773da7726b6c78a352ccff | [
"MIT"
] | null | null | null | bin/python/analyze-sensitivity-statusQuo.py | narslab/project-groundwork | 1d6ade695712bdd6ce773da7726b6c78a352ccff | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Mon Sep 27 20:00:33 2021
@author: Mahsa
"""
import analyze_sensitivity
import pandas as pd
#sensitivity results for statusQuo strategy
output1=analyze_sensitivity.run_sensitivity_statusQuo('r',10)
output2=analyze_sensitivity.run_sensitivity_statusQuo('r',-10)
output3=analyze_sensitivity.run_sensitivity_statusQuo('overhead_line',10,'corridor_length')
output4=analyze_sensitivity.run_sensitivity_statusQuo('overhead_line',-10,'corridor_length')
output5=analyze_sensitivity.run_sensitivity_statusQuo('underground_line',10,'corridor_length')
output6=analyze_sensitivity.run_sensitivity_statusQuo('underground_line',-10,'corridor_length')
output7=analyze_sensitivity.run_sensitivity_statusQuo('overhead_line',10,'replcost')
output8=analyze_sensitivity.run_sensitivity_statusQuo('overhead_line',-10,'replcost')
output9=analyze_sensitivity.run_sensitivity_statusQuo('underground_line',10,'replcost')
output10=analyze_sensitivity.run_sensitivity_statusQuo('underground_line',-10,'replcost')
output11=analyze_sensitivity.run_sensitivity_statusQuo('overhead_proportion',10)
output12=analyze_sensitivity.run_sensitivity_statusQuo('overhead_proportion',-10)
output13=analyze_sensitivity.run_sensitivity_statusQuo('overhead_line',10,'lifespan')
output14=analyze_sensitivity.run_sensitivity_statusQuo('overhead_line',-10,'lifespan')
output15=analyze_sensitivity.run_sensitivity_statusQuo('underground_line',10,'lifespan')
output16=analyze_sensitivity.run_sensitivity_statusQuo('underground_line',-10,'lifespan')
output17=analyze_sensitivity.run_sensitivity_statusQuo('overhead_line',10,'om_percentage_replcost')
output18=analyze_sensitivity.run_sensitivity_statusQuo('overhead_line',-10,'om_percentage_replcost')
output19=analyze_sensitivity.run_sensitivity_statusQuo('underground_line',10,'om_percentage_replcost')
output20=analyze_sensitivity.run_sensitivity_statusQuo('underground_line',-10,'om_percentage_replcost')
output21=analyze_sensitivity.run_sensitivity_statusQuo('length_shape',+10)
output22=analyze_sensitivity.run_sensitivity_statusQuo('length_shape',-10)
output23=analyze_sensitivity.run_sensitivity_statusQuo('length_scale',+10)
output24=analyze_sensitivity.run_sensitivity_statusQuo('length_scale',-10)
output25=analyze_sensitivity.run_sensitivity_statusQuo('age_shape',+10)
output26=analyze_sensitivity.run_sensitivity_statusQuo('age_shape',-10)
output27=analyze_sensitivity.run_sensitivity_statusQuo('age_scale',+10)
output28=analyze_sensitivity.run_sensitivity_statusQuo('age_scale',-10)
dataframe_sensitivity_statusQuo=pd.DataFrame({'Parameter':['r','over_corridor_length','under_corridor_length','over_replcost','under_replcost','overhead_proportion','over_lifespan','under_lifespan','over_om_percentage_replcost','under_om_percentage_replcost','length_shape','length_scale','age_shape','age_scale'],
'%change in cost resulted from +%10 change in parameter': [output1,output3,output5,output7,output9,output11,output13,output15,output17,output19,output21,output23,output25,output27],
'%change in cost resulted from -%10 change in parameter': [output2,output4,output6,output8,output10,output12,output14,output16,output18,output20,output22,output24,output26,output28]})
dataframe_sensitivity_statusQuo.to_csv(r'../../results/outcomes/sensitivity-result-statusquo.csv', index = False)
| 70.895833 | 314 | 0.838672 | 403 | 3,403 | 6.717122 | 0.205955 | 0.221648 | 0.217215 | 0.330994 | 0.672331 | 0.672331 | 0.672331 | 0.639823 | 0.439601 | 0.217215 | 0 | 0.051385 | 0.04496 | 3,403 | 47 | 315 | 72.404255 | 0.781538 | 0.034088 | 0 | 0 | 0 | 0 | 0.29051 | 0.066829 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.058824 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
1659f62c33ec72d7be5aea496bd82fdb74bbf471 | 1,291 | py | Python | setup.py | transfluxus/jsonpath-ng | 3e2dd9ec404de61ed56ce4d8af99c2612fd540e6 | [
"Apache-2.0"
] | 1 | 2021-06-26T01:50:10.000Z | 2021-06-26T01:50:10.000Z | setup.py | transfluxus/jsonpath-ng | 3e2dd9ec404de61ed56ce4d8af99c2612fd540e6 | [
"Apache-2.0"
] | null | null | null | setup.py | transfluxus/jsonpath-ng | 3e2dd9ec404de61ed56ce4d8af99c2612fd540e6 | [
"Apache-2.0"
] | null | null | null | import io
import setuptools
setuptools.setup(
name='jsonpath-ng',
version='1.5.2',
description=(
'A final implementation of JSONPath for Python that aims to be '
'standard compliant, including arithmetic and binary comparison '
'operators and providing clear AST for metaprogramming.'
),
author='Tomas Aparicio',
author_email='tomas@aparicio.me',
url='https://github.com/h2non/jsonpath-ng',
license='Apache 2.0',
long_description=io.open('README.rst', encoding='utf-8').read(),
packages=['jsonpath_ng', 'jsonpath_ng.bin', 'jsonpath_ng.ext'],
entry_points={
'console_scripts': [
'jsonpath_ng=jsonpath_ng.bin.jsonpath:entry_point'
],
},
test_suite='tests',
install_requires=[
'ply', 'decorator', 'six'
],
classifiers=[
'Development Status :: 5 - Production/Stable',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
],
)
| 32.275 | 73 | 0.621224 | 141 | 1,291 | 5.602837 | 0.617021 | 0.088608 | 0.189873 | 0.164557 | 0.146835 | 0.078481 | 0 | 0 | 0 | 0 | 0 | 0.01927 | 0.236251 | 1,291 | 39 | 74 | 33.102564 | 0.781947 | 0 | 0 | 0.081081 | 0 | 0 | 0.584818 | 0.03718 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.054054 | 0 | 0.054054 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
1667b8800017b68c1e28c6077fc9b51997736a5f | 241 | py | Python | knn.py | nw0428/Ed-Robotics | cb167ba5cc37765765b691499f9b8919c0923b2f | [
"MIT"
] | null | null | null | knn.py | nw0428/Ed-Robotics | cb167ba5cc37765765b691499f9b8919c0923b2f | [
"MIT"
] | null | null | null | knn.py | nw0428/Ed-Robotics | cb167ba5cc37765765b691499f9b8919c0923b2f | [
"MIT"
] | null | null | null | import math
def distance_squared(a, b):
return sum([a[i]-b[i] for i in range(len(a))])
def nearest_neighbor(data, current_value):
return min(map(lambda kv : (euclidean_distance_sq(current_value, kv[0]), kv[1]), data.items()))[1]
| 24.1 | 102 | 0.684647 | 42 | 241 | 3.785714 | 0.666667 | 0.150943 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014493 | 0.141079 | 241 | 9 | 103 | 26.777778 | 0.753623 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
167f91e34fe0f7ea4e77e7324c6a45a6097ceb3e | 4,029 | py | Python | h/services/__init__.py | jenkins-hypothesis/h | 328be7f5fa3abf3f05aba73d2311cf1eaf7b2277 | [
"BSD-2-Clause"
] | null | null | null | h/services/__init__.py | jenkins-hypothesis/h | 328be7f5fa3abf3f05aba73d2311cf1eaf7b2277 | [
"BSD-2-Clause"
] | null | null | null | h/services/__init__.py | jenkins-hypothesis/h | 328be7f5fa3abf3f05aba73d2311cf1eaf7b2277 | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""Service definitions that handle business logic."""
from __future__ import unicode_literals
def includeme(config):
config.register_service_factory(
".annotation_json_presentation.annotation_json_presentation_service_factory",
name="annotation_json_presentation",
)
config.register_service_factory(
".annotation_moderation.annotation_moderation_service_factory",
name="annotation_moderation",
)
config.register_service_factory(
".annotation_stats.annotation_stats_factory", name="annotation_stats"
)
config.register_service_factory(
".auth_ticket.auth_ticket_service_factory",
iface="pyramid_authsanity.interfaces.IAuthService",
)
config.register_service_factory(
".auth_token.auth_token_service_factory", name="auth_token"
)
config.register_service_factory(
".annotation_delete.annotation_delete_service_factory", name="annotation_delete"
)
config.register_service_factory(
".delete_group.delete_group_service_factory", name="delete_group"
)
config.register_service_factory(
".delete_user.delete_user_service_factory", name="delete_user"
)
config.register_service_factory(
".developer_token.developer_token_service_factory", name="developer_token"
)
config.register_service_factory(".feature.feature_service_factory", name="feature")
config.register_service_factory(".flag.flag_service_factory", name="flag")
config.register_service_factory(
".flag_count.flag_count_service_factory", name="flag_count"
)
config.register_service_factory(".group.groups_factory", name="group")
config.register_service_factory(
".group_create.group_create_factory", name="group_create"
)
config.register_service_factory(
".group_update.group_update_factory", name="group_update"
)
config.register_service_factory(
".group_links.group_links_factory", name="group_links"
)
config.register_service_factory(
".group_members.group_members_factory", name="group_members"
)
config.register_service_factory(
".groupfinder.groupfinder_service_factory", iface="h.interfaces.IGroupService"
)
config.register_service_factory(".links.links_factory", name="links")
config.register_service_factory(".group_list.group_list_factory", name="group_list")
config.register_service_factory(
".group_scope.group_scope_factory", name="group_scope"
)
config.register_service_factory(
".list_organizations.list_organizations_factory", name="list_organizations"
)
config.register_service_factory(".nipsa.nipsa_factory", name="nipsa")
config.register_service_factory(
".oauth_provider.oauth_provider_service_factory", name="oauth_provider"
)
config.register_service_factory(
".oauth_validator.oauth_validator_service_factory", name="oauth_validator"
)
config.register_service_factory(
".organization.organization_factory", name="organization"
)
config.register_service_factory(
".rename_user.rename_user_factory", name="rename_user"
)
config.register_service_factory(".settings.settings_factory", name="settings")
config.register_service_factory(".user.user_service_factory", name="user")
config.register_service_factory(
".user_unique.user_unique_factory", name="user_unique"
)
config.register_service_factory(
".user_password.user_password_service_factory", name="user_password"
)
config.register_service_factory(
".user_signup.user_signup_service_factory", name="user_signup"
)
config.register_service_factory(
".user_update.user_update_factory", name="user_update"
)
config.add_directive(
"add_annotation_link_generator", ".links.add_annotation_link_generator"
)
config.add_request_method(
".feature.FeatureRequestProperty", name="feature", reify=True
)
| 39.5 | 88 | 0.740134 | 432 | 4,029 | 6.428241 | 0.162037 | 0.252071 | 0.24955 | 0.332733 | 0.335254 | 0.027368 | 0 | 0 | 0 | 0 | 0 | 0.000295 | 0.157607 | 4,029 | 101 | 89 | 39.891089 | 0.817914 | 0.017374 | 0 | 0.265957 | 0 | 0 | 0.448014 | 0.356691 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010638 | false | 0.010638 | 0.010638 | 0 | 0.021277 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
1696a0e8b84bb09c6f26361f21ca6900fbbcef60 | 908 | py | Python | lib/coginvasion/quest/Objective.py | theclashingfritz/Cog-Invasion-Online-Dump | 2561abbacb3e2e288e06f3f04b935b5ed589c8f8 | [
"Apache-2.0"
] | 1 | 2020-03-12T16:44:10.000Z | 2020-03-12T16:44:10.000Z | lib/coginvasion/quest/Objective.py | theclashingfritz/Cog-Invasion-Online-Dump | 2561abbacb3e2e288e06f3f04b935b5ed589c8f8 | [
"Apache-2.0"
] | null | null | null | lib/coginvasion/quest/Objective.py | theclashingfritz/Cog-Invasion-Online-Dump | 2561abbacb3e2e288e06f3f04b935b5ed589c8f8 | [
"Apache-2.0"
] | null | null | null | # uncompyle6 version 3.2.4
# Python bytecode 2.7 (62211)
# Decompiled from: Python 2.7.15 (v2.7.15:ca079a3ea3, Apr 30 2018, 16:30:26) [MSC v.1500 64 bit (AMD64)]
# Embedded file name: lib.coginvasion.quest.Objective
from direct.directnotify.DirectNotifyGlobal import directNotify
class Objective:
notify = directNotify.newCategory('Objective')
def __init__(self, quest, location, assignDialog):
self.quest = quest
self.location = location
self.assignDialog = assignDialog
def setAssignDialog(self, dialog):
self.assignDialog = dialog
def getAssignDialog(self, dialog):
return self.assignDialog
def isOnLocation(self, zoneId):
if not isinstance(self.location, (list, tuple)):
return self.location == zoneId
return zoneId in self.location
def updateQuest(self):
pass
def finished(self):
pass | 30.266667 | 104 | 0.683921 | 108 | 908 | 5.712963 | 0.546296 | 0.077796 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062411 | 0.223568 | 908 | 30 | 105 | 30.266667 | 0.812766 | 0.227974 | 0 | 0.105263 | 0 | 0 | 0.012912 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.315789 | false | 0.105263 | 0.052632 | 0.052632 | 0.631579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
16a170c8d37127243d5d457eaa46078d241add92 | 311 | py | Python | nanome/api/structure/__init__.py | nanome-ai/nanome-plugin-api | f2ce6a5e3123ee7449a90c2659f3891124289f4a | [
"MIT"
] | 3 | 2020-07-02T13:08:27.000Z | 2021-11-24T14:32:53.000Z | nanome/api/structure/__init__.py | nanome-ai/nanome-plugin-api | f2ce6a5e3123ee7449a90c2659f3891124289f4a | [
"MIT"
] | 11 | 2020-09-14T17:01:47.000Z | 2022-02-18T04:00:52.000Z | nanome/api/structure/__init__.py | nanome-ai/nanome-plugin-api | f2ce6a5e3123ee7449a90c2659f3891124289f4a | [
"MIT"
] | 5 | 2020-08-12T16:30:03.000Z | 2021-12-06T18:04:23.000Z | from . import *
from .base import Base
from .atom import Atom
from .bond import Bond
from .residue import Residue
from .chain import Chain
from .molecule import Molecule
from .complex import Complex
from .workspace import Workspace
from .substructure import Substructure
from . import client
from . import io
| 20.733333 | 38 | 0.797428 | 44 | 311 | 5.636364 | 0.295455 | 0.120968 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.160772 | 311 | 14 | 39 | 22.214286 | 0.950192 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
16a6f00fea8944a6fb7d396d838b174192ce3b8c | 49,047 | py | Python | code/vox_fluoro/history/vox_fluoro_res_test/vox_fluoro_res_test.py | john-drago/fluoro | b757af60940c4395101a39a15f3ac4213f40fdce | [
"MIT"
] | 4 | 2021-03-01T15:34:37.000Z | 2021-06-03T08:56:39.000Z | code/vox_fluoro/history/vox_fluoro_res_test/vox_fluoro_res_test.py | john-drago/fluoro | b757af60940c4395101a39a15f3ac4213f40fdce | [
"MIT"
] | null | null | null | code/vox_fluoro/history/vox_fluoro_res_test/vox_fluoro_res_test.py | john-drago/fluoro | b757af60940c4395101a39a15f3ac4213f40fdce | [
"MIT"
] | null | null | null | import numpy as np
import h5py
import tensorflow as tf
# import keras
import os
import sys
import pickle
# We are going to try to do some residual netowrks
expr_name = sys.argv[0][:-3]
expr_no = '1'
save_dir = os.path.abspath(os.path.join(os.path.expanduser('~/fluoro/code/jupyt/vox_fluoro'), expr_name))
print(save_dir)
os.makedirs(save_dir, exist_ok=True)
# -----------------------------------------------------------------
def cust_mean_squared_error_var(y_true, y_pred):
base_dir = os.path.expanduser('~/fluoro/data/compilation')
stats_file = h5py.File(os.path.join(base_dir, 'labels_stats.h5py'), 'r')
# mean_dset = stats_file['mean']
# std_dset = stats_file['std']
var_dset = stats_file['var']
# mean_v = mean_dset[:]
# std_v = std_dset[:]
var_v = var_dset[:]
stats_file.close()
return tf.keras.backend.mean(tf.keras.backend.square(y_pred - y_true) / var_v)
# -----------------------------------------------------------------
params = {
# ---
# 3D CONV
# ---
# Entry Layers
'v_conv_0_filters': 30,
'v_conv_0_kernel': 9,
'v_conv_0_strides_0': 2,
'v_conv_0_strides_1': 2,
'v_conv_0_strides_2': 2,
'v_conv_0_pad': 'same',
'v_spatial_drop_rate_0': 0.3,
'v_conv_1_filters': 30,
'v_conv_1_kernel': 5,
'v_conv_1_strides_0': 2,
'v_conv_1_strides_1': 2,
'v_conv_1_strides_2': 3,
'v_conv_1_pad': 'same',
# ---
# Pool After Initial Layers
'v_pool_0_size': 2,
'v_pool_0_pad': 'same',
# ---
# Second Run of Entry Layers
'v_conv_2_filters': 30,
'v_conv_2_kernel': 5,
'v_conv_2_strides_0': 2,
'v_conv_2_strides_1': 2,
'v_conv_2_strides_2': 2,
'v_conv_2_pad': 'same',
# ---
# Run of Residual Layers
# 1
'v_conv_3_filters': 30,
'v_conv_3_kernel': 3,
'v_conv_3_strides_0': 1,
'v_conv_3_strides_1': 1,
'v_conv_3_strides_2': 1,
'v_conv_3_pad': 'same',
'v_spatial_drop_rate_2': 0.3,
'v_conv_4_filters': 30,
'v_conv_4_kernel': 3,
'v_conv_4_strides_0': 1,
'v_conv_4_strides_1': 1,
'v_conv_4_strides_2': 1,
'v_conv_4_pad': 'same',
# 2
'v_conv_5_filters': 30,
'v_conv_5_kernel': 3,
'v_conv_5_strides_0': 1,
'v_conv_5_strides_1': 1,
'v_conv_5_strides_2': 1,
'v_conv_5_pad': 'same',
'v_spatial_drop_rate_3': 0.3,
'v_conv_6_filters': 30,
'v_conv_6_kernel': 3,
'v_conv_6_strides_0': 1,
'v_conv_6_strides_1': 1,
'v_conv_6_strides_2': 1,
'v_conv_6_pad': 'same',
# 3
'v_conv_7_filters': 30,
'v_conv_7_kernel': 3,
'v_conv_7_strides_0': 1,
'v_conv_7_strides_1': 1,
'v_conv_7_strides_2': 1,
'v_conv_7_pad': 'same',
'v_spatial_drop_rate_4': 0.3,
'v_conv_8_filters': 30,
'v_conv_8_kernel': 3,
'v_conv_8_strides_0': 1,
'v_conv_8_strides_1': 1,
'v_conv_8_strides_2': 1,
'v_conv_8_pad': 'same',
# 4
'v_conv_9_filters': 40,
'v_conv_9_kernel': 3,
'v_conv_9_strides_0': 2,
'v_conv_9_strides_1': 2,
'v_conv_9_strides_2': 2,
'v_conv_9_pad': 'same',
'v_spatial_drop_rate_5': 0.3,
'v_conv_10_filters': 40,
'v_conv_10_kernel': 3,
'v_conv_10_strides_0': 1,
'v_conv_10_strides_1': 1,
'v_conv_10_strides_2': 1,
'v_conv_10_pad': 'same',
'v_conv_11_filters': 40,
'v_conv_11_kernel': 3,
'v_conv_11_strides_0': 2,
'v_conv_11_strides_1': 2,
'v_conv_11_strides_2': 2,
'v_conv_11_pad': 'same',
# 5
'v_conv_12_filters': 50,
'v_conv_12_kernel': 2,
'v_conv_12_strides_0': 2,
'v_conv_12_strides_1': 2,
'v_conv_12_strides_2': 2,
'v_conv_12_pad': 'same',
'v_spatial_drop_rate_6': 0.3,
'v_conv_13_filters': 50,
'v_conv_13_kernel': 2,
'v_conv_13_strides_0': 1,
'v_conv_13_strides_1': 1,
'v_conv_13_strides_2': 1,
'v_conv_13_pad': 'same',
'v_conv_14_filters': 50,
'v_conv_14_kernel': 1,
'v_conv_14_strides_0': 2,
'v_conv_14_strides_1': 2,
'v_conv_14_strides_2': 2,
'v_conv_14_pad': 'same',
# 6
'v_conv_15_filters': 50,
'v_conv_15_kernel': 2,
'v_conv_15_strides_0': 2,
'v_conv_15_strides_1': 2,
'v_conv_15_strides_2': 2,
'v_conv_15_pad': 'same',
'v_spatial_drop_rate_7': 0.3,
'v_conv_16_filters': 50,
'v_conv_16_kernel': 2,
'v_conv_16_strides_0': 1,
'v_conv_16_strides_1': 1,
'v_conv_16_strides_2': 1,
'v_conv_16_pad': 'same',
'v_conv_17_filters': 50,
'v_conv_17_kernel': 1,
'v_conv_17_strides_0': 2,
'v_conv_17_strides_1': 2,
'v_conv_17_strides_2': 2,
'v_conv_17_pad': 'same',
# ---
# Final Convs
'v_spatial_drop_rate_8': 0.5,
'v_conv_18_filters': 50,
'v_conv_18_kernel': 2,
'v_conv_18_strides_0': 1,
'v_conv_18_strides_1': 1,
'v_conv_18_strides_2': 1,
'v_conv_18_pad': 'valid',
'dense_1_v_units': 75,
'dense_2_v_units': 50,
# ---
# 2D CONV
# ---
# Entry Fluoro Layers
'conv_0_filters': 30,
'conv_0_kernel': 5,
'conv_0_strides': 2,
'conv_0_pad': 'same',
'spatial_drop_rate_0': 0.3,
'conv_1_filters': 30,
'conv_1_kernel': 5,
'conv_1_strides': 2,
'conv_1_pad': 'same',
# ---
# Pool After Initial Layers
'pool_0_size': 2,
'pool_0_pad': 'same',
# ---
# Run Of Residual Layers
# 1
'conv_2_filters': 30,
'conv_2_kernel': 3,
'conv_2_strides': 1,
'conv_2_pad': 'same',
'spatial_drop_rate_1': 0.3,
'conv_3_filters': 30,
'conv_3_kernel': 3,
'conv_3_strides': 1,
'conv_3_pad': 'same',
# 2
'conv_4_filters': 30,
'conv_4_kernel': 3,
'conv_4_strides': 1,
'conv_4_pad': 'same',
'spatial_drop_rate_2': 0.3,
'conv_5_filters': 30,
'conv_5_kernel': 3,
'conv_5_strides': 1,
'conv_5_pad': 'same',
# 3
'conv_6_filters': 30,
'conv_6_kernel': 3,
'conv_6_strides': 1,
'conv_6_pad': 'same',
'spatial_drop_rate_3': 0.3,
'conv_7_filters': 30,
'conv_7_kernel': 3,
'conv_7_strides': 1,
'conv_7_pad': 'same',
# 4
'conv_8_filters': 30,
'conv_8_kernel': 3,
'conv_8_strides': 1,
'conv_8_pad': 'same',
'spatial_drop_rate_4': 0.3,
'conv_9_filters': 30,
'conv_9_kernel': 3,
'conv_9_strides': 1,
'conv_9_pad': 'same',
# 5
'conv_10_filters': 40,
'conv_10_kernel': 3,
'conv_10_strides': 2,
'conv_10_pad': 'same',
'spatial_drop_rate_5': 0.3,
'conv_11_filters': 40,
'conv_11_kernel': 3,
'conv_11_strides': 1,
'conv_11_pad': 'same',
'conv_12_filters': 40,
'conv_12_kernel': 1,
'conv_12_strides': 2,
'conv_12_pad': 'same',
# 6
'conv_13_filters': 40,
'conv_13_kernel': 3,
'conv_13_strides': 2,
'conv_13_pad': 'same',
'spatial_drop_rate_6': 0.3,
'conv_14_filters': 40,
'conv_14_kernel': 3,
'conv_14_strides': 1,
'conv_14_pad': 'same',
'conv_15_filters': 40,
'conv_15_kernel': 1,
'conv_15_strides': 2,
'conv_15_pad': 'same',
# 7
'conv_16_filters': 40,
'conv_16_kernel': 3,
'conv_16_strides': 2,
'conv_16_pad': 'same',
'spatial_drop_rate_7': 0.3,
'conv_17_filters': 40,
'conv_17_kernel': 3,
'conv_17_strides': 1,
'conv_17_pad': 'same',
'conv_18_filters': 40,
'conv_18_kernel': 1,
'conv_18_strides': 2,
'conv_18_pad': 'same',
# ---
# Final Conv Layers
'spatial_drop_rate_8': 0.3,
'conv_19_filters': 50,
'conv_19_kernel': 2,
'conv_19_strides': 1,
'conv_19_pad': 'valid',
# ---
# Dense Layers
'dense_0_f_units': 50,
'dense_1_f_units': 50,
'dense_comb_1_units': 50,
'dense_comb_2_units': 50,
# Calibration Dense Layers
'dense_1_cali_units': 20,
'dense_2_cali_units': 6,
'dense_comb_v_1_units': 20,
'dense_comb_v_2_units': 6,
# Top Level Dense Units
'dense_1_co_units': 250,
'drop_1_comb_rate': 0.2,
'dense_2_co_units': 150,
'dense_3_co_units': 100,
'drop_2_comb_rate': 0.2,
'dense_4_co_units': 20,
# Main Output
'main_output_units': 6,
'main_output_act': 'linear',
# General Housekeeping
'v_conv_regularizer': None,
'conv_regularizer': None,
'dense_regularizer_1': None,
'dense_regularizer_2': None,
'activation_fn': 'elu',
'v_intra_act_fn': None,
'v_res_act_fn': 'elu',
'c_intra_act_fn': None,
'c_res_act_fn': 'elu',
'res_act_fn': 'elu',
'kern_init': 'glorot_uniform',
'model_opt': tf.keras.optimizers.Adam,
'learning_rate': 0.001,
'model_epochs': 50,
'model_batchsize': 5,
'model_loss': cust_mean_squared_error_var,
'model_metric': cust_mean_squared_error_var
}
# -----------------------------------------------------------------
channel_order = 'channels_last'
img_input_shape = (128, 128, 1)
vox_input_shape = (199, 164, 566, 1)
cali_input_shape = (6,)
# Input Layers
input_vox = tf.keras.Input(shape=vox_input_shape, name='input_vox', dtype='float32')
input_fluoro_1 = tf.keras.Input(shape=img_input_shape, name='input_fluoro_1', dtype='float32')
input_fluoro_2 = tf.keras.Input(shape=img_input_shape, name='input_fluoro_2', dtype='float32')
input_cali = tf.keras.Input(shape=cali_input_shape, name='input_cali', dtype='float32')
# -----------------------------------------------------------------
# ---
# Entry Layers
v_conv_0 = tf.keras.layers.Conv3D(filters=params['v_conv_0_filters'], kernel_size=params['v_conv_0_kernel'], strides=(params['v_conv_0_strides_0'], params['v_conv_0_strides_1'], params['v_conv_0_strides_2']), padding=params['v_conv_0_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_conv_regularizer'])(input_vox)
bn_0 = tf.keras.layers.BatchNormalization()(v_conv_0)
v_spat_0 = tf.keras.layers.SpatialDropout3D(rate=params['v_spatial_drop_rate_0'])(bn_0)
v_conv_1 = tf.keras.layers.Conv3D(filters=params['v_conv_1_filters'], kernel_size=params['v_conv_1_kernel'], strides=(params['v_conv_1_strides_0'], params['v_conv_1_strides_1'], params['v_conv_1_strides_2']), padding=params['v_conv_1_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_conv_regularizer'])(v_spat_0)
# ---
# Pool After Initial Layers
v_pool_0 = tf.keras.layers.MaxPooling3D(pool_size=params['v_pool_0_size'], padding=params['v_pool_0_pad'], data_format=channel_order)(v_conv_1)
# ---
# Second Run of Entry Layers
bn_1 = tf.keras.layers.BatchNormalization()(v_pool_0)
v_conv_2 = tf.keras.layers.Conv3D(filters=params['v_conv_2_filters'], kernel_size=params['v_conv_2_kernel'], strides=(params['v_conv_2_strides_0'], params['v_conv_2_strides_1'], params['v_conv_2_strides_2']), padding=params['v_conv_2_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_conv_regularizer'])(bn_1)
# ---
# Run of Residual Layers
bn_2 = tf.keras.layers.BatchNormalization()(v_conv_2)
# 1
v_conv_3 = tf.keras.layers.Conv3D(filters=params['v_conv_3_filters'], kernel_size=params['v_conv_3_kernel'], strides=(params['v_conv_3_strides_0'], params['v_conv_3_strides_1'], params['v_conv_3_strides_2']), padding=params['v_conv_3_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_conv_regularizer'])(bn_2)
bn_3 = tf.keras.layers.BatchNormalization()(v_conv_3)
v_spat_2 = tf.keras.layers.SpatialDropout3D(rate=params['v_spatial_drop_rate_2'])(bn_3)
v_conv_4 = tf.keras.layers.Conv3D(filters=params['v_conv_4_filters'], kernel_size=params['v_conv_4_kernel'], strides=(params['v_conv_4_strides_0'], params['v_conv_4_strides_1'], params['v_conv_4_strides_2']), padding=params['v_conv_4_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_intra_act_fn'])(v_spat_2)
bn_4 = tf.keras.layers.BatchNormalization()(v_conv_4)
v_add_0 = tf.keras.layers.Add()([bn_4, bn_2])
v_act_0 = tf.keras.layers.Activation(activation=params['v_res_act_fn'])(v_add_0)
# 2
v_conv_5 = tf.keras.layers.Conv3D(filters=params['v_conv_5_filters'], kernel_size=params['v_conv_5_kernel'], strides=(params['v_conv_5_strides_0'], params['v_conv_5_strides_1'], params['v_conv_5_strides_2']), padding=params['v_conv_5_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_conv_regularizer'])(v_act_0)
bn_5 = tf.keras.layers.BatchNormalization()(v_conv_5)
v_spat_3 = tf.keras.layers.SpatialDropout3D(rate=params['v_spatial_drop_rate_3'])(bn_5)
v_conv_6 = tf.keras.layers.Conv3D(filters=params['v_conv_6_filters'], kernel_size=params['v_conv_6_kernel'], strides=(params['v_conv_6_strides_0'], params['v_conv_6_strides_1'], params['v_conv_6_strides_2']), padding=params['v_conv_6_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_intra_act_fn'])(v_spat_3)
bn_6 = tf.keras.layers.BatchNormalization()(v_conv_6)
v_add_1 = tf.keras.layers.Add()([bn_6, v_act_0])
v_act_1 = tf.keras.layers.Activation(activation=params['v_res_act_fn'])(v_add_1)
# 3
v_conv_7 = tf.keras.layers.Conv3D(filters=params['v_conv_7_filters'], kernel_size=params['v_conv_7_kernel'], strides=(params['v_conv_7_strides_0'], params['v_conv_7_strides_1'], params['v_conv_7_strides_2']), padding=params['v_conv_7_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_conv_regularizer'])(v_act_1)
bn_7 = tf.keras.layers.BatchNormalization()(v_conv_7)
v_spat_4 = tf.keras.layers.SpatialDropout3D(rate=params['v_spatial_drop_rate_4'])(bn_7)
v_conv_8 = tf.keras.layers.Conv3D(filters=params['v_conv_8_filters'], kernel_size=params['v_conv_8_kernel'], strides=(params['v_conv_8_strides_0'], params['v_conv_8_strides_1'], params['v_conv_8_strides_2']), padding=params['v_conv_8_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_intra_act_fn'])(v_spat_4)
bn_8 = tf.keras.layers.BatchNormalization()(v_conv_8)
v_add_2 = tf.keras.layers.Add()([bn_8, v_act_1])
v_act_2 = tf.keras.layers.Activation(activation=params['v_res_act_fn'])(v_add_2)
# 4
v_conv_9 = tf.keras.layers.Conv3D(filters=params['v_conv_9_filters'], kernel_size=params['v_conv_9_kernel'], strides=(params['v_conv_9_strides_0'], params['v_conv_9_strides_1'], params['v_conv_9_strides_2']), padding=params['v_conv_9_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_conv_regularizer'])(v_act_2)
bn_9 = tf.keras.layers.BatchNormalization()(v_conv_9)
v_spat_5 = tf.keras.layers.SpatialDropout3D(rate=params['v_spatial_drop_rate_5'])(bn_9)
v_conv_10 = tf.keras.layers.Conv3D(filters=params['v_conv_10_filters'], kernel_size=params['v_conv_10_kernel'], strides=(params['v_conv_10_strides_0'], params['v_conv_10_strides_1'], params['v_conv_10_strides_2']), padding=params['v_conv_10_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_intra_act_fn'])(v_spat_5)
bn_10 = tf.keras.layers.BatchNormalization()(v_conv_10)
v_conv_11 = tf.keras.layers.Conv3D(filters=params['v_conv_11_filters'], kernel_size=params['v_conv_11_kernel'], strides=(params['v_conv_11_strides_0'], params['v_conv_11_strides_1'], params['v_conv_11_strides_2']), padding=params['v_conv_11_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_intra_act_fn'])(v_act_2)
bn_11 = tf.keras.layers.BatchNormalization()(v_conv_11)
v_add_3 = tf.keras.layers.Add()([bn_10, bn_11])
v_act_3 = tf.keras.layers.Activation(activation=params['v_res_act_fn'])(v_add_3)
# 5
v_conv_12 = tf.keras.layers.Conv3D(filters=params['v_conv_12_filters'], kernel_size=params['v_conv_12_kernel'], strides=(params['v_conv_12_strides_0'], params['v_conv_12_strides_1'], params['v_conv_12_strides_2']), padding=params['v_conv_12_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_conv_regularizer'])(v_act_3)
bn_12 = tf.keras.layers.BatchNormalization()(v_conv_12)
v_spat_6 = tf.keras.layers.SpatialDropout3D(rate=params['v_spatial_drop_rate_6'])(bn_12)
v_conv_13 = tf.keras.layers.Conv3D(filters=params['v_conv_13_filters'], kernel_size=params['v_conv_13_kernel'], strides=(params['v_conv_13_strides_0'], params['v_conv_13_strides_1'], params['v_conv_13_strides_2']), padding=params['v_conv_13_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_intra_act_fn'])(v_spat_6)
bn_13 = tf.keras.layers.BatchNormalization()(v_conv_13)
v_conv_14 = tf.keras.layers.Conv3D(filters=params['v_conv_14_filters'], kernel_size=params['v_conv_14_kernel'], strides=(params['v_conv_14_strides_0'], params['v_conv_14_strides_1'], params['v_conv_14_strides_2']), padding=params['v_conv_14_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_intra_act_fn'])(v_act_3)
bn_14 = tf.keras.layers.BatchNormalization()(v_conv_14)
v_add_4 = tf.keras.layers.Add()([bn_13, bn_14])
v_act_4 = tf.keras.layers.Activation(activation=params['v_res_act_fn'])(v_add_4)
# 6
v_conv_15 = tf.keras.layers.Conv3D(filters=params['v_conv_12_filters'], kernel_size=params['v_conv_12_kernel'], strides=(params['v_conv_12_strides_0'], params['v_conv_12_strides_1'], params['v_conv_12_strides_2']), padding=params['v_conv_12_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_conv_regularizer'])(v_act_4)
bn_15 = tf.keras.layers.BatchNormalization()(v_conv_15)
v_spat_7 = tf.keras.layers.SpatialDropout3D(rate=params['v_spatial_drop_rate_6'])(bn_15)
v_conv_16 = tf.keras.layers.Conv3D(filters=params['v_conv_13_filters'], kernel_size=params['v_conv_13_kernel'], strides=(params['v_conv_13_strides_0'], params['v_conv_13_strides_1'], params['v_conv_13_strides_2']), padding=params['v_conv_13_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_intra_act_fn'])(v_spat_7)
bn_16 = tf.keras.layers.BatchNormalization()(v_conv_16)
v_conv_17 = tf.keras.layers.Conv3D(filters=params['v_conv_14_filters'], kernel_size=params['v_conv_14_kernel'], strides=(params['v_conv_14_strides_0'], params['v_conv_14_strides_1'], params['v_conv_14_strides_2']), padding=params['v_conv_14_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_intra_act_fn'])(v_act_4)
bn_17 = tf.keras.layers.BatchNormalization()(v_conv_17)
v_add_5 = tf.keras.layers.Add()([bn_16, bn_17])
v_act_5 = tf.keras.layers.Activation(activation=params['v_res_act_fn'])(v_add_5)
# ---
# Final Conv Layers
bn_18 = tf.keras.layers.BatchNormalization()(v_act_5)
v_spat_8 = tf.keras.layers.SpatialDropout3D(rate=params['v_spatial_drop_rate_8'])(bn_18)
v_conv_18 = tf.keras.layers.Conv3D(filters=params['v_conv_18_filters'], kernel_size=params['v_conv_18_kernel'], strides=(params['v_conv_18_strides_0'], params['v_conv_18_strides_1'], params['v_conv_18_strides_2']), padding=params['v_conv_18_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['v_conv_regularizer'])(v_spat_8)
# ---
# Dense Layers
v_flatten_0 = tf.keras.layers.Flatten()(v_conv_18)
bn_15 = tf.keras.layers.BatchNormalization()(v_flatten_0)
dense_1_v = tf.keras.layers.Dense(units=params['dense_1_v_units'], activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(bn_15)
bn_16 = tf.keras.layers.BatchNormalization()(dense_1_v)
dense_2_v = tf.keras.layers.Dense(units=params['dense_2_v_units'], activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(bn_16)
# -----------------------------------------------------------------
# ---
# Entry Fluoro Layers
per_image_stand_1 = tf.keras.layers.Lambda(lambda frame: tf.image.per_image_standardization(frame))(input_fluoro_1)
bn_0 = tf.keras.layers.BatchNormalization()(per_image_stand_1)
conv_0_1 = tf.keras.layers.Conv2D(filters=params['conv_0_filters'], kernel_size=params['conv_0_kernel'], strides=params['conv_0_strides'], padding=params['conv_0_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(bn_0)
bn_1 = tf.keras.layers.BatchNormalization()(conv_0_1)
spat_0_1 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_0'])(bn_1)
conv_1_1 = tf.keras.layers.Conv2D(filters=params['conv_1_filters'], kernel_size=params['conv_1_kernel'], strides=params['conv_1_strides'], padding=params['conv_1_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_0_1)
# ---
# Pool After Initial Layers
pool_0_1 = tf.keras.layers.AveragePooling2D(pool_size=params['pool_0_size'], padding=params['pool_0_pad'])(conv_1_1)
# ---
# Run of Residual Layers
bn_2 = tf.keras.layers.BatchNormalization()(pool_0_1)
# 1
conv_2_1 = tf.keras.layers.Conv2D(filters=params['conv_2_filters'], kernel_size=params['conv_2_kernel'], strides=params['conv_2_strides'], padding=params['conv_2_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(bn_2)
bn_3 = tf.keras.layers.BatchNormalization()(conv_2_1)
spat_1_1 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_1'])(bn_3)
conv_3_1 = tf.keras.layers.Conv2D(filters=params['conv_3_filters'], kernel_size=params['conv_3_kernel'], strides=params['conv_3_strides'], padding=params['conv_3_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_1_1)
bn_4 = tf.keras.layers.BatchNormalization()(conv_3_1)
add_0 = tf.keras.layers.Add()([bn_4, bn_2])
act_0 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_0)
# 2
conv_4_1 = tf.keras.layers.Conv2D(filters=params['conv_4_filters'], kernel_size=params['conv_4_kernel'], strides=params['conv_4_strides'], padding=params['conv_4_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_0)
bn_5 = tf.keras.layers.BatchNormalization()(conv_4_1)
spat_2_1 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_1'])(bn_5)
conv_5_1 = tf.keras.layers.Conv2D(filters=params['conv_3_filters'], kernel_size=params['conv_3_kernel'], strides=params['conv_3_strides'], padding=params['conv_3_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_2_1)
bn_6 = tf.keras.layers.BatchNormalization()(conv_5_1)
add_1 = tf.keras.layers.Add()([act_0, bn_6])
act_1 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_1)
# 3
conv_6_1 = tf.keras.layers.Conv2D(filters=params['conv_6_filters'], kernel_size=params['conv_6_kernel'], strides=params['conv_6_strides'], padding=params['conv_6_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_1)
bn_7 = tf.keras.layers.BatchNormalization()(conv_6_1)
spat_3_1 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_3'])(bn_7)
conv_7_1 = tf.keras.layers.Conv2D(filters=params['conv_7_filters'], kernel_size=params['conv_7_kernel'], strides=params['conv_7_strides'], padding=params['conv_7_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_3_1)
bn_8 = tf.keras.layers.BatchNormalization()(conv_7_1)
add_2 = tf.keras.layers.Add()([act_1, bn_8])
act_2 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_2)
# 4
conv_8_1 = tf.keras.layers.Conv2D(filters=params['conv_8_filters'], kernel_size=params['conv_8_kernel'], strides=params['conv_8_strides'], padding=params['conv_8_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_2)
bn_9 = tf.keras.layers.BatchNormalization()(conv_8_1)
spat_4_1 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_4'])(bn_9)
conv_9_1 = tf.keras.layers.Conv2D(filters=params['conv_9_filters'], kernel_size=params['conv_9_kernel'], strides=params['conv_9_strides'], padding=params['conv_9_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_4_1)
bn_10 = tf.keras.layers.BatchNormalization()(conv_9_1)
add_3 = tf.keras.layers.Add()([act_2, bn_10])
act_3 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_3)
# 5
conv_10_1 = tf.keras.layers.Conv2D(filters=params['conv_10_filters'], kernel_size=params['conv_10_kernel'], strides=params['conv_10_strides'], padding=params['conv_10_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_3)
bn_11 = tf.keras.layers.BatchNormalization()(conv_10_1)
spat_5_1 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_5'])(bn_11)
conv_11_1 = tf.keras.layers.Conv2D(filters=params['conv_11_filters'], kernel_size=params['conv_11_kernel'], strides=params['conv_11_strides'], padding=params['conv_11_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_5_1)
bn_12 = tf.keras.layers.BatchNormalization()(conv_11_1)
conv_12_1 = tf.keras.layers.Conv2D(filters=params['conv_12_filters'], kernel_size=params['conv_12_kernel'], strides=params['conv_12_strides'], padding=params['conv_12_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_3)
bn_13 = tf.keras.layers.BatchNormalization()(conv_12_1)
add_4 = tf.keras.layers.Add()([bn_12, bn_13])
act_4 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_4)
# 6
conv_13_1 = tf.keras.layers.Conv2D(filters=params['conv_13_filters'], kernel_size=params['conv_13_kernel'], strides=params['conv_13_strides'], padding=params['conv_13_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_4)
bn_14 = tf.keras.layers.BatchNormalization()(conv_13_1)
spat_6_1 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_6'])(bn_14)
conv_14_1 = tf.keras.layers.Conv2D(filters=params['conv_14_filters'], kernel_size=params['conv_14_kernel'], strides=params['conv_14_strides'], padding=params['conv_14_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_6_1)
bn_15 = tf.keras.layers.BatchNormalization()(conv_14_1)
conv_15_1 = tf.keras.layers.Conv2D(filters=params['conv_15_filters'], kernel_size=params['conv_15_kernel'], strides=params['conv_15_strides'], padding=params['conv_15_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_4)
bn_16 = tf.keras.layers.BatchNormalization()(conv_15_1)
add_5 = tf.keras.layers.Add()([bn_15, bn_16])
act_5 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_5)
# 7
conv_16_1 = tf.keras.layers.Conv2D(filters=params['conv_16_filters'], kernel_size=params['conv_15_kernel'], strides=params['conv_16_strides'], padding=params['conv_16_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_5)
bn_17 = tf.keras.layers.BatchNormalization()(conv_16_1)
spat_7_1 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_7'])(bn_17)
conv_17_1 = tf.keras.layers.Conv2D(filters=params['conv_17_filters'], kernel_size=params['conv_17_kernel'], strides=params['conv_17_strides'], padding=params['conv_17_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_7_1)
bn_18 = tf.keras.layers.BatchNormalization()(conv_17_1)
conv_18_1 = tf.keras.layers.Conv2D(filters=params['conv_18_filters'], kernel_size=params['conv_18_kernel'], strides=params['conv_18_strides'], padding=params['conv_18_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_5)
bn_19 = tf.keras.layers.BatchNormalization()(conv_18_1)
add_6 = tf.keras.layers.Add()([bn_18, bn_19])
act_6 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_6)
# ---
# Final Conv Layers
bn_20 = tf.keras.layers.BatchNormalization()(act_6)
spat_8_1 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_8'])(bn_20)
conv_19_1 = tf.keras.layers.Conv2D(filters=params['conv_19_filters'], kernel_size=params['conv_19_kernel'], strides=params['conv_19_strides'], padding=params['conv_19_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_8_1)
# ---
# Dense Layers
flatten_0 = tf.keras.layers.Flatten()(conv_19_1)
bn_21 = tf.keras.layers.BatchNormalization()(flatten_0)
dense_0_f_1 = tf.keras.layers.Dense(units=params['dense_0_f_units'], activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(bn_21)
bn_22 = tf.keras.layers.BatchNormalization()(dense_0_f_1)
dense_1_f_1 = tf.keras.layers.Dense(units=params['dense_1_f_units'], activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(bn_22)
# -----------------------------------------------------------------
# ---
# Entry Fluoro Layers
per_image_stand_2 = tf.keras.layers.Lambda(lambda frame: tf.image.per_image_standardization(frame))(input_fluoro_2)
bn_0 = tf.keras.layers.BatchNormalization()(per_image_stand_2)
conv_0_2 = tf.keras.layers.Conv2D(filters=params['conv_0_filters'], kernel_size=params['conv_0_kernel'], strides=params['conv_0_strides'], padding=params['conv_0_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(bn_0)
bn_1 = tf.keras.layers.BatchNormalization()(conv_0_2)
spat_0_2 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_0'])(bn_1)
conv_1_2 = tf.keras.layers.Conv2D(filters=params['conv_1_filters'], kernel_size=params['conv_1_kernel'], strides=params['conv_1_strides'], padding=params['conv_1_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_0_2)
# ---
# Pool After Initial Layers
pool_0_2 = tf.keras.layers.AveragePooling2D(pool_size=params['pool_0_size'], padding=params['pool_0_pad'])(conv_1_2)
# ---
# Run of Residual Layers
bn_2 = tf.keras.layers.BatchNormalization()(pool_0_2)
# 1
conv_2_2 = tf.keras.layers.Conv2D(filters=params['conv_2_filters'], kernel_size=params['conv_2_kernel'], strides=params['conv_2_strides'], padding=params['conv_2_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(bn_2)
bn_3 = tf.keras.layers.BatchNormalization()(conv_2_2)
spat_1_2 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_1'])(bn_3)
conv_3_2 = tf.keras.layers.Conv2D(filters=params['conv_3_filters'], kernel_size=params['conv_3_kernel'], strides=params['conv_3_strides'], padding=params['conv_3_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_1_2)
bn_4 = tf.keras.layers.BatchNormalization()(conv_3_2)
add_0 = tf.keras.layers.Add()([bn_4, bn_2])
act_0 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_0)
# 2
conv_4_2 = tf.keras.layers.Conv2D(filters=params['conv_4_filters'], kernel_size=params['conv_4_kernel'], strides=params['conv_4_strides'], padding=params['conv_4_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_0)
bn_5 = tf.keras.layers.BatchNormalization()(conv_4_2)
spat_2_2 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_2'])(bn_5)
conv_5_2 = tf.keras.layers.Conv2D(filters=params['conv_3_filters'], kernel_size=params['conv_3_kernel'], strides=params['conv_3_strides'], padding=params['conv_3_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_2_2)
bn_6 = tf.keras.layers.BatchNormalization()(conv_5_2)
add_1 = tf.keras.layers.Add()([act_0, bn_6])
act_1 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_1)
# 3
conv_6_2 = tf.keras.layers.Conv2D(filters=params['conv_6_filters'], kernel_size=params['conv_6_kernel'], strides=params['conv_6_strides'], padding=params['conv_6_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_1)
bn_7 = tf.keras.layers.BatchNormalization()(conv_6_2)
spat_3_2 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_3'])(bn_7)
conv_7_2 = tf.keras.layers.Conv2D(filters=params['conv_7_filters'], kernel_size=params['conv_7_kernel'], strides=params['conv_7_strides'], padding=params['conv_7_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_3_2)
bn_8 = tf.keras.layers.BatchNormalization()(conv_7_2)
add_2 = tf.keras.layers.Add()([act_1, bn_8])
act_2 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_2)
# 4
conv_8_2 = tf.keras.layers.Conv2D(filters=params['conv_8_filters'], kernel_size=params['conv_8_kernel'], strides=params['conv_8_strides'], padding=params['conv_8_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_2)
bn_9 = tf.keras.layers.BatchNormalization()(conv_8_2)
spat_4_2 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_4'])(bn_9)
conv_9_2 = tf.keras.layers.Conv2D(filters=params['conv_9_filters'], kernel_size=params['conv_9_kernel'], strides=params['conv_9_strides'], padding=params['conv_9_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_4_2)
bn_10 = tf.keras.layers.BatchNormalization()(conv_9_2)
add_3 = tf.keras.layers.Add()([act_2, bn_10])
act_3 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_3)
# 5
conv_10_2 = tf.keras.layers.Conv2D(filters=params['conv_10_filters'], kernel_size=params['conv_10_kernel'], strides=params['conv_10_strides'], padding=params['conv_10_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_3)
bn_11 = tf.keras.layers.BatchNormalization()(conv_10_2)
spat_5_2 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_5'])(bn_11)
conv_11_2 = tf.keras.layers.Conv2D(filters=params['conv_11_filters'], kernel_size=params['conv_11_kernel'], strides=params['conv_11_strides'], padding=params['conv_11_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_5_2)
bn_12 = tf.keras.layers.BatchNormalization()(conv_11_2)
conv_12_2 = tf.keras.layers.Conv2D(filters=params['conv_12_filters'], kernel_size=params['conv_12_kernel'], strides=params['conv_12_strides'], padding=params['conv_12_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_3)
bn_13 = tf.keras.layers.BatchNormalization()(conv_12_2)
add_4 = tf.keras.layers.Add()([bn_12, bn_13])
act_4 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_4)
# 6
conv_13_2 = tf.keras.layers.Conv2D(filters=params['conv_13_filters'], kernel_size=params['conv_13_kernel'], strides=params['conv_13_strides'], padding=params['conv_13_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_4)
bn_14 = tf.keras.layers.BatchNormalization()(conv_13_2)
spat_6_2 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_6'])(bn_14)
conv_14_2 = tf.keras.layers.Conv2D(filters=params['conv_14_filters'], kernel_size=params['conv_14_kernel'], strides=params['conv_14_strides'], padding=params['conv_14_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_6_2)
bn_15 = tf.keras.layers.BatchNormalization()(conv_14_2)
conv_15_2 = tf.keras.layers.Conv2D(filters=params['conv_15_filters'], kernel_size=params['conv_15_kernel'], strides=params['conv_15_strides'], padding=params['conv_15_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_4)
bn_16 = tf.keras.layers.BatchNormalization()(conv_15_2)
add_5 = tf.keras.layers.Add()([bn_15, bn_16])
act_5 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_5)
# 7
conv_16_2 = tf.keras.layers.Conv2D(filters=params['conv_16_filters'], kernel_size=params['conv_15_kernel'], strides=params['conv_16_strides'], padding=params['conv_16_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_5)
bn_17 = tf.keras.layers.BatchNormalization()(conv_16_2)
spat_7_2 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_7'])(bn_17)
conv_17_2 = tf.keras.layers.Conv2D(filters=params['conv_17_filters'], kernel_size=params['conv_17_kernel'], strides=params['conv_17_strides'], padding=params['conv_17_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_7_2)
bn_18 = tf.keras.layers.BatchNormalization()(conv_17_2)
conv_18_2 = tf.keras.layers.Conv2D(filters=params['conv_18_filters'], kernel_size=params['conv_18_kernel'], strides=params['conv_18_strides'], padding=params['conv_18_pad'], data_format=channel_order, activation=params['c_intra_act_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(act_5)
bn_19 = tf.keras.layers.BatchNormalization()(conv_18_2)
add_6 = tf.keras.layers.Add()([bn_18, bn_19])
act_6 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(add_6)
# ---
# Final Conv Layers
bn_20 = tf.keras.layers.BatchNormalization()(act_6)
spat_8_2 = tf.keras.layers.SpatialDropout2D(rate=params['spatial_drop_rate_8'])(bn_20)
conv_19_2 = tf.keras.layers.Conv2D(filters=params['conv_19_filters'], kernel_size=params['conv_19_kernel'], strides=params['conv_19_strides'], padding=params['conv_19_pad'], data_format=channel_order, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['conv_regularizer'])(spat_8_2)
# ---
# Dense Layers
flatten_0 = tf.keras.layers.Flatten()(conv_19_2)
bn_21 = tf.keras.layers.BatchNormalization()(flatten_0)
dense_0_f_2 = tf.keras.layers.Dense(units=params['dense_0_f_units'], activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(bn_21)
bn_22 = tf.keras.layers.BatchNormalization()(dense_0_f_2)
dense_1_f_2 = tf.keras.layers.Dense(units=params['dense_1_f_units'], activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(bn_22)
# -----------------------------------------------------------------
bn_0 = tf.keras.layers.BatchNormalization()(input_cali)
dense_1_cali = tf.keras.layers.Dense(units=params['dense_1_cali_units'], activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(bn_0)
bn_1 = tf.keras.layers.BatchNormalization()(dense_1_cali)
dense_2_cali = tf.keras.layers.Dense(units=params['dense_2_cali_units'], activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(bn_1)
bn_2 = tf.keras.layers.BatchNormalization()(dense_2_cali)
# -----------------------------------------------------------------
# ---
# Combine the fluoro inputs together
dense_comb_f_0 = tf.keras.layers.Add()([dense_1_f_1, dense_1_f_2])
dense_comb_act_0 = tf.keras.layers.Activation(activation=params['c_res_act_fn'])(dense_comb_f_0)
bn_0 = tf.keras.layers.BatchNormalization()(dense_comb_act_0)
dense_comb_f_1 = tf.keras.layers.Dense(units=params['dense_comb_1_units'], activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(bn_0)
bn_1 = tf.keras.layers.BatchNormalization()(dense_comb_f_1)
dense_comb_f_2 = tf.keras.layers.Dense(units=params['dense_comb_2_units'], activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(bn_1)
# -----------------------------------------------------------------
# ---
# Combine the fluoro with the vox
dense_comb_v_0 = tf.keras.layers.Add()([dense_comb_f_2, dense_2_v])
dense_comb_v_act_0 = tf.keras.layers.Activation(activation=params['v_res_act_fn'])(dense_comb_v_0)
bn_0 = tf.keras.layers.BatchNormalization()(dense_comb_v_act_0)
dense_comb_v_1 = tf.keras.layers.Dense(units=params['dense_comb_v_1_units'], activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(bn_0)
bn_1 = tf.keras.layers.BatchNormalization()(dense_comb_v_1)
dense_comb_v_2 = tf.keras.layers.Dense(units=params['dense_comb_v_2_units'], activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(bn_1)
# -----------------------------------------------------------------
top_comb = tf.keras.layers.Add()([dense_comb_v_2, bn_2])
top_comb_act = tf.keras.layers.Activation(activation=params['v_res_act_fn'])(top_comb)
top_dense_1 = tf.keras.layers.Dense(units=6, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(top_comb_act)
bn_0 = tf.keras.layers.BatchNormalization()(top_dense_1)
top_dense_2 = tf.keras.layers.Dense(units=6, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(top_dense_1)
add_0 = tf.keras.layers.Add()([top_dense_2, bn_2])
act_0 = tf.keras.layers.Activation(activation=params['v_res_act_fn'])(add_0)
top_dense_1 = tf.keras.layers.Dense(units=6, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(act_0)
bn_0 = tf.keras.layers.BatchNormalization()(top_dense_1)
top_dense_2 = tf.keras.layers.Dense(units=6, activation=params['activation_fn'], kernel_initializer=params['kern_init'], activity_regularizer=params['dense_regularizer_1'])(top_dense_1)
add_0 = tf.keras.layers.Add()([top_dense_2, act_0])
act_0 = tf.keras.layers.Activation(activation=params['v_res_act_fn'])(add_0)
# -----------------------------------------------------------------
# -----------------------------------------------------------------
# Main Output
main_output = tf.keras.layers.Dense(units=params['main_output_units'], activation=params['main_output_act'], kernel_initializer=params['kern_init'], name='main_output')(act_0)
# -----------------------------------------------------------------
# Model Housekeeping
model = tf.keras.Model(inputs=[input_vox, input_fluoro_1, input_fluoro_2, input_cali], outputs=main_output)
model.compile(optimizer=params['model_opt'](lr=params['learning_rate']), loss=params['model_loss'], metrics=[params['model_metric']])
tf.keras.utils.plot_model(model, os.path.abspath(os.path.join(save_dir, expr_name + '_' + expr_no + '.png')), show_shapes=True)
model.summary()
# -----------------------------------------------------------------
vox_file = h5py.File(os.path.expanduser('~/fluoro/data/compilation/voxels_pad.h5py'), 'r')
vox_init = vox_file['vox_dset']
image_file = h5py.File(os.path.expanduser('~/fluoro/data/compilation/images.h5py'), 'r')
image_init = image_file['image_dset']
label_file = h5py.File(os.path.expanduser('~/fluoro/data/compilation/labels.h5py'), 'r')
label_init = label_file['labels_dset']
cali_file = h5py.File(os.path.expanduser('~/fluoro/data/compilation/calibration.h5py'), 'r')
cali_init = cali_file['cali_len3_rot']
def split_train_test(shape, num_of_samples=None, ratio=0.2):
if num_of_samples is None:
shuffled_indices = np.random.choice(shape, size=shape, replace=False)
else:
shuffled_indices = np.random.choice(shape, size=num_of_samples, replace=False)
test_set_size = int(len(shuffled_indices) * 0.2)
test_indx = shuffled_indices[:test_set_size]
train_indx = shuffled_indices[test_set_size:]
return test_indx, train_indx
num_of_samples = None
test_indxs, train_sup_indxs = split_train_test(len(label_init), num_of_samples=num_of_samples)
val_indxs, train_indxs = split_train_test(len(train_sup_indxs))
val_indxs = train_sup_indxs[val_indxs]
train_indxs = train_sup_indxs[train_indxs]
test_indxs = sorted(list(test_indxs))
val_indxs = sorted(list(val_indxs))
train_indxs = sorted(list(train_indxs))
hist_file = open(os.path.join(save_dir, expr_name + '_hist_objects_' + expr_no + '.pkl'), 'wb')
var_dict = {}
var_dict['test_indxs'] = test_indxs
var_dict['val_indxs'] = val_indxs
var_dict['train_indxs'] = train_indxs
vox_mat_train = vox_init[:]
vox_mat_val = vox_mat_train[val_indxs]
vox_mat_train = vox_mat_train[train_indxs]
vox_file.close()
image_mat_train = image_init[:]
image_mat_val = image_mat_train[val_indxs]
image_mat_train = image_mat_train[train_indxs]
image_file.close()
cali_mat_train = cali_init[:]
cali_mat_val = cali_mat_train[val_indxs]
cali_mat_train = cali_mat_train[train_indxs]
cali_file.close()
label_mat_train = label_init[:]
label_mat_val = label_mat_train[val_indxs]
label_mat_train = label_mat_train[train_indxs]
label_file.close()
# -----------------------------------------------------------------
print('\n\ncompletely loaded...\n\n')
result = model.fit(x={'input_vox': np.expand_dims(vox_mat_train, axis=-1), 'input_fluoro_1': np.expand_dims(image_mat_train[:, 0, :, :], axis=-1), 'input_fluoro_2': np.expand_dims(image_mat_train[:, 1, :, :], axis=-1), 'input_cali': cali_mat_train}, y=label_mat_train, validation_data=([np.expand_dims(vox_mat_val, axis=-1), np.expand_dims(image_mat_val[:, 0, :, :], axis=-1), np.expand_dims(image_mat_val[:, 1, :, :], axis=-1), cali_mat_val], label_mat_val), epochs=params['model_epochs'], batch_size=params['model_batchsize'], shuffle=True, verbose=2)
model.save(os.path.abspath(os.path.join(save_dir, expr_name + '_' + expr_no + '.h5')))
var_dict['result'] = result.history
pickle.dump(var_dict, hist_file)
hist_file.close()
| 52.456684 | 553 | 0.749669 | 7,774 | 49,047 | 4.297916 | 0.032673 | 0.041452 | 0.091823 | 0.061415 | 0.83455 | 0.761343 | 0.697294 | 0.686969 | 0.642793 | 0.625703 | 0 | 0.046992 | 0.079312 | 49,047 | 934 | 554 | 52.512848 | 0.692916 | 0.042143 | 0 | 0.066667 | 0 | 0 | 0.271359 | 0.011696 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003509 | false | 0 | 0.010526 | 0 | 0.017544 | 0.003509 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
16b75c5e83d37992a431f71e1bcd20512928e780 | 2,377 | py | Python | cvfm/database.py | atsgen/tf-vcenter-fabric-manager | bb2cf0a0f80464457e1b884847df77a11259077c | [
"Apache-2.0"
] | 1 | 2022-03-13T06:31:49.000Z | 2022-03-13T06:31:49.000Z | cvfm/database.py | atsgen/tf-vcenter-fabric-manager | bb2cf0a0f80464457e1b884847df77a11259077c | [
"Apache-2.0"
] | null | null | null | cvfm/database.py | atsgen/tf-vcenter-fabric-manager | bb2cf0a0f80464457e1b884847df77a11259077c | [
"Apache-2.0"
] | 1 | 2020-08-25T12:44:56.000Z | 2020-08-25T12:44:56.000Z | from builtins import object
import collections
import logging
logger = logging.getLogger(__name__)
class Database(object):
def __init__(self):
self._vm_models = {}
self._dpg_models = {}
self._supported_dvses = set()
self._physical_interfaces = collections.defaultdict(list)
self._host_name_to_vms = collections.defaultdict(list)
def add_vm_model(self, vm_model):
self._vm_models[vm_model.name] = vm_model
self._host_name_to_vms[vm_model.host_name].append(vm_model)
logger.debug("Saved %s", vm_model)
def get_vm_model(self, vm_name):
return self._vm_models.get(vm_name)
def remove_vm_model(self, vm_name):
vm_model = self._vm_models.pop(vm_name, None)
if vm_model:
self._host_name_to_vms[vm_model.host_name].remove(vm_model)
return vm_model
def get_all_vm_models(self):
return list(self._vm_models.values())
def get_vm_models_by_dpg_model(self, dpg_model):
return [
vm
for vm in self.get_all_vm_models()
if vm.has_interface_in_dpg(dpg_model)
]
def get_vm_models_by_host_name(self, host_name):
return self._host_name_to_vms.get(host_name, [])
def clear_database(self):
self._vm_models = {}
self._dpg_models = {}
self._supported_dvses = set()
self._physical_interfaces = collections.defaultdict(list)
logger.info("Cleared local database.")
def add_dpg_model(self, dpg_model):
self._dpg_models[dpg_model.name] = dpg_model
logger.debug("Saved %s", dpg_model)
def get_dpg_model(self, dpg_name):
return self._dpg_models.get(dpg_name)
def remove_dpg_model(self, dpg_name):
return self._dpg_models.pop(dpg_name, None)
def get_all_dpg_models(self):
return list(self._dpg_models.values())
def add_supported_dvs(self, dvs_name):
self._supported_dvses.add(dvs_name)
def is_dvs_supported(self, dvs_name):
return dvs_name in self._supported_dvses
def add_pi_model(self, pi_model):
key = (pi_model.host_name, pi_model.dvs_name)
self._physical_interfaces[key].append(pi_model)
def get_pi_models_for_vpg(self, vpg_model):
key = (vpg_model.host_name, vpg_model.dvs_name)
return self._physical_interfaces[key]
| 30.87013 | 71 | 0.68069 | 342 | 2,377 | 4.283626 | 0.160819 | 0.066894 | 0.05256 | 0.044369 | 0.423208 | 0.238908 | 0.238908 | 0.238908 | 0.238908 | 0.187031 | 0 | 0 | 0.225494 | 2,377 | 76 | 72 | 31.276316 | 0.795763 | 0 | 0 | 0.140351 | 0 | 0 | 0.016407 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.280702 | false | 0 | 0.052632 | 0.140351 | 0.526316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
16b884f0ed5e9bfe1f5a2a3738e661ab8baa56c8 | 64 | py | Python | config.py | JuntaoZhong/pokemon-database-webapp | a29d70a0107cd29a6dc456aba40f6f0152d3bb00 | [
"MIT"
] | null | null | null | config.py | JuntaoZhong/pokemon-database-webapp | a29d70a0107cd29a6dc456aba40f6f0152d3bb00 | [
"MIT"
] | 23 | 2021-03-04T01:27:35.000Z | 2021-03-15T21:56:44.000Z | webapp/config.py | rossgk2/cs257 | 1031d731c8712bcb300fefc79cce534c9650470c | [
"MIT"
] | null | null | null | user = 'postgres'
database = 'pokemon_db'
password = 'postgres'
| 16 | 23 | 0.71875 | 7 | 64 | 6.428571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140625 | 64 | 3 | 24 | 21.333333 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0.40625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.333333 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
16d7c1852c96b7d34ca1f3c351f06dcbc7de0932 | 1,465 | py | Python | OpenGLWrapper_JE/venv/Lib/site-packages/OpenGL/GLES2/NVX/blend_equation_advanced_multi_draw_buffers.py | JE-Chen/je_old_repo | a8b2f1ac2eec25758bd15b71c64b59b27e0bcda5 | [
"MIT"
] | null | null | null | OpenGLWrapper_JE/venv/Lib/site-packages/OpenGL/GLES2/NVX/blend_equation_advanced_multi_draw_buffers.py | JE-Chen/je_old_repo | a8b2f1ac2eec25758bd15b71c64b59b27e0bcda5 | [
"MIT"
] | null | null | null | OpenGLWrapper_JE/venv/Lib/site-packages/OpenGL/GLES2/NVX/blend_equation_advanced_multi_draw_buffers.py | JE-Chen/je_old_repo | a8b2f1ac2eec25758bd15b71c64b59b27e0bcda5 | [
"MIT"
] | null | null | null | '''OpenGL extension NVX.blend_equation_advanced_multi_draw_buffers
This module customises the behaviour of the
OpenGL.raw.GLES2.NVX.blend_equation_advanced_multi_draw_buffers to provide a more
Python-friendly API
Overview (from the spec)
This extension adds support for using advanced blend equations
introduced with NV_blend_equation_advanced (and standardized
by KHR_blend_equation_advanced) in conjunction with multiple
draw buffers. The NV_blend_equation_advanced extension supports
advanced blending equations only when rending to a single color
buffer using fragment color zero and throws and INVALID_OPERATION
error when multiple draw buffers are used. This extension removes
this restriction.
The official definition of this extension is available here:
http://www.opengl.org/registry/specs/NVX/blend_equation_advanced_multi_draw_buffers.txt
'''
from OpenGL import platform, constant, arrays
from OpenGL import extensions, wrapper
import ctypes
from OpenGL.raw.GLES2 import _types, _glgets
from OpenGL.raw.GLES2.NVX.blend_equation_advanced_multi_draw_buffers import *
from OpenGL.raw.GLES2.NVX.blend_equation_advanced_multi_draw_buffers import _EXTENSION_NAME
def glInitBlendEquationAdvancedMultiDrawBuffersNVX():
'''Return boolean indicating whether this extension is available'''
from OpenGL import extensions
return extensions.hasGLExtension( _EXTENSION_NAME )
### END AUTOGENERATED SECTION | 43.088235 | 92 | 0.822526 | 195 | 1,465 | 5.969231 | 0.466667 | 0.089347 | 0.14433 | 0.103093 | 0.225086 | 0.225086 | 0.225086 | 0.156357 | 0.156357 | 0.156357 | 0 | 0.003162 | 0.136519 | 1,465 | 34 | 93 | 43.088235 | 0.916996 | 0.69215 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | true | 0 | 0.777778 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
16d877a8588916581ed76bfde53c4b2fa2b3f575 | 226 | py | Python | main.py | Gsak3l/python-image-face-regognition | 8ad7dc8a27d634d794eee69d350ce6ba8029a608 | [
"MIT"
] | null | null | null | main.py | Gsak3l/python-image-face-regognition | 8ad7dc8a27d634d794eee69d350ce6ba8029a608 | [
"MIT"
] | null | null | null | main.py | Gsak3l/python-image-face-regognition | 8ad7dc8a27d634d794eee69d350ce6ba8029a608 | [
"MIT"
] | null | null | null | # what commands to type before starting
# pip install pipenv
# pipenv shell
# pipenv install face_recognition
# face_recognition --tolerance 0.50 --show-distance true --cpus 8 ./images/known ./images/unknown/ | cut -d ',' -f2
| 37.666667 | 115 | 0.738938 | 32 | 226 | 5.15625 | 0.8125 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025907 | 0.146018 | 226 | 5 | 116 | 45.2 | 0.829016 | 0.951327 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
16e158e370621ea54e309d69b61e4ad5cbf2e946 | 236 | py | Python | C_Tut/contact/admin.py | jaydeep11/C-tutorial-web-application | 3ec0225efda834fe93a678d887044906124de59b | [
"MIT"
] | null | null | null | C_Tut/contact/admin.py | jaydeep11/C-tutorial-web-application | 3ec0225efda834fe93a678d887044906124de59b | [
"MIT"
] | null | null | null | C_Tut/contact/admin.py | jaydeep11/C-tutorial-web-application | 3ec0225efda834fe93a678d887044906124de59b | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Contact
# Register your models here.
class ContactAdmin(admin.ModelAdmin):
readonly_fields = ["first_name","last_name","email","message"]
admin.site.register(Contact,ContactAdmin) | 39.333333 | 66 | 0.792373 | 30 | 236 | 6.133333 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09322 | 236 | 6 | 67 | 39.333333 | 0.859813 | 0.110169 | 0 | 0 | 0 | 0 | 0.148325 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
bc4463fc3629ae1399f5b3d71e1d3feaf8a42247 | 1,135 | py | Python | xrpl/models/requests/ledger_data.py | SubCODERS/xrpl-py | 24a02d099002625794f5b6491ec2cafd872cc721 | [
"ISC"
] | 1 | 2022-03-11T07:01:02.000Z | 2022-03-11T07:01:02.000Z | xrpl/models/requests/ledger_data.py | SubCODERS/xrpl-py | 24a02d099002625794f5b6491ec2cafd872cc721 | [
"ISC"
] | 2 | 2022-02-23T22:57:46.000Z | 2022-02-24T11:41:49.000Z | xrpl/models/requests/ledger_data.py | SubCODERS/xrpl-py | 24a02d099002625794f5b6491ec2cafd872cc721 | [
"ISC"
] | 1 | 2022-02-21T07:36:36.000Z | 2022-02-21T07:36:36.000Z | """
The ledger_data method retrieves contents of
the specified ledger. You can iterate through
several calls to retrieve the entire contents
of a single ledger version.
`See ledger data <https://xrpl.org/ledger_data.html>`_
"""
from dataclasses import dataclass, field
from typing import Any, Optional, Union
from xrpl.models.requests.request import Request, RequestMethod
from xrpl.models.utils import require_kwargs_on_init
@require_kwargs_on_init
@dataclass(frozen=True)
class LedgerData(Request):
"""
The ledger_data method retrieves contents of
the specified ledger. You can iterate through
several calls to retrieve the entire contents
of a single ledger version.
`See ledger data <https://xrpl.org/ledger_data.html>`_
"""
method: RequestMethod = field(default=RequestMethod.LEDGER_DATA, init=False)
ledger_hash: Optional[str] = None
ledger_index: Optional[Union[str, int]] = None
binary: bool = False
limit: Optional[int] = None
# marker data shape is actually undefined in the spec, up to the
# implementation of an individual server
marker: Optional[Any] = None
| 33.382353 | 80 | 0.753304 | 157 | 1,135 | 5.350318 | 0.43949 | 0.083333 | 0.030952 | 0.045238 | 0.419048 | 0.419048 | 0.419048 | 0.419048 | 0.419048 | 0.419048 | 0 | 0 | 0.171806 | 1,135 | 33 | 81 | 34.393939 | 0.893617 | 0.477533 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.307692 | 0 | 0.846154 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
bc5ebcddc31fcfc02e140cf52c83b9e9e1f821b9 | 350 | py | Python | python/network/Foundations-of-Python-Network-Programming/foundations-of-python-network-programming-14/source/chapter08/queuecrazy.py | bosserbosser/codetest | 987563900d912e891b53eeda8e2cf36f3c769430 | [
"Apache-2.0"
] | null | null | null | python/network/Foundations-of-Python-Network-Programming/foundations-of-python-network-programming-14/source/chapter08/queuecrazy.py | bosserbosser/codetest | 987563900d912e891b53eeda8e2cf36f3c769430 | [
"Apache-2.0"
] | null | null | null | python/network/Foundations-of-Python-Network-Programming/foundations-of-python-network-programming-14/source/chapter08/queuecrazy.py | bosserbosser/codetest | 987563900d912e891b53eeda8e2cf36f3c769430 | [
"Apache-2.0"
] | null | null | null | # The script printed in the book that calls itself "queuecrazy.py" is
# actually named "queuepi.py", as you can see a page or two later when
# the book demonstrates how to invoke the script at the shell prompt.
# So the script actually lives at the URL:
# https://github.com/brandon-rhodes/fopnp/blob/m/py3/chapter08/queuepi.py
# Sorry about that!
| 38.888889 | 73 | 0.757143 | 61 | 350 | 4.344262 | 0.754098 | 0.101887 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010204 | 0.16 | 350 | 8 | 74 | 43.75 | 0.891156 | 0.957143 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
bc8f9b5d14129ebcc89949be7dd00347728a9636 | 269 | py | Python | django/esite/esite/views.py | vollov/django-template | ca904ace18919dbb557961acbb9959ffd48d4d20 | [
"MIT"
] | null | null | null | django/esite/esite/views.py | vollov/django-template | ca904ace18919dbb557961acbb9959ffd48d4d20 | [
"MIT"
] | null | null | null | django/esite/esite/views.py | vollov/django-template | ca904ace18919dbb557961acbb9959ffd48d4d20 | [
"MIT"
] | null | null | null | from django.http import HttpResponse
from django.shortcuts import render
# import the logging library
import logging
# Get an instance of a logger
logger = logging.getLogger(__name__)
def home(request):
return render(request, 'home.html', {'page_title': 'home'}) | 24.454545 | 63 | 0.765799 | 37 | 269 | 5.432432 | 0.675676 | 0.099502 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141264 | 269 | 11 | 63 | 24.454545 | 0.87013 | 0.200743 | 0 | 0 | 0 | 0 | 0.107981 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.5 | 0.166667 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 3 |
bca9285b98a6ae5d1452b8301a7ba4a1315ee54d | 343 | py | Python | src/cmp/cool_lang/ast/attr_declaration_node.py | codestrange/cool-compiler-2020 | 30508965d75a1a1d1362d0b51bef8da3978fd0c2 | [
"MIT"
] | 3 | 2020-01-14T04:47:32.000Z | 2020-09-10T17:57:20.000Z | src/cmp/cool_lang/ast/attr_declaration_node.py | codestrange/cool-compiler-2020 | 30508965d75a1a1d1362d0b51bef8da3978fd0c2 | [
"MIT"
] | 5 | 2020-01-14T06:06:35.000Z | 2020-02-19T01:01:33.000Z | src/cmp/cool_lang/ast/attr_declaration_node.py | codestrange/cool-compiler-2020 | 30508965d75a1a1d1362d0b51bef8da3978fd0c2 | [
"MIT"
] | 3 | 2020-01-14T04:58:24.000Z | 2020-01-14T16:23:41.000Z | from .expresion_node import ExpressionNode
from .feature_declaration_node import FeatureDeclarationNode
class AttrDeclarationNode(FeatureDeclarationNode):
def __init__(self, idx: str, typex: str, expression: ExpressionNode, line: int, column: int):
super(AttrDeclarationNode, self).__init__(idx, typex, expression, line, column)
| 42.875 | 97 | 0.795918 | 36 | 343 | 7.277778 | 0.583333 | 0.076336 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122449 | 343 | 7 | 98 | 49 | 0.870432 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
bcc06ddb6c5cb8aab1de28e9dea70a2678d807ed | 135 | py | Python | table_not_found_error.py | WholeCoder/rlstools | 48b69905317dc495e4a44b66faa25b3978b3f383 | [
"MIT"
] | null | null | null | table_not_found_error.py | WholeCoder/rlstools | 48b69905317dc495e4a44b66faa25b3978b3f383 | [
"MIT"
] | null | null | null | table_not_found_error.py | WholeCoder/rlstools | 48b69905317dc495e4a44b66faa25b3978b3f383 | [
"MIT"
] | null | null | null | from sqlite3 import OperationalError
class TableNotFoundError(OperationalError):
def __init__(self, msg):
self.msg = msg
| 19.285714 | 43 | 0.740741 | 14 | 135 | 6.857143 | 0.714286 | 0.145833 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009174 | 0.192593 | 135 | 6 | 44 | 22.5 | 0.87156 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.