hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4010f4c156cc298232cb7a64e839715db36b91ff | 225 | py | Python | GUI/ErrorDialog.py | ishtjot/susereumutep | 56e20c1777e0c938ac42bd8056f84af9e0b76e46 | [
"Apache-2.0"
] | 2 | 2018-11-07T20:52:53.000Z | 2019-10-20T15:57:01.000Z | GUI/ErrorDialog.py | ishtjot/susereumutep | 56e20c1777e0c938ac42bd8056f84af9e0b76e46 | [
"Apache-2.0"
] | 3 | 2021-12-14T20:57:54.000Z | 2022-01-21T23:50:36.000Z | GUI/ErrorDialog.py | ishtjot/susereumutep | 56e20c1777e0c938ac42bd8056f84af9e0b76e46 | [
"Apache-2.0"
] | 2 | 2018-11-16T04:20:06.000Z | 2019-03-28T23:49:13.000Z | import gi; gi.require_version('Gtk', '3.0')
from gi.repository import Gtk
def ErrorDialog(self, message):
d = Gtk.MessageDialog(self, 0, Gtk.MessageType.WARNING, Gtk.ButtonsType.OK, message)
d.run()
d.destroy()
| 25 | 88 | 0.702222 | 33 | 225 | 4.757576 | 0.636364 | 0.101911 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015707 | 0.151111 | 225 | 8 | 89 | 28.125 | 0.806283 | 0 | 0 | 0 | 0 | 0 | 0.026667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
401822fa5b9ada4f30d9e21d43233199ad6096c2 | 3,423 | py | Python | library_test.py | tdd-laboratory/tdd-homework-zhigunenko-dsr | 573f3d9d90fd7ed863a5111448e083b38788ca33 | [
"Apache-2.0"
] | null | null | null | library_test.py | tdd-laboratory/tdd-homework-zhigunenko-dsr | 573f3d9d90fd7ed863a5111448e083b38788ca33 | [
"Apache-2.0"
] | null | null | null | library_test.py | tdd-laboratory/tdd-homework-zhigunenko-dsr | 573f3d9d90fd7ed863a5111448e083b38788ca33 | [
"Apache-2.0"
] | null | null | null | import unittest
import library
NUM_CORPUS = '''
On the 5th of May every year, Mexicans celebrate Cinco de Mayo. This tradition
began in 1845 (the twenty-second anniversary of the Mexican Revolution), and
is the 1st example of a national independence holiday becoming popular in the
Western Hemisphere. (The Fourth of July didn't see regular celebration in the
US until 15-20 years later.) It is celebrated by 77.9% of the population--
trending toward 80.
'''
class TestCase(unittest.TestCase):
# Helper function
def assert_extract(self, text, extractors, *expected):
actual = [x[1].group(0) for x in library.scan(text, extractors)]
self.assertEquals(str(actual), str([x for x in expected]))
# First unit test; prove that if we scan NUM_CORPUS looking for mixed_ordinals,
# we find "5th" and "1st".
def test_mixed_ordinals(self):
self.assert_extract(NUM_CORPUS, library.mixed_ordinals, '5th', '1st')
# Second unit test; prove that if we look for integers, we find four of them.
def test_integers(self):
self.assert_extract(NUM_CORPUS, library.integers, '1845', '15', '20', '80')
def test_iso_dates(self):
self.assert_extract("I was born on 2015-07-25.", library.dates_iso8601, '2015-07-25')
# Third unit test; prove that if we look for integers where there are none, we get no results.
def test_no_integers(self):
self.assert_extract("no integers", library.integers)
def test_no_dates1(self):
self.assert_extract("2019-12-32", library.dates_iso8601)
def test_no_dates2(self):
self.assert_extract("2019-13-31", library.dates_iso8601)
def test_human_readable_date01(self):
self.assert_extract('I was born on 25 Jan 2017.', library.dates_format, '25 Jan 2017')
def test_human_readable_date02(self):
self.assert_extract('I was born on 25 Feb 2017.', library.dates_format, '25 Feb 2017')
def test_human_readable_date03(self):
self.assert_extract('I was born on 25 Mar 2017.', library.dates_format, '25 Mar 2017')
def test_human_readable_date04(self):
self.assert_extract('I was born on 25 Apr 2017.', library.dates_format, '25 Apr 2017')
def test_human_readable_date05(self):
self.assert_extract('I was born on 25 May 2017.', library.dates_format, '25 May 2017')
def test_human_readable_date06(self):
self.assert_extract('I was born on 25 Jun 2017.', library.dates_format, '25 Jun 2017')
def test_human_readable_date07(self):
self.assert_extract('I was born on 25 Jul 2017.', library.dates_format, '25 Jul 2017')
def test_human_readable_date08(self):
self.assert_extract('I was born on 25 Aug 2017.', library.dates_format, '25 Aug 2017')
def test_human_readable_date09(self):
self.assert_extract('I was born on 25 Sep 2017.', library.dates_format, '25 Sep 2017')
def test_human_readable_date10(self):
self.assert_extract('I was born on 25 Oct 2017.', library.dates_format, '25 Oct 2017')
def test_human_readable_date11(self):
self.assert_extract('I was born on 25 Nov 2017.', library.dates_format, '25 Nov 2017')
def test_human_readable_date12(self):
self.assert_extract('I was born on 25 Dec 2017.', library.dates_format, '25 Dec 2017')
if __name__ == '__main__':
unittest.main()
| 41.743902 | 98 | 0.697926 | 523 | 3,423 | 4.386233 | 0.284895 | 0.107672 | 0.109852 | 0.164778 | 0.560157 | 0.258936 | 0.249782 | 0.217524 | 0.20401 | 0 | 0 | 0.089875 | 0.203623 | 3,423 | 81 | 99 | 42.259259 | 0.751651 | 0.083845 | 0 | 0 | 0 | 0 | 0.321406 | 0 | 0 | 0 | 0 | 0 | 0.384615 | 1 | 0.365385 | false | 0 | 0.038462 | 0 | 0.423077 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
401a7c953c1cd64b252db8d0802352217ba74422 | 207 | py | Python | djenga/animal_pairs/utils.py | 2ps/djenga | 85ac2c7b0b0e80b55aff43f027814d05b9b0532c | [
"BSD-3-Clause"
] | 6 | 2015-01-18T10:31:13.000Z | 2019-06-14T17:39:58.000Z | djenga/animal_pairs/utils.py | 2ps/djenga | 85ac2c7b0b0e80b55aff43f027814d05b9b0532c | [
"BSD-3-Clause"
] | 12 | 2015-05-03T06:58:00.000Z | 2019-06-26T21:58:16.000Z | djenga/animal_pairs/utils.py | 2ps/djenga | 85ac2c7b0b0e80b55aff43f027814d05b9b0532c | [
"BSD-3-Clause"
] | 1 | 2018-04-27T20:36:29.000Z | 2018-04-27T20:36:29.000Z |
import random
from .dictionary import ADJECTIVES, ANIMALS
__all__ = [ 'animal_pair' ]
def animal_pair():
return u'%s-%s' % (
random.choice(ADJECTIVES),
random.choice(ANIMALS),
)
| 14.785714 | 43 | 0.63285 | 23 | 207 | 5.434783 | 0.608696 | 0.16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.241546 | 207 | 13 | 44 | 15.923077 | 0.796178 | 0 | 0 | 0 | 0 | 0 | 0.07767 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.25 | 0.125 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
4027293246ee585a8ac189984d43f157c096a3b8 | 6,591 | py | Python | screen_translate/Translate.py | fnx4/Screen-Translate | f4564cc21787cbc87e9f9b97c1e23a6572eaf353 | [
"MIT"
] | null | null | null | screen_translate/Translate.py | fnx4/Screen-Translate | f4564cc21787cbc87e9f9b97c1e23a6572eaf353 | [
"MIT"
] | null | null | null | screen_translate/Translate.py | fnx4/Screen-Translate | f4564cc21787cbc87e9f9b97c1e23a6572eaf353 | [
"MIT"
] | null | null | null | import requests
from screen_translate.Mbox import Mbox
from screen_translate.LangCode import *
# ----------------------------------------------------------------
# Imports Library
# Google Translate
try:
from deep_translator import GoogleTranslator
except ConnectionError as e:
print("Error: No Internet Connection. Please Restart With Internet Connected", str(e))
Mbox("Error: No Internet Connection", str(e), 2)
except Exception as e:
print("Error", str(e))
Mbox("Error", str(e), 2)
# ----------------------------------------------------------------
# MyMemory Translate
try:
from deep_translator import MyMemoryTranslator
except ConnectionError as e:
print("Error: No Internet Connection. Please Restart With Internet Connected", str(e))
Mbox("Error: No Internet Connection", str(e), 2)
except Exception as e:
print("Error", str(e))
Mbox("Error", str(e), 2)
# ----------------------------------------------------------------
# Pons
try:
from deep_translator import PonsTranslator
except ConnectionError as e:
print("Error: No Internet Connection. Please Restart With Internet Connected", str(e))
Mbox("Error: No Internet Connection", str(e), 2)
except Exception as e:
print("Error", str(e))
Mbox("Error", str(e), 2)
__all__ = ["google_tl", "memory_tl", "pons_tl", "libre_tl"]
# ----------------------------------------------------------------
# TL Functions
# Google
def google_tl(text, to_lang, from_lang="auto", oldMethod = False):
"""Translate Using Google Translate
Args:
text ([str]): Text to translate
to_lang ([type]): Language to translate
from_lang (str, optional): [Language From]. Defaults to "auto".
Returns:
[type]: Translation result
"""
is_Success = False
result = ""
# --- Get lang code ---
try:
to_LanguageCode_Google = google_Lang[to_lang]
from_LanguageCode_Google = google_Lang[from_lang]
except KeyError as e:
print("Error: " + str(e))
return is_Success, "Error Language Code Undefined"
# --- Translate ---
try:
if not oldMethod:
result = GoogleTranslator(source=from_LanguageCode_Google, target=to_LanguageCode_Google).translate(text.strip())
else:
url = 'https://translate.googleapis.com/translate_a/single?client=gtx&sl={}&tl={}&dt=t&q={}'.format(from_LanguageCode_Google, to_LanguageCode_Google, text.replace('\n', ' ').replace(' ', '%20').strip())
result = requests.get(url).json()[0][0][0]
is_Success = True
except Exception as e:
print(str(e))
result = str(e)
Mbox("Error", str(e), 2)
finally:
print("-" * 50)
print("Query: " + text.strip())
print("-" * 50)
print("Translation Get: "+ result)
return is_Success, result
# My Mermory
def memory_tl(text, to_lang, from_lang="auto"):
"""Translate Using MyMemoryTranslator
Args:
text ([str]): Text to translate
to_lang ([type]): Language to translate
from_lang (str, optional): [Language From]. Defaults to "auto".
Returns:
[type]: Translation result
"""
is_Success = False
result = ""
# --- Get lang code ---
try:
to_LanguageCode_Memory = myMemory_Lang[to_lang]
from_LanguageCode_Memory = myMemory_Lang[from_lang]
except KeyError as e:
print("Error: " + str(e))
return is_Success, "Error Language Code Undefined"
# --- Translate ---
try:
result = MyMemoryTranslator(source=from_LanguageCode_Memory, target=to_LanguageCode_Memory).translate(text.strip())
is_Success = True
except Exception as e:
print(str(e))
result = str(e)
Mbox("Error", str(e), 2)
finally:
print("-" * 50)
print("Query: " + text.strip())
print("-" * 50)
print("Translation Get: "+ result)
return is_Success, result
# PonsTranslator
def pons_tl(text, to_lang, from_lang):
"""Translate Using PONS
Args:
text ([str]): Text to translate
to_lang ([type]): Language to translate
from_lang (str, optional): [Language From]. Defaults to "auto".
Returns:
[type]: Translation result
"""
is_Success = False
result = ""
# --- Get lang code ---
try:
to_LanguageCode_Pons = pons_Lang[to_lang]
from_LanguageCode_Pons = pons_Lang[from_lang]
except KeyError as e:
print("Error: " + str(e))
return is_Success, "Error Language Code Undefined"
# --- Translate ---
try:
result = PonsTranslator(source=from_LanguageCode_Pons, target=to_LanguageCode_Pons).translate(text.strip())
is_Success = True
except Exception as e:
print(str(e))
result = str(e)
Mbox("Error", str(e), 2)
finally:
print("-" * 50)
print("Query: " + text.strip())
print("-" * 50)
print("Translation Get: "+ result)
return is_Success, result
# LibreTranslator
def libre_tl(text, to_lang, from_lang, host="localhost", port="5000"):
"""Translate Using LibreTranslate
Args:
text ([str]): Text to translate
to_lang ([type]): Language to translate
from_lang (str, optional): [Language From]. Defaults to "auto".
host ([str]): Server hostname
port ([str]): Server port
Returns:
[type]: Translation result
"""
is_Success = False
result = ""
# --- Get lang code ---
try:
to_LanguageCode_Libre = libre_Lang[to_lang]
from_LanguageCode_Libre = libre_Lang[from_lang]
except KeyError as e:
print("Error: " + str(e))
return is_Success, "Error Language Code Undefined"
# --- Translate ---
try:
request = {
"q": text,
"source": from_LanguageCode_Libre,
"target": to_LanguageCode_Libre,
"format": "text"
}
adr = "http://" + host + ":" + port + "/translate"
response = requests.post(adr, request).json()
if "error" in response:
result = response["error"]
else:
result = response["translatedText"]
is_Success = True
except Exception as e:
print(str(e))
result = str(e)
Mbox("Error", str(e), 2)
finally:
print("-" * 50)
print("Query: " + text.strip())
print("-" * 50)
print("Translation Get: " + result)
return is_Success, result
| 32.308824 | 214 | 0.577302 | 736 | 6,591 | 5.03125 | 0.146739 | 0.030246 | 0.030246 | 0.035107 | 0.68674 | 0.651364 | 0.621118 | 0.608156 | 0.608156 | 0.608156 | 0 | 0.007193 | 0.261721 | 6,591 | 203 | 215 | 32.46798 | 0.753802 | 0.228949 | 0 | 0.725191 | 0 | 0.007634 | 0.164176 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030534 | false | 0 | 0.045802 | 0 | 0.137405 | 0.229008 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
402fab453bb9e2ac59ef1604d2cf41a8d383046e | 1,270 | py | Python | move.py | Nexowned/SnakeAI | 95b5d4a9d20df124040ff9335ad09409ca9ff607 | [
"Apache-2.0"
] | null | null | null | move.py | Nexowned/SnakeAI | 95b5d4a9d20df124040ff9335ad09409ca9ff607 | [
"Apache-2.0"
] | null | null | null | move.py | Nexowned/SnakeAI | 95b5d4a9d20df124040ff9335ad09409ca9ff607 | [
"Apache-2.0"
] | null | null | null | from enum import Enum
class Move(Enum):
LEFT = -1
STRAIGHT = 0
RIGHT = 1
class Direction(Enum):
NORTH = 0
EAST = 1
SOUTH = 2
WEST = 3
def get_new_direction(self, move):
return Direction(self.value + move.value) % 4
def get_xy_manipulation(self):
m = {
Direction.NORTH: (0, -1),
Direction.EAST: (1, 0),
Direction.SOUTH: (0, 1),
Direction.WEST: (-1, 0)
}
return m[self]
def get_xy_moves(self):
m = {
Direction.NORTH: [Direction.NORTH.get_xy_manipulation(), Direction.EAST.get_xy_manipulation(),
Direction.WEST.get_xy_manipulation()],
Direction.EAST: [Direction.NORTH.get_xy_manipulation(), Direction.EAST.get_xy_manipulation(),
Direction.SOUTH.get_xy_manipulation()],
Direction.SOUTH: [Direction.SOUTH.get_xy_manipulation(), Direction.EAST.get_xy_manipulation(),
Direction.WEST.get_xy_manipulation()],
Direction.WEST: [Direction.NORTH.get_xy_manipulation(), Direction.WEST.get_xy_manipulation(),
Direction.SOUTH.get_xy_manipulation()],
}
return m[self]
| 30.238095 | 106 | 0.573228 | 139 | 1,270 | 5.021583 | 0.201439 | 0.100287 | 0.316619 | 0.409742 | 0.581662 | 0.563037 | 0.510029 | 0.510029 | 0.446991 | 0.366762 | 0 | 0.018412 | 0.315748 | 1,270 | 41 | 107 | 30.97561 | 0.78481 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.09375 | false | 0 | 0.03125 | 0.03125 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
403ed8c759891b08ee51690d9702e340ed6f4833 | 2,620 | py | Python | instascrape/exceptions.py | tnychn/instascrape | 7aaf3c1a1786bbe80059ed6e0d93442a19a6f475 | [
"MIT"
] | 80 | 2020-05-28T17:22:14.000Z | 2022-03-25T07:15:51.000Z | instascrape/exceptions.py | AlphaXenon/InstaScrape | 7aaf3c1a1786bbe80059ed6e0d93442a19a6f475 | [
"MIT"
] | 23 | 2020-05-25T12:45:40.000Z | 2022-03-06T05:44:41.000Z | instascrape/exceptions.py | AlphaXenon/InstaScrape | 7aaf3c1a1786bbe80059ed6e0d93442a19a6f475 | [
"MIT"
] | 14 | 2020-06-28T05:52:28.000Z | 2022-03-28T04:27:50.000Z | class InstascrapeError(Exception):
"""Base exception class for all of the exceptions raised by Instascrape."""
class ExtractionError(InstascrapeError):
"""Raised when Instascrape fails to extract specified data from HTTP response."""
def __init__(self, message: str):
super().__init__("Failed to extract data from response. (message: '{0}')".format(message))
class PrivateAccessError(InstascrapeError):
"""Raised when user does not have permission to access specified data, i.e. private profile which the user is not following."""
def __init__(self):
super().__init__("The user profile is private and not being followed by you.")
class RateLimitedError(InstascrapeError):
"""Raised when Instascrape receives a 429 TooManyRequests from Instagram."""
def __init__(self):
super().__init__("(429) Too many requests. Failed to query data. Rate limited by Instagram.")
class NotFoundError(InstascrapeError):
"""Raised when Instascrape receives a 404 Not Found from Instagram."""
def __init__(self, message: str = None):
super().__init__(message or "(404) Nothing found.")
class ConnectionError(InstascrapeError):
"""Raised when Instascrape fails to connect to Instagram server."""
def __init__(self, url: str):
super().__init__("Failed to connect to '{0}'.".format(url))
class LoginError(InstascrapeError):
"""Raised when Instascrape fails to perform authentication, e.g. wrong credentials."""
def __init__(self, message: str):
super().__init__("Failed to log into Instagram. (message: '{0}')".format(message))
class TwoFactorAuthRequired(LoginError):
"""Raised when Instascrape fails to perform authentication due to two-factor authenticattion."""
def __init__(self):
super().__init__("two-factor authentication is required")
class CheckpointChallengeRequired(LoginError):
"""Raised when Instascrape fails to perform authentication due to checkpoint challenge."""
def __init__(self):
super().__init__("checkpoint challenge solving is required")
class AuthenticationRequired(InstascrapeError):
"""Raised when anonymous/unauthenticated (guest) user tries to perform actions that require authentication."""
def __init__(self):
super().__init__("Login is required in order to perform this action.")
class DownloadError(InstascrapeError):
"""Raised when Instascrape fails to download data from Instagram server."""
def __init__(self, message: str, url: str):
super().__init__("Download Failed -> {0} (url: '{1}')".format(message, url))
| 35.890411 | 131 | 0.719084 | 305 | 2,620 | 5.914754 | 0.344262 | 0.055432 | 0.060976 | 0.12306 | 0.429047 | 0.273282 | 0.140244 | 0.113082 | 0.113082 | 0.070953 | 0 | 0.007823 | 0.170611 | 2,620 | 72 | 132 | 36.388889 | 0.822365 | 0.342366 | 0 | 0.225806 | 0 | 0 | 0.263947 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.322581 | false | 0 | 0 | 0 | 0.677419 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
406082b1a0eb7c0edeb83793f334bb861a0e79b1 | 870 | py | Python | pykeys.py | palsayantan/PyKeys | b7e42e9a3a61c470a072096227656179440f1497 | [
"MIT"
] | 1 | 2020-03-25T17:55:25.000Z | 2020-03-25T17:55:25.000Z | pykeys.py | palsayantan/PyKeys | b7e42e9a3a61c470a072096227656179440f1497 | [
"MIT"
] | null | null | null | pykeys.py | palsayantan/PyKeys | b7e42e9a3a61c470a072096227656179440f1497 | [
"MIT"
] | null | null | null | import serial #Serial imported for Serial communication
import time #Required to use delay functions
import pyautogui
ArduinoSerial = serial.Serial('COM11',9600) #Create Serial port object called arduinoSerialData
time.sleep(2) #wait for 2 seconds for the communication to get established
# a(2) | up
# e(6) b(3) d(5) | left enter right
# c(4) | down
while 1:
incoming = ArduinoSerial.readline() #read the serial data and print it as line
#print incoming
if 'b' in incoming:
pyautogui.typewrite(['space'], 0.2)
if 'e' in incoming:
pyautogui.hotkey('left')
if 'd' in incoming:
pyautogui.hotkey('right')
if 'a' in incoming:
pyautogui.hotkey('up')
if 'c' in incoming:
pyautogui.hotkey('down')
| 27.1875 | 96 | 0.591954 | 108 | 870 | 4.768519 | 0.537037 | 0.097087 | 0.184466 | 0.194175 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026846 | 0.314943 | 870 | 31 | 97 | 28.064516 | 0.837248 | 0.394253 | 0 | 0 | 0 | 0 | 0.061602 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.176471 | 0 | 0.176471 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4078c4499453a3efc73c41617d322f80482c4509 | 2,323 | py | Python | changeling/pathfinder.py | trashtatur/wonderdraft-profile-manager | 80f9e74781b4c890d41e2d14fa915ee683584b68 | [
"MIT"
] | 3 | 2020-04-13T04:44:39.000Z | 2021-06-06T18:24:59.000Z | changeling/pathfinder.py | trashtatur/wonderdraft-profile-manager | 80f9e74781b4c890d41e2d14fa915ee683584b68 | [
"MIT"
] | null | null | null | changeling/pathfinder.py | trashtatur/wonderdraft-profile-manager | 80f9e74781b4c890d41e2d14fa915ee683584b68 | [
"MIT"
] | null | null | null | import os
from changeling.file_interactions.YMLConfigReader import YMLConfigReader
from changeling.init import Initial
class Pathfinder:
@staticmethod
def get_profile_directory():
return Initial.CHANGELING_PROFILES_PATH
@staticmethod
def __appdata_path():
return os.getenv('APPDATA')
@staticmethod
def get_wonderdraft_userfolder():
return os.path.join(Pathfinder.__appdata_path(), 'Wonderdraft')
@staticmethod
def get_wonderdraft_asset_folder():
return os.path.join(Pathfinder.get_wonderdraft_userfolder(), 'assets')
@staticmethod
def get_wonderdraft_themes_folder():
return os.path.join(Pathfinder.get_wonderdraft_userfolder(), 'themes')
@staticmethod
def get_wonderdraft_brushes_folder():
return os.path.join(Pathfinder.get_wonderdraft_userfolder(), 'brushes')
@staticmethod
def get_deactivated_assets_folder_path():
return os.path.join(
Pathfinder.get_wonderdraft_userfolder(),
YMLConfigReader.get_changeling_manager_directory_name(),
YMLConfigReader.get_deactivated_assets_folder_name()
)
@staticmethod
def get_deactivated_brushes_folder_path():
return os.path.join(
Pathfinder.get_wonderdraft_userfolder(),
YMLConfigReader.get_changeling_manager_directory_name(),
YMLConfigReader.get_deactivated_brushes_folder_name()
)
@staticmethod
def get_deactivated_themes_folder_path():
return os.path.join(
Pathfinder.get_wonderdraft_userfolder(),
YMLConfigReader.get_changeling_manager_directory_name(),
YMLConfigReader.get_deactivated_themes_folder_name()
)
@staticmethod
def get_logger_directory_path():
return os.path.join(
Pathfinder.get_wonderdraft_userfolder(),
YMLConfigReader.get_changeling_manager_directory_name(),
YMLConfigReader.get_logger_directory_name()
)
@staticmethod
def get_changeling_manager_directory_path():
return os.path.join(
Pathfinder.get_wonderdraft_userfolder(),
YMLConfigReader.get_changeling_manager_directory_name(),
)
| 30.973333 | 80 | 0.68489 | 219 | 2,323 | 6.844749 | 0.150685 | 0.112075 | 0.12008 | 0.096064 | 0.634423 | 0.598399 | 0.546364 | 0.546364 | 0.546364 | 0.43429 | 0 | 0 | 0.24365 | 2,323 | 74 | 81 | 31.391892 | 0.853159 | 0 | 0 | 0.464286 | 0 | 0 | 0.016481 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.196429 | true | 0 | 0.053571 | 0.196429 | 0.464286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
407e6e2de736ef89e683c2a12f7c93f7d76b6261 | 237 | py | Python | pyos/os/userCommand.py | dgisolfi/PyOS | 711205f570842a6573b650bde085d9925ee20aaa | [
"MIT"
] | 3 | 2021-01-09T12:44:14.000Z | 2021-01-10T02:23:52.000Z | pyos/os/userCommand.py | dgisolfi/PyOS | 711205f570842a6573b650bde085d9925ee20aaa | [
"MIT"
] | null | null | null | pyos/os/userCommand.py | dgisolfi/PyOS | 711205f570842a6573b650bde085d9925ee20aaa | [
"MIT"
] | 1 | 2021-01-09T12:44:16.000Z | 2021-01-09T12:44:16.000Z | #!/usr/bin/env python3
# Daniel Nicolas Gisolfi
class UserCommand:
""" The users input that will be translated into a shell command """
def __init__(self, cmd:str, args:list):
self.command = cmd
self.args = args
| 26.333333 | 72 | 0.662447 | 33 | 237 | 4.636364 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005525 | 0.236287 | 237 | 8 | 73 | 29.625 | 0.839779 | 0.447257 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
40992ccea87cf895d213378c551f2acaeadea394 | 3,500 | py | Python | tests/test_generic_importer.py | zhangchn/data-importer | faef624a19f97c76a157d8350bb05b819f1cb9f2 | [
"BSD-2-Clause-FreeBSD"
] | 62 | 2015-01-27T09:29:00.000Z | 2021-02-28T09:56:11.000Z | tests/test_generic_importer.py | zhangchn/data-importer | faef624a19f97c76a157d8350bb05b819f1cb9f2 | [
"BSD-2-Clause-FreeBSD"
] | 40 | 2015-01-16T11:57:17.000Z | 2022-03-13T14:13:00.000Z | tests/test_generic_importer.py | zhangchn/data-importer | faef624a19f97c76a157d8350bb05b819f1cb9f2 | [
"BSD-2-Clause-FreeBSD"
] | 34 | 2015-01-27T15:06:56.000Z | 2021-02-28T09:56:14.000Z | # encoding: utf-8
from __future__ import unicode_literals
import mock
import os
import django
from django.test import TestCase
from unittest import skipIf
from data_importer.importers.generic import GenericImporter
from data_importer.readers.xls_reader import XLSReader
from data_importer.readers.xlsx_reader import XLSXReader
from data_importer.readers.csv_reader import CSVReader
from data_importer.readers.xml_reader import XMLReader
from data_importer.core.exceptions import UnsuportedFile
from data_importer.models import FileHistory
from example.models import Invoice
LOCAL_DIR = os.path.dirname(__file__)
class TestGenericImporterSetup(TestCase):
def setUp(self):
self.xls_file = os.path.join(LOCAL_DIR, 'data/test.xls')
self.xlsx_file = os.path.join(LOCAL_DIR, 'data/test.xlsx')
self.csv_file = os.path.join(LOCAL_DIR, 'data/test.csv')
self.xml_file = os.path.join(LOCAL_DIR, 'data/test.xml')
self.unsuported_file = os.path.join(LOCAL_DIR, 'data/test_json_descriptor.json')
def test_xls_reader_set(self):
importer = GenericImporter(source=self.xls_file)
self.assertEqual(importer.get_reader_class(), XLSReader)
def test_xlsx_reader_set(self):
importer = GenericImporter(source=self.xlsx_file)
self.assertEqual(importer.get_reader_class(), XLSXReader)
def test_csv_reader_set(self):
importer = GenericImporter(source=self.csv_file)
self.assertEqual(importer.get_reader_class(), CSVReader)
def test_xml_reader_set(self):
importer = GenericImporter(source=self.xml_file)
self.assertEqual(importer.get_reader_class(), XMLReader)
def test_getting_source_file_extension(self):
importer = GenericImporter(source=self.csv_file)
self.assertEqual(importer.get_source_file_extension(), 'csv')
@skipIf(django.VERSION < (1, 4), "not supported in this library version")
def test_unsuported_raise_error_message(self):
with self.assertRaisesMessage(UnsuportedFile, 'Unsuported File'):
GenericImporter(source=self.unsuported_file)
def test_import_with_file_instance(self):
file_instance = open(self.csv_file)
importer = GenericImporter(source=file_instance)
self.assertEqual(importer.get_source_file_extension(), 'csv')
def test_import_with_model_instance(self):
file_mock = mock.MagicMock(spec=FileHistory, name='FileHistoryMock')
file_mock.file_upload = '/media/test.csv'
importer = GenericImporter(source=file_mock)
self.assertEqual(importer.get_source_file_extension(), 'csv')
class CustomerDataImporter(GenericImporter):
class Meta:
model = Invoice
ignore_first_line = True
# class TestGenericImporterBehavior(TestCase):
# def setUp(self):
# self.xls_file = os.path.join(LOCAL_DIR, 'data/test_invalid_lines.xlsx')
# def test_xlsx_is_not_valid(self):
# instance = CustomerDataImporter(source=self.xls_file)
# self.assertFalse(instance.is_valid())
# def test_save_lines_without_errors(self):
# instance = CustomerDataImporter(source=self.xls_file)
# instance.save()
# count_invoices = Invoice.objects.count()
# self.assertEqual(count_invoices, 6, ('error', count_invoices))
# def test_get_three_errors(self):
# instance = CustomerDataImporter(source=self.xls_file)
# instance.is_valid()
# self.assertEqual(len(instance.errors), 3)
| 37.234043 | 88 | 0.735714 | 433 | 3,500 | 5.678984 | 0.233256 | 0.031314 | 0.045547 | 0.074014 | 0.388369 | 0.385523 | 0.385523 | 0.251322 | 0.154534 | 0.103294 | 0 | 0.001719 | 0.168857 | 3,500 | 93 | 89 | 37.634409 | 0.843589 | 0.210286 | 0 | 0.092593 | 0 | 0 | 0.063342 | 0.010921 | 0 | 0 | 0 | 0 | 0.148148 | 1 | 0.166667 | false | 0 | 0.611111 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
40b13a714ce92da08268e92417452d13acbc8847 | 1,495 | py | Python | release/stubs.min/Autodesk/Revit/DB/__init___parts/LayoutRuleFixedNumber.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | release/stubs.min/Autodesk/Revit/DB/__init___parts/LayoutRuleFixedNumber.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | release/stubs.min/Autodesk/Revit/DB/__init___parts/LayoutRuleFixedNumber.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | class LayoutRuleFixedNumber(LayoutRule, IDisposable):
"""
This class indicate the layout rule of a Beam System is Fixed-Number.
LayoutRuleFixedNumber(numberOfLines: int)
"""
def Dispose(self):
""" Dispose(self: LayoutRuleFixedNumber,A_0: bool) """
pass
def ReleaseManagedResources(self, *args):
""" ReleaseManagedResources(self: APIObject) """
pass
def ReleaseUnmanagedResources(self, *args):
""" ReleaseUnmanagedResources(self: APIObject) """
pass
def __enter__(self, *args):
""" __enter__(self: IDisposable) -> object """
pass
def __exit__(self, *args):
""" __exit__(self: IDisposable,exc_type: object,exc_value: object,exc_back: object) """
pass
def __init__(self, *args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod
def __new__(self, numberOfLines):
""" __new__(cls: type,numberOfLines: int) """
pass
NumberOfLines = property(
lambda self: object(), lambda self, v: None, lambda self: None
)
"""Get or set the number of the beams in a beam system.
Get: NumberOfLines(self: LayoutRuleFixedNumber) -> int
Set: NumberOfLines(self: LayoutRuleFixedNumber)=value
"""
| 28.207547 | 221 | 0.630769 | 150 | 1,495 | 5.833333 | 0.366667 | 0.04 | 0.054857 | 0.065143 | 0.129143 | 0.129143 | 0.129143 | 0.129143 | 0.129143 | 0.129143 | 0 | 0.000894 | 0.251505 | 1,495 | 52 | 222 | 28.75 | 0.781055 | 0.410033 | 0 | 0.368421 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.368421 | false | 0.368421 | 0 | 0 | 0.473684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
40b302d5756b2c3409d6bffad0e046d7621e0c1c | 7,856 | py | Python | api/views/faces.py | heholek/librephotos | 4f41a61d9685a3165c857b695a2440895d8746dd | [
"MIT"
] | null | null | null | api/views/faces.py | heholek/librephotos | 4f41a61d9685a3165c857b695a2440895d8746dd | [
"MIT"
] | 3 | 2020-12-05T12:57:15.000Z | 2020-12-05T13:26:08.000Z | api/views/faces.py | heholek/librephotos | 4f41a61d9685a3165c857b695a2440895d8746dd | [
"MIT"
] | null | null | null | import uuid
import six
from django.core.cache import cache
from django.db.models import Q
from rest_framework import viewsets
from rest_framework.response import Response
from rest_framework.views import APIView
from rest_framework_extensions.cache.decorators import cache_response
from api.directory_watcher import scan_faces
from api.drf_optimize import OptimizeRelatedModelViewSetMetaclass
from api.face_classify import train_faces
from api.models import Face
from api.models.person import get_or_create_person
from api.serializers.serializers import FaceListSerializer, FaceSerializer
from api.util import logger
from api.views.caching import (
CACHE_TTL,
CustomListKeyConstructor,
CustomObjectKeyConstructor,
)
from api.views.pagination import HugeResultsSetPagination, StandardResultsSetPagination
class ScanFacesView(APIView):
def get(self, request, format=None):
try:
job_id = uuid.uuid4()
scan_faces.delay(request.user, job_id)
return Response({"status": True, "job_id": job_id})
except BaseException:
logger.exception("An Error occured")
return Response({"status": False})
class TrainFaceView(APIView):
def get(self, request, format=None):
try:
job_id = uuid.uuid4()
train_faces.delay(request.user, job_id)
return Response({"status": True, "job_id": job_id})
except BaseException:
logger.exception()
return Response({"status": False})
@six.add_metaclass(OptimizeRelatedModelViewSetMetaclass)
class FaceListViewSet(viewsets.ModelViewSet):
serializer_class = FaceListSerializer
pagination_class = StandardResultsSetPagination
@cache_response(CACHE_TTL, key_func=CustomObjectKeyConstructor())
def retrieve(self, *args, **kwargs):
return super(FaceListViewSet, self).retrieve(*args, **kwargs)
@cache_response(CACHE_TTL, key_func=CustomListKeyConstructor())
def list(self, *args, **kwargs):
return super(FaceListViewSet, self).list(*args, **kwargs)
@six.add_metaclass(OptimizeRelatedModelViewSetMetaclass)
class FaceInferredListViewSet(viewsets.ModelViewSet):
serializer_class = FaceListSerializer
pagination_class = HugeResultsSetPagination
def get_queryset(self):
# Todo: optimze query by only prefetching relevant models & fields
queryset = (
Face.objects.filter(
Q(photo__hidden=False)
& Q(photo__owner=self.request.user)
& Q(person_label_is_inferred=True)
)
.select_related("person")
.order_by("id")
)
return queryset
@cache_response(CACHE_TTL, key_func=CustomObjectKeyConstructor())
def retrieve(self, *args, **kwargs):
return super(FaceInferredListViewSet, self).retrieve(*args, **kwargs)
@cache_response(CACHE_TTL, key_func=CustomListKeyConstructor())
def list(self, *args, **kwargs):
return super(FaceInferredListViewSet, self).list(*args, **kwargs)
@six.add_metaclass(OptimizeRelatedModelViewSetMetaclass)
class FaceLabeledListViewSet(viewsets.ModelViewSet):
serializer_class = FaceListSerializer
pagination_class = HugeResultsSetPagination
def get_queryset(self):
# Todo: optimze query by only prefetching relevant models & fields
queryset = (
Face.objects.filter(
Q(photo__hidden=False) & Q(photo__owner=self.request.user),
Q(person_label_is_inferred=False) | Q(person__name="unknown"),
)
.select_related("person")
.order_by("id")
)
return queryset
@cache_response(CACHE_TTL, key_func=CustomObjectKeyConstructor())
def retrieve(self, *args, **kwargs):
return super(FaceLabeledListViewSet, self).retrieve(*args, **kwargs)
@cache_response(CACHE_TTL, key_func=CustomListKeyConstructor())
def list(self, *args, **kwargs):
return super(FaceLabeledListViewSet, self).list(*args, **kwargs)
@six.add_metaclass(OptimizeRelatedModelViewSetMetaclass)
class FaceViewSet(viewsets.ModelViewSet):
queryset = (
Face.objects.filter(Q(photo__hidden=False))
.prefetch_related("person")
.order_by("id")
)
serializer_class = FaceSerializer
pagination_class = StandardResultsSetPagination
@cache_response(CACHE_TTL, key_func=CustomObjectKeyConstructor())
def retrieve(self, *args, **kwargs):
return super(FaceViewSet, self).retrieve(*args, **kwargs)
@cache_response(CACHE_TTL, key_func=CustomListKeyConstructor())
def list(self, *args, **kwargs):
return super(FaceViewSet, self).list(*args, **kwargs)
@six.add_metaclass(OptimizeRelatedModelViewSetMetaclass)
class FaceInferredViewSet(viewsets.ModelViewSet):
serializer_class = FaceSerializer
pagination_class = HugeResultsSetPagination
def get_queryset(self):
return Face.objects.filter(
Q(photo__hidden=False)
& Q(photo__owner=self.request.user)
& Q(person_label_is_inferred=True)
).order_by("id")
@cache_response(CACHE_TTL, key_func=CustomObjectKeyConstructor())
def retrieve(self, *args, **kwargs):
return super(FaceInferredViewSet, self).retrieve(*args, **kwargs)
@cache_response(CACHE_TTL, key_func=CustomListKeyConstructor())
def list(self, *args, **kwargs):
return super(FaceInferredViewSet, self).list(*args, **kwargs)
@six.add_metaclass(OptimizeRelatedModelViewSetMetaclass)
class FaceLabeledViewSet(viewsets.ModelViewSet):
serializer_class = FaceSerializer
pagination_class = HugeResultsSetPagination
def get_queryset(self):
return Face.objects.filter(
Q(photo__hidden=False)
& Q(photo__owner=self.request.user)
& Q(person_label_is_inferred=False)
).order_by("id")
@cache_response(CACHE_TTL, key_func=CustomObjectKeyConstructor())
def retrieve(self, *args, **kwargs):
return super(FaceLabeledViewSet, self).retrieve(*args, **kwargs)
@cache_response(CACHE_TTL, key_func=CustomListKeyConstructor())
def list(self, *args, **kwargs):
return super(FaceLabeledViewSet, self).list(*args, **kwargs)
class SetFacePersonLabel(APIView):
def post(self, request, format=None):
data = dict(request.data)
person = get_or_create_person(name=data["person_name"])
faces = Face.objects.in_bulk(data["face_ids"])
updated = []
not_updated = []
for face in faces.values():
if face.photo.owner == request.user:
face.person = person
face.person_label_is_inferred = False
face.person_label_probability = 1.0
face.save()
updated.append(FaceListSerializer(face).data)
else:
not_updated.append(FaceListSerializer(face).data)
cache.clear()
return Response(
{
"status": True,
"results": updated,
"updated": updated,
"not_updated": not_updated,
}
)
class DeleteFaces(APIView):
def post(self, request, format=None):
data = dict(request.data)
faces = Face.objects.in_bulk(data["face_ids"])
deleted = []
not_deleted = []
for face in faces.values():
if face.photo.owner == request.user:
deleted.append(face.id)
face.delete()
else:
not_deleted.append(face.id)
cache.clear()
return Response(
{
"status": True,
"results": deleted,
"not_deleted": not_deleted,
"deleted": deleted,
}
)
| 34.761062 | 87 | 0.666752 | 795 | 7,856 | 6.407547 | 0.166038 | 0.047114 | 0.042403 | 0.04947 | 0.748331 | 0.70691 | 0.70691 | 0.63722 | 0.616019 | 0.543384 | 0 | 0.000664 | 0.233198 | 7,856 | 225 | 88 | 34.915556 | 0.844954 | 0.016421 | 0 | 0.546448 | 0 | 0 | 0.022786 | 0 | 0 | 0 | 0 | 0.004444 | 0 | 1 | 0.10929 | false | 0 | 0.092896 | 0.076503 | 0.448087 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
40b405972dd0420ddd029728fa3cd5f963238d4d | 219 | py | Python | jack_sparrow.py | malai001/codebank | 6834107ab683a38a8799102278369a66b1c1b336 | [
"bzip2-1.0.6"
] | 1 | 2017-07-11T05:02:12.000Z | 2017-07-11T05:02:12.000Z | jack_sparrow.py | malai001/codebank | 6834107ab683a38a8799102278369a66b1c1b336 | [
"bzip2-1.0.6"
] | null | null | null | jack_sparrow.py | malai001/codebank | 6834107ab683a38a8799102278369a66b1c1b336 | [
"bzip2-1.0.6"
] | null | null | null | cu,cd = map(int,raw_input().split())
a = map(int,raw_input().split(' '))
eff = cu-cd
count = 0
total = 0
for i in a:
t=i/eff
if((i-cu)%eff==0):
t=(i-cu)/eff
if((i%eff)>0):
t+=1
total+=t
print total
| 15.642857 | 37 | 0.534247 | 46 | 219 | 2.5 | 0.413043 | 0.069565 | 0.156522 | 0.243478 | 0.330435 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029412 | 0.223744 | 219 | 13 | 38 | 16.846154 | 0.647059 | 0 | 0 | 0 | 0 | 0 | 0.004854 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
40c1bb36043a7a976ce3c28c3e1af1a2d5dd40a6 | 199 | py | Python | plugins/counter.py | ebenezzerr/search-bot | 915e02b262946091706b81e3959fdc4a67f59f2e | [
"MIT"
] | null | null | null | plugins/counter.py | ebenezzerr/search-bot | 915e02b262946091706b81e3959fdc4a67f59f2e | [
"MIT"
] | null | null | null | plugins/counter.py | ebenezzerr/search-bot | 915e02b262946091706b81e3959fdc4a67f59f2e | [
"MIT"
] | null | null | null | import time
crontable = []
outputs = []
crontable.append([120,"say_time"])
def say_time():
#NOTE: you must add a real channel ID for this to work
outputs.append(["D0FEYBP9V", time.time()])
| 19.9 | 58 | 0.673367 | 29 | 199 | 4.551724 | 0.724138 | 0.106061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030488 | 0.175879 | 199 | 9 | 59 | 22.111111 | 0.77439 | 0.266332 | 0 | 0 | 0 | 0 | 0.117241 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
40c8428f4ca671de824dc675040f42550d196432 | 7,886 | py | Python | nda.py | Bangsat-XD/JANDAKU | 0e386e161a6718b866f1fccbb0759cc4df9c64f5 | [
"Apache-2.0"
] | 2 | 2021-11-17T03:35:03.000Z | 2021-12-08T06:00:31.000Z | nda.py | Bangsat-XD/JANDAKU | 0e386e161a6718b866f1fccbb0759cc4df9c64f5 | [
"Apache-2.0"
] | null | null | null | nda.py | Bangsat-XD/JANDAKU | 0e386e161a6718b866f1fccbb0759cc4df9c64f5 | [
"Apache-2.0"
] | 2 | 2021-11-05T18:07:48.000Z | 2022-02-24T21:25:07.000Z | ''' ___ ____ ___ ____ ____ ____ ____ ____ ____ ___ ____
|__] |___ |__] |__| [__ |__/ |___ | | | | \ |___
|__] |___ |__] | | ___] | \ |___ |___ |__| |__/ |___ '''
#!/bin/usr/python
from multiprocessing.pool import ThreadPool
from getpass import getpass
import os, urllib.request, sys, json, time, hashlib, random, shutil, re, threading
from bs4 import BeautifulSoup
try:
import mechanize
import requests
except ImportError:
os.system('pip install mechanize')
os.system('pip install requests')
if sys.version[0] == '2':
print(green('[INFO]'),(k),'Ini Menggunakan Python3!')
sys.exit(1)
sleep = time.sleep
h = '\x1b[32m'
r = '\x1b[1;91m'
k = '\x1b[1;97m'
n = '\033[94m'
W = "\033[0m"
G = '\033[32;1m'
R = '\033[31;1m'
time.sleep(1)
back = 0
lol = []
idd = []
threads = []
berhasil = []
cekpoint = []
gagal = []
idb = []
listgrup = []
id = []
ibb = []
s = (' Thanks to : Indo'+n+'⟬'+R+'X'+n+'⟭'+k+'ploit'+r+' and '+k+'Python Indonesia')
t = (' Token')
m = (' Multibruteforce Facebook')
user_agent_list = [
#Chrome
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.90 Safari/537.36',
'Mozilla/5.0 (Windows NT 5.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.90 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.2; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.90 Safari/537.36',
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.157 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36',
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.133 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.133 Safari/537.36',
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36',
#Firefox
'Mozilla/4.0 (compatible; MSIE 9.0; Windows NT 6.1)',
'Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko',
'Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)',
'Mozilla/5.0 (Windows NT 6.1; Trident/7.0; rv:11.0) like Gecko',
'Mozilla/5.0 (Windows NT 6.2; WOW64; Trident/7.0; rv:11.0) like Gecko',
'Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; rv:11.0) like Gecko',
'Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.0; Trident/5.0)',
'Mozilla/5.0 (Windows NT 6.3; WOW64; Trident/7.0; rv:11.0) like Gecko',
'Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0)',
'Mozilla/5.0 (Windows NT 6.1; Win64; x64; Trident/7.0; rv:11.0) like Gecko',
'Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; WOW64; Trident/6.0)',
'Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; Trident/6.0)',
'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729)',
]
proxies_list = [
'http://10.10.1.10:3128',
'http://77.232.139.200:8080',
'http://78.111.125.146:8080',
'http://77.239.133.146:3128',
'http://74.116.59.8:53281',
'http://67.53.121.67:8080',
'http://67.78.143.182:8080',
'http://62.64.111.42:53281',
'http://62.210.251.74:3128',
'http://62.210.105.103:3128',
'http://5.189.133.231:80',
'http://46.101.78.9:8080',
'http://45.55.86.49:8080',
'http://40.87.66.157:80',
'http://45.55.27.246:8080',
'http://45.55.27.246:80',
'http://41.164.32.58:8080',
'http://45.125.119.62:8080',
'http://37.187.116.199:80',
'http://43.250.80.226:80',
'http://43.241.130.242:8080',
'http://38.64.129.242:8080',
'http://41.203.183.50:8080',
'http://36.85.90.8:8080',
'http://36.75.128.3:80',
'http://36.81.255.73:8080',
'http://36.72.127.182:8080',
'http://36.67.230.209:8080',
'http://35.198.198.12:8080',
'http://35.196.159.241:8080',
'http://35.196.159.241:80',
'http://27.122.224.183:80',
'http://223.206.114.195:8080',
'http://221.120.214.174:8080',
'http://223.205.121.223:8080',
'http://222.124.30.138:80',
'http://222.165.205.204:8080',
'http://217.61.15.26:80',
'http://217.29.28.183:8080',
'http://217.121.243.43:8080',
'http://213.47.184.186:8080',
'http://207.148.17.223:8080',
'http://210.213.226.3:8080',
'http://202.70.80.233:8080',
]
sleep(0.2)
os.system('clear')
def tik(s):
for c in s + '\n':
sys.stdout.write(c)
sys.stdout.flush()
time.sleep(random.random() * 0.1)
print(k)
def logo():
os.system('clear')
print(R,' ╔═╗ ╔═╗╔╗')
print(R,' ║╔╝ ║╔╝║║')
print(R,' ╔╝╚╗╔══╗╔═╗╔══╗╔══╗ ╔╝╚╗║╚═╗',G,'Author : As'+R+'_'+G+'Min')
print(R,' ╚╗╔╝║╔╗║║╔╝║╔═╝║║═╣ ╚╗╔╝║╔╗║',G,'Github : asmin19',k)
print(' ║║ ║╚╝║║║ ║╚═╗║║═╣ ║║ ║╚╝║',G,'Version : 3.0',k)
print(' ╚╝ ╚══╝╚╝ ╚══╝╚══╝ ╚╝ ╚══╝')
print(r+'###################################################'+k)
a = ('===================================================')
def about():
logo()
print('')
tik(s)
print('')
print(R+'...................[INFORMATION]...................'+k)
print('')
print('Creator Asmin')
print('About this tools All about hacking facebook accounts')
print('Version 3.0')
print('Special thanks to '+G+' Khoirul Amsori'+k+' and'+G+' Ez Nhana Hna'+k)
print('Code name As'+r+'_'+k+'Min')
print('Team Buton '+R+'Sec'+k+'.')
print('E-mail asmin987asmin@gmail.com')
print('Github asmin19')
print('Telegram @asmin19')
print('WhatsApp +62 852-6834-5036')
print('Date 20:15 13-07-2019')
print('Region Baubau,Sulawesi Tenggara, Indonesia')
print('Support Password xxx123, xxx12345, xxx12, xxx, birthday, sayang, minions, number, and many more')
print('New Features You can crack with'+R+' Super-Multibruteforce'+k+' from friendlist your friends')
print(n+'Coming Soon Checker IG')
print('')
tik(G+'* contact author to '+R+'BUY'+G+' the script ')
print(k)
input('[+] Press [Enter] to return ')
menu()
def menu():
print(k)
logo()
print(s)
print(a)
print(''' [ 01 ] Create Wordlist
[ 02 ] Bruteforce
[ 03 ] Multibruteforce Facebook
[ 04 ] Friends Information
[ 05 ] Token
[ 06 ] About
[ 00 ] Exit
''')
try:
asm = input(n+'[#] Asmin'+k+'/' + r+'~' + k + '> ')
if asm in ['']:
tik(R+'[!] Please enter your choice ')
input('[+] Press [Enter] to return ')
menu()
elif asm in ['1','01']:
about()
elif asm in ['2','02']:
about()
elif asm in ['3','03']:
about()
elif asm in ['4','04']:
about()
elif asm in ['5','05']:
about()
elif asm in ['6','06']:
about()
elif asm in ['0','00']:
exit()
else:
tik(R+'[!] Wrong input')
input(n+'[+] Press [Enter] to return ')
os.system('clear')
logo()
print(s)
print(a)
menu()
except EOFError:
exit()
except KeyboardInterrupt:
exit()
menu()
| 37.732057 | 134 | 0.531195 | 1,173 | 7,886 | 3.594203 | 0.315431 | 0.051233 | 0.052182 | 0.041746 | 0.369782 | 0.345588 | 0.322106 | 0.302894 | 0.302894 | 0.280361 | 0 | 0.192385 | 0.24068 | 7,886 | 208 | 135 | 37.913462 | 0.492318 | 0.026122 | 0 | 0.171717 | 0 | 0.116162 | 0.630128 | 0.025738 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020202 | false | 0.010101 | 0.035354 | 0 | 0.055556 | 0.181818 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
40d09c34a2677918b5754d8d250071181146f110 | 551 | py | Python | hqca/acse/__init__.py | damazz/HQCA | b013ba68f86e42350913c4abc2e1c91695a429b7 | [
"Apache-2.0"
] | null | null | null | hqca/acse/__init__.py | damazz/HQCA | b013ba68f86e42350913c4abc2e1c91695a429b7 | [
"Apache-2.0"
] | null | null | null | hqca/acse/__init__.py | damazz/HQCA | b013ba68f86e42350913c4abc2e1c91695a429b7 | [
"Apache-2.0"
] | 1 | 2021-08-10T00:20:09.000Z | 2021-08-10T00:20:09.000Z | from hqca.acse._store_acse import *
from hqca.acse.acse import *
from hqca.tomography import *
from hqca.core._quantum_storage import *
from hqca.transforms import *
from hqca.instructions import *
from hqca.processes import *
__all__ = [
'StorageACSE',
'RunACSE',
'QuantumStorage',
'StandardTomography',
'QubitTomography',
'ReducedTomography',
'ReducedQubitTomography',
'JordanWigner',
'Parity',
'BravyiKitaev',
'PauliSet',
'StandardProcess',
]
| 23.956522 | 40 | 0.627949 | 46 | 551 | 7.347826 | 0.543478 | 0.16568 | 0.248521 | 0.106509 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.270417 | 551 | 22 | 41 | 25.045455 | 0.840796 | 0 | 0 | 0 | 0 | 0 | 0.284936 | 0.039927 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
40d52a7e3a4e61660203e96653e5846bf97c8c63 | 2,086 | py | Python | 07_classes/05_inheritance.py | ghimiresdp/python-level1 | a41f484c7c6e2e012d1d2e223f9eaa2d4dd98c3f | [
"MIT"
] | 1 | 2021-07-05T16:50:49.000Z | 2021-07-05T16:50:49.000Z | 07_classes/05_inheritance.py | ghimiresdp/python-level1 | a41f484c7c6e2e012d1d2e223f9eaa2d4dd98c3f | [
"MIT"
] | null | null | null | 07_classes/05_inheritance.py | ghimiresdp/python-level1 | a41f484c7c6e2e012d1d2e223f9eaa2d4dd98c3f | [
"MIT"
] | 7 | 2022-01-19T00:35:21.000Z | 2022-03-28T04:27:34.000Z | # © https://sudipghimire.com.np
# %% [markdown]
"""
# Inheritance
- This concept is exactly similar to biological inheritance where child inherits the feature of parent.
- in inheritance, there exists a parent class and child classes which inherits parent's behaviors.
- The base class will be the parent class and the class that is derived from the base class will
be treated as a child class.
Basic Structure
class Parent:
<attributes>
<methods>
class Child(Parent):
<attributes>
<methods>
Rectangle
attributes:
- width
- height
methods:
- perimeter
- area
variants:
square
Animal
attributes:
- legs: 4
- tail
variants:
- Pet
- Wild
- Herbivorous
- Carnivorous
"""
# %% Example Rectangle
class Rectangle:
width: float = 0
height: float = 0
def __init__(self, width, height):
self.width = width
self.height = height
def perimeter(self):
return 2 * (self.width + self.height)
def area(self):
return self.width * self.height
def diagonal_length(self):
return (self.width ** 2 + self.height ** 2) ** (1 / 2)
# %% child
class Square(Rectangle):
def __init__(self, width):
super().__init__(width=width, height=width)
def diagonal_length(self):
return self.width * (2 ** .5)
# %%
room_1 = Rectangle(5, 10)
print(f'Area of room 1: {room_1.area()}')
print(f'Perimeter of room 1: {room_1.perimeter()}')
print(f'Diagonal length of room 1: {room_1.diagonal_length()}')
# %%
room_2 = Square(5)
print(f'Area of room 2: {room_2.area()}')
print(f'Perimeter of room 2: {room_2.perimeter()}')
print(f'Diagonal length of room 2: {room_2.diagonal_length()}')
# %% builtin functions
# isinstance
print(isinstance(room_2, Square))
print(isinstance(room_1, Rectangle))
print(isinstance(room_1, Square))
print(isinstance(room_2, Rectangle))
# %% issubclass
print(issubclass(Square, Rectangle))
print(issubclass(Rectangle, Square))
print(issubclass(Rectangle, object))
| 20.86 | 103 | 0.650527 | 269 | 2,086 | 4.944238 | 0.297398 | 0.033835 | 0.057143 | 0.042857 | 0.254887 | 0.145865 | 0.108271 | 0.055639 | 0 | 0 | 0 | 0.01995 | 0.231064 | 2,086 | 99 | 104 | 21.070707 | 0.808603 | 0.416587 | 0 | 0.0625 | 0 | 0 | 0.20816 | 0.043297 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1875 | false | 0 | 0 | 0.125 | 0.4375 | 0.40625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 2 |
40dbf60cb13a9e3b4a5ab72770011eb1e12447c5 | 540 | py | Python | myvenv/lib/python3.5/site-packages/allauth/socialaccount/providers/asana/tests.py | tuvapp/tuvappcom | 5ca2be19f4b0c86a1d4a9553711a4da9d3f32841 | [
"MIT"
] | 1 | 2020-07-26T17:21:16.000Z | 2020-07-26T17:21:16.000Z | myvenv/lib/python3.5/site-packages/allauth/socialaccount/providers/asana/tests.py | tuvapp/tuvappcom | 5ca2be19f4b0c86a1d4a9553711a4da9d3f32841 | [
"MIT"
] | 6 | 2020-06-05T18:44:19.000Z | 2022-01-13T00:48:56.000Z | myvenv/lib/python3.5/site-packages/allauth/socialaccount/providers/asana/tests.py | tuvapp/tuvappcom | 5ca2be19f4b0c86a1d4a9553711a4da9d3f32841 | [
"MIT"
] | 1 | 2022-02-01T17:19:28.000Z | 2022-02-01T17:19:28.000Z | from allauth.socialaccount.tests import create_oauth2_tests
from allauth.tests import MockedResponse
from allauth.socialaccount.providers import registry
from .provider import AsanaProvider
class AsanaTests(create_oauth2_tests(
registry.by_id(AsanaProvider.id))):
def get_mocked_response(self):
return MockedResponse(200, """
{"data": {"photo": null, "workspaces": [{"id": 31337, "name": "test.com"},
{"id": 3133777, "name": "Personal Projects"}], "email": "test@test.com",
"name": "Test Name", "id": 43748387}}""")
| 33.75 | 74 | 0.716667 | 64 | 540 | 5.9375 | 0.5625 | 0.086842 | 0.126316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052966 | 0.125926 | 540 | 15 | 75 | 36 | 0.752119 | 0 | 0 | 0 | 0 | 0.181818 | 0.344444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.363636 | 0.090909 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
40e44999fb7f51f18370225d6ced72af64168aea | 2,109 | py | Python | cards/upgrade_rounds.py | MrCoft/EngiMod | 65c90bd9231ac388d8af7849a1835914f1eefc78 | [
"MIT"
] | null | null | null | cards/upgrade_rounds.py | MrCoft/EngiMod | 65c90bd9231ac388d8af7849a1835914f1eefc78 | [
"MIT"
] | null | null | null | cards/upgrade_rounds.py | MrCoft/EngiMod | 65c90bd9231ac388d8af7849a1835914f1eefc78 | [
"MIT"
] | null | null | null | from engi_mod import *
Card(
name = "Upgrade Rounds",
type = "skill",
target = "self",
rarity = "rare",
cost = 1,
flags = dict(
exhaust = "true",
),
desc = 'Upgrade ALL of your Attack cards containing "Round" for the rest of combat. NL Exhaust.',
upgrade_desc = 'Upgrade ALL of your Attack cards containing "Round" for the rest of combat.',
code = """
AbstractDungeon.actionManager.addToBottom(new UpgradeRoundsAction());
""",
upgrade_code = """
upgradeName();
exhaust = false;
rawDescription = UPGRADE_DESCRIPTION;
initializeDescription();
""",
)
Action(
id = "UpgradeRoundsAction",
flags = dict(
duration = "Settings.ACTION_DUR_MED",
actionType = "ActionType.WAIT",
),
code_FULL = r"""
ArrayList<AbstractCard> upgraded;
@Override
public void update() {
if (duration == Settings.ACTION_DUR_MED) {
upgraded = new ArrayList<AbstractCard>();
final AbstractPlayer p = AbstractDungeon.player;
upgradeAllCardsInGroup(p.hand);
upgradeAllCardsInGroup(p.drawPile);
upgradeAllCardsInGroup(p.discardPile);
upgradeAllCardsInGroup(p.exhaustPile);
isDone = true;
if (!upgraded.isEmpty()) {
AbstractDungeon.effectsQueue.add(new UpgradeShineEffect(Settings.WIDTH / 2.0f, Settings.HEIGHT / 2.0f));
SpireUtils.showCards(upgraded);
}
}
}
private void upgradeAllCardsInGroup(final CardGroup cardGroup) {
for (final AbstractCard card : cardGroup.group) {
if (RoundUtils.isRound(card) && card.canUpgrade()) {
if (cardGroup.type == CardGroup.CardGroupType.HAND) {
card.superFlash();
}
card.upgrade();
card.applyPowers();
upgraded.add(card);
}
}
}
""",
)
| 31.954545 | 124 | 0.541015 | 170 | 2,109 | 6.658824 | 0.535294 | 0.081272 | 0.024735 | 0.028269 | 0.162544 | 0.113074 | 0.113074 | 0.113074 | 0.113074 | 0.113074 | 0 | 0.003674 | 0.35467 | 2,109 | 65 | 125 | 32.446154 | 0.828068 | 0 | 0 | 0.116667 | 0 | 0.016667 | 0.828355 | 0.263632 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.016667 | 0 | 0.016667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
40fb57d2e7a001152acf4f7d19add4d0ba65f5d1 | 560 | py | Python | pyquilted/builder/contacts.py | cocoroutine/pyquilted | dd8644043deec17608e00f46e3ac4562b8879603 | [
"MIT"
] | 1 | 2019-02-21T20:10:37.000Z | 2019-02-21T20:10:37.000Z | pyquilted/builder/contacts.py | cocoroutine/pyquilted | dd8644043deec17608e00f46e3ac4562b8879603 | [
"MIT"
] | null | null | null | pyquilted/builder/contacts.py | cocoroutine/pyquilted | dd8644043deec17608e00f46e3ac4562b8879603 | [
"MIT"
] | null | null | null | from pyquilted.quilted.contact_factory import ContactFactory
from pyquilted.quilted.contacts_list import ContactsList
class ContactsBuilder:
"""Contacts data mapper object"""
def __init__(self, contacts_odict):
self.contacts = ContactsList()
self.odict = contacts_odict
def deserialize(self):
for key, val in self.odict.items():
self._add_contact(ContactFactory.create(key, val))
return self.contacts
def _add_contact(self, contact):
if contact:
self.contacts.append(contact)
| 29.473684 | 62 | 0.694643 | 63 | 560 | 5.984127 | 0.47619 | 0.127321 | 0.106101 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.221429 | 560 | 18 | 63 | 31.111111 | 0.864679 | 0.048214 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.153846 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
40ff126aea00f08815dc097acaa5d9c997f935ac | 171 | py | Python | rl/constants.py | dlebech/reinforcement-learning | 98f49ebf67b6bb4fc7a52527e0ebfaeb54486b3d | [
"MIT"
] | null | null | null | rl/constants.py | dlebech/reinforcement-learning | 98f49ebf67b6bb4fc7a52527e0ebfaeb54486b3d | [
"MIT"
] | 4 | 2019-03-17T10:59:04.000Z | 2019-03-24T11:08:06.000Z | rl/constants.py | dlebech/reinforcement-learning | 98f49ebf67b6bb4fc7a52527e0ebfaeb54486b3d | [
"MIT"
] | null | null | null | DEFAULT_ENV_NAME = "CartPole-v1"
DEFAULT_ALGORITHM = "random"
DEFAULT_MAX_EPISODES = 1000
DEFAULT_LEARNING_RATE = 0.001
DEFAULT_GAMMA = 0.95
DEFAULT_UPDATE_FREQUENCY = 20
| 24.428571 | 32 | 0.818713 | 25 | 171 | 5.2 | 0.76 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091503 | 0.105263 | 171 | 6 | 33 | 28.5 | 0.75817 | 0 | 0 | 0 | 0 | 0 | 0.099415 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
40ffc076361674499ba934a04524b8c77032b66c | 282 | py | Python | xe_currencies/settings.py | aisayko/django-xe-currencies | 483646a93bf15a8dc942b6b9e08c2516193eee1a | [
"MIT"
] | 2 | 2015-04-06T19:13:35.000Z | 2019-03-01T06:19:42.000Z | xe_currencies/settings.py | aisayko/django-xe-currencies | 483646a93bf15a8dc942b6b9e08c2516193eee1a | [
"MIT"
] | 3 | 2020-02-11T22:23:11.000Z | 2021-06-10T17:42:48.000Z | xe_currencies/settings.py | aisayko/django-xe-currencies | 483646a93bf15a8dc942b6b9e08c2516193eee1a | [
"MIT"
] | 1 | 2015-05-12T20:31:19.000Z | 2015-05-12T20:31:19.000Z | """
The following settings should be defined in your global settings
XE_DATAFEED_URL should be set to your registered XE username.
"""
try:
from django.conf import settings
except ImportError:
settings = {}
XE_DATAFEED_URL = 'http://www.xe.com/dfs/datafeed2.cgi?xeuser'
| 20.142857 | 64 | 0.748227 | 41 | 282 | 5.04878 | 0.731707 | 0.077295 | 0.173913 | 0.202899 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004237 | 0.163121 | 282 | 13 | 65 | 21.692308 | 0.872881 | 0.450355 | 0 | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
dc0e7c7ba3cef17e1053631cf14e61be56327e8f | 226 | py | Python | blog/forms.py | jarnalyrkar/DjangoUnleashed | a073e567f3e8c67debb368967b667e307e4c312c | [
"BSD-2-Clause"
] | null | null | null | blog/forms.py | jarnalyrkar/DjangoUnleashed | a073e567f3e8c67debb368967b667e307e4c312c | [
"BSD-2-Clause"
] | null | null | null | blog/forms.py | jarnalyrkar/DjangoUnleashed | a073e567f3e8c67debb368967b667e307e4c312c | [
"BSD-2-Clause"
] | null | null | null | from django import forms
from .models import Post
class PostForm(forms.ModelForm):
class Meta:
model = Post
fields = '__all__'
def clean_slug(self):
return self.cleaned_data['slug'].lower()
| 17.384615 | 48 | 0.650442 | 28 | 226 | 5.035714 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.256637 | 226 | 12 | 49 | 18.833333 | 0.839286 | 0 | 0 | 0 | 0 | 0 | 0.048673 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.25 | 0.125 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
dc0eb56dab49e282d65cf394851b192e9afad8f1 | 134 | py | Python | study/demo.py | t-ntran/vscode | 79488e9971073b6a065744ae5eca7146f4e10257 | [
"MIT"
] | null | null | null | study/demo.py | t-ntran/vscode | 79488e9971073b6a065744ae5eca7146f4e10257 | [
"MIT"
] | null | null | null | study/demo.py | t-ntran/vscode | 79488e9971073b6a065744ae5eca7146f4e10257 | [
"MIT"
] | 1 | 2021-11-29T16:51:54.000Z | 2021-11-29T16:51:54.000Z |
def initials(name):
parts = name.split(' ')
letters = ''
for part in parts:
letters += part[0]
return letters | 19.142857 | 27 | 0.559701 | 16 | 134 | 4.6875 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01087 | 0.313433 | 134 | 7 | 28 | 19.142857 | 0.804348 | 0 | 0 | 0 | 0 | 0 | 0.007463 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
904b20d74e0440aceaed8d8fdaef24bc6d3c3b84 | 243 | py | Python | conf.py | Zephor5/seccode_recognize | b78cff0753fc5d524f3c9e36269b3681408ac78c | [
"Apache-2.0"
] | null | null | null | conf.py | Zephor5/seccode_recognize | b78cff0753fc5d524f3c9e36269b3681408ac78c | [
"Apache-2.0"
] | null | null | null | conf.py | Zephor5/seccode_recognize | b78cff0753fc5d524f3c9e36269b3681408ac78c | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
import os
__author__ = 'zephor'
ROOT = os.path.abspath(os.path.dirname(__file__))
DATA_RAW = os.path.join(ROOT, 'data_raw/')
DATA_PREPROCESSED = os.path.join(ROOT, 'data_prep/')
DATA_NAMED = os.path.join(ROOT, 'data_named/')
| 22.090909 | 52 | 0.72428 | 39 | 243 | 4.153846 | 0.461538 | 0.185185 | 0.185185 | 0.259259 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004587 | 0.102881 | 243 | 10 | 53 | 24.3 | 0.738532 | 0.049383 | 0 | 0 | 0 | 0 | 0.157205 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
905346b15aab2cc9b3ce36fa3739561a721434f3 | 37,190 | py | Python | pytransact/test/test_link.py | jacob22/pytransact | c3c5e7e30138fa28f4affa199835e96b6369c609 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2019-07-10T15:21:16.000Z | 2019-07-10T15:21:16.000Z | pytransact/test/test_link.py | jacob22/pytransact | c3c5e7e30138fa28f4affa199835e96b6369c609 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | pytransact/test/test_link.py | jacob22/pytransact | c3c5e7e30138fa28f4affa199835e96b6369c609 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # Copyright 2019 Open End AB
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import py
import os
from bson.objectid import ObjectId
from pytransact import commit, contextbroker, exceptions, mongo
from pytransact import link as Link
from pytransact import query as Query
from pytransact.object.attribute import BlobVal
from pytransact.testsupport import ContextTests, Time
import blm
from blm import fundamental
blmpath = os.path.join(os.path.dirname(__file__), 'blm')
def setup_module(mod):
import blm.fundamental
blm.addBlmPath(blmpath)
def teardown_module(mod):
blm.removeBlmPath(blmpath)
blm.clear()
class LinkTests(ContextTests):
def setup_method(self, method):
super(LinkTests, self).setup_method(method)
self.time = Time()
self.clientId = mongo.insert(self.database.clients, {'updates': []})
self.linkId = 27
self.user = blm.fundamental.AccessHolder()
def teardown_method(self, method):
super(LinkTests, self).teardown_method(method)
self.time.restore()
def _getLinkData(self):
self.sync()
return mongo.find_one(self.database.links,
{'client': self.clientId,
'link': self.linkId})
def _getResult(self):
result = mongo.find_and_modify(self.database.clients,
{'_id': self.clientId},
{'$set': {'updates': []}})
if not result['updates']:
return None
return result['updates'][0]['args']
def _getState(self):
self.sync()
link = mongo.find_one(self.database.links,
{'client': self.clientId,
'link': self.linkId})
if link:
return link['state']
def _getParams(self):
self.sync()
link = mongo.find_one(self.database.links,
{'client': self.clientId,
'link': self.linkId})
if link:
return link['params']
def uptodate(self):
self.sync()
link = mongo.find_one(self.database.links, {'client': self.clientId,
'link': self.linkId})
return link and not link.get('outdatedBy')
def outdate(self, toids=[]):
mongo.update_one(self.database.links,
{'client': self.clientId, 'link': self.linkId},
{'$set': {'outdatedBy': ObjectId()},
'$addToSet': {'outdatedToids': {'$each': toids}}})
self.sync()
class LinkTransient(Link.Link):
def _run(self, params, state=None):
assert state is None
self.update({'foo': 'bar'})
class LinkPersistent(Link.Link):
state = {'gurka': 'tomat'}
result = {'foo': 'bar'}
def _run(self, params, state=None):
self._state = state
self.save(params, self.state)
self.update(self.result, persistent=True)
class TestLinkPersistence(LinkTests):
def setup_method(self, method):
super(TestLinkPersistence, self).setup_method(method)
self.factory = Link.LinkFactory()
self.factory.Transient = LinkTransient
self.factory.Persistent = LinkPersistent
self.pushnewctx(commit.CommitContext, user=self.user)
def test_transient(self):
link = self.factory.create('Transient', self.clientId, self.linkId,
params=dict(toid='apa'))
link.run()
self.sync()
# no persistent links
assert not list(self.database.links.find({'client': self.clientId,
'link': self.linkId}))
# but one (transient) update for the client
clientdata, = list(self.database.clients.find({'_id': self.clientId}))
assert clientdata['updates'] == [{'type': 'update',
'id': self.linkId,
'args': {'foo': 'bar'}}]
def test_persistent_new(self):
assert self.database.links.find({'client': self.clientId,
'link': self.linkId}).count() == 0
link = self.factory.create('Persistent', self.clientId, self.linkId,
params={'apa': 'bepa'})
link.run()
self.sync()
assert link._state is None
linkdata, = list(self.database.links.find({'client': self.clientId,
'link': self.linkId}))
clientdata, = list(self.database.clients.find({'_id': self.clientId}))
linkdata.pop('_id') # don't care
assert linkdata == {'type': 'LinkPersistent',
'client': self.clientId,
'link': self.linkId,
'allowRead': [self.user.id[0]],
'params': {'apa': 'bepa'},
'state': link.state,
'timestamp': self.time,
'outdatedBy': None,
'ancient': False}
assert clientdata['updates'] == [{'type': 'update',
'id': self.linkId,
'args': {'foo': 'bar'}}]
def test_persistent_client_update(self):
# create an existing link
link = self.factory.create('Persistent', self.clientId, self.linkId,
params=dict(toid='27', attrs=['foo', 'bar']))
link.run()
self.sync()
self.database.clients.find_and_modify({'_id': self.clientId},
{'$set': {'updates': []}})
self.time.step()
# now update it
link = self.factory.create('Persistent', self.clientId, self.linkId,
params=dict(toid='27', attrs=['bar', 'baz']))
link.run()
self.sync()
assert link._state == {'gurka': 'tomat'}
linkdata, = list(self.database.links.find({'client': self.clientId,
'link': self.linkId}))
clientdata, = list(self.database.clients.find({'_id': self.clientId}))
linkdata.pop('_id') # don't care
assert linkdata == {'type': 'LinkPersistent',
'client': self.clientId,
'link': self.linkId,
'allowRead': [self.user.id[0]],
'params': dict(toid='27', attrs=['bar', 'baz']),
'state': link.state,
'timestamp': self.time,
'outdatedBy': None,
'ancient': False}
assert clientdata['updates'] == [{'type': 'update',
'id': self.linkId,
'args': {'foo': 'bar'}}]
def test_persistent_db_update(self, monkeypatch):
# create an existing link
link = self.factory.create('Persistent', self.clientId, self.linkId,
params=dict(toid='27', attrs=['foo', 'bar']))
link.run()
mongo.update_one(self.database.clients, {'_id': self.clientId},
{'$set': {'updates': []}})
self.sync()
# no new data, but uptodate should be set
monkeypatch.setattr(LinkPersistent, 'result', None)
link = self.factory.create('Persistent', self.clientId, self.linkId)
link.run()
self.sync()
clientdata = mongo.find_one(self.database.clients, {'_id': self.clientId})
assert clientdata['updates'] == []
linkdata = mongo.find_one(self.database.links, {'client': self.clientId,
'link': self.linkId})
assert linkdata['outdatedBy'] == None
monkeypatch.undo()
monkeypatch.setattr(LinkPersistent, 'state', {'sallad': 'paprika'})
# now update it
link = self.factory.create('Persistent', self.clientId, self.linkId)
link.run()
self.sync()
assert link._state == {'gurka': 'tomat'}
linkdata, = list(self.database.links.find({'client': self.clientId,
'link': self.linkId}))
clientdata, = list(self.database.clients.find({'_id': self.clientId}))
linkdata.pop('_id') # don't care
assert linkdata == {'type': 'LinkPersistent',
'client': self.clientId,
'link': self.linkId,
'state': {'sallad': 'paprika'},
'outdatedBy': None,
'allowRead': [self.user.id[0]],
'timestamp': self.time,
'ancient': False,
# same params as first time:
'params': dict(toid='27', attrs=['foo', 'bar'])}
assert clientdata['updates'] == [{'type': 'update',
'id': self.linkId,
'args': {'foo': 'bar'}}]
def test_remove(self):
link = self.factory.create('Persistent', self.clientId, self.linkId,
params={'apa': 'bepa'})
link.run()
self.sync()
# sanity checks
assert mongo.find_one(self.database.links, {'link': self.linkId})
client = mongo.find_one(self.database.clients, {'_id': self.clientId})
link = self.factory.create('Persistent', self.clientId, self.linkId)
link.remove()
self.sync()
assert not mongo.find_one(self.database.links, {'link': self.linkId})
client = mongo.find_one(self.database.clients, {'_id': self.clientId})
def test_iterate_links(self, monkeypatch):
# create a few links
link0 = self.factory.create('Persistent', self.clientId, self.linkId,
params=dict(toid='27',
attrs=['foo', 'bar']))
link0.run()
link1 = self.factory.create('Persistent', self.clientId, self.linkId+1,
params=dict(toid='28',
attrs=['apa', 'bepa']))
link1.run()
link2 = self.factory.create('Persistent', self.clientId, self.linkId+2,
params=dict(toid='29',
attrs=['gurka', 'tomat']))
link2.run()
self.sync()
monkeypatch.setattr(Link, 'LinkPersistent', LinkPersistent,
raising=False)
links = set((l.clientId, l.linkId) for l in self.factory.iter())
expected = set((l.clientId, l.linkId) for l in [link0, link1, link2])
assert links == expected
otherclient = ObjectId()
link3 = self.factory.create('Persistent', otherclient, self.linkId,
params=(dict(toid='30', attrs=['foo', 'bar'])))
link3.run()
self.sync()
links = set((l.clientId, l.linkId) for l in self.factory.iter(
{'client': otherclient}))
expected = set((l.clientId, l.linkId) for l in [link3])
assert links == expected
class TestLink(LinkTests):
def setup_method(self, method):
super(TestLink, self).setup_method(method)
self.ops = []
self.results = [[42]]
self.calls = []
self.error = None
def runCommit(ops, interested=None):
self.sync()
state = self._getState()
# link must have been saved before we process commit
assert state == 'processing'
self.ops = ops
self.interested = interested
cmt = commit.Commit(handled_by=interested, state='done',
interested=interested, results=self.results,
error=self.error)
cmt.done(self.database)
self.ctx.runCommit = runCommit
def test_CallMethod_blm(self):
link = Link.LinkCallMethod(self.clientId, self.linkId,
{'blmName':'testBlm',
'methodName':'testMethod',
'args':[0,1,2]})
val = BlobVal('42')
val.large_blob = 1
self.results = [[{'foo': val}]]
link.run()
self.sync()
op, = self.ops
assert isinstance(op, commit.CallBlm)
assert op.args == [0,1,2]
assert op.blmName == 'testBlm'
assert op.methodName == 'testMethod'
assert self.interested == (self.clientId, self.linkId)
result = self._getResult()
assert result == {'error': None, 'result': [{'foo': val}]}
assert result['result'][0]['foo'].references == {self.clientId}
# commits are removed after result has been produced
assert mongo.find(self.database.links,
{'client': link.clientId, 'link': link.linkId},
).count() == 0
def test_CallMethod_blm_error(self):
self.results = []
self.error = 'error'
link = Link.LinkCallMethod(self.clientId, self.linkId,
{'blmName':'testBlm',
'methodName':'testMethod',
'args':[0,1,2]})
link.run()
op, = self.ops
assert isinstance(op, commit.CallBlm)
assert op.args == [0,1,2]
assert op.blmName == 'testBlm'
assert op.methodName == 'testMethod'
assert self.interested == (self.clientId, self.linkId)
result = self._getResult()
assert result == {'error': 'error', 'result': []}
self.sync()
assert mongo.find(self.database.links, {'client': link.clientId,
'link': link.linkId}).count() == 0
def test_CallMethod_toi(self):
oid = ObjectId()
link = Link.LinkCallMethod(self.clientId, self.linkId,
{'toid': str(oid),
'methodName':'testMethod',
'args':[0,1,2]})
link.run()
op, = self.ops
assert isinstance(op, commit.CallToi)
assert op.args == [0,1,2]
assert op.toid == oid
assert op.methodName == 'testMethod'
assert self.interested == (self.clientId, self.linkId)
self.sync()
result = self._getResult()
assert result == {'error': None, 'result': [42]}
assert mongo.find(self.database.links, {'client': link.clientId,
'link': link.linkId},
).count() == 0
class TestLinkWithData(LinkTests):
def test_Request(self):
from blm import testblm
oid = ObjectId()
link = Link.LinkRequest(self.clientId, self.linkId,
{'toid': str(oid),
'attrList' : ['attr1', 'attr2']})
link.run()
x = self._getResult()
assert isinstance(x['error'], exceptions.ToiNonexistantError)
assert not self.uptodate()
oid = ObjectId()
toi = self.ctx.createToi(blm.testblm.Test, oid,
{'name':['test_Request']})
self.sync()
link = Link.LinkRequest(self.clientId, self.linkId,
{'toid': str(oid),
'attrList' : ['attr1', 'attr2']})
link.run()
x = self._getResult()
assert x['error'] is None
assert x['toiDiff'].diffAttrs == {'attr1': [], 'attr2': [] }
s = self._getState()
assert s is None
assert not self.uptodate()
oid = ObjectId()
toi = self.ctx.createToi(blm.testblm.Test, oid,
{'name':['test_Request'], 'attr1':['A1']})
self.sync()
link = Link.LinkRequest(self.clientId, self.linkId,
{'toid': str(oid),
'attrList' : ['attr1', 'attr2']})
link.run()
x = self._getResult()
assert x['error'] is None
assert x['toiDiff'].diffAttrs == {'attr1': ['A1'], 'attr2': [] }
s = self._getState()
assert s is None
assert not self.uptodate()
# subscription
link = Link.LinkRequest(self.clientId, self.linkId,
{'toid': str(oid),
'attrList' : ['attr1', 'attr2'],
'subscription': True})
link.run()
self.sync()
x = self._getResult()
assert x['error'] is None
assert x['toiDiff'].diffAttrs == {'attr1': ['A1'], 'attr2': [] }
s = self._getState()
assert s == {'attr1': ['A1'], 'attr2': [] }
assert self.uptodate()
p = self._getParams()
assert p['toid'] == oid
link = Link.LinkRequest(self.clientId, self.linkId)
link.run()
x = self._getResult()
assert x is None
s = self._getState()
assert s == {'attr1': ['A1'], 'attr2': [] }
assert self.uptodate()
# update
self.outdate()
self.sync()
toi.attr2 = ['A2']
link = Link.LinkRequest(self.clientId, self.linkId)
link.run()
self.sync()
x = self._getResult()
assert x['toiDiff'].diffAttrs == {'attr2': [BlobVal('A2')]}
s = self._getState()
assert s == {'attr1': ['A1'], 'attr2': [BlobVal('A2')] }
assert s['attr2'][0].references == {self.clientId, link.link['_id']}
assert self.uptodate()
def test_Query(self):
from blm import testblm
query = blm.testblm.Test._query(name='test_Query')
# empty result
link = Link.LinkQuery(self.clientId, self.linkId, {'criteria': query})
link.run()
x = self._getResult()
assert x['add'] == {}
assert x['del'] == {}
assert x['error'] is None
s = self._getState()
assert s is None
assert not self.uptodate()
# non empty result
oid1 = ObjectId()
toi1 = self.ctx.createToi(blm.testblm.Test, oid1,
{'name':['test_Query'],
'attr1':['A1']})
self.sync()
self.pushnewctx()
link = Link.LinkQuery(self.clientId, self.linkId, {'criteria' : query})
link.run()
x = self._getResult()
assert x['add'] == { str(oid1): 'testblm.Test' }
assert x['del'] == {}
assert x['error'] is None
s = self._getState()
assert s is None
assert not self.uptodate()
# subscription
link = Link.LinkQuery(self.clientId, self.linkId,
{'criteria' : query, 'subscription': True})
link.run()
self.sync()
x = self._getResult()
assert x['add'] == { str(oid1): 'testblm.Test' }
assert x['del'] == {}
assert x['error'] is None
s = self._getState()
assert s == { str(oid1): 'testblm.Test' }
assert self.uptodate()
# update, new toi
self.outdate()
oid2 = ObjectId()
toi2 = self.ctx.createToi(blm.testblm.Test, oid2,
{'name':['test_Query'],
'attr1':['A2']})
self.sync()
self.pushnewctx()
link = Link.LinkQuery(self.clientId, self.linkId)
link.run()
self.sync()
x = self._getResult()
assert x['add'] == { str(oid2): 'testblm.Test' }
assert x['del'] == {}
assert x['error'] is None
s = self._getState()
assert s == { str(oid1): 'testblm.Test',
str(oid2): 'testblm.Test',}
assert self.uptodate()
# update, remove toi
self.outdate()
self.ctx.deleteToi(toi1)
self.sync()
self.pushnewctx()
link = Link.LinkQuery(self.clientId, self.linkId)
link.run()
self.sync()
x = self._getResult()
assert x['add'] == {}
assert x['del'] == { str(oid1): 'testblm.Test' }
assert x['error'] is None
s = self._getState()
assert s == { str(oid2): 'testblm.Test'}
assert self.uptodate()
def test_SortedQuery_basic(self, monkeypatch):
monkeypatch.setattr(BlobVal, 'large_blob', 1)
from blm import testblm
query = blm.testblm.Test._query(name='test_SortedQuery')
query.clear()
# empty result
link = Link.LinkSortedQuery(self.clientId, self.linkId,
{'criteria' : query,
'attrList' : ['attr1', 'attr2']})
link.run()
x = self._getResult()
assert x['diffops'] == []
assert x['error'] is None
assert x['toiDiffs'] == {}
s = self._getState()
assert s is None
assert not self.uptodate()
# non empty result
self.outdate()
oid = ObjectId()
toi = self.ctx.createToi(blm.testblm.Test, oid,
{'name':['test_SortedQuery'], 'attr1':['A1']})
self.sync()
self.pushnewctx()
link = Link.LinkSortedQuery(self.clientId, self.linkId,
{'criteria' : query,
'attrList' : ['attr1', 'attr2']})
link.run()
x = self._getResult()
assert x['diffops'] == [[0, 0, [toi]]]
assert x['error'] is None
assert len(x['toiDiffs']) == 1
td = x['toiDiffs'][str(oid)]
assert td.diffAttrs == {'attr1': ['A1'], 'attr2': [] }
s = self._getState()
assert s is None
assert not self.uptodate()
# subscription
self.outdate()
link = Link.LinkSortedQuery(self.clientId, self.linkId,
{'criteria' : query,
'attrList' : ['attr1', 'attr2'],
'subscription' : True,
'sorting': 'attr1'})
link.run()
self.sync()
x = self._getResult()
assert x['diffops'] == [[0, 0, [toi]]]
assert x['error'] is None
assert len(x['toiDiffs']) == 1
td = x['toiDiffs'][str(oid)]
assert td.diffAttrs == {'attr1': ['A1'], 'attr2': [] }
s = self._getState()
expected = { 'query' : [ str(oid) ],
'tois' : { str(oid) : { 'attr1' : ['A1'], 'attr2' : [] }},
'order': [ str(oid) ],
}
assert s == expected
assert self.uptodate()
# no change
self.outdate()
link = Link.LinkSortedQuery(self.clientId, self.linkId)
link.run()
self.sync()
x = self._getResult()
assert x is None
assert self.uptodate()
# update of toi
self.outdate()
val = BlobVal('A2')
toi.attr2 = [val]
self.pushnewctx()
link = Link.LinkSortedQuery(self.clientId, self.linkId)
link.run()
self.sync()
x = self._getResult()
assert x['diffops'] == []
assert x['error'] is None
assert len(x['toiDiffs']) == 1
td = x['toiDiffs'][str(oid)]
assert td.diffAttrs == {'attr2': [val] }
assert td.diffAttrs['attr2'][0].references == {self.clientId, link.link['_id']}
assert self.uptodate()
assert self.database.blobvals.files.find({'metadata.references.value': [self.clientId]}).count() == 0
# update of query: new toi
oid2 = ObjectId()
val = BlobVal('B2')
toi2 = self.ctx.createToi(blm.testblm.Test, oid2,
{'attr1':['B1'], 'attr2': [val]})
self.outdate([oid2])
self.sync()
link = Link.LinkSortedQuery(self.clientId, self.linkId)
link.run()
x = self._getResult()
assert x['diffops'] == [[1, 1, [toi2]]]
assert x['error'] is None
assert len(x['toiDiffs']) == 1
td = x['toiDiffs'][str(oid2)]
assert td.diffAttrs == {'attr1': ['B1'], 'attr2': [val] }
assert self.uptodate()
# client still references outdated blobval A2
assert self.database.blobvals.files.find({'metadata.references.value': [self.clientId]}).count() == 1
# update: remove toi
self.ctx.deleteToi(toi2)
self.outdate([oid2])
self.sync()
self.pushnewctx()
link = Link.LinkSortedQuery(self.clientId, self.linkId)
link.run()
x = self._getResult()
assert x['diffops'] == [[1, 2, []]]
assert x['error'] is None
assert len(x['toiDiffs']) == 0
assert self.uptodate()
# client still references outdated blobvals A2, B2; JsLink._msg_poll will dereference
assert self.database.blobvals.files.find({'metadata.references.value': [self.clientId]}).count() == 2
def test_SortedQuery_rerun_optimization(self, monkeypatch):
from blm import testblm
oid1 = ObjectId()
toi1 = self.ctx.createToi(blm.testblm.Test, oid1, {'name':['foobar']})
oid2 = ObjectId()
toi2 = self.ctx.createToi(blm.testblm.Test, oid2, {'name':['gazonk']})
oid3 = ObjectId()
toi3 = self.ctx.createToi(blm.testblm.Test, oid3, {'name':['zonka']})
self.sync()
query = blm.testblm.Test._query(name=Query.Like('foo*'))
link = Link.LinkSortedQuery(self.clientId, self.linkId,
{'criteria': query,
'attrList': ['name'],
'subscription': True,
'sorting': 'name'})
link.run()
r = self._getResult()
assert list(r['toiDiffs'].keys()) == [str(oid1)] # sanity
self.ctx.changeToi(toi2, {'name': ['foooo']})
mongo.update_one(self.database.links,
{'client': self.clientId, 'link': self.linkId},
{'$set': {'outdatedBy': ObjectId(),
'outdatedToids': [oid2]}})
self.sync()
link = Link.LinkFactory().create(None, self.clientId, self.linkId)
runQuery = self.ctx.runQuery
queries = []
def _runQuery(query):
queries.append(query)
return runQuery(query)
monkeypatch.setattr(self.ctx, 'runQuery', _runQuery)
link.run()
r = self._getResult()
assert r['diffops'] == [[1, 1, [toi2]]]
assert r['toiDiffs'][str(oid2)].diffAttrs == {'name': ['foooo']}
assert queries == [blm.testblm.Test._query(id=[oid2])] # whitebox
def test_SortedQuery_rerun_optimization_on_id(self, monkeypatch):
from blm import testblm
oid1 = ObjectId()
toi1 = self.ctx.createToi(blm.testblm.Test, oid1, {'name':['foobar']})
self.sync()
query = blm.testblm.Test._query(id=Query.In([str(oid1)]))
link = Link.LinkSortedQuery(self.clientId, self.linkId,
{'criteria': query,
'attrList': ['name'],
'subscription': True,
'sorting': 'name'})
link.run()
r = self._getResult()
assert list(r['toiDiffs'].keys()) == [str(oid1)] # sanity
self.ctx.changeToi(toi1, {'name': ['foooo']})
mongo.update_one(self.database.links,
{'client': self.clientId, 'link': self.linkId},
{'$set': {'outdatedBy': ObjectId(),
'outdatedToids': [oid1]}})
self.sync()
link = Link.LinkFactory().create(None, self.clientId, self.linkId)
runQuery = self.ctx.runQuery
queries = []
def _runQuery(query):
queries.append(query)
return runQuery(query)
monkeypatch.setattr(self.ctx, 'runQuery', _runQuery)
link.run()
r = self._getResult()
assert r['diffops'] == []
assert r['toiDiffs'][str(oid1)].diffAttrs == {'name': ['foooo']}
assert queries == [blm.testblm.Test._query(id=[oid1])] # whitebox
def test_SortedQuery_fulltext_update(self):
py.test.skip('full text index disabled for now')
from blm import testblm
with commit.CommitContext(self.database, user=self.user) as ctx:
ctx.setMayChange(True)
oid1 = ObjectId()
toi1 = ctx.createToi(blm.testblm.Test, oid1, {'name':['foo']})
ctx.runCommit([])
self.sync()
query = blm.testblm.Test._query(id=Query.Fulltext('bar'))
link = Link.LinkSortedQuery(self.clientId, self.linkId,
{'criteria': query,
'attrList': ['name'],
'subscription': True,
'sorting': 'name'})
link.run()
self.sync()
x = self._getResult()
assert x['diffops'] == []
toi1, = blm.testblm.Test._query(id=oid1).run()
with commit.CommitContext(self.database) as ctx:
ctx.setMayChange(True)
ctx.changeToi(toi1, {'name': ['foo bar']})
ctx.runCommit([])
mongo.update_one(self.database.links,
{'client': self.clientId, 'link': self.linkId},
{'$set': {'outdatedBy': ObjectId(),
'outdatedToids': [oid1]}})
self.sync()
link = Link.LinkFactory().create(None, self.clientId, self.linkId)
link.run()
self.sync()
x = self._getResult()
assert x['diffops'] == [[0, 0, [toi1]]]
def test_SortedQuery_updateParameters(self):
from blm import testblm
query = blm.testblm.Test._query(name='test_SortedQuery')
oid1 = ObjectId()
toi1 = self.ctx.createToi(blm.testblm.Test, oid1,
{'name':['test_SortedQuery'],
'attr1':['A1'], 'attr2':['B1']})
oid2 = ObjectId()
toi2 = self.ctx.createToi(blm.testblm.Test, oid2,
{'name':['test_SortedQuery'],
'attr1':['B1'], 'attr2':['A2']})
self.sync()
link = Link.LinkSortedQuery(self.clientId, self.linkId,
{'criteria': query,
'attrList': ['attr1', 'attr2'],
'subscription' : True,
'sorting': 'attr1'})
link.run()
x = self._getResult()
assert x['diffops'] == [[0, 0, [toi1, toi2]]]
self.outdate()
link = Link.LinkSortedQuery(self.clientId, self.linkId)
link.updateParameters(params={'sorting': 'attr2'})
x = self._getResult()
self.sync()
assert x['diffops'] == [[0, 0, [toi2]], [1, 2, []]]
assert self.uptodate()
# Make sure we always get an update, even if the result does not change.
# The client expects it.
self.outdate()
link = Link.LinkSortedQuery(self.clientId, self.linkId)
link.updateParameters(params={'sorting': 'attr2'})
x = self._getResult()
self.sync()
assert x['diffops'] == []
assert self.uptodate()
assert self._getLinkData()['outdatedToids'] == []
def test_SortedQuery_timing_error(self, monkeypatch):
# this test is nearly identical to
# test_SortedQuery_rerun_optimization but sneakily modifies
# the link's uptodate state while the link is running
#
# (it is possible that this is now a strict superset of
# test_SortedQuery_rerun_optimization and if so we might want
# to consider merging them)
from blm import testblm
oid1 = ObjectId()
toi1 = self.ctx.createToi(blm.testblm.Test, oid1, {'name':['foobar']})
oid2 = ObjectId()
toi2 = self.ctx.createToi(blm.testblm.Test, oid2, {'name':['gazonk']})
self.sync()
query = blm.testblm.Test._query(name=Query.Like('foo*'))
link = Link.LinkSortedQuery(self.clientId, self.linkId,
{'criteria': query,
'attrList': ['name'],
'subscription': True,
'sorting': 'name'})
link.run()
r = self._getResult()
assert list(r['toiDiffs'].keys()) == [str(oid1)] # sanity
cid1 = ObjectId()
self.ctx.changeToi(toi2, {'name': ['foooo']})
mongo.update_one(self.database.links,
{'client': self.clientId, 'link': self.linkId},
{'$set': {'outdatedBy': cid1,
'outdatedToids': [oid2]}})
self.sync()
link = Link.LinkFactory().create(None, self.clientId, self.linkId)
runQuery = self.ctx.runQuery
queries = []
cid2 = ObjectId()
def _runQuery(query):
queries.append(query)
result = list(runQuery(query))
mongo.update_one(self.database.links,
{'client': self.clientId, 'link': self.linkId},
{'$set': {'outdatedBy': cid2},
'$addToSet': {'outdatedToids': oid1}})
return result
monkeypatch.setattr(self.ctx, 'runQuery', _runQuery)
link.run()
r = self._getResult()
assert r['diffops'] == [[1, 1, [toi2]]]
assert r['toiDiffs'][str(oid2)].diffAttrs == {'name': ['foooo']}
assert queries == [blm.testblm.Test._query(id=[oid2])] # whitebox
self.sync()
assert not self.uptodate()
assert self._getLinkData()['outdatedBy'] == cid2
assert set(self._getLinkData()['outdatedToids']) == {oid1, oid2}
monkeypatch.undo()
link = Link.LinkFactory().create(None, self.clientId, self.linkId)
link.run()
self.sync()
assert self.uptodate()
assert self._getLinkData()['outdatedBy'] == None
assert self._getLinkData()['outdatedToids'] == []
def test_SortedQuery_do_not_optimize_ancients(self, monkeypatch):
from blm import testblm
oid1 = ObjectId()
toi1 = self.ctx.createToi(blm.testblm.Test, oid1, {'name':['foobar']})
oid2 = ObjectId()
toi2 = self.ctx.createToi(blm.testblm.Test, oid2, {'name':['gazonk']})
oid3 = ObjectId()
toi3 = self.ctx.createToi(blm.testblm.Test, oid3, {'name':['zonka']})
self.sync()
query = blm.testblm.Test._query(name=Query.Like('foo*'))
link = Link.LinkSortedQuery(self.clientId, self.linkId,
{'criteria': query,
'attrList': ['name'],
'subscription': True,
'sorting': 'name'})
link.run()
r = self._getResult()
assert list(r['toiDiffs'].keys()) == [str(oid1)] # sanity
self.ctx._query_cache.clear() # xxx we didn't need to do this in py2
self.ctx.changeToi(toi2, {'name': ['foooo']})
mongo.update_one(self.database.links,
{'client': self.clientId, 'link': self.linkId},
{'$set': {'outdatedBy': ObjectId(),
'outdatedToids': [],
'ancient': True}})
self.sync()
link = Link.LinkFactory().create(None, self.clientId, self.linkId)
runQuery = self.ctx.runQuery
queries = []
def _runQuery(query):
queries.append(query)
return runQuery(query)
monkeypatch.setattr(self.ctx, 'runQuery', _runQuery)
link.run()
r = self._getResult()
assert r['diffops'] == [[1, 1, [toi2]]]
assert r['toiDiffs'][str(oid2)].diffAttrs == {'name': ['foooo']}
assert queries == [query] # whitebox
| 37.794715 | 109 | 0.504168 | 3,638 | 37,190 | 5.098406 | 0.100605 | 0.056286 | 0.043131 | 0.059306 | 0.734904 | 0.710589 | 0.689239 | 0.659478 | 0.64077 | 0.618719 | 0 | 0.012028 | 0.353912 | 37,190 | 983 | 110 | 37.833164 | 0.759905 | 0.046679 | 0 | 0.699229 | 0 | 0 | 0.09217 | 0.002119 | 0 | 0 | 0 | 0 | 0.200514 | 1 | 0.047558 | false | 0 | 0.025707 | 0 | 0.096401 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9066a6fa5b6c16a19b4190b88ec90f81e3ece57d | 36,366 | py | Python | gigpower/src/gigpower/solution_nr3.py | elaguerta/gigpower | 22e0a6152fa8d7a04f6067f3d500bfee042a98f9 | [
"Unlicense"
] | null | null | null | gigpower/src/gigpower/solution_nr3.py | elaguerta/gigpower | 22e0a6152fa8d7a04f6067f3d500bfee042a98f9 | [
"Unlicense"
] | 4 | 2021-06-15T16:50:13.000Z | 2021-06-15T16:53:23.000Z | gigpower/src/gigpower/solution_nr3.py | msankur/gigpower | 22e0a6152fa8d7a04f6067f3d500bfee042a98f9 | [
"Unlicense"
] | null | null | null | # Elaine Laguerta (github: @elaguerta)
# LBNL GIG
# File created: 19 February 2021
# Create NR3 Solution class, a namespace for calculations used by nr3
from . solution import Solution
from . circuit import Circuit
import numpy as np
from . nr3_lib.compute_NR3FT import compute_NR3FT
from . nr3_lib.compute_NR3JT import compute_NR3JT
from . nr3_lib.map_output import map_output
class SolutionNR3(Solution):
CONVERGENCE_TOLERANCE = 10**-6
@classmethod
def set_zip_values(cls, zip_v):
"""
sets zip values for the Solution class
param zip_V: List or nd.array with 7 values
[a_z_p, a_i_p, a_pq_p, a_z_q, a_i_q, a_pq_q, min voltage pu]
Note that zip values are set both on the Solution class and Circuit
class
"""
Solution.set_zip_values(zip_v)
def __init__(self, dss_fp: str, **kwargs):
super().__init__(dss_fp, **kwargs) # sets self.circuit
self._set_orient('cols')
self._init_XNR()
self._init_slack_bus_matrices()
self._init_KVL_matrices()
self._init_KCL_matrices()
self._init_KVL_matrices_vregs()
def _init_XNR(self):
"""
adapted from
https://github.com/msankur/LinDist3Flow/blob/vectorized/20180601/PYTHON/lib/basematrices.py
written by @kathleenchang
"""
V0, I0 = None, None
Vslack = self.__class__.VSLACK
nline = self.circuit.lines.num_elements
nnode = self.circuit.buses.num_elements
tf_lines = self.circuit.transformers.get_num_lines_x_ph()
vr_lines = self.circuit.voltage_regulators.get_num_lines_x_ph()
# XNR order is bus voltages, line currents, transformer line currents,
# voltage regulator line currents * 2
XNR = np.zeros((2*3*(nnode + nline) + 2*tf_lines + 2*2*vr_lines, 1),
dtype=float)
# intialize node voltage portion of XNR
if V0 is None or len(V0) == 0:
for ph in range(3):
for k1 in range(nnode):
XNR[2*ph*nnode + 2*k1] = Vslack[ph].real
XNR[2*ph*nnode + 2*k1+1] = Vslack[ph].imag
# If initial V is given (usually from CVX)
elif len(V0) != 0:
for ph in range(3):
for k1 in range(nnode):
XNR[2*ph*nnode + 2*k1] = V0[ph, k1].real
XNR[2*ph*nnode + 2*k1+1] = V0[ph, k1].imag
# intialize line current portion of XNR
if I0 is None or len(I0) == 0:
XNR[(2*3*nnode):] = 0.0*np.ones((6*nline + 2*tf_lines + 2*2* vr_lines, 1), dtype = float)
# If initial I is given
elif len(I0) != 0:
for ph in range(3):
for k1 in range(nline):
XNR[(2*3*nnode) + 2*ph*nline + 2*k1] = I0[ph, k1].real
XNR[(2*3*nnode) + 2*ph*nline + 2*k1+1] = I0[ph, k1].imag
XNR[(2*3*nnode + 2*3*nline):] = np.zeros((len(XNR) - 2*3*nnode - 2*3*nline), 1)
self.XNR = XNR
def _init_slack_bus_matrices(self):
"""
Initializes g_SB and b_SB
adapted from
https://github.com/msankur/LinDist3Flow/blob/vectorized/20180601/PYTHON/lib/basematrices.py
written by @kathleenchang
"""
tf_lines = self.circuit.transformers.get_num_lines_x_ph()
vr_lines = self.circuit.voltage_regulators.get_num_lines_x_ph()
nline = self.circuit.lines.num_elements
nnode = self.circuit.buses.num_elements
Vslack = self.__class__.VSLACK
# ------------ Slack Bus ------------------
self.g_SB = np.zeros((6, 2*3*(nnode+nline) + 2*tf_lines + 2*2*vr_lines),
dtype=float)
sb_idx = [0, 1, 2*nnode, (2*nnode)+1, 4*nnode, (4*nnode)+1]
for i in range(len(sb_idx)):
self.g_SB[i, sb_idx[i]] = 1
self.b_SB = np.zeros((6, 1), dtype=float)
for i in range(3):
self.b_SB[2*i, 0] = Vslack[i].real
self.b_SB[(2*i) + 1] = Vslack[i].imag
def _init_KVL_matrices(self):
"""
set self.b_KVL and self.G_KVL
copied from
https://github.com/msankur/LinDist3Flow/blob/vectorized/20180601/PYTHON/lib/compute_SBKVL_matrices.py
written by @kathleenchang
"""
tf_bus = self.circuit.transformers.get_bus_ph_matrix()
tf_lines = self.circuit.transformers.get_num_lines_x_ph()
vr_lines = self.circuit.voltage_regulators.get_num_lines_x_ph()
nline = self.circuit.lines.num_elements
nnode = self.circuit.buses.num_elements
X_matrix = self.circuit.lines.get_X_matrix()
R_matrix = self.circuit.lines.get_R_matrix()
# ------- Residuals for KVL across line (m,n) ----------
self.b_KVL = np.zeros((2*3*(nline) + 2*tf_lines + 2*2*vr_lines, 1),
dtype=float)
G_KVL = np.zeros((2*3*(nline) + 2*tf_lines + 2*2*vr_lines,
2*3*(nnode+nline) + 2*tf_lines + 2*2*vr_lines),
dtype=float)
for ph in range(3):
for line in range(nline): # line = line index
line_ele = self.circuit.lines.get_element(line)
bus1_idx = self.circuit.buses.get_idx(line_ele.tx)
bus2_idx = self.circuit.buses.get_idx(line_ele.rx)
bus1_phases = line_ele.phase_matrix
if bus1_phases[ph] == 1:
G_KVL[2*ph*nline + 2*line][2*(nnode)*ph + 2*(bus1_idx)] = 1 #A_m
G_KVL[2*ph*nline + 2*line][2*(nnode)*ph + 2*(bus2_idx)] = -1 #A_n
G_KVL[2*ph*nline + 2*line+1][2*(nnode)*ph + 2*(bus1_idx) + 1] = 1 #B_m
G_KVL[2*ph*nline + 2*line+1][2*(nnode)*ph + 2*(bus2_idx) + 1] = -1 #B_n
G_KVL[2*ph*nline + 2*line][2*3*(nnode) + 2*line] = -R_matrix[line][ph*3] * bus1_phases[0] #C_mn for a
G_KVL[2*ph*nline + 2*line][2*3*(nnode) + 2*line + 1] = X_matrix[line][ph*3] * bus1_phases[0] #D_mn for a
G_KVL[2*ph*nline + 2*line][2*3*(nnode) + 2*nline + 2*line] = -R_matrix[line][ph*3 + 1] * bus1_phases[1] #C_mn for b
G_KVL[2*ph*nline + 2*line][2*3*(nnode) + 2*nline + 2*line + 1] = X_matrix[line][ph*3 + 1] * bus1_phases[1] #D_mn for b
G_KVL[2*ph*nline + 2*line][2*3*(nnode) + 4*nline + 2*line] = -R_matrix[line][ph*3 + 2] * bus1_phases[2] #C_mn for c
G_KVL[2*ph*nline + 2*line][2*3*(nnode) + 4*nline + 2*line + 1] = X_matrix[line][ph*3 + 2] * bus1_phases[2] #D_mn for c
G_KVL[2*ph*nline + 2*line+1][2*3*(nnode) + 2*line] = -X_matrix[line][ph*3] * bus1_phases[0] #C_mn for a
G_KVL[2*ph*nline + 2*line+1][2*3*(nnode) + 2*line + 1] = -R_matrix[line][ph*3] * bus1_phases[0] #D_mn for a
G_KVL[2*ph*nline + 2*line+1][2*3*(nnode) + 2*nline + 2*line] = -X_matrix[line][ph*3 + 1] * bus1_phases[1] #C_mn for b
G_KVL[2*ph*nline + 2*line+1][2*3*(nnode) + 2*nline + 2*line + 1] = -R_matrix[line][ph*3 + 1] * bus1_phases[1] #D_mn for b
G_KVL[2*ph*nline + 2*line+1][2*3*(nnode) + 4*nline + 2*line] = -X_matrix[line][ph*3 + 2] * bus1_phases[2] #C_mn for c
G_KVL[2*ph*nline + 2*line+1][2*3*(nnode) + 4*nline + 2*line + 1] = -R_matrix[line][ph*3 + 2] * bus1_phases[2] #D_mn for c
else:
G_KVL[2*ph*nline + 2*line][2*(nnode)*3 + 2*ph*nline + 2*line] = 1 #C_mn
G_KVL[2*ph*nline + 2*line+1][2*(nnode)*3 + 2*ph*nline + 2*line+1] = 1 #D_mn
#------- Residuals for Transformer KVL ----------
line_idx_tf = range(0, tf_lines)
kvl_count = 0
for tfbs in range(len(tf_bus[0])):
for ph in range(0, 3):
if tf_bus[ph + 2, tfbs] != 0:
line = line_idx_tf[kvl_count]
G_KVL[2*3*nline + 2*line][2*nnode*ph + 2*int(tf_bus[0, tfbs])] = 1 #A_m
G_KVL[2*3*nline + 2*line][2*nnode*ph + 2*int(tf_bus[1, tfbs])] = -1 #A_n
G_KVL[2*3*nline + 2*line+1][2*nnode*ph + 2*int(tf_bus[0, tfbs]) + 1] = 1 #B_m
G_KVL[2*3*nline + 2*line+1][2*nnode*ph + 2*int(tf_bus[1, tfbs]) + 1] = -1 #B_n
kvl_count += 1
self.G_KVL = G_KVL
def _init_KCL_matrices(self, der=0, capacitance=0):
"""
set H, g, b
copied from 20180601/PYTHON/lib/compute_KCL_matrices.py
written by @kathleenchang
"""
tf_bus = self.circuit.transformers.get_bus_ph_matrix()
vr_bus = self.circuit.voltage_regulators.get_bus_ph_matrix()
tf_lines = self.circuit.transformers.get_num_lines_x_ph()
tf_no = self.circuit.transformers.num_elements
vr_lines = self.circuit.voltage_regulators.get_num_lines_x_ph()
vr_count = self.circuit.voltage_regulators.num_elements
nnode = self.circuit.buses.num_elements
nline = self.circuit.lines.num_elements
# Line Indices Associated with Voltage Regulators and Transformers
line_in_idx_vr = range(0, 2*vr_lines, 2)
line_out_idx_vr = range(1, 2*vr_lines, 2)
line_idx_tf = range(0, tf_lines)
load_kw = self.circuit.loads.get_ppu_matrix()
load_kvar = self.circuit.loads.get_qpu_matrix()
caparr = self.circuit.get_cappu_matrix()
# ----------Residuals for KCL at a bus (m) ----------
bp = self.circuit.buses.get_phase_matrix_dict()
dss = self.dss
# Zip Parameters
# Load
beta_S = Circuit.aPQ_p
beta_I = Circuit.aI_p
beta_Z = Circuit.aZ_p
# Capacitors
gamma_S = Circuit.aPQ_p
gamma_I = Circuit.aI_p
gamma_Z = Circuit.aZ_p
H = np.zeros((2*3*(nnode-1), 2*3*(nnode+nline) + 2*tf_lines +
2*2*vr_lines, 2*3*(nnode+nline) + 2*tf_lines +
2*2*vr_lines), dtype=float)
g = np.zeros((2*3*(nnode-1), 1, 2*3*(nnode+nline) + 2*tf_lines +
2*2*vr_lines), dtype=float)
b = np.zeros((2*3*(nnode-1), 1, 1), dtype=float)
# --------------- Quadratic Terms -----------------
for ph in range(0,3):
if ph == 0: #set nominal voltage based on phase
A0 = 1
B0 = 0
elif ph == 1:
A0 = -1/2
B0 = -1 * np.sqrt(3)/2
elif ph == 2:
A0 = -1/2
B0 = np.sqrt(3)/2
for k2 in range(1, len(dss.Circuit.AllBusNames())): #skip slack bus
dss.Circuit.SetActiveBus(dss.Circuit.AllBusNames()[k2]) #set the bus
bus_name = dss.Circuit.AllBusNames()[k2]
in_lines = self.circuit.lines.get_line_list(bus_name, 'in') # upstream buses
out_lines = self.circuit.lines.get_line_list(bus_name, 'out') # downstream buses
for cplx in range(0,2):
idxbs = dss.Circuit.AllBusNames().index(dss.Circuit.AllBusNames()[k2])
if cplx == 0:
load_val = load_kw[ph,idxbs]
cap_val = 0
else:
load_val = load_kvar[ph,idxbs]
cap_val = caparr[ph][idxbs]
#gradient_mag = np.array([A0 * ((A0**2+B0**2) ** (-1/2)), B0 * ((A0**2+B0**2) ** (-1/2))]) #some derivatives
hessian_mag = np.array([[-((A0**2)*(A0**2+B0**2)**(-3/2))+(A0**2+B0**2)**(-1/2), -A0*B0*(A0**2+B0**2)**(-3/2)],
[-A0*B0*(A0**2+B0**2)**(-3/2), -((B0**2)*(A0**2+B0**2)**(-3/2))+((A0**2+B0**2)**(-1/2))]], dtype=float)
available_phases = bp[dss.Circuit.AllBusNames()[k2]] #phase array at specific bus
if available_phases[ph] == 1: #quadratic terms
H[2*ph*(nnode-1) + (k2-1)*2 + cplx][2*(nnode)*ph + 2*k2][2*(nnode)*ph + 2*k2] = \
-load_val * (beta_Z + (0.5 * beta_I* hessian_mag[0][0])) + \
cap_val * (gamma_Z + (0.5 * gamma_I * hessian_mag[0][0]))# TE replace assignment w/ -load_val * beta_Z; #a**2
H[2*ph*(nnode-1) + (k2-1)*2 + cplx][2*(nnode)*ph + 2*k2 + 1][2*(nnode)*ph + 2*k2 + 1] = \
-load_val * (beta_Z + (0.5 * beta_I * hessian_mag[1][1])) + \
cap_val * (gamma_Z + (0.5 * gamma_I * hessian_mag[1][1]))# TE replace assignment w/ -load_val * beta_Z; #b**2
#H[2*ph*(nnode-1) + (k2-1)*2 + cplx][2*(nnode)*ph + 2*k2][2*(nnode)*ph + 2*k2 + 1] = -load_val * beta_I * hessian_mag[0][1] / 2 #remove for TE
#H[2*ph*(nnode-1) + (k2-1)*2 + cplx][2*(nnode)*ph + 2*k2 + 1][2*(nnode)*ph + 2*k2] = -load_val * beta_I * hessian_mag[1][0] / 2 #remove for TE
for i in range(len(in_lines)): # in lines
line_idx = self.circuit.lines.get_idx(in_lines[i])
if available_phases[ph] == 1:
if cplx == 0: #real residual
#A_m and C_lm
H[2*ph*(nnode-1) + (k2-1)*2 + cplx][2*(nnode)*ph + 2*k2][2*3*(nnode) + 2*ph*nline + 2*line_idx] = 1/2
H[2*ph*(nnode-1) + (k2-1)*2 + cplx][2*3*(nnode) + 2*ph*nline + 2*line_idx][2*(nnode)*ph + 2*k2] = 1/2
#B_m and D_lm
H[2*ph*(nnode-1) + (k2-1)*2+ cplx][2*(nnode)*ph + 2*k2 + 1][2*3*(nnode) + 2*ph*nline + 2*line_idx + 1] = 1/2
H[2*ph*(nnode-1) + (k2-1)*2+ cplx][2*3*(nnode) + 2*ph*nline + 2*line_idx + 1][2*(nnode)*ph + 2*k2 + 1] = 1/2
if cplx == 1: #imaginary residual
# #A_m, D_lm
H[2*ph*(nnode-1) + (k2-1)*2+ cplx][2*(nnode)*ph + 2*k2][2*3*(nnode) + 2*ph*nline + 2*line_idx + 1] = -1/2
H[2*ph*(nnode-1) + (k2-1)*2+ cplx][2*3*(nnode) + 2*ph*nline + 2*line_idx + 1][2*(nnode)*ph + 2*k2] = -1/2
#B_m and C_lm
H[2*ph*(nnode-1) + (k2-1)*2+ cplx][2*(nnode)*ph + 2*k2 + 1][2*3*(nnode) + 2*ph*nline + 2*line_idx] = 1/2
H[2*ph*(nnode-1) + (k2-1)*2+ cplx][2*3*(nnode) + 2*ph*nline + 2*line_idx][2*(nnode)*ph + 2*k2 + 1] = 1/2
for j in range(len(out_lines)): # out lines
line_idx = self.circuit.lines.get_idx(out_lines[j])
if available_phases[ph] == 1:
if cplx == 0: #real residual
#A_m and C_mn
H[2*ph*(nnode-1) + (k2-1)*2+ cplx][2*(nnode)*ph + 2*k2][2*3*(nnode) + 2*ph*nline + 2*line_idx] = -1/2
H[2*ph*(nnode-1) + (k2-1)*2+ cplx][2*3*(nnode) + 2*ph*nline + 2*line_idx][2*(nnode)*ph + 2*k2] = -1/2
#B_m and D_mn
H[2*ph*(nnode-1) + (k2-1)*2+ cplx][2*(nnode)*ph + 2*k2 + 1][2*3*(nnode) + 2*ph*nline + 2*line_idx + 1] = -1/2
H[2*ph*(nnode-1) + (k2-1)*2+ cplx][2*3*(nnode) + 2*ph*nline + 2*line_idx + 1][2*(nnode)*ph + 2*k2 + 1] = -1/2
if cplx == 1: #imaginary residual
#A_m and D_mn
H[2*ph*(nnode-1) + (k2-1)*2+ cplx][2*(nnode)*ph + 2*k2][2*3*(nnode) + 2*ph*nline + 2*line_idx + 1]= 1/2
H[2*ph*(nnode-1) + (k2-1)*2+ cplx][2*3*(nnode) + 2*ph*nline + 2*line_idx + 1][2*(nnode)*ph + 2*k2] = 1/2
#C_m and B_mn
H[2*ph*(nnode-1) + (k2-1)*2+cplx][2*(nnode)*ph + 2*k2 + 1][2*3*(nnode) + 2*ph*nline + 2*line_idx] = -1/2
H[2*ph*(nnode-1) + (k2-1)*2+cplx][2*3*(nnode) + 2*ph*nline + 2*line_idx][2*(nnode)*ph + 2*k2 + 1] = -1/2
# ----------------------- Transformer KCL -----------------------
tf_no = len(dss.Transformers.AllNames()) - len(dss.RegControls.AllNames())
count_tf = 0
count_tf2 = 0
for i in range(tf_no):
for ph in range(0,3):
k2 = int(tf_bus[1, i]) #in bus index of transformer [out bus: a0] --line-a0-a1-- [in bus: a1]
if k2 == 0: #if source bus, need to skip line
count_tf += 1
if k2 != 0 and tf_bus[ph + 2, i] != 0: #if not source bus, perform KCL
line_idx = line_idx_tf[count_tf]
#A_m and C_lm
H[2*ph*(nnode-1) + (k2-1)*2 ][2*(nnode)*ph + 2*k2][2*3*(nnode+nline) + 2*line_idx] = 1/2
H[2*ph*(nnode-1) + (k2-1)*2 ][2*3*(nnode+nline) + 2*line_idx][2*(nnode)*ph + 2*k2] = 1/2
#B_m and D_lm
H[2*ph*(nnode-1) + (k2-1)*2][2*(nnode)*ph + 2*k2 + 1][2*3*(nnode+nline) + 2*line_idx + 1] = 1/2
H[2*ph*(nnode-1) + (k2-1)*2][2*3*(nnode+nline) + 2*line_idx + 1][2*(nnode)*ph + 2*k2 + 1] = 1/2
#A_m, D_lm
H[2*ph*(nnode-1) + (k2-1)*2+ 1][2*(nnode)*ph + 2*k2][2*3*(nnode+nline) + 2*line_idx + 1] = -1/2
H[2*ph*(nnode-1) + (k2-1)*2+ 1][2*3*(nnode+nline) + 2*line_idx + 1][2*(nnode)*ph + 2*k2] = -1/2
#B_m and C_lm
H[2*ph*(nnode-1) + (k2-1)*2+ 1][2*(nnode)*ph + 2*k2 + 1][2*3*(nnode+nline) + 2*line_idx] = 1/2
H[2*ph*(nnode-1) + (k2-1)*2+ 1][2*3*(nnode+nline) + 2*line_idx][2*(nnode)*ph + 2*k2 + 1] = 1/2
count_tf += 1 #go to next line
for j in range(tf_no): #fill in H for the outlines
for ph in range(0,3):
k2 = int(tf_bus[0, j]) #out bus index of transformer
if k2 == 0:
count_tf2 += 1
if k2 != 0 and tf_bus[ph + 2, j] != 0:
line_idx = line_idx_tf[count_tf2]
#real residual
#A_m and C_mn
H[2*ph*(nnode-1) + (k2-1)*2][2*(nnode)*ph + 2*k2][2*3*(nnode+nline) + 2*line_idx] = -1/2
H[2*ph*(nnode-1) + (k2-1)*2][2*3*(nnode+nline) + 2*line_idx][2*(nnode)*ph + 2*k2] = -1/2
#B_m and D_mn
H[2*ph*(nnode-1) + (k2-1)*2][2*(nnode)*ph + 2*k2 + 1][2*3*(nnode+nline) + 2*line_idx + 1] = -1/2
H[2*ph*(nnode-1) + (k2-1)*2][2*3*(nnode+nline) + 2*line_idx + 1][2*(nnode)*ph + 2*k2 + 1] = -1/2
#imaginary residual
#A_m and D_mn
H[2*ph*(nnode-1) + (k2-1)*2 + 1][2*(nnode)*ph + 2*k2][2*3*(nnode+nline) + 2*line_idx + 1]= 1/2
H[2*ph*(nnode-1) + (k2-1)*2 + 1][2*3*(nnode+nline) + 2*line_idx + 1][2*(nnode)*ph + 2*k2] = 1/2
#C_m and B_mn
H[2*ph*(nnode-1) + (k2-1)*2 + 1][2*(nnode)*ph + 2*k2 + 1][2*3*(nnode+nline) + 2*line_idx] = -1/2
H[2*ph*(nnode-1) + (k2-1)*2 + 1][2*3*(nnode+nline) + 2*line_idx][2*(nnode)*ph + 2*k2 + 1] = -1/2
count_tf2+=1
# ----------------------- Voltage Regulator KCL -----------------------
vr_count = len(dss.RegControls.AllNames())
count_vr = 0
count_vr2 = 0
if vr_count > 0:
for i in range(vr_count): #in lines
for ph in range(0,3):
k2 = int(vr_bus[1, i])
if k2 == 0:
count_vr += 1
if k2 != 0 and vr_bus[ph + 2, i] != 0:
line_idx = line_in_idx_vr[count_vr]
#real residual
#A_m and C_lm
H[2*ph*(nnode-1) + (k2-1)*2 + 0][2*(nnode)*ph + 2*k2][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx] = 1/2
H[2*ph*(nnode-1) + (k2-1)*2 + 0][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx][2*(nnode)*ph + 2*k2] = 1/2
#B_m and D_lm
H[2*ph*(nnode-1) + (k2-1)*2 + 0][2*(nnode)*ph + 2*k2 + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx + 1] = 1/2
H[2*ph*(nnode-1) + (k2-1)*2 + 0][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx + 1][2*(nnode)*ph + 2*k2 + 1] = 1/2
#imaginary residual
# #A_m, D_lm
H[2*ph*(nnode-1) + (k2-1)*2 + 1][2*(nnode)*ph + 2*k2][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx + 1] = -1/2
H[2*ph*(nnode-1) + (k2-1)*2 + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx + 1][2*(nnode)*ph + 2*k2] = -1/2
#B_m and C_lm
H[2*ph*(nnode-1) + (k2-1)*2 + 1][2*(nnode)*ph + 2*k2 + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx] = 1/2
H[2*ph*(nnode-1) + (k2-1)*2 + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx][2*(nnode)*ph + 2*k2 + 1] = 1/2
count_vr += 1
for j in range(vr_count): #out lines
for ph in range(0,3):
k2 = int(vr_bus[0, j])
if k2 == 0:
count_vr2 += 1
if k2 != 0 and vr_bus[ph + 2, j] != 0:
line_idx = line_out_idx_vr[count_vr2]
#real residual
#A_m and C_mn
H[2*ph*(nnode-1) + (k2-1)*2][2*(nnode)*ph + 2*k2][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx] = -1/2
H[2*ph*(nnode-1) + (k2-1)*2][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx][2*(nnode)*ph + 2*k2] = -1/2
#B_m and D_mn
H[2*ph*(nnode-1) + (k2-1)*2][2*(nnode)*ph + 2*k2 + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx + 1] = -1/2
H[2*ph*(nnode-1) + (k2-1)*2][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx + 1][2*(nnode)*ph + 2*k2 + 1] = -1/2
#imaginary residual
#A_m and D_mn
H[2*ph*(nnode-1) + (k2-1)*2 + 1][2*(nnode)*ph + 2*k2][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx + 1] = 1/2
H[2*ph*(nnode-1) + (k2-1)*2 + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx + 1][2*(nnode)*ph + 2*k2] = 1/2
#C_m and B_mn
H[2*ph*(nnode-1) + (k2-1)*2 + 1][2*(nnode)*ph + 2*k2 + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx] = -1/2
H[2*ph*(nnode-1) + (k2-1)*2 + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_idx][2*(nnode)*ph + 2*k2 + 1] = -1/2
count_vr2 += 1
# ----------------------- Linear Term & Constant Term -----------------------
for ph in range(0,3):
for k2 in range(1, len(dss.Circuit.AllBusNames())):
for cplx in range(0,2):
available_phases = bp[dss.Circuit.AllBusNames()[k2]] #phase array at specific bus
idxbs = dss.Circuit.AllBusNames().index(dss.Circuit.AllBusNames()[k2])
if cplx == 0:
load_val = load_kw[ph][idxbs]
else:
load_val = load_kvar[ph][idxbs]
# Linear terms
g_temp = np.zeros(2*3*(nnode+nline) + 2*tf_lines +
2*2*vr_lines, dtype=float)
if available_phases[ph] == 0: #if phase does not exist
g_temp[2*(ph)*nnode + 2*k2 + cplx] = 1
g[2*(nnode-1)*ph + 2*(k2-1) + cplx, 0,:] = g_temp
# Constant terms
if cplx == 0:
if der.real != 0:
b_factor = der.real
b_factor = 0
else:
b_factor = 0
else:
if capacitance != 0 or der.imag != 0:
b_factor = capacitance - der.imag
b_factor = 0
else:
b_factor = caparr[ph][k2]
if available_phases[ph] == 0: #if phase does not exist
b_temp = 0
else:
b_temp = (-load_val * beta_S) + (b_factor * gamma_S)
b[2*(nnode-1)*ph + 2*(k2-1) + cplx][0][0] = b_temp
self.H, self.g, self.b = H, g, b
def _init_KVL_matrices_vregs(self):
"""
set H_reg, G_reg
copied from 20180601/PYTHON/lib/compute_SBKVL_matrices.py
written by @kathleenchang
"""
# ---------- Voltage Regulator -----------
tf_bus = self.circuit.transformers.get_bus_ph_matrix()
vr_bus = self.circuit.voltage_regulators.get_bus_ph_matrix()
tf_lines = self.circuit.transformers.get_num_lines_x_ph()
tf_no = self.circuit.transformers.num_elements
vr_lines = self.circuit.voltage_regulators.get_num_lines_x_ph()
vr_count = self.circuit.voltage_regulators.num_elements
nnode = self.circuit.buses.num_elements
nline = self.circuit.lines.num_elements
gain = self.circuit.voltage_regulators.get_gain_matrix()
H_reg = np.zeros((2*vr_lines, 2*3*(nnode+nline) + 2*tf_lines +
2*2*vr_lines, 2*3*(nnode+nline) +
2*tf_lines + 2*2*vr_lines), dtype=float)
G_reg = np.zeros((2*vr_lines, 2*3*(nnode+nline) + 2*tf_lines +
2*2*vr_lines), dtype=float)
# voltage ratio: V_bus2 - gamma V_bus1 = 0
line_in_idx = range(0, 2*vr_lines, 2)
line_out_idx = range(1, 2*vr_lines, 2)
vr_counter = 0
for m in range(vr_count):
bus1_idx = vr_bus[0, m]
bus2_idx = vr_bus[1, m]
for ph in range(0,3):
if vr_bus[ph + 2,m] != 0:
# voltage gain: gamma*A_in = A_out
# gamma * B_in = B_out
# Negative not shown below because inserted into gain term
G_reg[2*vr_counter][2*nnode*ph + 2*bus1_idx] = gain[m] #A_in
G_reg[2*vr_counter][2*nnode*ph + 2*bus2_idx] = 1 #A_out
G_reg[2*vr_counter + 1][2*nnode*ph + 2*bus1_idx + 1] = gain[m] #B_in
G_reg[2*vr_counter + 1][2*nnode*ph + 2*bus2_idx + 1] = 1 #B_out
#conservation of power: V_bus1 (I_bus1,out)* - V_bus2 (I_bus2,in)* = 0
# A_1 * C_out + B_1 * D_out - (A_2 * C_in + B_2 * D_in)
# j(B_1 * C_out - A_1 * D_out) - j(B_2 * C_in - A_2 * D_in)
#A_1 C_out
H_reg[2*vr_counter][2*nnode*ph + 2*bus1_idx][2*3*(nnode+nline) + 2*tf_lines + 2*line_out_idx[vr_counter]] = 1
H_reg[2*vr_counter][2*3*(nnode+nline) + 2*tf_lines + 2*line_out_idx[vr_counter]][2*nnode*ph + 2*bus1_idx] = 1
#B_1 D_out
H_reg[2*vr_counter][2*nnode*ph + 2*bus1_idx + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_out_idx[vr_counter] + 1] = 1
H_reg[2*vr_counter][2*3*(nnode+nline) + 2*tf_lines + 2*line_out_idx[vr_counter]+ 1][2*nnode*ph + 2*bus1_idx + 1] = 1
#A_2 C_in
H_reg[2*vr_counter][2*nnode*ph + 2*bus2_idx][2*3*(nnode+nline) + 2*tf_lines + 2*line_in_idx[vr_counter]] = -1
H_reg[2*vr_counter][2*3*(nnode+nline) + 2*tf_lines + 2*line_in_idx[vr_counter]][2*nnode*ph + 2*bus2_idx] = -1
#B_2 D_in
H_reg[2*vr_counter][2*nnode*ph + 2*bus2_idx + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_in_idx[vr_counter] + 1] = -1
H_reg[2*vr_counter][2*3*(nnode+nline) + 2*tf_lines + 2*line_in_idx[vr_counter] + 1][2*nnode*ph + 2*bus2_idx + 1] = -1
#B_1 * C_out
H_reg[2*vr_counter + 1][2*nnode*ph + 2*bus1_idx + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_out_idx[vr_counter]] = 1
H_reg[2*vr_counter + 1][2*3*(nnode+nline) + 2*tf_lines+ 2*line_out_idx[vr_counter]][2*nnode*ph + 2*bus1_idx + 1] = 1
# A_1 * D_out
H_reg[2*vr_counter + 1][2*nnode*ph + 2*bus1_idx][2*3*(nnode+nline) + 2*tf_lines + 2*line_out_idx[vr_counter] + 1] = -1
H_reg[2*vr_counter + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_out_idx[vr_counter] + 1][2*nnode*ph + 2*bus1_idx] = -1
#B_2 * C_in
H_reg[2*vr_counter + 1][2*nnode*ph + 2*bus2_idx + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_in_idx[vr_counter]] = -1
H_reg[2*vr_counter + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_in_idx[vr_counter]][2*nnode*ph + 2*bus2_idx + 1] = -1
# A_2 * D_in
H_reg[2*vr_counter + 1][2*nnode*ph + 2*bus2_idx][2*3*(nnode+nline) + 2*tf_lines + 2*line_in_idx[vr_counter] + 1] = 1
H_reg[2*vr_counter + 1][2*3*(nnode+nline) + 2*tf_lines + 2*line_in_idx[vr_counter] + 1][2*nnode*ph + 2*bus2_idx] = 1
vr_counter += 1
self.H_reg, self.G_reg = H_reg, G_reg
def change_KCL_matrices(self, der=0, capacitance=0, t=-1):
H, g, b = self.H, self.g, self.b
wpu = self.circuit.get_wpu_matrix()
nnode = self.circuit.buses.num_elements
# load_kw, load_kvar = nominal_load_values(t)
# caparr = nominal_cap_arr()
load_kw = self.circuit.loads.get_ppu_matrix()
load_kvar = self.circuit.loads.get_qpu_matrix()
caparr = self.circuit.get_cappu_matrix()
# ----------Residuals for KCL at a bus (m) ----------
#Zip Parameters
beta_S = Circuit.aPQ_p
beta_I = Circuit.aI_p
beta_Z = Circuit.aZ_p
# Capacitors
gamma_S = Circuit.aPQ_p
gamma_I = Circuit.aI_p
gamma_Z = Circuit.aZ_p
# Quadratic Terms
# Time varying load
if t != -1:
for ph in range(0, 3):
for k2 in range(1, nnode): # skip slack bus
bus = self.circuit.buses.get_element(k2)
#dss.Circuit.SetActiveBus(dss.Circuit.AllBusNames()[k2]) #set the bus
available_phases = bus.phase_matrix # phase array at specific bus
for cplx in range(0, 2):
if available_phases[ph] == 1: # quadratic terms
H[2*ph*(nnode-1) + (k2-1)*2 + cplx][2*(nnode)*ph + 2*k2][2 *
(nnode)*ph + 2*k2] *= (1 + 0.1*np.sin(2*np.pi*0.01*t))
#-load_val * (beta_Z + (0.5 * beta_I* hessian_mag[0][0])) # TE replace assignment w/ -load_val * beta_Z; #a**2
H[2*ph*(nnode-1) + (k2-1)*2 + cplx][2*(nnode)*ph + 2*k2 + 1][2 *
(nnode)*ph + 2*k2 + 1] *= (1 + 0.1*np.sin(2*np.pi*0.01*t))
#-load_val * (beta_Z + (0.5 * beta_I * hessian_mag[1][1])) # TE replace assignment w/ -load_val * beta_Z; #b**2
# Constant Term
if t != -1 or der != 0 or capacitance != 0:
for ph in range(0, 3):
for k2 in range(1, nnode):
bus = self.circuit.buses.get_element(k2)
available_phases = bus.phase_matrix
if available_phases[ph] == 1:
for cplx in range(0, 2):
if cplx == 0:
load_val = load_kw[ph][k2]
if der.real != 0:
b_factor = der.real
else:
b_factor = 0
else:
load_val = load_kvar[ph][k2]
if capacitance != 0 or der.imag != 0:
b_factor = capacitance - der.imag
else:
b_factor = caparr[ph][k2]
if t != -1:
print('Should not enter')
load_val *= (1 + 0.1*np.sin(2*np.pi*0.01*t))
b_temp = (-load_val * beta_S) + \
(b_factor * gamma_S)
# TODO: resolve numpy warning here
b[2*(nnode-1)*ph + 2*(k2-1) +
cplx][0][0] = b_temp - wpu[ph][k2]
return H, b
def solve(self):
"""
Solves powerflow once, updates self.XNR with solved XNR
From src/nr3_python/lib/NR3.py
Written by @kathleenchang
"""
# get pointers to basematrices
XNR, g_SB, b_SB = self.XNR, self.g_SB, self.b_SB
G_KVL, b_KVL, = self.G_KVL, self.b_KVL
H, g, b = self.H, self.g, self.b
H_reg, G_reg = self.H_reg, self.G_reg
# get index and Sbase info from self.circuit
nline = self.circuit.lines.num_elements
nnode = self.circuit.buses.num_elements
Sbase = self.circuit.Sbase
tf_lines = self.circuit.transformers.get_num_lines_x_ph()
vr_lines = self.circuit.voltage_regulators.get_num_lines_x_ph()
tol = self.__class__.CONVERGENCE_TOLERANCE
maxiter = self.__class__.maxiter
FT = 1e99
itercount = 0
# solve power-flow.
while np.amax(np.abs(FT)) >= 1e-9 and itercount < maxiter:
# print("Iteration number for Original NR3 %f" % (itercount))
FT = compute_NR3FT(XNR, g_SB, b_SB, G_KVL, b_KVL, H,
g, b, nnode, nline, H_reg, G_reg, vr_lines)
JT = compute_NR3JT(XNR, g_SB, G_KVL, H, g, nnode,
nline, H_reg, G_reg, tf_lines, vr_lines)
if JT.shape[0] >= JT.shape[1]:
XNR = XNR - np.linalg.inv(JT.T@JT)@JT.T@FT
itercount += 1
XNR_final = XNR
self.XNR = XNR
self.converged = True
self.map_XNR()
def map_XNR(self):
"""
Set self,V, self.I, self.Stx, self.Srx, self.i, self.s
based on the current value of self.XNR
Written by @kathleenchang
"""
TXnum = self.circuit.get_tx_idx_matrix()
RXnum = self.circuit.get_rx_idx_matrix()
nnode = self.circuit.buses.num_elements
nline = self.circuit.lines.num_elements
PH = self.circuit.buses.get_phase_matrix(self._orient)
spu = self.circuit.get_spu_matrix()
APQ = self.circuit.get_aPQ_matrix()
AZ = self.circuit.get_aZ_matrix()
AI = self.circuit.get_aI_matrix()
cappu = self.circuit.get_cappu_matrix()
wpu = self.circuit.get_wpu_matrix()
XNR = self.XNR
# TODO: can/should we include transformers and voltage regulators in
# in INR?
VNR, INR, STXNR, SRXNR, iNR, sNR = map_output(self.circuit, XNR)
self.V = VNR
self.Vmag = np.abs(VNR)
self.I = INR
self.Stx = STXNR
self.Srx = SRXNR
self.i_Node = iNR
self.sV = sNR
| 54.768072 | 228 | 0.474399 | 5,498 | 36,366 | 2.966897 | 0.060931 | 0.022805 | 0.040338 | 0.048553 | 0.742582 | 0.712788 | 0.685937 | 0.654917 | 0.635422 | 0.606731 | 0 | 0.07804 | 0.371391 | 36,366 | 663 | 229 | 54.850679 | 0.635521 | 0.137464 | 0 | 0.353468 | 0 | 0 | 0.00081 | 0 | 0 | 0 | 0 | 0.003017 | 0 | 1 | 0.022371 | false | 0 | 0.013423 | 0 | 0.042506 | 0.002237 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
906fd60880d0da20a86eb888f097b14ac63d3dcb | 587 | py | Python | nightson/handlers/eventsownedbyuser.py | vswamy/nightson | 01b5efbf1acbca79aaece1b70e15f2877a1ea8c4 | [
"Apache-2.0"
] | null | null | null | nightson/handlers/eventsownedbyuser.py | vswamy/nightson | 01b5efbf1acbca79aaece1b70e15f2877a1ea8c4 | [
"Apache-2.0"
] | null | null | null | nightson/handlers/eventsownedbyuser.py | vswamy/nightson | 01b5efbf1acbca79aaece1b70e15f2877a1ea8c4 | [
"Apache-2.0"
] | null | null | null | from __future__ import absolute_import
import httplib
from nightson.handlers.auth import AuthHandler
from nightson.managers.events_owned_by_user_manager import EventsOwnedByUserManager
from tornado import gen
import ujson
class EventsOwnedByUser(AuthHandler):
@gen.coroutine
def get(self):
''' Subscribes a User for an Event! '''
events_owned_by_user_manager= EventsOwnedByUserManager(self.request)
result = yield events_owned_by_user_manager.get_events(self.current_user)
self.set_status(httplib.OK)
self.write(ujson.dumps(result))
| 27.952381 | 83 | 0.773424 | 73 | 587 | 5.945205 | 0.534247 | 0.076037 | 0.089862 | 0.117512 | 0.165899 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16184 | 587 | 20 | 84 | 29.35 | 0.882114 | 0.052811 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.461538 | 0 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
907d4e2af1015cd7c39ff1bc4cccbe2763ea41e4 | 622 | py | Python | ogreserver/forms/profile_edit.py | oii/ogreserver | 942d8ee612206fb094f04b3ff976187abebf3069 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | ogreserver/forms/profile_edit.py | oii/ogreserver | 942d8ee612206fb094f04b3ff976187abebf3069 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | ogreserver/forms/profile_edit.py | oii/ogreserver | 942d8ee612206fb094f04b3ff976187abebf3069 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | from __future__ import absolute_import
from __future__ import unicode_literals
from flask_wtf import FlaskForm
from wtforms import BooleanField, SelectField, validators
from wtforms.fields.html5 import EmailField
class ProfileEditForm(FlaskForm):
email = EmailField('Email Address', [validators.Required(), validators.Email()])
preferred_ebook_format = SelectField(
'Preferred Ebook Format',
choices=[('-', '-'), ('Kindle AZW Format', '.azw3'), ('ePub Format', '.epub')],
)
dont_email_me = BooleanField("Don't email me ever")
advanced = BooleanField('Display extra ebook metadata')
| 34.555556 | 87 | 0.729904 | 68 | 622 | 6.455882 | 0.558824 | 0.045558 | 0.072893 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003817 | 0.157556 | 622 | 17 | 88 | 36.588235 | 0.833969 | 0 | 0 | 0 | 0 | 0 | 0.196141 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.384615 | 0 | 0.769231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
9082d2ec2b7eb9f31e2d71626c0c8fba97aa3d20 | 2,055 | py | Python | Primeira Lista/exercicio2_biblioteca.py | marianadick/POO-2 | 3c5e40c747e65492482cd7dd60962ce06f245cd1 | [
"MIT"
] | null | null | null | Primeira Lista/exercicio2_biblioteca.py | marianadick/POO-2 | 3c5e40c747e65492482cd7dd60962ce06f245cd1 | [
"MIT"
] | null | null | null | Primeira Lista/exercicio2_biblioteca.py | marianadick/POO-2 | 3c5e40c747e65492482cd7dd60962ce06f245cd1 | [
"MIT"
] | null | null | null | '''Crie as classes necessárias para um sistema de gerenciamento de uma biblioteca. Os
bibliotecários deverão preencher o sistema com o título do livro, os autores, o ano, a editora, a
edição e o volume. A biblioteca também terá um sistema de pesquisa (outro software), portanto
será necessário conseguir acessar os atributos típicos de pesquisa (nome, autor, …).'''
class Livro():
def __init__(self, titulo: str, autores: str, ano: str, editora: str, edicao=1, volume=1):
self.__titulo = titulo
self.__autores = autores
self.__ano = ano
self.__editora = editora
self.__edicao = edicao
self.__volume = volume
@property
def titulo(self):
return self.__titulo
@property
def autores(self):
return self.__autores
@property
def ano(self):
return self.__ano
@property
def editora(self):
return self.__editora
@property
def edicao(self):
return self.__edicao
@property
def volume(self):
return self.__volume
class Biblioteca():
def __init__(self, livros: list):
self.__livros = livros
@property
def livros(self):
return self.__livros
def adicionar_livro(self, livro: Livro):
self.livros.append(livro)
def exibir_livros(self):
c = 0
for livro in self.livros:
c += 1
print(f'{c}. {livro.titulo} ({livro.ano}), por {livro.autores}, editora {livro.editora}, ed. {livro.edicao}, vol. {livro.volume}')
def main():
livro1 = Livro('Admirável mundo novo', 'Aldous Huxley', '1932', 'Biblioteca Azul')
livro2 = Livro('Morte na Mesopotâmia', 'Agatha Christie', '1936', 'Arqueiro', '19')
livro3 = Livro('Life, the Universe and Everything', 'Douglas Adams', '1982', 'Del Rey', '1', '3')
livro4 = Livro('Cidades de Papel', 'John Green', '2008', 'Intrinseca')
Biblioteca1 = Biblioteca([livro1, livro2, livro3])
Biblioteca1.adicionar_livro(livro4)
Biblioteca1.exibir_livros()
main() | 31.136364 | 142 | 0.639416 | 251 | 2,055 | 5.087649 | 0.414343 | 0.060298 | 0.076742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022668 | 0.248662 | 2,055 | 66 | 143 | 31.136364 | 0.802461 | 0.174696 | 0 | 0.145833 | 0 | 0.020833 | 0.189237 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.145833 | 0.4375 | 0.020833 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
90a1f8bc842a60494f885c4e11d6986f21a8085b | 2,630 | py | Python | V-sem/PatternMatchingUsingPythonProgramming/LAB_ASSIGNMENTS/Assignment1(29SEP2020)/Solution5.py | wasitshafi/JMI | 892d603e2abe58fc8b3744b0db07ee988e4ded5c | [
"MIT"
] | 3 | 2020-03-18T16:27:33.000Z | 2021-06-07T12:39:32.000Z | V-sem/PatternMatchingUsingPythonProgramming/LAB_ASSIGNMENTS/Assignment1(29SEP2020)/Solution5.py | wasitshafi/JMI | 892d603e2abe58fc8b3744b0db07ee988e4ded5c | [
"MIT"
] | null | null | null | V-sem/PatternMatchingUsingPythonProgramming/LAB_ASSIGNMENTS/Assignment1(29SEP2020)/Solution5.py | wasitshafi/JMI | 892d603e2abe58fc8b3744b0db07ee988e4ded5c | [
"MIT"
] | 10 | 2019-11-11T06:49:10.000Z | 2021-06-07T12:41:20.000Z | # Q5. Write a program to implement OOPs concepts in python i.e. inheritance etc.
class Person:
def __init__(self, name, age, address): # constructor
self.name = name
self.age = age
self.address = address
def set_data(self, name, age, address):
self.name = name
self.age = age
self.address = address
def get_data(self):
print('Name : ', self.name)
print('Age : ', self.age)
print('Address : ', self.address)
def get_name(self):
return self.name
def get_age(self):
return self.age
def get_address(self):
return self.address
class Student(Person):
def __init__(self, name, age, address, rollno, course, marksObtained, maximumMarks): # constructor
super().__init__(name, age, address)
self.rollno = rollno
self.course = course
self.marksObtained = marksObtained
self.maximumMarks = maximumMarks
def set_data(self, rollno, course, marksObtained, maximumMarks):
self.rollno = rollno
self.course = course
self.marksObtained = marksObtained
self.maximumMarks = maximumMarks
def get_data(self):
print('Name : ', super().get_name())
print('Age : ', super().get_age())
print('Address : ', super().get_address())
print('Rollno : ', self.rollno)
print('Course : ', self.course)
print('Marks Obtained : ', self.marksObtained)
print('Maximum Marks : ', self.maximumMarks)
print('Percentage : ', '{:.2f}'.format(self.get_percentage()), '%')
def get_rollno(self):
return self.rollno
def get_course(self):
return self.course
def get_marksObtained(self):
return self.marksObtained
def maximumMarks(self):
return self.maximumMarks
def get_percentage(self):
return marksObtained / maximumMarks * 100
print('PERSON DETAILS : ')
name = input('Enter Name of Person : ')
age = int(input('Enter Age of ' + name + ' : '))
address = input('Enter Address of ' + name +' : ')
p1 = Person(name, age, address) # Object of Person Class
print('\nSTUDENT DETAILS : ')
name = input('Enter Name of Student: ')
age = int(input('Enter Age of Student : '))
address = input('Enter Address of Student: ')
rollno = input('Enter Rollno of Student : ')
course = input('Enter Course of Student : ')
marksObtained = float(input('Enter Marks Obtained of Student : '))
maximumMarks = float(input('Enter Maximum Marks of Student : '))
s1 = Student(name, age, address, rollno, course, marksObtained, maximumMarks)
# Printing Person Details
print("\n\nPerson Details are as :")
p1.get_data();
#Printing Student Detatils
print("\n\nStudent Details are as :")
s1.get_data(); | 29.550562 | 100 | 0.671103 | 329 | 2,630 | 5.276596 | 0.173252 | 0.057604 | 0.056452 | 0.031106 | 0.351382 | 0.321429 | 0.243088 | 0.156682 | 0.156682 | 0.156682 | 0 | 0.004267 | 0.198099 | 2,630 | 89 | 101 | 29.550562 | 0.818872 | 0.06616 | 0 | 0.235294 | 0 | 0 | 0.188571 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.205882 | false | 0 | 0 | 0.117647 | 0.352941 | 0.220588 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
90a3e79347ce4b67f718ac80b27f353b2d1059f9 | 1,371 | py | Python | test/test_pipeline.py | urielcaire/mult | 45c0cba69153442be2cee6309d46d55086445e5c | [
"Apache-2.0"
] | 12 | 2020-10-13T01:27:35.000Z | 2021-11-22T13:42:26.000Z | test/test_pipeline.py | urielcaire/mult | 45c0cba69153442be2cee6309d46d55086445e5c | [
"Apache-2.0"
] | null | null | null | test/test_pipeline.py | urielcaire/mult | 45c0cba69153442be2cee6309d46d55086445e5c | [
"Apache-2.0"
] | 2 | 2020-10-12T13:40:41.000Z | 2022-03-29T05:28:59.000Z | # Copyright 2020 The MuLT Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
import unittest
from pipeline import SMLA
from optimization import *
from lightgbm import LGBMModel
from sklearn.datasets import load_wine
from sklearn.model_selection import train_test_split
class PipelineTest(unittest.TestCase):
def setUp(self):
self.X, self.y = load_wine(True)
self.X, self.y = self.X[(self.y == 0) | (self.y == 1), :], self.y[(self.y == 0) | (self.y == 1)]
def test_smla(self):
smla = SMLA(LGBMModel, LightGBMOptimizer)
x_train, y_train, x_valid, y_valid = train_test_split(self.X, self.y, stratify=self.y)
smla.fit(x_train, y_train, x_valid, y_valid, early_stopping_rounds=100)
smla.predict(x_valid)
| 34.275 | 104 | 0.680525 | 199 | 1,371 | 4.592965 | 0.512563 | 0.049234 | 0.039387 | 0.043764 | 0.078775 | 0.078775 | 0.052516 | 0.052516 | 0 | 0 | 0 | 0.013251 | 0.174325 | 1,371 | 39 | 105 | 35.153846 | 0.79417 | 0.477024 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.4 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
90aba2f13a99e3125be8f65c72041aedd9bfda44 | 1,118 | py | Python | examples/ff/ff.py | syccc-ethan/sensitivity-analysis-model | b37eb9e7771752a7f7ce10604cd4277e955a1075 | [
"MIT"
] | null | null | null | examples/ff/ff.py | syccc-ethan/sensitivity-analysis-model | b37eb9e7771752a7f7ce10604cd4277e955a1075 | [
"MIT"
] | null | null | null | examples/ff/ff.py | syccc-ethan/sensitivity-analysis-model | b37eb9e7771752a7f7ce10604cd4277e955a1075 | [
"MIT"
] | 1 | 2021-09-06T05:09:54.000Z | 2021-09-06T05:09:54.000Z | import sys
from SALib.analyze.ff import analyze
from SALib.sample.ff import sample
from SALib.util import read_param_file
sys.path.append('../..')
# Read the parameter range file and generate samples
problem = read_param_file('../../src/SALib/test_functions/params/Ishigami.txt')
# or define manually without a parameter file:
# problem = {
# 'num_vars': 3,
# 'names': ['x1', 'x2', 'x3'],
# 'groups': None,
# 'bounds': [[-3.14159265359, 3.14159265359],
# [-3.14159265359, 3.14159265359],
# [-3.14159265359, 3.14159265359]]
# }
# Generate samples
X = sample(problem)
# Run the "model" -- this will happen offline for external models
Y = X[:, 0] + (0.1 * X[:, 1]) + ((1.2 * X[:, 2]) * (0.2 + X[:, 0]))
# Perform the sensitivity analysis using the model output
analyze(problem, X, Y, second_order=True, print_to_console=True)
# Returns a dictionary with keys 'ME' (main effect) and 'IE' (interaction effect)
# The techniques bulks out the number of parameters with dummy parameters to the
# nearest 2**n. Any results involving dummy parameters should be treated with
# a sceptical eye.
| 33.878788 | 81 | 0.685152 | 162 | 1,118 | 4.67284 | 0.574074 | 0.095112 | 0.085865 | 0.15852 | 0.095112 | 0.095112 | 0.095112 | 0.095112 | 0.095112 | 0.095112 | 0 | 0.093851 | 0.170841 | 1,118 | 32 | 82 | 34.9375 | 0.722762 | 0.623435 | 0 | 0 | 1 | 0 | 0.136476 | 0.124069 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.444444 | 0 | 0.444444 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
90b15c9ef39d724e0618d9a98ba789da0d692fde | 669 | py | Python | src/admin/widgets/datetime.py | aimanow/sft | dce87ffe395ae4bd08b47f28e07594e1889da819 | [
"Apache-2.0"
] | 280 | 2016-07-19T09:59:02.000Z | 2022-03-05T19:02:48.000Z | widgets/datetime.py | YAR-SEN/GodMode2 | d8a79b45c6d8b94f3d2af3113428a87d148d20d0 | [
"WTFPL"
] | 3 | 2016-07-20T05:36:49.000Z | 2018-12-10T16:16:19.000Z | widgets/datetime.py | YAR-SEN/GodMode2 | d8a79b45c6d8b94f3d2af3113428a87d148d20d0 | [
"WTFPL"
] | 20 | 2016-07-20T10:51:34.000Z | 2022-01-12T23:15:22.000Z | from datetime import datetime, date
from wtforms.fields.html5 import DateTimeField
from godmode.widgets.base import BaseWidget
class DatetimeWidget(BaseWidget):
field = DateTimeField(format="%Y-%m-%d %H:%M:%S")
field_kwargs = {"step": 1}
def render_list(self, item):
value = getattr(item, self.name, None)
try:
if isinstance(value, (date, datetime)):
return value.strftime("%d.%m.%Y %H:%M")
if isinstance(value, str) and value.isnumeric():
return datetime.utcfromtimestamp(int(value)).strftime("%d.%m.%Y %H:%M")
except ValueError:
pass
return value
| 26.76 | 87 | 0.612855 | 81 | 669 | 5.037037 | 0.567901 | 0.014706 | 0.083333 | 0.073529 | 0.088235 | 0.088235 | 0.088235 | 0 | 0 | 0 | 0 | 0.004024 | 0.2571 | 669 | 24 | 88 | 27.875 | 0.816901 | 0 | 0 | 0 | 0 | 0 | 0.073244 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0.0625 | 0.1875 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
90c37347ea910d355b3dd1b0f9601c072795a9b6 | 401 | py | Python | Keras/predict.py | maikelronnau/studies | 794cde51ea69ed6eaee8e38d47dcff6d37fba8b7 | [
"MIT"
] | null | null | null | Keras/predict.py | maikelronnau/studies | 794cde51ea69ed6eaee8e38d47dcff6d37fba8b7 | [
"MIT"
] | 2 | 2018-08-30T20:45:12.000Z | 2018-09-14T02:44:06.000Z | Keras/predict.py | maikelronnau/studies | 794cde51ea69ed6eaee8e38d47dcff6d37fba8b7 | [
"MIT"
] | null | null | null | from keras.models import load_model
from keras.utils import np_utils
import dataset_loader
img_width, img_height = 48, 48
train_data_dir = 'Jaffe'
model = load_model('first_try.h5')
images, labels = dataset_loader.load_dataset_images(train_data_dir, img_width, img_height, load_backup=False, export_dataset=False)
labels = np_utils.to_categorical(labels, 7)
print model.predict_classes(images)
| 23.588235 | 131 | 0.812968 | 63 | 401 | 4.84127 | 0.507937 | 0.059016 | 0.072131 | 0.111475 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016713 | 0.104738 | 401 | 16 | 132 | 25.0625 | 0.832869 | 0 | 0 | 0 | 0 | 0 | 0.042394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.333333 | null | null | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
90c63938f777d28b9e14a8996c89c2206cd45915 | 823 | py | Python | app/database.py | perymerdeka/fastapi-fullstack-boilerplate | c62c0c41a18cbf28e12ece4ca83091d7d864616d | [
"MIT"
] | 70 | 2021-10-03T22:27:18.000Z | 2022-02-24T06:29:13.000Z | app/database.py | perymerdeka/fastapi-fullstack-boilerplate | c62c0c41a18cbf28e12ece4ca83091d7d864616d | [
"MIT"
] | 2 | 2021-10-08T02:46:56.000Z | 2022-01-02T04:26:31.000Z | app/database.py | perymerdeka/fastapi-fullstack-boilerplate | c62c0c41a18cbf28e12ece4ca83091d7d864616d | [
"MIT"
] | 4 | 2021-10-05T04:41:27.000Z | 2022-01-21T07:39:01.000Z | from contextlib import contextmanager
from typing import Callable
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm.session import Session
from .settings import SQLALCHEMY_URL, SQLALCHEMY_USER, SQLALCHEMY_PASSWORD
SQLALCHEMY_DATABASE_URL = SQLALCHEMY_URL
engine = create_engine(
SQLALCHEMY_DATABASE_URL,
connect_args=dict(user=SQLALCHEMY_USER, password=SQLALCHEMY_PASSWORD),
)
SessionLocal: Callable[..., Session] = sessionmaker(
autocommit=False, autoflush=False, bind=engine
)
Base = declarative_base()
@contextmanager
def get_db_session():
"""Starts a database session as a context manager.
"""
session = SessionLocal()
try:
yield session
finally:
session.close() | 26.548387 | 74 | 0.778858 | 94 | 823 | 6.638298 | 0.425532 | 0.089744 | 0.054487 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151883 | 823 | 31 | 75 | 26.548387 | 0.893983 | 0.057108 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0.086957 | 0.304348 | 0 | 0.347826 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
90d20997f3ea5fc55fe460923b29bebe9b6796d3 | 253 | py | Python | Scripts/python/scripts mundo 1/Desafios/Desafio012.py | BrenoNAlmeida/Scripts-Escola | 20d886d0401ef7f40a4a46e307eadbf5b1c0a5eb | [
"Apache-2.0"
] | null | null | null | Scripts/python/scripts mundo 1/Desafios/Desafio012.py | BrenoNAlmeida/Scripts-Escola | 20d886d0401ef7f40a4a46e307eadbf5b1c0a5eb | [
"Apache-2.0"
] | null | null | null | Scripts/python/scripts mundo 1/Desafios/Desafio012.py | BrenoNAlmeida/Scripts-Escola | 20d886d0401ef7f40a4a46e307eadbf5b1c0a5eb | [
"Apache-2.0"
] | null | null | null | p=float(input('\033[32mqual o preço do produto ? R$\033[m'))
d=(p*5)/100
v=p-d
print('\033[34mO desconto em 5 porcento do produto será de \033[31mR${}\033[34m'.format(d))
print('O valor do produto com 5 porcento de desconto é de \033[31mR${}'.format(v)) | 50.6 | 91 | 0.6917 | 52 | 253 | 3.365385 | 0.538462 | 0.154286 | 0.102857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152466 | 0.118577 | 253 | 5 | 92 | 50.6 | 0.632287 | 0 | 0 | 0 | 0 | 0 | 0.69685 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.4 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
90d20b4f8add3703f5cb3ecfdf27458a36925d7a | 3,027 | py | Python | external_system/models.py | Wellheor1/l2 | d980210921c545c68fe9d5522bb693d567995024 | [
"MIT"
] | null | null | null | external_system/models.py | Wellheor1/l2 | d980210921c545c68fe9d5522bb693d567995024 | [
"MIT"
] | null | null | null | external_system/models.py | Wellheor1/l2 | d980210921c545c68fe9d5522bb693d567995024 | [
"MIT"
] | null | null | null | from django.db import models
class FsliRefbookTest(models.Model):
"""
Таблица справочников ФСЛИ: https://nsi.rosminzdrav.ru/#!/refbook/1.2.643.5.1.13.13.11.1080
"""
code_fsli = models.CharField(max_length=20, db_index=True, help_text='Уникальный код ФСЛИ')
code_loinc = models.CharField(max_length=20, help_text='Код LOINC')
title = models.CharField(max_length=1000, db_index=True, help_text='Полное наименование')
english_title = models.CharField(max_length=1000, db_index=True, help_text='Английское наименование')
short_title = models.CharField(max_length=1000, db_index=True, help_text='Краткое наименование')
synonym = models.CharField(max_length=255, help_text='Синоним')
analit = models.CharField(max_length=255, help_text='Аналит')
analit_props = models.CharField(max_length=255, help_text='Свойства аналита')
dimension = models.CharField(max_length=255, help_text='Размерность')
unit = models.CharField(max_length=100, help_text='Единица измерения')
sample = models.CharField(max_length=100, help_text='Образец')
time_characteristic_sample = models.CharField(max_length=100, help_text='Временная характеристика образца')
method_type = models.CharField(max_length=500, help_text='Тип метода')
scale_type = models.CharField(max_length=100, help_text='Тип шкалы измерения')
actual = models.CharField(max_length=100, help_text='Статус')
active = models.BooleanField(default=True, help_text='Единица измерения')
test_group = models.CharField(max_length=100, help_text='Группа тестов')
code_nmu = models.CharField(max_length=100, help_text='Код НМУ')
ordering = models.IntegerField(help_text='Порядок сортировки', blank=True, default=None, null=True)
def __str__(self):
return f"{self.code_fsli} – {self.title}"
class InstrumentalResearchRefbook(models.Model):
"""
Таблица справочников: https://nsi.rosminzdrav.ru/#!/refbook/1.2.643.5.1.13.13.11.1471/
"""
code_nsi = models.CharField(default='', max_length=20, db_index=True, help_text='Уникальный код')
title = models.CharField(default='', max_length=1000, db_index=True, help_text='Полное наименование')
method = models.CharField(default='', max_length=300, db_index=True, help_text='Метод')
area = models.CharField(default='', max_length=300, db_index=True, help_text='Область')
localization = models.CharField(default='', max_length=300, db_index=True, help_text='Локализация')
code_nmu = models.CharField(default='', max_length=300, db_index=True, help_text='Код исследования НМУ')
def __str__(self):
return f"{self.code_nsi} – {self.title}-{self.area}-{self.code_nmu}"
class BodySiteRefbook(models.Model):
"""
Область исследования: 1.2.643.2.69.1.1.1.57/
"""
code = models.CharField(max_length=20, db_index=True, help_text='Код')
title = models.CharField(max_length=1000, db_index=True, help_text='Полное наименование')
def __str__(self):
return f"{self.code} – {self.title}"
| 51.305085 | 111 | 0.73406 | 420 | 3,027 | 5.078571 | 0.257143 | 0.101266 | 0.160338 | 0.213783 | 0.616034 | 0.568214 | 0.565401 | 0.38256 | 0.344116 | 0.344116 | 0 | 0.046573 | 0.127519 | 3,027 | 58 | 112 | 52.189655 | 0.759939 | 0.07334 | 0 | 0.135135 | 0 | 0 | 0.177238 | 0.014498 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081081 | false | 0 | 0.027027 | 0.081081 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
90d7fe10aeb70963477192585a9a24b4c7be7f5c | 6,413 | py | Python | app/endpoints/auth.py | a283910020/yummy-rest | 066f48e0219ac2c64daea0b393f62d921be4394d | [
"MIT"
] | 4 | 2018-11-16T21:43:25.000Z | 2020-11-23T08:26:05.000Z | app/endpoints/auth.py | a283910020/yummy-rest | 066f48e0219ac2c64daea0b393f62d921be4394d | [
"MIT"
] | 12 | 2017-12-22T09:10:51.000Z | 2018-06-17T17:16:16.000Z | app/endpoints/auth.py | a283910020/yummy-rest | 066f48e0219ac2c64daea0b393f62d921be4394d | [
"MIT"
] | 5 | 2018-03-02T08:03:28.000Z | 2020-05-25T14:59:59.000Z | """The API routes"""
from datetime import datetime, timedelta
from werkzeug.security import check_password_hash, generate_password_hash
from flask import jsonify, request, make_response
from flask_restplus import Resource
from flask_jwt import jwt
from app import APP
from app.helpers import decode_access_token
from app.helpers.validators import UserSchema
from app.restplus import API
from app.models import db, User, BlacklistToken
from app.serializers import add_user, login_user, password_reset
# Linting exceptions
# pylint: disable=C0103
# pylint: disable=W0702
# pylint: disable=W0703
# pylint: disable=E1101
# pylint: disable=R0201
auth_ns = API.namespace('auth', description="Authentication/Authorization operations.")
@auth_ns.route('/register')
class RegisterHandler(Resource):
"""
This class handles user account creation.
"""
@API.expect(add_user)
def post(self):
"""
Registers a new user account.
"""
data = request.get_json()
# Instanciate user schema
user_schema = UserSchema()
data, errors = user_schema.load(data)
if errors:
response_obj = dict(
errors=errors
)
return make_response(jsonify(response_obj), 422)
# Check if user exists
email = data['email'].lower()
user = User.query.filter_by(email=email).first()
if not user:
try:
new_user = User(
email=data['email'], username=data['username'], password=data['password']
)
db.session.add(new_user)
db.session.commit()
return make_response(jsonify({'message': 'Registered successfully!'}), 201)
except:
response = {"message": "Username already taken, please choose another."}
return make_response(jsonify(response), 401)
else:
response = jsonify({'message': 'User already exists. Please Log in instead.'})
return make_response(response, 400)
@auth_ns.route('/login')
class LoginHandler(Resource):
"""
This class handles user login
"""
@API.expect(login_user)
def post(self):
"""
User Login/SignIn route
"""
login_info = request.get_json()
if not login_info:
return make_response(jsonify({'message': 'Input payload validation failed'}), 400)
try:
user = User.query.filter_by(email=login_info['email']).first()
if not user:
return make_response(jsonify({"message": 'User does not exist!'}), 404)
if check_password_hash(user.password, login_info['password']):
payload = {
'exp': datetime.utcnow() + timedelta(weeks=3),
'iat': datetime.utcnow(),
'sub': user.public_id
}
token = jwt.encode(
payload,
APP.config['SECRET_KEY'],
algorithm='HS256'
)
print(user.username)
return jsonify({"message": "Logged in successfully.",
"access_token": token.decode('UTF-8'),
"username": user.username
})
return make_response(jsonify({"message": "Incorrect credentials."}), 401)
except Exception as e:
print(e)
return make_response(jsonify({"message": "An error occurred. Please try again."}), 501)
@auth_ns.route('/logout')
class LogoutHandler(Resource):
"""
This class handles user logout
"""
def post(self):
"""
Logout route
"""
access_token = request.headers.get('Authorization')
if access_token:
result = decode_access_token(access_token)
if not isinstance(result, str):
# mark the token as blacklisted
blacklisted_token = BlacklistToken(access_token)
try:
# insert the token
db.session.add(blacklisted_token)
db.session.commit()
response_obj = dict(
status="success",
message="Logged out successfully."
)
return make_response(jsonify(response_obj), 200)
except Exception as e:
resp_obj = {
'status': 'fail',
'message': e
}
return make_response(jsonify(resp_obj), 200)
else:
resp_obj = dict(
status="fail",
message=result
)
return make_response(jsonify(resp_obj), 401)
else:
response_obj = {
'status': 'fail',
'message': 'Provide a valid auth token.'
}
return make_response(jsonify(response_obj), 403)
@auth_ns.route('/reset-password')
class PasswordResetResource(Resource):
"""
This class handles the user password reset request
"""
@API.expect(password_reset)
def post(self):
"""
Reset user password
"""
# Request data
data = request.get_json()
# Get specified user
user = User.query.filter_by(public_id=data['public_id']).first()
if user:
if check_password_hash(user.password, data['current_password']):
user.password = generate_password_hash(data['new_password'])
db.session.commit()
resp_obj = dict(
status="Success!",
message="Password reset successfully!"
)
resp_obj = jsonify(resp_obj)
return make_response(resp_obj, 200)
resp_obj = dict(
status="Fail!",
message="Wrong current password. Try again."
)
resp_obj = jsonify(resp_obj)
return make_response(resp_obj, 401)
resp_obj = dict(
status="Fail!",
message="User doesn't exist, check the Public ID provided!"
)
resp_obj = jsonify(resp_obj)
return make_response(resp_obj, 403)
| 34.111702 | 99 | 0.545143 | 638 | 6,413 | 5.335423 | 0.266458 | 0.056404 | 0.079318 | 0.080787 | 0.264395 | 0.149236 | 0.040541 | 0.040541 | 0.040541 | 0.040541 | 0 | 0.016974 | 0.356931 | 6,413 | 187 | 100 | 34.294118 | 0.808438 | 0.07937 | 0 | 0.240602 | 1 | 0 | 0.129904 | 0.004882 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030075 | false | 0.082707 | 0.082707 | 0 | 0.263158 | 0.015038 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
90d9f70cb591ec552b0cc367503bc3b7d940b765 | 1,778 | py | Python | bluebutton/parsers/ccda.py | r4dat/bluebutton_priv | 4acf99605239b79e187009b7de3ecf43b796d6c4 | [
"Apache-2.0"
] | 8 | 2015-04-06T15:01:01.000Z | 2018-12-05T22:49:09.000Z | bluebutton/parsers/ccda.py | r4dat/bluebutton_priv | 4acf99605239b79e187009b7de3ecf43b796d6c4 | [
"Apache-2.0"
] | 2 | 2015-09-09T13:38:46.000Z | 2015-11-21T10:59:47.000Z | bluebutton/parsers/ccda.py | r4dat/bluebutton_priv | 4acf99605239b79e187009b7de3ecf43b796d6c4 | [
"Apache-2.0"
] | 10 | 2015-06-18T18:43:33.000Z | 2020-06-26T03:25:18.000Z | ###############################################################################
# Copyright 2015 University of Florida. All rights reserved.
# This file is part of the BlueButton.py project.
# Use of this source code is governed by the license found in the LICENSE file.
###############################################################################
from ._ccda.allergies import allergies
from ._ccda.care_plan import care_plan
from ._ccda.demographics import demographics
from ._ccda.document import document
from ._ccda.encounters import encounters
from ._ccda.free_text import free_text
from ._ccda.functional_statuses import functional_statuses
from ._ccda.immunizations import immunizations
from ._ccda.instructions import instructions
from ._ccda.medications import medications
from ._ccda.problems import problems
from ._ccda.procedures import procedures
from ._ccda.results import results
from ._ccda.smoking_status import smoking_status
from ._ccda.vitals import vitals
from ..core import wrappers
def run(ccda):
data = wrappers.ObjectWrapper()
data.document = document(ccda)
data.allergies = allergies(ccda)
data.care_plan = care_plan(ccda)
data.chief_complaint = free_text(ccda, 'chief_complaint')
data.demographics = demographics(ccda)
data.encounters = encounters(ccda)
data.functional_statuses = functional_statuses(ccda)
data.immunizations = immunizations(ccda).administered
data.immunization_declines = immunizations(ccda).declined
data.instructions = instructions(ccda)
data.results = results(ccda)
data.medications = medications(ccda)
data.problems = problems(ccda)
data.procedures = procedures(ccda)
data.smoking_status = smoking_status(ccda)
data.vitals = vitals(ccda)
return data | 39.511111 | 79 | 0.715411 | 204 | 1,778 | 6.073529 | 0.289216 | 0.096852 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002587 | 0.130484 | 1,778 | 45 | 80 | 39.511111 | 0.798836 | 0.103487 | 0 | 0 | 0 | 0 | 0.010468 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028571 | false | 0 | 0.457143 | 0 | 0.514286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
90e0a3fe73680b6d78c677a08cb847e35af7aaf8 | 364 | py | Python | src/nlp/boneyard/scripts/find_security_group.py | khappucino/global-traffic-management | 81a80ae86c9119066c92df0727554cb5b61f9899 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | src/nlp/boneyard/scripts/find_security_group.py | khappucino/global-traffic-management | 81a80ae86c9119066c92df0727554cb5b61f9899 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | src/nlp/boneyard/scripts/find_security_group.py | khappucino/global-traffic-management | 81a80ae86c9119066c92df0727554cb5b61f9899 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | import boto3
def find_security_group(ec2_client):
filters = [{'Name':'tag:Name', 'Values':['int-test-backend-server']}]
results = ec2_client.describe_security_groups(Filters=filters)
if len(results) == 0:
print("none")
return None
else:
print(results['SecurityGroups'][0]['GroupId'])
return results['SecurityGroups'][0]['GroupId']
| 30.333333 | 71 | 0.678571 | 44 | 364 | 5.477273 | 0.636364 | 0.074689 | 0.182573 | 0.240664 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019417 | 0.151099 | 364 | 11 | 72 | 33.090909 | 0.760518 | 0 | 0 | 0 | 0 | 0 | 0.239011 | 0.063187 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.1 | 0 | 0.4 | 0.2 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
90e94693e3fa4c22c730c0d064afee2f335d0d9a | 860 | py | Python | lstm_dao.py | goiaba/c19-cases-by-day | 1b97c51a2e0c091c8a1baba530a40ec8686e1ee5 | [
"MIT"
] | null | null | null | lstm_dao.py | goiaba/c19-cases-by-day | 1b97c51a2e0c091c8a1baba530a40ec8686e1ee5 | [
"MIT"
] | 6 | 2021-05-21T14:40:12.000Z | 2022-03-12T00:37:27.000Z | lstm_dao.py | goiaba/c19-cases-by-day | 1b97c51a2e0c091c8a1baba530a40ec8686e1ee5 | [
"MIT"
] | 1 | 2020-08-05T23:30:02.000Z | 2020-08-05T23:30:02.000Z | import pandas
from mariadb_handler import MariaDBHandler
class LstmDao(MariaDBHandler):
def __init__(self, host: str, database: str, user: str, password: str):
super(LstmDao, self).__init__(host, database, user, password)
def persist_cases_prediction(self, data: list):
if data:
stmt = """INSERT INTO covid_cases_prediction_by_country (idCountry, predictedCases, datePrediction, entranceDate)
VALUES (%(idCountry)s, %(predictedCases)s, %(datePrediction)s, %(entranceDate)s)"""
self._batch_executor(stmt, data)
def delete_cases_prediction_if_exists(self, entrance_date_str):
stmt = """DELETE FROM covid_cases_prediction_by_country
WHERE entranceDate = %(entranceDate)s"""
self._batch_executor(stmt, [{"entranceDate": entrance_date_str}])
| 45.263158 | 125 | 0.684884 | 94 | 860 | 5.93617 | 0.446809 | 0.107527 | 0.071685 | 0.078853 | 0.225806 | 0.121864 | 0 | 0 | 0 | 0 | 0 | 0 | 0.212791 | 860 | 18 | 126 | 47.777778 | 0.824225 | 0 | 0 | 0 | 0 | 0.071429 | 0.383721 | 0.076744 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0.142857 | 0.142857 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
90fcfd3bc6e9ddd7a032985039348959e97e84a0 | 417 | py | Python | rest_api_service_fastapi/test_debug.py | aishwaryashand/Task1 | 8c422d600ce44d60d4951e3e567da7e22a1f51fc | [
"Apache-2.0"
] | null | null | null | rest_api_service_fastapi/test_debug.py | aishwaryashand/Task1 | 8c422d600ce44d60d4951e3e567da7e22a1f51fc | [
"Apache-2.0"
] | null | null | null | rest_api_service_fastapi/test_debug.py | aishwaryashand/Task1 | 8c422d600ce44d60d4951e3e567da7e22a1f51fc | [
"Apache-2.0"
] | null | null | null |
def test_debug():
from app_utils import create_access_token
from datetime import timedelta
access_token_expires = timedelta(minutes=10)
access_token = create_access_token(data={"sub": "cuongld"}, expires_delta=access_token_expires)
print(access_token)
from app_utils import decode_access_token
decoded_access_token = decode_access_token(data=access_token)
print(decoded_access_token)
| 37.909091 | 99 | 0.788969 | 56 | 417 | 5.464286 | 0.392857 | 0.395425 | 0.078431 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005602 | 0.143885 | 417 | 10 | 100 | 41.7 | 0.851541 | 0 | 0 | 0 | 0 | 0 | 0.024038 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.444444 | 0.222222 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
29077d6526e6872758f4f7ff19d3e36abd645065 | 3,452 | py | Python | sobi_route_to_gpx.py | loisaidasam/sobi-save-to-strava | cc9958d699dc266fc6f1cdde0bec570da901b531 | [
"MIT"
] | null | null | null | sobi_route_to_gpx.py | loisaidasam/sobi-save-to-strava | cc9958d699dc266fc6f1cdde0bec570da901b531 | [
"MIT"
] | null | null | null | sobi_route_to_gpx.py | loisaidasam/sobi-save-to-strava | cc9958d699dc266fc6f1cdde0bec570da901b531 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import argparse
import datetime
import json
import gpxpy.gpx
import pytz
TIMEZONE_DEFAULT = 'America/New_York'
# This is the `gpxpy`'s default `creator` value:
# CREATOR_DEFAULT = 'gpx.py -- https://github.com/tkrajina/gpxpy'
# TODO: Come up with our own, something like this:
# CREATOR_DEFAULT = 'sobi-route-to-gpx.py -- https://github.com/loisaidasam/sobi-route-to-gpx'
CREATOR_DEFAULT = None
def main():
args = parse_args()
with open(args.route_filename, 'r') as fp:
route = json.load(fp)
timezone = args.timezone and pytz.timezone(args.timezone) or None
gpx = convert(route, timezone)
if args.creator:
gpx.creator = args.creator
print gpx.to_xml()
def parse_args():
parser = argparse.ArgumentParser()
parser.add_argument('route_filename',
help="The filename of a JSON file representing a SoBi route")
# TODO: Use `route`'s `start_time` / `finish_time` for auto-detecting timezone
parser.add_argument('--timezone',
default=TIMEZONE_DEFAULT,
help="The timezone the coordinates came from. Default: '%s'" % TIMEZONE_DEFAULT)
parser.add_argument('--creator',
default=CREATOR_DEFAULT,
help="A custom value for the base gpx element's 'creator' attribute")
return parser.parse_args()
def convert(route, timezone):
"""Convert a JSON representation of a SoBi route into a gpxpy.gpx.GPX object
"""
# TODO: Determine good values for `name`/`description`
name = "SoBi route #%s" % route['id']
description = None
# The GPX object
gpx = gpxpy.gpx.GPX()
# The GPXTrack
gpx_track = gpxpy.gpx.GPXTrack(name=name, description=description)
gpx.tracks.append(gpx_track)
# The GPXTrackSegment
gpx_segment = gpxpy.gpx.GPXTrackSegment()
gpx_track.segments.append(gpx_segment)
# Now add all of the coordinate data
coordinates = route_coordinates_generator(route, timezone)
for coordinate in coordinates:
trackpoint = gpxpy.gpx.GPXTrackPoint(**coordinate)
gpx_segment.points.append(trackpoint)
return gpx
def route_coordinates_generator(route, timezone):
"""
{
"path": {
"type": "LineString",
"coordinates": [
[
-84.36963166666666,
33.746605,
1549299522
],
...
}
"""
# Make sure the route looks like we think it should
path = route['path']
# Not sure what other types can be here:
assert path['type'] == "LineString"
for coordinate in path['coordinates']:
yield format_route_path_coordinate(coordinate, timezone)
def format_route_path_coordinate(coordinate, timezone):
"""
Params:
coordinate::tuple(
longitude::float
latitude::float
timestamp::int
A timezone-naive local timestamp
)
timezone::datetime.tzinfo
Returns:
::dict{
latitude::float
longitude::float
time::datetime.datetime
}
"""
longitude, latitude, timestamp = coordinate
time = datetime.datetime.fromtimestamp(timestamp, tz=timezone)
time_utc = time.astimezone(pytz.utc)
return {
'latitude': latitude,
'longitude': longitude,
'time': time_utc,
}
if __name__ == '__main__':
main()
| 29.008403 | 104 | 0.628331 | 393 | 3,452 | 5.40458 | 0.371501 | 0.022599 | 0.024011 | 0.015066 | 0.094162 | 0.04049 | 0 | 0 | 0 | 0 | 0 | 0.013423 | 0.266222 | 3,452 | 118 | 105 | 29.254237 | 0.825109 | 0.16657 | 0 | 0 | 0 | 0 | 0.131734 | 0 | 0 | 0 | 0 | 0.016949 | 0.017857 | 0 | null | null | 0 | 0.089286 | null | null | 0.017857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
2908a64626d13cd6ee7897d652cff7427ee39c1e | 162 | py | Python | twiml/voice/say/say-2/say-2.6.x.py | Tshisuaka/api-snippets | 52b50037d4af0f3b96adf76197964725a1501e96 | [
"MIT"
] | 234 | 2016-01-27T03:04:38.000Z | 2022-02-25T20:13:43.000Z | twiml/voice/say/say-2/say-2.6.x.py | Tshisuaka/api-snippets | 52b50037d4af0f3b96adf76197964725a1501e96 | [
"MIT"
] | 351 | 2016-04-06T16:55:33.000Z | 2022-03-10T18:42:36.000Z | twiml/voice/say/say-2/say-2.6.x.py | Tshisuaka/api-snippets | 52b50037d4af0f3b96adf76197964725a1501e96 | [
"MIT"
] | 494 | 2016-03-30T15:28:20.000Z | 2022-03-28T19:39:36.000Z | from twilio.twiml.voice_response import VoiceResponse, Say
response = VoiceResponse()
response.say('Chapeau!', voice='alice', language='fr-FR')
print(response)
| 23.142857 | 58 | 0.771605 | 20 | 162 | 6.2 | 0.65 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08642 | 162 | 6 | 59 | 27 | 0.837838 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
2910c416f2df00ee7511b4db3e260f212a4368f0 | 940 | py | Python | src/main/antlrParser/ExtendedListener/KotlinParserListenerExtended.py | alschmut/code2semantics | af1daf0b8320b534344c5352ae972fb600e21e43 | [
"MIT"
] | 2 | 2020-02-26T22:50:38.000Z | 2020-10-29T10:46:10.000Z | src/main/antlrParser/ExtendedListener/KotlinParserListenerExtended.py | alschmut/linguistic-parser | af1daf0b8320b534344c5352ae972fb600e21e43 | [
"MIT"
] | null | null | null | src/main/antlrParser/ExtendedListener/KotlinParserListenerExtended.py | alschmut/linguistic-parser | af1daf0b8320b534344c5352ae972fb600e21e43 | [
"MIT"
] | null | null | null | from antlrParser.Kotlin.KotlinParserListener import KotlinParserListener
from antlrParser.Kotlin.KotlinParser import KotlinParser
from antlrParser.BaseListener import BaseListener
class KotlinParserListenerExtended(KotlinParserListener, BaseListener):
def enterClassDeclaration(self, ctx:KotlinParser.ClassDeclarationContext):
self.identifiers.set_class_name(ctx.simpleIdentifier().getText(), ctx.start.line)
def enterFunctionDeclaration(self, ctx:KotlinParser.FunctionDeclarationContext):
self.identifiers.set_method_name(ctx.identifier().getText(), ctx.start.line)
def enterMultiVariableDeclaration(self, ctx:KotlinParser.MultiVariableDeclarationContext):
self.identifiers.set_variable_name(ctx.simpleIdentifier().getText(), ctx.start.line)
def enterSimpleIdentifier(self, ctx:KotlinParser.SimpleIdentifierContext):
self.identifiers.set_any_identifier(ctx.getText(), ctx.start.line)
| 55.294118 | 94 | 0.817021 | 88 | 940 | 8.636364 | 0.352273 | 0.036842 | 0.1 | 0.1 | 0.147368 | 0.118421 | 0.118421 | 0.118421 | 0 | 0 | 0 | 0 | 0.095745 | 940 | 16 | 95 | 58.75 | 0.894118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.25 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
292399f9c3b97ec85618deebdfd297911dcc17cc | 316 | py | Python | pyamg/krylov/__init__.py | Alexey-Voronin/pyamg-1 | 59d35010e4bd660aae3526e8a206a42cb1a54bfa | [
"MIT"
] | null | null | null | pyamg/krylov/__init__.py | Alexey-Voronin/pyamg-1 | 59d35010e4bd660aae3526e8a206a42cb1a54bfa | [
"MIT"
] | null | null | null | pyamg/krylov/__init__.py | Alexey-Voronin/pyamg-1 | 59d35010e4bd660aae3526e8a206a42cb1a54bfa | [
"MIT"
] | null | null | null | "Krylov Solvers"
from .info import __doc__
from ._gmres import *
from ._fgmres import *
from ._cg import *
from ._cr import *
from ._cgnr import *
from ._cgne import *
from ._bicgstab import *
from ._steepest_descent import *
from ._minimal_residual import *
__all__ = [s for s in dir() if not s.startswith('_')]
| 19.75 | 53 | 0.727848 | 45 | 316 | 4.666667 | 0.555556 | 0.380952 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.174051 | 316 | 15 | 54 | 21.066667 | 0.804598 | 0.044304 | 0 | 0 | 0 | 0 | 0.047468 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.833333 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
29335529b2fd9185e13948c54d9fff565b6a6a39 | 244 | py | Python | tests/input/func_models_float_money_field.py | enefeq/django_linter | d5a27004acc9c7caf8663a8fb519060f210c81ec | [
"MIT"
] | 101 | 2015-07-26T15:52:31.000Z | 2021-04-29T14:01:23.000Z | tests/input/func_models_float_money_field.py | enefeq/django_linter | d5a27004acc9c7caf8663a8fb519060f210c81ec | [
"MIT"
] | 3 | 2015-08-14T14:22:04.000Z | 2017-09-21T12:24:43.000Z | tests/input/func_models_float_money_field.py | enefeq/django_linter | d5a27004acc9c7caf8663a8fb519060f210c81ec | [
"MIT"
] | 4 | 2015-08-14T10:59:48.000Z | 2022-03-22T13:28:58.000Z | """
Check for correct type for money related field
"""
from django.db import models
class Product(models.Model):
name = models.CharField(max_length=255)
price = models.FloatField()
def __unicode__(self):
return self.name
| 18.769231 | 46 | 0.70082 | 32 | 244 | 5.1875 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015385 | 0.20082 | 244 | 12 | 47 | 20.333333 | 0.835897 | 0.188525 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0.166667 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
293cd8d37de875148c2235c0060e33f24aa80b50 | 827 | py | Python | tests/test_genutil_statistics_Numeric.py | CDAT/genutil | 8772964ed2b0305590f94139974a300b49d113b7 | [
"BSD-3-Clause"
] | null | null | null | tests/test_genutil_statistics_Numeric.py | CDAT/genutil | 8772964ed2b0305590f94139974a300b49d113b7 | [
"BSD-3-Clause"
] | 22 | 2018-03-22T10:10:32.000Z | 2021-01-01T22:25:41.000Z | tests/test_genutil_statistics_Numeric.py | CDAT/genutil | 8772964ed2b0305590f94139974a300b49d113b7 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
import numpy,genutil
import unittest
class GENUTIL(unittest.TestCase):
def assertArraysEqual(self,A,B):
self.assertTrue(numpy.all(numpy.equal(A,B)))
def testStatisticsNumpy(self):
a=numpy.ones((15,25),'d')
rk = [0.0, 91.66666666666667, 87.5, 83.33333333333333, 79.16666666666667, 75.0, 70.83333333333333, 66.66666666666667, 62.5, 58.333333333333336, 54.166666666666664, 95.83333333333333, 50.0, 41.666666666666664, 37.5, 33.333333333333336, 29.166666666666668, 25.0, 20.833333333333332, 16.666666666666668, 12.5, 8.333333333333334, 4.166666666666667, 45.833333333333336, 100.0]
# rk will be copied over and over
self.assertArraysEqual(genutil.statistics.rank(a,axis=1),rk)
self.assertTrue(numpy.allclose(genutil.statistics.variance(a,axis=0),0.))
| 55.133333 | 379 | 0.729141 | 113 | 827 | 5.336283 | 0.646018 | 0.016584 | 0.063018 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.415042 | 0.131802 | 827 | 14 | 380 | 59.071429 | 0.424791 | 0.062878 | 0 | 0 | 0 | 0 | 0.001294 | 0 | 0 | 0 | 0 | 0 | 0.4 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
29421cd3325c30e1b4418e9e8819ef46d692e96d | 552 | py | Python | core-python/Core_Python/exception/ExceptionMethods.py | theumang100/tutorials-1 | 497f54c2adb022c316530319a168fca1c007d4b1 | [
"MIT"
] | 9 | 2020-04-23T05:24:19.000Z | 2022-02-17T16:37:51.000Z | core-python/Core_Python/exception/ExceptionMethods.py | theumang100/tutorials-1 | 497f54c2adb022c316530319a168fca1c007d4b1 | [
"MIT"
] | 5 | 2020-10-01T05:08:37.000Z | 2020-10-12T03:18:10.000Z | core-python/Core_Python/exception/ExceptionMethods.py | theumang100/tutorials-1 | 497f54c2adb022c316530319a168fca1c007d4b1 | [
"MIT"
] | 9 | 2020-04-28T14:06:41.000Z | 2021-10-19T18:32:28.000Z | ''' e. and use all methods '''
''' also try different exception like java file,array,string,numberformat '''
''' user define exception '''
try:
raise Exception('spam,','eggs')
except Exception as inst:
print("Type of instance : ",type(inst)) # the exception instance
print("Arguments of instance : ",inst.args) # arguments stored in .args
print("Instance print : ",inst) # __str__ allows args to be printed directly,but may be overridden in exception subclasses
a,b = inst.args # unpack args
print("a : ",a)
print("b : ",b) | 42.461538 | 126 | 0.675725 | 76 | 552 | 4.855263 | 0.592105 | 0.054201 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190217 | 552 | 13 | 127 | 42.461538 | 0.825503 | 0.315217 | 0 | 0 | 0 | 0 | 0.292776 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.555556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
29444d17283b76ff2a263a365eb59fa0884b2476 | 6,299 | py | Python | pliers/extractors/__init__.py | anibalsolon/pliers | fcae6d7b6a7360adde8e5d840e920cb3b33e2e7d | [
"BSD-3-Clause"
] | null | null | null | pliers/extractors/__init__.py | anibalsolon/pliers | fcae6d7b6a7360adde8e5d840e920cb3b33e2e7d | [
"BSD-3-Clause"
] | 1 | 2021-10-29T20:26:00.000Z | 2021-11-05T23:16:05.000Z | pliers/extractors/__init__.py | anibalsolon/pliers | fcae6d7b6a7360adde8e5d840e920cb3b33e2e7d | [
"BSD-3-Clause"
] | null | null | null | ''' The `Extractor` hierarchy contains Transformer classes that take a `Stim`
of any type as input and return extracted feature information (rather than
another `Stim` instance).
'''
from .base import Extractor, ExtractorResult, merge_results
from .api import (ClarifaiAPIImageExtractor,
ClarifaiAPIVideoExtractor,
GoogleVisionAPIFaceExtractor,
GoogleVisionAPILabelExtractor,
GoogleVisionAPIPropertyExtractor,
GoogleVisionAPISafeSearchExtractor,
GoogleVisionAPIWebEntitiesExtractor,
GoogleVideoIntelligenceAPIExtractor,
GoogleVideoAPILabelDetectionExtractor,
GoogleVideoAPIShotDetectionExtractor,
GoogleVideoAPIExplicitDetectionExtractor,
GoogleLanguageAPIExtractor,
GoogleLanguageAPIEntityExtractor,
GoogleLanguageAPISentimentExtractor,
GoogleLanguageAPISyntaxExtractor,
GoogleLanguageAPITextCategoryExtractor,
GoogleLanguageAPIEntitySentimentExtractor,
MicrosoftAPIFaceExtractor,
MicrosoftAPIFaceEmotionExtractor,
MicrosoftVisionAPIExtractor,
MicrosoftVisionAPITagExtractor,
MicrosoftVisionAPICategoryExtractor,
MicrosoftVisionAPIImageTypeExtractor,
MicrosoftVisionAPIColorExtractor,
MicrosoftVisionAPIAdultExtractor)
from .audio import (LibrosaFeatureExtractor,
STFTAudioExtractor,
MeanAmplitudeExtractor,
SpectralCentroidExtractor,
SpectralBandwidthExtractor,
SpectralContrastExtractor,
SpectralRolloffExtractor,
PolyFeaturesExtractor,
ZeroCrossingRateExtractor,
ChromaSTFTExtractor,
ChromaCQTExtractor,
ChromaCENSExtractor,
MelspectrogramExtractor,
MFCCExtractor,
TonnetzExtractor,
TempogramExtractor,
RMSExtractor,
SpectralFlatnessExtractor,
OnsetDetectExtractor,
OnsetStrengthMultiExtractor,
TempoExtractor,
BeatTrackExtractor,
HarmonicExtractor,
PercussiveExtractor,
AudiosetLabelExtractor,
MFCCEnergyExtractor)
from .image import (BrightnessExtractor, SaliencyExtractor, SharpnessExtractor,
VibranceExtractor, FaceRecognitionFaceEncodingsExtractor,
FaceRecognitionFaceLandmarksExtractor,
FaceRecognitionFaceLocationsExtractor)
from .misc import MetricExtractor
from .models import (TensorFlowKerasApplicationExtractor,
TFHubImageExtractor, TFHubTextExtractor,
TFHubExtractor)
from .text import (ComplexTextExtractor, DictionaryExtractor,
PredefinedDictionaryExtractor, LengthExtractor,
NumUniqueWordsExtractor, PartOfSpeechExtractor,
WordEmbeddingExtractor, TextVectorizerExtractor,
VADERSentimentExtractor, SpaCyExtractor,
WordCounterExtractor, BertExtractor,
BertSequenceEncodingExtractor, BertLMExtractor,
BertSentimentExtractor)
from .video import (FarnebackOpticalFlowExtractor)
__all__ = [
'Extractor',
'ExtractorResult',
'ClarifaiAPIImageExtractor',
'ClarifaiAPIVideoExtractor',
'STFTAudioExtractor',
'MeanAmplitudeExtractor',
'LibrosaFeatureExtractor',
'SpectralCentroidExtractor',
'SpectralBandwidthExtractor',
'SpectralContrastExtractor',
'SpectralRolloffExtractor',
'PolyFeaturesExtractor',
'ZeroCrossingRateExtractor',
'ChromaSTFTExtractor',
'ChromaCQTExtractor',
'ChromaCENSExtractor',
'MelspectrogramExtractor',
'MFCCExtractor',
'TonnetzExtractor',
'TempogramExtractor',
'GoogleVisionAPIFaceExtractor',
'GoogleVisionAPILabelExtractor',
'GoogleVisionAPIPropertyExtractor',
'GoogleVisionAPISafeSearchExtractor',
'GoogleVisionAPIWebEntitiesExtractor',
'GoogleVideoIntelligenceAPIExtractor',
'GoogleVideoAPILabelDetectionExtractor',
'GoogleVideoAPIShotDetectionExtractor',
'GoogleVideoAPIExplicitDetectionExtractor',
'GoogleLanguageAPIExtractor',
'GoogleLanguageAPIEntityExtractor',
'GoogleLanguageAPISentimentExtractor',
'GoogleLanguageAPISyntaxExtractor',
'GoogleLanguageAPITextCategoryExtractor',
'GoogleLanguageAPIEntitySentimentExtractor',
'BrightnessExtractor',
'SaliencyExtractor',
'SharpnessExtractor',
'VibranceExtractor',
'FaceRecognitionFaceEncodingsExtractor',
'FaceRecognitionFaceLandmarksExtractor',
'FaceRecognitionFaceLocationsExtractor',
'MicrosoftAPIFaceExtractor',
'MicrosoftAPIFaceEmotionExtractor',
'MicrosoftVisionAPIExtractor',
'MicrosoftVisionAPITagExtractor',
'MicrosoftVisionAPICategoryExtractor',
'MicrosoftVisionAPIImageTypeExtractor',
'MicrosoftVisionAPIColorExtractor',
'MicrosoftVisionAPIAdultExtractor',
'TensorFlowKerasApplicationExtractor',
'TFHubExtractor',
'TFHubImageExtractor',
'TFHubTextExtractor',
'ComplexTextExtractor',
'DictionaryExtractor',
'PredefinedDictionaryExtractor',
'LengthExtractor',
'NumUniqueWordsExtractor',
'PartOfSpeechExtractor',
'FarnebackOpticalFlowExtractor',
'WordEmbeddingExtractor',
'TextVectorizerExtractor',
'VADERSentimentExtractor',
'merge_results',
'SpaCyExtractor',
'RMSExtractor',
'SpectralFlatnessExtractor'
'OnsetDetectExtractor',
'OnsetStrengthMultiExtractor',
'TempoExtractor',
'BeatTrackExtractor',
'HarmonicExtractor',
'PercussiveExtractor',
'BertExtractor',
'BertSequenceEncodingExtractor',
'BertLMExtractor',
'BertSentimentExtractor',
'AudiosetLabelExtractor',
'WordCounterExtractor',
'MetricExtractor'
]
| 39.616352 | 79 | 0.672011 | 215 | 6,299 | 19.660465 | 0.544186 | 0.011356 | 0.04211 | 0.058197 | 0.705938 | 0.705938 | 0.645848 | 0.559735 | 0.370002 | 0.370002 | 0 | 0 | 0.271948 | 6,299 | 158 | 80 | 39.867089 | 0.921718 | 0.027623 | 0 | 0 | 0 | 0 | 0.322871 | 0.239333 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.052632 | 0 | 0.052632 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
2944f711a01468344ef9c582eda7d8bc7b2a7cfc | 447 | py | Python | output/models/nist_data/list_pkg/non_negative_integer/schema_instance/nistschema_sv_iv_list_non_negative_integer_enumeration_5_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | 1 | 2021-08-14T17:59:21.000Z | 2021-08-14T17:59:21.000Z | output/models/nist_data/list_pkg/non_negative_integer/schema_instance/nistschema_sv_iv_list_non_negative_integer_enumeration_5_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | 4 | 2020-02-12T21:30:44.000Z | 2020-04-15T20:06:46.000Z | output/models/nist_data/list_pkg/non_negative_integer/schema_instance/nistschema_sv_iv_list_non_negative_integer_enumeration_5_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | null | null | null | from output.models.nist_data.list_pkg.non_negative_integer.schema_instance.nistschema_sv_iv_list_non_negative_integer_enumeration_5_xsd.nistschema_sv_iv_list_non_negative_integer_enumeration_5 import (
NistschemaSvIvListNonNegativeIntegerEnumeration5,
NistschemaSvIvListNonNegativeIntegerEnumeration5Type,
)
__all__ = [
"NistschemaSvIvListNonNegativeIntegerEnumeration5",
"NistschemaSvIvListNonNegativeIntegerEnumeration5Type",
]
| 44.7 | 201 | 0.888143 | 37 | 447 | 10.027027 | 0.594595 | 0.088949 | 0.145553 | 0.097035 | 0.25876 | 0.25876 | 0.25876 | 0.25876 | 0.25876 | 0 | 0 | 0.014388 | 0.067114 | 447 | 9 | 202 | 49.666667 | 0.8753 | 0 | 0 | 0 | 0 | 0 | 0.223714 | 0.223714 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
29452b76821e88ff9c0d2905709a221e5dea7c87 | 3,611 | py | Python | preloadShips.py | andrevst/fsnd-p4-item-catalog | a76fa2a6ef4f35eeeb93aade4813470e15b1d6d0 | [
"MIT"
] | null | null | null | preloadShips.py | andrevst/fsnd-p4-item-catalog | a76fa2a6ef4f35eeeb93aade4813470e15b1d6d0 | [
"MIT"
] | null | null | null | preloadShips.py | andrevst/fsnd-p4-item-catalog | a76fa2a6ef4f35eeeb93aade4813470e15b1d6d0 | [
"MIT"
] | null | null | null | from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from db_config import Base, Starships, User
engine = create_engine('sqlite:///fleet.db')
# Bind the engine to the metadata of the Base class so that the
# declaratives can be accessed through a DBSession instance
Base.metadata.bind = engine
DBSession = sessionmaker(bind=engine)
# A DBSession() instance establishes all conversations with the database
# and represents a "staging zone" for all the objects loaded into the
# database session object. Any change made against the objects in the
# session won't be persisted into the database until you call
# session.commit(). If you're not happy about the changes, you can
# revert all of them back to the last commit by calling
# session.rollback()
session = DBSession()
# Create dummy user
User1 = User(name="I Robot", email="",
image='')
session.add(User1)
session.commit()
# Create starting starships
starship1 = Starships(user_id=1, name="USS Enterprise", description="""Fisrt
ship on the series, commanded by Pyke and Kirk.
Has Spock.""",
category="NCC-1701")
session.add(starship1)
session.commit()
starship2 = Starships(user_id=1, name="USS Voyager", description="""Primary
setting of Star Trek: Voyager : (TV Series) . Commanded
by Captain Kathryn Janeway.
Commanded by Captain Chakotay in
the "Enemy of My Enemy" book series, then commanded by
Captain Afsarah Eden in the novel Unworthy,
then by Captain Chakotay as final commander in the novels
Full Circle and Children of the Storm.""",
category="NCC-74656")
session.add(starship2)
session.commit()
starship3 = Starships(user_id=1, name="USS Defiant", description="""The ship is
lost in an interdimensional rift in "The Tholian Web".
Later, it is revealed she was taken by Mirror Universe
Tholians, captured by the crew of ISS Enterprise (NX-01),
and then used in the service of the Terran Empire in
"In a Mirror, Darkly".The Defiant is mentioned,
but not seen (except for a wireframe graphic) in
Star Trek Discovery, Season One.""",
category="NCC-1764")
session.add(starship3)
session.commit()
starship4 = Starships(user_id=1, name="ISS Enterprise", description="""Mirror
Universe version of Enterprise.""", category="NCC-1701")
session.add(starship4)
session.commit()
starship5 = Starships(user_id=1, name="USS Discovery", description="""The USS
Discovery was a 23rd century Federation Crossfield-class
starship operated by Starfleet, under the command of
Captain Gabriel Lorca and, later, acting captain Saru""",
category="NCC-1031")
session.add(starship5)
session.commit()
starship6 = Starships(user_id=1, name="ISS Discovery", description="""Terran
Crossfield-class starship that was in service to
Starfleet in the mid-23rd century. The vessel was under
the command of Captain Sylvia Tilly, who rose to the
position after killing the previous captain.""",
category="NCC-1031")
session.add(starship6)
session.commit()
print "Launched Ships!"
| 37.226804 | 79 | 0.63085 | 440 | 3,611 | 5.156818 | 0.440909 | 0.045835 | 0.039665 | 0.042309 | 0.126047 | 0.06082 | 0 | 0 | 0 | 0 | 0 | 0.019969 | 0.292717 | 3,611 | 96 | 80 | 37.614583 | 0.868442 | 0.157297 | 0 | 0.152542 | 0 | 0 | 0.612999 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.050847 | null | null | 0.016949 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
29486c37880cc8b958e4a140985b3fe78d01e5af | 2,262 | py | Python | watson_developer_cloud/language_translation_v2.py | mcwilkes/python-sdk | bc545472ad1d00f77e916773e3a949160e4f48c3 | [
"Apache-2.0"
] | null | null | null | watson_developer_cloud/language_translation_v2.py | mcwilkes/python-sdk | bc545472ad1d00f77e916773e3a949160e4f48c3 | [
"Apache-2.0"
] | null | null | null | watson_developer_cloud/language_translation_v2.py | mcwilkes/python-sdk | bc545472ad1d00f77e916773e3a949160e4f48c3 | [
"Apache-2.0"
] | null | null | null | # Copyright 2015 IBM All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
The v1 Language Translation service
(https://http://www.ibm.com/smarterplanet/us/en/ibmwatson/developercloud/language-translation.html)
"""
from .watson_developer_cloud_service import WatsonDeveloperCloudService
from .watson_developer_cloud_service import WatsonInvalidArgument
class LanguageTranslationV2(WatsonDeveloperCloudService):
default_url = "https://gateway.watsonplatform.net/language-translation/api"
def __init__(self, url=default_url, **kwargs):
WatsonDeveloperCloudService.__init__(
self, 'language_translation', url, **kwargs)
def identify(self, text):
"""
Identifies the language of given source text
"""
return self.request(method='POST', url='/v2/identify', data=text, headers={'content-type': 'text/plain'},
accept_json=True)
def get_models(self, default=None, source=None, target=None):
"""
Get the available models for translation
"""
params = {'default': default, 'source': source, 'target': target}
return self.request(method='GET', url='/v2/models', params=params, accept_json=True)
def translate(self, text, source=None, target=None, model=None):
"""
Translates text from a source language to a target language
"""
if model is None and (source is None or target is None):
raise WatsonInvalidArgument('Either model or source and target must be specified')
data = {'text': text, 'source': source, 'target': target, 'model': model}
# data=data or json=data
return self.request(method='POST', url='/v2/translate', json=data).text
| 40.392857 | 113 | 0.69496 | 283 | 2,262 | 5.484099 | 0.448763 | 0.03866 | 0.032861 | 0.044459 | 0.088918 | 0.088918 | 0.041237 | 0 | 0 | 0 | 0 | 0.007186 | 0.200265 | 2,262 | 55 | 114 | 41.127273 | 0.850746 | 0.383731 | 0 | 0 | 0 | 0 | 0.182515 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.611111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
294c87d8b61279c817bbeaa062ea959eaa018803 | 8,811 | py | Python | cfp/models.py | WebCampZg/conference-web | 76ccae83924fdcd040d9280db5cf3a249d668606 | [
"BSD-3-Clause"
] | 4 | 2015-03-03T17:48:14.000Z | 2019-02-27T20:28:42.000Z | cfp/models.py | WebCampZg/conference-web | 76ccae83924fdcd040d9280db5cf3a249d668606 | [
"BSD-3-Clause"
] | 104 | 2015-02-25T18:09:15.000Z | 2019-06-21T10:02:53.000Z | cfp/models.py | WebCampZg/conference-web | 76ccae83924fdcd040d9280db5cf3a249d668606 | [
"BSD-3-Clause"
] | 9 | 2015-03-01T18:59:14.000Z | 2019-06-10T06:48:45.000Z | import uuid
import unicodedata
from django.conf import settings
from django.db import models
from django.utils import timezone
from django.utils.text import slugify
from django.utils.translation import ugettext_lazy as _
from cfp.choices import TALK_DURATIONS
from utils.behaviors import Timestampable
def get_applicant_avatar_path(instance, filename):
return "uploads/applicant_images/{0}/{1}".format(
slugify(instance.user.email),
unicodedata.normalize('NFKD', filename).encode('ascii', 'ignore'))
class CallForPaperManager(models.Manager):
def active(self):
today = timezone.now().date()
return self.filter(end_date__gte=today, begin_date__lte=today)
class CallForPaper(models.Model):
event = models.ForeignKey('events.Event', on_delete=models.CASCADE)
title = models.CharField(max_length=1024)
description = models.TextField()
announcement = models.TextField()
begin_date = models.DateField()
end_date = models.DateField(blank=True, null=True)
def __str__(self):
return self.title
def is_active(self):
today = timezone.now().date()
return today >= self.begin_date and (not self.end_date or today <= self.end_date)
def is_pending(self):
today = timezone.now().date()
return today < self.begin_date
@property
def applications(self):
return self.paperapplication_set.all()
@property
def application_count(self):
return self.paperapplication_set.count()
@property
def duration(self):
return (self.end_date - self.begin_date) if self.end_date else None
objects = CallForPaperManager
class Applicant(models.Model):
user = models.OneToOneField(
settings.AUTH_USER_MODEL, on_delete=models.CASCADE, related_name='applicant')
about = models.TextField(
verbose_name=_('About you'),
help_text=_('Describe yourself in 140 characters or fewer. Plain text only. [Public]'))
biography = models.TextField(
verbose_name=_('Biography'),
help_text=_('Who are you? Where have you worked? What are your professional interests? '
'Up to 10 sentences, use Markdown but avoid headings. [Public]'))
company_name = models.CharField(
max_length=100, blank=True,
verbose_name=_('Company name'),
help_text=_('Name of the company you work for. Optional.'))
speaker_experience = models.TextField(
blank=True, verbose_name=_('Speaker experience'),
help_text=_('If you\'ve given talks at other events, please list them.'
'Videos which show your english speaking skills are very helpful. '
'Use Markdown but avoid headings.'))
image = models.ImageField(
max_length=255, upload_to=get_applicant_avatar_path,
verbose_name=_('Photo'),
help_text=_('Please upload a picture of yourself which we may use for our web site and '
'publications. Make it a square PNG of at least 400x400px. [Public]'))
def __str__(self):
return self.user.get_full_name()
@property
def full_name(self):
return self.__str__()
@property
def email(self):
return self.user.email
@property
def twitter(self):
return self.user.twitter
@property
def github(self):
return self.user.github
class AudienceSkillLevel(models.Model):
name = models.CharField(max_length=50)
def __str__(self):
return self.name
class Meta:
ordering = ['pk', ]
class PaperApplicationQuerySet(models.QuerySet):
def talks(self):
return self.filter(type__in=PaperApplication.TALK_TYPES)
def workshops(self):
return self.filter(type__in=PaperApplication.WORKSHOP_TYPES)
class PaperApplication(Timestampable):
TYPE_KEYNOTE = 'keynote'
TYPE_TALK_LONG = 'talk_long'
TYPE_TALK_SHORT = 'talk_short'
TYPE_WORKSHOP_HALF = 'workshop_half'
TYPE_WORKSHOP_FULL = 'workshop_full'
TYPES = (
(TYPE_TALK_SHORT, 'Short talk (25 minutes)'),
(TYPE_TALK_LONG, 'Long talk (45 minutes)'),
(TYPE_KEYNOTE, 'Keynote (60 minutes)'),
(TYPE_WORKSHOP_HALF, 'Workshop (half day)'),
(TYPE_WORKSHOP_FULL, 'Workshop (full day)'),
)
# Shorter captions
TYPE_CAPTIONS = {
TYPE_TALK_SHORT: 'Short talk',
TYPE_TALK_LONG: 'Long talk',
TYPE_KEYNOTE: 'Keynote',
TYPE_WORKSHOP_HALF: 'Half Workshop',
TYPE_WORKSHOP_FULL: 'Full Workshop',
}
TALK_TYPES = (TYPE_KEYNOTE, TYPE_TALK_LONG, TYPE_TALK_SHORT)
WORKSHOP_TYPES = (TYPE_WORKSHOP_HALF, TYPE_WORKSHOP_FULL)
cfp = models.ForeignKey(CallForPaper, on_delete=models.CASCADE)
applicant = models.ForeignKey(Applicant, on_delete=models.CASCADE, related_name='applications')
type = models.CharField(
max_length=50,
verbose_name=_('Application Type'),
choices=TYPES,
default=TYPE_TALK_SHORT)
title = models.CharField(
max_length=255,
verbose_name=_('Title'),
help_text=_('The title of your talk. Keep it short and catchy. 50 characters max. [Public]'))
about = models.TextField(
max_length=140,
verbose_name=_('What\'s it about'),
help_text=_('Describe your talk in 140 characters or fewer. Plain text only. [Public]'))
abstract = models.TextField(
verbose_name=_('Abstract'),
help_text=_('You may go in more depth here. Up to 10 sentences, '
'use Markdown but avoid headings. [Public]'))
skill_level = models.ForeignKey(
AudienceSkillLevel, on_delete=models.CASCADE, verbose_name=_('Audience level'),
help_text=_('Which skill level is this talk most appropriate for? [Public]'))
duration = models.CharField(
_('Talk Duration Slot'),
choices=TALK_DURATIONS, max_length=255, blank=True, null=True,
help_text=_('What talk duration slot would you like? Take into account that there are '
'only 8 slots for 45 minute talks, and 20 slots for 25 minute talks.'))
accomodation_required = models.BooleanField(
_('I require accommodation'), default=False,
help_text=_('For people outside of the Zagreb area, we provide 3 nights in an apartment.'))
travel_expenses_required = models.BooleanField(
_('I require travel expenses'), default=False,
help_text=_('For people outside of the Zagreb area, we provide up to €200 in travel expenses.'))
extra_info = models.TextField(
_('Extra info'), blank=True,
help_text=_('Anything else that you would like to let us know?'))
grant_email_contact = models.BooleanField(default=False)
grant_process_data = models.BooleanField(default=False)
grant_publish_data = models.BooleanField(default=False)
grant_publish_video = models.BooleanField(default=False)
exclude = models.BooleanField(default=False)
labels = models.ManyToManyField("labels.Label", related_name="applications")
class Meta:
ordering = ['title', ]
objects = PaperApplicationQuerySet.as_manager()
def __str__(self):
return f"{self.title} - {self.applicant}"
@property
def has_talk(self):
return hasattr(self, "talk")
@property
def has_workshop(self):
return hasattr(self, "workshop")
@property
def has_instance(self):
return self.has_talk or self.has_workshop
@property
def instance(self):
"""
Return the talk or workshop instance or None if neither exists.
"""
assert self.is_for_talk or self.is_for_workshop
if self.is_for_talk and self.has_talk:
return self.talk
if self.is_for_workshop and self.has_workshop:
return self.workshop
@property
def is_for_talk(self):
return self.type in self.TALK_TYPES
@property
def is_for_workshop(self):
return self.type in self.WORKSHOP_TYPES
@property
def short_type(self):
return self.TYPE_CAPTIONS.get(self.type)
class Invite(models.Model):
"""
Allows a user to submit a talk even if CFP it is not currently active.
The token is tied to a specific user and CFP. It will not work if the
logged in user doesn't match the invited user, or if the CFP defined in
the invite is not tied to the currently active event.
"""
user = models.ForeignKey(
settings.AUTH_USER_MODEL, on_delete=models.CASCADE, related_name='invites')
cfp = models.ForeignKey(CallForPaper, on_delete=models.CASCADE)
created_at = models.DateTimeField(auto_now_add=True)
token = models.UUIDField(default=uuid.uuid4, editable=False, unique=True, db_index=True)
| 32.876866 | 104 | 0.676881 | 1,104 | 8,811 | 5.201993 | 0.265399 | 0.034825 | 0.039004 | 0.025596 | 0.27268 | 0.157061 | 0.143131 | 0.105172 | 0.087063 | 0.072784 | 0 | 0.008964 | 0.22767 | 8,811 | 267 | 105 | 33 | 0.834827 | 0.03961 | 0 | 0.141361 | 0 | 0 | 0.200333 | 0.003805 | 0 | 0 | 0 | 0 | 0.005236 | 1 | 0.125654 | false | 0 | 0.04712 | 0.104712 | 0.586387 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
294d6be33afc79a62c0b81fe9fea65158406643c | 10,606 | py | Python | tests/data.py | lttga/test2 | 8b5de0e3421b38dd45326d133e9500e39732a886 | [
"BSD-3-Clause"
] | null | null | null | tests/data.py | lttga/test2 | 8b5de0e3421b38dd45326d133e9500e39732a886 | [
"BSD-3-Clause"
] | null | null | null | tests/data.py | lttga/test2 | 8b5de0e3421b38dd45326d133e9500e39732a886 | [
"BSD-3-Clause"
] | null | null | null | TEST_ROOT_TABLES = {
"tenders": ["/tender"],
"awards": ["/awards"],
"contracts": ["/contracts"],
"planning": ["/planning"],
"parties": ["/parties"],
}
TEST_COMBINED_TABLES = {
"documents": [
"/planning/documents",
"/tender/documents",
"/awards/documents",
"/contracts/documents",
"/contracts/implementation/documents",
],
"milestones": [
"/planning/milestones",
"/tender/milestones",
"/contracts/milestones",
"/contracts/implementation/milestones",
],
"amendments": [
"/planning/amendments",
"/tender/amendments",
"/awards/amendments",
"/contracts/amendments",
"/contracts/implementation/amendments",
],
}
tenders_columns = [
"/tender/id",
"/tender/title",
"/tender/description",
"/tender/status",
"/tender/procurementMethod",
"/tender/procurementMethodDetails",
"/tender/procurementMethodRationale",
"/tender/mainProcurementCategory",
"/tender/additionalProcurementCategories",
"/tender/awardCriteria",
"/tender/awardCriteriaDetails",
"/tender/submissionMethodDetails",
"/tender/submissionMethod",
"/tender/hasEnquiries",
"/tender/eligibilityCriteria",
"/tender/numberOfTenderers",
"/tender/contractPeriod/startDate",
"/tender/contractPeriod/endDate",
"/tender/contractPeriod/maxExtentDate",
"/tender/contractPeriod/durationInDays",
"/tender/awardPeriod/startDate",
"/tender/awardPeriod/endDate",
"/tender/awardPeriod/maxExtentDate",
"/tender/awardPeriod/durationInDays",
"/tender/enquiryPeriod/startDate",
"/tender/enquiryPeriod/endDate",
"/tender/enquiryPeriod/maxExtentDate",
"/tender/enquiryPeriod/durationInDays",
"/tender/tenderPeriod/startDate",
"/tender/tenderPeriod/endDate",
"/tender/tenderPeriod/maxExtentDate",
"/tender/tenderPeriod/durationInDays",
"/tender/minValue/amount",
"/tender/minValue/currency",
"/tender/value/amount",
"/tender/value/currency",
"/tender/procuringEntity/name",
"/tender/procuringEntity/id",
]
tenders_arrays = [
"/tender/items",
"/tender/tenderers",
"/tender/items/additionalClassifications",
]
tenders_combined_columns = [
"/tender/id",
"/tender/title",
"/tender/description",
"/tender/status",
"/tender/procurementMethod",
"/tender/procurementMethodDetails",
"/tender/procurementMethodRationale",
"/tender/mainProcurementCategory",
"/tender/additionalProcurementCategories",
"/tender/awardCriteria",
"/tender/awardCriteriaDetails",
"/tender/submissionMethod",
"/tender/submissionMethodDetails",
"/tender/hasEnquiries",
"/tender/eligibilityCriteria",
"/tender/numberOfTenderers",
"/tender/tenderers/0/name",
"/tender/tenderers/0/id",
"/tender/contractPeriod/startDate",
"/tender/contractPeriod/endDate",
"/tender/contractPeriod/maxExtentDate",
"/tender/contractPeriod/durationInDays",
"/tender/awardPeriod/startDate",
"/tender/awardPeriod/endDate",
"/tender/awardPeriod/maxExtentDate",
"/tender/awardPeriod/durationInDays",
"/tender/enquiryPeriod/startDate",
"/tender/enquiryPeriod/endDate",
"/tender/enquiryPeriod/maxExtentDate",
"/tender/enquiryPeriod/durationInDays",
"/tender/tenderPeriod/startDate",
"/tender/tenderPeriod/endDate",
"/tender/tenderPeriod/maxExtentDate",
"/tender/tenderPeriod/durationInDays",
"/tender/minValue/amount",
"/tender/minValue/currency",
"/tender/value/amount",
"/tender/value/currency",
"/tender/items/0/id",
"/tender/items/0/description",
"/tender/items/0/quantity",
"/tender/items/0/unit/scheme",
"/tender/items/0/unit/id",
"/tender/items/0/unit/name",
"/tender/items/0/unit/uri",
"/tender/items/0/unit/value/amount",
"/tender/items/0/unit/value/currency",
"/tender/items/0/additionalClassifications/0/scheme",
"/tender/items/0/additionalClassifications/0/id",
"/tender/items/0/additionalClassifications/0/description",
"/tender/items/0/additionalClassifications/0/uri",
"/tender/items/0/classification/scheme",
"/tender/items/0/classification/id",
"/tender/items/0/classification/description",
"/tender/items/0/classification/uri",
"/tender/procuringEntity/name",
"/tender/procuringEntity/id",
]
awards_columns = [
"/awards/id",
"/awards/title",
"/awards/description",
"/awards/status",
"/awards/date",
"/awards/contractPeriod/startDate",
"/awards/contractPeriod/endDate",
"/awards/contractPeriod/maxExtentDate",
"/awards/contractPeriod/durationInDays",
"/awards/value/amount",
"/awards/value/currency",
]
awards_combined_columns = [
"/awards/id",
"/awards/title",
"/awards/description",
"/awards/status",
"/awards/date",
"/awards/contractPeriod/startDate",
"/awards/contractPeriod/endDate",
"/awards/contractPeriod/maxExtentDate",
"/awards/contractPeriod/durationInDays",
"/awards/items/0/id",
"/awards/items/0/description",
"/awards/items/0/quantity",
"/awards/items/0/unit/scheme",
"/awards/items/0/unit/id",
"/awards/items/0/unit/name",
"/awards/items/0/unit/uri",
"/awards/items/0/unit/value/amount",
"/awards/items/0/unit/value/currency",
"/awards/items/0/additionalClassifications/0/scheme",
"/awards/items/0/additionalClassifications/0/id",
"/awards/items/0/additionalClassifications/0/description",
"/awards/items/0/additionalClassifications/0/uri",
"/awards/items/0/classification/scheme",
"/awards/items/0/classification/id",
"/awards/items/0/classification/description",
"/awards/items/0/classification/uri",
"/awards/suppliers/0/name",
"/awards/suppliers/0/id",
"/awards/value/amount",
"/awards/value/currency",
]
awards_arrays = [
"/awards/suppliers",
"/awards/items",
"/awards/items/additionalClassifications",
]
contracts_columns = [
"/contracts/id",
"/contracts/awardID",
"/contracts/title",
"/contracts/description",
"/contracts/status",
"/contracts/dateSigned",
"/contracts/value/amount",
"/contracts/value/currency",
"/contracts/period/startDate",
"/contracts/period/endDate",
"/contracts/period/maxExtentDate",
"/contracts/period/durationInDays",
]
contracts_arrays = [
"/contracts/items",
"/contracts/relatedProcesses",
"/contracts/implementation/transactions",
"/contracts/items/additionalClassifications",
]
contracts_combined_columns = [
"/contracts/id",
"/contracts/awardID",
"/contracts/title",
"/contracts/description",
"/contracts/status",
"/contracts/dateSigned",
"/contracts/relatedProcesses/0/id",
"/contracts/relatedProcesses/0/relationship",
"/contracts/relatedProcesses/0/title",
"/contracts/relatedProcesses/0/scheme",
"/contracts/relatedProcesses/0/identifier",
"/contracts/relatedProcesses/0/uri",
"/contracts/implementation/transactions/0/id",
"/contracts/implementation/transactions/0/source",
"/contracts/implementation/transactions/0/date",
"/contracts/implementation/transactions/0/uri",
"/contracts/implementation/transactions/0/payee/name",
"/contracts/implementation/transactions/0/payee/id",
"/contracts/implementation/transactions/0/payer/name",
"/contracts/implementation/transactions/0/payer/id",
"/contracts/implementation/transactions/0/value/amount",
"/contracts/implementation/transactions/0/value/currency",
"/contracts/items/0/id",
"/contracts/items/0/description",
"/contracts/items/0/quantity",
"/contracts/items/0/unit/scheme",
"/contracts/items/0/unit/id",
"/contracts/items/0/unit/name",
"/contracts/items/0/unit/uri",
"/contracts/items/0/unit/value/amount",
"/contracts/items/0/unit/value/currency",
"/contracts/items/0/additionalClassifications/0/scheme",
"/contracts/items/0/additionalClassifications/0/id",
"/contracts/items/0/additionalClassifications/0/description",
"/contracts/items/0/additionalClassifications/0/uri",
"/contracts/items/0/classification/scheme",
"/contracts/items/0/classification/id",
"/contracts/items/0/classification/description",
"/contracts/items/0/classification/uri",
"/contracts/value/amount",
"/contracts/value/currency",
"/contracts/period/startDate",
"/contracts/period/endDate",
"/contracts/period/maxExtentDate",
"/contracts/period/durationInDays",
]
planning_columns = [
"/planning/rationale",
"/planning/budget/id",
"/planning/budget/description",
"/planning/budget/project",
"/planning/budget/projectID",
"/planning/budget/uri",
"/planning/budget/amount/amount",
"/planning/budget/amount/currency",
]
planning_combined_columns = [
"/planning/rationale",
"/planning/budget/id",
"/planning/budget/description",
"/planning/budget/project",
"/planning/budget/projectID",
"/planning/budget/uri",
"/planning/budget/amount/amount",
"/planning/budget/amount/currency",
]
planning_arrays = []
parties_arrays = ["/parties/additionalIdentifiers"]
parties_columns = [
"/parties/name",
"/parties/id",
"/parties/roles",
"/parties/contactPoint/name",
"/parties/contactPoint/email",
"/parties/contactPoint/telephone",
"/parties/contactPoint/faxNumber",
"/parties/contactPoint/url",
"/parties/address/streetAddress",
"/parties/address/locality",
"/parties/address/region",
"/parties/address/postalCode",
"/parties/address/countryName",
"/parties/identifier/scheme",
"/parties/identifier/id",
"/parties/identifier/legalName",
"/parties/identifier/uri",
]
parties_combined_columns = [
"/parties/name",
"/parties/id",
"/parties/roles",
"/parties/roles",
"/parties/contactPoint/name",
"/parties/contactPoint/email",
"/parties/contactPoint/telephone",
"/parties/contactPoint/faxNumber",
"/parties/contactPoint/url",
"/parties/address/streetAddress",
"/parties/address/locality",
"/parties/address/region",
"/parties/address/postalCode",
"/parties/address/countryName",
"/parties/additionalIdentifiers/0/scheme",
"/parties/additionalIdentifiers/0/id",
"/parties/additionalIdentifiers/0/legalName",
"/parties/additionalIdentifiers/0/uri",
"/parties/identifier/scheme",
"/parties/identifier/id",
"/parties/identifier/legalName",
"/parties/identifier/uri",
]
| 33.040498 | 65 | 0.681784 | 947 | 10,606 | 7.610348 | 0.089757 | 0.042459 | 0.024976 | 0.053282 | 0.713889 | 0.559179 | 0.53559 | 0.50562 | 0.496462 | 0.496462 | 0 | 0.009607 | 0.146144 | 10,606 | 320 | 66 | 33.14375 | 0.786219 | 0 | 0 | 0.55873 | 0 | 0 | 0.737507 | 0.634924 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
294e35ba6a817fe585ab45ec79e94eb4b84c6361 | 614 | py | Python | basic_tutorials/hello_world_objects.py | LearnPythonAndMakeGames/BasicPythonTutorialSeries | be129702680aaf1186fb62add13f94002d4baa63 | [
"Apache-2.0"
] | 7 | 2015-04-16T14:30:47.000Z | 2021-08-18T15:37:12.000Z | basic_tutorials/objects.py | LearnPythonAndMakeGames/BasicPythonTutorialSeries | be129702680aaf1186fb62add13f94002d4baa63 | [
"Apache-2.0"
] | null | null | null | basic_tutorials/objects.py | LearnPythonAndMakeGames/BasicPythonTutorialSeries | be129702680aaf1186fb62add13f94002d4baa63 | [
"Apache-2.0"
] | 2 | 2015-04-21T09:57:21.000Z | 2020-01-07T08:41:41.000Z |
def display(message):
"""Standard way to print a message"""
print message
class HelloWorld(object):
"""Describes your function... display a message"""
def __init__(self, message):
self.message = message
def display(self):
"""Display a message"""
print self.message
hello_world_object_01 = HelloWorld("hi")
hello_world_object_02 = HelloWorld("Hello, World!")
hello_world_object_03 = HelloWorld("Yo, sup!")
hello_world_object_01.display()
hello_world_object_02.display()
hello_world_object_03.display()
display("hi")
display("Hello, World!")
display("Yo, sup!")
| 20.466667 | 54 | 0.698697 | 79 | 614 | 5.151899 | 0.303797 | 0.19656 | 0.235872 | 0.088452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023529 | 0.169381 | 614 | 29 | 55 | 21.172414 | 0.77451 | 0 | 0 | 0 | 0 | 0 | 0.091451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.125 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
2950f1344c4ddf92d3bd5335e39a69a84076361b | 7,195 | py | Python | tests/test_complex_tokeniser.py | delfick/nose-of-yeti | b9fbb32c131074bc61eb5d1da67f0931d9803637 | [
"MIT"
] | 12 | 2015-02-20T11:21:58.000Z | 2022-01-20T08:33:32.000Z | tests/test_complex_tokeniser.py | delfick/nose-of-yeti | b9fbb32c131074bc61eb5d1da67f0931d9803637 | [
"MIT"
] | 8 | 2015-12-05T01:26:19.000Z | 2021-06-07T01:22:59.000Z | tests/test_complex_tokeniser.py | delfick/nose-of-yeti | b9fbb32c131074bc61eb5d1da67f0931d9803637 | [
"MIT"
] | 4 | 2015-05-26T19:49:48.000Z | 2016-05-25T20:33:59.000Z | import pytest
class Examples:
small_example = [
"""
describe "This":
before_each:
self.x = 5
describe "That":
before_each:
self.y = 6
describe "Meh":
after_each:
self.y = None
describe "Blah":pass
describe "async":
async before_each:
pass
async after_each:
pass
describe "Another":
before_each:
self.z = 8 """,
"""
class TestThis :
def setUp (self ):
__import__ ("noseOfYeti").tokeniser .TestSetup (super ()).sync_before_each ();self .x =5
class TestThis_That (TestThis ):
def setUp (self ):
__import__ ("noseOfYeti").tokeniser .TestSetup (super ()).sync_before_each ();self .y =6
class TestThis_That_Meh (TestThis_That ):
def tearDown (self ):
__import__ ("noseOfYeti").tokeniser .TestSetup (super ()).sync_after_each ();self .y =None
class TestThis_Blah (TestThis ):pass
class TestThis_Async (TestThis ):
async def setUp (self ):
await __import__ ("noseOfYeti").tokeniser .TestSetup (super ()).async_before_each ();pass
async def tearDown (self ):
await __import__ ("noseOfYeti").tokeniser .TestSetup (super ()).async_after_each ();pass
class TestAnother :
def setUp (self ):
__import__ ("noseOfYeti").tokeniser .TestSetup (super ()).sync_before_each ();self .z =8
TestThis .is_noy_spec =True
TestThis_That .is_noy_spec =True
TestThis_That_Meh .is_noy_spec =True
TestThis_Blah .is_noy_spec =True
TestThis_Async .is_noy_spec =True
TestAnother .is_noy_spec =True
""",
]
big_example = [
"""
describe "This":
before_each:
self.x = 5
it 'should':
if x:
pass
else:
x += 9
async it 'supports async its':
pass
describe "That":
before_each:
self.y = 6
describe "Meh":
after_each:
self.y = None
it "should set __testname__ for non alpha names ' $^":
pass
it 'should':
if y:
pass
else:
pass
it 'should have args', arg1, arg2:
blah |should| be_good()
describe "Blah":pass
ignore "root level $pecial-method*+":
pass
describe "Another":
before_each:
self.z = 8
it 'should':
if z:
if u:
print "hello \
there"
else:
print "no"
else:
pass
async it 'supports level 0 async its':
pass
""",
"""
class TestThis :
def setUp (self ):
__import__ ("noseOfYeti").tokeniser .TestSetup (super ()).sync_before_each ();self .x =5
def test_should (self ):
if x :
pass
else :
x +=9
async def test_supports_async_its (self ):
pass
class TestThis_That (TestThis ):
def setUp (self ):
__import__ ("noseOfYeti").tokeniser .TestSetup (super ()).sync_before_each ();self .y =6
class TestThis_That_Meh (TestThis_That ):
def tearDown (self ):
__import__ ("noseOfYeti").tokeniser .TestSetup (super ()).sync_after_each ();self .y =None
def test_should_set_testname_for_non_alpha_names (self ):
pass
def test_should (self ):
if y :
pass
else :
pass
def test_should_have_args (self ,arg1 ,arg2 ):
blah |should |be_good ()
class TestThis_Blah (TestThis ):pass
def ignore__root_level_pecial_method ():
pass
class TestAnother :
def setUp (self ):
__import__ ("noseOfYeti").tokeniser .TestSetup (super ()).sync_before_each ();self .z =8
def test_should (self ):
if z :
if u :
print "hello \
there"
else :
print "no"
else :
pass
async def test_supports_level_0_async_its ():
pass
TestThis .is_noy_spec =True
TestThis_That .is_noy_spec =True
TestThis_That_Meh .is_noy_spec =True
TestThis_Blah .is_noy_spec =True
TestAnother .is_noy_spec =True
ignore__root_level_pecial_method .__testname__ ="root level $pecial-method*+"
TestThis_That_Meh .test_should_set_testname_for_non_alpha_names .__testname__ ="should set __testname__ for non alpha names ' $^"
""",
]
comment_example = [
"""
assertTileHues(
self, tiles[0],
25.0, 25.0, 25.0, 25.0, 25.0, 25.0, # noqa
18.75, 18.75, 18.75, 18.75, 18.75, 18.75, # noqa
)
it "things":
assertTileHues(
self, tiles[1],
25.0, 25.0, 25.0, 25.0, 25.0, 25.0, # noqa
18.75, 18.75, 18.75, 18.75, 18.75, 18.75, # noqa
)
expected = {
# something
("D2", "<d"): lambda s: ("D2", "<d", s)
# something else
,
("B2", None): ("B2", None, None),
}
def t(n, f, c):
return expected[(n, f, c)]
""",
"""
assertTileHues (
self ,tiles [0 ],
25.0 ,25.0 ,25.0 ,25.0 ,25.0 ,25.0 ,# noqa
18.75 ,18.75 ,18.75 ,18.75 ,18.75 ,18.75 ,# noqa
)
def test_things ():
assertTileHues (
self ,tiles [1 ],
25.0 ,25.0 ,25.0 ,25.0 ,25.0 ,25.0 ,# noqa
18.75 ,18.75 ,18.75 ,18.75 ,18.75 ,18.75 ,# noqa
)
expected ={
# something
("D2","<d"):lambda s :("D2","<d",s )
# something else
,
("B2",None ):("B2",None ,None ),
}
def t (n ,f ,c ):
return expected [(n ,f ,c )]
""",
]
class Test_Tokeniser:
def test_gives_describes_noy_specific_attributes(self):
pytest.helpers.assert_example(
[
'describe "Something testable"',
"""
class TestSomethingTestable :pass
TestSomethingTestable .is_noy_spec =True
""",
]
)
class Test_Tokeniser_Complex:
def test_works_with_space(self):
pytest.helpers.assert_example(Examples.small_example)
def test_works_with_tabs(self):
pytest.helpers.assert_example(Examples.small_example, convert_to_tabs=True)
def test_keeps_good_indentation_in_body_with_spaces(self):
pytest.helpers.assert_example(Examples.big_example)
def test_keeps_good_indentation_in_body_with_tabs(self):
pytest.helpers.assert_example(Examples.big_example, convert_to_tabs=True)
def test_keeps_correct_indentation_with_comments(self):
pytest.helpers.assert_example(Examples.comment_example, convert_to_tabs=True)
| 30.35865 | 133 | 0.522446 | 801 | 7,195 | 4.410737 | 0.143571 | 0.020379 | 0.024908 | 0.033965 | 0.823663 | 0.746957 | 0.692613 | 0.663742 | 0.480328 | 0.463063 | 0 | 0.044248 | 0.371786 | 7,195 | 236 | 134 | 30.487288 | 0.737389 | 0 | 0 | 0 | 0 | 0 | 0.027027 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 1 | 0.181818 | false | 0 | 0.030303 | 0 | 0.393939 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
2952244280859ac15f3402162280c9caefc7bc02 | 850 | py | Python | server/src/testing/__init__.py | kentoku24/go-links | d530e1e5f76c966359d9106918fbaecf19604689 | [
"Apache-2.0"
] | null | null | null | server/src/testing/__init__.py | kentoku24/go-links | d530e1e5f76c966359d9106918fbaecf19604689 | [
"Apache-2.0"
] | 3 | 2021-10-06T22:22:56.000Z | 2022-02-27T10:52:41.000Z | server/src/testing/__init__.py | tiket/go-links | 451a89965237dcb9082f145647e55c780b9a4daf | [
"Apache-2.0"
] | null | null | null | import unittest
import requests
import webtest
from main import init_app_without_routes
from modules.users.helpers import get_or_create_user
class TrottoTestCase(unittest.TestCase):
def setUp(self):
self.init_app()
# always put some data in the database since it seems like in Linux, the datastore emulator
# can't be reset if there's no data in it (connection refused error)
get_or_create_user('init@test.trotto.dev', 'test.trotto.dev')
def tearDown(self):
# see https://github.com/googleapis/google-cloud-java/issues/1292#issuecomment-250391120
requests.post('http://localhost:8082/reset')
def init_app(self):
app = init_app_without_routes(disable_csrf=True)
for blueprint in getattr(self, 'blueprints_under_test', []):
app.register_blueprint(blueprint)
self.testapp = webtest.TestApp(app)
| 28.333333 | 95 | 0.751765 | 123 | 850 | 5.04878 | 0.634146 | 0.045089 | 0.045089 | 0.064412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023611 | 0.152941 | 850 | 29 | 96 | 29.310345 | 0.838889 | 0.284706 | 0 | 0 | 0 | 0 | 0.137645 | 0.034826 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1875 | false | 0 | 0.3125 | 0 | 0.5625 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
295233b1c84edcdd4f60d68634dcb2be1feebfc2 | 934 | py | Python | Project Pattern/pattern_19.py | chandthash/nppy | 228116d4efa6d28a9cdab245c6c8045844e96211 | [
"MIT"
] | null | null | null | Project Pattern/pattern_19.py | chandthash/nppy | 228116d4efa6d28a9cdab245c6c8045844e96211 | [
"MIT"
] | null | null | null | Project Pattern/pattern_19.py | chandthash/nppy | 228116d4efa6d28a9cdab245c6c8045844e96211 | [
"MIT"
] | null | null | null | def pattern_nineteen(steps):
''' Pattern nineteen
1 2 3 4 5 6 7 8 9 10
2 4 6 8 10 12 14 16 18 20
3 6 9 12 15 18 21 24 27 30
4 8 12 16 20 24 28 32 36 40
5 10 15 20 25 30 35 40 45 50
6 12 18 24 30 36 42 48 54 60
7 14 21 28 35 42 49 56 63 70
8 16 24 32 40 48 56 64 72 80
9 18 27 36 45 54 63 72 81 90
10 20 30 40 50 60 70 80 90 100
'''
for i in range(1, steps + 1):
num_list = []
for j in range(1, steps + 1):
num_list.append(str(i * j).ljust(4)) # Change 3 to different value if i * j is more than 3 digits
print(' '.join(num_list))
if __name__ == '__main__':
try:
pattern_nineteen(10)
except NameError:
print('Integer was expected')
| 30.129032 | 111 | 0.453961 | 160 | 934 | 2.56875 | 0.46875 | 0.109489 | 0.038929 | 0.06326 | 0.10219 | 0.10219 | 0.10219 | 0 | 0 | 0 | 0 | 0.392034 | 0.489293 | 934 | 30 | 112 | 31.133333 | 0.469602 | 0.501071 | 0 | 0 | 0 | 0 | 0.084302 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0 | 0 | 0.090909 | 0.181818 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
296013c5c6d772ea8eb968f144d8d2ec88d3ff61 | 1,434 | py | Python | source/injection/inject.py | nikhilrj/CARDS | 509815f23c11881e6444308fa1014aed6a1af358 | [
"MIT"
] | null | null | null | source/injection/inject.py | nikhilrj/CARDS | 509815f23c11881e6444308fa1014aed6a1af358 | [
"MIT"
] | null | null | null | source/injection/inject.py | nikhilrj/CARDS | 509815f23c11881e6444308fa1014aed6a1af358 | [
"MIT"
] | null | null | null | from variable import *
import random, time
def runCircle(var):
var.motors.drive(150, 150, Adafruit_MotorHAT.BACKWARD, Adafruit_MotorHAT.FORWARD)
def runCircle2(var):
var.motors.drive(150, 150, Adafruit_MotorHAT.FORWARD, Adafruit_MotorHAT.BACKWARD)
def runFWBW(var):
var.motors.drive(50, 50, Adafruit_MotorHAT.FORWARD, Adafruit_MotorHAT.FORWARD)
time.sleep(.5)
var.motors.drive(50, 50, Adafruit_MotorHAT.BACKWARD, Adafruit_MotorHAT.BACKWARD)
time.sleep(.5)
var.motors.drive(50, 50, Adafruit_MotorHAT.FORWARD, Adafruit_MotorHAT.FORWARD)
time.sleep(.5)
var.motors.drive(50, 50, Adafruit_MotorHAT.BACKWARD, Adafruit_MotorHAT.BACKWARD)
time.sleep(.5)
var.motors.drive(50, 50, Adafruit_MotorHAT.RELEASE, Adafruit_MotorHAT.RELEASE)
def injectMemory(dic):
for i in dic.keys():
try:
if isinstance(dic[i], (int, float)):
if random.random() > 0:#.95:
print 'changing', dic[i]
dic[i] ^= (1 << random.randint(0, 32))
elif isinstance(dic[i], object):
injectMemory(dic[i].__dict__)
except Exception, e:
print e
pass
global mission
var = mission.member()
print var
#call random function
functions = [var.direction.sensorRead, var.colorSensor.readColor, var.colorSensor.distance, var.motors.drive]
rand = random.randint(0, len(functions)-1)
#functions[rand]()
#print 'running in circle'
#runCircle(var)
#print 'running fwbw'
#runFWBW(var)
#print 'stopped'
injectMemory(var.__dict__)
print var
| 26.072727 | 109 | 0.744073 | 200 | 1,434 | 5.225 | 0.315 | 0.214354 | 0.107177 | 0.076555 | 0.439234 | 0.394258 | 0.394258 | 0.394258 | 0.319617 | 0.319617 | 0 | 0.03645 | 0.119944 | 1,434 | 54 | 110 | 26.555556 | 0.791601 | 0.088563 | 0 | 0.285714 | 0 | 0 | 0.006159 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.028571 | 0.057143 | null | null | 0.114286 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
297854f948108d783f9da534e88f02f6ecd46220 | 221 | py | Python | tests/test_order.py | duffelhq/duffel-api-python | 583703f34e345ac8fa185ca26441f9168d1b0dac | [
"MIT"
] | 2 | 2022-02-26T22:14:48.000Z | 2022-03-10T10:04:11.000Z | tests/test_order.py | duffelhq/duffel-api-python | 583703f34e345ac8fa185ca26441f9168d1b0dac | [
"MIT"
] | 29 | 2022-01-04T12:40:54.000Z | 2022-03-31T23:26:54.000Z | tests/test_order.py | duffelhq/duffel-api-python | 583703f34e345ac8fa185ca26441f9168d1b0dac | [
"MIT"
] | null | null | null | from duffel_api.models import Order
from .fixtures import raw_fixture
def test_order_model_parsing():
name = "get-order-by-id"
with raw_fixture(name) as fixture:
assert Order.from_json(fixture["data"])
| 22.1 | 47 | 0.733032 | 33 | 221 | 4.69697 | 0.666667 | 0.116129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171946 | 221 | 9 | 48 | 24.555556 | 0.846995 | 0 | 0 | 0 | 0 | 0 | 0.085973 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
297b385220ca10d6102862b9cdba654b241717d1 | 210 | py | Python | tests/panels/sqlalchemy/models.py | chrismaille/fastapi-debug-toolbar | 76d1e78eda4a23fc2b3e3d3c978ee9d8dbf025ae | [
"BSD-3-Clause"
] | 36 | 2021-07-22T08:11:31.000Z | 2022-01-31T13:09:26.000Z | tests/panels/sqlalchemy/models.py | chrismaille/fastapi-debug-toolbar | 76d1e78eda4a23fc2b3e3d3c978ee9d8dbf025ae | [
"BSD-3-Clause"
] | 10 | 2021-07-21T19:39:38.000Z | 2022-02-26T15:35:35.000Z | tests/panels/sqlalchemy/models.py | chrismaille/fastapi-debug-toolbar | 76d1e78eda4a23fc2b3e3d3c978ee9d8dbf025ae | [
"BSD-3-Clause"
] | 2 | 2021-07-28T09:55:13.000Z | 2022-02-18T11:29:25.000Z | from sqlalchemy import Column, Integer, String
from .database import Base
class User(Base):
__tablename__ = "users"
id = Column(Integer, primary_key=True)
username = Column(String, unique=True)
| 19.090909 | 46 | 0.72381 | 26 | 210 | 5.653846 | 0.692308 | 0.176871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.185714 | 210 | 10 | 47 | 21 | 0.859649 | 0 | 0 | 0 | 0 | 0 | 0.02381 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
2988c3fe79c798f0a13c5f4b8c13e7b444de97e6 | 311 | py | Python | minfraud/__init__.py | lapakiosstore/minfraud-api-python | ccf9d54327c038365008f6c3bcd392027b40a9ab | [
"Apache-2.0"
] | null | null | null | minfraud/__init__.py | lapakiosstore/minfraud-api-python | ccf9d54327c038365008f6c3bcd392027b40a9ab | [
"Apache-2.0"
] | null | null | null | minfraud/__init__.py | lapakiosstore/minfraud-api-python | ccf9d54327c038365008f6c3bcd392027b40a9ab | [
"Apache-2.0"
] | null | null | null | """
minfraud
~~~~~~~~
A client API to MaxMind's minFraud Score and Insights web services.
"""
from .errors import MinFraudError, AuthenticationError, \
HTTPError, InvalidRequestError, InsufficientFundsError
from .webservice import Client
from .version import __version__
__author__ = 'Gregory Oschwald'
| 20.733333 | 67 | 0.771704 | 32 | 311 | 7.25 | 0.78125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141479 | 311 | 14 | 68 | 22.214286 | 0.868914 | 0.276527 | 0 | 0 | 0 | 0 | 0.073733 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
462c81e47c3c867317a1fdf198ad2e3a6909a388 | 1,063 | py | Python | Info/CvAltRoot.py | macaurther/DOCUSA | 40586727c351d1b1130c05c2d4648cca3a8bacf5 | [
"MIT"
] | 93 | 2015-11-20T04:13:36.000Z | 2022-03-24T00:03:08.000Z | Info/CvAltRoot.py | macaurther/DOCUSA | 40586727c351d1b1130c05c2d4648cca3a8bacf5 | [
"MIT"
] | 206 | 2015-11-09T00:27:15.000Z | 2021-12-04T19:05:18.000Z | Info/CvAltRoot.py | dguenms/Dawn-of-Civilization | 1c4f510af97a869637cddb4c0859759158cea5ce | [
"MIT"
] | 117 | 2015-11-08T02:43:46.000Z | 2022-02-12T06:29:00.000Z | ## CvAltRoot
##
## Tells BUG where to locate its files when it cannot find them normally.
## This is common when using the /AltRoot Civ4 feature or sometimes on
## non-English operating systems and Windows Vista.
##
## HOW TO USE
##
## 1. Change the text in the quotes below to match the full path to the
## "Beyond the Sword" folder that contains the CivilizationIV.ini file.
## Make sure to use forward slashes (/), even for Windows paths.
##
## Windows XP: "C:/Documents and Settings/[UserName]/My Documents/My Games/Beyond the Sword"
##
## Windows Vista: "C:/Users/[UserName]/Documents/My Games/Beyond the Sword"
##
## MacOS: "/home/[UserName]/Documents/Beyond the Sword"
##
## 2. Copy this file to the "Python" folder.
## When BUG is installed as a mod, the folder is "Assets/Python".
## When BUG is installed normally, the folder is "CustomAssets/Python".
##
## Copyright (c) 2008 The BUG Mod.
##
## Author: EmperorFool
rootDir = "C:/Documents and Settings/[UserName]/My Documents/My Games/Beyond the Sword"
| 37.964286 | 96 | 0.687676 | 154 | 1,063 | 4.746753 | 0.532468 | 0.06156 | 0.095759 | 0.090287 | 0.207934 | 0.207934 | 0.166895 | 0.166895 | 0.166895 | 0.166895 | 0 | 0.008187 | 0.195673 | 1,063 | 27 | 97 | 39.37037 | 0.846784 | 0.819379 | 0 | 0 | 0 | 0 | 0.657895 | 0.192982 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4649162d84f4d90a99f721dd5b19cc0cf21fa46b | 2,778 | py | Python | modin/experimental/sql/test/test_sql.py | Rubtsowa/modin | 6550939753c76e896ef2bfd65bb9468d6ad161d7 | [
"ECL-2.0",
"Apache-2.0"
] | 7,258 | 2018-06-21T21:39:15.000Z | 2022-03-31T23:09:20.000Z | modin/experimental/sql/test/test_sql.py | Rubtsowa/modin | 6550939753c76e896ef2bfd65bb9468d6ad161d7 | [
"ECL-2.0",
"Apache-2.0"
] | 4,125 | 2018-06-22T18:04:48.000Z | 2022-03-31T17:13:19.000Z | modin/experimental/sql/test/test_sql.py | Rubtsowa/modin | 6550939753c76e896ef2bfd65bb9468d6ad161d7 | [
"ECL-2.0",
"Apache-2.0"
] | 547 | 2018-06-21T23:23:00.000Z | 2022-03-27T09:04:56.000Z | # Licensed to Modin Development Team under one or more contributor license agreements.
# See the NOTICE file distributed with this work for additional information regarding
# copyright ownership. The Modin Development Team licenses this file to you under the
# Apache License, Version 2.0 (the "License"); you may not use this file except in
# compliance with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed under
# the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
# ANY KIND, either express or implied. See the License for the specific language
# governing permissions and limitations under the License.
import modin.pandas as pd
import io
titanic_snippet = """passenger_id,survived,p_class,name,sex,age,sib_sp,parch,ticket,fare,cabin,embarked
1,0,3,"Braund, Mr. Owen Harris",male,22,1,0,A/5 21171,7.25,,S
2,1,1,"Cumings, Mrs. John Bradley (Florence Briggs Thayer)",female,38,1,0,PC 17599,71.2833,C85,C
3,1,3,"Heikkinen, Miss. Laina",female,26,0,0,STON/O2. 3101282,7.925,,S
4,1,1,"Futrelle, Mrs. Jacques Heath (Lily May Peel)",female,35,1,0,113803,53.1,C123,S
5,0,3,"Allen, Mr. William Henry",male,35,0,0,373450,8.05,,S
6,0,3,"Moran, Mr. James",male,,0,0,330877,8.4583,,Q
7,0,1,"McCarthy, Mr. Timothy J",male,54,0,0,17463,51.8625,E46,S
8,0,3,"Palsson, Master. Gosta Leonard",male,2,3,1,349909,21.075,,S
9,1,3,"Johnson, Mrs. Oscar W (Elisabeth Vilhelmina Berg)",female,27,0,2,347742,11.1333,,S
"""
def test_sql_query():
from modin.experimental.sql import query
df = pd.read_csv(io.StringIO(titanic_snippet))
sql = "SELECT survived, p_class, count(passenger_id) as count FROM (SELECT * FROM titanic WHERE survived = 1) as t1 GROUP BY survived, p_class"
query_result = query(sql, titanic=df)
expected_df = (
df[df.survived == 1]
.groupby(["survived", "p_class"])
.agg({"passenger_id": "count"})
.reset_index()
)
assert query_result.shape == expected_df.shape
values_left = expected_df.dropna().values
values_right = query_result.dropna().values
assert (values_left == values_right).all()
def test_sql_extension():
import modin.experimental.sql # noqa: F401
df = pd.read_csv(io.StringIO(titanic_snippet))
expected_df = df[df["survived"] == 1][["passenger_id", "survived"]]
sql = "SELECT passenger_id, survived WHERE survived = 1"
query_result = df.sql(sql)
assert list(query_result.columns) == ["passenger_id", "survived"]
values_left = expected_df.values
values_right = query_result.values
assert values_left.shape == values_right.shape
assert (values_left == values_right).all()
| 44.806452 | 147 | 0.720302 | 453 | 2,778 | 4.324503 | 0.465784 | 0.030628 | 0.038795 | 0.016335 | 0.118428 | 0.089842 | 0.035733 | 0.035733 | 0 | 0 | 0 | 0.073729 | 0.150468 | 2,778 | 61 | 148 | 45.540984 | 0.756356 | 0.276458 | 0 | 0.102564 | 0 | 0.282051 | 0.498497 | 0.186373 | 0 | 0 | 0 | 0 | 0.128205 | 1 | 0.051282 | false | 0.153846 | 0.102564 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
46820ce20b0a4e34ec50e5d7500956d171b40e1a | 321 | py | Python | models/maskrcnn/metric.py | tingyumao94/groupsoftmax-simpledet | feda05ae2261efcbd0e298792cd66dd383b1bdf6 | [
"Apache-2.0"
] | 153 | 2019-08-26T06:39:32.000Z | 2022-02-17T03:51:23.000Z | models/maskrcnn/metric.py | tingyumao94/groupsoftmax-simpledet | feda05ae2261efcbd0e298792cd66dd383b1bdf6 | [
"Apache-2.0"
] | 4 | 2019-09-11T09:49:04.000Z | 2020-03-04T07:31:42.000Z | models/maskrcnn/metric.py | tingyumao94/groupsoftmax-simpledet | feda05ae2261efcbd0e298792cd66dd383b1bdf6 | [
"Apache-2.0"
] | 26 | 2019-08-27T13:29:18.000Z | 2022-02-09T01:42:25.000Z | import numpy as np
import mxnet as mx
class SigmoidCELossMetric(mx.metric.EvalMetric):
def __init__(self, name, output_names, label_names):
super().__init__(name, output_names, label_names)
def update(self, labels, preds):
self.sum_metric += preds[0].mean().asscalar()
self.num_inst += 1 | 32.1 | 57 | 0.694704 | 44 | 321 | 4.75 | 0.636364 | 0.095694 | 0.143541 | 0.191388 | 0.239234 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007692 | 0.190031 | 321 | 10 | 58 | 32.1 | 0.796154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4684d72d601908f628cc1b9c98635d9d77766834 | 41,066 | py | Python | python/taichi/lang/ops.py | TiGeekMan/taichi | 8fdd3b8e5aabf6b2c8f33ff06994e24d50942c3a | [
"MIT"
] | null | null | null | python/taichi/lang/ops.py | TiGeekMan/taichi | 8fdd3b8e5aabf6b2c8f33ff06994e24d50942c3a | [
"MIT"
] | null | null | null | python/taichi/lang/ops.py | TiGeekMan/taichi | 8fdd3b8e5aabf6b2c8f33ff06994e24d50942c3a | [
"MIT"
] | null | null | null | import builtins
import functools
import math
import operator as _bt_ops_mod # bt for builtin
import traceback
from taichi._lib import core as _ti_core
from taichi.lang import expr, impl
from taichi.lang.exception import TaichiSyntaxError
from taichi.lang.util import cook_dtype, is_taichi_class, taichi_scope
unary_ops = []
def stack_info():
s = traceback.extract_stack()[3:-1]
for i, l in enumerate(s):
if 'taichi_ast_generator' in l:
s = s[i + 1:]
break
raw = ''.join(traceback.format_list(s))
# remove the confusing last line
return '\n'.join(raw.split('\n')[:-5]) + '\n'
def is_taichi_expr(a):
return isinstance(a, expr.Expr)
def wrap_if_not_expr(a):
return expr.Expr(a) if not is_taichi_expr(a) else a
def unary(foo):
@functools.wraps(foo)
def imp_foo(x):
return foo(x)
@functools.wraps(foo)
def wrapped(a):
if is_taichi_class(a):
return a._element_wise_unary(imp_foo)
return imp_foo(a)
return wrapped
binary_ops = []
def binary(foo):
@functools.wraps(foo)
def imp_foo(x, y):
return foo(x, y)
@functools.wraps(foo)
def rev_foo(x, y):
return foo(y, x)
@functools.wraps(foo)
def wrapped(a, b):
if is_taichi_class(a):
return a._element_wise_binary(imp_foo, b)
if is_taichi_class(b):
return b._element_wise_binary(rev_foo, a)
return imp_foo(a, b)
binary_ops.append(wrapped)
return wrapped
ternary_ops = []
def ternary(foo):
@functools.wraps(foo)
def abc_foo(a, b, c):
return foo(a, b, c)
@functools.wraps(foo)
def bac_foo(b, a, c):
return foo(a, b, c)
@functools.wraps(foo)
def cab_foo(c, a, b):
return foo(a, b, c)
@functools.wraps(foo)
def wrapped(a, b, c):
if is_taichi_class(a):
return a._element_wise_ternary(abc_foo, b, c)
if is_taichi_class(b):
return b._element_wise_ternary(bac_foo, a, c)
if is_taichi_class(c):
return c._element_wise_ternary(cab_foo, a, b)
return abc_foo(a, b, c)
ternary_ops.append(wrapped)
return wrapped
writeback_binary_ops = []
def writeback_binary(foo):
@functools.wraps(foo)
def imp_foo(x, y):
return foo(x, wrap_if_not_expr(y))
@functools.wraps(foo)
def wrapped(a, b):
if is_taichi_class(a):
return a._element_wise_writeback_binary(imp_foo, b)
if is_taichi_class(b):
raise TaichiSyntaxError(
f'cannot augassign taichi class {type(b)} to scalar expr')
else:
return imp_foo(a, b)
writeback_binary_ops.append(wrapped)
return wrapped
def cast(obj, dtype):
"""Copy and cast a scalar or a matrix to a specified data type.
Must be called in Taichi scope.
Args:
obj (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Input scalar or matrix.
dtype (:mod:`~taichi.types.primitive_types`): A primitive type defined in :mod:`~taichi.types.primitive_types`.
Returns:
A copy of `obj`, casted to the specified data type `dtype`.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Matrix([0, 1, 2], ti.i32)
>>> y = ti.cast(x, ti.f32)
>>> print(y)
>>>
>>> test()
[0.0, 1.0, 2.0]
"""
dtype = cook_dtype(dtype)
if is_taichi_class(obj):
# TODO: unify with element_wise_unary
return obj.cast(dtype)
return expr.Expr(_ti_core.value_cast(expr.Expr(obj).ptr, dtype))
def bit_cast(obj, dtype):
"""Copy and cast a scalar to a specified data type with its underlying
bits preserved. Must be called in taichi scope.
This function is equivalent to `reinterpret_cast` in C++.
Args:
obj (:mod:`~taichi.types.primitive_types`): Input scalar.
dtype (:mod:`~taichi.types.primitive_types`): Target data type, must have \
the same precision bits as the input (hence `f32` -> `f64` is not allowed).
Returns:
A copy of `obj`, casted to the specified data type `dtype`.
Example::
>>> @ti.kernel
>>> def test():
>>> x = 3.14
>>> y = ti.bit_cast(x, ti.i32)
>>> print(y) # 1078523331
>>>
>>> z = ti.bit_cast(y, ti.f32)
>>> print(z) # 3.14
"""
dtype = cook_dtype(dtype)
if is_taichi_class(obj):
raise ValueError('Cannot apply bit_cast on Taichi classes')
else:
return expr.Expr(_ti_core.bits_cast(expr.Expr(obj).ptr, dtype))
def _unary_operation(taichi_op, python_op, a):
if is_taichi_expr(a):
return expr.Expr(taichi_op(a.ptr), tb=stack_info())
return python_op(a)
def _binary_operation(taichi_op, python_op, a, b):
if is_taichi_expr(a) or is_taichi_expr(b):
a, b = wrap_if_not_expr(a), wrap_if_not_expr(b)
return expr.Expr(taichi_op(a.ptr, b.ptr), tb=stack_info())
return python_op(a, b)
def _ternary_operation(taichi_op, python_op, a, b, c):
if is_taichi_expr(a) or is_taichi_expr(b) or is_taichi_expr(c):
a, b, c = wrap_if_not_expr(a), wrap_if_not_expr(b), wrap_if_not_expr(c)
return expr.Expr(taichi_op(a.ptr, b.ptr, c.ptr), tb=stack_info())
return python_op(a, b, c)
@unary
def neg(x):
"""Numerical negative, element-wise.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Input scalar or matrix.
Returns:
Matrix or scalar `y`, so that `y = -x`. `y` has the same type as `x`.
Example::
>>> x = ti.Matrix([1, -1])
>>> y = ti.neg(a)
>>> y
[-1, 1]
"""
return _unary_operation(_ti_core.expr_neg, _bt_ops_mod.neg, x)
@unary
def sin(x):
"""Trigonometric sine, element-wise.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Angle, in radians.
Returns:
The sine of each element of `x`.
Example::
>>> from math import pi
>>> x = ti.Matrix([-pi/2., 0, pi/2.])
>>> ti.sin(x)
[-1., 0., 1.]
"""
return _unary_operation(_ti_core.expr_sin, math.sin, x)
@unary
def cos(x):
"""Trigonometric cosine, element-wise.
Args:
x (Union[:mod:`~taichi.type.primitive_types`, :class:`~taichi.Matrix`]): \
Angle, in radians.
Returns:
The cosine of each element of `x`.
Example::
>>> from math import pi
>>> x = ti.Matrix([-pi, 0, pi/2.])
>>> ti.cos(x)
[-1., 1., 0.]
"""
return _unary_operation(_ti_core.expr_cos, math.cos, x)
@unary
def asin(x):
"""Trigonometric inverse sine, element-wise.
The inverse of `sin` so that, if `y = sin(x)`, then `x = asin(y)`.
For input `x` not in the domain `[-1, 1]`, this function returns `nan` if \
it's called in taichi scope, or raises exception if it's called in python scope.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
A scalar or a matrix with elements in [-1, 1].
Returns:
The inverse sine of each element in `x`, in radians and in the closed \
interval `[-pi/2, pi/2]`.
Example::
>>> from math import pi
>>> ti.asin(ti.Matrix([-1.0, 0.0, 1.0])) * 180 / pi
[-90., 0., 90.]
"""
return _unary_operation(_ti_core.expr_asin, math.asin, x)
@unary
def acos(x):
"""Trigonometric inverse cosine, element-wise.
The inverse of `cos` so that, if `y = cos(x)`, then `x = acos(y)`.
For input `x` not in the domain `[-1, 1]`, this function returns `nan` if \
it's called in taichi scope, or raises exception if it's called in python scope.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
A scalar or a matrix with elements in [-1, 1].
Returns:
The inverse cosine of each element in `x`, in radians and in the closed \
interval `[0, pi]`. This is a scalar if `x` is a scalar.
Example::
>>> from math import pi
>>> ti.acos(ti.Matrix([-1.0, 0.0, 1.0])) * 180 / pi
[180., 90., 0.]
"""
return _unary_operation(_ti_core.expr_acos, math.acos, x)
@unary
def sqrt(x):
"""Return the non-negative square-root of a scalar or a matrix,
element wise. If `x < 0` an exception is raised.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The scalar or matrix whose square-roots are required.
Returns:
The square-root `y` so that `y >= 0` and `y^2 = x`. `y` has the same type as `x`.
Example::
>>> x = ti.Matrix([1., 4., 9.])
>>> y = ti.sqrt(x)
>>> y
[1.0, 2.0, 3.0]
"""
return _unary_operation(_ti_core.expr_sqrt, math.sqrt, x)
@unary
def rsqrt(x):
"""The reciprocal of the square root function.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
A scalar or a matrix.
Returns:
The reciprocal of `sqrt(x)`.
"""
def _rsqrt(x):
return 1 / math.sqrt(x)
return _unary_operation(_ti_core.expr_rsqrt, _rsqrt, x)
@unary
def round(x): # pylint: disable=redefined-builtin
"""Round to the nearest integer, element-wise.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
A scalar or a matrix.
Returns:
The nearest integer of `x`.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Vector([-1.5, 1.2, 2.7])
>>> print(ti.round(x))
[-2., 1., 3.]
"""
return _unary_operation(_ti_core.expr_round, builtins.round, x)
@unary
def floor(x):
"""Return the floor of the input, element-wise.
The floor of the scalar `x` is the largest integer `k`, such that `k <= x`.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Input scalar or matrix.
Returns:
The floor of each element in `x`, with float type.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Matrix([3.14, -1.5])
>>> y = ti.floor(x)
>>> print(y) # [3.0, -2.0]
"""
return _unary_operation(_ti_core.expr_floor, math.floor, x)
@unary
def ceil(x):
"""Return the ceiling of the input, element-wise.
The ceil of the scalar `x` is the smallest integer `k`, such that `k >= x`.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Input scalar or matrix.
Returns:
The ceiling of each element in `x`, with float dtype.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Matrix([3.14, -1.5])
>>> y = ti.ceil(x)
>>> print(y) # [4.0, -1.0]
"""
return _unary_operation(_ti_core.expr_ceil, math.ceil, x)
@unary
def tan(x):
"""Trigonometric tangent function, element-wise.
Equivalent to `ti.sin(x)/ti.cos(x)` element-wise.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Input scalar or matrix.
Returns:
The tangent values of `x`.
Example::
>>> from math import pi
>>> @ti.kernel
>>> def test():
>>> x = ti.Matrix([-pi, pi/2, pi])
>>> y = ti.tan(x)
>>> print(y)
>>>
>>> test()
[-0.0, -22877334.0, 0.0]
"""
return _unary_operation(_ti_core.expr_tan, math.tan, x)
@unary
def tanh(x):
"""Compute the hyperbolic tangent of `x`, element-wise.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Input scalar or matrix.
Returns:
The corresponding hyperbolic tangent values.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Matrix([-1.0, 0.0, 1.0])
>>> y = ti.tanh(x)
>>> print(y)
>>>
>>> test()
[-0.761594, 0.000000, 0.761594]
"""
return _unary_operation(_ti_core.expr_tanh, math.tanh, x)
@unary
def exp(x):
"""Compute the exponential of all elements in `x`, element-wise.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Input scalar or matrix.
Returns:
Element-wise exponential of `x`.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Matrix([-1.0, 0.0, 1.0])
>>> y = ti.exp(x)
>>> print(y)
>>>
>>> test()
[0.367879, 1.000000, 2.718282]
"""
return _unary_operation(_ti_core.expr_exp, math.exp, x)
@unary
def log(x):
"""Compute the natural logarithm, element-wise.
The natural logarithm `log` is the inverse of the exponential function,
so that `log(exp(x)) = x`. The natural logarithm is logarithm in base `e`.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Input scalar or matrix.
Returns:
The natural logarithm of `x`, element-wise.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Vector([-1.0, 0.0, 1.0])
>>> y = ti.log(x)
>>> print(y)
>>>
>>> test()
[-nan, -inf, 0.000000]
"""
return _unary_operation(_ti_core.expr_log, math.log, x)
@unary
def abs(x): # pylint: disable=W0622
"""Compute the absolute value :math:`|x|` of `x`, element-wise.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Input scalar or matrix.
Returns:
The absolute value of each element in `x`.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Vector([-1.0, 0.0, 1.0])
>>> y = ti.abs(x)
>>> print(y)
>>>
>>> test()
[1.0, 0.0, 1.0]
"""
return _unary_operation(_ti_core.expr_abs, builtins.abs, x)
@unary
def bit_not(a):
"""The bit not function.
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
Returns:
Bitwise not of `a`.
"""
return _unary_operation(_ti_core.expr_bit_not, _bt_ops_mod.invert, a)
@unary
def logical_not(a):
"""The logical not function.
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
Returns:
`1` iff `a=0`, otherwise `0`.
"""
return _unary_operation(_ti_core.expr_logic_not, lambda x: int(not x), a)
def random(dtype=float):
"""Return a single random float/integer according to the specified data type.
Must be called in taichi scope.
If the required `dtype` is float type, this function returns a random number
sampled from the uniform distribution in the half-open interval [0, 1).
For integer types this function returns a random integer in the
half-open interval [0, 2^32) if a 32-bit integer is required,
or a random integer in the half-open interval [0, 2^64) if a
64-bit integer is required.
Args:
dtype (:mod:`~taichi.types.primitive_types`): Type of the required random value.
Returns:
A random value with type `dtype`.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.random(float)
>>> print(x) # 0.090257
>>>
>>> y = ti.random(ti.f64)
>>> print(y) # 0.716101627301
>>>
>>> i = ti.random(ti.i32)
>>> print(i) # -963722261
>>>
>>> j = ti.random(ti.i64)
>>> print(j) # 73412986184350777
"""
dtype = cook_dtype(dtype)
x = expr.Expr(_ti_core.make_rand_expr(dtype))
return impl.expr_init(x)
# NEXT: add matpow(self, power)
@binary
def add(a, b):
"""The add function.
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
Returns:
sum of `a` and `b`.
"""
return _binary_operation(_ti_core.expr_add, _bt_ops_mod.add, a, b)
@binary
def sub(a, b):
"""The sub function.
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
Returns:
`a` subtract `b`.
"""
return _binary_operation(_ti_core.expr_sub, _bt_ops_mod.sub, a, b)
@binary
def mul(a, b):
"""The multiply function.
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
Returns:
`a` multiplied by `b`.
"""
return _binary_operation(_ti_core.expr_mul, _bt_ops_mod.mul, a, b)
@binary
def mod(x1, x2):
"""Returns the element-wise remainder of division.
This is equivalent to the Python modulus operator `x1 % x2` and
has the same sign as the divisor x2.
Args:
x1 (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Dividend scalar or matrix.
x2 (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Divisor scalar or matrix. When both `x1` and `x2` are matrices they must have the same shape.
Returns:
The element-wise remainder of the quotient `floordiv(x1, x2)`. This is a scalar \
if both `x1` and `x2` are scalars.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Matrix([3.0, 4.0, 5.0])
>>> y = 3
>>> z = ti.mod(y, x)
>>> print(z)
>>>
>>> test()
[1.0, 0.0, 4.0]
"""
def expr_python_mod(a, b):
# a % b = a - (a // b) * b
quotient = expr.Expr(_ti_core.expr_floordiv(a, b))
multiply = expr.Expr(_ti_core.expr_mul(b, quotient.ptr))
return _ti_core.expr_sub(a, multiply.ptr)
return _binary_operation(expr_python_mod, _bt_ops_mod.mod, x1, x2)
@binary
def pow(x, a): # pylint: disable=W0622
"""First array elements raised to powers from second array :math:`x^a`, element-wise.
Negative values raised to a non-integral value will return `nan`.
A zero value raised to a negative value will return `inf`.
Args:
x (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The bases.
a (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The exponents.
Returns:
The bases in `x1` raised to the exponents in `x2`. This is a scalar if both \
`x1` and `x2` are scalars.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Matrix([-2.0, 0.0, 2.0])
>>> y = -2.2
>>> z = ti.pow(x, y)
>>> print(z)
>>>
>>> test()
[-nan, inf, 0.217638]
"""
return _binary_operation(_ti_core.expr_pow, _bt_ops_mod.pow, x, a)
@binary
def floordiv(a, b):
"""The floor division function.
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix with elements not equal to zero.
Returns:
The floor function of `a` divided by `b`.
"""
return _binary_operation(_ti_core.expr_floordiv, _bt_ops_mod.floordiv, a,
b)
@binary
def truediv(a, b):
"""True division function.
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix with elements not equal to zero.
Returns:
The true value of `a` divided by `b`.
"""
return _binary_operation(_ti_core.expr_truediv, _bt_ops_mod.truediv, a, b)
@binary
def max_impl(a, b):
"""The maxnimum function.
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
Returns:
The maxnimum of `a` and `b`.
"""
return _binary_operation(_ti_core.expr_max, builtins.max, a, b)
@binary
def min_impl(a, b):
"""The minimum function.
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): A number or a matrix.
Returns:
The minimum of `a` and `b`.
"""
return _binary_operation(_ti_core.expr_min, builtins.min, a, b)
@binary
def atan2(x1, x2):
"""Element-wise arc tangent of `x1/x2`.
Args:
x1 (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
y-coordinates.
x2 (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
x-coordinates.
Returns:
Angles in radians, in the range `[-pi, pi]`.
This is a scalar if both `x1` and `x2` are scalars.
Example::
>>> from math import pi
>>> @ti.kernel
>>> def test():
>>> x = ti.Matrix([-1.0, 1.0, -1.0, 1.0])
>>> y = ti.Matrix([-1.0, -1.0, 1.0, 1.0])
>>> z = ti.atan2(y, x) * 180 / pi
>>> print(z)
>>>
>>> test()
[-135.0, -45.0, 135.0, 45.0]
"""
return _binary_operation(_ti_core.expr_atan2, math.atan2, x1, x2)
@binary
def raw_div(x1, x2):
"""Return `x1 // x2` if both `x1`, `x2` are integers, otherwise return `x1/x2`.
Args:
x1 (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): Dividend.
x2 (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): Divisor.
Returns:
Return `x1 // x2` if both `x1`, `x2` are integers, otherwise return `x1/x2`.
Example::
>>> @ti.kernel
>>> def main():
>>> x = 5
>>> y = 3
>>> print(raw_div(x, y)) # 1
>>> z = 4.0
>>> print(raw_div(x, z)) # 1.25
"""
def c_div(a, b):
if isinstance(a, int) and isinstance(b, int):
return a // b
return a / b
return _binary_operation(_ti_core.expr_div, c_div, x1, x2)
@binary
def raw_mod(x1, x2):
"""Return the remainder of `x1/x2`, element-wise.
This is the C-style `mod` function.
Args:
x1 (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The dividend.
x2 (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The divisor.
Returns:
The remainder of `x1` divided by `x2`.
Example::
>>> @ti.kernel
>>> def main():
>>> print(ti.mod(-4, 3)) # 2
>>> print(ti.raw_mod(-4, 3)) # -1
"""
def c_mod(x, y):
return x - y * int(float(x) / y)
return _binary_operation(_ti_core.expr_mod, c_mod, x1, x2)
@binary
def cmp_lt(a, b):
"""Compare two values (less than)
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value LHS
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value RHS
Returns:
Union[:class:`~taichi.lang.expr.Expr`, bool]: True if LHS is strictly smaller than RHS, False otherwise
"""
return _binary_operation(_ti_core.expr_cmp_lt, lambda a, b: -int(a < b), a,
b)
@binary
def cmp_le(a, b):
"""Compare two values (less than or equal to)
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value LHS
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value RHS
Returns:
Union[:class:`~taichi.lang.expr.Expr`, bool]: True if LHS is smaller than or equal to RHS, False otherwise
"""
return _binary_operation(_ti_core.expr_cmp_le, lambda a, b: -int(a <= b),
a, b)
@binary
def cmp_gt(a, b):
"""Compare two values (greater than)
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value LHS
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value RHS
Returns:
Union[:class:`~taichi.lang.expr.Expr`, bool]: True if LHS is strictly larger than RHS, False otherwise
"""
return _binary_operation(_ti_core.expr_cmp_gt, lambda a, b: -int(a > b), a,
b)
@binary
def cmp_ge(a, b):
"""Compare two values (greater than or equal to)
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value LHS
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value RHS
Returns:
bool: True if LHS is greater than or equal to RHS, False otherwise
"""
return _binary_operation(_ti_core.expr_cmp_ge, lambda a, b: -int(a >= b),
a, b)
@binary
def cmp_eq(a, b):
"""Compare two values (equal to)
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value LHS
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value RHS
Returns:
Union[:class:`~taichi.lang.expr.Expr`, bool]: True if LHS is equal to RHS, False otherwise.
"""
return _binary_operation(_ti_core.expr_cmp_eq, lambda a, b: -int(a == b),
a, b)
@binary
def cmp_ne(a, b):
"""Compare two values (not equal to)
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value LHS
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value RHS
Returns:
Union[:class:`~taichi.lang.expr.Expr`, bool]: True if LHS is not equal to RHS, False otherwise
"""
return _binary_operation(_ti_core.expr_cmp_ne, lambda a, b: -int(a != b),
a, b)
@binary
def bit_or(a, b):
"""Computes bitwise-or
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value LHS
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value RHS
Returns:
Union[:class:`~taichi.lang.expr.Expr`, bool]: LHS bitwise-or with RHS
"""
return _binary_operation(_ti_core.expr_bit_or, _bt_ops_mod.or_, a, b)
@binary
def bit_and(a, b):
"""Compute bitwise-and
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value LHS
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value RHS
Returns:
Union[:class:`~taichi.lang.expr.Expr`, bool]: LHS bitwise-and with RHS
"""
return _binary_operation(_ti_core.expr_bit_and, _bt_ops_mod.and_, a, b)
@binary
def bit_xor(a, b):
"""Compute bitwise-xor
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value LHS
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value RHS
Returns:
Union[:class:`~taichi.lang.expr.Expr`, bool]: LHS bitwise-xor with RHS
"""
return _binary_operation(_ti_core.expr_bit_xor, _bt_ops_mod.xor, a, b)
@binary
def bit_shl(a, b):
"""Compute bitwise shift left
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value LHS
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value RHS
Returns:
Union[:class:`~taichi.lang.expr.Expr`, int]: LHS << RHS
"""
return _binary_operation(_ti_core.expr_bit_shl, _bt_ops_mod.lshift, a, b)
@binary
def bit_sar(a, b):
"""Compute bitwise shift right
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value LHS
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value RHS
Returns:
Union[:class:`~taichi.lang.expr.Expr`, int]: LHS >> RHS
"""
return _binary_operation(_ti_core.expr_bit_sar, _bt_ops_mod.rshift, a, b)
@taichi_scope
@binary
def bit_shr(x1, x2):
"""Elements in `x1` shifted to the right by number of bits in `x2`.
Both `x1`, `x2` must have integer type.
Args:
x1 (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Input data.
x2 (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
Number of bits to remove at the right of `x1`.
Returns:
Return `x1` with bits shifted `x2` times to the right.
This is a scalar if both `x1` and `x2` are scalars.
Example::
>>> @ti.kernel
>>> def main():
>>> x = ti.Matrix([7, 8])
>>> y = ti.Matrix([1, 2])
>>> print(ti.bit_shr(x, y))
>>>
>>> main()
[3, 2]
"""
return _binary_operation(_ti_core.expr_bit_shr, _bt_ops_mod.rshift, x1, x2)
@binary
def logical_and(a, b):
"""Compute logical_and
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value LHS
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value RHS
Returns:
Union[:class:`~taichi.lang.expr.Expr`, bool]: LHS logical-and RHS (with short-circuit semantics)
"""
return _binary_operation(_ti_core.expr_logical_and, lambda a, b: a and b,
a, b)
@binary
def logical_or(a, b):
"""Compute logical_or
Args:
a (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value LHS
b (Union[:class:`~taichi.lang.expr.Expr`, :class:`~taichi.lang.matrix.Matrix`]): value RHS
Returns:
Union[:class:`~taichi.lang.expr.Expr`, bool]: LHS logical-or RHS (with short-circuit semantics)
"""
return _binary_operation(_ti_core.expr_logical_or, lambda a, b: a or b, a,
b)
@ternary
def select(cond, x1, x2):
"""Return an array drawn from elements in `x1` or `x2`,
depending on the conditions in `cond`.
Args:
cond (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The array of conditions.
x1, x2 (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The arrays where the output elements are taken from.
Returns:
The output at position `k` is the k-th element of `x1` if the k-th element
in `cond` is `True`, otherwise it's the k-th element of `x2`.
Example::
>>> @ti.kernel
>>> def main():
>>> cond = ti.Matrix([0, 1, 0, 1])
>>> x = ti.Matrix([1, 2, 3, 4])
>>> y = ti.Matrix([-1, -2, -3, -4])
>>> print(ti.select(cond, x, y))
>>>
>>> main()
[-1, 2, -3, 4]
"""
# TODO: systematically resolve `-1 = True` problem by introducing u1:
cond = logical_not(logical_not(cond))
def py_select(cond, x1, x2):
return x1 * cond + x2 * (1 - cond)
return _ternary_operation(_ti_core.expr_select, py_select, cond, x1, x2)
@writeback_binary
def atomic_add(x, y):
"""Atomically compute `x + y`, store the result in `x`,
and return the old value of `x`.
`x` must be a writable target, constant expressions or scalars
are not allowed.
Args:
x, y (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The input.
Returns:
The old value of `x`.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Vector([0, 0, 0])
>>> y = ti.Vector([1, 2, 3])
>>> z = ti.atomic_add(x, y)
>>> print(x) # [1, 2, 3] the new value of x
>>> print(z) # [0, 0, 0], the old value of x
>>>
>>> ti.atomic_add(1, x) # will raise TaichiSyntaxError
"""
return impl.expr_init(
expr.Expr(_ti_core.expr_atomic_add(x.ptr, y.ptr), tb=stack_info()))
@writeback_binary
def atomic_sub(x, y):
"""Atomically subtract `x` by `y`, store the result in `x`,
and return the old value of `x`.
`x` must be a writable target, constant expressions or scalars
are not allowed.
Args:
x, y (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The input.
Returns:
The old value of `x`.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Vector([0, 0, 0])
>>> y = ti.Vector([1, 2, 3])
>>> z = ti.atomic_sub(x, y)
>>> print(x) # [-1, -2, -3] the new value of x
>>> print(z) # [0, 0, 0], the old value of x
>>>
>>> ti.atomic_sub(1, x) # will raise TaichiSyntaxError
"""
return impl.expr_init(
expr.Expr(_ti_core.expr_atomic_sub(x.ptr, y.ptr), tb=stack_info()))
@writeback_binary
def atomic_min(x, y):
"""Atomically compute the minimum of `x` and `y`, element-wise.
Store the result in `x`, and return the old value of `x`.
`x` must be a writable target, constant expressions or scalars
are not allowed.
Args:
x, y (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The input.
Returns:
The old value of `x`.
Example::
>>> @ti.kernel
>>> def test():
>>> x = 2
>>> y = 1
>>> z = ti.atomic_min(x, y)
>>> print(x) # 1 the new value of x
>>> print(z) # 2, the old value of x
>>>
>>> ti.atomic_min(1, x) # will raise TaichiSyntaxError
"""
return impl.expr_init(
expr.Expr(_ti_core.expr_atomic_min(x.ptr, y.ptr), tb=stack_info()))
@writeback_binary
def atomic_max(x, y):
"""Atomically compute the maximum of `x` and `y`, element-wise.
Store the result in `x`, and return the old value of `x`.
`x` must be a writable target, constant expressions or scalars
are not allowed.
Args:
x, y (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The input.
Returns:
The old value of `x`.
Example::
>>> @ti.kernel
>>> def test():
>>> x = 1
>>> y = 2
>>> z = ti.atomic_max(x, y)
>>> print(x) # 2 the new value of x
>>> print(z) # 1, the old value of x
>>>
>>> ti.atomic_max(1, x) # will raise TaichiSyntaxError
"""
return impl.expr_init(
expr.Expr(_ti_core.expr_atomic_max(x.ptr, y.ptr), tb=stack_info()))
@writeback_binary
def atomic_and(x, y):
"""Atomically compute the bit-wise AND of `x` and `y`, element-wise.
Store the result in `x`, and return the old value of `x`.
`x` must be a writable target, constant expressions or scalars
are not allowed.
Args:
x, y (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The input. When both are matrices they must have the same shape.
Returns:
The old value of `x`.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Vector([-1, 0, 1])
>>> y = ti.Vector([1, 2, 3])
>>> z = ti.atomic_and(x, y)
>>> print(x) # [1, 0, 1] the new value of x
>>> print(z) # [-1, 0, 1], the old value of x
>>>
>>> ti.atomic_and(1, x) # will raise TaichiSyntaxError
"""
return impl.expr_init(
expr.Expr(_ti_core.expr_atomic_bit_and(x.ptr, y.ptr), tb=stack_info()))
@writeback_binary
def atomic_or(x, y):
"""Atomically compute the bit-wise OR of `x` and `y`, element-wise.
Store the result in `x`, and return the old value of `x`.
`x` must be a writable target, constant expressions or scalars
are not allowed.
Args:
x, y (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The input. When both are matrices they must have the same shape.
Returns:
The old value of `x`.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Vector([-1, 0, 1])
>>> y = ti.Vector([1, 2, 3])
>>> z = ti.atomic_or(x, y)
>>> print(x) # [-1, 2, 3] the new value of x
>>> print(z) # [-1, 0, 1], the old value of x
>>>
>>> ti.atomic_or(1, x) # will raise TaichiSyntaxError
"""
return impl.expr_init(
expr.Expr(_ti_core.expr_atomic_bit_or(x.ptr, y.ptr), tb=stack_info()))
@writeback_binary
def atomic_xor(x, y):
"""Atomically compute the bit-wise XOR of `x` and `y`, element-wise.
Store the result in `x`, and return the old value of `x`.
`x` must be a writable target, constant expressions or scalars
are not allowed.
Args:
x, y (Union[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The input. When both are matrices they must have the same shape.
Returns:
The old value of `x`.
Example::
>>> @ti.kernel
>>> def test():
>>> x = ti.Vector([-1, 0, 1])
>>> y = ti.Vector([1, 2, 3])
>>> z = ti.atomic_xor(x, y)
>>> print(x) # [-2, 2, 2] the new value of x
>>> print(z) # [-1, 0, 1], the old value of x
>>>
>>> ti.atomic_xor(1, x) # will raise TaichiSyntaxError
"""
return impl.expr_init(
expr.Expr(_ti_core.expr_atomic_bit_xor(x.ptr, y.ptr), tb=stack_info()))
@writeback_binary
def assign(a, b):
impl.get_runtime().prog.current_ast_builder().expr_assign(
a.ptr, b.ptr, stack_info())
return a
def max(*args): # pylint: disable=W0622
"""Compute the maximum of the arguments, element-wise.
This function takes no effect on a single argument, even it's array-like.
When there are both scalar and matrix arguments in `args`, the matrices
must have the same shape, and scalars will be broadcasted to the same shape as the matrix.
Args:
args: (List[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The input.
Returns:
Maximum of the inputs.
Example::
>>> @ti.kernel
>>> def foo():
>>> x = ti.Vector([0, 1, 2])
>>> y = ti.Vector([3, 4, 5])
>>> z = ti.max(x, y, 4)
>>> print(z) # [4, 4, 5]
"""
num_args = len(args)
assert num_args >= 1
if num_args == 1:
return args[0]
if num_args == 2:
return max_impl(args[0], args[1])
return max_impl(args[0], max(*args[1:]))
def min(*args): # pylint: disable=W0622
"""Compute the minimum of the arguments, element-wise.
This function takes no effect on a single argument, even it's array-like.
When there are both scalar and matrix arguments in `args`, the matrices
must have the same shape, and scalars will be broadcasted to the same shape as the matrix.
Args:
args: (List[:mod:`~taichi.types.primitive_types`, :class:`~taichi.Matrix`]): \
The input.
Returns:
Minimum of the inputs.
Example::
>>> @ti.kernel
>>> def foo():
>>> x = ti.Vector([0, 1, 2])
>>> y = ti.Vector([3, 4, 5])
>>> z = ti.min(x, y, 1)
>>> print(z) # [0, 1, 1]
"""
num_args = len(args)
assert num_args >= 1
if num_args == 1:
return args[0]
if num_args == 2:
return min_impl(args[0], args[1])
return min_impl(args[0], min(*args[1:]))
def ti_any(a):
return a.any()
def ti_all(a):
return a.all()
__all__ = [
"acos", "asin", "atan2", "atomic_and", "atomic_or", "atomic_xor",
"atomic_max", "atomic_sub", "atomic_min", "atomic_add", "bit_cast",
"bit_shr", "cast", "ceil", "cos", "exp", "floor", "log", "random",
"raw_mod", "raw_div", "round", "rsqrt", "sin", "sqrt", "tan", "tanh",
"max", "min", "select", "abs", "pow"
]
| 28.185312 | 142 | 0.567672 | 5,845 | 41,066 | 3.877844 | 0.068263 | 0.066002 | 0.063531 | 0.047648 | 0.721433 | 0.67449 | 0.632092 | 0.598165 | 0.588061 | 0.55506 | 0 | 0.020714 | 0.275849 | 41,066 | 1,456 | 143 | 28.20467 | 0.741476 | 0.647786 | 0 | 0.345345 | 0 | 0 | 0.02616 | 0 | 0 | 0 | 0 | 0.001374 | 0.006006 | 1 | 0.255255 | false | 0 | 0.027027 | 0.042042 | 0.585586 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
46946fae7eff0eee78a1b2337b661d1032f438bd | 466 | py | Python | app/blueprints/config.py | twotwo/flask-modern-dev | 054037a60bb0d4f85aa8d07ea32b007b1d26637d | [
"MIT"
] | null | null | null | app/blueprints/config.py | twotwo/flask-modern-dev | 054037a60bb0d4f85aa8d07ea32b007b1d26637d | [
"MIT"
] | null | null | null | app/blueprints/config.py | twotwo/flask-modern-dev | 054037a60bb0d4f85aa8d07ea32b007b1d26637d | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from flask import Blueprint, render_template
# from flask_login import current_user
from app.services import ConfigInfo, JobHandler
config_bp = Blueprint("config", __name__)
@config_bp.route("/", methods=["GET"])
def index():
jobs = JobHandler.get_all()
return render_template("config/index.html", jobs=jobs, config=ConfigInfo.get_all())
@config_bp.route("/about")
def about():
return render_template("config/about.html")
| 22.190476 | 87 | 0.725322 | 61 | 466 | 5.311475 | 0.47541 | 0.12963 | 0.080247 | 0.160494 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002463 | 0.128755 | 466 | 20 | 88 | 23.3 | 0.795567 | 0.124464 | 0 | 0 | 0 | 0 | 0.123457 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0.1 | 0.6 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
46a9510d72ae38ad38081220dacac06a010d2c9c | 124 | py | Python | nicos_mlz/toftof/setups/startup.py | jkrueger1/nicos | 5f4ce66c312dedd78995f9d91e8a6e3c891b262b | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | 12 | 2019-11-06T15:40:36.000Z | 2022-01-01T16:23:00.000Z | nicos_mlz/toftof/setups/startup.py | jkrueger1/nicos | 5f4ce66c312dedd78995f9d91e8a6e3c891b262b | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | 91 | 2020-08-18T09:20:26.000Z | 2022-02-01T11:07:14.000Z | nicos_mlz/toftof/setups/startup.py | jkrueger1/nicos | 5f4ce66c312dedd78995f9d91e8a6e3c891b262b | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | 6 | 2020-01-11T10:52:30.000Z | 2022-02-25T12:35:23.000Z | description = 'minimal NICOS startup setup'
group = 'lowlevel'
sysconfig = dict(
cache = 'tofhw.toftof.frm2:14869',
)
| 15.5 | 43 | 0.693548 | 14 | 124 | 6.142857 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 0.177419 | 124 | 7 | 44 | 17.714286 | 0.784314 | 0 | 0 | 0 | 0 | 0 | 0.467742 | 0.185484 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d3bd387c50282be8f3b302e1a1f33791cf9e8e90 | 13,204 | py | Python | pegasus/dados/schemas/sql_CNES_SR_postgreSQL.py | SecexSaudeTCU/PegaSUS | 0e24c00595e8a7376680dfb2e5aa42e1e9eb7770 | [
"MIT"
] | null | null | null | pegasus/dados/schemas/sql_CNES_SR_postgreSQL.py | SecexSaudeTCU/PegaSUS | 0e24c00595e8a7376680dfb2e5aa42e1e9eb7770 | [
"MIT"
] | null | null | null | pegasus/dados/schemas/sql_CNES_SR_postgreSQL.py | SecexSaudeTCU/PegaSUS | 0e24c00595e8a7376680dfb2e5aa42e1e9eb7770 | [
"MIT"
] | 1 | 2020-08-19T19:31:42.000Z | 2020-08-19T19:31:42.000Z | ###########################################################################################################################################################################
# CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR CNES_SR #
###########################################################################################################################################################################
import psycopg2
"""
Cria o schema do sub banco de dados do CNES_SR (Serviços Especializados) para atender ao formato do SGBD postgreSQL.
Na verdade o schema abaixo deve atender ao formato dos SGBD Oracle e SQL Server com no máximo pequenas modificações.
"""
# Função de criação das tabelas do sub banco de dados cnes_sr
def create_tables(connection_data, child_db):
conn = psycopg2.connect(dbname=connection_data[0],
host=connection_data[1],
port=connection_data[2],
user=connection_data[3],
password=connection_data[4])
cursor = conn.cursor()
cursor.execute(f'''/*Tabela Serviços Especializados (tabela principal do sub banco de dados cnes_sr)*/
CREATE TABLE IF NOT EXISTS {child_db}.srbr("CNES_ID" VARCHAR(7),
"UF_SR" VARCHAR(2),
"ANO_SR" INTEGER,
"MES_SR" VARCHAR(2),
"CODUFMUN_ID" VARCHAR(6),
"SERVESP_ID" VARCHAR(3),
"CLASSSR_ID" VARCHAR(6),
"SRVUNICO_ID" VARCHAR(3),
"TPGESTAO_ID" VARCHAR(2),
"PFPJ_ID" VARCHAR(2),
"CPF_CNPJ" VARCHAR(14),
"NIVDEP_ID" VARCHAR(2),
"ESFERAA_ID" VARCHAR(2),
"ATIVIDAD_ID" VARCHAR(2),
"RETENCAO_ID" VARCHAR(2),
"NATUREZA_ID" VARCHAR(2),
"CLIENTEL_ID" VARCHAR(2),
"TPUNID_ID" VARCHAR(2),
"TURNOAT_ID" VARCHAR(2),
"NIVHIER_ID" VARCHAR(2),
"TERCEIRO" NUMERIC,
"CNPJ_MAN" VARCHAR(14),
"CARACTER_ID" VARCHAR(2),
"AMB_NSUS" NUMERIC,
"AMB_SUS" NUMERIC,
"HOSP_NSUS" NUMERIC,
"HOSP_SUS" NUMERIC,
"CONTSRVU" NUMERIC,
"CNESTERC" VARCHAR(7),
"NATJUR_ID" VARCHAR(4));
/*Tabela dos estabelecimentos de saúde*/
CREATE TABLE IF NOT EXISTS {child_db}.cnes("ID" VARCHAR(7),
"DESCESTAB" VARCHAR(66),
"RSOC_MAN" VARCHAR(66),
"CPF_CNPJ" VARCHAR(14),
"EXCLUIDO" NUMERIC,
"DATAINCL" DATE,
"DATAEXCL" DATE);
/*Tabela dos municípios de localização de estabelecimentos*/
CREATE TABLE IF NOT EXISTS {child_db}.codufmun("ID" VARCHAR(6),
"MUNNOME" VARCHAR(66),
"MUNNOMEX" VARCHAR(66),
"MUNCODDV" VARCHAR(7),
"OBSERV" VARCHAR(66),
"SITUACAO" VARCHAR(66),
"MUNSINP" VARCHAR(66),
"MUNSIAFI" VARCHAR(66),
"UFCOD_ID" VARCHAR(2),
"AMAZONIA" VARCHAR(66),
"FRONTEIRA" VARCHAR(66),
"CAPITAL" VARCHAR(66),
"RSAUDE_ID" VARCHAR(5),
"LATITUDE" FLOAT,
"LONGITUDE" FLOAT,
"ALTITUDE" FLOAT,
"AREA" FLOAT,
"ANOINST" VARCHAR(66),
"ANOEXT" VARCHAR(66),
"SUCESSOR" VARCHAR(66));
/*Tabela dos serviços especializados*/
CREATE TABLE IF NOT EXISTS {child_db}.servesp("ID" VARCHAR(3),
"DESCRICAO" VARCHAR(100));
/*Tabela de classificações dos serviços especializados*/
CREATE TABLE IF NOT EXISTS {child_db}.classsr("ID" VARCHAR(6),
"DESCRICAO" VARCHAR(100));
/*Na verdade é a mesma coisa da tabela servesp*/
CREATE TABLE IF NOT EXISTS {child_db}.srvunico("ID" VARCHAR(3),
"DESCRICAO" VARCHAR(100));
/*Tabela dos tipos de gestão*/
CREATE TABLE IF NOT EXISTS {child_db}.tpgestao("ID" VARCHAR(2),
"GESTAO" VARCHAR(66));
/*Tabela se o estabelecimento é pessoa física ou pessoa jurídica*/
CREATE TABLE IF NOT EXISTS {child_db}.pfpj("ID" VARCHAR(2),
"PESSOA" VARCHAR(66));
/*Tabela do grau de independência do estabelecimento*/
CREATE TABLE IF NOT EXISTS {child_db}.nivdep("ID" VARCHAR(2),
"TIPO" VARCHAR(66));
/*Tabela dos tipos de administração*/
CREATE TABLE IF NOT EXISTS {child_db}.esferaa("ID" VARCHAR(2),
"ADMINISTRACAO" VARCHAR(66));
/*Tabela dos tipos de atividade de ensino, se houver*/
CREATE TABLE IF NOT EXISTS {child_db}.atividad("ID" VARCHAR(2),
"ATIVIDADE" VARCHAR(66));
/*Tabela dos tipos de retenção de tributos do estabelecimento*/
CREATE TABLE IF NOT EXISTS {child_db}.retencao("ID" VARCHAR(2),
"RETENCAO" VARCHAR(66));
/*Tabela da natureza do estabelecimento*/
CREATE TABLE IF NOT EXISTS {child_db}.natureza("ID" VARCHAR(2),
"NATUREZA" VARCHAR(66));
/*Tabela dos tipos de fluxo de clientela*/
CREATE TABLE IF NOT EXISTS {child_db}.clientel("ID" VARCHAR(2),
"CLIENTELA" VARCHAR(66));
/*Tabela dos tipos de estabelecimento*/
CREATE TABLE IF NOT EXISTS {child_db}.tpunid("ID" VARCHAR(2),
"TIPO" VARCHAR(66));
/*Tabela dos turnos de funcionamento do estabelecimento*/
CREATE TABLE IF NOT EXISTS {child_db}.turnoat("ID" VARCHAR(2),
"TURNO" VARCHAR(66));
/*Tabela dos níveis de atendimento do estabelecimento*/
CREATE TABLE IF NOT EXISTS {child_db}.nivhier("ID" VARCHAR(2),
"NIVEL" VARCHAR(66));
/*Tabela de caracterizações do estabelecimento*/
CREATE TABLE IF NOT EXISTS {child_db}.caracter("ID" VARCHAR(2),
"CARACTERIZACAO" VARCHAR(66));
/*Tabela das naturezas jurídicas de estabelecimentos*/
CREATE TABLE IF NOT EXISTS {child_db}.natjur("ID" VARCHAR(4),
"NATUREZA" VARCHAR(100));
/*Tabela dos Estados da RFB*/
CREATE TABLE IF NOT EXISTS {child_db}.ufcod("ID" VARCHAR(2),
"ESTADO" VARCHAR(66),
"SIGLA_UF" VARCHAR(66));
/*Tabela de regiões de saúde IBGE*/
CREATE TABLE IF NOT EXISTS {child_db}.rsaude("ID" VARCHAR(5),
"REGIAO" VARCHAR(66));
/*Tabela de Arquivos*/
CREATE TABLE IF NOT EXISTS {child_db}.arquivos("NOME" VARCHAR(15),
"DIRETORIO" VARCHAR(66),
"DATA_INSERCAO_FTP" DATE,
"DATA_HORA_CARGA" TIMESTAMP,
"QTD_REGISTROS" INTEGER);
''')
conn.commit()
# Encerra o cursor
cursor.close()
# Encerra a conexão
conn.close()
| 76.323699 | 171 | 0.288322 | 781 | 13,204 | 4.758003 | 0.252241 | 0.096878 | 0.069968 | 0.094726 | 0.342034 | 0.33746 | 0.308665 | 0.191604 | 0.16254 | 0.062971 | 0 | 0.026872 | 0.630794 | 13,204 | 172 | 172 | 76.767442 | 0.735385 | 0.019918 | 0 | 0.052632 | 0 | 0 | 0.962513 | 0.037811 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007519 | false | 0.007519 | 0.007519 | 0 | 0.015038 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d3c56cf35f25f1037ca611ba51d79245e4fd6a18 | 1,576 | py | Python | bdda/generate_fake_gazemaps.py | hofbi/mv-roi | c5d1363c63cba530970c1f8df3ada1c9573f0d60 | [
"MIT"
] | 1 | 2020-10-28T05:31:13.000Z | 2020-10-28T05:31:13.000Z | bdda/generate_fake_gazemaps.py | hofbi/mv-roi | c5d1363c63cba530970c1f8df3ada1c9573f0d60 | [
"MIT"
] | 1 | 2022-02-11T18:55:51.000Z | 2022-02-11T18:55:51.000Z | bdda/generate_fake_gazemaps.py | hofbi/mv-roi | c5d1363c63cba530970c1f8df3ada1c9573f0d60 | [
"MIT"
] | 1 | 2021-11-04T12:25:20.000Z | 2021-11-04T12:25:20.000Z | """Generate fake gazemaps required for prediction for a directory of images"""
import sys
from pathlib import Path
import PIL.Image
from tqdm import tqdm
try:
sys.path.append(str(Path(__file__).absolute().parent.parent))
except IndexError:
pass
from bdda.prepare import create_gazemap_from_shapes
from util import config
from util.args import ArgumentParserFactory
from util.files import get_files_with_suffix
def generate_fake_gazemaps(image_file, output_dir):
"""
Generate fake gazemap for image file
:param image_file:
:param output_dir:
:return:
"""
size = PIL.Image.open(image_file).size
fake_gazemap = create_gazemap_from_shapes([], size)
gazemap_path = Path(output_dir).joinpath(image_file.name)
fake_gazemap.save(gazemap_path)
def parse_arguments():
"""
Parse command line arguments
:return:
"""
factory = ArgumentParserFactory(__doc__)
factory.add_input_dir_argument("Path to the image files.")
factory.add_output_dir_argument(
"Path where the fake gazemaps will be put.", Path(__file__).parent
)
return factory.parser.parse_args()
def main():
"""main"""
args = parse_arguments()
Path(args.output_dir).mkdir(parents=True, exist_ok=True)
image_files = get_files_with_suffix(args.input_dir, config.BDDA_IMAGE_SUFFIX)
for image_file in tqdm(image_files, desc="Generating fake gazemaps..."):
generate_fake_gazemaps(image_file, args.output_dir)
if __name__ == "__main__":
try:
main()
except KeyboardInterrupt:
pass
| 25.836066 | 81 | 0.723985 | 209 | 1,576 | 5.143541 | 0.368421 | 0.058605 | 0.055814 | 0.042791 | 0.053953 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.182741 | 1,576 | 60 | 82 | 26.266667 | 0.834627 | 0.126269 | 0 | 0.114286 | 1 | 0 | 0.075758 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.085714 | false | 0.057143 | 0.228571 | 0 | 0.342857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
d3d2c48547d73a318a8df800683613c6ede907a6 | 1,154 | py | Python | cotizacion_mep/mep_calculator.py | dantebarba/cotizacion-mep | 9c9423692f6536021150ee7d484dbdd1a67c88cd | [
"MIT"
] | null | null | null | cotizacion_mep/mep_calculator.py | dantebarba/cotizacion-mep | 9c9423692f6536021150ee7d484dbdd1a67c88cd | [
"MIT"
] | 1 | 2020-01-03T18:11:13.000Z | 2020-01-03T18:11:13.000Z | cotizacion_mep/mep_calculator.py | dantebarba/cotizacion-mep | 9c9423692f6536021150ee7d484dbdd1a67c88cd | [
"MIT"
] | null | null | null | import cotizacion_mep.mongodb_mep_strategy as mongodb_mep_strategy
class MEPCalculator():
def __init__(self, api):
self._api = api
def calculate(self, bonds_list, min_volume=1, min_operations=1, min_volume_usd=1, min_operations_usd=1):
mep_value_pairs = []
for bond in bonds_list:
mep_value_pairs.append(self._api.get_bonds_pair(bond, bond+"D"))
return MEPCalculator.filter_volume(MEPCalculator.lowest_rate(mep_value_pairs), min_volume, min_operations, min_volume_usd, min_operations_usd)
@staticmethod
def filter_volume(mep_value_pairs, min_volume=1, min_operations=1, min_volume_usd=1, min_operations_usd=1):
return list(filter(lambda mep_value: (mep_value.bond_ars().volume > min_volume and mep_value.bond_usd().volume > min_volume_usd and mep_value.bond_ars().cantidad_operaciones > min_operations and mep_value.bond_usd().cantidad_operaciones > min_operations_usd), mep_value_pairs))
@staticmethod
def lowest_rate(mep_value_pairs):
mep_value_pairs.sort(cmp=lambda a, b: int(
a.mep_value()*1000 - b.mep_value()*1000))
return mep_value_pairs
| 50.173913 | 285 | 0.749567 | 171 | 1,154 | 4.637427 | 0.263158 | 0.151324 | 0.131148 | 0.056747 | 0.267339 | 0.136192 | 0.136192 | 0.136192 | 0.136192 | 0.136192 | 0 | 0.016444 | 0.156846 | 1,154 | 22 | 286 | 52.454545 | 0.798561 | 0 | 0 | 0.117647 | 0 | 0 | 0.000867 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.058824 | 0.058824 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
d3d8191f2914d90e3194613de72468015120bb4a | 7,699 | py | Python | tests/test_series.py | dmayilyan/dominoes | 1716852c49bf8392cdf6485fd328e5cdf41a7c08 | [
"MIT"
] | 38 | 2016-12-27T00:21:13.000Z | 2022-01-31T16:31:17.000Z | tests/test_series.py | deniscsz/dominoes | 9b5f0c34735aef9c4c165468bf0d4722d24ceeea | [
"MIT"
] | 46 | 2016-11-14T04:02:47.000Z | 2018-10-27T20:57:01.000Z | tests/test_series.py | deniscsz/dominoes | 9b5f0c34735aef9c4c165468bf0d4722d24ceeea | [
"MIT"
] | 19 | 2017-05-26T20:26:15.000Z | 2022-03-10T10:32:44.000Z | import collections
import dominoes
import unittest
class TestSeries(unittest.TestCase):
def test_init(self):
s1 = dominoes.Series()
self.assertEqual(len(s1.games), 1)
self.assertEqual(len(s1.games[0].board), 1)
self.assertEqual(s1.games[0].board.left_end(), 6)
self.assertEqual(s1.games[0].board.right_end(), 6)
hand_lengths1 = collections.Counter(len(h) for h in s1.games[0].hands)
self.assertEqual(hand_lengths1[6], 1)
self.assertEqual(hand_lengths1[7], 3)
self.assertTrue(s1.games[0].turn in range(4))
self.assertTrue(bool(s1.games[0].valid_moves))
self.assertIsNone(s1.games[0].result)
self.assertEqual(s1.scores, [0, 0])
self.assertEqual(s1.target_score, 200)
s2 = dominoes.Series(target_score=100)
self.assertEqual(len(s2.games), 1)
self.assertEqual(len(s2.games[0].board), 1)
self.assertEqual(s2.games[0].board.left_end(), 6)
self.assertEqual(s2.games[0].board.right_end(), 6)
hand_lengths2 = collections.Counter(len(h) for h in s2.games[0].hands)
self.assertEqual(hand_lengths2[6], 1)
self.assertEqual(hand_lengths2[7], 3)
self.assertTrue(s2.games[0].turn in range(4))
self.assertTrue(bool(s2.games[0].valid_moves))
self.assertIsNone(s2.games[0].result)
self.assertEqual(s2.scores, [0, 0])
self.assertEqual(s2.target_score, 100)
d = dominoes.Domino(1, 2)
s3 = dominoes.Series(starting_domino=d)
self.assertEqual(len(s3.games), 1)
self.assertEqual(len(s3.games[0].board), 1)
self.assertEqual(s3.games[0].board.left_end(), 1)
self.assertEqual(s3.games[0].board.right_end(), 2)
hand_lengths3 = collections.Counter(len(h) for h in s3.games[0].hands)
self.assertEqual(hand_lengths3[6], 1)
self.assertEqual(hand_lengths3[7], 3)
self.assertTrue(s3.games[0].turn in range(4))
self.assertTrue(bool(s3.games[0].valid_moves))
self.assertIsNone(s3.games[0].result)
self.assertEqual(s3.scores, [0, 0])
self.assertEqual(s3.target_score, 200)
def test_is_over(self):
s = dominoes.Series()
self.assertFalse(s.is_over())
s.scores = [199, 199]
self.assertFalse(s.is_over())
s.scores = [200, 199]
self.assertTrue(s.is_over())
s.scores = [199, 200]
self.assertTrue(s.is_over())
s.scores = [200, 200]
self.assertTrue(s.is_over())
s.scores = [201, 201]
self.assertTrue(s.is_over())
def test_next_game(self):
s = dominoes.Series()
str1 = str(s)
repr1 = repr(s)
self.assertRaises(dominoes.GameInProgressException, s.next_game)
self.assertEqual(len(s.games), 1)
self.assertEqual(len(s.games[0].board), 1)
self.assertTrue(bool(s.games[0].valid_moves))
self.assertIsNone(s.games[0].result)
self.assertEqual(s.scores, [0, 0])
self.assertEqual(s.target_score, 200)
self.assertTrue('Series to 200 points:' in str1)
self.assertTrue('Team 0 has 0 points.' in str1)
self.assertTrue('Team 1 has 0 points.' in str1)
self.assertEqual(str1, repr1)
scores = [200, 200]
s.scores = scores
str2 = str(s)
repr2 = repr(s)
self.assertRaises(dominoes.SeriesOverException, s.next_game)
self.assertEqual(len(s.games), 1)
self.assertEqual(len(s.games[0].board), 1)
self.assertTrue(bool(s.games[0].valid_moves))
self.assertIsNone(s.games[0].result)
self.assertEqual(s.scores, scores)
self.assertEqual(s.target_score, 200)
self.assertTrue('Series to 200 points:' in str2)
self.assertTrue('Team 0 has 200 points.' in str2)
self.assertTrue('Team 1 has 200 points.' in str2)
self.assertEqual(str2, repr2)
s.scores = [0, 0]
s.games[0].result = dominoes.Result(0, True, 50)
g1 = s.next_game()
str3 = str(s)
repr3 = repr(s)
self.assertEqual(len(s.games), 2)
self.assertEqual(len(g1.board), 0)
self.assertEqual(g1.turn, 0)
self.assertTrue(bool(g1.valid_moves))
self.assertIsNone(g1.result)
self.assertEqual(s.scores, [50, 0])
self.assertEqual(s.target_score, 200)
self.assertTrue('Series to 200 points:' in str3)
self.assertTrue('Team 0 has 50 points.' in str3)
self.assertTrue('Team 1 has 0 points.' in str3)
self.assertEqual(str3, repr3)
s.games[1].result = dominoes.Result(1, False, 50)
g2 = s.next_game()
str4 = str(s)
repr4 = repr(s)
self.assertEqual(len(s.games), 3)
self.assertEqual(len(g2.board), 0)
self.assertEqual(g2.turn, 2)
self.assertTrue(bool(g2.valid_moves))
self.assertIsNone(g2.result)
self.assertEqual(s.scores, [100, 0])
self.assertEqual(s.target_score, 200)
self.assertTrue('Series to 200 points:' in str4)
self.assertTrue('Team 0 has 100 points.' in str4)
self.assertTrue('Team 1 has 0 points.' in str4)
self.assertEqual(str4, repr4)
s.games[2].result = dominoes.Result(2, True, 50)
g3 = s.next_game()
str5 = str(s)
repr5 = repr(s)
self.assertEqual(len(s.games), 4)
self.assertEqual(len(g3.board), 0)
self.assertEqual(g3.turn, 2)
self.assertTrue(bool(g3.valid_moves))
self.assertIsNone(g3.result)
self.assertEqual(s.scores, [150, 0])
self.assertEqual(s.target_score, 200)
self.assertTrue('Series to 200 points:' in str5)
self.assertTrue('Team 0 has 150 points.' in str5)
self.assertTrue('Team 1 has 0 points.' in str5)
self.assertEqual(str5, repr5)
s.games[3].result = dominoes.Result(3, False, -50)
g4 = s.next_game()
str6 = str(s)
repr6 = repr(s)
self.assertEqual(len(s.games), 5)
self.assertEqual(len(g4.board), 0)
self.assertEqual(g4.turn, 3)
self.assertTrue(bool(g4.valid_moves))
self.assertIsNone(g4.result)
self.assertEqual(s.scores, [150, 50])
self.assertEqual(s.target_score, 200)
self.assertTrue('Series to 200 points:' in str6)
self.assertTrue('Team 0 has 150 points.' in str6)
self.assertTrue('Team 1 has 50 points.' in str6)
self.assertEqual(str6, repr6)
s.games[4].result = dominoes.Result(2, False, 0)
g5 = s.next_game()
str7 = str(s)
repr7 = repr(s)
self.assertEqual(len(s.games), 6)
self.assertEqual(len(g5.board), 0)
self.assertEqual(g5.turn, 3)
self.assertTrue(bool(g5.valid_moves))
self.assertIsNone(g5.result)
self.assertEqual(s.scores, [150, 50])
self.assertEqual(s.target_score, 200)
self.assertTrue('Series to 200 points:' in str7)
self.assertTrue('Team 0 has 150 points.' in str7)
self.assertTrue('Team 1 has 50 points.' in str7)
self.assertEqual(str7, repr7)
s.games[5].result = dominoes.Result(0, True, 100)
self.assertIsNone(s.next_game())
str8 = str(s)
repr8 = repr(s)
self.assertEqual(len(s.games), 6)
self.assertEqual(s.scores, [250, 50])
self.assertEqual(s.target_score, 200)
self.assertTrue('Series to 200 points:' in str8)
self.assertTrue('Team 0 has 250 points.' in str8)
self.assertTrue('Team 1 has 50 points.' in str8)
self.assertEqual(str8, repr8)
if __name__ == '__main__':
unittest.main()
| 35.809302 | 78 | 0.612287 | 1,064 | 7,699 | 4.369361 | 0.094925 | 0.219402 | 0.081308 | 0.055926 | 0.673263 | 0.531512 | 0.424177 | 0.328028 | 0.231448 | 0.208217 | 0 | 0.070154 | 0.250162 | 7,699 | 214 | 79 | 35.976636 | 0.735146 | 0 | 0 | 0.158192 | 0 | 0 | 0.066762 | 0 | 0 | 0 | 0 | 0 | 0.700565 | 1 | 0.016949 | false | 0 | 0.016949 | 0 | 0.039548 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d3d98aa53d17b965aefdf482149b07d09f4604a5 | 3,755 | py | Python | certbot_dns_rrpproxy/__init__.py | avarteqgmbh/certbot-dns-rrpproxy | 4670b53ea86553e60bc04f1f87d679fd39f96cfc | [
"MIT"
] | null | null | null | certbot_dns_rrpproxy/__init__.py | avarteqgmbh/certbot-dns-rrpproxy | 4670b53ea86553e60bc04f1f87d679fd39f96cfc | [
"MIT"
] | null | null | null | certbot_dns_rrpproxy/__init__.py | avarteqgmbh/certbot-dns-rrpproxy | 4670b53ea86553e60bc04f1f87d679fd39f96cfc | [
"MIT"
] | null | null | null | """
The `~certbot_dns_rrpproxy.dns_rrpproxy` plugin automates the process of
completing a ``dns-01`` challenge (`~acme.challenges.DNS01`) by creating, and
subsequently removing, TXT records using the RRPproxy API.
Named Arguments
---------------
======================================== =====================================
``--dns-rrpproxy-credentials`` RRPproxy account credentials in INI
format. (Required)
``--dns-rrpproxy-propagation-seconds`` The number of seconds to wait for DNS
to propagate before asking the ACME
server to verify the DNS record.
(Default: 120)
``--dns-rrpproxy-staging`` Whether this is a test run (OTE).
======================================== =====================================
Credentials
-----------
Use of this plugin requires a configuration file containing the login
credentials with the required permissions to update DNS zones.
.. code-block:: ini
:name: credentials.ini
:caption: Example credentials file:
certbot_dns_rrpproxy:dns_rrpproxy_s_login = user
certbot_dns_rrpproxy:dns_rrpproxy_s_pw = password
The path to this file can be provided interactively or using the
``--dns-rrpproxy-credentials`` command-line argument. Certbot records the
path to this file for use during renewal, but does not store the file's contents.
.. caution::
Users who can read this file can use these credentials to issue arbitrary
API calls on your behalf. Users who can cause Certbot to run using these
credentials can complete a ``dns-01`` challenge to acquire new certificates
or revoke existing certificates for associated domains, even if those
domains aren't being managed by this server.
Certbot will emit a warning if it detects that the credentials file can be
accessed by other users on your system. The warning reads "Unsafe permissions
on credentials configuration file", followed by the path to the credentials
file. This warning will be emitted each time Certbot uses the credentials file,
including for renewal, and cannot be silenced except by addressing the issue
(e.g., by using a command like ``chmod 600`` to restrict access to the file).
Examples
--------
.. code-block:: bash
:caption: To acquire a certificate for ``example.com``
certbot certonly \\
--authenticator certbot-dns-rrpproxy:dns-rrpproxy \\
--certbot-dns-rrpproxy:dns-rrpproxy-credentials ~/.secrets/certbot/rrpproxy.ini \\
-d example.com
.. code-block:: bash
:caption: To acquire a single certificate for both ``example.com`` and
``www.example.com``
certbot certonly \\
--authenticator certbot-dns-rrpproxy:dns-rrpproxy \\
--certbot-dns-rrpproxy:dns-rrpproxy-credentials ~/.secrets/certbot/rrpproxy.ini \\
-d example.com \\
-d www.example.com
.. code-block:: bash
:caption: To acquire a certificate for ``example.com``, waiting 120 seconds
for DNS propagation
certbot certonly \\
--authenticator certbot-dns-rrpproxy:dns-rrpproxy \\
--certbot-dns-rrpproxy:dns-rrpproxy-credentials ~/.secrets/certbot/rrpproxy.ini \\
--certbot-dns-rrpproxy:dns-rrpproxy-propagation-seconds 120 \\
-d example.com
.. code-block:: bash
:caption: To acquire a certificate for ``*.example.com`` and ``example.com``
in a staging request
certbot certonly \\
--authenticator certbot-dns-rrpproxy:dns-rrpproxy \\
--certbot-dns-rrpproxy:dns-rrpproxy-credentials ~/.secrets/certbot/rrpproxy.ini \\
--certbot-dns-rrpproxy:dns-rrpproxy-staging \\
-d *.example.com \\
-d example.com
"""
| 39.526316 | 87 | 0.658322 | 465 | 3,755 | 5.288172 | 0.335484 | 0.134201 | 0.095161 | 0.111021 | 0.371696 | 0.346076 | 0.321675 | 0.321675 | 0.321675 | 0.321675 | 0 | 0.006042 | 0.206658 | 3,755 | 94 | 88 | 39.946809 | 0.819402 | 0.993076 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d3e0dcdb7864c5f7d6cda7590c28fb12e391dd4e | 198 | py | Python | capture.py | Lovekesh-GH/attendance | 693f4a9798fb88c1c354c6fe3b8bca19da8617ea | [
"MIT"
] | 1 | 2021-09-02T08:23:54.000Z | 2021-09-02T08:23:54.000Z | capture.py | Lovekesh-GH/attendance | 693f4a9798fb88c1c354c6fe3b8bca19da8617ea | [
"MIT"
] | null | null | null | capture.py | Lovekesh-GH/attendance | 693f4a9798fb88c1c354c6fe3b8bca19da8617ea | [
"MIT"
] | null | null | null | import cv2
camera = cv2.VideoCapture(0)
for i in range(15):
return_value, image = camera.read()
print(return_value, image.shape)
cv2.imwrite('iloveyou'+str(i)+'.jpg', image)
del(camera) | 24.75 | 48 | 0.691919 | 30 | 198 | 4.5 | 0.7 | 0.162963 | 0.237037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035714 | 0.151515 | 198 | 8 | 49 | 24.75 | 0.767857 | 0 | 0 | 0 | 0 | 0 | 0.060302 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.142857 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d3e7b3f435b2097f084d523576aa64c4d5562d3b | 2,183 | py | Python | testinvenio/records/api.py | Alzpeta/testinvenio | 210e4b6ce6b1edcb2ce9ded1bc683745e7eb421a | [
"MIT"
] | null | null | null | testinvenio/records/api.py | Alzpeta/testinvenio | 210e4b6ce6b1edcb2ce9ded1bc683745e7eb421a | [
"MIT"
] | null | null | null | testinvenio/records/api.py | Alzpeta/testinvenio | 210e4b6ce6b1edcb2ce9ded1bc683745e7eb421a | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#
# Copyright (C) 2020 alzp.
#
# testInvenio is free software; you can redistribute it and/or modify it
# under the terms of the MIT License; see LICENSE file for more details.
"""Records API."""
from __future__ import absolute_import, print_function
from flask import make_response, abort
from invenio_records_files.api import Record as FilesRecord
#import oarepo_actions
from oarepo_actions.decorators import action
from invenio_access.permissions import Permission, authenticated_user, any_user
from flask_principal import Need, RoleNeed
from werkzeug.wrappers import Response
def can():
return True
class FakeNeed():
def __init__(self):
self.method = None
def can(self):
return True
class OwnerNeed():
def __init__(self, record):
self.record = record
def can(self, record):
if record['Owner'] == 1:
return True
else:
return False
def pf(record=None):
# return Permission(RoleNeed('admin'))
# return Permission(FakeNeed())
# return Permission(OwnerNeed(record["owner"]))
return Permission(any_user)
class Record(FilesRecord):
"""Custom record."""
# @action(url_path="blah", permissions=pf)
# def send_email(self, **kwargs):
# return {"title": self["title"]}
@classmethod
@action(detail=False, url_path="jej/<string:param>")
def blah1(cls, param= None,**kwargs):
return {"DOI": param}
# @classmethod
# @action(detail=False, url_path="test/<int:param>", permissions=pf,
# serializers={'GET': {'text/html': make_response}})
# def test1(cls, param=None, **kwargs):
# print("juch")
# return {param: "jej"}
#
# @classmethod
# @action(detail=False, url_path="a", permissions=pf, serializers={'text/html': make_response}, meth="PUT")
# def a(cls, param=None, **kwargs):
# return "<h1>jej</h1>"
#
# @classmethod
# @action(detail=False, permissions=pf, serializers={'text/html': make_response})
# def test(cls, **kwargs):
# print("jej")
# return Response(status=200)
_schema = "records/record-v1.0.0.json"
| 26.301205 | 111 | 0.648649 | 265 | 2,183 | 5.218868 | 0.422642 | 0.034707 | 0.066522 | 0.080983 | 0.193059 | 0.139552 | 0.06363 | 0 | 0 | 0 | 0 | 0.009335 | 0.214842 | 2,183 | 82 | 112 | 26.621951 | 0.79755 | 0.470454 | 0 | 0.1 | 0 | 0 | 0.046595 | 0.023297 | 0 | 0 | 0 | 0 | 0 | 1 | 0.233333 | false | 0 | 0.233333 | 0.133333 | 0.8 | 0.033333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
d3e7f89e32a25d491aee762692c2768b8c6a4de2 | 3,361 | py | Python | dev/sherpa/stats/sum_benchmark.py | gabemery/gammapy | 99e5c5d38e4920dddd7bca41fb1539ccda8bea2d | [
"BSD-3-Clause"
] | 3 | 2019-01-28T12:21:14.000Z | 2019-02-10T19:58:07.000Z | dev/sherpa/stats/sum_benchmark.py | gabemery/gammapy | 99e5c5d38e4920dddd7bca41fb1539ccda8bea2d | [
"BSD-3-Clause"
] | 1 | 2020-10-29T19:55:46.000Z | 2020-10-29T19:55:46.000Z | dev/sherpa/stats/sum_benchmark.py | gabemery/gammapy | 99e5c5d38e4920dddd7bca41fb1539ccda8bea2d | [
"BSD-3-Clause"
] | null | null | null | """Test the speed and precision of the sum of many values.
When fitting data with many events or bins (say 1e6 or 1e9 or more), the fit statistic is first
computed per event or bin and then summed.
The sum (and thus the fit statistic and fit results) can become incorrect due to rounding errors
(see e.g. http://en.wikipedia.org/wiki/Kahan_summation_algorithm)
There are clever and slower methods to avoid this problem (like the Kahan summation algorithm),
but they are not readily available in stdlib C / C++ or numpy
(there's an open feature request for numpy though: https://github.com/numpy/numpy/issues/2448)
The simplest solution is to use 128 bit precision for the accumulator in numpy:
In [7]: np.sum([1e10, -1e10, 1e-6] * int(1e6))
Out[7]: 1.9073477254638671
In [8]: np.sum([1e10, -1e10, 1e-6] * int(1e6), dtype='float128')
Out[8]: 1.0002404448965787888
Sherpa uses Kahan summation:
$ASCDS_INSTALL/src/pkg/sherpa/sherpa/sherpa/include/sherpa/stats.hh
This is a quick benchmarking of the precision, speed of sum as a function of these parameters:
* Accumulator bit size: 32, 64 or 128
* Number of elements: 1e6, 1e9
* TODO: how to choose values in a meaningful way? Precision results will completely depend on this.
Should be chosen similar to typical / extreme fitting cases with CASH, CSTAT, CHI2 fits
For now we only check the speed.
* TODO: Check against C and Cython implementation
"""
from timeit import Timer
dtypes = ['float32', 'float64', 'float128']
sizes = [int(1e6), int(1e9)]
def setup(size, dtype):
return """
import numpy as np
data = np.zeros({size}, dtype='{dtype}')
"""[1:-1].format(**locals())
def statement(dtype):
return """data.sum(dtype='{dtype}')""".format(**locals())
for data_dtype in dtypes:
for accumulator_dtype in dtypes:
for size in sizes:
timer = Timer(statement(accumulator_dtype), setup(size, data_dtype))
time = min(timer.repeat(repeat=3, number=1))
# Let's use the frequency in GHz of summed elements as our measure of speed
speed = 1e-9 * (size / time)
print('%10s %10s %10d %10.5f' %
(data_dtype, accumulator_dtype, size, speed))
"""
On my 2.6 GHz Intel Core I7 Macbook the speed doesn't depend on data or accumulator dtype at all.
This is weird, because it's a 64 bit machine, so 128 bit addition should be slower.
Also for such a simple computation as sum the limiting factor should be memory loading speed,
so 128 bit data should be slower to process than 64 bit data?
In [53]: run sum_benchmark.py
f32 f32 1000000 0.82793
f32 f32 1000000000 1.12276
f32 f64 1000000 1.12207
f32 f64 1000000000 1.10964
f32 f128 1000000 1.04155
f32 f128 1000000000 1.12900
f64 f32 1000000 1.10609
f64 f32 1000000000 1.12823
f64 f64 1000000 1.10493
f64 f64 1000000000 1.11920
f64 f128 1000000 1.15450
f64 f128 1000000000 1.11794
f128 f32 1000000 1.12087
f128 f32 1000000000 1.12223
f128 f64 1000000 1.09885
f128 f64 1000000000 1.11911
f128 f128 1000000 1.06943
f128 f128 1000000000 1.12578
"""
| 40.493976 | 99 | 0.65903 | 509 | 3,361 | 4.332024 | 0.453831 | 0.044898 | 0.019048 | 0.011791 | 0.019955 | 0.019955 | 0.019955 | 0.019955 | 0 | 0 | 0 | 0.190265 | 0.260339 | 3,361 | 82 | 100 | 40.987805 | 0.696702 | 0.439155 | 0 | 0 | 0 | 0 | 0.18169 | 0.035211 | 0 | 0 | 0 | 0.02439 | 0 | 1 | 0.111111 | false | 0 | 0.111111 | 0.111111 | 0.333333 | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
d3efd7ae6a6a1f08132710ca2b6ea5099c2da9fa | 50 | py | Python | dj_gui_api_server/version.py | ChihweiLHBird/pharus | 44de192c3de73a2e32bec9239adae1cde7ed9439 | [
"MIT"
] | null | null | null | dj_gui_api_server/version.py | ChihweiLHBird/pharus | 44de192c3de73a2e32bec9239adae1cde7ed9439 | [
"MIT"
] | null | null | null | dj_gui_api_server/version.py | ChihweiLHBird/pharus | 44de192c3de73a2e32bec9239adae1cde7ed9439 | [
"MIT"
] | null | null | null | """DJGUI API metadata."""
__version__ = '0.1.0a0'
| 16.666667 | 25 | 0.64 | 7 | 50 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 0.12 | 50 | 2 | 26 | 25 | 0.545455 | 0.38 | 0 | 0 | 0 | 0 | 0.28 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d3f7e786f4140cac1ad3b97fb2db7a35e3b0a05b | 1,496 | py | Python | test/tests/multinherit_metaclass_picking.py | aisk/pyston | ac69cfef0621dbc8901175e84fa2b5cb5781a646 | [
"BSD-2-Clause",
"Apache-2.0"
] | 1 | 2020-02-06T14:28:45.000Z | 2020-02-06T14:28:45.000Z | test/tests/multinherit_metaclass_picking.py | aisk/pyston | ac69cfef0621dbc8901175e84fa2b5cb5781a646 | [
"BSD-2-Clause",
"Apache-2.0"
] | null | null | null | test/tests/multinherit_metaclass_picking.py | aisk/pyston | ac69cfef0621dbc8901175e84fa2b5cb5781a646 | [
"BSD-2-Clause",
"Apache-2.0"
] | 1 | 2020-02-06T14:29:00.000Z | 2020-02-06T14:29:00.000Z | # Test a relatively-obscure corner case of multiple inheritance:
# the metaclass is normally the first base's metaclass, but type_new
# will specifically delegate to a later base's metaclass if it is more
# derived.
# First, test what I'm guessing is the common case, where a later
# class is the only one with a non-default metaclass:
class C(object):
pass
class M(type):
def __new__(cls, *args):
print "M.__new__", cls
return type.__new__(cls, *args)
def __init__(self, *args):
print "M.__init__"
return type.__init__(self, *args)
class D(object):
__metaclass__ = M
print type(D)
class E(C, D):
pass
print type(E)
# Then, test to make sure that it's actually type_new that's doing this, and not
# the class creation machinery. We can check this by using an initial metatype that
# doesn't defer to type_new
class GreedyMeta(type):
def __new__(cls, name, bases, attrs):
print "GreedyMeta.__new__", cls
if 'make_for_real' in attrs:
return type.__new__(cls, name, bases, attrs)
return 12345
class F(object):
__metaclass__ = GreedyMeta
make_for_real = True
print F, type(F)
class G(F, D):
pass
print G
# Constructing the class with the bases in the opposite order will fail,
# since this will end up calling M.__new__ -> type.__new__, and type_new
# does some extra checks, that we skipped with GreedyMeta.
try:
class H(D, F):
pass
except TypeError as e:
print e
| 24.933333 | 84 | 0.686497 | 237 | 1,496 | 4.080169 | 0.434599 | 0.050672 | 0.028956 | 0.026887 | 0.041365 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004352 | 0.231952 | 1,496 | 59 | 85 | 25.355932 | 0.83725 | 0.474599 | 0 | 0.121212 | 0 | 0 | 0.064767 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.121212 | 0 | null | null | 0.242424 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
d3f8f1498bed24cf5e2a0c2cfe0dc1f296c85168 | 879 | py | Python | wbia/web/modules/detect/resources.py | WildMeOrg/wildbook-ia | a18d57611e5936bea02a964716466e062415aa1a | [
"Apache-2.0"
] | 20 | 2021-01-19T23:17:21.000Z | 2022-03-21T10:25:56.000Z | wbia/web/modules/detect/resources.py | solomonkimunyu/wildbook-ia | ac433d4f2a47b1d905c421a36c497f787003afc3 | [
"Apache-2.0"
] | 16 | 2021-01-28T23:05:29.000Z | 2022-03-31T20:39:36.000Z | wbia/web/modules/detect/resources.py | solomonkimunyu/wildbook-ia | ac433d4f2a47b1d905c421a36c497f787003afc3 | [
"Apache-2.0"
] | 9 | 2021-02-13T20:19:46.000Z | 2022-03-29T10:47:11.000Z | # -*- coding: utf-8 -*-
# pylint: disable=too-few-public-methods
"""
RESTful API Detect resources
--------------------------
"""
import logging
from flask_restx_patched import Resource
from wbia.web.extensions.api import Namespace
from flask import current_app
from . import parameters
log = logging.getLogger(__name__)
api = Namespace('detect', description='Detect')
@api.route('/')
class Detection(Resource):
"""
Manipulations with detect.
"""
# @api.login_required(oauth_scopes=['detect:read'])
@api.parameters(parameters.RunDetectionParameters())
def get(self, args):
"""
Run detection.
"""
# Start detection job
ibs = current_app.ibs
print(args)
# CONVERT INPUT PARAMETERS INTO NEEDED FOR LEGACY DETECTION CALL
jobid = ibs.start_detect_image_lightnet()
return jobid
| 20.44186 | 72 | 0.648464 | 96 | 879 | 5.802083 | 0.645833 | 0.032316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001449 | 0.215017 | 879 | 42 | 73 | 20.928571 | 0.805797 | 0.332196 | 0 | 0 | 0 | 0 | 0.024209 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.333333 | 0 | 0.533333 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
d3f91172d3d48bd4e2b0d7b4e7c6f44af32911f6 | 299 | py | Python | Desafios/Ex-052.py | LuckyCards/Curso-Python3 | b39c7b2645220c71c35012f16c102428053fee25 | [
"MIT"
] | 1 | 2021-04-06T16:14:43.000Z | 2021-04-06T16:14:43.000Z | Desafios/Ex-052.py | LuckyCards/Curso-Python3 | b39c7b2645220c71c35012f16c102428053fee25 | [
"MIT"
] | null | null | null | Desafios/Ex-052.py | LuckyCards/Curso-Python3 | b39c7b2645220c71c35012f16c102428053fee25 | [
"MIT"
] | null | null | null | print(f'\033[33m{"—"*30:^30}\033[m')
print(f'\033[36m{"EXERCÍCIO Nº 52":^30}\033[m')
print(f'\033[33m{"—"*30:^30}\033[m')
n = int(input('Digite um número: '))
for x in range(2, n + 1):
if x == n:
print('Primo')
break
if n % x == 0:
print('não é primo')
break
| 24.916667 | 47 | 0.505017 | 56 | 299 | 2.732143 | 0.517857 | 0.117647 | 0.176471 | 0.156863 | 0.372549 | 0.372549 | 0.27451 | 0.27451 | 0.27451 | 0 | 0 | 0.173333 | 0.247492 | 299 | 11 | 48 | 27.181818 | 0.497778 | 0 | 0 | 0.363636 | 0 | 0 | 0.411371 | 0.173913 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.454545 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
31086fe294225118392641cd61b6d91b3caf44e8 | 15,437 | py | Python | test/test_mail.py | cedarjp/django-sendgrid-v5 | 979838bcb137f49afa56e3a5028872a1c7f0976d | [
"MIT"
] | null | null | null | test/test_mail.py | cedarjp/django-sendgrid-v5 | 979838bcb137f49afa56e3a5028872a1c7f0976d | [
"MIT"
] | null | null | null | test/test_mail.py | cedarjp/django-sendgrid-v5 | 979838bcb137f49afa56e3a5028872a1c7f0976d | [
"MIT"
] | null | null | null | import base64
import sys
from django.core.mail import EmailMessage, EmailMultiAlternatives
from django.test import override_settings
from django.test.testcases import SimpleTestCase
from sendgrid_backend.mail import SendgridBackend, SENDGRID_VERSION
if sys.version_info >= (3.0, 0.0, ):
from email.mime.image import MIMEImage
else:
from email.MIMEImage import MIMEImage
class TestMailGeneration(SimpleTestCase):
# Any assertDictEqual failures will show the entire diff instead of just a snippet
maxDiff = None
@classmethod
def setUpClass(self):
super(TestMailGeneration, self).setUpClass()
with override_settings(EMAIL_BACKEND="sendgrid_backend.SendgridBackend",
SENDGRID_API_KEY="DUMMY_API_KEY"):
self.backend = SendgridBackend()
def test_EmailMessage(self):
msg = EmailMessage(
subject="Hello, World!",
body="Hello, World!",
from_email="Sam Smith <sam.smith@example.com>",
to=["John Doe <john.doe@example.com>", "jane.doe@example.com"],
cc=["Stephanie Smith <stephanie.smith@example.com>"],
bcc=["Sarah Smith <sarah.smith@example.com>"],
reply_to=["Sam Smith <sam.smith@example.com>"],
)
result = self.backend._build_sg_mail(msg)
expected = {
"personalizations": [{
"to": [{
"email": "john.doe@example.com",
"name": "John Doe"
}, {
"email": "jane.doe@example.com",
}],
"cc": [{
"email": "stephanie.smith@example.com",
"name": "Stephanie Smith"
}],
"bcc": [{
"email": "sarah.smith@example.com",
"name": "Sarah Smith"
}],
"subject": "Hello, World!"
}],
"from": {
"email": "sam.smith@example.com",
"name": "Sam Smith"
},
"mail_settings": {
"sandbox_mode": {
"enable": False
}
},
"reply_to": {
"email": "sam.smith@example.com",
"name": "Sam Smith"
},
"subject": "Hello, World!",
"tracking_settings": {"open_tracking": {"enable": True}},
"content": [{
"type": "text/plain",
"value": "Hello, World!"
}]
}
self.assertDictEqual(result, expected)
def test_EmailMessage_attributes(self):
"""Test that send_at and categories attributes are correctly written through to output."""
msg = EmailMessage(
subject="Hello, World!",
body="Hello, World!",
from_email="Sam Smith <sam.smith@example.com>",
to=["John Doe <john.doe@example.com>", "jane.doe@example.com"],
)
# Set new attributes as message property
msg.send_at = 1518108670
if SENDGRID_VERSION < '6':
msg.categories = ['mammal', 'dog']
else:
msg.categories = ['dog', 'mammal']
msg.ip_pool_name = 'some-name'
result = self.backend._build_sg_mail(msg)
expected = {
"personalizations": [{
"to": [{
"email": "john.doe@example.com",
"name": "John Doe"
}, {
"email": "jane.doe@example.com",
}],
"subject": "Hello, World!",
"send_at": 1518108670,
}],
"from": {
"email": "sam.smith@example.com",
"name": "Sam Smith"
},
"mail_settings": {
"sandbox_mode": {
"enable": False
}
},
"subject": "Hello, World!",
"tracking_settings": {"open_tracking": {"enable": True}},
"content": [{
"type": "text/plain",
"value": "Hello, World!"
}],
"categories": ['mammal', 'dog'],
"ip_pool_name": "some-name"
}
self.assertDictEqual(result, expected)
def test_EmailMultiAlternatives(self):
msg = EmailMultiAlternatives(
subject="Hello, World!",
body=" ",
from_email="Sam Smith <sam.smith@example.com>",
to=["John Doe <john.doe@example.com>", "jane.doe@example.com"],
cc=["Stephanie Smith <stephanie.smith@example.com>"],
bcc=["Sarah Smith <sarah.smith@example.com>"],
reply_to=["Sam Smith <sam.smith@example.com>"],
)
msg.attach_alternative("<body<div>Hello World!</div></body>", "text/html")
# Test CSV attachment
msg.attach("file.csv", "1,2,3,4", "text/csv")
result = self.backend._build_sg_mail(msg)
expected = {
"personalizations": [{
"to": [{
"email": "john.doe@example.com",
"name": "John Doe"
}, {
"email": "jane.doe@example.com",
}],
"cc": [{
"email": "stephanie.smith@example.com",
"name": "Stephanie Smith"
}],
"bcc": [{
"email": "sarah.smith@example.com",
"name": "Sarah Smith"
}],
"subject": "Hello, World!"
}],
"from": {
"email": "sam.smith@example.com",
"name": "Sam Smith"
},
"mail_settings": {
"sandbox_mode": {
"enable": False
}
},
"reply_to": {
"email": "sam.smith@example.com",
"name": "Sam Smith"
},
"subject": "Hello, World!",
"tracking_settings": {"open_tracking": {"enable": True}},
"attachments": [{
"content": "MSwyLDMsNA==",
"filename": "file.csv",
"type": "text/csv"
}],
"content": [{
"type": "text/plain",
"value": " ",
}, {
"type": "text/html",
"value": "<body<div>Hello World!</div></body>",
}]
}
self.assertDictEqual(result, expected)
def test_EmailMultiAlternatives__unicode_attachment(self):
msg = EmailMultiAlternatives(
subject="Hello, World!",
body=" ",
from_email="Sam Smith <sam.smith@example.com>",
to=["John Doe <john.doe@example.com>", "jane.doe@example.com"],
cc=["Stephanie Smith <stephanie.smith@example.com>"],
bcc=["Sarah Smith <sarah.smith@example.com>"],
reply_to=["Sam Smith <sam.smith@example.com>"],
)
msg.attach_alternative("<body<div>Hello World!</div></body>", "text/html")
# Test CSV attachment
attachments = [
("file.xls", b"\xd0", "application/vnd.ms-excel"),
("file.csv", b"C\xc3\xb4te d\xe2\x80\x99Ivoire", "text/csv"),
]
if SENDGRID_VERSION < '6':
for a in attachments:
msg.attach(*a)
else:
for a in reversed(attachments):
msg.attach(*a)
result = self.backend._build_sg_mail(msg)
expected = {
"personalizations": [{
"to": [{
"email": "john.doe@example.com",
"name": "John Doe"
}, {
"email": "jane.doe@example.com",
}],
"cc": [{
"email": "stephanie.smith@example.com",
"name": "Stephanie Smith"
}],
"bcc": [{
"email": "sarah.smith@example.com",
"name": "Sarah Smith"
}],
"subject": "Hello, World!"
}],
"from": {
"email": "sam.smith@example.com",
"name": "Sam Smith"
},
"mail_settings": {
"sandbox_mode": {
"enable": False
}
},
"reply_to": {
"email": "sam.smith@example.com",
"name": "Sam Smith"
},
"subject": "Hello, World!",
"tracking_settings": {"open_tracking": {"enable": True}},
"attachments": [
{
"content": "0A==",
"filename": "file.xls",
"type": "application/vnd.ms-excel"
},
{
"content": "Q8O0dGUgZOKAmUl2b2lyZQ==",
"filename": "file.csv",
"type": "text/csv"
}
],
"content": [{
"type": "text/plain",
"value": " ",
}, {
"type": "text/html",
"value": "<body<div>Hello World!</div></body>",
}]
}
self.assertDictEqual(result, expected)
def test_reply_to(self):
kwargs = {
"subject": "Hello, World!",
"body": "Hello, World!",
"from_email": "Sam Smith <sam.smith@example.com>",
"to": ["John Doe <john.doe@example.com>"],
"reply_to": ["Sam Smith <sam.smith@example.com>"],
"headers": {"Reply-To": "Stephanie Smith <stephanie.smith@example.com>"}
}
# Test different values in Reply-To header and reply_to prop
msg = EmailMessage(**kwargs)
with self.assertRaises(ValueError):
self.backend._build_sg_mail(msg)
# Test different names (but same email) in Reply-To header and reply_to prop
kwargs["headers"] = {"Reply-To": "Bad Name <sam.smith@example.com>"}
msg = EmailMessage(**kwargs)
with self.assertRaises(ValueError):
self.backend._build_sg_mail(msg)
# Test same name/email in both Reply-To header and reply_to prop
kwargs["headers"] = {"Reply-To": "Sam Smith <sam.smith@example.com>"}
msg = EmailMessage(**kwargs)
result = self.backend._build_sg_mail(msg)
self.assertDictEqual(result["reply_to"], {"email": "sam.smith@example.com", "name": "Sam Smith"})
def test_mime(self):
msg = EmailMultiAlternatives(
subject="Hello, World!",
body=" ",
from_email="Sam Smith <sam.smith@example.com>",
to=["John Doe <john.doe@example.com>", "jane.doe@example.com"],
)
content = '<body><img src="cid:linux_penguin" /></body>'
msg.attach_alternative(content, "text/html")
with open("test/linux-penguin.png", "rb") as f:
img = MIMEImage(f.read())
img.add_header("Content-ID", "<linux_penguin>")
msg.attach(img)
result = self.backend._build_sg_mail(msg)
self.assertEqual(len(result["content"]), 2)
self.assertDictEqual(result["content"][0], {"type": "text/plain", "value": " "})
self.assertDictEqual(result["content"][1], {"type": "text/html", "value": content})
self.assertEqual(len(result["attachments"]), 1)
self.assertEqual(result["attachments"][0]["content_id"], "linux_penguin")
with open("test/linux-penguin.png", "rb") as f:
if sys.version_info >= (3.0, 0.0, ):
self.assertEqual(bytearray(result["attachments"][0]["content"], "utf-8"), base64.b64encode(f.read()))
else:
self.assertEqual(result["attachments"][0]["content"], base64.b64encode(f.read()))
self.assertEqual(result["attachments"][0]["type"], "image/png")
def test_templating(self):
msg = EmailMessage(
subject="Hello, World!",
body="Hello, World!",
from_email="Sam Smith <sam.smith@example.com>",
to=["John Doe <john.doe@example.com>", "jane.doe@example.com"],
)
msg.template_id = "test_template"
result = self.backend._build_sg_mail(msg)
self.assertIn("template_id", result)
self.assertEquals(result["template_id"], "test_template")
def test_asm(self):
msg = EmailMessage(
subject="Hello, World!",
body="Hello, World!",
from_email="Sam Smith <sam.smith@example.com>",
to=["John Doe <john.doe@example.com>", "jane.doe@example.com"],
)
msg.asm = {"group_id": 1}
result = self.backend._build_sg_mail(msg)
self.assertIn("asm", result)
self.assertIn("group_id", result["asm"])
del msg.asm["group_id"]
with self.assertRaises(KeyError):
self.backend._build_sg_mail(msg)
msg.asm = {"group_id": 1, "groups_to_display": [2, 3, 4], "bad_key": None}
result = self.backend._build_sg_mail(msg)
self.assertIn("asm", result)
self.assertIn("group_id", result["asm"])
self.assertIn("groups_to_display", result["asm"])
def test_EmailMessage_custom_args(self):
msg = EmailMessage(
subject="Hello, World!",
body="Hello, World!",
from_email="Sam Smith <sam.smith@example.com>",
to=["John Doe <john.doe@example.com>", "jane.doe@example.com"],
cc=["Stephanie Smith <stephanie.smith@example.com>"],
bcc=["Sarah Smith <sarah.smith@example.com>"],
reply_to=["Sam Smith <sam.smith@example.com>"],
)
msg.custom_args = {"arg_1": "Foo", "arg_2": "bar"}
result = self.backend._build_sg_mail(msg)
expected = {
"personalizations": [{
"to": [{
"email": "john.doe@example.com",
"name": "John Doe"
}, {
"email": "jane.doe@example.com",
}],
"cc": [{
"email": "stephanie.smith@example.com",
"name": "Stephanie Smith"
}],
"bcc": [{
"email": "sarah.smith@example.com",
"name": "Sarah Smith"
}],
"subject": "Hello, World!",
"custom_args": {"arg_1": "Foo", "arg_2": "bar"}
}],
"from": {
"email": "sam.smith@example.com",
"name": "Sam Smith"
},
"mail_settings": {
"sandbox_mode": {
"enable": False
}
},
"reply_to": {
"email": "sam.smith@example.com",
"name": "Sam Smith"
},
"subject": "Hello, World!",
"tracking_settings": {"open_tracking": {"enable": True}},
"content": [{
"type": "text/plain",
"value": "Hello, World!"
}],
}
self.assertDictEqual(result, expected)
"""
todo: Implement these
def test_attachments(self):
pass
def test_headers(self):
pass
"""
| 35.244292 | 117 | 0.473991 | 1,435 | 15,437 | 4.995122 | 0.135192 | 0.097656 | 0.089983 | 0.06529 | 0.71596 | 0.702846 | 0.68192 | 0.665179 | 0.638811 | 0.619838 | 0 | 0.00776 | 0.373907 | 15,437 | 437 | 118 | 35.324943 | 0.733885 | 0.028633 | 0 | 0.690667 | 0 | 0 | 0.31303 | 0.093687 | 0 | 0 | 0 | 0.002288 | 0.064 | 1 | 0.026667 | false | 0 | 0.021333 | 0 | 0.053333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
311af589eb934ed09526901690d74e9a677ed6d1 | 3,551 | py | Python | darts/models/arima.py | MuttData/darts | de2ec651f38e74d79044a606c5adcdac7bf627a6 | [
"Apache-2.0"
] | 1 | 2020-07-08T01:15:20.000Z | 2020-07-08T01:15:20.000Z | darts/models/arima.py | MuttData/darts | de2ec651f38e74d79044a606c5adcdac7bf627a6 | [
"Apache-2.0"
] | 1 | 2020-07-27T15:38:54.000Z | 2020-07-27T15:38:54.000Z | darts/models/arima.py | MuttData/darts | de2ec651f38e74d79044a606c5adcdac7bf627a6 | [
"Apache-2.0"
] | 1 | 2020-07-27T15:34:08.000Z | 2020-07-27T15:34:08.000Z | """
ARIMA and Auto-ARIMA
--------------------
Models for ARIMA (Autoregressive integrated moving average) and auto-ARIMA [1]_.
The implementations are wrapped around `statsmodels <https://github.com/statsmodels/statsmodels>`_
and `pmdarima <https://github.com/alkaline-ml/pmdarima>`_.
References
----------
.. [1] https://wikipedia.org/wiki/Autoregressive_integrated_moving_average
"""
from typing import Optional
from statsmodels.tsa.arima_model import ARMA as staARMA
from statsmodels.tsa.arima_model import ARIMA as staARIMA
from pmdarima import AutoARIMA as PmdAutoARIMA
from .forecasting_model import UnivariateForecastingModel
from ..timeseries import TimeSeries
from ..logging import get_logger
logger = get_logger(__name__)
class ARIMA(UnivariateForecastingModel):
def __init__(self, p: int = 12, d: int = 1, q: int = 0):
""" ARIMA
Parameters
----------
p : int
Order (number of time lags) of the autoregressive model (AR)
d : int
The order of differentiation; i.e., the number of times the data have had past values subtracted. (I)
q : int
The size of the moving average window (MA).
"""
super().__init__()
self.p = p
self.d = d
self.q = q
self.model = None
def __str__(self):
return 'ARIMA({},{},{})'.format(self.p, self.d, self.q)
def fit(self, series: TimeSeries, component_index: Optional[int] = None):
super().fit(series, component_index)
series = self.training_series
m = staARIMA(series.values(),
order=(self.p, self.d, self.q)) if self.d > 0 else staARMA(series.values(), order=(self.p, self.q))
self.model = m.fit(disp=0)
def predict(self, n):
super().predict(n)
forecast = self.model.forecast(steps=n)[0]
return self._build_forecast_series(forecast)
@property
def min_train_series_length(self) -> int:
return 30
class AutoARIMA(UnivariateForecastingModel):
def __init__(self, *autoarima_args, **autoarima_kwargs):
""" Auto-ARIMA
This implementation is a thin wrapper around
`pmdarima AutoARIMA model <https://alkaline-ml.com/pmdarima/modules/generated/pmdarima.arima.AutoARIMA.html>`_,
which provides functionality similar
to R's `auto.arima <https://www.rdocumentation.org/packages/forecast/versions/7.3/topics/auto.arima>`_.
This model supports the same parameters as the pmdarima AutoARIMA model.
See `pmdarima documentation <https://alkaline-ml.com/pmdarima/modules/generated/pmdarima.arima.AutoARIMA.html>`_
for an extensive documentation and a list of supported parameters.
Parameters
----------
autoarima_args
Positional arguments for the pmdarima AutoARIMA model
autoarima_kwargs
Keyword arguments for the pmdarima AutoARIMA model
"""
super().__init__()
self.model = PmdAutoARIMA(*autoarima_args, **autoarima_kwargs)
def __str__(self):
return 'Auto-ARIMA'
def fit(self, series: TimeSeries, component_index: Optional[int] = None):
super().fit(series, component_index)
series = self.training_series
self.model.fit(series.values())
def predict(self, n):
super().predict(n)
forecast = self.model.predict(n_periods=n)
return self._build_forecast_series(forecast)
@property
def min_train_series_length(self) -> int:
return 30
| 33.5 | 120 | 0.658406 | 431 | 3,551 | 5.269142 | 0.315545 | 0.023778 | 0.038749 | 0.033025 | 0.361515 | 0.361515 | 0.266843 | 0.266843 | 0.266843 | 0.266843 | 0 | 0.005476 | 0.228668 | 3,551 | 105 | 121 | 33.819048 | 0.823658 | 0.389468 | 0 | 0.468085 | 0 | 0 | 0.012742 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.212766 | false | 0 | 0.148936 | 0.085106 | 0.531915 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
311f5aef7099f9bc2d8f8d2b5e49665774f0bea1 | 267 | py | Python | run.py | andrewhingah/stackoverflow-lite | c9af5ec9258b3783054d47ba67c7289c3102ac74 | [
"MIT"
] | null | null | null | run.py | andrewhingah/stackoverflow-lite | c9af5ec9258b3783054d47ba67c7289c3102ac74 | [
"MIT"
] | 3 | 2018-08-09T08:10:31.000Z | 2018-08-17T18:58:24.000Z | run.py | andrewhingah/stackoverflow-lite | c9af5ec9258b3783054d47ba67c7289c3102ac74 | [
"MIT"
] | null | null | null | """entry level, run app"""
import os
from api.app import create_app
config_name = os.getenv('APP_SETTINGS') # config_name = "development"
app = create_app(config_name)
from api.manage import migrate
migrate()
if __name__ == '__main__':
app.run(debug=True) | 16.6875 | 70 | 0.722846 | 39 | 267 | 4.589744 | 0.512821 | 0.167598 | 0.167598 | 0.212291 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153558 | 267 | 16 | 71 | 16.6875 | 0.792035 | 0.183521 | 0 | 0 | 0 | 0 | 0.093897 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
3125869e8c88e77c152b79f074821c633115465e | 288 | py | Python | answers/A1.py | sohamb03/tech-odyssey-2020 | 0f11da2bd00da9ecf6655822201204ae2ec95348 | [
"MIT"
] | 1 | 2022-03-06T09:26:39.000Z | 2022-03-06T09:26:39.000Z | answers/A1.py | Sn1F3rt/tech-odyssey-2020 | 0f11da2bd00da9ecf6655822201204ae2ec95348 | [
"MIT"
] | null | null | null | answers/A1.py | Sn1F3rt/tech-odyssey-2020 | 0f11da2bd00da9ecf6655822201204ae2ec95348 | [
"MIT"
] | null | null | null | n, min_val, max_val = int(input('n=')), int(input('minimum=')), int(input('maximum='))
c = i = 0
while True:
i, sq = i+1, i**n
if sq in range(min_val, max_val+1): c += 1
if sq > max_val: break
print(f'{c} values raised to the power {n} lie in the range {min_val}, {max_val}')
| 36 | 86 | 0.604167 | 58 | 288 | 2.87931 | 0.465517 | 0.143713 | 0.161677 | 0.215569 | 0.203593 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017316 | 0.197917 | 288 | 7 | 87 | 41.142857 | 0.705628 | 0 | 0 | 0 | 0 | 0.142857 | 0.3125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3145ee355139209526eade085a08bff4a0f9ddb1 | 4,614 | py | Python | dependency/libgit/pygit2-0.21.4/pygit2/blame.py | RangHo/pini-engine | e1407724de32a433b7b46e0ee2469240b70d960b | [
"MIT"
] | null | null | null | dependency/libgit/pygit2-0.21.4/pygit2/blame.py | RangHo/pini-engine | e1407724de32a433b7b46e0ee2469240b70d960b | [
"MIT"
] | null | null | null | dependency/libgit/pygit2-0.21.4/pygit2/blame.py | RangHo/pini-engine | e1407724de32a433b7b46e0ee2469240b70d960b | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#
# Copyright 2010-2014 The pygit2 contributors
#
# This file is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License, version 2,
# as published by the Free Software Foundation.
#
# In addition to the permissions in the GNU General Public License,
# the authors give you unlimited permission to link the compiled
# version of this file into combinations with other programs,
# and to distribute those combinations without any restriction
# coming from the use of this file. (The General Public License
# restrictions do apply in other respects; for example, they cover
# modification of the file, and distribution when not linked into
# a combined executable.)
#
# This file is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; see the file COPYING. If not, write to
# the Free Software Foundation, 51 Franklin Street, Fifth Floor,
# Boston, MA 02110-1301, USA.
# Import from the future
from __future__ import absolute_import, unicode_literals
# Import from pygit2
from .errors import check_error
from .ffi import ffi, C
from .utils import to_bytes, is_string, to_str
from _pygit2 import Signature, Oid
def wrap_signature(csig):
if not csig:
return None
return Signature(ffi.string(csig.name).decode('utf-8'),
ffi.string(csig.email).decode('utf-8'),
csig.when.time, csig.when.offset, 'utf-8')
class BlameHunk(object):
@classmethod
def _from_c(cls, blame, ptr):
hunk = cls.__new__(cls)
hunk._blame = blame
hunk._hunk = ptr
return hunk
@property
def lines_in_hunk(self):
"""Number of lines"""
return self._hunk.lines_in_hunk
@property
def boundary(self):
"""Tracked to a boundary commit"""
# Casting directly to bool via cffi does not seem to work
return int(ffi.cast('int', self._hunk.boundary)) != 0
@property
def final_start_line_number(self):
"""Final start line number"""
return self._hunk.final_start_line_number
@property
def final_committer(self):
"""Final committer"""
return wrap_signature(self._hunk.final_signature)
@property
def final_commit_id(self):
return Oid(raw=bytes(ffi.buffer(ffi.addressof(self._hunk, 'final_commit_id'))[:]))
@property
def orig_start_line_number(self):
"""Origin start line number"""
return self._hunk.orig_start_line_number
@property
def orig_committer(self):
"""Original committer"""
return wrap_signature(self._hunk.orig_signature)
@property
def orig_commit_id(self):
return Oid(raw=bytes(ffi.buffer(ffi.addressof(self._hunk, 'orig_commit_id'))[:]))
@property
def orig_path(self):
"""Original path"""
path = self._hunk.orig_path
if not path:
return None
return ffi.string(path).decode()
class Blame(object):
@classmethod
def _from_c(cls, repo, ptr):
blame = cls.__new__(cls)
blame._repo = repo
blame._blame = ptr
return blame
def __del__(self):
C.git_blame_free(self._blame)
def __len__(self):
return C.git_blame_get_hunk_count(self._blame)
def __getitem__(self, index):
chunk = C.git_blame_get_hunk_byindex(self._blame, index)
if not chunk:
raise IndexError
return BlameHunk._from_c(self, chunk)
def for_line(self, line_no):
"""for_line(line_no) -> BlameHunk
Returns the blame hunk data for a given line given its number
in the current Blame.
Arguments:
line_no
Line number, starts at 1.
"""
if line_no < 0:
raise IndexError
chunk = C.git_blame_get_hunk_byline(self._blame, line_no)
if not chunk:
raise IndexError
return BlameHunk._from_c(self, chunk)
class BlameIterator(object):
def __init__(self, blame):
self._count = len(blame)
self._index = 0
self._blame = blame
def __next__(self):
if self._index >= self._count:
raise StopIteration
hunk = self._blame[self._blame]
self._index += 1
return hunk
def next(self):
return self.__next__()
| 28.306748 | 90 | 0.660381 | 623 | 4,614 | 4.682183 | 0.322632 | 0.033939 | 0.030854 | 0.026054 | 0.231059 | 0.174152 | 0.076791 | 0.076791 | 0.076791 | 0.076791 | 0 | 0.009305 | 0.25466 | 4,614 | 162 | 91 | 28.481481 | 0.838907 | 0.340269 | 0 | 0.261905 | 0 | 0 | 0.016063 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.22619 | false | 0 | 0.059524 | 0.047619 | 0.547619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
314faad8343cc843391e1e3be259aea1bf5f5391 | 1,665 | py | Python | src/funk_lines/core/results.py | federicober/funk-lines | 04f31b78379c57bcf4e38485a6b2e15bd2b85b07 | [
"MIT"
] | 1 | 2021-03-14T17:57:42.000Z | 2021-03-14T17:57:42.000Z | src/funk_lines/core/results.py | federicober/funk-lines | 04f31b78379c57bcf4e38485a6b2e15bd2b85b07 | [
"MIT"
] | 138 | 2020-09-28T06:55:14.000Z | 2022-02-06T17:47:00.000Z | src/funk_lines/core/results.py | federicober/funk-lines | 04f31b78379c57bcf4e38485a6b2e15bd2b85b07 | [
"MIT"
] | null | null | null | """File for the results class."""
import statistics
from typing import List, Optional, Sequence
from .ast_processors import StmtInfo
class Results:
"""Class for holding the results of an analysis.
The Analyser classes return a Result object.
"""
def __init__(self, total_lines: int, definitions: Sequence[StmtInfo]):
"""Result constructor.
Args:
total_lines: Total number of lines
definitions: Sequence of functions and classes
"""
self._total_lines: int = total_lines
self._definitions: List[StmtInfo] = list(definitions)
@property
def total_lines(self) -> int:
"""Total number of lines."""
return self._total_lines
@property
def nbr_definitions(self) -> int:
"""Total number of functions and classes."""
return len(self._definitions)
@property
def definitions(self) -> List[StmtInfo]:
"""List of statement info objects for all functions and classes."""
return self._definitions.copy()
@property
def lines_per_function(self) -> Optional[float]:
"""Mean number of lines per definition."""
if self._definitions:
return statistics.mean(def_.n_lines for def_ in self._definitions)
return None
def __add__(self, other: "Results") -> "Results":
"""Combines two results.
Args:
other: another Result object
Returns:
Combined results
"""
return self.__class__(
total_lines=self.total_lines + other.total_lines,
definitions=self._definitions + other.definitions,
)
| 28.220339 | 78 | 0.630631 | 184 | 1,665 | 5.51087 | 0.331522 | 0.088757 | 0.055227 | 0.033531 | 0.039448 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.277477 | 1,665 | 58 | 79 | 28.706897 | 0.842893 | 0.294294 | 0 | 0.153846 | 0 | 0 | 0.013321 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.115385 | 0 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
3157df38dd254156eab4565fa101c9e2d84b4ffd | 991 | py | Python | generate-table.py | ikr7/benchmark-normal-variates | d8919d9e170baeaef92181369af0c1a826177a89 | [
"MIT"
] | 1 | 2020-12-09T11:48:48.000Z | 2020-12-09T11:48:48.000Z | generate-table.py | ikr7/benchmark-normal-variates | d8919d9e170baeaef92181369af0c1a826177a89 | [
"MIT"
] | null | null | null | generate-table.py | ikr7/benchmark-normal-variates | d8919d9e170baeaef92181369af0c1a826177a89 | [
"MIT"
] | null | null | null | from pathlib import Path
from math import log10
input_path = Path('benchmark.txt')
benchmark_output_path = Path('tex/benchmark-result.tex')
n_output_path = Path('tex/n-samples-benchmark.tex')
benchmark_output_text = ''
benchmark_output_text += '\\begin{tabular}{crrrr}\n'
benchmark_output_text += ' \\toprule\n'
benchmark_output_text += ' アルゴリズム & 実行時間 (ms) & 速度 \\tnote{a} & $N_U$ \\tnote{b} \\\\\n'
benchmark_output_text += ' \\midrule\n'
input_text = input_path.read_text().split('\n')
power = int(log10(int(input_text[0])))
n_output_path.write_text(str(power))
for line in input_text[2:-1]:
algorithm, time, rate, samples = line.split('\t')
time = float(time)
rate = float(rate)
samples = float(samples)
benchmark_output_text += f' {algorithm} & {time / 1000:.0f} & {rate :.0f} & {samples :.4f} \\\\\n'
benchmark_output_text += ' \\bottomrule\n'
benchmark_output_text += '\\end{tabular}\n'
benchmark_output_path.write_text(benchmark_output_text) | 34.172414 | 105 | 0.693239 | 141 | 991 | 4.624113 | 0.361702 | 0.253067 | 0.26227 | 0.153374 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016355 | 0.136226 | 991 | 29 | 106 | 34.172414 | 0.745327 | 0 | 0 | 0 | 0 | 0.090909 | 0.296371 | 0.076613 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
31595a178431325e423f5918cc3d3bfcc918966f | 409 | py | Python | dabl/__init__.py | hp2500/dabl | 880abe0efe1ac2767fb5c11185f3adad3fba2e30 | [
"BSD-3-Clause"
] | null | null | null | dabl/__init__.py | hp2500/dabl | 880abe0efe1ac2767fb5c11185f3adad3fba2e30 | [
"BSD-3-Clause"
] | null | null | null | dabl/__init__.py | hp2500/dabl | 880abe0efe1ac2767fb5c11185f3adad3fba2e30 | [
"BSD-3-Clause"
] | null | null | null | from .preprocessing import EasyPreprocessor, clean, detect_types
from .models import SimpleClassifier, SimpleRegressor, AnyClassifier
from .plot.supervised import plot
from .explain import explain
from . import datasets
__version__ = "0.1.7"
__all__ = ['EasyPreprocessor', 'SimpleClassifier', 'AnyClassifier',
'SimpleRegressor',
'explain', 'clean', 'detect_types', 'plot', 'datasets']
| 34.083333 | 68 | 0.740831 | 40 | 409 | 7.325 | 0.5 | 0.075085 | 0.109215 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008646 | 0.151589 | 409 | 11 | 69 | 37.181818 | 0.835735 | 0 | 0 | 0 | 0 | 0 | 0.246944 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.555556 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
31599e1cc36b3f352519db332240aecb43a5d6ae | 146 | py | Python | OMOPAPI/apps.py | shahzadmumtaz22/OMOP_CDM_API | 09adc5ced19e974b914b9ed7d8ccabad14b97ca7 | [
"Unlicense"
] | null | null | null | OMOPAPI/apps.py | shahzadmumtaz22/OMOP_CDM_API | 09adc5ced19e974b914b9ed7d8ccabad14b97ca7 | [
"Unlicense"
] | null | null | null | OMOPAPI/apps.py | shahzadmumtaz22/OMOP_CDM_API | 09adc5ced19e974b914b9ed7d8ccabad14b97ca7 | [
"Unlicense"
] | null | null | null | from django.apps import AppConfig
class OmopapiConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'OMOPAPI'
| 20.857143 | 56 | 0.760274 | 17 | 146 | 6.411765 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150685 | 146 | 6 | 57 | 24.333333 | 0.879032 | 0 | 0 | 0 | 0 | 0 | 0.246575 | 0.19863 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
315a457db86f810250ee3f319bd2c63a39f1fd0e | 1,553 | py | Python | src/item_master/api/views.py | vishalhjoshi/croma | 5b033a1136a9a8290118801f0e7092aebd9d64cc | [
"MIT"
] | 1 | 2019-07-13T18:26:09.000Z | 2019-07-13T18:26:09.000Z | src/item_master/api/views.py | vishalhjoshi/croma | 5b033a1136a9a8290118801f0e7092aebd9d64cc | [
"MIT"
] | null | null | null | src/item_master/api/views.py | vishalhjoshi/croma | 5b033a1136a9a8290118801f0e7092aebd9d64cc | [
"MIT"
] | null | null | null | from ..models import Item
from .serializers import ItemApiSerialzer
from rest_framework.response import Response
from rest_framework.generics import ListAPIView
from rest_framework.permissions import IsAdminUser
from rest_framework.filters import SearchFilter
from .pagination import MyPageNumberPagination
class ItemListApiView(ListAPIView):
serializer_class = ItemApiSerialzer
permission_classes = (IsAdminUser,)
filter_backends = [SearchFilter,]
search_fields = ['name', 'id']
# pagination_class = MyPageNumberPagination
def get_queryset(self, *args, **kwargs):
queryset = Item.objects.all().order_by('name')
query = self.request.GET.get('q')
if query:
queryset = queryset.filter(name__startswith = query)
return queryset
# class ItemRetrieveApiView(RetrieveAPIView):
# lookup_field = 'pk'
# serializer_class = ItemApiSerialzer
# pagination_class = PageNumberPagination
# permission_classes = (IsAdminUser,)
# class ItemCreateApiView(CreateAPIView):
# lookup_field = 'pk'
# serializer_class = ItemApiSerialzer
# pagination_class = PageNumberPagination
# permission_classes = (IsAdminUser,)
# class ItemUpdateApiView(UpdateAPIView):
# lookup_field = 'pk'
# serializer_class = ItemApiSerialzer
# pagination_class = PageNumberPagination
# permission_classes = (IsAdminUser,)
# class ItemDeleteApiView(DestroyAPIView):
# lookup_field = 'pk'
# serializer_class = ItemApiSerialzer
# pagination_class = PageNumberPagination
# permission_classes = (IsAdminUser,) | 30.45098 | 55 | 0.763683 | 144 | 1,553 | 8.034722 | 0.388889 | 0.064823 | 0.133967 | 0.079516 | 0.382887 | 0.382887 | 0.382887 | 0.382887 | 0.382887 | 0.382887 | 0 | 0 | 0.152608 | 1,553 | 51 | 56 | 30.45098 | 0.879179 | 0.513844 | 0 | 0 | 0 | 0 | 0.014986 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.388889 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
318879ea2682905d2f4b9124957c47e4053bdd23 | 2,659 | py | Python | python/instruments/mbac/instr_setup.py | ACTCollaboration/moby2 | b0f6bd6add7170999eb964d18f16d795520426e9 | [
"BSD-2-Clause"
] | 3 | 2020-06-23T15:59:37.000Z | 2022-03-29T16:04:35.000Z | python/instruments/mbac/instr_setup.py | ACTCollaboration/moby2 | b0f6bd6add7170999eb964d18f16d795520426e9 | [
"BSD-2-Clause"
] | 1 | 2020-04-08T15:10:46.000Z | 2020-04-08T15:10:46.000Z | python/instruments/mbac/instr_setup.py | ACTCollaboration/moby2 | b0f6bd6add7170999eb964d18f16d795520426e9 | [
"BSD-2-Clause"
] | null | null | null | from __future__ import print_function
from __future__ import absolute_import
from past.builtins import basestring
import numpy as np
shared_depot = "act_depot"
pointing_fields = [
{'name': 'az', # some 2007 tods don't have Enc_Az_Deg_Astro
'field': 'Enc_Az_Deg',
'dtype': np.float64,
'scale': np.pi / 180,
'shift': np.pi,
},
{'name': 'alt',
'field': 'Enc_El_Deg',
'dtype': np.float64,
'scale': np.pi / 180,
},
{'name': 'ctime',
'field': 'C_Time',
'dtype': np.float64,
},
{'name': 'enc_flags',
'field': 'enc_flags',
'dtype': np.uint16,
},
{'name': 'data_mask',
'field': 'tes_flags',
'dtype': np.int32,
},
]
AR_NAMES = {
'mbac145': 'AR1',
'mbac215': 'AR2',
'mbac280': 'AR3',
'ar1': 'actpol1',
'ar2': 'actpol2',
}
ACT_TIME_CONSTANTS = {
'season2007': {
'mbac145': ('2007/tcnst_20071027_row.dat', True),
},
'season2008': {
'mbac145': ('2008/ar1_tau_090423.txt', False),
'mbac215': ('2008/ar2_tau_090423.txt', False),
'mbac280': ('2008/timeconst_ar3_2008_090802.txt', True),
},
'season2009': {
'mbac145': ('2009/template_ar1_2009_090811.txt.tau', False),
'mbac215': ('2009/template_ar2_2009_090811.txt.tau', False),
'mbac280': ('2009/timeconst_ar3_2009_090825.txt', True),
},
'season2010': {
'mbac145': ('2010/template_ar1_2010_101017.txt.tau', False),
'mbac215': ('2010/template_ar2_2010_101017.txt.tau', False),
'mbac280': ('2010/template_ar3_2010_100920.txt.tau', False),
},
}
ACT_FLAT_FIELDS = {
'season2007': {
'mbac145': '2007/ff_mbac145_2007_v1.dict',
},
'season2008': {
'mbac145': '2008/ff_mbac145_2008_v1.dict',
'mbac215': '2008/ff_mbac215_2008_v1.dict',
'mbac280': '2008/ff_mbac280_2008_v0.dict',
},
'season2009': {
'mbac145': '2009/ff_mbac145_2009_v1.dict',
'mbac215': '2009/ff_mbac215_2009_v1.dict',
'mbac280': '2008/ff_mbac280_2008_v0.dict',
},
'season2010': {
'mbac145': '2010/ff_mbac145_2010_v0.dict',
'mbac215': '2010/ff_mbac215_2010_v0.dict',
'mbac280': '2008/ff_mbac280_2008_v0.dict',
},
'2013': {
'ar1': '2013/ff_ar1_2013_null.dict',
},
'2014': {
'ar1': '2014/ff_ar1_2014_null.dict',
'ar2': '2014/ff_ar2_2014_null.dict',
},
}
ACT_SYNC_PARAMS = {
'mbac145': 'syncDefault145.par',
'mbac215': 'syncDefault215.par',
'mbac180': 'syncDefault280.par',
}
| 26.858586 | 68 | 0.564122 | 303 | 2,659 | 4.627063 | 0.313531 | 0.054922 | 0.03923 | 0.036377 | 0.17689 | 0.116976 | 0.116976 | 0.116976 | 0.051355 | 0 | 0 | 0.225102 | 0.259872 | 2,659 | 98 | 69 | 27.132653 | 0.487297 | 0.015795 | 0 | 0.179775 | 0 | 0 | 0.463835 | 0.261768 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.044944 | 0 | 0.044944 | 0.011236 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3190e76bc978d65a34d431821650bbfd1a592197 | 2,414 | py | Python | tests/test_extract_filter.py | yagays/ja-timex | 533b76d8586c65c6e7c95fc029f0f53eb1ee9fea | [
"MIT"
] | 89 | 2021-08-02T01:36:03.000Z | 2022-03-31T03:15:57.000Z | tests/test_extract_filter.py | yagays/ja-timex | 533b76d8586c65c6e7c95fc029f0f53eb1ee9fea | [
"MIT"
] | 21 | 2021-08-02T04:08:26.000Z | 2022-02-23T10:32:50.000Z | tests/test_extract_filter.py | yagays/ja-timex | 533b76d8586c65c6e7c95fc029f0f53eb1ee9fea | [
"MIT"
] | 7 | 2021-08-02T03:32:50.000Z | 2022-03-25T03:48:25.000Z | import re
from ja_timex.extract_filter import DecimalFilter, NumexpFilter, PartialNumFilter
from ja_timex.pattern.place import Pattern
from ja_timex.tag import Extract
def make_extract(target, original, type_name="abstime"):
return Extract(
type_name=type_name,
re_match=re.search(target, original),
pattern=Pattern(re_pattern=None, parse_func=lambda x: x, option=None),
)
def test_numexp_filter():
f = NumexpFilter()
assert f.filter(make_extract("7.18", "7.18キロメートル"), "7.18キロメートル")
assert f.filter(make_extract("7.18", "7.18 キロメートル"), "7.18 キロメートル")
assert f.filter(make_extract("7.18", "7.18cm"), "7.18cm")
assert f.filter(make_extract("7.18", "7.18mm"), "7.18mm")
assert f.filter(make_extract("7.18", "7.18%"), "7.18%")
assert f.filter(make_extract("7.18", "7.18インチ"), "7.18インチ")
assert f.filter(make_extract("7.18", "7.18GHz"), "7.18GHz")
assert f.filter(make_extract("7.18", "7.18円"), "7.18円")
assert f.filter(make_extract("2.4", "2.4GHz"), "2.4GHz")
assert not f.filter(make_extract("7.18", "7.18は晴れ"), "7.18は晴れ")
assert not f.filter(make_extract("7.18", "7.18に釣り上げられた10メートルの魚"), "7.18に釣り上げられた10メートルの魚")
# 3つ以上の数字に分けられる場合はフィルタの対象外
assert not f.filter(make_extract("2020.7.18", "2020.7.18"), "2020.7.18")
assert not f.filter(make_extract("2020.7.18", "2020.7.18円相場は"), "2020.7.18円相場は") # 単位が付いていた場合も同様
def test_partial_num_filter():
f = PartialNumFilter()
# 前後に数字または+がある場合
# マイナスは1/12-1/20といった表現があるため、対象外
assert f.filter(make_extract("13/1", "13/13"), "13/13")
assert f.filter(make_extract("3/13", "13/13"), "13/13")
assert f.filter(make_extract("13/1", "13/1+2"), "13/1+2")
assert f.filter(make_extract("3/13", "+3/13"), "+3/13")
# 前後に数字ではない文字の場合
assert not f.filter(make_extract("13/1", "13/1は"), "13/1は")
assert not f.filter(make_extract("3/13", "は3/13"), "は3/13")
# 末尾の0.1の前方に"."がある場合もTrueと判定する
assert f.filter(make_extract("0.1", "127.0.0.1"), "127.0.0.1")
def test_decimal_filter():
f = DecimalFilter()
# 0.1や0/1, 0-1といった表現において、0が0000年を表すことはない
assert f.filter(make_extract("0.18", "0.18"), "0.18")
assert f.filter(make_extract("0/10", "0/10"), "0/10")
assert f.filter(make_extract("0-10", "0-10"), "0-10")
# DURATIONの場合た対象外(0.1ヶ月という表記はあり得るため)
assert not f.filter(make_extract("0.18", "0.18", "duration"), "0.18")
| 35.5 | 101 | 0.652858 | 383 | 2,414 | 4.005222 | 0.206266 | 0.17927 | 0.172099 | 0.281617 | 0.495437 | 0.479791 | 0.411343 | 0.376793 | 0.215124 | 0.093872 | 0 | 0.133268 | 0.148302 | 2,414 | 67 | 102 | 36.029851 | 0.61284 | 0.083264 | 0 | 0.05 | 0 | 0 | 0.206443 | 0 | 0 | 0 | 0 | 0 | 0.6 | 1 | 0.1 | false | 0 | 0.1 | 0.025 | 0.225 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
31953dd698923d738989b91d47bd9123b364327c | 661 | py | Python | 25/02/classmethod_deco9.py | pylangstudy/201707 | c1cc72667f1e0b6e8eef4ee85067d7fa4ca500b6 | [
"CC0-1.0"
] | null | null | null | 25/02/classmethod_deco9.py | pylangstudy/201707 | c1cc72667f1e0b6e8eef4ee85067d7fa4ca500b6 | [
"CC0-1.0"
] | 46 | 2017-06-30T22:19:07.000Z | 2017-07-31T22:51:31.000Z | 25/02/classmethod_deco9.py | pylangstudy/201707 | c1cc72667f1e0b6e8eef4ee85067d7fa4ca500b6 | [
"CC0-1.0"
] | null | null | null | #!python3.6
#encoding: utf-8
# https://teratail.com/questions/52151
class Deco:
def __init__(self): self.value = 'value'
def deco(func):
def wrapper(self, *args, **kwargs):
print('----- start -----')
print('self.value =', self.value) # AttributeError: 'C' object has no attribute 'value'
ret = func(self, *args, **kwargs)
print('----- end -----')
return ret
return wrapper
class C:
@Deco.deco
def test(self, *args, **kwargs):
for a in args: print(a)
for k,v in kwargs.items(): print(k, v)
return 'RETURN'
print(C().test(1, 'a', key1='value1'))
| 30.045455 | 99 | 0.540091 | 84 | 661 | 4.202381 | 0.488095 | 0.076487 | 0.11898 | 0.107649 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023061 | 0.278366 | 661 | 21 | 100 | 31.47619 | 0.716981 | 0.172466 | 0 | 0 | 0 | 0 | 0.11418 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0 | 0 | 0.529412 | 0.352941 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
31990bda07e31fb382e80a88e29f86b7714938ce | 386 | py | Python | tests/data_augmentation_layers_test.py | dynastes-team/dynastes | 931b6d9ac83862eb39c2f5144c95b952e9efcd8e | [
"MIT"
] | 7 | 2020-01-18T14:28:04.000Z | 2021-11-10T16:46:34.000Z | tests/data_augmentation_layers_test.py | veqtor/dynastes | 931b6d9ac83862eb39c2f5144c95b952e9efcd8e | [
"MIT"
] | null | null | null | tests/data_augmentation_layers_test.py | veqtor/dynastes | 931b6d9ac83862eb39c2f5144c95b952e9efcd8e | [
"MIT"
] | null | null | null | import tensorflow as tf
from dynastes.util.test_utils import layer_test
from dynastes.layers.data_augmentation import SpecAugmentLayer
import numpy as np
to_tensor = tf.convert_to_tensor
normal = np.random.normal
class SpecAugmentLayerTest(tf.test.TestCase):
def test_simple(self):
layer_test(
SpecAugmentLayer, kwargs={}, input_shape=(5, 32, 80, 2))
| 27.571429 | 72 | 0.746114 | 52 | 386 | 5.365385 | 0.653846 | 0.086022 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018868 | 0.176166 | 386 | 13 | 73 | 29.692308 | 0.858491 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.4 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
31ac2d4430ef60decded56b7fb311c1ec890708c | 327 | py | Python | core/query.py | vsvarunsharma10/pqai | 3ef1351fbc39671916517917de9074a62b092eef | [
"MIT"
] | 17 | 2021-06-23T04:17:06.000Z | 2022-03-25T16:03:49.000Z | core/query.py | vsvarunsharma10/pqai | 3ef1351fbc39671916517917de9074a62b092eef | [
"MIT"
] | 15 | 2021-06-22T10:14:15.000Z | 2022-03-12T00:58:37.000Z | core/query.py | vsvarunsharma10/pqai | 3ef1351fbc39671916517917de9074a62b092eef | [
"MIT"
] | 3 | 2021-06-27T18:37:53.000Z | 2022-03-15T04:41:21.000Z |
from core.indexes import Index
class Query():
def __init__(self, query):
self._query = query
def run(self, index, n):
return index.search(self._query, n)
class TextQuery(Query):
def __init__(self, text):
super().__init__(text)
class VectorQuery(Query):
def __init__(self, vector):
super().__init__(vector) | 14.863636 | 37 | 0.706422 | 45 | 327 | 4.644444 | 0.422222 | 0.15311 | 0.172249 | 0.229665 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.159021 | 327 | 22 | 38 | 14.863636 | 0.76 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.083333 | 0.083333 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
31b6d248d0838a3c6899af690c87d261ce4a2efc | 79 | py | Python | Python 3/files.py | amanovishnu/Dejkins | 1c2b194445381a351f05a69b27eb4e42c5d0f70a | [
"Unlicense"
] | null | null | null | Python 3/files.py | amanovishnu/Dejkins | 1c2b194445381a351f05a69b27eb4e42c5d0f70a | [
"Unlicense"
] | 1 | 2018-02-08T12:54:58.000Z | 2018-02-08T12:54:58.000Z | Python 3/files.py | amanovishnu/Code-For-Fun | 1c2b194445381a351f05a69b27eb4e42c5d0f70a | [
"Unlicense"
] | null | null | null | fname = input("Enter a File name: ")
fh = open(fname)
print(fh.read().upper()) | 26.333333 | 37 | 0.64557 | 13 | 79 | 3.923077 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139241 | 79 | 3 | 38 | 26.333333 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
31bd27af87b558c415d75bfbc2d34c8c581dd4b9 | 6,400 | py | Python | sky/legacy/selstart.py | Asteur/sky_python_crawler | 397b75899be185ff8e0f726fcb018bb3711f7e61 | [
"BSD-3-Clause"
] | 325 | 2015-08-05T12:44:23.000Z | 2022-03-31T01:31:04.000Z | sky/legacy/selstart.py | Asteur/sky_python_crawler | 397b75899be185ff8e0f726fcb018bb3711f7e61 | [
"BSD-3-Clause"
] | 13 | 2015-09-12T10:01:34.000Z | 2020-05-13T05:26:35.000Z | sky/legacy/selstart.py | Asteur/sky_python_crawler | 397b75899be185ff8e0f726fcb018bb3711f7e61 | [
"BSD-3-Clause"
] | 51 | 2015-09-25T10:51:44.000Z | 2021-12-19T19:25:11.000Z | from selenium.webdriver.common.keys import Keys
from selenium import webdriver
from bs4 import BeautifulSoup
from threading import Thread
from threading import Lock
import time
import random
import re
options = webdriver.ChromeOptions()
options.add_argument('--user-data-dir=/Users/pascal/Library/Application Support/Google/Chrome/Default')
driver = webdriver.Chrome(executable_path = '/Users/pascal/Downloads/chromedriver', chrome_options = options)
driver.set_window_size(1920, 1200)
driver.maximize_window()
def getCurrentSource():
return BeautifulSoup(driver.page_source)
def getCurrentLinks():
soup = getCurrentSource()
return [x['href'] for x in soup.findAll('a', {'href' : True}) if x['href'].startswith('http://')]
while True:
soup = getCurrentSource()
links = [x['href'] for x in soup.findAll('a', {'href' : True}) if x['href'].startswith('http://')]
random.shuffle(links)
driver.get(links[0])
time.sleep(2)
driver.find_element_by_tag_name("body").send_keys(Keys.COMMAND + "t")
div = '<div id="popup" style="display: none">some text here</div>'
'<div id="popup" style="display: inline">some text here</div>'
txt="""
var removePop = document.getElementById("elpopup");
if (removePop != null) {
removePop.parentElement.removeChild(removePop);
}
elements = document.querySelectorAll(':hover');
var links = ''
var classesinfo = ''
for(var i = 0, element; (element = elements[i]) !== undefined; i++) {
if (element.hasAttribute("href")) {
links = links + " " + element.getAttribute('href')
}
if (element.hasAttribute("class")) {
classesinfo = classesinfo + " " + element.getAttribute('class')
}
}
if (elements != null && elements.length > 0) {
var ele = elements[elements.length - 1];
var para = document.createElement("div");
var linksNode = document.createTextNode(links);
var classesNode = document.createTextNode(classesinfo);
para.setAttribute('id', 'elpopup')
para.setAttribute('style', 'font-size:10px')
para.appendChild(linksNode);
para.appendChild(classesNode);
ele.appendChild(para);
}
"""
for i in range(20):
links = getCurrentLinks()
driver.execute_script(txt)
time.sleep(0.5)
print('tick ' + str(i))
path_to_chromedriver = '/Users/pascal/Downloads/chromedriver'
driver = webdriver.Chrome(executable_path = path_to_chromedriver)
driver.maximize_window()
driver.get('http://www.military-today.com/engineering/caterpillar_d9.htm')
var bla = []
document.addEventListener('click', function(e) {
e = e || window.event;
var mytarget = e.target || e.srcElement
bla.push(getPathTo(mytarget))
}, false);
function getPathTo(element) {
if (element.id!=='')
return 'id("'+element.id+'")';
if (element===document.body)
return element.tagName;
var ix= 0;
var siblings= element.parentNode.childNodes;
for (var i= 0; i<siblings.length; i++) {
var sibling= siblings[i];
if (sibling===element)
return getPathTo(element.parentNode)+'/'+element.tagName+'['+(ix+1)+']';
if (sibling.nodeType===1 && sibling.tagName===element.tagName)
ix++;
}
}
var keyboardMap = ["","","","CANCEL","","","HELP","","BACK_SPACE","TAB","","","CLEAR","ENTER","RETURN","","SHIFT","CONTROL","ALT","PAUSE","CAPS_LOCK","KANA","EISU","JUNJA","FINAL","HANJA","","ESCAPE","CONVERT","NONCONVERT","ACCEPT","MODECHANGE","SPACE","PAGE_UP","PAGE_DOWN","END","HOME","LEFT","UP","RIGHT","DOWN","SELECT","PRINT","EXECUTE","PRINTSCREEN","INSERT","DELETE","","0","1","2","3","4","5","6","7","8","9","COLON","SEMICOLON","LESS_THAN","EQUALS","GREATER_THAN","QUESTION_MARK","AT","A","B","C","D","E","F","G","H","I","J","K","L","M","N","O","P","Q","R","S","T","U","V","W","X","Y","Z","WIN","","CONTEXT_MENU","","SLEEP","NUMPAD0","NUMPAD1","NUMPAD2","NUMPAD3","NUMPAD4","NUMPAD5","NUMPAD6","NUMPAD7","NUMPAD8","NUMPAD9","MULTIPLY","ADD","SEPARATOR","SUBTRACT","DECIMAL","DIVIDE","F1","F2","F3","F4","F5","F6","F7","F8","F9","F10","F11","F12","F13","F14","F15","F16","F17","F18","F19","F20","F21","F22","F23","F24","","","","","","","","","NUM_LOCK","SCROLL_LOCK","WIN_OEM_FJ_JISHO","WIN_OEM_FJ_MASSHOU","WIN_OEM_FJ_TOUROKU","WIN_OEM_FJ_LOYA","WIN_OEM_FJ_ROYA","","","","","","","","","","CIRCUMFLEX","EXCLAMATION","DOUBLE_QUOTE","HASH","DOLLAR","PERCENT","AMPERSAND","UNDERSCORE","OPEN_PAREN","CLOSE_PAREN","ASTERISK","PLUS","PIPE","HYPHEN_MINUS","OPEN_CURLY_BRACKET","CLOSE_CURLY_BRACKET","TILDE","","","","","VOLUME_MUTE","VOLUME_DOWN","VOLUME_UP","","","SEMICOLON","EQUALS","COMMA","MINUS","PERIOD","SLASH","BACK_QUOTE","","","","","","","","","","","","","","","","","","","","","","","","","","","OPEN_BRACKET","BACK_SLASH","CLOSE_BRACKET","QUOTE","","META","ALTGR","","WIN_ICO_HELP","WIN_ICO_00","","WIN_ICO_CLEAR","","","WIN_OEM_RESET","WIN_OEM_JUMP","WIN_OEM_PA1","WIN_OEM_PA2","WIN_OEM_PA3","WIN_OEM_WSCTRL","WIN_OEM_CUSEL","WIN_OEM_ATTN","WIN_OEM_FINISH","WIN_OEM_COPY","WIN_OEM_AUTO","WIN_OEM_ENLW","WIN_OEM_BACKTAB","ATTN","CRSEL","EXSEL","EREOF","PLAY","ZOOM","","PA1","WIN_OEM_CLEAR",""];
document.addEventListener("keydown", keyDownTextField, false);
function keyDownTextField(e) {
var keyCode = e.keyCode;
bla.push(keyboardMap[keyCode])
}
ele = driver.find_element_by_xpath('id("comments-link-28888894")/A[1]')
ele.click()
ele = driver.switch_to_active_element()
ele.send_keys(67)
done_urls = set([''])
while True:
try:
current_url = slugify(driver.current_url)
if current_url in done_urls:
continue
done_urls.add(current_url)
with open('/Users/pascal/egoroot/selenium_cache/' + current_url, 'w') as f:
f.write(driver.page_source)
time.sleep(3)
except :
print('err')
opts = webdriver.ChromeOptions()
opts.add_argument('--user-data-dir=/Users/pascal/Library/Application Support/Google/Chrome/Default')
driver = webdriver.Chrome(executable_path = '/Users/pascal/Downloads/chromedriver', chrome_options = opts)
driver.set_window_size(1920, 1200)
driver.maximize_window()
def slugify(value):
"""
Normalizes string, converts to lowercase, removes non-alpha characters,
and converts spaces to hyphens.
"""
import re
value = re.sub('[^\w\s-]', '', value).strip().lower()
value = re.sub('[-\s]+', '-', value)
return value
| 41.830065 | 1,915 | 0.648281 | 801 | 6,400 | 5.033708 | 0.444444 | 0.028274 | 0.009921 | 0.023065 | 0.149554 | 0.140873 | 0.128472 | 0.128472 | 0.128472 | 0.128472 | 0 | 0.019625 | 0.124219 | 6,400 | 152 | 1,916 | 42.105263 | 0.699732 | 0 | 0 | 0.094017 | 0 | 0.025641 | 0.425406 | 0.130858 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.076923 | null | null | 0.017094 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
31cab976f3a13d0e2f591c6dedb8a047b7c3172a | 299 | py | Python | src/prefect/tasks/sendgrid/__init__.py | suryatmodulus/prefect | e4ac9f6aa831140c7fba0397f3e5e0884b1b9e42 | [
"Apache-2.0"
] | 3 | 2021-11-09T10:46:58.000Z | 2022-03-11T04:22:35.000Z | src/prefect/tasks/sendgrid/__init__.py | suryatmodulus/prefect | e4ac9f6aa831140c7fba0397f3e5e0884b1b9e42 | [
"Apache-2.0"
] | 8 | 2021-10-11T16:42:59.000Z | 2022-03-31T08:42:24.000Z | src/prefect/tasks/sendgrid/__init__.py | suryatmodulus/prefect | e4ac9f6aa831140c7fba0397f3e5e0884b1b9e42 | [
"Apache-2.0"
] | 1 | 2022-03-11T04:22:40.000Z | 2022-03-11T04:22:40.000Z | """
Tasks for interacting with SendGrid.
"""
try:
from prefect.tasks.sendgrid.sendgrid import SendEmail
except ImportError as exc:
raise ImportError(
'Using `prefect.tasks.sendgrid` requires Prefect to be installed with the "sendgrid" extra.'
) from exc
__all__ = ["SendEmail"]
| 24.916667 | 100 | 0.719064 | 36 | 299 | 5.861111 | 0.638889 | 0.113744 | 0.189573 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.187291 | 299 | 11 | 101 | 27.181818 | 0.868313 | 0.120401 | 0 | 0 | 0 | 0 | 0.388235 | 0.094118 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.