hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ce30803a616ae9ef2a49fa7b7f6a9b1c49fff5c6 | 2,584 | py | Python | tests/check_framework/test_model_field_deprecation.py | bpeschier/django | f54c0ec06e390dc5bce95fdccbcb51d6423da4f9 | [
"BSD-3-Clause"
] | 3 | 2020-05-30T17:08:51.000Z | 2021-12-14T02:55:19.000Z | tests/check_framework/test_model_field_deprecation.py | bpeschier/django | f54c0ec06e390dc5bce95fdccbcb51d6423da4f9 | [
"BSD-3-Clause"
] | 1 | 2021-03-24T12:21:05.000Z | 2021-03-24T12:31:52.000Z | tests/check_framework/test_model_field_deprecation.py | bpeschier/django | f54c0ec06e390dc5bce95fdccbcb51d6423da4f9 | [
"BSD-3-Clause"
] | 15 | 2016-01-08T14:28:41.000Z | 2019-04-19T08:33:31.000Z | from django.core import checks
from django.db import models
from django.test import SimpleTestCase
from .tests import IsolateModelsMixin
class TestDeprecatedField(IsolateModelsMixin, SimpleTestCase):
def test_default_details(self):
class MyField(models.Field):
system_check_deprecated_details = {}
class Model(models.Model):
name = MyField()
model = Model()
self.assertEqual(model.check(), [
checks.Warning(
msg='MyField has been deprecated.',
hint=None,
obj=Model._meta.get_field('name'),
id='fields.WXXX',
)
])
def test_user_specified_details(self):
class MyField(models.Field):
system_check_deprecated_details = {
'msg': 'This field is deprecated and will be removed soon.',
'hint': 'Use something else.',
'id': 'fields.W999',
}
class Model(models.Model):
name = MyField()
model = Model()
self.assertEqual(model.check(), [
checks.Warning(
msg='This field is deprecated and will be removed soon.',
hint='Use something else.',
obj=Model._meta.get_field('name'),
id='fields.W999',
)
])
class TestRemovedField(IsolateModelsMixin, SimpleTestCase):
def test_default_details(self):
class MyField(models.Field):
system_check_removed_details = {}
class Model(models.Model):
name = MyField()
model = Model()
self.assertEqual(model.check(), [
checks.Error(
msg='MyField has been removed except for support in historical migrations.',
hint=None,
obj=Model._meta.get_field('name'),
id='fields.EXXX',
)
])
def test_user_specified_details(self):
class MyField(models.Field):
system_check_removed_details = {
'msg': 'Support for this field is gone.',
'hint': 'Use something else.',
'id': 'fields.E999',
}
class Model(models.Model):
name = MyField()
model = Model()
self.assertEqual(model.check(), [
checks.Error(
msg='Support for this field is gone.',
hint='Use something else.',
obj=Model._meta.get_field('name'),
id='fields.E999',
)
])
| 30.046512 | 92 | 0.533669 | 249 | 2,584 | 5.417671 | 0.24498 | 0.035582 | 0.047443 | 0.068199 | 0.800593 | 0.800593 | 0.788732 | 0.788732 | 0.788732 | 0.788732 | 0 | 0.007286 | 0.362616 | 2,584 | 85 | 93 | 30.4 | 0.811779 | 0 | 0 | 0.6 | 0 | 0 | 0.168344 | 0 | 0 | 0 | 0 | 0 | 0.057143 | 1 | 0.057143 | false | 0 | 0.057143 | 0 | 0.257143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0219d9f72d6070aaab90e3159f9208b58ece5172 | 40 | py | Python | PapaGraph/src/PapaGraph/sample/__init__.py | thalyson004/PapaGraph | 429d0f0861ead8f1971958aacb4bf12ced013600 | [
"MIT"
] | null | null | null | PapaGraph/src/PapaGraph/sample/__init__.py | thalyson004/PapaGraph | 429d0f0861ead8f1971958aacb4bf12ced013600 | [
"MIT"
] | null | null | null | PapaGraph/src/PapaGraph/sample/__init__.py | thalyson004/PapaGraph | 429d0f0861ead8f1971958aacb4bf12ced013600 | [
"MIT"
] | null | null | null | from PapaGraph.sample.handmande import * | 40 | 40 | 0.85 | 5 | 40 | 6.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 40 | 1 | 40 | 40 | 0.918919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
021db2c66644b9600729556073c72e3b267d121f | 58 | py | Python | book-code/numpy-ml/numpy_ml/preprocessing/__init__.py | yangninghua/code_library | b769abecb4e0cbdbbb5762949c91847a0f0b3c5a | [
"MIT"
] | null | null | null | book-code/numpy-ml/numpy_ml/preprocessing/__init__.py | yangninghua/code_library | b769abecb4e0cbdbbb5762949c91847a0f0b3c5a | [
"MIT"
] | null | null | null | book-code/numpy-ml/numpy_ml/preprocessing/__init__.py | yangninghua/code_library | b769abecb4e0cbdbbb5762949c91847a0f0b3c5a | [
"MIT"
] | null | null | null | from . import general
from . import nlp
from . import dsp
| 14.5 | 21 | 0.741379 | 9 | 58 | 4.777778 | 0.555556 | 0.697674 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206897 | 58 | 3 | 22 | 19.333333 | 0.934783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0272bcc4054f83ae4a8afcce79187fa7509180da | 224 | py | Python | playground/step4/jhML/__init__.py | jhson989/jhML | eb8b76d3b47df858e82cd971bb32794e12de4747 | [
"Apache-2.0"
] | null | null | null | playground/step4/jhML/__init__.py | jhson989/jhML | eb8b76d3b47df858e82cd971bb32794e12de4747 | [
"Apache-2.0"
] | null | null | null | playground/step4/jhML/__init__.py | jhson989/jhML | eb8b76d3b47df858e82cd971bb32794e12de4747 | [
"Apache-2.0"
] | null | null | null | from jhML.core import Variable
from jhML.core import Function
from jhML.functions import setup_variable
from jhML.config import using_config
from jhML.config import no_grad
from jhML.config import Config
setup_variable()
| 20.363636 | 41 | 0.839286 | 35 | 224 | 5.257143 | 0.342857 | 0.26087 | 0.228261 | 0.326087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 224 | 10 | 42 | 22.4 | 0.938776 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.857143 | 0 | 0.857143 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5a18e3511fbffc03ebf1419cf7a073b1a46a0e84 | 267 | py | Python | account/forms.py | Zeble1603/cv-django | 329d8d471c92dc0ce5f4bfb2bb5212fc1c8c34b4 | [
"MIT"
] | 1 | 2021-10-19T21:22:38.000Z | 2021-10-19T21:22:38.000Z | account/forms.py | Zeble1603/cv-django | 329d8d471c92dc0ce5f4bfb2bb5212fc1c8c34b4 | [
"MIT"
] | null | null | null | account/forms.py | Zeble1603/cv-django | 329d8d471c92dc0ce5f4bfb2bb5212fc1c8c34b4 | [
"MIT"
] | null | null | null | from django import forms
from django.contrib.auth.forms import AuthenticationForm
from django.contrib.auth.models import User
class CustomLoginForm(AuthenticationForm):
class Meta:
model = User
fiels = ['username','password']
| 22.25 | 56 | 0.692884 | 28 | 267 | 6.607143 | 0.571429 | 0.162162 | 0.183784 | 0.227027 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.235955 | 267 | 12 | 57 | 22.25 | 0.906863 | 0 | 0 | 0 | 0 | 0 | 0.059701 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.142857 | 0.428571 | 0 | 0.714286 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
5a65678ea518f85abf324edc56a4740ad72c3fe6 | 85 | py | Python | utils/utils.py | tuzhucheng/sent-sim | ebda09322be1dca3e967b80ffcf6437adb789132 | [
"MIT"
] | 109 | 2017-12-09T04:52:06.000Z | 2022-02-08T17:41:37.000Z | utils/utils.py | tuzhucheng/sent-sim | ebda09322be1dca3e967b80ffcf6437adb789132 | [
"MIT"
] | 5 | 2018-06-05T01:50:02.000Z | 2021-03-14T04:45:02.000Z | utils/utils.py | tuzhucheng/sent-sim | ebda09322be1dca3e967b80ffcf6437adb789132 | [
"MIT"
] | 24 | 2018-04-27T01:52:34.000Z | 2021-12-21T09:21:26.000Z | import torch
def save_checkpoint(state, filename):
torch.save(state, filename)
| 14.166667 | 37 | 0.752941 | 11 | 85 | 5.727273 | 0.636364 | 0.412698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152941 | 85 | 5 | 38 | 17 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0c80f148b42f75fc26a4a88f9a3fab7c28807bb8 | 101 | py | Python | lib_import/exceptions.py | fannydufresne/lib_import | 7105fa5e054d4008eddc79c1b02b397e41f7a45f | [
"Apache-2.0"
] | null | null | null | lib_import/exceptions.py | fannydufresne/lib_import | 7105fa5e054d4008eddc79c1b02b397e41f7a45f | [
"Apache-2.0"
] | 4 | 2021-03-19T10:23:02.000Z | 2022-02-10T11:24:29.000Z | lib_import/exceptions.py | fannydufresne/lib_import | 7105fa5e054d4008eddc79c1b02b397e41f7a45f | [
"Apache-2.0"
] | null | null | null | """Usefull error classes."""
class ImportError(Exception):
"""A generic exception."""
pass
| 14.428571 | 30 | 0.643564 | 10 | 101 | 6.5 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188119 | 101 | 6 | 31 | 16.833333 | 0.792683 | 0.425743 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
0c9e7c80b6e06a7d2b1d111440377b9a7a94204f | 199 | py | Python | tests/test_queries.py | kingoodie/py2gds | 1c264968da13c7330e8181b54a8bcc3bf233dd1a | [
"MIT"
] | null | null | null | tests/test_queries.py | kingoodie/py2gds | 1c264968da13c7330e8181b54a8bcc3bf233dd1a | [
"MIT"
] | 1 | 2021-03-09T12:09:35.000Z | 2021-03-09T12:09:35.000Z | tests/test_queries.py | kingoodie/py2gds | 1c264968da13c7330e8181b54a8bcc3bf233dd1a | [
"MIT"
] | null | null | null | from py2gds.connection import Connection
from py2gds.queries import MatchNode
def test_create_node(graph_connection: Connection, home_page):
assert MatchNode(graph_connection, home_page).run()
| 28.428571 | 62 | 0.829146 | 26 | 199 | 6.115385 | 0.576923 | 0.125786 | 0.226415 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011236 | 0.105528 | 199 | 6 | 63 | 33.166667 | 0.882022 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | false | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0ca317f3cff83ad3323a41d7ea9b2e526efaf5f7 | 102 | py | Python | src/hbcomp/comp/__init__.py | zgoda/hbcomp | 0b787a05f2cd512c44363daaa560ec74cc9d6261 | [
"MIT"
] | null | null | null | src/hbcomp/comp/__init__.py | zgoda/hbcomp | 0b787a05f2cd512c44363daaa560ec74cc9d6261 | [
"MIT"
] | null | null | null | src/hbcomp/comp/__init__.py | zgoda/hbcomp | 0b787a05f2cd512c44363daaa560ec74cc9d6261 | [
"MIT"
] | null | null | null | from flask import Blueprint
comp_bp = Blueprint('comp', __name__)
from . import views # noqa: F401
| 17 | 37 | 0.735294 | 14 | 102 | 5 | 0.714286 | 0.371429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035714 | 0.176471 | 102 | 5 | 38 | 20.4 | 0.797619 | 0.098039 | 0 | 0 | 0 | 0 | 0.044444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0.666667 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
0cb4af222ce2d23fbdbbf25437c3f0ae1c06f04e | 38 | py | Python | stockscrape/__init__.py | rawsashimi1604/stockscrape_v2 | 6f34da0805c50285268f56b45e0a6488ecd524ce | [
"MIT"
] | 2 | 2021-07-25T14:36:43.000Z | 2021-12-08T00:37:45.000Z | stockscrape/__init__.py | rawsashimi1604/stockscrape_v2 | 6f34da0805c50285268f56b45e0a6488ecd524ce | [
"MIT"
] | null | null | null | stockscrape/__init__.py | rawsashimi1604/stockscrape_v2 | 6f34da0805c50285268f56b45e0a6488ecd524ce | [
"MIT"
] | 1 | 2021-07-25T19:24:08.000Z | 2021-07-25T19:24:08.000Z | from stockscrape.ticker import Ticker
| 19 | 37 | 0.868421 | 5 | 38 | 6.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 38 | 1 | 38 | 38 | 0.970588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0b291ab519efb13580bec65bd544493aa791d1d4 | 4,842 | py | Python | tests/test_relaxation_algorithm.py | penggan666/index_selection_evaluation | b6daf1f30c24a0675f4e3acfbd17304e5d91cfd6 | [
"MIT"
] | 37 | 2020-03-03T10:59:06.000Z | 2022-03-29T11:51:37.000Z | tests/test_relaxation_algorithm.py | Jiachen-Shi/index_selection_evaluation | fb22b929cbab22377e90a12ae23ea4002d8eab7b | [
"MIT"
] | 19 | 2020-03-10T14:55:56.000Z | 2021-05-20T09:54:32.000Z | tests/test_relaxation_algorithm.py | Jiachen-Shi/index_selection_evaluation | fb22b929cbab22377e90a12ae23ea4002d8eab7b | [
"MIT"
] | 14 | 2020-08-10T03:12:40.000Z | 2022-02-28T06:08:16.000Z | import unittest
from unittest.mock import patch
from selection.algorithms.relaxation_algorithm import RelaxationAlgorithm
from selection.index import Index
from selection.workload import Workload
from tests.mock_connector import (
MockConnector,
column_A_0,
column_A_1,
mock_cache,
query_0,
query_1,
)
class TestRelaxationAlgorithm(unittest.TestCase):
def setUp(self):
self.connector = MockConnector()
@staticmethod
def set_estimated_index_sizes(indexes, store_size=True):
for index in indexes:
if index.estimated_size is None:
index.estimated_size = len(index.columns) * 1000 * 1000
@staticmethod
def set_estimated_index_size(index):
index.estimated_size = len(index.columns) * 1000 * 1000
@patch("selection.algorithms.relaxation_algorithm.get_utilized_indexes")
def test_calculate_indexes_3000MB_2column(self, get_utilized_indexes_mock):
algorithm = RelaxationAlgorithm(
database_connector=self.connector,
parameters={"max_index_width": 2, "budget_MB": 3},
)
algorithm.cost_evaluation.cache = mock_cache
algorithm.cost_evaluation._prepare_cost_calculation = (
self.set_estimated_index_sizes
)
algorithm.cost_evaluation.estimate_size = self.set_estimated_index_size
get_utilized_indexes_mock.return_value = (
{
Index([column_A_0], 1000 * 1000),
Index([column_A_0, column_A_1], 2000 * 1000),
},
None,
)
index_selection = algorithm.calculate_best_indexes(Workload([query_0, query_1]))
self.assertEqual(
set(index_selection),
set([Index([column_A_0]), Index([column_A_0, column_A_1])]),
)
@patch("selection.algorithms.relaxation_algorithm.get_utilized_indexes")
def test_calculate_indexes_2MB_2column(self, get_utilized_indexes_mock):
algorithm = RelaxationAlgorithm(
database_connector=self.connector,
parameters={"max_index_width": 2, "budget_MB": 2},
)
algorithm.cost_evaluation.cache = mock_cache
algorithm.cost_evaluation._prepare_cost_calculation = (
self.set_estimated_index_sizes
)
algorithm.cost_evaluation.estimate_size = self.set_estimated_index_size
get_utilized_indexes_mock.return_value = (
{
Index([column_A_0], 1000 * 1000),
Index([column_A_0, column_A_1], 2000 * 1000),
},
None,
)
index_selection = algorithm.calculate_best_indexes(Workload([query_0, query_1]))
self.assertEqual(set(index_selection), set([Index([column_A_0, column_A_1])]))
@patch("selection.algorithms.relaxation_algorithm.get_utilized_indexes")
def test_calculate_indexes_1MB_2column(self, get_utilized_indexes_mock):
algorithm = RelaxationAlgorithm(
database_connector=self.connector,
parameters={"max_index_width": 2, "budget_MB": 1},
)
algorithm.cost_evaluation.cache = mock_cache
algorithm.cost_evaluation._prepare_cost_calculation = (
self.set_estimated_index_sizes
)
algorithm.cost_evaluation.estimate_size = self.set_estimated_index_size
get_utilized_indexes_mock.return_value = (
{
Index([column_A_0], 1000 * 1000),
Index([column_A_0, column_A_1], 2000 * 1000),
},
None,
)
index_selection = algorithm.calculate_best_indexes(Workload([query_0, query_1]))
# The single column index is dropped first, because of the lower penalty.
# The multi column index is prefixed second.
self.assertEqual(set(index_selection), {Index([column_A_0])})
@patch("selection.algorithms.relaxation_algorithm.get_utilized_indexes")
def test_calculate_indexes_500kB_2column(self, get_utilized_indexes_mock):
algorithm = RelaxationAlgorithm(
database_connector=self.connector,
parameters={"max_index_width": 2, "budget_MB": 0.5},
)
algorithm.cost_evaluation.cache = mock_cache
algorithm.cost_evaluation._prepare_cost_calculation = (
self.set_estimated_index_sizes
)
algorithm.cost_evaluation.estimate_size = self.set_estimated_index_size
get_utilized_indexes_mock.return_value = (
{
Index([column_A_0], 1000 * 1000),
Index([column_A_0, column_A_1], 2000 * 1000),
},
None,
)
index_selection = algorithm.calculate_best_indexes(Workload([query_0, query_1]))
self.assertEqual(set(index_selection), set())
if __name__ == "__main__":
unittest.main()
| 37.828125 | 88 | 0.663362 | 537 | 4,842 | 5.581006 | 0.156425 | 0.046713 | 0.034701 | 0.052052 | 0.818819 | 0.786787 | 0.781448 | 0.781448 | 0.754087 | 0.754087 | 0 | 0.036364 | 0.25031 | 4,842 | 127 | 89 | 38.125984 | 0.789256 | 0.023544 | 0 | 0.477064 | 0 | 0 | 0.074497 | 0.052487 | 0 | 0 | 0 | 0 | 0.036697 | 1 | 0.06422 | false | 0 | 0.055046 | 0 | 0.12844 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0b3fc135803f14c4090abdc3c650c3055228c2cb | 28 | py | Python | test2.py | cryptoandrew/Python | f1a36dbc5fca3061adb42e2f9ef70d30078da6cf | [
"Apache-2.0"
] | null | null | null | test2.py | cryptoandrew/Python | f1a36dbc5fca3061adb42e2f9ef70d30078da6cf | [
"Apache-2.0"
] | null | null | null | test2.py | cryptoandrew/Python | f1a36dbc5fca3061adb42e2f9ef70d30078da6cf | [
"Apache-2.0"
] | null | null | null | print("this is a try run")
| 9.333333 | 26 | 0.642857 | 6 | 28 | 3 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.214286 | 28 | 2 | 27 | 14 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0.62963 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
0b5c9d6d015a66099bc89adc6b146a591d89f099 | 108 | py | Python | myshowsapi/exceptions.py | kreedz/myshowsapi | 55c0504ea6e58ce2144c56ee4a6af333411ff8a3 | [
"MIT"
] | 1 | 2017-07-22T03:57:23.000Z | 2017-07-22T03:57:23.000Z | myshowsapi/exceptions.py | kreedz/myshowsapi | 55c0504ea6e58ce2144c56ee4a6af333411ff8a3 | [
"MIT"
] | null | null | null | myshowsapi/exceptions.py | kreedz/myshowsapi | 55c0504ea6e58ce2144c56ee4a6af333411ff8a3 | [
"MIT"
] | null | null | null | class MyShowsException(Exception):
pass
class MyShowsRetrieveTokenErrorException(Exception):
pass
| 15.428571 | 52 | 0.796296 | 8 | 108 | 10.75 | 0.625 | 0.302326 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 108 | 6 | 53 | 18 | 0.934783 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
0b7190eebe26f69546112cc16c895eea13f55903 | 118 | py | Python | tests/functional/modules/pyi_import_pyqt_uic_port/PyQt5/uic/__init__.py | hawkhai/pyinstaller | 016a24479b34de161792c72dde455a81ad4c78ae | [
"Apache-2.0"
] | 9,267 | 2015-01-01T04:08:45.000Z | 2022-03-31T11:42:38.000Z | tests/functional/modules/pyi_import_pyqt_uic_port/PyQt5/uic/__init__.py | jeremysanders/pyinstaller | 321b24f9a9a5978337735816b36ca6b4a90a2fb4 | [
"Apache-2.0"
] | 5,150 | 2015-01-01T12:09:56.000Z | 2022-03-31T18:06:12.000Z | tests/functional/modules/pyi_import_pyqt_uic_port/PyQt5/uic/__init__.py | jeremysanders/pyinstaller | 321b24f9a9a5978337735816b36ca6b4a90a2fb4 | [
"Apache-2.0"
] | 2,101 | 2015-01-03T10:25:27.000Z | 2022-03-30T11:04:42.000Z | print('this is my faked PyQt5.uic package')
__pyinstaller_fake_module_marker__ = '__pyinstaller_fake_module_marker__'
| 39.333333 | 73 | 0.855932 | 16 | 118 | 5.4375 | 0.75 | 0.344828 | 0.482759 | 0.62069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009174 | 0.076271 | 118 | 2 | 74 | 59 | 0.788991 | 0 | 0 | 0 | 0 | 0 | 0.576271 | 0.288136 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
0b8921ac35cfeb004bf61429a1e767221829644e | 167 | py | Python | src/test/tinc/tinctest/test/discovery/mockstorage/uao/test_smoke_uao.py | rodel-talampas/gpdb | 9c955e350334abbd922102f289f782697eb52069 | [
"PostgreSQL",
"Apache-2.0"
] | 9 | 2018-04-20T03:31:01.000Z | 2020-05-13T14:10:53.000Z | src/test/tinc/tinctest/test/discovery/mockstorage/uao/test_smoke_uao.py | rodel-talampas/gpdb | 9c955e350334abbd922102f289f782697eb52069 | [
"PostgreSQL",
"Apache-2.0"
] | 36 | 2017-09-21T09:12:27.000Z | 2020-06-17T16:40:48.000Z | src/test/tinc/tinctest/test/discovery/mockstorage/uao/test_smoke_uao.py | rodel-talampas/gpdb | 9c955e350334abbd922102f289f782697eb52069 | [
"PostgreSQL",
"Apache-2.0"
] | 32 | 2017-08-31T12:50:52.000Z | 2022-03-01T07:34:53.000Z | import tinctest
class UAOSmokeTests(tinctest.TINCTestCase):
def test_smoke_cardinality1(self):
pass
def test_smoke_cardinality2(self):
pass
| 16.7 | 43 | 0.718563 | 18 | 167 | 6.444444 | 0.666667 | 0.12069 | 0.206897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015385 | 0.221557 | 167 | 9 | 44 | 18.555556 | 0.876923 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0.166667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
0ba78b8b7f043c83e4d467e0147219f957472198 | 882 | gyp | Python | deps/libgdal/gyp-formats/nitf.gyp | jimgambale/node-gdal | dc5c89fb23f1004732106250c8b7d57f380f9b61 | [
"Apache-2.0"
] | 462 | 2015-01-07T23:09:18.000Z | 2022-03-30T03:58:09.000Z | deps/libgdal/gyp-formats/nitf.gyp | jimgambale/node-gdal | dc5c89fb23f1004732106250c8b7d57f380f9b61 | [
"Apache-2.0"
] | 196 | 2015-01-07T11:10:35.000Z | 2022-03-29T08:50:30.000Z | deps/libgdal/gyp-formats/nitf.gyp | jimgambale/node-gdal | dc5c89fb23f1004732106250c8b7d57f380f9b61 | [
"Apache-2.0"
] | 113 | 2015-01-15T02:24:18.000Z | 2021-11-22T06:05:52.000Z | {
"includes": [
"../common.gypi"
],
"targets": [
{
"target_name": "libgdal_nitf_frmt",
"type": "static_library",
"sources": [
"../gdal/frmts/nitf/ecrgtocdataset.cpp",
"../gdal/frmts/nitf/mgrs.c",
"../gdal/frmts/nitf/nitf_gcprpc.cpp",
"../gdal/frmts/nitf/nitfaridpcm.cpp",
"../gdal/frmts/nitf/nitfbilevel.cpp",
"../gdal/frmts/nitf/nitfdataset.cpp",
"../gdal/frmts/nitf/nitfdes.c",
"../gdal/frmts/nitf/nitfdump.c",
"../gdal/frmts/nitf/nitffile.c",
"../gdal/frmts/nitf/nitfimage.c",
"../gdal/frmts/nitf/nitfrasterband.cpp",
"../gdal/frmts/nitf/nitfwritejpeg.cpp",
"../gdal/frmts/nitf/nitfwritejpeg_12.cpp",
"../gdal/frmts/nitf/rpftocdataset.cpp",
"../gdal/frmts/nitf/rpftocfile.cpp"
],
"include_dirs": [
"../gdal/frmts/nitf",
"../gdal/frmts/vrt",
"../gdal/frmts/gtiff/libtiff"
]
}
]
}
| 25.941176 | 46 | 0.596372 | 102 | 882 | 5.088235 | 0.362745 | 0.312139 | 0.400771 | 0.277457 | 0.111753 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002692 | 0.157596 | 882 | 33 | 47 | 26.727273 | 0.695828 | 0 | 0 | 0.060606 | 0 | 0 | 0.738095 | 0.591837 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e7f8e9ac97bbcc4ac010a12e2c6d1f69f685457c | 36 | py | Python | envs/__init__.py | jairotunior/gym_go | c626ce79f4233c949787064884e806172a786a58 | [
"MIT"
] | null | null | null | envs/__init__.py | jairotunior/gym_go | c626ce79f4233c949787064884e806172a786a58 | [
"MIT"
] | null | null | null | envs/__init__.py | jairotunior/gym_go | c626ce79f4233c949787064884e806172a786a58 | [
"MIT"
] | null | null | null | from gym_go.envs.go_env import GoEnv | 36 | 36 | 0.861111 | 8 | 36 | 3.625 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 36 | 1 | 36 | 36 | 0.878788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f03c3a835d2024c9f9c89d1deae577af2ab59e4c | 123 | py | Python | telemetry-pi/main.py | USCSolarCarStrategyTeam/driver-display | a9570f5c7ea98a966f95d6ddc32b88e22e29bd56 | [
"Apache-2.0"
] | null | null | null | telemetry-pi/main.py | USCSolarCarStrategyTeam/driver-display | a9570f5c7ea98a966f95d6ddc32b88e22e29bd56 | [
"Apache-2.0"
] | 2 | 2017-11-04T22:44:41.000Z | 2018-03-23T02:14:20.000Z | telemetry-pi/main.py | USCSolarCarStrategyTeam/telemetry-pi | a9570f5c7ea98a966f95d6ddc32b88e22e29bd56 | [
"Apache-2.0"
] | 5 | 2017-10-07T22:54:09.000Z | 2018-01-27T21:20:51.000Z | from driver.Driver import Driver
from driver.Driver import Dummy2RPMDriver
c = Driver(Dummy2RPMDriver())
print(c.getRPM()) | 24.6 | 41 | 0.804878 | 16 | 123 | 6.1875 | 0.4375 | 0.20202 | 0.323232 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018018 | 0.097561 | 123 | 5 | 42 | 24.6 | 0.873874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0.25 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
f05803668020e4f3d55a55454a1a52684540bf84 | 9,784 | py | Python | tests/test_user.py | tappitikarrass/flask-ap | 88b97bb522474ca3dc056c209640050af74cb5dc | [
"BSD-3-Clause"
] | null | null | null | tests/test_user.py | tappitikarrass/flask-ap | 88b97bb522474ca3dc056c209640050af74cb5dc | [
"BSD-3-Clause"
] | null | null | null | tests/test_user.py | tappitikarrass/flask-ap | 88b97bb522474ca3dc056c209640050af74cb5dc | [
"BSD-3-Clause"
] | null | null | null | from app import (db)
from .conftest import (
get_last_user,
get_token,
get_json_field
)
from .test_data import (
post_user_data_200,
post_user_data_400,
update_user_data_200,
login_creds_200,
login_creds_403,
login_creds_404,
login_creds_alt_200
)
class TestUser:
class TestUserLogin:
def test_login_200(self, client, create_user, delete_user):
response = client.post("/login",
headers={'Authorization': 'Basic ' +
login_creds_200}
)
assert response.status_code == 200
def test_login_401(self, client, create_user, delete_user):
response = client.post("/login")
assert response.status_code == 401
def test_login_403(self, client, create_user, delete_user):
response = client.post("/login",
headers={'Authorization': 'Basic ' +
login_creds_403}
)
assert response.status_code == 403
def test_login_404(self, client, create_user, delete_user):
response = client.post("/login",
headers={'Authorization': 'Basic ' +
login_creds_404}
)
assert response.status_code == 404
class TestUserLogout:
def test_logout_200(self, client, create_user, delete_user):
response = client.delete("/logout",
headers={'Authorization': 'Bearer ' +
get_token(client)}
)
assert response.status_code == 200
def test_logout_401(self, client, create_user, delete_user):
response = client.delete("/logout")
assert response.status_code == 401
def test_logout_422(self, client, create_user, delete_user):
response = client.delete("/logout",
headers={'Authorization': 'Bearer ' +
'invalid_token'}
)
assert response.status_code == 422
class TestUserGetUsers:
def test_get_users_200(self, client, create_admin, delete_user):
response = client.get("/user",
headers={'Authorization': 'Bearer ' +
get_token(client)}
)
assert response.status_code == 200
def test_get_users_403(self, client, create_admin,
delete_user_alt, delete_user):
response = client.get("/user",
headers={'Authorization': 'Bearer ' +
get_token(client, login_creds_alt_200)}
)
assert response.status_code == 403
class TestUserPost:
def test_post_user_200(self, client, delete_user):
response = client.post("/user",
json=post_user_data_200)
assert response.status_code == 200
def test_post_user_400(self, client):
response = client.post("/user",
json=post_user_data_400)
assert response.status_code == 400
class TestUserDelete:
def test_delete_user_200(self, client, create_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id)
response = client.delete(url,
headers={'Authorization': 'Bearer ' +
get_token(client)}
)
assert response.status_code == 200
def test_delete_user_401(self, client, create_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id)
response = client.delete(url)
assert response.status_code == 401
def test_delete_user_403(self, client, create_user, create_user_alt,
delete_user_alt, delete_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id)
response = client.delete(url,
headers={'Authorization': 'Bearer ' +
get_token(client, login_creds_200)}
)
assert response.status_code == 403
def test_delete_user_404(self, client, create_user, delete_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id + 1)
response = client.delete(url,
headers={'Authorization': 'Bearer ' +
get_token(client)}
)
assert response.status_code == 404
def test_delete_user_422(self, client, create_user, delete_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id)
response = client.delete(url,
headers={'Authorization': 'Bearer ' +
'invalid_token'}
)
assert response.status_code == 422
class TestUserUpdate:
def test_update_user_200(self, client, create_user, delete_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id)
response = client.put(url,
json=update_user_data_200,
headers={'Authorization': 'Bearer ' +
get_token(client)}
)
assert response.status_code == 200
def test_update_user_401(self, client, create_user, delete_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id)
response = client.put(url, json=update_user_data_200)
assert response.status_code == 401
def test_update_user_403(self, client, create_user, create_user_alt,
delete_user_alt, delete_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id)
response = client.put(url,
json=update_user_data_200,
headers={'Authorization': 'Bearer ' +
get_token(client, login_creds_200)}
)
assert response.status_code == 403
def test_update_user_404(self, client, create_user, delete_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id + 1)
response = client.put(url,
json=update_user_data_200,
headers={'Authorization': 'Bearer ' +
get_token(client)}
)
assert response.status_code == 404
def test_update_user_422(self, client, create_user, delete_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id)
response = client.put(url,
json=update_user_data_200,
headers={'Authorization': 'Bearer ' +
'invalid_token'}
)
assert response.status_code == 422
class TestUserGetById:
def test_get_user_200(self, client, create_user, delete_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id)
response = client.get(url,
headers={'Authorization': 'Bearer ' +
get_token(client)}
)
assert response.status_code == 200
def test_get_user_401(self, client, create_user, delete_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id)
response = client.get(url)
assert response.status_code == 401
def test_get_user_403(self, client, create_user, create_user_alt,
delete_user_alt, delete_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id)
response = client.get(url,
headers={'Authorization': 'Bearer ' +
get_token(client, login_creds_200)}
)
assert response.status_code == 403
def test_get_user_404(self, client, create_user, delete_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id + 1)
response = client.get(url,
headers={'Authorization': 'Bearer ' +
get_token(client)}
)
assert response.status_code == 404
def test_get_user_422(self, client, create_user, delete_user):
user_id = get_last_user().user_id
url = "/user/" + str(user_id)
response = client.get(url,
headers={'Authorization': 'Bearer ' +
'invalid_token'}
)
assert response.status_code == 422
| 48.92 | 76 | 0.483647 | 925 | 9,784 | 4.781622 | 0.059459 | 0.061045 | 0.067827 | 0.141081 | 0.870224 | 0.860502 | 0.838797 | 0.806692 | 0.755144 | 0.75243 | 0 | 0.039986 | 0.432543 | 9,784 | 199 | 77 | 49.165829 | 0.756664 | 0 | 0 | 0.59596 | 0 | 0 | 0.059689 | 0 | 0 | 0 | 0 | 0 | 0.131313 | 1 | 0.131313 | false | 0 | 0.015152 | 0 | 0.186869 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f07bb016ffe9e9b4645a678d70ba08ed4ec3569c | 148 | py | Python | proj_vis/demo_test.py | SysCV/project-template | e4e73121c61ae3d8e361b5fdfe291d6989dd1a97 | [
"Apache-2.0"
] | 2 | 2021-12-07T06:44:13.000Z | 2022-03-31T02:02:27.000Z | proj_vis/demo_test.py | SysCV/project-template | e4e73121c61ae3d8e361b5fdfe291d6989dd1a97 | [
"Apache-2.0"
] | null | null | null | proj_vis/demo_test.py | SysCV/project-template | e4e73121c61ae3d8e361b5fdfe291d6989dd1a97 | [
"Apache-2.0"
] | 1 | 2021-12-07T06:44:17.000Z | 2021-12-07T06:44:17.000Z | """An example of unit tests based on pytest."""
from .demo import add
def test_add() -> None:
"""Test add."""
assert add([1, 2, 3]) == 6
| 16.444444 | 47 | 0.574324 | 24 | 148 | 3.5 | 0.833333 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035398 | 0.236486 | 148 | 8 | 48 | 18.5 | 0.707965 | 0.344595 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6520562bbba8e4e8365d0d3982e4f1d7a3f9bcad | 2,366 | py | Python | xlib/api/win32/dxgi/structs.py | jkennedyvz/DeepFaceLive | 274c20808da089eb7fc0fc0e8abe649379a29ffe | [
"MIT"
] | 3 | 2021-12-08T08:59:50.000Z | 2022-02-08T02:54:27.000Z | xlib/api/win32/dxgi/structs.py | jkennedyvz/DeepFaceLive | 274c20808da089eb7fc0fc0e8abe649379a29ffe | [
"MIT"
] | 1 | 2022-02-08T01:29:03.000Z | 2022-02-08T01:29:03.000Z | xlib/api/win32/dxgi/structs.py | jkennedyvz/DeepFaceLive | 274c20808da089eb7fc0fc0e8abe649379a29ffe | [
"MIT"
] | 1 | 2021-12-14T09:18:15.000Z | 2021-12-14T09:18:15.000Z | import ctypes as ct
from ctypes import (POINTER, WINFUNCTYPE, byref, c_byte, c_int64,
c_longlong, c_size_t, c_ubyte, c_uint, c_uint32, c_ulong,
c_void_p, c_wchar)
class DXGI_ADAPTER_FLAG(c_uint):
DXGI_ADAPTER_FLAG_NONE = 0
DXGI_ADAPTER_FLAG_REMOTE = 1
DXGI_ADAPTER_FLAG_SOFTWARE = 2
DXGI_ADAPTER_FLAG_FORCE_DWORD = 0xffffffff
class DXGI_ADAPTER_DESC(ct.Structure):
_fields_ = [('Description', c_wchar * 128),
('VendorId', c_uint),
('DeviceId', c_uint),
('SubSysId', c_uint),
('Revision', c_uint),
('DedicatedVideoMemory', c_size_t),
('DedicatedSystemMemory', c_size_t),
('SharedSystemMemory', c_size_t),
('AdapterLuid', c_longlong),
]
def __init__(self):
self.Description : c_wchar * 128 = ''
self.VendorId : c_uint = c_uint()
self.DeviceId : c_uint = c_uint()
self.SubSysId : c_uint = c_uint()
self.Revision : c_uint = c_uint()
self.DedicatedVideoMemory : c_size_t = c_size_t()
self.DedicatedSystemMemory : c_size_t = c_size_t()
self.SharedSystemMemory : c_size_t = c_size_t()
self.AdapterLuid : c_longlong = c_longlong()
super().__init__()
class DXGI_ADAPTER_DESC1(ct.Structure):
_fields_ = [('Description', c_wchar * 128),
('VendorId', c_uint),
('DeviceId', c_uint),
('SubSysId', c_uint),
('Revision', c_uint),
('DedicatedVideoMemory', c_size_t),
('DedicatedSystemMemory', c_size_t),
('SharedSystemMemory', c_size_t),
('AdapterLuid', c_longlong),
('Flags', c_uint),
]
def __init__(self):
self.Description : c_wchar * 128 = ''
self.VendorId : c_uint = c_uint()
self.DeviceId : c_uint = c_uint()
self.SubSysId : c_uint = c_uint()
self.Revision : c_uint = c_uint()
self.DedicatedVideoMemory : c_size_t = c_size_t()
self.DedicatedSystemMemory : c_size_t = c_size_t()
self.SharedSystemMemory : c_size_t = c_size_t()
self.AdapterLuid : c_longlong = c_longlong()
self.Flags : c_uint = c_uint()
super().__init__()
| 38.786885 | 77 | 0.572274 | 272 | 2,366 | 4.518382 | 0.194853 | 0.117982 | 0.092758 | 0.07323 | 0.730675 | 0.730675 | 0.730675 | 0.730675 | 0.730675 | 0.730675 | 0 | 0.012979 | 0.316145 | 2,366 | 60 | 78 | 39.433333 | 0.746601 | 0 | 0 | 0.727273 | 0 | 0 | 0.097633 | 0.017751 | 0 | 0 | 0.004227 | 0 | 0 | 1 | 0.036364 | false | 0 | 0.036364 | 0 | 0.236364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3303b8a37a4e0fc8b241c03d7187d7c2e0c2bca3 | 80 | py | Python | __init__.py | ltoosaint24/lambdata | cbb1d89962ca5cc17eec211ccc6110083a1b888c | [
"MIT"
] | null | null | null | __init__.py | ltoosaint24/lambdata | cbb1d89962ca5cc17eec211ccc6110083a1b888c | [
"MIT"
] | null | null | null | __init__.py | ltoosaint24/lambdata | cbb1d89962ca5cc17eec211ccc6110083a1b888c | [
"MIT"
] | null | null | null |
from lambdata.ltoussaint import HFunction
from labmdata.ltoussaint import fibb
| 20 | 41 | 0.8625 | 10 | 80 | 6.9 | 0.7 | 0.463768 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1125 | 80 | 3 | 42 | 26.666667 | 0.971831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
33525dce0f11cfa2820f3d51d615927225f24409 | 48 | py | Python | swinginghead/parser/__init__.py | isidentical-archive/swinginghead | cabce05378ee8eea71afc98ff931e227cc55f7e8 | [
"MIT"
] | 8 | 2019-04-17T17:18:16.000Z | 2019-04-21T20:37:47.000Z | swinginghead/parser/__init__.py | isidentical/swinginghead | cabce05378ee8eea71afc98ff931e227cc55f7e8 | [
"MIT"
] | null | null | null | swinginghead/parser/__init__.py | isidentical/swinginghead | cabce05378ee8eea71afc98ff931e227cc55f7e8 | [
"MIT"
] | null | null | null | from swinginghead.parser.pgen import get_parser
| 24 | 47 | 0.875 | 7 | 48 | 5.857143 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 48 | 1 | 48 | 48 | 0.931818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3357dda53f765f40952720ba96519c93887741ae | 31 | py | Python | AIShooter/__init__.py | SanteriHetekivi/AIShooter | f73461c9075cf5b0789679782fe21dee54f2775e | [
"Apache-2.0"
] | null | null | null | AIShooter/__init__.py | SanteriHetekivi/AIShooter | f73461c9075cf5b0789679782fe21dee54f2775e | [
"Apache-2.0"
] | null | null | null | AIShooter/__init__.py | SanteriHetekivi/AIShooter | f73461c9075cf5b0789679782fe21dee54f2775e | [
"Apache-2.0"
] | null | null | null | from .classes.Game import Game
| 15.5 | 30 | 0.806452 | 5 | 31 | 5 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 31 | 1 | 31 | 31 | 0.925926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3370394524cc24369b586ff5d671193f3d3a18ba | 271 | py | Python | default/modules/git.py | AshlynnInWonderland/zsh-powerline | e6f3326b3e15d8a89a0ea959314ea0ea5768ea86 | [
"MIT"
] | null | null | null | default/modules/git.py | AshlynnInWonderland/zsh-powerline | e6f3326b3e15d8a89a0ea959314ea0ea5768ea86 | [
"MIT"
] | null | null | null | default/modules/git.py | AshlynnInWonderland/zsh-powerline | e6f3326b3e15d8a89a0ea959314ea0ea5768ea86 | [
"MIT"
] | null | null | null | import subprocess as sp
#this code is disgusting and I'm ashamed of it
def returnText():
return " " + sp.check_output(['git','symbolic-ref','--quiet','HEAD'], stdin=None, stderr=sp.PIPE, shell=False, universal_newlines=False).strip().decode('utf-8').split('/')[2]
| 38.714286 | 179 | 0.690037 | 42 | 271 | 4.428571 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008299 | 0.110701 | 271 | 6 | 180 | 45.166667 | 0.759336 | 0.166052 | 0 | 0 | 0 | 0 | 0.151111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
68569fa7285a20cd3c59354e26115e30d8713487 | 99 | py | Python | tests/test_myfuncs.py | Humungus-Fungus/Knito | d10a38856375286c443673b17a50feac7b0e6b3a | [
"MIT"
] | null | null | null | tests/test_myfuncs.py | Humungus-Fungus/Knito | d10a38856375286c443673b17a50feac7b0e6b3a | [
"MIT"
] | null | null | null | tests/test_myfuncs.py | Humungus-Fungus/Knito | d10a38856375286c443673b17a50feac7b0e6b3a | [
"MIT"
] | null | null | null | from pylib import myfuncs
def test_conc():
assert myfuncs.conc("hello ", "world")=="hello world"
| 19.8 | 54 | 0.717172 | 14 | 99 | 5 | 0.714286 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131313 | 99 | 4 | 55 | 24.75 | 0.813953 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
689319af485dcbc48b8242d942e0853dcad086e7 | 13,901 | py | Python | app/python/testing/modules_and_functions/test_enrichment.py | RadjaHachilif/agotool | 2fcc3fd5a156053b528ec927bab79ddaf7af2dec | [
"MIT"
] | 6 | 2016-04-14T11:47:43.000Z | 2022-01-29T14:34:59.000Z | app/python/testing/modules_and_functions/test_enrichment.py | RadjaHachilif/agotool | 2fcc3fd5a156053b528ec927bab79ddaf7af2dec | [
"MIT"
] | 2 | 2019-12-21T12:15:46.000Z | 2021-01-08T12:22:17.000Z | app/python/testing/modules_and_functions/test_enrichment.py | RadjaHachilif/agotool | 2fcc3fd5a156053b528ec927bab79ddaf7af2dec | [
"MIT"
] | 1 | 2021-03-04T10:26:18.000Z | 2021-03-04T10:26:18.000Z | import sys, os
# sys.path.insert(0, os.path.dirname(os.path.abspath(os.path.realpath(__file__))))
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(os.path.realpath(__file__)), "../.."))) # to get to python directory
import pytest
import requests
import ast
import variables, ratio, query, userinput, enrichment, run
def format_for_REST_API(list_of_string):
return "%0d".join(list_of_string)
def test_count_terms_v3(random_foreground_background, pqo_STRING):
"""
this test is for ratio.count_terms_v3,
since it is testing for the presence of secondary IDs
# goterm: 'GO:0007610' has secondary id 'GO:0044708'
:param random_foreground_background:
:param pqo_STRING:
:return:
"""
foreground, background, taxid = random_foreground_background
etype_2_association_dict_foreground = pqo_STRING.get_association_dict_split_by_category(foreground)
go_slim_or_basic = "basic"
for entity_type in variables.entity_types_with_data_in_functions_table:
obo_dag = run.pick_dag_from_entity_type_and_basic_or_slim(entity_type, go_slim_or_basic, pqo_STRING)
assoc_dict = etype_2_association_dict_foreground[entity_type]
for an in (AN for AN in set(foreground) if AN in assoc_dict):
for association in assoc_dict[an]:
association_id = obo_dag[association].id
assert association_id == association
association_2_count_dict_v2, association_2_ANs_dict_v2, ans_counter_v2 = ratio.count_terms_v2(set(background), assoc_dict, obo_dag)
association_2_count_dict_v3, association_2_ANs_dict_v3, ans_counter_v3 = ratio.count_terms_v3(set(background), assoc_dict)
assert association_2_count_dict_v2 == association_2_count_dict_v3
assert association_2_ANs_dict_v2 == association_2_ANs_dict_v3
assert ans_counter_v2 <= ans_counter_v3
def test_EnrichmentStudy_genome(random_foreground_background, pqo_STRING, args_dict):
"""
checking for non empty results dictionary
perc_association_foreground <= 100
perc_asociation_background <= 100
foreground_count <= foreground_n
background_count <= background_n
:return:
"""
go_slim_or_basic = "basic"
o_or_u_or_both = "overrepresented"
multitest_method = "benjamini_hochberg"
output_format = "json"
foreground, background, taxid = random_foreground_background
background_n = pqo_STRING.get_proteome_count_from_taxid(int(taxid))
assert background_n == len(background)
assert len(foreground) <= len(background)
# ui = userinput.REST_API_input(pqo_STRING,
# foreground_string=format_for_REST_API(foreground),
# background_string=format_for_REST_API(background),
# enrichment_method="genome") #, background_n=len(background))
args_dict_temp = args_dict.copy()
args_dict_temp.update({"foreground":format_for_REST_API(foreground),
"background":format_for_REST_API(background),
"enrichment_method":"genome"})
ui = userinput.REST_API_input(pqo_STRING, args_dict_temp)
etype_2_association_dict_foreground = pqo_STRING.get_association_dict_split_by_category(foreground)
# assoc_dict = etype_2_association_dict_foreground[entity_type]
etype_2_association_2_count_dict_background, etype_2_association_2_ANs_dict_background, _ = query.get_association_2_count_ANs_background_split_by_entity(taxid)
for entity_type in variables.entity_types_with_data_in_functions_table:
dag = run.pick_dag_from_entity_type_and_basic_or_slim(entity_type, go_slim_or_basic, pqo_STRING)
assoc_dict = etype_2_association_dict_foreground[entity_type]
if bool(assoc_dict): # not empty dictionary
enrichment_study = enrichment.EnrichmentStudy(ui, assoc_dict, dag,
o_or_u_or_both=o_or_u_or_both,
multitest_method=multitest_method,
entity_type=entity_type,
association_2_count_dict_background=etype_2_association_2_count_dict_background[entity_type],
background_n=background_n)
result = enrichment_study.get_result(output_format)
assert result # not an empty dict
@pytest.mark.STRING_examples
def test_run_STRING_enrichment(pqo_STRING, STRING_examples, args_dict):
"""
checking that
:param pqo_STRING: PersistentQuery Object
:param STRING_examples: tuple (foreground ENSPs, taxid)
:param args_dict: dict (from conftest.py with default values)
:return
:
"""
enrichment_method = "compare_samples"
foreground, taxid = STRING_examples
background = query.get_proteins_of_taxid(taxid)
# background_n = pqo_STRING.get_proteome_count_from_taxid(taxid)
args_dict_temp = args_dict.copy()
args_dict_temp.update({"foreground":format_for_REST_API(foreground),
"background":format_for_REST_API(background),
"intensity":None,
"enrichment_method":enrichment_method})
# ui = userinput.REST_API_input(pqo_STRING, foreground_string=format_for_REST_API(foreground),
# background_string=format_for_REST_API(background), background_intensity=None, enrichment_method=enrichment_method)
ui = userinput.REST_API_input(pqo_STRING, args_dict_temp)
# results_all_function_types = run.run_STRING_enrichment(pqo=pqo_STRING, ui=ui, enrichment_method=enrichment_method,
# limit_2_entity_type=variables.limit_2_entity_types_ALL, output_format="json", FDR_cutoff=None)
args_dict_temp.update({"limit_2_entity_type":variables.limit_2_entity_types_ALL,
"output_format":"json",
"FDR_cutoff":None})
results_all_function_types = run.run_STRING_enrichment(pqo=pqo_STRING, ui=ui, args_dict=args_dict_temp)
assert results_all_function_types != {'message': 'Internal Server Error'}
etypes = variables.entity_types_with_data_in_functions_table
assert len(set(results_all_function_types.keys()).intersection(etypes)) == len(etypes)
for _, result in results_all_function_types.items():
# assert result is not empty
assert result
@pytest.mark.STRING_examples
def test_run_STRING_enrichment_genome(pqo_STRING, STRING_examples, args_dict):
foreground, taxid = STRING_examples
etype_2_association_dict = pqo_STRING.get_association_dict_split_by_category(foreground)
background_n = pqo_STRING.get_proteome_count_from_taxid(taxid)
args_dict_temp = args_dict.copy()
args_dict_temp.update({"foreground":format_for_REST_API(foreground),
"enrichment_method":"genome",
"background_n":background_n})
# ui = userinput.REST_API_input(pqo_STRING, foreground_string=format_for_REST_API(foreground), enrichment_method="genome", background_n=background_n)
ui = userinput.REST_API_input(pqo_STRING, args_dict_temp)
# results_all_function_types = run.run_STRING_enrichment_genome(pqo=pqo_STRING, ui=ui, taxid=taxid, background_n=background_n, output_format="json", FDR_cutoff=None)
args_dict_temp.update({"taxid":taxid,
"output_format":"json",
"FDR_cutoff":None})
results_all_function_types = run.run_STRING_enrichment_genome(pqo=pqo_STRING, ui=ui, background_n=background_n, args_dict=args_dict_temp)
assert results_all_function_types != {'message': 'Internal Server Error'}
# etypes = variables.entity_types_with_data_in_functions_table
# assert len(set(results_all_function_types.keys()).intersection(etypes)) == len(etypes) # incomplete overlap can be due to missing functional annotations for given ENSPs
for etype, result in results_all_function_types.items():
result = ast.literal_eval(result)
number_of_ENSPs_with_association = len(etype_2_association_dict[etype])
# number_of_associations = len(set(val for key, val in etype_2_association_dict[etype].items()))
number_of_associations = len({item for sublist in etype_2_association_dict[etype].values() for item in sublist})
assert len(result) == number_of_associations # number of rows in results --> number of associations
assert len(foreground) >= number_of_ENSPs_with_association # not every ENSP has functional associations
def test_EnrichmentStudy_(random_foreground_background, pqo_STRING, args_dict):
"""
perc_association_foreground <= 100
perc_asociation_background <= 100
foreground_count <= foreground_n
background_count <= background_n
:return:
"""
go_slim_or_basic = "basic"
o_or_u_or_both = "overrepresented"
multitest_method = "benjamini_hochberg"
output_format = "json"
foreground, background, taxid = random_foreground_background
background_n = pqo_STRING.get_proteome_count_from_taxid(int(taxid))
assert background_n == len(background)
assert len(foreground) <= len(background)
args_dict_temp = args_dict.copy()
args_dict_temp.update({"foreground":format_for_REST_API(foreground),
"background":format_for_REST_API(background),
"enrichment_method":"genome",
"background_n":len(background)})
# ui = userinput.REST_API_input(pqo_STRING,
# foreground_string=format_for_REST_API(foreground),
# background_string=format_for_REST_API(background),
# enrichment_method="genome", background_n=len(background))
ui = userinput.REST_API_input(pqo_STRING, args_dict_temp)
etype_2_association_dict_foreground = pqo_STRING.get_association_dict_split_by_category(foreground)
etype_2_association_2_count_dict_background, etype_2_association_2_ANs_dict_background, _ = query.get_association_2_count_ANs_background_split_by_entity(taxid)
for entity_type in variables.entity_types_with_data_in_functions_table:
dag = run.pick_dag_from_entity_type_and_basic_or_slim(entity_type, go_slim_or_basic, pqo_STRING)
assoc_dict = etype_2_association_dict_foreground[entity_type]
if bool(assoc_dict): # not empty dictionary
enrichment_study = enrichment.EnrichmentStudy(ui, assoc_dict, dag,
o_or_u_or_both=o_or_u_or_both,
multitest_method=multitest_method,
entity_type=entity_type,
association_2_count_dict_background=etype_2_association_2_count_dict_background[entity_type],
background_n=background_n)
result = enrichment_study.get_result(output_format)
assert result # not an empty dict
# def test_example_files():
# # assert 1 == 2
# pass
#
@pytest.mark.STRING_examples
def test_REST_API_genome(STRING_examples):
foreground, taxid = STRING_examples
fg = "%0d".join(foreground)
response = requests.post(variables.api_url, params={"output_format": "json",
"foreground": fg,
"enrichment_method": "genome",
"species": taxid})
assert type(response.json()) == dict
# @pytest.mark.STRING_examples
# def test_REST_API_characterize_foreground(STRING_examples):
# foreground, taxid = STRING_examples
# fg = "%0d".join(foreground)
# response = requests.post(variables.api_url, params={"output_format": "json",
# "foreground": fg,
# "enrichment_method": "characterize_foreground",
# "species": taxid})
# assert type(response.json()) == dict
#
# @pytest.mark.STRING_examples
# def test_REST_API_compare_samples(STRING_examples):
# foreground, taxid = STRING_examples
# background = query.get_proteins_of_taxid(taxid)
# fg = "%0d".join(foreground)
# bg = "%0d".join(background)
# response = requests.post(variables.api_url, params={"output_format": "json",
# "foreground": fg,
# "background": bg,
# "enrichment_method": "compare_samples",
# "species": taxid})
# assert type(response.json()) == dict
#
# @pytest.mark.STRING_examples
# def test_REST_API_error(STRING_examples):
# foreground, taxid = STRING_examples
# background = query.get_proteins_of_taxid(taxid)
# fg = "%0d".join(foreground)
# bg = "%0d".join(background)
# response = requests.post(variables.api_url, params={"output_format": "json",
# "foreground": fg,
# "background": bg,
# "enrichment_method": "compare_samples",
# "species": "INVALID_TAXID"})
# assert type(response.json()) == dict
#
# @pytest.mark.xfail(reason="wrong URL test should fail")
# def test_wrong_URL(STRING_examples):
# foreground, taxid = STRING_examples
# background = query.get_proteins_of_taxid(taxid)
# fg = "%0d".join(foreground)
# bg = "%0d".join(background)
# response = requests.post("www.bubu_was_here.com", params={"output_format": "json",
# "foreground": fg,
# "background": bg,
# "enrichment_method": "compare_samples",
# "species": "INVALID_TAXID"})
# assert type(response.json()) == dict
| 55.162698 | 174 | 0.680527 | 1,655 | 13,901 | 5.283988 | 0.118429 | 0.030875 | 0.033047 | 0.027444 | 0.814751 | 0.783533 | 0.755975 | 0.729903 | 0.724414 | 0.701315 | 0 | 0.008823 | 0.23358 | 13,901 | 251 | 175 | 55.38247 | 0.811995 | 0.381483 | 0 | 0.610687 | 0 | 0 | 0.059076 | 0 | 0 | 0 | 0 | 0 | 0.129771 | 1 | 0.053435 | false | 0 | 0.038168 | 0.007634 | 0.099237 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
68ae1670ea9c9343f3f5fd6a56a121bbcd844794 | 43 | py | Python | conf/gunicorn.conf.py | rodrigoalveslima/buzzblog | 2b301ee363fbbebbe0ee31b1bf9538811d97b293 | [
"Apache-2.0"
] | null | null | null | conf/gunicorn.conf.py | rodrigoalveslima/buzzblog | 2b301ee363fbbebbe0ee31b1bf9538811d97b293 | [
"Apache-2.0"
] | null | null | null | conf/gunicorn.conf.py | rodrigoalveslima/buzzblog | 2b301ee363fbbebbe0ee31b1bf9538811d97b293 | [
"Apache-2.0"
] | null | null | null | bind = "0.0.0.0:81"
workers = 1
threads = 4 | 14.333333 | 19 | 0.604651 | 10 | 43 | 2.6 | 0.7 | 0.230769 | 0.230769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.228571 | 0.186047 | 43 | 3 | 20 | 14.333333 | 0.514286 | 0 | 0 | 0 | 0 | 0 | 0.227273 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d7aa153d1c850a71f6578e1a009d842b0a81509e | 5,663 | py | Python | src/msob_and_fp/compare_avoid_dep.py | paulnikolaus/snc-mgf-toolbox | bda07fedd92657628dd34e00911b344ee3535dad | [
"MIT"
] | 3 | 2020-06-01T14:46:15.000Z | 2022-03-24T15:02:03.000Z | src/msob_and_fp/compare_avoid_dep.py | paulnikolaus/snc-mgf-toolbox | bda07fedd92657628dd34e00911b344ee3535dad | [
"MIT"
] | null | null | null | src/msob_and_fp/compare_avoid_dep.py | paulnikolaus/snc-mgf-toolbox | bda07fedd92657628dd34e00911b344ee3535dad | [
"MIT"
] | null | null | null | """Compare standard standard_bound with negative dependence."""
from timeit import default_timer as timer
from typing import Tuple
from nc_operations.perform_enum import PerformEnum
from optimization.optimize import Optimize
from msob_and_fp.optimize_fp_bound import OptimizeFPBound
from msob_and_fp.optimize_server_bound import OptimizeServerBound
from msob_and_fp.setting_avoid_dep import SettingMSOBFP
def compare_avoid_dep_211(setting: SettingMSOBFP,
print_x=False) -> Tuple[float, float, float]:
"""Compare standard_bound with the new Lyapunov standard_bound."""
delta_val = 0.05
one_param_bounds = [(0.1, 10.0)]
two_param_bounds = [(0.1, 10.0), (1.1, 10.0)]
standard_bound = Optimize(setting=setting, number_param=2,
print_x=print_x).grid_search(
grid_bounds=two_param_bounds, delta=delta_val)
server_bound = OptimizeServerBound(setting_msob_fp=setting,
number_param=1,
print_x=print_x).grid_search(
grid_bounds=one_param_bounds,
delta=delta_val)
fp_bound = OptimizeFPBound(setting_msob_fp=setting,
number_param=1,
print_x=print_x).grid_search(
grid_bounds=one_param_bounds,
delta=delta_val)
return standard_bound, server_bound, fp_bound
def compare_avoid_dep_212(setting: SettingMSOBFP,
print_x=False) -> Tuple[float, float, float]:
"""Compare standard_bound with the new Lyapunov standard_bound."""
delta_val = 0.05
one_param_bounds = [(0.1, 10.0)]
two_param_bounds = [(0.1, 10.0), (1.1, 10.0)]
standard_bound = Optimize(setting=setting, number_param=2,
print_x=print_x).grid_search(
grid_bounds=two_param_bounds, delta=delta_val)
server_bound = OptimizeServerBound(setting_msob_fp=setting,
number_param=1,
print_x=print_x).grid_search(
grid_bounds=one_param_bounds,
delta=delta_val)
fp_bound = OptimizeFPBound(setting_msob_fp=setting,
number_param=2,
print_x=print_x).grid_search(
grid_bounds=two_param_bounds,
delta=delta_val)
return standard_bound, server_bound, fp_bound
def compare_time_211(setting: SettingMSOBFP) -> Tuple[float, float, float]:
"""Compare standard_bound with the new Lyapunov standard_bound."""
delta_val = 0.05
one_param_bounds = [(0.1, 10.0)]
two_param_bounds = [(0.1, 10.0), (1.1, 10.0)]
start = timer()
Optimize(setting=setting,
number_param=2).grid_search(grid_bounds=two_param_bounds,
delta=delta_val)
stop = timer()
time_standard = stop - start
start = timer()
OptimizeServerBound(setting_msob_fp=setting, number_param=1).grid_search(
grid_bounds=one_param_bounds, delta=delta_val)
stop = timer()
time_server_bound = stop - start
start = timer()
OptimizeFPBound(setting_msob_fp=setting,
number_param=1).grid_search(grid_bounds=one_param_bounds,
delta=delta_val)
stop = timer()
time_fp_bound = stop - start
return time_standard, time_server_bound, time_fp_bound
def compare_time_212(setting: SettingMSOBFP) -> Tuple[float, float, float]:
"""Compare standard_bound with the new Lyapunov standard_bound."""
delta_val = 0.05
one_param_bounds = [(0.1, 10.0)]
two_param_bounds = [(0.1, 10.0), (1.1, 10.0)]
start = timer()
Optimize(setting=setting,
number_param=2).grid_search(grid_bounds=two_param_bounds,
delta=delta_val)
stop = timer()
time_standard = stop - start
start = timer()
OptimizeServerBound(setting_msob_fp=setting, number_param=1).grid_search(
grid_bounds=one_param_bounds, delta=delta_val)
stop = timer()
time_server_bound = stop - start
start = timer()
OptimizeFPBound(setting_msob_fp=setting,
number_param=2).grid_search(grid_bounds=two_param_bounds,
delta=delta_val)
stop = timer()
time_fp_bound = stop - start
return time_standard, time_server_bound, time_fp_bound
if __name__ == '__main__':
from nc_arrivals.iid import DM1
from nc_server.constant_rate_server import ConstantRateServer
from utils.perform_parameter import PerformParameter
from msob_and_fp.overlapping_tandem_perform import OverlappingTandemPerform
DELAY_PROB = PerformParameter(perform_metric=PerformEnum.DELAY_PROB,
value=10)
ARR_LIST = [DM1(lamb=7.0), DM1(lamb=7.0), DM1(lamb=6.0)]
SER_LIST = [
ConstantRateServer(rate=0.5),
ConstantRateServer(rate=8.0),
ConstantRateServer(rate=5.5)
]
SETTING = OverlappingTandemPerform(arr_list=ARR_LIST,
ser_list=SER_LIST,
perform_param=DELAY_PROB)
# print(compare_avoid_dep_211(setting=SETTING, print_x=False))
print(compare_time_211(setting=SETTING))
| 36.301282 | 80 | 0.603214 | 657 | 5,663 | 4.861492 | 0.130898 | 0.068879 | 0.015028 | 0.075141 | 0.745773 | 0.713525 | 0.705698 | 0.705698 | 0.705698 | 0.705698 | 0 | 0.030984 | 0.316087 | 5,663 | 155 | 81 | 36.535484 | 0.7937 | 0.0641 | 0 | 0.707547 | 0 | 0 | 0.001517 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037736 | false | 0 | 0.103774 | 0 | 0.179245 | 0.084906 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d7b4b93ccd6d84cc53de0814d0e12fb6f26958eb | 51 | py | Python | Templates/FuncApp-Http-Py-OptionB/ToolsB/__init__.py | mmaysami/azure-functions-python | e97b29204af83bc1fc81b886f841fe7b7bc0c8a3 | [
"MIT"
] | null | null | null | Templates/FuncApp-Http-Py-OptionB/ToolsB/__init__.py | mmaysami/azure-functions-python | e97b29204af83bc1fc81b886f841fe7b7bc0c8a3 | [
"MIT"
] | null | null | null | Templates/FuncApp-Http-Py-OptionB/ToolsB/__init__.py | mmaysami/azure-functions-python | e97b29204af83bc1fc81b886f841fe7b7bc0c8a3 | [
"MIT"
] | null | null | null | from .toolsB_F1 import *
from .toolsB_F2 import *
| 17 | 25 | 0.745098 | 8 | 51 | 4.5 | 0.625 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0.176471 | 51 | 2 | 26 | 25.5 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d7c39fdb8f7963477d7863fcb36cd1338da006de | 27 | py | Python | src/euler_python_package/euler_python/medium/p242.py | wilsonify/euler | 5214b776175e6d76a7c6d8915d0e062d189d9b79 | [
"MIT"
] | null | null | null | src/euler_python_package/euler_python/medium/p242.py | wilsonify/euler | 5214b776175e6d76a7c6d8915d0e062d189d9b79 | [
"MIT"
] | null | null | null | src/euler_python_package/euler_python/medium/p242.py | wilsonify/euler | 5214b776175e6d76a7c6d8915d0e062d189d9b79 | [
"MIT"
] | null | null | null | def problem242():
pass
| 9 | 17 | 0.62963 | 3 | 27 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 0.259259 | 27 | 2 | 18 | 13.5 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
d7d6d57a0792cdd710f4d32fb3f297e39b1ff310 | 140 | py | Python | quickforex/logger.py | jean-edouard-boulanger/python-quickforex | be95a795719f8bb77eceeb543d2ab963182b3f56 | [
"MIT"
] | null | null | null | quickforex/logger.py | jean-edouard-boulanger/python-quickforex | be95a795719f8bb77eceeb543d2ab963182b3f56 | [
"MIT"
] | null | null | null | quickforex/logger.py | jean-edouard-boulanger/python-quickforex | be95a795719f8bb77eceeb543d2ab963182b3f56 | [
"MIT"
] | null | null | null | import logging
logger = logging.getLogger("quickforex")
def get_module_logger(name) -> logging.Logger:
return logger.getChild(name)
| 15.555556 | 46 | 0.757143 | 17 | 140 | 6.117647 | 0.647059 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135714 | 140 | 8 | 47 | 17.5 | 0.859504 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
040c3c3833d191f29bf4e8657fbf028d00a0eeeb | 113 | py | Python | mmdet/version.py | Ixuanzhang/mmdet | a8f8397e49e3091f23262f8ace9641ee2c6be746 | [
"Apache-2.0"
] | null | null | null | mmdet/version.py | Ixuanzhang/mmdet | a8f8397e49e3091f23262f8ace9641ee2c6be746 | [
"Apache-2.0"
] | null | null | null | mmdet/version.py | Ixuanzhang/mmdet | a8f8397e49e3091f23262f8ace9641ee2c6be746 | [
"Apache-2.0"
] | null | null | null | # GENERATED VERSION FILE
# TIME: Sun Mar 15 12:04:43 2020
__version__ = '1.1.0+unknown'
short_version = '1.1.0'
| 18.833333 | 32 | 0.699115 | 21 | 113 | 3.52381 | 0.714286 | 0.216216 | 0.243243 | 0.27027 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189474 | 0.159292 | 113 | 5 | 33 | 22.6 | 0.589474 | 0.469027 | 0 | 0 | 1 | 0 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
041eebc6ce6e76f888cba92674a702776a2fc0a5 | 30 | py | Python | thglibs/dos/__init__.py | darkcode357/thg_lib | c1052bcd85f705ff8be404b7a28964eabef2ed45 | [
"MIT"
] | null | null | null | thglibs/dos/__init__.py | darkcode357/thg_lib | c1052bcd85f705ff8be404b7a28964eabef2ed45 | [
"MIT"
] | 52 | 2018-10-25T20:29:17.000Z | 2018-10-25T20:45:02.000Z | thglibs/dos/__init__.py | darkcode357/thg_lib | c1052bcd85f705ff8be404b7a28964eabef2ed45 | [
"MIT"
] | null | null | null | from .brainiac_dos import DOS
| 15 | 29 | 0.833333 | 5 | 30 | 4.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 1 | 30 | 30 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
043d5ad4afa1555845eaf7b0b6f21a92270f0c15 | 24,096 | py | Python | API/feowl-api/feowl/tests.py | jplusplus/feowl | f6278d41329f911f30760cb21a387c1cc9fa7a83 | [
"Apache-2.0"
] | null | null | null | API/feowl-api/feowl/tests.py | jplusplus/feowl | f6278d41329f911f30760cb21a387c1cc9fa7a83 | [
"Apache-2.0"
] | null | null | null | API/feowl-api/feowl/tests.py | jplusplus/feowl | f6278d41329f911f30760cb21a387c1cc9fa7a83 | [
"Apache-2.0"
] | null | null | null | from django.test.client import Client
from models import PowerReport, Area, Contributor, Device
from django.contrib.auth.models import User, Permission
from tastypie_test import ResourceTestCase
from django.db import models
from tastypie.models import create_api_key
import json
from django.conf import settings
models.signals.post_save.connect(create_api_key, sender=User)
class PowerReportResourceTest(ResourceTestCase):
fixtures = ['test_data.json']
def setUp(self):
super(PowerReportResourceTest, self).setUp()
# Create a user.
self.username = u'john'
self.password = u'doe'
self.user = User.objects.create_user(self.username, 'john@example.com', self.password)
self.api_key = self.user.api_key.key
self.c = Client()
self.post_data = {
"area": "/api/v1/areas/1/",
"happened_at": "2012-06-14 12:37:50",
"has_experienced_outage": True,
"duration": 60
}
# Fetch the ``Entry`` object we'll use in testing.
# Note that we aren't using PKs because they can change depending
# on what other tests are running.
self.power_report_1 = PowerReport.objects.get(duration=121)
# We also build a detail URI, since we will be using it all over.
# DRY, baby. DRY.
self.detail_url = '/api/v1/reports/{0}/'.format(self.power_report_1.pk)
def get_credentials(self):
return {"username": self.username, "api_key": self.api_key}
def test_get_list_unauthorizied(self):
"""Get reports from the API without authenticated"""
self.assertHttpUnauthorized(self.c.get('/api/v1/reports/'))
def test_get_list_json(self):
"""Get reports from the API with authenticated. With checks if all keys are available"""
resp = self.c.get('/api/v1/reports/', self.get_credentials())
self.assertValidJSONResponse(resp)
# Scope out the data for correctness.
self.assertEqual(len(self.deserialize(resp)['objects']), 5)
# Here we're checking an entire structure for the expected data.
self.assertEqual(self.deserialize(resp)['objects'][0], {
'area': '/api/v1/areas/1/',
'happened_at': '2012-06-13T12:37:50+00:00',
'has_experienced_outage': True,
'location': None,
'duration': 240,
'quality': '1.00',
'resource_uri': '/api/v1/reports/2/',
'contributor': None,
'device': None
})
def test_header_auth(self):
resp = self.c.get(self.detail_url, **{'HTTP_AUTHORIZATION': 'ApiKey ' + self.username + ':' + self.api_key})
self.assertValidJSONResponse(resp)
# We use ``assertKeys`` here to just verify the keys, not all the data.
self.assertKeys(self.deserialize(resp), ['area', 'happened_at', 'has_experienced_outage', 'contributor', 'device', 'location', 'duration', 'quality', 'resource_uri'])
self.assertEqual(self.deserialize(resp)['duration'], 121)
def test_get_detail_unauthenticated(self):
"""Try to Get a single report from the API without authenticated"""
self.assertHttpUnauthorized(self.c.get(self.detail_url))
def test_get_detail_json(self):
"""Get a single report from the API with authenticated. With checks if all keys are available"""
resp = self.c.get(self.detail_url, self.get_credentials())
self.assertValidJSONResponse(resp)
# We use ``assertKeys`` here to just verify the keys, not all the data.
self.assertKeys(self.deserialize(resp), ['area', 'happened_at', 'has_experienced_outage', 'contributor', 'device', 'location', 'duration', 'quality', 'resource_uri'])
self.assertEqual(self.deserialize(resp)['duration'], 121)
def test_get_detail_csv(self):
"""Get a single report from the API with authenticated. With checks if all keys are available"""
content = 'area,contributor,device,happened_at,has_experienced_outage,location,duration,quality,resource_uri\r\n/api/v1/areas/1/,None,None,2012-06-13T09:56:25+00:00,True,POINT (10.0195312486050003 3.6011423196581309),121,1.00,/api/v1/reports/1/\r\n'
resp = self.c.get('/api/v1/reports/1/?username=' + self.username + '&api_key=' + self.api_key + '&format=csv')
self.assertEquals(resp.content, content)
def test_get_list_csv(self):
"""Get a single report from the API with authenticated. With checks if all keys are available"""
content = 'area,contributor,device,happened_at,has_experienced_outage,location,duration,quality,resource_uri\r\n/api/v1/areas/1/,None,None,2012-06-13T12:37:50+00:00,True,None,240,1.00,/api/v1/reports/2/\r\n/api/v1/areas/5/,None,None,2012-06-13T12:38:06+00:00,True,None,32,1.00,/api/v1/reports/3/\r\n/api/v1/areas/2/,None,None,2012-06-13T12:38:18+00:00,True,None,1231,1.00,/api/v1/reports/4/\r\n/api/v1/areas/2/,None,None,2012-06-13T12:38:27+00:00,True,None,564,1.00,/api/v1/reports/5/\r\n/api/v1/areas/1/,None,None,2012-06-13T09:56:25+00:00,True,POINT (10.0195312486050003 3.6011423196581309),121,1.00,/api/v1/reports/1/\r\n'
resp = self.c.get('/api/v1/reports/?username=' + self.username + '&api_key=' + self.api_key + '&format=csv')
self.assertEquals(resp.content, content)
def test_post_list_unauthenticated(self):
"""Try to Post a single report to the API without authenticated"""
self.assertHttpUnauthorized(self.c.post('/api/v1/reports/', data=self.post_data))
def test_post_list_without_permissions(self):
"""Post a single report to the API with authenticated and without add permissions"""
add_powerreport = Permission.objects.get(codename="add_powerreport")
self.user.user_permissions.remove(add_powerreport)
self.assertEqual(PowerReport.objects.count(), 5)
self.assertHttpUnauthorized(self.c.post('/api/v1/reports/?username=' + self.username + '&api_key=' + self.api_key, data=json.dumps(self.post_data), content_type="application/json"))
# Verify that nothing was added to the db
self.assertEqual(PowerReport.objects.count(), 5)
def test_post_list_with_permissions(self):
"""Post a single report to the API with authenticated and with add permissions"""
add_powerreport = Permission.objects.get(codename="add_powerreport")
self.user.user_permissions.add(add_powerreport)
# Check how many there are first.
self.assertEqual(PowerReport.objects.count(), 5)
self.assertHttpCreated(self.c.post('/api/v1/reports/?username=%s&api_key=%s' % (self.username, self.api_key), data=json.dumps(self.post_data), content_type="application/json"))
# Verify a new one has been added.
self.assertEqual(PowerReport.objects.count(), 6)
def test_put_detail_unauthenticated(self):
"""Put a single report is not allowed from the API with authenticated"""
self.assertHttpMethodNotAllowed(self.c.put(self.detail_url))
def test_put_detail(self):
"""Put a single report is not allowed from the API with authenticated"""
self.assertHttpMethodNotAllowed(self.c.put(self.detail_url, self.get_credentials()))
def test_delete_detail_unauthenticated(self):
"""Delete a single report is not allowed from the API without authenticated"""
self.assertHttpMethodNotAllowed(self.c.delete(self.detail_url))
def test_delete_detail(self):
"""Delete a single report is not allowed from the API with authenticated"""
self.assertHttpMethodNotAllowed(self.c.delete(self.detail_url, self.get_credentials()))
class AreaResourceTest(ResourceTestCase):
def setUp(self):
super(AreaResourceTest, self).setUp()
# Create a user.
self.username = u'john'
self.password = u'doe'
self.user = User.objects.create_user(self.username, 'john@example.com', self.password)
self.api_key = self.user.api_key.key
self.c = Client()
self.post_data = {
'city': 'Douala',
'country': 'Cameroon',
'name': 'Douala VI',
'pop_per_sq_km': '12323.00',
'overall_population': 200000
}
# Fetch the ``Entry`` object we'll use in testing.
# Note that we aren't using PKs because they can change depending
# on what other tests are running.
self.area_1 = Area.objects.get(name="Douala I")
# We also build a detail URI, since we will be using it all over.
# DRY, baby. DRY.
self.detail_url = '/api/v1/areas/{0}/'.format(self.area_1.pk)
def get_credentials(self):
return {"username": self.username, "api_key": self.api_key}
def test_get_list_unauthorzied(self):
"""Get areas from the API without authenticated"""
self.assertHttpUnauthorized(self.c.get('/api/v1/areas/'))
def test_get_list_json(self):
"""Get areas from the API with authenticated. With checks if all keys are available"""
resp = self.c.get('/api/v1/areas/', self.get_credentials())
self.assertValidJSONResponse(resp)
# Scope out the data for correctness.
self.assertEqual(len(self.deserialize(resp)['objects']), 5)
# Here, we're checking an entire structure for the expected data.
self.assertEqual(self.deserialize(resp)['objects'][0], {
'city': 'Douala',
'country': 'Cameroon',
'name': 'Douala I',
'pop_per_sq_km': '0.00',
'overall_population': 223214,
'resource_uri': '/api/v1/areas/1/',
'id': '1'
})
def test_get_detail_unauthenticated(self):
"""Try to Get a single area from the API without authenticated"""
self.assertHttpUnauthorized(self.c.get(self.detail_url))
def test_get_detail_json(self):
"""Get a single area from the API with authenticated. With checks if all keys are available"""
resp = self.c.get(self.detail_url, self.get_credentials())
self.assertValidJSONResponse(resp)
# We use ``assertKeys`` here to just verify the keys, not all the data.
self.assertKeys(self.deserialize(resp), ['id', 'city', 'country', 'name', 'pop_per_sq_km', 'overall_population', 'resource_uri'])
self.assertEqual(self.deserialize(resp)['name'], "Douala I")
def test_post_list_unauthenticated(self):
"""Try to Post a single area to the API without authenticated"""
self.assertHttpMethodNotAllowed(self.c.post('/api/v1/areas/', data=self.post_data))
def test_post_list(self):
"""Try to Post a single area to the API with authenticated"""
self.assertHttpMethodNotAllowed(self.c.post('/api/v1/areas/?username=' + self.username + '&api_key=' + self.api_key, data=json.dumps(self.post_data), content_type="application/json"))
def test_put_detail_unauthenticated(self):
"""Try to Put a single area is not allowed from the API with authenticated"""
self.assertHttpMethodNotAllowed(self.c.put(self.detail_url))
def test_put_detail(self):
"""Try to Put a single area is not allowed from the API with authenticated"""
self.assertHttpMethodNotAllowed(self.c.put(self.detail_url, self.get_credentials()))
def test_delete_detail_unauthenticated(self):
"""Try to Delete a single area is not allowed from the API without authenticated"""
self.assertHttpMethodNotAllowed(self.c.delete(self.detail_url))
def test_delete_detail(self):
"""Try to Delete a single area is not allowed from the API with authenticated"""
self.assertHttpMethodNotAllowed(self.c.delete(self.detail_url, self.get_credentials()))
class ContributorResourceTest(ResourceTestCase):
def setUp(self):
super(ContributorResourceTest, self).setUp()
# Create a user.
self.username = u'john'
self.password = u'doe'
self.email = u'john@example.com'
self.user = User.objects.create_user(self.username, self.email, self.password)
self.api_key = self.user.api_key.key
self.c = Client()
self.post_data = {
'name': 'james',
'email': 'james@example.com',
'password': self.user.__dict__["password"],
'language': 'DE'
}
self.put_data = {
'email': 'jonny@example.com',
'language': 'DE'
}
# Fetch the ``Entry`` object we'll use in testing.
# Note that we aren't using PKs because they can change depending
# on what other tests are running.
Contributor(name="Tobias", email="tobias@test.de").save()
self.contributor_1 = Contributor.objects.get(pk=1)
# We also build a detail URI, since we will be using it all over.
# DRY, baby. DRY.
self.list_url = '/api/v1/contributors/'
self.detail_url = '{0}{1}/'.format(self.list_url, self.contributor_1.pk)
def get_credentials(self):
return {"username": self.username, "api_key": self.api_key}
def test_get_list_unauthorzied(self):
"""Get areas from the API without authenticated"""
self.assertHttpUnauthorized(self.c.get(self.list_url))
def test_get_list_json(self):
"""Get users from the API with authenticated. With checks if all keys are available"""
resp = self.c.get(self.list_url, self.get_credentials())
self.assertValidJSONResponse(resp)
# Scope out the data for correctness.
self.assertEqual(len(self.deserialize(resp)['objects']), 1)
# Here, we're checking an entire structure for the expected data.
self.assertEqual(self.deserialize(resp)['objects'][0], {
'id': '1',
'name': 'Tobias',
'email': 'tobias@test.de',
'password': settings.DUMMY_PASSWORD,
'resource_uri': self.detail_url,
'language': 'EN' # EN is the default value
})
def test_get_detail_unauthenticated(self):
"""Try to Get a single user from the API without authenticated"""
self.assertHttpUnauthorized(self.c.get(self.detail_url))
def test_get_detail_json(self):
"""Get a single user from the API with authenticated. With checks if all keys are available"""
resp = self.c.get(self.detail_url, self.get_credentials())
self.assertValidJSONResponse(resp)
# We use ``assertKeys`` here to just verify the keys, not all the data.
self.assertKeys(self.deserialize(resp), ['id', 'name', 'email', 'password', 'resource_uri', 'language'])
self.assertEqual(self.deserialize(resp)['name'], "Tobias")
def test_post_list_unauthenticated(self):
"""Try to Post a single user to the API without authenticated"""
self.assertHttpUnauthorized(self.c.post(self.list_url, data=self.post_data))
def test_post_list_without_permissions(self):
"""Try to Post a single user to the API with authenticated and without permission"""
self.assertHttpUnauthorized(self.c.post(self.list_url + '?username=' + self.username + '&api_key=' + self.api_key, data=json.dumps(self.post_data), content_type="application/json"))
def test_post_list_with_permissions(self):
"""Try to Post a single user to the API with authenticated and permission"""
add_contributor = Permission.objects.get(codename="add_contributor")
self.user.user_permissions.add(add_contributor)
self.assertEqual(Contributor.objects.count(), 1)
self.assertHttpCreated(self.c.post(self.list_url + '?username=' + self.username + '&api_key=' + self.api_key, data=json.dumps(self.post_data), content_type="application/json"))
self.assertEqual(Contributor.objects.count(), 2)
def test_put_detail_unauthenticated(self):
"""Try to Put a single user is not allowed from the API with authenticated"""
self.assertHttpUnauthorized(self.c.put(self.detail_url))
def test_put_detail_without_permission(self):
"""Try to Put a single user is not allowed from the API with authenticated and without permission"""
self.assertHttpUnauthorized(self.c.put(self.detail_url, self.get_credentials()))
def test_put_detail_with_permission(self):
"""Try to Put a single user is not allowed from the API with authenticated abd permission"""
change_contributor = Permission.objects.get(codename="change_contributor")
self.user.user_permissions.add(change_contributor)
self.assertEqual(Contributor.objects.count(), 1)
self.assertHttpAccepted(self.c.put(self.detail_url + '?username=' + self.username + '&api_key=' + self.api_key, data=json.dumps(self.put_data), content_type="application/json"))
self.assertEqual(Contributor.objects.count(), 1)
self.assertEqual(Contributor.objects.get(pk=self.contributor_1.pk).email, self.put_data.get("email"))
def test_delete_detail_unauthenticated(self):
"""Delete a single user is not allowed from the API without authenticated"""
self.assertHttpUnauthorized(self.c.delete(self.detail_url))
def test_delete_detail_without_permission(self):
"""Delete a single user is allowed from the API with authenticated"""
self.assertEqual(Contributor.objects.count(), 1)
self.assertHttpUnauthorized(self.c.delete(self.detail_url, self.get_credentials()))
self.assertEqual(Contributor.objects.count(), 1)
def test_delete_detail_with_permision(self):
"""Delete a single user is allowed from the API with authenticated"""
delete_contributor = Permission.objects.get(codename="delete_contributor")
self.user.user_permissions.add(delete_contributor)
self.assertEqual(Contributor.objects.count(), 1)
self.assertHttpAccepted(self.c.delete(self.detail_url, self.get_credentials()))
self.assertEqual(Contributor.objects.count(), 0)
class DeviceResourceTest(ResourceTestCase):
def setUp(self):
super(DeviceResourceTest, self).setUp()
# Create a user.
self.username = u'john'
self.password = u'doe'
self.email = u'john@example.com'
self.user = User.objects.create_user(self.username, self.email, self.password)
self.api_key = self.user.api_key.key
self.c = Client()
self.post_data = {
'category': 'SubDevice',
'phone_number': '12381294323',
}
self.put_data = {
'phone_number': "4267486238"
}
Contributor(name="Tobias", email="tobias@test.de").save()
self.contributor = Contributor.objects.get(pk=1)
Device(phone_number="01234567890", category="MainDevice", contributor=self.contributor).save()
# Fetch the ``Entry`` object we'll use in testing.
# Note that we aren't using PKs because they can change depending
# on what other tests are running.
self.device_1 = Device.objects.get(pk=1)
# We also build a detail URI, since we will be using it all over.
# DRY, baby. DRY.
self.list_url = u'/api/v1/devices/'
self.detail_url = u'{0}{1}/'.format(self.list_url, self.device_1.pk)
self.user_url = u'/api/v1/contributors/{0}/'.format(self.contributor.id)
def get_credentials(self):
return {"username": self.username, "api_key": self.api_key}
def test_get_list_unauthorzied(self):
"""Get devices from the API without authenticated"""
self.assertHttpUnauthorized(self.c.get(self.list_url))
def test_get_list_json(self):
"""Get devices from the API with authenticated. With checks if all keys are available."""
resp = self.c.get(self.list_url, self.get_credentials())
self.assertValidJSONResponse(resp)
# Scope out the data for correctness.
self.assertEqual(len(self.deserialize(resp)['objects']), 1)
# Here, we're checking an entire structure for the expected data.
self.assertEqual(self.deserialize(resp)['objects'][0], {
u"category": u"MainDevice",
u"phone_number": u"01234567890",
u"resource_uri": self.detail_url,
u"contributor": self.user_url
})
def test_get_detail_unauthenticated(self):
"""Try to Get a single device from the API without authentication"""
self.assertHttpUnauthorized(self.c.get(self.detail_url))
def test_get_detail_json(self):
"""Get a single device from the API with authentication. Also checks if all keys are available."""
resp = self.c.get(self.detail_url, self.get_credentials())
self.assertValidJSONResponse(resp)
# We use ``assertKeys`` here to just verify the keys, not all the data.
self.assertKeys(self.deserialize(resp), ['category', 'phone_number', 'resource_uri', 'contributor'])
self.assertEqual(self.deserialize(resp)['category'], "MainDevice")
def test_post_list_unauthenticated(self):
"""Try to Post a single device to the API without authenticated"""
self.assertHttpUnauthorized(self.c.post(self.list_url, data=self.post_data))
def test_post_list_without_permissions(self):
"""Try to Post a single device to the API with authenticated and without permission"""
self.assertHttpUnauthorized(self.c.post(self.list_url + '?username=' + self.username + '&api_key=' + self.api_key, data=json.dumps(self.post_data), content_type="application/json"))
def test_post_list_with_permissions(self):
"""Post a single device to the API with authenticated and permission"""
add_device = Permission.objects.get(codename="add_device")
self.user.user_permissions.add(add_device)
self.assertEqual(Device.objects.count(), 1)
self.assertHttpCreated(self.c.post(self.list_url + '?username=' + self.username + '&api_key=' + self.api_key, data=json.dumps(self.post_data), content_type="application/json"))
self.assertEqual(Device.objects.count(), 2)
def test_put_detail_unauthenticated(self):
"""Try to Put a single device is not allowed from the API with authenticated"""
self.assertHttpUnauthorized(self.c.put(self.detail_url))
def test_put_detail_without_permission(self):
"""Try to Put a single device is not allowed from the API with authenticated and without permission"""
self.assertHttpUnauthorized(self.c.put(self.detail_url, self.get_credentials()))
def test_put_detail_with_permission(self):
"""Put a single device is not allowed from the API with authenticated abd permission"""
change_device = Permission.objects.get(codename="change_device")
self.user.user_permissions.add(change_device)
self.assertEqual(Device.objects.count(), 1)
self.assertHttpAccepted(self.c.put(self.detail_url + '?username=' + self.username + '&api_key=' + self.api_key, data=json.dumps(self.put_data), content_type="application/json"))
self.assertEqual(Device.objects.count(), 1)
self.assertEqual(Device.objects.get(pk=self.device_1.pk).phone_number, self.put_data.get("phone_number"))
def test_delete_detail_unauthenticated(self):
"""Delete a single device is not allowed from the API without authenticated"""
self.assertHttpUnauthorized(self.c.delete(self.detail_url))
def test_delete_detail_without_permission(self):
"""Delete a single device is not allowed from the API with authenticated and without permission"""
self.assertHttpUnauthorized(self.c.delete(self.detail_url, self.get_credentials()))
def test_delete_detail_with_permission(self):
"""Delete a single device is not allowed from the API with authenticated and permission"""
delete_device = Permission.objects.get(codename="delete_device")
self.user.user_permissions.add(delete_device)
self.assertEqual(Device.objects.count(), 1)
self.assertHttpAccepted(self.c.delete(self.detail_url, self.get_credentials()))
self.assertEqual(Device.objects.count(), 0)
| 50.942918 | 633 | 0.678909 | 3,201 | 24,096 | 4.982818 | 0.07935 | 0.016928 | 0.023824 | 0.046144 | 0.872414 | 0.828777 | 0.794859 | 0.76815 | 0.756928 | 0.739687 | 0 | 0.022005 | 0.198456 | 24,096 | 472 | 634 | 51.050847 | 0.803821 | 0.225058 | 0 | 0.581633 | 0 | 0.006803 | 0.164736 | 0.06026 | 0 | 0 | 0 | 0 | 0.29932 | 1 | 0.197279 | false | 0.037415 | 0.027211 | 0.013605 | 0.255102 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f08ede78472e08b68a663cd50956771e9b1345ea | 44 | py | Python | testpy.py | hasanhamidi/Pointnet2_PyTorch | 50aa86337eb13adcec6ce865e5af4ab70852ace8 | [
"Unlicense"
] | null | null | null | testpy.py | hasanhamidi/Pointnet2_PyTorch | 50aa86337eb13adcec6ce865e5af4ab70852ace8 | [
"Unlicense"
] | null | null | null | testpy.py | hasanhamidi/Pointnet2_PyTorch | 50aa86337eb13adcec6ce865e5af4ab70852ace8 | [
"Unlicense"
] | null | null | null | print("_".join("Area_4_bb".split("_")[0:2])) | 44 | 44 | 0.613636 | 8 | 44 | 2.875 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068182 | 0 | 44 | 1 | 44 | 44 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0.244444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
f09bd9eb918a19106463ea390760706557ba5bec | 4,107 | py | Python | Code/.ipynb_checkpoints/CCS3_Cleanup_Table-checkpoint.py | arpanganguli/Snapshot_of_Indian_Economy | ff7301cdc2850b426cedad9b92be723e92a4f45a | [
"MIT"
] | null | null | null | Code/.ipynb_checkpoints/CCS3_Cleanup_Table-checkpoint.py | arpanganguli/Snapshot_of_Indian_Economy | ff7301cdc2850b426cedad9b92be723e92a4f45a | [
"MIT"
] | null | null | null | Code/.ipynb_checkpoints/CCS3_Cleanup_Table-checkpoint.py | arpanganguli/Snapshot_of_Indian_Economy | ff7301cdc2850b426cedad9b92be723e92a4f45a | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Apr 20 18:50:34 2021
@author: arpanganguli
"""
# import packages
import pandas as pd
from sqlalchemy import create_engine
import os
# import files
HOME = os.path.dirname(os.getcwd())
URL = 'sqlite:///' + os.path.join(HOME, 'Database/CCS.db')
engine = create_engine(URL, echo = True) # sqlite:////absolute/path/to/file.db
# ALITER: engine = create_engine('sqlite:////Users/arpanganguli/Documents/Projects/SINE/Database/CCS.db', echo = True) # sqlite:////absolute/path/to/file.db
# initial query
query_string = "SELECT * FROM CCS"
df = pd.read_sql(sql=query_string, con=engine)
print(df.head())
df_working = df.copy(deep=True) # deep copy to preserve original dataframe
df_working.dropna(how="all") # delete null and na values
df_working.drop('Assembly Constituency Name', inplace=True, axis=1)
df_working[["Age"
,"Gender"
,"Annual Income"
,"Educational Qualification"
,"No. of Family Members"
,"Occupation of Respondent"
,"Perception on General Economic condition - compared to one year ago"
,"Outlook on General Economic condition - one year ahead"
,"Perception on Household income - compared to one year ago"
,"Outlook on Household income - one year ahead"
,"Perception on Household spending - compared to one year ago"
,"Outlook on Household spending - one year ahead"
,"Perception on essential spending - compared to one year ago"
,"Outlook on essential spending - one year ahead"
,"Perception on non-essential spending - compared to one year ago"
,"Outlook on non-essential spending - one year ahead"
,"Perception on Employment scenario - compared to one year ago"
,"Outlook on Employment scenario - one year ahead"
,"Perception on General prices - compared to one year ago"
,"Outlook on General prices - one year ahead"
,"Perception on Inflation - compared to one year ago"
,"Outlook on Inflation - one year ahead"]] = df_working[["Age"
,"Gender"
,"Annual Income"
,"Educational Qualification"
,"No. of Family Members"
,"Occupation of Respondent"
,"Perception on General Economic condition - compared to one year ago"
,"Outlook on General Economic condition - one year ahead"
,"Perception on Household income - compared to one year ago"
,"Outlook on Household income - one year ahead"
,"Perception on Household spending - compared to one year ago"
,"Outlook on Household spending - one year ahead"
,"Perception on essential spending - compared to one year ago"
,"Outlook on essential spending - one year ahead"
,"Perception on non-essential spending - compared to one year ago"
,"Outlook on non-essential spending - one year ahead"
,"Perception on Employment scenario - compared to one year ago"
,"Outlook on Employment scenario - one year ahead"
,"Perception on General prices - compared to one year ago"
,"Outlook on General prices - one year ahead"
,"Perception on Inflation - compared to one year ago"
,"Outlook on Inflation - one year ahead"]].astype('category')
df_working[["Age"
,"Gender"
,"Annual Income"
,"Educational Qualification"
,"No. of Family Members"]] = df_working[["Age"
,"Gender"
,"Annual Income"
,"Educational Qualification"
,"No. of Family Members"]].apply(lambda x: x.cat.codes)
print(df_working.head())
| 49.481928 | 156 | 0.58242 | 458 | 4,107 | 5.19214 | 0.255459 | 0.094197 | 0.087468 | 0.114382 | 0.746846 | 0.746846 | 0.746846 | 0.746846 | 0.718251 | 0.718251 | 0 | 0.005435 | 0.327977 | 4,107 | 82 | 157 | 50.085366 | 0.856159 | 0.097882 | 0 | 0.738462 | 0 | 0 | 0.561551 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.046154 | 0 | 0.046154 | 0.030769 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f0f22c1b90c9f47fa013c70b2990e4d84631a8ff | 328 | py | Python | target_extraction/__init__.py | apmoore1/target-extraction | 4139ecdc432411fcc4ed2723f4165e7dae93544d | [
"Apache-2.0"
] | 5 | 2019-07-27T13:57:47.000Z | 2021-06-16T13:17:44.000Z | target_extraction/__init__.py | apmoore1/target-extraction | 4139ecdc432411fcc4ed2723f4165e7dae93544d | [
"Apache-2.0"
] | 26 | 2019-05-01T11:56:35.000Z | 2020-06-18T16:06:40.000Z | target_extraction/__init__.py | apmoore1/target-extraction | 4139ecdc432411fcc4ed2723f4165e7dae93544d | [
"Apache-2.0"
] | 1 | 2019-07-11T07:16:09.000Z | 2019-07-11T07:16:09.000Z | from target_extraction import data_types, data_types_util
from target_extraction import tokenizers, pos_taggers, taggers_helper
from target_extraction import dataset_parsers
from target_extraction import allen
from target_extraction import analysis
# Required to import LanguageModelTokenEmbedder
from allennlp_models import lm | 41 | 69 | 0.890244 | 43 | 328 | 6.511628 | 0.488372 | 0.178571 | 0.357143 | 0.464286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 328 | 8 | 70 | 41 | 0.945946 | 0.137195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f0f2f83256755222a218b7a124ff42997f08582f | 5,041 | py | Python | QUANTAXIS_Test/QAFetch_Test/QAQuery_Option_Test.py | Sinovel/QUANTAXIS | 97f1ea2140f58c92ff5c84b851886d9eda1f9ac3 | [
"MIT"
] | 14 | 2021-01-11T02:19:26.000Z | 2022-02-02T04:16:08.000Z | QUANTAXIS_Test/QAFetch_Test/QAQuery_Option_Test.py | Sinovel/QUANTAXIS | 97f1ea2140f58c92ff5c84b851886d9eda1f9ac3 | [
"MIT"
] | 2 | 2019-12-25T09:24:47.000Z | 2020-01-13T07:33:50.000Z | QUANTAXIS_Test/QAFetch_Test/QAQuery_Option_Test.py | Sinovel/QUANTAXIS | 97f1ea2140f58c92ff5c84b851886d9eda1f9ac3 | [
"MIT"
] | 8 | 2021-01-16T09:43:30.000Z | 2022-02-02T04:16:12.000Z | import unittest
import pprint
from QUANTAXIS import QUANTAXIS as QA
from QUANTAXIS.QAUtil.QADate import *
from QUANTAXIS.QAUtil.QADate_trade import *
from QUANTAXIS.QAFetch.QATdx import (QA_fetch_get_option_contract_time_to_market)
from QUANTAXIS.QAFetch.QATdx import (QA_fetch_get_commodity_option_M_contract_time_to_market)
from QUANTAXIS.QAFetch.QATdx import (QA_fetch_get_commodity_option_SR_contract_time_to_market)
from QUANTAXIS.QAFetch.QATdx import (QA_fetch_get_commodity_option_CU_contract_time_to_market)
from QUANTAXIS.QAFetch.QATdx import (QA_fetch_get_commodity_option_C_contract_time_to_market)
from QUANTAXIS.QAFetch.QATdx import (QA_fetch_get_commodity_option_CF_contract_time_to_market)
from QUANTAXIS.QAFetch.QATdx import (QA_fetch_get_commodity_option_RU_contract_time_to_market)
from QUANTAXIS.QASU.main import *
import numpy as np
import pandas as pd
class TestOptionData(unittest.TestCase):
def testQA_SU_save_stock_day(self):
QA_SU_save_option_50etf_day('tdx')
def testGetOptionMin(self):
#
# 去掉商品期权,保留510050开头的50ETF期权
rows = QA_fetch_get_option_contract_time_to_market()
strToday = QA_util_today_str()
for aRow in rows:
# print(aRow)
print(aRow.code);
result2 = QA.QA_fetch_get_option_min(package='tdx', code=aRow.code, start='2018-01-01', end=strToday)
pprint.pprint(result2);
#
def testGetOptionList(self):
#
# 去掉商品期权,保留510050开头的50ETF期权
rows = QA_fetch_get_option_contract_time_to_market()
strToday = QA_util_today_str()
for aRow in rows:
#print(aRow)
print(aRow.code);
result2 = QA.QA_fetch_get_option_day(package='tdx', code=aRow.code, start='2018-01-01', end=strToday)
pprint.pprint(result2);
#
#pprint.pprint(result)
#strToday = QA_util_today_str();
#result2 = QA.QA_fetch_get_option_day(package='tdx', code='10001214', start=strToday, end=strToday)
#pprint.pprint(result2)
def testGetCommodityCuOptionList(self):
rows = QA_fetch_get_commodity_option_CU_contract_time_to_market()
strToday = QA_util_today_str()
for aRow in rows:
#print(aRow)
print("准备获取 ;-》")
print(aRow)
print("-------------")
result2 = QA.QA_fetch_get_option_day(package='tdx', code=aRow.code, start='2018-01-01', end=strToday)
pprint.pprint(result2);
print("结束获取 ;-》")
def testGetCommodityRUOptionList(self):
rows = QA_fetch_get_commodity_option_RU_contract_time_to_market()
strToday = QA_util_today_str()
for aRow in rows:
# print(aRow)
print("准备获取 ;-》")
print(aRow)
print("-------------")
result2 = QA.QA_fetch_get_option_day(package='tdx', code=aRow.code, start='2018-01-01', end=strToday)
pprint.pprint(result2);
print("结束获取 ;-》")
def testGetCommodityCOptionList(self):
rows = QA_fetch_get_commodity_option_C_contract_time_to_market()
strToday = QA_util_today_str()
for aRow in rows:
# print(aRow)
print("准备获取 ;-》")
print(aRow)
print("-------------")
result2 = QA.QA_fetch_get_option_day(package='tdx', code=aRow.code, start='2018-01-01', end=strToday)
pprint.pprint(result2);
time.sleep(1)
print("结束获取 ;-》")
def testGetCommodityCFOptionList(self):
rows = QA_fetch_get_commodity_option_CF_contract_time_to_market()
strToday = QA_util_today_str()
for aRow in rows:
# print(aRow)
print("准备获取 ;-》")
print(aRow)
print("-------------")
result2 = QA.QA_fetch_get_option_day(package='tdx', code=aRow.code, start='2018-01-01', end=strToday)
pprint.pprint(result2);
print("结束获取 ;-》")
def testGetCommoditySROptionList(self):
rows = QA_fetch_get_commodity_option_SR_contract_time_to_market()
strToday = QA_util_today_str()
for aRow in rows:
#print(aRow)
print("准备获取 ;-》")
print(aRow)
print("-------------")
result2 = QA.QA_fetch_get_option_day(package='tdx', code=aRow.code, start='2018-01-01', end=strToday)
pprint.pprint(result2);
print("结束获取 ;-》")
def testGetCommodityMOptionList(self):
rows = QA_fetch_get_commodity_option_M_contract_time_to_market()
strToday = QA_util_today_str()
for aRow in rows:
#print(aRow)
print("准备获取 ;-》")
print(aRow)
print("-------------")
result2 = QA.QA_fetch_get_option_day(package='tdx', code=aRow.code, start='2018-01-01', end=strToday)
pprint.pprint(result2);
print("结束获取 ;-》")
def setUp(self):
pass
def tearDown(self):
pass
| 33.606667 | 113 | 0.631621 | 608 | 5,041 | 4.904605 | 0.131579 | 0.056338 | 0.080483 | 0.100604 | 0.813883 | 0.806506 | 0.790074 | 0.773977 | 0.758216 | 0.758216 | 0 | 0.028959 | 0.253323 | 5,041 | 149 | 114 | 33.832215 | 0.763284 | 0.062488 | 0 | 0.628866 | 0 | 0 | 0.059698 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.113402 | false | 0.020619 | 0.154639 | 0 | 0.278351 | 0.360825 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ac8f114b25d249a18d6b5a4c7b4a078aa6bfa438 | 5,535 | py | Python | MDAF/TestFunctions/Alpine.py | ejeanboris/MDAF_COMPLETE | e99c9762ea7304acc0a6795d33a55449a9800d89 | [
"CC0-1.0"
] | 1 | 2020-12-30T23:04:51.000Z | 2020-12-30T23:04:51.000Z | MDAF/TestFunctions/Alpine.py | ejeanboris/MDAF_COMPLETE | e99c9762ea7304acc0a6795d33a55449a9800d89 | [
"CC0-1.0"
] | null | null | null | MDAF/TestFunctions/Alpine.py | ejeanboris/MDAF_COMPLETE | e99c9762ea7304acc0a6795d33a55449a9800d89 | [
"CC0-1.0"
] | null | null | null | import math
def main(args):
'''
>>> (main([0,0]) - 0)<0.001
True
#_# dimmensions: 2
#_# upper: 10
#_# lower: -10
#_# minimum: [0,0]
#_# opti: 0
#_# cm_angle: array([[1.51864326e+00], [5.34186818e-01], [1.13769921e+00], [6.01461615e-01], [1.04599092e+02], [4.22931285e+01], [3.99368522e-01], [1.77424245e-01], [0.00000000e+00], [4.10000000e-02]])
#_# cm_conv: array([[0.46153846], [0.25 ], [0.48076923], [0.51923077], [0. ], [0.012 ]])
#_# cm_grad: array([[0.25422972], [0.15759445], [0. ], [0.028 ]])
#_# ela_conv: array([[ 5.49000000e-01], [ 0.00000000e+00], [-9.65901820e-01], [ 3.67214694e+00], [ 1.00000000e+03], [ 1.01000000e-01]])
#_# ela_curv: array([[2.42619601e-01], [3.20078135e+00], [5.52064754e+00], [5.50463872e+00], [7.36057390e+00], [1.32601509e+01], [2.88883096e+00], [0.00000000e+00], [1.00366087e+00], [1.58589209e+00], [6.83833374e+00], [2.81557142e+00], [5.79921113e+00], [1.54395057e+02], [1.45430442e+01], [0.00000000e+00], [1.00365450e+00], [1.31915593e+00], [2.27082230e+01], [2.14233097e+00], [4.20450173e+00], [2.46681056e+03], [1.83077553e+02], [0.00000000e+00], [8.40000000e+03], [8.49000000e-01]])
#_# ela_distr: array([[ 0.25150949], [-0.27290042], [ 1. ], [ 0. ], [ 0.016 ]])
#_# ela_local: array([[9.00000000e+01], [9.00000000e-01], [1.88758092e+00], [4.70404069e-01], [1.00000000e-02], [1.11235955e-02], [1.00000000e-02], [1.00000000e+01], [3.50000000e+01], [4.09500000e+01], [4.00000000e+01], [4.50000000e+01], [1.00000000e+02], [1.50201044e+01], [4.18500000e+03], [3.17000000e-01]])
#_# ela_meta: array([[0.01734519], [1.57345526], [0.09271256], [0.1030968 ], [1.11200464], [0.01547931], [0.12118168], [1.03294402], [0.11114866], [0. ], [0.007 ]])
#_# basic: array([[ 2.00000000e+00], [ 5.00000000e+02], [-1.00000000e+01], [-1.00000000e+01], [ 1.00000000e+01], [ 1.00000000e+01], [-1.01319516e+01], [ 1.74084063e+01], [ 6.00000000e+00], [ 6.00000000e+00], [ 3.60000000e+01], [ 3.60000000e+01], [ 1.00000000e+00], [ 0.00000000e+00], [ 1.00000000e-03]])
#_# disp: array([[ 0.81723285], [ 1.02288078], [ 1.00043181], [ 0.91285508], [ 0.94977571], [ 1.02392436], [ 1.0226333 ], [ 0.95015092], [-1.90889021], [ 0.23897563], [ 0.00451003], [-0.91017497], [-0.51416167], [ 0.2449211 ], [ 0.23170416], [-0.51032053], [ 0. ], [ 0.009 ]])
#_# limo: array([[ 0.16281056], [ 0.12956626], [ 1.98950747], [ 1.3449436 ], [ 0.07059703], [ 0.01217696], [ 7.43415949], [10.64830298], [ 1.09951251], [ 1.00718078], [ 1.70875965], [ 0.7110877 ], [ 0. ], [ 0.027 ]])
#_# nbc: array([[ 0.13314903], [ 0.74751908], [ 0.1546862 ], [ 0.22427331], [-0.51183856], [ 0. ], [ 0.031 ]])
#_# pca: array([[1. ], [1. ], [1. ], [1. ], [0.50758756], [0.50758756], [0.3850794 ], [0.38716446], [0. ], [0.002 ]])
#_# gcm: array([[4. ], [0.11111111], [0.88888889], [0.52777778], [0.24241125], [0.25 ], [0.25032008], [0.25694859], [0.00648962], [0.11111111], [0.11805556], [0.11111111], [0.13888889], [0.01388889], [0.47222222], [0.25 ], [0.25 ], [0.25 ], [0.25 ], [0. ], [1. ], [0.25694859], [0.02777778], [0. ], [0.022 ], [4. ], [0.11111111], [0.88888889], [0.75 ], [0.23441552], [0.25 ], [0.25096224], [0.26366 ], [0.01218501], [0.02777778], [0.0625 ], [0.02777778], [0.16666667], [0.06944444], [0.25 ], [0.25 ], [0.25 ], [0.25 ], [0.25 ], [0. ], [1. ], [0.26366 ], [0.02777778], [0. ], [0.022 ], [4. ], [0.11111111], [0.88888889], [0.75 ], [0.23766183], [0.25 ], [0.24248077], [0.27737663], [0.01848444], [0.02777778], [0.0625 ], [0.02777778], [0.16666667], [0.06944444], [0.25 ], [0.25 ], [0.25 ], [0.25 ], [0.25 ], [0. ], [1. ], [0.27737663], [0.02777778], [0. ], [0.024 ]])
#_# ic: array([[0.75502423], [1.00600601], [1.46273336], [0.48548549], [0.44176707], [0. ], [0.166 ]])
#_# Represented: 1
'''
vals = [(x*math.sin(x)+0.1*x) for i,x in enumerate(args)]
return sum(vals)
if __name__ == "__main__":
import doctest
doctest.testmod()
| 138.375 | 1,511 | 0.405962 | 594 | 5,535 | 3.720539 | 0.333333 | 0.024434 | 0.032579 | 0.029864 | 0.167421 | 0.124887 | 0.115837 | 0.115837 | 0.115837 | 0.115837 | 0 | 0.535641 | 0.384101 | 5,535 | 39 | 1,512 | 141.923077 | 0.112643 | 0.971816 | 0 | 0 | 0 | 0 | 0.043011 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
acc3e0f8eb5031e1626aaf3486f04c548570281d | 44 | py | Python | avalanche/training/plugins/strategy_plugin.py | coreylowman/avalanche | 9c1e7765f1577c400ec0c57260221bcffd9566a2 | [
"MIT"
] | 1 | 2021-09-15T13:57:27.000Z | 2021-09-15T13:57:27.000Z | avalanche/training/plugins/strategy_plugin.py | coreylowman/avalanche | 9c1e7765f1577c400ec0c57260221bcffd9566a2 | [
"MIT"
] | null | null | null | avalanche/training/plugins/strategy_plugin.py | coreylowman/avalanche | 9c1e7765f1577c400ec0c57260221bcffd9566a2 | [
"MIT"
] | null | null | null | from avalanche.core import SupervisedPlugin
| 22 | 43 | 0.886364 | 5 | 44 | 7.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 44 | 1 | 44 | 44 | 0.975 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
acea80043678f69a92cea2bd1992b736973083a7 | 246 | py | Python | bats/__init__.py | nalzok/BATS.py | d2c6c0bbec547fe48f13a62e23d41ff040b48796 | [
"MIT"
] | 2 | 2020-10-29T19:40:54.000Z | 2021-04-13T22:00:55.000Z | bats/__init__.py | nalzok/BATS.py | d2c6c0bbec547fe48f13a62e23d41ff040b48796 | [
"MIT"
] | 3 | 2020-04-25T00:50:50.000Z | 2021-07-02T15:27:03.000Z | bats/__init__.py | nalzok/BATS.py | d2c6c0bbec547fe48f13a62e23d41ff040b48796 | [
"MIT"
] | 1 | 2021-06-28T16:23:37.000Z | 2021-06-28T16:23:37.000Z | from ._version import __version__
from .topology import *
from .dense import *
from .linalg import *
from .linalg_f2 import *
from .linalg_f3 import *
from .linalg_auto import *
from .visualization import persistence_diagram, persistence_barcode
| 27.333333 | 67 | 0.804878 | 32 | 246 | 5.875 | 0.40625 | 0.319149 | 0.340426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00939 | 0.134146 | 246 | 8 | 68 | 30.75 | 0.873239 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c5a1a2c09a5bd214ab6ebccd45d119efe9e5b352 | 107 | py | Python | schedules/Errors.py | YishaiYosifov/schedules | a13f7b1b31163b67fddf1036e3cfdc9071031132 | [
"MIT"
] | null | null | null | schedules/Errors.py | YishaiYosifov/schedules | a13f7b1b31163b67fddf1036e3cfdc9071031132 | [
"MIT"
] | null | null | null | schedules/Errors.py | YishaiYosifov/schedules | a13f7b1b31163b67fddf1036e3cfdc9071031132 | [
"MIT"
] | null | null | null | class Errors():
class RangeError(Exception):
pass
class AsyncError(Exception):
pass | 21.4 | 32 | 0.626168 | 10 | 107 | 6.7 | 0.6 | 0.38806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.28972 | 107 | 5 | 33 | 21.4 | 0.881579 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.4 | 0 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
c5d4f69e4088766789c2e26266a4c0f80e2416fb | 78 | py | Python | 6 kyu/Diff That Poly.py | mwk0408/codewars_solutions | 9b4f502b5f159e68024d494e19a96a226acad5e5 | [
"MIT"
] | 6 | 2020-09-03T09:32:25.000Z | 2020-12-07T04:10:01.000Z | 6 kyu/Diff That Poly.py | mwk0408/codewars_solutions | 9b4f502b5f159e68024d494e19a96a226acad5e5 | [
"MIT"
] | 1 | 2021-12-13T15:30:21.000Z | 2021-12-13T15:30:21.000Z | 6 kyu/Diff That Poly.py | mwk0408/codewars_solutions | 9b4f502b5f159e68024d494e19a96a226acad5e5 | [
"MIT"
] | null | null | null | def diff(poly):
return [j*(len(poly)-1-i) for i,j in enumerate(poly[:-1])] | 39 | 62 | 0.615385 | 16 | 78 | 3 | 0.6875 | 0.208333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029851 | 0.141026 | 78 | 2 | 62 | 39 | 0.686567 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
a870de5eb58d917152a3f2ff7315049917379a0a | 344 | py | Python | modules/tests/test_weather.py | rampreeth/JARVIS-on-Messenger | 414d0440f111cffb457707b62f0d022512ced6c4 | [
"MIT"
] | 1,465 | 2016-04-19T02:04:51.000Z | 2022-03-31T03:40:14.000Z | modules/tests/test_weather.py | rampreeth/JARVIS-on-Messenger | 414d0440f111cffb457707b62f0d022512ced6c4 | [
"MIT"
] | 529 | 2016-04-20T03:40:43.000Z | 2022-02-03T12:37:05.000Z | modules/tests/test_weather.py | rampreeth/JARVIS-on-Messenger | 414d0440f111cffb457707b62f0d022512ced6c4 | [
"MIT"
] | 1,407 | 2016-04-19T02:16:38.000Z | 2022-03-30T14:24:41.000Z | import modules
def test_weather():
assert ('weather' == modules.process_query('tell me the weather in London')[0])
assert ('weather' == modules.process_query('weather Delhi')[0])
assert ('weather' == modules.process_query('What\'s the weather in Texas?')[0])
assert ('weather' != modules.process_query('something random')[0])
| 38.222222 | 83 | 0.688953 | 45 | 344 | 5.155556 | 0.444444 | 0.224138 | 0.344828 | 0.465517 | 0.564655 | 0.426724 | 0 | 0 | 0 | 0 | 0 | 0.013559 | 0.142442 | 344 | 8 | 84 | 43 | 0.772881 | 0 | 0 | 0 | 0 | 0 | 0.264535 | 0 | 0 | 0 | 0 | 0 | 0.666667 | 1 | 0.166667 | true | 0 | 0.166667 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a8a7b1f81a18164cb02edc013dacf5c2a8fc48e2 | 24 | py | Python | tests/build/scipy/scipy/_build_utils/__init__.py | crougeux/-a-i_v1.6.3_modif | b499a812e79f335d082d3f9b1070e0465ad67bab | [
"BSD-3-Clause"
] | 26 | 2018-02-14T23:52:58.000Z | 2021-08-16T13:50:03.000Z | tests/build/scipy/scipy/_build_utils/__init__.py | crougeux/-a-i_v1.6.3_modif | b499a812e79f335d082d3f9b1070e0465ad67bab | [
"BSD-3-Clause"
] | 5 | 2021-03-19T08:36:48.000Z | 2022-01-13T01:52:34.000Z | tests/build/scipy/scipy/_build_utils/__init__.py | crougeux/-a-i_v1.6.3_modif | b499a812e79f335d082d3f9b1070e0465ad67bab | [
"BSD-3-Clause"
] | 10 | 2018-08-13T19:38:39.000Z | 2020-04-19T03:02:00.000Z | from ._fortran import *
| 12 | 23 | 0.75 | 3 | 24 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 24 | 1 | 24 | 24 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a8b28c37fc7b8a88683a42e9e1e1c0d582934ed8 | 21,414 | py | Python | pdf4me/Pdf4mePythonClientApiTest/unittests/pdfA_client_test.py | pdf4me/pdf4me-clientapi-python | 17a1c1baae861369084d658475be56be42ca92d0 | [
"MIT"
] | 1 | 2020-06-30T22:18:17.000Z | 2020-06-30T22:18:17.000Z | pdf4me/Pdf4mePythonClientApiTest/unittests/pdfA_client_test.py | pdf4me/pdf4me-clientapi-python | 17a1c1baae861369084d658475be56be42ca92d0 | [
"MIT"
] | null | null | null | pdf4me/Pdf4mePythonClientApiTest/unittests/pdfA_client_test.py | pdf4me/pdf4me-clientapi-python | 17a1c1baae861369084d658475be56be42ca92d0 | [
"MIT"
] | 1 | 2018-07-10T17:40:37.000Z | 2018-07-10T17:40:37.000Z | import sys
sys.path.append('../../Pdf4mePythonClientApi')
sys.path.append('../../Pdf4mePythonClientApiTest')
import unittest
from nose.tools import assert_raises, assert_equal
from pdf4me.client.pdf4me_client import Pdf4meClient
from pdf4me.client.pdfA_client import PdfAClient
from pdf4me.helper.file_reader import FileReader
from pdf4me.helper.pdf4me_exceptions import Pdf4meClientException
from pdf4me.model import CreatePdfA, Document, PdfAAction, Rotate, RotateAction, PdfRotate, Protect, ProtectAction, Validate, ValidateAction, Repair, RepairAction, SignPdf, SignAction
from test_helper.check import Check
from test_helper.test_files import TestFiles
class PdfAClientTest(unittest.TestCase):
pdf4me_client = Pdf4meClient()
pdfA_client = PdfAClient(pdf4me_client)
test_files = TestFiles()
check = Check()
file_reader = FileReader()
def create_create_pdfA(self):
create_pdfA = CreatePdfA(
document=Document(
doc_data=self.test_files.pdf_data
),
pdf_a_action=PdfAAction(
compliance='pdfA2u'
)
)
return create_pdfA
def test_pdfA_throws_exception(self):
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.pdfA(None)
assert_equal(e.exception.msg, 'The create_pdfA parameter cannot be None.')
def test_pdfA_document_throws_exception(self):
# prepare args
create_pdfA = self.create_create_pdfA()
create_pdfA.document = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.pdfA(create_pdfA=create_pdfA)
assert_equal(e.exception.msg, 'The create_pdfA document cannot be None.')
def test_pdfA_document_data_throws_exception(self):
# prepare args
create_pdfA = self.create_create_pdfA()
create_pdfA.document.doc_data = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.pdfA(create_pdfA=create_pdfA)
assert_equal(e.exception.msg, 'The create_pdfA document cannot be None.')
def test_pdfA_action_data_throws_exception(self):
# prepare args
create_pdfA = self.create_create_pdfA()
create_pdfA.pdf_a_action = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.pdfA(create_pdfA=create_pdfA)
assert_equal(e.exception.msg, 'The pdf_a_action cannot be None.')
def test_pdfA_no_none_response(self):
# request
create_pdfA = self.create_create_pdfA()
res = self.pdfA_client.pdfA(create_pdfA)
# validation
self.assertIsNotNone(res)
self.assertIsNotNone(res['document'])
self.assertIsNotNone(res['document']['doc_data'])
def test_pdfA_doc_length(self):
# request
create_pdfA = self.create_create_pdfA()
res = self.pdfA_client.pdfA(create_pdfA)
# validation
original_length = self.test_files.pdf_length
pdfA = len(res['document']['doc_data'])
self.assertTrue(self.check.below_not_zero(pdfA, original_length))
def test_create_pdfA_no_none_response(self):
# request
res = self.pdfA_client.create_pdfA(
pdf_compliance='pdfA2u',
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
self.assertIsNotNone(res)
def test_create_pdfA_doc_length(self):
# request
res = self.pdfA_client.create_pdfA(
pdf_compliance='pdfA2u',
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
original_length = self.test_files.pdf_length
pdfA = self.check.get_doc_base64_length(res)
self.assertTrue(self.check.below_not_zero(pdfA, original_length))
def test_create_pdfA_without_compliance_no_none_response(self):
# request
res = self.pdfA_client.create_pdfA(
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
self.assertIsNotNone(res)
def test_create_pdfA_without_compliance_doc_length(self):
# request
res = self.pdfA_client.create_pdfA(
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
original_length = self.test_files.pdf_length
pdfA = self.check.get_doc_base64_length(res)
self.assertTrue(self.check.below_not_zero(pdfA, original_length))
# Rotate
def create_rotate(self):
rotate = Rotate(
document=Document(
doc_data=self.test_files.pdf_data
),
rotate_action=RotateAction(
rotation_list=[
PdfRotate(
page_nr=2,
rotation_type='clockwise'
)
]
)
)
return rotate
def test_rotate_throws_exception(self):
create_rotate: Rotate = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.rotate(rotate=create_rotate)
assert_equal(e.exception.msg, 'The rotate parameter cannot be None.')
def test_rotate_document_throws_exception(self):
# prepare args
create_rotate = self.create_rotate()
create_rotate.document = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.rotate(rotate=create_rotate)
assert_equal(e.exception.msg, 'The rotate document cannot be None.')
def test_rotate_document_data_throws_exception(self):
# prepare args
create_rotate = self.create_rotate()
create_rotate.document.doc_data = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.rotate(rotate=create_rotate)
assert_equal(e.exception.msg, 'The rotate document cannot be None.')
def test_rotate_action_data_throws_exception(self):
# prepare args
create_rotate = self.create_rotate()
create_rotate.rotate_action = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.rotate(rotate=create_rotate)
assert_equal(e.exception.msg, 'The rotate_action cannot be None.')
def test_rotate_no_none_response(self):
# request
create_rotate = self.create_rotate()
res = self.pdfA_client.rotate(create_rotate)
# validation
self.assertIsNotNone(res)
self.assertIsNotNone(res['document'])
self.assertIsNotNone(res['document']['doc_data'])
def test_rotate_doc_length(self):
# request
create_rotate = self.create_rotate()
res = self.pdfA_client.rotate(create_rotate)
# validation
original_length = self.test_files.pdf_length
rotate = len(res['document']['doc_data'])
self.assertTrue(self.check.below_not_zero(rotate, original_length))
def test_rotate_page_no_none_response(self):
# request
res = self.pdfA_client.rotate_page(
page_nr='2',
rotate='clockwise',
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
self.assertIsNotNone(res)
def test_rotate_page_doc_length(self):
# request
res = self.pdfA_client.rotate_page(
page_nr='2',
rotate='clockwise',
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
original_length = self.test_files.pdf_length
rotate = self.check.get_doc_base64_length(res)
self.assertTrue(self.check.below_not_zero(rotate, original_length))
# def test_rotate_page_without_compliance_no_none_response(self):
# # request
# res = self.pdfA_client.rotate_page(
# file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
# )
#
# # validation
# self.assertIsNotNone(res)
#
# def test_rotate_page_without_compliance_doc_length(self):
# # request
# res = self.pdfA_client.rotate_page(
# file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
# )
#
# # validation
# original_length = self.test_files.pdf_length
# pdfA = self.check.get_doc_base64_length(res)
#
# self.assertTrue(self.check.below_not_zero(pdfA, original_length))
def test_rotate_document_no_none_response(self):
# request
res = self.pdfA_client.rotate_document(
rotate='clockwise',
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
self.assertIsNotNone(res)
def test_rotate_document_doc_length(self):
# request
res = self.pdfA_client.rotate_document(
rotate='clockwise',
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
original_length = self.test_files.pdf_length
rotate = self.check.get_doc_base64_length(res)
self.assertTrue(self.check.below_not_zero(rotate, original_length))
# Protect
def create_protect(self):
protect = Protect(
document=Document(
doc_data=self.test_files.pdf_data
),
protect_action=ProtectAction(
user_password='pdf4me',
owner_password='pdf4me',
# unlock=None,
# permissions=None
)
)
return protect
def test_protect_throws_exception(self):
create_protect: Protect = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.protect(protect=create_protect)
assert_equal(e.exception.msg, 'The protect parameter cannot be None.')
def test_protect_document_throws_exception(self):
# prepare args
create_protect = self.create_protect()
create_protect.document = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.protect(protect=create_protect)
assert_equal(e.exception.msg, 'The protect document cannot be None.')
def test_protect_document_data_throws_exception(self):
# prepare args
create_protect = self.create_protect()
create_protect.document.doc_data = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.protect(protect=create_protect)
assert_equal(e.exception.msg, 'The protect document cannot be None.')
def test_protect_action_data_throws_exception(self):
# prepare args
create_protect = self.create_protect()
create_protect.protect_action = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.protect(protect=create_protect)
assert_equal(e.exception.msg, 'The protect_action cannot be None.')
def test_protect_no_none_response(self):
# request
create_protect = self.create_protect()
res = self.pdfA_client.protect(create_protect)
# validation
self.assertIsNotNone(res)
self.assertIsNotNone(res['document'])
self.assertIsNotNone(res['document']['doc_data'])
def test_protect_doc_length(self):
# request
create_protect = self.create_protect()
res = self.pdfA_client.protect(create_protect)
# validation
original_length = self.test_files.pdf_length
protect = len(res['document']['doc_data'])
self.assertTrue(self.check.below_not_zero(protect, original_length))
def test_protect_document_no_none_response(self):
# request
res = self.pdfA_client.protect_document(
password='pdf4me',
permissions='all',
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
self.assertIsNotNone(res)
def test_protect_document_doc_length(self):
# request
res = self.pdfA_client.protect_document(
password='pdf4me',
permissions='all',
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
original_length = self.test_files.pdf_length
protect = self.check.get_doc_base64_length(res)
self.assertTrue(self.check.below_not_zero(protect, original_length))
def test_unlock_document_no_none_response(self):
# request
res = self.pdfA_client.unlock_document(
password='pdf4me',
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
self.assertIsNotNone(res)
def test_unlock_document_doc_length(self):
# request
res = self.pdfA_client.unlock_document(
password='pdf4me',
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
original_length = self.test_files.pdf_length
unlock = self.check.get_doc_base64_length(res)
self.assertTrue(self.check.below_not_zero(unlock, original_length))
# Validate
def create_validate(self):
validate = Validate(
document=Document(
doc_data=self.test_files.pdf_data
),
validate_action=ValidateAction(
pdf_conformance='pdfA2u'
)
)
return validate
def test_validate_throws_exception(self):
create_validate: Validate = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.validate(validate=create_validate)
assert_equal(e.exception.msg, 'The validate parameter cannot be None.')
def test_validate_document_throws_exception(self):
# prepare args
create_validate = self.create_validate()
create_validate.document = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.validate(validate=create_validate)
assert_equal(e.exception.msg, 'The validate document cannot be None.')
def test_validate_document_data_throws_exception(self):
# prepare args
create_validate = self.create_validate()
create_validate.document.doc_data = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.validate(validate=create_validate)
assert_equal(e.exception.msg, 'The validate document cannot be None.')
def test_validate_action_data_throws_exception(self):
# prepare args
create_validate = self.create_validate()
create_validate.validate_action = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.validate(validate=create_validate)
assert_equal(e.exception.msg, 'The validate_action cannot be None.')
def test_validate_no_none_response(self):
# request
create_validate = self.create_validate()
res = self.pdfA_client.validate(create_validate)
# validation
self.assertIsNotNone(res)
def test_validate_pdfValidation_length(self):
# request
create_validate = self.create_validate()
res = self.pdfA_client.validate(create_validate)
# validation
self.assertIsNotNone(res)
self.assertIsNotNone(res['pdf_validation'])
validate = len(res['pdf_validation']['validations'])
self.assertTrue(self.check.not_zero(validate))
def test_validate_document_no_none_response(self):
# request
res = self.pdfA_client.validate_document(
pdf_compliance='pdfA2u',
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
self.assertIsNotNone(res)
def test_validate_document_pdfValidation_length(self):
# request
res = self.pdfA_client.validate_document(
pdf_compliance='pdfA2u',
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
self.assertIsNotNone(res)
self.assertIsNotNone(res['pdf_validation'])
validate = len(res['pdf_validation']['validations'])
self.assertTrue(self.check.not_zero(validate))
# Repair
def create_repair(self):
repair = Repair(
document=Document(
doc_data=self.test_files.pdf_data
),
repair_action=RepairAction(
)
)
return repair
def test_repair_throws_exception(self):
create_repair: Repair = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.repair(repair=create_repair)
assert_equal(e.exception.msg, 'The repair parameter cannot be None.')
def test_repair_document_throws_exception(self):
# prepare args
create_repair = self.create_repair()
create_repair.document = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.repair(repair=create_repair)
assert_equal(e.exception.msg, 'The repair document cannot be None.')
def test_repair_document_data_throws_exception(self):
# prepare args
create_repair = self.create_repair()
create_repair.document.doc_data = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.repair(repair=create_repair)
assert_equal(e.exception.msg, 'The repair document cannot be None.')
def test_repair_no_none_response(self):
# request
create_repair = self.create_repair()
res = self.pdfA_client.repair(create_repair)
# validation
self.assertIsNotNone(res)
self.assertIsNotNone(res['document'])
self.assertIsNotNone(res['document']['doc_data'])
def test_repair_doc_length(self):
# request
create_repair = self.create_repair()
res = self.pdfA_client.repair(create_repair)
# validation
original_length = self.test_files.pdf_length
repair = len(res['document']['doc_data'])
self.assertTrue(self.check.below_not_zero(repair, original_length))
def test_repair_document_no_none_response(self):
# request
res = self.pdfA_client.repair_document(
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
self.assertIsNotNone(res)
def test_repair_document_doc_length(self):
# request
res = self.pdfA_client.repair_document(
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
original_length = self.test_files.pdf_length
repair = self.check.get_doc_base64_length(res)
self.assertTrue(self.check.below_not_zero(repair, original_length))
# SignPdf
def create_sign_pdf(self):
sign_pdf = SignPdf(
document=Document(
doc_data=self.test_files.pdf_data
),
sign_action=SignAction(
)
)
return sign_pdf
def test_sign_pdf_throws_exception(self):
create_sign_pdf: SignPdf = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.sign_pdf(sign_pdf=create_sign_pdf)
assert_equal(e.exception.msg, 'The sign_pdf parameter cannot be None.')
def test_sign_pdf_document_throws_exception(self):
# prepare args
create_sign_pdf = self.create_sign_pdf()
create_sign_pdf.document = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.sign_pdf(sign_pdf=create_sign_pdf)
assert_equal(e.exception.msg, 'The sign_pdf document cannot be None.')
def test_sign_pdf_document_data_throws_exception(self):
# prepare args
create_sign_pdf = self.create_sign_pdf()
create_sign_pdf.document.doc_data = None
with assert_raises(Pdf4meClientException) as e:
self.pdfA_client.sign_pdf(sign_pdf=create_sign_pdf)
assert_equal(e.exception.msg, 'The sign_pdf document cannot be None.')
# def test_sign_pdf_no_none_response(self):
# # request
# create_sign_pdf = self.create_sign_pdf()
# res = self.pdfA_client.sign_pdf(create_sign_pdf)
#
# # validation
# self.assertIsNotNone(res)
# self.assertIsNotNone(res['document'])
# self.assertIsNotNone(res['document']['doc_data'])
#
# def test_sign_pdf_doc_length(self):
# # request
# create_sign_pdf = self.create_sign_pdf()
# res = self.pdfA_client.sign_pdf(create_sign_pdf)
#
# # validation
# original_length = self.test_files.pdf_length
# sign_pdf = len(res['document']['doc_data'])
#
# self.assertTrue(self.check.below_not_zero(sign_pdf, original_length))
# Metadata
def test_metadata_no_none_response(self):
# request
res = self.pdfA_client.metadata(
file=self.file_reader.get_file_handler(path=self.test_files.pdf_path)
)
# validation
self.assertIsNotNone(res)
| 32.843558 | 183 | 0.662651 | 2,486 | 21,414 | 5.390587 | 0.042237 | 0.041042 | 0.055369 | 0.04537 | 0.86307 | 0.858966 | 0.83934 | 0.824789 | 0.810835 | 0.769943 | 0 | 0.004192 | 0.253713 | 21,414 | 651 | 184 | 32.894009 | 0.834366 | 0.095872 | 0 | 0.554688 | 0 | 0 | 0.063481 | 0.003015 | 0 | 0 | 0 | 0 | 0.21875 | 1 | 0.143229 | false | 0.015625 | 0.026042 | 0 | 0.200521 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
764a396e497210818b3b8a9378f28b1be8c49da3 | 6,239 | py | Python | tests/actions/test_move.py | tank0226/organize | d5595a52f06ea6c805fe421dcc2429a3ccd03b09 | [
"MIT"
] | 1,231 | 2018-01-13T17:06:24.000Z | 2022-03-31T22:14:36.000Z | tests/todo/todo_move.py | ytzhangFTD/organize | 706992385b1e4063cf5eb574511c49263dcc17b5 | [
"MIT"
] | 170 | 2018-03-13T19:15:17.000Z | 2022-03-31T10:14:15.000Z | tests/todo/todo_move.py | ytzhangFTD/organize | 706992385b1e4063cf5eb574511c49263dcc17b5 | [
"MIT"
] | 86 | 2018-03-14T02:12:49.000Z | 2022-03-27T00:16:07.000Z | import os
from organize.actions import Move
from pathlib import Path
from organize.utils import DotDict
USER_DIR = os.path.expanduser("~")
ARGS = DotDict(basedir=Path.home(), path=Path.home() / "test.py", simulate=False)
def test_tilde_expansion(mock_exists, mock_samefile, mock_move, mock_trash, mock_mkdir):
mock_exists.return_value = False
mock_samefile.return_value = False
move = Move(dest="~/newname.py", overwrite=False)
updates = move.run(**ARGS)
mock_mkdir.assert_called_with(exist_ok=True, parents=True)
mock_exists.assert_called_with()
mock_trash.assert_not_called()
mock_move.assert_called_with(
src=os.path.join(USER_DIR, "test.py"), dst=os.path.join(USER_DIR, "newname.py")
)
assert updates == {"path": Path("~/newname.py").expanduser()}
def test_into_folder(mock_exists, mock_samefile, mock_move, mock_trash, mock_mkdir):
mock_exists.return_value = False
mock_samefile.return_value = False
move = Move(dest="~/somefolder/", overwrite=False)
updates = move.run(**ARGS)
mock_mkdir.assert_called_with(exist_ok=True, parents=True)
mock_exists.assert_called_with()
mock_trash.assert_not_called()
mock_move.assert_called_with(
src=os.path.join(USER_DIR, "test.py"),
dst=os.path.join(USER_DIR, "somefolder", "test.py"),
)
assert updates == {"path": Path(USER_DIR) / "somefolder" / "test.py"}
def test_overwrite(mock_exists, mock_samefile, mock_move, mock_trash, mock_mkdir):
mock_exists.return_value = True
mock_samefile.return_value = False
move = Move(dest="~/somefolder/", overwrite=True)
updates = move.run(**ARGS)
mock_mkdir.assert_called_with(exist_ok=True, parents=True)
mock_exists.assert_called_with()
mock_trash.assert_called_with(os.path.join(USER_DIR, "somefolder", "test.py"))
mock_move.assert_called_with(
src=os.path.join(USER_DIR, "test.py"),
dst=os.path.join(USER_DIR, "somefolder", "test.py"),
)
assert updates is not None
def test_already_exists(mock_exists, mock_samefile, mock_move, mock_trash, mock_mkdir):
mock_exists.side_effect = [True, False]
mock_samefile.return_value = False
move = Move(dest="~/folder/", overwrite=False)
updates = move.run(**ARGS)
mock_mkdir.assert_called_with(exist_ok=True, parents=True)
mock_exists.assert_called_with()
mock_trash.assert_not_called()
mock_move.assert_called_with(
src=os.path.join(USER_DIR, "test.py"),
dst=os.path.join(USER_DIR, "folder", "test 2.py"),
)
assert updates is not None
def test_already_exists_multiple(
mock_exists, mock_samefile, mock_move, mock_trash, mock_mkdir
):
mock_exists.side_effect = [True, True, True, False]
mock_samefile.return_value = False
move = Move(dest="~/folder/", overwrite=False)
updates = move.run(**ARGS)
mock_mkdir.assert_called_with(exist_ok=True, parents=True)
mock_exists.assert_called_with()
mock_trash.assert_not_called()
mock_move.assert_called_with(
src=os.path.join(USER_DIR, "test.py"),
dst=os.path.join(USER_DIR, "folder", "test 4.py"),
)
assert updates is not None
def test_already_exists_multiple_separator(
mock_exists, mock_samefile, mock_move, mock_trash, mock_mkdir
):
mock_exists.side_effect = [True, True, True, False]
mock_samefile.return_value = False
move = Move(dest="~/folder/", overwrite=False, counter_separator="_")
updates = move.run(**ARGS)
mock_mkdir.assert_called_with(exist_ok=True, parents=True)
mock_exists.assert_called_with()
mock_trash.assert_not_called()
mock_move.assert_called_with(
src=os.path.join(USER_DIR, "test.py"),
dst=os.path.join(USER_DIR, "folder", "test_4.py"),
)
assert updates is not None
def test_makedirs(mock_parent, mock_move, mock_trash):
move = Move(dest="~/some/new/folder/", overwrite=False)
updates = move.run(**ARGS)
mock_parent.mkdir.assert_called_with(parents=True, exist_ok=True)
mock_trash.assert_not_called()
mock_move.assert_called_with(
src=os.path.join(USER_DIR, "test.py"),
dst=os.path.join(USER_DIR, "some", "new", "folder", "test.py"),
)
assert updates is not None
def test_args(mock_exists, mock_samefile, mock_move, mock_trash, mock_mkdir):
args = ARGS.merge({"nr": {"upper": 1}})
mock_exists.return_value = False
mock_samefile.return_value = False
move = Move(dest="~/{nr.upper}-name.py", overwrite=False)
updates = move.run(**args)
mock_mkdir.assert_called_with(exist_ok=True, parents=True)
mock_exists.assert_called_with()
mock_trash.assert_not_called()
mock_move.assert_called_with(
src=os.path.join(USER_DIR, "test.py"), dst=os.path.join(USER_DIR, "1-name.py")
)
assert updates is not None
def test_path(mock_exists, mock_samefile, mock_move, mock_trash, mock_mkdir):
mock_exists.return_value = False
mock_samefile.return_value = False
move = Move(dest="~/{path.stem}/{path.suffix}/{path.name}", overwrite=False)
updates = move.run(**ARGS)
mock_mkdir.assert_called_with(exist_ok=True, parents=True)
mock_exists.assert_called_with()
mock_trash.assert_not_called()
mock_move.assert_called_with(
src=os.path.join(USER_DIR, "test.py"),
dst=os.path.join(USER_DIR, "test", ".py", "test.py"),
)
assert updates is not None
def test_keep_location(mock_exists, mock_samefile, mock_move, mock_trash, mock_mkdir):
mock_exists.return_value = True
mock_samefile.return_value = True
move = Move(dest="~/test.py")
updates = move.run(**ARGS)
mock_mkdir.assert_not_called()
mock_exists.assert_called_with()
mock_trash.assert_not_called()
mock_move.assert_not_called()
assert updates is not None
def test_dont_keep_case_sensitive(
mock_exists, mock_samefile, mock_move, mock_trash, mock_mkdir
):
mock_exists.return_value = True
mock_samefile.return_value = True
move = Move(dest="~/TEST.PY")
updates = move.run(**ARGS)
assert mock_mkdir.call_count > 0
mock_exists.assert_called_with()
mock_trash.assert_not_called()
assert mock_move.call_count > 0
assert updates is not None
| 36.48538 | 88 | 0.713255 | 908 | 6,239 | 4.596916 | 0.085903 | 0.071874 | 0.111164 | 0.063728 | 0.860565 | 0.842357 | 0.842357 | 0.830379 | 0.804983 | 0.788213 | 0 | 0.001338 | 0.161404 | 6,239 | 170 | 89 | 36.7 | 0.796445 | 0 | 0 | 0.634483 | 0 | 0 | 0.06812 | 0.006251 | 0 | 0 | 0 | 0 | 0.372414 | 1 | 0.075862 | false | 0 | 0.027586 | 0 | 0.103448 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4f17b93fa6cf1894584ca01717a6e7f199d0dab5 | 647 | py | Python | bireme/main/decorators.py | viniciusandrade/fi-admin-fork | 5187af08d1424f275222422ed12f9e54c14caa5b | [
"MIT",
"Python-2.0",
"Apache-2.0",
"BSD-3-Clause"
] | 5 | 2016-04-08T19:45:25.000Z | 2022-03-24T16:56:49.000Z | bireme/main/decorators.py | viniciusandrade/fi-admin-fork | 5187af08d1424f275222422ed12f9e54c14caa5b | [
"MIT",
"Python-2.0",
"Apache-2.0",
"BSD-3-Clause"
] | 1,214 | 2015-03-10T14:47:10.000Z | 2022-03-31T12:15:05.000Z | bireme/main/decorators.py | viniciusandrade/fi-admin-fork | 5187af08d1424f275222422ed12f9e54c14caa5b | [
"MIT",
"Python-2.0",
"Apache-2.0",
"BSD-3-Clause"
] | 7 | 2015-06-01T17:58:22.000Z | 2021-09-29T12:34:19.000Z | from django.http import HttpResponseForbidden
from functools import wraps
def advanced_permission(method):
@wraps(method)
def wrapper(request, *args, **kwargs):
user = request.user
if not user.is_superuser and user.profile.type == "basic":
return HttpResponseForbidden()
return method(request, *args, **kwargs)
return wrapper
def superuser_permission(method):
@wraps(method)
def wrapper(request, *args, **kwargs):
user = request.user
if not user.is_superuser:
return HttpResponseForbidden()
return method(request, *args, **kwargs)
return wrapper | 26.958333 | 66 | 0.666151 | 70 | 647 | 6.1 | 0.357143 | 0.103045 | 0.159251 | 0.126464 | 0.740047 | 0.740047 | 0.740047 | 0.740047 | 0.740047 | 0.416862 | 0 | 0 | 0.239567 | 647 | 24 | 67 | 26.958333 | 0.867886 | 0 | 0 | 0.666667 | 0 | 0 | 0.007716 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
4f3fd72a146a443c510cae4d21c42dc46ddafd13 | 5,297 | py | Python | tests/test_utils.py | doctorlard/nikola | d6dd2fc4af6fa0d92dfda500393bbd235b60df2a | [
"MIT"
] | 1 | 2015-12-14T21:38:33.000Z | 2015-12-14T21:38:33.000Z | tests/test_utils.py | doctorlard/nikola | d6dd2fc4af6fa0d92dfda500393bbd235b60df2a | [
"MIT"
] | 1 | 2019-08-18T13:37:20.000Z | 2019-08-18T16:09:08.000Z | tests/test_utils.py | doctorlard/nikola | d6dd2fc4af6fa0d92dfda500393bbd235b60df2a | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
import unittest
import mock
from nikola.post import get_meta
class dummy(object):
pass
class GetMetaTest(unittest.TestCase):
def test_getting_metadata_from_content(self):
file_metadata = [".. title: Nikola needs more tests!\n",
".. slug: write-tests-now\n",
".. date: 2012/09/15 19:52:05\n",
".. tags:\n",
".. link:\n",
".. description:\n",
"Post content\n"]
opener_mock = mock.mock_open(read_data=file_metadata)
opener_mock.return_value.readlines.return_value = file_metadata
post = dummy()
post.source_path = 'file_with_metadata'
post.metadata_path = 'file_with_metadata.meta'
with mock.patch('nikola.post.codecs.open', opener_mock, create=True):
meta = get_meta(post)
self.assertEqual('Nikola needs more tests!', meta['title'])
self.assertEqual('write-tests-now', meta['slug'])
self.assertEqual('2012/09/15 19:52:05', meta['date'])
self.assertFalse('tags' in meta)
self.assertFalse('link' in meta)
self.assertFalse('description' in meta)
def test_get_title_from_rest(self):
file_metadata = [".. slug: write-tests-now\n",
".. date: 2012/09/15 19:52:05\n",
".. tags:\n",
".. link:\n",
".. description:\n",
"Post Title\n",
"----------\n"]
opener_mock = mock.mock_open(read_data=file_metadata)
opener_mock.return_value.readlines.return_value = file_metadata
post = dummy()
post.source_path = 'file_with_metadata'
post.metadata_path = 'file_with_metadata.meta'
with mock.patch('nikola.post.codecs.open', opener_mock, create=True):
meta = get_meta(post)
self.assertEqual('Post Title', meta['title'])
self.assertEqual('write-tests-now', meta['slug'])
self.assertEqual('2012/09/15 19:52:05', meta['date'])
self.assertFalse('tags' in meta)
self.assertFalse('link' in meta)
self.assertFalse('description' in meta)
def test_get_title_from_fname(self):
file_metadata = [".. slug: write-tests-now\n",
".. date: 2012/09/15 19:52:05\n",
".. tags:\n",
".. link:\n",
".. description:\n"]
opener_mock = mock.mock_open(read_data=file_metadata)
opener_mock.return_value.readlines.return_value = file_metadata
post = dummy()
post.source_path = 'file_with_metadata'
post.metadata_path = 'file_with_metadata.meta'
with mock.patch('nikola.post.codecs.open', opener_mock, create=True):
meta = get_meta(post, 'file_with_metadata')
self.assertEqual('file_with_metadata', meta['title'])
self.assertEqual('write-tests-now', meta['slug'])
self.assertEqual('2012/09/15 19:52:05', meta['date'])
self.assertFalse('tags' in meta)
self.assertFalse('link' in meta)
self.assertFalse('description' in meta)
def test_use_filename_as_slug_fallback(self):
file_metadata = [".. title: Nikola needs more tests!\n",
".. date: 2012/09/15 19:52:05\n",
".. tags:\n",
".. link:\n",
".. description:\n",
"Post content\n"]
opener_mock = mock.mock_open(read_data=file_metadata)
opener_mock.return_value.readlines.return_value = file_metadata
post = dummy()
post.source_path = 'Slugify this'
post.metadata_path = 'Slugify this.meta'
with mock.patch('nikola.post.codecs.open', opener_mock, create=True):
meta = get_meta(post, 'Slugify this')
self.assertEqual('Nikola needs more tests!', meta['title'])
self.assertEqual('slugify-this', meta['slug'])
self.assertEqual('2012/09/15 19:52:05', meta['date'])
self.assertFalse('tags' in meta)
self.assertFalse('link' in meta)
self.assertFalse('description' in meta)
def test_extracting_metadata_from_filename(self):
post = dummy()
post.source_path = '2013-01-23-the_slug-dubdubtitle.md'
post.metadata_path = '2013-01-23-the_slug-dubdubtitle.meta'
with mock.patch('nikola.post.codecs.open', create=True):
meta = get_meta(post,
'(?P<date>\d{4}-\d{2}-\d{2})-(?P<slug>.*)-(?P<title>.*)\.md')
self.assertEqual('dubdubtitle', meta['title'])
self.assertEqual('the_slug', meta['slug'])
self.assertEqual('2013-01-23', meta['date'])
def test_get_meta_slug_only_from_filename(self):
post = dummy()
post.source_path = 'some/path/the_slug.md'
post.metadata_path = 'some/path/the_slug.meta'
with mock.patch('nikola.post.codecs.open', create=True):
meta = get_meta(post)
self.assertEqual('the_slug', meta['slug'])
if __name__ == '__main__':
unittest.main()
| 38.384058 | 89 | 0.572588 | 631 | 5,297 | 4.614897 | 0.136292 | 0.082418 | 0.021978 | 0.027473 | 0.823832 | 0.812157 | 0.796703 | 0.777473 | 0.745536 | 0.717033 | 0 | 0.037027 | 0.2862 | 5,297 | 137 | 90 | 38.664234 | 0.733139 | 0.003965 | 0 | 0.682243 | 0 | 0.009346 | 0.258058 | 0.071862 | 0 | 0 | 0 | 0 | 0.261682 | 1 | 0.056075 | false | 0.009346 | 0.037383 | 0 | 0.11215 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4f7d343c75f75aa722fcdea561f93b13caf2fecd | 7 | py | Python | Random-Programs/dev/games/terminal/testenviroment.py | naumoff0/Archive | d4ad2da89abb1576dd5a7c72ded6bf9b45c3f610 | [
"MIT"
] | null | null | null | Random-Programs/dev/games/terminal/testenviroment.py | naumoff0/Archive | d4ad2da89abb1576dd5a7c72ded6bf9b45c3f610 | [
"MIT"
] | null | null | null | Random-Programs/dev/games/terminal/testenviroment.py | naumoff0/Archive | d4ad2da89abb1576dd5a7c72ded6bf9b45c3f610 | [
"MIT"
] | null | null | null | ±C;j= | 7 | 7 | 0.285714 | 4 | 7 | 1.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 | 1 | 7 | 7 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
96c37fe3f975b60c420f3029f2ed04a81740e2e3 | 215 | py | Python | students/K33422/practical_works/Chanova_Sofya/simple_django_web_project/django_project_chanova/project_first_app/admin.py | ShubhamKunal/ITMO_ICT_WebDevelopment_2020-2021 | bb91c91a56d21cec2b12ae4cc722eaa652a88420 | [
"MIT"
] | 4 | 2020-09-03T15:41:42.000Z | 2021-12-24T15:28:20.000Z | students/K33422/practical_works/Chanova_Sofya/simple_django_web_project/django_project_chanova/project_first_app/admin.py | ShubhamKunal/ITMO_ICT_WebDevelopment_2020-2021 | bb91c91a56d21cec2b12ae4cc722eaa652a88420 | [
"MIT"
] | 48 | 2020-09-13T20:22:42.000Z | 2021-04-30T11:13:30.000Z | students/K33422/practical_works/Chanova_Sofya/simple_django_web_project/django_project_chanova/project_first_app/admin.py | ShubhamKunal/ITMO_ICT_WebDevelopment_2020-2021 | bb91c91a56d21cec2b12ae4cc722eaa652a88420 | [
"MIT"
] | 69 | 2020-09-06T10:32:37.000Z | 2021-11-28T18:13:17.000Z | from django.contrib import admin
from .models import Owner, Car, CarOwnership, DriverLicense
admin.site.register(Owner)
admin.site.register(Car)
admin.site.register(CarOwnership)
admin.site.register(DriverLicense)
| 26.875 | 59 | 0.827907 | 28 | 215 | 6.357143 | 0.428571 | 0.202247 | 0.382022 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074419 | 215 | 7 | 60 | 30.714286 | 0.894472 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
96f0e215719e1b2faa534bcb1c3c2ce7fb1e8bc6 | 203 | py | Python | examples/admin.py | dashdrum/django-reportview | 617166b0c923fadebe5dcea1e73e598ca5a46d1c | [
"NCSA"
] | 7 | 2015-09-17T19:24:19.000Z | 2017-03-08T17:13:49.000Z | examples/admin.py | dashdrum/django-reportview | 617166b0c923fadebe5dcea1e73e598ca5a46d1c | [
"NCSA"
] | null | null | null | examples/admin.py | dashdrum/django-reportview | 617166b0c923fadebe5dcea1e73e598ca5a46d1c | [
"NCSA"
] | null | null | null | from models import Author, Book, Publisher, Store
from django.contrib import admin
admin.site.register(Author)
admin.site.register(Book)
admin.site.register(Publisher)
admin.site.register(Store)
| 25.375 | 50 | 0.788177 | 28 | 203 | 5.714286 | 0.428571 | 0.225 | 0.425 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1133 | 203 | 7 | 51 | 29 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
8c16e3a1a65a4ecb5a6bce6391818d72b2aab497 | 23,823 | py | Python | movie_debug.py | RenqinCai/KDD19-CoSTCo | cc19c4b029e55af1c6b2a4d60d96cb8048f22d90 | [
"MIT"
] | null | null | null | movie_debug.py | RenqinCai/KDD19-CoSTCo | cc19c4b029e55af1c6b2a4d60d96cb8048f22d90 | [
"MIT"
] | null | null | null | movie_debug.py | RenqinCai/KDD19-CoSTCo | cc19c4b029e55af1c6b2a4d60d96cb8048f22d90 | [
"MIT"
] | null | null | null | import os
import json
import torch
import numpy as np
from torch.utils.data import Dataset
import pandas as pd
import argparse
import copy
from collections import Counter
from nltk.tokenize import TweetTokenizer
import gensim
import random
class _MOVIE(Dataset):
def __init__(self, args, vocab_obj, df, boa_item_dict, boa_user_dict):
super().__init__()
self.m_data_dir = args.data_dir
self.m_max_seq_len = args.max_seq_length
self.m_batch_size = args.batch_size
# self.m_vocab_file = "amazon_vocab.json"
self.m_max_line = 1e10
self.m_sos_id = vocab_obj.sos_idx
self.m_eos_id = vocab_obj.eos_idx
self.m_pad_id = vocab_obj.pad_idx
self.m_vocab_size = vocab_obj.vocab_size
self.m_vocab = vocab_obj
self.m_sample_num = len(df)
print("sample num", self.m_sample_num)
self.m_batch_num = int(self.m_sample_num/self.m_batch_size)
print("batch num", self.m_batch_num)
if (self.m_sample_num/self.m_batch_size - self.m_batch_num) > 0:
self.m_batch_num += 1
###get length
self.m_attr_item_list = []
self.m_attr_tf_item_list = []
self.m_attr_length_item_list = []
self.m_item_batch_list = []
self.m_attr_user_list = []
self.m_attr_tf_user_list = []
self.m_attr_length_user_list = []
self.m_user_batch_list = []
self.m_pos_target_list = []
self.m_pos_len_list = []
self.m_neg_target_list = []
self.m_neg_len_list = []
self.m_user2uid = {}
self.m_item2iid = {}
userid_list = df.userid.tolist()
itemid_list = df.itemid.tolist()
# review_list = df.review.tolist()
# tokens_list = df.token_idxs.tolist()
pos_attr_list = df.pos_attr.tolist()
neg_attr_list = df.neg_attr.tolist()
# print("boa_user_dict", boa_user_dict)
max_neg_len_threshold = 10000
max_attr_len = 20
for sample_index in range(self.m_sample_num):
user_id = userid_list[sample_index]
item_id = itemid_list[sample_index]
# pos_attrlist_i = list(pos_attr_list[sample_index])
pos_attrlist_i = [int(pos_attr_list[sample_index])]
neg_attrlist_i = list(neg_attr_list[sample_index])
neg_attrlist_i = [int(j) for j in neg_attrlist_i]
# tmp_len = random.randint(2, 5)
# neg_attrlist_i = neg_attrlist_i[:tmp_len]
# neg_attrlist_i = neg_attr_list[sample_index]
# attrdict_item_i = boa_item_dict[str(item_id)]
# attrlist_item_i = list(attrdict_item_i.keys())
# attrlist_item_i = [int(i) for i in attrlist_item_i]
# attrlist_item_i = attrlist_item_i[:max_attr_len]
# attrfreq_list_item_i = list(attrdict_item_i.values())
# attrfreq_list_item_i = attrfreq_list_item_i[:max_attr_len]
# attrdict_user_i = boa_user_dict[str(user_id)]
# attrlist_user_i = list(attrdict_user_i.keys())
# attrlist_user_i = [int(i) for i in attrlist_user_i]
# attrlist_user_i = attrlist_user_i[:max_attr_len]
# attrfreq_list_user_i = list(attrdict_user_i.values())
# attrfreq_list_user_i = attrfreq_list_user_i[:max_attr_len]
"""
scale the item freq into the range [0, 1]
"""
# def max_min_scale(val_list):
# vals = np.array(val_list)
# min_val = min(vals)
# max_val = max(vals)
# if max_val == min_val:
# scale_vals = np.zeros_like(vals)
# # print("scale_vals", scale_vals)
# else:
# scale_vals = (vals-min_val)/(max_val-min_val)
# scale_vals = scale_vals+1.0
# scale_val_list = list(scale_vals)
# return scale_val_list
# attrfreq_list_item_i = max_min_scale(attrfreq_list_item_i)
# self.m_attr_item_list.append(attrlist_item_i)
# self.m_attr_tf_item_list.append(attrfreq_list_item_i)
# self.m_attr_length_item_list.append(len(attrlist_item_i))
# attrfreq_list_user_i = max_min_scale(attrfreq_list_user_i)
# self.m_attr_user_list.append(attrlist_user_i)
# self.m_attr_tf_user_list.append(attrfreq_list_user_i)
# self.m_attr_length_user_list.append(len(attrlist_user_i))
self.m_user_batch_list.append(user_id)
self.m_item_batch_list.append(item_id)
self.m_pos_target_list.append(pos_attrlist_i)
self.m_pos_len_list.append(len(pos_attrlist_i))
self.m_neg_target_list.append(neg_attrlist_i)
self.m_neg_len_list.append(len(neg_attrlist_i))
print("... load train data ...", len(self.m_item_batch_list), len(self.m_attr_tf_item_list), len(self.m_attr_user_list), len(self.m_attr_tf_user_list))
# exit()
def __len__(self):
return len(self.m_item_batch_list)
def __getitem__(self, idx):
if torch.is_tensor(idx):
idx = idx.tolist()
i = idx
# attr_item_i = self.m_attr_item_list[i]
# attr_tf_item_i = self.m_attr_tf_item_list[i]
# attr_length_item_i = self.m_attr_length_item_list[i]
item_i = self.m_item_batch_list[i]
# attr_user_i = self.m_attr_user_list[i]
# attr_tf_user_i = self.m_attr_tf_user_list[i]
# attr_length_user_i = self.m_attr_length_user_list[i]
user_i = self.m_user_batch_list[i]
pos_target_i = self.m_pos_target_list[i]
# pos_len_i = self.m_pos_len_list[i]
neg_target_i = self.m_neg_target_list[i]
neg_len_i = self.m_neg_len_list[i]
# sample_i = {"attr_item": attr_item_i, "attr_tf_item": attr_tf_item_i, "attr_length_item": attr_length_item_i, "item": item_i, "attr_user": attr_user_i, "attr_tf_user": attr_tf_user_i, "attr_length_user": attr_length_user_i, "user": user_i, "pos_target": pos_target_i, "pos_len": pos_len_i, "neg_target": neg_target_i, "neg_len": neg_len_i}
sample_i = {"item": item_i, "user": user_i, "pos_target": pos_target_i, "neg_target": neg_target_i, "neg_len": neg_len_i}
return sample_i
@staticmethod
def collate(batch):
batch_size = len(batch)
attr_item_iter = []
attr_tf_item_iter = []
attr_length_item_iter = []
item_iter = []
attr_user_iter = []
attr_tf_user_iter = []
attr_length_user_iter = []
user_iter = []
pos_target_iter = []
pos_len_iter = []
neg_target_iter = []
neg_len_iter = []
for i in range(batch_size):
sample_i = batch[i]
# attr_length_item_i = sample_i["attr_length_item"]
# attr_length_item_iter.append(attr_length_item_i)
# attr_length_user_i = sample_i["attr_length_user"]
# attr_length_user_iter.append(attr_length_user_i)
# pos_len_i = sample_i["pos_len"]
# pos_len_iter.append(pos_len_i)
neg_len_i = sample_i["neg_len"]
neg_len_iter.append(neg_len_i)
# max_attr_length_item_iter = max(attr_length_item_iter)
# max_attr_length_user_iter = max(attr_length_user_iter)
# max_pos_targetlen_iter = max(pos_len_iter)
max_neg_targetlen_iter = max(neg_len_iter)
# print("max_pos_targetlen_iter", max_pos_targetlen_iter)
# print("max_neg_targetlen_iter", max_neg_targetlen_iter)
# freq_pad_id = float('-inf')
freq_pad_id = float(0)
pad_id = 0
for i in range(batch_size):
sample_i = batch[i]
# attr_item_i = copy.deepcopy(sample_i["attr_item"])
# attr_item_i = [int(i) for i in attr_item_i]
# attr_tf_item_i = copy.deepcopy(sample_i['attr_tf_item'])
# attr_length_item_i = sample_i["attr_length_item"]
# attr_item_i.extend([pad_id]*(max_attr_length_item_iter-attr_length_item_i))
# attr_item_iter.append(attr_item_i)
# attr_tf_item_i.extend([freq_pad_id]*(max_attr_length_item_iter-attr_length_item_i))
# attr_tf_item_iter.append(attr_tf_item_i)
item_i = sample_i["item"]
item_iter.append(item_i)
# attr_user_i = copy.deepcopy(sample_i["attr_user"])
# attr_user_i = [int(i) for i in attr_user_i]
# attr_tf_user_i = copy.deepcopy(sample_i['attr_tf_user'])
# attr_length_user_i = sample_i["attr_length_user"]
# attr_user_i.extend([pad_id]*(max_attr_length_user_iter-attr_length_user_i))
# attr_user_iter.append(attr_user_i)
# attr_tf_user_i.extend([freq_pad_id]*(max_attr_length_user_iter-attr_length_user_i))
# attr_tf_user_iter.append(attr_tf_user_i)
user_i = sample_i["user"]
user_iter.append(user_i)
pos_target_i = copy.deepcopy(sample_i["pos_target"])
# pos_len_i = sample_i["pos_len"]
# pos_target_i.extend([pad_id]*(max_pos_targetlen_iter-pos_len_i))
pos_target_iter.append(pos_target_i)
pos_len_i = 1
pos_len_iter.append(pos_len_i)
neg_target_i = copy.deepcopy(sample_i["neg_target"])
neg_len_i = sample_i["neg_len"]
neg_target_i.extend([pad_id]*(max_neg_targetlen_iter-neg_len_i))
neg_target_iter.append(neg_target_i)
# neg_len_i = 1
# neg_len_iter.append(pos_len_i)
# attr_item_iter_tensor = torch.from_numpy(np.array(attr_item_iter)).long()
# attr_tf_item_iter_tensor = torch.from_numpy(np.array(attr_tf_item_iter)).float()
# attr_length_item_iter_tensor = torch.from_numpy(np.array(attr_length_item_iter)).long()
item_iter_tensor = torch.from_numpy(np.array(item_iter)).long()
# attr_user_iter_tensor = torch.from_numpy(np.array(attr_user_iter)).long()
# attr_tf_user_iter_tensor = torch.from_numpy(np.array(attr_tf_user_iter)).float()
# attr_length_user_iter_tensor = torch.from_numpy(np.array(attr_length_user_iter)).long()
user_iter_tensor = torch.from_numpy(np.array(user_iter)).long()
pos_target_iter_tensor = torch.from_numpy(np.array(pos_target_iter)).long()
pos_len_iter_tensor = torch.from_numpy(np.array(pos_len_iter)).long()
neg_target_iter_tensor = torch.from_numpy(np.array(neg_target_iter)).long()
neg_len_iter_tensor = torch.from_numpy(np.array(neg_len_iter)).long()
# print("neg_len_iter_tensor", neg_len_iter_tensor)
return item_iter_tensor, user_iter_tensor, pos_target_iter_tensor, pos_len_iter_tensor, neg_target_iter_tensor, neg_len_iter_tensor
# return None, None, None, item_iter_tensor, None, None, None, user_iter_tensor, pos_target_iter_tensor, pos_len_iter_tensor, neg_target_iter_tensor, neg_len_iter_tensor
# return attr_item_iter_tensor, attr_tf_item_iter_tensor, attr_length_item_iter_tensor, item_iter_tensor, attr_user_iter_tensor, attr_tf_user_iter_tensor, attr_length_user_iter_tensor, user_iter_tensor, pos_target_iter_tensor, pos_len_iter_tensor, neg_target_iter_tensor, neg_len_iter_tensor
class _MOVIE_TEST(Dataset):
def __init__(self, args, vocab_obj, df, boa_item_dict, boa_user_dict):
super().__init__()
self.m_data_dir = args.data_dir
self.m_max_seq_len = args.max_seq_length
self.m_batch_size = args.batch_size
# self.m_vocab_file = "amazon_vocab.json"
self.m_max_line = 1e10
self.m_sos_id = vocab_obj.sos_idx
self.m_eos_id = vocab_obj.eos_idx
self.m_pad_id = vocab_obj.pad_idx
self.m_vocab_size = vocab_obj.vocab_size
self.m_vocab = vocab_obj
self.m_sample_num = len(df)
print("sample num", self.m_sample_num)
self.m_batch_num = int(self.m_sample_num/self.m_batch_size)
print("batch num", self.m_batch_num)
if (self.m_sample_num/self.m_batch_size - self.m_batch_num) > 0:
self.m_batch_num += 1
###get length
self.m_attr_item_list = []
self.m_attr_tf_item_list = []
self.m_attr_length_item_list = []
self.m_item_batch_list = []
self.m_attr_user_list = []
self.m_attr_tf_user_list = []
self.m_attr_length_user_list = []
self.m_user_batch_list = []
self.m_target_list = []
self.m_target_len_list = []
self.m_user2uid = {}
self.m_item2iid = {}
userid_list = df.userid.tolist()
itemid_list = df.itemid.tolist()
# review_list = df.review.tolist()
# tokens_list = df.token_idxs.tolist()
attr_list = df.attr.tolist()
# print("boa_user_dict", boa_user_dict)
# max_neg_len_threshold = 10000
max_attr_len = 20
for sample_index in range(self.m_sample_num):
user_id = userid_list[sample_index]
item_id = itemid_list[sample_index]
attrlist_i = list(attr_list[sample_index])
attrlist_i = [int(j) for j in attrlist_i]
# attrdict_item_i = boa_item_dict[str(item_id)]
# attrlist_item_i = list(attrdict_item_i.keys())
# attrlist_item_i = [int(i) for i in attrlist_item_i]
# attrlist_item_i = attrlist_item_i[:max_attr_len]
# attrfreq_list_item_i = list(attrdict_item_i.values())
# attrfreq_list_item_i = attrfreq_list_item_i[:max_attr_len]
# attrdict_user_i = boa_user_dict[str(user_id)]
# attrlist_user_i = list(attrdict_user_i.keys())
# attrlist_user_i = [int(i) for i in attrlist_user_i]
# attrlist_user_i = attrlist_user_i[:max_attr_len]
# attrfreq_list_user_i = list(attrdict_user_i.values())
# attrfreq_list_user_i = attrfreq_list_user_i[:max_attr_len]
# """
# scale the item freq into the range [0, 1]
# """
# def max_min_scale(val_list):
# vals = np.array(val_list)
# min_val = min(vals)
# max_val = max(vals)
# if max_val == min_val:
# scale_vals = np.zeros_like(vals)
# else:
# scale_vals = (vals-min_val)/(max_val-min_val)
# scale_vals = scale_vals+1.0
# scale_val_list = list(scale_vals)
# return scale_val_list
# attrfreq_list_item_i = max_min_scale(attrfreq_list_item_i)
# self.m_attr_item_list.append(attrlist_item_i)
# self.m_attr_tf_item_list.append(attrfreq_list_item_i)
# self.m_attr_length_item_list.append(len(attrlist_item_i))
self.m_item_batch_list.append(item_id)
# attrfreq_list_user_i = max_min_scale(attrfreq_list_user_i)
# self.m_attr_user_list.append(attrlist_user_i)
# self.m_attr_tf_user_list.append(attrfreq_list_user_i)
# self.m_attr_length_user_list.append(len(attrlist_user_i))
self.m_user_batch_list.append(user_id)
self.m_target_list.append(attrlist_i)
self.m_target_len_list.append(len(attrlist_i))
print("... load train data ...", len(self.m_item_batch_list), len(self.m_attr_tf_item_list), len(self.m_attr_user_list), len(self.m_attr_tf_user_list))
# exit()
def __len__(self):
return len(self.m_item_batch_list)
def __getitem__(self, idx):
if torch.is_tensor(idx):
idx = idx.tolist()
i = idx
# attr_item_i = self.m_attr_item_list[i]
# attr_tf_item_i = self.m_attr_tf_item_list[i]
# attr_length_item_i = self.m_attr_length_item_list[i]
item_i = self.m_item_batch_list[i]
# attr_user_i = self.m_attr_user_list[i]
# attr_tf_user_i = self.m_attr_tf_user_list[i]
# attr_length_user_i = self.m_attr_length_user_list[i]
user_i = self.m_user_batch_list[i]
target_i = self.m_target_list[i]
target_len_i = self.m_target_len_list[i]
# sample_i = {"attr_item": attr_item_i, "attr_tf_item": attr_tf_item_i, "attr_length_item": attr_length_item_i, "item": item_i, "attr_user": attr_user_i, "attr_tf_user": attr_tf_user_i, "attr_length_user": attr_length_user_i, "user": user_i, "target": target_i, "target_len": target_len_i}
sample_i = {"item": item_i, "user": user_i, "target": target_i, "target_len": target_len_i}
return sample_i
@staticmethod
def collate(batch):
batch_size = len(batch)
attr_item_iter = []
attr_tf_item_iter = []
attr_length_item_iter = []
item_iter = []
attr_user_iter = []
attr_tf_user_iter = []
attr_length_user_iter = []
user_iter = []
target_iter = []
target_len_iter = []
target_mask_iter = []
for i in range(batch_size):
sample_i = batch[i]
# attr_length_item_i = sample_i["attr_length_item"]
# attr_length_item_iter.append(attr_length_item_i)
# attr_length_user_i = sample_i["attr_length_user"]
# attr_length_user_iter.append(attr_length_user_i)
target_len_i = sample_i["target_len"]
target_len_iter.append(target_len_i)
# max_attr_length_item_iter = max(attr_length_item_iter)
# max_attr_length_user_iter = max(attr_length_user_iter)
max_targetlen_iter = max(target_len_iter)
# freq_pad_id = float('-inf')
freq_pad_id = float(0)
pad_id = 0
for i in range(batch_size):
sample_i = batch[i]
# attr_item_i = copy.deepcopy(sample_i["attr_item"])
# attr_item_i = [int(i) for i in attr_item_i]
# attr_tf_item_i = copy.deepcopy(sample_i['attr_tf_item'])
# attr_length_item_i = sample_i["attr_length_item"]
# attr_item_i.extend([pad_id]*(max_attr_length_item_iter-attr_length_item_i))
# attr_item_iter.append(attr_item_i)
# attr_tf_item_i.extend([freq_pad_id]*(max_attr_length_item_iter-attr_length_item_i))
# attr_tf_item_iter.append(attr_tf_item_i)
item_i = sample_i["item"]
item_iter.append(item_i)
# attr_user_i = copy.deepcopy(sample_i["attr_user"])
# attr_user_i = [int(i) for i in attr_user_i]
# attr_tf_user_i = copy.deepcopy(sample_i['attr_tf_user'])
# attr_length_user_i = sample_i["attr_length_user"]
# attr_user_i.extend([pad_id]*(max_attr_length_user_iter-attr_length_user_i))
# attr_user_iter.append(attr_user_i)
# attr_tf_user_i.extend([freq_pad_id]*(max_attr_length_user_iter-attr_length_user_i))
# attr_tf_user_iter.append(attr_tf_user_i)
user_i = sample_i["user"]
user_iter.append(user_i)
target_i = copy.deepcopy(sample_i["target"])
target_len_i = sample_i["target_len"]
target_i.extend([-1]*(max_targetlen_iter-target_len_i))
target_iter.append(target_i)
target_mask_iter.append([1]*target_len_i+[0]*(max_targetlen_iter-target_len_i))
# attr_item_iter_tensor = torch.from_numpy(np.array(attr_item_iter)).long()
# attr_tf_item_iter_tensor = torch.from_numpy(np.array(attr_tf_item_iter)).float()
# attr_length_item_iter_tensor = torch.from_numpy(np.array(attr_length_item_iter)).long()
item_iter_tensor = torch.from_numpy(np.array(item_iter)).long()
# attr_user_iter_tensor = torch.from_numpy(np.array(attr_user_iter)).long()
# attr_tf_user_iter_tensor = torch.from_numpy(np.array(attr_tf_user_iter)).float()
# attr_length_user_iter_tensor = torch.from_numpy(np.array(attr_length_user_iter)).long()
user_iter_tensor = torch.from_numpy(np.array(user_iter)).long()
target_iter_tensor = torch.from_numpy(np.array(target_iter)).long()
target_mask_iter_tensor = torch.from_numpy(np.array(target_mask_iter)).long()
return item_iter_tensor, user_iter_tensor, target_iter_tensor, target_mask_iter_tensor
# return None, None, None, item_iter_tensor, None, None, None, user_iter_tensor, target_iter_tensor, target_mask_iter_tensor
# return attr_item_iter_tensor, attr_tf_item_iter_tensor, attr_length_item_iter_tensor, item_iter_tensor, attr_user_iter_tensor, attr_tf_user_iter_tensor, attr_length_user_iter_tensor, user_iter_tensor, target_iter_tensor, target_mask_iter_tensor
def remove_target_zero_row(args):
data_dir = args.data_dir
train_data_file = data_dir + "/train.pickle"
valid_data_file = data_dir + "/valid.pickle"
# test_data_file = data_dir+"/test.pickle"
train_df = pd.read_pickle(train_data_file)
valid_df = pd.read_pickle(valid_data_file)
# test_df = pd.read_pickle(test_data_file)
print("columns", train_df.columns)
print('train num', len(train_df))
print('valid num', len(valid_df))
# print('test num', len(test_df))
item_boa_file = args.item_boa_file
with open(os.path.join(data_dir, item_boa_file), 'r',encoding='utf8') as f:
item_boa_dict = json.loads(f.read())
def remove_target_zero_row_df(df, train_valid_test_flag):
sample_num = len(df)
print("=="*10, train_valid_test_flag, "=="*10)
print("input sample num", sample_num)
userid_list = df.userid.tolist()
itemid_list = df.itemid.tolist()
attrlist_list = df.attr.tolist()
# rating_list = df.rating.tolist()
# rating_list = df.rating.tolist()
data_list = []
for sample_i in range(sample_num):
userid_i = userid_list[sample_i]
itemid_i = itemid_list[sample_i]
attrlist_i = attrlist_list[sample_i]
# rating_i = rating_list[sample_i]
item_boa_boafreq_list = item_boa_dict[str(itemid_i)]
# print(item_boa_boafreq_list)
item_boa = list(item_boa_boafreq_list.keys())
item_boa = [int(i) for i in item_boa]
item_boafreq = list(item_boa_boafreq_list.values())
# print(item_boa)
# print(attrlist_i)
common_boa = list(set(attrlist_i) & set(item_boa))
if len(common_boa) == 0:
continue
# boa_i = set(boa_i)
sub_data = [userid_i, itemid_i, attrlist_i]
data_list.append(sub_data)
new_df = pd.DataFrame(data_list)
print(new_df.head())
new_df.columns = ['userid', 'itemid', 'attr']
print("output sample num", len(new_df))
return new_df
new_train_data_file = data_dir + "/new_train.pickle"
new_valid_data_file = data_dir + "/new_valid.pickle"
new_train_df = remove_target_zero_row_df(train_df, "train")
new_train_df.to_pickle(new_train_data_file)
new_valid_df = remove_target_zero_row_df(valid_df, "valid")
new_valid_df.to_pickle(new_valid_data_file)
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument('--data_dir', type=str, default="../data/movie_attr_oov")
parser.add_argument('--item_boa_file', type=str, default="item_attr.json")
args = parser.parse_args()
remove_target_zero_row(args)
| 38.239165 | 350 | 0.64358 | 3,521 | 23,823 | 3.84067 | 0.047146 | 0.045108 | 0.045552 | 0.017748 | 0.846114 | 0.803002 | 0.758929 | 0.748798 | 0.71249 | 0.709754 | 0 | 0.002997 | 0.257566 | 23,823 | 622 | 351 | 38.300643 | 0.761576 | 0.402006 | 0 | 0.541985 | 0 | 0 | 0.031532 | 0.001573 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038168 | false | 0 | 0.045802 | 0.007634 | 0.118321 | 0.049618 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8c234b480328fa2c0a1a0b86378c67e2c0db3e3d | 2,142 | py | Python | blaze/tests/test_vlen.py | davidfischer/blaze-core | 19b55ed469aec742fd871b959115a3d87a89acb9 | [
"BSD-2-Clause"
] | 1 | 2015-01-02T18:16:07.000Z | 2015-01-02T18:16:07.000Z | blaze/tests/test_vlen.py | davidfischer/blaze-core | 19b55ed469aec742fd871b959115a3d87a89acb9 | [
"BSD-2-Clause"
] | null | null | null | blaze/tests/test_vlen.py | davidfischer/blaze-core | 19b55ed469aec742fd871b959115a3d87a89acb9 | [
"BSD-2-Clause"
] | null | null | null | """
Tests for the blob data type.
"""
import blaze
import tempfile, shutil, os.path
def test_simple_blob():
ds = blaze.dshape('x, blob')
c = blaze.Array(["s1", "sss2"], ds)
assert c[0] == "s1"
assert c[1] == "sss2"
def test_simple_persistent_blob():
td = tempfile.mkdtemp()
tmppath = os.path.join(td, 'c')
ds = blaze.dshape('x, blob')
c = blaze.Array(["s1", "sss2"], ds,
params=blaze.params(storage=tmppath))
assert c[0] == "s1"
assert c[1] == "sss2"
# Remove everything under the temporary dir
shutil.rmtree(td)
def test_object_blob():
ds = blaze.dshape('x, blob')
c = blaze.Array([(i, str(i*.2)) for i in range(10)], ds)
for i, v in enumerate(c):
assert v[0] == i
assert v[1] == str(i*.2)
def test_object_unicode():
ds = blaze.dshape('x, blob')
c = blaze.Array([u'a'*i for i in range(10)], ds)
for i, v in enumerate(c):
# The outcome are 0-dim arrays (that might change in the future)
assert v[()] == u'a'*i
def test_object_persistent_blob():
td = tempfile.mkdtemp()
tmppath = os.path.join(td, 'c')
ds = blaze.dshape('x, blob')
c = blaze.Array([(i, str(i*.2)) for i in range(10)], ds,
params=blaze.params(storage=tmppath))
for i, v in enumerate(c):
assert v[0] == i
assert v[1] == str(i*.2)
# Remove everything under the temporary dir
shutil.rmtree(td)
def test_object_persistent_blob_reopen():
td = tempfile.mkdtemp()
tmppath = os.path.join(td, 'c')
ds = blaze.dshape('x, blob')
c = blaze.Array([(i, "s"*i) for i in range(10)], ds,
params=blaze.params(storage=tmppath))
c2 = blaze.open(tmppath)
for i, v in enumerate(c2):
assert v[0] == i
assert v[1] == "s"*i
# Remove everything under the temporary dir
shutil.rmtree(td)
def test_intfloat_blob():
ds = blaze.dshape('x, blob')
c = blaze.Array([(i, i*.2) for i in range(10)], ds)
for i, v in enumerate(c):
print "v:", v, v[0], type(v[0])
assert v[0] == i
assert v[1] == i*.2
| 24.906977 | 72 | 0.565359 | 336 | 2,142 | 3.550595 | 0.190476 | 0.033529 | 0.076278 | 0.082146 | 0.82062 | 0.797988 | 0.752724 | 0.722548 | 0.661358 | 0.658005 | 0 | 0.026081 | 0.266106 | 2,142 | 85 | 73 | 25.2 | 0.732824 | 0.087768 | 0 | 0.6 | 0 | 0 | 0.042887 | 0 | 0 | 0 | 0 | 0 | 0.236364 | 0 | null | null | 0 | 0.036364 | null | null | 0.018182 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8c2a8b6cfbd71b1f501ed69f5b4cac8f57bb6314 | 36 | py | Python | python/testData/completion/paramCompletionWithEquals.after.py | tgodzik/intellij-community | f5ef4191fc30b69db945633951fb160c1cfb7b6f | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/completion/paramCompletionWithEquals.after.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 2 | 2022-02-19T09:45:05.000Z | 2022-02-27T20:32:55.000Z | python/testData/completion/paramCompletionWithEquals.after.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | def func(myArg) : pass
func(myArg=) | 12 | 22 | 0.694444 | 6 | 36 | 4.166667 | 0.666667 | 0.72 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138889 | 36 | 3 | 23 | 12 | 0.806452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.5 | 0 | null | null | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
8c3c91dc1875e61ac451e5618e6619437e839ec8 | 397 | py | Python | string3.py | yanapermana/metadecryptor | 7c3e11ddb8ae05c19dbebd8ed2c744ff86ceaa2f | [
"BSD-3-Clause"
] | 49 | 2015-07-03T21:21:45.000Z | 2022-03-31T04:04:42.000Z | string3.py | hangetzzu/metadecryptor | 7c3e11ddb8ae05c19dbebd8ed2c744ff86ceaa2f | [
"BSD-3-Clause"
] | 3 | 2015-11-03T03:49:25.000Z | 2019-11-04T16:03:18.000Z | string3.py | hangetzzu/metadecryptor | 7c3e11ddb8ae05c19dbebd8ed2c744ff86ceaa2f | [
"BSD-3-Clause"
] | 17 | 2015-07-05T15:24:18.000Z | 2022-01-17T22:33:26.000Z | from gather3 import *
class String3:
def __init__(self):
pass
def digits(self, filename):
print(gather_digit(filename))
def lowers(self, filename):
print(gather_lower(filename))
def uppers(self, filename):
print(gather_upper(filename))
def upperdigits(self, filename):
print(gather_upper_digit(filename))
def lowerdigits(self, filename):
print(gather_lower_digit(filename)) | 19.85 | 37 | 0.755668 | 51 | 397 | 5.666667 | 0.392157 | 0.207612 | 0.294118 | 0.397924 | 0.387543 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005797 | 0.130982 | 397 | 20 | 38 | 19.85 | 0.831884 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0.071429 | 0.071429 | 0 | 0.571429 | 0.357143 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
8c4edf8b06ad5581aa625d4bda42a9ee442de217 | 181 | py | Python | total_tolles_ferleihsystem/auth_providers/__init__.py | spethso/Verleihsystem-TTF | 39179f9ac5b07f5106e555f82f3c9011d33805bd | [
"MIT"
] | 1 | 2019-03-17T08:11:14.000Z | 2019-03-17T08:11:14.000Z | total_tolles_ferleihsystem/auth_providers/__init__.py | spethso/Verleihsystem-TTF | 39179f9ac5b07f5106e555f82f3c9011d33805bd | [
"MIT"
] | 60 | 2018-06-12T14:46:50.000Z | 2020-11-16T00:50:37.000Z | total_tolles_ferleihsystem/auth_providers/__init__.py | FIUS/ttf-backend | 39179f9ac5b07f5106e555f82f3c9011d33805bd | [
"MIT"
] | 1 | 2019-12-02T19:25:59.000Z | 2019-12-02T19:25:59.000Z | """
Authentication Providers
"""
from .. import APP
from . import ldap_auth_provider, basic_auth_provider
if APP.config.get('DEBUG', False):
from . import debug_auth_provider
| 18.1 | 53 | 0.751381 | 24 | 181 | 5.416667 | 0.583333 | 0.230769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.143646 | 181 | 9 | 54 | 20.111111 | 0.83871 | 0.132597 | 0 | 0 | 0 | 0 | 0.033557 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8c5e8ae34654b7169544c213164c00dbab6781bf | 25 | py | Python | test/__init__.py | Jyejin/hong | 62e762cdba60e07b673d4a1b0d64e758bacdeeb6 | [
"MIT"
] | null | null | null | test/__init__.py | Jyejin/hong | 62e762cdba60e07b673d4a1b0d64e758bacdeeb6 | [
"MIT"
] | 6 | 2018-03-06T07:53:09.000Z | 2022-03-11T23:15:52.000Z | test/__init__.py | Jyejin/hong | 62e762cdba60e07b673d4a1b0d64e758bacdeeb6 | [
"MIT"
] | null | null | null | from . import test_stocks | 25 | 25 | 0.84 | 4 | 25 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12 | 25 | 1 | 25 | 25 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4fc09d205a68bf0daf614b6c3c3912c87dcdd66f | 8,203 | py | Python | depth2normal/depth2normal_tf.py | velankar/LEGO | 39cd11a37cdd1fbb2ce0cd1d406f0488addc9a1a | [
"MIT"
] | 86 | 2018-07-02T07:54:49.000Z | 2021-12-28T06:37:39.000Z | depth2normal/depth2normal_tf.py | velankar/LEGO | 39cd11a37cdd1fbb2ce0cd1d406f0488addc9a1a | [
"MIT"
] | 9 | 2018-06-25T23:48:54.000Z | 2019-10-13T09:07:57.000Z | depth2normal/depth2normal_tf.py | velankar/LEGO | 39cd11a37cdd1fbb2ce0cd1d406f0488addc9a1a | [
"MIT"
] | 21 | 2018-08-02T14:36:16.000Z | 2022-03-04T03:40:31.000Z | import tensorflow as tf
import numpy as np
import time
def depth2normal_layer(depth_map, intrinsics, inverse):
## mask is used to filter the background with infinite depth
mask = tf.greater(depth_map, tf.zeros(depth_map.get_shape().as_list()))
if inverse:
mask_clip = 1e-8 * (1.0-tf.cast(mask, tf.float32)) ## Add black pixels (depth = infinite) with delta
depth_map += mask_clip
depth_map = 1.0/depth_map ## inverse depth map
kitti_shape = depth_map.get_shape().as_list()
pts_3d_map = compute_3dpts(depth_map, intrinsics)
print ("shape of pts_3d_map:")
print (pts_3d_map.get_shape().as_list())
nei = 5
## shift the 3d pts map by nei along 8 directions
pts_3d_map_ctr = pts_3d_map[nei:-nei, nei:-nei, :]
pts_3d_map_x0 = pts_3d_map[nei:-nei, 0:-(2*nei), :]
pts_3d_map_y0 = pts_3d_map[0:-(2*nei), nei:-nei, :]
pts_3d_map_x1 = pts_3d_map[nei:-nei, 2*nei:, :]
pts_3d_map_y1 = pts_3d_map[2*nei:, nei:-nei, :]
pts_3d_map_x0y0 = pts_3d_map[0:-(2*nei), 0:-(2*nei), :]
pts_3d_map_x0y1 = pts_3d_map[2*nei:, 0:-(2*nei), :]
pts_3d_map_x1y0 = pts_3d_map[0:-(2*nei), 2*nei:, :]
pts_3d_map_x1y1 = pts_3d_map[2*nei:, 2*nei:, :]
## generate difference between the central pixel and one of 8 neighboring pixels
diff_x0 = pts_3d_map_ctr - pts_3d_map_x0
diff_x1 = pts_3d_map_ctr - pts_3d_map_x1
diff_y0 = pts_3d_map_y0 - pts_3d_map_ctr
diff_y1 = pts_3d_map_y1 - pts_3d_map_ctr
diff_x0y0 = pts_3d_map_x0y0 - pts_3d_map_ctr
diff_x0y1 = pts_3d_map_ctr - pts_3d_map_x0y1
diff_x1y0 = pts_3d_map_x1y0 - pts_3d_map_ctr
diff_x1y1 = pts_3d_map_ctr - pts_3d_map_x1y1
## flatten the diff to a #pixle by 3 matrix
pix_num = (depth_map.get_shape().as_list()[0]-2*nei) * (depth_map.get_shape().as_list()[1]-2*nei)
diff_x0 = tf.reshape(diff_x0, [pix_num, 3])
diff_y0 = tf.reshape(diff_y0, [pix_num, 3])
diff_x1 = tf.reshape(diff_x1, [pix_num, 3])
diff_y1 = tf.reshape(diff_y1, [pix_num, 3])
diff_x0y0 = tf.reshape(diff_x0y0, [pix_num, 3])
diff_x0y1 = tf.reshape(diff_x0y1, [pix_num, 3])
diff_x1y0 = tf.reshape(diff_x1y0, [pix_num, 3])
diff_x1y1 = tf.reshape(diff_x1y1, [pix_num, 3])
## calculate normal by cross product of two vectors
normals0 = normalize_l2(tf.cross(diff_x1, diff_y1))
normals1 = normalize_l2(tf.cross(diff_x0, diff_y0))
normals2 = normalize_l2(tf.cross(diff_x0y1, diff_x0y0))
normals3 = normalize_l2(tf.cross(diff_x1y0, diff_x1y1))
normal_vector = tf.reduce_sum(tf.concat([[normals0], [normals1], [normals2], [normals3]], 0),0)
normal_vector = normalize_l2(normal_vector)
normal_map = tf.reshape(tf.squeeze(normal_vector), [kitti_shape[0]-2*nei]+[kitti_shape[1]-2*nei]+[3])
print (normal_map.get_shape().as_list())
normal_map *= tf.tile(tf.expand_dims(tf.cast(mask[nei:-nei, nei:-nei], tf.float32), 2), [1,1,3])
normal_map = tf.pad(normal_map, [[nei, nei], [nei, nei], [0,0]] ,"CONSTANT")
return normal_map
def depth2normal_layer_batch(depth_map, intrinsics, inverse, nei=3):
## depth_map is in rank 3 [batch, h, w], intrinsics are in rank 2 [batch,4]
## mask is used to filter the background with infinite depth
mask = tf.greater(depth_map, tf.zeros(depth_map.get_shape().as_list()))
if inverse:
mask_clip = 1e-8 * (1.0-tf.cast(mask, tf.float32)) ## Add black pixels (depth = infinite) with delta
depth_map += mask_clip
depth_map = 1.0/depth_map ## inverse depth map
kitti_shape = depth_map.get_shape().as_list()
pts_3d_map = compute_3dpts_batch(depth_map, intrinsics)
## shift the 3d pts map by nei along 8 directions
pts_3d_map_ctr = pts_3d_map[:,nei:-nei, nei:-nei, :]
pts_3d_map_x0 = pts_3d_map[:,nei:-nei, 0:-(2*nei), :]
pts_3d_map_y0 = pts_3d_map[:,0:-(2*nei), nei:-nei, :]
pts_3d_map_x1 = pts_3d_map[:,nei:-nei, 2*nei:, :]
pts_3d_map_y1 = pts_3d_map[:,2*nei:, nei:-nei, :]
pts_3d_map_x0y0 = pts_3d_map[:,0:-(2*nei), 0:-(2*nei), :]
pts_3d_map_x0y1 = pts_3d_map[:,2*nei:, 0:-(2*nei), :]
pts_3d_map_x1y0 = pts_3d_map[:,0:-(2*nei), 2*nei:, :]
pts_3d_map_x1y1 = pts_3d_map[:,2*nei:, 2*nei:, :]
## generate difference between the central pixel and one of 8 neighboring pixels
diff_x0 = pts_3d_map_ctr - pts_3d_map_x0
diff_x1 = pts_3d_map_ctr - pts_3d_map_x1
diff_y0 = pts_3d_map_y0 - pts_3d_map_ctr
diff_y1 = pts_3d_map_y1 - pts_3d_map_ctr
diff_x0y0 = pts_3d_map_x0y0 - pts_3d_map_ctr
diff_x0y1 = pts_3d_map_ctr - pts_3d_map_x0y1
diff_x1y0 = pts_3d_map_x1y0 - pts_3d_map_ctr
diff_x1y1 = pts_3d_map_ctr - pts_3d_map_x1y1
## flatten the diff to a #pixle by 3 matrix
pix_num = kitti_shape[0] * (kitti_shape[1]-2*nei) * (kitti_shape[2]-2*nei)
diff_x0 = tf.reshape(diff_x0, [pix_num, 3])
diff_y0 = tf.reshape(diff_y0, [pix_num, 3])
diff_x1 = tf.reshape(diff_x1, [pix_num, 3])
diff_y1 = tf.reshape(diff_y1, [pix_num, 3])
diff_x0y0 = tf.reshape(diff_x0y0, [pix_num, 3])
diff_x0y1 = tf.reshape(diff_x0y1, [pix_num, 3])
diff_x1y0 = tf.reshape(diff_x1y0, [pix_num, 3])
diff_x1y1 = tf.reshape(diff_x1y1, [pix_num, 3])
## calculate normal by cross product of two vectors
normals0 = normalize_l2(tf.cross(diff_x1, diff_y1)) #* tf.tile(normals0_mask[:, None], [1,3])
normals1 = normalize_l2(tf.cross(diff_x0, diff_y0)) #* tf.tile(normals1_mask[:, None], [1,3])
normals2 = normalize_l2(tf.cross(diff_x0y1, diff_x0y0)) #* tf.tile(normals2_mask[:, None], [1,3])
normals3 = normalize_l2(tf.cross(diff_x1y0, diff_x1y1)) #* tf.tile(normals3_mask[:, None], [1,3])
normal_vector = tf.reduce_sum(tf.concat([[normals0], [normals1], [normals2], [normals3]], 0),0)
normal_vector = normalize_l2(normals0)
normal_map = tf.reshape(tf.squeeze(normal_vector), [kitti_shape[0]]+[kitti_shape[1]-2*nei]+[kitti_shape[2]-2*nei]+[3])
normal_map *= tf.tile(tf.expand_dims(tf.cast(mask[:, nei:-nei, nei:-nei], tf.float32), -1), [1,1,1,3])
normal_map = tf.pad(normal_map, [[0,0], [nei, nei], [nei, nei], [0,0]] ,"CONSTANT")
return normal_map
def compute_3dpts(pts, intrinsics):
fx, fy, cx, cy = intrinsics[0], intrinsics[1], intrinsics[2], intrinsics[3]
pts_3d = tf.zeros(pts.get_shape().as_list()[:2]+[3])
pts_z = pts
x = tf.range(0, pts.get_shape().as_list()[1])
x = tf.cast(x, tf.float32)
y = tf.range(0, pts.get_shape().as_list()[0])
y = tf.cast(y, tf.float32)
pts_x = (tf.meshgrid(x, y)[0] - tf.ones(pts.get_shape().as_list())*cx) / (tf.ones(pts.get_shape().as_list())*fx) * pts
pts_y = (tf.meshgrid(x, y)[1] - tf.ones(pts.get_shape().as_list())*cy) / (tf.ones(pts.get_shape().as_list())*fy) * pts
pts_3d = tf.concat([[pts_x], [pts_y], [pts_z]], 0)
pts_3d = tf.transpose(pts_3d, perm = [1,2,0])
return pts_3d
def compute_3dpts_batch(pts, intrinsics):
## pts is the depth map of rank3 [batch, h, w], intrinsics is in [batch, 4]
fx, fy, cx, cy = intrinsics[:,0], intrinsics[:,1], intrinsics[:,2], intrinsics[:,3]
pts_shape = pts.get_shape().as_list()
pts_3d = tf.zeros(pts.get_shape().as_list()[:2]+[3])
pts_z = pts
x = tf.range(0, pts.get_shape().as_list()[2])
x = tf.cast(x, tf.float32)
y = tf.range(0, pts.get_shape().as_list()[1])
y = tf.cast(y, tf.float32)
cx_tile = tf.tile(tf.expand_dims(tf.expand_dims(cx, -1), -1), [1, pts_shape[1], pts_shape[2]])
cy_tile = tf.tile(tf.expand_dims(tf.expand_dims(cy, -1), -1), [1, pts_shape[1], pts_shape[2]])
fx_tile = tf.tile(tf.expand_dims(tf.expand_dims(fx, -1), -1), [1, pts_shape[1], pts_shape[2]])
fy_tile = tf.tile(tf.expand_dims(tf.expand_dims(fy, -1), -1), [1, pts_shape[1], pts_shape[2]])
pts_x = (tf.tile(tf.expand_dims(tf.meshgrid(x, y)[0], 0), [pts_shape[0], 1, 1]) - cx_tile) / fx_tile * pts
pts_y = (tf.tile(tf.expand_dims(tf.meshgrid(x, y)[1], 0), [pts_shape[0], 1, 1]) - cy_tile) / fy_tile * pts
pts_3d = tf.concat([[pts_x], [pts_y], [pts_z]], 0)
pts_3d = tf.transpose(pts_3d, perm = [1,2,3,0])
return pts_3d
def normalize_l2(vector):
return tf.nn.l2_normalize(vector, 1, epsilon=1e-20) | 47.97076 | 122 | 0.661465 | 1,477 | 8,203 | 3.372376 | 0.085985 | 0.082313 | 0.115639 | 0.053403 | 0.870508 | 0.847822 | 0.824332 | 0.805862 | 0.805862 | 0.689821 | 0 | 0.069524 | 0.172376 | 8,203 | 171 | 123 | 47.97076 | 0.664163 | 0.119347 | 0 | 0.53125 | 0 | 0 | 0.00501 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039063 | false | 0 | 0.023438 | 0.007813 | 0.101563 | 0.023438 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4fde1bf9e361c8c807ec30e51c5d7e504d3205c0 | 40 | py | Python | Flask_App/run.py | Data-Science-Community-SRM/disease-predictor | bc12e010cbbff7bf7ab0bd4691bef4553e473998 | [
"MIT"
] | 4 | 2021-06-01T15:02:07.000Z | 2022-03-31T04:56:15.000Z | Flask_App/run.py | Data-Science-Community-SRM/disease-predictor | bc12e010cbbff7bf7ab0bd4691bef4553e473998 | [
"MIT"
] | 1 | 2021-06-06T06:40:48.000Z | 2021-06-12T17:35:58.000Z | Flask_App/run.py | Data-Science-Community-SRM/disease-predictor | bc12e010cbbff7bf7ab0bd4691bef4553e473998 | [
"MIT"
] | 2 | 2021-02-09T12:33:54.000Z | 2021-06-06T06:36:47.000Z | #entry point
from Application import A
| 10 | 25 | 0.8 | 6 | 40 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175 | 40 | 3 | 26 | 13.333333 | 0.969697 | 0.275 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8b1548d7e655580428f978518f509f2c2af46e9e | 117 | py | Python | evd_ros_backend/evd_ros_core/src/evd_script/program_nodes/control_flow/conditionals/comparison.py | Wisc-HCI/CoFrame | 7a54344248d80cb316d36aabd40bbd3cdbbc07eb | [
"MIT"
] | null | null | null | evd_ros_backend/evd_ros_core/src/evd_script/program_nodes/control_flow/conditionals/comparison.py | Wisc-HCI/CoFrame | 7a54344248d80cb316d36aabd40bbd3cdbbc07eb | [
"MIT"
] | null | null | null | evd_ros_backend/evd_ros_core/src/evd_script/program_nodes/control_flow/conditionals/comparison.py | Wisc-HCI/CoFrame | 7a54344248d80cb316d36aabd40bbd3cdbbc07eb | [
"MIT"
] | null | null | null | '''
TODO
'''
from .abstract import AbstractConditional
class ComparisonConditional(AbstractConditional):
pass
| 11.7 | 49 | 0.769231 | 9 | 117 | 10 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145299 | 117 | 9 | 50 | 13 | 0.9 | 0.034188 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 6 |
8b436fb9f1050b2276730b96c8a1995991025fb5 | 164 | py | Python | packaging/portable/customize.py | shreyas202/thonny | ef894c359200b0591cf98451907243395b817c63 | [
"MIT"
] | 1 | 2021-10-30T16:56:40.000Z | 2021-10-30T16:56:40.000Z | packaging/portable/customize.py | shreyas202/thonny | ef894c359200b0591cf98451907243395b817c63 | [
"MIT"
] | 13 | 2018-11-15T09:31:06.000Z | 2019-11-22T18:16:54.000Z | packaging/portable/customize.py | shreyas202/thonny | ef894c359200b0591cf98451907243395b817c63 | [
"MIT"
] | 3 | 2018-11-24T14:00:30.000Z | 2019-07-02T02:32:26.000Z | import os.path
import thonny
user_dir = os.path.join(os.path.dirname(__file__), "..", "..", "..", ".thonny")
thonny.THONNY_USER_DIR = os.path.abspath(user_dir) | 32.8 | 80 | 0.676829 | 24 | 164 | 4.291667 | 0.416667 | 0.23301 | 0.252427 | 0.291262 | 0.368932 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109756 | 164 | 5 | 81 | 32.8 | 0.705479 | 0 | 0 | 0 | 0 | 0 | 0.080745 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
8c66049c2825623f67b50052cfff3b75dbad0299 | 40 | py | Python | coop/__init__.py | lionls/coop | 080f6684f198e1db3cbad41dfea24f481d8253f4 | [
"BSD-3-Clause"
] | 15 | 2021-09-14T07:08:31.000Z | 2022-03-21T02:20:40.000Z | coop/__init__.py | lionls/coop | 080f6684f198e1db3cbad41dfea24f481d8253f4 | [
"BSD-3-Clause"
] | 1 | 2022-02-11T11:17:53.000Z | 2022-02-15T16:56:09.000Z | coop/__init__.py | lionls/coop | 080f6684f198e1db3cbad41dfea24f481d8253f4 | [
"BSD-3-Clause"
] | 2 | 2021-09-19T09:35:31.000Z | 2022-03-24T12:33:50.000Z | from . import util
from .vae import VAE
| 13.333333 | 20 | 0.75 | 7 | 40 | 4.285714 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 40 | 2 | 21 | 20 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8ce49102d56cccef36b34dde6d5781cd7a0576dc | 241 | py | Python | orderAhead-BE/models/shipping/__init__.py | futuresea-dev/RetailApp_Flask_react-main | b5adb404fa6b227938d449ea79ed6f8a20692840 | [
"Unlicense"
] | null | null | null | orderAhead-BE/models/shipping/__init__.py | futuresea-dev/RetailApp_Flask_react-main | b5adb404fa6b227938d449ea79ed6f8a20692840 | [
"Unlicense"
] | null | null | null | orderAhead-BE/models/shipping/__init__.py | futuresea-dev/RetailApp_Flask_react-main | b5adb404fa6b227938d449ea79ed6f8a20692840 | [
"Unlicense"
] | null | null | null | from models.postgres_db import Postgres_DB
from models.shipping.zone import ShippingZone
from models.shipping.method import ShippingMethod
from models.shipping.instance import MethodInstance
from models.shipping.facade import ShippingFacade | 40.166667 | 51 | 0.879668 | 31 | 241 | 6.774194 | 0.451613 | 0.238095 | 0.342857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.082988 | 241 | 6 | 52 | 40.166667 | 0.950226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
509010dae979619a789aa561df8ce0bb6fb2307d | 136 | py | Python | openapi2oms/exceptions/MappingError.py | microservices/openapi2oms | a22b73427d77b57629786040ca6fa7f1e18853de | [
"MIT"
] | 3 | 2019-10-31T11:31:05.000Z | 2020-01-11T01:43:50.000Z | openapi2oms/exceptions/MappingError.py | microservices/openapi2oms | a22b73427d77b57629786040ca6fa7f1e18853de | [
"MIT"
] | 3 | 2019-10-21T10:31:07.000Z | 2021-10-18T23:44:23.000Z | openapi2oms/exceptions/MappingError.py | microservices/openapi2oms | a22b73427d77b57629786040ca6fa7f1e18853de | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from openapi2oms.exceptions.ConverterError import ConverterError
class MappingError(ConverterError):
pass
| 19.428571 | 64 | 0.764706 | 13 | 136 | 8 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016949 | 0.132353 | 136 | 6 | 65 | 22.666667 | 0.864407 | 0.154412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
5095c907269d9dac35bfaade85a4c96a423be0d1 | 36 | py | Python | College/import_shaz.py | AuGom/Python-Studies | 6fdb41768c7a6304a7e46bf391513161d7028de4 | [
"MIT"
] | null | null | null | College/import_shaz.py | AuGom/Python-Studies | 6fdb41768c7a6304a7e46bf391513161d7028de4 | [
"MIT"
] | null | null | null | College/import_shaz.py | AuGom/Python-Studies | 6fdb41768c7a6304a7e46bf391513161d7028de4 | [
"MIT"
] | null | null | null | import shazan
print(shazan.fat(5))
| 9 | 20 | 0.75 | 6 | 36 | 4.5 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 0.111111 | 36 | 3 | 21 | 12 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
50a8c73717d4062d1a9aa5bae8e5328d6c753eb8 | 982 | py | Python | tests/test_cli.py | lorenzha/mlflow | 7c7266232273019b99fa0eb72289bf1f5701815b | [
"Apache-2.0"
] | 12 | 2018-08-11T08:25:31.000Z | 2018-08-28T23:41:23.000Z | tests/test_cli.py | kmader/mlflow | 781ce394953c7857b19c84dc937964a806b62798 | [
"Apache-2.0"
] | 17 | 2018-08-11T00:26:26.000Z | 2018-08-29T10:14:17.000Z | tests/test_cli.py | kmader/mlflow | 781ce394953c7857b19c84dc937964a806b62798 | [
"Apache-2.0"
] | 3 | 2018-08-21T15:14:51.000Z | 2019-11-06T23:25:32.000Z | from click.testing import CliRunner
from mock import mock
from mlflow.cli import server
def test_server_static_prefix_validation():
with mock.patch("mlflow.cli._run_server") as run_server_mock:
CliRunner().invoke(server)
run_server_mock.assert_called_once()
with mock.patch("mlflow.cli._run_server") as run_server_mock:
CliRunner().invoke(server, ["--static-prefix", "/mlflow"])
run_server_mock.assert_called_once()
with mock.patch("mlflow.cli._run_server") as run_server_mock:
result = CliRunner().invoke(server, ["--static-prefix", "mlflow/"])
assert "--static-prefix must begin with a '/'." in result.output
run_server_mock.assert_not_called()
with mock.patch("mlflow.cli._run_server") as run_server_mock:
result = CliRunner().invoke(server, ["--static-prefix", "/mlflow/"])
assert "--static-prefix should not end with a '/'." in result.output
run_server_mock.assert_not_called()
| 44.636364 | 76 | 0.700611 | 132 | 982 | 4.939394 | 0.234848 | 0.165644 | 0.159509 | 0.116564 | 0.800614 | 0.800614 | 0.773006 | 0.773006 | 0.773006 | 0.773006 | 0 | 0 | 0.169043 | 982 | 21 | 77 | 46.761905 | 0.79902 | 0 | 0 | 0.444444 | 0 | 0 | 0.239308 | 0.089613 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.055556 | false | 0 | 0.166667 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
50aed01ddeda33dd8798f74b04c8f7a810423427 | 46 | py | Python | src/meltano/core/runner/__init__.py | code-watch/meltano | 2afff73ed43669b5134dacfce61814f7f4e77a13 | [
"MIT"
] | 8 | 2020-06-16T22:29:54.000Z | 2021-06-04T11:57:57.000Z | src/meltano/core/runner/__init__.py | dotmesh-io/meltano | 4616d44ded9dff4e9ad19a9004349e9baa16ddd5 | [
"MIT"
] | 38 | 2019-12-09T06:53:33.000Z | 2022-03-29T22:29:19.000Z | src/meltano/core/runner/__init__.py | aroder/meltano | b8d1d812f4051b6334986fc6b447d23c4d0d5043 | [
"MIT"
] | 2 | 2020-06-16T22:29:59.000Z | 2020-11-04T05:47:50.000Z | class Runner:
def run(self):
pass
| 11.5 | 18 | 0.543478 | 6 | 46 | 4.166667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.369565 | 46 | 3 | 19 | 15.333333 | 0.862069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0 | 0 | 0.666667 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
50f32af191a97728093ccdf28e00761a86e4b085 | 46 | py | Python | lab-3/lab3_example/models/__init__.py | moevm/db_sql_lab_examples | 01aa4843b59bbddbea739b4c7b4db8958a2f8393 | [
"MIT"
] | 3 | 2021-09-02T21:03:30.000Z | 2021-10-08T13:48:04.000Z | lab-3/lab3_example/models/__init__.py | moevm/db_sql_lab_examples | 01aa4843b59bbddbea739b4c7b4db8958a2f8393 | [
"MIT"
] | null | null | null | lab-3/lab3_example/models/__init__.py | moevm/db_sql_lab_examples | 01aa4843b59bbddbea739b4c7b4db8958a2f8393 | [
"MIT"
] | 1 | 2021-09-05T02:44:19.000Z | 2021-09-05T02:44:19.000Z | from .department import *
from .user import *
| 15.333333 | 25 | 0.73913 | 6 | 46 | 5.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 46 | 2 | 26 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
50fa687e71b7a749362c99fe6c3f6a9f5a9a0970 | 9,533 | py | Python | src/vtra/analysis/failure/road_rail_flooding.py | oi-analytics/oia-transport-archive | f89cb686704fe76c1665697b35d14caccf37f3a1 | [
"PostgreSQL"
] | 1 | 2021-03-31T02:59:50.000Z | 2021-03-31T02:59:50.000Z | src/vtra/analysis/failure/road_rail_flooding.py | oi-analytics/oia-transport-archive | f89cb686704fe76c1665697b35d14caccf37f3a1 | [
"PostgreSQL"
] | null | null | null | src/vtra/analysis/failure/road_rail_flooding.py | oi-analytics/oia-transport-archive | f89cb686704fe76c1665697b35d14caccf37f3a1 | [
"PostgreSQL"
] | 1 | 2022-02-24T16:51:47.000Z | 2022-02-24T16:51:47.000Z |
# -*- coding: utf-8 -*-
"""
Python script to assign commodity flows on the road network
Created on Wed Nov 26 2017
@author: Raghav Pant
"""
import pandas as pd
import os
import psycopg2
import networkx as nx
import csv
from sqlalchemy import create_engine
import subprocess as sp
import operator
import itertools
import copy
import matplotlib.pyplot as plt
from vtra.utils import load_config
def add_column_to_table(table_name, table_match, col_name, col_id, cursor, connection):
sql_query = "alter table %s add column %s double precision"%(table_name,col_name)
cursor.execute(sql_query)
connection.commit()
sql_query = '''
update %s set %s = (select %s from %s as A where %s.%s = A.%s)
'''%(table_name,col_name,col_name,table_match,table_name,col_id,col_id)
cursor.execute(sql_query)
connection.commit()
sql_query = "update %s set %s = 0 where %s is Null"%(table_name,col_name,col_name)
cursor.execute(sql_query)
connection.commit()
def add_column_value_to_table(table_name, col_name, col_id,col_id_val_list, cursor, connection):
sql_query = "alter table %s add column %s integer"%(table_name,col_name)
cursor.execute(sql_query)
connection.commit()
for id_val in col_id_val_list:
c_id = id_val[0]
c_val = id_val[1]
if type(c_id) is str:
sql_query = '''
update %s set %s = %s where %s = '%s'
'''%(table_name,col_name,c_val,col_id,c_id)
else:
sql_query = '''
update %s set %s = %s where %s = %s
'''%(table_name,col_name,c_val,col_id,c_id)
cursor.execute(sql_query)
connection.commit()
sql_query = "update %s set %s = 0 where %s is Null"%(table_name,col_name,col_name)
cursor.execute(sql_query)
connection.commit()
def get_total_edges(sector_region_list):
region_edges_list = []
sector_region_dict = {}
for sid,region in sector_region_list:
if region not in sector_region_dict.keys():
sector_region_dict.update({region:[sid]})
else:
sector_region_dict[region].append(sid)
for region,edges in sector_region_dict.items():
total_edges = len(edges)
region_edges_list.append([region,total_edges])
return region_edges_list
def get_flood_results(flood_list,element_type,sector_type,model_include,flood_depth_min,flood_depth_max,flood_return_period,sector_list,sector_region_list):
flood_percentage_list = []
all_flood_edges = [fr.id for fr in flood_list if fr.network_element == element_type
and fr.sector == sector_type and fr.model in model_include and fr.flood_depth >= flood_depth_min
and fr.flood_depth < flood_depth_max and fr.return_period == flood_return_period]
all_flood_edges = list(set(all_flood_edges))
# print (all_flood_edges)
if sector_type == 'road':
selected_flood_edges = [int(fr) for fr in all_flood_edges if int(fr) in sector_list]
else:
selected_flood_edges = [fr for fr in all_flood_edges if fr in sector_list]
selected_flood_edges = list(set(selected_flood_edges))
percent_sector_flooded = int(100.0*len(selected_flood_edges)/len(sector_list))
# flood_percentage_list.append(('Total',percent_sector_flooded))
flood_percentage_list.append(('Total',len(selected_flood_edges)))
sector_region_dict = {}
for sid,region in sector_region_list:
if region not in sector_region_dict.keys():
sector_region_dict.update({region:{'edge':[sid],'flood_edge':[]}})
if sid in selected_flood_edges:
sector_region_dict[region]['flood_edge'].append(sid)
else:
sector_region_dict[region]['edge'].append(sid)
if sid in selected_flood_edges:
sector_region_dict[region]['flood_edge'].append(sid)
for region,edges in sector_region_dict.items():
total_edges = len(edges['edge'])
flooded_edges = len(edges['flood_edge'])
if total_edges > 0:
percent_flood = int(100.0*flooded_edges/total_edges)
else:
percent_flood = 0
# flood_percentage_list.append((region,percent_flood))
flood_percentage_list.append((region,flooded_edges))
return flood_percentage_list
def get_flood_id_depth(flood_list,element_type,sector_type,model_include,flood_depth_threshold_min,flood_depth_threshold_max,flood_return_period,sector_list,sector_region_list):
sector_region_dict = {}
all_flood_edges = [(fr.id,fr.flood_depth) for fr in flood_list if fr.network_element == element_type
and fr.sector == sector_type and fr.model in model_include and fr.flood_depth >= flood_depth_threshold_min
and fr.flood_depth < flood_depth_threshold_max and fr.return_period == flood_return_period]
all_flood_edges = list(set(all_flood_edges))
# print (all_flood_edges)
if sector_type == 'road':
selected_flood_edges = [(int(fr[0]),fr[1]) for fr in all_flood_edges if int(fr[0]) in sector_list]
else:
selected_flood_edges = [fr for fr in all_flood_edges if fr[0] in sector_list]
# selected_flood_edges = [fr for fr in all_flood_edges if fr[0] in sector_list]
# percent_sector_flooded = int(100.0*len(selected_flood_edges)/len(sector_list))
# flood_percentage_list.append(('Total',percent_sector_flooded))
# flood_percentage_list.append(('Total',len(selected_flood_edges)))
sector_region_dict = {}
for sid,region in sector_region_list:
if region not in sector_region_dict.keys():
sector_region_dict.update({region:{'flood_edge':[]}})
if sid in [s[0] for s in selected_flood_edges]:
fattr = [s for s in selected_flood_edges if s[0] == sid]
if fattr:
fattr_max = max([fa[1] for fa in fattr])
fattr = [sid,fattr_max]
sector_region_dict[region]['flood_edge'].append(fattr)
elif sid in [s[0] for s in selected_flood_edges]:
fattr = [s for s in selected_flood_edges if s[0] == sid]
if fattr:
fattr_max = max([fa[1] for fa in fattr])
fattr = [sid,fattr_max]
sector_region_dict[region]['flood_edge'].append(fattr)
return sector_region_dict
def main():
curdir = os.getcwd()
conf = load_config()
try:
conn = psycopg2.connect(**conf['database'])
except:
print ("I am unable to connect to the database")
curs = conn.cursor()
engine = create_engine('postgresql://{user}:{password}@{host}:{port}/{database}'.format({
**conf['database']
}))
layers_ids = [('roads_edge_flood','edge_id','road2009edges','geom'),
('rail_edge_flood','edge_id','railnetworkedges','geom'),
('roads_node_flood','node_id','road2009nodes','geom'),
('rail_node_flood','node_id','railnetworknodes','geom'),
('port_flood','node_id','seaport_nodes','geom'),
('airport_flood','node_id','airport_nodes','geom'),
('provinces_flooding','name_eng','province_level_stats','geom')]
regional_table = 'province_level_stats'
mode_edge_tables = ['rail_edge_flood','roads_edge_flood']
edge_id = 'edge_id'
regional_name = 'name_eng'
col_names = ['level13_vt_100_mask_1','level14_vt_100_mask_1','level15_vt_100_mask_1','level16_vt_100_mask_1']
excel_writer = pd.ExcelWriter('vnm_road_rail_flood_list.xlsx')
for c in col_names:
fl_rg_list = []
for m in mode_edge_tables:
edge_region_list = []
sql_query = '''SELECT A.{0},B.{1} FROM {2} as A, {3} as B where st_intersects(A.geom,B.geom) is True and A.{4} > 0
'''.format(edge_id,regional_name,m,regional_table,c)
cur.execute(sql_query)
read_layer = cur.fetchall()
for row in read_layer:
edge_region_list.append((row[0],row[1]))
tedge_regions = [r[1] for r in edge_region_list]
tedge_regions = sorted(list(set(tedge_regions)))
for rg in tedge_regions:
ed = [r[0] for r in edge_region_list if r[1] == rg]
fl_rg_list.append((rg,{'flood_edge':ed}))
fl_rg_list = sorted(fl_rg_list, key=lambda x: x[0])
df = pd.DataFrame(fl_rg_list,columns = ['region','flood_list'])
df.to_excel(excel_writer,c,index = False)
excel_writer.save()
conf = load_config()
try:
conn = psycopg2.connect(**conf['database'])
except:
print ("I am unable to connect to the database")
curs = conn.cursor()
engine = create_engine('postgresql://{user}:{password}@{host}:{port}/{database}'.format({
**conf['database']
}))
layers_ids = [('roads_edge_flood','edge_id','road2009edges','geom'),
('rail_edge_flood','edge_id','railnetworkedges','geom'),
('roads_node_flood','node_id','road2009nodes','geom'),
('rail_node_flood','node_id','railnetworknodes','geom'),
('port_flood','node_id','seaport_nodes','geom'),
('airport_flood','node_id','airport_nodes','geom'),
('provinces_flooding','name_eng','province_level_stats','geom')]
regional_table = 'province_level_stats'
mode_edge_tables = ['rail_edge_flood','roads_edge_flood']
edge_id = 'edge_id'
regional_name = 'name_eng'
col_names = ['level13_vt_100_mask_1','level14_vt_100_mask_1','level15_vt_100_mask_1','level16_vt_100_mask_1']
excel_writer = pd.ExcelWriter('vnm_road_rail_flood_list.xlsx')
for c in col_names:
fl_rg_list = []
for m in mode_edge_tables:
edge_region_list = []
sql_query = '''SELECT A.{0},B.{1} FROM {2} as A, {3} as B where st_intersects(A.geom,B.geom) is True and A.{4} > 0
'''.format(edge_id,regional_name,m,regional_table,c)
cur.execute(sql_query)
read_layer = cur.fetchall()
for row in read_layer:
edge_region_list.append((row[0],row[1]))
tedge_regions = [r[1] for r in edge_region_list]
tedge_regions = sorted(list(set(tedge_regions)))
for rg in tedge_regions:
ed = [r[0] for r in edge_region_list if r[1] == rg]
fl_rg_list.append((rg,{'flood_edge':ed}))
fl_rg_list = sorted(fl_rg_list, key=lambda x: x[0])
df = pd.DataFrame(fl_rg_list,columns = ['region','flood_list'])
df.to_excel(excel_writer,c,index = False)
excel_writer.save()
if __name__ == '__main__':
main()
| 34.046429 | 177 | 0.729676 | 1,528 | 9,533 | 4.221859 | 0.132853 | 0.046504 | 0.047124 | 0.019842 | 0.845295 | 0.821113 | 0.804371 | 0.7929 | 0.7929 | 0.770268 | 0 | 0.015424 | 0.136264 | 9,533 | 279 | 178 | 34.168459 | 0.768035 | 0.060946 | 0 | 0.681592 | 0 | 0.014925 | 0.205932 | 0.043872 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029851 | false | 0.00995 | 0.059701 | 0 | 0.104478 | 0.00995 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0fde45a23e1a535989d5e87508363ba424cb4d43 | 8,382 | py | Python | tests/api/test_prcurve.py | RUrlus/ModelMetricUncertainty | f401a25dd196d6e4edf4901fcfee4b56ebd7c10b | [
"Apache-2.0"
] | null | null | null | tests/api/test_prcurve.py | RUrlus/ModelMetricUncertainty | f401a25dd196d6e4edf4901fcfee4b56ebd7c10b | [
"Apache-2.0"
] | 11 | 2021-12-08T10:34:17.000Z | 2022-01-20T13:40:05.000Z | tests/api/test_prcurve.py | RUrlus/ModelMetricUncertainty | f401a25dd196d6e4edf4901fcfee4b56ebd7c10b | [
"Apache-2.0"
] | null | null | null | import os
import numpy as np
import sklearn.metrics as skm
from sklearn.datasets import make_classification
from sklearn.linear_model import LogisticRegression
from sklearn.model_selection import train_test_split
import mmu
from mmu.commons._testing import generate_test_labels
from mmu.commons._testing import greater_equal_tol
def test_PRCMU_from_scores():
"""Test PRMU.from_scores"""
np.random.seed(412)
thresholds = np.linspace(1e-12, 1 - 1e-12, 200)
proba, _, y = generate_test_labels(N=500)
yhat = greater_equal_tol(proba, thresholds[100])
sk_conf_mat = skm.confusion_matrix(y, yhat)
pr_err = mmu.PRCU.from_scores(y=y, scores=proba, thresholds=thresholds)
assert pr_err.conf_mats is not None
assert pr_err.conf_mats.dtype == np.dtype(np.int64)
prec, rec, _, _ = skm.precision_recall_fscore_support(
y, yhat, zero_division=0.0 # type: ignore
)
assert pr_err.chi2_scores.shape == (
pr_err.rec_grid.size,
pr_err.rec_grid.size
)
assert np.isclose(pr_err.precision[100], prec[1])
assert np.isclose(pr_err.recall[100], rec[1])
assert np.array_equal(pr_err.conf_mats[100], sk_conf_mat.flatten())
def test_PRCMU_from_confusion_matrices():
"""Test PRMU.from_scores"""
np.random.seed(412)
thresholds = np.linspace(1e-12, 1 - 1e-12, 200)
proba, _, y = generate_test_labels(N=500)
yhat = greater_equal_tol(proba, thresholds[100])
sk_conf_mat = skm.confusion_matrix(y, yhat)
conf_mats = mmu.confusion_matrices_thresholds(y, proba, thresholds)
pr_err = mmu.PRCU.from_confusion_matrices(conf_mats=conf_mats)
assert pr_err.conf_mats is not None
assert pr_err.conf_mats.dtype == np.dtype(np.int64)
prec, rec, _, _ = skm.precision_recall_fscore_support(
y, yhat, zero_division=0.0 # type: ignore
)
assert pr_err.chi2_scores.shape == (
pr_err.rec_grid.size,
pr_err.rec_grid.size
)
assert np.isclose(pr_err.precision[100], prec[1])
assert np.isclose(pr_err.recall[100], rec[1])
assert np.array_equal(pr_err.conf_mats[100], sk_conf_mat.flatten())
def test_PRCMU_from_classifier():
"""Test PREU.from_classifier"""
# generate seeds to be used by sklearn
# do not use this in real scenarios,
# it's a convenience only used in the tutorial notebooks
seeds = mmu.commons.utils.SeedGenerator(234)
# generate 2 class dataset
X, y = make_classification(
n_samples=1000, n_classes=2, random_state=seeds()
)
# split into train/test sets
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.5, random_state=seeds()
)
# fit a model
model = LogisticRegression(solver='lbfgs')
model.fit(X_train, y_train)
# predict probabilities, for the positive outcome only
y_scores = model.predict_proba(X_test)[:, 1]
thresholds = np.linspace(1e-12, 1 - 1e-12, 200)
yhat = greater_equal_tol(y_scores, thresholds[100])
sk_conf_mat = skm.confusion_matrix(y_test, yhat)
pr_err = mmu.PRCU.from_classifier(
clf=model, X=X_test, y=y_test, thresholds=thresholds
)
assert pr_err.conf_mats is not None
assert pr_err.conf_mats.dtype == np.dtype(np.int64)
prec, rec, _, _ = skm.precision_recall_fscore_support(
y_test, yhat, zero_division=0.0 # type: ignore
)
assert pr_err.chi2_scores.shape == (
pr_err.rec_grid.size,
pr_err.rec_grid.size
)
assert np.isclose(pr_err.precision[100], prec[1])
assert np.isclose(pr_err.recall[100], rec[1])
assert np.array_equal(pr_err.conf_mats[100], sk_conf_mat.flatten())
def test_PRCEU_from_scores():
"""Test PRMU.from_scores"""
np.random.seed(412)
thresholds = np.linspace(1e-12, 1 - 1e-12, 200)
proba, _, y = generate_test_labels(N=500)
yhat = greater_equal_tol(proba, thresholds[100])
sk_conf_mat = skm.confusion_matrix(y, yhat)
pr_err = mmu.PRCU.from_scores(
y=y, scores=proba, thresholds=thresholds, method='bvn'
)
assert pr_err.conf_mats is not None
assert pr_err.conf_mats.dtype == np.dtype(np.int64)
prec, rec, _, _ = skm.precision_recall_fscore_support(
y, yhat, zero_division=0.0 # type: ignore
)
assert pr_err.chi2_scores.shape == (
pr_err.rec_grid.size,
pr_err.rec_grid.size
)
assert np.isclose(pr_err.precision[100], prec[1])
assert np.isclose(pr_err.recall[100], rec[1])
assert np.array_equal(pr_err.conf_mats[100], sk_conf_mat.flatten())
def test_PRCEU_from_confusion_matrices():
"""Test PRMU.from_scores"""
np.random.seed(412)
thresholds = np.linspace(1e-12, 1 - 1e-12, 200)
proba, _, y = generate_test_labels(N=500)
yhat = greater_equal_tol(proba, thresholds[100])
sk_conf_mat = skm.confusion_matrix(y, yhat)
conf_mats = mmu.confusion_matrices_thresholds(y, proba, thresholds)
pr_err = mmu.PRCU.from_confusion_matrices(conf_mats=conf_mats, method='bvn')
assert pr_err.conf_mats is not None
assert pr_err.conf_mats.dtype == np.dtype(np.int64)
prec, rec, _, _ = skm.precision_recall_fscore_support(
y, yhat, zero_division=0.0 # type: ignore
)
assert pr_err.chi2_scores.shape == (
pr_err.rec_grid.size,
pr_err.rec_grid.size
)
assert np.isclose(pr_err.precision[100], prec[1])
assert np.isclose(pr_err.recall[100], rec[1])
assert np.array_equal(pr_err.conf_mats[100], sk_conf_mat.flatten())
def test_PRCEU_from_classifier():
"""Test PREU.from_classifier"""
# generate seeds to be used by sklearn
# do not use this in real scenarios,
# it's a convenience only used in the tutorial notebooks
seeds = mmu.commons.utils.SeedGenerator(234)
# generate 2 class dataset
X, y = make_classification(
n_samples=1000, n_classes=2, random_state=seeds()
)
# split into train/test sets
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.5, random_state=seeds()
)
# fit a model
model = LogisticRegression(solver='lbfgs')
model.fit(X_train, y_train)
# predict probabilities, for the positive outcome only
y_scores = model.predict_proba(X_test)[:, 1]
thresholds = np.linspace(1e-12, 1 - 1e-12, 200)
yhat = greater_equal_tol(y_scores, thresholds[100])
sk_conf_mat = skm.confusion_matrix(y_test, yhat)
pr_err = mmu.PRCU.from_classifier(
clf=model, X=X_test, y=y_test, thresholds=thresholds, method='bvn'
)
assert pr_err.conf_mats is not None
assert pr_err.conf_mats.dtype == np.dtype(np.int64)
prec, rec, _, _ = skm.precision_recall_fscore_support(
y_test, yhat, zero_division=0.0 # type: ignore
)
assert pr_err.chi2_scores.shape == (
pr_err.rec_grid.size,
pr_err.rec_grid.size
)
assert np.isclose(pr_err.precision[100], prec[1])
assert np.isclose(pr_err.recall[100], rec[1])
assert np.array_equal(pr_err.conf_mats[100], sk_conf_mat.flatten())
def test_PRU_from_scores_with_train():
"""Test PREU.from_classifier"""
ref_path = os.path.join(
os.path.abspath(os.path.dirname(__file__)),
'train_reference_sets.npz'
)
ll = np.load(ref_path)
y_test = ll.get('y_test')
y_score = ll.get('y_score')
scores_bs = ll.get('scores_bs')
pr_err = mmu.PRCU.from_scores_with_train(
y_test,
scores=y_score,
scores_bs=scores_bs,
obs_axis=0
)
exp_shape = pr_err.thresholds.size, scores_bs.shape[1]
assert pr_err.train_precisions.shape == exp_shape
assert pr_err.train_recalls.shape == exp_shape
assert pr_err.train_conf_mats.shape == (
pr_err.thresholds.size, scores_bs.shape[1], 4
)
exp_cov_mat_shape = pr_err.thresholds.size, 4
assert pr_err.train_cov_mats.shape == exp_cov_mat_shape
assert pr_err.train_cov_mats.shape == exp_cov_mat_shape
assert pr_err.total_cov_mats.shape == exp_cov_mat_shape
# reference sets are computed with threshold 0.4999443530277085 which
# corresponds to idx: 418
ref_set_test = [0.00310889, 0.00112212, 0.00112212, 0.00258438]
ref_set_train = [0.00142558, -0.00041831, -0.00041831, 0.00189029]
assert np.allclose(pr_err.cov_mats[418].flatten(), ref_set_test, rtol=5e3)
assert np.allclose(pr_err.train_cov_mats[418].flatten(), ref_set_train, rtol=5e3)
| 34.780083 | 85 | 0.694703 | 1,299 | 8,382 | 4.203233 | 0.12933 | 0.06044 | 0.048352 | 0.042857 | 0.868132 | 0.845238 | 0.832784 | 0.817399 | 0.817399 | 0.80348 | 0 | 0.04859 | 0.192198 | 8,382 | 240 | 86 | 34.925 | 0.757791 | 0.098306 | 0 | 0.613636 | 0 | 0 | 0.008666 | 0.0032 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.039773 | false | 0 | 0.051136 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0fe02768dea51a841faddbe3c056f65a68ebab0c | 62,344 | py | Python | functional_tests/sample/test_list.py | SCiO-systems/qcat | 8c2b8e07650bc2049420fa6de758fba7e50c2f28 | [
"Apache-2.0"
] | null | null | null | functional_tests/sample/test_list.py | SCiO-systems/qcat | 8c2b8e07650bc2049420fa6de758fba7e50c2f28 | [
"Apache-2.0"
] | null | null | null | functional_tests/sample/test_list.py | SCiO-systems/qcat | 8c2b8e07650bc2049420fa6de758fba7e50c2f28 | [
"Apache-2.0"
] | null | null | null | import pytest
from django.contrib.auth.models import Group
from django.urls import reverse
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from unittest.mock import patch
from elasticmock import elasticmock
from functional_tests.base import FunctionalTest
from apps.accounts.models import User
from apps.questionnaire.models import Questionnaire
from apps.sample.tests.test_views import (
route_home,
route_questionnaire_list,
route_questionnaire_filter,
route_questionnaire_new_step)
from apps.search.tests.test_index import create_temp_indices
from functional_tests.pages.sample import SampleDetailPage, SampleEditPage, \
SampleStepPage, SampleListPage
@elasticmock
class ListTest(FunctionalTest):
fixtures = [
'global_key_values',
'flags',
'sample',
'unccd',
'sample_questionnaires_5',
'sample_projects',
'sample_institutions',
]
def setUp(self):
super(ListTest, self).setUp()
self.url_questionnaire_filter_sample = self.live_server_url + reverse(
route_questionnaire_filter) + '?type=sample'
create_temp_indices([('sample', '2015'), ('unccd', '2015')])
def test_list_is_available(self):
# She goes to the list page and sees all 4 questionnaires available.
self.browser.get(self.live_server_url + reverse(
route_questionnaire_list))
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 4)
# The entries are also ordered and each entry contains Keys 1 and 5 of
# the questionnaires.
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
'contains(text(), "Foo 3")]')
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//p['
'text()="Faz 3"]')
link = self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[2]//a['
'contains(text(), "Foo 4")]')
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[2]//p['
'text()="Faz 4"]')
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[3]//a['
'contains(text(), "Foo 2")]')
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[3]//p['
'text()="Faz 2"]')
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[4]//a['
'contains(text(), "Foo 1")]')
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[4]//p['
'text()="Faz 1"]')
# Each entry has a placeholder because there is no image
for e in list_entries:
self.findBy(
'xpath', '//img[contains(@src, "picture.svg")]', base=e)
# She clicks a link and sees she is taken to the detail page of the
# questionnaire
link.click()
self.toggle_all_sections()
self.checkOnPage('Key 3')
# The button to create a summary is on the page.
self.findBy('xpath', '//a[@href="{summary_url}"]'.format(
summary_url=reverse('questionnaire_summary', args=[4])
))
def test_pagination(self):
# Alice goes to the list view
self.browser.get(self.url_questionnaire_filter_sample)
# She sees 4 entries
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 4)
# She does not see the pagination because there are not enough entries
self.findByNot('xpath', '//div[contains(@class, "pagination")]')
# She adds a limit to the URL and sees it is used to narrow the results
self.browser.get(self.url_questionnaire_filter_sample + '&limit=1')
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 1)
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
'contains(text(), "Foo 3")]')
# Now the pagination is visible
self.findBy('xpath', '//div[contains(@class, "pagination")]')
pagination = self.findManyBy(
'xpath', '//ul[contains(@class, "pagination")]/li')
self.assertEqual(len(pagination), 6)
# She goes to the next page
self.findBy('xpath', '//li[@class="arrow"]/a').click()
WebDriverWait(self.browser, 10).until(
EC.invisibility_of_element_located(
(By.CLASS_NAME, "loading-indicator")))
pagination = self.findManyBy(
'xpath', '//ul[contains(@class, "pagination")]/li')
self.assertEqual(len(pagination), 6)
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 1)
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
'contains(text(), "Foo 4")]')
# She adds a filter
self.add_advanced_filter('qg_11__key_14', 'value_14_3')
# She sees that the list was filtered and that she is back on the
# first page
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 1)
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
'contains(text(), "Foo 3")]')
pagination = self.findManyBy(
'xpath', '//ul[contains(@class, "pagination")]/li')
self.assertEqual(len(pagination), 4)
# She goes to the next page and sees the filter persists
self.findBy('xpath', '//li[@class="arrow"]/a').click()
WebDriverWait(self.browser, 10).until(
EC.invisibility_of_element_located(
(By.CLASS_NAME, "loading-indicator")))
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 1)
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
'contains(text(), "Foo 4")]')
pagination = self.findManyBy(
'xpath', '//ul[contains(@class, "pagination")]/li')
self.assertEqual(len(pagination), 4)
url = self.browser.current_url
item_1 = self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])').text
filter_1 = self.findBy('xpath', '//div[@id="active-filters"]/div').text
pagination_1 = self.findBy(
'xpath', '//ul[contains(@class, "pagination")]').text
# She opens the current URL directly and sees that she is taken to the
# exact same page
self.browser.get(url)
item_2 = self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])').text
filter_2 = self.findBy('xpath', '//div[@id="active-filters"]/div').text
pagination_2 = self.findBy(
'xpath', '//ul[contains(@class, "pagination")]').text
self.assertEqual(item_1, item_2)
self.assertEqual(filter_1, filter_2)
self.assertEqual(pagination_1, pagination_2)
# She removes the specific filter and is back on the first page with
# all results
self.remove_filter(0)
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 1)
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
'contains(text(), "Foo 3")]')
pagination = self.findManyBy(
'xpath', '//ul[contains(@class, "pagination")]/li')
self.assertEqual(len(pagination), 6)
# She adds another filter
self.add_advanced_filter('qg_11__key_14', 'value_14_1')
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 1)
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
'contains(text(), "Foo 4")]')
pagination = self.findManyBy(
'xpath', '//ul[contains(@class, "pagination")]/li')
self.assertEqual(len(pagination), 4)
# She goes to the second page
self.findBy('xpath', '//li[@class="arrow"]/a').click()
WebDriverWait(self.browser, 10).until(
EC.invisibility_of_element_located(
(By.CLASS_NAME, "loading-indicator")))
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 1)
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
'contains(text(), "Foo 1")]')
@patch('apps.questionnaire.views.get_configuration_index_filter')
def test_list_with_foreign_configuration(self, mock_config_index_filter):
mock_config_index_filter.return_value = ['*']
# Alice goes to the list and sees that there are 5 questionnaires
# available (one in UNCCD config)
self.browser.get(self.live_server_url + reverse(
route_questionnaire_list))
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 5)
expected_list_order = [
{
'name': 'Foo 3',
'description': 'Faz 3',
},
{
'name': 'UNCCD practice 1',
'description': 'This is the description of the first UNCCD practice.',
},
{
'name': 'Foo 4',
'description': 'Faz 4',
},
{
'name': 'Foo 2',
'description': 'Faz 2',
},
{
'name': 'Foo 1',
'description': 'Faz 1',
},
]
self.check_list_results(expected_list_order)
def test_list_database_es(self):
# She goes to the WOCAT list and sees the list (retrieved from
# elasticsearch) also contains metadata information and is
# practically identical with the one on the landing page
self.browser.get(self.live_server_url + reverse(
route_questionnaire_list))
entry_xpath = '//article[contains(@class, "tech-item")][2]'
creation = self.findBy(
'xpath', '{}//time'.format(entry_xpath))
self.assertEqual(creation.text, '02/13/2014 5:08 p.m.')
compiler = self.findBy(
'xpath', '{}//li[contains(text(), "Compiler")]'.format(
entry_xpath))
self.assertIn('Foo Bar', compiler.text)
# She also sees that the second entry has one compiler and one editor
# but only the compiler is shown
entry_xpath = '//article[contains(@class, "tech-item")][1]'
compiler = self.findBy(
'xpath', '{}//li[contains(text(), "Compiler")]'.format(
entry_xpath))
self.assertIn('Foo Bar', compiler.text)
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
'contains(text(), "Foo 3")]').click()
info = self.findBy('xpath', '//ul[@class="tech-infos"]')
self.assertIn('Foo Bar', info.text)
self.assertIn('Faz Taz', info.text)
def test_filter_checkbox(self):
# Alice goes to the list view
self.browser.get(self.url_questionnaire_filter_sample)
# She sees there are 4 Questionnaires in the list
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 4)
# The number of Questionnaires is also indicated in the title
count = self.findBy(
'xpath', '//h2/span[@id="questionnaire-count"]')
self.assertEqual(count.text, '4')
# There is only one active filter set
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 1)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
# She filters by some value
self.add_advanced_filter('qg_11__key_14', 'value_14_2')
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 1)
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
'contains(text(), "Foo 2")]')
# The number of Questionnaires in the title is updated
count = self.findBy(
'xpath', '//h2/span[@id="questionnaire-count"]')
self.assertEqual(count.text, '1')
# The filter was added to the list of active filters
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(active_filters[1].text, 'Key 14: Value 14_2')
# She unchecks the checkbox and updates the filter
self.toggle_selected_advanced_filters(display=True)
self.findBy('xpath',
'//input[@name="filter-value-select" and '
'@value="value_14_2" and @checked="checked"]',
wait=True).click()
self.apply_filter()
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 4)
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 1)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
# She selects the first checkbox and updates the filter
self.add_advanced_filter('qg_11__key_14', 'value_14_1')
expected_list = [
{
'title': 'Foo 4'
},
{
'title': 'Foo 1'
}
]
self.check_list_results(expected_list)
# The number of Questionnaires in the title is updated
count = self.findBy(
'xpath', '//h2/span[@id="questionnaire-count"]')
self.assertEqual(count.text, '2')
# The filter was added to the list of active filters
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(active_filters[1].text, 'Key 14: Value 14_1')
# She also selects value 3 and updates the filter
self.toggle_selected_advanced_filters(display=True)
self.findBy('xpath',
'//input[@name="filter-value-select" and '
'@value="value_14_3"]', wait=True).click()
self.apply_filter()
WebDriverWait(self.browser, 10).until(
EC.invisibility_of_element_located(
(By.CLASS_NAME, "loading-indicator")))
# The filters are joined by OR, therefore there are now more results
expected_list = [
{
'title': 'Foo 3'
},
{
'title': 'Foo 4'
},
{
'title': 'Foo 1'
}
]
self.check_list_results(expected_list)
# The active filter updated in the list of active filters
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(active_filters[1].text, 'Key 14: Value 14_1 / Value 14_3')
self.remove_filter(0)
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 4)
# There is no active filter set
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 1)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
# She adds a first filter again
self.add_advanced_filter('qg_11__key_14', 'value_14_3')
# There is one active filter set
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(active_filters[1].text, 'Key 14: Value 14_3')
# She removes the first filter and adds another filter instead
self.toggle_selected_advanced_filters(display=True)
self.findBy('xpath',
'//input[@name="filter-value-select" and '
'@value="value_14_2"]', wait=True).click()
self.findBy('xpath',
'//input[@name="filter-value-select" and '
'@value="value_14_3"]').click()
self.apply_filter()
# Again, there is one active filter set
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(active_filters[1].text, 'Key 14: Value 14_2')
# She reloads the same page and sees the correct checkbox was selected
self.browser.get(self.browser.current_url)
self.toggle_selected_advanced_filters(display=True)
self.findBy('xpath',
'//input[@name="filter-value-select" and '
'@value="value_14_2" and @checked="checked"]', wait=True).click()
# def test_filter_flags(self):
#
# # Alice goes to the list view
# self.browser.get(self.live_server_url + reverse(
# route_questionnaire_list))
#
# # She sees there are 4 Questionnaires in the list
# list_entries = self.findManyBy(
# 'xpath', '//article[contains(@class, "tech-item")]')
# self.assertEqual(len(list_entries), 4)
#
# # There is no active filter set
# active_filter_panel = self.findBy(
# 'xpath', '//div[@id="active-filters"]/div')
# self.assertFalse(active_filter_panel.is_displayed())
# active_filters = self.findManyBy(
# 'xpath', '//div[@id="active-filters"]//li')
# self.assertEqual(len(active_filters), 0)
#
# # She sees a link for advanced filtering which opens the filter
# # panel
# filter_panel = self.findBy('id', 'search-advanced-options')
# self.assertFalse(filter_panel.is_displayed())
# self.findBy('link_text', 'Advanced filter').click()
# self.assertTrue(filter_panel.is_displayed())
#
# # She expands the flag section
# self.findBy('id', 'filter-flags-heading').click()
#
# # She sees a checkbox to filter by flag
# WebDriverWait(self.browser, 10).until(
# EC.visibility_of_element_located(
# (By.ID, "flag_unccd_bp")))
#
# url = self.browser.current_url
# # She submits the filter and sees the flag values were not submitted
# self.apply_filter()
# WebDriverWait(self.browser, 10).until(
# EC.invisibility_of_element_located(
# (By.CLASS_NAME, "loading-indicator")))
# self.assertEqual(self.browser.current_url, '{}?'.format(url))
#
# # She clicks the UNCCD flag checkbox
# flag_cb = self.findBy('id', 'flag_unccd_bp')
# flag_cb.click()
#
# # She submits the filter
# self.apply_filter()
# WebDriverWait(self.browser, 10).until(
# EC.invisibility_of_element_located(
# (By.CLASS_NAME, "loading-indicator")))
#
# # She sees that the filter was submitted and the results are filtered.
# list_entries = self.findManyBy(
# 'xpath', '//article[contains(@class, "tech-item")]')
# self.assertEqual(len(list_entries), 1)
# self.findBy(
# 'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
# 'contains(text(), "Foo 4")]')
#
# # The filter is in the list of active filters
# active_filter_panel = self.findBy(
# 'xpath', '//div[@id="active-filters"]/div')
# self.assertTrue(active_filter_panel.is_displayed())
# active_filters = self.findManyBy(
# 'xpath', '//div[@id="active-filters"]//li')
# self.assertEqual(len(active_filters), 1)
# filter_1 = self.findBy('xpath', '//div[@id="active-filters"]//li[1]')
# self.assertEqual(filter_1.text, 'UNCCD Best Practice')
#
# # The checkbox is checked
# self.assertTrue(flag_cb.is_selected())
#
# # She clears the filter
# self.findBy('xpath', '(//a[@class="remove-filter"])[1]').click()
# WebDriverWait(self.browser, 10).until(
# EC.invisibility_of_element_located(
# (By.CLASS_NAME, "loading-indicator")))
#
# # She sees the active filter is gone, checkbox is not checked and the
# # results are all there again
# list_entries = self.findManyBy(
# 'xpath', '//article[contains(@class, "tech-item")]')
# self.assertEqual(len(list_entries), 4)
# self.assertFalse(flag_cb.is_selected())
# active_filters = self.findManyBy(
# 'xpath', '//div[@id="active-filters"]//li')
# self.assertEqual(len(active_filters), 0)
#
# # She applies the filter once again
# flag_cb.click()
# self.apply_filter()
# WebDriverWait(self.browser, 10).until(
# EC.invisibility_of_element_located(
# (By.CLASS_NAME, "loading-indicator")))
# list_entries = self.findManyBy(
# 'xpath', '//article[contains(@class, "tech-item")]')
# self.assertEqual(len(list_entries), 1)
#
# # She reloads the page
# url = self.browser.current_url
# self.browser.get(url)
#
# # She sees that the results are correctly filtered
# list_entries = self.findManyBy(
# 'xpath', '//article[contains(@class, "tech-item")]')
# self.assertEqual(len(list_entries), 1)
# self.findBy(
# 'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
# 'contains(text(), "Foo 4")]')
#
# # The filter is in the list of active filters
# active_filter_panel = self.findBy(
# 'xpath', '//div[@id="active-filters"]/div')
# self.assertTrue(active_filter_panel.is_displayed())
# active_filters = self.findManyBy(
# 'xpath', '//div[@id="active-filters"]//li')
# self.assertEqual(len(active_filters), 1)
# filter_1 = self.findBy('xpath', '//div[@id="active-filters"]//li[1]')
# self.assertEqual(filter_1.text, 'UNCCD Best Practice')
#
# # The checkbox is checked
# flag_cb = self.findBy('id', 'flag_unccd_bp')
# self.assertTrue(flag_cb.is_selected())
#
# # She clicks "filter" again and sees the filter is not added twice.
# url = self.browser.current_url
# self.findBy('link_text', 'Advanced filter').click()
# self.apply_filter()
# WebDriverWait(self.browser, 10).until(
# EC.invisibility_of_element_located(
# (By.CLASS_NAME, "loading-indicator")))
# self.assertEqual(self.browser.current_url, url)
# def test_filter_dates(self):
#
# # Alice goes to the list view
# self.browser.get(self.live_server_url + reverse(
# route_questionnaire_list))
#
# # She sees there are 4 Questionnaires in the list
# list_entries = self.findManyBy(
# 'xpath', '//article[contains(@class, "tech-item")]')
# self.assertEqual(len(list_entries), 4)
#
# # There is no active filter set
# active_filter_panel = self.findBy(
# 'xpath', '//div[@id="active-filters"]/div')
# self.assertFalse(active_filter_panel.is_displayed())
# active_filters = self.findManyBy(
# 'xpath', '//div[@id="active-filters"]//li')
# self.assertEqual(len(active_filters), 0)
#
# # She sees a link for advanced filtering which opens the filter
# # panel
# filter_panel = self.findBy('id', 'search-advanced-options')
# self.assertFalse(filter_panel.is_displayed())
# self.findBy('link_text', 'Advanced filter').click()
# self.assertTrue(filter_panel.is_displayed())
#
# # She opens the date filter section
# self.findBy('id', 'filter-dates-heading').click()
#
# WebDriverWait(self.browser, 10).until(
# EC.visibility_of_element_located(
# (By.CLASS_NAME, "filter-created")))
#
# # She sees a slider to filter by creation date
# leftLabel = self.findBy(
# 'xpath', '//span[contains(@class, "filter-created") and '
# 'contains(@class, "leftLabel")]')
# self.assertEqual(leftLabel.text, '2000')
# rightLabel = self.findBy(
# 'xpath', '//span[contains(@class, "filter-created") and '
# 'contains(@class, "rightLabel")]')
# self.assertEqual(rightLabel.text, '2016')
#
# url = self.browser.current_url
#
# # She submits the filter and sees the slider values were not submitted
# self.apply_filter()
# WebDriverWait(self.browser, 10).until(
# EC.invisibility_of_element_located(
# (By.CLASS_NAME, "loading-indicator")))
# self.assertEqual(self.browser.current_url, '{}?'.format(url))
#
# # She "changes" the slider
# created_slider_min = self.findBy(
# 'xpath', '//input[contains(@class, "filter-created") and '
# 'contains(@class, "min")]')
# self.changeHiddenInput(created_slider_min, '2014')
# self.apply_filter()
# WebDriverWait(self.browser, 10).until(
# EC.invisibility_of_element_located(
# (By.CLASS_NAME, "loading-indicator")))
#
# # She sees that the filter was submitted in the url and the results
# # are filtered
# list_entries = self.findManyBy(
# 'xpath', '//article[contains(@class, "tech-item")]')
# self.assertEqual(len(list_entries), 1)
# self.findBy(
# 'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
# 'contains(text(), "Foo 4")]')
#
# # The filter was added to the list of active filters
# active_filter_panel = self.findBy(
# 'xpath', '//div[@id="active-filters"]/div')
# self.assertTrue(active_filter_panel.is_displayed())
# active_filters = self.findManyBy(
# 'xpath', '//div[@id="active-filters"]//li')
# self.assertEqual(len(active_filters), 1)
# filter_1 = self.findBy('xpath', '//div[@id="active-filters"]//li[1]')
# self.assertEqual(filter_1.text, 'Created: 2014 - 2016')
#
# # She also sets a filter for the updated year
# updated_slider_max = self.findBy(
# 'xpath', '//input[contains(@class, "filter-updated") and '
# 'contains(@class, "max")]')
# self.changeHiddenInput(updated_slider_max, '2012')
# self.apply_filter()
# WebDriverWait(self.browser, 10).until(
# EC.invisibility_of_element_located(
# (By.CLASS_NAME, "loading-indicator")))
#
# # As the filters are joined by OR, there are now more results
# list_entries = self.findManyBy(
# 'xpath', '//article[contains(@class, "tech-item")]')
# self.assertEqual(len(list_entries), 3)
# self.findBy(
# 'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
# 'contains(text(), "Foo 4")]')
# self.findBy(
# 'xpath', '(//article[contains(@class, "tech-item")])[2]//a['
# 'contains(text(), "Foo 2")]')
# self.findBy(
# 'xpath', '(//article[contains(@class, "tech-item")])[3]//a['
# 'contains(text(), "Foo 1")]')
#
# filter_1 = self.findBy('xpath', '//div[@id="active-filters"]//li[1]')
# self.assertEqual(filter_1.text, 'Created: 2014 - 2016')
# filter_2 = self.findBy('xpath', '//div[@id="active-filters"]//li[2]')
# self.assertEqual(filter_2.text, 'Updated: 2000 - 2012')
#
# # She removes the first filter (creation date), 2 entries show up
# self.findBy('xpath', '(//a[@class="remove-filter"])[1]').click()
# WebDriverWait(self.browser, 10).until(
# EC.invisibility_of_element_located(
# (By.CLASS_NAME, "loading-indicator")))
#
# list_entries = self.findManyBy(
# 'xpath', '//article[contains(@class, "tech-item")]')
# self.assertEqual(len(list_entries), 2)
# self.findBy(
# 'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
# 'contains(text(), "Foo 2")]')
# self.findBy(
# 'xpath', '(//article[contains(@class, "tech-item")])[2]//a['
# 'contains(text(), "Foo 1")]')
#
# # She hits the button to remove all filters
# self.findBy('id', 'filter-reset').click()
# WebDriverWait(self.browser, 10).until(
# EC.invisibility_of_element_located(
# (By.CLASS_NAME, "loading-indicator")))
#
# # She sees there are 4 Questionnaires in the list
# list_entries = self.findManyBy(
# 'xpath', '//article[contains(@class, "tech-item")]')
# self.assertEqual(len(list_entries), 4)
#
# # There is no active filter set
# active_filter_panel = self.findBy(
# 'xpath', '//div[@id="active-filters"]/div')
# self.assertFalse(active_filter_panel.is_displayed())
# active_filters = self.findManyBy(
# 'xpath', '//div[@id="active-filters"]//li')
# self.assertEqual(len(active_filters), 0)
#
# # She sets a filter again and reloads the page
# created_slider_min = self.findBy(
# 'xpath', '//input[contains(@class, "filter-created") and '
# 'contains(@class, "min")]')
# self.changeHiddenInput(created_slider_min, '2012')
# created_slider_max = self.findBy(
# 'xpath', '//input[contains(@class, "filter-created") and '
# 'contains(@class, "max")]')
# self.changeHiddenInput(created_slider_max, '2013')
#
# created_left_handle = self.findBy(
# 'xpath', '//div[contains(@class, "leftGrip") and contains(@class, '
# '"filter-created")]')
# self.assertEqual(
# created_left_handle.get_attribute('style'), 'left: 0px;')
#
# self.apply_filter()
# WebDriverWait(self.browser, 10).until(
# EC.invisibility_of_element_located(
# (By.CLASS_NAME, "loading-indicator")))
# url = self.browser.current_url
# self.browser.get(url)
#
# self.findBy('link_text', 'Advanced filter').click()
# self.findBy('id', 'filter-dates-heading').click()
# WebDriverWait(self.browser, 10).until(
# EC.visibility_of_element_located(
# (By.CLASS_NAME, "filter-created")))
#
# # She sees the filter is set
# list_entries = self.findManyBy(
# 'xpath', '//article[contains(@class, "tech-item")]')
# self.assertEqual(len(list_entries), 2)
# self.findBy(
# 'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
# 'contains(text(), "Foo 3")]')
# self.findBy(
# 'xpath', '(//article[contains(@class, "tech-item")])[2]//a['
# 'contains(text(), "Foo 2")]')
#
# # The filter was added to the list of active filters
# active_filter_panel = self.findBy(
# 'xpath', '//div[@id="active-filters"]/div')
# self.assertTrue(active_filter_panel.is_displayed())
# active_filters = self.findManyBy(
# 'xpath', '//div[@id="active-filters"]//li')
# self.assertEqual(len(active_filters), 1)
# filter_1 = self.findBy('xpath', '//div[@id="active-filters"]//li[1]')
# self.assertEqual(filter_1.text, 'Created: 2012 - 2013')
#
# # She sees the slider is set to match the current filter
# created_left_handle = self.findBy(
# 'xpath', '//div[contains(@class, "leftGrip") and contains(@class, '
# '"filter-created")]')
# self.assertNotEqual(
# created_left_handle.get_attribute('style'), 'left: 0px;')
#
# # She removes the filter and sees that the slider position has
# # been reset.
# self.findBy('xpath', '(//a[@class="remove-filter"])[1]').click()
# WebDriverWait(self.browser, 10).until(
# EC.invisibility_of_element_located(
# (By.CLASS_NAME, "loading-indicator")))
# time.sleep(1)
# self.assertEqual(
# created_left_handle.get_attribute('style'), 'left: 0px;')
#
# # She clicks "filter" again (slider not being set) and sees that
# # nothing is happening.
# # She submits the filter and sees the slider values were not submitted
# self.apply_filter()
# self.assertEqual(
# self.browser.current_url, '{}?'.format(
# self.live_server_url + reverse(route_questionnaire_list)))
# time.sleep(0.5)
def test_filter_project(self):
# Alice goes to the list view
self.browser.get(self.live_server_url + reverse(
route_questionnaire_list))
# She sees there are 4 Questionnaires in the list
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 4)
# There is no active filter set
active_filter_panel = self.findBy(
'xpath', '//div[@id="active-filters"]/div')
self.assertFalse(active_filter_panel.is_displayed())
active_filters = self.findManyBy(
'xpath', '//div[@id="active-filters"]//li')
self.assertEqual(len(active_filters), 0)
# She sees a datalist to filter by project
project_filter = self.findBy(
'xpath', '//div/label[@for="filter-project"]/../div'
)
url = self.browser.current_url
# She submits the filter and sees no values were submitted
self.apply_filter()
self.assertEqual(self.browser.current_url, '{}?type=sample'.format(url))
# She enters a project
project_filter = self.findBy(
'xpath', '//div/label[@for="filter-project"]/../div'
)
project_filter.click()
project_filter.find_element_by_xpath(
'//ul[@class="chosen-results"]/li[text()="The first Project (TFP)"]'
).click()
self.apply_filter()
# She sees that the filter was submitted in the url and the results
# are filtered
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 1)
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
'contains(text(), "Foo 3")]')
# The filter was added to the list of active filters
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertIn('The first Project (TFP)', active_filters[1].text)
# She hits the button to remove all filters
self.remove_filter(index=None)
# She sees there are 4 Questionnaires in the list
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 4)
# There is only one active filter
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 0)
# She sets a filter again and reloads the page
project_filter = self.findBy(
'xpath', '//div/label[@for="filter-project"]/../div'
)
project_filter.click()
project_filter.find_element_by_xpath(
'//ul[@class="chosen-results"]/li[text()="The first Project (TFP)"]'
).click()
self.apply_filter()
url = self.browser.current_url
self.browser.get(url)
# She sees the filter is set
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 1)
self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//a['
'contains(text(), "Foo 3")]')
# The filter was added to the list of active filters
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertIn('The first Project (TFP)', active_filters[1].text)
# She sees the text in the input field matches the project
project_filter = self.findBy(
'xpath', '//div/label[@for="filter-project"]/../div/a/span'
)
self.assertEqual(
project_filter.text,
'The first Project (TFP)'
)
# She removes the filter and sees that the input field has been
# reset
self.remove_filter(index=None)
project_filter = self.findBy(
'xpath', '//div/label[@for="filter-project"]/../div/a/span'
)
self.assertEqual(project_filter.text, 'Select or type a project name')
# She clicks "filter" again and sees that nothing is happening.
# She submits the filter and sees no values were submitted
self.apply_filter()
self.assertEqual(
self.browser.current_url, '{}?type=sample'.format(
self.live_server_url + reverse(route_questionnaire_list)))
def test_filter_search(self):
search_term = '4'
# Alice goes to the list view
self.browser.get(self.url_questionnaire_filter_sample)
# She sees there are 4 Questionnaires in the list
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 4)
# There is only the type filter active
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 1)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.add_advanced_filter('qg_11__key_14', 'value_14_1')
# She sees the results are filtered
expected_list = [
{
'title': 'Foo 4'
},
{
'title': 'Foo 1'
}
]
self.check_list_results(expected_list)
# She also searches for a word
self.findBy('xpath', '//input[@type="search"]').send_keys(search_term)
self.apply_filter()
# She sees that both filters are applied, they are joined by AND
expected_list = [
{
'title': 'Foo 4'
}
]
self.check_list_results(expected_list)
# The filter was added to the list of active filters
active_filters = self.get_active_filters(has_any=True)
self.assertEqual(len(active_filters), 3)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(active_filters[1].text, 'Key 14: Value 14_1')
self.assertEqual(active_filters[2].text, f'Search Terms: {search_term}')
# She removes one filter
self.remove_filter(index=0)
# The filters are updated
active_filters = self.get_active_filters(has_any=True)
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(active_filters[1].text, f'Search Terms: {search_term}')
# The results are filtered
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 1)
# She goes to the home page
self.browser.get(self.live_server_url + reverse(route_home))
# She enters a search term there
self.findBy('xpath', '//input[@type="search"]').send_keys(search_term)
self.set_input_value('search-type', 'sample')
self.findBy('id', 'submit-search').click()
# She sees she is taken to the list view with the filter set
active_filters = self.get_active_filters(has_any=True)
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[1].text, 'SLM Data: sample')
self.assertEqual(active_filters[0].text, f'Search Terms: {search_term}')
# The results are filtered
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 1)
# She goes to the advanced filter and sees the filter is still active
# there
self.open_advanced_filter('sample')
active_filters = self.get_active_filters(has_any=True)
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(active_filters[1].text, f'Search Terms: {search_term}')
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 1)
# She adds a second filter
self.add_advanced_filter('qg_11__key_14', 'value_14_1')
# The results are filtered
expected_list = [
{
'title': 'Foo 4'
}
]
self.check_list_results(expected_list)
# The filter was added to the list of active filters
active_filters = self.get_active_filters(has_any=True)
self.assertEqual(len(active_filters), 3)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(active_filters[1].text, 'Key 14: Value 14_1')
self.assertEqual(active_filters[2].text, f'Search Terms: {search_term}')
# She removes all filters
self.remove_filter(index=None)
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 4)
# There is only one active filter left
active_filters = self.get_active_filters(has_any=True)
self.assertEqual(len(active_filters), 1)
# The search field is empty
search_field = self.findBy('xpath', '//input[@type="search"]')
self.assertEqual(search_field.get_attribute('value'), '')
# She searches again by keyword, this time entering two words
search_field.send_keys(search_term)
self.apply_filter()
# The filter is set correctly
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(active_filters[1].text, f'Search Terms: {search_term}')
# She refreshes the page and sees the text in the search bar did not
# change
self.browser.get(self.browser.current_url)
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(active_filters[1].text, f'Search Terms: {search_term}')
search_field = self.findBy('xpath', '//input[@type="search"]')
self.assertEqual(search_field.get_attribute('value'), search_term)
# She removes the filter and sees it works.
self.remove_filter(index=None)
active_filters = self.get_active_filters(has_any=True)
self.assertEqual(len(active_filters), 1)
search_field = self.findBy('xpath', '//input[@type="search"]')
self.assertEqual(search_field.get_attribute('value'), '')
# She searches again by keyword, this time entering two words
special_chars = ';ö$ ,_%as[df)'
search_field.send_keys(special_chars)
self.apply_filter()
# The filter is set correctly
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(
active_filters[1].text, f'Search Terms: {special_chars}')
# She refreshes the page and sees the text in the search bar did not
# change
self.browser.get(self.browser.current_url)
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(
active_filters[1].text, f'Search Terms: {special_chars}')
search_field = self.findBy('xpath', '//input[@type="search"]')
self.assertEqual(search_field.get_attribute('value'), special_chars)
# She removes the filter and sees it works.
self.remove_filter(index=None)
active_filters = self.get_active_filters(has_any=True)
self.assertEqual(len(active_filters), 1)
search_field = self.findBy('xpath', '//input[@type="search"]')
self.assertEqual(search_field.get_attribute('value'), '')
def test_filter_country(self):
# Alice goes to the list view
self.browser.get(self.live_server_url + reverse(
route_questionnaire_list))
# She sees there are 4 Questionnaires in the list
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 4)
# There is no active filter set
self.get_active_filters(has_any=False)
# She sees a chosen container to filter by country and opens it
country_filter = self.findBy(
'xpath', '//div/label[@for="filter-country"]/../div')
country_filter.click()
url = self.browser.current_url
# She submits the filter and sees no values were submitted
self.apply_filter()
self.assertEqual(self.browser.current_url, '{}?type=sample'.format(url))
# She opens the country filter again and selects the value for
# Switzerland
country_filter = self.findBy(
'xpath', '//div/label[@for="filter-country"]/../div')
country_filter.click()
country_filter.find_element_by_xpath(
'//ul[@class="chosen-results"]/li[text()="Switzerland"]'
).click()
self.apply_filter()
# She sees that the filter was submitted in the url and the results
# are filtered
expected_list = [
{
'title': 'Foo 4'
},
{
'title': 'Foo 2'
},
{
'title': 'Foo 1'
}
]
self.check_list_results(expected_list)
# The filter was added to the list of active filters
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(active_filters[1].text, 'Country: Switzerland')
# She hits the button to remove all filters
self.remove_filter(index=None)
# She sees there are 4 Questionnaires in the list
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 4)
# There is no active filter set
self.get_active_filters(has_any=False)
# She sets a filter again and reloads the page
country_filter = self.findBy(
'xpath', '//div/label[@for="filter-country"]/../div')
country_filter.click()
country_filter.find_element_by_xpath(
'//ul[@class="chosen-results"]/li[text()="Afghanistan"]'
).click()
self.apply_filter()
url = self.browser.current_url
self.browser.get(url)
# She sees the filter is set
expected_list = [
{
'title': 'Foo 3'
}
]
self.check_list_results(expected_list)
# The filter was added to the list of active filters
active_filters = self.get_active_filters()
self.assertEqual(len(active_filters), 2)
self.assertEqual(active_filters[0].text, 'SLM Data: sample')
self.assertEqual(active_filters[1].text, 'Country: Afghanistan')
# She sees the text in the input field matches the country
selected_country = self.findBy(
'class_name', 'chosen-single'
).find_element_by_tag_name('span')
self.assertEqual(
selected_country.text,
'Afghanistan'
)
# She removes the filter and sees that the input field has been
# reset
self.remove_filter(index=None)
selected_country = self.findBy(
'class_name', 'chosen-single'
).find_element_by_tag_name('span')
self.assertEqual(selected_country.text, 'Select or type a country name')
# She clicks "filter" again and sees that nothing is happening.
# She submits the filter and sees no values were submitted
self.apply_filter()
self.assertEqual(
self.browser.current_url, '{}?type=sample'.format(
self.live_server_url + reverse(route_questionnaire_list)))
@elasticmock
class ListTestLinks(FunctionalTest):
fixtures = [
'groups_permissions',
'global_key_values',
'sample',
'samplemulti',
'sample_samplemulti_questionnaires',
]
"""
1
configurations: sample
code: sample_1
key_1 (en): This is the first key
links: 3
2
configurations: sample
code: sample_2
key_1 (en): Foo
links: -
3
configurations: samplemulti
code: samplemulti_1
key_1 (en): This is key 1a
links: 1
4
configurations: samplemulti
code: samplemulti_2
key_1 (en): This is key 1b
links: -
"""
def setUp(self):
super(ListTestLinks, self).setUp()
create_temp_indices([('sample', '2015'), ('samplemulti', '2015')])
def test_list_displays_links_user_questionnaires(self):
user_alice = User.objects.get(pk=101)
# Alice logs in
self.doLogin(user=user_alice)
# She enters a new questionnaire with some basic data
self.browser.get(self.live_server_url + reverse(
route_questionnaire_new_step,
kwargs={'identifier': 'new', 'step': 'cat_1'}))
self.findBy('name', 'qg_1-0-original_key_1').send_keys('Foo')
self.findBy('name', 'qg_1-0-original_key_3').send_keys('Bar')
self.submit_form_step()
# She also links another questionnaire
self.wait_for(
'xpath',
'//a[contains(@href, "/edit/") and contains(@href, "cat_5")]'
)
self.click_edit_section('cat_5')
self.findBy(
'xpath', '//input[contains(@class, "link-search-field")]'
'[1]').send_keys('key')
self.wait_for('xpath', '//li[@class="ui-menu-item"]')
self.findBy(
'xpath',
'//li[@class="ui-menu-item"]//strong[text()="This is key 1a"'
']').click()
self.submit_form_step()
# She goes to the page where she can see her own questionnaires
self.clickUserMenu(user_alice)
self.findBy(
'xpath', '//li[contains(@class, "has-dropdown")]/ul/li/a['
'contains(@href, "accounts/questionnaires")]').click()
# She sees the newly created questionnaire and that it contains only one
# link
self.findBy('xpath', '(//article[contains(@class, "tech-item")])[1]//a'
'[contains(text(), "Foo")]')
link_count = self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//ul['
'contains(@class, "tech-attached")]/li/a')
self.assertEqual(link_count.text, '') # No number
def test_list_displays_links_search(self):
# Alice logs in and creates a new questionnaire with some basic data and
# a linked questionnaire
user_alice = User.objects.get(pk=101)
user_alice.groups = [Group.objects.get(pk=3), Group.objects.get(pk=4)]
self.doLogin(user=user_alice)
# She submits and publishes the questionnaire
self.browser.get(self.live_server_url + reverse(
route_questionnaire_new_step,
kwargs={'identifier': 'new', 'step': 'cat_1'}))
self.findBy('name', 'qg_1-0-original_key_1').send_keys('Foo')
self.findBy('name', 'qg_1-0-original_key_3').send_keys('Bar')
self.submit_form_step()
btn = '//a[contains(@href, "/edit/") and contains(@href, "cat_5")]'
self.wait_for('xpath', btn)
self.click_edit_section('cat_5')
self.findBy(
'xpath', '//input[contains(@class, "link-search-field")]'
'[1]').send_keys('key')
self.wait_for('xpath', '//li[@class="ui-menu-item"]')
self.findBy(
'xpath',
'//li[@class="ui-menu-item"]//strong[text()="This is key 1a"'
']').click()
self.submit_form_step()
self.review_action('submit')
self.review_action('review')
self.review_action('publish')
# She goes to the SAMPLE search page and sees the questionnaires. These
# are: 3 (newly created, with link), 2 and 1 (with link)
self.browser.get(
self.live_server_url + reverse(route_questionnaire_list))
list_entries = self.findManyBy(
'xpath', '//article[contains(@class, "tech-item")]')
self.assertEqual(len(list_entries), 3)
# She sees that the first one contains a link to SAMPLEMULTI. The link
# count is only one!
link_count_1 = self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[1]//ul['
'contains(@class, "tech-attached")]/li/a')
self.assertEqual(link_count_1.text, '')
# Same for the third entry of the list
link_count_3 = self.findBy(
'xpath', '(//article[contains(@class, "tech-item")])[3]//ul['
'contains(@class, "tech-attached")]/li/a')
self.assertEqual(link_count_3.text, '')
@elasticmock
class ListTestStatus(FunctionalTest):
fixtures = [
'groups_permissions',
'global_key_values',
'sample',
'sample_questionnaire_status',
]
def setUp(self):
super(ListTestStatus, self).setUp()
create_temp_indices([('sample', '2015')])
# def test_unknown_name(self):
# user = create_new_user()
# user.groups = [
# Group.objects.get(pk=3), Group.objects.get(pk=4)]
# user.save()
#
# # Alice logs in
# self.doLogin(user=user)
#
# # She creates an empty questionnaire
# self.browser.get(self.live_server_url + reverse(
# route_questionnaire_new_step,
# kwargs={'identifier': 'new', 'step': 'cat_0'}))
# self.submit_form_step()
#
# url = self.browser.current_url
#
# # She goes to "my data" and sees it is listed there, no name indicated
# self.browser.get(self.live_server_url + reverse(
# accounts_route_questionnaires, kwargs={'user_id': user.id}))
#
# self.findByNot('xpath', '//a[contains(text(), "{}")]'.format(
# "{'en': 'Unknown name'}"))
#
# # She publishes the questionnaire
# self.browser.get(url)
# self.wait_for('class_name', 'review-panel-content')
#
# self.review_action('submit')
# self.review_action('review')
# self.review_action('publish')
#
# # She goes to the list and sees the data is listed there, "unknown name"
# # correctly displayed
# self.browser.get(self.live_server_url + reverse(
# route_questionnaire_list))
#
# self.findByNot('xpath', '//a[contains(text(), "{}")]'.format(
# "{'en': 'Unknown name'}"))
def test_list_status_public(self):
# She goes to the list view and sees the same questionnaires
self.browser.get(self.live_server_url + reverse(
route_questionnaire_list))
expected_results = [
{
'title': 'Foo 5',
'status': 'public'
},
{
'title': 'Foo 3',
'status': 'public'
},
]
self.check_list_results(expected_results)
def test_list_status_logged_in(self):
# Alice logs in as user 1.
user = User.objects.get(pk=101)
self.doLogin(user=user)
# Design change: Home also shows only public questionnaires
# She goes to the list view and sees only the public
# questionnaires
self.browser.get(self.live_server_url + reverse(
route_questionnaire_list))
expected_results = [
{
'title': 'Foo 5',
'status': 'public'
},
{
'title': 'Foo 3',
'status': 'public'
},
]
self.check_list_results(expected_results)
# She logs in as user 2
user = User.objects.get(pk=102)
self.doLogin(user=user)
# Design change: Home also shows only public questionnaires
# She goes to the list view and sees only the public
# questionnaires
self.browser.get(self.live_server_url + reverse(
route_questionnaire_list))
self.check_list_results(expected_results)
def test_list_shows_only_one_public(self):
code = 'sample_3'
old_key_1 = 'Foo 3'
new_key_1 = 'asdf'
old_questionnaire_count = Questionnaire.objects.count()
# Alice logs in and goes to the detail page of a "public" Questionnaire
user = User.objects.get(pk=101)
detail_page = SampleDetailPage(self)
detail_page.route_kwargs = {'identifier': code}
detail_page.open(login=True, user=user)
assert code in self.browser.current_url
assert detail_page.has_text(old_key_1)
assert not detail_page.has_text(new_key_1)
# She edits the Questionnaire and sees that the URL contains the
# code of the Questionnaire
detail_page.create_new_version()
edit_page = SampleEditPage(self)
edit_page.click_edit_category('cat_1')
assert code in self.browser.current_url
step_page = SampleStepPage(self)
step_page.enter_text(
step_page.LOC_FORM_INPUT_KEY_1, new_key_1, clear=True)
step_page.submit_step()
# She sees that the value of Key 1 was updated
assert not edit_page.has_text(old_key_1)
assert edit_page.has_text(new_key_1)
# Also there was an additional version created in the database
assert Questionnaire.objects.count() == old_questionnaire_count + 1
# The newly created version has the same code
assert Questionnaire.objects.filter(code=code).count() == 2
# She goes to the list view and sees only two entries: Foo 5 and Foo 3
list_page = SampleListPage(self)
list_page.open()
list_results = [
{
'title': 'Foo 5'
},
{
'title': old_key_1,
}
]
list_page.check_list_results(list_results)
# She goes to the detail page of the questionnaire and sees the
# draft version.
list_page.click_list_entry(index=1)
detail_page.expand_details()
assert not detail_page.has_text(old_key_1)
assert detail_page.has_text(new_key_1)
# She submits the questionnaire
detail_page.submit_questionnaire()
# In the DB, there is one active version
db_q = Questionnaire.objects.filter(code=code, status=4)
assert db_q.count() == 1
db_q_id = db_q[0].id
# Bob (the moderator) logs in
# The moderator publishes the questionnaire
user_moderator = User.objects.get(pk=105)
detail_page.open(login=True, user=user_moderator)
detail_page.review_questionnaire()
detail_page.publish_questionnaire()
# In the DB, there is still only one active version but it has a
# different ID now.
db_q = Questionnaire.objects.filter(code=code, status=4)
assert db_q.count() == 1
assert db_q_id != db_q[0].id
# He goes to the list view and sees only two entries: asdf and Foo 6.
list_page = SampleListPage(self)
list_page.open()
list_results = [
{
'title': 'Foo 5'
},
{
'title': new_key_1
},
]
list_page.check_list_results(list_results)
| 39.136221 | 86 | 0.586889 | 7,261 | 62,344 | 4.886517 | 0.060735 | 0.065218 | 0.043122 | 0.052141 | 0.809165 | 0.79265 | 0.772216 | 0.74888 | 0.732617 | 0.713368 | 0 | 0.013471 | 0.278439 | 62,344 | 1,592 | 87 | 39.160804 | 0.775258 | 0.356297 | 0 | 0.647355 | 0 | 0 | 0.212873 | 0.101711 | 0 | 0 | 0 | 0 | 0.178841 | 1 | 0.020151 | false | 0 | 0.017632 | 0 | 0.04534 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ba094e3ac75c7e7b5cefcc459f76fb2abaac0cca | 2,568 | py | Python | epytope/Data/pssms/smmpmbec/mat/B_40_01_10.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 7 | 2021-02-01T18:11:28.000Z | 2022-01-31T19:14:07.000Z | epytope/Data/pssms/smmpmbec/mat/B_40_01_10.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 22 | 2021-01-02T15:25:23.000Z | 2022-03-14T11:32:53.000Z | epytope/Data/pssms/smmpmbec/mat/B_40_01_10.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 4 | 2021-05-28T08:50:38.000Z | 2022-03-14T11:45:32.000Z | B_40_01_10 = {0: {'A': -0.23, 'C': 0.103, 'E': 0.416, 'D': 0.968, 'G': -0.421, 'F': -0.63, 'I': -0.342, 'H': -0.095, 'K': -0.066, 'M': -0.447, 'L': 0.081, 'N': 0.265, 'Q': 0.026, 'P': 1.352, 'S': -0.141, 'R': -0.049, 'T': -0.097, 'W': -0.163, 'V': -0.19, 'Y': -0.339}, 1: {'A': 0.398, 'C': -0.468, 'E': -1.235, 'D': 0.014, 'G': -0.084, 'F': -0.712, 'I': 0.499, 'H': -0.251, 'K': 0.529, 'M': 0.349, 'L': 0.584, 'N': -0.057, 'Q': 0.14, 'P': 0.521, 'S': -0.123, 'R': 0.213, 'T': 0.009, 'W': -0.538, 'V': 0.589, 'Y': -0.378}, 2: {'A': 0.151, 'C': 0.444, 'E': 0.022, 'D': 0.359, 'G': 0.371, 'F': -0.584, 'I': -0.486, 'H': -0.829, 'K': 0.771, 'M': -0.299, 'L': 0.222, 'N': -0.217, 'Q': 0.265, 'P': 0.207, 'S': 0.106, 'R': -0.054, 'T': -0.063, 'W': -0.215, 'V': -0.093, 'Y': -0.078}, 3: {'A': -0.03, 'C': -0.018, 'E': -0.012, 'D': -0.016, 'G': 0.016, 'F': 0.005, 'I': 0.024, 'H': -0.009, 'K': 0.017, 'M': 0.015, 'L': 0.021, 'N': -0.002, 'Q': 0.007, 'P': -0.03, 'S': 0.013, 'R': -0.008, 'T': 0.001, 'W': -0.003, 'V': 0.003, 'Y': 0.005}, 4: {'A': -0.036, 'C': -0.009, 'E': -0.001, 'D': 0.084, 'G': -0.041, 'F': -0.176, 'I': -0.105, 'H': -0.002, 'K': 0.028, 'M': -0.077, 'L': -0.033, 'N': -0.024, 'Q': 0.054, 'P': 0.27, 'S': -0.039, 'R': 0.051, 'T': 0.029, 'W': 0.01, 'V': 0.022, 'Y': -0.005}, 5: {'A': -0.119, 'C': -0.051, 'E': 0.107, 'D': 0.091, 'G': -0.045, 'F': -0.1, 'I': 0.106, 'H': -0.01, 'K': -0.102, 'M': 0.136, 'L': 0.044, 'N': 0.131, 'Q': 0.257, 'P': 0.02, 'S': -0.152, 'R': -0.156, 'T': -0.083, 'W': 0.025, 'V': -0.084, 'Y': -0.015}, 6: {'A': 0.046, 'C': -0.025, 'E': 0.009, 'D': 0.002, 'G': -0.014, 'F': -0.096, 'I': -0.033, 'H': 0.011, 'K': 0.097, 'M': -0.036, 'L': -0.038, 'N': -0.051, 'Q': 0.015, 'P': 0.031, 'S': 0.031, 'R': 0.111, 'T': 0.033, 'W': -0.059, 'V': 0.005, 'Y': -0.039}, 7: {'A': -0.027, 'C': 0.008, 'E': -0.011, 'D': -0.028, 'G': 0.059, 'F': 0.008, 'I': -0.122, 'H': 0.101, 'K': 0.046, 'M': -0.0, 'L': -0.065, 'N': 0.023, 'Q': 0.058, 'P': -0.044, 'S': -0.013, 'R': 0.138, 'T': -0.049, 'W': 0.005, 'V': -0.174, 'Y': 0.088}, 8: {'A': -0.152, 'C': 0.025, 'E': -0.062, 'D': 0.059, 'G': -0.034, 'F': 0.196, 'I': -0.274, 'H': 0.11, 'K': -0.041, 'M': 0.067, 'L': -0.01, 'N': 0.096, 'Q': -0.001, 'P': -0.134, 'S': -0.095, 'R': 0.035, 'T': -0.046, 'W': 0.127, 'V': -0.081, 'Y': 0.215}, 9: {'A': 0.292, 'C': 0.207, 'E': -0.16, 'D': 0.128, 'G': 0.447, 'F': -0.183, 'I': -0.427, 'H': 0.192, 'K': 0.24, 'M': -0.641, 'L': -1.284, 'N': -0.105, 'Q': -0.219, 'P': 0.192, 'S': 0.369, 'R': 0.637, 'T': 0.236, 'W': 0.302, 'V': -0.363, 'Y': 0.14}, -1: {'con': 4.74752}} | 2,568 | 2,568 | 0.393692 | 618 | 2,568 | 1.631068 | 0.281553 | 0.019841 | 0.009921 | 0.011905 | 0.027778 | 0 | 0 | 0 | 0 | 0 | 0 | 0.373259 | 0.161215 | 2,568 | 1 | 2,568 | 2,568 | 0.094708 | 0 | 0 | 0 | 0 | 0 | 0.079019 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e850a643d1dc9057d0e0d544a73d1e3c3bb40f4a | 6,591 | py | Python | modules/Bak.July2013/XhhC.py | TrentFranks/ssNMR-Topspin-Python | 95f99dc66bc665493d81d075088486f55ccae964 | [
"MIT"
] | 3 | 2016-08-24T12:01:15.000Z | 2021-12-02T21:45:34.000Z | modules/Bak.July2013/XhhC.py | TrentFranks/ssNMR-Topspin-Python | 95f99dc66bc665493d81d075088486f55ccae964 | [
"MIT"
] | null | null | null | modules/Bak.July2013/XhhC.py | TrentFranks/ssNMR-Topspin-Python | 95f99dc66bc665493d81d075088486f55ccae964 | [
"MIT"
] | null | null | null | """
Modules to Set default parameters:
W.T. Franks FMP Berlin
"""
import de.bruker.nmr.mfw.root as root
import math
import TopCmds
import IntShape
import Setup
p90H=2.5
ampH=4.0
p90C=3.0
ampC=0.0
p90N=5.0
ampN=-2.0
MAS =10000.0
def CH(units):
#48,49
p90H=float(TopCmds.GETPAR("P 3"))
ampH=float(TopCmds.GETPAR("PLdB 2"))
p90C=float(TopCmds.GETPAR("P 1"))
ampC=float(TopCmds.GETPAR("PLdB 1"))
MAS =float(TopCmds.GETPAR("CNST 31"))
SPH =TopCmds.GETPAR2("SPNAM 48")
SPH0=TopCmds.GETPAR2("SPNAM 40")
CNCT =float(TopCmds.GETPAR("P 48"))
CNCT0=float(TopCmds.GETPAR("P 15"))
MaxB1H=1000000./4./p90H
MaxB1C=1000000./4./p90C
if CNCT <= 1.00 : CNCT = CNCT0
if SPH == "gauss" or SPH == "None" or SPH == "" :
SPH = SPH0
TopCmds.PUTPAR("SPNAM 48",SPH)
TopCmds.XCMD("spnam48")
SPH=(TopCmds.GETPAR2("SPNAM 48"))
SPH.join()
Hav = IntShape.Integrate(SPH)
Hav0 = IntShape.Integrate(SPH0)
Hint = 0.01*((Hav)**2)/Hav0
#TopCmds.MSG("Hint "+str(Hint))
Hamp0=float(TopCmds.GETPAR("SPdB 40"))
Camp0=float(TopCmds.GETPAR("SPdB 41"))
B1H = MaxB1H*Hint*math.pow(10,(ampH-Hamp0)/20.)
B1C = MaxB1C*math.pow(10,(ampC-Camp0)/20.)
index = TopCmds.INPUT_DIALOG("C-H CP", "Contact and Ramp", \
["Proton B1 Field","H Ramp","Carbon B1 Field","Contact Time(P48)"],\
[str('%3.0f' %B1H),SPH,str('%3.0f' %B1C),str(CNCT)],\
["kHz","","kHz","us"],\
["1","1","1","1"],\
["Accept","Close"], ['a','c'], 10)
SP=index[1]
CNCT=float(index[3])
adjust=20*(math.log10(float(index[0])/MaxB1H))
Hamp1 = ampH-adjust
AvgAmp=IntShape.Integrate(index[1])/100.
adjust=20*(math.log10(1./AvgAmp))
Hamp = Hamp1-adjust
adjust=20*(math.log10(float(index[2])/MaxB1C))
Camp = ampC-adjust
if units == "W":
Hamp=dBtoW(Hamp)
Camp=dBtoW(Camp)
value = TopCmds.SELECT("Adjusting the CH CP parameters:",\
"This will set\n 1H power to: " + str('%3.2f' %Hamp)+" "+ units+"\n \
13C power to: " +str('%3.2f' %Camp) + units,["Update", "Keep Previous"])
if value != 1:
TopCmds.PUTPAR("SP"+units+" 48",str('%3.2f' %Hamp))
TopCmds.PUTPAR("SP"+units+" 49",str('%3.2f' %Camp))
TopCmds.PUTPAR("PL"+units+" 48",str('%3.2f' %Hamp))
TopCmds.PUTPAR("PL"+units+" 49",str('%3.2f' %Camp))
TopCmds.PUTPAR("P 48" ,str('%3.2f' %CNCT))
TopCmds.PUTPAR("SPNAM 48",SP)
def HC(units):
#44,45
p90H=float(TopCmds.GETPAR("P 3"))
ampH=float(TopCmds.GETPAR("PLdB 2"))
p90C=float(TopCmds.GETPAR("P 1"))
ampC=float(TopCmds.GETPAR("PLdB 1"))
MAS =float(TopCmds.GETPAR("CNST 31"))
SPH =TopCmds.GETPAR2("SPNAM 44")
SPH0=TopCmds.GETPAR2("SPNAM 40")
CNCT =float(TopCmds.GETPAR("P 44"))
CNCT0=float(TopCmds.GETPAR("P 15"))
MaxB1H=1000000./4./p90H
MaxB1C=1000000./4./p90C
if CNCT <= 1.00 : CNCT = CNCT0
if SPH == "gauss" or SPH == "None" or SPH == "" :
SPH = SPH0
TopCmds.PUTPAR("SPNAM 44",SPH)
TopCmds.XCMD("spnam44")
SPH=(TopCmds.GETPAR2("SPNAM 44"))
SPH.join()
Hav = IntShape.Integrate(SPH)
Hav0 = IntShape.Integrate(SPH0)
Hint = 0.01*((Hav)**2)/Hav0
#TopCmds.MSG("Hint "+str(Hint))
Hamp0=float(TopCmds.GETPAR("SPdB 40"))
Camp0=float(TopCmds.GETPAR("SPdB 41"))
B1H = MaxB1H*Hint*math.pow(10,(ampH-Hamp0)/20.)
B1C = MaxB1C*math.pow(10,(ampC-Camp0)/20.)
index = TopCmds.INPUT_DIALOG("H-C CP", "Contact and Ramp", \
["Proton B1 Field","H Ramp","Carbon B1 Field","Contact Time(P44)"],\
[str('%3.0f' %B1H),SPH,str('%3.0f' %B1C),str(CNCT)],\
["kHz","","kHz","us"],\
["1","1","1","1"],\
["Accept","Close"], ['a','c'], 10)
SP=index[1]
CNCT=float(index[3])
adjust=20*(math.log10(float(index[0])/MaxB1H))
Hamp1 = ampH-adjust
AvgAmp=IntShape.Integrate(index[1])/100.
adjust=20*(math.log10(1./AvgAmp))
Hamp = Hamp1-adjust
adjust=20*(math.log10(float(index[2])/MaxB1C))
Camp = ampC-adjust
if units == "W":
Hamp=Setup.dBtoW(Hamp)
Camp=Setup.dBtoW(Camp)
value = TopCmds.SELECT("Adjusting the CH CP parameters:",\
"This will set\n 1H power to: " + str('%3.2f' %Hamp)+" "+units+"\n \
13C power to: " +str('%3.2f' %Camp) + units,["Update", "Keep Previous"])
if value != 1:
TopCmds.PUTPAR("SP"+units+" 44",str('%3.2f' %Hamp))
TopCmds.PUTPAR("SP"+units+" 45",str('%3.2f' %Camp))
TopCmds.PUTPAR("PL"+units+" 44",str('%3.2f' %Hamp))
TopCmds.PUTPAR("PL"+units+" 45",str('%3.2f' %Camp))
TopCmds.PUTPAR("P 44" ,str('%3.2f' %CNCT))
TopCmds.PUTPAR("SPNAM 44",SP)
def NH(units):
#46,47
p90H=float(TopCmds.GETPAR("P 3"))
ampH=float(TopCmds.GETPAR("PLdB 2"))
p90C=float(TopCmds.GETPAR("P 1"))
ampC=float(TopCmds.GETPAR("PLdB 3"))
MAS =float(TopCmds.GETPAR("CNST 31"))
SPH =TopCmds.GETPAR2("SPNAM 46")
SPH0=TopCmds.GETPAR2("SPNAM 40")
CNCT =float(TopCmds.GETPAR("P 46"))
CNCT0=float(TopCmds.GETPAR("P 15"))
MaxB1H=1000000./4./p90H
MaxB1C=1000000./4./p90C
if CNCT <= 1.00 : CNCT = CNCT0
if SPH == "gauss" or SPH == "None" or SPH == "" :
SPH = SPH0
TopCmds.PUTPAR("SPNAM 46",SPH)
TopCmds.XCMD("spnam46")
SPH=(TopCmds.GETPAR2("SPNAM 46"))
SPH.join()
Hav = IntShape.Integrate(SPH)
Hav0 = IntShape.Integrate(SPH0)
Hint = 0.01*((Hav)**2)/Hav0
#TopCmds.MSG("Hint "+str(Hint))
Hamp0=float(TopCmds.GETPAR("SPdB 42"))
Camp0=float(TopCmds.GETPAR("SPdB 43"))
B1H = MaxB1H*Hint*math.pow(10,(ampH-Hamp0)/20.)
B1C = MaxB1C*math.pow(10,(ampC-Camp0)/20.)
index = TopCmds.INPUT_DIALOG("N-H CP", "Contact and Ramp", \
["Proton B1 Field","H Ramp","Nitrogen B1 Field","Contact Time(P46)"],\
[str('%3.0f' %B1H),SPH,str('%3.0f' %B1C),str(CNCT)],\
["kHz","","kHz","us"],\
["1","1","1","1"],\
["Accept","Close"], ['a','c'], 10)
SP=index[1]
CNCT=float(index[3])
adjust=20*(math.log10(float(index[0])/MaxB1H))
Hamp1 = ampH-adjust
AvgAmp=IntShape.Integrate(index[1])/100.
adjust=20*(math.log10(1./AvgAmp))
Hamp = Hamp1-adjust
adjust=20*(math.log10(float(index[2])/MaxB1C))
Camp = ampC-adjust
if units == "W":
Hamp=Setup.dBtoW(Hamp)
Camp=Setup.dBtoW(Camp)
value = TopCmds.SELECT("Adjusting the NH CP parameters:",\
"This will set\n 1H power to: " + str('%3.2f' %Hamp)+ " "+units+"\n \
15N power to: " +str('%3.2f' %Camp) + units,["Update", "Keep Previous"])
if value != 1:
TopCmds.PUTPAR("SP"+units+" 46",str('%3.2f' %Hamp))
TopCmds.PUTPAR("SP"+units+" 47",str('%3.2f' %Camp))
TopCmds.PUTPAR("PL"+units+" 46",str('%3.2f' %Hamp))
TopCmds.PUTPAR("PL"+units+" 47",str('%3.2f' %Camp))
TopCmds.PUTPAR("P 46" ,str('%3.2f' %CNCT))
TopCmds.PUTPAR("SPNAM 44",SP)
| 27.810127 | 75 | 0.611136 | 1,026 | 6,591 | 3.923002 | 0.137427 | 0.080497 | 0.120745 | 0.056646 | 0.911056 | 0.884969 | 0.884969 | 0.877267 | 0.787081 | 0.77118 | 0 | 0.089957 | 0.160067 | 6,591 | 237 | 76 | 27.810127 | 0.637103 | 0.024731 | 0 | 0.66092 | 0 | 0 | 0.169576 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017241 | false | 0 | 0.028736 | 0 | 0.045977 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e869f6138f633f7cc9d7d593e0b930adadf64d75 | 19 | py | Python | __init__.py | samesense/hgnc | 390a5770487d155420a10f787e4e61f37262b627 | [
"MIT"
] | null | null | null | __init__.py | samesense/hgnc | 390a5770487d155420a10f787e4e61f37262b627 | [
"MIT"
] | null | null | null | __init__.py | samesense/hgnc | 390a5770487d155420a10f787e4e61f37262b627 | [
"MIT"
] | null | null | null | from hgnc import *
| 9.5 | 18 | 0.736842 | 3 | 19 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 19 | 1 | 19 | 19 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e880a6a783e54269ab12bc7607ed46fe201c4928 | 42 | py | Python | howfast_apm/__init__.py | HowFast/apm-python | e0ae4f06aecc492e67a153b842250c75641db93d | [
"MIT"
] | 3 | 2019-09-03T03:15:27.000Z | 2019-09-06T15:35:00.000Z | howfast_apm/__init__.py | HowFast/apm-python | e0ae4f06aecc492e67a153b842250c75641db93d | [
"MIT"
] | 5 | 2019-09-25T05:08:24.000Z | 2020-05-22T23:07:58.000Z | howfast_apm/__init__.py | HowFast/apm-python | e0ae4f06aecc492e67a153b842250c75641db93d | [
"MIT"
] | null | null | null | from .flask import HowFastFlaskMiddleware
| 21 | 41 | 0.880952 | 4 | 42 | 9.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 42 | 1 | 42 | 42 | 0.973684 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e8c7c11c115b700e0f8f60435c15662269b5c2d8 | 49,557 | py | Python | ai/message_pb2.py | giongto35/Gowog | f1dee3b51b913f59419600bcf8964c647756562d | [
"MIT"
] | 99 | 2019-01-20T06:59:22.000Z | 2022-03-24T18:14:01.000Z | ai/message_pb2.py | giongto35/Gowog | f1dee3b51b913f59419600bcf8964c647756562d | [
"MIT"
] | 14 | 2019-01-19T08:36:33.000Z | 2019-10-12T04:43:31.000Z | ai/message_pb2.py | giongto35/Gowog | f1dee3b51b913f59419600bcf8964c647756562d | [
"MIT"
] | 22 | 2019-01-20T07:15:16.000Z | 2022-01-01T14:40:05.000Z | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: message.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf.internal import enum_type_wrapper
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from google.protobuf import timestamp_pb2 as google_dot_protobuf_dot_timestamp__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='message.proto',
package='Message_proto',
syntax='proto3',
serialized_options=None,
serialized_pb=_b('\n\rmessage.proto\x12\rMessage_proto\x1a\x1fgoogle/protobuf/timestamp.proto\"\xac\x03\n\x11\x43lientGameMessage\x12\x1d\n\x15input_sequence_number\x18\x01 \x01(\x05\x12\x14\n\x0ctime_elapsed\x18\x02 \x01(\x02\x12\x38\n\x13init_player_payload\x18\x03 \x01(\x0b\x32\x19.Message_proto.InitPlayerH\x00\x12<\n\x15move_position_payload\x18\x04 \x01(\x0b\x32\x1b.Message_proto.MovePositionH\x00\x12-\n\rshoot_payload\x18\x05 \x01(\x0b\x32\x14.Message_proto.ShootH\x00\x12\x36\n\x12\x62uild_wall_payload\x18\x06 \x01(\x0b\x32\x18.Message_proto.BuildWallH\x00\x12<\n\x15update_player_payload\x18\x07 \x01(\x0b\x32\x1b.Message_proto.UpdatePlayerH\x00\x12:\n\x14set_position_payload\x18\x08 \x01(\x0b\x32\x1a.Message_proto.SetPositionH\x00\x42\t\n\x07message\"\x14\n\x04Ping\x12\x0c\n\x04ping\x18\x01 \x01(\t\"/\n\x0bSetPosition\x12\n\n\x02id\x18\x01 \x01(\x05\x12\t\n\x01x\x18\x02 \x01(\x02\x12\t\n\x01y\x18\x03 \x01(\x02\"2\n\x0cMovePosition\x12\n\n\x02id\x18\x01 \x01(\x05\x12\n\n\x02\x64x\x18\x02 \x01(\x02\x12\n\n\x02\x64y\x18\x03 \x01(\x02\"`\n\nInitPlayer\x12\n\n\x02id\x18\x01 \x01(\x05\x12\x11\n\tclient_id\x18\x02 \x01(\x05\x12\t\n\x01x\x18\x03 \x01(\x02\x12\t\n\x01y\x18\x04 \x01(\x02\x12\x0c\n\x04name\x18\x05 \x01(\t\x12\x0f\n\x07is_main\x18\x06 \x01(\x08\"c\n\x03Map\x12\r\n\x05\x62lock\x18\x01 \x03(\x05\x12\x10\n\x08num_cols\x18\x02 \x01(\x05\x12\x10\n\x08num_rows\x18\x03 \x01(\x05\x12\x13\n\x0b\x62lock_width\x18\x04 \x01(\x02\x12\x14\n\x0c\x62lock_height\x18\x05 \x01(\x02\"_\n\x07InitAll\x12.\n\x0binit_player\x18\x01 \x03(\x0b\x32\x19.Message_proto.InitPlayer\x12$\n\x08init_map\x18\x02 \x01(\x0b\x32\x12.Message_proto.Map\"\x1a\n\x0cRemovePlayer\x12\n\n\x02id\x18\x01 \x01(\x05\"%\n\x10RegisterClientID\x12\x11\n\tclient_id\x18\x01 \x01(\x05\"\x9b\x01\n\x05Shoot\x12\n\n\x02id\x18\x01 \x01(\x03\x12\x11\n\tplayer_id\x18\x02 \x01(\x05\x12\t\n\x01x\x18\x03 \x01(\x02\x12\t\n\x01y\x18\x04 \x01(\x02\x12\n\n\x02\x64x\x18\x05 \x01(\x02\x12\n\n\x02\x64y\x18\x06 \x01(\x02\x12,\n\x04type\x18\x07 \x01(\x0e\x32\x1e.Message_proto.Shoot.ShootType\"\x17\n\tShootType\x12\n\n\x06NORMAL\x10\x00\"-\n\tBuildWall\x12\n\n\x02id\x18\x01 \x01(\x05\x12\t\n\x01x\x18\x02 \x01(\x05\x12\t\n\x01y\x18\x03 \x01(\x05\"b\n\x0cUpdatePlayer\x12\n\n\x02id\x18\x01 \x01(\x05\x12\t\n\x01x\x18\x02 \x01(\x02\x12\t\n\x01y\x18\x03 \x01(\x02\x12\x0e\n\x06health\x18\x04 \x01(\x02\x12\x0c\n\x04name\x18\x05 \x01(\t\x12\x12\n\nis_destroy\x18\x06 \x01(\x08\"\xd7\x01\n\x06Player\x12\n\n\x02id\x18\x01 \x01(\x05\x12\t\n\x01x\x18\x02 \x01(\x02\x12\t\n\x01y\x18\x03 \x01(\x02\x12\x0e\n\x06health\x18\x04 \x01(\x02\x12\x0c\n\x04size\x18\x05 \x01(\x02\x12\r\n\x05level\x18\x06 \x01(\x05\x12\r\n\x05score\x18\x07 \x01(\x05\x12\x0c\n\x04name\x18\x08 \x01(\t\x12/\n\x0bnext_reload\x18\t \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12\x12\n\nis_destroy\x18\n \x01(\x08\x12\x1c\n\x14\x63urrent_input_number\x18\x0b \x01(\x05\"\x99\x03\n\x11ServerGameMessage\x12\x1a\n\x12last_process_input\x18\x01 \x01(\x05\x12\x32\n\x10init_all_payload\x18\x02 \x01(\x0b\x32\x16.Message_proto.InitAllH\x00\x12\x38\n\x13init_player_payload\x18\x03 \x01(\x0b\x32\x19.Message_proto.InitPlayerH\x00\x12\x36\n\x15update_player_payload\x18\x04 \x01(\x0b\x32\x15.Message_proto.PlayerH\x00\x12\x32\n\x12init_shoot_payload\x18\x05 \x01(\x0b\x32\x14.Message_proto.ShootH\x00\x12<\n\x15remove_player_payload\x18\x06 \x01(\x0b\x32\x1b.Message_proto.RemovePlayerH\x00\x12\x45\n\x1aregister_client_id_payload\x18\x07 \x01(\x0b\x32\x1f.Message_proto.RegisterClientIDH\x00\x42\t\n\x07message\"!\n\tWallBlock\x12\t\n\x01x\x18\x01 \x01(\x02\x12\t\n\x01y\x18\x02 \x01(\x02\"p\n\x06\x42ullet\x12\x11\n\tplayer_id\x18\x01 \x01(\x05\x12\n\n\x02sx\x18\x02 \x01(\x02\x12\n\n\x02sy\x18\x03 \x01(\x02\x12\t\n\x01x\x18\x04 \x01(\x02\x12\t\n\x01y\x18\x05 \x01(\x02\x12\n\n\x02\x64x\x18\x06 \x01(\x02\x12\n\n\x02\x64y\x18\x07 \x01(\x02\x12\r\n\x05stime\x18\x08 \x01(\x02*2\n\tDirection\x12\x06\n\x02UP\x10\x00\x12\x08\n\x04\x44OWN\x10\x01\x12\x08\n\x04LEFT\x10\x02\x12\t\n\x05RIGHT\x10\x03\x62\x06proto3')
,
dependencies=[google_dot_protobuf_dot_timestamp__pb2.DESCRIPTOR,])
_DIRECTION = _descriptor.EnumDescriptor(
name='Direction',
full_name='Message_proto.Direction',
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name='UP', index=0, number=0,
serialized_options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='DOWN', index=1, number=1,
serialized_options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='LEFT', index=2, number=2,
serialized_options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='RIGHT', index=3, number=3,
serialized_options=None,
type=None),
],
containing_type=None,
serialized_options=None,
serialized_start=2066,
serialized_end=2116,
)
_sym_db.RegisterEnumDescriptor(_DIRECTION)
Direction = enum_type_wrapper.EnumTypeWrapper(_DIRECTION)
UP = 0
DOWN = 1
LEFT = 2
RIGHT = 3
_SHOOT_SHOOTTYPE = _descriptor.EnumDescriptor(
name='ShootType',
full_name='Message_proto.Shoot.ShootType',
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name='NORMAL', index=0, number=0,
serialized_options=None,
type=None),
],
containing_type=None,
serialized_options=None,
serialized_start=1115,
serialized_end=1138,
)
_sym_db.RegisterEnumDescriptor(_SHOOT_SHOOTTYPE)
_CLIENTGAMEMESSAGE = _descriptor.Descriptor(
name='ClientGameMessage',
full_name='Message_proto.ClientGameMessage',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='input_sequence_number', full_name='Message_proto.ClientGameMessage.input_sequence_number', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='time_elapsed', full_name='Message_proto.ClientGameMessage.time_elapsed', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='init_player_payload', full_name='Message_proto.ClientGameMessage.init_player_payload', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='move_position_payload', full_name='Message_proto.ClientGameMessage.move_position_payload', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='shoot_payload', full_name='Message_proto.ClientGameMessage.shoot_payload', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='build_wall_payload', full_name='Message_proto.ClientGameMessage.build_wall_payload', index=5,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='update_player_payload', full_name='Message_proto.ClientGameMessage.update_player_payload', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='set_position_payload', full_name='Message_proto.ClientGameMessage.set_position_payload', index=7,
number=8, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='message', full_name='Message_proto.ClientGameMessage.message',
index=0, containing_type=None, fields=[]),
],
serialized_start=66,
serialized_end=494,
)
_PING = _descriptor.Descriptor(
name='Ping',
full_name='Message_proto.Ping',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='ping', full_name='Message_proto.Ping.ping', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=496,
serialized_end=516,
)
_SETPOSITION = _descriptor.Descriptor(
name='SetPosition',
full_name='Message_proto.SetPosition',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='Message_proto.SetPosition.id', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='x', full_name='Message_proto.SetPosition.x', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='y', full_name='Message_proto.SetPosition.y', index=2,
number=3, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=518,
serialized_end=565,
)
_MOVEPOSITION = _descriptor.Descriptor(
name='MovePosition',
full_name='Message_proto.MovePosition',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='Message_proto.MovePosition.id', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='dx', full_name='Message_proto.MovePosition.dx', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='dy', full_name='Message_proto.MovePosition.dy', index=2,
number=3, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=567,
serialized_end=617,
)
_INITPLAYER = _descriptor.Descriptor(
name='InitPlayer',
full_name='Message_proto.InitPlayer',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='Message_proto.InitPlayer.id', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='client_id', full_name='Message_proto.InitPlayer.client_id', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='x', full_name='Message_proto.InitPlayer.x', index=2,
number=3, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='y', full_name='Message_proto.InitPlayer.y', index=3,
number=4, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='name', full_name='Message_proto.InitPlayer.name', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='is_main', full_name='Message_proto.InitPlayer.is_main', index=5,
number=6, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=619,
serialized_end=715,
)
_MAP = _descriptor.Descriptor(
name='Map',
full_name='Message_proto.Map',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='block', full_name='Message_proto.Map.block', index=0,
number=1, type=5, cpp_type=1, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='num_cols', full_name='Message_proto.Map.num_cols', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='num_rows', full_name='Message_proto.Map.num_rows', index=2,
number=3, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='block_width', full_name='Message_proto.Map.block_width', index=3,
number=4, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='block_height', full_name='Message_proto.Map.block_height', index=4,
number=5, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=717,
serialized_end=816,
)
_INITALL = _descriptor.Descriptor(
name='InitAll',
full_name='Message_proto.InitAll',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='init_player', full_name='Message_proto.InitAll.init_player', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='init_map', full_name='Message_proto.InitAll.init_map', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=818,
serialized_end=913,
)
_REMOVEPLAYER = _descriptor.Descriptor(
name='RemovePlayer',
full_name='Message_proto.RemovePlayer',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='Message_proto.RemovePlayer.id', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=915,
serialized_end=941,
)
_REGISTERCLIENTID = _descriptor.Descriptor(
name='RegisterClientID',
full_name='Message_proto.RegisterClientID',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='client_id', full_name='Message_proto.RegisterClientID.client_id', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=943,
serialized_end=980,
)
_SHOOT = _descriptor.Descriptor(
name='Shoot',
full_name='Message_proto.Shoot',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='Message_proto.Shoot.id', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='player_id', full_name='Message_proto.Shoot.player_id', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='x', full_name='Message_proto.Shoot.x', index=2,
number=3, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='y', full_name='Message_proto.Shoot.y', index=3,
number=4, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='dx', full_name='Message_proto.Shoot.dx', index=4,
number=5, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='dy', full_name='Message_proto.Shoot.dy', index=5,
number=6, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='type', full_name='Message_proto.Shoot.type', index=6,
number=7, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
_SHOOT_SHOOTTYPE,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=983,
serialized_end=1138,
)
_BUILDWALL = _descriptor.Descriptor(
name='BuildWall',
full_name='Message_proto.BuildWall',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='Message_proto.BuildWall.id', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='x', full_name='Message_proto.BuildWall.x', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='y', full_name='Message_proto.BuildWall.y', index=2,
number=3, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1140,
serialized_end=1185,
)
_UPDATEPLAYER = _descriptor.Descriptor(
name='UpdatePlayer',
full_name='Message_proto.UpdatePlayer',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='Message_proto.UpdatePlayer.id', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='x', full_name='Message_proto.UpdatePlayer.x', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='y', full_name='Message_proto.UpdatePlayer.y', index=2,
number=3, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='health', full_name='Message_proto.UpdatePlayer.health', index=3,
number=4, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='name', full_name='Message_proto.UpdatePlayer.name', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='is_destroy', full_name='Message_proto.UpdatePlayer.is_destroy', index=5,
number=6, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1187,
serialized_end=1285,
)
_PLAYER = _descriptor.Descriptor(
name='Player',
full_name='Message_proto.Player',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='Message_proto.Player.id', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='x', full_name='Message_proto.Player.x', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='y', full_name='Message_proto.Player.y', index=2,
number=3, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='health', full_name='Message_proto.Player.health', index=3,
number=4, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='size', full_name='Message_proto.Player.size', index=4,
number=5, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='level', full_name='Message_proto.Player.level', index=5,
number=6, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='score', full_name='Message_proto.Player.score', index=6,
number=7, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='name', full_name='Message_proto.Player.name', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='next_reload', full_name='Message_proto.Player.next_reload', index=8,
number=9, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='is_destroy', full_name='Message_proto.Player.is_destroy', index=9,
number=10, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='current_input_number', full_name='Message_proto.Player.current_input_number', index=10,
number=11, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1288,
serialized_end=1503,
)
_SERVERGAMEMESSAGE = _descriptor.Descriptor(
name='ServerGameMessage',
full_name='Message_proto.ServerGameMessage',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='last_process_input', full_name='Message_proto.ServerGameMessage.last_process_input', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='init_all_payload', full_name='Message_proto.ServerGameMessage.init_all_payload', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='init_player_payload', full_name='Message_proto.ServerGameMessage.init_player_payload', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='update_player_payload', full_name='Message_proto.ServerGameMessage.update_player_payload', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='init_shoot_payload', full_name='Message_proto.ServerGameMessage.init_shoot_payload', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='remove_player_payload', full_name='Message_proto.ServerGameMessage.remove_player_payload', index=5,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='register_client_id_payload', full_name='Message_proto.ServerGameMessage.register_client_id_payload', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='message', full_name='Message_proto.ServerGameMessage.message',
index=0, containing_type=None, fields=[]),
],
serialized_start=1506,
serialized_end=1915,
)
_WALLBLOCK = _descriptor.Descriptor(
name='WallBlock',
full_name='Message_proto.WallBlock',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='x', full_name='Message_proto.WallBlock.x', index=0,
number=1, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='y', full_name='Message_proto.WallBlock.y', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1917,
serialized_end=1950,
)
_BULLET = _descriptor.Descriptor(
name='Bullet',
full_name='Message_proto.Bullet',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='player_id', full_name='Message_proto.Bullet.player_id', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='sx', full_name='Message_proto.Bullet.sx', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='sy', full_name='Message_proto.Bullet.sy', index=2,
number=3, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='x', full_name='Message_proto.Bullet.x', index=3,
number=4, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='y', full_name='Message_proto.Bullet.y', index=4,
number=5, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='dx', full_name='Message_proto.Bullet.dx', index=5,
number=6, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='dy', full_name='Message_proto.Bullet.dy', index=6,
number=7, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='stime', full_name='Message_proto.Bullet.stime', index=7,
number=8, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1952,
serialized_end=2064,
)
_CLIENTGAMEMESSAGE.fields_by_name['init_player_payload'].message_type = _INITPLAYER
_CLIENTGAMEMESSAGE.fields_by_name['move_position_payload'].message_type = _MOVEPOSITION
_CLIENTGAMEMESSAGE.fields_by_name['shoot_payload'].message_type = _SHOOT
_CLIENTGAMEMESSAGE.fields_by_name['build_wall_payload'].message_type = _BUILDWALL
_CLIENTGAMEMESSAGE.fields_by_name['update_player_payload'].message_type = _UPDATEPLAYER
_CLIENTGAMEMESSAGE.fields_by_name['set_position_payload'].message_type = _SETPOSITION
_CLIENTGAMEMESSAGE.oneofs_by_name['message'].fields.append(
_CLIENTGAMEMESSAGE.fields_by_name['init_player_payload'])
_CLIENTGAMEMESSAGE.fields_by_name['init_player_payload'].containing_oneof = _CLIENTGAMEMESSAGE.oneofs_by_name['message']
_CLIENTGAMEMESSAGE.oneofs_by_name['message'].fields.append(
_CLIENTGAMEMESSAGE.fields_by_name['move_position_payload'])
_CLIENTGAMEMESSAGE.fields_by_name['move_position_payload'].containing_oneof = _CLIENTGAMEMESSAGE.oneofs_by_name['message']
_CLIENTGAMEMESSAGE.oneofs_by_name['message'].fields.append(
_CLIENTGAMEMESSAGE.fields_by_name['shoot_payload'])
_CLIENTGAMEMESSAGE.fields_by_name['shoot_payload'].containing_oneof = _CLIENTGAMEMESSAGE.oneofs_by_name['message']
_CLIENTGAMEMESSAGE.oneofs_by_name['message'].fields.append(
_CLIENTGAMEMESSAGE.fields_by_name['build_wall_payload'])
_CLIENTGAMEMESSAGE.fields_by_name['build_wall_payload'].containing_oneof = _CLIENTGAMEMESSAGE.oneofs_by_name['message']
_CLIENTGAMEMESSAGE.oneofs_by_name['message'].fields.append(
_CLIENTGAMEMESSAGE.fields_by_name['update_player_payload'])
_CLIENTGAMEMESSAGE.fields_by_name['update_player_payload'].containing_oneof = _CLIENTGAMEMESSAGE.oneofs_by_name['message']
_CLIENTGAMEMESSAGE.oneofs_by_name['message'].fields.append(
_CLIENTGAMEMESSAGE.fields_by_name['set_position_payload'])
_CLIENTGAMEMESSAGE.fields_by_name['set_position_payload'].containing_oneof = _CLIENTGAMEMESSAGE.oneofs_by_name['message']
_INITALL.fields_by_name['init_player'].message_type = _INITPLAYER
_INITALL.fields_by_name['init_map'].message_type = _MAP
_SHOOT.fields_by_name['type'].enum_type = _SHOOT_SHOOTTYPE
_SHOOT_SHOOTTYPE.containing_type = _SHOOT
_PLAYER.fields_by_name['next_reload'].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP
_SERVERGAMEMESSAGE.fields_by_name['init_all_payload'].message_type = _INITALL
_SERVERGAMEMESSAGE.fields_by_name['init_player_payload'].message_type = _INITPLAYER
_SERVERGAMEMESSAGE.fields_by_name['update_player_payload'].message_type = _PLAYER
_SERVERGAMEMESSAGE.fields_by_name['init_shoot_payload'].message_type = _SHOOT
_SERVERGAMEMESSAGE.fields_by_name['remove_player_payload'].message_type = _REMOVEPLAYER
_SERVERGAMEMESSAGE.fields_by_name['register_client_id_payload'].message_type = _REGISTERCLIENTID
_SERVERGAMEMESSAGE.oneofs_by_name['message'].fields.append(
_SERVERGAMEMESSAGE.fields_by_name['init_all_payload'])
_SERVERGAMEMESSAGE.fields_by_name['init_all_payload'].containing_oneof = _SERVERGAMEMESSAGE.oneofs_by_name['message']
_SERVERGAMEMESSAGE.oneofs_by_name['message'].fields.append(
_SERVERGAMEMESSAGE.fields_by_name['init_player_payload'])
_SERVERGAMEMESSAGE.fields_by_name['init_player_payload'].containing_oneof = _SERVERGAMEMESSAGE.oneofs_by_name['message']
_SERVERGAMEMESSAGE.oneofs_by_name['message'].fields.append(
_SERVERGAMEMESSAGE.fields_by_name['update_player_payload'])
_SERVERGAMEMESSAGE.fields_by_name['update_player_payload'].containing_oneof = _SERVERGAMEMESSAGE.oneofs_by_name['message']
_SERVERGAMEMESSAGE.oneofs_by_name['message'].fields.append(
_SERVERGAMEMESSAGE.fields_by_name['init_shoot_payload'])
_SERVERGAMEMESSAGE.fields_by_name['init_shoot_payload'].containing_oneof = _SERVERGAMEMESSAGE.oneofs_by_name['message']
_SERVERGAMEMESSAGE.oneofs_by_name['message'].fields.append(
_SERVERGAMEMESSAGE.fields_by_name['remove_player_payload'])
_SERVERGAMEMESSAGE.fields_by_name['remove_player_payload'].containing_oneof = _SERVERGAMEMESSAGE.oneofs_by_name['message']
_SERVERGAMEMESSAGE.oneofs_by_name['message'].fields.append(
_SERVERGAMEMESSAGE.fields_by_name['register_client_id_payload'])
_SERVERGAMEMESSAGE.fields_by_name['register_client_id_payload'].containing_oneof = _SERVERGAMEMESSAGE.oneofs_by_name['message']
DESCRIPTOR.message_types_by_name['ClientGameMessage'] = _CLIENTGAMEMESSAGE
DESCRIPTOR.message_types_by_name['Ping'] = _PING
DESCRIPTOR.message_types_by_name['SetPosition'] = _SETPOSITION
DESCRIPTOR.message_types_by_name['MovePosition'] = _MOVEPOSITION
DESCRIPTOR.message_types_by_name['InitPlayer'] = _INITPLAYER
DESCRIPTOR.message_types_by_name['Map'] = _MAP
DESCRIPTOR.message_types_by_name['InitAll'] = _INITALL
DESCRIPTOR.message_types_by_name['RemovePlayer'] = _REMOVEPLAYER
DESCRIPTOR.message_types_by_name['RegisterClientID'] = _REGISTERCLIENTID
DESCRIPTOR.message_types_by_name['Shoot'] = _SHOOT
DESCRIPTOR.message_types_by_name['BuildWall'] = _BUILDWALL
DESCRIPTOR.message_types_by_name['UpdatePlayer'] = _UPDATEPLAYER
DESCRIPTOR.message_types_by_name['Player'] = _PLAYER
DESCRIPTOR.message_types_by_name['ServerGameMessage'] = _SERVERGAMEMESSAGE
DESCRIPTOR.message_types_by_name['WallBlock'] = _WALLBLOCK
DESCRIPTOR.message_types_by_name['Bullet'] = _BULLET
DESCRIPTOR.enum_types_by_name['Direction'] = _DIRECTION
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
ClientGameMessage = _reflection.GeneratedProtocolMessageType('ClientGameMessage', (_message.Message,), dict(
DESCRIPTOR = _CLIENTGAMEMESSAGE,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.ClientGameMessage)
))
_sym_db.RegisterMessage(ClientGameMessage)
Ping = _reflection.GeneratedProtocolMessageType('Ping', (_message.Message,), dict(
DESCRIPTOR = _PING,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.Ping)
))
_sym_db.RegisterMessage(Ping)
SetPosition = _reflection.GeneratedProtocolMessageType('SetPosition', (_message.Message,), dict(
DESCRIPTOR = _SETPOSITION,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.SetPosition)
))
_sym_db.RegisterMessage(SetPosition)
MovePosition = _reflection.GeneratedProtocolMessageType('MovePosition', (_message.Message,), dict(
DESCRIPTOR = _MOVEPOSITION,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.MovePosition)
))
_sym_db.RegisterMessage(MovePosition)
InitPlayer = _reflection.GeneratedProtocolMessageType('InitPlayer', (_message.Message,), dict(
DESCRIPTOR = _INITPLAYER,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.InitPlayer)
))
_sym_db.RegisterMessage(InitPlayer)
Map = _reflection.GeneratedProtocolMessageType('Map', (_message.Message,), dict(
DESCRIPTOR = _MAP,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.Map)
))
_sym_db.RegisterMessage(Map)
InitAll = _reflection.GeneratedProtocolMessageType('InitAll', (_message.Message,), dict(
DESCRIPTOR = _INITALL,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.InitAll)
))
_sym_db.RegisterMessage(InitAll)
RemovePlayer = _reflection.GeneratedProtocolMessageType('RemovePlayer', (_message.Message,), dict(
DESCRIPTOR = _REMOVEPLAYER,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.RemovePlayer)
))
_sym_db.RegisterMessage(RemovePlayer)
RegisterClientID = _reflection.GeneratedProtocolMessageType('RegisterClientID', (_message.Message,), dict(
DESCRIPTOR = _REGISTERCLIENTID,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.RegisterClientID)
))
_sym_db.RegisterMessage(RegisterClientID)
Shoot = _reflection.GeneratedProtocolMessageType('Shoot', (_message.Message,), dict(
DESCRIPTOR = _SHOOT,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.Shoot)
))
_sym_db.RegisterMessage(Shoot)
BuildWall = _reflection.GeneratedProtocolMessageType('BuildWall', (_message.Message,), dict(
DESCRIPTOR = _BUILDWALL,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.BuildWall)
))
_sym_db.RegisterMessage(BuildWall)
UpdatePlayer = _reflection.GeneratedProtocolMessageType('UpdatePlayer', (_message.Message,), dict(
DESCRIPTOR = _UPDATEPLAYER,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.UpdatePlayer)
))
_sym_db.RegisterMessage(UpdatePlayer)
Player = _reflection.GeneratedProtocolMessageType('Player', (_message.Message,), dict(
DESCRIPTOR = _PLAYER,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.Player)
))
_sym_db.RegisterMessage(Player)
ServerGameMessage = _reflection.GeneratedProtocolMessageType('ServerGameMessage', (_message.Message,), dict(
DESCRIPTOR = _SERVERGAMEMESSAGE,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.ServerGameMessage)
))
_sym_db.RegisterMessage(ServerGameMessage)
WallBlock = _reflection.GeneratedProtocolMessageType('WallBlock', (_message.Message,), dict(
DESCRIPTOR = _WALLBLOCK,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.WallBlock)
))
_sym_db.RegisterMessage(WallBlock)
Bullet = _reflection.GeneratedProtocolMessageType('Bullet', (_message.Message,), dict(
DESCRIPTOR = _BULLET,
__module__ = 'message_pb2'
# @@protoc_insertion_point(class_scope:Message_proto.Bullet)
))
_sym_db.RegisterMessage(Bullet)
# @@protoc_insertion_point(module_scope)
| 42.068761 | 4,045 | 0.748471 | 6,467 | 49,557 | 5.424772 | 0.050255 | 0.056325 | 0.058663 | 0.053589 | 0.808249 | 0.745026 | 0.721595 | 0.682259 | 0.660453 | 0.646343 | 0 | 0.040173 | 0.127005 | 49,557 | 1,177 | 4,046 | 42.104503 | 0.770728 | 0.023307 | 0 | 0.69708 | 1 | 0.000912 | 0.190054 | 0.14855 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006387 | 0 | 0.006387 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2cda0a8203f4fb7174d0c3dda8e83458c2dc2154 | 1,937 | py | Python | accelbyte_py_sdk/api/sessionbrowser/__init__.py | AccelByte/accelbyte-python-sdk | dcd311fad111c59da828278975340fb92e0f26f7 | [
"MIT"
] | null | null | null | accelbyte_py_sdk/api/sessionbrowser/__init__.py | AccelByte/accelbyte-python-sdk | dcd311fad111c59da828278975340fb92e0f26f7 | [
"MIT"
] | 1 | 2021-10-13T03:46:58.000Z | 2021-10-13T03:46:58.000Z | accelbyte_py_sdk/api/sessionbrowser/__init__.py | AccelByte/accelbyte-python-sdk | dcd311fad111c59da828278975340fb92e0f26f7 | [
"MIT"
] | null | null | null | # Copyright (c) 2021 AccelByte Inc. All Rights Reserved.
# This is licensed software from AccelByte Inc, for limitations
# and restrictions contact your company contract manager.
#
# Code generated. DO NOT EDIT!
# template file: justice_py_sdk_codegen/__main__.py
"""Auto-generated package that contains models used by the justice-session-browser-service."""
__version__ = ""
__author__ = "AccelByte"
__email__ = "dev@accelbyte.net"
# pylint: disable=line-too-long
# session
from .wrappers import add_player_to_session
from .wrappers import add_player_to_session_async
from .wrappers import admin_get_session
from .wrappers import admin_get_session_async
from .wrappers import create_session
from .wrappers import create_session_async
from .wrappers import delete_session
from .wrappers import delete_session_async
from .wrappers import delete_session_local_ds
from .wrappers import delete_session_local_ds_async
from .wrappers import get_active_custom_game_sessions
from .wrappers import get_active_custom_game_sessions_async
from .wrappers import get_active_matchmaking_game_sessions
from .wrappers import get_active_matchmaking_game_sessions_async
from .wrappers import get_recent_player
from .wrappers import get_recent_player_async
from .wrappers import get_session
from .wrappers import get_session_async
from .wrappers import get_session_by_user_i_ds
from .wrappers import get_session_by_user_i_ds_async
from .wrappers import get_total_active_session
from .wrappers import get_total_active_session_async
from .wrappers import join_session
from .wrappers import join_session_async
from .wrappers import query_session
from .wrappers import query_session_async
from .wrappers import remove_player_from_session
from .wrappers import remove_player_from_session_async
from .wrappers import update_session
from .wrappers import update_session_async
from .wrappers import update_settings
from .wrappers import update_settings_async
| 38.74 | 94 | 0.859577 | 279 | 1,937 | 5.584229 | 0.290323 | 0.24647 | 0.369705 | 0.221438 | 0.776637 | 0.591784 | 0.426829 | 0.220154 | 0.047497 | 0 | 0 | 0.002296 | 0.100671 | 1,937 | 49 | 95 | 39.530612 | 0.892078 | 0.19618 | 0 | 0 | 1 | 0 | 0.016839 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.914286 | 0 | 0.914286 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fa00b18710a9161e9470464df07e2d2004fda489 | 133 | py | Python | __init__.py | WikiaSun/Slash | 670eb97d86e18006397300f7fbfde91154cdbf23 | [
"MIT"
] | null | null | null | __init__.py | WikiaSun/Slash | 670eb97d86e18006397300f7fbfde91154cdbf23 | [
"MIT"
] | null | null | null | __init__.py | WikiaSun/Slash | 670eb97d86e18006397300f7fbfde91154cdbf23 | [
"MIT"
] | null | null | null | from .bot import *
from .enums import *
from .command import *
from .context import *
from .converter import *
from .message import * | 22.166667 | 24 | 0.736842 | 18 | 133 | 5.444444 | 0.444444 | 0.510204 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172932 | 133 | 6 | 25 | 22.166667 | 0.890909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fa041e432806f912fd2d92278fa9a6d56e85a81f | 168 | py | Python | tests/basics/set_isdisjoint.py | geowor01/micropython | 7fb13eeef4a85f21cae36f1d502bcc53880e1815 | [
"MIT"
] | 7 | 2019-10-18T13:41:39.000Z | 2022-03-15T17:27:57.000Z | tests/basics/set_isdisjoint.py | geowor01/micropython | 7fb13eeef4a85f21cae36f1d502bcc53880e1815 | [
"MIT"
] | null | null | null | tests/basics/set_isdisjoint.py | geowor01/micropython | 7fb13eeef4a85f21cae36f1d502bcc53880e1815 | [
"MIT"
] | 2 | 2020-06-23T09:10:15.000Z | 2020-12-22T06:42:14.000Z | s = {1, 2, 3, 4}
print(s.isdisjoint({1}))
print(s.isdisjoint([2]))
print(s.isdisjoint([]))
print(s.isdisjoint({7,8,9,10}))
print(s.isdisjoint([7,8,9,1]))
print("PASS") | 21 | 31 | 0.625 | 32 | 168 | 3.28125 | 0.375 | 0.285714 | 0.761905 | 0.32381 | 0.361905 | 0.361905 | 0 | 0 | 0 | 0 | 0 | 0.096154 | 0.071429 | 168 | 8 | 32 | 21 | 0.576923 | 0 | 0 | 0 | 0 | 0 | 0.023669 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.142857 | 0 | 0 | 0 | 0.857143 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 6 |
fa18807423f4c45776b883130e452f6889e707be | 10,168 | py | Python | gym_aima/__init__.py | wegfawefgawefg/gym-aima | 79efa263f90bee9a8c18140a8320d5d98245c665 | [
"MIT"
] | 5 | 2020-03-08T04:58:25.000Z | 2022-01-28T13:19:50.000Z | gym_aima/__init__.py | wegfawefgawefg/gym-aima | 79efa263f90bee9a8c18140a8320d5d98245c665 | [
"MIT"
] | null | null | null | gym_aima/__init__.py | wegfawefgawefg/gym-aima | 79efa263f90bee9a8c18140a8320d5d98245c665 | [
"MIT"
] | 4 | 2020-04-11T14:02:21.000Z | 2022-03-27T15:38:17.000Z | from gym.envs.registration import register
# Classic Gridworld environments
register(
id='RussellNorvigGridworld-v0',
# To reproduce must use gamma of 1.0 (no discount)
entry_point='gym_aima.envs:AIMAEnv',
# With or without the sink state it will work
# because they don't use discounting
kwargs={'noise': 0.2, 'living_rew': -0.04, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
# To reproduce must use gamma of 0.9
id='AbbeelKleinGridworld-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.2, 'living_rew': 0.0, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
# Low Noise, With Sink Final State, All living costs
register(
id='AIMAGridworldLowNoiseNoLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.2, 'living_rew': 0.0, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldLowNoiseTinyLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.2, 'living_rew': -0.01, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldLowNoiseSmallLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.2, 'living_rew': -0.04, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldLowNoiseMediumLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.2, 'living_rew': -0.1, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldLowNoiseLargeLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.2, 'living_rew': -2, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldLowNoiseLargeLivingBonusWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.2, 'living_rew': 2, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
# Medium Noise, With Sink Final State, All living costs
register(
id='AIMAGridworldMediumNoiseNoLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.5, 'living_rew': 0.0, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldMediumNoiseTinyLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.5, 'living_rew': -0.01, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldMediumNoiseSmallLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.5, 'living_rew': -0.04, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldMediumNoiseMediumLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.5, 'living_rew': -0.1, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldMediumNoiseLargeLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.5, 'living_rew': -2, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldMediumNoiseLargeLivingBonusWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.5, 'living_rew': 2, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
# High Noise, With Sink Final State, All living costs
register(
id='AIMAGridworldHighNoiseNoLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.6666, 'living_rew': 0.0, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldHighNoiseTinyLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.6666, 'living_rew': -0.01, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldHighNoiseSmallLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.6666, 'living_rew': -0.04, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldHighNoiseMediumLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.6666, 'living_rew': -0.1, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldHighNoiseLargeLivingCostWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.6666, 'living_rew': -2, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldHighNoiseLargeLivingBonusWithSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.6666, 'living_rew': 2, 'sink': True},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
# Low Noise, Without Sink Final State, All living costs
register(
id='AIMAGridworldLowNoiseNoLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.2, 'living_rew': 0.0, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldLowNoiseTinyLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.2, 'living_rew': -0.01, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldLowNoiseSmallLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.2, 'living_rew': -0.04, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldLowNoiseMediumLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.2, 'living_rew': -0.1, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldLowNoiseLargeLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.2, 'living_rew': -2, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldLowNoiseLargeLivingBonusNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.2, 'living_rew': 2, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
# Medium Noise, Without Sink Final State, All living costs
register(
id='AIMAGridworldMediumNoiseNoLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.5, 'living_rew': 0.0, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldMediumNoiseTinyLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.5, 'living_rew': -0.01, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldMediumNoiseSmallLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.5, 'living_rew': -0.04, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldMediumNoiseMediumLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.5, 'living_rew': -0.1, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldMediumNoiseLargeLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.5, 'living_rew': -2, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldMediumNoiseLargeLivingBonusNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.5, 'living_rew': 2, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
# High Noise, Without Sink Final State, All living costs
register(
id='AIMAGridworldHighNoiseNoLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.6666, 'living_rew': 0.0, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldHighNoiseTinyLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.6666, 'living_rew': -0.01, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldHighNoiseSmallLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.6666, 'living_rew': -0.04, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldHighNoiseMediumLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.6666, 'living_rew': -0.1, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldHighNoiseLargeLivingCostNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.6666, 'living_rew': -2, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
register(
id='AIMAGridworldHighNoiseLargeLivingBonusNoSink-v0',
entry_point='gym_aima.envs:AIMAEnv',
kwargs={'noise': 0.6666, 'living_rew': 2, 'sink': False},
max_episode_steps=100,
reward_threshold=1.0,
nondeterministic=True,
)
| 31.190184 | 65 | 0.706727 | 1,224 | 10,168 | 5.684641 | 0.083333 | 0.01121 | 0.070997 | 0.092843 | 0.743748 | 0.743748 | 0.732538 | 0.732538 | 0.732538 | 0.732538 | 0 | 0.048755 | 0.150767 | 10,168 | 325 | 66 | 31.286154 | 0.757035 | 0.050747 | 0 | 0.636066 | 0 | 0 | 0.338174 | 0.263278 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.003279 | 0 | 0.003279 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fa282cc0d3049e86e15ebcb790a4713cc4a4467b | 17,705 | py | Python | ensai_model.py | Unique-Divine/Neural-Networks-for-Gravitational-Lens-Modeling | cd64ceda84bdd1486ad17585074ba8c7a0576ab5 | [
"MIT"
] | null | null | null | ensai_model.py | Unique-Divine/Neural-Networks-for-Gravitational-Lens-Modeling | cd64ceda84bdd1486ad17585074ba8c7a0576ab5 | [
"MIT"
] | null | null | null | ensai_model.py | Unique-Divine/Neural-Networks-for-Gravitational-Lens-Modeling | cd64ceda84bdd1486ad17585074ba8c7a0576ab5 | [
"MIT"
] | null | null | null |
trunc_normal = lambda stddev: tf.truncated_normal_initializer(0.0, stddev)
def model_transformer(x_image , scope="transformer", reuse=None):
with tf.variable_scope(scope):
with slim.arg_scope([slim.conv2d, slim.fully_connected], activation_fn=tf.nn.relu):
net = slim.conv2d(x_image, 64, [11, 11], 4, padding='VALID', scope='conv1')
net = slim.max_pool2d(net, [2, 2], scope='pool1')
net = slim.conv2d(net, 256, [5, 5], padding='VALID', scope='conv2')
net = slim.max_pool2d(net, [2, 2], scope='pool2')
net = slim.conv2d(net, 512, [3, 3], scope='conv3')
net = slim.conv2d(net, 1024, [3, 3], scope='conv4')
net = slim.conv2d(net, 1024, [3, 3], scope='conv5')
net = slim.max_pool2d(net, [2, 2], scope='pool5')
with slim.arg_scope([slim.conv2d], weights_initializer=trunc_normal(0.005), biases_initializer=tf.constant_initializer(0.1)):
net = slim.conv2d(net, 3072, [2, 2], padding='VALID', scope='fc6')
net = slim.conv2d(net, 4096, [1, 1], scope='fc7')
net = slim.conv2d(net, 5, [1, 1], activation_fn=None, normalizer_fn=None, biases_initializer=tf.zeros_initializer(), scope='fc8')
net = slim.flatten(net, scope='Flatten')
net = slim.fully_connected(net, 5 , activation_fn = None , scope='FC1')
net_x = slim.fully_connected(net, 1 , activation_fn=None , scope='TF_predict_x')
net_y = slim.fully_connected(net, 1 , activation_fn=None , scope='TF_predict_y')
zero_col = net_x * 0
one_col = net_x * 0 + 1
transformation_tensor = tf.concat( axis = 1 , values = [one_col, zero_col , 1.0 * net_x / ((192*0.04)/2), zero_col , one_col , 1.0 * net_y / ((192*0.04)/2) ] )
x_image = transformer(x_image, transformation_tensor , (numpix_side, numpix_side) )
x_image = tf.reshape(x_image , [-1,numpix_side,numpix_side,1] )
return x_image, net_x, net_y
batch_norm_params = {
# Decay for the moving averages.
'decay': 0.9997,
# epsilon to prevent 0s in variance.
'epsilon': 0.001,
# collection containing update_ops.
'updates_collections': tf.GraphKeys.UPDATE_OPS,
}
def block_a(inputs, scope=None, reuse=None):
# By default use stride=1 and SAME padding
with slim.arg_scope([slim.conv2d], stride=1, padding='SAME'):
with tf.variable_scope(scope, 'BlockA', [inputs], reuse=reuse):
with tf.variable_scope('branch_0'):
branch_0 = slim.conv2d(inputs, 32, [1, 1], scope='Conv2d_0a_1x1')
with tf.variable_scope('branch_1'):
branch_1 = slim.conv2d(inputs, 16, [1, 1], scope='Conv2d_0a_1x1')
branch_1 = slim.conv2d(branch_1, 16, [3, 3], scope='Conv2d_0b_3x3')
with tf.variable_scope('branch_2'):
branch_2 = slim.conv2d(inputs, 32, [1, 1], scope='Conv2d_0a_1x1')
branch_2 = slim.conv2d(branch_2, 32, [3, 3], scope='Conv2d_0b_3x3')
branch_2 = slim.conv2d(branch_2, 32, [5, 5], scope='Conv2d_0c_5x5')
branch_2 = slim.conv2d(branch_2, 32, [10, 10], scope='Conv2d_0c_10x10')
with tf.variable_scope('branch_3'):
branch_3 = slim.conv2d(inputs, 32, [1, 1], scope='Conv2d_0b_1x1')
return tf.concat(axis=3, values=[branch_0, branch_1, branch_2, branch_3])
mixed = tf.concat(axis=3, values=[branch_0, branch_1, branch_2, branch_3])
up = slim.conv2d(mixed, inputs.get_shape()[3], 1, normalizer_fn = slim.batch_norm , normalizer_params = batch_norm_params , activation_fn=tf.nn.relu, scope='RES_1')
scale = 0.7
inputs += scale * up
def model_1(net, scope="EN_Model1", reuse=None):
with tf.variable_scope(scope):
with tf.variable_scope('BlockA_1',reuse=reuse):
net = block_a(net)
with tf.variable_scope('BlockA_2',reuse=reuse):
net = block_a(net)
net = slim.max_pool2d(net, [3, 3], stride=2, padding='VALID', scope='MaxPool_1')
with tf.variable_scope('BlockA_3',reuse=reuse):
net = block_a(net)
with tf.variable_scope('BlockA_4',reuse=reuse):
net = block_a(net)
net = slim.max_pool2d(net, [3, 3], stride=2, padding='VALID', scope='MaxPool_2')
net = slim.flatten(net, scope='Flatten')
net = slim.fully_connected(net, 124, activation_fn=tf.nn.relu, scope='FC1')
net = slim.fully_connected(net, 5 , activation_fn=None, scope='read_out_layer')
net = tf.reshape(net,[-1,5])
return net
def model_2(net, scope="EN_Model2", reuse=None):
with tf.variable_scope(scope):
with slim.arg_scope([slim.conv2d], activation_fn=tf.nn.relu):
net = slim.conv2d(net, 16, [3, 3], stride=1 , scope='conv_1')
net = slim.conv2d(net, 16, [3, 3], stride=1 , scope='conv_2')
net = slim.max_pool2d(net, [3, 3], stride=2, padding='VALID', scope='MaxPool_1')
net = slim.conv2d(net, 32, [3, 3], stride=1 , scope='conv_3')
net = slim.conv2d(net, 32, [1, 1], stride=1 , scope='conv_4')
net = slim.max_pool2d(net, [3, 3], stride=2, padding='VALID', scope='MaxPool_2')
net = slim.conv2d(net, 32, [5, 5], stride=2 , scope='conv_5')
net = slim.conv2d(net, 32, [5, 5], stride=2 , scope='conv_6')
net = slim.conv2d(net, 32, [1, 1], stride=1 , scope='conv_7')
net = slim.conv2d(net, 32, [5, 5], stride=2 , scope='conv_8')
net = slim.flatten(net, scope='Flatten')
net = slim.fully_connected(net, 124, activation_fn=tf.nn.relu , scope='FC1')
net = slim.fully_connected(net, 5 , activation_fn=None, scope='read_out_layer')
net = tf.reshape(net,[-1,5])
return net
# No pooling, conv stride 2 for reduction
def model_3(net, scope="EN_Model3", reuse=None):
with tf.variable_scope(scope):
with slim.arg_scope([slim.conv2d], activation_fn=tf.nn.relu, stride=1):
net = slim.conv2d(net, 16, [3, 3], activation_fn=None , scope='conv_10')
net = slim.conv2d(net, 16, [1, 1], activation_fn=None , scope='conv_11')
net = slim.conv2d(net, 16, [5, 5], activation_fn=None , scope='conv_20')
net = slim.conv2d(net, 16, [1, 1] , scope='conv_21')
net = slim.conv2d(net, 16, [10, 10], stride=2 , scope='conv_30')
net = slim.conv2d(net, 16, [1, 1] , scope='conv_31')
net = slim.conv2d(net, 16, [10, 10], stride=2 , scope='conv_40')
net = slim.conv2d(net, 16, [1, 1] , scope='conv_41')
net = slim.conv2d(net, 32, [10, 10], stride=2 , scope='conv_50')
net = slim.conv2d(net, 32, [1, 1] , scope='conv_51')
net = slim.conv2d(net, 64, [3, 3], stride=2 , scope='conv_60')
net = slim.conv2d(net, 64, [1, 1] , scope='conv_61')
net = slim.flatten(net, scope='Flatten')
net = slim.fully_connected(net, 124 , activation_fn=tf.nn.relu , scope='FC1')
net = slim.fully_connected(net, 5 , activation_fn=None, scope='read_out_layer')
net = tf.reshape(net,[-1,5])
return net
# Few layers, no pooling, only one stride 2 conv.
def model_4(net, scope="EN_Model4", reuse=None):
with tf.variable_scope(scope):
with slim.arg_scope([slim.conv2d], activation_fn=tf.nn.relu, stride=1):
net = slim.conv2d(net, 16, [3, 3], activation_fn=None , scope='conv_10')
net = slim.conv2d(net, 16, [1, 1], activation_fn=None , scope='conv_11')
net = slim.conv2d(net, 16, [5, 5], activation_fn=None , scope='conv_20')
net = slim.conv2d(net, 16, [1, 1] , scope='conv_21')
net = slim.conv2d(net, 16, [20, 20], stride=2 , scope='conv_30')
net = slim.conv2d(net, 16, [1, 1] , scope='conv_31')
net = slim.flatten(net, scope='Flatten')
net = slim.fully_connected(net, 32 , activation_fn=tf.nn.relu , scope='FC1')
net = slim.fully_connected(net, 16 , activation_fn=tf.nn.relu , scope='FC2')
net = slim.fully_connected(net, 5 , activation_fn=None, scope='read_out_layer')
net = tf.reshape(net,[-1,5])
return net
# Overfeat model
trunc_normal = lambda stddev: tf.truncated_normal_initializer(0.0, stddev)
def model_5(net, scope="EN_Model5", reuse=None):
with tf.variable_scope(scope):
with slim.arg_scope([slim.conv2d], padding = 'SAME', activation_fn=tf.nn.relu, stride=1):
net = slim.conv2d(net, 64, [11, 11], stride=4, padding='VALID', scope='conv1')
net = slim.max_pool2d(net, [2, 2], scope='pool1')
net = slim.conv2d(net, 256, [5, 5], padding='VALID', scope='conv2')
net = slim.max_pool2d(net, [2, 2], scope='pool2')
net = slim.conv2d(net, 512, [3, 3], scope='conv3')
net = slim.conv2d(net, 1024, [3, 3], scope='conv4')
net = slim.conv2d(net, 1024, [3, 3], scope='conv5')
net = slim.max_pool2d(net, [2, 2], scope='pool5')
with slim.arg_scope([slim.conv2d], weights_initializer=trunc_normal(0.005), biases_initializer=tf.constant_initializer(0.1), activation_fn=tf.nn.relu):
net = slim.conv2d(net, 3072, [2, 2], padding='VALID', scope='fc6')
net = slim.conv2d(net, 4096, [1, 1], scope='fc7')
net = slim.conv2d(net, 5, [1, 1], activation_fn=None, normalizer_fn=None, biases_initializer=tf.zeros_initializer(), scope='fc8')
net = slim.flatten(net, scope='Flatten')
net = slim.fully_connected(net, 5 , activation_fn = None , scope='FC1')
net = tf.reshape(net,[-1,5])
return net
# Simple, minimalistic model with pool at every layer
def model_6(net, scope="EN_Model6", reuse=None):
with tf.variable_scope(scope):
with slim.arg_scope([slim.conv2d], padding = 'SAME', activation_fn=tf.nn.relu, stride=1):
net = slim.conv2d(net, 8, [3, 3] , scope='conv1')
net = slim.max_pool2d(net, [3, 3], scope='pool1')
net = slim.conv2d(net, 8, [5, 5] , scope='conv2')
net = slim.max_pool2d(net, [3, 3], scope='pool2')
net = slim.conv2d(net, 16, [5, 5], scope='conv3')
net = slim.max_pool2d(net, [2, 2], scope='pool3')
net = slim.conv2d(net, 16, [3, 3], scope='conv4')
net = slim.max_pool2d(net, [3, 3], scope='pool4')
net = slim.conv2d(net, 16, [3, 3], scope='conv5')
net = slim.max_pool2d(net, [2, 2], scope='pool5')
net = slim.flatten(net, scope='Flatten')
net = slim.fully_connected(net, 128 , activation_fn = tf.nn.relu , scope='FC1')
net = slim.fully_connected(net, 5 , activation_fn = None , scope='FC3')
net = tf.reshape(net,[-1,5])
return net
# only 2 fully connected layers
def model_7(net, scope="EN_Model7", reuse=None):
with tf.variable_scope(scope):
net = slim.flatten(net, scope='Flatten')
net = slim.fully_connected(net, 128 , activation_fn = tf.nn.relu , scope='FC1')
net = slim.fully_connected(net, 64 , activation_fn = tf.nn.relu , scope='FC2')
net = slim.fully_connected(net, 5 , activation_fn = None , scope='FC3')
net = tf.reshape(net,[-1,5])
return net
# No pooling, conv stride 2 for reduction
def model_8(net, scope="EN_Model8", reuse=None):
with tf.variable_scope(scope):
with slim.arg_scope([slim.conv2d], activation_fn=tf.nn.relu, stride=1):
net = slim.conv2d(net, 32, [3, 3], activation_fn=None , scope='conv_10')
net = slim.conv2d(net, 32, [1, 1], activation_fn=None , scope='conv_11')
net = slim.conv2d(net, 32, [5, 5], activation_fn=None , scope='conv_20')
net = slim.conv2d(net, 32, [1, 1] , scope='conv_21')
net = slim.conv2d(net, 32, [10, 10], stride=2 , scope='conv_30')
net = slim.conv2d(net, 32, [1, 1] , scope='conv_31')
net = slim.conv2d(net, 32, [10, 10], stride=1 , scope='conv_40')
net = slim.conv2d(net, 32, [1, 1] , scope='conv_41')
net = slim.conv2d(net, 64, [10, 10], stride=2 , scope='conv_50')
net = slim.conv2d(net, 64, [1, 1] , scope='conv_51')
net = slim.conv2d(net, 64, [10, 10], stride=1 , scope='conv_60')
net = slim.conv2d(net, 64, [1, 1] , scope='conv_61')
net = slim.conv2d(net, 128, [10, 10], stride=2 , scope='conv_70')
net = slim.conv2d(net, 128, [1, 1] , scope='conv_71')
net = slim.conv2d(net, 256, [3, 3], stride=1 , scope='conv_80')
net = slim.conv2d(net, 256, [1, 1] , scope='conv_81')
net = slim.flatten(net, scope='Flatten')
net = slim.fully_connected(net, 512 , activation_fn=tf.nn.relu , scope='FC1')
net = slim.fully_connected(net, 5 , activation_fn=None, scope='read_out_layer')
net = tf.reshape(net,[-1,5])
return net
trunc_normal = lambda stddev: tf.truncated_normal_initializer(0.0, stddev)
def alexnet_v2_arg_scope(weight_decay=0.0005):
with slim.arg_scope([slim.conv2d, slim.fully_connected],
activation_fn=tf.nn.relu,
biases_initializer=tf.constant_initializer(0.1),
weights_regularizer=slim.l2_regularizer(weight_decay)):
with slim.arg_scope([slim.conv2d], padding='SAME'):
with slim.arg_scope([slim.max_pool2d], padding='VALID') as arg_sc:
return arg_sc
def model_9(net,num_classes=5,is_training=True,scope='alexnet_v2'):
with tf.variable_scope(scope, 'alexnet_v2', [net]) as sc:
net = slim.conv2d(net, 64, [11, 11], 4, padding='VALID',scope='conv1')
net = slim.max_pool2d(net, [3, 3], 2, scope='pool1')
net = slim.conv2d(net, 192, [5, 5], scope='conv2')
net = slim.max_pool2d(net, [3, 3], 2, scope='pool2')
net = slim.conv2d(net, 384, [3, 3], scope='conv3')
net = slim.conv2d(net, 384, [3, 3], scope='conv4')
net = slim.conv2d(net, 256, [3, 3], scope='conv5')
net = slim.max_pool2d(net, [3, 3], 2, scope='pool5')
# Use conv2d instead of fully_connected layers.
with slim.arg_scope([slim.conv2d], weights_initializer=trunc_normal(0.005), biases_initializer=tf.constant_initializer(0.1)):
net = slim.conv2d(net, 4096, [4, 4], padding='VALID', scope='fc6')
net = slim.conv2d(net, 4096, [1, 1], scope='fc7')
net = slim.conv2d(net, num_classes, [1, 1], activation_fn=None, normalizer_fn=None, biases_initializer=tf.zeros_initializer(),scope='fc8')
net = tf.reshape(net,[-1,5])
return net
model_9.default_image_size = 192
# Kitt Peak model
def model_10(net, scope="EN_Model10", reuse=None):
with tf.variable_scope(scope):
with slim.arg_scope([slim.conv2d], padding = 'SAME', activation_fn=tf.nn.relu, stride=1):
MASK = tf.abs(tf.sign(net))
XX = net + ( (1-MASK) * 1000.0)
bias_measure_filt = tf.constant((1.0/16.0), shape=[4, 4, 1, 1])
bias_measure = tf.nn.conv2d( XX , bias_measure_filt , strides=[1, 1, 1, 1], padding='VALID')
im_bias = tf.reshape( tf.reduce_min(bias_measure,axis=[1,2,3]) , [-1,1,1,1] )
net = net - (im_bias * MASK )
net = slim.conv2d(net, 64, [2, 2] , scope='conv1')
net = slim.max_pool2d(net, [2, 2], scope='pool1')
net = slim.conv2d(net, 64, [2, 2] , scope='conv2')
net = slim.max_pool2d(net, [2, 2], scope='pool2')
net = slim.conv2d(net, 64, [2, 2], scope='conv3')
net = slim.max_pool2d(net, [2, 2], scope='pool3')
net = slim.conv2d(net, 128, [3, 3], scope='conv4')
net = slim.max_pool2d(net, [2, 2], scope='pool4')
net = slim.conv2d(net, 128, [3, 3], scope='conv5')
net = slim.max_pool2d(net, [2, 2], scope='pool5')
net = slim.conv2d(net, 256, [3, 3], scope='conv6')
net = slim.flatten(net, scope='Flatten')
net = slim.fully_connected(net, 1024 , activation_fn = tf.tanh , scope='FC1')
net = slim.fully_connected(net, 128 , activation_fn = tf.tanh , scope='FC2')
net = slim.fully_connected(net, 5 , activation_fn = None , scope='FC3')
net = tf.reshape(net,[-1,5])
return net
def cost_tensor(y_conv , scale_pars = [ [1., 0., 0., 0., 0.],[0. , 1. , 0., 0., 0.],[0., 0. , 1., 0., 0.],[0., 0. , 0., 1., 0.],[0., 0. , 0., 0., 1.]] ):
FLIPXY = tf.constant([ [1., 0., 0., 0., 0.],[0. , -1. , 0., 0., 0.],[0., 0. , -1., 0., 0.],[0., 0. , 0., 1., 0.],[0., 0. , 0., 0., 1.]] )
y_conv_flipped = tf.matmul(y_conv, FLIPXY)
scale_par_cost = tf.constant(scale_pars )
scaled_delta_1 = tf.matmul(tf.pow(y_conv - y_,2) , scale_par_cost)
scaled_delta_2 = tf.matmul(tf.pow(y_conv_flipped - y_,2) , scale_par_cost)
MeanSquareCost = tf.reduce_mean( tf.minimum(tf.reduce_mean( scaled_delta_1 ,axis=1) , tf.reduce_mean( scaled_delta_2 ,axis=1)) , axis=0)
return MeanSquareCost, y_conv_flipped
| 49.733146 | 171 | 0.574358 | 2,562 | 17,705 | 3.817721 | 0.087822 | 0.093753 | 0.102341 | 0.124323 | 0.807791 | 0.770371 | 0.748594 | 0.719047 | 0.67764 | 0.642777 | 0 | 0.082011 | 0.26309 | 17,705 | 355 | 172 | 49.873239 | 0.667663 | 0.024117 | 0 | 0.450199 | 0 | 0 | 0.069587 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d70cde36148a5783791e89a6e0ac7867bbe28dc0 | 143 | py | Python | metadata.py | ks8/conformation | f470849d5b7b90dc5a65bab8a536de1d57c1021a | [
"MIT"
] | null | null | null | metadata.py | ks8/conformation | f470849d5b7b90dc5a65bab8a536de1d57c1021a | [
"MIT"
] | null | null | null | metadata.py | ks8/conformation | f470849d5b7b90dc5a65bab8a536de1d57c1021a | [
"MIT"
] | null | null | null | """ Generate metadata. """
from conformation.metadata import metadata, Args
if __name__ == '__main__':
metadata(Args().parse_args())
| 23.833333 | 49 | 0.685315 | 15 | 143 | 5.933333 | 0.666667 | 0.269663 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.167832 | 143 | 5 | 50 | 28.6 | 0.747899 | 0.125874 | 0 | 0 | 1 | 0 | 0.071429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
d724c935aa1a0f202caf8256f479745fcdd20684 | 23 | py | Python | vibora/headers/__init__.py | brettcannon/vibora | 1933b631d4df62e7d748016f7463ab746d4695cc | [
"MIT"
] | 6,238 | 2018-06-14T19:29:47.000Z | 2022-03-29T21:42:03.000Z | vibora/headers/__init__.py | LL816/vibora | 4cda888f89aec6bfb2541ee53548ae1bf50fbf1b | [
"MIT"
] | 213 | 2018-06-13T20:13:59.000Z | 2022-03-26T07:46:49.000Z | vibora/headers/__init__.py | LL816/vibora | 4cda888f89aec6bfb2541ee53548ae1bf50fbf1b | [
"MIT"
] | 422 | 2018-06-20T01:29:41.000Z | 2022-02-27T16:45:29.000Z | from .headers import *
| 11.5 | 22 | 0.73913 | 3 | 23 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d72d8a8775ebd4ca530985052eb4375ade157bf6 | 437 | py | Python | tests/conftest.py | arielbello/sql-test-website | 29361215e4147a3db0ff22bae8d67ac7cdf1aabe | [
"MIT"
] | null | null | null | tests/conftest.py | arielbello/sql-test-website | 29361215e4147a3db0ff22bae8d67ac7cdf1aabe | [
"MIT"
] | null | null | null | tests/conftest.py | arielbello/sql-test-website | 29361215e4147a3db0ff22bae8d67ac7cdf1aabe | [
"MIT"
] | null | null | null | from app.main import app
import pytest
@pytest.fixture
def flask_app():
return app
@pytest.fixture
def client():
with app.test_client() as client:
with app.app_context():
# context config
pass
yield client
@pytest.fixture(scope="session")
def valid_email():
return "test_user@sqltest.com"
@pytest.fixture(scope="session")
def invalid_email():
return "--; DROP * FROM User"
| 16.185185 | 37 | 0.647597 | 56 | 437 | 4.946429 | 0.464286 | 0.187726 | 0.115523 | 0.180505 | 0.202166 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.237986 | 437 | 26 | 38 | 16.807692 | 0.831832 | 0.032037 | 0 | 0.235294 | 0 | 0 | 0.130641 | 0.049881 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | true | 0.058824 | 0.117647 | 0.176471 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 6 |
d73536832af358998d772263830884f96ae53e63 | 223 | py | Python | app/common/song_metadata/__init__.py | Mai-icy/I-Music-Player | 92f2d886a2c15a2c000b6f618d3123f7c365534d | [
"MIT"
] | 1 | 2022-01-03T09:21:18.000Z | 2022-01-03T09:21:18.000Z | app/common/song_metadata/__init__.py | Mai-icy/I-Music-Player | 92f2d886a2c15a2c000b6f618d3123f7c365534d | [
"MIT"
] | null | null | null | app/common/song_metadata/__init__.py | Mai-icy/I-Music-Player | 92f2d886a2c15a2c000b6f618d3123f7c365534d | [
"MIT"
] | null | null | null | from .read_song_metadata import get_song_metadata, get_album_buffer, get_md5
from .write_song_metadata import write_mp3_metadata, write_flac_metadata, update_md5_to_file
from .compare_song_metadata import compare_song_info
| 55.75 | 92 | 0.896861 | 36 | 223 | 5 | 0.472222 | 0.266667 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014493 | 0.071749 | 223 | 3 | 93 | 74.333333 | 0.855072 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d777e33b899ef94e40b40bb2c7f2c92db54e3d38 | 147 | py | Python | python_base/panduan.py | semoren/learn_python | d3140c3d4cb2bdb674d9c072c4195ebcfe686be3 | [
"Apache-2.0"
] | null | null | null | python_base/panduan.py | semoren/learn_python | d3140c3d4cb2bdb674d9c072c4195ebcfe686be3 | [
"Apache-2.0"
] | null | null | null | python_base/panduan.py | semoren/learn_python | d3140c3d4cb2bdb674d9c072c4195ebcfe686be3 | [
"Apache-2.0"
] | null | null | null | # 条件判断
age = input()
if int(age) >= 18:
print('your age is', age)
print('adult')
else:
print('your age is', age)
print('teenager') | 16.333333 | 29 | 0.564626 | 22 | 147 | 3.772727 | 0.545455 | 0.216867 | 0.289157 | 0.337349 | 0.53012 | 0.53012 | 0 | 0 | 0 | 0 | 0 | 0.018018 | 0.244898 | 147 | 9 | 30 | 16.333333 | 0.72973 | 0.027211 | 0 | 0.285714 | 0 | 0 | 0.246479 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.571429 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
ad23943470b6077c3d6f53ecf9baa85a2c32fb20 | 25 | py | Python | reservoirpy/pvtpy/__init__.py | scuervo91/reservoirpy | a4db620baf3ff66a85c7f61b1919713a8642e6fc | [
"MIT"
] | 16 | 2020-05-07T01:57:04.000Z | 2021-11-27T12:45:59.000Z | reservoirpy/pvtpy/__init__.py | scuervo91/reservoirpy | a4db620baf3ff66a85c7f61b1919713a8642e6fc | [
"MIT"
] | null | null | null | reservoirpy/pvtpy/__init__.py | scuervo91/reservoirpy | a4db620baf3ff66a85c7f61b1919713a8642e6fc | [
"MIT"
] | 5 | 2020-05-12T07:28:24.000Z | 2021-12-10T21:24:59.000Z | from . import black_oil
| 8.333333 | 23 | 0.76 | 4 | 25 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 25 | 2 | 24 | 12.5 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ad31bf5cd4ae6a87c15c732b62ca26ddc53ef278 | 123 | py | Python | python/domain/quality_plan/content/__init__.py | ICTU/document-as-code | e65fddb94513e7c2f54f248b4ce69e9e10ce42f5 | [
"Apache-2.0"
] | 2 | 2021-01-09T17:00:51.000Z | 2021-02-19T09:35:26.000Z | python/domain/quality_plan/content/__init__.py | ICTU/document-as-code | e65fddb94513e7c2f54f248b4ce69e9e10ce42f5 | [
"Apache-2.0"
] | null | null | null | python/domain/quality_plan/content/__init__.py | ICTU/document-as-code | e65fddb94513e7c2f54f248b4ce69e9e10ce42f5 | [
"Apache-2.0"
] | 1 | 2020-02-24T15:50:05.000Z | 2020-02-24T15:50:05.000Z | from .deliverables import *
from .measures import *
from .requirements import *
from .sources import *
from .text import *
| 20.5 | 27 | 0.756098 | 15 | 123 | 6.2 | 0.466667 | 0.430108 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162602 | 123 | 5 | 28 | 24.6 | 0.902913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ad32503eb3215ae17c6b2e365cd92eca487d4c23 | 46 | py | Python | source/writeproperties/__init__.py | luciebakels/gadgetanalyse | 63b80b5306ebc4a1fbfd9cfbe3608ca10b8a19a3 | [
"MIT"
] | null | null | null | source/writeproperties/__init__.py | luciebakels/gadgetanalyse | 63b80b5306ebc4a1fbfd9cfbe3608ca10b8a19a3 | [
"MIT"
] | null | null | null | source/writeproperties/__init__.py | luciebakels/gadgetanalyse | 63b80b5306ebc4a1fbfd9cfbe3608ca10b8a19a3 | [
"MIT"
] | null | null | null | from writeproperties.writeproperties import *
| 23 | 45 | 0.869565 | 4 | 46 | 10 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 46 | 1 | 46 | 46 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ad58832f9171063449bf521f90fbba3c7b8b2ecc | 5,565 | py | Python | src/titles.py | Zachary-Daum/2020-bioinformatics-hackathon | 31ae1afe57b07b1898c2cc8b52bd73a847687c4f | [
"MIT"
] | 1 | 2020-11-04T02:54:47.000Z | 2020-11-04T02:54:47.000Z | src/titles.py | Zachary-Daum/2020-bioinformatics-hackathon | 31ae1afe57b07b1898c2cc8b52bd73a847687c4f | [
"MIT"
] | 2 | 2020-04-25T04:51:05.000Z | 2020-04-29T12:45:39.000Z | src/titles.py | Zachary-Daum/2020-bioinformatics-hackathon | 31ae1afe57b07b1898c2cc8b52bd73a847687c4f | [
"MIT"
] | null | null | null | import nltk
from nltk import pos_tag
from nltk import RegexpParser
from nltk.corpus import state_union
from nltk.tokenize import PunktSentenceTokenizer
import numpy as np
import matplotlib
import matplotlib.pyplot as plt
train_text = state_union.raw("2005-GWBush.txt")
custom_sent_tokenizer = PunktSentenceTokenizer(train_text)
def between(value, a, b):
# Find and validate before-part.
pos_a = value.find(a)
if pos_a == -1: return ""
# Find and validate after part.
pos_b = value.rfind(b)
if pos_b == -1: return ""
# Return middle part.
adjusted_pos_a = pos_a + len(a)
if adjusted_pos_a >= pos_b: return ""
return value[adjusted_pos_a:pos_b]
#create file names to draw data from
with open('data/fake_pmids.txt', 'r') as fakeid:
fakeid = fakeid.readlines()
fakeid = ''.join(fakeid).replace("PMID:","")
idlist = fakeid.splitlines() #by here each PMID is its own item in the list
n = 0
VBlist = []
VBZlist = []
NNlist = []
JJlist = []
INlist = []
JJperNN = []
SLenList = []
NNWlist = []
while ( n <= 421):
workingid = idlist[n]
workingfile = "data/fakes/{}.txt"
currentfile = workingfile.format(workingid)
with open (currentfile, 'rt') as file:
data = file.read().splitlines()
temp = ''.join(data)
title = between(temp,'title { name "','." }, authors {')
#find sentence length
sent_len = (len(title.split()))
abstract = between(temp,'abstract "','.", mesh {')
#PoS Tagging
tokenizedtitle = custom_sent_tokenizer.tokenize(title)
def process_content():
for i in tokenizedtitle:
words = nltk.word_tokenize(i)
tagged = nltk.pos_tag(words)
#process title
temp = str(tagged).strip('[]')
#count verbs
process_content.VBCount = temp.count('VB')
#count VBZ
process_content.VBZCount = temp.count('VBZ')
#count NN
process_content.NNCount = temp.count('NN')
#count JJ
process_content.JJCount = temp.count('JJ')
#count IN
process_content.INCount = temp.count('IN')
#count JJ/NN
if process_content.NNCount == 0:
process_content.JJperNN = 0
else:
process_content.JJperNN = process_content.JJCount / process_content.NNCount
process_content()
VBlist.append(process_content.VBCount)
JJperNN.append(process_content.JJperNN)
#calc NN per S
NNperW = process_content.NNCount / sent_len
#append NN per S
NNWlist.append(NNperW)
n = n+1
####Plot fake data
plt.scatter( x = NNWlist, y = JJperNN, c = 'red')
#analyze real data to compare
with open('data/rand_pmids.txt', 'r') as fakeid:
fakeid = fakeid.readlines()
fakeid = ''.join(fakeid).replace("PMID:","")
idlist = fakeid.splitlines() #by here each PMID is its own item in the list
n = 0
VBlist = []
VBZlist = []
NNlist = []
JJlist = []
INlist = []
JJperNN = []
SLenList = []
NNWlist = []
while ( n <= 140):
workingid = idlist[n]
workingfile = "data/rand/{}.txt"
currentfile = workingfile.format(workingid)
with open (currentfile, 'rt') as file:
data = file.read().splitlines()
temp = ''.join(data)
title = between(temp,'title { name "','." }, authors {')
#find sentence length
sent_len = (len(title.split()))
abstract = between(temp,'abstract "','.", mesh {')
#PoS Tagging
tokenizedtitle = custom_sent_tokenizer.tokenize(title)
def process_content():
for i in tokenizedtitle:
words = nltk.word_tokenize(i)
tagged = nltk.pos_tag(words)
#process title
temp = str(tagged).strip('[]')
print(temp)
#count verbs
process_content.VBCount = temp.count('VB')
#count VBZ
process_content.VBZCount = temp.count('VBZ')
#count NN
process_content.NNCount = temp.count('NN')
#count JJ
process_content.JJCount = temp.count('JJ')
#count IN
process_content.INCount = temp.count('IN')
#count JJ/NN
if process_content.NNCount == 0:
process_content.JJperNN = 0
else:
process_content.JJperNN = process_content.JJCount / process_content.NNCount
process_content()
JJperNN.append(process_content.JJperNN)
#calc NN per S
NNperW = process_content.NNCount / sent_len
#append NN per S
NNWlist.append(NNperW)
n = n+1
#plot everything
plt.scatter( x = NNWlist, y = JJperNN, c = 'blue')
plt.xlabel('Nouns per Word')
plt.ylabel('Adjectives per Noun')
plt.show()
| 36.372549 | 99 | 0.519137 | 572 | 5,565 | 4.940559 | 0.258741 | 0.143666 | 0.059448 | 0.015924 | 0.762916 | 0.729653 | 0.729653 | 0.710545 | 0.710545 | 0.710545 | 0 | 0.005775 | 0.377718 | 5,565 | 152 | 100 | 36.611842 | 0.81028 | 0.092722 | 0 | 0.720721 | 0 | 0 | 0.061029 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027027 | false | 0 | 0.072072 | 0 | 0.108108 | 0.009009 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ad60c13491545de9f7bb653d3b1b30cebe3f41c6 | 5,873 | py | Python | tests/test_builder.py | dusktreader/sphinx-view | ad877834c4cf9235a6249628907a73d569ff051a | [
"MIT"
] | 3 | 2017-11-29T13:39:25.000Z | 2020-06-20T10:23:29.000Z | tests/test_builder.py | dusktreader/sphinx-view | ad877834c4cf9235a6249628907a73d569ff051a | [
"MIT"
] | 24 | 2016-12-11T18:01:02.000Z | 2021-09-30T16:39:36.000Z | tests/test_builder.py | dusktreader/sphinx-view | ad877834c4cf9235a6249628907a73d569ff051a | [
"MIT"
] | 5 | 2017-03-05T03:33:39.000Z | 2020-04-25T00:15:17.000Z | import os
import sys
from sview.builder import Builder
class TestBuilder:
def test_init_for_single(self, tmpdir, find_data_file):
config = {
"WORKING_DIR": str(tmpdir),
"TARGET": find_data_file("single.rst"),
}
builder = Builder(**config)
assert builder is not None
assert builder.working_dir == str(tmpdir)
assert builder.target == find_data_file("single.rst")
assert builder.src_dir == os.path.join(str(tmpdir), "src")
assert builder.build_dir == os.path.join(str(tmpdir), "build")
def test_init_for_dir(self, tmpdir, find_data_file):
config = {
"WORKING_DIR": str(tmpdir),
"TARGET": find_data_file("directory"),
}
builder = Builder(**config)
assert builder is not None
assert builder.working_dir == str(tmpdir)
assert builder.target == find_data_file("directory")
assert builder.src_dir == os.path.join(str(tmpdir), "src")
assert builder.build_dir == os.path.join(str(tmpdir), "build")
def test_init_for_package(self, tmpdir, find_data_file):
target = find_data_file("package")
config = {
"WORKING_DIR": str(tmpdir),
"TARGET": target,
"PACKAGE": True,
"PACKAGE_DOCS": "docs",
}
builder = Builder(**config)
assert builder is not None
assert builder.working_dir == str(tmpdir)
assert builder.target == os.path.join(target, "docs")
assert builder.build_dir == os.path.join(str(tmpdir), "build")
assert os.path.exists(os.path.join(str(tmpdir), "env"))
assert sys.prefix == os.path.join(str(tmpdir), "env")
def test_copy_dir(self, tmpdir, find_data_file):
target = find_data_file("package")
config = {
"WORKING_DIR": str(tmpdir),
"TARGET": target,
"PACKAGE": True,
"PACKAGE_DOCS": "docs",
}
builder = Builder(**config)
builder.remake_dirs()
builder.copy_dir()
assert os.path.exists(os.path.join(str(tmpdir), "src", "index.rst"))
def test_copy_file(self, tmpdir, find_data_file):
target = find_data_file("single.rst")
config = {
"WORKING_DIR": str(tmpdir),
"TARGET": target,
}
builder = Builder(**config)
builder.remake_dirs()
builder.copy_file()
final_path = os.path.join(str(tmpdir), "src", "index.rst")
assert os.path.exists(final_path)
with open(target) as original_file:
with open(final_path) as final_file:
assert original_file.read() == final_file.read()
def test_copy_literal_includes(self, tmpdir, find_data_file):
target = find_data_file("package")
config = {
"WORKING_DIR": str(tmpdir),
"TARGET": target,
"PACKAGE": True,
"PACKAGE_DOCS": "docs",
}
builder = Builder(**config)
builder.remake_dirs()
builder.copy_dir()
builder.copy_literal_includes()
assert os.path.exists(os.path.join(str(tmpdir), "src", "dummy.py"))
final_path = os.path.join(str(tmpdir), "src", "index.rst")
with open(final_path) as final_file:
assert ".. literalinclude:: dummy.py" in final_file.read()
def test_fetch_ext_from_index(self, tmpdir, find_data_file):
target = find_data_file("single.rst")
config = {
"WORKING_DIR": str(tmpdir),
"TARGET": target,
}
builder = Builder(**config)
builder.remake_dirs()
builder.copy_file()
assert builder.fetch_ext_from_index() == ".rst"
target = find_data_file("single.md")
config = {
"WORKING_DIR": str(tmpdir),
"TARGET": target,
}
builder = Builder(**config)
builder.remake_dirs()
builder.copy_file()
assert builder.fetch_ext_from_index() == ".md"
def test_build_conf_file(self, tmpdir, find_data_file):
target = find_data_file("single.rst")
config = {
"WORKING_DIR": str(tmpdir),
"TARGET": target,
}
builder = Builder(**config)
builder.remake_dirs()
builder.copy_file()
builder.build_conf_file()
final_path = os.path.join(str(tmpdir), "src", "conf.py")
with open(final_path) as final_file:
assert "source_suffix = '.rst'" in final_file.read()
def test_build_single(self, tmpdir, find_data_file):
target = find_data_file("single.rst")
config = {
"WORKING_DIR": str(tmpdir),
"TARGET": target,
}
builder = Builder(**config)
builder.build()
final_path = os.path.join(str(tmpdir), "build", "index.html")
assert os.path.exists(final_path)
def test_build_directory(self, tmpdir, find_data_file):
config = {
"WORKING_DIR": str(tmpdir),
"TARGET": find_data_file("directory"),
}
builder = Builder(**config)
builder.build()
final_path = os.path.join(str(tmpdir), "build", "index.html")
assert os.path.exists(final_path)
def test_build_package(self, tmpdir, find_data_file):
target = find_data_file("package")
config = {
"WORKING_DIR": str(tmpdir),
"TARGET": target,
"PACKAGE": True,
"PACKAGE_DOCS": "docs",
}
builder = Builder(**config)
builder.build()
build_dir = os.path.join(str(tmpdir), "build")
assert os.path.exists(os.path.join(build_dir, "index.html"))
assert os.path.exists(os.path.join(build_dir, "modules.html"))
assert os.path.exists(os.path.join(build_dir, "svdummy.html"))
| 34.958333 | 76 | 0.581985 | 689 | 5,873 | 4.744557 | 0.092888 | 0.082594 | 0.091771 | 0.087183 | 0.89018 | 0.872438 | 0.830529 | 0.830529 | 0.79688 | 0.780973 | 0 | 0 | 0.287076 | 5,873 | 167 | 77 | 35.167665 | 0.78075 | 0 | 0 | 0.668919 | 0 | 0 | 0.107952 | 0 | 0 | 0 | 0 | 0 | 0.195946 | 1 | 0.074324 | false | 0 | 0.02027 | 0 | 0.101351 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ad78bf9db59f447105067116f9d4a962fdd70b18 | 446 | py | Python | recognizer/pddl/state.py | RukNdf/MA-Landmark | 4038ebe7edc9e353e1987479f5f9edc528a4bd2a | [
"Unlicense"
] | null | null | null | recognizer/pddl/state.py | RukNdf/MA-Landmark | 4038ebe7edc9e353e1987479f5f9edc528a4bd2a | [
"Unlicense"
] | null | null | null | recognizer/pddl/state.py | RukNdf/MA-Landmark | 4038ebe7edc9e353e1987479f5f9edc528a4bd2a | [
"Unlicense"
] | null | null | null | # -----------------------------------------------
# Applicable
# -----------------------------------------------
def applicable(state, positive, negative):
return positive.issubset(state) and not negative.intersection(state)
# -----------------------------------------------
# Apply
# -----------------------------------------------
def apply(state, positive, negative):
return frozenset(state.difference(negative).union(positive))
| 26.235294 | 72 | 0.419283 | 28 | 446 | 6.678571 | 0.5 | 0.139037 | 0.224599 | 0.28877 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091928 | 446 | 16 | 73 | 27.875 | 0.461728 | 0.466368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 6 |
ad8944ac723600ec12c5c0b8cf19ea2c38139159 | 3,449 | py | Python | PasswordGenerator/password_generator.py | Unbewohnte/Password-Generator-site | 7b331d6f1b5335cea872666b6d22636990159d49 | [
"MIT"
] | null | null | null | PasswordGenerator/password_generator.py | Unbewohnte/Password-Generator-site | 7b331d6f1b5335cea872666b6d22636990159d49 | [
"MIT"
] | null | null | null | PasswordGenerator/password_generator.py | Unbewohnte/Password-Generator-site | 7b331d6f1b5335cea872666b6d22636990159d49 | [
"MIT"
] | null | null | null | import random
import string
digits = list(string.digits*10 + string.ascii_lowercase*10)
uppercase = list(string.ascii_uppercase + string.ascii_lowercase)*30
lowercase = list(string.ascii_lowercase)*30
punct = list(string.ascii_lowercase*10 + string.punctuation*5)
upp_dig = list(string.digits*10 + string.ascii_lowercase*10 + string.ascii_uppercase*10)
upp_dig_pun = list(string.digits*10 + string.ascii_lowercase*10 + string.ascii_uppercase*10 + string.punctuation*5)
digits_pun = list(string.ascii_lowercase*10 + string.digits*10 + string.punctuation*5)
upp_pun = list(string.ascii_lowercase*10 + string.ascii_uppercase*10 +string.punctuation*5)
#Yeah, spaghetti code, but couldn`t come up with a better idea
def generate(LEN,flags=None,how_many=None):
passwords = []
U,D,P = None,None,None
for flag_dict in flags:
if flag_dict['checkbox'] == 'Upper_reg' and flag_dict['value'] == 'y':
U = True
elif flag_dict['checkbox'] == 'Digits' and flag_dict['value'] == 'y':
D = True
elif flag_dict['checkbox'] == 'Punctuation' and flag_dict['value'] == 'y':
P = True
if U == None and D == None and P == None:
for password in range(int(how_many)):
password = random.sample(lowercase ,int(LEN))
password = ''.join(password)
passwords.append(password)
return passwords
elif U == True and D == None and P == None:
for password in range(int(how_many)):
password = random.sample(uppercase ,int(LEN))
password = ''.join(password)
passwords.append(password)
return passwords
elif U == None and D == True and P == None:
for password in range(int(how_many)):
password = random.sample(digits ,int(LEN))
password = ''.join(password)
passwords.append(password)
return passwords
elif U == None and D == None and P == True:
for password in range(int(how_many)):
password = random.sample(punct ,int(LEN))
password = ''.join(password)
passwords.append(password)
return passwords
elif U == True and D == True and P == None:
for password in range(int(how_many)):
password = random.sample(upp_dig ,int(LEN))
password = ''.join(password)
passwords.append(password)
return passwords
elif U == None and D == True and P == True:
for password in range(int(how_many)):
password = random.sample(digits_pun ,int(LEN))
password = ''.join(password)
passwords.append(password)
return passwords
elif U == True and D == None and P == True:
for password in range(int(how_many)):
password = random.sample(upp_pun ,int(LEN))
password = ''.join(password)
passwords.append(password)
return passwords
elif U == True and D == True and P == True:
for password in range(int(how_many)):
password = random.sample(upp_dig_pun ,int(LEN))
password = ''.join(password)
passwords.append(password)
return passwords
else:
print("Error!")
| 40.576471 | 116 | 0.575239 | 413 | 3,449 | 4.709443 | 0.150121 | 0.067866 | 0.082262 | 0.074036 | 0.813882 | 0.743445 | 0.726992 | 0.702828 | 0.682262 | 0.682262 | 0 | 0.014388 | 0.314874 | 3,449 | 84 | 117 | 41.059524 | 0.808718 | 0.017686 | 0 | 0.457143 | 0 | 0 | 0.022404 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014286 | false | 0.585714 | 0.028571 | 0 | 0.157143 | 0.014286 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
a8e89dc3ba0a767a8548b026ffc73127894e2e30 | 38,858 | py | Python | test/test_lca.py | MichaelTeti/lca-pytorch | 702ad3caba159390f4a031a3386587ff0cb79d12 | [
"MIT"
] | null | null | null | test/test_lca.py | MichaelTeti/lca-pytorch | 702ad3caba159390f4a031a3386587ff0cb79d12 | [
"MIT"
] | 7 | 2022-03-13T21:42:36.000Z | 2022-03-29T22:51:23.000Z | test/test_lca.py | MichaelTeti/lca-pytorch | 702ad3caba159390f4a031a3386587ff0cb79d12 | [
"MIT"
] | null | null | null | from tempfile import TemporaryDirectory
import unittest
import torch
from torch.testing import assert_close
from lcapt.lca import LCAConv1D, LCAConv2D, LCAConv3D
class TestLCA(unittest.TestCase):
def test_LCAConv1D_to_correct_input_shape_raises_ValueError(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir)
inputs_4d = torch.zeros(1, 3, 10, 10)
with self.assertRaises(ValueError):
lca._to_correct_input_shape(inputs_4d)
def test_LCAConv2D_to_correct_input_shape_raises_ValueError(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir)
inputs_3d = torch.zeros(1, 3, 10)
with self.assertRaises(ValueError):
lca._to_correct_input_shape(inputs_3d)
def test_LCAConv3D_to_correct_input_shape_raises_ValueError(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir)
inputs_4d = torch.zeros(1, 3, 10, 10)
with self.assertRaises(ValueError):
lca._to_correct_input_shape(inputs_4d)
def test_LCAConv1D_get_weights_returns_correct_shape(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, kt=5)
weights = lca.get_weights()
self.assertTrue(len(weights.shape) == 3)
self.assertTupleEqual(weights.numpy().shape, (10, 3, 5))
def test_LCAConv2D_get_weights_returns_correct_shape(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 5, 7)
weights = lca.get_weights()
self.assertTrue(len(weights.shape) == 4)
self.assertTupleEqual(weights.numpy().shape, (10, 3, 5, 7))
def test_LCAConv3D_get_weights_returns_correct_shape(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 5, 7, 9)
weights = lca.get_weights()
self.assertTrue(len(weights.shape) == 5)
self.assertTupleEqual(weights.numpy().shape, (10, 3, 9, 5, 7))
def test_LCAConv1D_assign_weight_values(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, kt=5)
new_weights = torch.randn(10, 3, 5)
lca.assign_weight_values(new_weights)
assert_close(new_weights, lca.get_weights(), rtol=0, atol=0)
def test_LCAConv2D_assign_weight_values(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 5, 7)
new_weights = torch.randn(10, 3, 5, 7)
lca.assign_weight_values(new_weights)
assert_close(new_weights, lca.get_weights(), rtol=0, atol=0)
def test_LCAConv3D_assign_weight_values(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 5, 7, 9)
new_weights = torch.randn(10, 3, 9, 5, 7)
lca.assign_weight_values(new_weights)
assert_close(new_weights, lca.get_weights(), rtol=0, atol=0)
def test_LCAConv1D_normalize_weights(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, kt=5)
new_weights = torch.rand(10, 3, 5) * 10 + 10
lca.assign_weight_values(new_weights)
lca.normalize_weights()
for feat in lca.get_weights():
self.assertAlmostEqual(feat.norm(2).item(), 1.0, 4)
def test_LCAConv2D_normalize_weights(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 5, 7)
new_weights = torch.rand(10, 3, 5, 7) * 10 + 10
lca.assign_weight_values(new_weights)
lca.normalize_weights()
for feat in lca.get_weights():
self.assertAlmostEqual(feat.norm(2).item(), 1.0, 4)
def test_LCAConv3D_normalize_weights(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 5, 7, 9)
new_weights = torch.rand(10, 3, 9, 5, 7) * 10 + 10
lca.assign_weight_values(new_weights)
lca.normalize_weights()
for feat in lca.get_weights():
self.assertAlmostEqual(feat.norm(2).item(), 1.0, 4)
def test_LCAConv1D_initial_weights_are_normalized(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, kt=25)
for feat in lca.get_weights():
self.assertAlmostEqual(feat.norm(2).item(), 1.0, 4)
def test_LCAConv2D_initial_weights_are_normalized(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 5, 7)
for feat in lca.get_weights():
self.assertAlmostEqual(feat.norm(2).item(), 1.0, 4)
def test_LCAConv3D_initial_weights_are_normalized(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 5, 7, 9)
for feat in lca.get_weights():
self.assertAlmostEqual(feat.norm(2).item(), 1.0, 4)
def test_compute_input_pad_raises_ValueError(self):
with TemporaryDirectory() as tmp_dir:
with self.assertRaises(ValueError):
lca = LCAConv2D(10, 3, tmp_dir, 5, 7, pad="weird_padding")
def test_LCAConv1D_input_padding_shape_same_padding(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, kt=5)
self.assertTupleEqual(lca.input_pad, (2, 0, 0))
def test_LCAConv2D_input_padding_shape_same_padding(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 5, 7)
self.assertTupleEqual(lca.input_pad, (0, 2, 3))
def test_LCAConv3D_input_padding_shape_same_padding(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 5, 7, 9)
self.assertTupleEqual(lca.input_pad, (4, 2, 3))
def test_LCAConv1D_input_padding_shape_valid_padding(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, kt=5, pad="valid")
self.assertTupleEqual(lca.input_pad, (0, 0, 0))
def test_LCAConv2D_input_padding_shape_valid_padding(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 5, 7, pad="valid")
self.assertTupleEqual(lca.input_pad, (0, 0, 0))
def test_LCAConv3D_input_padding_shape_valid_padding(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 5, 7, 9, pad="valid")
self.assertTupleEqual(lca.input_pad, (0, 0, 0))
def test_LCAConv1D_code_shape_stride_1_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, kt=5, lca_iters=3)
inputs = torch.randn(1, 3, 100)
code = lca(inputs)
self.assertTupleEqual(code.numpy().shape, (1, 10, 100))
def test_LCAConv2D_code_shape_stride_1_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 5, 7, lca_iters=3)
inputs = torch.randn(1, 3, 100, 99)
code = lca(inputs)
self.assertTupleEqual(code.numpy().shape, (1, 10, 100, 99))
def test_LCAConv3D_code_shape_stride_1_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 5, 7, 9, lca_iters=3)
inputs = torch.randn(1, 3, 8, 100, 101)
code = lca(inputs)
self.assertTupleEqual(code.numpy().shape, (1, 10, 8, 100, 101))
def test_LCAConv1D_code_shape_stride_2_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, kt=5, lca_iters=3, stride_t=2)
inputs = torch.randn(1, 3, 100)
code = lca(inputs)
self.assertTupleEqual(code.numpy().shape, (1, 10, 50))
def test_LCAConv2D_code_shape_stride_2_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 5, 7, lca_iters=3, stride_h=2, stride_w=2)
inputs = torch.randn(1, 3, 100, 100)
code = lca(inputs)
self.assertTupleEqual(code.numpy().shape, (1, 10, 50, 50))
def test_LCAConv3D_code_shape_stride_2_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(
10, 3, tmp_dir, 5, 7, 9, lca_iters=3, stride_h=2, stride_w=2, stride_t=2
)
inputs = torch.randn(1, 3, 8, 100, 100)
code = lca(inputs)
self.assertTupleEqual(code.numpy().shape, (1, 10, 4, 50, 50))
def test_LCAConv1D_code_shape_stride_4_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, kt=5, lca_iters=3, stride_t=4)
inputs = torch.randn(1, 3, 100)
code = lca(inputs)
self.assertTupleEqual(code.numpy().shape, (1, 10, 25))
def test_LCAConv2D_code_shape_stride_4_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 5, 7, lca_iters=3, stride_h=4, stride_w=4)
inputs = torch.randn(1, 3, 100, 100)
code = lca(inputs)
self.assertTupleEqual(code.numpy().shape, (1, 10, 25, 25))
def test_LCAConv3D_code_shape_stride_4_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(
10, 3, tmp_dir, 5, 7, 9, lca_iters=3, stride_h=4, stride_w=4, stride_t=4
)
inputs = torch.randn(1, 3, 8, 100, 100)
code = lca(inputs)
self.assertTupleEqual(code.numpy().shape, (1, 10, 2, 25, 25))
def test_LCAConv1D_recon_shape_stride_1_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, kt=5, lca_iters=3, return_all=True)
inputs = torch.randn(1, 3, 100)
recon = lca(inputs)[1]
assert_close(recon[..., -1].shape, inputs.shape, rtol=0, atol=0)
def test_LCAConv2D_recon_shape_stride_1_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 5, 7, lca_iters=3, return_all=True)
inputs = torch.randn(1, 3, 100, 100)
recon = lca(inputs)[1]
assert_close(inputs.shape, recon[..., -1].shape, rtol=0, atol=0)
def test_LCAConv3D_recon_shape_stride_1_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 5, 7, 9, lca_iters=3, return_all=True)
inputs = torch.randn(1, 3, 8, 100, 100)
recon = lca(inputs)[1]
assert_close(inputs.shape, recon[..., -1].shape, rtol=0, atol=0)
def test_LCAConv1D_recon_shape_stride_2_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, 5, 2, lca_iters=3, return_all=True)
inputs = torch.randn(1, 3, 100)
recon = lca(inputs)[1]
assert_close(recon[..., -1].shape, inputs.shape, rtol=0, atol=0)
def test_LCAConv2D_recon_shape_stride_2_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 5, 7, 2, 2, lca_iters=3, return_all=True)
inputs = torch.randn(1, 3, 100, 100)
recon = lca(inputs)[1]
assert_close(inputs.shape, recon[..., -1].shape, rtol=0, atol=0)
def test_LCAConv3D_recon_shape_stride_2_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(
10, 3, tmp_dir, 5, 7, 9, 2, 2, 2, lca_iters=3, return_all=True
)
inputs = torch.randn(1, 3, 8, 100, 100)
recon = lca(inputs)[1]
assert_close(inputs.shape, recon[..., -1].shape, rtol=0, atol=0)
def test_LCAConv1D_recon_shape_stride_4_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, 5, 4, lca_iters=3, return_all=True)
inputs = torch.randn(1, 3, 100)
recon = lca(inputs)[1]
assert_close(recon[..., -1].shape, inputs.shape, rtol=0, atol=0)
def test_LCAConv2D_recon_shape_stride_4_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 5, 7, 4, 4, lca_iters=3, return_all=True)
inputs = torch.randn(1, 3, 100, 100)
recon = lca(inputs)[1]
assert_close(inputs.shape, recon[..., -1].shape, rtol=0, atol=0)
def test_LCAConv3D_recon_shape_stride_4_pad_same(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(
10, 3, tmp_dir, 5, 7, 9, 4, 4, 4, lca_iters=3, return_all=True
)
inputs = torch.randn(1, 3, 8, 100, 100)
recon = lca(inputs)[1]
assert_close(inputs.shape, recon[..., -1].shape, rtol=0, atol=0)
def test_LCAConv1D_recon_shape_stride_1_pad_valid(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(
10, 3, tmp_dir, 100, pad="valid", lca_iters=3, return_all=True
)
inputs = torch.randn(1, 3, 100)
recon = lca(inputs)[1]
self.assertTupleEqual(recon[..., -1].numpy().shape, (1, 3, 100))
def test_LCAConv2D_recon_shape_stride_1_pad_valid(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(
10, 3, tmp_dir, 100, 100, lca_iters=3, return_all=True, pad="valid"
)
inputs = torch.randn(1, 3, 100, 100)
recon = lca(inputs)[1]
self.assertTupleEqual(recon[..., -1].numpy().shape, (1, 3, 100, 100))
def test_LCAConv3D_recon_shape_stride_1_pad_valid(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(
10, 3, tmp_dir, 20, 20, 4, pad="valid", lca_iters=3, return_all=True
)
inputs = torch.randn(1, 3, 4, 20, 20)
recon = lca(inputs)[1]
self.assertTupleEqual(recon[..., -1].numpy().shape, (1, 3, 4, 20, 20))
def test_LCAConv1D_code_shape_stride_1_pad_valid(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, 100, pad="valid", lca_iters=3)
inputs = torch.randn(1, 3, 100)
code = lca(inputs)
self.assertTupleEqual(code.numpy().shape, (1, 10, 1))
def test_LCAConv2D_code_shape_stride_1_pad_valid(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 100, 100, lca_iters=3, pad="valid")
inputs = torch.randn(1, 3, 100, 100)
code = lca(inputs)
self.assertTupleEqual(code.numpy().shape, (1, 10, 1, 1))
def test_LCAConv3D_code_shape_stride_1_pad_valid(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 20, 20, 4, pad="valid", lca_iters=3)
inputs = torch.randn(1, 3, 4, 20, 20)
code = lca(inputs)
self.assertTupleEqual(code.numpy().shape, (1, 10, 1, 1, 1))
def test_LCAConv3D_code_shape_no_time_pad(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 7, 7, 5, lca_iters=3, no_time_pad=True)
inputs = torch.randn(1, 3, 5, 20, 20)
code = lca(inputs)
self.assertTupleEqual(code.numpy().shape, (1, 10, 1, 20, 20))
def test_LCAConv3D_recon_shape_no_time_pad(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(
10, 3, tmp_dir, 7, 7, 5, lca_iters=3, no_time_pad=True, return_all=True
)
inputs = torch.randn(1, 3, 5, 20, 20)
recon = lca(inputs)[1]
self.assertTupleEqual(recon[..., -1].numpy().shape, inputs.numpy().shape)
def test_LCAConv1D_gradient(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, 5, lca_iters=3, req_grad=True)
inputs = torch.randn(1, 3, 100)
with torch.no_grad():
code = lca(inputs)
loss = code.sum()
with self.assertRaises(RuntimeError):
loss.backward()
code = lca(inputs)
loss = code.sum()
loss.backward()
def test_LCAConv2D_gradient(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 5, 7, lca_iters=3, req_grad=True)
inputs = torch.randn(1, 3, 20, 20)
with torch.no_grad():
code = lca(inputs)
loss = code.sum()
with self.assertRaises(RuntimeError):
loss.backward()
code = lca(inputs)
loss = code.sum()
loss.backward()
def test_LCAConv3D_gradient(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 5, 7, lca_iters=3, req_grad=True)
inputs = torch.randn(1, 3, 5, 20, 20)
with torch.no_grad():
code = lca(inputs)
loss = code.sum()
with self.assertRaises(RuntimeError):
loss.backward()
code = lca(inputs)
loss = code.sum()
loss.backward()
def test_LCAConv1D_code_feature_as_input(self):
with TemporaryDirectory() as tmp_dir:
for lambda_ in torch.arange(0.1, 1.0, 0.1):
lca = LCAConv1D(
10, 3, tmp_dir, 100, pad="valid", input_norm=False, lambda_=lambda_
)
inputs = lca.get_weights()[0].unsqueeze(0)
code = lca(inputs)
code = code.squeeze()
code = torch.sort(code, descending=True, stable=True)[0]
self.assertEqual(torch.count_nonzero(code), 1)
assert_close(code[0], code.max())
def test_LCAConv1D_recon_close_to_input_feature_as_input(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(
10,
3,
tmp_dir,
100,
pad="valid",
input_norm=False,
lambda_=0.1,
return_all=True,
)
inputs = lca.get_weights()[0].unsqueeze(0)
recon = lca(inputs)[1]
mae = (inputs - recon[..., -1]).abs().mean().item()
self.assertLess(mae, 5.5e-3)
def test_LCAConv2D_code_feature_as_input(self):
with TemporaryDirectory() as tmp_dir:
for lambda_ in torch.arange(0.1, 1.0, 0.1):
lca = LCAConv2D(
10,
3,
tmp_dir,
10,
10,
pad="valid",
input_norm=False,
lambda_=lambda_,
)
inputs = lca.get_weights()[0].unsqueeze(0)
code = lca(inputs)
code = code.squeeze()
code = torch.sort(code, descending=True, stable=True)[0]
self.assertEqual(torch.count_nonzero(code), 1)
assert_close(code[0], code.max())
def test_LCAConv2D_recon_close_to_input_feature_as_input(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(
10,
3,
tmp_dir,
10,
10,
pad="valid",
input_norm=False,
lambda_=0.1,
return_all=True,
)
inputs = lca.get_weights()[0].unsqueeze(0)
recon = lca(inputs)[1]
mae = (inputs - recon[..., -1]).abs().mean().item()
self.assertLess(mae, 5.5e-3)
def test_LCAConv3D_code_feature_as_input(self):
with TemporaryDirectory() as tmp_dir:
for lambda_ in torch.arange(0.1, 1.0, 0.1):
lca = LCAConv3D(
10,
3,
tmp_dir,
10,
10,
10,
pad="valid",
input_norm=False,
lambda_=lambda_,
)
inputs = lca.get_weights()[0].unsqueeze(0)
code = lca(inputs)
code = code.squeeze()
code = torch.sort(code, descending=True, stable=True)[0]
self.assertTrue(torch.count_nonzero(code), 1)
assert_close(code[0], code.max())
def test_LCAConv3D_recon_close_to_input_feature_as_input(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(
10,
3,
tmp_dir,
10,
10,
10,
pad="valid",
input_norm=False,
lambda_=0.1,
return_all=True,
)
inputs = lca.get_weights()[0].unsqueeze(0)
recon = lca(inputs)[1]
mae = (inputs - recon[..., -1]).abs().mean().item()
self.assertLess(mae, 5.5e-3)
def test_LCAConv1D_compute_lateral_connectivity_stride_1_odd_ksize(self):
with TemporaryDirectory() as tmp_dir:
for ksize in range(1, 50, 2):
lca = LCAConv1D(15, 3, tmp_dir, ksize)
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(
conns[..., 0, 0].numpy().shape, (15, 15, ksize * 2 - 1)
)
def test_LCAConv1D_compute_lateral_connectivity_stride_1_even_ksize(self):
with TemporaryDirectory() as tmp_dir:
for ksize in range(2, 52, 2):
lca = LCAConv1D(15, 3, tmp_dir, ksize, pad="valid")
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(
conns[..., 0, 0].numpy().shape, (15, 15, ksize * 2 - 1)
)
def test_LCAConv1D_compute_lateral_connectivity_stride_2_odd_ksize(self):
with TemporaryDirectory() as tmp_dir:
for ksize in range(1, 50, 2):
lca = LCAConv1D(15, 3, tmp_dir, ksize, 2)
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(conns[..., 0, 0].numpy().shape, (15, 15, ksize))
def test_LCAConv1D_compute_lateral_connectivity_stride_2_even_ksize(self):
with TemporaryDirectory() as tmp_dir:
for ksize in range(2, 52, 2):
lca = LCAConv1D(15, 3, tmp_dir, ksize, 2, pad="valid")
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(conns[..., 0, 0].numpy().shape, (15, 15, ksize - 1))
def test_LCAConv1D_compute_lateral_connectivity_odd_ksize_various_strides(self):
with TemporaryDirectory() as tmp_dir:
ksize = 7
for stride, exp_size in zip(range(1, 8), [13, 7, 5, 3, 3, 3, 1]):
lca = LCAConv1D(15, 3, tmp_dir, ksize, stride)
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(conns[..., 0, 0].numpy().shape, (15, 15, exp_size))
def test_LCAConv1D_compute_lateral_connectivity_even_ksize_various_strides(self):
with TemporaryDirectory() as tmp_dir:
ksize = 8
for stride, exp_size in zip(range(1, 9), [15, 7, 5, 3, 3, 3, 3, 1]):
lca = LCAConv1D(15, 3, tmp_dir, ksize, stride, pad="valid")
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(conns[..., 0, 0].numpy().shape, (15, 15, exp_size))
def test_LCAConv2D_compute_lateral_connectivity_stride_1_odd_ksize(self):
with TemporaryDirectory() as tmp_dir:
for ksize in range(1, 50, 2):
ksize2 = ksize + 2
lca = LCAConv2D(15, 3, tmp_dir, ksize, ksize2)
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(
conns[..., 0, :, :].numpy().shape,
(15, 15, ksize * 2 - 1, ksize2 * 2 - 1),
)
def test_LCAConv2D_compute_lateral_connectivity_stride_1_even_ksize(self):
with TemporaryDirectory() as tmp_dir:
for ksize in range(2, 52, 2):
ksize2 = ksize + 2
lca = LCAConv2D(15, 3, tmp_dir, ksize, ksize2, pad="valid")
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(
conns[..., 0, :, :].numpy().shape,
(15, 15, ksize * 2 - 1, ksize2 * 2 - 1),
)
def test_LCAConv2D_compute_lateral_connectivity_stride_2_odd_ksize(self):
with TemporaryDirectory() as tmp_dir:
for ksize in range(1, 50, 2):
ksize2 = ksize + 2
lca = LCAConv2D(15, 3, tmp_dir, ksize, ksize2, 2, 2)
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(
conns[..., 0, :, :].numpy().shape, (15, 15, ksize, ksize2)
)
def test_LCAConv2D_compute_lateral_connectivity_stride_2_even_ksize(self):
with TemporaryDirectory() as tmp_dir:
for ksize in range(2, 52, 2):
ksize2 = ksize + 2
lca = LCAConv2D(15, 3, tmp_dir, ksize, ksize2, 2, 2, pad="valid")
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(
conns[..., 0, :, :].numpy().shape, (15, 15, ksize - 1, ksize2 - 1)
)
def test_LCAConv2D_compute_lateral_connectivity_odd_ksize_various_strides(self):
with TemporaryDirectory() as tmp_dir:
ksize = 7
for stride, exp_size in zip(range(1, 8), [13, 7, 5, 3, 3, 3, 1]):
lca = LCAConv2D(15, 3, tmp_dir, ksize, ksize, stride, stride)
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(
conns[..., 0, :, :].numpy().shape, (15, 15, exp_size, exp_size)
)
def test_LCAConv2D_compute_lateral_connectivity_even_ksize_various_strides(self):
with TemporaryDirectory() as tmp_dir:
ksize = 8
for stride, exp_size in zip(range(1, 9), [15, 7, 5, 3, 3, 3, 3, 1]):
lca = LCAConv2D(
15, 3, tmp_dir, ksize, ksize, stride, stride, pad="valid"
)
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(
conns[..., 0, :, :].numpy().shape, (15, 15, exp_size, exp_size)
)
def test_LCAConv3D_compute_lateral_connectivity_stride_1_odd_ksize(self):
with TemporaryDirectory() as tmp_dir:
for ksize in range(1, 11, 2):
ksize2 = ksize + 2
ksize3 = ksize + 4
lca = LCAConv3D(15, 3, tmp_dir, ksize, ksize2, ksize3)
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(
conns.numpy().shape,
(15, 15, ksize3 * 2 - 1, ksize * 2 - 1, ksize2 * 2 - 1),
)
def test_LCAConv3D_compute_lateral_connectivity_stride_1_even_ksize(self):
with TemporaryDirectory() as tmp_dir:
for ksize in range(2, 12, 2):
ksize2 = ksize + 2
ksize3 = ksize + 4
lca = LCAConv3D(15, 3, tmp_dir, ksize, ksize2, ksize3, pad="valid")
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(
conns.numpy().shape,
(15, 15, ksize3 * 2 - 1, ksize * 2 - 1, ksize2 * 2 - 1),
)
def test_LCAConv3D_compute_lateral_connectivity_stride_2_odd_ksize(self):
with TemporaryDirectory() as tmp_dir:
for ksize in range(1, 11, 2):
ksize2 = ksize + 2
ksize3 = ksize + 4
lca = LCAConv3D(15, 3, tmp_dir, ksize, ksize2, ksize3, 2, 2, 2)
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(conns.numpy().shape, (15, 15, ksize3, ksize, ksize2))
def test_LCAConv3D_compute_lateral_connectivity_stride_2_even_ksize(self):
with TemporaryDirectory() as tmp_dir:
for ksize in range(2, 12, 2):
ksize2 = ksize + 2
ksize3 = ksize + 4
lca = LCAConv3D(
15, 3, tmp_dir, ksize, ksize2, ksize3, 2, 2, 2, pad="valid"
)
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(
conns.numpy().shape, (15, 15, ksize3 - 1, ksize - 1, ksize2 - 1)
)
def test_LCAConv3D_compute_lateral_connectivity_odd_ksize_various_strides(self):
with TemporaryDirectory() as tmp_dir:
ksize = 7
for stride, exp_size in zip(range(1, 8), [13, 7, 5, 3, 3, 3, 1]):
lca = LCAConv3D(
15, 3, tmp_dir, ksize, ksize, ksize, stride, stride, stride
)
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(
conns.numpy().shape, (15, 15, exp_size, exp_size, exp_size)
)
def test_LCAConv3D_compute_lateral_connectivity_even_ksize_various_strides(self):
with TemporaryDirectory() as tmp_dir:
ksize = 8
for stride, exp_size in zip(range(1, 9), [15, 7, 5, 3, 3, 3, 3, 1]):
lca = LCAConv3D(
15,
3,
tmp_dir,
ksize,
ksize,
ksize,
stride,
stride,
stride,
pad="valid",
)
conns = lca.compute_lateral_connectivity(lca.weights.detach())
self.assertEqual(
conns.numpy().shape, (15, 15, exp_size, exp_size, exp_size)
)
def test_LCAConv3D_no_time_pad(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 7, 7, 7)
self.assertEqual(lca.input_pad[0], 3)
lca = LCAConv3D(10, 3, tmp_dir, 7, 7, 7, no_time_pad=True)
self.assertEqual(lca.input_pad[0], 0)
def test_l1_norm_of_code_decreases_with_increasing_lambda(self):
with TemporaryDirectory() as tmp_dir:
l1_norms = []
for lambda_ in torch.arange(0.1, 1.0, 0.1):
lca = LCAConv2D(
10,
3,
tmp_dir,
10,
10,
lambda_=lambda_,
pad="valid",
input_norm=False,
)
inputs = lca.get_weights()[0].unsqueeze(0)
code = lca(inputs)
l1_norms.append(code.norm(1).item())
self.assertEqual(l1_norms, sorted(l1_norms, reverse=True))
def test_recon_error_increases_with_increasing_lambda(self):
with TemporaryDirectory() as tmp_dir:
errors = []
for lambda_ in torch.arange(0.1, 1.1, 0.1):
lca = LCAConv2D(
10,
3,
tmp_dir,
10,
10,
lambda_=lambda_,
pad="valid",
input_norm=False,
return_all=True,
)
inputs = lca.get_weights()[0].unsqueeze(0)
recon_error = lca(inputs)[2]
errors.append(0.5 * recon_error[..., -1].norm(2) ** 2)
self.assertEqual(errors, sorted(errors))
def test_inputs_equal_recon_error_plus_recon_LCAConv1D(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(
10, 5, tmp_dir, 5, lca_iters=3, input_norm=False, return_all=True
)
inputs = torch.randn(3, 5, 100)
recon, recon_error = lca(inputs)[1:3]
assert_close(inputs, recon_error[..., -1] + recon[..., -1])
def test_inputs_equal_recon_error_plus_recon_LCAConv2D(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(
10, 5, tmp_dir, 5, 5, lca_iters=3, input_norm=False, return_all=True
)
inputs = torch.randn(3, 5, 100, 100)
recon, recon_error = lca(inputs)[1:3]
assert_close(inputs, recon_error[..., -1] + recon[..., -1])
def test_inputs_equal_recon_error_plus_recon_LCAConv3D(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(
10,
5,
tmp_dir,
5,
5,
3,
2,
2,
1,
lca_iters=3,
input_norm=False,
return_all=True,
)
inputs = torch.randn(3, 5, 10, 100, 100)
recon, recon_error = lca(inputs)[1:3]
assert_close(inputs, recon_error[..., -1] + recon[..., -1])
def test_LCAConv2D_check_conv_params_raises_AssertionError_odd_even_ksizes(self):
with TemporaryDirectory() as tmp_dir:
for ksize1 in range(2, 12, 2):
for ksize2 in range(3, 12, 2):
with self.assertRaises(AssertionError):
lca = LCAConv2D(10, 1, tmp_dir, ksize1, ksize2)
def test_LCAConv3D_check_conv_params_raises_AssertionError_odd_even_ksizes(self):
with TemporaryDirectory() as tmp_dir:
for ksize1 in range(2, 12, 2):
for ksize2 in range(3, 12, 2):
for ksize3 in range(2, 12, 2):
with self.assertRaises(AssertionError):
lca = LCAConv3D(10, 1, tmp_dir, ksize1, ksize2, ksize3)
def test_LCAConv1D_compute_inhib_pad_even_ksize(self):
with TemporaryDirectory() as tmp_dir:
ksize = 8
for stride, exp_size in zip(range(1, 9), [7, 6, 6, 4, 5, 6, 7, 0]):
lca = LCAConv1D(10, 3, tmp_dir, ksize, stride, pad="valid")
self.assertEqual(lca.lat_conn_pad[0], exp_size)
def test_LCAConv1D_compute_inhib_pad_odd_ksize(self):
with TemporaryDirectory() as tmp_dir:
ksize = 9
for stride, exp_size in zip(range(1, 10), [8, 8, 6, 8, 5, 6, 7, 8, 0]):
lca = LCAConv1D(10, 3, tmp_dir, ksize, stride, pad="valid")
self.assertEqual(lca.lat_conn_pad[0], exp_size)
def test_LCAConv2D_compute_inhib_pad_even_ksize(self):
with TemporaryDirectory() as tmp_dir:
ksize = 8
for stride, exp_size in zip(range(1, 9), [7, 6, 6, 4, 5, 6, 7, 0]):
lca = LCAConv2D(
10, 3, tmp_dir, ksize, ksize, stride, stride, pad="valid"
)
self.assertEqual(lca.lat_conn_pad[1:], (exp_size, exp_size))
def test_LCAConv2D_compute_inhib_pad_odd_ksize(self):
with TemporaryDirectory() as tmp_dir:
ksize = 9
for stride, exp_size in zip(range(1, 10), [8, 8, 6, 8, 5, 6, 7, 8, 0]):
lca = LCAConv2D(
10, 3, tmp_dir, ksize, ksize, stride, stride, pad="valid"
)
self.assertEqual(lca.lat_conn_pad[1:], (exp_size, exp_size))
def test_LCAConv3D_compute_inhib_pad_even_ksize(self):
with TemporaryDirectory() as tmp_dir:
ksize = 8
for stride, exp_size in zip(range(1, 9), [7, 6, 6, 4, 5, 6, 7, 0]):
lca = LCAConv3D(
10,
3,
tmp_dir,
ksize,
ksize,
ksize,
stride,
stride,
stride,
pad="valid",
)
self.assertEqual(lca.lat_conn_pad, (exp_size,) * 3)
def test_LCAConv3D_compute_inhib_pad_odd_ksize(self):
with TemporaryDirectory() as tmp_dir:
ksize = 9
for stride, exp_size in zip(range(1, 10), [8, 8, 6, 8, 5, 6, 7, 8, 0]):
lca = LCAConv3D(
10,
3,
tmp_dir,
ksize,
ksize,
ksize,
stride,
stride,
stride,
pad="valid",
)
self.assertEqual(lca.lat_conn_pad, (exp_size,) * 3)
def test_LCAConv1D_compute_inhib_pad_ksize_equal_1(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, 1)
self.assertEqual(lca.lat_conn_pad, (0, 0, 0))
def test_LCAConv2D_compute_inhib_pad_ksize_equal_1(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 1, 1)
self.assertEqual(lca.lat_conn_pad, (0, 0, 0))
def test_LCAConv3D_compute_inhib_pad_ksize_equal_1(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 1, 1, 1)
self.assertEqual(lca.lat_conn_pad, (0, 0, 0))
def test_LCAConv1D_compute_lateral_connectivity_ksize_equal_1(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv1D(10, 3, tmp_dir, 1)
conns = lca.compute_lateral_connectivity(lca.weights)
self.assertEqual(conns.numpy().shape, (10, 10, 1, 1, 1))
def test_LCAConv2D_compute_lateral_connectivity_ksize_equal_1(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv2D(10, 3, tmp_dir, 1, 1)
conns = lca.compute_lateral_connectivity(lca.weights)
self.assertEqual(conns.numpy().shape, (10, 10, 1, 1, 1))
def test_LCAConv3D_compute_lateral_connectivity_ksize_equal_1(self):
with TemporaryDirectory() as tmp_dir:
lca = LCAConv3D(10, 3, tmp_dir, 1, 1, 1)
conns = lca.compute_lateral_connectivity(lca.weights)
self.assertEqual(conns.numpy().shape, (10, 10, 1, 1, 1))
if __name__ == "__main__":
unittest.main()
| 43.079823 | 88 | 0.557028 | 4,841 | 38,858 | 4.238381 | 0.03429 | 0.055853 | 0.120382 | 0.129642 | 0.95945 | 0.952091 | 0.932937 | 0.917828 | 0.891949 | 0.862024 | 0 | 0.069415 | 0.333419 | 38,858 | 901 | 89 | 43.127636 | 0.722724 | 0 | 0 | 0.664141 | 0 | 0 | 0.004658 | 0 | 0 | 0 | 0 | 0 | 0.132576 | 1 | 0.119949 | false | 0 | 0.006313 | 0 | 0.127525 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d171185f8777ee35a4cf795f4714d8c65b368cdf | 2,156 | py | Python | backend/tests/serializers/test_media_asset.py | druzhynin-oleksii/model_garden | 3599f5d0c81bc79139ceabed5dd647c29ccadc31 | [
"MIT"
] | 8 | 2020-09-10T18:28:19.000Z | 2022-02-22T03:41:14.000Z | backend/tests/serializers/test_media_asset.py | druzhynin-oleksii/model_garden | 3599f5d0c81bc79139ceabed5dd647c29ccadc31 | [
"MIT"
] | 1 | 2020-09-15T21:20:29.000Z | 2020-09-15T21:20:29.000Z | backend/tests/serializers/test_media_asset.py | druzhynin-oleksii/model_garden | 3599f5d0c81bc79139ceabed5dd647c29ccadc31 | [
"MIT"
] | 8 | 2020-09-10T16:29:35.000Z | 2022-01-25T15:05:03.000Z | from tests import BaseTestCase
from model_garden.constants import LabelingTaskStatus
from model_garden.serializers import MediaAssetSerializer
class TestMediaAssetSerializer(BaseTestCase):
def test_serialize(self):
media_asset = self.test_factory.create_media_asset()
serializer = MediaAssetSerializer(media_asset)
self.assertEqual(
serializer.data,
{
'dataset_id': media_asset.dataset.id,
'filename': media_asset.filename,
'remote_path': media_asset.remote_path,
'remote_label_path': None,
'labeling_task_name': None,
},
)
def test_serialize_with_remote_label_path(self):
labeling_task = self.test_factory.create_labeling_task(status=LabelingTaskStatus.SAVED)
media_asset = self.test_factory.create_media_asset()
media_asset.labeling_task = labeling_task
media_asset.labeling_asset_filepath = media_asset.remote_label_path
media_asset.save(update_fields=('labeling_task', 'labeling_asset_filepath'))
serializer = MediaAssetSerializer(media_asset)
self.assertEqual(
serializer.data,
{
'dataset_id': media_asset.dataset.id,
'filename': media_asset.filename,
'remote_path': media_asset.remote_path,
'remote_label_path': media_asset.labeling_asset_filepath,
'labeling_task_name': media_asset.labeling_task.name,
},
)
def test_serialize_with_remote_label_path_null(self):
labeling_task = self.test_factory.create_labeling_task(status=LabelingTaskStatus.SAVED)
media_asset = self.test_factory.create_media_asset()
media_asset.labeling_task = labeling_task
media_asset.labeling_asset_filepath = None
media_asset.save(update_fields=('labeling_task', 'labeling_asset_filepath'))
serializer = MediaAssetSerializer(media_asset)
self.assertEqual(
serializer.data,
{
'dataset_id': media_asset.dataset.id,
'filename': media_asset.filename,
'remote_path': media_asset.remote_path,
'remote_label_path': media_asset.labeling_asset_filepath,
'labeling_task_name': media_asset.labeling_task.name,
},
)
| 33.6875 | 91 | 0.739332 | 246 | 2,156 | 6.065041 | 0.166667 | 0.19437 | 0.096515 | 0.070375 | 0.845845 | 0.835791 | 0.835791 | 0.788874 | 0.761394 | 0.761394 | 0 | 0 | 0.174861 | 2,156 | 63 | 92 | 34.222222 | 0.838673 | 0 | 0 | 0.607843 | 0 | 0 | 0.122449 | 0.021336 | 0 | 0 | 0 | 0 | 0.058824 | 1 | 0.058824 | false | 0 | 0.058824 | 0 | 0.137255 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0f033b7fa016a99c7f5dcbea6ea31f3765c30612 | 42 | py | Python | zested/gui/__init__.py | Klafyvel/ZestEd | 4b508aa750c8acf6c592a9a725e260aeff0e43f0 | [
"BSD-2-Clause"
] | 1 | 2016-01-27T17:44:27.000Z | 2016-01-27T17:44:27.000Z | zested/gui/__init__.py | Klafyvel/ZestEd | 4b508aa750c8acf6c592a9a725e260aeff0e43f0 | [
"BSD-2-Clause"
] | 3 | 2017-12-06T13:23:32.000Z | 2017-12-06T13:23:43.000Z | zested/gui/__init__.py | Luthaf/Zested | 6731ddf50a44f94569528520395309037317f69b | [
"BSD-2-Clause"
] | null | null | null | from .editor import *
from .main import *
| 14 | 21 | 0.714286 | 6 | 42 | 5 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 42 | 2 | 22 | 21 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0f3aa1e8783ed9701524a08556722c5b189a3575 | 7,347 | py | Python | exercises/networking_selfpaced/networking-workshop/collections/ansible_collections/community/general/tests/unit/modules/messaging/rabbitmq/test_rabbitmq_user.py | tr3ck3r/linklight | 5060f624c235ecf46cb62cefcc6bddc6bf8ca3e7 | [
"MIT"
] | null | null | null | exercises/networking_selfpaced/networking-workshop/collections/ansible_collections/community/general/tests/unit/modules/messaging/rabbitmq/test_rabbitmq_user.py | tr3ck3r/linklight | 5060f624c235ecf46cb62cefcc6bddc6bf8ca3e7 | [
"MIT"
] | null | null | null | exercises/networking_selfpaced/networking-workshop/collections/ansible_collections/community/general/tests/unit/modules/messaging/rabbitmq/test_rabbitmq_user.py | tr3ck3r/linklight | 5060f624c235ecf46cb62cefcc6bddc6bf8ca3e7 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from ansible_collections.community.general.plugins.modules.messaging.rabbitmq import rabbitmq_user
from ansible_collections.community.general.tests.unit.compat.mock import patch
from ansible_collections.community.general.tests.unit.modules.utils import AnsibleExitJson, AnsibleFailJson, ModuleTestCase, set_module_args
class TestRabbitMQUserModule(ModuleTestCase):
def setUp(self):
super(TestRabbitMQUserModule, self).setUp()
self.module = rabbitmq_user
def tearDown(self):
super(TestRabbitMQUserModule, self).tearDown()
def _assert(self, exc, attribute, expected_value, msg=""):
value = exc.message[attribute] if hasattr(exc, attribute) else exc.args[0][attribute]
assert value == expected_value, msg
def test_without_required_parameters(self):
"""Failure must occurs when all parameters are missing"""
with self.assertRaises(AnsibleFailJson):
set_module_args({})
self.module.main()
def test_permissions_with_same_vhost(self):
set_module_args({
'user': 'someuser',
'password': 'somepassword',
'state': 'present',
'permissions': [{'vhost': '/'}, {'vhost': '/'}],
})
with patch('ansible.module_utils.basic.AnsibleModule.get_bin_path') as get_bin_path:
get_bin_path.return_value = '/rabbitmqctl'
try:
self.module.main()
except AnsibleFailJson as e:
self._assert(e, 'failed', True)
self._assert(e, 'msg',
"Error parsing permissions: You can't have two permission dicts for the same vhost")
@patch('ansible.module_utils.basic.AnsibleModule.get_bin_path')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser.get')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser.check_password')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser.has_tags_modifications')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser.has_permissions_modifications')
def test_password_changes_only_when_needed(self, has_permissions_modifications, has_tags_modifications,
check_password, get, get_bin_path):
set_module_args({
'user': 'someuser',
'password': 'somepassword',
'state': 'present',
'update_password': 'always',
})
get.return_value = True
get_bin_path.return_value = '/rabbitmqctl'
check_password.return_value = True
has_tags_modifications.return_value = False
has_permissions_modifications.return_value = False
try:
self.module.main()
except AnsibleExitJson as e:
self._assert(e, 'changed', False)
self._assert(e, 'state', 'present')
@patch('ansible.module_utils.basic.AnsibleModule.get_bin_path')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser._exec')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser._get_permissions')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser.has_tags_modifications')
def test_same_permissions_not_changing(self, has_tags_modifications, _get_permissions, _exec, get_bin_path):
set_module_args({
'user': 'someuser',
'password': 'somepassword',
'state': 'present',
'permissions': [{'vhost': '/', 'configure_priv': '.*', 'write_priv': '.*', 'read_priv': '.*'}],
})
_get_permissions.return_value = [{'vhost': '/', 'configure_priv': '.*', 'write_priv': '.*', 'read_priv': '.*'}]
_exec.return_value = ['someuser\t[]']
get_bin_path.return_value = '/rabbitmqctl'
has_tags_modifications.return_value = False
try:
self.module.main()
except AnsibleExitJson as e:
self._assert(e, 'changed', False)
self._assert(e, 'state', 'present')
@patch('ansible.module_utils.basic.AnsibleModule.get_bin_path')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser._exec')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser._get_permissions')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser.set_permissions')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser.has_tags_modifications')
def test_permissions_are_fixed(self, has_tags_modifications, set_permissions, _get_permissions, _exec, get_bin_path):
set_module_args({
'user': 'someuser',
'password': 'somepassword',
'state': 'present',
'permissions': [{'vhost': '/', 'configure_priv': '.*', 'write_priv': '.*', 'read_priv': '.*'}],
})
set_permissions.return_value = None
_get_permissions.return_value = []
_exec.return_value = ['someuser\t[]']
get_bin_path.return_value = '/rabbitmqctl'
has_tags_modifications.return_value = False
try:
self.module.main()
except AnsibleExitJson as e:
self._assert(e, 'changed', True)
self._assert(e, 'state', 'present')
assert set_permissions.call_count == 1
@patch('ansible.module_utils.basic.AnsibleModule.get_bin_path')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser._exec')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser._get_permissions')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser.set_permissions')
@patch('ansible_collections.community.general.plugins.modules.messaging.rabbitmq.rabbitmq_user.RabbitMqUser.has_tags_modifications')
def test_permissions_are_fixed_with_different_host(self, has_tags_modifications, set_permissions, _get_permissions,
_exec, get_bin_path):
set_module_args({
'user': 'someuser',
'password': 'somepassword',
'state': 'present',
'permissions': [{'vhost': '/', 'configure_priv': '.*', 'write_priv': '.*', 'read_priv': '.*'}],
})
set_permissions.return_value = None
_get_permissions.return_value = [{'vhost': 'monitoring', 'configure_priv': '.*', 'write_priv': '.*', 'read_priv': '.*'}]
_exec.return_value = ['someuser\t[]']
get_bin_path.return_value = '/rabbitmqctl'
has_tags_modifications.return_value = False
try:
self.module.main()
except AnsibleExitJson as e:
self._assert(e, 'changed', True)
self._assert(e, 'state', 'present')
assert set_permissions.call_count == 1
| 54.422222 | 143 | 0.676058 | 765 | 7,347 | 6.194771 | 0.152941 | 0.050644 | 0.102553 | 0.129141 | 0.783921 | 0.768939 | 0.747837 | 0.726947 | 0.713231 | 0.702469 | 0 | 0.000681 | 0.200898 | 7,347 | 134 | 144 | 54.828358 | 0.806507 | 0.010072 | 0 | 0.68595 | 0 | 0 | 0.383239 | 0.274116 | 0 | 0 | 0 | 0 | 0.123967 | 1 | 0.07438 | false | 0.082645 | 0.024793 | 0 | 0.107438 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
7e3a338ef49f0143bb51b37cb7087fb1b187a54e | 17 | py | Python | mklaren/util/__init__.py | tkemps/mklaren | d9e7890aaa26cb3877e1a82114ab1e52df595d96 | [
"BSD-2-Clause"
] | 3 | 2019-10-28T17:20:37.000Z | 2020-08-20T22:59:18.000Z | mklaren/util/__init__.py | tkemps/mklaren | d9e7890aaa26cb3877e1a82114ab1e52df595d96 | [
"BSD-2-Clause"
] | null | null | null | mklaren/util/__init__.py | tkemps/mklaren | d9e7890aaa26cb3877e1a82114ab1e52df595d96 | [
"BSD-2-Clause"
] | 1 | 2019-10-28T17:20:35.000Z | 2019-10-28T17:20:35.000Z | from .la import * | 17 | 17 | 0.705882 | 3 | 17 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 17 | 1 | 17 | 17 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.