hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
296fbd7a7affa6769704c5e942dffa0717fa9a8d | 77 | py | Python | tests/modules/core/test_error.py | spxtr/bumblebee-status | 45125f39af8323775aeabf809ae5ae80cfe3ccd9 | [
"MIT"
] | 1,089 | 2016-11-06T10:02:53.000Z | 2022-03-26T12:53:30.000Z | tests/modules/core/test_error.py | spxtr/bumblebee-status | 45125f39af8323775aeabf809ae5ae80cfe3ccd9 | [
"MIT"
] | 817 | 2016-11-05T05:42:39.000Z | 2022-03-25T19:43:52.000Z | tests/modules/core/test_error.py | spxtr/bumblebee-status | 45125f39af8323775aeabf809ae5ae80cfe3ccd9 | [
"MIT"
] | 317 | 2016-11-05T00:35:06.000Z | 2022-03-24T13:35:03.000Z | import pytest
def test_load_module():
__import__("modules.core.error")
| 12.833333 | 36 | 0.74026 | 10 | 77 | 5.1 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 77 | 5 | 37 | 15.4 | 0.772727 | 0 | 0 | 0 | 0 | 0 | 0.236842 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.666667 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
29734aebd0c0c968d950abb77d05337197f86833 | 6,095 | py | Python | tests/integration/fixtures/problem_fixtures.py | camUrban/PteraSoftware | cd5163aa2cc5ea76a906b271d9d9958ad99e57bd | [
"MIT"
] | 68 | 2020-07-18T20:49:37.000Z | 2022-03-23T21:07:35.000Z | tests/integration/fixtures/problem_fixtures.py | camUrban/PteraSoftware | cd5163aa2cc5ea76a906b271d9d9958ad99e57bd | [
"MIT"
] | 13 | 2020-08-20T04:56:44.000Z | 2022-03-27T14:34:02.000Z | tests/integration/fixtures/problem_fixtures.py | camUrban/PteraSoftware | cd5163aa2cc5ea76a906b271d9d9958ad99e57bd | [
"MIT"
] | 17 | 2020-07-23T22:26:31.000Z | 2022-03-27T12:34:27.000Z | """This module creates problem objects to be used as fixtures.
This module contains the following classes:
None
This module contains the following exceptions:
None
This module contains the following functions:
make_steady_validation_problem: This function creates a steady problem object to
be used as a fixture.
make_steady_multiple_wing_validation_problem: This function creates a steady
problem object with multi-wing geometry to be used as a fixture.
make_unsteady_validation_problem_with_static_geometry: This function creates an
unsteady problem object with static geometry to be used as a fixture.
make_unsteady_validation_problem_with_variable_geometry: This function creates an
unsteady problem object with variable geometry to be used as a fixture.
make_unsteady_validation_problem_with_multiple_wing_static_geometry: This
function creates an unsteady problem object with multi-wing, static geometry to
be used as a fixture.
make_unsteady_validation_problem_with_multiple_wing_variable_geometry: This
function creates an unsteady problem object with multi-wing, variable geometry to
be used as a fixture.
"""
import pterasoftware as ps
from tests.integration.fixtures import airplane_fixtures
from tests.integration.fixtures import movement_fixtures
from tests.integration.fixtures import operating_point_fixtures
def make_steady_validation_problem():
"""This function creates a steady problem object to be used as a fixture.
:return steady_validation_problem: SteadyProblem
This is the problem fixture.
"""
# Create the constructing fixtures.
steady_validation_airplane = airplane_fixtures.make_steady_validation_airplane()
steady_validation_operating_point = (
operating_point_fixtures.make_validation_operating_point()
)
# Create the problem fixture.
steady_validation_problem = ps.problems.SteadyProblem(
airplane=steady_validation_airplane,
operating_point=steady_validation_operating_point,
)
# Delete the constructing fixtures.
del steady_validation_airplane
del steady_validation_operating_point
# Return the problem fixture.
return steady_validation_problem
def make_steady_multiple_wing_validation_problem():
"""This function creates a steady problem object with multi-wing geometry to be
used as a fixture.
:return steady_validation_problem: SteadyProblem
This is the problem fixture.
"""
# Create the constructing fixtures.
steady_validation_airplane = (
airplane_fixtures.make_multiple_wing_steady_validation_airplane()
)
steady_validation_operating_point = (
operating_point_fixtures.make_validation_operating_point()
)
# Create the problem fixture.
steady_validation_problem = ps.problems.SteadyProblem(
airplane=steady_validation_airplane,
operating_point=steady_validation_operating_point,
)
# Delete the constructing fixtures.
del steady_validation_airplane
del steady_validation_operating_point
# Return the problem fixture.
return steady_validation_problem
def make_unsteady_validation_problem_with_static_geometry():
"""This function creates an unsteady problem object with static geometry to be
used as a fixture.
:return unsteady_validation_problem: UnsteadyProblem
This is the problem fixture.
"""
# Create the constructing fixture.
unsteady_validation_movement = movement_fixtures.make_static_validation_movement()
# Create the problem fixture.
unsteady_validation_problem = ps.problems.UnsteadyProblem(
movement=unsteady_validation_movement
)
# Delete the constructing fixture.
del unsteady_validation_movement
# Return the problem fixture.
return unsteady_validation_problem
def make_unsteady_validation_problem_with_variable_geometry():
"""This function creates an unsteady problem object with variable geometry to be
used as a fixture.
:return unsteady_validation_problem: UnsteadyProblem
This is the problem fixture.
"""
# Create the constructing fixture.
unsteady_validation_movement = movement_fixtures.make_variable_validation_movement()
# Create the problem fixture.
unsteady_validation_problem = ps.problems.UnsteadyProblem(
movement=unsteady_validation_movement
)
# Delete the constructing fixture.
del unsteady_validation_movement
# Return the problem fixture.
return unsteady_validation_problem
def make_unsteady_validation_problem_with_multiple_wing_static_geometry():
"""This function creates an unsteady problem object with multi-wing,
static geometry to be used as a fixture.
:return unsteady_validation_problem: UnsteadyProblem
This is the problem fixture.
"""
# Create the constructing fixture.
unsteady_validation_movement = (
movement_fixtures.make_multiple_wing_static_validation_movement()
)
# Create the problem fixture.
unsteady_validation_problem = ps.problems.UnsteadyProblem(
movement=unsteady_validation_movement
)
# Delete the constructing fixture.
del unsteady_validation_movement
# Return the problem fixture.
return unsteady_validation_problem
def make_unsteady_validation_problem_with_multiple_wing_variable_geometry():
"""This function creates an unsteady problem object with multi-wing, variable
geometry to be used as a fixture.
:return unsteady_validation_problem: UnsteadyProblem
This is the problem fixture.
"""
# Create the constructing fixture.
unsteady_validation_movement = (
movement_fixtures.make_multiple_wing_variable_validation_movement()
)
# Create the problem fixture.
unsteady_validation_problem = ps.problems.UnsteadyProblem(
movement=unsteady_validation_movement
)
# Delete the constructing fixture.
del unsteady_validation_movement
# Return the problem fixture.
return unsteady_validation_problem
| 32.945946 | 88 | 0.773749 | 710 | 6,095 | 6.349296 | 0.074648 | 0.127773 | 0.110914 | 0.028838 | 0.966504 | 0.952307 | 0.918589 | 0.918589 | 0.918589 | 0.918589 | 0 | 0 | 0.185562 | 6,095 | 184 | 89 | 33.125 | 0.908139 | 0.493683 | 0 | 0.548387 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.096774 | false | 0 | 0.064516 | 0 | 0.258065 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
469691059120c8d11608ea6e91756d85e3f904f8 | 24 | py | Python | jetpack/renoir/__init__.py | vasudevanv/jetpack | 3a1d4e30e164450586ad4d212361a5e30cc9eb7a | [
"MIT"
] | null | null | null | jetpack/renoir/__init__.py | vasudevanv/jetpack | 3a1d4e30e164450586ad4d212361a5e30cc9eb7a | [
"MIT"
] | null | null | null | jetpack/renoir/__init__.py | vasudevanv/jetpack | 3a1d4e30e164450586ad4d212361a5e30cc9eb7a | [
"MIT"
] | null | null | null |
from .plot import Plot
| 8 | 22 | 0.75 | 4 | 24 | 4.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208333 | 24 | 2 | 23 | 12 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d3b36012b18a2d5286370492cad2225e2cb47544 | 55 | py | Python | mvpd/func_regression/__init__.py | kakurk/PyMVPD | 253974f0b5b7d15d27616540fc5166ddf3b7cc04 | [
"MIT"
] | null | null | null | mvpd/func_regression/__init__.py | kakurk/PyMVPD | 253974f0b5b7d15d27616540fc5166ddf3b7cc04 | [
"MIT"
] | null | null | null | mvpd/func_regression/__init__.py | kakurk/PyMVPD | 253974f0b5b7d15d27616540fc5166ddf3b7cc04 | [
"MIT"
] | null | null | null |
from .L2_LR import L2_LR
from .PCA_LR import PCA_LR
| 9.166667 | 26 | 0.763636 | 12 | 55 | 3.166667 | 0.416667 | 0.210526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0.2 | 55 | 5 | 27 | 11 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d3ff394a4d5cc527c1285204b7dd641615871cb1 | 122 | py | Python | python/coursera_python/WESLEYAN/week1/2.py | SayanGhoshBDA/code-backup | 8b6135facc0e598e9686b2e8eb2d69dd68198b80 | [
"MIT"
] | 16 | 2018-11-26T08:39:42.000Z | 2019-05-08T10:09:52.000Z | python/coursera_python/WESLEYAN/week1/2.py | SayanGhoshBDA/code-backup | 8b6135facc0e598e9686b2e8eb2d69dd68198b80 | [
"MIT"
] | 8 | 2020-05-04T06:29:26.000Z | 2022-02-12T05:33:16.000Z | python/coursera_python/WESLEYAN/week1/2.py | SayanGhoshBDA/code-backup | 8b6135facc0e598e9686b2e8eb2d69dd68198b80 | [
"MIT"
] | 5 | 2020-02-11T16:02:21.000Z | 2021-02-05T07:48:30.000Z | def problem1_2(x,y):
print(x+y)
print(x*y)
pass # replace this pass (a do-nothing) statement with your code
| 17.428571 | 68 | 0.647541 | 22 | 122 | 3.545455 | 0.727273 | 0.076923 | 0.179487 | 0.205128 | 0.205128 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021505 | 0.237705 | 122 | 6 | 69 | 20.333333 | 0.817204 | 0.467213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.25 | 0 | 0 | 0.25 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 6 |
313e0796ee8955226838ac3d8693e1faeb10a630 | 3,250 | py | Python | tests/test_gcm_push_payload.py | FahadAlbukhari/django-push-notifications | 8c73a77131be2259f961a997cc3ca8945028f519 | [
"MIT"
] | 3 | 2019-11-09T13:11:00.000Z | 2020-01-07T03:02:58.000Z | tests/test_gcm_push_payload.py | FahadAlbukhari/django-push-notifications | 8c73a77131be2259f961a997cc3ca8945028f519 | [
"MIT"
] | null | null | null | tests/test_gcm_push_payload.py | FahadAlbukhari/django-push-notifications | 8c73a77131be2259f961a997cc3ca8945028f519 | [
"MIT"
] | 1 | 2020-06-02T22:39:31.000Z | 2020-06-02T22:39:31.000Z | from __future__ import absolute_import
from django.test import TestCase
from push_notifications.gcm import send_bulk_message, send_message
from ._mock import mock
from .responses import GCM_JSON, GCM_JSON_MULTIPLE
class GCMPushPayloadTest(TestCase):
def test_fcm_push_payload(self):
with mock.patch("push_notifications.gcm._fcm_send", return_value=GCM_JSON) as p:
send_message("abc", {"message": "Hello world"}, "FCM")
p.assert_called_once_with(
b'{"notification":{"body":"Hello world"},"registration_ids":["abc"]}',
"application/json", application_id=None)
def test_push_payload_with_app_id(self):
with mock.patch("push_notifications.gcm._gcm_send", return_value=GCM_JSON) as p:
send_message("abc", {"message": "Hello world"}, "GCM")
p.assert_called_once_with(
b'{"data":{"message":"Hello world"},"registration_ids":["abc"]}',
"application/json", application_id=None)
with mock.patch("push_notifications.gcm._gcm_send", return_value=GCM_JSON) as p:
send_message("abc", {"message": "Hello world"}, "GCM")
p.assert_called_once_with(
b'{"data":{"message":"Hello world"},"registration_ids":["abc"]}',
"application/json", application_id=None)
def test_fcm_push_payload_params(self):
with mock.patch("push_notifications.gcm._fcm_send", return_value=GCM_JSON) as p:
send_message(
"abc",
{"message": "Hello world", "title": "Push notification", "other": "misc"},
"FCM",
delay_while_idle=True, time_to_live=3600, foo="bar",
)
p.assert_called_once_with(
b'{"data":{"other":"misc"},"delay_while_idle":true,'
b'"notification":{"body":"Hello world","title":"Push notification"},'
b'"registration_ids":["abc"],"time_to_live":3600}',
"application/json", application_id=None)
def test_fcm_push_payload_many(self):
with mock.patch("push_notifications.gcm._fcm_send", return_value=GCM_JSON_MULTIPLE) as p:
send_bulk_message(["abc", "123"], {"message": "Hello world"}, "FCM")
p.assert_called_once_with(
b'{"notification":{"body":"Hello world"},"registration_ids":["abc","123"]}',
"application/json", application_id=None)
def test_gcm_push_payload(self):
with mock.patch("push_notifications.gcm._gcm_send", return_value=GCM_JSON) as p:
send_message("abc", {"message": "Hello world"}, "GCM")
p.assert_called_once_with(
b'{"data":{"message":"Hello world"},"registration_ids":["abc"]}',
"application/json", application_id=None)
def test_gcm_push_payload_params(self):
with mock.patch("push_notifications.gcm._gcm_send", return_value=GCM_JSON) as p:
send_message(
"abc", {"message": "Hello world"}, "GCM",
delay_while_idle=True, time_to_live=3600, foo="bar",
)
p.assert_called_once_with(
b'{"data":{"message":"Hello world"},"delay_while_idle":true,'
b'"registration_ids":["abc"],"time_to_live":3600}',
"application/json", application_id=None)
def test_gcm_push_payload_many(self):
with mock.patch("push_notifications.gcm._gcm_send", return_value=GCM_JSON_MULTIPLE) as p:
send_bulk_message(["abc", "123"], {"message": "Hello world"}, "GCM")
p.assert_called_once_with(
b'{"data":{"message":"Hello world"},"registration_ids":["abc","123"]}',
"application/json",
application_id=None)
| 41.666667 | 91 | 0.712308 | 453 | 3,250 | 4.783664 | 0.134658 | 0.073835 | 0.101984 | 0.06276 | 0.88694 | 0.833872 | 0.833872 | 0.833872 | 0.833872 | 0.828796 | 0 | 0.009729 | 0.114462 | 3,250 | 77 | 92 | 42.207792 | 0.743224 | 0 | 0 | 0.507692 | 0 | 0 | 0.392 | 0.266769 | 0 | 0 | 0 | 0 | 0.123077 | 1 | 0.107692 | false | 0 | 0.076923 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
31501718fa96fec037fdfcb981ac2cb72829a3c4 | 51,882 | py | Python | Apps/Python/app2/bot/cogs/mod.py | SinLess-Games-LimitedLC/Helix | 647f6b52b3432cfdd646f030fdc01ea20d57e70a | [
"Apache-2.0"
] | 3 | 2021-09-16T11:54:04.000Z | 2021-12-11T23:54:13.000Z | Apps/Python/app2/bot/cogs/mod.py | SinLess-Games-LimitedLC/Helix | 647f6b52b3432cfdd646f030fdc01ea20d57e70a | [
"Apache-2.0"
] | 15 | 2021-09-22T22:49:51.000Z | 2022-01-16T22:34:56.000Z | Apps/Python/app2/bot/cogs/mod.py | SinLess-Games/Helix | 5fdfca30ef132422be10bea5015804b4e8b1636f | [
"Apache-2.0"
] | null | null | null | import re
import discord
import mysql.connector
import copy
from discord.ext import commands
from util import check
from collections import Counter
from slugify import slugify
class Mod(commands.Cog):
def __init__(self, client):
self.client = client
self.client.modroles = {}
async def global_check(self, ctx):
self.blacklist = {}
for guild in self.guilds:
database = guild
self.conn = mysql.connector.connect(host='192.168.86.244', user='admin', password='Shellshocker93!', database=database)
self.c = self.conn.cursor()
self.c.execute('''SELECT * FROM blacklist WHERE 1''')
for server, word in self.c.fetchall():
if server not in self.blacklist:
self.blacklist[server] = [word]
else:
self.blacklist[server].append(word)
self.c.execute(f'''SELECT guild_id, mod_roles FROM {ctx.guild.id}.config WHERE 1''')
for server, modrole in self.c.fetchall():
self.client.modroles[server] = modrole
# -| if carl -> True
# --| if command is disabled for the server -> False --- Without this, it's possible that commands clash with other bots
# ----| if subcommand is disabled for the server -> False --- Not to mention the fact that mods are actually impacted negatively by the alternatives
# ----| if manage_server -> True
# -----| if user is plonked -> False
# --------| if command is disabled for user -> False
# -----------| if subcommand is disabled for user -> False
# -----| if pms -> True (all mod actions are based on servers anyway)
# -----| if server is disabled -> False
# -----| if channel is ignored -> False
# -----| if command is disabled in the channel -> False
# -------| if subcommand is disabled in the channel -> False
# -----| if command is restricted -> get the bot channel, send a mention in the bot channel, send command IN the bot channel
# -| else True
if ctx.guild is None:
return True
self.c.execute("SELECT * FROM config WHERE guild_id='{}'".format(ctx.guild.id))
b = self.c.fetchone()
self.c.execute("SELECT bot_channel FROM servers WHERE id='{}'".format(ctx.guild.id))
bot_channel = self.c.fetchone()
bot_channel = self.client.get_channel(int(bot_channel[0])) if bot_channel[0] is not None else ctx.channel
command_name = ctx.command.root_parent
if command_name is None:
command_name = ctx.command.name
command_name = str(command_name)
if b[2] is not None:
if command_name in b[2].split(','):
# disabled commands aren't overridden by
return False
elif ctx.invoked_subcommand is not None:
sub_command_name = f"{command_name} {ctx.invoked_subcommand.name}"
if sub_command_name in b[2].split(','):
return False
bypass = ctx.author.guild_permissions.manage_guild
if self.client.modroles.get(str(ctx.guild.id)) in [x.id for x in ctx.author.roles]:
return True
if bypass:
return True
if not b[3]:
# plonked server/all commands are
return False
if b[1] is not None:
if str(ctx.channel.id) in b[1].split(','):
# ignored channel
return False
if b[4] is not None and bot_channel != ctx.channel:
if command_name in b[4].split(','):
await bot_channel.send(ctx.author.mention)
context = copy.copy(ctx)
context.channel = bot_channel
await self.client.invoke(context)
return False
elif ctx.invoked_subcommand is not None:
sub_command_name = f"{command_name} {ctx.invoked_subcommand.name}"
if sub_command_name in b[4].split(','):
await bot_channel.send(ctx.author.mention)
context = copy.copy(ctx)
context.channel = bot_channel
await self.client.invoke(context)
return False
self.c.execute(f'''SELECT *
FROM {ctx.guild.id}.userconfig
WHERE (guild_id=? AND user_id=?)''',
(ctx.guild.id, ctx.author.id))
a = self.c.fetchone()
if a is not None:
# If the user isn't in the list, we don't have t o check
# This isn't really a speed up or anything, but it crashes otherwise xD
# Unfortunately highlight ignores will put you in the userconfig
# which isn't WEBSCALE BRO
if a[4]:
# user is plonked
# no, not per command
# just plonked
return False
if a[2] is not None:
if command_name in a[2].split(','):
# disabled user thing
return False
elif ctx.invoked_subcommand is not None:
# disabling subcommands per user
# is not fun
sub_command_name = f"{command_name} {ctx.invoked_subcommand.name}"
if sub_command_name in a[2].split(','):
return False
self.c.execute("SELECT commands FROM channelconfig WHERE (guild_id='{}' AND channel_id='{}'')".format(ctx.guild.id, ctx.channel.id))
channelconfig = self.c.fetchone()
if channelconfig is not None:
if channelconfig[0] is not None:
if command_name in channelconfig[0].split(','):
# ignored command
return False
elif ctx.invoked_subcommand is not None:
sub_command_name = f"{command_name} {ctx.invoked_subcommand.name}"
if sub_command_name in channelconfig[0].split(','):
return False
# command is allowed to be used, is it restricted though?
self.c.close()
self.conn.close()
return True
@staticmethod
def do_slugify(string):
string = slugify(string, separator="")
replacements = (('4', 'a'), ('@', 'a'), ('3', 'e'), ('1', 'i'), ('0', 'o'), ('7', 't'), ('5', 's'))
for old, new in replacements:
string = string.replace(old, new)
return string
@staticmethod
def clean_string(string):
string = re.sub('@', '@\u200b', string)
string = re.sub('#', '#\u200b', string)
return string
async def on_message_edit(self, before, after):
if before.guild is None:
return
if before.author.client:
return
bypass = before.author.guild_permissions.manage_guild
if bypass:
return
msg = self.do_slugify(after.content)
try:
if str(before.guild.id) in self.blacklist:
for blacklisted_word in self.blacklist[str(before.guild.id)]:
if blacklisted_word in msg:
try:
await before.delete()
return await before.author.send(
f'''Your message "{before.content}" was removed for containing the blacklisted word "{blacklisted_word}"''')
except Exception as e:
chan = self.client.get_channel(344986487676338187)
await chan.send(f"Error when trying to remove message {type(e).__name__}: {e}")
except Exception as e:
chan = self.client.get_channel(344986487676338187)
await chan.send(f"Error when trying to remove message (big) {type(e).__name__}: {e}")
async def on_message(self, message):
if message.guild is None:
return
if message.author.client:
return
bypass = message.author.guild_permissions.manage_guild
if bypass:
return
msg = self.do_slugify(message.content)
try:
if str(message.guild.id) in self.blacklist:
for blacklisted_word in self.blacklist[str(message.guild.id)]:
if blacklisted_word in msg:
try:
await message.delete()
return await message.author.send(
f'''Your message "{message.content}" was removed for containing the blacklisted word "{blacklisted_word}"''')
except Exception as e:
chan = self.client.get_channel(881083005186240522)
await chan.send(f"Error when trying to remove message {type(e).__name__}: {e}")
except Exception as e:
chan = self.client.get_channel(881083005186240522)
await chan.send(f"Error when trying to remove message (big) {type(e).__name__}: {e}")
@commands.group(invoke_without_command=True, name="blacklist", aliases=['bl'])
@check.admin_or_permissions(manage_server=True)
async def _blacklist(self, ctx):
await ctx.send("```Usage: !blacklist add <word>\n!blacklist remove <word>\n!blacklist show```")
@_blacklist.command(name="add", aliases=['+'])
@check.admin_or_permissions(manage_server=True)
async def add_word(self, ctx, *, to_be_blacklisted: str = None):
if to_be_blacklisted is None:
print(ctx)
await ctx.channel.send("You need to specify a word to blacklist")
return
self.blacklist = {}
for guild in self.guilds:
database = guild
self.conn = mysql.connector.connect(host='192.168.86.244', user='admin', password='Shellshocker93!',
database=database)
self.c = self.conn.cursor()
slugified_word = self.do_slugify(to_be_blacklisted)
self.c.execute("INSERT OR IGNORE INTO blacklist VALUES ('{}', '{}')".format(ctx.guild.id, slugified_word))
self.conn.commit()
try:
self.blacklist[str(ctx.guild.id)].append(slugified_word)
except KeyError:
self.blacklist[str(ctx.guild.id)] = [slugified_word]
to_be_blacklisted = self.clean_string(to_be_blacklisted)
await ctx.send(f'Added "{to_be_blacklisted}" to the blacklist')
@_blacklist.command(name="remove", aliases=['-'])
@check.admin_or_permissions(manage_server=True)
async def remove_word(self, ctx, *, word: str = None):
if word is None:
return await ctx.send("You need to specify a word to remove from the blacklist")
slugified_word = self.do_slugify(word)
if slugified_word not in self.blacklist[str(ctx.guild.id)]:
return await ctx.send("You don't seem to be blacklisting that word")
self.c.execute(f'''DELETE FROM {ctx.guild.id}.blacklist WHERE (guild_id=? AND word=?)''',
(ctx.guild.id, slugified_word))
self.conn.commit()
self.blacklist[str(ctx.guild.id, )].remove(slugified_word)
word = self.clean_string(word)
await ctx.send(f'Removed "{word}" from the blacklist')
@_blacklist.command(name="display", aliases=['show'])
@check.admin_or_permissions(manage_server=True)
async def show_words(self, ctx):
self.c.execute(f'SELECT word FROM {ctx.guild.id}.blacklist WHERE guild_id=?', (ctx.guild.id,))
words = self.c.fetchall()
if words == []:
return await ctx.send("No blacklisted words yet, use `!blacklist add <word>` to get started")
e = discord.Embed(title="Blocked words", description='\n'.join(x[0] for x in words))
await ctx.send(embed=e)
@_blacklist.command(name="clear")
@check.admin_or_permissions(manage_server=True)
async def _clear(self, ctx):
self.c.execute('''DELETE FROM blacklist WHERE guild_id=?''', (ctx.guild.id,))
self.conn.commit()
self.blacklist[str(ctx.guild.id, )] = []
await ctx.send("Removed all blacklisted words")
@commands.group(no_pm=True, invoke_without_command=True)
@check.admin_or_permissions(manage_server=True)
async def ignore(self, ctx, channel: discord.TextChannel = None, command: str = None, subcommand: str = None):
if channel is None:
channels = [str(ctx.channel.id)]
else:
channels = [str(x.id) for x in ctx.message.channel_mentions]
if command is None:
a = self.c.execute(
f'SELECT ignored_channels FROM {ctx.guild.id}.config WHERE guild_id=?', (ctx.guild.id,))
a = a.fetchone()
if a[0] is None:
# no channels disabled, easy update
channels_to_add = ','.join(channels)
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET ignored_channels=? WHERE guild_id=?', (channels_to_add, ctx.guild.id))
self.conn.commit()
e = ', '.join([f"<#{x}>" for x in channels])
return await ctx.send(f"Ignored **{e}**.")
ignored_channels = a[0].split(',')
if Counter(ignored_channels) == Counter(channels):
# ALL mentioned channels are already in the database, remove them instead
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET ignored_channels=? WHERE guild_id=?', (None, ctx.guild.id))
self.conn.commit()
e = ', '.join([f"<#{x}>" for x in channels])
return await ctx.send(f"Unignored **{e}**.")
else:
# If the mentioned channels and the saved channels have nothing or some in common, extend it
say_list = [x for x in channels if x not in ignored_channels]
if set(channels).issubset(ignored_channels):
# unignore
ignored_channels = [
x for x in ignored_channels if x not in channels]
say_list = [x for x in channels]
saved_list = ','.join(list(set(ignored_channels)))
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET ignored_channels=? WHERE guild_id=?', (saved_list, ctx.guild.id))
self.conn.commit()
e = ', '.join([f"<#{x}>" for x in say_list])
return await ctx.send(f"Unignored **{e}**.")
ignored_channels.extend(list(channels))
saved_list = ','.join(list(set(ignored_channels)))
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET ignored_channels=? WHERE guild_id=?', (saved_list, ctx.guild.id))
self.conn.commit()
e = ', '.join([f"<#{x}>" for x in say_list])
return await ctx.send(f"Ignored **{e}**.")
else:
# This is similar to the other subcommand disables
# But keeping all channels up to date is just unreasonable
# Instead we're gonna have it be a bit more lenient for
# non-existant rows and instead have it update as it goes along
command = command.lower()
if command in ('enable', 'disable'):
return await ctx.send('Cannot disable that command.')
cool_dict = {a.qualified_name: a.aliases for a in self.client.commands}
# I'm pretty sure ^ qualified_name is actually wrong and I should be using .name
# But at this point I'm too afraid to change anything
if command not in cool_dict:
for k, v in cool_dict.items():
if command in v:
command = k
if command not in [x.name for x in self.client.commands]:
return await ctx.send('I do not have this command registered.')
# We have the command picked, now we need to check if a subcommand has to be passed too
# Just like for the commands, we're gonna use the actual _name_ of the command and
# not an alias, this will be what the user is after in 99.999999% of cases so I'm not
# even gonna think twice about it
if subcommand is not None:
subcommand = subcommand.lower()
# a subcommand was defined and we need to check if it's registered as a name or as an alias
base_command = self.client.get_command(command)
try:
sub_commands = base_command.commands
except AttributeError:
return await ctx.send("That command has no subcommands")
# Now we have a generator with the group's subcommands
# we'll do the same thing as we did in the "cool dict"
second_cool_dict = {x.name: x.aliases for x in sub_commands}
if subcommand not in second_cool_dict:
print(second_cool_dict)
for k, v in second_cool_dict.items():
print(v)
if subcommand in v:
subcommand = k
if subcommand not in second_cool_dict:
return await ctx.send("I don't have that subcommand registered (if you used a prefix, don't!)")
command = f"{command} {subcommand}"
print(f"subcommand is {subcommand}")
print(f"making the disabled command '{command}'")
self.c.execute(f'''SELECT commands
FROM {ctx.guild.id}.channelconfig
WHERE (guild_id=? AND channel_id=?)''',
(ctx.guild.id, channel.id))
already_disabled_commands = self.c.fetchone()
if already_disabled_commands is None:
self.c.execute('''INSERT INTO channelconfig
VALUES (?, ?, ?, ?)''',
(ctx.guild.id, channel.id, command, None))
self.conn.commit()
return await ctx.send(f"Disabled **{command}** in #{channel.name}.")
print(already_disabled_commands)
if already_disabled_commands[0] is None:
# no commands disabled, easy update
self.c.execute(f'''UPDATE {ctx.guild.id}.channelconfig
SET commands=?
WHERE (guild_id=? AND channel_id=?)''',
(command, ctx.guild.id, channel.id))
self.conn.commit()
return await ctx.send(f"Disabled **{command}** in #{channel.name}.")
else:
# one or more commands disabled
disabled_commands = already_disabled_commands[0].split(',')
if command in disabled_commands:
# If the command is already disabled, we're gonna re-enable it
# the syntax is inconsistent and that's unfortunate, but it's
# pretty neat just having one command I think
disabled_commands.remove(command)
e = ','.join(
list(set([x for x in disabled_commands if x != ""])))
if e == "":
e = None
self.c.execute(f'''UPDATE {ctx.guild.id}.channelconfig
SET commands=?
WHERE (guild_id=? AND channel_id=?)''',
(e, ctx.guild.id, channel.id))
self.conn.commit()
return await ctx.send(f"Enabled **{command}** in #{channel.name}.")
else:
disabled_commands.append(command)
e = ','.join(
list(set([x for x in disabled_commands if x != ""])))
if e == "":
e = None
self.c.execute(f'''UPDATE {ctx.guild.id}.channelconfig
SET commands=?
WHERE (guild_id=? AND channel_id=?)''',
(e, ctx.guild.id, channel.id))
self.conn.commit()
return await ctx.send(f"Disabled **{command}** in #{channel.name}.")
@ignore.command(name='server')
@check.admin_or_permissions(manage_server=True)
async def ignore_guild(self, ctx):
a = self.c.execute(
f'SELECT enabled FROM {ctx.guild.id}.config WHERE guild_id=?', (ctx.guild.id,))
a = a.fetchone()
if a[0]:
# server is enabled, disable it
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET enabled=? WHERE guild_id=?', (False, ctx.guild.id))
self.conn.commit()
return await ctx.send(f"Disabled **{ctx.guild.name}**.")
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET enabled=? WHERE guild_id=?', (True, ctx.guild.id))
self.conn.commit()
await ctx.send(f"Enabled **{ctx.guild.name}**.")
@ignore.command(name='all')
@check.admin_or_permissions(manage_server=True)
async def ignore_all(self, ctx, command: str = None, subcommand: str = None):
if command is None:
# standard command, just ignores all channels
saved_list = ','.join([str(x.id) for x in ctx.guild.text_channels])
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET ignored_channels=? WHERE guild_id=?', (saved_list, ctx.guild.id))
self.conn.commit()
return await ctx.send("All channels ignored.")
# Here begins the fun stuff
# Basically, the way the command works is that it will iterate over the text channels of the guild
# And ignore the passed command (and optional subcommand) for all channels
# will basically work the same as ignore but on a larger scale
command = command.lower()
if command in ('enable', 'disable'):
return await ctx.send('Cannot disable that command.')
cool_dict = {a.qualified_name: a.aliases for a in self.client.commands}
# I'm pretty sure ^ qualified_name is actually wrong and I should be using .name
# But at this point I'm too afraid to change anything
if command not in cool_dict:
for k, v in cool_dict.items():
if command in v:
command = k
if command not in [x.name for x in self.client.commands]:
return await ctx.send('I do not have this command registered.')
# We have the command picked, now we need to check if a subcommand has to be passed too
# Just like for the commands, we're gonna use the actual _name_ of the command and
# not an alias, this will be what the user is after in 99.999999% of cases so I'm not
# even gonna think twice about it
if subcommand is not None:
subcommand = subcommand.lower()
# a subcommand was defined and we need to check if it's registered as a name or as an alias
base_command = self.client.get_command(command)
try:
sub_commands = base_command.commands
except AttributeError:
return await ctx.send("That command has no subcommands")
# Now we have a generator with the group's subcommands
# we'll do the same thing as we did in the "cool dict"
second_cool_dict = {x.name: x.aliases for x in sub_commands}
if subcommand not in second_cool_dict:
print(second_cool_dict)
for k, v in second_cool_dict.items():
print(v)
if subcommand in v:
subcommand = k
if subcommand not in second_cool_dict:
return await ctx.send("I don't have that subcommand registered")
command = f"{command} {subcommand}"
print(f"subcommand is {subcommand}")
print(f"making the disabled command '{command}'")
# If we reached this point, we have a command and possibly even a subcommand passed, iterating over the guild and
# ignoring it per channel _should_ be easy
channels = [x for x in ctx.guild.text_channels]
# sqlite is very VERY slow at committing but I'm too scared not to do it
# seeing "database is locked" gives me ptsd
disabled = []
enabled = []
for channel in channels:
self.c.execute(f'''SELECT commands
FROM {ctx.guild.id}.channelconfig
WHERE (guild_id=? AND channel_id=?)''',
(ctx.guild.id, channel.id))
already_disabled_commands = self.c.fetchone()
if already_disabled_commands is None:
self.c.execute(f'''INSERT INTO {ctx.guild.id}.channelconfig
VALUES (?, ?, ?, ?)''',
(ctx.guild.id, channel.id, command, None))
self.conn.commit()
disabled.append(channel.name)
continue
print(already_disabled_commands)
if already_disabled_commands[0] is None:
# no commands disabled, easy update
self.c.execute(f'''UPDATE {ctx.guild.id}.channelconfig
SET commands=?
WHERE (guild_id=? AND channel_id=?)''',
(command, ctx.guild.id, channel.id))
self.conn.commit()
disabled.append(channel.name)
else:
# one or more commands disabled
disabled_commands = already_disabled_commands[0].split(',')
if command in disabled_commands:
# If the command is already disabled, we're gonna re-enable it
# the syntax is inconsistent and that's unfortunate, but it's
# pretty neat just having one command I think
disabled_commands.remove(command)
e = ','.join(
list(set([x for x in disabled_commands if x != ""])))
if e == "":
e = None
self.c.execute(f'''UPDATE {ctx.guild.id}.channelconfig
SET commands=?
WHERE (guild_id=? AND channel_id=?)''',
(e, ctx.guild.id, channel.id))
self.conn.commit()
enabled.append(channel.name)
else:
disabled_commands.append(command)
e = ','.join(
list(set([x for x in disabled_commands if x != ""])))
if e == "":
e = None
self.c.execute(f'''UPDATE {ctx.guild.id}.channelconfig
SET commands=?
WHERE (guild_id=? AND channel_id=?)''',
(e, ctx.guild.id, channel.id))
self.conn.commit()
disabled.append(channel.name)
enabled = ', '.join(enabled)
disabled = ', '.join(disabled)
await ctx.send(
"**Command:** {}\n**Disabled channels:** {}\n**Enabled channels:** {}".format(command, disabled, enabled))
@commands.group()
async def unignore(self, ctx):
return
@unignore.command(name='all')
@check.admin_or_permissions(manage_server=True)
async def unignore_all(self, ctx):
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET ignored_channels=? WHERE guild_id=?', (None, ctx.guild.id))
self.conn.commit()
await ctx.send("All channels unignored.")
@ignore.command(name='list', no_pm=True)
@check.admin_or_permissions(manage_server=True)
async def ignore_list(self, ctx):
a = self.c.execute(
f'SELECT ignored_channels FROM {ctx.guild.id}.config WHERE guild_id=?', (ctx.guild.id,))
a = a.fetchone()
if a[0] is None:
disabled = []
elif a[0] == "":
disabled = []
else:
disabled = a[0].split(',')
e = discord.Embed()
if disabled == []:
dis = "None"
else:
dis = dis = '\n'.join([f'<#{x}>' for x in disabled])
enabled = [x for x in ctx.guild.text_channels if str(
x.id) not in disabled]
if len(enabled) == 0:
ena = "None"
else:
ena = '\n'.join([f'<#{x.id}>' for x in enabled])
e.add_field(name='Enabled', value=ena)
e.add_field(name='Disabled', value=dis)
e.set_footer(text=ctx.guild.name)
await ctx.send(embed=e)
@commands.group(no_pm=True, invoke_without_command=True, hidden=True)
@check.admin_or_permissions(manage_server=True)
async def disable(self, ctx, command: str, subcommand: str = None):
command = command.lower()
if command in ('enable', 'disable'):
return await ctx.send('Cannot disable that command.')
cool_dict = {a.qualified_name: a.aliases for a in self.client.commands}
# print(cool_dict)
if command not in cool_dict:
for k, v in cool_dict.items():
if command in v:
command = k
if command not in [x.name for x in self.client.commands]:
return await ctx.send('I do not have this command registered.')
# We have the command picked, now we need to check if a subcommand has to be passed too
# Just like for the commands, we're gonna use the actual _name_ of the command and
# not an alias, this will be what the user is after in 99.999999% of cases so I'm not
# even gonna think twice about it
if subcommand is not None:
subcommand = subcommand.lower()
# a subcommand was defined and we need to check if it's registered as a name or as an alias
base_command = self.client.get_command(command)
try:
sub_commands = base_command.commands
except AttributeError:
return await ctx.send("That command has no subcommands")
# Now we have a generator with the group's subcommands
# we'll do the same thing as we did in the "cool dict"
second_cool_dict = {x.name: x.aliases for x in sub_commands}
if subcommand not in second_cool_dict:
print(second_cool_dict)
for k, v in second_cool_dict.items():
print(v)
if subcommand in v:
subcommand = k
if subcommand not in second_cool_dict:
return await ctx.send("I don't have that subcommand registered")
command = f"{command} {subcommand}"
print(f"subcommand is {subcommand}")
print(f"making the disabled command '{command}'")
a = self.c.execute(
f'SELECT commands FROM {ctx.guild.id}.config WHERE guild_id=?', (ctx.guild.id,))
a = a.fetchone()
if a[0] is None:
# no commands disabled, easy update
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET commands=? WHERE guild_id=?', (command, ctx.guild.id))
self.conn.commit()
return await ctx.send(f'"{command}" command disabled in this server.')
else:
# one or more commands disabled
disabled_commands = a[0].split(',')
if command in disabled_commands:
return await ctx.send(f"Disabled **{command}**.")
else:
disabled_commands.append(command)
e = ','.join(
list(set([x for x in disabled_commands if x != ""])))
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET commands=? WHERE guild_id=?', (e, ctx.guild.id))
self.conn.commit()
return await ctx.send(f"Disabled **{command}**.")
@commands.group(no_pm=True, invoke_without_command=True, hidden=True)
@check.admin_or_permissions(manage_server=True)
async def enable(self, ctx, command: str, subcommand: str = None):
command = command.lower()
if command in ('enable', 'disable'):
return await ctx.send('Cannot enable that command.')
cool_dict = {a.qualified_name: a.aliases for a in self.client.commands}
if command not in cool_dict:
for k, v in cool_dict.items():
if command in v:
command = k
if command not in [x.name for x in self.client.commands]:
return await ctx.send('I do not have this command registered.')
if subcommand is not None:
subcommand = subcommand.lower()
# a subcommand was defined and we need to check if it's registered as a name or as an alias
base_command = self.client.get_command(command)
try:
sub_commands = base_command.commands
except AttributeError:
return await ctx.send("That command has no subcommands")
# Now we have a generator with the group's subcommands
# we'll do the same thing as we did in the "cool dict"
second_cool_dict = {x.name: x.aliases for x in sub_commands}
if subcommand not in second_cool_dict:
print(second_cool_dict)
for k, v in second_cool_dict.items():
print(v)
if subcommand in v:
subcommand = k
if subcommand not in second_cool_dict:
return await ctx.send("I don't have that subcommand registered")
command = f"{command} {subcommand}"
a = self.c.execute(
f'SELECT commands FROM {ctx.guild.id}.config WHERE guild_id=?', (ctx.guild.id,))
a = a.fetchone()
if a[0] is None:
# all commands enabled, easy update
return await ctx.send('All commands are already enabled on this server.')
else:
# one or more commands disabled
disabled_commands = a[0].split(',')
if command in disabled_commands:
# remove command from ignore list
disabled_commands.remove(command)
e = ','.join(
list(set([x for x in disabled_commands if x != ""])))
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET commands=? WHERE guild_id=?', (e, ctx.guild.id))
self.conn.commit()
return await ctx.send(f"Enabled **{command}**.")
else:
return await ctx.send(
f"**{command}** was already enabled, use `{ctx.prefix}disable` if you wish to disable it.")
@enable.command(name='list', no_pm=True)
async def enable_list(self, ctx):
a = self.c.execute(
f'SELECT commands FROM {ctx.guild.id}.config WHERE guild_id=?', (ctx.guild.id,))
a = a.fetchone()
if a[0] is None:
disabled = []
elif a[0] == "":
disabled = []
else:
disabled = a[0].split(',')
e = discord.Embed()
enabled = [
x.name for x in self.client.commands if x.name not in disabled and not x.hidden]
ena = ', '.join(enabled) if enabled != [] else "None"
dis = ', '.join(disabled) if disabled != [] else "None"
e.add_field(name='Enabled', value=ena)
e.add_field(name='Disabled', value=dis)
e.set_footer(text=ctx.guild.name)
await ctx.send(embed=e)
@disable.command(name='list', no_pm=True)
async def disable_list(self, ctx):
a = self.c.execute(
f'SELECT commands FROM {ctx.guild.id}.config WHERE guild_id=?', (ctx.guild.id,))
a = a.fetchone()
if a[0] is None:
disabled = "None"
elif a[0] == "":
disabled = "None"
else:
disabled = a[0].split(',')
e = discord.Embed()
enabled = [
x.name for x in self.client.commands if x.name not in disabled and not x.hidden]
ena = ', '.join(enabled) or "None"
dis = ', '.join(disabled) or "None"
e.add_field(name='Disabled', value=dis)
e.add_field(name='Enabled', value=ena)
e.set_footer(text=ctx.guild.name)
await ctx.send(embed=e)
@disable.command(name='all', no_pm=True)
@check.admin_or_permissions(manage_server=True)
async def disable_all(self, ctx):
to_disable = [x.name for x in self.client.commands if str(x.name) not in ("enable", "disable")]
e = ','.join(to_disable)
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET commands=? WHERE guild_id=?', (e, ctx.guild.id))
self.conn.commit()
return await ctx.send("Disabled ALL commands.")
@enable.command(name='all', no_pm=True)
@check.admin_or_permissions(manage_server=True)
async def enable_all(self, ctx):
self.c.execute(
'UPDATE config SET commands=? WHERE guild_id=?', (None, ctx.guild.id))
self.conn.commit()
return await ctx.send("Enabled ALL commands.")
@commands.command()
@check.admin_or_permissions(manage_server=True)
async def plonk(self, ctx, user: discord.Member = None, command: str = None, subcommand: str = None):
if user is None:
return ctx.send("You need to mention a user to plonk")
if command is None:
# global PLONK
a = self.c.execute(
f'SELECT plonked FROM {ctx.guild.id}.userconfig WHERE (guild_id=? AND user_id=?)', (ctx.guild.id, user.id))
a = a.fetchone()
if a is not None:
if a[0]:
# user is plonked
self.c.execute(
f'UPDATE {ctx.guild.id}.userconfig SET plonked=? WHERE (guild_id=? AND user_id=?)',
(False, ctx.guild.id, user.id))
self.conn.commit()
return await ctx.send(f"**{user.name}** is no longer banned from using the bot in this server.")
else:
# not plonked
self.c.execute(
f'UPDATE {ctx.guild.id}.userconfig SET plonked=? WHERE (guild_id=? AND user_id=?)',
(True, ctx.guild.id, user.id))
self.conn.commit()
return await ctx.send(f"**{user.name}** is now banned from using the bot in this server.")
else:
# if a is none, we have a clean user
# this means they always should be plonked
self.c.execute(f'''INSERT INTO {ctx.guild.id}.userconfig
VALUES (?, ?, ?, ?, ?, ?)''', (ctx.guild.id, user.id, None, None, True, None))
self.conn.commit()
return await ctx.send(f"**{user.name}** is now banned from using the bot in this server.")
else:
command = command.lower()
if command in ('enable', 'disable'):
return await ctx.send('Cannot enable that command.')
cool_dict = {
a.qualified_name: a.aliases for a in self.client.commands}
if command not in cool_dict:
for k, v in cool_dict.items():
if command in v:
command = k
if command not in [x.name for x in self.client.commands]:
return await ctx.send('I do not have this command registered.')
if subcommand is not None:
subcommand = subcommand.lower()
# a subcommand was defined and we need to check if it's registered as a name or as an alias
base_command = self.client.get_command(command)
try:
sub_commands = base_command.commands
except AttributeError:
return await ctx.send("That command has no subcommands")
# Now we have a generator with the group's subcommands
# we'll do the same thing as we did in the "cool dict"
second_cool_dict = {x.name: x.aliases for x in sub_commands}
if subcommand not in second_cool_dict:
print(second_cool_dict)
for k, v in second_cool_dict.items():
print(v)
if subcommand in v:
subcommand = k
if subcommand not in second_cool_dict:
return await ctx.send("I don't have that subcommand registered")
command = f"{command} {subcommand}"
print(f"subcommand is {subcommand}")
print(f"making the disabled command '{command}'")
a = self.c.execute(
f'SELECT command FROM {ctx.guild.id}.userconfig WHERE (guild_id=? AND user_id=?)', (ctx.guild.id, user.id))
a = a.fetchone()
if a is not None:
if a[0] is None:
# all commands enabled, easy update
self.c.execute(
f'UPDATE {ctx.guild.id}.userconfig SET command=? WHERE (guild_id=? AND user_id=?)',
(command, ctx.guild.id, user.id))
self.conn.commit()
return await ctx.send(f'"{command}" disabled for {user.name}.')
else:
# one or more commands disabled
disabled_commands = a[0].split(',')
if command in disabled_commands:
# remove command from ignore list
disabled_commands.remove(command)
e = ','.join(
list(set([x for x in disabled_commands if x != ""])))
if e == "":
# God I fucking _hate_
# dealing with empty strings
e = None
self.c.execute(
f'UPDATE {ctx.guild.id}.userconfig SET command=? WHERE (guild_id=? AND user_id=?)',
(e, ctx.guild.id, user.id))
self.conn.commit()
return await ctx.send(f"Enabled **{command}** for {user.name}.")
else:
disabled_commands.append(command)
e = ','.join(
list(set([x for x in disabled_commands if x != ""])))
self.c.execute(
f'UPDATE {ctx.guild.id}.userconfig SET command=? WHERE (guild_id=? AND user_id=?)',
(e, ctx.guild.id, user.id))
self.conn.commit()
return await ctx.send(f"Disabled **{command}** for {user.name}.")
else:
# If the row doesn't exist, we need to insert it
# luckily this looks almost exactly like the a[0] row
self.c.execute(f'''INSERT INTO {ctx.guild.id}.userconfig
VALUES (?, ?, ?, ?, ?, ?)''',
(ctx.guild.id, user.id, command, None, False, None))
self.conn.commit()
await ctx.send(f"Disabled **{command}** for {user.name}")
@commands.command(no_pm=True)
async def plonks(self, ctx, user: discord.Member = None):
if user is None:
d = self.c.execute(
f'SELECT user_id, plonked FROM {ctx.guild.id}.userconfig WHERE guild_id=?', (ctx.guild.id,))
d = d.fetchall()
plonks = '\n'.join([f"<@{x[0]}>" for x in d if x[1]]) or "None"
e = discord.Embed(
title=f"Plonks for {ctx.guild.name}", description=plonks)
# e.add_field(name="None", value=plonks)
await ctx.send(embed=e)
else:
d = self.c.execute(
f'SELECT user_id, command FROM {ctx.guild.id}.userconfig WHERE (guild_id=? AND user_id=?)', (ctx.guild.id, user.id))
d = d.fetchone()
if d is not None:
if d[0] is None:
plonks = "None"
elif d[0] == "":
plonks = "None"
else:
if d[1] is None:
plonks = "None"
else:
plonks = '\n'.join([x for x in d[1].split(',')])
else:
plonks = "None"
e = discord.Embed(
title=f"Plonks for {user.name}#{user.discriminator}", description=plonks)
await ctx.send(embed=e)
@commands.group(no_pm=True, invoke_without_command=True, hidden=True)
@check.admin_or_permissions(manage_server=True)
async def restrict(self, ctx, command: str, subcommand: str = None):
command = command.lower()
if command in ('enable', 'disable'):
return await ctx.send('Cannot disable that command.')
cool_dict = {a.qualified_name: a.aliases for a in self.client.commands}
# print(cool_dict)
if command not in cool_dict:
for k, v in cool_dict.items():
if command in v:
command = k
if command not in [x.name for x in self.client.commands]:
return await ctx.send('I do not have this command registered.')
# We have the command picked, now we need to check if a subcommand has to be passed too
# Just like for the commands, we're gonna use the actual _name_ of the command and
# not an alias, this will be what the user is after in 99.999999% of cases so I'm not
# even gonna think twice about it
if subcommand is not None:
subcommand = subcommand.lower()
# a subcommand was defined and we need to check if it's registered as a name or as an alias
base_command = self.client.get_command(command)
try:
sub_commands = base_command.commands
except AttributeError:
return await ctx.send("That command has no subcommands")
# Now we have a generator with the group's subcommands
# we'll do the same thing as we did in the "cool dict"
second_cool_dict = {x.name: x.aliases for x in sub_commands}
if subcommand not in second_cool_dict:
print(second_cool_dict)
for k, v in second_cool_dict.items():
print(v)
if subcommand in v:
subcommand = k
if subcommand not in second_cool_dict:
return await ctx.send("I don't have that subcommand registered")
command = f"{command} {subcommand}"
print(f"subcommand is {subcommand}")
print(f"making the disabled command '{command}'")
a = self.c.execute(
f'SELECT restricted_commands FROM {ctx.guild.id}.config WHERE guild_id=?', (ctx.guild.id,))
a = a.fetchone()
if a[0] is None:
# no commands restricted, easy update
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET restricted_commands=? WHERE guild_id=?', (command, ctx.guild.id))
self.conn.commit()
return await ctx.send(f'"{command}" is now restricted to the bot channel.')
else:
# one or more commands disabled
disabled_commands = a[0].split(',')
if command in disabled_commands:
return await ctx.send(f"Restricted **{command}**.")
else:
disabled_commands.append(command)
e = ','.join(
list(set([x for x in disabled_commands if x != ""])))
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET restricted_commands=? WHERE guild_id=?', (e, ctx.guild.id))
self.conn.commit()
return await ctx.send(f"Restricted **{command}**.")
@commands.group(no_pm=True, invoke_without_command=True, hidden=True)
@check.admin_or_permissions(manage_server=True)
async def unrestrict(self, ctx, command: str, subcommand: str = None):
command = command.lower()
if command in ('enable', 'disable'):
return await ctx.send('Cannot restrict that command.')
cool_dict = {a.qualified_name: a.aliases for a in self.client.commands}
if command not in cool_dict:
for k, v in cool_dict.items():
if command in v:
command = k
if command not in [x.name for x in self.client.commands]:
return await ctx.send('I do not have this command registered.')
if subcommand is not None:
subcommand = subcommand.lower()
# a subcommand was defined and we need to check if it's registered as a name or as an alias
base_command = self.client.get_command(command)
try:
sub_commands = base_command.commands
except AttributeError:
return await ctx.send("That command has no subcommands")
# Now we have a generator with the group's subcommands
# we'll do the same thing as we did in the "cool dict"
second_cool_dict = {x.name: x.aliases for x in sub_commands}
if subcommand not in second_cool_dict:
print(second_cool_dict)
for k, v in second_cool_dict.items():
print(v)
if subcommand in v:
subcommand = k
if subcommand not in second_cool_dict:
return await ctx.send("I don't have that subcommand registered")
command = f"{command} {subcommand}"
a = self.c.execute(
f'SELECT restricted_commands FROM {ctx.guild.id}.config WHERE guild_id=?', (ctx.guild.id,))
a = a.fetchone()
if a[0] is None:
# all commands enabled, easy update
return await ctx.send('All commands are already enabled on this server.')
else:
# one or more commands disabled
disabled_commands = a[0].split(',')
if command in disabled_commands:
# remove command from ignore list
disabled_commands.remove(command)
e = ','.join(
list(set([x for x in disabled_commands if x != ""])))
self.c.execute(
f'UPDATE {ctx.guild.id}.config SET restricted_commands=? WHERE guild_id=?', (e, ctx.guild.id))
self.conn.commit()
return await ctx.send(f"Set **{command}** free again.")
else:
return await ctx.send(
f"**{command}** was already unrestricted, use `{ctx.prefix}restrict` if you wish to restrict it.")
def setup(client):
client.add_cog(Mod(client))
| 49.177251 | 156 | 0.542751 | 6,376 | 51,882 | 4.332497 | 0.062422 | 0.040798 | 0.039096 | 0.039748 | 0.787431 | 0.759847 | 0.739031 | 0.724768 | 0.702577 | 0.674957 | 0 | 0.005907 | 0.353957 | 51,882 | 1,054 | 157 | 49.223909 | 0.818247 | 0.12222 | 0 | 0.712121 | 0 | 0.003497 | 0.19627 | 0.03175 | 0 | 0 | 0 | 0 | 0 | 1 | 0.004662 | false | 0.009324 | 0.009324 | 0 | 0.118881 | 0.031469 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
316aba75839b926ffe49f05406be2c8bf7c93823 | 54 | py | Python | xlib/net/__init__.py | jkennedyvz/DeepFaceLive | 274c20808da089eb7fc0fc0e8abe649379a29ffe | [
"MIT"
] | 4 | 2021-07-23T16:34:24.000Z | 2022-03-01T18:31:59.000Z | xlib/net/__init__.py | jkennedyvz/DeepFaceLive | 274c20808da089eb7fc0fc0e8abe649379a29ffe | [
"MIT"
] | 1 | 2022-02-08T01:29:03.000Z | 2022-02-08T01:29:03.000Z | xlib/net/__init__.py | jkennedyvz/DeepFaceLive | 274c20808da089eb7fc0fc0e8abe649379a29ffe | [
"MIT"
] | 1 | 2021-12-14T09:18:15.000Z | 2021-12-14T09:18:15.000Z | from .ThreadFileDownloader import ThreadFileDownloader | 54 | 54 | 0.925926 | 4 | 54 | 12.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 54 | 1 | 54 | 54 | 0.980392 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
317c796a7fb6ef75f4f17ea08d1149125afa2fde | 12,629 | py | Python | python/GafferSceneTest/MeshTangentsTest.py | ddesmond/gaffer | 4f25df88103b7893df75865ea919fb035f92bac0 | [
"BSD-3-Clause"
] | 561 | 2016-10-18T04:30:48.000Z | 2022-03-30T06:52:04.000Z | python/GafferSceneTest/MeshTangentsTest.py | ddesmond/gaffer | 4f25df88103b7893df75865ea919fb035f92bac0 | [
"BSD-3-Clause"
] | 1,828 | 2016-10-14T19:01:46.000Z | 2022-03-30T16:07:19.000Z | python/GafferSceneTest/MeshTangentsTest.py | ddesmond/gaffer | 4f25df88103b7893df75865ea919fb035f92bac0 | [
"BSD-3-Clause"
] | 120 | 2016-10-18T15:19:13.000Z | 2021-12-20T16:28:23.000Z | ##########################################################################
#
# Copyright (c) 2017, Image Engine Design Inc. All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:
#
# * Redistributions of source code must retain the above
# copyright notice, this list of conditions and the following
# disclaimer.
#
# * Redistributions in binary form must reproduce the above
# copyright notice, this list of conditions and the following
# disclaimer in the documentation and/or other materials provided with
# the distribution.
#
# * Neither the name of John Haddon nor the names of
# any other contributors to this software may be used to endorse or
# promote products derived from this software without specific prior
# written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS
# IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
# THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
# LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#
##########################################################################
import os
import unittest
import imath
import IECore
import IECoreScene
import Gaffer
import GafferScene
import GafferSceneTest
class MeshTangentsTest( GafferSceneTest.SceneTestCase ) :
def makeTriangleScene( self ) :
verticesPerFace = IECore.IntVectorData( [3] )
vertexIds = IECore.IntVectorData( [0, 1, 2] )
p = IECore.V3fVectorData( [imath.V3f( 0, 0, 0 ), imath.V3f( 1, 0, 0 ), imath.V3f( 0, 1, 0 )] )
n = IECore.V3fVectorData( [imath.V3f( 0, 0, -1 ), imath.V3f( 0, 0, -1 ), imath.V3f( 0, 0, -1 )] )
prefData = IECore.V3fVectorData( [imath.V3f( 0, 0, 0 ), imath.V3f( 0, -1, 0 ), imath.V3f( 1, 0, 0 )] )
mesh = IECoreScene.MeshPrimitive( verticesPerFace, vertexIds, "linear", p )
mesh["N"] = IECoreScene.PrimitiveVariable(
IECoreScene.PrimitiveVariable.Interpolation.Vertex,
n
)
mesh["uv"] = IECoreScene.PrimitiveVariable(
IECoreScene.PrimitiveVariable.Interpolation.FaceVarying,
IECore.V2fVectorData(
[ imath.V2f( 0, 0 ), imath.V2f( 1, 0 ), imath.V2f( 0, 1 ) ],
IECore.GeometricData.Interpretation.UV
)
)
mesh["foo"] = IECoreScene.PrimitiveVariable(
IECoreScene.PrimitiveVariable.Interpolation.FaceVarying,
IECore.V2fVectorData(
[ imath.V2f( 0, 0 ), imath.V2f( 0, 1 ), imath.V2f( 1, 0 ) ],
IECore.GeometricData.Interpretation.UV
)
)
mesh["Pref"] = IECoreScene.PrimitiveVariable( IECoreScene.PrimitiveVariable.Interpolation.Vertex, prefData )
objectToScene = GafferScene.ObjectToScene()
objectToScene["object"].setValue( mesh )
return objectToScene
def testModeUV( self ) :
meshTangents = GafferScene.MeshTangents()
triangleScene = self.makeTriangleScene()
meshTangents["in"].setInput( triangleScene["out"] )
pathFilter = GafferScene.PathFilter( "PathFilter" )
pathFilter["paths"].setValue( IECore.StringVectorData( ['/object'] ) )
meshTangents["filter"].setInput( pathFilter["out"] )
meshTangents['mode'].setValue( GafferScene.MeshTangents.Mode.UV )
object = meshTangents['out'].object( "/object" )
uTangent = object["uTangent"]
vTangent = object["vTangent"]
self.assertEqual( len( uTangent.data ), 3 )
self.assertEqual( len( vTangent.data ), 3 )
for v in uTangent.data :
self.assertTrue( v.equalWithAbsError( imath.V3f( 1, 0, 0 ), 0.000001 ) )
for v in vTangent.data :
self.assertTrue( v.equalWithAbsError( imath.V3f( 0, 1, 0 ), 0.000001 ) )
def testModeFirstEdge( self ) :
meshTangents = GafferScene.MeshTangents()
triangleScene = self.makeTriangleScene()
meshTangents["in"].setInput( triangleScene["out"] )
pathFilter = GafferScene.PathFilter( "PathFilter" )
pathFilter["paths"].setValue( IECore.StringVectorData( ['/object'] ) )
meshTangents["filter"].setInput( pathFilter["out"] )
meshTangents['mode'].setValue( GafferScene.MeshTangents.Mode.FirstEdge )
object = meshTangents['out'].object( "/object" )
tangent = object["tangent"]
biTangent = object["biTangent"]
self.assertEqual( len( tangent.data ), 3 )
self.assertEqual( len( biTangent.data ), 3 )
for v, v1 in zip( tangent.data, [imath.V3f( 1, 0, 0 ), imath.V3f( -1, 0, 0 ), imath.V3f( 0, -1, 0 ) ] ):
self.assertTrue( v.equalWithAbsError( v1, 0.000001 ) )
for v, v1 in zip( biTangent.data, [imath.V3f( 0, -1, 0 ), imath.V3f( 0, 1, 0 ), imath.V3f( -1, 0, 0 ) ] ):
self.assertTrue( v.equalWithAbsError( v1, 0.000001 ) )
def testModeTwoEdges( self ) :
meshTangents = GafferScene.MeshTangents()
triangleScene = self.makeTriangleScene()
meshTangents["in"].setInput( triangleScene["out"] )
pathFilter = GafferScene.PathFilter( "PathFilter" )
pathFilter["paths"].setValue( IECore.StringVectorData( ['/object'] ) )
meshTangents["filter"].setInput( pathFilter["out"] )
meshTangents['mode'].setValue( GafferScene.MeshTangents.Mode.TwoEdges )
object = meshTangents['out'].object( "/object" )
tangent = object["tangent"]
biTangent = object["biTangent"]
self.assertEqual( len( tangent.data ), 3 )
self.assertEqual( len( biTangent.data ), 3 )
for v, v1 in zip( tangent.data, [x.normalized() for x in [ imath.V3f( 1, 1, 0 ), imath.V3f( -2, 1, 0 ), imath.V3f( 1, -2, 0 ) ] ] ):
self.assertTrue( v.equalWithAbsError( v1, 0.000001 ) )
for v, v1 in zip( tangent.data, [x.normalized() for x in [ imath.V3f( 1, 1, 0 ), imath.V3f( -2, 1, 0 ), imath.V3f( 1, -2, 0 ) ] ] ):
self.assertTrue( v.equalWithAbsError( v1, 0.000001 ) )
def testModeCentroid( self ) :
meshTangents = GafferScene.MeshTangents()
triangleScene = self.makeTriangleScene()
meshTangents["in"].setInput( triangleScene["out"] )
pathFilter = GafferScene.PathFilter( "PathFilter" )
pathFilter["paths"].setValue( IECore.StringVectorData( ['/object'] ) )
meshTangents["filter"].setInput( pathFilter["out"] )
meshTangents['mode'].setValue( GafferScene.MeshTangents.Mode.PrimitiveCentroid )
object = meshTangents['out'].object( "/object" )
tangent = object["tangent"]
biTangent = object["biTangent"]
self.assertEqual( len( tangent.data ), 3 )
self.assertEqual( len( biTangent.data ), 3 )
for v, v1 in zip( tangent.data, [x.normalized() for x in [ imath.V3f( 1, 1, 0 ), imath.V3f( -2, 1, 0 ), imath.V3f( 1, -2, 0 ) ] ] ):
self.assertTrue( v.equalWithAbsError( v1, 0.000001 ) )
for v, v1 in zip( tangent.data, [x.normalized() for x in [ imath.V3f( 1, 1, 0 ), imath.V3f( -2, 1, 0 ), imath.V3f( 1, -2, 0 ) ] ] ):
self.assertTrue( v.equalWithAbsError( v1, 0.000001 ) )
def testCanRenameOutputTangents( self ) :
meshTangents = GafferScene.MeshTangents()
triangleScene = self.makeTriangleScene()
meshTangents["in"].setInput( triangleScene["out"] )
pathFilter = GafferScene.PathFilter( "PathFilter" )
pathFilter["paths"].setValue( IECore.StringVectorData( ['/object'] ) )
meshTangents["filter"].setInput( pathFilter["out"] )
meshTangents["uTangent"].setValue( "foo" )
meshTangents["vTangent"].setValue( "bar" )
object = meshTangents['out'].object( "/object" )
uTangent = object["foo"]
vTangent = object["bar"]
self.assertEqual( len( uTangent.data ), 3 )
self.assertEqual( len( vTangent.data ), 3 )
for v in uTangent.data :
self.assertTrue( v.equalWithAbsError( imath.V3f( 1, 0, 0 ), 0.000001 ) )
for v in vTangent.data :
self.assertTrue( v.equalWithAbsError( imath.V3f( 0, 1, 0 ), 0.000001 ) )
def testCanUseSecondUVSet( self ) :
meshTangents = GafferScene.MeshTangents()
meshTangents["uvSet"].setValue( "foo" )
triangleScene = self.makeTriangleScene()
meshTangents["in"].setInput( triangleScene["out"] )
pathFilter = GafferScene.PathFilter( "PathFilter" )
pathFilter["paths"].setValue( IECore.StringVectorData( ['/object'] ) )
meshTangents["filter"].setInput( pathFilter["out"] )
object = meshTangents['out'].object( "/object" )
uTangent = object["uTangent"]
vTangent = object["vTangent"]
self.assertEqual( len( uTangent.data ), 3 )
self.assertEqual( len( vTangent.data ), 3 )
for v in uTangent.data :
self.assertTrue( v.equalWithAbsError( imath.V3f( 0, 1, 0 ), 0.000001 ) )
# really I'd expect the naive answer to the vTangent to be IECore.V3f( 1, 0, 0 )
# but the code forces the triple of n, uT, vT to flip the direction of vT if we don't have a correctly handed set of basis vectors
for v in vTangent.data :
self.assertTrue( v.equalWithAbsError( imath.V3f( -1, 0, 0 ), 0.000001 ) )
def testCanUsePref( self ) :
meshTangents = GafferScene.MeshTangents()
meshTangents["position"].setValue( "Pref" )
triangleScene = self.makeTriangleScene()
meshTangents["in"].setInput( triangleScene["out"] )
pathFilter = GafferScene.PathFilter( "PathFilter" )
pathFilter["paths"].setValue( IECore.StringVectorData( ['/object'] ) )
meshTangents["filter"].setInput( pathFilter["out"] )
object = meshTangents['out'].object( "/object" )
uTangent = object["uTangent"]
vTangent = object["vTangent"]
self.assertEqual( len( uTangent.data ), 3 )
self.assertEqual( len( vTangent.data ), 3 )
for v in uTangent.data :
self.assertTrue( v.equalWithAbsError( imath.V3f( 0, -1, 0 ), 0.000001 ) )
for v in vTangent.data :
self.assertTrue( v.equalWithAbsError( imath.V3f( 1, 0, 0 ), 0.000001 ) )
def testHandedness( self ) :
isLeftHanded = lambda u, v, n : u.cross( v ).dot( n ) < 0
meshTangents = GafferScene.MeshTangents()
triangleScene = self.makeTriangleScene()
meshTangents["in"].setInput( triangleScene["out"] )
pathFilter = GafferScene.PathFilter( "PathFilter" )
pathFilter["paths"].setValue( IECore.StringVectorData( ['/object'] ) )
meshTangents["filter"].setInput( pathFilter["out"] )
meshTangents['orthogonal'].setValue( True )
meshTangents['mode'].setValue( GafferScene.MeshTangents.Mode.UV )
meshTangents['leftHanded'].setValue( True )
object = meshTangents['out'].object( "/object" )
n = imath.V3f(0,0,1)
for u, v in zip( object['uTangent'].data, object['vTangent'].data ) :
self.assertTrue( isLeftHanded( u, v, n ) )
meshTangents['leftHanded'].setValue( False )
object = meshTangents['out'].object( "/object" )
for u, v in zip( object['uTangent'].data, object['vTangent'].data ) :
self.assertFalse( isLeftHanded( u, v, n ) )
meshTangents['mode'].setValue( GafferScene.MeshTangents.Mode.FirstEdge )
meshTangents['leftHanded'].setValue( True )
object = meshTangents['out'].object( "/object" )
for u, v, n in zip( object['tangent'].data, object['biTangent'].data, object['N'].data ) :
self.assertTrue( isLeftHanded( u, v, n ) )
meshTangents['leftHanded'].setValue( False )
object = meshTangents['out'].object( "/object" )
for u, v, n in zip( object['tangent'].data, object['biTangent'].data, object['N'].data ) :
self.assertFalse( isLeftHanded( u, v, n ) )
meshTangents['mode'].setValue( GafferScene.MeshTangents.Mode.TwoEdges )
meshTangents['leftHanded'].setValue( True )
object = meshTangents['out'].object( "/object" )
for u, v, n in zip( object['tangent'].data, object['biTangent'].data, object['N'].data ) :
self.assertTrue( isLeftHanded( u, v, n ) )
meshTangents['leftHanded'].setValue( False )
object = meshTangents['out'].object( "/object" )
for u, v, n in zip( object['tangent'].data, object['biTangent'].data, object['N'].data ) :
self.assertFalse( isLeftHanded( u, v, n ) )
meshTangents['mode'].setValue( GafferScene.MeshTangents.Mode.PrimitiveCentroid )
meshTangents['leftHanded'].setValue( True )
object = meshTangents['out'].object( "/object" )
for u, v, n in zip( object['tangent'].data, object['biTangent'].data, object['N'].data ) :
self.assertTrue( isLeftHanded( u, v, n ) )
meshTangents['leftHanded'].setValue( False )
object = meshTangents['out'].object( "/object" )
for u, v, n in zip( object['tangent'].data, object['biTangent'].data, object['N'].data ) :
self.assertFalse( isLeftHanded( u, v, n ) )
| 35.980057 | 134 | 0.681368 | 1,485 | 12,629 | 5.794613 | 0.160269 | 0.033469 | 0.01778 | 0.047066 | 0.771993 | 0.742708 | 0.723068 | 0.701104 | 0.701104 | 0.68158 | 0 | 0.028736 | 0.162325 | 12,629 | 350 | 135 | 36.082857 | 0.784668 | 0.141896 | 0 | 0.740196 | 0 | 0 | 0.080736 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 1 | 0.044118 | false | 0 | 0.039216 | 0 | 0.093137 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
317ec133ace71f28a2d18cd183b30dc6e5f8e960 | 35 | py | Python | inline_text_assets/__init__.py | jungvonmatt/grow-ext-dynamic-css-bundles | 733fb07e2e0406a5df7e741927b77676a35d23cc | [
"Apache-2.0"
] | null | null | null | inline_text_assets/__init__.py | jungvonmatt/grow-ext-dynamic-css-bundles | 733fb07e2e0406a5df7e741927b77676a35d23cc | [
"Apache-2.0"
] | null | null | null | inline_text_assets/__init__.py | jungvonmatt/grow-ext-dynamic-css-bundles | 733fb07e2e0406a5df7e741927b77676a35d23cc | [
"Apache-2.0"
] | 1 | 2021-01-15T15:40:34.000Z | 2021-01-15T15:40:34.000Z | from . inline_text_assets import *
| 17.5 | 34 | 0.8 | 5 | 35 | 5.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 35 | 1 | 35 | 35 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3192a7ff1755652cf0db501944b470e76c2d779a | 4,601 | py | Python | CricketGround/Cricketground.py | Vignesh1326/Glowscript-Works | 8a0ead2ad84e5051e21d093b95e36e22324688dd | [
"MIT"
] | null | null | null | CricketGround/Cricketground.py | Vignesh1326/Glowscript-Works | 8a0ead2ad84e5051e21d093b95e36e22324688dd | [
"MIT"
] | null | null | null | CricketGround/Cricketground.py | Vignesh1326/Glowscript-Works | 8a0ead2ad84e5051e21d093b95e36e22324688dd | [
"MIT"
] | null | null | null | GlowScript 3.0 VPython
ground=box(pos=vector(0,0,0),size=vector(200,200,1),color=color.green)
bline1=box(pos=vector(0,-100,1),size=vector(200,1,1),color=color.red)
bline2=box(pos=vector(100,0,1),size=vector(1,200,1),color=color.red)
bline3=box(pos=vector(-100,0,1),size=vector(1,200,1),color=color.red)
bline4=box(pos=vector(0,100,1),size=vector(200,1,1),color=color.red)
ptch=box(pos=vector(0,0,0),size=vector(15,40,2),color=color.white)
s11=arrow(pos=vector(-1,-20,1),color=color.black,axis=vector(0,0,3))
s12=arrow(pos=vector(0,-20,1),color=color.black,axis=vector(0,0,3))
s13=arrow(pos=vector(1,-20,1),color=color.black,axis=vector(0,0,3))
s21=arrow(pos=vector(-1,20,1),color=color.black,axis=vector(0,0,3))
s22=arrow(pos=vector(0,20,1),color=color.black,axis=vector(0,0,3))
s23=arrow(pos=vector(1,20,1),color=color.black,axis=vector(0,0,3))
striker=arrow(pos=vector(0,18,1),color=color.yellow,axis=vector(0,0,5))
nonstriker=arrow(pos=vector(5,-18,1),color=color.yellow,axis=vector(0,0,5))
bowler=arrow(pos=vector(-1,-18,1),color=color.blue,axis=vector(0,0,5))
keeper=arrow(pos=vector(-3,30,1),color=color.blue,axis=vector(0,0,5))
slip1=arrow(pos=vector(-6,33,1),color=color.blue,axis=vector(0,0,5))
extracoverf=arrow(pos=vector(-85,-70,1),color=color.blue,axis=vector(0,0,5))
coverf=arrow(pos=vector(-45,-10,1),color=color.blue,axis=vector(0,0,5))
longonf=arrow(pos=vector(-30,-90,1),color=color.blue,axis=vector(0,0,5))
longofff=arrow(pos=vector(30,-90,1),color=color.blue,axis=vector(0,0,5))
legf=arrow(pos=vector(45,-10,1),color=color.blue,axis=vector(0,0,5))
thirdmanf=arrow(pos=vector(-90,90,1),color=color.blue,axis=vector(0,0,5))
leggf=arrow(pos=vector(10,85,1),color=color.blue,axis=vector(0,0,5))
gullyf=arrow(pos=vector(-35,15,1),color=color.blue,axis=vector(0,0,5))
umpire=arrow(pos=vector(0,-22,1),color=color.red,axis=vector(0,0,5))
batcrease=box(pos=vector(0,17,0),color=color.black,size=vector(15,0.3,2))
bowlcrease=box(pos=vector(0,-17,0),color=color.black,size=vector(15,0.3,2))
wide1=box(pos=vector(-5,18.5,0),color=color.black,size=vector(0.3,3,2))
wide2=box(pos=vector(5,18.5,0),color=color.black,size=vector(0.3,3,2))
ball=sphere(pos=vector(-1,-18,4),color=color.cyan,radius=0.5)
while True:
rate(10)
k=keysdown()
scene.camera.follow(ball)
if 'home' in k:
for i in range(17):
rate(10)
ball.pos.y=ball.pos.y+1
ball.pos.z=ball.pos.z+0.5
for j in range(18):
rate(10)
ball.pos.y=ball.pos.y+1
ball.pos.z=ball.pos.z-0.5
if 'down' in k:
if(ball.pos.y>=16 and ball.pos.y<=18):
for ii in range(60):
rate(10)
ball.pos.y=ball.pos.y-1
ball.pos.z=ball.pos.z+1
longonf.pos.x=longonf.pos.x+0.2
for jj in range(62):
rate(10)
ball.pos.y=ball.pos.y-1
ball.pos.z=ball.pos.z-1
longonf.pos.x=longonf.pos.x+0.2
t = text(text='That is a huge six', align='center', color=color.red,height=10,depth=5)
break
if 'left' in k:
if(ball.pos.y>=16 and ball.pos.y<=18):
for ii in range(50):
rate(10)
ball.pos.x=ball.pos.x-1
ball.pos.z=ball.pos.z+1
extracoverf.pos.y=extracoverf.pos.y+0.8
for jj in range(52):
rate(10)
ball.pos.x=ball.pos.x-1
ball.pos.z=ball.pos.z-1
extracoverf.pos.y=extracoverf.pos.y+0.8
t = text(text='That is a huge six', align='center', color=color.red,height=10,depth=5)
break
if 'right' in k:
if(ball.pos.y>=16 and ball.pos.y<=18):
for ii in range(50):
rate(10)
ball.pos.x=ball.pos.x+1
ball.pos.z=ball.pos.z+1
legf.pos.x=legf.pos.x+0.5
for jj in range(52):
rate(10)
ball.pos.x=ball.pos.x+1
ball.pos.z=ball.pos.z-1
legf.pos.x=legf.pos.x+0.5
t = text(text='That is a huge six', align='center', color=color.red,height=10,depth=5)
break
if 'up' in k:
for ii in range(2):
rate(10)
ball.pos.y=ball.pos.y+1
ball.pos.z=ball.pos.z-0.5
for jj in range(5):
rate(10)
s21.pos.y=s21.pos.y+0.4
s21.axis=vector(0,3,0)
t = text(text='Well bowled...!!!...Sticks off', align='center', color=color.red,height=10,depth=5)
break
| 46.01 | 106 | 0.593784 | 865 | 4,601 | 3.158382 | 0.137572 | 0.107613 | 0.100659 | 0.087848 | 0.759151 | 0.754392 | 0.751098 | 0.751098 | 0.732796 | 0.671303 | 0 | 0.103543 | 0.20865 | 4,601 | 99 | 107 | 46.474747 | 0.6468 | 0 | 0 | 0.489796 | 0 | 0 | 0.027603 | 0.004564 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
31da5738bf249f7b6c3f9753778efeeee9a73203 | 281 | py | Python | snmpagent_unity/unity_impl/PowerSupplyName.py | factioninc/snmp-unity-agent | 3525dc0fac60d1c784dcdd7c41693544bcbef843 | [
"Apache-2.0"
] | 2 | 2019-03-01T11:14:59.000Z | 2019-10-02T17:47:59.000Z | snmpagent_unity/unity_impl/PowerSupplyName.py | factioninc/snmp-unity-agent | 3525dc0fac60d1c784dcdd7c41693544bcbef843 | [
"Apache-2.0"
] | 2 | 2019-03-01T11:26:29.000Z | 2019-10-11T18:56:54.000Z | snmpagent_unity/unity_impl/PowerSupplyName.py | factioninc/snmp-unity-agent | 3525dc0fac60d1c784dcdd7c41693544bcbef843 | [
"Apache-2.0"
] | 1 | 2019-10-03T21:09:17.000Z | 2019-10-03T21:09:17.000Z | class PowerSupplyName(object):
def read_get(self, name, idx_name, unity_client):
return unity_client.get_power_supply_name(idx_name)
class PowerSupplyNameColumn(object):
def get_idx(self, name, idx, unity_client):
return unity_client.get_power_supplies()
| 31.222222 | 59 | 0.758007 | 38 | 281 | 5.263158 | 0.421053 | 0.22 | 0.11 | 0.22 | 0.36 | 0.36 | 0.36 | 0 | 0 | 0 | 0 | 0 | 0.156584 | 281 | 8 | 60 | 35.125 | 0.843882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
31dddd6ae020792f4e13ab38614ad5f97ecd1813 | 41 | py | Python | xbwt/__init__.py | dolce95/xbw-transform | bf6b402e2f04cacf2cb5120b00e7d9190b35e9f8 | [
"MIT"
] | null | null | null | xbwt/__init__.py | dolce95/xbw-transform | bf6b402e2f04cacf2cb5120b00e7d9190b35e9f8 | [
"MIT"
] | null | null | null | xbwt/__init__.py | dolce95/xbw-transform | bf6b402e2f04cacf2cb5120b00e7d9190b35e9f8 | [
"MIT"
] | null | null | null | from .xbwt import readAndImportTree, XBWT | 41 | 41 | 0.853659 | 5 | 41 | 7 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 41 | 1 | 41 | 41 | 0.945946 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
31ebacc8a867fc81762b969d67dceab19d59d4a9 | 455 | py | Python | torchsketch/data/datasets/__init__.py | songyzh/torchsketch | 42bca1b31ab9699d9b6d77a102b1f46bba82fb33 | [
"MIT"
] | 182 | 2020-03-25T01:59:11.000Z | 2022-03-29T08:58:47.000Z | torchsketch/data/datasets/__init__.py | songyzh/torchsketch | 42bca1b31ab9699d9b6d77a102b1f46bba82fb33 | [
"MIT"
] | 5 | 2020-03-25T13:16:50.000Z | 2022-02-19T09:51:39.000Z | torchsketch/data/datasets/__init__.py | songyzh/torchsketch | 42bca1b31ab9699d9b6d77a102b1f46bba82fb33 | [
"MIT"
] | 17 | 2020-03-25T12:40:49.000Z | 2022-03-28T06:34:40.000Z | from torchsketch.data.datasets.qmul_chair.download_qmul_chair import download_qmul_chair
from torchsketch.data.datasets.qmul_shoe.download_qmul_shoe import download_qmul_shoe
from torchsketch.data.datasets.quickdraw.quickdraw_414k.download_quickdraw_414k import download_quickdraw_414k
from torchsketch.data.datasets.sketchy.download_sketchy import download_sketchy
from torchsketch.data.datasets.tu_berlin.download_tu_berlin import download_tu_berlin | 91 | 111 | 0.903297 | 63 | 455 | 6.174603 | 0.222222 | 0.192802 | 0.244216 | 0.347044 | 0.159383 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020833 | 0.050549 | 455 | 5 | 112 | 91 | 0.87963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
735f5eaec41e16b05672f2d0fd17dad013a9573b | 52 | py | Python | tools/Polygraphy/polygraphy/tools/to_json/__init__.py | spradius/TensorRT | eb5de99b523c76c2f3ae997855ad86d3a1e86a31 | [
"Apache-2.0"
] | 1 | 2021-08-23T01:15:16.000Z | 2021-08-23T01:15:16.000Z | tools/Polygraphy/polygraphy/tools/to_json/__init__.py | spradius/TensorRT | eb5de99b523c76c2f3ae997855ad86d3a1e86a31 | [
"Apache-2.0"
] | null | null | null | tools/Polygraphy/polygraphy/tools/to_json/__init__.py | spradius/TensorRT | eb5de99b523c76c2f3ae997855ad86d3a1e86a31 | [
"Apache-2.0"
] | 1 | 2022-03-29T12:39:29.000Z | 2022-03-29T12:39:29.000Z | from polygraphy.tools.to_json.to_json import ToJSON
| 26 | 51 | 0.865385 | 9 | 52 | 4.777778 | 0.777778 | 0.27907 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 52 | 1 | 52 | 52 | 0.895833 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c308240704479edfa22634c9c2178c3e8c6ef1b0 | 207 | py | Python | office365/graph/directory/directoryRole.py | stardust85/Office365-REST-Python-Client | cd369c607c7d137a000734e9c5e8f03ae3e3c603 | [
"MIT"
] | null | null | null | office365/graph/directory/directoryRole.py | stardust85/Office365-REST-Python-Client | cd369c607c7d137a000734e9c5e8f03ae3e3c603 | [
"MIT"
] | null | null | null | office365/graph/directory/directoryRole.py | stardust85/Office365-REST-Python-Client | cd369c607c7d137a000734e9c5e8f03ae3e3c603 | [
"MIT"
] | null | null | null | from office365.graph.directory import DirectoryObject
class DirectoryRole(DirectoryObject):
"""Represents an Azure AD directory role. Azure AD directory roles are also known as administrator roles """
| 34.5 | 112 | 0.801932 | 25 | 207 | 6.64 | 0.76 | 0.084337 | 0.192771 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016854 | 0.140097 | 207 | 5 | 113 | 41.4 | 0.91573 | 0.487923 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c321c5c72e2f30ee59e53d907568b47935c74313 | 17,301 | py | Python | creel_portal/api/filters/FN123_Filter.py | AdamCottrill/CreelPortal | 5ec867c4f11b4231c112e8209116b6b96c2830ec | [
"MIT"
] | null | null | null | creel_portal/api/filters/FN123_Filter.py | AdamCottrill/CreelPortal | 5ec867c4f11b4231c112e8209116b6b96c2830ec | [
"MIT"
] | null | null | null | creel_portal/api/filters/FN123_Filter.py | AdamCottrill/CreelPortal | 5ec867c4f11b4231c112e8209116b6b96c2830ec | [
"MIT"
] | null | null | null | """
=============================================================
~/creel_portal/creel_portal/api/filters.py
Created: 29 Mar 2020 09:49:43
DESCRIPTION:
Filters associated with our api endpoints
A. Cottrill
=============================================================
"""
import django_filters
from django_filters import rest_framework as filters
from ...models.fishnet2 import FN123
from .filter_utils import ValueInFilter
class FN123Filter(filters.FilterSet):
"""FN123 objects represent catch counts associated with an interview."""
# FN123 field filters:
# "spc"
# "grp",
# "sek",
# "hvscnt",
# "rlscnt",
# "mescnt",
# "meswt",
spc = ValueInFilter(field_name="spc")
spc__not = ValueInFilter(field_name="spc", exclude=True)
grp = ValueInFilter(field_name="grp")
grp__not = ValueInFilter(field_name="grp", exclude=True)
sek = django_filters.BooleanFilter(field_name="sek")
hsvcnt = django_filters.CharFilter(field_name="hsvcnt", lookup_expr="exact")
hsvcnt__gte = django_filters.NumberFilter(field_name="hsvcnt", lookup_expr="gte")
hsvcnt__lte = django_filters.NumberFilter(field_name="hsvcnt", lookup_expr="lte")
hsvcnt__gt = django_filters.NumberFilter(field_name="hsvcnt", lookup_expr="gt")
hsvcnt__lt = django_filters.NumberFilter(field_name="hsvcnt", lookup_expr="lt")
rlscnt = django_filters.CharFilter(field_name="rlscnt", lookup_expr="exact")
rlscnt__gte = django_filters.NumberFilter(field_name="rlscnt", lookup_expr="gte")
rlscnt__lte = django_filters.NumberFilter(field_name="rlscnt", lookup_expr="lte")
rlscnt__gt = django_filters.NumberFilter(field_name="rlscnt", lookup_expr="gt")
rlscnt__lt = django_filters.NumberFilter(field_name="rlscnt", lookup_expr="lt")
mescnt = django_filters.CharFilter(field_name="mescnt", lookup_expr="exact")
mescnt__gte = django_filters.NumberFilter(field_name="mescnt", lookup_expr="gte")
mescnt__lte = django_filters.NumberFilter(field_name="mescnt", lookup_expr="lte")
mescnt__gt = django_filters.NumberFilter(field_name="mescnt", lookup_expr="gt")
mescnt__lt = django_filters.NumberFilter(field_name="mescnt", lookup_expr="lt")
# INTERVIEW ATTRIBUTES:
itvtm0 = django_filters.TimeFilter(
field_name="interview__itvtm0", help_text="format: HH:MM"
)
itvtm0__gte = django_filters.TimeFilter(
field_name="interview__itvtm0", lookup_expr="gte", help_text="format: HH:MM"
)
itvtm0__lte = django_filters.TimeFilter(
field_name="interview__itvtm0", lookup_expr="lte", help_text="format: HH:MM"
)
itvtm0__gt = django_filters.TimeFilter(
field_name="interview__itvtm0", lookup_expr="gt", help_text="format: HH:MM"
)
itvtm0__lt = django_filters.TimeFilter(
field_name="interview__itvtm0", lookup_expr="lt", help_text="format: HH:MM"
)
efftm0 = django_filters.TimeFilter(
field_name="interview__efftm0", help_text="format: HH:MM"
)
efftm0__gte = django_filters.TimeFilter(
field_name="interview__efftm0", lookup_expr="gte", help_text="format: HH:MM"
)
efftm0__lte = django_filters.TimeFilter(
field_name="interview__efftm0", lookup_expr="lte", help_text="format: HH:MM"
)
efftm0__gt = django_filters.TimeFilter(
field_name="interview__efftm0", lookup_expr="gt", help_text="format: HH:MM"
)
efftm0__lt = django_filters.TimeFilter(
field_name="interview__efftm0", lookup_expr="lt", help_text="format: HH:MM"
)
efftm1 = django_filters.TimeFilter(
field_name="interview__efftm1", help_text="format: HH:MM"
)
efftm1__gte = django_filters.TimeFilter(
field_name="interview__efftm1", lookup_expr="gte", help_text="format: HH:MM"
)
efftm1__lte = django_filters.TimeFilter(
field_name="interview__efftm1", lookup_expr="lte", help_text="format: HH:MM"
)
efftm1__gt = django_filters.TimeFilter(
field_name="interview__efftm1", lookup_expr="gt", help_text="format: HH:MM"
)
efftm1__lt = django_filters.TimeFilter(
field_name="interview__efftm1", lookup_expr="lt", help_text="format: HH:MM"
)
date = django_filters.DateFilter(
field_name="interview__date", help_text="format: yyyy-mm-dd"
)
date__gte = django_filters.DateFilter(
field_name="interview__date", lookup_expr="gte", help_text="format: yyyy-mm-dd"
)
date__lte = django_filters.DateFilter(
field_name="interview__date", lookup_expr="lte", help_text="format: yyyy-mm-dd"
)
date__gt = django_filters.DateFilter(
field_name="interview__date", lookup_expr="gt", help_text="format: yyyy-mm-dd"
)
date__lt = django_filters.DateFilter(
field_name="interview__date", lookup_expr="lt", help_text="format: yyyy-mm-dd"
)
persons = django_filters.CharFilter(
field_name="interview__persons", lookup_expr="exact"
)
persons__gte = django_filters.NumberFilter(
field_name="interview__persons", lookup_expr="gte"
)
persons__lte = django_filters.NumberFilter(
field_name="interview__persons", lookup_expr="lte"
)
persons__gt = django_filters.NumberFilter(
field_name="interview__persons", lookup_expr="gt"
)
persons__lt = django_filters.NumberFilter(
field_name="interview__persons", lookup_expr="lt"
)
anglers = django_filters.CharFilter(
field_name="interview__anglers", lookup_expr="exact"
)
anglers__gte = django_filters.NumberFilter(
field_name="interview__anglers", lookup_expr="gte"
)
anglers__lte = django_filters.NumberFilter(
field_name="interview__anglers", lookup_expr="lte"
)
anglers__gt = django_filters.NumberFilter(
field_name="interview__anglers", lookup_expr="gt"
)
anglers__lt = django_filters.NumberFilter(
field_name="interview__anglers", lookup_expr="lt"
)
rods = django_filters.CharFilter(field_name="interview__rods", lookup_expr="exact")
rods__gte = django_filters.NumberFilter(
field_name="interview__rods", lookup_expr="gte"
)
rods__lte = django_filters.NumberFilter(
field_name="interview__rods", lookup_expr="lte"
)
rods__gt = django_filters.NumberFilter(
field_name="interview__rods", lookup_expr="gt"
)
rods__lt = django_filters.NumberFilter(
field_name="interview__rods", lookup_expr="lt"
)
angmeth = ValueInFilter(field_name="interview__angmeth")
angmeth__not = ValueInFilter(field_name="interview__angmeth", exclude=True)
comment1__like = django_filters.CharFilter(
field_name="interview__comment1", lookup_expr="icontains"
)
# FN011 ATTRIBUTES
year = django_filters.CharFilter(
field_name="interview__sama__creel__year", lookup_expr="exact"
)
year__gte = django_filters.NumberFilter(
field_name="interview__sama__creel__year", lookup_expr="gte"
)
year__lte = django_filters.NumberFilter(
field_name="interview__sama__creel__year", lookup_expr="lte"
)
year__gt = django_filters.NumberFilter(
field_name="interview__sama__creel__year", lookup_expr="gt"
)
year__lt = django_filters.NumberFilter(
field_name="interview__sama__creel__year", lookup_expr="lt"
)
prj_date0 = django_filters.DateFilter(
field_name="interview__sama__creel__prj_date0", help_text="format: yyyy-mm-dd"
)
prj_date0__gte = django_filters.DateFilter(
field_name="interview__sama__creel__prj_date0",
lookup_expr="gte",
help_text="format: yyyy-mm-dd",
)
prj_date0__lte = django_filters.DateFilter(
field_name="interview__sama__creel__prj_date0",
lookup_expr="lte",
help_text="format: yyyy-mm-dd",
)
prj_date1 = django_filters.DateFilter(
field_name="interview__sama__creel__prj_date1", help_text="format: yyyy-mm-dd"
)
prj_date1__gte = django_filters.DateFilter(
field_name="interview__sama__creel__prj_date1",
lookup_expr="gte",
help_text="format: yyyy-mm-dd",
)
prj_date1__lte = django_filters.DateFilter(
field_name="interview__sama__creel__prj_date1",
lookup_expr="lte",
help_text="format: yyyy-mm-dd",
)
prj_cd = ValueInFilter(field_name="interview__sama__creel__prj_cd")
prj_cd__not = ValueInFilter(
field_name="interview__sama__creel__prj_cd", exclude=True
)
prj_cd__like = django_filters.CharFilter(
field_name="interview__sama__creel__prj_cd", lookup_expr="icontains"
)
prj_cd__not_like = django_filters.CharFilter(
field_name="interview__sama__creel__prj_cd",
lookup_expr="icontains",
exclude=True,
)
prj_cd__endswith = django_filters.CharFilter(
field_name="interview__sama__creel__prj_cd", lookup_expr="endswith"
)
prj_cd__not_endswith = django_filters.CharFilter(
field_name="interview__sama__creel__prj_cd",
lookup_expr="endswith",
exclude=True,
)
prj_nm__like = django_filters.CharFilter(
field_name="interview__sama__creel__prj_nm", lookup_expr="icontains"
)
prj_nm__not_like = django_filters.CharFilter(
field_name="interview__sama__creel__prj_nm",
lookup_expr="icontains",
exclude=True,
)
prj_ldr = django_filters.CharFilter(
field_name="interview__sama__creel__prj_ldr__username", lookup_expr="iexact"
)
contmeth = ValueInFilter(field_name="interview__sama__creel__contmeth")
contmeth__not = ValueInFilter(
field_name="interview__sama__creel__contmeth", exclude=True
)
lake = ValueInFilter(field_name="interview__sama__creel__lake__abbrev")
lake__not = ValueInFilter(
field_name="interview__sama__creel__lake__abbrev", exclude=True
)
# SEASON FILTERS:
ssn = ValueInFilter(field_name="interview__sama__season__ssn")
ssn__not = ValueInFilter(field_name="interview__sama__season__ssn", exclude=True)
ssn__like = django_filters.CharFilter(
field_name="interview__sama__season__ssn", lookup_expr="icontains"
)
ssn__not_like = django_filters.CharFilter(
field_name="interview__sama__season__ssn", lookup_expr="icontains", exclude=True
)
ssn_des = ValueInFilter(field_name="interview__sama__season__ssn_des")
ssn_des__not = ValueInFilter(
field_name="interview__sama__season__ssn_des", exclude=True
)
ssn_des__like = django_filters.CharFilter(
field_name="interview__sama__season__ssn_des", lookup_expr="icontains"
)
ssn_des__not_like = django_filters.CharFilter(
field_name="interview__sama__season__ssn_des",
lookup_expr="icontains",
exclude=True,
)
ssn_date0 = django_filters.DateFilter(
field_name="interview__sama__season__ssn_date0", help_text="format: yyyy-mm-dd"
)
ssn_date0__gte = django_filters.DateFilter(
field_name="interview__sama__season__ssn_date0",
lookup_expr="gte",
help_text="format: yyyy-mm-dd",
)
ssn_date0__lte = django_filters.DateFilter(
field_name="interview__sama__season__ssn_date0",
lookup_expr="lte",
help_text="format: yyyy-mm-dd",
)
ssn_date1 = django_filters.DateFilter(
field_name="interview__sama__season__ssn_date1", help_text="format: yyyy-mm-dd"
)
ssn_date1__gte = django_filters.DateFilter(
field_name="interview__sama__season__ssn_date1",
lookup_expr="gte",
help_text="format: yyyy-mm-dd",
)
ssn_date1__lte = django_filters.DateFilter(
field_name="interview__sama__season__ssn_date1",
lookup_expr="lte",
help_text="format: yyyy-mm-dd",
)
# daytype filters
dtp = ValueInFilter(field_name="interview__sama__daytype__dtp")
dtp__not = ValueInFilter(field_name="interview__sama__daytype__dtp", exclude=True)
dtp_nm__like = django_filters.CharFilter(
field_name="interview__sama__daytype__dtp_nm", lookup_expr="icontains"
)
dtp_nm__not_like = django_filters.CharFilter(
field_name="interview__sama__daytype__dtp_nm",
lookup_expr="icontains",
exclude=True,
)
# Period filters
prd = ValueInFilter(field_name="interview__sama__prd__prd")
prd__not = ValueInFilter(field_name="interview__sama__prd__prd", exclude=True)
prdtm0 = django_filters.TimeFilter(
field_name="interview__sama__prd__prdtm0", help_text="format: HH:MM"
)
prdtm0__gte = django_filters.TimeFilter(
field_name="interview__sama__prd__prdtm0",
lookup_expr="gte",
help_text="format: HH:MM",
)
prdtm0__lte = django_filters.TimeFilter(
field_name="interview__sama__prd__prdtm0",
lookup_expr="lte",
help_text="format: HH:MM",
)
prdtm1 = django_filters.TimeFilter(
field_name="interview__sama__prd__prdtm1", help_text="format: HH:MM"
)
prdtm1__gte = django_filters.TimeFilter(
field_name="interview__sama__prd__prdtm1",
lookup_expr="gte",
help_text="format: HH:MM",
)
prdtm1__lte = django_filters.TimeFilter(
field_name="interview__sama__prd__prdtm1",
lookup_expr="lte",
help_text="format: HH:MM",
)
prd_dur__gte = django_filters.NumberFilter(
field_name="interview__sama__prd__prd_dur", lookup_expr="gte"
)
prd_dur__lte = django_filters.NumberFilter(
field_name="interview__sama__prd__prd_dur", lookup_expr="lte"
)
# SPACE filters
space = ValueInFilter(field_name="interview__sama__area__space")
space__not = ValueInFilter(field_name="interview__sama__area__space", exclude=True)
space__like = django_filters.CharFilter(
field_name="interview__sama__area__space", lookup_expr="icontains"
)
space__not_like = django_filters.CharFilter(
field_name="interview__sama__area__space", lookup_expr="icontains", exclude=True
)
space_des = ValueInFilter(field_name="interview__sama__area__space_des")
space_des__not = ValueInFilter(
field_name="interview__sama__area__space_des", exclude=True
)
space_des__like = django_filters.CharFilter(
field_name="interview__sama__area__space_des", lookup_expr="icontains"
)
space_des__not_like = django_filters.CharFilter(
field_name="interview__sama__area__space_des",
lookup_expr="icontains",
exclude=True,
)
# TO DO - add NULL NOT_NULL
space_siz__gte = django_filters.NumberFilter(
field_name="interview__sama__area__space_siz", lookup_expr="gte"
)
space_siz__lte = django_filters.NumberFilter(
field_name="interview__sama__area__space_siz", lookup_expr="lte"
)
space_siz__gt = django_filters.NumberFilter(
field_name="interview__sama__area__space_siz", lookup_expr="gt"
)
space_siz__lt = django_filters.NumberFilter(
field_name="interview__sama__area__space_siz", lookup_expr="lt"
)
# TO DO - add NULL NOT_NULL
area_cnt__gte = django_filters.NumberFilter(
field_name="interview__sama__area__area_cnt", lookup_expr="gte"
)
area_cnt__lte = django_filters.NumberFilter(
field_name="interview__sama__area__area_cnt", lookup_expr="lte"
)
area_cnt__gt = django_filters.NumberFilter(
field_name="interview__sama__area__area_cnt", lookup_expr="gt"
)
area_cnt__lt = django_filters.NumberFilter(
field_name="interview__sama__area__area_cnt", lookup_expr="lt"
)
# TO DO - add NULL NOT_NULL
area_wt__gte = django_filters.NumberFilter(
field_name="interview__sama__area__area_wt", lookup_expr="gte"
)
area_wt__lte = django_filters.NumberFilter(
field_name="interview__sama__area__area_wt", lookup_expr="lte"
)
area_wt__gt = django_filters.NumberFilter(
field_name="interview__sama__area__area_wt", lookup_expr="gt"
)
area_wt__lt = django_filters.NumberFilter(
field_name="interview__sama__area__area_wt", lookup_expr="lt"
)
# MODE
mode = ValueInFilter(field_name="interview__sama__mode__mode")
mode__not = ValueInFilter(field_name="interview__sama__mode__mode", exclude=True)
mode__like = django_filters.CharFilter(
field_name="interview__sama__mode__mode", lookup_expr="icontains"
)
mode__not_like = django_filters.CharFilter(
field_name="interview__sama__mode__mode", lookup_expr="icontains", exclude=True
)
mode_des = ValueInFilter(field_name="interview__sama__mode__mode_des")
mode_des__not = ValueInFilter(
field_name="interview__sama__mode__mode_des", exclude=True
)
mode_des__like = django_filters.CharFilter(
field_name="interview__sama__mode__mode_des", lookup_expr="icontains"
)
mode_des__not_like = django_filters.CharFilter(
field_name="interview__sama__mode__mode_des",
lookup_expr="icontains",
exclude=True,
)
class Meta:
model = FN123
fields = [
"grp",
"sek",
"hvscnt",
"rlscnt",
"mescnt",
"meswt",
]
| 35.525667 | 88 | 0.705624 | 2,065 | 17,301 | 5.294915 | 0.05908 | 0.113591 | 0.194256 | 0.160966 | 0.864459 | 0.842052 | 0.786172 | 0.715383 | 0.555149 | 0.381562 | 0 | 0.007033 | 0.186405 | 17,301 | 486 | 89 | 35.598765 | 0.76975 | 0.034853 | 0 | 0.131105 | 0 | 0 | 0.25222 | 0.146448 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.010283 | 0 | 0.37018 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c33365b164f75c3b586baf639897420f075f450e | 316 | py | Python | NucleationModel/losses.py | MFrassek/CommittorEAE | 88a467e4500bc9ab69834209f4eaec9f2d0d7a61 | [
"MIT"
] | null | null | null | NucleationModel/losses.py | MFrassek/CommittorEAE | 88a467e4500bc9ab69834209f4eaec9f2d0d7a61 | [
"MIT"
] | null | null | null | NucleationModel/losses.py | MFrassek/CommittorEAE | 88a467e4500bc9ab69834209f4eaec9f2d0d7a61 | [
"MIT"
] | null | null | null | import tensorflow as tf
def binaryNegLikelihood(y_actual, y_pred):
return -(y_actual * tf.math.log(y_pred)
+ (1-y_actual) * tf.math.log(1-y_pred))
def binomialNegLikelihood(y_actual, y_pred):
return -(2*y_actual * tf.math.log(y_pred)
+ (2*(1-y_actual)) * tf.math.log(1-y_pred))
| 26.333333 | 56 | 0.64557 | 52 | 316 | 3.692308 | 0.288462 | 0.21875 | 0.1875 | 0.270833 | 0.645833 | 0.458333 | 0.458333 | 0.239583 | 0.239583 | 0 | 0 | 0.02381 | 0.202532 | 316 | 11 | 57 | 28.727273 | 0.738095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0.285714 | 0.714286 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
c353835e032eb85f3ad13cd6ecc6ad1dd4312f8a | 378 | py | Python | compilacion/analisis_semantico/Ast/instructions/breakNode.py | Gusta2307/Football-Simulation-IA-SIM-COM- | 8c29c5b1ef61708a4f8b34f5e0e00990aeecfacd | [
"MIT"
] | null | null | null | compilacion/analisis_semantico/Ast/instructions/breakNode.py | Gusta2307/Football-Simulation-IA-SIM-COM- | 8c29c5b1ef61708a4f8b34f5e0e00990aeecfacd | [
"MIT"
] | null | null | null | compilacion/analisis_semantico/Ast/instructions/breakNode.py | Gusta2307/Football-Simulation-IA-SIM-COM- | 8c29c5b1ef61708a4f8b34f5e0e00990aeecfacd | [
"MIT"
] | 1 | 2022-02-07T04:47:15.000Z | 2022-02-07T04:47:15.000Z | from compilacion.analisis_semantico.Ast.instruction import Instruction
from compilacion.analisis_semantico.scope import Scope
class BreakNode(Instruction):
def __init__(self) -> None:
pass
def checkSemantic(self, scope: Scope) -> bool:
return True
def execute(self, scope: Scope):
pass
def visit(self, scope):
return True | 23.625 | 70 | 0.687831 | 43 | 378 | 5.906977 | 0.488372 | 0.106299 | 0.181102 | 0.251969 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.23545 | 378 | 16 | 71 | 23.625 | 0.878893 | 0 | 0 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.363636 | false | 0.181818 | 0.181818 | 0.181818 | 0.818182 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 6 |
c372708e2a5b38d4dffaebb2f414e53adbfb47f0 | 37 | py | Python | btbot/siders/__init__.py | virtualpeer/btbot | fd0cae7c3e946631e42128630579cb9f93678805 | [
"MIT"
] | 8 | 2019-01-28T00:14:55.000Z | 2022-03-02T07:18:24.000Z | btbot/siders/__init__.py | dioconnoi/btbot | c7ed77beecdae59d2bf72c62e93afa847f60493a | [
"MIT"
] | null | null | null | btbot/siders/__init__.py | dioconnoi/btbot | c7ed77beecdae59d2bf72c62e93afa847f60493a | [
"MIT"
] | 11 | 2018-10-19T14:56:40.000Z | 2022-03-02T07:18:26.000Z | from .crossover import CrossOverSider | 37 | 37 | 0.891892 | 4 | 37 | 8.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081081 | 37 | 1 | 37 | 37 | 0.970588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c37b3afc18308f0fbd59cf462b25677ef9f9ec29 | 99 | py | Python | app/forms.py | HackCCNU/mumei | dbc7551c1694755e7d388a519da91a620550574d | [
"WTFPL",
"Unlicense"
] | 6 | 2016-10-27T13:45:04.000Z | 2017-09-17T04:38:09.000Z | examples/HelloAPI/app/main/forms.py | misakar/rest | 8bf7369aaa9da5cc4a300c625e4d7fea21f52681 | [
"MIT"
] | 7 | 2020-03-24T16:05:11.000Z | 2022-01-13T00:51:53.000Z | examples/HelloAPI/app/main/forms.py | misakar/rest | 8bf7369aaa9da5cc4a300c625e4d7fea21f52681 | [
"MIT"
] | 4 | 2015-12-11T03:20:27.000Z | 2016-02-03T04:47:52.000Z | # coding: utf-8
from flask_wtf import Form
# from wtforms import
# from wtforms.validators import
| 16.5 | 32 | 0.777778 | 15 | 99 | 5.066667 | 0.666667 | 0.289474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012048 | 0.161616 | 99 | 5 | 33 | 19.8 | 0.903614 | 0.646465 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
48c119ba8fcdbcf99d2f218409d9ab41ec7d6e28 | 197 | py | Python | working/Lib/__init__.py | tehologist/x-venture | 17d18e9b7b58ac9a8e01bf626e5ceedac4ef4cfc | [
"MIT"
] | null | null | null | working/Lib/__init__.py | tehologist/x-venture | 17d18e9b7b58ac9a8e01bf626e5ceedac4ef4cfc | [
"MIT"
] | null | null | null | working/Lib/__init__.py | tehologist/x-venture | 17d18e9b7b58ac9a8e01bf626e5ceedac4ef4cfc | [
"MIT"
] | null | null | null | #@+leo-ver=4-thin
#@+node:234.20061005223719:@thin .\\Lib\\__init__.py
"""Various classes for use in implementing generic mud."""
#@nonl
#@-node:234.20061005223719:@thin .\\Lib\\__init__.py
#@-leo
| 28.142857 | 58 | 0.700508 | 28 | 197 | 4.642857 | 0.678571 | 0.107692 | 0.323077 | 0.384615 | 0.523077 | 0.523077 | 0.523077 | 0 | 0 | 0 | 0 | 0.192308 | 0.076142 | 197 | 6 | 59 | 32.833333 | 0.521978 | 0.918782 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
48e7e0b9081fa4a6b4b409b8bb12007d2426cabe | 58 | py | Python | timeframeds/__init__.py | andyceo/pylibs | df2eec6f9903e27ab02a82688378207eedb1b419 | [
"MIT"
] | 1 | 2019-10-28T08:56:41.000Z | 2019-10-28T08:56:41.000Z | timeframeds/__init__.py | andyceo/pylibs | df2eec6f9903e27ab02a82688378207eedb1b419 | [
"MIT"
] | null | null | null | timeframeds/__init__.py | andyceo/pylibs | df2eec6f9903e27ab02a82688378207eedb1b419 | [
"MIT"
] | null | null | null | from .timeframe import *
from .timeframe_dataset import *
| 19.333333 | 32 | 0.793103 | 7 | 58 | 6.428571 | 0.571429 | 0.577778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 58 | 2 | 33 | 29 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
48f32efe42260e471dc8b21e1fbde7263d22c179 | 351 | py | Python | pycord/models/__init__.py | 4rqm/pycord | 8debf345ee64957754ae6b16af6520ac352fbe87 | [
"MIT"
] | 23 | 2017-09-16T13:13:12.000Z | 2017-11-16T23:24:46.000Z | pycord/models/__init__.py | 4rqm/pycord | 8debf345ee64957754ae6b16af6520ac352fbe87 | [
"MIT"
] | 5 | 2017-11-20T04:39:26.000Z | 2020-11-29T09:32:47.000Z | pycord/models/__init__.py | kyb3r/pycord | 8debf345ee64957754ae6b16af6520ac352fbe87 | [
"MIT"
] | 11 | 2018-01-19T14:51:38.000Z | 2021-04-07T08:38:42.000Z | from .channel import Channel, TextChannel, VoiceChannel, CategoryChannel, DMChannel, DMGroupChannel, TEXTCHANNEL, \
VOICECHANNEL, CATEGORYCHANNEL, DMCHANNEL, GROUPDMCHANNEL, GUILD_CHANNELS, DM_CHANNELS
from .embed import Embed
from .guild import Guild
from .message import Message
from .role import Role
from .user import User, ClientUser, Member
| 43.875 | 115 | 0.817664 | 40 | 351 | 7.125 | 0.45 | 0.161404 | 0.266667 | 0.329825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122507 | 351 | 7 | 116 | 50.142857 | 0.925325 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.857143 | 0 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5b114c594794d25bcb90c30d25ebd7023222fefb | 2,462 | py | Python | chigre/tests/test_secrets.py | javiercp/holdmybeer | 1b8f847c9a9d35880b8461c16c6ec7630bde86f5 | [
"MIT"
] | null | null | null | chigre/tests/test_secrets.py | javiercp/holdmybeer | 1b8f847c9a9d35880b8461c16c6ec7630bde86f5 | [
"MIT"
] | 53 | 2018-06-14T16:17:51.000Z | 2021-07-21T04:31:55.000Z | chigre/tests/test_secrets.py | javiercp/holdmybeer | 1b8f847c9a9d35880b8461c16c6ec7630bde86f5 | [
"MIT"
] | null | null | null | from django.contrib.auth.models import User
from rest_framework import status
from rest_framework.test import APITestCase
from rest_framework.reverse import reverse
from chigre.models import Secrets
from chigre.serializers import SecretsSerializer
# Create your tests here.
class SecretsReadTest(APITestCase):
def setUp(self):
self.superuser = User.objects.create_superuser('john', 'john@snow.com', 'johnpassword')
self.client.login(username='john', password='johnpassword')
self.secrets = Secrets.load()
self.secrets.maps_key = '123'
self.secrets.save()
def test_read_secrets(self):
"""
Ensure we can read the secrets object.
"""
url = reverse('secrets-info')
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data, {'maps_key': '123'})
class SecretsUpdateTest(APITestCase):
def setUp(self):
self.superuser = User.objects.create_superuser('john', 'john@snow.com', 'johnpassword')
self.client.login(username='john', password='johnpassword')
self.secrets = Secrets.load()
self.secrets.maps_key = '321'
self.secrets.save()
self.data = SecretsSerializer(self.secrets).data
self.data.update({'maps_key': '123'})
def test_update_secrets(self):
"""
Ensure we can update the secrets object.
"""
url = reverse('secrets-info')
response = self.client.put(url, self.data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data, {'maps_key': '123'})
class PubDeleteTest(APITestCase):
def setUp(self):
self.superuser = User.objects.create_superuser('john', 'john@snow.com', 'johnpassword')
self.client.login(username='john', password='johnpassword')
self.secrets = Secrets.load()
self.secrets.maps_key = '123'
self.secrets.save()
self.secrets.delete()
def test_read_secrets(self):
"""
Ensure we can read the secrets object.
"""
url = reverse('secrets-info')
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data, {'maps_key': '123'})
| 36.205882 | 95 | 0.639724 | 280 | 2,462 | 5.525 | 0.232143 | 0.078216 | 0.038785 | 0.044602 | 0.731739 | 0.717518 | 0.717518 | 0.717518 | 0.717518 | 0.717518 | 0 | 0.015983 | 0.237612 | 2,462 | 68 | 96 | 36.205882 | 0.808205 | 0.058083 | 0 | 0.666667 | 0 | 0 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 1 | 0.133333 | false | 0.133333 | 0.133333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
d28fd931e66ba2e45e2f25e64f49818ff0a1ee8d | 70,927 | py | Python | callbacks/data_anti_bar_drug_callbacks.py | peilinja/A-medical-business-data-quality-check-web-tool | eb88d8430dd176ee0c80735c24d4e27da1c137e3 | [
"MIT"
] | 1 | 2022-02-21T18:26:59.000Z | 2022-02-21T18:26:59.000Z | callbacks/data_anti_bar_drug_callbacks.py | peilinja/A-medical-business-data-quality-check-web-tool | eb88d8430dd176ee0c80735c24d4e27da1c137e3 | [
"MIT"
] | null | null | null | callbacks/data_anti_bar_drug_callbacks.py | peilinja/A-medical-business-data-quality-check-web-tool | eb88d8430dd176ee0c80735c24d4e27da1c137e3 | [
"MIT"
] | null | null | null | import json
import io
import plotly.graph_objects as go
from plotly.subplots import make_subplots
import dash
from dash import html
from dash import dcc
import dash_bootstrap_components as dbc
import pandas as pd
import numpy as np
import plotly.express as px
from dash.dependencies import Output, Input, State
from datetime import datetime, timedelta
from server import app
import plotly.graph_objects as go
import plotly.express as px
from sqlalchemy import create_engine
from flask import send_file
import os
from joblib import Parallel, delayed
from dash.exceptions import PreventUpdate
import time
import re
def discriminated_antis(all_antis):
try:
df_抗菌药物 = pd.read_csv(r'./抗菌药物字典.csv')
except:
df_抗菌药物 = pd.read_csv(r'./抗菌药物字典.csv', encoding='gbk')
def isanti(x):
df_抗菌药物['药品'] = x.抗菌药物
df1 = df_抗菌药物[df_抗菌药物['规则等级']==1]
if x.抗菌药物 in list(df1['匹配规则'].values):
return df1[df1['匹配规则']==x.抗菌药物].reset_index(drop=True).loc[0]['抗菌药物通用名']
else:
df2 = df_抗菌药物[df_抗菌药物['规则等级']==2]
df2['是否匹配'] = df2.apply(lambda y: y.抗菌药物通用名 if re.match(y.匹配规则, y.药品) else np.nan, axis=1)
df2['匹配长度'] = df2.apply(lambda y: 0 if pd.isnull(y.是否匹配) else len( y.匹配规则 ), axis=1)
if df2[~df2['是否匹配'].isnull()].shape[0]==0:
df3 = df_抗菌药物[df_抗菌药物['规则等级']==3]
df3['是否匹配'] = df3.apply(lambda y: y.抗菌药物通用名 if re.match(y.匹配规则, y.药品) else np.nan, axis=1)
df3['匹配长度'] = df3.apply(lambda y: 0 if pd.isnull(y.是否匹配) else len( y.匹配规则 ), axis=1)
if df3[~df3['是否匹配'].isnull()].shape[0]==0:
df4 = df_抗菌药物[df_抗菌药物['规则等级']==4]
df4['是否匹配'] = df4.apply(lambda y: y.抗菌药物通用名 if re.match(y.匹配规则, y.药品) else np.nan, axis=1)
df4['匹配长度'] = df4.apply(lambda y: 0 if pd.isnull(y.是否匹配) else len( y.匹配规则 ), axis=1)
if df4[~df4['是否匹配'].isnull()].shape[0]==0:
return np.nan
else:
return df4[~df4['是否匹配'].isnull()][['抗菌药物通用名','匹配长度']].drop_duplicates().sort_values(by=['匹配长度'], ascending=False).reset_index(drop=True)['抗菌药物通用名'].loc[0]#返回正则匹配成功且匹配长度最长
else:
return df3[~df3['是否匹配'].isnull()][['抗菌药物通用名','匹配长度']].drop_duplicates().sort_values(by=['匹配长度'], ascending=False).reset_index(drop=True)['抗菌药物通用名'].loc[0]#返回正则匹配成功且匹配长度最长
else:
return df2[~df2['是否匹配'].isnull()][['抗菌药物通用名','匹配长度']].drop_duplicates().sort_values(by=['匹配长度'], ascending=False).reset_index(drop=True)['抗菌药物通用名'].loc[0]#返回正则匹配成功且匹配长度最长
all_antis['抗菌药物通用名'] = all_antis.apply(isanti, axis=1)
return all_antis
# ----------------------------------------------------------------------------------------------------- 一级图一 ----------------------------------------------------------------------------------------------------------------------
# 获取抗菌药物-菌检出-药敏一级第一张图数据
def get_first_lev_first_fig_date(engine):
res_数据时间缺失及汇总 = pd.DataFrame(columns=['业务类型', 'num', 'month' ])
# 问题类别、问题数据量统计、全数据统计
bus_dic = {
'给药': "select '给药' as 业务类型 ,count(1) as num ,substr(BEGINTIME,1,7) as month from ANTIBIOTICS where BEGINTIME is not null group by substr(BEGINTIME,1,7)",
'菌检出': " select '菌检出' as 业务类型 , count(1) as num ,substr(REQUESTTIME,1,7) as month from BACTERIA where REQUESTTIME is not null group by substr(REQUESTTIME,1,7) ",
'药敏': " select '药敏' as 业务类型 , count(1) as num ,substr(REQUESTTIME,1,7) as month from DRUGSUSCEPTIBILITY where REQUESTTIME is not null group by substr(REQUESTTIME,1,7) ",
}
for bus in bus_dic:
res_数据时间缺失及汇总 = res_数据时间缺失及汇总.append(pd.read_sql(bus_dic[bus],con=engine))
print('抗菌药物-菌检出-药敏一级图一',bus)
return res_数据时间缺失及汇总
# 更新抗菌药物-菌检出-药敏一级图一
@app.callback(
Output('anti_bar_drug_first_level_first_fig','figure'),
Output('anti_bar_drug_first_level_first_fig_data','data'),
Input('anti_bar_drug_first_level_first_fig_data','data'),
Input("db_con_url", "data"),
Input("count_time", "data"),
# prevent_initial_call=True
)
def update_first_level_first_fig(anti_bar_drug_first_level_first_fig_data,db_con_url,count_time):
if db_con_url is None :
return dash.no_update
else:
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
engine = create_engine(db_con_url['db'])
if anti_bar_drug_first_level_first_fig_data is None:
anti_bar_drug_first_level_first_fig_data = {}
anti_bar_drug_first_level_first_fig = get_first_lev_first_fig_date(engine)
anti_bar_drug_first_level_first_fig_data['anti_bar_drug_first_level_first_fig'] = anti_bar_drug_first_level_first_fig.to_json(orient='split', date_format='iso')
anti_bar_drug_first_level_first_fig_data['hosname'] = db_con_url['hosname']
anti_bar_drug_first_level_first_fig_data['btime'] = btime
anti_bar_drug_first_level_first_fig_data['etime'] = etime
anti_bar_drug_first_level_first_fig_data = json.dumps(anti_bar_drug_first_level_first_fig_data)
else:
anti_bar_drug_first_level_first_fig_data = json.loads(anti_bar_drug_first_level_first_fig_data)
if db_con_url['hosname'] != anti_bar_drug_first_level_first_fig_data['hosname']:
anti_bar_drug_first_level_first_fig = get_first_lev_first_fig_date(engine)
anti_bar_drug_first_level_first_fig_data['anti_bar_drug_first_level_first_fig'] = anti_bar_drug_first_level_first_fig.to_json(orient='split',date_format='iso')
anti_bar_drug_first_level_first_fig_data['hosname'] = db_con_url['hosname']
anti_bar_drug_first_level_first_fig_data = json.dumps(anti_bar_drug_first_level_first_fig_data)
else:
anti_bar_drug_first_level_first_fig = pd.read_json(anti_bar_drug_first_level_first_fig_data['anti_bar_drug_first_level_first_fig'], orient='split')
anti_bar_drug_first_level_first_fig_data = dash.no_update
#
anti_bar_drug_first_level_first_fig = anti_bar_drug_first_level_first_fig[(anti_bar_drug_first_level_first_fig['month']>=btime) & (anti_bar_drug_first_level_first_fig['month']<=etime)]
anti_bar_drug_first_level_first_fig = anti_bar_drug_first_level_first_fig.sort_values(['month','业务类型'])
fig1 = px.line(anti_bar_drug_first_level_first_fig, x='month', y='num', color='业务类型',
color_discrete_sequence=px.colors.qualitative.Dark24)
# 设置水平图例及位置
fig1.update_layout(
margin=dict(l=20, r=20, t=20, b=20),
legend=dict(
orientation="h",
yanchor="bottom",
y=1.02,
xanchor="right",
x=1
))
fig1.update_yaxes(title_text="业务数据量")
fig1.update_xaxes(title_text="时间")
return fig1,anti_bar_drug_first_level_first_fig_data
# # ----------------------------------------------------------------------------------------------------- 一级图二 ----------------------------------------------------------------------------------------------------------------------
# # 获取抗菌药物-菌检出-药敏一级第二张图数据
def get_first_lev_second_fig_date(engine,btime,etime):
res_数据关键字缺失及汇总 = pd.DataFrame(columns=['业务类型', '科室', '科室名称', 'num'])
bus_dic = {'8种耐药菌检出': f""" select '8种耐药菌检出' as 业务类型, t1.dept as 科室,t2.label as 科室名称,t1.num from
(select dept,count(1) as num from BACTERIA where BACTERIA in ('大肠埃希菌', '鲍曼不动杆菌', '肺炎克雷伯菌', '金黄色葡萄球菌', '铜绿假单胞菌', '屎肠球菌', '粪肠球菌')
and substr(REQUESTTIME,1,7)>='{btime}' and substr(REQUESTTIME,1,7)<='{etime}' and dept is not null
group by dept) t1,s_departments t2
where t1.dept=t2.code(+) order by t1.num desc
""",
"限制级特殊级抗菌药物使用" : f"""select '限制级特殊级抗菌药物使用' as 业务类型,t1.dept as 科室,t2.label as 科室名称,t1.num from
(select dept,count(1) as num from ANTIBIOTICS where ALEVEL in ('限制类', '特殊类')
and substr(BEGINTIME,1,7)>='{btime}' and substr(BEGINTIME,1,7)<='{etime}' and dept is not null
group by dept) t1,s_departments t2
where t1.dept=t2.code(+) order by t1.num desc
""",
'药敏结果为耐药': f""" select '药敏结果为耐药' as 业务类型,t1.dept as 科室,t2.label as 科室名称,t1.num from
(select dept,count(1) as num from DRUGSUSCEPTIBILITY where SUSCEPTIBILITY like '%耐药%'
and substr(REQUESTTIME,1,7)>='{btime}' and substr(REQUESTTIME,1,7)<='{etime}' and dept is not null
group by dept) t1,s_departments t2
where t1.dept=t2.code(+) order by t1.num desc
"""
}
for bus in bus_dic:
temp = pd.read_sql(bus_dic[bus],con=engine)
temp = temp[0:8]
res_数据关键字缺失及汇总 = res_数据关键字缺失及汇总.append(temp)
return res_数据关键字缺失及汇总
# 更新一级图二
@app.callback(
Output('anti_bar_drug_first_level_second_fig','figure'),
Output('anti_bar_drug_first_level_second_fig_data','data'),
# Output('rank_month_choice','min'),
# Output('rank_month_choice','max'),
# Output('rank_month_choice','value'),
# Output('rank_month_choice','marks'),
Input('anti_bar_drug_first_level_second_fig_data','data'),
# Input('rank_month_choice','value'),
Input("db_con_url", "data"),
Input("count_time", "data"),
# Input('rank_month_choice','marks'),
# prevent_initial_call=True
)
# def update_first_level_second_fig(anti_bar_drug_first_level_second_fig_data,rank_month_choice,db_con_url,count_time,marks):
def update_first_level_second_fig(anti_bar_drug_first_level_second_fig_data,db_con_url,count_time):
# def unixTimeMillis(dt):
# return int(time.mktime(dt.timetuple()))
#
# def unixToDatetime(unix):
# return pd.to_datetime(unix, unit='s')
#
# def getMarks(start, end, Nth=100):
# result = {}
# for i, date in enumerate(daterange):
# result[unixTimeMillis(date)] = str(date.strftime('%Y-%m'))
if db_con_url is None :
return dash.no_update
else:
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
engine = create_engine(db_con_url['db'])
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
min = dash.no_update
max = dash.no_update
value = dash.no_update
marks = dash.no_update
if anti_bar_drug_first_level_second_fig_data is None:
anti_bar_drug_first_level_second_fig_data = {}
first_level_second_fig_data = get_first_lev_second_fig_date(engine,btime,etime)
anti_bar_drug_first_level_second_fig_data['first_level_second_fig_data'] = first_level_second_fig_data.to_json(orient='split', date_format='iso')
anti_bar_drug_first_level_second_fig_data['hosname'] = db_con_url['hosname']
anti_bar_drug_first_level_second_fig_data['btime'] = btime
anti_bar_drug_first_level_second_fig_data['etime'] = etime
anti_bar_drug_first_level_second_fig_data = json.dumps(anti_bar_drug_first_level_second_fig_data)
# end_date = datetime(int(etime[0:4]), int(etime[5:7]), 1)
# start_date = datetime(int(btime[0:4]), int(btime[5:7]), 1)
# daterange = pd.date_range(start=btime+'-01', periods=((end_date.year - start_date.year) * 12 + (end_date.month - start_date.month)), freq='M')
# min = unixTimeMillis(daterange.min())
# max = unixTimeMillis(daterange.max())
# value = [unixTimeMillis(daterange.min()), unixTimeMillis(daterange.max())]
# marks = getMarks(daterange.min(), daterange.max())
else:
anti_bar_drug_first_level_second_fig_data = json.loads(anti_bar_drug_first_level_second_fig_data)
if db_con_url['hosname'] != anti_bar_drug_first_level_second_fig_data['hosname']:
first_level_second_fig_data = get_first_lev_second_fig_date(engine, btime, etime)
anti_bar_drug_first_level_second_fig_data['first_level_second_fig_data'] = first_level_second_fig_data.to_json(orient='split',date_format='iso')
anti_bar_drug_first_level_second_fig_data['hosname'] = db_con_url['hosname']
anti_bar_drug_first_level_second_fig_data['btime'] = btime
anti_bar_drug_first_level_second_fig_data['etime'] = etime
anti_bar_drug_first_level_second_fig_data = json.dumps( anti_bar_drug_first_level_second_fig_data)
# end_date = datetime(int(etime[0:4]), int(etime[5:7]), 1)
# start_date = datetime(int(btime[0:4]), int(btime[5:7]), 1)
# daterange = pd.date_range(start=btime + '-01', periods=( (end_date.year - start_date.year) * 12 + (end_date.month - start_date.month)), freq='M')
# min = unixTimeMillis(daterange.min())
# max = unixTimeMillis(daterange.max())
# value = [unixTimeMillis(daterange.min()), unixTimeMillis(daterange.max())]
# print(value)
# marks = getMarks(daterange.min(), daterange.max())
else:
if anti_bar_drug_first_level_second_fig_data['btime'] != btime or anti_bar_drug_first_level_second_fig_data['etime'] != etime:
# if rank_month_choice is not None and len(rank_month_choice)>0:
# print(rank_month_choice)
# btime1 = time.gmtime(rank_month_choice[0])
# etime1 = time.gmtime(rank_month_choice[1])
# btime = f"{btime1.tm_year}-0{btime1.tm_mon}" if btime1.tm_mon<10 else f"{btime1.tm_year}-{btime1.tm_mon}"
# etime = f"{etime1.tm_year}-0{etime1.tm_mon}" if etime1.tm_mon<10 else f"{etime1.tm_year}-{etime1.tm_mon}"
# print(btime,etime)
first_level_second_fig_data = get_first_lev_second_fig_date(engine, btime, etime)
anti_bar_drug_first_level_second_fig_data[ 'first_level_second_fig_data'] = first_level_second_fig_data.to_json(orient='split', date_format='iso')
anti_bar_drug_first_level_second_fig_data['btime'] = btime
anti_bar_drug_first_level_second_fig_data['etime'] = etime
anti_bar_drug_first_level_second_fig_data = json.dumps(anti_bar_drug_first_level_second_fig_data)
else:
first_level_second_fig_data = pd.read_json(anti_bar_drug_first_level_second_fig_data['first_level_second_fig_data'], orient='split')
anti_bar_drug_first_level_second_fig_data = dash.no_update
# print("一级第二张图数据:")
# print(rank_month_choice)
# print(marks)
bar = first_level_second_fig_data[first_level_second_fig_data['业务类型']=='8种耐药菌检出']
anti = first_level_second_fig_data[first_level_second_fig_data['业务类型']=='限制级特殊级抗菌药物使用']
drug = first_level_second_fig_data[first_level_second_fig_data['业务类型']=='药敏结果为耐药']
bar = bar.sort_values(['num'], ascending=True)
anti = anti.sort_values(['num'], ascending=True)
drug = drug.sort_values(['num'], ascending=True)
fig = make_subplots(rows=1,cols=3)
fig.add_trace(
go.Bar(x=anti['num'], y=anti['科室名称'], orientation='h', name='给药', marker_color=px.colors.qualitative.Dark24[0]),
row=1, col=1
)
fig.add_trace(
go.Bar(x=drug['num'], y=drug['科室名称'], orientation='h', name='药敏',
marker_color=px.colors.qualitative.Dark24[1]),
row=1, col=2,
)
fig.add_trace(
go.Bar(x=bar['num'],y=bar['科室名称'],orientation='h',name='菌检出', marker_color=px.colors.qualitative.Dark24[2]),
row=1,col=3
)
# 设置水平图例及位置
fig.update_layout(
margin=dict(l=20, r=20, t=20, b=20),
legend=dict(
orientation="h",
yanchor="bottom",
y=1.02,
xanchor="right",
x=1
))
return fig,anti_bar_drug_first_level_second_fig_data
# return fig,anti_bar_drug_first_level_second_fig_data,min ,max ,value ,marks
# # ----------------------------------------------------------------------------------------------------- 二级图一 ----------------------------------------------------------------------------------------------------------------------
# 获取抗菌药物二级第一张图数据
def get_second_lev_first_fig_date(engine,btime,etime):
res_数据科室信息缺失及汇总 = pd.DataFrame(columns=['业务类型', 'num', 'month' ])
bus_dic = {'用药目的': f" select '用药目的缺失' as 业务类型,count(1) as num ,substr(BEGINTIME,1,7) as month from ANTIBIOTICS where (substr(BEGINTIME,1,7)>='{btime}' and substr(BEGINTIME,1,7)<='{etime}') group by substr(BEGINTIME,1,7) ",
'药物等级': f" select '药物等级缺失' as 业务类型,count(1) as num ,substr(BEGINTIME,1,7) as month from ANTIBIOTICS t1 where (substr(t1.BEGINTIME,1,7)>='{btime}' and substr(t1.BEGINTIME,1,7)<='{etime}') group by substr(BEGINTIME,1,7) ",
'医嘱开始时间大于结束时间': f" select '医嘱开始时间大于结束时间' as 业务类型,count(1) as num ,substr(BEGINTIME,1,7) as month from ANTIBIOTICS t1 where (substr(BEGINTIME,1,7)>='{btime}' and substr(BEGINTIME,1,7)<='{etime}') and BEGINTIME is not null and ENDTIME is not null and BEGINTIME>endtime group by substr(BEGINTIME,1,7) ",
'医嘱时间在出入院时间之外' : f""" select '医嘱时间在出入院时间之外' as 业务类型,count(1) as num ,substr(BEGINTIME,1,7) as month from ANTIBIOTICS t1,overall t2 where
( t1.BEGINTIME is not null and t1.ENDTIME is not null and t2.in_time is not null and t2.out_time is not null)
and t1.caseid = t2.caseid
and (t1.BEGINTIME<t2.IN_TIME or t1.BEGINTIME > t2.OUT_TIME or t1.ENDTIME<t2.IN_TIME or t1.ENDTIME > t2.OUT_TIME )
and (substr(t1.BEGINTIME,1,7)>='{btime}' and substr(t1.BEGINTIME,1,7)<='{etime}')
group by substr(BEGINTIME,1,7)
""",
}
for bus in bus_dic:
res_数据科室信息缺失及汇总 = res_数据科室信息缺失及汇总.append(pd.read_sql(bus_dic[bus],con=engine))
return res_数据科室信息缺失及汇总
# 更新二级图一
@app.callback(
Output('anti_second_level_first_fig','figure'),
Output('anti_second_level_first_fig_data','data'),
Input('anti_second_level_first_fig_data','data'),
Input("db_con_url", "data"),
Input("count_time", "data"),
# prevent_initial_call=True
)
def update_first_level_third_fig(anti_second_level_first_fig_data,db_con_url,count_time):
if db_con_url is None:
return dash.no_update
else:
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
engine = create_engine(db_con_url['db'])
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
if anti_second_level_first_fig_data is None:
anti_second_level_first_fig = get_second_lev_first_fig_date(engine, btime, etime)
anti_second_level_first_fig_data={}
anti_second_level_first_fig_data['anti_second_level_first_fig'] = anti_second_level_first_fig.to_json(orient='split', date_format='iso')
anti_second_level_first_fig_data['hosname'] = db_con_url['hosname']
anti_second_level_first_fig_data['btime'] = btime
anti_second_level_first_fig_data['etime'] = etime
anti_second_level_first_fig_data = json.dumps(anti_second_level_first_fig_data)
else:
anti_second_level_first_fig_data = json.loads(anti_second_level_first_fig_data)
if db_con_url['hosname'] != anti_second_level_first_fig_data['hosname']:
anti_second_level_first_fig = get_second_lev_first_fig_date(engine, btime, etime)
anti_second_level_first_fig_data['anti_second_level_first_fig'] = anti_second_level_first_fig.to_json(orient='split',date_format='iso')
anti_second_level_first_fig_data['hosname'] = db_con_url['hosname']
anti_second_level_first_fig_data['btime'] = btime
anti_second_level_first_fig_data['etime'] = etime
anti_second_level_first_fig_data = json.dumps(anti_second_level_first_fig_data)
else:
if anti_second_level_first_fig_data['btime'] != btime or anti_second_level_first_fig_data['etime'] != etime:
anti_second_level_first_fig = get_second_lev_first_fig_date(engine, btime, etime)
anti_second_level_first_fig_data['anti_second_level_first_fig'] = anti_second_level_first_fig.to_json(orient='split',date_format='iso')
anti_second_level_first_fig_data['btime'] = btime
anti_second_level_first_fig_data['etime'] = etime
anti_second_level_first_fig_data = json.dumps(anti_second_level_first_fig_data)
else:
anti_second_level_first_fig = pd.read_json(anti_second_level_first_fig_data['anti_second_level_first_fig'], orient='split')
anti_second_level_first_fig_data = dash.no_update
fig_概览一级_科室映射缺失 = go.Figure()
bus_opts = anti_second_level_first_fig[['业务类型']].drop_duplicates().reset_index(drop=True)
# res_数据科室信息缺失及汇总 = anti_second_level_first_fig.sort_values(['month','业务类型'])
print(anti_second_level_first_fig)
for tem,bus in bus_opts.iterrows():
print(tem,)
print(bus,)
temp = anti_second_level_first_fig[anti_second_level_first_fig['业务类型']==bus['业务类型']]
print(temp)
temp = temp.sort_values(['month'])
if temp.shape[0]>0:
fig_概览一级_科室映射缺失.add_trace(
go.Scatter(x=temp['month'], y=temp['num'], name=bus['业务类型'] ,marker_color=px.colors.qualitative.Dark24[tem] )
)
fig_概览一级_科室映射缺失.update_layout(
margin=dict(l=20, r=20, t=20, b=20),
legend=dict(
orientation="h",
yanchor="bottom",
y=1.02,
xanchor="right",
x=1
)
)
fig_概览一级_科室映射缺失.update_yaxes(title_text="问题数量")
fig_概览一级_科室映射缺失.update_xaxes(title_text="月份")
return fig_概览一级_科室映射缺失,anti_second_level_first_fig_data
# 下载二级图一明细
@app.callback(
Output('anti_second_level_first_fig_date_detail', 'data'),
Input('anti_second_level_first_fig_data_detail_down','n_clicks'),
Input("db_con_url", "data"),
Input("count_time", "data"),
prevent_initial_call=True,
)
def download_first_level_third_fig_data_detail(n_clicks,db_con_url,count_time):
if db_con_url is None :
return dash.no_update
else:
if n_clicks is not None and n_clicks>0:
n_clicks = 0
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
engine = create_engine(db_con_url['db'])
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
bus_dic = {
'用药目的缺失': f" select * from ANTIBIOTICS where (substr(BEGINTIME,1,7)>='{btime}' and substr(BEGINTIME,1,7)<='{etime}') ",
'药物等级缺失': f" select t1.* from ANTIBIOTICS t1 where (substr(t1.BEGINTIME,1,7)>='{btime}' and substr(t1.BEGINTIME,1,7)<='{etime}') ",
'医嘱开始时间大于结束时间': f" select t1.* from ANTIBIOTICS t1 where (substr(BEGINTIME,1,7)>='{btime}' and substr(BEGINTIME,1,7)<='{etime}') and BEGINTIME is not null and ENDTIME is not null and BEGINTIME>endtime ",
'医嘱时间在出入院时间之外': f""" select t1.* from ANTIBIOTICS t1,overall t2 where
( t1.BEGINTIME is not null and t1.ENDTIME is not null and t2.in_time is not null and t2.out_time is not null)
and t1.caseid = t2.caseid
and (t1.BEGINTIME<t2.IN_TIME or t1.BEGINTIME > t2.OUT_TIME or t1.ENDTIME<t2.IN_TIME or t1.ENDTIME > t2.OUT_TIME )
and (substr(t1.BEGINTIME,1,7)>='{btime}' and substr(t1.BEGINTIME,1,7)<='{etime}') group by substr(BEGINTIME,1,7)
""",
}
output = io.BytesIO()
writer = pd.ExcelWriter(output, engine='xlsxwriter')
for key in bus_dic.keys():
try:
temp = pd.read_sql(bus_dic[key], con=engine)
if temp.shape[0] > 0:
temp.to_excel(writer, sheet_name=key)
except:
error_df = pd.DataFrame(['明细数据获取出错'], columns=[key])
error_df.to_excel(writer, sheet_name=key)
writer.save()
data = output.getvalue()
hosName = db_con_url['hosname']
return dcc.send_bytes(data, f'{hosName}抗菌药物问题数据明细.xlsx')
else:
return dash.no_update
# # ----------------------------------------------------------------------------------------------------- 二级图二 ----------------------------------------------------------------------------------------------------------------------
# 获取抗菌药物二级第二张图数据
def get_second_level_second_fig_date(engine,btime,etime):
res_业务逻辑问题数据汇总 = pd.read_sql(f" select ANAME as 抗菌药物,count(1) as num , substr(BEGINTIME,1,7) as 月份 from antibiotics where substr(BEGINTIME,1,7)>='{btime}' and substr(BEGINTIME,1,7)<='{etime}' group by substr(BEGINTIME,1,7),ANAME ",con=engine)
return res_业务逻辑问题数据汇总
# 更新二级图
@app.callback(
Output('anti_second_level_second_fig','figure'),
Output('anti_second_level_second_fig_data','data'),
Input('anti_second_level_second_fig_data','data'),
Input("db_con_url", "data"),
Input("count_time", "data"),
# prevent_initial_call=True
)
def update_second_level_fig(anti_second_level_second_fig_data,db_con_url,count_time):
if db_con_url is None:
return dash.no_update
else:
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
engine = create_engine(db_con_url['db'])
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
if anti_second_level_second_fig_data is None:
anti_second_level_second_fig_data = {}
anti_second_level_second_fig = get_second_level_second_fig_date(engine, btime, etime)
anti_second_level_second_fig_data['anti_second_level_second_fig'] = anti_second_level_second_fig.to_json(orient='split', date_format='iso')
anti_second_level_second_fig_data['hosname'] = db_con_url['hosname']
anti_second_level_second_fig_data['btime'] = btime
anti_second_level_second_fig_data['etime'] = etime
anti_second_level_second_fig_data = json.dumps(anti_second_level_second_fig_data)
else:
anti_second_level_second_fig_data = json.loads(anti_second_level_second_fig_data)
if db_con_url['hosname'] != anti_second_level_second_fig_data['hosname']:
anti_second_level_second_fig = get_second_level_second_fig_date(engine, btime, etime)
anti_second_level_second_fig_data['anti_second_level_second_fig'] = anti_second_level_second_fig.to_json(orient='split',date_format='iso')
anti_second_level_second_fig_data['hosname'] = db_con_url['hosname']
anti_second_level_second_fig_data['btime'] = btime
anti_second_level_second_fig_data['etime'] = etime
anti_second_level_second_fig_data = json.dumps(anti_second_level_second_fig_data)
else:
if anti_second_level_second_fig_data['btime'] != btime or anti_second_level_second_fig_data['etime'] != etime:
anti_second_level_second_fig = get_second_level_second_fig_date(engine, btime, etime)
anti_second_level_second_fig_data['anti_second_level_second_fig'] = anti_second_level_second_fig.to_json(orient='split',date_format='iso')
anti_second_level_second_fig_data['btime'] = btime
anti_second_level_second_fig_data['etime'] = etime
anti_second_level_second_fig_data = json.dumps(anti_second_level_second_fig_data)
else:
anti_second_level_second_fig = pd.read_json(anti_second_level_second_fig_data['anti_second_level_second_fig'], orient='split')
anti_second_level_second_fig_data = dash.no_update
antis_dict = discriminated_antis(anti_second_level_second_fig[['抗菌药物']].drop_duplicates())
anti_second_level_second_fig = anti_second_level_second_fig.merge(antis_dict,on='抗菌药物',how='left')
anti_second_level_second_fig['抗菌药物通用名'] = np.where(anti_second_level_second_fig['抗菌药物通用名'].isnull(),anti_second_level_second_fig['抗菌药物'],anti_second_level_second_fig['抗菌药物通用名'])
anti_second_level_second_fig = anti_second_level_second_fig.sort_values(['月份'])
fig = px.bar(anti_second_level_second_fig, x="月份", y="num", color='抗菌药物通用名' ,color_discrete_sequence=px.colors.qualitative.Dark24)
fig.update_layout(
margin=dict(l=20, r=20, t=20, b=20),
#title=f"{btime}--{etime}",
)
fig.update_yaxes(title_text="医嘱数量", )
fig.update_xaxes(title_text="月份", )
return fig,anti_second_level_second_fig_data
# ----------------------------------------------------------------------------------------------------- 二级图三 ----------------------------------------------------------------------------------------------------------------------
# 获取抗菌药物二级第三张图数据
def get_second_level_third_fig_date(engine,btime,etime):
res_业务逻辑问题数据汇总 = pd.read_sql(
f" select ALEVEL as 抗菌药物等级,count(1) as num , substr(BEGINTIME,1,7) as 月份 from antibiotics where substr(BEGINTIME,1,7)>='{btime}' and substr(BEGINTIME,1,7)<='{etime}' and ALEVEL is not null group by substr(BEGINTIME,1,7),ALEVEL ",
con=engine)
return res_业务逻辑问题数据汇总
# 三级第一张图更新
@app.callback(
Output('anti_second_level_third_fig','figure'),
Output('anti_second_level_third_fig_data', 'data'),
Input('anti_second_level_third_fig_data', 'data'),
Input("db_con_url", "data"),
Input("count_time", "data"),
)
def update_third_level_first_fig(anti_second_level_third_fig_data,db_con_url,count_time):
if db_con_url is None:
return dash.no_update
else:
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
engine = create_engine(db_con_url['db'])
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
if anti_second_level_third_fig_data is None:
anti_second_level_third_fig_data = {}
anti_second_level_third_fig = get_second_level_third_fig_date(engine, btime, etime)
anti_second_level_third_fig_data['anti_second_level_third_fig'] = anti_second_level_third_fig.to_json( orient='split', date_format='iso')
anti_second_level_third_fig_data['hosname'] = db_con_url['hosname']
anti_second_level_third_fig_data['btime'] = btime
anti_second_level_third_fig_data['etime'] = etime
anti_second_level_third_fig_data = json.dumps(anti_second_level_third_fig_data)
else:
anti_second_level_third_fig_data = json.loads(anti_second_level_third_fig_data)
if db_con_url['hosname'] != anti_second_level_third_fig_data['hosname']:
anti_second_level_third_fig = get_second_level_third_fig_date(engine, btime, etime)
anti_second_level_third_fig_data['anti_second_level_third_fig'] = anti_second_level_third_fig.to_json(orient='split', date_format='iso')
anti_second_level_third_fig_data['hosname'] = db_con_url['hosname']
anti_second_level_third_fig_data['btime'] = btime
anti_second_level_third_fig_data['etime'] = etime
anti_second_level_third_fig_data = json.dumps(anti_second_level_third_fig_data)
else:
if anti_second_level_third_fig_data['btime'] != btime or anti_second_level_third_fig_data['etime'] != etime:
anti_second_level_third_fig = get_second_level_third_fig_date(engine, btime, etime)
anti_second_level_third_fig_data['anti_second_level_third_fig'] = anti_second_level_third_fig.to_json(orient='split', date_format='iso')
anti_second_level_third_fig_data['btime'] = btime
anti_second_level_third_fig_data['etime'] = etime
anti_second_level_third_fig_data = json.dumps(anti_second_level_third_fig_data)
else:
anti_second_level_third_fig = pd.read_json( anti_second_level_third_fig_data['anti_second_level_third_fig'], orient='split')
anti_second_level_third_fig_data = dash.no_update
anti_second_level_third_fig = anti_second_level_third_fig.sort_values(['月份'])
fig = px.bar(anti_second_level_third_fig, x="月份", y="num", color='抗菌药物等级', color_discrete_sequence=px.colors.qualitative.Dark24)
fig.update_layout(
margin=dict(l=30, r=30, t=30, b=30),
legend=dict(
orientation="h",
yanchor="bottom",
y=1.02,
xanchor="right",
x=1
),
)
fig.update_yaxes(title_text="医嘱数量", )
fig.update_xaxes(title_text="月份", )
return fig,anti_second_level_third_fig_data
# # ----------------------------------------------------------------------------------------------------- 三级图一 ----------------------------------------------------------------------------------------------------------------------
# # 获取菌检出三级第一张图数据
def get_third_level_first_fig_date(engine,btime,etime):
res = pd.read_sql(f"""select substr(REQUESTTIME,1,7) as month,BACTERIA as 菌,count(1) as num from BACTERIA where BACTERIA in ('大肠埃希菌', '鲍曼不动杆菌', '肺炎克雷伯菌', '金黄色葡萄球菌', '铜绿假单胞菌', '屎肠球菌', '粪肠球菌')
and substr(REQUESTTIME,1,7)>='{btime}' and substr(REQUESTTIME,1,7)<='{etime}'
group by BACTERIA, substr(REQUESTTIME,1,7)
""",con=engine)
return res
# 三级第一张图更新
@app.callback(
Output('bar_third_level_first_fig', 'figure'),
Output('bar_third_level_first_fig_data', 'data'),
Input('bar_third_level_first_fig_data', 'data'),
Input("db_con_url", "data"),
Input("count_time", "data"),
)
def update_third_level_first_fig(bar_third_level_first_fig_data,db_con_url,count_time):
if db_con_url is None:
return dash.no_update
else:
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
engine = create_engine(db_con_url['db'])
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
if bar_third_level_first_fig_data is None:
bar_third_level_first_fig_data = {}
bar_third_level_first_fig = get_third_level_first_fig_date(engine, btime, etime)
bar_third_level_first_fig_data['bar_third_level_first_fig'] = bar_third_level_first_fig.to_json( orient='split', date_format='iso')
bar_third_level_first_fig_data['hosname'] = db_con_url['hosname']
bar_third_level_first_fig_data['btime'] = btime
bar_third_level_first_fig_data['etime'] = etime
bar_third_level_first_fig_data = json.dumps(bar_third_level_first_fig_data)
else:
bar_third_level_first_fig_data = json.loads(bar_third_level_first_fig_data)
if db_con_url['hosname'] != bar_third_level_first_fig_data['hosname']:
bar_third_level_first_fig = get_third_level_first_fig_date(engine, btime, etime)
bar_third_level_first_fig_data['bar_third_level_first_fig'] = bar_third_level_first_fig.to_json(orient='split', date_format='iso')
bar_third_level_first_fig_data['hosname'] = db_con_url['hosname']
bar_third_level_first_fig_data['btime'] = btime
bar_third_level_first_fig_data['etime'] = etime
bar_third_level_first_fig_data = json.dumps(bar_third_level_first_fig_data)
else:
if bar_third_level_first_fig_data['btime'] != btime or bar_third_level_first_fig_data['etime'] != etime:
bar_third_level_first_fig = get_third_level_first_fig_date(engine, btime, etime)
bar_third_level_first_fig_data['bar_third_level_first_fig'] = bar_third_level_first_fig.to_json(orient='split', date_format='iso')
bar_third_level_first_fig_data['btime'] = btime
bar_third_level_first_fig_data['etime'] = etime
bar_third_level_first_fig_data = json.dumps(bar_third_level_first_fig_data)
else:
bar_third_level_first_fig = pd.read_json( bar_third_level_first_fig_data['bar_third_level_first_fig'], orient='split')
bar_third_level_first_fig_data = dash.no_update
bar_third_level_first_fig = bar_third_level_first_fig.sort_values(['month' ])
print(bar_third_level_first_fig)
fig1 = px.line(bar_third_level_first_fig, x='month', y= 'num' , color= '菌', color_discrete_sequence=px.colors.qualitative.Dark24)
fig1.update_layout(
margin=dict(l=30, r=30, t=30, b=30),
legend=dict(
orientation="h",
yanchor="bottom",
y=1.02,
xanchor="right",
x=1
),
)
fig1.update_yaxes(title_text= '菌检出数量', )
fig1.update_xaxes(title_text= '月份', )
return fig1,bar_third_level_first_fig_data
# # ----------------------------------------------------------------------------------------------------- 三级图二 ----------------------------------------------------------------------------------------------------------------------
# # 获取菌检出三级第二张图数据
def get_third_level_second_fig_date(engine,btime,etime):
res_信息缺失及汇总 = pd.DataFrame(columns=['业务类型', 'num', 'month'])
bus_dic = {
'菌检出类型': f" select '菌检出类型缺失' as 业务类型,count(1) as num ,substr(REQUESTTIME,1,7) as month from BACTERIA where (substr(REQUESTTIME,1,7)>='{btime}' and substr(REQUESTTIME,1,7)<='{etime}') and BTYPE is null group by substr(REQUESTTIME,1,7) ",
'院内外': f" select '院内外标识缺失' as 业务类型,count(1) as num ,substr(REQUESTTIME,1,7) as month from BACTERIA t1 where (substr(t1.REQUESTTIME,1,7)>='{btime}' and substr(t1.REQUESTTIME,1,7)<='{etime}') and OUTSIDE is null group by substr(REQUESTTIME,1,7) ",
'标本缺失': f" select '标本缺失' as 业务类型,count(1) as num ,substr(REQUESTTIME,1,7) as month from BACTERIA t1 where (substr(t1.REQUESTTIME,1,7)>='{btime}' and substr(t1.REQUESTTIME,1,7)<='{etime}') and SPECIMEN is null group by substr(REQUESTTIME,1,7) ",
'检验项目': f" select '检验项目缺失' as 业务类型,count(1) as num ,substr(REQUESTTIME,1,7) as month from BACTERIA t1 where (substr(t1.REQUESTTIME,1,7)>='{btime}' and substr(t1.REQUESTTIME,1,7)<='{etime}') and SUBJECT is null group by substr(REQUESTTIME,1,7) ",
'申请时间大于报告时间': f" select '菌检出申请时间大于报告时间' as 业务类型,count(1) as num ,substr(REQUESTTIME,1,7) as month from BACTERIA t1 where (substr(REQUESTTIME,1,7)>='{btime}' and substr(REQUESTTIME,1,7)<='{etime}') and REQUESTTIME is not null and REPORTTIME is not null and REQUESTTIME>REPORTTIME group by substr(REQUESTTIME,1,7) ",
'申请时间在出入院时间之外': f""" select '菌检出申请时间在出入院时间之外' as 业务类型,count(1) as num ,substr(REQUESTTIME,1,7) as month from BACTERIA t1,overall t2 where
( t1.REQUESTTIME is not null and t2.in_time is not null and t2.out_time is not null)
and t1.caseid = t2.caseid
and (t1.REQUESTTIME<t2.IN_TIME or t1.REQUESTTIME > t2.OUT_TIME )
and (substr(t1.REQUESTTIME,1,7)>='{btime}' and substr(t1.REQUESTTIME,1,7)<='{etime}')
group by substr(REQUESTTIME,1,7)
""",
}
for bus in bus_dic:
res_信息缺失及汇总 = res_信息缺失及汇总.append(pd.read_sql(bus_dic[bus], con=engine))
return res_信息缺失及汇总
# 三级第二张图更新
@app.callback(
Output('bar_third_level_second_fig', 'figure'),
Output('bar_third_level_second_fig_data', 'data'),
Input('bar_third_level_second_fig_data', 'data'),
Input("db_con_url", "data"),
Input("count_time", "data"),
)
def update_third_level_second_fig(bar_third_level_second_fig_data,db_con_url,count_time):
if db_con_url is None:
return dash.no_update
else:
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
engine = create_engine(db_con_url['db'])
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
if bar_third_level_second_fig_data is None:
bar_third_level_second_fig_data = {}
bar_third_level_second_fig = get_third_level_second_fig_date(engine, btime, etime)
bar_third_level_second_fig_data['bar_third_level_second_fig'] = bar_third_level_second_fig.to_json( orient='split', date_format='iso')
bar_third_level_second_fig_data['hosname'] = db_con_url['hosname']
bar_third_level_second_fig_data['btime'] = btime
bar_third_level_second_fig_data['etime'] = etime
bar_third_level_second_fig_data = json.dumps(bar_third_level_second_fig_data)
else:
bar_third_level_second_fig_data = json.loads(bar_third_level_second_fig_data)
if db_con_url['hosname'] != bar_third_level_second_fig_data['hosname']:
bar_third_level_second_fig = get_third_level_second_fig_date(engine, btime, etime)
bar_third_level_second_fig_data['bar_third_level_second_fig'] = bar_third_level_second_fig.to_json(orient='split', date_format='iso')
bar_third_level_second_fig_data['hosname'] = db_con_url['hosname']
bar_third_level_second_fig_data['btime'] = btime
bar_third_level_second_fig_data['etime'] = etime
bar_third_level_second_fig_data = json.dumps(bar_third_level_second_fig_data)
else:
if bar_third_level_second_fig_data['btime'] != btime or bar_third_level_second_fig_data['etime'] != etime:
bar_third_level_second_fig = get_third_level_second_fig_date(engine, btime, etime)
bar_third_level_second_fig_data['bar_third_level_second_fig'] = bar_third_level_second_fig.to_json(orient='split', date_format='iso')
bar_third_level_second_fig_data['btime'] = btime
bar_third_level_second_fig_data['etime'] = etime
bar_third_level_second_fig_data = json.dumps(bar_third_level_second_fig_data)
else:
bar_third_level_second_fig = pd.read_json( bar_third_level_second_fig_data['bar_third_level_second_fig'], orient='split')
bar_third_level_second_fig_data = dash.no_update
bar_third_level_second_fig = bar_third_level_second_fig.sort_values(['month' ])
fig1 = px.line(bar_third_level_second_fig, x='month', y= 'num' , color= '业务类型', color_discrete_sequence=px.colors.qualitative.Dark24)
fig1.update_layout(
margin=dict(l=30, r=30, t=30, b=30),
legend=dict(
orientation="h",
yanchor="bottom",
y=1.02,
xanchor="right",
x=1
),
)
fig1.update_yaxes(title_text= '数据缺失数量', )
fig1.update_xaxes(title_text= '月份', )
return fig1,bar_third_level_second_fig_data
# 下载三级图二明细
@app.callback(
Output('bar_third_level_second_fig_data_detail', 'data'),
Input('bar_third_level_second_fig_data_detail_down','n_clicks'),
Input("db_con_url", "data"),
Input("count_time", "data"),
prevent_initial_call=True,
)
def download_first_level_third_fig_data_detail(n_clicks,db_con_url,count_time):
if db_con_url is None :
return dash.no_update
else:
if n_clicks is not None and n_clicks>0:
n_clicks = 0
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
engine = create_engine(db_con_url['db'])
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
bus_dic = {
'菌检出类型缺失': f" select * from BACTERIA where (substr(REQUESTTIME,1,7)>='{btime}' and substr(REQUESTTIME,1,7)<='{etime}') and BTYPE is null ",
'院内外标识缺失': f" select t1.* from BACTERIA t1 where (substr(t1.REQUESTTIME,1,7)>='{btime}' and substr(t1.REQUESTTIME,1,7)<='{etime}') and OUTSIDE is null ",
'标本缺失': f" select t1.* from BACTERIA t1 where (substr(t1.REQUESTTIME,1,7)>='{btime}' and substr(t1.REQUESTTIME,1,7)<='{etime}') and SPECIMEN is null ",
'检验项目缺失': f" select t1.* from BACTERIA t1 where (substr(t1.REQUESTTIME,1,7)>='{btime}' and substr(t1.REQUESTTIME,1,7)<='{etime}') and SUBJECT is null ",
'申请时间大于报告时间': f" select t1.* from BACTERIA t1 where (substr(REQUESTTIME,1,7)>='{btime}' and substr(REQUESTTIME,1,7)<='{etime}') and REQUESTTIME is not null and REPORTTIME is not null and REQUESTTIME>REPORTTIME",
'申请时间在出入院时间之外': f""" select t1.* from BACTERIA t1,overall t2 where
( t1.REQUESTTIME is not null and t2.in_time is not null and t2.out_time is not null)
and t1.caseid = t2.caseid
and (t1.REQUESTTIME<t2.IN_TIME or t1.REQUESTTIME > t2.OUT_TIME )
and (substr(t1.REQUESTTIME,1,7)>='{btime}' and substr(t1.REQUESTTIME,1,7)<='{etime}')
""",
}
output = io.BytesIO()
writer = pd.ExcelWriter(output, engine='xlsxwriter')
for key in bus_dic.keys():
try:
temp = pd.read_sql(bus_dic[key], con=engine)
if temp.shape[0] > 0:
temp.to_excel(writer, sheet_name=key)
except:
error_df = pd.DataFrame(['明细数据获取出错'], columns=[key])
error_df.to_excel(writer, sheet_name=key)
writer.save()
data = output.getvalue()
hosName = db_con_url['hosname']
return dcc.send_bytes(data, f'{hosName}菌检出问题数据明细.xlsx')
else:
return dash.no_update
# # ----------------------------------------------------------------------------------------------------- 四级图一 ----------------------------------------------------------------------------------------------------------------------
# # 获取药敏四级第一张图数据
def get_fourth_level_first_fig_date(engine,btime,etime):
res_信息缺失及汇总 = pd.DataFrame(columns=['业务类型', 'num', 'month'])
bus_dic = {
'药敏结果': f" select '药敏结果缺失' as 业务类型,count(1) as num ,substr(REQUESTTIME,1,7) as month from DRUGSUSCEPTIBILITY where (substr(REQUESTTIME,1,7)>='{btime}' and substr(REQUESTTIME,1,7)<='{etime}') and SUSCEPTIBILITY is null group by substr(REQUESTTIME,1,7) ",
'标本缺失': f" select '标本缺失' as 业务类型,count(1) as num ,substr(REQUESTTIME,1,7) as month from DRUGSUSCEPTIBILITY t1 where (substr(t1.REQUESTTIME,1,7)>='{btime}' and substr(t1.REQUESTTIME,1,7)<='{etime}') and SPECIMEN is null group by substr(REQUESTTIME,1,7) ",
'申请时间大于报告时间': f" select '药敏申请时间大于报告时间' as 业务类型,count(1) as num ,substr(REQUESTTIME,1,7) as month from DRUGSUSCEPTIBILITY t1 where (substr(REQUESTTIME,1,7)>='{btime}' and substr(REQUESTTIME,1,7)<='{etime}') and REQUESTTIME is not null and REPORTTIME is not null and REQUESTTIME>REPORTTIME group by substr(REQUESTTIME,1,7) ",
'申请时间在出入院时间之外': f""" select '药敏申请时间在出入院时间之外' as 业务类型,count(1) as num ,substr(REQUESTTIME,1,7) as month from DRUGSUSCEPTIBILITY t1,overall t2 where
( t1.REQUESTTIME is not null and t2.in_time is not null and t2.out_time is not null)
and t1.caseid = t2.caseid
and (t1.REQUESTTIME<t2.IN_TIME or t1.REQUESTTIME > t2.OUT_TIME )
and (substr(t1.REQUESTTIME,1,7)>='{btime}' and substr(t1.REQUESTTIME,1,7)<='{etime}')
group by substr(REQUESTTIME,1,7)
""",
}
for bus in bus_dic:
res_信息缺失及汇总 = res_信息缺失及汇总.append(pd.read_sql(bus_dic[bus], con=engine))
return res_信息缺失及汇总
# 四级第一张图更新
@app.callback(
Output('drug_fourth_level_first_fig', 'figure'),
Output('drug_fourth_level_first_fig_data', 'data'),
Input('drug_fourth_level_first_fig_data', 'data'),
Input("db_con_url", "data"),
Input("count_time", "data"),
)
def update_third_level_second_fig(drug_fourth_level_first_fig_data,db_con_url,count_time):
if db_con_url is None:
return dash.no_update
else:
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
engine = create_engine(db_con_url['db'])
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
if drug_fourth_level_first_fig_data is None:
drug_fourth_level_first_fig_data = {}
drug_fourth_level_first_fig = get_fourth_level_first_fig_date(engine, btime, etime)
drug_fourth_level_first_fig_data['drug_fourth_level_first_fig'] = drug_fourth_level_first_fig.to_json( orient='split', date_format='iso')
drug_fourth_level_first_fig_data['hosname'] = db_con_url['hosname']
drug_fourth_level_first_fig_data['btime'] = btime
drug_fourth_level_first_fig_data['etime'] = etime
drug_fourth_level_first_fig_data = json.dumps(drug_fourth_level_first_fig_data)
else:
drug_fourth_level_first_fig_data = json.loads(drug_fourth_level_first_fig_data)
if db_con_url['hosname'] != drug_fourth_level_first_fig_data['hosname']:
drug_fourth_level_first_fig = get_fourth_level_first_fig_date(engine, btime, etime)
drug_fourth_level_first_fig_data['drug_fourth_level_first_fig'] = drug_fourth_level_first_fig.to_json(orient='split', date_format='iso')
drug_fourth_level_first_fig_data['hosname'] = db_con_url['hosname']
drug_fourth_level_first_fig_data['btime'] = btime
drug_fourth_level_first_fig_data['etime'] = etime
drug_fourth_level_first_fig_data = json.dumps(drug_fourth_level_first_fig_data)
else:
if drug_fourth_level_first_fig_data['btime'] != btime or drug_fourth_level_first_fig_data['etime'] != etime:
drug_fourth_level_first_fig = get_fourth_level_first_fig_date(engine, btime, etime)
drug_fourth_level_first_fig_data['drug_fourth_level_first_fig'] = drug_fourth_level_first_fig.to_json(orient='split', date_format='iso')
drug_fourth_level_first_fig_data['btime'] = btime
drug_fourth_level_first_fig_data['etime'] = etime
drug_fourth_level_first_fig_data = json.dumps(drug_fourth_level_first_fig_data)
else:
drug_fourth_level_first_fig = pd.read_json( drug_fourth_level_first_fig_data['drug_fourth_level_first_fig'], orient='split')
drug_fourth_level_first_fig_data = dash.no_update
drug_fourth_level_first_fig = drug_fourth_level_first_fig.sort_values(['month' ])
fig1 = px.line(drug_fourth_level_first_fig, x='month', y= 'num' , color= '业务类型', color_discrete_sequence=px.colors.qualitative.Dark24)
fig1.update_layout(
margin=dict(l=30, r=30, t=30, b=30),
legend=dict(
orientation="h",
yanchor="bottom",
y=1.02,
xanchor="right",
x=1
),
)
fig1.update_yaxes(title_text= '数据缺失数量', )
fig1.update_xaxes(title_text= '月份', )
return fig1,drug_fourth_level_first_fig_data
# 下载四级图一明细
@app.callback(
Output('drug_fourth_level_first_fig_data_detail', 'data'),
Input('drug_fourth_level_first_fig_data_detail_down','n_clicks'),
Input("db_con_url", "data"),
Input("count_time", "data"),
prevent_initial_call=True,
)
def download_first_level_third_fig_data_detail(n_clicks,db_con_url,count_time):
if db_con_url is None :
return dash.no_update
else:
if n_clicks is not None and n_clicks>0:
n_clicks = 0
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
engine = create_engine(db_con_url['db'])
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
bus_dic = {
'药敏结果缺失': f" select * from DRUGSUSCEPTIBILITY where (substr(REQUESTTIME,1,7)>='{btime}' and substr(REQUESTTIME,1,7)<='{etime}') and SUSCEPTIBILITY is null ",
'标本缺失': f" select t1.* from DRUGSUSCEPTIBILITY t1 where (substr(t1.REQUESTTIME,1,7)>='{btime}' and substr(t1.REQUESTTIME,1,7)<='{etime}') and SPECIMEN is null ",
'申请时间大于报告时间': f" select t1.* from DRUGSUSCEPTIBILITY t1 where (substr(REQUESTTIME,1,7)>='{btime}' and substr(REQUESTTIME,1,7)<='{etime}') and REQUESTTIME is not null and REPORTTIME is not null and REQUESTTIME>REPORTTIME ",
'申请时间在出入院时间之外': f""" select t1.* from DRUGSUSCEPTIBILITY t1,overall t2 where
( t1.REQUESTTIME is not null and t2.in_time is not null and t2.out_time is not null)
and t1.caseid = t2.caseid
and (t1.REQUESTTIME<t2.IN_TIME or t1.REQUESTTIME > t2.OUT_TIME )
and (substr(t1.REQUESTTIME,1,7)>='{btime}' and substr(t1.REQUESTTIME,1,7)<='{etime}')
""",
}
output = io.BytesIO()
writer = pd.ExcelWriter(output, engine='xlsxwriter')
for key in bus_dic.keys():
try:
temp = pd.read_sql(bus_dic[key], con=engine)
if temp.shape[0] > 0:
temp.to_excel(writer, sheet_name=key)
except:
error_df = pd.DataFrame(['明细数据获取出错'], columns=[key])
error_df.to_excel(writer, sheet_name=key)
writer.save()
data = output.getvalue()
hosName = db_con_url['hosname']
return dcc.send_bytes(data, f'{hosName}药敏问题数据明细.xlsx')
else:
return dash.no_update
# # ----------------------------------------------------------------------------------------------------- 四级图二 ----------------------------------------------------------------------------------------------------------------------
# # 获取药敏四级第二张图数据
def get_fourth_level_second_fig_date(engine,btime,etime):
res = pd.read_sql(f"""select count(1) as num,substr(REQUESTTIME,1,7) as month from (
select * from bacteria where (caseid,testno) not in (select caseid,testno from drugsusceptibility) and bacteria !='无菌' and bacteria not like '%酵母%' and bacteria not like '%念珠%' and bacteria not like '%真菌%'
) t1 where substr(REQUESTTIME,1,7)>='{btime}' and substr(REQUESTTIME,1,7)<='{btime}' group by substr(REQUESTTIME,1,7)
""",con=engine)
return res
# 四级第二张图更新
@app.callback(
Output('drug_fourth_level_second_fig', 'figure'),
Output('drug_fourth_level_second_fig_data', 'data'),
Input('drug_fourth_level_second_fig_data', 'data'),
Input("db_con_url", "data"),
Input("count_time", "data"),
)
def update_third_level_second_fig(drug_fourth_level_second_fig_data,db_con_url,count_time):
if db_con_url is None:
return dash.no_update
else:
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
engine = create_engine(db_con_url['db'])
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
if drug_fourth_level_second_fig_data is None:
drug_fourth_level_second_fig_data = {}
drug_fourth_level_second_fig = get_fourth_level_first_fig_date(engine, btime, etime)
drug_fourth_level_second_fig_data['drug_fourth_level_second_fig'] = drug_fourth_level_second_fig.to_json( orient='split', date_format='iso')
drug_fourth_level_second_fig_data['hosname'] = db_con_url['hosname']
drug_fourth_level_second_fig_data['btime'] = btime
drug_fourth_level_second_fig_data['etime'] = etime
drug_fourth_level_second_fig_data = json.dumps(drug_fourth_level_second_fig_data)
else:
drug_fourth_level_second_fig_data = json.loads(drug_fourth_level_second_fig_data)
if db_con_url['hosname'] != drug_fourth_level_second_fig_data['hosname']:
drug_fourth_level_second_fig = get_fourth_level_first_fig_date(engine, btime, etime)
drug_fourth_level_second_fig_data['drug_fourth_level_second_fig'] = drug_fourth_level_second_fig.to_json(orient='split', date_format='iso')
drug_fourth_level_second_fig_data['hosname'] = db_con_url['hosname']
drug_fourth_level_second_fig_data['btime'] = btime
drug_fourth_level_second_fig_data['etime'] = etime
drug_fourth_level_second_fig_data = json.dumps(drug_fourth_level_second_fig_data)
else:
if drug_fourth_level_second_fig_data['btime'] != btime or drug_fourth_level_second_fig_data['etime'] != etime:
drug_fourth_level_second_fig = get_fourth_level_first_fig_date(engine, btime, etime)
drug_fourth_level_second_fig_data['drug_fourth_level_second_fig'] = drug_fourth_level_second_fig.to_json(orient='split', date_format='iso')
drug_fourth_level_second_fig_data['btime'] = btime
drug_fourth_level_second_fig_data['etime'] = etime
drug_fourth_level_second_fig_data = json.dumps(drug_fourth_level_second_fig_data)
else:
drug_fourth_level_second_fig = pd.read_json( drug_fourth_level_second_fig_data['drug_fourth_level_second_fig'], orient='split')
drug_fourth_level_second_fig_data = dash.no_update
drug_fourth_level_second_fig = drug_fourth_level_second_fig.sort_values(['month'])
fig1 = px.line(drug_fourth_level_second_fig, x='month', y= 'num' , color_discrete_sequence=px.colors.qualitative.Dark24)
fig1.update_layout(
margin=dict(l=30, r=30, t=30, b=30),
legend=dict(
orientation="h",
yanchor="bottom",
y=1.02,
xanchor="right",
x=1
),
)
fig1.update_yaxes(title_text= '有菌检出无药敏数据量', )
fig1.update_xaxes(title_text= '月份', )
return fig1,drug_fourth_level_second_fig_data
# 下载四级图二明细
@app.callback(
Output('drug_fourth_level_second_fig_data_detail', 'data'),
Input('drug_fourth_level_second_fig_data_detail_down','n_clicks'),
Input("db_con_url", "data"),
Input("count_time", "data"),
prevent_initial_call=True,
)
def download_first_level_third_fig_data_detail(n_clicks,db_con_url,count_time):
if db_con_url is None :
return dash.no_update
else:
if n_clicks is not None and n_clicks>0:
n_clicks = 0
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
engine = create_engine(db_con_url['db'])
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
bus_dic = {
'有菌检出结果无药敏结果数据': f""" select t1.* from (
select * from bacteria where (caseid,testno) not in (select caseid,testno from drugsusceptibility) and bacteria !='无菌' and bacteria not like '%酵母%' and bacteria not like '%念珠%' and bacteria not like '%真菌%'
) t1 where substr(REQUESTTIME,1,7)>='{btime}' and substr(REQUESTTIME,1,7)<='{etime}'
""",
}
output = io.BytesIO()
writer = pd.ExcelWriter(output, engine='xlsxwriter')
for key in bus_dic.keys():
try:
temp = pd.read_sql(bus_dic[key], con=engine)
if temp.shape[0] > 0:
temp.to_excel(writer, sheet_name=key)
except:
error_df = pd.DataFrame(['明细数据获取出错'], columns=[key])
error_df.to_excel(writer, sheet_name=key)
writer.save()
data = output.getvalue()
hosName = db_con_url['hosname']
return dcc.send_bytes(data, f'{hosName}有菌检出结果无药敏结果数据明细.xlsx')
else:
return dash.no_update
#
# # ----------------------------------------------------------------------------------------------------- 全部下载 ----------------------------------------------------------------------------------------------------------------------
# 页面数据统计结果下载
@app.callback(
Output("down-anti-bar-drug", "data"),
Input("anti-all-count-data-down", "n_clicks"),
Input("anti_bar_drug_first_level_first_fig_data", "data"),
Input("anti_bar_drug_first_level_second_fig_data", "data"),
Input("anti_second_level_first_fig_data", "data"),
Input("anti_second_level_second_fig_data", "data"),
Input("anti_second_level_third_fig_data", "data"),
Input("bar_third_level_first_fig_data", "data"),
Input("bar_third_level_second_fig_data", "data"),
Input("drug_fourth_level_first_fig_data", "data"),
Input("drug_fourth_level_second_fig_data", "data"),
Input("db_con_url", "data"),
Input("count_time", "data"),
prevent_initial_call=True,
)
def get_all_count_data(n_clicks, anti_bar_drug_first_level_first_fig_data,
anti_bar_drug_first_level_second_fig_data,
anti_second_level_first_fig_data,
anti_second_level_second_fig_data,
anti_second_level_third_fig_data,
bar_third_level_first_fig_data,
bar_third_level_second_fig_data,
drug_fourth_level_first_fig_data,
drug_fourth_level_second_fig_data,
db_con_url,count_time):
if db_con_url is None :
return dash.no_update
else:
if n_clicks is not None and n_clicks>0:
n_clicks = 0
db_con_url = json.loads(db_con_url)
count_time = json.loads(count_time)
engine = create_engine(db_con_url['db'])
hosName = db_con_url['hosname']
btime = count_time['btime'][0:7]
etime = count_time['etime'][0:7]
now_time = str(datetime.now())[0:19].replace(' ', '_').replace(':', '_')
if anti_bar_drug_first_level_first_fig_data is not None and anti_bar_drug_first_level_second_fig_data is not None and anti_second_level_first_fig_data is not None and \
anti_second_level_second_fig_data is not None and anti_second_level_third_fig_data is not None and bar_third_level_first_fig_data is not None and \
bar_third_level_second_fig_data is not None and drug_fourth_level_first_fig_data is not None and drug_fourth_level_second_fig_data is not None :
anti_bar_drug_first_level_first_fig_data = json.loads(anti_bar_drug_first_level_first_fig_data )
anti_bar_drug_first_level_second_fig_data = json.loads(anti_bar_drug_first_level_second_fig_data )
anti_second_level_first_fig_data = json.loads(anti_second_level_first_fig_data )
anti_second_level_second_fig_data = json.loads(anti_second_level_second_fig_data )
anti_second_level_third_fig_data = json.loads(anti_second_level_third_fig_data )
bar_third_level_first_fig_data = json.loads(bar_third_level_first_fig_data )
bar_third_level_second_fig_data = json.loads(bar_third_level_second_fig_data )
drug_fourth_level_first_fig_data = json.loads(drug_fourth_level_first_fig_data )
drug_fourth_level_second_fig_data = json.loads(drug_fourth_level_second_fig_data )
if anti_bar_drug_first_level_first_fig_data['hosname'] == hosName and anti_bar_drug_first_level_first_fig_data['btime'] == btime and anti_bar_drug_first_level_first_fig_data['etime'] == etime and \
anti_bar_drug_first_level_second_fig_data['hosname'] == hosName and anti_bar_drug_first_level_second_fig_data['btime'] == btime and anti_bar_drug_first_level_second_fig_data['etime'] == etime and \
anti_second_level_first_fig_data['hosname'] == hosName and anti_second_level_first_fig_data['btime'] == btime and anti_second_level_first_fig_data['etime'] == etime and \
anti_second_level_second_fig_data['hosname'] == hosName and anti_second_level_second_fig_data['btime'] == btime and anti_second_level_second_fig_data['etime'] == etime and \
anti_second_level_third_fig_data['hosname'] == hosName and anti_second_level_third_fig_data['btime'] == btime and anti_second_level_third_fig_data['etime'] == etime and \
bar_third_level_first_fig_data['hosname'] == hosName and bar_third_level_first_fig_data['btime'] == btime and bar_third_level_first_fig_data['etime'] == etime and \
bar_third_level_second_fig_data['hosname'] == hosName and bar_third_level_second_fig_data['btime'] == btime and bar_third_level_second_fig_data['etime'] == etime and \
drug_fourth_level_first_fig_data['hosname'] == hosName and drug_fourth_level_first_fig_data['btime'] == btime and drug_fourth_level_first_fig_data['etime'] == etime and \
drug_fourth_level_second_fig_data['hosname'] == hosName and drug_fourth_level_second_fig_data['btime'] == btime and drug_fourth_level_second_fig_data['etime'] == etime and \
anti_second_level_second_fig_data['hosname'] == hosName and anti_second_level_second_fig_data['btime'] == btime and anti_second_level_second_fig_data['etime'] == etime :
anti_bar_drug_first_level_first_fig = pd.read_json(anti_bar_drug_first_level_first_fig_data['anti_bar_drug_first_level_first_fig'], orient='split')
anti_bar_drug_first_level_first_fig = anti_bar_drug_first_level_first_fig[(anti_bar_drug_first_level_first_fig['month'] >= btime) & (anti_bar_drug_first_level_first_fig['month'] <= etime)]
anti_bar_drug_first_level_second_fig = pd.read_json( anti_bar_drug_first_level_second_fig_data['first_level_second_fig_data'], orient='split')
anti_second_level_first_fig = pd.read_json( anti_second_level_first_fig_data['anti_second_level_first_fig'], orient='split')
anti_second_level_second_fig = pd.read_json( anti_second_level_second_fig_data['anti_second_level_second_fig'], orient='split')
antis_dict = discriminated_antis(anti_second_level_second_fig[['抗菌药物']].drop_duplicates())
anti_second_level_second_fig = anti_second_level_second_fig.merge(antis_dict, on='抗菌药物', how='left')
anti_second_level_second_fig['抗菌药物通用名'] = np.where(anti_second_level_second_fig['抗菌药物通用名'].isnull(),
anti_second_level_second_fig['抗菌药物'],
anti_second_level_second_fig['抗菌药物通用名'])
anti_second_level_third_fig = pd.read_json( anti_second_level_third_fig_data['anti_second_level_third_fig'], orient='split')
bar_third_level_first_fig = pd.read_json( bar_third_level_first_fig_data['bar_third_level_first_fig'], orient='split')
bar_third_level_second_fig = pd.read_json( bar_third_level_second_fig_data['bar_third_level_second_fig'], orient='split')
drug_fourth_level_first_fig = pd.read_json( drug_fourth_level_first_fig_data['drug_fourth_level_first_fig'], orient='split')
drug_fourth_level_second_fig = pd.read_json( drug_fourth_level_second_fig_data['drug_fourth_level_second_fig'], orient='split')
output = io.BytesIO()
writer = pd.ExcelWriter(output, engine='xlsxwriter')
anti_bar_drug_first_level_first_fig.to_excel(writer, sheet_name='抗菌药物_菌检出_药敏每月业务数据量',index=False)
anti_bar_drug_first_level_second_fig.to_excel(writer, sheet_name='抗菌药物_菌检出_药敏业务数据量科室排行',index=False)
anti_second_level_first_fig.to_excel(writer, sheet_name='抗菌药物问题数据每月分布',index=False)
anti_second_level_second_fig.to_excel(writer, sheet_name='抗菌药物使用比例每月分布',index=False)
anti_second_level_third_fig.to_excel(writer, sheet_name='抗菌药物等级使用比例每月分布', index=False)
bar_third_level_first_fig.to_excel(writer, sheet_name='八种重点菌检出量每月分布', index=False)
bar_third_level_second_fig.to_excel(writer, sheet_name='菌检出问题数据每月分布', index=False)
drug_fourth_level_first_fig.to_excel(writer, sheet_name='药敏问题数据每月分布', index=False)
drug_fourth_level_second_fig.to_excel(writer, sheet_name='菌检出无药敏数据每月分布', index=False)
writer.save()
data = output.getvalue()
hosName = db_con_url['hosname']
return dcc.send_bytes(data, f'{hosName}_{now_time}抗菌药物_菌检出_药敏.xlsx')
else:
return dash.no_update
else:
return dash.no_update
else:
return dash.no_update
| 59.353138 | 333 | 0.629732 | 9,471 | 70,927 | 4.315595 | 0.043079 | 0.064395 | 0.091797 | 0.080591 | 0.91454 | 0.899542 | 0.873829 | 0.857216 | 0.798424 | 0.779121 | 0 | 0.014186 | 0.235693 | 70,927 | 1,194 | 334 | 59.402848 | 0.73979 | 0.070791 | 0 | 0.631902 | 0 | 0.059305 | 0.272986 | 0.101927 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025562 | false | 0 | 0.023517 | 0 | 0.100205 | 0.006135 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d2b513445b58e5d332735b3dcf33f366cf53bca0 | 40 | py | Python | brainframe_qt/ui/dialogs/license_dialog/widgets/brainframe_license/license_terms/__init__.py | aotuai/brainframe-qt | 082cfd0694e569122ff7c63e56dd0ec4b62d5bac | [
"BSD-3-Clause"
] | 17 | 2021-02-11T18:19:22.000Z | 2022-02-08T06:12:50.000Z | brainframe_qt/ui/dialogs/license_dialog/widgets/brainframe_license/license_terms/__init__.py | aotuai/brainframe-qt | 082cfd0694e569122ff7c63e56dd0ec4b62d5bac | [
"BSD-3-Clause"
] | 80 | 2021-02-11T08:27:31.000Z | 2021-10-13T21:33:22.000Z | brainframe_qt/ui/dialogs/license_dialog/widgets/brainframe_license/license_terms/__init__.py | aotuai/brainframe-qt | 082cfd0694e569122ff7c63e56dd0ec4b62d5bac | [
"BSD-3-Clause"
] | 5 | 2021-02-12T09:51:34.000Z | 2022-02-08T09:25:15.000Z | from .license_terms import LicenseTerms
| 20 | 39 | 0.875 | 5 | 40 | 6.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 40 | 1 | 40 | 40 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d2c916d3763d82d7ccd264e658243d34602a5549 | 64 | py | Python | Python-Advanced/modules/ascii_art.py | Xamaneone/SoftUni-Intro | 985fe3249cd2adf021c2003372e840219811d989 | [
"MIT"
] | null | null | null | Python-Advanced/modules/ascii_art.py | Xamaneone/SoftUni-Intro | 985fe3249cd2adf021c2003372e840219811d989 | [
"MIT"
] | null | null | null | Python-Advanced/modules/ascii_art.py | Xamaneone/SoftUni-Intro | 985fe3249cd2adf021c2003372e840219811d989 | [
"MIT"
] | null | null | null | from pyfiglet import figlet_format
print(figlet_format(input())) | 32 | 34 | 0.84375 | 9 | 64 | 5.777778 | 0.777778 | 0.461538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 64 | 2 | 35 | 32 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
d2f776da7fcbef6008fab74c66010137456ce30e | 1,413 | py | Python | tests/unit_test.py | dljessup/dozenal-calendar | 3e9f057f44325864ef805f6cd2811fa8a865c348 | [
"MIT"
] | null | null | null | tests/unit_test.py | dljessup/dozenal-calendar | 3e9f057f44325864ef805f6cd2811fa8a865c348 | [
"MIT"
] | null | null | null | tests/unit_test.py | dljessup/dozenal-calendar | 3e9f057f44325864ef805f6cd2811fa8a865c348 | [
"MIT"
] | null | null | null | import pytest
import dozenal
def test_dozenal_digit_decimals():
assert dozenal.dozenal_digit(0) == '0'
assert dozenal.dozenal_digit(1) == '1'
assert dozenal.dozenal_digit(2) == '2'
assert dozenal.dozenal_digit(3) == '3'
assert dozenal.dozenal_digit(4) == '4'
assert dozenal.dozenal_digit(5) == '5'
assert dozenal.dozenal_digit(6) == '6'
assert dozenal.dozenal_digit(7) == '7'
assert dozenal.dozenal_digit(8) == '8'
assert dozenal.dozenal_digit(9) == '9'
def test_dozenal_digit_dek():
assert dozenal.dozenal_digit(10) == 'ᘔ'
def test_dozenal_digit_elv():
assert dozenal.dozenal_digit(11) == 'Ɛ'
def test_dozenal_digit_outofrange():
with pytest.raises(ValueError):
dozenal.dozenal_digit(12)
def test_dozenal_digit_float():
with pytest.raises(ValueError):
dozenal.dozenal_digit(3.0)
def test_dozenal_digit_str():
with pytest.raises(ValueError):
dozenal.dozenal_digit('9')
def test_iso2doz_normal():
assert dozenal.iso2doz('2017-10-20') == '6901-9-25'
def test_iso2doz_single_digit_day():
assert dozenal.iso2doz('2017-10-22') == '6901-ᘔ-00'
def test_iso2doz_presolstice():
assert dozenal.iso2doz('2017-12-20') == '6901-Ɛ-25'
def test_iso2doz_solstice():
assert dozenal.iso2doz('2017-12-21') == '6902-0-00'
def test_iso2doz_postsolstice():
assert dozenal.iso2doz('2017-12-22') == '6902-0-01'
| 23.949153 | 55 | 0.692144 | 198 | 1,413 | 4.712121 | 0.237374 | 0.270096 | 0.305466 | 0.321543 | 0.28403 | 0.144695 | 0.144695 | 0 | 0 | 0 | 0 | 0.094595 | 0.162067 | 1,413 | 58 | 56 | 24.362069 | 0.693412 | 0 | 0 | 0.083333 | 0 | 0 | 0.076433 | 0 | 0 | 0 | 0 | 0 | 0.472222 | 1 | 0.305556 | true | 0 | 0.055556 | 0 | 0.361111 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
825999d47aed4b73d6686ac604dbb08a1f8783c2 | 41,318 | py | Python | tests/integration_tests/tests/agentless_tests/test_environments.py | cloudify-cosmo/cloudify-manager | 4a3f44ceb49d449bc5ebc8766b1c7b9c174ff972 | [
"Apache-2.0"
] | 124 | 2015-01-22T22:28:37.000Z | 2022-02-26T23:12:06.000Z | tests/integration_tests/tests/agentless_tests/test_environments.py | cloudify-cosmo/cloudify-manager | 4a3f44ceb49d449bc5ebc8766b1c7b9c174ff972 | [
"Apache-2.0"
] | 345 | 2015-01-08T15:49:40.000Z | 2022-03-29T08:33:00.000Z | tests/integration_tests/tests/agentless_tests/test_environments.py | cloudify-cosmo/cloudify-manager | 4a3f44ceb49d449bc5ebc8766b1c7b9c174ff972 | [
"Apache-2.0"
] | 77 | 2015-01-07T14:04:35.000Z | 2022-03-07T22:46:00.000Z | import time
import pytest
from integration_tests import AgentlessTestCase
from integration_tests.tests.utils import get_resource as resource
from cloudify.models_states import DeploymentState
from cloudify_rest_client.exceptions import CloudifyClientError
pytestmark = pytest.mark.group_environments
@pytest.mark.usefixtures('cloudmock_plugin')
@pytest.mark.usefixtures('mock_workflows_plugin')
@pytest.mark.usefixtures('testmockoperations_plugin')
class EnvironmentTest(AgentlessTestCase):
def _deploy_main_environment(self, resource_path,
blueprint_id=None,
deployment_id=None):
dsl_path = resource(resource_path)
deployment, _ = self.deploy_application(dsl_path,
blueprint_id=blueprint_id,
deployment_id=deployment_id)
self.client.deployments.update_labels(
deployment.id,
[
{
'csys-obj-type': 'Environment'
},
]
)
return deployment
def _assert_main_environment_after_installation(self, environment_id,
deployment_status):
environment = self.client.deployments.get(environment_id)
# The environment itself is deployed and installed correctly
self.assertEqual(
environment.deployment_status,
deployment_status
)
def _assert_deployment_environment_attr(self,
deployment,
deployment_status,
sub_services_status=None,
sub_environments_status=None,
sub_services_count=0,
sub_environments_count=0):
self.assertEqual(
deployment.deployment_status,
deployment_status
)
self.assertEqual(
deployment.sub_services_status,
sub_services_status
)
self.assertEqual(
deployment.sub_environments_status,
sub_environments_status
)
self.assertEqual(
deployment.sub_services_count,
sub_services_count
)
self.assertEqual(
deployment.sub_environments_count,
sub_environments_count
)
def _verify_statuses_and_count_for_deployment(self,
deployment_id,
deployment_status,
sub_services_status=None,
sub_environments_status=None,
sub_services_count=0,
sub_environments_count=0):
deployment = self.client.deployments.get(deployment_id)
self._assert_deployment_environment_attr(
deployment,
deployment_status,
sub_services_status,
sub_environments_status,
sub_services_count,
sub_environments_count
)
def _attach_deployment_to_parents(self, deployment_id, parents_ids,
deployment_type):
if not parents_ids:
return
parents = []
for parent_id in parents_ids:
parents.append({'csys-obj-parent': parent_id})
labels = [{'csys-obj-type': deployment_type}]
labels.extend(parents)
self.client.deployments.update_labels(deployment_id, labels)
def _deploy_deployment_to_environment(self,
environment,
resource_path,
deployment_type,
blueprint_id=None,
deployment_id=None,
install=False):
dsl_path = resource(resource_path)
if not install:
deployment = self.deploy(dsl_path,
blueprint_id=blueprint_id,
deployment_id=deployment_id)
else:
deployment, _ = self.deploy_application(dsl_path)
self._attach_deployment_to_parents(
deployment.id,
[environment.id],
deployment_type
)
return deployment
def _deploy_environment_with_two_levels(self, main_environment):
# # First environment
env1 = self._deploy_deployment_to_environment(
main_environment,
'dsl/simple_deployment.yaml',
'environment',
install=True
)
# # Second environment
env2 = self._deploy_deployment_to_environment(
main_environment,
'dsl/simple_deployment.yaml',
'environment',
install=True
)
# # Add service + environment to the env1
service1, environment1 = \
self._deploy_environment_with_service_and_environment(env1)
# # Add service + environment to the env2
service2, environment2 = \
self._deploy_environment_with_service_and_environment(env2)
return service1, environment1, service2, environment2
def _deploy_environment_with_three_levels(self, main_environment):
_, environment11, _, environment21 = \
self._deploy_environment_with_two_levels(main_environment)
service111, environment111 = \
self._deploy_environment_with_service_and_environment(
environment11
)
service211, environment211 = \
self._deploy_environment_with_service_and_environment(
environment21
)
return service111, environment111, service211, environment211
def _deploy_environment_with_service_and_environment(self,
main_environment):
service = self._deploy_deployment_to_environment(
main_environment,
'dsl/simple_deployment.yaml',
'service',
install=True
)
environment = self._deploy_deployment_to_environment(
main_environment,
'dsl/simple_deployment.yaml',
'environment',
install=True
)
main_environment = self.client.deployments.get(main_environment.id)
self._assert_deployment_environment_attr(
main_environment,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_environments_status=DeploymentState.GOOD,
sub_services_count=1,
sub_environments_count=1
)
return service, environment
def _create_deployment_group_from_blueprint(self,
resource_path,
blueprint_id,
group_id,
group_size,
labels_to_add=None,
wait_on_labels_add=12):
# Upload group base blueprint
self.upload_blueprint_resource(
resource_path,
blueprint_id
)
# Handle group actions
self.client.deployment_groups.put(
group_id, blueprint_id=blueprint_id
)
self.client.deployment_groups.add_deployments(
group_id,
count=group_size
)
# Wait till the deployment created successfully before adding any
# labels in order to avoid any race condition
if labels_to_add:
time.sleep(wait_on_labels_add)
self.client.deployment_groups.put(
group_id,
labels=labels_to_add,
)
def _execute_workflow_on_group(self, group_id, workflow_id):
execution_group = self.client.execution_groups.start(
deployment_group_id=group_id,
workflow_id=workflow_id
)
self.wait_for_execution_to_end(execution_group, is_group=True)
def test_create_deployment_with_invalid_parent_label(self):
dsl_path = resource('dsl/basic.yaml')
deployment = self.deploy(dsl_path)
with self.assertRaises(CloudifyClientError):
self._attach_deployment_to_parents(
deployment.id,
['invalid_parent'],
'service'
)
def test_environment_with_cyclic_dependencies(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
deployment = self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'service'
)
with self.assertRaises(CloudifyClientError):
self._attach_deployment_to_parents(
environment.id,
[deployment.id],
'environment'
)
def test_environment_after_deploy_single_service(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'service'
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.REQUIRE_ATTENTION,
sub_services_status=DeploymentState.REQUIRE_ATTENTION,
sub_services_count=1
)
def test_environment_after_deploy_multiple_services(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'service'
)
self._deploy_deployment_to_environment(
environment,
'dsl/empty_blueprint.yaml',
'service'
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.REQUIRE_ATTENTION,
sub_services_status=DeploymentState.REQUIRE_ATTENTION,
sub_services_count=2
)
def test_environment_after_install_single_service(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'service',
install=True
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=1
)
def test_environment_after_install_multiple_services(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'service',
install=True
)
self._deploy_deployment_to_environment(
environment,
'dsl/empty_blueprint.yaml',
'service',
install=True
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=2
)
def test_environment_after_install_single_service_with_failure(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
deployment = self._deploy_deployment_to_environment(
environment,
'dsl/workflow_api.yaml',
'service',
install=True
)
with self.assertRaises(RuntimeError):
self.execute_workflow(
workflow_name='execute_operation',
deployment_id=deployment.id,
parameters={
'operation': 'test.fail',
'node_ids': ['test_node'],
'operation_kwargs': {'non_recoverable': True}
},
wait_for_execution=True
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.REQUIRE_ATTENTION,
sub_services_status=DeploymentState.REQUIRE_ATTENTION,
sub_services_count=1
)
def test_environment_after_install_multiple_services_with_failure(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'service',
install=True
)
service2 = self._deploy_deployment_to_environment(
environment,
'dsl/workflow_api.yaml',
'service',
install=True
)
with self.assertRaises(RuntimeError):
self.execute_workflow(
workflow_name='execute_operation',
deployment_id=service2.id,
parameters={
'operation': 'test.fail',
'node_ids': ['test_node'],
'operation_kwargs': {'non_recoverable': True}
},
wait_for_execution=True
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.REQUIRE_ATTENTION,
sub_services_status=DeploymentState.REQUIRE_ATTENTION,
sub_services_count=2
)
def test_environment_after_deploy_single_environment(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'environment'
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.REQUIRE_ATTENTION,
sub_environments_status=DeploymentState.REQUIRE_ATTENTION,
sub_environments_count=1
)
def test_environment_after_deploy_multiple_environments(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'environment'
)
self._deploy_deployment_to_environment(
environment,
'dsl/empty_blueprint.yaml',
'environment'
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.REQUIRE_ATTENTION,
sub_environments_status=DeploymentState.REQUIRE_ATTENTION,
sub_environments_count=2
)
def test_environment_after_install_single_environment(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'environment',
install=True
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.GOOD,
sub_environments_status=DeploymentState.GOOD,
sub_environments_count=1
)
def test_environment_after_install_multiple_environments(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'environment',
install=True
)
self._deploy_deployment_to_environment(
environment,
'dsl/empty_blueprint.yaml',
'environment',
install=True
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.GOOD,
sub_environments_status=DeploymentState.GOOD,
sub_environments_count=2
)
def test_environment_after_install_single_environment_with_failure(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
deployment = self._deploy_deployment_to_environment(
environment,
'dsl/workflow_api.yaml',
'environment',
install=True
)
with self.assertRaises(RuntimeError):
self.execute_workflow(
workflow_name='execute_operation',
deployment_id=deployment.id,
parameters={
'operation': 'test.fail',
'node_ids': ['test_node'],
'operation_kwargs': {'non_recoverable': True}
},
wait_for_execution=True
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.REQUIRE_ATTENTION,
sub_environments_status=DeploymentState.REQUIRE_ATTENTION,
sub_environments_count=1
)
def test_environment_after_install_multiple_environments_with_failure(
self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'environment',
install=True
)
deployment = self._deploy_deployment_to_environment(
environment,
'dsl/workflow_api.yaml',
'environment',
install=True
)
with self.assertRaises(RuntimeError):
self.execute_workflow(
workflow_name='execute_operation',
deployment_id=deployment.id,
parameters={
'operation': 'test.fail',
'node_ids': ['test_node'],
'operation_kwargs': {'non_recoverable': True}
},
wait_for_execution=True
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.REQUIRE_ATTENTION,
sub_environments_status=DeploymentState.REQUIRE_ATTENTION,
sub_environments_count=2
)
def test_environment_after_install_service_and_environment(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'environment',
install=True
)
self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'service',
install=True
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.GOOD,
sub_environments_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=1,
sub_environments_count=1
)
def test_environment_after_removing_service(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
deployment = self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'service'
)
environment = self.client.deployments.get(environment.id)
self._assert_deployment_environment_attr(
environment,
deployment_status=DeploymentState.REQUIRE_ATTENTION,
sub_services_status=DeploymentState.REQUIRE_ATTENTION,
sub_services_count=1
)
self.delete_deployment(deployment.id, validate=True)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.GOOD,
sub_services_count=0
)
def test_environment_after_uninstall_service(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
deployment = self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'service',
install=True
)
environment = self.client.deployments.get(environment.id)
self._assert_deployment_environment_attr(
environment,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=1
)
self.execute_workflow(workflow_name='uninstall',
deployment_id=deployment.id,
wait_for_execution=True)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.REQUIRE_ATTENTION,
sub_services_status=DeploymentState.REQUIRE_ATTENTION,
sub_services_count=1
)
def test_environment_after_removing_environment(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
deployment = self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'environment'
)
environment = self.client.deployments.get(environment.id)
self._assert_deployment_environment_attr(
environment,
deployment_status=DeploymentState.REQUIRE_ATTENTION,
sub_environments_status=DeploymentState.REQUIRE_ATTENTION,
sub_environments_count=1
)
self.delete_deployment(deployment.id, validate=True)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.GOOD,
sub_environments_count=0
)
def test_environment_after_uninstall_environment(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
deployment = self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'environment',
install=True
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.GOOD,
sub_environments_status=DeploymentState.GOOD,
sub_environments_count=1
)
self.execute_workflow(workflow_name='uninstall',
deployment_id=deployment.id,
wait_for_execution=True)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.REQUIRE_ATTENTION,
sub_environments_status=DeploymentState.REQUIRE_ATTENTION,
sub_environments_count=1
)
def test_environment_after_update_workflow(self):
main_environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
main_environment.id, DeploymentState.GOOD
)
deployment = self._deploy_deployment_to_environment(
main_environment,
'dsl/simple_deployment.yaml',
'service',
install=True
)
main_environment = self.client.deployments.get(main_environment.id)
self._assert_deployment_environment_attr(
main_environment,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=1
)
# Deploy new parent 1
environment_1 = self._deploy_main_environment(
'dsl/basic.yaml',
blueprint_id='new_parent_1',
deployment_id='new_parent_1'
)
# Deploy new parent 2
environment_2 = self._deploy_main_environment(
'dsl/basic.yaml',
blueprint_id='new_parent_2',
deployment_id='new_parent_2'
)
self._assert_main_environment_after_installation(
environment_1.id, DeploymentState.GOOD
)
self._assert_main_environment_after_installation(
environment_2.id, DeploymentState.GOOD
)
self.upload_blueprint_resource(
'dsl/simple_deployment_with_parents.yaml',
'updated-blueprint'
)
dep_up = self.client.deployment_updates.update_with_existing_blueprint(
deployment.id,
blueprint_id='updated-blueprint'
)
self.wait_for_execution_to_end(
self.client.executions.get(dep_up.execution_id))
self._verify_statuses_and_count_for_deployment(
environment_1.id,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=1
)
self._verify_statuses_and_count_for_deployment(
environment_2.id,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=1
)
def test_uninstall_environment_linked_with_multiple_deployments(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_environment_with_service_and_environment(environment)
with self.assertRaises(CloudifyClientError):
self.execute_workflow(workflow_name='uninstall',
deployment_id=environment.id)
def test_stop_environment_linked_with_multiple_deployments(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_environment_with_service_and_environment(environment)
with self.assertRaises(CloudifyClientError):
self.execute_workflow(workflow_name='stop',
deployment_id=environment.id)
def test_delete_environment_linked_with_multiple_deployments(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_environment_with_service_and_environment(environment)
with self.assertRaises(CloudifyClientError):
self.client.deployments.delete(environment.id)
def test_update_environment_linked_with_multiple_deployments(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
self._deploy_environment_with_service_and_environment(environment)
self.upload_blueprint_resource(
'dsl/basic_get_secret.yaml',
'updated_basic'
)
with self.assertRaises(CloudifyClientError):
self.client.deployment_updates.update_with_existing_blueprint(
environment.id, 'updated_basic')
def test_uninstall_environment_and_parent(self):
environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment.id, DeploymentState.GOOD
)
deployment = self._deploy_deployment_to_environment(
environment,
'dsl/simple_deployment.yaml',
'environment',
install=True
)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.GOOD,
sub_environments_status=DeploymentState.GOOD,
sub_environments_count=1
)
self.execute_workflow(workflow_name='uninstall',
deployment_id=deployment.id,
wait_for_execution=True)
self.delete_deployment(deployment.id, validate=True)
self._verify_statuses_and_count_for_deployment(
environment.id,
deployment_status=DeploymentState.GOOD,
sub_services_count=0
)
self.execute_workflow(workflow_name='uninstall',
deployment_id=environment.id,
wait_for_execution=True)
self.delete_deployment(environment.id, validate=True)
def test_environment_with_two_levels(self):
main_environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
main_environment.id, DeploymentState.GOOD
)
self._deploy_environment_with_two_levels(main_environment)
self._verify_statuses_and_count_for_deployment(
main_environment.id,
deployment_status=DeploymentState.GOOD,
sub_environments_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_environments_count=4,
sub_services_count=2
)
def test_environment_with_three_levels(self):
# Main parent
main_environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
main_environment.id, DeploymentState.GOOD
)
self._deploy_environment_with_three_levels(main_environment)
self._verify_statuses_and_count_for_deployment(
main_environment.id,
deployment_status=DeploymentState.GOOD,
sub_environments_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_environments_count=6,
sub_services_count=4
)
def test_environment_with_delete_child(self):
main_environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
main_environment.id, DeploymentState.GOOD
)
service, environment = \
self._deploy_environment_with_service_and_environment(
main_environment
)
self.execute_workflow(workflow_name='uninstall',
deployment_id=service.id)
self.delete_deployment(service.id, validate=True)
self._verify_statuses_and_count_for_deployment(
main_environment.id,
deployment_status=DeploymentState.GOOD,
sub_environments_status=DeploymentState.GOOD,
sub_environments_count=1
)
def test_environment_with_uninstall_child(self):
main_environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
main_environment.id, DeploymentState.GOOD
)
service, environment = \
self._deploy_environment_with_service_and_environment(
main_environment
)
self.execute_workflow(workflow_name='uninstall',
deployment_id=service.id)
self._verify_statuses_and_count_for_deployment(
main_environment.id,
deployment_status=DeploymentState.REQUIRE_ATTENTION,
sub_services_status=DeploymentState.REQUIRE_ATTENTION,
sub_environments_status=DeploymentState.GOOD,
sub_services_count=1,
sub_environments_count=1
)
def test_environment_after_removing_parent_label(self):
main_environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
main_environment.id, DeploymentState.GOOD
)
service, environment = \
self._deploy_environment_with_service_and_environment(
main_environment
)
self.client.deployments.update_labels(environment.id, [
{
'csys-obj-type': 'environment'
}
])
self._verify_statuses_and_count_for_deployment(
main_environment.id,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=1
)
def test_environment_after_conversion_to_service_type(self):
main_environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
main_environment.id, DeploymentState.GOOD
)
service, environment = \
self._deploy_environment_with_service_and_environment(
main_environment
)
self.client.deployments.update_labels(environment.id, [
{
'csys-obj-type': 'service'
},
{
'csys-obj-parent': main_environment.id
},
])
self._verify_statuses_and_count_for_deployment(
main_environment.id,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=2
)
def test_environment_after_conversion_to_environment_type(self):
main_environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
main_environment.id, DeploymentState.GOOD
)
service, environment = \
self._deploy_environment_with_service_and_environment(
main_environment
)
self.client.deployments.update_labels(service.id, [
{
'csys-obj-type': 'environment'
},
{
'csys-obj-parent': main_environment.id
},
])
self._verify_statuses_and_count_for_deployment(
main_environment.id,
deployment_status=DeploymentState.GOOD,
sub_environments_status=DeploymentState.GOOD,
sub_environments_count=2
)
def test_environment_with_adding_single_parent_to_group(self):
main_environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
main_environment.id, DeploymentState.GOOD
)
self._create_deployment_group_from_blueprint(
'dsl/simple_deployment.yaml',
'grp-blueprint',
'group1',
4,
labels_to_add=[{'csys-obj-parent': main_environment.id}]
)
self._execute_workflow_on_group('group1', 'install')
self._verify_statuses_and_count_for_deployment(
main_environment.id,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=4
)
def test_environment_with_adding_multiple_parents_to_group(self):
environment1 = self._deploy_main_environment('dsl/basic.yaml')
environment2 = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
environment1.id, DeploymentState.GOOD
)
self._assert_main_environment_after_installation(
environment2.id, DeploymentState.GOOD
)
self._create_deployment_group_from_blueprint(
'dsl/simple_deployment.yaml',
'grp-blueprint',
'group1',
4,
labels_to_add=[{'csys-obj-parent': environment1.id},
{'csys-obj-parent': environment2.id}]
)
self._execute_workflow_on_group('group1', 'install')
self._verify_statuses_and_count_for_deployment(
environment1.id,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=4
)
self._verify_statuses_and_count_for_deployment(
environment2.id,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=4
)
def test_environment_with_removing_parent_from_group(self):
main_environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
main_environment.id, DeploymentState.GOOD
)
self._create_deployment_group_from_blueprint(
'dsl/simple_deployment.yaml',
'grp-blueprint',
'group1',
4,
labels_to_add=[{'csys-obj-parent': main_environment.id}]
)
self._execute_workflow_on_group('group1', 'install')
self._verify_statuses_and_count_for_deployment(
main_environment.id,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=4
)
self.client.deployment_groups.put('group1', labels=[])
main_environment = self.client.deployments.get(main_environment.id)
self._assert_deployment_environment_attr(
main_environment,
deployment_status=DeploymentState.GOOD
)
def test_environment_after_conversion_to_environment_type_for_group(self):
main_environment = self._deploy_main_environment('dsl/basic.yaml')
self._assert_main_environment_after_installation(
main_environment.id, DeploymentState.GOOD
)
self._create_deployment_group_from_blueprint(
'dsl/simple_deployment.yaml',
'grp-blueprint',
'group1',
4,
labels_to_add=[{'csys-obj-parent': main_environment.id}]
)
self._execute_workflow_on_group('group1', 'install')
self._verify_statuses_and_count_for_deployment(
main_environment.id,
deployment_status=DeploymentState.GOOD,
sub_services_status=DeploymentState.GOOD,
sub_services_count=4
)
self.client.deployment_groups.put(
'group1',
labels=[{'csys-obj-parent': main_environment.id},
{'csys-obj-type': 'environment'}],
)
main_environment = self.client.deployments.get(main_environment.id)
self._assert_deployment_environment_attr(
main_environment,
deployment_status=DeploymentState.GOOD,
sub_environments_status=DeploymentState.GOOD,
sub_environments_count=4
)
| 39.016053 | 79 | 0.622441 | 3,736 | 41,318 | 6.434957 | 0.047912 | 0.089847 | 0.059274 | 0.065222 | 0.875504 | 0.852835 | 0.824425 | 0.792604 | 0.775051 | 0.764153 | 0 | 0.004686 | 0.31306 | 41,318 | 1,058 | 80 | 39.05293 | 0.842335 | 0.009294 | 0 | 0.64898 | 0 | 0 | 0.067894 | 0.024244 | 0 | 0 | 0 | 0 | 0.065306 | 1 | 0.047959 | false | 0 | 0.006122 | 0 | 0.061224 | 0.029592 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
826c8e2f0ae4ad200bf97536f77e8b4b0c44e489 | 1,458 | py | Python | Ryven/packages/auto_generated/_markupbase/nodes.py | tfroehlich82/Ryven | cb57c91d13949712844a4410a9302c4a90d28dcd | [
"MIT"
] | 2,872 | 2020-07-01T09:06:34.000Z | 2022-03-31T05:52:32.000Z | Ryven/packages/auto_generated/_markupbase/nodes.py | dhf327/Ryven | a11e361528d982a9dd3c489dd536f8b05ffd56e1 | [
"MIT"
] | 59 | 2020-06-28T12:50:50.000Z | 2022-03-27T19:07:54.000Z | Ryven/packages/auto_generated/_markupbase/nodes.py | dhf327/Ryven | a11e361528d982a9dd3c489dd536f8b05ffd56e1 | [
"MIT"
] | 339 | 2020-07-05T04:36:20.000Z | 2022-03-24T07:25:18.000Z |
from NENV import *
import _markupbase
class NodeBase(Node):
pass
class _Declname_Match_Node(NodeBase):
"""
Matches zero or more characters at the beginning of the string."""
title = '_declname_match'
type_ = '_markupbase'
init_inputs = [
NodeInputBP(label='string'),
NodeInputBP(label='pos', dtype=dtypes.Data(default=0, size='s')),
NodeInputBP(label='endpos', dtype=dtypes.Data(default=9223372036854775807, size='s')),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, _markupbase._declname_match(self.input(0), self.input(1), self.input(2)))
class _Declstringlit_Match_Node(NodeBase):
"""
Matches zero or more characters at the beginning of the string."""
title = '_declstringlit_match'
type_ = '_markupbase'
init_inputs = [
NodeInputBP(label='string'),
NodeInputBP(label='pos', dtype=dtypes.Data(default=0, size='s')),
NodeInputBP(label='endpos', dtype=dtypes.Data(default=9223372036854775807, size='s')),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, _markupbase._declstringlit_match(self.input(0), self.input(1), self.input(2)))
export_nodes(
_Declname_Match_Node,
_Declstringlit_Match_Node,
)
| 26.035714 | 109 | 0.648834 | 169 | 1,458 | 5.360947 | 0.319527 | 0.10596 | 0.066225 | 0.09713 | 0.807947 | 0.807947 | 0.807947 | 0.807947 | 0.807947 | 0.807947 | 0 | 0.050788 | 0.216735 | 1,458 | 55 | 110 | 26.509091 | 0.742557 | 0.087106 | 0 | 0.5 | 0 | 0 | 0.086325 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0.027778 | 0.055556 | 0 | 0.472222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
82cda0b356e4ed1a858022a2f59c1bf74cefa6b1 | 21 | py | Python | test_data/outputs/no_eol.py | Glutexo/reorder_python_imports | 4b62e933af3a42fbff3e84e705a6133d663d132b | [
"MIT"
] | 12 | 2021-04-05T21:01:48.000Z | 2022-03-31T13:57:20.000Z | test_data/outputs/no_eol.py | Glutexo/reorder_python_imports | 4b62e933af3a42fbff3e84e705a6133d663d132b | [
"MIT"
] | 1 | 2022-03-02T01:38:47.000Z | 2022-03-02T15:49:53.000Z | test_data/outputs/no_eol.py | Glutexo/reorder_python_imports | 4b62e933af3a42fbff3e84e705a6133d663d132b | [
"MIT"
] | 1 | 2020-05-21T01:52:25.000Z | 2020-05-21T01:52:25.000Z | import os
import sys
| 7 | 10 | 0.809524 | 4 | 21 | 4.25 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 21 | 2 | 11 | 10.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7d95bec888f04569f7a7376b135dbf4d5d43202f | 8,032 | py | Python | tests/unit/forum/test_visibility.py | OneRainbowDev/django-machina | 7354cc50f58dcbe49eecce7e1f019f6fff21d690 | [
"BSD-3-Clause"
] | null | null | null | tests/unit/forum/test_visibility.py | OneRainbowDev/django-machina | 7354cc50f58dcbe49eecce7e1f019f6fff21d690 | [
"BSD-3-Clause"
] | null | null | null | tests/unit/forum/test_visibility.py | OneRainbowDev/django-machina | 7354cc50f58dcbe49eecce7e1f019f6fff21d690 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
import pytest
from machina.core.db.models import get_model
from machina.core.loading import get_class
from machina.test.factories import PostFactory
from machina.test.factories import UserFactory
from machina.test.factories import create_category_forum
from machina.test.factories import create_forum
from machina.test.factories import create_topic
Forum = get_model('forum', 'Forum')
ForumVisibilityContentTree = get_class('forum.visibility', 'ForumVisibilityContentTree')
@pytest.mark.django_db
class TestForumVisibilityContentTree(object):
@pytest.fixture(autouse=True)
def setup(self):
self.user = UserFactory.create()
# Set up the following forum tree:
#
# top_level_cat
# forum_1
# forum_2
# forum_2_child_1
# top_level_forum_1
# top_level_forum_2
# sub_cat
# sub_sub_forum
# top_level_forum_3
# forum_3
# forum_3_child_1
# forum_3_child_1_1
# deep_forum
# last_forum
#
self.top_level_cat = create_category_forum()
self.forum_1 = create_forum(parent=self.top_level_cat)
self.forum_2 = create_forum(parent=self.top_level_cat)
self.forum_2_child_1 = create_forum(parent=self.forum_2)
self.top_level_forum_1 = create_forum()
self.top_level_forum_2 = create_forum()
self.sub_cat = create_category_forum(parent=self.top_level_forum_2)
self.sub_sub_forum = create_forum(parent=self.sub_cat)
self.top_level_forum_3 = create_forum()
self.forum_3 = create_forum(parent=self.top_level_forum_3)
self.forum_3_child_1 = create_forum(parent=self.forum_3)
self.forum_3_child_1_1 = create_forum(parent=self.forum_3_child_1)
self.deep_forum = create_forum(parent=self.forum_3_child_1_1)
self.last_forum = create_forum()
# Set up a topic and some posts
self.topic_1 = create_topic(forum=self.forum_1, poster=self.user)
self.post_1 = PostFactory.create(topic=self.topic_1, poster=self.user)
self.topic_2 = create_topic(forum=self.forum_2, poster=self.user)
self.post_2 = PostFactory.create(topic=self.topic_2, poster=self.user)
self.topic_3 = create_topic(forum=self.forum_2_child_1, poster=self.user)
self.post_3 = PostFactory.create(topic=self.topic_3, poster=self.user)
def test_can_be_initialized_from_a_list_of_forums(self):
# Run & check
visibility_tree = ForumVisibilityContentTree.from_forums(Forum.objects.all())
for forum in Forum.objects.all():
assert forum in visibility_tree.forums
def test_can_return_the_root_level_number(self):
# Setup
visibility_tree = ForumVisibilityContentTree.from_forums(Forum.objects.all())
# Run & check
assert visibility_tree.root_level == 0
def test_can_return_its_top_nodes(self):
# Setup
visibility_tree = ForumVisibilityContentTree.from_forums(Forum.objects.all())
# Run & check
assert [n.obj for n in visibility_tree.top_nodes] == [
self.top_level_cat, self.top_level_forum_1, self.top_level_forum_2,
self.top_level_forum_3, self.last_forum, ]
def test_can_return_its_visible_forums(self):
# Setup
visibility_tree = ForumVisibilityContentTree.from_forums(Forum.objects.all())
# Run & check
assert [n.obj for n in visibility_tree.visible_nodes] == [
self.top_level_cat, self.forum_1, self.forum_2, self.forum_2_child_1,
self.top_level_forum_1, self.top_level_forum_2, self.sub_cat, self.top_level_forum_3,
self.forum_3, self.last_forum, ]
assert visibility_tree.visible_forums == [
self.top_level_cat, self.forum_1, self.forum_2, self.forum_2_child_1,
self.top_level_forum_1, self.top_level_forum_2, self.sub_cat, self.top_level_forum_3,
self.forum_3, self.last_forum, ]
@pytest.mark.django_db
class TestForumVisibilityContentNode(object):
@pytest.fixture(autouse=True)
def setup(self):
self.user = UserFactory.create()
# Set up the following forum tree:
#
# top_level_cat
# forum_1
# forum_2
# forum_2_child_1
# top_level_forum_1
# top_level_forum_2
# sub_cat
# sub_sub_forum
# top_level_forum_3
# forum_3
# forum_3_child_1
# forum_3_child_1_1
# deep_forum
# last_forum
#
self.top_level_cat = create_category_forum()
self.forum_1 = create_forum(parent=self.top_level_cat)
self.forum_2 = create_forum(parent=self.top_level_cat)
self.forum_2_child_1 = create_forum(parent=self.forum_2)
self.top_level_forum_1 = create_forum()
self.top_level_forum_2 = create_forum()
self.sub_cat = create_category_forum(parent=self.top_level_forum_2)
self.sub_sub_forum = create_forum(parent=self.sub_cat)
self.top_level_forum_3 = create_forum()
self.forum_3 = create_forum(parent=self.top_level_forum_3)
self.forum_3_child_1 = create_forum(parent=self.forum_3)
self.forum_3_child_1_1 = create_forum(parent=self.forum_3_child_1)
self.deep_forum = create_forum(parent=self.forum_3_child_1_1)
self.last_forum = create_forum()
# Set up a topic and some posts
self.topic_1 = create_topic(forum=self.forum_1, poster=self.user)
self.post_1 = PostFactory.create(topic=self.topic_1, poster=self.user)
self.topic_2 = create_topic(forum=self.forum_2, poster=self.user)
self.post_2 = PostFactory.create(topic=self.topic_2, poster=self.user)
self.topic_3 = create_topic(forum=self.forum_2_child_1, poster=self.user)
self.post_3 = PostFactory.create(topic=self.topic_3, poster=self.user)
def test_can_return_its_last_post_date(self):
# Setup
visibility_tree = ForumVisibilityContentTree.from_forums(Forum.objects.all())
# Run & check
assert visibility_tree.as_dict[self.top_level_cat.id].last_post_on == self.post_3.created
def test_can_return_its_next_sibiling(self):
# Setup
visibility_tree = ForumVisibilityContentTree.from_forums(Forum.objects.all())
# Run & check
assert visibility_tree.as_dict[self.forum_1.id].next_sibling \
== visibility_tree.as_dict[self.forum_2.id]
assert visibility_tree.as_dict[self.top_level_cat.id].next_sibling \
== visibility_tree.as_dict[self.top_level_forum_1.id]
assert visibility_tree.as_dict[self.forum_3_child_1_1.id].next_sibling is None
def test_can_return_its_previous_sibiling(self):
# Setup
visibility_tree = ForumVisibilityContentTree.from_forums(Forum.objects.all())
# Run & check
assert visibility_tree.as_dict[self.forum_2.id].previous_sibling \
== visibility_tree.as_dict[self.forum_1.id]
assert visibility_tree.as_dict[self.top_level_forum_1.id].previous_sibling \
== visibility_tree.as_dict[self.top_level_cat.id]
assert visibility_tree.as_dict[self.forum_3_child_1_1.id].previous_sibling is None
def test_can_return_its_post_count(self):
# Setup
visibility_tree = ForumVisibilityContentTree.from_forums(Forum.objects.all())
# Run & check
assert visibility_tree.as_dict[self.top_level_cat.id].posts_count == 3
def test_can_return_its_topic_count(self):
# Setup
visibility_tree = ForumVisibilityContentTree.from_forums(Forum.objects.all())
# Run & check
assert visibility_tree.as_dict[self.top_level_cat.id].topics_count == 3
| 42.052356 | 97 | 0.677664 | 1,104 | 8,032 | 4.546196 | 0.085145 | 0.06854 | 0.083682 | 0.07113 | 0.882447 | 0.842598 | 0.826858 | 0.81052 | 0.762502 | 0.752341 | 0 | 0.021565 | 0.237923 | 8,032 | 190 | 98 | 42.273684 | 0.798399 | 0.114915 | 0 | 0.6 | 0 | 0 | 0.007372 | 0.003686 | 0 | 0 | 0 | 0 | 0.133333 | 1 | 0.104762 | false | 0 | 0.07619 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7da7d1ebeeef921abfcf0fc669436bc8b80d347f | 33 | py | Python | core/internals/__init__.py | yeti-threatintel/yeti | 9e8b76cd393f149c4990ead003902eac50c1766d | [
"Apache-2.0"
] | 1,250 | 2017-03-12T16:20:47.000Z | 2022-03-29T02:12:11.000Z | core/internals/__init__.py | yeti-threatintel/yeti | 9e8b76cd393f149c4990ead003902eac50c1766d | [
"Apache-2.0"
] | 540 | 2017-03-20T16:45:35.000Z | 2022-03-22T16:55:02.000Z | core/internals/__init__.py | yeti-threatintel/yeti | 9e8b76cd393f149c4990ead003902eac50c1766d | [
"Apache-2.0"
] | 293 | 2017-03-20T13:59:07.000Z | 2022-03-28T16:00:10.000Z | from .internals import Internals
| 16.5 | 32 | 0.848485 | 4 | 33 | 7 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 33 | 1 | 33 | 33 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
815b55a79c5b18f8a8fbad6b1a5ba72ae7e9de4d | 87 | py | Python | app/core/tests/test_admin.py | PabloAndres2000/recipe-app-api | e4b83b60fd295b77a15f8794cb938dc41d93a805 | [
"MIT"
] | null | null | null | app/core/tests/test_admin.py | PabloAndres2000/recipe-app-api | e4b83b60fd295b77a15f8794cb938dc41d93a805 | [
"MIT"
] | null | null | null | app/core/tests/test_admin.py | PabloAndres2000/recipe-app-api | e4b83b60fd295b77a15f8794cb938dc41d93a805 | [
"MIT"
] | null | null | null | from django.test import TestCase, Client
from django.contrib.auth import get_user_model | 43.5 | 46 | 0.862069 | 14 | 87 | 5.214286 | 0.785714 | 0.273973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091954 | 87 | 2 | 46 | 43.5 | 0.924051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8160d063ac8f534faa60a472d5f91a364f203ff3 | 45 | py | Python | tests/guinea-pigs/diff_toplevel_assert_error.py | Tirzono/teamcity-messages | e7f7334e2956a9e707222e4c83de9ffeb15b8ac0 | [
"Apache-2.0"
] | 105 | 2015-06-24T15:40:41.000Z | 2022-02-04T10:30:34.000Z | tests/guinea-pigs/diff_toplevel_assert_error.py | Tirzono/teamcity-messages | e7f7334e2956a9e707222e4c83de9ffeb15b8ac0 | [
"Apache-2.0"
] | 145 | 2015-06-24T15:26:28.000Z | 2022-03-22T20:04:19.000Z | tests/guinea-pigs/diff_toplevel_assert_error.py | Tirzono/teamcity-messages | e7f7334e2956a9e707222e4c83de9ffeb15b8ac0 | [
"Apache-2.0"
] | 76 | 2015-07-20T08:18:21.000Z | 2022-03-18T20:03:53.000Z | def test_test():
assert "spam" == "eggs"
| 15 | 27 | 0.577778 | 6 | 45 | 4.166667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 45 | 2 | 28 | 22.5 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0.177778 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
81624dc1bdb4debe37b9dbdd2a5faf0f135b6ea5 | 143 | py | Python | amos-demo-python-remake/src/screen_specs.py | astrofra/amiga-experiments | 07b39cdcc581cb64711eb48caa54a4d90ca47811 | [
"MIT"
] | 8 | 2015-09-23T19:20:55.000Z | 2020-06-02T22:46:30.000Z | amos-demo-python-remake/src/screen_specs.py | astrofra/amiga-experiments | 07b39cdcc581cb64711eb48caa54a4d90ca47811 | [
"MIT"
] | 2 | 2020-03-29T16:14:20.000Z | 2020-03-29T16:19:15.000Z | amos-demo-python-remake/src/screen_specs.py | astrofra/amiga-experiments | 07b39cdcc581cb64711eb48caa54a4d90ca47811 | [
"MIT"
] | 3 | 2017-10-11T23:50:19.000Z | 2018-07-21T16:11:29.000Z | demo_screen_size = [1280, 720, False]
amiga_screen_size = [320, 200]
def zoom_size():
return demo_screen_size[1] / amiga_screen_size[1] | 23.833333 | 50 | 0.727273 | 23 | 143 | 4.130435 | 0.565217 | 0.421053 | 0.294737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.123967 | 0.153846 | 143 | 6 | 50 | 23.833333 | 0.661157 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.25 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 6 |
81718f10e334f774da57b169dcd51dc382e3a361 | 147 | py | Python | pynetbox/lib/__init__.py | funzoneq/pynetbox | d30984835b5a6264f6a8d859e1c6c43613bf2f1a | [
"Apache-2.0"
] | null | null | null | pynetbox/lib/__init__.py | funzoneq/pynetbox | d30984835b5a6264f6a8d859e1c6c43613bf2f1a | [
"Apache-2.0"
] | null | null | null | pynetbox/lib/__init__.py | funzoneq/pynetbox | d30984835b5a6264f6a8d859e1c6c43613bf2f1a | [
"Apache-2.0"
] | null | null | null | from pynetbox.lib.endpoint import Endpoint
from pynetbox.lib.response import Record, IPRecord
from pynetbox.lib.query import Request, RequestError
| 36.75 | 52 | 0.85034 | 20 | 147 | 6.25 | 0.55 | 0.288 | 0.36 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 147 | 3 | 53 | 49 | 0.93985 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
81b384a9b604714137d9c6b3f3c0fd1864d97e24 | 115 | py | Python | zhihu_topic/zhihu/zhihu/run.py | ayueaa/Some-Spiders | 4cf085e55eab822c08d06b62099d1c235d1840ae | [
"MIT"
] | 5 | 2021-09-08T13:18:46.000Z | 2022-03-08T03:04:43.000Z | zhihu_topic/zhihu/zhihu/run.py | ayueaa/Some-Spiders | 4cf085e55eab822c08d06b62099d1c235d1840ae | [
"MIT"
] | 6 | 2021-03-19T00:38:28.000Z | 2022-03-11T23:47:06.000Z | zhihu_topic/zhihu/zhihu/run.py | ayueaa/Some-Spiders | 4cf085e55eab822c08d06b62099d1c235d1840ae | [
"MIT"
] | 3 | 2020-01-09T11:41:42.000Z | 2021-12-11T22:47:16.000Z | from scrapy import cmdline
cmdline.execute('scrapy crawl zhihu_topic-IT -s JOB_DIR=crawls/zhihu_topic-IT'.split()) | 57.5 | 87 | 0.808696 | 19 | 115 | 4.736842 | 0.736842 | 0.222222 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078261 | 115 | 2 | 87 | 57.5 | 0.849057 | 0 | 0 | 0 | 0 | 0 | 0.521739 | 0.252174 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
81ced197aed5cec782da4c0f5fd9cd898ab3a71f | 60 | py | Python | web/app/syzygy/__init__.py | aaronSchanck/SepTech | 323b7db183928b43576b0f13477fccdd192fd3fa | [
"Apache-2.0"
] | 1 | 2021-09-10T17:51:40.000Z | 2021-09-10T17:51:40.000Z | web/app/syzygy/__init__.py | ItsmeCurly/SepTech | 323b7db183928b43576b0f13477fccdd192fd3fa | [
"Apache-2.0"
] | 2 | 2021-04-30T15:57:12.000Z | 2021-05-01T17:55:51.000Z | web/app/syzygy/__init__.py | ItsmeCurly/SepTech | 323b7db183928b43576b0f13477fccdd192fd3fa | [
"Apache-2.0"
] | 1 | 2021-09-10T17:51:10.000Z | 2021-09-10T17:51:10.000Z | from .users.model import User
from .items.model import Item
| 20 | 29 | 0.8 | 10 | 60 | 4.8 | 0.7 | 0.458333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 60 | 2 | 30 | 30 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
81d6ac3daad2a937b7cbc200e3e26eb2c3220cbd | 18 | py | Python | python_face_recognition_door_opener_raspberry_pi/__init__.py | adneovrebo/face-recognition-door-opener-raspberry-pi | cbf8079c9973be1ec69cdd0bcfbe4410073f5ef8 | [
"MIT"
] | 7 | 2019-08-03T19:52:55.000Z | 2021-12-19T09:15:29.000Z | python_face_recognition_door_opener_raspberry_pi/__init__.py | adneovrebo/face-recognition-door-opener-raspberry-pi | cbf8079c9973be1ec69cdd0bcfbe4410073f5ef8 | [
"MIT"
] | null | null | null | python_face_recognition_door_opener_raspberry_pi/__init__.py | adneovrebo/face-recognition-door-opener-raspberry-pi | cbf8079c9973be1ec69cdd0bcfbe4410073f5ef8 | [
"MIT"
] | 1 | 2020-03-17T09:16:36.000Z | 2020-03-17T09:16:36.000Z | #Ådne Øvrebø 2019
| 9 | 17 | 0.777778 | 3 | 18 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.266667 | 0.166667 | 18 | 1 | 18 | 18 | 0.666667 | 0.888889 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
81d73d8c1cc6da6164fdc808779c01d7003b1ce1 | 33 | py | Python | asyncio_throttle/__init__.py | dizballanze/asyncio-throttle | 6ce2b8c591e951a5e216533456cc23cccffa278d | [
"MIT"
] | 92 | 2017-12-19T03:00:56.000Z | 2022-03-24T14:42:29.000Z | asyncio_throttle/__init__.py | dizballanze/asyncio-throttle | 6ce2b8c591e951a5e216533456cc23cccffa278d | [
"MIT"
] | 3 | 2020-04-01T14:11:45.000Z | 2021-06-19T00:42:46.000Z | asyncio_throttle/__init__.py | dizballanze/asyncio-throttle | 6ce2b8c591e951a5e216533456cc23cccffa278d | [
"MIT"
] | 9 | 2018-02-21T17:20:02.000Z | 2022-01-05T12:34:07.000Z | from .throttler import Throttler
| 16.5 | 32 | 0.848485 | 4 | 33 | 7 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 33 | 1 | 33 | 33 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c48e2761ccf604d4174f5fe753c4b251ad87d7ec | 3,739 | py | Python | tests/test_remote.py | wildfish/sphinx-gitref | 882b46441b5330206b68883b7f25ebb902abc8e7 | [
"BSD-3-Clause"
] | 5 | 2020-06-26T15:42:47.000Z | 2022-03-19T16:31:06.000Z | tests/test_remote.py | wildfish/sphinx-gitref | 882b46441b5330206b68883b7f25ebb902abc8e7 | [
"BSD-3-Clause"
] | 7 | 2020-08-11T14:27:52.000Z | 2022-03-21T15:54:41.000Z | tests/test_remote.py | wildfish/sphinx-gitref | 882b46441b5330206b68883b7f25ebb902abc8e7 | [
"BSD-3-Clause"
] | 1 | 2022-03-21T15:54:53.000Z | 2022-03-21T15:54:53.000Z | """
Test the remote definitions
"""
import re
import pytest
from sphinx_gitref.remote import Bitbucket, GitHub, GitLab, Remote, registry
def test_registry_get_by_url__github_ssh__detected():
url = "git@github.com:user/repo.git"
remote = registry.get_by_url(url, "branch")
assert isinstance(remote, GitHub)
assert remote.repo == "user/repo"
assert remote.branch == "branch"
def test_registry_get_by_url__github_https__detected():
url = "https://github.com/user/repo.git"
remote = registry.get_by_url(url, "branch")
assert isinstance(remote, GitHub)
assert remote.repo == "user/repo"
assert remote.branch == "branch"
def test_github_url__url_without_line__correct_url():
remote = GitHub("user/repo", "branch")
assert (
remote.get_url("filename.py")
== "https://github.com/user/repo/blob/branch/filename.py"
)
def test_github_url__url_with_line__correct_url():
remote = GitHub("user/repo", "branch")
assert (
remote.get_url("filename.py", 20)
== "https://github.com/user/repo/blob/branch/filename.py#L20"
)
def test_registry_get_by_url__bitbucket_ssh__detected():
url = "git@bitbucket.org:user/repo.git"
remote = registry.get_by_url(url, "branch")
assert isinstance(remote, Bitbucket)
assert remote.repo == "user/repo"
assert remote.branch == "branch"
def test_registry_get_by_url__bitbucket_https__detected():
url = "https://bitbucket.org/user/repo.git"
remote = registry.get_by_url(url, "branch")
assert isinstance(remote, Bitbucket)
assert remote.repo == "user/repo"
assert remote.branch == "branch"
def test_bitbucket_url__url_without_line__correct_url():
remote = Bitbucket("user/repo", "branch")
assert (
remote.get_url("filename.py")
== "https://bitbucket.org/user/repo/src/branch/filename.py"
)
def test_bitbucket_url__url_with_line__correct_url():
remote = Bitbucket("user/repo", "branch")
assert (
remote.get_url("filename.py", 20)
== "https://bitbucket.org/user/repo/src/branch/filename.py#lines-20"
)
def test_registry_get_by_url__gitlab_ssh__detected():
url = "git@gitlab.com:user/repo.git"
remote = registry.get_by_url(url, "branch")
assert isinstance(remote, GitLab)
assert remote.repo == "user/repo"
assert remote.branch == "branch"
def test_registry_get_by_url__gitlab_https__detected():
url = "https://gitlab.com/user/repo.git"
remote = registry.get_by_url(url, "branch")
assert isinstance(remote, GitLab)
assert remote.repo == "user/repo"
assert remote.branch == "branch"
def test_gitlab_url__url_without_line__correct_url():
remote = GitLab("user/repo", "branch")
assert (
remote.get_url("filename.py")
== "https://gitlab.com/user/repo/blob/branch/filename.py"
)
def test_gitlab_url__url_with_line__correct_url():
remote = GitLab("user/repo", "branch")
assert (
remote.get_url("filename.py", 20)
== "https://gitlab.com/user/repo/blob/branch/filename.py#L20"
)
def test_registry_get_by_url__unknown__raises_error():
invalid = "https://invalid.example.com/user/repo.git"
with pytest.raises(ValueError) as e:
registry.get_by_url(invalid, "branch")
assert str(e.value) == f"Unable to find a match for {invalid}"
def test_custom_remote__get_by_url__finds():
class TestRemote(Remote):
remote_match = re.compile(r"^https://test.example.com/(?P<repo>.+?).git$")
url = "https://test.example.com/user/repo.git"
remote = registry.get_by_url(url, "branch")
assert isinstance(remote, TestRemote)
assert remote.repo == "user/repo"
assert remote.branch == "branch"
| 30.153226 | 82 | 0.693768 | 507 | 3,739 | 4.822485 | 0.126233 | 0.088344 | 0.052352 | 0.09816 | 0.805726 | 0.768098 | 0.768098 | 0.703885 | 0.703885 | 0.611452 | 0 | 0.003861 | 0.168762 | 3,739 | 123 | 83 | 30.398374 | 0.782819 | 0.007221 | 0 | 0.511364 | 0 | 0.011364 | 0.266469 | 0.023488 | 0 | 0 | 0 | 0 | 0.318182 | 1 | 0.159091 | false | 0 | 0.034091 | 0 | 0.215909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c4ccd35e0fcfdc41d7d1666c5b71a7b9beba2416 | 76 | py | Python | mortgageinsurance/tests/__init__.py | fna/owning-a-home-api | 4f0cc8257989757ec64b85628d95022db5fe2b13 | [
"CC0-1.0"
] | null | null | null | mortgageinsurance/tests/__init__.py | fna/owning-a-home-api | 4f0cc8257989757ec64b85628d95022db5fe2b13 | [
"CC0-1.0"
] | null | null | null | mortgageinsurance/tests/__init__.py | fna/owning-a-home-api | 4f0cc8257989757ec64b85628d95022db5fe2b13 | [
"CC0-1.0"
] | null | null | null | from test_commands_load_mortgage_insurance import *
from test_views import * | 38 | 51 | 0.881579 | 11 | 76 | 5.636364 | 0.727273 | 0.258065 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.092105 | 76 | 2 | 52 | 38 | 0.898551 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c4ecbdccb221209935cbc3656ab86e921f059dc4 | 7,673 | py | Python | google/ads/google_ads/v5/proto/services/account_link_service_pb2_grpc.py | arammaliachi/google-ads-python | a4fe89567bd43eb784410523a6306b5d1dd9ee67 | [
"Apache-2.0"
] | 1 | 2021-04-09T04:28:47.000Z | 2021-04-09T04:28:47.000Z | google/ads/google_ads/v5/proto/services/account_link_service_pb2_grpc.py | arammaliachi/google-ads-python | a4fe89567bd43eb784410523a6306b5d1dd9ee67 | [
"Apache-2.0"
] | null | null | null | google/ads/google_ads/v5/proto/services/account_link_service_pb2_grpc.py | arammaliachi/google-ads-python | a4fe89567bd43eb784410523a6306b5d1dd9ee67 | [
"Apache-2.0"
] | null | null | null | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
"""Client and server classes corresponding to protobuf-defined services."""
import grpc
from google.ads.google_ads.v5.proto.resources import account_link_pb2 as google_dot_ads_dot_googleads__v5_dot_proto_dot_resources_dot_account__link__pb2
from google.ads.google_ads.v5.proto.services import account_link_service_pb2 as google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2
class AccountLinkServiceStub(object):
"""This service allows management of links between Google Ads accounts and other
accounts.
"""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.GetAccountLink = channel.unary_unary(
'/google.ads.googleads.v5.services.AccountLinkService/GetAccountLink',
request_serializer=google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.GetAccountLinkRequest.SerializeToString,
response_deserializer=google_dot_ads_dot_googleads__v5_dot_proto_dot_resources_dot_account__link__pb2.AccountLink.FromString,
)
self.CreateAccountLink = channel.unary_unary(
'/google.ads.googleads.v5.services.AccountLinkService/CreateAccountLink',
request_serializer=google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.CreateAccountLinkRequest.SerializeToString,
response_deserializer=google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.CreateAccountLinkResponse.FromString,
)
self.MutateAccountLink = channel.unary_unary(
'/google.ads.googleads.v5.services.AccountLinkService/MutateAccountLink',
request_serializer=google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.MutateAccountLinkRequest.SerializeToString,
response_deserializer=google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.MutateAccountLinkResponse.FromString,
)
class AccountLinkServiceServicer(object):
"""This service allows management of links between Google Ads accounts and other
accounts.
"""
def GetAccountLink(self, request, context):
"""Returns the account link in full detail.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def CreateAccountLink(self, request, context):
"""Creates an account link.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def MutateAccountLink(self, request, context):
"""Creates or removes an account link.
From V5, create is not supported through
AccountLinkService.MutateAccountLink. Please use
AccountLinkService.CreateAccountLink instead.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_AccountLinkServiceServicer_to_server(servicer, server):
rpc_method_handlers = {
'GetAccountLink': grpc.unary_unary_rpc_method_handler(
servicer.GetAccountLink,
request_deserializer=google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.GetAccountLinkRequest.FromString,
response_serializer=google_dot_ads_dot_googleads__v5_dot_proto_dot_resources_dot_account__link__pb2.AccountLink.SerializeToString,
),
'CreateAccountLink': grpc.unary_unary_rpc_method_handler(
servicer.CreateAccountLink,
request_deserializer=google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.CreateAccountLinkRequest.FromString,
response_serializer=google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.CreateAccountLinkResponse.SerializeToString,
),
'MutateAccountLink': grpc.unary_unary_rpc_method_handler(
servicer.MutateAccountLink,
request_deserializer=google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.MutateAccountLinkRequest.FromString,
response_serializer=google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.MutateAccountLinkResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'google.ads.googleads.v5.services.AccountLinkService', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class AccountLinkService(object):
"""This service allows management of links between Google Ads accounts and other
accounts.
"""
@staticmethod
def GetAccountLink(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/google.ads.googleads.v5.services.AccountLinkService/GetAccountLink',
google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.GetAccountLinkRequest.SerializeToString,
google_dot_ads_dot_googleads__v5_dot_proto_dot_resources_dot_account__link__pb2.AccountLink.FromString,
options, channel_credentials,
call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def CreateAccountLink(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/google.ads.googleads.v5.services.AccountLinkService/CreateAccountLink',
google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.CreateAccountLinkRequest.SerializeToString,
google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.CreateAccountLinkResponse.FromString,
options, channel_credentials,
call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def MutateAccountLink(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/google.ads.googleads.v5.services.AccountLinkService/MutateAccountLink',
google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.MutateAccountLinkRequest.SerializeToString,
google_dot_ads_dot_googleads__v5_dot_proto_dot_services_dot_account__link__service__pb2.MutateAccountLinkResponse.FromString,
options, channel_credentials,
call_credentials, compression, wait_for_ready, timeout, metadata)
| 53.657343 | 172 | 0.743647 | 805 | 7,673 | 6.544099 | 0.145342 | 0.056378 | 0.045558 | 0.056948 | 0.792711 | 0.792711 | 0.783979 | 0.733106 | 0.733106 | 0.697229 | 0 | 0.008448 | 0.197837 | 7,673 | 142 | 173 | 54.035211 | 0.847441 | 0.097745 | 0 | 0.45 | 1 | 0 | 0.095792 | 0.068423 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0 | 0.03 | 0.03 | 0.17 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f20df414439db12060652bab7b51e095499b8c23 | 42,782 | py | Python | tests/unit/transformer/test_mapper_custom_data_types.py | Breaka84/spooq | d44eb5ad98612ff59826e7caa6d3b255f3a351c2 | [
"MIT"
] | 3 | 2021-06-15T13:48:23.000Z | 2021-11-16T12:04:55.000Z | tests/unit/transformer/test_mapper_custom_data_types.py | Breaka84/spooq | d44eb5ad98612ff59826e7caa6d3b255f3a351c2 | [
"MIT"
] | 11 | 2020-05-22T13:46:22.000Z | 2021-11-16T09:34:09.000Z | tests/unit/transformer/test_mapper_custom_data_types.py | Breaka84/spooq | d44eb5ad98612ff59826e7caa6d3b255f3a351c2 | [
"MIT"
] | 1 | 2021-03-17T16:32:45.000Z | 2021-03-17T16:32:45.000Z | from builtins import object
import json
import pytest
import datetime
import pandas as pd
from pyspark.sql import functions as F
from pyspark.sql import Row
from pyspark.sql import types as T
import spooq.transformer.mapper_custom_data_types as custom_types
from spooq.transformer import Mapper
from ...data.test_fixtures.mapper_custom_data_types_fixtures import (
fixtures_for_spark_sql_object,
fixtures_for_has_value,
fixtures_for_extended_string_to_int,
fixtures_for_extended_string_to_long,
fixtures_for_extended_string_to_float,
fixtures_for_extended_string_to_double,
fixtures_for_extended_string_to_boolean,
fixtures_for_extended_string_to_timestamp_spark2,
fixtures_for_extended_string_unix_timestamp_ms_to_timestamp_spark2,
fixtures_for_extended_string_to_date_spark2,
fixtures_for_extended_string_unix_timestamp_ms_to_date_spark2,
fixtures_for_extended_string_to_timestamp,
fixtures_for_extended_string_unix_timestamp_ms_to_timestamp,
fixtures_for_extended_string_to_date,
fixtures_for_extended_string_unix_timestamp_ms_to_date,
)
from ...helpers.skip_conditions import only_spark2, only_spark3
def get_spark_data_type(input_value):
return {
"str": T.StringType(),
"int": T.LongType(),
"bool": T.BooleanType(),
"float": T.DoubleType(),
"NoneType": T.NullType(),
}[type(input_value).__name__]
def parameter_to_string_id(val):
return "<" + str(val) + ">"
def parameters_to_string_id(actual_value, expected_value):
return " actual: <{a}> ({a_cls}) -> expected: <{e}> ({e_cls})".format(
a=actual_value, a_cls=str(type(actual_value)), e=expected_value, e_cls=str(type(expected_value))
)
def get_input_df(spark_session, spark_context, source_key, input_value):
input_json = json.dumps({"attributes": {"data": {source_key: input_value}}})
return spark_session.read.json(spark_context.parallelize([input_json]))
class TestDynamicallyCallMethodsByDataTypeName(object):
# fmt: off
@pytest.mark.parametrize(("function_name", "data_type"), [
("_generate_select_expression_for_as_is", "as_is"),
("_generate_select_expression_without_casting", "as_is"),
("_generate_select_expression_without_casting", "keep"),
("_generate_select_expression_without_casting", "no_change"),
("_generate_select_expression_for_json_string", "json_string"),
("_generate_select_expression_for_meters_to_cm", "meters_to_cm"),
("_generate_select_expression_for_has_value", "has_value"),
("_generate_select_expression_for_timestamp_ms_to_ms", "timestamp_ms_to_ms"),
("_generate_select_expression_for_timestamp_ms_to_s", "timestamp_ms_to_s"),
("_generate_select_expression_for_timestamp_s_to_ms", "timestamp_s_to_ms"),
("_generate_select_expression_for_timestamp_s_to_ms", "timestamp_s_to_ms"),
("_generate_select_expression_for_StringNull", "StringNull"),
("_generate_select_expression_for_IntNull", "IntNull"),
("_generate_select_expression_for_IntBoolean", "IntBoolean"),
("_generate_select_expression_for_StringBoolean", "StringBoolean"),
("_generate_select_expression_for_TimestampMonth", "TimestampMonth"),
("_generate_select_expression_for_extended_string_to_int", "extended_string_to_int"),
("_generate_select_expression_for_extended_string_to_long", "extended_string_to_long"),
("_generate_select_expression_for_extended_string_to_float", "extended_string_to_float"),
("_generate_select_expression_for_extended_string_to_double", "extended_string_to_double"),
("_generate_select_expression_for_extended_string_to_boolean", "extended_string_to_boolean"),
("_generate_select_expression_for_extended_string_to_timestamp", "extended_string_to_timestamp"),
("_generate_select_expression_for_extended_string_unix_timestamp_ms_to_timestamp",
"extended_string_unix_timestamp_ms_to_timestamp"),
])
# fmt: on
def test_get_select_expression_for_custom_type(self, data_type, function_name, mocker):
source_column, name = "element.key", "element_key"
mocked_function = mocker.patch.object(custom_types, function_name)
custom_types._get_select_expression_for_custom_type(source_column, name, data_type)
mocked_function.assert_called_once_with(source_column, name)
def test_exception_is_raised_if_data_type_not_found(self):
source_column, name = "element.key", "element_key"
data_type = "NowhereToBeFound"
with pytest.raises(AttributeError):
custom_types._get_select_expression_for_custom_type(source_column, name, data_type)
class TestAdHocSparkSqlFunctions(object):
@staticmethod
def create_input_df(input_value_1, input_value_2, spark_session):
spark_schema = T.StructType(
[
T.StructField(
"nested",
T.StructType(
[
T.StructField("input_key_1", get_spark_data_type(input_value_1)),
T.StructField("input_key_2", get_spark_data_type(input_value_2)),
]
),
)
]
)
return spark_session.createDataFrame(
data=[Row(nested=Row(input_key_1=input_value_1, input_key_2=input_value_2))], schema=spark_schema
)
@pytest.mark.parametrize(
argnames=("input_value_1", "input_value_2", "mapper_function", "expected_value"),
argvalues=fixtures_for_spark_sql_object,
)
def test_spark_sql_object(self, spark_session, input_value_1, input_value_2, mapper_function, expected_value):
input_df = self.create_input_df(input_value_1, input_value_2, spark_session)
output_df = Mapper(mapping=[("output_key", mapper_function, "as_is")]).transform(input_df)
actual = output_df.first().output_key
if isinstance(expected_value, datetime.datetime):
assert (expected_value - datetime.timedelta(seconds=30)) < actual < datetime.datetime.now()
else:
assert actual == expected_value
class TestMiscConversions(object):
# fmt: off
@pytest.mark.parametrize(("input_value", "value"), [
("only some text",
"only some text"),
(None,
None),
({"key": "value"},
Row(key="value")),
({"key": {"other_key": "value"}},
Row(key=Row(other_key="value"))),
({"age": 18,"weight": 75},
Row(age=18, weight=75)),
({"list_of_friend_ids": [12, 75, 44, 76]},
Row(list_of_friend_ids=[12, 75, 44, 76])),
([{"weight": "75"}, {"weight": "76"}, {"weight": "73"}],
[Row(weight="75"), Row(weight="76"), Row(weight="73")]),
({"list_of_friend_ids": [{"id": 12}, {"id": 75}, {"id": 44}, {"id": 76}]},
Row(list_of_friend_ids=[Row(id=12), Row(id=75), Row(id=44), Row(id=76)])),
])
# fmt: on
def test_generate_select_expression_without_casting(self, input_value, value, spark_session, spark_context):
source_key, name = "demographics", "statistics"
input_df = get_input_df(spark_session, spark_context, source_key, input_value)
result_column = custom_types._generate_select_expression_without_casting(
source_column=input_df["attributes"]["data"][source_key], name=name
)
output_df = input_df.select(result_column)
assert output_df.schema.fieldNames() == [name], "Renaming of column"
assert output_df.first()[name] == value, "Processing of column value"
# fmt: off
@pytest.mark.parametrize(("input_value", "value"), [
("only some text",
"only some text"),
(None,
None),
({"key": "value"},
'{"key": "value"}'),
({"key": {"other_key": "value"}},
'{"key": {"other_key": "value"}}'),
({"age": 18, "weight": 75},
'{"age": 18, "weight": 75}'),
({"list_of_friend_ids": [12, 75, 44, 76]},
'{"list_of_friend_ids": [12, 75, 44, 76]}'),
([{"weight": "75"}, {"weight": "76"}, {"weight": "73"}],
'[{"weight": "75"}, {"weight": "76"}, {"weight": "73"}]'),
({"list_of_friend_ids": [{"id": 12}, {"id": 75}, {"id": 44}, {"id": 76}]},
'{"list_of_friend_ids": [{"id": 12}, {"id": 75}, {"id": 44}, {"id": 76}]}')
])
# fmt: on
def test_generate_select_expression_for_json_string(self, input_value, value, spark_session, spark_context):
source_key, name = "demographics", "statistics"
input_df = get_input_df(spark_session, spark_context, source_key, input_value)
result_column = custom_types._generate_select_expression_for_json_string(
source_column=input_df["attributes"]["data"][source_key], name=name
)
output_df = input_df.select(result_column)
assert output_df.schema.fieldNames() == [name], "Renaming of column"
assert output_df.schema[name].dataType.typeName() == "string", "Casting of column"
assert output_df.first()[name] == value, "Processing of column value"
# fmt: off
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=[
(1.80, 180),
(2., 200),
(-1.0, -100),
(0.0, 0),
(0, 0),
(2, 200),
(-4, -400),
("1.80", 180),
("2.", 200),
("-1.0", -100),
("0.0", 0),
(None, None),
("one", None),
])
# fmt: on
def test_generate_select_expression_for_meters_to_cm(self, input_value, expected_value, spark_session):
input_df = spark_session.createDataFrame(
data=[Row(input_key=input_value)],
schema=T.StructType([T.StructField("input_key", get_spark_data_type(input_value), True)]),
)
output_df = Mapper(mapping=[("output_column", "input_key", "meters_to_cm")]).transform(input_df)
assert output_df.first().output_column == expected_value, "Processing of column value"
assert output_df.schema.fieldNames() == ["output_column"], "Renaming of column"
assert output_df.schema["output_column"].dataType.typeName() == "integer", "Casting of column"
@pytest.mark.parametrize(argnames=("input_value", "expected_value"), argvalues=fixtures_for_has_value)
def test_generate_select_expression_for_has_value(self, input_value, expected_value, spark_session):
input_df = spark_session.createDataFrame(
data=[Row(input_key=input_value)],
schema=T.StructType([T.StructField("input_key", get_spark_data_type(input_value), True)]),
)
output_df = Mapper(mapping=[("output_column", "input_key", "has_value")]).transform(input_df)
assert output_df.first().output_column == expected_value, "Processing of column value"
assert output_df.schema.fieldNames() == ["output_column"], "Renaming of column"
assert output_df.schema["output_column"].dataType.typeName() == "boolean", "Casting of column"
class TestExtendedStringConversions(object):
@staticmethod
def create_input_df(input_value, spark_session):
return spark_session.createDataFrame(
data=[Row(input_key=input_value)],
schema=T.StructType([T.StructField("input_key", get_spark_data_type(input_value), True)]),
)
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=fixtures_for_extended_string_to_int,
ids=[parameters_to_string_id(actual, expected) for actual, expected in fixtures_for_extended_string_to_int],
)
def test_extended_string_to_int(self, spark_session, input_value, expected_value):
input_df = self.create_input_df(input_value, spark_session)
output_df = Mapper(mapping=[("output_key", "input_key", "extended_string_to_int")]).transform(input_df)
assert output_df.first().output_key == expected_value
assert isinstance(output_df.schema["output_key"].dataType, T.IntegerType)
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=fixtures_for_extended_string_to_long,
ids=[parameters_to_string_id(actual, expected) for actual, expected in fixtures_for_extended_string_to_long],
)
def test_extended_string_to_long(self, spark_session, input_value, expected_value):
input_df = self.create_input_df(input_value, spark_session)
output_df = Mapper(mapping=[("output_key", "input_key", "extended_string_to_long")]).transform(input_df)
assert output_df.first().output_key == expected_value
assert isinstance(output_df.schema["output_key"].dataType, T.LongType)
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=fixtures_for_extended_string_to_float,
ids=[parameters_to_string_id(actual, expected) for actual, expected in fixtures_for_extended_string_to_float],
)
def test_extended_string_to_float(self, spark_session, input_value, expected_value):
input_df = self.create_input_df(input_value, spark_session)
output_df = Mapper(mapping=[("output_key", "input_key", "extended_string_to_float")]).transform(input_df)
actual_value = output_df.first().output_key
if actual_value is not None:
assert pytest.approx(actual_value) == expected_value
else:
assert actual_value == expected_value
assert isinstance(output_df.schema["output_key"].dataType, T.FloatType)
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=fixtures_for_extended_string_to_double,
ids=[parameters_to_string_id(actual, expected) for actual, expected in fixtures_for_extended_string_to_double],
)
def test_extended_string_to_double(self, spark_session, input_value, expected_value):
input_df = self.create_input_df(input_value, spark_session)
output_df = Mapper(mapping=[("output_key", "input_key", "extended_string_to_double")]).transform(input_df)
actual_value = output_df.first().output_key
if actual_value is not None:
assert pytest.approx(actual_value) == expected_value
else:
assert actual_value == expected_value
assert isinstance(output_df.schema["output_key"].dataType, T.DoubleType)
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=fixtures_for_extended_string_to_boolean,
ids=[parameters_to_string_id(actual, expected) for actual, expected in fixtures_for_extended_string_to_boolean],
)
def test_extended_string_to_boolean(self, spark_session, input_value, expected_value):
input_df = self.create_input_df(input_value, spark_session)
output_df = Mapper(mapping=[("output_key", "input_key", "extended_string_to_boolean")]).transform(input_df)
assert output_df.first().output_key == expected_value
assert isinstance(output_df.schema["output_key"].dataType, T.BooleanType)
@only_spark2
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=fixtures_for_extended_string_to_timestamp_spark2,
ids=[
parameters_to_string_id(actual, expected)
for actual, expected in fixtures_for_extended_string_to_timestamp_spark2
],
)
def test_extended_string_to_timestamp_spark2(self, spark_session, input_value, expected_value):
# test uses timezone set to GMT / UTC (pytest.ini)!
input_df = self.create_input_df(input_value, spark_session)
output_df = Mapper(mapping=[("output_key", "input_key", "extended_string_to_timestamp")]).transform(input_df)
# workaround via pandas necessary due to bug with direct conversion
# to python datetime wrt timezone conversions (https://issues.apache.org/jira/browse/SPARK-32123)
try:
output_pd_df = output_df.toPandas()
actual_value = output_pd_df.iloc[0]["output_key"].to_pydatetime()
except ValueError:
# If input is in milliseconds it will still be stored in the DF but cannot be collected in Python
actual_value = "out_of_range_for_python"
except AttributeError:
# `.to_pydatetime()` can only be used on datetimes and throws AttributeErrors on other objects / None
actual_value = None
assert actual_value == expected_value
assert isinstance(output_df.schema["output_key"].dataType, T.TimestampType)
@only_spark3
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=fixtures_for_extended_string_to_timestamp,
ids=[
parameters_to_string_id(actual, expected) for actual, expected in fixtures_for_extended_string_to_timestamp
],
)
def test_extended_string_to_timestamp(self, spark_session, input_value, expected_value):
# test uses timezone set to GMT / UTC (pytest.ini)!
input_df = self.create_input_df(input_value, spark_session)
output_df = Mapper(mapping=[("output_key", "input_key", "extended_string_to_timestamp")]).transform(input_df)
# workaround via pandas necessary due to bug with direct conversion
# to python datetime wrt timezone conversions (https://issues.apache.org/jira/browse/SPARK-32123)
output_pd_df = output_df.toPandas()
output_value = output_pd_df.iloc[0]["output_key"]
if isinstance(output_value, type(pd.NaT)):
actual_value = None
else:
actual_value = output_value.to_pydatetime()
assert actual_value == expected_value
assert isinstance(output_df.schema["output_key"].dataType, T.TimestampType)
@only_spark2
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=fixtures_for_extended_string_unix_timestamp_ms_to_timestamp_spark2,
ids=[
parameters_to_string_id(actual, expected)
for actual, expected in fixtures_for_extended_string_unix_timestamp_ms_to_timestamp_spark2
],
)
def test_extended_string_unix_timestamp_ms_to_timestamp_spark2(self, spark_session, input_value, expected_value):
# test uses timezone set to GMT / UTC (pytest.ini)!
input_df = self.create_input_df(input_value, spark_session)
output_df = Mapper(
mapping=[("output_key", "input_key", "extended_string_unix_timestamp_ms_to_timestamp")]
).transform(input_df)
# workaround via pandas necessary due to bug with direct conversion
# to python datetime wrt timezone conversions (https://issues.apache.org/jira/browse/SPARK-32123)
try:
output_pd_df = output_df.toPandas()
actual_value = output_pd_df.iloc[0]["output_key"].to_pydatetime()
assert (
actual_value.toordinal() == expected_value.toordinal(),
"actual_value: {act_val}, expected value: {expected_val}".format(
act_val=actual_value, expected_val=expected_value
),
)
except AttributeError:
# `.to_pydatetime()` can only be used on datetimes and throws AttributeErrors on None
assert expected_value is None
assert isinstance(output_df.schema["output_key"].dataType, T.TimestampType)
@only_spark3
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=fixtures_for_extended_string_unix_timestamp_ms_to_timestamp,
ids=[
parameters_to_string_id(actual, expected)
for actual, expected in fixtures_for_extended_string_unix_timestamp_ms_to_timestamp
],
)
def test_extended_string_unix_timestamp_ms_to_timestamp(self, spark_session, input_value, expected_value):
# test uses timezone set to GMT / UTC (pytest.ini)!
input_df = self.create_input_df(input_value, spark_session)
output_df = Mapper(
mapping=[("output_key", "input_key", "extended_string_unix_timestamp_ms_to_timestamp")]
).transform(input_df)
# workaround via pandas necessary due to bug with direct conversion
# to python datetime wrt timezone conversions (https://issues.apache.org/jira/browse/SPARK-32123)
output_pd_df = output_df.toPandas()
output_value = output_pd_df.iloc[0]["output_key"]
if isinstance(output_value, type(pd.NaT)):
actual_value = None
else:
actual_value = output_value.to_pydatetime().toordinal()
expected_value = expected_value.toordinal()
assert actual_value == expected_value
assert isinstance(output_df.schema["output_key"].dataType, T.TimestampType)
@only_spark2
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=fixtures_for_extended_string_to_date_spark2,
ids=[
parameters_to_string_id(actual, expected)
for actual, expected in fixtures_for_extended_string_to_date_spark2
],
)
def test_extended_string_to_date_spark2(self, spark_session, input_value, expected_value):
input_df = self.create_input_df(input_value, spark_session)
output_df = Mapper(mapping=[("output_key", "input_key", "extended_string_to_date")]).transform(input_df)
try:
actual_value = output_df.first().output_key
except ValueError:
# If input is in milliseconds it will still be stored in the DF but cannot be collected in Python
actual_value = "out_of_range_for_python"
assert actual_value == expected_value
assert isinstance(output_df.schema["output_key"].dataType, T.DateType)
@only_spark3
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=fixtures_for_extended_string_to_date,
ids=[parameters_to_string_id(actual, expected) for actual, expected in fixtures_for_extended_string_to_date],
)
def test_extended_string_to_date(self, spark_session, input_value, expected_value):
input_df = self.create_input_df(input_value, spark_session)
output_df = Mapper(mapping=[("output_key", "input_key", "extended_string_to_date")]).transform(input_df)
actual_value = output_df.first().output_key
assert actual_value == expected_value
assert isinstance(output_df.schema["output_key"].dataType, T.DateType)
@only_spark2
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=fixtures_for_extended_string_unix_timestamp_ms_to_date_spark2,
ids=[
parameters_to_string_id(actual, expected)
for actual, expected in fixtures_for_extended_string_unix_timestamp_ms_to_date_spark2
],
)
def test_extended_string_unix_timestamp_ms_to_date_spark2(self, spark_session, input_value, expected_value):
input_df = self.create_input_df(input_value, spark_session)
output_df = Mapper(
mapping=[("output_key", "input_key", "extended_string_unix_timestamp_ms_to_date")]
).transform(input_df)
actual_value = output_df.first().output_key
assert actual_value == expected_value
assert isinstance(output_df.schema["output_key"].dataType, T.DateType)
@only_spark3
@pytest.mark.parametrize(
argnames=("input_value", "expected_value"),
argvalues=fixtures_for_extended_string_unix_timestamp_ms_to_date,
ids=[
parameters_to_string_id(actual, expected)
for actual, expected in fixtures_for_extended_string_unix_timestamp_ms_to_date
],
)
def test_extended_string_unix_timestamp_ms_to_date(self, spark_session, input_value, expected_value):
input_df = self.create_input_df(input_value, spark_session)
output_df = Mapper(
mapping=[("output_key", "input_key", "extended_string_unix_timestamp_ms_to_date")]
).transform(input_df)
actual_value = output_df.first().output_key
assert actual_value == expected_value
assert isinstance(output_df.schema["output_key"].dataType, T.DateType)
class TestAnonymizingMethods(object):
# fmt: off
@pytest.mark.parametrize(("input_value", "value"), [
("my_first_mail@myspace.com", "1"),
("", None),
(None, None),
(" ", "1"),
(100, "1"),
(0, "1")],
)
# fmt: on
def test_generate_select_expression_for_StringBoolean(self, input_value, value, spark_session, spark_context):
source_key, name = "email_address", "mail"
input_df = get_input_df(spark_session, spark_context, source_key, input_value)
result_column = custom_types._generate_select_expression_for_StringBoolean(
source_column=input_df["attributes"]["data"][source_key], name=name
)
output_df = input_df.select(result_column)
assert output_df.schema.fieldNames() == [name], "Renaming of column"
assert output_df.schema[name].dataType.typeName() == "string", "Casting of column"
assert output_df.first()[name] == value, "Processing of column value"
# fmt: off
@pytest.mark.parametrize(("input_value", "value"), [
("my_first_mail@myspace.com", None),
("", None),
(None, None),
(" ", None),
(100, None),
(0, None)],
)
# fmt: on
def test_generate_select_expression_for_StringNull(self, input_value, value, spark_session, spark_context):
source_key, name = "email_address", "mail"
input_df = get_input_df(spark_session, spark_context, source_key, input_value)
result_column = custom_types._generate_select_expression_for_StringNull(
source_column=input_df["attributes"]["data"][source_key], name=name
)
output_df = input_df.select(result_column)
assert output_df.schema.fieldNames() == [name], "Renaming of column"
assert output_df.schema[name].dataType.typeName() == "string", "Casting of column"
assert output_df.first()[name] == value, "Processing of column value"
# fmt: off
@pytest.mark.parametrize(("input_value", "value"), [
(12345, 1),
("", 1),
("some text", 1),
(None, None),
(0, 1),
(1, 1),
(-1, 1),
(5445.23, 1),
(float("inf"), 1),
(-1 * float("inf"), 1)
])
# fmt: on
def test_generate_select_expression_for_IntBoolean(self, input_value, value, spark_session, spark_context):
source_key, name = "user_id", "id"
input_df = get_input_df(spark_session, spark_context, source_key, input_value)
result_column = custom_types._generate_select_expression_for_IntBoolean(
source_column=input_df["attributes"]["data"][source_key], name=name
)
output_df = input_df.select(result_column)
assert output_df.schema.fieldNames() == [name], "Renaming of column"
assert output_df.schema[name].dataType.typeName() == "integer", "Casting of column"
assert output_df.first()[name] == value, "Processing of column value"
# fmt: off
@pytest.mark.parametrize(("input_value", "value"), [
(12345, None),
("", None),
("some text", None),
(None, None),
(0, None),
(1, None),
(-1, None),
(5445.23, None),
(float("inf"), None),
(-1 * float("inf"), None),
])
# fmt: on
def test_generate_select_expression_for_IntNull(self, input_value, value, spark_session, spark_context):
source_key, name = "user_id", "id"
input_df = get_input_df(spark_session, spark_context, source_key, input_value)
result_column = custom_types._generate_select_expression_for_IntNull(
source_column=input_df["attributes"]["data"][source_key], name=name
)
output_df = input_df.select(result_column)
assert output_df.schema.fieldNames() == [name], "Renaming of column"
assert output_df.schema[name].dataType.typeName() == "integer", "Casting of column"
assert output_df.first()[name] == value, "Processing of column value"
# fmt: off
@only_spark2
@pytest.mark.parametrize(("input_value", "value"), [
(None, None),
("1955-09-41", None),
("1969-04-03", "1969-04-01"),
("1985-03-07", "1985-03-01"),
("1998-06-10", "1998-06-01"),
("1967-05-16", "1967-05-01"),
("1953-01-01", "1953-01-01"),
("1954-11-06", "1954-11-01"),
("1978-09-05", "1978-09-01"),
("1999-05-23", "1999-05-01"),
])
# fmt: on
def test_generate_select_expression_for_TimestampMonth_spark2(
self, input_value, value, spark_session, spark_context
):
source_key, name = "day_of_birth", "birthday"
input_df = get_input_df(spark_session, spark_context, source_key, input_value)
input_df = input_df.withColumn(
source_key, F.to_utc_timestamp(input_df["attributes"]["data"][source_key], "yyyy-MM-dd")
)
result_column = custom_types._generate_select_expression_for_TimestampMonth(
source_column=input_df[source_key], name=name
)
output_df = input_df.select(result_column)
assert output_df.schema.fieldNames() == [name], "Renaming of column"
assert output_df.schema[name].dataType.typeName() == "timestamp", "Casting of column"
output_value = output_df.first()[name]
if output_value:
output_value = datetime.date.strftime(output_value, format("%Y-%m-%d"))
else:
output_value = None
assert output_value == value, "Processing of column value"
# fmt: off
@only_spark3
@pytest.mark.parametrize(("input_value", "value"), [
(None, None),
("1955-09-41", None),
("1969-04-03", "1969-04-01"),
("1985-03-07", "1985-03-01"),
("1998-06-10", "1998-06-01"),
("1967-05-16", "1967-05-01"),
("1953-01-01", "1953-01-01"),
("1954-11-06", "1954-11-01"),
("1978-09-05", "1978-09-01"),
("1999-05-23", "1999-05-01"),
])
# fmt: on
def test_generate_select_expression_for_TimestampMonth(self, input_value, value, spark_session, spark_context):
source_key, name = "day_of_birth", "birthday"
input_df = get_input_df(spark_session, spark_context, source_key, input_value)
input_df = input_df.withColumn(source_key, input_df["attributes"]["data"][source_key].cast(T.DateType()))
result_column = custom_types._generate_select_expression_for_TimestampMonth(
source_column=input_df[source_key], name=name
)
output_df = input_df.select(result_column)
assert output_df.schema.fieldNames() == [name], "Renaming of column"
assert output_df.schema[name].dataType.typeName() == "timestamp", "Casting of column"
output_value = output_df.first()[name]
if output_value:
output_value = datetime.date.strftime(output_value, format("%Y-%m-%d"))
else:
output_value = None
assert output_value == value, "Processing of column value"
class TestTimestampMethods(object):
# fmt: off
@pytest.mark.parametrize(("input_value", "value"), [
(0, 0), # minimum valid timestamp
(-1, None), # minimum valid timestamp - 1 ms
(None, None),
(4102358400000, 4102358400000), # maximum valid timestamp
(4102358400001, None), # maximum valid timestamp + 1 ms
(5049688276000, None),
(3469296996000, 3469296996000),
(7405162940000, None),
(2769601503000, 2769601503000),
(-1429593275000, None),
(3412549669000, 3412549669000),
])
# fmt: on
def test_generate_select_expression_for_timestamp_ms_to_ms(self, input_value, value, spark_session, spark_context):
source_key, name = "updated_at", "updated_at_ms"
input_df = get_input_df(spark_session, spark_context, source_key, input_value)
result_column = custom_types._generate_select_expression_for_timestamp_ms_to_ms(
source_column=input_df["attributes"]["data"][source_key], name=name
)
output_df = input_df.select(result_column)
assert output_df.schema.fieldNames() == [name], "Renaming of column"
assert output_df.schema[name].dataType.typeName() == "long", "Casting of column"
assert output_df.first()[name] == value, "Processing of column value"
# fmt: off
@pytest.mark.parametrize(("input_value", "value"), [
(0, 0), # minimum valid timestamp
(-1, None), # minimum valid timestamp - 1 ms
(None, None),
(4102358400000, 4102358400), # maximum valid timestamp
(4102358400001, None), # maximum valid timestamp + 1 ms
(5049688276000, None),
(3469296996000, 3469296996),
(7405162940000, None),
(2769601503000, 2769601503),
(-1429593275000, None),
(3412549669000, 3412549669),
])
# fmt: on
def test_generate_select_expression_for_timestamp_ms_to_s(self, input_value, value, spark_session, spark_context):
source_key, name = "updated_at", "updated_at_ms"
input_df = get_input_df(spark_session, spark_context, source_key, input_value)
result_column = custom_types._generate_select_expression_for_timestamp_ms_to_s(
source_column=input_df["attributes"]["data"][source_key], name=name
)
output_df = input_df.select(result_column)
assert output_df.schema.fieldNames() == [name], "Renaming of column"
assert output_df.schema[name].dataType.typeName() == "long", "Casting of column"
assert output_df.first()[name] == value, "Processing of column value"
# fmt: off
@pytest.mark.parametrize(("input_value", "value"), [
(0, 0), # minimum valid timestamp
(-1, None), # minimum valid timestamp - 1 s
(None, None),
(4102358400, 4102358400000), # maximum valid timestamp
(4102358401, None), # maximum valid timestamp + 1 s
(5049688276, None),
(3469296996, 3469296996000),
(7405162940, None),
(2769601503, 2769601503000),
(-1429593275, None),
(3412549669, 3412549669000),
])
# fmt: on
def test_generate_select_expression_for_timestamp_s_to_ms(self, input_value, value, spark_session, spark_context):
source_key, name = "updated_at", "updated_at_ms"
input_df = get_input_df(spark_session, spark_context, source_key, input_value)
result_column = custom_types._generate_select_expression_for_timestamp_s_to_ms(
source_column=input_df["attributes"]["data"][source_key], name=name
)
output_df = input_df.select(result_column)
assert output_df.schema.fieldNames() == [name], "Renaming of column"
assert output_df.schema[name].dataType.typeName() == "long", "Casting of column"
assert output_df.first()[name] == value, "Processing of column value"
# fmt: off
@pytest.mark.parametrize(("input_value", "value"), [
(0, 0), # minimum valid timestamp
(-1, None), # minimum valid timestamp - 1 s
(None, None),
(4102358400, 4102358400), # maximum valid timestamp
(4102358401, None), # maximum valid timestamp + 1 s
(5049688276, None),
(3469296996, 3469296996),
(7405162940, None),
(2769601503, 2769601503),
(-1429593275, None),
(3412549669, 3412549669),
])
# fmt: on
def test_generate_select_expression_for_timestamp_s_to_s(self, input_value, value, spark_session, spark_context):
source_key, name = "updated_at", "updated_at_ms"
input_df = get_input_df(spark_session, spark_context, source_key, input_value)
result_column = custom_types._generate_select_expression_for_timestamp_s_to_s(
source_column=input_df["attributes"]["data"][source_key], name=name
)
output_df = input_df.select(result_column)
assert output_df.schema.fieldNames() == [name], "Renaming of column"
assert output_df.schema[name].dataType.typeName() == "long", "Casting of column"
assert output_df.first()[name] == value, "Processing of column value"
# fmt: off
@pytest.mark.parametrize(
argnames="input_value",
argvalues=[1591627696951, 0, -1, 1]
)
# fmt: on
def test_generate_select_expression_for_unix_timestamp_ms_to_spark_timestamp(self, input_value, spark_session):
input_df = spark_session.createDataFrame(
[Row(input_column=input_value)], schema=T.StructType([T.StructField("input_column", T.LongType(), True)])
)
output_df = Mapper(
mapping=[("output_column", "input_column", "unix_timestamp_ms_to_spark_timestamp")]
).transform(input_df)
expected_value = datetime.datetime.fromtimestamp(input_value / 1000.0)
assert output_df.first().output_column == expected_value, "Processing of column value"
assert output_df.schema.fieldNames() == ["output_column"], "Renaming of column"
assert output_df.schema["output_column"].dataType.typeName() == "timestamp", "Casting of column"
class TestAddCustomDataTypeInRuntime(object):
@staticmethod
def _generate_select_expression_for_HelloWorld(source_column, name):
from pyspark.sql import types as sql_types
from pyspark.sql.functions import udf
def _to_hello_world(col):
if not col:
return None
else:
return "Hello World"
udf_hello_world = udf(_to_hello_world, sql_types.StringType())
return udf_hello_world(source_column).alias(name)
def test_custom_data_type_is_added(self, mocker):
source_column, name = "element.key", "element_key"
custom_types.add_custom_data_type(
function_name="_generate_select_expression_for_HelloWorld",
func=self._generate_select_expression_for_HelloWorld,
)
mocked_function = mocker.patch.object(custom_types, "_generate_select_expression_for_HelloWorld")
custom_types._get_select_expression_for_custom_type(source_column, name, "HelloWorld")
mocked_function.assert_called_once_with(source_column, name)
# fmt: off
@pytest.mark.parametrize(("input_value", "value"), [
("Some other string", "Hello World"),
("", None),
(None, None),
(" ", "Hello World"),
(100, "Hello World"),
(0, None)
])
# fmt: on
def test_custom_data_type_is_applied(self, input_value, value, spark_session, spark_context):
custom_types.add_custom_data_type(
function_name="_generate_select_expression_for_HelloWorld",
func=self._generate_select_expression_for_HelloWorld,
)
source_key, name, data_type = "key_name", "key", "HelloWorld"
input_df = get_input_df(spark_session, spark_context, source_key, input_value)
result_column = custom_types._get_select_expression_for_custom_type(
source_column=input_df["attributes"]["data"][source_key], name=name, data_type=data_type
)
output_df = input_df.select(result_column)
assert output_df.schema.fieldNames() == [name], "Renaming of column"
assert output_df.schema[name].dataType.typeName() == "string", "Casting of column"
assert output_df.first()[name] == value, "Processing of column value"
def test_multiple_columns_are_accessed(self, spark_session):
input_df = spark_session.createDataFrame(
[
Row(first_name="David", last_name="Eigenstuhler"),
Row(first_name="Katharina", last_name="Hohensinn"),
Row(first_name="Nora", last_name="Hohensinn"),
]
)
input_values = input_df.rdd.map(lambda x: x.asDict()).collect()
expected_values = [d["first_name"] + "_" + d["last_name"] for d in input_values]
def _first_and_last_name(source_column, name):
return F.concat_ws("_", source_column, F.col("last_name")).alias(name)
custom_types.add_custom_data_type(function_name="fullname", func=_first_and_last_name)
output_df = Mapper([("full_name", "first_name", "fullname")]).transform(input_df)
output_values = output_df.rdd.map(lambda x: x.asDict()["full_name"]).collect()
assert expected_values == output_values
def test_function_name_is_shortened(self, spark_session):
input_df = spark_session.createDataFrame(
[
Row(first_name="David"),
Row(first_name="Katharina"),
Row(first_name="Nora"),
]
)
input_values = input_df.rdd.map(lambda x: x.asDict()["first_name"]).collect()
expected_values = [fn.lower() for fn in input_values]
def _lowercase(source_column, name):
return F.lower(source_column).alias(name)
custom_types.add_custom_data_type(function_name="lowercase", func=_lowercase)
output_df = Mapper([("first_name", "first_name", "lowercase")]).transform(input_df)
output_values = output_df.rdd.map(lambda x: x.asDict()["first_name"]).collect()
assert expected_values == output_values
| 48.615909 | 120 | 0.653429 | 5,077 | 42,782 | 5.128422 | 0.067363 | 0.030918 | 0.035027 | 0.052886 | 0.857818 | 0.832008 | 0.787149 | 0.753697 | 0.732688 | 0.705112 | 0 | 0.034799 | 0.233603 | 42,782 | 879 | 121 | 48.671217 | 0.759302 | 0.045136 | 0 | 0.53004 | 0 | 0.001335 | 0.157015 | 0.049404 | 0 | 0 | 0 | 0 | 0.109479 | 1 | 0.06008 | false | 0 | 0.018692 | 0.008011 | 0.102804 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f21abb117960bb0c0415e97e041363ee36d853cf | 192 | py | Python | ex4_02.py | FMarnix/Py4Ev | 09f5f5bdd898e8683f70805dac2b5d6fff0a2c76 | [
"MIT"
] | null | null | null | ex4_02.py | FMarnix/Py4Ev | 09f5f5bdd898e8683f70805dac2b5d6fff0a2c76 | [
"MIT"
] | null | null | null | ex4_02.py | FMarnix/Py4Ev | 09f5f5bdd898e8683f70805dac2b5d6fff0a2c76 | [
"MIT"
] | null | null | null | def print_lyrics():
print("I'm a Lumberjack, and i'm okay.")
print('I sleep all night and I work all day.')
def repeat_lyrics():
print_lyrics()
print_lyrics()
repeat_lyrics() | 21.333333 | 50 | 0.666667 | 31 | 192 | 3.967742 | 0.483871 | 0.268293 | 0.260163 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.203125 | 192 | 9 | 51 | 21.333333 | 0.803922 | 0 | 0 | 0.285714 | 0 | 0 | 0.352332 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | true | 0 | 0 | 0 | 0.285714 | 0.714286 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
f21b25e92efeaca5c13003cb83a04a018c7e7e4b | 153 | py | Python | tests/helpers/examples/cart/serializers.py | nicoddemus/dependencies | 74180e2c6098d8ad03bc53c5703bdf8dc61c3ed9 | [
"BSD-2-Clause"
] | null | null | null | tests/helpers/examples/cart/serializers.py | nicoddemus/dependencies | 74180e2c6098d8ad03bc53c5703bdf8dc61c3ed9 | [
"BSD-2-Clause"
] | null | null | null | tests/helpers/examples/cart/serializers.py | nicoddemus/dependencies | 74180e2c6098d8ad03bc53c5703bdf8dc61c3ed9 | [
"BSD-2-Clause"
] | null | null | null | from rest_framework import serializers
class ItemSerializer(serializers.Serializer):
pass
class UserSerializer(serializers.Serializer):
pass
| 15.3 | 45 | 0.803922 | 15 | 153 | 8.133333 | 0.666667 | 0.344262 | 0.409836 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.143791 | 153 | 9 | 46 | 17 | 0.931298 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.4 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
48097a94a7d2c8e3599c94fc1e13df7d5a28829e | 35 | py | Python | RPi/GPIO/__init__.py | Def4l71diot/RPi.GPIO-def | ed5f93bf8aa59e41df59001ba74691b396101983 | [
"MIT"
] | 8 | 2018-08-24T03:34:40.000Z | 2022-01-05T11:10:34.000Z | RPi/GPIO/__init__.py | Def4l71diot/RPi.GPIO-def | ed5f93bf8aa59e41df59001ba74691b396101983 | [
"MIT"
] | 1 | 2018-09-14T17:33:55.000Z | 2018-09-14T17:33:55.000Z | RPi/GPIO/__init__.py | Def4l71diot/RPi.GPIO-def | ed5f93bf8aa59e41df59001ba74691b396101983 | [
"MIT"
] | 4 | 2017-02-04T11:29:12.000Z | 2020-12-29T20:26:27.000Z | from RPi.GPIO.definitions import *
| 17.5 | 34 | 0.8 | 5 | 35 | 5.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.903226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
480ce372f611ddff711209f111d2bd52c15aa767 | 202 | py | Python | posts/views.py | B339r1p/twitter-lite | 409f816c8ea829ff0a228c314591d591cf3cf41a | [
"MIT"
] | null | null | null | posts/views.py | B339r1p/twitter-lite | 409f816c8ea829ff0a228c314591d591cf3cf41a | [
"MIT"
] | null | null | null | posts/views.py | B339r1p/twitter-lite | 409f816c8ea829ff0a228c314591d591cf3cf41a | [
"MIT"
] | null | null | null | from django.shortcuts import HttpResponse
# Create your views here.
def homepage(request):
return HttpResponse('Hello Universe, This is our first alien meeting. behave yourself or get expelled')
| 25.25 | 107 | 0.782178 | 27 | 202 | 5.851852 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.158416 | 202 | 7 | 108 | 28.857143 | 0.929412 | 0.113861 | 0 | 0 | 0 | 0 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
4812898dbfdafa538b2a827a881a92273887672a | 6,645 | py | Python | taco_salad/toppings/criteria.py | mbrner/taco_salat | 6f3521b91081b5722aae3382c300aaa4edc67892 | [
"MIT"
] | 3 | 2016-12-09T13:05:34.000Z | 2016-12-10T12:22:59.000Z | taco_salad/toppings/criteria.py | mbrner/taco_salat | 6f3521b91081b5722aae3382c300aaa4edc67892 | [
"MIT"
] | null | null | null | taco_salad/toppings/criteria.py | mbrner/taco_salat | 6f3521b91081b5722aae3382c300aaa4edc67892 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from __future__ import absolute_import, print_function, division
import numpy as np
import numexpr as ne
def purity_criteria(threshold=0.99):
"""Returns decisison function which returns the absolute difference
between the achieved and desired purity.
Parameters
----------
threshold : float or callable
If float, independent from the positon of the window a constant
criteria, which has to be fulfilled for each window, is used.
If callable the function has to take the position and return a
criteria not greater than 1.
Returns
-------
decisison function : callable
Returns a func(y_true, y_pred, position, sample_weights=None)
returning 'difference' which is the absolute difference between
the achieved and desired purity
"""
if isinstance(threshold, float):
if threshold > 1.:
raise ValueError('Constant threshold must be <= 1')
def threshold_func(x):
return threshold
elif callable(threshold):
threshold_func = threshold
else:
raise TypeError('\'threshold\' must be either float or callable')
def decision_function(y_true, y_pred, position, sample_weights=None):
"""Returns decisison function which returns if the desired purity
is fulfilled.
Parameters
----------
y_true : 1d array-like
Ground truth (correct) target values. Only binary
classification is supported.
y_pred : 1d array-like
Estimated targets.
position : float
Value indicating the postion of the cut window.
Returns
-------
fulfilled : boolean
Return if the criteria is fulfilled.
"""
float_criteria = threshold_func(position)
if not isinstance(float_criteria, float):
raise TypeError('Callable threshold must return float <= 1.')
if float_criteria > 1.:
raise ValueError('Callable threshold returned value > 1')
y_true_bool = np.array(y_true, dtype=bool)
y_pred_bool = np.array(y_pred, dtype=bool)
if sample_weights is None:
tp = np.sum(y_true_bool[y_pred_bool])
fp = np.sum(~y_true_bool[y_pred_bool])
else:
idx_tp = np.logical_and(y_true_bool, y_pred_bool)
idx_fp = np.logical_and(~y_true_bool, y_pred_bool)
tp = np.sum(sample_weights[idx_tp])
fp = np.sum(sample_weights[idx_fp])
if tp + fp == 0:
purity = 0.
else:
purity = tp / (tp + fp)
return np.absolute(purity - float_criteria)
return decision_function
def general_confusion_matrix_criteria(eval_str, threshold=0.99):
"""Returns decisison function which returns the absolute difference
the defined criteria and the threshold.
Parameters
----------
eval_str: string
String for the criteria. The criteria is evaluated using
the numexpr module. Usable values are:
tp: True Positives
fp: False Positives
tn: True Negatives
fn: False Negatives
E.g.: Purity as the criteria: 'tp / (tp + fp)'
threshold : float or callable
If float, independent from the positon of the window a constant
criteria, which has to be fulfilled for each window, is used.
If callable the function has to take the position and return a
criteria not greater than 1.
Returns
-------
decisison function : callable
Returns a func(y_true, y_pred, position, sample_weights=None)
returning 'difference' which is the absolute difference between
the achieved and desired purity
"""
if isinstance(threshold, float):
if threshold > 1.:
raise ValueError('Constant threshold must be <= 1')
def threshold_func(x):
return threshold
elif callable(threshold):
threshold_func = threshold
else:
raise TypeError('\'threshold\' must be either float or callable')
if 'tp' in eval_str:
calc_tp = True
else:
calc_tp = False
if 'fp' in eval_str:
calc_fp = True
else:
calc_fp = False
if 'tn' in eval_str:
calc_tn = True
else:
calc_tn = False
if 'fn' in eval_str:
calc_fn = True
else:
calc_fn = False
def decision_function(y_true, y_pred, position, sample_weights=None):
"""Returns decisison function which returns if the desired purity
is fulfilled.
Parameters
----------
y_true : 1d array-like
Ground truth (correct) target values. Only binary
classification is supported.
y_pred : 1d array-like
Estimated targets.
position : float
Value indicating the postion of the cut window.
Returns
-------
fulfilled : boolean
Return if the criteria is fulfilled.
"""
float_criteria = threshold_func(position)
if not isinstance(float_criteria, float):
raise TypeError('Callable threshold must return float <= 1.')
if float_criteria > 1.:
raise ValueError('Callable threshold returned value > 1')
y_true_bool = np.array(y_true, dtype=bool)
y_pred_bool = np.array(y_pred, dtype=bool)
if calc_tp:
if sample_weights is None:
tp = np.sum(y_true_bool[y_pred_bool])
else:
idx_tp = np.logical_and(y_true_bool, y_pred_bool)
tp = np.sum(sample_weights[idx_tp]) # NOQA
if calc_fp:
if sample_weights is None:
fp = np.sum(~y_true_bool[y_pred_bool])
else:
idx_fp = np.logical_and(~y_true_bool, y_pred_bool)
fp = np.sum(sample_weights[idx_fp]) # NOQA
if calc_tn:
if sample_weights is None:
tn = np.sum(~y_true_bool[~y_pred_bool])
else:
idx_tn = np.logical_and(~y_true_bool, ~y_pred_bool)
tn = np.sum(sample_weights[idx_tn]) # NOQA
if calc_fn:
if sample_weights is None:
fn = np.sum(y_true_bool[~y_pred_bool])
else:
idx_fn = np.logical_and(y_true_bool, ~y_pred_bool)
fn = np.sum(sample_weights[idx_fn]) # NOQA
criteria_value = ne.evaluate(eval_str)
return np.absolute(criteria_value - float_criteria)
return decision_function
| 34.076923 | 73 | 0.607073 | 838 | 6,645 | 4.630072 | 0.152745 | 0.028351 | 0.032474 | 0.046907 | 0.81366 | 0.768557 | 0.768557 | 0.757474 | 0.757474 | 0.731443 | 0 | 0.005496 | 0.315425 | 6,645 | 194 | 74 | 34.252577 | 0.847439 | 0.356659 | 0 | 0.65625 | 0 | 0 | 0.076823 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.03125 | 0.020833 | 0.15625 | 0.010417 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
481506cb346e0f0a319acea93538a910306e628f | 14,769 | py | Python | saleor/payment/gateways/stripe/tests/test_stripe_api.py | koopape/saleor | 6c19e3627d4a0e7fa56840d9031fb33ea9968e0d | [
"CC-BY-4.0"
] | 1 | 2020-10-02T10:12:35.000Z | 2020-10-02T10:12:35.000Z | saleor/payment/gateways/stripe/tests/test_stripe_api.py | koopape/saleor | 6c19e3627d4a0e7fa56840d9031fb33ea9968e0d | [
"CC-BY-4.0"
] | 106 | 2021-08-02T04:30:50.000Z | 2022-03-28T04:49:41.000Z | saleor/payment/gateways/stripe/tests/test_stripe_api.py | koopape/saleor | 6c19e3627d4a0e7fa56840d9031fb33ea9968e0d | [
"CC-BY-4.0"
] | null | null | null | from decimal import Decimal
from unittest.mock import patch
from stripe.error import AuthenticationError, StripeError
from stripe.stripe_object import StripeObject
from saleor.payment.interface import PaymentMethodInfo
from saleor.payment.utils import price_to_minor_unit
from ..consts import (
AUTOMATIC_CAPTURE_METHOD,
MANUAL_CAPTURE_METHOD,
METADATA_IDENTIFIER,
STRIPE_API_VERSION,
WEBHOOK_EVENTS,
)
from ..stripe_api import (
cancel_payment_intent,
capture_payment_intent,
create_payment_intent,
delete_webhook,
get_or_create_customer,
get_payment_method_details,
is_secret_api_key_valid,
list_customer_payment_methods,
refund_payment_intent,
retrieve_payment_intent,
subscribe_webhook,
)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.WebhookEndpoint",
)
def test_is_secret_api_key_valid_incorrect_key(mocked_webhook):
api_key = "incorrect"
mocked_webhook.list.side_effect = AuthenticationError()
assert is_secret_api_key_valid(api_key) is False
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.WebhookEndpoint",
)
def test_is_secret_api_key_valid_correct_key(mocked_webhook):
api_key = "correct_key"
assert is_secret_api_key_valid(api_key) is True
mocked_webhook.list.assert_called_with(api_key, stripe_version=STRIPE_API_VERSION)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.WebhookEndpoint",
)
def test_subscribe_webhook_returns_webhook_object(mocked_webhook, channel_USD):
api_key = "api_key"
expected_url = (
"http://mirumee.com/plugins/channel/main/saleor.payments.stripe/webhooks/"
)
subscribe_webhook(api_key, channel_slug=channel_USD.slug)
mocked_webhook.create.assert_called_with(
api_key=api_key,
url=expected_url,
enabled_events=WEBHOOK_EVENTS,
metadata={METADATA_IDENTIFIER: "mirumee.com"},
stripe_version=STRIPE_API_VERSION,
)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.WebhookEndpoint",
)
def test_delete_webhook(mocked_webhook):
api_key = "api_key"
delete_webhook(api_key, "webhook_id")
mocked_webhook.delete.assert_called_with(
"webhook_id", api_key=api_key, stripe_version=STRIPE_API_VERSION
)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent",
)
def test_create_payment_intent_returns_intent_object(mocked_payment_intent):
api_key = "api_key"
mocked_payment_intent.create.return_value = StripeObject()
intent, error = create_payment_intent(
api_key, Decimal(10), "USD", auto_capture=True
)
mocked_payment_intent.create.assert_called_with(
api_key=api_key,
amount="1000",
currency="USD",
capture_method=AUTOMATIC_CAPTURE_METHOD,
stripe_version=STRIPE_API_VERSION,
)
assert isinstance(intent, StripeObject)
assert error is None
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent",
)
def test_create_payment_intent_with_customer(mocked_payment_intent):
customer = StripeObject(id="c_ABC")
api_key = "api_key"
mocked_payment_intent.create.return_value = StripeObject()
intent, error = create_payment_intent(
api_key, Decimal(10), "USD", auto_capture=True, customer=customer
)
mocked_payment_intent.create.assert_called_with(
api_key=api_key,
amount="1000",
currency="USD",
capture_method=AUTOMATIC_CAPTURE_METHOD,
customer=customer,
stripe_version=STRIPE_API_VERSION,
)
assert isinstance(intent, StripeObject)
assert error is None
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent",
)
def test_create_payment_intent_manual_auto_capture(mocked_payment_intent):
api_key = "api_key"
mocked_payment_intent.create.return_value = StripeObject()
_intent, _error = create_payment_intent(
api_key, Decimal(10), "USD", auto_capture=False
)
mocked_payment_intent.create.assert_called_with(
api_key=api_key,
amount="1000",
currency="USD",
capture_method=MANUAL_CAPTURE_METHOD,
stripe_version=STRIPE_API_VERSION,
)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent",
)
def test_create_payment_intent_returns_error(mocked_payment_intent):
api_key = "api_key"
mocked_payment_intent.create.side_effect = StripeError(
json_body={"error": "stripe-error"}
)
intent, error = create_payment_intent(api_key, Decimal(10), "USD")
mocked_payment_intent.create.assert_called_with(
api_key=api_key,
amount="1000",
currency="USD",
capture_method=AUTOMATIC_CAPTURE_METHOD,
stripe_version=STRIPE_API_VERSION,
)
assert intent is None
assert error
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent",
)
def test_retrieve_payment_intent(mocked_payment_intent):
api_key = "api_key"
payment_intent_id = "id1234"
mocked_payment_intent.retrieve.return_value = StripeObject()
intent, _ = retrieve_payment_intent(api_key, payment_intent_id)
mocked_payment_intent.retrieve.assert_called_with(
payment_intent_id, api_key=api_key, stripe_version=STRIPE_API_VERSION
)
assert isinstance(intent, StripeObject)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent",
)
def test_retrieve_payment_intent_stripe_returns_error(mocked_payment_intent):
api_key = "api_key"
payment_intent_id = "id1234"
expected_error = StripeError(message="stripe-error")
mocked_payment_intent.retrieve.side_effect = expected_error
_, error = retrieve_payment_intent(api_key, payment_intent_id)
mocked_payment_intent.retrieve.assert_called_with(
payment_intent_id, api_key=api_key, stripe_version=STRIPE_API_VERSION
)
assert error == expected_error
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent",
)
def test_capture_payment_intent(mocked_payment_intent):
api_key = "api_key"
payment_intent_id = "id1234"
amount = price_to_minor_unit(Decimal("10.0"), "USD")
mocked_payment_intent.capture.return_value = StripeObject()
intent, _ = capture_payment_intent(
api_key=api_key, payment_intent_id=payment_intent_id, amount_to_capture=amount
)
mocked_payment_intent.capture.assert_called_with(
payment_intent_id,
amount_to_capture=amount,
api_key=api_key,
stripe_version=STRIPE_API_VERSION,
)
assert isinstance(intent, StripeObject)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent",
)
def test_capture_payment_intent_stripe_returns_error(mocked_payment_intent):
api_key = "api_key"
payment_intent_id = "id1234"
amount = price_to_minor_unit(Decimal("10.0"), "USD")
expected_error = StripeError(message="stripe-error")
mocked_payment_intent.capture.side_effect = expected_error
_, error = capture_payment_intent(
api_key=api_key, payment_intent_id=payment_intent_id, amount_to_capture=amount
)
mocked_payment_intent.capture.assert_called_with(
payment_intent_id,
amount_to_capture=amount,
api_key=api_key,
stripe_version=STRIPE_API_VERSION,
)
assert error == expected_error
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.Refund",
)
def test_refund_payment_intent(mocked_refund):
api_key = "api_key"
payment_intent_id = "id1234"
amount = price_to_minor_unit(Decimal("10.0"), "USD")
mocked_refund.create.return_value = StripeObject()
intent, _ = refund_payment_intent(
api_key=api_key, payment_intent_id=payment_intent_id, amount_to_refund=amount
)
mocked_refund.create.assert_called_with(
payment_intent=payment_intent_id,
amount=amount,
api_key=api_key,
stripe_version=STRIPE_API_VERSION,
)
assert isinstance(intent, StripeObject)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.Refund",
)
def test_refund_payment_intent_returns_error(mocked_refund):
api_key = "api_key"
payment_intent_id = "id1234"
amount = price_to_minor_unit(Decimal("10.0"), "USD")
expected_error = StripeError(message="stripe-error")
mocked_refund.create.side_effect = expected_error
_, error = refund_payment_intent(
api_key=api_key, payment_intent_id=payment_intent_id, amount_to_refund=amount
)
mocked_refund.create.assert_called_with(
payment_intent=payment_intent_id,
amount=amount,
api_key=api_key,
stripe_version=STRIPE_API_VERSION,
)
assert error == expected_error
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent",
)
def test_cancel_payment_intent(mocked_payment_intent):
api_key = "api_key"
payment_intent_id = "id1234"
mocked_payment_intent.cancel.return_value = StripeObject()
intent, _ = cancel_payment_intent(
api_key=api_key, payment_intent_id=payment_intent_id
)
mocked_payment_intent.cancel.assert_called_with(
payment_intent_id, api_key=api_key, stripe_version=STRIPE_API_VERSION
)
assert isinstance(intent, StripeObject)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent",
)
def test_cancel_payment_intent_stripe_returns_error(mocked_payment_intent):
api_key = "api_key"
payment_intent_id = "id1234"
expected_error = StripeError(message="stripe-error")
mocked_payment_intent.cancel.side_effect = expected_error
_, error = cancel_payment_intent(
api_key=api_key, payment_intent_id=payment_intent_id
)
mocked_payment_intent.cancel.assert_called_with(
payment_intent_id, api_key=api_key, stripe_version=STRIPE_API_VERSION
)
assert error == expected_error
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.Customer",
)
def test_get_or_create_customer_retrieve(mocked_customer):
mocked_customer.retrieve.return_value = StripeObject()
api_key = "123"
customer_email = "admin@example.com"
customer_id = "c_12345"
customer = get_or_create_customer(
api_key=api_key,
customer_email=customer_email,
customer_id=customer_id,
)
assert isinstance(customer, StripeObject)
mocked_customer.retrieve.assert_called_with(
customer_id, api_key=api_key, stripe_version=STRIPE_API_VERSION
)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.Customer",
)
def test_get_or_create_customer_failed_retrieve(mocked_customer):
expected_error = StripeError(message="stripe-error")
mocked_customer.retrieve.side_effect = expected_error
api_key = "123"
customer_email = "admin@example.com"
customer_id = "c_12345"
customer = get_or_create_customer(
api_key=api_key,
customer_email=customer_email,
customer_id=customer_id,
)
assert customer is None
mocked_customer.retrieve.assert_called_with(
customer_id, api_key=api_key, stripe_version=STRIPE_API_VERSION
)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.Customer",
)
def test_get_or_create_customer_create(mocked_customer):
mocked_customer.create.return_value = StripeObject()
api_key = "123"
customer_email = "admin@example.com"
customer = get_or_create_customer(
api_key=api_key,
customer_email=customer_email,
customer_id=None,
)
assert isinstance(customer, StripeObject)
mocked_customer.create.assert_called_with(
email=customer_email, api_key=api_key, stripe_version=STRIPE_API_VERSION
)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.Customer",
)
def test_get_or_create_customer_failed_create(mocked_customer):
expected_error = StripeError(message="stripe-error")
mocked_customer.create.side_effect = expected_error
api_key = "123"
customer_email = "admin@example.com"
customer = get_or_create_customer(
api_key=api_key,
customer_email=customer_email,
customer_id=None,
)
assert customer is None
mocked_customer.create.assert_called_with(
email=customer_email, api_key=api_key, stripe_version=STRIPE_API_VERSION
)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.PaymentMethod",
)
def test_list_customer_payment_methods(mocked_payment_method):
api_key = "123"
customer_id = "c_customer_id"
mocked_payment_method.list.return_value = StripeObject()
payment_method, error = list_customer_payment_methods(
api_key=api_key, customer_id=customer_id
)
assert error is None
assert isinstance(payment_method, StripeObject)
mocked_payment_method.list.assert_called_with(
api_key=api_key,
customer=customer_id,
type="card",
stripe_version=STRIPE_API_VERSION,
)
@patch(
"saleor.payment.gateways.stripe.stripe_api.stripe.PaymentMethod",
)
def test_list_customer_payment_methods_failed_to_fetch(mocked_payment_method):
api_key = "123"
customer_id = "c_customer_id"
expected_error = StripeError(message="stripe-error")
mocked_payment_method.list.side_effect = expected_error
payment_method, error = list_customer_payment_methods(
api_key=api_key, customer_id=customer_id
)
assert payment_method is None
assert isinstance(error, StripeError)
mocked_payment_method.list.assert_called_with(
api_key=api_key,
customer=customer_id,
type="card",
stripe_version=STRIPE_API_VERSION,
)
def test_get_payment_method_details():
payment_intent = StripeObject()
payment_intent.charges = {
"data": [
{
"type": "card",
"card": {
"last4": "1234",
"exp_year": "2222",
"exp_month": "12",
"brand": "visa",
},
}
]
}
payment_method_info = get_payment_method_details(payment_intent)
assert payment_method_info == PaymentMethodInfo(
last_4="1234", exp_year=2222, exp_month=12, brand="visa", type="card"
)
def test_get_payment_method_details_missing_charges():
payment_intent = StripeObject()
payment_intent.charges = None
payment_method_info = get_payment_method_details(payment_intent)
assert payment_method_info is None
def test_get_payment_method_details_missing_charges_data():
payment_intent = StripeObject()
payment_intent.charges = {"data": None}
payment_method_info = get_payment_method_details(payment_intent)
assert payment_method_info is None
| 28.622093 | 86 | 0.735595 | 1,827 | 14,769 | 5.512863 | 0.065681 | 0.069102 | 0.041104 | 0.054805 | 0.832605 | 0.804011 | 0.785842 | 0.774821 | 0.757248 | 0.742156 | 0 | 0.009738 | 0.179565 | 14,769 | 515 | 87 | 28.67767 | 0.82149 | 0 | 0 | 0.546366 | 0 | 0 | 0.134606 | 0.090595 | 0 | 0 | 0 | 0 | 0.120301 | 1 | 0.062657 | false | 0 | 0.02005 | 0 | 0.082707 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
48199b6e8e3323617e90d6aa8272c0cbdf1d5160 | 33 | py | Python | simpleder/__init__.py | wq2012/SimpleDER | 8e5838969f6c5040db9970e619fffa8386bfe4c7 | [
"Apache-2.0"
] | 34 | 2019-01-27T18:42:39.000Z | 2022-02-28T06:08:25.000Z | simpleder/__init__.py | wq2012/SimpleDER | 8e5838969f6c5040db9970e619fffa8386bfe4c7 | [
"Apache-2.0"
] | 2 | 2020-11-16T07:04:58.000Z | 2021-06-24T06:24:14.000Z | simpleder/__init__.py | wq2012/SimpleDER | 8e5838969f6c5040db9970e619fffa8386bfe4c7 | [
"Apache-2.0"
] | 6 | 2019-03-26T00:02:23.000Z | 2021-02-18T20:02:39.000Z | from . import der
DER = der.DER
| 8.25 | 17 | 0.666667 | 6 | 33 | 3.666667 | 0.5 | 0.818182 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.242424 | 33 | 3 | 18 | 11 | 0.88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
4861ca2049d7210ca92c976ae881b906691c1513 | 43 | py | Python | activeteacher/data/transforms/__init__.py | HunterJ-Lin/ActiveTeacher | 7e15c5fa118a0b5779f44811be2be41fe2f9d169 | [
"Apache-2.0"
] | 1 | 2022-03-09T12:36:58.000Z | 2022-03-09T12:36:58.000Z | activeteacher/data/transforms/__init__.py | HunterJ-Lin/ActiveTeacher | 7e15c5fa118a0b5779f44811be2be41fe2f9d169 | [
"Apache-2.0"
] | null | null | null | activeteacher/data/transforms/__init__.py | HunterJ-Lin/ActiveTeacher | 7e15c5fa118a0b5779f44811be2be41fe2f9d169 | [
"Apache-2.0"
] | null | null | null | from .augmentation_impl import GaussianBlur | 43 | 43 | 0.906977 | 5 | 43 | 7.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069767 | 43 | 1 | 43 | 43 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4862f3a8ee900eceeab11e43b81a110d3208d71f | 44 | py | Python | radam/__init__.py | wenhui-prudencemed/RAdam | baa462c2ab2fd114c69c2c35b47cb4973efea376 | [
"Apache-2.0"
] | 2,597 | 2019-08-09T20:42:11.000Z | 2022-03-29T14:09:17.000Z | radam/__init__.py | wenhui-prudencemed/RAdam | baa462c2ab2fd114c69c2c35b47cb4973efea376 | [
"Apache-2.0"
] | 56 | 2019-08-15T22:56:37.000Z | 2021-09-19T21:16:15.000Z | radam/__init__.py | wenhui-prudencemed/RAdam | baa462c2ab2fd114c69c2c35b47cb4973efea376 | [
"Apache-2.0"
] | 385 | 2019-08-15T23:06:09.000Z | 2022-01-04T09:22:00.000Z | from .radam import RAdam, PlainRAdam, AdamW
| 22 | 43 | 0.795455 | 6 | 44 | 5.833333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 44 | 1 | 44 | 44 | 0.921053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6f94f9e550a17844fb0fb1bf942d3f2d26b933f1 | 65 | py | Python | py_tdlib/constructors/test_use_error.py | Mr-TelegramBot/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 24 | 2018-10-05T13:04:30.000Z | 2020-05-12T08:45:34.000Z | py_tdlib/constructors/test_use_error.py | MrMahdi313/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 3 | 2019-06-26T07:20:20.000Z | 2021-05-24T13:06:56.000Z | py_tdlib/constructors/test_use_error.py | MrMahdi313/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 5 | 2018-10-05T14:29:28.000Z | 2020-08-11T15:04:10.000Z | from ..factory import Method
class testUseError(Method):
pass
| 10.833333 | 28 | 0.769231 | 8 | 65 | 6.25 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 65 | 5 | 29 | 13 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
6fab5742afcf6441a1ed3799f0c491f944097ef1 | 170 | py | Python | src_py/formulations/formulation_abstract.py | sukolsak/kpd_opt | f2b8c05234ce9722235151e324a7026b642d4fb5 | [
"MIT"
] | null | null | null | src_py/formulations/formulation_abstract.py | sukolsak/kpd_opt | f2b8c05234ce9722235151e324a7026b642d4fb5 | [
"MIT"
] | null | null | null | src_py/formulations/formulation_abstract.py | sukolsak/kpd_opt | f2b8c05234ce9722235151e324a7026b642d4fb5 | [
"MIT"
] | null | null | null | from abc import ABC, abstractmethod
from utils.transport import Output
class Formulation(ABC):
@abstractmethod
def solve(self) -> {Output, None}:
pass
| 17 | 38 | 0.7 | 20 | 170 | 5.95 | 0.7 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.217647 | 170 | 9 | 39 | 18.888889 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.166667 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
d205497b261bab4ecfb9ddd7a6b4a29f81003f5a | 220 | py | Python | tests/wildcat_connection/conftest.py | nathanllww/slivka-irv | 6688ac88a34387df476d51f66d291e9f3b6454aa | [
"MIT"
] | null | null | null | tests/wildcat_connection/conftest.py | nathanllww/slivka-irv | 6688ac88a34387df476d51f66d291e9f3b6454aa | [
"MIT"
] | 5 | 2021-10-08T17:44:32.000Z | 2021-12-19T18:47:59.000Z | tests/wildcat_connection/conftest.py | nathanllww/slivka-irv | 6688ac88a34387df476d51f66d291e9f3b6454aa | [
"MIT"
] | null | null | null | import os
import pytest
from . import TEST_CASE_INVALID, WCTestCase
@pytest.fixture()
def duplicate_question_test_case() -> WCTestCase:
return WCTestCase(os.path.join(TEST_CASE_INVALID, "duplicate_questions.csv"))
| 24.444444 | 81 | 0.8 | 29 | 220 | 5.793103 | 0.586207 | 0.142857 | 0.178571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104545 | 220 | 8 | 82 | 27.5 | 0.852792 | 0 | 0 | 0 | 0 | 0 | 0.104545 | 0.104545 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0.5 | 0.166667 | 0.833333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
d2514cd2a3dba9b04975c577ac7e853ce1bf6b8b | 85 | py | Python | addons14/sale_timesheet_rounded/models/__init__.py | odoochain/addons_oca | 55d456d798aebe16e49b4a6070765f206a8885ca | [
"MIT"
] | 1 | 2021-06-10T14:59:13.000Z | 2021-06-10T14:59:13.000Z | addons14/sale_timesheet_rounded/models/__init__.py | odoochain/addons_oca | 55d456d798aebe16e49b4a6070765f206a8885ca | [
"MIT"
] | null | null | null | addons14/sale_timesheet_rounded/models/__init__.py | odoochain/addons_oca | 55d456d798aebe16e49b4a6070765f206a8885ca | [
"MIT"
] | 1 | 2021-04-09T09:44:44.000Z | 2021-04-09T09:44:44.000Z | from . import account_analytic_line
from . import project_project
from . import sale
| 21.25 | 35 | 0.823529 | 12 | 85 | 5.583333 | 0.583333 | 0.447761 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141176 | 85 | 3 | 36 | 28.333333 | 0.917808 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d26fefbe05d2dc63c65d1e11810217709b4b24c1 | 26 | py | Python | powerline_iconcwd/__init__.py | pronobis/powerline_iconcwd | bc9aee33200070eb1a97f7338bfd183e7adee2e2 | [
"MIT"
] | 1 | 2021-09-15T15:13:14.000Z | 2021-09-15T15:13:14.000Z | powerline_iconcwd/__init__.py | pronobis/powerline_iconcwd | bc9aee33200070eb1a97f7338bfd183e7adee2e2 | [
"MIT"
] | null | null | null | powerline_iconcwd/__init__.py | pronobis/powerline_iconcwd | bc9aee33200070eb1a97f7338bfd183e7adee2e2 | [
"MIT"
] | null | null | null | from .segments import cwd
| 13 | 25 | 0.807692 | 4 | 26 | 5.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 26 | 1 | 26 | 26 | 0.954545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
967886ea7bb9e49fd56425129be6fc0a79e70cdf | 117 | py | Python | catflap/data/__init__.py | yay4ya/catflap | 1cb146417948c4c60c4765287c6f62664058290b | [
"MIT"
] | null | null | null | catflap/data/__init__.py | yay4ya/catflap | 1cb146417948c4c60c4765287c6f62664058290b | [
"MIT"
] | null | null | null | catflap/data/__init__.py | yay4ya/catflap | 1cb146417948c4c60c4765287c6f62664058290b | [
"MIT"
] | 1 | 2020-06-30T15:56:07.000Z | 2020-06-30T15:56:07.000Z | from catflap.data.author import Author
from catflap.data.message import Message
from catflap.data.video import Video
| 29.25 | 40 | 0.846154 | 18 | 117 | 5.5 | 0.388889 | 0.333333 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 117 | 3 | 41 | 39 | 0.942857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
967f4faed0173677aa1046198821e74aa78ef993 | 225 | py | Python | supervisor/nacos/__init__.py | Leon-sk/supervisor-nacos | 5de2dd5ccdc16cc1142419945d0e41e78387040f | [
"Apache-2.0"
] | 1 | 2022-03-31T06:48:17.000Z | 2022-03-31T06:48:17.000Z | supervisor/nacos/__init__.py | Leon-sk/supervisor-nacos | 5de2dd5ccdc16cc1142419945d0e41e78387040f | [
"Apache-2.0"
] | null | null | null | supervisor/nacos/__init__.py | Leon-sk/supervisor-nacos | 5de2dd5ccdc16cc1142419945d0e41e78387040f | [
"Apache-2.0"
] | null | null | null | #!/usr/local/bin/env python3
# -*- coding:utf-8 -*-
from supervisor.nacos.client import NacosClient, NacosException, DEFAULTS, DEFAULT_GROUP_NAME
__all__ = ["NacosClient", "NacosException", "DEFAULTS", DEFAULT_GROUP_NAME]
| 32.142857 | 93 | 0.76 | 26 | 225 | 6.269231 | 0.769231 | 0.306748 | 0.404908 | 0.490798 | 0.601227 | 0.601227 | 0 | 0 | 0 | 0 | 0 | 0.009852 | 0.097778 | 225 | 6 | 94 | 37.5 | 0.793103 | 0.217778 | 0 | 0 | 0 | 0 | 0.189655 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
73be4ca4a52c93ecedb9b2d345fa247dbb39265a | 98 | py | Python | gestionale/controllers/cbi.py | ValentinoUberti/mcimporter | b9d777ab02eb51d193a86af096b7976e6d5a4dc9 | [
"Apache-2.0"
] | null | null | null | gestionale/controllers/cbi.py | ValentinoUberti/mcimporter | b9d777ab02eb51d193a86af096b7976e6d5a4dc9 | [
"Apache-2.0"
] | null | null | null | gestionale/controllers/cbi.py | ValentinoUberti/mcimporter | b9d777ab02eb51d193a86af096b7976e6d5a4dc9 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# try something like
def index(): return dict(message="hello from cbi.py") | 32.666667 | 53 | 0.663265 | 15 | 98 | 4.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011905 | 0.142857 | 98 | 3 | 53 | 32.666667 | 0.761905 | 0.408163 | 0 | 0 | 0 | 0 | 0.303571 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | true | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
73d6484df479d3df9b6bd67f152b20c2f88018f1 | 158 | py | Python | structural/decorator/logic/__init__.py | Kozak24/Patterns | 351d5c11f7c64ce5d58db37b6715fc8f7d31945a | [
"MIT"
] | null | null | null | structural/decorator/logic/__init__.py | Kozak24/Patterns | 351d5c11f7c64ce5d58db37b6715fc8f7d31945a | [
"MIT"
] | null | null | null | structural/decorator/logic/__init__.py | Kozak24/Patterns | 351d5c11f7c64ce5d58db37b6715fc8f7d31945a | [
"MIT"
] | null | null | null | from .decorator import Decorator
from .performance_decorator import PerformanceDecorator
from .logger_decorator import LoggerDecorator
from .stub import Stub
| 31.6 | 55 | 0.873418 | 18 | 158 | 7.555556 | 0.444444 | 0.330882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101266 | 158 | 4 | 56 | 39.5 | 0.957746 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fb4217401c788111d1ceb3aa73d96a8bf28c0c2f | 40 | py | Python | ibis/utility/__init__.py | CIDARLAB/ibis | 5108848c1d45326b3c65213b2f8deaa88fd29be6 | [
"MIT"
] | 1 | 2021-02-16T23:19:07.000Z | 2021-02-16T23:19:07.000Z | ibis/utility/__init__.py | CIDARLAB/ibis | 5108848c1d45326b3c65213b2f8deaa88fd29be6 | [
"MIT"
] | 1 | 2021-04-18T13:45:15.000Z | 2021-04-18T13:45:15.000Z | ibis/utility/__init__.py | CIDARLAB/scoring-project | 5108848c1d45326b3c65213b2f8deaa88fd29be6 | [
"MIT"
] | 2 | 2021-04-27T23:52:06.000Z | 2021-07-02T13:44:26.000Z | from .graphing import plot_cell_edgelist | 40 | 40 | 0.9 | 6 | 40 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 40 | 1 | 40 | 40 | 0.918919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fbcc10ef9a2b6ba196c51e1f9a76827b1d380ee3 | 104 | py | Python | src/apps/notes/admin.py | mentalnoteapp/mentalnote-server-graphene | c7ca74488c3bca0e4173cf1c5dffc906cd6ce3e3 | [
"MIT"
] | null | null | null | src/apps/notes/admin.py | mentalnoteapp/mentalnote-server-graphene | c7ca74488c3bca0e4173cf1c5dffc906cd6ce3e3 | [
"MIT"
] | null | null | null | src/apps/notes/admin.py | mentalnoteapp/mentalnote-server-graphene | c7ca74488c3bca0e4173cf1c5dffc906cd6ce3e3 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Note
admin.site.register(Note, admin.ModelAdmin)
| 17.333333 | 43 | 0.807692 | 15 | 104 | 5.6 | 0.666667 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 104 | 5 | 44 | 20.8 | 0.913043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
83fa38d4fd943fd9102f3aa41c02f74a3f7da323 | 14,791 | py | Python | unit_tests/view_modify_lon/test_modify_lon.py | LandRegistry/maintain-frontend | d92446a9972ebbcd9a43a7a7444a528aa2f30bf7 | [
"MIT"
] | 1 | 2019-10-03T13:58:29.000Z | 2019-10-03T13:58:29.000Z | unit_tests/view_modify_lon/test_modify_lon.py | LandRegistry/maintain-frontend | d92446a9972ebbcd9a43a7a7444a528aa2f30bf7 | [
"MIT"
] | null | null | null | unit_tests/view_modify_lon/test_modify_lon.py | LandRegistry/maintain-frontend | d92446a9972ebbcd9a43a7a7444a528aa2f30bf7 | [
"MIT"
] | 1 | 2021-04-11T05:24:57.000Z | 2021-04-11T05:24:57.000Z | from maintain_frontend import main
from flask_testing import TestCase
from unit_tests.utilities import Utilities
from unittest.mock import patch, MagicMock
from flask import url_for
from maintain_frontend.dependencies.session_api.session import Session
from maintain_frontend.models import LightObstructionNoticeItem
from maintain_frontend.constants.permissions import Permissions
import datetime
from dateutil.relativedelta import relativedelta
from unit_tests.mock_data.mock_land_charges import get_mock_lon_item
class TestModifyLON(TestCase):
def create_app(self):
main.app.testing = True
Utilities.mock_session_cookie_flask_test(self)
return main.app
def setUp(self):
main.app.config['Testing'] = True
main.app.config['WTF_CSRF_ENABLED'] = False
@patch('maintain_frontend.view_modify_lon.modify_lon.LocalLandChargeService')
def test_modify_lon_upload_get_invalid_id(self, mock_service):
self.client.set_cookie('localhost', Session.session_cookie_name,
'cookie_value')
self.mock_session.return_value.user.permissions = [Permissions.vary_lon]
response = self.client.get(url_for('modify_lon.modify_lon_upload_get', charge_id=123))
self.assert_status(response, 302)
@patch('maintain_frontend.view_modify_lon.modify_lon.LocalLandChargeService')
def test_modify_lon_upload_cancelled(self, mock_service):
self.client.set_cookie('localhost', Session.session_cookie_name,
'cookie_value')
self.mock_session.return_value.user.permissions = [Permissions.vary_lon]
self.mock_session.return_value.add_lon_charge_state = None
mock_item = get_mock_lon_item().copy()
mock_item['end-date'] = '2017-01-01'
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.json.return_value = [{
"item": mock_item,
"display_id": "LLC-TST",
"cancelled": True
}]
mock_service.return_value.get_by_charge_number.return_value = mock_response
response = self.client.get(url_for('modify_lon.modify_lon_upload_get', charge_id='LLC-TST'))
self.assert_redirects(response, '/error')
@patch('maintain_frontend.view_modify_lon.modify_lon.LocalLandChargeService')
def test_modify_lon_upload_get_found(self, mock_service):
self.client.set_cookie('localhost', Session.session_cookie_name,
'cookie_value')
self.mock_session.return_value.user.permissions = [Permissions.vary_lon]
self.mock_session.return_value.add_lon_charge_state = None
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.json.return_value = [{
"item": get_mock_lon_item(), "display_id": "LLC-TST", "cancelled": False
}]
mock_service.return_value.get_by_charge_number.return_value = mock_response
response = self.client.get(url_for('modify_lon.modify_lon_upload_get', charge_id='LLC-TST'))
self.assert_status(response, 200)
@patch('maintain_frontend.view_modify_lon.modify_lon.LocalLandChargeService')
def test_modify_lon_upload_get_not_found(self, mock_service):
self.client.set_cookie('localhost', Session.session_cookie_name,
'cookie_value')
self.mock_session.return_value.user.permissions = [Permissions.vary_lon]
self.mock_session.return_value.add_lon_charge_state = None
mock_response = MagicMock()
mock_response.status_code = 404
mock_service.return_value.get_by_charge_number.return_value = mock_response
response = self.client.get(url_for('modify_lon.modify_lon_upload_get', charge_id='LLC-TST'))
self.assert_status(response, 302)
@patch('maintain_frontend.view_modify_lon.modify_lon.LocalLandChargeService')
def test_modify_lon_redirect_when_different_id_in_session(self, mock_service):
self.client.set_cookie('localhost', Session.session_cookie_name,
'cookie_value')
self.mock_session.return_value.user.permissions = [Permissions.vary_lon]
state = LightObstructionNoticeItem.from_json(get_mock_lon_item())
self.mock_session.return_value.add_lon_charge_state = state
response = self.client.get(url_for('modify_lon.modify_lon_upload_get', charge_id=3))
self.assert_status(response, 302)
@patch('maintain_frontend.view_modify_lon.modify_lon.VaryLonValidator')
@patch('maintain_frontend.view_modify_lon.modify_lon.LocalLandChargeService')
def test_upload_lon_documents_no_select(self, mock_service, mock_validator):
self.client.set_cookie('localhost', Session.session_cookie_name, 'cookie_value')
self.mock_session.return_value.user.permissions = [Permissions.vary_lon]
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.json.return_value = [{"item": get_mock_lon_item(), "display_id": "LLC-TST", "cancelled": False}]
mock_service.return_value.get_by_charge_number.return_value = mock_response
mock_validator.validate.return_value.errors = {'vary-lon-options': ['Choose one']}
response = self.client.post(url_for('modify_lon.modify_lon_upload_post', charge_id="LLC-TST"))
self.assert_context('validation_errors', {'vary-lon-options': ['Choose one']})
self.assert_status(response, 400)
self.assert_template_used('modify_lon_upload.html')
@patch('maintain_frontend.view_modify_lon.modify_lon.handle_vary_lon_options_choice')
@patch('maintain_frontend.view_modify_lon.modify_lon.VaryLonValidator')
@patch('maintain_frontend.view_modify_lon.modify_lon.LocalLandChargeService')
def test_upload_lon_documents_no_errors(self, mock_service, mock_validator, mock_handle_options):
self.client.set_cookie('localhost', Session.session_cookie_name, 'cookie_value')
self.mock_session.return_value.user.permissions = [Permissions.vary_lon]
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.json.return_value = [{"item": get_mock_lon_item(), "display_id": "LLC-TST", "cancelled": False}]
mock_service.return_value.get_by_charge_number.return_value = mock_response
mock_validator.validate.return_value.errors = None
response = self.client.post(url_for('modify_lon.modify_lon_upload_post', charge_id="LLC-TST"))
self.assert_status(response, 302)
self.assertRedirects(response, url_for('modify_lon.modify_lon_details_get', charge_id="LLC-TST"))
@patch('maintain_frontend.view_modify_lon.modify_lon.request')
@patch('maintain_frontend.view_modify_lon.modify_lon.VaryLonValidator')
@patch('maintain_frontend.view_modify_lon.modify_lon.StorageAPIService')
@patch('maintain_frontend.view_modify_lon.modify_lon.LocalLandChargeService')
def test_upload_definitive_certificate_fields(self, mock_service, mock_storage_service, mock_validator,
mock_request):
self.client.set_cookie('localhost', Session.session_cookie_name, 'cookie_value')
self.mock_session.return_value.user.permissions = [Permissions.vary_lon]
state = LightObstructionNoticeItem()
self.mock_session.return_value.add_lon_charge_state = state
self.mock_session.return_value.edited_fields = {}
expected_expiry_date = datetime.date.today() + relativedelta(years=+21)
expected_definitive_date = datetime.date(1, 1, 1)
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.json.return_value = [{"item": get_mock_lon_item(), "display_id": "LLC-TST", "cancelled": False}]
mock_service.return_value.get_by_charge_number.return_value = mock_response
mock_validator.validate.return_value.errors = None
mock_request.form.getlist.return_value = ["Definitive Certificate"]
mock_request.form.get('definitive_cert_day').return_value = 1
mock_request.form.get('definitive_cert_month').return_value = 1
mock_request.form.get('definitive_cert_year').return_value = 1
mock_request.files.get('definitive-certificate-file-input').return_value = True
response = self.client.post(url_for('modify_lon.modify_lon_upload_post', charge_id="LLC-TST"))
self.assertEqual(self.mock_session.return_value.add_lon_charge_state.expiry_date,
expected_expiry_date)
self.assertEqual(self.mock_session.return_value.add_lon_charge_state.tribunal_definitive_certificate_date,
expected_definitive_date)
self.assertEqual(self.mock_session.return_value.edited_fields['tribunal-definitive-certificate-date'],
"Tribunal definitive certificate date")
self.assertEqual(self.mock_session.return_value.edited_fields['expiry-date'],
"Expiry date")
self.assert_status(response, 302)
self.assertRedirects(response, url_for('modify_lon.modify_lon_details_get', charge_id="LLC-TST"))
@patch('maintain_frontend.view_modify_lon.modify_lon.request')
@patch('maintain_frontend.view_modify_lon.modify_lon.LocalLandChargeService')
def test_vary_lon_validator_no_option_selected(self, mock_service, mock_request):
self.client.set_cookie('localhost', Session.session_cookie_name, 'cookie_value')
self.mock_session.return_value.user.permissions = [Permissions.vary_lon]
self.mock_session.return_value.edited_fields = {}
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.json.return_value = [{"item": get_mock_lon_item(), "display_id": "LLC-TST", "cancelled": False}]
mock_service.return_value.get_by_charge_number.return_value = mock_response
mock_request.form.getlist.return_value = []
response = self.client.post(url_for('modify_lon.modify_lon_upload_post', charge_id="LLC-TST"))
context_validation_error = self.get_context_variable('validation_errors')
self.assert_status(response, 400)
self.assertIsNotNone(context_validation_error["vary-lon-options"])
self.assertEqual(context_validation_error["vary-lon-options"].inline_message, "Choose one option")
@patch('maintain_frontend.view_modify_lon.modify_lon.request')
@patch('maintain_frontend.view_modify_lon.modify_lon.LocalLandChargeService')
def test_vary_lon_validator_required_fields_definitive_certificate(self, mock_service, mock_request):
self.client.set_cookie('localhost', Session.session_cookie_name, 'cookie_value')
self.mock_session.return_value.user.permissions = [Permissions.vary_lon]
self.mock_session.return_value.edited_fields = {}
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.json.return_value = [{"item": get_mock_lon_item(), "display_id": "LLC-TST", "cancelled": False}]
mock_service.return_value.get_by_charge_number.return_value = mock_response
mock_request.form.getlist.return_value = ["Definitive Certificate"]
mock_request.form.get.return_value = ""
mock_request.files.get.return_value = None
response = self.client.post(url_for('modify_lon.modify_lon_upload_post', charge_id="LLC-TST"))
context_validation_error = self.get_context_variable('validation_errors')
self.assert_status(response, 400)
self.assertIsNotNone(context_validation_error["tribunal_definitive_certificate_date"])
self.assertEqual(context_validation_error["tribunal_definitive_certificate_date"].inline_message,
"Check that the date is in Day/Month/Year format")
self.assertIsNotNone(context_validation_error["definitive-certificate-file-input"])
self.assertEqual(context_validation_error["definitive-certificate-file-input"].inline_message,
"Upload a document for Definitive Certificate")
@patch('maintain_frontend.view_modify_lon.modify_lon.request')
@patch('maintain_frontend.view_modify_lon.modify_lon.LocalLandChargeService')
def test_vary_lon_validator_required_fields_form_b(self, mock_service, mock_request):
self.client.set_cookie('localhost', Session.session_cookie_name, 'cookie_value')
self.mock_session.return_value.user.permissions = [Permissions.vary_lon]
self.mock_session.return_value.edited_fields = {}
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.json.return_value = [{"item": get_mock_lon_item(), "display_id": "LLC-TST", "cancelled": False}]
mock_service.return_value.get_by_charge_number.return_value = mock_response
mock_request.form.getlist.return_value = ["Form B"]
mock_request.files.get.return_value = None
response = self.client.post(url_for('modify_lon.modify_lon_upload_post', charge_id="LLC-TST"))
context_validation_error = self.get_context_variable('validation_errors')
self.assert_status(response, 400)
self.assertIsNotNone(context_validation_error["form-b-file-input"])
self.assertEqual(context_validation_error["form-b-file-input"].inline_message,
"Upload a document for Form B")
@patch('maintain_frontend.view_modify_lon.modify_lon.request')
@patch('maintain_frontend.view_modify_lon.modify_lon.LocalLandChargeService')
def test_vary_lon_validator_required_fields_court_order(self, mock_service, mock_request):
self.client.set_cookie('localhost', Session.session_cookie_name, 'cookie_value')
self.mock_session.return_value.user.permissions = [Permissions.vary_lon]
self.mock_session.return_value.edited_fields = {}
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.json.return_value = [{"item": get_mock_lon_item(), "display_id": "LLC-TST", "cancelled": False}]
mock_service.return_value.get_by_charge_number.return_value = mock_response
mock_request.form.getlist.return_value = ["Court Order"]
mock_request.files.get.return_value = None
response = self.client.post(url_for('modify_lon.modify_lon_upload_post', charge_id="LLC-TST"))
context_validation_error = self.get_context_variable('validation_errors')
self.assert_status(response, 400)
self.assertIsNotNone(context_validation_error["court-order-file-input"])
self.assertEqual(context_validation_error["court-order-file-input"].inline_message,
"Upload a document for Court Order")
| 52.080986 | 118 | 0.734095 | 1,837 | 14,791 | 5.523136 | 0.08601 | 0.06919 | 0.053223 | 0.063868 | 0.857678 | 0.844766 | 0.823182 | 0.787897 | 0.771437 | 0.762764 | 0 | 0.006736 | 0.166926 | 14,791 | 283 | 119 | 52.265018 | 0.816669 | 0 | 0 | 0.616822 | 0 | 0 | 0.224326 | 0.144345 | 0 | 0 | 0 | 0 | 0.140187 | 1 | 0.065421 | false | 0 | 0.051402 | 0 | 0.126168 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f7b5c22683ce914553be6f051c4779e19164439c | 141 | py | Python | myapp/oauth2_client.py | yoophi/flask-google-login-example | 57cbbcbd054a6ba47c2a492950bf500b6b595b7d | [
"MIT"
] | null | null | null | myapp/oauth2_client.py | yoophi/flask-google-login-example | 57cbbcbd054a6ba47c2a492950bf500b6b595b7d | [
"MIT"
] | null | null | null | myapp/oauth2_client.py | yoophi/flask-google-login-example | 57cbbcbd054a6ba47c2a492950bf500b6b595b7d | [
"MIT"
] | null | null | null | from oauthlib.oauth2 import WebApplicationClient
from myapp.config import GOOGLE_CLIENT_ID
client = WebApplicationClient(GOOGLE_CLIENT_ID)
| 23.5 | 48 | 0.87234 | 17 | 141 | 7 | 0.588235 | 0.201681 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007813 | 0.092199 | 141 | 5 | 49 | 28.2 | 0.921875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f7c56a38199c2398fefe63d2a8af1c5a7b2ba188 | 81 | py | Python | ProcessingTools/__init__.py | ysy9997/functions | 6de8db376fe15156c7ed02224c0cbf54cecc3e71 | [
"MIT"
] | 2 | 2021-05-13T07:54:37.000Z | 2022-03-24T01:40:56.000Z | ProcessingTools/__init__.py | ysy9997/functions | 6de8db376fe15156c7ed02224c0cbf54cecc3e71 | [
"MIT"
] | null | null | null | ProcessingTools/__init__.py | ysy9997/functions | 6de8db376fe15156c7ed02224c0cbf54cecc3e71 | [
"MIT"
] | null | null | null | from ProcessingTools.functions import *
from ProcessingTools.PrgressBar import *
| 27 | 40 | 0.851852 | 8 | 81 | 8.625 | 0.625 | 0.550725 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098765 | 81 | 2 | 41 | 40.5 | 0.945205 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f707de5d7baf6c48523186021b7d2ca01cdbe868 | 220 | py | Python | tflib/__init__.py | nguyenquangduc2000/AttGAN | 2ba96d1a1f80b39cc785c594ad8e1d800c06dd52 | [
"MIT"
] | 405 | 2019-04-17T03:02:18.000Z | 2022-03-11T06:36:00.000Z | tflib/__init__.py | nguyenquangduc2000/AttGAN | 2ba96d1a1f80b39cc785c594ad8e1d800c06dd52 | [
"MIT"
] | 58 | 2019-05-13T09:34:57.000Z | 2021-12-07T08:40:58.000Z | tflib/__init__.py | nguyenquangduc2000/AttGAN | 2ba96d1a1f80b39cc785c594ad8e1d800c06dd52 | [
"MIT"
] | 95 | 2019-04-20T02:32:32.000Z | 2022-03-07T03:58:24.000Z | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from tflib.checkpoint import *
from tflib.ops import *
from tflib.utils import *
from tflib.variable import *
| 24.444444 | 38 | 0.831818 | 30 | 220 | 5.633333 | 0.4 | 0.236686 | 0.284024 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131818 | 220 | 8 | 39 | 27.5 | 0.884817 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.142857 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f70d2c13b10c9d61592c4394147f47c2b83b8834 | 873 | py | Python | src/ControlManager.py | NEKERAFA/Soul-Tower | d37c0bf6bcbf253ec5b2c41f802adeeca31fb384 | [
"MIT"
] | null | null | null | src/ControlManager.py | NEKERAFA/Soul-Tower | d37c0bf6bcbf253ec5b2c41f802adeeca31fb384 | [
"MIT"
] | null | null | null | src/ControlManager.py | NEKERAFA/Soul-Tower | d37c0bf6bcbf253ec5b2c41f802adeeca31fb384 | [
"MIT"
] | null | null | null | import pygame
class ControlManager(object):
@classmethod
def up(cls):
raise NotImplementedError('Error: Abstract class')
@classmethod
def down(cls):
raise NotImplementedError('Error: Abstract class')
@classmethod
def left(cls):
raise NotImplementedError('Error: Abstract class')
@classmethod
def right(cls):
raise NotImplementedError('Error: Abstract class')
@classmethod
def angle(cls, pos):
raise NotImplementedError('Error: Abstract class')
@classmethod
def prim_button(cls):
raise NotImplementedError('Error: Abstract class')
@classmethod
def sec_button(cls):
raise NotImplementedError('Error: Abstract class')
@classmethod
def select_button(cls):
raise NotImplementedError('Error: Abstract class')
| 24.942857 | 59 | 0.648339 | 81 | 873 | 6.950617 | 0.271605 | 0.198934 | 0.412078 | 0.525755 | 0.840142 | 0.840142 | 0.840142 | 0.650089 | 0.230906 | 0 | 0 | 0 | 0.264605 | 873 | 34 | 60 | 25.676471 | 0.876947 | 0 | 0 | 0.615385 | 0 | 0 | 0.200238 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.307692 | false | 0 | 0.038462 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f711d5c3ee19c95b26ff713f3e6d532c9a88b33a | 109 | py | Python | python/karps/__init__.py | tjhunter/karps | 7c74c3bf5b566264d6fed6e17fb1716216467a50 | [
"Apache-2.0"
] | 5 | 2017-10-25T10:53:47.000Z | 2019-01-12T19:32:36.000Z | python/karps/__init__.py | tjhunter/karps | 7c74c3bf5b566264d6fed6e17fb1716216467a50 | [
"Apache-2.0"
] | 15 | 2017-06-22T20:53:50.000Z | 2017-10-22T00:47:00.000Z | python/karps/__init__.py | tjhunter/karps | 7c74c3bf5b566264d6fed6e17fb1716216467a50 | [
"Apache-2.0"
] | 1 | 2018-08-23T04:25:57.000Z | 2018-08-23T04:25:57.000Z | from .types import *
from .row import *
from .column import *
from .utils import scope
from .session import * | 21.8 | 24 | 0.743119 | 16 | 109 | 5.0625 | 0.5 | 0.37037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.174312 | 109 | 5 | 25 | 21.8 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f74546c3f5a70d993244acbf1d62f9927dccb094 | 321 | py | Python | oms/__init__.py | alphamatic/amp | 5018137097159415c10eaa659a2e0de8c4e403d4 | [
"BSD-3-Clause"
] | 5 | 2021-08-10T23:16:44.000Z | 2022-03-17T17:27:00.000Z | oms/__init__.py | alphamatic/amp | 5018137097159415c10eaa659a2e0de8c4e403d4 | [
"BSD-3-Clause"
] | 330 | 2021-06-10T17:28:22.000Z | 2022-03-31T00:55:48.000Z | oms/__init__.py | alphamatic/amp | 5018137097159415c10eaa659a2e0de8c4e403d4 | [
"BSD-3-Clause"
] | 6 | 2021-06-10T17:20:32.000Z | 2022-03-28T08:08:03.000Z | """
Import as:
import oms as oms
"""
from oms.oms_db import * # pylint: disable=unused-import # NOQA
from oms.order_processor import * # pylint: disable=unused-import # NOQA
from oms.portfolio import * # pylint: disable=unused-import # NOQA
from oms.portfolio_example import * # pylint: disable=unused-import # NOQA | 32.1 | 75 | 0.738318 | 45 | 321 | 5.2 | 0.288889 | 0.119658 | 0.324786 | 0.42735 | 0.764957 | 0.764957 | 0.615385 | 0.615385 | 0.435897 | 0 | 0 | 0 | 0.152648 | 321 | 10 | 75 | 32.1 | 0.860294 | 0.542056 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f748b3735fe3bcb7905fab7c85053e6d978932b1 | 72 | py | Python | conan/tools/meson/__init__.py | ssaavedra/conan | e15dc7902fbbeaf469798a3b9948ead1ecfc8e3c | [
"MIT"
] | 1 | 2021-08-05T15:33:08.000Z | 2021-08-05T15:33:08.000Z | conan/tools/meson/__init__.py | ssaavedra/conan | e15dc7902fbbeaf469798a3b9948ead1ecfc8e3c | [
"MIT"
] | null | null | null | conan/tools/meson/__init__.py | ssaavedra/conan | e15dc7902fbbeaf469798a3b9948ead1ecfc8e3c | [
"MIT"
] | null | null | null | # noinspection PyUnresolvedReferences
from .meson import MesonToolchain
| 24 | 37 | 0.875 | 6 | 72 | 10.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097222 | 72 | 2 | 38 | 36 | 0.969231 | 0.486111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f788b064b0a64252133f96ae87b1a8b56d17679a | 15,176 | py | Python | tests/unit-tests/test_sphinx_toctree.py | BenGale93/confluencebuilder | 93556f974ac482a8d21e95a686fee397d35ed7cd | [
"BSD-2-Clause"
] | null | null | null | tests/unit-tests/test_sphinx_toctree.py | BenGale93/confluencebuilder | 93556f974ac482a8d21e95a686fee397d35ed7cd | [
"BSD-2-Clause"
] | null | null | null | tests/unit-tests/test_sphinx_toctree.py | BenGale93/confluencebuilder | 93556f974ac482a8d21e95a686fee397d35ed7cd | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""
:copyright: Copyright 2016-2021 Sphinx Confluence Builder Contributors (AUTHORS)
:license: BSD-2-Clause (LICENSE)
"""
from tests.lib import build_sphinx
from tests.lib import parse
from tests.lib import prepare_conf
import os
import unittest
class TestConfluenceSphinxToctree(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.config = prepare_conf()
cls.test_dir = os.path.dirname(os.path.realpath(__file__))
def test_storage_sphinx_toctree_caption(self):
dataset = os.path.join(self.test_dir, 'datasets', 'toctree-caption')
out_dir = build_sphinx(dataset, config=self.config)
with parse('index', out_dir) as data:
root_toc = data.find('ul', recursive=False)
self.assertIsNotNone(root_toc)
caption = root_toc.findPrevious()
self.assertIsNotNone(caption)
self.assertEqual(caption.text, 'toctree caption')
def test_storage_sphinx_toctree_child_macro(self):
dataset = os.path.join(self.test_dir, 'datasets', 'toctree-default')
config = dict(self.config)
config['confluence_adv_hierarchy_child_macro'] = True
config['confluence_page_hierarchy'] = True
# relax due to this test (confluence_adv_hierarchy_child_macro) being
# deprecated
out_dir = build_sphinx(dataset, config=config, relax=True)
with parse('index', out_dir) as data:
macro = data.find('ac:structured-macro', recursive=False)
self.assertIsNotNone(macro)
self.assertTrue(macro.has_attr('ac:name'))
self.assertEqual(macro['ac:name'], 'children')
all_param = macro.find('ac:parameter', recursive=False)
self.assertIsNotNone(all_param)
self.assertTrue(all_param.has_attr('ac:name'))
self.assertEqual(all_param['ac:name'], 'all')
self.assertEqual(all_param.text, 'true')
def test_storage_sphinx_toctree_default(self):
dataset = os.path.join(self.test_dir, 'datasets', 'toctree-default')
out_dir = build_sphinx(dataset, config=self.config)
with parse('index', out_dir) as data:
root_toc = data.find('ul', recursive=False)
self.assertIsNotNone(root_toc)
primary_docs = root_toc.findChildren('li', recursive=False)
self.assertEqual(len(primary_docs), 2)
# document-a group
group = primary_docs[0]
self._verify_link(group, 'doc-a')
group_docs = group.find('ul', recursive=False)
self.assertIsNotNone(group_docs)
sub_docs = group_docs.findChildren('li', recursive=False)
self.assertIsNotNone(sub_docs)
self.assertEqual(len(sub_docs), 2)
self._verify_link(sub_docs[0], 'doc-a1')
self._verify_link(sub_docs[1], 'doc-a2')
# document-b group
group = primary_docs[1]
self._verify_link(group, 'doc-b')
subheader_group = group.find('ul', recursive=False)
self.assertIsNotNone(subheader_group)
subheader = subheader_group.find('li', recursive=False)
self.assertIsNotNone(subheader)
self._verify_link(subheader, 'doc-b', label='subheader',
anchor='subheader')
final_group = subheader.find('ul', recursive=False)
self.assertIsNotNone(final_group)
sub_docs = final_group.findChildren('li', recursive=False)
self.assertIsNotNone(sub_docs)
self.assertEqual(len(sub_docs), 1)
self._verify_link(sub_docs[0], 'doc-b1')
def test_storage_sphinx_toctree_hidden(self):
dataset = os.path.join(self.test_dir, 'datasets', 'toctree-hidden')
out_dir = build_sphinx(dataset, config=self.config)
with parse('index', out_dir) as data:
# expect no content with a hidden toctree
self.assertEqual(data.text, '')
def test_storage_sphinx_toctree_maxdepth(self):
dataset = os.path.join(self.test_dir, 'datasets', 'toctree-maxdepth')
out_dir = build_sphinx(dataset, config=self.config)
with parse('index', out_dir) as data:
root_toc = data.find('ul', recursive=False)
self.assertIsNotNone(root_toc)
docs = root_toc.findChildren('li', recursive=False)
self.assertIsNotNone(docs)
self.assertEqual(len(docs), 1)
doc = docs[0]
doc_tags = doc.findChildren(recursive=False)
self.assertIsNotNone(doc_tags)
self.assertEqual(len(doc_tags), 1) # no other links beyond depth
self._verify_link(doc, 'doc')
def test_storage_sphinx_toctree_numbered_default(self):
dataset = os.path.join(self.test_dir, 'datasets', 'toctree-numbered')
out_dir = build_sphinx(dataset, config=self.config)
with parse('index', out_dir) as data:
root_toc = data.find('ul', recursive=False)
self.assertIsNotNone(root_toc)
docs = root_toc.findChildren('li', recursive=False)
self.assertIsNotNone(docs)
self.assertEqual(len(docs), 4)
group = docs.pop(0)
self._verify_link(group, '1. doc')
group_docs = group.find('ul', recursive=False)
self.assertIsNotNone(group_docs)
sub_docs = group_docs.findChildren('li', recursive=False)
self.assertIsNotNone(sub_docs)
self.assertEqual(len(sub_docs), 1)
self._verify_link(sub_docs[0], '1.1. child')
group = docs.pop(0)
self._verify_link(group, '1. doc',
label='2. section with spaces')
group = docs.pop(0)
self._verify_link(group, '1. doc',
label='3. section_with_underscores')
group = docs.pop(0)
self._verify_link(group, '1. doc',
label='4. section with a large name - Lorem ipsum dolor sit amet, consectetur adipiscing elit. Praesent vitae volutpat ipsum, quis sodales eros. Aenean quis nunc quis leo aliquam gravida. Fusce accumsan nibh vitae enim ullamcorper iaculis. Duis eget augue dolor. Curabitur at enim elit. Nullam luctus mollis magna. Pellentesque pellentesque, leo quis suscipit finibus, diam justo convallis.')
with parse('doc', out_dir) as data:
root_toc = data.find('ul', recursive=False)
self.assertIsNotNone(root_toc)
docs = root_toc.findChildren('li', recursive=False)
self.assertIsNotNone(docs)
self.assertEqual(len(docs), 1)
doc = docs[0]
self._verify_link(doc, '1.1. child')
def test_storage_sphinx_toctree_numbered_depth(self):
dataset = os.path.join(self.test_dir, 'datasets', 'toctree-numbered-depth')
out_dir = build_sphinx(dataset, config=self.config)
with parse('index', out_dir) as data:
root_toc = data.find('ul', recursive=False)
self.assertIsNotNone(root_toc)
docs = root_toc.findChildren('li', recursive=False)
self.assertIsNotNone(docs)
self.assertEqual(len(docs), 1)
group = docs[0]
self._verify_link(group, '1. doc')
group_docs = group.find('ul', recursive=False)
self.assertIsNotNone(group_docs)
sub_docs = group_docs.findChildren('li', recursive=False)
self.assertIsNotNone(sub_docs)
self.assertEqual(len(sub_docs), 1)
self._verify_link(sub_docs[0], 'child')
with parse('doc', out_dir) as data:
root_toc = data.find('ul', recursive=False)
self.assertIsNotNone(root_toc)
docs = root_toc.findChildren('li', recursive=False)
self.assertIsNotNone(docs)
self.assertEqual(len(docs), 1)
doc = docs[0]
self._verify_link(doc, 'child')
def test_storage_sphinx_toctree_numbered_disable(self):
dataset = os.path.join(self.test_dir, 'datasets', 'toctree-numbered')
config = dict(self.config)
config['confluence_add_secnumbers'] = False
out_dir = build_sphinx(dataset, config=config)
with parse('index', out_dir) as data:
root_toc = data.find('ul', recursive=False)
self.assertIsNotNone(root_toc)
docs = root_toc.findChildren('li', recursive=False)
self.assertIsNotNone(docs)
self.assertEqual(len(docs), 4)
group = docs.pop(0)
self._verify_link(group, 'doc')
group_docs = group.find('ul', recursive=False)
self.assertIsNotNone(group_docs)
sub_docs = group_docs.findChildren('li', recursive=False)
self.assertIsNotNone(sub_docs)
self.assertEqual(len(sub_docs), 1)
self._verify_link(sub_docs[0], 'child')
group = docs.pop(0)
self._verify_link(group, 'doc', label='section with spaces')
group = docs.pop(0)
self._verify_link(group, 'doc', label='section_with_underscores')
group = docs.pop(0)
self._verify_link(group, 'doc', label='section with a large name - Lorem ipsum dolor sit amet, consectetur adipiscing elit. Praesent vitae volutpat ipsum, quis sodales eros. Aenean quis nunc quis leo aliquam gravida. Fusce accumsan nibh vitae enim ullamcorper iaculis. Duis eget augue dolor. Curabitur at enim elit. Nullam luctus mollis magna. Pellentesque pellentesque, leo quis suscipit finibus, diam justo convallis.')
with parse('doc', out_dir) as data:
root_toc = data.find('ul', recursive=False)
self.assertIsNotNone(root_toc)
docs = root_toc.findChildren('li', recursive=False)
self.assertIsNotNone(docs)
self.assertEqual(len(docs), 1)
doc = docs[0]
self._verify_link(doc, 'child')
def test_storage_sphinx_toctree_numbered_secnumbers_suffix_empty(self):
"""validate toctree secnumber supports empty str (storage)"""
#
# Ensure that the toctree secnumber suffix value can be set to an
# empty string.
dataset = os.path.join(self.test_dir, 'datasets', 'toctree-numbered')
config = dict(self.config)
config['confluence_secnumber_suffix'] = ''
out_dir = build_sphinx(dataset, config=config)
with parse('index', out_dir) as data:
root_toc = data.find('ul', recursive=False)
self.assertIsNotNone(root_toc)
docs = root_toc.findChildren('li', recursive=False)
self.assertIsNotNone(docs)
self.assertEqual(len(docs), 4)
group = docs.pop(0)
self._verify_link(group, '1doc')
group_docs = group.find('ul', recursive=False)
self.assertIsNotNone(group_docs)
sub_docs = group_docs.findChildren('li', recursive=False)
self.assertIsNotNone(sub_docs)
self.assertEqual(len(sub_docs), 1)
self._verify_link(sub_docs[0], '1.1child')
group = docs.pop(0)
self._verify_link(group, '1doc',
label='2section with spaces')
group = docs.pop(0)
self._verify_link(group, '1doc',
label='3section_with_underscores')
group = docs.pop(0)
self._verify_link(group, '1doc',
label='4section with a large name - Lorem ipsum dolor sit amet, consectetur adipiscing elit. Praesent vitae volutpat ipsum, quis sodales eros. Aenean quis nunc quis leo aliquam gravida. Fusce accumsan nibh vitae enim ullamcorper iaculis. Duis eget augue dolor. Curabitur at enim elit. Nullam luctus mollis magna. Pellentesque pellentesque, leo quis suscipit finibus, diam justo convallis.')
with parse('doc', out_dir) as data:
root_toc = data.find('ul', recursive=False)
self.assertIsNotNone(root_toc)
docs = root_toc.findChildren('li', recursive=False)
self.assertIsNotNone(docs)
self.assertEqual(len(docs), 1)
doc = docs[0]
self._verify_link(doc, '1.1child')
def test_storage_sphinx_toctree_numbered_secnumbers_suffix_set(self):
dataset = os.path.join(self.test_dir, 'datasets', 'toctree-numbered')
config = dict(self.config)
config['confluence_secnumber_suffix'] = '!Z /+4'
out_dir = build_sphinx(dataset, config=config)
with parse('index', out_dir) as data:
root_toc = data.find('ul', recursive=False)
self.assertIsNotNone(root_toc)
docs = root_toc.findChildren('li', recursive=False)
self.assertIsNotNone(docs)
self.assertEqual(len(docs), 4)
group = docs.pop(0)
self._verify_link(group, '1!Z /+4doc')
group_docs = group.find('ul', recursive=False)
self.assertIsNotNone(group_docs)
sub_docs = group_docs.findChildren('li', recursive=False)
self.assertIsNotNone(sub_docs)
self.assertEqual(len(sub_docs), 1)
self._verify_link(sub_docs[0], '1.1!Z /+4child')
group = docs.pop(0)
self._verify_link(group, '1!Z /+4doc',
label='2!Z /+4section with spaces')
group = docs.pop(0)
self._verify_link(group, '1!Z /+4doc',
label='3!Z /+4section_with_underscores')
group = docs.pop(0)
self._verify_link(group, '1!Z /+4doc',
label='4!Z /+4section with a large name - Lorem ipsum dolor sit amet, consectetur adipiscing elit. Praesent vitae volutpat ipsum, quis sodales eros. Aenean quis nunc quis leo aliquam gravida. Fusce accumsan nibh vitae enim ullamcorper iaculis. Duis eget augue dolor. Curabitur at enim elit. Nullam luctus mollis magna. Pellentesque pellentesque, leo quis suscipit finibus, diam justo convallis.')
with parse('doc', out_dir) as data:
root_toc = data.find('ul', recursive=False)
self.assertIsNotNone(root_toc)
docs = root_toc.findChildren('li', recursive=False)
self.assertIsNotNone(docs)
self.assertEqual(len(docs), 1)
doc = docs[0]
self._verify_link(doc, '1.1!Z /+4child')
def _verify_link(self, entity, title, label=None, anchor=None):
label = label if label else title
ac_link = entity.find('ac:link', recursive=False)
self.assertIsNotNone(ac_link)
if anchor:
self.assertTrue(ac_link.has_attr('ac:anchor'))
self.assertEqual(ac_link['ac:anchor'], anchor)
page_ref = ac_link.find('ri:page', recursive=False)
self.assertIsNotNone(page_ref)
self.assertTrue(page_ref.has_attr('ri:content-title'))
self.assertEqual(page_ref['ri:content-title'], title)
link_body = ac_link.find('ac:link-body', recursive=False)
self.assertIsNotNone(link_body)
self.assertEqual(link_body.text, label)
| 40.042216 | 433 | 0.626647 | 1,828 | 15,176 | 5.028446 | 0.108862 | 0.071584 | 0.092037 | 0.165144 | 0.789817 | 0.758268 | 0.736836 | 0.717907 | 0.70594 | 0.704961 | 0 | 0.00985 | 0.264101 | 15,176 | 378 | 434 | 40.148148 | 0.813216 | 0.029718 | 0 | 0.612167 | 0 | 0.015209 | 0.180915 | 0.017819 | 0 | 0 | 0 | 0 | 0.30038 | 1 | 0.045627 | false | 0 | 0.019011 | 0 | 0.068441 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e39d2613b09a4410d0b2618e9ad4fb617fbd78e6 | 107 | py | Python | tests/test_refwebapp.py | pshchelo/refwebapp | 101598a71376dcba72248b458f8f1b24a82adf66 | [
"MIT"
] | null | null | null | tests/test_refwebapp.py | pshchelo/refwebapp | 101598a71376dcba72248b458f8f1b24a82adf66 | [
"MIT"
] | null | null | null | tests/test_refwebapp.py | pshchelo/refwebapp | 101598a71376dcba72248b458f8f1b24a82adf66 | [
"MIT"
] | null | null | null | from refwebapp import version
def test_version():
assert version.version_info.package == "refwebapp"
| 17.833333 | 54 | 0.766355 | 13 | 107 | 6.153846 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149533 | 107 | 5 | 55 | 21.4 | 0.879121 | 0 | 0 | 0 | 0 | 0 | 0.084112 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e3c31d621b7edcb4e90a16a57bfdcd796142e7fc | 1,339 | py | Python | resources/panels/seeds/planet romance_adjective.py | exposit/pythia-oracle | 60e4e806c9ed1627f2649822ab1901d28933daac | [
"MIT"
] | 32 | 2016-08-27T01:31:42.000Z | 2022-03-21T08:59:28.000Z | resources/panels/seeds/planet romance_adjective.py | exposit/pythia-oracle | 60e4e806c9ed1627f2649822ab1901d28933daac | [
"MIT"
] | 3 | 2016-08-27T00:51:47.000Z | 2019-08-26T13:23:04.000Z | resources/panels/seeds/planet romance_adjective.py | exposit/pythia-oracle | 60e4e806c9ed1627f2649822ab1901d28933daac | [
"MIT"
] | 10 | 2016-08-28T14:14:41.000Z | 2021-03-18T03:24:22.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# Princess of Mars
chart = ["additional", "adjoined", "advanced", "afraid", "agreeable", "anxious", "awful", "baffled", "baleful", "bloodthirsty", "busy", "clever", "concerned", "congenial", "considerable", "continual", "correct", "deep", "delicious", "deserted", "desperate", "devoid", "dilated", "discernible", "distant", "entangled", "eternal", "fair", "fertile", "filial", "foolish", "formidable", "forward", "frenzied", "frightened", "full", "full-fledged", "happy", "harmless", "hated", "headfirst", "hereditary", "highest", "ice-clad", "identical", "important", "inaccessible", "ineffectual", "innumerable", "large", "late", "lesser", "long-extinct", "lost", "loud", "married", "mechanical", "mental", "mightiest", "moon-haunted", "mossy", "mutual", "natural", "naval", "new-found", "ornamental", "perpendicular", "pitiful", "possible", "precious", "proficient", "ragged", "resourceful", "respectable", "responsible", "richest", "rocky", "ruined", "selfish", "enshrined", "smooth", "snowiest", "startled", "strangest", "striking", "sufficient", "superhuman", "surest", "symmetrical", "tense", "tenuous", "penetrating", "tiny", "titanic", "tractable", "uncanny", "undisguised", "undue", "unfailing", "unnecessary", "vast", "vouchsafed", "wicked-looking", "wildest", "worship", "wrong"],
| 267.8 | 1,273 | 0.655713 | 123 | 1,339 | 7.138211 | 0.99187 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00082 | 0.088872 | 1,339 | 4 | 1,274 | 334.75 | 0.718852 | 0.044063 | 0 | 0 | 0 | 0 | 0.657792 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
54117b9e3cd3d33ec7f78578d17275f6523a45c8 | 4,530 | py | Python | tests/py/test_multi_gpu.py | p1x31/TRTorch | f99a6ca763eb08982e8b7172eb948a090bcbf11c | [
"BSD-3-Clause"
] | 944 | 2020-03-13T22:50:32.000Z | 2021-11-09T05:39:28.000Z | tests/py/test_multi_gpu.py | p1x31/TRTorch | f99a6ca763eb08982e8b7172eb948a090bcbf11c | [
"BSD-3-Clause"
] | 434 | 2020-03-18T03:00:29.000Z | 2021-11-09T00:26:36.000Z | tests/py/test_multi_gpu.py | p1x31/TRTorch | f99a6ca763eb08982e8b7172eb948a090bcbf11c | [
"BSD-3-Clause"
] | 106 | 2020-03-18T02:20:21.000Z | 2021-11-08T18:58:58.000Z | import unittest
import trtorch
import torch
import torchvision.models as models
from model_test_case import ModelTestCase
class TestMultiGpuSwitching(ModelTestCase):
def setUp(self):
if torch.cuda.device_count() < 2:
self.fail("Test is not relevant for this platform since number of available CUDA devices is less than 2")
trtorch.set_device(0)
self.target_gpu = 1
self.input = torch.randn((1, 3, 224, 224)).to("cuda:1")
self.model = self.model.to("cuda:1")
self.traced_model = torch.jit.trace(self.model, [self.input])
self.scripted_model = torch.jit.script(self.model)
def test_compile_traced(self):
trtorch.set_device(0)
compile_spec = {
"inputs": [trtorch.Input(self.input.shape)],
"device": {
"device_type": trtorch.DeviceType.GPU,
"gpu_id": self.target_gpu,
"dla_core": 0,
"allow_gpu_fallback": False,
"disable_tf32": False
}
}
trt_mod = trtorch.compile(self.traced_model, **compile_spec)
trtorch.set_device(self.target_gpu)
same = (trt_mod(self.input) - self.traced_model(self.input)).abs().max()
trtorch.set_device(0)
self.assertTrue(same < 2e-3)
def test_compile_script(self):
trtorch.set_device(0)
compile_spec = {
"inputs": [trtorch.Input(self.input.shape)],
"device": {
"device_type": trtorch.DeviceType.GPU,
"gpu_id": self.target_gpu,
"dla_core": 0,
"allow_gpu_fallback": False,
"disable_tf32": False
}
}
trt_mod = trtorch.compile(self.scripted_model, **compile_spec)
trtorch.set_device(self.target_gpu)
same = (trt_mod(self.input) - self.scripted_model(self.input)).abs().max()
trtorch.set_device(0)
self.assertTrue(same < 2e-3)
class TestMultiGpuSerializeDeserializeSwitching(ModelTestCase):
def setUp(self):
if torch.cuda.device_count() < 2:
self.fail("Test is not relevant for this platform since number of available CUDA devices is less than 2")
self.target_gpu = 0
trtorch.set_device(0)
self.input = torch.randn((1, 3, 224, 224)).to("cuda:0")
self.model = self.model.to("cuda:0")
self.traced_model = torch.jit.trace(self.model, [self.input])
self.scripted_model = torch.jit.script(self.model)
def test_compile_traced(self):
trtorch.set_device(0)
compile_spec = {
"inputs": [trtorch.Input(self.input.shape)],
"device": {
"device_type": trtorch.DeviceType.GPU,
"gpu_id": self.target_gpu,
"dla_core": 0,
"allow_gpu_fallback": False,
"disable_tf32": False
}
}
trt_mod = trtorch.compile(self.traced_model, **compile_spec)
# Changing the device ID deliberately. It should still run on correct device ID by context switching
trtorch.set_device(1)
same = (trt_mod(self.input) - self.traced_model(self.input)).abs().max()
self.assertTrue(same < 2e-3)
def test_compile_script(self):
trtorch.set_device(0)
compile_spec = {
"inputs": [trtorch.Input(self.input.shape)],
"device": {
"device_type": trtorch.DeviceType.GPU,
"gpu_id": self.target_gpu,
"dla_core": 0,
"allow_gpu_fallback": False,
"disable_tf32": False
}
}
trt_mod = trtorch.compile(self.scripted_model, **compile_spec)
# Changing the device ID deliberately. It should still run on correct device ID by context switching
trtorch.set_device(1)
same = (trt_mod(self.input) - self.scripted_model(self.input)).abs().max()
self.assertTrue(same < 2e-3)
def test_suite():
suite = unittest.TestSuite()
suite.addTest(TestMultiGpuSwitching.parametrize(TestMultiGpuSwitching, model=models.resnet18(pretrained=True)))
suite.addTest(
TestMultiGpuSerializeDeserializeSwitching.parametrize(TestMultiGpuSwitching,
model=models.resnet18(pretrained=True)))
return suite
suite = test_suite()
runner = unittest.TextTestRunner()
result = runner.run(suite)
exit(int(not result.wasSuccessful()))
| 35.116279 | 117 | 0.601545 | 526 | 4,530 | 5.01711 | 0.188213 | 0.054566 | 0.072755 | 0.051535 | 0.840849 | 0.826449 | 0.808261 | 0.759 | 0.759 | 0.759 | 0 | 0.018536 | 0.28543 | 4,530 | 128 | 118 | 35.390625 | 0.796725 | 0.043488 | 0 | 0.686275 | 0 | 0 | 0.109931 | 0 | 0 | 0 | 0 | 0 | 0.039216 | 1 | 0.068627 | false | 0 | 0.04902 | 0 | 0.147059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
541d78494e019792ed09ebd061ab1af3d2c34747 | 5,620 | py | Python | tests/conftest.py | 0x7d7b/flask-oauth2-validation | ee433ad692e734adeb4dfe0b5fb408704d253706 | [
"MIT"
] | null | null | null | tests/conftest.py | 0x7d7b/flask-oauth2-validation | ee433ad692e734adeb4dfe0b5fb408704d253706 | [
"MIT"
] | null | null | null | tests/conftest.py | 0x7d7b/flask-oauth2-validation | ee433ad692e734adeb4dfe0b5fb408704d253706 | [
"MIT"
] | null | null | null | from flask import Flask
import pytest
import requests_mock
from functional import test_jwk
@pytest.fixture(scope='function')
def test_app():
with requests_mock.Mocker() as mock:
_register_mock_addresses(mock)
yield _create_flask_app()
def _create_flask_app():
app = Flask(__name__)
app.config['TESTING'] = True
return app
def _register_mock_addresses(
mock: requests_mock.Mocker
):
mock.get(
'https://unsupported.issuer.local/oauth2' +
'/.well-known/oauth-authorization-server',
status_code=404
)
mock.get(
'https://missing_jwks.issuer.local/oauth2' +
'/.well-known/oauth-authorization-server',
json={
'introspection_endpoint':
'https://issuer.local/oauth2/introspect',
'introspection_endpoint_auth_methods_supported': [
'client_secret_basic',
'client_secret_post'
]
}
)
mock.get(
'https://missing_introspection.issuer.local/oauth2' +
'/.well-known/oauth-authorization-server',
json={
'jwks_uri':
'https://issuer.local/oauth2/keys',
'introspection_endpoint_auth_methods_supported': [
'client_secret_basic',
'client_secret_post'
]
}
)
mock.get(
'https://missing_introspection_auth.issuer.local/oauth2' +
'/.well-known/oauth-authorization-server',
json={
'jwks_uri':
'https://issuer.local/oauth2/keys',
'introspection_endpoint':
'https://issuer.local/oauth2/introspect',
}
)
mock.get(
'https://issuer.local/oauth2' +
'/.well-known/oauth-authorization-server',
json={
'jwks_uri':
'https://issuer.local/oauth2/keys',
'introspection_endpoint':
'https://issuer.local/oauth2/introspect',
'introspection_endpoint_auth_methods_supported': [
'client_secret_basic',
'client_secret_post'
]
}
)
mock.get(
'https://basic_introspection.issuer.local/oauth2' +
'/.well-known/oauth-authorization-server',
json={
'jwks_uri':
'https://issuer.local/oauth2/keys',
'introspection_endpoint':
'https://basic_introspection.issuer.local/oauth2/introspect',
'introspection_endpoint_auth_methods_supported': [
'client_secret_basic'
]
}
)
mock.get(
'https://issuer.local/oauth2/keys',
json={
'keys': [
{'kid': 'x'},
test_jwk,
{'kid': 'z'}
]
}
)
mock.get(
'https://key_lookup_error.issuer.local/oauth2' +
'/.well-known/oauth-authorization-server',
json={
'jwks_uri':
'https://key_lookup_error.issuer.local/oauth2/keys',
}
)
mock.get(
'https://key_lookup_error.issuer.local/oauth2/keys',
status_code=500
)
mock.get(
'https://invalid_introspection.issuer.local/oauth2' +
'/.well-known/oauth-authorization-server',
json={
'jwks_uri':
'https://issuer.local/oauth2/keys',
'introspection_endpoint':
'https://invalid_introspection.issuer.local/oauth2/introspect',
'introspection_endpoint_auth_methods_supported': [
'client_secret_basic',
'client_secret_post'
]
}
)
mock.post(
'https://invalid_introspection.issuer.local/oauth2/introspect',
json={
'active': False
}
)
mock.post(
'https://issuer.local/oauth2/introspect',
json={
'active': True
}
)
mock.post(
'https://basic_introspection.issuer.local/oauth2/introspect',
json={
'active': True
}
)
mock.get(
'https://introspection_error.issuer.local/oauth2' +
'/.well-known/oauth-authorization-server',
json={
'jwks_uri':
'https://issuer.local/oauth2/keys',
'introspection_endpoint':
'https://introspection_error.issuer.local/oauth2/introspect',
'introspection_endpoint_auth_methods_supported': [
'client_secret_basic',
'client_secret_post'
]
}
)
mock.post(
'https://introspection_error.issuer.local/oauth2/introspect',
status_code=500
)
mock.get(
'https://unknown_auth.issuer.local/oauth2' +
'/.well-known/oauth-authorization-server',
json={
'jwks_uri':
'https://issuer.local/oauth2/keys',
'introspection_endpoint':
'https://unknown_auth.issuer.local/oauth2/introspect',
'introspection_endpoint_auth_methods_supported': [
'client_secret_unknown'
]
}
)
mock.get(
'https://unknown_pubkey.issuer.local/oauth2' +
'/.well-known/oauth-authorization-server',
json={
'jwks_uri':
'https://unknown_pubkey.issuer.local/oauth2/keys',
}
)
mock.get(
'https://unknown_pubkey.issuer.local/oauth2/keys',
json={
'keys': [
{'kid': 'x'},
{'kid': 'z'}
]
}
)
| 26.761905 | 79 | 0.533986 | 496 | 5,620 | 5.814516 | 0.135081 | 0.129681 | 0.200416 | 0.099168 | 0.864771 | 0.863037 | 0.820735 | 0.755201 | 0.652913 | 0.611997 | 0 | 0.011575 | 0.338968 | 5,620 | 209 | 80 | 26.889952 | 0.764738 | 0 | 0 | 0.592391 | 0 | 0 | 0.486299 | 0.163523 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016304 | false | 0 | 0.021739 | 0 | 0.043478 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
584253e5b62da483fb29dab962b906a70b57da19 | 164,202 | py | Python | file_storage/avm_results_la_grange.py | johncoleman83/attom_python_client | 2fad572162f481a71cccf6003da4cbd8ec4477d4 | [
"MIT"
] | null | null | null | file_storage/avm_results_la_grange.py | johncoleman83/attom_python_client | 2fad572162f481a71cccf6003da4cbd8ec4477d4 | [
"MIT"
] | null | null | null | file_storage/avm_results_la_grange.py | johncoleman83/attom_python_client | 2fad572162f481a71cccf6003da4cbd8ec4477d4 | [
"MIT"
] | 1 | 2020-11-20T19:28:36.000Z | 2020-11-20T19:28:36.000Z | #!/usr/bin/env python3
avm_results = [
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T04:59:47", "transactionID": "330acf8a-d3bd-469d-b6dd-2ecc4fa06303"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 28768917031, "fips": "17031", "apn": "18091110260000", "attomID": 287689}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.186, "lotsize2": 8100}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "608 S KENSINGTON AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "608 S KENSINGTON AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2705", "postal3": "C020"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.802985", "longitude": "-87.874288", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157782, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1949, "propLandUse": "SFR", "propIndicator": "10", "legal1": "EAST2 W2 NW4 SEC09 E2NW4SW4 S09 T38 N R12E 3P"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1899, "groundfloorsize": 0, "livingsize": 1899, "sizeInd": "LIVING SQFT ", "universalsize": 1899}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 84, "value": 512100, "high": 586000, "low": 431300, "fsd": 16.0}}, "sale": {"buyerName": "MROZ,ELIZABETH K & KEVIN M", "sellerName": "KETCHUM,KATHLEEN T & ROBERT P", "salesearchdate": "2005-8-19", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1819245021", "amount": 359920}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 510000, "salerecdate": "2005-8-19", "saledisclosuretype": 0, "saledocnum": "0523120096", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 269}}, "assessment": {"assessed": {"assdttlvalue": 46149}, "market": {"mktttlvalue": 461490}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T04:59:51", "transactionID": "54f8f6e2-10bb-4564-bfab-75034d39125f"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17523301917031, "fips": "17031", "apn": "18093150040000", "attomID": 175233019}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1453, "lotsize2": 6329}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "1011 S MADISON AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "1011 S MADISON AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2854", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.795556", "longitude": "-87.870177", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1967, "propLandUse": "SFR", "propIndicator": "10", "legal1": "E2SW4 S09 T38N R12E 3P"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1731, "groundfloorsize": 0, "livingsize": 1731, "sizeInd": "LIVING SQFT ", "universalsize": 1731}, "rooms": {"bathstotal": 3.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 78, "value": 498200, "high": 589500, "low": 391200, "fsd": 22.0}}, "sale": {"buyerName": "DOYLE,CHRISTINE|MILLER,VINCENT K", "sellerName": "OZER,MARC R & ANN M", "salesearchdate": "2017-5-18", "saleTransDate": "2017-5-18", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1713908044", "amount": 421200}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 468000, "salerecdate": "2017-5-19", "saledisclosuretype": 0, "saledocnum": "1713908043", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 270}}, "assessment": {"assessed": {"assdttlvalue": 48066}, "market": {"mktttlvalue": 480660}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T04:59:53", "transactionID": "912a0380-c3f7-4ec5-9e61-31fd4a96b7f6"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17484776217031, "fips": "17031", "apn": "18091300200000", "attomID": 174847762}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "828 S MADISON AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "828 S MADISON AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2807", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.799070", "longitude": "-87.870677", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1948, "propLandUse": "SFR", "propIndicator": "10", "legal1": "L8 B15 COUNTRY CLUB ADDT TO LA GRAN GE E1/2 NW1/4 S9 T38N R12E"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2564, "groundfloorsize": 0, "livingsize": 2564, "sizeInd": "LIVING SQFT ", "universalsize": 2564}, "rooms": {"bathstotal": 3.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 78, "value": 627400, "high": 754400, "low": 491500, "fsd": 22.0}}, "sale": {"buyerName": "MICHELINI,LUCAS V & MICHELLE P", "sellerName": "WILLIAMS,R RICHARD & JOYCE H", "salesearchdate": "2016-7-26", "saleTransDate": "2016-7-26", "mortgage": {"FirstConcurrent": {"amount": 404000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 505000, "salerecdate": "2016-7-27", "saledisclosuretype": 0, "saledocnum": "1620918071", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 197}}, "assessment": {"assessed": {"assdttlvalue": 53727}, "market": {"mktttlvalue": 537270}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T04:59:56", "transactionID": "3f7572ce-1e4a-4241-bac0-37b9b36e19c3"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17599964617031, "fips": "17031", "apn": "18093000310000", "attomID": 175999646}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1701, "lotsize2": 7411}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "904 S STONE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "904 S STONE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2728", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.797520", "longitude": "-87.877730", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1933, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "WOOD"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1681, "groundfloorsize": 0, "livingsize": 1681, "sizeInd": "LIVING SQFT ", "universalsize": 1681}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 85, "value": 410800, "high": 472100, "low": 351600, "fsd": 15.0}}, "sale": {"buyerName": "PICKERING,JOHN R & JEANINE", "sellerName": "MERRION,DAVID W", "salesearchdate": "2016-12-9", "saleTransDate": "2016-12-9", "mortgage": {"FirstConcurrent": {"amount": 383200}, "SecondConcurrent": {"amount": 46300}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 479000, "salerecdate": "2016-12-20", "saledisclosuretype": 0, "saledocnum": "1635549053", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 285}}, "assessment": {"assessed": {"assdttlvalue": 41716}, "market": {"mktttlvalue": 417160}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T04:59:59", "transactionID": "3af7c23a-94a3-4805-a089-bc4cfade2e2b"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 2322812117031, "fips": "17031", "apn": "18091000270000", "attomID": 23228121}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.155, "lotsize2": 6750}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "505 S BRAINARD AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "505 S BRAINARD AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "6107", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.804677", "longitude": "-87.878924", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1956, "propLandUse": "SFR", "propIndicator": "10", "legal1": "WEST2 W2 NW4 SEC09 W2NW4SW4 S09 T38 N R12E 3P"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2139, "groundfloorsize": 0, "livingsize": 2139, "sizeInd": "LIVING SQFT ", "universalsize": 2139}, "rooms": {"bathstotal": 3.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 75, "value": 534800, "high": 667700, "low": 421800, "fsd": 25.0}}, "sale": {"sellerName": "LALLY,BRIAN & ANGELLA B", "salesearchdate": "2018-6-18", "saleTransDate": "2018-6-18", "mortgage": {"FirstConcurrent": {"amount": 377000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 532000, "salerecdate": "2018-10-11", "saledisclosuretype": 0, "saledocnum": "1828457101", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 249}}, "assessment": {"assessed": {"assdttlvalue": 45979}, "market": {"mktttlvalue": 459790}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:01", "transactionID": "67242182-7e21-458e-9be3-59a65f11858f"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17484770317031, "fips": "17031", "apn": "18091140020000", "attomID": 174847703}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "605 S ASHLAND AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "605 S ASHLAND AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2814", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.803142", "longitude": "-87.871653", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1941, "propLandUse": "SFR", "propIndicator": "10", "legal1": "(COUNTRY) (CLUB) ADD TO (LAGRANGE) SUB OF EH NW SEC 0 9-38-12"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1860, "groundfloorsize": 0, "livingsize": 1860, "sizeInd": "LIVING SQFT ", "universalsize": 1860}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 74, "value": 457700, "high": 578000, "low": 369900, "fsd": 26.0}}, "sale": {"buyerName": "PFISTER,MICHAEL & OLIVIA", "sellerName": "MUSCATO,TODD M & MEGHAN D", "salesearchdate": "2016-7-25", "saleTransDate": "2016-7-25", "mortgage": {"FirstConcurrent": {"amount": 360000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 450000, "salerecdate": "2016-8-26", "saledisclosuretype": 0, "saledocnum": "1623919393", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 242}}, "assessment": {"assessed": {"assdttlvalue": 44865}, "market": {"mktttlvalue": 448650}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:05", "transactionID": "a0768a0a-5675-4e00-b550-6996fe034e82"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17524260717031, "fips": "17031", "apn": "18093200160000", "attomID": 175242607}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "5316 S CATHERINE AVE", "line2": "COUNTRYSIDE, IL 60525", "locality": "Countryside", "matchCode": "ExaStr", "oneLine": "5316 S CATHERINE AVE, COUNTRYSIDE, IL 60525", "postal1": "60525", "postal2": "2840", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.792306", "longitude": "-87.872667", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004683765, PL1716873, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1944, "propLandUse": "SFR", "propIndicator": "10", "legal1": "E2SW4 S09 T38N R12E 3P"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "WOOD"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1041, "groundfloorsize": 0, "livingsize": 1041, "sizeInd": "LIVING SQFT ", "universalsize": 1041}, "rooms": {"bathstotal": 1.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 73, "value": 248700, "high": 315100, "low": 189800, "fsd": 27.0}}, "sale": {"buyerName": "REINHART,NICOLE & JACOB", "sellerName": "BMF REMODELING LLC", "salesearchdate": "2017-1-26", "saleTransDate": "2017-1-26", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1703910117", "amount": 424100}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 535000, "salerecdate": "2017-2-8", "saledisclosuretype": 0, "saledocnum": "1703910116", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 514}}, "assessment": {"assessed": {"assdttlvalue": 20041}, "market": {"mktttlvalue": 200410}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:07", "transactionID": "40438715-e68e-4342-ad40-24572557a0fb"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 865277517031, "fips": "17031", "apn": "18091060160000", "attomID": 8652775}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "512 S MADISON AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "512 S MADISON AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2801", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.804829", "longitude": "-87.870943", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1965, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1725, "groundfloorsize": 0, "livingsize": 1725, "sizeInd": "LIVING SQFT ", "universalsize": 1725}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 0, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 76, "value": 432300, "high": 536200, "low": 354800, "fsd": 24.0}}, "sale": {"buyerName": "MANTA,JOHN L TRUST", "sellerName": "BENSON,THOMAS G & BRIGID", "salesearchdate": "2016-6-17", "saleTransDate": "2016-6-17", "mortgage": {"FirstConcurrent": {"amount": 388000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 485000, "salerecdate": "2016-6-23", "saledisclosuretype": 0, "saledocnum": "1617539110", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 281}}, "assessment": {"assessed": {"assdttlvalue": 43561}, "market": {"mktttlvalue": 435610}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:11", "transactionID": "36ea88aa-609b-41e1-bc05-14cbff1d3f66"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17524260317031, "fips": "17031", "apn": "18093200110000", "attomID": 175242603}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "5347 S KENSINGTON AVE", "line2": "COUNTRYSIDE, IL 60525", "locality": "Countryside", "matchCode": "ExaStr", "oneLine": "5347 S KENSINGTON AVE, COUNTRYSIDE, IL 60525", "postal1": "60525", "postal2": "6607", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.792865", "longitude": "-87.873492", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004683765, PL1716873, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1955, "propLandUse": "SFR", "propIndicator": "10", "legal1": "L14 B8 5TH AVENUE MANOR E1/2 SW1/4 XN25AC S9 T38N R12E"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1824, "groundfloorsize": 0, "livingsize": 1824, "sizeInd": "LIVING SQFT ", "universalsize": 1824}, "rooms": {"bathstotal": 3.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 73, "value": 467000, "high": 591500, "low": 353900, "fsd": 27.0}}, "sale": {"buyerName": "SWEENEY,BRENDAN M & JENNIFER H", "sellerName": "BUDDE,MICHAEL JR & GINA", "salesearchdate": "2017-2-21", "saleTransDate": "2017-2-21", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1705306146", "amount": 388000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 485000, "salerecdate": "2017-2-22", "saledisclosuretype": 0, "saledocnum": "1705306145", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 266}}, "assessment": {"assessed": {"assdttlvalue": 35105}, "market": {"mktttlvalue": 351050}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:13", "transactionID": "a1d52055-9166-4809-ad57-18e0abfc7ba5"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17523295717031, "fips": "17031", "apn": "18091020030000", "attomID": 175232957}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.155, "lotsize2": 6750}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "509 S WAIOLA AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "509 S WAIOLA AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2732", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.804790", "longitude": "-87.876487", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157782, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1930, "propLandUse": "SFR", "propIndicator": "10", "legal1": "(SPRING)(GARDENS) SUB OF EH OF WH OF NW 1/4 OF EH OF NW 1/4 OF SW 1/4 SEC 09-38-12"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1580, "groundfloorsize": 0, "livingsize": 1580, "sizeInd": "LIVING SQFT ", "universalsize": 1580}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 82, "value": 497000, "high": 555700, "low": 405600, "fsd": 18.0}}, "sale": {"buyerName": "NAGEL,MATTHEW H & CAITLIN A", "sellerName": "YODER,MARK R", "salesearchdate": "2017-2-27", "saleTransDate": "2017-2-27", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1707255133", "amount": 361600}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 452000, "salerecdate": "2017-3-13", "saledisclosuretype": 0, "saledocnum": "1707255132", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 286}}, "assessment": {"assessed": {"assdttlvalue": 45228}, "market": {"mktttlvalue": 452280}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:16", "transactionID": "afe22eef-87d2-489b-98d0-e75727245f34"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17527273917031, "fips": "17031", "apn": "18093160300000", "attomID": 175272739}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.3698, "lotsize2": 16108}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "1103 S BRAINARD AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "1103 S BRAINARD AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "6605", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.793652", "longitude": "-87.878451", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1954, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2636, "groundfloorsize": 0, "livingsize": 2636, "sizeInd": "LIVING SQFT ", "universalsize": 2636}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 0, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 78, "value": 540600, "high": 612400, "low": 423200, "fsd": 22.0}}, "sale": {"buyerName": "CARDONE,JOHN & JOAN", "sellerName": "BAZZONI MARGARET E TRUST", "salesearchdate": "2016-10-4", "saleTransDate": "2016-10-4", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1630834045", "amount": 376000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 470000, "salerecdate": "2016-11-3", "saledisclosuretype": 0, "saledocnum": "1630834044", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 178}}, "assessment": {"assessed": {"assdttlvalue": 55137}, "market": {"mktttlvalue": 551370}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:18", "transactionID": "3747a6b4-c770-47a1-b993-c1754f8d3065"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 1461398217031, "fips": "17031", "apn": "18091120180000", "attomID": 14613982}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "620 S CATHERINE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "620 S CATHERINE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2825", "postal3": "C020"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.802795", "longitude": "-87.873140", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1948, "propLandUse": "SFR", "propIndicator": "10", "legal1": "E2NW4 S09 T38N R12E 3P"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "WOOD"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1666, "groundfloorsize": 0, "livingsize": 1666, "sizeInd": "LIVING SQFT ", "universalsize": 1666}, "rooms": {"bathstotal": 3.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 81, "value": 470500, "high": 550200, "low": 383200, "fsd": 19.0}}, "sale": {"buyerName": "SWINIARSKI,BENNETT & EMILY", "sellerName": "MC LANE,ROBERT & CHRISTINA", "salesearchdate": "2016-10-11", "saleTransDate": "2016-10-11", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1629150030", "amount": 385200}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 481500, "salerecdate": "2016-10-17", "saledisclosuretype": 0, "saledocnum": "1629150029", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 289}}, "assessment": {"assessed": {"assdttlvalue": 40668}, "market": {"mktttlvalue": 406680}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 2, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:21", "transactionID": "cee5f63f-c537-4f01-a496-6f0dcdf4d58c"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17484773217031, "fips": "17031", "apn": "18091190110000", "attomID": 174847732}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.155, "lotsize2": 6750}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "741 S SPRING AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "741 S SPRING AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2754", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.800553", "longitude": "-87.875054", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157782, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1954, "propLandUse": "SFR", "propIndicator": "10", "legal1": "SOUTH25 FT LOT111 E2 W2 NW4 E2 NW4 SW4 SEC09"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1522, "groundfloorsize": 0, "livingsize": 1522, "sizeInd": "LIVING SQFT ", "universalsize": 1522}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 78, "value": 290100, "high": 344600, "low": 227200, "fsd": 22.0}}, "sale": {"buyerName": "FOLEY,SEAN & SARA FAMILY TRUST", "multiAPNflag": "MULTIPLE", "sellerName": "FOLEY,SEAN M", "salesearchdate": "2017-9-12", "saleTransDate": "2017-9-12", "mortgage": {"FirstConcurrent": {"amount": 0}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 0, "salerecdate": "2017-10-13", "saledisclosuretype": 0, "saledocnum": "1728657261", "saletranstype": "Nominal - Non/Arms Length Sale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 0}}, "assessment": {"assessed": {"assdttlvalue": 27237}, "market": {"mktttlvalue": 272370}}, "owner": {}}, {"identifier": {"obPropId": 17484774217031, "fips": "17031", "apn": "18091190330000", "attomID": 174847742}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.0775, "lotsize2": 3375}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "741 S SPRING AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "741 S SPRING AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2754", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.800553", "longitude": "-87.875054", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157782, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1954, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1522, "groundfloorsize": 0, "livingsize": 1522, "sizeInd": "LIVING SQFT ", "universalsize": 1522}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 81, "value": 187700, "high": 214200, "low": 151900, "fsd": 19.0}}, "sale": {"buyerName": "FOLEY,SEAN & SARA FAMILY TRUST", "multiAPNflag": "MULTIPLE", "sellerName": "FOLEY,SEAN M", "salesearchdate": "2017-9-12", "saleTransDate": "2017-9-12", "mortgage": {"FirstConcurrent": {"amount": 0}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 0, "salerecdate": "2017-10-13", "saledisclosuretype": 0, "saledocnum": "1728657261", "saletranstype": "Nominal - Non/Arms Length Sale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 0}}, "assessment": {"assessed": {"assdttlvalue": 17285}, "market": {"mktttlvalue": 172850}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:24", "transactionID": "102a93c1-e27c-4e0f-9aeb-4953d05ad8bd"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17484771817031, "fips": "17031", "apn": "18091150060000", "attomID": 174847718}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1951, "lotsize2": 8500}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "621 S MADISON AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "621 S MADISON AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2804", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.802864", "longitude": "-87.870476", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1954, "propLandUse": "SFR", "propIndicator": "10", "legal1": "L19 B8 COUNTRY CLUB ADDT TO LAGRANG E SUBD E1/2 NW1/4 S9 T38N R12E"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 3077, "groundfloorsize": 0, "livingsize": 3077, "sizeInd": "LIVING SQFT ", "universalsize": 3077}, "rooms": {"bathstotal": 3.0, "beds": 5, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 78, "value": 693800, "high": 849200, "low": 553500, "fsd": 22.0}}, "sale": {"buyerName": "HERNANDEZ,SCOTT C & STEPHANIE K", "sellerName": "LAATZ,ALAN J & ALANE L", "salesearchdate": "2018-5-18", "saleTransDate": "2018-5-18", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1814229338", "amount": 420000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 525000, "salerecdate": "2018-5-22", "saledisclosuretype": 0, "saledocnum": "1814229337", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 171}}, "assessment": {"assessed": {"assdttlvalue": 62955}, "market": {"mktttlvalue": 629550}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:26", "transactionID": "46f56645-9d6a-482a-92e0-0c004f267d22"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17338671117031, "fips": "17031", "apn": "18093020060000", "attomID": 173386711}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1538, "lotsize2": 6700}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "921 S WAIOLA AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "921 S WAIOLA AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2740", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.797261", "longitude": "-87.876129", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157782, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "ABSENTEE(MAIL AND SITUS NOT =)", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1954, "propLandUse": "SFR", "propIndicator": "10", "legal1": "LOT 235 IN SPRING GARDENS E2 W2 NW4 OF SEC09 T38N R12E 3P"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1578, "groundfloorsize": 0, "livingsize": 1578, "sizeInd": "LIVING SQFT ", "universalsize": 1578}, "rooms": {"bathstotal": 3.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 74, "value": 476400, "high": 535100, "low": 354800, "fsd": 26.0}}, "sale": {"sellerName": "HUBER,VIRGINIA & JOSHUA F", "salesearchdate": "2018-10-26", "saleTransDate": "2018-10-26", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1830606109", "amount": 336000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 420000, "salerecdate": "2018-11-2", "saledisclosuretype": 0, "saledocnum": "1830606108", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 266}}, "assessment": {"assessed": {"assdttlvalue": 37857}, "market": {"mktttlvalue": 378570}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:29", "transactionID": "dcdd5c58-2471-4f2d-ba62-4f14dc0e3f4d"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 4518990617031, "fips": "17031", "apn": "18093300060000", "attomID": 45189906}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.2307, "lotsize2": 10050}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "5400 S KENSINGTON AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "5400 S KENSINGTON AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "6610", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.792073", "longitude": "-87.873806", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1956, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "WOOD"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1529, "groundfloorsize": 0, "livingsize": 1529, "sizeInd": "LIVING SQFT ", "universalsize": 1529}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 75, "value": 382700, "high": 478200, "low": 286400, "fsd": 25.0}}, "sale": {"buyerName": "FLEMING,ERIC & RACHEL", "sellerName": "HALLIGAN,GERRY A", "salesearchdate": "2017-6-29", "saleTransDate": "2017-6-29", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1720149049", "amount": 340000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 425000, "salerecdate": "2017-7-20", "saledisclosuretype": 0, "saledocnum": "1720149048", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 278}}, "assessment": {"assessed": {"assdttlvalue": 30836}, "market": {"mktttlvalue": 308360}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:32", "transactionID": "e4d6a0da-65aa-48f2-8891-19910e3e77eb"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17617177817031, "fips": "17031", "apn": "18091220090000", "attomID": 176171778}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "735 S ASHLAND AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "735 S ASHLAND AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2816", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.800741", "longitude": "-87.871538", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1952, "propLandUse": "SFR", "propIndicator": "10", "legal1": "E2NW4 S09 T38N R12E 3P"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1564, "groundfloorsize": 0, "livingsize": 1564, "sizeInd": "LIVING SQFT ", "universalsize": 1564}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 76, "value": 493200, "high": 580700, "low": 373900, "fsd": 24.0}}, "sale": {"buyerName": "OBRIEN,BARBARA", "sellerName": "EDWARDS,LAURA C & MATTHEW J", "salesearchdate": "2016-8-16", "saleTransDate": "2016-8-16", "mortgage": {"FirstConcurrent": {"amount": 339200}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 424000, "salerecdate": "2016-8-22", "saledisclosuretype": 0, "saledocnum": "1623518015", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 271}}, "assessment": {"assessed": {"assdttlvalue": 42495}, "market": {"mktttlvalue": 424950}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:35", "transactionID": "8c0a4427-2498-4205-841c-a256ab24227f"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17617177317031, "fips": "17031", "apn": "18091020010000", "attomID": 176171773}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.152, "lotsize2": 6620}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "501 S WAIOLA AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "501 S WAIOLA AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2732", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.804938", "longitude": "-87.876496", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721600, DB1723880, MT30001048, ND0004439079, PL1740767, SB0000085027, SB0000085030, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1951, "propLandUse": "SFR", "propIndicator": "10", "legal1": "LOT 288 IN SPRING GARDENS E2 W2 NW4 OF SEC09 T38N R12E 3P"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2304, "groundfloorsize": 0, "livingsize": 2304, "sizeInd": "LIVING SQFT ", "universalsize": 2304}, "rooms": {"bathstotal": 2.0, "beds": 5, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 82, "value": 567900, "high": 662600, "low": 466600, "fsd": 18.0}}, "sale": {"buyerName": "MICHEL,JAMES & KIMBERLY", "sellerName": "AHERN LAWRENCE R & F TRUST", "salesearchdate": "2016-7-8", "saleTransDate": "2016-7-8", "mortgage": {"FirstConcurrent": {"amount": 400000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 500000, "salerecdate": "2016-8-3", "saledisclosuretype": 0, "saledocnum": "1621617118", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 217}}, "assessment": {"assessed": {"assdttlvalue": 53221}, "market": {"mktttlvalue": 532210}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:37", "transactionID": "e0f7368f-9fd8-493a-9009-e0a337216b78"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 3885739217031, "fips": "17031", "apn": "18091270030000", "attomID": 38857392}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.155, "lotsize2": 6750}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "809 S SPRING AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "809 S SPRING AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2756", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.799329", "longitude": "-87.874998", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157782, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1940, "propLandUse": "SFR", "propIndicator": "10", "legal1": "L106 SPRINGS GARDENS SUBD E1/2 W1/2 NW1/4 & E1/2 NW1/4 SW1/4 S9 T38N R12E"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2116, "groundfloorsize": 0, "livingsize": 2116, "sizeInd": "LIVING SQFT ", "universalsize": 2116}, "rooms": {"bathstotal": 3.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 78, "value": 498800, "high": 588700, "low": 390100, "fsd": 22.0}}, "sale": {"buyerName": "REYNOLDS,TIMOTHY & SARAH", "sellerName": "BURKE,PATRICK D & DIANE M", "salesearchdate": "2016-7-8", "saleTransDate": "2016-7-8", "mortgage": {"FirstConcurrent": {"amount": 508250}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 535000, "salerecdate": "2016-7-25", "saledisclosuretype": 0, "saledocnum": "1620756042", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 253}}, "assessment": {"assessed": {"assdttlvalue": 43488}, "market": {"mktttlvalue": 434880}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:40", "transactionID": "dec6c55d-a20d-4817-95d4-450693ae04ba"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 15533993817031, "fips": "17031", "apn": "18091170120000", "attomID": 155339938}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.155, "lotsize2": 6750}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "745 S STONE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "745 S STONE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2725", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.800454", "longitude": "-87.877503", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1954, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1602, "groundfloorsize": 0, "livingsize": 1602, "sizeInd": "LIVING SQFT ", "universalsize": 1602}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 84, "value": 387000, "high": 433700, "low": 324200, "fsd": 16.0}}, "sale": {"buyerName": "DEMATTEO,RICHARD J & MAUREEN G", "sellerName": "TRUST 99-2128", "salesearchdate": "2017-6-20", "saleTransDate": "2017-6-20", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1717415091", "amount": 329000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 412000, "salerecdate": "2017-6-23", "saledisclosuretype": 0, "saledocnum": "1717415090", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 257}}, "assessment": {"assessed": {"assdttlvalue": 37143}, "market": {"mktttlvalue": 371430}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:43", "transactionID": "a93e773e-2974-40aa-a3dd-e2538c050ad7"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17524102117031, "fips": "17031", "apn": "18091090120000", "attomID": 175241021}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.155, "lotsize2": 6750}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "645 S STONE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "645 S STONE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2723", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.802269", "longitude": "-87.877590", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1945, "propLandUse": "SFR", "propIndicator": "10", "legal1": "LOT 13 IN BLK 4 IN H O STONE & COMP ANY'S BRAINARD PARK W2 W2 NW4 W2 NW 4 SW4 OF SEC09 T38N R12E 3P"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2045, "groundfloorsize": 0, "livingsize": 2045, "sizeInd": "LIVING SQFT ", "universalsize": 2045}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 85, "value": 475700, "high": 543600, "low": 405100, "fsd": 15.0}}, "sale": {"buyerName": "WOODS,BARBARA Y", "sellerName": "ADDISON,JAMES B & NANCY M", "salesearchdate": "2017-7-24", "saleTransDate": "2017-7-24", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1720608188", "amount": 335000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 535000, "salerecdate": "2017-7-25", "saledisclosuretype": 0, "saledocnum": "1720608187", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 262}}, "assessment": {"assessed": {"assdttlvalue": 46828}, "market": {"mktttlvalue": 468280}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:45", "transactionID": "033fd796-85ed-4aed-93d9-605c25815814"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 3467042017031, "fips": "17031", "apn": "18093240250000", "attomID": 34670420}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.173, "lotsize2": 7535}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "5409 S BRAINARD AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "5409 S BRAINARD AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "6606", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.791250", "longitude": "-87.878301", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1943, "propLandUse": "SFR", "propIndicator": "10", "legal1": "WEST 2 OF N 55 FT MEASURED ON W LIN E OF S 175 FT MEASURED ON W LINE OF N 2 ACRES ON W LINE OF SEC09 T38N"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "WOOD"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1133, "groundfloorsize": 0, "livingsize": 1133, "sizeInd": "LIVING SQFT ", "universalsize": 1133}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 80, "value": 346600, "high": 410100, "low": 277200, "fsd": 20.0}}, "sale": {"sellerName": "MCGINNIS,RYAN & MELISSA", "salesearchdate": "2017-8-14", "saleTransDate": "2017-8-14", "mortgage": {"FirstConcurrent": {"amount": 391500}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 435000, "salerecdate": "2017-9-1", "saledisclosuretype": 0, "saledocnum": "1724434020", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 384}}, "assessment": {"assessed": {"assdttlvalue": 29530}, "market": {"mktttlvalue": 295300}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:48", "transactionID": "02d0ef7e-55bf-4e5d-861a-13a503cd9ecd"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17524233417031, "fips": "17031", "apn": "18091130060000", "attomID": 175242334}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "621 S CATHERINE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "621 S CATHERINE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2826", "postal3": "C020"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.802806", "longitude": "-87.872772", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1950, "propLandUse": "SFR", "propIndicator": "10", "legal1": "LOT 19 IN BLK 6 IN COUNTRY CLUB ADD TO LA GRANGE E2 NW4 OF SEC09 T38N R12E 3P"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2429, "groundfloorsize": 0, "livingsize": 2429, "sizeInd": "LIVING SQFT ", "universalsize": 2429}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 80, "value": 549800, "high": 660500, "low": 450700, "fsd": 20.0}}, "sale": {"buyerName": "WISE,JOANNE T", "sellerName": "FORD,MARY B & RICHARD J", "salesearchdate": "2017-6-15", "saleTransDate": "2017-6-15", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1717357074", "amount": 440000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 550000, "salerecdate": "2017-6-22", "saledisclosuretype": 0, "saledocnum": "1717357073", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 226}}, "assessment": {"assessed": {"assdttlvalue": 53921}, "market": {"mktttlvalue": 539210}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:51", "transactionID": "91154cce-b5cc-4b9e-a15f-9501c4874a2e"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 775053417031, "fips": "17031", "apn": "18091080030000", "attomID": 7750534}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.155, "lotsize2": 6750}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "609 S BRAINARD AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "609 S BRAINARD AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2744", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.802741", "longitude": "-87.878864", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1949, "propLandUse": "SFR", "propIndicator": "10", "legal1": "L22 B3 BRAINARD PARK W1/2 W1/2 NW1/ 4 & W1/2 NW1/4 SW1/4 S9 T38N R12E"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1567, "groundfloorsize": 0, "livingsize": 1567, "sizeInd": "LIVING SQFT ", "universalsize": 1567}, "rooms": {"bathstotal": 3.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 79, "value": 446300, "high": 541700, "low": 358400, "fsd": 21.0}}, "sale": {"buyerName": "BIROS,JOSHUA|SCHWANZ,KRISTEN", "sellerName": "AARONSON,STEVEN & ELIZABETH", "salesearchdate": "2017-6-22", "saleTransDate": "2017-6-22", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1718808112", "amount": 350000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 437500, "salerecdate": "2017-7-7", "saledisclosuretype": 0, "saledocnum": "1718808111", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 279}}, "assessment": {"assessed": {"assdttlvalue": 37088}, "market": {"mktttlvalue": 370880}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:53", "transactionID": "a51512cd-c91e-4102-8f0b-0451de87b7ce"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 4503258317031, "fips": "17031", "apn": "18091290060000", "attomID": 45032583}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "821 S CATHERINE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "821 S CATHERINE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2830", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.799151", "longitude": "-87.872614", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1954, "propLandUse": "SFR", "propIndicator": "10", "legal1": "E2NW4 S09 T38N R12E 3P"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1249, "groundfloorsize": 0, "livingsize": 1249, "sizeInd": "LIVING SQFT ", "universalsize": 1249}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 76, "value": 415900, "high": 490800, "low": 315500, "fsd": 24.0}}, "sale": {"buyerName": "DOWD,JILLIAN M|ELKINS,ALICIA M", "sellerName": "GRELEWICZ,KIRSTEN H & JANUSZ K", "salesearchdate": "2017-4-10", "saleTransDate": "2017-4-10", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1711418064", "amount": 412250}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 485000, "salerecdate": "2017-4-24", "saledisclosuretype": 0, "saledocnum": "1711418063", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 388}}, "assessment": {"assessed": {"assdttlvalue": 37499}, "market": {"mktttlvalue": 374990}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:57", "transactionID": "32f8c417-caa4-4efb-8af1-802ed0de8fa5"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17618571917031, "fips": "17031", "apn": "18093080020000", "attomID": 176185719}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1538, "lotsize2": 6700}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "1005 S BRAINARD AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "1005 S BRAINARD AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2717", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.795701", "longitude": "-87.878533", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1962, "propLandUse": "SFR", "propIndicator": "10", "legal1": "H O (STONE) &COS (BRAINARD) PARK SUB OF WH WH NW SEC 09-38-12"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1718, "groundfloorsize": 0, "livingsize": 1718, "sizeInd": "LIVING SQFT ", "universalsize": 1718}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 78, "value": 474700, "high": 565700, "low": 372100, "fsd": 22.0}}, "sale": {"buyerName": "MILLER,HAL & JANICE", "sellerName": "HUERGO,IVAN", "salesearchdate": "2017-9-28", "saleTransDate": "2017-9-28", "mortgage": {"FirstConcurrent": {"amount": 0}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 480500, "salerecdate": "2017-10-10", "saledisclosuretype": 0, "saledocnum": "1728341016", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 280}}, "assessment": {"assessed": {"assdttlvalue": 43607}, "market": {"mktttlvalue": 436070}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:00:59", "transactionID": "e07db813-229c-4e67-8c19-5d8528e461d9"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 4502373817031, "fips": "17031", "apn": "18091260040000", "attomID": 45023738}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.155, "lotsize2": 6750}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "815 S WAIOLA AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "815 S WAIOLA AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2738", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.799195", "longitude": "-87.876215", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157782, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1954, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1833, "groundfloorsize": 0, "livingsize": 1833, "sizeInd": "LIVING SQFT ", "universalsize": 1833}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 85, "value": 479000, "high": 524500, "low": 408100, "fsd": 15.0}}, "sale": {"sellerName": "BODINE,DANIEL T", "salesearchdate": "2013-3-21", "saleTransDate": "2013-3-21", "mortgage": {"FirstConcurrent": {"amount": 391500}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 0, "salerecdate": "2013-3-28", "saledisclosuretype": 0, "saledocnum": "1308746001", "saletranstype": "Nominal - Non/Arms Length Sale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 0}}, "assessment": {"assessed": {"assdttlvalue": 41378}, "market": {"mktttlvalue": 413780}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:02", "transactionID": "f9e3589f-526a-42b0-8ce4-a72607013843"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17618565617031, "fips": "17031", "apn": "18091240220000", "attomID": 176185656}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.155, "lotsize2": 6750}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "838 S STONE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "838 S STONE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2726", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.798726", "longitude": "-87.877789", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1956, "propLandUse": "SFR", "propIndicator": "10", "legal1": "LOT 10 IN BLK 7 IN H O STONE & COMP ANY'S BRAINARD PARK OF W2 W2 NW4 W2 NW4 SW4 OF SEC09 T38N R12E 3P"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1790, "groundfloorsize": 0, "livingsize": 1790, "sizeInd": "LIVING SQFT ", "universalsize": 1790}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 85, "value": 422200, "high": 477400, "low": 357200, "fsd": 15.0}}, "sale": {"mortgage": {"FirstConcurrent": {"amount": 0}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 0, "saledisclosuretype": 0}, "calculation": {"priceperbed": 0, "pricepersizeunit": 0}}, "assessment": {"assessed": {"assdttlvalue": 40648}, "market": {"mktttlvalue": 406480}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:04", "transactionID": "a86222f5-6e4c-4be5-bfe8-d4604139dca2"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 5238144917031, "fips": "17031", "apn": "18093000290000", "attomID": 52381449}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.3099, "lotsize2": 13500}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "911 S BRAINARD AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "911 S BRAINARD AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2716", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.797410", "longitude": "-87.878608", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1953, "propLandUse": "SFR", "propIndicator": "10", "legal1": "WEST2 NW4 SW4 SEC09"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1799, "groundfloorsize": 0, "livingsize": 1799, "sizeInd": "LIVING SQFT ", "universalsize": 1799}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 82, "value": 445400, "high": 510700, "low": 367600, "fsd": 18.0}}, "sale": {"buyerName": "FEIPEL,NICHOLAS", "sellerName": "EBRIGHT,GEORGE E & PATRICIA J", "salesearchdate": "2006-1-10", "mortgage": {"FirstConcurrent": {"amount": 394250}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 0, "salerecdate": "2006-1-10", "saledisclosuretype": 0, "saledocnum": "0601019005", "saletranstype": "Nominal - Non/Arms Length Sale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 0}}, "assessment": {"assessed": {"assdttlvalue": 41172}, "market": {"mktttlvalue": 411720}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 2, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:07", "transactionID": "dbdcc65c-16f9-4ac8-ae11-addc1d888f62"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 3877212617031, "fips": "17031", "apn": "18091210210000", "attomID": 38772126}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "732 S ASHLAND AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "732 S ASHLAND AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2815", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.800766", "longitude": "-87.871908", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "ABSENTEE(MAIL AND SITUS NOT =)", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1942, "propLandUse": "SFR", "propIndicator": "10", "legal1": "E2NW4 S09 T38N R12E 3P"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1585, "groundfloorsize": 0, "livingsize": 1585, "sizeInd": "LIVING SQFT ", "universalsize": 1585}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 76, "value": 431100, "high": 510500, "low": 328200, "fsd": 24.0}}, "sale": {"buyerName": "KOVEL BRETT W & KERRI L TRUST", "sellerName": "KOVEL B W & K L TRUST", "salesearchdate": "2018-10-30", "saleTransDate": "2018-10-30", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1831306084", "amount": 422000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 527500, "salerecdate": "2018-11-9", "saledisclosuretype": 0, "saledocnum": "1831306083", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 333}}, "assessment": {"assessed": {"assdttlvalue": 37084}, "market": {"mktttlvalue": 370840}}, "owner": {}}, {"identifier": {"obPropId": 17623763117031, "fips": "17031", "apn": "17184110290000", "attomID": 176237631}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.0943, "lotsize2": 4106}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "732 S ASHLAND AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "732 S ASHLAND AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2815", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.800766", "longitude": "-87.871908", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Exempt", "propsubtype": "RESIDENTIAL", "proptype": "PUBLIC (NEC)", "yearbuilt": 0, "propLandUse": "PUBLIC (NEC)", "propIndicator": "90"}, "utilities": {}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 0, "groundfloorsize": 0, "livingsize": 0, "sizeInd": "BUILDING SQFT ", "universalsize": 0}, "rooms": {"bathstotal": 0.0, "beds": 0, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0}, "summary": {"levels": 0, "storyDesc": "TYPE UNKNOWN", "unitsCount": "0"}}, "avm": {"amount": {"scr": 0, "value": 0, "high": 0, "low": 0, "fsd": 0.0}}, "sale": {"mortgage": {"FirstConcurrent": {"amount": 0}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 0, "saledisclosuretype": 0}, "calculation": {"priceperbed": 0, "pricepersizeunit": 0}}, "assessment": {"assessed": {"assdttlvalue": 0}, "market": {"mktttlvalue": 0}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:10", "transactionID": "28d67243-30f2-466e-a772-e5432119a98b"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17527272417031, "fips": "17031", "apn": "18091040240000", "attomID": 175272724}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "544 S CATHERINE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "544 S CATHERINE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2823", "postal3": "C020"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.804177", "longitude": "-87.873203", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1948, "propLandUse": "SFR", "propIndicator": "10", "legal1": "E2NW4 S09 T38N R12E 3P"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1790, "groundfloorsize": 0, "livingsize": 1790, "sizeInd": "LIVING SQFT ", "universalsize": 1790}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 2, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 81, "value": 448400, "high": 510000, "low": 361300, "fsd": 19.0}}, "sale": {"buyerName": "CARR,WILLIAM & WILLIAM", "sellerName": "MICHAEL G ROHAN LIVING TRUST", "salesearchdate": "2018-5-10", "saleTransDate": "2018-5-10", "mortgage": {"FirstConcurrent": {"amount": 0}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 473000, "salerecdate": "2018-5-15", "saledisclosuretype": 0, "saledocnum": "1813546065", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 264}}, "assessment": {"assessed": {"assdttlvalue": 41530}, "market": {"mktttlvalue": 415300}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:12", "transactionID": "a2ca7008-dbd3-48b2-953a-86c0df1a04d4"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17527273317031, "fips": "17031", "apn": "18093160240000", "attomID": 175272733}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1722, "lotsize2": 7500}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "1128 S STONE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "1128 S STONE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "6624", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.793435", "longitude": "-87.877534", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "ABSENTEE(MAIL AND SITUS NOT =)", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1967, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2021, "groundfloorsize": 0, "livingsize": 2021, "sizeInd": "LIVING SQFT ", "universalsize": 2021}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 81, "value": 463800, "high": 552200, "low": 381500, "fsd": 19.0}}, "sale": {"multiAPNflag": "MULTIPLE", "sellerName": "HEIDERSCHEIDT,NORA J", "salesearchdate": "2019-3-29", "saleTransDate": "2019-3-29", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1909908177", "amount": 515000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 525000, "salerecdate": "2019-4-9", "saledisclosuretype": 0, "saledocnum": "1909908176", "saletranstype": "Resale"}, "calculation": {"priceperbed": 131250, "pricepersizeunit": 260}}, "assessment": {"assessed": {"assdttlvalue": 50390}, "market": {"mktttlvalue": 503900}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:16", "transactionID": "27b78e3c-18e9-42d6-90d3-78fe82d8b276"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17617178317031, "fips": "17031", "apn": "18091220140000", "attomID": 176171783}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "704 S MADISON AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "704 S MADISON AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2805", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.801321", "longitude": "-87.870761", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "ABSENTEE(MAIL AND SITUS NOT =)", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1951, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "WOOD"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1634, "groundfloorsize": 0, "livingsize": 1634, "sizeInd": "LIVING SQFT ", "universalsize": 1634}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 79, "value": 410600, "high": 494900, "low": 326800, "fsd": 21.0}}, "sale": {"buyerName": "QUINS CONSTRUCTION CORP", "sellerName": "BEERMANN HOWARD H TRUST", "salesearchdate": "2018-6-26", "saleTransDate": "2018-6-26", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1818034044", "amount": 363750}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 315000, "salerecdate": "2018-6-29", "saledisclosuretype": 0, "saledocnum": "1818034043", "saletranstype": "Construction Loan/Financing"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 193}}, "assessment": {"assessed": {"assdttlvalue": 36662}, "market": {"mktttlvalue": 366620}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:18", "transactionID": "a74b4f0e-0176-44bd-9385-9e9f657b6343"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17484775817031, "fips": "17031", "apn": "18091240010000", "attomID": 174847758}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1517, "lotsize2": 6610}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "801 S BRAINARD AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "801 S BRAINARD AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2746", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.799427", "longitude": "-87.878716", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "ABSENTEE(MAIL AND SITUS NOT =)", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1952, "propLandUse": "SFR", "propIndicator": "10", "legal1": "H O (STONE) &COS (BRAINARD) PARK SUB OF WH WH NW SEC 09-38-12"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1618, "groundfloorsize": 0, "livingsize": 1618, "sizeInd": "LIVING SQFT ", "universalsize": 1618}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 79, "value": 431800, "high": 494000, "low": 341600, "fsd": 21.0}}, "sale": {"sellerName": "PINNACLE DREAM HOME INC", "salesearchdate": "2019-5-6", "saleTransDate": "2019-5-6", "mortgage": {"FirstConcurrent": {"amount": 75000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 519500, "salerecdate": "2019-5-8", "saledisclosuretype": 0, "saledocnum": "1912842003", "saletranstype": "Resale"}, "calculation": {"priceperbed": 173167, "pricepersizeunit": 321}}, "assessment": {"assessed": {"assdttlvalue": 39394}, "market": {"mktttlvalue": 393940}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:21", "transactionID": "fbb97083-9c20-448c-91c5-93015af1f4bc"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17484767817031, "fips": "17031", "apn": "18091100140000", "attomID": 174847678}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1521, "lotsize2": 6625}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "600 S SPRING AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "600 S SPRING AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2751", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.803103", "longitude": "-87.875544", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157782, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1949, "propLandUse": "SFR", "propIndicator": "10", "legal1": "L157 SPRING GARDENS SUBD E1/2 W1/2 NW1/4 & E1/2 NW1/4 SW1/4 S9 T38N R12E"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1742, "groundfloorsize": 0, "livingsize": 1742, "sizeInd": "LIVING SQFT ", "universalsize": 1742}, "rooms": {"bathstotal": 3.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 81, "value": 437100, "high": 506500, "low": 352300, "fsd": 19.0}}, "sale": {"sellerName": "KOZLOWSKI SUSAN D", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1910055029", "amount": 380000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 0, "saledisclosuretype": 0}, "calculation": {"priceperbed": 0, "pricepersizeunit": 0}}, "assessment": {"assessed": {"assdttlvalue": 40911}, "market": {"mktttlvalue": 409110}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:23", "transactionID": "61cf6f19-c5cd-4450-9e02-86fe10e7f27c"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17524260217031, "fips": "17031", "apn": "18093200100000", "attomID": 175242602}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "5339 S KENSINGTON AVE", "line2": "COUNTRYSIDE, IL 60525", "locality": "Countryside", "matchCode": "ExaStr", "oneLine": "5339 S KENSINGTON AVE, COUNTRYSIDE, IL 60525", "postal1": "60525", "postal2": "6607", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.792959", "longitude": "-87.873497", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004683765, PL1716873, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 2002, "propLandUse": "SFR", "propIndicator": "10", "legal1": "L15 B8 H O STONE & COS 5TH AVENUE M ANOR E1/2 SW1/4 XN25AC S9 T38N R12E"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "WOOD"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1808, "groundfloorsize": 0, "livingsize": 1808, "sizeInd": "LIVING SQFT ", "universalsize": 1808}, "rooms": {"bathstotal": 3.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 72, "value": 474500, "high": 606100, "low": 359400, "fsd": 28.0}}, "sale": {"sellerName": "HESLIN,PATRICK J & JULIE M", "salesearchdate": "2019-1-11", "saleTransDate": "2019-1-11", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1901710056", "amount": 302700}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 547000, "salerecdate": "2019-1-17", "saledisclosuretype": 0, "saledocnum": "1901710054", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 303}}, "assessment": {"assessed": {"assdttlvalue": 35221}, "market": {"mktttlvalue": 352210}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:27", "transactionID": "540f1c7d-8fcd-4a02-9ace-c536e6ed6c3e"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 1466816117031, "fips": "17031", "apn": "18093000110000", "attomID": 14668161}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1538, "lotsize2": 6700}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "941 S BRAINARD AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "941 S BRAINARD AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2716", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.796853", "longitude": "-87.878584", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1958, "propLandUse": "SFR", "propIndicator": "10", "legal1": "WEST2 W2 NW4 W2 NW4 SW4 SEC09"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1793, "groundfloorsize": 0, "livingsize": 1793, "sizeInd": "LIVING SQFT ", "universalsize": 1793}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 2, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 82, "value": 518600, "high": 613100, "low": 427900, "fsd": 18.0}}, "sale": {"buyerName": "HAVEY,KEVIN J", "sellerName": "REID,GARRETT F & MARY E", "salesearchdate": "2018-4-21", "saleTransDate": "2018-4-21", "mortgage": {"FirstConcurrent": {"amount": 0}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 490000, "salerecdate": "2018-5-1", "saledisclosuretype": 0, "saledocnum": "1812155062", "saletranstype": "Resale"}, "calculation": {"priceperbed": 163333, "pricepersizeunit": 273}}, "assessment": {"assessed": {"assdttlvalue": 49088}, "market": {"mktttlvalue": 490880}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:29", "transactionID": "1e865349-249a-4545-92aa-efb6c19ae90f"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 4952444017031, "fips": "17031", "apn": "18093090150000", "attomID": 49524440}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1538, "lotsize2": 6700}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "1010 S WAIOLA AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "1010 S WAIOLA AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2741", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.795635", "longitude": "-87.876443", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1957, "propLandUse": "SFR", "propIndicator": "10", "legal1": "H O (STONE) &COS (BRAINARD) PARK SUB OF WH WH NW SEC 09-38-12"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1594, "groundfloorsize": 0, "livingsize": 1594, "sizeInd": "LIVING SQFT ", "universalsize": 1594}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 75, "value": 430200, "high": 477500, "low": 324500, "fsd": 25.0}}, "sale": {"buyerName": "BROSKEY,ROBERT & SARAH", "sellerName": "KORINEK,WILLIAM G & DARLENE S", "salesearchdate": "2017-10-30", "saleTransDate": "2017-10-30", "mortgage": {"FirstConcurrent": {"amount": 0}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 411000, "salerecdate": "2017-11-1", "saledisclosuretype": 0, "saledocnum": "1730549092", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 258}}, "assessment": {"assessed": {"assdttlvalue": 39994}, "market": {"mktttlvalue": 399940}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:31", "transactionID": "67cd816d-2704-46c7-b9b9-d65cf7403c31"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 15398326817031, "fips": "17031", "apn": "18091280130000", "attomID": 153983268}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1388, "lotsize2": 6045}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "800 S CATHERINE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "800 S CATHERINE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2829", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.799516", "longitude": "-87.872994", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1942, "propLandUse": "SFR", "propIndicator": "10", "legal1": "L1 B13 COUNTRY CLUB ADDT TO LA GRAN GE E1/2 NW1/4 S9 T38N R12E"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "WOOD"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1711, "groundfloorsize": 0, "livingsize": 1711, "sizeInd": "LIVING SQFT ", "universalsize": 1711}, "rooms": {"bathstotal": 3.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 78, "value": 465400, "high": 544700, "low": 361800, "fsd": 22.0}}, "sale": {"buyerName": "RIEDERER,DANA & JULIE", "sellerName": "ERDTMANN,SANDRA M", "salesearchdate": "2017-6-28", "saleTransDate": "2017-6-28", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1719510058", "amount": 403750}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 425000, "salerecdate": "2017-7-14", "saledisclosuretype": 0, "saledocnum": "1719510057", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 248}}, "assessment": {"assessed": {"assdttlvalue": 41304}, "market": {"mktttlvalue": 413040}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:35", "transactionID": "0090ebf0-d985-4c8f-b9a9-e5c62038c50b"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17527273417031, "fips": "17031", "apn": "18093160250000", "attomID": 175272734}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1794, "lotsize2": 7816}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "1132 S STONE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "1132 S STONE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "6624", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.793360", "longitude": "-87.877531", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1967, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2201, "groundfloorsize": 0, "livingsize": 2201, "sizeInd": "LIVING SQFT ", "universalsize": 2201}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 78, "value": 509600, "high": 580600, "low": 399600, "fsd": 22.0}}, "sale": {"buyerName": "PREUSSNER,JEFFREY G TRUST 101|PREUSSNER,KELLY K TR", "sellerName": "PREUSSNER,JEFFREY G & KELLY K", "salesearchdate": "2018-1-12", "saleTransDate": "2018-1-12", "mortgage": {"FirstConcurrent": {"amount": 0}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 449000, "salerecdate": "2018-1-18", "saledisclosuretype": 0, "saledocnum": "1801846152", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 204}}, "assessment": {"assessed": {"assdttlvalue": 55666}, "market": {"mktttlvalue": 556660}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:37", "transactionID": "16c34cb4-63bb-4044-8a3d-4019fc95fd6d"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17618565317031, "fips": "17031", "apn": "18091240190000", "attomID": 176185653}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.155, "lotsize2": 6750}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "824 S STONE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "824 S STONE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2726", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.798990", "longitude": "-87.877802", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1955, "propLandUse": "SFR", "propIndicator": "10", "legal1": "WEST2 W2 NW4 SEC09 W2NW4SW4 S09 T38 N R12E 3P"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2292, "groundfloorsize": 0, "livingsize": 2292, "sizeInd": "LIVING SQFT ", "universalsize": 2292}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 2, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 87, "value": 503700, "high": 567700, "low": 440900, "fsd": 13.0}}, "sale": {"buyerName": "POLIVKA,JOHN W & LAURA A", "sellerName": "MILES,CURTIS E & MEG F", "salesearchdate": "2017-11-7", "saleTransDate": "2017-11-7", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1731915018", "amount": 424100}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 535000, "salerecdate": "2017-11-15", "saledisclosuretype": 0, "saledocnum": "1731915017", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 233}}, "assessment": {"assessed": {"assdttlvalue": 51987}, "market": {"mktttlvalue": 519870}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 2, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:40", "transactionID": "7f653f2c-7dd6-4889-9bdb-32282ebfa19d"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 1802163717031, "fips": "17031", "apn": "18093130240000", "attomID": 18021637}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "1044 S ASHLAND AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "1044 S ASHLAND AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2821", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.795867", "longitude": "-87.871721", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1943, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "WOOD"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1224, "groundfloorsize": 0, "livingsize": 1224, "sizeInd": "LIVING SQFT ", "universalsize": 1224}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 77, "value": 463800, "high": 559300, "low": 358900, "fsd": 23.0}}, "sale": {"buyerName": "MCMULLEN,SHAWN A & JENNIFER R", "sellerName": "WEBER,JOEL A", "salesearchdate": "2017-7-18", "saleTransDate": "2017-7-18", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1722310006", "amount": 340000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 425000, "salerecdate": "2017-8-11", "saledisclosuretype": 0, "saledocnum": "1722310005", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 347}}, "assessment": {"assessed": {"assdttlvalue": 39003}, "market": {"mktttlvalue": 390030}}, "owner": {}}, {"identifier": {"obPropId": 20429158117031, "fips": "17031", "apn": "18093130230000", "attomID": 204291581}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1421, "lotsize2": 6192}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "1044 S ASHLAND AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "1044 S ASHLAND AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2821", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.795867", "longitude": "-87.871721", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "ABSENTEE(MAIL AND SITUS NOT =)", "propclass": "Vacant", "propsubtype": "RESIDENTIAL", "proptype": "RESIDENTIAL ACREAGE", "yearbuilt": 0, "propLandUse": "RESIDENTIAL ACREAGE", "propIndicator": "80", "legal1": "E2SW4 S09 T38N R12E 3P"}, "utilities": {"coolingtype": "AC.CENTRAL", "heatingtype": "FORCED AIR", "walltype": "WOOD"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 0, "groundfloorsize": 0, "livingsize": 0, "sizeInd": "LIVING SQFT ", "universalsize": 0}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 10}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - 2 ATTACHED SPACES"}, "summary": {"levels": 2, "storyDesc": "TYPE UNKNOWN", "unitsCount": "0"}}, "avm": {"amount": {"scr": 0, "value": 0, "high": 0, "low": 0, "fsd": 0.0}}, "sale": {"buyerName": "COYLE CONSTRUCTION CO", "sellerName": "WEBER,JOEL A", "salesearchdate": "2017-4-4", "saleTransDate": "2017-4-4", "mortgage": {"FirstConcurrent": {"amount": 0}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 232500, "salerecdate": "2017-4-17", "saledisclosuretype": 0, "saledocnum": "1710704053", "saletranstype": "Resale"}, "calculation": {"priceperbed": 58125, "pricepersizeunit": 0}}, "assessment": {"assessed": {"assdttlvalue": 4766}, "market": {"mktttlvalue": 0}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:42", "transactionID": "f5b2759e-b9be-4fa2-8de2-2460d97ab9d2"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 3905103717031, "fips": "17031", "apn": "18091230030000", "attomID": 39051037}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1951, "lotsize2": 8500}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "711 S MADISON AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "711 S MADISON AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2806", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.801222", "longitude": "-87.870389", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1953, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2632, "groundfloorsize": 0, "livingsize": 2632, "sizeInd": "LIVING SQFT ", "universalsize": 2632}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 77, "value": 597600, "high": 735400, "low": 480300, "fsd": 23.0}}, "sale": {"buyerName": "VENTURINI,PABLO M", "sellerName": "PICCIONE,GIUSEPPE & GAIL", "salesearchdate": "2018-2-16", "saleTransDate": "2018-2-16", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1806157015", "amount": 437600}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 547000, "salerecdate": "2018-3-2", "saledisclosuretype": 0, "saledocnum": "1806157014", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 208}}, "assessment": {"assessed": {"assdttlvalue": 59710}, "market": {"mktttlvalue": 597100}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:45", "transactionID": "c01c1f9a-adb9-4cfe-8858-37eaec304c8c"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 4363572917031, "fips": "17031", "apn": "18093170170000", "attomID": 43635729}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.231, "lotsize2": 10064}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "1116 S WAIOLA AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "1116 S WAIOLA AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "6627", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.793689", "longitude": "-87.876342", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1957, "propLandUse": "SFR", "propIndicator": "10", "legal1": "LOT 6 IN VIRENS SUB RESUB OF LOTS 3 & 4 IN FLORENCE HOME GARDENS SW4 SW4 OF SEC09 T38N R12E 3P"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1965, "groundfloorsize": 0, "livingsize": 1965, "sizeInd": "LIVING SQFT ", "universalsize": 1965}, "rooms": {"bathstotal": 3.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 78, "value": 506700, "high": 577800, "low": 394300, "fsd": 22.0}}, "sale": {"salesearchdate": "2018-3-21", "saleTransDate": "2018-3-21", "mortgage": {"FirstConcurrent": {"amount": 534215}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 547500, "salerecdate": "2018-4-9", "saledisclosuretype": 0, "saledocnum": "1809912058", "saletranstype": "Resale"}, "calculation": {"priceperbed": 136875, "pricepersizeunit": 279}}, "assessment": {"assessed": {"assdttlvalue": 53074}, "market": {"mktttlvalue": 530740}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:48", "transactionID": "9ff01e31-391b-4b0b-9c3c-1eb84401efd7"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 14757480317031, "fips": "17031", "apn": "18093130260000", "attomID": 147574803}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1977, "lotsize2": 8610}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "1021 S CATHERINE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "1021 S CATHERINE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2834", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.795869", "longitude": "-87.872456", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1964, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1150, "groundfloorsize": 0, "livingsize": 1150, "sizeInd": "LIVING SQFT ", "universalsize": 1150}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 0, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 75, "value": 403500, "high": 492600, "low": 304500, "fsd": 25.0}}, "sale": {"buyerName": "PERRI,CHRISTOPHER K|FULLER,HOPE O", "sellerName": "BACH,HAROLD H IV & ROCHELLE", "salesearchdate": "2018-6-4", "saleTransDate": "2018-6-4", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1817329154", "amount": 347600}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 434500, "salerecdate": "2018-6-22", "saledisclosuretype": 0, "saledocnum": "1817329153", "saletranstype": "Resale"}, "calculation": {"priceperbed": 144833, "pricepersizeunit": 378}}, "assessment": {"assessed": {"assdttlvalue": 33789}, "market": {"mktttlvalue": 337890}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:51", "transactionID": "0637feac-69b6-40ad-a082-b66ec2b60dc9"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 2209292517031, "fips": "17031", "apn": "18091140300000", "attomID": 22092925}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.2118, "lotsize2": 9225}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "634 S MADISON AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "634 S MADISON AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2803", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.802588", "longitude": "-87.870829", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1954, "propLandUse": "SFR", "propIndicator": "10", "legal1": "(COUNTRY) (CLUB) ADD TO (LAGRANGE) SUB OF EH NW SEC 0 9-38-12"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1594, "groundfloorsize": 0, "livingsize": 1594, "sizeInd": "LIVING SQFT ", "universalsize": 1594}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 79, "value": 397500, "high": 475500, "low": 312500, "fsd": 21.0}}, "sale": {"buyerName": "VOLZ,CARL E & SHARON N", "sellerName": "WHALEY,BARBARA A", "salesearchdate": "2018-6-26", "saleTransDate": "2018-6-26", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1817955027", "amount": 372099}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 411500, "salerecdate": "2018-6-28", "saledisclosuretype": 0, "saledocnum": "1817955025", "saletranstype": "Resale"}, "calculation": {"priceperbed": 137167, "pricepersizeunit": 258}}, "assessment": {"assessed": {"assdttlvalue": 38630}, "market": {"mktttlvalue": 386300}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:54", "transactionID": "ab9941ab-e2fa-4e26-aa10-d69e1223eb24"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17524240417031, "fips": "17031", "apn": "18091300190000", "attomID": 175242404}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "824 S MADISON AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "824 S MADISON AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2807", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.799145", "longitude": "-87.870679", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1951, "propLandUse": "SFR", "propIndicator": "10", "legal1": "L7 B15 COUNTRY CLUB ADDT TO LA GRAN GE S9 T38N R12E"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1601, "groundfloorsize": 0, "livingsize": 1601, "sizeInd": "LIVING SQFT ", "universalsize": 1601}, "rooms": {"bathstotal": 3.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 79, "value": 453100, "high": 548100, "low": 355900, "fsd": 21.0}}, "sale": {"buyerName": "SYRUP,KRISTA A", "sellerName": "NADDY,SEAN M & MAURA A", "salesearchdate": "2018-4-23", "saleTransDate": "2018-4-23", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1813022048", "amount": 339200}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 424000, "salerecdate": "2018-5-10", "saledisclosuretype": 0, "saledocnum": "1813022047", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 265}}, "assessment": {"assessed": {"assdttlvalue": 38826}, "market": {"mktttlvalue": 388260}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:56", "transactionID": "3c1f92f1-eaac-4e8e-aa69-8039dab83be2"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 4359978217031, "fips": "17031", "apn": "18091120150000", "attomID": 43599782}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "610 S CATHERINE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "610 S CATHERINE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2825", "postal3": "C020"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.802980", "longitude": "-87.873147", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1930, "propLandUse": "SFR", "propIndicator": "10", "legal1": "E2NW4 S09 T38N R12E 3P"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1700, "groundfloorsize": 0, "livingsize": 1700, "sizeInd": "LIVING SQFT ", "universalsize": 1700}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 81, "value": 450500, "high": 517100, "low": 363800, "fsd": 19.0}}, "sale": {"buyerName": "KUNKEL,ROBERT & EMILY", "sellerName": "MCCRACKEN,SEAN J & KRISTIN J", "salesearchdate": "2018-4-9", "saleTransDate": "2018-4-9", "mortgage": {"FirstConcurrent": {"amount": 0}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 500000, "salerecdate": "2018-4-18", "saledisclosuretype": 0, "saledocnum": "1810849058", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 294}}, "assessment": {"assessed": {"assdttlvalue": 41512}, "market": {"mktttlvalue": 415120}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:01:59", "transactionID": "b6b73238-db00-41fe-ae42-fd96cef293c6"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17524233717031, "fips": "17031", "apn": "18091160130000", "attomID": 175242337}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1519, "lotsize2": 6615}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "700 S STONE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "700 S STONE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2724", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.801252", "longitude": "-87.877914", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1977, "propLandUse": "SFR", "propIndicator": "10", "legal1": "L1 B6 H O STONE & COS BRAINARD PARK W1/2 W1/2 NW1/4 W1/2 NW1/4 SW1/4 S9 T38N R12E"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2352, "groundfloorsize": 0, "livingsize": 2352, "sizeInd": "LIVING SQFT ", "universalsize": 2352}, "rooms": {"bathstotal": 3.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 85, "value": 589600, "high": 676000, "low": 500600, "fsd": 15.0}}, "sale": {"buyerName": "HALL,CHRISTOPHER L & ERIN M", "sellerName": "MARSKE,SCOTT & THERESE A", "salesearchdate": "2017-4-5", "saleTransDate": "2017-4-5", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1710246040", "amount": 401600}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 502000, "salerecdate": "2017-4-12", "saledisclosuretype": 0, "saledocnum": "1710246039", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 213}}, "assessment": {"assessed": {"assdttlvalue": 60277}, "market": {"mktttlvalue": 602770}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:02:01", "transactionID": "5668d333-2bee-4522-a58d-b725eb1bb4eb"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17525073017031, "fips": "17031", "apn": "18093300010000", "attomID": 175250730}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.2317, "lotsize2": 10094}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "1201 S SPRING AVE", "line2": "COUNTRYSIDE, IL 60525", "locality": "Countryside", "matchCode": "ExaStr", "oneLine": "1201 S SPRING AVE, COUNTRYSIDE, IL 60525", "postal1": "60525", "postal2": "6604", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.792208", "longitude": "-87.874708", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000177226, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1955, "propLandUse": "SFR", "propIndicator": "10", "legal1": "(PAULINES) SUB IN SW 1/4 OF SW 1/4 SEC 09-38-12"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "WOOD"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1512, "groundfloorsize": 0, "livingsize": 1512, "sizeInd": "LIVING SQFT ", "universalsize": 1512}, "rooms": {"bathstotal": 1.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 73, "value": 361100, "high": 430600, "low": 263600, "fsd": 27.0}}, "sale": {"buyerName": "KRUG,MAUREEN", "sellerName": "CHROMNIAK,ANEIA", "salesearchdate": "2018-5-29", "saleTransDate": "2018-5-29", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1815555012", "amount": 160000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 475000, "salerecdate": "2018-6-4", "saledisclosuretype": 0, "saledocnum": "1815555011", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 314}}, "assessment": {"assessed": {"assdttlvalue": 28477}, "market": {"mktttlvalue": 284770}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:02:04", "transactionID": "c5ba26bd-d4fb-4713-ae8b-45b28964bb12"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 14942575917031, "fips": "17031", "apn": "18091200100000", "attomID": 149425759}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "737 S KENSINGTON AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "737 S KENSINGTON AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2708", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.800643", "longitude": "-87.873823", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1951, "propLandUse": "SFR", "propIndicator": "10", "legal1": "(COUNTRY) (CLUB) ADD TO (LAGRANGE) SUB OF EH NW SEC 0 9-38-12"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1950, "groundfloorsize": 0, "livingsize": 1950, "sizeInd": "LIVING SQFT ", "universalsize": 1950}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 83, "value": 490500, "high": 550400, "low": 405000, "fsd": 17.0}}, "sale": {"sellerName": "SOMMERFELD,MICHAEL & JEANNE", "salesearchdate": "2018-3-17", "saleTransDate": "2018-3-17", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1808547067", "amount": 393750}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 525000, "salerecdate": "2018-3-26", "saledisclosuretype": 0, "saledocnum": "1808547066", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 269}}, "assessment": {"assessed": {"assdttlvalue": 42499}, "market": {"mktttlvalue": 424990}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:02:07", "transactionID": "ad90e5cb-f9cc-48fc-8ad6-bf2902ca1557"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 15516024717031, "fips": "17031", "apn": "18091080010000", "attomID": 155160247}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1517, "lotsize2": 6609}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "601 S BRAINARD AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "601 S BRAINARD AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2744", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.803075", "longitude": "-87.878876", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "ABSENTEE(MAIL AND SITUS NOT =)", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1937, "propLandUse": "SFR", "propIndicator": "10", "legal1": "H O (STONE) &COS (BRAINARD) PARK SUB OF WH WH NW SEC 09-38-12"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2920, "groundfloorsize": 0, "livingsize": 2920, "sizeInd": "LIVING SQFT ", "universalsize": 2920}, "rooms": {"bathstotal": 3.0, "beds": 5, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 77, "value": 690300, "high": 848800, "low": 547600, "fsd": 23.0}}, "sale": {"sellerName": "MOVRICH CARL & SHARON TRUST", "salesearchdate": "2018-10-19", "saleTransDate": "2018-10-19", "mortgage": {"FirstConcurrent": {"amount": 497000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 535000, "salerecdate": "2018-11-1", "saledisclosuretype": 0, "saledocnum": "1830516005", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 183}}, "assessment": {"assessed": {"assdttlvalue": 59616}, "market": {"mktttlvalue": 596160}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:02:10", "transactionID": "e6bf3555-62bf-4cce-a8a6-122ffa9d7f04"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 78729217031, "fips": "17031", "apn": "18093040020000", "attomID": 787292}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "907 S KENSINGTON AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "907 S KENSINGTON AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2712", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.797564", "longitude": "-87.873722", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "ABSENTEE(MAIL AND SITUS NOT =)", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1970, "propLandUse": "SFR", "propIndicator": "10", "legal1": "ALBERT (ANDERSONS) SUB OF N 25 ACS OF EH OF SW 1/4 SE C 09-38-12"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1700, "groundfloorsize": 0, "livingsize": 1700, "sizeInd": "LIVING SQFT ", "universalsize": 1700}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 79, "value": 514600, "high": 584000, "low": 406900, "fsd": 21.0}}, "sale": {"buyerName": "JANKA,JEFFREY & DIANE W", "sellerName": "KERNAGIS,ANTHONY M & AMANDA K", "salesearchdate": "2018-6-27", "saleTransDate": "2018-6-27", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1818355024", "amount": 397800}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 468000, "salerecdate": "2018-7-2", "saledisclosuretype": 0, "saledocnum": "1818355023", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 275}}, "assessment": {"assessed": {"assdttlvalue": 44291}, "market": {"mktttlvalue": 442910}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:02:12", "transactionID": "eddc8997-3acc-4e0d-9127-06bf42a62e3c"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 3877474717031, "fips": "17031", "apn": "18091060090000", "attomID": 38774747}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "531 S ASHLAND AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "531 S ASHLAND AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2812", "postal3": "C020"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.804478", "longitude": "-87.871705", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1954, "propLandUse": "SFR", "propIndicator": "10", "legal1": "LOT 16 IN BLK 2 IN COUNTRY CLUB ADD TO LAGRANGE OF E2 NW4 OF SEC09 T38 N R12E 3P"}, "utilities": {"coolingtype": "TYPE UNKNOWN", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1673, "groundfloorsize": 0, "livingsize": 1673, "sizeInd": "LIVING SQFT ", "universalsize": 1673}, "rooms": {"bathstotal": 3.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 0, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 77, "value": 450100, "high": 552200, "low": 363800, "fsd": 23.0}}, "sale": {"sellerName": "QUINN,MATTHEW & LESLIE", "salesearchdate": "2005-12-16", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1822844017", "amount": 399200}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 0, "salerecdate": "2005-12-16", "saledisclosuretype": 0, "saledocnum": "0505035047", "saletranstype": "Nominal - Non/Arms Length Sale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 0}}, "assessment": {"assessed": {"assdttlvalue": 42568}, "market": {"mktttlvalue": 425680}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:02:15", "transactionID": "0df14f56-4a15-4617-bcff-fbdd22c3d377"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 4361745017031, "fips": "17031", "apn": "18091300060000", "attomID": 43617450}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "821 S ASHLAND AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "821 S ASHLAND AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2818", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.799173", "longitude": "-87.871470", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1950, "propLandUse": "SFR", "propIndicator": "10", "legal1": "E2NW4 S09 T38N R12E 3P"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1404, "groundfloorsize": 0, "livingsize": 1404, "sizeInd": "LIVING SQFT ", "universalsize": 1404}, "rooms": {"bathstotal": 2.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 0}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 79, "value": 420600, "high": 507800, "low": 335100, "fsd": 21.0}}, "sale": {"buyerName": "NADDY,MAURA A & SEAN M", "sellerName": "RENEHAN,MEGAN", "salesearchdate": "2018-4-25", "saleTransDate": "2018-4-25", "mortgage": {"FirstConcurrent": {"amount": 0}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 485000, "salerecdate": "2018-4-26", "saledisclosuretype": 0, "saledocnum": "1811655186", "saletranstype": "Resale"}, "calculation": {"priceperbed": 161667, "pricepersizeunit": 345}}, "assessment": {"assessed": {"assdttlvalue": 38127}, "market": {"mktttlvalue": 381270}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:02:18", "transactionID": "bb5bfe28-d906-4ee2-8440-ee74c45167d9"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17394145117031, "fips": "17031", "apn": "18091200220000", "attomID": 173941451}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1412, "lotsize2": 6150}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "738 S CATHERINE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "738 S CATHERINE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2827", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.800640", "longitude": "-87.873049", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "ABSENTEE(MAIL AND SITUS NOT =)", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1949, "propLandUse": "SFR", "propIndicator": "10"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1512, "groundfloorsize": 0, "livingsize": 1512, "sizeInd": "LIVING SQFT ", "universalsize": 1512}, "rooms": {"bathstotal": 1.0, "beds": 3, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 77, "value": 408500, "high": 460800, "low": 314400, "fsd": 23.0}}, "sale": {"buyerName": "KOPP,DAVID E & CLARE D", "sellerName": "KOPP,DAVID E & CLARE D", "salesearchdate": "2018-11-7", "saleTransDate": "2018-11-7", "mortgage": {"FirstConcurrent": {"amount": 0}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 475000, "salerecdate": "2018-11-13", "saledisclosuretype": 0, "saledocnum": "1831708048", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 314}}, "assessment": {"assessed": {"assdttlvalue": 34250}, "market": {"mktttlvalue": 342500}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:02:21", "transactionID": "9c180c9b-34e6-4e77-af1b-bb33ce225c8e"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 14760732317031, "fips": "17031", "apn": "18093090140000", "attomID": 147607323}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1538, "lotsize2": 6700}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "1004 S WAIOLA AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "1004 S WAIOLA AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2741", "postal3": "C072"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.795747", "longitude": "-87.876450", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1954, "propLandUse": "SFR", "propIndicator": "10", "legal1": "H O STONE & COMPANYS BRAI BLK 12 LOT 2"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "BRICK/STONE"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1203, "groundfloorsize": 0, "livingsize": 1203, "sizeInd": "LIVING SQFT ", "universalsize": 1203}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 1, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 76, "value": 374900, "high": 425300, "low": 285000, "fsd": 24.0}}, "sale": {"buyerName": "MALONEY,ROBERT & MARY", "sellerName": "MALONEY,ROBERT & MARY", "salesearchdate": "2018-8-23", "saleTransDate": "2018-8-23", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1824357066", "amount": 348000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 435000, "salerecdate": "2018-8-31", "saledisclosuretype": 0, "saledocnum": "1824357065", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 362}}, "assessment": {"assessed": {"assdttlvalue": 30214}, "market": {"mktttlvalue": 302140}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:02:23", "transactionID": "d8d61e4d-7458-4fc7-b9f5-9aaf34228174"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17524250617031, "fips": "17031", "apn": "18093010190000", "attomID": 175242506}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1538, "lotsize2": 6700}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "924 S WAIOLA AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "924 S WAIOLA AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2739", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.797175", "longitude": "-87.876493", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1959, "propLandUse": "SFR", "propIndicator": "10", "legal1": "L7 B9 H E STONE & COMPANYS BRAINARD PARK SUBD W1/2 W1/2 NW1/4 W1/2 NW 1/4 W1/2 S9 T38N R12E"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1789, "groundfloorsize": 0, "livingsize": 1789, "sizeInd": "LIVING SQFT ", "universalsize": 1789}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 80, "value": 520800, "high": 604800, "low": 414800, "fsd": 20.0}}, "sale": {"buyerName": "SACKS,JASON & COLLEEN", "sellerName": "SMITH,DAVID J & CATHERINE E", "salesearchdate": "2018-5-31", "saleTransDate": "2018-5-31", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1815549320", "amount": 400610}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 413000, "salerecdate": "2018-6-4", "saledisclosuretype": 0, "saledocnum": "1815549319", "saletranstype": "Resale"}, "calculation": {"priceperbed": 103250, "pricepersizeunit": 231}}, "assessment": {"assessed": {"assdttlvalue": 46780}, "market": {"mktttlvalue": 467800}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:02:26", "transactionID": "63d6549b-ae79-4a1b-ae83-657f068e5208"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17484769817031, "fips": "17031", "apn": "18091130130000", "attomID": 174847698}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1388, "lotsize2": 6047}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "600 S ASHLAND AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "600 S ASHLAND AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2813", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.803203", "longitude": "-87.872024", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000187321, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1949, "propLandUse": "SFR", "propIndicator": "10", "legal1": "LOT 1 IN BLK 6 IN COUNTRY CLUB ADD TO LA GRANGE E2 NW4 OF SEC09 T38N R 12E 3P"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2600, "groundfloorsize": 0, "livingsize": 2600, "sizeInd": "LIVING SQFT ", "universalsize": 2600}, "rooms": {"bathstotal": 3.0, "beds": 5, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 80, "value": 565200, "high": 665200, "low": 453800, "fsd": 20.0}}, "sale": {"buyerName": "MCKEEHAN,NICHOLAS & GUADALUPE", "sellerName": "COWHEY,DENNIS P & JANET L", "salesearchdate": "2016-9-16", "saleTransDate": "2016-9-16", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1629210014", "amount": 336000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 420000, "salerecdate": "2016-10-18", "saledisclosuretype": 0, "saledocnum": "1629210013", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 162}}, "assessment": {"assessed": {"assdttlvalue": 54320}, "market": {"mktttlvalue": 543200}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:02:28", "transactionID": "53d5da34-ab13-4bf7-abc4-e09118dc5b0d"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 17524255717031, "fips": "17031", "apn": "18093050320000", "attomID": 175242557}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.1977, "lotsize2": 8610}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "928 S ASHLAND AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "928 S ASHLAND AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2819", "postal3": "C012"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.797190", "longitude": "-87.871753", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1954, "propLandUse": "SFR", "propIndicator": "10", "legal1": "LOT 15 EXCEPT S 35 FT & LOT 16 & 17 IN BLK 2 IN ALBERT ANDERSON'S SUB OF N 25 ACRES OF E2 SW4 OF SEC09 T"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 2268, "groundfloorsize": 0, "livingsize": 2268, "sizeInd": "LIVING SQFT ", "universalsize": 2268}, "rooms": {"bathstotal": 3.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE - ATTACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 74, "value": 546700, "high": 653400, "low": 406400, "fsd": 26.0}}, "sale": {"buyerName": "MYERS,MICHAEL J & LISA M", "sellerName": "OSHEA,MARTIN M & ROBERTA", "salesearchdate": "2016-12-19", "saleTransDate": "2016-12-19", "mortgage": {"FirstConcurrent": {"trustDeedDocumentNumber": "1636419130", "amount": 372000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 465000, "salerecdate": "2016-12-29", "saledisclosuretype": 0, "saledocnum": "1636419129", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 205}}, "assessment": {"assessed": {"assdttlvalue": 46362}, "market": {"mktttlvalue": 463620}}, "owner": {}}]},
{"status": {"version": "1.0.0", "code": 0, "msg": "SuccessWithResult", "total": 1, "page": 1, "pagesize": 10, "responseDateTime": "2019-6-20T05:02:31", "transactionID": "439529f0-475e-40d3-8f96-bfe9c56ea45d"}, "echoed_fields": {"jobID": "", "loanNumber": "", "preparedBy": "", "resellerID": "", "preparedFor": ""}, "property": [{"identifier": {"obPropId": 4364271417031, "fips": "17031", "apn": "18091000160000", "attomID": 43642714}, "lot": {"depth": 0, "frontage": 0, "lotsize1": 0.155, "lotsize2": 6750}, "area": {"countrysecsubd": "Cook County"}, "address": {"country": "US", "countrySubd": "IL", "line1": "512 S STONE AVE", "line2": "LA GRANGE, IL 60525", "locality": "La Grange", "matchCode": "ExaStr", "oneLine": "512 S STONE AVE, LA GRANGE, IL 60525", "postal1": "60525", "postal2": "2720", "postal3": "C040"}, "location": {"accuracy": "Street", "elevation": 0.0, "latitude": "41.804688", "longitude": "-87.878073", "distance": 0.0, "geoid": "CO17031, CS1745447, DB1721630, DB1723880, MT30001048, ND0004439079, PL1740767, RS0000157153, SB0000085034, SB0000085035, SB0000085135, SB0000085136, ZI60525"}, "summary": {"absenteeInd": "OWNER OCCUPIED", "propclass": "Single Family Residence / Townhouse", "propsubtype": "HOUSE", "proptype": "SFR", "yearbuilt": 1942, "propLandUse": "SFR", "propIndicator": "10", "legal1": "L4 B2 STONE & COS BRAINARD PARK SUB D W1/2 W1/2 NW1/4 & W1/2 NW1/4 SW1/ 4 S9 T38N R12E"}, "utilities": {"coolingtype": "AC.CENTRAL", "walltype": "FRAME/MASONRY"}, "building": {"size": {"bldgsize": 0, "grosssize": 0, "grosssizeadjusted": 1759, "groundfloorsize": 0, "livingsize": 1759, "sizeInd": "LIVING SQFT ", "universalsize": 1759}, "rooms": {"bathstotal": 2.0, "beds": 4, "roomsTotal": 0}, "interior": {"bsmtsize": 0, "fplccount": 1, "fplcind": "Y", "fplctype": "YES"}, "parking": {"prkgSize": 0, "prkgType": "GARAGE DETACHED"}, "summary": {"levels": 2, "storyDesc": "HOUSE", "unitsCount": "1"}}, "avm": {"eventDate": "2019-3-22", "amount": {"scr": 82, "value": 467100, "high": 527000, "low": 385400, "fsd": 18.0}}, "sale": {"buyerName": "GAMACHE,THOMAS A|FITTS-GAMACHE,LARA E", "sellerName": "SZYMANSKI,BARTLOMIEJ & KATHLEEN", "salesearchdate": "2016-7-11", "saleTransDate": "2016-7-11", "mortgage": {"FirstConcurrent": {"amount": 430000}, "SecondConcurrent": {"amount": 0}, "ThirdConcurrent": {"amount": 0}}, "amount": {"saleamt": 538000, "salerecdate": "2016-7-13", "saledisclosuretype": 0, "saledocnum": "1619539051", "saletranstype": "Resale"}, "calculation": {"priceperbed": 0, "pricepersizeunit": 306}}, "assessment": {"assessed": {"assdttlvalue": 42786}, "market": {"mktttlvalue": 427860}}, "owner": {}}]},
] | 2,487.909091 | 4,862 | 0.648628 | 17,614 | 164,202 | 6.043148 | 0.116385 | 0.003608 | 0.011274 | 0.01691 | 0.730995 | 0.712882 | 0.688653 | 0.682284 | 0.67023 | 0.667816 | 0 | 0.165416 | 0.100273 | 164,202 | 66 | 4,863 | 2,487.909091 | 0.555081 | 0.000128 | 0 | 0 | 0 | 1.349206 | 0.638363 | 0.018772 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
58724741d31049379691c8e1ca7683942d16c3e9 | 11,207 | py | Python | wagl/longitude_latitude_arrays.py | Oceancolour-RG/wagl | f002a1c0a373d21758d44d2a808bdfd755d90226 | [
"Apache-2.0"
] | null | null | null | wagl/longitude_latitude_arrays.py | Oceancolour-RG/wagl | f002a1c0a373d21758d44d2a808bdfd755d90226 | [
"Apache-2.0"
] | null | null | null | wagl/longitude_latitude_arrays.py | Oceancolour-RG/wagl | f002a1c0a373d21758d44d2a808bdfd755d90226 | [
"Apache-2.0"
] | 1 | 2019-01-23T00:51:56.000Z | 2019-01-23T00:51:56.000Z | #!/usr/bin/env python
"""
Longitude and Latitude 2D grid creation.
"""
from __future__ import absolute_import, print_function
from functools import partial
import numpy
import osr
import h5py
from wagl.constants import DatasetName, GroupName
from wagl.interpolation import interpolate_grid
from wagl.hdf5 import H5CompressionFilter, attach_image_attributes
CRS = "EPSG:4326"
LON_DESC = "Contains the longitude values for each pixel."
LAT_DESC = "Contains the latitude values for each pixel."
def get_lon_coordinate(y, x, geobox, geo_crs=None, centre=False):
"""
Given an image/array y & x co-ordinate return the corresponding
longitude co-ordinate. The y, x style mimics Python indices.
:param y:
An integer representing an image/array row coordinate.
:param x:
An integer representing an image/array column coordinate.
:param geobox:
An instance of a GriddedGeoBox object.
:param geo_crs:
An instance of a defined geographic osr.SpatialReference
object. If set to None (Default), then geo_crs will be set
to WGS84.
:param centre:
A boolean indicating whether or not the returned co-ordinate
should reference the centre of a pixel, in which case a 0.5
offset is applied in the x & y directions. Default is False.
:return:
A floating point value representing the longitude
co-ordinate.
"""
if geo_crs is None:
geo_crs = osr.SpatialReference()
geo_crs.SetFromUserInput(CRS)
xy = (x, y)
mapx, mapy = geobox.convert_coordinates(xy, to_map=True, centre=centre)
x = geobox.transform_coordinates((mapx, mapy), geo_crs)[0]
return x
def get_lat_coordinate(y, x, geobox, geo_crs=None, centre=False):
"""
Given an image/array x & y co-ordinate return the corresponding
latitude co-ordinate. The y, x style mimics Python indices.
:param y:
An integer representing an image/array row coordinate.
:param x:
An integer representing an image/array column coordinate.
:param geobox:
An instance of a GriddedGeoBox object.
:param geo_crs:
An instance of a defined geographic osr.SpatialReference
object. If set to None (Default), then geo_crs will be set
to WGS84.
:param centre:
A boolean indicating whether or not the returned co-ordinate
should reference the centre of a pixel, in which case a 0.5
offset is applied in the x & y directions. Default is False.
:return:
A floating point value representing the latitude
co-ordinate.
"""
if geo_crs is None:
geo_crs = osr.SpatialReference()
geo_crs.SetFromUserInput(CRS)
xy = (x, y)
mapx, mapy = geobox.convert_coordinates(xy, to_map=True, centre=centre)
y = geobox.transform_coordinates((mapx, mapy), geo_crs)[1]
return y
def _create_lon_lat_grids(acquisition, out_fname=None,
compression=H5CompressionFilter.LZF,
filter_opts=None, depth=7):
"""
A private wrapper for dealing with the internal custom workings of the
multifile workflow.
"""
with h5py.File(out_fname, 'w') as fid:
create_lon_lat_grids(acquisition, fid, compression, filter_opts, depth)
def create_lon_lat_grids(acquisition, out_group=None,
compression=H5CompressionFilter.LZF,
filter_opts=None, depth=7):
"""
Creates 2 by 2D NumPy arrays containing longitude and latitude
co-ordinates for each array element.
:param acquisition:
An instance of an `Acquisition` object.
:param out_group:
If set to None (default) then the results will be returned
as an in-memory hdf5 file, i.e. the `core` driver. Otherwise,
a writeable HDF5 `Group` object.
The dataset names will be given by:
* contants.DatasetName.LON.value
* contants.DatasetName.LAT.value
:param compression:
The compression filter to use.
Default is H5CompressionFilter.LZF
:filter_opts:
A dict of key value pairs available to the given configuration
instance of H5CompressionFilter. For example
H5CompressionFilter.LZF has the keywords *chunks* and *shuffle*
available.
Default is None, which will use the default settings for the
chosen H5CompressionFilter instance.
:return:
An opened `h5py.File` object, that is either in-memory using the
`core` driver, or on disk.
"""
geobox = acquisition.gridded_geo_box()
# Define the lon and lat transform funtions
lon_func = partial(get_lon_coordinate, geobox=geobox, centre=True)
lat_func = partial(get_lat_coordinate, geobox=geobox, centre=True)
# Get some basic info about the image
shape = geobox.get_shape_yx()
# Initialise the array to contain the result
result = numpy.zeros(shape, dtype='float64')
interpolate_grid(result, lon_func, depth=depth, origin=(0, 0), shape=shape)
# Initialise the output files
if out_group is None:
fid = h5py.File('longitude-latitude.h5', driver='core',
backing_store=False)
else:
fid = out_group
if GroupName.LON_LAT_GROUP.value not in fid:
fid.create_group(GroupName.LON_LAT_GROUP.value)
grp = fid[GroupName.LON_LAT_GROUP.value]
# define some base attributes for the image datasets
attrs = {'crs_wkt': geobox.crs.ExportToWkt(),
'geotransform': geobox.transform.to_gdal(),
'description': LON_DESC}
if filter_opts is None:
filter_opts = {}
filter_opts['chunks'] = acquisition.tile_size
kwargs = compression.config(**filter_opts).dataset_compression_kwargs()
lon_dset = grp.create_dataset(DatasetName.LON.value, data=result, **kwargs)
attach_image_attributes(lon_dset, attrs)
result = numpy.zeros(shape, dtype='float64')
interpolate_grid(result, lat_func, depth=depth, origin=(0, 0), shape=shape)
attrs['description'] = LAT_DESC
lat_dset = grp.create_dataset(DatasetName.LAT.value, data=result, **kwargs)
attach_image_attributes(lat_dset, attrs)
return fid
def create_grid(geobox, coord_fn, depth=7):
"""
Interpolates a `NumPy` array based on the input coordinate function
`coord_fn`.
:param geobox:
An instance of an `GriddedGeoBox` object.
:param coord_fn:
A function that maps coordinates.
:return:
A `NumPy` array.
"""
# Define the transform funtions
func = partial(coord_fn, geobox=geobox, centre=True)
# Get some basic info about the image
shape = geobox.get_shape_yx()
# Initialise the array to contain the result
arr = numpy.zeros(shape, dtype='float64')
interpolate_grid(arr, func, depth=depth, origin=(0, 0), shape=shape)
return arr
def create_lon_grid(acquisition, out_fname=None,
compression=H5CompressionFilter.LZF, filter_opts=None,
depth=7):
"""Create longitude grid.
:param acquisition:
An instance of an `Acquisition` object.
:param out_fname:
If set to None (default) then the results will be returned
as an in-memory hdf5 file, i.e. the `core` driver.
Otherwise it should be a string containing the full file path
name to a writeable location on disk in which to save the HDF5
file.
The dataset path names will be as follows:
* contants.DatasetName.LON.value
:param compression:
The compression filter to use.
Default is H5CompressionFilter.LZF
:filter_opts:
A dict of key value pairs available to the given configuration
instance of H5CompressionFilter. For example
H5CompressionFilter.LZF has the keywords *chunks* and *shuffle*
available.
Default is None, which will use the default settings for the
chosen H5CompressionFilter instance.
:return:
An opened `h5py.File` object, that is either in-memory using the
`core` driver, or on disk.
"""
# Initialise the output files
if out_fname is None:
fid = h5py.File('longitude.h5', driver='core',
backing_store=False)
else:
fid = h5py.File(out_fname, 'w')
geobox = acquisition.gridded_geo_box()
# define some base attributes for the image datasets
attrs = {'crs_wkt': geobox.crs.ExportToWkt(),
'geotransform': geobox.transform.to_gdal(),
'description': LON_DESC}
if filter_opts is None:
filter_opts = {}
filter_opts['chunks'] = acquisition.tile_size
kwargs = compression.config(**filter_opts).dataset_compression_kwargs()
lon_grid = create_grid(geobox, get_lon_coordinate, depth)
grp = fid.create_group(GroupName.LON_LAT_GROUP.value)
dset = grp.create_dataset(DatasetName.LON.value, data=lon_grid, **kwargs)
attach_image_attributes(dset, attrs)
return fid
def create_lat_grid(acquisition, out_fname=None,
compression=H5CompressionFilter.LZF, filter_opts=None,
depth=7):
"""Create latitude grid.
:param acquisition:
An instance of an `Acquisition` object.
:param out_fname:
If set to None (default) then the results will be returned
as an in-memory hdf5 file, i.e. the `core` driver.
Otherwise it should be a string containing the full file path
name to a writeable location on disk in which to save the HDF5
file.
The dataset path names will be as follows:
* contants.DatasetName.LAT.value
:param compression:
The compression filter to use.
Default is H5CompressionFilter.LZF
:filter_opts:
A dict of key value pairs available to the given configuration
instance of H5CompressionFilter. For example
H5CompressionFilter.LZF has the keywords *chunks* and *shuffle*
available.
Default is None, which will use the default settings for the
chosen H5CompressionFilter instance.
:return:
An opened `h5py.File` object, that is either in-memory using the
`core` driver, or on disk.
"""
# Initialise the output files
if out_fname is None:
fid = h5py.File('latitude.h5', driver='core',
backing_store=False)
else:
fid = h5py.File(out_fname, 'w')
geobox = acquisition.gridded_geo_box()
# define some base attributes for the image datasets
attrs = {'crs_wkt': geobox.crs.ExportToWkt(),
'geotransform': geobox.transform.to_gdal(),
'description': LAT_DESC}
if filter_opts is None:
filter_opts = {}
filter_opts['chunks'] = acquisition.tile_size
kwargs = compression.config(**filter_opts).dataset_compression_kwargs()
lat_grid = create_grid(geobox, get_lat_coordinate, depth)
grp = fid.create_group(GroupName.LON_LAT_GROUP.value)
dset = grp.create_dataset(DatasetName.LAT.value, data=lat_grid, **kwargs)
attach_image_attributes(dset, attrs)
return fid
| 32.29683 | 79 | 0.671188 | 1,460 | 11,207 | 5.035616 | 0.158219 | 0.027203 | 0.013058 | 0.030468 | 0.840452 | 0.803727 | 0.785229 | 0.752312 | 0.718988 | 0.676143 | 0 | 0.008469 | 0.251896 | 11,207 | 346 | 80 | 32.390173 | 0.86844 | 0.476756 | 0 | 0.553571 | 0 | 0 | 0.056069 | 0.003965 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.071429 | 0 | 0.1875 | 0.008929 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5464315b316772de03840ee8b854ff40843536af | 2,646 | py | Python | src/open_geodata/lyr/base.py | open-geodata/open-geodata | 609a040e06dab2e7c7c627b20640011a8b381c4d | [
"MIT"
] | null | null | null | src/open_geodata/lyr/base.py | open-geodata/open-geodata | 609a040e06dab2e7c7c627b20640011a8b381c4d | [
"MIT"
] | null | null | null | src/open_geodata/lyr/base.py | open-geodata/open-geodata | 609a040e06dab2e7c7c627b20640011a8b381c4d | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# coding: utf-8
import folium
def google_hybrid(min_zoom, max_zoom):
row = {
'link': 'https://mt1.google.com/vt/lyrs=y&x={x}&y={y}&z={z}',
'name': 'Google Hybrid',
'attribution': 'https://www.google.com/maps',
}
return folium.TileLayer(
tiles=row['link'],
attr=('<a href="{}" target="blank">{}</a>'.format(row['attribution'], row['name'])),
name=row['name'],
min_zoom=min_zoom,
max_zoom=max_zoom,
subdomains=['mt0', 'mt1', 'mt2', 'mt3'],
overlay=False,
control=True,
show=True,
)
def google_satellite(min_zoom, max_zoom):
row = {
'link': 'https://mt1.google.com/vt/lyrs=s&x={x}&y={y}&z={z}',
'name': 'Google Satelite',
'attribution': 'https://www.google.com/maps',
}
return folium.TileLayer(
tiles=row['link'],
attr=('<a href="{}" target="blank">{}</a>'.format(row['attribution'], row['name'])),
name=row['name'],
min_zoom=min_zoom,
max_zoom=max_zoom,
subdomains=['mt0', 'mt1', 'mt2', 'mt3'],
overlay=False,
control=True,
show=False,
)
def google_terrain(min_zoom, max_zoom):
row = {
'link': 'https://mt1.google.com/vt/lyrs=p&x={x}&y={y}&z={z}',
'name': 'Google Terrain',
'attribution': 'https://www.google.com/maps',
}
return folium.TileLayer(
tiles=row['link'],
attr=('<a href="{}" target="blank">{}</a>'.format(row['attribution'], row['name'])),
name=row['name'],
min_zoom=min_zoom,
max_zoom=max_zoom,
subdomains=['mt0', 'mt1', 'mt2', 'mt3'],
overlay=False,
control=True,
show=False,
)
def google_streets(min_zoom, max_zoom):
row = {
'link': 'https://mt1.google.com/vt/lyrs=m&x={x}&y={y}&z={z}',
'name': 'Google Streets',
'attribution': 'https://www.google.com/maps',
}
return folium.TileLayer(
tiles=row['link'],
attr=('<a href="{}" target="blank">{}</a>'.format(row['attribution'], row['name'])),
name=row['name'],
min_zoom=min_zoom,
max_zoom=max_zoom,
subdomains=['mt0', 'mt1', 'mt2', 'mt3'],
overlay=False,
control=True,
show=False,
)
def cartodb_positron(min_zoom, max_zoom):
return folium.TileLayer(
tiles='cartodbpositron',
attr='Carto',
name='CartoDB Positron',
min_zoom=min_zoom,
max_zoom=max_zoom,
overlay=False,
control=True,
show=False,
)
if __name__ == '__main__':
pass
| 26.727273 | 92 | 0.534769 | 324 | 2,646 | 4.234568 | 0.188272 | 0.076531 | 0.120262 | 0.102041 | 0.830904 | 0.830904 | 0.80758 | 0.80758 | 0.740525 | 0.740525 | 0 | 0.010753 | 0.261905 | 2,646 | 98 | 93 | 27 | 0.691756 | 0.01285 | 0 | 0.646341 | 0 | 0.04878 | 0.291188 | 0.032184 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060976 | false | 0.012195 | 0.012195 | 0.012195 | 0.134146 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5466dc571c7babd43d6031431d45675cfea9acda | 128 | py | Python | crawl_utils/file/__init__.py | cjr0707/CrawlUtils | 723f0b8ef2a617ff0ca1b51e35a5ded43ab76ff0 | [
"MIT"
] | 1 | 2021-03-11T03:00:10.000Z | 2021-03-11T03:00:10.000Z | crawl_utils/file/__init__.py | cjr0707/CrawlUtils | 723f0b8ef2a617ff0ca1b51e35a5ded43ab76ff0 | [
"MIT"
] | null | null | null | crawl_utils/file/__init__.py | cjr0707/CrawlUtils | 723f0b8ef2a617ff0ca1b51e35a5ded43ab76ff0 | [
"MIT"
] | null | null | null | from .extract_attachment import extract_attachment
from .extract_img import extract_img
from .extract_time import TimeExtractor
| 32 | 50 | 0.882813 | 17 | 128 | 6.352941 | 0.411765 | 0.305556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 128 | 3 | 51 | 42.666667 | 0.931034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
54774dbc5112f15c1ad0e9667599cae448e2d045 | 298 | py | Python | plotly/graph_objs/ohlc/__init__.py | gnestor/plotly.py | a8ae062795ddbf9867b8578fe6d9e244948c15ff | [
"MIT"
] | 12 | 2020-04-18T18:10:22.000Z | 2021-12-06T10:11:15.000Z | plotly/graph_objs/ohlc/__init__.py | Vesauza/plotly.py | e53e626d59495d440341751f60aeff73ff365c28 | [
"MIT"
] | 27 | 2020-04-28T21:23:12.000Z | 2021-06-25T15:36:38.000Z | plotly/graph_objs/ohlc/__init__.py | Vesauza/plotly.py | e53e626d59495d440341751f60aeff73ff365c28 | [
"MIT"
] | 6 | 2020-04-18T23:07:08.000Z | 2021-11-18T07:53:06.000Z | from ._stream import Stream
from ._line import Line
from ._increasing import Increasing
from plotly.graph_objs.ohlc import increasing
from ._hoverlabel import Hoverlabel
from plotly.graph_objs.ohlc import hoverlabel
from ._decreasing import Decreasing
from plotly.graph_objs.ohlc import decreasing
| 33.111111 | 45 | 0.855705 | 41 | 298 | 6.02439 | 0.268293 | 0.121457 | 0.182186 | 0.230769 | 0.352227 | 0.352227 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107383 | 298 | 8 | 46 | 37.25 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
548dd582bcb413de0a6a7e6d8c6fd1a9ee69f099 | 59 | py | Python | test_fixtures/general_repo/file_with_multiple_imports_on_same_line.py | SerejkaSJ/fiasko_bro | dfb8c30109f317c1e5b6d211e002fd148695809e | [
"MIT"
] | 25 | 2018-01-24T10:45:35.000Z | 2020-12-05T21:47:20.000Z | test_fixtures/general_repo/file_with_multiple_imports_on_same_line.py | SerejkaSJ/fiasko_bro | dfb8c30109f317c1e5b6d211e002fd148695809e | [
"MIT"
] | 110 | 2018-01-21T12:25:13.000Z | 2021-06-10T19:27:22.000Z | test_fixtures/general_repo/file_with_multiple_imports_on_same_line.py | SerejkaSJ/fiasko_bro | dfb8c30109f317c1e5b6d211e002fd148695809e | [
"MIT"
] | 13 | 2017-12-12T22:19:01.000Z | 2019-01-29T18:08:05.000Z | import foo
from foo import one, two, three
import foo, bar
| 14.75 | 31 | 0.762712 | 11 | 59 | 4.090909 | 0.636364 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186441 | 59 | 3 | 32 | 19.666667 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
54a75ecefd001ef0d4176bc17eef99ff4e29ff3b | 9,551 | py | Python | sfm/ui/test_jobs.py | AramZS/sfm-ui | accfa59bf203e5dee7db46c8072256700278b462 | [
"MIT"
] | 1 | 2021-01-09T04:58:17.000Z | 2021-01-09T04:58:17.000Z | sfm/ui/test_jobs.py | AramZS/sfm-ui | accfa59bf203e5dee7db46c8072256700278b462 | [
"MIT"
] | null | null | null | sfm/ui/test_jobs.py | AramZS/sfm-ui | accfa59bf203e5dee7db46c8072256700278b462 | [
"MIT"
] | null | null | null | from django.test import TestCase
from django.conf import settings
import json
from mock import MagicMock, patch
from .jobs import collection_harvest, collection_stop
from .models import Collection, CollectionSet, Seed, Credential, Group, User, Harvest
from .rabbit import RabbitWorker
class StartJobsTests(TestCase):
def setUp(self):
self.user = User.objects.create_superuser(username="test_user", email="test_user@test.com",
password="test_password")
self.group = Group.objects.create(name="test_group")
self.collection_set = CollectionSet.objects.create(group=self.group, name="test_collection_set")
self.credential_token = {"key": "test_key"}
self.credential = Credential.objects.create(user=self.user, platform="test_platform",
token=json.dumps(self.credential_token))
self.harvest_options = {"test_option": "test_value"}
@patch("ui.jobs.RabbitWorker", autospec=True)
def test_collection_harvest(self, mock_rabbit_worker_class):
collection = Collection.objects.create(collection_set=self.collection_set, credential=self.credential,
harvest_type=Collection.TWITTER_USER_TIMELINE, name="test_collection",
harvest_options=json.dumps(self.harvest_options), is_active=True)
Seed.objects.create(collection=collection, token="test_token1", seed_id="1")
Seed.objects.create(collection=collection, uid="test_uid2", seed_id="2")
Seed.objects.create(collection=collection, token="test_token3", uid="test_uid3", seed_id="3")
# Creating Inactive seed which will be ignored from harvest
Seed.objects.create(collection=collection, uid="test_uid4", seed_id="4", is_active=False)
mock_rabbit_worker = MagicMock(spec=RabbitWorker)
mock_rabbit_worker_class.side_effect = [mock_rabbit_worker]
collection_harvest(collection.id)
# Harvest start message sent
name, args, kwargs = mock_rabbit_worker.mock_calls[0]
self.assertEqual("send_message", name)
message = args[0]
self.assertTrue(message["collection_set"]["id"])
self.assertEqual(
"{}/collection_set/{}/{}".format(settings.SFM_DATA_DIR, self.collection_set.collection_set_id,
collection.collection_id),
message["path"])
self.assertDictEqual(self.harvest_options, message["options"])
self.assertDictEqual({"token": "test_token1", "id": "1"}, message["seeds"][0])
self.assertDictEqual({"uid": "test_uid2", "id": "2"}, message["seeds"][1])
self.assertDictEqual({"token": "test_token3", "uid": "test_uid3", "id": "3"}, message["seeds"][2])
self.assertEqual(Collection.TWITTER_USER_TIMELINE, message["type"])
self.assertTrue(message["id"])
self.assertEqual("harvest.start.test_platform.twitter_user_timeline", args[1])
# Harvest model object created
harvest = Harvest.objects.get(harvest_id=message["id"])
self.assertIsNotNone(harvest.date_requested)
self.assertEqual(collection, harvest.collection)
self.assertEqual(Harvest.REQUESTED, harvest.status)
self.assertEqual(Collection.TWITTER_USER_TIMELINE, harvest.harvest_type)
@patch("ui.jobs.RabbitWorker", autospec=True)
def test_missing_collection_harvest(self, mock_rabbit_worker_class):
mock_rabbit_worker = MagicMock(spec=RabbitWorker)
mock_rabbit_worker_class.side_effect = [mock_rabbit_worker]
# Error should be logged and nothing happens
collection_harvest(1234567)
mock_rabbit_worker.assert_not_called()
@patch("ui.jobs.RabbitWorker", autospec=True)
def test_collection_without_seeds_harvest(self, mock_rabbit_worker_class):
collection = Collection.objects.create(collection_set=self.collection_set, credential=self.credential,
harvest_type=Collection.TWITTER_SAMPLE, name="test_collection",
harvest_options=json.dumps(self.harvest_options), is_active=True)
mock_rabbit_worker = MagicMock(spec=RabbitWorker)
mock_rabbit_worker_class.side_effect = [mock_rabbit_worker]
collection_harvest(collection.id)
# Harvest start message sent
name, args, kwargs = mock_rabbit_worker.mock_calls[0]
self.assertEqual("send_message", name)
message = args[0]
self.assertTrue(message["collection_set"]["id"])
self.assertEqual(
"{}/collection_set/{}/{}".format(settings.SFM_DATA_DIR, self.collection_set.collection_set_id,
collection.collection_id),
message["path"])
self.assertDictEqual(self.harvest_options, message["options"])
self.assertFalse("seeds" in message)
self.assertEqual(Collection.TWITTER_SAMPLE, message["type"])
self.assertTrue(message["id"])
self.assertEqual("harvest.start.test_platform.twitter_sample", args[1])
# Harvest model object created
harvest = Harvest.objects.get(harvest_id=message["id"])
self.assertIsNotNone(harvest.date_requested)
self.assertEqual(collection, harvest.collection)
self.assertEqual(Harvest.REQUESTED, harvest.status)
@patch("ui.jobs.RabbitWorker", autospec=True)
def test_missing_seeds(self, mock_rabbit_worker_class):
collection = Collection.objects.create(collection_set=self.collection_set, credential=self.credential,
harvest_type=Collection.TWITTER_USER_TIMELINE, name="test_collection",
harvest_options=json.dumps(self.harvest_options), is_active=True)
mock_rabbit_worker = MagicMock(spec=RabbitWorker)
mock_rabbit_worker_class.side_effect = [mock_rabbit_worker]
# Error should be logged and nothing happens
collection_harvest(collection.id)
mock_rabbit_worker.assert_not_called()
@patch("ui.jobs.RabbitWorker", autospec=True)
def test_wrong_number_of_seeds(self, mock_rabbit_worker_class):
collection = Collection.objects.create(collection_set=self.collection_set, credential=self.credential,
harvest_type=Collection.TWITTER_SAMPLE, name="test_collection",
harvest_options=json.dumps(self.harvest_options), is_active=True)
Seed.objects.create(collection=collection, token="test_token1")
mock_rabbit_worker = MagicMock(spec=RabbitWorker)
mock_rabbit_worker_class.side_effect = [mock_rabbit_worker]
# Error should be logged and nothing happens
collection_harvest(collection.id)
mock_rabbit_worker.assert_not_called()
class StopJobsTests(TestCase):
def setUp(self):
self.user = User.objects.create_superuser(username="test_user", email="test_user@test.com",
password="test_password")
self.group = Group.objects.create(name="test_group")
self.collection_set = CollectionSet.objects.create(group=self.group, name="test_collection_set")
self.credential_token = {"key": "test_key"}
self.credential = Credential.objects.create(user=self.user, platform="test_platform",
token=json.dumps(self.credential_token))
self.collection = Collection.objects.create(collection_set=self.collection_set, credential=self.credential,
harvest_type=Collection.TWITTER_SAMPLE, name="test_collection",
is_active=True)
self.historical_collection = self.collection.history.all()[0]
self.historical_credential = self.historical_collection.credential.history.all()[0]
@patch("ui.jobs.RabbitWorker", autospec=True)
def test_stop_harvest(self, mock_rabbit_worker_class):
harvest = Harvest.objects.create(harvest_type=Collection.TWITTER_SAMPLE,
collection=self.collection,
historical_collection=self.historical_collection,
historical_credential=self.historical_credential)
mock_rabbit_worker = MagicMock(spec=RabbitWorker)
mock_rabbit_worker_class.side_effect = [mock_rabbit_worker]
collection_stop(self.collection.id)
# Harvest stop message sent
name, args, kwargs = mock_rabbit_worker.mock_calls[0]
self.assertEqual("send_message", name)
message = args[0]
self.assertEqual(message["id"], harvest.harvest_id)
self.assertEqual("harvest.stop.test_platform.twitter_sample", args[1])
# Harvest model object update
harvest = Harvest.objects.get(harvest_id=message["id"])
self.assertEqual(Harvest.STOP_REQUESTED, harvest.status)
@patch("ui.jobs.RabbitWorker", autospec=True)
def test_missing_collection(self, mock_rabbit_worker_class):
mock_rabbit_worker = MagicMock(spec=RabbitWorker)
mock_rabbit_worker_class.side_effect = [mock_rabbit_worker]
# Error should be logged and nothing happens
collection_stop(1234567)
mock_rabbit_worker.assert_not_called()
| 53.960452 | 117 | 0.667469 | 1,038 | 9,551 | 5.88921 | 0.117534 | 0.057255 | 0.091608 | 0.048094 | 0.836905 | 0.823164 | 0.796499 | 0.766072 | 0.759202 | 0.732701 | 0 | 0.006146 | 0.233379 | 9,551 | 176 | 118 | 54.267045 | 0.828735 | 0.041357 | 0 | 0.666667 | 0 | 0 | 0.093941 | 0.019466 | 0 | 0 | 0 | 0 | 0.244444 | 1 | 0.066667 | false | 0.014815 | 0.051852 | 0 | 0.133333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
54b512d32f323a8689bd9353d417757575bae1dc | 26 | py | Python | cyvlfeat/fisher/__init__.py | hubutui/cyvlfeat | a0cfe17b0cc6fe14a9270b50592b4e0b0ec8ed1c | [
"BSD-2-Clause"
] | 103 | 2015-02-12T20:21:53.000Z | 2022-03-29T15:30:47.000Z | cyvlfeat/fisher/__init__.py | samousavizade/cyvlfeat | 03297e4d1a6924920a7cf2df9d558c93a8445b9f | [
"BSD-2-Clause"
] | 49 | 2015-05-05T03:48:37.000Z | 2022-03-09T13:54:24.000Z | cyvlfeat/fisher/__init__.py | samousavizade/cyvlfeat | 03297e4d1a6924920a7cf2df9d558c93a8445b9f | [
"BSD-2-Clause"
] | 68 | 2015-02-11T10:33:11.000Z | 2022-02-08T09:26:34.000Z | from .fisher import fisher | 26 | 26 | 0.846154 | 4 | 26 | 5.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 26 | 1 | 26 | 26 | 0.956522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
49af530a7c6786d68f8b4bb17407c0b8edef0c83 | 174 | py | Python | blurr/modeling/all.py | bogdansalyp/blurr | 77351fb4585b2383ee1507cec9f6625787e4e368 | [
"Apache-2.0"
] | null | null | null | blurr/modeling/all.py | bogdansalyp/blurr | 77351fb4585b2383ee1507cec9f6625787e4e368 | [
"Apache-2.0"
] | null | null | null | blurr/modeling/all.py | bogdansalyp/blurr | 77351fb4585b2383ee1507cec9f6625787e4e368 | [
"Apache-2.0"
] | null | null | null | from ..utils import *
from .core import *
from .language_modeling import *
from .question_answering import *
from .token_classification import *
from .summarization import *
| 24.857143 | 35 | 0.787356 | 21 | 174 | 6.380952 | 0.52381 | 0.373134 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 174 | 6 | 36 | 29 | 0.893333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3f967e52bda038d5208b1212e3fcae5818c6f768 | 2,051 | py | Python | src/AuShadha/patient/dijit_fields_constants.py | GosthMan/AuShadha | 3ab48825a0dba19bf880b6ac6141ab7a6adf1f3e | [
"PostgreSQL"
] | 46 | 2015-03-04T14:19:47.000Z | 2021-12-09T02:58:46.000Z | src/AuShadha/patient/dijit_fields_constants.py | aytida23/AuShadha | 3ab48825a0dba19bf880b6ac6141ab7a6adf1f3e | [
"PostgreSQL"
] | 2 | 2015-06-05T10:29:04.000Z | 2015-12-06T16:54:10.000Z | src/AuShadha/patient/dijit_fields_constants.py | aytida23/AuShadha | 3ab48825a0dba19bf880b6ac6141ab7a6adf1f3e | [
"PostgreSQL"
] | 24 | 2015-03-23T01:38:11.000Z | 2022-01-24T16:23:42.000Z | PATIENT_DETAIL_FORM_CONSTANTS = {
'first_name':{
'max_length': 30,
"data-dojo-type": "dijit.form.ValidationTextBox",
"data-dojo-props": r"'required' :true ,'regExp':'[a-zA-Z\'-. ]+','invalidMessage':'Invalid Character' "
},
'middle_name':{
'max_length': 30,
"data-dojo-type": "dijit.form.ValidationTextBox",
"data-dojo-props": r"'required' : false ,'regExp':'[a-zA-Z\'-. ]+','invalidMessage' : 'Invalid Character'"
},
'last_name':{
'max_length': 30,
"data-dojo-type": "dijit.form.ValidationTextBox",
"data-dojo-props": r"'required' : false ,'regExp':'[a-zA-Z\'-. ]+','invalidMessage' : 'Invalid Character'"
},
'patient_hospital_id':{
'max_length': 30,
"data-dojo-type": "dijit.form.ValidationTextBox",
"data-dojo-props": r"'required' : true ,'regExp':'[\\w]+','invalidMessage' : 'Invalid Character'"
},
'age':{
'max_length': 30,
"data-dojo-type": "dijit.form.ValidationTextBox",
"data-dojo-props": r"'required' : true ,'regExp':'\\d{1,3}','invalidMessage' : 'Only Numbers <1000 are allowed'"
},
'sex':{
'max_length': 30,
"data-dojo-type": "dijit.form.Select",
"data-dojo-props": r"'required' : true ,'regExp':'[\\w]+','invalidMessage' : ''"
}
#,
#"parent_clinic":{
#"max_length": 30,
#"data-dojo-type": "dijit.form.Select",
#"data-dojo-props": r"'required':'true', 'regExp': '', 'invalidMessage': 'Please select a value' "
#}
}
| 52.589744 | 136 | 0.425646 | 167 | 2,051 | 5.131737 | 0.275449 | 0.130688 | 0.089848 | 0.12252 | 0.806301 | 0.806301 | 0.806301 | 0.766628 | 0.766628 | 0.731622 | 0 | 0.016393 | 0.405168 | 2,051 | 38 | 137 | 53.973684 | 0.686066 | 0.083374 | 0 | 0.40625 | 0 | 0.09375 | 0.489861 | 0.182497 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3fa2272dd4e8b6e532e5c059cbde3eb8e15dd9de | 27 | py | Python | rook_client/__init__.py | ceph/rook-client-python | 82673cd7c7a3f4919b98706985ff27e57d2c1b94 | [
"Apache-2.0"
] | 2 | 2020-02-12T16:06:05.000Z | 2021-07-21T03:43:43.000Z | rook_client/__init__.py | ceph/rook-client-python | 82673cd7c7a3f4919b98706985ff27e57d2c1b94 | [
"Apache-2.0"
] | 4 | 2020-02-13T16:55:33.000Z | 2021-09-20T08:22:44.000Z | rook_client/__init__.py | ceph/rook-client-python | 82673cd7c7a3f4919b98706985ff27e57d2c1b94 | [
"Apache-2.0"
] | 11 | 2020-02-11T13:57:15.000Z | 2021-12-07T06:37:36.000Z | from ._helper import STRICT | 27 | 27 | 0.851852 | 4 | 27 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 27 | 1 | 27 | 27 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3fa6135d641688595ad1a77e5bc2e4846d849196 | 45 | py | Python | test/navigator_test/navigator_testing_suite/mission_perception_test/__init__.py | jaxnb/NaviGator | 2edb85cf5eab38f62132b3f467814516d2bb05f3 | [
"MIT"
] | 27 | 2020-02-17T21:54:09.000Z | 2022-03-18T17:49:23.000Z | test/navigator_test/navigator_testing_suite/mission_perception_test/__init__.py | jaxnb/NaviGator | 2edb85cf5eab38f62132b3f467814516d2bb05f3 | [
"MIT"
] | 325 | 2019-09-11T14:13:56.000Z | 2022-03-31T00:38:30.000Z | test/navigator_test/navigator_testing_suite/mission_perception_test/__init__.py | ericgorday/NaviGator | cc929a8609d7a416d0b8c9a95059e296f669464a | [
"MIT"
] | 24 | 2019-09-16T00:29:45.000Z | 2022-03-06T10:56:38.000Z | from find_the_break_test_perception import *
| 22.5 | 44 | 0.888889 | 7 | 45 | 5.142857 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 45 | 1 | 45 | 45 | 0.878049 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3fa88be13c3b525357b2ef387f39d97ef32a4bdb | 29 | py | Python | ants/learn/__init__.py | xemio/ANTsPy | ef610318e217bb04d3850d480c2e51df695d56c0 | [
"Apache-2.0"
] | 338 | 2017-09-01T06:47:54.000Z | 2022-03-31T12:11:46.000Z | ants/learn/__init__.py | xemio/ANTsPy | ef610318e217bb04d3850d480c2e51df695d56c0 | [
"Apache-2.0"
] | 306 | 2017-08-30T20:05:07.000Z | 2022-03-31T16:20:44.000Z | ants/learn/__init__.py | xemio/ANTsPy | ef610318e217bb04d3850d480c2e51df695d56c0 | [
"Apache-2.0"
] | 115 | 2017-09-08T11:53:17.000Z | 2022-03-27T05:53:39.000Z |
from .decomposition import * | 14.5 | 28 | 0.793103 | 3 | 29 | 7.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 2 | 28 | 14.5 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3fbf5e1454fa23453ed4ee914edf9e6975816b6e | 30 | py | Python | pyinfra/api/connectors/sshuserclient/__init__.py | ryan109/pyinfra | 6814920c634a2a047299875077633f72ecc46fce | [
"MIT"
] | 1 | 2020-04-12T16:15:15.000Z | 2020-04-12T16:15:15.000Z | pyinfra/api/connectors/sshuserclient/__init__.py | gchazot/pyinfra | 40dd01f11086dba2be20fa2f509556abb40d84f8 | [
"MIT"
] | null | null | null | pyinfra/api/connectors/sshuserclient/__init__.py | gchazot/pyinfra | 40dd01f11086dba2be20fa2f509556abb40d84f8 | [
"MIT"
] | null | null | null | from .client import SSHClient
| 15 | 29 | 0.833333 | 4 | 30 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 1 | 30 | 30 | 0.961538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3fc28663533e6d480fb216a31e1ec314702884b6 | 139 | py | Python | sambung_ayat/views.py | arroganthooman/gomurojaah | 93a1835f22e05607ba3dcc9ee2ccd9930b5b69f0 | [
"Unlicense"
] | null | null | null | sambung_ayat/views.py | arroganthooman/gomurojaah | 93a1835f22e05607ba3dcc9ee2ccd9930b5b69f0 | [
"Unlicense"
] | null | null | null | sambung_ayat/views.py | arroganthooman/gomurojaah | 93a1835f22e05607ba3dcc9ee2ccd9930b5b69f0 | [
"Unlicense"
] | null | null | null | from django.shortcuts import render
# Create your views here.
def sambungAyat(request):
return render(request, "sambung_ayat.html") | 23.166667 | 44 | 0.76259 | 18 | 139 | 5.833333 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151079 | 139 | 6 | 44 | 23.166667 | 0.889831 | 0.165468 | 0 | 0 | 0 | 0 | 0.154545 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
b76c1fe85b8ee5e324f4bf10dc0093ef2a8dabf0 | 2,402 | py | Python | tests/test_having_specs.py | scimas/druid_query | 7b281ef83e032a2765c9840400baf08c75818fb5 | [
"MIT"
] | null | null | null | tests/test_having_specs.py | scimas/druid_query | 7b281ef83e032a2765c9840400baf08c75818fb5 | [
"MIT"
] | null | null | null | tests/test_having_specs.py | scimas/druid_query | 7b281ef83e032a2765c9840400baf08c75818fb5 | [
"MIT"
] | null | null | null | import json
from druid_query.utils import druid_serealize
from druid_query.components import having_specs as hss
from druid_query.components import filters as flt
def test_json_conversion():
hs = hss.Filter(flt.TrueF())
generated = json.loads(json.dumps(hs, default=druid_serealize))
expected = {
'type': 'filter',
'filter': {
'type': 'true'
}
}
assert generated == expected
hs = hss.EqualTo('agg', 2)
generated = json.loads(json.dumps(hs, default=druid_serealize))
expected = {
'type': 'equalTo',
'aggregation': 'agg',
'value': 2
}
assert generated == expected
hs = hss.GreaterThan('agg', 2)
generated = json.loads(json.dumps(hs, default=druid_serealize))
expected = {
'type': 'greaterThan',
'aggregation': 'agg',
'value': 2
}
assert generated == expected
hs = hss.LessThan('agg', 2)
generated = json.loads(json.dumps(hs, default=druid_serealize))
expected = {
'type': 'lessThan',
'aggregation': 'agg',
'value': 2
}
assert generated == expected
hs = hss.DimSelector('dim', 2)
generated = json.loads(json.dumps(hs, default=druid_serealize))
expected = {
'type': 'dimSelector',
'dimension': 'dim',
'value': 2
}
assert generated == expected
hs = hss.And([hss.EqualTo('agg', 2), hss.EqualTo('agg', 2)])
generated = json.loads(json.dumps(hs, default=druid_serealize))
expected = {
'type': 'and',
'havingSpecs': [
{'type': 'equalTo', 'aggregation': 'agg', 'value': 2},
{'type': 'equalTo', 'aggregation': 'agg', 'value': 2}
]
}
assert generated == expected
hs = hss.Or([hss.EqualTo('agg', 2), hss.EqualTo('agg', 2)])
generated = json.loads(json.dumps(hs, default=druid_serealize))
expected = {
'type': 'or',
'havingSpecs': [
{'type': 'equalTo', 'aggregation': 'agg', 'value': 2},
{'type': 'equalTo', 'aggregation': 'agg', 'value': 2}
]
}
assert generated == expected
hs = hss.Not(hss.EqualTo('agg', 2))
generated = json.loads(json.dumps(hs, default=druid_serealize))
expected = {
'type': 'not',
'havingSpec': {'type': 'equalTo', 'aggregation': 'agg', 'value': 2}
}
assert generated == expected
| 28.595238 | 75 | 0.569525 | 256 | 2,402 | 5.285156 | 0.167969 | 0.093126 | 0.10643 | 0.130081 | 0.824095 | 0.761271 | 0.761271 | 0.736142 | 0.736142 | 0.625277 | 0 | 0.01021 | 0.266028 | 2,402 | 83 | 76 | 28.939759 | 0.757232 | 0 | 0 | 0.506849 | 0 | 0 | 0.158202 | 0 | 0 | 0 | 0 | 0 | 0.109589 | 1 | 0.013699 | false | 0 | 0.054795 | 0 | 0.068493 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b7a7a3a780a3cbabad529c1c0fa99eddcab9d6a8 | 3,608 | py | Python | pycharm-remap/frictionless/frictionless_regex.py | indika/frictionless | 3ad2a39f0e1c9d9be3bc677ade5fb4d4045c411b | [
"MIT"
] | null | null | null | pycharm-remap/frictionless/frictionless_regex.py | indika/frictionless | 3ad2a39f0e1c9d9be3bc677ade5fb4d4045c411b | [
"MIT"
] | null | null | null | pycharm-remap/frictionless/frictionless_regex.py | indika/frictionless | 3ad2a39f0e1c9d9be3bc677ade5fb4d4045c411b | [
"MIT"
] | null | null | null | __author__ = "Indika Piyasena"
import logging
import unittest
import re
from mappings.translate import Translate
logger = logging.getLogger(__name__)
class FrictionlessRegex:
def __init__(self):
pass
@staticmethod
def replace_one_keystroke(source, fmap):
# <keyboard-shortcut first-keystroke="control alt J" />
pattern = r'<keyboard-shortcut first-keystroke=\"([a-zA-Z0-9_\s]*) ([a-zA-Z0-9_]*)\" />'
regex = re.compile(pattern)
return regex.sub(
lambda match: '<keyboard-shortcut first-keystroke="{0} {1}" />'
.format(
match.group(1),
fmap(match.group(2))
), source)
@staticmethod
def replace_two_keystrokes(source, fmap):
pattern = r'<keyboard-shortcut first-keystroke=\"([a-zA-Z0-9_\s]*) ([a-zA-Z0-9_]*)\" second-keystroke="([a-zA-Z0-9_\s]*) ([a-zA-Z0-9_]*)" />'
regex = re.compile(pattern)
return regex.sub(
lambda
match: '<keyboard-shortcut first-keystroke="{0} {1}" second-keystroke="{2} {3}" />'
.format(
match.group(1),
fmap(match.group(2)),
match.group(3),
fmap(match.group(4))
), source)
class FrictionlessRegexTestCase(unittest.TestCase):
def test_one_keystroke(self):
fmap = Translate.map_to_qwerty
keystroke = '<keyboard-shortcut first-keystroke="control alt J" />'
result = FrictionlessRegex.replace_one_keystroke(keystroke, fmap)
print result
self.assertEqual(
'<keyboard-shortcut first-keystroke="control alt C" />', result)
def test_one_keystroke_with_two(self):
fmap = Translate.map_to_qwerty
keystroke = '<keyboard-shortcut first-keystroke="control alt J" />'
result = FrictionlessRegex.replace_two_keystrokes(keystroke, fmap)
print result
self.assertEqual(
'<keyboard-shortcut first-keystroke="control alt J" />', result)
def test_one_keystroke_redundant(self):
fmap = Translate.map_to_qwerty
keystroke = '<action id="CommentByLineComment">'
result = FrictionlessRegex.replace_one_keystroke(keystroke, fmap)
print result
self.assertEqual(
'<action id="CommentByLineComment">', result)
def test_two_keystrokes(self):
fmap = Translate.map_to_qwerty
keystroke = '<keyboard-shortcut first-keystroke="meta E" second-keystroke="meta R" />'
result = FrictionlessRegex.replace_two_keystrokes(keystroke, fmap)
print result
self.assertEqual(
'<keyboard-shortcut first-keystroke="meta D" second-keystroke="meta O" />',
result)
def test_two_keystrokes_with_one(self):
fmap = Translate.map_to_qwerty
keystroke = '<keyboard-shortcut first-keystroke="meta E" second-keystroke="meta R" />'
result = FrictionlessRegex.replace_one_keystroke(keystroke, fmap)
print result
self.assertEqual(
'<keyboard-shortcut first-keystroke="meta E" second-keystroke="meta R" />',
result)
def test_two_keystrokes_redundant(self):
fmap = Translate.map_to_qwerty
keystroke = '<action id="CommentByLineComment">'
result = FrictionlessRegex.replace_two_keystrokes(keystroke, fmap)
print result
self.assertEqual(
'<action id="CommentByLineComment">',
result)
if __name__ == '__main__':
print "Testing FrictionlessRegex in stand-alone-mode"
unittest.main()
| 35.029126 | 149 | 0.628049 | 385 | 3,608 | 5.698701 | 0.207792 | 0.094804 | 0.12443 | 0.177758 | 0.784868 | 0.734731 | 0.734731 | 0.716044 | 0.683683 | 0.683683 | 0 | 0.008902 | 0.252772 | 3,608 | 102 | 150 | 35.372549 | 0.804896 | 0.01469 | 0 | 0.555556 | 0 | 0.037037 | 0.289414 | 0.140203 | 0 | 0 | 0 | 0 | 0.074074 | 0 | null | null | 0.012346 | 0.049383 | null | null | 0.08642 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b7e428545526e21433d8fefd68a57a49d23e5435 | 29 | py | Python | shortenedurls/models/__init__.py | eliasvelardezft/urlshortener | e6e9fb3de1a6cb1481eeba221cdaad91698b0214 | [
"MIT"
] | null | null | null | shortenedurls/models/__init__.py | eliasvelardezft/urlshortener | e6e9fb3de1a6cb1481eeba221cdaad91698b0214 | [
"MIT"
] | null | null | null | shortenedurls/models/__init__.py | eliasvelardezft/urlshortener | e6e9fb3de1a6cb1481eeba221cdaad91698b0214 | [
"MIT"
] | null | null | null | from .shortened_urls import * | 29 | 29 | 0.827586 | 4 | 29 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103448 | 29 | 1 | 29 | 29 | 0.884615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4d262614d34396f006db4672412ecace4d87b960 | 96,991 | py | Python | sdk/python/pulumi_azure_nextgen/cdn/v20191231/outputs.py | test-wiz-sec/pulumi-azure-nextgen | 20a695af0d020b34b0f1c336e1b69702755174cc | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_nextgen/cdn/v20191231/outputs.py | test-wiz-sec/pulumi-azure-nextgen | 20a695af0d020b34b0f1c336e1b69702755174cc | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_nextgen/cdn/v20191231/outputs.py | test-wiz-sec/pulumi-azure-nextgen | 20a695af0d020b34b0f1c336e1b69702755174cc | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union
from ... import _utilities, _tables
from . import outputs
__all__ = [
'CacheExpirationActionParametersResponse',
'CacheKeyQueryStringActionParametersResponse',
'CookiesMatchConditionParametersResponse',
'DeepCreatedOriginGroupResponse',
'DeepCreatedOriginResponse',
'DeliveryRuleCacheExpirationActionResponse',
'DeliveryRuleCacheKeyQueryStringActionResponse',
'DeliveryRuleCookiesConditionResponse',
'DeliveryRuleHttpVersionConditionResponse',
'DeliveryRuleIsDeviceConditionResponse',
'DeliveryRulePostArgsConditionResponse',
'DeliveryRuleQueryStringConditionResponse',
'DeliveryRuleRemoteAddressConditionResponse',
'DeliveryRuleRequestBodyConditionResponse',
'DeliveryRuleRequestHeaderActionResponse',
'DeliveryRuleRequestHeaderConditionResponse',
'DeliveryRuleRequestMethodConditionResponse',
'DeliveryRuleRequestSchemeConditionResponse',
'DeliveryRuleRequestUriConditionResponse',
'DeliveryRuleResponse',
'DeliveryRuleResponseHeaderActionResponse',
'DeliveryRuleUrlFileExtensionConditionResponse',
'DeliveryRuleUrlFileNameConditionResponse',
'DeliveryRuleUrlPathConditionResponse',
'EndpointPropertiesUpdateParametersResponseDeliveryPolicy',
'GeoFilterResponse',
'HeaderActionParametersResponse',
'HealthProbeParametersResponse',
'HttpErrorRangeParametersResponse',
'HttpVersionMatchConditionParametersResponse',
'IsDeviceMatchConditionParametersResponse',
'OriginGroupOverrideActionParametersResponse',
'OriginGroupOverrideActionResponse',
'PostArgsMatchConditionParametersResponse',
'QueryStringMatchConditionParametersResponse',
'RemoteAddressMatchConditionParametersResponse',
'RequestBodyMatchConditionParametersResponse',
'RequestHeaderMatchConditionParametersResponse',
'RequestMethodMatchConditionParametersResponse',
'RequestSchemeMatchConditionParametersResponse',
'RequestUriMatchConditionParametersResponse',
'ResourceReferenceResponse',
'ResponseBasedOriginErrorDetectionParametersResponse',
'SkuResponse',
'UrlFileExtensionMatchConditionParametersResponse',
'UrlFileNameMatchConditionParametersResponse',
'UrlPathMatchConditionParametersResponse',
'UrlRedirectActionParametersResponse',
'UrlRedirectActionResponse',
'UrlRewriteActionParametersResponse',
'UrlRewriteActionResponse',
]
@pulumi.output_type
class CacheExpirationActionParametersResponse(dict):
"""
Defines the parameters for the cache expiration action.
"""
def __init__(__self__, *,
cache_behavior: str,
cache_type: str,
odata_type: str,
cache_duration: Optional[str] = None):
"""
Defines the parameters for the cache expiration action.
:param str cache_behavior: Caching behavior for the requests
:param str cache_type: The level at which the content needs to be cached.
:param str cache_duration: The duration for which the content needs to be cached. Allowed format is [d.]hh:mm:ss
"""
pulumi.set(__self__, "cache_behavior", cache_behavior)
pulumi.set(__self__, "cache_type", cache_type)
pulumi.set(__self__, "odata_type", odata_type)
if cache_duration is not None:
pulumi.set(__self__, "cache_duration", cache_duration)
@property
@pulumi.getter(name="cacheBehavior")
def cache_behavior(self) -> str:
"""
Caching behavior for the requests
"""
return pulumi.get(self, "cache_behavior")
@property
@pulumi.getter(name="cacheType")
def cache_type(self) -> str:
"""
The level at which the content needs to be cached.
"""
return pulumi.get(self, "cache_type")
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter(name="cacheDuration")
def cache_duration(self) -> Optional[str]:
"""
The duration for which the content needs to be cached. Allowed format is [d.]hh:mm:ss
"""
return pulumi.get(self, "cache_duration")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class CacheKeyQueryStringActionParametersResponse(dict):
"""
Defines the parameters for the cache-key query string action.
"""
def __init__(__self__, *,
odata_type: str,
query_string_behavior: str,
query_parameters: Optional[str] = None):
"""
Defines the parameters for the cache-key query string action.
:param str query_string_behavior: Caching behavior for the requests
:param str query_parameters: query parameters to include or exclude (comma separated).
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "query_string_behavior", query_string_behavior)
if query_parameters is not None:
pulumi.set(__self__, "query_parameters", query_parameters)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter(name="queryStringBehavior")
def query_string_behavior(self) -> str:
"""
Caching behavior for the requests
"""
return pulumi.get(self, "query_string_behavior")
@property
@pulumi.getter(name="queryParameters")
def query_parameters(self) -> Optional[str]:
"""
query parameters to include or exclude (comma separated).
"""
return pulumi.get(self, "query_parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class CookiesMatchConditionParametersResponse(dict):
"""
Defines the parameters for Cookies match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None,
selector: Optional[str] = None,
transforms: Optional[Sequence[str]] = None):
"""
Defines the parameters for Cookies match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: The match value for the condition of the delivery rule
:param bool negate_condition: Describes if this is negate condition or not
:param str selector: Name of Cookies to be matched
:param Sequence[str] transforms: List of transforms
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
if selector is not None:
pulumi.set(__self__, "selector", selector)
if transforms is not None:
pulumi.set(__self__, "transforms", transforms)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
The match value for the condition of the delivery rule
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
@property
@pulumi.getter
def selector(self) -> Optional[str]:
"""
Name of Cookies to be matched
"""
return pulumi.get(self, "selector")
@property
@pulumi.getter
def transforms(self) -> Optional[Sequence[str]]:
"""
List of transforms
"""
return pulumi.get(self, "transforms")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeepCreatedOriginGroupResponse(dict):
"""
The origin group for CDN content which is added when creating a CDN endpoint. Traffic is sent to the origins within the origin group based on origin health.
"""
def __init__(__self__, *,
name: str,
origins: Sequence['outputs.ResourceReferenceResponse'],
health_probe_settings: Optional['outputs.HealthProbeParametersResponse'] = None,
response_based_origin_error_detection_settings: Optional['outputs.ResponseBasedOriginErrorDetectionParametersResponse'] = None,
traffic_restoration_time_to_healed_or_new_endpoints_in_minutes: Optional[int] = None):
"""
The origin group for CDN content which is added when creating a CDN endpoint. Traffic is sent to the origins within the origin group based on origin health.
:param str name: Origin group name which must be unique within the endpoint.
:param Sequence['ResourceReferenceResponseArgs'] origins: The source of the content being delivered via CDN within given origin group.
:param 'HealthProbeParametersResponseArgs' health_probe_settings: Health probe settings to the origin that is used to determine the health of the origin.
:param 'ResponseBasedOriginErrorDetectionParametersResponseArgs' response_based_origin_error_detection_settings: The JSON object that contains the properties to determine origin health using real requests/responses.This property is currently not supported.
:param int traffic_restoration_time_to_healed_or_new_endpoints_in_minutes: Time in minutes to shift the traffic to the endpoint gradually when an unhealthy endpoint comes healthy or a new endpoint is added. Default is 10 mins. This property is currently not supported.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "origins", origins)
if health_probe_settings is not None:
pulumi.set(__self__, "health_probe_settings", health_probe_settings)
if response_based_origin_error_detection_settings is not None:
pulumi.set(__self__, "response_based_origin_error_detection_settings", response_based_origin_error_detection_settings)
if traffic_restoration_time_to_healed_or_new_endpoints_in_minutes is not None:
pulumi.set(__self__, "traffic_restoration_time_to_healed_or_new_endpoints_in_minutes", traffic_restoration_time_to_healed_or_new_endpoints_in_minutes)
@property
@pulumi.getter
def name(self) -> str:
"""
Origin group name which must be unique within the endpoint.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def origins(self) -> Sequence['outputs.ResourceReferenceResponse']:
"""
The source of the content being delivered via CDN within given origin group.
"""
return pulumi.get(self, "origins")
@property
@pulumi.getter(name="healthProbeSettings")
def health_probe_settings(self) -> Optional['outputs.HealthProbeParametersResponse']:
"""
Health probe settings to the origin that is used to determine the health of the origin.
"""
return pulumi.get(self, "health_probe_settings")
@property
@pulumi.getter(name="responseBasedOriginErrorDetectionSettings")
def response_based_origin_error_detection_settings(self) -> Optional['outputs.ResponseBasedOriginErrorDetectionParametersResponse']:
"""
The JSON object that contains the properties to determine origin health using real requests/responses.This property is currently not supported.
"""
return pulumi.get(self, "response_based_origin_error_detection_settings")
@property
@pulumi.getter(name="trafficRestorationTimeToHealedOrNewEndpointsInMinutes")
def traffic_restoration_time_to_healed_or_new_endpoints_in_minutes(self) -> Optional[int]:
"""
Time in minutes to shift the traffic to the endpoint gradually when an unhealthy endpoint comes healthy or a new endpoint is added. Default is 10 mins. This property is currently not supported.
"""
return pulumi.get(self, "traffic_restoration_time_to_healed_or_new_endpoints_in_minutes")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeepCreatedOriginResponse(dict):
"""
The main origin of CDN content which is added when creating a CDN endpoint.
"""
def __init__(__self__, *,
host_name: str,
name: str,
enabled: Optional[bool] = None,
http_port: Optional[int] = None,
https_port: Optional[int] = None,
origin_host_header: Optional[str] = None,
priority: Optional[int] = None,
weight: Optional[int] = None):
"""
The main origin of CDN content which is added when creating a CDN endpoint.
:param str host_name: The address of the origin. It can be a domain name, IPv4 address, or IPv6 address. This should be unique across all origins in an endpoint.
:param str name: Origin name which must be unique within the endpoint.
:param bool enabled: Origin is enabled for load balancing or not. By default, origin is always enabled.
:param int http_port: The value of the HTTP port. Must be between 1 and 65535.
:param int https_port: The value of the HTTPS port. Must be between 1 and 65535.
:param str origin_host_header: The host header value sent to the origin with each request. If you leave this blank, the request hostname determines this value. Azure CDN origins, such as Web Apps, Blob Storage, and Cloud Services require this host header value to match the origin hostname by default. If endpoint uses multiple origins for load balancing, then the host header at endpoint is ignored and this one is considered.
:param int priority: Priority of origin in given origin group for load balancing. Higher priorities will not be used for load balancing if any lower priority origin is healthy.Must be between 1 and 5.
:param int weight: Weight of the origin in given origin group for load balancing. Must be between 1 and 1000
"""
pulumi.set(__self__, "host_name", host_name)
pulumi.set(__self__, "name", name)
if enabled is not None:
pulumi.set(__self__, "enabled", enabled)
if http_port is not None:
pulumi.set(__self__, "http_port", http_port)
if https_port is not None:
pulumi.set(__self__, "https_port", https_port)
if origin_host_header is not None:
pulumi.set(__self__, "origin_host_header", origin_host_header)
if priority is not None:
pulumi.set(__self__, "priority", priority)
if weight is not None:
pulumi.set(__self__, "weight", weight)
@property
@pulumi.getter(name="hostName")
def host_name(self) -> str:
"""
The address of the origin. It can be a domain name, IPv4 address, or IPv6 address. This should be unique across all origins in an endpoint.
"""
return pulumi.get(self, "host_name")
@property
@pulumi.getter
def name(self) -> str:
"""
Origin name which must be unique within the endpoint.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def enabled(self) -> Optional[bool]:
"""
Origin is enabled for load balancing or not. By default, origin is always enabled.
"""
return pulumi.get(self, "enabled")
@property
@pulumi.getter(name="httpPort")
def http_port(self) -> Optional[int]:
"""
The value of the HTTP port. Must be between 1 and 65535.
"""
return pulumi.get(self, "http_port")
@property
@pulumi.getter(name="httpsPort")
def https_port(self) -> Optional[int]:
"""
The value of the HTTPS port. Must be between 1 and 65535.
"""
return pulumi.get(self, "https_port")
@property
@pulumi.getter(name="originHostHeader")
def origin_host_header(self) -> Optional[str]:
"""
The host header value sent to the origin with each request. If you leave this blank, the request hostname determines this value. Azure CDN origins, such as Web Apps, Blob Storage, and Cloud Services require this host header value to match the origin hostname by default. If endpoint uses multiple origins for load balancing, then the host header at endpoint is ignored and this one is considered.
"""
return pulumi.get(self, "origin_host_header")
@property
@pulumi.getter
def priority(self) -> Optional[int]:
"""
Priority of origin in given origin group for load balancing. Higher priorities will not be used for load balancing if any lower priority origin is healthy.Must be between 1 and 5.
"""
return pulumi.get(self, "priority")
@property
@pulumi.getter
def weight(self) -> Optional[int]:
"""
Weight of the origin in given origin group for load balancing. Must be between 1 and 1000
"""
return pulumi.get(self, "weight")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleCacheExpirationActionResponse(dict):
"""
Defines the cache expiration action for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.CacheExpirationActionParametersResponse'):
"""
Defines the cache expiration action for the delivery rule.
:param str name: The name of the action for the delivery rule.
:param 'CacheExpirationActionParametersResponseArgs' parameters: Defines the parameters for the action.
"""
pulumi.set(__self__, "name", 'CacheExpiration')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the action for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.CacheExpirationActionParametersResponse':
"""
Defines the parameters for the action.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleCacheKeyQueryStringActionResponse(dict):
"""
Defines the cache-key query string action for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.CacheKeyQueryStringActionParametersResponse'):
"""
Defines the cache-key query string action for the delivery rule.
:param str name: The name of the action for the delivery rule.
:param 'CacheKeyQueryStringActionParametersResponseArgs' parameters: Defines the parameters for the action.
"""
pulumi.set(__self__, "name", 'CacheKeyQueryString')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the action for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.CacheKeyQueryStringActionParametersResponse':
"""
Defines the parameters for the action.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleCookiesConditionResponse(dict):
"""
Defines the Cookies condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.CookiesMatchConditionParametersResponse'):
"""
Defines the Cookies condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'CookiesMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'Cookies')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.CookiesMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleHttpVersionConditionResponse(dict):
"""
Defines the HttpVersion condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.HttpVersionMatchConditionParametersResponse'):
"""
Defines the HttpVersion condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'HttpVersionMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'HttpVersion')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.HttpVersionMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleIsDeviceConditionResponse(dict):
"""
Defines the IsDevice condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.IsDeviceMatchConditionParametersResponse'):
"""
Defines the IsDevice condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'IsDeviceMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'IsDevice')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.IsDeviceMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRulePostArgsConditionResponse(dict):
"""
Defines the PostArgs condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.PostArgsMatchConditionParametersResponse'):
"""
Defines the PostArgs condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'PostArgsMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'PostArgs')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.PostArgsMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleQueryStringConditionResponse(dict):
"""
Defines the QueryString condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.QueryStringMatchConditionParametersResponse'):
"""
Defines the QueryString condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'QueryStringMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'QueryString')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.QueryStringMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleRemoteAddressConditionResponse(dict):
"""
Defines the RemoteAddress condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.RemoteAddressMatchConditionParametersResponse'):
"""
Defines the RemoteAddress condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'RemoteAddressMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'RemoteAddress')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.RemoteAddressMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleRequestBodyConditionResponse(dict):
"""
Defines the RequestBody condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.RequestBodyMatchConditionParametersResponse'):
"""
Defines the RequestBody condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'RequestBodyMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'RequestBody')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.RequestBodyMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleRequestHeaderActionResponse(dict):
"""
Defines the request header action for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.HeaderActionParametersResponse'):
"""
Defines the request header action for the delivery rule.
:param str name: The name of the action for the delivery rule.
:param 'HeaderActionParametersResponseArgs' parameters: Defines the parameters for the action.
"""
pulumi.set(__self__, "name", 'ModifyRequestHeader')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the action for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.HeaderActionParametersResponse':
"""
Defines the parameters for the action.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleRequestHeaderConditionResponse(dict):
"""
Defines the RequestHeader condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.RequestHeaderMatchConditionParametersResponse'):
"""
Defines the RequestHeader condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'RequestHeaderMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'RequestHeader')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.RequestHeaderMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleRequestMethodConditionResponse(dict):
"""
Defines the RequestMethod condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.RequestMethodMatchConditionParametersResponse'):
"""
Defines the RequestMethod condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'RequestMethodMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'RequestMethod')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.RequestMethodMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleRequestSchemeConditionResponse(dict):
"""
Defines the RequestScheme condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.RequestSchemeMatchConditionParametersResponse'):
"""
Defines the RequestScheme condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'RequestSchemeMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'RequestScheme')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.RequestSchemeMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleRequestUriConditionResponse(dict):
"""
Defines the RequestUri condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.RequestUriMatchConditionParametersResponse'):
"""
Defines the RequestUri condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'RequestUriMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'RequestUri')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.RequestUriMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleResponse(dict):
"""
A rule that specifies a set of actions and conditions
"""
def __init__(__self__, *,
actions: Sequence[Any],
order: int,
conditions: Optional[Sequence[Any]] = None,
name: Optional[str] = None):
"""
A rule that specifies a set of actions and conditions
:param Sequence[Union['DeliveryRuleCacheExpirationActionResponseArgs', 'DeliveryRuleCacheKeyQueryStringActionResponseArgs', 'DeliveryRuleRequestHeaderActionResponseArgs', 'DeliveryRuleResponseHeaderActionResponseArgs', 'OriginGroupOverrideActionResponseArgs', 'UrlRedirectActionResponseArgs', 'UrlRewriteActionResponseArgs']] actions: A list of actions that are executed when all the conditions of a rule are satisfied.
:param int order: The order in which the rules are applied for the endpoint. Possible values {0,1,2,3,………}. A rule with a lesser order will be applied before a rule with a greater order. Rule with order 0 is a special rule. It does not require any condition and actions listed in it will always be applied.
:param Sequence[Union['DeliveryRuleCookiesConditionResponseArgs', 'DeliveryRuleHttpVersionConditionResponseArgs', 'DeliveryRuleIsDeviceConditionResponseArgs', 'DeliveryRulePostArgsConditionResponseArgs', 'DeliveryRuleQueryStringConditionResponseArgs', 'DeliveryRuleRemoteAddressConditionResponseArgs', 'DeliveryRuleRequestBodyConditionResponseArgs', 'DeliveryRuleRequestHeaderConditionResponseArgs', 'DeliveryRuleRequestMethodConditionResponseArgs', 'DeliveryRuleRequestSchemeConditionResponseArgs', 'DeliveryRuleRequestUriConditionResponseArgs', 'DeliveryRuleUrlFileExtensionConditionResponseArgs', 'DeliveryRuleUrlFileNameConditionResponseArgs', 'DeliveryRuleUrlPathConditionResponseArgs']] conditions: A list of conditions that must be matched for the actions to be executed
:param str name: Name of the rule
"""
pulumi.set(__self__, "actions", actions)
pulumi.set(__self__, "order", order)
if conditions is not None:
pulumi.set(__self__, "conditions", conditions)
if name is not None:
pulumi.set(__self__, "name", name)
@property
@pulumi.getter
def actions(self) -> Sequence[Any]:
"""
A list of actions that are executed when all the conditions of a rule are satisfied.
"""
return pulumi.get(self, "actions")
@property
@pulumi.getter
def order(self) -> int:
"""
The order in which the rules are applied for the endpoint. Possible values {0,1,2,3,………}. A rule with a lesser order will be applied before a rule with a greater order. Rule with order 0 is a special rule. It does not require any condition and actions listed in it will always be applied.
"""
return pulumi.get(self, "order")
@property
@pulumi.getter
def conditions(self) -> Optional[Sequence[Any]]:
"""
A list of conditions that must be matched for the actions to be executed
"""
return pulumi.get(self, "conditions")
@property
@pulumi.getter
def name(self) -> Optional[str]:
"""
Name of the rule
"""
return pulumi.get(self, "name")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleResponseHeaderActionResponse(dict):
"""
Defines the response header action for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.HeaderActionParametersResponse'):
"""
Defines the response header action for the delivery rule.
:param str name: The name of the action for the delivery rule.
:param 'HeaderActionParametersResponseArgs' parameters: Defines the parameters for the action.
"""
pulumi.set(__self__, "name", 'ModifyResponseHeader')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the action for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.HeaderActionParametersResponse':
"""
Defines the parameters for the action.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleUrlFileExtensionConditionResponse(dict):
"""
Defines the UrlFileExtension condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.UrlFileExtensionMatchConditionParametersResponse'):
"""
Defines the UrlFileExtension condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'UrlFileExtensionMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'UrlFileExtension')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.UrlFileExtensionMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleUrlFileNameConditionResponse(dict):
"""
Defines the UrlFileName condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.UrlFileNameMatchConditionParametersResponse'):
"""
Defines the UrlFileName condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'UrlFileNameMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'UrlFileName')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.UrlFileNameMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class DeliveryRuleUrlPathConditionResponse(dict):
"""
Defines the UrlPath condition for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.UrlPathMatchConditionParametersResponse'):
"""
Defines the UrlPath condition for the delivery rule.
:param str name: The name of the condition for the delivery rule.
:param 'UrlPathMatchConditionParametersResponseArgs' parameters: Defines the parameters for the condition.
"""
pulumi.set(__self__, "name", 'UrlPath')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the condition for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.UrlPathMatchConditionParametersResponse':
"""
Defines the parameters for the condition.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class EndpointPropertiesUpdateParametersResponseDeliveryPolicy(dict):
"""
A policy that specifies the delivery rules to be used for an endpoint.
"""
def __init__(__self__, *,
rules: Sequence['outputs.DeliveryRuleResponse'],
description: Optional[str] = None):
"""
A policy that specifies the delivery rules to be used for an endpoint.
:param Sequence['DeliveryRuleResponseArgs'] rules: A list of the delivery rules.
:param str description: User-friendly description of the policy.
"""
pulumi.set(__self__, "rules", rules)
if description is not None:
pulumi.set(__self__, "description", description)
@property
@pulumi.getter
def rules(self) -> Sequence['outputs.DeliveryRuleResponse']:
"""
A list of the delivery rules.
"""
return pulumi.get(self, "rules")
@property
@pulumi.getter
def description(self) -> Optional[str]:
"""
User-friendly description of the policy.
"""
return pulumi.get(self, "description")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class GeoFilterResponse(dict):
"""
Rules defining user's geo access within a CDN endpoint.
"""
def __init__(__self__, *,
action: str,
country_codes: Sequence[str],
relative_path: str):
"""
Rules defining user's geo access within a CDN endpoint.
:param str action: Action of the geo filter, i.e. allow or block access.
:param Sequence[str] country_codes: Two letter country codes defining user country access in a geo filter, e.g. AU, MX, US.
:param str relative_path: Relative path applicable to geo filter. (e.g. '/mypictures', '/mypicture/kitty.jpg', and etc.)
"""
pulumi.set(__self__, "action", action)
pulumi.set(__self__, "country_codes", country_codes)
pulumi.set(__self__, "relative_path", relative_path)
@property
@pulumi.getter
def action(self) -> str:
"""
Action of the geo filter, i.e. allow or block access.
"""
return pulumi.get(self, "action")
@property
@pulumi.getter(name="countryCodes")
def country_codes(self) -> Sequence[str]:
"""
Two letter country codes defining user country access in a geo filter, e.g. AU, MX, US.
"""
return pulumi.get(self, "country_codes")
@property
@pulumi.getter(name="relativePath")
def relative_path(self) -> str:
"""
Relative path applicable to geo filter. (e.g. '/mypictures', '/mypicture/kitty.jpg', and etc.)
"""
return pulumi.get(self, "relative_path")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class HeaderActionParametersResponse(dict):
"""
Defines the parameters for the request header action.
"""
def __init__(__self__, *,
header_action: str,
header_name: str,
odata_type: str,
value: Optional[str] = None):
"""
Defines the parameters for the request header action.
:param str header_action: Action to perform
:param str header_name: Name of the header to modify
:param str value: Value for the specified action
"""
pulumi.set(__self__, "header_action", header_action)
pulumi.set(__self__, "header_name", header_name)
pulumi.set(__self__, "odata_type", odata_type)
if value is not None:
pulumi.set(__self__, "value", value)
@property
@pulumi.getter(name="headerAction")
def header_action(self) -> str:
"""
Action to perform
"""
return pulumi.get(self, "header_action")
@property
@pulumi.getter(name="headerName")
def header_name(self) -> str:
"""
Name of the header to modify
"""
return pulumi.get(self, "header_name")
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def value(self) -> Optional[str]:
"""
Value for the specified action
"""
return pulumi.get(self, "value")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class HealthProbeParametersResponse(dict):
"""
The JSON object that contains the properties to send health probes to origin.
"""
def __init__(__self__, *,
probe_interval_in_seconds: Optional[int] = None,
probe_path: Optional[str] = None,
probe_protocol: Optional[str] = None,
probe_request_type: Optional[str] = None):
"""
The JSON object that contains the properties to send health probes to origin.
:param int probe_interval_in_seconds: The number of seconds between health probes.Default is 240sec.
:param str probe_path: The path relative to the origin that is used to determine the health of the origin.
:param str probe_protocol: Protocol to use for health probe.
:param str probe_request_type: The type of health probe request that is made.
"""
if probe_interval_in_seconds is not None:
pulumi.set(__self__, "probe_interval_in_seconds", probe_interval_in_seconds)
if probe_path is not None:
pulumi.set(__self__, "probe_path", probe_path)
if probe_protocol is not None:
pulumi.set(__self__, "probe_protocol", probe_protocol)
if probe_request_type is not None:
pulumi.set(__self__, "probe_request_type", probe_request_type)
@property
@pulumi.getter(name="probeIntervalInSeconds")
def probe_interval_in_seconds(self) -> Optional[int]:
"""
The number of seconds between health probes.Default is 240sec.
"""
return pulumi.get(self, "probe_interval_in_seconds")
@property
@pulumi.getter(name="probePath")
def probe_path(self) -> Optional[str]:
"""
The path relative to the origin that is used to determine the health of the origin.
"""
return pulumi.get(self, "probe_path")
@property
@pulumi.getter(name="probeProtocol")
def probe_protocol(self) -> Optional[str]:
"""
Protocol to use for health probe.
"""
return pulumi.get(self, "probe_protocol")
@property
@pulumi.getter(name="probeRequestType")
def probe_request_type(self) -> Optional[str]:
"""
The type of health probe request that is made.
"""
return pulumi.get(self, "probe_request_type")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class HttpErrorRangeParametersResponse(dict):
"""
The JSON object that represents the range for http status codes
"""
def __init__(__self__, *,
begin: Optional[int] = None,
end: Optional[int] = None):
"""
The JSON object that represents the range for http status codes
:param int begin: The inclusive start of the http status code range.
:param int end: The inclusive end of the http status code range.
"""
if begin is not None:
pulumi.set(__self__, "begin", begin)
if end is not None:
pulumi.set(__self__, "end", end)
@property
@pulumi.getter
def begin(self) -> Optional[int]:
"""
The inclusive start of the http status code range.
"""
return pulumi.get(self, "begin")
@property
@pulumi.getter
def end(self) -> Optional[int]:
"""
The inclusive end of the http status code range.
"""
return pulumi.get(self, "end")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class HttpVersionMatchConditionParametersResponse(dict):
"""
Defines the parameters for HttpVersion match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None):
"""
Defines the parameters for HttpVersion match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: The match value for the condition of the delivery rule
:param bool negate_condition: Describes if this is negate condition or not
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
The match value for the condition of the delivery rule
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class IsDeviceMatchConditionParametersResponse(dict):
"""
Defines the parameters for IsDevice match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None,
transforms: Optional[Sequence[str]] = None):
"""
Defines the parameters for IsDevice match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: The match value for the condition of the delivery rule
:param bool negate_condition: Describes if this is negate condition or not
:param Sequence[str] transforms: List of transforms
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
if transforms is not None:
pulumi.set(__self__, "transforms", transforms)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
The match value for the condition of the delivery rule
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
@property
@pulumi.getter
def transforms(self) -> Optional[Sequence[str]]:
"""
List of transforms
"""
return pulumi.get(self, "transforms")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class OriginGroupOverrideActionParametersResponse(dict):
"""
Defines the parameters for the Origin Group override action.
"""
def __init__(__self__, *,
odata_type: str,
origin_group: 'outputs.ResourceReferenceResponse'):
"""
Defines the parameters for the Origin Group override action.
:param 'ResourceReferenceResponseArgs' origin_group: A reference to the origin group from where the content will be fetched from when CDN does not have it
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "origin_group", origin_group)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter(name="originGroup")
def origin_group(self) -> 'outputs.ResourceReferenceResponse':
"""
A reference to the origin group from where the content will be fetched from when CDN does not have it
"""
return pulumi.get(self, "origin_group")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class OriginGroupOverrideActionResponse(dict):
"""
Defines the Origin Group override action for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.OriginGroupOverrideActionParametersResponse'):
"""
Defines the Origin Group override action for the delivery rule.
:param str name: The name of the action for the delivery rule.
:param 'OriginGroupOverrideActionParametersResponseArgs' parameters: Defines the parameters for the action.
"""
pulumi.set(__self__, "name", 'OriginGroupOverride')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the action for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.OriginGroupOverrideActionParametersResponse':
"""
Defines the parameters for the action.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class PostArgsMatchConditionParametersResponse(dict):
"""
Defines the parameters for PostArgs match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None,
selector: Optional[str] = None,
transforms: Optional[Sequence[str]] = None):
"""
Defines the parameters for PostArgs match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: The match value for the condition of the delivery rule
:param bool negate_condition: Describes if this is negate condition or not
:param str selector: Name of PostArg to be matched
:param Sequence[str] transforms: List of transforms
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
if selector is not None:
pulumi.set(__self__, "selector", selector)
if transforms is not None:
pulumi.set(__self__, "transforms", transforms)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
The match value for the condition of the delivery rule
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
@property
@pulumi.getter
def selector(self) -> Optional[str]:
"""
Name of PostArg to be matched
"""
return pulumi.get(self, "selector")
@property
@pulumi.getter
def transforms(self) -> Optional[Sequence[str]]:
"""
List of transforms
"""
return pulumi.get(self, "transforms")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class QueryStringMatchConditionParametersResponse(dict):
"""
Defines the parameters for QueryString match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None,
transforms: Optional[Sequence[str]] = None):
"""
Defines the parameters for QueryString match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: The match value for the condition of the delivery rule
:param bool negate_condition: Describes if this is negate condition or not
:param Sequence[str] transforms: List of transforms
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
if transforms is not None:
pulumi.set(__self__, "transforms", transforms)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
The match value for the condition of the delivery rule
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
@property
@pulumi.getter
def transforms(self) -> Optional[Sequence[str]]:
"""
List of transforms
"""
return pulumi.get(self, "transforms")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class RemoteAddressMatchConditionParametersResponse(dict):
"""
Defines the parameters for RemoteAddress match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None,
transforms: Optional[Sequence[str]] = None):
"""
Defines the parameters for RemoteAddress match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: Match values to match against. The operator will apply to each value in here with OR semantics. If any of them match the variable with the given operator this match condition is considered a match.
:param bool negate_condition: Describes if this is negate condition or not
:param Sequence[str] transforms: List of transforms
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
if transforms is not None:
pulumi.set(__self__, "transforms", transforms)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
Match values to match against. The operator will apply to each value in here with OR semantics. If any of them match the variable with the given operator this match condition is considered a match.
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
@property
@pulumi.getter
def transforms(self) -> Optional[Sequence[str]]:
"""
List of transforms
"""
return pulumi.get(self, "transforms")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class RequestBodyMatchConditionParametersResponse(dict):
"""
Defines the parameters for RequestBody match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None,
transforms: Optional[Sequence[str]] = None):
"""
Defines the parameters for RequestBody match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: The match value for the condition of the delivery rule
:param bool negate_condition: Describes if this is negate condition or not
:param Sequence[str] transforms: List of transforms
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
if transforms is not None:
pulumi.set(__self__, "transforms", transforms)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
The match value for the condition of the delivery rule
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
@property
@pulumi.getter
def transforms(self) -> Optional[Sequence[str]]:
"""
List of transforms
"""
return pulumi.get(self, "transforms")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class RequestHeaderMatchConditionParametersResponse(dict):
"""
Defines the parameters for RequestHeader match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None,
selector: Optional[str] = None,
transforms: Optional[Sequence[str]] = None):
"""
Defines the parameters for RequestHeader match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: The match value for the condition of the delivery rule
:param bool negate_condition: Describes if this is negate condition or not
:param str selector: Name of Header to be matched
:param Sequence[str] transforms: List of transforms
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
if selector is not None:
pulumi.set(__self__, "selector", selector)
if transforms is not None:
pulumi.set(__self__, "transforms", transforms)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
The match value for the condition of the delivery rule
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
@property
@pulumi.getter
def selector(self) -> Optional[str]:
"""
Name of Header to be matched
"""
return pulumi.get(self, "selector")
@property
@pulumi.getter
def transforms(self) -> Optional[Sequence[str]]:
"""
List of transforms
"""
return pulumi.get(self, "transforms")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class RequestMethodMatchConditionParametersResponse(dict):
"""
Defines the parameters for RequestMethod match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None):
"""
Defines the parameters for RequestMethod match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: The match value for the condition of the delivery rule
:param bool negate_condition: Describes if this is negate condition or not
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
The match value for the condition of the delivery rule
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class RequestSchemeMatchConditionParametersResponse(dict):
"""
Defines the parameters for RequestScheme match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None):
"""
Defines the parameters for RequestScheme match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: The match value for the condition of the delivery rule
:param bool negate_condition: Describes if this is negate condition or not
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
The match value for the condition of the delivery rule
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class RequestUriMatchConditionParametersResponse(dict):
"""
Defines the parameters for RequestUri match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None,
transforms: Optional[Sequence[str]] = None):
"""
Defines the parameters for RequestUri match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: The match value for the condition of the delivery rule
:param bool negate_condition: Describes if this is negate condition or not
:param Sequence[str] transforms: List of transforms
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
if transforms is not None:
pulumi.set(__self__, "transforms", transforms)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
The match value for the condition of the delivery rule
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
@property
@pulumi.getter
def transforms(self) -> Optional[Sequence[str]]:
"""
List of transforms
"""
return pulumi.get(self, "transforms")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class ResourceReferenceResponse(dict):
"""
Reference to another resource.
"""
def __init__(__self__, *,
id: Optional[str] = None):
"""
Reference to another resource.
:param str id: Resource ID.
"""
if id is not None:
pulumi.set(__self__, "id", id)
@property
@pulumi.getter
def id(self) -> Optional[str]:
"""
Resource ID.
"""
return pulumi.get(self, "id")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class ResponseBasedOriginErrorDetectionParametersResponse(dict):
"""
The JSON object that contains the properties to determine origin health using real requests/responses.
"""
def __init__(__self__, *,
http_error_ranges: Optional[Sequence['outputs.HttpErrorRangeParametersResponse']] = None,
response_based_detected_error_types: Optional[str] = None,
response_based_failover_threshold_percentage: Optional[int] = None):
"""
The JSON object that contains the properties to determine origin health using real requests/responses.
:param Sequence['HttpErrorRangeParametersResponseArgs'] http_error_ranges: The list of Http status code ranges that are considered as server errors for origin and it is marked as unhealthy.
:param str response_based_detected_error_types: Type of response errors for real user requests for which origin will be deemed unhealthy
:param int response_based_failover_threshold_percentage: The percentage of failed requests in the sample where failover should trigger.
"""
if http_error_ranges is not None:
pulumi.set(__self__, "http_error_ranges", http_error_ranges)
if response_based_detected_error_types is not None:
pulumi.set(__self__, "response_based_detected_error_types", response_based_detected_error_types)
if response_based_failover_threshold_percentage is not None:
pulumi.set(__self__, "response_based_failover_threshold_percentage", response_based_failover_threshold_percentage)
@property
@pulumi.getter(name="httpErrorRanges")
def http_error_ranges(self) -> Optional[Sequence['outputs.HttpErrorRangeParametersResponse']]:
"""
The list of Http status code ranges that are considered as server errors for origin and it is marked as unhealthy.
"""
return pulumi.get(self, "http_error_ranges")
@property
@pulumi.getter(name="responseBasedDetectedErrorTypes")
def response_based_detected_error_types(self) -> Optional[str]:
"""
Type of response errors for real user requests for which origin will be deemed unhealthy
"""
return pulumi.get(self, "response_based_detected_error_types")
@property
@pulumi.getter(name="responseBasedFailoverThresholdPercentage")
def response_based_failover_threshold_percentage(self) -> Optional[int]:
"""
The percentage of failed requests in the sample where failover should trigger.
"""
return pulumi.get(self, "response_based_failover_threshold_percentage")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class SkuResponse(dict):
"""
The pricing tier (defines a CDN provider, feature list and rate) of the CDN profile.
"""
def __init__(__self__, *,
name: Optional[str] = None):
"""
The pricing tier (defines a CDN provider, feature list and rate) of the CDN profile.
:param str name: Name of the pricing tier.
"""
if name is not None:
pulumi.set(__self__, "name", name)
@property
@pulumi.getter
def name(self) -> Optional[str]:
"""
Name of the pricing tier.
"""
return pulumi.get(self, "name")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class UrlFileExtensionMatchConditionParametersResponse(dict):
"""
Defines the parameters for UrlFileExtension match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None,
transforms: Optional[Sequence[str]] = None):
"""
Defines the parameters for UrlFileExtension match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: The match value for the condition of the delivery rule
:param bool negate_condition: Describes if this is negate condition or not
:param Sequence[str] transforms: List of transforms
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
if transforms is not None:
pulumi.set(__self__, "transforms", transforms)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
The match value for the condition of the delivery rule
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
@property
@pulumi.getter
def transforms(self) -> Optional[Sequence[str]]:
"""
List of transforms
"""
return pulumi.get(self, "transforms")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class UrlFileNameMatchConditionParametersResponse(dict):
"""
Defines the parameters for UrlFilename match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None,
transforms: Optional[Sequence[str]] = None):
"""
Defines the parameters for UrlFilename match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: The match value for the condition of the delivery rule
:param bool negate_condition: Describes if this is negate condition or not
:param Sequence[str] transforms: List of transforms
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
if transforms is not None:
pulumi.set(__self__, "transforms", transforms)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
The match value for the condition of the delivery rule
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
@property
@pulumi.getter
def transforms(self) -> Optional[Sequence[str]]:
"""
List of transforms
"""
return pulumi.get(self, "transforms")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class UrlPathMatchConditionParametersResponse(dict):
"""
Defines the parameters for UrlPath match conditions
"""
def __init__(__self__, *,
odata_type: str,
operator: str,
match_values: Optional[Sequence[str]] = None,
negate_condition: Optional[bool] = None,
transforms: Optional[Sequence[str]] = None):
"""
Defines the parameters for UrlPath match conditions
:param str operator: Describes operator to be matched
:param Sequence[str] match_values: The match value for the condition of the delivery rule
:param bool negate_condition: Describes if this is negate condition or not
:param Sequence[str] transforms: List of transforms
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "operator", operator)
if match_values is not None:
pulumi.set(__self__, "match_values", match_values)
if negate_condition is not None:
pulumi.set(__self__, "negate_condition", negate_condition)
if transforms is not None:
pulumi.set(__self__, "transforms", transforms)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter
def operator(self) -> str:
"""
Describes operator to be matched
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter(name="matchValues")
def match_values(self) -> Optional[Sequence[str]]:
"""
The match value for the condition of the delivery rule
"""
return pulumi.get(self, "match_values")
@property
@pulumi.getter(name="negateCondition")
def negate_condition(self) -> Optional[bool]:
"""
Describes if this is negate condition or not
"""
return pulumi.get(self, "negate_condition")
@property
@pulumi.getter
def transforms(self) -> Optional[Sequence[str]]:
"""
List of transforms
"""
return pulumi.get(self, "transforms")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class UrlRedirectActionParametersResponse(dict):
"""
Defines the parameters for the url redirect action.
"""
def __init__(__self__, *,
odata_type: str,
redirect_type: str,
custom_fragment: Optional[str] = None,
custom_hostname: Optional[str] = None,
custom_path: Optional[str] = None,
custom_query_string: Optional[str] = None,
destination_protocol: Optional[str] = None):
"""
Defines the parameters for the url redirect action.
:param str redirect_type: The redirect type the rule will use when redirecting traffic.
:param str custom_fragment: Fragment to add to the redirect URL. Fragment is the part of the URL that comes after #. Do not include the #.
:param str custom_hostname: Host to redirect. Leave empty to use the incoming host as the destination host.
:param str custom_path: The full path to redirect. Path cannot be empty and must start with /. Leave empty to use the incoming path as destination path.
:param str custom_query_string: The set of query strings to be placed in the redirect URL. Setting this value would replace any existing query string; leave empty to preserve the incoming query string. Query string must be in <key>=<value> format. ? and & will be added automatically so do not include them.
:param str destination_protocol: Protocol to use for the redirect. The default value is MatchRequest
"""
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "redirect_type", redirect_type)
if custom_fragment is not None:
pulumi.set(__self__, "custom_fragment", custom_fragment)
if custom_hostname is not None:
pulumi.set(__self__, "custom_hostname", custom_hostname)
if custom_path is not None:
pulumi.set(__self__, "custom_path", custom_path)
if custom_query_string is not None:
pulumi.set(__self__, "custom_query_string", custom_query_string)
if destination_protocol is not None:
pulumi.set(__self__, "destination_protocol", destination_protocol)
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter(name="redirectType")
def redirect_type(self) -> str:
"""
The redirect type the rule will use when redirecting traffic.
"""
return pulumi.get(self, "redirect_type")
@property
@pulumi.getter(name="customFragment")
def custom_fragment(self) -> Optional[str]:
"""
Fragment to add to the redirect URL. Fragment is the part of the URL that comes after #. Do not include the #.
"""
return pulumi.get(self, "custom_fragment")
@property
@pulumi.getter(name="customHostname")
def custom_hostname(self) -> Optional[str]:
"""
Host to redirect. Leave empty to use the incoming host as the destination host.
"""
return pulumi.get(self, "custom_hostname")
@property
@pulumi.getter(name="customPath")
def custom_path(self) -> Optional[str]:
"""
The full path to redirect. Path cannot be empty and must start with /. Leave empty to use the incoming path as destination path.
"""
return pulumi.get(self, "custom_path")
@property
@pulumi.getter(name="customQueryString")
def custom_query_string(self) -> Optional[str]:
"""
The set of query strings to be placed in the redirect URL. Setting this value would replace any existing query string; leave empty to preserve the incoming query string. Query string must be in <key>=<value> format. ? and & will be added automatically so do not include them.
"""
return pulumi.get(self, "custom_query_string")
@property
@pulumi.getter(name="destinationProtocol")
def destination_protocol(self) -> Optional[str]:
"""
Protocol to use for the redirect. The default value is MatchRequest
"""
return pulumi.get(self, "destination_protocol")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class UrlRedirectActionResponse(dict):
"""
Defines the url redirect action for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.UrlRedirectActionParametersResponse'):
"""
Defines the url redirect action for the delivery rule.
:param str name: The name of the action for the delivery rule.
:param 'UrlRedirectActionParametersResponseArgs' parameters: Defines the parameters for the action.
"""
pulumi.set(__self__, "name", 'UrlRedirect')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the action for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.UrlRedirectActionParametersResponse':
"""
Defines the parameters for the action.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class UrlRewriteActionParametersResponse(dict):
"""
Defines the parameters for the url rewrite action.
"""
def __init__(__self__, *,
destination: str,
odata_type: str,
source_pattern: str,
preserve_unmatched_path: Optional[bool] = None):
"""
Defines the parameters for the url rewrite action.
:param str destination: Define the relative URL to which the above requests will be rewritten by.
:param str source_pattern: define a request URI pattern that identifies the type of requests that may be rewritten. If value is blank, all strings are matched.
:param bool preserve_unmatched_path: Whether to preserve unmatched path. Default value is true.
"""
pulumi.set(__self__, "destination", destination)
pulumi.set(__self__, "odata_type", odata_type)
pulumi.set(__self__, "source_pattern", source_pattern)
if preserve_unmatched_path is not None:
pulumi.set(__self__, "preserve_unmatched_path", preserve_unmatched_path)
@property
@pulumi.getter
def destination(self) -> str:
"""
Define the relative URL to which the above requests will be rewritten by.
"""
return pulumi.get(self, "destination")
@property
@pulumi.getter(name="odataType")
def odata_type(self) -> str:
return pulumi.get(self, "odata_type")
@property
@pulumi.getter(name="sourcePattern")
def source_pattern(self) -> str:
"""
define a request URI pattern that identifies the type of requests that may be rewritten. If value is blank, all strings are matched.
"""
return pulumi.get(self, "source_pattern")
@property
@pulumi.getter(name="preserveUnmatchedPath")
def preserve_unmatched_path(self) -> Optional[bool]:
"""
Whether to preserve unmatched path. Default value is true.
"""
return pulumi.get(self, "preserve_unmatched_path")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class UrlRewriteActionResponse(dict):
"""
Defines the url rewrite action for the delivery rule.
"""
def __init__(__self__, *,
name: str,
parameters: 'outputs.UrlRewriteActionParametersResponse'):
"""
Defines the url rewrite action for the delivery rule.
:param str name: The name of the action for the delivery rule.
:param 'UrlRewriteActionParametersResponseArgs' parameters: Defines the parameters for the action.
"""
pulumi.set(__self__, "name", 'UrlRewrite')
pulumi.set(__self__, "parameters", parameters)
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the action for the delivery rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def parameters(self) -> 'outputs.UrlRewriteActionParametersResponse':
"""
Defines the parameters for the action.
"""
return pulumi.get(self, "parameters")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
| 36.836688 | 785 | 0.654071 | 10,580 | 96,991 | 5.790359 | 0.048015 | 0.017237 | 0.035862 | 0.052414 | 0.765956 | 0.737684 | 0.721295 | 0.701087 | 0.677092 | 0.665617 | 0 | 0.000875 | 0.257395 | 96,991 | 2,632 | 786 | 36.850684 | 0.849429 | 0.305987 | 0 | 0.701141 | 1 | 0 | 0.156146 | 0.084657 | 0 | 0 | 0 | 0 | 0 | 1 | 0.193295 | false | 0 | 0.00428 | 0.050642 | 0.39087 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4d41baa9c55e44bf02938b6f13b1f9b99c2d757a | 37 | py | Python | gdacs_reader/exceptions.py | benjiao/gdacs-reader | f45a9f8987446ae0180d49e6bbbcdfe67fa846a3 | [
"MIT"
] | 1 | 2021-08-24T09:17:57.000Z | 2021-08-24T09:17:57.000Z | gdacs_reader/exceptions.py | benjiao/gdacs-reader | f45a9f8987446ae0180d49e6bbbcdfe67fa846a3 | [
"MIT"
] | null | null | null | gdacs_reader/exceptions.py | benjiao/gdacs-reader | f45a9f8987446ae0180d49e6bbbcdfe67fa846a3 | [
"MIT"
] | null | null | null | from gdacs_reader import GdacsReader
| 18.5 | 36 | 0.891892 | 5 | 37 | 6.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 37 | 1 | 37 | 37 | 0.969697 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4db5a64a23d8ee55d1340ea222ac193d19dfdb8c | 138 | py | Python | temas/tema1/codigo/t1e2.py | GabJL/FP2021 | 9c2c80c3bd0b7e112f66475c48ecdcf20b611338 | [
"MIT"
] | 1 | 2021-11-29T12:12:48.000Z | 2021-11-29T12:12:48.000Z | temas/tema1/codigo/t1e2.py | GabJL/FP2021 | 9c2c80c3bd0b7e112f66475c48ecdcf20b611338 | [
"MIT"
] | null | null | null | temas/tema1/codigo/t1e2.py | GabJL/FP2021 | 9c2c80c3bd0b7e112f66475c48ecdcf20b611338 | [
"MIT"
] | null | null | null | from turtle import *
print("Dibujando un cuadrado")
forward(80)
left(90)
forward(80)
left(90)
forward(80)
left(90)
forward(80)
left(90)
| 10.615385 | 30 | 0.724638 | 23 | 138 | 4.347826 | 0.478261 | 0.36 | 0.52 | 0.6 | 0.6 | 0.6 | 0.6 | 0.6 | 0.6 | 0.6 | 0 | 0.132231 | 0.123188 | 138 | 12 | 31 | 11.5 | 0.694215 | 0 | 0 | 0.8 | 0 | 0 | 0.152174 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.1 | 0 | 0.1 | 0.1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.