hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
3f3ecef6808c96fdb3c1da71c5c7f99ffaa8290c | 178 | py | Python | profit/sur/gp/__init__.py | krystophny/unsur | 6f1f6faebae640dbbd724db42fb6343f884face6 | [
"MIT"
] | 2 | 2019-09-08T20:52:17.000Z | 2019-11-05T15:36:06.000Z | profit/sur/gp/__init__.py | krystophny/redmod | cfef0aa4f1498615c8ad3c814e939b6ab78abff6 | [
"MIT"
] | 3 | 2019-07-26T08:08:50.000Z | 2019-07-26T09:17:03.000Z | profit/sur/gp/__init__.py | krystophny/redmod | cfef0aa4f1498615c8ad3c814e939b6ab78abff6 | [
"MIT"
] | null | null | null | from .gaussian_process import GaussianProcess
from .gpy_surrogate import GPySurrogate
from .custom_surrogate import GPSurrogate
from .sklearn_surrogate import SklearnGPSurrogate
| 35.6 | 49 | 0.88764 | 20 | 178 | 7.7 | 0.6 | 0.292208 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089888 | 178 | 4 | 50 | 44.5 | 0.950617 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3f5058494a5507508b32b407fa96490f390a0001 | 45 | py | Python | spirl/configs/rl/maze/SAC/conf.py | kouroshHakha/fist | 328c098789239fd892e17edefd799fc1957ab637 | [
"BSD-3-Clause"
] | 8 | 2021-10-14T03:14:23.000Z | 2022-03-15T21:31:17.000Z | spirl/configs/rl/maze/SAC/conf.py | kouroshHakha/fist | 328c098789239fd892e17edefd799fc1957ab637 | [
"BSD-3-Clause"
] | null | null | null | spirl/configs/rl/maze/SAC/conf.py | kouroshHakha/fist | 328c098789239fd892e17edefd799fc1957ab637 | [
"BSD-3-Clause"
] | 1 | 2021-09-13T20:42:28.000Z | 2021-09-13T20:42:28.000Z | from spirl.configs.rl.maze.base_conf import * | 45 | 45 | 0.822222 | 8 | 45 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 45 | 1 | 45 | 45 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3f506cfdc03e638cbbccb2066a6a7d2789299308 | 218 | py | Python | applications/users/functions.py | diegovasconcelo/FusionAI | 7dbd1c7c0465e742ed1d4a2604244abbfccbea59 | [
"MIT"
] | null | null | null | applications/users/functions.py | diegovasconcelo/FusionAI | 7dbd1c7c0465e742ed1d4a2604244abbfccbea59 | [
"MIT"
] | null | null | null | applications/users/functions.py | diegovasconcelo/FusionAI | 7dbd1c7c0465e742ed1d4a2604244abbfccbea59 | [
"MIT"
] | null | null | null | # Extra functions
import random
import string
# Generate a Code for register
def code_generator(size = 6, chars = string.ascii_uppercase + string.digits):
return ''.join(random.choice(chars) for _ in range(size))
| 27.25 | 77 | 0.752294 | 31 | 218 | 5.193548 | 0.741935 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005405 | 0.151376 | 218 | 7 | 78 | 31.142857 | 0.864865 | 0.201835 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0.25 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
3f531a5c6d8a2bf4146ee2eac9364b1696c84174 | 117 | py | Python | models/cls/__init__.py | luost26/Equivariant-OrientedMP | 597f9c4ace953929e5eefef84e4c840d6636b818 | [
"MIT"
] | 5 | 2022-03-26T07:08:21.000Z | 2022-03-31T12:23:40.000Z | models/cls/__init__.py | luost26/Equivariant-OrientedMP | 597f9c4ace953929e5eefef84e4c840d6636b818 | [
"MIT"
] | null | null | null | models/cls/__init__.py | luost26/Equivariant-OrientedMP | 597f9c4ace953929e5eefef84e4c840d6636b818 | [
"MIT"
] | null | null | null | from ._registry import get_model
from .oriented_dgcnn import OrientedDGCNN
from .oriented_rscnn import OrientedRSCNN
| 29.25 | 41 | 0.871795 | 15 | 117 | 6.533333 | 0.666667 | 0.244898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 117 | 3 | 42 | 39 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3f601ccc708ab7ed40ec44841db36dc37573f65b | 325 | py | Python | lpdn/__init__.py | marcoancona/LPDN | 095336227b6c56462a251e11952656c10c3f8bf3 | [
"MIT"
] | 9 | 2019-07-01T06:18:19.000Z | 2022-03-05T06:41:45.000Z | lpdn/__init__.py | marcoancona/LPDN | 095336227b6c56462a251e11952656c10c3f8bf3 | [
"MIT"
] | null | null | null | lpdn/__init__.py | marcoancona/LPDN | 095336227b6c56462a251e11952656c10c3f8bf3 | [
"MIT"
] | 6 | 2019-03-01T14:15:21.000Z | 2020-11-19T20:01:00.000Z | from .utils.conversion import convert_to_lpdn
from .layers.activation import filter_activation
from .layers.pooling import LPMaxPooling2D, LPAveragePooling1D
from .layers.dense import LPDense
from .layers.convolution import LPConv1D, LPConv2D
from .layers.activation import LPActivation
from .layers.reshape import LPFlatten
| 40.625 | 62 | 0.858462 | 40 | 325 | 6.9 | 0.525 | 0.217391 | 0.144928 | 0.188406 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013559 | 0.092308 | 325 | 7 | 63 | 46.428571 | 0.922034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3f68f1d780469ba9030cd784d0e5991e108b6d11 | 172 | py | Python | src/iert_news/admin.py | MetricsGroup/IERT-Webapp | 9e43f1775767412898f9340b9cc84196eb4abfdb | [
"MIT"
] | 3 | 2019-04-25T11:19:22.000Z | 2020-05-10T20:41:12.000Z | src/iert_news/admin.py | MetricsGroup/IERT-Webapp | 9e43f1775767412898f9340b9cc84196eb4abfdb | [
"MIT"
] | 5 | 2020-06-17T05:16:27.000Z | 2022-01-13T02:15:56.000Z | src/iert_news/admin.py | MetricsGroup/IERT-Webapp | 9e43f1775767412898f9340b9cc84196eb4abfdb | [
"MIT"
] | 3 | 2020-06-13T10:40:27.000Z | 2021-10-13T15:45:50.000Z | from django.contrib import admin
from .models import new, new_by_viewer, Comment
admin.site.register(new)
admin.site.register(new_by_viewer)
admin.site.register(Comment)
| 21.5 | 47 | 0.819767 | 27 | 172 | 5.074074 | 0.444444 | 0.19708 | 0.372263 | 0.291971 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087209 | 172 | 7 | 48 | 24.571429 | 0.872611 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
58bd338d9226ad08c33ce98e31ea266aa0ab0cd2 | 132,475 | py | Python | tests/test_core_network.py | tkahng/genet | d5c29ed9e44408b60f55d8de889d7430debc9f04 | [
"MIT"
] | 1 | 2021-03-09T20:21:26.000Z | 2021-03-09T20:21:26.000Z | tests/test_core_network.py | tkahng/genet | d5c29ed9e44408b60f55d8de889d7430debc9f04 | [
"MIT"
] | null | null | null | tests/test_core_network.py | tkahng/genet | d5c29ed9e44408b60f55d8de889d7430debc9f04 | [
"MIT"
] | null | null | null | import ast
import json
import os
import sys
import uuid
import lxml
import networkx as nx
import pandas as pd
import geopandas as gpd
import pytest
from pandas.testing import assert_frame_equal, assert_series_equal
from shapely.geometry import LineString, Polygon, Point
from genet.core import Network
from genet.inputs_handler import matsim_reader
from tests.test_outputs_handler_matsim_xml_writer import network_dtd, schedule_dtd
from genet.schedule_elements import Route, Service, Schedule
from genet.utils import plot, spatial
from genet.inputs_handler import read
from tests.fixtures import assert_semantically_equal, route, stop_epsg_27700, network_object_from_test_data, \
full_fat_default_config_path, correct_schedule, vehicle_definitions_config_path
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), "..")))
pt2matsim_network_test_file = os.path.abspath(
os.path.join(os.path.dirname(__file__), "test_data", "matsim", "network.xml"))
pt2matsim_schedule_file = os.path.abspath(
os.path.join(os.path.dirname(__file__), "test_data", "matsim", "schedule.xml"))
puma_network_test_file = os.path.abspath(
os.path.join(os.path.dirname(__file__), "test_data", "puma", "network.xml"))
puma_schedule_test_file = os.path.abspath(
os.path.join(os.path.dirname(__file__), "test_data", "puma", "schedule.xml"))
simplified_network = os.path.abspath(
os.path.join(os.path.dirname(__file__), "test_data", "simplified_network", "network.xml"))
simplified_schedule = os.path.abspath(
os.path.join(os.path.dirname(__file__), "test_data", "simplified_network", "schedule.xml"))
network_link_attrib_text_missing = os.path.abspath(
os.path.join(os.path.dirname(__file__), "test_data", "matsim", "network_link_attrib_text_missing.xml"))
@pytest.fixture()
def network1():
n1 = Network('epsg:27700')
n1.add_node('101982',
{'id': '101982',
'x': '528704.1425925883',
'y': '182068.78193707118',
'lon': -0.14625948709424305,
'lat': 51.52287873323954,
's2_id': 5221390329378179879})
n1.add_node('101986',
{'id': '101986',
'x': '528835.203274008',
'y': '182006.27331298392',
'lon': -0.14439428709377497,
'lat': 51.52228713323965,
's2_id': 5221390328605860387})
n1.add_link('0', '101982', '101986',
attribs={'id': '0',
'from': '101982',
'to': '101986',
'freespeed': 4.166666666666667,
'capacity': 600.0,
'permlanes': 1.0,
'oneway': '1',
'modes': ['car'],
's2_from': 5221390329378179879,
's2_to': 5221390328605860387,
'length': 52.765151087870265,
'attributes': {'osm:way:access': {'name': 'osm:way:access',
'class': 'java.lang.String',
'text': 'permissive'},
'osm:way:highway': {'name': 'osm:way:highway',
'class': 'java.lang.String',
'text': 'unclassified'},
'osm:way:id': {'name': 'osm:way:id',
'class': 'java.lang.Long',
'text': '26997928'},
'osm:way:name': {'name': 'osm:way:name',
'class': 'java.lang.String',
'text': 'Brunswick Place'}}})
return n1
@pytest.fixture()
def network2():
n2 = Network('epsg:4326')
n2.add_node('101982',
{'id': '101982',
'x': -0.14625948709424305,
'y': 51.52287873323954,
'lon': -0.14625948709424305,
'lat': 51.52287873323954,
's2_id': 5221390329378179879})
n2.add_node('101990',
{'id': '101990',
'x': -0.14770188709624754,
'y': 51.5205729332399,
'lon': -0.14770188709624754,
'lat': 51.5205729332399,
's2_id': 5221390304444511271})
n2.add_link('0', '101982', '101990',
attribs={'id': '0',
'from': '101982',
'to': '101990',
'freespeed': 4.166666666666667,
'capacity': 600.0,
'permlanes': 1.0,
'oneway': '1',
'modes': ['car'],
's2_from': 5221390329378179879,
's2_to': 5221390304444511271,
'length': 52.765151087870265,
'attributes': {'osm:way:access': {'name': 'osm:way:access',
'class': 'java.lang.String',
'text': 'permissive'},
'osm:way:highway': {'name': 'osm:way:highway',
'class': 'java.lang.String',
'text': 'unclassified'},
'osm:way:id': {'name': 'osm:way:id',
'class': 'java.lang.Long',
'text': '26997928'},
'osm:way:name': {'name': 'osm:way:name',
'class': 'java.lang.String',
'text': 'Brunswick Place'}}})
return n2
def test_network_graph_initiates_as_not_simplififed():
n = Network('epsg:27700')
assert not n.graph.graph['simplified']
def test__repr__shows_graph_info_and_schedule_info():
n = Network('epsg:4326')
assert 'instance at' in n.__repr__()
assert 'graph' in n.__repr__()
assert 'schedule' in n.__repr__()
def test__str__shows_info():
n = Network('epsg:4326')
assert 'Graph info' in n.__str__()
assert 'Schedule info' in n.__str__()
def test_reproject_changes_x_y_values_for_all_nodes(network1):
network1.reproject('epsg:4326')
nodes = dict(network1.nodes())
correct_nodes = {
'101982': {'id': '101982', 'x': -0.14625948709424305, 'y': 51.52287873323954, 'lon': -0.14625948709424305,
'lat': 51.52287873323954, 's2_id': 5221390329378179879},
'101986': {'id': '101986', 'x': -0.14439428709377497, 'y': 51.52228713323965, 'lon': -0.14439428709377497,
'lat': 51.52228713323965, 's2_id': 5221390328605860387}}
target_change_log = pd.DataFrame(
{'timestamp': {3: '2020-07-09 19:50:51', 4: '2020-07-09 19:50:51'}, 'change_event': {3: 'modify', 4: 'modify'},
'object_type': {3: 'node', 4: 'node'}, 'old_id': {3: '101982', 4: '101986'},
'new_id': {3: '101982', 4: '101986'}, 'old_attributes': {
3: "{'id': '101982', 'x': '528704.1425925883', 'y': '182068.78193707118', 'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879}",
4: "{'id': '101986', 'x': '528835.203274008', 'y': '182006.27331298392', 'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387}"},
'new_attributes': {
3: "{'id': '101982', 'x': -0.14625948709424305, 'y': 51.52287873323954, 'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879}",
4: "{'id': '101986', 'x': -0.14439428709377497, 'y': 51.52228713323965, 'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387}"},
'diff': {3: [('change', 'x', ('528704.1425925883', -0.14625948709424305)),
('change', 'y', ('182068.78193707118', 51.52287873323954))],
4: [('change', 'x', ('528835.203274008', -0.14439428709377497)),
('change', 'y', ('182006.27331298392', 51.52228713323965))]}}
)
assert_semantically_equal(nodes, correct_nodes)
for i in [3, 4]:
assert_semantically_equal(ast.literal_eval(target_change_log.loc[i, 'old_attributes']),
ast.literal_eval(network1.change_log.loc[i, 'old_attributes']))
assert_semantically_equal(ast.literal_eval(target_change_log.loc[i, 'new_attributes']),
ast.literal_eval(network1.change_log.loc[i, 'new_attributes']))
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'diff']
assert_frame_equal(network1.change_log[cols_to_compare].tail(2), target_change_log[cols_to_compare],
check_dtype=False)
def test_reproject_delegates_reprojection_to_schedules_own_method(network1, route, mocker):
mocker.patch.object(Schedule, 'reproject')
network1.schedule = Schedule(epsg='epsg:27700', services=[Service(id='id', routes=[route])])
network1.reproject('epsg:4326')
network1.schedule.reproject.assert_called_once_with('epsg:4326', 1)
def test_reproject_updates_graph_crs(network1):
network1.reproject('epsg:4326')
assert network1.graph.graph['crs'] == {'init': 'epsg:4326'}
def test_reprojecting_links_with_geometries():
n = Network('epsg:27700')
n.add_nodes({'A': {'x': -82514.72274, 'y': 220772.02798},
'B': {'x': -82769.25894, 'y': 220773.0637}})
n.add_links({'1': {'from': 'A', 'to': 'B',
'geometry': LineString([(-82514.72274, 220772.02798),
(-82546.23894, 220772.88254),
(-82571.87107, 220772.53339),
(-82594.92709, 220770.68385),
(-82625.33255, 220770.45579),
(-82631.26842, 220770.40158),
(-82669.7309, 220770.04349),
(-82727.94946, 220770.79793),
(-82757.38528, 220771.75412),
(-82761.82425, 220771.95614),
(-82769.25894, 220773.0637)])}})
n.reproject('epsg:2157')
geometry_coords = list(n.link('1')['geometry'].coords)
assert round(geometry_coords[0][0], 7) == 532006.5605980
assert round(geometry_coords[0][1], 7) == 547653.3751768
assert round(geometry_coords[-1][0], 7) == 531753.4315189
assert round(geometry_coords[-1][1], 7) == 547633.5224837
def test_adding_the_same_networks():
n_left = Network('epsg:27700')
n_left.add_node('1', {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879})
n_left.add_node('2', {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387})
n_left.add_link('1', '1', '2', attribs={'modes': ['walk']})
n_right = Network('epsg:27700')
n_right.add_node('1', {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879})
n_right.add_node('2', {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387})
n_right.add_link('1', '1', '2', attribs={'modes': ['walk']})
n_left.add(n_right)
assert_semantically_equal(dict(n_left.nodes()), {
'1': {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118, 'lon': -0.14625948709424305,
'lat': 51.52287873323954, 's2_id': 5221390329378179879},
'2': {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392, 'lon': -0.14439428709377497,
'lat': 51.52228713323965, 's2_id': 5221390328605860387}})
assert_semantically_equal(dict(n_left.links()), {'1': {'modes': ['walk'], 'from': '1', 'to': '2', 'id': '1'}})
def test_adding_the_same_networks_but_with_differing_projections():
n_left = Network('epsg:27700')
n_left.add_node('1', {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879})
n_left.add_node('2', {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387})
n_left.add_link('1', '1', '2', attribs={'modes': ['walk']})
n_right = Network('epsg:27700')
n_right.add_node('1', {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879})
n_right.add_node('2', {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387})
n_right.add_link('1', '1', '2', attribs={'modes': ['walk']})
n_right.reproject('epsg:4326')
n_left.add(n_right)
assert_semantically_equal(dict(n_left.nodes()), {
'1': {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118, 'lon': -0.14625948709424305,
'lat': 51.52287873323954, 's2_id': 5221390329378179879},
'2': {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392, 'lon': -0.14439428709377497,
'lat': 51.52228713323965, 's2_id': 5221390328605860387}})
assert_semantically_equal(dict(n_left.links()), {'1': {'modes': ['walk'], 'from': '1', 'to': '2', 'id': '1'}})
def test_adding_networks_with_clashing_node_ids():
n_left = Network('epsg:27700')
n_left.add_node('1', {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879})
n_left.add_node('2', {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387})
n_left.add_link('1', '1', '2', attribs={'modes': ['walk']})
n_right = Network('epsg:27700')
n_right.add_node('10', {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879})
n_right.add_node('20', {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387})
n_right.add_link('1', '10', '20', attribs={'modes': ['walk']})
n_left.add(n_right)
assert_semantically_equal(dict(n_left.nodes()), {
'1': {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118, 'lon': -0.14625948709424305,
'lat': 51.52287873323954, 's2_id': 5221390329378179879},
'2': {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392, 'lon': -0.14439428709377497,
'lat': 51.52228713323965, 's2_id': 5221390328605860387}})
assert_semantically_equal(dict(n_left.links()), {'1': {'modes': ['walk'], 'from': '1', 'to': '2', 'id': '1'}})
def test_adding_networks_with_clashing_link_ids():
n_left = Network('epsg:27700')
n_left.add_node('1', {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879})
n_left.add_node('2', {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387})
n_left.add_link('1', '1', '2', attribs={'modes': ['walk']})
n_right = Network('epsg:27700')
n_right.add_node('1', {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879})
n_right.add_node('2', {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387})
n_right.add_link('10', '1', '2', attribs={'modes': ['walk']})
n_left.add(n_right)
assert_semantically_equal(dict(n_left.nodes()), {
'1': {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118, 'lon': -0.14625948709424305,
'lat': 51.52287873323954, 's2_id': 5221390329378179879},
'2': {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392, 'lon': -0.14439428709377497,
'lat': 51.52228713323965, 's2_id': 5221390328605860387}})
assert_semantically_equal(dict(n_left.links()), {'1': {'modes': ['walk'], 'from': '1', 'to': '2', 'id': '1'}})
def test_adding_networks_with_clashing_multiindices():
n_left = Network('epsg:27700')
n_left.add_node('1', {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879})
n_left.add_node('2', {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387})
n_left.add_link('1', '1', '2', 0, attribs={'modes': ['walk']})
n_right = Network('epsg:27700')
n_left.add_node('1', {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879})
n_left.add_node('2', {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387})
n_left.add_link('1', '1', '2', 0, attribs={'modes': ['walk', 'bike']})
n_left.add(n_right)
assert len(list(n_left.nodes())) == 2
assert n_left.node('1') == {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879}
assert n_left.node('2') == {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387}
assert len(n_left.link_id_mapping) == 2
assert n_left.link('1') == {'modes': ['walk'], 'from': '1', 'to': '2', 'id': '1'}
assert n_left.graph['1']['2'][0] == {'modes': ['walk'], 'from': '1', 'to': '2', 'id': '1'}
def test_adding_disjoint_networks_with_unique_ids():
n_left = Network('epsg:27700')
n_left.add_node('1', {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879})
n_left.add_node('2', {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387})
n_left.add_link('1', '1', '2', attribs={'modes': ['walk']})
n_right = Network('epsg:27700')
n_right.add_node('10', {'id': '1', 'x': 1, 'y': 1,
'lon': 1, 'lat': 1, 's2_id': 1})
n_right.add_node('20', {'id': '2', 'x': 1, 'y': 1,
'lon': 1, 'lat': 1, 's2_id': 2})
n_right.add_link('100', '10', '20', attribs={'modes': ['walk']})
n_left.add(n_right)
assert_semantically_equal(dict(n_left.nodes()), {'10': {'id': '1', 'x': 1, 'y': 1, 'lon': 1, 'lat': 1, 's2_id': 1},
'20': {'id': '2', 'x': 1, 'y': 1, 'lon': 1, 'lat': 1, 's2_id': 2},
'1': {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954,
's2_id': 5221390329378179879},
'2': {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965,
's2_id': 5221390328605860387}})
assert_semantically_equal(dict(n_left.links()), {'100': {'modes': ['walk'], 'from': '10', 'to': '20', 'id': '100'},
'1': {'modes': ['walk'], 'from': '1', 'to': '2', 'id': '1'}})
def test_adding_disjoint_networks_with_clashing_ids():
n_left = Network('epsg:27700')
n_left.add_node('1', {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879})
n_left.add_node('2', {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387})
n_left.add_link('1', '1', '2', attribs={'modes': ['walk']})
n_right = Network('epsg:27700')
n_right.add_node('1', {'id': '1', 'x': 1, 'y': 1,
'lon': 1, 'lat': 1, 's2_id': 1})
n_right.add_node('2', {'id': '2', 'x': 1, 'y': 1,
'lon': 1, 'lat': 1, 's2_id': 2})
n_right.add_link('1', '1', '2', attribs={'modes': ['walk']})
n_left.add(n_right)
assert len(list(n_left.nodes())) == 4
assert n_left.node('1') == {'id': '1', 'x': 528704.1425925883, 'y': 182068.78193707118,
'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879}
assert n_left.node('2') == {'id': '2', 'x': 528835.203274008, 'y': 182006.27331298392,
'lon': -0.14439428709377497, 'lat': 51.52228713323965, 's2_id': 5221390328605860387}
assert len(n_left.link_id_mapping) == 2
assert n_left.link('1') == {'modes': ['walk'], 'from': '1', 'to': '2', 'id': '1'}
def test_adding_simplified_network_and_not_throws_error():
n = Network('epsg:2770')
m = Network('epsg:2770')
m.graph.graph['simplified'] = True
with pytest.raises(RuntimeError) as error_info:
n.add(m)
assert "cannot add" in str(error_info.value)
def test_print_shows_info(mocker):
mocker.patch.object(Network, 'info')
n = Network('epsg:27700')
n.print()
n.info.assert_called_once()
def test_plot_delegates_to_util_plot_plot_graph_routes(mocker):
mocker.patch.object(plot, 'plot_graph_routes')
n = Network('epsg:27700')
n.plot()
plot.plot_graph_routes.assert_called_once()
def test_plot_graph_delegates_to_util_plot_plot_graph(mocker):
mocker.patch.object(plot, 'plot_graph')
n = Network('epsg:27700')
n.plot_graph()
plot.plot_graph.assert_called_once()
def test_plot_schedule_delegates_to_util_plot_plot_non_routed_schedule_graph(mocker, network_object_from_test_data):
mocker.patch.object(plot, 'plot_non_routed_schedule_graph')
n = network_object_from_test_data
n.plot_schedule()
plot.plot_non_routed_schedule_graph.assert_called_once()
def test_attempt_to_simplify_already_simplified_network_throws_error():
n = Network('epsg:27700')
n.graph.graph["simplified"] = True
with pytest.raises(RuntimeError) as error_info:
n.simplify()
assert "cannot simplify" in str(error_info.value)
def test_simplifing_puma_network_results_in_correct_record_of_removed_links_and_expected_graph_data():
n = read.read_matsim(path_to_network=puma_network_test_file, epsg='epsg:27700',
path_to_schedule=puma_schedule_test_file)
link_ids_pre_simplify = set(dict(n.links()).keys())
n.simplify()
assert n.is_simplified()
link_ids_post_simplify = set(dict(n.links()).keys())
assert link_ids_post_simplify & link_ids_pre_simplify
new_links = link_ids_post_simplify - link_ids_pre_simplify
deleted_links = link_ids_pre_simplify - link_ids_post_simplify
assert set(n.link_simplification_map.keys()) == deleted_links
assert set(n.link_simplification_map.values()) == new_links
assert (set(n.link_id_mapping.keys()) & new_links) == new_links
report = n.generate_validation_report()
assert report['routing']['services_have_routes_in_the_graph']
assert report['schedule']['schedule_level']['is_valid_schedule']
def test_simplified_network_saves_to_correct_dtds(tmpdir, network_dtd, schedule_dtd):
n = read.read_matsim(path_to_network=puma_network_test_file, epsg='epsg:27700',
path_to_schedule=puma_schedule_test_file)
n.simplify()
n.write_to_matsim(tmpdir)
generated_network_file_path = os.path.join(tmpdir, 'network.xml')
xml_obj = lxml.etree.parse(generated_network_file_path)
assert network_dtd.validate(xml_obj), \
'Doc generated at {} is not valid against DTD due to {}'.format(generated_network_file_path,
network_dtd.error_log.filter_from_errors())
generated_schedule_file_path = os.path.join(tmpdir, 'schedule.xml')
xml_obj = lxml.etree.parse(generated_schedule_file_path)
assert schedule_dtd.validate(xml_obj), \
'Doc generated at {} is not valid against DTD due to {}'.format(generated_network_file_path,
schedule_dtd.error_log.filter_from_errors())
def test_simplifying_network_with_multi_edges_resulting_in_multi_paths():
n = Network('epsg:27700')
n.add_nodes({
'n_-1': {'x': -1, 'y': -1, 's2_id': -1},
'n_0': {'x': 0, 'y': 0, 's2_id': 0},
'n_1': {'x': 1, 'y': 1, 's2_id': 1},
'n_2': {'x': 2, 'y': 2, 's2_id': 2},
'n_3': {'x': 3, 'y': 3, 's2_id': 3},
'n_4': {'x': 4, 'y': 4, 's2_id': 4},
'n_5': {'x': 5, 'y': 5, 's2_id': 5},
'n_6': {'x': 6, 'y': 5, 's2_id': 6},
})
n.add_links({
'l_-1': {'from': 'n_-1', 'to': 'n_1', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1,
'modes': {'car'}},
'l_0': {'from': 'n_0', 'to': 'n_1', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1,
'modes': {'car'}},
'l_1': {'from': 'n_1', 'to': 'n_2', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1,
'modes': {'car'}},
'l_2': {'from': 'n_1', 'to': 'n_2', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1,
'modes': {'car'}},
'l_3': {'from': 'n_2', 'to': 'n_3', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1,
'modes': {'car'}},
'l_4': {'from': 'n_2', 'to': 'n_3', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1,
'modes': {'car'}},
'l_5': {'from': 'n_3', 'to': 'n_4', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1,
'modes': {'car'}},
'l_6': {'from': 'n_3', 'to': 'n_4', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1,
'modes': {'car'}},
'l_7': {'from': 'n_4', 'to': 'n_5', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1,
'modes': {'car'}},
'l_8': {'from': 'n_4', 'to': 'n_6', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1,
'modes': {'car'}}
})
n.simplify()
assert set(n.link_simplification_map) == {'l_4', 'l_1', 'l_5', 'l_3', 'l_6', 'l_2'}
def test_reading_back_simplified_network():
# simplified networks have additional geometry attribute and some of their attributes are composite, e.g. links
# now refer to a number of osm ways each with a unique id
n = read.read_matsim(path_to_network=simplified_network, epsg='epsg:27700',
path_to_schedule=simplified_schedule)
number_of_simplified_links = 659
links_with_geometry = n.extract_links_on_edge_attributes(conditions={'geometry': lambda x: True})
assert len(links_with_geometry) == number_of_simplified_links
for link in links_with_geometry:
attribs = n.link(link)
if 'attributes' in attribs:
assert not 'geometry' in attribs['attributes']
for k, v in attribs['attributes'].items():
if isinstance(v['text'], str):
assert not ',' in v['text']
def test_network_with_missing_link_attribute_elem_text_is_read_and_able_to_save_again(tmpdir):
n = read.read_matsim(path_to_network=network_link_attrib_text_missing, epsg='epsg:27700')
n.write_to_matsim(tmpdir)
def test_node_attribute_data_under_key_returns_correct_pd_series_with_nested_keys():
n = Network('epsg:27700')
n.add_node(1, {'a': {'b': 1}})
n.add_node(2, {'a': {'b': 4}})
output_series = n.node_attribute_data_under_key(key={'a': 'b'})
assert_series_equal(output_series, pd.Series({1: 1, 2: 4}))
def test_node_attribute_data_under_key_returns_correct_pd_series_with_flat_keys():
n = Network('epsg:27700')
n.add_node(1, {'b': 1})
n.add_node(2, {'b': 4})
output_series = n.node_attribute_data_under_key(key='b')
assert_series_equal(output_series, pd.Series({1: 1, 2: 4}))
def test_node_attribute_data_under_keys(network1):
df = network1.node_attribute_data_under_keys(['x', 'y'])
df_to_compare = pd.DataFrame({'x': {'101982': '528704.1425925883', '101986': '528835.203274008'},
'y': {'101982': '182068.78193707118', '101986': '182006.27331298392'}})
assert_frame_equal(df, df_to_compare)
def test_node_attribute_data_under_keys_with_named_index(network1):
df = network1.node_attribute_data_under_keys(['x', 'y'], index_name='index')
assert df.index.name == 'index'
def test_node_attribute_data_under_keys_generates_key_for_nested_data(network1):
network1.add_node('1', {'key': {'nested_value': {'more_nested': 4}}})
df = network1.node_attribute_data_under_keys([{'key': {'nested_value': 'more_nested'}}])
assert isinstance(df, pd.DataFrame)
assert 'key::nested_value::more_nested' in df.columns
def test_node_attribute_data_under_keys_returns_dataframe_with_one_col_if_passed_one_key(network1):
df = network1.node_attribute_data_under_keys(['x'], index_name='index')
assert isinstance(df, pd.DataFrame)
assert len(df.columns) == 1
def test_link_attribute_data_under_key_returns_correct_pd_series_with_nested_keys():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': {'b': 1}})
n.add_link('1', 1, 2, attribs={'a': {'b': 4}})
output_series = n.link_attribute_data_under_key(key={'a': 'b'})
assert_series_equal(output_series, pd.Series({'0': 1, '1': 4}))
def test_link_attribute_data_under_key_returns_correct_pd_series_with_flat_keys():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'b': 1})
n.add_link('1', 1, 2, attribs={'b': 4})
output_series = n.link_attribute_data_under_key(key='b')
assert_series_equal(output_series, pd.Series({'0': 1, '1': 4}))
def test_link_attribute_data_under_keys(network1):
df = network1.link_attribute_data_under_keys(['modes', 'freespeed', 'capacity', 'permlanes'])
df_to_compare = pd.DataFrame({'modes': {'0': ['car']}, 'freespeed': {'0': 4.166666666666667},
'capacity': {'0': 600.0}, 'permlanes': {'0': 1.0}})
assert_frame_equal(df, df_to_compare)
def test_link_attribute_data_under_keys_with_named_index(network1):
df = network1.link_attribute_data_under_keys(['modes', 'freespeed', 'capacity', 'permlanes'], index_name='index')
assert df.index.name == 'index'
def test_link_attribute_data_under_keys_returns_dataframe_with_one_col_if_passed_one_key(network1):
df = network1.link_attribute_data_under_keys(['modes'])
assert isinstance(df, pd.DataFrame)
assert len(df.columns) == 1
def test_link_attribute_data_under_keys_generates_key_for_nested_data(network1):
df = network1.link_attribute_data_under_keys([{'attributes': {'osm:way:access': 'text'}}])
assert isinstance(df, pd.DataFrame)
assert 'attributes::osm:way:access::text' in df.columns
def test_add_node_adds_node_to_graph_with_attribs():
n = Network('epsg:27700')
n.add_node(1, {'a': 1})
assert n.graph.has_node(1)
assert n.node(1) == {'a': 1}
def test_add_node_adds_node_to_graph_without_attribs():
n = Network('epsg:27700')
n.add_node(1)
assert n.node(1) == {}
assert n.graph.has_node(1)
def test_add_multiple_nodes():
n = Network('epsg:27700')
reindexing_dict, actual_nodes_added = n.add_nodes({1: {'x': 1, 'y': 2}, 2: {'x': 2, 'y': 2}})
assert n.graph.has_node(1)
assert n.node(1) == {'x': 1, 'y': 2, 'id': 1}
assert n.graph.has_node(2)
assert n.node(2) == {'x': 2, 'y': 2, 'id': 2}
assert reindexing_dict == {}
def test_add_nodes_with_clashing_ids():
n = Network('epsg:27700')
n.add_node(1, {})
reindexing_dict, actual_nodes_added = n.add_nodes({1: {'x': 1, 'y': 2}, 2: {'x': 2, 'y': 2}})
assert n.graph.has_node(1)
assert n.node(1) == {}
assert n.graph.has_node(2)
assert n.node(2) == {'x': 2, 'y': 2, 'id': 2}
assert 1 in reindexing_dict
assert n.graph.has_node(reindexing_dict[1])
assert n.node(reindexing_dict[1]) == {'x': 1, 'y': 2, 'id': reindexing_dict[1]}
def test_add_nodes_with_multiple_clashing_ids():
n = Network('epsg:27700')
n.add_node(1, {})
n.add_node(2, {})
assert n.graph.has_node(1)
assert n.node(1) == {}
assert n.graph.has_node(2)
assert n.node(2) == {}
reindexing_dict, actual_nodes_added = n.add_nodes({1: {'x': 1, 'y': 2}, 2: {'x': 2, 'y': 2}})
assert 1 in reindexing_dict
assert n.graph.has_node(reindexing_dict[1])
assert n.node(reindexing_dict[1]) == {'x': 1, 'y': 2, 'id': reindexing_dict[1]}
assert 2 in reindexing_dict
assert n.graph.has_node(reindexing_dict[2])
assert n.node(reindexing_dict[2]) == {'x': 2, 'y': 2, 'id': reindexing_dict[2]}
def test_add_edge_generates_a_link_id_and_delegated_to_add_link_id(mocker):
mocker.patch.object(Network, 'add_link')
mocker.patch.object(Network, 'generate_index_for_edge', return_value='12345')
n = Network('epsg:27700')
n.add_edge(1, 2, attribs={'a': 1})
Network.generate_index_for_edge.assert_called_once()
Network.add_link.assert_called_once_with('12345', 1, 2, None, {'a': 1}, False)
def test_add_edge_generates_a_link_id_with_specified_multiidx(mocker):
mocker.patch.object(Network, 'add_link')
mocker.patch.object(Network, 'generate_index_for_edge', return_value='12345')
n = Network('epsg:27700')
n.add_edge(1, 2, multi_edge_idx=10, attribs={'a': 1})
Network.generate_index_for_edge.assert_called_once()
Network.add_link.assert_called_once_with('12345', 1, 2, 10, {'a': 1}, False)
def test_adding_multiple_edges():
n = Network('epsg:27700')
n.add_edges([{'from': 1, 'to': 2}, {'from': 2, 'to': 3}])
assert n.graph.has_edge(1, 2)
assert n.graph.has_edge(2, 3)
assert '0' in n.link_id_mapping
assert '1' in n.link_id_mapping
if n.link_id_mapping['0'] == {'from': 1, 'to': 2, 'multi_edge_idx': 0}:
assert n.link_id_mapping['1'] == {'from': 2, 'to': 3, 'multi_edge_idx': 0}
elif n.link_id_mapping['1'] == {'from': 1, 'to': 2, 'multi_edge_idx': 0}:
assert n.link_id_mapping['0'] == {'from': 2, 'to': 3, 'multi_edge_idx': 0}
else:
raise AssertionError()
def test_adding_multiple_edges_between_same_nodes():
n = Network('epsg:27700')
n.add_edges([{'from': 1, 'to': 2}, {'from': 1, 'to': 2}, {'from': 1, 'to': 2}, {'from': 2, 'to': 3}])
assert n.graph.has_edge(1, 2)
assert n.graph.number_of_edges(1, 2) == 3
assert n.graph.has_edge(2, 3)
assert len(n.link_id_mapping) == 4
def test_add_link_adds_edge_to_graph_with_attribs():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': 1})
assert n.graph.has_edge(1, 2)
assert '0' in n.link_id_mapping
assert n.edge(1, 2) == {0: {'a': 1, 'from': 1, 'id': '0', 'to': 2}}
def test_add_link_adds_edge_to_graph_without_attribs():
n = Network('epsg:27700')
n.add_link('0', 1, 2)
n.graph.has_edge(1, 2)
assert '0' in n.link_id_mapping
assert n.link_id_mapping['0'] == {'from': 1, 'to': 2, 'multi_edge_idx': 0}
def test_adding_multiple_links():
n = Network('epsg:27700')
n.add_links({'0': {'from': 1, 'to': 2}, '1': {'from': 2, 'to': 3}})
assert n.graph.has_edge(1, 2)
assert n.graph.has_edge(2, 3)
assert '0' in n.link_id_mapping
assert '1' in n.link_id_mapping
assert n.link_id_mapping['0'] == {'from': 1, 'to': 2, 'multi_edge_idx': 0}
assert n.link_id_mapping['1'] == {'from': 2, 'to': 3, 'multi_edge_idx': 0}
def test_adding_multiple_links_with_id_clashes():
n = Network('epsg:27700')
n.add_link('0', 10, 20)
assert '0' in n.link_id_mapping
reindexing_dict, links_and_attribs = n.add_links({'0': {'from': 1, 'to': 2}, '1': {'from': 2, 'to': 3}})
assert '1' in n.link_id_mapping
assert '0' in reindexing_dict
assert len(n.link_id_mapping) == 3
assert_semantically_equal(links_and_attribs[reindexing_dict['0']], {'from': 1, 'to': 2, 'id': reindexing_dict['0']})
assert_semantically_equal(links_and_attribs['1'], {'from': 2, 'to': 3, 'id': '1'})
def test_adding_multiple_links_with_multiple_id_clashes():
n = Network('epsg:27700')
n.add_link('0', 10, 20)
n.add_link('1', 10, 20)
assert '0' in n.link_id_mapping
assert '1' in n.link_id_mapping
reindexing_dict, links_and_attribs = n.add_links({'0': {'from': 1, 'to': 2}, '1': {'from': 2, 'to': 3}})
assert '0' in reindexing_dict
assert '1' in reindexing_dict
assert len(n.link_id_mapping) == 4
assert_semantically_equal(links_and_attribs[reindexing_dict['0']], {'from': 1, 'to': 2, 'id': reindexing_dict['0']})
assert_semantically_equal(links_and_attribs[reindexing_dict['1']], {'from': 2, 'to': 3, 'id': reindexing_dict['1']})
def test_adding_loads_of_multiple_links_between_same_nodes():
n = Network('epsg:27700')
reindexing_dict, links_and_attribs = n.add_links({i: {'from': 1, 'to': 2} for i in range(10)})
assert_semantically_equal(links_and_attribs, {i: {'from': 1, 'to': 2, 'id': i} for i in range(10)})
assert_semantically_equal(n.link_id_mapping, {i: {'from': 1, 'to': 2, 'multi_edge_idx': i} for i in range(10)})
def test_adding_multiple_links_with_multi_idx_clashes():
n = Network('epsg:27700')
n.add_link('0', 1, 2)
n.add_link('1', 1, 2)
assert '0' in n.link_id_mapping
assert '1' in n.link_id_mapping
n.add_links({'2': {'from': 1, 'to': 2}, '3': {'from': 1, 'to': 2}, '4': {'from': 2, 'to': 3}})
assert n.link_id_mapping['2'] == {'from': 1, 'to': 2, 'multi_edge_idx': 2}
assert n.link_id_mapping['3'] == {'from': 1, 'to': 2, 'multi_edge_idx': 3}
assert n.link_id_mapping['4'] == {'from': 2, 'to': 3, 'multi_edge_idx': 0}
def test_adding_multiple_links_with_id_and_multi_idx_clashes():
n = Network('epsg:27700')
n.add_link('0', 1, 2)
n.add_link('1', 1, 2)
assert '0' in n.link_id_mapping
assert '1' in n.link_id_mapping
reindexing_dict, links_and_attribs = n.add_links(
{'0': {'from': 1, 'to': 2}, '1': {'from': 1, 'to': 2}, '2': {'from': 2, 'to': 3}})
assert '0' in reindexing_dict
assert '1' in reindexing_dict
assert len(n.link_id_mapping) == 5
assert_semantically_equal(n.link_id_mapping[reindexing_dict['0']], {'from': 1, 'to': 2, 'multi_edge_idx': 2})
assert_semantically_equal(n.link_id_mapping[reindexing_dict['1']], {'from': 1, 'to': 2, 'multi_edge_idx': 3})
def test_adding_multiple_links_missing_some_from_nodes():
n = Network('epsg:27700')
with pytest.raises(RuntimeError) as error_info:
n.add_links({'0': {'to': 2}, '1': {'from': 2, 'to': 3}})
assert "You are trying to add links which are missing `from` (origin) nodes" in str(error_info.value)
def test_adding_multiple_links_missing_from_nodes_completely():
n = Network('epsg:27700')
with pytest.raises(RuntimeError) as error_info:
n.add_links({'0': {'to': 2}, '1': {'to': 3}})
assert "You are trying to add links which are missing `from` (origin) nodes" in str(error_info.value)
def test_adding_multiple_links_missing_some_to_nodes():
n = Network('epsg:27700')
with pytest.raises(RuntimeError) as error_info:
n.add_links({'0': {'from': 2}, '1': {'from': 2, 'to': 3}})
assert "You are trying to add links which are missing `to` (destination) nodes" in str(error_info.value)
def test_adding_multiple_links_missing_to_nodes_completely():
n = Network('epsg:27700')
with pytest.raises(RuntimeError) as error_info:
n.add_links({'0': {'from': 2}, '1': {'from': 2}})
assert "You are trying to add links which are missing `to` (destination) nodes" in str(error_info.value)
def test_adding_links_with_different_non_overlapping_attributes():
# generates a nan attribute for link attributes
n = Network('epsg:27700')
reindexing_dict, links_and_attributes = n.add_links({
'2': {'from': 1, 'to': 2, 'speed': 20},
'3': {'from': 1, 'to': 2, 'capacity': 123},
'4': {'from': 2, 'to': 3, 'modes': [1, 2, 3]}})
assert reindexing_dict == {}
assert_semantically_equal(links_and_attributes, {
'2': {'id': '2', 'from': 1, 'to': 2, 'speed': 20},
'3': {'id': '3', 'from': 1, 'to': 2, 'capacity': 123},
'4': {'id': '4', 'from': 2, 'to': 3, 'modes': [1, 2, 3]}})
def test_adding_multiple_links_to_same_edge_clashing_with_existing_edge():
n = Network('epsg:27700')
n.add_link(link_id='0', u='2', v='2', attribs={'speed': 20})
n.add_links({'1': {'from': '2', 'to': '2', 'something': 20},
'2': {'from': '2', 'to': '2', 'capacity': 123}})
assert_semantically_equal(dict(n.links()), {'0': {'speed': 20, 'from': '2', 'to': '2', 'id': '0'},
'1': {'from': '2', 'to': '2', 'something': 20.0, 'id': '1'},
'2': {'from': '2', 'to': '2', 'capacity': 123.0, 'id': '2'}})
assert_semantically_equal(n.link_id_mapping, {'0': {'from': '2', 'to': '2', 'multi_edge_idx': 0},
'1': {'from': '2', 'to': '2', 'multi_edge_idx': 1},
'2': {'from': '2', 'to': '2', 'multi_edge_idx': 2}})
def test_network_modal_subgraph_using_general_subgraph_on_link_attribs():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'modes': ['car', 'bike']})
n.add_link('1', 2, 3, attribs={'modes': ['car']})
n.add_link('2', 2, 3, attribs={'modes': ['bike']})
car_graph = n.subgraph_on_link_conditions(conditions={'modes': 'car'}, mixed_dtypes=True)
assert list(car_graph.edges) == [(1, 2, 0), (2, 3, 0)]
def test_modes():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'modes': ['car', 'bike']})
n.add_link('1', 2, 3, attribs={'modes': ['car']})
n.add_link('2', 2, 3, attribs={'modes': ['bike']})
n.add_link('3', 2, 3, attribs={})
assert n.modes() == {'car', 'bike'}
def test_network_modal_subgraph_using_specific_modal_subgraph_method_single_mode():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'modes': ['car', 'bike']})
n.add_link('1', 2, 3, attribs={'modes': ['car']})
n.add_link('2', 2, 3, attribs={'modes': ['bike']})
car_graph = n.modal_subgraph(modes='car')
assert list(car_graph.edges) == [(1, 2, 0), (2, 3, 0)]
def test_network_modal_subgraph_using_specific_modal_subgraph_method_several_modes():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'modes': ['car', 'bike']})
n.add_link('1', 2, 3, attribs={'modes': ['car']})
n.add_link('2', 2, 3, attribs={'modes': ['bike']})
n.add_link('3', 2, 3, attribs={'modes': ['walk']})
car_bike_graph = n.modal_subgraph(modes=['car', 'bike'])
assert list(car_bike_graph.edges) == [(1, 2, 0), (2, 3, 0), (2, 3, 1)]
def test_links_on_modal_condition():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'modes': ['car', 'bike']})
n.add_link('1', 2, 3, attribs={'modes': ['car']})
n.add_link('2', 2, 3, attribs={'modes': ['bike']})
n.add_link('3', 2, 3, attribs={'modes': ['walk']})
car_links = n.links_on_modal_condition(modes=['car'])
assert set(car_links) == {'0', '1'}
def test_nodes_on_modal_condition():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'modes': ['car', 'bike']})
n.add_link('1', 2, 3, attribs={'modes': ['car']})
n.add_link('2', 2, 3, attribs={'modes': ['bike']})
n.add_link('3', 2, 3, attribs={'modes': ['walk']})
car_nodes = n.nodes_on_modal_condition(modes=['car'])
assert set(car_nodes) == {1, 2, 3}
test_geojson = os.path.abspath(
os.path.join(os.path.dirname(__file__), "test_data", "test_geojson.geojson"))
def test_nodes_on_spatial_condition_with_geojson(network_object_from_test_data):
network_object_from_test_data.add_node('1', {'id': '1', 'x': 508400, 'y': 162050})
nodes = network_object_from_test_data.nodes_on_spatial_condition(test_geojson)
assert set(nodes) == {'21667818', '25508485'}
def test_nodes_on_spatial_condition_with_shapely_geom(network_object_from_test_data):
region = Polygon([(-0.1487016677856445, 51.52556684350165), (-0.14063358306884766, 51.5255134425896),
(-0.13865947723388672, 51.5228700191647), (-0.14093399047851562, 51.52006622056997),
(-0.1492595672607422, 51.51974577545329), (-0.1508045196533203, 51.52276321095246),
(-0.1487016677856445, 51.52556684350165)])
network_object_from_test_data.add_node('1', {'id': '1', 'x': 508400, 'y': 162050})
nodes = network_object_from_test_data.nodes_on_spatial_condition(region)
assert set(nodes) == {'21667818', '25508485'}
def test_nodes_on_spatial_condition_with_s2_region(network_object_from_test_data):
region = '48761ad04d,48761ad054,48761ad05c,48761ad061,48761ad085,48761ad08c,48761ad094,48761ad09c,48761ad0b,48761ad0d,48761ad0f,48761ad14,48761ad182c,48761ad19c,48761ad1a4,48761ad1ac,48761ad1b4,48761ad1bac,48761ad3d7f,48761ad3dc,48761ad3e4,48761ad3ef,48761ad3f4,48761ad3fc,48761ad41,48761ad43,48761ad5d,48761ad5e4,48761ad5ec,48761ad5fc,48761ad7,48761ad803,48761ad81c,48761ad824,48761ad82c,48761ad9d,48761ad9e4,48761ad9e84,48761ad9fc,48761ada04,48761ada0c,48761b2804,48761b2814,48761b281c,48761b283,48761b2844,48761b284c,48761b2995,48761b29b4,48761b29bc,48761b29d,48761b29f,48761b2a04'
network_object_from_test_data.add_node(
'1', {'id': '1', 'x': 508400, 'y': 162050, 's2_id': spatial.generate_index_s2(51.3472033, 0.4449167)})
nodes = network_object_from_test_data.nodes_on_spatial_condition(region)
assert set(nodes) == {'21667818', '25508485'}
def test_links_on_spatial_condition_with_geojson(network_object_from_test_data):
network_object_from_test_data.add_node('1', {'id': '1', 'x': 508400, 'y': 162050})
network_object_from_test_data.add_link('2', u='21667818', v='1')
links = network_object_from_test_data.links_on_spatial_condition(test_geojson)
assert set(links) == {'1', '2'}
def test_links_on_spatial_condition_with_shapely_geom(network_object_from_test_data):
region = Polygon([(-0.1487016677856445, 51.52556684350165), (-0.14063358306884766, 51.5255134425896),
(-0.13865947723388672, 51.5228700191647), (-0.14093399047851562, 51.52006622056997),
(-0.1492595672607422, 51.51974577545329), (-0.1508045196533203, 51.52276321095246),
(-0.1487016677856445, 51.52556684350165)])
network_object_from_test_data.add_node('1', {'id': '1', 'x': 508400, 'y': 162050})
network_object_from_test_data.add_link('2', u='21667818', v='1')
links = network_object_from_test_data.links_on_spatial_condition(region)
assert set(links) == {'1', '2'}
def test_links_on_spatial_condition_with_s2_region(network_object_from_test_data):
region = '48761ad04d,48761ad054,48761ad05c,48761ad061,48761ad085,48761ad08c,48761ad094,48761ad09c,48761ad0b,48761ad0d,48761ad0f,48761ad14,48761ad182c,48761ad19c,48761ad1a4,48761ad1ac,48761ad1b4,48761ad1bac,48761ad3d7f,48761ad3dc,48761ad3e4,48761ad3ef,48761ad3f4,48761ad3fc,48761ad41,48761ad43,48761ad5d,48761ad5e4,48761ad5ec,48761ad5fc,48761ad7,48761ad803,48761ad81c,48761ad824,48761ad82c,48761ad9d,48761ad9e4,48761ad9e84,48761ad9fc,48761ada04,48761ada0c,48761b2804,48761b2814,48761b281c,48761b283,48761b2844,48761b284c,48761b2995,48761b29b4,48761b29bc,48761b29d,48761b29f,48761b2a04'
network_object_from_test_data.add_node('1', {'id': '1', 'x': 508400, 'y': 162050})
network_object_from_test_data.add_link('2', u='21667818', v='1')
links = network_object_from_test_data.links_on_spatial_condition(region)
assert set(links) == {'1', '2'}
def test_links_on_spatial_condition_with_intersection_and_complex_geometry_that_falls_outside_region(
network_object_from_test_data):
region = Polygon([(-0.1487016677856445, 51.52556684350165), (-0.14063358306884766, 51.5255134425896),
(-0.13865947723388672, 51.5228700191647), (-0.14093399047851562, 51.52006622056997),
(-0.1492595672607422, 51.51974577545329), (-0.1508045196533203, 51.52276321095246),
(-0.1487016677856445, 51.52556684350165)])
network_object_from_test_data.add_link(
'2', u='21667818', v='25508485',
attribs={'geometry': LineString(
[(528504.1342843144, 182155.7435136598), (508400, 162050), (528489.467895946, 182206.20303669578)])})
links = network_object_from_test_data.links_on_spatial_condition(region, how='intersect')
assert set(links) == {'1', '2'}
def test_links_on_spatial_condition_with_containement(network_object_from_test_data):
region = Polygon([(-0.1487016677856445, 51.52556684350165), (-0.14063358306884766, 51.5255134425896),
(-0.13865947723388672, 51.5228700191647), (-0.14093399047851562, 51.52006622056997),
(-0.1492595672607422, 51.51974577545329), (-0.1508045196533203, 51.52276321095246),
(-0.1487016677856445, 51.52556684350165)])
network_object_from_test_data.add_node('1', {'id': '1', 'x': 508400, 'y': 162050})
network_object_from_test_data.add_link('2', u='21667818', v='1')
links = network_object_from_test_data.links_on_spatial_condition(region, how='within')
assert set(links) == {'1'}
def test_links_on_spatial_condition_with_containement_and_complex_geometry_that_falls_outside_region(
network_object_from_test_data):
region = Polygon([(-0.1487016677856445, 51.52556684350165), (-0.14063358306884766, 51.5255134425896),
(-0.13865947723388672, 51.5228700191647), (-0.14093399047851562, 51.52006622056997),
(-0.1492595672607422, 51.51974577545329), (-0.1508045196533203, 51.52276321095246),
(-0.1487016677856445, 51.52556684350165)])
network_object_from_test_data.add_link(
'2', u='21667818', v='25508485',
attribs={'geometry': LineString(
[(528504.1342843144, 182155.7435136598), (508400, 162050), (528489.467895946, 182206.20303669578)])})
links = network_object_from_test_data.links_on_spatial_condition(region, how='within')
assert set(links) == {'1'}
def test_links_on_spatial_condition_with_containement_and_s2_region(network_object_from_test_data):
region = '48761ad04d,48761ad054,48761ad05c,48761ad061,48761ad085,48761ad08c,48761ad094,48761ad09c,48761ad0b,48761ad0d,48761ad0f,48761ad14,48761ad182c,48761ad19c,48761ad1a4,48761ad1ac,48761ad1b4,48761ad1bac,48761ad3d7f,48761ad3dc,48761ad3e4,48761ad3ef,48761ad3f4,48761ad3fc,48761ad41,48761ad43,48761ad5d,48761ad5e4,48761ad5ec,48761ad5fc,48761ad7,48761ad803,48761ad81c,48761ad824,48761ad82c,48761ad9d,48761ad9e4,48761ad9e84,48761ad9fc,48761ada04,48761ada0c,48761b2804,48761b2814,48761b281c,48761b283,48761b2844,48761b284c,48761b2995,48761b29b4,48761b29bc,48761b29d,48761b29f,48761b2a04'
network_object_from_test_data.add_node('1', {'id': '1', 'x': 508400, 'y': 162050})
network_object_from_test_data.add_link('2', u='21667818', v='1')
links = network_object_from_test_data.links_on_spatial_condition(region, how='within')
assert set(links) == {'1'}
def test_links_on_spatial_condition_with_containement_and_complex_geometry_that_falls_outside_s2_region(
network_object_from_test_data):
region = '48761ad04d,48761ad054,48761ad05c,48761ad061,48761ad085,48761ad08c,48761ad094,48761ad09c,48761ad0b,48761ad0d,48761ad0f,48761ad14,48761ad182c,48761ad19c,48761ad1a4,48761ad1ac,48761ad1b4,48761ad1bac,48761ad3d7f,48761ad3dc,48761ad3e4,48761ad3ef,48761ad3f4,48761ad3fc,48761ad41,48761ad43,48761ad5d,48761ad5e4,48761ad5ec,48761ad5fc,48761ad7,48761ad803,48761ad81c,48761ad824,48761ad82c,48761ad9d,48761ad9e4,48761ad9e84,48761ad9fc,48761ada04,48761ada0c,48761b2804,48761b2814,48761b281c,48761b283,48761b2844,48761b284c,48761b2995,48761b29b4,48761b29bc,48761b29d,48761b29f,48761b2a04'
network_object_from_test_data.add_link(
'2', u='21667818', v='25508485',
attribs={'geometry': LineString(
[(528504.1342843144, 182155.7435136598), (508400, 162050), (528489.467895946, 182206.20303669578)])})
links = network_object_from_test_data.links_on_spatial_condition(region, how='within')
assert set(links) == {'1'}
def test_find_shortest_path_when_graph_has_no_extra_edge_choices():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'modes': ['car', 'bike'], 'length': 1})
n.add_link('1', 2, 3, attribs={'modes': ['car'], 'length': 1})
n.add_link('2', 2, 3, attribs={'modes': ['bike'], 'length': 1})
n.add_link('3', 2, 3, attribs={'modes': ['walk'], 'length': 1})
bike_route = n.find_shortest_path(1, 3, modes='bike')
assert bike_route == ['0', '2']
def test_find_shortest_path_when_subgraph_is_pre_computed():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'modes': ['car', 'bike'], 'length': 1})
n.add_link('1', 2, 3, attribs={'modes': ['car'], 'length': 1})
n.add_link('2', 2, 3, attribs={'modes': ['bike'], 'length': 1})
n.add_link('3', 2, 3, attribs={'modes': ['walk'], 'length': 1})
bike_g = n.modal_subgraph(modes='bike')
bike_route = n.find_shortest_path(1, 3, subgraph=bike_g)
assert bike_route == ['0', '2']
def test_find_shortest_path_defaults_to_full_graph():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'modes': ['car', 'bike'], 'length': 1})
n.add_link('1', 2, 3, attribs={'modes': ['car'], 'freespeed': 3})
n.add_link('2', 2, 3, attribs={'modes': ['bike'], 'freespeed': 2})
n.add_link('3', 2, 3, attribs={'modes': ['walk'], 'freespeed': 1})
bike_route = n.find_shortest_path(1, 3)
assert bike_route == ['0', '1']
def test_find_shortest_path_when_graph_has_extra_edge_choice_for_freespeed_that_is_obvious():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'modes': ['car', 'bike'], 'length': 1, 'freespeed': 10})
n.add_link('2', 2, 3, attribs={'modes': ['car', 'bike'], 'length': 1, 'freespeed': 10})
n.add_link('3', 2, 3, attribs={'modes': ['car', 'bike'], 'length': 1, 'freespeed': 1})
bike_route = n.find_shortest_path(1, 3, modes='bike')
assert bike_route == ['0', '2']
def test_find_shortest_path_when_graph_has_extra_edge_choice_with_attractive_mode():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'modes': ['car', 'bike'], 'length': 1, 'freespeed': 10})
n.add_link('2', 2, 3, attribs={'modes': ['car', 'bike'], 'length': 1, 'freespeed': 10})
n.add_link('3', 2, 3, attribs={'modes': ['bike'], 'length': 1, 'freespeed': 1})
bike_route = n.find_shortest_path(1, 3, modes='bike')
assert bike_route == ['0', '3']
def test_find_shortest_path_and_return_just_nodes():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'modes': ['car', 'bike'], 'length': 1, 'freespeed': 10})
n.add_link('1', 2, 3, attribs={'modes': ['car', 'bike'], 'length': 1, 'freespeed': 10})
bike_route = n.find_shortest_path(1, 3, return_nodes=True)
assert bike_route == [1, 2, 3]
def test_add_link_adds_link_with_specific_multi_idx():
n = Network('epsg:27700')
n.add_link('0', 1, 2, 0)
assert '0' in n.link_id_mapping
assert n.link_id_mapping['0'] == {'from': 1, 'to': 2, 'multi_edge_idx': 0}
assert n.graph[1][2][0] == {'from': 1, 'to': 2, 'id': '0'}
def test_add_link_generates_new_multi_idx_if_already_exists():
n = Network('epsg:27700')
n.add_link('0', 1, 2, 0)
n.add_link('1', 1, 2, 0)
assert '0' in n.link_id_mapping
assert '1' in n.link_id_mapping
assert n.link_id_mapping['0'] == {'from': 1, 'to': 2, 'multi_edge_idx': 0}
assert n.graph[1][2][0] == {'from': 1, 'to': 2, 'id': '0'}
assert n.link_id_mapping['1']['multi_edge_idx'] != 0
assert n.graph[1][2][n.link_id_mapping['1']['multi_edge_idx']] == {'from': 1, 'to': 2, 'id': '1'}
def test_reindex_node(network1):
assert [id for id, attribs in network1.nodes()] == ['101982', '101986']
assert [id for id, attribs in network1.links()] == ['0']
assert network1.link('0')['from'] == '101982'
assert network1.link('0')['to'] == '101986'
assert [(from_n, to_n) for from_n, to_n, attribs in network1.edges()] == [('101982', '101986')]
assert network1.link_id_mapping['0']['from'] == '101982'
network1.reindex_node('101982', '007')
assert [id for id, attribs in network1.nodes()] == ['007', '101986']
assert [id for id, attribs in network1.links()] == ['0']
assert network1.link('0')['from'] == '007'
assert network1.link('0')['to'] == '101986'
assert [(from_n, to_n) for from_n, to_n, attribs in network1.edges()] == [('007', '101986')]
assert network1.link_id_mapping['0']['from'] == '007'
correct_change_log_df = pd.DataFrame(
{'timestamp': {3: '2020-06-08 19:39:08', 4: '2020-06-08 19:39:08', 5: '2020-06-08 19:39:08'},
'change_event': {3: 'modify', 4: 'modify', 5: 'modify'}, 'object_type': {3: 'link', 4: 'node', 5: 'node'},
'old_id': {3: '0', 4: '101982', 5: '101982'}, 'new_id': {3: '0', 4: '007', 5: '101982'}, 'old_attributes': {
3: "{'id': '0', 'from': '101982', 'to': '101986', 'freespeed': 4.166666666666667, 'capacity': 600.0, 'permlanes': 1.0, 'oneway': '1', 'modes': ['car'], 's2_from': 5221390329378179879, 's2_to': 5221390328605860387, 'length': 52.765151087870265, 'attributes': {'osm:way:access': {'name': 'osm:way:access', 'class': 'java.lang.String', 'text': 'permissive'}, 'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String', 'text': 'unclassified'}, 'osm:way:id': {'name': 'osm:way:id', 'class': 'java.lang.Long', 'text': '26997928'}, 'osm:way:name': {'name': 'osm:way:name', 'class': 'java.lang.String', 'text': 'Brunswick Place'}}}",
4: "{'id': '101982', 'x': '528704.1425925883', 'y': '182068.78193707118', 'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879}",
5: "{'id': '101982', 'x': '528704.1425925883', 'y': '182068.78193707118', 'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879}"},
'new_attributes': {
3: "{'id': '0', 'from': '007', 'to': '101986', 'freespeed': 4.166666666666667, 'capacity': 600.0, 'permlanes': 1.0, 'oneway': '1', 'modes': ['car'], 's2_from': 5221390329378179879, 's2_to': 5221390328605860387, 'length': 52.765151087870265, 'attributes': {'osm:way:access': {'name': 'osm:way:access', 'class': 'java.lang.String', 'text': 'permissive'}, 'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String', 'text': 'unclassified'}, 'osm:way:id': {'name': 'osm:way:id', 'class': 'java.lang.Long', 'text': '26997928'}, 'osm:way:name': {'name': 'osm:way:name', 'class': 'java.lang.String', 'text': 'Brunswick Place'}}}",
4: "{'id': '007', 'x': '528704.1425925883', 'y': '182068.78193707118', 'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879}",
5: "{'id': '007', 'x': '528704.1425925883', 'y': '182068.78193707118', 'lon': -0.14625948709424305, 'lat': 51.52287873323954, 's2_id': 5221390329378179879}"},
'diff': {3: [('change', 'from', ('101982', '007'))],
4: [('change', 'id', ('101982', '007')), ('change', 'id', ('101982', '007'))],
5: [('change', 'id', ('101982', '007'))]}})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(network1.change_log[cols_to_compare].tail(3), correct_change_log_df[cols_to_compare],
check_names=False,
check_dtype=False)
def test_reindex_node_when_node_id_already_exists(network1):
assert [id for id, attribs in network1.nodes()] == ['101982', '101986']
assert [id for id, attribs in network1.links()] == ['0']
assert network1.link('0')['from'] == '101982'
assert network1.link('0')['to'] == '101986'
assert [(from_n, to_n) for from_n, to_n, attribs in network1.edges()] == [('101982', '101986')]
assert network1.link_id_mapping['0']['from'] == '101982'
network1.reindex_node('101982', '101986')
node_ids = [id for id, attribs in network1.nodes()]
assert '101986' in node_ids
assert '101982' not in node_ids
assert len(set(node_ids)) == 2
assert network1.node(node_ids[0]) != network1.node(node_ids[1])
def test_reindex_link(network1):
assert [id for id, attribs in network1.nodes()] == ['101982', '101986']
assert [id for id, attribs in network1.links()] == ['0']
assert '0' in network1.link_id_mapping
assert network1.link('0')['from'] == '101982'
assert network1.link('0')['to'] == '101986'
assert [(from_n, to_n) for from_n, to_n, attribs in network1.edges()] == [('101982', '101986')]
assert network1.edge('101982', '101986')[0]['id'] == '0'
network1.reindex_link('0', '007')
assert [id for id, attribs in network1.nodes()] == ['101982', '101986']
assert [id for id, attribs in network1.links()] == ['007']
assert '0' not in network1.link_id_mapping
assert '007' in network1.link_id_mapping
assert network1.link('007')['from'] == '101982'
assert network1.link('007')['to'] == '101986'
assert [(from_n, to_n) for from_n, to_n, attribs in network1.edges()] == [('101982', '101986')]
assert network1.edge('101982', '101986')[0]['id'] == '007'
correct_change_log_df = pd.DataFrame(
{'timestamp': {3: '2020-06-08 19:34:48', 4: '2020-06-08 19:34:48'}, 'change_event': {3: 'modify', 4: 'modify'},
'object_type': {3: 'link', 4: 'link'}, 'old_id': {3: '0', 4: '0'}, 'new_id': {3: '007', 4: '0'},
'old_attributes': {
3: "{'id': '0', 'from': '101982', 'to': '101986', 'freespeed': 4.166666666666667, 'capacity': 600.0, 'permlanes': 1.0, 'oneway': '1', 'modes': ['car'], 's2_from': 5221390329378179879, 's2_to': 5221390328605860387, 'length': 52.765151087870265, 'attributes': {'osm:way:access': {'name': 'osm:way:access', 'class': 'java.lang.String', 'text': 'permissive'}, 'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String', 'text': 'unclassified'}, 'osm:way:id': {'name': 'osm:way:id', 'class': 'java.lang.Long', 'text': '26997928'}, 'osm:way:name': {'name': 'osm:way:name', 'class': 'java.lang.String', 'text': 'Brunswick Place'}}}",
4: "{'id': '0', 'from': '101982', 'to': '101986', 'freespeed': 4.166666666666667, 'capacity': 600.0, 'permlanes': 1.0, 'oneway': '1', 'modes': ['car'], 's2_from': 5221390329378179879, 's2_to': 5221390328605860387, 'length': 52.765151087870265, 'attributes': {'osm:way:access': {'name': 'osm:way:access', 'class': 'java.lang.String', 'text': 'permissive'}, 'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String', 'text': 'unclassified'}, 'osm:way:id': {'name': 'osm:way:id', 'class': 'java.lang.Long', 'text': '26997928'}, 'osm:way:name': {'name': 'osm:way:name', 'class': 'java.lang.String', 'text': 'Brunswick Place'}}}"},
'new_attributes': {
3: "{'id': '007', 'from': '101982', 'to': '101986', 'freespeed': 4.166666666666667, 'capacity': 600.0, 'permlanes': 1.0, 'oneway': '1', 'modes': ['car'], 's2_from': 5221390329378179879, 's2_to': 5221390328605860387, 'length': 52.765151087870265, 'attributes': {'osm:way:access': {'name': 'osm:way:access', 'class': 'java.lang.String', 'text': 'permissive'}, 'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String', 'text': 'unclassified'}, 'osm:way:id': {'name': 'osm:way:id', 'class': 'java.lang.Long', 'text': '26997928'}, 'osm:way:name': {'name': 'osm:way:name', 'class': 'java.lang.String', 'text': 'Brunswick Place'}}}",
4: "{'id': '007', 'from': '101982', 'to': '101986', 'freespeed': 4.166666666666667, 'capacity': 600.0, 'permlanes': 1.0, 'oneway': '1', 'modes': ['car'], 's2_from': 5221390329378179879, 's2_to': 5221390328605860387, 'length': 52.765151087870265, 'attributes': {'osm:way:access': {'name': 'osm:way:access', 'class': 'java.lang.String', 'text': 'permissive'}, 'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String', 'text': 'unclassified'}, 'osm:way:id': {'name': 'osm:way:id', 'class': 'java.lang.Long', 'text': '26997928'}, 'osm:way:name': {'name': 'osm:way:name', 'class': 'java.lang.String', 'text': 'Brunswick Place'}}}"},
'diff': {3: [('change', 'id', ('0', '007')), ('change', 'id', ('0', '007'))],
4: [('change', 'id', ('0', '007'))]}})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(network1.change_log[cols_to_compare].tail(2), correct_change_log_df[cols_to_compare],
check_names=False, check_dtype=False)
def test_reindex_link_when_link_id_already_exists(network1):
assert [id for id, attribs in network1.nodes()] == ['101982', '101986']
assert [id for id, attribs in network1.links()] == ['0']
assert network1.link('0')['from'] == '101982'
assert network1.link('0')['to'] == '101986'
assert [(from_n, to_n) for from_n, to_n, attribs in network1.edges()] == [('101982', '101986')]
network1.add_link('1', '101986', '101982', attribs={})
network1.reindex_link('0', '1')
link_ids = [id for id, attribs in network1.links()]
assert '1' in link_ids
assert '0' not in link_ids
assert len(set(link_ids)) == 2
assert network1.link(link_ids[0]) != network1.link(link_ids[1])
def test_modify_node_adds_attributes_in_the_graph_and_change_is_recorded_by_change_log():
n = Network('epsg:27700')
n.add_node(1, {'a': 1})
n.apply_attributes_to_node(1, {'b': 1})
assert n.node(1) == {'b': 1, 'a': 1}
correct_change_log_df = pd.DataFrame(
{'timestamp': {0: '2020-05-28 13:49:53', 1: '2020-05-28 13:49:53'}, 'change_event': {0: 'add', 1: 'modify'},
'object_type': {0: 'node', 1: 'node'}, 'old_id': {0: None, 1: 1}, 'new_id': {0: 1, 1: 1},
'old_attributes': {0: None, 1: "{'a': 1}"}, 'new_attributes': {0: "{'a': 1}", 1: "{'a': 1, 'b': 1}"},
'diff': {0: [('add', '', [('a', 1)]), ('add', 'id', 1)], 1: [('add', '', [('b', 1)])]}})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(n.change_log[cols_to_compare], correct_change_log_df[cols_to_compare], check_names=False,
check_dtype=False)
def test_modify_node_overwrites_existing_attributes_in_the_graph_and_change_is_recorded_by_change_log():
n = Network('epsg:27700')
n.add_node(1, {'a': 1})
n.apply_attributes_to_node(1, {'a': 4})
assert n.node(1) == {'a': 4}
correct_change_log_df = pd.DataFrame(
{'timestamp': {0: '2020-05-28 13:49:53', 1: '2020-05-28 13:49:53'}, 'change_event': {0: 'add', 1: 'modify'},
'object_type': {0: 'node', 1: 'node'}, 'old_id': {0: None, 1: 1}, 'new_id': {0: 1, 1: 1},
'old_attributes': {0: None, 1: "{'a': 1}"}, 'new_attributes': {0: "{'a': 1}", 1: "{'a': 4}"},
'diff': {0: [('add', '', [('a', 1)]), ('add', 'id', 1)], 1: [('change', 'a', (1, 4))]}})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(n.change_log[cols_to_compare], correct_change_log_df[cols_to_compare], check_dtype=False)
def test_modify_nodes_adds_and_changes_attributes_in_the_graph_and_change_is_recorded_by_change_log():
n = Network('epsg:27700')
n.add_node(1, {'a': 1})
n.add_node(2, {'b': 1})
n.apply_attributes_to_nodes({1: {'a': 4}, 2: {'a': 1}})
assert n.node(1) == {'a': 4}
assert n.node(2) == {'b': 1, 'a': 1}
correct_change_log_df = pd.DataFrame(
{'timestamp': {0: '2020-06-01 15:07:51', 1: '2020-06-01 15:07:51', 2: '2020-06-01 15:07:51',
3: '2020-06-01 15:07:51'}, 'change_event': {0: 'add', 1: 'add', 2: 'modify', 3: 'modify'},
'object_type': {0: 'node', 1: 'node', 2: 'node', 3: 'node'}, 'old_id': {0: None, 1: None, 2: 1, 3: 2},
'new_id': {0: 1, 1: 2, 2: 1, 3: 2}, 'old_attributes': {0: None, 1: None, 2: "{'a': 1}", 3: "{'b': 1}"},
'new_attributes': {0: "{'a': 1}", 1: "{'b': 1}", 2: "{'a': 4}", 3: "{'b': 1, 'a': 1}"},
'diff': {0: [('add', '', [('a', 1)]), ('add', 'id', 1)], 1: [('add', '', [('b', 1)]), ('add', 'id', 2)],
2: [('change', 'a', (1, 4))], 3: [('add', '', [('a', 1)])]}
})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(n.change_log[cols_to_compare], correct_change_log_df[cols_to_compare], check_dtype=False)
def multiply_node_attribs(node_attribs):
return node_attribs['a'] * node_attribs['c']
def test_apply_function_to_nodes():
n = Network('epsg:27700')
n.add_node('0', attribs={'a': 2, 'c': 3})
n.add_node('1', attribs={'c': 100})
n.apply_function_to_nodes(function=multiply_node_attribs, location='new_computed_attrib')
assert_semantically_equal(dict(n.nodes()),
{'0': {'a': 2, 'c': 3, 'new_computed_attrib': 6},
'1': {'c': 100}})
def test_apply_attributes_to_edge_without_filter_conditions():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': 1})
n.add_link('1', 1, 2, attribs={'b': 1})
n.apply_attributes_to_edge(1, 2, {'c': 1})
assert n.link('0') == {'a': 1, 'from': 1, 'to': 2, 'id': '0', 'c': 1}
assert n.link('1') == {'b': 1, 'from': 1, 'to': 2, 'id': '1', 'c': 1}
correct_change_log_df = pd.DataFrame(
{'timestamp': {2: '2020-07-10 14:53:25', 3: '2020-07-10 14:53:25'}, 'change_event': {2: 'modify', 3: 'modify'},
'object_type': {2: 'edge', 3: 'edge'}, 'old_id': {2: '(1, 2, 0)', 3: '(1, 2, 1)'},
'new_id': {2: '(1, 2, 0)', 3: '(1, 2, 1)'},
'old_attributes': {2: "{'a': 1, 'from': 1, 'to': 2, 'id': '0'}", 3: "{'b': 1, 'from': 1, 'to': 2, 'id': '1'}"},
'new_attributes': {2: "{'a': 1, 'from': 1, 'to': 2, 'id': '0', 'c': 1}",
3: "{'b': 1, 'from': 1, 'to': 2, 'id': '1', 'c': 1}"},
'diff': {2: [('add', '', [('c', 1)])], 3: [('add', '', [('c', 1)])]}})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(n.change_log[cols_to_compare].tail(2), correct_change_log_df[cols_to_compare],
check_dtype=False)
def test_apply_attributes_to_edge_with_filter_conditions():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': 1})
n.add_link('1', 1, 2, attribs={'b': 1})
n.apply_attributes_to_edge(1, 2, {'c': 1}, conditions={'a': (0, 2)})
assert n.link('0') == {'a': 1, 'from': 1, 'to': 2, 'id': '0', 'c': 1}
assert n.link('1') == {'b': 1, 'from': 1, 'to': 2, 'id': '1'}
correct_change_log_df = pd.DataFrame(
{'timestamp': {2: '2020-07-10 14:53:25'}, 'change_event': {2: 'modify'},
'object_type': {2: 'edge'}, 'old_id': {2: '(1, 2, 0)'},
'new_id': {2: '(1, 2, 0)'},
'old_attributes': {2: "{'a': 1, 'from': 1, 'to': 2, 'id': '0'}"},
'new_attributes': {2: "{'a': 1, 'from': 1, 'to': 2, 'id': '0', 'c': 1}"},
'diff': {2: [('add', '', [('c', 1)])]}})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(n.change_log[cols_to_compare].tail(1), correct_change_log_df[cols_to_compare],
check_dtype=False)
def test_apply_attributes_to_multiple_edges():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': 1})
n.add_link('1', 1, 2, attribs={'b': 1})
n.add_link('2', 2, 3, attribs={'c': 1})
n.add_link('3', 2, 3, attribs={'d': 1})
n.apply_attributes_to_edges({(1, 2): {'e': 1}, (2, 3): {'f': 1}})
assert n.link('0') == {'a': 1, 'from': 1, 'to': 2, 'id': '0', 'e': 1}
assert n.link('1') == {'b': 1, 'from': 1, 'to': 2, 'id': '1', 'e': 1}
assert n.link('2') == {'c': 1, 'from': 2, 'to': 3, 'id': '2', 'f': 1}
assert n.link('3') == {'d': 1, 'from': 2, 'to': 3, 'id': '3', 'f': 1}
def test_apply_attributes_to_multiple_edges_with_conditions():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': 1})
n.add_link('1', 1, 2, attribs={'b': 1})
n.add_link('2', 2, 3, attribs={'c': 1})
n.add_link('3', 2, 3, attribs={'d': 1})
n.apply_attributes_to_edges({(1, 2): {'e': 1}, (2, 3): {'f': 1}}, conditions=[{'a': (0, 2)}, {'c': (0, 2)}])
assert n.link('0') == {'a': 1, 'from': 1, 'to': 2, 'id': '0', 'e': 1}
assert n.link('1') == {'b': 1, 'from': 1, 'to': 2, 'id': '1'}
assert n.link('2') == {'c': 1, 'from': 2, 'to': 3, 'id': '2', 'f': 1}
assert n.link('3') == {'d': 1, 'from': 2, 'to': 3, 'id': '3'}
def test_modify_link_adds_attributes_in_the_graph_and_change_is_recorded_by_change_log():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': 1})
n.apply_attributes_to_link('0', {'b': 1})
assert n.link('0') == {'a': 1, 'from': 1, 'to': 2, 'id': '0', 'b': 1}
correct_change_log_df = pd.DataFrame(
{'timestamp': {0: '2020-06-12 20:02:49', 1: '2020-06-12 20:02:49'}, 'change_event': {0: 'add', 1: 'modify'},
'object_type': {0: 'link', 1: 'link'}, 'old_id': {0: None, 1: '0'}, 'new_id': {0: '0', 1: '0'},
'old_attributes': {0: None, 1: "{'a': 1, 'from': 1, 'to': 2, 'id': '0'}"},
'new_attributes': {0: "{'a': 1, 'from': 1, 'to': 2, 'id': '0'}",
1: "{'a': 1, 'from': 1, 'to': 2, 'id': '0', 'b': 1}"},
'diff': {0: [('add', '', [('a', 1), ('from', 1), ('to', 2), ('id', '0')]), ('add', 'id', '0')],
1: [('add', '', [('b', 1)])]}})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(n.change_log[cols_to_compare], correct_change_log_df[cols_to_compare], check_dtype=False)
def test_modify_link_overwrites_existing_attributes_in_the_graph_and_change_is_recorded_by_change_log():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': 1})
n.apply_attributes_to_link('0', {'a': 4})
assert n.link('0') == {'a': 4, 'from': 1, 'to': 2, 'id': '0'}
correct_change_log_df = pd.DataFrame(
{'timestamp': {0: '2020-06-12 20:04:23', 1: '2020-06-12 20:04:23'}, 'change_event': {0: 'add', 1: 'modify'},
'object_type': {0: 'link', 1: 'link'}, 'old_id': {0: None, 1: '0'}, 'new_id': {0: '0', 1: '0'},
'old_attributes': {0: None, 1: "{'a': 1, 'from': 1, 'to': 2, 'id': '0'}"},
'new_attributes': {0: "{'a': 1, 'from': 1, 'to': 2, 'id': '0'}", 1: "{'a': 4, 'from': 1, 'to': 2, 'id': '0'}"},
'diff': {0: [('add', '', [('a', 1), ('from', 1), ('to', 2), ('id', '0')]), ('add', 'id', '0')],
1: [('change', 'a', (1, 4))]}})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(n.change_log[cols_to_compare], correct_change_log_df[cols_to_compare], check_dtype=False)
def test_modify_link_adds_attributes_in_the_graph_with_multiple_edges():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': 1})
n.add_link('1', 1, 2, attribs={'c': 100})
n.apply_attributes_to_link('0', {'b': 1})
assert n.link('0') == {'a': 1, 'from': 1, 'to': 2, 'id': '0', 'b': 1}
assert n.link('1') == {'c': 100, 'from': 1, 'to': 2, 'id': '1'}
def test_modify_links_adds_and_changes_attributes_in_the_graph_with_multiple_edges_and_change_is_recorded_by_change_log():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': {'b': 1}})
n.add_link('1', 1, 2, attribs={'c': 100})
n.apply_attributes_to_links({'0': {'a': {'b': 100}}, '1': {'a': {'b': 10}}})
assert n.link('0') == {'a': {'b': 100}, 'from': 1, 'to': 2, 'id': '0'}
assert n.link('1') == {'c': 100, 'from': 1, 'to': 2, 'id': '1', 'a': {'b': 10}}
correct_change_log_df = pd.DataFrame(
{'timestamp': {2: '2020-06-12 19:59:40', 3: '2020-06-12 19:59:40'}, 'change_event': {2: 'modify', 3: 'modify'},
'object_type': {2: 'link', 3: 'link'}, 'old_id': {2: '0', 3: '1'}, 'new_id': {2: '0', 3: '1'},
'old_attributes': {2: "{'a': {'b': 1}, 'from': 1, 'to': 2, 'id': '0'}",
3: "{'c': 100, 'from': 1, 'to': 2, 'id': '1'}"},
'new_attributes': {2: "{'a': {'b': 100}, 'from': 1, 'to': 2, 'id': '0'}",
3: "{'c': 100, 'from': 1, 'to': 2, 'id': '1', 'a': {'b': 10}}"},
'diff': {2: [('change', 'a.b', (1, 100))], 3: [('add', '', [('a', {'b': 10})])]}})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(n.change_log[cols_to_compare].tail(2), correct_change_log_df[cols_to_compare],
check_dtype=False)
def multiply_link_attribs(link_attribs):
return link_attribs['a'] * link_attribs['c']
def test_apply_function_to_links():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': 2, 'c': 3})
n.add_link('1', 1, 2, attribs={'c': 100})
n.apply_function_to_links(function=multiply_link_attribs, location='new_computed_attrib')
assert_semantically_equal(dict(n.links()),
{'0': {'a': 2, 'c': 3, 'from': 1, 'to': 2, 'id': '0', 'new_computed_attrib': 6},
'1': {'c': 100, 'from': 1, 'to': 2, 'id': '1'}})
def test_resolves_link_id_clashes_by_mapping_clashing_link_to_a_new_id(mocker):
mocker.patch.object(Network, 'generate_index_for_edge', return_value='1')
n = Network('epsg:27700')
n.add_link('0', 1, 2)
assert n.graph.has_edge(1, 2)
assert n.link_id_mapping['0'] == {'from': 1, 'to': 2, 'multi_edge_idx': 0}
assert '1' not in n.link_id_mapping
n.add_link('0', 3, 0)
assert n.graph.has_edge(3, 0)
assert n.link_id_mapping['1'] == {'from': 3, 'to': 0, 'multi_edge_idx': 0}
# also assert that the link mapped to '0' is still as expected
assert n.link_id_mapping['0'] == {'from': 1, 'to': 2, 'multi_edge_idx': 0}
def test_removing_single_node():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': 1})
n.add_link('1', 1, 2, attribs={'b': 4})
n.add_link('2', 2, 3, attribs={'a': 1})
n.add_link('3', 2, 3, attribs={'b': 4})
n.remove_node(1)
assert list(n.graph.nodes) == [2, 3]
assert list(n.graph.edges) == [(2, 3, 0), (2, 3, 1)]
correct_change_log = pd.DataFrame(
{'timestamp': {4: '2020-06-11 10:37:54'}, 'change_event': {4: 'remove'}, 'object_type': {4: 'node'},
'old_id': {4: 1}, 'new_id': {4: None}, 'old_attributes': {4: '{}'}, 'new_attributes': {4: None},
'diff': {4: [('remove', 'id', 1)]}})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(n.change_log[cols_to_compare].tail(1), correct_change_log[cols_to_compare],
check_dtype=False)
def test_removing_multiple_nodes():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': 1})
n.add_link('1', 1, 2, attribs={'b': 4})
n.add_link('2', 2, 3, attribs={'a': 1})
n.add_link('3', 2, 3, attribs={'b': 4})
n.remove_nodes([1, 2])
assert list(n.graph.nodes) == [3]
assert list(n.graph.edges) == []
correct_change_log = pd.DataFrame(
{'timestamp': {4: '2020-06-11 10:39:52', 5: '2020-06-11 10:39:52'}, 'change_event': {4: 'remove', 5: 'remove'},
'object_type': {4: 'node', 5: 'node'}, 'old_id': {4: 1, 5: 2}, 'new_id': {4: None, 5: None},
'old_attributes': {4: '{}', 5: '{}'}, 'new_attributes': {4: None, 5: None},
'diff': {4: [('remove', 'id', 1)], 5: [('remove', 'id', 2)]}})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(n.change_log[cols_to_compare].tail(2), correct_change_log[cols_to_compare],
check_dtype=False)
def test_removing_single_link():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': 1})
n.add_link('1', 1, 2, attribs={'b': 4})
n.add_link('2', 2, 3, attribs={'a': 1})
n.add_link('3', 2, 3, attribs={'b': 4})
assert '1' in n.link_id_mapping
n.remove_link('1')
assert list(n.graph.nodes) == [1, 2, 3]
assert list(n.graph.edges) == [(1, 2, 0), (2, 3, 0), (2, 3, 1)]
assert not '1' in n.link_id_mapping
correct_change_log = pd.DataFrame(
{'timestamp': {4: '2020-06-12 19:58:01'}, 'change_event': {4: 'remove'}, 'object_type': {4: 'link'},
'old_id': {4: '1'}, 'new_id': {4: None}, 'old_attributes': {4: "{'b': 4, 'from': 1, 'to': 2, 'id': '1'}"},
'new_attributes': {4: None},
'diff': {4: [('remove', '', [('b', 4), ('from', 1), ('to', 2), ('id', '1')]), ('remove', 'id', '1')]}})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(n.change_log[cols_to_compare].tail(1), correct_change_log[cols_to_compare],
check_dtype=False)
def test_removing_multiple_links():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'a': 1})
n.add_link('1', 1, 2, attribs={'b': 4})
n.add_link('2', 2, 3, attribs={'a': 1})
n.add_link('3', 2, 3, attribs={'b': 4})
assert '0' in n.link_id_mapping
assert '2' in n.link_id_mapping
n.remove_links(['0', '2'])
assert list(n.graph.nodes) == [1, 2, 3]
assert list(n.graph.edges) == [(1, 2, 1), (2, 3, 1)]
assert not '0' in n.link_id_mapping
assert not '2' in n.link_id_mapping
correct_change_log = pd.DataFrame(
{'timestamp': {4: '2020-06-12 19:55:10', 5: '2020-06-12 19:55:10'}, 'change_event': {4: 'remove', 5: 'remove'},
'object_type': {4: 'link', 5: 'link'}, 'old_id': {4: '0', 5: '2'}, 'new_id': {4: None, 5: None},
'old_attributes': {4: "{'a': 1, 'from': 1, 'to': 2, 'id': '0'}", 5: "{'a': 1, 'from': 2, 'to': 3, 'id': '2'}"},
'new_attributes': {4: None, 5: None},
'diff': {4: [('remove', '', [('a', 1), ('from', 1), ('to', 2), ('id', '0')]), ('remove', 'id', '0')],
5: [('remove', '', [('a', 1), ('from', 2), ('to', 3), ('id', '2')]), ('remove', 'id', '2')]}})
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(n.change_log[cols_to_compare].tail(2), correct_change_log[cols_to_compare],
check_dtype=False)
def test_number_of_multi_edges_counts_multi_edges_on_single_edge():
n = Network('epsg:27700')
n.graph.add_edges_from([(1, 2), (2, 3), (3, 4)])
assert n.number_of_multi_edges(1, 2) == 1
def test_number_of_multi_edges_counts_multi_edges_on_multi_edge():
n = Network('epsg:27700')
n.graph.add_edges_from([(1, 2), (1, 2), (3, 4)])
assert n.number_of_multi_edges(1, 2) == 2
def test_number_of_multi_edges_counts_multi_edges_on_non_existing_edge():
n = Network('epsg:27700')
n.graph.add_edges_from([(1, 2), (1, 2), (3, 4)])
assert n.number_of_multi_edges(1214, 21321) == 0
def test_nodes_gives_iterator_of_node_id_and_attribs():
n = Network('epsg:27700')
n.graph.add_edges_from([(1, 2), (2, 3), (3, 4)])
assert list(n.nodes()) == [(1, {}), (2, {}), (3, {}), (4, {})]
def test_node_gives_node_attribss():
n = Network('epsg:27700')
n.graph.add_node(1, **{'attrib': 1})
assert n.node(1) == {'attrib': 1}
def test_edges_gives_iterator_of_edge_from_to_nodes_and_attribs():
n = Network('epsg:27700')
n.graph.add_edges_from([(1, 2), (2, 3), (3, 4)])
assert list(n.edges()) == [(1, 2, {0: {}}), (2, 3, {0: {}}), (3, 4, {0: {}})]
def test_edge_method_gives_attributes_for_given_from_and_to_nodes():
n = Network('epsg:27700')
n.graph.add_edge(1, 2, **{'attrib': 1})
assert n.edge(1, 2) == {0: {'attrib': 1}}
def test_links_gives_iterator_of_link_id_and_edge_attribs():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'f': 's'})
n.add_link('1', 2, 3, attribs={'h': 1})
assert list(n.links()) == [('0', {'f': 's', 'from': 1, 'to': 2, 'id': '0'}),
('1', {'h': 1, 'from': 2, 'to': 3, 'id': '1'})]
def test_link_gives_link_attribs():
n = Network('epsg:27700')
n.add_link('0', 1, 2, attribs={'attrib': 1})
n.add_link('0', 1, 2, attribs={'attrib': 1})
assert n.link('0') == {'attrib': 1, 'from': 1, 'to': 2, 'id': '0'}
def test_schedule_routes(network_object_from_test_data):
n = network_object_from_test_data
correct_routes = [['25508485', '21667818']]
routes = n.schedule_routes_nodes()
assert correct_routes == routes
def test_schedule_routes_with_an_empty_service(network_object_from_test_data):
n = network_object_from_test_data
n.schedule._graph.graph['routes']['1'] = {
'route_short_name': '', 'mode': 'bus',
'trips': {},
'arrival_offsets': [], 'departure_offsets': [],
'route_long_name': '', 'id': '1', 'route': [],
'await_departure': [], 'ordered_stops': []}
n.schedule._graph.graph['service_to_route_map']['10314'].append('1')
n.schedule._graph.graph['route_to_service_map']['1'] = '10314'
assert set(n.schedule.service_ids()) == {'10314'}
correct_routes = [['25508485', '21667818']]
routes = n.schedule_routes_nodes()
assert correct_routes == routes
def test_schedule_routes_with_disconnected_routes(network_object_from_test_data):
n = network_object_from_test_data
n.add_link('2', 2345678, 987875)
n.schedule.apply_attributes_to_routes({'VJbd8660f05fe6f744e58a66ae12bd66acbca88b98': {'route': ['1', '2']}})
correct_routes = [['25508485', '21667818'], [2345678, 987875]]
routes = n.schedule_routes_nodes()
assert correct_routes == routes
def test_reads_osm_network_into_the_right_schema(full_fat_default_config_path):
osm_test_file = os.path.abspath(
os.path.join(os.path.dirname(__file__), "test_data", "osm", "osm.xml"))
network = read.read_osm(osm_test_file, full_fat_default_config_path, 1, 'epsg:27700')
assert_semantically_equal(dict(network.nodes()), {
'0': {'id': '0', 'x': 622502.8306679451, 'y': -5526117.781903352, 'lat': 0.008554364250688652,
'lon': -0.0006545205888310243, 's2_id': 1152921492875543713},
'1': {'id': '1', 'x': 622502.8132744529, 'y': -5524378.838447345, 'lat': 0.024278505899735615,
'lon': -0.0006545205888310243, 's2_id': 1152921335974974453},
'2': {'id': '2', 'x': 622502.8314014417, 'y': -5527856.725358106, 'lat': -0.00716977739835831,
'lon': -0.0006545205888310243, 's2_id': 384307157539499829}})
assert len(list(network.links())) == 11
number_of_0_multi_idx = 0
number_of_1_multi_idx = 0
number_of_2_multi_idx = 0
for link_id, edge_map in network.link_id_mapping.items():
if edge_map['multi_edge_idx'] == 0:
number_of_0_multi_idx += 1
elif edge_map['multi_edge_idx'] == 1:
number_of_1_multi_idx += 1
elif edge_map['multi_edge_idx'] == 2:
number_of_2_multi_idx += 1
assert number_of_0_multi_idx == 5
assert number_of_1_multi_idx == 4
assert number_of_2_multi_idx == 1
correct_link_attribs = [
{'permlanes': 1.0, 'freespeed': 12.5, 'capacity': 600.0, 'oneway': '1', 'modes': ['walk', 'car', 'bike'],
'from': '0', 'to': '1', 's2_from': 1152921492875543713, 's2_to': 1152921335974974453,
'length': 1748.4487354464366,
'attributes': {'osm:way:osmid': {'name': 'osm:way:osmid', 'class': 'java.lang.String', 'text': '0'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'unclassified'}}},
{'permlanes': 1.0, 'freespeed': 12.5, 'capacity': 600.0, 'oneway': '1', 'modes': ['walk', 'car', 'bike'],
'from': '1', 'to': '0', 's2_from': 1152921335974974453, 's2_to': 1152921492875543713,
'length': 1748.4487354464366,
'attributes': {'osm:way:osmid': {'name': 'osm:way:osmid', 'class': 'java.lang.String', 'text': '0'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'unclassified'}}},
{'permlanes': 1.0, 'freespeed': 12.5, 'capacity': 600.0, 'oneway': '1', 'modes': ['walk', 'car', 'bike'],
'from': '0', 'to': '2', 's2_from': 1152921492875543713, 's2_to': 384307157539499829,
'length': 1748.4488584600201,
'attributes': {'osm:way:osmid': {'name': 'osm:way:osmid', 'class': 'java.lang.String', 'text': '100'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'unclassified'}}},
{'permlanes': 1.0, 'freespeed': 12.5, 'capacity': 600.0, 'oneway': '1', 'modes': ['walk', 'car', 'bike'],
'from': '2', 'to': '0', 's2_from': 384307157539499829, 's2_to': 1152921492875543713,
'length': 1748.4488584600201,
'attributes': {'osm:way:osmid': {'name': 'osm:way:osmid', 'class': 'java.lang.String', 'text': '100'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'unclassified'}}},
{'permlanes': 1.0, 'freespeed': 12.5, 'capacity': 600.0, 'oneway': '1', 'modes': ['walk', 'car', 'bike'],
'from': '1', 'to': '0', 's2_from': 1152921335974974453, 's2_to': 1152921492875543713,
'length': 1748.4487354464366,
'attributes': {'osm:way:osmid': {'name': 'osm:way:osmid', 'class': 'java.lang.String', 'text': '400'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'unclassified'}}},
{'permlanes': 1.0, 'freespeed': 12.5, 'capacity': 600.0, 'oneway': '1', 'modes': ['walk', 'car', 'bike'],
'from': '0', 'to': '1', 's2_from': 1152921492875543713, 's2_to': 1152921335974974453,
'length': 1748.4487354464366,
'attributes': {'osm:way:osmid': {'name': 'osm:way:osmid', 'class': 'java.lang.String', 'text': '400'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'unclassified'}}},
{'permlanes': 1.0, 'freespeed': 12.5, 'capacity': 600.0, 'oneway': '1', 'modes': ['walk', 'car', 'bike'],
'from': '2', 'to': '0', 's2_from': 384307157539499829, 's2_to': 1152921492875543713,
'length': 1748.4488584600201,
'attributes': {'osm:way:osmid': {'name': 'osm:way:osmid', 'class': 'java.lang.String', 'text': '700'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'unclassified'}}},
{'permlanes': 1.0, 'freespeed': 12.5, 'capacity': 600.0, 'oneway': '1', 'modes': ['walk', 'car', 'bike'],
'from': '0', 'to': '2', 's2_from': 1152921492875543713, 's2_to': 384307157539499829,
'length': 1748.4488584600201,
'attributes': {'osm:way:osmid': {'name': 'osm:way:osmid', 'class': 'java.lang.String', 'text': '700'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'unclassified'}}},
{'permlanes': 3.0, 'freespeed': 12.5, 'capacity': 1800.0, 'oneway': '1', 'modes': ['walk', 'car', 'bike'],
'from': '2', 'to': '1', 's2_from': 384307157539499829, 's2_to': 1152921335974974453,
'length': 3496.897593906457,
'attributes': {'osm:way:lanes': {'name': 'osm:way:lanes', 'class': 'java.lang.String', 'text': '3'},
'osm:way:osmid': {'name': 'osm:way:osmid', 'class': 'java.lang.String', 'text': '47007861'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'tertiary'}}},
{'permlanes': 3.0, 'freespeed': 12.5, 'capacity': 1800.0, 'oneway': '1', 'modes': ['walk', 'car', 'bike'],
'from': '1', 'to': '0', 's2_from': 1152921335974974453, 's2_to': 1152921492875543713,
'length': 1748.4487354464366,
'attributes': {'osm:way:lanes': {'name': 'osm:way:lanes', 'class': 'java.lang.String', 'text': '3'},
'osm:way:osmid': {'name': 'osm:way:osmid', 'class': 'java.lang.String', 'text': '47007861'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'tertiary'}}},
{'permlanes': 1.0, 'freespeed': 12.5, 'capacity': 600.0, 'oneway': '1',
'modes': ['car', 'walk', 'bike'], 'from': '1', 'to': '0',
's2_from': 1152921335974974453, 's2_to': 1152921492875543713,
'length': 1748.4487354464366, 'attributes': {
'osm:way:osmid': {'name': 'osm:way:osmid', 'class': 'java.lang.String',
'text': '47007862'},
'osm:way:lanes': {'name': 'osm:way:lanes', 'class': 'java.lang.String',
'text': '3;2'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'tertiary'}}}
]
cols = ['permlanes', 'freespeed', 'capacity', 'oneway', 'modes', 'from', 'to', 's2_from', 's2_to', 'length',
'attributes']
assert len(network.link_id_mapping) == 11
for link in network.link_id_mapping.keys():
satisfied = False
attribs_to_test = network.link(link).copy()
del attribs_to_test['id']
for link_attrib in correct_link_attribs:
try:
assert_semantically_equal(attribs_to_test, link_attrib)
satisfied = True
except AssertionError:
pass
assert satisfied
def test_read_matsim_network_with_duplicated_node_ids_records_removal_in_changelog(mocker):
dup_nodes = {'21667818': [
{'id': '21667818', 'x': 528504.1342843144, 'y': 182155.7435136598, 'lon': -0.14910908709500162,
'lat': 51.52370573323939, 's2_id': 5221390302696205321}]}
mocker.patch.object(matsim_reader, 'read_network', return_value=(nx.MultiDiGraph(), 2, dup_nodes, {}))
network = read.read_matsim(path_to_network=pt2matsim_network_test_file, epsg='epsg:27700')
correct_change_log_df = pd.DataFrame(
{'timestamp': {0: '2020-07-02 11:36:54'}, 'change_event': {0: 'remove'}, 'object_type': {0: 'node'},
'old_id': {0: '21667818'}, 'new_id': {0: None},
'old_attributes': {
0: "{'id': '21667818', 'x': 528504.1342843144, 'y': 182155.7435136598, 'lon': -0.14910908709500162, 'lat': 51.52370573323939, 's2_id': 5221390302696205321}"},
'new_attributes': {0: None},
'diff': {0: [('remove', '', [('id', '21667818'), ('x', 528504.1342843144),
('y', 182155.7435136598),
('lon', -0.14910908709500162),
('lat', 51.52370573323939),
('s2_id', 5221390302696205321)]),
('remove', 'id', '21667818')]}}
)
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(network.change_log[cols_to_compare].tail(1), correct_change_log_df[cols_to_compare],
check_names=False,
check_dtype=False)
def test_read_matsim_network_with_duplicated_link_ids_records_reindexing_in_changelog(mocker):
dup_links = {'1': ['1_1']}
correct_link_id_map = {'1': {'from': '25508485', 'to': '21667818', 'multi_edge_idx': 0},
'1_1': {'from': '25508485', 'to': '21667818', 'multi_edge_idx': 1}}
mocker.patch.object(matsim_reader, 'read_network',
return_value=(nx.MultiDiGraph(), correct_link_id_map, {}, dup_links))
mocker.patch.object(Network, 'link', return_value={'heyooo': '1'})
network = read.read_matsim(path_to_network=pt2matsim_network_test_file, epsg='epsg:27700')
correct_change_log_df = pd.DataFrame(
{'timestamp': {0: '2020-07-02 11:59:00'}, 'change_event': {0: 'modify'}, 'object_type': {0: 'link'},
'old_id': {0: '1'}, 'new_id': {0: '1_1'}, 'old_attributes': {0: "{'heyooo': '1'}"},
'new_attributes': {0: "{'heyooo': '1'}"}, 'diff': {0: [('change', 'id', ('1', '1_1'))]}}
)
cols_to_compare = ['change_event', 'object_type', 'old_id', 'new_id', 'old_attributes', 'new_attributes', 'diff']
assert_frame_equal(network.change_log[cols_to_compare].tail(1), correct_change_log_df[cols_to_compare],
check_names=False,
check_dtype=False)
def test_has_node_when_node_is_in_the_graph():
n = Network('epsg:27700')
n.add_node('1')
assert n.has_node('1')
def test_has_node_when_node_is_not_in_the_graph():
n = Network('epsg:27700')
assert not n.has_node('1')
def test_has_nodes_when_nodes_in_the_graph():
n = Network('epsg:27700')
n.add_node('1')
n.add_node('2')
n.add_node('3')
assert n.has_nodes(['1', '2'])
def test_has_nodes_when_only_some_nodes_in_the_graph():
n = Network('epsg:27700')
n.add_node('1')
n.add_node('2')
n.add_node('3')
assert not n.has_nodes(['1', '4'])
def test_has_nodes_when_none_of_the_nodes_in_the_graph():
n = Network('epsg:27700')
n.add_node('1')
n.add_node('2')
n.add_node('3')
assert not n.has_nodes(['10', '20'])
def test_has_edge_when_edge_is_in_the_graph():
n = Network('epsg:27700')
n.add_link('1', 1, 2)
assert n.has_edge(1, 2)
def test_has_edge_when_edge_is_not_in_the_graph():
n = Network('epsg:27700')
assert not n.has_edge(1, 2)
def test_has_link_when_link_is_in_the_graph():
n = Network('epsg:27700')
n.add_link('1', 1, 2)
assert n.has_link('1')
def test_has_link_when_link_is_not_in_the_graph():
n = Network('epsg:27700')
assert not n.has_link('1')
def test_has_link_when_link_id_is_in_the_network_but_corresponding_edge_is_not():
# unlikely scenario but possible if someone uses a non genet method to play with the graph
n = Network('epsg:27700')
n.link_id_mapping['1'] = {'from': 1, 'to': 2, 'multi_edge_idx': 0}
assert not n.has_link('1')
def test_has_links_when_links_in_the_graph():
n = Network('epsg:27700')
n.add_link('1', 1, 2)
n.add_link('2', 1, 2)
n.add_link('3', 1, 2)
assert n.has_links(['1', '2'])
def test_has_links_when_only_some_links_in_the_graph():
n = Network('epsg:27700')
n.add_link('1', 1, 2)
n.add_link('2', 1, 2)
n.add_link('3', 1, 2)
assert not n.has_links(['1', '4'])
def test_has_links_when_none_of_the_links_in_the_graph():
n = Network('epsg:27700')
n.add_link('1', 1, 2)
n.add_link('2', 1, 2)
n.add_link('3', 1, 2)
assert not n.has_links(['10', '20'])
def test_has_links_with_passing_attribute_condition():
n = Network('epsg:27700')
n.add_link('1', 1, 2, attribs={'modes': 'car'})
n.add_link('2', 1, 2, attribs={'modes': 'car'})
assert n.has_links(['1', '2'], conditions={'modes': 'car'})
def test_has_links_with_failing_attribute_condition():
n = Network('epsg:27700')
n.add_link('1', 1, 2, attribs={'modes': 'bus'})
n.add_link('2', 1, 2, attribs={'modes': 'walk'})
assert not n.has_links(['1', '2'], conditions={'modes': 'car'})
def test_has_links_not_in_graph_with_attribute_condition():
n = Network('epsg:27700')
n.add_link('1', 1, 2, attribs={'modes': 'car'})
n.add_link('2', 1, 2, attribs={'modes': 'car'})
assert not n.has_links(['10', '20'], conditions={'modes': 'car'})
def test_has_valid_link_chain_with_a_valid_link_chain():
n = Network('epsg:27700')
n.add_link('1', 1, 3)
n.add_link('2', 3, 4)
assert n.has_valid_link_chain(['1', '2'])
def test_has_valid_link_chain_with_an_invalid_link_chain():
n = Network('epsg:27700')
n.add_link('1', 1, 3)
n.add_link('2', 2, 4)
assert not n.has_valid_link_chain(['1', '2'])
def test_has_valid_link_chain_with_an_empty_link_chain():
n = Network('epsg:27700')
n.add_link('1', 1, 3)
n.add_link('2', 2, 4)
assert not n.has_valid_link_chain([])
def test_calculate_route_distance_with_links_that_have_length_attrib():
n = Network('epsg:27700')
n.add_link('1', 1, 3, attribs={'length': 2})
n.add_link('2', 3, 4, attribs={'length': 1})
assert n.route_distance(['1', '2']) == 3
def test_calculate_route_distance_with_links_that_dont_have_length_attrib():
n = Network('epsg:27700')
n.add_node(1, attribs={'s2_id': 12345})
n.add_node(3, attribs={'s2_id': 345435})
n.add_node(4, attribs={'s2_id': 568767})
n.add_link('1', 1, 3)
n.add_link('2', 3, 4)
assert round(n.route_distance(['1', '2']), 6) == 0.013918
def test_calculate_route_distance_returns_0_when_route_is_invalid():
n = Network('epsg:27700')
n.add_link('1', 1, 3)
n.add_link('2', 5, 4)
assert n.route_distance(['1', '2']) == 0
def test_valid_network_route():
n = Network('epsg:27700')
n.add_link('1', 1, 2, attribs={'modes': ['car', 'bus']})
n.add_link('2', 2, 3, attribs={'modes': ['car', 'bus']})
r = Route(route_short_name='', mode='bus', stops=[], trips={}, arrival_offsets=[], departure_offsets=[],
route=['1', '2'])
assert n.is_valid_network_route(r)
def test_network_route_with_wrong_links():
n = Network('epsg:27700')
n.add_link('1', 1, 2, attribs={'modes': ['car', 'bus']})
n.add_link('2', 3, 2, attribs={'modes': ['car', 'bus']})
r = Route(route_short_name='', mode='bus', stops=[], trips={}, arrival_offsets=[], departure_offsets=[],
route=['1', '2'])
assert not n.is_valid_network_route(r)
def test_network_route_with_empty_link_list():
n = Network('epsg:27700')
n.add_link('1', 1, 2, attribs={'modes': ['car', 'bus']})
n.add_link('2', 3, 2, attribs={'modes': ['car', 'bus']})
r = Route(route_short_name='', mode='bus', stops=[], trips={}, arrival_offsets=[], departure_offsets=[],
route=[])
assert not n.is_valid_network_route(r)
def test_network_route_with_incorrect_modes_on_link():
n = Network('epsg:27700')
n.add_link('1', 1, 2, attribs={'modes': ['car']})
n.add_link('2', 3, 2, attribs={'modes': ['car', 'bus']})
r = Route(route_short_name='', mode='bus', stops=[], trips={}, arrival_offsets=[], departure_offsets=[],
route=['1', '2'])
assert not n.is_valid_network_route(r)
def test_generate_index_for_node_gives_next_integer_string_when_you_have_matsim_usual_integer_index():
n = Network('epsg:27700')
n.add_node('1')
assert n.generate_index_for_node() == '2'
def test_generate_index_for_node_gives_string_based_on_length_node_ids_when_you_have_mixed_index():
n = Network('epsg:27700')
n.add_node('1')
n.add_node('1x')
assert n.generate_index_for_node() == '3'
def test_generate_index_for_node_gives_string_based_on_length_node_ids_when_you_have_all_non_int_index():
n = Network('epsg:27700')
n.add_node('1w')
n.add_node('1x')
assert n.generate_index_for_node() == '3'
def test_generate_index_for_node_gives_uuid4_as_last_resort(mocker):
mocker.patch.object(uuid, 'uuid4')
n = Network('epsg:27700')
n.add_node('1w')
n.add_node('1x')
n.add_node('4')
n.generate_index_for_node()
uuid.uuid4.assert_called_once()
def test_generating_n_indicies_for_nodes():
n = Network('epsg:27700')
n.add_nodes({str(i): {} for i in range(10)})
idxs = n.generate_indices_for_n_nodes(5)
assert len(idxs) == 5
assert not set(dict(n.nodes()).keys()) & idxs
def test_generate_index_for_edge_gives_next_integer_string_when_you_have_matsim_usual_integer_index():
n = Network('epsg:27700')
n.link_id_mapping = {'1': {}, '2': {}}
new_idx = n.generate_index_for_edge()
assert isinstance(new_idx, str)
assert new_idx not in ['1', '2']
def test_generate_index_for_edge_gives_string_based_on_length_link_id_mapping_when_you_have_mixed_index():
n = Network('epsg:27700')
n.link_id_mapping = {'1': {}, 'x2': {}}
new_idx = n.generate_index_for_edge()
assert isinstance(new_idx, str)
assert new_idx not in ['1', 'x2']
def test_generate_index_for_edge_gives_string_based_on_length_link_id_mapping_when_you_have_all_non_int_index():
n = Network('epsg:27700')
n.link_id_mapping = {'1x': {}, 'x2': {}}
new_idx = n.generate_index_for_edge()
assert isinstance(new_idx, str)
assert new_idx not in ['1x', 'x2']
def test_index_graph_edges_generates_completely_new_index():
n = Network('epsg:27700')
n.add_link('1x', 1, 2)
n.add_link('x2', 1, 2)
n.index_graph_edges()
assert list(n.link_id_mapping.keys()) == ['0', '1']
def test_generating_n_indicies_for_edges():
n = Network('epsg:27700')
n.add_links({str(i): {'from': 0, 'to': 1} for i in range(11)})
idxs = n.generate_indices_for_n_edges(7)
assert len(idxs) == 7
for i in idxs:
assert isinstance(i, str)
assert not set(n.link_id_mapping.keys()) & idxs
def test_has_schedule_with_valid_network_routes_with_valid_routes(route):
n = Network('epsg:27700')
n.add_link('1', 1, 2, attribs={"modes": ['bus']})
n.add_link('2', 2, 3, attribs={"modes": ['car', 'bus']})
route.route = ['1', '2']
n.schedule = Schedule(n.epsg, [Service(id='service', routes=[route])])
route.reindex('service_1')
n.schedule.add_route('service', route)
n.schedule.apply_attributes_to_routes({'service_0': {'route': ['1', '2']}, 'service_1': {'route': ['1', '2']}})
assert n.has_schedule_with_valid_network_routes()
def test_has_schedule_with_valid_network_routes_with_some_valid_routes(route):
n = Network('epsg:27700')
n.add_link('1', 1, 2)
n.add_link('2', 2, 3)
route.route = ['1', '2']
route_2 = Route(route_short_name='', mode='bus', stops=[],
trips={'trip_id': ['1'], 'trip_departure_time': ['13:00:00'], 'vehicle_id': ['veh_1_bus']},
arrival_offsets=[], departure_offsets=[], route=['10000'])
n.schedule = Schedule(n.epsg, [Service(id='service', routes=[route, route_2])])
assert not n.has_schedule_with_valid_network_routes()
def test_has_schedule_with_valid_network_routes_with_invalid_routes(route):
n = Network('epsg:27700')
n.add_link('1', 1, 2)
n.add_link('2', 2, 3)
route.route = ['3', '4']
n.schedule = Schedule(n.epsg, [Service(id='service', routes=[route, route])])
assert not n.has_schedule_with_valid_network_routes()
def test_has_schedule_with_valid_network_routes_with_empty_routes(route):
n = Network('epsg:27700')
n.add_link('1', 1, 2)
n.add_link('2', 2, 3)
route.route = []
n.schedule = Schedule(n.epsg, [Service(id='service', routes=[route, route])])
assert not n.has_schedule_with_valid_network_routes()
def test_invalid_network_routes_with_valid_route(route):
n = Network('epsg:27700')
n.add_link('1', 1, 2, attribs={"modes": ['car', 'bus']})
n.add_link('2', 2, 3, attribs={"modes": ['bus']})
route.reindex('route')
n.schedule = Schedule(n.epsg, [Service(id='service', routes=[route])])
n.schedule.apply_attributes_to_routes({'route': {'route': ['1', '2']}})
assert n.invalid_network_routes() == []
def test_invalid_network_routes_with_invalid_route(route):
n = Network('epsg:27700')
n.add_link('1', 1, 2)
n.add_link('2', 2, 3)
route.reindex('route')
n.schedule = Schedule(n.epsg, [Service(id='service', routes=[route])])
n.schedule.apply_attributes_to_routes({'route': {'route': ['3', '4']}})
assert n.invalid_network_routes() == ['route']
def test_invalid_network_routes_with_empty_route(route):
n = Network('epsg:27700')
n.add_link('1', 1, 2)
n.add_link('2', 2, 3)
route.reindex('route')
n.schedule = Schedule(n.epsg, [Service(id='service', routes=[route])])
n.schedule.apply_attributes_to_routes({'route': {'route': []}})
assert n.invalid_network_routes() == ['route']
def test_generate_validation_report_with_pt2matsim_network(network_object_from_test_data):
n = network_object_from_test_data
report = n.generate_validation_report()
correct_report = {
'graph': {
'graph_connectivity': {
'car': {'problem_nodes': {'dead_ends': ['21667818'], 'unreachable_node': ['25508485']},
'number_of_connected_subgraphs': 2},
'walk': {'problem_nodes': {'dead_ends': ['21667818'], 'unreachable_node': ['25508485']},
'number_of_connected_subgraphs': 2},
'bike': {'problem_nodes': {'dead_ends': [], 'unreachable_node': []},
'number_of_connected_subgraphs': 0}},
'link_attributes': {
'links_over_1km_length': {'number_of': 0, 'percentage': 0.0, 'link_ids': []},
'zero_attributes': {}}},
'schedule': {
'schedule_level': {'is_valid_schedule': False, 'invalid_stages': ['not_has_valid_services'],
'has_valid_services': False, 'invalid_services': ['10314']},
'service_level': {
'10314': {'is_valid_service': False, 'invalid_stages': ['not_has_valid_routes'],
'has_valid_routes': False, 'invalid_routes': ['VJbd8660f05fe6f744e58a66ae12bd66acbca88b98']}},
'route_level': {'10314': {'VJbd8660f05fe6f744e58a66ae12bd66acbca88b98': {'is_valid_route': False,
'invalid_stages': [
'not_has_correctly_ordered_route']}}}},
'routing': {'services_have_routes_in_the_graph': False,
'service_routes_with_invalid_network_route': ['VJbd8660f05fe6f744e58a66ae12bd66acbca88b98'],
'route_to_crow_fly_ratio': {
'10314': {'VJbd8660f05fe6f744e58a66ae12bd66acbca88b98': 'Division by zero'}}}}
assert_semantically_equal(report, correct_report)
def test_generate_validation_report_with_correct_schedule(correct_schedule):
n = Network('epsg:27700')
n.add_link('1', 1, 2, attribs={'length': 2, "modes": ['car', 'bus']})
n.add_link('2', 2, 3, attribs={'length': 2, "modes": ['car', 'bus']})
n.schedule = correct_schedule
report = n.generate_validation_report()
correct_report = {
'graph': {
'graph_connectivity': {'car': {'problem_nodes': {'dead_ends': [3], 'unreachable_node': [1]},
'number_of_connected_subgraphs': 3},
'walk': {'problem_nodes': {'dead_ends': [], 'unreachable_node': []},
'number_of_connected_subgraphs': 0},
'bike': {'problem_nodes': {'dead_ends': [], 'unreachable_node': []},
'number_of_connected_subgraphs': 0}},
'link_attributes': {'links_over_1km_length': {'number_of': 0, 'percentage': 0.0, 'link_ids': []},
'zero_attributes': {}}},
'schedule': {'schedule_level': {'is_valid_schedule': True, 'invalid_stages': [], 'has_valid_services': True,
'invalid_services': []}, 'service_level': {
'service': {'is_valid_service': True, 'invalid_stages': [], 'has_valid_routes': True,
'invalid_routes': []}}, 'route_level': {
'service': {'1': {'is_valid_route': True, 'invalid_stages': []},
'2': {'is_valid_route': True, 'invalid_stages': []}}}},
'routing': {'services_have_routes_in_the_graph': True, 'service_routes_with_invalid_network_route': [],
'route_to_crow_fly_ratio': {'service': {'1': 0.037918141839160244, '2': 0.037918141839160244}}}}
assert_semantically_equal(report, correct_report)
def test_zero_value_attributes_show_up_in_validation_report():
n = Network('epsg:27700')
n.add_link('1', 1, 2, attribs={'length': 0, 'capacity': 0.0, 'freespeed': '0.0', "modes": ['car', 'bus']})
n.add_link('2', 2, 3, attribs={'length': 2, 'capacity': 1, 'freespeed': 2, "modes": ['car', 'bus']})
report = n.generate_validation_report()
correct_report = {'graph': {
'graph_connectivity': {
'car': {'problem_nodes': {'dead_ends': [3], 'unreachable_node': [1]}, 'number_of_connected_subgraphs': 3},
'walk': {'problem_nodes': {'dead_ends': [], 'unreachable_node': []}, 'number_of_connected_subgraphs': 0},
'bike': {'problem_nodes': {'dead_ends': [], 'unreachable_node': []}, 'number_of_connected_subgraphs': 0}},
'link_attributes': {
'links_over_1km_length': {'number_of': 0, 'percentage': 0.0, 'link_ids': []},
'zero_attributes': {
'length': {'number_of': 1, 'percentage': 0.5, 'link_ids': ['1']},
'capacity': {'number_of': 1, 'percentage': 0.5, 'link_ids': ['1']},
'freespeed': {'number_of': 1, 'percentage': 0.5, 'link_ids': ['1']}}}}}
assert_semantically_equal(report, correct_report)
def test_write_to_matsim_generates_three_matsim_files(network_object_from_test_data, tmpdir):
# the correctness of these files is tested elsewhere
expected_network_xml = os.path.join(tmpdir, 'network.xml')
assert not os.path.exists(expected_network_xml)
expected_schedule_xml = os.path.join(tmpdir, 'schedule.xml')
assert not os.path.exists(expected_schedule_xml)
expected_vehicle_xml = os.path.join(tmpdir, 'vehicles.xml')
assert not os.path.exists(expected_vehicle_xml)
network_object_from_test_data.write_to_matsim(tmpdir)
assert os.path.exists(expected_network_xml)
assert os.path.exists(expected_schedule_xml)
assert os.path.exists(expected_vehicle_xml)
def test_write_to_matsim_generates_network_matsim_file_if_network_is_car_only(network_object_from_test_data, tmpdir):
# the correctness of these files is tested elsewhere
expected_network_xml = os.path.join(tmpdir, 'network.xml')
assert not os.path.exists(expected_network_xml)
expected_schedule_xml = os.path.join(tmpdir, 'schedule.xml')
assert not os.path.exists(expected_schedule_xml)
expected_vehicle_xml = os.path.join(tmpdir, 'vehicles.xml')
assert not os.path.exists(expected_vehicle_xml)
n = network_object_from_test_data
n.schedule = Schedule('epsg:27700')
assert not n.schedule
n.write_to_matsim(tmpdir)
assert os.path.exists(expected_network_xml)
assert not os.path.exists(expected_schedule_xml)
assert not os.path.exists(expected_vehicle_xml)
def test_write_to_matsim_generates_change_log_csv(network_object_from_test_data, tmpdir):
expected_change_log_path = os.path.join(tmpdir, 'network_change_log.csv')
expected_schedule_change_log_path = os.path.join(tmpdir, 'schedule_change_log.csv')
assert not os.path.exists(expected_change_log_path)
network_object_from_test_data.write_to_matsim(tmpdir)
assert os.path.exists(expected_change_log_path)
assert os.path.exists(expected_schedule_change_log_path)
benchmark_path_json = os.path.abspath(
os.path.join(os.path.dirname(__file__), "test_data", "auxiliary_files", "links_benchmark.json"))
benchmark_path_csv = os.path.abspath(
os.path.join(os.path.dirname(__file__), "test_data", "auxiliary_files", "links_benchmark.csv"))
@pytest.fixture()
def aux_network():
n = Network('epsg:27700')
n.add_nodes({'1': {'x': 1, 'y': 2, 's2_id': 0}, '2': {'x': 1, 'y': 2, 's2_id': 0},
'3': {'x': 1, 'y': 2, 's2_id': 0}, '4': {'x': 1, 'y': 2, 's2_id': 0}})
n.add_links(
{'1': {'from': '1', 'to': '2', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1, 'modes': {'car'}},
'2': {'from': '1', 'to': '3', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1, 'modes': {'car'}},
'3': {'from': '2', 'to': '4', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1, 'modes': {'car'}},
'4': {'from': '3', 'to': '4', 'freespeed': 1, 'capacity': 1, 'permlanes': 1, 'length': 1, 'modes': {'car'}}})
n.read_auxiliary_link_file(benchmark_path_json)
n.read_auxiliary_node_file(benchmark_path_csv)
return n
def test_reindexing_network_node_with_auxiliary_files(aux_network):
aux_network.reindex_node('3', '0')
assert aux_network.auxiliary_files['node']['links_benchmark.csv'].map == {'2': '2', '3': '0', '4': '4', '1': '1'}
assert aux_network.auxiliary_files['link']['links_benchmark.json'].map == {'2': '2', '1': '1', '3': '3', '4': '4'}
def test_reindexing_network_link_with_auxiliary_files(aux_network):
aux_network.reindex_link('2', '0')
assert aux_network.auxiliary_files['node']['links_benchmark.csv'].map == {'2': '2', '3': '3', '4': '4', '1': '1'}
assert aux_network.auxiliary_files['link']['links_benchmark.json'].map == {'2': '0', '1': '1', '3': '3', '4': '4'}
def test_removing_network_node_with_auxiliary_files(aux_network):
aux_network.remove_nodes(['1', '2'])
aux_network.remove_node('3')
assert aux_network.auxiliary_files['node']['links_benchmark.csv'].map == {'2': None, '3': None, '4': '4', '1': None}
assert aux_network.auxiliary_files['link']['links_benchmark.json'].map == {'2': '2', '1': '1', '3': '3', '4': '4'}
def test_removing_network_link_with_auxiliary_files(aux_network):
aux_network.remove_links(['1', '2'])
aux_network.remove_link('3')
assert aux_network.auxiliary_files['node']['links_benchmark.csv'].map == {'2': '2', '3': '3', '4': '4', '1': '1'}
assert aux_network.auxiliary_files['link']['links_benchmark.json'].map == {'2': None, '1': None, '3': None,
'4': '4'}
def test_simplifying_network_with_auxiliary_files(aux_network):
aux_network.simplify()
assert aux_network.auxiliary_files['node']['links_benchmark.csv'].map == {'1': '1', '2': None, '3': None, '4': '4'}
assert aux_network.auxiliary_files['link']['links_benchmark.json'].map == {
'2': aux_network.link_simplification_map['2'],
'1': aux_network.link_simplification_map['1'],
'3': aux_network.link_simplification_map['3'],
'4': aux_network.link_simplification_map['4']}
def test_saving_network_with_auxiliary_files_with_changes(aux_network, tmpdir):
aux_network.auxiliary_files['node']['links_benchmark.csv'].map = {'2': None, '3': None, '4': '04', '1': None}
aux_network.auxiliary_files['link']['links_benchmark.json'].map = {'2': '002', '1': '001', '3': '003', '4': '004'}
expected_json_aux_file = os.path.join(tmpdir, 'auxiliary_files', 'links_benchmark.json')
expected_csv_aux_file = os.path.join(tmpdir, 'auxiliary_files', 'links_benchmark.csv')
assert not os.path.exists(expected_json_aux_file)
assert not os.path.exists(expected_csv_aux_file)
aux_network.write_to_matsim(tmpdir)
assert os.path.exists(expected_json_aux_file)
assert os.path.exists(expected_csv_aux_file)
with open(expected_json_aux_file) as json_file:
assert json.load(json_file)['car']['2']['in']['links'] == ['002']
assert pd.read_csv(expected_csv_aux_file)['links'].to_dict() == {0: '[None]', 1: '[None]', 2: '[None]', 3: "['04']"}
@pytest.fixture()
def network_1_geo_and_json(network1):
nodes = gpd.GeoDataFrame({
'101982': {'id': '101982', 'x': '528704.1425925883', 'y': '182068.78193707118', 'lon': -0.14625948709424305,
'lat': 51.52287873323954, 's2_id': 5221390329378179879,
'geometry': Point(528704.1425925883, 182068.78193707118)},
'101986': {'id': '101986', 'x': '528835.203274008', 'y': '182006.27331298392', 'lon': -0.14439428709377497,
'lat': 51.52228713323965, 's2_id': 5221390328605860387,
'geometry': Point(528835.203274008, 182006.27331298392)}}).T
nodes.index = nodes.index.set_names(['index'])
links = gpd.GeoDataFrame({
'0': {'id': '0', 'from': '101982', 'to': '101986', 'freespeed': 4.166666666666667, 'capacity': 600.0,
'permlanes': 1.0, 'oneway': '1', 'modes': ['car', 'bike'], 's2_from': 5221390329378179879,
's2_to': 5221390328605860387, 'length': 52.765151087870265,
'geometry': LineString(
[(528704.1425925883, 182068.78193707118), (528835.203274008, 182006.27331298392)]),
'u': '101982', 'v': '101986',
'attributes': {
'osm:way:access': {'name': 'osm:way:access', 'class': 'java.lang.String',
'text': 'permissive'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'unclassified'},
'osm:way:id': {'name': 'osm:way:id', 'class': 'java.lang.Long', 'text': '26997928'},
'osm:way:name': {'name': 'osm:way:name', 'class': 'java.lang.String',
'text': 'Brunswick Place'}}}}).T
links.index = links.index.set_names(['index'])
# most networks are expected to have complex geometries
network1.apply_attributes_to_links(
{'0': {
'modes': ['car', 'bike'],
'geometry': LineString([(528704.1425925883, 182068.78193707118), (528835.203274008, 182006.27331298392)])}})
return {
'network': network1,
'expected_json': {'nodes': {
'101982': {'id': '101982', 'x': '528704.1425925883', 'y': '182068.78193707118', 'lon': -0.14625948709424305,
'lat': 51.52287873323954, 's2_id': 5221390329378179879,
'geometry': [528704.1425925883, 182068.78193707118]},
'101986': {'id': '101986', 'x': '528835.203274008', 'y': '182006.27331298392', 'lon': -0.14439428709377497,
'lat': 51.52228713323965, 's2_id': 5221390328605860387,
'geometry': [528835.203274008, 182006.27331298392]}},
'links': {
'0': {'id': '0', 'from': '101982', 'to': '101986', 'freespeed': 4.166666666666667, 'capacity': 600.0,
'permlanes': 1.0, 'oneway': '1', 'modes': ['car', 'bike'], 's2_from': 5221390329378179879,
's2_to': 5221390328605860387, 'length': 52.765151087870265,
'geometry': 'ez~hinaBc~sze|`@gx|~W|uo|J', 'u': '101982', 'v': '101986',
'attributes': {
'osm:way:access': {'name': 'osm:way:access', 'class': 'java.lang.String', 'text': 'permissive'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'unclassified'},
'osm:way:id': {'name': 'osm:way:id', 'class': 'java.lang.Long', 'text': '26997928'},
'osm:way:name': {'name': 'osm:way:name', 'class': 'java.lang.String',
'text': 'Brunswick Place'}}}}},
'expected_sanitised_json': {'nodes': {
'101982': {'id': '101982', 'x': '528704.1425925883', 'y': '182068.78193707118', 'lon': -0.14625948709424305,
'lat': 51.52287873323954, 's2_id': 5221390329378179879,
'geometry': '528704.1425925883,182068.78193707118'},
'101986': {'id': '101986', 'x': '528835.203274008', 'y': '182006.27331298392', 'lon': -0.14439428709377497,
'lat': 51.52228713323965, 's2_id': 5221390328605860387,
'geometry': '528835.203274008,182006.27331298392'}},
'links': {
'0': {'id': '0', 'from': '101982', 'to': '101986', 'freespeed': 4.166666666666667, 'capacity': 600.0,
'permlanes': 1.0, 'oneway': '1', 'modes': 'car,bike', 's2_from': 5221390329378179879,
's2_to': 5221390328605860387, 'length': 52.765151087870265,
'geometry': 'ez~hinaBc~sze|`@gx|~W|uo|J', 'u': '101982', 'v': '101986',
'attributes': {
'osm:way:access': {'name': 'osm:way:access', 'class': 'java.lang.String',
'text': 'permissive'},
'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String',
'text': 'unclassified'},
'osm:way:id': {'name': 'osm:way:id', 'class': 'java.lang.Long', 'text': '26997928'},
'osm:way:name': {'name': 'osm:way:name', 'class': 'java.lang.String',
'text': 'Brunswick Place'}}}}},
'expected_geodataframe': {'nodes': nodes, 'links': links}
}
def test_transforming_network_to_json(network_1_geo_and_json):
assert_semantically_equal(network_1_geo_and_json['network'].to_json(), network_1_geo_and_json['expected_json'])
def test_saving_network_to_json(network_1_geo_and_json, tmpdir):
network_1_geo_and_json['network'].write_to_json(tmpdir)
expected_network_json = os.path.join(tmpdir, 'network.json')
assert os.path.exists(expected_network_json)
with open(expected_network_json) as json_file:
output_json = json.load(json_file)
assert_semantically_equal(
output_json,
network_1_geo_and_json['expected_sanitised_json'])
def test_transforming_network_to_geodataframe(network_1_geo_and_json):
node_cols = ['id', 'x', 'y', 'lon', 'lat', 's2_id', 'geometry']
link_cols = ['id', 'from', 'to', 'freespeed', 'capacity', 'permlanes', 'oneway', 'modes', 's2_from', 's2_to',
'length', 'geometry', 'attributes', 'u', 'v']
_network = network_1_geo_and_json['network'].to_geodataframe()
assert set(_network['nodes'].columns) == set(node_cols)
assert_frame_equal(_network['nodes'][node_cols], network_1_geo_and_json['expected_geodataframe']['nodes'][node_cols], check_dtype=False)
assert set(_network['links'].columns) == set(link_cols)
assert_frame_equal(_network['links'][link_cols], network_1_geo_and_json['expected_geodataframe']['links'][link_cols], check_dtype=False)
def test_saving_network_to_geojson(network1, correct_schedule, tmpdir):
network1.schedule = correct_schedule
network1.write_to_geojson(tmpdir)
assert set(os.listdir(tmpdir)) == {'network_nodes_geometry_only.geojson', 'network_nodes.geojson',
'network_change_log.csv', 'schedule_links_geometry_only.geojson',
'network_links.geojson', 'network_links_geometry_only.geojson',
'schedule_nodes_geometry_only.geojson', 'schedule_links.geojson',
'schedule_nodes.geojson', 'schedule_change_log.csv'}
def test_saving_network_to_csv(network1, correct_schedule, tmpdir):
network1.schedule = correct_schedule
network1.write_to_csv(tmpdir)
assert set(os.listdir(tmpdir)) == {'network', 'schedule'}
assert set(os.listdir(os.path.join(tmpdir, 'network'))) == {'nodes.csv', 'links.csv', 'network_change_log.csv'}
assert set(os.listdir(os.path.join(tmpdir, 'schedule'))) == {'calendar.csv', 'stop_times.csv', 'trips.csv',
'routes.csv', 'schedule_change_log.csv', 'stops.csv'}
output_nodes = pd.read_csv(os.path.join(tmpdir, 'network', 'nodes.csv'))
assert_semantically_equal(
output_nodes.to_dict(),
{'index': {0: 101982, 1: 101986}, 'lat': {0: 51.52287873323954, 1: 51.52228713323965},
's2_id': {0: 5221390329378179879, 1: 5221390328605860387},
'lon': {0: -0.14625948709424305, 1: -0.14439428709377494}, 'id': {0: 101982, 1: 101986},
'x': {0: 528704.1425925883, 1: 528835.203274008},
'geometry': {0: '[528704.1425925883, 182068.78193707118]',
1: '[528835.203274008, 182006.27331298392]'},
'y': {0: 182068.7819370712, 1: 182006.27331298392}}
)
output_links = pd.read_csv(os.path.join(tmpdir, 'network', 'links.csv'))
assert_semantically_equal(
output_links.to_dict(),
{'index': {0: 0}, 'modes': {0: "['car']"}, 'to': {0: 101986}, 's2_from': {0: 5221390329378179879},
'length': {0: 52.76515108787025}, 'id': {0: 0}, 'from': {0: 101982}, 's2_to': {0: 5221390328605860387},
'capacity': {0: 600.0},
'u': {0: 101982},
'v': {0: 101986},
'geometry': {0: 'ez~hinaBc~sze|`@gx|~W|uo|J'},
'oneway': {0: 1}, 'freespeed': {0: 4.166666666666667}, 'permlanes': {0: 1.0}, 'attributes': {
0: "{'osm:way:access': {'name': 'osm:way:access', 'class': 'java.lang.String', 'text': 'permissive'}, 'osm:way:highway': {'name': 'osm:way:highway', 'class': 'java.lang.String', 'text': 'unclassified'}, 'osm:way:id': {'name': 'osm:way:id', 'class': 'java.lang.Long', 'text': '26997928'}, 'osm:way:name': {'name': 'osm:way:name', 'class': 'java.lang.String', 'text': 'Brunswick Place'}}"}}
)
| 51.930615 | 658 | 0.59772 | 17,722 | 132,475 | 4.21431 | 0.042659 | 0.011943 | 0.017031 | 0.028225 | 0.843893 | 0.803953 | 0.769515 | 0.733096 | 0.702956 | 0.679726 | 0 | 0.162318 | 0.206718 | 132,475 | 2,550 | 659 | 51.95098 | 0.548368 | 0.003903 | 0 | 0.488812 | 0 | 0.019393 | 0.235349 | 0.032444 | 0 | 0 | 0 | 0 | 0.207857 | 1 | 0.093486 | false | 0.001989 | 0.009448 | 0.000995 | 0.105917 | 0.000995 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
58d6bdf4ecdc043c3c302b217aa3fea148e14f87 | 67 | py | Python | general/recording-and-playing-audio/audio_player_playsound.py | caesarcc/python-code-tutorials | aa48ebe695e86440b206b641501ad55d021309bf | [
"MIT"
] | 1,059 | 2019-08-04T17:51:43.000Z | 2022-03-31T10:13:37.000Z | general/recording-and-playing-audio/audio_player_playsound.py | kylearrigoni/pythoncode-tutorials | 6fd30ed854d1ea7573351ddb10c1eaae55ca5717 | [
"MIT"
] | 56 | 2019-12-10T16:46:40.000Z | 2022-03-29T01:36:50.000Z | general/recording-and-playing-audio/audio_player_playsound.py | kylearrigoni/pythoncode-tutorials | 6fd30ed854d1ea7573351ddb10c1eaae55ca5717 | [
"MIT"
] | 1,330 | 2019-08-16T01:36:18.000Z | 2022-03-31T16:57:22.000Z | from playsound import playsound
import sys
playsound(sys.argv[1])
| 13.4 | 31 | 0.80597 | 10 | 67 | 5.4 | 0.6 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016949 | 0.119403 | 67 | 4 | 32 | 16.75 | 0.898305 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
18ebe438f2f4af3127e6c726a0354d7c7d7a4892 | 82 | py | Python | pybuses/__init__.py | Enforcer/pycommand_bus | 028dee5134ddba7fd6144daee61530f35c1aa151 | [
"MIT"
] | 20 | 2019-01-07T20:36:23.000Z | 2021-11-07T14:02:01.000Z | pybuses/__init__.py | Enforcer/pycommand_bus | 028dee5134ddba7fd6144daee61530f35c1aa151 | [
"MIT"
] | 1 | 2020-02-19T14:44:37.000Z | 2021-04-05T07:07:38.000Z | pybuses/__init__.py | Enforcer/pycommand_bus | 028dee5134ddba7fd6144daee61530f35c1aa151 | [
"MIT"
] | 1 | 2019-03-28T05:38:39.000Z | 2019-03-28T05:38:39.000Z | from pybuses.command_bus import CommandBus
from pybuses.event_bus import EventBus
| 27.333333 | 42 | 0.878049 | 12 | 82 | 5.833333 | 0.666667 | 0.314286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 82 | 2 | 43 | 41 | 0.945946 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7a1a37bd32578f2a132cacc4d84d65f5cdb0bdbe | 140 | py | Python | teste.py | dgmolina/AulasDevOps | 7cecc84f9f4b7c173845a01e320f71942617af5a | [
"Apache-2.0"
] | null | null | null | teste.py | dgmolina/AulasDevOps | 7cecc84f9f4b7c173845a01e320f71942617af5a | [
"Apache-2.0"
] | null | null | null | teste.py | dgmolina/AulasDevOps | 7cecc84f9f4b7c173845a01e320f71942617af5a | [
"Apache-2.0"
] | null | null | null | import pytest
from principal import soma
def test_soma():
assert soma(2,4) == 6
assert soma(8, -4) == 4
assert soma(0, 0) == 0
| 17.5 | 27 | 0.614286 | 24 | 140 | 3.541667 | 0.541667 | 0.352941 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 0.25 | 140 | 7 | 28 | 20 | 0.72381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.166667 | true | 0 | 0.333333 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
e131324a1bf8591a377041ef59c65f0585a627af | 7,038 | py | Python | gluetool_modules_framework/tests/test_task_dispatcher.py | testing-farm/gluetool-modules | bc3b39ae0d3150d2857b2c4cd5453fe492ea7ed0 | [
"Apache-2.0"
] | null | null | null | gluetool_modules_framework/tests/test_task_dispatcher.py | testing-farm/gluetool-modules | bc3b39ae0d3150d2857b2c4cd5453fe492ea7ed0 | [
"Apache-2.0"
] | null | null | null | gluetool_modules_framework/tests/test_task_dispatcher.py | testing-farm/gluetool-modules | bc3b39ae0d3150d2857b2c4cd5453fe492ea7ed0 | [
"Apache-2.0"
] | null | null | null | # Copyright Contributors to the Testing Farm project.
# SPDX-License-Identifier: Apache-2.0
import pytest
import gluetool
from mock import MagicMock
from gluetool_modules_framework.dispatchers.task_dispatcher import TaskDispatcher
from . import create_module, patch_shared, check_loadable
@pytest.fixture(name='module')
def fixture_module():
module = create_module(TaskDispatcher)[1]
return module
@pytest.fixture(name='module_run_module')
def fixture_module_run_module(module, monkeypatch):
run_module_mock = MagicMock(
return_value=(None, None)
)
monkeypatch.setattr(
'gluetool.Module.run_module',
run_module_mock
)
return module, run_module_mock
def test_loadable(module):
check_loadable(module.glue, 'gluetool_modules_framework/dispatchers/task_dispatcher.py', 'TaskDispatcher')
def test_empty_test_batch(module_run_module, monkeypatch):
module, run_module_mock = module_run_module
patch_shared(monkeypatch, module, {
'plan_test_batch': []
})
module.execute()
assert not run_module_mock.called
def test_execute(module_run_module, monkeypatch):
module, run_module_mock = module_run_module
module_name = 'module1'
options = ['--option1', '--option2']
test_batch = [(module_name, options)]
patch_shared(monkeypatch, module, {
'plan_test_batch': test_batch
})
module.execute()
run_module_mock.assert_called_with(module_name, options)
def test_task_id(module_run_module, monkeypatch):
module, run_module_mock = module_run_module
module_name = 'module1'
options = ['--option1', '--option2']
test_batch = [(module_name, options)]
patch_shared(monkeypatch, module, {
'plan_test_batch': test_batch,
'thread_id': 1234
})
module.execute()
run_module_mock.assert_called_with(module_name, ['--testing-thread-id', '1234-1'] + options)
def test_state_reporter_unknown_options(module_run_module, monkeypatch):
module, run_module_mock = module_run_module
module_name = 'module1'
options = ['--option1', '--option2']
test_batch = [(module_name, options)]
topic = 'dummy_topic'
module._config['pipeline-test-bus-topic'] = topic
report_pipeline_state_mock = MagicMock()
patch_shared(monkeypatch, module, {}, callables={
'report_pipeline_state': report_pipeline_state_mock,
'plan_test_batch': MagicMock(return_value=test_batch)
})
module.execute()
report_pipeline_state_mock.assert_called_with(
'queued',
thread_id=None,
topic=topic,
test_category='unknown',
test_type='unknown'
)
run_module_mock.assert_called_with(module_name, options)
@pytest.mark.parametrize('args', [
'--pipeline-state-reporter-options="--test-category dummy_test_category --test-type dummy_test_type"',
'--pipeline-state-reporter-options="--test-type dummy_test_type --test-category dummy_test_category"',
'--pipeline-state-reporter-options="--test-category=dummy_test_category --test-type=dummy_test_type"',
'--pipeline-state-reporter-options="--test-type=dummy_test_type --test-category=dummy_test_category"',
'--pipeline-state-reporter-options=--test-category dummy_test_category --test-type dummy_test_type',
'--pipeline-state-reporter-options=--test-type dummy_test_type --test-category dummy_test_category',
'--pipeline-state-reporter-options=--test-category=dummy_test_category --test-type=dummy_test_type',
'--pipeline-state-reporter-options=--test-type=dummy_test_type --test-category=dummy_test_category',
'--pipeline-state-reporter-options=--test-category=dummy_test_category --test-type dummy_test_type',
'--pipeline-state-reporter-options=--test-category dummy_test_category --test-type=dummy_test_type'
])
def test_state_reporter_options_from_args(module_run_module, monkeypatch, args):
module, run_module_mock = module_run_module
module_name = 'module1'
options = [args]
test_batch = [(module_name, options)]
topic = 'dummy_topic'
module._config['pipeline-test-bus-topic'] = topic
report_pipeline_state_mock = MagicMock()
patch_shared(monkeypatch, module, {}, callables={
'report_pipeline_state': report_pipeline_state_mock,
'plan_test_batch': MagicMock(return_value=test_batch)
})
module.execute()
report_pipeline_state_mock.assert_called_with(
'queued',
thread_id=None,
topic=topic,
test_category='dummy_test_category',
test_type='dummy_test_type'
)
run_module_mock.assert_called_with(module_name, options)
def test_state_reporter_options_from_mapping(module_run_module, monkeypatch):
module, run_module_mock = module_run_module
module_name = 'module1'
options = ['--option1', '--option2']
test_batch = [(module_name, options)]
topic = 'dummy_topic'
module._config['pipeline-test-bus-topic'] = topic
module._config['pipeline-test-categories'] = 'dummy_topic_categories'
module._config['pipeline-test-types'] = 'dummy_topic_types'
report_pipeline_state_mock = MagicMock()
patch_shared(monkeypatch, module, {}, callables={
'report_pipeline_state': report_pipeline_state_mock,
'plan_test_batch': MagicMock(return_value=test_batch)
})
mapping_mock = MagicMock()
mapping_mock.match = MagicMock(return_value='dummy_value')
monkeypatch.setattr(
'gluetool.utils.SimplePatternMap',
MagicMock(return_value=mapping_mock)
)
module.execute()
report_pipeline_state_mock.assert_called_with(
'queued',
thread_id=None,
topic=topic,
test_category='dummy_value',
test_type='dummy_value'
)
run_module_mock.assert_called_with(module_name, options)
def test_state_reporter_options_mapping_error(module_run_module, monkeypatch):
module, run_module_mock = module_run_module
module_name = 'module1'
options = ['--option1', '--option2']
test_batch = [(module_name, options)]
topic = 'dummy_topic'
module._config['pipeline-test-bus-topic'] = topic
module._config['pipeline-test-categories'] = 'dummy_topic_categories'
module._config['pipeline-test-types'] = 'dummy_topic_types'
report_pipeline_state_mock = MagicMock()
patch_shared(monkeypatch, module, {}, callables={
'report_pipeline_state': report_pipeline_state_mock,
'plan_test_batch': MagicMock(return_value=test_batch)
})
mapping_mock = MagicMock()
mapping_mock.match = MagicMock()
mapping_mock.match.side_effect = gluetool.GlueError('dummy_error')
monkeypatch.setattr(
'gluetool.utils.SimplePatternMap',
MagicMock(return_value=mapping_mock)
)
module.execute()
report_pipeline_state_mock.assert_called_with(
'queued',
thread_id=None,
topic=topic,
test_category='unknown',
test_type='unknown'
)
run_module_mock.assert_called_with(module_name, options)
| 31.846154 | 110 | 0.720801 | 837 | 7,038 | 5.681004 | 0.101553 | 0.064353 | 0.082019 | 0.058044 | 0.823975 | 0.818297 | 0.790326 | 0.781703 | 0.781703 | 0.781703 | 0 | 0.004772 | 0.166383 | 7,038 | 220 | 111 | 31.990909 | 0.805693 | 0.012361 | 0 | 0.635802 | 0 | 0.012346 | 0.280512 | 0.1635 | 0 | 0 | 0 | 0 | 0.067901 | 1 | 0.061728 | false | 0 | 0.030864 | 0 | 0.104938 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e13212339a1ed1a49367afa90808b666b7f4d079 | 79 | py | Python | py_tdlib/constructors/passport_element_type_utility_bill.py | Mr-TelegramBot/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 24 | 2018-10-05T13:04:30.000Z | 2020-05-12T08:45:34.000Z | py_tdlib/constructors/passport_element_type_utility_bill.py | MrMahdi313/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 3 | 2019-06-26T07:20:20.000Z | 2021-05-24T13:06:56.000Z | py_tdlib/constructors/passport_element_type_utility_bill.py | MrMahdi313/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 5 | 2018-10-05T14:29:28.000Z | 2020-08-11T15:04:10.000Z | from ..factory import Type
class passportElementTypeUtilityBill(Type):
pass
| 13.166667 | 43 | 0.810127 | 8 | 79 | 8 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126582 | 79 | 5 | 44 | 15.8 | 0.927536 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.666667 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
e178172c3f323b7b59e0629cc16aae297e09744f | 6,880 | py | Python | MLlib/optimizers.py | TarunTomar122/ML-DL-implementation | b784e0ec9d48b119afbc30e9719fbd260a777c2c | [
"BSD-3-Clause"
] | null | null | null | MLlib/optimizers.py | TarunTomar122/ML-DL-implementation | b784e0ec9d48b119afbc30e9719fbd260a777c2c | [
"BSD-3-Clause"
] | null | null | null | MLlib/optimizers.py | TarunTomar122/ML-DL-implementation | b784e0ec9d48b119afbc30e9719fbd260a777c2c | [
"BSD-3-Clause"
] | null | null | null | from MLlib.loss_func import MeanSquaredError
import numpy as np
import random
class GradientDescent():
'''
A classic gradient descent implementation.
W = W - a * dm
a - learning rate
dm - derivative of loss function wrt x (parameter)
W - Weights
'''
def __init__(self, learning_rate=0.01, loss_func=MeanSquaredError):
self.learning_rate = learning_rate
self.loss_func = loss_func
def iterate(self, X, Y, W):
return W - self.learning_rate * self.loss_func.derivative(X, Y, W)
class StochasticGradientDescent():
def __init__(self, learning_rate=0.01, loss_func=MeanSquaredError):
self.learning_rate = learning_rate
self.loss_func = loss_func
def iterate(self, X, Y, W):
M, N = X.shape
i = random.randint(0, M-1)
x, y = X[i, :], Y[:, i]
x.shape, y.shape = (1, N), (1, 1)
return W - self.learning_rate * self.loss_func.derivative(x, y, W)
class SGD(StochasticGradientDescent):
'''
An abstract class to provide an alias to the
really long class name StochasticGradientDescent.
'''
pass
class MiniBatchGradientDescent():
def __init__(
self, learning_rate=0.01,
loss_func=MeanSquaredError,
batch_size=5
):
self.learning_rate = learning_rate
self.loss_func = loss_func
self.batch_size = batch_size
def iterate(self, X, Y, W):
M, N = X.shape
index = [random.randint(0, M-1) for i in range(self.batch_size)]
x = X[index, :]
y = Y[:, index]
x.shape = (self.batch_size, N)
y.shape = (1, self.batch_size)
return W - self.learning_rate * self.loss_func.derivative(x, y, W)
class MiniBatchGD(MiniBatchGradientDescent):
'''
An abstract class to provide an alias to the
really long class name MiniBatchGradientDescent.
'''
pass
class MomentumGradientDescent():
def __init__(
self, learning_rate=0.01,
loss_func=MeanSquaredError,
batch_size=5,
gamma=0.9
):
self.learning_rate = learning_rate
self.loss_func = loss_func
self.batch_size = batch_size
self.gamma = gamma
self.Vp = 0
self.Vc = 0
def iterate(self, X, Y, W):
M, N = X.shape
index = [random.randint(0, M-1) for i in range(self.batch_size)]
x = X[index, :]
y = Y[:, index]
x.shape = (self.batch_size, N)
y.shape = (1, self.batch_size)
self.Vc = self.gamma * self.Vp + \
self.learning_rate * self.loss_func.derivative(x, y, W)
W = W - self.Vc
self.Vp = self.Vc
return W
class MomentumGD(MomentumGradientDescent):
'''
An abstract class to provide an alias to the
really long class name MomentumGradientDescent.
'''
pass
class NesterovAcceleratedGradientDescent():
def __init__(
self, learning_rate=0.01,
loss_func=MeanSquaredError,
batch_size=5,
gamma=0.9
):
self.learning_rate = learning_rate
self.loss_func = loss_func
self.batch_size = batch_size
self.gamma = gamma
self.Vp = 0
self.Vc = 0
def iterate(self, X, Y, W):
M, N = X.shape
index = [random.randint(0, M-1) for i in range(self.batch_size)]
x = X[index, :]
y = Y[:, index]
x.shape = (self.batch_size, N)
y.shape = (1, self.batch_size)
self.Vc = self.gamma * self.Vp + \
self.learning_rate * \
self.loss_func.derivative(x, y, W - self.gamma * self.Vp)
W = W - self.Vc
self.Vp = self.Vc
return W
class NesterovAccGD(NesterovAcceleratedGradientDescent):
'''
An abstract class to provide an alias to the
really long class name NesterovAcceleratedGradientDescent.
'''
pass
class Adagrad():
def __init__(
self, learning_rate=0.01,
loss_func=MeanSquaredError,
batch_size=5,
epsilon=0.00000001
):
self.learning_rate = learning_rate
self.loss_func = loss_func
self.batch_size = batch_size
self.epsilon = epsilon
self.S = 0
def iterate(self, X, Y, W):
M, N = X.shape
index = [random.randint(0, M-1) for i in range(self.batch_size)]
x = X[index, :]
y = Y[:, index]
x.shape = (self.batch_size, N)
y.shape = (1, self.batch_size)
derivative = self.loss_func.derivative(x, y, W)
self.S += derivative * derivative
W = W - self.learning_rate / \
np.sqrt(self.S + self.epsilon) * derivative
return W
class Adadelta():
def __init__(
self, learning_rate=0.01,
loss_func=MeanSquaredError,
batch_size=5,
gamma=0.9,
epsilon=0.00000001
):
self.learning_rate = learning_rate
self.loss_func = loss_func
self.batch_size = batch_size
self.epsilon = epsilon
self.gamma = gamma
self.S = 0
def iterate(self, X, Y, W):
M, N = X.shape
index = [random.randint(0, M-1) for i in range(self.batch_size)]
x = X[index, :]
y = Y[:, index]
x.shape = (self.batch_size, N)
y.shape = (1, self.batch_size)
derivative = self.loss_func.derivative(x, y, W)
self.S += self.gamma * self.S + \
(1 - self.gamma) * derivative * derivative
W = W - self.learning_rate / \
np.sqrt(self.S + self.epsilon) * derivative
return W
class Adam():
def __init__(
self, learning_rate=0.01,
loss_func=MeanSquaredError,
batch_size=5,
epsilon=0.00000001,
beta1=0.9,
beta2=0.999
):
self.learning_rate = learning_rate
self.loss_func = loss_func
self.batch_size = batch_size
self.epsilon = epsilon
self.beta1 = beta1
self.beta2 = beta2
self.S = 0
self.Sc = 0
self.V = 0
self.Vc = 0
def iterate(self, X, Y, W):
M, N = X.shape
index = [random.randint(0, M-1) for i in range(self.batch_size)]
x = X[index, :]
y = Y[:, index]
x.shape = (self.batch_size, N)
y.shape = (1, self.batch_size)
derivative = self.loss_func.derivative(x, y, W)
self.V = self.beta1 * self.V + (1 - self.beta1) * derivative
self.S = self.beta2 * self.S + \
(1 - self.beta2) * derivative * derivative
self.Vc = self.V / (1 - self.beta1)
self.Sc = self.S / (1 - self.beta2)
W = W - self.learning_rate / \
(np.sqrt(self.Sc) + self.epsilon) * self.Vc
return W
| 24.397163 | 74 | 0.563372 | 899 | 6,880 | 4.163515 | 0.092325 | 0.086562 | 0.102592 | 0.069463 | 0.781993 | 0.761688 | 0.761688 | 0.761688 | 0.753139 | 0.753139 | 0 | 0.025741 | 0.328052 | 6,880 | 281 | 75 | 24.483986 | 0.783907 | 0.076453 | 0 | 0.779006 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.088398 | false | 0.022099 | 0.016575 | 0.005525 | 0.21547 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bee71bb90a3f682e753966bfd2380b4ae78b3aad | 44 | py | Python | petpy/__init__.py | tcove/petpy | 877468c21846e07299a68ee497318ae413aea8bd | [
"Apache-2.0"
] | null | null | null | petpy/__init__.py | tcove/petpy | 877468c21846e07299a68ee497318ae413aea8bd | [
"Apache-2.0"
] | null | null | null | petpy/__init__.py | tcove/petpy | 877468c21846e07299a68ee497318ae413aea8bd | [
"Apache-2.0"
] | null | null | null | from .petpy import *
from .process import *
| 14.666667 | 22 | 0.727273 | 6 | 44 | 5.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 44 | 2 | 23 | 22 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
83124f0fa4e37533f6e51891dd22e10d54b029a5 | 107,829 | py | Python | cisco-ios-xr/ydk/models/cisco_ios_xr/_meta/_Cisco_IOS_XR_ipv4_io_oper.py | tkamata-test/ydk-py | b637e7853a8edbbd31fbc05afa3aa4110b31c5f9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xr/ydk/models/cisco_ios_xr/_meta/_Cisco_IOS_XR_ipv4_io_oper.py | tkamata-test/ydk-py | b637e7853a8edbbd31fbc05afa3aa4110b31c5f9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xr/ydk/models/cisco_ios_xr/_meta/_Cisco_IOS_XR_ipv4_io_oper.py | tkamata-test/ydk-py | b637e7853a8edbbd31fbc05afa3aa4110b31c5f9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null |
import re
import collections
from enum import Enum
from ydk._core._dm_meta_info import _MetaInfoClassMember, _MetaInfoClass, _MetaInfoEnum
from ydk.types import Empty, YList, YLeafList, DELETE, Decimal64, FixedBitsDict
from ydk._core._dm_meta_info import ATTRIBUTE, REFERENCE_CLASS, REFERENCE_LIST, REFERENCE_LEAFLIST, REFERENCE_IDENTITY_CLASS, REFERENCE_ENUM_CLASS, REFERENCE_BITS, REFERENCE_UNION
from ydk.errors import YPYError, YPYModelError
from ydk.providers._importer import _yang_ns
_meta_table = {
'RpfModeEnum' : _MetaInfoEnum('RpfModeEnum', 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper',
{
'strict':'strict',
'loose':'loose',
}, 'Cisco-IOS-XR-ipv4-io-oper', _yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper']),
'Ipv4MaOperLineStateEnum' : _MetaInfoEnum('Ipv4MaOperLineStateEnum', 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper',
{
'unknown':'unknown',
'shutdown':'shutdown',
'down':'down',
'up':'up',
}, 'Cisco-IOS-XR-ipv4-io-oper', _yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper']),
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Briefs.Brief' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Briefs.Brief',
False,
[
_MetaInfoClassMember('interface-name', ATTRIBUTE, 'str' , None, None,
[], ['(([a-zA-Z0-9_]*\\d+/){3,4}\\d+)|(([a-zA-Z0-9_]*\\d+/){3,4}\\d+\\.\\d+)|(([a-zA-Z0-9_]*\\d+/){2}([a-zA-Z0-9_]*\\d+))|(([a-zA-Z0-9_]*\\d+/){2}([a-zA-Z0-9_]+))|([a-zA-Z0-9_-]*\\d+)|([a-zA-Z0-9_-]*\\d+\\.\\d+)|(mpls)|(dwdm)'],
''' The name of the interface
''',
'interface_name',
'Cisco-IOS-XR-ipv4-io-oper', True),
_MetaInfoClassMember('line-state', REFERENCE_ENUM_CLASS, 'Ipv4MaOperLineStateEnum' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4MaOperLineStateEnum',
[], [],
''' Line state of the interface
''',
'line_state',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('primary-address', ATTRIBUTE, 'str' , None, None,
[], ['(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'],
''' Primary address
''',
'primary_address',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('vrf-id', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' VRF ID of the interface
''',
'vrf_id',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'brief',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Briefs' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Briefs',
False,
[
_MetaInfoClassMember('brief', REFERENCE_LIST, 'Brief' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Briefs.Brief',
[], [],
''' Brief interface IPv4 network operational
data for an interface
''',
'brief',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'briefs',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.Acl' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.Acl',
False,
[
_MetaInfoClassMember('common-in-bound', ATTRIBUTE, 'str' , None, None,
[], [],
''' Common ACL applied to incoming packets
''',
'common_in_bound',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('common-out-bound', ATTRIBUTE, 'str' , None, None,
[], [],
''' Common ACL applied to outgoing packets
''',
'common_out_bound',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('inbound', ATTRIBUTE, 'str' , None, None,
[], [],
''' ACL applied to incoming packets
''',
'inbound',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('outbound', ATTRIBUTE, 'str' , None, None,
[], [],
''' ACL applied to outgoing packets
''',
'outbound',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'acl',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.MultiAcl' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.MultiAcl',
False,
[
_MetaInfoClassMember('common', REFERENCE_LEAFLIST, 'str' , None, None,
[], [],
''' Common ACLs
''',
'common',
'Cisco-IOS-XR-ipv4-io-oper', False, max_elements=5),
_MetaInfoClassMember('inbound', REFERENCE_LEAFLIST, 'str' , None, None,
[], [],
''' Inbound ACLs
''',
'inbound',
'Cisco-IOS-XR-ipv4-io-oper', False, max_elements=5),
_MetaInfoClassMember('outbound', REFERENCE_LEAFLIST, 'str' , None, None,
[], [],
''' Outbound ACLs
''',
'outbound',
'Cisco-IOS-XR-ipv4-io-oper', False, max_elements=5),
],
'Cisco-IOS-XR-ipv4-io-oper',
'multi-acl',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.HelperAddress' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.HelperAddress',
False,
[
_MetaInfoClassMember('address-array', REFERENCE_LEAFLIST, 'str' , None, None,
[], ['(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'],
''' Helper address
''',
'address_array',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'helper-address',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.Rpf' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.Rpf',
False,
[
_MetaInfoClassMember('allow-default-route', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Allow Default Route
''',
'allow_default_route',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('allow-self-ping', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Allow Self Ping
''',
'allow_self_ping',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('enable', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable RPF config
''',
'enable',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('mode', REFERENCE_ENUM_CLASS, 'RpfModeEnum' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'RpfModeEnum',
[], [],
''' RPF Mode (loose/strict)
''',
'mode',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'rpf',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa.Input' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa.Input',
False,
[
_MetaInfoClassMember('destination', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable destination accouting
''',
'destination',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('enable', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable BGP PA for ingress/egress
''',
'enable',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('source', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable source accouting
''',
'source',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'input',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa.Output' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa.Output',
False,
[
_MetaInfoClassMember('destination', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable destination accouting
''',
'destination',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('enable', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable BGP PA for ingress/egress
''',
'enable',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('source', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable source accouting
''',
'source',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'output',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa',
False,
[
_MetaInfoClassMember('input', REFERENCE_CLASS, 'Input' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa.Input',
[], [],
''' BGP PA input config
''',
'input',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('output', REFERENCE_CLASS, 'Output' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa.Output',
[], [],
''' BGP PA output config
''',
'output',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'bgp-pa',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.PubUtime' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.PubUtime',
False,
[
],
'Cisco-IOS-XR-ipv4-io-oper',
'pub-utime',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.IdbUtime' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.IdbUtime',
False,
[
],
'Cisco-IOS-XR-ipv4-io-oper',
'idb-utime',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.CapsUtime' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.CapsUtime',
False,
[
],
'Cisco-IOS-XR-ipv4-io-oper',
'caps-utime',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.FwdEnUtime' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.FwdEnUtime',
False,
[
],
'Cisco-IOS-XR-ipv4-io-oper',
'fwd-en-utime',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.FwdDisUtime' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.FwdDisUtime',
False,
[
],
'Cisco-IOS-XR-ipv4-io-oper',
'fwd-dis-utime',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.MulticastGroup' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.MulticastGroup',
False,
[
_MetaInfoClassMember('group-address', ATTRIBUTE, 'str' , None, None,
[], ['(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'],
''' Address of multicast group
''',
'group_address',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'multicast-group',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.SecondaryAddress' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.SecondaryAddress',
False,
[
_MetaInfoClassMember('address', ATTRIBUTE, 'str' , None, None,
[], ['(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'],
''' Address
''',
'address',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('prefix-length', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Prefix length of address
''',
'prefix_length',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('route-tag', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Route Tag associated with this address (0 = no
tag)
''',
'route_tag',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'secondary-address',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail',
False,
[
_MetaInfoClassMember('interface-name', ATTRIBUTE, 'str' , None, None,
[], ['(([a-zA-Z0-9_]*\\d+/){3,4}\\d+)|(([a-zA-Z0-9_]*\\d+/){3,4}\\d+\\.\\d+)|(([a-zA-Z0-9_]*\\d+/){2}([a-zA-Z0-9_]*\\d+))|(([a-zA-Z0-9_]*\\d+/){2}([a-zA-Z0-9_]+))|([a-zA-Z0-9_-]*\\d+)|([a-zA-Z0-9_-]*\\d+\\.\\d+)|(mpls)|(dwdm)'],
''' The name of the interface
''',
'interface_name',
'Cisco-IOS-XR-ipv4-io-oper', True),
_MetaInfoClassMember('acl', REFERENCE_CLASS, 'Acl' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.Acl',
[], [],
''' ACLs configured on the interface
''',
'acl',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('bgp-pa', REFERENCE_CLASS, 'BgpPa' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa',
[], [],
''' BGP PA config on the interface
''',
'bgp_pa',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('caps-utime', REFERENCE_CLASS, 'CapsUtime' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.CapsUtime',
[], [],
''' CAPS Add Time
''',
'caps_utime',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('direct-broadcast', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Are direct broadcasts sent on the interface?
''',
'direct_broadcast',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('flow-tag-dst', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Is BGP Flow Tag Destination is enable
''',
'flow_tag_dst',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('flow-tag-src', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Is BGP Flow Tag Source is enable
''',
'flow_tag_src',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('fwd-dis-utime', REFERENCE_CLASS, 'FwdDisUtime' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.FwdDisUtime',
[], [],
''' FWD DISABLE Time
''',
'fwd_dis_utime',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('fwd-en-utime', REFERENCE_CLASS, 'FwdEnUtime' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.FwdEnUtime',
[], [],
''' FWD ENABLE Time
''',
'fwd_en_utime',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('helper-address', REFERENCE_CLASS, 'HelperAddress' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.HelperAddress',
[], [],
''' Helper Addresses configured on the interface
''',
'helper_address',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('idb-utime', REFERENCE_CLASS, 'IdbUtime' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.IdbUtime',
[], [],
''' IDB Create Time
''',
'idb_utime',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('line-state', REFERENCE_ENUM_CLASS, 'Ipv4MaOperLineStateEnum' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4MaOperLineStateEnum',
[], [],
''' Line state of the interface
''',
'line_state',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('mask-reply', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Are mask replies sent on the interface?
''',
'mask_reply',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('mlacp-active', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Is mLACP state Active (valid if RG ID exists)
''',
'mlacp_active',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('mtu', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' IP MTU of the interface
''',
'mtu',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('multi-acl', REFERENCE_CLASS, 'MultiAcl' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.MultiAcl',
[], [],
''' Multi ACLs configured on the interface
''',
'multi_acl',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('multicast-group', REFERENCE_LIST, 'MulticastGroup' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.MulticastGroup',
[], [],
''' Multicast groups joined on the interface
''',
'multicast_group',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('prefix-length', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Prefix length of primary address
''',
'prefix_length',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('primary-address', ATTRIBUTE, 'str' , None, None,
[], ['(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'],
''' Primary address
''',
'primary_address',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('proxy-arp-disabled', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Is Proxy ARP disabled on the interface?
''',
'proxy_arp_disabled',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('pub-utime', REFERENCE_CLASS, 'PubUtime' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.PubUtime',
[], [],
''' Address Publish Time
''',
'pub_utime',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('redirect', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Are ICMP redirects sent on the interface?
''',
'redirect',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('rg-id-exists', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Does ICCP RG ID exist on the interface?
''',
'rg_id_exists',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('route-tag', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Route tag associated with the primary address (0
= no tag)
''',
'route_tag',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('rpf', REFERENCE_CLASS, 'Rpf' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.Rpf',
[], [],
''' RPF config on the interface
''',
'rpf',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('secondary-address', REFERENCE_LIST, 'SecondaryAddress' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.SecondaryAddress',
[], [],
''' Secondary addresses on the interface
''',
'secondary_address',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('unnumbered-interface-name', ATTRIBUTE, 'str' , None, None,
[], [],
''' Name of referenced interface (valid if
unnumbered)
''',
'unnumbered_interface_name',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('unreachable', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Are ICMP unreachables sent on the interface?
''',
'unreachable',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('vrf-id', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' VRF ID of the interface
''',
'vrf_id',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'detail',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details',
False,
[
_MetaInfoClassMember('detail', REFERENCE_LIST, 'Detail' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail',
[], [],
''' Detail interface IPv4 network operational
data for an interface
''',
'detail',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'details',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf',
False,
[
_MetaInfoClassMember('vrf-name', ATTRIBUTE, 'str' , None, None,
[], ['[\\w\\-\\.:,_@#%$\\+=\\|;]+'],
''' The VRF name
''',
'vrf_name',
'Cisco-IOS-XR-ipv4-io-oper', True),
_MetaInfoClassMember('briefs', REFERENCE_CLASS, 'Briefs' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Briefs',
[], [],
''' Brief interface IPv4 network operational
data for a node
''',
'briefs',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('details', REFERENCE_CLASS, 'Details' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details',
[], [],
''' Detail interface IPv4 network operational
data for a node
''',
'details',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'vrf',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Vrfs' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Vrfs',
False,
[
_MetaInfoClassMember('vrf', REFERENCE_LIST, 'Vrf' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf',
[], [],
''' VRF name of an interface belong to
''',
'vrf',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'vrfs',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Summary.IfUpUp' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Summary.IfUpUp',
False,
[
_MetaInfoClassMember('ip-assigned', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Number of interfaces with explicit addresses
''',
'ip_assigned',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('ip-unassigned', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Number of unassigned interfaces with explicit
addresses
''',
'ip_unassigned',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('ip-unnumbered', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Number of unnumbered interfaces with explicit
addresses
''',
'ip_unnumbered',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'if-up-up',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Summary.IfUpDown' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Summary.IfUpDown',
False,
[
_MetaInfoClassMember('ip-assigned', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Number of interfaces with explicit addresses
''',
'ip_assigned',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('ip-unassigned', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Number of unassigned interfaces with explicit
addresses
''',
'ip_unassigned',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('ip-unnumbered', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Number of unnumbered interfaces with explicit
addresses
''',
'ip_unnumbered',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'if-up-down',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Summary.IfDownDown' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Summary.IfDownDown',
False,
[
_MetaInfoClassMember('ip-assigned', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Number of interfaces with explicit addresses
''',
'ip_assigned',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('ip-unassigned', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Number of unassigned interfaces with explicit
addresses
''',
'ip_unassigned',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('ip-unnumbered', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Number of unnumbered interfaces with explicit
addresses
''',
'ip_unnumbered',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'if-down-down',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Summary.IfShutdownDown' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Summary.IfShutdownDown',
False,
[
_MetaInfoClassMember('ip-assigned', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Number of interfaces with explicit addresses
''',
'ip_assigned',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('ip-unassigned', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Number of unassigned interfaces with explicit
addresses
''',
'ip_unassigned',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('ip-unnumbered', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Number of unnumbered interfaces with explicit
addresses
''',
'ip_unnumbered',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'if-shutdown-down',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData.Summary' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData.Summary',
False,
[
_MetaInfoClassMember('if-down-down', REFERENCE_CLASS, 'IfDownDown' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Summary.IfDownDown',
[], [],
''' Number of interfaces (down,down)
''',
'if_down_down',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('if-shutdown-down', REFERENCE_CLASS, 'IfShutdownDown' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Summary.IfShutdownDown',
[], [],
''' Number of interfaces (shutdown,down)
''',
'if_shutdown_down',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('if-up-down', REFERENCE_CLASS, 'IfUpDown' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Summary.IfUpDown',
[], [],
''' Number of interfaces (up,down)
''',
'if_up_down',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('if-up-down-basecaps-up', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Number of interfaces (up,down) with basecaps up
''',
'if_up_down_basecaps_up',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('if-up-up', REFERENCE_CLASS, 'IfUpUp' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Summary.IfUpUp',
[], [],
''' Number of interfaces (up,up)
''',
'if_up_up',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'summary',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.InterfaceData' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.InterfaceData',
False,
[
_MetaInfoClassMember('summary', REFERENCE_CLASS, 'Summary' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Summary',
[], [],
''' Summary of IPv4 network operational interface
data on a node
''',
'summary',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('vrfs', REFERENCE_CLASS, 'Vrfs' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData.Vrfs',
[], [],
''' VRF specific IPv4 network operational
interface data
''',
'vrfs',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'interface-data',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.Statistics.Traffic.Ipv4Stats' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.Statistics.Traffic.Ipv4Stats',
False,
[
_MetaInfoClassMember('bad-header', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Bad Header
''',
'bad_header',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('bad-hop-count', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Bad Hop Count
''',
'bad_hop_count',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('bad-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Bad Option
''',
'bad_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('bad-security-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Bad Security Option
''',
'bad_security_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('bad-source-address', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Bad Source Address
''',
'bad_source_address',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('basic-security-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Basic Security Option
''',
'basic_security_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('broadcast-in', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Broadcast In
''',
'broadcast_in',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('broadcast-out', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Broadcast Out
''',
'broadcast_out',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('cipso-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Cipso Option
''',
'cipso_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('encapsulation-failed', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Encapsulation Failed
''',
'encapsulation_failed',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('end-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' End Option
''',
'end_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('extended-security-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Extended Security Option
''',
'extended_security_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('format-errors', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Format Errors
''',
'format_errors',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('fragment-count', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Fragment Count
''',
'fragment_count',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('input-packets', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Input Packets
''',
'input_packets',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('lisp-decap-error', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Lisp decap errors
''',
'lisp_decap_error',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('lisp-encap-error', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Lisp encap errors
''',
'lisp_encap_error',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('lisp-v4-decap', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Lisp IPv4 decapped packets
''',
'lisp_v4_decap',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('lisp-v4-encap', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Lisp IPv4 encapped packets
''',
'lisp_v4_encap',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('lisp-v6-decap', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Lisp IPv6 decapped packets
''',
'lisp_v6_decap',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('lisp-v6-encap', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Lisp IPv6 encapped packets
''',
'lisp_v6_encap',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('loose-source-route-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Loose Source Route Option
''',
'loose_source_route_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('multicast-in', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Multicast In
''',
'multicast_in',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('multicast-out', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Multicast Out
''',
'multicast_out',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('no-gateway', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' No Gateway
''',
'no_gateway',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('no-protocol', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' No Protocol
''',
'no_protocol',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('no-router', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' No Router
''',
'no_router',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('noop-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Noop Option
''',
'noop_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('options-present', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' IP Options Present
''',
'options_present',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('packet-too-big', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Packet Too Big
''',
'packet_too_big',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('packets-forwarded', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Packets Forwarded
''',
'packets_forwarded',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('packets-fragmented', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Packets Fragmented
''',
'packets_fragmented',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('packets-output', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Packets Output
''',
'packets_output',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('reassemble-failed', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Reassembly Failed
''',
'reassemble_failed',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('reassemble-input', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' RaInput
''',
'reassemble_input',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('reassemble-max-drop', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Reassembly Max Drop
''',
'reassemble_max_drop',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('reassemble-timeout', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Reassembly Timeout
''',
'reassemble_timeout',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('reassembled', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Reassembled
''',
'reassembled',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('received-packets', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Received Packets
''',
'received_packets',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('record-route-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Record Route Option
''',
'record_route_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('router-alert-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Router Alert Option
''',
'router_alert_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('sid-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' SID Option
''',
'sid_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('strict-source-route-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Strict Source Route Option
''',
'strict_source_route_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('timestamp-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Timestamp Option
''',
'timestamp_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('unknown-option', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Unknown Option
''',
'unknown_option',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'ipv4-stats',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.Statistics.Traffic.IcmpStats' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.Statistics.Traffic.IcmpStats',
False,
[
_MetaInfoClassMember('admin-unreachable-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Admin Unreachable Received
''',
'admin_unreachable_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('admin-unreachable-sent', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Admin Unreachable Sent
''',
'admin_unreachable_sent',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('checksum-error', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Checksum Errors
''',
'checksum_error',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('echo-reply-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Echo Reply Received
''',
'echo_reply_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('echo-reply-sent', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Echo Reply Sent
''',
'echo_reply_sent',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('echo-request-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Echo Request Sent
''',
'echo_request_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('echo-request-sent', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Echo Request Sent
''',
'echo_request_sent',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('fragment-unreachable-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Fragment Unreachable Received
''',
'fragment_unreachable_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('fragment-unreachable-sent', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Fragment Unreachable Sent
''',
'fragment_unreachable_sent',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('hopcount-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Hopcount Received
''',
'hopcount_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('hopcount-sent', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Hopcount Sent
''',
'hopcount_sent',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('host-unreachable-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Host Unreachable Received
''',
'host_unreachable_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('host-unreachable-sent', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Host Unreachable Sent
''',
'host_unreachable_sent',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('mask-reply-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Mask Received
''',
'mask_reply_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('mask-reply-sent', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Mask Sent
''',
'mask_reply_sent',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('mask-request-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Mask Received
''',
'mask_request_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('mask-request-sent', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Mask Sent
''',
'mask_request_sent',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('network-unreachable-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Network Unreachable Received
''',
'network_unreachable_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('network-unreachable-sent', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Network Unreachable Sent
''',
'network_unreachable_sent',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('output', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Transmitted
''',
'output',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('param-error-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Parameter Error Received
''',
'param_error_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('param-error-send', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Parameter Error Sent
''',
'param_error_send',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('port-unreachable-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Port Unreachable Received
''',
'port_unreachable_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('port-unreachable-sent', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Port Unreachable Sent
''',
'port_unreachable_sent',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('protocol-unreachable-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Protocol Unreachable Received
''',
'protocol_unreachable_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('protocol-unreachable-sent', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Protocol Unreachable Sent
''',
'protocol_unreachable_sent',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('reassebly-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Reassembly Received
''',
'reassebly_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('reassembly-sent', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Reassembly Sent
''',
'reassembly_sent',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Received
''',
'received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('redirect-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Redirect Received
''',
'redirect_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('redirect-send', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Redirect Sent
''',
'redirect_send',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('router-advert-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Router Advertisement Received
''',
'router_advert_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('router-solicit-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Router Solicited Received
''',
'router_solicit_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('source-quench-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Source Quench
''',
'source_quench_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('timestamp-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Timestamp Received
''',
'timestamp_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('timestamp-reply-received', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Timestamp Reply Received
''',
'timestamp_reply_received',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('unknown', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' ICMP Unknown
''',
'unknown',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'icmp-stats',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.Statistics.Traffic' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.Statistics.Traffic',
False,
[
_MetaInfoClassMember('icmp-stats', REFERENCE_CLASS, 'IcmpStats' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.Statistics.Traffic.IcmpStats',
[], [],
''' ICMP Stats
''',
'icmp_stats',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('ipv4-stats', REFERENCE_CLASS, 'Ipv4Stats' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.Statistics.Traffic.Ipv4Stats',
[], [],
''' IPv4 Network Stats
''',
'ipv4_stats',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'traffic',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node.Statistics' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node.Statistics',
False,
[
_MetaInfoClassMember('traffic', REFERENCE_CLASS, 'Traffic' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.Statistics.Traffic',
[], [],
''' Traffic statistics for a node
''',
'traffic',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'statistics',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes.Node' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes.Node',
False,
[
_MetaInfoClassMember('node-name', ATTRIBUTE, 'str' , None, None,
[], ['([a-zA-Z0-9_]*\\d+/){1,2}([a-zA-Z0-9_]*\\d+)'],
''' The node name
''',
'node_name',
'Cisco-IOS-XR-ipv4-io-oper', True),
_MetaInfoClassMember('interface-data', REFERENCE_CLASS, 'InterfaceData' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.InterfaceData',
[], [],
''' IPv4 network operational interface data
''',
'interface_data',
'Cisco-IOS-XR-ipv4-io-oper', False),
_MetaInfoClassMember('statistics', REFERENCE_CLASS, 'Statistics' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node.Statistics',
[], [],
''' Statistical IPv4 network operational data for
a node
''',
'statistics',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'node',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Nodes' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Nodes',
False,
[
_MetaInfoClassMember('node', REFERENCE_LIST, 'Node' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes.Node',
[], [],
''' IPv4 network operational data for a particular
node
''',
'node',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'nodes',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.Acl' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.Acl',
False,
[
_MetaInfoClassMember('common-in-bound', ATTRIBUTE, 'str' , None, None,
[], [],
''' Common ACL applied to incoming packets
''',
'common_in_bound',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('common-out-bound', ATTRIBUTE, 'str' , None, None,
[], [],
''' Common ACL applied to outgoing packets
''',
'common_out_bound',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('inbound', ATTRIBUTE, 'str' , None, None,
[], [],
''' ACL applied to incoming packets
''',
'inbound',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('outbound', ATTRIBUTE, 'str' , None, None,
[], [],
''' ACL applied to outgoing packets
''',
'outbound',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'acl',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.MultiAcl' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.MultiAcl',
False,
[
_MetaInfoClassMember('common', REFERENCE_LEAFLIST, 'str' , None, None,
[], [],
''' Common ACLs
''',
'common',
'Cisco-IOS-XR-ipv4-ma-oper', False, max_elements=5),
_MetaInfoClassMember('inbound', REFERENCE_LEAFLIST, 'str' , None, None,
[], [],
''' Inbound ACLs
''',
'inbound',
'Cisco-IOS-XR-ipv4-ma-oper', False, max_elements=5),
_MetaInfoClassMember('outbound', REFERENCE_LEAFLIST, 'str' , None, None,
[], [],
''' Outbound ACLs
''',
'outbound',
'Cisco-IOS-XR-ipv4-ma-oper', False, max_elements=5),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'multi-acl',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.HelperAddress' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.HelperAddress',
False,
[
_MetaInfoClassMember('address-array', REFERENCE_LEAFLIST, 'str' , None, None,
[], ['(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'],
''' Helper address
''',
'address_array',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'helper-address',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.Rpf' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.Rpf',
False,
[
_MetaInfoClassMember('allow-default-route', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Allow Default Route
''',
'allow_default_route',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('allow-self-ping', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Allow Self Ping
''',
'allow_self_ping',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('enable', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable RPF config
''',
'enable',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('mode', REFERENCE_ENUM_CLASS, 'RpfModeEnum' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_ma_oper', 'RpfModeEnum',
[], [],
''' RPF Mode (loose/strict)
''',
'mode',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'rpf',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa.Input' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa.Input',
False,
[
_MetaInfoClassMember('destination', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable destination accouting
''',
'destination',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('enable', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable BGP PA for ingress/egress
''',
'enable',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('source', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable source accouting
''',
'source',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'input',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa.Output' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa.Output',
False,
[
_MetaInfoClassMember('destination', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable destination accouting
''',
'destination',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('enable', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable BGP PA for ingress/egress
''',
'enable',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('source', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Enable source accouting
''',
'source',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'output',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa',
False,
[
_MetaInfoClassMember('input', REFERENCE_CLASS, 'Input' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa.Input',
[], [],
''' BGP PA input config
''',
'input',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('output', REFERENCE_CLASS, 'Output' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa.Output',
[], [],
''' BGP PA output config
''',
'output',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'bgp-pa',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.PubUtime' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.PubUtime',
False,
[
],
'Cisco-IOS-XR-ipv4-ma-oper',
'pub-utime',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.IdbUtime' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.IdbUtime',
False,
[
],
'Cisco-IOS-XR-ipv4-ma-oper',
'idb-utime',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.CapsUtime' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.CapsUtime',
False,
[
],
'Cisco-IOS-XR-ipv4-ma-oper',
'caps-utime',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.FwdEnUtime' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.FwdEnUtime',
False,
[
],
'Cisco-IOS-XR-ipv4-ma-oper',
'fwd-en-utime',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.FwdDisUtime' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.FwdDisUtime',
False,
[
],
'Cisco-IOS-XR-ipv4-ma-oper',
'fwd-dis-utime',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.MulticastGroup' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.MulticastGroup',
False,
[
_MetaInfoClassMember('group-address', ATTRIBUTE, 'str' , None, None,
[], ['(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'],
''' Address of multicast group
''',
'group_address',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'multicast-group',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.SecondaryAddress' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.SecondaryAddress',
False,
[
_MetaInfoClassMember('address', ATTRIBUTE, 'str' , None, None,
[], ['(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'],
''' Address
''',
'address',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('prefix-length', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Prefix length of address
''',
'prefix_length',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('route-tag', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Route Tag associated with this address (0 = no
tag)
''',
'route_tag',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'secondary-address',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail',
False,
[
_MetaInfoClassMember('acl', REFERENCE_CLASS, 'Acl' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.Acl',
[], [],
''' ACLs configured on the interface
''',
'acl',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('bgp-pa', REFERENCE_CLASS, 'BgpPa' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa',
[], [],
''' BGP PA config on the interface
''',
'bgp_pa',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('caps-utime', REFERENCE_CLASS, 'CapsUtime' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.CapsUtime',
[], [],
''' CAPS Add Time
''',
'caps_utime',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('direct-broadcast', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Are direct broadcasts sent on the interface?
''',
'direct_broadcast',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('flow-tag-dst', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Is BGP Flow Tag Destination is enable
''',
'flow_tag_dst',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('flow-tag-src', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Is BGP Flow Tag Source is enable
''',
'flow_tag_src',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('fwd-dis-utime', REFERENCE_CLASS, 'FwdDisUtime' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.FwdDisUtime',
[], [],
''' FWD DISABLE Time
''',
'fwd_dis_utime',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('fwd-en-utime', REFERENCE_CLASS, 'FwdEnUtime' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.FwdEnUtime',
[], [],
''' FWD ENABLE Time
''',
'fwd_en_utime',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('helper-address', REFERENCE_CLASS, 'HelperAddress' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.HelperAddress',
[], [],
''' Helper Addresses configured on the interface
''',
'helper_address',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('idb-utime', REFERENCE_CLASS, 'IdbUtime' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.IdbUtime',
[], [],
''' IDB Create Time
''',
'idb_utime',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('line-state', REFERENCE_ENUM_CLASS, 'Ipv4MaOperLineStateEnum' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_ma_oper', 'Ipv4MaOperLineStateEnum',
[], [],
''' Line state of the interface
''',
'line_state',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('mask-reply', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Are mask replies sent on the interface?
''',
'mask_reply',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('mlacp-active', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Is mLACP state Active (valid if RG ID exists)
''',
'mlacp_active',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('mtu', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' IP MTU of the interface
''',
'mtu',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('multi-acl', REFERENCE_CLASS, 'MultiAcl' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.MultiAcl',
[], [],
''' Multi ACLs configured on the interface
''',
'multi_acl',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('multicast-group', REFERENCE_LIST, 'MulticastGroup' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.MulticastGroup',
[], [],
''' Multicast groups joined on the interface
''',
'multicast_group',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('prefix-length', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Prefix length of primary address
''',
'prefix_length',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('primary-address', ATTRIBUTE, 'str' , None, None,
[], ['(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'],
''' Primary address
''',
'primary_address',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('proxy-arp-disabled', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Is Proxy ARP disabled on the interface?
''',
'proxy_arp_disabled',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('pub-utime', REFERENCE_CLASS, 'PubUtime' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.PubUtime',
[], [],
''' Address Publish Time
''',
'pub_utime',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('redirect', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Are ICMP redirects sent on the interface?
''',
'redirect',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('rg-id-exists', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Does ICCP RG ID exist on the interface?
''',
'rg_id_exists',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('route-tag', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Route tag associated with the primary address (0
= no tag)
''',
'route_tag',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('rpf', REFERENCE_CLASS, 'Rpf' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.Rpf',
[], [],
''' RPF config on the interface
''',
'rpf',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('secondary-address', REFERENCE_LIST, 'SecondaryAddress' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.SecondaryAddress',
[], [],
''' Secondary addresses on the interface
''',
'secondary_address',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('unnumbered-interface-name', ATTRIBUTE, 'str' , None, None,
[], [],
''' Name of referenced interface (valid if
unnumbered)
''',
'unnumbered_interface_name',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('unreachable', ATTRIBUTE, 'bool' , None, None,
[], [],
''' Are ICMP unreachables sent on the interface?
''',
'unreachable',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('vrf-id', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' VRF ID of the interface
''',
'vrf_id',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'detail',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Brief' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Brief',
False,
[
_MetaInfoClassMember('line-state', REFERENCE_ENUM_CLASS, 'Ipv4MaOperLineStateEnum' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_ma_oper', 'Ipv4MaOperLineStateEnum',
[], [],
''' Line state of the interface
''',
'line_state',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('primary-address', ATTRIBUTE, 'str' , None, None,
[], ['(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'],
''' Primary address
''',
'primary_address',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('vrf-id', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' VRF ID of the interface
''',
'vrf_id',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'brief',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs.Vrf' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs.Vrf',
False,
[
_MetaInfoClassMember('vrf-name', ATTRIBUTE, 'str' , None, None,
[], ['[\\w\\-\\.:,_@#%$\\+=\\|;]+'],
''' The VRF name
''',
'vrf_name',
'Cisco-IOS-XR-ipv4-ma-oper', True),
_MetaInfoClassMember('brief', REFERENCE_CLASS, 'Brief' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Brief',
[], [],
''' Brief IPv4 network operational data for an
interface
''',
'brief',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('detail', REFERENCE_CLASS, 'Detail' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail',
[], [],
''' Detail IPv4 network operational data for an
interface
''',
'detail',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'vrf',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface.Vrfs' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface.Vrfs',
False,
[
_MetaInfoClassMember('vrf', REFERENCE_LIST, 'Vrf' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs.Vrf',
[], [],
''' VRF information on the interface
''',
'vrf',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'vrfs',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces.Interface' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces.Interface',
False,
[
_MetaInfoClassMember('interface-name', ATTRIBUTE, 'str' , None, None,
[], ['(([a-zA-Z0-9_]*\\d+/){3,4}\\d+)|(([a-zA-Z0-9_]*\\d+/){3,4}\\d+\\.\\d+)|(([a-zA-Z0-9_]*\\d+/){2}([a-zA-Z0-9_]*\\d+))|(([a-zA-Z0-9_]*\\d+/){2}([a-zA-Z0-9_]+))|([a-zA-Z0-9_-]*\\d+)|([a-zA-Z0-9_-]*\\d+\\.\\d+)|(mpls)|(dwdm)'],
''' The name of the interface
''',
'interface_name',
'Cisco-IOS-XR-ipv4-ma-oper', True),
_MetaInfoClassMember('vrfs', REFERENCE_CLASS, 'Vrfs' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface.Vrfs',
[], [],
''' List of VRF on the interface
''',
'vrfs',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'interface',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network.Interfaces' : {
'meta_info' : _MetaInfoClass('Ipv4Network.Interfaces',
False,
[
_MetaInfoClassMember('interface', REFERENCE_LIST, 'Interface' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces.Interface',
[], [],
''' Interface names with VRF
''',
'interface',
'Cisco-IOS-XR-ipv4-ma-oper', False),
],
'Cisco-IOS-XR-ipv4-ma-oper',
'interfaces',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-ma-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
'Ipv4Network' : {
'meta_info' : _MetaInfoClass('Ipv4Network',
False,
[
_MetaInfoClassMember('interfaces', REFERENCE_CLASS, 'Interfaces' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Interfaces',
[], [],
''' IPv4 network operational interface data
''',
'interfaces',
'Cisco-IOS-XR-ipv4-ma-oper', False),
_MetaInfoClassMember('nodes', REFERENCE_CLASS, 'Nodes' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper', 'Ipv4Network.Nodes',
[], [],
''' Node-specific IPv4 network operational data
''',
'nodes',
'Cisco-IOS-XR-ipv4-io-oper', False),
],
'Cisco-IOS-XR-ipv4-io-oper',
'ipv4-network',
_yang_ns._namespaces['Cisco-IOS-XR-ipv4-io-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_io_oper'
),
},
}
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Briefs.Brief']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Briefs']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa.Input']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa.Output']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.Acl']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.MultiAcl']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.HelperAddress']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.Rpf']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.BgpPa']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.PubUtime']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.IdbUtime']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.CapsUtime']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.FwdEnUtime']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.FwdDisUtime']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.MulticastGroup']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail.SecondaryAddress']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details.Detail']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Briefs']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf.Details']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs.Vrf']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Summary.IfUpUp']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Summary']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Summary.IfUpDown']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Summary']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Summary.IfDownDown']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Summary']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Summary.IfShutdownDown']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Summary']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Vrfs']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData.Summary']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.InterfaceData']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.Statistics.Traffic.Ipv4Stats']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.Statistics.Traffic']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.Statistics.Traffic.IcmpStats']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.Statistics.Traffic']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.Statistics.Traffic']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node.Statistics']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.InterfaceData']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node']['meta_info']
_meta_table['Ipv4Network.Nodes.Node.Statistics']['meta_info'].parent =_meta_table['Ipv4Network.Nodes.Node']['meta_info']
_meta_table['Ipv4Network.Nodes.Node']['meta_info'].parent =_meta_table['Ipv4Network.Nodes']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa.Input']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa.Output']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.Acl']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.MultiAcl']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.HelperAddress']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.Rpf']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.BgpPa']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.PubUtime']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.IdbUtime']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.CapsUtime']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.FwdEnUtime']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.FwdDisUtime']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.MulticastGroup']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail.SecondaryAddress']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Detail']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf.Brief']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs.Vrf']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface.Vrfs']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface.Vrfs']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces.Interface']['meta_info']
_meta_table['Ipv4Network.Interfaces.Interface']['meta_info'].parent =_meta_table['Ipv4Network.Interfaces']['meta_info']
_meta_table['Ipv4Network.Nodes']['meta_info'].parent =_meta_table['Ipv4Network']['meta_info']
_meta_table['Ipv4Network.Interfaces']['meta_info'].parent =_meta_table['Ipv4Network']['meta_info']
| 51.371606 | 245 | 0.507183 | 10,253 | 107,829 | 5.140057 | 0.028577 | 0.086678 | 0.108347 | 0.121668 | 0.933246 | 0.927762 | 0.920514 | 0.887194 | 0.840914 | 0.753325 | 0 | 0.03463 | 0.341754 | 107,829 | 2,098 | 246 | 51.396092 | 0.707867 | 0 | 0 | 0.587336 | 0 | 0.007642 | 0.438887 | 0.341016 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004367 | 0 | 0.004367 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
55e753f8e0f4bb4aff646764629d278ef0169d19 | 5,508 | py | Python | test/encoders/test_conv_encoder.py | lijianhackthon/neural_sp | fcb23e20850bdef4564105a46c15f6b4bd287151 | [
"Apache-2.0"
] | 2 | 2021-01-25T02:55:09.000Z | 2021-02-05T03:47:05.000Z | test/encoders/test_conv_encoder.py | lijianhackthon/neural_sp | fcb23e20850bdef4564105a46c15f6b4bd287151 | [
"Apache-2.0"
] | null | null | null | test/encoders/test_conv_encoder.py | lijianhackthon/neural_sp | fcb23e20850bdef4564105a46c15f6b4bd287151 | [
"Apache-2.0"
] | 1 | 2021-02-28T05:59:29.000Z | 2021-02-28T05:59:29.000Z | #! /usr/bin/env python3
# -*- coding: utf-8 -*-
"""Test for CNN encoder."""
import importlib
import numpy as np
import pytest
import torch
from neural_sp.models.torch_utils import np2tensor
from neural_sp.models.torch_utils import pad_list
def make_args_2d(**kwargs):
args = dict(
input_dim=80,
in_channel=1,
channels="32_32_32",
kernel_sizes="(3,3)_(3,3)_(3,3)",
strides="(1,1)_(1,1)_(1,1)",
poolings="(2,2)_(2,2)_(2,2)",
dropout=0.1,
batch_norm=False,
layer_norm=False,
residual=False,
bottleneck_dim=0,
param_init=0.1,
layer_norm_eps=1e-12
)
args.update(kwargs)
return args
@pytest.mark.parametrize(
"args",
[
# subsample4
({'channels': "32_32", 'kernel_sizes': "(3,3)_(3,3)",
'strides': "(1,1)_(1,1)", 'poolings': "(2,2)_(2,2)"}),
({'channels': "32_32", 'kernel_sizes': "(3,3)_(3,3)",
'strides': "(1,1)_(1,1)", 'poolings': "(2,2)_(2,1)"}),
# ({'channels': "32_32", 'kernel_sizes': "(3,3)_(3,3)",
# 'strides': "(1,1)_(1,1)", 'poolings': "(2,2)_(1,2)"}),
({'channels': "32_32", 'kernel_sizes': "(3,3)_(3,3)",
'strides': "(1,1)_(1,1)", 'poolings': "(1,1)_(1,1)"}),
# subsample8
({'channels': "32_32_32", 'kernel_sizes': "(3,3)_(3,3)_(3,3)",
'poolings': "(2,2)_(2,2)_(2,2)"}),
({'channels': "32_32_32", 'kernel_sizes': "(3,3)_(3,3)_(3,3)",
'poolings': "(2,2)_(2,2)_(2,1)"}),
# ({'channels': "32_32_32", 'kernel_sizes': "(3,3)_(3,3)_(3,3)",
# 'poolings': "(2,2)_(2,2)_(1,2)"}),
({'channels': "32_32_32", 'kernel_sizes': "(3,3)_(3,3)_(3,3)",
'poolings': "(2,2)_(2,1)_(2,1)"}),
# ({'channels': "32_32_32", 'kernel_sizes': "(3,3)_(3,3)_(3,3)",
# 'poolings': "(2,2)_(1,2)_(1,2)"}),
({'channels': "32_32_32", 'kernel_sizes': "(3,3)_(3,3)_(3,3)",
'poolings': "(2,2)_(1,1)_(1,1)"}),
({'channels': "32_32_32", 'kernel_sizes': "(3,3)_(3,3)_(3,3)",
'poolings': "(2,1)_(1,1)_(1,1)"}),
# ({'channels': "32_32_32", 'kernel_sizes': "(3,3)_(3,3)_(3,3)",
# 'poolings': "(1,2)_(1,1)_(1,1)"}),
({'channels': "32_32_32", 'kernel_sizes': "(3,3)_(3,3)_(3,3)",
'poolings': "(1,1)_(1,1)_(1,1)"}),
# others
({'batch_norm': True}),
({'layer_norm': True}),
({'residual': True}),
({'bottleneck_dim': 8}),
]
)
def test_forward_2d(args):
args = make_args_2d(**args)
batch_size = 4
xmaxs = [40, 45]
device = "cpu"
module = importlib.import_module('neural_sp.models.seq2seq.encoders.conv')
(channels, kernel_sizes, strides, poolings), is_1dconv = module.parse_cnn_config(
args['channels'], args['kernel_sizes'],
args['strides'], args['poolings'])
assert not is_1dconv
enc = module.ConvEncoder(**args)
enc = enc.to(device)
for xmax in xmaxs:
xs = np.random.randn(batch_size, xmax, args['input_dim']).astype(np.float32)
xlens = torch.IntTensor([len(x) - i * enc.subsampling_factor for i, x in enumerate(xs)])
xs = pad_list([np2tensor(x, device).float() for x in xs], 0.)
xs, xlens = enc(xs, xlens)
assert xs.size(0) == batch_size
assert xs.size(1) == xlens.max(), (xs.size(), xlens)
def make_args_1d(**kwargs):
args = dict(
input_dim=80,
in_channel=1,
channels="32_32_32",
kernel_sizes="3_3_3",
strides="1_1_1",
poolings="2_2_2",
dropout=0.1,
batch_norm=False,
layer_norm=True,
residual=True,
bottleneck_dim=0,
param_init=0.1,
layer_norm_eps=1e-12
)
args.update(kwargs)
return args
@pytest.mark.parametrize(
"args",
[
# subsample4
({'channels': "32_32", 'kernel_sizes': "3_3",
'strides': "1_1", 'poolings': "2_2"}),
({'channels': "32_32", 'kernel_sizes': "3_3",
'strides': "1_1", 'poolings': "2_1"}),
({'channels': "32_32", 'kernel_sizes': "3_3",
'strides': "1_1", 'poolings': "1_1"}),
# subsample8
({'channels': "32_32_32", 'kernel_sizes': "3_3_3",
'poolings': "2_2_2"}),
({'channels': "32_32_32", 'kernel_sizes': "3_3_3",
'poolings': "2_2_1"}),
({'channels': "32_32_32", 'kernel_sizes': "3_3_3",
'poolings': "2_1_1"}),
({'channels': "32_32_32", 'kernel_sizes': "3_3_3",
'poolings': "1_1_1"}),
# bottleneck
({'bottleneck_dim': 8}),
]
)
def test_forward_1d(args):
args = make_args_1d(**args)
batch_size = 4
xmaxs = [40, 45]
device = "cpu"
module = importlib.import_module('neural_sp.models.seq2seq.encoders.conv')
(channels, kernel_sizes, strides, poolings), is_1dconv = module.parse_cnn_config(
args['channels'], args['kernel_sizes'],
args['strides'], args['poolings'])
assert is_1dconv
enc = module.ConvEncoder(**args)
enc = enc.to(device)
for xmax in xmaxs:
xs = np.random.randn(batch_size, xmax, args['input_dim']).astype(np.float32)
xlens = torch.IntTensor([len(x) - i * enc.subsampling_factor for i, x in enumerate(xs)])
xs = pad_list([np2tensor(x, device).float() for x in xs], 0.)
xs, xlens = enc(xs, xlens)
assert xs.size(0) == batch_size
assert xs.size(1) == xlens.max(), (xs.size(), xlens)
| 33.585366 | 96 | 0.535948 | 776 | 5,508 | 3.534794 | 0.135309 | 0.054685 | 0.057966 | 0.049581 | 0.916879 | 0.915786 | 0.899016 | 0.845425 | 0.845425 | 0.835581 | 0 | 0.098219 | 0.245824 | 5,508 | 163 | 97 | 33.791411 | 0.562109 | 0.097858 | 0 | 0.598425 | 0 | 0 | 0.259903 | 0.01536 | 0 | 0 | 0 | 0 | 0.047244 | 1 | 0.031496 | false | 0 | 0.062992 | 0 | 0.110236 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
55f4bd731f2bf2a2818c43ec0adc11f9dc91e68a | 41 | py | Python | mobo/optimizer/__init__.py | MasashiSode/MOBO | 8083d482ce5bff3ede82daca63de818e4be825b0 | [
"MIT"
] | 30 | 2019-03-21T01:53:29.000Z | 2022-02-26T12:37:26.000Z | mobo/optimizer/__init__.py | MasashiSode/MOGP | 8083d482ce5bff3ede82daca63de818e4be825b0 | [
"MIT"
] | null | null | null | mobo/optimizer/__init__.py | MasashiSode/MOGP | 8083d482ce5bff3ede82daca63de818e4be825b0 | [
"MIT"
] | 7 | 2020-06-03T01:05:04.000Z | 2022-03-08T11:48:14.000Z | from .multi_objective_optimizer import *
| 20.5 | 40 | 0.853659 | 5 | 41 | 6.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 41 | 1 | 41 | 41 | 0.891892 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
55f5de91eee149b9096b61f4b358953e57c3319e | 113 | py | Python | biopymlff/test/lipid.py | saandre15/biopymlff | ec90370a8c03c51426bd24477034c9413bdcdb04 | [
"MIT"
] | null | null | null | biopymlff/test/lipid.py | saandre15/biopymlff | ec90370a8c03c51426bd24477034c9413bdcdb04 | [
"MIT"
] | null | null | null | biopymlff/test/lipid.py | saandre15/biopymlff | ec90370a8c03c51426bd24477034c9413bdcdb04 | [
"MIT"
] | null | null | null | from ..test.general import General_Test
class Lipid_Test(General_Test):
def test_01(self):
pass | 18.833333 | 39 | 0.690265 | 16 | 113 | 4.625 | 0.625 | 0.297297 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022989 | 0.230089 | 113 | 6 | 40 | 18.833333 | 0.827586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.25 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
3678524c8de510d8676888257fe76437a3894054 | 26 | py | Python | hello.py | zuiop13/github-action-with-python | b302b2384bc2d0cc14ec0163aac3a8d110aeb3dd | [
"MIT"
] | null | null | null | hello.py | zuiop13/github-action-with-python | b302b2384bc2d0cc14ec0163aac3a8d110aeb3dd | [
"MIT"
] | 41 | 2021-09-06T01:11:50.000Z | 2021-11-05T01:07:00.000Z | hello.py | zuiop13/github-action-with-python | b302b2384bc2d0cc14ec0163aac3a8d110aeb3dd | [
"MIT"
] | null | null | null | print("Hello, World! 06")
| 13 | 25 | 0.653846 | 4 | 26 | 4.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 0.115385 | 26 | 1 | 26 | 26 | 0.652174 | 0 | 0 | 0 | 0 | 0 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
36a5369d29d21953e625e8efb53db2a70da51372 | 93 | py | Python | tasks.py | fossabot/prjct | 1e3f07bd2827f0bd3e63dad0d1a5fd831026c370 | [
"MIT"
] | null | null | null | tasks.py | fossabot/prjct | 1e3f07bd2827f0bd3e63dad0d1a5fd831026c370 | [
"MIT"
] | null | null | null | tasks.py | fossabot/prjct | 1e3f07bd2827f0bd3e63dad0d1a5fd831026c370 | [
"MIT"
] | null | null | null | import invoke
#import minchin.releaser
from minchin.releaser import make_release, vendorize
| 18.6 | 52 | 0.849462 | 12 | 93 | 6.5 | 0.666667 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107527 | 93 | 4 | 53 | 23.25 | 0.939759 | 0.247312 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
36b12b8dc07e23afe97ee210f293420753d386f9 | 144 | py | Python | info/modules/index/__init__.py | Ljc-xiaohu/information16 | 537ac399849b5ebee3b38f7f040dfa0541eac876 | [
"MIT"
] | null | null | null | info/modules/index/__init__.py | Ljc-xiaohu/information16 | 537ac399849b5ebee3b38f7f040dfa0541eac876 | [
"MIT"
] | null | null | null | info/modules/index/__init__.py | Ljc-xiaohu/information16 | 537ac399849b5ebee3b38f7f040dfa0541eac876 | [
"MIT"
] | 1 | 2020-09-26T15:03:20.000Z | 2020-09-26T15:03:20.000Z | from flask import Blueprint
#创建蓝图对象
index_blu = Blueprint("index",__name__)
#装饰视图函数
# from info.modules.index import views
from . import views | 18 | 39 | 0.784722 | 20 | 144 | 5.4 | 0.6 | 0.203704 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131944 | 144 | 8 | 40 | 18 | 0.864 | 0.340278 | 0 | 0 | 0 | 0 | 0.053763 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0.666667 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
36d08a53d1e9e6ba22336fd01a45c770571cd0fe | 7,896 | py | Python | rero/Modules/weeb.py | voqz/rerobot | dd4617980ab4bd36fd1e29a019f37e31c8d43382 | [
"Apache-2.0"
] | 6 | 2017-05-18T16:17:30.000Z | 2018-11-30T01:33:23.000Z | rero/Modules/weeb.py | kokhouse/rerobot | dd4617980ab4bd36fd1e29a019f37e31c8d43382 | [
"Apache-2.0"
] | 1 | 2017-05-19T04:02:30.000Z | 2017-05-20T12:52:29.000Z | rero/Modules/weeb.py | kokhouse/rerobot | dd4617980ab4bd36fd1e29a019f37e31c8d43382 | [
"Apache-2.0"
] | 1 | 2018-04-19T20:14:29.000Z | 2018-04-19T20:14:29.000Z | # coding=utf-8
"""
Copyright 2017 Luxory (@LXYcs)
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import xml
import aiohttp
import html
import re
with open("config/settings.yaml") as file:
settings_file = file.read()
file.close()
settings = yaml.load(settings_file)
class Weeb:
"""
Contains anime/manga related commands
"""
def __init__(self, context):
super().__init__()
self.ctx = context
async def anime_parser(self, message):
"""
Fetch anime from MAL
:param message:
:return:
"""
q = message.content[7:]
if q == "":
await self.ctx.send_message(message.channel, "(◕ᴗ◕✿) `You need to specify a anime.`")
return
switch = str(q).split('$')
if len(switch) == 2:
q_clean = switch[0].replace(" ", "+")
with aiohttp.ClientSession(auth=aiohttp.BasicAuth(login=settings["MAL_ID"],
password=settings["MAL_PASSWORD"])) as session:
async with session.get(
'http://myanimelist.net/api/anime/search.xml?q={}'.format(q_clean)) as resp:
data = await resp.read()
try:
dats = xml.etree.ElementTree.fromstring(data)
title = dats[0][1].text
episodes = dats[0][4].text
score = dats[0][5].text
synopsis = dats[0][10].text
image = dats[0][11].text
s_clean = html.unescape(synopsis)
ss_clean = re.sub("[\[,\]]", "", s_clean)
ss_clean_s = ss_clean.replace("<br />", '')
await self.ctx.send_message(message.channel, "**{}**"
"\nEpisodes: `{}`"
"\nScore: `{}`"
"\nExcerpt: ```{}``` "
"{}"
.format(title, episodes, score, ss_clean_s, image))
except xml.etree.ElementTree.ParseError:
await self.ctx.send_message(message.channel, "Anime not found. Try again!")
except AttributeError:
await self.ctx.send_message(message.channel, "Anime not found. Try again!")
else:
q_clean = str(q).replace(" ", "+")
with aiohttp.ClientSession(auth=aiohttp.BasicAuth(login=settings["MAL_ID"],
password=settings["MAL_PASSWORD"])) as session:
async with session.get(
'http://myanimelist.net/api/anime/search.xml?q={}'.format(q_clean)) as resp:
data = await resp.read()
try:
dats = xml.etree.ElementTree.fromstring(data)
title = dats[0][1].text
episodes = dats[0][4].text
score = dats[0][5].text
image = dats[0][11].text
await self.ctx.send_message(message.channel, "**{}**"
"\nEpisodes: `{}`"
"\nScore: `{}`"
"\n{}"
"\n*If you want summary, use* `?anime {}$`"
.format(title, episodes, score, image, str(q)))
except xml.etree.ElementTree.ParseError:
await self.ctx.send_message(message.channel, "Anime not found. Try again!")
except AttributeError:
await self.ctx.send_message(message.channel, "Anime not found. Try again!")
async def manga_parser(self, message):
"""
Fetch manga from MAL
:param message:
:return:
"""
q = message.content[7:]
if q == "":
await self.ctx.send_message(message.channel, "(◕ᴗ◕✿) `You need to specify a manga.`")
return
switch = str(q).split('$')
if len(switch) == 2:
q_clean = switch[0].replace(" ", "+")
with aiohttp.ClientSession(auth=aiohttp.BasicAuth(login=settings["MAL_ID"],
password=settings["MAL_PASSWORD"])) as session:
async with session.get(
'http://myanimelist.net/api/manga/search.xml?q={}'.format(q_clean)) as resp:
data = await resp.read()
try:
dats = xml.etree.ElementTree.fromstring(data)
title = dats[0][1].text
chapters = dats[0][4].text
score = dats[0][6].text
synopsis = dats[0][11].text
image = dats[0][12].text
s_clean = html.unescape(synopsis)
ss_clean = re.sub("[\[,\]]", "", s_clean)
ss_clean_s = ss_clean.replace("<br />", '')
await self.ctx.send_message(message.channel, "**{}**"
"\nChapters: `{}`"
"\nScore: `{}`"
"\nExcerpt: ```{}``` "
"{}"
.format(title, chapters, score, ss_clean_s, image))
except xml.etree.ElementTree.ParseError:
await self.ctx.send_message(message.channel, "Manga not found. Try again!")
except AttributeError:
await self.ctx.send_message(message.channel, "Manga not found. Try again!")
else:
q_clean = str(q).replace(" ", "+")
with aiohttp.ClientSession(auth=aiohttp.BasicAuth(login=settings["MAL_ID"],
password=settings["MAL_PASSWORD"])) as session:
async with session.get(
'http://myanimelist.net/api/manga/search.xml?q={}'.format(q_clean)) as resp:
data = await resp.read()
try:
dats = xml.etree.ElementTree.fromstring(data)
title = dats[0][1].text
chapters = dats[0][4].text
score = dats[0][6].text
image = dats[0][12].text
await self.ctx.send_message(message.channel, "**{}**"
"\nChapters: `{}`"
"\nScore: `{}`"
"\n{}"
"\n*If you want summary, use* `?manga {}$`"
.format(title, chapters, score, image, str(q)))
except xml.etree.ElementTree.ParseError:
await self.ctx.send_message(message.channel, "Manga not found. Try again!")
except AttributeError:
await self.ctx.send_message(message.channel, "Manga not found. Try again!")
| 48.441718 | 112 | 0.455674 | 755 | 7,896 | 4.696689 | 0.222517 | 0.025381 | 0.047377 | 0.06317 | 0.750705 | 0.750705 | 0.734913 | 0.734913 | 0.720812 | 0.683023 | 0 | 0.012597 | 0.426925 | 7,896 | 162 | 113 | 48.740741 | 0.769724 | 0.076621 | 0 | 0.806452 | 0 | 0 | 0.124577 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008065 | false | 0.032258 | 0.032258 | 0 | 0.064516 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7fc7e32632a748621844d88b1083c862fc1c4441 | 46 | py | Python | hypothesis/stat/__init__.py | boyali/hypothesis-sre | f44d25eb281d49663d49d134ee73ad542849714b | [
"BSD-3-Clause"
] | null | null | null | hypothesis/stat/__init__.py | boyali/hypothesis-sre | f44d25eb281d49663d49d134ee73ad542849714b | [
"BSD-3-Clause"
] | null | null | null | hypothesis/stat/__init__.py | boyali/hypothesis-sre | f44d25eb281d49663d49d134ee73ad542849714b | [
"BSD-3-Clause"
] | null | null | null | from .constraint import highest_density_level
| 23 | 45 | 0.891304 | 6 | 46 | 6.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 46 | 1 | 46 | 46 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7fc98a236f1032e11be3631f893fa5f65c1c67ed | 55,524 | py | Python | pygpw_tris.py | russelldavies/mush | b6fa5d2ba8dc4cacbc2ab2d4e1b8da878d871245 | [
"Zlib"
] | null | null | null | pygpw_tris.py | russelldavies/mush | b6fa5d2ba8dc4cacbc2ab2d4e1b8da878d871245 | [
"Zlib"
] | null | null | null | pygpw_tris.py | russelldavies/mush | b6fa5d2ba8dc4cacbc2ab2d4e1b8da878d871245 | [
"Zlib"
] | 1 | 2020-07-27T21:54:11.000Z | 2020-07-27T21:54:11.000Z | # 3D array of character trigraph probability in English
tris = [
[
[2,0,3,0,0,0,1,0,0,0,0,1,1,1,0,0,0,3,2,0,0,0,0,0,0,0], # A A
[37,25,2,5,38,0,0,2,46,1,0,304,0,2,49,0,0,24,24,0,19,0,0,0,14,0], # A B
[26,1,64,2,107,0,1,94,67,0,173,13,5,1,35,1,13,32,3,114,23,0,0,0,45,0], #A C
[35,7,3,43,116,6,3,8,75,14,1,16,25,3,44,3,1,35,20,1,10,25,9,0,18,0], # A D
[2,0,2,1,0,1,3,0,0,0,0,10,0,2,3,0,0,12,6,0,2,0,0,0,0,0], # A E
[5,0,0,0,14,50,2,0,3,0,2,5,0,2,7,0,0,5,1,39,1,0,0,0,1,0], # A F
[30,1,0,1,182,0,42,5,30,0,0,7,9,42,51,3,0,24,3,0,21,0,3,0,3,0], # A G
[12,0,0,0,20,0,0,0,3,0,0,5,4,2,13,0,0,2,0,0,1,0,0,0,0,0], # A H
[2,0,10,26,2,1,10,0,2,1,2,87,13,144,0,2,0,93,30,23,0,3,1,0,0,0], # A I
[4,0,0,0,3,0,0,0,0,0,0,0,0,0,4,0,0,0,0,0,0,0,0,0,0,0], # A J
[11,0,1,1,98,1,0,1,15,0,0,3,0,0,5,1,0,3,0,1,2,0,3,0,8,0], # A K
[78,20,34,45,124,21,24,5,109,0,28,237,31,3,53,23,0,7,16,69,29,26,5,0,26,2], # A L
[70,57,1,1,98,3,0,1,68,0,0,3,38,2,43,69,0,3,14,3,12,0,2,0,14,0], # A M
[114,6,156,359,103,8,146,12,141,2,57,4,0,89,61,1,4,1,124,443,29,6,1,3,28,9], # A N
[0,0,1,0,0,0,0,0,0,0,0,3,1,0,0,0,0,3,2,2,2,0,0,0,0,0], # A O
[29,3,0,1,59,1,0,86,25,0,1,14,1,1,37,94,0,9,22,30,8,0,0,0,9,0], # A P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,14,0,0,0,0,0], # A Q
[124,64,101,233,115,12,47,5,188,3,61,55,68,34,46,25,6,94,48,189,5,22,5,1,172,2], # A R
[19,3,32,0,71,0,1,81,49,0,22,3,19,2,19,34,4,0,152,211,12,0,1,0,17,1], # A S
[50,3,41,2,863,4,0,144,352,0,5,14,6,3,144,0,0,60,13,106,57,1,5,0,8,5], # A T
[0,5,23,35,5,5,38,1,0,1,3,33,4,23,0,4,1,35,52,56,0,1,0,7,0,1], # A U
[35,0,0,1,108,0,0,0,49,0,0,1,0,0,19,0,0,0,0,0,3,1,0,0,6,0], # A V
[30,10,0,4,3,6,2,2,2,0,10,13,4,15,3,0,0,6,3,5,0,0,0,0,2,0], # A W
[3,0,0,0,4,0,0,0,22,0,0,1,0,0,7,2,0,0,1,1,0,0,3,0,3,0], # A X
[11,8,1,5,16,5,1,2,2,0,0,10,7,4,13,1,0,3,5,7,3,0,5,0,0,0], # A Y
[10,0,0,1,22,0,0,0,10,0,0,0,0,0,7,0,0,0,0,2,2,0,0,0,4,11], # A Z
],
[
[0,17,74,11,1,2,19,4,8,0,10,68,7,73,1,7,0,110,54,55,9,1,3,1,12,1], # B A
[7,0,0,0,16,0,0,0,10,0,0,24,0,0,9,0,0,2,3,0,2,0,0,0,14,0], # B B
[2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # B C
[2,0,0,0,2,0,0,0,2,0,0,0,0,0,3,0,0,1,0,0,3,0,0,0,0,0], # B D
[51,1,14,34,18,11,16,7,9,0,1,85,5,48,2,2,2,199,36,41,0,4,5,1,6,2], # B E
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0], # B F
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # B G
[0,0,0,0,1,0,0,0,0,0,0,0,0,0,5,0,0,0,0,0,1,0,0,0,0,0], # B H
[34,8,22,21,8,3,9,1,0,3,1,50,7,45,16,4,2,29,22,59,4,4,0,0,0,3], # B I
[0,0,0,0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0], # B J
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # B K
[57,0,0,0,519,0,0,0,35,0,0,0,0,0,47,0,0,0,0,0,32,1,0,0,3,0], # B L
[0,0,0,0,1,0,0,0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0], # B M
[1,0,0,0,2,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0], # B N
[62,7,4,21,3,2,9,3,8,1,1,46,8,63,58,2,0,55,15,20,46,6,17,10,19,0], # B O
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0], # B P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # B Q
[110,0,0,0,77,0,0,0,100,0,0,0,0,0,78,0,0,0,0,0,28,0,0,0,10,0], # B R
[0,0,6,0,16,0,0,0,7,0,0,0,0,0,12,0,0,0,0,27,2,0,0,0,0,0], # B S
[1,0,0,0,3,1,0,0,0,0,0,4,0,0,1,0,0,3,0,0,0,0,0,0,0,0], # B T
[0,3,21,16,3,5,14,0,12,1,2,52,7,20,2,0,1,104,44,54,0,0,0,3,1,5], # B U
[0,0,0,0,3,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # B V
[0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # B W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # B X
[1,0,0,0,3,0,1,2,0,0,0,4,0,0,0,3,0,6,8,3,0,0,2,0,0,2], # B Y
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # B Z
],
[
[1,47,17,33,1,3,4,5,7,1,3,120,40,120,1,59,1,171,60,150,19,20,1,0,5,0], # C A
[0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0], # C B
[23,0,0,0,22,0,0,5,13,0,0,13,0,0,26,0,0,7,0,0,27,0,0,0,0,0], # C C
[1,0,1,0,1,0,0,0,0,0,0,0,0,0,8,0,0,0,0,0,0,0,0,0,0,0], # C D
[23,6,4,17,6,6,1,2,13,0,0,50,12,109,7,43,0,76,63,22,1,0,4,0,2,1], # C E
[2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # C F
[0,0,0,0,1,0,0,0,2,0,0,0,0,0,2,0,0,4,1,0,1,0,0,0,0,0], # C G
[165,10,2,3,176,4,3,1,141,0,0,26,20,16,102,1,0,63,8,10,44,0,13,0,20,0], # C H
[76,15,8,33,24,16,3,0,0,0,0,38,5,45,50,28,0,29,38,71,6,8,0,0,0,0], # C I
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # C J
[17,16,2,3,90,4,1,7,20,1,1,45,8,8,12,9,0,3,32,6,6,0,13,0,22,0], # C K
[95,0,0,0,84,0,0,0,50,0,0,0,0,0,54,0,0,0,0,0,34,0,0,0,3,0], # C L
[1,0,0,0,2,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0], # C M
[2,0,0,0,1,0,0,0,4,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0], # C N
[33,16,40,22,14,10,11,12,9,1,1,101,218,421,24,56,2,129,37,40,86,22,25,4,4,2], # C O
[1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0], # C P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,13,0,0,0,0,0], # C Q
[101,0,0,0,112,0,0,0,75,0,0,0,0,0,88,0,0,0,0,1,41,0,0,0,25,0], # C R
[0,0,0,0,0,0,0,0,3,0,0,0,0,1,2,0,0,0,1,2,0,0,0,0,0,0], # C S
[44,0,0,0,12,2,0,0,113,0,0,0,2,0,94,0,0,46,0,0,42,0,1,0,3,0], # C T
[3,12,2,6,6,6,0,0,8,0,0,102,42,10,9,15,0,72,51,41,1,0,0,0,0,0], # C U
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # C V
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # C W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # C X
[5,1,20,0,0,0,1,0,0,0,0,3,0,2,2,4,0,3,2,9,0,0,0,0,0,0], # C Y
[2,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # C Z
],
[
[0,7,16,7,1,2,13,6,18,0,3,54,23,59,0,10,0,31,6,40,8,13,3,0,32,3], # D A
[9,0,0,0,7,0,0,0,3,0,0,2,0,0,8,0,0,1,0,0,8,0,0,0,2,0], # D B
[5,0,0,0,0,0,0,2,0,0,0,2,0,0,3,0,0,0,0,0,2,0,0,0,0,0], # D C
[8,0,0,0,30,0,0,3,19,0,0,38,0,0,4,0,0,4,0,0,1,0,0,0,16,0], # D D
[34,37,82,14,17,41,11,4,5,2,0,88,62,170,14,40,4,183,99,39,6,20,16,6,1,2], # D E
[6,0,0,0,0,0,0,0,6,0,0,2,0,0,5,0,0,2,0,0,4,0,0,0,0,0], # D F
[4,0,0,0,73,0,0,0,2,0,1,1,1,0,0,0,0,1,0,0,2,0,1,0,3,0], # D G
[8,0,0,0,9,0,0,0,4,0,0,0,0,0,10,0,0,0,0,0,0,0,0,0,0,0], # D H
[100,10,104,12,33,26,31,1,1,0,1,22,22,65,57,15,0,20,138,53,20,31,1,6,0,1], # D I
[4,0,0,0,2,0,0,0,0,0,0,0,0,0,4,0,0,0,0,0,7,0,0,0,0,0], # D J
[0,0,0,0,1,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # D K
[9,0,0,0,79,0,0,0,12,0,0,0,0,0,7,0,0,0,0,0,1,0,0,0,3,0], # D L
[13,0,0,0,3,0,0,0,21,0,0,0,0,0,11,0,0,0,0,0,1,0,0,0,0,0], # D M
[7,0,0,0,9,0,0,0,3,0,0,0,0,0,1,0,0,0,0,6,0,0,0,0,0,0], # D N
[1,5,21,10,6,3,20,1,3,0,0,30,38,54,17,7,0,39,11,10,30,5,54,5,1,3], # D O
[6,0,0,0,1,0,0,1,3,0,0,1,0,0,7,0,0,1,0,0,0,0,0,0,0,0], # D P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0], # D Q
[74,0,0,0,47,0,0,0,53,0,0,0,0,0,80,0,0,0,0,0,22,0,0,0,8,0], # D R
[1,0,3,0,10,0,0,9,5,0,1,3,10,0,16,8,0,0,0,31,1,0,2,0,0,0], # D S
[3,0,0,0,1,0,0,6,1,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0], # D T
[10,7,52,2,5,3,4,0,2,0,1,33,14,15,5,11,1,19,15,8,1,0,0,0,0,1], # D U
[3,0,0,0,13,0,0,0,7,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0], # D V
[19,0,0,0,10,0,0,0,19,0,0,0,0,0,8,0,0,2,0,0,0,0,0,0,2,0], # D W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # D X
[4,2,1,2,3,1,2,0,1,0,1,4,4,12,0,0,0,0,8,1,0,0,1,0,0,0], # D Y
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0], # D Z
],
[
[0,39,34,110,0,12,13,3,0,0,50,68,38,71,0,13,1,117,80,112,28,19,7,0,0,1], # E A
[32,5,0,0,31,0,0,0,8,0,0,6,0,0,28,0,0,32,2,3,29,0,0,0,4,0], # E B
[33,0,9,2,51,0,0,39,49,0,47,26,0,0,59,0,0,35,2,206,42,0,0,0,2,0], # E C
[29,7,1,16,45,5,22,3,88,0,0,8,9,4,24,2,0,27,8,4,27,0,7,0,13,0], # E D
[2,4,13,63,1,6,1,4,10,0,19,23,13,66,1,42,0,43,9,34,1,4,6,0,0,8], # E E
[14,0,1,2,36,33,0,0,22,0,0,15,0,0,24,0,0,14,1,13,35,0,0,0,5,0], # E F
[48,1,0,0,36,1,15,2,38,0,0,7,4,4,26,0,0,38,0,0,19,0,0,0,4,0], # E G
[14,0,0,0,24,0,0,0,6,0,0,0,1,0,18,0,0,4,0,0,4,0,0,0,3,0], # E H
[8,0,5,13,2,1,42,0,1,1,2,13,7,59,1,1,0,10,25,22,0,7,0,0,0,2], # E I
[4,0,0,0,4,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,3,0,0,0,0,0], # E J
[2,1,0,1,6,0,0,0,4,0,0,0,0,1,1,0,0,0,2,3,0,0,0,0,1,0], # E K
[76,7,6,57,131,19,7,3,125,0,4,238,22,1,48,15,0,4,27,26,17,19,2,0,7,0], # E L
[87,53,1,0,84,0,0,0,102,0,0,3,8,8,56,64,0,0,4,0,19,0,1,0,8,0], # E M
[78,17,68,159,128,8,35,14,96,2,2,4,5,54,57,3,2,9,127,624,33,10,8,0,11,16], # E N
[0,0,8,10,0,6,7,1,2,0,0,23,10,38,0,16,0,14,6,4,41,3,2,2,0,1], # E O
[26,1,1,0,27,0,0,32,45,0,0,21,1,0,35,9,0,35,10,65,13,0,2,0,3,0], # E P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,59,0,0,0,0,0], # E Q
[217,57,66,22,190,41,70,13,200,3,14,40,134,117,113,42,2,123,167,135,23,58,22,1,123,1], # E R
[17,7,74,6,58,1,3,25,82,0,3,6,17,5,34,52,7,0,222,278,18,2,1,0,6,0], # E S
[78,3,19,0,129,4,0,93,105,0,1,3,2,2,50,1,0,73,5,113,17,0,4,0,32,4], # E T
[0,4,7,6,1,0,4,0,0,0,2,3,17,4,0,15,0,46,20,18,0,2,1,0,0,0], # E U
[29,0,0,0,121,0,0,0,56,0,0,0,0,0,26,0,0,2,1,0,2,2,0,0,3,1], # E V
[33,4,3,4,16,2,0,5,24,0,0,3,3,3,23,2,0,3,15,4,0,0,1,0,2,0], # E W
[29,0,43,0,20,0,0,14,21,0,0,0,0,0,15,78,1,0,0,72,12,0,0,1,2,0], # E X
[7,3,1,4,25,2,0,2,0,0,1,4,6,4,4,1,0,2,3,0,0,1,4,0,0,0], # E Y
[1,0,0,0,9,0,0,0,1,0,0,0,0,0,4,0,0,1,0,0,1,1,0,0,2,3], # E Z
],
[
[1,10,39,5,2,1,1,3,18,0,2,35,10,27,0,0,0,36,13,18,10,0,2,3,4,1], # F A
[2,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F B
[1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F C
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0], # F D
[18,5,24,6,12,0,2,0,6,0,1,25,6,18,2,0,0,114,17,15,4,2,2,0,1,0], # F E
[10,2,0,0,51,0,0,2,45,0,0,21,4,0,13,0,0,9,7,0,7,0,0,0,8,0], # F F
[1,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F G
[2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F H
[9,9,58,18,42,7,11,0,0,0,0,29,2,53,0,0,0,40,41,18,0,2,0,10,0,3], # F I
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # F J
[2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F K
[64,0,0,0,50,0,0,0,21,0,0,0,0,0,60,0,0,0,0,0,42,0,0,0,15,0], # F L
[6,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F M
[0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F N
[5,1,8,2,1,0,7,0,6,0,0,34,1,8,32,2,0,165,5,0,25,1,2,7,1,0], # F O
[0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F Q
[64,0,0,0,66,0,0,0,35,0,0,0,0,0,35,0,0,0,0,0,11,0,0,0,3,0], # F R
[1,0,0,0,2,0,0,2,0,0,1,0,0,0,1,1,0,0,0,2,0,0,0,0,0,0], # F S
[1,1,1,0,19,0,0,3,1,0,0,0,1,0,3,0,0,1,9,0,0,0,4,0,8,0], # F T
[0,0,4,2,1,0,9,0,0,2,0,119,7,24,0,0,0,28,31,6,0,0,0,0,0,2], # F U
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F V
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F X
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F Y
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # F Z
],
[
[0,20,5,11,3,2,11,3,13,0,0,68,24,60,1,5,0,63,23,68,15,8,5,0,2,5], # G A
[4,0,0,0,1,0,0,0,3,0,0,0,0,0,5,0,0,0,0,0,0,0,0,0,0,0], # G B
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # G C
[2,0,0,0,1,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0], # G D
[23,3,2,4,12,1,1,3,4,0,0,32,8,141,39,4,0,96,29,33,1,1,4,0,5,0], # G E
[0,0,0,0,1,0,0,0,3,0,0,0,0,0,0,0,0,1,0,0,3,0,0,0,0,0], # G F
[8,0,0,0,20,0,0,1,60,0,0,24,0,0,3,1,0,6,4,0,0,0,0,0,12,0], # G G
[18,4,1,1,12,2,1,1,2,0,1,4,0,3,12,1,0,1,3,153,2,0,3,0,1,0], # G H
[23,21,16,6,7,2,9,0,0,0,0,24,7,103,17,1,0,10,26,19,3,10,0,0,0,1], # G I
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # G J
[0,0,0,0,0,0,0,0,2,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0], # G K
[49,0,0,0,73,0,0,0,25,0,0,0,0,0,38,0,0,0,0,0,13,0,0,0,17,0], # G L
[23,0,0,0,12,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,3,0,0,0,1,0], # G M
[26,1,0,0,28,0,0,0,20,0,0,0,0,0,26,2,0,0,0,1,7,0,0,0,0,0], # G N
[6,4,3,16,6,1,10,1,5,0,0,22,1,49,20,3,0,34,12,23,16,7,5,0,1,0], # G O
[0,0,0,0,1,0,0,0,3,0,0,2,0,0,2,0,0,0,0,0,0,0,0,0,0,0], # G P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # G Q
[216,0,0,0,97,0,0,0,43,0,0,0,0,0,50,0,0,0,0,0,14,0,0,0,3,0], # G R
[2,2,0,0,0,0,0,2,2,0,1,1,0,0,2,1,0,0,0,18,0,0,1,0,0,0], # G S
[2,0,0,0,0,0,0,8,3,0,0,0,0,0,17,0,0,1,0,0,0,0,0,0,0,0], # G T
[28,1,1,0,49,1,1,0,41,0,0,26,15,24,2,0,0,14,22,6,0,0,0,0,3,1], # G U
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # G V
[5,0,0,0,2,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0], # G W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # G X
[1,0,0,0,0,0,0,0,0,0,0,0,7,3,0,6,0,5,0,0,0,0,0,0,0,0], # G Y
[2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # G Z
],
[
[2,26,15,20,6,8,22,3,31,0,11,90,66,171,3,25,0,142,30,49,20,11,20,0,13,8], # H A
[4,0,0,0,3,0,0,0,1,0,0,2,0,0,12,0,0,2,0,0,4,0,0,0,1,0], # H B
[1,0,0,0,0,0,0,1,0,0,0,0,0,0,2,0,0,1,0,0,0,0,0,0,0,0], # H C
[2,0,0,0,0,0,0,0,1,0,0,0,0,0,2,0,0,4,0,0,0,0,0,0,0,0], # H D
[123,5,22,33,37,5,3,0,27,0,0,87,65,86,17,7,1,311,57,42,11,11,14,8,11,2], # H E
[2,0,0,0,0,0,0,0,3,0,0,0,0,0,2,0,0,0,0,0,10,0,0,0,0,0], # H F
[1,0,0,0,1,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0], # H G
[1,0,0,0,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # H H
[22,22,56,15,23,6,19,0,0,1,1,73,20,79,17,41,0,36,53,39,3,11,0,0,0,6], # H I
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # H J
[0,0,0,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # H K
[5,0,0,0,11,0,0,0,8,0,0,0,0,0,22,0,0,1,0,0,1,0,0,0,1,0], # H L
[21,0,0,0,15,0,0,0,6,0,0,0,1,0,7,0,0,0,2,0,1,0,0,0,0,0], # H M
[3,0,0,0,8,0,0,0,9,0,0,0,0,1,3,0,0,0,4,0,2,0,0,0,0,0], # H N
[13,18,13,25,17,5,13,0,7,1,4,101,62,62,44,29,0,130,45,33,81,8,28,0,6,2], # H O
[3,0,0,0,0,0,0,0,2,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0], # H P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0], # H Q
[20,0,0,0,23,0,0,0,40,0,0,1,0,0,72,0,0,0,0,0,13,0,0,0,3,0], # H R
[3,0,1,0,0,0,0,2,1,0,0,0,0,0,3,0,0,0,0,5,0,0,0,0,0,0], # H S
[3,0,2,1,21,9,1,7,5,0,0,1,4,3,4,1,0,2,7,1,1,0,3,0,6,0], # H T
[3,13,7,6,3,5,12,1,0,0,0,7,37,26,0,3,0,37,24,15,0,0,0,2,2,1], # H U
[0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # H V
[17,0,0,0,5,0,0,2,5,0,0,0,0,0,9,0,0,0,0,0,0,0,0,0,0,0], # H W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # H X
[5,1,1,39,1,0,3,0,1,0,0,13,9,0,0,25,0,9,29,9,0,0,0,1,0,0], # H Y
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # H Z
],
[
[0,33,20,8,1,0,17,5,1,0,2,169,20,230,0,3,0,30,13,91,0,1,1,2,0,1], # I A
[11,19,0,0,38,0,0,0,22,0,0,131,1,2,10,0,0,20,1,0,23,0,0,0,2,0], # I B
[161,0,3,0,113,0,0,62,113,0,142,15,0,4,46,0,0,12,5,53,42,0,0,0,7,0], # I C
[51,2,0,31,232,0,30,0,46,1,0,5,1,8,10,1,0,1,10,5,11,0,7,0,9,0], # I D
[0,1,17,6,1,16,11,1,0,0,1,52,4,70,0,1,0,66,18,50,7,17,6,0,0,2], # I E
[7,0,0,0,31,45,0,0,27,0,0,9,0,1,10,0,0,2,0,24,10,0,0,0,71,0], # I F
[48,0,0,0,41,0,30,147,30,0,0,4,15,57,20,1,0,23,3,1,15,0,1,0,2,2], # I G
[1,0,0,0,2,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # I H
[1,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # I I
[3,0,0,0,2,0,0,0,1,0,0,0,0,0,2,0,0,0,0,0,1,0,0,0,0,0], # I J
[6,0,0,0,17,0,0,0,3,0,1,0,0,0,3,0,0,0,0,1,2,0,0,0,1,0], # I K
[60,10,6,36,106,6,5,7,90,0,13,253,14,0,24,1,0,1,10,31,6,6,5,0,10,0], # I L
[76,26,0,0,94,1,0,1,53,0,0,1,38,1,30,133,0,1,8,0,17,0,0,0,2,0], # I M
[212,12,143,168,396,83,435,26,94,8,43,9,6,44,70,3,10,2,139,205,35,46,4,4,15,1], # I N
[2,2,20,10,1,0,9,0,0,0,0,28,12,604,0,8,0,25,13,24,139,3,2,3,0,1], # I O
[20,5,0,0,26,2,0,16,16,1,0,33,6,0,13,39,0,5,19,28,5,0,1,0,1,0], # I P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,36,0,0,0,0,0], # I Q
[41,2,39,24,106,7,9,0,19,0,11,20,24,1,24,8,0,39,11,31,3,5,8,0,10,0], # I R
[35,5,71,4,110,4,2,189,56,1,13,12,93,5,55,33,3,6,85,271,4,1,1,0,8,0], # I S
[136,1,34,1,184,5,0,77,158,0,1,4,6,5,70,1,0,31,2,105,72,0,1,0,142,19], # I T
[0,0,1,0,0,0,0,0,0,0,0,1,121,1,0,0,0,1,19,0,0,0,0,0,0,0], # I U
[57,0,0,0,292,0,0,0,37,0,0,0,0,0,12,0,0,1,0,0,3,0,0,0,2,0], # I V
[3,0,0,0,0,0,0,0,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # I W
[1,0,0,0,2,1,1,0,3,0,0,0,0,0,4,0,0,0,0,9,1,0,0,0,1,0], # I X
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # I Y
[9,0,0,0,13,0,0,0,0,0,0,0,0,0,7,0,0,0,0,0,1,1,0,0,0,16], # I Z
],
[
[0,2,32,1,1,0,3,3,2,0,3,1,8,17,0,2,0,5,2,0,2,3,2,1,1,2], # J A
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J B
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J C
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J D
[4,0,24,1,1,3,0,1,0,2,0,2,0,6,2,0,0,11,9,5,0,0,6,0,0,0], # J E
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J F
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J G
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J H
[0,1,0,0,0,1,4,0,0,0,0,2,4,3,0,0,0,0,0,4,0,1,0,0,0,0], # J I
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J J
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J K
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J L
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J M
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J N
[4,2,6,0,3,0,3,12,10,0,1,6,0,5,0,0,0,10,10,1,13,4,2,0,7,0], # J O
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J Q
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J R
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J S
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J T
[3,3,0,19,0,0,8,0,2,2,2,8,5,24,0,1,0,15,9,5,0,1,0,2,0,0], # J U
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J V
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J X
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J Y
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # J Z
],
[
[0,3,0,6,1,2,8,2,1,1,1,9,4,13,2,3,0,18,4,17,2,1,2,1,5,2], # K A
[3,0,0,0,3,0,0,0,2,0,0,0,0,0,11,0,0,1,0,0,1,0,0,0,0,0], # K B
[2,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0], # K C
[3,0,0,0,1,0,0,0,0,0,0,0,0,0,2,0,0,1,0,0,0,0,0,0,0,0], # K D
[4,3,0,7,28,3,3,2,1,0,0,20,5,55,3,3,0,59,18,56,2,1,4,0,27,0], # K E
[1,0,0,0,1,0,0,0,1,0,0,0,0,0,3,0,0,0,0,0,3,0,0,0,0,0], # K F
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0], # K G
[9,0,0,0,2,0,0,0,0,0,0,0,1,0,8,0,0,1,0,1,0,0,0,0,0,0], # K H
[5,2,3,9,15,1,1,0,0,0,1,10,10,87,2,4,0,11,15,13,0,2,2,0,0,0], # K I
[2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # K J
[1,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0], # K K
[15,0,0,0,46,0,0,0,13,0,0,0,0,0,3,0,0,0,0,0,1,0,0,0,2,0], # K L
[13,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # K M
[5,0,0,0,11,0,0,0,10,0,0,0,0,0,24,0,0,0,0,0,8,0,0,0,0,0], # K N
[1,1,2,3,2,4,0,2,1,0,1,3,1,7,1,2,0,6,2,1,7,4,5,2,0,0], # K O
[2,0,0,0,0,0,0,0,4,0,0,4,0,0,5,0,0,0,0,0,0,0,0,0,0,0], # K P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # K Q
[10,0,0,0,3,0,0,0,3,0,0,0,0,0,6,0,0,0,0,0,5,0,0,0,2,0], # K R
[2,2,1,0,1,0,1,9,5,0,1,0,4,0,8,3,0,0,0,11,4,0,1,0,1,0], # K S
[3,0,0,0,0,0,0,2,3,0,0,0,0,0,5,0,0,2,0,0,0,0,0,0,0,0], # K T
[0,0,0,2,0,0,0,1,0,0,0,5,1,1,0,8,0,2,1,1,0,0,1,0,1,0], # K U
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # K V
[9,0,0,0,4,0,0,1,2,0,0,0,0,0,7,0,0,0,0,0,0,0,0,0,0,0], # K W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # K X
[2,0,0,0,1,0,0,1,0,1,0,4,0,0,2,0,0,2,1,0,1,0,3,0,0,0], # K Y
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # K Z
],
[
[1,46,84,43,3,2,46,9,52,0,10,3,64,242,4,23,1,157,92,210,45,21,23,9,42,11], # L A
[12,0,0,0,17,0,0,0,3,0,0,2,0,0,13,0,0,4,0,0,4,0,0,0,2,0], # L B
[9,0,0,0,6,0,0,12,4,0,0,1,1,0,19,0,0,2,0,1,7,0,0,0,2,0], # L C
[2,3,2,0,41,4,0,1,16,0,0,1,2,3,13,1,0,8,9,2,3,0,5,0,3,0], # L D
[94,25,75,44,36,13,55,9,26,1,1,9,55,121,22,22,0,77,84,115,12,29,14,30,75,1], # L E
[9,1,0,0,4,1,1,1,12,0,0,1,0,0,7,0,0,8,1,2,8,0,1,0,0,0], # L F
[16,0,0,0,12,0,0,0,10,0,0,0,0,0,6,0,0,6,0,0,0,0,0,0,0,0], # L G
[7,0,0,0,6,0,0,0,2,0,0,0,0,0,7,0,0,0,0,0,0,0,0,0,0,0], # L H
[82,33,140,26,43,37,73,0,0,1,6,11,46,238,50,40,13,5,90,127,12,36,0,3,0,7], # L I
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # L J
[7,0,0,0,4,0,0,3,9,0,0,2,0,1,2,0,0,0,3,0,0,0,3,0,8,0], # L K
[128,12,2,4,169,7,2,4,152,1,0,0,7,0,100,2,0,1,10,2,41,0,7,0,53,0], # L L
[27,0,0,2,11,0,0,2,9,0,0,0,1,0,13,0,0,0,4,0,3,0,0,0,3,0], # L M
[0,0,0,0,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,3,0,0,0,0,0], # L N
[23,23,65,15,7,4,132,3,32,0,2,7,29,69,50,36,11,74,33,53,66,16,80,1,12,1], # L O
[11,0,0,0,3,1,0,21,5,0,0,0,1,0,6,0,0,3,1,4,0,0,0,0,1,0], # L P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # L Q
[2,0,0,0,1,0,0,0,0,0,0,0,0,0,5,0,0,0,0,0,2,0,0,0,6,0], # L R
[7,1,0,0,16,0,0,8,23,0,1,0,1,0,20,3,0,0,1,23,0,0,1,0,2,0], # L S
[22,1,0,0,23,0,0,14,34,0,0,0,2,0,23,0,0,9,3,0,8,1,1,0,18,5], # L T
[5,17,26,18,31,5,13,0,5,2,4,8,68,31,15,5,0,21,68,56,0,4,0,13,0,1], # L U
[19,0,0,1,46,0,0,0,9,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0], # L V
[8,0,0,0,2,0,0,1,2,0,0,0,0,0,9,0,0,0,0,0,0,0,0,0,1,0], # L W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # L X
[2,4,12,2,2,2,3,7,2,0,1,3,13,11,2,11,0,2,31,15,1,0,4,0,0,0], # L Y
[2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # L Z
],
[
[0,10,59,34,3,0,57,7,31,3,25,104,6,326,2,4,0,144,49,192,10,2,3,11,14,7], # M A
[31,1,0,1,44,0,0,0,32,0,0,31,0,1,27,1,0,32,1,0,21,0,0,0,0,0], # M B
[3,1,17,6,2,2,9,3,5,0,9,3,3,4,2,1,0,0,0,0,0,0,0,0,0,0], # M C
[0,0,0,0,2,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0], # M D
[30,6,8,45,3,2,14,1,4,0,1,51,19,283,10,4,0,125,39,128,0,2,9,3,4,1], # M E
[0,0,0,0,3,0,0,0,3,0,0,2,0,0,4,0,0,0,0,0,4,0,0,0,0,0], # M F
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # M G
[0,0,0,0,3,0,0,0,0,0,0,0,0,0,4,0,0,0,0,0,1,0,0,0,0,0], # M H
[19,0,93,54,8,2,19,0,0,1,2,76,9,194,4,0,1,21,96,109,10,0,0,5,0,1], # M I
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # M J
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # M K
[1,0,0,0,3,0,0,0,6,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0], # M L
[40,0,0,0,46,0,0,0,33,0,0,0,0,0,32,0,0,0,0,0,17,0,0,0,12,0], # M M
[12,0,0,0,4,0,0,0,10,0,0,0,0,0,3,0,0,0,0,0,1,0,0,0,1,0], # M N
[4,10,13,28,4,1,14,3,11,0,6,47,10,168,16,3,0,107,40,45,56,8,1,1,1,2], # M O
[52,3,0,0,71,1,1,26,18,0,4,71,0,0,50,0,0,41,9,43,19,0,0,0,7,0], # M P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0], # M Q
[2,0,0,0,0,0,0,0,0,0,0,0,0,0,3,0,0,0,1,0,0,0,0,0,0,0], # M R
[0,1,2,1,5,1,0,2,3,0,1,0,2,0,8,2,0,0,1,10,1,0,0,0,2,0], # M S
[0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0], # M T
[0,0,7,11,6,3,6,0,2,0,2,55,11,29,2,1,0,18,53,30,0,0,0,0,0,3], # M U
[0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # M V
[2,0,0,0,2,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # M W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # M X
[0,0,11,0,5,0,1,0,0,0,0,1,0,2,7,0,0,7,7,4,0,0,0,0,0,0], # M Y
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # M Z
],
[
[2,24,33,23,6,3,30,6,20,0,9,115,29,59,2,31,0,94,28,159,19,10,5,0,1,5], # N A
[5,0,1,0,20,0,0,0,1,0,0,4,0,0,7,0,0,4,1,0,10,0,0,0,0,0], # N B
[25,0,0,0,190,0,0,87,51,0,1,18,0,0,62,0,0,16,0,36,21,0,0,0,8,0], # N C
[75,11,4,1,162,6,3,7,102,1,1,22,10,2,57,9,2,46,30,4,37,0,11,0,20,0], # N D
[34,12,36,12,29,17,16,4,14,0,0,45,16,20,25,8,6,88,80,84,32,12,37,18,45,3], # N E
[15,0,0,0,30,0,0,0,38,0,0,23,0,0,26,0,0,10,0,0,19,0,0,0,0,0], # N F
[22,8,0,3,114,6,0,15,18,0,3,51,5,0,20,2,0,24,24,28,38,0,2,0,9,0], # N G
[18,0,0,0,16,0,0,0,6,0,0,0,0,0,15,0,0,0,0,0,2,0,0,0,3,0], # N H
[90,9,148,14,33,27,35,4,1,0,5,12,25,44,26,21,7,4,87,94,29,11,0,4,0,4], # N I
[2,0,0,0,3,0,0,0,0,0,0,0,0,0,4,0,0,0,0,0,13,0,0,0,0,0], # N J
[6,0,1,0,22,4,1,1,10,0,0,12,2,0,1,1,0,2,2,3,0,0,0,0,9,0], # N K
[9,0,0,0,8,0,0,0,5,0,0,0,0,0,5,0,0,0,0,0,0,0,0,0,1,0], # N L
[8,0,0,0,5,0,0,0,2,0,0,0,0,0,7,0,0,0,0,0,0,0,0,0,0,0], # N M
[39,0,0,0,74,0,0,0,52,0,1,0,0,0,23,0,0,0,1,0,14,0,1,0,25,0], # N N
[4,18,21,10,4,4,15,0,11,0,0,30,60,34,11,11,0,80,32,47,52,18,24,7,2,2], # N O
[0,0,0,0,1,0,0,0,1,0,0,4,0,0,6,0,0,0,0,0,2,0,0,0,0,0], # N P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,22,0,0,0,0,0], # N Q
[3,0,1,0,1,0,0,0,6,0,0,0,0,0,6,0,0,0,0,0,3,0,0,0,6,0], # N R
[26,4,23,2,73,17,3,12,96,0,5,8,13,0,60,25,0,1,3,79,39,4,4,0,5,0], # N S
[143,1,1,1,175,2,2,64,209,0,0,13,3,1,65,1,0,114,3,0,32,0,2,0,21,1], # N T
[12,6,16,6,11,3,6,0,5,0,1,15,35,9,6,3,0,9,25,31,1,0,0,0,0,1], # N U
[15,0,0,0,43,0,0,0,20,0,0,0,0,0,17,0,0,0,0,0,4,0,0,0,1,0], # N V
[12,0,0,0,3,0,0,2,4,0,0,0,0,0,6,0,0,1,0,0,0,0,0,0,0,0], # N W
[0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0], # N X
[5,3,1,1,0,0,0,1,0,0,0,7,14,0,4,1,1,1,3,1,1,1,2,1,0,0], # N Y
[10,0,0,0,5,0,0,0,5,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,5,0], # N Z
],
[
[1,0,20,30,0,2,5,2,0,0,9,9,8,18,0,4,1,51,13,44,1,1,0,2,0,0], # O A
[17,24,2,2,28,2,0,1,32,4,0,19,0,1,16,0,0,5,26,3,8,3,1,0,2,0], # O B
[50,0,28,0,38,0,0,47,26,0,129,14,0,0,33,0,0,25,0,34,20,0,0,0,8,0], # O C
[17,3,3,15,59,3,13,4,47,0,1,13,2,1,22,3,0,8,11,0,21,0,8,0,35,0], # O D
[0,6,1,7,0,3,0,1,6,0,1,10,3,13,1,0,1,10,15,6,2,7,0,3,1,0], # O E
[7,0,0,0,4,63,0,0,10,0,0,4,1,0,6,0,0,1,0,15,4,0,0,0,1,0], # O F
[34,2,0,1,44,1,22,3,15,1,0,11,3,11,7,0,0,80,1,2,18,0,1,0,83,0], # O G
[10,0,0,0,8,0,0,0,6,0,0,1,5,9,5,0,0,2,0,0,0,0,0,0,1,0], # O H
[3,1,12,53,1,1,2,0,0,0,1,27,0,51,0,0,0,11,39,8,0,0,0,1,0,0], # O I
[1,0,0,0,5,0,0,0,1,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0], # O J
[5,2,1,0,48,0,0,1,7,0,1,4,0,0,3,1,0,0,5,0,3,0,1,0,6,0], # O K
[71,4,6,83,111,8,5,3,121,0,14,124,16,1,132,6,0,1,18,24,43,16,2,0,46,1], # O L
[89,50,1,0,174,5,0,1,76,0,0,2,64,7,56,125,1,1,4,0,4,0,2,0,22,0], # O M
[129,3,64,82,181,52,86,3,124,10,11,7,3,46,75,1,6,10,107,149,8,38,9,1,54,5], # O N
[0,2,4,92,0,22,4,1,0,0,68,42,42,44,0,19,0,21,21,68,0,3,0,0,0,2], # O O
[28,1,2,0,71,0,2,82,32,1,3,16,1,1,45,29,0,17,14,21,10,0,2,0,19,0], # O P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,14,0,0,0,0,0], # O Q
[122,26,31,96,138,7,34,2,143,0,61,8,85,76,61,59,1,58,46,211,11,4,9,0,116,1], # O R
[31,4,24,0,107,0,3,18,102,0,2,7,9,1,18,42,2,0,63,127,5,1,2,0,8,0], # O S
[45,7,11,0,64,2,1,88,63,0,0,10,3,1,42,4,0,17,7,63,9,0,3,0,11,0], # O T
[3,11,17,13,3,3,62,1,6,0,0,32,1,137,0,11,1,86,445,103,0,7,0,1,0,2], # O U
[26,0,0,0,109,0,0,0,27,0,1,0,0,0,7,0,0,0,0,0,0,0,0,0,2,0], # O V
[18,14,2,13,48,6,0,8,8,0,1,28,7,83,1,8,0,5,13,2,2,0,1,0,4,1], # O W
[2,1,3,0,5,1,1,3,26,0,0,0,0,1,1,0,0,0,0,1,0,1,1,0,14,0], # O X
[15,1,4,6,3,1,0,0,1,0,0,3,0,1,4,1,0,1,2,1,0,0,0,0,0,0], # O Y
[2,0,0,0,9,0,0,0,0,0,0,0,0,0,7,0,0,0,0,0,0,0,0,0,3,1], # O Z
],
[
[0,8,38,11,1,0,18,0,17,0,2,50,5,73,1,23,1,176,50,101,18,5,7,1,10,2], # P A
[3,0,0,0,3,0,0,0,0,0,0,1,0,0,6,0,0,2,1,0,3,0,0,0,0,0], # P B
[0,0,0,0,0,0,0,1,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0], # P C
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,2,0,0,0,0,0,0,0,0], # P D
[51,1,62,34,19,4,8,0,3,1,2,47,2,108,4,10,0,292,22,50,3,1,8,2,2,4], # P E
[0,0,0,0,1,0,0,0,2,0,0,1,0,0,0,0,0,1,0,0,3,0,0,0,0,0], # P F
[2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0], # P G
[56,0,0,2,88,0,0,0,76,0,0,3,0,1,97,0,0,13,1,3,5,0,0,0,79,0], # P H
[21,0,74,25,33,1,19,0,0,0,6,27,3,74,12,11,2,37,27,57,3,2,0,2,0,2], # P I
[1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # P J
[0,0,0,0,2,0,0,0,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # P K
[150,0,0,0,121,0,0,0,59,0,0,0,0,0,33,0,0,0,0,0,29,0,0,0,11,0], # P L
[6,0,0,0,2,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,1,0,0,0,0,0], # P M
[0,0,0,0,4,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0], # P N
[2,1,19,10,12,2,7,0,31,0,12,111,14,55,23,17,0,97,126,52,20,3,13,3,2,0], # P O
[16,0,0,0,48,0,0,1,20,0,0,32,1,0,25,0,0,32,3,0,1,0,0,0,16,0], # P P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # P Q
[39,0,0,0,166,0,0,0,104,0,0,0,0,0,273,0,0,0,0,0,12,0,0,0,1,0], # P R
[4,1,3,0,17,0,0,5,22,0,1,1,2,0,13,0,0,0,0,14,6,0,1,0,35,0], # P S
[16,0,1,0,9,0,0,3,107,0,0,0,0,0,33,0,0,3,0,0,19,0,0,0,4,0], # P T
[1,8,4,8,3,6,4,0,1,0,1,41,8,22,0,9,0,39,18,28,0,0,0,0,0,1], # P U
[0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # P V
[3,0,0,0,0,0,0,0,2,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0], # P W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # P X
[1,2,0,0,0,0,3,0,1,0,1,3,0,0,1,0,0,20,0,3,0,0,1,0,0,0], # P Y
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # P Z
],
[
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0], # Q A
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q B
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q C
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q D
[0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q E
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q F
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q G
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q H
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q I
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q J
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q K
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q L
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q M
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q N
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q O
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q Q
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q R
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q S
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q T
[110,0,0,0,100,0,0,0,128,0,0,0,0,0,13,0,0,0,0,0,0,0,0,0,3,0], # Q U
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q V
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q X
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q Y
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Q Z
],
[
[0,72,130,95,8,35,73,14,85,3,10,121,95,313,2,119,1,26,66,277,19,45,28,2,28,13], # R A
[32,0,0,0,26,0,0,0,35,0,0,4,0,0,44,0,0,3,1,0,9,0,0,0,5,0], # R B
[18,0,2,0,47,0,0,86,25,0,3,11,0,0,13,0,0,1,2,7,38,0,0,0,4,0], # R C
[22,5,1,0,26,1,0,4,42,0,0,4,0,2,17,1,0,5,9,4,3,0,4,0,7,0], # R D
[166,26,106,99,114,52,55,20,25,4,4,60,69,143,20,72,8,11,257,119,14,56,34,7,23,2], # R E
[11,0,0,0,15,1,0,0,9,0,0,7,0,0,8,0,0,4,0,0,12,0,0,0,0,0], # R F
[26,0,0,0,63,0,0,5,25,0,0,11,1,0,18,0,0,2,2,0,13,0,0,0,11,0], # R G
[11,0,0,0,19,0,0,0,5,0,0,0,0,0,18,0,0,0,0,0,2,0,0,0,3,0], # R H
[182,54,210,87,79,38,65,1,0,1,6,49,65,166,82,61,1,0,151,141,29,44,1,6,1,10], # R I
[0,0,0,0,3,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,2,0,0,0,0,0], # R J
[4,2,0,1,19,0,0,3,9,0,0,6,3,2,5,3,0,1,10,2,0,0,1,0,6,0], # R K
[24,2,0,4,28,0,0,0,36,0,0,0,0,0,14,1,0,0,2,1,2,0,1,0,8,0], # R L
[97,1,2,0,29,2,0,3,65,0,0,2,0,0,39,1,0,0,1,1,10,0,1,0,5,0], # R M
[53,5,0,0,50,4,0,3,29,0,1,0,6,0,16,1,0,0,9,5,7,0,2,0,4,0], # R N
[46,40,79,40,18,22,56,4,32,5,10,76,90,167,84,127,2,14,127,74,127,42,63,17,15,3], # R O
[10,0,0,0,21,0,0,33,10,0,0,5,1,0,25,0,0,12,8,8,5,0,0,0,1,0], # R P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,10,0,0,0,0,0], # R Q
[53,0,0,0,92,0,0,5,85,0,0,0,0,0,47,0,0,0,0,0,14,0,0,0,60,0], # R R
[26,2,2,2,84,1,0,16,44,0,4,2,3,1,43,12,1,0,0,32,14,1,2,0,2,0], # R S
[39,2,2,0,61,5,3,101,99,0,0,11,7,3,32,0,0,17,12,1,27,0,2,0,24,7], # R T
[5,21,30,31,15,6,12,0,18,0,0,10,46,41,1,28,0,3,83,22,0,1,1,1,0,1], # R U
[31,0,0,0,37,0,0,0,28,0,0,0,0,0,5,0,0,0,0,0,1,0,0,0,2,0], # R V
[15,0,0,0,6,0,0,0,12,0,0,0,0,0,15,0,0,0,0,0,0,0,0,0,0,0], # R W
[0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # R X
[5,3,3,5,3,0,1,0,0,0,0,10,11,4,12,16,0,0,9,4,0,0,2,0,0,0], # R Y
[2,0,0,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # R Z
],
[
[2,44,23,16,1,10,21,4,16,1,7,80,17,89,1,10,0,36,10,43,22,10,13,5,7,0], # S A
[9,0,0,0,4,0,0,0,2,0,0,0,0,0,6,0,0,2,0,0,18,0,0,0,3,0], # S B
[81,0,0,0,65,0,1,78,37,0,0,5,1,0,88,0,0,92,0,0,40,0,0,0,3,0], # S C
[11,0,0,0,0,0,0,0,1,0,0,0,0,0,2,0,0,2,0,0,2,0,0,0,0,0], # S D
[38,14,47,18,33,7,8,3,11,0,1,63,39,101,5,28,14,83,28,41,12,19,15,15,19,1], # S E
[3,0,0,0,7,0,0,0,5,0,0,0,0,0,7,0,0,0,0,0,6,0,0,0,1,0], # S F
[0,0,0,0,2,0,0,0,2,0,0,0,0,0,2,0,0,5,1,0,2,0,0,0,0,0], # S G
[97,9,1,0,79,3,0,0,75,0,1,4,16,3,81,2,0,27,0,1,20,1,6,0,17,0], # S H
[55,56,44,80,28,15,38,0,0,0,2,50,40,78,148,7,1,7,99,89,9,76,0,8,0,3], # S I
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0], # S J
[9,0,0,0,24,0,0,0,35,0,0,0,2,0,3,0,0,1,0,0,5,0,0,0,23,0], # S K
[42,0,0,0,35,0,0,0,29,0,0,1,0,0,29,0,0,0,0,0,13,0,0,0,2,0], # S L
[57,0,0,0,30,0,0,0,31,0,0,0,0,0,25,0,0,0,0,0,14,0,0,0,2,0], # S M
[21,0,0,0,12,0,0,0,12,0,0,0,0,0,19,0,0,0,0,4,6,0,0,0,2,0], # S N
[6,4,26,12,6,10,4,1,8,1,0,67,65,190,8,21,0,71,0,11,34,6,3,0,3,1], # S O
[63,1,0,0,116,0,0,41,82,0,0,24,0,0,69,0,0,34,1,0,16,0,0,0,3,0], # S P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,52,0,0,0,0,0], # S Q
[4,0,0,0,1,0,0,0,1,0,0,0,0,0,2,0,0,0,1,0,3,0,0,0,0,0], # S R
[50,3,2,0,77,3,0,4,151,0,0,5,11,1,42,2,0,4,0,4,17,0,13,0,19,0], # S S
[258,6,4,1,291,9,1,11,240,1,0,25,12,2,205,6,0,255,3,0,58,2,7,0,36,0], # S T
[14,38,17,6,7,11,6,0,11,0,0,39,35,37,1,42,0,71,30,4,0,0,0,0,0,4], # S U
[0,0,0,0,5,0,0,0,6,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0], # S V
[37,0,0,0,31,0,0,0,28,0,0,0,0,0,21,0,0,2,0,0,2,0,0,0,0,0], # S W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # S X
[0,2,32,1,1,0,1,0,0,0,1,18,19,30,0,2,0,9,5,1,0,0,0,0,0,1], # S Y
[0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # S Z
],
[
[0,74,44,8,3,9,45,8,68,0,15,130,36,181,1,23,0,128,22,185,13,11,9,13,4,0], # T A
[7,0,0,0,4,0,0,0,4,0,0,0,0,0,6,0,0,3,0,0,3,0,0,0,0,0], # T B
[5,0,0,0,0,0,0,112,0,0,0,2,0,0,5,0,0,1,0,0,1,0,0,0,1,0], # T C
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,3,0,0,1,0,0,0,0,0,0,0,0], # T D
[52,9,29,37,66,9,17,6,16,0,2,65,49,185,18,20,0,588,61,23,9,9,9,16,1,0], # T E
[6,0,0,0,1,0,0,0,5,0,0,1,0,0,6,0,0,1,0,0,24,0,0,0,0,0], # T F
[4,0,0,0,2,0,0,0,0,0,0,0,0,0,2,0,0,1,0,0,1,0,0,0,0,0], # T G
[68,6,1,5,274,8,1,2,62,0,1,9,13,3,90,4,1,61,8,2,31,0,16,0,49,0], # T H
[99,35,342,16,35,45,34,0,0,0,3,67,75,183,419,28,9,18,75,88,9,128,0,0,0,2], # T I
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # T J
[2,0,0,0,1,0,0,0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0], # T K
[18,0,0,0,102,0,0,0,5,0,0,2,0,0,3,0,0,0,0,0,2,0,0,0,3,0], # T L
[25,0,0,0,8,0,0,0,3,0,0,0,0,0,11,0,0,0,0,0,3,0,0,0,0,0], # T M
[3,0,0,0,9,0,0,0,5,0,0,0,0,0,2,0,0,0,0,4,1,0,0,0,0,0], # T N
[5,6,34,11,8,7,26,0,14,0,9,38,65,238,26,56,0,319,19,16,36,3,36,7,3,2], # T O
[2,0,0,0,1,0,0,0,1,0,0,2,0,0,3,0,0,5,0,0,0,0,0,0,0,0], # T P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # T Q
[315,0,0,0,98,0,0,0,246,0,0,0,0,0,201,0,0,0,0,0,68,0,1,0,64,0], # T R
[2,2,2,1,10,2,0,3,4,0,1,0,13,0,9,3,0,0,0,8,5,2,5,0,3,0], # T S
[44,0,0,0,154,1,1,2,53,0,1,45,0,0,33,0,0,10,8,0,4,1,0,0,25,0], # T T
[41,14,9,41,8,5,4,0,10,0,0,19,30,29,13,10,0,159,35,22,0,0,0,1,1,0], # T U
[3,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # T V
[14,0,0,0,12,0,0,1,23,0,0,0,0,0,15,0,0,0,0,0,2,0,0,1,0,0], # T W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # T X
[2,1,2,0,0,0,1,0,1,0,0,14,2,0,0,34,0,14,3,0,0,0,2,1,0,0], # T Y
[1,0,0,0,5,0,1,0,2,0,0,1,1,0,1,1,0,1,1,0,0,0,0,0,0,0], # T Z
],
[
[0,4,7,21,0,1,5,1,4,0,5,51,2,26,0,1,0,48,9,37,0,2,4,0,3,0], # U A
[8,18,0,1,20,0,0,2,18,2,0,23,5,0,2,1,0,10,15,8,7,2,0,0,1,0], # U B
[10,0,14,0,23,0,0,31,29,0,55,16,0,0,7,0,0,9,1,47,5,0,0,0,2,0], # U C
[17,1,0,24,67,0,18,0,39,0,0,4,0,0,8,0,0,1,10,0,2,0,2,0,7,1], # U D
[6,9,0,1,5,5,4,1,0,1,0,21,1,33,1,1,0,19,22,15,2,0,0,0,3,6], # U E
[1,0,0,0,0,58,0,0,0,0,0,1,1,0,1,0,0,0,0,3,1,0,0,0,0,0], # U F
[19,1,0,0,21,0,34,80,3,0,0,4,2,2,6,0,0,1,1,0,11,0,0,0,0,0], # U G
[3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0], # U H
[3,2,14,14,6,0,1,0,0,0,0,32,0,31,1,8,0,19,44,64,1,4,0,2,0,3], # U I
[1,0,0,0,0,0,0,0,2,0,0,0,0,0,1,0,0,0,0,0,3,0,0,0,0,0], # U J
[1,0,0,1,12,0,0,0,3,0,1,0,0,0,1,0,0,2,0,0,0,0,0,0,0,0], # U K
[136,4,11,11,46,14,7,0,35,0,10,67,5,2,23,16,0,1,24,73,16,3,1,0,5,1], # U L
[22,52,3,1,51,5,0,1,32,0,0,2,28,11,8,48,1,0,8,1,6,2,0,0,0,0], # U M
[21,6,73,131,25,5,46,2,55,0,33,4,2,13,4,2,0,2,15,82,1,0,2,0,5,0], # U N
[0,0,0,1,0,0,0,0,3,0,0,2,0,3,0,2,0,16,3,5,29,0,0,0,2,0], # U O
[4,4,1,2,31,1,1,14,10,0,1,13,1,0,8,24,0,13,13,24,2,0,2,0,2,0], # U P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,4,0,0,0,0,0], # U Q
[75,27,21,17,149,8,60,1,66,2,11,17,11,55,28,15,1,51,43,43,9,15,3,0,28,1], # U R
[31,5,29,2,105,0,1,53,64,0,17,3,0,1,8,12,1,0,34,115,6,0,0,0,4,0], # U S
[45,1,14,1,69,0,1,55,77,0,0,8,3,3,49,0,0,13,7,51,11,0,2,0,6,2], # U T
[0,0,0,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0], # U U
[0,0,0,0,8,0,0,0,5,0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0], # U V
[2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # U W
[0,0,0,0,4,0,0,0,2,0,0,1,0,0,1,0,0,0,0,5,4,0,0,0,0,0], # U X
[1,0,0,0,1,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,1,1,0,0,0,0], # U Y
[2,0,0,0,4,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,1,0,0,0,0,12], # U Z
],
[
[0,9,20,8,1,0,14,2,8,1,3,69,2,57,0,1,0,31,18,36,5,0,0,0,0,0], # V A
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V B
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V C
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V D
[6,2,5,4,4,3,6,4,5,0,1,47,4,120,3,1,0,271,46,24,0,0,1,5,10,0], # V E
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V F
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V G
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V H
[37,4,33,23,21,2,8,0,2,0,3,43,0,47,18,0,0,16,65,30,5,16,0,2,0,1], # V I
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V J
[0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V K
[2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # V L
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V M
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V N
[0,0,23,0,0,0,3,0,9,0,5,48,2,6,1,0,0,10,4,9,10,1,3,0,6,0], # V O
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V Q
[0,0,0,0,5,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0], # V R
[0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V S
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V T
[0,0,0,0,0,0,0,0,0,0,0,13,0,0,0,0,0,2,2,0,0,0,0,0,0,0], # V U
[0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0], # V V
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # V X
[0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0], # V Y
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # V Z
],
[
[1,4,7,8,0,3,12,3,18,0,8,53,5,20,0,4,0,100,27,55,1,9,1,4,71,1], # W A
[6,0,0,0,7,0,0,0,1,0,0,0,0,0,10,0,0,3,0,0,1,0,0,0,0,0], # W B
[3,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0], # W C
[0,0,0,0,5,0,0,0,1,0,0,0,0,0,4,0,0,3,0,0,1,0,0,0,3,0], # W D
[30,5,1,9,33,0,2,1,19,0,0,51,0,11,0,2,0,36,21,7,0,2,0,0,2,0], # W E
[1,0,0,0,0,0,0,0,3,0,0,3,0,0,4,0,0,0,0,0,3,0,0,0,0,0], # W F
[0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # W G
[18,0,0,0,47,0,0,0,52,0,0,0,0,0,19,0,0,0,0,0,1,0,0,0,1,0], # W H
[0,0,14,18,5,5,15,0,0,0,0,40,2,83,0,2,0,8,38,47,0,4,0,1,0,2], # W I
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # W J
[0,0,0,0,0,0,0,0,2,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0], # W K
[3,0,0,0,9,0,0,0,5,0,0,0,0,0,1,0,0,0,1,1,0,0,0,0,3,0], # W L
[8,0,0,0,5,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # W M
[0,1,1,1,6,1,1,2,3,0,0,0,0,0,0,2,0,1,10,4,1,0,2,0,3,0], # W N
[0,1,0,0,3,1,0,0,0,0,3,10,17,8,54,1,0,121,1,1,3,2,1,0,0,0], # W O
[1,0,0,0,1,0,0,0,1,0,0,1,0,0,5,0,0,0,0,0,1,0,0,0,0,0], # W P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # W Q
[7,0,0,0,12,0,0,0,25,0,0,0,0,0,10,0,0,0,0,0,0,0,0,0,6,0], # W R
[0,1,1,0,2,0,0,1,1,0,1,2,2,0,5,3,0,1,1,4,1,0,2,0,1,0], # W S
[1,0,0,0,1,0,0,3,1,0,0,0,0,0,3,0,0,0,0,0,1,0,0,0,0,0], # W T
[0,0,0,0,0,0,0,1,0,0,0,1,1,1,0,1,0,2,0,0,0,0,0,0,0,0], # W U
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # W V
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # W W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # W X
[2,0,0,0,5,0,0,0,0,0,0,1,1,4,1,0,0,0,0,0,0,0,0,0,0,0], # W Y
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0], # W Z
],
[
[0,0,5,1,0,1,3,0,0,0,0,4,6,6,0,0,0,0,3,6,0,1,0,0,0,0], # X A
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0], # X B
[3,0,0,0,11,0,0,3,7,0,0,7,0,0,3,0,0,5,0,0,7,0,0,0,0,0], # X C
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # X D
[0,0,7,1,0,0,2,0,1,0,0,2,6,9,0,0,0,6,1,1,0,0,0,0,1,0], # X E
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0], # X F
[0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0], # X G
[7,0,0,0,0,0,0,0,4,0,0,0,0,0,4,0,0,0,0,0,2,0,0,0,0,0], # X H
[8,2,12,8,4,2,2,0,0,0,0,2,11,4,8,0,0,0,9,2,0,1,1,0,0,0], # X I
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # X J
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # X K
[0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # X L
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # X M
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # X N
[0,0,0,1,0,0,3,0,0,0,0,1,1,10,0,1,0,6,1,5,0,0,0,0,0,0], # X O
[8,0,0,0,27,0,0,0,5,0,0,18,0,0,12,0,0,7,0,0,3,0,0,0,0,0], # X P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0], # X Q
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # X R
[0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # X S
[6,1,0,0,22,0,0,1,7,0,0,0,0,0,7,0,0,31,0,0,9,0,0,0,1,0], # X T
[4,1,0,2,0,0,0,0,0,0,0,3,0,0,0,1,0,6,0,0,0,0,0,0,0,0], # X U
[0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # X V
[0,0,0,0,2,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0], # X W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # X X
[0,0,0,0,0,0,2,0,0,0,0,6,0,0,0,0,0,2,0,0,0,0,0,0,0,0], # X Y
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # X Z
],
[
[0,0,5,5,0,0,1,1,0,0,2,11,3,29,1,4,1,20,1,3,0,0,3,0,0,0], # Y A
[4,0,0,4,7,0,0,0,2,0,0,0,0,0,9,0,0,3,0,0,3,0,0,0,0,0], # Y B
[4,0,0,0,18,0,0,31,4,0,0,19,0,0,12,0,0,0,0,0,0,0,0,0,0,0], # Y C
[4,1,0,0,12,0,0,0,2,0,0,0,0,2,1,0,0,37,0,0,0,0,0,0,0,0], # Y D
[11,3,0,1,1,1,1,0,1,0,0,13,1,6,2,1,0,19,7,6,0,1,1,0,0,0], # Y E
[1,0,0,0,1,0,0,0,3,0,0,2,0,0,0,0,0,0,0,0,4,0,0,0,0,0], # Y F
[0,0,0,1,2,0,0,0,2,0,0,1,3,1,8,0,0,3,0,0,1,0,0,0,2,0], # Y G
[0,0,0,0,4,0,0,0,0,0,0,0,0,0,10,0,0,0,0,0,0,0,0,0,1,0], # Y H
[0,0,0,1,1,0,0,0,0,0,0,0,0,9,0,2,0,0,2,0,0,0,0,0,0,0], # Y I
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Y J
[0,0,0,0,3,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0], # Y K
[15,0,0,0,22,0,0,0,13,0,1,19,0,0,11,1,0,0,2,0,3,6,0,0,0,0], # Y L
[18,4,1,0,20,0,0,0,5,0,0,0,3,7,11,20,0,0,0,0,2,0,0,0,1,0], # Y M
[14,0,11,3,12,0,3,1,2,0,0,0,0,3,11,0,0,0,0,6,0,0,0,2,1,0], # Y N
[0,0,2,2,0,4,6,0,0,0,5,2,1,18,0,4,0,8,4,5,17,1,1,0,0,1], # Y O
[2,0,0,0,24,0,0,17,5,0,0,2,0,2,21,0,0,5,7,16,3,0,0,0,1,0], # Y P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0], # Y Q
[15,0,0,2,6,1,0,0,21,0,0,0,0,2,29,0,0,2,0,1,4,0,0,0,1,0], # Y R
[3,1,3,0,12,0,0,1,38,0,0,1,2,0,4,3,0,0,6,39,2,0,0,0,0,0], # Y S
[2,0,0,0,16,0,0,16,10,0,0,0,0,0,12,0,0,0,0,2,0,0,0,0,1,0], # Y T
[0,0,3,0,0,0,3,1,0,0,2,1,0,1,0,1,0,0,2,0,0,0,0,0,0,0], # Y U
[1,0,0,0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Y V
[10,0,1,0,3,0,0,2,4,0,0,0,0,0,5,0,0,3,0,0,0,0,0,0,0,0], # Y W
[0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Y X
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Y Y
[2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0], # Y Z
],
[
[1,3,2,0,0,0,5,1,1,0,1,4,1,11,0,1,0,19,0,0,0,1,0,0,0,1], # Z A
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z B
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z C
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z D
[5,1,2,1,1,0,0,0,1,0,1,7,0,12,0,0,0,13,3,3,1,0,1,0,0,0], # Z E
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z F
[0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z G
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z H
[1,1,2,0,7,0,5,0,0,0,0,5,4,6,1,1,0,2,1,1,1,0,0,0,0,0], # Z I
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z J
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z K
[0,0,0,0,16,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,2,0], # Z L
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z M
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z N
[3,0,0,2,2,0,1,0,7,0,0,0,3,10,5,2,0,5,0,0,1,1,0,0,0,0], # Z O
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z P
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z Q
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # Z R
[0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z S
[0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z T
[0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,0,0,4,0,0,0,0,0,0,0,0], # Z U
[0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # Z V
[0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0], # Z W
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z X
[0,1,0,0,0,0,4,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0], # Z Y
[7,0,0,0,1,0,0,0,7,0,0,17,0,0,2,0,0,0,0,0,0,0,1,0,5,0], # Z Z
],
] # end tris
| 75.748977 | 108 | 0.389309 | 18,939 | 55,524 | 1.141349 | 0.014256 | 0.832717 | 1.005644 | 1.105662 | 0.643088 | 0.58267 | 0.536871 | 0.498103 | 0.470577 | 0.43403 | 0 | 0.489262 | 0.256123 | 55,524 | 732 | 109 | 75.852459 | 0.03409 | 0.049798 | 0 | 0.265753 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7ff3463684b967493f8e719e3cdc3085f13e9f42 | 3,293 | py | Python | tests/test_secondstate.py | fruiti-ltd/secondstate | 81fe6916b92c7024372a95f0eb9d50f6275dfc69 | [
"BSD-3-Clause"
] | 1 | 2021-05-28T23:02:08.000Z | 2021-05-28T23:02:08.000Z | tests/test_secondstate.py | fruiti-ltd/secondstate | 81fe6916b92c7024372a95f0eb9d50f6275dfc69 | [
"BSD-3-Clause"
] | null | null | null | tests/test_secondstate.py | fruiti-ltd/secondstate | 81fe6916b92c7024372a95f0eb9d50f6275dfc69 | [
"BSD-3-Clause"
] | null | null | null | import pytest
from .fixtures import *
pytestmark = [pytest.mark.monitor_skip_test]
def test_init_state(initial_state_one_week):
expected_result = 604800
assert len(initial_state_one_week.get_state()) == expected_result
def test_set(initial_state_one_week, availability_one_hour):
assert availability_one_hour[9533] == 1
assert availability_one_hour[13132] == 1
def test_get_without_minutes_or_seconds(initial_state_one_week, availability_one_hour):
with pytest.raises(ValueError):
initial_state_one_week.get()
def test_get_15_minutes(initial_state_one_week, availability_one_hour):
result = initial_state_one_week.get(seconds=900)
assert result == [
["2021-05-17T12:00:00", "2021-05-17T12:15:00"],
["2021-05-17T12:15:00", "2021-05-17T12:30:00"],
["2021-05-17T12:30:00", "2021-05-17T12:45:00"],
["2021-05-17T12:45:00", "2021-05-17T13:00:00"],
]
def test_get_30_minutes_custom_format(initial_state_one_week, availability_one_hour):
result = initial_state_one_week.get(minutes=30, custom_format=True)
assert result == ["1621252800_1621254600", "1621254600_1621256400"]
def test_get_30_minutes(initial_state_one_week, availability_one_hour):
result = initial_state_one_week.get(minutes=30)
assert result == [
["2021-05-17T12:00:00", "2021-05-17T12:30:00"],
["2021-05-17T12:30:00", "2021-05-17T13:00:00"],
]
def test_unset(
initial_state_one_week,
availability_one_hour,
not_available_fifteen_minutes,
):
result = initial_state_one_week.get(minutes=15)
assert result == [
["2021-05-17T12:00:00", "2021-05-17T12:15:00"],
["2021-05-17T12:30:00", "2021-05-17T12:45:00"],
["2021-05-17T12:45:00", "2021-05-17T13:00:00"],
]
@pytest.mark.monitor_test
def test_get_12_hours(initial_state_one_week, availability_all_week_no_break):
result = initial_state_one_week.get(minutes=60 * 12)
assert result == [
["2021-05-17T18:00:00", "2021-05-18T06:00:00"],
["2021-05-18T06:00:00", "2021-05-18T18:00:00"],
["2021-05-18T18:00:00", "2021-05-19T06:00:00"],
["2021-05-19T06:00:00", "2021-05-19T18:00:00"],
["2021-05-19T18:00:00", "2021-05-20T06:00:00"],
["2021-05-20T06:00:00", "2021-05-20T18:00:00"],
["2021-05-20T18:00:00", "2021-05-21T06:00:00"],
["2021-05-21T06:00:00", "2021-05-21T18:00:00"],
]
@pytest.mark.monitor_test
def test_get_with_iso_datetimes(initial_state_one_week, availability_one_hour):
result = initial_state_one_week.get(minutes=5)
assert result == [
["2021-05-17T12:00:00", "2021-05-17T12:05:00"],
["2021-05-17T12:05:00", "2021-05-17T12:10:00"],
["2021-05-17T12:10:00", "2021-05-17T12:15:00"],
["2021-05-17T12:15:00", "2021-05-17T12:20:00"],
["2021-05-17T12:20:00", "2021-05-17T12:25:00"],
["2021-05-17T12:25:00", "2021-05-17T12:30:00"],
["2021-05-17T12:30:00", "2021-05-17T12:35:00"],
["2021-05-17T12:35:00", "2021-05-17T12:40:00"],
["2021-05-17T12:40:00", "2021-05-17T12:45:00"],
["2021-05-17T12:45:00", "2021-05-17T12:50:00"],
["2021-05-17T12:50:00", "2021-05-17T12:55:00"],
["2021-05-17T12:55:00", "2021-05-17T13:00:00"],
]
| 33.602041 | 87 | 0.655937 | 522 | 3,293 | 3.921456 | 0.149425 | 0.170005 | 0.207132 | 0.215926 | 0.764045 | 0.715681 | 0.711285 | 0.621397 | 0.597948 | 0.372252 | 0 | 0.322943 | 0.162162 | 3,293 | 97 | 88 | 33.948454 | 0.418992 | 0 | 0 | 0.185714 | 0 | 0 | 0.347404 | 0.012754 | 0 | 0 | 0 | 0 | 0.128571 | 1 | 0.128571 | false | 0 | 0.028571 | 0 | 0.157143 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3d1279d9077aef16c67b5016f7d657ceeeb73273 | 31 | py | Python | wrappers/__init__.py | mindpowered/reserved-seating-python | 43500525683d4e7a706fdf6f412398569e2363b4 | [
"MIT"
] | null | null | null | wrappers/__init__.py | mindpowered/reserved-seating-python | 43500525683d4e7a706fdf6f412398569e2363b4 | [
"MIT"
] | null | null | null | wrappers/__init__.py | mindpowered/reserved-seating-python | 43500525683d4e7a706fdf6f412398569e2363b4 | [
"MIT"
] | null | null | null | from .ReservedSeating import *
| 15.5 | 30 | 0.806452 | 3 | 31 | 8.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 31 | 1 | 31 | 31 | 0.925926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3d4a5c0ed171038ce2bab6c87cd1396060f569e4 | 142 | py | Python | rabbitmq/lab-5/celery-integration/simple_tasks.py | mithun008/amazonmq-rabbitmq-workshop | 3e40a85f75dd4ad6c5947e6fd850aae52d8c926c | [
"Apache-2.0"
] | null | null | null | rabbitmq/lab-5/celery-integration/simple_tasks.py | mithun008/amazonmq-rabbitmq-workshop | 3e40a85f75dd4ad6c5947e6fd850aae52d8c926c | [
"Apache-2.0"
] | null | null | null | rabbitmq/lab-5/celery-integration/simple_tasks.py | mithun008/amazonmq-rabbitmq-workshop | 3e40a85f75dd4ad6c5947e6fd850aae52d8c926c | [
"Apache-2.0"
] | 2 | 2021-08-11T16:38:32.000Z | 2021-09-03T11:03:52.000Z | from base_app import get_base_celery_app
app = get_base_celery_app(module_name="simple_tasks")
@app.task()
def add(x, y):
return x + y
| 15.777778 | 53 | 0.739437 | 26 | 142 | 3.692308 | 0.615385 | 0.145833 | 0.270833 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15493 | 142 | 8 | 54 | 17.75 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.084507 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0.2 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
3d597e2d410aa1c44545972761f393acc69fd4a3 | 38 | py | Python | ut/tests/indirect.py | muhammad-ammar/alpha | 68e6f2ba695d65b904948297c27c327e939e848e | [
"MIT"
] | null | null | null | ut/tests/indirect.py | muhammad-ammar/alpha | 68e6f2ba695d65b904948297c27c327e939e848e | [
"MIT"
] | null | null | null | ut/tests/indirect.py | muhammad-ammar/alpha | 68e6f2ba695d65b904948297c27c327e939e848e | [
"MIT"
] | null | null | null | from ..src.src3 import Feature3 as f3
| 19 | 37 | 0.763158 | 7 | 38 | 4.142857 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 0.157895 | 38 | 1 | 38 | 38 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
182ae9e48ddf6caed93ed1a443989ef2e6d676a3 | 116 | py | Python | helpers.py | sleeyax/PyDeobfuscator | cffc6cb6ac1fea8e9c99d8501c8b95e93272279f | [
"MIT"
] | 10 | 2020-02-13T19:26:38.000Z | 2021-12-13T08:28:04.000Z | helpers.py | sleeyax/PyDeobfuscator | cffc6cb6ac1fea8e9c99d8501c8b95e93272279f | [
"MIT"
] | null | null | null | helpers.py | sleeyax/PyDeobfuscator | cffc6cb6ac1fea8e9c99d8501c8b95e93272279f | [
"MIT"
] | 3 | 2020-10-03T18:43:50.000Z | 2021-04-28T14:12:48.000Z | # read file contents to string
def read_file_contents(file):
with open(file, 'r') as f:
return f.read()
| 23.2 | 30 | 0.655172 | 19 | 116 | 3.894737 | 0.631579 | 0.216216 | 0.432432 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.232759 | 116 | 4 | 31 | 29 | 0.831461 | 0.241379 | 0 | 0 | 0 | 0 | 0.011628 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
182d22273052a572a07bf88d4f39b7cd98f52b6a | 73 | py | Python | lib/backbone/__init__.py | d4l-data4life/BBNOrchestra-for-VQAmed2021 | b083c0c9c2be73419ca43536bf185fff379cf98c | [
"MIT"
] | 2 | 2021-05-31T08:45:42.000Z | 2021-09-08T07:46:34.000Z | lib/backbone/__init__.py | d4l-data4life/BBNOrchestra-for-VQAmed2021 | b083c0c9c2be73419ca43536bf185fff379cf98c | [
"MIT"
] | null | null | null | lib/backbone/__init__.py | d4l-data4life/BBNOrchestra-for-VQAmed2021 | b083c0c9c2be73419ca43536bf185fff379cf98c | [
"MIT"
] | null | null | null | from .resnet import bbn_res50, bbn_res34
from .resnest import bbn_ress50
| 24.333333 | 40 | 0.835616 | 12 | 73 | 4.833333 | 0.666667 | 0.310345 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 0.123288 | 73 | 2 | 41 | 36.5 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
184a031b397e07a4967ebc22e6ecfb412ccd1bbd | 47,359 | py | Python | pirates/leveleditor/worldData/anvil_island_area_barbossa_cave.py | itsyaboyrocket/pirates | 6ca1e7d571c670b0d976f65e608235707b5737e3 | [
"BSD-3-Clause"
] | 3 | 2021-02-25T06:38:13.000Z | 2022-03-22T07:00:15.000Z | pirates/leveleditor/worldData/anvil_island_area_barbossa_cave.py | itsyaboyrocket/pirates | 6ca1e7d571c670b0d976f65e608235707b5737e3 | [
"BSD-3-Clause"
] | null | null | null | pirates/leveleditor/worldData/anvil_island_area_barbossa_cave.py | itsyaboyrocket/pirates | 6ca1e7d571c670b0d976f65e608235707b5737e3 | [
"BSD-3-Clause"
] | 1 | 2021-02-25T06:38:17.000Z | 2021-02-25T06:38:17.000Z | # uncompyle6 version 3.2.0
# Python bytecode 2.4 (62061)
# Decompiled from: Python 2.7.14 (v2.7.14:84471935ed, Sep 16 2017, 20:19:30) [MSC v.1500 32 bit (Intel)]
# Embedded file name: pirates.leveleditor.worldData.anvil_island_area_barbossa_cave
from pandac.PandaModules import Point3, VBase3, Vec4, Vec3
objectStruct = {'Objects': {'1172209006.11sdnaik': {'Type': 'Island Game Area', 'Name': 'anvil_island_area_barbossa_cave', 'File': '', 'Environment': 'Cave', 'Footstep Sound': 'Sand', 'Instanced': True, 'Minimap': False, 'Objects': {'1172209074.56sdnaik': {'Type': 'Locator Node', 'Name': 'portal_interior_1', 'Hpr': VBase3(95.675, 0.0, 0.0), 'Pos': Point3(85.919, -190.083, 24.757), 'Scale': VBase3(1.0, 1.0, 1.0)}, '1172618710.78sdnaik': {'Type': 'Townsperson', 'Category': 'Cast', 'AnimSet': 'cb_apple', 'AuraFX': 'None', 'Boss': False, 'CustomModel': 'models/char/cb_2000', 'GhostColor': 'None', 'GhostFX': 0, 'Greeting Animation': '', 'HelpID': 'NONE', 'Hpr': VBase3(-111.252, 0.0, 0.0), 'Instanced World': 'None', 'Level': '37', 'Notice Animation 1': '', 'Notice Animation 2': '', 'Patrol Radius': '12.0000', 'Pos': Point3(-21.98, 19.615, 6.041), 'PoseAnim': '', 'PoseFrame': '', 'Private Status': 'All', 'PropFXLeft': 'None', 'PropFXRight': 'None', 'PropLeft': 'None', 'PropRight': 'None', 'Respawns': True, 'Scale': VBase3(1.0, 1.0, 1.0), 'ShopID': 'PORT_ROYAL_DEFAULTS', 'Start State': 'Idle', 'StartFrame': '0', 'Team': 'Villager', 'TrailFX': 'None', 'TrailLeft': 'None', 'TrailRight': 'None', 'Zombie': False, 'spawnTimeAlt': '', 'spawnTimeBegin': 0.0, 'spawnTimeEnd': 0.0}, '1173468367.09kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(65.89, 0.0, 10.222), 'Pos': Point3(-22.972, 16.554, 6.811), 'Scale': VBase3(0.863, 0.863, 0.863), 'Visual': {'Model': 'models/props/treasureChest_open'}}, '1173468423.53kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(81.692, 0.0, 0.0), 'Objects': {'1173471720.95kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(175.579, 56.568, 0.0), 'Pos': Point3(-1.241, 2.011, 0.413), 'Scale': VBase3(1.362, 1.362, 1.362), 'Visual': {'Model': 'models/props/treasure_sconce'}}}, 'Pos': Point3(-21.29, 7.239, 3.606), 'Scale': VBase3(0.734, 0.734, 0.734), 'Visual': {'Model': 'models/props/treasureTrough'}}, '1173468471.78kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(-27.286, 0.0, 0.0), 'Objects': {'1173471825.44kmuller': {'Type': 'Furniture - Fancy', 'DisableCollision': False, 'Hpr': VBase3(103.874, -24.165, 0.0), 'Pos': Point3(-0.17, 3.557, 0.148), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/chair_fancy'}}, '1173471860.03kmuller': {'Type': 'Jugs_and_Jars', 'DisableCollision': False, 'Hpr': VBase3(27.286, 0.0, 0.0), 'Pos': Point3(-4.353, 2.819, 4.024), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (1.0, 0.71, 0.82, 1.0), 'Model': 'models/props/bottle_red'}}}, 'Pos': Point3(-33.657, -18.259, 3.981), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/treasureTrough'}}, '1173468497.0kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(-52.269, 1.207, 1.067), 'Objects': {'1173471924.11kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(-176.001, 0.143, -1.083), 'Pos': Point3(-1.148, -7.0, 0.554), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/treasureChest_open'}}, '1173471969.92kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(51.827, -19.244, -1.327), 'Pos': Point3(-3.079, 0.042, 0.331), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/treasure_chandelier'}}, '1173472001.3kmuller': {'Type': 'Jugs_and_Jars', 'DisableCollision': False, 'Hpr': VBase3(54.562, 6.976, -16.417), 'Pos': Point3(-1.915, 3.088, 0.404), 'Scale': VBase3(1.072, 1.072, 1.072), 'Visual': {'Model': 'models/props/bottle_green'}}, '1173473947.56kmuller': {'Type': 'Trunks', 'DisableCollision': False, 'Hpr': VBase3(-177.109, -1.995, -1.679), 'Pos': Point3(1.736, -1.62, 0.559), 'Scale': VBase3(0.761, 0.761, 0.761), 'Visual': {'Color': (0.7200000286102295, 0.699999988079071, 0.5899999737739563, 1.0), 'Model': 'models/props/Trunk_rounded_2'}}}, 'Pos': Point3(-24.383, -15.746, 3.792), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/treasureTrough'}}, '1173471575.44kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(56.363, 0.0, 0.0), 'Objects': {'1173471627.81kmuller': {'Type': 'Barrel', 'DisableCollision': False, 'Hpr': VBase3(-56.363, 4.799, -7.37), 'Pos': Point3(-7.898, -2.057, -0.921), 'Scale': VBase3(0.778, 0.778, 0.778), 'Visual': {'Color': (0.7099999785423279, 0.6700000166893005, 0.6000000238418579, 1.0), 'Model': 'models/props/barrel_worn'}}, '1173471671.51kmuller': {'Type': 'Wall_Hangings', 'DisableCollision': False, 'Hpr': VBase3(-161.949, 16.647, 13.013), 'Pos': Point3(4.633, 5.65, 3.513), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/seascape_battle'}}, '1173472099.22kmuller': {'Type': 'Jugs_and_Jars', 'DisableCollision': False, 'Hpr': VBase3(-56.744, -12.669, -6.175), 'Pos': Point3(-1.816, -0.758, 0.362), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.6000000238418579, 0.7200000286102295, 0.6000000238418579, 1.0), 'Model': 'models/props/bottle_tan'}}, '1250877604.84akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(78.429, 1.866, -11.711), 'Pos': Point3(-1.296, 3.362, 0.865), 'Scale': VBase3(0.84, 0.84, 0.84), 'VisSize': '', 'Visual': {'Model': 'models/props/torch'}}}, 'Pos': Point3(-33.146, -6.186, 3.947), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/treasureTrough'}}, '1173471597.2kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Objects': {}, 'Pos': Point3(-34.728, 2.632, 3.882), 'Scale': VBase3(1.221, 1.221, 1.221), 'Visual': {'Model': 'models/props/treasureTrough_single'}}, '1173471783.73kmuller': {'Type': 'Crate', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-35.795, -19.76, 4.397), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/crates_group_2'}}, '1173472048.89kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(65.655, 0.0, 0.0), 'Pos': Point3(-29.55, 1.392, 4.183), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/treasureChest_closed'}}, '1173472175.67kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(0.0, 0.494, 1.263), 'Objects': {'1173472214.83kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(0.0, 0.0, 0.0), 'Pos': Point3(3.135, -4.125, 0.325), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/treasureTrough_single'}}, '1173472843.22kmuller': {'Type': 'Barrel', 'DisableCollision': False, 'Hpr': VBase3(0.0, 1.884, 0.947), 'Pos': Point3(2.816, 3.404, 0.009), 'Scale': VBase3(0.784, 0.784, 0.784), 'Visual': {'Color': (0.7200000286102295, 0.699999988079071, 0.5899999737739563, 1.0), 'Model': 'models/props/barrel_grey'}}, '1173473890.48kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(-74.672, -3.73, -2.843), 'Pos': Point3(1.553, -1.297, 0.889), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/treasureChest_open'}}, '1173474959.44kmuller': {'Type': 'Jugs_and_Jars', 'DisableCollision': False, 'Hpr': VBase3(0.0, 12.717, 0.0), 'Pos': Point3(-1.903, -0.442, 0.318), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/bottle_red'}}}, 'Pos': Point3(26.083, -7.65, 2.748), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/treasureTrough'}}, '1173472197.86kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(-43.152, 0.0, 0.0), 'Objects': {'1173474384.75kmuller': {'Type': 'Trunks', 'DisableCollision': False, 'Hpr': VBase3(-24.158, 0.0, 0.0), 'Pos': Point3(-2.887, -1.469, 0.652), 'Scale': VBase3(0.789, 0.789, 0.789), 'Visual': {'Color': (0.49000000953674316, 0.47999998927116394, 0.4000000059604645, 1.0), 'Model': 'models/props/Trunk_rounded'}}, '1250877328.98akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(34.287, -13.693, -15.183), 'Pos': Point3(2.829, 1.448, -0.03), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Model': 'models/props/torch'}}}, 'Pos': Point3(25.568, 3.214, 2.794), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/treasureTrough'}}, '1173472392.78kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-17.706, 0.035, -0.11), 'Objects': {'1250877589.53akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(134.792, 1.866, -11.711), 'Pos': Point3(-0.072, -0.955, 1.21), 'Scale': VBase3(0.84, 0.84, 0.84), 'VisSize': '', 'Visual': {'Model': 'models/props/torch'}}}, 'Pos': Point3(-9.835, -31.083, 3.176), 'Scale': VBase3(1.19, 1.19, 1.19), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureTrough_single'}}, '1173472402.28kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(-17.706, 0.035, 0.005), 'Pos': Point3(-10.198, -24.927, 3.4), 'Scale': VBase3(0.931, 0.931, 0.931), 'Visual': {'Model': 'models/props/treasureTrough_single'}}, '1173472651.34kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(-0.111, -14.424, -1.428), 'Pos': Point3(-31.521, 9.349, 8.04), 'Scale': VBase3(0.775, 0.775, 0.775), 'Visual': {'Model': 'models/props/treasureTrough_single'}}, '1173472721.78kmuller': {'Type': 'Barrel', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-33.341, 6.35, 9.026), 'Scale': VBase3(0.675, 0.675, 0.675), 'Visual': {'Color': (0.7200000286102295, 0.699999988079071, 0.5899999737739563, 1.0), 'Model': 'models/props/barrel_grey'}}, '1173473917.97kmuller': {'Type': 'Wall_Hangings', 'DisableCollision': False, 'Hpr': VBase3(72.697, 33.022, 29.472), 'Pos': Point3(26.998, 6.135, 4.941), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/portrait_gov'}}, '1173474152.67kmuller': {'Type': 'Crate', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-33.123, -4.407, 4.426), 'Scale': VBase3(0.825, 0.825, 0.825), 'Visual': {'Color': (0.7099999785423279, 0.6700000166893005, 0.6000000238418579, 1.0), 'Model': 'models/props/crate_04'}}, '1173474457.5kmuller': {'Type': 'Crate', 'DisableCollision': False, 'Hpr': VBase3(30.719, 0.0, 0.0), 'Pos': Point3(25.244, -4.318, 3.273), 'Scale': VBase3(0.799, 0.799, 0.799), 'Visual': {'Color': (0.49000000953674316, 0.47999998927116394, 0.4000000059604645, 1.0), 'Model': 'models/props/crate'}}, '1173474718.7kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(170.127, 0.0, 0.0), 'Pos': Point3(-19.421, -19.11, 3.925), 'Scale': VBase3(0.738, 0.738, 0.738), 'Visual': {'Model': 'models/props/treasureChest_closed'}}, '1173475002.56kmuller': {'Type': 'Crate', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(24.603, 0.0, 0.0), 'Pos': Point3(-105.99, -27.167, 41.989), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (0.899999976158, 0.899999976158, 0.699999988079, 1.0), 'Model': 'models/props/crates_group_1'}}, '1173475022.66kmuller': {'Type': 'Barrel', 'DisableCollision': True, 'Holiday': '', 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-101.726, -21.398, 40.132), 'Scale': VBase3(0.728, 0.728, 0.728), 'VisSize': '', 'Visual': {'Color': (0.91, 0.86, 0.65, 1.0), 'Model': 'models/props/barrel_worn'}}, '1173475076.05kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(0.175, 13.381, 11.028), 'Objects': {'1173475259.56kmuller': {'Type': 'Jugs_and_Jars', 'DisableCollision': False, 'Hpr': VBase3(-5.229, -24.249, -12.155), 'Pos': Point3(2.971, 0.399, -0.891), 'Scale': VBase3(1.792, 1.792, 1.792), 'Visual': {'Model': 'models/props/largejug_A'}}, '1173475343.87kmuller': {'Type': 'Wall_Hangings', 'DisableCollision': False, 'Hpr': VBase3(-54.715, 21.019, -3.286), 'Pos': Point3(-0.355, -3.566, 1.789), 'Scale': VBase3(0.636, 0.636, 0.636), 'Visual': {'Model': 'models/props/seascape_port'}}}, 'Pos': Point3(-102.684, -32.061, 40.039), 'Scale': VBase3(1.572, 1.572, 1.572), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureTrough_single'}}, '1173475106.3kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-88.578, -13.952, 0.629), 'Pos': Point3(-97.499, -26.255, 38.394), 'Scale': VBase3(1.53, 1.53, 1.53), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureTrough_single'}}, '1173475150.47kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(102.4, 18.802, 4.057), 'Pos': Point3(-96.706, -21.344, 39.702), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Model': 'models/props/treasureChest_open'}}, '1173475212.69kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(2.574, -0.798, 14.081), 'Pos': Point3(-97.35, -18.315, 38.277), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureTrough_single'}}, '1173475484.11kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(0.191, -7.866, 3.947), 'Objects': {'1173475560.97kmuller': {'Type': 'Barrel', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-17.717, 8.694, -1.336), 'Pos': Point3(-4.788, -3.67, -0.11), 'Scale': VBase3(0.662, 0.662, 0.662), 'VisSize': '', 'Visual': {'Color': (0.589999973774, 0.589999973774, 0.490000009537, 1.0), 'Model': 'models/props/barrel_group_3'}}, '1173475599.84kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(161.762, -8.706, 1.257), 'Pos': Point3(-1.743, -0.101, 0.606), 'Scale': VBase3(0.724, 0.724, 0.724), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureChest_open'}}, '1173475645.34kmuller': {'Type': 'Furniture - Fancy', 'DisableCollision': False, 'Hpr': VBase3(-27.381, -30.075, -28.026), 'Pos': Point3(3.899, 1.605, 0.527), 'Scale': VBase3(0.724, 0.724, 0.724), 'Visual': {'Model': 'models/props/stool_fancy'}}, '1173475705.97kmuller': {'Type': 'Trunks', 'DisableCollision': False, 'Hpr': VBase3(-170.091, -1.954, 5.226), 'Pos': Point3(-2.933, 2.714, 0.278), 'Scale': VBase3(0.724, 0.724, 0.724), 'Visual': {'Model': 'models/props/Trunk_rounded_2'}}, '1173494051.56kmuller': {'Type': 'Trunks', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(132.966, -4.395, -7.056), 'Pos': Point3(5.434, -3.625, 0.838), 'Scale': VBase3(0.747, 0.747, 0.747), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/Trunk_rounded_2'}}}, 'Pos': Point3(-71.338, -79.459, 29.129), 'Scale': VBase3(1.38, 1.38, 1.38), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureTrough'}}, '1173475529.08kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(0.923, 0.0, 0.0), 'Pos': Point3(-77.951, -72.216, 28.19), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureTrough_single'}}, '1173476346.23kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-61.004, -84.022, 28.875), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureTrough_single'}}, '1173476370.78kmuller': {'Type': 'Jugs_and_Jars', 'DisableCollision': False, 'Hpr': VBase3(0.0, 10.732, 0.0), 'Pos': Point3(-64.089, -80.774, 31.04), 'Scale': VBase3(1.115, 1.115, 1.115), 'Visual': {'Model': 'models/props/bottle_green'}}, '1173476412.98kmuller': {'Type': 'Jugs_and_Jars', 'DisableCollision': False, 'Hpr': VBase3(-8.405, 28.067, 6.498), 'Pos': Point3(-61.858, -81.625, 29.208), 'Scale': VBase3(2.558, 2.558, 2.558), 'Visual': {'Model': 'models/props/waterpitcher'}}, '1173476478.11kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(0.0, -8.664, 0.0), 'Objects': {'1173476535.72kmuller': {'Type': 'Crate', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-70.524, -1.407, 8.187), 'Pos': Point3(2.747, -5.673, 0.935), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (0.589999973774, 0.589999973774, 0.490000009537, 1.0), 'Model': 'models/props/crates_group_1'}}, '1173476711.03kmuller': {'Type': 'Barrel', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-0.533, 14.375, 5.277), 'Pos': Point3(-0.415, -3.12, 0.775), 'Scale': VBase3(0.943, 0.943, 0.943), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/barrel_grey'}}, '1173476839.64kmuller': {'Type': 'Jugs_and_Jars', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(0.0, 8.664, -10.81), 'Pos': Point3(-1.813, -0.603, 0.396), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/bottle_green'}}}, 'Pos': Point3(79.984, -90.141, 5.219), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureTrough'}}, '1173476568.78kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-26.274, -11.342, -5.545), 'Objects': {'1173476683.0kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-103.181, -7.123, 3.621), 'Pos': Point3(-1.421, -0.953, 0.526), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureChest_open'}}}, 'Pos': Point3(78.207, -95.333, 6.207), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureTrough'}}, '1173476595.0kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-0.003, -10.571, 1.208), 'Objects': {'1173476640.59kmuller': {'Type': 'Jugs_and_Jars', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(0.229, 10.568, -1.229), 'Pos': Point3(-2.054, -0.088, 2.839), 'Scale': VBase3(1.172, 1.172, 1.172), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/bottle_red'}}}, 'Pos': Point3(87.998, -95.619, 6.261), 'Scale': VBase3(1.28, 1.28, 1.28), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureTrough_single'}}, '1173476807.97kmuller': {'Type': 'Furniture - Fancy', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-105.689, -0.169, -86.025), 'Pos': Point3(80.358, -92.174, 6.756), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/chair_fancy'}}, '1173476913.16kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-0.527, -13.011, -4.646), 'Objects': {'1173476992.98kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-159.013, 0.788, 8.363), 'Pos': Point3(5.298, -0.068, 0.456), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureChest_closed'}}, '1173494163.92kmuller': {'Type': 'Jugs_and_Jars', 'DisableCollision': False, 'Hpr': VBase3(-0.267, 13.019, 6.306), 'Pos': Point3(-1.023, 2.279, 0.473), 'Scale': VBase3(1.375, 1.375, 1.375), 'Visual': {'Color': (1.0, 0.89, 0.77, 1.0), 'Model': 'models/props/bottle_red'}}}, 'Pos': Point3(-56.248, -79.205, 10.348), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureTrough_single'}}, '1173476951.55kmuller': {'Type': 'Treasure Chest', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-0.296, -14.959, -4.882), 'Pos': Point3(-60.628, -75.756, 9.378), 'Scale': VBase3(0.807, 0.807, 0.807), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/treasureTrough_single'}}, '1173477100.75kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(0.0, 0.0, 0.901), 'Objects': {'1173477184.05kmuller': {'Type': 'Jugs_and_Jars', 'DisableCollision': False, 'Hpr': VBase3(5.719, 18.512, -32.928), 'Pos': Point3(-0.84, -0.577, 1.416), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.28999999165534973, 0.4000000059604645, 0.46000000834465027, 1.0), 'Model': 'models/props/bottle_tan'}}, '1250877310.59akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-172.606, -0.064, 15.101), 'Pos': Point3(1.441, 0.427, -0.478), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Model': 'models/props/torch'}}}, 'Pos': Point3(-2.817, 26.98, 3.312), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/treasureTrough_single'}}, '1173477117.31kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(0.774, 22.338, 3.276), 'Scale': VBase3(0.649, 0.649, 0.649), 'Visual': {'Model': 'models/props/treasureTrough_single'}}, '1173477133.91kmuller': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(-7.998, 0.0, 0.0), 'Pos': Point3(-1.361, 23.038, 3.417), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/treasureChest_closed'}}, '1173494260.01kmuller': {'Type': 'Cups', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-23.149, 34.955, -16.484), 'Pos': Point3(-59.985, -74.367, 10.035), 'Scale': VBase3(1.402, 1.402, 1.402), 'VisSize': '', 'Visual': {'Model': 'models/props/beerstein'}}, '1175216546.7kmuller': {'Type': 'Tunnel Cap', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-179.257, 0.0, 0.0), 'Pos': Point3(87.158, -179.257, 26.766), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (0.40000000596, 0.40000000596, 0.40000000596, 1.0), 'Model': 'models/tunnels/pir_m_are_tun_caveInterior_cap'}}, '1175912064.0JB2': {'Type': 'Animal', 'Hpr': Point3(0.0, 0.0, 0.0), 'MinHP': 10, 'Patrol Radius': 12, 'Pos': Point3(-0.907, -7.683, 4.223), 'PoseAnim': '', 'PoseFrame': '', 'Respawns': True, 'Scale': VBase3(1.0, 1.0, 1.0), 'Species': 'Monkey', 'Start State': 'Idle', 'StartFrame': '0', 'Team': 1, 'spawnTimeBegin': 0.0, 'spawnTimeEnd': 0.0}, '1176164583.28dzlu': {'Type': 'Light - Dynamic', 'Attenuation': '0.005', 'ConeAngle': '15.0000', 'DropOff': '90.0000', 'FlickRate': 0.5, 'Flickering': False, 'Hpr': VBase3(-137.198, -12.488, -99.924), 'Intensity': '0.3939', 'LightType': 'SPOT', 'Pos': Point3(-25.129, 29.622, 13.381), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (1, 1, 1, 1), 'Model': 'models/props/light_tool_bulb'}}, '1176165564.08dzlu': {'Type': 'Light - Dynamic', 'Attenuation': '0.005', 'ConeAngle': '108.8636', 'DropOff': '65.4545', 'FlickRate': 0.5, 'Flickering': False, 'Hpr': VBase3(98.927, 18.357, -92.791), 'Intensity': '0.4545', 'LightType': 'SPOT', 'Pos': Point3(-8.182, 23.236, 6.965), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (1, 1, 1, 1), 'Model': 'models/props/light_tool_bulb'}}, '1176167113.18dzlu': {'Type': 'Light - Dynamic', 'Attenuation': '0.005', 'ConeAngle': '60.0000', 'DropOff': '0.0000', 'FlickRate': 0.5, 'Flickering': False, 'Hpr': VBase3(0.0, 0.0, -0.474), 'Intensity': '0.2121', 'LightType': 'POINT', 'Pos': Point3(-6.907, 19.341, 6.928), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (1.0, 0.7799999713897705, 0.5299999713897705, 1.0), 'Model': 'models/props/light_tool_bulb'}}, '1176225288.77dzlu': {'Type': 'Light - Dynamic', 'Attenuation': '0.005', 'ConeAngle': '60.0000', 'DropOff': '12.2727', 'FlickRate': 0.5, 'Flickering': True, 'Hpr': VBase3(1.676, 25.173, 33.894), 'Intensity': '1.6667', 'LightType': 'POINT', 'Pos': Point3(-21.229, 19.109, 9.82), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Color': (0.9900000095367432, 0.8399999737739563, 0.5600000023841858, 1.0), 'Model': 'models/props/light_tool_bulb'}}, '1176342559.48dzlu': {'Type': 'Light - Dynamic', 'Attenuation': '0.005', 'ConeAngle': '65.9091', 'DropOff': '24.5455', 'FlickRate': 0.5, 'Flickering': False, 'Hpr': VBase3(-55.948, -11.457, -89.954), 'Intensity': '0.6061', 'LightType': 'SPOT', 'Pos': Point3(-26.332, 18.178, 11.598), 'Scale': VBase3(1.0, 1.0, 1.0), 'Visual': {'Model': 'models/props/light_tool_bulb'}}, '1177608371.49dzlu': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(-17.785, -3.09, 1.969), 'Pos': Point3(-4.896, -32.682, 3.386), 'Scale': VBase3(0.59, 0.59, 0.59), 'Visual': {'Model': 'models/props/treasureTrough_single'}}, '1178323712.0dchiappe': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(65.89, 0.0, 10.222), 'Pos': Point3(-8.318, 16.465, 3.695), 'Scale': VBase3(0.863, 0.863, 0.863), 'Visual': {'Model': 'models/props/treasureChest_open'}}, '1178323712.0dchiappe0': {'Type': 'Treasure Chest', 'DisableCollision': False, 'Hpr': VBase3(65.89, 0.0, 10.222), 'Pos': Point3(-8.318, 16.465, 3.695), 'Scale': VBase3(0.863, 0.863, 0.863), 'Visual': {'Model': 'models/props/treasureChest_open'}}, '1213983053.36aapatel': {'Type': 'Townsperson', 'Category': 'PvPRewards', 'AnimSet': 'default', 'AuraFX': 'None', 'Boss': False, 'CustomModel': 'None', 'GhostColor': 'None', 'GhostFX': 0, 'Greeting Animation': '', 'HelpID': 'NONE', 'Holiday': '', 'Hpr': VBase3(169.842, 0.0, 0.0), 'Instanced World': 'None', 'Level': '37', 'Notice Animation 1': '', 'Notice Animation 2': '', 'Patrol Radius': '12.0000', 'Pos': Point3(9.477, 14.543, 3.276), 'PoseAnim': '', 'PoseFrame': '', 'Private Status': 'All', 'PropFXLeft': 'None', 'PropFXRight': 'None', 'PropLeft': 'None', 'PropRight': 'None', 'Respawns': True, 'Scale': VBase3(1.0, 1.0, 1.0), 'ShopID': 'PRIVATEER_TATTOOS', 'Start State': 'Idle', 'StartFrame': '0', 'Team': 'Villager', 'TrailFX': 'None', 'TrailLeft': 'None', 'TrailRight': 'None', 'VisSize': '', 'Zombie': False, 'spawnTimeAlt': '', 'spawnTimeBegin': 0.0, 'spawnTimeEnd': 0.0}, '1250816577.11akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'Holiday': '', 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(-70.015, 67.695, 0.088), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Model': 'models/props/torch'}}, '1250816607.06akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-6.294, 0.0, 0.0), 'Pos': Point3(-21.687, 79.745, -0.936), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Model': 'models/props/torch'}}, '1250876943.56akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-86.964, 0.0, 0.0), 'Pos': Point3(114.079, 67.157, -0.98), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Model': 'models/props/torch'}}, '1250877014.44akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-86.964, 0.0, 0.0), 'Pos': Point3(93.212, -32.115, -0.131), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Model': 'models/props/torch'}}, '1250877050.25akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-86.964, 0.0, 0.0), 'Pos': Point3(150.027, -51.636, -1.113), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Model': 'models/props/torch'}}, '1250877088.86akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-158.034, 0.0, 0.0), 'Pos': Point3(72.603, -80.586, 0.284), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Model': 'models/props/torch'}}, '1250877112.28akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-158.034, 0.0, 0.0), 'Pos': Point3(52.532, -98.638, 10.099), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Model': 'models/props/torch'}}, '1250877144.7akelts': {'Type': 'Light_Fixtures', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(131.997, -11.72, 0.0), 'Pos': Point3(16.811, -90.386, 2.429), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/torch'}}, '1250877166.08akelts': {'Type': 'Light_Fixtures', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(131.997, -0.317, 0.0), 'Pos': Point3(-35.907, -92.812, -1.333), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/torch'}}, '1250877270.7akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(131.997, -0.317, 0.0), 'Pos': Point3(-18.641, 23.404, 3.92), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Model': 'models/props/torch'}}, '1250877286.14akelts': {'Type': 'Light_Fixtures', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-172.608, -0.18, 14.208), 'Pos': Point3(-9.33, 37.096, 2.003), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Model': 'models/props/torch'}}, '1250877376.02akelts': {'Type': 'Light_Fixtures', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-79.034, 0.0, -6.327), 'Pos': Point3(32.824, -16.682, 3.201), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/torch'}}, '1250877472.52akelts': {'Type': 'Light_Fixtures', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(117.084, 1.763, -11.659), 'Pos': Point3(7.106, -40.55, 3.447), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/props/torch'}}, '1251140212.95piwanow': {'Type': 'Effect Node', 'EffectName': 'watersplash_effect', 'Hpr': VBase3(-128.191, 0.0, 0.0), 'Pos': Point3(-85.61, 48.499, 2.16), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (0, 0, 0.65, 1), 'Model': 'models/misc/smiley'}}, '1251140289.59piwanow': {'Type': 'Effect Node', 'EffectName': 'watersplash_effect', 'Hpr': VBase3(-176.566, 0.0, 0.0), 'Pos': Point3(8.715, 78.784, 2.884), 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (0, 0, 0.65, 1), 'Model': 'models/misc/smiley'}}, '1251140327.19piwanow': {'Type': 'Effect Node', 'EffectName': 'watersplash_effect', 'Hpr': VBase3(152.045, 0.0, 0.0), 'Pos': Point3(74.63, 91.223, 3.637), 'Scale': VBase3(0.8, 0.8, 0.8), 'VisSize': '', 'Visual': {'Color': (0, 0, 0.65, 1), 'Model': 'models/misc/smiley'}}, '1251140366.42piwanow': {'Type': 'Effect Node', 'EffectName': 'watersplash_effect', 'Hpr': VBase3(163.301, 0.0, 0.0), 'Pos': Point3(139.836, 48.412, 2.707), 'Scale': VBase3(0.8, 0.8, 0.8), 'VisSize': '', 'Visual': {'Color': (0, 0, 0.65, 1), 'Model': 'models/misc/smiley'}}, '1251148581.42piwanow': {'Type': 'Cave_Props', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(144.817, 0.0, 0.0), 'Objects': {'1251154942.28akelts': {'Type': 'Cave_Props', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-56.912, 0.0, 0.0), 'Pos': Point3(5.302, 0.976, -0.03), 'RenderEffect': False, 'Scale': VBase3(1.0, 1.0, 1.0), 'VisSize': '', 'Visual': {'Color': (0.346, 0.432, 0.391, 1.0), 'Model': 'models/props/pir_m_prp_cav_rockGroup_i'}}}, 'Pos': Point3(8.854, 77.776, -0.496), 'RenderEffect': False, 'Scale': VBase3(1.224, 1.224, 1.224), 'VisSize': '', 'Visual': {'Color': (0.323, 0.404, 0.365, 1.0), 'Model': 'models/props/pir_m_prp_cav_rockGroup_g'}}, '1251148751.73piwanow': {'Type': 'Cave_Props', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(40.355, 7.197, -21.452), 'Pos': Point3(76.426, 90.621, -2.115), 'RenderEffect': False, 'Scale': VBase3(0.2, 0.3, 0.286), 'VisSize': '', 'Visual': {'Color': (0.32, 0.44, 0.41, 1.0), 'Model': 'models/props/pir_m_prp_cav_rockGroup_a'}}, '1251149030.66piwanow': {'Type': 'Cave_Props', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(-47.141, 0.0, 0.0), 'Pos': Point3(137.696, 40.061, -1.582), 'RenderEffect': False, 'Scale': VBase3(0.568, 0.451, 0.451), 'VisSize': '', 'Visual': {'Color': (0.279, 0.349, 0.341, 1.0), 'Model': 'models/props/pir_m_prp_cav_rockGroup_c'}}, '1251149293.97piwanow': {'Type': 'Cave_Props', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-92.315, 0.0, 0.0), 'Pos': Point3(-85.427, 47.662, -1.031), 'RenderEffect': False, 'Scale': VBase3(1.838, 1.838, 1.838), 'VisSize': '', 'Visual': {'Color': (0.27, 0.329, 0.293, 1.0), 'Model': 'models/props/pir_m_prp_cav_rockGroup_k'}}, '1251154757.73akelts': {'Type': 'Cave_Props', 'DisableCollision': True, 'Holiday': '', 'Hpr': VBase3(41.068, -6.877, -0.44), 'Pos': Point3(86.04, 90.778, -1.571), 'RenderEffect': False, 'Scale': VBase3(1.389, 2.083, 1.986), 'VisSize': '', 'Visual': {'Color': (0.324, 0.438, 0.411, 1.0), 'Model': 'models/props/pir_m_prp_cav_rockGroup_k'}}, '1251155149.92akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Holiday': '', 'Hpr': Point3(0.0, 0.0, 0.0), 'Pos': Point3(82.063, 88.344, 1.109), 'Scale': VBase3(1.779, 1.779, 1.779), 'VisSize': '', 'Visual': {'Color': (1.0, 1.0, 1.0, 1.0), 'Model': 'models/misc/pir_m_prp_lev_cambarrier_sphere'}}, '1251161466.69akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-72.323, 0.0, 0.0), 'Pos': Point3(119.014, 52.431, -2.076), 'Scale': VBase3(1.14, 1.354, 1.875), 'VisSize': '', 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}, '1251161509.13akelts': {'Type': 'Collision Barrier', 'DisableCollision': False, 'Holiday': '', 'Hpr': VBase3(-59.964, 0.0, 0.0), 'Pos': Point3(130.185, 30.776, -2.195), 'Scale': VBase3(3.834, 1.354, 1.875), 'VisSize': '', 'Visual': {'Model': 'models/misc/pir_m_prp_lev_cambarrier_plane'}}}, 'Visibility': 'Grid', 'Visual': {'Model': 'models/caves/pir_m_are_cav_barbossa'}}}, 'TodSettings': {'AmbientColors': {0: Vec4(0.45, 0.53, 0.65, 1), 2: Vec4(0.537255, 0.494118, 0.627451, 1), 4: Vec4(0.4, 0.447059, 0.498039, 1), 6: Vec4(0.439216, 0.447059, 0.556863, 1), 8: Vec4(0.388235, 0.419608, 0.537255, 1), 12: Vec4(0.337255, 0.278431, 0.407843, 1), 13: Vec4(0.337255, 0.278431, 0.407843, 1), 16: Vec4(0.247059, 0.247059, 0.247059, 1), 17: Vec4(0.34, 0.28, 0.41, 1)}, 'DirectionalColors': {0: Vec4(0.55, 0.46, 0.35, 1), 2: Vec4(0.458824, 0.458824, 0.364706, 1), 4: Vec4(0.6, 0.337255, 0.0980392, 1), 6: Vec4(0.458824, 0.478431, 0.447059, 1), 8: Vec4(0.419608, 0.419608, 0.4, 1), 12: Vec4(0.658824, 0.756863, 0.0470588, 1), 13: Vec4(0.658824, 0.756863, 0.0470588, 1), 16: Vec4(0, 0, 0, 1), 17: Vec4(0.66, 0.76, 0.05, 1)}, 'FogColors': {0: Vec4(0.3, 0.2, 0.15, 0), 2: Vec4(0.6, 0.694118, 0.894118, 1), 4: Vec4(0.298039, 0.176471, 0.14902, 1), 6: Vec4(0.14902, 0.2, 0.34902, 1), 8: Vec4(0.0470588, 0.0588235, 0.168627, 1), 12: Vec4(0.0980392, 0.117647, 0.027451, 1), 13: Vec4(0.0980392, 0.117647, 0.027451, 1), 16: Vec4(0.054902, 0.0392157, 0, 1), 17: Vec4(0.1, 0.12, 0.03, 0)}, 'FogRanges': {0: 0.0001, 2: 9.999999747378752e-05, 4: 9.999999747378752e-05, 6: 9.999999747378752e-05, 8: 0.00019999999494757503, 12: 0.0002500000118743628, 13: 0.0002500000118743628, 16: 9.999999747378752e-05, 17: 0.005}, 'LinearFogRanges': {0: (0.0, 100.0), 2: (0.0, 100.0), 4: (0.0, 100.0), 6: (0.0, 100.0), 8: (0.0, 100.0), 12: (0.0, 100.0), 13: (0.0, 100.0), 16: (150.0, 300.0), 17: (0.0, 100.0)}}, 'Node Links': [], 'Layers': {}, 'ObjectIds': {'1172209006.11sdnaik': '["Objects"]["1172209006.11sdnaik"]', '1172209074.56sdnaik': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1172209074.56sdnaik"]', '1172618710.78sdnaik': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1172618710.78sdnaik"]', '1173468367.09kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173468367.09kmuller"]', '1173468423.53kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173468423.53kmuller"]', '1173468471.78kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173468471.78kmuller"]', '1173468497.0kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173468497.0kmuller"]', '1173471575.44kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173471575.44kmuller"]', '1173471597.2kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173471597.2kmuller"]', '1173471627.81kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173471575.44kmuller"]["Objects"]["1173471627.81kmuller"]', '1173471671.51kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173471575.44kmuller"]["Objects"]["1173471671.51kmuller"]', '1173471720.95kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173468423.53kmuller"]["Objects"]["1173471720.95kmuller"]', '1173471783.73kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173471783.73kmuller"]', '1173471825.44kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173468471.78kmuller"]["Objects"]["1173471825.44kmuller"]', '1173471860.03kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173468471.78kmuller"]["Objects"]["1173471860.03kmuller"]', '1173471924.11kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173468497.0kmuller"]["Objects"]["1173471924.11kmuller"]', '1173471969.92kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173468497.0kmuller"]["Objects"]["1173471969.92kmuller"]', '1173472001.3kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173468497.0kmuller"]["Objects"]["1173472001.3kmuller"]', '1173472048.89kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472048.89kmuller"]', '1173472099.22kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173471575.44kmuller"]["Objects"]["1173472099.22kmuller"]', '1173472175.67kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472175.67kmuller"]', '1173472197.86kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472197.86kmuller"]', '1173472214.83kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472175.67kmuller"]["Objects"]["1173472214.83kmuller"]', '1173472392.78kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472392.78kmuller"]', '1173472402.28kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472402.28kmuller"]', '1173472651.34kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472651.34kmuller"]', '1173472721.78kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472721.78kmuller"]', '1173472843.22kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472175.67kmuller"]["Objects"]["1173472843.22kmuller"]', '1173473890.48kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472175.67kmuller"]["Objects"]["1173473890.48kmuller"]', '1173473917.97kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173473917.97kmuller"]', '1173473947.56kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173468497.0kmuller"]["Objects"]["1173473947.56kmuller"]', '1173474152.67kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173474152.67kmuller"]', '1173474384.75kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472197.86kmuller"]["Objects"]["1173474384.75kmuller"]', '1173474457.5kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173474457.5kmuller"]', '1173474718.7kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173474718.7kmuller"]', '1173474959.44kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472175.67kmuller"]["Objects"]["1173474959.44kmuller"]', '1173475002.56kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475002.56kmuller"]', '1173475022.66kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475022.66kmuller"]', '1173475076.05kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475076.05kmuller"]', '1173475106.3kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475106.3kmuller"]', '1173475150.47kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475150.47kmuller"]', '1173475212.69kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475212.69kmuller"]', '1173475259.56kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475076.05kmuller"]["Objects"]["1173475259.56kmuller"]', '1173475343.87kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475076.05kmuller"]["Objects"]["1173475343.87kmuller"]', '1173475484.11kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475484.11kmuller"]', '1173475529.08kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475529.08kmuller"]', '1173475560.97kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475484.11kmuller"]["Objects"]["1173475560.97kmuller"]', '1173475599.84kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475484.11kmuller"]["Objects"]["1173475599.84kmuller"]', '1173475645.34kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475484.11kmuller"]["Objects"]["1173475645.34kmuller"]', '1173475705.97kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475484.11kmuller"]["Objects"]["1173475705.97kmuller"]', '1173476346.23kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476346.23kmuller"]', '1173476370.78kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476370.78kmuller"]', '1173476412.98kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476412.98kmuller"]', '1173476478.11kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476478.11kmuller"]', '1173476535.72kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476478.11kmuller"]["Objects"]["1173476535.72kmuller"]', '1173476568.78kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476568.78kmuller"]', '1173476595.0kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476595.0kmuller"]', '1173476640.59kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476595.0kmuller"]["Objects"]["1173476640.59kmuller"]', '1173476683.0kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476568.78kmuller"]["Objects"]["1173476683.0kmuller"]', '1173476711.03kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476478.11kmuller"]["Objects"]["1173476711.03kmuller"]', '1173476807.97kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476807.97kmuller"]', '1173476839.64kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476478.11kmuller"]["Objects"]["1173476839.64kmuller"]', '1173476913.16kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476913.16kmuller"]', '1173476951.55kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476951.55kmuller"]', '1173476992.98kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476913.16kmuller"]["Objects"]["1173476992.98kmuller"]', '1173477100.75kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173477100.75kmuller"]', '1173477117.31kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173477117.31kmuller"]', '1173477133.91kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173477133.91kmuller"]', '1173477184.05kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173477100.75kmuller"]["Objects"]["1173477184.05kmuller"]', '1173494051.56kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173475484.11kmuller"]["Objects"]["1173494051.56kmuller"]', '1173494163.92kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173476913.16kmuller"]["Objects"]["1173494163.92kmuller"]', '1173494260.01kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173494260.01kmuller"]', '1175216546.7kmuller': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1175216546.7kmuller"]', '1175912064.0JB2': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1175912064.0JB2"]', '1176164583.28dzlu': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1176164583.28dzlu"]', '1176165564.08dzlu': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1176165564.08dzlu"]', '1176167113.18dzlu': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1176167113.18dzlu"]', '1176225288.77dzlu': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1176225288.77dzlu"]', '1176342559.48dzlu': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1176342559.48dzlu"]', '1177608371.49dzlu': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1177608371.49dzlu"]', '1178323712.0dchiappe': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1178323712.0dchiappe"]', '1178323712.0dchiappe0': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1178323712.0dchiappe0"]', '1213983053.36aapatel': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1213983053.36aapatel"]', '1250816577.11akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1250816577.11akelts"]', '1250816607.06akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1250816607.06akelts"]', '1250876943.56akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1250876943.56akelts"]', '1250877014.44akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1250877014.44akelts"]', '1250877050.25akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1250877050.25akelts"]', '1250877088.86akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1250877088.86akelts"]', '1250877112.28akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1250877112.28akelts"]', '1250877144.7akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1250877144.7akelts"]', '1250877166.08akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1250877166.08akelts"]', '1250877270.7akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1250877270.7akelts"]', '1250877286.14akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1250877286.14akelts"]', '1250877310.59akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173477100.75kmuller"]["Objects"]["1250877310.59akelts"]', '1250877328.98akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472197.86kmuller"]["Objects"]["1250877328.98akelts"]', '1250877376.02akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1250877376.02akelts"]', '1250877472.52akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1250877472.52akelts"]', '1250877589.53akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173472392.78kmuller"]["Objects"]["1250877589.53akelts"]', '1250877604.84akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1173471575.44kmuller"]["Objects"]["1250877604.84akelts"]', '1251140212.95piwanow': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1251140212.95piwanow"]', '1251140289.59piwanow': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1251140289.59piwanow"]', '1251140327.19piwanow': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1251140327.19piwanow"]', '1251140366.42piwanow': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1251140366.42piwanow"]', '1251148581.42piwanow': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1251148581.42piwanow"]', '1251148751.73piwanow': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1251148751.73piwanow"]', '1251149030.66piwanow': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1251149030.66piwanow"]', '1251149293.97piwanow': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1251149293.97piwanow"]', '1251154757.73akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1251154757.73akelts"]', '1251154942.28akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1251148581.42piwanow"]["Objects"]["1251154942.28akelts"]', '1251155149.92akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1251155149.92akelts"]', '1251161466.69akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1251161466.69akelts"]', '1251161509.13akelts': '["Objects"]["1172209006.11sdnaik"]["Objects"]["1251161509.13akelts"]'}}
extraInfo = {'camPos': Point3(12.9352, 2.9029, 10.1918), 'camHpr': VBase3(15.8438, -17.1455, 0), 'focalLength': 0.639999985695, 'skyState': -2, 'fog': 1} | 6,765.571429 | 46,900 | 0.651703 | 6,447 | 47,359 | 4.756631 | 0.156972 | 0.020283 | 0.01937 | 0.025696 | 0.574512 | 0.514642 | 0.451608 | 0.353942 | 0.278778 | 0.252984 | 0 | 0.27146 | 0.074833 | 47,359 | 7 | 46,901 | 6,765.571429 | 0.428438 | 0.005004 | 0 | 0 | 0 | 0 | 0.536249 | 0.246042 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
1850044d5d892365950ca734baed9e875d208d90 | 713 | py | Python | fiellclib/mr_project/data/root_config.py | leith-bartrich/fiellc | 1c02690538f442dede5d6afc8926355cb2ac838e | [
"MIT"
] | null | null | null | fiellclib/mr_project/data/root_config.py | leith-bartrich/fiellc | 1c02690538f442dede5d6afc8926355cb2ac838e | [
"MIT"
] | null | null | null | fiellclib/mr_project/data/root_config.py | leith-bartrich/fiellc | 1c02690538f442dede5d6afc8926355cb2ac838e | [
"MIT"
] | null | null | null | import typing
from fiepipelib.rootaspect.data.config import RootAsepctConfiguration
class MRProjectConfig(RootAsepctConfiguration):
_gitlab_server_name: str = None
def get_gitlab_server_name(self) -> str:
return self._gitlab_server_name
def set_gitlab_server_name(self, gitlab_server_name:str):
self._gitlab_server_name = gitlab_server_name
def get_config_name(self) -> str:
return "mr_project"
def from_json_data(self, data: typing.Dict):
self._gitlab_server_name = data['gitlab_server_name']
return
def to_json_data(self) -> typing.Dict:
ret = {}
ret['gitlab_server_name'] = self._gitlab_server_name
return ret
| 23.766667 | 69 | 0.712482 | 90 | 713 | 5.244444 | 0.288889 | 0.279661 | 0.372881 | 0.211864 | 0.152542 | 0.152542 | 0.152542 | 0 | 0 | 0 | 0 | 0 | 0.210379 | 713 | 29 | 70 | 24.586207 | 0.838366 | 0 | 0 | 0 | 0 | 0 | 0.064698 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.294118 | false | 0 | 0.117647 | 0.117647 | 0.764706 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
43ea952ebbf33b1a8f73a439986810caf67d1126 | 4,469 | py | Python | catalogue/migrations/0002_auto_20170301_1803.py | lh00257/superharris | cc8794ac6a63fa157ed6d0ef75f5089253ff987d | [
"MIT"
] | null | null | null | catalogue/migrations/0002_auto_20170301_1803.py | lh00257/superharris | cc8794ac6a63fa157ed6d0ef75f5089253ff987d | [
"MIT"
] | null | null | null | catalogue/migrations/0002_auto_20170301_1803.py | lh00257/superharris | cc8794ac6a63fa157ed6d0ef75f5089253ff987d | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.10.5 on 2017-03-01 18:03
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('catalogue', '0001_initial'),
]
operations = [
migrations.RemoveField(
model_name='observation',
name='app_vd_mod',
),
migrations.RemoveField(
model_name='observation',
name='c_LSR',
),
migrations.RemoveField(
model_name='observation',
name='c_LSR_unit',
),
migrations.RemoveField(
model_name='observation',
name='dfgc',
),
migrations.RemoveField(
model_name='observation',
name='dfgc_unit',
),
migrations.RemoveField(
model_name='observation',
name='dfgc_x',
),
migrations.RemoveField(
model_name='observation',
name='dfgc_y',
),
migrations.RemoveField(
model_name='observation',
name='dfgc_z',
),
migrations.RemoveField(
model_name='observation',
name='dfs_unit',
),
migrations.RemoveField(
model_name='observation',
name='eb_v',
),
migrations.RemoveField(
model_name='observation',
name='sig_v_unit',
),
migrations.RemoveField(
model_name='observation',
name='sp_lg_tc',
),
migrations.RemoveField(
model_name='observation',
name='sp_lg_th',
),
migrations.RemoveField(
model_name='observation',
name='sp_mu_V_unit',
),
migrations.RemoveField(
model_name='observation',
name='sp_r_c_unit',
),
migrations.RemoveField(
model_name='observation',
name='sp_r_h_unit',
),
migrations.RemoveField(
model_name='observation',
name='sp_rho_0_unit',
),
migrations.RemoveField(
model_name='observation',
name='sp_t_unit',
),
migrations.RemoveField(
model_name='observation',
name='spt',
),
migrations.RemoveField(
model_name='observation',
name='v_hb',
),
migrations.RemoveField(
model_name='observation',
name='v_r_err',
),
migrations.RemoveField(
model_name='observation',
name='v_r_unit',
),
migrations.RemoveField(
model_name='observation',
name='v_t',
),
migrations.AlterField(
model_name='observation',
name='dec',
field=models.FloatField(blank=True, null=True, verbose_name='Declinations [degree]'),
),
migrations.AlterField(
model_name='observation',
name='dfs',
field=models.FloatField(blank=True, null=True, verbose_name='Distance from the sun [kpc]'),
),
migrations.AlterField(
model_name='observation',
name='gallat',
field=models.FloatField(blank=True, null=True, verbose_name='Latitude [degree]'),
),
migrations.AlterField(
model_name='observation',
name='gallon',
field=models.FloatField(blank=True, null=True, verbose_name='Longitude [degree]'),
),
migrations.AlterField(
model_name='observation',
name='ra',
field=models.FloatField(blank=True, null=True, verbose_name='Right ascension [degree]'),
),
migrations.AlterField(
model_name='observation',
name='sig_err',
field=models.FloatField(blank=True, null=True, verbose_name='Observational uncertainty [km/s]'),
),
migrations.AlterField(
model_name='observation',
name='sig_v',
field=models.FloatField(blank=True, null=True, verbose_name='Velocity dispersion [km/s]'),
),
migrations.AlterField(
model_name='observation',
name='v_r',
field=models.FloatField(blank=True, null=True, verbose_name='Heliocentric radial velocity [km/s]'),
),
]
| 30.195946 | 111 | 0.533453 | 391 | 4,469 | 5.879795 | 0.227621 | 0.121357 | 0.269682 | 0.323619 | 0.836451 | 0.833841 | 0.734232 | 0.475859 | 0.238799 | 0 | 0 | 0.007571 | 0.349743 | 4,469 | 147 | 112 | 30.401361 | 0.783551 | 0.015216 | 0 | 0.664286 | 1 | 0 | 0.175534 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.014286 | 0 | 0.035714 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
43ead0c0a7ecb0a182b620c7e2d595485215d930 | 50,408 | py | Python | tests/software_tests/can/test_single_frame.py | mdabrowski1990/uds | 1aee0c1de446ee3dd461706949504f2c218db1e8 | [
"MIT"
] | 18 | 2021-03-28T22:39:18.000Z | 2022-02-13T21:50:37.000Z | tests/software_tests/can/test_single_frame.py | mdabrowski1990/uds | 1aee0c1de446ee3dd461706949504f2c218db1e8 | [
"MIT"
] | 153 | 2021-02-09T09:27:05.000Z | 2022-03-29T06:09:15.000Z | tests/software_tests/can/test_single_frame.py | mdabrowski1990/uds | 1aee0c1de446ee3dd461706949504f2c218db1e8 | [
"MIT"
] | 1 | 2021-05-13T16:01:46.000Z | 2021-05-13T16:01:46.000Z | import pytest
from mock import patch
from uds.can.single_frame import CanSingleFrameHandler, \
InconsistentArgumentsError, CanDlcHandler, DEFAULT_FILLER_BYTE
from uds.can import CanAddressingFormat
class TestCanSingleFrameHandler:
"""Unit tests for `CanSingleFrameHandler` class."""
SCRIPT_LOCATION = "uds.can.single_frame"
def setup(self):
self._patcher_validate_nibble = patch(f"{self.SCRIPT_LOCATION}.validate_nibble")
self.mock_validate_nibble = self._patcher_validate_nibble.start()
self._patcher_validate_raw_byte = patch(f"{self.SCRIPT_LOCATION}.validate_raw_byte")
self.mock_validate_raw_byte = self._patcher_validate_raw_byte.start()
self._patcher_validate_raw_bytes = patch(f"{self.SCRIPT_LOCATION}.validate_raw_bytes")
self.mock_validate_raw_bytes = self._patcher_validate_raw_bytes.start()
self._patcher_encode_dlc = patch(f"{self.SCRIPT_LOCATION}.CanDlcHandler.encode_dlc")
self.mock_encode_dlc = self._patcher_encode_dlc.start()
self._patcher_decode_dlc = patch(f"{self.SCRIPT_LOCATION}.CanDlcHandler.decode_dlc")
self.mock_decode_dlc = self._patcher_decode_dlc.start()
self._patcher_get_min_dlc = patch(f"{self.SCRIPT_LOCATION}.CanDlcHandler.get_min_dlc")
self.mock_get_min_dlc = self._patcher_get_min_dlc.start()
self._patcher_validate_dlc = patch(f"{self.SCRIPT_LOCATION}.CanDlcHandler.validate_dlc")
self.mock_validate_dlc = self._patcher_validate_dlc.start()
self._patcher_encode_ai_data_bytes = \
patch(f"{self.SCRIPT_LOCATION}.CanAddressingInformationHandler.encode_ai_data_bytes")
self.mock_encode_ai_data_bytes = self._patcher_encode_ai_data_bytes.start()
self._patcher_get_ai_data_bytes_number = \
patch(f"{self.SCRIPT_LOCATION}.CanAddressingInformationHandler.get_ai_data_bytes_number")
self.mock_get_ai_data_bytes_number = self._patcher_get_ai_data_bytes_number.start()
def teardown(self):
self._patcher_validate_nibble.stop()
self._patcher_validate_raw_byte.stop()
self._patcher_validate_raw_bytes.stop()
self._patcher_encode_dlc.stop()
self._patcher_decode_dlc.stop()
self._patcher_get_min_dlc.stop()
self._patcher_validate_dlc.stop()
self._patcher_encode_ai_data_bytes.stop()
self._patcher_get_ai_data_bytes_number.stop()
# create_valid_frame_data
@pytest.mark.parametrize("addressing_format, target_address, address_extension", [
("some format", "TA", "SA"),
("another format", None, None),
])
@pytest.mark.parametrize("dlc, filler_byte", [
(CanDlcHandler.MIN_BASE_UDS_DLC, 0x66),
(CanDlcHandler.MIN_BASE_UDS_DLC + 2, 0x99),
])
@pytest.mark.parametrize("payload, data_bytes_number, ai_bytes, sf_dl_bytes", [
([0x54], 2, [], [0xFA]),
(range(50, 110), 64, [0x98], [0x12, 0x34]),
])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__encode_valid_sf_dl")
def test_create_valid_frame_data__valid_with_dlc(self, mock_encode_sf_dl,
addressing_format, target_address, address_extension,
payload, dlc, filler_byte,
data_bytes_number, ai_bytes, sf_dl_bytes):
self.mock_encode_ai_data_bytes.return_value = ai_bytes
self.mock_decode_dlc.return_value = data_bytes_number
mock_encode_sf_dl.return_value = sf_dl_bytes
sf_frame_data = CanSingleFrameHandler.create_valid_frame_data(addressing_format=addressing_format,
payload=payload,
dlc=dlc,
filler_byte=filler_byte,
target_address=target_address,
address_extension=address_extension)
self.mock_validate_raw_bytes.assert_called_once_with(payload, allow_empty=False)
self.mock_validate_raw_byte.assert_called_once_with(filler_byte)
self.mock_encode_ai_data_bytes.assert_called_once_with(addressing_format=addressing_format,
target_address=target_address,
address_extension=address_extension)
self.mock_decode_dlc.assert_called_once_with(dlc)
mock_encode_sf_dl.assert_called_once_with(sf_dl=len(payload),
dlc=dlc,
addressing_format=addressing_format)
assert isinstance(sf_frame_data, list)
assert len(sf_frame_data) == data_bytes_number
@pytest.mark.parametrize("addressing_format, target_address, address_extension", [
("some format", "TA", "SA"),
("another format", None, None),
])
@pytest.mark.parametrize("dlc, filler_byte", [
("some DLC", 0x66),
(8, 0x99),
])
@pytest.mark.parametrize("payload, data_bytes_number, ai_bytes, sf_dl_bytes", [
([0x54], 2, [], [0xFA]),
(range(50, 110), 64, [0x98], [0x12, 0x34]),
])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.get_min_dlc")
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__encode_valid_sf_dl")
def test_create_valid_frame_data__valid_without_dlc(self, mock_encode_sf_dl, mock_get_min_dlc,
addressing_format, target_address, address_extension,
payload, dlc, filler_byte, data_bytes_number, ai_bytes,
sf_dl_bytes):
self.mock_encode_ai_data_bytes.return_value = ai_bytes
self.mock_decode_dlc.return_value = data_bytes_number
mock_encode_sf_dl.return_value = sf_dl_bytes
mock_get_min_dlc.return_value = dlc
sf_frame_data = CanSingleFrameHandler.create_valid_frame_data(addressing_format=addressing_format,
payload=payload,
dlc=None,
filler_byte=filler_byte,
target_address=target_address,
address_extension=address_extension)
self.mock_validate_raw_bytes.assert_called_once_with(payload, allow_empty=False)
self.mock_validate_raw_byte.assert_called_once_with(filler_byte)
self.mock_encode_ai_data_bytes.assert_called_once_with(addressing_format=addressing_format,
target_address=target_address,
address_extension=address_extension)
mock_get_min_dlc.assert_called_once_with(addressing_format=addressing_format,
payload_length=len(payload))
self.mock_decode_dlc.assert_called_once_with(dlc)
mock_encode_sf_dl.assert_called_once_with(sf_dl=len(payload),
dlc=dlc,
addressing_format=addressing_format)
assert isinstance(sf_frame_data, list)
assert len(sf_frame_data) == data_bytes_number
@pytest.mark.parametrize("addressing_format, target_address, address_extension", [
("some format", "TA", "SA"),
("another format", None, None),
])
@pytest.mark.parametrize("filler_byte", [0x66, 0x99])
@pytest.mark.parametrize("dlc, payload, data_bytes_number, ai_bytes, sf_dl_bytes", [
(CanDlcHandler.MIN_BASE_UDS_DLC - 1, range(60), 100, [0xFF], [0x00, 0xFA]),
(CanDlcHandler.MIN_BASE_UDS_DLC, [0x20, 0x30, 0x44], 3, [], [0x03]),
(CanDlcHandler.MIN_BASE_UDS_DLC + 1, range(20), 21, [0xAA], [0x03]),
])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__encode_valid_sf_dl")
def test_create_valid_frame_data__inconsistent_args(self, mock_encode_sf_dl,
addressing_format, target_address, address_extension,
payload, dlc, filler_byte,
data_bytes_number, ai_bytes, sf_dl_bytes):
self.mock_encode_ai_data_bytes.return_value = ai_bytes
self.mock_decode_dlc.return_value = data_bytes_number
mock_encode_sf_dl.return_value = sf_dl_bytes
with pytest.raises(InconsistentArgumentsError):
CanSingleFrameHandler.create_valid_frame_data(addressing_format=addressing_format,
payload=payload,
dlc=dlc,
filler_byte=filler_byte,
target_address=target_address,
address_extension=address_extension)
self.mock_validate_raw_bytes.assert_called_once_with(payload, allow_empty=False)
self.mock_validate_raw_byte.assert_called_once_with(filler_byte)
self.mock_encode_ai_data_bytes.assert_called_once_with(addressing_format=addressing_format,
target_address=target_address,
address_extension=address_extension)
self.mock_decode_dlc.assert_called_once_with(dlc)
mock_encode_sf_dl.assert_called_once_with(sf_dl=len(payload),
dlc=dlc,
addressing_format=addressing_format)
# create_any_frame_data
@pytest.mark.parametrize("addressing_format, target_address, address_extension", [
("some format", "TA", "SA"),
("another format", None, None),
])
@pytest.mark.parametrize("dlc, filler_byte", [
("some DLC", 0x66),
(8, 0x99),
])
@pytest.mark.parametrize("payload, data_bytes_number, ai_bytes, sf_dl_bytes", [
([0x54], 2, [], [0xFA]),
(range(50, 110), 64, [0x98], [0x12, 0x34]),
])
@pytest.mark.parametrize("sf_dl_short, sf_dl_long", [
(5, None),
("short", "long"),
])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__encode_any_sf_dl")
def test_create_any_frame_data__valid(self, mock_encode_any_sf_dl,
addressing_format, target_address, address_extension,
payload, dlc, filler_byte, sf_dl_short, sf_dl_long,
data_bytes_number, ai_bytes, sf_dl_bytes):
self.mock_encode_ai_data_bytes.return_value = ai_bytes
self.mock_decode_dlc.return_value = data_bytes_number
mock_encode_any_sf_dl.return_value = sf_dl_bytes
sf_frame_data = CanSingleFrameHandler.create_any_frame_data(addressing_format=addressing_format,
payload=payload,
dlc=dlc,
sf_dl_short=sf_dl_short,
sf_dl_long=sf_dl_long,
filler_byte=filler_byte,
target_address=target_address,
address_extension=address_extension)
self.mock_validate_raw_bytes.assert_called_once_with(payload, allow_empty=True)
self.mock_validate_raw_byte.assert_called_once_with(filler_byte)
self.mock_encode_ai_data_bytes.assert_called_once_with(addressing_format=addressing_format,
target_address=target_address,
address_extension=address_extension)
self.mock_decode_dlc.assert_called_once_with(dlc)
mock_encode_any_sf_dl.assert_called_once_with(sf_dl_short=sf_dl_short,
sf_dl_long=sf_dl_long)
assert isinstance(sf_frame_data, list)
assert len(sf_frame_data) == data_bytes_number
@pytest.mark.parametrize("addressing_format, target_address, address_extension", [
("some format", "TA", "SA"),
("another format", None, None),
])
@pytest.mark.parametrize("dlc, filler_byte", [
("some DLC", 0x66),
(8, 0x99),
])
@pytest.mark.parametrize("payload, data_bytes_number, ai_bytes, sf_dl_bytes", [
([0x54], 2, [0x11], [0xFA]),
(range(50, 112), 64, [0x98], [0x12, 0x34]),
])
@pytest.mark.parametrize("sf_dl_short, sf_dl_long", [
(5, None),
("short", "long"),
])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__encode_any_sf_dl")
def test_create_any_frame_data__inconsistent_args(self, mock_encode_any_sf_dl,
addressing_format, target_address, address_extension,
payload, dlc, filler_byte, sf_dl_short, sf_dl_long,
data_bytes_number, ai_bytes, sf_dl_bytes):
self.mock_encode_ai_data_bytes.return_value = ai_bytes
self.mock_decode_dlc.return_value = data_bytes_number
mock_encode_any_sf_dl.return_value = sf_dl_bytes
with pytest.raises(InconsistentArgumentsError):
CanSingleFrameHandler.create_any_frame_data(addressing_format=addressing_format,
payload=payload,
dlc=dlc,
sf_dl_short=sf_dl_short,
sf_dl_long=sf_dl_long,
filler_byte=filler_byte,
target_address=target_address,
address_extension=address_extension)
self.mock_validate_raw_bytes.assert_called_once_with(payload, allow_empty=True)
self.mock_validate_raw_byte.assert_called_once_with(filler_byte)
self.mock_encode_ai_data_bytes.assert_called_once_with(addressing_format=addressing_format,
target_address=target_address,
address_extension=address_extension)
self.mock_decode_dlc.assert_called_once_with(dlc)
mock_encode_any_sf_dl.assert_called_once_with(sf_dl_short=sf_dl_short,
sf_dl_long=sf_dl_long)
# is_single_frame
@pytest.mark.parametrize("addressing_format", ["some addressing format", "another format"])
@pytest.mark.parametrize("raw_frame_data, ai_bytes_number", [
([0x01, 0xFE, 0xDC], 0),
([0xFE, 0x05, 0xDC, 0xBA, 0x98, 0x76, 0x54], 1),
])
def test_is_single_frame__true(self, addressing_format, raw_frame_data, ai_bytes_number):
self.mock_get_ai_data_bytes_number.return_value = ai_bytes_number
assert CanSingleFrameHandler.is_single_frame(addressing_format=addressing_format,
raw_frame_data=raw_frame_data) is True
self.mock_get_ai_data_bytes_number.assert_called_once_with(addressing_format)
@pytest.mark.parametrize("addressing_format", ["some addressing format", "another format"])
@pytest.mark.parametrize("raw_frame_data, ai_bytes_number", [
([0x01, 0xFE, 0xDC], 1),
([0xFE, 0x15, 0xDC, 0xBA, 0x98, 0x76, 0x54], 1),
])
def test_is_single_frame__false(self, addressing_format, raw_frame_data, ai_bytes_number):
self.mock_get_ai_data_bytes_number.return_value = ai_bytes_number
assert CanSingleFrameHandler.is_single_frame(addressing_format=addressing_format,
raw_frame_data=raw_frame_data) is False
self.mock_get_ai_data_bytes_number.assert_called_once_with(addressing_format)
# decode_payload
@pytest.mark.parametrize("addressing_format", ["some addressing format", "another format"])
@pytest.mark.parametrize("raw_frame_data, sf_dl, sf_dl_bytes_number, ai_bytes_number", [
(range(8), 4, 1, 0),
(tuple(range(10, 74)), 20, 2, 1),
])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.get_sf_dl_bytes_number")
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.decode_sf_dl")
def test_decode_payload(self, mock_decode_sf_dl, mock_get_sf_dl_bytes_number,
addressing_format, raw_frame_data,
sf_dl, sf_dl_bytes_number, ai_bytes_number):
mock_decode_sf_dl.return_value = sf_dl
mock_get_sf_dl_bytes_number.return_value = sf_dl_bytes_number
self.mock_get_ai_data_bytes_number.return_value = ai_bytes_number
payload = CanSingleFrameHandler.decode_payload(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
mock_decode_sf_dl.assert_called_once_with(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
self.mock_encode_dlc.assert_called_once_with(len(raw_frame_data))
self.mock_get_ai_data_bytes_number.assert_called_once_with(addressing_format)
mock_get_sf_dl_bytes_number.assert_called_once_with(self.mock_encode_dlc.return_value)
assert isinstance(payload, list)
assert len(payload) == sf_dl
assert payload == list(raw_frame_data)[ai_bytes_number+sf_dl_bytes_number:][:sf_dl]
# decode_sf_dl
@pytest.mark.parametrize("addressing_format", ["some addressing format", "another format"])
@pytest.mark.parametrize("raw_frame_data", [list(range(64)), (0x12, 0x34, 0x56)])
@pytest.mark.parametrize("sf_dl", [1, 7])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__extract_sf_dl_data_bytes")
def test_decode_sf_dl__valid_short(self, mock_extract_sf_dl_data_bytes,
addressing_format, raw_frame_data, sf_dl):
mock_extract_sf_dl_data_bytes.return_value = [(CanSingleFrameHandler.SINGLE_FRAME_N_PCI << 4) ^ sf_dl]
assert CanSingleFrameHandler.decode_sf_dl(addressing_format=addressing_format,
raw_frame_data=raw_frame_data) == sf_dl
mock_extract_sf_dl_data_bytes.assert_called_once_with(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
@pytest.mark.parametrize("addressing_format", ["some addressing format", "another format"])
@pytest.mark.parametrize("raw_frame_data", [list(range(64)), (0x12, 0x34, 0x56)])
@pytest.mark.parametrize("sf_dl", [8, 0x3E])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__extract_sf_dl_data_bytes")
def test_decode_sf_dl__valid_long(self, mock_extract_sf_dl_data_bytes,
addressing_format, raw_frame_data, sf_dl):
mock_extract_sf_dl_data_bytes.return_value = [(CanSingleFrameHandler.SINGLE_FRAME_N_PCI << 4), sf_dl]
assert CanSingleFrameHandler.decode_sf_dl(addressing_format=addressing_format,
raw_frame_data=raw_frame_data) == sf_dl
mock_extract_sf_dl_data_bytes.assert_called_once_with(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
@pytest.mark.parametrize("addressing_format", ["some addressing format", "another format"])
@pytest.mark.parametrize("raw_frame_data", [list(range(64)), (0x12, 0x34, 0x56)])
@pytest.mark.parametrize("sf_dl_data_bytes", [[], [0x00, 0x00, 0x01]])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__extract_sf_dl_data_bytes")
def test_decode_sf_dl__not_implemented(self, mock_extract_sf_dl_data_bytes,
addressing_format, raw_frame_data, sf_dl_data_bytes):
mock_extract_sf_dl_data_bytes.return_value = sf_dl_data_bytes
with pytest.raises(NotImplementedError):
CanSingleFrameHandler.decode_sf_dl(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
mock_extract_sf_dl_data_bytes.assert_called_once_with(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
# get_min_dlc
@pytest.mark.parametrize("addressing_format", ["some addressing format", "something else"])
@pytest.mark.parametrize("ai_data_bytes, payload_length", [
(1, 7),
(0, 62),
])
@pytest.mark.parametrize("decoded_dlc", [CanSingleFrameHandler.MAX_DLC_VALUE_SHORT_SF_DL,
CanSingleFrameHandler.MAX_DLC_VALUE_SHORT_SF_DL - 2])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__validate_payload_length")
def test_get_min_dlc__short_dlc(self, mock_validate_payload_length,
addressing_format, payload_length, ai_data_bytes,
decoded_dlc):
self.mock_get_ai_data_bytes_number.return_value = ai_data_bytes
self.mock_get_min_dlc.return_value = decoded_dlc
assert CanSingleFrameHandler.get_min_dlc(addressing_format=addressing_format,
payload_length=payload_length) == self.mock_get_min_dlc.return_value
self.mock_get_ai_data_bytes_number.assert_called_once_with(addressing_format)
mock_validate_payload_length.assert_called_once_with(ai_data_bytes_number=ai_data_bytes,
payload_length=payload_length)
data_bytes_number = payload_length + CanSingleFrameHandler.SHORT_SF_DL_BYTES_USED + ai_data_bytes
self.mock_get_min_dlc.assert_called_once_with(data_bytes_number)
@pytest.mark.parametrize("addressing_format", ["some addressing format", "something else"])
@pytest.mark.parametrize("ai_data_bytes, payload_length", [
(1, 7),
(0, 62),
])
@pytest.mark.parametrize("decoded_dlc", [CanSingleFrameHandler.MAX_DLC_VALUE_SHORT_SF_DL + 1,
CanSingleFrameHandler.MAX_DLC_VALUE_SHORT_SF_DL + 5])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__validate_payload_length")
def test_get_can_frame_dlc_single_frame__long_dlc(self, mock_validate_payload_length,
addressing_format, payload_length, ai_data_bytes,
decoded_dlc):
self.mock_get_ai_data_bytes_number.return_value = ai_data_bytes
self.mock_get_min_dlc.return_value = decoded_dlc
assert CanSingleFrameHandler.get_min_dlc(addressing_format=addressing_format,
payload_length=payload_length) == self.mock_get_min_dlc.return_value
self.mock_get_ai_data_bytes_number.assert_called_once_with(addressing_format)
mock_validate_payload_length.assert_called_once_with(ai_data_bytes_number=ai_data_bytes,
payload_length=payload_length)
data_bytes_number = payload_length + CanSingleFrameHandler.LONG_SF_DL_BYTES_USED + ai_data_bytes
self.mock_get_min_dlc.assert_called_with(data_bytes_number)
# get_max_payload_size
@pytest.mark.parametrize("addressing_format", ["some addressing format", "something else"])
@pytest.mark.parametrize("dlc", ["some DLC", 8])
@pytest.mark.parametrize("frame_data_bytes_number, ai_data_bytes_number, sf_dl_bytes_number", [
(10, 1, 1),
(6, 0, 2),
(64, 1, 2),
])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.get_sf_dl_bytes_number")
def test_get_max_payload_size__with_addressing_dlc(self, mock_get_sf_dl_bytes_number,
addressing_format, dlc,
frame_data_bytes_number, ai_data_bytes_number,
sf_dl_bytes_number):
self.mock_decode_dlc.return_value = frame_data_bytes_number
self.mock_get_ai_data_bytes_number.return_value = ai_data_bytes_number
mock_get_sf_dl_bytes_number.return_value = sf_dl_bytes_number
max_value = CanSingleFrameHandler.get_max_payload_size(addressing_format=addressing_format, dlc=dlc)
self.mock_decode_dlc.assert_called_once_with(dlc)
self.mock_get_ai_data_bytes_number.assert_called_once_with(addressing_format)
mock_get_sf_dl_bytes_number.assert_called_once_with(dlc)
assert isinstance(max_value, int)
assert max_value == frame_data_bytes_number - ai_data_bytes_number - sf_dl_bytes_number
@pytest.mark.parametrize("addressing_format", ["some addressing format", "something else"])
@pytest.mark.parametrize("dlc", ["some DLC", 8])
@pytest.mark.parametrize("frame_data_bytes_number, ai_data_bytes_number, sf_dl_bytes_number", [
(2, 1, 1),
(1, 0, 2),
(2, 1, 2),
])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.get_sf_dl_bytes_number")
def test_get_max_payload_size__too_short(self, mock_get_sf_dl_bytes_number,
addressing_format, dlc,
frame_data_bytes_number, ai_data_bytes_number, sf_dl_bytes_number):
self.mock_decode_dlc.return_value = frame_data_bytes_number
self.mock_get_ai_data_bytes_number.return_value = ai_data_bytes_number
mock_get_sf_dl_bytes_number.return_value = sf_dl_bytes_number
with pytest.raises(InconsistentArgumentsError):
CanSingleFrameHandler.get_max_payload_size(addressing_format=addressing_format, dlc=dlc)
self.mock_decode_dlc.assert_called_once_with(dlc)
self.mock_get_ai_data_bytes_number.assert_called_once_with(addressing_format)
mock_get_sf_dl_bytes_number.assert_called_once_with(dlc)
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.get_sf_dl_bytes_number")
def test_get_max_payload_size__without_args(self, mock_get_sf_dl_bytes_number):
max_value = CanSingleFrameHandler.get_max_payload_size()
self.mock_decode_dlc.assert_not_called()
self.mock_get_ai_data_bytes_number.assert_not_called()
mock_get_sf_dl_bytes_number.assert_not_called()
assert isinstance(max_value, int)
assert max_value == CanDlcHandler.MAX_DATA_BYTES_NUMBER - CanSingleFrameHandler.LONG_SF_DL_BYTES_USED
# get_sf_dl_bytes_number
@pytest.mark.parametrize("dlc, expected_sf_dl_bytes_number", [
(CanSingleFrameHandler.MAX_DLC_VALUE_SHORT_SF_DL - 100, CanSingleFrameHandler.SHORT_SF_DL_BYTES_USED),
(CanSingleFrameHandler.MAX_DLC_VALUE_SHORT_SF_DL, CanSingleFrameHandler.SHORT_SF_DL_BYTES_USED),
(CanSingleFrameHandler.MAX_DLC_VALUE_SHORT_SF_DL + 1, CanSingleFrameHandler.LONG_SF_DL_BYTES_USED),
(CanSingleFrameHandler.MAX_DLC_VALUE_SHORT_SF_DL + 100, CanSingleFrameHandler.LONG_SF_DL_BYTES_USED),
])
def test_get_sf_dl_bytes_number__short(self, dlc, expected_sf_dl_bytes_number):
assert CanSingleFrameHandler.get_sf_dl_bytes_number(dlc) == expected_sf_dl_bytes_number
self.mock_validate_dlc.assert_called_once_with(dlc)
# validate_frame_data
@pytest.mark.parametrize("addressing_format", ["some addressing format", "something else"])
@pytest.mark.parametrize("raw_frame_data, dlc, ai_bytes_number, sf_dl_data_bytes", [
(list(range(8)), 8, 0, [0x07]),
(list(range(64)), 10, 0, [0x00, 0x3E]),
])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__extract_sf_dl_data_bytes")
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.is_single_frame")
def test_validate_frame_data__valid(self, mock_is_single_frame, mock_extract_sf_dl_data_bytes,
addressing_format, raw_frame_data,
ai_bytes_number, sf_dl_data_bytes, dlc):
mock_is_single_frame.return_value = True
self.mock_get_ai_data_bytes_number.return_value = ai_bytes_number
mock_extract_sf_dl_data_bytes.return_value = sf_dl_data_bytes
self.mock_encode_dlc.return_value = dlc
CanSingleFrameHandler.validate_frame_data(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
self.mock_validate_raw_bytes.assert_called_once_with(raw_frame_data)
mock_is_single_frame.assert_called_once_with(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
self.mock_get_ai_data_bytes_number.assert_called_once_with(addressing_format)
mock_extract_sf_dl_data_bytes.assert_called_once_with(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
self.mock_encode_dlc.assert_called_once_with(len(raw_frame_data))
@pytest.mark.parametrize("addressing_format", ["some addressing format", "something else"])
@pytest.mark.parametrize("raw_frame_data", ["some raw bbytes", range(6)])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.is_single_frame")
def test_validate_frame_data__invalid_type(self, mock_is_single_frame, addressing_format, raw_frame_data):
mock_is_single_frame.return_value = False
with pytest.raises(ValueError):
CanSingleFrameHandler.validate_frame_data(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
self.mock_validate_raw_bytes.assert_called_once_with(raw_frame_data)
mock_is_single_frame.assert_called_once_with(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
@pytest.mark.parametrize("addressing_format", ["some addressing format", "something else"])
@pytest.mark.parametrize("raw_frame_data, dlc, ai_bytes_number, sf_dl_data_bytes", [
(list(range(7)), 8, 0, [0x07]),
(list(range(9)), 9, 1, [0x00, 0x08]),
(list(range(64)), 10, 0, [0x0A, 0x3F]),
])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__extract_sf_dl_data_bytes")
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.is_single_frame")
def test_validate_frame_data__sf_dl_bytes(self, mock_is_single_frame, mock_extract_sf_dl_data_bytes,
addressing_format, raw_frame_data,
ai_bytes_number, sf_dl_data_bytes, dlc):
mock_is_single_frame.return_value = True
self.mock_get_ai_data_bytes_number.return_value = ai_bytes_number
mock_extract_sf_dl_data_bytes.return_value = sf_dl_data_bytes
self.mock_encode_dlc.return_value = dlc
with pytest.raises(InconsistentArgumentsError):
CanSingleFrameHandler.validate_frame_data(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
self.mock_validate_raw_bytes.assert_called_once_with(raw_frame_data)
mock_is_single_frame.assert_called_once_with(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
self.mock_get_ai_data_bytes_number.assert_called_once_with(addressing_format)
mock_extract_sf_dl_data_bytes.assert_called_once_with(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
self.mock_encode_dlc.assert_called_once_with(len(raw_frame_data))
# validate_sf_dl
@pytest.mark.parametrize("sf_dl", ["some SF_DL", None, 5.])
@pytest.mark.parametrize("dlc, addressing_format", [
("dlc", "some addressing"),
(8, None),
])
def test_validate_sf_dl__type_error(self, sf_dl, dlc, addressing_format):
with pytest.raises(TypeError):
CanSingleFrameHandler.validate_sf_dl(sf_dl=sf_dl, dlc=dlc, addressing_format=addressing_format)
@pytest.mark.parametrize("sf_dl", [0, -1, -6])
@pytest.mark.parametrize("dlc, addressing_format", [
("dlc", "some addressing"),
(8, None),
])
def test_validate_sf_dl__value_error(self, sf_dl, dlc, addressing_format):
with pytest.raises(ValueError):
CanSingleFrameHandler.validate_sf_dl(sf_dl=sf_dl, dlc=dlc, addressing_format=addressing_format)
@pytest.mark.parametrize("sf_dl, max_sf_dl", [
(8, 7),
(14, 9),
(63, 62),
])
@pytest.mark.parametrize("dlc", [8, 0xF])
@pytest.mark.parametrize("addressing_format", [None, "some addressing"])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.get_max_payload_size")
def test_validate_sf_dl__inconsistent(self, mock_get_max_payload_size,
sf_dl, dlc, addressing_format, max_sf_dl):
mock_get_max_payload_size.return_value = max_sf_dl
with pytest.raises(InconsistentArgumentsError):
CanSingleFrameHandler.validate_sf_dl(sf_dl=sf_dl, dlc=dlc, addressing_format=addressing_format) #
mock_get_max_payload_size.assert_called_once_with(addressing_format=addressing_format, dlc=dlc)
@pytest.mark.parametrize("sf_dl, max_sf_dl", [
(8, 8),
(14, 16),
(62, 62),
])
@pytest.mark.parametrize("dlc", [8, 0xF])
@pytest.mark.parametrize("addressing_format", [None, "some addressing"])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.get_max_payload_size")
def test_validate_sf_dl__valid(self, mock_get_max_payload_size,
sf_dl, dlc, addressing_format, max_sf_dl):
mock_get_max_payload_size.return_value = max_sf_dl
CanSingleFrameHandler.validate_sf_dl(sf_dl=sf_dl, dlc=dlc, addressing_format=addressing_format)
mock_get_max_payload_size.assert_called_once_with(addressing_format=addressing_format, dlc=dlc)
# __validate_payload_length
@pytest.mark.parametrize("payload_length", [None, 6.5, "not a payload length"])
@pytest.mark.parametrize("ai_data_bytes_number", [0, 1])
def test_validate_payload_length__type_error(self, payload_length, ai_data_bytes_number):
with pytest.raises(TypeError):
CanSingleFrameHandler._CanSingleFrameHandler__validate_payload_length(payload_length=payload_length,
ai_data_bytes_number=ai_data_bytes_number)
@pytest.mark.parametrize("payload_length", [0, -1])
@pytest.mark.parametrize("ai_data_bytes_number", [0, 1])
def test_validate_payload_length__value_error(self, payload_length, ai_data_bytes_number):
with pytest.raises(ValueError):
CanSingleFrameHandler._CanSingleFrameHandler__validate_payload_length(payload_length=payload_length,
ai_data_bytes_number=ai_data_bytes_number)
@pytest.mark.parametrize("payload_length, ai_data_bytes_number", [
(CanDlcHandler.MAX_DATA_BYTES_NUMBER - CanSingleFrameHandler.LONG_SF_DL_BYTES_USED, 1),
(CanDlcHandler.MAX_DATA_BYTES_NUMBER - CanSingleFrameHandler.LONG_SF_DL_BYTES_USED + 42, 1),
(CanDlcHandler.MAX_DATA_BYTES_NUMBER - CanSingleFrameHandler.LONG_SF_DL_BYTES_USED + 1, 0),
(CanDlcHandler.MAX_DATA_BYTES_NUMBER - CanSingleFrameHandler.LONG_SF_DL_BYTES_USED + 5, 0),
])
def test_validate_payload_length__inconsistency_error(self, payload_length, ai_data_bytes_number):
with pytest.raises(InconsistentArgumentsError):
CanSingleFrameHandler._CanSingleFrameHandler__validate_payload_length(payload_length=payload_length,
ai_data_bytes_number=ai_data_bytes_number)
@pytest.mark.parametrize("payload_length, ai_data_bytes_number", [
(CanDlcHandler.MAX_DATA_BYTES_NUMBER - CanSingleFrameHandler.LONG_SF_DL_BYTES_USED - 1, 1),
(CanDlcHandler.MAX_DATA_BYTES_NUMBER - CanSingleFrameHandler.LONG_SF_DL_BYTES_USED, 0),
(1, 0),
])
def test_validate_payload_length__valid(self, payload_length, ai_data_bytes_number):
CanSingleFrameHandler._CanSingleFrameHandler__validate_payload_length(payload_length=payload_length,
ai_data_bytes_number=ai_data_bytes_number)
# __extract_sf_dl_data_bytes
@pytest.mark.parametrize("addressing_format, ai_data_bytes", [
("some addressing format", 0),
("another format", 1),
])
@pytest.mark.parametrize("raw_frame_data", [range(10), range(10, 16)])
@pytest.mark.parametrize("sf_dl_bytes_number", [1, 2])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.get_sf_dl_bytes_number")
def test_extract_sf_dl_data_bytes(self, mock_get_sf_dl_bytes_number,
addressing_format, raw_frame_data, sf_dl_bytes_number, ai_data_bytes):
mock_get_sf_dl_bytes_number.return_value = sf_dl_bytes_number
self.mock_get_ai_data_bytes_number.return_value = ai_data_bytes
sf_dl_bytes = CanSingleFrameHandler._CanSingleFrameHandler__extract_sf_dl_data_bytes(
addressing_format=addressing_format, raw_frame_data=raw_frame_data)
self.mock_get_ai_data_bytes_number.assert_called_once_with(addressing_format)
self.mock_encode_dlc.assert_called_once_with(len(raw_frame_data))
mock_get_sf_dl_bytes_number.assert_called_once_with(self.mock_encode_dlc.return_value)
assert isinstance(sf_dl_bytes, list)
assert len(sf_dl_bytes) == sf_dl_bytes_number
assert sf_dl_bytes == list(raw_frame_data)[ai_data_bytes:][:sf_dl_bytes_number]
# __encode_valid_sf_dl
@pytest.mark.parametrize("sf_dl", [0, 8, 0xF])
@pytest.mark.parametrize("dlc", [CanSingleFrameHandler.MAX_DLC_VALUE_SHORT_SF_DL,
CanSingleFrameHandler.MAX_DLC_VALUE_SHORT_SF_DL - 1])
@pytest.mark.parametrize("addressing_format", [None, "some addressing format"])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__encode_any_sf_dl")
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.validate_sf_dl")
def test_encode_valid_sf_dl__short(self, mock_validate_sf_dl, mock_encode_any_sf_dl,
sf_dl, dlc, addressing_format):
assert CanSingleFrameHandler._CanSingleFrameHandler__encode_valid_sf_dl(
sf_dl=sf_dl,
addressing_format=addressing_format,
dlc=dlc) == mock_encode_any_sf_dl.return_value
mock_validate_sf_dl.assert_called_once_with(sf_dl=sf_dl,
dlc=dlc,
addressing_format=addressing_format)
mock_encode_any_sf_dl.assert_called_once_with(sf_dl_short=sf_dl)
@pytest.mark.parametrize("sf_dl", [0, 8, 0xF])
@pytest.mark.parametrize("dlc", [CanSingleFrameHandler.MAX_DLC_VALUE_SHORT_SF_DL + 1,
CanSingleFrameHandler.MAX_DLC_VALUE_SHORT_SF_DL + 2])
@pytest.mark.parametrize("addressing_format", [None, "some addressing format"])
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler._CanSingleFrameHandler__encode_any_sf_dl")
@patch(f"{SCRIPT_LOCATION}.CanSingleFrameHandler.validate_sf_dl")
def test_encode_valid_sf_dl__long(self, mock_validate_sf_dl, mock_encode_any_sf_dl,
sf_dl, dlc, addressing_format):
assert CanSingleFrameHandler._CanSingleFrameHandler__encode_valid_sf_dl(
sf_dl=sf_dl,
addressing_format=addressing_format,
dlc=dlc) == mock_encode_any_sf_dl.return_value
mock_validate_sf_dl.assert_called_once_with(sf_dl=sf_dl,
dlc=dlc,
addressing_format=addressing_format)
mock_encode_any_sf_dl.assert_called_once_with(sf_dl_long=sf_dl)
# __encode_any_sf_dl
@pytest.mark.parametrize("sf_dl_short", [0, 7, 0xF])
def test_encode_any_sf_dl__short(self, sf_dl_short):
assert CanSingleFrameHandler._CanSingleFrameHandler__encode_any_sf_dl(sf_dl_short=sf_dl_short) \
== [(CanSingleFrameHandler.SINGLE_FRAME_N_PCI << 4) + sf_dl_short]
self.mock_validate_nibble.assert_called_once_with(sf_dl_short)
self.mock_validate_raw_byte.assert_not_called()
@pytest.mark.parametrize("sf_dl_short", [0, 5, 0xF])
@pytest.mark.parametrize("sf_dl_long", [0, 7, 0xF])
def test_encode_any_sf_dl__long(self, sf_dl_short, sf_dl_long):
assert CanSingleFrameHandler._CanSingleFrameHandler__encode_any_sf_dl(
sf_dl_short=sf_dl_short,
sf_dl_long=sf_dl_long) == [(CanSingleFrameHandler.SINGLE_FRAME_N_PCI << 4) + sf_dl_short, sf_dl_long]
self.mock_validate_nibble.assert_called_once_with(sf_dl_short)
self.mock_validate_raw_byte.assert_called_once_with(sf_dl_long)
@pytest.mark.integration
class TestCanSingleFrameHandlerIntegration:
"""Integration tests for `CanSingleFrameHandler` class."""
# create_valid_frame_data
@pytest.mark.parametrize("kwargs, expected_raw_frame_data", [
({"addressing_format": CanAddressingFormat.NORMAL_11BIT_ADDRESSING,
"payload": [0x3E]},
[0x01, 0x3E]),
({"addressing_format": CanAddressingFormat.NORMAL_FIXED_ADDRESSING,
"payload": [0x3E],
"dlc": 8,
"target_address": 0xFF},
[0x01, 0x3E] + ([DEFAULT_FILLER_BYTE] * 6)),
({"addressing_format": CanAddressingFormat.EXTENDED_ADDRESSING,
"payload": list(range(54)),
"filler_byte": 0x66,
"target_address": 0xF2},
[0xF2, 0x00, 0x36] + list(range(54)) + ([0x66] * 7)),
({"addressing_format": CanAddressingFormat.MIXED_11BIT_ADDRESSING,
"payload": [0x9A, 0xB8, 0xC4, 0x67, 0x10, 0x00],
"dlc": 8,
"filler_byte": 0x66,
"address_extension": 0x12},
[0x12, 0x06, 0x9A, 0xB8, 0xC4, 0x67, 0x10, 0x00]),
({"addressing_format": CanAddressingFormat.MIXED_29BIT_ADDRESSING,
"payload": [0x9A, 0xB8, 0xC4, 0x67, 0x10, 0x00, 0x01],
"filler_byte": 0x99,
"target_address": 0xF2,
"address_extension": 0x12},
[0x12, 0x00, 0x07, 0x9A, 0xB8, 0xC4, 0x67, 0x10, 0x00, 0x01, 0x99, 0x99]),
])
def test_create_valid_frame_data__valid(self, kwargs, expected_raw_frame_data):
assert CanSingleFrameHandler.create_valid_frame_data(**kwargs) == expected_raw_frame_data
@pytest.mark.parametrize("kwargs", [
{"addressing_format": CanAddressingFormat.NORMAL_11BIT_ADDRESSING,
"payload": [0x3E],
"dlc": 1},
{"addressing_format": CanAddressingFormat.NORMAL_FIXED_ADDRESSING,
"payload": list(range(63))},
{"addressing_format": CanAddressingFormat.EXTENDED_ADDRESSING,
"payload": list(range(50, 112)),
"target_address": 0xF2},
{"addressing_format": CanAddressingFormat.MIXED_11BIT_ADDRESSING,
"payload": [0xAB],
"dlc": 7,
"address_extension": 0x12},
{"addressing_format": CanAddressingFormat.MIXED_29BIT_ADDRESSING,
"payload": [],
"target_address": 0xF2,
"address_extension": 0x12},
])
def test_create_valid_frame_data__invalid(self, kwargs):
with pytest.raises(ValueError):
CanSingleFrameHandler.create_valid_frame_data(**kwargs)
# create_any_frame_data
@pytest.mark.parametrize("kwargs, expected_raw_frame_data", [
({"addressing_format": CanAddressingFormat.NORMAL_11BIT_ADDRESSING,
"payload": [0x3E],
"dlc": 2,
"sf_dl_short": 0xF},
[0x0F, 0x3E]),
({"addressing_format": CanAddressingFormat.NORMAL_FIXED_ADDRESSING,
"payload": [0x3E],
"dlc": 8,
"target_address": 0xFF,
"sf_dl_short": 0x0,
"sf_dl_long": 0xAB},
[0x00, 0xAB, 0x3E] + ([DEFAULT_FILLER_BYTE] * 5)),
({"addressing_format": CanAddressingFormat.EXTENDED_ADDRESSING,
"payload": [],
"dlc": 0xF,
"filler_byte": 0x66,
"target_address": 0xF2,
"sf_dl_short": 0xE},
[0xF2, 0x0E] + ([0x66] * 62)),
({"addressing_format": CanAddressingFormat.MIXED_11BIT_ADDRESSING,
"payload": [0x9A, 0xB8, 0xC4, 0x67, 0x10, 0x00],
"dlc": 8,
"filler_byte": 0x66,
"address_extension": 0x12,
"sf_dl_short": 0x5},
[0x12, 0x05, 0x9A, 0xB8, 0xC4, 0x67, 0x10, 0x00]),
({"addressing_format": CanAddressingFormat.MIXED_29BIT_ADDRESSING,
"payload": [0x9A, 0xB8],
"dlc": 7,
"filler_byte": 0x99,
"target_address": 0xF2,
"address_extension": 0x12,
"sf_dl_short": 0x2},
[0x12, 0x02, 0x9A, 0xB8, 0x99, 0x99, 0x99]),
])
def test_create_any_frame_data__valid(self, kwargs, expected_raw_frame_data):
assert CanSingleFrameHandler.create_any_frame_data(**kwargs) == expected_raw_frame_data
@pytest.mark.parametrize("kwargs", [
{"addressing_format": CanAddressingFormat.NORMAL_11BIT_ADDRESSING,
"payload": [0x3E],
"dlc": 1,
"sf_dl_short": 0xF},
{"addressing_format": CanAddressingFormat.NORMAL_FIXED_ADDRESSING,
"payload": list(range(63)),
"dlc": 0xF,
"sf_dl_short": 0x0,
"sf_dl_long": 0xAB},
{"addressing_format": CanAddressingFormat.EXTENDED_ADDRESSING,
"payload": [],
"dlc": 0xF,
"target_address": 0xF2,
"sf_dl_short": 0x10},
{"addressing_format": CanAddressingFormat.MIXED_11BIT_ADDRESSING,
"payload": list(range(5)),
"dlc": 8,
"address_extension": 0x12,
"sf_dl_short": 0,
"sf_dl_long": 0x100},
])
def test_create_any_frame_data__invalid(self, kwargs):
with pytest.raises(ValueError):
CanSingleFrameHandler.create_any_frame_data(**kwargs)
# decode_sf_dl
@pytest.mark.parametrize("addressing_format, raw_frame_data, expected_sf_dl", [
(CanAddressingFormat.NORMAL_11BIT_ADDRESSING, (0x01, 0x3E), 1),
(CanAddressingFormat.NORMAL_FIXED_ADDRESSING, [0x00, 0x32] + list(range(20, 82)), 0x32),
(CanAddressingFormat.EXTENDED_ADDRESSING, [0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xFF, 0xFE], 4),
(CanAddressingFormat.MIXED_11BIT_ADDRESSING, [0x0A, 0x00, 0x3D] + list(range(100, 161)), 0x3D),
(CanAddressingFormat.MIXED_29BIT_ADDRESSING, [0xDA, 0x00, 0x10] + list(range(21)), 0x10)
])
def test_decode_sf_dl(self, addressing_format, raw_frame_data, expected_sf_dl):
assert CanSingleFrameHandler.decode_sf_dl(addressing_format=addressing_format,
raw_frame_data=raw_frame_data) == expected_sf_dl
# validate_frame_data
@pytest.mark.parametrize("addressing_format, raw_frame_data", [
(CanAddressingFormat.NORMAL_11BIT_ADDRESSING, (0x07, 0xFE, 0xDC, 0xBA, 0x98, 0x76, 0x54, 0x32)),
(CanAddressingFormat.NORMAL_FIXED_ADDRESSING, [0x00, 0x3E] + list(range(0x10, 0x4E))),
(CanAddressingFormat.EXTENDED_ADDRESSING, [0x00, 0x00, 0x01] + list(range(0x10, 0x4D))),
(CanAddressingFormat.MIXED_11BIT_ADDRESSING, (0x02, 0x01, 0xFF)),
(CanAddressingFormat.MIXED_29BIT_ADDRESSING, [0x04, 0x00, 0x05] + list(range(100, 113))),
])
def test_validate_frame_data__valid(self, addressing_format, raw_frame_data):
assert CanSingleFrameHandler.validate_frame_data(addressing_format=addressing_format,
raw_frame_data=raw_frame_data) is None
@pytest.mark.parametrize("addressing_format, raw_frame_data", [
(CanAddressingFormat.NORMAL_11BIT_ADDRESSING, (0x05, 0xFE, 0xDC, 0xBA, 0x98, 0x76, 0x54)),
(CanAddressingFormat.NORMAL_FIXED_ADDRESSING, [0x00, 0x3F] + list(range(0x10, 0x4E))),
(CanAddressingFormat.EXTENDED_ADDRESSING, [0x00, 0x01, 0x01] + list(range(0x10, 0x4D))),
(CanAddressingFormat.MIXED_11BIT_ADDRESSING, (0x02, 0x07, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF)),
(CanAddressingFormat.MIXED_29BIT_ADDRESSING, [0xB0, 0x09] + list(range(100, 114))),
])
def test_validate_frame_data__invalid(self, addressing_format, raw_frame_data):
with pytest.raises(ValueError):
CanSingleFrameHandler.validate_frame_data(addressing_format=addressing_format,
raw_frame_data=raw_frame_data)
| 60.224612 | 124 | 0.649976 | 5,736 | 50,408 | 5.213215 | 0.035391 | 0.039728 | 0.043641 | 0.047487 | 0.920041 | 0.878775 | 0.838311 | 0.800288 | 0.771862 | 0.757349 | 0 | 0.02594 | 0.264283 | 50,408 | 836 | 125 | 60.296651 | 0.780375 | 0.008967 | 0 | 0.653333 | 0 | 0 | 0.125744 | 0.052336 | 0 | 0 | 0.021551 | 0 | 0.142667 | 1 | 0.056 | false | 0 | 0.005333 | 0 | 0.065333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b808bfcdffa17832a3f8864867347afef3e3488e | 39 | py | Python | Python Codewars Answers/Functional Addition.py | OreOlad/github-slideshow | 6c4ef5df6971e3dbd79c9ef7cbb141b2ff682006 | [
"MIT"
] | null | null | null | Python Codewars Answers/Functional Addition.py | OreOlad/github-slideshow | 6c4ef5df6971e3dbd79c9ef7cbb141b2ff682006 | [
"MIT"
] | 4 | 2020-05-14T15:28:52.000Z | 2021-09-28T02:43:53.000Z | Python Codewars Answers/Functional Addition.py | OreOlad/github-slideshow | 6c4ef5df6971e3dbd79c9ef7cbb141b2ff682006 | [
"MIT"
] | null | null | null | def add(n):
return lambda x: x + n
| 13 | 26 | 0.564103 | 8 | 39 | 2.75 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.307692 | 39 | 2 | 27 | 19.5 | 0.814815 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
62c32f4957b142ce9152cee3f188aea5b236e2bf | 5,302 | py | Python | rdmo/projects/tests/test_viewset_value.py | berkerY/rdmo | c0500f9b6caff9106a254a05e0d0e8018fc8db28 | [
"Apache-2.0"
] | 77 | 2016-08-09T11:40:20.000Z | 2022-03-06T11:03:26.000Z | rdmo/projects/tests/test_viewset_value.py | MSpenger/rdmo | c0500f9b6caff9106a254a05e0d0e8018fc8db28 | [
"Apache-2.0"
] | 377 | 2016-07-01T13:59:36.000Z | 2022-03-30T13:53:19.000Z | rdmo/projects/tests/test_viewset_value.py | MSpenger/rdmo | c0500f9b6caff9106a254a05e0d0e8018fc8db28 | [
"Apache-2.0"
] | 47 | 2016-06-23T11:32:19.000Z | 2022-03-01T11:34:37.000Z | import pytest
from django.urls import reverse
from rdmo.core.constants import VALUE_TYPE_FILE
from ..models import Value
users = (
('owner', 'owner'),
('manager', 'manager'),
('author', 'author'),
('guest', 'guest'),
('api', 'api'),
('user', 'user'),
('site', 'site'),
('anonymous', None),
)
view_value_permission_map = {
'owner': [1, 2, 3, 4, 5],
'manager': [1, 3, 5],
'author': [1, 3, 5],
'guest': [1, 3, 5],
'api': [1, 2, 3, 4, 5],
'site': [1, 2, 3, 4, 5]
}
urlnames = {
'list': 'v1-projects:value-list',
'detail': 'v1-projects:value-detail',
'file': 'v1-projects:value-file'
}
values = [1, 2, 3, 4, 5, 6, 7, 238, 242, 243, 244, 245]
snapshots = [1, 3, 7, 4, 5, 6]
@pytest.mark.parametrize('username,password', users)
def test_list(db, client, username, password):
client.login(username=username, password=password)
url = reverse(urlnames['list'])
response = client.get(url)
if password:
assert response.status_code == 200
assert isinstance(response.json(), list)
if username == 'user':
assert sorted([item['id'] for item in response.json()]) == []
else:
values_list = Value.objects.filter(project__in=view_value_permission_map.get(username, [])) \
.filter(snapshot_id=None) \
.order_by('id').values_list('id', flat=True)
assert sorted([item['id'] for item in response.json()]) == list(values_list)
else:
assert response.status_code == 401
@pytest.mark.parametrize('username,password', users)
@pytest.mark.parametrize('snapshot_id', snapshots)
def test_list_snapshot(db, client, username, password, snapshot_id):
client.login(username=username, password=password)
url = reverse(urlnames['list']) + '?snapshot={}'.format(snapshot_id)
response = client.get(url)
if password:
assert response.status_code == 200
assert isinstance(response.json(), list)
if username == 'user':
assert sorted([item['id'] for item in response.json()]) == []
else:
values_list = Value.objects.filter(project__in=view_value_permission_map.get(username, [])) \
.filter(snapshot_id=snapshot_id) \
.order_by('id').values_list('id', flat=True)
assert sorted([item['id'] for item in response.json()]) == list(values_list)
else:
assert response.status_code == 401
@pytest.mark.parametrize('username,password', users)
@pytest.mark.parametrize('value_id', values)
def test_detail(db, client, username, password, value_id):
client.login(username=username, password=password)
value = Value.objects.get(pk=value_id)
url = reverse(urlnames['detail'], args=[value_id])
response = client.get(url)
if value.project.id in view_value_permission_map.get(username, []):
assert response.status_code == 200
assert isinstance(response.json(), dict)
assert response.json().get('id') == value_id
elif password:
assert response.status_code == 404
else:
assert response.status_code == 401
@pytest.mark.parametrize('username,password', users)
def test_create(db, client, username, password):
client.login(username=username, password=password)
url = reverse(urlnames['list'])
response = client.post(url)
if password:
assert response.status_code == 405
else:
assert response.status_code == 401
@pytest.mark.parametrize('username,password', users)
@pytest.mark.parametrize('value_id', values)
def test_update(db, client, username, password, value_id):
client.login(username=username, password=password)
url = reverse(urlnames['detail'], args=[value_id])
data = {}
response = client.put(url, data, content_type='application/json')
if password:
assert response.status_code == 405
else:
assert response.status_code == 401
@pytest.mark.parametrize('username,password', users)
@pytest.mark.parametrize('value_id', values)
def test_delete(db, client, username, password, value_id):
client.login(username=username, password=password)
url = reverse(urlnames['detail'], args=[value_id])
response = client.delete(url)
if password:
assert response.status_code == 405
else:
assert response.status_code == 401
@pytest.mark.parametrize('username,password', users)
@pytest.mark.parametrize('value_id', values)
def test_file(db, client, files, username, password, value_id):
client.login(username=username, password=password)
value = Value.objects.get(pk=value_id)
url = reverse(urlnames['file'], args=[value_id])
response = client.get(url)
if value.value_type == VALUE_TYPE_FILE and value.project.id in view_value_permission_map.get(username, []):
assert response.status_code == 200
assert response['Content-Type'] == value.file_type
assert response['Content-Disposition'] == 'attachment; filename={}'.format(value.file_name)
assert response.content == value.file.read()
elif password:
assert response.status_code == 404
else:
assert response.status_code == 401
| 32.728395 | 111 | 0.641645 | 649 | 5,302 | 5.118644 | 0.155624 | 0.101144 | 0.096328 | 0.115593 | 0.765503 | 0.759482 | 0.75888 | 0.75888 | 0.742023 | 0.708007 | 0 | 0.024694 | 0.213316 | 5,302 | 161 | 112 | 32.931677 | 0.771757 | 0 | 0 | 0.584 | 0 | 0 | 0.092984 | 0.012825 | 0 | 0 | 0 | 0 | 0.216 | 1 | 0.056 | false | 0.224 | 0.032 | 0 | 0.088 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
62c3bd1eac6952ba5258aeeb003d762d915208f2 | 86 | py | Python | fabfile.py | lallamilagro/yafas-server | 8bd8b64c7bfdaeeadffe0daf91eecf91091fcb94 | [
"MIT"
] | null | null | null | fabfile.py | lallamilagro/yafas-server | 8bd8b64c7bfdaeeadffe0daf91eecf91091fcb94 | [
"MIT"
] | null | null | null | fabfile.py | lallamilagro/yafas-server | 8bd8b64c7bfdaeeadffe0daf91eecf91091fcb94 | [
"MIT"
] | null | null | null | from fabric.api import local
def dev():
local('gunicorn --reload app:api -t 0')
| 14.333333 | 43 | 0.662791 | 14 | 86 | 4.071429 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014493 | 0.197674 | 86 | 5 | 44 | 17.2 | 0.811594 | 0 | 0 | 0 | 0 | 0 | 0.348837 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
62c8e979a92e5aadf07d790090067e540f852e86 | 96 | py | Python | quiz/tests.py | TheShubhendra/quizzes-backend | 90e161281ec02599b17c6f0a9fdd20bddf25bfe7 | [
"MIT"
] | null | null | null | quiz/tests.py | TheShubhendra/quizzes-backend | 90e161281ec02599b17c6f0a9fdd20bddf25bfe7 | [
"MIT"
] | null | null | null | quiz/tests.py | TheShubhendra/quizzes-backend | 90e161281ec02599b17c6f0a9fdd20bddf25bfe7 | [
"MIT"
] | null | null | null | from django.test import TestCase
# Create your tests here.
def test_nothing():
assert True
| 16 | 32 | 0.75 | 14 | 96 | 5.071429 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1875 | 96 | 5 | 33 | 19.2 | 0.910256 | 0.239583 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
62eafafa26205536075ef12c7f241213269bc5c2 | 196 | py | Python | monitor/app/models/__init__.py | hellckt/monitor | 20be8395e55500e21937aabe227c91e0d70fcc49 | [
"WTFPL"
] | 3 | 2017-10-25T16:43:10.000Z | 2019-12-05T09:12:30.000Z | monitor/app/models/__init__.py | hellckt/monitor | 20be8395e55500e21937aabe227c91e0d70fcc49 | [
"WTFPL"
] | null | null | null | monitor/app/models/__init__.py | hellckt/monitor | 20be8395e55500e21937aabe227c91e0d70fcc49 | [
"WTFPL"
] | 2 | 2017-03-14T08:15:18.000Z | 2021-12-30T04:26:19.000Z | # -*- encoding: utf-8 -*-
from monitor.app.models.ban import Ban, Path
from monitor.app.models.flow import Flow
from monitor.app.models.user import User
__all__ = ['Ban', 'Path', 'Flow', 'User']
| 28 | 44 | 0.704082 | 30 | 196 | 4.466667 | 0.433333 | 0.246269 | 0.313433 | 0.447761 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005848 | 0.127551 | 196 | 6 | 45 | 32.666667 | 0.777778 | 0.117347 | 0 | 0 | 0 | 0 | 0.087719 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
62fab7367a3fc326921eaa194387b3b106eef594 | 97 | py | Python | src/sfml/__init__.py | vmstarchenko/python-sfml | 3105bbc7ae405633702fc22c1725d7941d098488 | [
"Zlib"
] | 121 | 2015-01-05T17:31:46.000Z | 2019-09-11T02:28:58.000Z | src/sfml/__init__.py | vmstarchenko/python-sfml | 3105bbc7ae405633702fc22c1725d7941d098488 | [
"Zlib"
] | 86 | 2015-01-23T04:50:47.000Z | 2019-04-04T07:23:57.000Z | src/sfml/__init__.py | vmstarchenko/python-sfml | 3105bbc7ae405633702fc22c1725d7941d098488 | [
"Zlib"
] | 37 | 2015-01-18T13:09:02.000Z | 2019-05-06T18:13:44.000Z | import sfml.system
import sfml.window
import sfml.graphics
import sfml.audio
import sfml.network
| 16.166667 | 20 | 0.845361 | 15 | 97 | 5.466667 | 0.466667 | 0.609756 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103093 | 97 | 5 | 21 | 19.4 | 0.942529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1a0d2f257bad7aa114c05195461076865262b1d9 | 107 | py | Python | LAUG/util/dataloader/__init__.py | wise-east/LAUG | c5fc674e76a0a20622a77301f9986ad58713d58d | [
"Apache-2.0"
] | 10 | 2021-07-10T12:40:42.000Z | 2022-03-14T07:51:06.000Z | LAUG/util/dataloader/__init__.py | wise-east/LAUG | c5fc674e76a0a20622a77301f9986ad58713d58d | [
"Apache-2.0"
] | 5 | 2021-07-01T11:23:58.000Z | 2021-09-09T05:51:02.000Z | LAUG/util/dataloader/__init__.py | wise-east/LAUG | c5fc674e76a0a20622a77301f9986ad58713d58d | [
"Apache-2.0"
] | 2 | 2021-09-13T16:26:42.000Z | 2021-11-16T09:26:54.000Z | from LAUG.util.dataloader.dataset_dataloader import *
from LAUG.util.dataloader.module_dataloader import *
| 35.666667 | 53 | 0.850467 | 14 | 107 | 6.357143 | 0.5 | 0.179775 | 0.269663 | 0.494382 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074766 | 107 | 2 | 54 | 53.5 | 0.89899 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
c53e95294466b22afa904fda2dbc54e00c164fe0 | 92 | py | Python | tests/testapp/views.py | PaesslerAG/django-httpxforwardedfor | d76f94ed1e16820409964790cfa90023d7df3916 | [
"BSD-3-Clause"
] | 3 | 2017-04-12T08:37:12.000Z | 2019-06-06T19:12:16.000Z | tests/testapp/views.py | PaesslerAG/django-httpxforwardedfor | d76f94ed1e16820409964790cfa90023d7df3916 | [
"BSD-3-Clause"
] | 2 | 2019-01-17T19:18:46.000Z | 2020-04-06T13:03:04.000Z | tests/testapp/views.py | PaesslerAG/django-httpxforwardedfor | d76f94ed1e16820409964790cfa90023d7df3916 | [
"BSD-3-Clause"
] | null | null | null | from django.http import HttpResponse
def empty_page(request):
return HttpResponse("")
| 15.333333 | 36 | 0.76087 | 11 | 92 | 6.272727 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152174 | 92 | 5 | 37 | 18.4 | 0.884615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
c55cd47deff25e4c1c8f0e744b1ccadb2d1e4557 | 32,370 | py | Python | pybind/nos/v7_1_0/brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/nos/v7_1_0/brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/nos/v7_1_0/brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | 1 | 2021-11-05T22:15:42.000Z | 2021-11-05T22:15:42.000Z |
from operator import attrgetter
import pyangbind.lib.xpathhelper as xpathhelper
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType, RestrictedClassType, TypedListType
from pyangbind.lib.yangtypes import YANGBool, YANGListType, YANGDynClass, ReferenceType
from pyangbind.lib.base import PybindBase
from decimal import Decimal
from bitarray import bitarray
import __builtin__
import node_info
class show_firmware_version(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module brocade-firmware-ext - based on the path /brocade_firmware_ext_rpc/show-firmware-version/output/show-firmware-version. Each member element of
the container is represented as a class variable - with a specific
YANG type.
"""
__slots__ = ('_pybind_generated_by', '_path_helper', '_yang_name', '_rest_name', '_extmethods', '__switchid','__os_name','__os_version','__copy_right_info','__build_time','__firmware_full_version','__control_processor_vendor','__control_processor_chipset','__control_processor_memory','__node_info',)
_yang_name = 'show-firmware-version'
_rest_name = 'show-firmware-version'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
path_helper_ = kwargs.pop("path_helper", None)
if path_helper_ is False:
self._path_helper = False
elif path_helper_ is not None and isinstance(path_helper_, xpathhelper.YANGPathHelper):
self._path_helper = path_helper_
elif hasattr(self, "_parent"):
path_helper_ = getattr(self._parent, "_path_helper", False)
self._path_helper = path_helper_
else:
self._path_helper = False
extmethods = kwargs.pop("extmethods", None)
if extmethods is False:
self._extmethods = False
elif extmethods is not None and isinstance(extmethods, dict):
self._extmethods = extmethods
elif hasattr(self, "_parent"):
extmethods = getattr(self._parent, "_extmethods", None)
self._extmethods = extmethods
else:
self._extmethods = False
self.__os_name = YANGDynClass(base=unicode, is_leaf=True, yang_name="os-name", rest_name="os-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
self.__copy_right_info = YANGDynClass(base=unicode, is_leaf=True, yang_name="copy-right-info", rest_name="copy-right-info", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
self.__build_time = YANGDynClass(base=unicode, is_leaf=True, yang_name="build-time", rest_name="build-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
self.__control_processor_chipset = YANGDynClass(base=unicode, is_leaf=True, yang_name="control-processor-chipset", rest_name="control-processor-chipset", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
self.__os_version = YANGDynClass(base=unicode, is_leaf=True, yang_name="os-version", rest_name="os-version", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
self.__node_info = YANGDynClass(base=YANGListType(False,node_info.node_info, yang_name="node-info", rest_name="node-info", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='False', extensions=None), is_container='list', yang_name="node-info", rest_name="node-info", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='list', is_config=True)
self.__switchid = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1..3']}), is_leaf=True, yang_name="switchid", rest_name="switchid", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='ras-extensions:switchid-type', is_config=True)
self.__firmware_full_version = YANGDynClass(base=unicode, is_leaf=True, yang_name="firmware-full-version", rest_name="firmware-full-version", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
self.__control_processor_vendor = YANGDynClass(base=unicode, is_leaf=True, yang_name="control-processor-vendor", rest_name="control-processor-vendor", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
self.__control_processor_memory = YANGDynClass(base=unicode, is_leaf=True, yang_name="control-processor-memory", rest_name="control-processor-memory", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return [u'brocade_firmware_ext_rpc', u'show-firmware-version', u'output', u'show-firmware-version']
def _rest_path(self):
if hasattr(self, "_parent"):
if self._rest_name:
return self._parent._rest_path()+[self._rest_name]
else:
return self._parent._rest_path()
else:
return [u'show-firmware-version', u'output', u'show-firmware-version']
def _get_switchid(self):
"""
Getter method for switchid, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/switchid (ras-extensions:switchid-type)
YANG Description: Switch id specifies the particular switch to fetch firmware version info.
"""
return self.__switchid
def _set_switchid(self, v, load=False):
"""
Setter method for switchid, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/switchid (ras-extensions:switchid-type)
If this variable is read-only (config: false) in the
source YANG file, then _set_switchid is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_switchid() directly.
YANG Description: Switch id specifies the particular switch to fetch firmware version info.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1..3']}), is_leaf=True, yang_name="switchid", rest_name="switchid", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='ras-extensions:switchid-type', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """switchid must be of a type compatible with ras-extensions:switchid-type""",
'defined-type': "ras-extensions:switchid-type",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1..3']}), is_leaf=True, yang_name="switchid", rest_name="switchid", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='ras-extensions:switchid-type', is_config=True)""",
})
self.__switchid = t
if hasattr(self, '_set'):
self._set()
def _unset_switchid(self):
self.__switchid = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'length': [u'1..3']}), is_leaf=True, yang_name="switchid", rest_name="switchid", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='ras-extensions:switchid-type', is_config=True)
def _get_os_name(self):
"""
Getter method for os_name, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/os_name (string)
YANG Description: Name of the Firmware version. Example: NOS, FOS etc.
"""
return self.__os_name
def _set_os_name(self, v, load=False):
"""
Setter method for os_name, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/os_name (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_os_name is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_os_name() directly.
YANG Description: Name of the Firmware version. Example: NOS, FOS etc.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="os-name", rest_name="os-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """os_name must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="os-name", rest_name="os-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)""",
})
self.__os_name = t
if hasattr(self, '_set'):
self._set()
def _unset_os_name(self):
self.__os_name = YANGDynClass(base=unicode, is_leaf=True, yang_name="os-name", rest_name="os-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
def _get_os_version(self):
"""
Getter method for os_version, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/os_version (string)
YANG Description: Version of the Firmware.
"""
return self.__os_version
def _set_os_version(self, v, load=False):
"""
Setter method for os_version, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/os_version (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_os_version is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_os_version() directly.
YANG Description: Version of the Firmware.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="os-version", rest_name="os-version", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """os_version must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="os-version", rest_name="os-version", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)""",
})
self.__os_version = t
if hasattr(self, '_set'):
self._set()
def _unset_os_version(self):
self.__os_version = YANGDynClass(base=unicode, is_leaf=True, yang_name="os-version", rest_name="os-version", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
def _get_copy_right_info(self):
"""
Getter method for copy_right_info, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/copy_right_info (string)
YANG Description: Copy right information of the Firmware.
"""
return self.__copy_right_info
def _set_copy_right_info(self, v, load=False):
"""
Setter method for copy_right_info, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/copy_right_info (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_copy_right_info is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_copy_right_info() directly.
YANG Description: Copy right information of the Firmware.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="copy-right-info", rest_name="copy-right-info", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """copy_right_info must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="copy-right-info", rest_name="copy-right-info", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)""",
})
self.__copy_right_info = t
if hasattr(self, '_set'):
self._set()
def _unset_copy_right_info(self):
self.__copy_right_info = YANGDynClass(base=unicode, is_leaf=True, yang_name="copy-right-info", rest_name="copy-right-info", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
def _get_build_time(self):
"""
Getter method for build_time, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/build_time (string)
YANG Description: Time information on the build of Firmware.
"""
return self.__build_time
def _set_build_time(self, v, load=False):
"""
Setter method for build_time, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/build_time (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_build_time is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_build_time() directly.
YANG Description: Time information on the build of Firmware.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="build-time", rest_name="build-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """build_time must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="build-time", rest_name="build-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)""",
})
self.__build_time = t
if hasattr(self, '_set'):
self._set()
def _unset_build_time(self):
self.__build_time = YANGDynClass(base=unicode, is_leaf=True, yang_name="build-time", rest_name="build-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
def _get_firmware_full_version(self):
"""
Getter method for firmware_full_version, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/firmware_full_version (string)
YANG Description: Full version string of Firmware.
"""
return self.__firmware_full_version
def _set_firmware_full_version(self, v, load=False):
"""
Setter method for firmware_full_version, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/firmware_full_version (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_firmware_full_version is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_firmware_full_version() directly.
YANG Description: Full version string of Firmware.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="firmware-full-version", rest_name="firmware-full-version", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """firmware_full_version must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="firmware-full-version", rest_name="firmware-full-version", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)""",
})
self.__firmware_full_version = t
if hasattr(self, '_set'):
self._set()
def _unset_firmware_full_version(self):
self.__firmware_full_version = YANGDynClass(base=unicode, is_leaf=True, yang_name="firmware-full-version", rest_name="firmware-full-version", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
def _get_control_processor_vendor(self):
"""
Getter method for control_processor_vendor, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/control_processor_vendor (string)
YANG Description: Information on the control processor.
"""
return self.__control_processor_vendor
def _set_control_processor_vendor(self, v, load=False):
"""
Setter method for control_processor_vendor, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/control_processor_vendor (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_control_processor_vendor is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_control_processor_vendor() directly.
YANG Description: Information on the control processor.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="control-processor-vendor", rest_name="control-processor-vendor", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """control_processor_vendor must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="control-processor-vendor", rest_name="control-processor-vendor", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)""",
})
self.__control_processor_vendor = t
if hasattr(self, '_set'):
self._set()
def _unset_control_processor_vendor(self):
self.__control_processor_vendor = YANGDynClass(base=unicode, is_leaf=True, yang_name="control-processor-vendor", rest_name="control-processor-vendor", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
def _get_control_processor_chipset(self):
"""
Getter method for control_processor_chipset, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/control_processor_chipset (string)
YANG Description: Information on the control processor.
"""
return self.__control_processor_chipset
def _set_control_processor_chipset(self, v, load=False):
"""
Setter method for control_processor_chipset, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/control_processor_chipset (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_control_processor_chipset is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_control_processor_chipset() directly.
YANG Description: Information on the control processor.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="control-processor-chipset", rest_name="control-processor-chipset", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """control_processor_chipset must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="control-processor-chipset", rest_name="control-processor-chipset", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)""",
})
self.__control_processor_chipset = t
if hasattr(self, '_set'):
self._set()
def _unset_control_processor_chipset(self):
self.__control_processor_chipset = YANGDynClass(base=unicode, is_leaf=True, yang_name="control-processor-chipset", rest_name="control-processor-chipset", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
def _get_control_processor_memory(self):
"""
Getter method for control_processor_memory, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/control_processor_memory (string)
YANG Description: Memory of the control processor.
"""
return self.__control_processor_memory
def _set_control_processor_memory(self, v, load=False):
"""
Setter method for control_processor_memory, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/control_processor_memory (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_control_processor_memory is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_control_processor_memory() directly.
YANG Description: Memory of the control processor.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="control-processor-memory", rest_name="control-processor-memory", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """control_processor_memory must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="control-processor-memory", rest_name="control-processor-memory", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)""",
})
self.__control_processor_memory = t
if hasattr(self, '_set'):
self._set()
def _unset_control_processor_memory(self):
self.__control_processor_memory = YANGDynClass(base=unicode, is_leaf=True, yang_name="control-processor-memory", rest_name="control-processor-memory", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='string', is_config=True)
def _get_node_info(self):
"""
Getter method for node_info, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/node_info (list)
"""
return self.__node_info
def _set_node_info(self, v, load=False):
"""
Setter method for node_info, mapped from YANG variable /brocade_firmware_ext_rpc/show_firmware_version/output/show_firmware_version/node_info (list)
If this variable is read-only (config: false) in the
source YANG file, then _set_node_info is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_node_info() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGListType(False,node_info.node_info, yang_name="node-info", rest_name="node-info", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='False', extensions=None), is_container='list', yang_name="node-info", rest_name="node-info", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='list', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """node_info must be of a type compatible with list""",
'defined-type': "list",
'generated-type': """YANGDynClass(base=YANGListType(False,node_info.node_info, yang_name="node-info", rest_name="node-info", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='False', extensions=None), is_container='list', yang_name="node-info", rest_name="node-info", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='list', is_config=True)""",
})
self.__node_info = t
if hasattr(self, '_set'):
self._set()
def _unset_node_info(self):
self.__node_info = YANGDynClass(base=YANGListType(False,node_info.node_info, yang_name="node-info", rest_name="node-info", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='False', extensions=None), is_container='list', yang_name="node-info", rest_name="node-info", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-firmware-ext', defining_module='brocade-firmware-ext', yang_type='list', is_config=True)
switchid = __builtin__.property(_get_switchid, _set_switchid)
os_name = __builtin__.property(_get_os_name, _set_os_name)
os_version = __builtin__.property(_get_os_version, _set_os_version)
copy_right_info = __builtin__.property(_get_copy_right_info, _set_copy_right_info)
build_time = __builtin__.property(_get_build_time, _set_build_time)
firmware_full_version = __builtin__.property(_get_firmware_full_version, _set_firmware_full_version)
control_processor_vendor = __builtin__.property(_get_control_processor_vendor, _set_control_processor_vendor)
control_processor_chipset = __builtin__.property(_get_control_processor_chipset, _set_control_processor_chipset)
control_processor_memory = __builtin__.property(_get_control_processor_memory, _set_control_processor_memory)
node_info = __builtin__.property(_get_node_info, _set_node_info)
_pyangbind_elements = {'switchid': switchid, 'os_name': os_name, 'os_version': os_version, 'copy_right_info': copy_right_info, 'build_time': build_time, 'firmware_full_version': firmware_full_version, 'control_processor_vendor': control_processor_vendor, 'control_processor_chipset': control_processor_chipset, 'control_processor_memory': control_processor_memory, 'node_info': node_info, }
| 68.147368 | 568 | 0.75882 | 4,405 | 32,370 | 5.274007 | 0.043587 | 0.066503 | 0.079804 | 0.034091 | 0.880424 | 0.849432 | 0.83329 | 0.822788 | 0.804623 | 0.795412 | 0 | 0.000494 | 0.125147 | 32,370 | 474 | 569 | 68.291139 | 0.819874 | 0.220142 | 0 | 0.475836 | 0 | 0.037175 | 0.346483 | 0.205864 | 0 | 0 | 0 | 0 | 0 | 1 | 0.122677 | false | 0 | 0.033457 | 0 | 0.271375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c575874d8ea0dbfc16f46c76e96f8f4490f0f2fe | 26 | py | Python | venv/Lib/site-packages/JEGmail/__init__.py | JE-Chen/finalProject_nknySystem | bc8b15f0d9ef416c9a5a2aea4955a7f34e916cc7 | [
"MIT"
] | 3 | 2020-12-30T06:37:10.000Z | 2021-03-05T11:56:04.000Z | venv/Lib/site-packages/JEGmail/__init__.py | JE-Chen/finalProject_nknySystem | bc8b15f0d9ef416c9a5a2aea4955a7f34e916cc7 | [
"MIT"
] | null | null | null | venv/Lib/site-packages/JEGmail/__init__.py | JE-Chen/finalProject_nknySystem | bc8b15f0d9ef416c9a5a2aea4955a7f34e916cc7 | [
"MIT"
] | 1 | 2020-12-21T03:53:03.000Z | 2020-12-21T03:53:03.000Z | from JEGmail import Core
| 13 | 25 | 0.807692 | 4 | 26 | 5.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192308 | 26 | 1 | 26 | 26 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3d90f9a8c1a30f0272eaaef480a1b052edc19fe6 | 9,815 | py | Python | scripts/rnn_models.py | epfl-ml4ed/meta-transfer-learning | 2868ef0d3768b8089de7c06d8e557bddd27f5ca6 | [
"MIT"
] | null | null | null | scripts/rnn_models.py | epfl-ml4ed/meta-transfer-learning | 2868ef0d3768b8089de7c06d8e557bddd27f5ca6 | [
"MIT"
] | null | null | null | scripts/rnn_models.py | epfl-ml4ed/meta-transfer-learning | 2868ef0d3768b8089de7c06d8e557bddd27f5ca6 | [
"MIT"
] | null | null | null | import numpy as np
import pandas as pd
import tensorflow as tf
import sklearn as sk
from keras.models import Sequential
from keras.layers import Dense, Masking, LSTM, Bidirectional
from sklearn.metrics import balanced_accuracy_score, precision_score, recall_score, roc_auc_score, f1_score, accuracy_score
import os.path
import matplotlib.pyplot as pyplot
def evaluate(model, x_test, y_test, week_type, feature_type, course, percentile=0.4, current_timestamp=0, model_name=None, y_pred=None):
scores={}
if y_pred is None:
y_pred = model.predict(x_test)
y_pred = [1 if y[0] >= 0.5 else 0 for y in y_pred]
scores['acc'] = accuracy_score(y_test, y_pred)
scores['bac'] = balanced_accuracy_score(y_test, y_pred)
scores['prec'] = precision_score(y_test, y_pred)
scores['rec'] = recall_score(y_test, y_pred)
scores['f1'] = f1_score(y_test, y_pred)
scores['auc'] = roc_auc_score(y_test, y_pred)
scores['feature_type'] = feature_type
scores['week_type'] = week_type
scores['course'] = course
if model_name == None:
scores['model_name'] = type(model).__name__
else:
scores['model_name'] = model_name
scores['timestamp'] = current_timestamp
scores['percentile'] = percentile
scores['data_balance'] = sum(y_test)/len(y_test)
return scores
def bidirectional_lstm_64(x_train, y_train, x_test, y_test, x_val, y_val, week_type, feature_types, course, percentile, current_timestamp, num_epochs=10):
dim_max = int(np.max(x_train[:, :]) + 1)
n_dims = x_train.shape[0]
n_weeks = x_train.shape[1]
n_features = x_train.shape[2]
look_back = 3
# LSTM
# define model
lstm = Sequential()
lstm.add(Masking(mask_value=-1., input_shape=(n_dims, n_weeks, n_features)))
lstm.add(Bidirectional(LSTM(64)))
lstm.add(Dense(1, activation='sigmoid'))
# compile the model
lstm.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
checkpoint_filepath = 'checkpoints/lstm-bi-64-' + current_timestamp
os.mkdir(checkpoint_filepath)
model_checkpoint_callback = tf.keras.callbacks.ModelCheckpoint(
filepath=checkpoint_filepath,
monitor='val_accuracy',
mode='max',
save_best_only=True)
# fit the model
history = lstm.fit(x_train, y_train, validation_data=(x_val, y_val), epochs=num_epochs, batch_size=64, verbose=1, callbacks=[model_checkpoint_callback])
lstm = tf.keras.models.load_model(checkpoint_filepath)
# evaluate the model
y_pred = lstm.predict(x_test)
y_pred = [1 if y[0] >= 0.5 else 0 for y in y_pred]
# evaluate the model
model_params = {'model': 'LSTM-bi', 'epochs': num_epochs, 'batch_size': 64, 'loss': 'binary_cross_entropy'}
scores = evaluate(None, x_test, y_test, week_type, feature_types, course, percentile, current_timestamp, y_pred=y_pred, model_name="TF-LSTM-bi-64" , model_params=model_params)
y_val_pred = lstm.predict(x_val)
y_val_pred = [1 if y[0] >= 0.5 else 0 for y in y_val_pred]
val_scores = evaluate(None, x_val, y_val, week_type, feature_types, course, percentile, current_timestamp, y_pred=y_val_pred, model_name="TF-LSTM-bi-64" , model_params=model_params)
lstm.save(checkpoint_filepath + '_final_e')
return history, scores, val_scores, lstm
def bidirectional_lstm_32_32(x_train, y_train, x_test, y_test, x_val, y_val, week_type, feature_types, course, percentile, current_timestamp, num_epochs=10):
n_dims = x_train.shape[0]
n_weeks = x_train.shape[1]
n_features = x_train.shape[2]
# LSTM
# define model
lstm = Sequential()
lstm.add(Masking(mask_value=-1., input_shape=(n_dims, n_weeks, n_features)))
lstm.add(Bidirectional(LSTM(32, return_sequences=True)))
lstm.add(Bidirectional(LSTM(32)))
lstm.add(Dense(1, activation='sigmoid'))
# compile the model
lstm.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
checkpoint_filepath = 'checkpoints/lstm-bi-32-32-' + current_timestamp
os.mkdir(checkpoint_filepath)
model_checkpoint_callback = tf.keras.callbacks.ModelCheckpoint(
filepath=checkpoint_filepath,
monitor='val_accuracy',
mode='max',
save_best_only=True)
# fit the model
history = lstm.fit(x_train, y_train, validation_data=(x_val, y_val), epochs=num_epochs, batch_size=64, verbose=1, callbacks=[model_checkpoint_callback])
lstm = tf.keras.models.load_model(checkpoint_filepath)
# evaluate the model
y_pred = lstm.predict(x_test)
y_pred = [1 if y[0] >= 0.5 else 0 for y in y_pred]
# evaluate the model
model_params = {'model': 'LSTM-bi', 'epochs': num_epochs, 'batch_size': 64, 'loss': 'binary_cross_entropy'}
scores = evaluate(None, x_test, y_test, week_type, feature_types, course, percentile, current_timestamp, y_pred=y_pred, model_name="TF-LSTM-bi-32-32" , model_params=model_params)
y_val_pred = lstm.predict(x_val)
y_val_pred = [1 if y[0] >= 0.5 else 0 for y in y_val_pred]
val_scores = evaluate(None, x_val, y_val, week_type, feature_types, course, percentile, current_timestamp, y_pred=y_val_pred, model_name="TF-LSTM-bi-32-32" , model_params=model_params)
lstm.save(checkpoint_filepath + '_final_e')
return history, scores, val_scores, lstm
def bidirectional_lstm_32(x_train, y_train, x_test, y_test, x_val, y_val, week_type, feature_types, course, percentile, current_timestamp, num_epochs=10):
n_dims = x_train.shape[0]
n_weeks = x_train.shape[1]
n_features = x_train.shape[2]
# LSTM
# define model
lstm = Sequential()
lstm.add(Masking(mask_value=-1., input_shape=(n_dims, n_weeks, n_features)))
lstm.add(Bidirectional(LSTM(32)))
lstm.add(Dense(1, activation='sigmoid'))
# compile the model
lstm.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
checkpoint_filepath = 'checkpoints/lstm-bi-32-' + current_timestamp
os.mkdir(checkpoint_filepath)
model_checkpoint_callback = tf.keras.callbacks.ModelCheckpoint(
filepath=checkpoint_filepath,
monitor='val_accuracy',
mode='max',
save_best_only=True)
# fit the model
history = lstm.fit(x_train, y_train, validation_data=(x_val, y_val), epochs=num_epochs, batch_size=64, verbose=1, callbacks=[model_checkpoint_callback])
lstm = tf.keras.models.load_model(checkpoint_filepath)
# evaluate the model
y_pred = lstm.predict(x_test)
y_pred = [1 if y[0] >= 0.5 else 0 for y in y_pred]
# evaluate the model
model_params = {'model': 'LSTM-bi', 'epochs': num_epochs, 'batch_size': 64, 'loss': 'binary_cross_entropy'}
scores = evaluate(None, x_test, y_test, week_type, feature_types, course, percentile, current_timestamp, y_pred=y_pred, model_name="TF-LSTM-bi-32" , model_params=model_params)
y_val_pred = lstm.predict(x_val)
y_val_pred = [1 if y[0] >= 0.5 else 0 for y in y_val_pred]
val_scores = evaluate(None, x_val, y_val, week_type, feature_types, course, percentile, current_timestamp, y_pred=y_val_pred, model_name="TF-LSTM-bi-32" , model_params=model_params)
lstm.save(checkpoint_filepath + '_final_e')
return history, scores, val_scores, lstm
def bidirectional_lstm_128(x_train, y_train, x_test, y_test, x_val, y_val, week_type, feature_types, course, percentile, current_timestamp, num_epochs=10):
n_dims = x_train.shape[0]
n_weeks = x_train.shape[1]
n_features = x_train.shape[2]
# LSTM
# define model
lstm = Sequential()
lstm.add(Masking(mask_value=-1., input_shape=(n_dims, n_weeks, n_features)))
lstm.add(Bidirectional(LSTM(128)))
lstm.add(Dense(1, activation='sigmoid'))
# compile the model
lstm.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
checkpoint_filepath = 'checkpoints/lstm-bi-128-' + current_timestamp
os.mkdir(checkpoint_filepath)
model_checkpoint_callback = tf.keras.callbacks.ModelCheckpoint(
filepath=checkpoint_filepath,
monitor='val_accuracy',
mode='max',
save_best_only=True)
# fit the model
history = lstm.fit(x_train, y_train, validation_data=(x_val, y_val), epochs=num_epochs, batch_size=64, verbose=1, callbacks=[model_checkpoint_callback])
lstm = tf.keras.models.load_model(checkpoint_filepath)
# evaluate the model
y_pred = lstm.predict(x_test)
y_pred = [1 if y[0] >= 0.5 else 0 for y in y_pred]
# evaluate the model
model_params = {'model': 'LSTM-bi', 'epochs': num_epochs, 'batch_size': 64, 'loss': 'binary_cross_entropy'}
scores = evaluate(None, x_test, y_test, week_type, feature_types, course, percentile, current_timestamp, y_pred=y_pred, model_name="TF-LSTM-bi-128" , model_params=model_params)
y_val_pred = lstm.predict(x_val)
y_val_pred = [1 if y[0] >= 0.5 else 0 for y in y_val_pred]
val_scores = evaluate(None, x_val, y_val, week_type, feature_types, course, percentile, current_timestamp, y_pred=y_val_pred, model_name="TF-LSTM-bi-128" , model_params=model_params)
lstm.save(checkpoint_filepath + '_final_e')
return history, scores, val_scores, lstm
def plot_history(history, file_name, counter):
# plot loss during training
pyplot.figure(counter*2)
pyplot.title('Loss ' + file_name)
pyplot.plot(history.history['loss'], label='train')
pyplot.plot(history.history['val_loss'], label='test')
pyplot.legend()
pyplot.savefig(file_name + "_loss.png")
# plot accuracy during training
pyplot.figure(counter*2+1)
pyplot.title('Accuracy ' + file_name)
pyplot.plot(history.history['accuracy'], label='train')
pyplot.plot(history.history['val_accuracy'], label='test')
pyplot.legend()
pyplot.savefig(file_name + "_acc.png") | 46.738095 | 188 | 0.711564 | 1,474 | 9,815 | 4.456581 | 0.095658 | 0.02664 | 0.012178 | 0.019485 | 0.86345 | 0.86345 | 0.830568 | 0.806668 | 0.79388 | 0.79388 | 0 | 0.019421 | 0.165869 | 9,815 | 210 | 189 | 46.738095 | 0.782949 | 0.041467 | 0 | 0.6 | 0 | 0 | 0.089213 | 0.010232 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03871 | false | 0 | 0.058065 | 0 | 0.129032 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3da7c2d17c4fb1a2297796ee1c0d4230379bfd61 | 100 | py | Python | greydot/__init__.py | TralahM/greydot-api | 34982c34bf878b2728e934147ca4ea38a78523e4 | [
"MIT"
] | null | null | null | greydot/__init__.py | TralahM/greydot-api | 34982c34bf878b2728e934147ca4ea38a78523e4 | [
"MIT"
] | null | null | null | greydot/__init__.py | TralahM/greydot-api | 34982c34bf878b2728e934147ca4ea38a78523e4 | [
"MIT"
] | null | null | null | from . import airtime
from . import sms
from . import wallet
from . import b2c
from . import signup
| 16.666667 | 21 | 0.75 | 15 | 100 | 5 | 0.466667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0125 | 0.2 | 100 | 5 | 22 | 20 | 0.925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3dd72598c8539775e879fb93fd8b4877fbd64555 | 35 | py | Python | Step1-PythonBasic/Practices/liul/1-5/py00101.py | Jumpers/MysoftAutoTest | 50efc385a96532fc0777061d6c5e7201a4991f04 | [
"Apache-2.0"
] | null | null | null | Step1-PythonBasic/Practices/liul/1-5/py00101.py | Jumpers/MysoftAutoTest | 50efc385a96532fc0777061d6c5e7201a4991f04 | [
"Apache-2.0"
] | null | null | null | Step1-PythonBasic/Practices/liul/1-5/py00101.py | Jumpers/MysoftAutoTest | 50efc385a96532fc0777061d6c5e7201a4991f04 | [
"Apache-2.0"
] | null | null | null | print "hello, welcome to python!"
| 11.666667 | 33 | 0.714286 | 5 | 35 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171429 | 35 | 2 | 34 | 17.5 | 0.862069 | 0 | 0 | 0 | 0 | 0 | 0.735294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
3ddff74e38fdfac674e2e297ed871ef772679665 | 56 | py | Python | icevision/metrics/coco_metric/__init__.py | ai-fast-track/mantisshrimp | cc6d6a4a048f6ddda2782b6593dcd6b083a673e4 | [
"Apache-2.0"
] | 580 | 2020-09-10T06:29:57.000Z | 2022-03-29T19:34:54.000Z | icevision/metrics/coco_metric/__init__.py | ai-fast-track/mantisshrimp | cc6d6a4a048f6ddda2782b6593dcd6b083a673e4 | [
"Apache-2.0"
] | 691 | 2020-09-05T03:08:34.000Z | 2022-03-31T23:47:06.000Z | icevision/metrics/coco_metric/__init__.py | lgvaz/mantisshrimp2 | 743cb7df0dae7eb1331fc2bb66fc9ca09db496cd | [
"Apache-2.0"
] | 105 | 2020-09-09T10:41:35.000Z | 2022-03-25T17:16:49.000Z | from icevision.metrics.coco_metric.coco_metric import *
| 28 | 55 | 0.857143 | 8 | 56 | 5.75 | 0.75 | 0.434783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 56 | 1 | 56 | 56 | 0.884615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3ded6d6077c766603523b13300327f96211b016b | 72 | py | Python | datatrans/utils/classes/__init__.py | KooCook/datatrans | 65c80da4d8a1ed67963b9d704b361c864cb1151b | [
"BSD-3-Clause"
] | 1 | 2020-10-24T04:07:42.000Z | 2020-10-24T04:07:42.000Z | datatrans/utils/classes/__init__.py | KooCook/datatrans | 65c80da4d8a1ed67963b9d704b361c864cb1151b | [
"BSD-3-Clause"
] | null | null | null | datatrans/utils/classes/__init__.py | KooCook/datatrans | 65c80da4d8a1ed67963b9d704b361c864cb1151b | [
"BSD-3-Clause"
] | null | null | null | from .dataclass import *
from .jsonenum import *
from .encoder import *
| 18 | 24 | 0.75 | 9 | 72 | 6 | 0.555556 | 0.37037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 72 | 3 | 25 | 24 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9a93d2f5035fd1e80f5ae47f8b087289fa14a31e | 16,265 | py | Python | feeds/tests.py | StevenMonty/django-feed-reader | 95731a4178f028f6539c55dce77928a377a9cdd9 | [
"MIT"
] | 21 | 2019-06-21T09:17:32.000Z | 2022-03-04T06:55:36.000Z | feeds/tests.py | StevenMonty/django-feed-reader | 95731a4178f028f6539c55dce77928a377a9cdd9 | [
"MIT"
] | 6 | 2019-09-01T03:40:04.000Z | 2021-01-30T01:07:19.000Z | feeds/tests.py | StevenMonty/django-feed-reader | 95731a4178f028f6539c55dce77928a377a9cdd9 | [
"MIT"
] | 15 | 2019-09-01T03:58:17.000Z | 2022-01-16T05:25:40.000Z | from django.test import TestCase, Client
from django.conf import settings
# Create your tests here.
from feeds.models import Source, Post, Enclosure, WebProxy
from feeds.utils import read_feed, find_proxies, get_proxy, fix_relative
from django.utils import timezone
from django.urls import reverse
from datetime import timedelta
import mock
import os
import requests_mock
TEST_FILES_FOLDER = os.path.join(os.path.dirname(os.path.abspath(__file__)),"testdata")
BASE_URL = 'http://feed.com/'
class UtilsTest(TestCase):
def test_fix_relative(self):
url = "https://example.com/rss.xml"
html= "<a href='/'><img src='/image.jpg'></a>"
html = fix_relative(html, url)
self.assertEqual(html, "<a href='https://example.com/'><img src='https://example.com/image.jpg'></a>")
class BaseTest(TestCase):
def _populate_mock(self, mock, test_file, status, content_type, etag=None, headers=None, url=BASE_URL, is_cloudflare=False):
content = open(os.path.join(TEST_FILES_FOLDER, test_file), "rb").read()
ret_headers = {"Content-Type": content_type, "etag":"an-etag"}
if headers is not None:
ret_headers = {**ret_headers, **headers}
{"Content-Type": content_type, "etag":"an-etag"}
if is_cloudflare:
agent = "{user_agent} (+{server}; Updater; {subs} subscribers)".format(user_agent=settings.FEEDS_USER_AGENT, server=settings.FEEDS_SERVER, subs=1)
mock.register_uri('GET', url, request_headers={"User-Agent": agent}, status_code=status, content=content, headers=ret_headers)
else:
if etag is None:
mock.register_uri('GET', url, status_code=status, content=content, headers=ret_headers)
else:
mock.register_uri('GET', url, request_headers={'If-None-Match': etag}, status_code=status, content=content, headers=ret_headers)
@requests_mock.Mocker()
class XMLFeedsTest(BaseTest):
def test_simple_xml(self, mock):
self._populate_mock(mock, status=200, test_file="rss_xhtml_body.xml", content_type="application/rss+xml")
ls = timezone.now()
src = Source(name="test1", feed_url=BASE_URL, interval=0, last_success=ls, last_change=ls)
src.save()
# Read the feed once to get the 1 post and the etag
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200)
self.assertEqual(src.posts.count(), 1) # got the one post
self.assertEqual(src.interval, 60)
self.assertEqual(src.etag, "an-etag")
self.assertNotEqual(src.last_success, ls)
self.assertNotEqual(src.last_change, ls)
def test_podcast(self, mock):
self._populate_mock(mock, status=200, test_file="podcast.xml", content_type="application/rss+xml")
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
# Read the feed once to get the 1 post and the etag
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.description, 'SU: Three nerds discussing tech, Apple, programming, and loosely related matters.')
self.assertEqual(src.posts.all()[0].enclosures.count(), 1)
def test_sanitize_1(self, mock):
"""
Make sure feedparser's sanitization is running
"""
self._populate_mock(mock, status=200, test_file="rss_xhtml_body.xml", content_type="application/rss+xml")
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
# Read the feed once to get the 1 post and the etag
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200)
p = src.posts.all()[0]
self.assertFalse("<script>" in p.body)
def test_sanitize_2(self, mock):
"""
Another test that the sanitization is going on. This time we have
stolen a test case from the feedparser libarary
"""
self._populate_mock(mock, status=200, test_file="sanitizer_bad_comment.xml", content_type="application/rss+xml")
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
# read the feed to update the name
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200)
self.assertEqual(src.name, "safe")
def test_sanitize_attrs(self, mock):
self._populate_mock(mock, status=200, test_file="sanitizer_img_attrs.xml", content_type="application/rss+xml")
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
# read the feed to update the name
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200)
body = src.posts.all()[0].body
self.assertTrue("<img" in body)
self.assertFalse("align=" in body)
self.assertFalse("hspace=" in body)
@requests_mock.Mocker()
class JSONFeedTest(BaseTest):
def test_simple_json(self, mock):
self._populate_mock(mock, status=200, test_file="json_simple_two_entry.json", content_type="application/json")
ls = timezone.now()
src = Source(name="test1", feed_url=BASE_URL, interval=0, last_success=ls, last_change=ls)
src.save()
# Read the feed once to get the 1 post and the etag
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200)
self.assertEqual(src.posts.count(), 2) # got the one post
self.assertEqual(src.interval, 60)
self.assertEqual(src.etag, "an-etag")
self.assertNotEqual(src.last_success, ls)
self.assertNotEqual(src.last_change, ls)
def test_sanitize_1(self, mock):
self._populate_mock(mock, status=200, test_file="json_simple_two_entry.json", content_type="application/json")
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
# Read the feed once to get the 1 post and the etag
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200)
p = src.posts.all()[0]
self.assertFalse("<script>" in p.body)
def test_sanitize_2(self, mock):
"""
Another test that the sanitization is going on. This time we have
stolen a test case from the feedparser libarary
"""
self._populate_mock(mock, status=200, test_file="sanitizer_bad_comment.json", content_type="application/json")
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
# read the feed to update the name
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200)
self.assertEqual(src.name, "safe")
def test_podcast(self, mock):
self._populate_mock(mock, status=200, test_file="podcast.json", content_type="application/json")
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
# read the feed to update the name
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200)
post = src.posts.all()[0]
self.assertEqual(post.enclosures.count(), 1)
@requests_mock.Mocker()
class HTTPStuffTest(BaseTest):
def test_fucking_cloudflare(self, mock):
self._populate_mock(mock, status=200, test_file="json_simple_two_entry.json", content_type="application/json")
self._populate_mock(mock, status=403, test_file="json_simple_two_entry.json", content_type="application/json", is_cloudflare=True)
src = Source(name="test1", feed_url=BASE_URL, interval=0, is_cloudflare=False)
src.save()
# Read the feed once to get the 1 post and the etag
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 403)
src = Source(name="test1", feed_url=BASE_URL, interval=0, is_cloudflare=True)
src.save()
# Read the feed once to get the 1 post and the etag
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200)
def test_find_proxies(self, mock):
self._populate_mock(mock, status=200, test_file="proxy_list.html", content_type="text/html", url="http://www.workingproxies.org")
find_proxies()
self.assertEqual(WebProxy.objects.count(), 20)
def test_get_proxy(self, mock):
self._populate_mock(mock, status=200, test_file="proxy_list.html", content_type="text/html", url="http://www.workingproxies.org")
p = get_proxy()
self.assertIsNotNone(p)
def test_etags(self, mock):
self._populate_mock(mock, status=200, test_file="rss_xhtml_body.xml", content_type="application/xml+rss")
self._populate_mock(mock, status=304, test_file="empty_file.txt", content_type="application/xml+rss", etag="an-etag")
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
# Read the feed once to get the 1 post and the etag
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200)
self.assertEqual(src.posts.count(), 1) # got the one post
self.assertEqual(src.interval, 60)
self.assertEqual(src.etag, "an-etag")
# Read the feed again to get a 304 and a small increment to the interval
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.posts.count(), 1) # should have no more
self.assertEqual(src.status_code, 304)
self.assertEqual(src.interval, 70)
self.assertTrue(src.live)
def test_not_a_feed(self, mock):
self._populate_mock(mock, status=200, test_file="spurious_text_file.txt", content_type="text/plain")
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200) # it returned a page, but not a feed
self.assertEqual(src.posts.count(), 0) # can't have got any
self.assertEqual(src.interval, 120)
self.assertTrue(src.live)
def test_permission_denied(self, mock):
self._populate_mock(mock, status=403, test_file="empty_file.txt", content_type="text/plain")
ls = timezone.now()
src = Source(name="test1", feed_url=BASE_URL, interval=0, last_success=ls)
src.save()
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 403) # it returned a page, but not a feed
self.assertEqual(src.posts.count(), 0) # can't have got any
self.assertFalse(src.live)
def test_feed_gone(self, mock):
self._populate_mock(mock, status=410, test_file="empty_file.txt", content_type="text/plain")
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 410) # it returned a page, but not a feed
self.assertEqual(src.posts.count(), 0) # can't have got any
self.assertFalse(src.live)
def test_feed_not_found(self, mock):
self._populate_mock(mock, status=404, test_file="empty_file.txt", content_type="text/plain")
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 404) # it returned a page, but not a feed
self.assertEqual(src.posts.count(), 0) # can't have got any
self.assertTrue(src.live)
self.assertEqual(src.interval, 120)
def test_temp_redirect(self, mock):
new_url = "http://new.feed.com/"
self._populate_mock(mock, status=302, test_file="empty_file.txt", content_type="text/plain", headers={"Location": new_url})
self._populate_mock(mock, status=200, test_file="rss_xhtml_body.xml", content_type="application/xml+rss", url=new_url)
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
self.assertIsNone(src.last_302_start)
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200)
self.assertEqual(src.last_302_url, new_url) # this is where went
self.assertIsNotNone(src.last_302_start)
self.assertEqual(src.posts.count(), 1) # after following redirect will have 1 post
self.assertEqual(src.interval, 60)
self.assertTrue(src.live)
# do it all again - shouldn't change
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200) # it returned a page, but not a feed
self.assertEqual(src.last_302_url, new_url) # this is where went
self.assertIsNotNone(src.last_302_start)
self.assertEqual(src.posts.count(), 1) # after following redirect will have 1 post
self.assertEqual(src.interval, 80)
self.assertTrue(src.live)
# now we test making it permaent
src.last_302_start = timezone.now() - timedelta(days=365)
src.save()
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200)
self.assertEqual(src.last_302_url, ' ')
self.assertIsNone(src.last_302_start)
self.assertEqual(src.posts.count(), 1)
self.assertEqual(src.interval, 100)
self.assertEqual(src.feed_url, new_url)
self.assertTrue(src.live)
def test_perm_redirect(self, mock):
new_url = "http://new.feed.com/"
self._populate_mock(mock, status=301, test_file="empty_file.txt", content_type="text/plain", headers={"Location": new_url})
self._populate_mock(mock, status=200, test_file="rss_xhtml_body.xml", content_type="application/xml+rss", url=new_url)
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 301)
self.assertEqual(src.interval, 60)
self.assertEqual(src.feed_url, new_url)
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 200)
self.assertEqual(src.posts.count(), 1)
self.assertEqual(src.interval, 60)
self.assertTrue(src.live)
def test_server_error_1(self, mock):
self._populate_mock(mock, status=500, test_file="empty_file.txt", content_type="text/plain")
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 500) # error
self.assertEqual(src.posts.count(), 0) # can't have got any
self.assertTrue(src.live)
self.assertEqual(src.interval, 120)
def test_server_error_2(self, mock):
self._populate_mock(mock, status=503, test_file="empty_file.txt", content_type="text/plain")
src = Source(name="test1", feed_url=BASE_URL, interval=0)
src.save()
read_feed(src)
src.refresh_from_db()
self.assertEqual(src.status_code, 503) # error!
self.assertEqual(src.posts.count(), 0) # can't have got any
self.assertTrue(src.live)
self.assertEqual(src.interval, 120)
| 33.398357 | 158 | 0.61869 | 2,137 | 16,265 | 4.517548 | 0.111371 | 0.100994 | 0.1156 | 0.051792 | 0.800083 | 0.786306 | 0.767557 | 0.741144 | 0.721773 | 0.692873 | 0 | 0.023933 | 0.267876 | 16,265 | 486 | 159 | 33.467078 | 0.786782 | 0.091177 | 0 | 0.665455 | 0 | 0.003636 | 0.101298 | 0.015096 | 0 | 0 | 0 | 0 | 0.327273 | 1 | 0.083636 | false | 0 | 0.036364 | 0 | 0.138182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9ad379c665f28869e3ddadbec28a1b910979fa4a | 2,909 | py | Python | wiki_crosslingual/utils.py | iacercalixto/wiki_crosslingual | 74352cd5d164a8b6113f3ffed957b7eb779c763b | [
"MIT"
] | 1 | 2021-12-27T16:00:51.000Z | 2021-12-27T16:00:51.000Z | wiki_crosslingual/utils.py | iacercalixto/wiki_crosslingual | 74352cd5d164a8b6113f3ffed957b7eb779c763b | [
"MIT"
] | null | null | null | wiki_crosslingual/utils.py | iacercalixto/wiki_crosslingual | 74352cd5d164a8b6113f3ffed957b7eb779c763b | [
"MIT"
] | 1 | 2021-12-15T13:36:21.000Z | 2021-12-15T13:36:21.000Z | def get_sampling_probability_from_counts(datasets_counts_list):
# following: https://papers.nips.cc/paper/8928-cross-lingual-language-model-pretraining.pdf (Section 3.1)
# given: n_i is the number of examples in dataset i
# compute: q_i = p_i^{\alpha} / \sum_{j=1}^{N}{ p_j^{\alpha} }
# p_i = n_i / \sum_{k=1}^{N}{ n_k }
alpha = 0.5
n_datasets = len(datasets_counts_list)
count = [0] * n_datasets
#for i in range(n_datasets):
# count[i] = len(datasets_list[i])
count = datasets_counts_list
N = float(sum(count))
p = [0.] * n_datasets
for i in range(n_datasets):
p[i] = count[i] / N
# create an output vector where the probability of sampling an example from a dataset is upsampled for the low-resource language
# but not too much.
final_weights_per_example = []
final_weights_per_dataset = []
for i in range(n_datasets):
p_sum = 0.
for ii in range(n_datasets):
p_sum += (p[ii] ** alpha)
qi = (p[i] ** alpha) / p_sum
for _ in range(count[i]):
final_weights_per_example.append( qi )
final_weights_per_dataset.append( qi )
return final_weights_per_example, final_weights_per_dataset
def get_sampling_probability(datasets_list):
# following: https://papers.nips.cc/paper/8928-cross-lingual-language-model-pretraining.pdf (Section 3.1)
# given: n_i is the number of examples in dataset i
# compute: q_i = p_i^{\alpha} / \sum_{j=1}^{N}{ p_j^{\alpha} }
# p_i = n_i / \sum_{k=1}^{N}{ n_k }
alpha = 0.5
n_datasets = len(datasets_list)
count = [0] * n_datasets
for i in range(n_datasets):
count[i] = len(datasets_list[i])
N = float(sum(count))
p = [0.] * n_datasets
for i in range(n_datasets):
p[i] = count[i] / N
# create an output vector where the probability of sampling an example from a dataset is upsampled for the low-resource language
# but not too much.
final_weights_per_example = []
final_weights_per_dataset = []
for i in range(n_datasets):
p_sum = 0.
for ii in range(n_datasets):
p_sum += (p[ii] ** alpha)
qi = (p[i] ** alpha) / p_sum
for _ in range(count[i]):
final_weights_per_example.append( qi )
final_weights_per_dataset.append( qi )
return final_weights_per_example, final_weights_per_dataset
def get_datasets_sampling_probability(datasets_list):
_, res = get_sampling_probability(datasets_list)
return res
if __name__=="__main__":
datasets = [ list(range(100)), list(range(85)), list(range(120)) ]
_, res = get_sampling_probability( datasets )
print(res)
res1 = get_datasets_sampling_probability( datasets )
print(res1)
datasets_counts = [ 100, 85, 120 ]
_, res2 = get_sampling_probability_from_counts( datasets_counts )
print(res2)
| 39.849315 | 132 | 0.647989 | 434 | 2,909 | 4.064516 | 0.182028 | 0.071429 | 0.102041 | 0.072562 | 0.912698 | 0.816327 | 0.816327 | 0.764172 | 0.764172 | 0.764172 | 0 | 0.02073 | 0.237195 | 2,909 | 72 | 133 | 40.402778 | 0.774223 | 0.299759 | 0 | 0.660377 | 0 | 0 | 0.003958 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056604 | false | 0 | 0 | 0 | 0.113208 | 0.056604 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b1093bba07e942a88d21b582049bdb55359891ea | 1,649 | py | Python | tests/test_backend.py | hgw3lls/Quado | 4e107afbce7663144dd449a23b068fb29b9f221a | [
"MIT"
] | 1 | 2021-12-30T20:53:30.000Z | 2021-12-30T20:53:30.000Z | tests/test_backend.py | hgw3lls/Quado | 4e107afbce7663144dd449a23b068fb29b9f221a | [
"MIT"
] | null | null | null | tests/test_backend.py | hgw3lls/Quado | 4e107afbce7663144dd449a23b068fb29b9f221a | [
"MIT"
] | 1 | 2021-12-30T20:53:32.000Z | 2021-12-30T20:53:32.000Z | from starlette.testclient import TestClient
import json
# import pytest
from backend.main import app
client = TestClient(app)
def test_process_text(test_app, monkeypatch):
test_request_payload = {
"text": "test text",
"description": "test description",
}
test_response_payload = {
"text": "test text",
"description": "test description",
}
async def mock_post(payload):
return 1
monkeypatch.setattr(app, "post", mock_post)
response = test_app.post(
"/process_text/",
data=json.dumps(test_request_payload),
)
assert response.status_code == 200
assert response.json() == test_response_payload
def test_process_text_invalid_json(test_app):
response = test_app.post(
"/process_text/", data=json.dumps({"title": "something"})
)
assert response.status_code == 422
def test_generate_report(test_app, monkeypatch):
test_request_payload = {
"text": "test text",
"description": "test description",
}
test_response_payload = {
"text": "test text",
"description": "test description",
}
async def mock_post(payload):
return 1
monkeypatch.setattr(app, "post", mock_post)
response = test_app.post(
"/generate_report/",
data=json.dumps(test_request_payload),
)
assert response.status_code == 200
assert response.json() == test_response_payload
def test_generate_report_invalid_json(test_app):
response = test_app.post(
"/generate_report/", data=json.dumps({"title": "something"})
)
assert response.status_code == 422
| 23.898551 | 68 | 0.65373 | 187 | 1,649 | 5.508021 | 0.197861 | 0.054369 | 0.069903 | 0.073786 | 0.836893 | 0.836893 | 0.836893 | 0.836893 | 0.801942 | 0.716505 | 0 | 0.011041 | 0.231049 | 1,649 | 68 | 69 | 24.25 | 0.801262 | 0.007884 | 0 | 0.6 | 0 | 0 | 0.157895 | 0 | 0 | 0 | 0 | 0 | 0.12 | 1 | 0.08 | false | 0 | 0.06 | 0 | 0.18 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b131253d18aef54a7fd8b450dc354a6e743eeb4e | 204 | py | Python | kuwala/pipelines/google-poi/src/pipeline/main.py | Guutch/kuwala | b34083e5cad6a27aff061655c8f9ea2f643550fa | [
"Apache-2.0"
] | null | null | null | kuwala/pipelines/google-poi/src/pipeline/main.py | Guutch/kuwala | b34083e5cad6a27aff061655c8f9ea2f643550fa | [
"Apache-2.0"
] | null | null | null | kuwala/pipelines/google-poi/src/pipeline/main.py | Guutch/kuwala | b34083e5cad6a27aff061655c8f9ea2f643550fa | [
"Apache-2.0"
] | null | null | null | from SearchScraper import SearchScraper
from search_string_generator import generate_search_strings
if __name__ == '__main__':
generate_search_strings()
SearchScraper.scrape_with_search_string()
| 29.142857 | 59 | 0.838235 | 23 | 204 | 6.695652 | 0.565217 | 0.155844 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.112745 | 204 | 6 | 60 | 34 | 0.850829 | 0 | 0 | 0 | 1 | 0 | 0.039216 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
b190a1444fce13e6da7f497f9946b47c579ecde6 | 162 | py | Python | taskcat/testing/__init__.py | neil-greenwood/taskcat | e47f97714a01bd3c3b5040a05380a5f97f158d1b | [
"Apache-2.0"
] | 920 | 2016-12-03T01:41:25.000Z | 2021-11-04T13:52:21.000Z | taskcat/testing/__init__.py | neil-greenwood/taskcat | e47f97714a01bd3c3b5040a05380a5f97f158d1b | [
"Apache-2.0"
] | 544 | 2017-02-23T22:41:25.000Z | 2021-11-03T23:02:25.000Z | taskcat/testing/__init__.py | neil-greenwood/taskcat | e47f97714a01bd3c3b5040a05380a5f97f158d1b | [
"Apache-2.0"
] | 225 | 2016-12-11T13:36:21.000Z | 2021-11-04T14:43:53.000Z | from ._cfn_test import CFNTest # noqa: F401
from ._lint_test import LintTest # noqa: F401
from ._unit_test import UnitTest # noqa: F401
__all__ = ["CFNTest"]
| 27 | 46 | 0.740741 | 23 | 162 | 4.782609 | 0.521739 | 0.272727 | 0.218182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067164 | 0.17284 | 162 | 5 | 47 | 32.4 | 0.753731 | 0.197531 | 0 | 0 | 0 | 0 | 0.055556 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
492648dbad4e709c072b8d42757bb70eff390a0a | 2,880 | py | Python | src/app/tests/communitycards_test.py | mikeshoe/texas-holdem-tutor | acc92e2c4f1fa1e815d518a87413940ad2f6deee | [
"MIT"
] | 2 | 2016-10-07T16:14:38.000Z | 2016-10-07T21:36:11.000Z | src/app/tests/communitycards_test.py | mikeshoe/texas-holdem-pi-tutor | acc92e2c4f1fa1e815d518a87413940ad2f6deee | [
"MIT"
] | null | null | null | src/app/tests/communitycards_test.py | mikeshoe/texas-holdem-pi-tutor | acc92e2c4f1fa1e815d518a87413940ad2f6deee | [
"MIT"
] | null | null | null | '''
Created on Sep 20, 2016
@author: mike
'''
import unittest
from app.communitycards import CommunityCards
from app.tests.utparent import UTParent
class Test(UTParent):
def setUp(self):
pass
def tearDown(self):
pass
def test_add_community_cards(self):
ace_spades = self.get_ace_spades()
two_diamonds = self.get_two_diamonds()
three_hearts = self.get_three_hearts()
four_clubs = self.get_four_clubs()
five_spades = self.get_five_spades()
community_cards = CommunityCards()
assert 0 == community_cards.num_community_cards()
community_cards.add_community_card(ace_spades)
assert 1 == community_cards.num_community_cards()
community_cards.add_community_card(two_diamonds)
assert 2 == community_cards.num_community_cards()
community_cards.add_community_card(three_hearts)
assert 3 == community_cards.num_community_cards()
community_cards.add_community_card(four_clubs)
assert 4 == community_cards.num_community_cards()
community_cards.add_community_card(five_spades)
assert 5 == community_cards.num_community_cards()
@unittest.expectedFailure
def test_too_many_community_cards(self):
ace_spades = self.get_ace_spades()
two_diamonds = self.get_two_diamonds()
three_hearts = self.get_three_hearts()
four_clubs = self.get_four_clubs()
five_spades = self.get_five_spades()
six_diamonds = self.get_six_diamonds()
community_cards = CommunityCards()
assert 0 == community_cards.num_community_cards()
community_cards.add_community_card(ace_spades)
assert 1 == community_cards.num_community_cards()
community_cards.add_community_card(two_diamonds)
assert 2 == community_cards.num_community_cards()
community_cards.add_community_card(three_hearts)
assert 3 == community_cards.num_community_cards()
community_cards.add_community_card(four_clubs)
assert 4 == community_cards.num_community_cards()
community_cards.add_community_card(five_spades)
assert 5 == community_cards.num_community_cards()
'This should throw an exception & fail as hold-em is limited to 5 community cards'
community_cards.add_community_card(six_diamonds)
@unittest.expectedFailure
def test_community_cards_must_be_different(self):
black_bullet = self.get_ace_spades()
community_cards = CommunityCards()
community_cards.add_community_card(black_bullet)
'This should throw an exception & fail - community cards must be different'
community_cards.add_community_card(black_bullet)
if __name__ == "__main__":
#import sys;sys.argv = ['', 'Test.testName']
unittest.main() | 36.455696 | 90 | 0.702778 | 349 | 2,880 | 5.378224 | 0.200573 | 0.335642 | 0.117741 | 0.180075 | 0.778903 | 0.748002 | 0.716036 | 0.648908 | 0.648908 | 0.648908 | 0 | 0.008475 | 0.221528 | 2,880 | 79 | 91 | 36.455696 | 0.828724 | 0.028472 | 0 | 0.719298 | 0 | 0 | 0.057685 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 1 | 0.087719 | false | 0.035088 | 0.052632 | 0 | 0.157895 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
494b9fa2f2d69731f1f230d52885c88d9dea9773 | 4,210 | py | Python | httpGepClient/gepClient.py | TopToy/Geppetto | 69600a280dd620bcacc97102c7bb1d3269e0929f | [
"Apache-2.0"
] | null | null | null | httpGepClient/gepClient.py | TopToy/Geppetto | 69600a280dd620bcacc97102c7bb1d3269e0929f | [
"Apache-2.0"
] | null | null | null | httpGepClient/gepClient.py | TopToy/Geppetto | 69600a280dd620bcacc97102c7bb1d3269e0929f | [
"Apache-2.0"
] | null | null | null | import requests
from termcolor import colored
from settings import BASE_URL
def height(port, ip='127.0.0.1'):
response = requests.get(BASE_URL.replace('[ip]:[port]', '{}:{}'.format(ip, port)) + 'state/height')
if response.status_code != 200:
print(colored('Unsuccessful call, code {}'.format(response.status_code), 'red'))
return -1
return int(response.json()['num'])
def liveness(port, ip='127.0.0.1'):
response = requests.get(BASE_URL.replace('[ip]:[port]', '{}:{}'.format(ip, port)) + 'state/liveness')
if response.status_code != 200:
print(colored('Unsuccessful call, code {}'.format(response.status_code), 'red'))
return ''
return response.text
def pool_size(port, ip='127.0.0.1'):
response = requests.get(BASE_URL.replace('[ip]:[port]', '{}:{}'.format(ip, port)) + 'state/pool_size')
if response.status_code != 200:
print(colored('Unsuccessful call, code {}'.format(response.status_code), 'red'))
return -1
return int(response.json()['num'])
def pending_size(port, ip='127.0.0.1'):
response = requests.get(BASE_URL.replace('[ip]:[port]', '{}:{}'.format(ip, port)) + 'state/pending_size')
if response.status_code != 200:
print(colored('Unsuccessful call, code {}'.format(response.status_code), 'red'))
return -1
return int(response.json()['num'])
def validators(port, ip='127.0.0.1'):
response = requests.get(BASE_URL.replace('[ip]:[port]', '{}:{}'.format(ip, port)) + 'state/validators')
if response.status_code != 200:
print(colored('Unsuccessful call, code {}'.format(requests.status_codes), 'red'))
return ''
return response.json()['ips']
def info(port, ip='127.0.0.1'):
response = requests.get(BASE_URL.replace('[ip]:[port]', '{}:{}'.format(ip, port)) + 'state/info')
if response.status_code != 200:
print(colored('Unsuccessful call, code {}'.format(response.status_code), 'red'))
return ''
return response.json()
def get_tx(port,ip, c, w, p, b, t, blocking=0):
response = requests.get(BASE_URL.replace('[ip]:[port]', '{}:{}'.format(ip, port)) +
'transactions/cid={}&worker={}&pid={}&bid={}&tx_num={}&blocking={}'
.format(c, w, p, b, t, blocking))
if response.status_code != 200:
print(colored('Unsuccessful call, code {}'.format(response.status_code), 'red'))
return ''
return response.json()
def get_tx_data(port,ip, c, w, p, b, t, blocking=0):
response = requests.get(BASE_URL.replace('[ip]:[port]', '{}:{}'.format(ip, port)) +
'transactions/cid={}&worker={}&pid={}&bid={}&tx_num={}&blocking={}/data'
.format(c, w, p, b, t, blocking))
if response.status_code != 200:
print(colored('Unsuccessful call, code {}'.format(response.status_code), 'red'))
return ''
return response.json()
def tx_status(port,ip, c, w, p, b, t):
response = requests.get(BASE_URL.replace('[ip]:[port]', '{}:{}'.format(ip, port)) +
'transactions/cid={}&worker={}&pid={}&bid={}&tx_num={}/status'
.format(c, w, p, b, t))
if response.status_code != 200:
print(colored('Unsuccessful call, code {}'.format(response.status_code), 'red'))
return ''
return response.json()['status']
def get_block(port,ip, c, h, blocking=0):
response = requests.get(BASE_URL.replace('[ip]:[port]', '{}:{}'.format(ip, port)) +
'blocks/cid={}&height={}&blocking={}'
.format(c, h, blocking))
if response.status_code != 200:
print(colored('Unsuccessful call, code {}'.format(response.status_code), 'red'))
return ''
return response.json()
def write_tx(port,ip, c, data):
response = requests.post(BASE_URL.replace('[ip]:[port]', '{}:{}'.format(ip, port)) +
'transactions/cid={}?data={}'.format(c, data))#, data={'data': data})
if response.status_code != 200:
print(colored('Unsuccessful call, code {}'.format(response.status_code), 'red'))
return ''
return response.json()
| 41.683168 | 109 | 0.586223 | 529 | 4,210 | 4.57656 | 0.111531 | 0.054523 | 0.156134 | 0.072697 | 0.86741 | 0.856258 | 0.851714 | 0.847171 | 0.847171 | 0.847171 | 0 | 0.022556 | 0.210214 | 4,210 | 100 | 110 | 42.1 | 0.705564 | 0.005226 | 0 | 0.589744 | 0 | 0 | 0.217101 | 0.06138 | 0 | 0 | 0 | 0 | 0 | 1 | 0.141026 | false | 0 | 0.038462 | 0 | 0.461538 | 0.141026 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4959d9fb308e6553285537073ec9538e4e6f3daa | 70 | py | Python | models/__init__.py | PrendiProgramming/CTRL-C | 33649170f94f1bdc568163762ab2c1787f2fc948 | [
"Apache-2.0"
] | 43 | 2021-09-06T08:00:41.000Z | 2022-03-24T08:14:31.000Z | models/__init__.py | PrendiProgramming/CTRL-C | 33649170f94f1bdc568163762ab2c1787f2fc948 | [
"Apache-2.0"
] | 4 | 2021-09-10T17:11:42.000Z | 2022-03-31T02:04:12.000Z | models/__init__.py | PrendiProgramming/CTRL-C | 33649170f94f1bdc568163762ab2c1787f2fc948 | [
"Apache-2.0"
] | 6 | 2021-09-07T04:04:27.000Z | 2021-11-30T10:48:22.000Z | from .ctrlc import build
def build_model(cfg):
return build(cfg)
| 14 | 24 | 0.728571 | 11 | 70 | 4.545455 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.185714 | 70 | 4 | 25 | 17.5 | 0.877193 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
773d1f0601413c5aeaa166bdb2233368efa7429d | 33 | py | Python | Hello.py | PandaCoding2020/NextSimpleStarter | 663a9adc22be057d1bb9dc9604e776535cab354f | [
"MIT"
] | null | null | null | Hello.py | PandaCoding2020/NextSimpleStarter | 663a9adc22be057d1bb9dc9604e776535cab354f | [
"MIT"
] | null | null | null | Hello.py | PandaCoding2020/NextSimpleStarter | 663a9adc22be057d1bb9dc9604e776535cab354f | [
"MIT"
] | null | null | null | import os
print("Hello,python")
| 8.25 | 21 | 0.727273 | 5 | 33 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 33 | 3 | 22 | 11 | 0.827586 | 0 | 0 | 0 | 0 | 0 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
774f39c476b18d16573527ced50013188e9fe462 | 138 | py | Python | tests/test_tool.py | TheFinalJoke/bluetool | 09f414fc4ca1bf299541cbf4f9b533568c675024 | [
"Apache-2.0"
] | null | null | null | tests/test_tool.py | TheFinalJoke/bluetool | 09f414fc4ca1bf299541cbf4f9b533568c675024 | [
"Apache-2.0"
] | null | null | null | tests/test_tool.py | TheFinalJoke/bluetool | 09f414fc4ca1bf299541cbf4f9b533568c675024 | [
"Apache-2.0"
] | null | null | null | from lib import tool
def test_haversine():
assert tool.haversine(52.370216, 4.895168, 52.520008,
13.404954) == 945793.4375088713
| 23 | 57 | 0.724638 | 20 | 138 | 4.95 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.405172 | 0.15942 | 138 | 5 | 58 | 27.6 | 0.448276 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | true | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
620880ffe5175916158ed19586e380643032b018 | 46 | py | Python | egs/avspeech/looking-to-listen/local/__init__.py | tux-coder/asteroid | d9c76ec728dc95ef695daf5743f21750c3218e0a | [
"MIT"
] | 1 | 2020-07-15T08:32:56.000Z | 2020-07-15T08:32:56.000Z | egs/avspeech/looking-to-listen/local/__init__.py | tux-coder/asteroid | d9c76ec728dc95ef695daf5743f21750c3218e0a | [
"MIT"
] | null | null | null | egs/avspeech/looking-to-listen/local/__init__.py | tux-coder/asteroid | d9c76ec728dc95ef695daf5743f21750c3218e0a | [
"MIT"
] | null | null | null | from .postprocess import filter_audio, shelf
| 15.333333 | 44 | 0.826087 | 6 | 46 | 6.166667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130435 | 46 | 2 | 45 | 23 | 0.925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
622d9aea600cc305e2adda6812cdd2eb8ab65404 | 30 | py | Python | src/keywords/__init__.py | macaquedev/cobralang | af070ee96cf94970d97a413798e3e9e022749c20 | [
"MIT"
] | null | null | null | src/keywords/__init__.py | macaquedev/cobralang | af070ee96cf94970d97a413798e3e9e022749c20 | [
"MIT"
] | null | null | null | src/keywords/__init__.py | macaquedev/cobralang | af070ee96cf94970d97a413798e3e9e022749c20 | [
"MIT"
] | null | null | null | from .keywords import keywords | 30 | 30 | 0.866667 | 4 | 30 | 6.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 30 | 1 | 30 | 30 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
623c43750b1a23474db3ffdcdee9925d5cd31dfb | 69 | py | Python | run.py | fatihaltinpinar/discord-bot | 54bbe1b884366299320d0fd860ab193c3cc03e98 | [
"MIT"
] | null | null | null | run.py | fatihaltinpinar/discord-bot | 54bbe1b884366299320d0fd860ab193c3cc03e98 | [
"MIT"
] | null | null | null | run.py | fatihaltinpinar/discord-bot | 54bbe1b884366299320d0fd860ab193c3cc03e98 | [
"MIT"
] | null | null | null | from bot.database import Database
from bot import bot
import config
| 13.8 | 33 | 0.826087 | 11 | 69 | 5.181818 | 0.454545 | 0.245614 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15942 | 69 | 4 | 34 | 17.25 | 0.982759 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
624461f5bcbae4eed07e01eb2cd0bd239a1fa026 | 4,145 | py | Python | worklog/parsing/WorklogListener.py | ssaamm/worklog.md | f531585159a43f5b5e3b0731612adcf003b92ed5 | [
"MIT"
] | null | null | null | worklog/parsing/WorklogListener.py | ssaamm/worklog.md | f531585159a43f5b5e3b0731612adcf003b92ed5 | [
"MIT"
] | null | null | null | worklog/parsing/WorklogListener.py | ssaamm/worklog.md | f531585159a43f5b5e3b0731612adcf003b92ed5 | [
"MIT"
] | null | null | null | # Generated from Worklog.g4 by ANTLR 4.5.1
from antlr4 import *
if __name__ is not None and "." in __name__:
from .WorklogParser import WorklogParser
else:
from WorklogParser import WorklogParser
# This class defines a complete listener for a parse tree produced by WorklogParser.
class WorklogListener(ParseTreeListener):
# Enter a parse tree produced by WorklogParser#wl.
def enterWl(self, ctx:WorklogParser.WlContext):
pass
# Exit a parse tree produced by WorklogParser#wl.
def exitWl(self, ctx:WorklogParser.WlContext):
pass
# Enter a parse tree produced by WorklogParser#week.
def enterWeek(self, ctx:WorklogParser.WeekContext):
pass
# Exit a parse tree produced by WorklogParser#week.
def exitWeek(self, ctx:WorklogParser.WeekContext):
pass
# Enter a parse tree produced by WorklogParser#weekHeader.
def enterWeekHeader(self, ctx:WorklogParser.WeekHeaderContext):
pass
# Exit a parse tree produced by WorklogParser#weekHeader.
def exitWeekHeader(self, ctx:WorklogParser.WeekHeaderContext):
pass
# Enter a parse tree produced by WorklogParser#weekBody.
def enterWeekBody(self, ctx:WorklogParser.WeekBodyContext):
pass
# Exit a parse tree produced by WorklogParser#weekBody.
def exitWeekBody(self, ctx:WorklogParser.WeekBodyContext):
pass
# Enter a parse tree produced by WorklogParser#day.
def enterDay(self, ctx:WorklogParser.DayContext):
pass
# Exit a parse tree produced by WorklogParser#day.
def exitDay(self, ctx:WorklogParser.DayContext):
pass
# Enter a parse tree produced by WorklogParser#dayHeader.
def enterDayHeader(self, ctx:WorklogParser.DayHeaderContext):
pass
# Exit a parse tree produced by WorklogParser#dayHeader.
def exitDayHeader(self, ctx:WorklogParser.DayHeaderContext):
pass
# Enter a parse tree produced by WorklogParser#dayInfo.
def enterDayInfo(self, ctx:WorklogParser.DayInfoContext):
pass
# Exit a parse tree produced by WorklogParser#dayInfo.
def exitDayInfo(self, ctx:WorklogParser.DayInfoContext):
pass
# Enter a parse tree produced by WorklogParser#start.
def enterStart(self, ctx:WorklogParser.StartContext):
pass
# Exit a parse tree produced by WorklogParser#start.
def exitStart(self, ctx:WorklogParser.StartContext):
pass
# Enter a parse tree produced by WorklogParser#extra.
def enterExtra(self, ctx:WorklogParser.ExtraContext):
pass
# Exit a parse tree produced by WorklogParser#extra.
def exitExtra(self, ctx:WorklogParser.ExtraContext):
pass
# Enter a parse tree produced by WorklogParser#lunch.
def enterLunch(self, ctx:WorklogParser.LunchContext):
pass
# Exit a parse tree produced by WorklogParser#lunch.
def exitLunch(self, ctx:WorklogParser.LunchContext):
pass
# Enter a parse tree produced by WorklogParser#stop.
def enterStop(self, ctx:WorklogParser.StopContext):
pass
# Exit a parse tree produced by WorklogParser#stop.
def exitStop(self, ctx:WorklogParser.StopContext):
pass
# Enter a parse tree produced by WorklogParser#dayBody.
def enterDayBody(self, ctx:WorklogParser.DayBodyContext):
pass
# Exit a parse tree produced by WorklogParser#dayBody.
def exitDayBody(self, ctx:WorklogParser.DayBodyContext):
pass
# Enter a parse tree produced by WorklogParser#additionalLine.
def enterAdditionalLine(self, ctx:WorklogParser.AdditionalLineContext):
pass
# Exit a parse tree produced by WorklogParser#additionalLine.
def exitAdditionalLine(self, ctx:WorklogParser.AdditionalLineContext):
pass
# Enter a parse tree produced by WorklogParser#wordWithMaybeSpace.
def enterWordWithMaybeSpace(self, ctx:WorklogParser.WordWithMaybeSpaceContext):
pass
# Exit a parse tree produced by WorklogParser#wordWithMaybeSpace.
def exitWordWithMaybeSpace(self, ctx:WorklogParser.WordWithMaybeSpaceContext):
pass
| 30.255474 | 84 | 0.722799 | 465 | 4,145 | 6.425806 | 0.202151 | 0.058233 | 0.097055 | 0.174699 | 0.816265 | 0.494311 | 0.483266 | 0.481593 | 0 | 0 | 0 | 0.001532 | 0.212545 | 4,145 | 136 | 85 | 30.477941 | 0.913909 | 0.385766 | 0 | 0.451613 | 1 | 0 | 0.000402 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.451613 | false | 0.451613 | 0.048387 | 0 | 0.516129 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
6256220cbd1d2c08685a329f374fdfe730c2ea18 | 61 | py | Python | elcano/__init__.py | juancroldan/elcano | eb01f41c6262ac622542bdee87f6f501b0f5ed1c | [
"MIT"
] | null | null | null | elcano/__init__.py | juancroldan/elcano | eb01f41c6262ac622542bdee87f6f501b0f5ed1c | [
"MIT"
] | null | null | null | elcano/__init__.py | juancroldan/elcano | eb01f41c6262ac622542bdee87f6f501b0f5ed1c | [
"MIT"
] | null | null | null | from elcano.utils import *
def explore(dataset_fname):
pass | 15.25 | 27 | 0.786885 | 9 | 61 | 5.222222 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131148 | 61 | 4 | 28 | 15.25 | 0.886792 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
625c408e508e9a9e13fae884a870a6c45d4f8846 | 49 | py | Python | src/onlyuserclient/api/__init__.py | tangdyy/onlyuserclient | d93b4e1077afda6b58bba002729f6bc89b988c7a | [
"MIT"
] | 2 | 2021-03-12T00:42:13.000Z | 2021-05-24T06:31:13.000Z | src/onlyuserclient/api/__init__.py | tangdyy/onlyuserclient | d93b4e1077afda6b58bba002729f6bc89b988c7a | [
"MIT"
] | null | null | null | src/onlyuserclient/api/__init__.py | tangdyy/onlyuserclient | d93b4e1077afda6b58bba002729f6bc89b988c7a | [
"MIT"
] | null | null | null | from .base import BaseAPI
from .onlyuser import * | 24.5 | 25 | 0.795918 | 7 | 49 | 5.571429 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 49 | 2 | 26 | 24.5 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6286a0397c66a804dac6e5d58f9757b1b948397d | 157 | py | Python | zai-pai-xu-shu-zu-zhong-cha-zhao-shu-zi-lcof.py | tsonglew/leetcode-solution | abce0c36def55a8d3bf86fca531246a29920e771 | [
"Unlicense"
] | null | null | null | zai-pai-xu-shu-zu-zhong-cha-zhao-shu-zi-lcof.py | tsonglew/leetcode-solution | abce0c36def55a8d3bf86fca531246a29920e771 | [
"Unlicense"
] | null | null | null | zai-pai-xu-shu-zu-zhong-cha-zhao-shu-zi-lcof.py | tsonglew/leetcode-solution | abce0c36def55a8d3bf86fca531246a29920e771 | [
"Unlicense"
] | null | null | null | class Solution:
def search(self, nums: List[int], target: int) -> int:
return bisect.bisect_right(nums, target)-bisect.bisect_left(nums, target)
| 39.25 | 81 | 0.700637 | 22 | 157 | 4.909091 | 0.590909 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.165605 | 157 | 3 | 82 | 52.333333 | 0.824427 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
628f9ed3d61513e6634592c82ff1412d1cbb2807 | 43 | py | Python | mvqag/data/__init__.py | VirkSaab/Medical-VQA | 6d77963cc81940fc680a18d931e0d88a3264f5fa | [
"MIT"
] | null | null | null | mvqag/data/__init__.py | VirkSaab/Medical-VQA | 6d77963cc81940fc680a18d931e0d88a3264f5fa | [
"MIT"
] | null | null | null | mvqag/data/__init__.py | VirkSaab/Medical-VQA | 6d77963cc81940fc680a18d931e0d88a3264f5fa | [
"MIT"
] | null | null | null | from .data import *
from .generate import * | 21.5 | 23 | 0.744186 | 6 | 43 | 5.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162791 | 43 | 2 | 23 | 21.5 | 0.888889 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
6570c71502a0a3aade5a125f6ba68d75aef3569a | 47,022 | py | Python | Packs/SophosCentral/Integrations/SophosCentral/SophosCentral_test.py | matan-xmcyber/content | 7f02301c140b35956af3cd20cb8dfc64f34afb3e | [
"MIT"
] | 2 | 2020-07-27T10:35:41.000Z | 2020-12-14T15:44:18.000Z | Packs/SophosCentral/Integrations/SophosCentral/SophosCentral_test.py | matan-xmcyber/content | 7f02301c140b35956af3cd20cb8dfc64f34afb3e | [
"MIT"
] | 48 | 2022-03-08T13:45:00.000Z | 2022-03-31T14:32:05.000Z | Packs/SophosCentral/Integrations/SophosCentral/SophosCentral_test.py | adambaumeister/content | c6808d0b13d00edc4cd6268793c2ae0c2e39aed6 | [
"MIT"
] | 2 | 2019-08-29T10:20:55.000Z | 2019-09-01T12:16:09.000Z | import json
from datetime import datetime
from pytest import raises
from CommonServerPython import DemistoException
BASE_URL = 'https://api-eu02.central.sophos.com'
DATE_FORMAT = '%Y-%m-%dT%H:%M:%S.%fZ'
def load_mock_response(file_name: str) -> dict:
"""
Load one of the mock responses to be used for assertion.
Args:
file_name (str): Name of the mock response JSON file to return.
"""
with open(f'test_data/{file_name}', mode='r', encoding='utf-8') as json_file:
return json.loads(json_file.read())
def test_sophos_central_alert_list_command(requests_mock) -> None:
"""
Scenario: List alerts
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_alert_list is called.
Then:
- Ensure number of items is correct.
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_alert_list_command
mock_response = load_mock_response('alert_list.json')
requests_mock.get(f'{BASE_URL}/common/v1/alerts', json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_alert_list_command(client, {'limit': '14'})
assert len(result.outputs) == 3
assert result.outputs_prefix == 'SophosCentral.Alert'
assert result.outputs[0].get('id') == '56931431-9faf-480c-ba1d-8d7541eae259'
def test_sophos_central_alert_get_command(requests_mock) -> None:
"""
Scenario: Get a single alert.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_alert_get is called.
Then:
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_alert_get_command
mock_response = load_mock_response('alert_single.json')
alert_id = '56931431-9faf-480c-ba1d-8d7541eae259'
requests_mock.get(f'{BASE_URL}/common/v1/alerts/{alert_id}', json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_alert_get_command(client, {'alert_id': alert_id})
assert result.outputs_prefix == 'SophosCentral.Alert'
assert result.outputs.get('id') == '70e3781d-c0f6-4e72-b6aa-3c3ef21f3dbb'
def test_sophos_central_alert_action_command(requests_mock) -> None:
"""
Scenario: Take an action against one or more alerts.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_alert_action is called.
Then:
- Ensure number of items is correct.
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_alert_action_command
mock_response = load_mock_response('alert_action.json')
alert_id = '56931431-9faf-480c-ba1d-8d7541eae259'
requests_mock.post(f'{BASE_URL}/common/v1/alerts/{alert_id}/actions', json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_alert_action_command(client, {'alert_id': alert_id,
'action': 'clearThreat', 'message': 'b'})
assert len(result.outputs) == 1
assert result.outputs_prefix == 'SophosCentral.AlertAction'
assert result.outputs[0].get('alertId') == '25c7b132-56d8-4bce-9d1b-6c51a7eb3c78'
alert_ids = ['56931431-9faf-480c-ba1d-8d7541eae259'] * 3
result = sophos_central_alert_action_command(client, {'alert_id': alert_ids,
'action': 'clearThreat', 'message': 'b'})
assert len(result.outputs) == 3
def test_sophos_central_alert_search_command(requests_mock) -> None:
"""
Scenario: Search for specific alerts.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_alert_search is called.
Then:
- Ensure number of items is correct.
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_alert_search_command
mock_response = load_mock_response('alert_list.json')
requests_mock.post(f'{BASE_URL}/common/v1/alerts/search', json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_alert_search_command(client, {'limit': '14'})
assert len(result.outputs) == 3
assert result.outputs_prefix == 'SophosCentral.Alert'
assert result.outputs[0].get('id') == '56931431-9faf-480c-ba1d-8d7541eae259'
def test_sophos_central_endpoint_list_command(requests_mock) -> None:
"""
Scenario: List endpoints.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_endpoint_scan is called.
Then:
- Ensure number of items is correct.
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_endpoint_list_command
mock_response = load_mock_response('endpoint_list.json')
requests_mock.get(f'{BASE_URL}/endpoint/v1/endpoints', json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_endpoint_list_command(client, {'limit': '17'})
assert len(result.outputs) == 2
assert result.outputs_prefix == 'SophosCentral.Endpoint'
assert result.outputs[0].get('id') == '6e9567ea-bb50-40c5-9f12-42eb308e4c9b'
def test_sophos_central_endpoint_scan_command(requests_mock) -> None:
"""
Scenario: Scan one or more endpoints.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_endpoint_scan is called.
Then:
- Ensure number of items is correct.
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_endpoint_scan_command
mock_response = load_mock_response('endpoint_scan.json')
endpoint_id = '6e9567ea-bb50-40c5-9f12-42eb308e4c9b'
requests_mock.post(f'{BASE_URL}/endpoint/v1/endpoints/{endpoint_id}/scans', json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_endpoint_scan_command(client, {'endpoint_id': endpoint_id})
assert len(result.outputs) == 1
assert result.outputs_prefix == 'SophosCentral.EndpointScan'
assert result.outputs[0].get('id') == '6e9567ea-bb50-40c5-9f12-42eb308e4c9b'
endpoint_ids = ['6e9567ea-bb50-40c5-9f12-42eb308e4c9b'] * 3
result = sophos_central_endpoint_scan_command(client, {'endpoint_id': endpoint_ids})
assert len(result.outputs) == 3
def test_sophos_central_endpoint_tamper_get_command(requests_mock) -> None:
"""
Scenario: Get tamper protection information for one or more endpoints.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_endpoint_tamper_get is called.
Then:
- Ensure number of items is correct.
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_endpoint_tamper_get_command
mock_response = load_mock_response('endpoint_tamper.json')
endpoint_id = '6e9567ea-bb50-40c5-9f12-42eb308e4c9b'
requests_mock.get(f'{BASE_URL}/endpoint/v1/endpoints/{endpoint_id}/tamper-protection',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_endpoint_tamper_get_command(client, {'endpoint_id': endpoint_id,
'get_password': True})
assert len(result.outputs) == 1
assert result.outputs_prefix == 'SophosCentral.EndpointTamper'
assert result.outputs[0].get('password') == '1234567890'
endpoint_ids = ['6e9567ea-bb50-40c5-9f12-42eb308e4c9b'] * 3
result = sophos_central_endpoint_tamper_get_command(client, {'endpoint_id': endpoint_ids,
'get_password': True})
assert len(result.outputs) == 3
def test_sophos_central_endpoint_tamper_update_command(requests_mock) -> None:
"""
Scenario: Update tamper protection information for one or more endpoints.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_endpoint_tamper_update is called.
Then:
- Ensure number of items is correct.
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_endpoint_tamper_update_command
mock_response = load_mock_response('endpoint_tamper.json')
endpoint_id = '6e9567ea-bb50-40c5-9f12-42eb308e4c9b'
requests_mock.post(f'{BASE_URL}/endpoint/v1/endpoints/{endpoint_id}/tamper-protection',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_endpoint_tamper_update_command(client, {'endpoint_id': endpoint_id,
'get_password': True})
assert len(result.outputs) == 1
assert result.outputs_prefix == 'SophosCentral.EndpointTamper'
assert result.outputs[0].get('password') == '1234567890'
endpoint_ids = ['6e9567ea-bb50-40c5-9f12-42eb308e4c9b'] * 3
result = sophos_central_endpoint_tamper_update_command(client, {'endpoint_id': endpoint_ids,
'get_password': True})
assert len(result.outputs) == 3
def test_sophos_central_allowed_item_list_command(requests_mock) -> None:
"""
Scenario: List allowed items.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_allowed_item_list is called.
Then:
- Ensure number of items is correct.
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_allowed_item_list_command
mock_response = load_mock_response('allowed_item_list.json')
requests_mock.get(f'{BASE_URL}/endpoint/v1/settings/allowed-items',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_allowed_item_list_command(client, {'page_size': '30',
'page': '1'})
assert len(result.outputs) == 3
assert result.outputs_prefix == 'SophosCentral.AllowedItem'
assert result.outputs[0].get('id') == 'a28c7ee1-8ad9-4b5c-8f15-4d913436ce18'
def test_sophos_central_allowed_item_get_command(requests_mock) -> None:
"""
Scenario: Get a single allowed item.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_allowed_item_get is called
Then:
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_allowed_item_get_command
mock_response = load_mock_response('allowed_item_single.json')
allowed_item_id = 'a28c7ee1-8ad9-4b5c-8f15-4d913436ce18'
requests_mock.get(f'{BASE_URL}/endpoint/v1/settings/allowed-items/{allowed_item_id}',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_allowed_item_get_command(client, {'allowed_item_id': allowed_item_id})
assert result.outputs_prefix == 'SophosCentral.AllowedItem'
assert result.outputs.get('id') == '811fa316-d485-4499-a979-3e1c0a89f1fd'
def test_sophos_central_allowed_item_add_command(requests_mock) -> None:
"""
Scenario: Add an allowed item.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_allowed_item_list is called.
Then:
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_allowed_item_add_command
mock_response = load_mock_response('allowed_item_single.json')
requests_mock.post(f'{BASE_URL}/endpoint/v1/settings/allowed-items',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_allowed_item_add_command(client, {})
assert result.outputs_prefix == 'SophosCentral.AllowedItem'
assert result.outputs.get('id') == '811fa316-d485-4499-a979-3e1c0a89f1fd'
def test_sophos_central_allowed_item_update_command(requests_mock) -> None:
"""
Scenario: Update an existing allowed item.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_allowed_item_update is called.
Then:
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_allowed_item_update_command
mock_response = load_mock_response('allowed_item_single.json')
allowed_item_id = 'a28c7ee1-8ad9-4b5c-8f15-4d913436ce18'
requests_mock.patch(f'{BASE_URL}/endpoint/v1/settings/allowed-items/{allowed_item_id}',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_allowed_item_update_command(client,
{'allowed_item_id': allowed_item_id})
assert result.outputs_prefix == 'SophosCentral.AllowedItem'
assert result.outputs.get('id') == '811fa316-d485-4499-a979-3e1c0a89f1fd'
def test_sophos_central_allowed_item_delete_command(requests_mock) -> None:
"""
Scenario: Delete an existing allowed item.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_allowed_item_delete is called.
Then:
- Ensure the output is correct.
- Ensure outputs prefix is correct.
"""
from SophosCentral import Client, sophos_central_allowed_item_delete_command
mock_response = load_mock_response('deleted.json')
allowed_item_id = 'a28c7ee1-8ad9-4b5c-8f15-4d913436ce18'
requests_mock.delete(f'{BASE_URL}/endpoint/v1/settings/allowed-items/{allowed_item_id}',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_allowed_item_delete_command(client,
{'allowed_item_id': allowed_item_id})
assert result.outputs == {'deletedItemId': allowed_item_id}
assert result.outputs_prefix == 'SophosCentral.DeletedAllowedItem'
def test_sophos_central_blocked_item_list_command(requests_mock) -> None:
"""
Scenario: List blocked items.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_blocked_item_list is called.
Then:
- Ensure number of items is correct.
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_blocked_item_list_command
mock_response = load_mock_response('blocked_item_list.json')
requests_mock.get(f'{BASE_URL}/endpoint/v1/settings/blocked-items',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_blocked_item_list_command(client, {'page_size': '30',
'page': '1'})
assert len(result.outputs) == 3
assert result.outputs_prefix == 'SophosCentral.BlockedItem'
assert result.outputs[0].get('id') == '6b0d0fb1-4254-45b0-896a-2eb36d0e2368'
def test_sophos_central_blocked_item_get_command(requests_mock) -> None:
"""
Scenario: Get a single blocked item.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_blocked_item_get is called.
Then:
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_blocked_item_get_command
mock_response = load_mock_response('blocked_item_single.json')
blocked_item_id = 'a28c7ee1-8ad9-4b5c-8f15-4d913436ce18'
requests_mock.get(f'{BASE_URL}/endpoint/v1/settings/blocked-items/{blocked_item_id}',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_blocked_item_get_command(client, {'blocked_item_id': blocked_item_id})
assert result.outputs_prefix == 'SophosCentral.BlockedItem'
assert result.outputs.get('id') == '998ffd3d-4a44-40da-8c1f-b18ace4ff735'
def test_sophos_central_blocked_item_add_command(requests_mock) -> None:
"""
Scenario: Add a new blocked item.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_blocked_item_add is called.
Then:
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_blocked_item_add_command
mock_response = load_mock_response('blocked_item_single.json')
requests_mock.post(f'{BASE_URL}/endpoint/v1/settings/blocked-items',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_blocked_item_add_command(client, {})
assert result.outputs_prefix == 'SophosCentral.BlockedItem'
assert result.outputs.get('id') == '998ffd3d-4a44-40da-8c1f-b18ace4ff735'
def test_sophos_central_blocked_item_delete_command(requests_mock) -> None:
"""
Scenario: Delete an existing blocked item.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_blocked_item_delete is called.
Then:
- Ensure the output is correct.
- Ensure outputs prefix is correct.
"""
from SophosCentral import Client, sophos_central_blocked_item_delete_command
mock_response = load_mock_response('deleted.json')
blocked_item_id = 'a28c7ee1-8ad9-4b5c-8f15-4d913436ce18'
requests_mock.delete(f'{BASE_URL}/endpoint/v1/settings/blocked-items/{blocked_item_id}',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_blocked_item_delete_command(client,
{'blocked_item_id': blocked_item_id})
assert result.outputs == {'deletedItemId': blocked_item_id}
assert result.outputs_prefix == 'SophosCentral.DeletedBlockedItem'
def test_sophos_central_scan_exclusion_list_command(requests_mock) -> None:
"""
Scenario: List scan exclusions.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_scan_exclusion_list is called.
Then:
- Ensure number of items is correct.
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_scan_exclusion_list_command
mock_response = load_mock_response('scan_exclusion_list.json')
requests_mock.get(f'{BASE_URL}/endpoint/v1/settings/exclusions/scanning',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_scan_exclusion_list_command(client, {'page_size': '30',
'page': '1'})
assert len(result.outputs) == 3
assert result.outputs_prefix == 'SophosCentral.ScanExclusion'
assert result.outputs[0].get('id') == '369b0956-a7b6-44fc-b1cc-bd7b3279c663'
def test_sophos_central_scan_exclusion_get_command(requests_mock) -> None:
"""
Scenario: Get a single scan exclusion.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_scan_exclusion_get is called.
Then:
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_scan_exclusion_get_command
mock_response = load_mock_response('scan_exclusion_single.json')
scan_exclusion_id = '16bac29f-17a4-4c3a-9370-8c5968c5ac7d'
requests_mock.get(f'{BASE_URL}/endpoint/v1/settings/exclusions/scanning/{scan_exclusion_id}',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_scan_exclusion_get_command(client, {'exclusion_id': scan_exclusion_id})
assert result.outputs_prefix == 'SophosCentral.ScanExclusion'
assert result.outputs.get('id') == '16bac29f-17a4-4c3a-9370-8c5968c5ac7d'
def test_sophos_central_scan_exclusion_add_command(requests_mock) -> None:
"""
Scenario: Add a new scan exclusion.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_scan_exclusion_add is called.
Then:
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_scan_exclusion_add_command
mock_response = load_mock_response('scan_exclusion_single.json')
requests_mock.post(f'{BASE_URL}/endpoint/v1/settings/exclusions/scanning',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_scan_exclusion_add_command(client, {})
assert result.outputs_prefix == 'SophosCentral.ScanExclusion'
assert result.outputs.get('id') == '16bac29f-17a4-4c3a-9370-8c5968c5ac7d'
def test_sophos_central_scan_exclusion_update_command(requests_mock) -> None:
"""
Scenario: Update an existing scan exclusion.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_scan_exclusion_update is called.
Then:
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_scan_exclusion_update_command
mock_response = load_mock_response('scan_exclusion_single.json')
scan_exclusion_id = '16bac29f-17a4-4c3a-9370-8c5968c5ac7d'
requests_mock.patch(f'{BASE_URL}/endpoint/v1/settings/exclusions/scanning/{scan_exclusion_id}',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_scan_exclusion_update_command(client,
{'exclusion_id': scan_exclusion_id})
assert result.outputs_prefix == 'SophosCentral.ScanExclusion'
assert result.outputs.get('id') == '16bac29f-17a4-4c3a-9370-8c5968c5ac7d'
def test_sophos_central_scan_exclusion_delete_command(requests_mock) -> None:
"""
Scenario: Delete an existing scan exclusion.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_scan_exclusion_update is called.
Then:
- Ensure the output is correct.
- Ensure outputs prefix is correct.
"""
from SophosCentral import Client, sophos_central_scan_exclusion_delete_command
mock_response = load_mock_response('deleted.json')
scan_exclusion_id = '16bac29f-17a4-4c3a-9370-8c5968c5ac7d'
requests_mock.delete(f'{BASE_URL}/endpoint/v1/settings/exclusions/scanning/{scan_exclusion_id}',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_scan_exclusion_delete_command(client,
{'exclusion_id': scan_exclusion_id})
assert result.outputs == {'deletedExclusionId': scan_exclusion_id}
assert result.outputs_prefix == 'SophosCentral.DeletedScanExclusion'
def test_sophos_central_exploit_mitigation_list_command(requests_mock) -> None:
"""
Scenario: List all exploit mitigations.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_exploit_mitigation_list is called.
Then:
- Ensure number of items is correct.
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_exploit_mitigation_list_command
mock_response = load_mock_response('exploit_mitigation_list.json')
requests_mock.get(f'{BASE_URL}/endpoint/v1/settings/exploit-mitigation/applications',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_exploit_mitigation_list_command(client, {'page_size': '30',
'page': '1'})
assert len(result.outputs) == 3
assert result.outputs_prefix == 'SophosCentral.ExploitMitigation'
assert result.outputs[0].get('id') == '30fbb4cf-2961-4ffc-937e-97c57f468838'
def test_sophos_central_exploit_mitigation_get_command(requests_mock) -> None:
"""
Scenario: Get a single exploit mitigation.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_exploit_mitigation_get is called.
Then:
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_exploit_mitigation_get_command
mock_response = load_mock_response('exploit_mitigation_single.json')
exploit_id = 'c2824651-26c1-4470-addf-7b6bb6ac90b4'
requests_mock.get(f'{BASE_URL}/endpoint/v1/settings/'
f'exploit-mitigation/applications/{exploit_id}', json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_exploit_mitigation_get_command(client, {'mitigation_id': exploit_id})
assert result.outputs_prefix == 'SophosCentral.ExploitMitigation'
assert result.outputs.get('id') == 'c2824651-26c1-4470-addf-7b6bb6ac90b4'
def test_sophos_central_exploit_mitigation_add_command(requests_mock) -> None:
"""
Scenario: Add a new exploit mitigation.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_exploit_mitigation_add is called.
Then:
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_exploit_mitigation_add_command
mock_response = load_mock_response('exploit_mitigation_single.json')
exploit_id = 'c2824651-26c1-4470-addf-7b6bb6ac90b4'
requests_mock.post(f'{BASE_URL}/endpoint/v1/settings/exploit-mitigation/applications',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_exploit_mitigation_add_command(client, {'mitigation_id': exploit_id})
assert result.outputs_prefix == 'SophosCentral.ExploitMitigation'
assert result.outputs.get('id') == 'c2824651-26c1-4470-addf-7b6bb6ac90b4'
def test_sophos_central_exploit_mitigation_update_command(requests_mock) -> None:
"""
Scenario: Update an existing exploit mitigation.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_exploit_mitigation_update is called.
Then:
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_exploit_mitigation_update_command
mock_response = load_mock_response('exploit_mitigation_single.json')
exploit_id = 'c2824651-26c1-4470-addf-7b6bb6ac90b4'
requests_mock.patch(f'{BASE_URL}/endpoint/v1/settings/'
f'exploit-mitigation/applications/{exploit_id}',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_exploit_mitigation_update_command(client, {'mitigation_id': exploit_id})
assert result.outputs_prefix == 'SophosCentral.ExploitMitigation'
assert result.outputs.get('id') == 'c2824651-26c1-4470-addf-7b6bb6ac90b4'
def test_sophos_central_exploit_mitigation_delete_command(requests_mock) -> None:
"""
Scenario: Delete an existing exploit mitigation.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_exploit_mitigation_delete is called.
Then:
- Ensure the output is correct.
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_exploit_mitigation_delete_command
mock_response = load_mock_response('deleted.json')
exploit_id = 'c2824651-26c1-4470-addf-7b6bb6ac90b4'
requests_mock.delete(f'{BASE_URL}/endpoint/v1/settings/'
f'exploit-mitigation/applications/{exploit_id}',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_exploit_mitigation_delete_command(client, {'mitigation_id': exploit_id})
assert result.outputs == {'deletedMitigationId': exploit_id}
assert result.outputs_prefix == 'SophosCentral.DeletedExploitMitigation'
def test_sophos_central_detected_exploit_list_command(requests_mock) -> None:
"""
Scenario: List all detected exploits.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_detected_exploit_list is called.
Then:
- Ensure number of items is correct.
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_detected_exploit_list_command
mock_response = load_mock_response('detected_exploit_list.json')
requests_mock.get(f'{BASE_URL}/endpoint/v1/settings/exploit-mitigation/detected-exploits',
json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_detected_exploit_list_command(client, {'page_size': '30',
'page': '1'})
assert len(result.outputs) == 3
assert result.outputs_prefix == 'SophosCentral.DetectedExploit'
assert result.outputs[0].get('id') == 'b81aac51-2fc0-ab6a-asdf-7b6bb6ac90b4'
def test_sophos_central_detected_exploit_get_command(requests_mock) -> None:
"""
Scenario: Get a single detected exploit.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- sophos_central_detected_exploit_get is called.
Then:
- Ensure outputs prefix is correct.
- Ensure a sample value from the API matches what is generated in the context.
"""
from SophosCentral import Client, sophos_central_detected_exploit_get_command
mock_response = load_mock_response('detected_exploit_single.json')
exploit_id = 'b81aac51-2fc0-ab6a-asdf-7b6bb6ac90b4'
requests_mock.get(f'{BASE_URL}/endpoint/v1/settings/'
f'exploit-mitigation/detected-exploits/{exploit_id}', json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
result = sophos_central_detected_exploit_get_command(client,
{'detected_exploit_id': exploit_id})
assert result.outputs_prefix == 'SophosCentral.DetectedExploit'
assert result.outputs.get('id') == 'b81aac51-2fc0-ab6a-asdf-7b6bb6ac90b4'
def test_retrieve_jwt_token(requests_mock) -> None:
"""
Scenario: Get a JWT token with or without a saved one in the integration context.
Given:
- User has provided valid credentials.
When:
- Every time before a command is run.
Then:
- Ensure the JWT token is correct (same as either the mocked context integration or response).
"""
from SophosCentral import retrieve_jwt_token
mock_response = load_mock_response('auth_token.json')
requests_mock.post('https://id.sophos.com/api/v2/oauth2/token', json=mock_response)
result = retrieve_jwt_token('a', 'b', {})
assert result == 'xxxxxxx'
result = retrieve_jwt_token('a', 'b', {'bearer_token': 'aaaa', 'valid_until': 999999999999999})
assert result == 'aaaa'
def test_get_client_data(requests_mock) -> None:
"""
Scenario: Get the client data before executing a command.
Given:
- User has provided valid credentials.
- JWT token has been returned by retrieve_jwt_token().
When:
- Every time after retrieve_jwt_token() and before any command.
Then:
- Ensure base URL is correct according to mock response.
- Ensure headers are correct according to given fake JWT token and mock response.
"""
from SophosCentral import Client
mock_response = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_response)
headers, base_url = Client.get_client_data('aaaa')
assert base_url == 'https://api-eu02.central.sophos.com/'
assert headers == {'Authorization': 'Bearer aaaa',
'X-Tenant-ID': '11f104c5-cc4a-4a9f-bb9c-632c936dfb9f'}
def test_fetch_incidents(requests_mock) -> None:
"""
Scenario: Fetch incidents.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- Every time fetch_incident is called (either timed or by command).
Then:
- Ensure number of incidents is correct.
- Ensure last_fetch is correctly configured according to mock response.
"""
from SophosCentral import Client, fetch_incidents
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
mock_response = load_mock_response('alert_list.json')
requests_mock.post(f'{BASE_URL}/common/v1/alerts/search', json=mock_response)
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
last_fetch, incidents = fetch_incidents(client, {'last_fetch': 1}, '1 days', ['x'], ['x'], '50')
wanted_time = datetime.timestamp(datetime.strptime('2020-11-04T09:31:19.895Z', DATE_FORMAT))
assert last_fetch.get('last_fetch') == wanted_time * 1000
assert len(incidents) == 3
assert incidents[0].get('name') == 'Sophos Central Alert 56931431-9faf-480c-ba1d-8d7541eae259'
def test_fetch_incidents_no_last_fetch(requests_mock):
"""
Scenario: Fetch incidents for the first time, so there is no last_fetch available.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
- First time running fetch incidents.
When:
- Every time fetch_incident is called (either timed or by command).
Then:
- Ensure number of incidents is correct.
- Ensure last_fetch is correctly configured according to mock response.
"""
from SophosCentral import Client, fetch_incidents
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
mock_response = load_mock_response('alert_list.json')
requests_mock.post(f'{BASE_URL}/common/v1/alerts/search', json=mock_response)
last_fetch, incidents = fetch_incidents(client, {}, '12 years', ['x'], ['x'], '50')
wanted_time = datetime.timestamp(datetime.strptime('2020-11-04T09:31:19.895Z', DATE_FORMAT))
assert last_fetch.get('last_fetch') == wanted_time * 1000
assert len(incidents) == 3
assert incidents[0].get('name') == 'Sophos Central Alert 56931431-9faf-480c-ba1d-8d7541eae259'
def test_fetch_incidents_empty_response(requests_mock):
"""
Scenario: Fetch incidents but there are no incidents to return.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- Every time fetch_incident is called (either timed or by command).
- There are no incidents to return.
Then:
- Ensure number of incidents is correct (None).
- Ensure last_fetch is correctly configured according to mock response.
"""
from SophosCentral import Client, fetch_incidents
mock_client_data = load_mock_response('client_data.json')
requests_mock.get('https://api.central.sophos.com/whoami/v1', json=mock_client_data)
client = Client(bearer_token='a', verify=False, client_id='a',
client_secret='b', proxy=False)
mock_response = load_mock_response('empty.json')
requests_mock.post(f'{BASE_URL}/common/v1/alerts/search', json=mock_response)
last_fetch, incidents = fetch_incidents(client, {'last_fetch': 100000000}, '3 days', ['x'],
['x'], '50')
assert last_fetch.get('last_fetch') == 100000001
assert len(incidents) == 0
def test_validate_item_fields() -> None:
"""
Scenario: Validate arguments for creating / updating items before sending to API.
Given:
- User has provided valid credentials.
- Headers and JWT token have been set.
When:
- When add/update_item is called.
Then:
- Ensure correct arguments do not raise an error (e.g certificateSigner type with a
value in the corresponding variable.)
- Ensure faulty arguments do raise an error (e.g. certificateSigner
type without a corresponding value for it).
"""
from SophosCentral import validate_item_fields
args = {'item_type': 'certificateSigner', 'certificate_signer': 'xxx'}
validate_item_fields(args)
args = {'item_type': 'certificateSigner', 'path': 'xxx'}
with raises(DemistoException):
validate_item_fields(args)
| 47.449041 | 101 | 0.686062 | 5,857 | 47,022 | 5.258665 | 0.056855 | 0.055325 | 0.035325 | 0.025292 | 0.923539 | 0.907727 | 0.882825 | 0.863961 | 0.834481 | 0.794188 | 0 | 0.032042 | 0.217473 | 47,022 | 990 | 102 | 47.49697 | 0.805006 | 0.245098 | 0 | 0.593684 | 0 | 0 | 0.247144 | 0.145353 | 0 | 0 | 0 | 0 | 0.181053 | 1 | 0.075789 | false | 0.012632 | 0.082105 | 0 | 0.16 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6588695f3848a32e6622381b5be79fab02765592 | 31,331 | py | Python | iotfunctions/dbtables.py | TheTheseus/functions | 526bb01598c80c5db8995a979a613a7aa6a1e126 | [
"Apache-2.0"
] | null | null | null | iotfunctions/dbtables.py | TheTheseus/functions | 526bb01598c80c5db8995a979a613a7aa6a1e126 | [
"Apache-2.0"
] | null | null | null | iotfunctions/dbtables.py | TheTheseus/functions | 526bb01598c80c5db8995a979a613a7aa6a1e126 | [
"Apache-2.0"
] | null | null | null | # *****************************************************************************
# © Copyright IBM Corp. 2020. All Rights Reserved.
#
# This program and the accompanying materials
# are made available under the terms of the Apache V2.0
# which accompanies this distribution, and is available at
# http://www.apache.org/licenses/LICENSE-2.0
#
# *****************************************************************************
import logging
import os
from pathlib import Path
import dill as pickle
import ibm_db
import pandas as pd
import psycopg2
import pyarrow
import pyarrow.parquet
from iotfunctions import dbhelper
logger = logging.getLogger(__name__)
class DBDataCache:
PARQUET_DIRECTORY = 'parquet'
CACHE_TABLENAME = 'KPI_DATA_CACHE'
CACHE_FILE_STEM = 'df_parquet'
def __init__(self, tenant_id, entity_type_id, schema, db_connection, db_type):
self.tenant_id = tenant_id
self.entity_type_id = entity_type_id
self.schema = schema
self.db_connection = db_connection
self.db_type = db_type
if self.db_type == 'db2':
self.is_postgre_sql = False
self.schema = schema.upper()
self.cache_tablename = DBDataCache.CACHE_TABLENAME.upper()
elif self.db_type == 'postgresql':
self.is_postgre_sql = True
self.schema = schema.lower()
self.cache_tablename = DBDataCache.CACHE_TABLENAME.lower()
else:
raise Exception('Initialization of %s failed because the database type %s is unknown.' % (
self.__class__.__name__, self.db_type))
self.quoted_schema = dbhelper.quotingSchemaName(self.schema, self.is_postgre_sql)
self.quoted_cache_tablename = dbhelper.quotingTableName(self.cache_tablename, self.is_postgre_sql)
self._handle_cache_table()
def _create_cache_table(self):
if not self.is_postgre_sql:
sql_statement = "CREATE TABLE %s.%s ( " \
"ENTITY_TYPE_ID BIGINT NOT NULL, " \
"PARQUET_NAME VARCHAR(2048) NOT NULL, " \
"PARQUET_FILE BLOB(2G), " \
"UPDATED_TS TIMESTAMP NOT NULL DEFAULT CURRENT TIMESTAMP, " \
"CONSTRAINT %s UNIQUE(ENTITY_TYPE_ID, PARQUET_NAME) ENFORCED ) " \
"ORGANIZE BY ROW" % (self.quoted_schema, self.quoted_cache_tablename,
dbhelper.quotingTableName('uc_%s' % self.cache_tablename,
self.is_postgre_sql))
try:
stmt = ibm_db.exec_immediate(self.db_connection, sql_statement)
ibm_db.free_result(stmt)
except Exception as ex:
raise Exception('Execution of sql statement "%s" failed.' % sql_statement) from ex
else:
sql_statement = "CREATE TABLE %s.%s ( " \
"entity_type_id BIGINT NOT NULL, " \
"parquet_name VARCHAR(2048) NOT NULL, " \
"parquet_file BYTEA, " \
"updated_ts TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP, " \
"CONSTRAINT %s UNIQUE(entity_type_id, parquet_name))" % (
self.quoted_schema, self.quoted_cache_tablename,
dbhelper.quotingTableName('uc_%s' % self.cache_tablename, self.is_postgre_sql))
try:
dbhelper.execute_postgre_sql_query(self.db_connection, sql_statement)
except Exception as ex:
raise Exception('Execution of sql statement "%s" failed.' % sql_statement) from ex
logger.info('Table %s.%s has been created.' % (self.quoted_schema, self.quoted_cache_tablename))
def _cache_table_exists(self):
exists = False
try:
if not self.is_postgre_sql:
stmt = ibm_db.tables(self.db_connection, None, self.schema, self.cache_tablename, None)
try:
fetch_value = ibm_db.fetch_row(stmt, 0)
if fetch_value:
exists = True
finally:
ibm_db.free_result(stmt)
else:
exists = dbhelper.check_table_exist(self.db_connection, self.db_type, self.schema, self.cache_tablename)
except Exception as ex:
raise Exception(
'Error while probing for table %s.%s' % (self.quoted_schema, self.quoted_cache_tablename)) from ex
logger.debug('Table %s.%s %s.' % (
self.quoted_schema, self.quoted_cache_tablename, 'exists' if exists else 'does not exist'))
return exists
def _handle_cache_table(self):
if not self._cache_table_exists():
self._create_cache_table()
def _push_cache(self, cache_filename, cache_pathname):
if not self.is_postgre_sql:
sql_statement = "MERGE INTO %s.%s AS TARGET " \
"USING (VALUES (?, ?, ?, CURRENT_TIMESTAMP)) " \
"AS SOURCE (ENTITY_TYPE_ID, PARQUET_NAME, PARQUET_FILE, UPDATED_TS) " \
"ON TARGET.ENTITY_TYPE_ID = SOURCE.ENTITY_TYPE_ID " \
"AND TARGET.PARQUET_NAME = SOURCE.PARQUET_NAME " \
"WHEN MATCHED THEN " \
"UPDATE SET TARGET.PARQUET_FILE = SOURCE.PARQUET_FILE, " \
"TARGET.UPDATED_TS = SOURCE.UPDATED_TS " \
"WHEN NOT MATCHED THEN " \
"INSERT (ENTITY_TYPE_ID, PARQUET_NAME, PARQUET_FILE, UPDATED_TS) " \
"VALUES (SOURCE.ENTITY_TYPE_ID, SOURCE.PARQUET_NAME, SOURCE.PARQUET_FILE, " \
"SOURCE.UPDATED_TS)" % (self.quoted_schema, self.quoted_cache_tablename)
try:
stmt = ibm_db.prepare(self.db_connection, sql_statement)
try:
ibm_db.bind_param(stmt, 1, self.entity_type_id)
ibm_db.bind_param(stmt, 2, cache_filename)
ibm_db.bind_param(stmt, 3, cache_pathname, ibm_db.PARAM_FILE, ibm_db.SQL_BLOB)
ibm_db.execute(stmt)
finally:
ibm_db.free_result(stmt)
except Exception as ex:
raise Exception('Storing cache file %s under name %s failed with sql statement "%s"' % (
cache_pathname, cache_filename, sql_statement)) from ex
else:
try:
f = open(cache_pathname, 'rb')
try:
blob = f.read()
finally:
f.close()
except Exception as ex:
raise Exception('The cache file %s could not be read from disc.' % cache_pathname) from ex
else:
statement1 = "INSERT INTO %s.%s (entity_type_id, parquet_name, parquet_file, updated_ts) " % (
self.quoted_schema, self.quoted_cache_tablename)
statement3 = "ON CONFLICT ON CONSTRAINT %s DO update set entity_type_id = EXCLUDED.entity_type_id, " \
"parquet_name = EXCLUDED.parquet_name, parquet_file = EXCLUDED.parquet_file, " \
"updated_ts = EXCLUDED.updated_ts" % dbhelper.quotingTableName(
('uc_%s' % self.cache_tablename), self.is_postgre_sql)
sql_statement = statement1 + " values (%s, %s, %s, current_timestamp) " + statement3
try:
dbhelper.execute_postgre_sql_query(self.db_connection, sql_statement,
(self.entity_type_id, cache_filename, psycopg2.Binary(blob)))
except Exception as ex:
raise Exception('Storing cache under name %s failed with sql statement "%s"' % (
cache_filename, sql_statement)) from ex
logger.info('Cache has been stored under name %s in table %s.%s' % (
cache_filename, self.quoted_schema, self.quoted_cache_tablename))
def _get_cache(self, cache_filename, cache_pathname):
# Remove file on disc if there is one
try:
if os.path.exists(cache_pathname):
os.remove(cache_pathname)
except Exception as ex:
raise Exception('Removal of old cache file %s failed.' % cache_pathname) from ex
if not self.is_postgre_sql:
sql_statement = "SELECT PARQUET_FILE FROM %s.%s WHERE ENTITY_TYPE_ID = ? AND PARQUET_NAME = ?" % (
self.quoted_schema, self.quoted_cache_tablename)
stmt = ibm_db.prepare(self.db_connection, sql_statement)
try:
ibm_db.bind_param(stmt, 1, self.entity_type_id)
ibm_db.bind_param(stmt, 2, cache_filename)
ibm_db.execute(stmt)
row = ibm_db.fetch_tuple(stmt)
if row is False:
row = None
except Exception as ex:
raise Exception(
'Retrieval of cache %s failed with sql statement "%s"' % (cache_filename, sql_statement)) from ex
finally:
ibm_db.free_result(stmt)
else:
sql_statement = 'SELECT parquet_file FROM %s.%s' % (self.quoted_schema, self.quoted_cache_tablename)
sql_statement += ' WHERE entity_type_id = %s AND parquet_name = %s'
try:
row = dbhelper.execute_postgre_sql_select_query(self.db_connection, sql_statement,
(self.entity_type_id, cache_filename),
fetch_one_only=True)
except Exception as ex:
raise Exception(
'Retrieval of cache %s failed with sql statement "%s"' % (cache_filename, sql_statement)) from ex
cache_found = False
if row is not None:
cache_found = True
parquet = row[0]
if parquet is not None and len(parquet) > 0:
try:
f = open(cache_pathname, "wb")
try:
f.write(parquet)
logger.info('Cache %s has been retrieved from table %s.%s and stored under %s' % (
cache_filename, self.quoted_schema, self.quoted_cache_tablename, cache_pathname))
finally:
f.close()
except Exception as ex:
raise Exception('Writing cache file %s to disc failed.' % cache_pathname) from ex
else:
logger.info('The cache %s is empty' % cache_filename)
else:
logger.info('No cache found for %s' % cache_filename)
return cache_found
def _get_cache_filename(self, dep_grain, grain, old_name=False):
# Create local path for cache file on disk.
base_path = '%s/%s/%d' % (DBDataCache.PARQUET_DIRECTORY, self.tenant_id, self.entity_type_id)
Path(base_path).mkdir(parents=True, exist_ok=True)
# Assemble filename and full pathname of cache file
if old_name is False:
filename = '%s__%s__%s' % (
DBDataCache.CACHE_FILE_STEM, str(dep_grain[3]) if dep_grain is not None else str(None),
str(grain[3]) if grain is not None else str(None))
local_path = '%s/%s' % (base_path, filename)
else:
src = '%s_%s_%s' % (
str(dep_grain[0]), str('_'.join(dep_grain[1])), str(dep_grain[2])) if dep_grain is not None else str(
None)
tar = '%s_%s_%s' % (str(grain[0]), str('_'.join(grain[1])), str(grain[2])) if grain is not None else str(
None)
filename = '%s__%s__%s' % (DBDataCache.CACHE_FILE_STEM, src, tar)
local_path = '%s/%s' % (base_path, filename)
return filename, local_path, base_path
def store_cache(self, dep_grain, grain, df):
cache_filename, cache_pathname, base_path = self._get_cache_filename(dep_grain, grain)
if df is not None:
try:
pyarrow_table = pyarrow.Table.from_pandas(df, schema=pyarrow.Schema.from_pandas(df))
pyarrow.parquet.write_table(pyarrow_table, cache_pathname, version='2.0')
logger.info(
'Cache %s of size %s has been saved to file %s' % (cache_filename, str(df.shape), cache_pathname))
except pyarrow.lib.ArrowInvalid as ex:
raise Exception(
'The dataframe could not be saved to file %s because pyarrow threw an exception.' % cache_pathname) from ex
except Exception as ex:
raise Exception('The dataframe could not be saved to file %s.' % cache_pathname) from ex
else:
self._push_cache(cache_filename, cache_pathname)
else:
logger.warning('Dataframe is None. Therefore no cache has been stored in database.')
def retrieve_cache(self, dep_grain, grain, old_name=False):
cache_filename, cache_pathname, base_path = self._get_cache_filename(dep_grain, grain, old_name)
self._get_cache(cache_filename, cache_pathname)
df_loaded = None
if os.path.exists(cache_pathname):
try:
df_loaded = pd.read_parquet(cache_pathname)
if df_loaded is not None:
logger.info('Cache %s of size %s has been retrieved from file %s' % (
cache_filename, str(df_loaded.shape), cache_pathname))
except Exception as ex:
raise Exception('The dataframe could not be loaded from parquet file %s' % cache_pathname) from ex
return df_loaded
def delete_cache(self, dep_grain, grain, old_name=False):
# Delete single cache entry locally
cache_filename, cache_pathname, base_path = self._get_cache_filename(dep_grain, grain, old_name)
if os.path.exists(cache_pathname):
try:
os.remove(cache_pathname)
except Exception as ex:
raise Exception('Removal of cache file %s failed' % cache_pathname) from ex
# Delete single cache entry in database
if not self.is_postgre_sql:
sql_statement = "DELETE FROM %s.%s WHERE ENTITY_TYPE_ID = ? AND PARQUET_NAME = ?" % (
self.quoted_schema, self.quoted_cache_tablename)
try:
stmt = ibm_db.prepare(self.db_connection, sql_statement)
try:
ibm_db.bind_param(stmt, 1, self.entity_type_id)
ibm_db.bind_param(stmt, 2, cache_filename)
ibm_db.execute(stmt)
finally:
ibm_db.free_result(stmt)
except Exception as ex:
raise Exception('Deletion of cache file %s failed with sql statement "%s"' % (
cache_filename, sql_statement)) from ex
else:
sql_statement = "DELETE FROM %s.%s" % (self.quoted_schema, self.quoted_cache_tablename)
sql_statement += ' where entity_type_id = %s and parquet_name = %s'
try:
dbhelper.execute_postgre_sql_query(self.db_connection, sql_statement,
(self.entity_type_id, cache_filename))
except Exception as ex:
raise Exception(
'Deletion of cache file %s failed with sql statement %s' % (cache_filename, sql_statement)) from ex
logger.info('Cache file %s has been deleted from table %s.%s' % (
cache_filename, self.quoted_schema, self.quoted_cache_tablename))
def delete_all_caches(self):
# Delete all cache entries for this entity type locally
cache_filename, cache_pathname, base_path = self._get_cache_filename(None, None)
if os.path.exists(base_path):
try:
file_listing = os.listdir(base_path)
except Exception as ex:
raise Exception('Failure to list content of directory %s' % base_path) from ex
for filename in file_listing:
if filename.startswith(DBDataCache.CACHE_FILE_STEM):
full_path = '%s/%s' % (base_path, filename)
try:
os.remove(full_path)
except Exception as ex:
raise Exception('Removal of file %s failed' % full_path) from ex
# Delete all cache entries for this entity type in database
if not self.is_postgre_sql:
sql_statement = "DELETE FROM %s.%s where ENTITY_TYPE_ID = ?" % (
self.quoted_schema, self.quoted_cache_tablename)
try:
stmt = ibm_db.prepare(self.db_connection, sql_statement)
try:
ibm_db.bind_param(stmt, 1, self.entity_type_id)
ibm_db.execute(stmt)
finally:
ibm_db.free_result(stmt)
except Exception as ex:
raise Exception('Deletion of cache files failed with sql statement "%s"' % sql_statement) from ex
else:
sql_statement = "DELETE FROM %s.%s" % (self.quoted_schema, self.quoted_cache_tablename,)
sql_statement += ' where entity_type_id = %s'
try:
dbhelper.execute_postgre_sql_query(self.db_connection, sql_statement, (self.entity_type_id,))
except Exception as ex:
raise Exception('Deletion of cache files failed with sql statement %s' % sql_statement) from ex
logger.info('All caches have been deleted from table %s.%s for entity type id %d' % (
self.quoted_schema, self.quoted_cache_tablename, self.entity_type_id))
class FileModelStore:
STORE_TABLENAME = 'KPI_MODEL_STORE'
def __init__(self):
logger.info('Init FileModelStore')
def store_model(self, model_name, model, user_name=None, serialize=True):
if serialize:
try:
model = pickle.dumps(model)
except Exception as ex:
raise Exception(
'Serialization of model %s that is supposed to be stored in ModelStore failed.' % model_name) from ex
filename = self.STORE_TABLENAME + model_name
f = open(filename, "wb")
f.write(model)
f.close()
def retrieve_model(self, model_name, deserialize=True):
filename = self.STORE_TABLENAME + model_name
model = None
if os.path.exists(filename):
f = open(filename, "rb")
model = f.read(model)
f.close()
if model is not None:
logger.info('Model %s of size %d bytes has been retrieved from filesystem' % (
model_name, len(model) if model is not None else 0))
else:
logger.info('Model %s does not exist in filesystem' % (model_name))
if model is not None and deserialize:
try:
model = pickle.loads(model)
except Exception as ex:
raise Exception(
'Deserialization of model %s that has been retrieved from ModelStore failed.' % model_name) from ex
return model
def delete_model(self, model_name):
filename = self.STORE_TABLENAME + model_name
if os.path.exists(filename):
os.remove(filename)
logger.info('Model %s has been deleted from filesystem' % (model_name))
class DBModelStore:
STORE_TABLENAME = 'KPI_MODEL_STORE'
def __init__(self, tenant_id, entity_type_id, schema, db_connection, db_type):
self.tenant_id = tenant_id
self.entity_type_id = entity_type_id
self.schema = schema
self.db_connection = db_connection
self.db_type = db_type
if self.db_type == 'db2':
self.is_postgre_sql = False
self.schema = schema.upper()
self.store_tablename = DBModelStore.STORE_TABLENAME.upper()
elif self.db_type == 'postgresql':
self.is_postgre_sql = True
self.schema = schema.lower()
self.store_tablename = DBModelStore.STORE_TABLENAME.lower()
else:
raise Exception('Initialization of %s failed because the database type %s is unknown.' % (
self.__class__.__name__, self.db_type))
self.quoted_schema = dbhelper.quotingSchemaName(self.schema, self.is_postgre_sql)
self.quoted_store_tablename = dbhelper.quotingTableName(self.store_tablename, self.is_postgre_sql)
self._handle_store_table()
def _create_store_table(self):
if not self.is_postgre_sql:
sql_statement = "CREATE TABLE %s.%s ( " \
"ENTITY_TYPE_ID BIGINT NOT NULL, " \
"MODEL_NAME VARCHAR(2048) NOT NULL, " \
"MODEL BLOB(2G), " \
"UPDATED_TS TIMESTAMP NOT NULL DEFAULT CURRENT TIMESTAMP, " \
"LAST_UPDATED_BY VARCHAR(256), " \
"CONSTRAINT %s UNIQUE(ENTITY_TYPE_ID, MODEL_NAME) ENFORCED) " \
"ORGANIZE BY ROW" % (self.quoted_schema, self.quoted_store_tablename,
dbhelper.quotingTableName('uc_%s' % self.store_tablename,
self.is_postgre_sql))
try:
stmt = ibm_db.exec_immediate(self.db_connection, sql_statement)
ibm_db.free_result(stmt)
except Exception as ex:
raise Exception('Execution of sql statement "%s" failed.' % sql_statement) from ex
else:
sql_statement = "CREATE TABLE %s.%s ( " \
"entity_type_id BIGINT NOT NULL, " \
"model_name VARCHAR(2048) NOT NULL, " \
"model BYTEA, " \
"updated_ts TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP, " \
"last_updated_by VARCHAR(256), " \
"CONSTRAINT %s UNIQUE(entity_type_id, model_name))" % (
self.quoted_schema, self.quoted_store_tablename,
dbhelper.quotingTableName('uc_%s' % self.store_tablename, self.is_postgre_sql))
try:
dbhelper.execute_postgre_sql_query(self.db_connection, sql_statement)
except Exception as ex:
raise Exception('Execution of sql statement "%s" failed.' % sql_statement) from ex
logger.info('Table %s.%s has been created.' % (self.quoted_schema, self.quoted_store_tablename))
def _store_table_exists(self):
exists = False
try:
if not self.is_postgre_sql:
stmt = ibm_db.tables(self.db_connection, None, self.schema, self.store_tablename, None)
try:
fetch_value = ibm_db.fetch_row(stmt, 0)
if fetch_value:
exists = True
finally:
ibm_db.free_result(stmt)
else:
exists = dbhelper.check_table_exist(self.db_connection, self.db_type, self.schema, self.store_tablename)
except Exception as ex:
raise Exception(
'Error while probing for table %s.%s' % (self.quoted_schema, self.quoted_store_tablename)) from ex
logger.debug('Table %s.%s %s.' % (
self.quoted_schema, self.quoted_store_tablename, 'exists' if exists else 'does not exist'))
return exists
def _handle_store_table(self):
if not self._store_table_exists():
self._create_store_table()
def store_model(self, model_name, model, user_name=None, serialize=True):
if serialize:
try:
model = pickle.dumps(model)
except Exception as ex:
raise Exception(
'Serialization of model %s that is supposed to be stored in ModelStore failed.' % model_name) from ex
if not self.is_postgre_sql:
sql_statement = "MERGE INTO %s.%s AS TARGET " \
"USING (VALUES (?, ?, ?, CURRENT_TIMESTAMP, ?)) " \
"AS SOURCE (ENTITY_TYPE_ID, MODEL_NAME, MODEL, UPDATED_TS, LAST_UPDATED_BY) " \
"ON TARGET.ENTITY_TYPE_ID = SOURCE.ENTITY_TYPE_ID " \
"AND TARGET.MODEL_NAME = SOURCE.MODEL_NAME " \
"WHEN MATCHED THEN " \
"UPDATE SET TARGET.MODEL = SOURCE.MODEL, " \
"TARGET.UPDATED_TS = SOURCE.UPDATED_TS " \
"WHEN NOT MATCHED THEN " \
"INSERT (ENTITY_TYPE_ID, MODEL_NAME, MODEL, UPDATED_TS, LAST_UPDATED_BY) " \
"VALUES (SOURCE.ENTITY_TYPE_ID, SOURCE.MODEL_NAME, SOURCE.MODEL, " \
"SOURCE.UPDATED_TS, SOURCE.LAST_UPDATED_BY)" % (
self.quoted_schema, self.quoted_store_tablename)
try:
stmt = ibm_db.prepare(self.db_connection, sql_statement)
try:
ibm_db.bind_param(stmt, 1, self.entity_type_id)
ibm_db.bind_param(stmt, 2, model_name)
ibm_db.bind_param(stmt, 3, model)
ibm_db.bind_param(stmt, 4, user_name)
ibm_db.execute(stmt)
finally:
ibm_db.free_result(stmt)
except Exception as ex:
raise Exception('Storing model %s failed with sql statement "%s"' % (model_name, sql_statement)) from ex
else:
statement1 = "INSERT INTO %s.%s (entity_type_id, model_name, model, updated_ts, last_updated_by) " % (
self.quoted_schema, self.quoted_store_tablename)
statement3 = "ON CONFLICT ON CONSTRAINT %s DO update set entity_type_id = EXCLUDED.entity_type_id, " \
"model_name = EXCLUDED.model_name, model = EXCLUDED.model, " \
"updated_ts = EXCLUDED.updated_ts, last_updated_by = EXCLUDED.last_updated_by" % dbhelper.quotingTableName(
('uc_%s' % self.store_tablename), self.is_postgre_sql)
sql_statement = statement1 + " values (%s, %s, %s, current_timestamp, %s) " + statement3
try:
dbhelper.execute_postgre_sql_query(self.db_connection, sql_statement,
(self.entity_type_id, model_name, psycopg2.Binary(model), user_name))
except Exception as ex:
raise Exception('Storing model %s failed with sql statement "%s"' % (model_name, sql_statement)) from ex
logger.info('Model %s of size %d bytes has been stored in table %s.%s.' % (
model_name, len(model) if model is not None else 0, self.quoted_schema, self.quoted_store_tablename))
def retrieve_model(self, model_name, deserialize=True):
if not self.is_postgre_sql:
sql_statement = "SELECT MODEL FROM %s.%s WHERE ENTITY_TYPE_ID = ? AND MODEL_NAME = ?" % (
self.quoted_schema, self.quoted_store_tablename)
stmt = ibm_db.prepare(self.db_connection, sql_statement)
try:
ibm_db.bind_param(stmt, 1, self.entity_type_id)
ibm_db.bind_param(stmt, 2, model_name)
ibm_db.execute(stmt)
row = ibm_db.fetch_tuple(stmt)
if row is False:
model = None
else:
model = row[0]
except Exception as ex:
raise Exception(
'Retrieval of model %s failed with sql statement "%s"' % (model_name, sql_statement)) from ex
finally:
ibm_db.free_result(stmt)
else:
sql_statement = 'SELECT model FROM %s.%s' % (self.quoted_schema, self.quoted_store_tablename)
sql_statement += ' WHERE entity_type_id = %s AND model_name = %s'
try:
row = dbhelper.execute_postgre_sql_select_query(self.db_connection, sql_statement,
(self.entity_type_id, model_name), fetch_one_only=True)
if row is None:
model = None
else:
model = bytes(row[0])
except Exception as ex:
raise Exception(
'Retrieval of model %s failed with sql statement "%s"' % (model_name, sql_statement)) from ex
if model is not None:
logger.info('Model %s of size %d bytes has been retrieved from table %s.%s' % (
model_name, len(model) if model is not None else 0, self.quoted_schema, self.quoted_store_tablename))
else:
logger.info('Model %s does not exist in table %s.%s' % (
model_name, self.quoted_schema, self.quoted_store_tablename))
if model is not None and deserialize:
try:
model = pickle.loads(model)
except Exception as ex:
raise Exception(
'Deserialization of model %s that has been retrieved from ModelStore failed.' % model_name) from ex
return model
def delete_model(self, model_name):
if not self.is_postgre_sql:
sql_statement = "DELETE FROM %s.%s where ENTITY_TYPE_ID = ? and MODEL_NAME = ?" % (
self.quoted_schema, self.quoted_store_tablename)
try:
stmt = ibm_db.prepare(self.db_connection, sql_statement)
try:
ibm_db.bind_param(stmt, 1, self.entity_type_id)
ibm_db.bind_param(stmt, 2, model_name)
ibm_db.execute(stmt)
finally:
ibm_db.free_result(stmt)
except Exception as ex:
raise Exception(
'Deletion of model %s failed with sql statement "%s"' % (model_name, sql_statement)) from ex
else:
sql_statement = "DELETE FROM %s.%s" % (self.quoted_schema, self.quoted_store_tablename)
sql_statement += ' where entity_type_id = %s and model_name = %s'
try:
dbhelper.execute_postgre_sql_query(self.db_connection, sql_statement, (self.entity_type_id, model_name))
except Exception as ex:
raise Exception(
'Deletion of model %s failed with sql statement "%s"' % (model_name, sql_statement)) from ex
logger.info('Model %s has been deleted from table %s.%s' % (
model_name, self.quoted_schema, self.quoted_store_tablename))
| 47.114286 | 132 | 0.567393 | 3,629 | 31,331 | 4.651419 | 0.074952 | 0.054739 | 0.040521 | 0.03519 | 0.837618 | 0.814218 | 0.771979 | 0.744787 | 0.70231 | 0.670616 | 0 | 0.003908 | 0.346685 | 31,331 | 664 | 133 | 47.185241 | 0.820704 | 0.022853 | 0 | 0.607076 | 0 | 0 | 0.202987 | 0.011667 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040968 | false | 0 | 0.018622 | 0 | 0.087523 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
65c9146e3f3fb46305263e41d0e53f6ea16ce9fc | 2,464 | py | Python | create_model/splite_new_img.py | yokoyang/img-class | 2b1e12ee88dbbc5c3a52732a527721e92e30f89e | [
"MIT"
] | 1 | 2018-03-30T10:43:42.000Z | 2018-03-30T10:43:42.000Z | create_model/splite_new_img.py | yokoyang/img-class | 2b1e12ee88dbbc5c3a52732a527721e92e30f89e | [
"MIT"
] | null | null | null | create_model/splite_new_img.py | yokoyang/img-class | 2b1e12ee88dbbc5c3a52732a527721e92e30f89e | [
"MIT"
] | null | null | null | import os
import cv2
import numpy as np
import pandas as pd
import tifffile as tiff
from PIL import Image
# 一般建筑&农村&工厂&阴影
# 水体&植被
# 建筑场地&裸地
# 运动场&道路
name_list = ['split-mask-data']
# name_list = [ 'bare_land', 'building_yard', 'countryside', 'factory', 'general_building', 'playground',
# 'road', 'shadow', 'tree', 'water']
# name = 'split-data'
# name = 'bare_land'
# name = 'building_yard'
# name = 'countryside'
# name = 'factory'
# name = 'general_building'
# name = 'playground'
# name = 'road'
# name = 'shadow'
# name = 'tree'
# name = 'water'
# for name in name_list:
# # /home/yokoyang/PycharmProjects/untitled/biaozhu/water/0_0_0.tif
# # /home/yokoyang/PycharmProjects/untitled/biaozhu/water/0_0_0.tif
# Dir = "/home/yokoyang/PycharmProjects/untitled/new_data/split-data"
# target_size = 640
# folder_name = "/home/yokoyang/PycharmProjects/untitled/640_biaozhu/" + name + "/"
#
# train_img = pd.read_csv('/home/yokoyang/PycharmProjects/untitled/new_data/data_imageID.csv')
#
# Image_ID = sorted(train_img.ImageId.unique())
# i = 0
# for i in Image_ID:
# filename = os.path.join(Dir, name, '{}.tif'.format(i))
# cv2_im = cv2.imread(filename)
# cv2_im = cv2.cvtColor(cv2_im, cv2.COLOR_BGR2RGB)
# cv2_im = cv2_im.astype(np.uint8)
# # cv2_im ^= 255
# img = Image.fromarray(cv2_im)
# img2 = img.crop((0, 0, target_size, target_size))
# pic_name = folder_name + i + ".tif"
# tiff.imsave(pic_name, np.array(img2))
for name in name_list:
# /home/yokoyang/PycharmProjects/untitled/biaozhu/water/0_0_0.tif
# /home/yokoyang/PycharmProjects/untitled/biaozhu/water/0_0_0.tif
Dir = "/home/yokoyang/PycharmProjects/untitled/new_data"
target_size = 640
folder_name = "/home/yokoyang/PycharmProjects/untitled/640_biaozhu/" + name + "/"
train_img = pd.read_csv('/home/yokoyang/PycharmProjects/untitled/new_data/data_imageID.csv')
Image_ID = sorted(train_img.ImageId.unique())
i = 0
for i in Image_ID:
filename = os.path.join(Dir, name, '{}.tif'.format(i))
cv2_im = cv2.imread(filename)
cv2_im = cv2.cvtColor(cv2_im, cv2.COLOR_BGR2RGB)
cv2_im = cv2_im.astype(np.uint8)
# cv2_im ^= 255
img = Image.fromarray(cv2_im)
img2 = img.crop((0, 0, target_size, target_size))
pic_name = folder_name + i + ".tif"
tiff.imsave(pic_name, np.array(img2)) | 35.2 | 105 | 0.654221 | 342 | 2,464 | 4.520468 | 0.239766 | 0.045278 | 0.174644 | 0.226391 | 0.765847 | 0.765847 | 0.765847 | 0.765847 | 0.765847 | 0.765847 | 0 | 0.032878 | 0.197646 | 2,464 | 70 | 106 | 35.2 | 0.749115 | 0.593344 | 0 | 0 | 0 | 0 | 0.199374 | 0.172234 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.26087 | 0 | 0.26087 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
65d46440ba4d96e56ee59cb92a2b927301638566 | 150 | py | Python | src/docs/__init__.py | kunansy/Vocabulary | 8794f093067262cf6e71bf64428716d6677d83aa | [
"MIT"
] | null | null | null | src/docs/__init__.py | kunansy/Vocabulary | 8794f093067262cf6e71bf64428716d6677d83aa | [
"MIT"
] | 7 | 2020-09-11T05:43:58.000Z | 2021-08-21T20:42:59.000Z | src/docs/__init__.py | FaustGoethe/EnglishVocabulary | ec1af4f38b9b5196558ce7cda05878827727fcd3 | [
"MIT"
] | null | null | null | # from .create_doc import create_pdf
from .create_doc import create_docx
from .create_doc import visual_info
__all__ = 'create_docx', 'visual_info'
| 21.428571 | 38 | 0.806667 | 23 | 150 | 4.73913 | 0.391304 | 0.275229 | 0.357798 | 0.522936 | 0.458716 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126667 | 150 | 6 | 39 | 25 | 0.832061 | 0.226667 | 0 | 0 | 0 | 0 | 0.192982 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
65e7960e3cc600e29d20d31c14f3364469032489 | 129 | py | Python | common/utils/utils.py | ltxhh/course | 45c8e4e436d9f20effccc7ed0844dfd07d8348b1 | [
"Apache-2.0"
] | null | null | null | common/utils/utils.py | ltxhh/course | 45c8e4e436d9f20effccc7ed0844dfd07d8348b1 | [
"Apache-2.0"
] | null | null | null | common/utils/utils.py | ltxhh/course | 45c8e4e436d9f20effccc7ed0844dfd07d8348b1 | [
"Apache-2.0"
] | null | null | null | # -*- codeing = utf-8 -*-
# @Time : 2022/3/30 10:25
# @Author : linyaxuan
# @File : utils.py
# @Software : PyCharm
import json
| 14.333333 | 25 | 0.596899 | 18 | 129 | 4.277778 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 0.209302 | 129 | 8 | 26 | 16.125 | 0.637255 | 0.806202 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
02a9dbb3b5057bb7cb1b5dab4e933c8b54177a71 | 4,128 | py | Python | src/main.py | Plasticoo/YAAF | 4d6273a0237cd90e708782bc15982768a95e9b28 | [
"MIT"
] | 3 | 2015-08-01T03:23:48.000Z | 2017-04-15T06:29:39.000Z | src/main.py | Plasticoo/YAAF | 4d6273a0237cd90e708782bc15982768a95e9b28 | [
"MIT"
] | null | null | null | src/main.py | Plasticoo/YAAF | 4d6273a0237cd90e708782bc15982768a95e9b28 | [
"MIT"
] | 4 | 2018-02-24T17:09:26.000Z | 2021-04-25T12:20:58.000Z | import argparse
from .funcs import threads
def main_function():
parse = argparse.ArgumentParser(description="YAAF - Yet Another Admin Finder!")
parse.add_argument("-u", "--url=", action="store", dest="URL", help="Website url")
parse.add_argument("-t", "--threads=", action="store", dest="NUMBER", help="Number of threads to use")
parse.add_argument("-w", "--wordlist=", action="store", dest="NAME", help="Wordlist to use")
parse.add_argument("-e", "--ext=", action="store", dest="EXT", help="Wordlist to use")
parse.add_argument("-no", "--no-output", action="store_true", dest="NO_OUTPUT", help="If outputs every try")
parse.add_argument("-l", "--log", action="store_true", help="Saves the results to a file")
parse.add_argument("-ua", "--user-agent=", action="store", dest="UAGENT", help="Custom user-agent")
parse.add_argument("-s", "--sanitize", action="store_true", help="Sanitizes the url given")
parsed_args = parse.parse_args()
if parsed_args.URL and parsed_args.NAME and not parsed_args.EXT and not parsed_args.UAGENT:
if parsed_args.NUMBER:
if 1 <= int(parsed_args.NUMBER) <= 10:
threads.start_threads(parsed_args.URL, parsed_args.NAME, num_threads=int(parsed_args.NUMBER),
log_res=parsed_args.log, output=parsed_args.NO_OUTPUT, sanitize=parsed_args.sanitize)
else:
print("Number of threads must be between 1 and 10.")
else:
threads.start_threads(parsed_args.URL, parsed_args.NAME, log_res=parsed_args.log, output=parsed_args.NO_OUTPUT,
sanitize=parsed_args.sanitize)
elif parsed_args.URL and parsed_args.NAME and parsed_args.EXT and not parsed_args.UAGENT:
if parsed_args.NUMBER:
if 1 <= int(parsed_args.NUMBER) <= 10:
threads.start_threads(parsed_args.URL, parsed_args.NAME, num_threads=int(parsed_args.NUMBER),
log_res=parsed_args.log, output=parsed_args.NO_OUTPUT, extension=parsed_args.EXT,
sanitize=parsed_args.sanitize)
else:
print("Number of threads must be between 1 and 10.")
else:
threads.start_threads(parsed_args.URL, parsed_args.NAME, log_res=parsed_args.log, output=parsed_args.NO_OUTPUT,
extension=parsed_args.EXT, sanitize=parsed_args.sanitize)
elif parsed_args.URL and parsed_args.NAME and not parsed_args.EXT and parsed_args.UAGENT:
if parsed_args.NUMBER:
if 1 <= int(parsed_args.NUMBER) <= 10:
threads.start_threads(parsed_args.URL, parsed_args.NAME, num_threads=int(parsed_args.NUMBER),
log_res=parsed_args.log, output=parsed_args.NO_OUTPUT, u_agent=parsed_args.UAGENT,
sanitize=parsed_args.sanitize)
else:
print("Number of threads must be between 1 and 10.")
else:
threads.start_threads(parsed_args.URL, parsed_args.NAME, log_res=parsed_args.log, output=parsed_args.NO_OUTPUT,
sanitize=parsed_args.sanitize)
elif parsed_args.URL and parsed_args.NAME and parsed_args.EXT and parsed_args.UAGENT:
if parsed_args.NUMBER:
if 1 <= int(parsed_args.NUMBER) <= 10:
threads.start_threads(parsed_args.URL, parsed_args.NAME, num_threads=int(parsed_args.NUMBER),
log_res=parsed_args.log, output=parsed_args.NO_OUTPUT, extension=parsed_args.EXT,
u_agent=parsed_args.UAGENT, sanitize=parsed_args.sanitize)
else:
print("Number of threads must be between 1 and 10.")
else:
threads.start_threads(parsed_args.URL, parsed_args.NAME, log_res=parsed_args.log, output=parsed_args.NO_OUTPUT,
extension=parsed_args.EXT, u_agent=parsed_args.UAGENT, sanitize=parsed_args.sanitize)
else:
print("URL and WordList are mandatory.")
| 61.61194 | 124 | 0.63687 | 541 | 4,128 | 4.63586 | 0.142329 | 0.30303 | 0.062201 | 0.060606 | 0.767943 | 0.759569 | 0.759569 | 0.733254 | 0.733254 | 0.733254 | 0 | 0.00778 | 0.252665 | 4,128 | 66 | 125 | 62.545455 | 0.805186 | 0 | 0 | 0.603448 | 0 | 0 | 0.138602 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017241 | false | 0 | 0.034483 | 0 | 0.051724 | 0.086207 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f305f872e62ec00d20f33cafe3b8de2a47c7a3d1 | 17,854 | py | Python | example_checkout/tests/test_object_payment_plan_instalment.py | pkimber/checkout | 9b1468ee55f3d843db81136b85563164e9672dd4 | [
"Apache-2.0"
] | null | null | null | example_checkout/tests/test_object_payment_plan_instalment.py | pkimber/checkout | 9b1468ee55f3d843db81136b85563164e9672dd4 | [
"Apache-2.0"
] | null | null | null | example_checkout/tests/test_object_payment_plan_instalment.py | pkimber/checkout | 9b1468ee55f3d843db81136b85563164e9672dd4 | [
"Apache-2.0"
] | null | null | null | # -*- encoding: utf-8 -*-
import pytest
from datetime import date
from dateutil.relativedelta import relativedelta
from decimal import Decimal
from unittest import mock
from django.db import transaction
from checkout.models import (
CheckoutError,
CheckoutAction,
CheckoutState,
ObjectPaymentPlan,
ObjectPaymentPlanInstalment,
)
from checkout.tests.factories import (
CheckoutFactory,
CustomerFactory,
ObjectPaymentPlanFactory,
ObjectPaymentPlanInstalmentFactory,
PaymentPlanFactory,
)
from checkout.tests.helper import check_checkout
from login.tests.factories import UserFactory
from mail.models import Message
from mail.tests.factories import NotifyFactory
from .factories import ContactFactory
@pytest.mark.django_db
def test_can_charge_deposit():
object_payment_plan = ObjectPaymentPlanFactory(
content_object=ContactFactory()
)
obj = ObjectPaymentPlanInstalmentFactory(
deposit=True,
due=date.today(),
object_payment_plan=object_payment_plan,
state=CheckoutState.objects.pending
)
assert obj.checkout_can_charge
@pytest.mark.django_db
def test_can_charge_deposit_not_pending():
object_payment_plan = ObjectPaymentPlanFactory(
content_object=ContactFactory()
)
obj = ObjectPaymentPlanInstalmentFactory(
deposit=True,
due=date.today(),
object_payment_plan=object_payment_plan,
state=CheckoutState.objects.request
)
assert not obj.checkout_can_charge
@pytest.mark.django_db
def test_can_charge_due():
object_payment_plan = ObjectPaymentPlanFactory(
content_object=ContactFactory()
)
obj = ObjectPaymentPlanInstalmentFactory(
due=date.today(),
object_payment_plan=object_payment_plan,
state=CheckoutState.objects.request
)
assert obj.checkout_can_charge
@pytest.mark.django_db
def test_can_charge_overdue():
object_payment_plan = ObjectPaymentPlanFactory(
content_object=ContactFactory()
)
obj = ObjectPaymentPlanInstalmentFactory(
due=date.today()+relativedelta(days=-10),
object_payment_plan=object_payment_plan,
state=CheckoutState.objects.request
)
assert obj.checkout_can_charge
@pytest.mark.django_db
def test_can_charge_due_not_yet():
object_payment_plan = ObjectPaymentPlanFactory(
content_object=ContactFactory()
)
obj = ObjectPaymentPlanInstalmentFactory(
due=date.today()+relativedelta(days=10),
object_payment_plan=object_payment_plan,
state=CheckoutState.objects.request
)
assert not obj.checkout_can_charge
@pytest.mark.django_db
def test_can_charge_fail():
object_payment_plan = ObjectPaymentPlanFactory(
content_object=ContactFactory()
)
obj = ObjectPaymentPlanInstalmentFactory(
object_payment_plan=object_payment_plan,
state=CheckoutState.objects.fail
)
assert not obj.checkout_can_charge
@pytest.mark.django_db
def test_can_charge_not_deposit():
object_payment_plan = ObjectPaymentPlanFactory(
content_object=ContactFactory()
)
obj = ObjectPaymentPlanInstalmentFactory(
due=date.today(),
object_payment_plan=object_payment_plan,
state=CheckoutState.objects.pending
)
assert not obj.checkout_can_charge
@pytest.mark.django_db
def test_can_charge_pending():
object_payment_plan = ObjectPaymentPlanFactory(
content_object=ContactFactory()
)
obj = ObjectPaymentPlanInstalmentFactory(
object_payment_plan=object_payment_plan,
state=CheckoutState.objects.pending
)
assert not obj.checkout_can_charge
@pytest.mark.django_db
def test_can_charge_request():
object_payment_plan = ObjectPaymentPlanFactory(
content_object=ContactFactory()
)
obj = ObjectPaymentPlanInstalmentFactory(
object_payment_plan=object_payment_plan,
state=CheckoutState.objects.request
)
assert obj.checkout_can_charge
@pytest.mark.django_db
def test_can_charge_success():
object_payment_plan = ObjectPaymentPlanFactory(
content_object=ContactFactory()
)
obj = ObjectPaymentPlanInstalmentFactory(
object_payment_plan=object_payment_plan,
state=CheckoutState.objects.success
)
assert not obj.checkout_can_charge
@pytest.mark.django_db
def test_check_checkout():
with transaction.atomic():
# this must be run within a transaction
obj = ObjectPaymentPlan.objects.create_object_payment_plan(
ContactFactory(),
PaymentPlanFactory(),
Decimal('100')
)
instalment = ObjectPaymentPlanInstalment.objects.get(
object_payment_plan=obj
)
check_checkout(instalment)
@pytest.mark.django_db
def test_checkout_description():
payment_plan = PaymentPlanFactory(
name='pkimber',
deposit=50,
count=2,
interval=1
)
# create the plan and the deposit
contact_pp = ObjectPaymentPlan.objects.create_object_payment_plan(
ContactFactory(),
payment_plan,
Decimal('100'),
)
# check deposit description
deposit = ObjectPaymentPlanInstalment.objects.filter(
object_payment_plan=contact_pp
)
assert 1 == deposit.count()
assert [
'pkimber', 'Deposit'
] == deposit[0].checkout_description
# create the instalments
contact_pp.create_instalments()
# check
instalments = ObjectPaymentPlanInstalment.objects.filter(
object_payment_plan=contact_pp
)
assert 3 == instalments.count()
assert [
'pkimber', 'Instalment 2 of 3'
] == instalments[1].checkout_description
@pytest.mark.django_db
def test_checkout_fail():
with transaction.atomic():
# this must be run within a transaction
obj = ObjectPaymentPlan.objects.create_object_payment_plan(
ContactFactory(),
PaymentPlanFactory(),
Decimal('100')
)
obj = ObjectPaymentPlanInstalment.objects.get(
object_payment_plan=obj
)
assert CheckoutState.objects.pending == obj.state
obj.checkout_fail()
assert CheckoutState.objects.fail == obj.state
@pytest.mark.django_db
def test_checkout_name():
user = UserFactory(first_name='Patrick', last_name='Kimber')
contact_pp = ObjectPaymentPlan.objects.create_object_payment_plan(
ContactFactory(user=user),
PaymentPlanFactory(),
Decimal('100'),
)
obj = ObjectPaymentPlanInstalment.objects.get(
object_payment_plan=contact_pp
)
assert 'Patrick Kimber' == obj.checkout_name
@pytest.mark.django_db
def test_checkout_success():
contact_pp = ObjectPaymentPlan.objects.create_object_payment_plan(
ContactFactory(),
PaymentPlanFactory(),
Decimal('100'),
)
obj = ObjectPaymentPlanInstalment.objects.get(
object_payment_plan=contact_pp
)
assert CheckoutState.objects.pending == obj.state
checkout = CheckoutFactory(
action=CheckoutAction.objects.charge,
content_object=obj,
state=CheckoutState.objects.success,
)
obj.checkout_success(checkout)
assert CheckoutState.objects.success == obj.state
@pytest.mark.django_db
def test_checkout_total():
payment_plan = PaymentPlanFactory(
name='pkimber',
deposit=50,
count=2,
interval=1
)
user = UserFactory(first_name='Patrick', last_name='Kimber')
contact_pp = ObjectPaymentPlan.objects.create_object_payment_plan(
ContactFactory(user=user),
payment_plan,
Decimal('100'),
)
contact_pp.create_instalments()
assert Decimal('50') == ObjectPaymentPlanInstalment.objects.get(
count=1
).checkout_total
assert Decimal('25') == ObjectPaymentPlanInstalment.objects.get(
count=2
).checkout_total
assert Decimal('25') == ObjectPaymentPlanInstalment.objects.get(
count=3
).checkout_total
@pytest.mark.django_db
def test_checkout_email():
user = UserFactory(email='me@test.com')
contact_pp = ObjectPaymentPlan.objects.create_object_payment_plan(
ContactFactory(user=user),
PaymentPlanFactory(),
Decimal('100'),
)
obj = ObjectPaymentPlanInstalment.objects.get(
object_payment_plan=contact_pp
)
assert 'me@test.com' == obj.checkout_email
#@pytest.mark.django_db
#def test_checkout_list():
# c1 = CheckoutFactory(
# action=CheckoutAction.objects.charge,
# content_object=SalesLedgerFactory(),
# )
# c2 = CheckoutFactory(
# action=CheckoutAction.objects.charge,
# content_object=ObjectPaymentPlanInstalmentFactory(
# object_payment_plan=ObjectPaymentPlanFactory(
# content_object=ContactFactory(),
# ),
# ),
# )
# c3 = CheckoutFactory(
# action=CheckoutAction.objects.charge,
# content_object=ObjectPaymentPlanInstalmentFactory(
# object_payment_plan=ObjectPaymentPlanFactory(
# content_object=ContactFactory(),
# ),
# ),
# )
# checkout_list = ObjectPaymentPlanInstalment.objects.checkout_list
# assert 2 == checkout_list.count()
# assert c1 not in checkout_list
# assert c2 in checkout_list
# assert c3 in checkout_list
@pytest.mark.django_db
def test_due():
today = date.today()
ObjectPaymentPlanInstalmentFactory(
count=1,
due=today+relativedelta(days=-1),
amount=Decimal('1'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
ObjectPaymentPlanInstalmentFactory(
count=2,
due=today+relativedelta(days=-2),
amount=Decimal('2'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
result = [p.amount for p in ObjectPaymentPlanInstalment.objects.due]
assert [Decimal('1'), Decimal('2')] == result
@pytest.mark.django_db
def test_due_plan_deleted():
today = date.today()
ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=-1),
amount=Decimal('1'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
object_payment_plan = ObjectPaymentPlanFactory(
deleted=True,
content_object=ContactFactory(),
)
ObjectPaymentPlanInstalmentFactory(
object_payment_plan=object_payment_plan,
due=today+relativedelta(days=2),
amount=Decimal('2'),
)
ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=-3),
amount=Decimal('3'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
result = [p.amount for p in ObjectPaymentPlanInstalment.objects.due]
assert [Decimal('1'), Decimal('3')] == result
@pytest.mark.django_db
def test_due_plan_deposit():
today = date.today()
ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=-1),
amount=Decimal('1'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
ObjectPaymentPlanInstalmentFactory(
deposit=True,
due=today+relativedelta(days=-2),
amount=Decimal('2'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=-3),
amount=Decimal('3'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
result = [p.amount for p in ObjectPaymentPlanInstalment.objects.due]
assert [Decimal('1'), Decimal('3')] == result
@pytest.mark.django_db
def test_due_not_due():
today = date.today()
ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=-1),
amount=Decimal('1'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=1),
amount=Decimal('2'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=-3),
amount=Decimal('3'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
result = [p.amount for p in ObjectPaymentPlanInstalment.objects.due]
assert [Decimal('1'), Decimal('3')] == result
@pytest.mark.django_db
def test_due_not_pending():
today = date.today()
ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=-1),
amount=Decimal('1'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=-2),
state=CheckoutState.objects.fail,
amount=Decimal('2'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=-3),
amount=Decimal('3'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
result = [p.amount for p in ObjectPaymentPlanInstalment.objects.due]
assert [Decimal('1'), Decimal('3')] == result
@pytest.mark.django_db
def test_factory():
ObjectPaymentPlanInstalmentFactory(
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
@pytest.mark.django_db
def test_create_instalments_first_of_month():
obj = ObjectPaymentPlanInstalmentFactory(
amount=Decimal('50'),
count=1,
deposit=True,
due=date(2015, 3, 10),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
obj.object_payment_plan.create_instalments()
assert date(2015, 3, 10) == ObjectPaymentPlanInstalment.objects.get(
count=1
).due
assert date(2015, 4, 1) == ObjectPaymentPlanInstalment.objects.get(
count=2
).due
assert date(2015, 5, 1) == ObjectPaymentPlanInstalment.objects.get(
count=3
).due
@pytest.mark.django_db
def test_create_instalments_first_of_month_after_15th():
obj = ObjectPaymentPlanInstalmentFactory(
amount=Decimal('50'),
count=1,
deposit=True,
due=date(2015, 3, 17),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
obj.object_payment_plan.create_instalments()
assert date(2015, 3, 17) == ObjectPaymentPlanInstalment.objects.get(
count=1
).due
assert date(2015, 5, 1) == ObjectPaymentPlanInstalment.objects.get(
count=2
).due
assert date(2015, 6, 1) == ObjectPaymentPlanInstalment.objects.get(
count=3
).due
@pytest.mark.django_db
def test_process_payments(mocker):
"""Process payments."""
mocker.patch('stripe.Charge.create')
mocker.patch('stripe.Customer.create')
today = date.today()
install_1 = ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=1),
amount=Decimal('2'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
install_2 = ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=-1),
amount=Decimal('1'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
install_3 = ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=-2),
amount=Decimal('2'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
CustomerFactory(
email=install_2.object_payment_plan.content_object.checkout_email
)
CustomerFactory(
email=install_3.object_payment_plan.content_object.checkout_email
)
ObjectPaymentPlanInstalment.objects.process_payments()
# check
install_1.refresh_from_db()
assert install_1.state == CheckoutState.objects.pending
install_2.refresh_from_db()
assert install_2.state == CheckoutState.objects.success
install_3.refresh_from_db()
assert install_3.state == CheckoutState.objects.success
@pytest.mark.django_db
def test_process_payments_fail(mocker):
"""Process payments."""
with mock.patch('stripe.Customer.create') as mock_customer:
mock_customer.side_effect = CheckoutError('Mock')
today = date.today()
install = ObjectPaymentPlanInstalmentFactory(
due=today+relativedelta(days=-1),
amount=Decimal('1'),
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
)
CustomerFactory(
email=install.object_payment_plan.content_object.checkout_email
)
NotifyFactory()
ObjectPaymentPlanInstalment.objects.process_payments()
# check
install.refresh_from_db()
assert install.state == CheckoutState.objects.fail
assert 1 == Message.objects.count()
assert 'FAIL' in Message.objects.first().subject
@pytest.mark.django_db
def test_str():
str(ObjectPaymentPlanInstalmentFactory(
object_payment_plan=ObjectPaymentPlanFactory(
content_object=ContactFactory(),
),
))
| 29.856187 | 75 | 0.685225 | 1,702 | 17,854 | 6.954172 | 0.080494 | 0.07342 | 0.107722 | 0.117776 | 0.822575 | 0.802383 | 0.7805 | 0.73099 | 0.704545 | 0.651234 | 0 | 0.013102 | 0.221967 | 17,854 | 597 | 76 | 29.906198 | 0.83896 | 0.063011 | 0 | 0.648594 | 0 | 0 | 0.014686 | 0.002637 | 0 | 0 | 0 | 0 | 0.080321 | 1 | 0.056225 | false | 0 | 0.026104 | 0 | 0.082329 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f30ec9cae10c08f1e561305e3776b16f110b958d | 7,450 | py | Python | day1_1.py | v-ek/advofcode | 9d031a8d6b56bd91e84ce302f6de3b88e582a976 | [
"MIT"
] | null | null | null | day1_1.py | v-ek/advofcode | 9d031a8d6b56bd91e84ce302f6de3b88e582a976 | [
"MIT"
] | null | null | null | day1_1.py | v-ek/advofcode | 9d031a8d6b56bd91e84ce302f6de3b88e582a976 | [
"MIT"
] | null | null | null | from __future__ import print_function
string_to_count = '(((())))()((((((((())()(()))(()((((()(()(((()((()((()(()()()()()))(((()(()((((((((((())(()()((())()(((())))()(()(()((()(()))(()()()()((()((()(((()()(((((((()()())()((((()()(((((()(())()(())((())()()))()(((((((())(()())(()(((())(()))((())))(()((()())))()())((((())))(()(((((()(())(((()()((()((()((((((((((())(()())))))()))())()()((((()()()()()()((((((())())(((()())()((()()(((()()()))(((((()))(((()(()()()(()(()(((())()))(()(((()((())()(()())())))((()()()(()()(((()))(((()((((()(((((()()(()())((()())())(()((((((()(()()))((((()))))())((())()()((()(()))))((((((((()))(()()(((())())(())()((()()()()((()((()((()()(((())))(()((())()((((((((()((()(()()(((())())())))(())())))()((((()))))))())))()()))()())((()())()((()()()))(()()(((()(())((((())())((((((((()()()()())))()()()((((()()))))))()((((()(((()))(()()())))((()()(((()))()()())())(((())((()()(())()()()(((())))))()())((()))()))((())()()())()())()()(()))())))())()))(())((()(())))(()(())(()))))(()(())())(()(())(()(()))))((()())()))()((((()()))))())))()()())((())()((()()()))()(((()(()))))(())()()))(((()())))))))))(((())))()))())()))))()()(((())))))))()(()()(()))((()))))((())))((()((())))())))()()(()))())()(()((()())(()(()()())())(()()))()))))(()())()()))()()()()))(()(()(()))))))()(()))()))()()(()((())(()(())))()(((())(())())))))()(()(()))))()))(()()()(())()(()(())))()))))()()(((((())))))())()())())())()())()))))()))))))))())()()()()()()())))()))((())()))())))()((())()))))()))())))))))())()()()))()()(()((((()(((((((()(())((()())((()()))()))))(())))()()()(())((())()())))(())))(())))(((()()))()(())(((()(()))((())))())()))((((()))())()))))))))()(())())))(()))()(()()))())()()(())())))())()()(()())))()((()())(()(())(())))))))))))))(()))))()))))))()()())(()(((((()(()())))())()))(()))()))(()()))()())(()))())()(())((()()))))))())))())()(((())))(()(()))()()))()(()))))))((()())(()))))))()())))()()))))))))((((((((()()()(()))))))()())))())))()()((())()))((())(())))())())))()()()((()((()(())))())()(())))))))))()())))()()()()()()))()))((())())(()(()))))))(()()))()))(())))()))))))))))))(()))))))))()))))()))()())()))()()))))))()))))((()))))(()))())()(())))(()())((((()())))()))))(()))()(()()(())))))())))))()))))))())))())))))())))())())))())(()))))(())()(())))())()))((()()))))))())))((())))))))())))(())))))()()())))))())))))()))))))()))()()()(()(((()())())())(()))())))))((()(())(()))))))))(())))()()()())())(()))))()()()))()))())())())()(())))()(((()((((())))))))()))))))))))))))))))))((())()())(()))))()()))))))(()()(())())))())))((())))((())))))))))))))()))))()(()))))))())))))()))(()()())(()())))))))))()))))))(())))))()()))()())(((())))()))(()))))))))(())())))())))())())())()()))((())()(())()())()))()())(())(()))))()())))(()(((()))))))()(()())()()()))()))))))))()()()(())()())()(((((()))()())())(()))))()()()(())))())))()((()())))(()))())()(()())())(()))()()))((()()))((()()()()())))(())()))(()(())))((()()))))))))())))))))())()()))))))))))))))))(())()(())(())()())())()))()(()))))())())))))()())()(()))()()(())))(())())))))(()))))))))))))))())())(())(())))(((()))()))))())((())(()))())))))))())))))())))()))()))))))))))))())()))))()))))((()))(())))()(())))(())()))()))())))())))))))()(()())())))()()())))(())))))(()))))))))))))(()))()))()))())))(((()()()(())((()())))()())(((()))(())()))((()()()())))())(())(()))))()(((((())))(()))())())))))))((((()()()))())())()(()(()())))))))))()())())))(())))()())(((()(())())()()))())())))))))((()())((()()(()))(()(())))()))()))(()))(()))()()(()(((())((((()))()(()))((())()(()(()())()(()))()())))))(()))()))())()())))())))(())))((())(()())))))()))(())(()))()())()(()()((()(()))))))()(())(()())(())()))(((())()))(()()(()()()))))(()(())))()))))())))))())(()()()()()()(((())))(()()))()((())(((((()()())))(()))(()))()()))(((())())()(((()()()()))))(()))(())())))()())(()()())())))))))()))))((())))()())(()))(()(()))())))))())(())))))()()())())()))()()(())))(()))(())((((((())(()))(()))())()))(()()(())))()))(()()))()))()(())))(())))((()(()))(())()()())())))(((()()())(())()))))))()(((()(((((()()(((())(())))())()((()))))((()())()(())(((())))(((()((()(()(()))(()()))())(()))(())(())))()))))))((((()))()((((()(()))()))()()))))()(()(()))()(()((()(((()(()()(((()))))()(((()(()(()(((()(()())())()()(()(()())())(()((((())(()))()))(((((()()())(())()((()()())))()()(((()()))()((((((((()(())))())((()))))(())))(()))))((()((((()()(())(((((()))(((((((((((((()())))((((()(((()((())())()))((()))()(()()((()()()()(()()(()(()(((())()(()((((((()((()()((())()((((()((()()(()()())((()()()((()((())()(()(((()((())((((())(()))((()(()))(()())()((((((((()(((((((((((()))(()(((()(()()()((((())((())()())()))(())((())(()))(((()((()(())))(()))))((()()))))((((()(()(()())(()(())((((((((()((((()((()(((((()))())()(()))(()()((()(())(((((()(())()(((((()()))))))()(((())()(()()((((())()((())((()(((())(((()))((()()((((()(())))))((()((((()((()((()(((())((()))(((((((()(((()((((((((())()))((((())(((((()((((((((()(((()((()(((()()(((()((((((()()(()((((((((()()(()(()(())((((()())()))))(((()))((((())((((()())((()(())()((()((((((()((((((()(())))()())(((())())())()(())()(()())((()()((((())((((((())(()(((((()((((())()((((()(()(())(()())(((())()((())((((()))()((((((())(()(((()(((()((((((()(((()))(()()())())((()((()())()((((())(((()(()(((((((((())(())))()((()()()()(())((()))(((((((()(((((((((()(()))))(()((((((((()((((()((()()((((((()()(((((((()(()(())()(())((()()()((()(((((()())()(((((()())()()((()(()())(()()()(((()()(((((()((((((()()((()(()()()((((((((((((()((((((((()()(((()())))()(((()()(())())((((()((((()((((()()()(())(())((()(()(((((((((((((((()(())(())))))()()))((()(((()(())((()(((()(()()((((()()(((()(((()(((((()()((()(()(((()))((((((()((((((((()((()((())(((((()(((())(())())((()()))((((())()()((()(((()(((((()()(((()))(((()(()(((((((((((((()))((((((((()(((()))))())((((((((((((())((())((()())(((())((())(()((((((((((()(((())((()()(()((())(((((((((((()))((((((((((((()(()())((()((()((()(()(((()((((((((()()(()((()(()(((()))((()))(((((((((((((()(())((((((())(((()(())(()(()(()((()()))((((()((((()((((())))())((((()((((()))((((((()((((((()((()(((())))((())(()))(()((()((((()((()(((()()))((((()()()(((((((())(((())(()))())((((()())(((()(((((((((((()(()(()((()(((((((((((((((()()((((()((((((((()(((()()((()((((()))(((()(())((((((()((((())()((((()((()))(())()(()(((()((())())((((((()(()(())())(((())(()(()())(((((()((()((())()())(())))(((()(())))))))(((()(((()))()((()(((()()((()())()()))())))(((()))(()(((()(((((((((()(()(((((()()(((()())()()))))()(((()))(((()(()(()(()(()))()(())()))(()(((())))(()))))))))))(())((()((())((()(())()(())((()()((((()()((()()))((())(((()((()(())(())))()(()(((((()((()))())()(((((()()(((()(()((((((())(()))(())()))((()(()()))(())())()))(((())))(()((()(((())(())())))((()()((((((((((((((()((()(()()(()(((()))())()()((()()()(())(()))(()())(((())((())()(())()()(()()(())))((()(((()))))(((()()(()()))())((()((())()))((((()()()())((())))(((()(())(((((()(((((()((()(()((((()()(((()()()(((()())(((()()((((())(()))(((()))(())())((()))(((()((()))(((()()((())((()(((((()((((()()())((()))()((((()((()(()()()('
num_chars = len(string_to_count)
floor = 0
first_time = True
for ii in range(0,num_chars):
if string_to_count[ii] == '(':
floor += 1
elif string_to_count[ii] == ')':
floor -= 1
if floor == -1 and first_time:
print('First index:')
print(ii)
print('First pos')
print(ii + 1)
first_time = False
print(floor) | 310.416667 | 7,020 | 0.034765 | 62 | 7,450 | 3.887097 | 0.435484 | 0.13278 | 0.215768 | 0.124481 | 0.174274 | 0.174274 | 0 | 0 | 0 | 0 | 0 | 0.000821 | 0.018523 | 7,450 | 24 | 7,021 | 310.416667 | 0.032139 | 0 | 0 | 0 | 0 | 0 | 0.942558 | 0.939471 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.058824 | 0.352941 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f325243fc7d08c7ba946157639cf8a51264e5be2 | 209 | py | Python | moto/logs/__init__.py | jonnangle/moto-1 | 40b4e299abb732aad7f56cc0f680c0a272a46594 | [
"Apache-2.0"
] | 5,460 | 2015-01-01T01:11:17.000Z | 2022-03-31T23:45:38.000Z | moto/logs/__init__.py | jonnangle/moto-1 | 40b4e299abb732aad7f56cc0f680c0a272a46594 | [
"Apache-2.0"
] | 4,475 | 2015-01-05T19:37:30.000Z | 2022-03-31T13:55:12.000Z | moto/logs/__init__.py | jonnangle/moto-1 | 40b4e299abb732aad7f56cc0f680c0a272a46594 | [
"Apache-2.0"
] | 1,831 | 2015-01-14T00:00:44.000Z | 2022-03-31T20:30:04.000Z | from .models import logs_backends
from ..core.models import base_decorator, deprecated_base_decorator
mock_logs = base_decorator(logs_backends)
mock_logs_deprecated = deprecated_base_decorator(logs_backends)
| 34.833333 | 67 | 0.870813 | 28 | 209 | 6.071429 | 0.357143 | 0.305882 | 0.270588 | 0.294118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076555 | 209 | 5 | 68 | 41.8 | 0.880829 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
b82c11152f53122f34b77a67d8736bb534044fe9 | 888 | py | Python | code/UI/OpenAPI/python-flask-server/swagger_server/models/__init__.py | ramseylab/RTX | e5782b874021f22cafb21658f8c688a865186eeb | [
"MIT"
] | null | null | null | code/UI/OpenAPI/python-flask-server/swagger_server/models/__init__.py | ramseylab/RTX | e5782b874021f22cafb21658f8c688a865186eeb | [
"MIT"
] | null | null | null | code/UI/OpenAPI/python-flask-server/swagger_server/models/__init__.py | ramseylab/RTX | e5782b874021f22cafb21658f8c688a865186eeb | [
"MIT"
] | null | null | null | # coding: utf-8
# flake8: noqa
from __future__ import absolute_import
# import models into model package
from swagger_server.models.edge import Edge
from swagger_server.models.edge_attribute import EdgeAttribute
from swagger_server.models.feedback import Feedback
from swagger_server.models.mesh_ngd_response import MeshNgdResponse
from swagger_server.models.node import Node
from swagger_server.models.node_attribute import NodeAttribute
from swagger_server.models.query import Query
from swagger_server.models.query_terms import QueryTerms
from swagger_server.models.question import Question
from swagger_server.models.response import Response
from swagger_server.models.response_feedback import ResponseFeedback
from swagger_server.models.result import Result
from swagger_server.models.result_feedback import ResultFeedback
from swagger_server.models.result_graph import ResultGraph
| 44.4 | 68 | 0.879505 | 121 | 888 | 6.231405 | 0.280992 | 0.204244 | 0.31565 | 0.427056 | 0.415119 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002454 | 0.082207 | 888 | 19 | 69 | 46.736842 | 0.922699 | 0.066441 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b23c559ab700f56a24222a7c42cd8e55ba1a4011 | 7,481 | py | Python | pytimize/programs/tests/linear/test_init.py | TerrayTM/pytimize | 5d3508e8da4d2e6c8e52d3386b580bc300dc52d5 | [
"Apache-2.0"
] | 10 | 2020-02-24T01:52:20.000Z | 2022-03-12T02:50:58.000Z | pytimize/programs/tests/linear/test_init.py | TerrayTM/pytimize | 5d3508e8da4d2e6c8e52d3386b580bc300dc52d5 | [
"Apache-2.0"
] | null | null | null | pytimize/programs/tests/linear/test_init.py | TerrayTM/pytimize | 5d3508e8da4d2e6c8e52d3386b580bc300dc52d5 | [
"Apache-2.0"
] | 2 | 2021-10-03T22:18:47.000Z | 2021-10-03T22:19:27.000Z | import numpy as np
import math
from ... import LinearProgram
from unittest import TestCase, main
class TestInit(TestCase):
def setUp(self):
self.A = np.array([
[1, 2, 3],
[4, 5, 6],
[7, 8, 9]
])
self.b = np.array([6, 15, 24])
self.c = np.array([100, 200, 300])
self.z = 5
def test_init(self):
inequalities = np.array(["=", "=", "="])
p = LinearProgram(self.A, self.b, self.c, self.z, "min", inequalities)
self.assertTrue(np.allclose(p.A, self.A), "Should construct coefficient matrix.")
self.assertTrue(np.allclose(p.b, self.b), "Should construct constraint values.")
self.assertTrue(np.allclose(p.c, self.c), "Should construct coefficient vector.")
self.assertTrue(math.isclose(p.z, self.z), "Should construct constant.")
self.assertEqual(p.objective, "min", "Should construct objective.")
self.assertEqual(p.inequalities, ["=", "=", "="], "Should construct inequalities.")
self.assertFalse(p.is_sef, "Should detect non-SEF.")
self.assertTrue(np.issubdtype(p.A.dtype, np.floating), "Should be of type float.")
self.assertTrue(np.issubdtype(p.b.dtype, np.floating), "Should be of type float.")
self.assertTrue(np.issubdtype(p.c.dtype, np.floating), "Should be of type float.")
self.assertIn(type(p.z), [float, int], "Should be of type float or int.")
self.assertIsInstance(p.objective, str, "Should be of type string.")
self.assertIsInstance(p.inequalities, list, "Should be of type list.")
p = LinearProgram(self.A.tolist(), self.b.tolist(), self.c.tolist(), self.z)
self.assertTrue(np.allclose(p.A, self.A), "Should construct coefficient matrix.")
self.assertTrue(np.allclose(p.b, self.b), "Should construct constraint values.")
self.assertTrue(np.allclose(p.c, self.c), "Should construct coefficient vector.")
self.assertTrue(math.isclose(p.z, self.z), "Should construct constant.")
self.assertTrue(p.is_sef, "Should detect SEF.")
self.assertTrue(np.issubdtype(p.A.dtype, np.floating), "Should be of type float.")
self.assertTrue(np.issubdtype(p.b.dtype, np.floating), "Should be of type float.")
self.assertTrue(np.issubdtype(p.c.dtype, np.floating), "Should be of type float.")
self.assertIn(type(p.z), [float, int], "Should be of type float or int.")
def test_invalid_dimensions(self):
A = np.array([
[1, 2, 3, 4],
[5, 6, 7, 8],
[9, 10, 11, 12]
])
with self.assertRaises(ValueError, msg="Should throw exception if dimension mismatch between A and c."):
p = LinearProgram(A, self.b, self.c, self.z)
A = [[1, 2, 3, 4], [5, 6, 8], [9, 10, 11, 12]]
with self.assertRaises(ValueError, msg="Should throw exception if A is a jagged array."):
p = LinearProgram(A, self.b, self.c, self.z)
A = [[1, 2, 3, 4], [5, 6, 8, 10], [[9, 10], 10, 11, 12]]
with self.assertRaises(ValueError, msg="Should throw exception if A has more than 2 dimensions."):
p = LinearProgram(A, self.b, self.c, self.z)
A = np.array([1, 2, 3, 4])
with self.assertRaises(ValueError, msg="Should throw exception if A is one dimensional."):
p = LinearProgram(A, self.b, self.c, self.z)
b = np.array([
[10, 20, 30]
])
with self.assertRaises(ValueError, msg="Should throw exception if b is two dimensional."):
p = LinearProgram(self.A, b, self.c, self.z)
c = np.array([
[1, 2, 3],
[4, 5, 6]
])
with self.assertRaises(ValueError, msg="Should throw exception if c is two dimensional."):
p = LinearProgram(self.A, self.b, c, self.z)
b = np.array([10, 20])
with self.assertRaises(ValueError, msg="Should throw exception if dimension mismatch between A and b."):
p = LinearProgram(self.A, b, self.c, self.z)
with self.assertRaises(ValueError, msg="Should throw exception if arrays are empty."):
p = LinearProgram([], [], [], self.z)
with self.assertRaises(ValueError, msg="Should throw exception if dimension mismatch between inequalities and b."):
p = LinearProgram(self.A, self.b, self.c, self.z, inequalities=["="])
with self.assertRaises(ValueError, msg="Should throw exception if dimension mismatch between inequalities and b."):
p = LinearProgram(self.A, self.b, self.c, self.z, inequalities=["=", "<=", ">=", ">="])
with self.assertRaises(ValueError, msg="Should throw exception if dimension mismatch between inequalities and b."):
p = LinearProgram(self.A, self.b, self.c, self.z, inequalities=[])
with self.assertRaises(ValueError, msg="Should throw exception if number of free variables is more than implied."):
p = LinearProgram(self.A, self.b, self.c, self.z, inequalities=None, free_variables=[1, 2, 3, 4])
def test_invalid_values(self):
A = np.array([
["a", 2, "b"],
[5, 6, "c"],
[9, "d", 12]
])
with self.assertRaises(ValueError, msg="Should throw exception if type of A is incorrect."):
p = LinearProgram(A, self.b, self.c, self.z)
b = "test"
with self.assertRaises(ValueError, msg="Should throw exception if type of b is incorrect."):
p = LinearProgram(self.A, b, self.c, self.z)
c = 6
with self.assertRaises(ValueError, msg="Should throw exception if type of c is incorrect."):
p = LinearProgram(self.A, self.b, c, self.z)
with self.assertRaises(ValueError, msg="Should throw exception if type of inequalities is incorrect."):
p = LinearProgram(self.A, self.b, self.c, self.z, inequalities="test")
with self.assertRaises(ValueError, msg="Should throw exception if inequalities have invalid values."):
p = LinearProgram(self.A, self.b, self.c, self.z, inequalities=["+", 23])
with self.assertRaises(ValueError, msg="Should throw exception if free variables have invalid indices."):
p = LinearProgram(self.A, self.b, self.c, self.z, free_variables=[0, 1])
with self.assertRaises(ValueError, msg="Should throw exception if free variables have invalid indices."):
p = LinearProgram(self.A, self.b, self.c, self.z, free_variables=[4])
def test_sef_detection(self):
p = LinearProgram(self.A, self.b, self.c, self.z)
self.assertTrue(p.is_sef, "Should detect SEF.")
p = LinearProgram(self.A, self.b, self.c, self.z, inequalities=["=", "=", "="])
self.assertTrue(p.is_sef, "Should detect SEF.")
p = LinearProgram(self.A, self.b, self.c, self.z, "min")
self.assertFalse(p.is_sef, "Should detect non-SEF.")
p = LinearProgram(self.A, self.b, self.c, self.z, inequalities=["<=", "=", "="])
self.assertFalse(p.is_sef, "Should detect non-SEF.")
p = LinearProgram(self.A, self.b, self.c, self.z, inequalities=None, free_variables=[1])
self.assertFalse(p.is_sef, "Should detect non-SEF.")
p = LinearProgram(self.A, self.b, self.c, self.z, "min", None, [1])
self.assertFalse(p.is_sef, "Should detect non-SEF.")
if __name__ == "__main__":
main()
| 44.005882 | 123 | 0.609544 | 1,031 | 7,481 | 4.396702 | 0.109602 | 0.033091 | 0.033091 | 0.050739 | 0.83587 | 0.83587 | 0.833223 | 0.823737 | 0.804986 | 0.775645 | 0 | 0.019373 | 0.241011 | 7,481 | 169 | 124 | 44.266272 | 0.778971 | 0 | 0 | 0.40678 | 0 | 0 | 0.251303 | 0 | 0 | 0 | 0 | 0 | 0.398305 | 1 | 0.042373 | false | 0 | 0.033898 | 0 | 0.084746 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b270c251c844dca8d9cf24d5a905a4387cf09b11 | 7,077 | py | Python | doubleml_py_vs_r/tests/test_pliv_pyvsr.py | DoubleML/doubleml-py-vs-r | 82d08f765ee16e323c91258195bae27decd86221 | [
"MIT"
] | 3 | 2021-05-05T08:38:57.000Z | 2021-08-17T13:14:36.000Z | doubleml_py_vs_r/tests/test_pliv_pyvsr.py | DoubleML/doubleml-py-vs-r | 82d08f765ee16e323c91258195bae27decd86221 | [
"MIT"
] | 3 | 2021-05-06T13:25:10.000Z | 2021-09-29T11:14:51.000Z | doubleml_py_vs_r/tests/test_pliv_pyvsr.py | DoubleML/doubleml-py-vs-r | 82d08f765ee16e323c91258195bae27decd86221 | [
"MIT"
] | null | null | null | import numpy as np
import pytest
import math
from sklearn.base import clone
from sklearn.linear_model import LinearRegression
import doubleml as dml
from _utils_pyvsr import export_smpl_split_to_r, r_MLPLIV, \
r_MLPLIV_PARTIAL_X, r_MLPLIV_PARTIAL_Z, r_MLPLIV_PARTIAL_XZ
rpy2 = pytest.importorskip("rpy2")
from rpy2.robjects import pandas2ri
pandas2ri.activate()
@pytest.fixture(scope='module',
params=['partialling out'])
def score(request):
return request.param
@pytest.fixture(scope='module',
params=['dml1', 'dml2'])
def dml_procedure(request):
return request.param
@pytest.fixture(scope='module')
def dml_pliv_pyvsr_fixture(generate_data_pliv, score, dml_procedure):
n_folds = 2
# collect data
obj_dml_data = generate_data_pliv
# Set machine learning methods for g, m & r
learner = LinearRegression()
ml_g = clone(learner)
ml_m = clone(learner)
ml_r = clone(learner)
np.random.seed(3141)
dml_pliv_obj = dml.DoubleMLPLIV(obj_dml_data,
ml_g, ml_m, ml_r,
n_folds,
dml_procedure=dml_procedure)
dml_pliv_obj.fit()
# fit the DML model in R
all_train, all_test = export_smpl_split_to_r(dml_pliv_obj.smpls[0])
r_dataframe = pandas2ri.py2rpy(obj_dml_data.data)
res_r = r_MLPLIV(r_dataframe, 'partialling out', dml_procedure,
all_train, all_test)
print(res_r)
res_dict = {'coef_py': dml_pliv_obj.coef,
'coef_r': res_r[0],
'se_py': dml_pliv_obj.se,
'se_r': res_r[1]}
return res_dict
def test_dml_pliv_pyvsr_coef(dml_pliv_pyvsr_fixture):
assert math.isclose(dml_pliv_pyvsr_fixture['coef_py'],
dml_pliv_pyvsr_fixture['coef_r'],
rel_tol=1e-9, abs_tol=1e-4)
def test_dml_pliv_pyvsr_se(dml_pliv_pyvsr_fixture):
assert math.isclose(dml_pliv_pyvsr_fixture['se_py'],
dml_pliv_pyvsr_fixture['se_r'],
rel_tol=1e-9, abs_tol=1e-4)
@pytest.fixture(scope='module')
def dml_pliv_partial_x_pyvsr_fixture(generate_data_pliv_partialX, score, dml_procedure):
n_folds = 2
# collect data
obj_dml_data = generate_data_pliv_partialX
# Set machine learning methods for g, m & r
learner = LinearRegression()
ml_g = clone(learner)
ml_m = clone(learner)
ml_r = clone(learner)
np.random.seed(3141)
dml_pliv_obj = dml.DoubleMLPLIV(obj_dml_data,
ml_g, ml_m, ml_r,
n_folds,
dml_procedure=dml_procedure)
dml_pliv_obj.fit()
# fit the DML model in R
all_train, all_test = export_smpl_split_to_r(dml_pliv_obj.smpls[0])
r_dataframe = pandas2ri.py2rpy(obj_dml_data.data)
res_r = r_MLPLIV_PARTIAL_X(r_dataframe, 'partialling out', dml_procedure,
all_train, all_test)
print(res_r)
res_dict = {'coef_py': dml_pliv_obj.coef,
'coef_r': res_r[0],
'se_py': dml_pliv_obj.se,
'se_r': res_r[1]}
return res_dict
def test_dml_pliv_partial_x_pyvsr_coef(dml_pliv_partial_x_pyvsr_fixture):
assert math.isclose(dml_pliv_partial_x_pyvsr_fixture['coef_py'],
dml_pliv_partial_x_pyvsr_fixture['coef_r'],
rel_tol=1e-9, abs_tol=1e-4)
def test_dml_pliv_partial_x_pyvsr_se(dml_pliv_partial_x_pyvsr_fixture):
assert math.isclose(dml_pliv_partial_x_pyvsr_fixture['se_py'],
dml_pliv_partial_x_pyvsr_fixture['se_r'],
rel_tol=1e-9, abs_tol=1e-4)
@pytest.fixture(scope='module')
def dml_pliv_partial_z_pyvsr_fixture(generate_data_pliv_partialZ, score, dml_procedure):
n_folds = 2
# collect data
obj_dml_data = generate_data_pliv_partialZ
# Set machine learning methods for g, m & r
learner = LinearRegression()
ml_r = clone(learner)
np.random.seed(3141)
dml_pliv_obj = dml.DoubleMLPLIV._partialZ(obj_dml_data,
ml_r,
n_folds,
dml_procedure=dml_procedure)
dml_pliv_obj.fit()
# fit the DML model in R
all_train, all_test = export_smpl_split_to_r(dml_pliv_obj.smpls[0])
r_dataframe = pandas2ri.py2rpy(obj_dml_data.data)
res_r = r_MLPLIV_PARTIAL_Z(r_dataframe, 'partialling out', dml_procedure,
all_train, all_test)
print(res_r)
res_dict = {'coef_py': dml_pliv_obj.coef,
'coef_r': res_r[0],
'se_py': dml_pliv_obj.se,
'se_r': res_r[1]}
return res_dict
def test_dml_pliv_partial_z_pyvsr_coef(dml_pliv_partial_z_pyvsr_fixture):
assert math.isclose(dml_pliv_partial_z_pyvsr_fixture['coef_py'],
dml_pliv_partial_z_pyvsr_fixture['coef_r'],
rel_tol=1e-9, abs_tol=1e-4)
def test_dml_pliv_partial_z_pyvsr_se(dml_pliv_partial_z_pyvsr_fixture):
assert math.isclose(dml_pliv_partial_z_pyvsr_fixture['se_py'],
dml_pliv_partial_z_pyvsr_fixture['se_r'],
rel_tol=1e-9, abs_tol=1e-4)
@pytest.fixture(scope='module')
def dml_pliv_partial_xz_pyvsr_fixture(generate_data_pliv_partialXZ, score, dml_procedure):
n_folds = 2
# collect data
obj_dml_data = generate_data_pliv_partialXZ
# Set machine learning methods for g, m & r
learner = LinearRegression()
ml_g = clone(learner)
ml_m = clone(learner)
ml_r = clone(learner)
np.random.seed(3141)
dml_pliv_obj = dml.DoubleMLPLIV._partialXZ(obj_dml_data,
ml_g, ml_m, ml_r,
n_folds,
dml_procedure=dml_procedure)
dml_pliv_obj.fit()
# fit the DML model in R
all_train, all_test = export_smpl_split_to_r(dml_pliv_obj.smpls[0])
r_dataframe = pandas2ri.py2rpy(obj_dml_data.data)
res_r = r_MLPLIV_PARTIAL_XZ(r_dataframe, 'partialling out', dml_procedure,
all_train, all_test)
print(res_r)
res_dict = {'coef_py': dml_pliv_obj.coef,
'coef_r': res_r[0],
'se_py': dml_pliv_obj.se,
'se_r': res_r[1]}
return res_dict
def test_dml_pliv_partial_xz_pyvsr_coef(dml_pliv_partial_xz_pyvsr_fixture):
assert math.isclose(dml_pliv_partial_xz_pyvsr_fixture['coef_py'],
dml_pliv_partial_xz_pyvsr_fixture['coef_r'],
rel_tol=1e-9, abs_tol=1e-4)
def test_dml_pliv_partial_xz_pyvsr_se(dml_pliv_partial_xz_pyvsr_fixture):
assert math.isclose(dml_pliv_partial_xz_pyvsr_fixture['se_py'],
dml_pliv_partial_xz_pyvsr_fixture['se_r'],
rel_tol=1e-9, abs_tol=1e-4)
| 32.022624 | 90 | 0.629221 | 996 | 7,077 | 4.027108 | 0.095382 | 0.097731 | 0.094241 | 0.033657 | 0.907255 | 0.859636 | 0.84368 | 0.801296 | 0.776864 | 0.776864 | 0 | 0.015647 | 0.286562 | 7,077 | 220 | 91 | 32.168182 | 0.778768 | 0.043945 | 0 | 0.627586 | 0 | 0 | 0.044277 | 0 | 0 | 0 | 0 | 0 | 0.055172 | 1 | 0.096552 | false | 0 | 0.062069 | 0.013793 | 0.2 | 0.027586 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b275d1dfd23a520b62b7b9348871093a5043b1e6 | 130 | py | Python | pgdrive/__init__.py | decisionforce/pgdrive | 19af5d09a40a68a2a5f8b3ac8b40f109e71c26ee | [
"Apache-2.0"
] | 97 | 2020-12-25T06:02:17.000Z | 2022-01-16T06:58:39.000Z | pgdrive/__init__.py | decisionforce/pgdrive | 19af5d09a40a68a2a5f8b3ac8b40f109e71c26ee | [
"Apache-2.0"
] | 192 | 2020-12-25T07:58:17.000Z | 2021-08-28T10:13:59.000Z | pgdrive/__init__.py | decisionforce/pgdrive | 19af5d09a40a68a2a5f8b3ac8b40f109e71c26ee | [
"Apache-2.0"
] | 11 | 2020-12-29T11:23:44.000Z | 2021-12-06T23:25:49.000Z | import pgdrive.register
from pgdrive.envs import PGDriveEnv, TopDownPGDriveEnv, TopDownSingleFramePGDriveEnv, TopDownPGDriveEnvV2
| 43.333333 | 105 | 0.892308 | 11 | 130 | 10.545455 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008264 | 0.069231 | 130 | 2 | 106 | 65 | 0.950413 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b2808b96aae7f7b1d8ee5385521c46c558330b28 | 99 | py | Python | quickspy/net/__init__.py | kirte2849/Quickspy | 767d0fb8ded283aa0d8122d77e15dc411f553994 | [
"MIT"
] | 1 | 2020-07-11T13:41:40.000Z | 2020-07-11T13:41:40.000Z | quickspy/net/__init__.py | kirte2849/Quickspy | 767d0fb8ded283aa0d8122d77e15dc411f553994 | [
"MIT"
] | null | null | null | quickspy/net/__init__.py | kirte2849/Quickspy | 767d0fb8ded283aa0d8122d77e15dc411f553994 | [
"MIT"
] | null | null | null | from .netengine import Response
from .netengine import NetEngine
from .urlmanager import UrlManager | 33 | 34 | 0.858586 | 12 | 99 | 7.083333 | 0.416667 | 0.305882 | 0.447059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 99 | 3 | 34 | 33 | 0.965909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
b2ad56ad9dd418278e94f8b87c47e83887f695a4 | 28,415 | py | Python | tests/monitoring/monitoring_service/test_handlers.py | hackaugusto/raiden-services | a25794ee1448534e5234f196b01a9bc13fab7ca0 | [
"MIT"
] | null | null | null | tests/monitoring/monitoring_service/test_handlers.py | hackaugusto/raiden-services | a25794ee1448534e5234f196b01a9bc13fab7ca0 | [
"MIT"
] | null | null | null | tests/monitoring/monitoring_service/test_handlers.py | hackaugusto/raiden-services | a25794ee1448534e5234f196b01a9bc13fab7ca0 | [
"MIT"
] | null | null | null | # pylint: disable=redefined-outer-name
from unittest.mock import Mock, patch
import pytest
from eth_utils import to_checksum_address
from tests.libs.mocks.web3 import Web3Mock
from tests.monitoring.monitoring_service.factories import (
DEFAULT_CHANNEL_IDENTIFIER,
DEFAULT_PARTICIPANT1,
DEFAULT_PARTICIPANT2,
DEFAULT_SETTLE_TIMEOUT,
DEFAULT_TOKEN_NETWORK_ADDRESS,
create_signed_monitor_request,
)
from monitoring_service.database import Database
from monitoring_service.events import (
ActionClaimRewardTriggeredEvent,
ActionMonitoringTriggeredEvent,
)
from monitoring_service.handlers import (
Context,
action_claim_reward_triggered_event_handler,
action_monitoring_triggered_event_handler,
channel_closed_event_handler,
channel_opened_event_handler,
channel_settled_event_handler,
monitor_new_balance_proof_event_handler,
monitor_reward_claim_event_handler,
non_closing_balance_proof_updated_event_handler,
updated_head_block_event_handler,
)
from monitoring_service.states import OnChainUpdateStatus
from raiden.utils.typing import Address, BlockNumber, ChannelID, Nonce, TokenAmount
from raiden_contracts.constants import ChannelState
from raiden_contracts.tests.utils import get_random_privkey
from raiden_libs.constants import UDC_SECURITY_MARGIN_FACTOR_MS
from raiden_libs.events import (
Event,
ReceiveChannelClosedEvent,
ReceiveChannelOpenedEvent,
ReceiveChannelSettledEvent,
ReceiveMonitoringNewBalanceProofEvent,
ReceiveMonitoringRewardClaimedEvent,
ReceiveNonClosingBalanceProofUpdatedEvent,
)
@pytest.fixture(autouse=True)
def mock_first_allowed_block(monkeypatch):
monkeypatch.setattr(
"monitoring_service.handlers._first_allowed_block_to_monitor", Mock(return_value=1)
)
def assert_channel_state(context: Context, state: ChannelState):
channel = context.db.get_channel(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS, channel_id=DEFAULT_CHANNEL_IDENTIFIER
)
assert channel
assert channel.state == state
def create_default_token_network(context):
context.db.conn.execute(
"INSERT INTO token_network (address) VALUES (?)",
[to_checksum_address(DEFAULT_TOKEN_NETWORK_ADDRESS)],
)
def setup_state_with_open_channel(context: Context) -> Context:
create_default_token_network(context)
event = ReceiveChannelOpenedEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
participant1=DEFAULT_PARTICIPANT1,
participant2=DEFAULT_PARTICIPANT2,
settle_timeout=DEFAULT_SETTLE_TIMEOUT,
block_number=BlockNumber(42),
)
assert context.db.channel_count() == 0
channel_opened_event_handler(event, context)
return context
def setup_state_with_closed_channel(context: Context) -> Context:
context = setup_state_with_open_channel(context)
event = ReceiveChannelClosedEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
closing_participant=DEFAULT_PARTICIPANT2,
block_number=BlockNumber(52),
)
channel_closed_event_handler(event, context)
assert context.db.channel_count() == 1
assert_channel_state(context, ChannelState.CLOSED)
return context
@pytest.fixture
def context(ms_database: Database):
return Context(
ms_state=ms_database.load_state(),
db=ms_database,
web3=Web3Mock(),
monitoring_service_contract=Mock(),
user_deposit_contract=Mock(),
min_reward=1,
required_confirmations=1,
)
def test_event_handler_ignore_other_events(context: Context):
event = Event()
for handler in [
channel_opened_event_handler,
channel_closed_event_handler,
non_closing_balance_proof_updated_event_handler,
channel_settled_event_handler,
monitor_new_balance_proof_event_handler,
monitor_reward_claim_event_handler,
action_monitoring_triggered_event_handler,
action_claim_reward_triggered_event_handler,
updated_head_block_event_handler,
]:
with pytest.raises(AssertionError):
handler(event=event, context=context)
def test_channel_opened_event_handler_adds_channel(context: Context):
create_default_token_network(context)
event = ReceiveChannelOpenedEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
participant1=DEFAULT_PARTICIPANT1,
participant2=DEFAULT_PARTICIPANT2,
settle_timeout=100,
block_number=BlockNumber(42),
)
assert context.db.channel_count() == 0
channel_opened_event_handler(event, context)
assert context.db.channel_count() == 1
assert_channel_state(context, ChannelState.OPENED)
def test_channel_closed_event_handler_closes_existing_channel(context: Context):
context = setup_state_with_open_channel(context)
context.web3.eth.blockNumber = BlockNumber(60)
event = ReceiveChannelClosedEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
closing_participant=DEFAULT_PARTICIPANT2,
block_number=BlockNumber(52),
)
channel_closed_event_handler(event, context)
# ActionMonitoringTriggeredEvent has been triggered
assert context.db.scheduled_event_count() == 1
assert context.db.channel_count() == 1
assert_channel_state(context, ChannelState.CLOSED)
def test_channel_closed_event_handler_idempotency(context: Context):
context = setup_state_with_open_channel(context)
context.web3.eth.blockNumber = BlockNumber(60)
event = ReceiveChannelClosedEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
closing_participant=DEFAULT_PARTICIPANT2,
block_number=BlockNumber(52),
)
channel_closed_event_handler(event, context)
# ActionMonitoringTriggeredEvent has been triggered
assert context.db.scheduled_event_count() == 1
assert context.db.channel_count() == 1
assert_channel_state(context, ChannelState.CLOSED)
# run handler again, check idempotency
channel_closed_event_handler(event, context)
assert context.db.scheduled_event_count() == 1
def test_channel_closed_event_handler_ignores_existing_channel_after_timeout(context: Context):
context = setup_state_with_open_channel(context)
context.web3.eth.blockNumber = BlockNumber(200)
event = ReceiveChannelClosedEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
closing_participant=DEFAULT_PARTICIPANT2,
block_number=BlockNumber(52),
)
channel_closed_event_handler(event, context)
# no ActionMonitoringTriggeredEvent has been triggered
assert context.db.scheduled_event_count() == 0
assert context.db.channel_count() == 1
assert_channel_state(context, ChannelState.CLOSED)
def test_channel_closed_event_handler_leaves_existing_channel(context: Context):
context = setup_state_with_open_channel(context)
event = ReceiveChannelClosedEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=ChannelID(4),
closing_participant=DEFAULT_PARTICIPANT2,
block_number=BlockNumber(52),
)
channel_closed_event_handler(event, context)
assert context.db.channel_count() == 1
assert_channel_state(context, ChannelState.OPENED)
def test_channel_closed_event_handler_trigger_action_monitor_event_with_monitor_request(
context: Context,
):
context = setup_state_with_open_channel(context)
# add MR to DB
context.db.upsert_monitor_request(create_signed_monitor_request())
event = ReceiveChannelClosedEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
closing_participant=DEFAULT_PARTICIPANT2,
block_number=BlockNumber(52),
)
channel_closed_event_handler(event, context)
assert context.db.scheduled_event_count() == 1
def test_channel_closed_event_handler_trigger_action_monitor_event_without_monitor_request(
context: Context,
):
context = setup_state_with_open_channel(context)
event = ReceiveChannelClosedEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
closing_participant=DEFAULT_PARTICIPANT2,
block_number=BlockNumber(52),
)
channel_closed_event_handler(event, context)
assert context.db.scheduled_event_count() == 1
def test_channel_settled_event_handler_settles_existing_channel(context: Context):
context = setup_state_with_closed_channel(context)
event = ReceiveChannelSettledEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
block_number=BlockNumber(52),
)
channel_settled_event_handler(event, context)
assert context.db.channel_count() == 1
assert_channel_state(context, ChannelState.SETTLED)
def test_channel_settled_event_handler_leaves_existing_channel(context: Context):
context = setup_state_with_closed_channel(context)
event = ReceiveChannelSettledEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=ChannelID(4),
block_number=BlockNumber(52),
)
channel_settled_event_handler(event, context)
assert context.db.channel_count() == 1
assert_channel_state(context, ChannelState.CLOSED)
def test_channel_bp_updated_event_handler_sets_update_status_if_not_set(context: Context):
context = setup_state_with_closed_channel(context)
event_bp = ReceiveNonClosingBalanceProofUpdatedEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
closing_participant=DEFAULT_PARTICIPANT2,
nonce=Nonce(2),
block_number=BlockNumber(23),
)
channel = context.db.get_channel(event_bp.token_network_address, event_bp.channel_identifier)
assert channel
assert channel.update_status is None
non_closing_balance_proof_updated_event_handler(event_bp, context)
assert context.db.channel_count() == 1
channel = context.db.get_channel(event_bp.token_network_address, event_bp.channel_identifier)
assert channel
assert channel.update_status is not None
assert channel.update_status.nonce == 2
assert channel.update_status.update_sender_address == DEFAULT_PARTICIPANT1
event_bp2 = ReceiveNonClosingBalanceProofUpdatedEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
closing_participant=DEFAULT_PARTICIPANT2,
nonce=Nonce(5),
block_number=BlockNumber(53),
)
non_closing_balance_proof_updated_event_handler(event_bp2, context)
assert context.db.channel_count() == 1
channel = context.db.get_channel(event_bp.token_network_address, event_bp.channel_identifier)
assert channel
assert channel.update_status is not None
assert channel.update_status.nonce == 5
assert channel.update_status.update_sender_address == DEFAULT_PARTICIPANT1
def test_monitor_new_balance_proof_event_handler_sets_update_status(context: Context):
context = setup_state_with_closed_channel(context)
new_balance_event = ReceiveMonitoringNewBalanceProofEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
reward_amount=TokenAmount(1),
nonce=Nonce(2),
ms_address=Address(bytes([4] * 20)),
raiden_node_address=DEFAULT_PARTICIPANT2,
block_number=BlockNumber(23),
)
channel = context.db.get_channel(
new_balance_event.token_network_address, new_balance_event.channel_identifier
)
assert channel
assert channel.update_status is None
monitor_new_balance_proof_event_handler(new_balance_event, context)
assert context.db.channel_count() == 1
channel = context.db.get_channel(
new_balance_event.token_network_address, new_balance_event.channel_identifier
)
assert channel
assert channel.update_status is not None
assert channel.update_status.nonce == 2
assert channel.update_status.update_sender_address == bytes([4] * 20)
new_balance_event2 = ReceiveMonitoringNewBalanceProofEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
reward_amount=TokenAmount(1),
nonce=Nonce(5),
ms_address=Address(bytes([4] * 20)),
raiden_node_address=DEFAULT_PARTICIPANT2,
block_number=BlockNumber(23),
)
monitor_new_balance_proof_event_handler(new_balance_event2, context)
assert context.db.channel_count() == 1
channel = context.db.get_channel(
new_balance_event.token_network_address, new_balance_event.channel_identifier
)
assert channel
assert channel.update_status is not None
assert channel.update_status.nonce == 5
assert channel.update_status.update_sender_address == bytes([4] * 20)
def test_monitor_new_balance_proof_event_handler_idempotency(context: Context):
context = setup_state_with_closed_channel(context)
new_balance_event = ReceiveMonitoringNewBalanceProofEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
reward_amount=TokenAmount(1),
nonce=Nonce(2),
ms_address=Address(bytes([3] * 20)),
raiden_node_address=DEFAULT_PARTICIPANT2,
block_number=BlockNumber(23),
)
channel = context.db.get_channel(
new_balance_event.token_network_address, new_balance_event.channel_identifier
)
assert channel
assert channel.update_status is None
monitor_new_balance_proof_event_handler(new_balance_event, context)
assert context.db.scheduled_event_count() == 1
assert context.db.channel_count() == 1
channel = context.db.get_channel(
new_balance_event.token_network_address, new_balance_event.channel_identifier
)
assert channel
assert channel.update_status is not None
assert channel.update_status.nonce == 2
assert channel.update_status.update_sender_address == bytes([3] * 20)
monitor_new_balance_proof_event_handler(new_balance_event, context)
assert context.db.scheduled_event_count() == 1
assert context.db.channel_count() == 1
channel = context.db.get_channel(
new_balance_event.token_network_address, new_balance_event.channel_identifier
)
assert channel
assert channel.update_status is not None
assert channel.update_status.nonce == 2
assert channel.update_status.update_sender_address == bytes([3] * 20)
def test_monitor_reward_claimed_event_handler(context: Context, log):
context = setup_state_with_closed_channel(context)
claim_event = ReceiveMonitoringRewardClaimedEvent(
ms_address=context.ms_state.address,
amount=TokenAmount(1),
reward_identifier="REWARD",
block_number=BlockNumber(23),
)
monitor_reward_claim_event_handler(claim_event, context)
assert log.has("Successfully claimed reward")
claim_event.ms_address = Address(bytes([3] * 20))
monitor_reward_claim_event_handler(claim_event, context)
assert log.has("Another MS claimed reward")
def test_action_monitoring_triggered_event_handler_does_not_trigger_monitor_call_when_nonce_to_small( # noqa
context: Context,
):
context = setup_state_with_closed_channel(context)
event3 = ReceiveMonitoringNewBalanceProofEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
reward_amount=TokenAmount(1),
nonce=Nonce(5),
ms_address=Address(bytes([3] * 20)),
raiden_node_address=DEFAULT_PARTICIPANT2,
block_number=BlockNumber(23),
)
channel = context.db.get_channel(event3.token_network_address, event3.channel_identifier)
assert channel
assert channel.update_status is None
monitor_new_balance_proof_event_handler(event3, context)
# add MR to DB, with nonce being smaller than in event3
context.db.upsert_monitor_request(create_signed_monitor_request(nonce=Nonce(4)))
event4 = ActionMonitoringTriggeredEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
non_closing_participant=DEFAULT_PARTICIPANT2,
)
channel = context.db.get_channel(event4.token_network_address, event4.channel_identifier)
assert channel
assert channel.update_status is not None
assert channel.monitor_tx_hash is None
action_monitoring_triggered_event_handler(event4, context)
assert context.db.channel_count() == 1
assert channel
assert channel.monitor_tx_hash is None
def test_action_monitoring_rescheduling_when_user_lacks_funds(context: Context):
reward_amount = TokenAmount(10)
context = setup_state_with_closed_channel(context)
context.db.upsert_monitor_request(
create_signed_monitor_request(nonce=Nonce(6), reward_amount=reward_amount)
)
event = ActionMonitoringTriggeredEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
non_closing_participant=DEFAULT_PARTICIPANT2,
)
scheduled_events_before = context.db.get_scheduled_events(max_trigger_block=BlockNumber(10000))
# Try to call monitor when the user has insufficient funds
with patch("monitoring_service.handlers.get_pessimistic_udc_balance", Mock(return_value=0)):
action_monitoring_triggered_event_handler(event, context)
assert not context.monitoring_service_contract.functions.monitor.called
# Now the event must have been rescheduled
# TODO: check that the event is rescheduled to trigger at the right block
scheduled_events_after = context.db.get_scheduled_events(max_trigger_block=BlockNumber(10000))
new_events = set(scheduled_events_after) - set(scheduled_events_before)
assert len(new_events) == 1
assert new_events.pop().event == event
# With sufficient funds it must succeed
with patch(
"monitoring_service.handlers.get_pessimistic_udc_balance",
Mock(return_value=reward_amount * UDC_SECURITY_MARGIN_FACTOR_MS),
):
action_monitoring_triggered_event_handler(event, context)
assert context.monitoring_service_contract.functions.monitor.called
def test_action_monitoring_triggered_event_handler_with_sufficient_balance_does_trigger_monitor_call( # noqa
context: Context,
):
""" Tests that `monitor` is called when the ActionMonitoringTriggeredEvent is triggered and
user has sufficient balance in user deposit contract
Also a test for https://github.com/raiden-network/raiden-services/issues/29 , as the MR
is sent after the channel has been closed.
"""
context = setup_state_with_closed_channel(context)
context.db.upsert_monitor_request(
create_signed_monitor_request(nonce=Nonce(6), reward_amount=TokenAmount(10))
)
trigger_event = ActionMonitoringTriggeredEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
non_closing_participant=DEFAULT_PARTICIPANT2,
)
channel = context.db.get_channel(
trigger_event.token_network_address, trigger_event.channel_identifier
)
assert channel
assert channel.monitor_tx_hash is None
context.user_deposit_contract.functions.effectiveBalance(
DEFAULT_PARTICIPANT2
).call.return_value = 21
action_monitoring_triggered_event_handler(trigger_event, context)
# check that the monitor call has been done
assert context.monitoring_service_contract.functions.monitor.called is True
def test_action_monitoring_triggered_event_handler_with_insufficient_reward_amount_does_not_trigger_monitor_call( # noqa
context: Context,
):
""" Tests that `monitor` is not called when the ActionMonitoringTriggeredEvent is triggered but
the monitor request shows an insufficient reward amount
"""
context = setup_state_with_closed_channel(context)
context.db.upsert_monitor_request(
create_signed_monitor_request(nonce=Nonce(6), reward_amount=TokenAmount(0))
)
trigger_event = ActionMonitoringTriggeredEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
non_closing_participant=DEFAULT_PARTICIPANT2,
)
channel = context.db.get_channel(
trigger_event.token_network_address, trigger_event.channel_identifier
)
assert channel
assert channel.monitor_tx_hash is None
context.user_deposit_contract.functions.effectiveBalance(
DEFAULT_PARTICIPANT2
).call.return_value = 21
action_monitoring_triggered_event_handler(trigger_event, context)
# check that the monitor call has been done
assert context.monitoring_service_contract.functions.monitor.called is False
def test_action_monitoring_triggered_event_handler_without_sufficient_balance_doesnt_trigger_monitor_call( # noqa
context: Context,
):
""" Tests that `monitor` is not called when user has insufficient balance in user deposit contract
Also a test for https://github.com/raiden-network/raiden-services/issues/29 , as the MR
is sent after the channel has been closed.
"""
context = setup_state_with_closed_channel(context)
context.db.upsert_monitor_request(
create_signed_monitor_request(nonce=Nonce(6), reward_amount=TokenAmount(10))
)
trigger_event = ActionMonitoringTriggeredEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
non_closing_participant=DEFAULT_PARTICIPANT2,
)
channel = context.db.get_channel(
trigger_event.token_network_address, trigger_event.channel_identifier
)
assert channel
assert channel.monitor_tx_hash is None
context.user_deposit_contract.functions.effectiveBalance(
DEFAULT_PARTICIPANT2
).call.return_value = 0
action_monitoring_triggered_event_handler(trigger_event, context)
# check that the monitor call has been done
assert context.monitoring_service_contract.functions.monitor.called is False
def test_mr_available_before_channel_triggers_monitor_call(context: Context):
""" Tests that the MR is read from the DB, even if it is supplied before the channel was opened.
See https://github.com/raiden-network/raiden-services/issues/26
"""
# add MR to DB
context.db.upsert_monitor_request(create_signed_monitor_request())
context = setup_state_with_closed_channel(context)
event = ActionMonitoringTriggeredEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
non_closing_participant=DEFAULT_PARTICIPANT2,
)
context.user_deposit_contract.functions.effectiveBalance(
DEFAULT_PARTICIPANT2
).call.return_value = 100
action_monitoring_triggered_event_handler(event, context)
# check that the monitor call has been done
assert context.monitoring_service_contract.functions.monitor.called is True
def test_mr_with_unknown_signatures(context: Context):
""" The signatures are valid but don't belong to the participants.
"""
context = setup_state_with_closed_channel(context)
def assert_mr_is_ignored(mr):
context.db.upsert_monitor_request(mr)
event = ActionMonitoringTriggeredEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
non_closing_participant=DEFAULT_PARTICIPANT2,
)
action_monitoring_triggered_event_handler(event, context)
assert not context.monitoring_service_contract.functions.monitor.called
assert_mr_is_ignored(create_signed_monitor_request(closing_privkey=get_random_privkey()))
assert_mr_is_ignored(create_signed_monitor_request(nonclosing_privkey=get_random_privkey()))
def test_action_claim_reward_triggered_event_handler_does_trigger_claim_call( # noqa
context: Context,
):
""" Tests that `claimReward` is called when the ActionMonitoringTriggeredEvent is triggered and
user has sufficient balance in user deposit contract
"""
context = setup_state_with_closed_channel(context)
context.db.upsert_monitor_request(
create_signed_monitor_request(nonce=Nonce(6), reward_amount=TokenAmount(10))
)
trigger_event = ActionClaimRewardTriggeredEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
non_closing_participant=DEFAULT_PARTICIPANT2,
)
channel = context.db.get_channel(
trigger_event.token_network_address, trigger_event.channel_identifier
)
assert channel
assert channel.claim_tx_hash is None
# Set update state
channel.update_status = OnChainUpdateStatus(
update_sender_address=context.ms_state.address, nonce=Nonce(6)
)
context.db.upsert_channel(channel)
action_claim_reward_triggered_event_handler(trigger_event, context)
# check that the monitor call has been done
assert context.monitoring_service_contract.functions.claimReward.called is True
def test_action_claim_reward_triggered_event_handler_without_reward_doesnt_trigger_claim_call( # noqa
context: Context,
):
""" Tests that `claimReward` is called when the ActionMonitoringTriggeredEvent is triggered and
user has sufficient balance in user deposit contract
"""
context = setup_state_with_closed_channel(context)
context.db.upsert_monitor_request(
create_signed_monitor_request(nonce=Nonce(6), reward_amount=TokenAmount(0))
)
trigger_event = ActionClaimRewardTriggeredEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
non_closing_participant=DEFAULT_PARTICIPANT2,
)
channel = context.db.get_channel(
trigger_event.token_network_address, trigger_event.channel_identifier
)
assert channel
assert channel.claim_tx_hash is None
# Set update state
channel.update_status = OnChainUpdateStatus(
update_sender_address=context.ms_state.address, nonce=Nonce(6)
)
context.db.upsert_channel(channel)
action_claim_reward_triggered_event_handler(trigger_event, context)
# check that the monitor call has been done
assert context.monitoring_service_contract.functions.claimReward.called is False
def test_action_claim_reward_triggered_event_handler_without_update_state_doesnt_trigger_claim_call( # noqa
context: Context,
):
""" Tests that `claimReward` is called when the ActionMonitoringTriggeredEvent is triggered and
user has sufficient balance in user deposit contract
"""
context = setup_state_with_closed_channel(context)
context.db.upsert_monitor_request(
create_signed_monitor_request(nonce=Nonce(6), reward_amount=TokenAmount(0))
)
trigger_event = ActionClaimRewardTriggeredEvent(
token_network_address=DEFAULT_TOKEN_NETWORK_ADDRESS,
channel_identifier=DEFAULT_CHANNEL_IDENTIFIER,
non_closing_participant=DEFAULT_PARTICIPANT2,
)
channel = context.db.get_channel(
trigger_event.token_network_address, trigger_event.channel_identifier
)
assert channel
assert channel.claim_tx_hash is None
# Set update state
channel.update_status = OnChainUpdateStatus(
update_sender_address=Address(bytes([1] * 20)), nonce=Nonce(6)
)
context.db.upsert_channel(channel)
action_claim_reward_triggered_event_handler(trigger_event, context)
# check that the monitor call has been done
assert context.monitoring_service_contract.functions.claimReward.called is False
| 36.429487 | 121 | 0.77445 | 3,305 | 28,415 | 6.26112 | 0.076853 | 0.045813 | 0.069782 | 0.037694 | 0.848983 | 0.830764 | 0.818344 | 0.797129 | 0.764413 | 0.744261 | 0 | 0.009679 | 0.163716 | 28,415 | 779 | 122 | 36.476252 | 0.861129 | 0.077565 | 0 | 0.669627 | 0 | 0 | 0.010476 | 0.006485 | 0 | 0 | 0 | 0.001284 | 0.182948 | 1 | 0.055062 | false | 0 | 0.024867 | 0.001776 | 0.085258 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a24637c3ebb8c3b8d0752641d8358ed8a1440231 | 26 | py | Python | ephypype/interfaces/__init__.py | wmvanvliet/ephypype | bb4226eda7f892a9564b9503a6ba7d22a46ce2db | [
"BSD-3-Clause"
] | 1 | 2019-09-03T07:31:45.000Z | 2019-09-03T07:31:45.000Z | ephypype/interfaces/__init__.py | davidmeunier79/ephypype | 4ef225f6fa882f62f443bb476811727a4bd2fe55 | [
"BSD-3-Clause"
] | 1 | 2018-09-03T20:08:46.000Z | 2018-09-03T21:00:55.000Z | ephypype/interfaces/__init__.py | davidmeunier79/ephypype | 4ef225f6fa882f62f443bb476811727a4bd2fe55 | [
"BSD-3-Clause"
] | null | null | null | from . import mne # noqa
| 13 | 25 | 0.653846 | 4 | 26 | 4.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.269231 | 26 | 1 | 26 | 26 | 0.894737 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a25b33c74bf4817798abdcbe96b9bcf574dbb9be | 2,644 | py | Python | vggface/resnet50/layers.py | claudiourbina/VGGFace | 362cc8f805c1fd4135fddf8d602026735bcfdf5a | [
"MIT"
] | null | null | null | vggface/resnet50/layers.py | claudiourbina/VGGFace | 362cc8f805c1fd4135fddf8d602026735bcfdf5a | [
"MIT"
] | null | null | null | vggface/resnet50/layers.py | claudiourbina/VGGFace | 362cc8f805c1fd4135fddf8d602026735bcfdf5a | [
"MIT"
] | null | null | null | import tensorflow as tf
from tensorflow.keras import backend as K
from tensorflow.keras.layers import (
Layer,
Conv2D,
BatchNormalization,
Activation
)
def conv2d_block(
inp,
filters:list,
kernel_size:tuple,
stage:int,
block:int,
strides:tuple=(2, 2),
use_bias:bool=False
):
bn_axis = 3 if K.image_data_format() == 'channels_last' else 1
name = f"conv{stage}_{block}"
x = Conv2D(
filters=filters[0],
kernel_size=(1, 1),
strides=strides,
use_bias=use_bias,
name=f"{name}_red"
)(inp)
x = BatchNormalization(
axis=bn_axis,
name=f"{name}_bn1"
)(x)
x = Activation('relu')(x)
x = Conv2D(
filters=filters[1],
kernel_size=kernel_size,
padding='same',
use_bias=use_bias,
name=f"{name}"
)(x)
x = BatchNormalization(
axis=bn_axis,
name=f"{name}_bn2"
)(x)
x = Activation('relu')(x)
x = Conv2D(
filters=filters[2],
kernel_size=(1, 1),
name=f"{name}_inc",
use_bias=use_bias
)(x)
x = BatchNormalization(
axis=bn_axis,
name=f"{name}_bn3"
)(x)
short = Conv2D(
filters=filters[2],
kernel_size=(1, 1),
strides=strides,
use_bias=use_bias,
name=f"{name}_short",
)(inp)
short = BatchNormalization(
axis=bn_axis,
name=f"{name}_bn4"
)(short)
x = tf.keras.layers.add([x, short])
x = Activation('relu')(x)
return x
def identity_block(
inp,
filters:list,
kernel_size:tuple,
stage:int,
block:int,
use_bias:bool=False):
bn_axis = 3 if K.image_data_format() == 'channels_last' else 1
name = f"conv{stage}_{block}"
x = Conv2D(
filters=filters[0],
kernel_size=(1, 1),
use_bias=use_bias,
name=f"{name}_red"
)(inp)
x = BatchNormalization(
axis=bn_axis,
name=f"{name}_bn1"
)(x)
x = Activation('relu')(x)
x = Conv2D(
filters=filters[1],
kernel_size=kernel_size,
use_bias=use_bias,
padding='same',
name=f"{name}"
)(x)
x = BatchNormalization(
axis=bn_axis,
name=f"{name}_bn2"
)(x)
x = Activation('relu')(x)
x = Conv2D(
filters=filters[2],
kernel_size=(1, 1),
use_bias=use_bias,
name=f"{name}_inc"
)(x)
x = BatchNormalization(
axis=bn_axis,
name=f"{name}_bn3"
)(x)
x = tf.keras.layers.add([x, inp])
x = Activation('relu')(x)
return x | 21.152 | 66 | 0.539334 | 336 | 2,644 | 4.071429 | 0.172619 | 0.081871 | 0.092105 | 0.071637 | 0.811404 | 0.811404 | 0.751462 | 0.714181 | 0.69883 | 0.69883 | 0 | 0.021715 | 0.320726 | 2,644 | 125 | 67 | 21.152 | 0.739978 | 0 | 0 | 0.785714 | 0 | 0 | 0.086957 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017857 | false | 0 | 0.026786 | 0 | 0.0625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a263ac689ab7f1d5e823a75ea35dfd68b19ae412 | 39 | py | Python | expanse_su_estimator/__init__.py | alex-wenzel/expanse-su-estimator | 9616298aa78bf5db7275215a93385e16b3b97720 | [
"MIT"
] | null | null | null | expanse_su_estimator/__init__.py | alex-wenzel/expanse-su-estimator | 9616298aa78bf5db7275215a93385e16b3b97720 | [
"MIT"
] | null | null | null | expanse_su_estimator/__init__.py | alex-wenzel/expanse-su-estimator | 9616298aa78bf5db7275215a93385e16b3b97720 | [
"MIT"
] | null | null | null | from .sbatch_parser import SBATCHScript | 39 | 39 | 0.897436 | 5 | 39 | 6.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 39 | 1 | 39 | 39 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a2c369352f3aa7bd49a2dabacd0d34b033718649 | 157 | py | Python | napari_zelda/_reader.py | DragaDoncila/napari-zelda | 78f8d25fbc04bc53493f186663c029e6469af7de | [
"BSD-3-Clause"
] | 9 | 2021-09-10T18:16:31.000Z | 2022-02-27T09:56:40.000Z | napari_zelda/_reader.py | DragaDoncila/napari-zelda | 78f8d25fbc04bc53493f186663c029e6469af7de | [
"BSD-3-Clause"
] | 1 | 2021-11-18T17:47:01.000Z | 2021-11-18T17:50:57.000Z | napari_zelda/_reader.py | DragaDoncila/napari-zelda | 78f8d25fbc04bc53493f186663c029e6469af7de | [
"BSD-3-Clause"
] | 2 | 2021-12-21T00:08:20.000Z | 2022-01-30T19:50:57.000Z | from napari_plugin_engine import napari_hook_implementation
@napari_hook_implementation
def napari_get_reader():
pass
def reader_function():
pass
| 15.7 | 59 | 0.815287 | 20 | 157 | 5.95 | 0.6 | 0.168067 | 0.403361 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140127 | 157 | 9 | 60 | 17.444444 | 0.881481 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0.333333 | 0.166667 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
a2dfa34834d519f7b23550e498f3b027300e1b0f | 2,247 | py | Python | UI/app/models.py | sanilrod/Contactless-Attendence-System | a0246047df139807dd636ca102b8151c456292ae | [
"MIT"
] | 1 | 2021-05-30T11:15:39.000Z | 2021-05-30T11:15:39.000Z | UI/app/models.py | sanilrod/Contactless-Attendence-System | a0246047df139807dd636ca102b8151c456292ae | [
"MIT"
] | null | null | null | UI/app/models.py | sanilrod/Contactless-Attendence-System | a0246047df139807dd636ca102b8151c456292ae | [
"MIT"
] | null | null | null | from django.db import models
from django.contrib.auth.models import User
# Create your models here.
class Teacher(models.Model):
name = models.CharField(max_length=255)
face_image1 = models.ImageField(max_length=255,upload_to="media/uploaded_images/")
face_image2 = models.ImageField(max_length=255,blank=True,upload_to="media/uploaded_images/")
face_image3 = models.ImageField(max_length=255,blank=True,upload_to="media/uploaded_images/")
details = models.TextField(null=True,blank=True)
address = models.TextField(null=True,blank=True)
username = models.CharField(max_length=255,null=True)
password = models.CharField(max_length=255, null=True)
email = models.CharField(max_length=255, null=True)
def __str__(self):
return str(self.name)
class AttendanceSetting(models.Model):
college_time = models.CharField(max_length=255,null=True,help_text="In Am")
attendance_time = models.CharField(max_length=255,null=True,help_text="In Am")
no_late_marks = models.IntegerField(null=True,help_text="After which Absentee will be marked")
no_early_marks = models.IntegerField(null=True,help_text="After which Absentee will be marked")
half_day_time = models.CharField(max_length=255,null=True,help_text="In Am")
checkout_time = models.CharField(max_length=255,null=True,help_text="In Pm")
class Attendance(models.Model):
date = models.DateTimeField(auto_now_add=True, blank=True)
teacher = models.CharField(max_length=255)
attendance_marked = models.BooleanField(default=False)
late_mark = models.BooleanField(default=False)
half_day = models.BooleanField(default=False)
image_path = models.CharField(max_length=255,null=True)
def __str__(self):
return str(self.teacher)+"==>"+str(self.date)
class Checkout(models.Model):
date = models.DateTimeField(auto_now_add=True, blank=True)
teacher = models.CharField(max_length=255)
attendance_marked = models.BooleanField(default=False)
early_mark = models.BooleanField(default=False)
half_day = models.BooleanField(default=False)
image_path = models.CharField(max_length=255, null=True)
def __str__(self):
return str(self.teacher) + "==>" + str(self.date)
| 40.854545 | 99 | 0.749889 | 310 | 2,247 | 5.232258 | 0.245161 | 0.083231 | 0.110974 | 0.177559 | 0.823674 | 0.789766 | 0.72873 | 0.685573 | 0.685573 | 0.685573 | 0 | 0.02459 | 0.131286 | 2,247 | 54 | 100 | 41.611111 | 0.806352 | 0.010681 | 0 | 0.384615 | 0 | 0 | 0.072973 | 0.02973 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0.025641 | 0.051282 | 0.076923 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
0c2eaf6aa6156fc340f9372f3679d9e31835edb0 | 34 | py | Python | parser/calc/__init__.py | tehmaze/parser | ccc69236304b2f00671f14c62433e8830b838101 | [
"MIT"
] | 2 | 2019-02-02T06:35:28.000Z | 2019-03-08T07:22:58.000Z | parser/calc/__init__.py | tehmaze/parser | ccc69236304b2f00671f14c62433e8830b838101 | [
"MIT"
] | null | null | null | parser/calc/__init__.py | tehmaze/parser | ccc69236304b2f00671f14c62433e8830b838101 | [
"MIT"
] | null | null | null | from parser.calc.base import calc
| 17 | 33 | 0.823529 | 6 | 34 | 4.666667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 34 | 1 | 34 | 34 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0c5d3cd18109e9c58cdcb4789f23f670f1b0778f | 27 | py | Python | pbj/electrostatics/__init__.py | bem4solvation/pbj | 4fa9c111596359192539787ae241a79d4316b15b | [
"MIT"
] | null | null | null | pbj/electrostatics/__init__.py | bem4solvation/pbj | 4fa9c111596359192539787ae241a79d4316b15b | [
"MIT"
] | 1 | 2022-02-18T17:34:37.000Z | 2022-02-18T17:34:37.000Z | pbj/electrostatics/__init__.py | bem4solvation/pbj | 4fa9c111596359192539787ae241a79d4316b15b | [
"MIT"
] | null | null | null | from .solute import Solute
| 13.5 | 26 | 0.814815 | 4 | 27 | 5.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 27 | 1 | 27 | 27 | 0.956522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a73d5d520d3f591e84e0442f600716d1c8d127ab | 5,940 | py | Python | tests/odm/query/test_update.py | yo-mo/beanie | 1641dd81be64dd1dc11af667deb2e50feb2de2be | [
"Apache-2.0"
] | 574 | 2021-03-16T12:49:12.000Z | 2022-03-30T11:45:33.000Z | tests/odm/query/test_update.py | yo-mo/beanie | 1641dd81be64dd1dc11af667deb2e50feb2de2be | [
"Apache-2.0"
] | 148 | 2021-03-16T22:02:37.000Z | 2022-03-31T21:04:47.000Z | tests/odm/query/test_update.py | yo-mo/beanie | 1641dd81be64dd1dc11af667deb2e50feb2de2be | [
"Apache-2.0"
] | 53 | 2021-03-16T21:53:14.000Z | 2022-03-31T12:51:51.000Z | import asyncio
import pytest
from beanie.odm.operators.update.general import Set, Max
from tests.odm.models import Sample
async def test_update_query():
q = (
Sample.find_many(Sample.integer == 1)
.update(Set({Sample.integer: 10}))
.update_query
)
assert q == {"$set": {"integer": 10}}
q = (
Sample.find_many(Sample.integer == 1)
.update(Max({Sample.integer: 10}), Set({Sample.optional: None}))
.update_query
)
assert q == {"$max": {"integer": 10}, "$set": {"optional": None}}
q = (
Sample.find_many(Sample.integer == 1)
.update(Set({Sample.integer: 10}), Set({Sample.optional: None}))
.update_query
)
assert q == {"$set": {"optional": None}}
q = (
Sample.find_many(Sample.integer == 1)
.update(Max({Sample.integer: 10}))
.update(Set({Sample.optional: None}))
.update_query
)
assert q == {"$max": {"integer": 10}, "$set": {"optional": None}}
q = (
Sample.find_many(Sample.integer == 1)
.update(Set({Sample.integer: 10}))
.update(Set({Sample.optional: None}))
.update_query
)
assert q == {"$set": {"optional": None}}
with pytest.raises(TypeError):
Sample.find_many(Sample.integer == 1).update(40).update_query
async def test_update_many(preset_documents):
await Sample.find_many(Sample.increment > 4).find_many(
Sample.nested.optional == None
).update(
Set({Sample.increment: 100})
) # noqa
result = await Sample.find_many(Sample.increment == 100).to_list()
assert len(result) == 3
for sample in result:
assert sample.increment == 100
async def test_update_many_linked_method(preset_documents):
await Sample.find_many(Sample.increment > 4).find_many(
Sample.nested.optional == None
).update_many(
Set({Sample.increment: 100})
) # noqa
result = await Sample.find_many(Sample.increment == 100).to_list()
assert len(result) == 3
for sample in result:
assert sample.increment == 100
async def test_update_all(preset_documents):
await Sample.update_all(Set({Sample.integer: 100}))
result = await Sample.find_all().to_list()
for sample in result:
assert sample.integer == 100
await Sample.find_all().update(Set({Sample.integer: 101}))
result = await Sample.find_all().to_list()
for sample in result:
assert sample.integer == 101
async def test_update_one(preset_documents):
await Sample.find_one(Sample.integer == 1).update(
Set({Sample.integer: 100})
)
result = await Sample.find_many(Sample.integer == 100).to_list()
assert len(result) == 1
assert result[0].integer == 100
await Sample.find_one(Sample.integer == 1).update_one(
Set({Sample.integer: 101})
)
result = await Sample.find_many(Sample.integer == 101).to_list()
assert len(result) == 1
assert result[0].integer == 101
async def test_update_self(preset_documents):
sample = await Sample.find_one(Sample.integer == 1)
await sample.update(Set({Sample.integer: 100}))
assert sample.integer == 100
result = await Sample.find_many(Sample.integer == 100).to_list()
assert len(result) == 1
assert result[0].integer == 100
async def test_update_many_with_session(preset_documents, session):
q = (
Sample.find_many(Sample.increment > 4)
.find_many(Sample.nested.optional == None)
.update(Set({Sample.increment: 100}))
.set_session(session=session)
)
assert q.session == session
q = (
Sample.find_many(Sample.increment > 4)
.find_many(Sample.nested.optional == None)
.update(Set({Sample.increment: 100}), session=session)
)
assert q.session == session
q = (
Sample.find_many(Sample.increment > 4)
.find_many(Sample.nested.optional == None, session=session)
.update(Set({Sample.increment: 100}))
)
assert q.session == session
await q # noqa
result = await Sample.find_many(Sample.increment == 100).to_list()
assert len(result) == 3
for sample in result:
assert sample.increment == 100
async def test_update_many_upsert_with_insert(
preset_documents, sample_doc_not_saved
):
await Sample.find_many(Sample.integer > 100000).upsert(
Set({Sample.integer: 100}), on_insert=sample_doc_not_saved
)
await asyncio.sleep(2)
new_docs = await Sample.find_many(
Sample.string == sample_doc_not_saved.string
).to_list()
assert len(new_docs) == 1
doc = new_docs[0]
assert doc.integer == sample_doc_not_saved.integer
async def test_update_many_upsert_without_insert(
preset_documents, sample_doc_not_saved
):
await Sample.find_many(Sample.integer > 1).upsert(
Set({Sample.integer: 100}), on_insert=sample_doc_not_saved
)
await asyncio.sleep(2)
new_docs = await Sample.find_many(
Sample.string == sample_doc_not_saved.string
).to_list()
assert len(new_docs) == 0
async def test_update_one_upsert_with_insert(
preset_documents, sample_doc_not_saved
):
await Sample.find_one(Sample.integer > 100000).upsert(
Set({Sample.integer: 100}), on_insert=sample_doc_not_saved
)
await asyncio.sleep(2)
new_docs = await Sample.find_many(
Sample.string == sample_doc_not_saved.string
).to_list()
assert len(new_docs) == 1
doc = new_docs[0]
assert doc.integer == sample_doc_not_saved.integer
async def test_update_one_upsert_without_insert(
preset_documents, sample_doc_not_saved
):
await Sample.find_one(Sample.integer > 1).upsert(
Set({Sample.integer: 100}), on_insert=sample_doc_not_saved
)
await asyncio.sleep(2)
new_docs = await Sample.find_many(
Sample.string == sample_doc_not_saved.string
).to_list()
assert len(new_docs) == 0
| 30.152284 | 72 | 0.655556 | 782 | 5,940 | 4.772379 | 0.085678 | 0.114952 | 0.105038 | 0.123258 | 0.903537 | 0.870043 | 0.845391 | 0.823151 | 0.781083 | 0.781083 | 0 | 0.032674 | 0.216835 | 5,940 | 196 | 73 | 30.306122 | 0.769561 | 0.002357 | 0 | 0.639752 | 0 | 0 | 0.013678 | 0 | 0 | 0 | 0 | 0 | 0.180124 | 1 | 0 | false | 0 | 0.024845 | 0 | 0.024845 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a7767bbee007cafd9951a8eea161cb8ccd9a83c0 | 183 | py | Python | tpudiepie/__init__.py | jeffhsu3/tpudiepie | ff3257d1dc5898cd1520b524885f0fec0cde378c | [
"BSD-2-Clause"
] | null | null | null | tpudiepie/__init__.py | jeffhsu3/tpudiepie | ff3257d1dc5898cd1520b524885f0fec0cde378c | [
"BSD-2-Clause"
] | null | null | null | tpudiepie/__init__.py | jeffhsu3/tpudiepie | ff3257d1dc5898cd1520b524885f0fec0cde378c | [
"BSD-2-Clause"
] | null | null | null | from .tpu import get_tpu, get_tpus, format, format_widths, format_headers, create_tpu_command, delete_tpu_command, reimage_tpu_command, logger
from . import tpu
from . import program
| 45.75 | 142 | 0.830601 | 28 | 183 | 5.071429 | 0.5 | 0.211268 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10929 | 183 | 3 | 143 | 61 | 0.871166 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a7b5c2a81e3aae5330496b2d1c0eef731c3b700b | 95 | py | Python | discard/__init__.py | JesseWeinstein/discard | 40d13d7d9beb7b08367f3aaad53e34ecee442eb2 | [
"0BSD"
] | 10 | 2021-02-03T02:54:39.000Z | 2022-03-30T17:32:02.000Z | discard/__init__.py | JesseWeinstein/discard | 40d13d7d9beb7b08367f3aaad53e34ecee442eb2 | [
"0BSD"
] | 11 | 2021-02-01T23:04:26.000Z | 2022-02-11T14:44:55.000Z | discard/__init__.py | JesseWeinstein/discard | 40d13d7d9beb7b08367f3aaad53e34ecee442eb2 | [
"0BSD"
] | 1 | 2021-04-12T22:42:59.000Z | 2021-04-12T22:42:59.000Z | from discard.discard import Discard
from discard.cli import cli
import discard.reader as reader | 31.666667 | 35 | 0.852632 | 15 | 95 | 5.4 | 0.4 | 0.271605 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115789 | 95 | 3 | 36 | 31.666667 | 0.964286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a7baf1c9256ba6de69e97ffb2b54c5acd739307c | 29 | py | Python | src/daos/base_dao/__init__.py | taonguyen740/flask_based_3tier_framework | f02e492eff0206e661925dddcf0ba978ead38b5e | [
"MIT"
] | null | null | null | src/daos/base_dao/__init__.py | taonguyen740/flask_based_3tier_framework | f02e492eff0206e661925dddcf0ba978ead38b5e | [
"MIT"
] | null | null | null | src/daos/base_dao/__init__.py | taonguyen740/flask_based_3tier_framework | f02e492eff0206e661925dddcf0ba978ead38b5e | [
"MIT"
] | null | null | null | from .base_dao import BaseDao | 29 | 29 | 0.862069 | 5 | 29 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103448 | 29 | 1 | 29 | 29 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a7c5a95c3d907e63d1110d6428aaaa3e7e02f5ca | 48 | py | Python | losses/__init__.py | janhenrikbern/information_gain | 69880f8a5ab83c04a4ae99878752c60edbac92ce | [
"MIT"
] | null | null | null | losses/__init__.py | janhenrikbern/information_gain | 69880f8a5ab83c04a4ae99878752c60edbac92ce | [
"MIT"
] | null | null | null | losses/__init__.py | janhenrikbern/information_gain | 69880f8a5ab83c04a4ae99878752c60edbac92ce | [
"MIT"
] | null | null | null | from .deschaintre import SVBRDFL1Loss, MixedLoss | 48 | 48 | 0.875 | 5 | 48 | 8.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022727 | 0.083333 | 48 | 1 | 48 | 48 | 0.931818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a7d339c931675592c1cfee92a95a4a6fa71b7450 | 148 | py | Python | src/syn_reports/commands/benefactor_permissions_report/__init__.py | pcstout/syn-reports | 9b2692fbc38e5596e62d8a415536483f2d05ee78 | [
"Apache-2.0"
] | 1 | 2020-02-27T02:15:38.000Z | 2020-02-27T02:15:38.000Z | src/syn_reports/commands/benefactor_permissions_report/__init__.py | pcstout/syn-reports | 9b2692fbc38e5596e62d8a415536483f2d05ee78 | [
"Apache-2.0"
] | 7 | 2020-03-24T18:21:31.000Z | 2021-06-22T14:22:11.000Z | src/syn_reports/commands/benefactor_permissions_report/__init__.py | pcstout/syn-reports | 9b2692fbc38e5596e62d8a415536483f2d05ee78 | [
"Apache-2.0"
] | 2 | 2020-03-02T21:30:50.000Z | 2020-03-13T22:03:43.000Z | from .cli import create, execute
from .benefactor_permissions_report import BenefactorPermissionsReport
from .benefactor_view import BenefactorView
| 37 | 70 | 0.885135 | 16 | 148 | 8 | 0.6875 | 0.21875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087838 | 148 | 3 | 71 | 49.333333 | 0.948148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ac0ebc4426db7df3387de85547ff42e8ce09c1f7 | 62 | py | Python | simplepy/convert.py | Kevys/python-games | 4949c122c9ccc69166afd106bedf8d618fe0f12f | [
"MIT"
] | 1 | 2021-12-01T13:38:28.000Z | 2021-12-01T13:38:28.000Z | simplepy/convert.py | Kevys/python-games | 4949c122c9ccc69166afd106bedf8d618fe0f12f | [
"MIT"
] | null | null | null | simplepy/convert.py | Kevys/python-games | 4949c122c9ccc69166afd106bedf8d618fe0f12f | [
"MIT"
] | null | null | null | def convert(value, to):
value = to(value)
return value | 20.666667 | 23 | 0.645161 | 9 | 62 | 4.444444 | 0.555556 | 0.35 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.241935 | 62 | 3 | 24 | 20.666667 | 0.851064 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
ac46ab152fa8aff7270233b85e7d243eb1d0b69a | 7,171 | py | Python | tests/test_34_population.py | skanct/pysaml2 | 0c1e26a6dd8759962857a30ebd67f63fe9e881ee | [
"Apache-2.0"
] | 249 | 2018-03-01T09:47:04.000Z | 2022-03-26T04:51:26.000Z | tests/test_34_population.py | skanct/pysaml2 | 0c1e26a6dd8759962857a30ebd67f63fe9e881ee | [
"Apache-2.0"
] | 416 | 2018-02-21T15:18:35.000Z | 2022-03-04T16:59:36.000Z | tests/test_34_population.py | skanct/pysaml2 | 0c1e26a6dd8759962857a30ebd67f63fe9e881ee | [
"Apache-2.0"
] | 203 | 2018-02-21T13:53:12.000Z | 2022-03-08T22:22:17.000Z | #!/usr/bin/env python
from saml2.ident import code
from saml2.saml import NAMEID_FORMAT_TRANSIENT, NameID
from saml2.population import Population
from saml2.time_util import in_a_while
IDP_ONE = "urn:mace:example.com:saml:one:idp"
IDP_OTHER = "urn:mace:example.com:saml:other:idp"
nid = NameID(name_qualifier="foo", format=NAMEID_FORMAT_TRANSIENT,
text="123456")
nida = NameID(name_qualifier="foo", format=NAMEID_FORMAT_TRANSIENT,
text="abcdef")
cnid = code(nid)
cnida = code(nida)
def _eq(l1, l2):
return set(l1) == set(l2)
class TestPopulationMemoryBased():
def setup_class(self):
self.population = Population()
def test_add_person(self):
session_info = {
"name_id": nid,
"issuer": IDP_ONE,
"not_on_or_after": in_a_while(minutes=15),
"ava": {
"givenName": "Anders",
"surName": "Andersson",
"mail": "anders.andersson@example.com"
}
}
self.population.add_information_about_person(session_info)
issuers = self.population.issuers_of_info(nid)
assert list(issuers) == [IDP_ONE]
subjects = [code(c) for c in self.population.subjects()]
assert subjects == [cnid]
# Are any of the sources gone stale
stales = self.population.stale_sources_for_person(nid)
assert stales == []
# are any of the possible sources not used or gone stale
possible = [IDP_ONE, IDP_OTHER]
stales = self.population.stale_sources_for_person(nid, possible)
assert stales == [IDP_OTHER]
(identity, stale) = self.population.get_identity(nid)
assert stale == []
assert identity == {'mail': 'anders.andersson@example.com',
'givenName': 'Anders',
'surName': 'Andersson'}
info = self.population.get_info_from(nid, IDP_ONE)
assert sorted(list(info.keys())) == sorted(["not_on_or_after",
"name_id", "ava"])
assert info["name_id"] == nid
assert info["ava"] == {'mail': 'anders.andersson@example.com',
'givenName': 'Anders',
'surName': 'Andersson'}
def test_extend_person(self):
session_info = {
"name_id": nid,
"issuer": IDP_OTHER,
"not_on_or_after": in_a_while(minutes=15),
"ava": {
"eduPersonEntitlement": "Anka"
}
}
self.population.add_information_about_person(session_info)
issuers = self.population.issuers_of_info(nid)
assert _eq(issuers, [IDP_ONE, IDP_OTHER])
subjects = [code(c) for c in self.population.subjects()]
assert subjects == [cnid]
# Are any of the sources gone stale
stales = self.population.stale_sources_for_person(nid)
assert stales == []
# are any of the possible sources not used or gone stale
possible = [IDP_ONE, IDP_OTHER]
stales = self.population.stale_sources_for_person(nid, possible)
assert stales == []
(identity, stale) = self.population.get_identity(nid)
assert stale == []
assert identity == {'mail': 'anders.andersson@example.com',
'givenName': 'Anders',
'surName': 'Andersson',
"eduPersonEntitlement": "Anka"}
info = self.population.get_info_from(nid, IDP_OTHER)
assert sorted(list(info.keys())) == sorted(["not_on_or_after",
"name_id", "ava"])
assert info["name_id"] == nid
assert info["ava"] == {"eduPersonEntitlement": "Anka"}
def test_add_another_person(self):
session_info = {
"name_id": nida,
"issuer": IDP_ONE,
"not_on_or_after": in_a_while(minutes=15),
"ava": {
"givenName": "Bertil",
"surName": "Bertilsson",
"mail": "bertil.bertilsson@example.com"
}
}
self.population.add_information_about_person(session_info)
issuers = self.population.issuers_of_info(nida)
assert list(issuers) == [IDP_ONE]
subjects = [code(c) for c in self.population.subjects()]
assert _eq(subjects, [cnid, cnida])
stales = self.population.stale_sources_for_person(nida)
assert stales == []
# are any of the possible sources not used or gone stale
possible = [IDP_ONE, IDP_OTHER]
stales = self.population.stale_sources_for_person(nida, possible)
assert stales == [IDP_OTHER]
(identity, stale) = self.population.get_identity(nida)
assert stale == []
assert identity == {"givenName": "Bertil",
"surName": "Bertilsson",
"mail": "bertil.bertilsson@example.com"
}
info = self.population.get_info_from(nida, IDP_ONE)
assert sorted(list(info.keys())) == sorted(["not_on_or_after",
"name_id", "ava"])
assert info["name_id"] == nida
assert info["ava"] == {"givenName": "Bertil",
"surName": "Bertilsson",
"mail": "bertil.bertilsson@example.com"
}
def test_modify_person(self):
session_info = {
"name_id": nid,
"issuer": IDP_ONE,
"not_on_or_after": in_a_while(minutes=15),
"ava": {
"givenName": "Arne",
"surName": "Andersson",
"mail": "arne.andersson@example.com"
}
}
self.population.add_information_about_person(session_info)
issuers = self.population.issuers_of_info(nid)
assert _eq(issuers, [IDP_ONE, IDP_OTHER])
subjects = [code(c) for c in self.population.subjects()]
assert _eq(subjects, [cnid, cnida])
# Are any of the sources gone stale
stales = self.population.stale_sources_for_person(nid)
assert stales == []
# are any of the possible sources not used or gone stale
possible = [IDP_ONE, IDP_OTHER]
stales = self.population.stale_sources_for_person(nid, possible)
assert stales == []
(identity, stale) = self.population.get_identity(nid)
assert stale == []
assert identity == {'mail': 'arne.andersson@example.com',
'givenName': 'Arne',
'surName': 'Andersson',
"eduPersonEntitlement": "Anka"}
info = self.population.get_info_from(nid, IDP_OTHER)
assert sorted(list(info.keys())) == sorted(["not_on_or_after",
"name_id", "ava"])
assert info["name_id"] == nid
assert info["ava"] == {"eduPersonEntitlement": "Anka"}
| 39.618785 | 73 | 0.549017 | 751 | 7,171 | 5.029294 | 0.126498 | 0.107493 | 0.021181 | 0.025417 | 0.864972 | 0.842997 | 0.828171 | 0.828171 | 0.806725 | 0.736034 | 0 | 0.004613 | 0.33496 | 7,171 | 180 | 74 | 39.838889 | 0.787377 | 0.047692 | 0 | 0.653061 | 0 | 0 | 0.151737 | 0.046767 | 0 | 0 | 0 | 0 | 0.244898 | 1 | 0.040816 | false | 0 | 0.027211 | 0.006803 | 0.081633 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3be0421755e994d936c91e19b759ee0f164a1731 | 90 | py | Python | cvstudio/decor/__init__.py | haruiz/PytorchCvStudio | ccf79dd0cc0d61f3fd01b1b5d96f7cda7b681eef | [
"MIT"
] | 32 | 2019-10-31T03:10:52.000Z | 2020-12-23T11:50:53.000Z | cvstudio/decor/__init__.py | haruiz/CvStudio | ccf79dd0cc0d61f3fd01b1b5d96f7cda7b681eef | [
"MIT"
] | 19 | 2019-10-31T15:06:05.000Z | 2020-06-15T02:21:55.000Z | cvstudio/decor/__init__.py | haruiz/PytorchCvStudio | ccf79dd0cc0d61f3fd01b1b5d96f7cda7b681eef | [
"MIT"
] | 8 | 2019-10-31T03:32:50.000Z | 2020-07-17T20:47:37.000Z | from .gui_exception_decor import gui_exception
from .work_exception import work_exception
| 30 | 46 | 0.888889 | 13 | 90 | 5.769231 | 0.461538 | 0.32 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 90 | 2 | 47 | 45 | 0.914634 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ce105ae4204a2601fa170d75f25d93ddef6de8b7 | 71 | py | Python | tests/__init__.py | gulaki/code-timer | 979eacc8fb98b3f6fef46b1099492aa081c9d619 | [
"MIT"
] | 1 | 2018-03-10T08:39:48.000Z | 2018-03-10T08:39:48.000Z | tests/__init__.py | gulaki/code-timer | 979eacc8fb98b3f6fef46b1099492aa081c9d619 | [
"MIT"
] | 4 | 2018-03-07T13:38:24.000Z | 2018-03-15T04:55:10.000Z | tests/__init__.py | gulaki/code-timer | 979eacc8fb98b3f6fef46b1099492aa081c9d619 | [
"MIT"
] | 2 | 2018-03-07T13:39:06.000Z | 2018-03-10T08:56:22.000Z | from test_getsize import test_getsize
from test_stats import test_stats | 35.5 | 37 | 0.901408 | 12 | 71 | 5 | 0.416667 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098592 | 71 | 2 | 38 | 35.5 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ce277212d7ea05b697c6518c2a9685d6dee60fa8 | 654 | py | Python | training_gym/level.py | TrackerSB/reinforcement-learning | 83f89e59b95a007332bdf733f77715843f9159d9 | [
"MIT"
] | null | null | null | training_gym/level.py | TrackerSB/reinforcement-learning | 83f89e59b95a007332bdf733f77715843f9159d9 | [
"MIT"
] | null | null | null | training_gym/level.py | TrackerSB/reinforcement-learning | 83f89e59b95a007332bdf733f77715843f9159d9 | [
"MIT"
] | null | null | null | from enum import Enum
class Level(Enum):
GRASS = 0
PURE = 1
def vae(self):
from os.path import join, dirname
if self is Level.GRASS:
return join(dirname(__file__), "..", "models", "lines-vae.pkl")
elif self is Level.PURE:
return join(dirname(__file__), "..", "models", "smallgrid-lines-vae.pkl")
def model(self):
from os.path import join, dirname
if self is Level.GRASS:
return join(dirname(__file__), "..", "models", "lines-agent.pkl")
elif self is Level.PURE:
return join(dirname(__file__), "..", "models", "smallgrid-lines-agent.pkl") | 31.142857 | 87 | 0.585627 | 83 | 654 | 4.421687 | 0.325301 | 0.179837 | 0.119891 | 0.228883 | 0.784741 | 0.784741 | 0.784741 | 0.784741 | 0.784741 | 0.784741 | 0 | 0.004202 | 0.272171 | 654 | 21 | 87 | 31.142857 | 0.766807 | 0 | 0 | 0.375 | 0 | 0 | 0.164886 | 0.073282 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.1875 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.