hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
79dcd5238b67211a58d3e2e79fefde96170704d7 | 1,391 | py | Python | logos/migrations/0004_auto_20171112_0912.py | ppik/loxodon | c9d3148ec70f281ba28b3b39e1d843db2fd9a3ac | [
"MIT"
] | null | null | null | logos/migrations/0004_auto_20171112_0912.py | ppik/loxodon | c9d3148ec70f281ba28b3b39e1d843db2fd9a3ac | [
"MIT"
] | 1 | 2018-01-18T09:04:47.000Z | 2018-01-18T14:29:13.000Z | logos/migrations/0004_auto_20171112_0912.py | ppik/loxodon | c9d3148ec70f281ba28b3b39e1d843db2fd9a3ac | [
"MIT"
] | 1 | 2018-01-17T14:14:52.000Z | 2018-01-17T14:14:52.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.7 on 2017-11-12 09:12
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logos', '0003_auto_20171111_2123'),
]
operations = [
migrations.AddField(
model_name='logoanalyze',
name='pop_logo_name_1',
field=models.CharField(default='Unknown', max_length=200),
),
migrations.AddField(
model_name='logoanalyze',
name='pop_logo_name_2',
field=models.CharField(default='Unknown', max_length=200),
),
migrations.AddField(
model_name='logoanalyze',
name='pop_logo_name_3',
field=models.CharField(default='Unknown', max_length=200),
),
migrations.AddField(
model_name='logoanalyze',
name='pop_precision_1',
field=models.CharField(default='0.0', max_length=20),
),
migrations.AddField(
model_name='logoanalyze',
name='pop_precision_2',
field=models.CharField(default='0.0', max_length=20),
),
migrations.AddField(
model_name='logoanalyze',
name='pop_precision_3',
field=models.CharField(default='0.0', max_length=20),
),
]
| 30.23913 | 70 | 0.582315 | 145 | 1,391 | 5.344828 | 0.331034 | 0.139355 | 0.178065 | 0.209032 | 0.754839 | 0.747097 | 0.747097 | 0.747097 | 0.735484 | 0.615484 | 0 | 0.061224 | 0.295471 | 1,391 | 45 | 71 | 30.911111 | 0.729592 | 0.048886 | 0 | 0.631579 | 1 | 0 | 0.162121 | 0.017424 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.052632 | 0 | 0.131579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8dc56e0555a7d1ce2911f975aac7200e1c5c2237 | 141 | py | Python | reco_gym/envs/__init__.py | NunoEdgarGFlowHub/reco-gym | 42701b7ae115b879edf6881f368878c458a2a368 | [
"Apache-2.0"
] | null | null | null | reco_gym/envs/__init__.py | NunoEdgarGFlowHub/reco-gym | 42701b7ae115b879edf6881f368878c458a2a368 | [
"Apache-2.0"
] | null | null | null | reco_gym/envs/__init__.py | NunoEdgarGFlowHub/reco-gym | 42701b7ae115b879edf6881f368878c458a2a368 | [
"Apache-2.0"
] | null | null | null | from .reco_env_v0 import RecoEnv0
from .reco_env_v1 import RecoEnv1
from .reco_env_v0 import env_0_args
from .reco_env_v1 import env_1_args
| 23.5 | 35 | 0.851064 | 28 | 141 | 3.857143 | 0.392857 | 0.296296 | 0.407407 | 0.240741 | 0.703704 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064516 | 0.120567 | 141 | 5 | 36 | 28.2 | 0.806452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
5c068604ad3e034a9f0f0eb352b46e0a6b70e26a | 84 | py | Python | virustotalpy/__init__.py | maxmmueller/virustotalpy | c3ffa8b91ae625f2ca73a3ddc61c7662846efdb5 | [
"Apache-2.0"
] | 5 | 2022-01-15T19:21:39.000Z | 2022-03-04T21:49:44.000Z | virustotalpy/__init__.py | maxmmueller/virustotalpy | c3ffa8b91ae625f2ca73a3ddc61c7662846efdb5 | [
"Apache-2.0"
] | null | null | null | virustotalpy/__init__.py | maxmmueller/virustotalpy | c3ffa8b91ae625f2ca73a3ddc61c7662846efdb5 | [
"Apache-2.0"
] | null | null | null | from virustotalpy.wrapper import Virustotal
from virustotalpy.wrapper import vtError | 42 | 43 | 0.892857 | 10 | 84 | 7.5 | 0.6 | 0.426667 | 0.613333 | 0.773333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 84 | 2 | 44 | 42 | 0.974026 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
30bd1c66e6184ed2e45b03db0dfca067208cdca3 | 250 | py | Python | pset_challenging_ext/exercises/p62.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 5 | 2019-04-08T20:05:37.000Z | 2019-12-04T20:48:45.000Z | pset_challenging_ext/exercises/p62.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 8 | 2019-04-15T15:16:05.000Z | 2022-02-12T10:33:32.000Z | pset_challenging_ext/exercises/p62.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 2 | 2019-04-10T00:14:42.000Z | 2020-02-26T20:35:21.000Z | """
Write a program to read an ASCII string and to convert it to a unicode string encoded by utf-8.
"""
"""Write a program to read an ASCII string and to convert it to a unicode string encoded by utf-8.
Hints:
Use unicode() function to convert.
""" | 27.777778 | 98 | 0.728 | 46 | 250 | 3.956522 | 0.413043 | 0.148352 | 0.142857 | 0.164835 | 0.824176 | 0.824176 | 0.824176 | 0.824176 | 0.824176 | 0.824176 | 0 | 0.009901 | 0.192 | 250 | 9 | 99 | 27.777778 | 0.891089 | 0.38 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
30fbe348c8212cb51756a55950a99dd81a6805a9 | 4,945 | py | Python | tests/test_combined.py | EsoNes0/tea | 26340af2ced714929ac41123aa98215646a8c10c | [
"MIT"
] | null | null | null | tests/test_combined.py | EsoNes0/tea | 26340af2ced714929ac41123aa98215646a8c10c | [
"MIT"
] | 3 | 2019-12-07T04:40:23.000Z | 2019-12-12T05:35:47.000Z | tests/test_combined.py | EsoNes0/tea | 26340af2ced714929ac41123aa98215646a8c10c | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""Basic tests for tiny encryption algorithm encryption"""
import tea.combined
DELTA_ONE = "DeltaOne = 0x11111111\n"
DELTA_TWO = "DeltaTwo = 0x22222222\n"
L_0 = "L[0] = 0xa0000009\n"
R_0 = "R[0] = 0x8000006b\n"
L_1 = "L[1] = 0x8000006b\n"
R_1 = "R[1] = 0xb72599b2\n"
L_2 = "L[2] = 0xb72599b2\n"
R_2 = "R[2] = 0xcf8e5a4c\n"
L_0_2 = "L[0] = 0x6d792074\n"
R_0_2 = "R[0] = 0x6578740a\n"
L_1_2 = "L[1] = 0x6578740a\n"
R_1_2 = "R[1] = 0xeee21246\n"
L_2_2 = "L[2] = 0xeee21246\n"
R_2_2 = "R[2] = 0xbf121dc5\n"
def test_encryption(capsys):
"""Testing encryption algorithm"""
tea.combined.main('encrypt', '90001C55', '1234ABCD', 'FEDCBA98',
'E2468AC0', 'A0000009', '8000006B')
out, err = capsys.readouterr()
assert DELTA_ONE in out
assert DELTA_TWO in out
assert L_0 in out
assert R_0 in out
assert L_1 in out
assert R_1 in out
assert L_2 in out
assert R_2 in out
assert err == ""
def test_decryption(capsys):
"""Testing decryption algorithm"""
tea.combined.main("decrypt", "90001C55", "1234ABCD", "FEDCBA98",
"E2468AC0", "B72599B2", "CF8E5A4C")
out, err = capsys.readouterr()
assert DELTA_ONE in out
assert DELTA_TWO in out
assert L_2 in out
assert R_2 in out
assert L_1 in out
assert R_1 in out
assert L_0 in out
assert R_0 in out
assert err == ""
def test_encryption_2(capsys):
"""Testing encryption algorithm"""
tea.combined.main('encrypt', '9ff579e5', 'fd720aac', '36629fc9',
'64a74968', '6d792074', '6578740a')
out, err = capsys.readouterr()
assert DELTA_ONE in out
assert DELTA_TWO in out
assert L_0_2 in out
assert R_0_2 in out
assert L_1_2 in out
assert R_1_2 in out
assert L_2_2 in out
assert R_2_2 in out
assert err == ""
def test_decryption_2(capsys):
"""Testing decryption algorithm"""
tea.combined.main("decrypt", "9ff579e5", "fd720aac", "36629fc9",
"64a74968", "eee21246", "bf121dc5")
out, err = capsys.readouterr()
assert DELTA_ONE in out
assert DELTA_TWO in out
assert L_2_2 in out
assert R_2_2 in out
assert L_1_2 in out
assert R_1_2 in out
assert L_0_2 in out
assert R_0_2 in out
assert err == ""
def test_set_variables_encrypt_with_input(capsys, mocker):
"""Testing set variables method encrypt"""
_args = []
mocked_input = mocker.patch('builtins.input')
mocked_input.side_effect = [1, '90001C55', '1234ABCD', 'FEDCBA98',
'E2468AC0', 'A0000009', '8000006B']
tea.combined.set_variables(_args)
captured = capsys.readouterr()
assert DELTA_ONE in captured.out
assert DELTA_TWO in captured.out
assert L_0 in captured.out
assert R_0 in captured.out
assert L_1 in captured.out
assert R_1 in captured.out
assert R_2 in captured.out
assert R_2 in captured.out
assert captured.err == ""
assert mocked_input.call_count == 7
def test_set_variables_decrypt_with_input(capsys, mocker):
"""Testing set variables method decrypt"""
_args = []
mocked_input = mocker.patch('builtins.input')
mocked_input.side_effect = [0, "9ff579e5", "fd720aac", "36629fc9",
"64a74968", "eee21246", "bf121dc5"]
tea.combined.set_variables(_args)
captured = capsys.readouterr()
assert DELTA_ONE in captured.out
assert DELTA_TWO in captured.out
assert L_2_2 in captured.out
assert R_2_2 in captured.out
assert L_1_2 in captured.out
assert R_1_2 in captured.out
assert L_0_2 in captured.out
assert R_0_2 in captured.out
assert captured.err == ""
assert mocked_input.call_count == 7
def test_set_variables_encrypt_with_args(capsys):
"""Testing set variables method encrypt with args"""
_args = ["encrypt", '90001C55', '1234ABCD', 'FEDCBA98',
'E2468AC0', 'A0000009', '8000006B']
tea.combined.set_variables(_args)
captured = capsys.readouterr()
assert DELTA_ONE in captured.out
assert DELTA_TWO in captured.out
assert L_0 in captured.out
assert R_0 in captured.out
assert L_1 in captured.out
assert R_1 in captured.out
assert L_2 in captured.out
assert R_2 in captured.out
assert captured.err == ""
def test_set_variables_decrypt_with_args(capsys):
"""Testing set variables method decrypt with args"""
_args = ["decrypt", "9ff579e5", "fd720aac", "36629fc9",
"64a74968", "eee21246", "bf121dc5"]
tea.combined.set_variables(_args)
captured = capsys.readouterr()
assert DELTA_ONE in captured.out
assert DELTA_TWO in captured.out
assert L_2_2 in captured.out
assert R_2_2 in captured.out
assert L_1_2 in captured.out
assert R_1_2 in captured.out
assert L_0_2 in captured.out
assert R_0_2 in captured.out
assert captured.err == ""
| 29.789157 | 70 | 0.65905 | 737 | 4,945 | 4.210312 | 0.105834 | 0.185627 | 0.113439 | 0.195939 | 0.850467 | 0.841444 | 0.818885 | 0.755076 | 0.642926 | 0.642926 | 0 | 0.121325 | 0.236603 | 4,945 | 165 | 71 | 29.969697 | 0.700662 | 0.072599 | 0 | 0.72093 | 0 | 0 | 0.160352 | 0 | 0 | 0 | 0.030837 | 0 | 0.573643 | 1 | 0.062016 | false | 0 | 0.007752 | 0 | 0.069767 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
eb7a0fd85569fbb0b0293512bfc203bfc55e49d0 | 72,413 | py | Python | pymatflow/qe/static.py | DeqiTang/pymatflow | bd8776feb40ecef0e6704ee898d9f42ded3b0186 | [
"MIT"
] | 6 | 2020-03-06T16:13:08.000Z | 2022-03-09T07:53:34.000Z | pymatflow/qe/static.py | DeqiTang/pymatflow | bd8776feb40ecef0e6704ee898d9f42ded3b0186 | [
"MIT"
] | 1 | 2021-10-02T02:23:08.000Z | 2021-11-08T13:29:37.000Z | pymatflow/qe/static.py | DeqiTang/pymatflow | bd8776feb40ecef0e6704ee898d9f42ded3b0186 | [
"MIT"
] | 1 | 2021-07-10T16:28:14.000Z | 2021-07-10T16:28:14.000Z | """
Static calc
"""
import os
import re
import sys
import shutil
from pymatflow.remote.server import server_handle
from pymatflow.qe.pwscf import PwScf
class StaticRun(PwScf):
"""
About:
static_run implements the control over scf, nscf and
calculations based on them, like dos, pdos, bands, epsilon
Status:
currently implemented calculation including:
scf, nscf, dos, bands, projwfc(pdos), ir_raman, elf, fermi_surface,
difference_charge_density, ellectron_density,
converge test:
ecutwfc, ecutrho, kpoints, degauss
"""
def __init__(self):
super().__init__()
self.control.basic_setting("scf")
self.arts.ifstatic = True
def scf(self, directory="tmp-qe-static", inpname="static-scf.in", output="static-scf.out", runopt="gen", auto=0):
"""
directory: a place for all the generated files
:param directory: the overall static calculation directory
:param runopt: determine whether the calculation is executed.
there are three values: 'gen', 'genrun', 'run'
'gen': only generate the input files
'genrun': generate input files and run
'run': run from the previously generated input files
Note:
only scf can generate the overall directory for static
calculation(except the converge test for parameters like
ecutwfc, kpoints, degauss)! other calculations is based
on scf or nscf(which is based scf), so logically when
doing these calculations there should already be the
directory where scf calculation has been conducted.
"""
self.control.calculation("scf")
if runopt == 'gen' or runopt == 'genrun':
if os.path.exists(directory):
shutil.rmtree(directory)
os.mkdir(directory)
#os.system("cp *.UPF %s/" % directory)
#os.system("cp %s %s/" % (self.arts.xyz.file, directory))
# do not copy too many files at the same time or it will be slow
# so we do not copy all UPF files in the directory but just copy
# those used in the calculation.
shutil.copyfile(self.arts.xyz.file, os.path.join(directory, os.path.basename(self.arts.xyz.file)))
#all_upfs = [s for s in os.listdir() if s.split(".")[-1] == "UPF"]
all_file = os.listdir()
for element in self.arts.xyz.specie_labels:
for item in all_file:
#if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE):
#if item.split(".")[0].lower() == element.lower() or item.split("_")[0].lower() == element.lower():
if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE) or re.match("(%s)(_*)(upf)" % element, item, re.IGNORECASE):
shutil.copyfile(item, os.path.join(directory, item))
break
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.set_params({"pseudo_dir": os.path.abspath(directory)})
#
with open(os.path.join(directory, inpname), 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.arts.to_in(fout)
# gen yhbatch script
self.gen_llhpc(directory=directory, inpname=inpname, output=output, cmd="$PMF_PWX")
# gen pbs scripts
self.gen_pbs(directory=directory, inpname=inpname, output=output, cmd="$PMF_PWX", jobname=self.run_params["jobname"], nodes=self.run_params["nodes"], ppn=self.run_params["ppn"], queue=self.run_params["queue"])
# gen cdcloud script
self.gen_cdcloud(directory=directory, inpname=inpname, output=output, cmd="$PMF_PWX")
if runopt == 'genrun' or runopt == 'run':
os.chdir(directory)
os.system("%s $PMF_PWX < %s | tee %s" % (self.run_params["mpi"], inpname, output))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="static-scf", server=self.run_params["server"])
def nscf(self, directory="tmp-qe-static", inpname="static-nscf.in", output="static-nscf.out", runopt='gen', auto=0):
"""
:param directory: the overall static calculation directory
:param runopt: determine whether the calculation is executed.
there are three values: 'gen', 'genrun', 'run'
'gen': only generate the input files
'genrun': generate input files and run
'run': run from the previously generated input files
"""
self.control.calculation("nscf")
# first check whether there is a previous scf running
if not os.path.exists(directory):
print("===================================================\n")
print(" Warning !!!\n")
print("===================================================\n")
print("non-scf calculation:\n")
print(" directory of previous scf calculattion not found!\n")
sys.exit(1)
if runopt == 'gen' or runopt == 'genrun':
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.set_params({"pseudo_dir": os.path.abspath(directory)})
with open(os.path.join(directory, inpname), 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.arts.to_in(fout)
# gen yhbatch script
self.gen_llhpc(directory=directory, inpname=inpname, output=output, cmd="$PMF_PWX")
# gen pbs scripts
self.gen_pbs(directory=directory, inpname=inpname, output=output, cmd="$PMF_PWX", jobname=self.run_params["jobname"], nodes=self.run_params["nodes"], ppn=self.run_params["ppn"], queue=self.run_params["queue"])
# gen cdcloud script
self.gen_cdcloud(directory=directory, inpname=inpname, output=output, cmd="$PMF_PWX")
if runopt == 'genrun' or runopt == 'run':
os.chdir(directory)
os.system("%s $PMF_PWX < %s | tee %s" % (self.run_params["mpi"], inpname, output))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="static-nscf", server=self.run_params["server"])
def converge_ecutwfc(self, emin, emax, step, directory="tmp-qe-ecutwfc", runopt="gen", auto=0):
if runopt == "gen" or runopt == "genrun":
if os.path.exists(directory):
shutil.rmtree(directory)
os.mkdir(directory)
os.system("cp %s %s/" % (self.arts.xyz.file, directory))
#all_upfs = [s for s in os.listdir() if s.split(".")[-1] == "UPF"]
all_file = os.listdir()
for element in self.arts.xyz.specie_labels:
for item in all_file:
#if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE):
#if item.split(".")[0].lower() == element.lower() or item.split("_")[0].lower() == element.lower():
if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE) or re.match("(%s)(_*)(upf)" % element, item, re.IGNORECASE):
shutil.copyfile(item, os.path.join(directory, item))
break
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.set_params({"pseudo_dir": os.path.abspath(directory)})
os.chdir(directory)
n_test = int((emax - emin) / step)
for i in range(n_test + 1):
ecut_wfc = int(emin + i * step)
ecut_rho = ecut_wfc * 4 # using default value for ecut_rho: 4 * ecutwfc
inp_name = "ecutwfc-%d.in" % ecut_wfc
self.control.set_params({'outdir': './tmp-' + str(ecut_wfc)})
self.system.set_params({'ecutwfc': ecut_wfc})
with open(inp_name, 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.arts.to_in(fout)
# gen yhbatch running script
with open("converge-ecutwfc.slurm", 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s\n" % self.run_params["jobname"])
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
for i in range(n_test + 1):
ecut_wfc = int(emin + i * step)
inp_name = "ecutwfc-%d.in" % ecut_wfc
out_f_name = "ecutwfc-%d.out" % ecut_wfc
fout.write("yhrun $PMF_PWX < %s > %s\n" % (inp_name, out_f_name))
# gen pbs running script
with open("converge-ecutwfc.pbs", 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#PBS -N %s\n" % self.run_params["jobname"])
fout.write("#PBS -l nodes=%d:ppn=%d\n" % (self.run_params["nodes"], self.run_params["ppn"]))
if "queue" in self.run_params and self.run_params["queue"] != None:
fout.write("#PBS -q %s\n" %self.run_params["queue"])
fout.write("cd $PBS_O_WORKDIR\n")
fout.write("NP=`cat $PBS_NODEFILE | wc -l`\n")
for i in range(n_test + 1):
ecut_wfc = int(emin + i * step)
inp_name = "ecutwfc-%d.in" % ecut_wfc
out_f_name = "ecutwfc-%d.out" % ecut_wfc
fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < %s > %s\n" % (inp_name, out_f_name))
os.chdir("../")
if runopt == "run" or runopt == "genrun":
# run the simulation
os.chdir(directory)
for i in range(n_test + 1):
ecut_wfc = int(emin + i * step)
inp_name = "ecutwfc-%d.in" % ecut_wfc
out_f_name = "ecutwfc-%d.out" % ecut_wfc
os.system("%s $PMF_PWX < %s | tee %s" % (self.run_params["mpi"], inp_name, out_f_name))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="converge-ecutwfc", server=self.run_params["server"])
def converge_ecutrho(self, emin, emax, step, ecutwfc, directory="tmp-qe-ecutrho", runopt="gen", auto=0):
if runopt == "gen" or runopt == "genrun":
if os.path.exists(directory):
shutil.rmtree(directory)
os.mkdir(directory)
#all_upfs = [s for s in os.listdir() if s.split(".")[-1] == "UPF"]
all_file = os.listdir()
for element in self.arts.xyz.specie_labels:
for item in all_file:
#if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE):
#if item.split(".")[0].lower() == element.lower() or item.split("_")[0].lower() == element.lower():
if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE) or re.match("(%s)(_*)(upf)" % element, item, re.IGNORECASE):
shutil.copyfile(item, os.path.join(directory, item))
break
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.set_params({"pseudo_dir": os.path.abspath(directory)})
os.chdir(directory)
n_test = int((emax - emin) / step)
for i in range(n_test + 1):
ecut_rho = int(emin + i * step)
inp_name = "ecutrho-%d.in" % ecut_rho
self.control.params['outdir'] = './tmp-' + str(ecut_rho)
self.system.params['ecutwfc'] = ecutwfc
self.system.params["ecutrho"] = ecut_rho
with open(inp_name, 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.arts.to_in(fout)
# gen yhbatch running script
with open("converge-ecutrho.slurm", 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s\n" % self.run_params["jobname"])
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
for i in range(n_test + 1):
ecut_rho = int(emin + i * step)
inp_name = "ecutrho-%d.in" % ecut_rho
out_f_name = "ecutrho-%d.out" % ecut_rho
fout.write("yhrun $PMF_PWX < %s > %s\n" % (inp_name, out_f_name))
# gen pbs running script
with open("converge-ecutrho.pbs", 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#PBS -N %s\n" % self.run_params["jobname"])
fout.write("#PBS -l nodes=%d:ppn=%d\n" % (self.run_params["nodes"], self.run_params["ppn"]))
if "queue" in self.run_params and self.run_params["queue"] != None:
fout.write("#PBS -q %s\n" %self.run_params["queue"])
fout.write("cd $PBS_O_WORKDIR\n")
fout.write("NP=`cat $PBS_NODEFILE | wc -l`\n")
for i in range(n_test + 1):
ecut_rho = int(emin + i * step)
inp_name = "ecutrho-%d.in" % ecut_rho
out_f_name = "ecutrho-%d.out" % ecut_rho
fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < %s > %s\n" % (inp_name, out_f_name))
os.chdir("../")
if runopt == "run" or runopt == "genrun":
# run the simulation
os.chdir(directory)
for i in range(n_test + 1):
ecut_rho = int(emin + i * step)
inp_name = "ecutrho-%d.in" % ecut_rho
out_f_name = "ecutrho-%d.out" % ecut_rho
os.system("%s $PMF_PWX < %s | tee %s" % (self.run_params["mpi"], inp_name, out_f_name))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="converge-ecutrho", server=self.run_params["server"])
#
def converge_kpoints(self, nk_min, nk_max, step=1, directory="tmp-qe-kpoints", runopt="gen", auto=0):
"""
test the energy convergenc against k-points
currently only support automatic schme of K_POINTS
and only nk1 = nk2 = nk3 are supported
Note:
if you converge the ecutwfc previously, you should
specify the converged ecutwfc through system in the
parameters
"""
if runopt == "gen" or runopt == "genrun":
if os.path.exists(directory):
shutil.rmtree(directory)
os.mkdir(directory)
#all_upfs = [s for s in os.listdir() if s.split(".")[-1] == "UPF"]
all_file = os.listdir()
for element in self.arts.xyz.specie_labels:
for item in all_file:
#if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE):
#if item.split(".")[0].lower() == element.lower() or item.split("_")[0].lower() == element.lower():
if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE) or re.match("(%s)(_*)(upf)" % element, item, re.IGNORECASE):
shutil.copyfile(item, os.path.join(directory, item))
break
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.set_params({"pseudo_dir": os.path.abspath(directory)})
os.chdir(directory)
n_test = int((nk_max - nk_min) / step)
for i in range(n_test + 1):
nk = nk_min + i * step # nk1 = nk2 = nk3 = nk
inp_name = "kpoints-%d.in" % nk
self.control.set_params({'outdir': './tmp-' + str(nk)})
self.arts.set_kpoints([nk, nk, nk, 0, 0, 0])
with open(inp_name, 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.arts.to_in(fout)
# gen yhbatch running script
with open("converge-kpoints.slurm", 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s\n" % self.run_params["jobname"])
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
for i in range(n_test + 1):
nk = nk_min + i * step # nk1 = nk2 = nk3 = nk
inp_name = "kpoints-%d.in" % nk
out_f_name = "kpoints-%d.out" % nk
fout.write("yhrun $PMF_PWX < %s > %s\n" % (inp_name, out_f_name))
# gen pbs running script
with open("converge-kpoints.pbs", 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#PBS -N %s\n" % self.run_params["jobname"])
fout.write("#PBS -l nodes=%d:ppn=%d\n" % (self.run_params["nodes"], self.run_params["ppn"]))
if "queue" in self.run_params and self.run_params["queue"] != None:
fout.write("#PBS -q %s\n" %self.run_params["queue"])
fout.write("cd $PBS_O_WORKDIR\n")
fout.write("NP=`cat $PBS_NODEFILE | wc -l`\n")
for i in range(n_test + 1):
nk = nk_min + i * step # nk1 = nk2 = nk3 = nk
inp_name = "kpoints-%d.in" % nk
out_f_name = "kpoints-%d.out" % nk
fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < %s > %s\n" % (inp_name, out_f_name))
os.chdir("../")
if runopt == "run" or runopt == "genrun":
# run the simulation
os.chdir(directory)
for i in range(n_test + 1):
nk = nk_min + i * step
inp_name = "kpoints-%d.in" % nk
out_f_name = "kpoints-%d.out" % nk
os.system("%s $PMF_PWX < %s | tee %s" % (self.run_params["mpi"], inp_name, out_f_name))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="converge-kpoints", server=self.run_params["server"])
def converge_degauss(self,degauss_min, degauss_max, step=0.01, directory="tmp-qe-degauss", runopt="gen", auto=0):
"""
Convergence with respect to degauss/smearing
smearing:
(a) 'gauss'
(b) 'marzari-vanderbilt'
(c) 'methfessel-paxton'
degauss:
suggested values:
0.06, 0.07, 0.08, 0.09, 0.10 (in Ry)
Note:
here we do the testing of degauss on energy.
however quantities like the force on an atom
may be more suited for this kind of testing.
smearing is in fact part of the system setting
how ever I set it a independent parameter in
this function, to provide user the direct way
to set the type of gauss smearing for testing.
And of course we should not set smearing and
occupations through system parameters.
occpuations should always be set to smearing in
testing degauss
the user better set the previously converged
ecutwfc throught system parameters
"""
if runopt == "gen" or runopt == "genrun":
if os.path.exists(directory):
shutil.rmtree(directory)
os.mkdir(directory)
#all_upfs = [s for s in os.listdir() if s.split(".")[-1] == "UPF"]
all_file = os.listdir()
for element in self.arts.xyz.specie_labels:
for item in all_file:
#if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE):
#if item.split(".")[0].lower() == element.lower() or item.split("_")[0].lower() == element.lower():
if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE) or re.match("(%s)(_*)(upf)" % element, item, re.IGNORECASE):
shutil.copyfile(item, os.path.join(directory, item))
break
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.set_params({"pseudo_dir": os.path.abspath(directory)})
os.chdir(directory)
n_test = int((degauss_max - degauss_min) / step)
for i in range(n_test + 1):
degauss = degauss_min + i * step
inp_name = "degauss-%f.in" % degauss
self.control.set_params({'outdir': './tmp-%f' % degauss})
#self.arts.set_kpoints([nk, nk, nk, 0, 0, 0]) # use the previously convered kpoints(automatic)
self.system.set_params({'degauss': degauss})
with open(inp_name, 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.arts.to_in(fout)
# gen yhbatch running script
with open("converge-degauss.slurm", 'w') as fout:
fout.write("#!/bin/bash\n")
for i in range(n_test + 1):
degauss = degauss_min + i * step
inp_name = "degauss-%f.in" % degauss
out_f_name = "degauss-%f.out" % degauss
fout.write("yhrun $PMF_PWX < %s > %s\n" % (inp_name, out_f_name))
# gen pbs running script
with open("converge-degauss.pbs", 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#PBS -N %s\n" % self.run_params["jobname"])
fout.write("#PBS -l nodes=%d:ppn=%d\n" % (self.run_params["nodes"], self.run_params["ppn"]))
if "queue" in self.run_params and self.run_params["queue"] != None:
fout.write("#PBS -q %s\n" %self.run_params["queue"])
fout.write("cd $PBS_O_WORKDIR\n")
fout.write("NP=`cat $PBS_NODEFILE | wc -l`\n")
for i in range(n_test + 1):
degauss = degauss_min + i * step
inp_name = "degauss-%f.in" % degauss
out_f_name = "degauss-%f.out" % degauss
fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < %s > %s\n" % (inp_name, out_f_name))
os.chdir("../")
if runopt == "run" or runopt == "genrun":
# run the simulation
os.chdir(directory)
for i in range(n_test + 1):
degauss = degauss_min + i * step
inp_name = "degauss-%f.in" % degauss
out_f_name = "degauss-%f.out" % degauss
os.system("%s $PMF_PWX < %s | tee %s" % (self.run_params["mpi"], inp_name, out_f_name))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="converge-degauss", server=self.run_params["server"])
def dos(self, directory="tmp-qe-static", inpname="static-dos.in", output="static-dos.out",
fildos="dosx.dos", bz_sum='smearing', ngauss='default', degauss='default', emin='default', emax='default',
deltae='default', runopt="gen", auto=0):
"""
Reference:
http://www.quantum-espresso.org/Doc/INPUT_DOS.html
:param bz_sum:
'smearing' :
'tetrahedra' :
'tetrahedra_lin' :
'tetrahedra_opt' :
:param ngauss:
'default': read from saved input for pw.x
0: Simple Gaussian (default)
1: Methfessel-Paxton of order 1
-1: Marzari-Vanderbilt "cold smearing"
-99: Fermi-Dirac function
:param degauss:
gaussian broadening, Ry (not eV!)
'default':
a floating number
Note:
the degauss in dos.x can significantly affect
the plotting of dos,
but I don't know whether the degauss in scf
and nscf also has such significant effect. if
so, I might need provdie more ability to set
appropriate degauss in scf and nscf running.
"""
# first check whether there is a previous scf running
if not os.path.exists(directory):
print("===================================================\n")
print(" Warning !!!\n")
print("===================================================\n")
print("dos calculation:\n")
print(" directory of previous scf or nscf calculattion not found!\n")
sys.exit(1)
if runopt == "gen" or runopt == "genrun":
with open(os.path.join(directory, inpname), 'w') as fout:
fout.write("&DOS\n")
fout.write("prefix = '%s'\n" % self.control.params["prefix"])
fout.write("outdir = '%s'\n" % self.control.params["outdir"])
fout.write("fildos = '%s'\n" % fildos)
#fout.write("bz_sum = '%s'\n" % bz_sum)
if bz_sum == 'smearing':
if ngauss == 'default':
fout.write("! use ngauss read from input for pw.x store in xxx.save\n")
else:
fout.write("ngauss = %d\n" % ngauss)
if degauss == 'default':
fout.write("! use degauss read from input for pw.x stored in xxx.save\n")
fout.write("! or degauss = DeltaE, if DeltaE is specified\n")
fout.write("! we better set degauss and ngauss ourselves!\n")
else:
fout.write("degauss = %f\n" % degauss)
if emin == 'default':
fout.write("!using default Emin: lower band value plus 3 times gauss smearing value\n")
else:
fout.write("emin = %f\n" % emin)
if emax == 'default':
fout.write("!using default Emax: upper band value minus 3 times gauss smearing value\n")
else:
fout.write("emax = %f\n" % emax)
if deltae == 'default':
fout.write("!using default DeltaE value\n")
else:
fout.write("deltae = %f\n" % deltae)
fout.write("/\n")
fout.write("\n")
# gen yhbatch script
self.gen_llhpc(directory=directory, inpname=inpname, output=output, cmd="$PMF_DOSX")
# gen pbs script
self.gen_pbs(directory=directory, inpname=inpname, output=output, cmd="$PMF_DOSX", jobname=self.run_params["jobname"], nodes=self.run_params["nodes"], ppn=self.run_params["ppn"], queue=self.run_params["queue"])
# gen cdcloud script
self.gen_cdcloud(directory=directory, inpname=inpname, output=output, cmd="$PMF_DOSX")
if runopt == "run" or runopt == "genrun":
os.chdir(directory)
os.system("%s $PMF_DOSX < %s | tee %s" % (self.run_params["mpi"], inpname, output))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="static-dos", server=self.run_params["server"])
def set_bands(self, bands_input={}):
self.bands_input = {
"prefix": self.control.params["prefix"].as_val(t=str, dim=0),
"outdir": self.control.params["outdir"].as_val(t=str, dim=0),
"filband": "bands.dat",
"lsym": ".true."
}
for item in bands_input:
self.bands_input[item] = bands_input[item]
def bands(self, directory="tmp-qe-static", inpname1="static-bands.in", output1="static-bands.out",
inpname2="bands.in", output2="bands.out", runopt="gen", auto=0):
"""
first check whether there is a previous scf running
Note:
the calculation of 'bands' is based on the previous scf or nscf running
namely there must be the xxx.save/charge-density.dat for pw.x to read
and do the bands calculation
Warning:
now we better use tpiba_b type kpoints setting!!! as only the postprocess of that
type of band structure calculation is implemented now
"""
self.control.calculation('bands')
if not os.path.exists(directory):
print("===================================================\n")
print(" Warning !!!\n")
print("===================================================\n")
print("bands calculation:\n")
print(" directory of previous scf or nscf calculattion not found!\n")
sys.exit(1)
if runopt == "gen" or runopt == "genrun":
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.pseudo_dir = os.path.abspath(directory)
with open(os.path.join(directory, inpname1), 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.arts.to_in(fout)
#
with open(os.path.join(directory, inpname2), 'w') as fout:
fout.write("&BANDS\n")
for item in self.bands_input:
if self.bands_input[item] is not None:
if type(self.bands_input[item]) == str and self.bands_input[item].lower() not in [".true.", ".false."]:
fout.write("%s = '%s'\n" % (item, self.bands_input[item]))
else:
fout.write("%s = %s\n" % (item, self.bands_input[item]))
fout.write("/\n")
fout.write("\n")
# gen yhbatch script
with open(os.path.join(directory, "band-structure.slurm"), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s\n" % self.run_params["jobname"])
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
fout.write("yhrun %s < %s > %s\n" % ("$PMF_PWX", inpname1, output1))
fout.write("yhrun %s < %s > %s\n" % ("$PMF_BANDSX", inpname2, output2))
# gen pbs script
with open(os.path.join(directory, "band-structure.pbs"), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#PBS -N %s\n" % self.run_params["jobname"])
fout.write("#PBS -l nodes=%d:ppn=%d\n" % (self.run_params["nodes"], self.run_params["ppn"]))
if "queue" in self.run_params and self.run_params["queue"] != None:
fout.write("#PBS -q %s\n" %self.run_params["queue"])
fout.write("cd $PBS_O_WORKDIR\n")
fout.write("NP=`cat $PBS_NODEFILE | wc -l`\n")
fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % ("$PMF_PWX", inpname1, output1))
fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % ("$PMF_BANDSX", inpname2, output2))
# gen cdcloud script
with open(os.path.join(directory, "band-structure.slurm_cd"), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s\n" % self.run_params["jobname"])
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
fout.write("#\n")
fout.write("export I_MPI_PMI_LIBRARY=/opt/gridview/slurm/lib/libpmi.so\n")
fout.write("srun --mpi=pmix_v3 %s < %s > %s\n" % ("$PMF_PWX", inpname1, output1))
fout.write("srun --mpi=pmix_v3 %s < %s > %s\n" % ("$PMF_BANDSX", inpname2, output2))
if runopt == "run" or runopt == "genrun":
os.chdir(directory)
os.system("%s $PMF_PWX < %s | tee %s" % (self.run_params["mpi"], inpname1, output1))
os.system("%s $PMF_BANDSX < %s | tee %s" % (self.run_params["mpi"], inpname2, output2))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="band-structure.pbs", server=self.run_params["server"])
def set_projwfc(self, projwfc_input={}):
"""
"""
self.projwfc_input = {
"prefix": self.control.params["prefix"].as_val(t=str, dim=0),
"outdir": self.control.params["outdir"].as_val(t=str, dim=0),
"filpdos": "projwfc",
"ngauss": "default",
"degauss": "default",
"emin": "default",
"emax": "default",
"deltae": "default",
}
for item in projwfc_input:
self.projwfc_input[item] = projwfc_input[item]
def projwfc(self, directory="tmp-qe-static", inpname="static-projwfc.in", output="static-projwfc.out", runopt="gen", auto=0):
"""
Reference:
http://www.quantum-espresso.org/Doc/INPUT_PROJWFC.html
&projwfc can using projwfc.x to calculate Lowdin charges, spilling
parameter, projected DOS
ngauss:
'default': read from saved input for pw.x
0: Simple Gaussian (default)
1: Methfessel-Paxton of order 1
-1: Marzari-Vanderbilt "cold smearing"
-99: Fermi-Dirac function
degauss:
gaussian broadening, Ry (not eV!)
'default':
a floating number
Note:
the degauss in projwfc.x can significantly affect
the plotting of dos,
but I don't know whether the degauss in scf
and nscf also has such significant effect. if
so, I might need provdie more ability to set
appropriate degauss in scf and nscf running.
"""
# first check whether there is a previous scf running
if not os.path.exists(directory):
print("===================================================\n")
print(" Warning !!!\n")
print("===================================================\n")
print("projwfc calculation:\n")
print(" directory of previous scf or nscf calculattion not found!\n")
sys.exit(1)
if runopt == "gen" or runopt == "genrun":
with open(os.path.join(directory, inpname), 'w') as fout:
fout.write("&PROJWFC\n")
for item in self.projwfc_input:
if item in ["ngauss", "degauss", "emin", "emax", "deltae"]:
continue
if self.projwfc_input[item] is not None:
if type(self.projwfc_input[item]) == str:
fout.write("%s = '%s'\n" % (item, self.projwfc_input[item]))
else:
fout.write("%s = %s\n" % (item, self.projwfc_input[item]))
if self.projwfc_input["ngauss"] == 'default':
fout.write("! use ngauss read from input for pw.x store in xxx.save\n")
else:
fout.write("ngauss = %d\n" % self.projwfc_input["ngauss"])
if self.projwfc_input["degauss"] == 'default':
fout.write("! use degauss read from input for pw.x stored in xxx.save\n")
fout.write("! or degauss = DeltaE, if DeltaE is specified\n")
fout.write("! we better set degauss and ngauss ourselves!\n")
else:
fout.write("degauss = %f\n" % self.projwfc_input["degauss"])
if self.projwfc_input["emin"] == 'default':
fout.write("!using default Emin: lower band value plus 3 times gauss smearing value\n")
else:
fout.write("emin = %f\n" % self.projwfc_input["emin"])
if self.projwfc_input["emax"] == 'default':
fout.write("!using default Emax: upper band value minus 3 times gauss smearing value\n")
else:
fout.write("emax = %f\n" % self.projwfc_input["emax"])
if self.projwfc_input["deltae"] == 'default':
fout.write("!using default DeltaE value\n")
else:
fout.write("deltae = %f\n" % self.projwfc_input["deltae"])
fout.write("/\n")
fout.write("\n")
# gen yhbatch script
self.gen_yh(directory=directory, inpname=inpname, output=output, cmd="$PMF_PROJWFCX")
# gen pbs script
self.gen_pbs(directory=directory, inpname=inpname, output=output, cmd="$PMF_PROJWFCX", jobname=self.run_params["jobname"], nodes=self.run_params["nodes"], ppn=self.run_params["ppn"], queue=self.run_params["queue"])
# gen cdcloud script
self.gen_cdcloud(directory=directory, inpname=inpname, output=output, cmd="$PMF_PROJWFCX")
if runopt == "run" or runopt == "genrun":
os.chdir(directory)
os.system("%s $PMF_PROJWFCX < %s | tee %s" % (self.run_params["mpi"], inpname, output))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="static-projwfc.pbs", server=self.run_params["server"])
def set_molecularpdos(self, inputmopdos={}):
"""
Reference:
http://www.quantum-espresso.org/Doc/INPUT_molecularpdos.html
ngauss:
0: Simple Gaussian (default)
1: Methfessel-Paxton of order 1
-1: Marzari-Vanderbilt "cold smearing"
-99: Fermi-Dirac function
degauss:
gaussian broadening, Ry (not eV!)
a floating number
Note:
I don't know why the run of molecularpdos.x in my computer is not stable
with all the same condition, it sometimes run successfully, and when you
execute again it might give 'STOP error reading file'. and when you again
execute it, it might work!!! unbelievable
"""
self.inputmopdos = {
"fileout": "molecularpdos",
"ngauss": 0,
"degauss": 0.001,
"emin": "default",
"emax": "default",
"deltae": "default",
}
for item in inputmopdos:
if item in self.inputmopdos:
self.inputmopdos[item] = inputmopdos[item]
def molecularpdos(self, directory="tmp-qe-static", inpname="static-molecularpdos.in", output="static-molecularpdos.out",
runopt="gen", auto=0):
"""
"""
# first check whether there is a previous scf running
if not os.path.exists(directory):
print("===================================================\n")
print(" Warning !!!\n")
print("===================================================\n")
print("molecularpdos calculation:\n")
print(" directory of previous scf or nscf calculattion not found!\n")
sys.exit(1)
if runopt == "gen" or runopt == "genrun":
with open(os.path.join(directory, inpname), 'w') as fout:
fout.write("&INPUTMOPDOS\n")
fout.write("xmlfile_full = '%s'\n" % "./tmp/pwscf.save/atomic_proj.xml")
fout.write("xmlfile_part = '%s'\n" % "./tmp/pwscf.save/atomic_proj.xml")
fout.write("fileout = '%s'\n" % self.inputmopdos["fileout"])
fout.write("ngauss = %d\n" % self.inputmopdos["ngauss"])
fout.write("! default degauss is 0.0 which will calse float number erros\n")
fout.write("! we better set degauss and ngauss ourselves!\n")
fout.write("degauss = %f\n" % self.inputmopdos["degauss"])
if self.inputmopdos["emin"] == 'default':
fout.write("!using default Emin: band extrema\n")
else:
fout.write("emin = %f\n" % self.inputmopdos["emin"])
if self.inputmopdos["emax"] == 'default':
fout.write("!using default Emax: band extrema\n")
else:
fout.write("emax = %f\n" % self.inputmopdos["emax"])
fout.write("!Note deltae is in unit of eV while other variables like degauss is Rydberg\n")
if self.inputmopdos["deltae"] == 'default':
fout.write("!using default DeltaE value: 0.01 in unit of eV\n")
else:
fout.write("deltae = %f\n" % self.inputmopdos["deltae"])
fout.write("/\n")
fout.write("\n")
# gen yhbatch script
self.gen_yh(directory=directory, inpname=inpname, output=output, cmd="$PMF_MOLECULARPDOSX")
# gen pbs script
self.gen_pbs(directory=directory, inpname=inpname, output=output, cmd="$PMF_MOLECULARPDOSX", jobname=self.run_params["jobname"], nodes=self.run_params["nodes"], ppn=self.run_params["ppn"], queue=self.run_params["queue"])
# gen cdcloud script
self.gen_cdcloud(directory=directory, inpname=inpname, output=output, cmd="$PMF_MOLECULARPDOSX")
if runopt == "run" or runopt == "genrun":
os.chdir(directory)
os.system("%s $PMF_MOLECULARPDOSX < %s | tee %s" % (self.run_params["mpi"], inpname, output))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="static-molecular-pdos", server=self.run_params["server"])
def fermi_surface(self, directory="tmp-qe-static", inpname="fermi-surface.in", output="fermi-surface.out", runopt="gen", auto=0):
"""
scf->nscf(with denser k points)->fs.x
"""
# first check whether there is a previous scf running
if not os.path.exists(directory):
print("===================================================\n")
print(" Warning !!!\n")
print("===================================================\n")
print("fs.x calculation:\n")
print(" directory of previous scf or nscf calculattion not found!\n")
sys.exit(1)
if runopt == "gen" or runopt == "genrun":
with open(os.path.join(directory, inpname), 'w') as fout:
fout.write("&fermi\n")
fout.write("prefix = '%s'\n" % self.control.params["prefix"])
fout.write("outdir = '%s'\n" % self.control.params["outdir"])
fout.write("/\n")
fout.write("\n")
# gen yhbatch script
self.gen_yh(directory=directory, inpname=inpname, output=output, cmd="$PMF_FSX")
# gen pbs script
self.gen_pbs(directory=directory, inpname=inpname, output=output, cmd="$PMF_FSX", jobname=self.run_params["jobname"], nodes=self.run_params["nodes"], ppn=self.run_params["ppn"], queue=self.run_params["queue"])
# gen cdcloud script
self.gen_cdcloud(directory=directory, inpname=inpname, output=output, cmd="$PMF_FSX")
if runopt == "run" or runopt == "genrun":
os.chdir(directory)
os.system("%s $PMF_FSX < %s | tee %s" % (self.run_params["mpi"], inpname, output))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="fermi-surface", server=self.run_params["server"])
def set_pp(self, inputpp={}, plotpp={}):
self.inputpp = {
"plot_num": [0],
}
for item in inputpp:
if item in self.inputpp:
self.inputpp[item] = inputpp[item]
self.plotpp = {
"iflag": 3,
"output_format": 5,
}
for item in plotpp:
if item in self.plotpp:
self.plotpp[item] = plotpp[item]
def pp(self, directory="tmp-qe-static", prefix="pp", runopt="gen", auto=0):
"""
Note:
the 3D charge plot like electron localization function and charge density
can be used to fabricate 2D plots using vesta software(Utilities/'2D Data Display').
where you can set (hkl) and depth to plot.
"""
# first check whether there is a previous scf running
if not os.path.exists(directory):
print("===================================================\n")
print(" Warning !!!\n")
print("===================================================\n")
print("pp.x calculation:\n")
print(" directory of previous scf or nscf calculattion not found!\n")
sys.exit(1)
if runopt == "gen" or runopt == "genrun":
table = {
0: "electron-pseudo-charge-density",
1: "total-potential",
2: "local-ionic-potential",
3: "ldos",
4: "local-density-of-electronic-entropy",
5: "stm",
6: "spin-polar",
7: "molecular-orbitals",
8: "electron-local-function",
9: "charge-density-minus-superposition-of-atomic-densities",
10: "ILDOS",
11: "v_bare+v_H-potential",
12: "sawtooth-electric-field-potential",
13: "nocollinear-magnetization",
17: "all-electron-charge-density-paw-only",
18: "exchage-correlation-magnetic-field-noncollinear-case",
19: "reduced-density-gradient",
20: "product-of-charge-density-with-hessian",
21: "all-electron-density-paw-only",
}
for plot_num_i in self.inputpp["plot_num"]:
with open(os.path.join(directory, prefix+"-"+table[plot_num_i]+".in"), 'w') as fout:
self._pp_inputpp(fout, plot_num=plot_num_i, filplot=table[plot_num_i]+".dat")
self._pp_plot(fout, output_format=self.plotpp["output_format"], iflag=self.plotpp["iflag"], filepp=table[plot_num_i]+".dat")
# gen yhbatch script
with open(os.path.join(directory, "pp.x.slurm"), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s\n" % self.run_params["jobname"])
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
for plot_num_i in self.inputpp["plot_num"]:
fout.write("yhrun %s < %s > %s\n" % ("$PMF_PPX", prefix+"-"+table[plot_num_i]+".in", prefix+"-"+table[plot_num_i]+".out"))
# gen pbs script
with open(os.path.join(directory, "pp.x.pbs"), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#PBS -N %s\n" % self.run_params["jobname"])
fout.write("#PBS -l nodes=%d:ppn=%d\n" % (self.run_params["nodes"], self.run_params["ppn"]))
if "queue" in self.run_params and self.run_params["queue"] != None:
fout.write("#PBS -q %s\n" %self.run_params["queue"])
fout.write("\n")
fout.write("cd $PBS_O_WORKDIR\n")
fout.write("NP=`cat $PBS_NODEFILE | wc -l`\n")
#fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % (cmd, inpname, output))
for plot_num_i in self.inputpp["plot_num"]:
fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % ("$PMF_PPX", prefix+"-"+table[plot_num_i]+".in", prefix+"-"+table[plot_num_i]+".out"))
# gen cdcloud script
with open(os.path.join(directory, "pp.x.slurm_cd"), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s\n" % self.run_params["jobname"])
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
fout.write("#\n")
fout.write("export I_MPI_PMI_LIBRARY=/opt/gridview/slurm/lib/libpmi.so\n")
for plot_num_i in self.inputpp["plot_num"]:
fout.write("srun --mpi=pmix_v3 %s < %s > %s\n" % ("$PMF_PPX", prefix+"-"+table[plot_num_i]+".in", prefix+"-"+table[plot_num_i]+".out"))
if runopt == "run" or runopt == "genrun":
os.chdir(directory)
for plot_num_i in self.inputpp["plot_num"]:
os.system("%s pp.x < %s | tee %s" % (self.run_params["mpi"], prefix+"-"+table[plot_num_i]+".in", prefix+"-"+table[plot_num_i]+".out"))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="pp.x", server=self.run_params["server"])
def _pp_inputpp(self, fout, plot_num, filplot):
"""
:param fout: a file stream for writing
:param plot_num -> selects what to save in filplot:
0 = electron (pseudo-)charge density
1 = total potential V_bare + V_H + V_xc
2 = local ionic potential V_bare
3 = local density of states at specific energy or grid of energies
(number of states per volume, in bohr^3, per energy unit, in Ry)
4 = local density of electronic entropy
5 = STM images
Tersoff and Hamann, PRB 31, 805 (1985)
8 = electron localization function (ELF)
9 = charge density minus superposition of atomic densities
13 = the noncollinear magnetization.
About other value of plot_num, refere to the the input manual
of pp.x:
http://www.quantum-espresso.org/Doc/INPUT_PP.html
"""
fout.write("&inputpp\n")
fout.write("prefix = '%s'\n" % self.control.params["prefix"])
fout.write("outdir = '%s'\n" % self.control.params["outdir"])
fout.write("filplot = '%s'\n" % (filplot))
fout.write("plot_num = %d\n" % plot_num)
if plot_num == 0:
fout.write("spin_component = %d\n" % 0)
elif plot_num == 1:
fout.write("spin_component = %d\n" % 0)
elif plot_num == 3:
pass
elif plot_num == 5:
pass
elif plot_num == 7:
fout.write("kpoint(1) = 1\n")
fout.write("kpoint(2) = 2\n")
fout.write("kband(1) = 1\n")
fout.write('kband(2) = 2\n')
elif plot_num == 10:
pass
elif plot_num == 17:
pass
fout.write("/\n")
def _pp_plot(self, fout, filepp, iflag=3, output_format=5,
e1=[2.0, 0.0, 0.0], e2=[0.0, 2.0, 0.0], e3=[0.0, 0.0, 2.0],
x0=[0.0, 0.0, 0.0], nx=1000, ny=1000, nz=1000):
"""
:param fout: a file stream for writing
"""
#fout.write("&inputpp\n")
#fout.write("/\n\n")
fout.write("&plot\n")
fout.write("nfile = 1\n")
fout.write("filepp(1) = '%s'\n" % (filepp))
fout.write("weight(1) = 1.0\n")
fout.write("iflag = %d\n" % iflag)
fout.write("output_format = %d\n" % output_format)
if iflag == 0 or iflag == 1:
fout.write("e1(1) = %f, e1(2) = %f, e1(3) = %f\n" % (e1[0], e1[1], e1[2]))
fout.write("x0(1) = %f, x0(2) = %f, x0(3) = %f\n" % (x0[0], x0[1], x0[2]))
fout.write("nx = %d\n" % nx)
elif iflag == 2:
fout.write("e1(1) = %f, e1(2) = %f, e1(3) = %f\n" % (e1[0], e1[1], e1[2]))
fout.write("e2(1) = %f, e2(2) = %f, e2(3) = %f\n" % (e2[0], e2[1], e2[2]))
fout.write("x0(1) = %f, x0(2) = %f, x0(3) = %f\n" % (x0[0], x0[1], x0[2]))
fout.write("nx = %d, ny = %d\n" % (nx, ny))
elif iflag == 3:
fout.write("e1(1) = %f, e1(2) = %f, e1(3) = %f\n" % (e1[0], e1[1], e1[2]))
fout.write("e2(1) = %f, e2(2) = %f, e2(3) = %f\n" % (e2[0], e2[1], e2[2]))
fout.write("e3(1) = %f, e3(2) = %f, e3(3) = %f\n" % (e3[0], e3[1], e3[2]))
fout.write("x0(1) = %f, x0(2) = %f, x0(3) = %f\n" % (x0[0], x0[1], x0[2]))
fout.write("nx = %d, ny = %d, nz = %d\n" % (nx, ny, nz))
elif iflag == 4:
fout.write("radius = %f\n" % radius)
fout.write("nx = %d, ny = %d\n" % (nx, ny))
if output_format == 0:
fout.write("fileout = '%s'\n" % (filepp.split(".")[0]+".1d.gp"))
elif output_format == 2:
fout.write("fileout = '%s'\n" % (filepp.split(".")[0]+".plotrho"))
elif output_format == 3:
fout.write("fileout = '%s'\n" % (filepp.split(".")[0]+".2d.xsf"))
elif output_format == 5:
fout.write("fileout = '%s'\n" % (filepp.split(".")[0]+".3d.xsf"))
elif output_format == 6:
fout.write("fileout = '%s'\n" % (filepp.split(".")[0]+".cube"))
elif output_format == 7:
fout.write("fileout = '%s'\n" % (filepp.split(".")[0]+"2d.gp"))
fout.write("/\n")
fout.write("\n")
def xspectra(self, directory="tmp-qe-static", inpname="xspectra.in", output="xspectra.out", runopt="gen", auto=0):
"""
Reference:
http://www.quantum-espresso.org/Doc/INPUT_XSpectra.txt
"""
# first check whether there is a previous scf running
if not os.path.exists(directory):
print("===================================================\n")
print(" Warning !!!\n")
print("===================================================\n")
print("xspectra.x calculation:\n")
print(" directory of previous scf or nscf calculattion not found!\n")
sys.exit(1)
if runopt == "gen" or runopt == "genrun":
with open(os.path.join(directory, inpname), 'w') as fout:
fout.write("&input_xspectra\n")
fout.write("/\n")
fout.write("&plot\n")
fout.write("/\n")
# gen yhbatch script
self.gen_yh(directory=directory, inpname=inpname, output=output, cmd="$PMF_XSPECTRAX")
# gen pbs script
self.gen_pbs(directory=directory, inpname=inpname, output=output, cmd="$PMF_XSPECTRAX", jobname=self.run_params["jobname"], nodes=self.run_params["nodes"], ppn=self.run_params["ppn"], queue=self.run_params["queue"])
# gen cdcloud script
self.gen_cdcloud(directory=directory, inpname=inpname, output=output, cmd="$PMF_XSPECTRAX")
if runopt == "run" or runopt == "genrun":
os.chdir(directory)
os.system("%s $PMF_XSPECTRAX < %s | tee %s" % (self.run_params["mpi"], inpname, output))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="xspectra", server=self.run_params["server"])
#
def run(self, directory="tmp-qe-static", runopt="gen", auto=0, kpath=None,
kpoints_mp_scf=[1, 1, 1, 0, 0, 0], kpoints_mp_nscf=[3, 3, 3, 0, 0, 0]):
"""
directory: a place for all the generated files
:param directory: the overall static calculation directory
:param runopt: determine whether the calculation is executed.
there are three values: 'gen', 'genrun', 'run'
'gen': only generate the input files
'genrun': generate input files and run
'run': run from the previously generated input files
Note:
scf, nscf, pdos, bands in a single run
"""
if runopt == 'gen' or runopt == 'genrun':
if os.path.exists(directory):
shutil.rmtree(directory)
os.mkdir(directory)
#os.system("cp *.UPF %s/" % directory)
#os.system("cp %s %s/" % (self.arts.xyz.file, directory))
# do not copy too many files at the same time or it will be slow
# so we do not copy all UPF files in the directory but just copy
# those used in the calculation.
shutil.copyfile(self.arts.xyz.file, os.path.join(directory, os.path.basename(self.arts.xyz.file)))
#all_upfs = [s for s in os.listdir() if s.split(".")[-1] == "UPF"]
all_file = os.listdir()
for element in self.arts.xyz.specie_labels:
for item in all_file:
#if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE):
#if item.split(".")[0].lower() == element.lower() or item.split("_")[0].lower() == element.lower():
if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE) or re.match("(%s)(_*)(upf)" % element, item, re.IGNORECASE):
shutil.copyfile(item, os.path.join(directory, item))
break
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.set_params({"pseudo_dir": os.path.abspath(directory)})
#
# check hybrid functional
# in pw.x non-scf calc, hybrid functional is not allowed
input_dft = self.system.params["input_dft"].as_val(t=str, dim=0) if self.system.params["input_dft"].as_val() is not None else None
self.control.set_params({"nstep": None})
# 1) scf
self.control.calculation("scf")
self.set_kpoints(kpoints_option="automatic", kpoints_mp=kpoints_mp_scf)
with open(os.path.join(directory, "static-scf.in"), 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.arts.to_in(fout)
# 2) nscf
self.control.calculation("nscf")
# hybrid functional calc is not allowed in non-scf calc
self.set_kpoints(kpoints_option="automatic", kpoints_mp=kpoints_mp_nscf)
if input_dft.lower() == "pbe0" or input_dft.lower() == "b3lyp" or input_dft.lower() == "hse":
self.system.set_params({"input_dft": "pbe"})
with open(os.path.join(directory, "static-nscf.in"), 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.arts.to_in(fout)
# 3) projwfc
with open(os.path.join(directory, "static-projwfc.in"), 'w') as fout:
fout.write("&PROJWFC\n")
for item in self.projwfc_input:
if item in ["ngauss", "degauss", "emin", "emax", "deltae"]:
continue
if self.projwfc_input[item] is not None:
if type(self.projwfc_input[item]) == str:
fout.write("%s = '%s'\n" % (item, self.projwfc_input[item]))
else:
fout.write("%s = %s\n" % (item, self.projwfc_input[item]))
if self.projwfc_input["ngauss"] == 'default':
fout.write("! use ngauss read from input for pw.x store in xxx.save\n")
else:
fout.write("ngauss = %d\n" % self.projwfc_input["ngauss"])
if self.projwfc_input["degauss"] == 'default':
fout.write("! use degauss read from input for pw.x stored in xxx.save\n")
fout.write("! or degauss = DeltaE, if DeltaE is specified\n")
fout.write("! we better set degauss and ngauss ourselves!\n")
else:
fout.write("degauss = %f\n" % self.projwfc_input["degauss"])
if self.projwfc_input["emin"] == 'default':
fout.write("!using default Emin: lower band value plus 3 times gauss smearing value\n")
else:
fout.write("emin = %f\n" % self.projwfc_input["emin"])
if self.projwfc_input["emax"] == 'default':
fout.write("!using default Emax: upper band value minus 3 times gauss smearing value\n")
else:
fout.write("emax = %f\n" % self.projwfc_input["emax"])
if self.projwfc_input["deltae"] == 'default':
fout.write("!using default DeltaE value\n")
else:
fout.write("deltae = %f\n" % self.projwfc_input["deltae"])
fout.write("/\n")
fout.write("\n")
# 4) band structure
self.control.calculation('bands')
self.set_kpoints(kpoints_option="crystal_b", crystal_b=kpath)
if input_dft.lower() == "pbe0" or input_dft.lower() == "b3lyp" or input_dft.lower() == "hse":
self.system.params["input_dft"] = "pbe"
with open(os.path.join(directory, "static-bands.in"), 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.arts.to_in(fout)
#
with open(os.path.join(directory, "bands.in"), 'w') as fout:
fout.write("&BANDS\n")
for item in self.bands_input:
if self.bands_input[item] is not None:
if type(self.bands_input[item]) == str and self.bands_input[item].lower() not in [".true.", ".false."]:
fout.write("%s = '%s'\n" % (item, self.bands_input[item]))
else:
fout.write("%s = %s\n" % (item, self.bands_input[item]))
fout.write("/\n")
fout.write("\n")
# 5) pp.x
prefix="pp"
table = {
0: "electron-pseudo-charge-density",
1: "total-potential",
2: "local-ionic-potential",
3: "ldos",
4: "local-density-of-electronic-entropy",
5: "stm",
6: "spin-polar",
7: "molecular-orbitals",
8: "electron-local-function",
9: "charge-density-minus-superposition-of-atomic-densities",
10: "ILDOS",
11: "v_bare+v_H-potential",
12: "sawtooth-electric-field-potential",
13: "nocollinear-magnetization",
17: "all-electron-charge-density-paw-only",
18: "exchage-correlation-magnetic-field-noncollinear-case",
19: "reduced-density-gradient",
20: "product-of-charge-density-with-hessian",
21: "all-electron-density-paw-only",
}
for plot_num_i in self.inputpp["plot_num"]:
with open(os.path.join(directory, prefix+"-"+table[plot_num_i]+".in"), 'w') as fout:
self._pp_inputpp(fout, plot_num=plot_num_i, filplot=table[plot_num_i]+".dat")
self._pp_plot(fout, output_format=self.plotpp["output_format"], iflag=self.plotpp["iflag"], filepp=table[plot_num_i]+".dat")
# gen yhbatch script
with open(os.path.join(directory, "static.slurm"), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s\n" % self.run_params["jobname"])
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
fout.write("yhrun $PMF_PWX < static-scf.in > static-scf.out\n")
fout.write("yhrun $PMF_PWX < static-nscf.in > static-nscf.out\n")
fout.write("yhrun $PMF_PROJWFCX < static-projwfc.in > static-projwfc.out\n")
fout.write("yhrun $PMF_PWX < static-bands.in > static-bands.out\n")
fout.write("yhrun $PMF_BANDSX < bands.in > bands.out\n")
for plot_num_i in self.inputpp["plot_num"]:
fout.write("yhrun %s < %s > %s\n" % ("$PMF_PPX", prefix+"-"+table[plot_num_i]+".in", prefix+"-"+table[plot_num_i]+".out"))
# gen pbs script
with open(os.path.join(directory, "static.pbs"), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#PBS -N %s\n" % self.run_params["jobname"])
fout.write("#PBS -l nodes=%d:ppn=%d\n" % (self.run_params["nodes"], self.run_params["ppn"]))
if "queue" in self.run_params and self.run_params["queue"] != None:
fout.write("#PBS -q %s\n" %self.run_params["queue"])
fout.write("\n")
fout.write("cd $PBS_O_WORKDIR\n")
fout.write("NP=`cat $PBS_NODEFILE | wc -l`\n")
fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < static-scf.in > static-scf.out\n")
fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < static-nscf.in > static-nscf.out\n")
fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PROJWFCX < static-projwfc.in > static-projwfc.out\n")
fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < static-bands.in > static-bands.out\n")
fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_BANDSX < bands.in > bands.out\n")
for plot_num_i in self.inputpp["plot_num"]:
fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % ("$PMF_PPX", prefix+"-"+table[plot_num_i]+".in", prefix+"-"+table[plot_num_i]+".out"))
# gen local bash script
with open(os.path.join(directory, "static.sh"), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("%s $PMF_PWX < static-scf.in | tee static-scf.out\n" % self.run_params["mpi"])
fout.write("%s $PMF_PWX < static-nscf.in | tee static-nscf.out\n" % self.run_params["mpi"])
fout.write("%s $PMF_PROJWFCX < static-projwfc.in | tee static-projwfc.out\n" % self.run_params["mpi"])
fout.write("%s $PMF_PWX < static-bands.in > static-bands.out\n" % self.run_params["mpi"])
fout.write("%s $PMF_BANDSX < bands.in | tee bands.out\n" % self.run_params["mpi"])
for plot_num_i in self.inputpp["plot_num"]:
fout.write("%s $PMF_PPX < %s | tee %s\n" % (self.run_params["mpi"], prefix+"-"+table[plot_num_i]+".in", prefix+"-"+table[plot_num_i]+".out"))
# gen cdcloud script
with open(os.path.join(directory, "static.slurm_cd"), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s\n" % self.run_params["jobname"])
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
fout.write("#\n")
fout.write("export I_MPI_PMI_LIBRARY=/opt/gridview/slurm/lib/libpmi.so\n")
fout.write("srun --mpi=pmix_v3 $PMF_PWX < static-scf.in > static-scf.out\n")
fout.write("srun --mpi=pmix_v3 $PMF_PWX < static-nscf.in > static-nscf.out\n")
fout.write("srun --mpi=pmix_v3 $PMF_PROJWFCX < static-projwfc.in > static-projwfc.out\n")
fout.write("srun --mpi=pmix_v3 $PMF_PWX < static-bands.in > static-bands.out\n")
fout.write("srun --mpi=pmix_v3 $PMF_BANDSX < bands.in > bands.out\n")
for plot_num_i in self.inputpp["plot_num"]:
fout.write("srun --mpi=pmix_v3 %s < %s > %s\n" % ("$PMF_PPX", prefix+"-"+table[plot_num_i]+".in", prefix+"-"+table[plot_num_i]+".out"))
if runopt == 'genrun' or runopt == 'run':
os.chdir(directory)
os.system("bash static.sh")
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="static", server=self.run_params["server"])
| 54.651321 | 233 | 0.510765 | 8,762 | 72,413 | 4.126569 | 0.06551 | 0.071687 | 0.056808 | 0.031363 | 0.836519 | 0.824598 | 0.81218 | 0.787399 | 0.771967 | 0.753879 | 0 | 0.009602 | 0.334111 | 72,413 | 1,324 | 234 | 54.692598 | 0.740248 | 0.142184 | 0 | 0.71024 | 0 | 0.019608 | 0.242156 | 0.03462 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022876 | false | 0.004357 | 0.006536 | 0 | 0.030501 | 0.043573 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
eb8fcd8bdc58c637032c120a0377ec709a1c7f14 | 22,931 | py | Python | Lista4.py | EnzoItaliano/calculoNumericoEmPython | be3161b823955620be71e0f94a3421288fd28ef0 | [
"MIT"
] | 1 | 2019-12-28T21:23:00.000Z | 2019-12-28T21:23:00.000Z | Lista4.py | EnzoItaliano/calculoNumericoEmPython | be3161b823955620be71e0f94a3421288fd28ef0 | [
"MIT"
] | null | null | null | Lista4.py | EnzoItaliano/calculoNumericoEmPython | be3161b823955620be71e0f94a3421288fd28ef0 | [
"MIT"
] | null | null | null | import copy
import numpy as np
from sympy import *
import prettymatrix
import matplotlib.pyplot as plt
from prettytable import PrettyTable
from numpy.polynomial import Polynomial as P
x = symbols('x')
#Interpolação
##Polinomio de Lagrange
def Lagrange(pontos, valor, f):
Pn = 0
print("Polinômios coeficientes")
for i in range(len(pontos)):
mult = 1
multp = 1
div = 1
for j in range(len(pontos)):
if i == j: continue
mult *= P([-pontos[j][0], 1])
multp *= x - pontos[j][0]
div *= pontos[i][0] - pontos[j][0]
print("\n>>>>>>>L[%a]<<<<<<<" % i)
pprint(multp/div)
Pn = Pn + pontos[i][1] * (mult // div)
print("Polinômio interpolador de Lagrange p(x) = ", end="")
poli = list(Pn)
for i in range(len(poli)):
print(abs(round(poli[i],8)),end="")
if i == 0: print(" ",end="")
elif i == 1: print("x ", end="")
else: print("x**%o"%i, end=" ")
if i != len(poli)-1:
if poli[i+1] >= 0:
print("+ ", end="")
else:
print("- ", end="")
print("\n")
print("Polinômio interpolador avaliado em x =",valor,", é P("+str(valor)+") =" ,Pn(valor))
if f != 0:
f = diff(f,x,len(poli))
# print(simplify(f))
maior = abs(f.subs(x,pontos[0][0]))
if abs(f.subs(x,pontos[len(pontos)-1][0])) > maior:
maior = abs(f.subs(x,pontos[len(pontos)-1][0]))
mult = 1
for i in range(len(pontos)):
mult *= abs(valor-pontos[i][0])
E = mult * maior / factorial(len(poli))
print("\nLimitante")
print("|E("+str(valor)+")| <= ",E.evalf())
def plotL(pontos, xi, xf):
l = []
for i in range(len(pontos)):
multp = 1
div = 1
for j in range(len(pontos)):
if i == j: continue
multp *= x - pontos[j][0]
div *= pontos[i][0] - pontos[j][0]
l.append(multp/div)
fig, ax = plt.subplots()
z = np.arange(xi,xf,0.01)
y = np.zeros((len(l),len(z)))
for i in range(len(l)):
for j in range(len(z)):
y[i][j] = (l[i].subs(x,z[j]))
ax.plot(z,y[i], label=str(l[i]))
ax.legend()
ax.grid()
plt.show()
def graficoLagrange(pontos, valor):
Pn = 0
for i in range(len(pontos)):
mult = 1
div = 1
for j in range(len(pontos)):
if i == j: continue
mult *= P([-pontos[j][0], 1])
div *= pontos[i][0] - pontos[j][0]
Pn = Pn + pontos[i][1] * (mult // div)
fig, ax = plt.subplots()
z = np.arange(-4,4,0.01)
y = []
for i in range(len(z)):
y.append(Pn(z[i]))
a = []
w = []
for i in range(len(pontos)):
a.append(pontos[i][0])
w.append(pontos[i][1])
ax.plot(z,y, label='Polinômio Interpolador P(x)')
ax.plot(a,w, "r*", markersize=6, label="Pontos da tabela")
ax.plot(valor,Pn(valor), "g*", markersize=6, label="Estimativa")
ax.legend()
ax.grid()
plt.show()
def graficofLagrange(pontos, valor, f):
Pn = 0
for i in range(len(pontos)):
mult = 1
div = 1
for j in range(len(pontos)):
if i == j: continue
mult *= P([-pontos[j][0], 1])
div *= pontos[i][0] - pontos[j][0]
Pn = Pn + pontos[i][1] * (mult // div)
fig, ax = plt.subplots()
z = np.arange(-0.5,1.5,0.01)
y = []
for i in range(len(z)):
y.append(Pn(z[i]))
a = []
for i in range(len(z)):
a.append(f.subs(x,z[i]))
b = []
w = []
for i in range(len(pontos)):
b.append(pontos[i][0])
w.append(pontos[i][1])
ax.plot(b,w, "r*", markersize=6, label="Pontos da tabela")
ax.plot(z,y, label='Polinômio Interpolador P(x)')
ax.plot(z,a, label="Função f(x)")
ax.plot(valor,Pn(valor), "g*", markersize=6, label="Estimativa")
ax.plot(valor,f.subs(x,valor), "yo", label="Valor exato")
ax.legend()
ax.grid()
plt.show()
# def f(x): return (3+x)/(1+x)
# pontos = [[0.1, 2.82],[0.2, 2.67], [0.4, 2.43]]
# Lagrange(pontos, 0.25, f(x))
# graficoLagrange(pontos, 0.25)
# graficofLagrange(pontos, 0.25, f(x))
def Newton(pontos, valor, f):
dif = []
for i in range(len(pontos)):
dif.append([])
for i in range(len(pontos)):
dif[0].append(pontos[i][1])
for i in range(len(pontos)-1):
for j in range(len(pontos)-(i+1)):
dif[i+1].append((dif[i][j+1]-dif[i][j])/(pontos[j+i+1][0]-pontos[j][0]))
Table = PrettyTable()
points=[]
for i in range(len(pontos)):
points.append(pontos[i][0])
Table.add_column("xk", points)
for k in range(len(dif)):
while len(dif[k]) < len(pontos):
dif[k].append("-")
Table.add_column("Dif_"+str(k),dif[k])
print("Tabela")
print(Table)
Pn = dif[0][0]
for i in range(1,len(dif)):
temp = 1
for j in range(i):
temp *= (x-pontos[j][0])
temp *= dif[i][0]
Pn += temp
print("Polinômio interpolador p(x) = ",end="")
print(simplify(Pn))
print("Polinômio interpolador avaliado em x = "+str(valor)+" é p("+str(valor)+") = ", end="")
print(round(Pn.subs(x,valor),8))
if f != 0:
f = diff(f,x,degree(Pn,x)+1)
# print(simplify(f))
maior = abs(f.subs(x,pontos[0][0]))
if abs(f.subs(x,pontos[len(pontos)-1][0])) > maior:
maior = abs(f.subs(x,pontos[len(pontos)-1][0]))
mult = 1
for i in range(len(pontos)):
mult *= abs(valor-pontos[i][0])
E = mult * maior / factorial(degree(Pn,x)+1)
print("\nLimitante")
print("|E("+str(valor)+")| <= ",E.evalf())
def graficoNewton(pontos, valor):
dif = []
for i in range(len(pontos)):
dif.append([])
for i in range(len(pontos)):
dif[0].append(pontos[i][1])
for i in range(len(pontos)-1):
for j in range(len(pontos)-(i+1)):
dif[i+1].append((dif[i][j+1]-dif[i][j])/(pontos[j+i+1][0]-pontos[j][0]))
Pn = dif[0][0]
for i in range(1,len(dif)):
temp = 1
for j in range(i):
temp *= (x-pontos[j][0])
temp *= dif[i][0]
Pn += temp
fig, ax = plt.subplots()
z = np.arange(-1,1.5,0.001)
y = []
for i in range(len(z)):
y.append(Pn.subs(x,z[i]))
a = []
w = []
for i in range(len(pontos)):
a.append(pontos[i][0])
w.append(pontos[i][1])
ax.plot(z,y, label='Polinômio Interpolador P(x)')
ax.plot(a,w, "r*", markersize=6, label="Pontos da tabela")
ax.plot(valor,Pn.subs(x,valor), "g*", markersize=6, label="Estimativa")
ax.legend()
ax.grid()
plt.show()
def graficofNewton(pontos, valor, f):
dif = []
for i in range(len(pontos)):
dif.append([])
for i in range(len(pontos)):
dif[0].append(pontos[i][1])
for i in range(len(pontos)-1):
for j in range(len(pontos)-(i+1)):
dif[i+1].append((dif[i][j+1]-dif[i][j])/(pontos[j+i+1][0]-pontos[j][0]))
Pn = dif[0][0]
for i in range(1,len(dif)):
temp = 1
for j in range(i):
temp *= (x-pontos[j][0])
temp *= dif[i][0]
Pn += temp
fig, ax = plt.subplots()
z = np.arange(-1,1.5,0.001)
y = []
for i in range(len(z)):
y.append(Pn.subs(x,z[i]))
a = []
for i in range(len(z)):
a.append(f.subs(x,z[i]))
b = []
w = []
for i in range(len(pontos)):
b.append(pontos[i][0])
w.append(pontos[i][1])
ax.plot(b,w, "r*", markersize=6, label="Pontos da tabela")
ax.plot(z,y, label='Polinômio Interpolador P(x)')
ax.plot(z,a, label="Função f(x)")
ax.plot(valor,Pn.subs(x,valor), "g*", markersize=6, label="Estimativa")
ax.plot(valor,f.subs(x,valor), "yo", label="Valor exato")
ax.legend()
ax.grid()
plt.show()
# def f(x): return exp(x) + sin(x)
# pontos = [[0,1],[.5,2.12],[1,3.55]]
# Newton(pontos,0.7,f(x))
# graficoNewton(pontos, 0.7)
# graficofLagrange(pontos, 0.7, f(x))
def NewtonGregory(pontos, valor, f):
intervalo = pontos[1][0] - pontos[0][0]
for i in range(1,len(pontos)):
if pontos[i][0] - pontos[i-1][0] != intervalo:
return print("Valores de X não são equidistantes")
dif = []
for i in range(len(pontos)):
dif.append([])
for i in range(len(pontos)):
dif[0].append(pontos[i][1])
for i in range(len(pontos)-1):
for j in range(len(pontos)-(i+1)):
dif[i+1].append((dif[i][j+1]-dif[i][j]))
Table = PrettyTable()
points=[]
for i in range(len(pontos)):
points.append(pontos[i][0])
Table.add_column("xk", points)
for k in range(len(dif)):
while len(dif[k]) < len(pontos):
dif[k].append("-")
Table.add_column("Dif_"+str(k),dif[k])
print("Tabela")
print(Table)
Pn = dif[0][0]
for i in range(1,len(dif)):
temp = 1
for j in range(i):
temp *= (x-pontos[j][0])
temp *= (dif[i][0]/(factorial(i)*intervalo**i))
Pn += temp
print("Polinômio interpolador p(x) = ",end="")
print(Pn)
print("Polinômio interpolador avaliado em x = "+str(valor)+" é p("+str(valor)+") = ", end="")
print(round(Pn.subs(x,valor),8))
if f != 0:
f = diff(f,x,degree(Pn,x)+1)
# print(simplify(f))
maior = abs(f.subs(x,pontos[0][0]))
if abs(f.subs(x,pontos[len(pontos)-1][0])) > maior:
maior = abs(f.subs(x,pontos[len(pontos)-1][0]))
mult = 1
for i in range(len(pontos)):
mult *= abs(valor-pontos[i][0])
E = mult * maior / factorial(degree(Pn,x)+1)
print("\nLimitante")
print("|E("+str(valor)+")| <= ",E.evalf())
def graficoNG(pontos, valor):
intervalo = pontos[1][0] - pontos[0][0]
for i in range(1,len(pontos)):
if pontos[i][0] - pontos[i-1][0] != intervalo:
return print("Valores de X não são equidistantes")
dif = []
for i in range(len(pontos)):
dif.append([])
for i in range(len(pontos)):
dif[0].append(pontos[i][1])
for i in range(len(pontos)-1):
for j in range(len(pontos)-(i+1)):
dif[i+1].append((dif[i][j+1]-dif[i][j]))
Pn = dif[0][0]
for i in range(1,len(dif)):
temp = 1
for j in range(i):
temp *= (x-pontos[j][0])
temp *= (dif[i][0]/(factorial(i)*intervalo**i))
Pn += temp
fig, ax = plt.subplots()
z = np.arange(-1,3.5,0.001)
y = []
for i in range(len(z)):
y.append(Pn.subs(x,z[i]))
a = []
w = []
for i in range(len(pontos)):
a.append(pontos[i][0])
w.append(pontos[i][1])
ax.plot(z,y, label='Polinômio Interpolador P(x)')
ax.plot(a,w, "r*", markersize=6, label="Pontos da tabela")
ax.plot(valor,Pn.subs(x,valor), "g*", markersize=6, label="Estimativa")
ax.legend()
ax.grid()
plt.show()
def graficofNG(pontos, valor, f):
intervalo = pontos[1][0] - pontos[0][0]
for i in range(1,len(pontos)):
if pontos[i][0] - pontos[i-1][0] != intervalo:
return print("Valores de X não são equidistantes")
dif = []
for i in range(len(pontos)):
dif.append([])
for i in range(len(pontos)):
dif[0].append(pontos[i][1])
for i in range(len(pontos)-1):
for j in range(len(pontos)-(i+1)):
dif[i+1].append((dif[i][j+1]-dif[i][j]))
Pn = dif[0][0]
for i in range(1,len(dif)):
temp = 1
for j in range(i):
temp *= (x-pontos[j][0])
temp *= (dif[i][0]/(factorial(i)*intervalo**i))
Pn += temp
fig, ax = plt.subplots()
z = np.arange(-0.4,3,0.001)
y = []
for i in range(len(z)):
y.append(Pn.subs(x,z[i]))
a = []
for i in range(len(z)):
a.append(f.subs(x,z[i]))
b = []
w = []
for i in range(len(pontos)):
b.append(pontos[i][0])
w.append(pontos[i][1])
ax.plot(b,w, "r*", markersize=6, label="Pontos da tabela")
ax.plot(z,y, label='Polinômio Interpolador P(x)')
ax.plot(z,a, label="Função f(x)")
ax.plot(valor,Pn.subs(x,valor), "g*", markersize=6, label="Estimativa")
ax.plot(valor,f.subs(x,valor), "yo", label="Valor exato")
print(valor,Pn.subs(x,valor))
print(valor,f.subs(x,valor))
ax.legend()
ax.grid()
plt.show()
# def f(x): return x/(1+x)
# pontos = [[0,1],[1,.5],[2,2/3]]
# NewtonGregory(pontos, 1.3, f(x))
# graficoNG(pontos, 1.3)
# graficofNG(pontos, 1.3, f(x))
def sistLinear(G, B, ordem):
y = symbols('y:'+str(ordem))
mY = []
for i in range(len(y)):
mY.append(y[i])
D = np.linalg.det(G)
tempG = G.copy()
for j in range(ordem):
for i in range(ordem):
tempG[i][j] = B[i]
tempD = np.linalg.det(tempG)
tempG = G.copy()
mY[j] = round(tempD/D, 8)
mTemp = []
for i in range(len(mY)):
mTemp.append([mY[i]])
mY = mTemp.copy()
mY = np.asarray(mY)
return mY
def spline(pontos, valor):
h = []
for i in range(1,len(pontos)):
h.append(pontos[i][0] - pontos[i-1][0])
M = np.zeros((len(h)-1,len(h)-1))
for i in range(len(h)-1):
if i == 0:
M[i][i] = 2*(h[i]+h[i+1])
M[i][i+1] = h[i+1]
elif i == len(h)-2:
M[i][i] = 2*(h[i]+h[i+1])
M[i][i-1] = h[i]
else:
M[i][i] = 2*(h[i]+h[i+1])
M[i][i-1] = h[i]
M[i][i+1] = h[i+1]
print(prettymatrix.matrix_to_string(M, name='Matriz = '))
B = np.zeros((len(h)-1,1))
for i in range(1,len(h)):
B[i-1][0] = 6*((pontos[i+1][1]-pontos[i][1])/h[i]) - 6*((pontos[i][1]-pontos[i-1][1])/h[i-1])
print(prettymatrix.matrix_to_string(B, name='B = '))
mu = sistLinear(M, B, len(h)-1)
print("Spline natural: \u03BC0 = 0, \u03BC"+str(len(h))+" = 0\n")
print("Resolvendo o sistema linear M*Y=B, temos:")
print('\u03BC1 = ', mu[0][0])
print('\u03BC2 = ', mu[1][0])
alpha = np.zeros(len(h))
beta = np.zeros(len(h))
gamma = np.zeros(len(h))
for i in range(len(h)):
if i == 0:
alpha[i] = ((pontos[i+1][1]-pontos[i][1])/h[i]) - ((mu[i][0]/6)*h[i]) - ((0/3)*h[i])
beta[i] = 0/2
gamma[i] = (mu[i][0]-0)/(6*h[i])
elif i == len(mu):
alpha[i] = ((pontos[i+1][1]-pontos[i][1])/h[i]) - ((0/6)*h[i]) - ((mu[i-1]/3)*h[i])
beta[i] = mu[i-1][0]/2
gamma[i] = (0-mu[i-1][0])/(6*h[i])
else:
alpha[i] = ((pontos[i+1][1]-pontos[i][1])/h[i]) - ((mu[i][0]/6)*h[i]) - ((mu[i-1]/3)*h[i])
beta[i] = mu[i-1][0]/2
gamma[i] = (mu[i][0]-mu[i-1][0])/(6*h[i])
i = np.linspace(0,len(alpha)-1,len(alpha))
Table = PrettyTable()
Table.add_column("i",i)
Table.add_column("\u03B1",alpha)
Table.add_column("\u03B2",beta)
Table.add_column("\u03B3",gamma)
print("\nCoeficientes dos polinomios da spline:")
print(Table)
S = []
for i in range(len(alpha)):
# print(pontos[i][1] +(alpha[i]*(x-pontos[i][0])))
S.append(pontos[i][1] + (alpha[i]*(x-pontos[i][0])) + (beta[i]*(x-pontos[i][0])**2) + (gamma[i]*(x-pontos[i][0])**3))
print("\nSpline cúbica natural:\n")
for i in range(len(S)):
print("P"+str(i)+"(x) = "+str(simplify(S[i]))+" , Intervalo=["+str(pontos[i][0])+","+str(pontos[i+1][0])+"]")
print("")
c = 0
for i in range(1,len(pontos)):
intervalo = [pontos[i-1][0],pontos[i][0]]
if valor >= intervalo[0] and valor < intervalo[1]:
c = copy.copy(i)
break
print("Queremos encontrar o valor para f("+str(valor)+") então devemos usar P"+str(c-1)+" pois x = "+str(valor)+" está contido no intervalo = ",intervalo)
print("\nLogo, a função em x = "+str(valor)+" é aproximadamente: ",S[1].subs(x,valor))
def graficoSpline(pontos, valor):
h = []
for i in range(1,len(pontos)):
h.append(pontos[i][0] - pontos[i-1][0])
# print(len(h))
M = np.zeros((len(h)-1,len(h)-1))
for i in range(len(h)-1):
if i == 0:
M[i][i] = 2*(h[i]+h[i+1])
M[i][i+1] = h[i+1]
elif i == len(h)-2:
M[i][i] = 2*(h[i]+h[i+1])
M[i][i-1] = h[i]
else:
M[i][i] = 2*(h[i]+h[i+1])
M[i][i-1] = h[i]
M[i][i+1] = h[i+1]
B = np.zeros((len(h)-1,1))
for i in range(1,len(h)):
B[i-1][0] = 6*((pontos[i+1][1]-pontos[i][1])/h[i]) - 6*((pontos[i][1]-pontos[i-1][1])/h[i-1])
mu = sistLinear(M, B, len(h)-1)
alpha = np.zeros(len(h))
beta = np.zeros(len(h))
gamma = np.zeros(len(h))
for i in range(len(h)):
if i == 0:
alpha[i] = ((pontos[i+1][1]-pontos[i][1])/h[i]) - ((mu[i][0]/6)*h[i]) - ((0/3)*h[i])
beta[i] = 0/2
gamma[i] = (mu[i][0]-0)/(6*h[i])
elif i == len(mu):
alpha[i] = ((pontos[i+1][1]-pontos[i][1])/h[i]) - ((0/6)*h[i]) - ((mu[i-1]/3)*h[i])
beta[i] = mu[i-1][0]/2
gamma[i] = (0-mu[i-1][0])/(6*h[i])
else:
alpha[i] = ((pontos[i+1][1]-pontos[i][1])/h[i]) - ((mu[i][0]/6)*h[i]) - ((mu[i-1]/3)*h[i])
beta[i] = mu[i-1][0]/2
gamma[i] = (mu[i][0]-mu[i-1][0])/(6*h[i])
S = []
for i in range(len(alpha)):
# print(pontos[i][1] +(alpha[i]*(x-pontos[i][0])))
S.append(pontos[i][1] + (alpha[i]*(x-pontos[i][0])) + (beta[i]*(x-pontos[i][0])**2) + (gamma[i]*(x-pontos[i][0])**3))
c = 0
for i in range(1,len(pontos)):
intervalo = [pontos[i-1][0],pontos[i][0]]
if valor >= intervalo[0] and valor < intervalo[1]:
c = copy.copy(i)
break
Pn = S
fig, ax = plt.subplots()
for i in range(len(pontos)-1):
z = np.arange(pontos[i][0],pontos[i+1][0],0.001)
y = []
for j in range(len(z)):
y.append(Pn[i].subs(x,z[j]))
ax.plot(z,y, label='Polinômio Interpolador P'+str(i)+'(x)')
a = []
w = []
for i in range(len(pontos)):
a.append(pontos[i][0])
w.append(pontos[i][1])
ax.plot(a,w, "r*", markersize=6, label="Pontos da tabela")
ax.plot(valor,Pn[c].subs(x,valor), "g*", markersize=6, label="Estimativa")
ax.legend()
ax.grid()
plt.show()
# pontos = [[0,3.4422],[0.5,2.2302],[1,-0.8228],[1.5,-4.6133],[2,-9.0841]]
# pontos = [[3,2.5],[4.5,1],[7,2.5],[9,.5]]
# spline(pontos,5)
# graficoSpline(pontos, 5)
def minquaddis(pontos, grau):
pts = len(pontos)
g = np.zeros((grau+1,pts))
f = []
for j in range(pts):
for i in range(grau+1):
g[i][j] = pontos[j][0]**i
f.append(pontos[j][1])
print("Vetores")
for i in range(grau+1):
print("g"+str(i+1)+" = ", g[i])
print("f = ", f)
print("")
B = np.zeros((grau+1,grau+1))
for i in range(grau+1):
for j in range(grau+1):
soma = 0
for k in range(pts):
soma += g[i][k] * g[j][k]
B[i][j] = soma
print("A matriz dos coeficientes do sistema, no qual denotamos por B é")
print(prettymatrix.matrix_to_string(B, name='B = '))
print("E a matriz coluna cuja cada entrada é <g_i,f> é:")
D = []
for i in range(grau+1):
soma = 0
for k in range(pts):
soma += g[i][k] * f[k]
D.append([soma])
D = np.asarray(D)
print(prettymatrix.matrix_to_string(D, name='D = '))
print("Solução do sistema B*Y=D via eliminação de Gauss com pivotamento parcial:")
Y = sistLinear(B,D,grau+1)
print(prettymatrix.matrix_to_string(Y, name='Y = '))
p = 0
for i in range(grau+1):
p += Y[i][0]*x**i
print("Polinômio g(x) = ",p)
def graficodis(pontos,grau):
pts = len(pontos)
g = np.zeros((grau+1,pts))
f = []
for j in range(pts):
for i in range(grau+1):
g[i][j] = pontos[j][0]**i
f.append(pontos[j][1])
B = np.zeros((grau+1,grau+1))
for i in range(grau+1):
for j in range(grau+1):
soma = 0
for k in range(pts):
soma += g[i][k] * g[j][k]
B[i][j] = soma
D = []
for i in range(grau+1):
soma = 0
for k in range(pts):
soma += g[i][k] * f[k]
D.append([soma])
D = np.asarray(D)
Y = sistLinear(B,D,grau+1)
P = 0
for i in range(grau+1):
P += Y[i][0]*x**i
fig, ax = plt.subplots()
z = np.arange(-4,4,0.001)
y = []
for i in range(len(z)):
y.append(P.subs(x,z[i]))
b = []
w = []
for i in range(len(pontos)):
b.append(pontos[i][0])
w.append(pontos[i][1])
ax.plot(b,w, "r*", markersize=6, label="Pontos da tabela")
ax.plot(z,y, label='Função g(x)')
ax.legend()
ax.grid()
plt.show()
# pontos = [[-2,1],[-1,-3],[1,1],[2,9]]
# minquaddis(pontos,2)
# graficodis(pontos,2)
def minquadcont(f, a, b, grau):
grau += 1
g = []
for i in range(grau):
g.append(x**i)
B = np.zeros((grau,grau))
D = np.zeros((grau,1))
for i in range(grau):
for j in range(grau):
B[i][j] = integrate(g[i]*g[j], (x, a, b))
D[i][0] = integrate(g[i]*f, (x, a, b))
print("A matriz dos coeficientes do sistema, no qual denotamos por B é")
print(prettymatrix.matrix_to_string(B, name='B = '))
print("E a matriz coluna cuja cada entrada é <g_i,f> é:")
print(prettymatrix.matrix_to_string(D, name='D = '))
Y = sistLinear(B, D, grau)
print("Solução do sistema B*Y=D via eliminação de Gauss com pivotamento parcial:")
print(prettymatrix.matrix_to_string(Y, name='Y = '))
P = 0
for i in range(grau):
P += Y[i][0]*x**i
print("Polinômio g(x) = ", P)
def graficocont(f, a, b, grau):
grau += 1
g = []
for i in range(grau):
g.append(x**i)
B = np.zeros((grau,grau))
D = np.zeros((grau,1))
for i in range(grau):
for j in range(grau):
B[i][j] = integrate(g[i]*g[j], (x, a, b))
D[i][0] = integrate(g[i]*f, (x, a, b))
Y = sistLinear(B, D, grau)
P = 0
for i in range(grau):
P += Y[i][0]*x**i
fig, ax = plt.subplots()
z = np.arange(0,4,0.001)
y = []
for i in range(len(z)):
y.append(P.subs(x,z[i]))
b = np.arange(0,4,0.001)
w = []
for i in range(len(b)):
w.append(f.subs(x,b[i]))
ax.plot(b,w, label="Função f(x)")
ax.plot(z,y, label='Função g(x)')
ax.legend()
ax.grid()
plt.show()
def f(x): return exp(-x)
# minquadcont(f(x),1,3,1)
# graficocont(f(x),1,3,1) | 29.211465 | 158 | 0.494396 | 3,846 | 22,931 | 2.940458 | 0.057722 | 0.074896 | 0.04775 | 0.087541 | 0.832434 | 0.809002 | 0.792201 | 0.776108 | 0.761871 | 0.743567 | 0 | 0.039854 | 0.284375 | 22,931 | 785 | 159 | 29.211465 | 0.649299 | 0.041821 | 0 | 0.829421 | 0 | 0 | 0.083045 | 0.000957 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028169 | false | 0 | 0.010955 | 0.001565 | 0.045383 | 0.103286 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
691c5748787d1aee91eb32b82e037249a736d99a | 2,024 | py | Python | setup.py | karimbahgat/PyPi | 89aaea980fab89c944e1f218fcb1c6b754c21f79 | [
"MIT"
] | 3 | 2018-02-22T19:29:05.000Z | 2021-04-24T03:01:18.000Z | setup.py | karimbahgat/PyPi | 89aaea980fab89c944e1f218fcb1c6b754c21f79 | [
"MIT"
] | 1 | 2019-12-10T07:08:49.000Z | 2021-04-24T03:03:59.000Z | setup.py | karimbahgat/Pipy | 89aaea980fab89c944e1f218fcb1c6b754c21f79 | [
"MIT"
] | null | null | null | try: from setuptools import setup
except: from distutils.core import setup
setup( long_description=open("README.rst").read(),
name="""Pipy""",
license="""MIT""",
author="""Karim Bahgat""",
author_email="""karim.bahgat.norway@gmail.com""",
url="""http://github.com/karimbahgat/Pipy""",
package_data={'pipy': ['pip/_vendor/Makefile', 'pip/_vendor/README.rst', 'pip/_vendor/vendor.txt', 'pip/_vendor/certifi/cacert.pem', 'pip/_vendor/distlib/t32.exe', 'pip/_vendor/distlib/t64.exe', 'pip/_vendor/distlib/w32.exe', 'pip/_vendor/distlib/w64.exe', 'pip/_vendor/distlib/_backport/sysconfig.cfg', 'pip/_vendor/requests/cacert.pem']},
version="""0.1.0""",
keywords="""bla bla""",
packages=['pipy', 'pipy/pip', 'pipy/pip/commands', 'pipy/pip/compat', 'pipy/pip/req', 'pipy/pip/utils', 'pipy/pip/vcs', 'pipy/pip/_vendor', 'pipy/pip/_vendor/cachecontrol', 'pipy/pip/_vendor/cachecontrol/caches', 'pipy/pip/_vendor/certifi', 'pipy/pip/_vendor/colorama', 'pipy/pip/_vendor/distlib', 'pipy/pip/_vendor/distlib/_backport', 'pipy/pip/_vendor/html5lib', 'pipy/pip/_vendor/html5lib/filters', 'pipy/pip/_vendor/html5lib/serializer', 'pipy/pip/_vendor/html5lib/treeadapters', 'pipy/pip/_vendor/html5lib/treebuilders', 'pipy/pip/_vendor/html5lib/treewalkers', 'pipy/pip/_vendor/html5lib/trie', 'pipy/pip/_vendor/lockfile', 'pipy/pip/_vendor/progress', 'pipy/pip/_vendor/requests', 'pipy/pip/_vendor/requests/packages', 'pipy/pip/_vendor/requests/packages/chardet', 'pipy/pip/_vendor/requests/packages/urllib3', 'pipy/pip/_vendor/requests/packages/urllib3/contrib', 'pipy/pip/_vendor/requests/packages/urllib3/packages', 'pipy/pip/_vendor/requests/packages/urllib3/packages/ssl_match_hostname', 'pipy/pip/_vendor/requests/packages/urllib3/util', 'pipy/pip/_vendor/_markerlib'],
classifiers=['License :: OSI Approved', 'Programming Language :: Python', 'Development Status :: 4 - Beta', 'Intended Audience :: Developers', 'Intended Audience :: Science/Research', 'Intended Audience :: End Users/Desktop'],
description="""Blabla""",
)
| 119.058824 | 1,084 | 0.743577 | 266 | 2,024 | 5.496241 | 0.368421 | 0.215458 | 0.23119 | 0.114911 | 0.173735 | 0.159371 | 0.060192 | 0 | 0 | 0 | 0 | 0.01252 | 0.052866 | 2,024 | 16 | 1,085 | 126.5 | 0.75013 | 0 | 0 | 0 | 0 | 0 | 0.752964 | 0.559289 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.133333 | 0 | 0.133333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6938fd50dcead5f861e2bf08d92d0a40f777b869 | 1,857 | py | Python | examples/method_examples/init.py | c60evaporator/param-tuning-utility | 80625f875428badac37d8439195a9327a565b040 | [
"BSD-3-Clause"
] | null | null | null | examples/method_examples/init.py | c60evaporator/param-tuning-utility | 80625f875428badac37d8439195a9327a565b040 | [
"BSD-3-Clause"
] | null | null | null | examples/method_examples/init.py | c60evaporator/param-tuning-utility | 80625f875428badac37d8439195a9327a565b040 | [
"BSD-3-Clause"
] | 1 | 2022-01-06T05:13:07.000Z | 2022-01-06T05:13:07.000Z | # %% __init__(), no argument
import parent_import
from tune_easy import LGBMRegressorTuning
import pandas as pd
# Load dataset
df_reg = pd.read_csv(f'../sample_data/osaka_metropolis_english.csv')
TARGET_VARIABLE = 'approval_rate' # Target variable
USE_EXPLANATORY = ['2_between_30to60', '3_male_ratio', '5_household_member', 'latitude'] # Explanatory variables
y = df_reg[TARGET_VARIABLE].values
X = df_reg[USE_EXPLANATORY].values
###### __init() ######
tuning = LGBMRegressorTuning(X, y, USE_EXPLANATORY)
# %% __init__(), for LeaveOneGroupOut
import parent_import
from tune_easy import XGBRegressorTuning
from sklearn.model_selection import LeaveOneGroupOut
import pandas as pd
# Load dataset
df_reg = pd.read_csv(f'../sample_data/osaka_metropolis_english.csv')
TARGET_VARIABLE = 'approval_rate' # Target variable
USE_EXPLANATORY = ['2_between_30to60', '3_male_ratio', '5_household_member', 'latitude'] # Explanatory variables
X = df_reg[USE_EXPLANATORY].values
y = df_reg[TARGET_VARIABLE].values
###### __init() ######
tuning = XGBRegressorTuning(X, y, USE_EXPLANATORY, # Required argument
cv_group=df_reg['ward_after'].values) # Grouping data for LeaveOneGroupOut
# %% __init__(), use validation data as eval_data in fit_params
import parent_import
from tune_easy import LGBMRegressorTuning
import pandas as pd
# Load dataset
df_reg = pd.read_csv(f'../sample_data/osaka_metropolis_english.csv')
TARGET_VARIABLE = 'approval_rate' # Target variable
USE_EXPLANATORY = ['2_between_30to60', '3_male_ratio', '5_household_member', 'latitude'] # Explanatory variables
X = df_reg[USE_EXPLANATORY].values
y = df_reg[TARGET_VARIABLE].values
###### __init() ######
tuning = LGBMRegressorTuning(X, y, USE_EXPLANATORY, # Required argument
eval_data_source='valid') # Use valid data as eval_set
# %%
| 43.186047 | 113 | 0.752289 | 242 | 1,857 | 5.400826 | 0.268595 | 0.038256 | 0.041316 | 0.050497 | 0.801071 | 0.801071 | 0.729151 | 0.701607 | 0.635807 | 0.635807 | 0 | 0.01306 | 0.134087 | 1,857 | 42 | 114 | 44.214286 | 0.799751 | 0.219709 | 0 | 0.766667 | 0 | 0 | 0.247312 | 0.092473 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
15cd311570d04972dca76fadc71dc6c988523d54 | 161,712 | py | Python | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_Ethernet_SPAN_oper.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 177 | 2016-03-15T17:03:51.000Z | 2022-03-18T16:48:44.000Z | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_Ethernet_SPAN_oper.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 18 | 2016-03-30T10:45:22.000Z | 2020-07-14T16:28:13.000Z | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_Ethernet_SPAN_oper.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 85 | 2016-03-16T20:38:57.000Z | 2022-02-22T04:26:02.000Z | """ Cisco_IOS_XR_Ethernet_SPAN_oper
This module contains a collection of YANG definitions
for Cisco IOS\-XR Ethernet\-SPAN package operational data.
This module contains definitions
for the following management objects\:
span\-monitor\-session\: Monitor Session operational data
Copyright (c) 2013\-2018 by Cisco Systems, Inc.
All rights reserved.
"""
import sys
from collections import OrderedDict
from ydk.types import Entity as _Entity_
from ydk.types import EntityPath, Identity, Enum, YType, YLeaf, YLeafList, YList, LeafDataList, Bits, Empty, Decimal64
from ydk.types import Entity, EntityPath, Identity, Enum, YType, YLeaf, YLeafList, YList, LeafDataList, Bits, Empty, Decimal64
from ydk.filters import YFilter
from ydk.errors import YError, YModelError
from ydk.errors.error_handler import handle_type_error as _handle_type_error
class DestinationClass(Enum):
"""
DestinationClass (Enum Class)
Destination class
.. data:: interface_class = 0
Destination is an interface
.. data:: pseudowire_class = 1
Destination is a pseudowire
.. data:: next_hop_ipv4_class = 2
Destination is a next-hop IPv4 address
.. data:: next_hop_ipv6_class = 3
Destination is a next-hop IPv6 address
.. data:: invalid_class = 255
Destination is not specified
"""
interface_class = Enum.YLeaf(0, "interface-class")
pseudowire_class = Enum.YLeaf(1, "pseudowire-class")
next_hop_ipv4_class = Enum.YLeaf(2, "next-hop-ipv4-class")
next_hop_ipv6_class = Enum.YLeaf(3, "next-hop-ipv6-class")
invalid_class = Enum.YLeaf(255, "invalid-class")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['DestinationClass']
class ImStateEnum(Enum):
"""
ImStateEnum (Enum Class)
Im state enum
.. data:: im_state_not_ready = 0
im state not ready
.. data:: im_state_admin_down = 1
im state admin down
.. data:: im_state_down = 2
im state down
.. data:: im_state_up = 3
im state up
.. data:: im_state_shutdown = 4
im state shutdown
.. data:: im_state_err_disable = 5
im state err disable
.. data:: im_state_down_immediate = 6
im state down immediate
.. data:: im_state_down_immediate_admin = 7
im state down immediate admin
.. data:: im_state_down_graceful = 8
im state down graceful
.. data:: im_state_begin_shutdown = 9
im state begin shutdown
.. data:: im_state_end_shutdown = 10
im state end shutdown
.. data:: im_state_begin_error_disable = 11
im state begin error disable
.. data:: im_state_end_error_disable = 12
im state end error disable
.. data:: im_state_begin_down_graceful = 13
im state begin down graceful
.. data:: im_state_reset = 14
im state reset
.. data:: im_state_operational = 15
im state operational
.. data:: im_state_not_operational = 16
im state not operational
.. data:: im_state_unknown = 17
im state unknown
.. data:: im_state_last = 18
im state last
"""
im_state_not_ready = Enum.YLeaf(0, "im-state-not-ready")
im_state_admin_down = Enum.YLeaf(1, "im-state-admin-down")
im_state_down = Enum.YLeaf(2, "im-state-down")
im_state_up = Enum.YLeaf(3, "im-state-up")
im_state_shutdown = Enum.YLeaf(4, "im-state-shutdown")
im_state_err_disable = Enum.YLeaf(5, "im-state-err-disable")
im_state_down_immediate = Enum.YLeaf(6, "im-state-down-immediate")
im_state_down_immediate_admin = Enum.YLeaf(7, "im-state-down-immediate-admin")
im_state_down_graceful = Enum.YLeaf(8, "im-state-down-graceful")
im_state_begin_shutdown = Enum.YLeaf(9, "im-state-begin-shutdown")
im_state_end_shutdown = Enum.YLeaf(10, "im-state-end-shutdown")
im_state_begin_error_disable = Enum.YLeaf(11, "im-state-begin-error-disable")
im_state_end_error_disable = Enum.YLeaf(12, "im-state-end-error-disable")
im_state_begin_down_graceful = Enum.YLeaf(13, "im-state-begin-down-graceful")
im_state_reset = Enum.YLeaf(14, "im-state-reset")
im_state_operational = Enum.YLeaf(15, "im-state-operational")
im_state_not_operational = Enum.YLeaf(16, "im-state-not-operational")
im_state_unknown = Enum.YLeaf(17, "im-state-unknown")
im_state_last = Enum.YLeaf(18, "im-state-last")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['ImStateEnum']
class MirrorInterval(Enum):
"""
MirrorInterval (Enum Class)
Monitor\-session mirror intervals
.. data:: mirror_interval_all = 0
Mirror all packets
.. data:: mirror_interval512 = 1
Mirror Interval 512
.. data:: mirror_interval1k = 2
Mirror Interval 1K
.. data:: mirror_interval2k = 3
Mirror Interval 2K
.. data:: mirror_interval4k = 4
Mirror Interval 4K
.. data:: mirror_interval8k = 5
Mirror Interval 8K
.. data:: mirror_interval16k = 6
Mirror Interval 16K
"""
mirror_interval_all = Enum.YLeaf(0, "mirror-interval-all")
mirror_interval512 = Enum.YLeaf(1, "mirror-interval512")
mirror_interval1k = Enum.YLeaf(2, "mirror-interval1k")
mirror_interval2k = Enum.YLeaf(3, "mirror-interval2k")
mirror_interval4k = Enum.YLeaf(4, "mirror-interval4k")
mirror_interval8k = Enum.YLeaf(5, "mirror-interval8k")
mirror_interval16k = Enum.YLeaf(6, "mirror-interval16k")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['MirrorInterval']
class SessionClass(Enum):
"""
SessionClass (Enum Class)
Session class
.. data:: ethernet_class = 0
Ethernet mirroring session
.. data:: ipv4_class = 1
IPv4 mirroring session
.. data:: ipv6_class = 2
IPv6 mirroring session
.. data:: mplsipv4_class = 3
MPLS-IPv4 mirroring session
.. data:: mplsipv6_class = 4
MPLS-IPv6 mirroring session
.. data:: invalid_class = 65535
Invalid session class
"""
ethernet_class = Enum.YLeaf(0, "ethernet-class")
ipv4_class = Enum.YLeaf(1, "ipv4-class")
ipv6_class = Enum.YLeaf(2, "ipv6-class")
mplsipv4_class = Enum.YLeaf(3, "mplsipv4-class")
mplsipv6_class = Enum.YLeaf(4, "mplsipv6-class")
invalid_class = Enum.YLeaf(65535, "invalid-class")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SessionClass']
class TrafficDirection(Enum):
"""
TrafficDirection (Enum Class)
Monitor\-session traffic directions
.. data:: invalid = 0
Invalid
.. data:: rx_only = 1
Received
.. data:: tx_only = 2
Transmitted
.. data:: both = 3
Both
"""
invalid = Enum.YLeaf(0, "invalid")
rx_only = Enum.YLeaf(1, "rx-only")
tx_only = Enum.YLeaf(2, "tx-only")
both = Enum.YLeaf(3, "both")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['TrafficDirection']
class SpanMonitorSession(_Entity_):
"""
Monitor Session operational data
.. attribute:: global_
Global operational data
**type**\: :py:class:`Global <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global>`
**config**\: False
.. attribute:: nodes
Node table for node\-specific operational data
**type**\: :py:class:`Nodes <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes>`
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession, self).__init__()
self._top_entity = None
self.yang_name = "span-monitor-session"
self.yang_parent_name = "Cisco-IOS-XR-Ethernet-SPAN-oper"
self.is_top_level_class = True
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("global", ("global_", SpanMonitorSession.Global)), ("nodes", ("nodes", SpanMonitorSession.Nodes))])
self._leafs = OrderedDict()
self.global_ = SpanMonitorSession.Global()
self.global_.parent = self
self._children_name_map["global_"] = "global"
self.nodes = SpanMonitorSession.Nodes()
self.nodes.parent = self
self._children_name_map["nodes"] = "nodes"
self._segment_path = lambda: "Cisco-IOS-XR-Ethernet-SPAN-oper:span-monitor-session"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession, [], name, value)
class Global(_Entity_):
"""
Global operational data
.. attribute:: statistics
Table of statistics for source interfaces
**type**\: :py:class:`Statistics <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global.Statistics>`
**config**\: False
.. attribute:: global_sessions
Global Monitor Sessions table
**type**\: :py:class:`GlobalSessions <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global.GlobalSessions>`
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global, self).__init__()
self.yang_name = "global"
self.yang_parent_name = "span-monitor-session"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("statistics", ("statistics", SpanMonitorSession.Global.Statistics)), ("global-sessions", ("global_sessions", SpanMonitorSession.Global.GlobalSessions))])
self._leafs = OrderedDict()
self.statistics = SpanMonitorSession.Global.Statistics()
self.statistics.parent = self
self._children_name_map["statistics"] = "statistics"
self.global_sessions = SpanMonitorSession.Global.GlobalSessions()
self.global_sessions.parent = self
self._children_name_map["global_sessions"] = "global-sessions"
self._segment_path = lambda: "global"
self._absolute_path = lambda: "Cisco-IOS-XR-Ethernet-SPAN-oper:span-monitor-session/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global, [], name, value)
class Statistics(_Entity_):
"""
Table of statistics for source interfaces
.. attribute:: statistic
Statistics for a particular source interface
**type**\: list of :py:class:`Statistic <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global.Statistics.Statistic>`
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global.Statistics, self).__init__()
self.yang_name = "statistics"
self.yang_parent_name = "global"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("statistic", ("statistic", SpanMonitorSession.Global.Statistics.Statistic))])
self._leafs = OrderedDict()
self.statistic = YList(self)
self._segment_path = lambda: "statistics"
self._absolute_path = lambda: "Cisco-IOS-XR-Ethernet-SPAN-oper:span-monitor-session/global/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global.Statistics, [], name, value)
class Statistic(_Entity_):
"""
Statistics for a particular source interface
.. attribute:: session (key)
Session Name
**type**\: str
**length:** 1..79
**config**\: False
.. attribute:: interface (key)
Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: rx_packets_mirrored
RX Packets Mirrored
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: rx_octets_mirrored
RX Octets Mirrored
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: tx_packets_mirrored
TX Packets Mirrored
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: tx_octets_mirrored
TX Octets Mirrored
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: packets_not_mirrored
Packets Not Mirrored
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: octets_not_mirrored
Octets Not Mirrored
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global.Statistics.Statistic, self).__init__()
self.yang_name = "statistic"
self.yang_parent_name = "statistics"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['session','interface']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('session', (YLeaf(YType.str, 'session'), ['str'])),
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('rx_packets_mirrored', (YLeaf(YType.uint64, 'rx-packets-mirrored'), ['int'])),
('rx_octets_mirrored', (YLeaf(YType.uint64, 'rx-octets-mirrored'), ['int'])),
('tx_packets_mirrored', (YLeaf(YType.uint64, 'tx-packets-mirrored'), ['int'])),
('tx_octets_mirrored', (YLeaf(YType.uint64, 'tx-octets-mirrored'), ['int'])),
('packets_not_mirrored', (YLeaf(YType.uint64, 'packets-not-mirrored'), ['int'])),
('octets_not_mirrored', (YLeaf(YType.uint64, 'octets-not-mirrored'), ['int'])),
])
self.session = None
self.interface = None
self.rx_packets_mirrored = None
self.rx_octets_mirrored = None
self.tx_packets_mirrored = None
self.tx_octets_mirrored = None
self.packets_not_mirrored = None
self.octets_not_mirrored = None
self._segment_path = lambda: "statistic" + "[session='" + str(self.session) + "']" + "[interface='" + str(self.interface) + "']"
self._absolute_path = lambda: "Cisco-IOS-XR-Ethernet-SPAN-oper:span-monitor-session/global/statistics/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global.Statistics.Statistic, ['session', 'interface', 'rx_packets_mirrored', 'rx_octets_mirrored', 'tx_packets_mirrored', 'tx_octets_mirrored', 'packets_not_mirrored', 'octets_not_mirrored'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global.Statistics.Statistic']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global.Statistics']['meta_info']
class GlobalSessions(_Entity_):
"""
Global Monitor Sessions table
.. attribute:: global_session
Information about a globally\-configured monitor session
**type**\: list of :py:class:`GlobalSession <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global.GlobalSessions.GlobalSession>`
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global.GlobalSessions, self).__init__()
self.yang_name = "global-sessions"
self.yang_parent_name = "global"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("global-session", ("global_session", SpanMonitorSession.Global.GlobalSessions.GlobalSession))])
self._leafs = OrderedDict()
self.global_session = YList(self)
self._segment_path = lambda: "global-sessions"
self._absolute_path = lambda: "Cisco-IOS-XR-Ethernet-SPAN-oper:span-monitor-session/global/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global.GlobalSessions, [], name, value)
class GlobalSession(_Entity_):
"""
Information about a globally\-configured
monitor session
.. attribute:: session (key)
Session Name
**type**\: str
**length:** 1..79
**config**\: False
.. attribute:: destination_data
Destination data
**type**\: :py:class:`DestinationData <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData>`
**config**\: False
.. attribute:: destination_id
Destination ID
**type**\: :py:class:`DestinationId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId>`
**config**\: False
.. attribute:: inject_interface
Inject interface data
**type**\: :py:class:`InjectInterface <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global.GlobalSessions.GlobalSession.InjectInterface>`
**config**\: False
.. attribute:: name
Session Name
**type**\: str
**config**\: False
.. attribute:: session_class
Session class
**type**\: :py:class:`SessionClass <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SessionClass>`
**config**\: False
.. attribute:: id
Numerical ID assigned to session
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: destination_error
Last error observed for the destination
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: destination_interface_name
Destination interface name (deprecated by DestinationData, invalid for pseudowires)
**type**\: str
**config**\: False
.. attribute:: destination_interface_handle
Destination interface handle (deprecated by DestinationID, invalid for pseudowires)
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: interface_error
Last error observed for the destination interface (deprecated by DestinationError)
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global.GlobalSessions.GlobalSession, self).__init__()
self.yang_name = "global-session"
self.yang_parent_name = "global-sessions"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['session']
self._child_classes = OrderedDict([("destination-data", ("destination_data", SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData)), ("destination-id", ("destination_id", SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId)), ("inject-interface", ("inject_interface", SpanMonitorSession.Global.GlobalSessions.GlobalSession.InjectInterface))])
self._leafs = OrderedDict([
('session', (YLeaf(YType.str, 'session'), ['str'])),
('name', (YLeaf(YType.str, 'name'), ['str'])),
('session_class', (YLeaf(YType.enumeration, 'session-class'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'SessionClass', '')])),
('id', (YLeaf(YType.uint32, 'id'), ['int'])),
('destination_error', (YLeaf(YType.uint32, 'destination-error'), ['int'])),
('destination_interface_name', (YLeaf(YType.str, 'destination-interface-name'), ['str'])),
('destination_interface_handle', (YLeaf(YType.str, 'destination-interface-handle'), ['str'])),
('interface_error', (YLeaf(YType.uint32, 'interface-error'), ['int'])),
])
self.session = None
self.name = None
self.session_class = None
self.id = None
self.destination_error = None
self.destination_interface_name = None
self.destination_interface_handle = None
self.interface_error = None
self.destination_data = SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData()
self.destination_data.parent = self
self._children_name_map["destination_data"] = "destination-data"
self.destination_id = SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId()
self.destination_id.parent = self
self._children_name_map["destination_id"] = "destination-id"
self.inject_interface = SpanMonitorSession.Global.GlobalSessions.GlobalSession.InjectInterface()
self.inject_interface.parent = self
self._children_name_map["inject_interface"] = "inject-interface"
self._segment_path = lambda: "global-session" + "[session='" + str(self.session) + "']"
self._absolute_path = lambda: "Cisco-IOS-XR-Ethernet-SPAN-oper:span-monitor-session/global/global-sessions/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global.GlobalSessions.GlobalSession, ['session', 'name', 'session_class', 'id', 'destination_error', 'destination_interface_name', 'destination_interface_handle', 'interface_error'], name, value)
class DestinationData(_Entity_):
"""
Destination data
.. attribute:: interface_data
Interface data
**type**\: :py:class:`InterfaceData <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.InterfaceData>`
**config**\: False
.. attribute:: pseudowire_data
Pseudowire data
**type**\: :py:class:`PseudowireData <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.PseudowireData>`
**config**\: False
.. attribute:: next_hop_ipv4_data
Next\-hop IPv4 data
**type**\: :py:class:`NextHopIpv4Data <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.NextHopIpv4Data>`
**config**\: False
.. attribute:: next_hop_ipv6_data
Next\-hop IPv6 data
**type**\: :py:class:`NextHopIpv6Data <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.NextHopIpv6Data>`
**config**\: False
.. attribute:: destination_class
DestinationClass
**type**\: :py:class:`DestinationClass <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.DestinationClass>`
**config**\: False
.. attribute:: invalid_value
Invalid Parameter
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData, self).__init__()
self.yang_name = "destination-data"
self.yang_parent_name = "global-session"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("interface-data", ("interface_data", SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.InterfaceData)), ("pseudowire-data", ("pseudowire_data", SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.PseudowireData)), ("next-hop-ipv4-data", ("next_hop_ipv4_data", SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.NextHopIpv4Data)), ("next-hop-ipv6-data", ("next_hop_ipv6_data", SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.NextHopIpv6Data))])
self._leafs = OrderedDict([
('destination_class', (YLeaf(YType.enumeration, 'destination-class'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'DestinationClass', '')])),
('invalid_value', (YLeaf(YType.uint32, 'invalid-value'), ['int'])),
])
self.destination_class = None
self.invalid_value = None
self.interface_data = SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.InterfaceData()
self.interface_data.parent = self
self._children_name_map["interface_data"] = "interface-data"
self.pseudowire_data = SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.PseudowireData()
self.pseudowire_data.parent = self
self._children_name_map["pseudowire_data"] = "pseudowire-data"
self.next_hop_ipv4_data = SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.NextHopIpv4Data()
self.next_hop_ipv4_data.parent = self
self._children_name_map["next_hop_ipv4_data"] = "next-hop-ipv4-data"
self.next_hop_ipv6_data = SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.NextHopIpv6Data()
self.next_hop_ipv6_data.parent = self
self._children_name_map["next_hop_ipv6_data"] = "next-hop-ipv6-data"
self._segment_path = lambda: "destination-data"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData, ['destination_class', 'invalid_value'], name, value)
class InterfaceData(_Entity_):
"""
Interface data
.. attribute:: interface_name
Interface Name
**type**\: str
**config**\: False
.. attribute:: interface_state
Interface State
**type**\: :py:class:`ImStateEnum <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.ImStateEnum>`
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.InterfaceData, self).__init__()
self.yang_name = "interface-data"
self.yang_parent_name = "destination-data"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_name', (YLeaf(YType.str, 'interface-name'), ['str'])),
('interface_state', (YLeaf(YType.enumeration, 'interface-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'ImStateEnum', '')])),
])
self.interface_name = None
self.interface_state = None
self._segment_path = lambda: "interface-data"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.InterfaceData, ['interface_name', 'interface_state'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.InterfaceData']['meta_info']
class PseudowireData(_Entity_):
"""
Pseudowire data
.. attribute:: pseudowire_name
Pseudowire Name
**type**\: str
**config**\: False
.. attribute:: pseudowire_is_up
Pseudowire State
**type**\: bool
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.PseudowireData, self).__init__()
self.yang_name = "pseudowire-data"
self.yang_parent_name = "destination-data"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('pseudowire_name', (YLeaf(YType.str, 'pseudowire-name'), ['str'])),
('pseudowire_is_up', (YLeaf(YType.boolean, 'pseudowire-is-up'), ['bool'])),
])
self.pseudowire_name = None
self.pseudowire_is_up = None
self._segment_path = lambda: "pseudowire-data"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.PseudowireData, ['pseudowire_name', 'pseudowire_is_up'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.PseudowireData']['meta_info']
class NextHopIpv4Data(_Entity_):
"""
Next\-hop IPv4 data
.. attribute:: ipv4_address
IPv4 address
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: vrf_name
VRF name
**type**\: str
**config**\: False
.. attribute:: address_is_reachable
Address is reachable
**type**\: bool
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.NextHopIpv4Data, self).__init__()
self.yang_name = "next-hop-ipv4-data"
self.yang_parent_name = "destination-data"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv4_address', (YLeaf(YType.str, 'ipv4-address'), ['str'])),
('vrf_name', (YLeaf(YType.str, 'vrf-name'), ['str'])),
('address_is_reachable', (YLeaf(YType.boolean, 'address-is-reachable'), ['bool'])),
])
self.ipv4_address = None
self.vrf_name = None
self.address_is_reachable = None
self._segment_path = lambda: "next-hop-ipv4-data"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.NextHopIpv4Data, ['ipv4_address', 'vrf_name', 'address_is_reachable'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.NextHopIpv4Data']['meta_info']
class NextHopIpv6Data(_Entity_):
"""
Next\-hop IPv6 data
.. attribute:: ipv6_address
IPv6 address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: vrf_name
VRF name
**type**\: str
**config**\: False
.. attribute:: address_is_reachable
Address is reachable
**type**\: bool
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.NextHopIpv6Data, self).__init__()
self.yang_name = "next-hop-ipv6-data"
self.yang_parent_name = "destination-data"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv6_address', (YLeaf(YType.str, 'ipv6-address'), ['str'])),
('vrf_name', (YLeaf(YType.str, 'vrf-name'), ['str'])),
('address_is_reachable', (YLeaf(YType.boolean, 'address-is-reachable'), ['bool'])),
])
self.ipv6_address = None
self.vrf_name = None
self.address_is_reachable = None
self._segment_path = lambda: "next-hop-ipv6-data"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.NextHopIpv6Data, ['ipv6_address', 'vrf_name', 'address_is_reachable'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData.NextHopIpv6Data']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationData']['meta_info']
class DestinationId(_Entity_):
"""
Destination ID
.. attribute:: ipv4_address_and_vrf
IPv4 address
**type**\: :py:class:`Ipv4AddressAndVrf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId.Ipv4AddressAndVrf>`
**config**\: False
.. attribute:: ipv6_address_and_vrf
IPv6 address
**type**\: :py:class:`Ipv6AddressAndVrf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId.Ipv6AddressAndVrf>`
**config**\: False
.. attribute:: destination_class
DestinationClass
**type**\: :py:class:`DestinationClass <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.DestinationClass>`
**config**\: False
.. attribute:: interface
Interface Handle
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: pseudowire_id
Pseudowire XCID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_value
Invalid Parameter
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId, self).__init__()
self.yang_name = "destination-id"
self.yang_parent_name = "global-session"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("ipv4-address-and-vrf", ("ipv4_address_and_vrf", SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId.Ipv4AddressAndVrf)), ("ipv6-address-and-vrf", ("ipv6_address_and_vrf", SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId.Ipv6AddressAndVrf))])
self._leafs = OrderedDict([
('destination_class', (YLeaf(YType.enumeration, 'destination-class'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'DestinationClass', '')])),
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('pseudowire_id', (YLeaf(YType.uint32, 'pseudowire-id'), ['int'])),
('invalid_value', (YLeaf(YType.uint32, 'invalid-value'), ['int'])),
])
self.destination_class = None
self.interface = None
self.pseudowire_id = None
self.invalid_value = None
self.ipv4_address_and_vrf = SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId.Ipv4AddressAndVrf()
self.ipv4_address_and_vrf.parent = self
self._children_name_map["ipv4_address_and_vrf"] = "ipv4-address-and-vrf"
self.ipv6_address_and_vrf = SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId.Ipv6AddressAndVrf()
self.ipv6_address_and_vrf.parent = self
self._children_name_map["ipv6_address_and_vrf"] = "ipv6-address-and-vrf"
self._segment_path = lambda: "destination-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId, ['destination_class', 'interface', 'pseudowire_id', 'invalid_value'], name, value)
class Ipv4AddressAndVrf(_Entity_):
"""
IPv4 address
.. attribute:: ipv4_address
IPv4 address
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: vrf_name
VRF
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId.Ipv4AddressAndVrf, self).__init__()
self.yang_name = "ipv4-address-and-vrf"
self.yang_parent_name = "destination-id"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv4_address', (YLeaf(YType.str, 'ipv4-address'), ['str'])),
('vrf_name', (YLeaf(YType.str, 'vrf-name'), ['str'])),
])
self.ipv4_address = None
self.vrf_name = None
self._segment_path = lambda: "ipv4-address-and-vrf"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId.Ipv4AddressAndVrf, ['ipv4_address', 'vrf_name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId.Ipv4AddressAndVrf']['meta_info']
class Ipv6AddressAndVrf(_Entity_):
"""
IPv6 address
.. attribute:: ipv6_address
IPv6 address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: vrf_name
VRF
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId.Ipv6AddressAndVrf, self).__init__()
self.yang_name = "ipv6-address-and-vrf"
self.yang_parent_name = "destination-id"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv6_address', (YLeaf(YType.str, 'ipv6-address'), ['str'])),
('vrf_name', (YLeaf(YType.str, 'vrf-name'), ['str'])),
])
self.ipv6_address = None
self.vrf_name = None
self._segment_path = lambda: "ipv6-address-and-vrf"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId.Ipv6AddressAndVrf, ['ipv6_address', 'vrf_name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId.Ipv6AddressAndVrf']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global.GlobalSessions.GlobalSession.DestinationId']['meta_info']
class InjectInterface(_Entity_):
"""
Inject interface data
.. attribute:: name
Interface Name
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Global.GlobalSessions.GlobalSession.InjectInterface, self).__init__()
self.yang_name = "inject-interface"
self.yang_parent_name = "global-session"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
])
self.name = None
self._segment_path = lambda: "inject-interface"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Global.GlobalSessions.GlobalSession.InjectInterface, ['name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global.GlobalSessions.GlobalSession.InjectInterface']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global.GlobalSessions.GlobalSession']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global.GlobalSessions']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Global']['meta_info']
class Nodes(_Entity_):
"""
Node table for node\-specific operational data
.. attribute:: node
Node\-specific data for a particular node
**type**\: list of :py:class:`Node <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node>`
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes, self).__init__()
self.yang_name = "nodes"
self.yang_parent_name = "span-monitor-session"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("node", ("node", SpanMonitorSession.Nodes.Node))])
self._leafs = OrderedDict()
self.node = YList(self)
self._segment_path = lambda: "nodes"
self._absolute_path = lambda: "Cisco-IOS-XR-Ethernet-SPAN-oper:span-monitor-session/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes, [], name, value)
class Node(_Entity_):
"""
Node\-specific data for a particular node
.. attribute:: node (key)
Node
**type**\: str
**pattern:** ([a\-zA\-Z0\-9\_]\*\\d+/){1,2}([a\-zA\-Z0\-9\_]\*\\d+)
**config**\: False
.. attribute:: attachments
Table of source interfaces configured as attached to a session
**type**\: :py:class:`Attachments <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Attachments>`
**config**\: False
.. attribute:: hardware_sessions
Table of sessions set up in the hardware. When all sessions are operating correctly the entries in this table should match those entries in GlobalSessionTable that have a destination configured
**type**\: :py:class:`HardwareSessions <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.HardwareSessions>`
**config**\: False
.. attribute:: interfaces
Table of source interfaces set up in the hardware. The entries in this table should match the entries in AttachmentTable when all sessions are operating correctly
**type**\: :py:class:`Interfaces <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Interfaces>`
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node, self).__init__()
self.yang_name = "node"
self.yang_parent_name = "nodes"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['node']
self._child_classes = OrderedDict([("attachments", ("attachments", SpanMonitorSession.Nodes.Node.Attachments)), ("hardware-sessions", ("hardware_sessions", SpanMonitorSession.Nodes.Node.HardwareSessions)), ("interfaces", ("interfaces", SpanMonitorSession.Nodes.Node.Interfaces))])
self._leafs = OrderedDict([
('node', (YLeaf(YType.str, 'node'), ['str'])),
])
self.node = None
self.attachments = SpanMonitorSession.Nodes.Node.Attachments()
self.attachments.parent = self
self._children_name_map["attachments"] = "attachments"
self.hardware_sessions = SpanMonitorSession.Nodes.Node.HardwareSessions()
self.hardware_sessions.parent = self
self._children_name_map["hardware_sessions"] = "hardware-sessions"
self.interfaces = SpanMonitorSession.Nodes.Node.Interfaces()
self.interfaces.parent = self
self._children_name_map["interfaces"] = "interfaces"
self._segment_path = lambda: "node" + "[node='" + str(self.node) + "']"
self._absolute_path = lambda: "Cisco-IOS-XR-Ethernet-SPAN-oper:span-monitor-session/nodes/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node, ['node'], name, value)
class Attachments(_Entity_):
"""
Table of source interfaces configured as
attached to a session
.. attribute:: attachment
Information about a particular source interface configured as attached to monitor session
**type**\: list of :py:class:`Attachment <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Attachments.Attachment>`
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Attachments, self).__init__()
self.yang_name = "attachments"
self.yang_parent_name = "node"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("attachment", ("attachment", SpanMonitorSession.Nodes.Node.Attachments.Attachment))])
self._leafs = OrderedDict()
self.attachment = YList(self)
self._segment_path = lambda: "attachments"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Attachments, [], name, value)
class Attachment(_Entity_):
"""
Information about a particular source
interface configured as attached to monitor
session
.. attribute:: session (key)
Session Name
**type**\: str
**length:** 1..79
**config**\: False
.. attribute:: interface (key)
Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: traffic_parameters
Traffic mirroring parameters
**type**\: :py:class:`TrafficParameters <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Attachments.Attachment.TrafficParameters>`
**config**\: False
.. attribute:: destination_id
Destination ID
**type**\: :py:class:`DestinationId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId>`
**config**\: False
.. attribute:: name
Session Name
**type**\: str
**config**\: False
.. attribute:: local_class
Local attachment class
**type**\: :py:class:`SessionClass <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SessionClass>`
**config**\: False
.. attribute:: id
Numerical ID assigned to session
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: global_class
Global session class
**type**\: :py:class:`SessionClass <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SessionClass>`
**config**\: False
.. attribute:: session_is_configured
The Session is configured globally
**type**\: bool
**config**\: False
.. attribute:: source_interface
Source interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: source_interface_state
Source interface state
**type**\: :py:class:`ImStateEnum <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.ImStateEnum>`
**config**\: False
.. attribute:: pfi_error
Last error returned from PFI for this interface
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dest_pw_type_not_supported
The destination PW type is not supported
**type**\: bool
**config**\: False
.. attribute:: source_interface_is_a_destination
This source interface is a destination for another monitor\-session
**type**\: bool
**config**\: False
.. attribute:: destination_interface
Destination interface (deprecated by DestinationID, invalid for pseudowires)
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: traffic_direction
Traffic mirroring direction (deprecated by TrafficParameters)
**type**\: :py:class:`TrafficDirection <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.TrafficDirection>`
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Attachments.Attachment, self).__init__()
self.yang_name = "attachment"
self.yang_parent_name = "attachments"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['session','interface']
self._child_classes = OrderedDict([("traffic-parameters", ("traffic_parameters", SpanMonitorSession.Nodes.Node.Attachments.Attachment.TrafficParameters)), ("destination-id", ("destination_id", SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId))])
self._leafs = OrderedDict([
('session', (YLeaf(YType.str, 'session'), ['str'])),
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('name', (YLeaf(YType.str, 'name'), ['str'])),
('local_class', (YLeaf(YType.enumeration, 'local-class'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'SessionClass', '')])),
('id', (YLeaf(YType.uint32, 'id'), ['int'])),
('global_class', (YLeaf(YType.enumeration, 'global-class'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'SessionClass', '')])),
('session_is_configured', (YLeaf(YType.boolean, 'session-is-configured'), ['bool'])),
('source_interface', (YLeaf(YType.str, 'source-interface'), ['str'])),
('source_interface_state', (YLeaf(YType.enumeration, 'source-interface-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'ImStateEnum', '')])),
('pfi_error', (YLeaf(YType.uint32, 'pfi-error'), ['int'])),
('dest_pw_type_not_supported', (YLeaf(YType.boolean, 'dest-pw-type-not-supported'), ['bool'])),
('source_interface_is_a_destination', (YLeaf(YType.boolean, 'source-interface-is-a-destination'), ['bool'])),
('destination_interface', (YLeaf(YType.str, 'destination-interface'), ['str'])),
('traffic_direction', (YLeaf(YType.enumeration, 'traffic-direction'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'TrafficDirection', '')])),
])
self.session = None
self.interface = None
self.name = None
self.local_class = None
self.id = None
self.global_class = None
self.session_is_configured = None
self.source_interface = None
self.source_interface_state = None
self.pfi_error = None
self.dest_pw_type_not_supported = None
self.source_interface_is_a_destination = None
self.destination_interface = None
self.traffic_direction = None
self.traffic_parameters = SpanMonitorSession.Nodes.Node.Attachments.Attachment.TrafficParameters()
self.traffic_parameters.parent = self
self._children_name_map["traffic_parameters"] = "traffic-parameters"
self.destination_id = SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId()
self.destination_id.parent = self
self._children_name_map["destination_id"] = "destination-id"
self._segment_path = lambda: "attachment" + "[session='" + str(self.session) + "']" + "[interface='" + str(self.interface) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Attachments.Attachment, ['session', 'interface', 'name', 'local_class', 'id', 'global_class', 'session_is_configured', 'source_interface', 'source_interface_state', 'pfi_error', 'dest_pw_type_not_supported', 'source_interface_is_a_destination', 'destination_interface', 'traffic_direction'], name, value)
class TrafficParameters(_Entity_):
"""
Traffic mirroring parameters
.. attribute:: traffic_direction
Direction
**type**\: :py:class:`TrafficDirection <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.TrafficDirection>`
**config**\: False
.. attribute:: port_level
Port level mirroring
**type**\: bool
**config**\: False
.. attribute:: is_acl_enabled
ACL enabled
**type**\: bool
**config**\: False
.. attribute:: mirror_bytes
Number of bytes to mirror
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: byte
.. attribute:: mirror_interval
Interval between mirrored packets
**type**\: :py:class:`MirrorInterval <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.MirrorInterval>`
**config**\: False
.. attribute:: acl_name
ACL name
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Attachments.Attachment.TrafficParameters, self).__init__()
self.yang_name = "traffic-parameters"
self.yang_parent_name = "attachment"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('traffic_direction', (YLeaf(YType.enumeration, 'traffic-direction'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'TrafficDirection', '')])),
('port_level', (YLeaf(YType.boolean, 'port-level'), ['bool'])),
('is_acl_enabled', (YLeaf(YType.boolean, 'is-acl-enabled'), ['bool'])),
('mirror_bytes', (YLeaf(YType.uint32, 'mirror-bytes'), ['int'])),
('mirror_interval', (YLeaf(YType.enumeration, 'mirror-interval'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'MirrorInterval', '')])),
('acl_name', (YLeaf(YType.str, 'acl-name'), ['str'])),
])
self.traffic_direction = None
self.port_level = None
self.is_acl_enabled = None
self.mirror_bytes = None
self.mirror_interval = None
self.acl_name = None
self._segment_path = lambda: "traffic-parameters"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Attachments.Attachment.TrafficParameters, ['traffic_direction', 'port_level', 'is_acl_enabled', 'mirror_bytes', 'mirror_interval', 'acl_name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Attachments.Attachment.TrafficParameters']['meta_info']
class DestinationId(_Entity_):
"""
Destination ID
.. attribute:: ipv4_address_and_vrf
IPv4 address
**type**\: :py:class:`Ipv4AddressAndVrf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId.Ipv4AddressAndVrf>`
**config**\: False
.. attribute:: ipv6_address_and_vrf
IPv6 address
**type**\: :py:class:`Ipv6AddressAndVrf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId.Ipv6AddressAndVrf>`
**config**\: False
.. attribute:: destination_class
DestinationClass
**type**\: :py:class:`DestinationClass <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.DestinationClass>`
**config**\: False
.. attribute:: interface
Interface Handle
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: pseudowire_id
Pseudowire XCID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_value
Invalid Parameter
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId, self).__init__()
self.yang_name = "destination-id"
self.yang_parent_name = "attachment"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("ipv4-address-and-vrf", ("ipv4_address_and_vrf", SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId.Ipv4AddressAndVrf)), ("ipv6-address-and-vrf", ("ipv6_address_and_vrf", SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId.Ipv6AddressAndVrf))])
self._leafs = OrderedDict([
('destination_class', (YLeaf(YType.enumeration, 'destination-class'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'DestinationClass', '')])),
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('pseudowire_id', (YLeaf(YType.uint32, 'pseudowire-id'), ['int'])),
('invalid_value', (YLeaf(YType.uint32, 'invalid-value'), ['int'])),
])
self.destination_class = None
self.interface = None
self.pseudowire_id = None
self.invalid_value = None
self.ipv4_address_and_vrf = SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId.Ipv4AddressAndVrf()
self.ipv4_address_and_vrf.parent = self
self._children_name_map["ipv4_address_and_vrf"] = "ipv4-address-and-vrf"
self.ipv6_address_and_vrf = SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId.Ipv6AddressAndVrf()
self.ipv6_address_and_vrf.parent = self
self._children_name_map["ipv6_address_and_vrf"] = "ipv6-address-and-vrf"
self._segment_path = lambda: "destination-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId, ['destination_class', 'interface', 'pseudowire_id', 'invalid_value'], name, value)
class Ipv4AddressAndVrf(_Entity_):
"""
IPv4 address
.. attribute:: ipv4_address
IPv4 address
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: vrf_name
VRF
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId.Ipv4AddressAndVrf, self).__init__()
self.yang_name = "ipv4-address-and-vrf"
self.yang_parent_name = "destination-id"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv4_address', (YLeaf(YType.str, 'ipv4-address'), ['str'])),
('vrf_name', (YLeaf(YType.str, 'vrf-name'), ['str'])),
])
self.ipv4_address = None
self.vrf_name = None
self._segment_path = lambda: "ipv4-address-and-vrf"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId.Ipv4AddressAndVrf, ['ipv4_address', 'vrf_name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId.Ipv4AddressAndVrf']['meta_info']
class Ipv6AddressAndVrf(_Entity_):
"""
IPv6 address
.. attribute:: ipv6_address
IPv6 address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: vrf_name
VRF
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId.Ipv6AddressAndVrf, self).__init__()
self.yang_name = "ipv6-address-and-vrf"
self.yang_parent_name = "destination-id"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv6_address', (YLeaf(YType.str, 'ipv6-address'), ['str'])),
('vrf_name', (YLeaf(YType.str, 'vrf-name'), ['str'])),
])
self.ipv6_address = None
self.vrf_name = None
self._segment_path = lambda: "ipv6-address-and-vrf"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId.Ipv6AddressAndVrf, ['ipv6_address', 'vrf_name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId.Ipv6AddressAndVrf']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Attachments.Attachment.DestinationId']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Attachments.Attachment']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Attachments']['meta_info']
class HardwareSessions(_Entity_):
"""
Table of sessions set up in the hardware.
When all sessions are operating correctly the
entries in this table should match those
entries in GlobalSessionTable that have a
destination configured
.. attribute:: hardware_session
Information about a particular session that is set up in the hardware
**type**\: list of :py:class:`HardwareSession <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession>`
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.HardwareSessions, self).__init__()
self.yang_name = "hardware-sessions"
self.yang_parent_name = "node"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("hardware-session", ("hardware_session", SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession))])
self._leafs = OrderedDict()
self.hardware_session = YList(self)
self._segment_path = lambda: "hardware-sessions"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.HardwareSessions, [], name, value)
class HardwareSession(_Entity_):
"""
Information about a particular session that
is set up in the hardware
.. attribute:: session_class
Sesssion class
**type**\: :py:class:`SpanSessionClass <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_datatypes.SpanSessionClass>`
**config**\: False
.. attribute:: session_id
Session ID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: destination_id
Destination ID
**type**\: :py:class:`DestinationId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId>`
**config**\: False
.. attribute:: id
Assigned numerical ID for this session
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: name
Configured Session Name
**type**\: str
**config**\: False
.. attribute:: session_class_xr
Session class
**type**\: :py:class:`SessionClass <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SessionClass>`
**config**\: False
.. attribute:: destination_interface
Destination interface (deprecated by DestinationID, invalid for pseudowires)
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: platform_error
Last error observed for this session while programming the hardware
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: inject_interface_ifh
Inject Interface ifhandle
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: inject_interface_mac
Inject Interface MAC address
**type**\: str
**pattern:** [0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2}){5}
**config**\: False
.. attribute:: inject_interface_invalid
An inject interface is flagged as invalid on a particular node if the interface exists on that node, and there is no attachment interface config for it
**type**\: bool
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession, self).__init__()
self.yang_name = "hardware-session"
self.yang_parent_name = "hardware-sessions"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("destination-id", ("destination_id", SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId))])
self._leafs = OrderedDict([
('session_class', (YLeaf(YType.enumeration, 'session-class'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_datatypes', 'SpanSessionClass', '')])),
('session_id', (YLeaf(YType.uint32, 'session-id'), ['int'])),
('id', (YLeaf(YType.uint32, 'id'), ['int'])),
('name', (YLeaf(YType.str, 'name'), ['str'])),
('session_class_xr', (YLeaf(YType.enumeration, 'session-class-xr'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'SessionClass', '')])),
('destination_interface', (YLeaf(YType.str, 'destination-interface'), ['str'])),
('platform_error', (YLeaf(YType.uint32, 'platform-error'), ['int'])),
('inject_interface_ifh', (YLeaf(YType.str, 'inject-interface-ifh'), ['str'])),
('inject_interface_mac', (YLeaf(YType.str, 'inject-interface-mac'), ['str'])),
('inject_interface_invalid', (YLeaf(YType.boolean, 'inject-interface-invalid'), ['bool'])),
])
self.session_class = None
self.session_id = None
self.id = None
self.name = None
self.session_class_xr = None
self.destination_interface = None
self.platform_error = None
self.inject_interface_ifh = None
self.inject_interface_mac = None
self.inject_interface_invalid = None
self.destination_id = SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId()
self.destination_id.parent = self
self._children_name_map["destination_id"] = "destination-id"
self._segment_path = lambda: "hardware-session"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession, ['session_class', 'session_id', 'id', 'name', 'session_class_xr', 'destination_interface', 'platform_error', 'inject_interface_ifh', 'inject_interface_mac', 'inject_interface_invalid'], name, value)
class DestinationId(_Entity_):
"""
Destination ID
.. attribute:: ipv4_address_and_vrf
IPv4 address
**type**\: :py:class:`Ipv4AddressAndVrf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId.Ipv4AddressAndVrf>`
**config**\: False
.. attribute:: ipv6_address_and_vrf
IPv6 address
**type**\: :py:class:`Ipv6AddressAndVrf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId.Ipv6AddressAndVrf>`
**config**\: False
.. attribute:: destination_class
DestinationClass
**type**\: :py:class:`DestinationClass <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.DestinationClass>`
**config**\: False
.. attribute:: interface
Interface Handle
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: pseudowire_id
Pseudowire XCID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_value
Invalid Parameter
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId, self).__init__()
self.yang_name = "destination-id"
self.yang_parent_name = "hardware-session"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("ipv4-address-and-vrf", ("ipv4_address_and_vrf", SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId.Ipv4AddressAndVrf)), ("ipv6-address-and-vrf", ("ipv6_address_and_vrf", SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId.Ipv6AddressAndVrf))])
self._leafs = OrderedDict([
('destination_class', (YLeaf(YType.enumeration, 'destination-class'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'DestinationClass', '')])),
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('pseudowire_id', (YLeaf(YType.uint32, 'pseudowire-id'), ['int'])),
('invalid_value', (YLeaf(YType.uint32, 'invalid-value'), ['int'])),
])
self.destination_class = None
self.interface = None
self.pseudowire_id = None
self.invalid_value = None
self.ipv4_address_and_vrf = SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId.Ipv4AddressAndVrf()
self.ipv4_address_and_vrf.parent = self
self._children_name_map["ipv4_address_and_vrf"] = "ipv4-address-and-vrf"
self.ipv6_address_and_vrf = SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId.Ipv6AddressAndVrf()
self.ipv6_address_and_vrf.parent = self
self._children_name_map["ipv6_address_and_vrf"] = "ipv6-address-and-vrf"
self._segment_path = lambda: "destination-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId, ['destination_class', 'interface', 'pseudowire_id', 'invalid_value'], name, value)
class Ipv4AddressAndVrf(_Entity_):
"""
IPv4 address
.. attribute:: ipv4_address
IPv4 address
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: vrf_name
VRF
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId.Ipv4AddressAndVrf, self).__init__()
self.yang_name = "ipv4-address-and-vrf"
self.yang_parent_name = "destination-id"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv4_address', (YLeaf(YType.str, 'ipv4-address'), ['str'])),
('vrf_name', (YLeaf(YType.str, 'vrf-name'), ['str'])),
])
self.ipv4_address = None
self.vrf_name = None
self._segment_path = lambda: "ipv4-address-and-vrf"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId.Ipv4AddressAndVrf, ['ipv4_address', 'vrf_name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId.Ipv4AddressAndVrf']['meta_info']
class Ipv6AddressAndVrf(_Entity_):
"""
IPv6 address
.. attribute:: ipv6_address
IPv6 address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: vrf_name
VRF
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId.Ipv6AddressAndVrf, self).__init__()
self.yang_name = "ipv6-address-and-vrf"
self.yang_parent_name = "destination-id"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv6_address', (YLeaf(YType.str, 'ipv6-address'), ['str'])),
('vrf_name', (YLeaf(YType.str, 'vrf-name'), ['str'])),
])
self.ipv6_address = None
self.vrf_name = None
self._segment_path = lambda: "ipv6-address-and-vrf"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId.Ipv6AddressAndVrf, ['ipv6_address', 'vrf_name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId.Ipv6AddressAndVrf']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession.DestinationId']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.HardwareSessions.HardwareSession']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.HardwareSessions']['meta_info']
class Interfaces(_Entity_):
"""
Table of source interfaces set up in the
hardware. The entries in this table should
match the entries in AttachmentTable when all
sessions are operating correctly
.. attribute:: interface
Information about a particular interface that is set up in the hardware
**type**\: list of :py:class:`Interface <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Interfaces.Interface>`
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Interfaces, self).__init__()
self.yang_name = "interfaces"
self.yang_parent_name = "node"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("interface", ("interface", SpanMonitorSession.Nodes.Node.Interfaces.Interface))])
self._leafs = OrderedDict()
self.interface = YList(self)
self._segment_path = lambda: "interfaces"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Interfaces, [], name, value)
class Interface(_Entity_):
"""
Information about a particular interface that
is set up in the hardware
.. attribute:: interface (key)
Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: destination_id
Destination ID (deprecated by Attachment)
**type**\: :py:class:`DestinationId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId>`
**config**\: False
.. attribute:: traffic_mirroring_parameters
Traffic mirroring parameters (deprecated by Attachment)
**type**\: :py:class:`TrafficMirroringParameters <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Interfaces.Interface.TrafficMirroringParameters>`
**config**\: False
.. attribute:: source_interface
Source interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: platform_error
Last error observed for this interface while programming the hardware
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: destination_interface
Destination interface (deprecated by Attachment)
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: traffic_direction
Traffic mirroring direction (deprecated by Attachment)
**type**\: :py:class:`TrafficDirection <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.TrafficDirection>`
**config**\: False
.. attribute:: attachment
Attachment information
**type**\: list of :py:class:`Attachment <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment>`
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Interfaces.Interface, self).__init__()
self.yang_name = "interface"
self.yang_parent_name = "interfaces"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['interface']
self._child_classes = OrderedDict([("destination-id", ("destination_id", SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId)), ("traffic-mirroring-parameters", ("traffic_mirroring_parameters", SpanMonitorSession.Nodes.Node.Interfaces.Interface.TrafficMirroringParameters)), ("attachment", ("attachment", SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment))])
self._leafs = OrderedDict([
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('source_interface', (YLeaf(YType.str, 'source-interface'), ['str'])),
('platform_error', (YLeaf(YType.uint32, 'platform-error'), ['int'])),
('destination_interface', (YLeaf(YType.str, 'destination-interface'), ['str'])),
('traffic_direction', (YLeaf(YType.enumeration, 'traffic-direction'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'TrafficDirection', '')])),
])
self.interface = None
self.source_interface = None
self.platform_error = None
self.destination_interface = None
self.traffic_direction = None
self.destination_id = SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId()
self.destination_id.parent = self
self._children_name_map["destination_id"] = "destination-id"
self.traffic_mirroring_parameters = SpanMonitorSession.Nodes.Node.Interfaces.Interface.TrafficMirroringParameters()
self.traffic_mirroring_parameters.parent = self
self._children_name_map["traffic_mirroring_parameters"] = "traffic-mirroring-parameters"
self.attachment = YList(self)
self._segment_path = lambda: "interface" + "[interface='" + str(self.interface) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Interfaces.Interface, ['interface', 'source_interface', 'platform_error', 'destination_interface', 'traffic_direction'], name, value)
class DestinationId(_Entity_):
"""
Destination ID (deprecated by Attachment)
.. attribute:: ipv4_address_and_vrf
IPv4 address
**type**\: :py:class:`Ipv4AddressAndVrf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId.Ipv4AddressAndVrf>`
**config**\: False
.. attribute:: ipv6_address_and_vrf
IPv6 address
**type**\: :py:class:`Ipv6AddressAndVrf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId.Ipv6AddressAndVrf>`
**config**\: False
.. attribute:: destination_class
DestinationClass
**type**\: :py:class:`DestinationClass <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.DestinationClass>`
**config**\: False
.. attribute:: interface
Interface Handle
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: pseudowire_id
Pseudowire XCID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_value
Invalid Parameter
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId, self).__init__()
self.yang_name = "destination-id"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("ipv4-address-and-vrf", ("ipv4_address_and_vrf", SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId.Ipv4AddressAndVrf)), ("ipv6-address-and-vrf", ("ipv6_address_and_vrf", SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId.Ipv6AddressAndVrf))])
self._leafs = OrderedDict([
('destination_class', (YLeaf(YType.enumeration, 'destination-class'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'DestinationClass', '')])),
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('pseudowire_id', (YLeaf(YType.uint32, 'pseudowire-id'), ['int'])),
('invalid_value', (YLeaf(YType.uint32, 'invalid-value'), ['int'])),
])
self.destination_class = None
self.interface = None
self.pseudowire_id = None
self.invalid_value = None
self.ipv4_address_and_vrf = SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId.Ipv4AddressAndVrf()
self.ipv4_address_and_vrf.parent = self
self._children_name_map["ipv4_address_and_vrf"] = "ipv4-address-and-vrf"
self.ipv6_address_and_vrf = SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId.Ipv6AddressAndVrf()
self.ipv6_address_and_vrf.parent = self
self._children_name_map["ipv6_address_and_vrf"] = "ipv6-address-and-vrf"
self._segment_path = lambda: "destination-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId, ['destination_class', 'interface', 'pseudowire_id', 'invalid_value'], name, value)
class Ipv4AddressAndVrf(_Entity_):
"""
IPv4 address
.. attribute:: ipv4_address
IPv4 address
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: vrf_name
VRF
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId.Ipv4AddressAndVrf, self).__init__()
self.yang_name = "ipv4-address-and-vrf"
self.yang_parent_name = "destination-id"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv4_address', (YLeaf(YType.str, 'ipv4-address'), ['str'])),
('vrf_name', (YLeaf(YType.str, 'vrf-name'), ['str'])),
])
self.ipv4_address = None
self.vrf_name = None
self._segment_path = lambda: "ipv4-address-and-vrf"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId.Ipv4AddressAndVrf, ['ipv4_address', 'vrf_name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId.Ipv4AddressAndVrf']['meta_info']
class Ipv6AddressAndVrf(_Entity_):
"""
IPv6 address
.. attribute:: ipv6_address
IPv6 address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: vrf_name
VRF
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId.Ipv6AddressAndVrf, self).__init__()
self.yang_name = "ipv6-address-and-vrf"
self.yang_parent_name = "destination-id"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv6_address', (YLeaf(YType.str, 'ipv6-address'), ['str'])),
('vrf_name', (YLeaf(YType.str, 'vrf-name'), ['str'])),
])
self.ipv6_address = None
self.vrf_name = None
self._segment_path = lambda: "ipv6-address-and-vrf"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId.Ipv6AddressAndVrf, ['ipv6_address', 'vrf_name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId.Ipv6AddressAndVrf']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Interfaces.Interface.DestinationId']['meta_info']
class TrafficMirroringParameters(_Entity_):
"""
Traffic mirroring parameters (deprecated by
Attachment)
.. attribute:: traffic_direction
Direction
**type**\: :py:class:`TrafficDirection <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.TrafficDirection>`
**config**\: False
.. attribute:: port_level
Port level mirroring
**type**\: bool
**config**\: False
.. attribute:: is_acl_enabled
ACL enabled
**type**\: bool
**config**\: False
.. attribute:: mirror_bytes
Number of bytes to mirror
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: byte
.. attribute:: mirror_interval
Interval between mirrored packets
**type**\: :py:class:`MirrorInterval <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.MirrorInterval>`
**config**\: False
.. attribute:: acl_name
ACL name
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Interfaces.Interface.TrafficMirroringParameters, self).__init__()
self.yang_name = "traffic-mirroring-parameters"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('traffic_direction', (YLeaf(YType.enumeration, 'traffic-direction'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'TrafficDirection', '')])),
('port_level', (YLeaf(YType.boolean, 'port-level'), ['bool'])),
('is_acl_enabled', (YLeaf(YType.boolean, 'is-acl-enabled'), ['bool'])),
('mirror_bytes', (YLeaf(YType.uint32, 'mirror-bytes'), ['int'])),
('mirror_interval', (YLeaf(YType.enumeration, 'mirror-interval'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'MirrorInterval', '')])),
('acl_name', (YLeaf(YType.str, 'acl-name'), ['str'])),
])
self.traffic_direction = None
self.port_level = None
self.is_acl_enabled = None
self.mirror_bytes = None
self.mirror_interval = None
self.acl_name = None
self._segment_path = lambda: "traffic-mirroring-parameters"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Interfaces.Interface.TrafficMirroringParameters, ['traffic_direction', 'port_level', 'is_acl_enabled', 'mirror_bytes', 'mirror_interval', 'acl_name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Interfaces.Interface.TrafficMirroringParameters']['meta_info']
class Attachment(_Entity_):
"""
Attachment information
.. attribute:: destination_id
Destination ID
**type**\: :py:class:`DestinationId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId>`
**config**\: False
.. attribute:: traffic_mirroring_parameters
Traffic mirroring parameters
**type**\: :py:class:`TrafficMirroringParameters <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.TrafficMirroringParameters>`
**config**\: False
.. attribute:: class_
Attachment class
**type**\: :py:class:`SessionClass <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SessionClass>`
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment, self).__init__()
self.yang_name = "attachment"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("destination-id", ("destination_id", SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId)), ("traffic-mirroring-parameters", ("traffic_mirroring_parameters", SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.TrafficMirroringParameters))])
self._leafs = OrderedDict([
('class_', (YLeaf(YType.enumeration, 'class'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'SessionClass', '')])),
])
self.class_ = None
self.destination_id = SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId()
self.destination_id.parent = self
self._children_name_map["destination_id"] = "destination-id"
self.traffic_mirroring_parameters = SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.TrafficMirroringParameters()
self.traffic_mirroring_parameters.parent = self
self._children_name_map["traffic_mirroring_parameters"] = "traffic-mirroring-parameters"
self._segment_path = lambda: "attachment"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment, ['class_'], name, value)
class DestinationId(_Entity_):
"""
Destination ID
.. attribute:: ipv4_address_and_vrf
IPv4 address
**type**\: :py:class:`Ipv4AddressAndVrf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId.Ipv4AddressAndVrf>`
**config**\: False
.. attribute:: ipv6_address_and_vrf
IPv6 address
**type**\: :py:class:`Ipv6AddressAndVrf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId.Ipv6AddressAndVrf>`
**config**\: False
.. attribute:: destination_class
DestinationClass
**type**\: :py:class:`DestinationClass <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.DestinationClass>`
**config**\: False
.. attribute:: interface
Interface Handle
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: pseudowire_id
Pseudowire XCID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_value
Invalid Parameter
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId, self).__init__()
self.yang_name = "destination-id"
self.yang_parent_name = "attachment"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("ipv4-address-and-vrf", ("ipv4_address_and_vrf", SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId.Ipv4AddressAndVrf)), ("ipv6-address-and-vrf", ("ipv6_address_and_vrf", SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId.Ipv6AddressAndVrf))])
self._leafs = OrderedDict([
('destination_class', (YLeaf(YType.enumeration, 'destination-class'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'DestinationClass', '')])),
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('pseudowire_id', (YLeaf(YType.uint32, 'pseudowire-id'), ['int'])),
('invalid_value', (YLeaf(YType.uint32, 'invalid-value'), ['int'])),
])
self.destination_class = None
self.interface = None
self.pseudowire_id = None
self.invalid_value = None
self.ipv4_address_and_vrf = SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId.Ipv4AddressAndVrf()
self.ipv4_address_and_vrf.parent = self
self._children_name_map["ipv4_address_and_vrf"] = "ipv4-address-and-vrf"
self.ipv6_address_and_vrf = SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId.Ipv6AddressAndVrf()
self.ipv6_address_and_vrf.parent = self
self._children_name_map["ipv6_address_and_vrf"] = "ipv6-address-and-vrf"
self._segment_path = lambda: "destination-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId, ['destination_class', 'interface', 'pseudowire_id', 'invalid_value'], name, value)
class Ipv4AddressAndVrf(_Entity_):
"""
IPv4 address
.. attribute:: ipv4_address
IPv4 address
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: vrf_name
VRF
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId.Ipv4AddressAndVrf, self).__init__()
self.yang_name = "ipv4-address-and-vrf"
self.yang_parent_name = "destination-id"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv4_address', (YLeaf(YType.str, 'ipv4-address'), ['str'])),
('vrf_name', (YLeaf(YType.str, 'vrf-name'), ['str'])),
])
self.ipv4_address = None
self.vrf_name = None
self._segment_path = lambda: "ipv4-address-and-vrf"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId.Ipv4AddressAndVrf, ['ipv4_address', 'vrf_name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId.Ipv4AddressAndVrf']['meta_info']
class Ipv6AddressAndVrf(_Entity_):
"""
IPv6 address
.. attribute:: ipv6_address
IPv6 address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: vrf_name
VRF
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId.Ipv6AddressAndVrf, self).__init__()
self.yang_name = "ipv6-address-and-vrf"
self.yang_parent_name = "destination-id"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv6_address', (YLeaf(YType.str, 'ipv6-address'), ['str'])),
('vrf_name', (YLeaf(YType.str, 'vrf-name'), ['str'])),
])
self.ipv6_address = None
self.vrf_name = None
self._segment_path = lambda: "ipv6-address-and-vrf"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId.Ipv6AddressAndVrf, ['ipv6_address', 'vrf_name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId.Ipv6AddressAndVrf']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.DestinationId']['meta_info']
class TrafficMirroringParameters(_Entity_):
"""
Traffic mirroring parameters
.. attribute:: traffic_direction
Direction
**type**\: :py:class:`TrafficDirection <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.TrafficDirection>`
**config**\: False
.. attribute:: port_level
Port level mirroring
**type**\: bool
**config**\: False
.. attribute:: is_acl_enabled
ACL enabled
**type**\: bool
**config**\: False
.. attribute:: mirror_bytes
Number of bytes to mirror
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: byte
.. attribute:: mirror_interval
Interval between mirrored packets
**type**\: :py:class:`MirrorInterval <ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper.MirrorInterval>`
**config**\: False
.. attribute:: acl_name
ACL name
**type**\: str
**config**\: False
"""
_prefix = 'ethernet-span-oper'
_revision = '2015-11-09'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.TrafficMirroringParameters, self).__init__()
self.yang_name = "traffic-mirroring-parameters"
self.yang_parent_name = "attachment"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('traffic_direction', (YLeaf(YType.enumeration, 'traffic-direction'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'TrafficDirection', '')])),
('port_level', (YLeaf(YType.boolean, 'port-level'), ['bool'])),
('is_acl_enabled', (YLeaf(YType.boolean, 'is-acl-enabled'), ['bool'])),
('mirror_bytes', (YLeaf(YType.uint32, 'mirror-bytes'), ['int'])),
('mirror_interval', (YLeaf(YType.enumeration, 'mirror-interval'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_Ethernet_SPAN_oper', 'MirrorInterval', '')])),
('acl_name', (YLeaf(YType.str, 'acl-name'), ['str'])),
])
self.traffic_direction = None
self.port_level = None
self.is_acl_enabled = None
self.mirror_bytes = None
self.mirror_interval = None
self.acl_name = None
self._segment_path = lambda: "traffic-mirroring-parameters"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.TrafficMirroringParameters, ['traffic_direction', 'port_level', 'is_acl_enabled', 'mirror_bytes', 'mirror_interval', 'acl_name'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment.TrafficMirroringParameters']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Interfaces.Interface.Attachment']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Interfaces.Interface']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node.Interfaces']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes.Node']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession.Nodes']['meta_info']
def clone_ptr(self):
self._top_entity = SpanMonitorSession()
return self._top_entity
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_Ethernet_SPAN_oper as meta
return meta._meta_table['SpanMonitorSession']['meta_info']
| 47.520423 | 586 | 0.458321 | 12,721 | 161,712 | 5.525037 | 0.024762 | 0.029936 | 0.03742 | 0.035086 | 0.872546 | 0.826518 | 0.797308 | 0.77221 | 0.754695 | 0.740368 | 0 | 0.020718 | 0.443647 | 161,712 | 3,402 | 587 | 47.534392 | 0.760484 | 0.212155 | 0 | 0.704225 | 0 | 0.003706 | 0.158272 | 0.056054 | 0.001483 | 0 | 0 | 0 | 0 | 1 | 0.091179 | false | 0 | 0.038547 | 0 | 0.227576 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
15ed0acda6fdf69eb920e5093f9f0e2b20899bea | 110 | py | Python | __init__.py | lcremer/Maya-VertexColor | 66b77ce019da9addfce7783dcf64a617c8d23df5 | [
"Unlicense"
] | 1 | 2020-03-22T15:02:31.000Z | 2020-03-22T15:02:31.000Z | __init__.py | lcremer/Maya_VertexColor | 66b77ce019da9addfce7783dcf64a617c8d23df5 | [
"Unlicense"
] | null | null | null | __init__.py | lcremer/Maya_VertexColor | 66b77ce019da9addfce7783dcf64a617c8d23df5 | [
"Unlicense"
] | null | null | null | import Maya_VertexColor.Gradient
from Maya_VertexColor.Menu import menu_item, add_menu_item
add_menu_item()
| 18.333333 | 58 | 0.863636 | 17 | 110 | 5.176471 | 0.470588 | 0.272727 | 0.25 | 0.340909 | 0.340909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 110 | 5 | 59 | 22 | 0.88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
c63e128ac94e998a990c1e3fdea89c700d04f285 | 29,988 | py | Python | voipms/entities/didsorder.py | 4doom4/python-voipms | 3159ccfaf1ed9f5fef431fa3d2fdd54b9d3b1b3c | [
"MIT"
] | 14 | 2017-06-26T16:22:59.000Z | 2022-03-10T13:22:49.000Z | voipms/entities/didsorder.py | judahpaul16/python-voipms | 4e1eb51f927b9e0924091f7bbf25ccc2193c3bac | [
"MIT"
] | 8 | 2018-02-15T18:25:48.000Z | 2022-03-29T06:17:00.000Z | voipms/entities/didsorder.py | judahpaul16/python-voipms | 4e1eb51f927b9e0924091f7bbf25ccc2193c3bac | [
"MIT"
] | 8 | 2019-02-22T00:42:25.000Z | 2022-02-14T19:50:41.000Z | # coding=utf-8
"""
The Dids API endpoint order
Documentation: https://voip.ms/m/apidocs.php
"""
from voipms.baseapi import BaseApi
from voipms.helpers import order
class DidsOrder(BaseApi):
"""
ORder for the Dids endpoint.
"""
def __init__(self, *args, **kwargs):
"""
Initialize the endpoint
"""
super(DidsOrder, self).__init__(*args, **kwargs)
self.endpoint = 'dids'
def order_did(self, did, routing, pop, dialtime, cnam, billing_type, **kwargs):
"""
Orders and Adds a new DID Number to the Account
:param did: [Required] DID to be Ordered (Example: 5552223333)
:type did: :py:class:`int`
:param routing: [Required] Main Routing for the DID
:type routing: :py:class:`str`
:param pop: [Required] Point of Presence for the DID (Example: 5)
:type pop: :py:class:`int`
:param dialtime: [Required] Dial Time Out for the DID (Example: 60 -> in seconds)
:type dialtime: :py:class:`int`
:param cnam: [Required] CNAM for the DID (Boolean: True/False)
:type cnam: :py:class:`bool`
:param billing_type: [Required] Billing type for the DID (1 = Per Minute, 2 = Flat)
:type billing_type: :py:class:`int`
:param **kwargs: All optional parameters
:type **kwargs: :py:class:`dict`
:param failover_busy: Busy Routing for the DID
:type failover_busy: :py:class:`str`
:param failover_unreachable: Unreachable Routing for the DID
:type failover_unreachable: :py:class:`str`
:param failover_noanswer: NoAnswer Routing for the DID
:type failover_noanswer: :py:class:`str`
:param voicemail: Voicemail for the DID (Example: 101)
:type voicemail: :py:class:`int`
:param callerid_prefix: Caller ID Prefix for the DID
:type callerid_prefix: :py:class:`str`
:param note: Note for the DID
:type note: :py:class:`str`
:param account: Reseller Sub Account (Example: '100001_VoIP')
:type account: :py:class:`str`
:param monthly: Montly Fee for Reseller Client (Example: 3.50)
:type monthly: :py:class:`float`
:param setup: Setup Fee for Reseller Client (Example: 1.99)
:type setup: :py:class:`float`
:param minute: Minute Rate for Reseller Client (Example: 0.03)
:type minute: :py:class:`float`
:param test: Set to True if testing how Orders work
- Orders can not be undone
- When testing, no Orders are made
:type test: :py:class:`bool`
:returns: :py:class:`dict`
routing, failover_busy, failover_unreachable and failover_noanswer
can receive values in the following format => header:record_id
Where header could be: account, fwd, vm, sip, grp, ivr, sys, recording, queue, cb, tc, disa, none.
Examples:
account Used for routing calls to Sub Accounts
You can get all sub accounts using the accounts.get_sub_accounts function
fwd Used for routing calls to Forwarding entries.
You can get the ID right after creating a Forwarding with setForwarding
or by requesting all forwardings entries with getForwardings.
vm Used for routing calls to a Voicemail.
You can get all voicemails and their IDs using the voicemail.get_voicemails function
sys System Options:
hangup = Hangup the Call
busy = Busy tone
noservice = System Recording: Number not in service
disconnected = System Recording: Number has been disconnected
dtmf = DTMF Test
echo = ECHO Test
none Used to route calls to no action
Examples:
'account:100001_VoIP'
'fwd:1026'
'vm:101'
'none:'
'sys:echo'
"""
method = "orderDID"
kwargs.update({
"method": method,
"did": did,
"routing": routing,
"pop": pop,
"dialtime": dialtime,
"cnam": cnam,
"billing_type": billing_type,
})
return self._voipms_client._get(order(**kwargs))
def order_did_international_geographic(self, location_id, quantity, routing, pop, dialtime, cnam, billing_type, **kwargs):
"""
Orders and Adds new International Geographic DID Numbers to the Account
:param location_id: [Required] ID for a specific International Location (Values from dids.get_dids_international_geographic)
:type location_id: :py:class:`int`
:param quantity: [Required] Number of dids to be purchased (Example: 2)
:type quantity: :py:class:`int`
:param routing: [Required] Main Routing for the DID
:type routing: :py:class:`str`
:param pop: [Required] Point of Presence for the DID (Example: 5)
:type pop: :py:class:`int`
:param dialtime: [Required] Dial Time Out for the DID (Example: 60 -> in seconds)
:type dialtime: :py:class:`int`
:param cnam: [Required] CNAM for the DID (Boolean: True/False)
:type cnam: :py:class:`bool`
:param billing_type: [Required] Billing type for the DID (1 = Per Minute, 2 = Flat)
:type billing_type: :py:class:`int`
:param **kwargs: All optional parameters
:type **kwargs: :py:class:`dict`
:param failover_busy: Busy Routing for the DID
:type failover_busy: :py:class:`str`
:param failover_unreachable: Unreachable Routing for the DID
:type failover_unreachable: :py:class:`str`
:param failover_noanswer: NoAnswer Routing for the DID
:type failover_noanswer: :py:class:`str`
:param voicemail: Voicemail for the DID (Example: 101)
:type voicemail: :py:class:`int`
:param callerid_prefix: Caller ID Prefix for the DID
:type callerid_prefix: :py:class:`str`
:param note: Note for the DID
:type note: :py:class:`str`
:param account: Reseller Sub Account (Example: '100001_VoIP')
:type account: :py:class:`str`
:param monthly: Montly Fee for Reseller Client (Example: 3.50)
:type monthly: :py:class:`float`
:param setup: Setup Fee for Reseller Client (Example: 1.99)
:type setup: :py:class:`float`
:param minute: Minute Rate for Reseller Client (Example: 0.03)
:type minute: :py:class:`float`
:param test: Set to True if testing how Orders work
- Orders can not be undone
- When testing, no Orders are made
:type test: :py:class:`bool`
:returns: :py:class:`dict`
routing, failover_busy, failover_unreachable and failover_noanswer
can receive values in the following format => header:record_id
Where header could be: account, fwd, vm, sip, grp, ivr, sys, recording, queue, cb, tc, disa, none.
Examples:
account Used for routing calls to Sub Accounts
You can get all sub accounts using the accounts.get_sub_accounts function
fwd Used for routing calls to Forwarding entries.
You can get the ID right after creating a Forwarding with setForwarding
or by requesting all forwardings entries with getForwardings.
vm Used for routing calls to a Voicemail.
You can get all voicemails and their IDs using the voicemail.get_voicemails function
sys System Options:
hangup = Hangup the Call
busy = Busy tone
noservice = System Recording: Number not in service
disconnected = System Recording: Number has been disconnected
dtmf = DTMF Test
echo = ECHO Test
none Used to route calls to no action
Examples:
'account:100001_VoIP'
'fwd:1026'
'vm:101'
'none:'
'sys:echo'
"""
method = "orderDIDInternationalGeographic"
kwargs.update({
"method": method,
"location_id": location_id,
"quantity": quantity,
"routing": routing,
"pop": pop,
"dialtime": dialtime,
"cnam": cnam,
"billing_type": billing_type,
})
return self._voipms_client._get(order(**kwargs))
def order_did_international_national(self, location_id, quantity, routing, pop, dialtime, cnam, billing_type, **kwargs):
"""
Orders and Adds new International National DID Numbers to the Account
:param location_id: [Required] ID for a specific International Location (Values from dids.get_dids_international_geographic)
:type location_id: :py:class:`int`
:param quantity: [Required] Number of dids to be purchased (Example: 2)
:type quantity: :py:class:`int`
:param routing: [Required] Main Routing for the DID
:type routing: :py:class:`str`
:param pop: [Required] Point of Presence for the DID (Example: 5)
:type pop: :py:class:`int`
:param dialtime: [Required] Dial Time Out for the DID (Example: 60 -> in seconds)
:type dialtime: :py:class:`int`
:param cnam: [Required] CNAM for the DID (Boolean: True/False)
:type cnam: :py:class:`bool`
:param billing_type: [Required] Billing type for the DID (1 = Per Minute, 2 = Flat)
:type billing_type: :py:class:`int`
:param **kwargs: All optional parameters
:type **kwargs: :py:class:`dict`
:param failover_busy: Busy Routing for the DID
:type failover_busy: :py:class:`str`
:param failover_unreachable: Unreachable Routing for the DID
:type failover_unreachable: :py:class:`str`
:param failover_noanswer: NoAnswer Routing for the DID
:type failover_noanswer: :py:class:`str`
:param voicemail: Voicemail for the DID (Example: 101)
:type voicemail: :py:class:`int`
:param callerid_prefix: Caller ID Prefix for the DID
:type callerid_prefix: :py:class:`str`
:param note: Note for the DID
:type note: :py:class:`str`
:param account: Reseller Sub Account (Example: '100001_VoIP')
:type account: :py:class:`str`
:param monthly: Montly Fee for Reseller Client (Example: 3.50)
:type monthly: :py:class:`float`
:param setup: Setup Fee for Reseller Client (Example: 1.99)
:type setup: :py:class:`float`
:param minute: Minute Rate for Reseller Client (Example: 0.03)
:type minute: :py:class:`float`
:param test: Set to True if testing how Orders work
- Orders can not be undone
- When testing, no Orders are made
:type test: :py:class:`bool`
:returns: :py:class:`dict`
routing, failover_busy, failover_unreachable and failover_noanswer
can receive values in the following format => header:record_id
Where header could be: account, fwd, vm, sip, grp, ivr, sys, recording, queue, cb, tc, disa, none.
Examples:
account Used for routing calls to Sub Accounts
You can get all sub accounts using the accounts.get_sub_accounts function
fwd Used for routing calls to Forwarding entries.
You can get the ID right after creating a Forwarding with setForwarding
or by requesting all forwardings entries with getForwardings.
vm Used for routing calls to a Voicemail.
You can get all voicemails and their IDs using the voicemail.get_voicemails function
sys System Options:
hangup = Hangup the Call
busy = Busy tone
noservice = System Recording: Number not in service
disconnected = System Recording: Number has been disconnected
dtmf = DTMF Test
echo = ECHO Test
none Used to route calls to no action
Examples:
'account:100001_VoIP'
'fwd:1026'
'vm:101'
'none:'
'sys:echo'
"""
method = "orderDIDInternationalNational"
kwargs.update({
"method": method,
"location_id": location_id,
"quantity": quantity,
"routing": routing,
"pop": pop,
"dialtime": dialtime,
"cnam": cnam,
"billing_type": billing_type,
})
return self._voipms_client._get(order(**kwargs))
def order_did_international_toll_free(self, location_id, quantity, routing, pop, dialtime, cnam, billing_type, **kwargs):
"""
Orders and Adds new International TollFree DID Numbers to the Account
:param location_id: [Required] ID for a specific International Location (Values from dids.get_dids_international_geographic)
:type location_id: :py:class:`int`
:param quantity: [Required] Number of dids to be purchased (Example: 2)
:type quantity: :py:class:`int`
:param routing: [Required] Main Routing for the DID
:type routing: :py:class:`str`
:param pop: [Required] Point of Presence for the DID (Example: 5)
:type pop: :py:class:`int`
:param dialtime: [Required] Dial Time Out for the DID (Example: 60 -> in seconds)
:type dialtime: :py:class:`int`
:param cnam: [Required] CNAM for the DID (Boolean: True/False)
:type cnam: :py:class:`bool`
:param billing_type: [Required] Billing type for the DID (1 = Per Minute, 2 = Flat)
:type billing_type: :py:class:`int`
:param **kwargs: All optional parameters
:type **kwargs: :py:class:`dict`
:param failover_busy: Busy Routing for the DID
:type failover_busy: :py:class:`str`
:param failover_unreachable: Unreachable Routing for the DID
:type failover_unreachable: :py:class:`str`
:param failover_noanswer: NoAnswer Routing for the DID
:type failover_noanswer: :py:class:`str`
:param voicemail: Voicemail for the DID (Example: 101)
:type voicemail: :py:class:`int`
:param callerid_prefix: Caller ID Prefix for the DID
:type callerid_prefix: :py:class:`str`
:param note: Note for the DID
:type note: :py:class:`str`
:param account: Reseller Sub Account (Example: '100001_VoIP')
:type account: :py:class:`str`
:param monthly: Montly Fee for Reseller Client (Example: 3.50)
:type monthly: :py:class:`float`
:param setup: Setup Fee for Reseller Client (Example: 1.99)
:type setup: :py:class:`float`
:param minute: Minute Rate for Reseller Client (Example: 0.03)
:type minute: :py:class:`float`
:param test: Set to True if testing how Orders work
- Orders can not be undone
- When testing, no Orders are made
:type test: :py:class:`bool`
:returns: :py:class:`dict`
routing, failover_busy, failover_unreachable and failover_noanswer
can receive values in the following format => header:record_id
Where header could be: account, fwd, vm, sip, grp, ivr, sys, recording, queue, cb, tc, disa, none.
Examples:
account Used for routing calls to Sub Accounts
You can get all sub accounts using the accounts.get_sub_accounts function
fwd Used for routing calls to Forwarding entries.
You can get the ID right after creating a Forwarding with setForwarding
or by requesting all forwardings entries with getForwardings.
vm Used for routing calls to a Voicemail.
You can get all voicemails and their IDs using the voicemail.get_voicemails function
sys System Options:
hangup = Hangup the Call
busy = Busy tone
noservice = System Recording: Number not in service
disconnected = System Recording: Number has been disconnected
dtmf = DTMF Test
echo = ECHO Test
none Used to route calls to no action
Examples:
'account:100001_VoIP'
'fwd:1026'
'vm:101'
'none:'
'sys:echo'
"""
method = "orderDIDInternationalTollFree"
kwargs.update({
"method": method,
"location_id": location_id,
"quantity": quantity,
"routing": routing,
"pop": pop,
"dialtime": dialtime,
"cnam": cnam,
"billing_type": billing_type,
})
return self._voipms_client._get(order(**kwargs))
def order_did_virtual(self, digits, routing, pop, dialtime, cnam, billing_type, **kwargs):
"""
Orders and Adds a new Virtual DID Number to the Account
:param digits: [Required] Three Digits for the new Virtual DID (Example: 001)
:type digits: :py:class:`int`
:param routing: [Required] Main Routing for the DID
:type routing: :py:class:`str`
:param pop: [Required] Point of Presence for the DID (Example: 5)
:type pop: :py:class:`int`
:param dialtime: [Required] Dial Time Out for the DID (Example: 60 -> in seconds)
:type dialtime: :py:class:`int`
:param cnam: [Required] CNAM for the DID (Boolean: True/False)
:type cnam: :py:class:`bool`
:param billing_type: [Required] Billing type for the DID (1 = Per Minute, 2 = Flat)
:type billing_type: :py:class:`int`
:param **kwargs: All optional parameters
:type **kwargs: :py:class:`dict`
:param failover_busy: Busy Routing for the DID
:type failover_busy: :py:class:`str`
:param failover_unreachable: Unreachable Routing for the DID
:type failover_unreachable: :py:class:`str`
:param failover_noanswer: NoAnswer Routing for the DID
:type failover_noanswer: :py:class:`str`
:param voicemail: Voicemail for the DID (Example: 101)
:type voicemail: :py:class:`int`
:param callerid_prefix: Caller ID Prefix for the DID
:type callerid_prefix: :py:class:`str`
:param note: Note for the DID
:type note: :py:class:`str`
:param account: Reseller Sub Account (Example: '100001_VoIP')
:type account: :py:class:`str`
:param monthly: Montly Fee for Reseller Client (Example: 3.50)
:type monthly: :py:class:`float`
:param setup: Setup Fee for Reseller Client (Example: 1.99)
:type setup: :py:class:`float`
:param minute: Minute Rate for Reseller Client (Example: 0.03)
:type minute: :py:class:`float`
:param test: Set to True if testing how Orders work
- Orders can not be undone
- When testing, no Orders are made
:type test: :py:class:`bool`
:returns: :py:class:`dict`
routing, failover_busy, failover_unreachable and failover_noanswer
can receive values in the following format => header:record_id
Where header could be: account, fwd, vm, sip, grp, ivr, sys, recording, queue, cb, tc, disa, none.
Examples:
account Used for routing calls to Sub Accounts
You can get all sub accounts using the accounts.get_sub_accounts function
fwd Used for routing calls to Forwarding entries.
You can get the ID right after creating a Forwarding with setForwarding
or by requesting all forwardings entries with getForwardings.
vm Used for routing calls to a Voicemail.
You can get all voicemails and their IDs using the voicemail.get_voicemails function
sys System Options:
hangup = Hangup the Call
busy = Busy tone
noservice = System Recording: Number not in service
disconnected = System Recording: Number has been disconnected
dtmf = DTMF Test
echo = ECHO Test
none Used to route calls to no action
Examples:
'account:100001_VoIP'
'fwd:1026'
'vm:101'
'none:'
'sys:echo'
"""
method = "orderDIDVirtual"
kwargs.update({
"method": method,
"digits": digits,
"routing": routing,
"pop": pop,
"dialtime": dialtime,
"cnam": cnam,
"billing_type": billing_type,
})
return self._voipms_client._get(order(**kwargs))
def order_toll_free(self, did, routing, pop, dialtime, cnam, billing_type, **kwargs):
"""
Orders and Adds a new Toll Free Number to the Account
:param did: [Required] DID to be Ordered (Example: 8772223333)
:type did: :py:class:`int`
:param routing: [Required] Main Routing for the DID
:type routing: :py:class:`str`
:param pop: [Required] Point of Presence for the DID (Example: 5)
:type pop: :py:class:`int`
:param dialtime: [Required] Dial Time Out for the DID (Example: 60 -> in seconds)
:type dialtime: :py:class:`int`
:param cnam: [Required] CNAM for the DID (Boolean: True/False)
:type cnam: :py:class:`bool`
:param billing_type: [Required] Billing type for the DID (1 = Per Minute, 2 = Flat)
:type billing_type: :py:class:`int`
:param **kwargs: All optional parameters
:type **kwargs: :py:class:`dict`
:param failover_busy: Busy Routing for the DID
:type failover_busy: :py:class:`str`
:param failover_unreachable: Unreachable Routing for the DID
:type failover_unreachable: :py:class:`str`
:param failover_noanswer: NoAnswer Routing for the DID
:type failover_noanswer: :py:class:`str`
:param voicemail: Voicemail for the DID (Example: 101)
:type voicemail: :py:class:`int`
:param callerid_prefix: Caller ID Prefix for the DID
:type callerid_prefix: :py:class:`str`
:param note: Note for the DID
:type note: :py:class:`str`
:param account: Reseller Sub Account (Example: '100001_VoIP')
:type account: :py:class:`str`
:param monthly: Montly Fee for Reseller Client (Example: 3.50)
:type monthly: :py:class:`float`
:param setup: Setup Fee for Reseller Client (Example: 1.99)
:type setup: :py:class:`float`
:param minute: Minute Rate for Reseller Client (Example: 0.03)
:type minute: :py:class:`float`
:param test: Set to True if testing how Orders work
- Orders can not be undone
- When testing, no Orders are made
:type test: :py:class:`bool`
:returns: :py:class:`dict`
routing, failover_busy, failover_unreachable and failover_noanswer
can receive values in the following format => header:record_id
Where header could be: account, fwd, vm, sip, grp, ivr, sys, recording, queue, cb, tc, disa, none.
Examples:
account Used for routing calls to Sub Accounts
You can get all sub accounts using the accounts.get_sub_accounts function
fwd Used for routing calls to Forwarding entries.
You can get the ID right after creating a Forwarding with setForwarding
or by requesting all forwardings entries with getForwardings.
vm Used for routing calls to a Voicemail.
You can get all voicemails and their IDs using the voicemail.get_voicemails function
sys System Options:
hangup = Hangup the Call
busy = Busy tone
noservice = System Recording: Number not in service
disconnected = System Recording: Number has been disconnected
dtmf = DTMF Test
echo = ECHO Test
none Used to route calls to no action
Examples:
'account:100001_VoIP'
'fwd:1026'
'vm:101'
'none:'
'sys:echo'
"""
method = "orderTollFree"
kwargs.update({
"method": method,
"did": did,
"routing": routing,
"pop": pop,
"dialtime": dialtime,
"cnam": cnam,
"billing_type": billing_type,
})
return self._voipms_client._get(order(**kwargs))
def order_vanity(self, did, routing, pop, dialtime, cnam, billing_type, carrier, **kwargs):
"""
Orders and Adds a new Vanity Toll Free Number to the Account
:param did: [Required] DID to be Ordered (Example: 8772223333)
:type did: :py:class:`int`
:param routing: [Required] Main Routing for the DID
:type routing: :py:class:`str`
:param pop: [Required] Point of Presence for the DID (Example: 5)
:type pop: :py:class:`int`
:param dialtime: [Required] Dial Time Out for the DID (Example: 60 -> in seconds)
:type dialtime: :py:class:`int`
:param cnam: [Required] CNAM for the DID (Boolean: True/False)
:type cnam: :py:class:`bool`
:param billing_type: [Required] Billing type for the DID (1 = Per Minute, 2 = Flat)
:type billing_type: :py:class:`int`
:param carrier: [Required] Carrier for the DID (Values from dids.get_carriers)
:type carrier: :py:class:`int`
:param **kwargs: All optional parameters
:type **kwargs: :py:class:`dict`
:param failover_busy: Busy Routing for the DID
:type failover_busy: :py:class:`str`
:param failover_unreachable: Unreachable Routing for the DID
:type failover_unreachable: :py:class:`str`
:param failover_noanswer: NoAnswer Routing for the DID
:type failover_noanswer: :py:class:`str`
:param voicemail: Voicemail for the DID (Example: 101)
:type voicemail: :py:class:`int`
:param callerid_prefix: Caller ID Prefix for the DID
:type callerid_prefix: :py:class:`str`
:param note: Note for the DID
:type note: :py:class:`str`
:param account: Reseller Sub Account (Example: '100001_VoIP')
:type account: :py:class:`str`
:param monthly: Montly Fee for Reseller Client (Example: 3.50)
:type monthly: :py:class:`float`
:param setup: Setup Fee for Reseller Client (Example: 1.99)
:type setup: :py:class:`float`
:param minute: Minute Rate for Reseller Client (Example: 0.03)
:type minute: :py:class:`float`
:param test: Set to True if testing how Orders work
- Orders can not be undone
- When testing, no Orders are made
:type test: :py:class:`bool`
:returns: :py:class:`dict`
routing, failover_busy, failover_unreachable and failover_noanswer
can receive values in the following format => header:record_id
Where header could be: account, fwd, vm, sip, grp, ivr, sys, recording, queue, cb, tc, disa, none.
Examples:
account Used for routing calls to Sub Accounts
You can get all sub accounts using the accounts.get_sub_accounts function
fwd Used for routing calls to Forwarding entries.
You can get the ID right after creating a Forwarding with setForwarding
or by requesting all forwardings entries with getForwardings.
vm Used for routing calls to a Voicemail.
You can get all voicemails and their IDs using the voicemail.get_voicemails function
sys System Options:
hangup = Hangup the Call
busy = Busy tone
noservice = System Recording: Number not in service
disconnected = System Recording: Number has been disconnected
dtmf = DTMF Test
echo = ECHO Test
none Used to route calls to no action
Examples:
'account:100001_VoIP'
'fwd:1026'
'vm:101'
'none:'
'sys:echo'
"""
method = "orderVanity"
kwargs.update({
"method": method,
"did": did,
"routing": routing,
"pop": pop,
"dialtime": dialtime,
"cnam": cnam,
"billing_type": billing_type,
"carrier": carrier,
})
return self._voipms_client._get(order(**kwargs))
| 43.842105 | 132 | 0.5822 | 3,542 | 29,988 | 4.865895 | 0.057877 | 0.055643 | 0.040731 | 0.042646 | 0.959327 | 0.959327 | 0.95631 | 0.954221 | 0.9519 | 0.9519 | 0 | 0.01447 | 0.334 | 29,988 | 683 | 133 | 43.906296 | 0.848488 | 0.7494 | 0 | 0.765957 | 0 | 0 | 0.136339 | 0.024317 | 0 | 0 | 0 | 0 | 0 | 1 | 0.085106 | false | 0 | 0.021277 | 0 | 0.191489 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c64c95354944da9d88c616a7bcb9690e3db12ed1 | 5,220 | py | Python | tests/test_base.py | sashgorokhov-forks/stories | ae0596cd1c6eb2b159bc652706d28ed934af1507 | [
"BSD-2-Clause"
] | null | null | null | tests/test_base.py | sashgorokhov-forks/stories | ae0596cd1c6eb2b159bc652706d28ed934af1507 | [
"BSD-2-Clause"
] | null | null | null | tests/test_base.py | sashgorokhov-forks/stories | ae0596cd1c6eb2b159bc652706d28ed934af1507 | [
"BSD-2-Clause"
] | null | null | null | import pytest
import examples
from stories.exceptions import FailureError
def test_empty():
result = examples.methods.Empty().x()
assert result is None
result = examples.methods.Empty().x.run()
assert result.is_success
assert not result.is_failure
assert result.value is None
result = examples.methods.EmptySubstory().y()
assert result is None
result = examples.methods.EmptySubstory().y.run()
assert result.is_success
assert not result.is_failure
assert result.value is None
result = examples.methods.SubstoryDI(examples.methods.Empty().x).y(3)
assert result == 6
result = examples.methods.SubstoryDI(examples.methods.Empty().x).y.run(3)
assert result.is_success
assert not result.is_failure
assert result.value == 6
def test_failure():
# Simple.
with pytest.raises(FailureError) as exc_info:
examples.methods.Simple().x(2, 2)
assert not exc_info.value.reason
result = examples.methods.Simple().x.run(2, 2)
assert not result.is_success
assert result.is_failure
assert not result.failure_reason
assert result.ctx == {"foo": 2, "bar": 2}
assert result.failed_on("two")
assert not result.failed_on("one")
with pytest.raises(AssertionError):
result.value
# Simple substory.
with pytest.raises(FailureError) as exc_info:
examples.methods.SimpleSubstory().y(3)
assert not exc_info.value.reason
result = examples.methods.SimpleSubstory().y.run(3)
assert not result.is_success
assert result.is_failure
assert not result.failure_reason
assert result.ctx == {"foo": 2, "bar": 4, "spam": 3}
assert result.failed_on("two")
assert not result.failed_on("one")
with pytest.raises(AssertionError):
result.value
# Substory DI.
with pytest.raises(FailureError) as exc_info:
examples.methods.SubstoryDI(examples.methods.Simple().x).y(3)
assert not exc_info.value.reason
result = examples.methods.SubstoryDI(examples.methods.Simple().x).y.run(3)
assert not result.is_success
assert result.is_failure
assert not result.failure_reason
assert result.ctx == {"foo": 2, "bar": 4, "spam": 3}
assert result.failed_on("two")
assert not result.failed_on("one")
with pytest.raises(AssertionError):
result.value
def test_result():
result = examples.methods.Simple().x(1, 3)
assert result == -1
result = examples.methods.Simple().x.run(1, 3)
assert result.is_success
assert not result.is_failure
assert not result.failed_on("two")
assert result.value == -1
result = examples.methods.SimpleSubstory().y(2)
assert result == -1
result = examples.methods.SimpleSubstory().y.run(2)
assert result.is_success
assert not result.is_failure
assert not result.failed_on("two")
assert result.value == -1
result = examples.methods.SubstoryDI(examples.methods.Simple().x).y(2)
assert result == -1
result = examples.methods.SubstoryDI(examples.methods.Simple().x).y.run(2)
assert result.is_success
assert not result.is_failure
assert not result.failed_on("two")
assert result.value == -1
def test_skip():
result = examples.methods.Simple().x(1, -1)
assert result is None
result = examples.methods.Simple().x.run(1, -1)
assert result.is_success
assert not result.is_failure
assert result.value is None
result = examples.methods.SimpleSubstory().y(-2)
assert result == -4
result = examples.methods.SimpleSubstory().y.run(-2)
assert result.is_success
assert not result.is_failure
assert result.value == -4
result = examples.methods.SubstoryDI(examples.methods.Simple().x).y(-2)
assert result == -4
result = examples.methods.SubstoryDI(examples.methods.Simple().x).y.run(-2)
assert result.is_success
assert not result.is_failure
assert result.value == -4
result = examples.methods.SubstoryDI(examples.methods.SimpleSubstory().z).y(2)
assert result == 4
result = examples.methods.SubstoryDI(examples.methods.SimpleSubstory().z).y.run(2)
assert result.is_success
assert not result.is_failure
assert result.value == 4
result = examples.methods.SubstoryDI(examples.methods.Pipe().y).y(-2)
assert result == -4
result = examples.methods.SubstoryDI(examples.methods.Pipe().y).y.run(-2)
assert result.is_success
assert not result.is_failure
assert result.value == -4
def test_return_type():
with pytest.raises(AssertionError):
examples.methods.WrongResult().x()
with pytest.raises(AssertionError):
examples.methods.WrongResult().x.run()
def test_attribute_access():
with pytest.raises(AssertionError):
examples.methods.AttributeAccess().x()
with pytest.raises(AssertionError):
examples.methods.AttributeAccess().x.run()
def test_inject_implementation():
result = examples.methods.ImplementationDI(f=lambda arg: arg + 1).x(1)
assert result == 2
result = examples.methods.ImplementationDI(f=lambda arg: arg + 1).x.run(1)
assert result.is_success
assert not result.is_failure
assert result.value == 2
| 28.216216 | 86 | 0.693487 | 697 | 5,220 | 5.110473 | 0.090387 | 0.193711 | 0.15918 | 0.088433 | 0.922235 | 0.911847 | 0.885177 | 0.832398 | 0.740876 | 0.659742 | 0 | 0.014148 | 0.187548 | 5,220 | 184 | 87 | 28.369565 | 0.825749 | 0.007088 | 0 | 0.604651 | 0 | 0 | 0.010234 | 0 | 0 | 0 | 0 | 0 | 0.612403 | 1 | 0.054264 | false | 0 | 0.023256 | 0 | 0.077519 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d694861f4d647a1f06d76fe7b342f526d0895362 | 206 | py | Python | Crawlers/Tiktok/config.py | BingquLee/CrawlerSchedulerFBV | d4944470733c3639e3c4712769b3eda161895c4a | [
"Apache-2.0"
] | null | null | null | Crawlers/Tiktok/config.py | BingquLee/CrawlerSchedulerFBV | d4944470733c3639e3c4712769b3eda161895c4a | [
"Apache-2.0"
] | null | null | null | Crawlers/Tiktok/config.py | BingquLee/CrawlerSchedulerFBV | d4944470733c3639e3c4712769b3eda161895c4a | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
ACCOUNT_TO_SID = {
'BingquLee': '103b465c87289529651457a0ae58a7e1',
'bingquli': '3d909a93c9b9c887f545628fc4c76665',
'13269704912': 'eb655110c6c0fdf5aa33a5e86797ca9b',
}
| 25.75 | 54 | 0.718447 | 12 | 206 | 12.166667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.438202 | 0.135922 | 206 | 7 | 55 | 29.428571 | 0.382022 | 0.101942 | 0 | 0 | 0 | 0 | 0.677596 | 0.52459 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d6eda2a784a6a8b449642645a07c348b97f3477b | 163 | py | Python | homework-3/hw3p1/hw3/mc.py | neelpawarcmu/deep-learning-library | 401483fce40e3a025054596cbec368ff4f647661 | [
"MIT"
] | null | null | null | homework-3/hw3p1/hw3/mc.py | neelpawarcmu/deep-learning-library | 401483fce40e3a025054596cbec368ff4f647661 | [
"MIT"
] | null | null | null | homework-3/hw3p1/hw3/mc.py | neelpawarcmu/deep-learning-library | 401483fce40e3a025054596cbec368ff4f647661 | [
"MIT"
] | null | null | null | # You know the drill...
def question_1():
return 'b'
def question_2():
return 'b'
def question_3():
return 'b'
def question_4():
return 'a' | 9.588235 | 23 | 0.582822 | 24 | 163 | 3.791667 | 0.541667 | 0.483516 | 0.32967 | 0.593407 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033613 | 0.269939 | 163 | 17 | 24 | 9.588235 | 0.731092 | 0.128834 | 0 | 0.375 | 0 | 0 | 0.028369 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
2403a3619ebdcaa44364dfe531a4796c462ee58e | 1,003 | py | Python | functions.py | philrosenfield/ResolvedStellarPops | ab24083ae5080545165ccf7589d5a22c7989ce75 | [
"BSD-3-Clause"
] | null | null | null | functions.py | philrosenfield/ResolvedStellarPops | ab24083ae5080545165ccf7589d5a22c7989ce75 | [
"BSD-3-Clause"
] | null | null | null | functions.py | philrosenfield/ResolvedStellarPops | ab24083ae5080545165ccf7589d5a22c7989ce75 | [
"BSD-3-Clause"
] | null | null | null | import numpy as np
def gaussian(x, p):
'''
gaussian(arr,p): p[0] = norm, p[1] = mean, p[2]=sigma
'''
return p[0] * np.exp(-1 * (x - p[1]) ** 2 / (2 * p[2] ** 2))
def double_gaussian(x, p):
'''
gaussian(arr,p): p[0] = norm1, p[1] = mean1, p[2]=sigma1
p[3] = norm2, p[4] = mean2, p[5]=sigma2
'''
return gaussian(x, p[:3]) + gaussian(x, p[3:])
def mp_double_gauss(p, fjac=None, x=None, y=None, err=None):
'''
double gaussian for mpfit
'''
model = double_gaussian(x, p)
status = 0
return [status, (y - model) / err]
def mp_gauss(p, fjac=None, x=None, y=None, err=None):
'''
double gaussian for mpfit
'''
model = gaussian(x, p)
status = 0
return [status, (y - model) / err]
def diff_gaussian(x, p):
'''
gaussian(arr,p): p[0] = norm1, p[1] = mean1, p[2]=sigma1
p[3] = norm2, p[4] = mean2, p[5]=sigma2
'''
return gaussian(x, p[:3]) - gaussian(x, p[3:])
#del np
| 22.288889 | 64 | 0.508475 | 162 | 1,003 | 3.111111 | 0.240741 | 0.039683 | 0.178571 | 0.087302 | 0.813492 | 0.813492 | 0.813492 | 0.813492 | 0.765873 | 0.765873 | 0 | 0.055944 | 0.287139 | 1,003 | 44 | 65 | 22.795455 | 0.648951 | 0.338983 | 0 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.066667 | 0 | 0.733333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
2425caeecd4884e6020ec7bc71c821834bea471a | 175 | py | Python | src/simple_playgrounds/playground/playgrounds/__init__.py | embaba/simple-playgrounds | 74225a032cc20ad83ae1ce39811b1fde29e44cc4 | [
"MIT"
] | 12 | 2022-01-13T09:33:49.000Z | 2022-02-10T12:10:51.000Z | src/simple_playgrounds/playground/playgrounds/__init__.py | embaba/simple-playgrounds | 74225a032cc20ad83ae1ce39811b1fde29e44cc4 | [
"MIT"
] | 31 | 2020-07-19T21:47:02.000Z | 2021-11-11T23:09:18.000Z | src/simple_playgrounds/playground/playgrounds/__init__.py | embaba/simple-playgrounds | 74225a032cc20ad83ae1ce39811b1fde29e44cc4 | [
"MIT"
] | 4 | 2020-11-03T17:38:52.000Z | 2021-09-02T12:04:26.000Z | from .rl.basic import *
from simple_playgrounds.playground.playgrounds.demo_playgrounds import *
from simple_playgrounds.playground.playgrounds.profiling_playgrounds import *
| 43.75 | 77 | 0.868571 | 20 | 175 | 7.4 | 0.45 | 0.135135 | 0.216216 | 0.364865 | 0.648649 | 0.648649 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068571 | 175 | 3 | 78 | 58.333333 | 0.907975 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
242fafbcde05bb92b4a03b1a3c9b4e2fe61a48f4 | 71,749 | py | Python | xoa_driver/internals/core/commands/p4_commands.py | xenadevel/xena-open-automation-python-api | b17e512aa14eee7c51677004b4c91712005edcd0 | [
"Apache-2.0"
] | 1 | 2022-03-18T17:17:59.000Z | 2022-03-18T17:17:59.000Z | xoa_driver/internals/core/commands/p4_commands.py | xenadevel/xena-open-automation-python-api | b17e512aa14eee7c51677004b4c91712005edcd0 | [
"Apache-2.0"
] | null | null | null | xoa_driver/internals/core/commands/p4_commands.py | xenadevel/xena-open-automation-python-api | b17e512aa14eee7c51677004b4c91712005edcd0 | [
"Apache-2.0"
] | null | null | null | #: L47 Port Commands
from dataclasses import dataclass
import typing
import functools
from ..protocol.command_builders import (
build_get_request,
build_set_request
)
from .. import interfaces
from ..transporter.token import Token
from ..protocol.fields.data_types import *
from ..protocol.fields.field import XmpField
from ..registry import register_command
from .enums import *
@register_command
@dataclass
class P4_TRAFFIC:
"""
Gives a traffic state command to a L47 port.
"""
code: typing.ClassVar[int] = 700
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class SetDataAttr:
traffic_state: XmpField[XmpByte] = XmpField(XmpByte, choices=L47TrafficState) # coded byte, the traffic state command issued to the port.
def set(self, traffic_state: L47TrafficState) -> "Token":
"""Set L47 port traffic state.
:param traffic_state: the traffic state command issued to the port
:type traffic_state: L47TrafficState
"""
return Token(self._connection, build_set_request(self, module=self._module, port=self._port, traffic_state=traffic_state))
set_off = functools.partialmethod(set, L47TrafficState.OFF)
"""Set L47 port traffic state to Off."""
set_on = functools.partialmethod(set, L47TrafficState.ON)
"""Set L47 port traffic state to On."""
set_stop = functools.partialmethod(set, L47TrafficState.STOP)
"""Set L47 port traffic state to Stop."""
set_prepare = functools.partialmethod(set, L47TrafficState.PREPARE)
"""Set L47 port traffic state to Prepare."""
set_prerun = functools.partialmethod(set, L47TrafficState.PRERUN)
"""Set L47 port traffic state to Prerun."""
@register_command
@dataclass
class P4_STATE:
"""
Display the current state of the L47 port.
"""
code: typing.ClassVar[int] = 701
pushed: typing.ClassVar[bool] = True
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
state: XmpField[XmpByte] = XmpField(XmpByte, choices=L47PortState) # coded byte, specifying the current state for this port.
def get(self) -> "Token[GetDataAttr]":
"""Get the current state of the L47 port.
:return: the current state of the L47 port
:rtype: P4_STATE.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_CAPABILITIES:
"""
Report the speeds supported by the L47 port.
"""
code: typing.ClassVar[int] = 702
pushed: typing.ClassVar[bool] = True
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
auto: XmpField[XmpByte] = XmpField(XmpByte) # byte, autoneg supported
N100_mbps: XmpField[XmpByte] = XmpField(XmpByte) # byte, 100M speed supported
N1_gbps: XmpField[XmpByte] = XmpField(XmpByte) # byte, 1G speed supported
N2_5_gbps: XmpField[XmpByte] = XmpField(XmpByte) # byte, 2.5G speed supported
N5_gbps: XmpField[XmpByte] = XmpField(XmpByte) # byte, 5G speed supported
N10_gbps: XmpField[XmpByte] = XmpField(XmpByte) # byte, 10G speed supported
N25_gbps: XmpField[XmpByte] = XmpField(XmpByte) # byte, 25G speed supported
N40_gbps: XmpField[XmpByte] = XmpField(XmpByte) # byte, 40G speed supported
N50_gbps: XmpField[XmpByte] = XmpField(XmpByte) # byte, 50G speed supported
N100_gbps: XmpField[XmpByte] = XmpField(XmpByte) # byte, 100G speed supported
def get(self) -> "Token[GetDataAttr]":
"""Get the speeds supported by the L47 port.
:return: the speeds supported by the L47 port
:rtype: P4_CAPABILITIES.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_STATE_STATUS:
"""
Returns status of the last port state change. If the port state has changed to
PREPARE_FAIL, the status contains information about the reason for the fail.
Currently the status will be "OK"in all other states.
"""
code: typing.ClassVar[int] = 703
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
status: XmpField[XmpStr] = XmpField(XmpStr) # string, status for the last port state change
def get(self) -> "Token[GetDataAttr]":
"""Get status of the last port state change.
:return: status of the last port state change
:rtype: P4_STATE_STATUS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_VLAN_OFFLOAD:
"""
Specifies if 802.1Q VLAN tag should be inserted and stripped by the Ethernet
device. If VLAN Offload is switched ON, VLAN tags will not be present in frames
captured by the L47 Server.
"""
code: typing.ClassVar[int] = 704
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class SetDataAttr:
offload: XmpField[XmpByte] = XmpField(XmpByte, choices=OnOff) # coded byte, specifies if VLAN Offload is switched ON
@dataclass(frozen=True)
class GetDataAttr:
offload: XmpField[XmpByte] = XmpField(XmpByte, choices=OnOff) # coded byte, specifies if VLAN Offload is switched ON
def get(self) -> "Token[GetDataAttr]":
"""Get the VLAN offload status.
:return: VLAN offload status
:rtype: P4_VLAN_OFFLOAD.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
def set(self, offload: OnOff) -> "Token":
"""Set the VLAN offload state.
:param offload: specifies if VLAN Offload is enabled
:type offload: OnOff
"""
return Token(self._connection, build_set_request(self, module=self._module, port=self._port, offload=offload))
set_off = functools.partialmethod(set, OnOff.OFF)
"""Disable VLAN offload."""
set_on = functools.partialmethod(set, OnOff.ON)
"""Enable VLAN offload."""
@register_command
@dataclass
class P4_ARP_CONFIG:
"""
Configure the value of the ARP request transmission rate, retransmission timeout
and max. retries.
"""
code: typing.ClassVar[int] = 705
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class SetDataAttr:
rate: XmpField[XmpInt] = XmpField(XmpInt) # integer, ARP Request transmission rate (requests / sec) - must be larger than 0
retrans_timeout: XmpField[XmpInt] = XmpField(XmpInt) # integer, ARP Request retransmission timeout [ms] - must be larger than 0
retries: XmpField[XmpByte] = XmpField(XmpByte) # byte, maximum ARP Request retransmission retries
@dataclass(frozen=True)
class GetDataAttr:
rate: XmpField[XmpInt] = XmpField(XmpInt) # integer, ARP Request transmission rate (requests / sec) - must be larger than 0
retrans_timeout: XmpField[XmpInt] = XmpField(XmpInt) # integer, ARP Request retransmission timeout [ms] - must be larger than 0
retries: XmpField[XmpByte] = XmpField(XmpByte) # byte, maximum ARP Request retransmission retries
def get(self) -> "Token[GetDataAttr]":
"""Get the ARP configuration on the port.
:return: the ARP configuration on the port
:rtype: P4_ARP_CONFIG.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
def set(self, rate: int, retrans_timeout: int, retries: int) -> "Token":
"""Set the ARP configuration on the port.
:param rate: ARP Request transmission rate (requests/sec) - must be larger than 0
:type rate: int
:param retrans_timeout: ARP Request retransmission timeout [ms] - must be larger than 0
:type retrans_timeout: int
:param retries: maximum ARP Request retransmission retries
:type retries: int
"""
return Token(self._connection, build_set_request(self, module=self._module, port=self._port, rate=rate, retrans_timeout=retrans_timeout, retries=retries))
@register_command
@dataclass
class P4_NDP_CONFIG:
"""
Configure the value of the NDP Neighbor Solicitation transmission rate,
retransmission timeout and max. retries.
"""
code: typing.ClassVar[int] = 706
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class SetDataAttr:
rate: XmpField[XmpInt] = XmpField(XmpInt) # integer, NDP Neighbor Solicitation transmission rate (requests / sec) - must be larger than 0
retrans_timeout: XmpField[XmpInt] = XmpField(XmpInt) # integer, NDP Neighbor Solicitation retransmission timeout [ms] - must be larger than 0
retries: XmpField[XmpByte] = XmpField(XmpByte) # byte, Max. NDP Neighbor Solicitation retransmission retries
@dataclass(frozen=True)
class GetDataAttr:
rate: XmpField[XmpInt] = XmpField(XmpInt) # integer, NDP Neighbor Solicitation transmission rate (requests / sec) - must be larger than 0
retrans_timeout: XmpField[XmpInt] = XmpField(XmpInt) # integer, NDP Neighbor Solicitation retransmission timeout [ms] - must be larger than 0
retries: XmpField[XmpByte] = XmpField(XmpByte) # byte, Max. NDP Neighbor Solicitation retransmission retries
def get(self) -> "Token[GetDataAttr]":
"""Get the NDP configuration on the port.
:return: the NDP configuration on the port
:rtype: P4_NDP_CONFIG.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
def set(self, rate: int, retrans_timeout: int, retries: int) -> "Token":
"""Set the NDP configuration on the port.
:param rate: NDP Neighbor Solicitation transmission rate (requests/sec) - must be larger than 0
:type rate: int
:param retrans_timeout: NDP Neighbor Solicitation retransmission timeout [ms] - must be larger than 0
:type retrans_timeout: int
:param retries: maximum NDP Neighbor Solicitation retransmission retries
:type retries: int
"""
return Token(self._connection, build_set_request(self, module=self._module, port=self._port, rate=rate, retrans_timeout=retrans_timeout, retries=retries))
@register_command
@dataclass
class P4_CAPTURE:
"""
Starts or stops packet capture on this port.
"""
code: typing.ClassVar[int] = 707
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class SetDataAttr:
on_off: XmpField[XmpByte] = XmpField(XmpByte, choices=OnOff) # coded byte, specifying whether to capture traffic on this port
@dataclass(frozen=True)
class GetDataAttr:
on_off: XmpField[XmpByte] = XmpField(XmpByte, choices=OnOff) # coded byte, specifying whether to capture traffic on this port
def get(self) -> "Token[GetDataAttr]":
"""Get packet capture state on this port.
:return: packet capture state on this port
:rtype: P4_CAPTURE.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
def set(self, on_off: OnOff) -> "Token":
"""Set packet capture state on this port.
:param on_off: specifying whether to capture traffic on this port
:type on_off: OnOff
"""
return Token(self._connection, build_set_request(self, module=self._module, port=self._port, on_off=on_off))
set_off = functools.partialmethod(set, OnOff.OFF)
"""Stop packet capture on this port."""
set_on = functools.partialmethod(set, OnOff.ON)
"""Start packet capture on this port."""
@register_command
@dataclass
class P4_CAPTURE_GET_FIRST:
"""
Returns the first captured frame on the port. Command is only valid when port is
in state STOPPED
"""
code: typing.ClassVar[int] = 708
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
index: XmpField[XmpInt] = XmpField(XmpInt) # integer, index of frame returned
second: XmpField[XmpInt] = XmpField(XmpInt) # integer, second value of frame capture timestamp
microsecond: XmpField[XmpInt] = XmpField(XmpInt) # integer, microsecond value of frame capture timestamp
capture_length: XmpField[XmpInt] = XmpField(XmpInt) # integer, length of captured portion of the frame
frame_length: XmpField[XmpInt] = XmpField(XmpInt) # integer, length of the frame
frame: XmpField[XmpHexList] = XmpField(XmpHexList) # list of hex bytes, the captured frame (capture_len bytes)
def get(self) -> "Token[GetDataAttr]":
"""Get the first captured frame on the port
:return: the first captured frame on the port
:rtype: P4_CAPTURE_GET_FIRST.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_CAPTURE_GET_NEXT:
"""
Returns the next captured frame on the port. Command is only valid when port is
in state STOPPED
"""
code: typing.ClassVar[int] = 709
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
index: XmpField[XmpInt] = XmpField(XmpInt) # integer, index of frame returned
second: XmpField[XmpInt] = XmpField(XmpInt) # integer, second value of frame capture timestamp
microsecond: XmpField[XmpInt] = XmpField(XmpInt) # integer, usec value of frame capture timestamp
capture_length: XmpField[XmpInt] = XmpField(XmpInt) # integer, length of captured portion of the frame
frame_length: XmpField[XmpInt] = XmpField(XmpInt) # integer, length of the frame
frame: XmpField[XmpHexList] = XmpField(XmpHexList) # hexdata, the captured frame (capture_len bytes)
def get(self) -> "Token[GetDataAttr]":
"""Get the next captured frame on the port
:return: the next captured frame on the port
:rtype: P4_CAPTURE_GET_NEXT.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_ETH_TX_COUNTERS:
"""
Return total port Ethernet transmit statistics since last clear.
"""
code: typing.ClassVar[int] = 710
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
bits_per_sec: XmpField[XmpLong] = XmpField(XmpLong) # long integer, bit/second of (layer 2) bytes transmitted
packets_per_sec: XmpField[XmpLong] = XmpField(XmpLong) # long integer, packets/second of packets transmitted
byte_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of (layer 2) bytes transmitted
packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of packets transmitted
def get(self) -> "Token[GetDataAttr]":
"""Get total port Ethernet transmit statistics since last clear.
:return: total port Ethernet transmit statistics since last clear.
:rtype: P4_ETH_TX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_ETH_RX_COUNTERS:
"""
Return total port Ethernet receive statistics since last clear.
"""
code: typing.ClassVar[int] = 711
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
bits_per_sec: XmpField[XmpLong] = XmpField(XmpLong) # long integer, bit/second of (layer 2) bytes received
packets_per_sec: XmpField[XmpLong] = XmpField(XmpLong) # long, integer packets/second of received packets
byte_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of (layer 2) bytes received
packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of packets received
def get(self) -> "Token[GetDataAttr]":
"""Get total port Ethernet receive statistics since last clear.
:return: total port Ethernet receive statistics since last clear.
:rtype: P4_ETH_RX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_PORT_TX_COUNTERS:
"""
Return total port transmit statistics since last clear.
"""
code: typing.ClassVar[int] = 712
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
vlan_packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of 802.1Q VLAN tagged packets transmitted
bits_per_sec: XmpField[XmpLong] = XmpField(XmpLong) # long integer, bit/second of (layer 1) bits transmitted.
byte_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of (layer 1) bytes received.
def get(self) -> "Token[GetDataAttr]":
"""Get total port transmit statistics since last clear.
:return: total port transmit statistics since last clear.
:rtype: P4_PORT_TX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_PORT_RX_COUNTERS:
"""
Return total port receive statistics since last clear.
"""
code: typing.ClassVar[int] = 713
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
vlan_packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of 802.1Q VLAN tagged packets received
bits_per_sec: XmpField[XmpLong] = XmpField(XmpLong) # long integer, bit/second of (layer 1) bits received.
byte_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of (layer 1) bytes received.
def get(self) -> "Token[GetDataAttr]":
"""Get total port receive statistics since last clear.
:return: total port receive statistics since last clear.
:rtype: P4_PORT_RX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_PORT_COUNTERS:
"""
Return total port transmit error statistics since last clear.
"""
code: typing.ClassVar[int] = 714
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
invalid_eth_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of invalid (e.g. short) Ethernet packets received
unknown_eth_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of unknown or unsupported Ethernet packets received
mismatch_vlan_error_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of packets with mismatching vlan info received
pkt_rate_limit_count: XmpField[XmpLong] = XmpField(
XmpLong
) # long integer, number of times that number of packets transmitted has been limited by the maximum packet rate limiter.
def get(self) -> "Token[GetDataAttr]":
"""Get total port transmit error statistics since last clear.
:return: total port transmit error statistics since last clear.
:rtype: P4_PORT_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_TX_PACKET_SIZE:
"""
Return histogram over transmitted (layer 2) packets sizes in 100 bytes intervals.
"""
code: typing.ClassVar[int] = 715
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
bin_00: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_01: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_02: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_03: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_04: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_05: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_06: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_07: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_08: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_09: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_10: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_11: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_12: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_13: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_14: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_15: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
def get(self) -> "Token[GetDataAttr]":
"""Get a histogram over transmitted (layer 2) packets sizes in 100 bytes intervals.
:return: histogram over transmitted (layer 2) packets sizes in 100 bytes intervals.
:rtype: P4_TX_PACKET_SIZE.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_RX_PACKET_SIZE:
"""
Return a histogram over received (layer 2) packets sizes in 100 bytes intervals.
"""
code: typing.ClassVar[int] = 716
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
bin_00: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_01: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_02: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_03: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_04: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_05: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_06: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_07: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_08: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_09: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_10: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_11: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_12: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_13: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_14: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
bin_15: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of packets received with a (layer 2) size in the given interval.
def get(self) -> "Token[GetDataAttr]":
"""Get a histogram over received (layer 2) packets sizes in 100 bytes intervals.
:return: a histogram over received (layer 2) packets sizes in 100 bytes intervals.
:rtype: P4_RX_PACKET_SIZE.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_TX_MTU:
"""
Return histogram over transmitted (layer 3) packets sizes in 1 byte intervals.
Each bin represents a packet size in the interval [576..1500] bytes.
"""
code: typing.ClassVar[int] = 717
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
bins: XmpField[XmpByteList] = XmpField(XmpByteList) # 925 x byte, '1' if any packets were transmitted with the specified layer 3 size, otherwise '0'.
def get(self) -> "Token[GetDataAttr]":
"""Get histogram over transmitted (layer 3) packets sizes in 1 byte intervals.
:return: histogram over transmitted (layer 3) packets sizes in 1 byte intervals.
:rtype: P4_TX_MTU.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_RX_MTU:
"""
Return histogram over received (layer 3) packets sizes in 1 byte intervals. Each
bin represents a packet size in the interval [576..1500] bytes.
"""
code: typing.ClassVar[int] = 718
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
bins: XmpField[XmpByteList] = XmpField(XmpByteList) # 925 x byte, '1' if any packets were received with the specified layer 3 size, otherwise '0'.
def get(self) -> "Token[GetDataAttr]":
"""Get histogram over received (layer 3) packets sizes in 1 byte intervals.
:return: histogram over received (layer 3) packets sizes in 1 byte intervals.
:rtype: P4_RX_MTU.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_IPV4_RX_COUNTERS:
"""
Return total Port IPv4 protocol receive statistics since last clear.
"""
code: typing.ClassVar[int] = 719
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of IPv4 packets received
def get(self) -> "Token[GetDataAttr]":
"""Get total Port IPv4 protocol receive statistics since last clear.
:return: total Port IPv4 protocol receive statistics since last clear.
:rtype: P4_IPV4_RX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_IPV4_TX_COUNTERS:
"""
Return total Port IPv4 protocol transmit statistics since last clear.
"""
code: typing.ClassVar[int] = 720
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of IPv4 packets transmitted
def get(self) -> "Token[GetDataAttr]":
"""Get total Port IPv4 protocol transmit statistics since last clear.
:return: total Port IPv4 protocol transmit statistics since last clear.
:rtype: P4_IPV4_TX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_IPV4_COUNTERS:
"""
Return total Port IPv4 protocol error statistics since last clear.
"""
code: typing.ClassVar[int] = 721
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
checksum_error_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of IPv4 packets which ip header checksum error
invalid_packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of IPv4 packets which are malformed
unknown_packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of IPv4 packets with unknown protocol
def get(self) -> "Token[GetDataAttr]":
"""Get total Port IPv4 protocol error statistics since last clear.
:return: total Port IPv4 protocol error statistics since last clear.
:rtype: P4_IPV4_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_IPV6_RX_COUNTERS:
"""
Return total Port IPv6 protocol receive statistics since last clear.
"""
code: typing.ClassVar[int] = 722
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of IPv6 packets received
def get(self) -> "Token[GetDataAttr]":
"""Get total Port IPv6 protocol receive statistics since last clear.
:return: total Port IPv6 protocol receive statistics since last clear.
:rtype: P4_IPV6_RX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_IPV6_TX_COUNTERS:
"""
Return total Port IPv6 protocol transmit statistics since last clear.
"""
code: typing.ClassVar[int] = 723
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of IPv6 packets transmitted
def get(self) -> "Token[GetDataAttr]":
"""Get total Port IPv6 protocol transmit statistics since last clear.
:return: total Port IPv6 protocol transmit statistics since last clear.
:rtype: P4_IPV6_TX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_IPV6_COUNTERS:
"""
Return total Port IPv6 protocol error statistics since last clear.
"""
code: typing.ClassVar[int] = 724
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
invalid_packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of ipv6 packets which are malformed
unknown_packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of ipv6 packets with unknown protocol
def get(self) -> "Token[GetDataAttr]":
"""Get total Port IPv6 protocol error statistics since last clear.
:return: total Port IPv6 protocol error statistics since last clear.
:rtype: P4_IPV6_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_ARP_RX_COUNTERS:
"""
Return total Port ARP protocol receive statistics since last clear.
"""
code: typing.ClassVar[int] = 725
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
arp_request_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number ARP Requests received
arp_reply_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number ARP Replies received
def get(self) -> "Token[GetDataAttr]":
"""Get total Port ARP protocol receive statistics since last clear.
:return: total Port ARP protocol receive statistics since last clear.
:rtype: P4_ARP_RX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_ARP_TX_COUNTERS:
"""
Return total Port ARP protocol transmit statistics since last clear.
"""
code: typing.ClassVar[int] = 726
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
arp_request_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number ARP Requests transmitted
arp_reply_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number ARP Replies transmitted
def get(self) -> "Token[GetDataAttr]":
"""Get total Port ARP protocol transmit statistics since last clear.
:return: total Port ARP protocol transmit statistics since last clear.
:rtype: P4_ARP_TX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_ARP_COUNTERS:
"""
Return total Port ARP protocol error statistics since last clear.
"""
code: typing.ClassVar[int] = 727
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
invalid_arp_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of invalid ARP packets received
arp_request_lookup_failure_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of ARP requests received that could not be resolved
arp_reply_lookup_failure_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of ARP replies received that could not be resolved
arp_request_retrans_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of retransmitted ARP requests
arp_resolved_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of correct resolved IP addresses
arp_failed_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of IP address that was not resolved
arp_table_lookup_failure_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of dest IP addresses not found in the ARP table
def get(self) -> "Token[GetDataAttr]":
"""Get total Port ARP protocol error statistics since last clear.
:return: total Port ARP protocol error statistics since last clear.
:rtype: P4_ARP_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_NDP_RX_COUNTERS:
"""
Return total Port NDP protocol receive statistics since last clear.
"""
code: typing.ClassVar[int] = 728
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
ndp_request_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number NDP Requests received
ndp_reply_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number NDP Replies received
def get(self) -> "Token[GetDataAttr]":
"""Get total Port NDP protocol receive statistics since last clear.
:return: total Port NDP protocol receive statistics since last clear.
:rtype: P4_NDP_RX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_NDP_TX_COUNTERS:
"""
Return total Port NDP protocol transmit statistics since last clear.
"""
code: typing.ClassVar[int] = 729
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
ndp_request_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number NDP Requests transmitted
ndp_reply_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number NDP Replies transmitted
def get(self) -> "Token[GetDataAttr]":
"""Get total Port NDP protocol transmit statistics since last clear.
:return: total Port NDP protocol transmit statistics since last clear.
:rtype: P4_NDP_TX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_NDP_COUNTERS:
"""
Return total Port NDP protocol error statistics since last clear.
"""
code: typing.ClassVar[int] = 730
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
invalid_ndp_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of invalid NDP packets received
ndp_request_lookup_failure_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of NDP requests received that could not be resolved
ndp_reply_lookup_failure_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of NDP replies received that could not be resolved
ndp_request_retrans_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of retransmitted NDP requests
ndp_resolved_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of correct resolved IP addresses
ndp_failed_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of IP address that was not resolved
ndp_table_lookup_failure_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of dest IP addresses not found in the NDP table
def get(self) -> "Token[GetDataAttr]":
"""Get total Port NDP protocol error statistics since last clear.
:return: total Port NDP protocol error statistics since last clear.
:rtype: P4_NDP_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_ICMP_RX_COUNTERS:
"""
Return total Port ICMP protocol receive statistics since last clear.
"""
code: typing.ClassVar[int] = 731
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
icmp_echo_reqest_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of ICMP Echo requests received
icmp_echo_reply_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of ICMP Echo replies received
icmp_dest_unknown_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of ICMP Destination unknown received
icmp_time_excessive_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of ICMP Time exceeded received
icmpv6_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of ICMPv6 packets received
def get(self) -> "Token[GetDataAttr]":
"""Get total Port ICMP protocol receive statistics since last clear.
:return: total Port ICMP protocol receive statistics since last clear.
:rtype: P4_ICMP_RX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_ICMP_TX_COUNTERS:
"""
Return total Port ICMP protocol transmit statistics since last clear.
"""
code: typing.ClassVar[int] = 732
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
icmp_echo_reqest_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of ICMP Echo requests transmitted
icmp_echo_reply_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of ICMP Echo replies transmitted
icmp_dest_unknown_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of ICMP Destination unknown transmitted
icmp_time_excessive_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of ICMP Time exceeded transmitted
icmpv6_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of ICMPv6 packets transmitted
def get(self) -> "Token[GetDataAttr]":
"""Get total Port ICMP protocol transmit statistics since last clear.
:return: total Port ICMP protocol transmit statistics since last clear.
:rtype: P4_ICMP_TX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_ICMP_COUNTERS:
"""
Return total Port ICMP protocol error statistics since last clear.
"""
code: typing.ClassVar[int] = 733
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
invalid_icmp_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of unknown or invalid ICMP packets received
unknown_icmp_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of unknown or unsupported ICMP packets received
invalid_icmpv6_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of unknown or invalid ICMPv6 packets received
unknown_icmpv6_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of unknown or unsupported ICMPv6 packets received
def get(self) -> "Token[GetDataAttr]":
"""Get total Port ICMP protocol error statistics since last clear.
:return: total Port ICMP protocol error statistics since last clear.
:rtype: P4_ICMP_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_TCP_RX_COUNTERS:
"""
Return total Port TCP protocol receive statistics since last clear.
"""
code: typing.ClassVar[int] = 734
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of TCP packets received
def get(self) -> "Token[GetDataAttr]":
"""Get total Port TCP protocol receive statistics since last clear.
:return: total Port TCP protocol receive statistics since last clear.
:rtype: P4_TCP_RX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_TCP_TX_COUNTERS:
"""
Return total Port TCP protocol transmit statistics since last clear.
"""
code: typing.ClassVar[int] = 735
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of TCP packets transmitted
def get(self) -> "Token[GetDataAttr]":
"""Get total Port TCP protocol transmit statistics since last clear.
:return: total Port TCP protocol transmit statistics since last clear.
:rtype: P4_TCP_TX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_TCP_COUNTERS:
"""
Return total Port TCP protocol error statistics since last clear.
"""
code: typing.ClassVar[int] = 736
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
checksum_error_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of tcp packets which tcp header checksum error
invalid_tcp_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of TCP packets which are malformed
tcp_lookup_failure_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of TCP packets received that could not be resolved
def get(self) -> "Token[GetDataAttr]":
"""Get total Port TCP protocol error statistics since last clear.
:return: total Port TCP protocol error statistics since last clear.
:rtype: P4_TCP_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_UDP_RX_COUNTERS:
"""
Return total Port UDP protocol receive statistics since last clear.
"""
code: typing.ClassVar[int] = 737
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of UDP packets received
def get(self) -> "Token[GetDataAttr]":
"""Get total Port UDP protocol receive statistics since last clear.
:return: total Port UDP protocol receive statistics since last clear.
:rtype: P4_UDP_RX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_UDP_TX_COUNTERS:
"""
Return total Port UDP protocol transmit statistics since last clear.
"""
code: typing.ClassVar[int] = 738
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
packet_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of UDP packets transmitted
def get(self) -> "Token[GetDataAttr]":
"""Get total Port UDP protocol transmit statistics since last clear.
:return: total Port UDP protocol transmit statistics since last clear.
:rtype: P4_UDP_TX_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_UDP_COUNTERS:
"""
Return total Port UDP protocol error statistics since last clear.
"""
code: typing.ClassVar[int] = 739
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
checksum_error_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of udp packets which udp header checksum error
invalid_udp_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, total number of UDP packets which are malformed
udp_lookup_failure_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, number of UDP packets received that could not be resolved
def get(self) -> "Token[GetDataAttr]":
"""Get total Port UDP protocol error statistics since last clear.
:return: total Port UDP protocol error statistics since last clear.
:rtype: P4_UDP_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_CLEAR_COUNTERS:
"""
Clears all run-time port counters.
"""
code: typing.ClassVar[int] = 740
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class SetDataAttr:
pass
def set(self) -> "Token":
"""Clears all run-time port counters.
"""
return Token(
self._connection,
build_set_request(
self,
module=self._module,
port=self._port,
),
)
@register_command
@dataclass
class P4_ETH_COUNTERS:
"""
Return total port Ethernet statistics since last clear.
"""
code: typing.ClassVar[int] = 765
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
current_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, the current time (mSec since module restart)
ref_time: XmpField[XmpLong] = XmpField(XmpLong) # long integer, reference time (mSec for P4_TRAFFIC on)
tx_error_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, TX errors
rx_error_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, RX errors
rx_packet_lost_count: XmpField[XmpLong] = XmpField(XmpLong) # long integer, packets lost by the Ethernet driver due to RX queue overflow
def get(self) -> "Token[GetDataAttr]":
"""Get total port Ethernet statistics since last clear.
:return: total port Ethernet statistics since last clear.
:rtype: P4_ETH_COUNTERS.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_CLEAR:
"""
Set the Port State to OFF and delete all configured Connection Groups for the port.
"""
code: typing.ClassVar[int] = 766
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class SetDataAttr:
pass
def set(self) -> "Token":
"""Set the Port State to OFF and delete all configured Connection Groups for the port.
"""
return Token(
self._connection,
build_set_request(
self,
module=self._module,
port=self._port,
),
)
@register_command
@dataclass
class P4_SPEEDSELECTION:
"""
Sets the port speed. The selected speed must be one of the speeds supported by
the port, which can be retrieved with :class:`~xoa_driver.internals.core.commands.p4_commands.P4_CAPABILITIES`.
"""
code: typing.ClassVar[int] = 767
pushed: typing.ClassVar[bool] = True
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class SetDataAttr:
speed: XmpField[XmpByte] = XmpField(XmpByte, choices=L47PortSpeed) # coded byte, specifies the speed of the port
@dataclass(frozen=True)
class GetDataAttr:
speed: XmpField[XmpByte] = XmpField(XmpByte, choices=L47PortSpeed) # coded byte, specifies the speed of the port
def get(self) -> "Token[GetDataAttr]":
"""Get the port speed mode.
:return: the port speed mode.
:rtype: P4_SPEEDSELECTION.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
def set(self, speed: L47PortSpeed) -> "Token":
"""Set the port speed mode.
:param speed: specifies the speed mode of the port
:type speed: L47PortSpeed
"""
return Token(self._connection, build_set_request(self, module=self._module, port=self._port, speed=speed))
set_auto = functools.partialmethod(set, L47PortSpeed.AUTO)
"""Set the port speed mode to Auto."""
set_f100m = functools.partialmethod(set, L47PortSpeed.F100M)
"""Set the port speed mode to 100 Mbit/s."""
set_f1g = functools.partialmethod(set, L47PortSpeed.F1G)
"""Set the port speed mode to 1 Gbit/s."""
set_f2_5g = functools.partialmethod(set, L47PortSpeed.F2_5G)
"""Set the port speed mode to 2.5 Gbit/s."""
set_f5g = functools.partialmethod(set, L47PortSpeed.F5G)
"""Set the port speed mode to 5 Gbit/s."""
set_f10g = functools.partialmethod(set, L47PortSpeed.F10G)
"""Set the port speed mode to 10 Gbit/s."""
set_f25g = functools.partialmethod(set, L47PortSpeed.F25G)
"""Set the port speed mode to 25 Gbit/s."""
set_f40g = functools.partialmethod(set, L47PortSpeed.F40G)
"""Set the port speed mode to 40 Gbit/s."""
set_f50g = functools.partialmethod(set, L47PortSpeed.F50G)
"""Set the port speed mode to 50 Gbit/s."""
set_f100g = functools.partialmethod(set, L47PortSpeed.F100G)
"""Set the port speed mode to 100 Gbit/s."""
@register_command
@dataclass
class P4_MAX_PACKET_RATE:
"""
Specifies the maximum number of packets per second allowed to be transmitted on the port.
"""
code: typing.ClassVar[int] = 950
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class SetDataAttr:
mode: XmpField[XmpByte] = XmpField(XmpByte, choices=AutoOrManual) # coded byte, specifies the mode of the max. pps mechanism
rate: XmpField[XmpInt] = XmpField(XmpInt) # integer, maximum number of packets per second to transmit on this port
time_window: XmpField[XmpInt] = XmpField(XmpInt) # integer, time window [us] to measure the pps rate
@dataclass(frozen=True)
class GetDataAttr:
mode: XmpField[XmpByte] = XmpField(XmpByte, choices=AutoOrManual) # coded byte, specifies the mode of the max. pps mechanism
rate: XmpField[XmpInt] = XmpField(XmpInt) # integer, maximum number of packets per second to transmit on this port
time_window: XmpField[XmpInt] = XmpField(XmpInt) # integer, time window [us] to measure the pps rate
def get(self) -> "Token[GetDataAttr]":
"""Get the maximum number of packets per second allowed to be transmitted on the port.
:return: the maximum number of packets per second allowed to be transmitted on the port.
:rtype: P4_MAX_PACKET_RATE.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
def set(self, mode: AutoOrManual, rate: int, time_window: int) -> "Token":
"""Set the maximum number of packets per second allowed to be transmitted on the port.
:param mode: specifies the mode of the max. pps mechanism
:type mode: AutoOrManual
:param rate: maximum number of packets per second to transmit on this port
:type rate: int
:param time_window: time window [us] to measure the pps rate
:type time_window: int
"""
return Token(self._connection, build_set_request(self, module=self._module, port=self._port, mode=mode, rate=rate, time_window=time_window))
set_automatic = functools.partialmethod(set, AutoOrManual.AUTOMATIC)
"""Set port max packet rate mode to Automatic."""
set_manual = functools.partialmethod(set, AutoOrManual.MANUAL)
"""Set port max packet rate mode to Manual."""
@register_command
@dataclass
class P4_PCI_INFO:
"""
Report the port PCI info.
"""
code: typing.ClassVar[int] = 960
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
vendor_id: XmpField[XmpHex4] = XmpField(XmpHex4) # four hex bytes, PCI Vendor ID
device_id: XmpField[XmpHex4] = XmpField(XmpHex4) # four hex bytes, PCI Device ID
sub_vendor_id: XmpField[XmpHex4] = XmpField(XmpHex4) # four hex bytes, PCI Subsystem Vendor ID
sub_device_id: XmpField[XmpHex4] = XmpField(XmpHex4) # four hex bytes, PCI Subsystem Device ID
rev: XmpField[XmpInt] = XmpField(XmpInt) # integer, Revision
def get(self) -> "Token[GetDataAttr]":
"""Get the port PCI info.
:return: the port PCI info
:rtype: P4_PCI_INFO.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_FW_VER:
"""
Report the firmware version of the port (NIC).
"""
code: typing.ClassVar[int] = 961
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
major: XmpField[XmpInt] = XmpField(XmpInt) # integer, Major firmware version
minor: XmpField[XmpInt] = XmpField(XmpInt) # integer, Minor firmware version
def get(self) -> "Token[GetDataAttr]":
"""Get the firmware version of the port (NIC).
:return: the firmware version of the port (NIC)
:rtype: P4_FW_VER.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_DEV_NAME:
"""
Report the name of the device (NIC) on which the port is located.
"""
code: typing.ClassVar[int] = 962
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
name: XmpField[XmpStr] = XmpField(XmpStr) # string, name of the device (NIC) on which the port is located
def get(self) -> "Token[GetDataAttr]":
"""Get the name of the device (NIC) on which the port is located.
:return: the name of the device (NIC) on which the port is located.
:rtype: P4_DEV_NAME.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_PORT_TYPE:
"""
Report the port type. The different possible ports are divided into types.
"""
code: typing.ClassVar[int] = 963
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
type_number: XmpField[XmpInt] = XmpField(XmpInt) # integer, enumerated port type
type_string: XmpField[XmpStr] = XmpField(XmpStr) # string, textual representation of the port type
def get(self) -> "Token[GetDataAttr]":
"""Get the L47 port type.
:return: the L47 port type
:rtype: P4_PORT_TYPE.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_LICENSE_INFO:
"""
Returns the information on the license assigned to the port - if any.
"""
code: typing.ClassVar[int] = 964
pushed: typing.ClassVar[bool] = True
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
present: XmpField[XmpByte] = XmpField(XmpByte, choices=IsPresent) # coded byte, specifies if a license is assigned to the port
speed: XmpField[XmpByte] = XmpField(XmpByte, choices=LicenseSpeed) # coded byte, if a license is assigned to the port, specifies the speed of the license
permanency: XmpField[XmpByte] = XmpField(XmpByte, choices=IsPermanent) # coded byte, if a license is assigned to the port, specifies if the license is permanent
expiration: XmpField[XmpLong] = XmpField(
XmpLong
) # long integer, if a license is assigned to the port and it is not permanent, specifies the expiration date of the license - in seconds since Jan 1, 1970.
def get(self) -> "Token[GetDataAttr]":
"""Get the information on the license assigned to the port.
:return: the information on the license assigned to the port
:rtype: P4_LICENSE_INFO.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
@register_command
@dataclass
class P4_APTITUDES:
"""
Returns the ports aptitudes - i.e. what is possible to configure on the port in
terms of features and performance.
Current schema of the BSON document:
.. code-block::
schema = {
'chassis': {
'type': 'int32',
'required': True,
'enum': ['CHASSIS_TYPE_UNKNOWN',
'CHASSIS_TYPE_APPLIANCE',
'CHASSIS_TYPE_BAY',
'CHASSIS_TYPE_COMPACT',
'CHASSIS_TYPE_SAFIRE']
},
'tcp_udp': {
'type': 'document',
'required': True,
'properties': {
'cc': {
'type': 'int32',
'required': True,
},
}
},
'tls': {
'type': 'document',
'required': True,
'properties': {
'supported': {
'type': 'bool',
'required': True,
},
'cc': {
'type': 'int32',
'required': True,
}
}
}
}
"""
code: typing.ClassVar[int] = 1200
pushed: typing.ClassVar[bool] = False
_connection: "interfaces.IConnection"
_module: int
_port: int
@dataclass(frozen=True)
class GetDataAttr:
bson: XmpField[XmpByteList] = XmpField(XmpByteList) # list of hex bytes, bson document containing the ports aptitudes
def get(self) -> "Token[GetDataAttr]":
"""Get the ports aptitudes
:return: the ports aptitudes in BSON format
:rtype: P4_APTITUDES.GetDataAttr
"""
return Token(self._connection, build_get_request(self, module=self._module, port=self._port))
| 40.582014 | 169 | 0.693891 | 8,867 | 71,749 | 5.490019 | 0.053908 | 0.102917 | 0.078903 | 0.102917 | 0.896056 | 0.87235 | 0.839441 | 0.812161 | 0.789647 | 0.757087 | 0 | 0.012661 | 0.2184 | 71,749 | 1,767 | 170 | 40.60498 | 0.8554 | 0.394235 | 0 | 0.779826 | 0 | 0 | 0.051239 | 0.028306 | 0 | 0 | 0 | 0 | 0 | 1 | 0.061822 | false | 0.002169 | 0.010846 | 0 | 0.550976 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
79e9e5c3e8a07b22a92fc529e92e6767acc99c14 | 17 | py | Python | Coursera_HW/Week_1/task_9.py | IsFelix/ML-HLS_python_course_fall21 | 3119482d2951feb85d8a7819d2beaac947bca5db | [
"MIT"
] | null | null | null | Coursera_HW/Week_1/task_9.py | IsFelix/ML-HLS_python_course_fall21 | 3119482d2951feb85d8a7819d2beaac947bca5db | [
"MIT"
] | null | null | null | Coursera_HW/Week_1/task_9.py | IsFelix/ML-HLS_python_course_fall21 | 3119482d2951feb85d8a7819d2beaac947bca5db | [
"MIT"
] | null | null | null | print(100 * "A")
| 8.5 | 16 | 0.529412 | 3 | 17 | 3 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.214286 | 0.176471 | 17 | 1 | 17 | 17 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
79fe0c28f16dbe8fd7235cb098a6d84b53115cbb | 46 | py | Python | covmatest/__init__.py | gcattan/covmatTest | e2d1c11d53270dcac9ccf82515d6f00f10c0ac9a | [
"Apache-2.0"
] | null | null | null | covmatest/__init__.py | gcattan/covmatTest | e2d1c11d53270dcac9ccf82515d6f00f10c0ac9a | [
"Apache-2.0"
] | null | null | null | covmatest/__init__.py | gcattan/covmatTest | e2d1c11d53270dcac9ccf82515d6f00f10c0ac9a | [
"Apache-2.0"
] | null | null | null | from .get_covmat import CovmatGen, get_covmat
| 23 | 45 | 0.847826 | 7 | 46 | 5.285714 | 0.714286 | 0.486486 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108696 | 46 | 1 | 46 | 46 | 0.902439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
030b9c8fe7121093d2bc7afcf4f61759bab4abc1 | 7,712 | py | Python | maxnorm/optimization.py | kharris/max-qnorm-tensor-completion | 78a6a5be123cebe670ad1b82d2738d8537e9edfd | [
"BSD-3-Clause-Clear"
] | null | null | null | maxnorm/optimization.py | kharris/max-qnorm-tensor-completion | 78a6a5be123cebe670ad1b82d2738d8537e9edfd | [
"BSD-3-Clause-Clear"
] | null | null | null | maxnorm/optimization.py | kharris/max-qnorm-tensor-completion | 78a6a5be123cebe670ad1b82d2738d8537e9edfd | [
"BSD-3-Clause-Clear"
] | null | null | null | import numpy as np
#import jax.numpy as np
from typing import Callable
import warnings
'''
optimization module
Provides matrix-valued prox-gradient method and its accelerated versions.
Modified from the code in https://github.com/harrispopgen/mushi/
'''
def hs_dot(A, B):
return (A*B).flatten().sum()
def prox_grad_method(x: np.ndarray,
g: Callable[[np.ndarray], np.float64],
grad_g: Callable[[np.ndarray], np.float64],
h: Callable[[np.ndarray], np.float64],
prox: Callable[[np.ndarray, np.float64], np.float64],
tol: np.float64 = 1e-6,
max_iter: int = 100,
s0: np.float64 = 1,
max_line_iter: int = 100,
gamma: np.float64 = 0.8,
verbosity = 0) -> np.ndarray:
u"""Nesterov accelerated proximal gradient method
https://people.eecs.berkeley.edu/~elghaoui/Teaching/EE227A/lecture18.pdf
x: initial point
g: differentiable term in objective function
grad_g: gradient of g
h: non-differentiable term in objective function
prox: proximal operator corresponding to h
tol: relative tolerance in objective function for convergence
max_iter: maximum number of proximal gradient steps
s0: initial step size
max_line_iter: maximum number of line search steps
gamma: step size shrinkage rate for line search
"""
# initialize step size
s = s0
# initial objective value
f = g(x) + h(x)
if verbosity > 0:
print(f'initial objective {f:.6e}', flush=True)
print(f'initial smooth part {g(x):.6e}', flush=True)
for k in range(1, max_iter + 1):
# evaluate differtiable part of objective at current point
g1 = g(x)
grad_g1 = grad_g(x)
# check for errors
if not np.all(np.isfinite(grad_g1)):
warnings.warn("gradient contains invalid values", RuntimeWarning)
return np.nan
if np.all(grad_g1 == 0):
warnings.warn("zero gradient, breaking", RuntimeWarning)
break
# store old iterate
x_old = x
# Armijo line search
for line_iter in range(max_line_iter):
# new point via prox-gradient of momentum point
x = prox(x - s * grad_g1, s)
# G_s(q) as in the notes linked above
G = (1 / s) * (x_old - x)
# test g(q - sG_s(q)) for sufficient decrease
if g(x) <= (g1 - s * hs_dot(grad_g1, G) + (s / 2) * hs_dot(G, G)):
# Armijo satisfied
break
else:
# Armijo not satisfied
s *= gamma # shrink step size
if line_iter == max_line_iter - 1:
warnings.warn("line search failed", RuntimeWarning)
s = s0
if not np.all(np.isfinite(x)):
warnings.warn("x contains invalid values", RuntimeWarning)
# terminate if objective function is constant within tolerance
f_old = f
f = g(x) + h(x)
rel_change = np.abs((f - f_old) / f_old)
if verbosity > 0:
print(f'iteration {k}, objective {f:.3e}, '
f'relative change {rel_change:.3e}', flush=True)
# print(f'iteration {k}, objective {f:.3e}, '
# f'relative change {rel_change:.3e}',
# end=' \r', flush=True)
if rel_change < tol:
if verbosity > 0:
print(f'\nrelative change in objective function {rel_change:.2g} '
f'is within tolerance {tol} after {k} iterations',
flush=True)
break
if k == max_iter and verbosity > 0:
print(f'\nmaximum iteration {max_iter} reached with relative '
f'change in objective function {rel_change:.2g}', flush=True)
return x
def acc_prox_grad_method(x: np.ndarray,
g: Callable[[np.ndarray], np.float64],
grad_g: Callable[[np.ndarray], np.float64],
h: Callable[[np.ndarray], np.float64],
prox: Callable[[np.ndarray, np.float64], np.float64],
tol: np.float64 = 1e-6,
max_iter: int = 100,
s0: np.float64 = 1,
max_line_iter: int = 100,
gamma: np.float64 = 0.8,
verbosity = 0) -> np.ndarray:
u"""Nesterov accelerated proximal gradient method
https://people.eecs.berkeley.edu/~elghaoui/Teaching/EE227A/lecture18.pdf
x: initial point
g: differentiable term in objective function
grad_g: gradient of g
h: non-differentiable term in objective function
prox: proximal operator corresponding to h
tol: relative tolerance in objective function for convergence
max_iter: maximum number of proximal gradient steps
s0: initial step size
max_line_iter: maximum number of line search steps
gamma: step size shrinkage rate for line search
"""
# initialize step size
s = s0
# initialize momentum iterate
q = x
# initial objective value
f = g(x) + h(x)
if verbosity > 0:
print(f'initial objective {f:.6e}', flush=True)
for k in range(1, max_iter + 1):
# evaluate differtiable part of objective at momentum point
g1 = g(q)
grad_g1 = grad_g(q)
if not np.all(np.isfinite(grad_g1)):
warnings.warn("gradient contains invalid values", RuntimeWarning)
return np.nan
if np.all(grad_g1 == 0):
warnings.warn("zero gradient, breaking", RuntimeWarning)
break
# store old iterate
x_old = x
# Armijo line search
for line_iter in range(max_line_iter):
# new point via prox-gradient of momentum point
x = prox(q - s * grad_g1, s)
# G_s(q) as in the notes linked above
G = (1 / s) * (q - x)
# test g(q - sG_s(q)) for sufficient decrease
if g(q - s * G) <= (g1 - s * hs_dot(grad_g1, G) + (s / 2) * hs_dot(G, G)):
# Armijo satisfied
break
else:
# Armijo not satisfied
s *= gamma # shrink step size
# update momentum point
q = x + ((k - 1) / (k + 2)) * (x - x_old)
if line_iter == max_line_iter - 1:
warnings.warn("line search failed", RuntimeWarning)
s = s0
if not np.all(np.isfinite(x)):
warnings.warn("x contains invalid values", RuntimeWarning)
return np.nan
# terminate if objective function is constant within tolerance
f_old = f
f = g(x) + h(x)
rel_change = np.abs((f - f_old) / f_old)
# print(f'iteration {k}, objective {f:.3e}, '
# f'relative change {rel_change:.3e}', flush=True)
if verbosity > 0:
print(f'iteration {k}, objective {f:.3e}, '
f'relative change {rel_change:.3e}',
end=' \r', flush=True)
if rel_change < tol:
if verbosity > 0:
print(f'\nrelative change in objective function {rel_change:.2g} '
f'is within tolerance {tol} after {k} iterations',
flush=True)
break
if k == max_iter:
if verbosity > 0:
print(f'\nmaximum iteration {max_iter} reached with relative '
f'change in objective function {rel_change:.2g}', flush=True)
return x
| 40.589474 | 86 | 0.550571 | 980 | 7,712 | 4.24898 | 0.168367 | 0.034582 | 0.045629 | 0.036503 | 0.902017 | 0.901537 | 0.901537 | 0.898895 | 0.898895 | 0.898895 | 0 | 0.025005 | 0.351789 | 7,712 | 189 | 87 | 40.804233 | 0.807962 | 0.276712 | 0 | 0.773109 | 0 | 0 | 0.155627 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02521 | false | 0 | 0.02521 | 0.008403 | 0.10084 | 0.07563 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
033e7d586169458c4a97fe7914da794a22bbc804 | 16,461 | py | Python | test/dataset_3d_lc.py | nishantrai18/homage | 4da1e3836b231d4915683a38502f7b5e0c14a1ea | [
"MIT"
] | 10 | 2021-11-09T02:40:54.000Z | 2022-03-10T11:22:42.000Z | test/dataset_3d_lc.py | nishantrai18/homage | 4da1e3836b231d4915683a38502f7b5e0c14a1ea | [
"MIT"
] | 4 | 2021-07-03T04:44:34.000Z | 2022-03-24T04:46:06.000Z | test/dataset_3d_lc.py | nishantrai18/homage | 4da1e3836b231d4915683a38502f7b5e0c14a1ea | [
"MIT"
] | 4 | 2021-11-09T02:40:56.000Z | 2022-01-21T15:19:46.000Z | import torch
from torch.utils import data
from torchvision import transforms
import os
import sys
import time
import pickle
import csv
import glob
import pandas as pd
import numpy as np
import cv2
sys.path.append('../train')
import model_utils as mu
sys.path.append('../utils')
from augmentation import *
from tqdm import tqdm
from joblib import Parallel, delayed
def pil_loader(path):
with open(path, 'rb') as f:
with Image.open(f) as img:
return img.convert('RGB')
toTensor = transforms.ToTensor()
toPILImage = transforms.ToPILImage()
def flow_loader(path):
try:
img = Image.open(path)
except:
return None
f = toTensor(img)
if f.mean() > 0.3:
f -= 0.5
return f
def fetch_imgs_seq(vpath, idx_block):
seq = [pil_loader(os.path.join(vpath, 'image_%05d.jpg' % (i+1))) for i in idx_block]
return seq
def fill_nones(l):
l = [l[i-1] if l[i] is None else l[i] for i in range(len(l))]
l = [l[i-1] if l[i] is None else l[i] for i in range(len(l))]
try:
nonNoneL = [item for item in l if item is not None][0]
except:
nonNoneL = torch.zeros((1, 256, 256))
return [torch.zeros(nonNoneL.shape) if l[i] is None else l[i] for i in range(len(l))]
def get_u_flow_path_list(vpath, idx_block):
dataset = 'ucf101' if 'ucf101' in vpath else 'hmdb51'
flow_base_path = os.path.join('/dev/shm/data/nishantr/flow/', dataset + '_flow/')
vid_name = os.path.basename(os.path.normpath(vpath))
return [os.path.join(flow_base_path, 'u', vid_name, 'frame%06d.jpg' % (i + 1)) for i in idx_block]
def get_v_flow_path_list(vpath, idx_block):
dataset = 'ucf101' if 'ucf101' in vpath else 'hmdb51'
flow_base_path = os.path.join('/dev/shm/data/nishantr/flow/', dataset + '_flow/')
vid_name = os.path.basename(os.path.normpath(vpath))
return [os.path.join(flow_base_path, 'v', vid_name, 'frame%06d.jpg' % (i + 1)) for i in idx_block]
def fetch_flow_seq(vpath, idx_block):
u_flow_list = get_u_flow_path_list(vpath, idx_block)
v_flow_list = get_v_flow_path_list(vpath, idx_block)
u_seq = fill_nones([flow_loader(f) for f in u_flow_list])
v_seq = fill_nones([flow_loader(f) for f in v_flow_list])
seq = [toPILImage(torch.cat([u, v])) for u, v in zip(u_seq, v_seq)]
return seq
def get_class_vid(vpath):
return os.path.normpath(vpath).split('/')[-2:]
def load_detectron_feature(fdir, idx, opt):
# opt is either hm or seg
shape = (192, 256)
num_channels = 17 if opt == 'hm' else 1
def load_feature(path):
try:
x = np.load(path)[opt]
except:
x = np.zeros((0, 0, 0))
# Match non-existent values
if x.shape[1] == 0:
x = np.zeros((num_channels, shape[0], shape[1]))
x = torch.tensor(x, dtype=torch.float) / 255.0
# Add extra channel in case it's not present
if len(x.shape) < 3:
x = x.unsqueeze(0)
return x
suffix = 'heatmap' if opt == 'hm' else 'segmask'
fpath = os.path.join(fdir, suffix + '_%05d.npz' % idx)
if os.path.isfile(fpath):
return load_feature(fpath)
else:
# We do not have results lower than idx=2
idx = max(3, idx)
# We assume having all results for every two frames
fpath0 = os.path.join(fdir, suffix + '_%05d.npz' % (idx - 1))
fpath1 = os.path.join(fdir, suffix + '_%05d.npz' % (idx + 1))
# This is not guaranteed to exist
if not os.path.isfile(fpath1):
fpath1 = fpath0
a0, a1 = load_feature(fpath0), load_feature(fpath1)
try:
a_avg = (a0 + a1) / 2.0
except:
a_avg = None
return a_avg
def fetch_kp_heatmap_seq(vpath, idx_block):
assert '/frame/' in vpath, "Incorrect vpath received: {}".format(vpath)
feature_vpath = vpath.replace('/frame/', '/heatmaps/')
seq = fill_nones([load_detectron_feature(feature_vpath, idx, opt='hm') for idx in idx_block])
if len(set([x.shape for x in seq])) > 1:
# We now know the invalid paths, so no need to print them
# print("Invalid path:", vpath)
seq = [seq[len(seq) // 2] for _ in seq]
return seq
def fetch_seg_mask_seq(vpath, idx_block):
assert '/frame/' in vpath, "Incorrect vpath received: {}".format(vpath)
feature_vpath = vpath.replace('/frame/', '/segmasks/')
seq = fill_nones([load_detectron_feature(feature_vpath, idx, opt='seg') for idx in idx_block])
return seq
class UCF101_3d(data.Dataset):
def __init__(self,
mode='train',
transform=None,
seq_len=10,
num_seq =1,
downsample=3,
epsilon=5,
which_split=1,
modality=mu.ImgMode):
self.mode = mode
self.transform = transform
self.seq_len = seq_len
self.num_seq = num_seq
self.downsample = downsample
self.epsilon = epsilon
self.which_split = which_split
self.modality = modality
# splits
if mode == 'train':
split = '../data/ucf101/train_split%02d.csv' % self.which_split
video_info = pd.read_csv(split, header=None)
elif (mode == 'val') or (mode == 'test'):
split = '../data/ucf101/test_split%02d.csv' % self.which_split # use test for val, temporary
video_info = pd.read_csv(split, header=None)
else: raise ValueError('wrong mode')
# get action list
self.action_dict_encode = {}
self.action_dict_decode = {}
action_file = os.path.join('../data/ucf101', 'classInd.txt')
action_df = pd.read_csv(action_file, sep=' ', header=None)
for _, row in action_df.iterrows():
act_id, act_name = row
act_id = int(act_id) - 1 # let id start from 0
self.action_dict_decode[act_id] = act_name
self.action_dict_encode[act_name] = act_id
# filter out too short videos:
drop_idx = []
for idx, row in video_info.iterrows():
vpath, vlen = row
if vlen <= 0:
drop_idx.append(idx)
self.video_info = video_info.drop(drop_idx, axis=0)
# if mode == 'val': self.video_info = self.video_info.sample(frac=0.3)
# shuffle not required
def idx_sampler(self, vlen, vpath):
'''sample index from a video'''
downsample = self.downsample
if (vlen - (self.num_seq * self.seq_len * self.downsample)) <= 0:
downsample = ((vlen - 1) / (self.num_seq * self.seq_len * 1.0)) * 0.9
n = 1
if self.mode == 'test':
seq_idx_block = np.arange(0, vlen, downsample) # all possible frames with downsampling
seq_idx_block = seq_idx_block.astype(int)
return [seq_idx_block, vpath]
start_idx = np.random.choice(range(vlen-int(self.num_seq*self.seq_len*downsample)), n)
seq_idx = np.expand_dims(np.arange(self.num_seq), -1)*downsample*self.seq_len + start_idx
seq_idx_block = seq_idx + np.expand_dims(np.arange(self.seq_len),0)*downsample
seq_idx_block = seq_idx_block.astype(int)
return [seq_idx_block, vpath]
def __getitem__(self, index):
vpath, vlen = self.video_info.iloc[index]
items = self.idx_sampler(vlen, vpath)
if items is None: print(vpath)
idx_block, vpath = items
if self.mode != 'test':
assert idx_block.shape == (self.num_seq, self.seq_len)
idx_block = idx_block.reshape(self.num_seq*self.seq_len)
seq = None
if self.modality == mu.ImgMode:
seq = fetch_imgs_seq(vpath, idx_block)
elif self.modality == mu.FlowMode:
seq = fetch_flow_seq(vpath, idx_block)
elif self.modality == mu.KeypointHeatmap:
seq = fetch_kp_heatmap_seq(vpath, idx_block)
elif self.modality == mu.SegMask:
seq = fetch_seg_mask_seq(vpath, idx_block)
if self.modality in [mu.KeypointHeatmap, mu.SegMask]:
seq = torch.stack(seq)
# if self.mode == 'test':
# # apply same transform
# t_seq = [self.transform(seq) for _ in range(5)]
# else:
t_seq = self.transform(seq) # apply same transform
# Convert tensor into list of tensors
if self.modality in [mu.KeypointHeatmap, mu.SegMask]:
t_seq = [t_seq[idx] for idx in range(t_seq.shape[0])]
num_crop = None
try:
(C, H, W) = t_seq[0].size()
t_seq = torch.stack(t_seq, 0)
except:
(C, H, W) = t_seq[0][0].size()
tmp = [torch.stack(i, 0) for i in t_seq]
assert len(tmp) == 5
num_crop = 5
t_seq = torch.stack(tmp, 1)
if self.mode == 'test':
# return all available clips, but cut into length = num_seq
SL = t_seq.size(0)
clips = []; i = 0
while i+self.seq_len <= SL:
clips.append(t_seq[i:i+self.seq_len, :])
# i += self.seq_len//2
i += self.seq_len
if num_crop:
# half overlap:
clips = [torch.stack(clips[i:i+self.num_seq], 0).permute(2,0,3,1,4,5) for i in range(0,len(clips)+1-self.num_seq,self.num_seq//2)]
NC = len(clips)
t_seq = torch.stack(clips, 0).view(NC*num_crop, self.num_seq, C, self.seq_len, H, W)
else:
# half overlap:
clips = [torch.stack(clips[i:i+self.num_seq], 0).transpose(1,2) for i in range(0,len(clips)+1-self.num_seq,self.num_seq//2)]
t_seq = torch.stack(clips, 0)
else:
t_seq = t_seq.view(self.num_seq, self.seq_len, C, H, W).transpose(1,2)
try:
vname = vpath.split('/')[-3]
vid = self.encode_action(vname)
except:
vname = vpath.split('/')[-2]
vid = self.encode_action(vname)
label = torch.LongTensor([vid])
idx = torch.LongTensor([index])
return t_seq, label, idx
def __len__(self):
return len(self.video_info)
def encode_action(self, action_name):
'''give action name, return category'''
return self.action_dict_encode[action_name]
def decode_action(self, action_code):
'''give action code, return action name'''
return self.action_dict_decode[action_code]
class HMDB51_3d(data.Dataset):
def __init__(self,
mode='train',
transform=None,
seq_len=10,
num_seq=1,
downsample=1,
epsilon=5,
which_split=1,
modality=mu.ImgMode):
self.mode = mode
self.transform = transform
self.seq_len = seq_len
self.num_seq = num_seq
self.downsample = downsample
self.epsilon = epsilon
self.which_split = which_split
self.modality = modality
# splits
if mode == 'train':
split = '../data/hmdb51/train_split%02d.csv' % self.which_split
video_info = pd.read_csv(split, header=None)
elif (mode == 'val') or (mode == 'test'):
split = '../data/hmdb51/test_split%02d.csv' % self.which_split # use test for val, temporary
video_info = pd.read_csv(split, header=None)
else: raise ValueError('wrong mode')
# get action list
self.action_dict_encode = {}
self.action_dict_decode = {}
action_file = os.path.join('../data/hmdb51', 'classInd.txt')
action_df = pd.read_csv(action_file, sep=' ', header=None)
for _, row in action_df.iterrows():
act_id, act_name = row
act_id = int(act_id) - 1 # let id start from 0
self.action_dict_decode[act_id] = act_name
self.action_dict_encode[act_name] = act_id
# filter out too short videos:
drop_idx = []
for idx, row in video_info.iterrows():
vpath, vlen = row
if vlen <= 0:
drop_idx.append(idx)
self.video_info = video_info.drop(drop_idx, axis=0)
# if mode == 'val': self.video_info = self.video_info.sample(frac=0.3)
# shuffle not required
def idx_sampler(self, vlen, vpath):
'''sample index from a video'''
downsample = self.downsample
if (vlen - (self.num_seq * self.seq_len * self.downsample)) <= 0:
downsample = ((vlen - 1) / (self.num_seq * self.seq_len * 1.0)) * 0.9
n=1
if self.mode == 'test':
seq_idx_block = np.arange(0, vlen, downsample) # all possible frames with downsampling
seq_idx_block = seq_idx_block.astype(int)
return [seq_idx_block, vpath]
start_idx = np.random.choice(range(vlen-int(self.num_seq*self.seq_len*downsample)), n)
seq_idx = np.expand_dims(np.arange(self.num_seq), -1)*downsample*self.seq_len + start_idx
seq_idx_block = seq_idx + np.expand_dims(np.arange(self.seq_len),0)*downsample
seq_idx_block = seq_idx_block.astype(int)
return [seq_idx_block, vpath]
def __getitem__(self, index):
vpath, vlen = self.video_info.iloc[index]
items = self.idx_sampler(vlen, vpath)
if items is None: print(vpath)
idx_block, vpath = items
if self.mode != 'test':
assert idx_block.shape == (self.num_seq, self.seq_len)
idx_block = idx_block.reshape(self.num_seq*self.seq_len)
seq = None
if self.modality == mu.ImgMode:
seq = fetch_imgs_seq(vpath, idx_block)
elif self.modality == mu.FlowMode:
seq = fetch_flow_seq(vpath, idx_block)
elif self.modality == mu.KeypointHeatmap:
seq = fetch_kp_heatmap_seq(vpath, idx_block)
elif self.modality == mu.SegMask:
seq = fetch_seg_mask_seq(vpath, idx_block)
if self.modality in [mu.KeypointHeatmap, mu.SegMask]:
seq = torch.stack(seq)
t_seq = self.transform(seq) # apply same transform
# Convert tensor into list of tensors
if self.modality in [mu.KeypointHeatmap, mu.SegMask]:
t_seq = [t_seq[idx] for idx in range(t_seq.shape[0])]
num_crop = None
try:
(C, H, W) = t_seq[0].size()
t_seq = torch.stack(t_seq, 0)
except:
(C, H, W) = t_seq[0][0].size()
tmp = [torch.stack(i, 0) for i in t_seq]
assert len(tmp) == 5
num_crop = 5
t_seq = torch.stack(tmp, 1)
# print(t_seq.size())
# import ipdb; ipdb.set_trace()
if self.mode == 'test':
# return all available clips, but cut into length = num_seq
SL = t_seq.size(0)
clips = []; i = 0
while i+self.seq_len <= SL:
clips.append(t_seq[i:i+self.seq_len, :])
# i += self.seq_len//2
i += self.seq_len
if num_crop:
# half overlap:
clips = [torch.stack(clips[i:i+self.num_seq], 0).permute(2,0,3,1,4,5) for i in range(0,len(clips)+1-self.num_seq,self.num_seq//2)]
NC = len(clips)
t_seq = torch.stack(clips, 0).view(NC*num_crop, self.num_seq, C, self.seq_len, H, W)
else:
# half overlap:
clips = [torch.stack(clips[i:i+self.num_seq], 0).transpose(1,2) for i in range(0,len(clips)+1-self.num_seq,3*self.num_seq//4)]
t_seq = torch.stack(clips, 0)
else:
t_seq = t_seq.view(self.num_seq, self.seq_len, C, H, W).transpose(1,2)
try:
vname = vpath.split('/')[-3]
vid = self.encode_action(vname)
except:
vname = vpath.split('/')[-2]
vid = self.encode_action(vname)
label = torch.LongTensor([vid])
idx = torch.LongTensor([index])
return t_seq, label, idx
def __len__(self):
return len(self.video_info)
def encode_action(self, action_name):
'''give action name, return category'''
return self.action_dict_encode[action_name]
def decode_action(self, action_code):
'''give action code, return action name'''
return self.action_dict_decode[action_code]
| 36.178022 | 146 | 0.581921 | 2,355 | 16,461 | 3.882378 | 0.121019 | 0.039374 | 0.032812 | 0.022968 | 0.813628 | 0.805972 | 0.800503 | 0.797659 | 0.785191 | 0.772285 | 0 | 0.020016 | 0.295851 | 16,461 | 454 | 147 | 36.257709 | 0.768786 | 0.087297 | 0 | 0.721068 | 0 | 0 | 0.039599 | 0.012709 | 0 | 0 | 0 | 0 | 0.017804 | 1 | 0.071217 | false | 0 | 0.047478 | 0.008902 | 0.20178 | 0.005935 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
307f789a9b4a520223bd65bd6d6624a76265ba45 | 133 | py | Python | coaching/core/admin.py | paurushofficial/coachingWebapp | 314a7025d1bf73d6f383709085c9cd6901ba0887 | [
"MIT"
] | null | null | null | coaching/core/admin.py | paurushofficial/coachingWebapp | 314a7025d1bf73d6f383709085c9cd6901ba0887 | [
"MIT"
] | 5 | 2021-03-30T14:00:24.000Z | 2021-06-10T19:45:13.000Z | coaching/core/admin.py | paurushofficial/coachingWebapp | 314a7025d1bf73d6f383709085c9cd6901ba0887 | [
"MIT"
] | null | null | null | from django.contrib import admin
from . models import student, subject
admin.site.register(subject)
admin.site.register(student) | 26.6 | 38 | 0.796992 | 18 | 133 | 5.888889 | 0.555556 | 0.226415 | 0.301887 | 0.45283 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120301 | 133 | 5 | 39 | 26.6 | 0.905983 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
061896ee0dfa8e01e5aae0ee475216107fb6fa1b | 164 | py | Python | ml_yara_generator/__init__.py | ysillam/ml_yara_generator | 772a688d71ede6ad9cbea845e24bf1e9ffe81908 | [
"MIT"
] | null | null | null | ml_yara_generator/__init__.py | ysillam/ml_yara_generator | 772a688d71ede6ad9cbea845e24bf1e9ffe81908 | [
"MIT"
] | null | null | null | ml_yara_generator/__init__.py | ysillam/ml_yara_generator | 772a688d71ede6ad9cbea845e24bf1e9ffe81908 | [
"MIT"
] | null | null | null | from .src import classifiers, extractors, yara_generator
from .src.yara_generator import yara_generator
from .src.yara_generator.yara_generator import YaraGenerator | 54.666667 | 60 | 0.871951 | 22 | 164 | 6.272727 | 0.363636 | 0.471014 | 0.246377 | 0.289855 | 0.478261 | 0.478261 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079268 | 164 | 3 | 60 | 54.666667 | 0.913907 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
2317603c3896e21da146929d245aafecacfa84b3 | 106 | py | Python | extensions/.stubs/clrclasses/__clrclasses__/System/Windows/__init__.py | vicwjb/Pycad | 7391cd694b7a91ad9f9964ec95833c1081bc1f84 | [
"MIT"
] | 1 | 2020-03-25T03:27:24.000Z | 2020-03-25T03:27:24.000Z | extensions/.stubs/clrclasses/__clrclasses__/System/Windows/__init__.py | vicwjb/Pycad | 7391cd694b7a91ad9f9964ec95833c1081bc1f84 | [
"MIT"
] | null | null | null | extensions/.stubs/clrclasses/__clrclasses__/System/Windows/__init__.py | vicwjb/Pycad | 7391cd694b7a91ad9f9964ec95833c1081bc1f84 | [
"MIT"
] | null | null | null | import __clrclasses__.System.Windows.Input as Input
import __clrclasses__.System.Windows.Markup as Markup
| 35.333333 | 53 | 0.867925 | 14 | 106 | 6 | 0.5 | 0.380952 | 0.52381 | 0.690476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075472 | 106 | 2 | 54 | 53 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
001d6419d5e7ee98347e5a14ee7260464280a81f | 1,604 | py | Python | django/smartcity/vagent/hook.py | nvitha/Smart-Cities | 9a4cb29b143956bb73789e4af2619cde681393be | [
"MIT"
] | null | null | null | django/smartcity/vagent/hook.py | nvitha/Smart-Cities | 9a4cb29b143956bb73789e4af2619cde681393be | [
"MIT"
] | null | null | null | django/smartcity/vagent/hook.py | nvitha/Smart-Cities | 9a4cb29b143956bb73789e4af2619cde681393be | [
"MIT"
] | null | null | null | import sys
import os
import subprocess
def start_thrash_test():
process = subprocess.Popen(['python', '/var/www/html/smartcity/smartcity/vagent/start_thrash.py'], env={'PYTHONPATH': os.pathsep.join(sys.path)})
error = process.communicate(process)
def stop_thrash_test():
process = subprocess.Popen(['python', '/var/www/html/smartcity/smartcity/vagent/stop_thrash.py'], env={'PYTHONPATH': os.pathsep.join(sys.path)})
error = process.communicate(process)
def start_uniform_test():
process = subprocess.Popen(['python', '/var/www/html/smartcity/smartcity/vagent/start_uniform.py'], env={'PYTHONPATH': os.pathsep.join(sys.path)})
error = process.communicate(process)
def stop_uniform_test():
process = subprocess.Popen(['python', '/var/www/html/smartcity/smartcity/vagent/stop_uniform.py'], env={'PYTHONPATH': os.pathsep.join(sys.path)})
error = process.communicate(process)
def start_random_test():
process = subprocess.Popen(['python', '/var/www/html/smartcity/smartcity/vagent/start_random.py'], env={'PYTHONPATH': os.pathsep.join(sys.path)})
error = process.communicate(process)
def stop_random_test():
process = subprocess.Popen(['python', '/var/www/html/smartcity/smartcity/vagent/stop_random.py'], env={'PYTHONPATH': os.pathsep.join(sys.path)})
error = process.communicate(process)
#
# def main():
# process = subprocess.Popen(['python', '/var/www/html/smartcity/smartcity/vagent/tester_vagent.py'], env={'PYTHONPATH': os.pathsep.join(sys.path)})
# error = process.communicate(process)
#
#
# if __name__ == '__main__':
# main()
| 37.302326 | 152 | 0.719451 | 204 | 1,604 | 5.52451 | 0.151961 | 0.10559 | 0.136646 | 0.173913 | 0.937001 | 0.937001 | 0.937001 | 0.937001 | 0.937001 | 0.937001 | 0 | 0 | 0.105362 | 1,604 | 42 | 153 | 38.190476 | 0.785366 | 0.150249 | 0 | 0.285714 | 0 | 0 | 0.318316 | 0.247415 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
cc81230cf88cb286c1df4f329156e2c7fef5a3e0 | 6,011 | py | Python | pyquanttrade/features/indicators.py | marqueles/pyquanttrade | a9b5e04f0d8a5ee385f60024f8b74bd363d7c681 | [
"BSD-2-Clause"
] | null | null | null | pyquanttrade/features/indicators.py | marqueles/pyquanttrade | a9b5e04f0d8a5ee385f60024f8b74bd363d7c681 | [
"BSD-2-Clause"
] | null | null | null | pyquanttrade/features/indicators.py | marqueles/pyquanttrade | a9b5e04f0d8a5ee385f60024f8b74bd363d7c681 | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""Functions from market data"""
__author__ = "Miguel Martin"
__version__ = "1"
def always_true():
def return_function(when, ticket, trades, data):
return True
return return_function
def apply_all(criteria):
def return_function(when, ticket, trades, data):
return all([f(when, ticket, trades, data) for f in criteria])
return return_function
def apply_any(criteria):
def return_function(when, ticket, trades, data):
return any([f(when, ticket, trades, data) for f in criteria])
return return_function
def apply_not(criterium):
def return_function(when, ticket, trades, data):
return not criterium(when, ticket, trades, data)
return return_function
def not_trade():
def return_function(when, ticket, trades, data):
return not trades.any_open_trade()
return return_function
def eq(element, than):
def return_function(when, ticket, trades, data):
return element(when, ticket, trades, data) == than
return return_function
def leq(element, than):
def return_function(when, ticket, trades, data):
return element(when, ticket, trades, data) <= than
return return_function
def geq(element, than):
def return_function(when, ticket, trades, data):
return element(when, ticket, trades, data) >= than
return return_function
def gt(element, than):
def return_function(when, ticket, trades, data):
return element(when, ticket, trades, data) > than
return return_function
def lt(element, than):
def return_function(when, ticket, trades, data):
return element(when, ticket, trades, data) < than
return return_function
def TF_indicator(func):
def return_function(when, ticker, trades, data):
data1 = func(data)
column_name = data1.name
return data[when, column_name]
return return_function
def unit_indicator(func):
def return_function(when, ticker, trades, data):
data1 = func(data)
column_name = f"unit_indicator_{data1.name}"
if column_name not in data.columns:
data[column_name] = data1 == 1
return data[column_name][when]
return return_function
def cross_of_values(func1, func2):
def return_function(when, ticker, trades, data):
data1 = func1(data)
data2 = func2(data)
column_name = f"diff_rows_{data1.name}_vs_{data2.name}"
if column_name not in data.columns:
data[column_name] = 0
data.loc[data1 <= data2, [column_name]] = 1
data[column_name] = data[column_name].diff()
data[column_name].iloc[0] = 0
data[column_name].loc[data[column_name] == 1] = True
data[column_name].loc[data[column_name] != 1] = False
return data[column_name][when]
return return_function
def greater_than(func1, func2):
def return_function(when, ticker, trades, data):
data1 = func1(data)
data2 = func2(data)
column_name = f"{data1.name}_greater_than_{data2.name}"
if column_name not in data.columns:
data[column_name] = 0
data.loc[data1 > data2, [column_name]] = 1
data[column_name] = data[column_name].astype(bool)
return data[column_name][when]
return return_function
def lower_than(func1, func2):
def return_function(when, ticker, trades, data):
data1 = func1(data)
data2 = func2(data)
column_name = f"{data1.name}_lower_than_{data2.name}"
if column_name not in data.columns:
data[column_name] = 0
data.loc[data1 < data2, [column_name]] = 1
data[column_name] = data[column_name].astype(bool)
return data[column_name][when]
return return_function
def lower_than_value(func1, value):
def return_function(when, ticker, trades, data):
data1 = func1(data)
column_name = f"{data1.name}_lower_than_{value}"
if column_name not in data.columns:
data[column_name] = 0
data.loc[data1 < value, [column_name]] = 1
data[column_name] = data[column_name].astype(bool)
return data[column_name][when]
return return_function
def greater_than_value(func1, value):
def return_function(when, ticker, trades, data):
data1 = func1(data)
column_name = f"{data1.name}_lower_than_{value}"
if column_name not in data.columns:
data[column_name] = 0
data.loc[data1 > value, [column_name]] = 1
data[column_name] = data[column_name].astype(bool)
return data[column_name][when]
return return_function
def upwards_turn(func1):
def return_function(when, ticker, trades, data):
data1 = func1(data)
column_name = f"upwards_turn_{data1.name}"
if column_name not in data.columns:
data[column_name] = 0
data.loc[data1.diff(1) < 0, [column_name]] = -1
data.loc[data1.diff(1) > 0, [column_name]] = 1
data[column_name] = data[column_name].diff(1)
data[column_name].iloc[0] = 0
data.loc[data[column_name] > 0, [column_name]] = True
data.loc[data[column_name] <= 0, [column_name]] = False
return data[column_name][when]
return return_function
def downwards_turn(func1):
def return_function(when, ticker, trades, data):
data1 = func1(data)
column_name = f"downwards_turn_{data1.name}"
if column_name not in data.columns:
data[column_name] = 0
data.loc[data1.diff(1) < 0, [column_name]] = -1
data.loc[data1.diff(1) > 0, [column_name]] = 1
data[column_name] = data[column_name].diff(1)
data[column_name].iloc[0] = 0
data.loc[data[column_name] >= 0, [column_name]] = False
data.loc[data[column_name] < 0, [column_name]] = True
return data[column_name][when]
return return_function
| 30.20603 | 69 | 0.638163 | 793 | 6,011 | 4.636822 | 0.087011 | 0.195812 | 0.190373 | 0.108512 | 0.906445 | 0.89176 | 0.89176 | 0.88632 | 0.843894 | 0.757411 | 0 | 0.021797 | 0.252038 | 6,011 | 198 | 70 | 30.358586 | 0.796041 | 0.008152 | 0 | 0.628571 | 0 | 0 | 0.044829 | 0.042478 | 0 | 0 | 0 | 0 | 0 | 1 | 0.271429 | false | 0 | 0 | 0.071429 | 0.542857 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 9 |
ccb1bbf800520c651dcdfaa9272cb06d736d243b | 51,452 | py | Python | simscale_sdk/api/simulations_api.py | slainesimscale/simscale-python-sdk | db483eeabe558e55d020f5f829a3bf13c9c287a7 | [
"MIT"
] | 8 | 2021-01-22T13:41:03.000Z | 2022-01-03T09:00:10.000Z | simscale_sdk/api/simulations_api.py | slainesimscale/simscale-python-sdk | db483eeabe558e55d020f5f829a3bf13c9c287a7 | [
"MIT"
] | null | null | null | simscale_sdk/api/simulations_api.py | slainesimscale/simscale-python-sdk | db483eeabe558e55d020f5f829a3bf13c9c287a7 | [
"MIT"
] | 3 | 2021-03-18T15:52:52.000Z | 2022-01-03T08:59:30.000Z | # coding: utf-8
"""
SimScale API
The version of the OpenAPI document: 0.0.0
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from simscale_sdk.api_client import ApiClient
from simscale_sdk.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class SimulationsApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def check_simulation_setup(self, project_id, simulation_id, **kwargs): # noqa: E501
"""Check the simulation setup # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.check_simulation_setup(project_id, simulation_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param str simulation_id: The simulation ID (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: CheckResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.check_simulation_setup_with_http_info(project_id, simulation_id, **kwargs) # noqa: E501
def check_simulation_setup_with_http_info(self, project_id, simulation_id, **kwargs): # noqa: E501
"""Check the simulation setup # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.check_simulation_setup_with_http_info(project_id, simulation_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param str simulation_id: The simulation ID (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(CheckResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'project_id',
'simulation_id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method check_simulation_setup" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'project_id' is set
if self.api_client.client_side_validation and ('project_id' not in local_var_params or # noqa: E501
local_var_params['project_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_id` when calling `check_simulation_setup`") # noqa: E501
# verify the required parameter 'simulation_id' is set
if self.api_client.client_side_validation and ('simulation_id' not in local_var_params or # noqa: E501
local_var_params['simulation_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `simulation_id` when calling `check_simulation_setup`") # noqa: E501
collection_formats = {}
path_params = {}
if 'project_id' in local_var_params:
path_params['projectId'] = local_var_params['project_id'] # noqa: E501
if 'simulation_id' in local_var_params:
path_params['simulationId'] = local_var_params['simulation_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apiKey'] # noqa: E501
return self.api_client.call_api(
'/projects/{projectId}/simulations/{simulationId}/check', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CheckResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def create_geometry_primitive(self, project_id, geometry_primitive, **kwargs): # noqa: E501
"""Create a geometry primitive for reference within a Simulation spec. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_geometry_primitive(project_id, geometry_primitive, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param GeometryPrimitive geometry_primitive: Geometry primitive specification. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: GeometryPrimitiveResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.create_geometry_primitive_with_http_info(project_id, geometry_primitive, **kwargs) # noqa: E501
def create_geometry_primitive_with_http_info(self, project_id, geometry_primitive, **kwargs): # noqa: E501
"""Create a geometry primitive for reference within a Simulation spec. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_geometry_primitive_with_http_info(project_id, geometry_primitive, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param GeometryPrimitive geometry_primitive: Geometry primitive specification. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(GeometryPrimitiveResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'project_id',
'geometry_primitive'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method create_geometry_primitive" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'project_id' is set
if self.api_client.client_side_validation and ('project_id' not in local_var_params or # noqa: E501
local_var_params['project_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_id` when calling `create_geometry_primitive`") # noqa: E501
# verify the required parameter 'geometry_primitive' is set
if self.api_client.client_side_validation and ('geometry_primitive' not in local_var_params or # noqa: E501
local_var_params['geometry_primitive'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `geometry_primitive` when calling `create_geometry_primitive`") # noqa: E501
collection_formats = {}
path_params = {}
if 'project_id' in local_var_params:
path_params['projectId'] = local_var_params['project_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'geometry_primitive' in local_var_params:
body_params = local_var_params['geometry_primitive']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apiKey'] # noqa: E501
return self.api_client.call_api(
'/projects/{projectId}/geometryprimitives', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GeometryPrimitiveResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def create_simulation(self, project_id, simulation_spec, **kwargs): # noqa: E501
"""Create a simulation setup # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_simulation(project_id, simulation_spec, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param SimulationSpec simulation_spec: Simulation to be created (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Simulation
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.create_simulation_with_http_info(project_id, simulation_spec, **kwargs) # noqa: E501
def create_simulation_with_http_info(self, project_id, simulation_spec, **kwargs): # noqa: E501
"""Create a simulation setup # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_simulation_with_http_info(project_id, simulation_spec, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param SimulationSpec simulation_spec: Simulation to be created (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(Simulation, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'project_id',
'simulation_spec'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method create_simulation" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'project_id' is set
if self.api_client.client_side_validation and ('project_id' not in local_var_params or # noqa: E501
local_var_params['project_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_id` when calling `create_simulation`") # noqa: E501
# verify the required parameter 'simulation_spec' is set
if self.api_client.client_side_validation and ('simulation_spec' not in local_var_params or # noqa: E501
local_var_params['simulation_spec'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `simulation_spec` when calling `create_simulation`") # noqa: E501
collection_formats = {}
path_params = {}
if 'project_id' in local_var_params:
path_params['projectId'] = local_var_params['project_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'simulation_spec' in local_var_params:
body_params = local_var_params['simulation_spec']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apiKey'] # noqa: E501
return self.api_client.call_api(
'/projects/{projectId}/simulations', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Simulation', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def estimate_simulation_setup(self, project_id, simulation_id, **kwargs): # noqa: E501
"""Estimate the simulation setup # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.estimate_simulation_setup(project_id, simulation_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param str simulation_id: The simulation ID (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Estimation
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.estimate_simulation_setup_with_http_info(project_id, simulation_id, **kwargs) # noqa: E501
def estimate_simulation_setup_with_http_info(self, project_id, simulation_id, **kwargs): # noqa: E501
"""Estimate the simulation setup # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.estimate_simulation_setup_with_http_info(project_id, simulation_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param str simulation_id: The simulation ID (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(Estimation, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'project_id',
'simulation_id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method estimate_simulation_setup" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'project_id' is set
if self.api_client.client_side_validation and ('project_id' not in local_var_params or # noqa: E501
local_var_params['project_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_id` when calling `estimate_simulation_setup`") # noqa: E501
# verify the required parameter 'simulation_id' is set
if self.api_client.client_side_validation and ('simulation_id' not in local_var_params or # noqa: E501
local_var_params['simulation_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `simulation_id` when calling `estimate_simulation_setup`") # noqa: E501
collection_formats = {}
path_params = {}
if 'project_id' in local_var_params:
path_params['projectId'] = local_var_params['project_id'] # noqa: E501
if 'simulation_id' in local_var_params:
path_params['simulationId'] = local_var_params['simulation_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apiKey'] # noqa: E501
return self.api_client.call_api(
'/projects/{projectId}/simulations/{simulationId}/estimate', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Estimation', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_simulation(self, project_id, simulation_id, **kwargs): # noqa: E501
"""Get information about the simulation setup # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_simulation(project_id, simulation_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param str simulation_id: The simulation ID (required)
:param str simulation_spec_schema_version: Version of the schema the simulation spec should conform to
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: SimulationSpec
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_simulation_with_http_info(project_id, simulation_id, **kwargs) # noqa: E501
def get_simulation_with_http_info(self, project_id, simulation_id, **kwargs): # noqa: E501
"""Get information about the simulation setup # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_simulation_with_http_info(project_id, simulation_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param str simulation_id: The simulation ID (required)
:param str simulation_spec_schema_version: Version of the schema the simulation spec should conform to
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(SimulationSpec, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'project_id',
'simulation_id',
'simulation_spec_schema_version'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_simulation" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'project_id' is set
if self.api_client.client_side_validation and ('project_id' not in local_var_params or # noqa: E501
local_var_params['project_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_id` when calling `get_simulation`") # noqa: E501
# verify the required parameter 'simulation_id' is set
if self.api_client.client_side_validation and ('simulation_id' not in local_var_params or # noqa: E501
local_var_params['simulation_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `simulation_id` when calling `get_simulation`") # noqa: E501
collection_formats = {}
path_params = {}
if 'project_id' in local_var_params:
path_params['projectId'] = local_var_params['project_id'] # noqa: E501
if 'simulation_id' in local_var_params:
path_params['simulationId'] = local_var_params['simulation_id'] # noqa: E501
query_params = []
if 'simulation_spec_schema_version' in local_var_params and local_var_params['simulation_spec_schema_version'] is not None: # noqa: E501
query_params.append(('simulationSpecSchemaVersion', local_var_params['simulation_spec_schema_version'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apiKey'] # noqa: E501
return self.api_client.call_api(
'/projects/{projectId}/simulations/{simulationId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SimulationSpec', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_simulation_sdk_code(self, project_id, simulation_id, **kwargs): # noqa: E501
"""Get Python SDK code for the simulation # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_simulation_sdk_code(project_id, simulation_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param str simulation_id: The simulation ID (required)
:param str sdk_version: Version of the SDK to generate code for
:param str sdk_language: Language of the SDK to generate code for. Only Python is currently supported.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: str
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_simulation_sdk_code_with_http_info(project_id, simulation_id, **kwargs) # noqa: E501
def get_simulation_sdk_code_with_http_info(self, project_id, simulation_id, **kwargs): # noqa: E501
"""Get Python SDK code for the simulation # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_simulation_sdk_code_with_http_info(project_id, simulation_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param str simulation_id: The simulation ID (required)
:param str sdk_version: Version of the SDK to generate code for
:param str sdk_language: Language of the SDK to generate code for. Only Python is currently supported.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(str, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'project_id',
'simulation_id',
'sdk_version',
'sdk_language'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_simulation_sdk_code" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'project_id' is set
if self.api_client.client_side_validation and ('project_id' not in local_var_params or # noqa: E501
local_var_params['project_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_id` when calling `get_simulation_sdk_code`") # noqa: E501
# verify the required parameter 'simulation_id' is set
if self.api_client.client_side_validation and ('simulation_id' not in local_var_params or # noqa: E501
local_var_params['simulation_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `simulation_id` when calling `get_simulation_sdk_code`") # noqa: E501
if self.api_client.client_side_validation and 'sdk_version' in local_var_params and not re.search(r'latest|\d+\.\d+\.\d+', local_var_params['sdk_version']): # noqa: E501
raise ApiValueError("Invalid value for parameter `sdk_version` when calling `get_simulation_sdk_code`, must conform to the pattern `/latest|\d+\.\d+\.\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'project_id' in local_var_params:
path_params['projectId'] = local_var_params['project_id'] # noqa: E501
if 'simulation_id' in local_var_params:
path_params['simulationId'] = local_var_params['simulation_id'] # noqa: E501
query_params = []
if 'sdk_version' in local_var_params and local_var_params['sdk_version'] is not None: # noqa: E501
query_params.append(('sdkVersion', local_var_params['sdk_version'])) # noqa: E501
if 'sdk_language' in local_var_params and local_var_params['sdk_language'] is not None: # noqa: E501
query_params.append(('sdkLanguage', local_var_params['sdk_language'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain;charset=UTF-8', 'application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apiKey'] # noqa: E501
return self.api_client.call_api(
'/projects/{projectId}/simulations/{simulationId}/sdkcode', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='str', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_simulations(self, project_id, **kwargs): # noqa: E501
"""List simulation setups within a project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_simulations(project_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param int limit: The number of items to return.
:param int page: The page number. Use in combination with limit.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Simulations
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_simulations_with_http_info(project_id, **kwargs) # noqa: E501
def get_simulations_with_http_info(self, project_id, **kwargs): # noqa: E501
"""List simulation setups within a project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_simulations_with_http_info(project_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param int limit: The number of items to return.
:param int page: The page number. Use in combination with limit.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(Simulations, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'project_id',
'limit',
'page'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_simulations" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'project_id' is set
if self.api_client.client_side_validation and ('project_id' not in local_var_params or # noqa: E501
local_var_params['project_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_id` when calling `get_simulations`") # noqa: E501
if self.api_client.client_side_validation and 'limit' in local_var_params and local_var_params['limit'] > 1000: # noqa: E501
raise ApiValueError("Invalid value for parameter `limit` when calling `get_simulations`, must be a value less than or equal to `1000`") # noqa: E501
if self.api_client.client_side_validation and 'limit' in local_var_params and local_var_params['limit'] < 10: # noqa: E501
raise ApiValueError("Invalid value for parameter `limit` when calling `get_simulations`, must be a value greater than or equal to `10`") # noqa: E501
if self.api_client.client_side_validation and 'page' in local_var_params and local_var_params['page'] > 1000: # noqa: E501
raise ApiValueError("Invalid value for parameter `page` when calling `get_simulations`, must be a value less than or equal to `1000`") # noqa: E501
if self.api_client.client_side_validation and 'page' in local_var_params and local_var_params['page'] < 1: # noqa: E501
raise ApiValueError("Invalid value for parameter `page` when calling `get_simulations`, must be a value greater than or equal to `1`") # noqa: E501
collection_formats = {}
path_params = {}
if 'project_id' in local_var_params:
path_params['projectId'] = local_var_params['project_id'] # noqa: E501
query_params = []
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apiKey'] # noqa: E501
return self.api_client.call_api(
'/projects/{projectId}/simulations', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Simulations', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def update_simulation(self, project_id, simulation_id, simulation_spec, **kwargs): # noqa: E501
"""Update information about the simulation setup # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_simulation(project_id, simulation_id, simulation_spec, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param str simulation_id: The simulation ID (required)
:param SimulationSpec simulation_spec: Simulation to be updated (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.update_simulation_with_http_info(project_id, simulation_id, simulation_spec, **kwargs) # noqa: E501
def update_simulation_with_http_info(self, project_id, simulation_id, simulation_spec, **kwargs): # noqa: E501
"""Update information about the simulation setup # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_simulation_with_http_info(project_id, simulation_id, simulation_spec, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str project_id: The project ID (required)
:param str simulation_id: The simulation ID (required)
:param SimulationSpec simulation_spec: Simulation to be updated (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'project_id',
'simulation_id',
'simulation_spec'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method update_simulation" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'project_id' is set
if self.api_client.client_side_validation and ('project_id' not in local_var_params or # noqa: E501
local_var_params['project_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_id` when calling `update_simulation`") # noqa: E501
# verify the required parameter 'simulation_id' is set
if self.api_client.client_side_validation and ('simulation_id' not in local_var_params or # noqa: E501
local_var_params['simulation_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `simulation_id` when calling `update_simulation`") # noqa: E501
# verify the required parameter 'simulation_spec' is set
if self.api_client.client_side_validation and ('simulation_spec' not in local_var_params or # noqa: E501
local_var_params['simulation_spec'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `simulation_spec` when calling `update_simulation`") # noqa: E501
collection_formats = {}
path_params = {}
if 'project_id' in local_var_params:
path_params['projectId'] = local_var_params['project_id'] # noqa: E501
if 'simulation_id' in local_var_params:
path_params['simulationId'] = local_var_params['simulation_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'simulation_spec' in local_var_params:
body_params = local_var_params['simulation_spec']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apiKey'] # noqa: E501
return self.api_client.call_api(
'/projects/{projectId}/simulations/{simulationId}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
| 48.955281 | 183 | 0.609481 | 5,763 | 51,452 | 5.179594 | 0.040083 | 0.045293 | 0.071759 | 0.022513 | 0.958861 | 0.954037 | 0.948643 | 0.938827 | 0.932127 | 0.920201 | 0 | 0.015045 | 0.317908 | 51,452 | 1,050 | 184 | 49.001905 | 0.835504 | 0.415416 | 0 | 0.717557 | 1 | 0.009542 | 0.220506 | 0.052989 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032443 | false | 0 | 0.009542 | 0 | 0.074427 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
aeab5e9d63f47fd401893cc6febb2a9028b54dff | 18,640 | py | Python | src/sim/14-covid19/ext_rule.py | momacs/pram | d2de43ea447d13a65d814f781ec86889754f76fe | [
"BSD-3-Clause"
] | 10 | 2019-01-18T19:11:54.000Z | 2022-03-16T08:39:36.000Z | src/sim/14-covid19/ext_rule.py | momacs/pram | d2de43ea447d13a65d814f781ec86889754f76fe | [
"BSD-3-Clause"
] | 2 | 2019-02-19T15:10:44.000Z | 2019-02-26T04:26:24.000Z | src/sim/14-covid19/ext_rule.py | momacs/pram | d2de43ea447d13a65d814f781ec86889754f76fe | [
"BSD-3-Clause"
] | 3 | 2019-02-19T15:11:08.000Z | 2021-08-20T11:51:04.000Z | from pram.entity import Group, GroupQry, GroupSplitSpec, Site
from pram.io.pop import PopulationLocation
from pram.rule import Rule, SimRule
from ext_group_qry import gq_S, gq_E, gq_IA, gq_IS, gq_R
from ext_data import disease_name
# ----------------------------------------------------------------------------------------------------------------------
class DailyBehaviorRule(Rule):
"""
Model of daily human behavior.
Args:
default_dst_site (Site): Default destination site.
p_home_IS (float): Probability of staying at home when in infected symptomatic (IS) state.
UI:
p_home_IS (type:float, mode:range, none:False, min:0.00, max:1.00, init:0.80)
"""
def __init__(self, default_dst_site, p_home_IS):
self.default_dst_site = default_dst_site
self.p_home_IS = p_home_IS
super().__init__(f'daily-behavior-{self.default_dst_site}', group_qry=GroupQry(cond=[lambda g: g.has_rel(self.default_dst_site)]))
def apply(self, pop, group, iter, t):
is_morning = not iter % 2 # TODO: Make this a simulation variable
if is_morning:
return self.apply_morning(pop, group, iter, t)
else:
return self.apply_evening(pop, group, iter, t)
def apply_morning(self, pop, group, iter, t):
if pop.sim.get_var('closedown'):
return None
if group.has_attr({ disease_name: 'IS' }):
return [
GroupSplitSpec(p=1.0 - self.p_home_IS, rel_set={ Site.AT: group.get_rel(self.default_dst_site) }),
GroupSplitSpec(p= self.p_home_IS)
]
return [GroupSplitSpec(p=1.0, rel_set={ Site.AT: group.get_rel(self.default_dst_site) })]
def apply_evening(self, pop, group, iter, t):
return [GroupSplitSpec(p=1.0, rel_set={ Site.AT: group.get_rel('home') })]
# ----------------------------------------------------------------------------------------------------------------------
class DiseaseRule(Rule):
def __init__(self, primary_E_site, r0, p_E_IA, p_IA_IS, p_IS_R, p_home_E, p_social_E, soc_dist_comp_young, soc_dist_comp_old, p_fat_by_age_group):
if p_home_E + p_social_E > 1.0:
raise ValueError('p_home_E + p_social_E cannot be greater than 1.')
self.primary_E_site = primary_E_site # primary exposure site
self.r0 = r0
self.p_E_IA = p_E_IA
self.p_IA_IS = p_IA_IS
self.p_IS_R = p_IS_R
self.p_home_E = p_home_E
self.p_social_E = p_social_E
self.soc_dist_comp_young = soc_dist_comp_young # social distancing compliance (young people)
self.soc_dist_comp_old = soc_dist_comp_old # social distancing compliance (old people)
self.p_fat_by_age_group = p_fat_by_age_group
super().__init__(f'disease-progress-{self.primary_E_site}', group_qry=GroupQry(cond=[lambda g: g.has_rel(self.primary_E_site)]))
def apply(self, pop, group, iter, t):
if group.has_attr({ disease_name: 'S' }):
p_E = 0.0
if group.is_at_site_name(self.primary_E_site):
prop_IA = group.get_site_at().get_mass_prop(gq_IA)
prop_IS = group.get_site_at().get_mass_prop(gq_IS)
prop_I = prop_IA + prop_IS
p_E = min(1.0, self.r0 * (prop_I))
elif self.p_home_E > 0 or self.p_social_E > 0:
prop_IA = group.get_site_at().get_mass_prop(gq_IA)
prop_IS = group.get_site_at().get_mass_prop(gq_IS)
prop_I = prop_IA + prop_IS
if group.get_attr('age_group') == '0-50':
soc_dist_comp = self.soc_dist_comp_young
else:
soc_dist_comp = self.soc_dist_comp_old
p_E = min(1.0, self.r0 * (prop_I))
p_E = p_E * self.p_home_E + p_E * self.p_social_E * (1.0 - soc_dist_comp)
if p_E == 0: # nothing to split (TODO: Can this be worked into PyPRAM?)
return None
else:
return [
GroupSplitSpec(p=1 - p_E, attr_set={ disease_name: 'S' }),
GroupSplitSpec(p= p_E, attr_set={ disease_name: 'E' })
]
if group.has_attr({ disease_name: 'E' }):
return [
GroupSplitSpec(p=1 - self.p_E_IA, attr_set={ disease_name: 'E' }),
GroupSplitSpec(p= self.p_E_IA, attr_set={ disease_name: 'IA' }),
]
if group.has_attr({ disease_name: 'IA' }):
return [
GroupSplitSpec(p=1 - self.p_IA_IS, attr_set={ disease_name: 'IA' }),
GroupSplitSpec(p= self.p_IA_IS, attr_set={ disease_name: 'IS' })
]
if group.has_attr({ disease_name: 'IS' }):
p_fat = self.p_fat_by_age_group.get(group.get_attr('age_group'), 0.0)
if p_fat is None:
raise ValueError(f'Unexpected age group: {age_group}')
return [
GroupSplitSpec(p= p_fat, attr_set=Group.VOID), # TODO: Make this a special class
GroupSplitSpec(p=1 - self.p_IS_R, attr_set={ disease_name: 'IS' }),
GroupSplitSpec(p= self.p_IS_R - p_fat, attr_set={ disease_name: 'R' })
]
# ----------------------------------------------------------------------------------------------------------------------
class DiseaseRule2(Rule):
def __init__(self, primary_E_site, sei2r_params, p_home_E, p_social_E, soc_dist_comp_young, soc_dist_comp_old, p_fat_by_age_group):
if p_home_E + p_social_E > 1.0:
raise ValueError('p_home_E + p_social_E cannot be greater than 1.')
self.primary_E_site = primary_E_site # primary exposure site
self.sei2r_params = sei2r_params
self.p_home_E = p_home_E
self.p_social_E = p_social_E
self.soc_dist_comp_young = soc_dist_comp_young # social distancing compliance (young people)
self.soc_dist_comp_old = soc_dist_comp_old # social distancing compliance (old people)
self.p_fat_by_age_group = p_fat_by_age_group
super().__init__(f'disease-progress-{self.primary_E_site}', group_qry=GroupQry(cond=[lambda g: g.has_rel(self.primary_E_site)]))
def apply(self, pop, group, iter, t):
if group.has_attr({ disease_name: 'S' }):
p_E = 0.0
if group.is_at_site_name(self.primary_E_site):
prop_IA = group.get_site_at().get_mass_prop(gq_IA)
prop_IS = group.get_site_at().get_mass_prop(gq_IS)
prop_I = prop_IA + prop_IS
p_E = min(1.0, self.sei2r_params.r0 * (prop_I))
elif self.p_home_E > 0 or self.p_social_E > 0:
prop_IA = group.get_site_at().get_mass_prop(gq_IA)
prop_IS = group.get_site_at().get_mass_prop(gq_IS)
prop_I = prop_IA + prop_IS
if group.get_attr('age_group') == '0-50':
soc_dist_comp = self.soc_dist_comp_young
else:
soc_dist_comp = self.soc_dist_comp_old
p_E = min(1.0, self.sei2r_params.r0 * (prop_I))
p_E = p_E * self.p_home_E + p_E * self.p_social_E * (1.0 - soc_dist_comp)
if p_E == 0: # nothing to split (TODO: Can this be worked into PyPRAM?)
return None
else:
return [
GroupSplitSpec(p=1 - p_E, attr_set={ disease_name: 'S' }),
GroupSplitSpec(p= p_E, attr_set={ disease_name: 'E' })
]
if group.has_attr({ disease_name: 'E' }):
return [
GroupSplitSpec(p=1 - self.sei2r_params.kappa_1, attr_set={ disease_name: 'E' }),
GroupSplitSpec(p= self.sei2r_params.kappa_1, attr_set={ disease_name: 'IA' }),
]
if group.has_attr({ disease_name: 'IA' }):
return [
GroupSplitSpec(p=1 - self.sei2r_params.kappa_2, attr_set={ disease_name: 'IA' }),
GroupSplitSpec(p= self.sei2r_params.kappa_2, attr_set={ disease_name: 'IS' })
]
if group.has_attr({ disease_name: 'IS' }):
p_fat = self.p_fat_by_age_group.get(group.get_attr('age_group'), 0.0)
if p_fat is None:
raise ValueError(f'Unexpected age group: {age_group}')
return [
GroupSplitSpec(p= p_fat, attr_set=Group.VOID), # TODO: Make this a special class
GroupSplitSpec(p=1 - self.sei2r_params.gamma, attr_set={ disease_name: 'IS' }),
GroupSplitSpec(p= self.sei2r_params.gamma - p_fat, attr_set={ disease_name: 'R' })
]
# ----------------------------------------------------------------------------------------------------------------------
class DiseaseRuleMobility(Rule):
def __init__(self, primary_E_site, r0, p_E_IA, p_IA_IS, p_IS_R, p_home_E, soc_dist_comp_young, soc_dist_comp_old, p_fat_by_age_group, pop_mobility=None, p_social_E_max=1.0):
self.primary_E_site = primary_E_site # primary exposure site
self.r0 = r0
self.p_E_IA = p_E_IA
self.p_IA_IS = p_IA_IS
self.p_IS_R = p_IS_R
self.p_home_E = p_home_E
self.soc_dist_comp_young = soc_dist_comp_young # social distancing compliance (young people)
self.soc_dist_comp_old = soc_dist_comp_old # social distancing compliance (old people)
self.p_social_E_max = p_social_E_max
self.p_fat_by_age_group = p_fat_by_age_group
if isinstance(pop_mobility, str): # pop_mobility is assumed to contain database file path
self.pop_mobility = PopulationLocation(pop_mobility)
self.pop_mobility.set_mobility_first_day_of_year(61)
self.pop_mobility.set_contacts_first_day_of_year(61)
else:
self.pop_mobility = pop_mobility
super().__init__(f'disease-progress-mobility-{self.primary_E_site}', group_qry=GroupQry(cond=[lambda g: g.has_rel(self.primary_E_site)]))
def apply(self, pop, group, iter, t):
if group.has_attr({ disease_name: 'S' }):
p_E = 0.0
if group.is_at_site_name(self.primary_E_site):
prop_IA = group.get_site_at().get_mass_prop(gq_IA)
prop_IS = group.get_site_at().get_mass_prop(gq_IS)
prop_I = prop_IA + prop_IS
p_E = min(1.0, self.r0 * (prop_I))
else:
if isinstance(self.pop_mobility, PopulationLocation):
day = iter // 2 + 1 # ordinal number of day from the start of simulation (two iterations per day)
p_social_E = min(self.pop_mobility.get_contacts_by_day_of_year(group.get_rel('home').name, 2020, day, True) or 0 / 100, self.p_social_E_max)
elif isinstance(self.pop_mobility, float):
p_social_E = min(self.pop_mobility, self.p_social_E_max)
prop_IA = group.get_site_at().get_mass_prop(gq_IA)
prop_IS = group.get_site_at().get_mass_prop(gq_IS)
prop_I = prop_IA + prop_IS
if group.get_attr('age_group') == '0-50':
soc_dist_comp = self.soc_dist_comp_young
else:
soc_dist_comp = self.soc_dist_comp_old
p_E = min(1.0, self.r0 * (prop_I))
p_E = p_E * self.p_home_E + p_E * p_social_E * (1.0 - soc_dist_comp)
if p_E == 0: # nothing to split (TODO: Can this be worked into PyPRAM?)
return None
else:
return [
GroupSplitSpec(p=1 - p_E, attr_set={ disease_name: 'S' }),
GroupSplitSpec(p= p_E, attr_set={ disease_name: 'E' })
]
if group.has_attr({ disease_name: 'E' }):
return [
GroupSplitSpec(p=1 - self.p_E_IA, attr_set={ disease_name: 'E' }),
GroupSplitSpec(p= self.p_E_IA, attr_set={ disease_name: 'IA' }),
]
if group.has_attr({ disease_name: 'IA' }):
return [
GroupSplitSpec(p=1 - self.p_IA_IS, attr_set={ disease_name: 'IA' }),
GroupSplitSpec(p= self.p_IA_IS, attr_set={ disease_name: 'IS' })
]
if group.has_attr({ disease_name: 'IS' }):
p_fat = self.p_fat_by_age_group.get(group.get_attr('age_group'), 0.0)
if p_fat is None:
raise ValueError(f'Unexpected age group: {age_group}')
return [
GroupSplitSpec(p= p_fat, attr_set=Group.VOID), # TODO: Make this a special class
GroupSplitSpec(p=1 - self.p_IS_R, attr_set={ disease_name: 'IS' }),
GroupSplitSpec(p= self.p_IS_R - p_fat, attr_set={ disease_name: 'R' })
]
# ----------------------------------------------------------------------------------------------------------------------
class DiseaseRuleMobility2(Rule):
def __init__(self, primary_E_site, sei2r_params, p_home_E, soc_dist_comp_young, soc_dist_comp_old, p_fat_by_age_group, pop_mobility=None, p_social_E_max=1.0):
self.primary_E_site = primary_E_site # primary exposure site
self.sei2r_params = sei2r_params
self.p_home_E = p_home_E
self.soc_dist_comp_young = soc_dist_comp_young # social distancing compliance (young people)
self.soc_dist_comp_old = soc_dist_comp_old # social distancing compliance (old people)
self.p_social_E_max = p_social_E_max
self.p_fat_by_age_group = p_fat_by_age_group
if isinstance(pop_mobility, str): # pop_mobility is assumed to contain database file path
self.pop_mobility = PopulationLocation(pop_mobility)
self.pop_mobility.set_mobility_first_day_of_year(61)
self.pop_mobility.set_contacts_first_day_of_year(61)
else:
self.pop_mobility = pop_mobility
super().__init__(f'disease-progress-mobility-{self.primary_E_site}', group_qry=GroupQry(cond=[lambda g: g.has_rel(self.primary_E_site)]))
def apply(self, pop, group, iter, t):
if group.has_attr({ disease_name: 'S' }):
p_E = 0.0
if group.is_at_site_name(self.primary_E_site):
prop_IA = group.get_site_at().get_mass_prop(gq_IA)
prop_IS = group.get_site_at().get_mass_prop(gq_IS)
prop_I = prop_IA + prop_IS
p_E = min(1.0, self.sei2r_params.r0 * (prop_I))
else:
if isinstance(self.pop_mobility, PopulationLocation):
day = iter // 2 + 1 # ordinal number of day from the start of simulation (two iterations per day)
p_social_E = min(self.pop_mobility.get_contacts_by_day_of_year(group.get_rel('home').name, 2020, day, True) or 0 / 100, self.p_social_E_max)
elif isinstance(self.pop_mobility, float):
p_social_E = min(self.pop_mobility, self.p_social_E_max)
prop_IA = group.get_site_at().get_mass_prop(gq_IA)
prop_IS = group.get_site_at().get_mass_prop(gq_IS)
prop_I = prop_IA + prop_IS
if group.get_attr('age_group') == '0-50':
soc_dist_comp = self.soc_dist_comp_young
else:
soc_dist_comp = self.soc_dist_comp_old
p_E = min(1.0, self.sei2r_params.r0 * (prop_I))
p_E = p_E * self.p_home_E + p_E * p_social_E * (1.0 - soc_dist_comp)
if p_E == 0: # nothing to split (TODO: Can this be worked into PyPRAM?)
return None
else:
return [
GroupSplitSpec(p=1 - p_E, attr_set={ disease_name: 'S' }),
GroupSplitSpec(p= p_E, attr_set={ disease_name: 'E' })
]
if group.has_attr({ disease_name: 'E' }):
return [
GroupSplitSpec(p=1 - self.sei2r_params.kappa_1, attr_set={ disease_name: 'E' }),
GroupSplitSpec(p= self.sei2r_params.kappa_1, attr_set={ disease_name: 'IA' }),
]
if group.has_attr({ disease_name: 'IA' }):
return [
GroupSplitSpec(p=1 - self.sei2r_params.kappa_2, attr_set={ disease_name: 'IA' }),
GroupSplitSpec(p= self.sei2r_params.kappa_2, attr_set={ disease_name: 'IS' })
]
if group.has_attr({ disease_name: 'IS' }):
p_fat = self.p_fat_by_age_group.get(group.get_attr('age_group'), 0.0)
if p_fat is None:
raise ValueError(f'Unexpected age group: {age_group}')
return [
GroupSplitSpec(p= p_fat, attr_set=Group.VOID), # TODO: Make this a special class
GroupSplitSpec(p=1 - self.sei2r_params.gamma, attr_set={ disease_name: 'IS' }),
GroupSplitSpec(p= self.sei2r_params.gamma - p_fat, attr_set={ disease_name: 'R' })
]
# ----------------------------------------------------------------------------------------------------------------------
class ClosedownIntervention(SimRule):
def __init__(self, prop_pop_IS_threshold):
super().__init__('closedown-intervention')
self.vars = { 'closedown': False }
self.prop_pop_IS_threshold = prop_pop_IS_threshold
def apply(self, sim, iter, t):
if not sim.get_var('closedown'):
prop_inf = sim.pop.get_groups_mass_prop(gq_IS)
if prop_inf >= self.prop_pop_IS_threshold:
sim.set_var('closedown', True)
# ----------------------------------------------------------------------------------------------------------------------
class ReopenIntervention(SimRule):
def __init__(self, prop_pop_IS_threshold):
super().__init__('reopen-intervention')
self.vars = { 'closedown': False }
self.prop_pop_IS_threshold = prop_pop_IS_threshold
def apply(self, sim, iter, t):
if sim.get_var('closedown'):
prop_inf = sim.pop.get_groups_mass_prop(gq_IS)
if prop_inf <= self.prop_pop_IS_threshold:
sim.set_var('closedown', False)
| 48.041237 | 177 | 0.561856 | 2,535 | 18,640 | 3.750296 | 0.065878 | 0.026822 | 0.05091 | 0.060587 | 0.906911 | 0.905123 | 0.900494 | 0.895761 | 0.892711 | 0.892711 | 0 | 0.013427 | 0.296781 | 18,640 | 387 | 178 | 48.165375 | 0.711855 | 0.118133 | 0 | 0.810169 | 0 | 0 | 0.043492 | 0.014049 | 0 | 0 | 0 | 0.002584 | 0 | 1 | 0.054237 | false | 0 | 0.016949 | 0.00339 | 0.183051 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9dfb360e8e73a010f6cc6b9e6529ab413ba9785b | 15,208 | py | Python | proliantutils/tests/redfish/resources/system/test_bios.py | openstack/deb-python-proliantutils | b9229a0ab3e7c7af0b9e59968a5c6c7fea53bd88 | [
"Apache-2.0"
] | 12 | 2016-09-14T21:59:39.000Z | 2019-10-09T17:02:14.000Z | proliantutils/tests/redfish/resources/system/test_bios.py | openstack/deb-python-proliantutils | b9229a0ab3e7c7af0b9e59968a5c6c7fea53bd88 | [
"Apache-2.0"
] | null | null | null | proliantutils/tests/redfish/resources/system/test_bios.py | openstack/deb-python-proliantutils | b9229a0ab3e7c7af0b9e59968a5c6c7fea53bd88 | [
"Apache-2.0"
] | null | null | null | # Copyright 2017 Hewlett Packard Enterprise Development LP
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import json
import mock
import sushy
import testtools
from proliantutils import exception
from proliantutils.redfish.resources.system import bios
from proliantutils.redfish.resources.system import constants as sys_cons
class BIOSSettingsTestCase(testtools.TestCase):
def setUp(self):
super(BIOSSettingsTestCase, self).setUp()
self.conn = mock.MagicMock()
with open('proliantutils/tests/redfish/'
'json_samples/bios.json', 'r') as f:
self.conn.get.return_value.json.return_value = (
json.loads(f.read())['Default'])
self.bios_inst = bios.BIOSSettings(
self.conn, '/redfish/v1/Systems/1/bios',
redfish_version='1.0.2')
def test_attributes(self):
self.assertEqual(sys_cons.BIOS_BOOT_MODE_UEFI,
self.bios_inst.boot_mode)
self.assertEqual(sys_cons.SRIOV_ENABLED,
self.bios_inst.sriov)
self.assertEqual(sys_cons.CPUVT_ENABLED,
self.bios_inst.cpu_vt)
def test_pending_settings(self):
self.assertIsNone(self.bios_inst._pending_settings)
self.conn.get.return_value.json.reset_mock()
with open('proliantutils/tests/redfish/'
'json_samples/bios.json', 'r') as f:
self.conn.get.return_value.json.return_value = (
json.loads(f.read())['BIOS_pending_settings_default'])
actual_settings = self.bios_inst.pending_settings
self.assertIsInstance(actual_settings,
bios.BIOSPendingSettings)
self.conn.get.return_value.json.assert_called_once_with()
# reset mock
self.conn.get.return_value.json.reset_mock()
self.assertIs(actual_settings,
self.bios_inst.pending_settings)
self.conn.get.return_value.json.assert_not_called()
def test_boot_settings(self):
self.assertIsNone(self.bios_inst._boot_settings)
self.conn.get.return_value.json.reset_mock()
with open('proliantutils/tests/redfish/'
'json_samples/bios_boot.json', 'r') as f:
self.conn.get.return_value.json.return_value = (
json.loads(f.read())['Default'])
actual_settings = self.bios_inst.boot_settings
self.assertIsInstance(actual_settings,
bios.BIOSBootSettings)
self.conn.get.return_value.json.assert_called_once_with()
# reset mock
self.conn.get.return_value.json.reset_mock()
self.assertIs(actual_settings,
self.bios_inst.boot_settings)
self.conn.get.return_value.json.assert_not_called()
def test__get_base_configs(self):
self.assertIsNone(self.bios_inst._base_configs)
with open('proliantutils/tests/redfish/'
'json_samples/bios_base_configs.json', 'r') as f:
self.conn.get.return_value.json.return_value = json.loads(f.read())
default_settings = self.bios_inst._get_base_configs()
self.assertIsInstance(default_settings, bios.BIOSBaseConfigs)
def test_pending_settings_on_refresh(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios.json', 'r') as f:
self.conn.get.return_value.json.return_value = (
json.loads(f.read())['BIOS_pending_settings_default'])
actual_settings = self.bios_inst.pending_settings
self.assertIsInstance(actual_settings,
bios.BIOSPendingSettings)
with open('proliantutils/tests/redfish/'
'json_samples/bios.json', 'r') as f:
self.conn.get.return_value.json.return_value = (
json.loads(f.read())['Default'])
self.bios_inst.refresh()
self.assertIsNone(self.bios_inst._pending_settings)
with open('proliantutils/tests/redfish/'
'json_samples/bios.json', 'r') as f:
self.conn.get.return_value.json.return_value = (
json.loads(f.read())['BIOS_pending_settings_default'])
self.assertIsInstance(actual_settings,
bios.BIOSPendingSettings)
def test_boot_settings_on_refresh(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios_boot.json', 'r') as f:
self.conn.get.return_value.json.return_value = (
json.loads(f.read())['Default'])
actual_settings = self.bios_inst.boot_settings
self.assertIsInstance(actual_settings,
bios.BIOSBootSettings)
with open('proliantutils/tests/redfish/'
'json_samples/bios.json', 'r') as f:
self.conn.get.return_value.json.return_value = (
json.loads(f.read())['Default'])
self.bios_inst.refresh()
self.assertIsNone(self.bios_inst._boot_settings)
with open('proliantutils/tests/redfish/'
'json_samples/bios_boot.json', 'r') as f:
self.conn.get.return_value.json.return_value = (
json.loads(f.read())['Default'])
self.assertIsInstance(actual_settings,
bios.BIOSBootSettings)
def test__get_base_configs_on_refresh(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios_base_configs.json', 'r') as f:
self.conn.get.return_value.json.return_value = json.loads(f.read())
default_settings = self.bios_inst._get_base_configs()
self.assertIsInstance(default_settings, bios.BIOSBaseConfigs)
with open('proliantutils/tests/redfish/'
'json_samples/bios.json', 'r') as f:
self.conn.get.return_value.json.return_value = (
json.loads(f.read())['Default'])
self.bios_inst.refresh()
self.assertIsNone(self.bios_inst._base_configs)
with open('proliantutils/tests/redfish/'
'json_samples/bios_base_configs.json', 'r') as f:
self.conn.get.return_value.json.return_value = json.loads(f.read())
self.assertIsInstance(default_settings, bios.BIOSBaseConfigs)
class BIOSBaseConfigsTestCase(testtools.TestCase):
def setUp(self):
super(BIOSBaseConfigsTestCase, self).setUp()
self.conn = mock.MagicMock()
with open('proliantutils/tests/redfish/'
'json_samples/bios_base_configs.json', 'r') as f:
self.conn.get.return_value.json.return_value = json.loads(f.read())
self.bios_base_inst = bios.BIOSBaseConfigs(
self.conn, '/redfish/v1/Systems/1/bios/baseconfigs',
redfish_version='1.0.2')
def test_attributes(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios_base_configs.json', 'r') as f:
bios_default = json.loads(f.read())['BaseConfigs'][0]['default']
self.assertEqual(bios_default, self.bios_base_inst.default_config)
class BIOSPendingSettingsTestCase(testtools.TestCase):
def setUp(self):
super(BIOSPendingSettingsTestCase, self).setUp()
self.conn = mock.MagicMock()
with open('proliantutils/tests/redfish/'
'json_samples/bios.json', 'r') as f:
self.conn.get.return_value.json.return_value = (
json.loads(f.read())['BIOS_pending_settings_default'])
self.bios_settings_inst = bios.BIOSPendingSettings(
self.conn, '/redfish/v1/Systems/1/bios/settings',
redfish_version='1.0.2')
def test_attributes(self):
self.assertEqual(sys_cons.BIOS_BOOT_MODE_UEFI,
self.bios_settings_inst.boot_mode)
def test_set_pending_boot_mode_bios(self):
self.bios_settings_inst.set_pending_boot_mode(
sys_cons.BIOS_BOOT_MODE_LEGACY_BIOS)
data = {
'Attributes': {
'BootMode': 'LegacyBios'
}
}
self.bios_settings_inst._conn.patch.assert_called_once_with(
'/redfish/v1/Systems/1/bios/settings', data=data)
def test_set_pending_boot_mode_uefi(self):
self.bios_settings_inst.set_pending_boot_mode(
sys_cons.BIOS_BOOT_MODE_UEFI)
data = {
'Attributes': {
'BootMode': 'Uefi',
'UefiOptimizedBoot': 'Enabled'
}
}
self.bios_settings_inst._conn.patch.assert_called_once_with(
'/redfish/v1/Systems/1/bios/settings', data=data)
def test_update_bios_data_by_post(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios_base_configs.json', 'r') as f:
bios_settings = json.loads(f.read())['BaseConfigs'][0]['default']
target_uri = '/redfish/v1/Systems/1/bios/settings'
data = {
'Attributes': bios_settings
}
self.bios_settings_inst.update_bios_data_by_post(bios_settings)
self.bios_settings_inst._conn.post.assert_called_once_with(target_uri,
data=data)
def test_update_bios_data_by_patch(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios_base_configs.json', 'r') as f:
bios_settings = json.loads(f.read())['BaseConfigs'][0]['default']
target_uri = '/redfish/v1/Systems/1/bios/settings'
data = {
'Attributes': bios_settings
}
self.bios_settings_inst.update_bios_data_by_patch(bios_settings)
self.bios_settings_inst._conn.patch.assert_called_once_with(target_uri,
data=data)
class BIOSBootSettingsTestCase(testtools.TestCase):
def setUp(self):
super(BIOSBootSettingsTestCase, self).setUp()
self.conn = mock.MagicMock()
with open('proliantutils/tests/redfish/'
'json_samples/bios_boot.json', 'r') as f:
self.conn.get.return_value.json.return_value = (
json.loads(f.read())['Default'])
self.bios_boot_inst = bios.BIOSBootSettings(
self.conn, '/redfish/v1/Systems/1/bios/boot',
redfish_version='1.0.2')
def test__attributes(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios_boot.json', 'r') as f:
boot_json = (json.loads(f.read())['Default'])
self.assertEqual(boot_json['BootSources'],
self.bios_boot_inst.boot_sources)
self.assertEqual(boot_json['PersistentBootConfigOrder'],
self.bios_boot_inst.persistent_boot_config_order)
def test_get_persistent_boot_device(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios_boot.json', 'r') as f:
boot_json = (json.loads(f.read())['Default'])
self.bios_boot_inst.persistent_boot_config_order = (
boot_json['PersistentBootConfigOrder'])
self.bios_boot_inst.boot_sources = boot_json['BootSources']
result = self.bios_boot_inst.get_persistent_boot_device()
self.assertEqual(result, sushy.BOOT_SOURCE_TARGET_HDD)
def test_get_persistent_boot_device_without_boot(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios_boot.json', 'r') as f:
boot_json = (json.loads(f.read())['BIOS_boot_without_boot'])
self.bios_boot_inst.boot_sources = boot_json['BootSources']
self.bios_boot_inst.persistent_boot_config_order = (
boot_json['PersistentBootConfigOrder'])
self.assertRaisesRegex(
exception.IloError,
'Persistent boot device failed, as no matched boot sources '
'found for device:',
self.bios_boot_inst.get_persistent_boot_device)
def test_get_persistent_boot_device_none(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios_boot.json', 'r') as f:
boot_json = (
json.loads(f.read())['BIOS_persistent_boot_device_none'])
self.bios_boot_inst.boot_sources = boot_json['BootSources']
self.bios_boot_inst.persistent_boot_config_order = (
boot_json['PersistentBootConfigOrder'])
result = self.bios_boot_inst.get_persistent_boot_device()
self.assertEqual(result, sushy.BOOT_SOURCE_TARGET_NONE)
def test_get_persistent_boot_device_boot_sources_is_none(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios_boot.json', 'r') as f:
boot_json = (
json.loads(f.read())['BIOS_boot_without_boot_sources'])
self.bios_boot_inst.boot_sources = boot_json['BootSources']
self.bios_boot_inst.persistent_boot_config_order = (
boot_json['PersistentBootConfigOrder'])
self.assertRaisesRegex(
exception.IloError,
'Boot sources or persistent boot config order not found',
self.bios_boot_inst.get_persistent_boot_device)
def test_get_uefi_boot_string(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios_boot.json', 'r') as f:
boot_json = (json.loads(f.read())['Default'])
self.bios_boot_inst.boot_sources = boot_json['BootSources']
result = self.bios_boot_inst.get_uefi_boot_string('C4346BB7EF30')
self.assertEqual(result, 'NIC.LOM.1.1.iSCSI')
def test_get_uefi_boot_string_boot_sources_is_none(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios_boot.json', 'r') as f:
boot_json = (
json.loads(f.read())['BIOS_boot_without_boot_sources'])
self.bios_boot_inst.boot_sources = boot_json['BootSources']
self.assertRaisesRegex(
exception.IloError,
'Boot sources are not found',
self.bios_boot_inst.get_uefi_boot_string, '123456')
def test_get_uefi_boot_string_mac_invalid(self):
with open('proliantutils/tests/redfish/'
'json_samples/bios_boot.json', 'r') as f:
boot_json = (json.loads(f.read())['Default'])
self.bios_boot_inst.boot_sources = boot_json['BootSources']
self.assertRaisesRegex(
exception.IloError,
'MAC provided "123456" is Invalid',
self.bios_boot_inst.get_uefi_boot_string, '123456')
| 43.575931 | 79 | 0.637296 | 1,786 | 15,208 | 5.150616 | 0.099664 | 0.046962 | 0.065224 | 0.076313 | 0.839874 | 0.831503 | 0.761605 | 0.723774 | 0.707686 | 0.696597 | 0 | 0.005833 | 0.256049 | 15,208 | 348 | 80 | 43.701149 | 0.80723 | 0.042675 | 0 | 0.708633 | 0 | 0 | 0.187113 | 0.146197 | 0 | 0 | 0 | 0 | 0.140288 | 1 | 0.089928 | false | 0 | 0.02518 | 0 | 0.129496 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d183279da4a4c0b4f41a6aef2f32036651d8e5e5 | 83 | py | Python | backend/app/views.py | jnthnrzr/food-craving-survey | 67fb87041bb60d6f1f051a32ee8492b7f2961618 | [
"MIT"
] | null | null | null | backend/app/views.py | jnthnrzr/food-craving-survey | 67fb87041bb60d6f1f051a32ee8492b7f2961618 | [
"MIT"
] | 4 | 2018-03-02T02:46:02.000Z | 2021-06-01T21:44:27.000Z | backend/app/views.py | jnthnrzr/food-craving-survey | 67fb87041bb60d6f1f051a32ee8492b7f2961618 | [
"MIT"
] | null | null | null | from app import app
@app.route('/')
def hello_world():
return "Hello World!"
| 11.857143 | 25 | 0.650602 | 12 | 83 | 4.416667 | 0.666667 | 0.377358 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192771 | 83 | 6 | 26 | 13.833333 | 0.791045 | 0 | 0 | 0 | 0 | 0 | 0.156627 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
d184c2af2067863ca227b797cf940cd863acd0a8 | 11,961 | py | Python | tests/pm/test_clean.py | ssjunnebo/scilifelab | 79960f7042118f900bd1eaabe4902ee76abd8020 | [
"MIT"
] | 1 | 2016-03-21T14:04:09.000Z | 2016-03-21T14:04:09.000Z | tests/pm/test_clean.py | ssjunnebo/scilifelab | 79960f7042118f900bd1eaabe4902ee76abd8020 | [
"MIT"
] | 35 | 2015-01-22T08:25:02.000Z | 2020-02-17T12:09:12.000Z | tests/pm/test_clean.py | ssjunnebo/scilifelab | 79960f7042118f900bd1eaabe4902ee76abd8020 | [
"MIT"
] | 6 | 2015-01-16T15:32:08.000Z | 2020-01-30T14:34:40.000Z | """
Test cleaning operations
"""
import os
import sys
import glob
from cement.core import handler
from cement.utils import shell
from test_default import PmTest, safe_makedir
from scilifelab.pm.core.project import ProjectController
flowcell = "120829_AA001AAAXX"
filedir = os.path.abspath(os.path.dirname(os.path.realpath(__file__)))
intermediate = os.path.join(filedir, "data", "projects", "j_doe_00_02", "intermediate")
data = os.path.join(filedir, "data", "projects", "j_doe_00_02", "data")
j_doe_00_04 = os.path.join(filedir, "data", "projects", "j_doe_00_04")
class CleanTest(PmTest):
## FIX ME: move to empty_files.py
INPUT_FILES = {'P1_106F_index6' : [
'1_120829_AA001AAAXX_nophix_10_1_fastq.txt',
'1_120829_AA001AAAXX_nophix_10_2_fastq.txt'],
'P1_107_index7' : [
'1_120829_AA001AAAXX_nophix_12_1_fastq.txt.gz',
'1_120829_AA001AAAXX_nophix_12_2_fastq.txt.gz']
}
RESULT_FILES = {'P1_106F_index6': [
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign.bam',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign.bam.bai',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal.bam',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal.bai',
'1_120829_AA001AAAXX_nophix_10-sort.bigwig',
'1_120829_AA001AAAXX_nophix_10-sort-dup-target.crisp_pileup.gz',
'1_120829_AA001AAAXX_nophix_10-sort-dup-target.pileup.gz',
'1_120829_AA001AAAXX_nophix_10-sort-dup-target.crisp_pileup',
'1_120829_AA001AAAXX_nophix_10-sort-dup-target.pileup',
'alignments/1_120829_AA001AAAXX_nophix_10_1_fastq-fastq.bam',
'alignments/1_120829_AA001AAAXX_nophix_10_2_fastq-fastq.bam',
'alignments/1_120829_AA001AAAXX_nophix_10_1.sai',
'alignments/1_120829_AA001AAAXX_nophix_10_2.sai',
'alignments/1_120829_AA001AAAXX_nophix_10.sam',
'alignments/1_120829_AA001AAAXX_nophix_10.bam',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-chr1-realign-subsetchr1.bam',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-chr1-realign-subsetchr1.bam.bai',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-chr1-realign.bam',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-chr1-realign.intervals',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-chr2-realign-subsetchr2.bam',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-chr2-realign-subsetchr2.bam.bai',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-chr2-realign.bam',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-chr2-realign.intervals',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-variants-split/1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-chr1-variants.vcf',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-variants-split/1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-chr2-variants.vcf',
'1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-variants-split/1_120829_AA001AAAXX_nophix_10-sort-gatkrecal-realign-chr3-variants.vcf'
],
'P1_107_index7' : [
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign.bam',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign.bam.bai',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal.bam',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal.bai',
'1_120829_AA001AAAXX_nophix_12-sort.bigwig',
'1_120829_AA001AAAXX_nophix_12-sort-dup-target.crisp_pileup.gz',
'1_120829_AA001AAAXX_nophix_12-sort-dup-target.pileup.gz',
'1_120829_AA001AAAXX_nophix_12-sort-dup-target.crisp_pileup',
'1_120829_AA001AAAXX_nophix_12-sort-dup-target.pileup',
'alignments/1_120829_AA001AAAXX_nophix_12_1_fastq-fastq.bam',
'alignments/1_120829_AA001AAAXX_nophix_12_2_fastq-fastq.bam',
'alignments/1_120829_AA001AAAXX_nophix_12_1.sai',
'alignments/1_120829_AA001AAAXX_nophix_12_2.sai',
'alignments/1_120829_AA001AAAXX_nophix_12.sam',
'alignments/1_120829_AA001AAAXX_nophix_12.bam',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-chr1-realign-subsetchr1.bam',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-chr1-realign-subsetchr1.bam.bai',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-chr1-realign.bam',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-chr1-realign.intervals',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-chr2-realign-subsetchr2.bam',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-chr2-realign-subsetchr2.bam.bai',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-chr2-realign.bam',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-split/1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-chr2-realign.intervals',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-variants-split/1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-chr1-variants.vcf',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-variants-split/1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-chr2-variants.vcf',
'1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-variants-split/1_120829_AA001AAAXX_nophix_12-sort-gatkrecal-realign-chr3-variants.vcf'
]
}
def setUp(self):
super(CleanTest, self).setUp()
## Setup pre-casava results
for k in self.RESULT_FILES.keys():
for f in self.RESULT_FILES[k]:
outfile = os.path.join(intermediate, flowcell, f)
if not os.path.exists(os.path.dirname(outfile)):
safe_makedir(os.path.dirname(outfile))
exit_code = shell.exec_cmd2(['touch', outfile])
for k in self.INPUT_FILES.keys():
for f in self.INPUT_FILES[k]:
outfile = os.path.join(data, flowcell, '1_120829_AA001AAAXX_barcode', f)
if not os.path.exists(os.path.dirname(outfile)):
safe_makedir(os.path.dirname(outfile))
exit_code = shell.exec_cmd2(['touch', outfile])
## Setup casava results
for k in self.RESULT_FILES.keys():
for f in self.RESULT_FILES[k]:
outfile = os.path.join(data, k, flowcell, f)
if not os.path.exists(os.path.dirname(outfile)):
safe_makedir(os.path.dirname(outfile))
exit_code = shell.exec_cmd2(['touch', outfile])
for k in self.INPUT_FILES.keys():
for f in self.INPUT_FILES[k]:
outfile = os.path.join(data, k, flowcell, '1_120829_AA001AAAXX_barcode', f)
if not os.path.exists(os.path.dirname(outfile)):
safe_makedir(os.path.dirname(outfile))
exit_code = shell.exec_cmd2(['touch', outfile])
def test_clean_dry(self):
self.app = self.make_app(argv = ['project', 'clean', 'j_doe_00_02', '--pileup', '-n', '--intermediate', '--force'])
handler.register(ProjectController)
self._run_app()
def test_clean(self):
before = glob.glob(os.path.join(intermediate, "120829_AA001AAAXX", "*"))
self.app = self.make_app(argv = ['project', 'clean', 'j_doe_00_02', '--pileup', '--intermediate', '--force'])
handler.register(ProjectController)
self._run_app()
after = glob.glob(os.path.join(intermediate, "120829_AA001AAAXX", "*"))
diff = [os.path.basename(x) for x in list(set(before).difference(set(after)))]
self.eq(set(diff), set(['1_120829_AA001AAAXX_nophix_12-sort-dup-target.pileup.gz', '1_120829_AA001AAAXX_nophix_12-sort-dup-target.pileup', '1_120829_AA001AAAXX_nophix_12-sort-dup-target.crisp_pileup.gz', '1_120829_AA001AAAXX_nophix_12-sort-dup-target.crisp_pileup', '1_120829_AA001AAAXX_nophix_10-sort-dup-target.crisp_pileup.gz', '1_120829_AA001AAAXX_nophix_10-sort-dup-target.crisp_pileup', '1_120829_AA001AAAXX_nophix_10-sort-dup-target.pileup', '1_120829_AA001AAAXX_nophix_10-sort-dup-target.pileup.gz']))
def test_clean_fastqbam(self):
before = glob.glob(os.path.join(data, "P1_106F_index6/120829_AA001AAAXX/alignments", "*"))
self.app = self.make_app(argv = ['project', 'clean', 'j_doe_00_02', '--data', '--fastqbam', '--force'])
handler.register(ProjectController)
self._run_app()
after = glob.glob(os.path.join(data, "P1_106F_index6/120829_AA001AAAXX/alignments", "*"))
diff = len(before) - len(after)
self.eq(diff, 2)
diff = [os.path.basename(x) for x in list(set(before).difference(set(after)))]
self.eq(set(diff), set(['1_120829_AA001AAAXX_nophix_10_1_fastq-fastq.bam', '1_120829_AA001AAAXX_nophix_10_2_fastq-fastq.bam']))
def test_clean_split_tmp(self):
before = {'realign-split' : glob.glob(os.path.join(j_doe_00_04, "intermediate", "analysis_1", "1_120924_CC003CCCXX_7-sort-dup-gatkrecal-realign-split", "*.*")),
'realign-split-tx' : glob.glob(os.path.join(j_doe_00_04, "intermediate", "analysis_1", "1_120924_CC003CCCXX_7-sort-dup-gatkrecal-realign-split", "tx", "*.*")),
'variants-split' : glob.glob(os.path.join(j_doe_00_04, "intermediate", "analysis_1", "1_120924_CC003CCCXX_7-sort-dup-gatkrecal-realign-variants-split", "*.*")),
'variants-split-tx' : glob.glob(os.path.join(j_doe_00_04, "intermediate", "analysis_1", "1_120924_CC003CCCXX_7-sort-dup-gatkrecal-realign-variants-split", "tx", "*.*")),
'all' : glob.glob(os.path.join(j_doe_00_04, "intermediate", "analysis_1", "*"))}
self.app = self.make_app(argv = ['project', 'clean', 'j_doe_00_04', '--intermediate', '--tmp', '--split', '--force'])
handler.register(ProjectController)
self._run_app()
after = {'realign-split' : glob.glob(os.path.join(j_doe_00_04, "intermediate", "analysis_1", "1_120924_CC003CCCXX_7-sort-dup-gatkrecal-realign-split", "*.*")),
'realign-split-tx' : glob.glob(os.path.join(j_doe_00_04, "intermediate", "analysis_1", "1_120924_CC003CCCXX_7-sort-dup-gatkrecal-realign-split", "tx", "*.*")),
'variants-split' : glob.glob(os.path.join(j_doe_00_04, "intermediate", "analysis_1", "1_120924_CC003CCCXX_7-sort-dup-gatkrecal-realign-variants-split", "*.*")),
'variants-split-tx' : glob.glob(os.path.join(j_doe_00_04, "intermediate", "analysis_1", "1_120924_CC003CCCXX_7-sort-dup-gatkrecal-realign-variants-split", "tx", "*.*")),
'all' : glob.glob(os.path.join(j_doe_00_04, "intermediate", "analysis_1", "*"))}
self.eq(set(before['all']), set(after['all']))
self.eq(len(after['realign-split']), 0)
self.eq(len(after['realign-split-tx']), 0)
self.eq(len(after['variants-split']), 0)
self.eq(len(after['variants-split-tx']), 0)
| 74.75625 | 517 | 0.697601 | 1,604 | 11,961 | 4.862843 | 0.084165 | 0.194872 | 0.196154 | 0.259487 | 0.924231 | 0.92141 | 0.899231 | 0.851538 | 0.809744 | 0.732308 | 0 | 0.138844 | 0.172644 | 11,961 | 159 | 518 | 75.226415 | 0.649353 | 0.008528 | 0 | 0.277778 | 0 | 0.152778 | 0.55867 | 0.485396 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034722 | false | 0 | 0.048611 | 0 | 0.104167 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
d185ab277901d9bb2b78a9a57565118e9ad6ddbf | 12,578 | py | Python | app/smpa/resources/address.py | LBHackney-IT/smpa-backend | ececb2d5ce83443f8f456314927df12c235d97c0 | [
"MIT"
] | 1 | 2019-05-28T09:19:30.000Z | 2019-05-28T09:19:30.000Z | app/smpa/resources/address.py | LBHackney-IT/smpa-backend | ececb2d5ce83443f8f456314927df12c235d97c0 | [
"MIT"
] | 4 | 2019-11-19T16:28:20.000Z | 2019-11-25T15:19:31.000Z | app/smpa/resources/address.py | LBHackney-IT/smpa-backend | ececb2d5ce83443f8f456314927df12c235d97c0 | [
"MIT"
] | null | null | null | import falcon
from typing import Optional
from .core import Resource, ListResource
from ..services.address import (
_addresses, _site_addresses, _bs7666_addresses, _external_addresses, _international_addresses
)
from ..models.address import Address, SiteAddress # NOQA
class AddressPost(ListResource):
_service = _addresses
auth = {
'exempt_methods': ['OPTIONS', 'GET']
}
def on_get(self, req: falcon.Request, resp: falcon.Response) -> None:
"""
---
summary: Get all Addresses from the DB
tags:
- Address
parameters:
- in: query
schema: CoreListSchema
produces:
- application/json
responses:
200:
description: All Addresses
schema:
type: array
items: Address
401:
description: Unauthorized
"""
super().on_get(req, resp)
def on_post(self, req: falcon.Request, resp: falcon.Response) -> None:
"""
---
summary: Add a new Address to the database
tags:
- Address
parameters:
- in: body
schema: Address
consumes:
- application/json
produces:
- application/json
responses:
201:
description: Address created successfully
schema: Address
401:
description: Unauthorized
422:
description: Input body formatting issue
"""
super().on_post(req, resp)
class AddressPatch(Resource):
_service = _addresses
def on_get(self, req: falcon.Request, resp: falcon.Response, id: Optional[str] = None) -> None:
"""
---
summary: Get one or more Addresses from the database
tags:
- Address
parameters:
- in: path
schema: CoreGetSchema
produces:
- application/json
responses:
200:
description: One or more Addresses
schema:
type: array
items: Address
401:
description: Unauthorized
"""
super().on_get(req, resp, id)
def on_patch(self, req: falcon.Request, resp: falcon.Response, id: str) -> None:
"""
---
summary: Update an Address in the database
tags:
- Address
parameters:
- in: body
schema: Address
consumes:
- application/json
produces:
- application/json
responses:
200:
description: Returns updated Address
schema: Address
401:
description: Unauthorized
404:
description: Object does not exist
422:
description: Input body formatting issue
"""
super().on_patch(req, resp, id)
class SiteAddressPost(ListResource):
_service = _site_addresses
def on_get(self, req: falcon.Request, resp: falcon.Response) -> None:
"""
---
summary: Get all SiteAddresses from the DB
tags:
- Address
parameters:
- in: query
schema: CoreListSchema
produces:
- application/json
responses:
200:
description: All SiteAddresses
schema:
type: array
items: SiteAddress
401:
description: Unauthorized
"""
super().on_get(req, resp)
def on_post(self, req: falcon.Request, resp: falcon.Response) -> None:
"""
---
summary: Add new SiteAddress to the database
tags:
- SiteAddress
parameters:
- in: body
schema: SiteAddress
consumes:
- application/json
produces:
- application/json
responses:
201:
description: SiteAddress created successfully
schema: SiteAddress
401:
description: Unauthorized
422:
description: Input body formatting issue
"""
super().on_post(req, resp)
class SiteAddressPatch(Resource):
_service = _site_addresses
def on_get(self, req: falcon.Request, resp: falcon.Response, id: Optional[str] = None) -> None:
"""
---
summary: Get one or more SiteAddresses from the database
tags:
- SiteAddress
parameters:
- in: path
schema: CoreGetSchema
produces:
- application/json
responses:
200:
description: One or more SiteAddresses
schema: SiteAddress
401:
description: Unauthorized
"""
super().on_get(req, resp, id)
def on_patch(self, req: falcon.Request, resp: falcon.Response, id: str) -> None:
"""
---
summary: Update a SiteAddress in the database
tags:
- SiteAddress
parameters:
- in: body
schema: SiteAddress
consumes:
- application/json
produces:
- application/json
responses:
200:
description: Returns updated SiteAddress
schema: SiteAddress
401:
description: Unauthorized
404:
description: Object does not exist
422:
description: Input body formatting issue
"""
super().on_patch(req, resp, id)
class BS7666AddressResource(Resource):
_service = _bs7666_addresses
def on_get(self, req: falcon.Request, resp: falcon.Response, id: Optional[str] = None) -> None:
"""
---
summary: Get one or more BS7666Addresses from the database
tags:
- BS7666Address
parameters:
- in: path
schema: CoreGetSchema
produces:
- application/json
responses:
200:
description: One or more BS7666Addresses
type: array
items: BS7666Address
401:
description: Unauthorized
"""
super().on_get(req, resp, id)
def on_patch(self, req: falcon.Request, resp: falcon.Response, id: str) -> None:
"""
---
summary: Update a BS7666Address in the database
tags:
- BS7666Address
parameters:
- in: body
schema: BS7666Address
consumes:
- application/json
produces:
- application/json
responses:
200:
description: Returns updated BS7666Address
schema: BS7666Address
401:
description: Unauthorized
404:
description: Object does not exist
422:
description: Input body formatting issue
"""
super().on_patch(req, resp, id)
def on_post(self, req: falcon.Request, resp: falcon.Response) -> None:
"""
---
summary: Add new BS7666Address to the database
tags:
- BS7666Address
parameters:
- in: body
schema: BS7666Address
consumes:
- application/json
produces:
- application/json
responses:
201:
description: BS7666Address created successfully
schema: BS7666Address
401:
description: Unauthorized
422:
description: Input body formatting issue
"""
super().on_post(req, resp)
class ExternalAddressResource(Resource):
_service = _external_addresses
def on_get(self, req: falcon.Request, resp: falcon.Response, id: Optional[str] = None) -> None:
"""
---
summary: Get one or more ExternalAddresses from the database
tags:
- ExternalAddress
parameters:
- in: path
schema: CoreGetSchema
produces:
- application/json
responses:
200:
description: One or more ExternalAddresses
schema:
type: array
items: ExternalAddress
401:
description: Unauthorized
"""
super().on_get(req, resp, id)
def on_patch(self, req: falcon.Request, resp: falcon.Response, id: str) -> None:
"""
---
summary: Update an ExternalAddress in the database
tags:
- ExternalAddress
parameters:
- in: body
schema: ExternalAddress
consumes:
- application/json
produces:
- application/json
responses:
200:
description: Returns updated ExternalAddress
schema: ExternalAddress
401:
description: Unauthorized
404:
description: Object does not exist
422:
description: Input body formatting issue
"""
super().on_patch(req, resp, id)
def on_post(self, req: falcon.Request, resp: falcon.Response) -> None:
"""
---
summary: Add new ExternalAddress to the database
tags:
- ExternalAddress
parameters:
- in: body
schema: ExternalAddress
consumes:
- application/json
produces:
- application/json
responses:
201:
description: ExternalAddress created successfully
schema: ExternalAddress
401:
description: Unauthorized
422:
description: Input body formatting issue
"""
super().on_post(req, resp)
class InternationalAddressResource(Resource):
_service = _international_addresses
def on_get(self, req: falcon.Request, resp: falcon.Response, id: Optional[str] = None) -> None:
"""
---
summary: Get one or more InternationalAddresses from the database
tags:
- InternationalAddress
parameters:
- in: path
schema: CoreGetSchema
produces:
- application/json
responses:
200:
description: One or more InternationalAddresses
schema:
type: array
items: InternationalAddress
401:
description: Unauthorized
"""
super().on_get(req, resp, id)
def on_patch(self, req: falcon.Request, resp: falcon.Response, id: str) -> None:
"""
---
summary: Update an InternationalAddress in the database
tags:
- InternationalAddress
parameters:
- in: body
schema: InternationalAddress
consumes:
- application/json
produces:
- application/json
responses:
200:
description: Returns updated InternationalAddress
schema: InternationalAddress
401:
description: Unauthorized
404:
description: Object does not exist
422:
description: Input body formatting issue
"""
super().on_patch(req, resp, id)
def on_post(self, req: falcon.Request, resp: falcon.Response) -> None:
"""
---
summary: Add new InternationalAddress to the database
tags:
- InternationalAddress
parameters:
- in: body
schema: InternationalAddress
consumes:
- application/json
produces:
- application/json
responses:
201:
description: InternationalAddress created successfully
schema: InternationalAddress
401:
description: Unauthorized
422:
description: Input body formatting issue
"""
super().on_post(req, resp)
| 28.457014 | 99 | 0.509381 | 1,015 | 12,578 | 6.250246 | 0.099507 | 0.06384 | 0.034836 | 0.053594 | 0.808953 | 0.777743 | 0.755517 | 0.755517 | 0.755517 | 0.741015 | 0 | 0.029476 | 0.420099 | 12,578 | 441 | 100 | 28.521542 | 0.84028 | 0.53371 | 0 | 0.655172 | 0 | 0 | 0.00779 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.293103 | false | 0 | 0.086207 | 0 | 0.637931 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ae20e961e8414a86bc5f006b13243675a496a113 | 6,269 | py | Python | test/test_BO.py | Basvanstein/MIP-EGO | e1ed0b0ea020850c72c4de5efd5dda0a99de571f | [
"MIT"
] | 23 | 2018-07-20T17:22:28.000Z | 2022-02-23T15:41:30.000Z | test/test_BO.py | Basvanstein/MIP-EGO | e1ed0b0ea020850c72c4de5efd5dda0a99de571f | [
"MIT"
] | 5 | 2019-03-05T22:09:13.000Z | 2021-10-08T08:48:43.000Z | test/test_BO.py | Basvanstein/MIP-EGO | e1ed0b0ea020850c72c4de5efd5dda0a99de571f | [
"MIT"
] | 14 | 2018-05-15T21:47:57.000Z | 2021-12-07T02:04:38.000Z | import numpy as np
import sys, os
from mipego import ParallelBO, BO, ContinuousSpace, OrdinalSpace, \
NominalSpace, RandomForest
from mipego.Extension import MultiAcquisitionBO
from mipego.GaussianProcess import GaussianProcess
from mipego.GaussianProcess.trend import constant_trend
np.random.seed(123)
import unittest
class TestBO(unittest.TestCase):
def test_pickling(self):
dim = 5
lb, ub = -1, 5
def fitness(x):
x = np.asarray(x)
return np.sum(x ** 2)
space = ContinuousSpace([lb, ub]) * dim
mean = constant_trend(dim, beta=None)
thetaL = 1e-10 * (ub - lb) * np.ones(dim)
thetaU = 10 * (ub - lb) * np.ones(dim)
theta0 = np.random.rand(dim) * (thetaU - thetaL) + thetaL
model = GaussianProcess(
mean=mean, corr='squared_exponential',
theta0=theta0, thetaL=thetaL, thetaU=thetaU,
nugget=0, noise_estim=False,
optimizer='BFGS', wait_iter=3, random_start=dim,
likelihood='concentrated', eval_budget=100 * dim
)
opt = BO(
search_space=space,
obj_fun=fitness,
model=model,
DoE_size=5,
max_FEs=10,
verbose=True,
n_point=1
)
opt.step()
opt.save('picklingtest')
opt = BO.load('picklingtest')
print(opt.run())
os.remove('picklingtest')
def test_pickling2(self):
dim = 5
lb, ub = -1, 5
def fitness(x):
x = np.asarray(x)
return np.sum(x ** 2)
space = ContinuousSpace([lb, ub]) * dim
mean = constant_trend(dim, beta=None)
thetaL = 1e-10 * (ub - lb) * np.ones(dim)
thetaU = 10 * (ub - lb) * np.ones(dim)
theta0 = np.random.rand(dim) * (thetaU - thetaL) + thetaL
model = GaussianProcess(
mean=mean, corr='squared_exponential',
theta0=theta0, thetaL=thetaL, thetaU=thetaU,
nugget=0, noise_estim=False,
optimizer='BFGS', wait_iter=3, random_start=dim,
likelihood='concentrated', eval_budget=100 * dim
)
opt = BO(
search_space=space,
obj_fun=fitness,
model=model,
DoE_size=5,
max_FEs=10,
verbose=True,
n_point=1,
logger='log1'
)
opt.save('picklingtest')
opt = BO.load('picklingtest')
print(opt.run())
os.remove('picklingtest')
try:
os.remove('log1')
except Exception:
pass
def test_continuous(self):
dim = 5
lb, ub = -1, 5
def fitness(x):
x = np.asarray(x)
return np.sum(x ** 2)
space = ContinuousSpace([lb, ub]) * dim
mean = constant_trend(dim, beta=None)
thetaL = 1e-10 * (ub - lb) * np.ones(dim)
thetaU = 10 * (ub - lb) * np.ones(dim)
theta0 = np.random.rand(dim) * (thetaU - thetaL) + thetaL
model = GaussianProcess(
mean=mean, corr='squared_exponential',
theta0=theta0, thetaL=thetaL, thetaU=thetaU,
nugget=0, noise_estim=False,
optimizer='BFGS', wait_iter=3, random_start=dim,
likelihood='concentrated', eval_budget=100 * dim
)
opt = BO(
search_space=space,
obj_fun=fitness,
model=model,
DoE_size=5,
max_FEs=10,
verbose=True,
n_point=1
)
print(opt.run())
def test_mix_space(self):
dim_r = 2 # dimension of the real values
def obj_fun(x):
x_r = np.array([x['continuous_%d'%i] for i in range(dim_r)])
x_i = x['ordinal']
x_d = x['nominal']
_ = 0 if x_d == 'OK' else 1
return np.sum(x_r ** 2) + abs(x_i - 10) / 123. + _ * 2
search_space = ContinuousSpace([-5, 5], var_name='continuous') * dim_r + \
OrdinalSpace([5, 15], var_name='ordinal') + \
NominalSpace(['OK', 'A', 'B', 'C', 'D', 'E', 'F', 'G'], var_name='nominal')
model = RandomForest(levels=search_space.levels)
opt = ParallelBO(
search_space=search_space,
obj_fun=obj_fun,
model=model,
max_FEs=9,
DoE_size=3, # the initial DoE size
eval_type='dict',
acquisition_fun='MGFI',
acquisition_par={'t' : 2},
n_job=3, # number of processes
n_point=3, # number of the candidate solution proposed in each iteration
verbose=True # turn this off, if you prefer no output
)
xopt, fopt, stop_dict = opt.run()
print('xopt: {}'.format(xopt))
print('fopt: {}'.format(fopt))
print('stop criteria: {}'.format(stop_dict))
def test_multi_acquisition(self):
dim_r = 2 # dimension of the real values
def obj_fun(x):
x_r = np.array([x['continuous_%d'%i] for i in range(dim_r)])
x_i = x['ordinal']
x_d = x['nominal']
_ = 0 if x_d == 'OK' else 1
return np.sum(x_r ** 2) + abs(x_i - 10) / 123. + _ * 2
search_space = ContinuousSpace([-5, 5], var_name='continuous') * dim_r + \
OrdinalSpace([5, 15], var_name='ordinal') + \
NominalSpace(['OK', 'A', 'B', 'C', 'D', 'E', 'F', 'G'], var_name='nominal')
model = RandomForest(levels=search_space.levels)
opt = MultiAcquisitionBO(
search_space=search_space,
obj_fun=obj_fun,
model=model,
max_FEs=8,
DoE_size=4, # the initial DoE size
eval_type='dict',
n_job=4, # number of processes
n_point=4, # number of the candidate solution proposed in each iteration
verbose=True # turn this off, if you prefer no output
)
xopt, fopt, stop_dict = opt.run()
print('xopt: {}'.format(xopt))
print('fopt: {}'.format(fopt))
print('stop criteria: {}'.format(stop_dict))
if __name__ == '__main__':
unittest.main() | 31.502513 | 88 | 0.527835 | 752 | 6,269 | 4.255319 | 0.212766 | 0.037813 | 0.01125 | 0.015 | 0.824688 | 0.810313 | 0.810313 | 0.792188 | 0.792188 | 0.792188 | 0 | 0.026432 | 0.348221 | 6,269 | 199 | 89 | 31.502513 | 0.75673 | 0.053757 | 0 | 0.731707 | 0 | 0 | 0.066869 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060976 | false | 0.006098 | 0.042683 | 0 | 0.140244 | 0.054878 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ae8dafd16d11593953fda205e6102a2ab75f879c | 132 | py | Python | sati/__init__.py | yuksk/sati | 3f70304297b34b9ef69e611e7e2b23fcee1c09c0 | [
"MIT"
] | 1 | 2021-06-08T01:17:04.000Z | 2021-06-08T01:17:04.000Z | sati/__init__.py | yuksk/sati | 3f70304297b34b9ef69e611e7e2b23fcee1c09c0 | [
"MIT"
] | null | null | null | sati/__init__.py | yuksk/sati | 3f70304297b34b9ef69e611e7e2b23fcee1c09c0 | [
"MIT"
] | null | null | null | from sati.distributions import *
from sati.model import Model
from sati.planes import *
from sati.preprocessing import GuessInitRsp
| 26.4 | 43 | 0.833333 | 18 | 132 | 6.111111 | 0.444444 | 0.290909 | 0.254545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 132 | 4 | 44 | 33 | 0.948276 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8827b43b7c5937143d0302e911c44dbd818ba3e4 | 8,269 | py | Python | porerefiner/protocols/minknow/rpc/instance_grpc.py | CFSAN-Biostatistics/porerefiner | 64f96498bd6c036cfac46def1d9d94362001e67c | [
"MIT"
] | 8 | 2019-10-10T20:05:18.000Z | 2021-02-19T21:53:43.000Z | porerefiner/protocols/minknow/rpc/instance_grpc.py | CFSAN-Biostatistics/porerefiner | 64f96498bd6c036cfac46def1d9d94362001e67c | [
"MIT"
] | 2 | 2020-07-17T07:24:17.000Z | 2021-02-19T22:28:12.000Z | porerefiner/protocols/minknow/rpc/instance_grpc.py | CFSAN-Biostatistics/porerefiner | 64f96498bd6c036cfac46def1d9d94362001e67c | [
"MIT"
] | 2 | 2019-10-01T15:45:59.000Z | 2019-10-28T19:15:32.000Z | # Generated by the Protocol Buffers compiler. DO NOT EDIT!
# source: minknow/rpc/instance.proto
# plugin: grpclib.plugin.main
import abc
import typing
import grpclib.const
import grpclib.client
if typing.TYPE_CHECKING:
import grpclib.server
from . import acquisition_pb2
from . import device_pb2
from . import protocol_pb2
from . import instance_pb2
class InstanceServiceBase(abc.ABC):
@abc.abstractmethod
async def get_version_info(self, stream: 'grpclib.server.Stream[minknow.rpc.instance_pb2.GetVersionInfoRequest, minknow.rpc.instance_pb2.GetVersionInfoResponse]') -> None:
pass
@abc.abstractmethod
async def get_output_directories(self, stream: 'grpclib.server.Stream[minknow.rpc.instance_pb2.GetOutputDirectoriesRequest, minknow.rpc.instance_pb2.OutputDirectories]') -> None:
pass
@abc.abstractmethod
async def get_default_output_directories(self, stream: 'grpclib.server.Stream[minknow.rpc.instance_pb2.GetDefaultOutputDirectoriesRequest, minknow.rpc.instance_pb2.OutputDirectories]') -> None:
pass
@abc.abstractmethod
async def set_output_directory(self, stream: 'grpclib.server.Stream[minknow.rpc.instance_pb2.SetOutputDirectoryRequest, minknow.rpc.instance_pb2.SetOutputDirectoryResponse]') -> None:
pass
@abc.abstractmethod
async def set_reads_directory(self, stream: 'grpclib.server.Stream[minknow.rpc.instance_pb2.SetReadsDirectoryRequest, minknow.rpc.instance_pb2.SetReadsDirectoryResponse]') -> None:
pass
@abc.abstractmethod
async def get_disk_space_info(self, stream: 'grpclib.server.Stream[minknow.rpc.instance_pb2.GetDiskSpaceInfoRequest, minknow.rpc.instance_pb2.GetDiskSpaceInfoResponse]') -> None:
pass
@abc.abstractmethod
async def get_machine_id(self, stream: 'grpclib.server.Stream[minknow.rpc.instance_pb2.GetMachineIdRequest, minknow.rpc.instance_pb2.GetMachineIdResponse]') -> None:
pass
@abc.abstractmethod
async def get_host_type(self, stream: 'grpclib.server.Stream[minknow.rpc.instance_pb2.GetHostTypeRequest, minknow.rpc.instance_pb2.GetHostTypeResponse]') -> None:
pass
@abc.abstractmethod
async def stream_instance_activity(self, stream: 'grpclib.server.Stream[minknow.rpc.instance_pb2.StreamInstanceActivityRequest, minknow.rpc.instance_pb2.StreamInstanceActivityResponse]') -> None:
pass
def __mapping__(self) -> typing.Dict[str, grpclib.const.Handler]:
return {
'/ont.rpc.instance.InstanceService/get_version_info': grpclib.const.Handler(
self.get_version_info,
grpclib.const.Cardinality.UNARY_UNARY,
minknow.rpc.instance_pb2.GetVersionInfoRequest,
minknow.rpc.instance_pb2.GetVersionInfoResponse,
),
'/ont.rpc.instance.InstanceService/get_output_directories': grpclib.const.Handler(
self.get_output_directories,
grpclib.const.Cardinality.UNARY_UNARY,
minknow.rpc.instance_pb2.GetOutputDirectoriesRequest,
minknow.rpc.instance_pb2.OutputDirectories,
),
'/ont.rpc.instance.InstanceService/get_default_output_directories': grpclib.const.Handler(
self.get_default_output_directories,
grpclib.const.Cardinality.UNARY_UNARY,
minknow.rpc.instance_pb2.GetDefaultOutputDirectoriesRequest,
minknow.rpc.instance_pb2.OutputDirectories,
),
'/ont.rpc.instance.InstanceService/set_output_directory': grpclib.const.Handler(
self.set_output_directory,
grpclib.const.Cardinality.UNARY_UNARY,
minknow.rpc.instance_pb2.SetOutputDirectoryRequest,
minknow.rpc.instance_pb2.SetOutputDirectoryResponse,
),
'/ont.rpc.instance.InstanceService/set_reads_directory': grpclib.const.Handler(
self.set_reads_directory,
grpclib.const.Cardinality.UNARY_UNARY,
minknow.rpc.instance_pb2.SetReadsDirectoryRequest,
minknow.rpc.instance_pb2.SetReadsDirectoryResponse,
),
'/ont.rpc.instance.InstanceService/get_disk_space_info': grpclib.const.Handler(
self.get_disk_space_info,
grpclib.const.Cardinality.UNARY_UNARY,
minknow.rpc.instance_pb2.GetDiskSpaceInfoRequest,
minknow.rpc.instance_pb2.GetDiskSpaceInfoResponse,
),
'/ont.rpc.instance.InstanceService/get_machine_id': grpclib.const.Handler(
self.get_machine_id,
grpclib.const.Cardinality.UNARY_UNARY,
minknow.rpc.instance_pb2.GetMachineIdRequest,
minknow.rpc.instance_pb2.GetMachineIdResponse,
),
'/ont.rpc.instance.InstanceService/get_host_type': grpclib.const.Handler(
self.get_host_type,
grpclib.const.Cardinality.UNARY_UNARY,
minknow.rpc.instance_pb2.GetHostTypeRequest,
minknow.rpc.instance_pb2.GetHostTypeResponse,
),
'/ont.rpc.instance.InstanceService/stream_instance_activity': grpclib.const.Handler(
self.stream_instance_activity,
grpclib.const.Cardinality.UNARY_STREAM,
minknow.rpc.instance_pb2.StreamInstanceActivityRequest,
minknow.rpc.instance_pb2.StreamInstanceActivityResponse,
),
}
class InstanceServiceStub:
def __init__(self, channel: grpclib.client.Channel) -> None:
self.get_version_info = grpclib.client.UnaryUnaryMethod(
channel,
'/ont.rpc.instance.InstanceService/get_version_info',
minknow.rpc.instance_pb2.GetVersionInfoRequest,
minknow.rpc.instance_pb2.GetVersionInfoResponse,
)
self.get_output_directories = grpclib.client.UnaryUnaryMethod(
channel,
'/ont.rpc.instance.InstanceService/get_output_directories',
minknow.rpc.instance_pb2.GetOutputDirectoriesRequest,
minknow.rpc.instance_pb2.OutputDirectories,
)
self.get_default_output_directories = grpclib.client.UnaryUnaryMethod(
channel,
'/ont.rpc.instance.InstanceService/get_default_output_directories',
minknow.rpc.instance_pb2.GetDefaultOutputDirectoriesRequest,
minknow.rpc.instance_pb2.OutputDirectories,
)
self.set_output_directory = grpclib.client.UnaryUnaryMethod(
channel,
'/ont.rpc.instance.InstanceService/set_output_directory',
minknow.rpc.instance_pb2.SetOutputDirectoryRequest,
minknow.rpc.instance_pb2.SetOutputDirectoryResponse,
)
self.set_reads_directory = grpclib.client.UnaryUnaryMethod(
channel,
'/ont.rpc.instance.InstanceService/set_reads_directory',
minknow.rpc.instance_pb2.SetReadsDirectoryRequest,
minknow.rpc.instance_pb2.SetReadsDirectoryResponse,
)
self.get_disk_space_info = grpclib.client.UnaryUnaryMethod(
channel,
'/ont.rpc.instance.InstanceService/get_disk_space_info',
minknow.rpc.instance_pb2.GetDiskSpaceInfoRequest,
minknow.rpc.instance_pb2.GetDiskSpaceInfoResponse,
)
self.get_machine_id = grpclib.client.UnaryUnaryMethod(
channel,
'/ont.rpc.instance.InstanceService/get_machine_id',
minknow.rpc.instance_pb2.GetMachineIdRequest,
minknow.rpc.instance_pb2.GetMachineIdResponse,
)
self.get_host_type = grpclib.client.UnaryUnaryMethod(
channel,
'/ont.rpc.instance.InstanceService/get_host_type',
minknow.rpc.instance_pb2.GetHostTypeRequest,
minknow.rpc.instance_pb2.GetHostTypeResponse,
)
self.stream_instance_activity = grpclib.client.UnaryStreamMethod(
channel,
'/ont.rpc.instance.InstanceService/stream_instance_activity',
minknow.rpc.instance_pb2.StreamInstanceActivityRequest,
minknow.rpc.instance_pb2.StreamInstanceActivityResponse,
)
| 48.075581 | 199 | 0.692829 | 787 | 8,269 | 7.055909 | 0.108005 | 0.144607 | 0.178282 | 0.204214 | 0.900054 | 0.849991 | 0.816316 | 0.699082 | 0.679453 | 0.618044 | 0 | 0.009027 | 0.223002 | 8,269 | 171 | 200 | 48.356725 | 0.855253 | 0.014391 | 0 | 0.529801 | 1 | 0.059603 | 0.253008 | 0.251903 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013245 | false | 0.059603 | 0.059603 | 0.006623 | 0.092715 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
88acdce4ac11d85b4466aaf14f625f23f36f6859 | 4,952 | py | Python | tests/graphs/algorithms/test_cutting.py | ref-humbold/AlgoLib_Python | 05f725504656ec93b879374a8cd87464d88fff77 | [
"Apache-2.0"
] | null | null | null | tests/graphs/algorithms/test_cutting.py | ref-humbold/AlgoLib_Python | 05f725504656ec93b879374a8cd87464d88fff77 | [
"Apache-2.0"
] | null | null | null | tests/graphs/algorithms/test_cutting.py | ref-humbold/AlgoLib_Python | 05f725504656ec93b879374a8cd87464d88fff77 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""Tests: Algorithms for graph cutting"""
import unittest
from assertpy import assert_that
from algolib.graphs import UndirectedSimpleGraph
from algolib.graphs.algorithms import find_edge_cut, find_vertex_cut
class CuttingTest(unittest.TestCase):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
@staticmethod
def test__find_edge_cut__when_present_bridges():
# given
graph = UndirectedSimpleGraph(range(12))
graph.add_edge_between(graph.get_vertex(0), graph.get_vertex(1))
graph.add_edge_between(graph.get_vertex(0), graph.get_vertex(2))
graph.add_edge_between(graph.get_vertex(0), graph.get_vertex(7))
graph.add_edge_between(graph.get_vertex(1), graph.get_vertex(2))
graph.add_edge_between(graph.get_vertex(1), graph.get_vertex(3))
graph.add_edge_between(graph.get_vertex(1), graph.get_vertex(4))
graph.add_edge_between(graph.get_vertex(3), graph.get_vertex(5))
graph.add_edge_between(graph.get_vertex(4), graph.get_vertex(5))
graph.add_edge_between(graph.get_vertex(5), graph.get_vertex(6))
graph.add_edge_between(graph.get_vertex(7), graph.get_vertex(8))
graph.add_edge_between(graph.get_vertex(7), graph.get_vertex(9))
graph.add_edge_between(graph.get_vertex(7), graph.get_vertex(11))
graph.add_edge_between(graph.get_vertex(8), graph.get_vertex(9))
graph.add_edge_between(graph.get_vertex(9), graph.get_vertex(10))
graph.add_edge_between(graph.get_vertex(9), graph.get_vertex(11))
graph.add_edge_between(graph.get_vertex(10), graph.get_vertex(11))
# when
result = find_edge_cut(graph)
# then
assert_that(sorted(result)).is_equal_to([graph.get_edge(0, 7), graph.get_edge(5, 6)])
@staticmethod
def test__find_edge_cut__when_no_bridges():
# given
graph = UndirectedSimpleGraph(range(6))
graph.add_edge_between(graph.get_vertex(0), graph.get_vertex(1))
graph.add_edge_between(graph.get_vertex(0), graph.get_vertex(2))
graph.add_edge_between(graph.get_vertex(1), graph.get_vertex(2))
graph.add_edge_between(graph.get_vertex(1), graph.get_vertex(3))
graph.add_edge_between(graph.get_vertex(1), graph.get_vertex(4))
graph.add_edge_between(graph.get_vertex(3), graph.get_vertex(5))
graph.add_edge_between(graph.get_vertex(4), graph.get_vertex(5))
# when
result = find_edge_cut(graph)
# then
assert_that(list(result)).is_empty()
@staticmethod
def test__find_vertex_cut__when_present_separators():
# given
graph = UndirectedSimpleGraph(range(12))
graph.add_edge_between(graph.get_vertex(0), graph.get_vertex(1))
graph.add_edge_between(graph.get_vertex(0), graph.get_vertex(2))
graph.add_edge_between(graph.get_vertex(0), graph.get_vertex(7))
graph.add_edge_between(graph.get_vertex(1), graph.get_vertex(2))
graph.add_edge_between(graph.get_vertex(1), graph.get_vertex(3))
graph.add_edge_between(graph.get_vertex(1), graph.get_vertex(4))
graph.add_edge_between(graph.get_vertex(3), graph.get_vertex(5))
graph.add_edge_between(graph.get_vertex(4), graph.get_vertex(5))
graph.add_edge_between(graph.get_vertex(5), graph.get_vertex(6))
graph.add_edge_between(graph.get_vertex(7), graph.get_vertex(8))
graph.add_edge_between(graph.get_vertex(7), graph.get_vertex(9))
graph.add_edge_between(graph.get_vertex(7), graph.get_vertex(11))
graph.add_edge_between(graph.get_vertex(8), graph.get_vertex(9))
graph.add_edge_between(graph.get_vertex(9), graph.get_vertex(10))
graph.add_edge_between(graph.get_vertex(9), graph.get_vertex(11))
graph.add_edge_between(graph.get_vertex(10), graph.get_vertex(11))
# when
result = find_vertex_cut(graph)
# then
assert_that(sorted(result)).is_equal_to(
[graph.get_vertex(0), graph.get_vertex(1), graph.get_vertex(5), graph.get_vertex(7)])
@staticmethod
def test__find_vertex_cut__when_no_separators():
# given
graph = UndirectedSimpleGraph(range(6))
graph.add_edge_between(graph.get_vertex(0), graph.get_vertex(1))
graph.add_edge_between(graph.get_vertex(0), graph.get_vertex(2))
graph.add_edge_between(graph.get_vertex(1), graph.get_vertex(2))
graph.add_edge_between(graph.get_vertex(1), graph.get_vertex(3))
graph.add_edge_between(graph.get_vertex(1), graph.get_vertex(4))
graph.add_edge_between(graph.get_vertex(2), graph.get_vertex(3))
graph.add_edge_between(graph.get_vertex(3), graph.get_vertex(5))
graph.add_edge_between(graph.get_vertex(4), graph.get_vertex(5))
# when
result = find_vertex_cut(graph)
# then
assert_that(list(result)).is_empty()
| 50.530612 | 97 | 0.705775 | 744 | 4,952 | 4.352151 | 0.083333 | 0.247066 | 0.423718 | 0.275788 | 0.909203 | 0.898394 | 0.898394 | 0.847746 | 0.847746 | 0.809141 | 0 | 0.028869 | 0.167609 | 4,952 | 97 | 98 | 51.051546 | 0.756672 | 0.024637 | 0 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 1 | 0.066667 | false | 0 | 0.053333 | 0 | 0.133333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
ee2a7df6c09ae8f485b09ed5c9e72d10b3762d1b | 3,056 | py | Python | src/plotting_framework/plot_test.py | aymatveev/drawing_framework | 731ae0b08b6cd5fb1da9e637f64f729e015282ea | [
"MIT"
] | null | null | null | src/plotting_framework/plot_test.py | aymatveev/drawing_framework | 731ae0b08b6cd5fb1da9e637f64f729e015282ea | [
"MIT"
] | null | null | null | src/plotting_framework/plot_test.py | aymatveev/drawing_framework | 731ae0b08b6cd5fb1da9e637f64f729e015282ea | [
"MIT"
] | null | null | null | from typing import Iterable, Tuple
#from . import plot
import plot
import importlib
def test_plot8():
importlib.reload(plot)
data: Iterable[Iterable[Iterable[Tuple[int, int]]]] = ((
("points", ((0, 0), (-10, -10))),
("points", ((10, 10), (-9, -9))),
("points", ((5, 5), (-6, 8))),
("vector", ((-2, -1), (15, 5)))
), (
("points", ((9, 0), (-10, -10))),
("points", ((19, 10), (-9, -9))),
("points", ((9, 5), (-6, 8))),
("vector", ((-2, -1), (15, 5)))
), (
("points", ((5, 0), (-10, -10))),
("points", ((15, 10), (-9, -9))),
("points", ((5, 5), (-6, 8))),
("vector", ((-2, -1), (15, 5)))
))
result = plot.plot(data, (-20, -20), (20, 20), 3)
print(result)
def test_plot7():
importlib.reload(plot)
data: Iterable[Iterable[Iterable[Tuple[int, int]]]] = ((
("points", ((0, 0), (-10, -10))),
("points", ((10, 10), (-9, -9))),
("points", ((5, 5), (-6, 8)))
), (
("points", ((9, 0), (-10, -10))),
("points", ((19, 10), (-9, -9))),
("points", ((9, 5), (-6, 8)))
), (
("points", ((5, 0), (-10, -10))),
("points", ((15, 10), (-9, -9))),
("points", ((5, 5), (-6, 8)))
))
result = plot.plot(data, (-20, -20), (20, 20), 3)
print(result)
def test_plot6():
importlib.reload(plot)
data: Iterable[Iterable[Iterable[Tuple[int, int]]]] = ((
((0, 0), (-10, -10)),
((10, 10), (-9, -9)),
((5, 5), (-6, 8))
),)
result = plot.scatter_animate(data, (-20, -20), (20, 20), 3)
print(result)
def test_plot5():
importlib.reload(plot)
data: Iterable[Iterable[Iterable[Tuple[int, int]]]] = ((
((0, 0), (-10, -10)),
((10, 10), (-9, -9)),
((5, 5), (-6, 8))
), (
((9, 0), (-10, -10)),
((19, 10), (-9, -9)),
((9, 5), (-6, 8))
), (
((5, 0), (-10, -10)),
((15, 10), (-9, -9)),
((5, 5), (-6, 8))
))
result = plot.scatter_animate(data, (-20, -20), (20, 20), 3)
print(result)
def test_plot4():
importlib.reload(plot)
data: Iterable[Iterable[Iterable[Tuple[int, int]]]] = ((
((0, 0), (1, 1)),
), (
((5, 5), (10, 10)),
), (
((10, 10), (19, 19)),
))
result = plot.scatter_animate(data, (-20, -20), (20, 20), 3)
print(result)
def test_plot3():
importlib.reload(plot)
data: Iterable[Iterable[Tuple[int, int]]] = (
((0, 0), (-10, -10)),
((10, 10), (-9, -9)),
((5, 5), (-6, 8))
)
result = plot.scatter(data, (-20, -20), (20, 20))
print(result)
def test_plot2():
importlib.reload(plot)
data: Iterable[Tuple[int, int]] = ((0, 0), (-10, -10))
result = plot.scatter_xy(data, (-20, -20), (20, 20))
print(result)
def test_plot1():
importlib.reload(plot)
data: Iterable[Tuple[int, int]] = ((0, 0), (10, 10))
print(f"Data: {data}")
result = plot.scatter_xy(data, (-20, -20), (20, 20))
print(result) | 26.573913 | 64 | 0.427356 | 390 | 3,056 | 3.315385 | 0.102564 | 0.074246 | 0.074246 | 0.142305 | 0.869296 | 0.869296 | 0.851508 | 0.851508 | 0.851508 | 0.814385 | 0 | 0.13626 | 0.284359 | 3,056 | 115 | 65 | 26.573913 | 0.454961 | 0.00589 | 0 | 0.71134 | 0 | 0 | 0.045425 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.082474 | false | 0 | 0.113402 | 0 | 0.195876 | 0.092784 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ee6ae6ad842e49f37a7162c28840f62eecaba076 | 539 | py | Python | practiceforloop/nestedforloop.py | poojavaibhavsahu/Pooja_Python | 58122bfa8586883145042b11fe1cc013c803ab4f | [
"bzip2-1.0.6"
] | null | null | null | practiceforloop/nestedforloop.py | poojavaibhavsahu/Pooja_Python | 58122bfa8586883145042b11fe1cc013c803ab4f | [
"bzip2-1.0.6"
] | null | null | null | practiceforloop/nestedforloop.py | poojavaibhavsahu/Pooja_Python | 58122bfa8586883145042b11fe1cc013c803ab4f | [
"bzip2-1.0.6"
] | null | null | null | for i in range(1,4,1):
for j in range(1,4,1):
print("*",end=" " )
print()
k=1
for i in range(1,4,1):
i=i+1
for j in range(1,4,1):
print(k,end=" ")
k=k+1
print()
k=1
for i in range(1,4,1):
for j in range(1,i+1,1):
print(k,end=" ")
k=k+1
print()
for i in range(1,4,1):
for j in range(1,i+1,1):
print("*",end=" ")
print()
a=65
for i in range(1,4,1):
for j in range(1,i+1,1):
ch=chr(a)
print(ch,end=" ")
a=a+1
print()
| 14.184211 | 28 | 0.439703 | 112 | 539 | 2.116071 | 0.125 | 0.295359 | 0.337553 | 0.265823 | 0.831224 | 0.831224 | 0.831224 | 0.831224 | 0.751055 | 0.637131 | 0 | 0.108571 | 0.350649 | 539 | 37 | 29 | 14.567568 | 0.568571 | 0 | 0 | 0.821429 | 0 | 0 | 0.013011 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.357143 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
c9bc995b180ff5e7f8dee832c43567b25fa8233f | 8,202 | py | Python | examples/05-fdem/plot_0_fdem_analytic.py | Prithwijit-Chak/simpeg | d93145d768b5512621cdd75566b4a8175fee9ed3 | [
"MIT"
] | 358 | 2015-03-11T05:48:41.000Z | 2022-03-26T02:04:12.000Z | examples/05-fdem/plot_0_fdem_analytic.py | thast/simpeg | 8021082b8b53f3c08fa87fc085547bdd56437c6b | [
"MIT"
] | 885 | 2015-01-19T09:23:48.000Z | 2022-03-29T12:08:34.000Z | examples/05-fdem/plot_0_fdem_analytic.py | thast/simpeg | 8021082b8b53f3c08fa87fc085547bdd56437c6b | [
"MIT"
] | 214 | 2015-03-11T05:48:43.000Z | 2022-03-02T01:05:11.000Z | """
Simulation with Analytic FDEM Solutions
=======================================
Here, the module *SimPEG.electromagnetics.analytics.FDEM* is used to simulate
harmonic electric and magnetic field for both electric and magnetic dipole
sources in a wholespace.
"""
#########################################################################
# Import modules
# --------------
#
import numpy as np
from SimPEG import utils
from SimPEG.electromagnetics.analytics.FDEM import (
ElectricDipoleWholeSpace,
MagneticDipoleWholeSpace,
)
import matplotlib.pyplot as plt
from matplotlib.colors import LogNorm
#####################################################################
# Magnetic Fields for a Magnetic Dipole Source
# --------------------------------------------
#
# Here, we compute the magnetic fields for a harmonic magnetic dipole
# source in the z direction. Based on the geometry of the problem, we
# expect magnetic fields in the x and z directions, but none in the y
# direction.
#
# Defining electric dipole location and frequency
source_location = np.r_[0, 0, 0]
frequency = 1e3
# Defining observation locations (avoid placing observation at source)
x = np.arange(-100.5, 100.5, step=1.0)
y = np.r_[0]
z = x
observation_locations = utils.ndgrid(x, y, z)
# Define wholespace conductivity
sig = 1e-2
# Compute the fields
Hx, Hy, Hz = MagneticDipoleWholeSpace(
observation_locations,
source_location,
sig,
frequency,
moment="Z",
fieldType="h",
mu_r=1,
eps_r=1,
)
# Plot
fig = plt.figure(figsize=(14, 5))
hxplt = Hx.reshape(x.size, z.size)
hzplt = Hz.reshape(x.size, z.size)
ax1 = fig.add_subplot(121)
absH = np.sqrt(Hx.real ** 2 + Hy.real ** 2 + Hz.real ** 2)
pc1 = ax1.pcolor(x, z, absH.reshape(x.size, z.size), norm=LogNorm())
ax1.streamplot(x, z, hxplt.real, hzplt.real, color="k", density=1)
ax1.set_xlim([x.min(), x.max()])
ax1.set_ylim([z.min(), z.max()])
ax1.set_title("Real Component")
ax1.set_xlabel("x")
ax1.set_ylabel("z")
cb1 = plt.colorbar(pc1, ax=ax1)
cb1.set_label("Re[H] (A/m)")
ax2 = fig.add_subplot(122)
absH = np.sqrt(Hx.imag ** 2 + Hy.imag ** 2 + Hz.imag ** 2)
pc2 = ax2.pcolor(x, z, absH.reshape(x.size, z.size), norm=LogNorm())
ax2.streamplot(x, z, hxplt.imag, hzplt.imag, color="k", density=1)
ax2.set_xlim([x.min(), x.max()])
ax2.set_ylim([z.min(), z.max()])
ax2.set_title("Imaginary Component")
ax2.set_xlabel("x")
ax2.set_ylabel("z")
cb2 = plt.colorbar(pc2, ax=ax2)
cb2.set_label("Im[H] (A/m)")
#####################################################################
# Electric Fields for a Magnetic Dipole Source
# --------------------------------------------
#
# Here, we compute the electric fields for a harmonic magnetic dipole
# source in the y direction. Based on the geometry of the problem, we
# expect rotational electric fields in the x and z directions, but none in the y
# direction.
#
# Defining electric dipole location and frequency
source_location = np.r_[0, 0, 0]
frequency = 1e3
# Defining observation locations (avoid placing observation at source)
x = np.arange(-100.5, 100.5, step=1.0)
y = np.r_[0]
z = x
observation_locations = utils.ndgrid(x, y, z)
# Define wholespace conductivity
sig = 1e-2
# Predict the fields
Ex, Ey, Ez = MagneticDipoleWholeSpace(
observation_locations,
source_location,
sig,
frequency,
moment="Y",
fieldType="e",
mu_r=1,
eps_r=1,
)
# Plot
fig = plt.figure(figsize=(14, 5))
explt = Ex.reshape(x.size, z.size)
ezplt = Ez.reshape(x.size, z.size)
ax1 = fig.add_subplot(121)
absE = np.sqrt(Ex.real ** 2 + Ey.real ** 2 + Ez.real ** 2)
pc1 = ax1.pcolor(x, z, absE.reshape(x.size, z.size), norm=LogNorm())
ax1.streamplot(x, z, explt.real, ezplt.real, color="k", density=1)
ax1.set_xlim([x.min(), x.max()])
ax1.set_ylim([z.min(), z.max()])
ax1.set_title("Real Component")
ax1.set_xlabel("x")
ax1.set_ylabel("z")
cb1 = plt.colorbar(pc1, ax=ax1)
cb1.set_label("Re[E] (V/m)")
ax2 = fig.add_subplot(122)
absE = np.sqrt(Ex.imag ** 2 + Ey.imag ** 2 + Ez.imag ** 2)
pc2 = ax2.pcolor(x, z, absE.reshape(x.size, z.size), norm=LogNorm())
ax2.streamplot(x, z, explt.imag, ezplt.imag, color="k", density=1)
ax2.set_xlim([x.min(), x.max()])
ax2.set_ylim([z.min(), z.max()])
ax2.set_title("Imaginary Component")
ax2.set_xlabel("x")
ax2.set_ylabel("z")
cb2 = plt.colorbar(pc2, ax=ax2)
cb2.set_label("Im[E] (V/m)")
#####################################################################
# Electric Field from a Harmonic Electric Current Dipole Source
# -------------------------------------------------------------
#
# Here, we compute the electric fields for a harmonic electric current dipole
# source in the z direction. Based on the geometry of the problem, we
# expect electric fields in the x and z directions, but none in the y
# direction.
#
# Defining electric dipole location and frequency
source_location = np.r_[0, 0, 0]
frequency = 1e3
# Defining observation locations (avoid placing observation at source)
x = np.arange(-100.5, 100.5, step=1.0)
y = np.r_[0]
z = x
observation_locations = utils.ndgrid(x, y, z)
# Define wholespace conductivity
sig = 1e-2
# Predict the fields
Ex, Ey, Ez = ElectricDipoleWholeSpace(
observation_locations,
source_location,
sig,
frequency,
moment=[0, 0, 1],
fieldType="e",
mu_r=1,
eps_r=1,
)
# Plot
fig = plt.figure(figsize=(14, 5))
explt = Ex.reshape(x.size, z.size)
ezplt = Ez.reshape(x.size, z.size)
ax1 = fig.add_subplot(121)
absE = np.sqrt(Ex.real ** 2 + Ey.real ** 2 + Ez.real ** 2)
pc1 = ax1.pcolor(x, z, absE.reshape(x.size, z.size), norm=LogNorm())
ax1.streamplot(x, z, explt.real, ezplt.real, color="k", density=1)
ax1.set_xlim([x.min(), x.max()])
ax1.set_ylim([z.min(), z.max()])
ax1.set_title("Real Component")
ax1.set_xlabel("x")
ax1.set_ylabel("z")
cb1 = plt.colorbar(pc1, ax=ax1)
cb1.set_label("Re[E] (V/m)")
ax2 = fig.add_subplot(122)
absE = np.sqrt(Ex.imag ** 2 + Ey.imag ** 2 + Ez.imag ** 2)
pc2 = ax2.pcolor(x, z, absE.reshape(x.size, z.size), norm=LogNorm())
ax2.streamplot(x, z, explt.imag, ezplt.imag, color="k", density=1)
ax2.set_xlim([x.min(), x.max()])
ax2.set_ylim([z.min(), z.max()])
ax2.set_title("Imaginary Component")
ax2.set_xlabel("x")
ax2.set_ylabel("z")
cb2 = plt.colorbar(pc2, ax=ax2)
cb2.set_label("Im[E] (V/m)")
#####################################################################
# Magnetic Field from a Harmonic Electric Dipole Source
# -----------------------------------------------------
#
# Here, we compute the magnetic fields for a harmonic electric current dipole
# source in the y direction. Based on the geometry of the problem, we
# expect rotational magnetic fields in the x and z directions, but no fields
# in the y direction.
#
# Defining electric dipole location and frequency
source_location = np.r_[0, 0, 0]
frequency = 1e3
# Defining observation locations (avoid placing observation at source)
x = np.arange(-100.5, 100.5, step=1.0)
y = np.r_[0]
z = x
observation_locations = utils.ndgrid(x, y, z)
# Define wholespace conductivity
sig = 1e-2
# Predict the fields
Hx, Hy, Hz = ElectricDipoleWholeSpace(
observation_locations,
source_location,
sig,
frequency,
moment=[0, 1, 0],
fieldType="h",
mu_r=1,
eps_r=1,
)
# Plot
fig = plt.figure(figsize=(14, 5))
hxplt = Hx.reshape(x.size, z.size)
hzplt = Hz.reshape(x.size, z.size)
ax1 = fig.add_subplot(121)
absH = np.sqrt(Hx.real ** 2 + Hy.real ** 2 + Hz.real ** 2)
pc1 = ax1.pcolor(x, z, absH.reshape(x.size, z.size), norm=LogNorm())
ax1.streamplot(x, z, hxplt.real, hzplt.real, color="k", density=1)
ax1.set_xlim([x.min(), x.max()])
ax1.set_ylim([z.min(), z.max()])
ax1.set_title("Real Component")
ax1.set_xlabel("x")
ax1.set_ylabel("z")
cb1 = plt.colorbar(pc1, ax=ax1)
cb1.set_label("Re[H] (A/m)")
ax2 = fig.add_subplot(122)
absH = np.sqrt(Hx.imag ** 2 + Hy.imag ** 2 + Hz.imag ** 2)
pc2 = ax2.pcolor(x, z, absH.reshape(x.size, z.size), norm=LogNorm())
ax2.streamplot(x, z, hxplt.imag, hzplt.imag, color="k", density=1)
ax2.set_xlim([x.min(), x.max()])
ax2.set_ylim([z.min(), z.max()])
ax2.set_title("Imaginary Component")
ax2.set_xlabel("x")
ax2.set_ylabel("z")
cb2 = plt.colorbar(pc2, ax=ax2)
cb2.set_label("Im[H] (A/m)")
| 28.282759 | 80 | 0.636674 | 1,295 | 8,202 | 3.964479 | 0.111969 | 0.023374 | 0.037398 | 0.040514 | 0.914686 | 0.903779 | 0.899104 | 0.899104 | 0.869497 | 0.82762 | 0 | 0.036528 | 0.138869 | 8,202 | 289 | 81 | 28.380623 | 0.690358 | 0.279444 | 0 | 0.886364 | 0 | 0 | 0.045479 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.028409 | 0 | 0.028409 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c9c5d91b0a62d7b77d0337a48f8ce5c19f05a65d | 40,825 | py | Python | tests/test_contract.py | dargor/stories | 550a36506c5ec0a4603c0f14a3c5fe52132ef6bf | [
"BSD-2-Clause"
] | null | null | null | tests/test_contract.py | dargor/stories | 550a36506c5ec0a4603c0f14a3c5fe52132ef6bf | [
"BSD-2-Clause"
] | null | null | null | tests/test_contract.py | dargor/stories | 550a36506c5ec0a4603c0f14a3c5fe52132ef6bf | [
"BSD-2-Clause"
] | null | null | null | import pytest
from helpers import make_collector
from stories.exceptions import ContextContractError
# TODO:
#
# [ ] Show collected arguments of the story composition in the error
# messages.
#
# [ ] Show violation values in validation error messages.
#
# [ ] Write correct and verbose docstrings for each test in this
# module.
def test_assign_existed_variables(m):
"""
We can not write a variable with the same name to the context
twice.
"""
class T(m.ParamChildWithNull, m.StringMethod):
pass
class Q(m.ParentWithNull, m.StringParentMethod, T):
pass
class J(m.ParentWithNull, m.StringParentMethod):
def __init__(self):
self.x = T().x
# Simple.
expected = """
These variables are already present in the context: 'bar', 'foo'
Function returned value: T.one
Use different names for Success() keyword arguments.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
T().x(foo=1, bar=[2])
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
T().x.run(foo=1, bar=[2])
assert str(exc_info.value) == expected
# Substory inheritance.
expected = """
These variables are already present in the context: 'bar', 'foo'
Function returned value: Q.one
Use different names for Success() keyword arguments.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
Q().a()
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
Q().a.run()
assert str(exc_info.value) == expected
# Substory DI.
expected = """
These variables are already present in the context: 'bar', 'foo'
Function returned value: T.one
Use different names for Success() keyword arguments.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
J().a()
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
J().a.run()
assert str(exc_info.value) == expected
def test_context_variables_normalization(m):
"""
We apply normalization to the context variables, if story defines
context contract. If story step returns a string holding a
number, we should store a number in the context.
"""
class T(m.Child, m.StringMethod):
pass
class Q(m.Parent, m.NormalParentMethod, T):
pass
class J(m.Parent, m.NormalParentMethod):
def __init__(self):
self.x = T().x
# Simple.
getter = make_collector()
T().x()
assert getter().foo == 1
assert getter().bar == [2]
getter = make_collector()
T().x.run()
assert getter().foo == 1
assert getter().bar == [2]
# Substory inheritance.
getter = make_collector()
Q().a()
assert getter().foo == 1
assert getter().bar == [2]
getter = make_collector()
Q().a.run()
assert getter().foo == 1
assert getter().bar == [2]
# Substory DI.
getter = make_collector()
J().a()
assert getter().foo == 1
assert getter().bar == [2]
getter = make_collector()
J().a.run()
assert getter().foo == 1
assert getter().bar == [2]
def test_context_variables_normalization_conflict(m):
"""
More than one substory can declare an argument with the same name.
This means validators of both substories should return the same
result.
"""
# FIXME: Normalization conflict can consist of two
# variables. The first variable can be set by one
# substory. The second variable can be set by
# another substory.
class T(m.ParamChild, m.NormalMethod):
pass
class E(m.NextParamChildWithString, m.NormalNextMethod):
pass
class Q(m.SequentialParent, m.StringParentMethod, T, E):
pass
class J(m.SequentialParent, m.StringParentMethod):
def __init__(self):
self.x = T().x
self.y = E().y
# Substory inheritance.
expected = """
These arguments have normalization conflict: 'bar', 'foo'
Story method: Q.x
Story normalization result:
- bar: [2]
- foo: 1
Story method: Q.y
Story normalization result:
- bar: ['2']
- foo: '1'
""".strip()
with pytest.raises(ContextContractError) as exc_info:
Q().a()
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
Q().a.run()
assert str(exc_info.value) == expected
# Substory DI.
expected = """
These arguments have normalization conflict: 'bar', 'foo'
Story method: E.y
Story normalization result:
- bar: ['2']
- foo: '1'
Story method: T.x
Story normalization result:
- bar: [2]
- foo: 1
""".strip()
with pytest.raises(ContextContractError) as exc_info:
J().a()
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
J().a.run()
assert str(exc_info.value) == expected
def test_story_arguments_normalization(m):
"""
We apply normalization to the story arguments, if story defines
context contract. If story was called with a string argument
holding a number, we should store a number in the context.
"""
class T(m.ParamChild, m.NormalMethod):
pass
class Q(m.ParamParent, m.StringParentMethod, T):
pass
class J(m.ParamParent, m.StringParentMethod):
def __init__(self):
self.x = T().x
# Simple.
getter = make_collector()
T().x(foo="1", bar=["2"])
assert getter().foo == 1
assert getter().bar == [2]
getter = make_collector()
T().x.run(foo="1", bar=["2"])
assert getter().foo == 1
assert getter().bar == [2]
# Substory inheritance.
getter = make_collector()
Q().a(ham="1", eggs="2")
assert getter().ham == 1
assert getter().eggs == 2
getter = make_collector()
Q().a.run(ham="1", eggs="2")
assert getter().ham == 1
assert getter().eggs == 2
# Substory DI.
getter = make_collector()
J().a(ham="1", eggs="2")
assert getter().ham == 1
assert getter().eggs == 2
getter = make_collector()
J().a.run(ham="1", eggs="2")
assert getter().ham == 1
assert getter().eggs == 2
def test_story_arguments_normalization_many_levels(m):
"""
We apply normalization to the story arguments on any levels of
story composition.
"""
class T(m.ParamChild, m.NormalMethod):
pass
class Q(m.ParamParent, m.NormalParentMethod, T):
pass
class J(m.ParamParent, m.NormalParentMethod):
def __init__(self):
self.x = T().x
class R(m.ParamRoot, m.NormalRootMethod, Q):
pass
class F(m.ParamRoot, m.NormalRootMethod):
def __init__(self):
self.a = J().a
# Substory inheritance.
getter = make_collector()
Q().a(ham="1", eggs="2", foo="3", bar=["4"])
assert getter().ham == 1
assert getter().eggs == 2
assert getter().foo == 3
assert getter().bar == [4]
getter = make_collector()
Q().a.run(ham="1", eggs="2", foo="3", bar=["4"])
assert getter().ham == 1
assert getter().eggs == 2
assert getter().foo == 3
assert getter().bar == [4]
getter = make_collector()
R().i(fizz="0", ham="1", eggs="2", foo="3", bar=["4"])
assert getter().fizz == 0
assert getter().ham == 1
assert getter().eggs == 2
assert getter().foo == 3
assert getter().bar == [4]
getter = make_collector()
R().i.run(fizz="0", ham="1", eggs="2", foo="3", bar=["4"])
assert getter().fizz == 0
assert getter().ham == 1
assert getter().eggs == 2
assert getter().foo == 3
assert getter().bar == [4]
# Substory DI.
getter = make_collector()
Q().a(ham="1", eggs="2", foo="3", bar=["4"])
assert getter().ham == 1
assert getter().eggs == 2
assert getter().foo == 3
assert getter().bar == [4]
getter = make_collector()
Q().a.run(ham="1", eggs="2", foo="3", bar=["4"])
assert getter().ham == 1
assert getter().eggs == 2
assert getter().foo == 3
assert getter().bar == [4]
getter = make_collector()
F().i(fizz="0", ham="1", eggs="2", foo="3", bar=["4"])
assert getter().fizz == 0
assert getter().ham == 1
assert getter().eggs == 2
assert getter().foo == 3
assert getter().bar == [4]
getter = make_collector()
F().i.run(fizz="0", ham="1", eggs="2", foo="3", bar=["4"])
assert getter().fizz == 0
assert getter().ham == 1
assert getter().eggs == 2
assert getter().foo == 3
assert getter().bar == [4]
def test_story_arguments_normalization_conflict(m):
"""
Story and substory can have an argument with the same name. They
both will define validators for this argument. If normalization
result of both contracts will mismatch we should raise an error.
"""
class T(m.ParamChild, m.NormalMethod):
pass
class Q(m.ParamParentWithSameWithString, m.NormalParentMethod, T):
pass
class J(m.ParamParentWithSameWithString, m.NormalParentMethod):
def __init__(self):
self.x = T().x
# Substory inheritance.
expected = """
These arguments have normalization conflict: 'bar', 'foo'
Story method: Q.a
Story normalization result:
- bar: ['2']
- foo: '1'
Story method: Q.x
Story normalization result:
- bar: [2]
- foo: 1
""".strip()
with pytest.raises(ContextContractError) as exc_info:
Q().a(foo="1", bar=["2"])
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
Q().a.run(foo="1", bar=["2"])
assert str(exc_info.value) == expected
# Substory DI.
expected = """
These arguments have normalization conflict: 'bar', 'foo'
Story method: J.a
Story normalization result:
- bar: ['2']
- foo: '1'
Story method: T.x
Story normalization result:
- bar: [2]
- foo: 1
""".strip()
with pytest.raises(ContextContractError) as exc_info:
J().a(foo="1", bar=["2"])
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
J().a.run(foo="1", bar=["2"])
assert str(exc_info.value) == expected
def test_context_variables_validation(m):
"""
We apply validators to the context variables, if story defines
context contract.
"""
class T(m.Child, m.WrongMethod):
pass
class Q(m.Parent, m.NormalParentMethod, T):
pass
class J(m.Parent, m.NormalParentMethod):
def __init__(self):
self.x = T().x
# Simple.
expected = """
These variables violates context contract: 'bar', 'foo'
Function returned value: T.one
Violations:
bar:
""".strip()
with pytest.raises(ContextContractError) as exc_info:
T().x()
assert str(exc_info.value).startswith(expected)
with pytest.raises(ContextContractError) as exc_info:
T().x.run()
assert str(exc_info.value).startswith(expected)
# Substory inheritance.
expected = """
These variables violates context contract: 'bar', 'foo'
Function returned value: Q.one
Violations:
bar:
""".strip()
with pytest.raises(ContextContractError) as exc_info:
Q().a()
assert str(exc_info.value).startswith(expected)
with pytest.raises(ContextContractError) as exc_info:
Q().a.run()
assert str(exc_info.value).startswith(expected)
# Substory DI.
expected = """
These variables violates context contract: 'bar', 'foo'
Function returned value: T.one
Violations:
bar:
""".strip()
with pytest.raises(ContextContractError) as exc_info:
J().a()
assert str(exc_info.value).startswith(expected)
with pytest.raises(ContextContractError) as exc_info:
J().a.run()
assert str(exc_info.value).startswith(expected)
def test_story_arguments_validation(m):
"""
We apply validators to the story arguments, if story defines
context contract. This is check performed during story call, not
execution.
"""
class T(m.ParamChild, m.ExceptionMethod):
pass
class Q(m.ParamParent, m.ExceptionParentMethod, m.Child, m.NormalMethod):
pass
class J(m.ParamParent, m.ExceptionParentMethod):
def __init__(self):
class T(m.Child, m.NormalMethod):
pass
self.x = T().x
# Simple.
expected = """
These arguments violates context contract: 'bar', 'foo'
Story method: T.x
Violations:
bar:
""".strip()
with pytest.raises(ContextContractError) as exc_info:
T().x(foo="<boom>", bar=["<boom>"])
assert str(exc_info.value).startswith(expected)
with pytest.raises(ContextContractError) as exc_info:
T().x.run(foo="<boom>", bar=["<boom>"])
assert str(exc_info.value).startswith(expected)
# Substory inheritance.
expected = """
These arguments violates context contract: 'eggs', 'ham'
Story method: Q.a
Violations:
eggs:
""".strip()
with pytest.raises(ContextContractError) as exc_info:
Q().a(ham="<boom>", eggs="<boom>")
assert str(exc_info.value).startswith(expected)
with pytest.raises(ContextContractError) as exc_info:
Q().a.run(ham="<boom>", eggs="<boom>")
assert str(exc_info.value).startswith(expected)
# Substory DI.
expected = """
These arguments violates context contract: 'eggs', 'ham'
Story method: J.a
Violations:
eggs:
""".strip()
with pytest.raises(ContextContractError) as exc_info:
J().a(ham="<boom>", eggs="<boom>")
assert str(exc_info.value).startswith(expected)
with pytest.raises(ContextContractError) as exc_info:
J().a.run(ham="<boom>", eggs="<boom>")
assert str(exc_info.value).startswith(expected)
def test_story_arguments_validation_many_levels(m):
"""
We apply contract validation to the story arguments on any levels
of story composition.
"""
class T(m.ParamChild, m.NormalMethod):
pass
class Q(m.Parent, m.ExceptionParentMethod, T):
pass
class J(m.Parent, m.ExceptionParentMethod):
def __init__(self):
self.x = T().x
class R(m.Root, m.ExceptionRootMethod, m.Parent, m.NormalParentMethod, T):
pass
class F(m.Root, m.ExceptionRootMethod):
def __init__(self):
class J(m.Parent, m.NormalParentMethod):
def __init__(self):
self.x = T().x
self.a = J().a
# Substory inheritance.
expected = """
These arguments violates context contract: 'foo'
Story method: R.i
Violations:
foo:
""".strip()
with pytest.raises(ContextContractError) as exc_info:
R().i(foo="<boom>", bar=[1])
assert str(exc_info.value).startswith(expected)
with pytest.raises(ContextContractError) as exc_info:
R().i.run(foo="<boom>", bar=[1])
assert str(exc_info.value).startswith(expected)
# Substory DI.
expected = """
These arguments violates context contract: 'foo'
Story method: F.i
Violations:
foo:
""".strip()
with pytest.raises(ContextContractError) as exc_info:
F().i(foo="<boom>", bar=[1])
assert str(exc_info.value).startswith(expected)
with pytest.raises(ContextContractError) as exc_info:
F().i.run(foo="<boom>", bar=[1])
assert str(exc_info.value).startswith(expected)
def test_composition_contract_variable_conflict(m):
"""
Story and substory contracts can not declare the same variable
twice.
"""
class T(m.Child, m.NormalMethod):
pass
class Q(m.ParentWithSame, m.NormalParentMethod, T):
pass
class J(m.ParentWithSame, m.NormalParentMethod):
def __init__(self):
self.x = T().x
# Substory inheritance.
expected = """
Repeated variables can not be used in a story composition.
Variables repeated in both context contracts: 'bar', 'baz', 'foo'
Story method: Q.a
Substory method: Q.x
Use variables with different names.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
Q().a
assert str(exc_info.value) == expected
# Substory DI.
expected = """
Repeated variables can not be used in a story composition.
Variables repeated in both context contracts: 'bar', 'baz', 'foo'
Story method: J.a
Substory method: T.x
Use variables with different names.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
J().a
assert str(exc_info.value) == expected
def test_composition_contract_variable_conflict_many_levels(m):
"""
Story and substory contracts can not declare the same variable
twice.
"""
class T(m.Child, m.NormalMethod):
pass
class Q(m.Parent, m.NormalParentMethod, T):
pass
class J(m.Parent, m.NormalParentMethod):
def __init__(self):
self.x = T().x
class R(m.RootWithSame, m.NormalRootMethod, Q):
pass
class F(m.RootWithSame, m.NormalRootMethod):
def __init__(self):
self.a = J().a
# Substory inheritance.
expected = """
Repeated variables can not be used in a story composition.
Variables repeated in both context contracts: 'bar', 'baz', 'foo'
Story method: R.i
Substory method: R.x
Use variables with different names.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
R().i
assert str(exc_info.value) == expected
# Substory DI.
expected = """
Repeated variables can not be used in a story composition.
Variables repeated in both context contracts: 'bar', 'baz', 'foo'
Story method: F.i
Substory method: T.x
Use variables with different names.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
F().i
assert str(exc_info.value) == expected
def test_composition_contract_variable_conflict_sequential(m):
"""
Story and substory contracts can not declare the same variable
twice.
"""
class T(m.Child, m.NormalMethod):
pass
class E(m.NextChildWithSame, m.NormalMethod):
pass
class Q(m.SequentialParent, m.StringParentMethod, T, E):
pass
class J(m.SequentialParent, m.StringParentMethod):
def __init__(self):
self.x = T().x
self.y = E().y
# Substory inheritance.
expected = """
Repeated variables can not be used in a story composition.
Variables repeated in both context contracts: 'bar', 'baz', 'foo'
Story method: Q.x
Substory method: Q.y
Use variables with different names.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
Q().a
assert str(exc_info.value) == expected
# Substory DI.
expected = """
Repeated variables can not be used in a story composition.
Variables repeated in both context contracts: 'bar', 'baz', 'foo'
Story method: T.x
Substory method: E.y
Use variables with different names.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
J().a
assert str(exc_info.value) == expected
def test_composition_contract_variable_conflict_sequential_reuse(m):
"""
Story and substory can reuse the same contract. Substory can have
more arguments than story. Another sequential substory can have
the same arguments as previous substory.
"""
class E(m.NextParamChildReuse, m.NormalMethod):
pass
class Q(m.ParamParentWithSame, m.NormalParentMethod):
pass
class V(m.NextParamParentReuse, m.NormalParentMethod, E):
pass
class R(m.SequentialRoot, m.StringWideRootMethod, Q, V):
pass
class F(m.SequentialRoot, m.StringWideRootMethod):
def __init__(self):
self.a = Q().a
self.b = V().b
# Substory inheritance.
R().i()
result = R().i.run()
assert result.value is None
# Substory DI.
F().i()
result = F().i.run()
assert result.value is None
def test_composition_incompatible_contract_types(m):
"""Deny to use different types in the story composition."""
class T(m.Child, m.NormalMethod):
pass
class Q(m.ParentWithNull, m.NormalParentMethod, T):
pass
class J(m.ParentWithNull, m.NormalParentMethod):
def __init__(self):
self.x = T().x
# Substory inheritance.
expected = """
Story and substory context contracts has incompatible types:
Story method: Q.a
Story context contract: None
Substory method: Q.x
Substory context contract:
""".strip()
with pytest.raises(ContextContractError) as exc_info:
Q().a
assert str(exc_info.value).startswith(expected)
# Substory DI.
expected = """
Story and substory context contracts has incompatible types:
Story method: J.a
Story context contract: None
Substory method: T.x
Substory context contract:
""".strip()
with pytest.raises(ContextContractError) as exc_info:
J().a
assert str(exc_info.value).startswith(expected)
def test_composition_use_same_contract_instance(m):
"""
The same contract class or instance can be used in story and a
substory. This should not lead to the incompatible contract
composition error. Variable declared there can be assigned in one
of the story. And it will be declared once within the contract.
"""
class T(m.ChildReuse, m.NormalMethod):
pass
class Q(m.ParentReuse, m.NormalParentMethod, T):
pass
class J(m.ParentReuse, m.NormalParentMethod):
def __init__(self):
self.x = T().x
# Substory inheritance.
Q().a
# Substory DI.
J().a
def test_unknown_context_variable(m):
"""
Step can't use Success argument name which was not specified in
the contract.
"""
class T(m.Child, m.UnknownMethod):
pass
class Q(m.Parent, m.NormalParentMethod, T):
pass
class J(m.Parent, m.NormalParentMethod):
def __init__(self):
self.x = T().x
# Simple.
expected = """
These variables were not defined in the context contract: 'quiz', 'spam'
Available variables are: 'bar', 'baz', 'foo'
Function returned value: T.one
Use different names for Success() keyword arguments or add these names to the contract.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
T().x()
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
T().x.run()
assert str(exc_info.value) == expected
# Substory inheritance.
expected = """
These variables were not defined in the context contract: 'quiz', 'spam'
Available variables are: 'bar', 'baz', 'foo'
Function returned value: Q.one
Use different names for Success() keyword arguments or add these names to the contract.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
Q().a()
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
Q().a.run()
assert str(exc_info.value) == expected
# Substory DI.
expected = """
These variables were not defined in the context contract: 'quiz', 'spam'
Available variables are: 'bar', 'baz', 'foo'
Function returned value: T.one
Use different names for Success() keyword arguments or add these names to the contract.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
J().a()
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
J().a.run()
assert str(exc_info.value) == expected
@pytest.mark.parametrize("child", ["ParamChild", "ParamChildWithNull"])
@pytest.mark.parametrize(
"parent,base", [("ParamParent", "Child"), ("ParamParentWithNull", "ChildWithNull")]
)
def test_unknown_story_arguments(m, child, parent, base):
"""
Allow to pass known only story and substory arguments to the call.
"""
class T(getattr(m, child), m.NormalMethod):
pass
class Q(getattr(m, parent), m.NormalParentMethod, getattr(m, base), m.NormalMethod):
pass
class J(getattr(m, parent), m.NormalParentMethod):
def __init__(self):
class T(getattr(m, base), m.NormalMethod):
pass
self.x = T().x
# Simple.
expected = """
These arguments are unknown: baz, fox
Story method: T.x
Story composition arguments: foo, bar
""".strip()
with pytest.raises(ContextContractError) as exc_info:
T().x(baz=1, fox=2)
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
T().x.run(baz=1, fox=2)
assert str(exc_info.value) == expected
# Substory inheritance.
expected = """
These arguments are unknown: beans, fox
Story method: Q.a
Story composition arguments: ham, eggs
""".strip()
with pytest.raises(ContextContractError) as exc_info:
Q().a(beans=1, fox=2)
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
Q().a.run(beans=1, fox=2)
assert str(exc_info.value) == expected
# Substory DI.
expected = """
These arguments are unknown: beans, fox
Story method: J.a
Story composition arguments: ham, eggs
""".strip()
with pytest.raises(ContextContractError) as exc_info:
J().a(beans=1, fox=2)
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
J().a.run(beans=1, fox=2)
assert str(exc_info.value) == expected
@pytest.mark.parametrize(
"child,parent", [("Child", "Parent"), ("ChildWithNull", "ParentWithNull")]
)
def test_unknown_story_arguments_with_empty(m, child, parent):
"""
Deny any arguments in the call, if story and substory has no
arguments specified.
"""
class T(getattr(m, child), m.NormalMethod):
pass
class Q(getattr(m, parent), m.NormalParentMethod, T):
pass
class J(getattr(m, parent), m.NormalParentMethod):
def __init__(self):
self.x = T().x
# Simple.
expected = """
These arguments are unknown: baz, fox
Story method: T.x
Story composition has no arguments.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
T().x(baz=1, fox=2)
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
T().x.run(baz=1, fox=2)
assert str(exc_info.value) == expected
# Substory inheritance.
expected = """
These arguments are unknown: beans, fox
Story method: Q.a
Story composition has no arguments.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
Q().a(beans=1, fox=2)
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
Q().a.run(beans=1, fox=2)
assert str(exc_info.value) == expected
# Substory DI.
expected = """
These arguments are unknown: beans, fox
Story method: J.a
Story composition has no arguments.
""".strip()
with pytest.raises(ContextContractError) as exc_info:
J().a(beans=1, fox=2)
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
J().a.run(beans=1, fox=2)
assert str(exc_info.value) == expected
def test_require_story_arguments_present_in_context(m):
"""Check story and substory arguments are present in the context."""
class T(m.ParamChildWithNull, m.NormalMethod):
pass
class Q(m.ParentWithNull, m.NormalParentMethod, T):
pass
class J(m.ParentWithNull, m.NormalParentMethod):
def __init__(self):
self.x = T().x
# Simple.
expected = """
These variables are missing from the context: bar, foo
Story method: T.x
Story arguments: foo, bar
T.x
Context()
""".strip()
with pytest.raises(ContextContractError) as exc_info:
T().x() # FIXME: This should be arguments error (not substory call error).
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
T().x.run()
assert str(exc_info.value) == expected
# Substory inheritance.
expected = """
These variables are missing from the context: bar, foo
Story method: Q.x
Story arguments: foo, bar
Q.a
before
x
Context()
""".strip()
with pytest.raises(ContextContractError) as exc_info:
Q().a()
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
Q().a.run()
assert str(exc_info.value) == expected
# Substory DI.
expected = """
These variables are missing from the context: bar, foo
Story method: T.x
Story arguments: foo, bar
J.a
before
x (T.x)
Context()
""".strip()
with pytest.raises(ContextContractError) as exc_info:
J().a()
assert str(exc_info.value) == expected
with pytest.raises(ContextContractError) as exc_info:
J().a.run()
assert str(exc_info.value) == expected
def test_parent_steps_set_story_arguments(m):
"""
Steps of parent stories should be able to set child stories
arguments with `Success` marker keyword arguments.
"""
class T(m.ParamChild, m.NormalMethod):
pass
class Q(m.Parent, m.StringParentMethod, T):
pass
class J(m.Parent, m.StringParentMethod):
def __init__(self):
self.x = T().x
class R(m.Root, m.StringRootMethod, m.Parent, m.NormalParentMethod, T):
pass
class F(m.Root, m.StringRootMethod):
def __init__(self):
class J(m.Parent, m.NormalParentMethod):
def __init__(self):
self.x = T().x
self.a = J().a
# Substory inheritance.
getter = make_collector()
Q().a()
assert getter().foo == 1
assert getter().bar == [2]
getter = make_collector()
Q().a.run()
assert getter().foo == 1
assert getter().bar == [2]
getter = make_collector()
R().i()
assert getter().foo == 1
assert getter().bar == [2]
getter = make_collector()
R().i.run()
assert getter().foo == 1
assert getter().bar == [2]
# Substory DI.
getter = make_collector()
J().a()
assert getter().foo == 1
assert getter().bar == [2]
getter = make_collector()
J().a.run()
assert getter().foo == 1
assert getter().bar == [2]
getter = make_collector()
F().i()
assert getter().foo == 1
assert getter().bar == [2]
getter = make_collector()
F().i.run()
assert getter().foo == 1
assert getter().bar == [2]
def test_sequential_story_steps_set_story_arguments(m):
"""
There are a few sequential substories with one common parent
story. One substory should be able to set variable to provide an
argument to the next sequential story.
"""
class T(m.ChildWithShrink, m.StringMethod):
pass
class E(m.NextParamChildWithString, m.NormalNextMethod):
pass
class Q(m.SequentialParent, m.NormalParentMethod, T, E):
pass
class J(m.SequentialParent, m.NormalParentMethod):
def __init__(self):
self.x = T().x
self.y = E().y
# Substory inheritance.
getter = make_collector()
Q().a()
assert getter().foo == "1"
assert getter().bar == ["2"]
getter = make_collector()
Q().a.run()
assert getter().foo == "1"
assert getter().bar == ["2"]
# Substory DI.
getter = make_collector()
J().a()
assert getter().foo == "1"
assert getter().bar == ["2"]
getter = make_collector()
J().a.run()
assert getter().foo == "1"
assert getter().bar == ["2"]
def test_arguments_should_be_declared_in_contract(m):
"""
We should require all story arguments to be declared in the
context contract.
"""
class T(m.ParamChildWithShrink, m.NormalMethod):
pass
class Q(m.Parent, m.NormalParentMethod, T):
pass
class J(m.Parent, m.NormalParentMethod):
def __init__(self):
self.x = T().x
# Simple.
expected = """
These arguments should be declared in the context contract: bar, foo
Story method: T.x
Story arguments: foo, bar, baz
""".strip()
with pytest.raises(ContextContractError) as exc_info:
T().x
assert str(exc_info.value) == expected
# Substory inheritance.
expected = """
These arguments should be declared in the context contract: bar, foo
Story method: Q.x
Story arguments: foo, bar, baz
""".strip()
with pytest.raises(ContextContractError) as exc_info:
Q().a
assert str(exc_info.value) == expected
# Substory DI.
expected = """
These arguments should be declared in the context contract: bar, foo
Story method: T.x
Story arguments: foo, bar, baz
""".strip()
with pytest.raises(ContextContractError) as exc_info:
J().a
assert str(exc_info.value) == expected
# Aliases.
def test_story_variable_alias_normalization_store_same_object(m):
"""
When story step sets a set of variables some of them are aliases
of each other. If the type and the value of alias are equal to
the origin value, we should preserve the same reference to the
value.
"""
class T(m.ChildAlias, m.AliasMethod):
pass
# Simple.
getter = make_collector()
T().x()
assert getter().foo is getter().bar
assert getter().foo == {"key": "1"}
assert getter().bar == {"key": "1"}
assert getter().baz == {"key": 1}
getter = make_collector()
T().x.run()
assert getter().foo is getter().bar
assert getter().foo == {"key": "1"}
assert getter().bar == {"key": "1"}
assert getter().baz == {"key": 1}
# FIXME: Substory inheritance.
# FIXME: Substory DI.
def test_story_argument_alias_normalization_store_same_object(m):
"""
When story has a set of arguments some of them are aliases of each
other. If the type and the value of alias are equal to the origin
value, we should preserve the same reference to the value.
"""
class T(m.ParamChildAlias, m.NormalMethod):
pass
# Simple.
value = {"key": "1"}
getter = make_collector()
T().x(foo=value, bar=value, baz=value)
assert getter().foo is getter().bar
assert getter().foo == {"key": "1"}
assert getter().bar == {"key": "1"}
assert getter().baz == {"key": 1}
getter = make_collector()
T().x.run(foo=value, bar=value, baz=value)
assert getter().foo is getter().bar
assert getter().foo == {"key": "1"}
assert getter().bar == {"key": "1"}
assert getter().baz == {"key": 1}
# FIXME: Substory inheritance.
# FIXME: Substory DI.
# Representation.
def test_story_contract_representation_with_spec(m):
"""
Show collected story composition contract as mounted story
attribute.
"""
class T(m.Child, m.StringMethod):
pass
class Q(m.Parent, m.NormalParentMethod, T):
pass
class J(m.Parent, m.NormalParentMethod):
def __init__(self):
self.x = T().x
class R(m.Root, m.NormalRootMethod, Q):
pass
class F(m.Root, m.NormalRootMethod):
def __init__(self):
self.a = J().a
# Simple.
expected = """
Contract:
foo: ... # T.x variable
bar: ... # T.x variable
baz: ... # T.x variable
""".strip()
assert repr(T().x.contract) == expected
# Substory inheritance.
expected = """
Contract:
ham: ... # Q.a variable
eggs: ... # Q.a variable
beans: ... # Q.a variable
foo: ... # Q.x variable
bar: ... # Q.x variable
baz: ... # Q.x variable
""".strip()
assert repr(Q().a.contract) == expected
expected = """
Contract:
fizz: ... # R.i variable
buzz: ... # R.i variable
ham: ... # R.a variable
eggs: ... # R.a variable
beans: ... # R.a variable
foo: ... # R.x variable
bar: ... # R.x variable
baz: ... # R.x variable
""".strip()
assert repr(R().i.contract) == expected
# Substory DI.
expected = """
Contract:
ham: ... # J.a variable
eggs: ... # J.a variable
beans: ... # J.a variable
foo: ... # T.x variable
bar: ... # T.x variable
baz: ... # T.x variable
""".strip()
assert repr(J().a.contract) == expected
expected = """
Contract:
fizz: ... # F.i variable
buzz: ... # F.i variable
ham: ... # J.a variable
eggs: ... # J.a variable
beans: ... # J.a variable
foo: ... # T.x variable
bar: ... # T.x variable
baz: ... # T.x variable
""".strip()
assert repr(F().i.contract) == expected
def test_story_contract_representation_with_spec_with_args(m):
"""
Show collected story composition contract as mounted story
attribute. We show each story arguments.
"""
class T(m.ParamChild, m.StringMethod):
pass
class Q(m.ParamParent, m.NormalParentMethod, T):
pass
class J(m.ParamParent, m.NormalParentMethod):
def __init__(self):
self.x = T().x
class R(m.ParamRoot, m.NormalRootMethod, Q):
pass
class F(m.ParamRoot, m.NormalRootMethod):
def __init__(self):
self.a = J().a
# Simple.
expected = """
Contract:
foo: ... # T.x argument
bar: ... # T.x argument
baz: ... # T.x variable
""".strip()
assert repr(T().x.contract) == expected
# Substory inheritance.
expected = """
Contract:
ham: ... # Q.a argument
eggs: ... # Q.a argument
foo: ... # Q.x argument
bar: ... # Q.x argument
beans: ... # Q.a variable
baz: ... # Q.x variable
""".strip()
assert repr(Q().a.contract) == expected
expected = """
Contract:
fizz: ... # R.i argument
ham: ... # R.a argument
eggs: ... # R.a argument
foo: ... # R.x argument
bar: ... # R.x argument
buzz: ... # R.i variable
beans: ... # R.a variable
baz: ... # R.x variable
""".strip()
assert repr(R().i.contract) == expected
# Substory DI.
expected = """
Contract:
ham: ... # J.a argument
eggs: ... # J.a argument
foo: ... # T.x argument
bar: ... # T.x argument
beans: ... # J.a variable
baz: ... # T.x variable
""".strip()
assert repr(J().a.contract) == expected
expected = """
Contract:
fizz: ... # F.i argument
ham: ... # J.a argument
eggs: ... # J.a argument
foo: ... # T.x argument
bar: ... # T.x argument
buzz: ... # F.i variable
beans: ... # J.a variable
baz: ... # T.x variable
""".strip()
assert repr(F().i.contract) == expected
def test_story_contract_representation_with_spec_with_args_conflict(m):
"""
Show collected story composition contract as mounted story
attribute. We show each story arguments in multiline mode if the
same name was declared in multiple substories.
"""
class T(m.ParamChild, m.NormalMethod):
pass
class Q(m.ParamParentWithSameWithString, m.NormalParentMethod, T):
pass
class J(m.ParamParentWithSameWithString, m.NormalParentMethod):
def __init__(self):
self.x = T().x
# FIXME: Implement this.
#
# class R(..., m.NormalRootMethod, Q):
# pass
#
# class F(..., m.NormalRootMethod):
# def __init__(self):
# self.a = J().a
# Substory inheritance.
expected = """
Contract:
foo:
... # Q.a argument
... # Q.x argument
bar:
... # Q.a argument
... # Q.x argument
baz: ... # Q.x variable
""".strip()
assert repr(Q().a.contract) == expected
# expected = """
# Contract:
# fizz: ... # R.i argument
# ham: ... # R.a argument
# eggs: ... # R.a argument
# foo: ... # R.x argument
# bar: ... # R.x argument
# buzz: ... # R.i variable
# beans: ... # R.a variable
# baz: ... # R.x variable
# """.strip()
#
# assert repr(R().i.contract) == expected
# Substory DI.
expected = """
Contract:
foo:
... # J.a argument
... # T.x argument
bar:
... # J.a argument
... # T.x argument
baz: ... # T.x variable
""".strip()
assert repr(J().a.contract) == expected
# expected = """
# Contract:
# fizz: ... # F.i argument
# ham: ... # J.a argument
# eggs: ... # J.a argument
# foo: ... # T.x argument
# bar: ... # T.x argument
# buzz: ... # F.i variable
# beans: ... # J.a variable
# baz: ... # T.x variable
# """.strip()
#
# assert repr(F().i.contract) == expected
| 23.051948 | 88 | 0.618641 | 5,246 | 40,825 | 4.731605 | 0.05223 | 0.036661 | 0.041898 | 0.094271 | 0.87503 | 0.849609 | 0.831198 | 0.805737 | 0.795061 | 0.782008 | 0 | 0.006533 | 0.246369 | 40,825 | 1,770 | 89 | 23.064972 | 0.800241 | 0.135701 | 0 | 0.854065 | 0 | 0 | 0.241984 | 0 | 0 | 0 | 0 | 0.00339 | 0.175318 | 1 | 0.058766 | false | 0.063663 | 0.002938 | 0 | 0.157689 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
a01d24546269e1a4fa722a6252be1c0d59ed35a4 | 13,399 | py | Python | tests/unit_tests/test_tethys_portal/test_views/test_psa.py | ezrajrice/tethys | 238271ebb09913f1f57b0d127fd5c81bb4780a0a | [
"BSD-2-Clause"
] | 79 | 2015-10-05T13:13:28.000Z | 2022-02-01T12:30:33.000Z | tests/unit_tests/test_tethys_portal/test_views/test_psa.py | ezrajrice/tethys | 238271ebb09913f1f57b0d127fd5c81bb4780a0a | [
"BSD-2-Clause"
] | 542 | 2015-08-12T22:11:32.000Z | 2022-03-29T22:18:08.000Z | tests/unit_tests/test_tethys_portal/test_views/test_psa.py | Aquaveo/tethys | 15f67c3fb9458d3af2733542be5ea6391f33b222 | [
"BSD-2-Clause"
] | 71 | 2016-01-16T01:03:41.000Z | 2022-03-31T17:55:54.000Z | import unittest
from unittest import mock
from django.http import HttpResponseBadRequest
from django.test import override_settings
from django.contrib.auth import REDIRECT_FIELD_NAME
from django.core.exceptions import ImproperlyConfigured
from social_core.backends.base import BaseAuth
from social_django.views import _do_login
from tethys_services.backends.multi_tenant_mixin import MultiTenantMixin
from tethys_portal.forms import SsoTenantForm
from .mock_decorator import mock_decorator
# Fixes the Cache-Control error in tests. Must appear before view imports.
mock.patch('django.views.decorators.cache.never_cache', lambda x: x).start()
mock.patch('social_django.utils.psa', side_effect=mock_decorator).start()
from tethys_portal.views.psa import tenant, auth, complete # noqa: E402
class TethysPortalViewsAccountsTest(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
@mock.patch('tethys_portal.views.psa.do_auth')
def test_auth_not_mtm(self, mock_do_auth):
mock_backend = mock.MagicMock(spec=BaseAuth) # Not a MultiTenantMixin
mock_backend.name = 'other backend'
mock_request = mock.MagicMock(method='GET', backend=mock_backend) # GET request
url_backend = 'foo'
ret = auth(mock_request, url_backend)
mock_do_auth.assert_called_with(mock_backend, redirect_name=REDIRECT_FIELD_NAME)
self.assertEqual(mock_do_auth(), ret)
@mock.patch('tethys_portal.views.psa.do_auth')
def test_auth_is_mtm_multi_tenant_none(self, mock_do_auth):
mock_backend = mock.MagicMock(spec=MultiTenantMixin)
mock_backend.name = 'foo'
mock_backend.setting = mock.MagicMock(return_value=None) # MULTI_TENANT returns None
mock_request = mock.MagicMock(method='GET', backend=mock_backend) # GET request
url_backend = 'foo'
ret = auth(mock_request, url_backend)
mock_do_auth.assert_called_with(mock_backend, redirect_name=REDIRECT_FIELD_NAME)
self.assertEqual(mock_do_auth(), ret)
@mock.patch('tethys_portal.views.psa.redirect')
@mock.patch('tethys_portal.views.psa.do_auth')
def test_auth_is_mtm_configured(self, mock_do_auth, mock_redirect):
mock_backend = mock.MagicMock(spec=MultiTenantMixin)
mock_backend.name = 'foo'
mock_backend.setting = mock.MagicMock(return_value={'foo bar': {}}) # MULTI_TENANT returns settings
mock_request = mock.MagicMock(method='GET', backend=mock_backend) # GET request
url_backend = 'foo'
ret = auth(mock_request, url_backend)
mock_redirect.assert_called_with('social:tenant', backend=url_backend)
mock_do_auth.assert_not_called()
self.assertEqual(mock_redirect(), ret)
@mock.patch('tethys_portal.views.psa.do_complete')
def test_complete_not_mtm(self, mock_do_complete):
mock_backend = mock.MagicMock(spec=BaseAuth) # Not a MultiTenantMixin
mock_backend.name = 'other backend'
mock_request = mock.MagicMock(method='GET', backend=mock_backend) # GET request
url_backend = 'foo'
ret = complete(mock_request, url_backend)
mock_do_complete.assert_called_with(
mock_backend,
_do_login,
user=mock_request.user,
redirect_name=REDIRECT_FIELD_NAME,
request=mock_request,
)
self.assertEqual(mock_do_complete(), ret)
@mock.patch('tethys_portal.views.psa.log')
@mock.patch('tethys_portal.views.psa.redirect')
@mock.patch('tethys_portal.views.psa.do_complete')
def test_complete_is_mtm_no_saved_tenant(self, mock_do_complete, mock_redirect, mock_log):
mock_backend = mock.MagicMock(
spec=MultiTenantMixin,
strategy=mock.MagicMock(
session_get=mock.MagicMock(return_value=None)
)
)
mock_backend.name = 'foo'
mock_request = mock.MagicMock(method='GET', backend=mock_backend) # GET request
url_backend = 'foo'
ret = complete(mock_request, url_backend)
mock_log.error.assert_called_with('Session contains no value for "tenant".')
mock_redirect.assert_called_with('accounts:login')
mock_do_complete.assert_not_called()
self.assertEqual(mock_redirect(), ret)
@mock.patch('tethys_portal.views.psa.do_complete')
def test_complete_is_mtm_tenant_valid(self, mock_do_complete):
mock_backend = mock.MagicMock(
spec=MultiTenantMixin,
strategy=mock.MagicMock(
session_get=mock.MagicMock(return_value='Foo')
)
)
mock_backend.name = 'foo'
mock_request = mock.MagicMock(method='GET', backend=mock_backend) # GET request
url_backend = 'foo'
ret = complete(mock_request, url_backend)
self.assertEqual(mock_backend.tenant, 'Foo')
mock_do_complete.assert_called_with(
mock_backend,
_do_login,
user=mock_request.user,
redirect_name=REDIRECT_FIELD_NAME,
request=mock_request,
)
self.assertEqual(mock_do_complete(), ret)
@mock.patch('tethys_portal.views.psa.log')
@mock.patch('tethys_portal.views.psa.redirect')
@mock.patch('tethys_portal.views.psa.do_complete')
def test_complete_is_mtm_improperly_configured(self, mock_do_complete, mock_redirect, mock_log):
mock_backend = mock.MagicMock(
spec=MultiTenantMixin,
strategy=mock.MagicMock(
session_get=mock.MagicMock(return_value='Foo')
)
)
type(mock_backend).tenant = mock.PropertyMock(side_effect=ImproperlyConfigured('some error'))
mock_backend.name = 'foo'
mock_request = mock.MagicMock(method='GET', backend=mock_backend) # GET request
url_backend = 'foo'
ret = complete(mock_request, url_backend)
mock_log.error.assert_called_with('some error')
mock_redirect.assert_called_with('accounts:login')
mock_do_complete.assert_not_called()
self.assertEqual(mock_redirect(), ret)
@mock.patch('tethys_portal.views.psa.log')
@mock.patch('tethys_portal.views.psa.redirect')
@mock.patch('tethys_portal.views.psa.do_complete')
def test_complete_is_mtm_value_error(self, mock_do_complete, mock_redirect, mock_log):
mock_backend = mock.MagicMock(
spec=MultiTenantMixin,
strategy=mock.MagicMock(
session_get=mock.MagicMock(return_value='Foo')
)
)
type(mock_backend).tenant = mock.PropertyMock(side_effect=ValueError('some error'))
mock_backend.name = 'foo'
mock_request = mock.MagicMock(method='GET', backend=mock_backend) # GET request
url_backend = 'foo'
ret = complete(mock_request, url_backend)
mock_log.error.assert_called_with('some error')
mock_redirect.assert_called_with('accounts:login')
mock_do_complete.assert_not_called()
self.assertEqual(mock_redirect(), ret)
@override_settings(SSO_TENANT_ALIAS='foo bar')
@mock.patch('tethys_portal.views.psa.log')
@mock.patch('tethys_portal.views.psa.redirect')
def test_tenant_get_backend_not_mtm(self, mock_redirect, mock_log):
mock_backend = mock.MagicMock(spec=BaseAuth) # Not a MultiTenantMixin
mock_backend.name = 'other backend'
mock_request = mock.MagicMock(method='GET', GET=dict(), backend=mock_backend) # GET request
url_backend = 'foo'
ret = tenant(mock_request, backend=url_backend)
mock_log.error.assert_called_with('Backend "other backend" does not support MULTI_TENANT features.')
mock_redirect.assert_called_with('accounts:login')
self.assertEqual(mock_redirect(), ret)
@override_settings(SSO_TENANT_ALIAS='foo bar')
@mock.patch('tethys_portal.views.psa.SsoTenantForm', spec=SsoTenantForm)
@mock.patch('tethys_portal.views.psa.render')
def test_tenant_get(self, mock_render, mock_tenant_form):
mock_backend = mock.MagicMock(spec=MultiTenantMixin)
mock_request = mock.MagicMock(method='GET', GET=dict(), backend=mock_backend) # GET request
url_backend = 'foo'
ret = tenant(mock_request, backend=url_backend)
mock_tenant_form.assert_called()
mock_render.assert_called_with(
mock_request,
'tethys_portal/accounts/sso_tenant.html',
{
'form': mock_tenant_form(),
'form_title': 'Foo Bar',
'page_title': 'Foo Bar',
'backend': url_backend
}
)
self.assertEqual(mock_render(), ret)
@override_settings(SSO_TENANT_ALIAS='foo bar')
@mock.patch('tethys_portal.forms.SsoTenantForm', spec=SsoTenantForm)
def test_tenant_view_post_no_submit(self, mock_tenant_form):
mock_backend = mock.MagicMock(spec=MultiTenantMixin)
mock_request = mock.MagicMock(method='POST', POST=dict(), backend=mock_backend) # Empty POST dict
url_backend = 'foo'
ret = tenant(mock_request, url_backend)
mock_tenant_form.assert_not_called()
self.assertIsInstance(ret, HttpResponseBadRequest)
@override_settings(SSO_TENANT_ALIAS='foo bar')
@mock.patch('tethys_portal.views.psa.do_auth')
@mock.patch('tethys_portal.views.psa.SsoTenantForm', spec=SsoTenantForm)
def test_tenant_view_post_valid(self, mock_tenant_form, mock_do_auth):
mock_tenant_form.is_valid = mock.MagicMock(return_value=True)
mock_tenant_form().cleaned_data = {'tenant': 'GitHub'}
mock_backend = mock.MagicMock(spec=MultiTenantMixin)
post_params = {
'sso-tenant-submit': 'submit',
'tenant': 'GitHub',
'remember': False
}
mock_request = mock.MagicMock(method='POST', POST=post_params, backend=mock_backend) # valid POST request
url_backend = 'foo'
ret = tenant(mock_request, url_backend)
# Make sure form is bound to POST data
mock_tenant_form.assert_called_with(mock_request.POST)
mock_tenant_form().is_valid.assert_called()
mock_do_auth.assert_called_with(mock_backend, redirect_name=REDIRECT_FIELD_NAME)
self.assertEqual('GitHub', mock_backend.tenant)
self.assertEqual(mock_do_auth(), ret)
@override_settings(SSO_TENANT_ALIAS='foo bar')
@mock.patch('tethys_portal.views.psa.log')
@mock.patch('tethys_portal.views.psa.redirect')
@mock.patch('tethys_portal.views.psa.do_auth')
@mock.patch('tethys_portal.views.psa.SsoTenantForm', spec=SsoTenantForm)
def test_tenant_view_post_improperly_configured(self, mock_tenant_form, mock_do_auth, mock_redirect, mock_log):
mock_tenant_form.is_valid = mock.MagicMock(return_value=True)
mock_tenant_form().cleaned_data = {'tenant': 'GitHub'}
mock_backend = mock.MagicMock(spec=MultiTenantMixin)
type(mock_backend).tenant = mock.PropertyMock(side_effect=ImproperlyConfigured('some error message'))
post_params = {
'sso-tenant-submit': 'submit',
'tenant': 'GitHub',
'remember': False
}
mock_request = mock.MagicMock(method='POST', POST=post_params, backend=mock_backend) # valid POST request
url_backend = 'foo'
ret = tenant(mock_request, url_backend)
# Make sure form is bound to POST data
mock_tenant_form.assert_called_with(mock_request.POST)
mock_tenant_form().is_valid.assert_called()
mock_do_auth.assert_not_called()
mock_log.error.assert_called_with('some error message')
mock_redirect.assert_called_with('accounts:login')
self.assertEqual(mock_redirect(), ret)
@override_settings(SSO_TENANT_ALIAS='Thingy')
@mock.patch('tethys_portal.views.psa.render')
@mock.patch('tethys_portal.views.psa.do_auth')
@mock.patch('tethys_portal.views.psa.SsoTenantForm', spec=SsoTenantForm)
def test_tenant_view_post_value_error(self, mock_tenant_form, mock_do_auth, mock_render):
mock_tenant_form.is_valid = mock.MagicMock(return_value=True)
mock_tenant_form().cleaned_data = {'tenant': 'GitHub'}
mock_backend = mock.MagicMock(spec=MultiTenantMixin)
type(mock_backend).tenant = mock.PropertyMock(side_effect=ValueError)
post_params = {
'sso-tenant-submit': 'submit',
'tenant': 'GitHub',
'remember': False
}
mock_request = mock.MagicMock(method='POST', POST=post_params, backend=mock_backend) # valid POST request
url_backend = 'foo'
ret = tenant(mock_request, url_backend)
# Make sure form is bound to POST data
mock_tenant_form.assert_called_with(mock_request.POST)
mock_tenant_form().is_valid.assert_called()
mock_do_auth.assert_not_called()
mock_tenant_form().add_error.assert_called_with('tenant', 'Invalid thingy provided.')
mock_render.assert_called_with(
mock_request,
'tethys_portal/accounts/sso_tenant.html',
{
'form': mock_tenant_form(),
'form_title': 'Thingy',
'page_title': 'Thingy',
'backend': url_backend
}
)
self.assertEqual(mock_render(), ret)
| 42.401899 | 115 | 0.687141 | 1,640 | 13,399 | 5.30061 | 0.084756 | 0.063269 | 0.056712 | 0.06672 | 0.848039 | 0.83665 | 0.823421 | 0.814448 | 0.781433 | 0.761187 | 0 | 0.000284 | 0.210687 | 13,399 | 315 | 116 | 42.536508 | 0.821672 | 0.038212 | 0 | 0.714286 | 0 | 0 | 0.141813 | 0.083113 | 0 | 0 | 0 | 0 | 0.189189 | 1 | 0.061776 | false | 0.007722 | 0.046332 | 0 | 0.111969 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4e440d4db8488bb8947fb6ae7f965b83a83f3590 | 50 | py | Python | grammar_school/main.py | cbfield/grammar_school | 890b11ef1558f487f48fde3ccd48a6e58eb258ba | [
"MIT"
] | null | null | null | grammar_school/main.py | cbfield/grammar_school | 890b11ef1558f487f48fde3ccd48a6e58eb258ba | [
"MIT"
] | null | null | null | grammar_school/main.py | cbfield/grammar_school | 890b11ef1558f487f48fde3ccd48a6e58eb258ba | [
"MIT"
] | null | null | null | def do_something_cool():
return 'Something Cool'
| 16.666667 | 24 | 0.78 | 7 | 50 | 5.285714 | 0.714286 | 0.702703 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12 | 50 | 2 | 25 | 25 | 0.840909 | 0 | 0 | 0 | 0 | 0 | 0.28 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
4eba11b3e730e30447df1fd8f2057e80a8738368 | 355 | py | Python | tests/internal/instance_type/test_instance_type_c6i_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | null | null | null | tests/internal/instance_type/test_instance_type_c6i_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | null | null | null | tests/internal/instance_type/test_instance_type_c6i_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | null | null | null |
# Testing module instance_type.c6i
import pytest
import ec2_compare.internal.instance_type.c6i
def test_get_internal_data_instance_type_c6i_get_instances_list():
assert len(ec2_compare.internal.instance_type.c6i.get_instances_list()) > 0
def test_get_internal_data_instance_type_c6i_get():
assert len(ec2_compare.internal.instance_type.c6i.get) > 0
| 35.5 | 77 | 0.850704 | 56 | 355 | 4.946429 | 0.339286 | 0.259928 | 0.32491 | 0.259928 | 0.826715 | 0.826715 | 0.613718 | 0.613718 | 0.613718 | 0 | 0 | 0.033435 | 0.073239 | 355 | 9 | 78 | 39.444444 | 0.808511 | 0.090141 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
0963f8c6e1a58578f37ce3c6a38777427f0e61dc | 2,643 | py | Python | retinopathy/models/heads/rank.py | RamsteinWR/Diabetic-Retinopathy-Blindness-Detection | 24390aeefd197600255a961189872dd4dfc77092 | [
"MIT"
] | 68 | 2019-09-08T20:04:23.000Z | 2021-05-05T10:05:14.000Z | retinopathy/models/heads/rank.py | RamsteinWR/Diabetic-Retinopathy-Blindness-Detection | 24390aeefd197600255a961189872dd4dfc77092 | [
"MIT"
] | 1 | 2019-09-24T06:40:33.000Z | 2019-10-04T09:13:35.000Z | retinopathy/models/heads/rank.py | RamsteinWR/Diabetic-Retinopathy-Blindness-Detection | 24390aeefd197600255a961189872dd4dfc77092 | [
"MIT"
] | 25 | 2019-09-09T04:42:51.000Z | 2022-03-28T15:01:30.000Z | from torch import nn
from retinopathy.rank_pooling import GlobalRankPooling
class RankPoolingHeadModel(nn.Module):
def __init__(self, feature_maps, num_classes: int, dropout=0.):
super().__init__()
self.features_size = feature_maps[-1]
self.rank_pool = GlobalRankPooling(self.features_size, 16 * 16)
self.dropout = nn.AlphaDropout(dropout)
self.logits = nn.Linear(self.features_size, num_classes)
# Regression to grade using SSD-like module
self.regression = nn.Sequential(
nn.Linear(self.features_size, 16),
nn.ELU(inplace=True),
nn.Linear(16, 16),
nn.ELU(inplace=True),
nn.Linear(16, 1)
)
self.ordinal = nn.Linear(self.features_size, num_classes - 1)
def forward(self, features):
features = self.rank_pool(features[-1])
features = self.dropout(features)
logits = self.logits(features)
regression = self.regression(features)
if regression.size(1) == 1:
regression = regression.squeeze(1)
ordinal = self.ordinal(features).sigmoid().sum(dim=1)
return {
'features': features,
'logits': logits,
'regression': regression,
'ordinal': ordinal
}
class RankPoolingHeadModelV2(nn.Module):
def __init__(self, feature_maps, num_classes: int, dropout=0.):
super().__init__()
self.features_size = 512
self.bottleneck = nn.Conv2d(feature_maps[-1], self.features_size, kernel_size=1)
self.rank_pool = GlobalRankPooling(self.features_size, 16 * 16)
self.dropout = nn.AlphaDropout(dropout)
self.logits = nn.Linear(self.features_size, num_classes)
# Regression to grade using SSD-like module
self.regression = nn.Sequential(
nn.Linear(self.features_size, 16),
nn.ELU(inplace=True),
nn.Linear(16, 16),
nn.ELU(inplace=True),
nn.Linear(16, 1)
)
self.ordinal = nn.Linear(self.features_size, num_classes - 1)
def forward(self, features):
features = self.bottleneck(self.dropout(features[-1]))
features = self.rank_pool(features)
logits = self.logits(features)
regression = self.regression(features)
if regression.size(1) == 1:
regression = regression.squeeze(1)
ordinal = self.ordinal(features).sigmoid().sum(dim=1)
return {
'features': features,
'logits': logits,
'regression': regression,
'ordinal': ordinal
}
| 32.231707 | 88 | 0.607264 | 295 | 2,643 | 5.294915 | 0.179661 | 0.099872 | 0.112676 | 0.076825 | 0.84507 | 0.822023 | 0.822023 | 0.822023 | 0.822023 | 0.822023 | 0 | 0.02521 | 0.279607 | 2,643 | 81 | 89 | 32.62963 | 0.795168 | 0.031404 | 0 | 0.754098 | 0 | 0 | 0.024247 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.065574 | false | 0 | 0.032787 | 0 | 0.163934 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
11adb26610571287c7c154801f1e10a8b626e268 | 34,379 | py | Python | codegen_template.py | septimit/odatapy-client | add9cb5d70866e882569bed3c91c95e31a1dc6ed | [
"MIT"
] | 90 | 2015-01-09T17:10:04.000Z | 2022-03-29T09:13:09.000Z | codegen_template.py | septimit/odatapy-client | add9cb5d70866e882569bed3c91c95e31a1dc6ed | [
"MIT"
] | 9 | 2015-06-02T02:47:08.000Z | 2020-03-15T17:58:37.000Z | codegen_template.py | septimit/odatapy-client | add9cb5d70866e882569bed3c91c95e31a1dc6ed | [
"MIT"
] | 22 | 2015-01-29T21:18:00.000Z | 2022-01-24T20:05:15.000Z | # OData Python Client and Server Libraries ver. 1.0.0
# Copyright (c) Microsoft Corporation
# All rights reserved.
# MIT License
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
# The above copyright notice and this permission notice shall be included
# in all copies or substantial portions of the Software.
# THE SOFTWARE IS PROVIDED *AS IS*, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
# DEALINGS IN THE SOFTWARE.
# class_name, edm_namespace, edm_name
EDM_INFO = r"""
_namespace = r"{1}"
_typename = r"{2}"
@staticmethod
def get_full_name():
return {0}._namespace + '.' + {0}._typename
@staticmethod
def get_type_name():
return {0}._typename
"""
GET_ROOT_URL = r"""
def get_root_url(self):
if self._service_context is not None:
return self._service_context.get_root_url()
else:
return ""
"""
# edm_namespace
GET_ENUM_TYPE_NAMESPACE = r"""
@staticmethod
def get_enum_type_namespace():
return "{0}"
"""
BEGIN_GET_ENUM_TYPE_FROM_STRING = r"""
@staticmethod
def get_enum_value_from_string(enum_string):"""
# class_name, edm_name, class_member_name
ON_GET_ENUM_TYPE_FROM_STRING = r"""
if enum_string == "{1}":
return {0}.{2}"""
END_GET_ENUM_TYPE_FROM_STRING = r"""
return 0
"""
BEGIN_GET_STRING_FROM_ENUM_TYPE = r"""
@staticmethod
def get_string_from_enum_value(enum_value):"""
# class_name, edm_name, class_member_name
ON_GET_STRING_FROM_ENUM_TYPE = r'''
if enum_value == {0}.{2}:
return "{1}"'''
END_GET_STRING_FROM_ENUM_TYPE = r"""
return ""
"""
# base_class_name
BEGIN_COMPLEX_CONSTRUCTOR = r"""
def __init__(self, service_context):
{0}.__init__(self, service_context)"""
# class_member_name, default_value
ON_PROPERTY_IN_COMPLEX_CONSTRUCTOR = r"""
self._{0} = {1}"""
# class_member_name, edm_name, primitive_resolver
PRIMITIVE_PROPERTY_IN_COMPLEX_MAPPING = r"""
def get_{0}(self):
return self._{0}
def set_{0}(self, property_value):
self._{0} = property_value
def _get_{0}_from_complex(self, complex):
property_value = odata_client_python.odata_value()
if not complex.get_property_value("{1}", property_value):
return
if odata_client_python.is_nullptr(property_value):
property_value = None
if property_value is None:
return
primitive_value = odata_client_python.to_primitive_value(property_value)
if primitive_value is not None:
try:
self._{0} = {2}
except:
self._{0} = primitive_value.to_string()
def _set_{0}_to_complex(self, complex):
if complex is None or self._{0} is None:
return
complex.set_value("{1}", self._{0})
"""
# class_member_name, edm_name, class_member_type
COMPLEX_PROPERTY_IN_COMPLEX_MAPPING = r"""
def get_{0}(self):
return self._{0}
def set_{0}(self, property_value):
self._{0} = property_value
def _get_{0}_from_complex(self, complex):
property_value = odata_client_python.odata_value()
if not complex.get_property_value("{1}", property_value):
return
if odata_client_python.is_nullptr(property_value):
property_value = None
if property_value is None:
return
if property_value.get_value_type().get_type_kind() == odata_client_python.Complex:
complex_value = odata_client_python.to_complex_value(property_value)
self._{0} = {2}.create_instance_from_complex(complex_value, self._service_context)
def _set_{0}_to_complex(self, complex):
if complex is None or self._{0} is None:
return
complex.set_value("{1}", self._{0}.to_value())
"""
# class_member_name, edm_name, class_member_type
ENUM_PROPERTY_IN_COMPLEX_MAPPING = r"""
def get_{0}(self):
return self._{0}
def set_{0}(self, property_value):
self._{0} = property_value
def _get_{0}_from_complex(self, complex):
if complex is None:
return
property_value = odata_client_python.odata_value()
if not complex.get_property_value("{1}", property_value):
return
if odata_client_python.is_nullptr(property_value):
property_value = None
if property_value is None:
return
enum_value = odata_client_python.to_enum_value(property_value)
if enum_value is not None:
self._{0} = {2}.get_enum_value_from_string(enum_value.to_string())
def _set_{0}_to_complex(self, complex):
if complex is None or self._{0} is None:
return
complex_type = odata_client_python.to_complex_type(complex.get_value_type())
if complex_type is None:
return
edm_property = complex_type.find_property("{1}")
if edm_property is None:
return
property_type = edm_property.get_property_type()
enum_value = odata_client_python.odata_enum_value(property_type, {2}.get_string_from_enum_value(self._{0}))
complex.set_value("{1}", enum_value)
"""
# class_member_name, edm_name, primitive_resolver
COLLECTION_PRIMITIVE_PROPERTY_IN_COMPLEX_MAPPING = r"""
def get_{0}(self):
return self._{0}
def set_{0}(self, property_values):
self._{0} = property_values
def add_to_{0}(self, property_value):
if self._{0} is None:
self._{0} = []
self._{0}.append(property_value)
def _get_{0}_from_complex(self, complex):
if complex is None:
return
property_value = odata_client_python.odata_value()
if not complex.get_property_value("{1}", property_value):
return
if odata_client_python.is_nullptr(property_value):
property_value = None
if property_value is None:
return
property_collection_value = odata_client_python.to_collection_value(property_value)
if property_collection_value is None:
return
self._{0} = []
for odata_value in property_collection_value.get_collection_values():
primitive_value = odata_client_python.to_primitive_value(odata_value)
if primitive_value is None:
continue
try:
value = {2}
except:
value = primitive_value.to_string()
self._{0}.append(value)
def _set_{0}_to_complex(self, complex):
if complex is None or self._{0} is None:
return
complex_type = odata_client_python.to_complex_type(complex.get_value_type())
if complex_type is None:
return
edm_property = complex_type.find_property("{1}")
if edm_property is None:
return
property_type = edm_property.get_property_type()
collection_value_type = odata_client_python.to_collection_type(property_type)
if collection_value_type is None:
return
collection_value = odata_client_python.to_collection_value(collection_value_type)
for primitive in self._{0}:
collection_value.add_collection_value(odata_client_python.odata_primitive_value.make_primitive_value(primitive))
complex.set_value("{1}", collection_value)
"""
# class_member_name, edm_name, class_member_type
COLLECTION_ENUM_PROPERTY_IN_COMPLEX_MAPPING = r"""
def get_{0}(self):
return self._{0}
def set_{0}(self, property_values):
self._{0} = property_values
def add_to_{0}(self, property_value):
if self._{0} is None:
self._{0} = []
self._{0}.append(property_value)
def _get_{0}_from_complex(self, complex):
if complex is None:
return
property_value = odata_client_python.odata_value()
if not complex.get_property_value("{1}", property_value):
return
if odata_client_python.is_nullptr(property_value):
property_value = None
if property_value is None:
return
property_collection_value = odata_client_python.to_collection_value(property_value)
if property_collection_value is None:
return
self._{0} = []
for odata_value in property_collection_value.get_collection_values():
enum_value = odata_client_python.to_enum_value(odata_value)
if enum_value is None:
continue
self._{0}.append({2}.get_enum_value_from_string(enum_value.to_string()))
def _set_{0}_to_complex(self, complex):
if self._{0} is None:
return
if complex is None or complex.get_value_type() is None:
return
complex_type = odata_client_python.to_complex_type(complex.get_value_type())
if complex_type is None:
return
edm_property = complex_type.find_property("{1}")
if edm_property is None:
return
property_type = edm_property.get_property_type()
collection_value_type = odata_client_python.to_collection_type(property_type)
if collection_value_type is None:
return
collection_value = odata_client_python.odata_collection_value(collection_value_type)
for property in self._{0}:
collection_value.add_collection_value(odata_client_python.odata_enum_value(collection_value_type.get_element_type(), {2}.get_string_from_enum_value(property)))
complex.set_value("{1}", collection_value)
"""
# class_name, base_class_name
BEGIN_COMPLEX_INSTANCE_CREATOR = r"""
@staticmethod
def create_instance_from_complex(complex_value, service_context):
if complex_value is None:
return None
real_type_name = complex_value.get_value_type().get_name()
if real_type_name != "{0}":
if real_type_name in {0}._derived_creator_map:
instance = {0}._derived_creator_map[real_type_name](service_context)
else:
return None
else:
instance = {0}(service_context)
instance.from_value(complex_value)
return instance
def from_value(self, complex_value):
self._{1}__from_value(complex_value)"""
# class_member_name
ON_PROPERTY_IN_COMPLEX_INSTANCE_CREATOR = r"""
self._get_{0}_from_complex(complex_value)"""
END_COMPLEX_INSTANCE_CREATOR = r"""
__from_value = from_value
"""
BEGIN_TO_COMPLEX_VALUE = r"""
def to_value(self):
if self._service_context is None or self._service_context.get_edm_model() is None:
return None
complex_type = self._service_context.get_edm_model().find_complex_type(self._typename)
complex_value = odata_client_python.odata_complex_value(complex_type)"""
# base_class_name
BEGIN_TO_COMPLEX_VALUE_WITH_BASE_CLASS = r"""
def to_value(self):
if self._service_context is None or self._service_context.get_edm_model() is None:
return None
complex_type = self._service_context.get_edm_model().find_complex_type(self._typename)
complex_value = self._{0}__to_value()
complex_value.set_value_type(complex_type)"""
# class_member_name
ON_TO_COMPLEX_VALUE = r"""
self._set_{0}_to_complex(complex_value)"""
END_TO_COMPLEX_VALUE = r"""
if self._namespace != "" and self._typename != "":
complex_value.set_value(odata_client_python.odata_json_constants.PAYLOAD_ANNOTATION_TYPE, odata_client_python.odata_primitive_value(odata_client_python.edm_payload_annotation_type(odata_client_python.odata_json_constants.PAYLOAD_ANNOTATION_TYPE), "#" + self._namespace + "." + self._typename))
return complex_value
__to_value = to_value
"""
# base_class_name
BEGIN_ENTITY_CONSTRUCTOR = r"""
def __init__(self, service_context):
{0}.__init__(self, service_context)
"""
# class_member_name, default_value
ON_PROPERTY_IN_ENTITY_CONSTRUCTOR = r"""
self._{0} = {1}"""
# class_member_name, edm_name, primitive_resolver
PRIMITIVE_PROPERTY_IN_ENTITY_MAPPING = r"""
def get_{0}(self):
return self._{0}
def set_{0}(self, property_value):
self._{0} = property_value
def _get_{0}_from_entity(self, entity):
property_value = odata_client_python.odata_value()
if not entity.get_property_value("{1}", property_value):
return
if odata_client_python.is_nullptr(property_value):
property_value = None
if property_value is None:
return
primitive_value = odata_client_python.to_primitive_value(property_value)
if primitive_value is not None:
try:
self._{0} = {2}
except:
self._{0} = primitive_value.to_string()
def _set_{0}_to_entity(self, entity):
if entity is None or self._{0} is None:
return
entity.set_value("{1}", self._{0})
"""
# class_member_name, edm_name, class_member_type
ENUM_PROPERTY_IN_ENTITY_MAPPING = r"""
def get_{0}(self):
return self._{0}
def set_{0}(self, property_value):
self._{0} = property_value
def _get_{0}_from_entity(self, entity):
if entity is None:
return
property_value = odata_client_python.odata_value()
if not entity.get_property_value("{1}", property_value):
return
if odata_client_python.is_nullptr(property_value):
property_value = None
if property_value is None:
return
enum_value = odata_client_python.to_enum_value(property_value)
if enum_value is not None:
self._{0} = {2}.get_enum_value_from_string(enum_value.to_string())
def _set_{0}_to_entity(self, entity):
if entity is None or self._{0} is None:
return
entity_type = odata_client_python.to_entity_type(entity.get_value_type())
if entity_type is None:
return
edm_property = entity_type.find_property("{1}")
if edm_property is None:
return
property_type = edm_property.get_property_type()
enum_value = odata_client_python.odata_enum_value(property_type, {2}.get_string_from_enum_value(self._{0}))
entity.set_value("{1}", enum_value)
"""
# class_member_name, edm_name, class_member_type
COMPLEX_PROPERTY_IN_ENTITY_MAPPING = r"""
def get_{0}(self):
return self._{0}
def set_{0}(self, property_value):
self._{0} = property_value
def _get_{0}_from_entity(self, entity):
if entity is None:
return
property_value = odata_client_python.odata_value()
if not entity.get_property_value("{1}", property_value):
return
if odata_client_python.is_nullptr(property_value):
property_value = None
if property_value is None:
return
if property_value.get_value_type().get_type_kind() == odata_client_python.Complex:
complex_value = odata_client_python.to_complex_value(property_value)
self._{0} = {2}.create_instance_from_complex(complex_value, self._service_context)
def _set_{0}_to_entity(self, entity):
if entity is None or self._{0} is None:
return
entity.set_value("{1}", self._{0}.to_value())
"""
# class_member_name, edm_name, class_member_type
NAVIGATION_PROPERTY_IN_ENTITY_MAPPING = r"""
def get_{0}(self):
return self._{0}
def set_{0}(self, navitation_value):
self._{0} = navitation_value
def load_{0}(self):
if self._service_context is None:
return
path = self._service_context.get_relative_path(self._edit_link) + '/' + "{1}"
values = self._service_context.query(path)
if values is None:
return
self._{0} = None
if len(values) == 1:
entity_value = odata_client_python.to_entity_value(values[0])
if entity_value is None:
return
self._{0} = {2}.create_instance_from_entity(entity_value, self._service_context)
def _get_{0}_from_entity(self, entity):
if entity is None:
return
property_value = odata_client_python.odata_value()
if not entity.get_property_value("{1}", property_value):
return
if odata_client_python.is_nullptr(property_value):
property_value = None
if property_value is None:
return
if property_value.get_value_type().get_type_kind() == odata_client_python.Entity:
entity_value = odata_client_python.to_entity_value(property_value)
self._{0} = {2}.create_instance_from_entity(entity_value, self._service_context)
"""
# class_member_name, edm_name, primitive_resolver
COLLECTION_PRIMITIVE_PROPERTY_IN_ENTITY_MAPPING = r"""
def get_{0}(self):
return self._{0}
def set_{0}(self, property_values):
self._{0} = property_values
def add_to_{0}(self, property_value):
if self._{0} is None:
self._{0} = []
self._{0}.append(property_value)
def _get_{0}_from_entity(self, entity):
if entity is None:
return
property_value = odata_client_python.odata_value()
if not entity.get_property_value("{1}", property_value):
return
if odata_client_python.is_nullptr(property_value):
property_value = None
if property_value is None:
return
property_collection_value = odata_client_python.to_collection_value(property_value)
if property_collection_value is None:
return
self._{0} = []
for odata_value in property_collection_value.get_collection_values():
primitive_value = odata_client_python.to_primitive_value(odata_value)
if primitive_value is None:
continue
try:
value = {2}
except:
value = primitive_value.to_string()
self._{0}.append(value)
def _set_{0}_to_entity(self, entity):
if self._{0} is None:
return
if entity is None or entity.get_value_type() is None:
return
entity_type = odata_client_python.to_entity_type(entity.get_value_type())
if entity_type is None:
return
edm_property = entity_type.find_property("{1}")
if edm_property is None:
return
property_type = edm_property.get_property_type()
collection_value_type = odata_client_python.to_collection_type(property_type)
if collection_value_type is None:
return
collection_value = odata_client_python.odata_collection_value(collection_value_type)
for primitive in self._{0}:
collection_value.add_collection_value(odata_client_python.odata_primitive_value.make_primitive_value(primitive))
entity.set_value("{1}", collection_value)
"""
# class_member_name, edm_name, class_member_type
COLLECTION_ENUM_PROPERTY_IN_ENTITY_MAPPING = r"""
def get_{0}(self):
return self._{0}
def set_{0}(self, property_values):
self._{0} = property_values
def add_to_{0}(self, property_value):
if self._{0} is None:
self._{0} = []
self._{0}.append(property_value)
def _get_{0}_from_entity(self, entity):
if entity is None:
return
property_value = odata_client_python.odata_value()
if not entity.get_property_value("{1}", property_value):
return
if odata_client_python.is_nullptr(property_value):
property_value = None
if property_value is None:
return
property_collection_value = odata_client_python.to_collection_value(property_value)
if property_collection_value is None:
return
self._{0} = []
for odata_value in property_collection_value.get_collection_values():
enum_value = odata_client_python.to_enum_value(odata_value)
if enum_value is None:
continue
self._{0}.append({2}.get_enum_value_from_string(enum_value.to_string()))
def _set_{0}_to_entity(self, entity):
if self._{0} is None:
return
if entity is None or entity.get_value_type() is None:
return
entity_type = odata_client_python.to_entity_type(entity.get_value_type())
if entity_type is None:
return
edm_property = entity_type.find_property("{1}")
if edm_property is None:
return
property_type = edm_property.get_property_type()
collection_value_type = odata_client_python.to_collection_type(property_type)
if collection_value_type is None:
return
collection_value = odata_client_python.odata_collection_value(collection_value_type)
for property in self._{0}:
collection_value.add_collection_value(odata_client_python.odata_enum_value(collection_value_type.get_element_type(), {2}.get_string_from_enum_value(property)))
entity.set_value("{1}", collection_value)
"""
# class_member_name, edm_name, class_member_type
COLLECTION_COMPLEX_PROPERTY_IN_ENTITY_MAPPING = r"""
def get_{0}(self):
return self._{0}
def set_{0}(self, property_values):
self._{0} = property_values
def add_to_{0}(self, property_value):
if self._{0} is None:
self._{0} = []
self._{0}.append(property_value)
def _get_{0}_from_entity(self, entity):
if entity is None:
return
property_value = odata_client_python.odata_value()
if not entity.get_property_value("{1}", property_value):
return
if odata_client_python.is_nullptr(property_value):
property_value = None
if property_value is None:
return
property_collection_value = odata_client_python.to_collection_value(property_value)
if property_collection_value is None:
return
self._{0} = []
for odata_value in property_collection_value.get_collection_values():
complex_value = odata_client_python.to_complex_value(odata_value)
if complex_value is None:
continue
self._{0}.append({2}.create_instance_from_complex(complex_value, self._service_context))
def _set_{0}_to_entity(self, entity):
if self._{0} is None:
return
if entity is None or entity.get_value_type() is None:
return
entity_type = odata_client_python.to_entity_type(entity.get_value_type())
if entity_type is None:
return
edm_property = entity_type.find_property("{1}")
if edm_property is None:
return
property_type = edm_property.get_property_type()
collection_value_type = odata_client_python.to_collection_type(property_type)
if collection_value_type is None:
return
collection_value = odata_client_python.odata_collection_value(collection_value_type)
for property in self._{0}:
if property is not None:
collection_value.add_collection_value(property.to_value())
entity.set_value("{1}", collection_value)
"""
# class_member_name, edm_name, class_member_type
COLLECTION_NAVIGATION_PROPERTY_IN_ENTITY_MAPPING = r"""
def get_{0}(self):
return self._{0}
def set_{0}(self, property_values):
self._{0} = property_values
def add_to_{0}(self, property_value):
if self._{0} is None:
self._{0} = []
self._{0}.append(property_value)
def load_{0}(self):
if self._service_context is None:
return
path = self._service_context.get_relative_path(self._edit_link) + '/' + "{1}"
values = self._service_context.query(path)
if values is None:
return
self._{0} = []
for value in values:
entity_value = odata_client_python.to_entity_value(value)
if entity_value is None:
return
self._{0}.append({2}.create_instance_from_entity(entity_value, self._service_context))
def _get_{0}_from_entity(self, entity):
if entity is None:
return
property_value = odata_client_python.odata_value()
if not entity.get_property_value("{1}", property_value):
return
if odata_client_python.is_nullptr(property_value):
property_value = None
if property_value is None:
return
property_collection_value = odata_client_python.to_collection_value(property_value)
if property_collection_value is None:
return
self._{0} = []
for odata_value in property_collection_value.get_collection_values():
entity_value = odata_client_python.to_entity_value(odata_value)
if entity_value is None:
continue
self._{0}.append({2}.create_instance_from_entity(entity_value, self._service_context))
"""
# class_name, base_class_name
BEGIN_ENTITY_INSTANCE_CREATOR = r"""
@staticmethod
def create_instance_from_entity(entity_value, service_context):
if entity_value is None:
return None
real_type_name = entity_value.get_value_type().get_name()
if real_type_name != "{0}":
if real_type_name in {0}._derived_creator_map:
instance = {0}._derived_creator_map[real_type_name](service_context)
else:
return None
else:
instance = {0}(service_context)
instance.from_value(entity_value)
return instance
def from_value(self, entity_value):
self._{1}__from_value(entity_value)
self._edit_link = odata_client_python.odata_entity_model_builder.compute_edit_link(self.get_root_url(), entity_value, "", False)"""
# class_member_name
ON_PROPERTY_IN_ENTITY_INSTANCE_CREATOR = r"""
self._get_{0}_from_entity(entity_value)"""
END_ENTITY_INSTANCE_CREATOR = r"""
__from_value = from_value
"""
BEGIN_TO_ENTITY_VALUE = r"""
def to_value(self):
if self._service_context is None or self._service_context.get_edm_model() is None:
return None
entity_type = self._service_context.get_edm_model().find_entity_type(self._typename)
entity_value = odata_client_python.odata_entity_value(entity_type)"""
# base_class_name
BEGIN_TO_ENTITY_VALUE_WITH_BASE_CLASS = r"""
def to_value(self):
if self._service_context is None or self._service_context.get_edm_model() is None:
return None
entity_type = self._service_context.get_edm_model().find_entity_type(self._typename)
entity_value = self._{0}__to_value()
entity_value.set_value_type(entity_type)"""
# class_member_name
ON_TO_ENTITY_VALUE = r"""
self._set_{0}_to_entity(entity_value)"""
END_TO_ENTITY_VALUE = r"""
if self._namespace != "" and self._typename != "":
entity_value.set_value(odata_client_python.odata_json_constants.PAYLOAD_ANNOTATION_TYPE, odata_client_python.odata_primitive_value(odata_client_python.edm_payload_annotation_type(odata_client_python.odata_json_constants.PAYLOAD_ANNOTATION_TYPE), "#" + self._namespace + "." + self._typename))
return entity_value
__to_value = to_value
"""
ENTITY_CONTAINER_CONSTRUCTOR = r"""
def __init__(self, baseAddress, options=odata_client_python.client_options()):
odata_service_context.__init__(self, baseAddress, options)
"""
# class_member_name, edm_name, strong_type_name
QUERY_ENTITY_SET_IN_ENTITY_CONTAINER = r"""
def {0}(self, key=None, filter=None, top=None, skip=None, orderby=None, select=None, expand=None):
if self._client is None:
return
query_ex = self.get_query_expression("{1}", key=key, filter=filter, top=top, skip=skip, orderby=orderby, select=select, expand=expand)
values = self._client.get_data_from_server(query_ex).get()
if values is None:
return
ret = []
for value in values:
entity_value = odata_client_python.to_entity_value(value)
if entity_value is None:
continue
ret.append({2}.create_instance_from_entity(entity_value, self))
return ret
"""
# class_member_name, edm_name, strong_type_name
QUERY_SINGLETON_IN_ENTITY_CONTAINER = r"""
def {0}(self, key=None, filter=None, top=None, skip=None, orderby=None, select=None, expand=None):
if self._client is None:
return
query_ex = self.get_query_expression("{1}", key=key, filter=filter, top=top, skip=skip, orderby=orderby, select=select, expand=expand)
values = self._client.get_data_from_server(query_ex).get()
if values is None:
return
if len(values) == 1:
entity_value = odata_client_python.to_entity_value(values[0])
if entity_value is None:
return None
return {2}.create_instance_from_entity(entity_value, self)
else:
return None
"""
# class_member_name, arguments
BEGIN_OPERATION = r"""
def {0}({1}):
if self._service_context is None:
return None
function_query_url = self._service_context.get_relative_path(self._edit_link) + '/'
function_query_url += self._namespace + '.' + "{0}"
parameters = odata_client_python.vector_odata_parameter()"""
# class_member_name, arguments
BEGIN_OPERATION_IMPORT = r"""
def {0}({1}):
function_query_url = "{0}"
parameters = odata_client_python.vector_odata_parameter()"""
# member_name, edm_name
ON_PRIMITIVE_PARAMETER_IN_OPERATION = r"""
if {0} is not None:
primitive_value = odata_client_python.odata_primitive_value.make_primitive_value({0})
if primitive_value is not None:
parameters.push_back(odata_client_python.odata_parameter("{1}", primitive_value))"""
# member_name, edm_name, member_strong_type_name
ON_CLASS_PARAMETER_IN_OPERATION = r"""
if isinstance({0}, {2}):
value = {0}.to_value()
if value is not None:
parameters.push_back(odata_client_python.odata_parameter("{1}", value))"""
# member_name, edm_name, member_strong_type_name
ON_ENUM_PARAMETER_IN_OPERATION = r"""
if {0} is not None:
enum_string = {2}.get_string_from_enum_value({0})
enum_type = odata_client_python.edm_enum_type("", "", "", False)
enum_value = odata_client_python.odata_enum_value(enum_type, enum_string)
if enum_value is not None:
parameters.push_back(odata_client_python.odata_parameter("{1}", enum_value))"""
# member_name, edm_name
ON_COLLECTION_PRIMITIVE_PARAMETER_IN_OPERATION = r"""
if isinstance({0}, (list, tuple)):
collection_value = odata_client_python.odata_collection_value(odata_client_python.edm_collection_type("action parameter"))
for value in {0}:
if value is None:
continue
primitive_value = odata_client_python.odata_primitive_value.make_primitive_value(value)
if primitive_value is not None:
collection_value.add_collection_value(primitive_value)
parameters.push_back(odata_client_python.odata_parameter("{1}", collection_value))"""
# member_name, edm_name, member_strong_type_name
ON_COLLECTION_CLASS_PARAMETER_IN_OPERATION = r"""
if isinstance({0}, (list, tuple)):
collection_value = odata_client_python.odata_collection_value(odata_client_python.edm_collection_type("action parameter"))
for ins in {0}:
if isinstance(ins, {2}):
value = ins.to_value()
if value is not None:
collection_value.add_collection_value(value)
parameters.push_back(odata_client_python.odata_parameter("{1}", collection_value))"""
# member_name, edm_name, member_strong_type_name
ON_COLLECTION_ENUM_PARAMETER_IN_OPERATION = r"""
if isinstance({0}, (list, tuple)):
collection_value = odata_client_python.odata_collection_value(odata_client_python.edm_collection_type("action parameter"))
for value in {0}:
if value is None:
continue
enum_string = {2}.get_string_from_enum_value(value)
enum_type = odata_client_python.edm_enum_type("", "", "", False)
enum_value = odata_client_python.odata_enum_value(enum_type, enum_string)
if enum_value is not None:
collection_value.add_collection_value(enum_value)
parameters.push_back(odata_client_python.odata_parameter("{1}", collection_value))
"""
# executor_name, is_function, return_type
END_OPERATION_WITH_RETURN_VALUE = r"""
return self._service_context.{0}(function_query_url, parameters, {1}, {2})
"""
# executor_name, is_function
END_OPERATION_VOID = r"""
return self._service_context.{0}(function_query_url, parameters, {1})
"""
# executor_name, is_function, return_type
END_OPERATION_IMPORT_WITH_RETURN_VALUE = r"""
return self.{0}(function_query_url, parameters, {1}, {2})
"""
# executor_name, is_function
END_OPERATION_IMPORT_VOID = r"""
return self.{0}(function_query_url, parameters, {1})
"""
| 38.326644 | 305 | 0.665988 | 4,443 | 34,379 | 4.722935 | 0.047265 | 0.073103 | 0.086685 | 0.06605 | 0.895254 | 0.876239 | 0.847979 | 0.834302 | 0.800991 | 0.771016 | 0 | 0.011021 | 0.242532 | 34,379 | 896 | 306 | 38.36942 | 0.794785 | 0.07746 | 0 | 0.796117 | 0 | 0.008322 | 0.923072 | 0.384608 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004161 | 0 | 0.183079 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
11c8334887783b7d0f78e4824dbf724af7d80d32 | 87,437 | py | Python | moody/m/ori20/__init__.py | tokenchain/moodyeth | e5a40ffe411fc4dc767ba8a279c1fd7c002dc975 | [
"MIT"
] | null | null | null | moody/m/ori20/__init__.py | tokenchain/moodyeth | e5a40ffe411fc4dc767ba8a279c1fd7c002dc975 | [
"MIT"
] | null | null | null | moody/m/ori20/__init__.py | tokenchain/moodyeth | e5a40ffe411fc4dc767ba8a279c1fd7c002dc975 | [
"MIT"
] | null | null | null | """Generated wrapper for Ori20 Solidity contract."""
# pylint: disable=too-many-arguments
import json
import time
from typing import ( # pylint: disable=unused-import
Optional,
Tuple,
Union,
)
from eth_utils import to_checksum_address
from hexbytes import HexBytes
from moody import Bolors
from moody.libeb import MiliDoS
from moody.m.bases import ContractMethod, Validator, ContractBase, Signatures
from moody.m.tx_params import TxParams
from web3.contract import ContractFunction
from web3.datastructures import AttributeDict
from web3.exceptions import ContractLogicError
# Try to import a custom validator class definition; if there isn't one,
# declare one that we can instantiate for the default argument to the
# constructor for Ori20 below.
try:
# both mypy and pylint complain about what we're doing here, but this
# works just fine, so their messages have been disabled here.
from . import ( # type: ignore # pylint: disable=import-self
Ori20Validator,
)
except ImportError:
class Ori20Validator( # type: ignore
Validator
):
"""No-op input validator."""
try:
from .middleware import MIDDLEWARE # type: ignore
except ImportError:
pass
class AddMinterMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the addMinter method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("addMinter")
def validate_and_normalize_inputs(self, account: str) -> any:
"""Validate the inputs to the addMinter method."""
self.validator.assert_valid(
method_name='addMinter',
parameter_name='account',
argument_value=account,
)
account = self.validate_and_checksum_address(account)
return (account)
def block_send(self, account: str, gas: int, price: int, val: int = 0, debug: bool = False, receiptListen: bool = False) -> None:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(account)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': gas,
'gasPrice': price
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if val > 0:
_t['value'] = val
if debug:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if receiptListen is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if debug:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if receiptListen is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: add_minter")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET} on set_asset_token: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}: set_asset_token")
def send_transaction(self, account: str, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(account) = self.validate_and_normalize_inputs(account)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account).transact(tx_params.as_dict())
def build_transaction(self, account: str, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(account) = self.validate_and_normalize_inputs(account)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account).buildTransaction(tx_params.as_dict())
def estimate_gas(self, account: str, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(account) = self.validate_and_normalize_inputs(account)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account).estimateGas(tx_params.as_dict())
class AllowanceMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the allowance method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("allowance")
def validate_and_normalize_inputs(self, owner: str, spender: str) -> any:
"""Validate the inputs to the allowance method."""
self.validator.assert_valid(
method_name='allowance',
parameter_name='owner',
argument_value=owner,
)
owner = self.validate_and_checksum_address(owner)
self.validator.assert_valid(
method_name='allowance',
parameter_name='spender',
argument_value=spender,
)
spender = self.validate_and_checksum_address(spender)
return (owner, spender)
def block_call(self, owner: str, spender: str, debug: bool = False) -> int:
_fn = self._underlying_method(owner, spender)
returned = _fn.call({
'from': self._operate
})
return int(returned)
def estimate_gas(self, owner: str, spender: str, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(owner, spender) = self.validate_and_normalize_inputs(owner, spender)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(owner, spender).estimateGas(tx_params.as_dict())
class ApproveMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the approve method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("approve")
def validate_and_normalize_inputs(self, spender: str, amount: int) -> any:
"""Validate the inputs to the approve method."""
self.validator.assert_valid(
method_name='approve',
parameter_name='spender',
argument_value=spender,
)
spender = self.validate_and_checksum_address(spender)
self.validator.assert_valid(
method_name='approve',
parameter_name='amount',
argument_value=amount,
)
# safeguard against fractional inputs
amount = int(amount)
return (spender, amount)
def block_send(self, spender: str, amount: int, gas: int, price: int, val: int = 0, debug: bool = False, receiptListen: bool = False) -> bool:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(spender, amount)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': gas,
'gasPrice': price
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if val > 0:
_t['value'] = val
if debug:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if receiptListen is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if debug:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if receiptListen is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: approve")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET} on set_asset_token: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}: set_asset_token")
def send_transaction(self, spender: str, amount: int, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(spender, amount) = self.validate_and_normalize_inputs(spender, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(spender, amount).transact(tx_params.as_dict())
def build_transaction(self, spender: str, amount: int, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(spender, amount) = self.validate_and_normalize_inputs(spender, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(spender, amount).buildTransaction(tx_params.as_dict())
def estimate_gas(self, spender: str, amount: int, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(spender, amount) = self.validate_and_normalize_inputs(spender, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(spender, amount).estimateGas(tx_params.as_dict())
class BalanceOfMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the balanceOf method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("balanceOf")
def validate_and_normalize_inputs(self, account: str) -> any:
"""Validate the inputs to the balanceOf method."""
self.validator.assert_valid(
method_name='balanceOf',
parameter_name='account',
argument_value=account,
)
account = self.validate_and_checksum_address(account)
return (account)
def block_call(self, account: str, debug: bool = False) -> int:
_fn = self._underlying_method(account)
returned = _fn.call({
'from': self._operate
})
return int(returned)
def estimate_gas(self, account: str, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(account) = self.validate_and_normalize_inputs(account)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account).estimateGas(tx_params.as_dict())
class BurnMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the burn method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("burn")
def validate_and_normalize_inputs(self, amount: int) -> any:
"""Validate the inputs to the burn method."""
self.validator.assert_valid(
method_name='burn',
parameter_name='amount',
argument_value=amount,
)
# safeguard against fractional inputs
amount = int(amount)
return (amount)
def block_send(self, amount: int, gas: int, price: int, val: int = 0, debug: bool = False, receiptListen: bool = False) -> None:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(amount)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': gas,
'gasPrice': price
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if val > 0:
_t['value'] = val
if debug:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if receiptListen is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if debug:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if receiptListen is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: burn")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET} on set_asset_token: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}: set_asset_token")
def send_transaction(self, amount: int, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(amount) = self.validate_and_normalize_inputs(amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(amount).transact(tx_params.as_dict())
def build_transaction(self, amount: int, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(amount) = self.validate_and_normalize_inputs(amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(amount).buildTransaction(tx_params.as_dict())
def estimate_gas(self, amount: int, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(amount) = self.validate_and_normalize_inputs(amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(amount).estimateGas(tx_params.as_dict())
class BurnFromMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the burnFrom method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("burnFrom")
def validate_and_normalize_inputs(self, account: str, amount: int) -> any:
"""Validate the inputs to the burnFrom method."""
self.validator.assert_valid(
method_name='burnFrom',
parameter_name='account',
argument_value=account,
)
account = self.validate_and_checksum_address(account)
self.validator.assert_valid(
method_name='burnFrom',
parameter_name='amount',
argument_value=amount,
)
# safeguard against fractional inputs
amount = int(amount)
return (account, amount)
def block_send(self, account: str, amount: int, gas: int, price: int, val: int = 0, debug: bool = False, receiptListen: bool = False) -> None:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(account, amount)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': gas,
'gasPrice': price
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if val > 0:
_t['value'] = val
if debug:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if receiptListen is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if debug:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if receiptListen is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: burn_from")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET} on set_asset_token: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}: set_asset_token")
def send_transaction(self, account: str, amount: int, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(account, amount) = self.validate_and_normalize_inputs(account, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account, amount).transact(tx_params.as_dict())
def build_transaction(self, account: str, amount: int, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(account, amount) = self.validate_and_normalize_inputs(account, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account, amount).buildTransaction(tx_params.as_dict())
def estimate_gas(self, account: str, amount: int, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(account, amount) = self.validate_and_normalize_inputs(account, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account, amount).estimateGas(tx_params.as_dict())
class CapMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the cap method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("cap")
def block_call(self, debug: bool = False) -> int:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return int(returned)
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class DecimalsMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the decimals method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("decimals")
def block_call(self, debug: bool = False) -> int:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return int(returned)
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class DecreaseAllowanceMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the decreaseAllowance method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("decreaseAllowance")
def validate_and_normalize_inputs(self, spender: str, subtracted_value: int) -> any:
"""Validate the inputs to the decreaseAllowance method."""
self.validator.assert_valid(
method_name='decreaseAllowance',
parameter_name='spender',
argument_value=spender,
)
spender = self.validate_and_checksum_address(spender)
self.validator.assert_valid(
method_name='decreaseAllowance',
parameter_name='subtractedValue',
argument_value=subtracted_value,
)
# safeguard against fractional inputs
subtracted_value = int(subtracted_value)
return (spender, subtracted_value)
def block_send(self, spender: str, subtracted_value: int, gas: int, price: int, val: int = 0, debug: bool = False, receiptListen: bool = False) -> bool:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(spender, subtracted_value)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': gas,
'gasPrice': price
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if val > 0:
_t['value'] = val
if debug:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if receiptListen is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if debug:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if receiptListen is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: decrease_allowance")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET} on set_asset_token: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}: set_asset_token")
def send_transaction(self, spender: str, subtracted_value: int, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(spender, subtracted_value) = self.validate_and_normalize_inputs(spender, subtracted_value)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(spender, subtracted_value).transact(tx_params.as_dict())
def build_transaction(self, spender: str, subtracted_value: int, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(spender, subtracted_value) = self.validate_and_normalize_inputs(spender, subtracted_value)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(spender, subtracted_value).buildTransaction(tx_params.as_dict())
def estimate_gas(self, spender: str, subtracted_value: int, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(spender, subtracted_value) = self.validate_and_normalize_inputs(spender, subtracted_value)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(spender, subtracted_value).estimateGas(tx_params.as_dict())
class GetDecimalsMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the getDecimals method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("getDecimals")
def block_call(self, debug: bool = False) -> int:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return int(returned)
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class GovMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the gov method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("gov")
def block_call(self, debug: bool = False) -> str:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return str(returned)
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class IncreaseAllowanceMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the increaseAllowance method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("increaseAllowance")
def validate_and_normalize_inputs(self, spender: str, added_value: int) -> any:
"""Validate the inputs to the increaseAllowance method."""
self.validator.assert_valid(
method_name='increaseAllowance',
parameter_name='spender',
argument_value=spender,
)
spender = self.validate_and_checksum_address(spender)
self.validator.assert_valid(
method_name='increaseAllowance',
parameter_name='addedValue',
argument_value=added_value,
)
# safeguard against fractional inputs
added_value = int(added_value)
return (spender, added_value)
def block_send(self, spender: str, added_value: int, gas: int, price: int, val: int = 0, debug: bool = False, receiptListen: bool = False) -> bool:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(spender, added_value)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': gas,
'gasPrice': price
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if val > 0:
_t['value'] = val
if debug:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if receiptListen is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if debug:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if receiptListen is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: increase_allowance")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET} on set_asset_token: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}: set_asset_token")
def send_transaction(self, spender: str, added_value: int, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(spender, added_value) = self.validate_and_normalize_inputs(spender, added_value)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(spender, added_value).transact(tx_params.as_dict())
def build_transaction(self, spender: str, added_value: int, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(spender, added_value) = self.validate_and_normalize_inputs(spender, added_value)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(spender, added_value).buildTransaction(tx_params.as_dict())
def estimate_gas(self, spender: str, added_value: int, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(spender, added_value) = self.validate_and_normalize_inputs(spender, added_value)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(spender, added_value).estimateGas(tx_params.as_dict())
class IsMinterMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the isMinter method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("isMinter")
def validate_and_normalize_inputs(self, account: str) -> any:
"""Validate the inputs to the isMinter method."""
self.validator.assert_valid(
method_name='isMinter',
parameter_name='account',
argument_value=account,
)
account = self.validate_and_checksum_address(account)
return (account)
def block_call(self, account: str, debug: bool = False) -> bool:
_fn = self._underlying_method(account)
returned = _fn.call({
'from': self._operate
})
return bool(returned)
def estimate_gas(self, account: str, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(account) = self.validate_and_normalize_inputs(account)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account).estimateGas(tx_params.as_dict())
class MintMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the mint method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("mint")
def validate_and_normalize_inputs(self, account: str, amount: int) -> any:
"""Validate the inputs to the mint method."""
self.validator.assert_valid(
method_name='mint',
parameter_name='account',
argument_value=account,
)
account = self.validate_and_checksum_address(account)
self.validator.assert_valid(
method_name='mint',
parameter_name='amount',
argument_value=amount,
)
# safeguard against fractional inputs
amount = int(amount)
return (account, amount)
def block_send(self, account: str, amount: int, gas: int, price: int, val: int = 0, debug: bool = False, receiptListen: bool = False) -> bool:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(account, amount)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': gas,
'gasPrice': price
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if val > 0:
_t['value'] = val
if debug:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if receiptListen is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if debug:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if receiptListen is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: mint")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET} on set_asset_token: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}: set_asset_token")
def send_transaction(self, account: str, amount: int, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(account, amount) = self.validate_and_normalize_inputs(account, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account, amount).transact(tx_params.as_dict())
def build_transaction(self, account: str, amount: int, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(account, amount) = self.validate_and_normalize_inputs(account, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account, amount).buildTransaction(tx_params.as_dict())
def estimate_gas(self, account: str, amount: int, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(account, amount) = self.validate_and_normalize_inputs(account, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account, amount).estimateGas(tx_params.as_dict())
class NameMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the name method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("name")
def block_call(self, debug: bool = False) -> str:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return str(returned)
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class RemoveMinterMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the removeMinter method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("removeMinter")
def validate_and_normalize_inputs(self, account: str) -> any:
"""Validate the inputs to the removeMinter method."""
self.validator.assert_valid(
method_name='removeMinter',
parameter_name='account',
argument_value=account,
)
account = self.validate_and_checksum_address(account)
return (account)
def block_send(self, account: str, gas: int, price: int, val: int = 0, debug: bool = False, receiptListen: bool = False) -> None:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(account)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': gas,
'gasPrice': price
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if val > 0:
_t['value'] = val
if debug:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if receiptListen is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if debug:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if receiptListen is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: remove_minter")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET} on set_asset_token: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}: set_asset_token")
def send_transaction(self, account: str, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(account) = self.validate_and_normalize_inputs(account)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account).transact(tx_params.as_dict())
def build_transaction(self, account: str, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(account) = self.validate_and_normalize_inputs(account)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account).buildTransaction(tx_params.as_dict())
def estimate_gas(self, account: str, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(account) = self.validate_and_normalize_inputs(account)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account).estimateGas(tx_params.as_dict())
class RenounceMinterMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the renounceMinter method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("renounceMinter")
def block_send(self, gas: int, price: int, val: int = 0, debug: bool = False, receiptListen: bool = False) -> None:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method()
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': gas,
'gasPrice': price
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if val > 0:
_t['value'] = val
if debug:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if receiptListen is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if debug:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if receiptListen is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: renounce_minter")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET} on set_asset_token: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}: set_asset_token")
def send_transaction(self, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().transact(tx_params.as_dict())
def build_transaction(self, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().buildTransaction(tx_params.as_dict())
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class SymbolMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the symbol method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("symbol")
def block_call(self, debug: bool = False) -> str:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return str(returned)
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class TokenNameMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the tokenName method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("tokenName")
def block_call(self, debug: bool = False) -> str:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return str(returned)
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class TokenSymbolMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the tokenSymbol method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("tokenSymbol")
def block_call(self, debug: bool = False) -> str:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return str(returned)
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class TotalSupplyMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the totalSupply method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("totalSupply")
def block_call(self, debug: bool = False) -> int:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return int(returned)
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class TransferMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the transfer method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("transfer")
def validate_and_normalize_inputs(self, recipient: str, amount: int) -> any:
"""Validate the inputs to the transfer method."""
self.validator.assert_valid(
method_name='transfer',
parameter_name='recipient',
argument_value=recipient,
)
recipient = self.validate_and_checksum_address(recipient)
self.validator.assert_valid(
method_name='transfer',
parameter_name='amount',
argument_value=amount,
)
# safeguard against fractional inputs
amount = int(amount)
return (recipient, amount)
def block_send(self, recipient: str, amount: int, gas: int, price: int, val: int = 0, debug: bool = False, receiptListen: bool = False) -> bool:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(recipient, amount)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': gas,
'gasPrice': price
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if val > 0:
_t['value'] = val
if debug:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if receiptListen is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if debug:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if receiptListen is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: transfer")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET} on set_asset_token: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}: set_asset_token")
def send_transaction(self, recipient: str, amount: int, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(recipient, amount) = self.validate_and_normalize_inputs(recipient, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(recipient, amount).transact(tx_params.as_dict())
def build_transaction(self, recipient: str, amount: int, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(recipient, amount) = self.validate_and_normalize_inputs(recipient, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(recipient, amount).buildTransaction(tx_params.as_dict())
def estimate_gas(self, recipient: str, amount: int, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(recipient, amount) = self.validate_and_normalize_inputs(recipient, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(recipient, amount).estimateGas(tx_params.as_dict())
class TransferFromMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the transferFrom method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator = None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("transferFrom")
def validate_and_normalize_inputs(self, sender: str, recipient: str, amount: int) -> any:
"""Validate the inputs to the transferFrom method."""
self.validator.assert_valid(
method_name='transferFrom',
parameter_name='sender',
argument_value=sender,
)
sender = self.validate_and_checksum_address(sender)
self.validator.assert_valid(
method_name='transferFrom',
parameter_name='recipient',
argument_value=recipient,
)
recipient = self.validate_and_checksum_address(recipient)
self.validator.assert_valid(
method_name='transferFrom',
parameter_name='amount',
argument_value=amount,
)
# safeguard against fractional inputs
amount = int(amount)
return (sender, recipient, amount)
def block_send(self, sender: str, recipient: str, amount: int, gas: int, price: int, val: int = 0, debug: bool = False, receiptListen: bool = False) -> bool:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(sender, recipient, amount)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': gas,
'gasPrice': price
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if val > 0:
_t['value'] = val
if debug:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if receiptListen is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if debug:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if receiptListen is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: transfer_from")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET} on set_asset_token: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}: set_asset_token")
def send_transaction(self, sender: str, recipient: str, amount: int, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(sender, recipient, amount) = self.validate_and_normalize_inputs(sender, recipient, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(sender, recipient, amount).transact(tx_params.as_dict())
def build_transaction(self, sender: str, recipient: str, amount: int, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(sender, recipient, amount) = self.validate_and_normalize_inputs(sender, recipient, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(sender, recipient, amount).buildTransaction(tx_params.as_dict())
def estimate_gas(self, sender: str, recipient: str, amount: int, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(sender, recipient, amount) = self.validate_and_normalize_inputs(sender, recipient, amount)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(sender, recipient, amount).estimateGas(tx_params.as_dict())
class SignatureGenerator(Signatures):
"""
The signature is generated for this and it is installed.
"""
def __init__(self, abi: any):
super().__init__(abi)
def add_minter(self) -> str:
return self._function_signatures["addMinter"]
def allowance(self) -> str:
return self._function_signatures["allowance"]
def approve(self) -> str:
return self._function_signatures["approve"]
def balance_of(self) -> str:
return self._function_signatures["balanceOf"]
def burn(self) -> str:
return self._function_signatures["burn"]
def burn_from(self) -> str:
return self._function_signatures["burnFrom"]
def cap(self) -> str:
return self._function_signatures["cap"]
def decimals(self) -> str:
return self._function_signatures["decimals"]
def decrease_allowance(self) -> str:
return self._function_signatures["decreaseAllowance"]
def get_decimals(self) -> str:
return self._function_signatures["getDecimals"]
def gov(self) -> str:
return self._function_signatures["gov"]
def increase_allowance(self) -> str:
return self._function_signatures["increaseAllowance"]
def is_minter(self) -> str:
return self._function_signatures["isMinter"]
def mint(self) -> str:
return self._function_signatures["mint"]
def name(self) -> str:
return self._function_signatures["name"]
def remove_minter(self) -> str:
return self._function_signatures["removeMinter"]
def renounce_minter(self) -> str:
return self._function_signatures["renounceMinter"]
def symbol(self) -> str:
return self._function_signatures["symbol"]
def token_name(self) -> str:
return self._function_signatures["tokenName"]
def token_symbol(self) -> str:
return self._function_signatures["tokenSymbol"]
def total_supply(self) -> str:
return self._function_signatures["totalSupply"]
def transfer(self) -> str:
return self._function_signatures["transfer"]
def transfer_from(self) -> str:
return self._function_signatures["transferFrom"]
# pylint: disable=too-many-public-methods,too-many-instance-attributes
class Ori20(ContractBase):
"""Wrapper class for Ori20 Solidity contract."""
_fn_add_minter: AddMinterMethod
"""Constructor-initialized instance of
:class:`AddMinterMethod`.
"""
_fn_allowance: AllowanceMethod
"""Constructor-initialized instance of
:class:`AllowanceMethod`.
"""
_fn_approve: ApproveMethod
"""Constructor-initialized instance of
:class:`ApproveMethod`.
"""
_fn_balance_of: BalanceOfMethod
"""Constructor-initialized instance of
:class:`BalanceOfMethod`.
"""
_fn_burn: BurnMethod
"""Constructor-initialized instance of
:class:`BurnMethod`.
"""
_fn_burn_from: BurnFromMethod
"""Constructor-initialized instance of
:class:`BurnFromMethod`.
"""
_fn_cap: CapMethod
"""Constructor-initialized instance of
:class:`CapMethod`.
"""
_fn_decimals: DecimalsMethod
"""Constructor-initialized instance of
:class:`DecimalsMethod`.
"""
_fn_decrease_allowance: DecreaseAllowanceMethod
"""Constructor-initialized instance of
:class:`DecreaseAllowanceMethod`.
"""
_fn_get_decimals: GetDecimalsMethod
"""Constructor-initialized instance of
:class:`GetDecimalsMethod`.
"""
_fn_gov: GovMethod
"""Constructor-initialized instance of
:class:`GovMethod`.
"""
_fn_increase_allowance: IncreaseAllowanceMethod
"""Constructor-initialized instance of
:class:`IncreaseAllowanceMethod`.
"""
_fn_is_minter: IsMinterMethod
"""Constructor-initialized instance of
:class:`IsMinterMethod`.
"""
_fn_mint: MintMethod
"""Constructor-initialized instance of
:class:`MintMethod`.
"""
_fn_name: NameMethod
"""Constructor-initialized instance of
:class:`NameMethod`.
"""
_fn_remove_minter: RemoveMinterMethod
"""Constructor-initialized instance of
:class:`RemoveMinterMethod`.
"""
_fn_renounce_minter: RenounceMinterMethod
"""Constructor-initialized instance of
:class:`RenounceMinterMethod`.
"""
_fn_symbol: SymbolMethod
"""Constructor-initialized instance of
:class:`SymbolMethod`.
"""
_fn_token_name: TokenNameMethod
"""Constructor-initialized instance of
:class:`TokenNameMethod`.
"""
_fn_token_symbol: TokenSymbolMethod
"""Constructor-initialized instance of
:class:`TokenSymbolMethod`.
"""
_fn_total_supply: TotalSupplyMethod
"""Constructor-initialized instance of
:class:`TotalSupplyMethod`.
"""
_fn_transfer: TransferMethod
"""Constructor-initialized instance of
:class:`TransferMethod`.
"""
_fn_transfer_from: TransferFromMethod
"""Constructor-initialized instance of
:class:`TransferFromMethod`.
"""
SIGNATURES: SignatureGenerator = None
def __init__(
self,
core_lib: MiliDoS,
contract_address: str,
validator: Ori20Validator = None,
):
"""Get an instance of wrapper for smart contract.
"""
# pylint: disable=too-many-statements
super().__init__()
self.contract_address = contract_address
web3 = core_lib.w3
if not validator:
validator = Ori20Validator(web3, contract_address)
# if any middleware was imported, inject it
try:
MIDDLEWARE
except NameError:
pass
else:
try:
for middleware in MIDDLEWARE:
web3.middleware_onion.inject(
middleware['function'], layer=middleware['layer'],
)
except ValueError as value_error:
if value_error.args == ("You can't add the same un-named instance twice",):
pass
self._web3_eth = web3.eth
functions = self._web3_eth.contract(address=to_checksum_address(contract_address), abi=Ori20.abi()).functions
signed = SignatureGenerator(Ori20.abi())
validator.bindSignatures(signed)
self.SIGNATURES = signed
self._fn_add_minter = AddMinterMethod(core_lib, contract_address, functions.addMinter, validator)
self._fn_allowance = AllowanceMethod(core_lib, contract_address, functions.allowance, validator)
self._fn_approve = ApproveMethod(core_lib, contract_address, functions.approve, validator)
self._fn_balance_of = BalanceOfMethod(core_lib, contract_address, functions.balanceOf, validator)
self._fn_burn = BurnMethod(core_lib, contract_address, functions.burn, validator)
self._fn_burn_from = BurnFromMethod(core_lib, contract_address, functions.burnFrom, validator)
self._fn_cap = CapMethod(core_lib, contract_address, functions.cap, validator)
self._fn_decimals = DecimalsMethod(core_lib, contract_address, functions.decimals, validator)
self._fn_decrease_allowance = DecreaseAllowanceMethod(core_lib, contract_address, functions.decreaseAllowance, validator)
self._fn_get_decimals = GetDecimalsMethod(core_lib, contract_address, functions.getDecimals, validator)
self._fn_gov = GovMethod(core_lib, contract_address, functions.gov, validator)
self._fn_increase_allowance = IncreaseAllowanceMethod(core_lib, contract_address, functions.increaseAllowance, validator)
self._fn_is_minter = IsMinterMethod(core_lib, contract_address, functions.isMinter, validator)
self._fn_mint = MintMethod(core_lib, contract_address, functions.mint, validator)
self._fn_name = NameMethod(core_lib, contract_address, functions.name, validator)
self._fn_remove_minter = RemoveMinterMethod(core_lib, contract_address, functions.removeMinter, validator)
self._fn_renounce_minter = RenounceMinterMethod(core_lib, contract_address, functions.renounceMinter, validator)
self._fn_symbol = SymbolMethod(core_lib, contract_address, functions.symbol, validator)
self._fn_token_name = TokenNameMethod(core_lib, contract_address, functions.tokenName, validator)
self._fn_token_symbol = TokenSymbolMethod(core_lib, contract_address, functions.tokenSymbol, validator)
self._fn_total_supply = TotalSupplyMethod(core_lib, contract_address, functions.totalSupply, validator)
self._fn_transfer = TransferMethod(core_lib, contract_address, functions.transfer, validator)
self._fn_transfer_from = TransferFromMethod(core_lib, contract_address, functions.transferFrom, validator)
def event_approval(
self, tx_hash: Union[HexBytes, bytes]
) -> Tuple[AttributeDict]:
"""
Implementation of event approval in contract Ori20
Get log entry for Approval event.
:param tx_hash: hash of transaction emitting Approval event
"""
tx_receipt = self._web3_eth.getTransactionReceipt(tx_hash)
return self._web3_eth.contract(address=to_checksum_address(self.contract_address), abi=Ori20.abi()).events.Approval().processReceipt(tx_receipt)
def event_minter_added(
self, tx_hash: Union[HexBytes, bytes]
) -> Tuple[AttributeDict]:
"""
Implementation of event minter_added in contract Ori20
Get log entry for MinterAdded event.
:param tx_hash: hash of transaction emitting MinterAdded event
"""
tx_receipt = self._web3_eth.getTransactionReceipt(tx_hash)
return self._web3_eth.contract(address=to_checksum_address(self.contract_address), abi=Ori20.abi()).events.MinterAdded().processReceipt(tx_receipt)
def event_minter_removed(
self, tx_hash: Union[HexBytes, bytes]
) -> Tuple[AttributeDict]:
"""
Implementation of event minter_removed in contract Ori20
Get log entry for MinterRemoved event.
:param tx_hash: hash of transaction emitting MinterRemoved event
"""
tx_receipt = self._web3_eth.getTransactionReceipt(tx_hash)
return self._web3_eth.contract(address=to_checksum_address(self.contract_address), abi=Ori20.abi()).events.MinterRemoved().processReceipt(tx_receipt)
def event_transfer(
self, tx_hash: Union[HexBytes, bytes]
) -> Tuple[AttributeDict]:
"""
Implementation of event transfer in contract Ori20
Get log entry for Transfer event.
:param tx_hash: hash of transaction emitting Transfer event
"""
tx_receipt = self._web3_eth.getTransactionReceipt(tx_hash)
return self._web3_eth.contract(address=to_checksum_address(self.contract_address), abi=Ori20.abi()).events.Transfer().processReceipt(tx_receipt)
def add_minter(self, account: str) -> None:
"""
Implementation of add_minter in contract Ori20
Method of the function
"""
return self._fn_add_minter.block_send(account, self.call_contract_fee_amount, self.call_contract_fee_price, 0, self.call_contract_debug_flag, self.call_contract_enforce_tx_receipt)
def allowance(self, owner: str, spender: str) -> int:
"""
Implementation of allowance in contract Ori20
Method of the function
"""
return self._fn_allowance.block_call(owner, spender)
def approve(self, spender: str, amount: int) -> bool:
"""
Implementation of approve in contract Ori20
Method of the function
"""
return self._fn_approve.block_send(spender, amount, self.call_contract_fee_amount, self.call_contract_fee_price, 0, self.call_contract_debug_flag, self.call_contract_enforce_tx_receipt)
def balance_of(self, account: str) -> int:
"""
Implementation of balance_of in contract Ori20
Method of the function
"""
return self._fn_balance_of.block_call(account)
def burn(self, amount: int) -> None:
"""
Implementation of burn in contract Ori20
Method of the function
"""
return self._fn_burn.block_send(amount, self.call_contract_fee_amount, self.call_contract_fee_price, 0, self.call_contract_debug_flag, self.call_contract_enforce_tx_receipt)
def burn_from(self, account: str, amount: int) -> None:
"""
Implementation of burn_from in contract Ori20
Method of the function
"""
return self._fn_burn_from.block_send(account, amount, self.call_contract_fee_amount, self.call_contract_fee_price, 0, self.call_contract_debug_flag, self.call_contract_enforce_tx_receipt)
def cap(self) -> int:
"""
Implementation of cap in contract Ori20
Method of the function
"""
return self._fn_cap.block_call()
def decimals(self) -> int:
"""
Implementation of decimals in contract Ori20
Method of the function
"""
return self._fn_decimals.block_call()
def decrease_allowance(self, spender: str, subtracted_value: int) -> bool:
"""
Implementation of decrease_allowance in contract Ori20
Method of the function
"""
return self._fn_decrease_allowance.block_send(spender, subtracted_value, self.call_contract_fee_amount, self.call_contract_fee_price, 0, self.call_contract_debug_flag, self.call_contract_enforce_tx_receipt)
def get_decimals(self) -> int:
"""
Implementation of get_decimals in contract Ori20
Method of the function
"""
return self._fn_get_decimals.block_call()
def gov(self) -> str:
"""
Implementation of gov in contract Ori20
Method of the function
"""
return self._fn_gov.block_call()
def increase_allowance(self, spender: str, added_value: int) -> bool:
"""
Implementation of increase_allowance in contract Ori20
Method of the function
"""
return self._fn_increase_allowance.block_send(spender, added_value, self.call_contract_fee_amount, self.call_contract_fee_price, 0, self.call_contract_debug_flag, self.call_contract_enforce_tx_receipt)
def is_minter(self, account: str) -> bool:
"""
Implementation of is_minter in contract Ori20
Method of the function
"""
return self._fn_is_minter.block_call(account)
def mint(self, account: str, amount: int) -> bool:
"""
Implementation of mint in contract Ori20
Method of the function
"""
return self._fn_mint.block_send(account, amount, self.call_contract_fee_amount, self.call_contract_fee_price, 0, self.call_contract_debug_flag, self.call_contract_enforce_tx_receipt)
def name(self) -> str:
"""
Implementation of name in contract Ori20
Method of the function
"""
return self._fn_name.block_call()
def remove_minter(self, account: str) -> None:
"""
Implementation of remove_minter in contract Ori20
Method of the function
"""
return self._fn_remove_minter.block_send(account, self.call_contract_fee_amount, self.call_contract_fee_price, 0, self.call_contract_debug_flag, self.call_contract_enforce_tx_receipt)
def renounce_minter(self) -> None:
"""
Implementation of renounce_minter in contract Ori20
Method of the function
"""
return self._fn_renounce_minter.block_send(self.call_contract_fee_amount, self.call_contract_fee_price, 0, self.call_contract_debug_flag, self.call_contract_enforce_tx_receipt)
def symbol(self) -> str:
"""
Implementation of symbol in contract Ori20
Method of the function
"""
return self._fn_symbol.block_call()
def token_name(self) -> str:
"""
Implementation of token_name in contract Ori20
Method of the function
"""
return self._fn_token_name.block_call()
def token_symbol(self) -> str:
"""
Implementation of token_symbol in contract Ori20
Method of the function
"""
return self._fn_token_symbol.block_call()
def total_supply(self) -> int:
"""
Implementation of total_supply in contract Ori20
Method of the function
"""
return self._fn_total_supply.block_call()
def transfer(self, recipient: str, amount: int) -> bool:
"""
Implementation of transfer in contract Ori20
Method of the function
"""
return self._fn_transfer.block_send(recipient, amount, self.call_contract_fee_amount, self.call_contract_fee_price, 0, self.call_contract_debug_flag, self.call_contract_enforce_tx_receipt)
def transfer_from(self, sender: str, recipient: str, amount: int) -> bool:
"""
Implementation of transfer_from in contract Ori20
Method of the function
"""
return self._fn_transfer_from.block_send(sender, recipient, amount, self.call_contract_fee_amount, self.call_contract_fee_price, 0, self.call_contract_debug_flag, self.call_contract_enforce_tx_receipt)
def CallContractWait(self, t_long: int) -> "Ori20":
self._fn_add_minter.setWait(t_long)
self._fn_allowance.setWait(t_long)
self._fn_approve.setWait(t_long)
self._fn_balance_of.setWait(t_long)
self._fn_burn.setWait(t_long)
self._fn_burn_from.setWait(t_long)
self._fn_cap.setWait(t_long)
self._fn_decimals.setWait(t_long)
self._fn_decrease_allowance.setWait(t_long)
self._fn_get_decimals.setWait(t_long)
self._fn_gov.setWait(t_long)
self._fn_increase_allowance.setWait(t_long)
self._fn_is_minter.setWait(t_long)
self._fn_mint.setWait(t_long)
self._fn_name.setWait(t_long)
self._fn_remove_minter.setWait(t_long)
self._fn_renounce_minter.setWait(t_long)
self._fn_symbol.setWait(t_long)
self._fn_token_name.setWait(t_long)
self._fn_token_symbol.setWait(t_long)
self._fn_total_supply.setWait(t_long)
self._fn_transfer.setWait(t_long)
self._fn_transfer_from.setWait(t_long)
return self
@staticmethod
def abi():
"""Return the ABI to the underlying contract."""
return json.loads(
'[{"inputs":[],"payable":false,"stateMutability":"nonpayable","type":"constructor"},{"anonymous":false,"inputs":[{"indexed":true,"internalType":"address","name":"owner","type":"address"},{"indexed":true,"internalType":"address","name":"spender","type":"address"},{"indexed":false,"internalType":"uint256","name":"value","type":"uint256"}],"name":"Approval","type":"event"},{"anonymous":false,"inputs":[{"indexed":true,"internalType":"address","name":"account","type":"address"}],"name":"MinterAdded","type":"event"},{"anonymous":false,"inputs":[{"indexed":true,"internalType":"address","name":"account","type":"address"}],"name":"MinterRemoved","type":"event"},{"anonymous":false,"inputs":[{"indexed":true,"internalType":"address","name":"from","type":"address"},{"indexed":true,"internalType":"address","name":"to","type":"address"},{"indexed":false,"internalType":"uint256","name":"value","type":"uint256"}],"name":"Transfer","type":"event"},{"constant":false,"inputs":[{"internalType":"address","name":"account","type":"address"}],"name":"addMinter","outputs":[],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[{"internalType":"address","name":"owner","type":"address"},{"internalType":"address","name":"spender","type":"address"}],"name":"allowance","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":false,"inputs":[{"internalType":"address","name":"spender","type":"address"},{"internalType":"uint256","name":"amount","type":"uint256"}],"name":"approve","outputs":[{"internalType":"bool","name":"","type":"bool"}],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[{"internalType":"address","name":"account","type":"address"}],"name":"balanceOf","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":false,"inputs":[{"internalType":"uint256","name":"amount","type":"uint256"}],"name":"burn","outputs":[],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":false,"inputs":[{"internalType":"address","name":"account","type":"address"},{"internalType":"uint256","name":"amount","type":"uint256"}],"name":"burnFrom","outputs":[],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[],"name":"cap","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":true,"inputs":[],"name":"decimals","outputs":[{"internalType":"uint8","name":"","type":"uint8"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":false,"inputs":[{"internalType":"address","name":"spender","type":"address"},{"internalType":"uint256","name":"subtractedValue","type":"uint256"}],"name":"decreaseAllowance","outputs":[{"internalType":"bool","name":"","type":"bool"}],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[],"name":"getDecimals","outputs":[{"internalType":"uint8","name":"","type":"uint8"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":true,"inputs":[],"name":"gov","outputs":[{"internalType":"address","name":"","type":"address"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":false,"inputs":[{"internalType":"address","name":"spender","type":"address"},{"internalType":"uint256","name":"addedValue","type":"uint256"}],"name":"increaseAllowance","outputs":[{"internalType":"bool","name":"","type":"bool"}],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[{"internalType":"address","name":"account","type":"address"}],"name":"isMinter","outputs":[{"internalType":"bool","name":"","type":"bool"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":false,"inputs":[{"internalType":"address","name":"account","type":"address"},{"internalType":"uint256","name":"amount","type":"uint256"}],"name":"mint","outputs":[{"internalType":"bool","name":"","type":"bool"}],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[],"name":"name","outputs":[{"internalType":"string","name":"","type":"string"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":false,"inputs":[{"internalType":"address","name":"account","type":"address"}],"name":"removeMinter","outputs":[],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":false,"inputs":[],"name":"renounceMinter","outputs":[],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[],"name":"symbol","outputs":[{"internalType":"string","name":"","type":"string"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":true,"inputs":[],"name":"tokenName","outputs":[{"internalType":"string","name":"","type":"string"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":true,"inputs":[],"name":"tokenSymbol","outputs":[{"internalType":"string","name":"","type":"string"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":true,"inputs":[],"name":"totalSupply","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":false,"inputs":[{"internalType":"address","name":"recipient","type":"address"},{"internalType":"uint256","name":"amount","type":"uint256"}],"name":"transfer","outputs":[{"internalType":"bool","name":"","type":"bool"}],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":false,"inputs":[{"internalType":"address","name":"sender","type":"address"},{"internalType":"address","name":"recipient","type":"address"},{"internalType":"uint256","name":"amount","type":"uint256"}],"name":"transferFrom","outputs":[{"internalType":"bool","name":"","type":"bool"}],"payable":false,"stateMutability":"nonpayable","type":"function"}]'
# noqa: E501 (line-too-long)
)
# pylint: disable=too-many-lines
| 42.693848 | 6,045 | 0.6383 | 9,527 | 87,437 | 5.632413 | 0.035268 | 0.036974 | 0.033917 | 0.022288 | 0.867946 | 0.823183 | 0.798416 | 0.771674 | 0.759653 | 0.719381 | 0 | 0.004397 | 0.240459 | 87,437 | 2,047 | 6,046 | 42.714704 | 0.802596 | 0.119698 | 0 | 0.676642 | 1 | 0.000831 | 0.168056 | 0.09702 | 0 | 0 | 0 | 0 | 0.018288 | 1 | 0.131338 | false | 0.002494 | 0.0133 | 0.019119 | 0.287614 | 0.109726 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f50169703b9c24d53f3cd1695248bb26a2a3a53c | 8,902 | py | Python | naturMagCovers.py | petrvanblokland/TYPETR-Upgrade-site | 25aa4025d9d4592b73c0265cf7b3b4f9556bc9a6 | [
"MIT"
] | null | null | null | naturMagCovers.py | petrvanblokland/TYPETR-Upgrade-site | 25aa4025d9d4592b73c0265cf7b3b4f9556bc9a6 | [
"MIT"
] | 1 | 2018-01-15T23:49:44.000Z | 2018-01-16T00:18:46.000Z | naturMagCovers.py | petrvanblokland/TYPETR-Upgrade-site | 25aa4025d9d4592b73c0265cf7b3b4f9556bc9a6 | [
"MIT"
] | null | null | null | # -*- coding: UTF-8 -*-
#!/usr/bin/env python
# -----------------------------------------------------------------------------
# Copyright (c) 2016+ Buro Petr van Blokland + Claudia Mens & Font Bureau
# www.pagebot.io
#
# P A G E B O T
#
# Free to use. Licensed under MIT conditions
#
# Supporting usage of DrawBot, www.drawbot.com
# Supporting usage of Flat, https://github.com/xxyxyz/flat
# -----------------------------------------------------------------------------
#
# publications.py
#
from pagebot.contexts import defaultContext as context
from pagebot.fonttoolbox.objects.font import getFontByName
from pagebot.style import A4, CENTER, DISPLAY_BLOCK, RIGHT, LEFT
from pagebot import Gradient, Shadow
from pagebot.toolbox.dating import now
shadow = Shadow(offset=(6, -6), blur=10, color=(0.2, 0.2, 0.2, 0.5))
W, H = A4[0]*3/4, A4[1]*3/4
magazineTitle = u'Natur'
def buildCoverPages(w, h, year):
r, g, b = 0x50/255, 0x80/255, 0xc0/255
M = 2 # Margin
ML, MR, MT, MB = M, 0.75*M, M, 1.5*M
cw = w-ML-MR
# Page 66
context.newPage(w, h)
# Draw image, covering all page, scaled.
context.image('docs/images/IMG_0735-50.jpg', (-1, -10), h=h+20)
context.save()
#context.setGradient(gradient, (0, h*3/4), w, h/5) # Add self to define start/end from relative size.
context.fill((r, g, b, 0.8))
context.rect(0, h*4/5, w, h/5)
context.restore()
y = h
# Title of cover, make it fit in with and add shadow
# PageBot bug: automatic sizing with tracking does not work now
#coverTitleStyle = dict(font='Upgrade-Medium', fontSize=100, textFill=1, rTracking=-0.5)
#bs = context.newString(magazineTitle, style=coverTitleStyle, w=w-4*M)
coverTitleStyle = dict(font='Upgrade-Medium', fontSize=100, textFill=1)
bs = context.newString(magazineTitle, style=coverTitleStyle)
bs += context.newString('2', style=dict(font='Upgrade-Medium', textFill=1, fontSize=100, openTypeFeatures=dict(sinf=True)))
tw, th = bs.size()
context.setShadow(shadow)
context.text(bs, (M*4, y-th*0.62))
context.resetShadow()
y -= th
# Title of cover, make it fit in with and add shadow
style = dict(font='Upgrade-Book', fontSize=h/21, textFill=1, rLeading=1.1)
bs = context.newString('Upgraded\nBranches', style=style)
tw, th = bs.size()
context.text(bs, (w*2/3, y+th*1.33))
# Title of cover, make it fit in with and add shadow
coverTitleStyle = dict(font='Upgrade-LightItalic', fontSize=28, textFill=1)
bs = context.newString('February', style=coverTitleStyle)
context.text(bs, (M*6, MB+16))
def buildCoverPages1(w, h, year):
r, g, b = 0x18/255, 0x24/255, 0x35/255
M = 2 # Margin
ML, MR, MT, MB = M, 0.75*M, M, 1.5*M
cw = w-ML-MR
# Page 66
context.newPage(w, h)
# Draw image, covering all page, scaled.
context.image('docs/images/IMG_2643-50.jpg', (-1000, 0), h=h)
context.save()
#context.setGradient(gradient, (0, h*3/4), w, h/5) # Add self to define start/end from relative size.
context.fill((r, g, b, 0.8))
context.rect(0, h*4/5, w, h/5)
context.restore()
y = h
# Title of cover, make it fit in with and add shadow
# PageBot bug: automatic sizing with tracking does not work now
#coverTitleStyle = dict(font='Upgrade-Medium', fontSize=100, textFill=1, rTracking=-0.5)
#bs = context.newString(magazineTitle, style=coverTitleStyle, w=w-4*M)
coverTitleStyle = dict(font='Upgrade-Medium', fontSize=100, textFill=1)
bs = context.newString(magazineTitle, style=coverTitleStyle)
bs += context.newString('3', style=dict(font='Upgrade-Medium', textFill=1, fontSize=100, openTypeFeatures=dict(sups=True)))
tw, th = bs.size()
context.setShadow(shadow)
context.text(bs, (M*4, y-th*0.62))
context.resetShadow()
y -= th
# Title of cover, make it fit in with and add shadow
style = dict(font='Upgrade-Book', fontSize=h/21, textFill=1, rLeading=1.1)
bs = context.newString('Upgraded\nBrass & Blue', style=style)
tw, th = bs.size()
context.text(bs, (w*2/3, y+th*1.33))
# Title of cover, make it fit in with and add shadow
coverTitleStyle = dict(font='Upgrade-LightItalic', fontSize=28, textFill=1)
bs = context.newString('March', style=coverTitleStyle)
context.text(bs, (M*6, MB+16))
def buildCoverPages2(w, h, year):
r, g, b = 0x40/255, 0x76/255, 0x1F/255
M = 2 # Margin
ML, MR, MT, MB = M, 0.75*M, M, 1.5*M
cw = w-ML-MR
# Page 66
context.newPage(w, h)
# Draw image, covering all page, scaled.
context.image('docs/images/IMG_1728-50.jpg', (-100, -10), h=h+20)
context.save()
#context.setGradient(gradient, (0, h*3/4), w, h/5) # Add self to define start/end from relative size.
context.fill((r, g, b, 0.8))
context.rect(0, h*4/5, w, h/4)
context.restore()
y = h
# Title of cover, make it fit in with and add shadow
# PageBot bug: automatic sizing with tracking does not work now
#coverTitleStyle = dict(font='Upgrade-Medium', fontSize=100, textFill=1, rTracking=-0.5)
#bs = context.newString(magazineTitle, style=coverTitleStyle, w=w-4*M)
coverTitleStyle = dict(font='Upgrade-Medium', fontSize=100, textFill=1)
bs = context.newString(magazineTitle, style=coverTitleStyle)
bs += context.newString('8', style=dict(font='Upgrade-Medium', textFill=1, fontSize=100, openTypeFeatures=dict(sinf=True)))
tw, th = bs.size()
context.setShadow(shadow)
context.text(bs, (M*4, y-th*0.62))
context.resetShadow()
y -= th
# Title of cover, make it fit in with and add shadow
style = dict(font='Upgrade-Book', fontSize=h/21, textFill=1, rLeading=1.1)
bs = context.newString('Upgraded\nGingers', style=style)
tw, th = bs.size()
context.text(bs, (w*2/3, y+th*1.33))
# Title of cover, make it fit in with and add shadow
coverTitleStyle = dict(font='Upgrade-LightItalic', fontSize=28, textFill=1)
bs = context.newString('August', style=coverTitleStyle)
context.text(bs, (w*2/3, MB+12))
def buildCoverPages3(w, h, year):
r, g, b = 0x45/255, 0x76/255, 0x76/255
#gradient = Gradient(start=(0, 0), end=(0, 1), locations=(0, 1), colors=((r, g, b, 0 ), (r, g, b, 1)))
M = 2 # Margin
ML, MR, MT, MB = M, 0.75*M, M, 1.5*M
cw = w-ML-MR
# Page 66
context.newPage(w, h)
# Draw image, covering all page, scaled.
context.image('docs/images/IMG_0750-50.jpg', (-w/2.4, -8), h=h+16)
context.save()
#context.setGradient(gradient, (0, h*3/4), w, h/5) # Add self to define start/end from relative size.
context.fill((r, g, b, 0.8))
context.rect(0, h*4/5, w, h/4)
context.restore()
y = h
# Title of cover, make it fit in with and add shadow
# PageBot bug: automatic sizing with tracking does not work now
#coverTitleStyle = dict(font='Upgrade-Medium', fontSize=100, textFill=1, rTracking=-0.5)
#bs = context.newString(magazineTitle, style=coverTitleStyle, w=w-4*M)
coverTitleStyle = dict(font='Upgrade-Medium', fontSize=100, textFill=1)
bs = context.newString(magazineTitle, style=coverTitleStyle)
bs += context.newString('9', style=dict(font='Upgrade-Medium', textFill=1, fontSize=100, openTypeFeatures=dict(sups=True)))
tw, th = bs.size()
context.setShadow(shadow)
context.text(bs, (M*4, y-th*0.62))
context.resetShadow()
y -= th
# Title of cover, make it fit in with and add shadow
style = dict(font='Upgrade-Book', fontSize=h/21, textFill=1, rLeading=1.1)
bs = context.newString('Upgraded\nExperience', style=style)
tw, th = bs.size()
context.text(bs, (w*2/3, y+th*1.33))
# Title of cover, make it fit in with and add shadow
coverTitleStyle = dict(font='Upgrade-LightItalic', fontSize=28, textFill=1)
bs = context.newString('September', style=coverTitleStyle)
context.text(bs, (w*2/3, MB+12))
IMAGES = (
('docs/documents/naturCoverPages.pdf', W, H, now().year, buildCoverPages),
('docs/images/naturCoverPages.png', W, H, now().year, buildCoverPages),
('docs/documents/naturCoverPages1.pdf', W, H, now().year+1, buildCoverPages1),
('docs/images/naturCoverPages1.png', W, H, now().year+1, buildCoverPages1),
('docs/documents/naturCoverPages2.pdf', W, H, now().year+1, buildCoverPages2),
('docs/images/naturCoverPages2.png', W, H, now().year+1, buildCoverPages2),
('docs/documents/naturCoverPages3.pdf', W, H, now().year+1, buildCoverPages3),
('docs/images/naturCoverPages3.png', W, H, now().year+1, buildCoverPages3),
)
for path, w, h, year, m in IMAGES:
newDrawing()
m(w, h, year)
saveImage(path, multipage=True)
print path
| 39.215859 | 653 | 0.629971 | 1,327 | 8,902 | 4.222306 | 0.158252 | 0.009638 | 0.053543 | 0.034267 | 0.803141 | 0.800821 | 0.771908 | 0.750491 | 0.750491 | 0.750491 | 0 | 0.054789 | 0.202426 | 8,902 | 227 | 654 | 39.215859 | 0.734366 | 0.302404 | 0 | 0.641221 | 0 | 0 | 0.117743 | 0.060823 | 0 | 0 | 0.007806 | 0 | 0 | 0 | null | null | 0 | 0.038168 | null | null | 0.007634 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f506c01f94347f952d8c551109f6a08fef8c3231 | 116 | py | Python | src/analytics/lambdas/test/test_coronavirus_gov_importer.py | shehir12/covid19-app-system-public | 63184012d85335f7c499fe41ab534a0ef935a4b8 | [
"MIT"
] | null | null | null | src/analytics/lambdas/test/test_coronavirus_gov_importer.py | shehir12/covid19-app-system-public | 63184012d85335f7c499fe41ab534a0ef935a4b8 | [
"MIT"
] | null | null | null | src/analytics/lambdas/test/test_coronavirus_gov_importer.py | shehir12/covid19-app-system-public | 63184012d85335f7c499fe41ab534a0ef935a4b8 | [
"MIT"
] | null | null | null | from . import check_code
def test_code_is_valid():
assert check_code('importers/coronavirus_gov_importer.py')
| 19.333333 | 62 | 0.793103 | 17 | 116 | 5 | 0.823529 | 0.211765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12069 | 116 | 5 | 63 | 23.2 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0.318966 | 0.318966 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.666667 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
018107436ed7014aba853cd3343b07ffdd93f3d3 | 1,935 | py | Python | tests/test_Board.py | PoppeThomas/connect4-nn | 7af9f00991724c9d18ea7d0946f423218ff2fa0d | [
"MIT"
] | null | null | null | tests/test_Board.py | PoppeThomas/connect4-nn | 7af9f00991724c9d18ea7d0946f423218ff2fa0d | [
"MIT"
] | null | null | null | tests/test_Board.py | PoppeThomas/connect4-nn | 7af9f00991724c9d18ea7d0946f423218ff2fa0d | [
"MIT"
] | null | null | null | from game.Board import Board
from game.Constants import HUMAN, COMPUTER
def test_detect_win_vertical():
board = Board()
row, win = board.play(HUMAN, 0)
assert not win
row, win = board.play(HUMAN, 0)
assert not win
row, win = board.play(HUMAN, 0)
assert not win
row, win = board.play(HUMAN, 0)
assert win
def test_detect_win_horizontal():
board = Board()
row, win = board.play(COMPUTER, 0)
assert not win
row, win = board.play(COMPUTER, 1)
assert not win
row, win = board.play(COMPUTER, 3)
assert not win
row, win = board.play(COMPUTER, 2)
assert win
def test_detect_win_horizontal_not_same_player():
board = Board()
row, win = board.play(COMPUTER, 0)
assert not win
row, win = board.play(HUMAN, 1)
assert not win
row, win = board.play(COMPUTER, 3)
assert not win
row, win = board.play(COMPUTER, 2)
assert not win
def test_detect_win_diagonal_1():
board = Board()
row, win = board.play(COMPUTER, 0)
assert not win
row, win = board.play(HUMAN, 1)
row, win = board.play(COMPUTER, 1)
assert not win
row, win = board.play(HUMAN, 2)
row, win = board.play(HUMAN, 2)
row, win = board.play(COMPUTER, 2)
assert not win
row, win = board.play(HUMAN, 3)
row, win = board.play(COMPUTER, 3)
row, win = board.play(HUMAN, 3)
row, win = board.play(COMPUTER, 3)
assert win
def test_detect_win_diagonal_2():
board = Board()
row, win = board.play(COMPUTER, 3)
assert not win
row, win = board.play(HUMAN, 2)
row, win = board.play(COMPUTER, 2)
assert not win
row, win = board.play(HUMAN, 0)
row, win = board.play(COMPUTER, 0)
row, win = board.play(HUMAN, 0)
row, win = board.play(COMPUTER, 0)
assert not win
row, win = board.play(HUMAN, 1)
row, win = board.play(HUMAN, 1)
row, win = board.play(COMPUTER, 1)
assert win
| 25.12987 | 49 | 0.630491 | 299 | 1,935 | 4.013378 | 0.080268 | 0.16 | 0.293333 | 0.4 | 0.915 | 0.915 | 0.864167 | 0.8025 | 0.8025 | 0.7975 | 0 | 0.023529 | 0.25323 | 1,935 | 76 | 50 | 25.460526 | 0.80692 | 0 | 0 | 0.890625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.3125 | 1 | 0.078125 | false | 0 | 0.03125 | 0 | 0.109375 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
6dd5902dc5c4476d14c700d3861d8d352d8ea7a4 | 74,640 | py | Python | tests/data/curry/eqconstr/generate_test_programs.py | andyjost/Sprite | 7ecd6fc7d48d7f62da644e48c12c7b882e1a2929 | [
"MIT"
] | 1 | 2022-03-16T16:37:11.000Z | 2022-03-16T16:37:11.000Z | tests/data/curry/eqconstr/generate_test_programs.py | andyjost/Sprite | 7ecd6fc7d48d7f62da644e48c12c7b882e1a2929 | [
"MIT"
] | null | null | null | tests/data/curry/eqconstr/generate_test_programs.py | andyjost/Sprite | 7ecd6fc7d48d7f62da644e48c12c7b882e1a2929 | [
"MIT"
] | null | null | null | #!/usr/bin/python
import sys
sys.path.insert(0, '../../scripts')
from generate_test_programs_lib import generate_test_programs
# This script generates many test files. The reason for doing it this way is
# simply that it is easier to maintain this script than work with scores of
# tiny files.
#
# The programs are arranged into several lists based on the type definitions
# they require. There is a naming convention. The constructors are named as
# A, B, C, ... and files are encoded as a concatenation of the lowercase names,
# each followed by its arity. So, "a0" tests a type with one constructor
# taking no arguments, and "a1b2c0" tests a type with three constructors that
# take 1, 2, and 0 arguments, respectively.
PROGRAMS = [
# Constructors.
'main = 0 =:= 0'
, 'main = 0 =:= 1'
, 'main = [0,0] =:= [0,0]'
, 'main = [0,0] =:= [0,1]'
# Functions.
, 'f = True\n'
'main = f =:= f'
, 'f = True\n'
'main = True =:= f'
, 'f = True\n'
'main = f =:= True'
, 'f = True\n'
'main = False =:= f'
, 'f = True\n'
'main = f =:= False'
# Choices.
, 'main = 0 =:= (0?1)'
, 'main = (0?1) =:= 1'
, 'main = (0?1) =:= (0?1)'
# Failures.
, 'main = 0 =:= failed'
, 'main = failed =:= 1'
, 'main = failed =:= (0?1)'
# Free variables.
, 'main = True =:= x where x free'
, 'main = True == x where x free'
, 'f True True = True\n'
'main = f True x where x free'
, 'main = (x::())=:=y where x,y free'
, 'main = (True:x)=:=(True:y) where x,y free'
, 'main = (True:x)=:=(False:y) where x,y free'
, 'main = ((x::()):xs)=:=(y:ys) where x,y,xs,ys free'
, 'main = (True:xs)=:=(y:ys) where y,xs,ys free'
# Forward nodes.
, 'fwd x = x\n' # fwd constructor
'main = fwd True =:= True'
, 'fwd x = x\n'
'main = fwd True =:= False'
, 'fwd x = x\n'
'main = True =:= fwd True'
, 'fwd x = x\n'
'main = False =:= fwd True'
, 'fwd x = x\n' # fwd function
'f = True\n'
'main = fwd f =:= True'
, 'fwd x = x\n'
'f = True\n'
'main = fwd f =:= False'
, 'fwd x = x\n'
'f = True\n'
'main = True =:= fwd f'
, 'fwd x = x\n'
'f = True\n'
'main = False =:= fwd f'
, 'fwd x = x\n' # fwd choice
'f = True ? False\n'
'main = fwd f =:= True'
, 'fwd x = x\n'
'f = True ? False\n'
'main = fwd f =:= False'
, 'fwd x = x\n'
'f = True ? False\n'
'main = True =:= fwd f'
, 'fwd x = x\n'
'f = True ? False\n'
'main = False =:= fwd f'
, 'fwd x = x\n' # fwd free
'main = fwd x =:= True where x free'
, 'fwd x = x\n'
'main = fwd x =:= False where x free'
, 'fwd x = x\n'
'main = True =:= fwd x where x free'
, 'fwd x = x\n'
'main = False =:= fwd x where x free'
# More complex.
, 'main = ((x =:= True) &> x) ? x where x free'
, 'main = ((True =:= x) &> x) ? x where x free'
, 'main = x ? ((x =:= True) &> x) where x free'
, 'main = x ? ((True =:= x) &> x) where x free'
, 'main :: Bool\n'
'main = ((x =:= x) &> x) ? x where x free'
, 'main = (((x::Bool) =:= x & (y::Bool) =:= y) &> x) ? x where x,y free'
, 'main :: Bool\n'
'main = ((x =:= x & y =:= y & x =:= y) &> x) ? x where x,y free'
, 'main = ((x =:= x & y =:= y & x =:= y & y =:= True) &> x) ? x where x,y free'
, 'main = ((y =:= True & x =:= x & y =:= y & x =:= y) &> x) ? x where x,y free'
, 'main = ((y =:= True & x =:= x & y =:= y & x =:= y & y =:= True) &> x) ? x where x,y free'
, 'main = ((x =:= True & x =:= True) &> x) ? x where x free'
, 'main = ((x =:= True & x =:= False) &> x) ? x where x free'
, 'main = ((x =:= True & y =:= x & y =:= True) &> x) ? x where x,y free'
, 'main = ((x =:= True & y =:= x & y =:= False) &> x) ? x where x,y free'
# Inconsistent cases.
, 'main = ((x =:= True) & (y =:= False) & (x =:= y)) &> x where x,y free'
, 'main = ((True =:= x) & (y =:= False) & (x =:= y)) &> x where x,y free'
, 'main = ((x =:= True) & (False =:= y) & (x =:= y)) &> x where x,y free'
, 'main = ((True =:= x) & (False =:= y) & (x =:= y)) &> x where x,y free'
, 'main = ((x =:= True) & (y =:= False) & (y =:= x)) &> x where x,y free'
, 'main = ((True =:= x) & (y =:= False) & (y =:= x)) &> x where x,y free'
, 'main = ((x =:= True) & (False =:= y) & (y =:= x)) &> x where x,y free'
, 'main = ((True =:= x) & (False =:= y) & (y =:= x)) &> x where x,y free'
, 'main = ((x =:= True) & (x =:= y) & (y =:= False)) &> x where x,y free'
, 'main = ((True =:= x) & (x =:= y) & (y =:= False)) &> x where x,y free'
, 'main = ((x =:= True) & (x =:= y) & (False =:= y)) &> x where x,y free'
, 'main = ((True =:= x) & (x =:= y) & (False =:= y)) &> x where x,y free'
, 'main = ((x =:= True) & (y =:= x) & (y =:= False)) &> x where x,y free'
, 'main = ((True =:= x) & (y =:= x) & (y =:= False)) &> x where x,y free'
, 'main = ((x =:= True) & (y =:= x) & (False =:= y)) &> x where x,y free'
, 'main = ((True =:= x) & (y =:= x) & (False =:= y)) &> x where x,y free'
, 'main = ((y =:= False) & (x =:= True) & (x =:= y)) &> x where x,y free'
, 'main = ((y =:= False) & (True =:= x) & (x =:= y)) &> x where x,y free'
, 'main = ((False =:= y) & (x =:= True) & (x =:= y)) &> x where x,y free'
, 'main = ((False =:= y) & (True =:= x) & (x =:= y)) &> x where x,y free'
, 'main = ((y =:= False) & (x =:= True) & (y =:= x)) &> x where x,y free'
, 'main = ((y =:= False) & (True =:= x) & (y =:= x)) &> x where x,y free'
, 'main = ((False =:= y) & (x =:= True) & (y =:= x)) &> x where x,y free'
, 'main = ((False =:= y) & (True =:= x) & (y =:= x)) &> x where x,y free'
, 'main = ((x =:= y) & (x =:= True) & (y =:= False)) &> x where x,y free'
, 'main = ((x =:= y) & (True =:= x) & (y =:= False)) &> x where x,y free'
, 'main = ((x =:= y) & (x =:= True) & (False =:= y)) &> x where x,y free'
, 'main = ((x =:= y) & (True =:= x) & (False =:= y)) &> x where x,y free'
, 'main = ((y =:= x) & (x =:= True) & (y =:= False)) &> x where x,y free'
, 'main = ((y =:= x) & (True =:= x) & (y =:= False)) &> x where x,y free'
, 'main = ((y =:= x) & (x =:= True) & (False =:= y)) &> x where x,y free'
, 'main = ((y =:= x) & (True =:= x) & (False =:= y)) &> x where x,y free'
, 'main = ((y =:= False) & (x =:= y) & (x =:= True)) &> x where x,y free'
, 'main = ((y =:= False) & (x =:= y) & (True =:= x)) &> x where x,y free'
, 'main = ((False =:= y) & (x =:= y) & (x =:= True)) &> x where x,y free'
, 'main = ((False =:= y) & (x =:= y) & (True =:= x)) &> x where x,y free'
, 'main = ((y =:= False) & (y =:= x) & (x =:= True)) &> x where x,y free'
, 'main = ((y =:= False) & (y =:= x) & (True =:= x)) &> x where x,y free'
, 'main = ((False =:= y) & (y =:= x) & (x =:= True)) &> x where x,y free'
, 'main = ((False =:= y) & (y =:= x) & (True =:= x)) &> x where x,y free'
, 'main = ((x =:= y) & (y =:= False) & (x =:= True)) &> x where x,y free'
, 'main = ((x =:= y) & (y =:= False) & (True =:= x)) &> x where x,y free'
, 'main = ((x =:= y) & (False =:= y) & (x =:= True)) &> x where x,y free'
, 'main = ((x =:= y) & (False =:= y) & (True =:= x)) &> x where x,y free'
, 'main = ((y =:= x) & (y =:= False) & (x =:= True)) &> x where x,y free'
, 'main = ((y =:= x) & (y =:= False) & (True =:= x)) &> x where x,y free'
, 'main = ((y =:= x) & (False =:= y) & (x =:= True)) &> x where x,y free'
, 'main = ((y =:= x) & (False =:= y) & (True =:= x)) &> x where x,y free'
, 'main = ((x=:=y), x=:=True, y) where x,y free'
# Binding the same variable multiple times.
, 'main = x=:=True & x=:=not False &> x where x free'
, 'main :: ([Bool], [Bool])\n'
'main = x=:=y &> (x, y) where x,y free'
]
# Programs for integer narrowing.
INTEGER_PROGRAMS = [
'main = x =:= 1 where x free'
, 'main = 0 =:= x where x free'
, 'f 0 1 = 1\n'
'main :: Int\n'
'main = f 0 x where x free'
, 'f 0 True = True\n'
'main :: Bool\n'
'main = f 0 x where x free'
, 'main = (0:xs)=:=(y:ys) where y,xs,ys free'
, 'main :: Int\n'
'main = x=:=5 &> x where x free'
, 'f 0 = 0\n'
'f 1 = 1\n'
'g 0 = 0\n'
'g 2 = 2\n'
'main :: (Int, Int)\n'
'main = (f x, g x) where x free'
, 'main :: Int\n'
'main = x=:=1 &> x+2 where x free'
]
# Programs for character narrowing.
CHAR_PROGRAMS = [
"main = x =:= 'a' where x free"
, "main = 'a' =:= x where x free"
, "f 'a' 'b' = 'a'\n"
"main :: Char\n"
"main = f 'a' x where x free"
, "f 'a' True = True\n"
"main :: Bool\n"
"main = f 'a' x where x free"
, "main = ('a':xs)=:=(y:ys) where y,xs,ys free"
, "main :: Char\n"
"main = x=:='f' &> x where x free"
, "f 'a' = 'a'\n"
"f 'b' = 'b'\n"
"g 'a' = 'a'\n"
"g 'c' = 'c'\n"
"main :: (Char, Char)\n"
"main = (f x, g x) where x free"
, "main :: Int\n"
"main = x=:='b' &> ord x where x free"
]
# Programs for floating-point narrowing.
FLOAT_PROGRAMS = [
"main = x =:= 3.14 where x free"
, "main = 3.14 =:= x where x free"
, "f 3.14 2.72 = 3.14\n"
"main :: Float\n"
"main = f 3.14 x where x free"
, "f 3.14 True = True\n"
"main :: Bool\n"
"main = f 3.14 x where x free"
, "main = (3.14:xs)=:=(y:ys) where y,xs,ys free"
, "main :: Float\n"
"main = x=:=6.0 &> x where x free"
, "f 3.14 = 3.14\n"
"f 2.72 = 2.72\n"
"g 3.14 = 3.14\n"
"g 5.0 = 5.0\n"
"main :: (Float, Float)\n"
"main = (f x, g x) where x free"
, "main :: Float\n"
"main = x=:=2.72 &> -x where x free"
]
# Programs requiring: data T = A
A0_PROGRAMS = [
'x =:= A &> x where x free'
, 'A =:= x &> x where x free'
, '(x =:= A &> x) ? x where x free'
, '(A =:= x &> x) ? x where x free'
, '(x =:= y & y =:= A) &> x where x,y free'
, '(x =:= y & y =:= A) &> y where x,y free'
, '(x =:= y & A =:= y) &> x where x,y free'
, '(x =:= y & A =:= y) &> y where x,y free'
, '(y =:= x & y =:= A) &> x where x,y free'
, '(y =:= x & y =:= A) &> y where x,y free'
, '(y =:= x & A =:= y) &> x where x,y free'
, '(y =:= x & A =:= y) &> y where x,y free'
, '(y =:= A & x =:= y) &> x where x,y free'
, '(y =:= A & x =:= y) &> y where x,y free'
, '(A =:= y & x =:= y) &> x where x,y free'
, '(A =:= y & x =:= y) &> y where x,y free'
, '(y =:= A & y =:= x) &> x where x,y free'
, '(y =:= A & y =:= x) &> y where x,y free'
, '(A =:= y & y =:= x) &> x where x,y free'
, '(A =:= y & y =:= x) &> y where x,y free'
, '((x =:= y & y =:= A) &> x) ? x where x,y free'
, '((x =:= y & y =:= A) &> y) ? x where x,y free'
, '((x =:= y & A =:= y) &> x) ? x where x,y free'
, '((x =:= y & A =:= y) &> y) ? x where x,y free'
, '((y =:= x & y =:= A) &> x) ? x where x,y free'
, '((y =:= x & y =:= A) &> y) ? x where x,y free'
, '((y =:= x & A =:= y) &> x) ? x where x,y free'
, '((y =:= x & A =:= y) &> y) ? x where x,y free'
, '((y =:= A & x =:= y) &> x) ? x where x,y free'
, '((y =:= A & x =:= y) &> y) ? x where x,y free'
, '((A =:= y & x =:= y) &> x) ? x where x,y free'
, '((A =:= y & x =:= y) &> y) ? x where x,y free'
, '((y =:= A & y =:= x) &> x) ? x where x,y free'
, '((y =:= A & y =:= x) &> y) ? x where x,y free'
, '((A =:= y & y =:= x) &> x) ? x where x,y free'
, '((A =:= y & y =:= x) &> y) ? x where x,y free'
, '((x =:= y & y =:= A) &> x) ? y where x,y free'
, '((x =:= y & y =:= A) &> y) ? y where x,y free'
, '((x =:= y & A =:= y) &> x) ? y where x,y free'
, '((x =:= y & A =:= y) &> y) ? y where x,y free'
, '((y =:= x & y =:= A) &> x) ? y where x,y free'
, '((y =:= x & y =:= A) &> y) ? y where x,y free'
, '((y =:= x & A =:= y) &> x) ? y where x,y free'
, '((y =:= x & A =:= y) &> y) ? y where x,y free'
, '((y =:= A & x =:= y) &> x) ? y where x,y free'
, '((y =:= A & x =:= y) &> y) ? y where x,y free'
, '((A =:= y & x =:= y) &> x) ? y where x,y free'
, '((A =:= y & x =:= y) &> y) ? y where x,y free'
, '((y =:= A & y =:= x) &> x) ? y where x,y free'
, '((y =:= A & y =:= x) &> y) ? y where x,y free'
, '((A =:= y & y =:= x) &> x) ? y where x,y free'
, '((A =:= y & y =:= x) &> y) ? y where x,y free'
, '((x =:= y & y =:= A) &> x) ? x ? y where x,y free'
, '((x =:= y & y =:= A) &> y) ? x ? y where x,y free'
, '((x =:= y & A =:= y) &> x) ? x ? y where x,y free'
, '((x =:= y & A =:= y) &> y) ? x ? y where x,y free'
, '((y =:= x & y =:= A) &> x) ? x ? y where x,y free'
, '((y =:= x & y =:= A) &> y) ? x ? y where x,y free'
, '((y =:= x & A =:= y) &> x) ? x ? y where x,y free'
, '((y =:= x & A =:= y) &> y) ? x ? y where x,y free'
, '((y =:= A & x =:= y) &> x) ? x ? y where x,y free'
, '((y =:= A & x =:= y) &> y) ? x ? y where x,y free'
, '((A =:= y & x =:= y) &> x) ? x ? y where x,y free'
, '((A =:= y & x =:= y) &> y) ? x ? y where x,y free'
, '((y =:= A & y =:= x) &> x) ? x ? y where x,y free'
, '((y =:= A & y =:= x) &> y) ? x ? y where x,y free'
, '((A =:= y & y =:= x) &> x) ? x ? y where x,y free'
, '((A =:= y & y =:= x) &> y) ? x ? y where x,y free'
, '((x =:= y & y =:= A) &> x) ? y ? x where x,y free'
, '((x =:= y & y =:= A) &> y) ? y ? x where x,y free'
, '((x =:= y & A =:= y) &> x) ? y ? x where x,y free'
, '((x =:= y & A =:= y) &> y) ? y ? x where x,y free'
, '((y =:= x & y =:= A) &> x) ? y ? x where x,y free'
, '((y =:= x & y =:= A) &> y) ? y ? x where x,y free'
, '((y =:= x & A =:= y) &> x) ? y ? x where x,y free'
, '((y =:= x & A =:= y) &> y) ? y ? x where x,y free'
, '((y =:= A & x =:= y) &> x) ? y ? x where x,y free'
, '((y =:= A & x =:= y) &> y) ? y ? x where x,y free'
, '((A =:= y & x =:= y) &> x) ? y ? x where x,y free'
, '((A =:= y & x =:= y) &> y) ? y ? x where x,y free'
, '((y =:= A & y =:= x) &> x) ? y ? x where x,y free'
, '((y =:= A & y =:= x) &> y) ? y ? x where x,y free'
, '((A =:= y & y =:= x) &> x) ? y ? x where x,y free'
, '((A =:= y & y =:= x) &> y) ? y ? x where x,y free'
, 'x ? ((x =:= y & y =:= A) &> x) where x,y free'
, 'x ? ((x =:= y & y =:= A) &> y) where x,y free'
, 'x ? ((x =:= y & A =:= y) &> x) where x,y free'
, 'x ? ((x =:= y & A =:= y) &> y) where x,y free'
, 'x ? ((y =:= x & y =:= A) &> x) where x,y free'
, 'x ? ((y =:= x & y =:= A) &> y) where x,y free'
, 'x ? ((y =:= x & A =:= y) &> x) where x,y free'
, 'x ? ((y =:= x & A =:= y) &> y) where x,y free'
, 'x ? ((y =:= A & x =:= y) &> x) where x,y free'
, 'x ? ((y =:= A & x =:= y) &> y) where x,y free'
, 'x ? ((A =:= y & x =:= y) &> x) where x,y free'
, 'x ? ((A =:= y & x =:= y) &> y) where x,y free'
, 'x ? ((y =:= A & y =:= x) &> x) where x,y free'
, 'x ? ((y =:= A & y =:= x) &> y) where x,y free'
, 'x ? ((A =:= y & y =:= x) &> x) where x,y free'
, 'x ? ((A =:= y & y =:= x) &> y) where x,y free'
, 'y ? ((x =:= y & y =:= A) &> x) where x,y free'
, 'y ? ((x =:= y & y =:= A) &> y) where x,y free'
, 'y ? ((x =:= y & A =:= y) &> x) where x,y free'
, 'y ? ((x =:= y & A =:= y) &> y) where x,y free'
, 'y ? ((y =:= x & y =:= A) &> x) where x,y free'
, 'y ? ((y =:= x & y =:= A) &> y) where x,y free'
, 'y ? ((y =:= x & A =:= y) &> x) where x,y free'
, 'y ? ((y =:= x & A =:= y) &> y) where x,y free'
, 'y ? ((y =:= A & x =:= y) &> x) where x,y free'
, 'y ? ((y =:= A & x =:= y) &> y) where x,y free'
, 'y ? ((A =:= y & x =:= y) &> x) where x,y free'
, 'y ? ((A =:= y & x =:= y) &> y) where x,y free'
, 'y ? ((y =:= A & y =:= x) &> x) where x,y free'
, 'y ? ((y =:= A & y =:= x) &> y) where x,y free'
, 'y ? ((A =:= y & y =:= x) &> x) where x,y free'
, 'y ? ((A =:= y & y =:= x) &> y) where x,y free'
, 'x ? ((x =:= y & y =:= A) &> x) ? y where x,y free'
, 'x ? ((x =:= y & y =:= A) &> y) ? y where x,y free'
, 'x ? ((x =:= y & A =:= y) &> x) ? y where x,y free'
, 'x ? ((x =:= y & A =:= y) &> y) ? y where x,y free'
, 'x ? ((y =:= x & y =:= A) &> x) ? y where x,y free'
, 'x ? ((y =:= x & y =:= A) &> y) ? y where x,y free'
, 'x ? ((y =:= x & A =:= y) &> x) ? y where x,y free'
, 'x ? ((y =:= x & A =:= y) &> y) ? y where x,y free'
, 'x ? ((y =:= A & x =:= y) &> x) ? y where x,y free'
, 'x ? ((y =:= A & x =:= y) &> y) ? y where x,y free'
, 'x ? ((A =:= y & x =:= y) &> x) ? y where x,y free'
, 'x ? ((A =:= y & x =:= y) &> y) ? y where x,y free'
, 'x ? ((y =:= A & y =:= x) &> x) ? y where x,y free'
, 'x ? ((y =:= A & y =:= x) &> y) ? y where x,y free'
, 'x ? ((A =:= y & y =:= x) &> x) ? y where x,y free'
, 'x ? ((A =:= y & y =:= x) &> y) ? y where x,y free'
, 'y ? ((x =:= y & y =:= A) &> x) ? x where x,y free'
, 'y ? ((x =:= y & y =:= A) &> y) ? x where x,y free'
, 'y ? ((x =:= y & A =:= y) &> x) ? x where x,y free'
, 'y ? ((x =:= y & A =:= y) &> y) ? x where x,y free'
, 'y ? ((y =:= x & y =:= A) &> x) ? x where x,y free'
, 'y ? ((y =:= x & y =:= A) &> y) ? x where x,y free'
, 'y ? ((y =:= x & A =:= y) &> x) ? x where x,y free'
, 'y ? ((y =:= x & A =:= y) &> y) ? x where x,y free'
, 'y ? ((y =:= A & x =:= y) &> x) ? x where x,y free'
, 'y ? ((y =:= A & x =:= y) &> y) ? x where x,y free'
, 'y ? ((A =:= y & x =:= y) &> x) ? x where x,y free'
, 'y ? ((A =:= y & x =:= y) &> y) ? x where x,y free'
, 'y ? ((y =:= A & y =:= x) &> x) ? x where x,y free'
, 'y ? ((y =:= A & y =:= x) &> y) ? x where x,y free'
, 'y ? ((A =:= y & y =:= x) &> x) ? x where x,y free'
, 'y ? ((A =:= y & y =:= x) &> y) ? x where x,y free'
, 'x ? y ? ((x =:= y & y =:= A) &> x) where x,y free'
, 'x ? y ? ((x =:= y & y =:= A) &> y) where x,y free'
, 'x ? y ? ((x =:= y & A =:= y) &> x) where x,y free'
, 'x ? y ? ((x =:= y & A =:= y) &> y) where x,y free'
, 'x ? y ? ((y =:= x & y =:= A) &> x) where x,y free'
, 'x ? y ? ((y =:= x & y =:= A) &> y) where x,y free'
, 'x ? y ? ((y =:= x & A =:= y) &> x) where x,y free'
, 'x ? y ? ((y =:= x & A =:= y) &> y) where x,y free'
, 'x ? y ? ((y =:= A & x =:= y) &> x) where x,y free'
, 'x ? y ? ((y =:= A & x =:= y) &> y) where x,y free'
, 'x ? y ? ((A =:= y & x =:= y) &> x) where x,y free'
, 'x ? y ? ((A =:= y & x =:= y) &> y) where x,y free'
, 'x ? y ? ((y =:= A & y =:= x) &> x) where x,y free'
, 'x ? y ? ((y =:= A & y =:= x) &> y) where x,y free'
, 'x ? y ? ((A =:= y & y =:= x) &> x) where x,y free'
, 'x ? y ? ((A =:= y & y =:= x) &> y) where x,y free'
, 'y ? x ? ((x =:= y & y =:= A) &> x) where x,y free'
, 'y ? x ? ((x =:= y & y =:= A) &> y) where x,y free'
, 'y ? x ? ((x =:= y & A =:= y) &> x) where x,y free'
, 'y ? x ? ((x =:= y & A =:= y) &> y) where x,y free'
, 'y ? x ? ((y =:= x & y =:= A) &> x) where x,y free'
, 'y ? x ? ((y =:= x & y =:= A) &> y) where x,y free'
, 'y ? x ? ((y =:= x & A =:= y) &> x) where x,y free'
, 'y ? x ? ((y =:= x & A =:= y) &> y) where x,y free'
, 'y ? x ? ((y =:= A & x =:= y) &> x) where x,y free'
, 'y ? x ? ((y =:= A & x =:= y) &> y) where x,y free'
, 'y ? x ? ((A =:= y & x =:= y) &> x) where x,y free'
, 'y ? x ? ((A =:= y & x =:= y) &> y) where x,y free'
, 'y ? x ? ((y =:= A & y =:= x) &> x) where x,y free'
, 'y ? x ? ((y =:= A & y =:= x) &> y) where x,y free'
, 'y ? x ? ((A =:= y & y =:= x) &> x) where x,y free'
, 'y ? x ? ((A =:= y & y =:= x) &> y) where x,y free'
]
# Programs requiring: data T = A | B | C.
A0B0C0_PROGRAMS = [
'x =:= A &> x where x free'
, 'x =:= B &> x where x free'
, 'x =:= C &> x where x free'
, 'A =:= x &> x where x free'
, 'B =:= x &> x where x free'
, 'C =:= x &> x where x free'
, '(x =:= A &> x) ? x where x free'
, '(x =:= B &> x) ? x where x free'
, '(x =:= C &> x) ? x where x free'
, '(A =:= x &> x) ? x where x free'
, '(B =:= x &> x) ? x where x free'
, '(C =:= x &> x) ? x where x free'
, '((x =:= y & y =:= A) &> x) ? x where x,y free'
, '((x =:= y & y =:= B) &> x) ? x where x,y free'
, '((x =:= y & y =:= C) &> x) ? x where x,y free'
, '((x =:= y & A =:= y) &> x) ? x where x,y free'
, '((x =:= y & B =:= y) &> x) ? x where x,y free'
, '((x =:= y & C =:= y) &> x) ? x where x,y free'
, '((y =:= x & y =:= A) &> x) ? x where x,y free'
, '((y =:= x & y =:= B) &> x) ? x where x,y free'
, '((y =:= x & y =:= C) &> x) ? x where x,y free'
, '((y =:= x & A =:= y) &> x) ? x where x,y free'
, '((y =:= x & B =:= y) &> x) ? x where x,y free'
, '((y =:= x & C =:= y) &> x) ? x where x,y free'
, '((y =:= A & x =:= y ) &> x) ? x where x,y free'
, '((y =:= B & x =:= y ) &> x) ? x where x,y free'
, '((y =:= C & x =:= y ) &> x) ? x where x,y free'
, '((A =:= y & x =:= y ) &> x) ? x where x,y free'
, '((B =:= y & x =:= y ) &> x) ? x where x,y free'
, '((C =:= y & x =:= y ) &> x) ? x where x,y free'
, '((y =:= A & y =:= x ) &> x) ? x where x,y free'
, '((y =:= B & y =:= x ) &> x) ? x where x,y free'
, '((y =:= C & y =:= x ) &> x) ? x where x,y free'
, '((A =:= y & y =:= x ) &> x) ? x where x,y free'
, '((B =:= y & y =:= x ) &> x) ? x where x,y free'
, '((C =:= y & y =:= x ) &> x) ? x where x,y free'
, '((x =:= y & y =:= A) &> y) ? x where x,y free'
, '((x =:= y & y =:= B) &> y) ? x where x,y free'
, '((x =:= y & y =:= C) &> y) ? x where x,y free'
, '((x =:= y & A =:= y) &> y) ? x where x,y free'
, '((x =:= y & B =:= y) &> y) ? x where x,y free'
, '((x =:= y & C =:= y) &> y) ? x where x,y free'
, '((y =:= x & y =:= A) &> y) ? x where x,y free'
, '((y =:= x & y =:= B) &> y) ? x where x,y free'
, '((y =:= x & y =:= C) &> y) ? x where x,y free'
, '((y =:= x & A =:= y) &> y) ? x where x,y free'
, '((y =:= x & B =:= y) &> y) ? x where x,y free'
, '((y =:= x & C =:= y) &> y) ? x where x,y free'
, '((y =:= A & x =:= y ) &> y) ? x where x,y free'
, '((y =:= B & x =:= y ) &> y) ? x where x,y free'
, '((y =:= C & x =:= y ) &> y) ? x where x,y free'
, '((A =:= y & x =:= y ) &> y) ? x where x,y free'
, '((B =:= y & x =:= y ) &> y) ? x where x,y free'
, '((C =:= y & x =:= y ) &> y) ? x where x,y free'
, '((y =:= A & y =:= x ) &> y) ? x where x,y free'
, '((y =:= B & y =:= x ) &> y) ? x where x,y free'
, '((y =:= C & y =:= x ) &> y) ? x where x,y free'
, '((A =:= y & y =:= x ) &> y) ? x where x,y free'
, '((B =:= y & y =:= x ) &> y) ? x where x,y free'
, '((C =:= y & y =:= x ) &> y) ? x where x,y free'
, '((x =:= y & y =:= A) &> x) ? y where x,y free'
, '((x =:= y & y =:= B) &> x) ? y where x,y free'
, '((x =:= y & y =:= C) &> x) ? y where x,y free'
, '((x =:= y & A =:= y) &> x) ? y where x,y free'
, '((x =:= y & B =:= y) &> x) ? y where x,y free'
, '((x =:= y & C =:= y) &> x) ? y where x,y free'
, '((y =:= x & y =:= A) &> x) ? y where x,y free'
, '((y =:= x & y =:= B) &> x) ? y where x,y free'
, '((y =:= x & y =:= C) &> x) ? y where x,y free'
, '((y =:= x & A =:= y) &> x) ? y where x,y free'
, '((y =:= x & B =:= y) &> x) ? y where x,y free'
, '((y =:= x & C =:= y) &> x) ? y where x,y free'
, '((y =:= A & x =:= y ) &> x) ? y where x,y free'
, '((y =:= B & x =:= y ) &> x) ? y where x,y free'
, '((y =:= C & x =:= y ) &> x) ? y where x,y free'
, '((A =:= y & x =:= y ) &> x) ? y where x,y free'
, '((B =:= y & x =:= y ) &> x) ? y where x,y free'
, '((C =:= y & x =:= y ) &> x) ? y where x,y free'
, '((y =:= A & y =:= x ) &> x) ? y where x,y free'
, '((y =:= B & y =:= x ) &> x) ? y where x,y free'
, '((y =:= C & y =:= x ) &> x) ? y where x,y free'
, '((A =:= y & y =:= x ) &> x) ? y where x,y free'
, '((B =:= y & y =:= x ) &> x) ? y where x,y free'
, '((C =:= y & y =:= x ) &> x) ? y where x,y free'
, '((x =:= y & y =:= A) &> y) ? y where x,y free'
, '((x =:= y & y =:= B) &> y) ? y where x,y free'
, '((x =:= y & y =:= C) &> y) ? y where x,y free'
, '((x =:= y & A =:= y) &> y) ? y where x,y free'
, '((x =:= y & B =:= y) &> y) ? y where x,y free'
, '((x =:= y & C =:= y) &> y) ? y where x,y free'
, '((y =:= x & y =:= A) &> y) ? y where x,y free'
, '((y =:= x & y =:= B) &> y) ? y where x,y free'
, '((y =:= x & y =:= C) &> y) ? y where x,y free'
, '((y =:= x & A =:= y) &> y) ? y where x,y free'
, '((y =:= x & B =:= y) &> y) ? y where x,y free'
, '((y =:= x & C =:= y) &> y) ? y where x,y free'
, '((y =:= A & x =:= y ) &> y) ? y where x,y free'
, '((y =:= B & x =:= y ) &> y) ? y where x,y free'
, '((y =:= C & x =:= y ) &> y) ? y where x,y free'
, '((A =:= y & x =:= y ) &> y) ? y where x,y free'
, '((B =:= y & x =:= y ) &> y) ? y where x,y free'
, '((C =:= y & x =:= y ) &> y) ? y where x,y free'
, '((y =:= A & y =:= x ) &> y) ? y where x,y free'
, '((y =:= B & y =:= x ) &> y) ? y where x,y free'
, '((y =:= C & y =:= x ) &> y) ? y where x,y free'
, '((A =:= y & y =:= x ) &> y) ? y where x,y free'
, '((B =:= y & y =:= x ) &> y) ? y where x,y free'
, '((C =:= y & y =:= x ) &> y) ? y where x,y free'
, '((x =:= y & y =:= A) &> x) ? x ? y where x,y free'
, '((x =:= y & y =:= B) &> x) ? x ? y where x,y free'
, '((x =:= y & y =:= C) &> x) ? x ? y where x,y free'
, '((x =:= y & A =:= y) &> x) ? x ? y where x,y free'
, '((x =:= y & B =:= y) &> x) ? x ? y where x,y free'
, '((x =:= y & C =:= y) &> x) ? x ? y where x,y free'
, '((y =:= x & y =:= A) &> x) ? x ? y where x,y free'
, '((y =:= x & y =:= B) &> x) ? x ? y where x,y free'
, '((y =:= x & y =:= C) &> x) ? x ? y where x,y free'
, '((y =:= x & A =:= y) &> x) ? x ? y where x,y free'
, '((y =:= x & B =:= y) &> x) ? x ? y where x,y free'
, '((y =:= x & C =:= y) &> x) ? x ? y where x,y free'
, '((y =:= A & x =:= y ) &> x) ? x ? y where x,y free'
, '((y =:= B & x =:= y ) &> x) ? x ? y where x,y free'
, '((y =:= C & x =:= y ) &> x) ? x ? y where x,y free'
, '((A =:= y & x =:= y ) &> x) ? x ? y where x,y free'
, '((B =:= y & x =:= y ) &> x) ? x ? y where x,y free'
, '((C =:= y & x =:= y ) &> x) ? x ? y where x,y free'
, '((y =:= A & y =:= x ) &> x) ? x ? y where x,y free'
, '((y =:= B & y =:= x ) &> x) ? x ? y where x,y free'
, '((y =:= C & y =:= x ) &> x) ? x ? y where x,y free'
, '((A =:= y & y =:= x ) &> x) ? x ? y where x,y free'
, '((B =:= y & y =:= x ) &> x) ? x ? y where x,y free'
, '((C =:= y & y =:= x ) &> x) ? x ? y where x,y free'
, '((x =:= y & y =:= A) &> y) ? x ? y where x,y free'
, '((x =:= y & y =:= B) &> y) ? x ? y where x,y free'
, '((x =:= y & y =:= C) &> y) ? x ? y where x,y free'
, '((x =:= y & A =:= y) &> y) ? x ? y where x,y free'
, '((x =:= y & B =:= y) &> y) ? x ? y where x,y free'
, '((x =:= y & C =:= y) &> y) ? x ? y where x,y free'
, '((y =:= x & y =:= A) &> y) ? x ? y where x,y free'
, '((y =:= x & y =:= B) &> y) ? x ? y where x,y free'
, '((y =:= x & y =:= C) &> y) ? x ? y where x,y free'
, '((y =:= x & A =:= y) &> y) ? x ? y where x,y free'
, '((y =:= x & B =:= y) &> y) ? x ? y where x,y free'
, '((y =:= x & C =:= y) &> y) ? x ? y where x,y free'
, '((y =:= A & x =:= y ) &> y) ? x ? y where x,y free'
, '((y =:= B & x =:= y ) &> y) ? x ? y where x,y free'
, '((y =:= C & x =:= y ) &> y) ? x ? y where x,y free'
, '((A =:= y & x =:= y ) &> y) ? x ? y where x,y free'
, '((B =:= y & x =:= y ) &> y) ? x ? y where x,y free'
, '((C =:= y & x =:= y ) &> y) ? x ? y where x,y free'
, '((y =:= A & y =:= x ) &> y) ? x ? y where x,y free'
, '((y =:= B & y =:= x ) &> y) ? x ? y where x,y free'
, '((y =:= C & y =:= x ) &> y) ? x ? y where x,y free'
, '((A =:= y & y =:= x ) &> y) ? x ? y where x,y free'
, '((B =:= y & y =:= x ) &> y) ? x ? y where x,y free'
, '((C =:= y & y =:= x ) &> y) ? x ? y where x,y free'
, '((x =:= y & y =:= A) &> x) ? y ? x where x,y free'
, '((x =:= y & y =:= B) &> x) ? y ? x where x,y free'
, '((x =:= y & y =:= C) &> x) ? y ? x where x,y free'
, '((x =:= y & A =:= y) &> x) ? y ? x where x,y free'
, '((x =:= y & B =:= y) &> x) ? y ? x where x,y free'
, '((x =:= y & C =:= y) &> x) ? y ? x where x,y free'
, '((y =:= x & y =:= A) &> x) ? y ? x where x,y free'
, '((y =:= x & y =:= B) &> x) ? y ? x where x,y free'
, '((y =:= x & y =:= C) &> x) ? y ? x where x,y free'
, '((y =:= x & A =:= y) &> x) ? y ? x where x,y free'
, '((y =:= x & B =:= y) &> x) ? y ? x where x,y free'
, '((y =:= x & C =:= y) &> x) ? y ? x where x,y free'
, '((y =:= A & x =:= y ) &> x) ? y ? x where x,y free'
, '((y =:= B & x =:= y ) &> x) ? y ? x where x,y free'
, '((y =:= C & x =:= y ) &> x) ? y ? x where x,y free'
, '((A =:= y & x =:= y ) &> x) ? y ? x where x,y free'
, '((B =:= y & x =:= y ) &> x) ? y ? x where x,y free'
, '((C =:= y & x =:= y ) &> x) ? y ? x where x,y free'
, '((y =:= A & y =:= x ) &> x) ? y ? x where x,y free'
, '((y =:= B & y =:= x ) &> x) ? y ? x where x,y free'
, '((y =:= C & y =:= x ) &> x) ? y ? x where x,y free'
, '((A =:= y & y =:= x ) &> x) ? y ? x where x,y free'
, '((B =:= y & y =:= x ) &> x) ? y ? x where x,y free'
, '((C =:= y & y =:= x ) &> x) ? y ? x where x,y free'
, '((x =:= y & y =:= A) &> y) ? y ? x where x,y free'
, '((x =:= y & y =:= B) &> y) ? y ? x where x,y free'
, '((x =:= y & y =:= C) &> y) ? y ? x where x,y free'
, '((x =:= y & A =:= y) &> y) ? y ? x where x,y free'
, '((x =:= y & B =:= y) &> y) ? y ? x where x,y free'
, '((x =:= y & C =:= y) &> y) ? y ? x where x,y free'
, '((y =:= x & y =:= A) &> y) ? y ? x where x,y free'
, '((y =:= x & y =:= B) &> y) ? y ? x where x,y free'
, '((y =:= x & y =:= C) &> y) ? y ? x where x,y free'
, '((y =:= x & A =:= y) &> y) ? y ? x where x,y free'
, '((y =:= x & B =:= y) &> y) ? y ? x where x,y free'
, '((y =:= x & C =:= y) &> y) ? y ? x where x,y free'
, '((y =:= A & x =:= y ) &> y) ? y ? x where x,y free'
, '((y =:= B & x =:= y ) &> y) ? y ? x where x,y free'
, '((y =:= C & x =:= y ) &> y) ? y ? x where x,y free'
, '((A =:= y & x =:= y ) &> y) ? y ? x where x,y free'
, '((B =:= y & x =:= y ) &> y) ? y ? x where x,y free'
, '((C =:= y & x =:= y ) &> y) ? y ? x where x,y free'
, '((y =:= A & y =:= x ) &> y) ? y ? x where x,y free'
, '((y =:= B & y =:= x ) &> y) ? y ? x where x,y free'
, '((y =:= C & y =:= x ) &> y) ? y ? x where x,y free'
, '((A =:= y & y =:= x ) &> y) ? y ? x where x,y free'
, '((B =:= y & y =:= x ) &> y) ? y ? x where x,y free'
, '((C =:= y & y =:= x ) &> y) ? y ? x where x,y free'
, 'y ? ((x =:= y & y =:= A) &> x) ? x where x,y free'
, 'y ? ((x =:= y & y =:= B) &> x) ? x where x,y free'
, 'y ? ((x =:= y & y =:= C) &> x) ? x where x,y free'
, 'y ? ((x =:= y & A =:= y) &> x) ? x where x,y free'
, 'y ? ((x =:= y & B =:= y) &> x) ? x where x,y free'
, 'y ? ((x =:= y & C =:= y) &> x) ? x where x,y free'
, 'y ? ((y =:= x & y =:= A) &> x) ? x where x,y free'
, 'y ? ((y =:= x & y =:= B) &> x) ? x where x,y free'
, 'y ? ((y =:= x & y =:= C) &> x) ? x where x,y free'
, 'y ? ((y =:= x & A =:= y) &> x) ? x where x,y free'
, 'y ? ((y =:= x & B =:= y) &> x) ? x where x,y free'
, 'y ? ((y =:= x & C =:= y) &> x) ? x where x,y free'
, 'y ? ((y =:= A & x =:= y ) &> x) ? x where x,y free'
, 'y ? ((y =:= B & x =:= y ) &> x) ? x where x,y free'
, 'y ? ((y =:= C & x =:= y ) &> x) ? x where x,y free'
, 'y ? ((A =:= y & x =:= y ) &> x) ? x where x,y free'
, 'y ? ((B =:= y & x =:= y ) &> x) ? x where x,y free'
, 'y ? ((C =:= y & x =:= y ) &> x) ? x where x,y free'
, 'y ? ((y =:= A & y =:= x ) &> x) ? x where x,y free'
, 'y ? ((y =:= B & y =:= x ) &> x) ? x where x,y free'
, 'y ? ((y =:= C & y =:= x ) &> x) ? x where x,y free'
, 'y ? ((A =:= y & y =:= x ) &> x) ? x where x,y free'
, 'y ? ((B =:= y & y =:= x ) &> x) ? x where x,y free'
, 'y ? ((C =:= y & y =:= x ) &> x) ? x where x,y free'
, 'y ? ((x =:= y & y =:= A) &> y) ? x where x,y free'
, 'y ? ((x =:= y & y =:= B) &> y) ? x where x,y free'
, 'y ? ((x =:= y & y =:= C) &> y) ? x where x,y free'
, 'y ? ((x =:= y & A =:= y) &> y) ? x where x,y free'
, 'y ? ((x =:= y & B =:= y) &> y) ? x where x,y free'
, 'y ? ((x =:= y & C =:= y) &> y) ? x where x,y free'
, 'y ? ((y =:= x & y =:= A) &> y) ? x where x,y free'
, 'y ? ((y =:= x & y =:= B) &> y) ? x where x,y free'
, 'y ? ((y =:= x & y =:= C) &> y) ? x where x,y free'
, 'y ? ((y =:= x & A =:= y) &> y) ? x where x,y free'
, 'y ? ((y =:= x & B =:= y) &> y) ? x where x,y free'
, 'y ? ((y =:= x & C =:= y) &> y) ? x where x,y free'
, 'y ? ((y =:= A & x =:= y ) &> y) ? x where x,y free'
, 'y ? ((y =:= B & x =:= y ) &> y) ? x where x,y free'
, 'y ? ((y =:= C & x =:= y ) &> y) ? x where x,y free'
, 'y ? ((A =:= y & x =:= y ) &> y) ? x where x,y free'
, 'y ? ((B =:= y & x =:= y ) &> y) ? x where x,y free'
, 'y ? ((C =:= y & x =:= y ) &> y) ? x where x,y free'
, 'y ? ((y =:= A & y =:= x ) &> y) ? x where x,y free'
, 'y ? ((y =:= B & y =:= x ) &> y) ? x where x,y free'
, 'y ? ((y =:= C & y =:= x ) &> y) ? x where x,y free'
, 'y ? ((A =:= y & y =:= x ) &> y) ? x where x,y free'
, 'y ? ((B =:= y & y =:= x ) &> y) ? x where x,y free'
, 'y ? ((C =:= y & y =:= x ) &> y) ? x where x,y free'
, 'x ? ((x =:= y & y =:= A) &> x) ? y where x,y free'
, 'x ? ((x =:= y & y =:= B) &> x) ? y where x,y free'
, 'x ? ((x =:= y & y =:= C) &> x) ? y where x,y free'
, 'x ? ((x =:= y & A =:= y) &> x) ? y where x,y free'
, 'x ? ((x =:= y & B =:= y) &> x) ? y where x,y free'
, 'x ? ((x =:= y & C =:= y) &> x) ? y where x,y free'
, 'x ? ((y =:= x & y =:= A) &> x) ? y where x,y free'
, 'x ? ((y =:= x & y =:= B) &> x) ? y where x,y free'
, 'x ? ((y =:= x & y =:= C) &> x) ? y where x,y free'
, 'x ? ((y =:= x & A =:= y) &> x) ? y where x,y free'
, 'x ? ((y =:= x & B =:= y) &> x) ? y where x,y free'
, 'x ? ((y =:= x & C =:= y) &> x) ? y where x,y free'
, 'x ? ((y =:= A & x =:= y ) &> x) ? y where x,y free'
, 'x ? ((y =:= B & x =:= y ) &> x) ? y where x,y free'
, 'x ? ((y =:= C & x =:= y ) &> x) ? y where x,y free'
, 'x ? ((A =:= y & x =:= y ) &> x) ? y where x,y free'
, 'x ? ((B =:= y & x =:= y ) &> x) ? y where x,y free'
, 'x ? ((C =:= y & x =:= y ) &> x) ? y where x,y free'
, 'x ? ((y =:= A & y =:= x ) &> x) ? y where x,y free'
, 'x ? ((y =:= B & y =:= x ) &> x) ? y where x,y free'
, 'x ? ((y =:= C & y =:= x ) &> x) ? y where x,y free'
, 'x ? ((A =:= y & y =:= x ) &> x) ? y where x,y free'
, 'x ? ((B =:= y & y =:= x ) &> x) ? y where x,y free'
, 'x ? ((C =:= y & y =:= x ) &> x) ? y where x,y free'
, 'x ? ((x =:= y & y =:= A) &> y) ? y where x,y free'
, 'x ? ((x =:= y & y =:= B) &> y) ? y where x,y free'
, 'x ? ((x =:= y & y =:= C) &> y) ? y where x,y free'
, 'x ? ((x =:= y & A =:= y) &> y) ? y where x,y free'
, 'x ? ((x =:= y & B =:= y) &> y) ? y where x,y free'
, 'x ? ((x =:= y & C =:= y) &> y) ? y where x,y free'
, 'x ? ((y =:= x & y =:= A) &> y) ? y where x,y free'
, 'x ? ((y =:= x & y =:= B) &> y) ? y where x,y free'
, 'x ? ((y =:= x & y =:= C) &> y) ? y where x,y free'
, 'x ? ((y =:= x & A =:= y) &> y) ? y where x,y free'
, 'x ? ((y =:= x & B =:= y) &> y) ? y where x,y free'
, 'x ? ((y =:= x & C =:= y) &> y) ? y where x,y free'
, 'x ? ((y =:= A & x =:= y ) &> y) ? y where x,y free'
, 'x ? ((y =:= B & x =:= y ) &> y) ? y where x,y free'
, 'x ? ((y =:= C & x =:= y ) &> y) ? y where x,y free'
, 'x ? ((A =:= y & x =:= y ) &> y) ? y where x,y free'
, 'x ? ((B =:= y & x =:= y ) &> y) ? y where x,y free'
, 'x ? ((C =:= y & x =:= y ) &> y) ? y where x,y free'
, 'x ? ((y =:= A & y =:= x ) &> y) ? y where x,y free'
, 'x ? ((y =:= B & y =:= x ) &> y) ? y where x,y free'
, 'x ? ((y =:= C & y =:= x ) &> y) ? y where x,y free'
, 'x ? ((A =:= y & y =:= x ) &> y) ? y where x,y free'
, 'x ? ((B =:= y & y =:= x ) &> y) ? y where x,y free'
, 'x ? ((C =:= y & y =:= x ) &> y) ? y where x,y free'
# , 'x ? y ? ((x =:= y & y =:= A) &> x) where x,y free'
# , 'x ? y ? ((x =:= y & y =:= B) &> x) where x,y free'
# , 'x ? y ? ((x =:= y & y =:= C) &> x) where x,y free'
# , 'x ? y ? ((x =:= y & A =:= y) &> x) where x,y free'
# , 'x ? y ? ((x =:= y & B =:= y) &> x) where x,y free'
# , 'x ? y ? ((x =:= y & C =:= y) &> x) where x,y free'
# , 'x ? y ? ((y =:= x & y =:= A) &> x) where x,y free'
# , 'x ? y ? ((y =:= x & y =:= B) &> x) where x,y free'
# , 'x ? y ? ((y =:= x & y =:= C) &> x) where x,y free'
# , 'x ? y ? ((y =:= x & A =:= y) &> x) where x,y free'
# , 'x ? y ? ((y =:= x & B =:= y) &> x) where x,y free'
# , 'x ? y ? ((y =:= x & C =:= y) &> x) where x,y free'
# , 'x ? y ? ((y =:= A & x =:= y ) &> x) where x,y free'
# , 'x ? y ? ((y =:= B & x =:= y ) &> x) where x,y free'
# , 'x ? y ? ((y =:= C & x =:= y ) &> x) where x,y free'
# , 'x ? y ? ((A =:= y & x =:= y ) &> x) where x,y free'
# , 'x ? y ? ((B =:= y & x =:= y ) &> x) where x,y free'
# , 'x ? y ? ((C =:= y & x =:= y ) &> x) where x,y free'
# , 'x ? y ? ((y =:= A & y =:= x ) &> x) where x,y free'
# , 'x ? y ? ((y =:= B & y =:= x ) &> x) where x,y free'
# , 'x ? y ? ((y =:= C & y =:= x ) &> x) where x,y free'
# , 'x ? y ? ((A =:= y & y =:= x ) &> x) where x,y free'
# , 'x ? y ? ((B =:= y & y =:= x ) &> x) where x,y free'
# , 'x ? y ? ((C =:= y & y =:= x ) &> x) where x,y free'
# , 'x ? y ? ((x =:= y & y =:= A) &> y) where x,y free'
# , 'x ? y ? ((x =:= y & y =:= B) &> y) where x,y free'
# , 'x ? y ? ((x =:= y & y =:= C) &> y) where x,y free'
# , 'x ? y ? ((x =:= y & A =:= y) &> y) where x,y free'
# , 'x ? y ? ((x =:= y & B =:= y) &> y) where x,y free'
# , 'x ? y ? ((x =:= y & C =:= y) &> y) where x,y free'
# , 'x ? y ? ((y =:= x & y =:= A) &> y) where x,y free'
# , 'x ? y ? ((y =:= x & y =:= B) &> y) where x,y free'
# , 'x ? y ? ((y =:= x & y =:= C) &> y) where x,y free'
# , 'x ? y ? ((y =:= x & A =:= y) &> y) where x,y free'
# , 'x ? y ? ((y =:= x & B =:= y) &> y) where x,y free'
# , 'x ? y ? ((y =:= x & C =:= y) &> y) where x,y free'
# , 'x ? y ? ((y =:= A & x =:= y ) &> y) where x,y free'
# , 'x ? y ? ((y =:= B & x =:= y ) &> y) where x,y free'
# , 'x ? y ? ((y =:= C & x =:= y ) &> y) where x,y free'
# , 'x ? y ? ((A =:= y & x =:= y ) &> y) where x,y free'
# , 'x ? y ? ((B =:= y & x =:= y ) &> y) where x,y free'
# , 'x ? y ? ((C =:= y & x =:= y ) &> y) where x,y free'
# , 'x ? y ? ((y =:= A & y =:= x ) &> y) where x,y free'
# , 'x ? y ? ((y =:= B & y =:= x ) &> y) where x,y free'
# , 'x ? y ? ((y =:= C & y =:= x ) &> y) where x,y free'
# , 'x ? y ? ((A =:= y & y =:= x ) &> y) where x,y free'
# , 'x ? y ? ((B =:= y & y =:= x ) &> y) where x,y free'
# , 'x ? y ? ((C =:= y & y =:= x ) &> y) where x,y free'
# , 'y ? x ? ((x =:= y & y =:= A) &> x) where x,y free'
# , 'y ? x ? ((x =:= y & y =:= B) &> x) where x,y free'
# , 'y ? x ? ((x =:= y & y =:= C) &> x) where x,y free'
# , 'y ? x ? ((x =:= y & A =:= y) &> x) where x,y free'
# , 'y ? x ? ((x =:= y & B =:= y) &> x) where x,y free'
# , 'y ? x ? ((x =:= y & C =:= y) &> x) where x,y free'
# , 'y ? x ? ((y =:= x & y =:= A) &> x) where x,y free'
# , 'y ? x ? ((y =:= x & y =:= B) &> x) where x,y free'
# , 'y ? x ? ((y =:= x & y =:= C) &> x) where x,y free'
# , 'y ? x ? ((y =:= x & A =:= y) &> x) where x,y free'
# , 'y ? x ? ((y =:= x & B =:= y) &> x) where x,y free'
# , 'y ? x ? ((y =:= x & C =:= y) &> x) where x,y free'
# , 'y ? x ? ((y =:= A & x =:= y ) &> x) where x,y free'
# , 'y ? x ? ((y =:= B & x =:= y ) &> x) where x,y free'
# , 'y ? x ? ((y =:= C & x =:= y ) &> x) where x,y free'
# , 'y ? x ? ((A =:= y & x =:= y ) &> x) where x,y free'
# , 'y ? x ? ((B =:= y & x =:= y ) &> x) where x,y free'
# , 'y ? x ? ((C =:= y & x =:= y ) &> x) where x,y free'
# , 'y ? x ? ((y =:= A & y =:= x ) &> x) where x,y free'
# , 'y ? x ? ((y =:= B & y =:= x ) &> x) where x,y free'
# , 'y ? x ? ((y =:= C & y =:= x ) &> x) where x,y free'
# , 'y ? x ? ((A =:= y & y =:= x ) &> x) where x,y free'
# , 'y ? x ? ((B =:= y & y =:= x ) &> x) where x,y free'
# , 'y ? x ? ((C =:= y & y =:= x ) &> x) where x,y free'
# , 'y ? x ? ((x =:= y & y =:= A) &> y) where x,y free'
# , 'y ? x ? ((x =:= y & y =:= B) &> y) where x,y free'
# , 'y ? x ? ((x =:= y & y =:= C) &> y) where x,y free'
# , 'y ? x ? ((x =:= y & A =:= y) &> y) where x,y free'
# , 'y ? x ? ((x =:= y & B =:= y) &> y) where x,y free'
# , 'y ? x ? ((x =:= y & C =:= y) &> y) where x,y free'
# , 'y ? x ? ((y =:= x & y =:= A) &> y) where x,y free'
# , 'y ? x ? ((y =:= x & y =:= B) &> y) where x,y free'
# , 'y ? x ? ((y =:= x & y =:= C) &> y) where x,y free'
# , 'y ? x ? ((y =:= x & A =:= y) &> y) where x,y free'
# , 'y ? x ? ((y =:= x & B =:= y) &> y) where x,y free'
# , 'y ? x ? ((y =:= x & C =:= y) &> y) where x,y free'
# , 'y ? x ? ((y =:= A & x =:= y ) &> y) where x,y free'
# , 'y ? x ? ((y =:= B & x =:= y ) &> y) where x,y free'
# , 'y ? x ? ((y =:= C & x =:= y ) &> y) where x,y free'
# , 'y ? x ? ((A =:= y & x =:= y ) &> y) where x,y free'
# , 'y ? x ? ((B =:= y & x =:= y ) &> y) where x,y free'
# , 'y ? x ? ((C =:= y & x =:= y ) &> y) where x,y free'
# , 'y ? x ? ((y =:= A & y =:= x ) &> y) where x,y free'
# , 'y ? x ? ((y =:= B & y =:= x ) &> y) where x,y free'
# , 'y ? x ? ((y =:= C & y =:= x ) &> y) where x,y free'
# , 'y ? x ? ((A =:= y & y =:= x ) &> y) where x,y free'
# , 'y ? x ? ((B =:= y & y =:= x ) &> y) where x,y free'
# , 'y ? x ? ((C =:= y & y =:= x ) &> y) where x,y free'
, '((x =:= y & x =:= A & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & x =:= A & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & x =:= A & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & x =:= B & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & x =:= B & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & x =:= B & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & x =:= C & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & x =:= C & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & x =:= C & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & x =:= A & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & x =:= A & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & x =:= A & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & x =:= B & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & x =:= B & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & x =:= B & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & x =:= C & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & x =:= C & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & x =:= C & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & A =:= x & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & A =:= x & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & A =:= x & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & B =:= x & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & B =:= x & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & B =:= x & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & C =:= x & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & C =:= x & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & C =:= x & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & A =:= x & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & A =:= x & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & A =:= x & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & B =:= x & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & B =:= x & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & B =:= x & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & C =:= x & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & C =:= x & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & C =:= x & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & x =:= A & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & x =:= A & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & x =:= A & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & x =:= B & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & x =:= B & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & x =:= B & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & x =:= C & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & x =:= C & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & x =:= C & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & x =:= A & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & x =:= A & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & x =:= A & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & x =:= B & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & x =:= B & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & x =:= B & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & x =:= C & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & x =:= C & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & x =:= C & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & A =:= x & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & A =:= x & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & A =:= x & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & B =:= x & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & B =:= x & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & B =:= x & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & C =:= x & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & C =:= x & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & C =:= x & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & A =:= x & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & A =:= x & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & A =:= x & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & B =:= x & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & B =:= x & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & B =:= x & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & C =:= x & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & C =:= x & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & C =:= x & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & y =:= A & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & y =:= B & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & y =:= C & x =:= A) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & y =:= A & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & y =:= B & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & y =:= C & x =:= B) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & y =:= A & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & y =:= B & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & y =:= C & x =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & A =:= y & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & B =:= y & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & C =:= y & x =:= A) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & A =:= y & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & B =:= y & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & C =:= y & x =:= B) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & A =:= y & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & B =:= y & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & C =:= y & x =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & y =:= A & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & y =:= B & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & y =:= C & A =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & y =:= A & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & y =:= B & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & y =:= C & B =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & y =:= A & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & y =:= B & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & y =:= C & C =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & A =:= y & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & B =:= y & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & C =:= y & A =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & A =:= y & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & B =:= y & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & C =:= y & B =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= y & A =:= y & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & B =:= y & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= y & C =:= y & C =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & y =:= A & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & y =:= B & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & y =:= C & x =:= A) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & y =:= A & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & y =:= B & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & y =:= C & x =:= B) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & y =:= A & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & y =:= B & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & y =:= C & x =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & A =:= y & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & B =:= y & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & C =:= y & x =:= A) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & A =:= y & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & B =:= y & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & C =:= y & x =:= B) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & A =:= y & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & B =:= y & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & C =:= y & x =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & y =:= A & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & y =:= B & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & y =:= C & A =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & y =:= A & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & y =:= B & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & y =:= C & B =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & y =:= A & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & y =:= B & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & y =:= C & C =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & A =:= y & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & B =:= y & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & C =:= y & A =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & A =:= y & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & B =:= y & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & C =:= y & B =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= x & A =:= y & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & B =:= y & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= x & C =:= y & C =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & x =:= y & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & x =:= y & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & x =:= y & x =:= A) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & x =:= y & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & x =:= y & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & x =:= y & x =:= B) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & x =:= y & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & x =:= y & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & x =:= y & x =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & x =:= y & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & x =:= y & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & x =:= y & x =:= A) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & x =:= y & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & x =:= y & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & x =:= y & x =:= B) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & x =:= y & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & x =:= y & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & x =:= y & x =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & x =:= y & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & x =:= y & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & x =:= y & A =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & x =:= y & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & x =:= y & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & x =:= y & B =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & x =:= y & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & x =:= y & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & x =:= y & C =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & x =:= y & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & x =:= y & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & x =:= y & A =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & x =:= y & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & x =:= y & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & x =:= y & B =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & x =:= y & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & x =:= y & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & x =:= y & C =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & y =:= x & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & y =:= x & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & y =:= x & x =:= A) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & y =:= x & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & y =:= x & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & y =:= x & x =:= B) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & y =:= x & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & y =:= x & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & y =:= x & x =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & y =:= x & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & y =:= x & x =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & y =:= x & x =:= A) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & y =:= x & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & y =:= x & x =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & y =:= x & x =:= B) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & y =:= x & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & y =:= x & x =:= C) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & y =:= x & x =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & y =:= x & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & y =:= x & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & y =:= x & A =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & y =:= x & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & y =:= x & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & y =:= x & B =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & y =:= x & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & y =:= x & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & y =:= x & C =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & y =:= x & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & y =:= x & A =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & y =:= x & A =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & y =:= x & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & y =:= x & B =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & y =:= x & B =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & y =:= x & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & y =:= x & C =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & y =:= x & C =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= A & x =:= y & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & x =:= y & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & x =:= y & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= B & x =:= y & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & x =:= y & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & x =:= y & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= C & x =:= y & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & x =:= y & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & x =:= y & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= A & x =:= y & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & x =:= y & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & x =:= y & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= B & x =:= y & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & x =:= y & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & x =:= y & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= C & x =:= y & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & x =:= y & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & x =:= y & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= x & x =:= y & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & x =:= y & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & x =:= y & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((B =:= x & x =:= y & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & x =:= y & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & x =:= y & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((C =:= x & x =:= y & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & x =:= y & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & x =:= y & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= x & x =:= y & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & x =:= y & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & x =:= y & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((B =:= x & x =:= y & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & x =:= y & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & x =:= y & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((C =:= x & x =:= y & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & x =:= y & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & x =:= y & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= A & y =:= x & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & y =:= x & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & y =:= x & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= B & y =:= x & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & y =:= x & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & y =:= x & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= C & y =:= x & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & y =:= x & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & y =:= x & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= A & y =:= x & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & y =:= x & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & y =:= x & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= B & y =:= x & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & y =:= x & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & y =:= x & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= C & y =:= x & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & y =:= x & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & y =:= x & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= x & y =:= x & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & y =:= x & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & y =:= x & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((B =:= x & y =:= x & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & y =:= x & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & y =:= x & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((C =:= x & y =:= x & y =:= A) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & y =:= x & y =:= B) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & y =:= x & y =:= C) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= x & y =:= x & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & y =:= x & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & y =:= x & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((B =:= x & y =:= x & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & y =:= x & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & y =:= x & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((C =:= x & y =:= x & A =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & y =:= x & B =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & y =:= x & C =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & x =:= A & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & x =:= A & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & x =:= A & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & x =:= B & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & x =:= B & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & x =:= B & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & x =:= C & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & x =:= C & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & x =:= C & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & x =:= A & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & x =:= A & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & x =:= A & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & x =:= B & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & x =:= B & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & x =:= B & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & x =:= C & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & x =:= C & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & x =:= C & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & A =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & A =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & A =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & B =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & B =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & B =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & C =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & C =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & C =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & A =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & A =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & A =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & B =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & B =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & B =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & C =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & C =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & C =:= x & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & x =:= A & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & x =:= A & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & x =:= A & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & x =:= B & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & x =:= B & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & x =:= B & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & x =:= C & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & x =:= C & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & x =:= C & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & x =:= A & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & x =:= A & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & x =:= A & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & x =:= B & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & x =:= B & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & x =:= B & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & x =:= C & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & x =:= C & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & x =:= C & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & A =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & A =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & A =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & B =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & B =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & B =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((y =:= A & C =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= B & C =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((y =:= C & C =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & A =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & A =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & A =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & B =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & B =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & B =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= y & C =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= y & C =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= y & C =:= x & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= A & y =:= A & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & y =:= B & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & y =:= C & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= B & y =:= A & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & y =:= B & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & y =:= C & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= C & y =:= A & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & y =:= B & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & y =:= C & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= A & A =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & B =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & C =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= B & A =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & B =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & C =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= C & A =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & B =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & C =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= x & y =:= A & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & y =:= B & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & y =:= C & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((B =:= x & y =:= A & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & y =:= B & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & y =:= C & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((C =:= x & y =:= A & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & y =:= B & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & y =:= C & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= x & A =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & B =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & C =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((B =:= x & A =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & B =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & C =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((C =:= x & A =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & B =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & C =:= y & x =:= y) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= A & y =:= A & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & y =:= B & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & y =:= C & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= B & y =:= A & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & y =:= B & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & y =:= C & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= C & y =:= A & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & y =:= B & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & y =:= C & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= A & A =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & B =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= A & C =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= B & A =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & B =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= B & C =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((x =:= C & A =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & B =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((x =:= C & C =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= x & y =:= A & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & y =:= B & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & y =:= C & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((B =:= x & y =:= A & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & y =:= B & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & y =:= C & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((C =:= x & y =:= A & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & y =:= B & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & y =:= C & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((A =:= x & A =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & B =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((A =:= x & C =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((B =:= x & A =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & B =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((B =:= x & C =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
, '((C =:= x & A =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & B =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
# , '((C =:= x & C =:= y & y =:= x) &> (x, y)) ? (x, y) where x,y free'
]
# Programs requiring: data T = A T T | B T | C.
A2B1C0_PROGRAMS = [
'(x =:= A x1 x2) &> x where x,x1,x2 free'
, '(x =:= B x1 ) &> x where x,x1 free'
, '(x =:= C ) &> x where x free'
, '(x =:= A (A x11 x12) x2) &> x where x,x11,x12,x2 free'
, '(x =:= A (A x11 x12) (A x21 x22)) &> x where x,x11,x12,x21,x22 free'
, '(x =:= A (A x11 x12) (B x21 )) &> x where x,x11,x12,x21 free'
, '(x =:= A (A x11 x12) C ) &> x where x,x11,x12 free'
, '(x =:= A (B x11) x2) &> x where x,x11,x2 free'
, '(x =:= A (B x11 ) (A x21 x22)) &> x where x,x11,x21,x22 free'
, '(x =:= A (B x11 ) (B x21 )) &> x where x,x11,x21 free'
, '(x =:= A (B x11 ) C ) &> x where x,x11 free'
, '(x =:= A C x2) &> x where x, x2 free'
, '(x =:= A C (A x21 x22)) &> x where x,x21,x22 free'
, '(x =:= A C (B x21 )) &> x where x,x21 free'
, '(x =:= A C C ) &> x where x free'
]
generate_test_programs([
# programtext fileprefix digits predef
# +------------------+-----------+-------+-----------------------------------
(PROGRAMS , 'prog' , 2 , '' )
, (INTEGER_PROGRAMS, 'iprog' , 2 , '' )
, (CHAR_PROGRAMS , 'cprog' , 2 , '' )
, (FLOAT_PROGRAMS , 'fprog' , 2 , '' )
, (A0_PROGRAMS , 'a0_' , 3 , 'data T = A\nmain = ' )
, (A0B0C0_PROGRAMS , 'a0b0c0_' , 3 , 'data T = A | B | C\nmain = ' )
, (A2B1C0_PROGRAMS , 'a2b1c0_' , 3 , 'data T = A T T | B T | C\nmain = ')
])
| 56.717325 | 94 | 0.322481 | 13,933 | 74,640 | 1.725974 | 0.010622 | 0.250166 | 0.306512 | 0.481204 | 0.93987 | 0.9249 | 0.910887 | 0.898328 | 0.882485 | 0.872297 | 0 | 0.004635 | 0.329327 | 74,640 | 1,315 | 95 | 56.760456 | 0.475759 | 0.355466 | 0 | 0.294529 | 1 | 0.572759 | 0.83513 | 0.00044 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.002328 | 0 | 0.002328 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 14 |
6dfe93c020bafb7b127a88e7386fd9b8a47ada15 | 3,062 | py | Python | configcascade/tests/test_settings.py | movermeyer/configcascade | c786436a7f63242b99b7b36990dc127a88de13cd | [
"MIT"
] | null | null | null | configcascade/tests/test_settings.py | movermeyer/configcascade | c786436a7f63242b99b7b36990dc127a88de13cd | [
"MIT"
] | null | null | null | configcascade/tests/test_settings.py | movermeyer/configcascade | c786436a7f63242b99b7b36990dc127a88de13cd | [
"MIT"
] | 1 | 2018-03-03T16:14:55.000Z | 2018-03-03T16:14:55.000Z | from unittest import TestCase
from configcascade import Settings, YamlFileLoader
import os
class SettingsTestCase(TestCase):
def test_with_merge(self):
config_file = os.path.realpath("%s/fixture/config/settings_test.yml" % os.path.dirname(os.path.realpath(__file__)))
file_loader = YamlFileLoader()
settings_loader = Settings(file_loader, ['routes', 'parameters', 'services'])
settings = settings_loader.compile(config_file)
expected_parameters = {'driver': {'surname': 'Carmona', 'complete_name': '{{ driver.name }} {{ driver.surname }}', 'name': 'Felix'}}
expected_routes = {'my_controller_with_service': {'path': '/my_controller/with/service', 'controller': 'felix.SomeController'}, 'hello': {'path': '/hello/{name}', 'controller': 'felix.HelloController'}}
expected_server = {'foo': 'bar'}
expected_server_adapter = None
expected_services = {'car': {'class': 'felix.services.Car', 'arguments': ['@driver']}, 'driver': {'class': 'felix.services.Driver', 'arguments': ['{{ driver.complete_name }}']}}
expected_error_handler = 'felix.handler.DefaultErrorHandler'
expected_debug = False
self.assertEqual(settings['parameters'], expected_parameters)
self.assertEqual(settings['routes'], expected_routes)
self.assertEqual(settings['server'], expected_server)
self.assertEqual(settings['server_adapter'], expected_server_adapter)
self.assertEqual(settings['services'], expected_services)
self.assertEqual(settings['error_handler'], expected_error_handler)
self.assertEqual(settings['debug'], expected_debug)
def test_without_merge(self):
config_file = os.path.realpath("%s/fixture/config/settings_test.yml" % os.path.dirname(os.path.realpath(__file__)))
file_loader = YamlFileLoader()
settings_loader = Settings(file_loader)
settings = settings_loader.compile(config_file)
expected_parameters = {'driver': {'surname': 'Carmona', 'complete_name': '{{ driver.name }} {{ driver.surname }}', 'name': 'Felix'}}
expected_routes = {'my_controller_with_service': {'path': '/my_controller/with/service', 'controller': 'felix.SomeController'}, 'hello': {'path': '/hello/{name}', 'controller': 'felix.HelloController'}}
expected_server = {'foo': 'bar'}
expected_server_adapter = None
expected_services = {'car': {'class': 'felix.services.Car', 'arguments': ['@driver']}}
expected_error_handler = 'felix.handler.DefaultErrorHandler'
expected_debug = False
self.assertEqual(settings['parameters'], expected_parameters)
self.assertEqual(settings['routes'], expected_routes)
self.assertEqual(settings['server'], expected_server)
self.assertEqual(settings['server_adapter'], expected_server_adapter)
self.assertEqual(settings['services'], expected_services)
self.assertEqual(settings['error_handler'], expected_error_handler)
self.assertEqual(settings['debug'], expected_debug)
| 61.24 | 210 | 0.693991 | 312 | 3,062 | 6.564103 | 0.176282 | 0.102539 | 0.157227 | 0.044922 | 0.895508 | 0.895508 | 0.895508 | 0.895508 | 0.895508 | 0.895508 | 0 | 0 | 0.156107 | 3,062 | 49 | 211 | 62.489796 | 0.79257 | 0 | 0 | 0.761905 | 0 | 0 | 0.28968 | 0.099608 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.047619 | false | 0 | 0.071429 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
096722d65f87ff9dd9d9b8fc068fe95b69cf1d51 | 17,392 | py | Python | kingpin/tests/kazoo_utils/test_kazoo_utils.py | fakeNetflix/pinterest-repo-kingpin | baea08ae941a4e57edb9129658fe3e7d40e4d0c3 | [
"Apache-2.0"
] | 76 | 2016-01-27T21:16:53.000Z | 2021-09-23T02:23:49.000Z | kingpin/tests/kazoo_utils/test_kazoo_utils.py | fakeNetflix/pinterest-repo-kingpin | baea08ae941a4e57edb9129658fe3e7d40e4d0c3 | [
"Apache-2.0"
] | 2 | 2016-02-26T02:37:46.000Z | 2018-02-23T09:03:41.000Z | kingpin/tests/kazoo_utils/test_kazoo_utils.py | fakeNetflix/pinterest-repo-kingpin | baea08ae941a4e57edb9129658fe3e7d40e4d0c3 | [
"Apache-2.0"
] | 22 | 2016-01-27T21:16:58.000Z | 2020-12-24T11:26:01.000Z | #!/usr/bin/python
#
# Copyright 2016 Pinterest, Inc
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Test cases for ServerSet."""
import socket
import os
import tempfile
import gevent
from gevent.event import Event
import mock
from mock import Mock, patch
from nose.plugins.attrib import attr
import unittest
from kingpin.kazoo_utils import DataWatcher, KazooClientManager, ServerSet
from kingpin.kazoo_utils.file_watch import FileWatch
import testutil
ZK_HOSTS = ["datazk001:2181", "datazk002:2181"]
class KazooClientManagerSingletonTestCase(unittest.TestCase):
@patch("kazoo.client.KazooClient.__new__",
new=Mock(side_effect=testutil.get_mock_kazoo_client))
def test_get_zk_hosts_directly(self):
""" Test passing zk_hosts in directly.
"""
testutil.initialize_kazoo_client_manager(ZK_HOSTS)
kz_client_manager = KazooClientManager(ZK_HOSTS)
self.assertEqual(kz_client_manager.get_client().hosts, ",".join(ZK_HOSTS))
class DataWatcherTestCase(unittest.TestCase):
TEST_PATH = "/test_data_watcher"
DATA_0 = "foo"
DATA_1 = "bar"
@mock.patch("kazoo.client.KazooClient.__new__",
new=mock.Mock(side_effect=testutil.get_mock_kazoo_client))
def test_data_watcher(self):
"""Test various scenarios for data watcher:
1. When data get changed, watcher callback should be invoked.
2. When the underlying zk client disconnects and then recovers,
the watcher callback should be invoked.
3. When the underlying zk client messes up beyond recovery,
the underlying client should be replaced, and once the new client
is in place, the watcher callback should be invoked again.
"""
data_stat = []
watcher_triggered = Event()
def data_watch(data, stat):
while data_stat:
data_stat.pop()
data_stat.append(data)
data_stat.append(stat)
watcher_triggered.set()
testutil.initialize_kazoo_client_manager(ZK_HOSTS)
client = KazooClientManager().get_client()
client.create(DataWatcherTestCase.TEST_PATH,
DataWatcherTestCase.DATA_0)
data_watcher = DataWatcher(DataWatcherTestCase.TEST_PATH,
ZK_HOSTS,
waiting_in_secs=0.01)
data_watcher.watch(data_watch).join()
watcher_triggered.wait(1)
# Now the data and version should be foo and 0.
self.assertEqual(data_stat[0], DataWatcherTestCase.DATA_0)
self.assertEqual(data_stat[1].version, 0)
watcher_triggered.clear()
client.set(DataWatcherTestCase.TEST_PATH, DataWatcherTestCase.DATA_1)
watcher_triggered.wait(1)
# Make sure that watch callback is triggered.
self.assertEqual(data_stat[0], DataWatcherTestCase.DATA_1)
self.assertEqual(data_stat[1].version, 1)
data_stat.pop()
data_stat.pop()
# Test recoverable failure
watcher_triggered.clear()
client.stop()
client.start()
# Here the client actually will call check the znode in the
# background.
watcher_triggered.wait(1)
# Since nothing changed, no notification from the client.
self.assertFalse(data_stat)
# Test client change
client.stop()
watcher_triggered.clear()
# give the monit greenlet a chance to detect failures.
gevent.sleep(1)
# Assert the client has been replaced with a new one.
self.assertFalse(KazooClientManager().get_client() is client)
watcher_triggered.wait(1)
# Make sure that watch callback is triggered when client is replaced.
self.assertEqual(data_stat[0], DataWatcherTestCase.DATA_1)
self.assertEqual(data_stat[1].version, 1)
class DataWatcherWithFileTestCase(unittest.TestCase):
"""
Test the data watcher with a local file provided.
"""
TEST_PATH = "/test_data_watcher"
DATA_0 = "foo"
DATA_1 = "bar"
# Initialize a singleton file watch with low wait time
FILE_WATCH = FileWatch(polling_wait_in_seconds=0.5)
@mock.patch("kazoo.client.KazooClient.__new__",
new=mock.Mock(side_effect=testutil.get_mock_kazoo_client))
def test_data_watcher(self):
"""Test data watcher with a local file:
1. When data get changed, watcher callback should be invoked.
2. When the underlying zk client disconnects and then recovers,
the watcher callback should be invoked.
3. When the underlying zk client messes up beyond recovery,
the underlying client should be replaced, and once the new client
is in place, the watcher callback should be invoked again.
Although when a local file is being watched, now all the code paths
about the above behaviors got affected, we still want to test all the
scenarios to make sure nothing breaks when a file is used.
"""
data_stat = []
watcher_triggered = Event()
fd, tmp_file = tempfile.mkstemp()
with open(tmp_file, 'w') as f:
f.write(self.DATA_0)
def data_watch(data, stat):
while data_stat:
data_stat.pop()
data_stat.append(data)
data_stat.append(stat)
watcher_triggered.set()
data_watcher = DataWatcher(DataWatcherWithFileTestCase.TEST_PATH,
ZK_HOSTS,
waiting_in_secs=0.01,
file_path=tmp_file)
data_watcher.watch(data_watch).join()
watcher_triggered.wait(1)
# Now the data and version should be foo and the mtime of file.
mtime = os.path.getmtime(tmp_file)
self.assertEqual(data_stat[0], DataWatcherWithFileTestCase.DATA_0)
self.assertEqual(data_stat[1].version, mtime)
self.assertEqual(data_watcher.get_data()[0], DataWatcherWithFileTestCase.DATA_0)
self.assertEqual(data_watcher.get_data()[1].version, mtime)
watcher_triggered.clear()
gevent.sleep(1)
with open(tmp_file, 'w') as f:
f.write(self.DATA_1)
watcher_triggered.wait(1)
# Make sure that watch callback is triggered.
mtime = os.path.getmtime(tmp_file)
self.assertEqual(data_stat[0], DataWatcherWithFileTestCase.DATA_1)
self.assertEqual(data_stat[1].version, mtime)
self.assertEqual(data_watcher.get_data()[0], DataWatcherWithFileTestCase.DATA_1)
self.assertEqual(data_watcher.get_data()[1].version, mtime)
data_stat.pop()
data_stat.pop()
# Test recoverable failure, even though the watcher with a file path
# is not changing any implementation or behavior in this part, we want
# to keep the tests here to ensure.
watcher_triggered.clear()
self.FILE_WATCH._clear_all_watches()
os.remove(tmp_file)
class ServerSetTestCase(unittest.TestCase):
"""Test server set."""
SERVER_SET_PATH = "/test_server_set"
SERVER_SET_DESTROY_PATH = "/test_server_set_destroy"
PORT_1 = 8080
PORT_2 = 8081
END_POINT_1 = ServerSet._create_endpoint(PORT_1, True)
END_POINT_2 = ServerSet._create_endpoint(PORT_2, True)
END_POINTS = [END_POINT_1, END_POINT_2]
@mock.patch("kazoo.client.KazooClient.__new__",
new=mock.Mock(side_effect=testutil.get_mock_kazoo_client))
def test_server_set(self):
"""Test various failure scenarios on server set implementation.
1. When a new server joins the set, the watcher should be notified.
2. When the underlying zk client disconnects and then recovers,
the server set should be transparent to server set participants
and watchers.
3. When the underlying zk client messes up beyond recovery,
the underlying client should be replaced, and this should be
transparent to server set participants and watchers.
"""
all_children = []
watcher_triggered = Event()
def server_set_watcher(children):
while all_children:
all_children.pop()
for child in children:
all_children.append(child)
watcher_triggered.set()
testutil.initialize_kazoo_client_manager(ZK_HOSTS)
client = KazooClientManager().get_client()
server_set = ServerSet(ServerSetTestCase.SERVER_SET_PATH,
ZK_HOSTS,
waiting_in_secs=0.01)
server_set.join(ServerSetTestCase.PORT_1, use_ip=True).join()
server_set.monitor(server_set_watcher).join()
watcher_triggered.wait(1)
# Now the server set should only contain end point 1
self.assertEqual(all_children, [ServerSetTestCase.END_POINT_1])
watcher_triggered.clear()
server_set.join(ServerSetTestCase.PORT_2, use_ip=True).join()
watcher_triggered.wait(1)
all_children.sort()
# Now the server set should contain both end point 1 and 2
self.assertEqual(all_children, ServerSetTestCase.END_POINTS)
# Test recoverable failure
client.stop()
watcher_triggered.clear()
client.start()
watcher_triggered.wait(1)
# Server set should remain the same when the client recovers
all_children.sort()
self.assertEqual(all_children, ServerSetTestCase.END_POINTS)
# Test client change
client.stop()
watcher_triggered.clear()
# give the monit greenlet a chance to detect failures
gevent.sleep(1)
# Assert the client has been replaced with a new one
self.assertFalse(KazooClientManager().get_client() is client)
watcher_triggered.wait(1)
# Server set should survive the underlying client being swapped out
all_children.sort()
self.assertEqual(all_children, ServerSetTestCase.END_POINTS)
@attr('destroy_serverset')
@mock.patch("kazoo.client.KazooClient.__new__",
new=mock.Mock(side_effect=testutil.get_mock_kazoo_client))
def test_serverset_destroy(self):
testutil.initialize_kazoo_client_manager(ZK_HOSTS)
client = KazooClientManager().get_client()
server_set = ServerSet(ServerSetTestCase.SERVER_SET_DESTROY_PATH,
ZK_HOSTS,
waiting_in_secs=0.01)
server_set.join(ServerSetTestCase.PORT_1, use_ip=False)
server_set.join(ServerSetTestCase.PORT_2, use_ip=False)
# Give time to let server set join to do its magic.
gevent.sleep(1)
server_set._destroy(ServerSetTestCase.END_POINT_1)
gevent.sleep(1)
children = client.get_children(
ServerSetTestCase.SERVER_SET_DESTROY_PATH)
for child in children:
self.assertFalse(child.endswith(ServerSetTestCase.END_POINT_1))
class ServerSetWithFileTestCase(unittest.TestCase):
"""Test server set with local file."""
SERVER_SET_PATH = "/test_server_set"
SERVER_SET_DESTROY_PATH = "/test_server_set_destroy"
PORT_1 = 8088
PORT_2 = 8189
END_POINT_1 = "%s:%d" % (socket.gethostname(), PORT_1)
END_POINT_2 = "%s:%d" % (socket.gethostname(), PORT_2)
END_POINTS = [END_POINT_1, END_POINT_2]
# Initialize a singleton file watch with low wait time
FILE_WATCH = FileWatch(polling_wait_in_seconds=0.5)
FILE_WATCH._clear_all_watches()
@mock.patch("kazoo.client.KazooClient.__new__",
new=mock.Mock(side_effect=testutil.get_mock_kazoo_client))
def test_server_set(self):
"""Test various failure scenarios on server set implementation.
1. When a new server joins the set, the watcher should be notified.
In practice there is a daemon monitoring the server set change in
zk and update the local file.
2. When the underlying zk client disconnects and then recovers,
the server set should be transparent to server set participants
and watchers.
3. When the underlying zk client messes up beyond recovery,
it should be transparent to server set participants and watchers.
Although when a local file is being watched, now all the code paths
about the above behaviors got affected, we still want to test all the
scenarios to make sure nothing breaks when a file is used.
NOTE: to simulate the behavior in practice, when a server joins or
leaves, we assume that there is a daemon to make the corresponding
change to the local file.
"""
fd, tmp_file = tempfile.mkstemp()
all_children = []
watcher_triggered = Event()
def server_set_watcher(children):
while all_children:
all_children.pop()
for child in children:
all_children.append(child)
watcher_triggered.set()
testutil.initialize_kazoo_client_manager(ZK_HOSTS)
client = KazooClientManager().get_client()
server_set = ServerSet(ServerSetWithFileTestCase.SERVER_SET_PATH,
ZK_HOSTS,
waiting_in_secs=0.01,
file_path=tmp_file)
server_set.join(ServerSetWithFileTestCase.PORT_1, use_ip=False).join()
# update the local file manually here, suppose there is a daemon
with open(tmp_file, 'w') as f:
f.write(ServerSetWithFileTestCase.END_POINT_1)
gevent.sleep(1)
server_set.monitor(server_set_watcher).join()
watcher_triggered.wait(1)
# Now the server set should only contain end point 1
self.assertEqual(all_children, [ServerSetWithFileTestCase.END_POINT_1])
watcher_triggered.clear()
server_set.join(ServerSetWithFileTestCase.PORT_2, use_ip=False).join()
# update the local file manually here, suppose there is a daemon
with open(tmp_file, 'w') as f:
f.write(ServerSetWithFileTestCase.END_POINT_1 +
"\n" +
ServerSetWithFileTestCase.END_POINT_2)
gevent.sleep(1)
watcher_triggered.wait(1)
all_children.sort()
# Now the server set should contain both end point 1 and 2
self.assertEqual(all_children, ServerSetWithFileTestCase.END_POINTS)
# Test recoverable failure
client.stop()
watcher_triggered.clear()
client.start()
watcher_triggered.wait(1)
# Server set should remain the same when the client recovers
all_children.sort()
self.assertEqual(all_children, ServerSetWithFileTestCase.END_POINTS)
# Test client change
client.stop()
watcher_triggered.clear()
# give the monit greenlet a chance to detect failures
gevent.sleep(1)
watcher_triggered.wait(1)
# Server set should survive the underlying client being swapped out
all_children.sort()
self.assertEqual(all_children, ServerSetWithFileTestCase.END_POINTS)
self.FILE_WATCH._clear_all_watches()
os.remove(tmp_file)
@attr('destroy_serverset')
@mock.patch("kazoo.client.KazooClient.__new__",
new=mock.Mock(side_effect=testutil.get_mock_kazoo_client))
def test_serverset_destroy(self):
testutil.initialize_kazoo_client_manager(ZK_HOSTS)
client = KazooClientManager().get_client()
client.start()
fd, tmp_file = tempfile.mkstemp()
server_set = ServerSet(ServerSetWithFileTestCase.SERVER_SET_DESTROY_PATH,
ZK_HOSTS,
waiting_in_secs=0.01)
server_set.join(ServerSetWithFileTestCase.PORT_1, use_ip=False)
server_set.join(ServerSetWithFileTestCase.PORT_2, use_ip=False)
# update the local file manually here, suppose there is a daemon
with open(tmp_file, 'w') as f:
f.write(ServerSetWithFileTestCase.END_POINT_1 +
"\n" +
ServerSetWithFileTestCase.END_POINT_2)
# Give time to let server set join to do its magic.
gevent.sleep(1)
server_set._destroy(ServerSetWithFileTestCase.END_POINT_1)
# update the local file manually here, suppose there is a daemon
with open(tmp_file, 'w') as f:
f.write(ServerSetWithFileTestCase.END_POINT_2)
gevent.sleep(1)
children = client.get_children(
ServerSetWithFileTestCase.SERVER_SET_DESTROY_PATH)
for child in children:
self.assertFalse(child.endswith(ServerSetWithFileTestCase.END_POINT_1))
self.FILE_WATCH._clear_all_watches()
os.remove(tmp_file)
| 42.009662 | 88 | 0.664846 | 2,149 | 17,392 | 5.176826 | 0.133085 | 0.046112 | 0.013753 | 0.026427 | 0.805213 | 0.768719 | 0.752449 | 0.733753 | 0.715236 | 0.66373 | 0 | 0.013313 | 0.261442 | 17,392 | 413 | 89 | 42.11138 | 0.852783 | 0.283924 | 0 | 0.739464 | 0 | 0 | 0.036193 | 0.022631 | 0 | 0 | 0 | 0 | 0.10728 | 1 | 0.042146 | false | 0 | 0.045977 | 0 | 0.191571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0981dfe01c72a2252e455a10a13d212bbfe95756 | 144 | py | Python | pydsa/tests/test_mod_inverse.py | mish24/pydsa | 422db76d77503fb60af3d9a712ceab0a02f668cf | [
"BSD-3-Clause"
] | 126 | 2016-02-09T18:31:16.000Z | 2021-11-02T09:24:38.000Z | pydsa/tests/test_mod_inverse.py | mish24/pydsa | 422db76d77503fb60af3d9a712ceab0a02f668cf | [
"BSD-3-Clause"
] | 104 | 2016-03-01T12:12:56.000Z | 2019-04-30T16:13:04.000Z | pydsa/tests/test_mod_inverse.py | mish24/pydsa | 422db76d77503fb60af3d9a712ceab0a02f668cf | [
"BSD-3-Clause"
] | 124 | 2016-02-11T17:59:39.000Z | 2020-01-11T21:55:54.000Z | from pydsa.mod_inverse import mod_inverse
def test_mod_inverse():
assert mod_inverse(3, 13) == 9
assert mod_inverse(30, 120000) == -1
| 20.571429 | 41 | 0.715278 | 23 | 144 | 4.217391 | 0.608696 | 0.515464 | 0.329897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110169 | 0.180556 | 144 | 6 | 42 | 24 | 0.711864 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.25 | true | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
09894f1793c2f508fb376fa27fd595596ed7a6de | 3,120 | py | Python | chap04/DiscriminantAnalysis.py | ppham27/MLaPP-solutions | 3b3fb838b0873eae01bf9793d7464386dbb43835 | [
"MIT"
] | 56 | 2017-01-04T20:48:46.000Z | 2022-03-28T08:31:33.000Z | chap04/DiscriminantAnalysis.py | eric701803/MLaPP-solutions | 3b3fb838b0873eae01bf9793d7464386dbb43835 | [
"MIT"
] | null | null | null | chap04/DiscriminantAnalysis.py | eric701803/MLaPP-solutions | 3b3fb838b0873eae01bf9793d7464386dbb43835 | [
"MIT"
] | 23 | 2017-01-23T05:53:52.000Z | 2021-01-02T05:55:40.000Z | import numpy as np
from scipy import stats
class QDA:
def fit(self, X, y):
assert(len(X) == len(y))
self.n = len(y) # number of observations
self.p = X.shape[1] # number of features
self.classes = np.unique(y)
self.C = len(self.classes)
self.theta = np.empty(self.C, dtype=np.float64)
self.covariances = np.empty((self.C, self.p, self.p), dtype=np.float64)
self.mu = np.empty((self.C, self.p), dtype=np.float64)
for i in range(self.C):
sub_X = X[y == self.classes[i]]
self.theta[i] = len(sub_X)/len(y)
self.covariances[i] = np.cov(sub_X, rowvar=False, bias=False)
self.mu[i] = np.mean(sub_X, axis=0)
return self.mu
def predict(self, X):
if X.ndim == 1:
X = X.reshape((1, len(X)))
predictions = np.empty(len(X), dtype=self.classes.dtype)
for i in range(len(X)):
class_numerator = np.empty(self.C, dtype=np.float64)
for c in range(self.C):
class_numerator[c] = self.theta[c]*stats.multivariate_normal.pdf(x=X[i],
mean=self.mu[c],
cov=self.covariances[c])
predictions[i] = self.classes[np.argmax(class_numerator)]
return predictions
def score(self, X, y):
predictions = self.predict(X)
return sum(predictions == y)/len(y)
class LDA:
def fit(self, X, y):
assert(len(X) == len(y))
self.n = len(y) # number of observations
self.p = X.shape[1] # number of features
self.classes = np.unique(y)
self.C = len(self.classes)
self.theta = np.empty(self.C, dtype=np.float64)
self.covariance = np.zeros((self.p, self.p), dtype=np.float64)
self.mu = np.empty((self.C, self.p), dtype=np.float64)
for i in range(self.C):
sub_X = X[y == self.classes[i]]
self.theta[i] = len(sub_X)/len(y)
self.mu[i] = np.mean(sub_X, axis=0)
self.covariance += np.cov(sub_X, rowvar=False, ddof=len(sub_X) - 1)
self.covariance /= self.n
return self.mu
def predict(self, X):
if X.ndim == 1:
X = X.reshape((1, len(X)))
predictions = np.empty(len(X), dtype=self.classes.dtype)
for i in range(len(X)):
class_numerator = np.empty(self.C, dtype=np.float64)
for c in range(self.C):
class_numerator[c] = self.theta[c]*stats.multivariate_normal.pdf(x=X[i],
mean=self.mu[c],
cov=self.covariance)
predictions[i] = self.classes[np.argmax(class_numerator)]
return predictions
def score(self, X, y):
predictions = self.predict(X)
return sum(predictions == y)/len(y)
| 45.882353 | 105 | 0.498718 | 419 | 3,120 | 3.673031 | 0.147971 | 0.042235 | 0.072775 | 0.054581 | 0.891488 | 0.891488 | 0.8577 | 0.8577 | 0.8577 | 0.831709 | 0 | 0.012677 | 0.367949 | 3,120 | 67 | 106 | 46.567164 | 0.767748 | 0.026603 | 0 | 0.830769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030769 | 1 | 0.092308 | false | 0 | 0.030769 | 0 | 0.246154 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
098b1b561859816f1dd407a3f9cea2f2d92578bb | 7,493 | py | Python | test_thumbnail_maker.py | jgyy/python-concurrency | 96ea26af1b18a3c0fee760720a9901605d213fab | [
"MIT"
] | null | null | null | test_thumbnail_maker.py | jgyy/python-concurrency | 96ea26af1b18a3c0fee760720a9901605d213fab | [
"MIT"
] | null | null | null | test_thumbnail_maker.py | jgyy/python-concurrency | 96ea26af1b18a3c0fee760720a9901605d213fab | [
"MIT"
] | null | null | null | """
thumnbail maker script, may change the source
"""
from thumbnail_maker import ThumbnailMakerService
IMG_URLS = [
'https://i.kinja-img.com/gawker-media/image/upload/s--wjW8UU49--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/oebzojig8nnkgtxujstr.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--rk2aB5tu--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/dfeszj0zpugm1uuwbj19.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--_E6mQi9a--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/pyzg0xblsc69kmmtzzsr.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--0ISrJi5n--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/jkjhb3gv0qmieppvq0b2.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--3lZHtAvN--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/alscw3vhlcfzdkzfnz3e.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--4RWC_Q8y--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/juiqfgnxsnd8hztashdj.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--53FOkEHQ--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/rwsm1niblyxtdve1yuvq.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--7qH7m17g--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/ofqe9ruxjn04komwar9o.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--8oh2GJ8---/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/d4mwn9uy4ednf1nyt44d.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--9rg3INuV--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/lbkhmc8mvaz7ve0w2j5q.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--BNU4s31Q--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/r2fxeggcuapki9ze86g5.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--E9JvR6zD--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/ycikkm50timeaevel40l.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--FgqaCeME--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/ajs0pcmeh4ollbnt5gxw.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--FHzgs724--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/thok0ygjqbbucnocsb0h.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--gXbd2e1V--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/jvp8ftnuptptzrkbmwwy.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--qXHAzcVS--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/ant9bupe0imzmc7em6za.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--hVbrpwjH--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/k9jzpxlvcccaz4mzmwxt.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--Ie8pBtnS--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/zdux9lg2pierc8t6weay.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--jHAkq6mQ--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/ccwlwicp2umrn1dnp2h1.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--L_vCdQkY--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/zwxmqlcov5otterd3837.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--lSlAC-ng--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/lsn8nvlmnevmslwolxgu.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--mV5iGurd--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/ywooldudzdkem6blteiw.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--pSC41gMP--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/buulcxrypashtjuqkisc.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--ROdOrqxj--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/muotybwrl9sm4s2jgoig.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--RPPFz0Qr--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/vteo9oglifm6uykzi90a.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--TqpTY9t6--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/or8j0njxkotoo0sve2jy.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--C3g1bWMZ--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/rfn9ykjpcnaxuyfqwzf9.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--FACPTWC0--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/zeumriyb2xp29m2akvcx.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--DzuAg0e2--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/gzdeepkdu9xik9dbhsux.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--rsurpTHL--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/zmxjrzddhnrd2u9oogln.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--s0_a10yn--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/rnjkov3ygfnlouzbblsz.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--kfwsjd-S--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/zjwrlnjnt4joucs92av7.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--SX6NJkRp--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/bmy8w7gyegwa6l06ocn5.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--6VGkwAXX--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/ynsxzkpy4nhzjhp6bo2z.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--V2d6coaQ--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/jy65mxlgcmau9xe6ahqj.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--ZL0yx1Ky--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/eolrqijtojpctifrruus.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--XXMEFrPv--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/u99smdxbfdznyjn4ftso.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--D5S1EO7C--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/utejdsgttgmwblw7zera.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--UFBa0kpw--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/di1ucunfwa9gxy5lu2mh.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--HWwzwQ1q--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/iofbudezbje6gnqddef1.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--HEK5Si0z--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/s4vo17kcic5nctl9qwk6.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--ho5zOxOS--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/onx5yksjnr8ulmchw1nw.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--sJJDHlcS--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/wsvwtlwjp62efu20dxbb.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--sZs0RU_X--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/ympzuvjshxjfoen6ax3h.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--vUgA6_-L--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/i1pzwzhvx4jj7mysu2bz.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--yR0DdAoE--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/m7vmbgzimblv4su9m2kb.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--XhHj4A_x--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/dee608xivehmprkchnhx.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--ZeHgo4MW--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/q6qmahfckjhawbtjbjkk.jpg',
'https://i.kinja-img.com/gawker-media/image/upload/s--ZelFOYrg--/c_scale,dpr_2.0,f_auto,fl_progressive,q_80,w_1600/vdx7igcimjfcqhe1yfwy.jpg'
]
def test_thumbnail_maker():
"""
to run the thumbnail maker function
"""
tn_maker = ThumbnailMakerService()
tn_maker.make_thumbnails(IMG_URLS)
if __name__ == '__main__':
test_thumbnail_maker()
| 108.594203 | 145 | 0.777392 | 1,321 | 7,493 | 4.168055 | 0.11355 | 0.053396 | 0.097893 | 0.124591 | 0.711769 | 0.711769 | 0.711769 | 0.711769 | 0.711769 | 0.711769 | 0 | 0.084366 | 0.041372 | 7,493 | 68 | 146 | 110.191176 | 0.682166 | 0.01081 | 0 | 0 | 0 | 0.859649 | 0.916227 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017544 | false | 0 | 0.017544 | 0 | 0.035088 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
09cbcd90c2010cddd99db377b0687b83b6d26e6f | 168 | py | Python | codewars/8kyu/dinamuh/VolumeOfACuboid/main.py | dinamuh/Training_one | d18e8fb12608ce1753162c20252ca928c4df97ab | [
"MIT"
] | null | null | null | codewars/8kyu/dinamuh/VolumeOfACuboid/main.py | dinamuh/Training_one | d18e8fb12608ce1753162c20252ca928c4df97ab | [
"MIT"
] | 2 | 2019-01-22T10:53:42.000Z | 2019-01-31T08:02:48.000Z | codewars/8kyu/dinamuh/VolumeOfACuboid/main.py | dinamuh/Training_one | d18e8fb12608ce1753162c20252ca928c4df97ab | [
"MIT"
] | 13 | 2019-01-22T10:37:42.000Z | 2019-01-25T13:30:43.000Z | def getVolumeOfCubiod(length, width, height):
return length * width * height
def get_volume_of_cuboid2(length, width, height):
return length * width * height
| 24 | 49 | 0.738095 | 21 | 168 | 5.761905 | 0.47619 | 0.363636 | 0.561983 | 0.380165 | 0.661157 | 0.661157 | 0.661157 | 0 | 0 | 0 | 0 | 0.007246 | 0.178571 | 168 | 6 | 50 | 28 | 0.869565 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
1142d570a734c5276fd49ef731bfc2d806537bdd | 257 | py | Python | multirunnable/concurrent/__init__.py | Chisanan232/pyocean | b5710660652ad4abe6845693e0576e99f9155084 | [
"Apache-2.0"
] | null | null | null | multirunnable/concurrent/__init__.py | Chisanan232/pyocean | b5710660652ad4abe6845693e0576e99f9155084 | [
"Apache-2.0"
] | null | null | null | multirunnable/concurrent/__init__.py | Chisanan232/pyocean | b5710660652ad4abe6845693e0576e99f9155084 | [
"Apache-2.0"
] | null | null | null | from multirunnable.concurrent.features import ThreadQueueType, ThreadLock, ThreadCommunication
from multirunnable.concurrent.strategy import ConcurrentStrategy, ThreadStrategy, ThreadPoolStrategy
from multirunnable.concurrent.result import ConcurrentResult
| 64.25 | 100 | 0.898833 | 22 | 257 | 10.5 | 0.636364 | 0.220779 | 0.350649 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062257 | 257 | 3 | 101 | 85.666667 | 0.958506 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
115d6d119723cea02ff9621d46069cbe0f206305 | 7,371 | py | Python | tests/pytests/unit/states/test_kmod.py | markgras/salt | d66cd3c935533c63870b83228b978ce43e0ef70d | [
"Apache-2.0"
] | 9,425 | 2015-01-01T05:59:24.000Z | 2022-03-31T20:44:05.000Z | tests/pytests/unit/states/test_kmod.py | markgras/salt | d66cd3c935533c63870b83228b978ce43e0ef70d | [
"Apache-2.0"
] | 33,507 | 2015-01-01T00:19:56.000Z | 2022-03-31T23:48:20.000Z | tests/pytests/unit/states/test_kmod.py | markgras/salt | d66cd3c935533c63870b83228b978ce43e0ef70d | [
"Apache-2.0"
] | 5,810 | 2015-01-01T19:11:45.000Z | 2022-03-31T02:37:20.000Z | """
:codeauthor: Jayesh Kariya <jayeshk@saltstack.com>
"""
import pytest
import salt.states.kmod as kmod
from tests.support.mock import MagicMock, patch
@pytest.fixture
def configure_loader_modules():
return {kmod: {}}
def test_present():
"""
Test to ensure that the specified kernel module is loaded.
"""
name = "cheese"
ret = {"name": name, "result": True, "comment": "", "changes": {}}
mock_mod_list = MagicMock(return_value=[name])
with patch.dict(kmod.__salt__, {"kmod.mod_list": mock_mod_list}):
comment = "Kernel module {} is already present".format(name)
ret.update({"comment": comment})
assert kmod.present(name) == ret
mock_mod_list = MagicMock(return_value=[])
with patch.dict(kmod.__salt__, {"kmod.mod_list": mock_mod_list}):
with patch.dict(kmod.__opts__, {"test": True}):
comment = "Kernel module {} is set to be loaded".format(name)
ret.update({"comment": comment, "result": None})
assert kmod.present(name) == ret
mock_mod_list = MagicMock(return_value=[])
mock_available = MagicMock(return_value=[name])
mock_load = MagicMock(return_value=[name])
with patch.dict(
kmod.__salt__,
{
"kmod.mod_list": mock_mod_list,
"kmod.available": mock_available,
"kmod.load": mock_load,
},
):
with patch.dict(kmod.__opts__, {"test": False}):
comment = "Loaded kernel module {}".format(name)
ret.update(
{"comment": comment, "result": True, "changes": {name: "loaded"}}
)
assert kmod.present(name) == ret
def test_present_multi():
"""
Test to ensure that multiple kernel modules are loaded.
"""
name = "salted kernel"
mods = ["cheese", "crackers"]
ret = {"name": name, "result": True, "changes": {}}
mock_mod_list = MagicMock(return_value=mods)
with patch.dict(kmod.__salt__, {"kmod.mod_list": mock_mod_list}):
call_ret = kmod.present(name, mods=mods)
# Check comment independently: makes test more stable on PY3
comment = call_ret.pop("comment")
assert "cheese" in comment
assert "crackers" in comment
assert "are already present" in comment
# Assert against all other dictionary key/values
assert ret == call_ret
mock_mod_list = MagicMock(return_value=[])
with patch.dict(kmod.__salt__, {"kmod.mod_list": mock_mod_list}):
with patch.dict(kmod.__opts__, {"test": True}):
call_ret = kmod.present(name, mods=mods)
ret.update({"result": None})
# Check comment independently: makes test more stable on PY3
comment = call_ret.pop("comment")
assert "cheese" in comment
assert "crackers" in comment
assert "are set to be loaded" in comment
# Assert against all other dictionary key/values
assert ret == call_ret
mock_mod_list = MagicMock(return_value=[])
mock_available = MagicMock(return_value=mods)
mock_load = MagicMock(return_value=mods)
with patch.dict(
kmod.__salt__,
{
"kmod.mod_list": mock_mod_list,
"kmod.available": mock_available,
"kmod.load": mock_load,
},
):
with patch.dict(kmod.__opts__, {"test": False}):
call_ret = kmod.present(name, mods=mods)
ret.update(
{"result": True, "changes": {mods[0]: "loaded", mods[1]: "loaded"}}
)
# Check comment independently: makes test more stable on PY3
comment = call_ret.pop("comment")
assert "cheese" in comment
assert "crackers" in comment
assert "Loaded kernel modules" in comment
# Assert against all other dictionary key/values
assert ret == call_ret
def test_absent():
"""
Test to verify that the named kernel module is not loaded.
"""
name = "cheese"
ret = {"name": name, "result": True, "comment": "", "changes": {}}
mock_mod_list = MagicMock(return_value=[name])
with patch.dict(kmod.__salt__, {"kmod.mod_list": mock_mod_list}):
with patch.dict(kmod.__opts__, {"test": True}):
comment = "Kernel module {} is set to be removed".format(name)
ret.update({"comment": comment, "result": None})
assert kmod.absent(name) == ret
mock_mod_list = MagicMock(return_value=[name])
mock_remove = MagicMock(return_value=[name])
with patch.dict(
kmod.__salt__, {"kmod.mod_list": mock_mod_list, "kmod.remove": mock_remove}
):
with patch.dict(kmod.__opts__, {"test": False}):
comment = "Removed kernel module {}".format(name)
ret.update(
{"comment": comment, "result": True, "changes": {name: "removed"}}
)
assert kmod.absent(name) == ret
mock_mod_list = MagicMock(return_value=[])
with patch.dict(kmod.__salt__, {"kmod.mod_list": mock_mod_list}):
with patch.dict(kmod.__opts__, {"test": True}):
comment = "Kernel module {} is already removed".format(name)
ret.update({"comment": comment, "result": True, "changes": {}})
assert kmod.absent(name) == ret
def test_absent_multi():
"""
Test to verify that multiple kernel modules are not loaded.
"""
name = "salted kernel"
mods = ["cheese", "crackers"]
ret = {"name": name, "result": True, "changes": {}}
mock_mod_list = MagicMock(return_value=mods)
with patch.dict(kmod.__salt__, {"kmod.mod_list": mock_mod_list}):
with patch.dict(kmod.__opts__, {"test": True}):
ret.update({"result": None})
call_ret = kmod.absent(name, mods=mods)
# Check comment independently: makes test more stable on PY3
comment = call_ret.pop("comment")
assert "cheese" in comment
assert "crackers" in comment
assert "are set to be removed" in comment
# Assert against all other dictionary key/values
assert ret == call_ret
mock_mod_list = MagicMock(return_value=mods)
mock_remove = MagicMock(return_value=mods)
with patch.dict(
kmod.__salt__, {"kmod.mod_list": mock_mod_list, "kmod.remove": mock_remove}
):
with patch.dict(kmod.__opts__, {"test": False}):
call_ret = kmod.absent(name, mods=mods)
ret.update(
{"result": True, "changes": {mods[0]: "removed", mods[1]: "removed"}}
)
# Check comment independently: makes test more stable on PY3
comment = call_ret.pop("comment")
assert "cheese" in comment
assert "crackers" in comment
assert "Removed kernel modules" in comment
# Assert against all other dictionary key/values
assert ret == call_ret
mock_mod_list = MagicMock(return_value=[])
with patch.dict(kmod.__salt__, {"kmod.mod_list": mock_mod_list}):
with patch.dict(kmod.__opts__, {"test": True}):
comment = "Kernel modules {} are already removed".format(", ".join(mods))
ret.update({"comment": comment, "result": True, "changes": {}})
assert kmod.absent(name, mods=mods) == ret
| 37.040201 | 85 | 0.600733 | 871 | 7,371 | 4.853042 | 0.102181 | 0.059617 | 0.062456 | 0.088479 | 0.877691 | 0.839366 | 0.828957 | 0.817365 | 0.81074 | 0.808375 | 0 | 0.001673 | 0.270384 | 7,371 | 198 | 86 | 37.227273 | 0.784306 | 0.110568 | 0 | 0.702128 | 0 | 0 | 0.160463 | 0 | 0 | 0 | 0 | 0 | 0.191489 | 1 | 0.035461 | false | 0 | 0.021277 | 0.007092 | 0.06383 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fecd1579fea1d9ac300ce0be6d1a146b6d45d1b5 | 78,667 | py | Python | cottonformation/res/frauddetector.py | gitter-badger/cottonformation-project | 354f1dce7ea106e209af2d5d818b6033a27c193c | [
"BSD-2-Clause"
] | null | null | null | cottonformation/res/frauddetector.py | gitter-badger/cottonformation-project | 354f1dce7ea106e209af2d5d818b6033a27c193c | [
"BSD-2-Clause"
] | null | null | null | cottonformation/res/frauddetector.py | gitter-badger/cottonformation-project | 354f1dce7ea106e209af2d5d818b6033a27c193c | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""
This module
"""
import attr
import typing
from ..core.model import (
Property, Resource, Tag, GetAtt, TypeHint, TypeCheck,
)
from ..core.constant import AttrMeta
#--- Property declaration ---
@attr.s
class DetectorLabel(Property):
"""
AWS Object Type = "AWS::FraudDetector::Detector.Label"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html
Property Document:
- ``p_Arn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-arn
- ``p_CreatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-createdtime
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-description
- ``p_Inline``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-inline
- ``p_LastUpdatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-lastupdatedtime
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-name
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::Detector.Label"
p_Arn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Arn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-arn"""
p_CreatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "CreatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-createdtime"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-description"""
p_Inline: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "Inline"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-inline"""
p_LastUpdatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "LastUpdatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-lastupdatedtime"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-name"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-label.html#cfn-frauddetector-detector-label-tags"""
@attr.s
class DetectorEntityType(Property):
"""
AWS Object Type = "AWS::FraudDetector::Detector.EntityType"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html
Property Document:
- ``p_Arn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-arn
- ``p_CreatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-createdtime
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-description
- ``p_Inline``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-inline
- ``p_LastUpdatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-lastupdatedtime
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-name
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::Detector.EntityType"
p_Arn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Arn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-arn"""
p_CreatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "CreatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-createdtime"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-description"""
p_Inline: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "Inline"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-inline"""
p_LastUpdatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "LastUpdatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-lastupdatedtime"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-name"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-entitytype.html#cfn-frauddetector-detector-entitytype-tags"""
@attr.s
class DetectorModel(Property):
"""
AWS Object Type = "AWS::FraudDetector::Detector.Model"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-model.html
Property Document:
- ``p_Arn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-model.html#cfn-frauddetector-detector-model-arn
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::Detector.Model"
p_Arn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Arn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-model.html#cfn-frauddetector-detector-model-arn"""
@attr.s
class DetectorOutcome(Property):
"""
AWS Object Type = "AWS::FraudDetector::Detector.Outcome"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html
Property Document:
- ``p_Arn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-arn
- ``p_CreatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-createdtime
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-description
- ``p_Inline``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-inline
- ``p_LastUpdatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-lastupdatedtime
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-name
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::Detector.Outcome"
p_Arn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Arn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-arn"""
p_CreatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "CreatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-createdtime"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-description"""
p_Inline: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "Inline"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-inline"""
p_LastUpdatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "LastUpdatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-lastupdatedtime"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-name"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-outcome.html#cfn-frauddetector-detector-outcome-tags"""
@attr.s
class EventTypeEntityType(Property):
"""
AWS Object Type = "AWS::FraudDetector::EventType.EntityType"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html
Property Document:
- ``p_Arn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-arn
- ``p_CreatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-createdtime
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-description
- ``p_Inline``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-inline
- ``p_LastUpdatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-lastupdatedtime
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-name
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::EventType.EntityType"
p_Arn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Arn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-arn"""
p_CreatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "CreatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-createdtime"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-description"""
p_Inline: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "Inline"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-inline"""
p_LastUpdatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "LastUpdatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-lastupdatedtime"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-name"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-entitytype.html#cfn-frauddetector-eventtype-entitytype-tags"""
@attr.s
class DetectorEventVariable(Property):
"""
AWS Object Type = "AWS::FraudDetector::Detector.EventVariable"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html
Property Document:
- ``p_Arn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-arn
- ``p_CreatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-createdtime
- ``p_DataSource``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-datasource
- ``p_DataType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-datatype
- ``p_DefaultValue``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-defaultvalue
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-description
- ``p_Inline``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-inline
- ``p_LastUpdatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-lastupdatedtime
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-name
- ``p_VariableType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-variabletype
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::Detector.EventVariable"
p_Arn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Arn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-arn"""
p_CreatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "CreatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-createdtime"""
p_DataSource: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "DataSource"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-datasource"""
p_DataType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "DataType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-datatype"""
p_DefaultValue: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "DefaultValue"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-defaultvalue"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-description"""
p_Inline: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "Inline"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-inline"""
p_LastUpdatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "LastUpdatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-lastupdatedtime"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-name"""
p_VariableType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "VariableType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-variabletype"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventvariable.html#cfn-frauddetector-detector-eventvariable-tags"""
@attr.s
class EventTypeEventVariable(Property):
"""
AWS Object Type = "AWS::FraudDetector::EventType.EventVariable"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html
Property Document:
- ``p_Arn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-arn
- ``p_CreatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-createdtime
- ``p_DataSource``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-datasource
- ``p_DataType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-datatype
- ``p_DefaultValue``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-defaultvalue
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-description
- ``p_Inline``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-inline
- ``p_LastUpdatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-lastupdatedtime
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-name
- ``p_VariableType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-variabletype
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::EventType.EventVariable"
p_Arn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Arn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-arn"""
p_CreatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "CreatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-createdtime"""
p_DataSource: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "DataSource"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-datasource"""
p_DataType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "DataType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-datatype"""
p_DefaultValue: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "DefaultValue"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-defaultvalue"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-description"""
p_Inline: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "Inline"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-inline"""
p_LastUpdatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "LastUpdatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-lastupdatedtime"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-name"""
p_VariableType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "VariableType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-variabletype"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-eventvariable.html#cfn-frauddetector-eventtype-eventvariable-tags"""
@attr.s
class EventTypeLabel(Property):
"""
AWS Object Type = "AWS::FraudDetector::EventType.Label"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html
Property Document:
- ``p_Arn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-arn
- ``p_CreatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-createdtime
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-description
- ``p_Inline``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-inline
- ``p_LastUpdatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-lastupdatedtime
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-name
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::EventType.Label"
p_Arn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Arn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-arn"""
p_CreatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "CreatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-createdtime"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-description"""
p_Inline: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "Inline"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-inline"""
p_LastUpdatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "LastUpdatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-lastupdatedtime"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-name"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-eventtype-label.html#cfn-frauddetector-eventtype-label-tags"""
@attr.s
class DetectorRule(Property):
"""
AWS Object Type = "AWS::FraudDetector::Detector.Rule"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html
Property Document:
- ``p_Arn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-arn
- ``p_CreatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-createdtime
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-description
- ``p_DetectorId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-detectorid
- ``p_Expression``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-expression
- ``p_Language``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-language
- ``p_LastUpdatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-lastupdatedtime
- ``p_Outcomes``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-outcomes
- ``p_RuleId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-ruleid
- ``p_RuleVersion``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-ruleversion
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::Detector.Rule"
p_Arn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Arn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-arn"""
p_CreatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "CreatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-createdtime"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-description"""
p_DetectorId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "DetectorId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-detectorid"""
p_Expression: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Expression"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-expression"""
p_Language: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Language"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-language"""
p_LastUpdatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "LastUpdatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-lastupdatedtime"""
p_Outcomes: typing.List[typing.Union['DetectorOutcome', dict]] = attr.ib(
default=None,
converter=DetectorOutcome.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(DetectorOutcome), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Outcomes"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-outcomes"""
p_RuleId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "RuleId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-ruleid"""
p_RuleVersion: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "RuleVersion"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-ruleversion"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-rule.html#cfn-frauddetector-detector-rule-tags"""
@attr.s
class DetectorEventType(Property):
"""
AWS Object Type = "AWS::FraudDetector::Detector.EventType"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html
Property Document:
- ``p_Arn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-arn
- ``p_CreatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-createdtime
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-description
- ``p_EntityTypes``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-entitytypes
- ``p_EventVariables``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-eventvariables
- ``p_Inline``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-inline
- ``p_Labels``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-labels
- ``p_LastUpdatedTime``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-lastupdatedtime
- ``p_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-name
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::Detector.EventType"
p_Arn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Arn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-arn"""
p_CreatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "CreatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-createdtime"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-description"""
p_EntityTypes: typing.List[typing.Union['DetectorEntityType', dict]] = attr.ib(
default=None,
converter=DetectorEntityType.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(DetectorEntityType), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "EntityTypes"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-entitytypes"""
p_EventVariables: typing.List[typing.Union['DetectorEventVariable', dict]] = attr.ib(
default=None,
converter=DetectorEventVariable.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(DetectorEventVariable), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "EventVariables"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-eventvariables"""
p_Inline: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "Inline"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-inline"""
p_Labels: typing.List[typing.Union['DetectorLabel', dict]] = attr.ib(
default=None,
converter=DetectorLabel.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(DetectorLabel), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Labels"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-labels"""
p_LastUpdatedTime: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "LastUpdatedTime"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-lastupdatedtime"""
p_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-name"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-frauddetector-detector-eventtype.html#cfn-frauddetector-detector-eventtype-tags"""
#--- Resource declaration ---
@attr.s
class EntityType(Resource):
"""
AWS Object Type = "AWS::FraudDetector::EntityType"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-entitytype.html
Property Document:
- ``rp_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-entitytype.html#cfn-frauddetector-entitytype-name
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-entitytype.html#cfn-frauddetector-entitytype-description
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-entitytype.html#cfn-frauddetector-entitytype-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::EntityType"
rp_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-entitytype.html#cfn-frauddetector-entitytype-name"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-entitytype.html#cfn-frauddetector-entitytype-description"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-entitytype.html#cfn-frauddetector-entitytype-tags"""
@property
def rv_Arn(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-entitytype.html#aws-resource-frauddetector-entitytype-return-values"""
return GetAtt(resource=self, attr_name="Arn")
@property
def rv_CreatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-entitytype.html#aws-resource-frauddetector-entitytype-return-values"""
return GetAtt(resource=self, attr_name="CreatedTime")
@property
def rv_LastUpdatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-entitytype.html#aws-resource-frauddetector-entitytype-return-values"""
return GetAtt(resource=self, attr_name="LastUpdatedTime")
@attr.s
class Outcome(Resource):
"""
AWS Object Type = "AWS::FraudDetector::Outcome"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-outcome.html
Property Document:
- ``rp_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-outcome.html#cfn-frauddetector-outcome-name
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-outcome.html#cfn-frauddetector-outcome-description
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-outcome.html#cfn-frauddetector-outcome-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::Outcome"
rp_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-outcome.html#cfn-frauddetector-outcome-name"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-outcome.html#cfn-frauddetector-outcome-description"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-outcome.html#cfn-frauddetector-outcome-tags"""
@property
def rv_Arn(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-outcome.html#aws-resource-frauddetector-outcome-return-values"""
return GetAtt(resource=self, attr_name="Arn")
@property
def rv_CreatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-outcome.html#aws-resource-frauddetector-outcome-return-values"""
return GetAtt(resource=self, attr_name="CreatedTime")
@property
def rv_LastUpdatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-outcome.html#aws-resource-frauddetector-outcome-return-values"""
return GetAtt(resource=self, attr_name="LastUpdatedTime")
@attr.s
class EventType(Resource):
"""
AWS Object Type = "AWS::FraudDetector::EventType"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html
Property Document:
- ``rp_EntityTypes``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#cfn-frauddetector-eventtype-entitytypes
- ``rp_EventVariables``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#cfn-frauddetector-eventtype-eventvariables
- ``rp_Labels``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#cfn-frauddetector-eventtype-labels
- ``rp_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#cfn-frauddetector-eventtype-name
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#cfn-frauddetector-eventtype-description
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#cfn-frauddetector-eventtype-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::EventType"
rp_EntityTypes: typing.List[typing.Union['EventTypeEntityType', dict]] = attr.ib(
default=None,
converter=EventTypeEntityType.from_list,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(EventTypeEntityType), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "EntityTypes"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#cfn-frauddetector-eventtype-entitytypes"""
rp_EventVariables: typing.List[typing.Union['EventTypeEventVariable', dict]] = attr.ib(
default=None,
converter=EventTypeEventVariable.from_list,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(EventTypeEventVariable), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "EventVariables"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#cfn-frauddetector-eventtype-eventvariables"""
rp_Labels: typing.List[typing.Union['EventTypeLabel', dict]] = attr.ib(
default=None,
converter=EventTypeLabel.from_list,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(EventTypeLabel), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "Labels"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#cfn-frauddetector-eventtype-labels"""
rp_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#cfn-frauddetector-eventtype-name"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#cfn-frauddetector-eventtype-description"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#cfn-frauddetector-eventtype-tags"""
@property
def rv_Arn(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#aws-resource-frauddetector-eventtype-return-values"""
return GetAtt(resource=self, attr_name="Arn")
@property
def rv_CreatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#aws-resource-frauddetector-eventtype-return-values"""
return GetAtt(resource=self, attr_name="CreatedTime")
@property
def rv_LastUpdatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-eventtype.html#aws-resource-frauddetector-eventtype-return-values"""
return GetAtt(resource=self, attr_name="LastUpdatedTime")
@attr.s
class Detector(Resource):
"""
AWS Object Type = "AWS::FraudDetector::Detector"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html
Property Document:
- ``rp_DetectorId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-detectorid
- ``rp_EventType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-eventtype
- ``rp_Rules``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-rules
- ``p_AssociatedModels``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-associatedmodels
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-description
- ``p_DetectorVersionStatus``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-detectorversionstatus
- ``p_RuleExecutionMode``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-ruleexecutionmode
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::Detector"
rp_DetectorId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "DetectorId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-detectorid"""
rp_EventType: typing.Union['DetectorEventType', dict] = attr.ib(
default=None,
converter=DetectorEventType.from_dict,
validator=attr.validators.instance_of(DetectorEventType),
metadata={AttrMeta.PROPERTY_NAME: "EventType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-eventtype"""
rp_Rules: typing.List[typing.Union['DetectorRule', dict]] = attr.ib(
default=None,
converter=DetectorRule.from_list,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(DetectorRule), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "Rules"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-rules"""
p_AssociatedModels: typing.List[typing.Union['DetectorModel', dict]] = attr.ib(
default=None,
converter=DetectorModel.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(DetectorModel), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "AssociatedModels"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-associatedmodels"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-description"""
p_DetectorVersionStatus: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "DetectorVersionStatus"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-detectorversionstatus"""
p_RuleExecutionMode: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "RuleExecutionMode"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-ruleexecutionmode"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#cfn-frauddetector-detector-tags"""
@property
def rv_DetectorVersionId(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#aws-resource-frauddetector-detector-return-values"""
return GetAtt(resource=self, attr_name="DetectorVersionId")
@property
def rv_EventTypeArn(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#aws-resource-frauddetector-detector-return-values"""
return GetAtt(resource=self, attr_name="EventType.Arn")
@property
def rv_EventTypeCreatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#aws-resource-frauddetector-detector-return-values"""
return GetAtt(resource=self, attr_name="EventType.CreatedTime")
@property
def rv_EventTypeLastUpdatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#aws-resource-frauddetector-detector-return-values"""
return GetAtt(resource=self, attr_name="EventType.LastUpdatedTime")
@property
def rv_Arn(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#aws-resource-frauddetector-detector-return-values"""
return GetAtt(resource=self, attr_name="Arn")
@property
def rv_CreatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#aws-resource-frauddetector-detector-return-values"""
return GetAtt(resource=self, attr_name="CreatedTime")
@property
def rv_LastUpdatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-detector.html#aws-resource-frauddetector-detector-return-values"""
return GetAtt(resource=self, attr_name="LastUpdatedTime")
@attr.s
class Label(Resource):
"""
AWS Object Type = "AWS::FraudDetector::Label"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-label.html
Property Document:
- ``rp_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-label.html#cfn-frauddetector-label-name
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-label.html#cfn-frauddetector-label-description
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-label.html#cfn-frauddetector-label-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::Label"
rp_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-label.html#cfn-frauddetector-label-name"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-label.html#cfn-frauddetector-label-description"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-label.html#cfn-frauddetector-label-tags"""
@property
def rv_Arn(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-label.html#aws-resource-frauddetector-label-return-values"""
return GetAtt(resource=self, attr_name="Arn")
@property
def rv_CreatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-label.html#aws-resource-frauddetector-label-return-values"""
return GetAtt(resource=self, attr_name="CreatedTime")
@property
def rv_LastUpdatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-label.html#aws-resource-frauddetector-label-return-values"""
return GetAtt(resource=self, attr_name="LastUpdatedTime")
@attr.s
class Variable(Resource):
"""
AWS Object Type = "AWS::FraudDetector::Variable"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html
Property Document:
- ``rp_DataSource``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-datasource
- ``rp_DataType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-datatype
- ``rp_DefaultValue``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-defaultvalue
- ``rp_Name``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-name
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-description
- ``p_VariableType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-variabletype
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-tags
"""
AWS_OBJECT_TYPE = "AWS::FraudDetector::Variable"
rp_DataSource: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "DataSource"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-datasource"""
rp_DataType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "DataType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-datatype"""
rp_DefaultValue: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "DefaultValue"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-defaultvalue"""
rp_Name: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Name"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-name"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-description"""
p_VariableType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "VariableType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-variabletype"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#cfn-frauddetector-variable-tags"""
@property
def rv_Arn(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#aws-resource-frauddetector-variable-return-values"""
return GetAtt(resource=self, attr_name="Arn")
@property
def rv_CreatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#aws-resource-frauddetector-variable-return-values"""
return GetAtt(resource=self, attr_name="CreatedTime")
@property
def rv_LastUpdatedTime(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-frauddetector-variable.html#aws-resource-frauddetector-variable-return-values"""
return GetAtt(resource=self, attr_name="LastUpdatedTime")
| 68.825022 | 206 | 0.763471 | 8,842 | 78,667 | 6.695205 | 0.012441 | 0.101454 | 0.047568 | 0.073515 | 0.976756 | 0.972871 | 0.962787 | 0.945523 | 0.945523 | 0.945269 | 0 | 0.000014 | 0.101517 | 78,667 | 1,142 | 207 | 68.885289 | 0.837538 | 0.327799 | 0 | 0.701449 | 0 | 0 | 0.054987 | 0.019214 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031884 | false | 0 | 0.005797 | 0 | 0.273913 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
28c668bf6e076a5f925351a58304d9c62f766c62 | 23,413 | py | Python | tests/test_create_agents_bulk.py | craft-ai/craft-ai-client-python | 3d8b3d9a49c0c70964deaeb9645130dd54f9a0b3 | [
"BSD-3-Clause"
] | 14 | 2016-08-26T07:06:57.000Z | 2020-09-22T07:41:21.000Z | tests/test_create_agents_bulk.py | craft-ai/craft-ai-client-python | 3d8b3d9a49c0c70964deaeb9645130dd54f9a0b3 | [
"BSD-3-Clause"
] | 94 | 2016-08-02T14:07:59.000Z | 2021-10-06T11:50:52.000Z | tests/test_create_agents_bulk.py | craft-ai/craft-ai-client-python | 3d8b3d9a49c0c70964deaeb9645130dd54f9a0b3 | [
"BSD-3-Clause"
] | 8 | 2017-02-07T12:05:57.000Z | 2021-10-14T09:45:30.000Z | import unittest
from craft_ai import Client, errors as craft_err
from . import settings
from .utils import generate_entity_id
from .data import valid_data, invalid_data
NB_AGENTS_TO_CREATE = 5
class TestCreateAgentsBulkSuccess(unittest.TestCase):
"""Checks that the client succeeds when creating
an/multiple agent(s) with OK input"""
@classmethod
def setUpClass(cls):
cls.client = Client(settings.CRAFT_CFG)
cls.agent_id1 = generate_entity_id("test_create_agents_bulk")
cls.agent_id2 = generate_entity_id("test_create_agents_bulk")
cls.agent_name = generate_entity_id("test_create_agents_bulk")
@classmethod
def tearDownClass(cls):
for agent_id in cls.client.list_agents():
try:
cls.client.delete_agent(agent_id)
except craft_err.CraftAiError:
continue
def setUp(self):
# Makes sure that no agent with the same ID already exists
resp1 = self.client.delete_agent(self.agent_id1)
resp2 = self.client.delete_agent(self.agent_id2)
self.assertIsInstance(resp1, dict)
self.assertIsInstance(resp2, dict)
def clean_up_agent(self, aid):
# Makes sure that no agent with the standard ID remains
try:
self.client.delete_agent(aid)
except craft_err.CraftAiError:
return
def clean_up_agents(self, aids):
# Makes sure that no agent with the standard ID remains
for aid in aids:
self.clean_up_agent(aid)
def test_create_one_agent_generated_agent_id(self):
"""create_agents_bulk should succeed when given an empty `id` field.
It should give a proper JSON response with a list containing a dict with `id` and
`configuration` fields being strings.
"""
payload = [{"configuration": valid_data.VALID_CONFIGURATION}]
resp = self.client.create_agents_bulk(payload)
self.assertIsInstance(resp[0].get("id"), str)
self.addCleanup(self.clean_up_agent, resp[0].get("id"))
def test_create_multiple_agents_generated_agent_id(self):
"""create_agents_bulk should succeed when given agents to create with empty `id` field.
It should give a proper JSON response with a list containing dicts with `id` and
`configuration` fields being strings.
"""
payload = [
{"configuration": valid_data.VALID_CONFIGURATION},
{"configuration": valid_data.VALID_CONFIGURATION},
]
resp = self.client.create_agents_bulk(payload)
self.assertIsInstance(resp[0].get("id"), str)
self.assertIsInstance(resp[1].get("id"), str)
self.addCleanup(self.clean_up_agents, [resp[0].get("id"), resp[1].get("id")])
def test_create_one_agent_given_agent_id(self):
"""create_agents_bulk should succeed when given a valid string in the `id` field.
It should give a proper JSON response with a list containing a dict with `id` and
`configuration` fields being strings and `id` being the same as the one given as
a parameter.
"""
payload = [
{"id": self.agent_id1, "configuration": valid_data.VALID_CONFIGURATION}
]
resp = self.client.create_agents_bulk(payload)
self.assertEqual(resp[0].get("id"), self.agent_id1)
self.addCleanup(self.clean_up_agent, self.agent_id1)
def test_create_multiple_agents_given_agent_id(self):
"""create_agents_bulk should succeed when given valid strings in the `id` field.
It should give a proper JSON response with a list containing dicts with `id` and
`configuration` fields being strings and the `id`s being the same as the ones given
as parameters.
"""
payload = [
{"id": self.agent_id1, "configuration": valid_data.VALID_CONFIGURATION},
{"id": self.agent_id2, "configuration": valid_data.VALID_CONFIGURATION},
]
resp = self.client.create_agents_bulk(payload)
self.assertEqual(resp[0].get("id"), self.agent_id1)
self.assertEqual(resp[1].get("id"), self.agent_id2)
self.addCleanup(self.clean_up_agents, [resp[0].get("id"), resp[1].get("id")])
def test_create_agents_bulk_id_given_and_generated(self):
"""create_agents_bulk should succeed when given some agents with string `id` and some
with empty `id` field.
It should give a proper JSON response with a list containing dicts with `id` and
`configuration` fields being strings and if the `id` was given as a parameter, `id`
should be the same as the one given as a parameter.
"""
payload = [
{"id": self.agent_id1, "configuration": valid_data.VALID_CONFIGURATION},
{"configuration": valid_data.VALID_CONFIGURATION},
]
resp = self.client.create_agents_bulk(payload)
self.assertEqual(resp[0].get("id"), self.agent_id1)
self.assertIsInstance(resp[1].get("id"), str)
self.addCleanup(self.clean_up_agents, [resp[0].get("id"), resp[1].get("id")])
def test_create_lot_of_agents_bulk(self):
"""create_agents_bulk should succeed when given a lot of agents to create.
It should give a proper JSON response with a list containing dicts
with `id` and `configuration` fields being strings and the first `id` being the
same as the one given as a parameter.
"""
payload = []
agents_lst = []
for i in range(NB_AGENTS_TO_CREATE):
new_agent_id = generate_entity_id("test_create_lot_of_agents_bulk")
self.client.delete_agent(new_agent_id)
payload.append(
{"id": new_agent_id, "configuration": valid_data.VALID_CONFIGURATION}
)
agents_lst.append(new_agent_id)
response = self.client.create_agents_bulk(payload)
for i, resp in enumerate(response):
self.assertEqual(resp.get("id"), agents_lst[i])
self.assertFalse("error" in resp)
self.addCleanup(self.clean_up_agents, agents_lst)
class TestCreateAgentsBulkFailure(unittest.TestCase):
"""Checks that the client fails when creating
an/multiple agent(s) with bad input"""
@classmethod
def setUpClass(cls):
cls.client = Client(settings.CRAFT_CFG)
def setUp(self):
self.agent_id1 = generate_entity_id("test_create_agents_bulk_failure")
self.agent_id2 = generate_entity_id("test_create_agents_bulk_failure")
self.agent_name = generate_entity_id("test_create_agents_bulk_failure")
# Makes sure that no agent with the same ID already exists
resp1 = self.client.delete_agent(self.agent_id1)
resp2 = self.client.delete_agent(self.agent_id2)
self.assertIsInstance(resp1, dict)
self.assertIsInstance(resp2, dict)
def tearDown(self):
# This ensures that agents are properly deleted every time
self.client.delete_agent(self.agent_id1)
self.client.delete_agent(self.agent_id2)
self.client.delete_agent(self.agent_name)
def clean_up_agent(self, aid):
# Makes sure that no agent with the standard ID remains
self.client.delete_agent(aid)
def clean_up_agents(self, aids):
# Makes sure that no agent with the standard ID remains
for aid in aids:
self.clean_up_agent(aid)
def test_create_agents_bulk_with_existing_agent_id(self):
"""create_agents_bulk should fail when given only IDs that already exist.
It should raise an error upon request for creation of a bulk of agents with IDs
that already exist, since agent IDs should always be unique.
"""
# Calling create_agents_bulk a first time
payload = [
{"id": self.agent_id1, "configuration": valid_data.VALID_CONFIGURATION},
{"id": self.agent_id2, "configuration": valid_data.VALID_CONFIGURATION},
]
self.client.create_agents_bulk(payload)
self.assertRaises(
craft_err.CraftAiBadRequestError, self.client.create_agents_bulk, payload
)
self.addCleanup(self.clean_up_agents, [self.agent_id1, self.agent_id2])
def test_create_agents_bulk_with_invalid_agent_id(self):
"""create_agents_bulk should fail when all agent IDs are invalid.
It should raise an error upon request for creation of all agents with invalid id.
"""
payload = [
{"id": "toto/tutu", "configuration": valid_data.VALID_CONFIGURATION},
{"id": "toto@tutu", "configuration": valid_data.VALID_CONFIGURATION},
]
self.assertRaises(
craft_err.CraftAiBadRequestError, self.client.create_agents_bulk, payload
)
def test_create_agents_with_an_empty_payload(self):
"""create_agents_bulk should fail when given payload is empty.
It should raise an error of invalid given payload.
"""
payload = []
self.assertRaises(
craft_err.CraftAiBadRequestError, self.client.create_agents_bulk, payload
)
def test_create_agents_bulk_with_invalid_context(self):
"""create_agents_bulk should fail when all agent contexts are invalid or the `context`
field doesn't exist.
It should raise an error upon request for creation of all agents with invalid context.
"""
payload = []
agents_lst = []
# Add all the invalid context to check
for i, invalid_context in enumerate(invalid_data.INVALID_CONTEXTS):
new_agent_id = generate_entity_id(
"test_create_agents_bulk_with_invalid_context" + str(i)
)
invalid_configuration = {
"context": invalid_data.INVALID_CONTEXTS[invalid_context],
"output": ["lightbulbColor"],
"time_quantum": 100,
}
self.client.delete_agent(new_agent_id)
payload.append({"id": new_agent_id, "configuration": invalid_configuration})
agents_lst.append(new_agent_id)
# Add an agent with no context field
new_agent_id = self.agent_name.format(len(agents_lst))
self.client.delete_agent(new_agent_id)
invalid_configuration = {"output": ["lightbulbColor"], "time_quantum": 100}
payload.append({"id": new_agent_id, "configuration": invalid_configuration})
agents_lst.append(new_agent_id)
self.assertRaises(
craft_err.CraftAiBadRequestError, self.client.create_agents_bulk, payload
)
self.addCleanup(self.clean_up_agents, agents_lst)
def test_create_agents_bulk_undefined_config(self):
"""create_agents_bulk should fail when the configuration is undefined or the
`configuration` field doesn't exist.
It should raise an error upon request for creation of all agents with no
configuration key in the request body, since it is a mandatory field to
create an agent.
"""
payload = []
agents_lst = []
# Add all the invalid context to check
for i, empty_configuration in enumerate(invalid_data.UNDEFINED_KEY):
new_agent_id = generate_entity_id(
"test_create_agents_bulk_undef_conf_" + str(i)
)
self.client.delete_agent(new_agent_id)
payload.append(
{
"id": new_agent_id,
"configuration": invalid_data.UNDEFINED_KEY[empty_configuration],
}
)
agents_lst.append(new_agent_id)
# Add agent with no configuration
new_agent_id = self.agent_name.format(len(agents_lst))
self.client.delete_agent(new_agent_id)
payload.append({"id": new_agent_id})
agents_lst.append(new_agent_id)
self.assertRaises(
craft_err.CraftAiBadRequestError, self.client.create_agents_bulk, payload
)
self.addCleanup(self.clean_up_agents, agents_lst)
def test_create_agents_bulk_invalid_time_quantum(self):
"""create_agents_bulk should fail when given invalid time quantums.
It should raise an error upon request for creation of all agent with incorrect time
quantum in the configuration, since it is essential to perform any action with craft
ai.
"""
payload = []
agents_lst = []
# Add all the invalid time quantum to check
for i, inv_tq in enumerate(invalid_data.INVALID_TIME_QUANTA):
new_agent_id = generate_entity_id(
"test_create_agents_bulk_invalid_time_quantum"
)
invalid_configuration = {
"context": valid_data.VALID_CONTEXT,
"output": valid_data.VALID_OUTPUT,
"time_quantum": invalid_data.INVALID_TIME_QUANTA[inv_tq],
}
self.client.delete_agent(new_agent_id)
payload.append({"id": new_agent_id, "configuration": invalid_configuration})
agents_lst.append(new_agent_id)
self.assertRaises(
craft_err.CraftAiBadRequestError, self.client.create_agents_bulk, payload
)
self.addCleanup(self.clean_up_agents, agents_lst)
class TestCreateAgentsBulkSomeFailure(unittest.TestCase):
"""Checks that the client succeed when creating an/multiple agent(s)
with bad input and an/multiple agent(s) with valid input"""
@classmethod
def setUpClass(cls):
cls.client = Client(settings.CRAFT_CFG)
@classmethod
def tearDownClass(cls):
for agent_id in cls.client.list_agents():
try:
cls.client.delete_agent(agent_id)
except craft_err.CraftAiError:
continue
def setUp(self):
self.agent_id = generate_entity_id("test_create_agents_bulk_SomeFail")
self.agent_name = generate_entity_id("test_create_agents_bulk_SomeFail")
# Makes sure that no agent with the same ID already exists
resp = self.client.delete_agent(self.agent_id)
self.assertIsInstance(resp, dict)
def clean_up_agent(self, aid):
# Makes sure that no agent with the standard ID remains
self.client.delete_agent(aid)
def clean_up_agents(self, aids):
# Makes sure that no agent with the standard ID remains
for aid in aids:
try:
self.clean_up_agent(aid)
except craft_err.CraftAiError:
continue
def test_create_some_agents_with_existing_agent_id(self):
"""create_agents_bulk should succeed when some of the ID given already exist
and the others doesn't.
It should give a proper JSON response with a list containing dicts.
The ones having existing IDs have the `error` field being a CraftAiBadRequestError.
The ones having valid IDs have `configuration` field being strings.
In either case they should have 'id' being the same as the one given as a parameter.
"""
payload = [
{"id": self.agent_id, "configuration": valid_data.VALID_CONFIGURATION},
{"configuration": valid_data.VALID_CONFIGURATION},
]
resp1 = self.client.create_agents_bulk(payload)
resp2 = self.client.create_agents_bulk(payload)
self.assertEqual(resp2[0].get("id"), self.agent_id)
self.assertIsInstance(resp2[0].get("error"), craft_err.CraftAiBadRequestError)
self.assertFalse("configuration" in resp2[0])
self.assertIsInstance(resp1[1].get("id"), str)
self.assertTrue("configuration" in resp1[1])
self.assertIsInstance(resp2[1].get("id"), str)
self.assertTrue("configuration" in resp2[1])
self.addCleanup(
self.clean_up_agents,
[self.agent_id, resp1[1].get("id"), resp2[1].get("id")],
)
def test_create_some_agents_with_invalid_agent_id(self):
"""create_agents_bulk should succeed when some of the ID given are invalid
and the others are valid.
It should give a proper JSON response with a list containing dicts.
The ones having invalid IDs have the `error` field being a CraftAiBadRequestError.
The ones having valid IDs have `configuration` field being strings.
In either case they should have 'id' being the same as the one given as a parameter.
"""
payload = [
{"id": "toto/tutu", "configuration": valid_data.VALID_CONFIGURATION},
{"id": self.agent_id, "configuration": valid_data.VALID_CONFIGURATION},
]
resp = self.client.create_agents_bulk(payload)
self.assertEqual(resp[0].get("id"), "toto/tutu")
self.assertIsInstance(resp[0].get("error"), craft_err.CraftAiBadRequestError)
self.assertFalse("configuration" in resp[0])
self.assertEqual(resp[1].get("id"), self.agent_id)
self.assertTrue("configuration" in resp[1])
self.addCleanup(self.clean_up_agent, self.agent_id)
def test_create_same_agents_in_bulk(self):
"""create_agents_bulk should succeed when agents in a bulk have the same ID given.
It should give a proper JSON response with a list containing two dicts.
The first one should have 'id' being the same as the one given as a parameter,
and the `configuration` field being strings.
The second one should have `id` being the same as the one given as a parameter
'error' field being a CraftAiBadRequestError.
"""
# Calling create_agents_bulk a first time
payload = [
{"id": self.agent_id, "configuration": valid_data.VALID_CONFIGURATION},
{"id": self.agent_id, "configuration": valid_data.VALID_CONFIGURATION},
]
resp = self.client.create_agents_bulk(payload)
self.assertEqual(resp[0].get("id"), self.agent_id)
self.assertEqual(resp[1].get("id"), self.agent_id)
self.assertTrue("configuration" in resp[0] or "configuration" in resp[1])
if "configuration" in resp[0]:
self.assertIsInstance(
resp[1].get("error"), craft_err.CraftAiBadRequestError
)
elif "configuration" in resp[1]:
self.assertIsInstance(
resp[0].get("error"), craft_err.CraftAiBadRequestError
)
self.addCleanup(self.clean_up_agent, self.agent_id)
def test_create_some_agents_bulk_invalid_context(self):
"""create_agents_bulk should succeed with some agents with invalid context
and some with valid context.
It should give a proper JSON response with a list containing dicts.
The ones having invalid context have the `error` field being a CraftAiBadRequestError.
The ones having valid ids have the `id` field being string and 'configuration' field
being a dict.
"""
# Add valid agent with a valid configuration
payload = [
{"id": self.agent_id, "configuration": valid_data.VALID_CONFIGURATION}
]
agents_lst = [self.agent_id]
# Add all the invalid context to check
for i, invalid_context in enumerate(invalid_data.INVALID_CONTEXTS):
new_agent_id = generate_entity_id(
"test_create_some_agents_bulk_invalid_context"
)
invalid_configuration = {
"context": invalid_data.INVALID_CONTEXTS[invalid_context],
"output": ["lightbulbColor"],
"time_quantum": 100,
}
self.client.delete_agent(new_agent_id)
payload.append({"id": new_agent_id, "configuration": invalid_configuration})
agents_lst.append(new_agent_id)
# Add an agent with no context field
new_agent_id = self.agent_name.format(len(agents_lst))
self.client.delete_agent(new_agent_id)
invalid_configuration = {"output": ["lightbulbColor"], "time_quantum": 100}
resp = self.client.create_agents_bulk(payload)
self.assertEqual(resp[0].get("id"), self.agent_id)
self.assertTrue("configuration" in resp[0])
self.assertFalse("error" in resp[0])
for i in range(1, len(resp)):
self.assertEqual(resp[i].get("id"), agents_lst[i])
self.assertTrue("error" in resp[i])
self.assertFalse("configuration" in resp[i])
self.addCleanup(self.clean_up_agents, agents_lst)
def test_create_some_agents_undef_config(self):
"""create_agents_bulk should succeed with some agents with undefined configuration
and some with valid configuration.
It should give a proper JSON response with a list containing dicts.
The ones having invalid configuration have the `error` field being a CraftAiBadRequestError.
The ones having valid ids have the `id` field being string and 'configuration' field
being a dict.
The valid ones should have `id` and `configuration` fields being strings.
The invalid ones should have 'id' and 'error' fields.
"""
# Add valid agent with a valid configuration
payload = [
{"id": self.agent_id, "configuration": valid_data.VALID_CONFIGURATION}
]
agents_lst = [self.agent_id]
# Add all the invalid configuration to check
for i, empty_configuration in enumerate(invalid_data.UNDEFINED_KEY):
new_agent_id = generate_entity_id(
"test_create_some_agents_undef_config" + str(i)
)
self.client.delete_agent(new_agent_id)
payload.append(
{
"id": new_agent_id,
"configuration": invalid_data.UNDEFINED_KEY[empty_configuration],
}
)
agents_lst.append(new_agent_id)
# Add agent with no configuration
new_agent_id = self.agent_name.format(len(agents_lst))
self.client.delete_agent(new_agent_id)
payload.append({"id": new_agent_id})
agents_lst.append(new_agent_id)
self.assertRaises(
craft_err.CraftAiBadRequestError, self.client.create_agents_bulk, payload,
)
def test_create_some_agents_inval_time_quant(self):
"""create_agents_bulk should succeed with some agents with invalid time quantum
in the configuration and some with valid configuration.
It should give a proper JSON response with a list containing dicts.
The ones having invalid time quantum have the `error` field being a CraftAiBadRequestError.
The ones having valid ids have the `id` field being string and 'configuration' field
being a dict.
"""
# Add invalid configuration with invalid time quantum
new_agent_id = generate_entity_id("test_create_some_agents_inval_time_quant")
invalid_configuration = {
"context": valid_data.VALID_CONTEXT,
"output": valid_data.VALID_OUTPUT,
"time_quantum": invalid_data.INVALID_TIME_QUANTA["negative_tq"],
}
payload = [{"id": new_agent_id, "configuration": invalid_configuration}]
self.assertRaises(
craft_err.CraftAiBadRequestError, self.client.create_agents_bulk, payload,
)
| 42.034111 | 100 | 0.658737 | 2,949 | 23,413 | 5.008817 | 0.067141 | 0.033173 | 0.061743 | 0.029856 | 0.888769 | 0.850044 | 0.817277 | 0.787828 | 0.749712 | 0.710717 | 0 | 0.005523 | 0.257635 | 23,413 | 556 | 101 | 42.109712 | 0.844313 | 0.278136 | 0 | 0.603604 | 1 | 0 | 0.085952 | 0.03099 | 0 | 0 | 0 | 0 | 0.138138 | 1 | 0.099099 | false | 0 | 0.015015 | 0 | 0.126126 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3a8d54144aa784eef78a5e4ff0b551174ea6c20e | 49,935 | py | Python | evulate_segmentation_ISBI2017_UNet.py | UpCoder/YNe | 2f932456eda29b1e04f4c7e212e2ab0dacfe831b | [
"MIT"
] | null | null | null | evulate_segmentation_ISBI2017_UNet.py | UpCoder/YNe | 2f932456eda29b1e04f4c7e212e2ab0dacfe831b | [
"MIT"
] | null | null | null | evulate_segmentation_ISBI2017_UNet.py | UpCoder/YNe | 2f932456eda29b1e04f4c7e212e2ab0dacfe831b | [
"MIT"
] | null | null | null | # -*- coding=utf-8 -*-
import numpy as np
import math
import tensorflow as tf
from preprocessing import ssd_vgg_preprocessing, segmentation_preprocessing
import util
import cv2
from nets import UNetWL as UNet
from glob import glob
from datasets.medicalImage import fill_region, close_operation, open_operation, save_mhd_image
slim = tf.contrib.slim
import config
import os
import pickle
import pydensecrf.densecrf as dcrf
import gc
# scales = [[256, 256], [272, 272], [288, 288], [304, 304], [320, 320], [352, 352], [336, 336]]
scales = [[512, 512]]
# scales = [[480, 480], [496, 496], [512, 512], [528, 528], [544, 544], [560, 560]]
# =========================================================================== #
# Checkpoint and running Flags
# =========================================================================== #
tf.app.flags.DEFINE_string('checkpoint_path', None,
'the path of pretrained model to be used. If there are checkpoints\
in train_dir, this config will be ignored.')
tf.app.flags.DEFINE_float('gpu_memory_fraction', -1,
'the gpu memory fraction to be used. If less than 0, allow_growth = True is used.')
# =========================================================================== #
# I/O and preprocessing Flags.
# =========================================================================== #
tf.app.flags.DEFINE_integer(
'num_readers', 1,
'The number of parallel readers that read data from the dataset.')
tf.app.flags.DEFINE_integer(
'num_preprocessing_threads', 4,
'The number of threads used to create the batches.')
tf.app.flags.DEFINE_bool('preprocessing_use_rotation', False,
'Whether to use rotation for data augmentation')
# =========================================================================== #
# Dataset Flags.
# =========================================================================== #
tf.app.flags.DEFINE_string(
'dataset_name', 'icdar2015', 'The name of the dataset to load.')
tf.app.flags.DEFINE_string(
'dataset_split_name', 'test', 'The name of the train/test split.')
tf.app.flags.DEFINE_string('dataset_dir',
util.io.get_absolute_path('~/dataset/ICDAR2015/Challenge4/ch4_test_images'),
'The directory where the dataset files are stored.')
tf.app.flags.DEFINE_integer('eval_image_width', 256, 'Train image size')
tf.app.flags.DEFINE_integer('eval_image_height', 256, 'Train image size')
tf.app.flags.DEFINE_bool('using_moving_average', True,
'Whether to use ExponentionalMovingAverage')
tf.app.flags.DEFINE_float('moving_average_decay', 0.9999,
'The decay rate of ExponentionalMovingAverage')
tf.app.flags.DEFINE_string('pred_dir', '', '')
tf.app.flags.DEFINE_string('pred_vis_dir', '', '')
tf.app.flags.DEFINE_string('decoder', 'upsampling', '')
tf.app.flags.DEFINE_string('pred_assign_label_path', '', '')
tf.app.flags.DEFINE_string('recovery_img_dir', '', '')
tf.app.flags.DEFINE_string('recovery_feature_map_dir', '', '')
tf.app.flags.DEFINE_bool('update_center', False, '')
tf.app.flags.DEFINE_integer('batch_size', None, 'The number of samples in each batch.')
tf.app.flags.DEFINE_bool('test_flag', False, '')
tf.app.flags.DEFINE_bool('attention_flag', False, '')
tf.app.flags.DEFINE_bool('center_block_flag', True, '')
tf.app.flags.DEFINE_bool('nii_flag', False, '')
tf.app.flags.DEFINE_integer('num_centers_k', 2, 'split the image into k^2 block to compute the center')
tf.app.flags.DEFINE_bool('nii_case_flag', False, '')
tf.app.flags.DEFINE_bool('full_annotation_flag', False, '')
FLAGS = tf.app.flags.FLAGS
def config_initialization():
# image shape and feature layers shape inference
image_shape = (FLAGS.eval_image_height, FLAGS.eval_image_width)
if not FLAGS.dataset_dir:
raise ValueError('You must supply the dataset directory with --dataset_dir')
tf.logging.set_verbosity(tf.logging.DEBUG)
# config.load_config(FLAGS.checkpoint_path)
# config.load_config(util.io.get_dir(FLAGS.checkpoint_path))
# config.load_config('/home/give/PycharmProjects/ISBI_Detection')
config.load_config('./')
config.init_config(image_shape,
batch_size = FLAGS.batch_size,
pixel_conf_threshold = 0.5,
link_conf_threshold = 0.1,
num_gpus = 1,
)
util.proc.set_proc_name('test_pixel_link_on'+ '_' + FLAGS.dataset_name)
def evulate_dir_nii_weakly_new():
'''
新版本的评估代码
:return:
'''
from metrics import dice, IoU
from datasets.medicalImage import convertCase2PNGs, image_expand
from post_processing import cluster_postprocessing, net_center_posprocessing
nii_dir = '/home/give/Documents/dataset/ISBI2017/Training_Batch_1'
save_dir = '/home/give/Documents/dataset/ISBI2017/weakly_label_segmentation_V4/Batch_1/DLSC_0/niis'
# restore_path = '/home/give/PycharmProjects/weakly_label_segmentation/logs/ISBI2017_V2/1s_agumentation_weakly-upsampling-2/model.ckpt-168090'
restore_path = '/home/give/PycharmProjects/weakly_label_segmentation/logs/ISBI2017_V2/UNet-upsampling-1/model.ckpt-137572'
nii_parent_dir = os.path.dirname(nii_dir)
with tf.name_scope('test'):
image = tf.placeholder(dtype=tf.float32, shape=[None, None, 3])
image_shape_placeholder = tf.placeholder(tf.int32, shape=[2])
input_shape_placeholder = tf.placeholder(tf.int32, shape=[2])
processed_image = segmentation_preprocessing.segmentation_preprocessing(image, None, None,
out_shape=input_shape_placeholder,
is_training=False)
b_image = tf.expand_dims(processed_image, axis=0)
net = UNet.UNet(b_image, None, None, is_training=False, decoder=FLAGS.decoder,
update_center_flag=FLAGS.update_center, batch_size=FLAGS.batch_size,
update_center_strategy=1,
num_centers_k=FLAGS.num_centers_k,
full_annotation_flag=False, output_shape_tensor=input_shape_placeholder)
# net = UNet.UNet(b_image, None, None, is_training=False, decoder=FLAGS.decoder,
# update_center_flag=FLAGS.update_center,
# batch_size=2, init_center_value=None, update_center_strategy=1,
# num_centers_k=FLAGS.num_centers_k, full_annotation_flag=False,
# output_shape_tensor=input_shape_placeholder)
# print slim.get_variables_to_restore()
global_step = slim.get_or_create_global_step()
sess_config = tf.ConfigProto(log_device_placement=False, allow_soft_placement=True)
if FLAGS.gpu_memory_fraction < 0:
sess_config.gpu_options.allow_growth = True
elif FLAGS.gpu_memory_fraction > 0:
sess_config.gpu_options.per_process_gpu_memory_fraction = FLAGS.gpu_memory_fraction
# Variables to restore: moving avg. or normal weights.
if FLAGS.using_moving_average:
variable_averages = tf.train.ExponentialMovingAverage(
FLAGS.moving_average_decay)
variables_to_restore = variable_averages.variables_to_restore()
variables_to_restore[global_step.op.name] = global_step
else:
variables_to_restore = slim.get_variables_to_restore()
saver = tf.train.Saver()
nii_pathes = glob(os.path.join(nii_dir, 'volume-*.nii'))
checkpoint = restore_path
# pixel_recovery_features = tf.image.resize_images(net.pixel_recovery_features, image_shape_placeholder)
with tf.Session(config=sess_config) as sess:
saver.restore(sess, checkpoint)
global_gt = []
global_pred = []
global_pred_kmeans = []
global_pred_centers = []
global_pred_crf = []
case_dices = []
case_IoUs = []
case_dices_kmeans = [] # kmeans
case_IoUs_kmeans = [] # kmeans
case_dices_centers = [] # net centers
case_IoUs_centers = [] # net centers
case_dices_crf = []
case_IoUs_crf = []
for iter, nii_path in enumerate(nii_pathes):
# if os.path.basename(nii_path) not in ['volume-5.nii', 'volume-12.nii', 'volume-13.nii', 'volume-15.nii', 'volume-19.nii', 'volume-20.nii', 'volume-21.nii', 'volume-25.nii', 'volume-26.nii']:
# continue
if os.path.basename(nii_path) in ['volume-15.nii', 'volume-25.nii']:
continue
# if os.path.basename(nii_path) != 'volume-0.nii':
# continue
nii_path_basename = os.path.basename(nii_path)
pred_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred')
pred_vis_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred_vis')
recovery_img_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'recovery_img')
if not os.path.exists(pred_dir):
os.makedirs(pred_dir)
if not os.path.exists(pred_vis_dir):
os.makedirs(pred_vis_dir)
if not os.path.exists(recovery_img_dir):
os.makedirs(recovery_img_dir)
seg_path = os.path.join(nii_dir, 'segmentation-' + nii_path.split('.')[0].split('-')[1] + '.nii')
case_preds = []
case_preds_kmeans = []
case_preds_centers = []
case_gts = []
# case_recover_features = []
print(nii_path, seg_path)
imgs, tumor_masks, liver_masks, tumor_weak_masks = convertCase2PNGs(nii_path, seg_path, save_dir=None)
print(len(imgs), len(tumor_masks), len(liver_masks), len(tumor_masks))
for slice_idx, (image_data, liver_mask, whole_mask) in enumerate(zip(imgs, liver_masks, tumor_masks)):
pixel_cls_scores_ms = []
pixel_recover_feature_ms = []
for single_scale in scales:
pixel_recover_feature, pixel_cls_scores, b_image_v, global_step_v, net_centers = sess.run(
[net.pixel_recovery_features, net.pixel_cls_scores, b_image, global_step, net.centers],
feed_dict={
image: image_data,
image_shape_placeholder: np.shape(image_data)[:2],
input_shape_placeholder: single_scale
})
pixel_cls_scores_ms.append(
cv2.resize(pixel_cls_scores[0, :, :, 1], tuple(np.shape(image_data)[:2][::-1])))
pixel_recover_feature_ms.append(pixel_recover_feature[0])
del pixel_recover_feature
pixel_cls_scores = np.mean(pixel_cls_scores_ms, axis=0)
pixel_recover_feature = np.mean(pixel_recover_feature_ms, axis=0)
# case_recover_features.append(pixel_recover_feature)
# pred = cv2.resize(pos_score, tuple(np.shape(image_data)[:2][::-1]), interpolation=cv2.INTER_NEAREST)
if np.sum(whole_mask) != 0:
pred = np.asarray(pixel_cls_scores > 0.6, np.uint8)
# 开操作 先腐蚀,后膨胀
# 闭操作 先膨胀,后腐蚀
# pred = close_operation(pred, kernel_size=3)
pred = open_operation(pred, kernel_size=3)
pred = fill_region(pred)
# 再计算kmeans 的结果
pred_kmeans = np.asarray(image_expand(pred, kernel_size=5), np.uint8)
# pred_seg = image_expand(pred_seg, 5)
pred_kmeans = cluster_postprocessing(pred_kmeans, whole_mask, pixel_recover_feature, k=2)
pred_kmeans[image_data[:, :, 1] < (10./255.)] = 0
pred_kmeans = close_operation(pred_kmeans, kernel_size=5)
pred_kmeans = fill_region(pred_kmeans)
# 计算根据center得到的结果
# pixel_recover_feature, net_centers
# pred_centers = np.asarray(image_expand(pred, kernel_size=5), np.uint8)
pred_centers = np.asarray(pred, np.uint8)
pred_centers = net_center_posprocessing(pred_centers, centers=net_centers,
pixel_wise_feature=pixel_recover_feature, gt=whole_mask)
pred_centers[image_data[:, :, 1] < (10. / 255.)] = 0
pred_centers = close_operation(pred_centers, kernel_size=5)
pred_centers = fill_region(pred_centers)
else:
pred = np.zeros_like(whole_mask)
pred_kmeans = np.zeros_like(whole_mask)
pred_centers = np.zeros_like(whole_mask)
global_gt.append(whole_mask)
case_gts.append(whole_mask)
case_preds.append(pred)
case_preds_kmeans.append(pred_kmeans)
case_preds_centers.append(pred_centers)
global_pred.append(pred)
global_pred_kmeans.append(pred_kmeans)
global_pred_centers.append(pred_centers)
print '%d / %d: %s' % (slice_idx + 1, len(imgs), os.path.basename(nii_path)), np.shape(
pixel_cls_scores), np.max(
pixel_cls_scores), np.min(pixel_cls_scores), np.shape(pixel_recover_feature)
del pixel_recover_feature, pixel_recover_feature_ms
gc.collect()
save_mhd_image(np.transpose(np.asarray(case_preds, np.uint8), axes=[0, 2, 1]),
os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred.mhd'))
save_mhd_image(np.transpose(np.asarray(case_preds_kmeans, np.uint8), axes=[0, 2, 1]),
os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred_kmeans.mhd'))
save_mhd_image(np.transpose(np.asarray(case_preds_centers, np.uint8), axes=[0, 2, 1]),
os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred_centers.mhd'))
case_dice = dice(case_gts, case_preds)
case_IoU = IoU(case_gts, case_preds)
case_dice_kmeans = dice(case_gts, case_preds_kmeans)
case_IoU_kmeans = IoU(case_gts, case_preds_kmeans)
case_dice_centers = dice(case_gts, case_preds_centers)
case_IoU_centers = IoU(case_gts, case_preds_centers)
print('case dice: ', case_dice)
print('case IoU: ', case_IoU)
print('case dice kmeans: ', case_dice_kmeans)
print('case IoU kmeans: ', case_IoU_kmeans)
print('case dice centers: ', case_dice_centers)
print('case IoU centers: ', case_IoU_centers)
case_dices.append(case_dice)
case_IoUs.append(case_IoU)
case_dices_kmeans.append(case_dice_kmeans)
case_IoUs_kmeans.append(case_IoU_kmeans)
case_dices_centers.append(case_dice_centers)
case_IoUs_centers.append(case_IoU_centers)
print 'global dice is ', dice(global_gt, global_pred)
print 'global IoU is ', IoU(global_gt, global_pred)
print('mean of case dice is ', np.mean(case_dices))
print('mean of case IoU is ', np.mean(case_IoUs))
print 'global dice (kmeans) is ', dice(global_gt, global_pred_kmeans)
print 'global IoU (kmeans) is ', IoU(global_gt, global_pred_kmeans)
print 'mean of case dice (kmeans) is ', np.mean(case_dices_kmeans)
print 'mean of case IoU (kmenas) is ', np.mean(case_IoUs_kmeans)
print 'global dice (centers) is ', dice(global_gt, global_pred_centers)
print 'global IoU (centers) is ', IoU(global_gt, global_pred_centers)
print 'mean of case dice (centers) is ', np.mean(case_dices_centers)
print 'mean of case IoU (centers) is ', np.mean(case_IoUs_centers)
def evulate_dir_nii():
threshold = 0.6
print('threshold = ', threshold)
from metrics import dice, IoU
from datasets.medicalImage import convertCase2PNGs
nii_dir = '/home/give/Documents/dataset/ISBI2017/Training_Batch_1'
save_dir = '/home/give/Documents/dataset/ISBI2017/weakly_label_segmentation_V4/Batch_1/UNet/niis'
restore_path = '/home/give/PycharmProjects/weakly_label_segmentation/logs/ISBI2017_V2/UNet-upsampling/model.ckpt-121234'
nii_parent_dir = os.path.dirname(nii_dir)
with tf.name_scope('test'):
image = tf.placeholder(dtype=tf.float32, shape=[None, None, 3])
image_shape_placeholder = tf.placeholder(tf.int32, shape=[2])
input_shape_placeholder = tf.placeholder(tf.int32, shape=[2])
processed_image = segmentation_preprocessing.segmentation_preprocessing(image, None, None,
out_shape=input_shape_placeholder,
is_training=False)
b_image = tf.expand_dims(processed_image, axis=0)
net = UNet.UNet(b_image, None, None, is_training=False, decoder=FLAGS.decoder,
update_center_flag=FLAGS.update_center,
batch_size=2, init_center_value=None, update_center_strategy=2,
num_centers_k=FLAGS.num_centers_k, full_annotation_flag=True,
output_shape_tensor=input_shape_placeholder)
# print slim.get_variables_to_restore()
global_step = slim.get_or_create_global_step()
sess_config = tf.ConfigProto(log_device_placement=False, allow_soft_placement=True)
if FLAGS.gpu_memory_fraction < 0:
sess_config.gpu_options.allow_growth = True
elif FLAGS.gpu_memory_fraction > 0:
sess_config.gpu_options.per_process_gpu_memory_fraction = FLAGS.gpu_memory_fraction
# Variables to restore: moving avg. or normal weights.
if FLAGS.using_moving_average:
variable_averages = tf.train.ExponentialMovingAverage(
FLAGS.moving_average_decay)
variables_to_restore = variable_averages.variables_to_restore()
variables_to_restore[global_step.op.name] = global_step
else:
variables_to_restore = slim.get_variables_to_restore()
saver = tf.train.Saver()
nii_pathes = glob(os.path.join(nii_dir, 'volume-*.nii'))
checkpoint = restore_path
# pixel_recovery_features = tf.image.resize_images(net.pixel_recovery_features, image_shape_placeholder)
with tf.Session(config=sess_config) as sess:
saver.restore(sess, checkpoint)
IoUs = []
dices = []
global_gt = []
global_pred = []
case_dices = []
case_IoUs = []
for iter, nii_path in enumerate(nii_pathes):
# if os.path.basename(nii_path) in ['volume-5.nii', 'volume-12.nii', 'volume-13.nii', 'volume-15.nii', 'volume-19.nii', 'volume-20.nii', 'volume-21.nii', 'volume-25.nii', 'volume-26.nii']:
# continue
if os.path.basename(nii_path) in ['volume-15.nii', 'volume-25.nii']:
continue
# if not nii_path.endswith('14.nii'):
# continue
nii_path_basename = os.path.basename(nii_path)
pred_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred')
pred_vis_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred_vis')
recovery_img_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'recovery_img')
if not os.path.exists(pred_dir):
os.makedirs(pred_dir)
if not os.path.exists(pred_vis_dir):
os.makedirs(pred_vis_dir)
if not os.path.exists(recovery_img_dir):
os.makedirs(recovery_img_dir)
seg_path = os.path.join(nii_dir, 'segmentation-' + nii_path.split('.')[0].split('-')[1] + '.nii')
case_preds = []
case_gts = []
print(nii_path, seg_path)
imgs, tumor_masks, liver_masks, tumor_weak_masks = convertCase2PNGs(nii_path, seg_path, save_dir=None)
print(len(imgs), len(tumor_masks), len(liver_masks), len(tumor_masks))
for slice_idx, (image_data, liver_mask, whole_mask) in enumerate(zip(imgs, liver_masks, tumor_masks)):
pixel_cls_scores_ms = []
for single_scale in scales:
pixel_cls_scores, b_image_v, global_step_v = sess.run(
[net.pixel_cls_scores, b_image, global_step],
feed_dict={
image: image_data,
image_shape_placeholder: np.shape(image_data)[:2],
input_shape_placeholder: single_scale
})
pixel_cls_scores_ms.append(
cv2.resize(pixel_cls_scores[0, :, :, 1], tuple(np.shape(image_data)[:2][::-1])))
pixel_cls_scores = np.mean(pixel_cls_scores_ms, axis=0)
print '%d / %d: %s' % (slice_idx + 1, len(imgs), os.path.basename(nii_path)), np.shape(
pixel_cls_scores), np.max(
pixel_cls_scores), np.min(pixel_cls_scores)
# pred = cv2.resize(pos_score, tuple(np.shape(image_data)[:2][::-1]), interpolation=cv2.INTER_NEAREST)
if np.sum(whole_mask) != 0:
pred = np.asarray(pixel_cls_scores > threshold, np.uint8)
# 开操作 先腐蚀,后膨胀
# pred = open_operation(pos_score, kernel_size=5)
pred = fill_region(pred)
IoUs.append(IoU(whole_mask, pred))
dices.append(dice(whole_mask, pred))
else:
pred = np.zeros_like(whole_mask)
global_gt.append(whole_mask)
case_gts.append(whole_mask)
case_preds.append(pred)
global_pred.append(pred)
save_mhd_image(np.transpose(np.asarray(case_preds, np.uint8), axes=[0, 2, 1]),
os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred.mhd'))
case_dice = dice(case_gts, case_preds)
case_IoU = IoU(case_gts, case_preds)
print('case dice: ', case_dice)
print('case IoU: ', case_IoU)
case_dices.append(case_dice)
case_IoUs.append(case_IoU)
print 'mean of Dice is ', np.mean(dices)
print 'mean of IoU is ', np.mean(IoUs)
print 'global dice (seg) is ', dice(global_gt, global_pred)
print 'global IoU (seg) is ', IoU(global_gt, global_pred)
print('mean of case dice is ', np.mean(case_dices))
print('mean of case IoU is ', np.mean(case_IoUs))
def dense_crf(feature_map, output_probs):
h = output_probs.shape[0]
w = output_probs.shape[1]
output_probs = np.expand_dims(output_probs, 0)
output_probs = np.append(1 - output_probs, output_probs, axis=0)
d = dcrf.DenseCRF2D(w, h, 2)
U = -np.log(output_probs)
U = U.reshape((2, -1))
U = np.ascontiguousarray(U)
img = np.ascontiguousarray(feature_map)
d.setUnaryEnergy(U)
from pydensecrf.utils import unary_from_softmax, create_pairwise_bilateral
from sklearn.decomposition import PCA
pca = PCA(n_components=16, whiten=True)
# print(np.shape(img))
pca_image = pca.fit_transform(np.reshape(img, [-1, 128]))
img = np.reshape(pca_image, [512, 512, 16])
pairwise_energy = create_pairwise_bilateral(sdims=(20, 20), schan=(3,), img=img, chdim=2)
d.addPairwiseGaussian(sxy=3, compat=3)
d.addPairwiseEnergy(pairwise_energy, compat=10)
# d.addPairwiseBilateral(sxy=2, srgb=3, rgbim=img, compat=10)
Q = d.inference(5)
Q = np.argmax(np.array(Q), axis=0).reshape((h, w))
return Q
def evulate_dir_nii_weakly_feature_crf():
'''
新版本的评估代码
:return:
'''
from metrics import dice, IoU
from datasets.medicalImage import convertCase2PNGs, image_expand
nii_dir = '/home/give/Documents/dataset/ISBI2017/Training_Batch_1'
save_dir = '/home/give/Documents/dataset/ISBI2017/weakly_label_segmentation_V4/Batch_1/DLSC_0/niis'
# restore_path = '/home/give/PycharmProjects/weakly_label_segmentation/logs/ISBI2017_V2/1s_agumentation_weakly-upsampling-2/model.ckpt-168090'
restore_path = '/home/give/PycharmProjects/weakly_label_segmentation/logs/ISBI2017_V2/UNet-upsampling-1/model.ckpt-137572'
nii_parent_dir = os.path.dirname(nii_dir)
with tf.name_scope('test'):
image = tf.placeholder(dtype=tf.float32, shape=[None, None, 3])
image_shape_placeholder = tf.placeholder(tf.int32, shape=[2])
input_shape_placeholder = tf.placeholder(tf.int32, shape=[2])
processed_image = segmentation_preprocessing.segmentation_preprocessing(image, None, None,
out_shape=input_shape_placeholder,
is_training=False)
b_image = tf.expand_dims(processed_image, axis=0)
net = UNet.UNet(b_image, None, None, is_training=False, decoder=FLAGS.decoder,
update_center_flag=FLAGS.update_center,
batch_size=2, init_center_value=None, update_center_strategy=2,
num_centers_k=FLAGS.num_centers_k, full_annotation_flag=False,
output_shape_tensor=input_shape_placeholder)
# print slim.get_variables_to_restore()
global_step = slim.get_or_create_global_step()
sess_config = tf.ConfigProto(log_device_placement=False, allow_soft_placement=True)
if FLAGS.gpu_memory_fraction < 0:
sess_config.gpu_options.allow_growth = True
elif FLAGS.gpu_memory_fraction > 0:
sess_config.gpu_options.per_process_gpu_memory_fraction = FLAGS.gpu_memory_fraction
# Variables to restore: moving avg. or normal weights.
if FLAGS.using_moving_average:
variable_averages = tf.train.ExponentialMovingAverage(
FLAGS.moving_average_decay)
variables_to_restore = variable_averages.variables_to_restore()
variables_to_restore[global_step.op.name] = global_step
else:
variables_to_restore = slim.get_variables_to_restore()
saver = tf.train.Saver()
nii_pathes = glob(os.path.join(nii_dir, 'volume-*.nii'))
checkpoint = restore_path
# pixel_recovery_features = tf.image.resize_images(net.pixel_recovery_features, image_shape_placeholder)
with tf.Session(config=sess_config) as sess:
saver.restore(sess, checkpoint)
global_gt = []
global_pred = []
global_pred_crf = []
case_dices = []
case_IoUs = []
case_dices_crf = []
case_IoUs_crf = []
for iter, nii_path in enumerate(nii_pathes):
# if os.path.basename(nii_path) not in ['volume-5.nii', 'volume-12.nii', 'volume-13.nii', 'volume-15.nii', 'volume-19.nii', 'volume-20.nii', 'volume-21.nii', 'volume-25.nii', 'volume-26.nii']:
# continue
if os.path.basename(nii_path) in ['volume-15.nii', 'volume-25.nii']:
continue
# if os.path.basename(nii_path) != 'volume-0.nii':
# continue
nii_path_basename = os.path.basename(nii_path)
pred_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred')
pred_vis_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred_vis')
recovery_img_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'recovery_img')
if not os.path.exists(pred_dir):
os.makedirs(pred_dir)
if not os.path.exists(pred_vis_dir):
os.makedirs(pred_vis_dir)
if not os.path.exists(recovery_img_dir):
os.makedirs(recovery_img_dir)
seg_path = os.path.join(nii_dir, 'segmentation-' + nii_path.split('.')[0].split('-')[1] + '.nii')
case_preds = []
case_preds_crf = []
case_gts = []
# case_recover_features = []
print(nii_path, seg_path)
imgs, tumor_masks, liver_masks, tumor_weak_masks = convertCase2PNGs(nii_path, seg_path, save_dir=None)
print(len(imgs), len(tumor_masks), len(liver_masks), len(tumor_masks))
for slice_idx, (image_data, liver_mask, whole_mask) in enumerate(zip(imgs, liver_masks, tumor_masks)):
pixel_cls_scores_ms = []
pixel_recover_feature_ms = []
for single_scale in scales:
pixel_recover_feature, pixel_cls_scores, b_image_v, global_step_v, net_centers = sess.run(
[net.pixel_recovery_features, net.pixel_cls_scores, b_image, global_step, net.centers],
feed_dict={
image: image_data,
image_shape_placeholder: np.shape(image_data)[:2],
input_shape_placeholder: single_scale
})
pixel_cls_scores_ms.append(
cv2.resize(pixel_cls_scores[0, :, :, 1], tuple(np.shape(image_data)[:2][::-1])))
pixel_recover_feature_ms.append(pixel_recover_feature[0])
del pixel_recover_feature
pixel_cls_scores = np.mean(pixel_cls_scores_ms, axis=0)
pixel_recover_feature = np.mean(pixel_recover_feature_ms, axis=0)
# case_recover_features.append(pixel_recover_feature)
# pred = cv2.resize(pos_score, tuple(np.shape(image_data)[:2][::-1]), interpolation=cv2.INTER_NEAREST)
if np.sum(whole_mask) != 0:
pred = np.asarray(pixel_cls_scores > 0.6, np.uint8)
# 开操作 先腐蚀,后膨胀
# 闭操作 先膨胀,后腐蚀
# pred = close_operation(pred, kernel_size=3)
pred = open_operation(pred, kernel_size=3)
pred = fill_region(pred)
# 再计算kmeans 的结果
pred_crf = dense_crf(pixel_recover_feature, pixel_cls_scores)
pred_crf = np.asarray(pred_crf, np.uint8)
if np.sum(pred_crf) != 0:
pred_crf = open_operation(pred_crf, kernel_size=3)
pred_crf = fill_region(pred_crf)
else:
pred = np.zeros_like(whole_mask)
pred_crf = np.zeros_like(whole_mask)
global_gt.append(whole_mask)
case_gts.append(whole_mask)
case_preds.append(pred)
case_preds_crf.append(pred_crf)
global_pred.append(pred)
global_pred_crf.append(pred_crf)
print '%d / %d: %s' % (slice_idx + 1, len(imgs), os.path.basename(nii_path)), np.shape(
pixel_cls_scores), np.max(
pixel_cls_scores), np.min(pixel_cls_scores), np.shape(pixel_recover_feature)
del pixel_recover_feature, pixel_recover_feature_ms
gc.collect()
save_mhd_image(np.transpose(np.asarray(case_preds_crf, np.uint8), axes=[0, 2, 1]),
os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred_feature_crf.mhd'))
case_dice = dice(case_gts, case_preds)
case_IoU = IoU(case_gts, case_preds)
case_dice_crf = dice(case_gts, case_preds_crf)
case_IoU_crf = IoU(case_gts, case_preds_crf)
print('case dice: ', case_dice)
print('case IoU: ', case_IoU)
print('case dice kmeans: ', case_dice_crf)
print('case IoU kmeans: ', case_IoU_crf)
case_dices.append(case_dice)
case_IoUs.append(case_IoU)
case_dices_crf.append(case_dice_crf)
case_IoUs_crf.append(case_IoU_crf)
print 'global dice is ', dice(global_gt, global_pred)
print 'global IoU is ', IoU(global_gt, global_pred)
print('mean of case dice is ', np.mean(case_dices))
print('mean of case IoU is ', np.mean(case_IoUs))
print 'global dice (crf) is ', dice(global_gt, global_pred_crf)
print 'global IoU (crf) is ', IoU(global_gt, global_pred_crf)
print 'mean of case dice (crf) is ', np.mean(case_dices_crf)
print 'mean of case IoU (crf) is ', np.mean(case_IoUs_crf)
def evulate_dir_nii_weakly_crf():
from metrics import dice, IoU
from datasets.medicalImage import convertCase2PNGs, image_expand
nii_dir = '/home/give/Documents/dataset/ISBI2017/Training_Batch_1'
save_dir = '/home/give/Documents/dataset/ISBI2017/weakly_label_segmentation_V4/Batch_1/DLSC_0/niis'
restore_path = '/home/give/PycharmProjects/weakly_label_segmentation/logs/ISBI2017_V2/UNet-upsampling-1/model.ckpt-137572'
nii_parent_dir = os.path.dirname(nii_dir)
with tf.name_scope('test'):
image = tf.placeholder(dtype=tf.float32, shape=[None, None, 3])
image_shape_placeholder = tf.placeholder(tf.int32, shape=[2])
input_shape_placeholder = tf.placeholder(tf.int32, shape=[2])
processed_image = segmentation_preprocessing.segmentation_preprocessing(image, None, None,
out_shape=input_shape_placeholder,
is_training=False)
b_image = tf.expand_dims(processed_image, axis=0)
net = UNet.UNet(b_image, None, None, is_training=False, decoder=FLAGS.decoder,
update_center_flag=FLAGS.update_center,
batch_size=2, init_center_value=None, update_center_strategy=2,
num_centers_k=FLAGS.num_centers_k, full_annotation_flag=False,
output_shape_tensor=input_shape_placeholder)
# print slim.get_variables_to_restore()
global_step = slim.get_or_create_global_step()
sess_config = tf.ConfigProto(log_device_placement=False, allow_soft_placement=True)
if FLAGS.gpu_memory_fraction < 0:
sess_config.gpu_options.allow_growth = True
elif FLAGS.gpu_memory_fraction > 0:
sess_config.gpu_options.per_process_gpu_memory_fraction = FLAGS.gpu_memory_fraction
# Variables to restore: moving avg. or normal weights.
if FLAGS.using_moving_average:
variable_averages = tf.train.ExponentialMovingAverage(
FLAGS.moving_average_decay)
variables_to_restore = variable_averages.variables_to_restore()
variables_to_restore[global_step.op.name] = global_step
else:
variables_to_restore = slim.get_variables_to_restore()
saver = tf.train.Saver()
nii_pathes = glob(os.path.join(nii_dir, 'volume-*.nii'))
checkpoint = restore_path
# pixel_recovery_features = tf.image.resize_images(net.pixel_recovery_features, image_shape_placeholder)
with tf.Session(config=sess_config) as sess:
saver.restore(sess, checkpoint)
IoUs = []
dices = []
global_gt = []
global_pred = []
global_pred_crf = []
case_dices = []
case_IoUs = []
case_dices_crf = []
case_IoUs_crf = []
for iter, nii_path in enumerate(nii_pathes):
# if os.path.basename(nii_path) in ['volume-5.nii', 'volume-12.nii', 'volume-13.nii', 'volume-15.nii', 'volume-19.nii', 'volume-20.nii', 'volume-21.nii', 'volume-25.nii', 'volume-26.nii']:
# continue
if os.path.basename(nii_path) in ['volume-15.nii', 'volume-25.nii']:
continue
nii_path_basename = os.path.basename(nii_path)
pred_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred')
pred_vis_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred_vis')
recovery_img_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'recovery_img')
if not os.path.exists(pred_dir):
os.makedirs(pred_dir)
if not os.path.exists(pred_vis_dir):
os.makedirs(pred_vis_dir)
if not os.path.exists(recovery_img_dir):
os.makedirs(recovery_img_dir)
seg_path = os.path.join(nii_dir, 'segmentation-' + nii_path.split('.')[0].split('-')[1] + '.nii')
case_preds = []
case_preds_crf = []
case_gts = []
# case_recover_features = []
print(nii_path, seg_path)
imgs, tumor_masks, liver_masks, tumor_weak_masks = convertCase2PNGs(nii_path, seg_path, save_dir=None)
print(len(imgs), len(tumor_masks), len(liver_masks), len(tumor_masks))
for slice_idx, (image_data, liver_mask, whole_mask) in enumerate(zip(imgs, liver_masks, tumor_masks)):
pixel_cls_scores_ms = []
for single_scale in scales:
pixel_cls_scores, b_image_v, global_step_v = sess.run(
[net.pixel_cls_scores, b_image, global_step],
feed_dict={
image: image_data,
image_shape_placeholder: np.shape(image_data)[:2],
input_shape_placeholder: single_scale
})
pixel_cls_scores_ms.append(
cv2.resize(pixel_cls_scores[0, :, :, 1], tuple(np.shape(image_data)[:2][::-1])))
pixel_cls_scores = np.mean(pixel_cls_scores_ms, axis=0)
# case_recover_features.append(pixel_recover_feature)
# pred = cv2.resize(pos_score, tuple(np.shape(image_data)[:2][::-1]), interpolation=cv2.INTER_NEAREST)
if np.sum(whole_mask) != 0:
pred = np.asarray(pixel_cls_scores > 0.6, np.uint8)
# 开操作 先腐蚀,后膨胀
# 闭操作 先膨胀,后腐蚀
# pred = close_operation(pred, kernel_size=3)
pred = open_operation(pred, kernel_size=3)
pred = fill_region(pred)
IoUs.append(IoU(whole_mask, pred))
dices.append(dice(whole_mask, pred))
pred_crf = dense_crf(np.asarray(image_data * 255., np.uint8), pixel_cls_scores)
pred_crf = np.asarray(pred_crf, np.uint8)
if np.sum(pred_crf) != 0:
pred_crf = open_operation(pred_crf, kernel_size=3)
pred_crf = fill_region(pred_crf)
else:
pred = np.zeros_like(whole_mask)
pred_crf = np.zeros_like(whole_mask)
global_gt.append(whole_mask)
case_gts.append(whole_mask)
case_preds.append(pred)
case_preds_crf.append(pred_crf)
global_pred.append(pred)
global_pred_crf.append(pred_crf)
print '%d / %d: %s' % (slice_idx + 1, len(imgs), os.path.basename(nii_path)), np.shape(
pixel_cls_scores), np.max(
pixel_cls_scores), np.min(pixel_cls_scores)
save_mhd_image(np.transpose(np.asarray(case_preds, np.uint8), axes=[0, 2, 1]),
os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred.mhd'))
save_mhd_image(np.transpose(np.asarray(case_preds_crf, np.uint8), axes=[0, 2, 1]),
os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred_crf.mhd'))
case_dice = dice(case_gts, case_preds)
case_IoU = IoU(case_gts, case_preds)
case_dice_crf = dice(case_gts, case_preds_crf)
case_IoU_crf = IoU(case_gts, case_preds_crf)
print('case dice: ', case_dice)
print('case IoU: ', case_IoU)
print('case dice crf: ', case_dice_crf)
print('case IoU crf: ', case_IoU_crf)
case_dices.append(case_dice)
case_IoUs.append(case_IoU)
case_dices_crf.append(case_dice_crf)
case_IoUs_crf.append(case_IoU_crf)
print 'global dice is ', dice(global_gt, global_pred)
print 'global IoU is ', IoU(global_gt, global_pred)
print('mean of case dice is ', np.mean(case_dices))
print('mean of case IoU is ', np.mean(case_IoUs))
print 'global dice (crf) is ', dice(global_gt, global_pred_crf)
print 'global IoU (crf) is ', IoU(global_gt, global_pred_crf)
print 'mean of case dice is ', np.mean(case_dices_crf)
print 'mean of case IoU is ', np.mean(case_IoUs_crf)
def evulate_dir_nii_weakly_kmeans_pixel():
'''
新版本的评估代码
:return:
'''
from metrics import dice, IoU
from datasets.medicalImage import convertCase2PNGs, image_expand
from post_processing import net_center_posprocessing, cluster_postprocessing
nii_dir = '/home/give/Documents/dataset/ISBI2017/Training_Batch_1'
save_dir = '/home/give/Documents/dataset/ISBI2017/weakly_label_segmentation_V4/Batch_1/DLSC_0/niis'
restore_path = '/home/give/PycharmProjects/weakly_label_segmentation/logs/ISBI2017_V2/UNet-upsampling-1/model.ckpt-137572'
with tf.name_scope('test'):
image = tf.placeholder(dtype=tf.float32, shape=[None, None, 3])
image_shape_placeholder = tf.placeholder(tf.int32, shape=[2])
input_shape_placeholder = tf.placeholder(tf.int32, shape=[2])
processed_image = segmentation_preprocessing.segmentation_preprocessing(image, None, None,
out_shape=input_shape_placeholder,
is_training=False)
b_image = tf.expand_dims(processed_image, axis=0)
net = UNet.UNet(b_image, None, None, is_training=False, decoder=FLAGS.decoder,
update_center_flag=FLAGS.update_center,
batch_size=2, init_center_value=None, update_center_strategy=2,
num_centers_k=FLAGS.num_centers_k, full_annotation_flag=False,
output_shape_tensor=input_shape_placeholder)
# print slim.get_variables_to_restore()
global_step = slim.get_or_create_global_step()
sess_config = tf.ConfigProto(log_device_placement=False, allow_soft_placement=True)
if FLAGS.gpu_memory_fraction < 0:
sess_config.gpu_options.allow_growth = True
elif FLAGS.gpu_memory_fraction > 0:
sess_config.gpu_options.per_process_gpu_memory_fraction = FLAGS.gpu_memory_fraction
# Variables to restore: moving avg. or normal weights.
if FLAGS.using_moving_average:
variable_averages = tf.train.ExponentialMovingAverage(
FLAGS.moving_average_decay)
variables_to_restore = variable_averages.variables_to_restore()
variables_to_restore[global_step.op.name] = global_step
else:
variables_to_restore = slim.get_variables_to_restore()
saver = tf.train.Saver()
nii_pathes = glob(os.path.join(nii_dir, 'volume-*.nii'))
checkpoint = restore_path
# pixel_recovery_features = tf.image.resize_images(net.pixel_recovery_features, image_shape_placeholder)
with tf.Session(config=sess_config) as sess:
saver.restore(sess, checkpoint)
global_gt = []
global_pred = []
global_pred_kmeans = []
global_pred_crf = []
case_dices = []
case_IoUs = []
case_dices_kmeans = [] # kmeans
case_IoUs_kmeans = [] # kmeans
for iter, nii_path in enumerate(nii_pathes):
if os.path.basename(nii_path) in ['volume-15.nii', 'volume-25.nii']:
continue
nii_path_basename = os.path.basename(nii_path)
pred_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred')
pred_vis_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred_vis')
recovery_img_dir = os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'recovery_img')
if not os.path.exists(pred_dir):
os.makedirs(pred_dir)
if not os.path.exists(pred_vis_dir):
os.makedirs(pred_vis_dir)
if not os.path.exists(recovery_img_dir):
os.makedirs(recovery_img_dir)
seg_path = os.path.join(nii_dir, 'segmentation-' + nii_path.split('.')[0].split('-')[1] + '.nii')
case_preds = []
case_preds_kmeans = []
case_gts = []
# case_recover_features = []
print(nii_path, seg_path)
imgs, tumor_masks, liver_masks, tumor_weak_masks = convertCase2PNGs(nii_path, seg_path, save_dir=None)
print(len(imgs), len(tumor_masks), len(liver_masks), len(tumor_masks))
for slice_idx, (image_data, liver_mask, whole_mask) in enumerate(zip(imgs, liver_masks, tumor_masks)):
pixel_cls_scores_ms = []
for single_scale in scales:
pixel_cls_scores, b_image_v, global_step_v, net_centers = sess.run(
[net.pixel_cls_scores, b_image, global_step, net.centers],
feed_dict={
image: image_data,
image_shape_placeholder: np.shape(image_data)[:2],
input_shape_placeholder: single_scale
})
pixel_cls_scores_ms.append(
cv2.resize(pixel_cls_scores[0, :, :, 1], tuple(np.shape(image_data)[:2][::-1])))
pixel_cls_scores = np.mean(pixel_cls_scores_ms, axis=0)
if np.sum(whole_mask) != 0:
pred = np.asarray(pixel_cls_scores > 0.6, np.uint8)
# 开操作 先腐蚀,后膨胀
# 闭操作 先膨胀,后腐蚀
# pred = close_operation(pred, kernel_size=3)
pred = open_operation(pred, kernel_size=3)
pred = fill_region(pred)
# 再计算kmeans 的结果
pred_kmeans = np.asarray(image_expand(pred, kernel_size=5), np.uint8)
# pred_seg = image_expand(pred_seg, 5)
pred_kmeans = cluster_postprocessing(pred_kmeans, whole_mask, image_data, k=2)
pred_kmeans[image_data[:, :, 1] < (10./255.)] = 0
pred_kmeans = close_operation(pred_kmeans, kernel_size=5)
pred_kmeans = fill_region(pred_kmeans)
else:
pred = np.zeros_like(whole_mask)
pred_kmeans = np.zeros_like(whole_mask)
global_gt.append(whole_mask)
case_gts.append(whole_mask)
case_preds.append(pred)
case_preds_kmeans.append(pred_kmeans)
global_pred.append(pred)
global_pred_kmeans.append(pred_kmeans)
print '%d / %d: %s' % (slice_idx + 1, len(imgs), os.path.basename(nii_path)), np.shape(
pixel_cls_scores), np.max(
pixel_cls_scores), np.min(pixel_cls_scores)
gc.collect()
save_mhd_image(np.transpose(np.asarray(case_preds, np.uint8), axes=[0, 2, 1]),
os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred.mhd'))
save_mhd_image(np.transpose(np.asarray(case_preds_kmeans, np.uint8), axes=[0, 2, 1]),
os.path.join(save_dir, nii_path_basename.split('.')[0].split('-')[1], 'pred_kmeans.mhd'))
case_dice = dice(case_gts, case_preds)
case_IoU = IoU(case_gts, case_preds)
case_dice_kmeans = dice(case_gts, case_preds_kmeans)
case_IoU_kmeans = IoU(case_gts, case_preds_kmeans)
print('case dice: ', case_dice)
print('case IoU: ', case_IoU)
print('case dice kmeans: ', case_dice_kmeans)
print('case IoU kmeans: ', case_IoU_kmeans)
case_dices.append(case_dice)
case_IoUs.append(case_IoU)
case_dices_kmeans.append(case_dice_kmeans)
case_IoUs_kmeans.append(case_IoU_kmeans)
print 'global dice is ', dice(global_gt, global_pred)
print 'global IoU is ', IoU(global_gt, global_pred)
print('mean of case dice is ', np.mean(case_dices))
print('mean of case IoU is ', np.mean(case_IoUs))
print 'global dice (kmeans) is ', dice(global_gt, global_pred_kmeans)
print 'global IoU (kmeans) is ', IoU(global_gt, global_pred_kmeans)
print 'mean of case dice (kmeans) is ', np.mean(case_dices_kmeans)
print 'mean of case IoU (kmenas) is ', np.mean(case_IoUs_kmeans)
def main(_):
config_initialization()
print('full annotation flag')
# evulate_dir_nii()
# evulate_dir_nii_weakly_new()
# evulate_dir_nii_weakly_feature_crf()
# evulate_dir_nii_weakly_crf()
evulate_dir_nii_weakly_kmeans_pixel()
if __name__ == '__main__':
tf.app.run()
# 544 / 828: volume-4.nii (1, 256, 256, 2) 0.0005035501 1.9400452e-09 (1, 256, 256, 1) 0.016835693 6.260295e-05 0.003921569 0.0 (1, 256, 256, 3) | 49.637177 | 204 | 0.608671 | 6,287 | 49,935 | 4.522666 | 0.06426 | 0.015615 | 0.028065 | 0.012239 | 0.869241 | 0.852993 | 0.825913 | 0.82018 | 0.817859 | 0.816522 | 0 | 0.021661 | 0.274257 | 49,935 | 1,006 | 205 | 49.637177 | 0.762942 | 0.103935 | 0 | 0.787062 | 0 | 0.006739 | 0.093959 | 0.031799 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.037736 | null | null | 0.106469 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3ae2e3f9f47c5e0227e27252496e904399f8eed5 | 689 | py | Python | pytorch_pfn_extras/writing/__init__.py | belltailjp/pytorch-pfn-extras | 8ce08d8e6bc67502be8e0fd5053a8c84e4df5e6a | [
"MIT"
] | 243 | 2020-05-12T01:15:46.000Z | 2022-03-21T22:07:57.000Z | pytorch_pfn_extras/writing/__init__.py | belltailjp/pytorch-pfn-extras | 8ce08d8e6bc67502be8e0fd5053a8c84e4df5e6a | [
"MIT"
] | 495 | 2020-05-12T06:45:12.000Z | 2022-03-31T07:14:02.000Z | pytorch_pfn_extras/writing/__init__.py | belltailjp/pytorch-pfn-extras | 8ce08d8e6bc67502be8e0fd5053a8c84e4df5e6a | [
"MIT"
] | 37 | 2020-05-12T02:16:07.000Z | 2021-08-11T06:00:16.000Z | from pytorch_pfn_extras.writing._writer_base import Writer # NOQA
from pytorch_pfn_extras.writing._writer_base import StandardWriter # NOQA
from pytorch_pfn_extras.writing._simple_writer import SimpleWriter # NOQA
from pytorch_pfn_extras.writing._parallel_writer import ThreadWriter # NOQA
from pytorch_pfn_extras.writing._parallel_writer import ProcessWriter # NOQA
from pytorch_pfn_extras.writing._queue_writer import QueueWriter # NOQA
from pytorch_pfn_extras.writing._queue_writer import ThreadQueueWriter # NOQA
from pytorch_pfn_extras.writing._queue_writer import ProcessQueueWriter # NOQA
from pytorch_pfn_extras.writing._tensorboard_writer import TensorBoardWriter # NOQA
| 68.9 | 84 | 0.869376 | 90 | 689 | 6.255556 | 0.233333 | 0.175844 | 0.223801 | 0.319716 | 0.706927 | 0.706927 | 0.589698 | 0.589698 | 0.436945 | 0 | 0 | 0 | 0.091437 | 689 | 9 | 85 | 76.555556 | 0.899361 | 0.063861 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3ae7a790c6157712a8d2a6769b3d709952c985fb | 216 | py | Python | docker_images/asteroid/app/pipelines/__init__.py | huggingface/api-inference-community | 5edcd6aecbb14fefc74755ac929ab9cf29ac841a | [
"Apache-2.0"
] | 2 | 2022-03-24T19:41:23.000Z | 2022-03-25T10:41:26.000Z | docker_images/asteroid/app/pipelines/__init__.py | huggingface/api-inference-community | 5edcd6aecbb14fefc74755ac929ab9cf29ac841a | [
"Apache-2.0"
] | 6 | 2022-03-16T12:51:45.000Z | 2022-03-17T08:40:35.000Z | docker_images/asteroid/app/pipelines/__init__.py | huggingface/api-inference-community | 5edcd6aecbb14fefc74755ac929ab9cf29ac841a | [
"Apache-2.0"
] | null | null | null | from app.pipelines.base import Pipeline, PipelineException # isort:skip
from app.pipelines.audio_source_separation import AudioSourceSeparationPipeline
from app.pipelines.audio_to_audio import AudioToAudioPipeline
| 43.2 | 79 | 0.87963 | 25 | 216 | 7.44 | 0.6 | 0.112903 | 0.258065 | 0.225806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078704 | 216 | 4 | 80 | 54 | 0.934673 | 0.046296 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c929ef03dbcbc91f02957c4994cad33886356593 | 199 | py | Python | time.py | BeatSkip/HP-Prime-uAsyncIO | eb3f4e1285adab0fc0e1c48f791a3a6715be5f9c | [
"MIT"
] | null | null | null | time.py | BeatSkip/HP-Prime-uAsyncIO | eb3f4e1285adab0fc0e1c48f791a3a6715be5f9c | [
"MIT"
] | null | null | null | time.py | BeatSkip/HP-Prime-uAsyncIO | eb3f4e1285adab0fc0e1c48f791a3a6715be5f9c | [
"MIT"
] | null | null | null | from hpprime import *
from math import *
def ticks_ms():
return eval("ticks()")
def ticks_diff(ticks1,ticks2):
return ticks1-ticks2
def ticks_add(ticks1,ticks2):
return ticks1 + ticks2 | 18.090909 | 30 | 0.713568 | 28 | 199 | 4.964286 | 0.464286 | 0.345324 | 0.258993 | 0.345324 | 0.431655 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04908 | 0.180905 | 199 | 11 | 31 | 18.090909 | 0.803681 | 0 | 0 | 0 | 0 | 0 | 0.035 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0 | 0.25 | 0.375 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
a31b425303af027b1afba22bfb74f9d30e8b0036 | 244 | py | Python | teacher/datasets/__init__.py | Kaysera/fuzzy-lore | 128131e0f41f480d509b63c5e75d0ce58f07bae4 | [
"MIT"
] | 3 | 2022-03-09T16:54:02.000Z | 2022-03-10T13:28:31.000Z | teacher/datasets/__init__.py | Kaysera/fuzzy-lore | 128131e0f41f480d509b63c5e75d0ce58f07bae4 | [
"MIT"
] | 1 | 2022-03-17T16:30:02.000Z | 2022-03-24T17:54:08.000Z | teacher/datasets/__init__.py | Kaysera/fuzzy-lore | 128131e0f41f480d509b63c5e75d0ce58f07bae4 | [
"MIT"
] | null | null | null | from ._base import load_german, load_adult, load_compas, load_heloc, load_beer, load_pima, load_breast
__all__ = [
"load_german",
"load_adult",
"load_compas",
"load_heloc",
"load_beer",
"load_pima",
"load_breast"
]
| 20.333333 | 102 | 0.672131 | 32 | 244 | 4.53125 | 0.375 | 0.137931 | 0.193103 | 0.262069 | 0.882759 | 0.882759 | 0.882759 | 0.882759 | 0.882759 | 0.882759 | 0 | 0 | 0.204918 | 244 | 11 | 103 | 22.181818 | 0.747423 | 0 | 0 | 0 | 0 | 0 | 0.290984 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a33c50af1dc33180847cb13763ea4f76138699ea | 12,678 | py | Python | src/configs/sandybridge/kernel_gen_two.py | flame/tblis-strassen | 6e929ab34c366c4ec6804ad2bf7cae4b84ee81ab | [
"BSD-3-Clause"
] | 9 | 2017-08-25T08:25:01.000Z | 2021-12-02T20:41:28.000Z | src/configs/sandybridge/kernel_gen_two.py | flame/tblis-strassen | 6e929ab34c366c4ec6804ad2bf7cae4b84ee81ab | [
"BSD-3-Clause"
] | null | null | null | src/configs/sandybridge/kernel_gen_two.py | flame/tblis-strassen | 6e929ab34c366c4ec6804ad2bf7cae4b84ee81ab | [
"BSD-3-Clause"
] | 3 | 2018-07-31T05:58:20.000Z | 2022-01-11T03:36:46.000Z | import sys
from common import is_one, is_negone, is_nonzero, write_line, write_break, transpose, printmat, contain_nontrivial
#Round Robin way to get the register
def get_reg( avoid_reg = '' ):
get_reg.counter += 1
res_reg = get_reg.reg_pool[ get_reg.counter % len(get_reg.reg_pool) ]
if ( res_reg == avoid_reg ):
get_reg.counter += 1
res_reg = get_reg.reg_pool[ get_reg.counter % len(get_reg.reg_pool) ]
return res_reg
get_reg.counter = -1
get_reg.reg_pool = [ 'rcx', 'rdx', 'r8', 'r9', 'r10', 'r11', 'r12', 'r13', 'r14' ]
#get_reg.reg_pool = [ 'rcx', 'rdx', 'rsi', 'r8', 'r9', 'r10', 'r11', 'r12', 'r13', 'r14' ]
# rdi, rax, rbx, r15, already occupied.
# (rcx, rdx, rsi, r8, r9, r10, r11, r12, r13, r14): register allocation algorithm
#Round Robin way to get the AVX 256-bit register
def get_avx_reg( avoid_reg = '' ):
get_avx_reg.counter += 1
res_reg = get_avx_reg.avx_reg_pool[ get_avx_reg.counter % len(get_avx_reg.avx_reg_pool) ]
if( res_reg == avoid_reg ):
get_avx_reg.counter += 1
res_reg = get_avx_reg.avx_reg_pool[ get_avx_reg.counter % len(get_avx_reg.avx_reg_pool) ]
return res_reg
get_avx_reg.counter = -1
get_avx_reg.avx_reg_pool = [ 'ymm0', 'ymm1', 'ymm2', 'ymm3', 'ymm4', 'ymm5', 'ymm6', 'ymm7' ]
def gen_updatec_assembly( myfile ):
c03_ymm_list = ['ymm9', 'ymm11', 'ymm13', 'ymm15'] #c00:c33
c47_ymm_list = ['ymm8', 'ymm10', 'ymm12', 'ymm14'] #c40:c73
for idx in range(4):
myfile.write( \
'''\
"vextractf128 $1, %%{0}, %%xmm1 \\n\\t"
"vmovlpd (%%{2}), %%xmm0, %%xmm0 \\n\\t" // load c0{1} and c1{1},
"vmovhpd (%%{2},%%rsi), %%xmm0, %%xmm0 \\n\\t"
"vmulpd %%xmm6, %%xmm{3}, %%xmm2 \\n\\t" // scale by alpha,
"vaddpd %%xmm2, %%xmm0, %%xmm2 \\n\\t" // add the gemm result,
"vmovlpd %%xmm2, (%%{2}) \\n\\t" // and store back to memory.
"vmovhpd %%xmm2, (%%{2},%%rsi) \\n\\t"
"vmovlpd (%%{2},%%r12), %%xmm0, %%xmm0 \\n\\t" // load c2{1} and c3{1},
"vmovhpd (%%{2},%%r13), %%xmm0, %%xmm0 \\n\\t"
"vmulpd %%xmm6, %%xmm1, %%xmm2 \\n\\t" // scale by alpha,
"vaddpd %%xmm2, %%xmm0, %%xmm2 \\n\\t" // add the gemm result,
"vmovlpd %%xmm2, (%%{2},%%r12) \\n\\t" // and store back to memory.
"vmovhpd %%xmm2, (%%{2},%%r13) \\n\\t"
"addq %%rdi, %%{2} \\n\\t" // c += cs_c;
" \\n\\t"
'''.format( c03_ymm_list[idx], str(idx), 'rbx', c03_ymm_list[idx][3:] ) )
for idx in range(4):
myfile.write( \
'''\
"vextractf128 $1, %%{0}, %%xmm1 \\n\\t"
"vmovlpd (%%{2}), %%xmm0, %%xmm0 \\n\\t" // load c4{1} and c5{1},
"vmovhpd (%%{2},%%rsi), %%xmm0, %%xmm0 \\n\\t"
"vmulpd %%xmm6, %%xmm{3}, %%xmm2 \\n\\t" // scale by alpha,
"vaddpd %%xmm2, %%xmm0, %%xmm2 \\n\\t" // add the gemm result,
"vmovlpd %%xmm2, (%%{2}) \\n\\t" // and store back to memory.
"vmovhpd %%xmm2, (%%{2},%%rsi) \\n\\t"
"vmovlpd (%%{2},%%r12), %%xmm0, %%xmm0 \\n\\t" // load c6{1} and c7{1},
"vmovhpd (%%{2},%%r13), %%xmm0, %%xmm0 \\n\\t"
"vmulpd %%xmm6, %%xmm1, %%xmm2 \\n\\t" // scale by alpha,
"vaddpd %%xmm2, %%xmm0, %%xmm2 \\n\\t" // add the gemm result,
"vmovlpd %%xmm2, (%%{2},%%r12) \\n\\t" // and store back to memory.
"vmovhpd %%xmm2, (%%{2},%%r13) \\n\\t"
"addq %%rdi, %%{2} \\n\\t" // c += cs_c;
" \\n\\t"
'''.format( c47_ymm_list[idx], str(idx), 'rdx', c47_ymm_list[idx][3:] ) )
def write_updatec_assembly( myfile, nonzero_coeffs ):
nnz = len( nonzero_coeffs )
write_line( myfile, 1, '"movq %{0}, %%rax \\n\\t" // load address of alpha_list'.format(nnz+6) )
for j, coeff in enumerate(nonzero_coeffs):
alpha_avx_reg = get_avx_reg()
myfile.write( \
'''\
" \\n\\t"
"vbroadcastsd (%%rax), %%{3} \\n\\t" // load alpha_list[ i ] and duplicate
"movq %{0}, %%{2} \\n\\t" // load address of c
" \\n\\t"
"vmovapd 0 * 32(%%{2}), %%{4} \\n\\t" // {4} = c{1}( 0:3, 0 )
"vmulpd %%{3}, %%ymm9, %%{5} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm9( c{1}( 0:3, 0 ) )
"vaddpd %%{4}, %%{5}, %%{4} \\n\\t" // {4} += {5}
"vmovapd %%{4}, 0(%%{2}) \\n\\t" // c{1}( 0:3, 0 ) = {4}
"vmovapd 1 * 32(%%{2}), %%{6} \\n\\t" // {6} = c{1}( 4:7, 0 )
"vmulpd %%{3}, %%ymm8, %%{7} \\n\\t" // scale by alpha, {7} = {3}( alpha ) * ymm8( c{1}( 4:7, 0 ) )
"vaddpd %%{6}, %%{7}, %%{6} \\n\\t" // {6} += {7}
"vmovapd %%{6}, 32(%%{2}) \\n\\t" // c{1}( 4:7, 0 ) = {6}
"addq %%rdi, %%{2} \\n\\t"
"vmovapd 0 * 32(%%{2}), %%{8} \\n\\t" // {8} = c{1}( 0:3, 1 )
"vmulpd %%{3}, %%ymm11, %%{9} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm11( c{1}( 0:3, 1 ) )
"vaddpd %%{8}, %%{9}, %%{8} \\n\\t" // {8} += {7}
"vmovapd %%{8}, 0(%%{2}) \\n\\t" // c{1}( 0:3, 1 ) = {8}
"vmovapd 1 * 32(%%{2}), %%{10} \\n\\t" // {10} = c{1}( 4:7, 1 )
"vmulpd %%{3}, %%ymm10, %%{11} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm10( c{1}( 4:7, 1 ) )
"vaddpd %%{10}, %%{11}, %%{10} \\n\\t" // {10} += {9}
"vmovapd %%{10}, 32(%%{2}) \\n\\t" // c{1}( 4:7, 1 ) = {10}
"addq %%rdi, %%{2} \\n\\t"
"vmovapd 0 * 32(%%{2}), %%{12} \\n\\t" // {12} = c{1}( 0:3, 2 )
"vmulpd %%{3}, %%ymm13, %%{13} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm13( c{1}( 0:3, 2 ) )
"vaddpd %%{12}, %%{13}, %%{12} \\n\\t" // {12} += {11}
"vmovapd %%{12}, 0(%%{2}) \\n\\t" // c{1}( 0:3, 2 ) = {12}
"vmovapd 1 * 32(%%{2}), %%{14} \\n\\t" // {14} = c{1}( 4:7, 2 )
"vmulpd %%{3}, %%ymm12, %%{15} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm12( c{1}( 4:7, 2 ) )
"vaddpd %%{14}, %%{15}, %%{14} \\n\\t" // {14} += {13}
"vmovapd %%{14}, 32(%%{2}) \\n\\t" // c{1}( 4:7, 2 ) = {14}
"addq %%rdi, %%{2} \\n\\t"
"vmovapd 0 * 32(%%{2}), %%{16} \\n\\t" // {16} = c{1}( 0:3, 3 )
"vmulpd %%{3}, %%ymm15, %%{17} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm15( c{1}( 0:3, 3 ) )
"vaddpd %%{16}, %%{17}, %%{16} \\n\\t" // {16} += {15}
"vmovapd %%{16}, 0(%%{2}) \\n\\t" // c{1}( 0:3, 3 ) = {16}
"vmovapd 1 * 32(%%{2}), %%{18} \\n\\t" // {18} = c{1}( 4:7, 3 )
"vmulpd %%{3}, %%ymm14, %%{19} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm14( c{1}( 4:7, 3 ) )
"vaddpd %%{18}, %%{19}, %%{18} \\n\\t" // {18} +={17}
"vmovapd %%{18}, 32(%%{2}) \\n\\t" // c{1}( 4:7, 3 ) = {18}
"addq $1 * 8, %%rax \\n\\t" // alpha_list += 8
" \\n\\t"
'''.format( str(j+6), str(j), get_reg(), alpha_avx_reg, get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ) ) )
def write_updatec_two_assembly( myfile ):
#nnz = len( nonzero_coeffs )
nnz = 2
write_line( myfile, 1, '"movq %{0}, %%rax \\n\\t" // load address of alpha_list'.format(nnz+6) )
for j in range( nnz ):
#for j, coeff in enumerate(nonzero_coeffs):
#print "coeff not 1 / -1!"
alpha_avx_reg = get_avx_reg()
myfile.write( \
'''\
" \\n\\t"
"vbroadcastsd (%%rax), %%{3} \\n\\t" // load alpha_list[ i ] and duplicate
"movq %{0}, %%{2} \\n\\t" // load address of c
" \\n\\t"
"vmovapd 0 * 32(%%{2}), %%{4} \\n\\t" // {4} = c{1}( 0:3, 0 )
"vmulpd %%{3}, %%ymm9, %%{5} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm9( c{1}( 0:3, 0 ) )
"vaddpd %%{4}, %%{5}, %%{4} \\n\\t" // {4} += {5}
"vmovapd %%{4}, 0(%%{2}) \\n\\t" // c{1}( 0:3, 0 ) = {4}
"vmovapd 1 * 32(%%{2}), %%{6} \\n\\t" // {6} = c{1}( 4:7, 0 )
"vmulpd %%{3}, %%ymm8, %%{7} \\n\\t" // scale by alpha, {7} = {3}( alpha ) * ymm8( c{1}( 4:7, 0 ) )
"vaddpd %%{6}, %%{7}, %%{6} \\n\\t" // {6} += {7}
"vmovapd %%{6}, 32(%%{2}) \\n\\t" // c{1}( 4:7, 0 ) = {6}
"addq %%rdi, %%{2} \\n\\t"
"vmovapd 0 * 32(%%{2}), %%{8} \\n\\t" // {8} = c{1}( 0:3, 1 )
"vmulpd %%{3}, %%ymm11, %%{9} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm11( c{1}( 0:3, 1 ) )
"vaddpd %%{8}, %%{9}, %%{8} \\n\\t" // {8} += {7}
"vmovapd %%{8}, 0(%%{2}) \\n\\t" // c{1}( 0:3, 1 ) = {8}
"vmovapd 1 * 32(%%{2}), %%{10} \\n\\t" // {10} = c{1}( 4:7, 1 )
"vmulpd %%{3}, %%ymm10, %%{11} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm10( c{1}( 4:7, 1 ) )
"vaddpd %%{10}, %%{11}, %%{10} \\n\\t" // {10} += {9}
"vmovapd %%{10}, 32(%%{2}) \\n\\t" // c{1}( 4:7, 1 ) = {10}
"addq %%rdi, %%{2} \\n\\t"
"vmovapd 0 * 32(%%{2}), %%{12} \\n\\t" // {12} = c{1}( 0:3, 2 )
"vmulpd %%{3}, %%ymm13, %%{13} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm13( c{1}( 0:3, 2 ) )
"vaddpd %%{12}, %%{13}, %%{12} \\n\\t" // {12} += {11}
"vmovapd %%{12}, 0(%%{2}) \\n\\t" // c{1}( 0:3, 2 ) = {12}
"vmovapd 1 * 32(%%{2}), %%{14} \\n\\t" // {14} = c{1}( 4:7, 2 )
"vmulpd %%{3}, %%ymm12, %%{15} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm12( c{1}( 4:7, 2 ) )
"vaddpd %%{14}, %%{15}, %%{14} \\n\\t" // {14} += {13}
"vmovapd %%{14}, 32(%%{2}) \\n\\t" // c{1}( 4:7, 2 ) = {14}
"addq %%rdi, %%{2} \\n\\t"
"vmovapd 0 * 32(%%{2}), %%{16} \\n\\t" // {16} = c{1}( 0:3, 3 )
"vmulpd %%{3}, %%ymm15, %%{17} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm15( c{1}( 0:3, 3 ) )
"vaddpd %%{16}, %%{17}, %%{16} \\n\\t" // {16} += {15}
"vmovapd %%{16}, 0(%%{2}) \\n\\t" // c{1}( 0:3, 3 ) = {16}
"vmovapd 1 * 32(%%{2}), %%{18} \\n\\t" // {18} = c{1}( 4:7, 3 )
"vmulpd %%{3}, %%ymm14, %%{19} \\n\\t" // scale by alpha, {5} = {3}( alpha ) * ymm14( c{1}( 4:7, 3 ) )
"vaddpd %%{18}, %%{19}, %%{18} \\n\\t" // {18} +={17}
"vmovapd %%{18}, 32(%%{2}) \\n\\t" // c{1}( 4:7, 3 ) = {18}
"addq $1 * 8, %%rax \\n\\t" // alpha_list += 8
" \\n\\t"
'''.format( str(j+6), str(j), get_reg(), alpha_avx_reg, get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ), get_avx_reg( alpha_avx_reg ) ) )
def main():
myfile = open( 'a.c', 'w' )
nonzero_coeffs=['1','-1']
#write_updatec_assembly( myfile, nonzero_coeffs )
#gen_updatec_assembly( myfile )
write_updatec_two_assembly( myfile )
if __name__ == '__main__':
main()
| 63.074627 | 538 | 0.405821 | 1,790 | 12,678 | 2.728492 | 0.089944 | 0.046683 | 0.082924 | 0.095823 | 0.890459 | 0.869779 | 0.825758 | 0.809173 | 0.807125 | 0.807125 | 0 | 0.105813 | 0.335069 | 12,678 | 200 | 539 | 63.39 | 0.473547 | 0.037466 | 0 | 0.489796 | 0 | 0 | 0.087224 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.122449 | false | 0 | 0.040816 | 0 | 0.204082 | 0.020408 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6eb70c7a9ef57a376a6b48568e82fb13d2d1deaa | 5,615 | py | Python | stubs/cloudtrail.py | claytonbrown/troposphere | bf0f1e48b14f578de0221d50f711467ad716ca87 | [
"BSD-2-Clause"
] | null | null | null | stubs/cloudtrail.py | claytonbrown/troposphere | bf0f1e48b14f578de0221d50f711467ad716ca87 | [
"BSD-2-Clause"
] | null | null | null | stubs/cloudtrail.py | claytonbrown/troposphere | bf0f1e48b14f578de0221d50f711467ad716ca87 | [
"BSD-2-Clause"
] | null | null | null | from . import AWSObject, AWSProperty
from .validators import *
from .constants import *
# -------------------------------------------
class CloudTrailTrail(AWSObject):
"""# AWS::CloudTrail::Trail - CloudFormationResourceSpecification version: 1.4.0
{
"Documentation": "http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html",
"Properties": {
"CloudWatchLogsLogGroupArn": {
"Documentation": "http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-cloudwatchlogsloggrouparn",
"PrimitiveType": "String",
"Required": false,
"UpdateType": "Mutable"
},
"CloudWatchLogsRoleArn": {
"Documentation": "http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-cloudwatchlogsrolearn",
"PrimitiveType": "String",
"Required": false,
"UpdateType": "Mutable"
},
"EnableLogFileValidation": {
"Documentation": "http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-enablelogfilevalidation",
"PrimitiveType": "Boolean",
"Required": false,
"UpdateType": "Mutable"
},
"IncludeGlobalServiceEvents": {
"Documentation": "http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-includeglobalserviceevents",
"PrimitiveType": "Boolean",
"Required": false,
"UpdateType": "Mutable"
},
"IsLogging": {
"Documentation": "http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-islogging",
"PrimitiveType": "Boolean",
"Required": true,
"UpdateType": "Mutable"
},
"IsMultiRegionTrail": {
"Documentation": "http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-ismultiregiontrail",
"PrimitiveType": "Boolean",
"Required": false,
"UpdateType": "Mutable"
},
"KMSKeyId": {
"Documentation": "http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-kmskeyid",
"PrimitiveType": "String",
"Required": false,
"UpdateType": "Mutable"
},
"S3BucketName": {
"Documentation": "http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-s3bucketname",
"PrimitiveType": "String",
"Required": true,
"UpdateType": "Mutable"
},
"S3KeyPrefix": {
"Documentation": "http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-s3keyprefix",
"PrimitiveType": "String",
"Required": false,
"UpdateType": "Mutable"
},
"SnsTopicName": {
"Documentation": "http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-snstopicname",
"PrimitiveType": "String",
"Required": false,
"UpdateType": "Mutable"
}
}
}
"""
resource_type = "AWS::CloudTrail::Trail"
props = {
'CloudWatchLogsLogGroupArn': (basestring, False, 'Mutable', 'http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-cloudwatchlogsloggrouparn'),
'CloudWatchLogsRoleArn': (basestring, False, 'Mutable', 'http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-cloudwatchlogsrolearn'),
'EnableLogFileValidation': (boolean, False, 'Mutable', 'http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-enablelogfilevalidation'),
'IncludeGlobalServiceEvents': (boolean, False, 'Mutable', 'http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-includeglobalserviceevents'),
'IsLogging': (boolean, True, 'Mutable', 'http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-islogging'),
'IsMultiRegionTrail': (boolean, False, 'Mutable', 'http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-ismultiregiontrail'),
'KMSKeyId': (basestring, False, 'Mutable', 'http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-kmskeyid'),
'S3BucketName': (basestring, True, 'Mutable', 'http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-s3bucketname'),
'S3KeyPrefix': (basestring, False, 'Mutable', 'http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-s3keyprefix'),
'SnsTopicName': (basestring, False, 'Mutable', 'http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudtrail-trail.html#cfn-cloudtrail-trail-snstopicname')
}
| 59.734043 | 215 | 0.674978 | 502 | 5,615 | 7.547809 | 0.09761 | 0.17023 | 0.060966 | 0.09422 | 0.828451 | 0.828451 | 0.724202 | 0.724202 | 0.724202 | 0.724202 | 0 | 0.002364 | 0.171327 | 5,615 | 93 | 216 | 60.376344 | 0.811949 | 0.611576 | 0 | 0 | 0 | 0.588235 | 0.73723 | 0.053843 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.176471 | 0 | 0.352941 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6ec02dc1482fad768571b8b7c78a815b1eb496de | 13,033 | py | Python | tests/test_lowering.py | lite-david/polymath | cf1addc75e203fa606ebc6d32bc552fb3975ea99 | [
"Apache-2.0"
] | 15 | 2021-05-09T05:46:04.000Z | 2022-03-06T20:46:32.000Z | tests/test_lowering.py | lite-david/polymath | cf1addc75e203fa606ebc6d32bc552fb3975ea99 | [
"Apache-2.0"
] | null | null | null | tests/test_lowering.py | lite-david/polymath | cf1addc75e203fa606ebc6d32bc552fb3975ea99 | [
"Apache-2.0"
] | 4 | 2021-08-24T07:46:29.000Z | 2022-03-05T18:23:07.000Z | from polymath.srdfg.passes.compiler_passes import NormalizeGraph, Lower
import polymath as pm
import pprint
import numpy as np
from pathlib import Path
from tests.util import reco, sigmoid, svm, logistic, linear, set_shape_and_lower, conv
import pytest
CWD = Path(f"{__file__}").parent
BASE_PATH = f"{CWD}/pmlang_examples"
OUTPATH = f"{BASE_PATH}/outputs"
def test_single_dim():
with pm.Node(name="elem5") as graph:
m = pm.parameter(name="m")
x = pm.input("x", shape=m)
w = pm.state("w", shape=m)
i = pm.index(0, m-1, name="i")
w[i] = (w[i]*x[i])
x_ = np.random.randint(0, 10, 3)
w_ = np.random.randint(0, 10, 3)
coarse_eval = graph("w", x=x_, w=w_)
np_result = x_*w_
np.testing.assert_allclose(coarse_eval, np_result)
shape_pass = NormalizeGraph({"m": 3})
graph_shapes = shape_pass(graph)
shape_res = graph_shapes("w", x=x_, w=w_)
np.testing.assert_allclose(shape_res, np_result)
lower_pass = Lower({})
lowered_graph = lower_pass(graph_shapes)
input_info = {f"w/w({i},)": w_[i] for i in range(len(w_))}
input_info.update({f"x/x({i},)": x_[i] for i in range(len(x_))})
fine_grained_eval = lowered_graph("w/w(1,)", input_info)
assert fine_grained_eval == np_result[1]
pb_path = f"{OUTPATH}/{graph.name}.srdfg"
pm.pb_store(lowered_graph, OUTPATH)
loaded_node = pm.pb_load(pb_path)
input_info = {f"w/w({i},)": w_[i] for i in range(len(w_))}
input_info.update({f"x/x({i},)": x_[i] for i in range(len(x_))})
fine_grained_eval = loaded_node("w/w(1,)", input_info)
assert fine_grained_eval == np_result[1]
@pm.register_pass
def get_children(node, ctx):
for a in node.args:
if a.name in ctx:
ctx[a.name]["children"].append(node.name)
@pm.register_pass
def non_class_pass(node, ctx):
ctx['dtype'] = node.type_modifier
return node
def test_multi_dim():
with pm.Node(name="elem4") as graph:
m = pm.parameter(name="m")
n = pm.parameter(name="n")
x = pm.input("x", shape=(m,n))
w = pm.state("w", shape=(m,n))
i = pm.index(0, m-1, name="i")
j = pm.index(0, n-1, name="j")
w[i,j] = (w[i,j]*x[i,j])
m_ = 3
n_ = 4
x_ = np.random.randint(0, 10, m_*n_).reshape((m_,n_))
w_ = np.random.randint(0, 10, m_*n_).reshape((m_,n_))
coarse_eval = graph("w", x=x_, w=w_)
np_result = x_*w_
np.testing.assert_allclose(coarse_eval, np_result)
shape_pass = NormalizeGraph({"m": m_, "n": n_})
graph_shapes = shape_pass(graph)
shape_res = graph_shapes("w", x=x_, w=w_)
np.testing.assert_allclose(shape_res, np_result)
lower_pass = Lower({})
lowered_graph = lower_pass(graph_shapes)
input_info = {}
for i in range(m_):
for j in range(n_):
input_info[f"w/w({i}, {j})"] = w_[i,j]
input_info[f"x/x({i}, {j})"] = x_[i,j]
fine_grained_eval = lowered_graph("w/w(2, 3)", input_info)
assert fine_grained_eval == np_result[2,3]
def test_single_dim_op_slice():
with pm.Node(name="elem3") as graph:
m = pm.parameter(name="m")
x = pm.input("x", shape=m)
w = pm.state("w", shape=m)
i = pm.index(0, m-1, name="i")
out = (w[i]*x[i])
w[i] = (out[i] - w[i])
m_ = 3
x_ = np.random.randint(0, 10, m_)
w_ = np.random.randint(0, 10, m_)
coarse_eval = graph("w", x=x_, w=w_)
np_result = x_*w_ - w_
np.testing.assert_allclose(coarse_eval, np_result)
shape_pass = NormalizeGraph({"m": 3})
graph_shapes = shape_pass(graph)
shape_res = graph_shapes("w", x=x_, w=w_)
np.testing.assert_allclose(shape_res, np_result)
lower_pass = Lower({})
lowered_graph = lower_pass(graph_shapes)
input_info = {f"w/w({i},)": w_[i] for i in range(len(w_))}
input_info.update({f"x/x({i},)": x_[i] for i in range(len(x_))})
fine_grained_eval = lowered_graph("w/w(2,)", input_info)
assert fine_grained_eval == np_result[2]
def test_multi_dim_op_slice():
with pm.Node(name="elem2") as graph:
m = pm.parameter(name="m")
n = pm.parameter(name="n")
mu = pm.parameter(name="mu", default=2.0)
x = pm.input(name="x", shape=(m,n))
w = pm.state(name="w", shape=(m,n))
i = pm.index(0, m-1, name="i")
j = pm.index(0, n-1, name="j")
out = (x[i,j]*w[i,j]).set_name("w_out")
w[i,j] = (mu * (out[i,j] - w[i,j]) )
m_ = 3
n_ = 2
x_ = np.random.randint(0, 10, m_*n_).reshape((m_, n_))
w_ = np.random.randint(0, 10, m_*n_).reshape((m_, n_))
coarse_eval = graph("w", x=x_, w=w_)
np_result = (x_*w_ - w_)*2.0
np.testing.assert_allclose(coarse_eval, np_result)
shape_pass = NormalizeGraph({"m": m_, "n": n_})
graph_shapes = shape_pass(graph)
shape_res = graph_shapes("w", x=x_, w=w_)
np.testing.assert_allclose(shape_res, np_result)
lower_pass = Lower({})
lowered_graph = lower_pass(graph_shapes)
input_info = {}
for i in range(m_):
for j in range(n_):
input_info[f"w/w({i}, {j})"] = w_[i,j]
input_info[f"x/x({i}, {j})"] = x_[i,j]
fine_grained_eval = lowered_graph("w/w(2, 1)", input_info)
assert fine_grained_eval == np_result[2, 1]
def test_lower_group_op():
with pm.Node(name="linear_reg1") as graph:
m = pm.parameter(name="m")
x = pm.input("x", shape=(m))
y = pm.input("y")
w = pm.state("w", shape=(m))
i = pm.index(0, m-1, name="i")
h = pm.sum([i], w[i] * x[i], name="h")
m_ = 3
n_ = 3
x_ = np.random.randint(0, 10, m_)
w_ = np.random.randint(0, 10, (m_))
np_result = np.sum(x_ * w_)
np.testing.assert_allclose(graph("h", {"w": w_, "x": x_}), np_result)
np.testing.assert_allclose(graph("h", w=w_, x=x_), np_result)
shape_pass = NormalizeGraph({"m": m_, "n": n_})
graph_shapes = shape_pass(graph)
shape_res = graph_shapes("h", x=x_, w=w_)
np.testing.assert_allclose(shape_res, np_result)
lower_pass = Lower({})
lowered_graph = lower_pass(graph_shapes)
input_info = {f"w/w({i},)": w_[i] for i in range(len(w_))}
input_info.update({f"x/x({i},)": x_[i] for i in range(len(x_))})
#
fine_grained_eval = lowered_graph("h/h(4,)", input_info)
assert fine_grained_eval == np_result
pb_path = f"{OUTPATH}/linear_reg1.srdfg"
pm.pb_store(lowered_graph, OUTPATH)
loaded_node = pm.pb_load(pb_path) #
input_info = {f"w/w({i},)": w_[i] for i in range(len(w_))}
input_info.update({f"x/x({i},)": x_[i] for i in range(len(x_))})
loaded_res = loaded_node("h/h(4,)", input_info)
assert loaded_node.func_hash() == lowered_graph.func_hash()
assert loaded_res == np_result
#
def test_single_dim_norm():
with pm.Node(name="elem1") as graph:
m = pm.parameter("m")
x = pm.input("x", shape=m)
w = pm.state("w", shape=m)
i = pm.index(0, m-1, name="i")
w[i] = (w[i]*x[i])
x_ = np.random.randint(0, 10, 3)
w_ = np.random.randint(0, 10, 3)
coarse_eval = graph("w", x=x_, w=w_)
np_result = x_*w_
np.testing.assert_allclose(coarse_eval, np_result)
shape_pass = NormalizeGraph({"m": 3})
graph_shapes = shape_pass(graph)
shape_res = graph_shapes("w", x=x_, w=w_)
np.testing.assert_allclose(shape_res, np_result)
lower_pass = Lower({})
lowered_graph = lower_pass(graph_shapes)
input_info = {f"w/w({i},)": w_[i] for i in range(len(w_))}
input_info.update({f"x/x({i},)": x_[i] for i in range(len(x_))})
fine_grained_eval = lowered_graph("w/w(1,)", input_info)
assert fine_grained_eval == np_result[1]
pb_path = f"{OUTPATH}/{graph.name}.srdfg"
pm.pb_store(lowered_graph, OUTPATH)
loaded_node = pm.pb_load(pb_path)
input_info = {f"w/w({i},)": w_[i] for i in range(len(w_))}
input_info.update({f"x/x({i},)": x_[i] for i in range(len(x_))})
fine_grained_eval = loaded_node("w/w(1,)", input_info)
assert fine_grained_eval == np_result[1]
def test_multi_dim_norm():
with pm.Node(name="elem") as graph:
m = pm.parameter(name="m")
n = pm.parameter(name="n")
x = pm.input("x", shape=(m,n))
w = pm.state("w", shape=(m,n))
i = pm.index(0, m-1, name="i")
j = pm.index(0, n-1, name="j")
w[i,j] = (w[i,j]*x[i,j])
m_ = 3
n_ = 4
x_ = np.random.randint(0, 10, m_*n_).reshape((m_,n_))
w_ = np.random.randint(0, 10, m_*n_).reshape((m_,n_))
coarse_eval = graph("w", x=x_, w=w_)
np_result = x_*w_
np.testing.assert_allclose(coarse_eval, np_result)
shape_pass = NormalizeGraph({"m": m_, "n": n_})
graph_shapes = shape_pass(graph)
shape_res = graph_shapes("w", x=x_, w=w_)
np.testing.assert_allclose(shape_res, np_result)
lower_pass = pm.Lower({})
lowered_graph = lower_pass(graph_shapes, {})
input_info = {}
for i in range(m_):
for j in range(n_):
input_info[f"w/w({i}, {j})"] = w_[i,j]
input_info[f"x/x({i}, {j})"] = x_[i,j]
fine_grained_eval = lowered_graph("w/w(2, 3)", input_info)
assert fine_grained_eval == np_result[2,3]
def test_reco():
m_ = 3
n_ = 3
k_ = 2
graph, input_info, out_info, keys = reco(m=m_, n=n_, k=k_, coarse=True)
shape_val_pass = pm.NormalizeGraph({"m": m_, "n": n_, "k": k_})
new_graph = shape_val_pass(graph)
test_res = new_graph(keys, input_info)
np.testing.assert_allclose(test_res[0], out_info["w1"])
np.testing.assert_allclose(test_res[1], out_info["w2"])
graph, input_info, new_out_info, keys = reco(m=m_, n=n_, k=k_)
flatten_pass = pm.Lower({})
flattened_g = flatten_pass(new_graph)
all_vals = flattened_g(keys, input_info)
out1 = np.asarray(list(all_vals[0:6])).reshape(new_out_info["w2"].shape)
out2 = np.asarray(list(all_vals[6:])).reshape(new_out_info["w2"].shape)
np.testing.assert_allclose(new_out_info["w1"], out1)
np.testing.assert_allclose(new_out_info["w2"], out2)
@pytest.mark.parametrize('m_',[
3, 54
])
def test_svm(m_):
shape_dict = {"m": m_}
graph, input_info, out_info, keys = svm(**shape_dict, coarse=True)
shape_val_pass = pm.NormalizeGraph(shape_dict)
new_graph = shape_val_pass(graph)
test_res = new_graph(keys, input_info)
np.testing.assert_allclose(test_res, out_info["w"])
graph, input_info, new_out_info, keys = svm(**shape_dict)
flatten_pass = pm.Lower({})
flattened_g = flatten_pass(new_graph)
all_vals = flattened_g(keys, input_info)
np.testing.assert_allclose(new_out_info["w"], all_vals)
@pytest.mark.parametrize('m_',[
10
])
def test_linear(m_):
shape_dict = {"m": m_}
graph, input_info, out_info, keys = linear(**shape_dict, coarse=True)
shape_val_pass = pm.NormalizeGraph(shape_dict)
new_graph = shape_val_pass(graph)
test_res = new_graph(keys, input_info)
np.testing.assert_allclose(test_res, out_info["w"])
graph, input_info, new_out_info, keys = linear(**shape_dict)
flatten_pass = pm.Lower({})
flattened_g = flatten_pass(new_graph)
all_vals = flattened_g(keys, input_info)
np.testing.assert_allclose(new_out_info["w"], all_vals)
@pytest.mark.parametrize('m_',[
3
])
def test_sigmoid(m_):
with pm.Node(name="logistic1") as graph:
m = pm.parameter(name="m")
n = pm.parameter(name="n")
x = pm.input("x", shape=(m))
w = pm.state("w", shape=(m))
i = pm.index(0, m-1, name="i")
o = pm.sigmoid(pm.sum([i], w[i]*x[i]), name="out")
x_ = np.random.randint(0, 10, m_)
w_ = np.random.randint(0, 10, m_)
input_dict = {"x": x_, "w": w_}
np_res = int(sigmoid(np.sum(x_*w_)))
shape_dict = {"m": m_}
coarse_eval = graph("out", x=x_, w=w_)
np.testing.assert_allclose(np_res, coarse_eval)
lowered = set_shape_and_lower(graph, shape_dict)
@pytest.mark.parametrize('m_',[
3
])
def test_multidim_sigmoid(m_):
with pm.Node(name="logistic") as graph:
m = pm.parameter(name="m")
n = pm.parameter(name="n")
x = pm.input("x", shape=(m))
w = pm.state("w", shape=(m))
i = pm.index(0, m-1, name="i")
o = pm.sigmoid(w[i]*x[i], name="out")
x_ = np.random.randint(0, 10, m_).astype(np.float)
w_ = np.random.randint(0, 10, m_).astype(np.float)
shape_dict = {"m": m_}
input_dict = {"x": x_, "w": w_}
np_res = sigmoid((x_*w_))
coarse_eval = graph("out", input_dict)
np.testing.assert_allclose(np_res, coarse_eval)
lowered = set_shape_and_lower(graph, shape_dict)
keys = [f"out/out({i},)" for i in range(m_)]
x_ = np.random.randint(0, 10, m_).astype(np.float)
w_ = np.random.randint(0, 10, m_).astype(np.float)
input_dict = {}
for i in range(m_):
input_dict[f"x/x({i},)"] = x_[i]
input_dict[f"w/w({i},)"] = w_[i]
np_res = sigmoid((x_*w_))
lower_res = np.asarray(lowered(keys, input_dict)).reshape(np_res.shape)
np.testing.assert_allclose(lower_res, np_res)
| 34.941019 | 86 | 0.610143 | 2,211 | 13,033 | 3.320669 | 0.066033 | 0.055162 | 0.053119 | 0.081449 | 0.852629 | 0.824707 | 0.798965 | 0.761782 | 0.753201 | 0.737401 | 0 | 0.01561 | 0.208624 | 13,033 | 372 | 87 | 35.034946 | 0.696238 | 0 | 0 | 0.695925 | 0 | 0 | 0.049282 | 0.007983 | 0 | 0 | 0 | 0 | 0.115987 | 1 | 0.043887 | false | 0.137931 | 0.021944 | 0 | 0.068966 | 0.003135 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
42c5a7192b46e8f5994f1b0fe22d8d4c36136395 | 16,671 | py | Python | yacht/agents/modules/recurrent.py | IusztinPaul/yacht | c68ab7c66bde860bb91534c29e97772ba328adb5 | [
"Apache-2.0"
] | 5 | 2021-09-03T10:16:50.000Z | 2022-02-28T07:32:43.000Z | yacht/agents/modules/recurrent.py | IusztinPaul/yacht | c68ab7c66bde860bb91534c29e97772ba328adb5 | [
"Apache-2.0"
] | null | null | null | yacht/agents/modules/recurrent.py | IusztinPaul/yacht | c68ab7c66bde860bb91534c29e97772ba328adb5 | [
"Apache-2.0"
] | 1 | 2022-03-05T16:06:46.000Z | 2022-03-05T16:06:46.000Z | from typing import List, Optional, Dict
import gym
import torch
from stable_baselines3.common.torch_layers import BaseFeaturesExtractor
from torch import nn
from yacht.agents.misc import unflatten_observations
from yacht.agents.modules.torch_layers import SimplifiedVariableSelectionNetwork, LinearStack, Resample, AddNorm
class DayRecurrentFeatureExtractor(BaseFeaturesExtractor):
def __init__(
self,
observation_space: gym.Space,
features_dim: List[int],
window_size: int,
intervals: List[str],
features: List[str],
env_features_len: int,
num_assets: int,
include_weekends: bool,
activation_fn: nn.Module,
rnn_layer_type: nn.Module
):
super().__init__(observation_space, features_dim[-1])
assert len(features_dim) >= 3
assert len(set(features_dim[1:-1])) == 1, 'The features_dim of the recurrent layers should be equal.'
self.window_size = window_size
self.intervals = intervals
self.features = features
self.env_features_len = env_features_len
self.num_assets = num_assets
self.include_weekends = include_weekends
self.num_rnn_layers = len(features_dim[1:-1])
self.public_mlp = nn.Sequential(
nn.Linear(in_features=len(self.features) * self.num_assets, out_features=features_dim[0]),
activation_fn()
)
self.public_recurrent = rnn_layer_type(
features_dim[0],
features_dim[1],
num_layers=self.num_rnn_layers,
batch_first=True
)
self.private_mlp = nn.Sequential(
nn.Linear(in_features=env_features_len, out_features=features_dim[0]),
activation_fn()
)
self.private_recurrent = rnn_layer_type(
features_dim[0],
features_dim[1],
num_layers=self.num_rnn_layers,
batch_first=True
)
self.output_mlp = nn.Sequential(
nn.Linear(features_dim[1] * 2, features_dim[-1]),
activation_fn()
)
def forward(self, observations: torch.Tensor) -> torch.Tensor:
observations = unflatten_observations(
observations=observations,
intervals=self.intervals,
num_env_features=self.env_features_len,
num_assets=self.num_assets,
include_weekends=self.include_weekends
)
batch_size, window_size, bar_size, num_assets_size, features_size = observations['1d'].shape
public_input = observations['1d']
public_input = public_input.reshape(batch_size, window_size, -1)
batch_size, window_size, env_features = observations['env_features'].shape
private_input = observations['env_features']
public_input = self.public_mlp(public_input)
public_input, _ = self.public_recurrent(public_input)
public_input = public_input[:, -1, :]
private_input = self.private_mlp(private_input)
private_input, _ = self.private_recurrent(private_input)
private_input = private_input[:, -1, :]
output = torch.cat([public_input, private_input], dim=-1)
output = output.reshape(batch_size, -1)
output = self.output_mlp(output)
return output
class DayVSNRecurrentFeatureExtractor(BaseFeaturesExtractor):
def __init__(
self,
observation_space: gym.Space,
features_dim: List[int],
window_size: int,
intervals: List[str],
features: List[str],
env_features_len: int,
num_assets: int,
include_weekends: bool,
activation_fn: nn.Module,
rnn_layer_type: nn.Module,
dropout: Optional[float] = None,
attention_head_size: int = 1,
add_attention: bool = False,
add_normalization: bool = False,
add_output_vsn: bool = False,
add_residual: bool = False
):
super().__init__(observation_space, features_dim[-1])
assert len(features_dim) >= 3
assert len(set(features_dim[1:-1])) == 1, 'The features_dim of the recurrent layers should be equal.'
self.window_size = window_size
self.intervals = intervals
self.features = features
self.env_features_len = env_features_len
self.num_assets = num_assets
self.include_weekends = include_weekends
self.num_rnn_layers = len(features_dim[1:-1])
self.dropout = dropout if dropout and dropout > 0 else None
self.attention_head_size = attention_head_size
self.add_attention = add_attention
self.add_normalization = add_normalization
self.add_output_vsn = add_output_vsn
self.add_residual = add_residual
self.public_vsn = SimplifiedVariableSelectionNetwork(
public_features_len=len(self.features),
private_features_len=None,
num_assets=self.num_assets,
hidden_features=features_dim[0],
activation_fn=activation_fn,
dropout=self.dropout,
layers_type='grn',
add_normalization=self.add_normalization,
add_residual=self.add_residual
)
self.public_recurrent = rnn_layer_type(
features_dim[0],
features_dim[1],
num_layers=self.num_rnn_layers,
batch_first=True
)
if self.add_residual is True:
self.public_resample = Resample(
in_features=features_dim[0],
out_features=features_dim[1],
activation_fn=activation_fn,
trainable_add=False,
dropout=self.dropout
)
self.public_add_norm = AddNorm(
out_features=features_dim[1],
add_normalization=self.add_normalization,
add_residual=self.add_residual
)
self.private_mlp = LinearStack(
in_features=env_features_len,
out_features=features_dim[0],
activation_fn=activation_fn,
n=1,
hidden_features=features_dim[0],
dropout=self.dropout
)
self.private_recurrent = rnn_layer_type(
features_dim[0],
features_dim[1],
num_layers=self.num_rnn_layers,
batch_first=True
)
if self.add_residual is True:
self.private_resample = Resample(
in_features=features_dim[0],
out_features=features_dim[1],
activation_fn=activation_fn,
trainable_add=False,
dropout=self.dropout
)
self.private_add_norm = AddNorm(
out_features=features_dim[1],
add_normalization=self.add_normalization,
add_residual=self.add_residual
)
if self.add_output_vsn is True:
self.output_vsn = SimplifiedVariableSelectionNetwork(
public_features_len=features_dim[1],
private_features_len=features_dim[1],
num_assets=1,
hidden_features=features_dim[1],
activation_fn=activation_fn,
dropout=self.dropout,
layers_type='grn',
add_normalization=self.add_normalization,
add_residual=self.add_residual
)
if self.add_attention is True:
self.output_attn = nn.MultiheadAttention(
embed_dim=features_dim[1] if self.add_output_vsn is True else features_dim[1] * 2,
num_heads=self.attention_head_size,
dropout=dropout
)
if self.add_residual is True:
self.output_add_norm = AddNorm(
out_features=features_dim[1] if self.add_output_vsn is True else features_dim[1] * 2,
add_normalization=self.add_normalization,
add_residual=self.add_residual
)
self.output_mlp = LinearStack(
in_features=features_dim[1] if self.add_output_vsn is True else features_dim[1] * 2,
out_features=features_dim[2],
activation_fn=activation_fn,
n=1,
hidden_features=features_dim[2],
dropout=self.dropout
)
def forward(self, observations: torch.Tensor) -> torch.Tensor:
observations = unflatten_observations(
observations=observations,
intervals=self.intervals,
num_env_features=self.env_features_len,
num_assets=self.num_assets,
include_weekends=self.include_weekends
)
batch_size, window_size, env_features = observations['env_features'].shape
private_input = observations.pop('env_features')
public_variables = self.split_variables(observations)
public_vsn_output = self.public_vsn(public_variables)
public_rnn_output, _ = self.public_recurrent(public_vsn_output)
if self.add_residual is True:
residual = self.public_resample(public_vsn_output)
public_rnn_output = self.public_add_norm(public_rnn_output, residual)
private_mlp_output = self.private_mlp(private_input)
private_rnn_output, _ = self.private_recurrent(private_mlp_output)
if self.add_residual is True:
residual = self.private_resample(private_mlp_output)
private_rnn_output = self.private_add_norm(private_rnn_output, residual)
if self.add_output_vsn:
variables = {
'public_features_0': public_rnn_output,
'private_features': private_rnn_output
}
aggregated_output = self.output_vsn(variables)
else:
aggregated_output = torch.cat([public_rnn_output, private_rnn_output], dim=-1)
if self.add_attention is True:
output, _ = self.output_attn(
query=aggregated_output[:, -1:],
key=aggregated_output,
value=aggregated_output
)
if self.add_residual is True:
output = self.output_add_norm(output, aggregated_output[:, -1:])
else:
output = aggregated_output[:, -1, :]
output = output.reshape(batch_size, -1)
output = self.output_mlp(output)
return output
@classmethod
def split_variables(cls, observations: Dict[str, torch.Tensor]) -> Dict[str, torch.Tensor]:
variables = dict()
for key, value in observations.items():
if key == 'env_features':
variables['private_features'] = value
elif key == '1d':
num_assets = value.shape[3]
for asset_idx in range(num_assets):
asset_values = value[..., asset_idx, :]
batch_size, window_size = asset_values.shape[:2]
asset_values = asset_values.view(batch_size, window_size, -1)
variables[f'public_features_{asset_idx}'] = asset_values
else:
raise RuntimeError(f'Unsupported observation type: {key}')
return variables
class OnlyVSNRecurrentFeatureExtractor(BaseFeaturesExtractor):
def __init__(
self,
observation_space: gym.Space,
features_dim: List[int],
window_size: int,
intervals: List[str],
features: List[str],
env_features_len: int,
num_assets: int,
include_weekends: bool,
activation_fn: nn.Module,
rnn_layer_type: nn.Module,
dropout: Optional[float] = None,
attention_head_size: int = 1,
add_attention: bool = False,
add_normalization: bool = False,
add_residual: bool = False
):
super().__init__(observation_space, features_dim[-1])
assert len(features_dim) >= 3
assert len(set(features_dim[1:-1])) == 1, 'The features_dim of the recurrent layers should be equal.'
self.window_size = window_size
self.intervals = intervals
self.features = features
self.env_features_len = env_features_len
self.num_assets = num_assets
self.include_weekends = include_weekends
self.num_rnn_layers = len(features_dim[1:-1])
self.dropout = dropout if dropout and dropout > 0 else None
self.attention_head_size = attention_head_size
self.add_attention = add_attention
self.add_normalization = add_normalization
self.add_residual = add_residual
self.vsn = SimplifiedVariableSelectionNetwork(
public_features_len=len(self.features),
private_features_len=None,
num_assets=self.num_assets,
hidden_features=features_dim[0],
activation_fn=activation_fn,
dropout=self.dropout,
layers_type='linear',
add_normalization=self.add_normalization,
add_residual=self.add_residual
)
self.recurrent = rnn_layer_type(
features_dim[0],
features_dim[1],
num_layers=self.num_rnn_layers,
batch_first=True
)
if self.add_residual is True:
self.resample = Resample(
in_features=features_dim[0],
out_features=features_dim[1],
activation_fn=activation_fn,
trainable_add=False,
dropout=self.dropout
)
self.add_norm = AddNorm(
out_features=features_dim[1],
add_normalization=self.add_normalization,
add_residual=self.add_residual
)
if self.add_attention is True:
self.output_attn = nn.MultiheadAttention(
embed_dim=features_dim[1],
num_heads=self.attention_head_size,
dropout=dropout
)
if self.add_residual is True:
self.output_add_norm = AddNorm(
out_features=features_dim[1],
add_normalization=self.add_normalization,
add_residual=self.add_residual
)
self.output_mlp = LinearStack(
in_features=features_dim[1],
out_features=features_dim[2],
activation_fn=activation_fn,
n=1,
hidden_features=features_dim[2],
dropout=self.dropout
)
def forward(self, observations: torch.Tensor) -> torch.Tensor:
observations = unflatten_observations(
observations=observations,
intervals=self.intervals,
num_env_features=self.env_features_len,
num_assets=self.num_assets,
include_weekends=self.include_weekends
)
batch_size, _, _ = observations['env_features'].shape
variables = self.split_variables(observations)
vsn_output = self.vsn(variables)
rnn_output, _ = self.recurrent(vsn_output)
if self.add_residual is True:
residual = self.resample(vsn_output)
rnn_output = self.add_norm(rnn_output, residual)
if self.add_attention is True:
output, _ = self.output_attn(
query=rnn_output[:, -1:],
key=rnn_output,
value=rnn_output
)
if self.add_residual is True:
output = self.output_add_norm(output, rnn_output[:, -1:])
else:
output = rnn_output[:, -1, :]
output = output.reshape(batch_size, -1)
output = self.output_mlp(output)
return output
@classmethod
def split_variables(cls, observations: Dict[str, torch.Tensor]) -> Dict[str, torch.Tensor]:
variables = dict()
for key, value in observations.items():
if key == 'env_features':
variables['private_features'] = value
elif key == '1d':
num_assets = value.shape[3]
for asset_idx in range(num_assets):
asset_values = value[..., asset_idx, :]
batch_size, window_size = asset_values.shape[:2]
asset_values = asset_values.view(batch_size, window_size, -1)
variables[f'public_features_{asset_idx}'] = asset_values
else:
raise RuntimeError(f'Unsupported observation type: {key}')
return variables
| 38.769767 | 112 | 0.604703 | 1,813 | 16,671 | 5.237728 | 0.071704 | 0.070661 | 0.042965 | 0.030118 | 0.840143 | 0.80139 | 0.784225 | 0.775906 | 0.770114 | 0.765796 | 0 | 0.008943 | 0.315878 | 16,671 | 429 | 113 | 38.86014 | 0.823674 | 0 | 0 | 0.701799 | 0 | 0 | 0.027833 | 0.003239 | 0 | 0 | 0 | 0 | 0.015424 | 1 | 0.020566 | false | 0 | 0.017995 | 0 | 0.059126 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6e1d70e08944e1d2c1d332221e636bd060a08e44 | 13,146 | py | Python | app/eventFrameNotes/routes.py | DeschutesBrewery/brewerypi | 5459dfc6b1ed415920c13a8a7c9a2d3d3c82099f | [
"MIT"
] | 27 | 2017-11-27T05:01:05.000Z | 2020-11-14T19:52:26.000Z | app/eventFrameNotes/routes.py | DeschutesBrewery/brewerypi | 5459dfc6b1ed415920c13a8a7c9a2d3d3c82099f | [
"MIT"
] | 259 | 2017-11-23T00:43:26.000Z | 2020-11-03T01:07:30.000Z | app/eventFrameNotes/routes.py | DeschutesBrewery/brewerypi | 5459dfc6b1ed415920c13a8a7c9a2d3d3c82099f | [
"MIT"
] | 8 | 2018-10-29T04:39:29.000Z | 2020-10-01T22:18:12.000Z | from flask import flash, redirect, render_template, request, url_for
from flask_login import current_user, login_required
from . import eventFrameNotes
from . forms import EventFrameNoteForm
from .. import db
from .. decorators import permissionRequired
from .. models import EventFrame, EventFrameGroup, EventFrameNote, Note, Permission
modelName = "Event Frame Notes"
@eventFrameNotes.route("/eventFrameNotes/add/<int:eventFrameId>", methods = ["GET", "POST"])
@eventFrameNotes.route("/eventFrameNotes/add/<int:eventFrameId>/<int:eventFrameGroupId>", methods = ["GET", "POST"])
@login_required
@permissionRequired(Permission.DATA_ENTRY)
def addEventFrameNote(eventFrameId, eventFrameGroupId = None):
operation = "Add"
form = EventFrameNoteForm()
# Add a new event frame note.
if form.validate_on_submit():
note = Note(Note = form.note.data, Timestamp = form.utcTimestamp.data, UserId = current_user.get_id())
db.session.add(note)
db.session.commit()
eventFrameNote = EventFrameNote(NoteId = note.NoteId, EventFrameId = eventFrameId)
db.session.add(eventFrameNote)
db.session.commit()
flash("You have successfully added a new Event Frame Note.", "alert alert-success")
return redirect(form.requestReferrer.data)
# Present a form to add a new event frame note.
if form.requestReferrer.data is None:
form.requestReferrer.data = request.referrer
eventFrame = EventFrame.query.get_or_404(eventFrameId)
if eventFrameGroupId is None:
eventFrameGroup = None
else:
eventFrameGroup = EventFrameGroup.query.get_or_404(eventFrameGroupId)
if eventFrame.ParentEventFrameId:
if eventFrameGroup is None:
breadcrumbs = [{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "Root"),
"text" : "<span class = \"glyphicon glyphicon-home\"></span>"},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "Enterprise",
selectedId = eventFrame.origin().EventFrameTemplate.ElementTemplate.Site.Enterprise.EnterpriseId),
"text" : eventFrame.origin().EventFrameTemplate.ElementTemplate.Site.Enterprise.Name},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "Site",
selectedId = eventFrame.origin().EventFrameTemplate.ElementTemplate.Site.SiteId),
"text" : eventFrame.origin().EventFrameTemplate.ElementTemplate.Site.Name},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "ElementTemplate",
selectedId = eventFrame.origin().EventFrameTemplate.ElementTemplate.ElementTemplateId),
"text" : eventFrame.origin().EventFrameTemplate.ElementTemplate.Name},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "EventFrameTemplate",
selectedId = eventFrame.origin().EventFrameTemplate.EventFrameTemplateId), "text" : eventFrame.origin().EventFrameTemplate.Name},
{"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrame.origin().EventFrameId), "text": eventFrame.origin().Name}]
for eventFrameAcestor in eventFrame.ancestors([]):
if eventFrameAcestor.ParentEventFrameId is not None:
breadcrumbs.append({"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrameAcestor.EventFrameId),
"text" : "{} / {}".format(eventFrameAcestor.EventFrameTemplate.Name, eventFrameAcestor.Name)})
breadcrumbs.append({"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrame.EventFrameId),
"text" : "{} / {}".format(eventFrame.EventFrameTemplate.Name, eventFrame.Name)})
else:
breadcrumbs = [{"url" : url_for("eventFrameGroups.listEventFrameGroups"), "text" : "<span class = \"glyphicon glyphicon-home\"></span>"},
{"url" : url_for("eventFrameGroups.dashboard", eventFrameGroupId = eventFrameGroup.EventFrameGroupId), "text" : eventFrameGroup.Name},
{"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrame.origin().EventFrameId,
eventFrameGroupId = eventFrameGroup.EventFrameGroupId), "text" : eventFrame.origin().Name}]
for eventFrameAcestor in eventFrame.ancestors([]):
if eventFrameAcestor.ParentEventFrameId is not None:
breadcrumbs.append({"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrameAcestor.EventFrameId,
eventFrameGroupId = eventFrameGroup.EventFrameGroupId),
"text" : "{} / {}".format(eventFrameAcestor.EventFrameTemplate.Name, eventFrameAcestor.Name)})
breadcrumbs.append({"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrame.EventFrameId,
eventFrameGroupId = eventFrameGroup.EventFrameGroupId), "text" : "{} / {}".format(eventFrame.EventFrameTemplate.Name, eventFrame.Name)})
else:
if eventFrameGroup is None:
breadcrumbs = [{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "Root"),
"text" : "<span class = \"glyphicon glyphicon-home\"></span>"},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "Enterprise",
selectedId = eventFrame.EventFrameTemplate.ElementTemplate.Site.Enterprise.EnterpriseId),
"text" : eventFrame.EventFrameTemplate.ElementTemplate.Site.Enterprise.Name},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "Site",
selectedId = eventFrame.EventFrameTemplate.ElementTemplate.Site.SiteId),
"text" : eventFrame.EventFrameTemplate.ElementTemplate.Site.Name},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "ElementTemplate",
selectedId = eventFrame.EventFrameTemplate.ElementTemplate.ElementTemplateId),
"text" : eventFrame.EventFrameTemplate.ElementTemplate.Name},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "EventFrameTemplate",
selectedId = eventFrame.EventFrameTemplate.EventFrameTemplateId), "text" : eventFrame.EventFrameTemplate.Name},
{"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrame.EventFrameId), "text" : eventFrame.Name}]
else:
breadcrumbs = [{"url" : url_for("eventFrameGroups.listEventFrameGroups"), "text" : "<span class = \"glyphicon glyphicon-home\"></span>"},
{"url" : url_for("eventFrameGroups.dashboard", eventFrameGroupId = eventFrameGroup.EventFrameGroupId), "text" : eventFrameGroup.Name},
{"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrame.EventFrameId, eventFrameGroupId = eventFrameGroup.EventFrameGroupId),
"text" : eventFrame.Name}]
return render_template("addEditWithTimestamp.html", breadcrumbs = breadcrumbs, form = form, modelName = modelName, operation = operation)
@eventFrameNotes.route("/eventFrameNotes/delete/<int:eventFrameId>/<int:noteId>", methods = ["GET", "POST"])
@login_required
@permissionRequired(Permission.DATA_ENTRY)
def deleteEventFrameNote(eventFrameId, noteId):
eventFrameNote = EventFrameNote.query.filter_by(EventFrameId = eventFrameId, NoteId = noteId).first()
eventFrameNote.delete()
db.session.commit()
flash("You have successfully deleted the event frame note.", "alert alert-success")
return redirect(request.referrer)
@eventFrameNotes.route("/eventFrameNotes/edit/<int:eventFrameId>/<int:noteId>", methods = ["GET", "POST"])
@eventFrameNotes.route("/eventFrameNotes/edit/<int:eventFrameId>/<int:noteId>/<int:eventFrameGroupId>", methods = ["GET", "POST"])
@login_required
@permissionRequired(Permission.DATA_ENTRY)
def editEventFrameNote(eventFrameId, noteId, eventFrameGroupId = None):
operation = "Edit"
note = Note.query.get_or_404(noteId)
form = EventFrameNoteForm()
# Edit an existing event frame note.
if form.validate_on_submit():
note.Note = form.note.data
note.Timestamp = form.utcTimestamp.data
note.UserId = current_user.get_id()
db.session.commit()
flash("You have successfully edited the Event Frame Note.", "alert alert-success")
return redirect(form.requestReferrer.data)
# Present a form to edit an existing event frame.
form.note.data = note.Note
form.timestamp.data = note.Timestamp
if form.requestReferrer.data is None:
form.requestReferrer.data = request.referrer
eventFrame = EventFrame.query.get_or_404(eventFrameId)
if eventFrameGroupId is None:
eventFrameGroup = None
else:
eventFrameGroup = EventFrameGroup.query.get_or_404(eventFrameGroupId)
if eventFrame.ParentEventFrameId:
if eventFrameGroup is None:
breadcrumbs = [{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "Root"),
"text" : "<span class = \"glyphicon glyphicon-home\"></span>"},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "Enterprise",
selectedId = eventFrame.origin().EventFrameTemplate.ElementTemplate.Site.Enterprise.EnterpriseId),
"text" : eventFrame.origin().EventFrameTemplate.ElementTemplate.Site.Enterprise.Name},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "Site",
selectedId = eventFrame.origin().EventFrameTemplate.ElementTemplate.Site.SiteId),
"text" : eventFrame.origin().EventFrameTemplate.ElementTemplate.Site.Name},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "ElementTemplate",
selectedId = eventFrame.origin().EventFrameTemplate.ElementTemplate.ElementTemplateId),
"text" : eventFrame.origin().EventFrameTemplate.ElementTemplate.Name},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "EventFrameTemplate",
selectedId = eventFrame.origin().EventFrameTemplate.EventFrameTemplateId), "text" : eventFrame.origin().EventFrameTemplate.Name},
{"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrame.origin().EventFrameId), "text": eventFrame.origin().Name}]
for eventFrameAcestor in eventFrame.ancestors([]):
if eventFrameAcestor.ParentEventFrameId is not None:
breadcrumbs.append({"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrameAcestor.EventFrameId),
"text" : "{} / {}".format(eventFrameAcestor.EventFrameTemplate.Name, eventFrameAcestor.Name)})
breadcrumbs.append({"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrame.EventFrameId),
"text" : "{} / {}".format(eventFrame.EventFrameTemplate.Name, eventFrame.Name)})
breadcrumbs.append({"url": None, "text": note.Timestamp})
else:
breadcrumbs = [{"url" : url_for("eventFrameGroups.listEventFrameGroups"), "text" : "<span class = \"glyphicon glyphicon-home\"></span>"},
{"url" : url_for("eventFrameGroups.dashboard", eventFrameGroupId = eventFrameGroup.EventFrameGroupId), "text" : eventFrameGroup.Name},
{"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrame.origin().EventFrameId,
eventFrameGroupId = eventFrameGroup.EventFrameGroupId), "text" : eventFrame.origin().Name}]
for eventFrameAcestor in eventFrame.ancestors([]):
if eventFrameAcestor.ParentEventFrameId is not None:
breadcrumbs.append({"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrameAcestor.EventFrameId,
eventFrameGroupId = eventFrameGroup.EventFrameGroupId),
"text" : "{} / {}".format(eventFrameAcestor.EventFrameTemplate.Name, eventFrameAcestor.Name)})
breadcrumbs.append({"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrame.EventFrameId,
eventFrameGroupId = eventFrameGroup.EventFrameGroupId), "text" : "{} / {}".format(eventFrame.EventFrameTemplate.Name, eventFrame.Name)})
breadcrumbs.append({"url": None, "text": note.Timestamp})
else:
if eventFrameGroup is None:
breadcrumbs = [{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "Root"),
"text" : "<span class = \"glyphicon glyphicon-home\"></span>"},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "Enterprise",
selectedId = eventFrame.EventFrameTemplate.ElementTemplate.Site.Enterprise.EnterpriseId),
"text" : eventFrame.EventFrameTemplate.ElementTemplate.Site.Enterprise.Name},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "Site",
selectedId = eventFrame.EventFrameTemplate.ElementTemplate.Site.SiteId),
"text" : eventFrame.EventFrameTemplate.ElementTemplate.Site.Name},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "ElementTemplate",
selectedId = eventFrame.EventFrameTemplate.ElementTemplate.ElementTemplateId),
"text" : eventFrame.EventFrameTemplate.ElementTemplate.Name},
{"url" : url_for("eventFrames.selectEventFrame", selectedClass = "EventFrameTemplate",
selectedId = eventFrame.EventFrameTemplate.EventFrameTemplateId), "text" : eventFrame.EventFrameTemplate.Name},
{"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrame.EventFrameId), "text" : eventFrame.Name},
{"url" : None, "text" : note.Timestamp}]
else:
breadcrumbs = [{"url" : url_for("eventFrameGroups.listEventFrameGroups"), "text" : "<span class = \"glyphicon glyphicon-home\"></span>"},
{"url" : url_for("eventFrameGroups.dashboard", eventFrameGroupId = eventFrameGroup.EventFrameGroupId), "text" : eventFrameGroup.Name},
{"url" : url_for("eventFrames.dashboard", eventFrameId = eventFrame.EventFrameId, eventFrameGroupId = eventFrameGroup.EventFrameGroupId),
"text" : eventFrame.Name}]
breadcrumbs.append({"url": None, "text": note.Timestamp})
return render_template("addEditWithTimestamp.html", breadcrumbs = breadcrumbs, form = form, modelName = modelName, operation = operation)
| 64.758621 | 141 | 0.753689 | 1,223 | 13,146 | 8.039248 | 0.097302 | 0.027461 | 0.040277 | 0.07323 | 0.903478 | 0.896359 | 0.884154 | 0.862998 | 0.847335 | 0.837978 | 0 | 0.001282 | 0.110224 | 13,146 | 202 | 142 | 65.079208 | 0.839275 | 0.011867 | 0 | 0.794444 | 0 | 0 | 0.195071 | 0.114363 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016667 | false | 0 | 0.038889 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
280c71a89af496ef418a925a6006df37d116de7c | 3,686 | py | Python | backend/audits/migrations/0004_auto_20190218_1313.py | donroyco/falco | 305889f867907660c8a67e6f40b38289dcd22ab7 | [
"MIT"
] | 796 | 2019-10-19T19:58:12.000Z | 2022-03-22T14:02:37.000Z | backend/audits/migrations/0004_auto_20190218_1313.py | Johann-S/falco | c4713d0ab38f3a8a4601b1594fd22a4be170933d | [
"MIT"
] | 224 | 2019-10-19T17:45:12.000Z | 2022-03-24T20:46:29.000Z | backend/audits/migrations/0004_auto_20190218_1313.py | Johann-S/falco | c4713d0ab38f3a8a4601b1594fd22a4be170933d | [
"MIT"
] | 33 | 2019-10-22T21:17:09.000Z | 2021-12-23T06:08:26.000Z | # Generated by Django 2.1.5 on 2019-02-18 13:13
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [("audits", "0003_auto_20190206_1513")]
operations = [
migrations.RemoveField(model_name="auditresults", name="wpt_metric_tti"),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_first_view_first_contentful_paint",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_first_view_first_meaningful_paint",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_first_view_first_paint",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_first_view_load_time",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_first_view_speed_index",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_first_view_time_to_first_byte",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_first_view_tti",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_lighthouse_performance",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_repeat_view_first_contentful_paint",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_repeat_view_first_meaningful_paint",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_repeat_view_first_paint",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_repeat_view_load_time",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_repeat_view_speed_index",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_repeat_view_time_to_first_byte",
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name="auditresults",
name="wpt_metric_repeat_view_tti",
field=models.IntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name="auditstatushistory",
name="status",
field=models.CharField(
choices=[
("REQUESTED", "REQUESTED"),
("PENDING", "PENDING"),
("ERROR", "ERROR"),
("SUCCESS", "SUCCESS"),
],
max_length=10,
),
),
]
| 36.49505 | 81 | 0.587629 | 349 | 3,686 | 5.91404 | 0.183381 | 0.074128 | 0.162791 | 0.193798 | 0.837694 | 0.837694 | 0.821221 | 0.821221 | 0.821221 | 0.821221 | 0 | 0.012906 | 0.306294 | 3,686 | 100 | 82 | 36.86 | 0.79429 | 0.012208 | 0 | 0.659574 | 1 | 0 | 0.234955 | 0.154713 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.010638 | 0 | 0.042553 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
282ff3e6d51fe05a38aa463ea215772a815dd22f | 84 | py | Python | starfish/core/image/__init__.py | haoxusci/starfish | d7bd856024c75f2ce41504406f2a663566c3814b | [
"MIT"
] | 164 | 2018-03-21T21:52:56.000Z | 2022-03-23T17:14:39.000Z | starfish/core/image/__init__.py | lbgbox/starfish | 0e879d995d5c49b6f5a842e201e3be04c91afc7e | [
"MIT"
] | 1,728 | 2018-03-15T23:16:09.000Z | 2022-03-12T00:09:18.000Z | starfish/core/image/__init__.py | lbgbox/starfish | 0e879d995d5c49b6f5a842e201e3be04c91afc7e | [
"MIT"
] | 66 | 2018-03-25T17:21:15.000Z | 2022-01-16T09:17:11.000Z | from ._registration import ApplyTransform
from ._registration import LearnTransform
| 28 | 41 | 0.880952 | 8 | 84 | 9 | 0.625 | 0.444444 | 0.611111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 84 | 2 | 42 | 42 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
956510b39080a8021f0600003b0efb393f0fcfcd | 1,091 | py | Python | tests/lamper/test_decorators.py | epopeia/lamper | d6aaea78b4df4ef1535ecf81ca42f9f35a635e12 | [
"MIT"
] | 1 | 2020-09-27T01:18:06.000Z | 2020-09-27T01:18:06.000Z | tests/lamper/test_decorators.py | epopeia/lamper | d6aaea78b4df4ef1535ecf81ca42f9f35a635e12 | [
"MIT"
] | null | null | null | tests/lamper/test_decorators.py | epopeia/lamper | d6aaea78b4df4ef1535ecf81ca42f9f35a635e12 | [
"MIT"
] | null | null | null | from lamper import decorators
def test_post_decorator():
def my_func(): ...
dec = decorators.Mapping().post('/test')
dec_func = dec(my_func)
assert type(dec_func).__name__ == 'function'
def test_post_decorator():
def my_func(): ...
dec = decorators.Mapping().post('/test')
dec_func = dec(my_func)
assert type(dec_func).__name__ == 'function'
def test_put_decorator():
def my_func(): ...
dec = decorators.Mapping().put('/test')
dec_func = dec(my_func)
assert type(dec_func).__name__ == 'function'
def test_delete_decorator():
def my_func(): ...
dec = decorators.Mapping().delete('/test')
dec_func = dec(my_func)
assert type(dec_func).__name__ == 'function'
def test_patch_decorator():
def my_func(): ...
dec = decorators.Mapping().patch('/test')
dec_func = dec(my_func)
assert type(dec_func).__name__ == 'function'
def test_options_decorator():
def my_func(): ...
dec = decorators.Mapping().options('/test')
dec_func = dec(my_func)
assert type(dec_func).__name__ == 'function' | 21.392157 | 48 | 0.648946 | 142 | 1,091 | 4.56338 | 0.140845 | 0.111111 | 0.12963 | 0.166667 | 0.895062 | 0.895062 | 0.895062 | 0.660494 | 0.660494 | 0.660494 | 0 | 0 | 0.191567 | 1,091 | 51 | 49 | 21.392157 | 0.734694 | 0 | 0 | 0.709677 | 0 | 0 | 0.071429 | 0 | 0 | 0 | 0 | 0 | 0.193548 | 1 | 0.387097 | false | 0 | 0.032258 | 0 | 0.419355 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
95b7d82bceeefd0999f13d326ddfa0848da859e3 | 24,627 | py | Python | snips_nlu/tests/test_crf_utils.py | CharlyBlavier/snips-nlu-Copy | 829d513ac464e0421a264fd64d8b94f59a09875e | [
"Apache-2.0"
] | 3,764 | 2018-02-27T08:25:52.000Z | 2022-03-30T17:59:22.000Z | snips_nlu/tests/test_crf_utils.py | unicorns18/snips-nlu | 74b2893c91fc0bafc919a7e088ecb0b2bd611acf | [
"Apache-2.0"
] | 305 | 2018-02-28T13:45:23.000Z | 2022-03-10T15:33:35.000Z | snips_nlu/tests/test_crf_utils.py | unicorns18/snips-nlu | 74b2893c91fc0bafc919a7e088ecb0b2bd611acf | [
"Apache-2.0"
] | 559 | 2018-03-04T15:44:15.000Z | 2022-03-21T17:00:21.000Z | from __future__ import unicode_literals
from builtins import range
from mock import patch
from snips_nlu.constants import LANGUAGE_EN
from snips_nlu.preprocessing import Token, tokenize
from snips_nlu.result import unresolved_slot
from snips_nlu.slot_filler.crf_utils import (
BEGINNING_PREFIX, INSIDE_PREFIX, LAST_PREFIX, OUTSIDE, TaggingScheme,
UNIT_PREFIX, end_of_bilou_slot, end_of_bio_slot, negative_tagging,
positive_tagging, start_of_bilou_slot, start_of_bio_slot, tags_to_slots,
utterance_to_sample)
from snips_nlu.tests.utils import SnipsTest
class TestCRFUtils(SnipsTest):
def test_io_tags_to_slots(self):
# Given
language = LANGUAGE_EN
slot_name = "animal"
intent_slots_mapping = {"animal": "animal"}
tags = [
{
"text": "",
"tags": [],
"expected_slots": []
},
{
"text": "nothing here",
"tags": [OUTSIDE, OUTSIDE],
"expected_slots": []
},
{
"text": "i am a blue bird",
"tags": [OUTSIDE, OUTSIDE, OUTSIDE,
INSIDE_PREFIX + slot_name,
INSIDE_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(7, 16),
value="blue bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "i am a bird",
"tags": [OUTSIDE, OUTSIDE, OUTSIDE,
INSIDE_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(7, 11),
value="bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "bird",
"tags": [INSIDE_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 4),
value="bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "blue bird",
"tags": [INSIDE_PREFIX + slot_name,
INSIDE_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 9),
value="blue bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "light blue bird blue bird",
"tags": [INSIDE_PREFIX + slot_name,
INSIDE_PREFIX + slot_name,
INSIDE_PREFIX + slot_name,
INSIDE_PREFIX + slot_name,
INSIDE_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 25),
value="light blue bird blue bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "bird birdy",
"tags": [INSIDE_PREFIX + slot_name,
INSIDE_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 10),
value="bird birdy",
entity=slot_name,
slot_name=slot_name
)
]
}
]
for data in tags:
# When
slots = tags_to_slots(
data["text"], tokenize(data["text"], language),
data["tags"], TaggingScheme.IO,
intent_slots_mapping)
# Then
self.assertEqual(slots, data["expected_slots"])
def test_bio_tags_to_slots(self):
# Given
language = LANGUAGE_EN
slot_name = "animal"
intent_slots_mapping = {"animal": "animal"}
tags = [
{
"text": "",
"tags": [],
"expected_slots": []
},
{
"text": "nothing here",
"tags": [OUTSIDE, OUTSIDE],
"expected_slots": []
},
{
"text": "i am a blue bird",
"tags": [OUTSIDE, OUTSIDE, OUTSIDE,
BEGINNING_PREFIX + slot_name,
INSIDE_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(7, 16),
value="blue bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "i am a bird",
"tags": [OUTSIDE, OUTSIDE, OUTSIDE,
BEGINNING_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(7, 11),
value="bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "bird",
"tags": [BEGINNING_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 4),
value="bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "blue bird",
"tags": [BEGINNING_PREFIX + slot_name,
INSIDE_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 9),
value="blue bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "light blue bird blue bird",
"tags": [BEGINNING_PREFIX + slot_name,
INSIDE_PREFIX + slot_name,
INSIDE_PREFIX + slot_name,
BEGINNING_PREFIX + slot_name,
INSIDE_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 15),
value="light blue bird",
entity=slot_name,
slot_name=slot_name
),
unresolved_slot(
match_range=(16, 25),
value="blue bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "bird birdy",
"tags": [BEGINNING_PREFIX + slot_name,
BEGINNING_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 4),
value="bird",
entity=slot_name,
slot_name=slot_name
),
unresolved_slot(
match_range=(5, 10),
value="birdy",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "blue bird and white bird",
"tags": [BEGINNING_PREFIX + slot_name,
INSIDE_PREFIX + slot_name,
OUTSIDE,
INSIDE_PREFIX + slot_name,
INSIDE_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 9),
value="blue bird",
entity=slot_name,
slot_name=slot_name
),
unresolved_slot(
match_range=(14, 24),
value="white bird",
entity=slot_name,
slot_name=slot_name
)
]
}
]
for data in tags:
# When
slots = tags_to_slots(
data["text"], tokenize(data["text"], language),
data["tags"], TaggingScheme.BIO,
intent_slots_mapping)
# Then
self.assertEqual(slots, data["expected_slots"])
def test_bilou_tags_to_slots(self):
# Given
language = LANGUAGE_EN
slot_name = "animal"
intent_slots_mapping = {"animal": "animal"}
tags = [
{
"text": "",
"tags": [],
"expected_slots": []
},
{
"text": "nothing here",
"tags": [OUTSIDE, OUTSIDE],
"expected_slots": []
},
{
"text": "i am a blue bird",
"tags": [OUTSIDE, OUTSIDE, OUTSIDE,
BEGINNING_PREFIX + slot_name,
LAST_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(7, 16),
value="blue bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "i am a bird",
"tags": [OUTSIDE, OUTSIDE, OUTSIDE,
UNIT_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(7, 11),
value="bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "bird",
"tags": [UNIT_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 4),
value="bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "blue bird",
"tags": [BEGINNING_PREFIX + slot_name,
LAST_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 9),
value="blue bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "light blue bird blue bird",
"tags": [BEGINNING_PREFIX + slot_name,
INSIDE_PREFIX + slot_name,
LAST_PREFIX + slot_name,
BEGINNING_PREFIX + slot_name,
LAST_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 15),
value="light blue bird",
entity=slot_name,
slot_name=slot_name
),
unresolved_slot(
match_range=(16, 25),
value="blue bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "bird birdy",
"tags": [UNIT_PREFIX + slot_name,
UNIT_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 4),
value="bird",
entity=slot_name,
slot_name=slot_name
),
unresolved_slot(
match_range=(5, 10),
value="birdy",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "light bird bird blue bird",
"tags": [BEGINNING_PREFIX + slot_name,
INSIDE_PREFIX + slot_name,
UNIT_PREFIX + slot_name,
BEGINNING_PREFIX + slot_name,
INSIDE_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 10),
value="light bird",
entity=slot_name,
slot_name=slot_name
),
unresolved_slot(
match_range=(11, 15),
value="bird",
entity=slot_name,
slot_name=slot_name
),
unresolved_slot(
match_range=(16, 25),
value="blue bird",
entity=slot_name,
slot_name=slot_name
)
]
},
{
"text": "bird bird bird",
"tags": [LAST_PREFIX + slot_name,
BEGINNING_PREFIX + slot_name,
UNIT_PREFIX + slot_name],
"expected_slots": [
unresolved_slot(
match_range=(0, 4),
value="bird",
entity=slot_name,
slot_name=slot_name
),
unresolved_slot(
match_range=(5, 9),
value="bird",
entity=slot_name,
slot_name=slot_name
),
unresolved_slot(
match_range=(10, 14),
value="bird",
entity=slot_name,
slot_name=slot_name
)
]
},
]
for data in tags:
# When
slots = tags_to_slots(
data["text"], tokenize(data["text"], language),
data["tags"], TaggingScheme.BILOU,
intent_slots_mapping)
# Then
self.assertEqual(slots, data["expected_slots"])
def test_positive_tagging_should_handle_zero_length(self):
# Given
slot_name = "animal"
slot_size = 0
# When
tags = []
for scheme in TaggingScheme:
tags.append(positive_tagging(scheme, slot_name, slot_size))
# Then
expected_tags = [[]] * len(TaggingScheme)
self.assertEqual(tags, expected_tags)
@patch('snips_nlu.slot_filler.crf_utils.positive_tagging')
def test_utterance_to_sample(self, mocked_positive_tagging):
# Given
language = LANGUAGE_EN
def mock_positive_tagging(_, slot, slot_size):
return [INSIDE_PREFIX + slot for _ in range(slot_size)]
mocked_positive_tagging.side_effect = mock_positive_tagging
slot_name = "animal"
query_data = [{"text": "i am a "},
{"text": "beautiful bird", "slot_name": slot_name}]
expected_tagging = [OUTSIDE, OUTSIDE, OUTSIDE,
INSIDE_PREFIX + slot_name,
INSIDE_PREFIX + slot_name]
expected_tokens = [Token(value='i', start=0, end=1),
Token(value='am', start=2, end=4),
Token(value='a', start=5, end=6),
Token(value='beautiful', start=7, end=16),
Token(value='bird', start=17, end=21)]
expected_sample = {"tokens": expected_tokens,
"tags": expected_tagging}
# When
sample = utterance_to_sample(query_data, TaggingScheme.IO, language)
# Then
self.assertEqual(sample, expected_sample)
@patch('snips_nlu.slot_filler.crf_utils.positive_tagging')
def test_utterance_to_sample_with_partial_slots(self,
mocked_positive_tagging):
# Given
language = LANGUAGE_EN
def mock_positive_tagging(_, slot, slot_size):
return [INSIDE_PREFIX + slot for _ in range(slot_size)]
mocked_positive_tagging.side_effect = mock_positive_tagging
slot_name = "animal"
query_data = [{"text": "i am a b"},
{"text": "eautiful bird", "slot_name": slot_name}]
expected_tagging = [OUTSIDE, OUTSIDE, OUTSIDE, OUTSIDE,
INSIDE_PREFIX + slot_name,
INSIDE_PREFIX + slot_name]
expected_tokens = [Token(value='i', start=0, end=1),
Token(value='am', start=2, end=4),
Token(value='a', start=5, end=6),
Token(value='b', start=7, end=8),
Token(value='eautiful', start=8, end=16),
Token(value='bird', start=17, end=21)]
expected_sample = {"tokens": expected_tokens, "tags": expected_tagging}
# When
sample = utterance_to_sample(query_data, TaggingScheme.IO, language)
# Then
mocked_positive_tagging.assert_called()
self.assertEqual(sample, expected_sample)
def test_negative_tagging(self):
# Given
size = 3
# When
tagging = negative_tagging(size)
# Then
expected_tagging = [OUTSIDE, OUTSIDE, OUTSIDE]
self.assertListEqual(tagging, expected_tagging)
def test_positive_tagging_with_io(self):
# Given
tagging_scheme = TaggingScheme.IO
slot_name = "animal"
slot_size = 3
# When
tags = positive_tagging(tagging_scheme, slot_name, slot_size)
# Then
t = INSIDE_PREFIX + slot_name
expected_tags = [t, t, t]
self.assertListEqual(tags, expected_tags)
def test_positive_tagging_with_bio(self):
# Given
tagging_scheme = TaggingScheme.BIO
slot_name = "animal"
slot_size = 3
# When
tags = positive_tagging(tagging_scheme, slot_name, slot_size)
# Then
expected_tags = [BEGINNING_PREFIX + slot_name,
INSIDE_PREFIX + slot_name, INSIDE_PREFIX + slot_name]
self.assertListEqual(tags, expected_tags)
def test_positive_tagging_with_bilou(self):
# Given
tagging_scheme = TaggingScheme.BILOU
slot_name = "animal"
slot_size = 3
# When
tags = positive_tagging(tagging_scheme, slot_name, slot_size)
# Then
expected_tags = [BEGINNING_PREFIX + slot_name,
INSIDE_PREFIX + slot_name, LAST_PREFIX + slot_name]
self.assertListEqual(tags, expected_tags)
def test_positive_tagging_with_bilou_unit(self):
# Given
tagging_scheme = TaggingScheme.BILOU
slot_name = "animal"
slot_size = 1
# When
tags = positive_tagging(tagging_scheme, slot_name, slot_size)
# Then
expected_tags = [UNIT_PREFIX + slot_name]
self.assertListEqual(tags, expected_tags)
def test_start_of_bio_slot(self):
# Given
tags = [
OUTSIDE,
BEGINNING_PREFIX,
INSIDE_PREFIX,
OUTSIDE,
INSIDE_PREFIX,
OUTSIDE,
BEGINNING_PREFIX,
OUTSIDE,
INSIDE_PREFIX,
BEGINNING_PREFIX,
OUTSIDE,
BEGINNING_PREFIX,
BEGINNING_PREFIX,
INSIDE_PREFIX,
INSIDE_PREFIX
]
# When
starts_of_bio = [start_of_bio_slot(tags, i) for i in range(len(tags))]
# Then
expected_starts = [
False,
True,
False,
False,
True,
False,
True,
False,
True,
True,
False,
True,
True,
False,
False
]
self.assertListEqual(starts_of_bio, expected_starts)
def test_end_of_bio_slot(self):
# Given
tags = [
OUTSIDE,
BEGINNING_PREFIX,
INSIDE_PREFIX,
OUTSIDE,
INSIDE_PREFIX,
OUTSIDE,
BEGINNING_PREFIX,
OUTSIDE,
INSIDE_PREFIX,
BEGINNING_PREFIX,
OUTSIDE,
BEGINNING_PREFIX,
BEGINNING_PREFIX,
INSIDE_PREFIX,
INSIDE_PREFIX
]
# When
ends_of_bio = [end_of_bio_slot(tags, i) for i in range(len(tags))]
# Then
expected_ends = [
False,
False,
True,
False,
True,
False,
True,
False,
True,
True,
False,
True,
False,
False,
True
]
self.assertListEqual(ends_of_bio, expected_ends)
def test_start_of_bilou_slot(self):
# Given
tags = [
OUTSIDE,
LAST_PREFIX,
UNIT_PREFIX,
BEGINNING_PREFIX,
UNIT_PREFIX,
INSIDE_PREFIX,
LAST_PREFIX,
LAST_PREFIX,
UNIT_PREFIX,
UNIT_PREFIX,
LAST_PREFIX,
OUTSIDE,
LAST_PREFIX,
BEGINNING_PREFIX,
INSIDE_PREFIX,
INSIDE_PREFIX,
LAST_PREFIX
]
# When
starts_of_bilou = [start_of_bilou_slot(tags, i) for i in
range(len(tags))]
# Then
expected_starts = [
False,
True,
True,
True,
True,
True,
False,
True,
True,
True,
True,
False,
True,
True,
False,
False,
False
]
self.assertListEqual(starts_of_bilou, expected_starts)
def test_end_of_bilou_slot(self):
# Given
tags = [
OUTSIDE,
LAST_PREFIX,
UNIT_PREFIX,
BEGINNING_PREFIX,
UNIT_PREFIX,
INSIDE_PREFIX,
LAST_PREFIX,
LAST_PREFIX,
UNIT_PREFIX,
UNIT_PREFIX,
LAST_PREFIX,
OUTSIDE,
INSIDE_PREFIX,
BEGINNING_PREFIX,
OUTSIDE,
BEGINNING_PREFIX,
INSIDE_PREFIX,
INSIDE_PREFIX,
LAST_PREFIX
]
# When
ends_of_bilou = [end_of_bilou_slot(tags, i) for i in range(len(tags))]
# Then
expected_ends = [
False,
True,
True,
True,
True,
False,
True,
True,
True,
True,
True,
False,
True,
True,
False,
False,
False,
False,
True
]
self.assertListEqual(ends_of_bilou, expected_ends)
| 31.776774 | 79 | 0.417631 | 1,977 | 24,627 | 4.884674 | 0.062721 | 0.142487 | 0.083256 | 0.102723 | 0.895516 | 0.861033 | 0.847779 | 0.834835 | 0.831521 | 0.798281 | 0 | 0.009744 | 0.504081 | 24,627 | 774 | 80 | 31.817829 | 0.780971 | 0.009705 | 0 | 0.704918 | 0 | 0 | 0.063717 | 0.003944 | 0 | 0 | 0 | 0 | 0.023845 | 1 | 0.025335 | false | 0 | 0.011923 | 0.002981 | 0.041729 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
95c1b3f12dc681f52b29b063ae7736a498d316cc | 15,048 | py | Python | sds_bayesian_numpy/initial.py | thlautenschlaeger/sds | 710e7c6b1b15afd0476b6f0f60c03415571bebeb | [
"MIT"
] | null | null | null | sds_bayesian_numpy/initial.py | thlautenschlaeger/sds | 710e7c6b1b15afd0476b6f0f60c03415571bebeb | [
"MIT"
] | null | null | null | sds_bayesian_numpy/initial.py | thlautenschlaeger/sds | 710e7c6b1b15afd0476b6f0f60c03415571bebeb | [
"MIT"
] | null | null | null | import numpy as np
import autograd.numpy.random as npr
from scipy.special import digamma, logsumexp
from scipy.stats import multivariate_normal as mvn
from sds_bayesian_numpy.ext.stats import multivariate_normal_logpdf as mvn_logpdf
class InitialState:
def __init__(self, n_states, prior={'omega0':1}):
self.n_states = n_states
self.prior = prior
self.posterior = self.init_posterior()
def init_posterior(self):
return {'omega': np.random.random(size=(self.n_states))}
@property
def log_init(self):
""" Sub normalized log probabilities
:returns: [num_states]
"""
_tmp = digamma(np.sum(self.posterior['omega'], axis=0))
return np.array([digamma(omega) - _tmp for omega in self.posterior['omega']])
@property
def log_prob(self):
""" Normalized log probability
:returns: [num_states]
"""
return self.log_init - logsumexp(self.log_init)
@property
def initial_distribution(self):
return np.exp(self.log_prob)
def sample(self):
return npr.choice(self.n_states, p=self.initial_distribution)
def m_step(self, gamma, **kwargs):
_gamma = sum([_w[0, :] for _w in gamma])
_gamma = _gamma / sum(_gamma)
self.posterior['omega'] = self.prior['omega0'] + _gamma
class GaussianInitState:
def __init__(self, state_dim, obs_dim, prior=None, reg=1e-128):
self.state_dim = state_dim
self.obs_dim = obs_dim
self.reg = reg
self.mean = np.random.random((state_dim, obs_dim))
self.cov = np.random.random((state_dim, obs_dim, obs_dim))
for k in range(state_dim):
self.cov[k] = 0.5 * (self.cov[k] + self.cov[k].T)
self.cov[k] += self.obs_dim * np.eye(self.obs_dim)
@property
def params(self):
return self.mean, self.cov
@params.setter
def params(self, value):
self.mean, self.cov = value[0], value[1]
def sample(self, z):
_x = mvn(mean=self.mean[z], cov=self.cov[z, ...]).rvs()
return np.atleast_1d(_x)
def log_likelihood(self, x, u=None):
loglik = []
for _x in x:
_loglik = np.column_stack([mvn.logpdf(_x[0, :], self.mean[k], self.cov[k], allow_singular=True)
for k in range(self.state_dim)])
loglik.append(_loglik)
return loglik
def update_gauss_params(self, x, gamma):
_norm = 0
mu = 0
sig = 0
for _x, _gamma in zip(x, gamma):
_norm += np.sum(_gamma[0:, :, None], axis=0) # + self.reg
mu += np.sum(_gamma[0:, :, None] * _x[0:, None, :], axis=0)
mu /= _norm + self.reg
for _x, _gamma in zip(x, gamma):
resid = _x[0:, None, :] - mu
sig += np.sum(_gamma[0:, :, None, None] * resid[:, :, None, :] * resid[:, :, :, None], axis=0)
sig /= _norm[:, None]
self.params = mu, sig
def m_step(self, x, gamma, u=None, weights=None, **kwargs):
self.update_gauss_params(x, gamma)
class GaussianArInitState:
def __init__(self, state_dim, obs_dim, prior=None, reg=1e-128, ar_steps=15):
self.state_dim = state_dim
self.obs_dim = obs_dim
self.reg = reg
self.ar_steps = ar_steps
self.mean = np.random.random((state_dim, obs_dim))
self.cov = np.random.random((state_dim, obs_dim, obs_dim))
for k in range(state_dim):
self.cov[k] = 0.5 * (self.cov[k] + self.cov[k].T)
self.cov[k] += np.eye(self.obs_dim)
@property
def params(self):
return self.mean, self.cov
@params.setter
def params(self, value):
self.mean, self.cov = value[0], value[1]
def sample(self, z):
_x = mvn(mean=self.mean[z], cov=self.cov[z, ...]).rvs()
return np.atleast_1d(_x)
def log_likelihood(self, x):
loglik = []
for _x in x:
_loglik = np.column_stack([mvn.logpdf(_x[0:self.ar_steps, :], self.mean[k], self.cov[k], allow_singular=True)
for k in range(self.state_dim)])
loglik.append(_loglik)
return loglik
def update_gauss_params(self, x, gamma):
_norm = 0
mu = 0
sig = 0
for _x, _gamma in zip(x, gamma):
_norm += np.sum(_gamma[0:self.ar_steps:, :, None], axis=0) # + self.reg
mu += np.sum(_gamma[0:self.ar_steps:, :, None] * _x[0:self.ar_steps:, None, :], axis=0)
mu /= _norm + self.reg
for _x, _gamma in zip(x, gamma):
resid = _x[0:self.ar_steps:, None, :] - mu
sig += np.sum(_gamma[0:self.ar_steps:, :, None, None] * resid[:, :, None, :] * resid[:, :, :, None], axis=0)
sig /= _norm[:, None]
self.params = mu, sig
def m_step(self, x, gamma, weights=None, **kwargs):
self.update_gauss_params(x, gamma)
class GaussianBayesianInitState:
def __init__(self, state_dim, obs_dim, prior=None, reg=1e-128):
self.state_dim = state_dim
self.obs_dim = obs_dim
self.reg = reg
self.prior = prior
self.posterior = self.init_posterior()
self.mean = np.random.random((state_dim, obs_dim))
self.cov = np.random.random((state_dim, obs_dim, obs_dim))
for k in range(state_dim):
self.cov[k] = 0.5 * (self.cov[k] + self.cov[k].T)
self.cov[k] += np.eye(self.obs_dim)
def init_posterior(self):
# Init pos. definit Wishart scale matrix
W = np.random.random(size=(self.state_dim, self.obs_dim, self.obs_dim))
for k in range(self.state_dim):
W[k] = 0.5 * (W[k] + W[k].T)
W[k] = W[k] + np.eye(self.obs_dim)
nu = np.abs(np.random.random(size=self.state_dim)) + 5 #+ self.obs_dim
m = np.random.multivariate_normal(np.zeros(self.obs_dim), np.eye(self.obs_dim), size=self.state_dim)
beta = np.abs(np.random.random(size=self.state_dim))
return {'W': W, 'nu': nu, 'm': m, 'beta': beta}
@property
def params(self):
return self.mean, self.cov
@params.setter
def params(self, value):
self.mean, self.cov = value[0], value[1]
# log lamb
@property
def log_lamb(self):
loglamb = np.empty(shape=self.state_dim)
for k in range(self.posterior['nu'].shape[0]):
_tmp = self.obs_dim * np.log(2) + np.log(np.linalg.det(self.posterior['W'][k]))
loglamb[k] = np.sum(
[digamma((self.posterior['nu'][k] + 1 - i) / 2)
for i in range(1, self.obs_dim + 1)]) + _tmp
return loglamb
def sample(self, z):
_x = mvn(mean=self.mean[z], cov=self.cov[z, ...]).rvs()
return np.atleast_1d(_x)
def param_posterior_estimate(self, x):
D = self.obs_dim
post_ests = []
for _x in x:
_post_est = []
for k in range(self.state_dim):
res = (self.posterior['m'][k] - _x[0:])[:, None]
tmp = res.T @ self.posterior['W'][k] @ res
tmp = D * (1 / self.posterior['beta'][k]) + self.posterior['nu'][k] * tmp
_post_est.append(tmp)
post_ests.append(np.vstack(_post_est))
return post_ests
def log_likelihood_bayes(self, x):
D = self.obs_dim
param_post_ests = self.param_posterior_estimate(x)
logliks = []
for _x, _post_est in zip(x, param_post_ests):
_loglik = np.empty(shape=(self.state_dim, 1))
for k in range(self.state_dim):
_loglik[k] = 0.5 * (self.log_lamb[k] - np.log(2 * np.pi * D) - _post_est[k])
logliks.append(_loglik.T)
return logliks
def log_likelihood(self, x):
loglik = []
for _x in x:
_loglik = np.column_stack([mvn.logpdf(_x[0, :], self.mean[k], self.cov[k], allow_singular=True)
for k in range(self.state_dim)])
loglik.append(_loglik)
return loglik
def update_gauss_params(self, x, gamma):
_norm = 0
mu = 0
sig = 0
for _x, _gamma in zip(x, gamma):
_norm += np.sum(_gamma[0:, :, None], axis=0) # + self.reg
mu += np.sum(_gamma[0:, :, None] * _x[:, None, :], axis=0)
mu /= _norm + self.reg
for _x, _gamma in zip(x, gamma):
resid = _x[0:, None, :] - mu
sig += np.sum(_gamma[0:, :, None, None] * resid[:, :, None, :] * resid[:, :, :, None], axis=0)
sig /= _norm[:, None]
self.params = mu, sig
def m_step(self, x, gamma, weights=None, **kwargs):
self.update_gauss_params(x, gamma)
_gamma = np.vstack([g[0] for g in gamma])
e_counts = _gamma.sum(axis=0)
_x = np.vstack([_x[0] for _x in x])
self.posterior['beta'] = self.prior['beta0'] + e_counts
self.posterior['nu'] = self.prior['nu0'] + e_counts
self.posterior['m'] = (self.prior['beta0'] * self.prior['m0']
+ (_gamma[:, :, None] * _x[:, None, :]).sum(axis=0)) \
/ self.posterior['beta'][:, None]
for k in range(self.state_dim):
_resid_0 = self.mean[k] - self.prior['m0']
_resid_1 = _x[:, None, :] - self.mean[k]
# _resid_2 = self.mean[k] - self.posterior['m'][k]
NS = (_gamma[:, k, None, None] * (_resid_1[:, :, None, :] * _resid_1[:, :, :, None]).squeeze(axis=1)).sum(axis=0)
_W_inv = np.linalg.inv(self.prior['W0']) + NS \
+ ((self.prior['beta0'] * e_counts[k]) \
/ (self.prior['beta0'] + e_counts[k])) \
* (_resid_0[None, :] * _resid_0[:, None])
self.posterior['W'][k] = np.linalg.inv(_W_inv)
class ArGaussianBayesianInitState:
def __init__(self, state_dim, obs_dim, prior=None, reg=1e-128, ar_steps=10):
self.state_dim = state_dim
self.obs_dim = obs_dim
self.reg = reg
self.prior = prior
self.ar_steps = ar_steps
self.posterior = self.init_posterior()
self.mean = np.random.random((state_dim, obs_dim))
self.cov = np.random.random((state_dim, obs_dim, obs_dim))
for k in range(state_dim):
self.cov[k] = 0.5 * (self.cov[k] + self.cov[k].T)
self.cov[k] += np.eye(self.obs_dim)
def init_posterior(self):
# Init pos. definit Wishart scale matrix
W = np.random.random(size=(self.state_dim, self.obs_dim, self.obs_dim))
for k in range(self.state_dim):
W[k] = 0.5 * (W[k] + W[k].T)
W[k] = W[k] + np.eye(self.obs_dim)
nu = np.abs(np.random.random(size=self.state_dim)) + 5 #+ self.obs_dim
m = np.random.multivariate_normal(np.zeros(self.obs_dim), np.eye(self.obs_dim), size=self.state_dim)
beta = np.abs(np.random.random(size=self.state_dim))
return {'W': W, 'nu': nu, 'm': m, 'beta': beta}
@property
def params(self):
return self.mean, self.cov
@params.setter
def params(self, value):
self.mean, self.cov = value[0], value[1]
# log lamb
@property
def log_lamb(self):
loglamb = np.empty(shape=self.state_dim)
for k in range(self.posterior['nu'].shape[0]):
_tmp = self.obs_dim * np.log(2) + np.log(np.linalg.det(self.posterior['W'][k]))
loglamb[k] = np.sum(
[digamma((self.posterior['nu'][k] + 1 - i) / 2)
for i in range(1, self.obs_dim + 1)]) + _tmp
return loglamb
def sample(self, z):
_x = mvn(mean=self.mean[z], cov=self.cov[z, ...]).rvs()
return np.atleast_1d(_x)
def param_posterior_estimate(self, x):
D = self.obs_dim
post_ests = []
for _x in x:
_post_est = []
for k in range(self.state_dim):
res = (self.posterior['m'][k] - _x[0:self.ar_steps:])[:, None]
for i in range(self.ar_steps):
tmp = res[i] @ self.posterior['W'][k] @ res[i].T
tmp = D * (1 / self.posterior['beta'][k]) + self.posterior['nu'][k] * tmp
_post_est.append(tmp)
post_ests.append(np.vstack(_post_est))
return post_ests
def log_likelihood_bayes(self, x):
D = self.obs_dim
param_post_ests = self.param_posterior_estimate(x)
logliks = []
for _x, _post_est in zip(x, param_post_ests):
_loglik = np.empty(shape=(self.state_dim, 1))
for k in range(self.state_dim):
_loglik[k] = 0.5 * (self.log_lamb[k] - np.log(2 * np.pi * D) - _post_est[k])
logliks.append(_loglik.T)
return logliks
def log_likelihood(self, x):
loglik = []
for _x in x:
_loglik = np.column_stack([mvn.logpdf(_x[0, :], self.mean[k], self.cov[k], allow_singular=True)
for k in range(self.state_dim)])
loglik.append(_loglik)
return loglik
def update_gauss_params(self, x, gamma):
_norm = 0
mu = 0
sig = 0
for _x, _gamma in zip(x, gamma):
_norm += np.sum(_gamma[0:self.ar_steps:, :, None], axis=0) # + self.reg
mu += np.sum(_gamma[0:self.ar_steps:, :, None] * _x[0:self.ar_steps:, None, :], axis=0)
mu /= _norm + self.reg
for _x, _gamma in zip(x, gamma):
resid = _x[0:self.ar_steps:, None, :] - mu
sig += np.sum(_gamma[0:self.ar_steps:, :, None, None] * resid[:, :, None, :] * resid[:, :, :, None], axis=0)
sig /= _norm[:, None]
self.params = mu, sig
def m_step(self, x, gamma, weights=None, **kwargs):
self.update_gauss_params(x, gamma)
_gamma = np.vstack([g[0:self.ar_steps] for g in gamma])
e_counts = _gamma.sum(axis=0)
_x = np.vstack([_x[0:self.ar_steps] for _x in x])
self.posterior['beta'] = self.prior['beta0'] + e_counts
self.posterior['nu'] = self.prior['nu0'] + e_counts
self.posterior['m'] = (self.prior['beta0'] * self.prior['m0']
+ (_gamma[:, :, None] * _x[:, None, :]).sum(axis=0)) \
/ self.posterior['beta'][:, None]
for k in range(self.state_dim):
_resid_0 = self.mean[k] - self.prior['m0']
_resid_1 = _x[:, None, :] - self.mean[k]
# _resid_2 = self.mean[k] - self.posterior['m'][k]
NS = (_gamma[:, k, None, None] * (_resid_1[:, :, None, :] * _resid_1[:, :, :, None]).squeeze(axis=1)).sum(axis=0)
_W_inv = np.linalg.inv(self.prior['W0']) + NS \
+ ((self.prior['beta0'] * e_counts[k]) \
/ (self.prior['beta0'] + e_counts[k])) \
* (_resid_0[None, :] * _resid_0[:, None])
self.posterior['W'][k] = np.linalg.inv(_W_inv)
| 35.076923 | 125 | 0.541999 | 2,135 | 15,048 | 3.624356 | 0.068384 | 0.037994 | 0.049625 | 0.025588 | 0.879555 | 0.868312 | 0.862755 | 0.862755 | 0.855777 | 0.855777 | 0 | 0.015579 | 0.300439 | 15,048 | 428 | 126 | 35.158879 | 0.719483 | 0.024721 | 0 | 0.86262 | 0 | 0 | 0.010673 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.13738 | false | 0 | 0.015974 | 0.022364 | 0.249201 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
254dc6108990379217a9e440915d667a633c6181 | 181 | py | Python | xrpl/core/binarycodec/exceptions.py | mDuo13/xrpl-py | 70f927dcd2dbb8644b3e210b0a8de2a214e71e3d | [
"0BSD"
] | null | null | null | xrpl/core/binarycodec/exceptions.py | mDuo13/xrpl-py | 70f927dcd2dbb8644b3e210b0a8de2a214e71e3d | [
"0BSD"
] | null | null | null | xrpl/core/binarycodec/exceptions.py | mDuo13/xrpl-py | 70f927dcd2dbb8644b3e210b0a8de2a214e71e3d | [
"0BSD"
] | null | null | null | """General XRPL Binary Codec Exceptions."""
from xrpl import XRPLException
class XRPLBinaryCodecException(XRPLException):
"""General XRPL Binary Codec Exception."""
pass
| 20.111111 | 46 | 0.751381 | 18 | 181 | 7.555556 | 0.666667 | 0.161765 | 0.25 | 0.323529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.154696 | 181 | 8 | 47 | 22.625 | 0.888889 | 0.40884 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 8 |
c2cc04b9d25e1d001e123581168e340cd5b4a3e4 | 298 | py | Python | sds/__init__.py | AndrewOwenMartin/sds | 78c78c378c91472ee503c471fb2701f8a1410af5 | [
"Apache-2.0"
] | 9 | 2018-12-12T22:04:38.000Z | 2019-03-10T19:42:35.000Z | sds/__init__.py | AndrewOwenMartin/sds | 78c78c378c91472ee503c471fb2701f8a1410af5 | [
"Apache-2.0"
] | null | null | null | sds/__init__.py | AndrewOwenMartin/sds | 78c78c378c91472ee503c471fb2701f8a1410af5 | [
"Apache-2.0"
] | null | null | null | from sds.standard import (
Agent,
D_passive,
DH_uniform,
H_fixed,
I_sync,
SDS,
Swarm,
T_boolean,
TM_uniform,
)
__all__ = [
"Agent",
"D_passive",
"DH_uniform",
"H_fixed",
"I_sync",
"SDS",
"Swarm",
"T_boolean",
"TM_uniform",
]
| 12.416667 | 26 | 0.52349 | 35 | 298 | 4 | 0.514286 | 0.085714 | 0.185714 | 0.214286 | 0.828571 | 0.828571 | 0.828571 | 0.828571 | 0.828571 | 0.828571 | 0 | 0 | 0.33557 | 298 | 23 | 27 | 12.956522 | 0.707071 | 0 | 0 | 0 | 0 | 0 | 0.214765 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.090909 | 0.045455 | 0 | 0.045455 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
c2e06c9134c5246c3c30e9e52192500586357453 | 222 | py | Python | tests/test_generate_random_password.py | JacobChesslo/random-password | ed1d87c53d144a408436322e25251e03ccb488d7 | [
"MIT"
] | 1 | 2021-11-17T01:34:18.000Z | 2021-11-17T01:34:18.000Z | tests/test_generate_random_password.py | JacobChesslo/random-password | ed1d87c53d144a408436322e25251e03ccb488d7 | [
"MIT"
] | null | null | null | tests/test_generate_random_password.py | JacobChesslo/random-password | ed1d87c53d144a408436322e25251e03ccb488d7 | [
"MIT"
] | null | null | null | import unittest
from password.generate_random_password import generate_random_password
class TestGenerateRandomPassword(unittest.TestCase):
pass
# TODO finish tests
if __name__ == '__main__':
unittest.main() | 22.2 | 70 | 0.792793 | 24 | 222 | 6.833333 | 0.666667 | 0.170732 | 0.268293 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144144 | 222 | 10 | 71 | 22.2 | 0.863158 | 0.076577 | 0 | 0 | 1 | 0 | 0.039216 | 0 | 0 | 0 | 0 | 0.1 | 0 | 1 | 0 | true | 0.5 | 0.333333 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 7 |
6c365e65f5e893e734096bb136989d58ee01f3f0 | 33,925 | py | Python | spark_fhir_schemas/stu3/complex_types/medicationdispense.py | icanbwell/SparkFhirSchemas | 8c828313c39850b65f8676e67f526ee92b7d624e | [
"Apache-2.0"
] | 2 | 2020-10-31T23:25:01.000Z | 2021-06-09T14:12:42.000Z | spark_fhir_schemas/stu3/complex_types/medicationdispense.py | icanbwell/SparkFhirSchemas | 8c828313c39850b65f8676e67f526ee92b7d624e | [
"Apache-2.0"
] | null | null | null | spark_fhir_schemas/stu3/complex_types/medicationdispense.py | icanbwell/SparkFhirSchemas | 8c828313c39850b65f8676e67f526ee92b7d624e | [
"Apache-2.0"
] | null | null | null | from typing import Union, List, Optional
from pyspark.sql.types import (
StructType,
StructField,
StringType,
ArrayType,
BooleanType,
DataType,
)
# This file is auto-generated by generate_schema so do not edit manually
# noinspection PyPep8Naming
class MedicationDispenseSchema:
"""
Indicates that a medication product is to be or has been dispensed for a named
person/patient. This includes a description of the medication product
(supply) provided and the instructions for administering the medication. The
medication dispense is the result of a pharmacy system responding to a
medication order.
"""
# noinspection PyDefaultArgument
@staticmethod
def get_schema(
max_nesting_depth: Optional[int] = 6,
nesting_depth: int = 0,
nesting_list: List[str] = [],
max_recursion_limit: Optional[int] = 2,
include_extension: Optional[bool] = False,
extension_fields: Optional[List[str]] = [
"valueBoolean",
"valueCode",
"valueDate",
"valueDateTime",
"valueDecimal",
"valueId",
"valueInteger",
"valuePositiveInt",
"valueString",
"valueTime",
"valueUnsignedInt",
"valueUri",
"valueQuantity",
],
extension_depth: int = 0,
max_extension_depth: Optional[int] = 2,
) -> Union[StructType, DataType]:
"""
Indicates that a medication product is to be or has been dispensed for a named
person/patient. This includes a description of the medication product
(supply) provided and the instructions for administering the medication. The
medication dispense is the result of a pharmacy system responding to a
medication order.
id: The logical id of the resource, as used in the URL for the resource. Once
assigned, this value never changes.
extension: May be used to represent additional information that is not part of the basic
definition of the resource. In order to make the use of extensions safe and
manageable, there is a strict set of governance applied to the definition and
use of extensions. Though any implementer is allowed to define an extension,
there is a set of requirements that SHALL be met as part of the definition of
the extension.
meta: The metadata about the resource. This is content that is maintained by the
infrastructure. Changes to the content may not always be associated with
version changes to the resource.
implicitRules: A reference to a set of rules that were followed when the resource was
constructed, and which must be understood when processing the content.
language: The base language in which the resource is written.
text: A human-readable narrative that contains a summary of the resource, and may be
used to represent the content of the resource to a human. The narrative need
not encode all the structured data, but is required to contain sufficient
detail to make it "clinically safe" for a human to just read the narrative.
Resource definitions may define what content should be represented in the
narrative to ensure clinical safety.
contained: These resources do not have an independent existence apart from the resource
that contains them - they cannot be identified independently, and nor can they
have their own independent transaction scope.
resourceType: This is a MedicationDispense resource
identifier: Identifier assigned by the dispensing facility - this is an identifier
assigned outside FHIR.
partOf: The procedure that the dispense is done because of.
status: A code specifying the state of the set of dispense events.
category: Indicates type of medication dispense and where the medication is expected to
be consumed or administered.
medicationCodeableConcept: Identifies the medication being administered. This is either a link to a
resource representing the details of the medication or a simple attribute
carrying a code that identifies the medication from a known list of
medications.
medicationReference: Identifies the medication being administered. This is either a link to a
resource representing the details of the medication or a simple attribute
carrying a code that identifies the medication from a known list of
medications.
subject: A link to a resource representing the person or the group to whom the
medication will be given.
context: The encounter or episode of care that establishes the context for this event.
supportingInformation: Additional information that supports the medication being dispensed.
performer: Indicates who or what performed the event. It should be assumed that the
performer is the dispenser of the medication.
authorizingPrescription: Indicates the medication order that is being dispensed against.
type: Indicates the type of dispensing event that is performed. For example, Trial
Fill, Completion of Trial, Partial Fill, Emergency Fill, Samples, etc.
quantity: The amount of medication that has been dispensed. Includes unit of measure.
daysSupply: The amount of medication expressed as a timing amount.
whenPrepared: The time when the dispensed product was packaged and reviewed.
whenHandedOver: The time the dispensed product was provided to the patient or their
representative.
destination: Identification of the facility/location where the medication was shipped to,
as part of the dispense event.
receiver: Identifies the person who picked up the medication. This will usually be a
patient or their caregiver, but some cases exist where it can be a healthcare
professional.
note: Extra information about the dispense that could not be conveyed in the other
attributes.
dosageInstruction: Indicates how the medication is to be used by the patient.
substitution: Indicates whether or not substitution was made as part of the dispense. In
some cases substitution will be expected but does not happen, in other cases
substitution is not expected but does happen. This block explains what
substitution did or did not happen and why. If nothing is specified,
substitution was not done.
detectedIssue: Indicates an actual or potential clinical issue with or between one or more
active or proposed clinical actions for a patient; e.g. Drug-drug interaction,
duplicate therapy, dosage alert etc.
notDone: True if the dispense was not performed for some reason.
notDoneReasonCodeableConcept: Indicates the reason why a dispense was not performed.
notDoneReasonReference: Indicates the reason why a dispense was not performed.
eventHistory: A summary of the events of interest that have occurred, such as when the
dispense was verified.
"""
from spark_fhir_schemas.stu3.complex_types.extension import ExtensionSchema
from spark_fhir_schemas.stu3.complex_types.meta import MetaSchema
from spark_fhir_schemas.stu3.complex_types.narrative import NarrativeSchema
from spark_fhir_schemas.stu3.simple_types.resourcelist import ResourceListSchema
from spark_fhir_schemas.stu3.complex_types.identifier import IdentifierSchema
from spark_fhir_schemas.stu3.complex_types.reference import ReferenceSchema
from spark_fhir_schemas.stu3.complex_types.codeableconcept import (
CodeableConceptSchema,
)
from spark_fhir_schemas.stu3.complex_types.medicationdispense_performer import (
MedicationDispense_PerformerSchema,
)
from spark_fhir_schemas.stu3.complex_types.quantity import QuantitySchema
from spark_fhir_schemas.stu3.complex_types.annotation import AnnotationSchema
from spark_fhir_schemas.stu3.complex_types.dosage import DosageSchema
from spark_fhir_schemas.stu3.complex_types.medicationdispense_substitution import (
MedicationDispense_SubstitutionSchema,
)
if (
max_recursion_limit
and nesting_list.count("MedicationDispense") >= max_recursion_limit
) or (max_nesting_depth and nesting_depth >= max_nesting_depth):
return StructType([StructField("id", StringType(), True)])
# add my name to recursion list for later
my_nesting_list: List[str] = nesting_list + ["MedicationDispense"]
schema = StructType(
[
# The logical id of the resource, as used in the URL for the resource. Once
# assigned, this value never changes.
StructField("id", StringType(), True),
# May be used to represent additional information that is not part of the basic
# definition of the resource. In order to make the use of extensions safe and
# manageable, there is a strict set of governance applied to the definition and
# use of extensions. Though any implementer is allowed to define an extension,
# there is a set of requirements that SHALL be met as part of the definition of
# the extension.
StructField(
"extension",
ArrayType(
ExtensionSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
)
),
True,
),
# The metadata about the resource. This is content that is maintained by the
# infrastructure. Changes to the content may not always be associated with
# version changes to the resource.
StructField(
"meta",
MetaSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# A reference to a set of rules that were followed when the resource was
# constructed, and which must be understood when processing the content.
StructField("implicitRules", StringType(), True),
# The base language in which the resource is written.
StructField("language", StringType(), True),
# A human-readable narrative that contains a summary of the resource, and may be
# used to represent the content of the resource to a human. The narrative need
# not encode all the structured data, but is required to contain sufficient
# detail to make it "clinically safe" for a human to just read the narrative.
# Resource definitions may define what content should be represented in the
# narrative to ensure clinical safety.
StructField(
"text",
NarrativeSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# These resources do not have an independent existence apart from the resource
# that contains them - they cannot be identified independently, and nor can they
# have their own independent transaction scope.
StructField(
"contained",
ArrayType(
ResourceListSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
)
),
True,
),
# This is a MedicationDispense resource
StructField("resourceType", StringType(), True),
# Identifier assigned by the dispensing facility - this is an identifier
# assigned outside FHIR.
StructField(
"identifier",
ArrayType(
IdentifierSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
)
),
True,
),
# The procedure that the dispense is done because of.
StructField(
"partOf",
ArrayType(
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
)
),
True,
),
# A code specifying the state of the set of dispense events.
StructField("status", StringType(), True),
# Indicates type of medication dispense and where the medication is expected to
# be consumed or administered.
StructField(
"category",
CodeableConceptSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# Identifies the medication being administered. This is either a link to a
# resource representing the details of the medication or a simple attribute
# carrying a code that identifies the medication from a known list of
# medications.
StructField(
"medicationCodeableConcept",
CodeableConceptSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# Identifies the medication being administered. This is either a link to a
# resource representing the details of the medication or a simple attribute
# carrying a code that identifies the medication from a known list of
# medications.
StructField(
"medicationReference",
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# A link to a resource representing the person or the group to whom the
# medication will be given.
StructField(
"subject",
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# The encounter or episode of care that establishes the context for this event.
StructField(
"context",
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# Additional information that supports the medication being dispensed.
StructField(
"supportingInformation",
ArrayType(
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
)
),
True,
),
# Indicates who or what performed the event. It should be assumed that the
# performer is the dispenser of the medication.
StructField(
"performer",
ArrayType(
MedicationDispense_PerformerSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
)
),
True,
),
# Indicates the medication order that is being dispensed against.
StructField(
"authorizingPrescription",
ArrayType(
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
)
),
True,
),
# Indicates the type of dispensing event that is performed. For example, Trial
# Fill, Completion of Trial, Partial Fill, Emergency Fill, Samples, etc.
StructField(
"type",
CodeableConceptSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# The amount of medication that has been dispensed. Includes unit of measure.
StructField(
"quantity",
QuantitySchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# The amount of medication expressed as a timing amount.
StructField(
"daysSupply",
QuantitySchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# The time when the dispensed product was packaged and reviewed.
StructField("whenPrepared", StringType(), True),
# The time the dispensed product was provided to the patient or their
# representative.
StructField("whenHandedOver", StringType(), True),
# Identification of the facility/location where the medication was shipped to,
# as part of the dispense event.
StructField(
"destination",
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# Identifies the person who picked up the medication. This will usually be a
# patient or their caregiver, but some cases exist where it can be a healthcare
# professional.
StructField(
"receiver",
ArrayType(
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
)
),
True,
),
# Extra information about the dispense that could not be conveyed in the other
# attributes.
StructField(
"note",
ArrayType(
AnnotationSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
)
),
True,
),
# Indicates how the medication is to be used by the patient.
StructField(
"dosageInstruction",
ArrayType(
DosageSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
)
),
True,
),
# Indicates whether or not substitution was made as part of the dispense. In
# some cases substitution will be expected but does not happen, in other cases
# substitution is not expected but does happen. This block explains what
# substitution did or did not happen and why. If nothing is specified,
# substitution was not done.
StructField(
"substitution",
MedicationDispense_SubstitutionSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# Indicates an actual or potential clinical issue with or between one or more
# active or proposed clinical actions for a patient; e.g. Drug-drug interaction,
# duplicate therapy, dosage alert etc.
StructField(
"detectedIssue",
ArrayType(
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
)
),
True,
),
# True if the dispense was not performed for some reason.
StructField("notDone", BooleanType(), True),
# Indicates the reason why a dispense was not performed.
StructField(
"notDoneReasonCodeableConcept",
CodeableConceptSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# Indicates the reason why a dispense was not performed.
StructField(
"notDoneReasonReference",
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
),
True,
),
# A summary of the events of interest that have occurred, such as when the
# dispense was verified.
StructField(
"eventHistory",
ArrayType(
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
)
),
True,
),
]
)
if not include_extension:
schema.fields = [
c
if c.name != "extension"
else StructField("extension", StringType(), True)
for c in schema.fields
]
return schema
| 50.333828 | 107 | 0.547738 | 3,101 | 33,925 | 5.768784 | 0.118671 | 0.073118 | 0.046118 | 0.069764 | 0.84197 | 0.834088 | 0.834088 | 0.815976 | 0.803231 | 0.779753 | 0 | 0.002911 | 0.412734 | 33,925 | 673 | 108 | 50.408618 | 0.894996 | 0.299985 | 0 | 0.730193 | 0 | 0 | 0.025522 | 0.005174 | 0 | 0 | 0 | 0 | 0 | 1 | 0.002141 | false | 0 | 0.029979 | 0 | 0.038544 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6c73d1e65d96616ef7dc6fca83c09c0ddd7eb43c | 42,305 | py | Python | qualys_cs_api/api/image_api.py | jlk/qualys-cs-python-client | e2e39fd64d41fd6671d45343843ef36fa3ab59a4 | [
"Apache-2.0"
] | null | null | null | qualys_cs_api/api/image_api.py | jlk/qualys-cs-python-client | e2e39fd64d41fd6671d45343843ef36fa3ab59a4 | [
"Apache-2.0"
] | null | null | null | qualys_cs_api/api/image_api.py | jlk/qualys-cs-python-client | e2e39fd64d41fd6671d45343843ef36fa3ab59a4 | [
"Apache-2.0"
] | 1 | 2020-05-15T04:20:48.000Z | 2020-05-15T04:20:48.000Z | # coding: utf-8
"""
Container Security APIs
All features of the Container Security are available through REST APIs.<br/>Access support information at www.qualys.com/support/<br/><br/><b>Permissions:</b><br/>User must have the Container module enabled<br/>User must have API ACCESS permission # noqa: E501
The version of the OpenAPI document: 1.0.0
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from qualys_cs_api.api_client import ApiClient
from qualys_cs_api.exceptions import (
ApiTypeError,
ApiValueError
)
class ImageApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def delete_images_using_delete(self, image_delete_request, **kwargs): # noqa: E501
"""Delete images in your account # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_images_using_delete(image_delete_request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param ImageDeleteRequest image_delete_request: Provide one or more image Ids or filters in the format shown under Example Value. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: str
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.delete_images_using_delete_with_http_info(image_delete_request, **kwargs) # noqa: E501
def delete_images_using_delete_with_http_info(self, image_delete_request, **kwargs): # noqa: E501
"""Delete images in your account # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_images_using_delete_with_http_info(image_delete_request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param ImageDeleteRequest image_delete_request: Provide one or more image Ids or filters in the format shown under Example Value. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(str, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['image_delete_request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_images_using_delete" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'image_delete_request' is set
if self.api_client.client_side_validation and ('image_delete_request' not in local_var_params or # noqa: E501
local_var_params['image_delete_request'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `image_delete_request` when calling `delete_images_using_delete`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'image_delete_request' in local_var_params:
body_params = local_var_params['image_delete_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/v1.1/images', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='str', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_image_association_using_get(self, image_id, **kwargs): # noqa: E501
"""Show associations for an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_image_association_using_get(image_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str image_id: imageId (required)
:param str filter: filter
:param str type: type
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: ImageAssociation
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_image_association_using_get_with_http_info(image_id, **kwargs) # noqa: E501
def get_image_association_using_get_with_http_info(self, image_id, **kwargs): # noqa: E501
"""Show associations for an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_image_association_using_get_with_http_info(image_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str image_id: imageId (required)
:param str filter: filter
:param str type: type
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(ImageAssociation, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['image_id', 'filter', 'type'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_image_association_using_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'image_id' is set
if self.api_client.client_side_validation and ('image_id' not in local_var_params or # noqa: E501
local_var_params['image_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `image_id` when calling `get_image_association_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'image_id' in local_var_params:
path_params['imageId'] = local_var_params['image_id'] # noqa: E501
query_params = []
if 'filter' in local_var_params and local_var_params['filter'] is not None: # noqa: E501
query_params.append(('filter', local_var_params['filter'])) # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/v1.1/images/{imageId}/association', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ImageAssociation', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_image_details_using_get(self, image_id, **kwargs): # noqa: E501
"""Show details of an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_image_details_using_get(image_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str image_id: Specify the ID or SHA value of a specific image in the user’s scope. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: ImageDetails
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_image_details_using_get_with_http_info(image_id, **kwargs) # noqa: E501
def get_image_details_using_get_with_http_info(self, image_id, **kwargs): # noqa: E501
"""Show details of an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_image_details_using_get_with_http_info(image_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str image_id: Specify the ID or SHA value of a specific image in the user’s scope. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(ImageDetails, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['image_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_image_details_using_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'image_id' is set
if self.api_client.client_side_validation and ('image_id' not in local_var_params or # noqa: E501
local_var_params['image_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `image_id` when calling `get_image_details_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'image_id' in local_var_params:
path_params['imageId'] = local_var_params['image_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/v1.1/images/{imageId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ImageDetails', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_image_installed_software_using_get(self, image_id, **kwargs): # noqa: E501
"""Show software installed on an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_image_installed_software_using_get(image_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str image_id: Specify the ID or SHA value of a specific image in the user’s scope. (required)
:param str filter: Filter the image vulnerability details by providing a query using Qualys syntax. <a href='/cs/help/search/language.htm' target='_blank'>Click here</a> for help with creating your query.
:param str sort: Sort the results using a Qualys token. For example qid:asc. <a href='/cs/help/search_tips/sortable_tokens.htm'>Click here</a> for a listing of tokens.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: SoftwarePivotListResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_image_installed_software_using_get_with_http_info(image_id, **kwargs) # noqa: E501
def get_image_installed_software_using_get_with_http_info(self, image_id, **kwargs): # noqa: E501
"""Show software installed on an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_image_installed_software_using_get_with_http_info(image_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str image_id: Specify the ID or SHA value of a specific image in the user’s scope. (required)
:param str filter: Filter the image vulnerability details by providing a query using Qualys syntax. <a href='/cs/help/search/language.htm' target='_blank'>Click here</a> for help with creating your query.
:param str sort: Sort the results using a Qualys token. For example qid:asc. <a href='/cs/help/search_tips/sortable_tokens.htm'>Click here</a> for a listing of tokens.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(SoftwarePivotListResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['image_id', 'filter', 'sort'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_image_installed_software_using_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'image_id' is set
if self.api_client.client_side_validation and ('image_id' not in local_var_params or # noqa: E501
local_var_params['image_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `image_id` when calling `get_image_installed_software_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'image_id' in local_var_params:
path_params['imageId'] = local_var_params['image_id'] # noqa: E501
query_params = []
if 'filter' in local_var_params and local_var_params['filter'] is not None: # noqa: E501
query_params.append(('filter', local_var_params['filter'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/v1.1/images/{imageId}/software', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SoftwarePivotListResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_image_pivot_data_with_list_using_get(self, page_number, page_size, **kwargs): # noqa: E501
"""Show a list of images in your account # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_image_pivot_data_with_list_using_get(page_number, page_size, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param int page_number: The page to be returned. (required)
:param int page_size: The number of records per page to be included in the response. (required)
:param str filter: Filter the images list by providing a query using Qualys syntax. <a href='/cs/help/search/language.htm' target='_blank'>Click here</a> for help with creating your query.
:param str sort: Sort the results using a Qualys token. For example created:desc. <a href='/cs/help/search_tips/sortable_tokens.htm'>Click here</a> for a listing of tokens.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: PivotListResponseAbstractImage
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_image_pivot_data_with_list_using_get_with_http_info(page_number, page_size, **kwargs) # noqa: E501
def get_image_pivot_data_with_list_using_get_with_http_info(self, page_number, page_size, **kwargs): # noqa: E501
"""Show a list of images in your account # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_image_pivot_data_with_list_using_get_with_http_info(page_number, page_size, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param int page_number: The page to be returned. (required)
:param int page_size: The number of records per page to be included in the response. (required)
:param str filter: Filter the images list by providing a query using Qualys syntax. <a href='/cs/help/search/language.htm' target='_blank'>Click here</a> for help with creating your query.
:param str sort: Sort the results using a Qualys token. For example created:desc. <a href='/cs/help/search_tips/sortable_tokens.htm'>Click here</a> for a listing of tokens.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(PivotListResponseAbstractImage, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['page_number', 'page_size', 'filter', 'sort'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_image_pivot_data_with_list_using_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'page_number' is set
if self.api_client.client_side_validation and ('page_number' not in local_var_params or # noqa: E501
local_var_params['page_number'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `page_number` when calling `get_image_pivot_data_with_list_using_get`") # noqa: E501
# verify the required parameter 'page_size' is set
if self.api_client.client_side_validation and ('page_size' not in local_var_params or # noqa: E501
local_var_params['page_size'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `page_size` when calling `get_image_pivot_data_with_list_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'filter' in local_var_params and local_var_params['filter'] is not None: # noqa: E501
query_params.append(('filter', local_var_params['filter'])) # noqa: E501
if 'page_number' in local_var_params and local_var_params['page_number'] is not None: # noqa: E501
query_params.append(('pageNumber', local_var_params['page_number'])) # noqa: E501
if 'page_size' in local_var_params and local_var_params['page_size'] is not None: # noqa: E501
query_params.append(('pageSize', local_var_params['page_size'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/v1.1/images', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PivotListResponseAbstractImage', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_image_vuln_count_using_get(self, image_id, **kwargs): # noqa: E501
"""Show vulnerability count for an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_image_vuln_count_using_get(image_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str image_id: Specify the ID or SHA value of a specific image in the user’s scope. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: dict(str, int)
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_image_vuln_count_using_get_with_http_info(image_id, **kwargs) # noqa: E501
def get_image_vuln_count_using_get_with_http_info(self, image_id, **kwargs): # noqa: E501
"""Show vulnerability count for an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_image_vuln_count_using_get_with_http_info(image_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str image_id: Specify the ID or SHA value of a specific image in the user’s scope. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(dict(str, int), status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['image_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_image_vuln_count_using_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'image_id' is set
if self.api_client.client_side_validation and ('image_id' not in local_var_params or # noqa: E501
local_var_params['image_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `image_id` when calling `get_image_vuln_count_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'image_id' in local_var_params:
path_params['imageId'] = local_var_params['image_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/v1.1/images/{imageId}/vuln/count', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='dict(str, int)', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_image_vuln_details_using_get(self, image_id, **kwargs): # noqa: E501
"""Show vulnerability details for an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_image_vuln_details_using_get(image_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str image_id: Specify the ID or SHA value of a specific image in the user’s scope. (required)
:param str filter: Filter the image vulnerability details by providing a query using Qualys syntax. <a href='/cs/help/search/language.htm' target='_blank'>Click here</a> for help with creating your query.
:param str type: Specify the type of information to be fetched: Summary, Details, All
:param str sort: Sort the results using a Qualys token. For example qid:asc. <a href='/cs/help/search_tips/sortable_tokens.htm'>Click here</a> for a listing of tokens.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: ImageVulnResponseFacade
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_image_vuln_details_using_get_with_http_info(image_id, **kwargs) # noqa: E501
def get_image_vuln_details_using_get_with_http_info(self, image_id, **kwargs): # noqa: E501
"""Show vulnerability details for an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_image_vuln_details_using_get_with_http_info(image_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str image_id: Specify the ID or SHA value of a specific image in the user’s scope. (required)
:param str filter: Filter the image vulnerability details by providing a query using Qualys syntax. <a href='/cs/help/search/language.htm' target='_blank'>Click here</a> for help with creating your query.
:param str type: Specify the type of information to be fetched: Summary, Details, All
:param str sort: Sort the results using a Qualys token. For example qid:asc. <a href='/cs/help/search_tips/sortable_tokens.htm'>Click here</a> for a listing of tokens.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(ImageVulnResponseFacade, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['image_id', 'filter', 'type', 'sort'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_image_vuln_details_using_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'image_id' is set
if self.api_client.client_side_validation and ('image_id' not in local_var_params or # noqa: E501
local_var_params['image_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `image_id` when calling `get_image_vuln_details_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'image_id' in local_var_params:
path_params['imageId'] = local_var_params['image_id'] # noqa: E501
query_params = []
if 'filter' in local_var_params and local_var_params['filter'] is not None: # noqa: E501
query_params.append(('filter', local_var_params['filter'])) # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/v1.1/images/{imageId}/vuln', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ImageVulnResponseFacade', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
| 51.092995 | 265 | 0.622858 | 5,113 | 42,305 | 4.892627 | 0.051242 | 0.041893 | 0.065478 | 0.025184 | 0.945715 | 0.938679 | 0.932283 | 0.929445 | 0.923129 | 0.917533 | 0 | 0.013833 | 0.301076 | 42,305 | 827 | 266 | 51.154776 | 0.832217 | 0.477012 | 0 | 0.762032 | 0 | 0 | 0.17939 | 0.061608 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040107 | false | 0 | 0.013369 | 0 | 0.093583 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6684a4321934ad8e78608fbb3cf81173e7091f03 | 3,872 | py | Python | tests/smoke/test_international.py | mayank-sfdc/directory-tests | 6e978bc1a27c19389e99e454143122aa27e47b85 | [
"MIT"
] | 4 | 2017-06-02T09:09:10.000Z | 2018-01-25T19:06:12.000Z | tests/smoke/test_international.py | mayank-sfdc/directory-tests | 6e978bc1a27c19389e99e454143122aa27e47b85 | [
"MIT"
] | 53 | 2016-10-27T22:31:03.000Z | 2022-03-07T11:18:25.000Z | tests/smoke/test_international.py | mayank-sfdc/directory-tests | 6e978bc1a27c19389e99e454143122aa27e47b85 | [
"MIT"
] | 3 | 2017-11-22T11:42:40.000Z | 2022-02-21T01:20:04.000Z | # -*- coding: utf-8 -*-
import pytest
from rest_framework.status import HTTP_200_OK
import allure
from directory_tests_shared import URLs
from tests.smoke.cms_api_helpers import get_and_assert
pytestmark = [allure.suite("International site"), allure.feature("International site")]
@pytest.mark.dev
@pytest.mark.parametrize(
"url",
[
URLs.INTERNATIONAL_REGIONS_MIDLANDS.absolute,
URLs.INTERNATIONAL_REGIONS_NORTH_ENGLAND.absolute,
URLs.INTERNATIONAL_REGIONS_NORTHERN_IRELAND.absolute,
URLs.INTERNATIONAL_REGIONS_SOUTH_ENGLAND.absolute,
URLs.INTERNATIONAL_REGIONS_WALES.absolute,
],
)
def test_region_pages(url, basic_auth):
get_and_assert(
url=url, status_code=HTTP_200_OK, auth=basic_auth, allow_redirects=True
)
@pytest.mark.dev
@pytest.mark.parametrize(
"url",
[
URLs.INTERNATIONAL_INDUSTRY_AEROSPACE.absolute,
URLs.INTERNATIONAL_INDUSTRY_AUTOMOTIVE.absolute,
URLs.INTERNATIONAL_INDUSTRY_CREATIVE_INDUSTRIES.absolute,
URLs.INTERNATIONAL_INDUSTRY_EDUCATION.absolute,
URLs.INTERNATIONAL_INDUSTRY_ENERGY.absolute,
URLs.INTERNATIONAL_INDUSTRY_ENGINEERING_AND_MANUFACTURING.absolute,
URLs.INTERNATIONAL_INDUSTRY_FINANCIAL_SERVICES.absolute,
URLs.INTERNATIONAL_INDUSTRY_HEALTH_AND_LIFE_SCIENCES.absolute,
URLs.INTERNATIONAL_INDUSTRY_LEGAL_SERVICES.absolute,
URLs.INTERNATIONAL_INDUSTRY_REAL_ESTATE.absolute,
URLs.INTERNATIONAL_INDUSTRY_SPACE.absolute,
URLs.INTERNATIONAL_INDUSTRY_TECHNOLOGY.absolute,
],
)
def test_industry_pages_dev(url, basic_auth):
get_and_assert(
url=url, status_code=HTTP_200_OK, auth=basic_auth, allow_redirects=True
)
@pytest.mark.stage
@pytest.mark.parametrize(
"url",
[
URLs.INTERNATIONAL_INDUSTRY_CREATIVE_INDUSTRIES.absolute,
URLs.INTERNATIONAL_INDUSTRY_ENERGY.absolute,
URLs.INTERNATIONAL_INDUSTRY_ENGINEERING_AND_MANUFACTURING.absolute,
URLs.INTERNATIONAL_INDUSTRY_FINANCIAL_AND_PROFESSIONAL_SERVICES.absolute,
URLs.INTERNATIONAL_INDUSTRY_FINANCIAL_SERVICES.absolute,
URLs.INTERNATIONAL_INDUSTRY_HEALTH_AND_LIFE_SCIENCES.absolute,
URLs.INTERNATIONAL_INDUSTRY_LEGAL_SERVICES.absolute,
URLs.INTERNATIONAL_INDUSTRY_REAL_ESTATE.absolute,
URLs.INTERNATIONAL_INDUSTRY_TECHNOLOGY.absolute,
],
)
def test_industry_pages_stage(url, basic_auth):
get_and_assert(
url=url, status_code=HTTP_200_OK, auth=basic_auth, allow_redirects=True
)
@pytest.mark.uat
@pytest.mark.parametrize(
"url",
[
URLs.INTERNATIONAL_INDUSTRY_AEROSPACE.absolute,
URLs.INTERNATIONAL_INDUSTRY_AGRICULTURAL_TECHNOLOGY.absolute,
URLs.INTERNATIONAL_INDUSTRY_AUTOMOTIVE.absolute,
URLs.INTERNATIONAL_INDUSTRY_CREATIVE_INDUSTRIES.absolute,
URLs.INTERNATIONAL_INDUSTRY_CYBER_SECURITY.absolute,
URLs.INTERNATIONAL_INDUSTRY_EDUCATION.absolute,
URLs.INTERNATIONAL_INDUSTRY_ENGINEERING_AND_MANUFACTURING.absolute,
URLs.INTERNATIONAL_INDUSTRY_FINANCIAL_SERVICES.absolute,
URLs.INTERNATIONAL_INDUSTRY_FOOD_AND_DRINK.absolute,
URLs.INTERNATIONAL_INDUSTRY_HEALTH_AND_LIFE_SCIENCES.absolute,
URLs.INTERNATIONAL_INDUSTRY_LEGAL_SERVICES.absolute,
URLs.INTERNATIONAL_INDUSTRY_MARITIME.absolute,
URLs.INTERNATIONAL_INDUSTRY_NUCLEAR_ENERGY.absolute,
URLs.INTERNATIONAL_INDUSTRY_OIL_AND_GAS.absolute,
URLs.INTERNATIONAL_INDUSTRY_RETAIL.absolute,
URLs.INTERNATIONAL_INDUSTRY_SPACE.absolute,
URLs.INTERNATIONAL_INDUSTRY_SPORTS_ECONOMY.absolute,
URLs.INTERNATIONAL_INDUSTRY_TECHNOLOGY.absolute,
],
)
def test_industry_pages_uat(url, basic_auth):
get_and_assert(
url=url, status_code=HTTP_200_OK, auth=basic_auth, allow_redirects=True
)
| 37.960784 | 87 | 0.775052 | 423 | 3,872 | 6.669031 | 0.20331 | 0.265154 | 0.354484 | 0.421127 | 0.807161 | 0.76285 | 0.76285 | 0.754342 | 0.754342 | 0.673875 | 0 | 0.004897 | 0.15625 | 3,872 | 101 | 88 | 38.336634 | 0.858586 | 0.005424 | 0 | 0.577778 | 0 | 0 | 0.012471 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 1 | 0.044444 | false | 0 | 0.055556 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
dd01323680d1bb18142e9003d2ab6314554d5d49 | 248 | py | Python | main.py | inffbl/automsqos | 1797936f965d55ba95cae0246c91010fb870a70a | [
"Apache-2.0"
] | null | null | null | main.py | inffbl/automsqos | 1797936f965d55ba95cae0246c91010fb870a70a | [
"Apache-2.0"
] | null | null | null | main.py | inffbl/automsqos | 1797936f965d55ba95cae0246c91010fb870a70a | [
"Apache-2.0"
] | null | null | null | from funcs.read_table import f_read_table
from funcs.read_sql import f_read_sql
from funcs.read_excel import f_read_excel
from funcs.insert_to_db import f_insert_to_db
import pandas as pd
if __name__ == '__main__':
print("Hello World!") | 31 | 46 | 0.794355 | 44 | 248 | 3.977273 | 0.454545 | 0.205714 | 0.222857 | 0.182857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149194 | 248 | 8 | 47 | 31 | 0.829384 | 0 | 0 | 0 | 0 | 0 | 0.082645 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.714286 | 0 | 0.714286 | 0.142857 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
dd3f3992e00fba8375e7d96703d78c8b03f699f7 | 103 | py | Python | data/batchers/__init__.py | df424/ml | e12232ca4b90f983bfb14718afd314d3d6cc1bf9 | [
"MIT"
] | null | null | null | data/batchers/__init__.py | df424/ml | e12232ca4b90f983bfb14718afd314d3d6cc1bf9 | [
"MIT"
] | null | null | null | data/batchers/__init__.py | df424/ml | e12232ca4b90f983bfb14718afd314d3d6cc1bf9 | [
"MIT"
] | null | null | null |
from ml.data.batchers.batcher import Batcher
from ml.data.batchers.bucket_batcher import BucketBatcher | 34.333333 | 57 | 0.864078 | 15 | 103 | 5.866667 | 0.533333 | 0.136364 | 0.227273 | 0.409091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07767 | 103 | 3 | 57 | 34.333333 | 0.926316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
06bbf536ce833ebad37aa1d17b220e90b66fbd00 | 5,584 | py | Python | routes/image_classify.py | moonrailgun/ai-api | 08a2216187d7c2ac1d689ac3137a5617349f5bb1 | [
"MIT"
] | null | null | null | routes/image_classify.py | moonrailgun/ai-api | 08a2216187d7c2ac1d689ac3137a5617349f5bb1 | [
"MIT"
] | null | null | null | routes/image_classify.py | moonrailgun/ai-api | 08a2216187d7c2ac1d689ac3137a5617349f5bb1 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from flask import request, jsonify
from flask.blueprints import Blueprint
from ai_clients import image_classify_client
import base64
image_classify = Blueprint('image_classify', __name__)
ALLOWED_EXTENSIONS = ['png', 'jpg', 'bmp']
def allowed_file(filename):
return '.' in filename and filename.rsplit('.', 1)[1] in ALLOWED_EXTENSIONS
@image_classify.route('/general', methods=['post'])
def advanced_general():
image = request.files.get('image')
if not image:
return jsonify({
'result': False,
'msg': u'缺少image',
})
if not(image and allowed_file(image.filename)):
return jsonify({
'result': False,
'msg': u'文件上传失败,只允许jpg/png/bmp',
})
# 不存储
data = image.read()
res = image_classify_client.advancedGeneral(data)
if res.get('error_code'):
return jsonify({
'result': False,
'msg': res.get('error_msg', ''),
})
return jsonify({
'result': True,
'data': res,
})
@image_classify.route('/dishDetect', methods=['post'])
def dish_detect():
image = request.files.get('image')
top_num = request.form.get('top_num', 5)
if not image:
return jsonify({
'result': False,
'msg': u'缺少image',
})
if not(image and allowed_file(image.filename)):
return jsonify({
'result': False,
'msg': u'文件上传失败,只允许jpg/png/bmp',
})
# 不存储
data = image.read()
res = image_classify_client.dishDetect(data, {'top_num': top_num})
if res.get('error_code'):
return jsonify({
'result': False,
'msg': res.get('error_msg', ''),
})
return jsonify({
'result': True,
'data': res,
})
@image_classify.route('/carDetect', methods=['post'])
def car_detect():
image = request.files.get('image')
top_num = request.form.get('top_num', 5)
if not image:
return jsonify({
'result': False,
'msg': u'缺少image',
})
if not(image and allowed_file(image.filename)):
return jsonify({
'result': False,
'msg': u'文件上传失败,只允许jpg/png/bmp',
})
# 不存储
data = image.read()
res = image_classify_client.carDetect(data, {'top_num': top_num})
if res.get('error_code'):
return jsonify({
'result': False,
'msg': res.get('error_msg', ''),
})
return jsonify({
'result': True,
'data': res,
})
@image_classify.route('/logoSearch', methods=['post'])
def logo_search():
image = request.files.get('image')
custom_lib = request.form.get('custom_lib', False)
if not image:
return jsonify({
'result': False,
'msg': u'缺少image',
})
if not(image and allowed_file(image.filename)):
return jsonify({
'result': False,
'msg': u'文件上传失败,只允许jpg/png/bmp',
})
# 不存储
data = image.read()
res = image_classify_client.logoSearch(data, {'custom_lib': custom_lib})
if res.get('error_code'):
return jsonify({
'result': False,
'msg': res.get('error_msg', ''),
})
return jsonify({
'result': True,
'data': res,
})
@image_classify.route('/animalDetect', methods=['post'])
def animal_detect():
image = request.files.get('image')
top_num = request.form.get('top_num', 6)
if not image:
return jsonify({
'result': False,
'msg': u'缺少image',
})
if not(image and allowed_file(image.filename)):
return jsonify({
'result': False,
'msg': u'文件上传失败,只允许jpg/png/bmp',
})
# 不存储
data = image.read()
res = image_classify_client.animalDetect(data, {'top_num': top_num})
if res.get('error_code'):
return jsonify({
'result': False,
'msg': res.get('error_msg', ''),
})
return jsonify({
'result': True,
'data': res,
})
@image_classify.route('/plantDetect', methods=['post'])
def plant_detect():
image = request.files.get('image')
if not image:
return jsonify({
'result': False,
'msg': u'缺少image',
})
if not(image and allowed_file(image.filename)):
return jsonify({
'result': False,
'msg': u'文件上传失败,只允许jpg/png/bmp',
})
# 不存储
data = image.read()
res = image_classify_client.plantDetect(data)
if res.get('error_code'):
return jsonify({
'result': False,
'msg': res.get('error_msg', ''),
})
return jsonify({
'result': True,
'data': res,
})
@image_classify.route('/objectDetect', methods=['post'])
def object_detect():
image = request.files.get('image')
with_face = request.form.get('with_face', False)
if not image:
return jsonify({
'result': False,
'msg': u'缺少image',
})
if not(image and allowed_file(image.filename)):
return jsonify({
'result': False,
'msg': u'文件上传失败,只允许jpg/png/bmp',
})
# 不存储
data = image.read()
res = image_classify_client.objectDetect(data, {'with_face': with_face})
if res.get('error_code'):
return jsonify({
'result': False,
'msg': res.get('error_msg', ''),
})
return jsonify({
'result': True,
'data': res,
})
| 23.863248 | 79 | 0.537607 | 612 | 5,584 | 4.772876 | 0.125817 | 0.124615 | 0.182129 | 0.172544 | 0.765834 | 0.757275 | 0.744608 | 0.744608 | 0.744608 | 0.744608 | 0 | 0.002067 | 0.306769 | 5,584 | 233 | 80 | 23.965665 | 0.752519 | 0.008775 | 0 | 0.823204 | 0 | 0 | 0.150923 | 0.026602 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044199 | false | 0 | 0.022099 | 0.005525 | 0.226519 | 0.01105 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
06c943219a857781455f7dc246311698ea78546b | 182 | py | Python | kaylee-hudson/mutation.py | homologus/2020-intermediate-class | 4eba7c6cc4d90b5e320f0750092e4a10844fc96e | [
"MIT"
] | 3 | 2020-07-03T14:38:00.000Z | 2020-07-31T22:31:13.000Z | kaylee-hudson/mutation.py | homologus/2020-intermediate-class | 4eba7c6cc4d90b5e320f0750092e4a10844fc96e | [
"MIT"
] | null | null | null | kaylee-hudson/mutation.py | homologus/2020-intermediate-class | 4eba7c6cc4d90b5e320f0750092e4a10844fc96e | [
"MIT"
] | 5 | 2020-07-03T14:26:17.000Z | 2020-07-30T23:00:30.000Z | from Bio.Seq import Seq
read = Seq("GGGTCAACTGCTATGATGTGTTTGGAGCCCAGTCACCCTTTGGTGGCTACAAGATGTCGGGGAGTGGCCGGGAGTTGGGCGAGTACGGGCTGCAGGCATACACTAAAGTGAAAACTGT")
print(read.translate())
| 36.4 | 132 | 0.89011 | 11 | 182 | 14.727273 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049451 | 182 | 4 | 133 | 45.5 | 0.936416 | 0 | 0 | 0 | 0 | 0 | 0.648352 | 0.648352 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
6646467cb2275023e9d83735a3fa58017bfdabeb | 2,408 | py | Python | makmal/forms.py | ParmenidesSartre/Makmal-Record-System | f121e5b62899b5a084bbd62f8b7132dca5c72691 | [
"MIT"
] | null | null | null | makmal/forms.py | ParmenidesSartre/Makmal-Record-System | f121e5b62899b5a084bbd62f8b7132dca5c72691 | [
"MIT"
] | null | null | null | makmal/forms.py | ParmenidesSartre/Makmal-Record-System | f121e5b62899b5a084bbd62f8b7132dca5c72691 | [
"MIT"
] | null | null | null | from django import forms
from .models import Report
class ReportForm(forms.ModelForm):
project_name = forms.CharField(widget=forms.TextInput(attrs={'class' : "form-control"}))
contractor_name = forms.CharField(widget=forms.TextInput(attrs={'class' : "form-control"}))
pipe_type = forms.CharField(widget=forms.TextInput(attrs={'class' : "form-control"}))
pipe_size = forms.CharField(widget=forms.TextInput(attrs={'class' : "form-control"}))
result_1 = forms.IntegerField(widget=forms.TextInput(attrs={'class' : "form-control"}))
result_2 = forms.IntegerField(widget=forms.TextInput(attrs={'class' : "form-control"}))
result_3 = forms.IntegerField(widget=forms.TextInput(attrs={'class' : "form-control"}), required=False)
result_4 = forms.IntegerField(widget=forms.TextInput(attrs={'class' : "form-control"}), required=False)
done_on = forms.DateTimeField(widget = forms.DateInput(format='%Y/%m/%d %H:%M' , attrs={ 'id': 'dateTimeFlatpickr',
'class' : "form-control flatpickr flatpickr-input active",
'placeholder' : 'Pilih Tarikh...'}))
class Meta:
model = Report
fields = '__all__'
class AddRecordForm(forms.ModelForm):
project_name = forms.CharField(widget=forms.TextInput(attrs={'class' : "form-control"}))
contractor_name = forms.CharField(widget=forms.TextInput(attrs={'class' : "form-control"}))
pipe_type = forms.CharField(widget=forms.TextInput(attrs={'class' : "form-control"}))
pipe_size = forms.CharField(widget=forms.TextInput(attrs={'class' : "form-control"}))
result_1 = forms.IntegerField(widget=forms.TextInput(attrs={'class' : "form-control"}))
result_2 = forms.IntegerField(widget=forms.TextInput(attrs={'class' : "form-control"}))
result_3 = forms.IntegerField(widget=forms.TextInput(attrs={'class' : "form-control"}), required=False)
result_4 = forms.IntegerField(widget=forms.TextInput(attrs={'class' : "form-control"}),required=False)
done_on = forms.DateTimeField(widget = forms.DateInput(format='%Y/%m/%d %H:%M' , attrs={ 'id': 'dateTimeFlatpickr',
'class' : "form-control flatpickr flatpickr-input active",
'placeholder' : 'Pilih Tarikh...'}))
class Meta:
model = Report
fields = '__all__' | 58.731707 | 119 | 0.653654 | 264 | 2,408 | 5.863636 | 0.193182 | 0.127907 | 0.186047 | 0.258398 | 0.950904 | 0.950904 | 0.950904 | 0.950904 | 0.950904 | 0.950904 | 0 | 0.004075 | 0.184801 | 2,408 | 41 | 120 | 58.731707 | 0.784514 | 0 | 0 | 0.875 | 0 | 0 | 0.209215 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 10 |
c6bf466c75c1145e1e2bc71730532c3e1df3293e | 2,016 | py | Python | nautobot/dcim/migrations/0006_auto_slug.py | psmware-ltd/nautobot | ac516287fb8edcc3482bd011839de837c6bbf0df | [
"Apache-2.0"
] | 384 | 2021-02-24T01:40:40.000Z | 2022-03-30T10:30:59.000Z | nautobot/dcim/migrations/0006_auto_slug.py | psmware-ltd/nautobot | ac516287fb8edcc3482bd011839de837c6bbf0df | [
"Apache-2.0"
] | 1,067 | 2021-02-24T00:58:08.000Z | 2022-03-31T23:38:23.000Z | nautobot/dcim/migrations/0006_auto_slug.py | psmware-ltd/nautobot | ac516287fb8edcc3482bd011839de837c6bbf0df | [
"Apache-2.0"
] | 128 | 2021-02-24T02:45:16.000Z | 2022-03-20T18:48:36.000Z | # Generated by Django 3.1.13 on 2021-08-05 19:24
from django.db import migrations
import nautobot.core.fields
class Migration(migrations.Migration):
dependencies = [
("dcim", "0005_device_local_context_schema"),
]
operations = [
migrations.AlterField(
model_name="devicerole",
name="slug",
field=nautobot.core.fields.AutoSlugField(blank=True, max_length=100, populate_from="name", unique=True),
),
migrations.AlterField(
model_name="devicetype",
name="slug",
field=nautobot.core.fields.AutoSlugField(blank=True, max_length=100, populate_from="model", unique=None),
),
migrations.AlterField(
model_name="manufacturer",
name="slug",
field=nautobot.core.fields.AutoSlugField(blank=True, max_length=100, populate_from="name", unique=True),
),
migrations.AlterField(
model_name="platform",
name="slug",
field=nautobot.core.fields.AutoSlugField(blank=True, max_length=100, populate_from="name", unique=True),
),
migrations.AlterField(
model_name="rackgroup",
name="slug",
field=nautobot.core.fields.AutoSlugField(blank=True, max_length=100, populate_from="name", unique=None),
),
migrations.AlterField(
model_name="rackrole",
name="slug",
field=nautobot.core.fields.AutoSlugField(blank=True, max_length=100, populate_from="name", unique=True),
),
migrations.AlterField(
model_name="region",
name="slug",
field=nautobot.core.fields.AutoSlugField(blank=True, max_length=100, populate_from="name", unique=True),
),
migrations.AlterField(
model_name="site",
name="slug",
field=nautobot.core.fields.AutoSlugField(blank=True, max_length=100, populate_from="name", unique=True),
),
]
| 36.654545 | 117 | 0.610119 | 209 | 2,016 | 5.751196 | 0.244019 | 0.08985 | 0.134775 | 0.193012 | 0.771215 | 0.771215 | 0.711314 | 0.711314 | 0.711314 | 0.711314 | 0 | 0.02971 | 0.265377 | 2,016 | 54 | 118 | 37.333333 | 0.781904 | 0.022817 | 0 | 0.625 | 1 | 0 | 0.085366 | 0.01626 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.041667 | 0 | 0.104167 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c6cbf3a662233558b79dff5fe202044e11927961 | 154 | py | Python | Codes examples in the book/chs16-20_code/mathproj/comp/numeric/n1.py | almazkun/The-Quick-Python-Book-Third-Edition | d405e45b757d53e851843aed30118949b42c2ee6 | [
"Apache-2.0"
] | 2 | 2019-05-04T12:33:09.000Z | 2020-02-28T20:06:27.000Z | Codes examples in the book/chs16-20_code/mathproj/comp/numeric/n1.py | almazkun/The-Quick-Python-Book-Third-Edition | d405e45b757d53e851843aed30118949b42c2ee6 | [
"Apache-2.0"
] | null | null | null | Codes examples in the book/chs16-20_code/mathproj/comp/numeric/n1.py | almazkun/The-Quick-Python-Book-Third-Edition | d405e45b757d53e851843aed30118949b42c2ee6 | [
"Apache-2.0"
] | 1 | 2020-02-28T19:33:28.000Z | 2020-02-28T19:33:28.000Z | from mathproj import version
from mathproj.comp import c1
from mathproj.comp.numeric.n2 import h
def g():
print("version is", version)
print(h())
| 22 | 38 | 0.727273 | 24 | 154 | 4.666667 | 0.541667 | 0.321429 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015625 | 0.168831 | 154 | 6 | 39 | 25.666667 | 0.859375 | 0 | 0 | 0 | 0 | 0 | 0.064935 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0.5 | 0 | 0.666667 | 0.333333 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
af34a0ee93abb591acfb92f17a6181d7d4ec11c3 | 1,197 | py | Python | src/text_selection_tests/common/ngram_extractor.py/test_get_ngrams_generator.py | stefantaubert/text-selection | 4b3b49005cbeb2e9212ed94686d8e871c6c2c368 | [
"MIT"
] | null | null | null | src/text_selection_tests/common/ngram_extractor.py/test_get_ngrams_generator.py | stefantaubert/text-selection | 4b3b49005cbeb2e9212ed94686d8e871c6c2c368 | [
"MIT"
] | null | null | null | src/text_selection_tests/common/ngram_extractor.py/test_get_ngrams_generator.py | stefantaubert/text-selection | 4b3b49005cbeb2e9212ed94686d8e871c6c2c368 | [
"MIT"
] | null | null | null | from text_selection.common.ngram_extractor import get_ngrams_generator
def test_n1_empty__returns_empty():
symbols = tuple()
result = list(get_ngrams_generator(symbols, n=1))
assert result == []
def test_n2_empty__returns_empty():
symbols = tuple()
result = list(get_ngrams_generator(symbols, n=2))
assert result == []
def test_n3_empty__returns_empty():
symbols = tuple()
result = list(get_ngrams_generator(symbols, n=3))
assert result == []
def test_n2_a__returns_empty():
symbols = ("a",)
result = list(get_ngrams_generator(symbols, n=2))
assert result == []
def test_n3_ab__returns_empty():
symbols = ("a", "b")
result = list(get_ngrams_generator(symbols, n=3))
assert result == []
def test_n1_abc__returns_ab_bc():
symbols = ("a", "b", "c")
result = list(get_ngrams_generator(symbols, n=1))
assert result == [("a",), ("b",), ("c",)]
def test_n2_abc__returns_ab_bc():
symbols = ("a", "b", "c")
result = list(get_ngrams_generator(symbols, n=2))
assert result == [("a", "b"), ("b", "c")]
def test_n3_abc__returns_abc():
symbols = ("a", "b", "c")
result = list(get_ngrams_generator(symbols, n=3))
assert result == [("a", "b", "c")]
| 23.94 | 70 | 0.667502 | 173 | 1,197 | 4.260116 | 0.184971 | 0.109905 | 0.21981 | 0.206242 | 0.795115 | 0.772049 | 0.772049 | 0.772049 | 0.772049 | 0.772049 | 0 | 0.015826 | 0.155388 | 1,197 | 49 | 71 | 24.428571 | 0.713155 | 0 | 0 | 0.575758 | 0 | 0 | 0.018379 | 0 | 0 | 0 | 0 | 0 | 0.242424 | 1 | 0.242424 | false | 0 | 0.030303 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
af459004e59f5651e457bedfa558e9f805f4a95c | 13,376 | py | Python | src/saltext/vmware/modules/vmc_vpn_statistics.py | kdsalvy/salt-ext-modules-vmware-1 | 9fdc941692e4c526f575f33b2ce23c1470582934 | [
"Apache-2.0"
] | 10 | 2021-11-02T20:24:44.000Z | 2022-03-11T05:54:27.000Z | src/saltext/vmware/modules/vmc_vpn_statistics.py | cmcmarrow/salt-ext-modules-vmware | c546a9f9ae121b7399dabae82f714117d0ab558d | [
"Apache-2.0"
] | 83 | 2021-10-01T15:13:02.000Z | 2022-03-31T16:22:40.000Z | src/saltext/vmware/modules/vmc_vpn_statistics.py | cmcmarrow/salt-ext-modules-vmware | c546a9f9ae121b7399dabae82f714117d0ab558d | [
"Apache-2.0"
] | 15 | 2021-09-30T23:17:27.000Z | 2022-03-23T06:54:22.000Z | """
Salt execution module for VPN statistics
Provides methods to Display VPN Statistics and Sessions.
"""
import logging
from saltext.vmware.utils import vmc_constants
from saltext.vmware.utils import vmc_request
log = logging.getLogger(__name__)
__virtualname__ = "vmc_vpn_statistics"
def __virtual__():
return __virtualname__
def get_ipsec_statistics(
hostname,
refresh_key,
authorization_host,
org_id,
sddc_id,
locale_service_id,
service_id,
session_id,
tier0_id=None,
tier1_id=None,
verify_ssl=True,
cert=None,
enforcement_point_path=None,
):
"""
Retrieves VPN IPSec Statistics from Given SDDC
CLI Example:
.. code-block:: bash
salt vm_minion vmc_vpn_statistics.get_ipsec_statistics hostname=nsxt-manager.local ...
hostname
The host name of NSX-T manager
refresh_key
refresh_key to get access token
authorization_host
hostname to get access token
org_id
org_id of the SDDC
sddc_id
sddc_id from which ipsec statistics should be retrieved
locale_service_id
id of locale service for example default
service_id
id of service for example default
session_id
id of session
Enter one of the tier0 or tier1 id
tier0_id
id of tier0 for example vmc
tier1_id is currently not supported
id of tier1
verify_ssl
(Optional) Option to enable/disable SSL verification. Enabled by default.
If set to False, the certificate validation is skipped.
cert
(Optional) Path to the SSL client certificate file to connect to VMC Cloud Console.
The certificate can be retrieved from browser.
enforcement_point_path
(Optional) String Path of the enforcement point
"""
log.info("Retrieving ipsec statistics for SDDC %s", sddc_id)
if tier0_id and tier1_id:
log.error(vmc_constants.VPN_ERROR_SPECIFY_ONE)
return {"error": vmc_constants.VPN_ERROR_SPECIFY_ONE}
elif not (tier0_id or tier1_id):
log.error(vmc_constants.VPN_ERROR_SPECIFY_ATLEAST_ONE)
return {"error": vmc_constants.VPN_ERROR_SPECIFY_ATLEAST_ONE}
if tier0_id:
tier = "tier-0s"
tier_id = tier0_id
else:
tier = "tier-1s"
tier_id = tier1_id
api_url_base = vmc_request.set_base_url(hostname)
api_url = (
"{base_url}vmc/reverse-proxy/api/orgs/{org_id}/sddcs/{sddc_id}/"
"policy/api/v1/infra/{tier}/{tier_id}/locale-services/{locale_service_id}/"
"ipsec-vpn-services/{service_id}/sessions/{session_id}/statistics"
)
api_url = api_url.format(
base_url=api_url_base,
org_id=org_id,
sddc_id=sddc_id,
tier=tier,
tier_id=tier_id,
locale_service_id=locale_service_id,
service_id=service_id,
session_id=session_id,
)
params = vmc_request._filter_kwargs(
allowed_kwargs=["enforcement_point_path"], enforcement_point_path=enforcement_point_path
)
return vmc_request.call_api(
method=vmc_constants.GET_REQUEST_METHOD,
url=api_url,
refresh_key=refresh_key,
authorization_host=authorization_host,
description="vmc_vpn_statistics.get_ipsec_statistics",
verify_ssl=verify_ssl,
cert=cert,
params=params,
)
def get_ipsec_sessions(
hostname,
refresh_key,
authorization_host,
org_id,
sddc_id,
locale_service_id,
service_id,
tier0_id=None,
tier1_id=None,
verify_ssl=True,
cert=None,
cursor=None,
page_size=None,
sort_by=None,
sort_ascending=None,
):
"""
Retrieves ipsec session from Given SDDC
this also include l2vpn sessions
CLI Example:
.. code-block:: bash
salt vm_minion vmc_vpn_statistics.get_ipsec_sessions hostname=nsxt-manager.local ...
hostname
The host name of NSX-T manager
refresh_key
refresh_key to get access token
authorization_host
hostname to get access token
org_id
org_id of the SDDC
sddc_id
sddc_id from which public ips should be retrieved
locale_service_id
id of locale service for example default
service_id
id of service for example default
Enter one of the tier0 or tier1 id
tier0_id
id of tier0 for example vmc
tier1_id is currently not supported
id of tier1
verify_ssl
(Optional) Option to enable/disable SSL verification. Enabled by default.
If set to False, the certificate validation is skipped.
cert
(Optional) Path to the SSL client certificate file to connect to VMC Cloud Console.
The certificate can be retrieved from browser.
cursor
(Optional) Opaque cursor to be used for getting next page of records (supplied by current result page)
page_size
(Optional) Maximum number of results to return in this page. Default page size is 1000.
sort_by
(Optional) Field by which records are sorted
sort_ascending
(Optional) Boolean value to sort result in ascending order. Enabled by default.
"""
log.info("Retrieving ipsec sessions for SDDC %s", sddc_id)
if tier0_id and tier1_id:
log.error(vmc_constants.VPN_ERROR_SPECIFY_ONE)
return {"error": vmc_constants.VPN_ERROR_SPECIFY_ONE}
elif not (tier0_id or tier1_id):
log.error(vmc_constants.VPN_ERROR_SPECIFY_ATLEAST_ONE)
return {"error": vmc_constants.VPN_ERROR_SPECIFY_ATLEAST_ONE}
if tier0_id:
tier = "tier-0s"
tier_id = tier0_id
else:
tier = "tier-1s"
tier_id = tier1_id
api_url_base = vmc_request.set_base_url(hostname)
api_url = (
"{base_url}vmc/reverse-proxy/api/orgs/{org_id}/sddcs/{sddc_id}/"
"policy/api/v1/infra/{tier}/{tier_id}/locale-services/{locale_service_id}/"
"ipsec-vpn-services/{service_id}/sessions"
)
api_url = api_url.format(
base_url=api_url_base,
org_id=org_id,
sddc_id=sddc_id,
tier=tier,
tier_id=tier_id,
locale_service_id=locale_service_id,
service_id=service_id,
)
params = vmc_request._filter_kwargs(
allowed_kwargs=["cursor", "page_size", "sort_ascending", "sort_by"],
cursor=cursor,
page_size=page_size,
sort_by=sort_by,
sort_ascending=sort_ascending,
)
return vmc_request.call_api(
method=vmc_constants.GET_REQUEST_METHOD,
url=api_url,
refresh_key=refresh_key,
authorization_host=authorization_host,
description="vmc_vpn_statistics.get_ipsec_sessions",
verify_ssl=verify_ssl,
cert=cert,
params=params,
)
def get_l2vpn_statistics(
hostname,
refresh_key,
authorization_host,
org_id,
sddc_id,
locale_service_id,
service_id,
session_id,
tier0_id=None,
tier1_id=None,
verify_ssl=True,
cert=None,
enforcement_point_path=None,
source=None,
):
"""
Retrieves L2VPN Statistics from Given SDDC
CLI Example:
.. code-block:: bash
salt vm_minion vmc_vpn_statistics.get_l2vpn_statistics hostname=nsxt-manager.local ...
hostname
The host name of NSX-T manager
refresh_key
refresh_key to get access token
authorization_host
hostname to get access token
org_id
org_id of the SDDC
sddc_id
sddc_id from which ipsec statistics should be retrieved
locale_service_id
id of locale service for example default
service_id
id of service for example default
session_id
id of session
Enter one of the tier0 or tier1 id
tier0_id
id of tier0 for example vmc
tier1_id is currently not supported
id of tier1
verify_ssl
(Optional) Option to enable/disable SSL verification. Enabled by default.
If set to False, the certificate validation is skipped.
cert
(Optional) Path to the SSL client certificate file to connect to VMC Cloud Console.
The certificate can be retrieved from browser.
enforcement_point_path
(Optional) String Path of the enforcement point
source
(Optional) valid options are realtime, cached
"""
log.info("Retrieving l2vpn statistics for SDDC %s", sddc_id)
if tier0_id and tier1_id:
log.error(vmc_constants.VPN_ERROR_SPECIFY_ONE)
return {"error": vmc_constants.VPN_ERROR_SPECIFY_ONE}
elif not (tier0_id or tier1_id):
log.error(vmc_constants.VPN_ERROR_SPECIFY_ATLEAST_ONE)
return {"error": vmc_constants.VPN_ERROR_SPECIFY_ATLEAST_ONE}
if tier0_id:
tier = "tier-0s"
tier_id = tier0_id
else:
tier = "tier-1s"
tier_id = tier1_id
api_url_base = vmc_request.set_base_url(hostname)
api_url = (
"{base_url}vmc/reverse-proxy/api/orgs/{org_id}/sddcs/{sddc_id}/"
"policy/api/v1/infra/{tier}/{tier_id}/locale-services/{locale_service_id}/"
"l2vpn-services/{service_id}/sessions/{session_id}/statistics"
)
api_url = api_url.format(
base_url=api_url_base,
org_id=org_id,
sddc_id=sddc_id,
tier=tier,
tier_id=tier_id,
locale_service_id=locale_service_id,
service_id=service_id,
session_id=session_id,
)
params = vmc_request._filter_kwargs(
allowed_kwargs=["enforcement_point_path", "source"],
enforcement_point_path=enforcement_point_path,
source=source,
)
return vmc_request.call_api(
method=vmc_constants.GET_REQUEST_METHOD,
url=api_url,
refresh_key=refresh_key,
authorization_host=authorization_host,
description="vmc_vpn_statistics.get_l2vpn_statistics",
verify_ssl=verify_ssl,
cert=cert,
params=params,
)
def get_l2vpn_sessions(
hostname,
refresh_key,
authorization_host,
org_id,
sddc_id,
locale_service_id,
service_id,
tier0_id=None,
tier1_id=None,
verify_ssl=True,
cert=None,
cursor=None,
page_size=None,
sort_by=None,
sort_ascending=None,
):
"""
Retrieves l2vpn session from Given SDDC
CLI Example:
.. code-block:: bash
salt vm_minion vmc_vpn_statistics.get_l2vpn_sessions hostname=nsxt-manager.local ...
hostname
The host name of NSX-T manager
refresh_key
refresh_key to get access token
authorization_host
hostname to get access token
org_id
org_id of the SDDC
sddc_id
sddc_id from which public ips should be retrieved
locale_service_id
id of locale service for example default
service_id
id of service for example default
Enter one of the tier0 or tier1 id
tier0_id
id of tier0 for example vmc
tier1_id is currently not supported
id of tier1
verify_ssl
(Optional) Option to enable/disable SSL verification. Enabled by default.
If set to False, the certificate validation is skipped.
cert
(Optional) Path to the SSL client certificate file to connect to VMC Cloud Console.
The certificate can be retrieved from browser.
cursor
(Optional) Opaque cursor to be used for getting next page of records (supplied by current result page)
page_size
(Optional) Maximum number of results to return in this page. Default page size is 1000.
sort_by
(Optional) Field by which records are sorted
sort_ascending
(Optional) Boolean value to sort result in ascending order. Enabled by default.
"""
log.info("Retrieving l2vpn sessions for SDDC %s", sddc_id)
if tier0_id and tier1_id:
log.error(vmc_constants.VPN_ERROR_SPECIFY_ONE)
return {"error": vmc_constants.VPN_ERROR_SPECIFY_ONE}
elif not (tier0_id or tier1_id):
log.error(vmc_constants.VPN_ERROR_SPECIFY_ATLEAST_ONE)
return {"error": vmc_constants.VPN_ERROR_SPECIFY_ATLEAST_ONE}
if tier0_id:
tier = "tier-0s"
tier_id = tier0_id
else:
tier = "tier-1s"
tier_id = tier1_id
api_url_base = vmc_request.set_base_url(hostname)
api_url = (
"{base_url}vmc/reverse-proxy/api/orgs/{org_id}/sddcs/{sddc_id}/"
"policy/api/v1/infra/{tier}/{tier_id}/locale-services/{locale_service_id}/"
"l2vpn-services/{service_id}/sessions"
)
api_url = api_url.format(
base_url=api_url_base,
org_id=org_id,
sddc_id=sddc_id,
tier=tier,
tier_id=tier_id,
locale_service_id=locale_service_id,
service_id=service_id,
)
params = vmc_request._filter_kwargs(
allowed_kwargs=["cursor", "page_size", "sort_ascending", "sort_by"],
cursor=cursor,
page_size=page_size,
sort_by=sort_by,
sort_ascending=sort_ascending,
)
return vmc_request.call_api(
method=vmc_constants.GET_REQUEST_METHOD,
url=api_url,
refresh_key=refresh_key,
authorization_host=authorization_host,
description="vmc_vpn_statistics.get_l2vpn_sessions",
verify_ssl=verify_ssl,
cert=cert,
params=params,
)
| 25.87234 | 110 | 0.67038 | 1,808 | 13,376 | 4.676438 | 0.092367 | 0.042578 | 0.035482 | 0.037847 | 0.949616 | 0.949616 | 0.932821 | 0.932821 | 0.932821 | 0.926434 | 0 | 0.009429 | 0.262635 | 13,376 | 516 | 111 | 25.922481 | 0.847815 | 0.37201 | 0 | 0.822581 | 0 | 0 | 0.164673 | 0.120417 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020161 | false | 0 | 0.012097 | 0.004032 | 0.084677 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
af72d761850ae07c04dd6a4b8238ae54d99d0b90 | 14,496 | py | Python | controller/api/tests/test_scheduler.py | shopkeep/deis | d9dd78187e1ddca7086fbdf600a738ae482c5378 | [
"Apache-2.0"
] | 1 | 2019-06-17T18:51:47.000Z | 2019-06-17T18:51:47.000Z | controller/api/tests/test_scheduler.py | shopkeep/deis | d9dd78187e1ddca7086fbdf600a738ae482c5378 | [
"Apache-2.0"
] | null | null | null | controller/api/tests/test_scheduler.py | shopkeep/deis | d9dd78187e1ddca7086fbdf600a738ae482c5378 | [
"Apache-2.0"
] | null | null | null | """
Unit tests for the Deis api app.
Run the tests with "./manage.py test api"
"""
from __future__ import unicode_literals
import json
from django.test import TransactionTestCase
from scheduler import chaos
class SchedulerTest(TransactionTestCase):
"""Tests creation of containers on nodes"""
fixtures = ['tests.json']
def setUp(self):
self.assertTrue(
self.client.login(username='autotest', password='password'))
body = {'id': 'autotest', 'domain': 'autotest.local', 'type': 'chaos',
'hosts': 'host1,host2', 'auth': 'base64string', 'options': {}}
response = self.client.post('/api/clusters', json.dumps(body),
content_type='application/json')
self.assertEqual(response.status_code, 201)
# start without any chaos
chaos.CREATE_ERROR_RATE = 0
chaos.DESTROY_ERROR_RATE = 0
chaos.START_ERROR_RATE = 0
chaos.STOP_ERROR_RATE = 0
def test_create_chaos(self):
url = '/api/apps'
body = {'cluster': 'autotest'}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 201)
app_id = response.data['id']
# post a new build
url = "/api/apps/{app_id}/builds".format(**locals())
body = {'image': 'autotest/example', 'sha': 'a'*40,
'procfile': json.dumps({'web': 'node server.js', 'worker': 'node worker.js'})}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 201)
url = "/api/apps/{app_id}/containers".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 1)
# scale to zero for consistency
url = "/api/apps/{app_id}/scale".format(**locals())
body = {'web': 0}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 204)
# let's get chaotic
chaos.CREATE_ERROR_RATE = 0.5
# scale up but expect a 503
url = "/api/apps/{app_id}/scale".format(**locals())
body = {'web': 20}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 503)
# inspect broken containers
url = "/api/apps/{app_id}/containers".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 20)
# make sure some failed
states = set([c['state'] for c in response.data['results']])
self.assertEqual(states, set(['error', 'created']))
def test_start_chaos(self):
url = '/api/apps'
body = {'cluster': 'autotest'}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 201)
app_id = response.data['id']
# post a new build
url = "/api/apps/{app_id}/builds".format(**locals())
body = {'image': 'autotest/example', 'sha': 'a'*40,
'procfile': json.dumps({'web': 'node server.js', 'worker': 'node worker.js'})}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 201)
url = "/api/apps/{app_id}/containers".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 1)
# scale to zero for consistency
url = "/api/apps/{app_id}/scale".format(**locals())
body = {'web': 0}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 204)
# let's get chaotic
chaos.START_ERROR_RATE = 0.5
# scale up, which will allow some crashed containers
url = "/api/apps/{app_id}/scale".format(**locals())
body = {'web': 20}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 204)
# inspect broken containers
url = "/api/apps/{app_id}/containers".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 20)
# make sure some failed
states = set([c['state'] for c in response.data['results']])
self.assertEqual(states, set(['crashed', 'up']))
def test_destroy_chaos(self):
url = '/api/apps'
body = {'cluster': 'autotest'}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 201)
app_id = response.data['id']
# post a new build
url = "/api/apps/{app_id}/builds".format(**locals())
body = {'image': 'autotest/example', 'sha': 'a'*40,
'procfile': json.dumps({'web': 'node server.js', 'worker': 'node worker.js'})}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 201)
url = "/api/apps/{app_id}/containers".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 1)
# scale up
url = "/api/apps/{app_id}/scale".format(**locals())
body = {'web': 20}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 204)
url = "/api/apps/{app_id}/containers".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 20)
# let's get chaotic
chaos.DESTROY_ERROR_RATE = 0.5
# scale to zero but expect a 503
url = "/api/apps/{app_id}/scale".format(**locals())
body = {'web': 0}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 503)
# inspect broken containers
url = "/api/apps/{app_id}/containers".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
states = set([c['state'] for c in response.data['results']])
self.assertEqual(states, set(['error']))
# make sure we can cleanup after enough tries
containers = 20
for _ in range(100):
url = "/api/apps/{app_id}/scale".format(**locals())
body = {'web': 0}
response = self.client.post(url, json.dumps(body), content_type='application/json')
# break if we destroyed successfully
if response.status_code == 204:
break
self.assertEquals(response.status_code, 503)
# inspect broken containers
url = "/api/apps/{app_id}/containers".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
containers = len(response.data['results'])
def test_build_chaos(self):
url = '/api/apps'
body = {'cluster': 'autotest'}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 201)
app_id = response.data['id']
# post a new build
url = "/api/apps/{app_id}/builds".format(**locals())
body = {'image': 'autotest/example', 'sha': 'a'*40,
'procfile': json.dumps({'web': 'node server.js', 'worker': 'node worker.js'})}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 201)
# inspect builds
url = "/api/apps/{app_id}/builds".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 2)
# inspect releases
url = "/api/apps/{app_id}/releases".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 2)
url = "/api/apps/{app_id}/containers".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 1)
# scale up
url = "/api/apps/{app_id}/scale".format(**locals())
body = {'web': 20}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 204)
# simulate failing to create containers
chaos.CREATE_ERROR_RATE = 0.5
chaos.START_ERROR_RATE = 0.5
# post a new build
url = "/api/apps/{app_id}/builds".format(**locals())
body = {'image': 'autotest/example', 'sha': 'b'*40,
'procfile': json.dumps({'web': 'node server.js', 'worker': 'node worker.js'})}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 503)
# inspect releases
url = "/api/apps/{app_id}/releases".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 2)
# inspect containers
url = "/api/apps/{app_id}/containers".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 20)
# make sure all old containers are still up
states = set([c['state'] for c in response.data['results']])
self.assertEqual(states, set(['up']))
def test_config_chaos(self):
url = '/api/apps'
body = {'cluster': 'autotest'}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 201)
app_id = response.data['id']
# post a new build
url = "/api/apps/{app_id}/builds".format(**locals())
body = {'image': 'autotest/example', 'sha': 'a'*40,
'procfile': json.dumps({'web': 'node server.js', 'worker': 'node worker.js'})}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 201)
# inspect releases
url = "/api/apps/{app_id}/releases".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 2)
url = "/api/apps/{app_id}/containers".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 1)
# scale up
url = "/api/apps/{app_id}/scale".format(**locals())
body = {'web': 20}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 204)
# simulate failing to create or start containers
chaos.CREATE_ERROR_RATE = 0.5
chaos.START_ERROR_RATE = 0.5
# post a new config
url = "/api/apps/{app_id}/config".format(**locals())
body = {'values': json.dumps({'NEW_URL1': 'http://localhost:8080/'})}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 503)
# inspect releases
url = "/api/apps/{app_id}/releases".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 2)
# inspect containers
url = "/api/apps/{app_id}/containers".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 20)
# make sure all old containers are still up
states = set([c['state'] for c in response.data['results']])
self.assertEqual(states, set(['up']))
def test_run_chaos(self):
url = '/api/apps'
body = {'cluster': 'autotest'}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 201)
app_id = response.data['id']
# post a new build
url = "/api/apps/{app_id}/builds".format(**locals())
body = {'image': 'autotest/example', 'sha': 'a'*40,
'procfile': json.dumps({'web': 'node server.js', 'worker': 'node worker.js'})}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 201)
# inspect builds
url = "/api/apps/{app_id}/builds".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 2)
# inspect releases
url = "/api/apps/{app_id}/releases".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 2)
url = "/api/apps/{app_id}/containers".format(**locals())
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 1)
# block all create operations
chaos.CREATE_ERROR_RATE = 1
# make sure the run fails with a 503
url = '/api/apps/{app_id}/run'.format(**locals())
body = {'command': 'ls -al'}
response = self.client.post(url, json.dumps(body), content_type='application/json')
self.assertEqual(response.status_code, 503)
| 48.481605 | 95 | 0.614238 | 1,769 | 14,496 | 4.944036 | 0.089881 | 0.11491 | 0.094672 | 0.145895 | 0.895152 | 0.881432 | 0.874914 | 0.872742 | 0.872742 | 0.872742 | 0 | 0.020669 | 0.229029 | 14,496 | 298 | 96 | 48.644295 | 0.761901 | 0.078366 | 0 | 0.833333 | 0 | 0 | 0.185043 | 0.07516 | 0 | 0 | 0 | 0 | 0.294872 | 1 | 0.029915 | false | 0.004274 | 0.017094 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
afa790067e407a19068c25b869137eab5a632efc | 174 | py | Python | userManagement/User.py | micv-dev/DeepKubeGPUCluster | b1f674ea3c251a5287ee83d582b193248e04f9d6 | [
"Apache-2.0"
] | 2 | 2021-01-22T05:56:40.000Z | 2021-07-03T17:50:49.000Z | userManagement/User.py | micv-dev/DeepKubeGPUCluster | b1f674ea3c251a5287ee83d582b193248e04f9d6 | [
"Apache-2.0"
] | null | null | null | userManagement/User.py | micv-dev/DeepKubeGPUCluster | b1f674ea3c251a5287ee83d582b193248e04f9d6 | [
"Apache-2.0"
] | null | null | null | class User:
def __init__(self,user_name,password):
self.user_name=user_name
self.password=password
def validate_user(self):
return True
| 19.333333 | 42 | 0.649425 | 22 | 174 | 4.772727 | 0.454545 | 0.228571 | 0.228571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.275862 | 174 | 9 | 43 | 19.333333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0 | 0.166667 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 8 |
bb8aaa7828da43158f48a8e2b05e16d47b4d74f9 | 1,673 | py | Python | tests/tree_node_test.py | paulo-erichsen/leetcode | 78543363f7f938bdbda75de9cdab645daa29466a | [
"MIT"
] | null | null | null | tests/tree_node_test.py | paulo-erichsen/leetcode | 78543363f7f938bdbda75de9cdab645daa29466a | [
"MIT"
] | null | null | null | tests/tree_node_test.py | paulo-erichsen/leetcode | 78543363f7f938bdbda75de9cdab645daa29466a | [
"MIT"
] | null | null | null | from algorithms.tree_node import make_tree, TreeNode
def test_make_tree1():
v = make_tree([6, 2, 8, 0, 4, 7, 9, -1, -1, 3, 5])
s0 = TreeNode(6)
s1 = TreeNode(2)
s2 = TreeNode(8)
s3 = TreeNode(0)
s4 = TreeNode(4)
s5 = TreeNode(7)
s6 = TreeNode(9)
s7 = TreeNode(-1)
s8 = TreeNode(-1)
s9 = TreeNode(3)
s10 = TreeNode(5)
s0.left = s1
s0.right = s2
s1.left = s3
s1.right = s4
s2.left = s5
s2.right = s6
s3.left = s7
s3.right = s8
s4.left = s9
s4.right = s10
assert len(v) == 11
assert v[0].val == s0.val
assert v[1].val == s1.val
assert v[2].val == s2.val
assert v[3].val == s3.val
assert v[4].val == s4.val
assert v[5].val == s5.val
assert v[6].val == s6.val
assert v[7].val == s7.val
assert v[8].val == s8.val
assert v[9].val == s9.val
assert v[10].val == s10.val
def test_make_tree2():
v = make_tree([6, 2, 8, 0, 4, 7, 9, None, None, 3, 5])
s0 = TreeNode(6)
s1 = TreeNode(2)
s2 = TreeNode(8)
s3 = TreeNode(0)
s4 = TreeNode(4)
s5 = TreeNode(7)
s6 = TreeNode(9)
s9 = TreeNode(3)
s10 = TreeNode(5)
s0.left = s1
s0.right = s2
s1.left = s3
s1.right = s4
s2.left = s5
s2.right = s6
s4.left = s9
s4.right = s10
assert len(v) == 11
assert v[0].val == s0.val
assert v[1].val == s1.val
assert v[2].val == s2.val
assert v[3].val == s3.val
assert v[4].val == s4.val
assert v[5].val == s5.val
assert v[6].val == s6.val
assert v[3].left is None
assert v[3].right is None
assert v[9].val == s9.val
assert v[10].val == s10.val
| 22.013158 | 58 | 0.533174 | 298 | 1,673 | 2.966443 | 0.14094 | 0.174208 | 0.20362 | 0.03733 | 0.79638 | 0.79638 | 0.79638 | 0.79638 | 0.79638 | 0.79638 | 0 | 0.130584 | 0.304244 | 1,673 | 75 | 59 | 22.306667 | 0.628866 | 0 | 0 | 0.80597 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.358209 | 1 | 0.029851 | false | 0 | 0.014925 | 0 | 0.044776 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bbfd66c4117588da9b4487b56fa7307098f7332b | 27,985 | py | Python | methodsAL.py | bo1929/active-learning-named-entity-recognition | d83d93e0ecede467c45512b0cb7cc27167d050b9 | [
"MIT"
] | 2 | 2021-07-08T14:01:27.000Z | 2021-08-24T18:10:31.000Z | methodsAL.py | bo1929/active-learning-named-entity-recognition | d83d93e0ecede467c45512b0cb7cc27167d050b9 | [
"MIT"
] | null | null | null | methodsAL.py | bo1929/active-learning-named-entity-recognition | d83d93e0ecede467c45512b0cb7cc27167d050b9 | [
"MIT"
] | null | null | null | import os
import math
import random
import operator
import numpy as np
import pandas as pd
import wrappers.wrapper_UMAP as umap_
import matplotlib.pyplot as plt
import statsmodels.api as sm
from itertools import accumulate
def fit_distribution(lengths):
dens = sm.nonparametric.KDEUnivariate(lengths)
dens.fit()
return dens
def lenght_prob(dens, len_sent):
# prob = dens.evaluate(len_sent)
prob = math.sqrt(dens.evaluate(len_sent))
return prob
def rs(idx_pool, batch_size, seed):
random.seed(a=seed, version=2)
idx_q = random.sample(idx_pool, batch_size)
return idx_q, [s for s in idx_pool if s not in idx_q]
def lss(sent_lenghts, idx_pool, batch_size):
batch = np.argpartition(np.array(sent_lenghts) * (-1), batch_size - 1)[
:batch_size
].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def tp(m_pool, idx_pool, batch_size):
num_sent = len(m_pool)
tp = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(m_pool[i])
tp[i] = 1 - min([(max(m_pool[i][j].values())) for j in range(len_sent)])
batch = np.argpartition(tp * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def ttp(m_pool, idx_pool, batch_size):
num_sent = len(m_pool)
ttp = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(m_pool[i])
ttp[i] = sum([1 - (max(m_pool[i][j].values())) for j in range(len_sent)])
batch = np.argpartition(ttp * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def ntp(m_pool, idx_pool, batch_size):
num_sent = len(m_pool)
ntp = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(m_pool[i])
ntp[i] = (
sum([1 - (max(m_pool[i][j].values())) for j in range(len_sent)]) / len_sent
)
batch = np.argpartition(ntp * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
# Minumum Token Margin
def tm(m_pool, idx_pool, batch_size):
num_sent = len(m_pool)
tm = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(m_pool[i])
tm[i] = 1 - min(
[
max(m_pool[i][j].values()) - sorted(m_pool[i][j].values())[-2]
for j in range(len_sent)
]
)
batch = np.argpartition(tm * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def ttm(m_pool, idx_pool, batch_size):
num_sent = len(m_pool)
ttm = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(m_pool[i])
ttm[i] = sum(
[
1 - (max(m_pool[i][j].values()) - sorted(m_pool[i][j].values())[-2])
for j in range(len_sent)
]
)
batch = np.argpartition(ttm * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [s for s in idx_pool if s not in idx_q]
def ntm(m_pool, idx_pool, batch_size):
num_sent = len(m_pool)
ntm = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(m_pool[i])
ntm[i] = (
sum(
[
1 - (max(m_pool[i][j].values()) - sorted(m_pool[i][j].values())[-2])
for j in range(len_sent)
]
)
/ len_sent
)
batch = np.argpartition(ntm * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [s for s in idx_pool if s not in idx_q]
def te(m_pool, idx_pool, batch_size):
num_sent = len(m_pool)
te = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(m_pool[i])
te[i] = max(
[
(-1) * sum([p * math.log2(p) for p in m_pool[i][j].values() if p > 0])
for j in range(len_sent)
]
)
batch = np.argpartition(te * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [s for s in idx_pool if s not in idx_q]
def tte(m_pool, idx_pool, batch_size):
num_sent = len(m_pool)
tte = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(m_pool[i])
tte[i] = sum(
[
(-1) * sum([p * math.log2(p) for p in m_pool[i][j].values() if p > 0])
for j in range(len_sent)
]
)
batch = np.argpartition(tte * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def nte(m_pool, idx_pool, batch_size):
num_sent = len(m_pool)
nte = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(m_pool[i])
nte[i] = (
sum(
[
(-1)
* sum([p * math.log2(p) for p in m_pool[i][j].values() if p > 0])
for j in range(len_sent)
]
)
/ len_sent
)
batch = np.argpartition(nte * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def ap(m_pool, y_pred, idx_pool, batch_size):
num_sent = len(m_pool)
ap = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(m_pool[i])
ap[i] = 1 - min([(m_pool[i][j][y_pred[i][j]]) for j in range(len_sent)])
batch = np.argpartition(ap * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def tap(m_pool, y_pred, idx_pool, batch_size):
num_sent = len(m_pool)
tap = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(m_pool[i])
tap[i] = sum([1 - m_pool[i][j][y_pred[i][j]] for j in range(len_sent)])
batch = np.argpartition(tap * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def nap(m_pool, y_pred, idx_pool, batch_size):
num_sent = len(m_pool)
nap = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(m_pool[i])
nap[i] = (
sum([1 - m_pool[i][j][y_pred[i][j]] for j in range(len_sent)]) / len_sent
)
batch = np.argpartition(nap * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def ptp(cfg, embeddings_ann, embeddings_pool, y_ann, m_pool, idx_pool, batch_size):
PDF = fit_distribution([len(sent) for sent in embeddings_pool])
experiment_dir = cfg["experiment_directory"]
tag_dict = cfg["tag_dict"]
kwargs = {**cfg["umap_al"], **cfg["hdbscan_al"]}
(
embeddings_ann,
embeddings_pool,
clusters_ann,
clusters_pool,
clusterer,
count_clusters,
) = umap_.ss_umap_r_hdbscan_c(
embeddings_ann, embeddings_pool, y_ann, tag_dict, seed=cfg["seed"], **kwargs
)
sent_len_pool = [0] + [len(sent) for sent in embeddings_pool]
sent_idx_pool = list(accumulate(sent_len_pool))
fig, ax = plt.subplots(1, figsize=(18, 10))
clr = [c for sent in clusters_pool for c in sent]
coord = np.array([xy for sent in embeddings_pool for xy in sent])
ax.scatter(coord[:, 0], coord[:, 1], c=clr, s=0.4, cmap="Spectral", alpha=0.5)
fig.suptitle(
"Semi-supervised UMAP + HDBSCAN, " + str(len(y_ann)) + " sentences labeled",
fontsize=18,
)
plt.savefig(
os.path.join(
experiment_dir + "ss_umap_r_hdbscan_c_" + str(len(y_ann)) + ".png"
),
dpi=700,
)
n_ent = max(count_clusters.items(), key=operator.itemgetter(1))[0]
threshold = pd.Series(clusterer.outlier_scores_[len(embeddings_ann) :]).quantile(
cfg["hdbscan_al"]["mask_outlier"]
)
outliers = np.where(clusterer.outlier_scores_[len(embeddings_ann) :] > threshold)[0]
mask_out = np.zeros(len(clusterer.outlier_scores_[len(embeddings_ann) :]))
mask_out[outliers] = 1
mask_out = [
mask_out[sent_idx_pool[i - 1] : sent_idx_pool[i]]
for i in range(1, len(sent_idx_pool))
]
num_sent = len(embeddings_pool)
entity_rich = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(embeddings_pool[i])
entity_rich[i] = (
sum(
[
1 - max(m_pool[i][j].values())
for j in range(len_sent)
if (clusters_pool[i][j] != n_ent) or (mask_out[i][j] == 1)
]
)
) * lenght_prob(PDF, len_sent)
batch = np.argpartition(entity_rich * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def otp(cfg, embeddings_ann, embeddings_pool, y_ann, m_pool, idx_pool, batch_size):
experiment_dir = cfg["experiment_directory"]
tag_dict = cfg["tag_dict"]
kwargs = {**cfg["umap_al"], **cfg["hdbscan_al"]}
(
embeddings_ann,
embeddings_pool,
clusters_ann,
clusters_pool,
clusterer,
count_clusters,
) = umap_.ss_umap_r_hdbscan_c(
embeddings_ann, embeddings_pool, y_ann, tag_dict, seed=cfg["seed"], **kwargs
)
sent_len_pool = [0] + [len(sent) for sent in embeddings_pool]
sent_idx_pool = list(accumulate(sent_len_pool))
fig, ax = plt.subplots(1, figsize=(18, 10))
clr = [c for sent in clusters_pool for c in sent]
coord = np.array([xy for sent in embeddings_pool for xy in sent])
ax.scatter(coord[:, 0], coord[:, 1], c=clr, s=0.4, cmap="Spectral", alpha=0.5)
fig.suptitle(
"Semi-supervised UMAP + HDBSCAN, " + str(len(y_ann)) + " sentences labeled",
fontsize=18,
)
plt.savefig(
os.path.join(
experiment_dir + "ss_umap_r_hdbscan_c_" + str(len(y_ann)) + ".png"
),
dpi=700,
)
n_ent = max(count_clusters.items(), key=operator.itemgetter(1))[0]
threshold = pd.Series(clusterer.outlier_scores_[len(embeddings_ann) :]).quantile(
cfg["hdbscan_al"]["mask_outlier"]
)
outliers = np.where(clusterer.outlier_scores_[len(embeddings_ann) :] > threshold)[0]
mask_out = np.zeros(len(clusterer.outlier_scores_[len(embeddings_ann) :]))
mask_out[outliers] = 1
mask_out = [
mask_out[sent_idx_pool[i - 1] : sent_idx_pool[i]]
for i in range(1, len(sent_idx_pool))
]
num_sent = len(embeddings_pool)
entity_rich = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(embeddings_pool[i])
entity_rich[i] = (
sum(
[
1 - max(m_pool[i][j].values())
for j in range(len_sent)
if (clusters_pool[i][j] != n_ent) or (mask_out[i][j] == 1)
]
)
)
batch = np.argpartition(entity_rich * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def otm(cfg, embeddings_ann, embeddings_pool, y_ann, m_pool, idx_pool, batch_size):
experiment_dir = cfg["experiment_directory"]
tag_dict = cfg["tag_dict"]
kwargs = {**cfg["umap_al"], **cfg["hdbscan_al"]}
(
embeddings_ann,
embeddings_pool,
clusters_ann,
clusters_pool,
clusterer,
count_clusters,
) = umap_.ss_umap_r_hdbscan_c(
embeddings_ann, embeddings_pool, y_ann, tag_dict, seed=cfg["seed"], **kwargs
)
sent_len_pool = [0] + [len(sent) for sent in embeddings_pool]
sent_idx_pool = list(accumulate(sent_len_pool))
fig, ax = plt.subplots(1, figsize=(18, 10))
clr = [c for sent in clusters_pool for c in sent]
coord = np.array([xy for sent in embeddings_pool for xy in sent])
ax.scatter(coord[:, 0], coord[:, 1], c=clr, s=0.4, cmap="Spectral", alpha=0.5)
fig.suptitle(
"Semi-supervised UMAP + HDBSCAN, " + str(len(y_ann)) + " sentences labeled",
fontsize=18,
)
plt.savefig(
os.path.join(
experiment_dir + "ss_umap_r_hdbscan_c_" + str(len(y_ann)) + ".png"
),
dpi=700,
)
n_ent = max(count_clusters.items(), key=operator.itemgetter(1))[0]
threshold = pd.Series(clusterer.outlier_scores_[len(embeddings_ann) :]).quantile(
cfg["hdbscan_al"]["mask_outlier"]
)
outliers = np.where(clusterer.outlier_scores_[len(embeddings_ann) :] > threshold)[0]
mask_out = np.zeros(len(clusterer.outlier_scores_[len(embeddings_ann) :]))
mask_out[outliers] = 1
mask_out = [
mask_out[sent_idx_pool[i - 1] : sent_idx_pool[i]]
for i in range(1, len(sent_idx_pool))
]
num_sent = len(embeddings_pool)
entity_rich = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(embeddings_pool[i])
entity_rich[i] = sum(
[
1 - max(m_pool[i][j].values()) - sorted(m_pool[i][j].values())[-2]
for j in range(len_sent)
if (clusters_pool[i][j] != n_ent) or (mask_out[i][j] == 1)
]
)
batch = np.argpartition(entity_rich * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def ptm(cfg, embeddings_ann, embeddings_pool, y_ann, m_pool, idx_pool, batch_size):
PDF = fit_distribution([len(sent) for sent in embeddings_pool])
experiment_dir = cfg["experiment_directory"]
tag_dict = cfg["tag_dict"]
kwargs = {**cfg["umap_al"], **cfg["hdbscan_al"]}
(
embeddings_ann,
embeddings_pool,
clusters_ann,
clusters_pool,
clusterer,
count_clusters,
) = umap_.ss_umap_r_hdbscan_c(
embeddings_ann, embeddings_pool, y_ann, tag_dict, seed=cfg["seed"], **kwargs
)
sent_len_pool = [0] + [len(sent) for sent in embeddings_pool]
sent_idx_pool = list(accumulate(sent_len_pool))
fig, ax = plt.subplots(1, figsize=(18, 10))
clr = [c for sent in clusters_pool for c in sent]
coord = np.array([xy for sent in embeddings_pool for xy in sent])
ax.scatter(coord[:, 0], coord[:, 1], c=clr, s=0.4, cmap="Spectral", alpha=0.5)
fig.suptitle(
"Semi-supervised UMAP + HDBSCAN, " + str(len(y_ann)) + " sentences labeled",
fontsize=18,
)
plt.savefig(
os.path.join(
experiment_dir + "ss_umap_r_hdbscan_c_" + str(len(y_ann)) + ".png"
),
dpi=700,
)
n_ent = max(count_clusters.items(), key=operator.itemgetter(1))[0]
threshold = pd.Series(clusterer.outlier_scores_[len(embeddings_ann) :]).quantile(
cfg["hdbscan_al"]["mask_outlier"]
)
outliers = np.where(clusterer.outlier_scores_[len(embeddings_ann) :] > threshold)[0]
mask_out = np.zeros(len(clusterer.outlier_scores_[len(embeddings_ann) :]))
mask_out[outliers] = 1
mask_out = [
mask_out[sent_idx_pool[i - 1] : sent_idx_pool[i]]
for i in range(1, len(sent_idx_pool))
]
num_sent = len(embeddings_pool)
entity_rich = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(embeddings_pool[i])
entity_rich[i] = (
sum(
[
1 - max(m_pool[i][j].values()) - sorted(m_pool[i][j].values())[-2]
for j in range(len_sent)
if (clusters_pool[i][j] != n_ent) or (mask_out[i][j] == 1)
]
)
) * lenght_prob(PDF, len_sent)
batch = np.argpartition(entity_rich * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def pte(cfg, embeddings_ann, embeddings_pool, y_ann, m_pool, idx_pool, batch_size):
PDF = fit_distribution([len(sent) for sent in embeddings_pool])
experiment_dir = cfg["experiment_directory"]
tag_dict = cfg["tag_dict"]
kwargs = {**cfg["umap_al"], **cfg["hdbscan_al"]}
(
embeddings_ann,
embeddings_pool,
clusters_ann,
clusters_pool,
clusterer,
count_clusters,
) = umap_.ss_umap_r_hdbscan_c(
embeddings_ann, embeddings_pool, y_ann, tag_dict, seed=cfg["seed"], **kwargs
)
sent_len_pool = [0] + [len(sent) for sent in embeddings_pool]
sent_idx_pool = list(accumulate(sent_len_pool))
fig, ax = plt.subplots(1, figsize=(18, 10))
clr = [c for sent in clusters_pool for c in sent]
coord = np.array([xy for sent in embeddings_pool for xy in sent])
ax.scatter(coord[:, 0], coord[:, 1], c=clr, s=0.4, cmap="Spectral", alpha=0.5)
fig.suptitle(
"Semi-supervised UMAP + HDBSCAN, " + str(len(y_ann)) + " sentences labeled.",
fontsize=18,
)
plt.savefig(
os.path.join(
experiment_dir + "ss_umap_r_hdbscan_c_" + str(len(y_ann)) + ".png"
),
dpi=700,
)
n_ent = max(count_clusters.items(), key=operator.itemgetter(1))[0]
threshold = pd.Series(clusterer.outlier_scores_[len(embeddings_ann) :]).quantile(
cfg["hdbscan_al"]["mask_outlier"]
)
outliers = np.where(clusterer.outlier_scores_[len(embeddings_ann) :] > threshold)[0]
mask_out = np.zeros(len(clusterer.outlier_scores_[len(embeddings_ann) :]))
mask_out[outliers] = 1
mask_out = [
mask_out[sent_idx_pool[i - 1] : sent_idx_pool[i]]
for i in range(1, len(sent_idx_pool))
]
num_sent = len(embeddings_pool)
entity_rich = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(embeddings_pool[i])
entity_rich[i] = (
sum(
[
(-1)
* sum([p * math.log2(p) for p in m_pool[i][j].values() if p > 0])
for j in range(len_sent)
if (clusters_pool[i][j] != n_ent) or (mask_out[i][j] == 1)
]
)
) * lenght_prob(PDF, len_sent)
batch = np.argpartition(entity_rich * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def ote(cfg, embeddings_ann, embeddings_pool, y_ann, m_pool, idx_pool, batch_size):
experiment_dir = cfg["experiment_directory"]
tag_dict = cfg["tag_dict"]
kwargs = {**cfg["umap_al"], **cfg["hdbscan_al"]}
(
embeddings_ann,
embeddings_pool,
clusters_ann,
clusters_pool,
clusterer,
count_clusters,
) = umap_.ss_umap_r_hdbscan_c(
embeddings_ann, embeddings_pool, y_ann, tag_dict, seed=cfg["seed"], **kwargs
)
sent_len_pool = [0] + [len(sent) for sent in embeddings_pool]
sent_idx_pool = list(accumulate(sent_len_pool))
fig, ax = plt.subplots(1, figsize=(18, 10))
clr = [c for sent in clusters_pool for c in sent]
coord = np.array([xy for sent in embeddings_pool for xy in sent])
ax.scatter(coord[:, 0], coord[:, 1], c=clr, s=0.4, cmap="Spectral", alpha=0.5)
fig.suptitle(
"Semi-supervised UMAP + HDBSCAN, " + str(len(y_ann)) + " sentences labeled.",
fontsize=18,
)
plt.savefig(
os.path.join(
experiment_dir + "ss_umap_r_hdbscan_c_" + str(len(y_ann)) + ".png"
),
dpi=700,
)
n_ent = max(count_clusters.items(), key=operator.itemgetter(1))[0]
threshold = pd.Series(clusterer.outlier_scores_[len(embeddings_ann) :]).quantile(
cfg["hdbscan_al"]["mask_outlier"]
)
outliers = np.where(clusterer.outlier_scores_[len(embeddings_ann) :] > threshold)[0]
mask_out = np.zeros(len(clusterer.outlier_scores_[len(embeddings_ann) :]))
mask_out[outliers] = 1
mask_out = [
mask_out[sent_idx_pool[i - 1] : sent_idx_pool[i]]
for i in range(1, len(sent_idx_pool))
]
num_sent = len(embeddings_pool)
entity_rich = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(embeddings_pool[i])
entity_rich[i] = sum(
[
(-1) * sum([p * math.log2(p) for p in m_pool[i][j].values() if p > 0])
for j in range(len_sent)
if (clusters_pool[i][j] != n_ent) or (mask_out[i][j] == 1)
]
)
batch = np.argpartition(entity_rich * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def pap(
cfg, embeddings_ann, embeddings_pool, y_ann, y_pred, m_pool, idx_pool, batch_size
):
PDF = fit_distribution([len(sent) for sent in embeddings_pool])
experiment_dir = cfg["experiment_directory"]
tag_dict = cfg["tag_dict"]
kwargs = {**cfg["umap_al"], **cfg["hdbscan_al"]}
(
embeddings_ann,
embeddings_pool,
clusters_ann,
clusters_pool,
clusterer,
count_clusters,
) = umap_.ss_umap_r_hdbscan_c(
embeddings_ann, embeddings_pool, y_ann, tag_dict, seed=cfg["seed"], **kwargs
)
sent_len_pool = [0] + [len(sent) for sent in embeddings_pool]
sent_idx_pool = list(accumulate(sent_len_pool))
fig, ax = plt.subplots(1, figsize=(18, 10))
clr = [c for sent in clusters_pool for c in sent]
coord = np.array([xy for sent in embeddings_pool for xy in sent])
ax.scatter(coord[:, 0], coord[:, 1], c=clr, s=0.4, cmap="Spectral", alpha=0.5)
fig.suptitle(
"Semi-supervised UMAP + HDBSCAN, " + str(len(y_ann)) + " sentences labeled.",
fontsize=18,
)
plt.savefig(
os.path.join(
experiment_dir + "ss_umap_r_hdbscan_c_" + str(len(y_ann)) + ".png"
),
dpi=700,
)
n_ent = max(count_clusters.items(), key=operator.itemgetter(1))[0]
threshold = pd.Series(clusterer.outlier_scores_[len(embeddings_ann) :]).quantile(
cfg["hdbscan_al"]["mask_outlier"]
)
outliers = np.where(clusterer.outlier_scores_[len(embeddings_ann) :] > threshold)[0]
mask_out = np.zeros(len(clusterer.outlier_scores_[len(embeddings_ann) :]))
mask_out[outliers] = 1
mask_out = [
mask_out[sent_idx_pool[i - 1] : sent_idx_pool[i]]
for i in range(1, len(sent_idx_pool))
]
num_sent = len(embeddings_pool)
entity_rich = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(embeddings_pool[i])
entity_rich[i] = (
sum(
[
1 - m_pool[i][j][y_pred[i][j]]
for j in range(len_sent)
if (clusters_pool[i][j] != n_ent) or (mask_out[i][j] == 1)
]
)
) * lenght_prob(PDF, len_sent)
batch = np.argpartition(entity_rich * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def oap(
cfg, embeddings_ann, embeddings_pool, y_ann, y_pred, m_pool, idx_pool, batch_size
):
experiment_dir = cfg["experiment_directory"]
tag_dict = cfg["tag_dict"]
kwargs = {**cfg["umap_al"], **cfg["hdbscan_al"]}
(
embeddings_ann,
embeddings_pool,
clusters_ann,
clusters_pool,
clusterer,
count_clusters,
) = umap_.ss_umap_r_hdbscan_c(
embeddings_ann, embeddings_pool, y_ann, tag_dict, seed=cfg["seed"], **kwargs
)
sent_len_pool = [0] + [len(sent) for sent in embeddings_pool]
sent_idx_pool = list(accumulate(sent_len_pool))
fig, ax = plt.subplots(1, figsize=(18, 10))
clr = [c for sent in clusters_pool for c in sent]
coord = np.array([xy for sent in embeddings_pool for xy in sent])
ax.scatter(coord[:, 0], coord[:, 1], c=clr, s=0.4, cmap="Spectral", alpha=0.5)
fig.suptitle(
"Semi-supervised UMAP + HDBSCAN, " + str(len(y_ann)) + " sentences labeled.",
fontsize=18,
)
plt.savefig(
os.path.join(
experiment_dir + "ss_umap_r_hdbscan_c_" + str(len(y_ann)) + ".png"
),
dpi=700,
)
n_ent = max(count_clusters.items(), key=operator.itemgetter(1))[0]
threshold = pd.Series(clusterer.outlier_scores_[len(embeddings_ann) :]).quantile(
cfg["hdbscan_al"]["mask_outlier"]
)
outliers = np.where(clusterer.outlier_scores_[len(embeddings_ann) :] > threshold)[0]
mask_out = np.zeros(len(clusterer.outlier_scores_[len(embeddings_ann) :]))
mask_out[outliers] = 1
mask_out = [
mask_out[sent_idx_pool[i - 1] : sent_idx_pool[i]]
for i in range(1, len(sent_idx_pool))
]
num_sent = len(embeddings_pool)
entity_rich = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(embeddings_pool[i])
entity_rich[i] = sum(
[
1 - m_pool[i][j][y_pred[i][j]]
for j in range(len_sent)
if (clusters_pool[i][j] != n_ent) or (mask_out[i][j] == 1)
]
)
batch = np.argpartition(entity_rich * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
def pas(cfg, embeddings_ann, embeddings_pool, y_ann, idx_pool, batch_size):
PDF = fit_distribution([len(sent) for sent in embeddings_pool])
experiment_dir = cfg["experiment_directory"]
tag_dict = cfg["tag_dict"]
kwargs = {**cfg["umap_al"], **cfg["hdbscan_al"]}
(
embeddings_ann,
embeddings_pool,
clusters_ann,
clusters_pool,
clusterer,
count_clusters,
) = umap_.ss_umap_r_hdbscan_c(
embeddings_ann, embeddings_pool, y_ann, tag_dict, seed=cfg["seed"], **kwargs
)
sent_len_pool = [0] + [len(sent) for sent in embeddings_pool]
sent_idx_pool = list(accumulate(sent_len_pool))
fig, ax = plt.subplots(1, figsize=(18, 10))
clr = [c for sent in clusters_pool for c in sent]
coord = np.array([xy for sent in embeddings_pool for xy in sent])
ax.scatter(coord[:, 0], coord[:, 1], c=clr, s=0.4, cmap="Spectral", alpha=0.5)
fig.suptitle(
"Semi-supervised UMAP + HDBSCAN, " + str(len(y_ann)) + " sentences labeled.",
fontsize=18,
)
plt.savefig(
os.path.join(
experiment_dir + "ss_umap_r_hdbscan_c_" + str(len(y_ann)) + ".png"
),
dpi=700,
)
n_ent = max(count_clusters.items(), key=operator.itemgetter(1))[0]
threshold = pd.Series(clusterer.outlier_scores_[len(embeddings_ann) :]).quantile(
cfg["hdbscan_al"]["mask_outlier"]
)
outliers = np.where(clusterer.outlier_scores_[len(embeddings_ann) :] > threshold)[0]
mask_out = np.zeros(len(clusterer.outlier_scores_[len(embeddings_ann) :]))
mask_out[outliers] = 1
mask_out = [
mask_out[sent_idx_pool[i - 1] : sent_idx_pool[i]]
for i in range(1, len(sent_idx_pool))
]
num_sent = len(embeddings_pool)
entity_rich = np.zeros(num_sent)
for i in range(num_sent):
len_sent = len(embeddings_pool[i])
entity_rich[i] = (
sum(
[
1
for j in range(len_sent)
if (clusters_pool[i][j] != n_ent) or (mask_out[i][j] == 1)
]
)
) * lenght_prob(PDF, len_sent)
batch = np.argpartition(entity_rich * (-1), batch_size - 1)[:batch_size].tolist()
idx_q = [idx_pool[i] for i in batch]
return idx_q, [i for i in idx_pool if i not in idx_q]
| 31.982857 | 88 | 0.595891 | 4,222 | 27,985 | 3.698247 | 0.03837 | 0.047073 | 0.027283 | 0.022416 | 0.956001 | 0.956001 | 0.956001 | 0.953823 | 0.953823 | 0.953183 | 0 | 0.014491 | 0.267608 | 27,985 | 874 | 89 | 32.019451 | 0.747317 | 0.001822 | 0 | 0.727273 | 0 | 0 | 0.049479 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036075 | false | 0 | 0.01443 | 0 | 0.08658 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a52fdbb0ccdab0b7b4077b5b4a9cf345e667b730 | 141 | py | Python | extensions/.stubs/clrclasses/System/Runtime/Hosting/__init__.py | vicwjb/Pycad | 7391cd694b7a91ad9f9964ec95833c1081bc1f84 | [
"MIT"
] | 1 | 2020-03-25T03:27:24.000Z | 2020-03-25T03:27:24.000Z | extensions/.stubs/clrclasses/System/Runtime/Hosting/__init__.py | vicwjb/Pycad | 7391cd694b7a91ad9f9964ec95833c1081bc1f84 | [
"MIT"
] | null | null | null | extensions/.stubs/clrclasses/System/Runtime/Hosting/__init__.py | vicwjb/Pycad | 7391cd694b7a91ad9f9964ec95833c1081bc1f84 | [
"MIT"
] | null | null | null | from __clrclasses__.System.Runtime.Hosting import ActivationArguments
from __clrclasses__.System.Runtime.Hosting import ApplicationActivator
| 47 | 70 | 0.900709 | 14 | 141 | 8.5 | 0.571429 | 0.235294 | 0.336134 | 0.453782 | 0.672269 | 0.672269 | 0 | 0 | 0 | 0 | 0 | 0 | 0.056738 | 141 | 2 | 71 | 70.5 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
a569ae7343dfd0987fb48a227c0bc9a2eca31973 | 137 | py | Python | tirelire-auth/app/adapters/repository/__init__.py | AgRenaud/tirelire | 0ac42dbf735dea4ecb741057bd037c18657b95c7 | [
"MIT"
] | null | null | null | tirelire-auth/app/adapters/repository/__init__.py | AgRenaud/tirelire | 0ac42dbf735dea4ecb741057bd037c18657b95c7 | [
"MIT"
] | null | null | null | tirelire-auth/app/adapters/repository/__init__.py | AgRenaud/tirelire | 0ac42dbf735dea4ecb741057bd037c18657b95c7 | [
"MIT"
] | null | null | null | from app.adapters.repository.repository import AbstractUserRepository
from app.adapters.repository.user_repository import UserRepository
| 45.666667 | 69 | 0.89781 | 15 | 137 | 8.133333 | 0.533333 | 0.114754 | 0.245902 | 0.409836 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058394 | 137 | 2 | 70 | 68.5 | 0.945736 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b1c89d6db178a9ad8f289e03b6fadc98302ca3b7 | 3,576 | py | Python | tests/test_mail.py | olivierdalang/django-toosimple-q | eaab43f43ed83541fa3197133fffdf1dfa6b9e01 | [
"MIT"
] | 13 | 2020-04-14T18:15:18.000Z | 2022-03-28T19:32:34.000Z | tests/test_mail.py | olivierdalang/django-toosimple-q | eaab43f43ed83541fa3197133fffdf1dfa6b9e01 | [
"MIT"
] | 1 | 2021-07-15T22:21:22.000Z | 2022-03-28T20:16:22.000Z | tests/test_mail.py | olivierdalang/django-toosimple-q | eaab43f43ed83541fa3197133fffdf1dfa6b9e01 | [
"MIT"
] | null | null | null | from django.core import mail, management
from django.core.mail import send_mail
from django.test import TestCase
from django.test.utils import override_settings
from django_toosimple_q.models import Task
from .utils import QueueAssertionMixin
class TestMail(QueueAssertionMixin, TestCase):
def setUp(self):
mail.outbox.clear()
@override_settings(
EMAIL_BACKEND="django_toosimple_q.contrib.mail.backend.QueueBackend",
TOOSIMPLEQ_EMAIL_BACKEND="django.core.mail.backends.locmem.EmailBackend",
)
def test_queue_mail(self):
self.assertQueue(0)
send_mail(
"Subject here",
"Here is the message.",
"from@example.com",
["to@example.com"],
)
self.assertQueue(1, state=Task.QUEUED)
self.assertQueue(1)
self.assertEquals(len(mail.outbox), 0)
management.call_command("worker", "--until_done")
self.assertQueue(1, state=Task.SUCCEEDED)
self.assertQueue(1)
self.assertEquals(len(mail.outbox), 1)
@override_settings(
EMAIL_BACKEND="django_toosimple_q.contrib.mail.backend.QueueBackend",
TOOSIMPLEQ_EMAIL_BACKEND="django.core.mail.backends.locmem.EmailBackend",
)
def test_queue_mail_two(self):
self.assertQueue(0)
send_mail(
"Subject here",
"Here is the message.",
"from@example.com",
["to@example.com"],
)
send_mail(
"Other subject here",
"Here is the message.",
"from@example.com",
["to@example.com"],
)
self.assertQueue(2, state=Task.QUEUED)
self.assertQueue(2)
self.assertEquals(len(mail.outbox), 0)
management.call_command("worker", "--until_done")
self.assertQueue(2, state=Task.SUCCEEDED)
self.assertQueue(2)
self.assertEquals(len(mail.outbox), 2)
@override_settings(
EMAIL_BACKEND="django_toosimple_q.contrib.mail.backend.QueueBackend",
TOOSIMPLEQ_EMAIL_BACKEND="django.core.mail.backends.locmem.EmailBackend",
)
def test_queue_mail_duplicate(self):
self.assertQueue(0)
send_mail(
"Subject here",
"Here is the message.",
"from@example.com",
["to@example.com"],
)
send_mail(
"Subject here",
"Here is the message.",
"from@example.com",
["to@example.com"],
)
self.assertQueue(1, state=Task.QUEUED)
self.assertQueue(1)
self.assertEquals(len(mail.outbox), 0)
management.call_command("worker", "--until_done")
self.assertQueue(1, state=Task.SUCCEEDED)
self.assertQueue(1)
self.assertEquals(len(mail.outbox), 1)
@override_settings(
EMAIL_BACKEND="django_toosimple_q.contrib.mail.backend.QueueBackend",
TOOSIMPLEQ_EMAIL_BACKEND="failing_backend",
)
def test_queue_mail(self):
self.assertQueue(0)
send_mail(
"Subject here",
"Here is the message.",
"from@example.com",
["to@example.com"],
)
self.assertQueue(1, state=Task.QUEUED)
self.assertQueue(1)
self.assertEquals(len(mail.outbox), 0)
management.call_command("worker", "--until_done")
self.assertQueue(1, state=Task.FAILED, replaced=True)
self.assertQueue(1, state=Task.SLEEPING)
self.assertQueue(2)
self.assertEquals(len(mail.outbox), 0)
| 28.15748 | 81 | 0.610738 | 393 | 3,576 | 5.422392 | 0.160305 | 0.147818 | 0.090099 | 0.086344 | 0.845143 | 0.816987 | 0.816987 | 0.816987 | 0.79587 | 0.766776 | 0 | 0.011132 | 0.271532 | 3,576 | 126 | 82 | 28.380952 | 0.80691 | 0 | 0 | 0.714286 | 0 | 0 | 0.225951 | 0.095917 | 0 | 0 | 0 | 0 | 0.316327 | 1 | 0.05102 | false | 0 | 0.061224 | 0 | 0.122449 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5926afefc110413ddd88d32ad41a9a9db102e548 | 6,162 | py | Python | Algorithms/Search_Algorithm/HashMap.py | UnimaidElectrical/PythonForBeginners | b5f6ae01c86112b78258491a3efe636ec6a0c5e8 | [
"MIT"
] | 1 | 2021-08-12T14:25:51.000Z | 2021-08-12T14:25:51.000Z | Algorithms/Search_Algorithm/HashMap.py | UnimaidElectrical/PythonForBeginners | b5f6ae01c86112b78258491a3efe636ec6a0c5e8 | [
"MIT"
] | 1 | 2020-09-08T04:26:26.000Z | 2020-09-08T04:26:26.000Z | Algorithms/Search_Algorithm/HashMap.py | UnimaidElectrical/PythonForBeginners | b5f6ae01c86112b78258491a3efe636ec6a0c5e8 | [
"MIT"
] | 6 | 2019-09-14T12:58:17.000Z | 2021-10-30T21:26:02.000Z |
#HashMap
#Format Of Records
class AlgoHashTable:
def __name__(self, size):
self.size = size
#self.hash_table = [[] for _ in range(self.size)]
#self.hash_table = self.create_buckets()
#def create_buckets(self):
# return [[] for _ in range(self.size)]
def set_val(self, key, value):
pass
def get_val(self, key):
pass
def __str__(self):
#return "Hello!"
#return "".join(str(item) for item in self.hash_table)
hash_table = AlgoHashTable(256)
print(hash_table)
'''
*************************************************************************************
*************************************************************************************
*************************************************************************************
class AlgoHashTable:
def __name__(self, size):
self.size = size
self.hash_table = [[] for _ in range(self.size)]
#self.hash_table = self.create_buckets()
#def create_buckets(self):
# return [[] for _ in range(self.size)]
def set_val(self, key, value):
pass
def get_val(self, key):
pass
def __str__(self):
return "".join(str(item) for item in self.hash_table)
hash_table = AlgoHashTable(256)
print(hash_table)
*************************************************************************************
*************************************************************************************
*************************************************************************************
class AlgoHashTable:
def __init__(self, size):
self.size = size
self.hash_table = [[] for _ in range(self.size)]
#print(self.hash_table)
def __str__(self):
return "".join(str(item) for item in self.hash_table)
hash_table = AlgoHashTable(256)
print(hash_table)
*************************************************************************************
*************************************************************************************
*************************************************************************************
#placing two key value pair in one list/location
class AlgoHashTable:
def __init__(self, size):
self.size = size
self.hash_table = self.create_buckets()
def create_buckets(self):
return [[] for _ in range(self.size)]
def set_val(self,key,value):
hashed_key = 10 #hash(key)%self.size
bucket = self.hash_table[hashed_key]
bucket.append((key,value))
def __str__(self):
return "".join(str(item) for item in self.hash_table)
hash_table = AlgoHashTable(256)
print(hash_table)
hash_table.set_val('mashrur@example.com', 'some value')
hash_table.set_val('johndoe@example.com', 'some other value')
print(hash_table)
** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** *
** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** *
** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** *
# edit an item in the key/value pair like the way it is implemented in the dictionary DataStructure
class AlgoHashTable:
def __init__(self, size):
self.size = size
self.hash_table = self.create_buckets()
def create_buckets(self):
return [[] for _ in range(self.size)]
def set_val(self, key, value):
hashed_key = 10 # hash(key)%self.size
bucket = self.hash_table[hashed_key]
bucket.append((key, value))
def __str__(self):
return "".join(str(item) for item in self.hash_table)
hash_table = AlgoHashTable(256)
print(hash_table)
hash_table.set_val('mashrur@example.com', 'some value')
hash_table.set_val('johndoe@example.com', 'some other value')
print(hash_table)
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
the key must be an int or a str and it cannot be a list because it cannot be changed.
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
'''
class AlgoHashTable:
def __init__(self, size):
self.size = size
#self.hash_table = [[] for _ in range(self.size)]
#print(self.hash_table)
self.hash_table = self.create_buckets()
def create_buckets(self):
return [[] for _ in range(self.size)]
def set_val(self, key, value):
hashed_key = hash(key)%self.size
bucket = self.hash_table[hashed_key]
found_key = False
for index, record in enumerate(bucket):
record_key, record_value = record
if record_key == key:
found_key = True
break
if found_key:
bucket[index] = (key, value)
else:
bucket.append((key, value))
bucket.append((key,value))
def get_val(self, key):
pass
def __str__(self):
#pass
#return "Hello!"
return "".join(str(item) for item in self.hash_table)
hash_table = AlgoHashTable(256)
print(hash_table)
hash_table.set_val('tegaslink@gmail.com','some value')
hash_table.set_val('JohnDoe@gmail.com','some other value')
print(hash_table)
records = [ ('mashrur@example.com','Hello World'),
('JohnDoe@example.com','Hello to you too'),
('Janedoe@example.com', 'Python is awesome')
]
for record in records:
key, value = record #This unpackes the tuple into key and value
print(key, value)
for index, record in enumerate(records):
key, value = record
print(index,key,value,)
print(records[index])
for index, record in enumerate(records):
key, value = record
if key == 'JohnDoe@example.com':
break
print(index,key,value,)
print(records[index]) | 27.145374 | 152 | 0.471762 | 625 | 6,162 | 4.4368 | 0.1408 | 0.133069 | 0.093761 | 0.045438 | 0.797692 | 0.781464 | 0.781464 | 0.745041 | 0.73278 | 0.688785 | 0 | 0.004521 | 0.210321 | 6,162 | 227 | 153 | 27.145374 | 0.565351 | 0.061019 | 0 | 0.553571 | 0 | 0 | 0.09717 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.053571 | 0 | null | null | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
3cd90ea6ba3521002d3c5ff9e21bcb08091b3af9 | 105 | py | Python | historia/culture/__init__.py | eranimo/historia | 5e0b047d4bcdd534f48f8b9bf19d425b0b31a3fd | [
"MIT"
] | 6 | 2016-04-26T18:39:36.000Z | 2021-09-01T09:13:38.000Z | historia/culture/__init__.py | eranimo/historia | 5e0b047d4bcdd534f48f8b9bf19d425b0b31a3fd | [
"MIT"
] | null | null | null | historia/culture/__init__.py | eranimo/historia | 5e0b047d4bcdd534f48f8b9bf19d425b0b31a3fd | [
"MIT"
] | 4 | 2016-04-10T23:47:23.000Z | 2021-08-15T11:40:28.000Z | from historia.culture.culture import Culture
from historia.culture.culture_variant import CultureVariant
| 35 | 59 | 0.885714 | 13 | 105 | 7.076923 | 0.461538 | 0.26087 | 0.413043 | 0.565217 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07619 | 105 | 2 | 60 | 52.5 | 0.948454 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.